Track Prompts in Your Traces with Langtrace
Dylan Zuber
⸱
Software Engineer
Jul 1, 2024
Langtrace has introduced a new feature that allows you to attach prompt IDs and versions to your traces. This enhancement enables you to filter and group traces based on prompt information, which is especially useful for debugging and monitoring. In this blog post, we’ll guide you through the process of using this feature in both TypeScript and Python.
Why Track Prompt IDs and Versions?
Tracking prompt IDs and versions in your traces allows you to:
Easily identify which prompts are being used in your application.
Monitor the performance and behavior of specific prompts.
Debug issues more effectively by tracing back to the exact prompt version used.
Getting Prompts from the Langtrace Platform
You can start by retrieving prompts from the Langtrace platform (how to guide here) and include them in your trace. Here’s an example of how to do this:
How to Attach Prompt IDs and Versions
Langtrace provides simple methods to attach prompt IDs and versions to your traces. Depending on the programming language you are using, you can utilize either the withAdditionalAttributes function for TypeScript or the inject_additional_attributes function for Python.
TypeScript Example
In TypeScript, you can wrap your code with the withAdditionalAttributes function to add prompt information to your trace spans. Here’s how you can do it:
Python Example
In Python, you can use the inject_additional_attributes function to achieve the same result. Below is an example demonstrating how to add prompt information to your trace spans:
Conclusion

Now, when viewing your traces in the “Traces” tab, you can see the Prompt ID and Prompt Version included with the ingested traces! With the ability to attach prompt IDs and versions to your traces, Langtrace provides a powerful tool for monitoring and debugging your applications. By following the examples provided, you can easily integrate this feature into your workflow and gain deeper insights into your prompt usage.
We’d love to hear from you!
We’d love to hear your feedback on Langtrace! We invite you to join our community on Discord or reach out at support@langtrace.ai and share your experiences, insights, and suggestions. Together, we can continue to set new standards of observability and evaluations in LLM application development.
Happy tracing!
Ready to deploy?
Try out the Langtrace SDK with just 2 lines of code.
Want to learn more?
Check out our documentation to learn more about how langtrace works
Join the Community
Check out our Discord community to ask questions and meet customers