Track Prompts in Your Traces with Langtrace

Dylan Zuber

Software Engineer

Jul 1, 2024

Langtrace has introduced a new feature that allows you to attach prompt IDs and versions to your traces. This enhancement enables you to filter and group traces based on prompt information, which is especially useful for debugging and monitoring. In this blog post, we’ll guide you through the process of using this feature in both TypeScript and Python.

Why Track Prompt IDs and Versions?

Tracking prompt IDs and versions in your traces allows you to:

  • Easily identify which prompts are being used in your application.

  • Monitor the performance and behavior of specific prompts.

  • Debug issues more effectively by tracing back to the exact prompt version used.

Getting Prompts from the Langtrace Platform

You can start by retrieving prompts from the Langtrace platform (how to guide here) and include them in your trace. Here’s an example of how to do this:

# Must precede any llm module imports
from langtrace_python_sdk import get_prompt_from_registry

langtrace_prompt = get_prompt_from_registry(<Prompt Registry ID>

How to Attach Prompt IDs and Versions

Langtrace provides simple methods to attach prompt IDs and versions to your traces. Depending on the programming language you are using, you can utilize either the withAdditionalAttributes function for TypeScript or the inject_additional_attributes function for Python.

TypeScript Example

In TypeScript, you can wrap your code with the withAdditionalAttributes function to add prompt information to your trace spans. Here’s how you can do it:

import * as Langtrace from "@langtrace/typescript-sdk";
import OpenAI from 'openai'

Langtrace.init({
  write_spans_to_console: true,
})

const openai = new OpenAI()

export const run = async ()=>{
  const response = await Langtrace.withAdditionalAttributes(async () => {
  return await openai.chat.completions.create({
    model: 'gpt-4',
    messages: [
    { role: 'system', content: 'Talk like a pirate' },
    { role: 'user', content: 'Tell me a story in 3 sentences or less.' }
    ],
    stream: false
  })
}, { prompt_id: "prompt1234", prompt_version: "1" })
}
run().then(() => console.log('done'))

Python Example

In Python, you can use the inject_additional_attributes function to achieve the same result. Below is an example demonstrating how to add prompt information to your trace spans:

from langtrace_python_sdk import inject_additional_attributes

def do_llm_stuff(name=""):
  response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Say this is a test three 
times"}],
    stream=False,
  )
  return response

def main():
    response = inject_additional_attributes(lambda: do_llm_stuff(name="llm"), {'prompt_id': 'promptid1234', 'prompt_version': '1'})

# if the function do not take arguments then this syntax will work
response = inject_additional_attributes(do_llm_stuff, {'prompt_id': 'promptid1234', 'prompt_version': '1'})

main()

Conclusion

Now, when viewing your traces in the “Traces” tab, you can see the Prompt ID and Prompt Version included with the ingested traces! With the ability to attach prompt IDs and versions to your traces, Langtrace provides a powerful tool for monitoring and debugging your applications. By following the examples provided, you can easily integrate this feature into your workflow and gain deeper insights into your prompt usage.

We’d love to hear from you!

We’d love to hear your feedback on Langtrace! We invite you to join our community on Discord or reach out at support@langtrace.ai and share your experiences, insights, and suggestions. Together, we can continue to set new standards of observability and evaluations in LLM application development.

Happy tracing!

Ready to try Langtrace?

Try out the Langtrace SDK with just 2 lines of code.

Ready to deploy?

Try out the Langtrace SDK with just 2 lines of code.

Want to learn more?

Check out our documentation to learn more about how langtrace works

Join the Community

Check out our Discord community to ask questions and meet customers