Integrate Langtrace with LiteLLM
Karthik Kalyanaraman
⸱
Cofounder and CTO
Oct 15, 2024
Introduction
We are excited to announce that Langtrace now supports LiteLLM natively. This means:
If you are using LiteLLM for inference, you can setup Langtrace to automatically capture metrics such as token usage, cost, latency and request metadata.
Additionally you can install Langtrace to your LiteLLM proxy and set it up to send metrics to any OpenTelemetry compatible observability tool in addition to Langtrace.
Setup
Sign up to Langtrace, create a project and get a Langtrace API key
Install Langtrace SDK
Setup the .env var
Initialize Langtrace in your code
See the traces in Langtrace
For more information check out our docs here - https://docs.langtrace.ai/supported-integrations/llm-frameworks/litellm
Useful Resources
Getting started with Langtrace https://docs.langtrace.ai/introduction
Langtrace Website https://langtrace.ai/
Langtrace Discord https://discord.langtrace.ai/
Langtrace Github https://github.com/Scale3-Labs/langtrace
Langtrace Twitter(X) https://x.com/langtrace_ai
Langtrace Linkedin https://www.linkedin.com/company/langtrace/about/
Ready to deploy?
Try out the Langtrace SDK with just 2 lines of code.
Want to learn more?
Check out our documentation to learn more about how langtrace works
Join the Community
Check out our Discord community to ask questions and meet customers