Integrate Langtrace with DeepSeek

Karthik Kalyanaraman

Cofounder and CTO

Dec 2, 2024

Introduction

We are excited to announce that Langtrace now supports DeepSeek's family of models natively. This means Langtrace will automatically capture traces and metrics including token usage, cost, latency and model hyper parameters automatically if you are using DeepSeek's family of models.

Setup

  1. Sign up to Langtrace, create a project and get a Langtrace API key

  2. Install Langtrace SDK

pip install -U langtrace-python-sdk
  1. Setup the .env var

export LANGTRACE_API_KEY=YOUR_LANGTRACE_API_KEY
  1. Initialize Langtrace in your code

import os
from langtrace_python_sdk import langtrace # Must precede any llm module imports
from openai import OpenAI

langtrace.init(api_key = os.environ['LANGTRACE_API_KEY'])


client = OpenAI(api_key="<DeepSeek API Key>", base_url="https://api.deepseek.com")

response = client.chat.completions.create(
    model="deepseek-chat",
    messages=[
        {"role": "system", "content": "You are a helpful assistant"},
        {"role": "user", "content": "Hello"},
    ],
    stream=False
)

print(chat_completion.choices[0].message.content)
  1. See the traces in Langtrace

For more information check out our docs - https://docs.langtrace.ai/supported-integrations/llm-tools/deepseek#deepseek

Useful Resources

Ready to try Langtrace?

Try out the Langtrace SDK with just 2 lines of code.

Ready to deploy?

Try out the Langtrace SDK with just 2 lines of code.

Want to learn more?

Check out our documentation to learn more about how langtrace works

Join the Community

Check out our Discord community to ask questions and meet customers