Integrate Langtrace with xAI's Grok

Karthik Kalyanaraman

Cofounder and CTO

Oct 21, 2024

Introduction

We are excited to announce that Langtrace now supports xAI's Grok family of models natively. This means Langtrace will automatically capture traces and metrics including token usage, cost, latency and model hyper parameters automatically if you are using Grok's family of models.

Setup

  1. Sign up to Langtrace, create a project and get a Langtrace API key

  2. Install Langtrace SDK

pip install -U langtrace-python-sdk
  1. Setup the .env var

export LANGTRACE_API_KEY=YOUR_LANGTRACE_API_KEY
  1. Initialize Langtrace in your code

import os
from langtrace_python_sdk import langtrace # Must precede any llm module imports
from openai import OpenAI

langtrace.init(api_key = os.environ['LANGTRACE_API_KEY'])

client = OpenAI(
  api_key=os.environ["XAI_API_KEY"],
  base_url="https://api.x.ai/v1",
)

chat_completion = client.chat.completions.create(
  model="grok-beta",
  messages=[
      {
          "role": "user",
          "content": "What is LangChain?",
      }
  ],
)
print(chat_completion.choices[0].message.content)
  1. See the traces in Langtrace

For more information check out our docs here - https://docs.langtrace.ai/supported-integrations/llm-tools/xai#xai

Useful Resources

Ready to try Langtrace?

Try out the Langtrace SDK with just 2 lines of code.

Ready to deploy?

Try out the Langtrace SDK with just 2 lines of code.

Want to learn more?

Check out our documentation to learn more about how langtrace works

Join the Community

Check out our Discord community to ask questions and meet customers