Integrate Langtrace with Datadog

Karthik Kalyanaraman

Cofounder and CTO

Oct 17, 2024

Introduction

From day 1, we wanted to build an Observability solution that is,

  • Opensource

  • OpenTelemetry compatible

  • Prevents vendor lock-in

As a result of these early design decisions, developers today have the choice of using Langtrace's opentelemetry based tracing SDKs with any observability tool of their choice. In this post, we show case how you can use Langtrace for tracing your GenAI stack with Datadog

Setup

  1. Ensure you have a [Datadog](https://datadoghq.com/) account

  2. Install Langtrace SDK

pip install -U langtrace-python-sdk
  1. In your OpenTelemetry collector setup, make sure you have the collector endpoint environment variable in .env set like this example:

export OTEL_EXPORTER_OTLP_ENDPOINT="http://localhost:4317"
  1. Initialize the Langtrace SDK in your application with your OTLP collector endpoint:

import os
from openai import OpenAI

from langtrace_python_sdk import langtrace
from langtrace_python_sdk.utils.with_root_span import with_langtrace_root_span

from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import \
    OTLPSpanExporter

# Set up the OTLP exporter with the endpoint from your collector
otlp_exporter = OTLPSpanExporter(
    # This should match your collector's OTLP gRPC endpoint
    endpoint=os.environ.get("OTEL_EXPORTER_OTLP_ENDPOINT"),
    insecure=True  # Set to False if using HTTPS
)

# set up langtrace
langtrace.init(custom_remote_exporter=otlp_exporter)

# rest of your code
@with_langtrace_root_span() # This optional annotation is used to create a parent root span for the entire application
def app():
    client = OpenAI(
        api_key="<YOUR_OPENAI_API_KEY>")
    response = client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[
            {
                "role": "system",
                "content": "How many states of matter are there?"
            }
        ],
    )
    print(response.choices[0].message.content)


app()
  1. Note: The OTLP collector is setup using a config file in this example. The config file is shown below:

receivers:
  otlp:
    protocols:
      http:
        endpoint: "localhost:4318"
      grpc:
        endpoint: "localhost:4317"

processors:
  batch:

exporters:
  datadog:
    api:
      site: "datadoghq.com"
      key: "<DATADOG_API_KEY>"

service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [datadog]
    metrics:
      receivers: [otlp]
      processors: [batch]
      exporters: [datadog]
    logs:
      receivers: [otlp]
      processors: [batch]
      exporters: [datadog]
  1. With the environment variables set, run your application with OpenTelemetry instrumentation. Use the following command:

python main.py
  1. Note: Make sure your collector is running and configured correctly. To run the collector locally, you can use the following command:

otelcol-contrib --config otel-collector-config.yaml
  1. Once the application is running, you should see traces in your Datadog APM dashboard as shown below:

For more information check out our docs here - https://docs.langtrace.ai/supported-integrations/observability-tools/datadog

Useful Resources

Ready to try Langtrace?

Try out the Langtrace SDK with just 2 lines of code.

Ready to deploy?

Try out the Langtrace SDK with just 2 lines of code.

Want to learn more?

Check out our documentation to learn more about how langtrace works

Join the Community

Check out our Discord community to ask questions and meet customers