Monitoring a Langchain RAG Application with SigNoz and Langtrace

Karthik Kalyanaraman

Cofounder and CTO

Dec 12, 2024

Introduction

This guide outlines the steps to set up a Retrieval-Augmented Generation (RAG) chatbot application. The chatbot leverages Elastic Search for document search, Azure OpenAI for generating responses, and Langtrace for observability into SigNoz. You can run this Chatbot using the code that can be found in this github repository.

Step 1: Set Environment Variables for OpenTelemetry Traces

Set up the following environment variables to enable OpenTelemetry tracing and send traces to SigNoz:

OTEL_EXPORTER_OTLP_ENDPOINT="ingest.us.signoz.cloud:443"
OTEL_EXPORTER_OTLP_HEADERS="signoz-access-token=<token>"
OTEL_RESOURCE_ATTRIBUTES=service.name="rag chatbot"
OTEL_EXPORTER_OTLP_PROTOCOL="grpc"

Replace <token> with your SigNoz token.

Step 2: Install and Set Up Langtrace

Install the Langtrace Python SDK in your application environment:

Next, integrate Langtrace into your Flask app. Open your app's app.py file and add the following code at the top:

from langtrace_python_sdk import langtrace

langtrace.init()

This enables automatic tracing and observability for your application.

Step 3: Configure Azure OpenAI Environment Variables

Ensure the necessary Azure OpenAI environment variables are set up:

export LLM_TYPE=azure
export OPENAI_VERSION=<version>       # e.g., 2023-05-15
export OPENAI_BASE_URL=<base_url>     # Azure OpenAI endpoint URL
export OPENAI_API_KEY=<api_key>       # Azure OpenAI API key
export OPENAI_ENGINE=<deployment_name># Azure deployment name

Replace placeholders with the respective values from your Azure OpenAI setup.

Step 4: Run the Flask Application

Start the Flask application by running the following command:

This will start your RAG application, making it available for interaction.

Step 5: Use the Chatbot

Interact with the chatbot by asking questions. The chatbot will use Elastic Search to retrieve relevant documents and Azure OpenAI to generate responses.

Step 6: Observe Traces in SigNoz

After using the chatbot, navigate to your SigNoz traces dashboard. You’ll see detailed traces of your application, including interactions with Elastic Search, Azure OpenAI, and the RAG logic. Use these traces for debugging, performance monitoring, and optimization.

SigNoz dashboard

Conclusion

By following these steps, you can set up and observe a fully functional RAG chatbot application with integrated observability.

Useful Resources

Ready to try Langtrace?

Try out the Langtrace SDK with just 2 lines of code.

Ready to deploy?

Try out the Langtrace SDK with just 2 lines of code.

Want to learn more?

Check out our documentation to learn more about how langtrace works

Join the Community

Check out our Discord community to ask questions and meet customers