Collect metrics and traces from LangChain services

Learn how to collect metrics and traces from LangChain services.

You can collect metrics and traces from LangChain large-language model (LLM) services by instrumenting your Python application using the Splunk Distribution of OpenTelemetry Python with the OpenTelemetry LangChain instrumentation. The LangChain instrumentation pushes metrics and traces to the OTLP receiver.

Complete the following steps to collect metrics and traces from LangChain services and monitor their performance using navigators.

  1. (Prerequisite) Initialize the LangChain instrumentation in your service.
  2. Configure and activate the component for LangChain.
  3. Use the LangChain navigator to monitor the performance of LangChain services.

Prerequisites

Learn how to initialize the LangChain instrumentation in your service.

To use the LangChain instrumentation to push LangChain metrics and traces to the OTLP receiver, you must install and initialize the LangChain instrumentation in your service.

  1. Run the following command:
    pip install git+https://github.com/zhirafovod/opentelemetry-python-contrib.git@instrumentation-genai-langchain#subdirectory=instrumentation-genai/opentelemetry-instrumentation-langchain
  2. Add the following code to your service:

    from opentelemetry.instrumentation.langchain import
    LangChainInstrumentor
    LangChainInstrumentor().instrument()                
                      

Configure and activate the component for LangChain

Learn how to configure and activate the component to monitor LangChain services.

Complete the following steps to configure and activate the component for LangChain.

  1. Deploy the Splunk Distribution of the OpenTelemetry Collector to your host or container platform:
  2. To activate the OTLP receiver for LangChain manually in the Collector configuration, add the OpenTelemetry receiver address to the configuration of your LangChain service. This step is required to send telemetry to the receiver. For example:
    otel_endpoint = os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT")
  3. Add the following environment variables to the configuration of your LangChain service:
    OTEL_RESOURCE_ATTRIBUTES="deployment.environment=<environment_name>"
    OTEL_SERVICE_NAME=<service_name>
    OTEL_INSTRUMENTATION_LANGCHAIN_CAPTURE_MESSAGE_CONTENT=<true_or_false>
    OPENAI_API_KEY=<openai_api_key>
                        
  4. Deploy the Python agent in your LangChain service:
    1. Install the Splunk Distribution of OpenTelemetry Python using the guided setup or manual method.
    2. Install the AI/LLM instrumentation by following the steps on LangChain instrumentation in the OpenTelemetry Python Contrib GitHub repository. You can use zero-code instrumentation or manual instrumentation.
  5. Run your service.

Monitor the performance of LangChain services

Learn how to navigate to the LangChain navigator, which you can use to monitor the performance of LangChain services.

Complete the following steps to access the LangChain navigator and monitor the performance of LangChain services. For more information on navigators, see Use navigators and Monitor LLM costs with navigators.

  1. From the Splunk Observability Cloud main menu, select Infrastructure.
  2. Under AI/ML, select AI Frameworks.
  3. Select the LangChain summary card.

Configuration settings

Learn about the configuration settings for the OTLP receiver.

To view the configuration options for the OTLP receiver, see Settings.

Metrics

Learn about the monitoring metrics available for LangChain services.

The following metrics are available for LangChain services. For more information, see Semantic conventions for generative AI metrics in the OpenTelemetry documentation.
Metric nameTypeUnitDescription
gen_ai.client.operation.durationfloat

s

The duration of the GenAI operation.
gen_ai.client.token.usageint countMeasures the number of input (prompt) and output (completions) tokens used.

Attributes

Learn about the resource attributes available for LangChain services.

The following resource attributes are available for LangChain services.
OpenTelemetry attribute nameDescription
telemetry.sdk.nameThe name of the telemetry SDK.
telemetry.sdk.languageThe language of the telemetry SDK.
telemetry.sdk.versionThe version string of the telemetry SDK.
deployment.environmentThe environment where the GenAI service is running.
service.nameThe name of the service.
host.nameThe name of the host.
os.typeThe OS type.

Troubleshoot

Learn how to get help if you can't see your data in Splunk Observability Cloud.

If you can't see your data in Splunk Observability Cloud, you can get help in the following ways:

  • Prospective customers and free trial users can ask a question and get answers through community support in the Splunk Community.