Configure the OTLP receiver to collect LangChain metrics

Learn how to configure the OTLP receiver to collect LangChain metrics.

You can monitor the performance of LangChain large-language model (LLM) services by configuring your LangChain services to send metrics to Splunk Observability Cloud. This solution uses the OTLP receiver with the LangChain instrumentation, which collects metrics from LangChain services by automatically setting up OpenTelemetry to track performance and statistics.

Complete the following steps to collect metrics from and monitor the performance of LangChain services.

  1. (Prerequisite) Initialize the LangChain instrumentation in your service.
  2. Configure and activate the component for LangChain.
  3. Use the LangChain navigator to monitor the performance of LangChain services.

Prerequisite

Learn how to initialize the LangChain instrumentation in your service.

To use the OTLP receiver to collect metrics from LangChain, you must initialize the LangChain instrumentation in your service.

Add the following code to your service:
from opentelemetry.instrumentation.langchain import
LangChainInstrumentor
LangChainInstrumentor().instrument()                
                  

Configure and activate the component for LangChain

Learn how to configure and activate the component to monitor LangChain services.

Complete the following steps to configure and activate the component for LangChain.

  1. Deploy the Splunk Distribution of the OpenTelemetry Collector to your host or container platform:
  2. To activate the OTLP receiver for LangChain manually in the Collector configuration, add the OpenTelemetry receiver address to the configuration of your LangChain service. For example:
    otel_endpoint = os.getenv("OTEL_EXPORTER_OTLP_HTTP_ENDPOINT", "{SPLUNK_COLLECTOR_LISTEN_INTERFACE}:4318")
  3. Add the following environment variables to the configuration of your LangChain service:
    OTEL_RESOURCE_ATTRIBUTES="deployment.environment=<environment_name>"
    OTEL_SERVICE_NAME=<service_name>
    OTEL_INSTRUMENTATION_LANGCHAIN_CAPTURE_MESSAGE_CONTENT=<true_or_false>
    OPENAI_API_KEY=<openai_api_key>
                        
  4. Deploy the Python agent in your LangChain service:
    1. Install the Splunk Distribution of OpenTelemetry Python using the guided setup or manual method.
    2. Install the AI/LLM instrumentation by following the steps on LangChain instrumentation in the OpenTelemetry Python Contrib GitHub repository. You can use zero-code instrumentation or manual instrumentation.
  5. Run your service.

Monitor the performance of LangChain services

Learn how to navigate to the LangChain navigator, which you can use to monitor the performance of LangChain services.

Complete the following steps to access the LangChain navigator and monitor the performance of LangChain services. For more information on navigators, see Use navigators and Monitor LLM costs with navigators.

  1. From the Splunk Observability Cloud main menu, select Infrastructure.
  2. Under AI/ML, select AI Frameworks.
  3. Select the LangChain summary card.

Configuration settings

Learn about the configuration settings for the OTLP receiver.

To view the configuration options for the OTLP receiver, see Settings.

Metrics

Learn about the monitoring metrics available for LangChain services.

The following metrics are available for LangChain services. For more information, see Semantic conventions for generative AI metrics in the OpenTelemetry documentation.
Metric nameTypeUnitDescription
gen_ai.client.operation.durationfloat

s

The duration of the GenAI operation.
gen_ai.client.token.usageint countMeasures the number of input (prompt) and output (completions) tokens used.

Attributes

Learn about the resource attributes available for LangChain services.

The following resource attributes are available for LangChain services.
OpenTelemetry attribute nameDescription
telemetry.sdk.nameThe name of the telemetry SDK.
telemetry.sdk.languageThe language of the telemetry SDK.
telemetry.sdk.versionThe version string of the telemetry SDK.
deployment.environmentThe environment where the GenAI service is running.
service.nameThe name of the service.
host.nameThe name of the host.
os.typeThe OS type.

Troubleshoot

Learn how to get help if you can't see your data in Splunk Observability Cloud.

If you are a Splunk Observability Cloud customer and are not able to see your data in Splunk Observability Cloud, you can get help in the following ways:

  • Prospective customers and free trial users can ask a question and get answers through community support in the Splunk Community.