Monitor LangChain Applications
Splunk AppDynamics Python Agent helps you to monitor GenAI apps built using the LangChain framework.
Prerequisites
Ensure to set the enable-langchain
flag to true
in /path/to/appdynamics.cfg
file of the Python Agent. See [instrumentation]
.
Python agent supports the following versions of LangChain LLMs in Splunk AppDynamics.
Component | Version |
---|---|
langchain
| <= 0.2.11 |
langchain-ollama
| <= 0.2.0 |
langchain-chroma
| <= 0.1.1 |
langchain-postgres
| <= 0.0.12 |
chromadb | <= 0.5.20 |
pgvector | <= 0.2.5 |
Monitor LangChain Ollama APIs
To monitor Ollama calls, the Python Agent reports these metrics:
- Input Tokens
- Output Tokens
- Time to first token metric (ms)
- Time per output token (ms)
- Average Response Time (ms)
- Prompt count
- Embedding queries count
- Errors
For Token Metrics, ensure to install transformers Python library. See transformers.
pip install transformers
Example metric path: Agent|Langchain|LLM|llama3.2_latest|Average Response Time (ms))
Create a Custom Dashboard to Monitor Ollama APIs
Monitor LangChain Vectorstores
When you instrument the app with Python Agent, it captures the following for Vectorstores:
-
For
langchain_community
andlangchain_{vendor}
vectorstores, Python Agent captures the following metrics:- Search Score
- Average Response Time (ms)
- Errors
- Calls
- Vector Insertion Count
- Vector Deletion Count
Note:By default, Splunk AppDynamics SaaS capture the following metrics for
langchain_{vendor}
vectorstores:langchain_milvus
langchain_chroma
langchain_postgres
However, you can set the
langchain-vectorstores-instrumented-modules
flag to capture the metrics from the specific LangChain vendor in/path/to/appdynamics.cfg
. For more information, see [instrumentation]. - Exit calls to vector databases: pgvector and chroma.
To capture PGVector and Chroma vectorstores request and response for in snapshot exit call detail, set the enable-genai-data-capture
flag to true in the /path/to/appdynamics.cfg
file of the Python Agent. See [instrumentation]
.