Troubleshoot data ingestion for AI Agent Monitoring
Troubleshoot issues with the LLM Providers data integration and trace ingestion.
Alpha features described in this document are provided by Splunk to you "as is" without any warranties, maintenance and support, or service-level commitments. Splunk makes this alpha feature available in its sole discretion and may discontinue it at any time. These documents are not yet publicly available and we ask that you keep such information confidential. Use of alpha features is subject to the Splunk Pre-Release Agreement for Hosted Services.
-
AI Agent Monitoring
-
AI Agent Security Monitoring
Prerequisite
The troubleshooting topics on this page assume that you have completed the steps in Set up AI Agent Monitoring.
LLM Providers data integration can’t be saved
Splunk Observability Cloud could not establish a connection with LLM provider. Review your authentication credentials and try again.
Trace data is no longer being ingested
The LLM Providers data integration was correctly configured. You confirmed that trace data was being ingested because the page displayed traces with a value in the Quality column. However, the AI trace data page is no longer showing new trace data.
To resolve this issue, use the Splunk Observability Cloud main menu to navigate to . Proceed with the following steps to troubleshoot the LLM Providers data integration.
- Verify that the rate limit isn't exceeded and you haven't run out of tokens for your LLM provider.
- Verify that the LLM model provided in the Model name field is still supported by the provider.
- Verify that the API token you provided is active. Rotate the token if needed.
- Verify that the Number of evaluations you set aligns with your environment. Splunk Observability Cloud stops running evaluations once the number of evaluations per minute is reached.
- Verify that the Sampling rate isn't set to 0.