Monitor LLM costs with navigators

Learn how to monitor LLM costs with navigators.

Attention: Beta features described in this document are provided by Splunk to you "as is" without any warranties, maintenance and support, or service-level commitments. Splunk makes this beta feature available in its sole discretion and may discontinue it at any time. Use of beta features is subject to the Splunk General Terms.
Note: This feature is only supported on the LangChain and OpenAI navigators.
Splunk Observability Cloud collects OpenTelemetry generative AI client and model server metrics to track the token usage and costs of large-language model (LLM) services. On supported navigators, you can use metric charts to track the cost and token usage for LLM services and models.

For more information on the cost-related metrics and attributes collected by Splunk Observability Cloud, see the LangChain and OpenAI configuration documentation.

Complete the following steps to monitor LLM costs with navigators.

  1. From the Splunk Observability Cloud main menu, select Infrastructure.
  2. Under AI/ML, select AI Frameworks.
  3. Select the OpenAI or LangChain navigator.
  4. Use one of the following methods to monitor LLM costs:
    • The Overview tab is selected by default. View the charts for cost and token usage.
    • Select the Table tab, then the name of a service to display the detail view. View the charts for cost and token usage.
    The following screenshot displays the detail view of the LangChain navigator, which includes metric charts for cost and token usage.A screenshot of the detail view for the LangChain navigator, which contains metric charts that you can use to monitor cost and token usage.