Feature preview: Third-party LLM Usage

In addition to Splunk platform hosted large language models (LLMs), version 1.3.1 of Splunk AI Assistant for SPL offers a feature preview that leverages third-party LLMs, hosted in Azure for OpenAI.

Note: Version 1.3.1 of this feature preview is only available for Splunk Cloud platform users.

When you remain opted-in for this feature preview, the assistant determines when to use a Splunk platform hosted LLM and when to use a third-party LLM. Third-party LLMs can provide better response quality from the assistant, depending on factors such as use case and cost.

CAUTION: When your administrator installs or upgrades to version 1.3.1 you are opted-in for the Third-party LLM Usage (preview) feature by default. Participation can be turned off or back on by your administrator at any time.

Requirement

Version 1.3.1 of the Third-party LLM Usage (preview) feature is only available for Splunk Cloud platform users in the following regions:

  • AWS - Canada Central
  • AWS - EU Paris
  • AWS - AP Sydney
  • AWS - US East Virginia
  • AWS - US West Oregon
  • AWS - EU London
  • AWS - AP Tokyo

Participating in the preview

If you are a user with administrator privileges, when you install version 1.3.1 or upgrade to version 1.3.1, you are opted-in by default for the Third-party LLM Usage (preview) feature. Administrators can opt out or back in at any time, and the change takes effect immediately.

Note: Only users with administrator privileges can opt-in or opt-out of this preview feature.

Whether the Third-party LLM Usage (preview) is turned off or on, that setting applies at the app level, affecting all users, and not at the individual user level.

If you want to opt-in or out of this preview feature at a later date than the app installation or upgrade, navigate to the Settings tab of the assistant. Select or deselect the Use third-party LLMs option, as shown in the following image:

This image shows the Settings tab of Splunk AI Assistant for SPL. The setting where administrators can toggle the Third-party LLM Usage preview feature on or off is highlighted.

Note: Users without administrator privileges see the optimization information and the setting chosen, but cannot change this setting.

Using the Third-party LLM Usage (preview) feature

When you opt-in for the Third-party LLM Usage (preview), Splunk AI Assistant for SPL can leverage an external large language models (LLM) hosted in Azure for OpenAI. This LLM generates the response provided by the app when deemed necessary, and can improve the response quality. The app leverages the additional options from the LLM based on use case, latency, accuracy, and cost.

CAUTION: The external LLM endpoint is secure but is outside the Splunk platform data boundary. The search prompt is sent to the third-party LLM and is governed by the third-party LLM provider's data handling policy.

The Third-party LLM Usage (preview) feature includes enterprise-grade compliance and regional data boundaries. Opting in causes no disruption to Splunk AI Assistant for SPL services or responsiveness.

When you opt-in, search responses are tagged with the source as being either internal, using the Splunk platform, or external, using the third-party LLMs. Administrators can view these audit log tags as needed.