Feature preview: Third-party LLM Usage
In addition to Splunk platform hosted large language models (LLMs), version 1.3.1 of Splunk AI Assistant for SPL offers a feature preview that leverages third-party LLMs, hosted in Azure for OpenAI.
When you remain opted-in for this feature preview, the assistant determines when to use a Splunk platform hosted LLM and when to use a third-party LLM. Third-party LLMs can provide better response quality from the assistant, depending on factors such as use case and cost.
What is supported
Splunk AI Assistant for SPL supports natural language and is subject to Microsoft's Azure OpenAI Acceptable Use Policy and Code of Conduct Content requirements. See https://www.microsoft.com/en-us/microsoft-365/legal/docid12 and https://learn.microsoft.com/en-us/legal/ai-code-of-conduct#content-requirements.
Splunk AI Assistant for SPL responds in natural language with a summary of insights synthesized from multiple sources.
Requirement
Version 1.3.1 of the Third-party LLM Usage (preview) feature is only available for Splunk Cloud platform users in the following regions:
- AWS - Canada Central
- AWS - EU Paris
- AWS - AP Sydney
- AWS - US East Virginia
- AWS - US West Oregon
- AWS - EU London
- AWS - AP Tokyo
Participating in the preview
If you are a user with administrator privileges, when you install version 1.3.1 or upgrade to version 1.3.1, you are opted-in by default for the Third-party LLM Usage (preview) feature. Administrators can opt out or back in at any time, and the change takes effect immediately.
Whether the Third-party LLM Usage (preview) is turned off or on, that setting applies at the app level, affecting all users, and not at the individual user level.
If you want to opt-in or out of this preview feature at a later date than the app installation or upgrade, navigate to the Settings tab of the assistant. Select or deselect the Use third-party LLMs option, as shown in the following image:
Using the Third-party LLM Usage (preview) feature
When you opt-in for the Third-party LLM Usage (preview), Splunk AI Assistant for SPL can leverage an external large language models (LLM) hosted in Azure for OpenAI. This LLM generates the response provided by the app when deemed necessary, and can improve the response quality. The app leverages the additional options from the LLM based on use case, latency, accuracy, and cost.
The Third-party LLM Usage (preview) feature includes enterprise-grade compliance and regional data boundaries. Opting in causes no disruption to Splunk AI Assistant for SPL services or responsiveness.
When you opt-in, search responses are tagged with the source as being either internal, using the Splunk platform, or external, using the third-party LLMs. Administrators can view these audit log tags as needed.