Feature preview: Third-party LLM Usage
In addition to the large language models (LLMs) hosted in Splunk Cloud Platform, version 1.3.1 or higher of the Splunk AI Assistant for SPL offers a feature preview that gives customers an option to allow Splunk to use models hosted in Azure OpenAI.
When you remain opted-in for this feature preview, the assistant determines when to use a Splunk platform hosted LLM and when to use a third-party LLM. Third-party LLMs can provide better response quality from the assistant, depending on factors such as use case and cost.
What is supported
When the administrator opts in to this feature, Splunk AI Assistant for SPL
Requirement
Version 1.3.2 of the Third-party LLM Usage (preview) feature is only available for Splunk Cloud platform users in the following regions:
- AWS - Canada Central
AWS - AP Mumbai
- AWS - AP Sydney
AWS - AP Tokyo
AWS - EU London
AWS - EU Paris
- AWS - US West Oregon
AWS - US East Virginia
Azure - East US (Virginia)
Azure - UK South (London)
Azure - West US (California)
Azure - Japan East (Tokyo)
Participating in the preview
If you are a user with administrator privileges, when you install version 1.3.1 or upgrade to version 1.3.1, you are opted-in by default for the Third-party LLM Usage (preview) feature. Administrators can opt out or back in at any time, and the change takes effect immediately.
Whether the Third-party LLM Usage (preview) is turned off or on, that setting applies at the app level, affecting all users, and not at the individual user level.
If you want to opt-in or out of this preview feature at a later date than the app installation or upgrade, navigate to the Settings tab of the assistant. Select or deselect the Use third-party LLMs option, as shown in the following image:
Using the Third-party LLM Usage (preview) feature
When you opt-in for the Third-party LLM Usage (preview), Splunk AI Assistant for SPL can leverage an external large language models (LLM) hosted in Azure OpenAI. This LLM generates the response provided by the app when deemed necessary, and can improve the response quality. The app leverages the additional options from the LLM based on the intent, and complexity of the request.
The Third-party LLM Usage (preview) feature includes enterprise-grade compliance and regional data boundaries. Opting in causes no disruption to Splunk AI Assistant for SPL services or responsiveness.
When you opt-in, search responses are tagged with the source as being either internal, using the Splunk platform, or external, using the third-party LLMs. Administrators can view these audit log tags as needed.