Model runtime options
In addition to the large language models (LLMs) hosted in Splunk Cloud Platform, version 1.4.0 of Splunk AI Assistant for SPL provides the option to use models hosted in Azure OpenAI.
Splunk AI Assistant for SPL determines when to use a Splunk platform hosted LLM, and when to use a third-party LLM. Third-party LLMs can provide better response quality through the assistant, depending on factors such as use case and cost. When you install or upgrade to version 1.4.0 you are opted-in to this functionality by default. You can disable this functionality at any time.
Requirements
This functionality is available to Splunk Cloud platform users in the following regions:
- AWS - Canada Central
AWS - AP Mumbai
- AWS - AP Sydney
AWS - AP Tokyo
AWS - EU London
AWS - EU Paris
- AWS - US West Oregon
AWS - US East Virginia
Azure - East US (Virginia)
Azure - UK South (London)
Azure - West US (California)
Azure - Japan East (Tokyo)
Opt in or out of the model runtime choices feature
When you install version 1.4.0 of the Splunk AI Assistant for SP, you are opted in to this feature by default. You can opt out or back in at any time, and the change takes effect immediately. You must have administrator privileges to opt in or out of this feature.
This setting applies at the app level and affects all users. it can not be set at the individual user level.
If you want to opt-in or out of this feature, navigate to the Settings tab of the assistant. Select or deselect the Use model runtime choices option.
Using the Model runtime choice feature
Splunk AI Assistant for SPL can leverage an external large language models (LLM) hosted in Azure OpenAI. This LLM generates the response provided by the app when deemed necessary, and can improve the response quality. The Splunk AI Assistant for SPL leverages the additional options from the LLM based on the intent and complexity of the request. The external LLM endpoint is secure but is outside the Splunk platform data boundary. The search prompt is sent to the third-party LLM and is governed by the third-party LLM provider's data handling policy.
This feature includes enterprise-grade compliance and regional data boundaries. Opting in causes no disruption to Splunk AI Assistant for SPL services or responsiveness.
When you opt-in, search responses are tagged with the source as being either internal, using the Splunk platform, or external, using the third-party LLMs. Administrators can view these audit log tags as needed.