Feature preview: Cisco Deep Time Series Model
Version 5.7.0 of the AI Toolkit offers a preview of the Cisco Deep Time Series Model (CDTSM). The CDTSM is a pretrained, zero-shot, generative AI model designed for forecasting metric time series data across the Splunk platform.
The CDTSM can recognize patterns, trends, and seasonal cycles in a time series' history and make forecasts of how that time series might continue into the near future. CDTSM offers time series analysis without the need for data science expertise.
With the CDTSM you can get immediate forecasting capability without model training, get support for predictive operations workflows, get capacity planning assistance, and proactive operational insights.
Key features of the CDTSM
The Cisco Deep Time Series Model (CDTSM) is a pretrained, zero-shot forecasting model designed for time series data common in operational environments. No per-metric training is required. Built on transformer architecture, this model can generate forecasts without requiring custom training or fine-tuning on your specific data, making it immediately usable for predictive analytics.
Requirements
You must be a Splunk Cloud customer in a supported region to try the preview of the Cisco Deep Time Series Model.
Supported regions
The Cisco Deep Time Series Model is available in all AWS regions for Splunk Cloud customers.
Participating in the preview
Splunk Cloud customers of the AI Toolkit who have administrator privileges can opt-in for the CDTSM when they install or upgrade to version 5.7.0 of the AI Toolkit. Administrators must also agree to the following Splunk Pre-Release Software License Agreement and Research Addendum. See https://voc.splunk.com/preview/ctsm_aitk
Use cases for the CDTSM
-
Infrastructure capacity planning
-
Application performance forecasting
-
Network traffic prediction
-
Resource utilization forecasting
-
Operational metrics prediction
-
Anomaly detection in time series data
-
Trend analysis and extrapolation
CDTSM syntax and parameters
Follow this syntax to call the Cisco Deep Time Series Model:
apply CTSM <fields_to_forecast> [time_field=<str>] [forecast_k=<int>] [quantiles=<str>][conf_interval=<int>] [holdback=<int>] [show_input=<bool>]
| Name | Type | Description |
|---|---|---|
fields_to_forecast |
argument | Space separated fields, depicting the time series fields. |
time_field |
parameter | The field containing timestamp information. |
forecast_k |
parameter | The number of future timestamps that need to be forecasted. Default value of 128. |
holdback |
parameter | The number of input data points to be held back from the model. Useful to compare ground truth and predictions. The holdback cannot be greater than forecast_k. When you specify forecast_k, you must also account for the holdback value. Default value of 0. |
conf_interval |
parameter | This is the confidence interval in percentage around forecasted values. By default it is set to 90%. Permissible values are 20, 40, 50, 60, 80, 90, and 98. |
quantiles |
parameter | The comma separated integers signifying any explicit quantile values that you want to see in forecasts. Permissible values are mean, p1, p10, p20, p25, p30, p40, p5, p50, p60, p70, p75, p80, p90, p95, p99. |
show_input |
parameter | To see the input data as well in result, set it to true. When set to false will only show holdback and forecast_k datapoints in output. |
CDTSM syntax constraints
-
You can exclude both the
holdbackandforecast_kparameters. But you cannot specify both with a value of 0 in the ML-SPL search. -
The value for
holdbackis 0 by default and cannot be a negative number. -
The value for
holdbackcan't be greater than the input value. Input minus theholdbackis the number of datapoints fed into the model. -
The value for
holdbackcannot be greater than forecast_k. -
If the combined value for
holdback+forecast_kis greater than 384, model performance might degrade. -
The model supports up to 30 thousand datapoints. If you pass in more than 30 thousand datapoints the model uses the most recent 30 thousand datapoints.
-
Intervals can be anything other than 0 seconds.
Cisco Deep Time Series Model behavior
The model expects data with a fixed resolution. For example1 minute, 5 minutes, or 10 minutes.
When the model returns mean forecast it is labeled as predicted.
The model returns quantiles information only until reaching 128 forecast_k data points.After that the quantiles information might not be present. If forecast_k is greater than 384, expect some degraded predictions.
You can forecast more than 1 value by placing those fields after the apply CDTSM commands. For example, to forecast 2 columns such as cpu and memory, the ML-SPL search is apply CDTSM cpu memory.
The model requires an input of at least 60 data points.
time_field parameter is not a required parameter. If not passed, _time is used as the time_field by default. The algorithm expects every value of the time_field to be populated although the data might have a fixed resolution.
time_field must be populated with date values.
holdback and forecast_k parameters. The default value of forecast_k is 128, and the default value of holdback is 0. If these parameters are included in the ML-SPL search they cannot both be 0.
When you pass in both holdback and forecast_k you must account for the holdback parameter in the forecast_k parameter. For example, if you want 10 new timestamps and you require 5 input points to be considered as holdback, then set holdback as 5 and forecast_k as 10 + 5 = 15. Meaning holdback is always less than or equal to forecast_k.
The quantiles parameter is not mandatory. If passed in the value must be a single percentile value like p10 or p20, or multiple percentile values in a comma separated value (CSV) format. Permissible quantile values are mean, p1, p10, p20, p25, p30, p40, p5, p50, p60, p70, p75, p80, p90, p95, and p99.
The confidence interval percentage parameter of conf_interval is not mandatory. If not passed in, the conf_interval is 90. Meaning lower90 and upper90 bounds are shown which is p5 and p95.
Confidence interval is mapped to quantiles. For example conf_interval 90 means q0.5 lower bound, and q0.95 upper bound [ lower = 1 - (90/100) / 2 = 0.05, and upper=1 - ( 1 -(90/100) / 2) = 0.95 ].
The CDTSM only supports specific quantiles and in turn supports only the confidence intervals of 20, 40, 50, 60, 80, 90, or 98.
The show_input parameter default is true and is not mandatory.
Using Forecast visualizations
To visualize time series forecasts with forecast charts, you must use the forecastviz macro.
Forecast macro example
The following is an example of the forecastviz macro:
forecastviz(384, 20, "cpu", 45)
In this example the first argument is the forecast_k parameter, the second argument is the holdback parameter, the third parameter is the raw field of the time-series, and the fourth argument is the conf_interval.
time_field as _time for visualizing this forecast. The show_input parameter must be set as true because the forecastviz parameter requires the input time series as well.
In the following image you can see the holdback markers, forecast markers, and confidence interval bands:
Troubleshooting the CDTSM
I opted out when I saw the option to try the CDTSM preview. How can I opt back in?
There is no option in the AI Toolkit itself to opt back in to try the CDTSM preview. Contact Splunk support to get this preview made available.