Kvstore_to_json.py operations in ITE Work
Splunk IT Essentials Work provides a kvstore_to_json.py
script that lets you backup or restore ITE Work configuration data, perform bulk service KPI operations, apply time zone offsets for ITE Work objects, and regenerate KPI search schedules.
Usage options
The kvstore_to_json.py
script is located in $SPLUNK_HOME/etc/apps/SA-ITOA/bin/
.
The kvstore_to_json.py
script has these 4 modes:
Mode 1: Backup and restore operations
Mode 2: Bulk service KPI operations.
Mode 3: Time zone offset operations.
Mode 4: Regenerate KPI search schedules.
To view all kvstore_to_json.py
usage options, specify the -h
option.
[root@myserver splunk]# ./bin/splunk cmd python etc/apps/SA-ITOA/bin/kvstore_to_json.py -h
Usage: kvstore_to_json.py [options]
Options:
-h, --help show this help message and exit
-s SPLUNKDPORT, --splunkdport=SPLUNKDPORT
splunkd port. If no option is provided, we will
default to '8089'
-u USERNAME, --username=USERNAME
Splunk username
-p PASSWORD, --password=PASSWORD
Splunk password
-n, --no-prompt Use this option when you want to disable the prompt
version of this script
-v, --verbose Use this option for verbose logging
-f FILE_PATH, --filepath=FILE_PATH
The full path of a directory. Usage depends on mode.
When importing backed up data of version 1.2.0, this
could be a file or a set of files. When working with
service KPIs, this is a directory containing
input.json on entry and output.json on exit.
-m MODE, --mode=MODE Specify the mode of operation - what kind of
operations to perform. Mode is set to: 1 - for
backup/restore operations. 2 - for service KPI
operations.
Backup and restore operations. This is mode 1.:
Use this option when you want to perform backup/restore operations.
-i, --importData Use this option when you want to upload data to the KV
Store. When importing data from version 1.2.0, you can
use filepath as wildcard to upload data from more than
one file. However, filepath must be within quotes if
it is being used as a wildcard
-d, --persist-data Use this option when you want to persist existing
configuration in KV Store during import. NOTE:
Applicable only if importData option is used
-y, --dry-run Use this option when you want only to list objects for
import or backup
-a, --conf-file Use this option when you want to back up .conf files.
-b BR_VERSION, --base-version=BR_VERSION
The original ITSI application version user intends to
backup/restore from.
-e DUPNAME_TAG, --dupname-tag=DUPNAME_TAG
Automatically rename all the duplicated service or
entity names from restoring with a tag. If this option
is not set, the restoring will halt if duplicate names
are detected. The default tag is:
_dup_from_restore_<epoch_timestamp>
Service KPI operations. This is mode 2.:
Use this option when you want to get/create/update/delete KPIs for
existing services.
-g, --get For input, specify a list of service keys with the
keys of KPIs to retrieve. Expected format: [{_key:
<service key>, kpis: [{_key: <KPI key>}]]. Specify []
to get all KPIs from all services. Specify [{_key:
<service key>, kpis: []] to get all KPIs from a
service. Assumes input is available in
file_path/input.json
-c, --create For input, specify a non-empty list of service keys
with their KPIs list. Expected format: [{_key:
<service key>, kpis: [{_key: <KPI key>, <rest of KPI
structure>}]]. Note that only existing services could
be updated with new KPIs only with this option.
Assumes input is available in file_path/input.json
-t, --update For input, specify a non-empty list of service keys
with their KPIs list. Expected format: [{_key:
<service key>, kpis: [{_key: <KPI key>, <rest of KPI
structure>}]]. Note that only existing services and
existing KPIs could be updated using this option.
Assumes input is available in file_path/input.json
-r, --delete For input, specify a list of service keys with the
keys for the KPIs to delete.Expected format: [{_key:
<service key>, kpis: [{_key: <KPI key>}]]. Assumes
input is available in file_path/input.json
Timezone offset operations. This is mode 3.:
Use this option when you want to adjust timezone settings for time
sensitive fields on object configuration.
-q IS_GET, --is_get=IS_GET
For input, specify if you are trying to read objects
or update their timezone offsets.
-o OBJECT_TYPE, --object_type=OBJECT_TYPE
For input, specify a valid object type that contains
time sensitive configuration. This option will apply
offset to all objects on this type unless scoped to a
specific object using object_key parameter.Supported
object types are: "maintenance_calendar" for
maintenance windows, "service" for Services/KPIs
(threshold policies)
-k OBJECT_TITLE, --object_title=OBJECT_TITLE
For input, specify an optional object title of object
type that contains time sensitive configuration. Using
this option will cause the offset change to only apply
to that object.
-z OFFSET_IN_SEC, --offset_seconds=OFFSET_IN_SEC
For input, specify the offset to apply in seconds as a
positive or negative number. This offset should be the
number of seconds that you want to add or subtract
from the current value.
Running the script in a search head cluster environment
When running the kvstore_to_json.py
script in a replicated KV store environment, the script works on any member of the search head cluster. It does not require execution on the captain.
It is best practice to run the script on the MongoDB captain, which might be different than the captain of the search head cluster.
Backup and restore operations (mode 1)
You can no longer perform partial backups using the kvstore_to_json.py mode 1 option. Use the partial backup workflow in the UI instead. For information, see Create a partial backup of ITE Work in this Splunk IT Essentials Work Administration manual.
Use replacement options
The partial restore rules schema provides replacement options, which let you change the name of an object when you run a partial backup/restore operation. Replacement options are useful for renaming objects when moving from a test environment to a production environment.
For example, to backup a service called test_database_service, but change the name to database_service; and to backup a deep dive called test_database_deep_dive, and change the name to database_deep_dive, you would create a rules.json
file that contains the following:
[
{
"object_type": "service",
"title_list": "^test_database_service$",
"replacement_rules": [ {
"replacement_key": "title",
"replacement_type": "replace",
"replacement_string": "database_service",
"replacement_pattern": "^test_database_service$"
}]
},
{
"object_type": "entity",
"title_list": ["10.12.*", "*host_database*"]
},
{
"object_type": "deep_dive",
"title_list": "^test_database_deep_dive$",
"replacement_rules": [ {
"replacement_key": "title",
"replacement_type": "replace",
"replacement_string": "database_deep_dive",
"replacement_pattern": "^test_database_deep_dive$"
}]
}
]
Service KPI operations (mode 2)
kvstore_to_json.py
mode 2 options let you run bulk operations on KPIs, including get (-g
), create (-c
), update (-t
), and delete (-r
). Use these options to replicate, edit, and copy KPIs to multiple services, for example, when moving your ITSI deployment from a test environment to a production environment.
All service KPI options require you to specify the mode 2 parameter -m 2
. You must also specify the file path -f
parameter as the full path to the directory containing the input.json
file.
Before you can run service KPI operations, you must create an input.json
file in the destination directory. The script accepts data input from input.json
and sends data output to an output.json
file that the script creates in the same directory.
All service KPI operations, except get -g
, require you to specify service and/or KPI keys. You can retrieve these keys using the -g
option in output.json
.
kvstore_to_json.py
help - h
option for proper input.json
and command syntax.Get service and KPI keys
Use the get -g
option to retrieve service and KPI data in JSON format, including service and KPI keys.
- Create an
input.json
file in the destination directory.mkdir <directory_containing_input.json> touch input.json
- Edit
input.json
: Add[]
to the file to retrieve JSON data for all services and KPIs, or add specific service and kpi keys to the file to retrieve JSON data for those services and KPIs only. For example:[{"_key": "<service_key>", "kpis": [{"_key": "<kpi_key>"}] } ].
- Run the
kvstore_to_json
script using the get-g
option. Name the full path to the directory containing theinput.json
file as the file path-f
parameter. For example:cd $SPLUNK_HOME bin/splunk cmd python kvstore_to_json.py -u admin -p changeme -m 2 -g -f <directory_containing_input.json> -n
- Review the contents of
output.json
to identify service and KPI keys. For example:[ { "_key": "669c5cec-a492-419d-8659-95a185b4dc5c", "kpis": [ { "_key": "f017cc7b2e67f2b3b9152146", ... } ] } ]
Create KPIs
Use the -c
option to create new KPIs.
- Edit
input.json
to specify the service key of the service for which you want to create the KPI. - Add KPI keys for the KPIs that you want to add to the service and any key-value pairs belonging to the KPI that you want to include in the KPI definition. Leave the key field for each KPI empty for ITSI to auto generate it. For example:
[ { "_key": "<service_key>", "kpis": [ { "title": "<title_of_kpi_to_create>", ... } ] } ]
Update KPIs
Use the -t
option to update KPIs.
In input.json
specify the service and KPI key for each KPI, and any other key/value pair data that you want to update for the KPI.
Delete KPIs
Use the -r
option to delete KPIs.
In input.json
, specify service and kpi keys for all KPIs that you want to delete.
Caution: Make sure to properly validate your JSON input. While the kvstore_to_json
script does provide some schema validation, incorrect JSON formatting can cause errors.
Time zone offset operations (mode 3)
The kvstore_to_json.py
mode 3 option lets you apply a time offset for time-sensitive fields in object configurations. You can use this option to correct time zone discrepancies for the following object types:
maintenance_calendar
: Sets an offset for maintenance window start and end times.service
: Sets an offset for the KPI threshold time policies within a service.kpi_threshold_template
: Sets an offset for a KPI threshold template. After running the command to set an offset for a KPI threshold template, you must run the command again for each service that uses the KPI threshold template and set the same offset so that they are in sync.
Apply time zone offset
Run the following command to set an offset for one of the supported object types.
- Run
kvstore_to_json.py
, where-o
is the object type,-k
is the title of the specific object, and-z
is the specific time zone offset in seconds. For example:cd $SPLUNK_HOME bin/splunk cmd python etc/apps/SA-ITOA/bin/kvstore_to_json.py -m 3 -o service -k "Database Service" -z 1800
- Enter the requested information at the prompts (default interactive mode only). For example:
>> Enter the splunkd port number OR press the enter key to use [8089] > 8089 >> Enter splunk username or press enter to use "admin" > admin >> Enter splunk password for "admin" >
- The script applies the time zone offset to the specified object. For example:
1 object(s) match request Applying timezone change on requested object(s): [u'Database Service'] Timezone offset has been applied on the objects requested.
ITSI time-sensitive configurations are normalized to UTC.
Regenerate KPI search schedules (mode 4)
The kvstore_to_json.py
mode 4 option regenerates the search schedules for your KPIs. Use this command if you have set your KPI saved search schedules to run at the same time in itsi_settings.conf. Run this command to reset the search schedules of all your KPIs to use the new search schedule. See Synchronize KPI searches in ITSI for more information.
- Run
kvstore_to_json.py
in mode 4. For example:cd $SPLUNK_HOME bin/splunk cmd python etc/apps/SA-ITOA/bin/kvstore_to_json.py -m 4
- Enter the requested information at the prompts (default interactive mode only).
- You'll see the following message after the KPI search schedules have been reset:
Retrieving KPIs to reset their saved search scheduling Saving updated KPI scheduling Done.