Centralized Log Management UI Details
General Configuration Tab
Field | Description | Required |
---|---|---|
Source Name | Name of this source rule. This name must be unique. Appears in the list of defined source rules. | Yes |
Source Type |
Specifies the event type of log source file. This field is prepopulated when you start from a Splunk AppDynamics template or an existing source rule. If you are creating a new source rule from scratch, you can specify any value. This value is used to identify this specific type of log event and can be used in searching and filtering the collected log data. The value shows up in the field list for the log data. | Yes |
Source File |
The location and name of the log file to serve as a log source. The location must be on the same machine as the analytics-agent. You can use wild cards and you can specify whether to match files one level deep or all log files in the path directory structure.
Note: For optimal performance, we recommend limiting the number of log files per log source to 6000.
| Required when collection type is from the local filesystem. |
Exclude Files |
Exclude, or blocklist, files from the defined source rule(s). Input the relative path of the file(s) to exclude, using wildcards to exclude multiple files.
| No |
TCP Port | Specifies the port for the analytics-agent to collect log files from a network connection. This field is not present if the collection type is From local file system. If no port number is provided, port 514 is used. Both the syslog utility and analytics-agent should have root access to send logs to port 514 (binding to ports less than 1024 requires root access). | Required when collection type is from a network connection. |
Enable path extraction |
This checkbox enables extraction of fields from the path name. Add a grok pattern in the text box. The syntax for a grok pattern is {%{SYNTAX:SEMANTIC}. For example, to extract | No |
Thread Count |
Specify the number of threads that the Analytics Agent will use for log processing. You must specify the thread counts after analyzing the performance of your environment as increased thread counts might impact system resources. Minimum Value: 5 Maximum Value: 25 | Required if the number of log files to be processed is more than five. |
File Wait Time |
Specify the time in seconds that the agent will wait to update a log file. After this time limit, the agent begins monitoring a new log file. The default value is 30 seconds. | No |
Start collecting from |
Indicate where to begin tailing (collecting the log records). Options are:
| The default is to collect log records from the beginning of the file. |
Override time zone |
Use this to override the time zone for eventTimestamp in log events. If your log events don't have a time zone assigned and you are not overriding it, the host machine time zone is assigned to the log events. You cannot override the timezone in the pickupTimestamp field.If you override the time zone, you must provide a timestamp format. Time zone formats should conform to Joda-Time Available Time Zones, and the timestamp format should be created with the Joda-Time formatting system. | No |
Override timestamp format |
You can override the format if needed. Time zone formats should conform to Joda-Time Available Time Zones, and the timestamp format should be created with the Joda-Time formatting system. | Yes. The override timestamp format is needed to extract out time correctly from the log file. If you do not provide this field, the
eventTimestamp will be equal to pickupTimestamp despite log line having a timestamp. |
Auto-Correct duplicate Timestamps | Enable this option to preserve the order of original log messages if the messages contain duplicate timestamps. This adds a counter to preserve the order. | No |
Collect Gzip files | Enable this to also find gzip files in the specified path. | No |
Field Extraction Tab
The following table describes the fields and actions on this tab. For a detailed procedure on using Auto or Manual Field Extraction, see Field Extraction for Source Rules.
Section/Field or Action | Description |
---|---|
Add Grok Pattern | |
Message Pattern |
This is the grok pattern used to extract fields from the log message. A pattern may be prepopulated when you use a Splunk AppDynamics template or an existing source rule as your starting point. You can add or remove grok patterns as needed. |
Multiline Format | For log files that include log records that span multiple lines (spanning multiple line breaks), use this field to indicate how the individual records in the log file should be identified. You have two options:
The multiline format is not supported when you are collecting log data from a network connection. |
Extract Key-Value Pairs | |
Field | Shows the field selected for key-value pair configuration. |
Split | The delimiter used to separate the key from its value. In this example, "key=value": the split delimiter is the equal sign "=". Multiple comma-separated values can be added. |
Separator | The delimiter used to separate out two key-value pairs. In this example, "key1=value1;key2=value2": the separator is the semi-colon ";". Multiple comma-separated values can be added. |
Trim | A list of characters to remove from the starting and/or the end of the key/value before storing them. In this example, "_ThreadID_": you can specify the underscore "_" as the trim character to result in "ThreadID". Multiple comma-separated values can be added. |
Include | A list of key names to capture from the "source". You must provide keys in the include field. If the include field is left blank no key-value pairs are collected. |
Actions | |
Upload Sample file | Browse to a local file to upload a sample log file. |
Preview | Use to refresh the Preview grid to see the results of the specified field extraction pattern. |
Auto Field Extraction | |
Definer Sample |
Select a message from the preview grid that is representative of the fields that you want to extract from the log records. You can select only one definer sample per source rule. |
Refiner Samples | Refines the regular expression to include values that were not included in the original definer sample. |
Counter Sample | This specifies something to ignore. Refines the regular expression to exclude values that were included by the original definer or refiner sample. |
Manual Field Extraction | |
Regular Expression | Add a regular expression to define the field you want to extract. |
Field Type | Specify the type for the field. |
Field Name | Automatically generated within the regular expression pattern. This name appears in the Fields list in the Analytics Search UI. |
Actions | |
Add Field | Use to add additional regular expressions for extracting more fields. You cannot add more than 20 fields. |
Preview
| Use these buttons to filter the viewable results in the preview grid. |
Upload Sample File | Upload a sample file from your local file system to use in the preview grid. |
Field Management Tab
Field/Action | Description |
---|---|
Select Any Field to Customize | Select the field to add customizations. You can add multiple customizations to a single field. |
Field Name | This column lists the fields that have customizations. |
Customize | This column shows the specific customizations. For static fields, the display name is shown. Static fields cannot be further customized. |
Mask Value | This customization option masks values in the collected data. Specify the starting and ending position in the data and the character to use as the masking value. |
Replace Value | This customization option replaces the entire value of the field with a static string. |
Rename | This customization option enables you to rename a field to a more recognizable display name. |
Field Type |
Use to change the data type of the field. For example, from string to number. Available types are String, Boolean, and Number. Note that after a source rule has been saved with a field of a specified data type, you cannot later change the data type of that field. Once the fields are indexed in the analytics database trying to specify a new type will cause a validation error. |
Remove | This customization turns off data collection for the field. It can be reversed at a later time. |
Add Static Field | This action allows you to add a static field to all the log events collected from this source. This field can then be used to search and filter the log data. For example, use this to add Tier, Node, and Application Name to the log data. |