Skip to main content

Reshape Data

Now, we're going to use Stream's Eval Function to restructure the event, to make it easier to read and analyze. We'll get rid of _raw field, as it's not really needed after parsing. (All of the information from _raw is now in the event itself.) We'll also move the original Splunk metadata to fields under a new object, to make it clear to downstream consumers that those fields are Splunk-related.

important

Restructure Event with Eval

  1. In the left Pipelines pane, click Add Function.

  2. Search for Eval.

  3. Click Eval.

  4. Scroll down to display the new Eval Function, and click into its Filter field.

  5. Replace the Filter field's default true value with: splunk_metrics.
    (Be careful not to paste in leading or trailing characters. splunk_metrics is what we set as our Destination Field in Parser, so this filter will return true for events which have a splunk_metrics field.)

  6. For Evaluate Fields, we're going to enter a number of options. The first row creates a new object. The following rows add fields to that object, and set these fields' values to the values of fields already in the event. The last row overrides sourcetype with the literal (single-quoted) value 'splunk_metrics'. Click + Add Field to add each row:

    NameValue Expression
    splunk_metadata{}
    splunk_metadata.hosthost
    splunk_metadata.indexindex
    splunk_metadata.sourcesource
    splunk_metadata.sourcetypesourcetype
    sourcetype'splunk_metrics'
  7. Click Save.

In the right Preview pane, you should now see parsed events that look like this:

Parsed Event 2

We're pretty close, but we'd also like to get rid of a few fields we no longer need. Because host and sourcetype are used in our partitioning expression, we should keep those. But we no longer need _raw, index, or source.

important

Remove Fields

  1. In Remove Fields, enter the string: _raw, index, source
  2. Click Save to update the Pipeline

Now, you should see a set of fields being dropped like this:

Parsed Event 3

We've now taken a regular log event, in semi-structured format, and converted it to a JSON document which can easily be analyzed in a number of systems. Next, let's enrich it with some additional context to make querying it easier.