Skip to main content

Route Data

Lastly, now that our Pipeline is configured and we're reshaping data the way we'd like, let's set up a Route to select a subset of our data to manipulate, and send it down the splunk‑metrics Pipeline.

important

Configure Route

  1. If necessary, select the Processing submenu and click Pipelines, then give the splunk-metrics Pipeline focus.
  2. At the top, click Attach to Route.
  3. To the right of the default route, click ... and click Insert Route Above
    (This inserts your new Route at the top, so Stream will process it first.)
  4. For Route Name, enter Splunk Metrics.
  5. For Filter, enter source.endsWith('metrics.log').
  6. For Pipeline, select splunk-metrics.
  7. For Output, select s3:s3.
  8. Set the Final slider to No.
  9. Click Save.

By this point, most things above should be pretty self-explanatory. Route Name is descriptive. Filter is a filter expression – like those we've used before – which finds events where the source field endsWith() metrics.log. Pipeline is set to the Pipeline we were just working on, and Output is set to go to the s3 Destination we created earlier. The only thing which bears explanation is Final, an important concept in Stream.

If Final is true, the event is consumed and sent down the Pipeline for processing. If Final is set to false, a copy of the event is sent down the Pipeline, while the original can be evaluated and matched by further Routes.

This centralizes, for the most part, the branching and routing decisions in the product. The Data Routes page gives you a centralized place for troubleshooting. It also provides a simple facility for treating data different ways – depending on where you might be sending it, or on whether you want it in multiple shapes.

Now, let's see if we are indeed dropping events.

important

Monitoring

  1. In the top nav, Click Monitoring (Note: Depending on the size of your window, the top nav will consolidate items that won't fit in a pulldown represented by an ellipsis (...) - if so, click on the ellipsis and then click on Monitoring.).
  2. In the Monitoring submenu, click on the Data pull down, and then click Pipelines.

Note that the number of events coming into the splunk‑metrics Pipeline should be greater than the number of events going out.

Now, we should be seeing events showing up in our S3 bucket. Let's validate, via the terminal, what data is getting deposited.

important

Validate Output

  1. Look at just one record from the splunk_metrics sourcetype, to verify that its event shape mirrors what we had seen in the Preview pane. Copy/paste this command into the terminal:
    mc find minio/logarchives -path '*/splunk_metrics/*.json' -exec "mc cat {}" 2>/dev/null | head -1  | jq .
  2. Next, run this command. Depending on how long it's been since you saved the route, its output should report at least 20, up to a few hundred events:
    mc find minio/logarchives -path '*/splunk_metrics/*.json' -exec "mc cat {}" 2>/dev/null | wc -l

That's it! We're successfully routing data from a Splunk Universal Forwarder; parsing, reshaping and enriching it; and then delivering it to a S3 bucket!

So, on to our conclusion.