Skip to main content

Connecting Sources and Destinations

TL;DR

Data Routes connect Sources and Destinations. They work as follows:

  1. Data comes in from a source and enters the Routes at the top of the list.
  2. If the data satisfies a Route’s filter expression, the matching results are sent to the appropriate Pipeline or Pack and then pushed to the preconfigured Destination.
    1. Filters are JavaScript-syntax-compatible expressions, basically any unit of code that resolves to a value. With filters you can get very granular results with the data you are selecting.
  3. If the Route has Final enabled, the data does not continue
  4. If the Route has Final disabled, the data continues to the next Route (i.e. return to step 2)

This flow allows a single Source to send to multiple Destinations. For example, you can archive a set of raw data then ship off your enriched data to your SIEM.

Routing is the nexus of Stream. All data coming in must get routed somewhere. All Sources need to be connected (read: routed) to Destinations. Stream doesn’t restrict you to a one-to-one relationship between Sources and Destinations. If you need to send data from a source to two or more places, so be it.

important
  1. With Manage active in the top nav, select the Routing submenu and click Data Routes
  2. Click route #2, palo2SecOpsElastic to expand it

Data Routes start off with a filter. This is a JavaScript expression that dictates what data gets sent through to the Route. If we expand Route 2, called palo2SecOpsElastic, we can see an example of a filter: __inputId.startsWith(’syslog:paloalto’). This means only data coming in from our firewall gets put through the rest of the route. Any other data, say Windows event data, doesn’t fit the bill.

note

__ (two underscores) denotes an internal metadata field to Stream that will not get passed on to the Destination. These fields can be used for many purposes, in our example it is used as a filter.

Next, the Route will send the matching results through the selected Pipeline.Pack. We’ll discuss more about Pipelines in the next section. For now, Stream uses Pipelines & Packs to process data by transforming, enriching, masking, and reducing it.

After we shape our data, our Route sends it to the allocated Destination. In this case we have selected the Security Operations team Elasticsearch instance.

Finally, pardon the pun, is the Final toggle. This is perhaps the most important part of Routes as it decides if the data continues down the Route list. That’s right, your data doesn’t have to disappear after it’s been sent to a Destination, like with some competitors' products.

If the Final toggle is set to No, the data that was filtered into the selected Route flows to the next Route and the process begins again, starting with the filter check. If the Final toggle is set to Yes, then the data stops at the end of the Route.

Back to the main point: your data, your way. If IT and SecOps both need data from a firewall in their own SIEM instance, we can make two Routes that filter on the firewall data, enrich the data based on the teams' needs in their specific Pipelines, and Route it to their respective SIEMs. Actually, that’s what we have done.

important

Click palo2ITElastic to expand the Route

Notice this Route has the same filter, but a different Pipeline and Destination.

Speaking of different Pipelines and enriching data for specific needs, let’s talk about Pipelines.