Skip to main content

Create Splunk Source

Next, we're going to configure a Splunk Universal Forwarder (UF) to send data to Cribl Stream. The UF is already installed, but not yet configured. To keep it simple, we're going to have it send only its internal logs, and the Cribl log directory, to have some examples of JSON logging.

important

Add a Splunk Source

  1. With Manage active in Stream's top nav, select the Data submenu and click Sources.
  2. Click the Splunk TCP tile.
  3. There is a default disabled source called in_splunk_tcp - Toggle the Enabled slider to Yes.
  4. Click Yes on the dialog that pops up asking if you're sure you want to enable the input.

We now have a running Splunk input to Cribl Stream! Stream will be listening on port 9997 for traffic from a Splunk Universal Forwarder. For the sake of simplicity, we already have the Universal Forwarder installed, we just need to configure it to talk to Stream.

important

Start and Verify the Splunk UF

  1. First, start the UF. Copy and paste the line below into the terminal:

    /opt/splunk/bin/splunk start --seed-passwd cribldemo
  2. When prompted to accept the license agreement, enter y.

  3. Once the UF has started, we need to add a sandbox directory and tell it to monitor the Cribl log directory. Copy the lines below and paste them into the terminal:

    mkdir /sandbox
    /opt/splunk/bin/splunk add monitor /opt/cribl/config-volume/log -auth admin:cribldemo -sourcetype cribl -index cribl
  4. Next, configure the UF to send data to Stream. Copy and paste this command:

    /opt/splunk/bin/splunk add forward-server uf-to-s3:9997 -auth admin:cribldemo
  5. Click Monitoring in the top nav to verify the Splunk input.

After 30–60 seconds, you should notice a spike of new events coming in. The UF will clear its backlog of events pretty quickly, and then you'll see a more regular pattern of incoming events. It should look something like this:

UF Events

Now that events are flowing, let's validate that MinIO is placing them in our filesystem. As mentioned earlier, MinIO is configured to write to the /data directory, so our files will be in a path that looks like: /data/logarchives/prefix/<timestamp>/<host>/<sourcetype>. Let's validate that we're seeing the proper data:

Validate the MinIO Output
  1. Setup the Minio connection by running the following command:
    mc alias set minio http://minio:9000/ ACCESSKEY SECRETKEY
  2. Validate that files are showing up by using the MinIO client to run the following find command in the terminal. You should see a number of files with the pattern CriblOut-<id>.json:
    mc find minio/logarchives -name '*.json'
  3. Check the files' contents, to validate that metrics data is flowing from Splunk's internal logs:
    mc find minio/logarchives -name '*.json' -exec "mc cat {}" | jq -r ._raw | egrep "INFO.*Metrics"

Success! Now we've got data flowing from Splunk's Universal Forwarder to Cribl, and then on through MinIO to the filesystem. Next, we're going to show how to parse, reshape, and enrich this data.