Skip to main content

Notifying Stream

Currently, Stream supports a Push model, where the source-code repository or CI/CD tool needs to make an API call to tell Stream that a change has occurred on the branch.

Once a branch is updated (via a push or a PR merge), the repo or CI/CD tool needs to call the /api/v1/version/sync API endpoint. For the purposes of this course, we've included a script that takes care of this, but it's worth exploring the steps in that script.

Here is what you'll see if you click the Terminal top tab, and then enter cat /tmp/scripts/notify-ls.sh on the command line:

#!/bin/bash

# Authenticate and Save the Token.
TOKEN=$(curl http://gitops-cribl-prd:9000/api/v1/auth/login \
-H 'Content-Type: application/json' \
-d "{\"username\":\"admin\",\"password\":\"cribldemo\"}" 2>/dev/null | \
jq -r .token)

# Set up the Authentication Header
export AUTH_HEAD="Authorization: Bearer $TOKEN"

# Make the notification call
curl -X POST "http://gitops-cribl-prd:9000/api/v1/version/sync" -H "accept: application/json" -H "${AUTH_HEAD}" -d "ref=prod&deploy=true"

This script makes two calls to the Cribl API. The first call is to authenticate with the API and generate a token to be used for the second call.

Running the Script
  1. Click Terminal on the top nav.
  2. Enter the following into the terminal:
/tmp/scripts/notify-ls.sh

Give it a minute or two to deploy, and then we can take a look at the production instance to see if our configuration changes made it over.

Validating Changes
  1. Click the Stream Prod top tab.

  2. In Cribl's own top nav, click the Cribl logo at left to unroll the product selector.

  3. Select Stream from this menu to expand Stream's top nav.

  4. Click Manage on this top nav.

  5. Click into the default Worker Group.

  6. Click the Data submenu, then click Sources.

  7. On the resulting Manage Sources page, locate and click the Syslog tile. (You can type its name into the filter box, or scroll down to find it.) You should now see the prod_apache Source enabled in this prod environment: prod_apache Source

  8. From Stream's top nave, now select Data > Destinations.

  9. Locate and click the Elasticsearch tile. (You can type Elastic into the filter box, or scroll down to find the tile.) You should see both of the Destinations we created, with dev_logs greyed out and prod_logs now enabled: Prod Destinations

  10. Click Stream's Processing submenu, and select Pipelines. You can collapse the Pipelines page's right Preview pane (and/or widen your browser) to see the Pipelines' names. You should see the apache_logs Pipeline at the top of the list: Prod Pipeline

  11. Click Stream's Routing submenu, and select Data Routes. You should see the apache_logs Route at the top of the list. Notice the small sparkline graph confirming that data is now flowing through this Route. (You can hover over the sparkline to see details.) Prod Route

If you want to look at the datagen Sources (Datasubmenu > Sources > Datagen), you'll also see that the dev_gen source is present in this prod environment, but greyed out as inactive.

Validating Data Flow in Elasticsearch
  1. Click the Elasticsearch/Kibana top tab.
  2. In the Kibana UI's left nav, click the second button, Discover. (See the screenshot below.)
  3. If dev* is not set as the index pattern, click the drop-down selector and set it.
  4. Click Refresh if necessary. You should now be able to confirm that Stream data is flowing into this downstream service, too.

Kibana Discover

That's it! Let's wrap up...