Skip to main content

Here's Where the Fun Begins

Before we start, if this is your first experience with Cribl Search, you should definitely check out our Cribl Search Overview Sandbox. If reading is more your thing, then check out the docs or the tip sheet.

The important information Search requires no configuration to query Cribl Lake. Every Lake dataset is available in Search by default. Every one. Even the new one you just made!

Find It
  1. From the product switcher at the upper left, click Search
  2. Click Dismiss on the goat pop-up at the top right (unless you're into that sort of thing, no judgement here)
  3. On the left-hand side, under Available Datasets hover over lake-icondefault_logs and click Search Now

And boom goes the dynamite data! The data we just plumbed up is already streaming into our Lake, ready for Search to go fishing for insights! Why stop there? As our Canadian friends say, let's "Send it!"

send it!

For this situation, we're going to pretend that we are helping our Security team sift through Apache logs for an incident response (luckily, we made that dataset earlier!). On the left-hand side, let's focus on the data we found on a smaller subset: host='web02.cribl.io' and sourcetype='access_common'. Maybe in the future we can add the access_errors (*wink*), but for now let's start with the basic logs.

Grab It
  1. On the left-hand side, click host
    1. In the resulting pop-up, click web02.cribl.io
    2. In the resulting pop-up, click Add field in search
  2. Back over on the left, click sourcetype
    1. In the resulting pop-up, click access_common
    2. In the resulting pop-up, click Add field in search
  3. At the top right, click the blue search_button_ button

Your query should now look like this:

dataset="default_logs" host="web02.cribl.io" sourcetype="access_common" | limit 1000 

OK, we've narrowed our scope. Let's actually send this data to our incident_response dataset.

Send It
  1. In the query box, delete limit 1000 at the end of your query
  2. At the end of the query, append send after the |

    Your query should now look like this:
    dataset="default_logs" host="web02.cribl.io" sourcetype="access_common" | send
  3. Click Search at the top right

Wait, click Search? Yes, click Search. What we've just done is created a query (dataset="default_logs" host="web02.cribl.io" sourcetype="access_common") and pushed it through the send operator. Clicking Search causes Search to execute the query and then push the results to our default Stream Cloud Worker Group. If you recall, we previously configured Stream to receive this data and send it to our sbx_incident_response dataset in Cribl Lake. Which means...

Check It
  1. Replace the query in the query box with the following:
    dataset="sbx_incident_response" | limit 1000
  2. Click Search

Noice. Our resulting data has been sent to our new dataset and is query-able! You can double-check this by looking at how many host entries are in the left-hand side when you click host, or how many sourcetype entries there are when you click sourcetype. Essentially, what we have done is given our security team a smaller subset of data to query. This will speed up query results!

export it!

But wait, we forgot the error logs! Security says they want the access_error logs to correlate with the access_common logs. That way, they can tell if any errors were yeet-ed out between gaps in the logs. Instead of send, let's do something even smoother: export.

Now Export It
  1. Click History at the top left of the query box
  2. Find and click the our last query against the default_logs dataset
  3. In the query box, replace access_common with access_error
  4. In the query box, replace send with export to lake sbx_incident_response

    Your query should now look like this:
    dataset="default_logs" host="web02.cribl.io" sourcetype="access_error" | export to lake sbx_incident_response
  5. Click Search at the top right of the query box

*Chef's Kiss*: export is a Search operator that sends query results to a Lake dataset. It is quite literally built for this. Don't take my word for it, though. Let's go check that our data made it into our sbx_incident_response dataset.

Constant Vigilance!
  1. Replace the query in the query box with the following:
    dataset="sbx_incident_response" | limit 1000
  2. Click Search

How do we check? Well, we added a second sourcetype for the same host, so let's check that there are now two sourcetypes in our dataset, by clicking sourcetype on the left-hand side. There should be access_common and access_error. Yeah? Hell yeah! My mom told me if I wait for some things, then good things will happen, and I waited for some things and got banana bread at work today, dude, hell yeah! Sorry, off-topic again. Almost time to wrap up this fun little dip in the Lake. But first, one last stop: Lakehouse.