Path: blob/master/tutorials-and-examples/example-notebooks/Example - Using Sentinel Search Queries.ipynb
3253 views
Example Notebook - Using Microsoft Sentinel Search Jobs
Search in Microsoft Sentinel is built on top of Search jobs. Search jobs are asynchronous queries that fetch records.
The results are returned to a search table that's created in your Log Analytics workspace after you start the Search job.
The search job uses parallel processing to run the search across long time spans, in extremely large datasets.
Using MSTICPy you can create Search jobs from a notebook, check when the requested logs are ready and then query the returned data. In this notebook we take you through an example of doing just this.
Setup
The first thing we need to do is install and configure MSTICPy in order to ensure the features are accessible.
The next step is to connect to Microsoft Sentinel. If you have configured a MSTICPy configuration file you can use your workspace details configured there. Otherwise you can pass in your details when initalizing it.
Once connected we can now start a search with create_search. To this function we need to pass in a KQL query to run for the search.
Log queries in a search job are intended to scan very large sets of data.
To support distribution and segmentation, the queries can only search on data source at a time and can only use a subset of KQL, including the operators:
where
extend
project
project-away
project-keep
project-rename
project-reorder
parse
parse-where
More details on the limitations can be found in the documenation.
In addition to the query we can also provide the following optional parameters:
start and end times for the query - if this isn't provided it defaults to the last 90 days
a name for the serach - if not provided a random GUID is generated
a limit on the number of results to return - by defualt this is 1000
Note: It can take some time to create a search job.
Once a search job is created it is not immediately ready for querying, it can take some time to run the search, and return the data.
We can check the status of our search job with check_search_status and by passing it our search name.
This will print out the current search jobs's status. Once the status is 'Succeeded' the data is ready for querying when this happens the function will return True.
Once the search job is ready we can use MSTICPy's QueryProvider feature to run a query against the search's dataset and see the results of the search.
The name of the table to query is the name of the search job with _SRCH appended - this is output when you run create_search or check_search_status.
Once a search job is complete and the data no longer needed we can delete the job and its associated data.
This can be done with delete_search and again passing it the search name.
As with search job creation, the deletion can take some time but not further action is required once the deletion is started.
More details about these features can be found at: