Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
Azure
GitHub Repository: Azure/Azure-Sentinel-Notebooks
Path: blob/master/Guided Investigation - Fusion Incident.ipynb
3249 views
Kernel: Python 3.10 - SDK v2

Guided investigation - Fusion Incidents

 Details... Notebook Version: 1.1

Data Sources Used:

Microsoft Sentinel uses Fusion, a correlation engine based on scalable machine learning algorithms, to automatically detect multistage attacks (also known as advanced persistent threats or APT) by identifying combinations of anomalous behaviors and suspicious activities that are observed at various stages of the kill chain. On the basis of these discoveries, Microsoft Sentinel generates incidents that would otherwise be difficult to catch. These incidents comprise two or more alerts or activities. By design, these incidents are low-volume, high-fidelity, and high-severity. For more information https://aka.ms/SentinelFusion

This notebook takes you through a guided investigation of a Microsoft Sentinel Fusion Incident. The investigation focuses on the entities that attached to a Microsoft Sentinel Fusion Incident. This notebook can be extended with additional investigation steps based on specific processes and workflows.

Notebook initialization

The next cell:

  • Checks for the correct Python version

  • Checks versions and optionally installs required packages

  • Imports the required packages into the notebook

  • Sets a number of configuration options.

More details...

This should complete without errors. If you encounter errors or warnings look at the following two notebooks:

If you are running in the Microsoft Sentinel Notebooks environment (Azure Notebooks or Azure ML) you can run live versions of these notebooks:

You may also need to do some additional configuration to successfully use functions such as Threat Intelligence service lookup and Geo IP lookup. There are more details about this in the ConfiguringNotebookEnvironment notebook and in these documents:

from IPython.display import HTML, display import sys !{sys.executable} -m pip install azure-mgmt-resourcegraph !{sys.executable} -m pip install msticpy extra_imports = [ "json", "bokeh.plotting,show", "msticnb,nb", "msticpy.nbwidgets,SelectAlert", "msticpy.nbwidgets,Progress", "msticpy.context.azure, MicrosoftSentinel", "msticpy.vis.entity_graph_tools, EntityGraph", ] display(HTML("<h3>Starting Notebook setup...</h3>")) import msticpy as mp mp.init_notebook( extra_imports=extra_imports, additional_packages=["msticnb>=1.0"], )
from IPython.display import HTML, display extra_imports = [ "json", "bokeh.plotting,show", "msticnb,nb", "msticpy.nbwidgets,SelectAlert", "msticpy.nbwidgets,Progress", "msticpy.context.azure.azure_sentinel,AzureSentinel", "msticpy.vis.entity_graph_tools, EntityGraph", ] display(HTML("<h3>Starting Notebook setup...</h3>")) import msticpy as mp mp.init_notebook( extra_imports=extra_imports, additional_packages=["msticnb>=1.0"], )
Note: The following cell creates some helper functions used later in the notebook. This cell has no output.
import re def check_ent(items, entity): """Check if entity is present""" for item in items: if item[0].casefold() == entity.casefold(): return True return False def extract_resourcegroup(azure_id : str) -> str: azure_id = azure_id.lower() m = re.match(r'^/?subscriptions/[^/]+/resourcegroups/[^/]+', azure_id) if m == None: return "" if 'providers/microsoft.devices/iothubs' in azure_id: return "" start, end = m.span() resource_group = azure_id[start:end] if resource_group[0] != '/': resource_group = "/" + resource_group return resource_group def ti_color_cells(val): """Color cells of output dataframe based on severity""" color = "none" if isinstance(val, str): if val.casefold() == "high": color = "Red" elif val.casefold() == "warning" or val.casefold() == "medium": color = "Orange" elif val.casefold() == "information" or val.casefold() == "low": color = "Green" return f"background-color: {color}" def ent_color_cells(val): """Color table cells based on values in the cells""" if isinstance(val, int): color = "yellow" if val < 3 else "none" elif isinstance(val, float): color = "yellow" if val > 4.30891 or val < 2.72120 else "none" else: color = "none" return "background-color: %s" % color def ent_alerts(ent_val): query = f" SecurityAlert | where TimeGenerated between(datetime({start})..datetime({end})) | where Entities contains '{ent_val}'" alerts_df = qry_prov.exec_query(query) if isinstance(alerts_df, pd.DataFrame) and not alerts_df.empty: display_timeline( data=alerts_df, source_columns=["DisplayName", "AlertSeverity", "ProviderName"], title=f"Alerts involving {ent_val}", group_by="AlertSeverity", height=300, time_column="TimeGenerated", )
from datetime import datetime, timedelta, timezone # papermill default parameters ws_name = "default" incident_id = None end = datetime.now(timezone.utc) start = end - timedelta(days=5)

Authenticate to Microsoft Sentinel APIs and Select Subscriptions

This cell connects to the Microsoft Sentinel APIs and gets a list of subscriptions the user has access to for them to select. In order to use this the user must have at least read permissions on the Microsoft Sentinel workspace. In the drop down select the name of the subscription that contains the Microsoft Sentinel workspace you want to triage incidents from.

print( "Configured workspaces: ", ", ".join(msticpy.settings.get_config("AzureSentinel.Workspaces").keys()), ) import ipywidgets as widgets ws_param = widgets.Combobox( description="Workspace Name", value=ws_name, options=list(msticpy.settings.get_config("AzureSentinel.Workspaces").keys()), ) ws_param

Now select the name of the Microsoft Sentinel workspace in the subscription you want to triage incidents from.

Authenticate to Microsoft Sentinel, TI providers and load Notebooklets

Details... If you are using user/device authentication, run the following cell. - Click the 'Copy code to clipboard and authenticate' button. - This will pop up an Azure Active Directory authentication dialog (in a new tab or browser window). The device code will have been copied to the clipboard. - Select the text box and paste (Ctrl-V/Cmd-V) the copied value. - You should then be redirected to a user authentication page where you should authenticate with a user account that has permission to query your Log Analytics workspace.

Note: you may occasionally see a JavaScript error displayed at the end of the authentication - you can safely ignore this.
On successful authentication you should see a popup schema button. To find your Workspace Id go to Log Analytics. Look at the workspace properties to find the ID.

Note that you may see a warning relating to the IPStack service when running this cell. This can be safely ignored as its not used in this case.

from msticpy.context.azure import MicrosoftSentinel from msticpy.common.pkg_config import get_config config_items = get_config("AzureSentinel.Workspaces")[ws_param.value] try: sent_prov = MicrosoftSentinel(ws_name=ws_param.value) sent_prov.connect() except KeyError as e: raise MsticpyUserConfigError("Unable to retreive Sentinel workspace items from config. Ensure you have SubscriptionId, ResourceGroup and WorkspaceName specified.") from e
from msticpy.common.timespan import TimeSpan #Authentication qry_prov = QueryProvider("MSSentinel") qry_prov.connect(WorkspaceConfig(workspace=ws_param.value)) nb_timespan = TimeSpan(start, end) qry_prov.query_time.timespan = nb_timespan md("<hr>") md("Confirm time range to search", "bold") qry_prov.query_time

Authentication and Configuration Problems


Click for details about configuring your authentication parameters

The notebook is expecting your Microsoft Sentinel Tenant ID and Workspace ID to be configured in one of the following places:

  • config.json in the current folder

  • msticpyconfig.yaml in the current folder or location specified by MSTICPYCONFIG environment variable.

For help with setting up your config.json file (if this hasn't been done automatically) see the ConfiguringNotebookEnvironment notebook in the root folder of your Azure-Sentinel-Notebooks project. This shows you how to obtain your Workspace and Subscription IDs from the Microsoft Sentinel Portal. You can use the SubscriptionID to find your Tenant ID). To view the current config.json run the following in a code cell.

%pfile config.json

For help with setting up your msticpyconfig.yaml see the Setup section at the end of this notebook and the ConfigureNotebookEnvironment notebook

Import and initialize notebooklets

This imports the msticnb package and the notebooklets classes.

These are needed for the notebook operations

import msticnb as nb nb.init(query_provider=qry_prov) ti = nb.DataProviders.instance.tilookup timespan = TimeSpan(start=start, end=end)

Timeline View of Fusion Incidents

This timeline shows you all fusion incidents in the selected workspace, grouped by the severity of the incidents.

from msticpy.vis.timeline import display_timeline params = {"$top" : 500 , "$filter": f"properties/relatedAnalyticRuleIds/any(x:x eq 'BuiltInFusion') and properties/createdTimeUtc gt {start.isoformat()} and properties/createdTimeUtc lt {end.isoformat()}"} incidents = sent_prov.list_incidents(params) if isinstance(incidents, pd.DataFrame) and not incidents.empty: incidents['Title'] = incidents["properties.title"] incidents['Status'] = incidents["properties.status"] incidents['date'] = incidents["properties.createdTimeUtc"] display_timeline( data=incidents, source_columns=["Title", "Status"], title="Fusion Incidents over time - grouped by severity", height=300, group_by="properties.severity", time_column="date", ) else: md("No incidents found")

Select Fusion Incident to Investigate

From the table below select the incident you wish to investigate.

from IPython.display import HTML md("Select an incident to triage:", "bold") def display_incident(incident): details = f""" Selected Incident: {incident['properties.title']}, Incident time: {incident['properties.createdTimeUtc']} - Severity: {incident['properties.severity']} - Assingned to: {incident['properties.owner.userPrincipalName']} - Status: {incident['properties.status']} """ new_idx = [idx.split(".")[-1] for idx in incident.index] incident.set_axis(new_idx, inplace=True) return (HTML(details),pd.DataFrame(incident)) filtered_incidents = incidents filtered_incidents["short_id"] = filtered_incidents["id"].apply( lambda x: x.split("/")[-1] ) alert_sel = SelectAlert( alerts=filtered_incidents, default_alert=incident_id, columns=["properties.title", "properties.severity", "properties.status", "name"], time_col="properties.createdTimeUtc", id_col="short_id", action=display_incident, ) alert_sel.display()

Fusion creates correlations on entites including host, account, IP addresses and azure resources. To investigate Fusion incidents, we recommend you to start the investigaiton with the joined entites. The cell below shows you key details and context relating to this fusion incident, including:

  • All the associated entities: with column 'IsFusedEntity' indicating if the incident is fused on the entity

  • Summary of associated incidents created on those entities in last 14 days: with the column "Number of associated incidents", "status of associated incidents" and "classification of associated incidents"

  • Related alerts: including the expanded alerts that are indicated in the column 'IsExpandedAlert', which fired on the Fused entities in last 7 days if exists

incident_details = sent_prov.get_incident( alert_sel.selected_alert.id.split("/")[-1], entities=True, alerts=True ) ent_dfs = [] alert_out = [] full_alerts = [] #fused entities all_entities_count = {} fusion_join_ent_map = { "azure-resource": "resourceid", "azureresource": "resourceid", "account": "aaduserid", "host": "hostname", "ip": "address", "url": "url", "iotdevice": "deviceid" } if "Alerts" in incident_details.columns: for alert in incident_details.iloc[0]["Alerts"]: qry = f"SecurityAlert | where TimeGenerated between((datetime({start})-7d)..datetime({end})) | where SystemAlertId == '{alert['ID']}'" df = qry_prov.exec_query(qry) df["IsExpansionAlert"] = False full_alerts.append(df) if df.empty or not df["Entities"].iloc[0]: alert_full = {"ID": alert["ID"], "Name": alert["Name"], "Entities": None} else: alert_full = { "ID": alert["ID"], "Name": alert["Name"], "Entities": json.loads(df["Entities"].iloc[0].lower()), } for ent in alert_full['Entities']: if 'type' in ent: ent_type = ent['type'].lower() if ent_type in fusion_join_ent_map: if ent_type == 'azure-resource': rg = extract_resourcegroup(ent[fusion_join_ent_map[ent_type]]) if rg: all_entities_count[rg] = all_entities_count.get(rg, 0) + 1 elif ent_type == 'account' and "objectguid" in ent: ent_key = ent['objectguid'] all_entities_count[ent_key] = all_entities_count.get(ent_key, 0) + 1 elif fusion_join_ent_map[ent_type] in ent: ent_key = ent[fusion_join_ent_map[ent_type]] all_entities_count[ent_key] = all_entities_count.get(ent_key, 0) + 1 incident_details["Alerts"] = [alert_out] full_alerts = pd.concat(full_alerts, axis=0, ignore_index=True) full_alerts.sort_values(by=['SystemAlertId','TimeGenerated']) full_alerts = full_alerts.drop_duplicates(subset=['SystemAlertId'],keep='last') # expanded alerts on fused entities incident_created_time = incident_details.iloc[0]['properties.createdTimeUtc'] for (ent_key,count) in all_entities_count.items(): if count > 1: query = f" SecurityAlert | where TimeGenerated between((datetime({incident_created_time})-7d)..(datetime({incident_created_time})+7d)) | where Entities contains '{ent_key}'" alerts_df = qry_prov.exec_query(query) alerts_df["IsExpansionAlert"] = False if isinstance(alerts_df, pd.DataFrame) and not alerts_df.empty: for idx, one_alert in alerts_df.iterrows(): if not (one_alert['SystemAlertId'] in set(alert_full['ID'])): one_alert['IsExpansionAlert'] = True full_alerts = full_alerts.append(one_alert, ignore_index=True) alert_full = { "ID": one_alert["SystemAlertId"], "Name": alert["Name"], "Entities": json.loads(one_alert["Entities"].lower()), } alert_out.append(alert_full) # query all the incidents contains those entities and summarize their status incidents = sent_prov.get_incidents() all_inincident_details = [] if isinstance(incidents, pd.DataFrame) and not incidents.empty: incidents["date"] = pd.to_datetime(incidents["properties.createdTimeUtc"], utc=True) filtered_incidents=incidents[incidents["date"].between(pd.to_datetime(incident_created_time)- timedelta(days=14), pd.to_datetime(incident_created_time) + timedelta(days=7))] for idx, incident_tmp in filtered_incidents.iterrows(): incident_details_tmp = sent_prov.get_incident(incident_tmp.id.split("/")[-1], entities=True) all_inincident_details.append(incident_details_tmp) all_inincident_details_df = pd.concat(all_inincident_details, axis=0, ignore_index=True) ent_map = { "FileHash": "hashValue", "Malware": "malwareName", "File": "fileName", "CloudApplication": "appId", "AzureResource": "resourceId", "RegistryValue": "registryName", "SecurityGroup": "SID", "IoTDevice": "deviceId", "Mailbox": "mailboxPrimaryAddress", "MailMessage": "networkMessageId", "SubmissionMail": "submissionId", "Account": "accountName", "Host": "hostName", "Ip": "address", } for ent in incident_details["Entities"][0]: all_inincident_details_matched = pd.DataFrame() entities = {} for k, v in ent[1].items(): entities[k.lower()] = v ent_df = pd.json_normalize(entities) ent_type = ent[0].lower() ent_df["type"] = ent_type ent_df['IsFusedEntity'] = False ent_key = None if ent_type in fusion_join_ent_map: if ent_type == 'azureresource': ent_key = ent_df[fusion_join_ent_map[ent_type]].values[0] rg = extract_resourcegroup(ent_df[fusion_join_ent_map[ent_type]].values[0]) if rg and all_entities_count.get(rg, 0) > 1: ent_df['IsFusedEntity'] = True elif ent_type == 'account' and "objectguid" in ent_df: ent_key = ent_df['objectguid'].values[0] if all_entities_count.get(ent_key, 0) > 1: ent_df['IsFusedEntity'] = True elif fusion_join_ent_map[ent_type] in ent_df: ent_key = ent_df[fusion_join_ent_map[ent_type]].values[0] if all_entities_count.get(ent_key, 0) > 1: ent_df['IsFusedEntity'] = True elif ent[0] in ent_map: try: ent_key = ent[1][ent_map[ent[0]]] except: ent_key = None # check if the incident have the entity incident_id_set = set([incident_details.iloc[0]['id']]) if ent_key != None: for idx, incident_details_tmp in all_inincident_details_df.iterrows(): for ent_tmp in incident_details_tmp['Entities']: if ent_type == ent_tmp[0].lower() and not incident_details_tmp['id'] in incident_id_set: for k, v in ent_tmp[1].items(): if ent_key in v.lower(): all_inincident_details_matched = all_inincident_details_matched.append(incident_details_tmp, ignore_index=True) incident_id_set.add(incident_details_tmp['id']) #summarize the incidents created on the entity associated_incidents_status = "none" associated_incidents_classification = "none" count_associated_incidents = 0 if not all_inincident_details_matched.empty: count_associated_incidents = len(all_inincident_details_matched.index) associated_incidents_status = json.dumps(all_inincident_details_matched['properties.status'].value_counts().to_dict()) if "properties.classification" in all_inincident_details_matched.columns: associated_incidents_classification = json.dumps(all_inincident_details_matched['properties.classification'].value_counts().to_dict()) ent_df['Number of associated incidents'] = count_associated_incidents ent_df['Status of associated incidents'] = associated_incidents_status ent_df['Classification of associated incidents'] = associated_incidents_classification ent_dfs.append(ent_df) md("Incident Entities:", "bold") if ent_dfs: new_df = pd.concat(ent_dfs, axis=0, ignore_index=True) grp_df = new_df.groupby("type") for grp in grp_df: md(grp[0], "bold") display(grp[1].dropna(axis=1)) md("Graph of incident entities:", "bold") graph = EntityGraph(incident_details.iloc[0]) graph.plot(timeline=True) md("Fusion incidents with expanded alerts:", "bold") full_alerts.sort_values(by='TimeGenerated') display(full_alerts)

Entity Analysis

Below is an analysis of the incident's entities that appear in threat intelligence sources.

sev = [] resps = pd.DataFrame() # For each entity look it up in Threat Intelligence data md("Looking up entities in TI feeds...") prog = Progress(completed_len=len(incident_details["Entities"].iloc[0])) i = 0 result_dfs = [] for ent in incident_details["Entities"].iloc[0]: i += 1 prog.update_progress(i) if ent[0] == "Ip": resp = ti.lookup_ioc(observable=ent[1]["address"], ioc_type="ipv4") result_dfs.append(ti.result_to_df(resp)) for response in resp[1]: sev.append(response[1].severity) if ent[0] == "Url" or ent[0] == "DnsResolution": if "url" in ent[1]: lkup_dom = ent[1]["url"] else: lkup_dom = ent[1]["domainName"] resp = ti.lookup_ioc(lkup_dom, ioc_type="url") result_dfs.append(ti.result_to_df(resp)) for response in resp[1]: sev.append(response[1].severity) if ent[0] == "FileHash": resp = ti.lookup_ioc(ent[1]["hashValue"]) result_dfs.append(ti.result_to_df(resp)) for response in resp[1]: sev.append(response[1].severity) if result_dfs: resps = pd.concat(result_dfs) else: resps = pd.DataFrame() # Take overall severity of the entities based on the highest score if "high" in sev: severity = "High" elif "warning" in sev: severity = "Warning" elif "information" in sev: severity = "Information" else: severity = "None" md("Checking to see if incident entities appear in TI data...") incident_details["TI Severity"] = severity # Output TI hits of high or warning severity if ( incident_details["TI Severity"].iloc[0] == "High" or incident_details["TI Severity"].iloc[0] == "Warning" or incident_details["TI Severity"].iloc[0] == "Information" ): print("Incident:") display( incident_details[ [ "properties.createdTimeUtc", "properties.incidentNumber", "properties.title", "properties.status", "properties.severity", "TI Severity", ] ] .style.applymap(ti_color_cells) .hide(axis='index') ) md("TI Results:", "bold") display( resps[["Ioc", "IocType", "Provider", "Severity", "Details"]] .sort_values(by="Severity") .style.applymap(ti_color_cells) .hide(axis='index') ) else: md("None of the Entities appeared in TI data", "bold")

IP Entity Analysis

Below is an analysis of all IP entities attached to the incident.

# Enrich IP entities using the IP Summary notebooklet ip_ent_nb = nb.nblts.azsent.network.IpAddressSummary() if not resps.empty and "ipv4" in resps["IocType"].unique(): for ip_addr in resps[resps["IocType"] == "ipv4"]["Ioc"].unique(): folium_map = FoliumMap(width="50%", height="50%") try: display(HTML(f"<h1>Summary of Activity Related to{ip_addr}:</h1>")) ip_ent_nb_out = ip_ent_nb.run(value=ip_addr, timespan=timespan, silent=True) md( f"{ip_addr} - {ip_ent_nb_out.ip_origin} - {ip_ent_nb_out.ip_type}", "bold", ) if ( isinstance(ip_ent_nb_out.whois, pd.DataFrame) and not ip_ent_nb_out.whois.empty ): md(f"Whois information for {ip_addr}", "bold") display(ip_ent_nb_out.whois) if ip_ent_nb_out.location: md(f"Geo IP details for {ip_addr}", "bold") folium_map.add_ip_cluster([ip_ent_nb_out.ip_entity]) display(folium_map) if ( isinstance(ip_ent_nb_out.related_alerts, pd.DataFrame) and not ip_ent_nb_out.related_alerts.empty ): md(f"Alerts for {ip_addr}", "bold") tl = nbdisplay.display_timeline( data=ip_ent_nb_out.related_alerts, source_columns=["AlertName", "Severity"], title=f"Alerts associated with {ip_addr}", height=300, ) display(tl) if ( isinstance(ip_ent_nb_out.ti_results, pd.DataFrame) and not ip_ent_nb_out.ti_results.empty ): md(f"TI results for {ip_addr}", "bold") display(ip_ent_nb_out.ti_results) if ( isinstance(ip_ent_nb_out.passive_dns, pd.DataFrame) and not ip_ent_nb_out.passive_dns.empty ): md(f"Passive DNS results for {ip_addr}", "bold") display(ip_ent_nb_out.passive_dns) if ip_ent_nb_out.host_entities[0]["IpAddresses"]: md(f"{ip_addr} belongs to a known host", "bold") display( pd.DataFrame.from_records( [ { x: ip_ent_nb_out.host_entities[x] for x in ip_ent_nb_out.host_entities.__iter__() } ] ) ) print( "------------------------------------------------------------------------------------------------------------------------------------------------------------------" ) display(HTML("<br><br>")) except: md(f"Error processing {ip_addr}", "bold") else: md("No IP entities present", "bold")

URL Entity Analysis

Below is an analysis of all URL entities attached to the incident.

url_nb = nb.nblts.azsent.url.URLSummary() domain_records = pd.DataFrame() if not resps.empty and "url" in resps["IocType"].unique(): md("Domain entity enrichment", "bold") for url in resps[resps["IocType"] == "url"]["Ioc"].unique(): md(f"Summary of {url}", "bold") url_nb_result = url_nb.run(timespan=timespan, value=url) url_nb_result.display_alert_timeline() url_nb_result.browse() else: md("No URL entities present", "bold")

User Entity Analysis

Below is an analysis of all User entities attached to the incident.

# Enrich Account entities using the AccountSummary notebooklet account_nb = nb.nblts.azsent.account.AccountSummary() user = None uent = None if check_ent(incident_details["Entities"][0], "account") or check_ent( incident_details["Entities"][0], "mailbox" ): md("Account entity enrichment", "bold") for ent in incident_details["Entities"][0]: if ent[0] == "Account" or ent[0] == "Mailbox": if "accountName" in ent[1].keys(): uent = ent[1]["accountName"] elif "aadUserId" in ent[1].keys(): uent = ent[1]["aadUserId"] elif "upn" in ent[1].keys(): uent = ent[1]["upn"] if "upnSuffix" in ent[1].keys(): user = uent + "@" + ent[1]["upnSuffix"] else: user = uent if user: try: ac_nb = account_nb.run(timespan=timespan, value=user, silent=True) ac_nb.get_additional_data() if ( isinstance(ac_nb.account_activity, pd.DataFrame) and not ac_nb.account_activity.empty ): display(HTML(f"<h1>Summary of Activity Related to {user}:</h1>")) md("Recent activity") display(ac_nb.account_activity) else: md(f"No activity found for {user}") if ( isinstance(ac_nb.related_alerts, pd.DataFrame) and not ac_nb.related_alerts.empty ): show(ac_nb.alert_timeline) if ( isinstance(ac_nb.host_logon_summary, pd.DataFrame) and not ac_nb.host_logon_summary.empty ): md(f"Host logons by {user}") display(ac_nb.host_logon_summary) if ( isinstance(ac_nb.azure_activity_summary, pd.DataFrame) and not ac_nb.azure_activity_summary.empty ): md(f"Azure activity by {user}") display(ac_nb.azure_activity_summary) show(ac_nb.azure_timeline_by_provider) if ( isinstance(ac_nb.ip_summary, pd.DataFrame) and not ac_nb.ip_summary.empty ): md(f"IPs used by {user}") display(ac_nb.ip_summary) except: print(f"Error processing {user}") else: md("No Account entities present", "bold")

Host Entity Analysis

Below is an analysis of all Host entities attached to the incident.

# Enrich Host entities using the HostSummary notebooklet host_nb = nb.nblts.azsent.host.HostSummary() if check_ent(incident_details["Entities"][0], "host"): md("Host entity enrichment", "bold") for ent in incident_details["Entities"][0]: if ent[0] == "Host": if "dnsDomain" in ent[1]: host_name = ent[1]["hostName"] + "." + ent[1]["dnsDomain"], "" else: host_name = ent[1]["hostName"] md(f"Host summary for {host_name}", "bold") try: display(HTML(f"<h1>Summary of Activity Related to{host_name}:</h1>")) host_sum_out = host_nb.run(value=host_name, timespan=timespan) except: print(f"Error processing {host_name}") else: md("No Host entities present", "bold")

Timeline of other alerts with the same entities

If there are other entity types not analyzed above, a timeline of their appearance in security alerts appears below.

ent_map = { "FileHash": "hashValue", "Malware": "malwareName", "File": "fileName", "CloudApplication": "appId", "AzureResource": "resourceId", "RegistryValue": "registryName", "SecurityGroup": "SID", "IoTDevice": "deviceId", "Mailbox": "mailboxPrimaryAddress", "MailMessage": "networkMessageId", "SubmissionMail": "submissionId", "Account": "accountName", "Host": "hostName", "Ip": "address", } for ent in incident_details["Entities"][0]: if ent[0] in ent_map: ent_alerts(ent[1][ent_map[ent[0]]])