Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
Azure
GitHub Repository: Azure/Azure-Sentinel-Notebooks
Path: blob/master/tutorials-and-examples/example-notebooks/Example - Guided Hunting - Office365-Exploring.ipynb
3253 views
Kernel: Python 3

Title: Office 365 Explorer

Notebook Version: 1.0
Python Version: Python 3.10 (including Python 3.10 - SDK v2 - AzureML)
Required Packages: kqlmagic, msticpy, pandas, numpy, matplotlib, seaborn, networkx, ipywidgets, ipython, scikit_learn, folium, maxminddb_geolite2, holoviews
Platforms Supported:

  • Azure Notebooks Free Compute

  • Azure Notebooks DSVM

  • OS Independent

Data Sources Required:

  • Log Analytics - OfficeActivity, IPLocation, Azure Network Analytics

Description:

Brings together a series of queries and visualizations to help you investigate the security status of Office 365 subscription and individual user activities.

  • The first section focuses on Tenant-Wide data queries and analysis

  • The second section allows you to focus on individial accounts and examine them for any suspicious activity.

This notebook is intended to be illustrative of the types of data available in Office 365 Activity data and how to query and use them. It is not meant to be used as a prescriptive guide to how to navigate through the data. Feel free to experiment and submit anything interesting you find to the community.

Warning: Example Notebook - No longer supported!

 This notebooks is meant to be illustrative of specific scenarios and is not actively maintained. 
 It is unlikely to be runnable directly in your environment. Instead, please use the notebooks in the root of this repo. 

Contents

Setup

Make sure that you have installed packages specified in the setup (uncomment the lines to execute)

Install Packages

The first time this cell runs for a new Azure Notebooks project or local Python environment it will take several minutes to download and install the packages. In subsequent runs it should run quickly and confirm that package dependencies are already installed. Unless you want to upgrade the packages you can feel free to skip execution of the next cell.

If you see any import failures (ImportError) in the notebook, please re-run this cell and answer 'y', then re-run the cell where the failure occurred.

Note you may see some warnings about package incompatibility with certain packages. This does not affect the functionality of this notebook but you may need to upgrade the packages producing the warnings to a more recent version.

import sys import warnings warnings.filterwarnings("ignore",category=DeprecationWarning) MIN_REQ_PYTHON = (3,10) if sys.version_info < MIN_REQ_PYTHON: print('Check the Kernel->Change Kernel menu and ensure that Python 3.10') print('or later is selected as the active kernel.') sys.exit("Python %s.%s or later is required.\n" % MIN_REQ_PYTHON) # Package Installs - try to avoid if they are already installed try: import msticpy.sectools as sectools import Kqlmagic from dns import reversename, resolver from ipwhois import IPWhois import folium print('If you answer "n" this cell will exit with an error in order to avoid the pip install calls,') print('This error can safely be ignored.') resp = input('msticpy and Kqlmagic packages are already loaded. Do you want to re-install? (y/n)') if resp.strip().lower() != 'y': sys.exit('pip install aborted - you may skip this error and continue.') else: print('After installation has completed, restart the current kernel and run ' 'the notebook again skipping this cell.') except ImportError: pass print('\nPlease wait. Installing required packages. This may take a few minutes...') !pip install git+https://github.com/microsoft/msticpy --upgrade --user !pip install Kqlmagic --no-cache-dir --upgrade --user !pip install holoviews !pip install dnspython --upgrade !pip install ipwhois --upgrade !pip install folium --upgrade # Uncomment to refresh the maxminddb database # !pip install maxminddb-geolite2 --upgrade print('To ensure that the latest versions of the installed libraries ' 'are used, please restart the current kernel and run ' 'the notebook again skipping this cell.')
# Imports import sys import warnings MIN_REQ_PYTHON = (3,10) if sys.version_info < MIN_REQ_PYTHON: print('Check the Kernel->Change Kernel menu and ensure that Python 3.10') print('or later is selected as the active kernel.') sys.exit("Python %s.%s or later is required.\n" % MIN_REQ_PYTHON) import numpy as np from IPython import get_ipython from IPython.display import display, HTML, Markdown import ipywidgets as widgets import matplotlib.pyplot as plt import seaborn as sns sns.set() import networkx as nx import pandas as pd pd.set_option('display.max_rows', 100) pd.set_option('display.max_columns', 50) pd.set_option('display.max_colwidth', 100) import msticpy.sectools as sectools import msticpy.nbtools as mas import msticpy.nbtools.kql as qry import msticpy.nbtools.nbdisplay as nbdisp # Some of our dependencies (networkx) still use deprecated Matplotlib # APIs - we can't do anything about it so suppress them from view from matplotlib import MatplotlibDeprecationWarning warnings.simplefilter("ignore", category=MatplotlibDeprecationWarning) WIDGET_DEFAULTS = {'layout': widgets.Layout(width='95%'), 'style': {'description_width': 'initial'}} display(HTML(mas.util._TOGGLE_CODE_PREPARE_STR)) HTML(''' <script type="text/javascript"> IPython.notebook.kernel.execute("nb_query_string='".concat(window.location.search).concat("'")); </script> ''');

Get WorkspaceId

To find your Workspace Id go to Log Analytics. Look at the workspace properties to find the ID.

import os from msticpy.nbtools.wsconfig import WorkspaceConfig ws_config_file = 'config.json' WORKSPACE_ID = None TENANT_ID = None try: ws_config = WorkspaceConfig(ws_config_file) display(Markdown(f'Read Workspace configuration from local config.json for workspace **{ws_config["workspace_name"]}**')) for cf_item in ['tenant_id', 'subscription_id', 'resource_group', 'workspace_id', 'workspace_name']: display(Markdown(f'**{cf_item.upper()}**: {ws_config[cf_item]}')) if ('cookiecutter' not in ws_config['workspace_id'] or 'cookiecutter' not in ws_config['tenant_id']): WORKSPACE_ID = ws_config['workspace_id'] TENANT_ID = ws_config['tenant_id'] except: pass if not WORKSPACE_ID or not TENANT_ID: display(Markdown('**Workspace configuration not found.**\n\n' 'Please go to your Log Analytics workspace, copy the workspace ID' ' and/or tenant Id and paste here.<br> ' 'Or read the workspace_id from the config.json in your Azure Notebooks project.')) ws_config = None ws_id = mas.GetEnvironmentKey(env_var='WORKSPACE_ID', prompt='Please enter your Log Analytics Workspace Id:', auto_display=True) ten_id = mas.GetEnvironmentKey(env_var='TENANT_ID', prompt='Please enter your Log Analytics Tenant Id:', auto_display=True)

Read Workspace configuration from local config.json for workspace ASIHuntOMSWorkspaceV4

TENANT_ID: 72f988bf-86f1-41af-91ab-2d7cd011db47

SUBSCRIPTION_ID: 40dcc8bf-0478-4f3b-b275-ed0a94f2c013

RESOURCE_GROUP: ASIHuntOMSWorkspaceRG

WORKSPACE_ID: 52b1ab41-869e-4138-9e40-2a4457f09bf0

WORKSPACE_NAME: ASIHuntOMSWorkspaceV4

Authenticate to Log Analytics

If you are using user/device authentication, run the following cell.

  • Click the 'Copy code to clipboard and authenticate' button.

  • This will pop up an Azure Active Directory authentication dialog (in a new tab or browser window). The device code will have been copied to the clipboard.

  • Select the text box and paste (Ctrl-V/Cmd-V) the copied value.

  • You should then be redirected to a user authentication page where you should authenticate with a user account that has permission to query your Log Analytics workspace.

Use the following syntax if you are authenticating using an Azure Active Directory AppId and Secret:

%kql loganalytics://tenant(aad_tenant).workspace(WORKSPACE_ID).clientid(client_id).clientsecret(client_secret)

instead of

%kql loganalytics://code().workspace(WORKSPACE_ID)

Note: you may occasionally see a JavaScript error displayed at the end of the authentication - you can safely ignore this.
On successful authentication you should see a popup schema button.

if not WORKSPACE_ID or not TENANT_ID: try: WORKSPACE_ID = ws_id.value TENANT_ID = ten_id.value except NameError: raise ValueError('No workspace or Tenant Id.') mas.kql.load_kql_magic() %kql loganalytics://code().tenant(TENANT_ID).workspace(WORKSPACE_ID)
%kql search * | summarize RowCount=count() by Type | project-rename Table=Type la_table_set = _kql_raw_result_.to_dataframe() table_index = la_table_set.set_index('Table')['RowCount'].to_dict() display(Markdown('Current data in workspace')) display(la_table_set.T)
<IPython.core.display.Javascript object>

Current data in workspace

Contents

Office 365 Activity

Log Analytics Queries

if ('OfficeActivity' not in table_index or table_index['OfficeActivity'] == 0): display(Markdown('<font color="red"><h2>Warning. Office Data not available.</h2></font><br>' 'Either Office 365 data has not been imported into the workspace or' ' the OfficeActivity table is empty.<br>' 'This workbook is not useable with the current workspace.'))
from msticpy.sectools.geoip import GeoLiteLookup iplocation = GeoLiteLookup() # Queries ad_changes_query = ''' OfficeActivity | where TimeGenerated >= datetime({start}) | where TimeGenerated <= datetime({end}) | where RecordType == 'AzureActiveDirectory' | where Operation in ('Add service principal.', 'Change user password.', 'Add user.', 'Add member to role.') | where UserType == 'Regular' | project OfficeId, TimeGenerated, Operation, OrganizationId, OfficeWorkload, ResultStatus, OfficeObjectId, UserId = tolower(UserId), ClientIP, ExtendedProperties ''' office_ops_query = ''' OfficeActivity | where TimeGenerated >= datetime({start}) | where TimeGenerated <= datetime({end}) | where RecordType in ("AzureActiveDirectoryAccountLogon", "AzureActiveDirectoryStsLogon") | extend UserAgent = extractjson("$[0].Value", ExtendedProperties, typeof(string)) | union ( OfficeActivity | where TimeGenerated >= datetime({start}) | where TimeGenerated <= datetime({end}) | where RecordType !in ("AzureActiveDirectoryAccountLogon", "AzureActiveDirectoryStsLogon") ) | where UserType == 'Regular' ''' office_ops_summary_query = ''' let timeRange=ago(30d); let officeAuthentications = OfficeActivity | where TimeGenerated >= timeRange | where RecordType in ("AzureActiveDirectoryAccountLogon", "AzureActiveDirectoryStsLogon") | extend UserAgent = extractjson("$[0].Value", ExtendedProperties, typeof(string)) | where Operation == "UserLoggedIn"; officeAuthentications | union ( OfficeActivity | where TimeGenerated >= timeRange | where RecordType !in ("AzureActiveDirectoryAccountLogon", "AzureActiveDirectoryStsLogon") ) | where UserType == 'Regular' | extend RecordOp = strcat(RecordType, '-', Operation) | summarize OperationCount=count() by RecordType, Operation, UserId, UserAgent, ClientIP, bin(TimeGenerated, 1h) // render timeline ''' office_logons_byua_query = ''' let end = datetime({end}); let threshold={threshold}; let start = end - 1d; let hist_start = start - 30d; let hist_end = end; let officeAuthentications = OfficeActivity | where TimeGenerated >= hist_start | where TimeGenerated <= hist_end | where RecordType in ("AzureActiveDirectoryAccountLogon", "AzureActiveDirectoryStsLogon") | extend UserAgent = extractjson("$[0].Value", ExtendedProperties, typeof(string)) | where Operation == "UserLoggedIn"; let lookupWindow = end - start; let lookupBin = lookupWindow / 2.0; officeAuthentications | project-rename Start = TimeGenerated | extend TimeKey = bin(Start, lookupBin) | join kind = inner ( officeAuthentications | project-rename End = TimeGenerated | extend TimeKey = range(bin(End - lookupWindow, lookupBin), bin(End, lookupBin), lookupBin) | mvexpand TimeKey to typeof(datetime) ) on UserAgent, TimeKey | project timeSpan = End - Start, UserId, ClientIP , UserAgent , Start, End | summarize Count_ClientIP = dcount(ClientIP) by UserId | where Count_ClientIP > threshold | join kind=inner ( officeAuthentications | summarize minTime=min(TimeGenerated), maxTime=max(TimeGenerated) by UserId, UserAgent, ClientIP ) on UserAgent ''' office_logons_byuser_query = ''' let end = datetime({end}); let start = datetime({start}); let threshold={threshold}; let officeAuthentications = OfficeActivity | where TimeGenerated >= start | where TimeGenerated <= end | where RecordType in ("AzureActiveDirectoryAccountLogon", "AzureActiveDirectoryStsLogon") | extend UserAgent = extractjson("$[0].Value", ExtendedProperties, typeof(string)) | where Operation == "UserLoggedIn"; let lookupWindow = 1d; let lookupBin = lookupWindow / 2.0; officeAuthentications | project-rename Start = TimeGenerated | extend TimeKey = bin(Start, lookupBin) | join kind = inner ( officeAuthentications | project-rename End = TimeGenerated | extend TimeKey = range(bin(End - lookupWindow, lookupBin), bin(End, lookupBin), lookupBin) | mvexpand TimeKey to typeof(datetime) ) on UserId, TimeKey | project timeSpan = End - Start, UserId, ClientIP , UserAgent, Start, End | summarize Count_ClientIP = dcount(ClientIP) by UserId | where Count_ClientIP > threshold | join kind=inner ( officeAuthentications | summarize minTime=min(TimeGenerated), maxTime=max(TimeGenerated) by UserId, UserAgent, ClientIP ) on UserId ''' # %kql -query office_logons_query # office_logons_df = _kql_raw_result_.to_dataframe() # # Description: New user agents associated with a clientIP for sharepoint file uploads/downloads. # # DataSource: #OfficeActivity # # Techniques: #Exfiltration # new_user_agents = ''' let end = datetime({end}); let start = datetime({end}); let hist_start = start - 30d; let hist_end = start; let historicalUA = OfficeActivity | where TimeGenerated >= hist_start | where TimeGenerated <= hist_end | where UserType == 'Regular' | summarize op_count = count() by UserId, UserAgent, RecordType, Operation; let recentUA = OfficeActivity | where TimeGenerated >= start | where TimeGenerated <= end | where UserType == 'Regular' | summarize op_count = count() by UserId, UserAgent, RecordType, Operation; recentUA | join kind=leftanti ( historicalUA ) on UserId, UserAgent | where not(isempty(UserId)) ''' user_logon_anom_query = ''' let LogonEvents=() {{ let logonFail=OfficeActivity | where TimeGenerated >= datetime({start}) | where TimeGenerated <= datetime({end}) | where RecordType in ("AzureActiveDirectoryAccountLogon", "AzureActiveDirectoryStsLogon") and ResultStatus =~ "Failed" | project TimeGenerated, AccountName=split(UserId, "@").[0], AccountDomain = iff(RecordType == "AzureActiveDirectoryAccountLogon",UserDomain,split(UserId, "@").[1]), UserId, IpAddress=ClientIP, OrganizationId, ActionType="LogonFailure"; let logonSuccess=OfficeActivity | where TimeGenerated >= datetime({start}) | where TimeGenerated <= datetime({end}) | where RecordType in ("AzureActiveDirectoryAccountLogon", "AzureActiveDirectoryStsLogon") and ResultStatus =~ "Succeeded" | project TimeGenerated, AccountName=split(UserId, "@").[0], AccountDomain = iff(RecordType == "AzureActiveDirectoryAccountLogon",UserDomain,split(UserId, "@").[1]), UserId, IpAddress=ClientIP, OrganizationId, ActionType="Logon"; logonFail | union logonSuccess}}; let logonSummary = LogonEvents | summarize count() by ActionType, IpAddress, tostring(AccountName), tostring(AccountDomain), UserId, OrganizationId, bin(TimeGenerated, 1m); let logon_success = logonSummary | where ActionType == "Logon"; let logon_fail = logonSummary | where ActionType == "LogonFailure"; logon_fail | join kind = leftouter (logon_success) on IpAddress | project TimeGenerated, IpAddress, failCount=count_, AccountName, OrganizationId, UserId, successCount=count_1 | extend successRate = 1.0*successCount/(successCount+failCount) | project TimeGenerated, IpAddress, AccountName, successRate, failCount, successCount, UserId, OrganizationId '''
# set the origin time to the time of our alert o365_query_times = mas.QueryTime(units='days', before=3, after=1, max_before=60, max_after=20) o365_query_times.display()
HTML(value='<h4>Set query time boundaries</h4>')
HBox(children=(DatePicker(value=datetime.date(2019, 2, 27), description='Origin Date'), Text(value='01:37:03.8…
VBox(children=(IntRangeSlider(value=(-3, 1), description='Time Range (day):', layout=Layout(width='80%'), max=…

Contents

Tenant-wide Information

AAD Operations Changes to users and groups

print('Getting data...', end=' ') o365_query = ad_changes_query.format(start = o365_query_times.start, end=o365_query_times.end) %kql -query o365_query ad_changes_df = _kql_raw_result_.to_dataframe() print('done.') ad_changes_df[['TimeGenerated', 'Operation', 'OfficeWorkload', 'ResultStatus', 'OfficeObjectId', 'UserId', 'ClientIP']]
Getting data... done.

Contents

Logon Anomalies

Logon failures from an ipaddress that then succeed.

print('Getting data...', end=' ') o365_query = user_logon_anom_query.format(start = o365_query_times.start, end=o365_query_times.end) %kql -query o365_query user_logon_anom_df = _kql_raw_result_.to_dataframe() print('done.') user_logon_anom_df.sort_values('failCount')
Getting data... done.

Contents

Summary of O365 Activity Types

Warning this query can be time consuming for large O365 subscriptions

print('Getting data...', end=' ') o365_query = office_ops_summary_query.format(start = o365_query_times.start, end=o365_query_times.end) %kql -query o365_query office_ops_summary_df = _kql_raw_result_.to_dataframe() print('done.') (office_ops_summary_df .assign(UserId = lambda x: x.UserId.str.lower()) .groupby(['RecordType', 'Operation']) .aggregate({'ClientIP': 'nunique', 'UserId': 'nunique', 'OperationCount': 'sum'}))
Getting data... done.

Contents

Variability of IP Address for users

unique_ip_op_ua = (office_ops_summary_df.assign(UserId = lambda x: x.UserId.str.lower()) .groupby(['UserId', 'Operation']) .aggregate({'ClientIP': 'nunique', 'OperationCount': 'sum'})).reset_index() user_ip_op = sns.catplot(x="ClientIP", y="UserId", hue='Operation', data=unique_ip_op_ua, height=5, aspect=2) user_ip_op.fig.suptitle('Variability of IP Address Usage by user');
Image in a Jupyter notebook
office_ops_summary_df

Contents

Accounts with multiple IPs and Geolocations

restrict_cols = ['RecordType', 'TimeGenerated', 'Operation', 'UserId', 'ClientIP', 'UserAgent'] office_ops_summary = office_ops_summary_df[restrict_cols].assign(UserId = lambda x: x.UserId.str.lower()) unique_ip_op_ua['ClientIPCount'] = unique_ip_op_ua['ClientIP'] office_ops_merged = pd.merge(unique_ip_op_ua.query('ClientIP > 1').drop(columns='ClientIP'), office_ops_summary, on=['UserId', 'Operation']) client_ips = office_ops_merged.query('ClientIP != "<null>" & ClientIP != ""')['ClientIP'].drop_duplicates().tolist() ip_entities = [] for ip in client_ips: ip_entity = mas.IpAddress(Address=ip) iplocation.lookup_ip(ip_entity=ip_entity) ip_dict = {'Address': ip_entity.Address} ip_dict.update(ip_entity.Location.properties) ip_entities.append(pd.Series(ip_dict)) ip_locs_df = pd.DataFrame(data=ip_entities) ip_locs_df office_ops_summary_ip_loc = pd.merge(office_ops_merged, ip_locs_df, left_on='ClientIP', right_on='Address', how='left') (office_ops_summary_ip_loc.groupby(['UserId', 'CountryCode', 'City']) .aggregate({'ClientIP': 'nunique', 'OperationCount': 'sum'})).reset_index()

Contents

User Logons where User has logged on from > N IP Address in period

th_wgt = widgets.IntSlider(value=1, min=1, max=50, step=1, description='Set IP Count Threshold', **WIDGET_DEFAULTS) th_wgt
IntSlider(value=1, description='Set IP Count Threshold', layout=Layout(width='95%'), max=50, min=1, style=Slid…
print('Getting data...', end=' ') o365_query = office_logons_byuser_query.format(start = o365_query_times.start, end=o365_query_times.end, threshold=th_wgt.value) %kql -query o365_query office_logons_byuser_df = _kql_raw_result_.to_dataframe() print('done.') office_logons_byuser_df
Getting data... done.

Contents

Matrix of Selected Operation Types by Location and IP

print('Getting data...', end=' ') o365_query = office_ops_query.format(start=o365_query_times.start, end=o365_query_times.end) %kql -query o365_query office_ops_df = _kql_raw_result_.to_dataframe() print('done.') # Get Locations for distinct IPs client_ips = office_ops_df.query('ClientIP != "<null>" & ClientIP != ""')['ClientIP'].drop_duplicates().tolist() ip_entities = [] for ip in client_ips: ip_entity = mas.IpAddress(Address=ip) iplocation.lookup_ip(ip_entity=ip_entity) ip_dict = {'Address': ip_entity.Address} ip_dict.update(ip_entity.Location.properties) ip_entities.append(pd.Series(ip_dict)) ip_locs_df = pd.DataFrame(data=ip_entities) # Get rid of unneeded columns restrict_cols = ['OfficeId', 'RecordType', 'TimeGenerated', 'Operation', 'OrganizationId', 'UserType', 'UserKey', 'OfficeWorkload', 'ResultStatus', 'OfficeObjectId', 'UserId', 'ClientIP','UserAgent'] office_ops_restr = office_ops_df[restrict_cols] # Merge main DF with IP location data office_ops_locs = pd.merge(office_ops_restr, ip_locs_df, how='right', left_on='ClientIP', right_on='Address', indicator=True) limit_op_types = ['FileDownloaded', 'FileModified','FileUploaded', 'UserLoggedIn','UserLoginFailed','Add member to role.', 'Add user.','Change user password.', 'Update user.'] office_ops_locs = office_ops_locs[office_ops_locs.Operation.isin(limit_op_types)] # Calculate operations grouped by location and operation type cm = sns.light_palette("yellow", as_cmap=True) country_by_op_count = (office_ops_locs[['Operation', 'RecordType', 'CountryCode', 'City']] .groupby(['CountryCode', 'City', 'Operation']) .count()) display(country_by_op_count.unstack().fillna(0).rename(columns={'RecordType':'OperationCount'}) .style.background_gradient(cmap=cm)) # Group by Client IP, Country, Operation clientip_by_op_count = (office_ops_locs[['ClientIP', 'Operation', 'RecordType', 'CountryCode']] .groupby(['ClientIP', 'CountryCode', 'Operation']) .count()) (clientip_by_op_count.unstack().fillna(0).rename(columns={'RecordType':'OperationCount'}) .style.background_gradient(cmap=cm))
Getting data... done.

Contents

Geolocation Map of Client IPs

from msticpy.nbtools.foliummap import FoliumMap folium_map = FoliumMap() def get_row_ip_loc(row): try: _, ip_entity = iplocation.lookup_ip(ip_address=row.ClientIP) return ip_entity except ValueError: return None off_ip_locs = (office_ops_df[['ClientIP']] .drop_duplicates() .apply(get_row_ip_loc, axis=1) .tolist()) ip_locs = [ip_list[0] for ip_list in off_ip_locs if ip_list] display(HTML('<h3>External IP Addresses seen in Office Activity</h3>')) display(HTML('Numbered circles indicate multiple items - click to expand.')) icon_props = {'color': 'purple'} folium_map.add_ip_cluster(ip_entities=ip_locs, **icon_props) display(folium_map.folium_map)

Contents

Distinct User Agent Strings in Use

display(Markdown('### IPs and User Agents - frequency of use')) display(Markdown('Distinct UserAgents by num of operations')) office_ops_df[['UserAgent', 'Operation']].groupby(['UserAgent']).count().rename(columns={'Operation':'OpCount'})

IPs and User Agents - frequency of use

Distinct UserAgents by num of operations

Contents

Graphical Activity Timeline

with warnings.catch_warnings(): warnings.simplefilter("ignore") display(Markdown('### Change in rate of Activity Class (RecordType) and Operation')) sns.relplot(data=office_ops_summary_df, x='TimeGenerated', y='OperationCount', kind='line', aspect=2, hue='RecordType') sns.relplot(data=office_ops_summary_df.query('RecordType == "SharePointFileOperation"'), x='TimeGenerated', y='OperationCount', hue='Operation', kind='line', aspect=2)

Change in rate of Activity Class (RecordType) and Operation

Image in a Jupyter notebookImage in a Jupyter notebook

Contents

Users With largest Activity Type Count

with warnings.catch_warnings(): warnings.simplefilter("ignore") display(Markdown('### Identify Users/IPs with largest operation count')) office_ops = office_ops_summary_df.assign(Account=lambda x: (x.UserId.str.extract('([^@]+)@.*', expand=False)).str.lower()) limit_op_types = ['FileDownloaded', 'FileModified','FileUploaded', 'UserLoggedIn','UserLoginFailed','Add member to role.', 'Add user.','Change user password.', 'Update user.'] office_ops = office_ops[office_ops.Operation.isin(limit_op_types)] sns.catplot(data=office_ops, y='Account', x='OperationCount', hue='Operation', aspect=2) display(office_ops.pivot_table('OperationCount', index=['Account'], columns='Operation').style.bar(color='orange', align='mid'))

Identify Users/IPs with largest operation count

Image in a Jupyter notebook
new_df = office_ops_df[['OfficeId', 'RecordType', 'TimeGenerated', 'Operation', 'OrganizationId', 'UserType', 'UserKey', 'OfficeWorkload', 'ResultStatus', 'OfficeObjectId', 'UserId', 'ClientIP','UserAgent']] pd.merge(new_df, ip_locs_df, how='left', left_on='ClientIP', right_on='Address')

Contents

Office User Investigation

# set the origin time to the time of our alert o365_query_times_user = mas.QueryTime(units='days', before=2, after=1, max_before=60, max_after=20, auto_display=True)
HTML(value='<h4>Set query time boundaries</h4>')
HBox(children=(DatePicker(value=datetime.date(2019, 2, 27), description='Origin Date'), Text(value='01:54:28.9…
VBox(children=(IntRangeSlider(value=(-2, 1), description='Time Range (day):', layout=Layout(width='80%'), max=…
distinct_users = office_ops_df[['UserId']].sort_values('UserId')['UserId'].str.lower().drop_duplicates().tolist() distinct_users user_select = mas.SelectItem(description='Select User Id', item_list=distinct_users, auto_display=True) # (items=distinct_users)
Select(description='Select User Id', layout=Layout(height='100px', width='50%'), options=('[email protected]…

Contents

Activity Summary

# Provides a summary view of a given account's activity # For use when investigating an account that has been identified as having associated suspect activity or been otherwise compromised. # All office activity by UserName using UI to set Time range # Tags: #Persistence, #Discovery, #Lateral Movement, #Collection user_activity_query = ''' OfficeActivity | where TimeGenerated >= datetime({start}) | where TimeGenerated <= datetime({end}) | where UserKey has "{user}" or UserId has "{user}" ''' print('Getting data...', end=' ') o365_query = user_activity_query.format(start=o365_query_times_user.start, end=o365_query_times_user.end, user=user_select.value) %kql -query o365_query user_activity_df = _kql_raw_result_.to_dataframe() print('done.') user_activity_df
Getting data... done.

Contents

Operation Breakdown for User

my_df = (user_activity_df[['OfficeId', 'RecordType', 'TimeGenerated', 'Operation', 'ResultStatus', 'UserId', 'ClientIP','UserAgent']] .groupby(['Operation', 'ResultStatus', 'ClientIP']) .aggregate({'OfficeId': 'count'}) .rename(columns={'OfficeId': 'OperationCount', 'ClientIP': 'IPCount'}) .reset_index()) sns.catplot(x='OperationCount', y="Operation", hue="ClientIP", jitter=False, data=my_df, aspect=2.5);
Image in a Jupyter notebook

Contents

IP Count for Different User Operations

my_df2 = (user_activity_df[['OfficeId', 'RecordType', 'TimeGenerated', 'Operation', 'ResultStatus', 'UserId', 'ClientIP','UserAgent']] .groupby(['Operation']) .aggregate({'OfficeId': 'count', 'ClientIP': 'nunique'}) .rename(columns={'OfficeId': 'OperationCount', 'ClientIP': 'IPCount'}) .reset_index()) sns.barplot(x='IPCount', y="Operation", data=my_df2);
Image in a Jupyter notebook

Contents

Activity Timeline

nbdisp.display_timeline(data=user_activity_df, title='Office Operations', source_columns=['OfficeWorkload', 'Operation', 'ClientIP', 'ResultStatus'], height=200)
MIME type unknown not supported
MIME type unknown not supported

Contents

User IP GeoMap

def get_row_ip_loc(row): try: _, ip_entity = iplocation.lookup_ip(ip_address=row.ClientIP) return ip_entity except ValueError: return None from msticpy.nbtools.foliummap import FoliumMap folium_map = FoliumMap() off_ip_locs = (user_activity_df[['ClientIP']] .drop_duplicates() .apply(get_row_ip_loc, axis=1) .tolist()) ip_locs = [ip_list[0] for ip_list in off_ip_locs if ip_list] display(HTML('<h3>External IP Addresses seen in Office Activity</h3>')) display(HTML('Numbered circles indicate multiple items - click to expand.')) icon_props = {'color': 'purple'} folium_map.add_ip_cluster(ip_entities=ip_locs, **icon_props) display(folium_map.folium_map)

Contents

Check for User IPs in Azure Network Flow Data

The full data is available in the Dataframe az_net_query_byip

if ('AzureNetworkAnalytics_CL' not in table_index or table_index['AzureNetworkAnalytics_CL'] == 0): display(Markdown('<font color="red"><h2>Warning. Azure network flow data not available.</h2></font><br>' 'This section of the notebook is not useable with the current workspace.')) # Azure Network Analytics Base Query az_net_analytics_query =r''' AzureNetworkAnalytics_CL | where SubType_s == 'FlowLog' | where FlowStartTime_t >= datetime({start}) | where FlowEndTime_t <= datetime({end}) | project TenantId, TimeGenerated, FlowStartTime = FlowStartTime_t, FlowEndTime = FlowEndTime_t, FlowIntervalEndTime = FlowIntervalEndTime_t, FlowType = FlowType_s, ResourceGroup = split(VM_s, '/')[0], VMName = split(VM_s, '/')[1], VMIPAddress = VMIP_s, PublicIPs = extractall(@"([\d\.]+)[|\d]+", dynamic([1]), PublicIPs_s), SrcIP = SrcIP_s, DestIP = DestIP_s, ExtIP = iif(FlowDirection_s == 'I', SrcIP_s, DestIP_s), L4Protocol = L4Protocol_s, L7Protocol = L7Protocol_s, DestPort = DestPort_d, FlowDirection = FlowDirection_s, AllowedOutFlows = AllowedOutFlows_d, AllowedInFlows = AllowedInFlows_d, DeniedInFlows = DeniedInFlows_d, DeniedOutFlows = DeniedOutFlows_d, RemoteRegion = AzureRegion_s, VMRegion = Region_s | extend AllExtIPs = iif(isempty(PublicIPs), pack_array(ExtIP), iif(isempty(ExtIP), PublicIPs, array_concat(PublicIPs, pack_array(ExtIP))) ) | project-away ExtIP | mvexpand AllExtIPs {where_clause} ''' # Build the query parameters all_user_ips = user_activity_df['ClientIP'].drop_duplicates().tolist() all_user_ips = [ip for ip in all_user_ips if ip and ip != '<null>'] ip_list = ','.join(['\'{}\''.format(i) for i in all_user_ips]) az_ip_where = f''' | where (AllExtIPs in ({ip_list}) or SrcIP in ({ip_list}) or DestIP in ({ip_list}) ) and (AllowedOutFlows > 0 or AllowedInFlows > 0)''' print('getting data...') az_net_query_byip = az_net_analytics_query.format(where_clause=az_ip_where, start=o365_query_times_user.start, end=o365_query_times_user.end) net_default_cols = ['FlowStartTime', 'FlowEndTime', 'VMName', 'VMIPAddress', 'PublicIPs', 'SrcIP', 'DestIP', 'L4Protocol', 'L7Protocol', 'DestPort', 'FlowDirection', 'AllowedOutFlows', 'AllowedInFlows'] %kql -query az_net_query_byip az_net_comms_df = _kql_raw_result_.to_dataframe() az_net_comms_df[net_default_cols] import warnings with warnings.catch_warnings(): warnings.simplefilter("ignore") az_net_comms_df['TotalAllowedFlows'] = az_net_comms_df['AllowedOutFlows'] + az_net_comms_df['AllowedInFlows'] sns.catplot(x="L7Protocol", y="TotalAllowedFlows", col="FlowDirection", data=az_net_comms_df) sns.relplot(x="FlowStartTime", y="TotalAllowedFlows", col="FlowDirection", kind="line", hue="L7Protocol", data=az_net_comms_df).set_xticklabels(rotation=50) cols = ['VMName', 'VMIPAddress', 'PublicIPs', 'SrcIP', 'DestIP', 'L4Protocol', 'L7Protocol', 'DestPort', 'FlowDirection', 'AllExtIPs', 'TotalAllowedFlows'] flow_index = az_net_comms_df[cols].copy() def get_source_ip(row): if row.FlowDirection == 'O': return row.VMIPAddress if row.VMIPAddress else row.SrcIP else: return row.AllExtIPs if row.AllExtIPs else row.DestIP def get_dest_ip(row): if row.FlowDirection == 'O': return row.AllExtIPs if row.AllExtIPs else row.DestIP else: return row.VMIPAddress if row.VMIPAddress else row.SrcIP flow_index['source'] = flow_index.apply(get_source_ip, axis=1) flow_index['target'] = flow_index.apply(get_dest_ip, axis=1) flow_index['value'] = flow_index['L7Protocol'] cm = sns.light_palette("green", as_cmap=True) with warnings.catch_warnings(): warnings.simplefilter("ignore") display(flow_index[['source', 'target', 'value', 'L7Protocol', 'FlowDirection', 'TotalAllowedFlows']] .groupby(['source', 'target', 'value', 'L7Protocol', 'FlowDirection']) .sum().unstack().style.background_gradient(cmap=cm)) nbdisp.display_timeline(data=az_net_comms_df.query('AllowedOutFlows > 0'), overlay_data=az_net_comms_df.query('AllowedInFlows > 0'), title='Network Flows (out=blue, in=green)', time_column='FlowStartTime', source_columns=['FlowType', 'AllExtIPs', 'L7Protocol', 'FlowDirection'], height=300)
getting data...
MIME type unknown not supported
MIME type unknown not supported
Image in a Jupyter notebookImage in a Jupyter notebook

Contents

Rare Combinations of Country/UserAgent/Operation Type

The dataframe below lists combinations in the time period that had less than 3 instances. This might help you to spot relatively unusual activity.

from msticpy.sectools.eventcluster import (dbcluster_events, add_process_features, char_ord_score, token_count, delim_count) restrict_cols = ['OfficeId', 'RecordType', 'TimeGenerated', 'Operation', 'OrganizationId', 'UserType', 'UserKey', 'OfficeWorkload', 'ResultStatus', 'OfficeObjectId', 'UserId', 'ClientIP','UserAgent'] feature_office_ops = office_ops_df[restrict_cols] feature_office_ops = ( pd.merge(feature_office_ops, ip_locs_df, how='left', left_on='ClientIP', right_on='Address') .fillna('')) # feature_office_ops = office_ops_df.copy() feature_office_ops['country_num'] = feature_office_ops.apply(lambda x: char_ord_score(x, 'CountryCode') if x.CountryCode else 0, axis=1) feature_office_ops['ua_tokens'] = feature_office_ops.apply(lambda x: char_ord_score(x, 'UserAgent'), axis=1) feature_office_ops['user_num'] = feature_office_ops.apply(lambda x: char_ord_score(x, 'UserId'), axis=1) feature_office_ops['op_num'] = feature_office_ops.apply(lambda x: char_ord_score(x, 'Operation'), axis=1) # you might need to play around with the max_cluster_distance parameter. # decreasing this gives more clusters. (clustered_ops, dbcluster, x_data) = dbcluster_events(data=feature_office_ops, cluster_columns=['country_num', 'op_num', 'ua_tokens'], time_column='TimeGenerated', max_cluster_distance=0.0001) print('Number of input events:', len(feature_office_ops)) print('Number of clustered events:', len(clustered_ops)) display(Markdown('#### Rarest combinations')) display(clustered_ops[['TimeGenerated', 'RecordType', 'Operation', 'UserId', 'UserAgent', 'ClusterSize', 'OfficeObjectId', 'CountryName']] .query('ClusterSize <= 2') .sort_values('ClusterSize', ascending=True)) display(Markdown('#### Most common operations')) display((clustered_ops[['RecordType', 'Operation', 'ClusterSize']] .sort_values('ClusterSize', ascending=False) .head(10)))
Number of input events: 486 Number of clustered events: 34

Rarest combinations

Most common operations

Contents

Appendices

Available DataFrames

print('List of current DataFrames in Notebook') print('-' * 50) current_vars = list(locals().keys()) for var_name in current_vars: if isinstance(locals()[var_name], pd.DataFrame) and not var_name.startswith('_'): print(var_name)
List of current DataFrames in Notebook -------------------------------------------------- la_table_set ad_changes_df user_logon_anom_df office_ops_summary_df unique_ip_op_ua office_ops_summary office_ops_merged ip_locs_df office_ops_summary_ip_loc office_logons_byuser_df office_ops_df office_ops_restr office_ops_locs country_by_op_count clientip_by_op_count office_ops new_df user_activity_df my_df my_df2 az_net_comms_df flow_index feature_office_ops clustered_ops

Saving Data to Excel

To save the contents of a pandas DataFrame to an Excel spreadsheet use the following syntax

writer = pd.ExcelWriter('myWorksheet.xlsx') my_data_frame.to_excel(writer,'Sheet1') writer.save()