Incident Response Part 2: What about the other logs?

The second part of the Incident Response series is here! In the last blog, we were lucky enough to have the logs already available in Microsoft 365 Defender or Sentinel. But what could you do if you do not ingest the logs in your SIEM or it is not logged by your EDR? This blog will explain how you can still perform incident response using KQL. Spoiler: Azure Data Explorer is your best friend! Furthermore, some practical examples are shared which can help you enrich your current M365D and Sentinel incident response cases. Do you not have any Microsoft Security Tooling? No problem, those are not needed for this blog!

The incident response series consists of the following parts:

  1. Incident Response Part 1: IR on Microsoft Security Incidents (KQL edition)
  2. Incident Response Part 2: What about the other logs?
  3. Incident Response Part 3: Leveraging Live Response
Free Azure Data Explorer Cluster

You can get a free Azure Data Explorer Cluster if you do not have a company owned cluster. This cluster will include roughly 100 GB of log storage. The documentation below can be used to create your cluster, which is a prerequisite for the next part of the blog.


The reason why certain logs are not ingested in your SIEM can differ per organisation. This could be cost, inadequate data quality, ingestion limitations or you simply do not have security tooling running. The reason why something is not ingested is also not relevant for this blog, since we are going to perform incident response on logs that are not ingested in the first place. An example of such logs could be an appliance whose logs are not ingested in Sentinel, but a critical vulnerability requires you to investigate the logs to validate if exploitation has been performed. Another option could be that you have custom incident response scripts that collect specific logs from a compromised system.

Azure Data Explorer

The starting point of this incident response part is a working Azure Data Explorer (ADX) cluster (If you do not have this yet, check the section above). This could also be the cluster you have used for Kusto Detective Agency. The query language in ADX is very familiar to a lot of you, it’s the name of the domain this blog is hosted on, KQL. This makes it easier to react to the incidents. As in Sentinel or M365D, we again have tables that contain content, in the case of the image below 100 rows of content of the table IpInfo are returned.

Azure Data Explorer

Windows Security Events

Similar to when a real incident needs to be investigated, it is first required to collect the needed logs. It is important to collect logs as soon as possible because data might be overwritten after some time on local systems. So before we can use ADX we first secure the logs, in this case Windows Security Events from a compromised workstation.

Windows Security Events can be useful for investigating security incidents because they contain a lot of useful information. Such as the creation of processes and Active Directory Events. But it can be that you do not ingest those logs, or only partially. The PowerShell script below can be (remotely, more on that in the third part of the Incident Response series) executed on a compromised device, this script will save all security events in a CSV. This CSV is then loaded into ADX to query the logs using KQL.

$ExecutionDate = $(get-date -f yyyy-MM-dd)
$OutputName = "SecurityEvents-$ExecutionDate.csv"
Get-EventLog -LogName Security | Export-Csv -Path $OutputName -NoTypeInformation
if (Test-Path -Path $OutputName) {
    $folderPath = (Get-Item $OutputName).DirectoryName
    Write-Host "Output File Location: $folderPath\$OutputName"
} else {
    Write-Host "File does not exist."

Looking for more PowerShell Incident Response Scripts? They are available here!

Ingesting Logs in Azure Data Explorer

The above PowerShell script returned a CSV that we can ingest into ADX to analyse its content. This ingestion consists of the following steps:

  1. Menu -> Data

  2. Quick Actions -> Ingest Data

    Ingestiong Data Section ADX

  3. Select the Cluster, Database and write down your tablename. In this case the table name is SecurityEventsDevice1. If you only want to add more rows to a current table, select ingest data in the exsisting table.

    Configuring Table Name

  4. Upload the CSV file.

  5. Verify Schema.

    Verify Ingested Schema

  6. Start Ingestion, for larger datasets this can take a few minutes.

  7. Query the data:

SecurityEventsDevice1 //Change this to your table name
| take 10

Very easy right? Now we can start analysing this device for malicious activities.

ADX Ingestion Documentation:

Analysing Windows Security Events

The KQL part of this blog will show high level examples of useful steps to take when analysing logs. Since all different types of data can be ingested and reviewed in ADX, the KQL queries need to be specifically written for the data.

When analysing the content of any table I most often first take the first 10 rows to get a feeling of the content. In this case, 10 Windows Security Events are returned, and opening one of the rows will show all the rows including its content. I advise to always open a few rows from different EventIds to get a feeling for its contents.

| take 10

Ingest Data

Based on those rows we can collect some statistics which can help us to find rare behaviours in our environment. The fields that contain the actiontypes are most valuable for those statistics. In the case of Windows Security Events, the actiontype is the EventId field. Event identifiers uniquely identify a particular event. To get an understanding of what happened on the device the statistics of the EventIds can be valuable for investigation.

| summarize TotalEvents = count() by EventID
| sort by TotalEvents
KQL Solutions

The Azure-Sentinel GitHub Repository contains KQL queries for a lot of solutions. Even if you have not ingested the data in Sentinel the prebuild KQL logic can be beneficial.

For Windows Security Events the following links are very helpful and can be used if you have ingested the data in ADX:

Microsoft 365

The second example investigates the unified audit logs from Microsoft 365 which have been retrieved using the Microsoft Extractor Suite. The results of Microsoft Extractor Suite can be exported to CSV format for further analysis. For this blog, the publicly available dataset from Invictus Incident Response is used to simulate the activities. Because the dataset is publicly available you can perform the same actions!


Microsoft Extractor Suite

The Microsoft Extractor Suite is a PowerShell module for the acquisition of data from Microsoft 365 and Azure for Incident Response and Cyber Security purposes. This has been developed by the team of Invictus Incident Response. The following Microsoft data sources can be extracted for analysis: Unified Audit Log, Admin Audit Log, Mailbox Audit Log, Mailbox Rules, Transport Rules, Message Trace Logs, Azure AD Sign-In Logs, Azure AD Audit Logs, Registered OAuth applications in Azure AD.

The documentation on how to use this PowerShell Module:

To ingest the CSV into Azure Data Explorer the same activities have been performed as for the Windows Security Events (see section Ingesting Logs in Azure Data Explorer). The only difference is that a new table has been made, which is named AuditRecordsInvictusIR and it contains other data. One of the first steps is to determine which actiontypes (operations in this case) this dataset has.

| summarize count() by Operations

Operations in the M365 Logs

Based on the listing of all the operations we could determine that multiple alerts have been triggered in our Microsoft 365 environment. This information could of course be very useful in an incident response scenario, thus the help of KQL was needed to list the alerts. The query below collected all AlertTriggered operations and parsed the AlertName and AlertId fields from the AuditData column. Lastly, the arg_max() function is used to only collect the latest status of all the triggered alerts.

| where Operations == 'AlertTriggered'
| extend AlertName = tostring(parse_json(AuditData.Name)), 
    AlertId = tostring(parse_json(AuditData.AlertId))
| summarize arg_max(CreationDate, AlertName, AlertId) by AlertId

Alerts from the M365 Logs

The sky is the limit

When using Azure Data Explorer for incident response it does not matter whether you use Windows Security Event Logs, Unified Audit Logs, Firewall Logs or Signin Logs from an appliance. All data can be analyzed using KQL, if the data consists of less standardized formats some parsing must be done before you can query the events. But Azure Data Explorer has proven that it can be a huge help when performing incident response!

The last part of the incident response series will also include how you can leverage MDE Live Response to remotely run scripts that can be ingested in Azure Data Explorer. Stay tuned!

Questions? Feel free to reach out to me on any of my socials.