I’ve worked with several customers recently who still like to be able to work offline somewhat. It’s been part of their normal processes for a long while and has become a partial a habit due to working with legacy, on-premises security tools for so long. I try to teach better ways to modernize the processes that are exposed by using a modern, cloud-based SIEM like Azure Sentinel, but habits are hard to break. They have found that, due to the slowness of the old tool’s query language against the sea of data they are storing, it’s just been easier to offload data to an external file and work against that. Of course, thanks to the value of Azure, Log Analytics, and KQL, Azure Sentinel customers don’t have to do this any longer. But, for those that do, here’s a quick solution.
NOTE: For a more current method of doing this see: How to Automate the Backup of Azure Sentinel Tables to Blob Storage Using PowerShell. However, the more current method only supports certain tables and not custom tables. To backup non-supported and custom tables, continue on…
If you’ve worked with Azure Sentinel, you should know that there’s an option in the Logs blade to export query results, as shown in the image.

However, because – yes – we are also talking about customers who want to work offline, a PowerShell script is also palatable in this scenario. For that reason, I put together the PowerShell script shown below.
The script does the following:
- Logs into Azure
- Initiates downloading a table stored in Log Analytics (you’ll need to supply the table and your Log Analytics workspace ID) based on a KQL query that simply
- Saves the downloaded data into a .csv in a C:\SentinelTables folder locally. The folder will needs to be created manually (or you can change the destination variable). The filename will be TableName+data.csv.
You can also just stop operation of the script (Ctrl-C) to stop downloading data and work with what has already been exported locally.
Here’s the PowerShell script:
# Authenticate to AzureRM
Login-AzureRmAccount
#==============================================================
# Define environment variables
#==============================================================
$SavePath = "C:\SentinelTables"
$FileDate = Get-Date -Format "yyyy-MM-dd"
# Fill in your Log Analytics workspace ID
$WorkspaceID = "xxxxx-xxxxx-xxxxx-xxxxx-xxxxxx"
# Change the TableName to the table you want to extract
$TableName = "AzureActivity"
# Output to CSV
$OutputCSV = "$SavePath\$TableName-$FileDate.csv"
# Get the Table data from Log Analytics
$TableResult = Invoke-AzureRmOperationalInsightsQuery -WorkspaceId $WorkspaceID -Query $TableName | select Results -ExpandProperty Results
$TableResultCount = ($TableResult | Measure-Object).Count
# Fill up the CSV
If ($TableResultCount -ge 1){
foreach ($Result in $TableResult){
$Result | Select * | Export-Csv $OutputCsv -NoTypeInformation -Append
}
}
You can also grab the most current version of the script from my GitHub Repo: https://github.com/rod-trent/SentinelPS/blob/master/ExportSentinelTable.ps1
Additionally, this script can be useful for those customers that want to “backup” data from Azure Sentinel to save it longer than the data retention value for the Log Analytics workspace. I’m working on a version that will push the data to Azure blog storage for that purpose.
Also, keep in mind, that while I’m targeting Azure Sentinel data with this PowerShell script solution, this works for any table in any Log Analytics workspace.
Thoughts? Script improvements? Let me know….
[Want to discuss this further? Hit me up on Twitter or LinkedIn]
One thought on “How to Export and Backup Azure Sentinel Tables Using PowerShell”
You must log in to post a comment.