Year: 2018

OMS Query – Patching Status for Meltdown and Spectre

This is a short article to show you how to use OMS Log Analytics to query the status of patches on Microsoft Windows Server platforms.

Please note: Official guidance and advice can be found here Protect your Windows devices against Spectre and Meltdown. This article is just one example of how to monitor patch status using the super cool OMS Log Analytics tools.

If you have not used OMS or Log Analytics it is well worth spending some time investigating.  You have the option of paid, trial and free tiers and a whole range of interesting preconfigured packs to play with.

Where Log Analytics gets interesting is when you start to increase the amount of information you are gathering and then use custom queries to dig for information, provide proactive notifications and automated actions and to train and develop models to display insights into your environment. Just imagine a machine learning model applied to data from your sys log server to map our network activity and threats.

For this article I am assuming that you already have OMS enabled and are collecting data but may never have looked into Log Analytics. You’ve probably clicked the Advanced Analytics button a few times and made some progress or gone “Whoa dude, strange things are afoot at the Circle K!” (The last bit might just be me :-))

Lets get cracking:

Head to your OMS Workspace that hosts your LogAnalytics Service for the VMs you want to monitor. At this stage it’s worth noting that there are a number of architectural options when considering your OMS Workspace design. This article does not go into the patterns you can adopt but as long as you have some VMs on premises being monitored and the data being collected you’ll be able to continue.

Select Log Search and then open up Advanced Analytics and “Hold On!”       

When the Advanced Analytics page has loaded open a new tab and paste in the query you need for the results you are after.  To test select Run.

       

The query you are looking to run is from the Update data.  Therefore this needs to be your first input.  You are then extracting data from here and narrowing down what you are looking for.  Once narrowed down you need to decide how you want this data displayed, this is your summary.  Finally we are placing all this information into a table.

If this is the very first time you have tried a free form query try the top most line first.  Its likely you will get a lot of records but you will see all the data and then be able to narrow it down to what you are after.

I have copied the query below for you to use.  Like everything if you know a better of of doing things please share I’d certainly be interested!

Update
| where KBID == "4056898" or KBID == "4056890"
| where UpdateState == 'Installed' or UpdateState == "Needed"
| summarize hint.strategy=partitioned arg_max(TimeGenerated, *) by Computer,SourceComputerId,UpdateID
|summarize dcount(Computer) by Computer,UpdateState
| render table

 

Disclaimer:  Please note although I work for Microsoft the information provided here does not represent an official Microsoft position and is provided as is.

Auditing Azure VMs Add Results to Azure Tables | Azure PowerShell

Having read Paulo Marques article Working with Azure Storage Tables from PowerShell I decided to make the edits to my auditing scripts and push the results into Azure Tables to act as a repository I have the ability to keep but also one that gives me more options. Moving forward we can look to update or pull this information out on demand or use it as a basis of a comparison.  I find it quite useful to have an independent record of the starting and end state of an environment pre and post any work undertaken.

There are a number of ways you can audit an Azure environment. With most of my customers I have implemented OMS, often using a combination of paid and free tiers to achieve the reporting they need to meet their own requirements and standards.

I’m a big fan of OMS, this script represent only one way to gather information and a chance to try something new in PowerShell.

To get started you’ll need to follow the instructions in Paulo’s article to install the correct module and from there I suggest following his guide as this will give you a good understanding how the commands operate. Once competed it is a straight forward process to integrate this in to any auditing script you currently have. The example below already has a table created.


# Variables for the environment set up 
# PLEASE NOTE the Azure Table has already been set up
$subscriptionName = "Subscription Name"
$resourceGroup = "Resource Group"
$storageAccount = "Storage Account"
$tableName = "Table"
$PartitionKey = "Partition Key"
$table = Get-AzureStorageTableTable -resourceGroup $resourceGroup -tableName $tableName -storageAccountName $storageAccount


# Call the status of each server and upload into the Azure Table
$rowcounter = 1 
$RGs = Get-AzureRMResourceGroup 
  
  foreach($RG in $RGs) 
   { 
     $VMs = Get-AzureRmVM -ResourceGroupName $RG.ResourceGroupName 
     foreach($VM in $VMs)
      
     { 
      $VMDetail = Get-AzureRmVM -ResourceGroupName $RG.ResourceGroupName -Name $VM.Name -Status 
      
      foreach ($VMStatus in $VMDetail.Statuses) 
       {                                             
        $VMStatusDetail = $VMStatus.DisplayStatus                                               
                                           
       }
    
        Add-StorageTableRow -table $table `
        -partitionKey $PartitionKey `
        -rowKey ([guid]::NewGuid().tostring()) `
        -property @{"ResourceGroup"=$RG.ResourceGroupName;"computerName"=$VM.name;"status"=$VMStatusDetail}
       $rowcounter++  

       } 
    }

This second example updates the values in the Azure Table. To do this we have to pull out the Computer / Server status from the Table and add this to the collected information as before.

# Variables for the environment set up 
# PLEASE NOTE the Azure Table has already been set up
$subscriptionName = "Wade - Internal Consumption"
$resourceGroup = "rg-ause-test-platform"
$storageAccount = "rgausetestplatform626"
$tableName = "table01"
$PartitionKey = "AUSSite"
$table = Get-AzureStorageTableTable -resourceGroup $resourceGroup -tableName $tableName -storageAccountName $storageAccount


# Call the status of each server and upload into the Azure Table
$rowcounter = 1 
$RGs = Get-AzureRMResourceGroup 
  
  foreach($RG in $RGs) 
   { 
     $VMs = Get-AzureRmVM -ResourceGroupName $RG.ResourceGroupName 
     foreach($VM in $VMs)
      
     { 
      $VMDetail = Get-AzureRmVM -ResourceGroupName $RG.ResourceGroupName -Name $VM.Name -Status 
      
      foreach ($VMStatus in $VMDetail.Statuses) 
       {                                             
        $VMStatusDetail = $VMStatus.DisplayStatus                                               
                                           
       }
    
# Creating the filter and getting original entity
[string]$filter = [Microsoft.WindowsAzure.Storage.Table.TableQuery]::GenerateFilterCondition("computerName ",[Microsoft.WindowsAzure.Storage.Table.QueryComparisons]::Equal,$VM.Name)
$computer = Get-AzureStorageTableRowByCustomFilter -table $table -customFilter $filter

# Changing values
$computer.status = $VMStatusDetail

# Updating the content
$computer | Update-AzureStorageTableRow -table $table

# Getting the entity again to check the changes
Get-AzureStorageTableRowByCustomFilter -table $table -customFilter $filter 

       } 
    }

Remember there is always a better way to do things and the only way we find out is if you have a go at sharing. I look forward to your versions and updates. Happy scripting!

Disclaimer:  Please note although I work for Microsoft the information provided here does not represent an official Microsoft position and is provided as is.