Your gateway to Azure – Log Analytics

There are a ton of articles that detail how to get all sorts of data into Log Analytics. My friend Cameron Fuller has demonstrated several ways to do it, for example. Heck, I even wrote a post or two.

Recently it occurred to me that I hadn’t read a lot of articles on WHY you want your data in Log Analytics. For people that already have data being ingested it’s obvious, but if you haven’t started down that road yet you might be wondering what all the hype is about. This article is for you.

I will tell you right now – Log Analytics is the ‘gateway drug’ of Azure. One hit, and you are hooked. Once you get your data into Log Analytics the possible uses skyrocket. Let’s break down some of the obvious ones first.

Analysis

This one is the get’s the “Duh” award. Get the data into Log Analytics immediately let’s you use the Azure Data Explorer Query Language (aka at one time as Kusto).

The language is easy to understand, easy to write, and there are tons of examples for doing everything from simple queries to complex monsters with charts, predictions, and cross resource log searches. This is the first, and the most obvious benefit of data ingestion. Not to diminish the capability offered by stopping here, but if this is the extent of the usage of your data then you are missing out.

Solutions

Built right into Log Analytics is set of amazing pre-built solutions that can automatically take your logs and turn it into consumable and actionable data points. Need to know how your Operations Manager environment is doing? Connect SCOM to Log Analytics and you are just a few clicks away from seeing performance improvement suggestions, availability recommendations, and even spot upgrade issues before they occur. The SQL Assessment supplies even more actionable data across your entire connected SQL environment. Most of the solutions come with exquisitely details recommendations. Check out this example from my personal SQL Assessment.

There are many different solutions, and they are being added all the time. Container analysis, upgrade assessment, change tracking, malware checking, AD replication status – the list of solutions is amazing! Even better, the product team that builds these solutions wants to know what you want to see! Go here to see the full list of solutions currently available, and check out the link at the bottom of the page to leave your suggestions.

The not so obvious, and the most fun!

Ok – we’ve knocked out the most obvious usages, but now let’s look at some of the other fun uses!

Log Analytics queries can be directly exported and imported into PowerBI! Simply craft your query, click export (see below) and LA will automatically craft the connection information and query for in a way that PowerBI can understand! All of that data suddenly available to the power of one best BI engines in the business.

Ok – I can hear you all now. PowerBI is just another reporting type application (it’s not, btw), but what else can we do? How about integration to one of the most powerful set of automation tool-sets in the market? Connectors are available directly in both Flow and Logic Apps that allow you to query your data and trigger from the returned data. This is where your data integration truly starts!

Imagine some of the possibilities for both your on-prem and cloud resources:

  • Get texts about critical updates as they are found
  • Schedule update installations with a full approval chain
  • Send notifications about changes that occur in your environment, sending the notifications to the appropriate teams based on the change type
  • Azure Monitor alerts sent straight to Event Grids or Event Hubs for further processing
  • Connect your SCOM alerts through LA and right into Azure automation to perform runbooks in the cloud or on-prem
  • Consume logs from your building entry system and schedule software distributions when people leave the office

Monitor your Logic Apps with Log Analytics (Preview)

This is a continuation of the blog series that fellow MVP Billy York and I are putting on around the journey from on-prem SCORCH to Automation in Azure. In this post we will look at a new preview Log Analytics solution that will help you monitor and respond to issues that will invariably happen with Logic Apps.

Logic App Management allows heavy users of Logic Apps to get both ‘at-a-glance’ updates on the success or failure of their apps, or drill down deeply into app runs and get details such as start/stop time, execution duration, action durations, etc… It is especially useful when you have a very large set of Logic Apps. Trying to get a single pane of glass when you have dozens or hundreds of Logic Apps is difficult with this exciting preview feature.

To add this solution to Log Analytics, you will need…..wait for it…..a Log Analytics workspace! Your existing Log Analytics workspace will do just fine – no need to create a new one. In fact, there are a lot of reasons why you wouldn’t want to! Once you have an eligible workspace, it’s time to add the solution. You can get to this either by adding a resource to the resource group, or adding the solution directly from the workspace – both will start same wizard

One option to add the solution is to do it directly from your Log Analytics workspace.
Searching for the solution…
The actual solution is still in preview, but is very stable.

The actual wizard is very simple – just pick your workspace and click create. Once you have your solution provisioned, you can turn it on or off per Logic App by selecting the “Diagnostic Settings” on the LA and editing the setting below – if you want to turn it off, just uncheck the “Send to Log Analytics” setting.

For this article, I setup a couple of simple Logic Apps – both of them query RSS feeds, but one has a good feed URL, and the other doesn’t. I varied their start times a bit just to show some contrasting data.

The good logic app…..
And the bad one – this URL doesn’t actually work.

I let these apps run for a couple of days. When I look at my Log Analytics workspace, I am greeted with this new tile, and can drill down into to get a nice Logic App run overview:

New tile!
And some awesome graphs!!!

This is a great overview screen for your apps. Notice that the central graph is slightly off – I have tried it in multiple browsers on multiple themes, but it shows the same in each. Hence the ‘preview’ moniker on the solution, I guess. Regardless, at a simple glance I can see the LAs that have succeeded or failed, get a list of the errors, and see a visual comparison of the status of my apps. Clicking on the ‘runs’ tab brings me to a bit more details, and if I click on the “See All” link in the bottom left of each tile, I get a familiar looking interface 🙂

A bit more detail……
Our old friend – Kusto!

For anyone wanting to know the actual query, here it is:

AzureDiagnostics
| where Category == "WorkflowRuntime"
| where OperationName == "Microsoft.Logic/workflows/workflowRunCompleted"
| join kind = rightouter
(
    AzureDiagnostics
    | where Category == "WorkflowRuntime"
    | where OperationName == "Microsoft.Logic/workflows/workflowRunStarted"
    | where resource_runId_s in (( AzureDiagnostics
    | where Category == "WorkflowRuntime"
    | where OperationName == "Microsoft.Logic/workflows/workflowTriggerCompleted"
    | project resource_runId_s ))
    | project WorkflowStartStatus=status_s, WorkflowNameFromInnerQuery=resource_workflowName_s, WorkflowIdFromInnerQuery=workflowId_s, resource_runId_s
)
on resource_runId_s
| extend WorkflowStatus=iff(isnotempty(status_s), status_s, WorkflowStartStatus)
| extend WorkflowName=iff(isnotempty(resource_workflowName_s), resource_workflowName_s, WorkflowNameFromInnerQuery)
| extend WorkflowId=iff(isnotempty(workflowId_s), workflowId_s, WorkflowIdFromInnerQuery)
| summarize Count=count() by WorkflowId, WorkflowName, WorkflowStatus

The benefits of having this data in the “Gateway Drug” (trademark pending) of Azure – Log Analytics – should be obvious. Want to export your Logic App data to PowerBI, or even better setup alerts when a LA fails? Combine your LA data with feeds from AAD, Graph, Office365, etc…. Having all of this data in an easily queried repository is priceless.