Understanding Microsoft Automation Options – Again

Microsoft has more automation options than there are flavors of jelly beans. Understanding what each one is capable of can be a chore in and of itself. Here is a quick primer on some of the tools:

  • Azure Automation
  • Power Apps
  • Azure Logic Apps
  • Azure Function Apps
  • System Center Orchestrator

Azure Automation
Azure Automation is a service that helps you automate manual, long-running, error-prone, and frequently repeated tasks that are commonly performed in a cloud and on-premises environment. It allows you to use runbooks (based on Windows PowerShell or Graphical Runbooks) to automate processes and workflows.
Azure Automation delivers a cloud-based automation, operating system updates, and configuration service that supports consistent management across your Azure and non-Azure environments. It includes process automation, configuration management, update management, shared capabilities, and heterogeneous features.


  • Can automate a wide range of tasks and processes
  • Can run on a schedule or be triggered by an event
  • Can be used to automate processes across a variety of systems and platforms


  • Requires some coding knowledge such as PowerShell
  • May require more setup and configuration compared to other services

Azure Logic Apps:
Azure Logic Apps is a cloud service that helps you automate and orchestrate tasks, business processes, and workflows when you need to integrate apps, data, systems, and services across enterprises or organizations. Logic Apps connects to your on-premises and cloud-based systems and services, and it provides over 200 connectors that you can use to create your logic app workflow.


  • Provides a visual designer for building workflows
  • Offers a wide range of connectors to other systems and services
  • Can be triggered by a variety of events and can run on a schedule
  • Can be configured for multi or single tenant, which grants ability to scale in the appropriate manner


  • May require more setup and configuration compared to other services, especially for apps that use the “Standard” model.

Azure Function Apps:
Azure Functions is a serverless compute service that enables you to run code on-demand without having to explicitly provision or manage infrastructure. Functions can be triggered by a variety of inputs, including HTTP requests, timer triggers, and changes to data in other services, and it can be used to build a variety of applications and services. Function Apps are best used for data-in/data-out operations. You can build function apps that have a longer session life, but that requires advanced configuration and potentially higher cost.


  • Serverless compute model (pay-per-use)
  • Can be triggered by a variety of inputs and events
  • Can be used to build a wide range of applications and services


  • Requires coding knowledge
  • May not be suitable for more complex or customized scenarios, especially if you opt for long running functions

Power Apps and the Power App Platform:
Power Apps is a low-code platform for building custom business applications. With Power Apps, you can create custom business applications for web and mobile devices that can connect to your business data stored in the Common Data Service (CDS) or other data sources.


  • Low-code platform for building custom business applications – especially useful for business users that don’t need IT involvement
  • Can connect to a wide range of data sources
  • Provides a visual designer for building apps


  • May not be suitable for more complex or customized scenarios, and should not be used for business critical processes
  • May have limitations in terms of app functionality and capabilities

System Center Orchestrator (SCORCH):
SCORCH is a workflow automation software product that enables you to automate system and application processes across an enterprise. SCORCH is part of the System Center suite of tools and is designed to work with other System Center products, as well as other Microsoft and third-party products. It is the only wholly on-prem tool in this post.


  • Can automate a wide range of tasks and processes across an enterprise
  • Provides a visual designer for building workflows
  • Offers a wide range of connectors to other systems and services
  • Can be triggered by a variety of events and can run on a schedule
  • Can be configured to be completely on-prem, should your company not allow cloud services


  • First and foremost – dead product walking. Do not invest in SCORCH at this time. Microsoft might choose to invest in this product in the future, but do not invest time in this tool until that happens.
  • May require more setup and configuration compared to other workflow automation tools
  • May not be suitable for more complex or customized workflow scenarios
  • May require a significant investment in terms of time and resources to set up and maintain
  • May have dependencies on other System Center products or require integration with other Microsoft or third-party products

MQTT Data to Azure Log Analytics

Setting up solar on our farm in North Texas has been a treasure trove of data to analyze – everything from Inverter load, to PV watts in, to state-of-charge for the batteries – It’s a massive amount of useful information.

So naturally I want that data in Azure Log Analytics. It’s perfect platform to handle this amount of data. Small messages, lots of entries. Just ready for reporting and analytics.

To do this, we are going to use IOT Hub as a go-between. Here are the high-level steps:

To get MQTT data into Azure Log Analytics via IoT Hub, you will need to perform the following steps:

  1. Set up an IoT hub in your Azure account.
  2. Set up a device in your IoT hub.
  3. Configure the device to send data to your IoT hub using the MQTT protocol.
  4. Set up a log analytics workspace in your Azure account.
  5. Connect your IoT hub to your log analytics workspace.
  6. Start sending MQTT data from your device to your IoT hub.

And now some details:

  1. Set up an IoT hub in your Azure account:
  • In the Azure portal, click on “Create a resource” in the top left corner, then search for “IoT Hub” and select it.
  • Follow the prompts to create a new IoT hub, including selecting the subscription, resource group, and region you want to use.
  • Make note of the IoT hub’s name and connection string, as you will need these later.
  1. Set up a device in your IoT hub:
  • In the Azure portal, go to the IoT hub blade and select “IoT Devices” from the left menu.
  • Click the “Add” button to create a new device.
  • Follow the prompts to set up the device, including giving it a unique device ID and generating a device key or certificate.
  • Make note of the device ID and key/certificate, as you will need these later.
  1. Configure the device to send data to your IoT hub using MQTT:
  • The specific steps for this will depend on the device you are using, but generally you will need to specify the MQTT endpoint and authentication information for your IoT hub.
  • The MQTT endpoint will be in the format “YOUR-IOT-HUB-NAME.azure-devices.net”, and you will need to use the device ID and key/certificate you obtained in step 2 to authenticate.
  • Specify the MQTT topic you want to publish data to.
  1. Set up a log analytics workspace:
  • In the Azure portal, click on “Create a resource” in the top left corner, then search for “Log Analytics” and select it.
  • Follow the prompts to create a new log analytics workspace, including selecting the subscription, resource group, and region you want to use.
  • Make note of the workspace ID and primary key, as you will need these later.
  1. Connect your IoT hub to your log analytics workspace:
    • In the Azure portal, go to the IoT hub blade and select “Diagnostic settings” from the left menu.
    • Click the “Add diagnostic setting” button to create a new setting.
    • Select the log analytics workspace you created in step 4 as the destination for the data.
    • Select the data types you want to collect
    1. Start sending MQTT data from your device to your IoT hub:
    • Using the device ID, key/certificate, MQTT endpoint, and topic you obtained in steps 2 and 3, publish data to your IoT hub using the MQTT protocol.
    • The data should be automatically forwarded to your log analytics workspace and be available for analysis.

    Adding Azure Alert Rules with PowerShell 7

    Here is a quick and dirty post to create both Metric and Scheduled Query alert rule types in Azure:

    To create an Azure Alert rule using Powershell 7 and the AZ module, you will need to install both Powershell 7 and the AZ module.

    PowerShell 7: https://docs.microsoft.com/en-us/powershell/scripting/install/installing-powershell?view=powershell-7.1

    To install the AZ module, run the following command in Powershell:

    Install-Module -Name AZ

    Once both Powershell 7 and the AZ module are installed, you can use the following commands to create an Azure Alert rule.

    To create a metric alert rule:

    $alertRule = New-AzMetricAlertRule `
        -ResourceGroupName "MyResourceGroup" `
        -RuleName "MyMetricAlertRule" `
        -TargetResourceId "/subscriptions/{subscription-id}/resourceGroups/{resource-group-name}/providers/Microsoft.Compute/virtualMachines/{vm-name}" `
        -MetricName "Percentage CPU" `
        -Operator GreaterThan `
        -Threshold 90 `
        -WindowSize 30 `
        -TimeAggregationOperator Average

    And to create a Scheduled Query rule:

    $alertRule = New-AzLogAlertRule `
        -ResourceGroupName "MyResourceGroup" `
        -RuleName "MyScheduledQueryAlertRule" `
        -Location "East US" `
        -Condition "AzureMetrics | where ResourceProvider == 'Microsoft.Compute' and ResourceType == 'virtualMachines' and MetricName == 'Percentage CPU' and TimeGrain == 'PT1H' and TimeGenerated > ago(2h) | summarize AggregateValue=avg(MetricValue) by bin(TimeGenerated, 1h), ResourceId" `
        -ActionGroupId "/subscriptions/{subscription-id}/resourceGroups/{resource-group-name}/providers/Microsoft.Insights/actionGroups/{action-group-name}"

    You will have to replace the main bits you would expect – ResourceGroupName, Subscription-ID, Action-Group-Name, Location, etc.

    Hope this helps!

    Stopwatch vs Measure-Command

    Just a quick one – consider using the Stopwatch class instead of Measure-Command in PowerShell. There are a couple of good reasons that you might want to:

    First off you get more granular measuring units. While this is not normally an issue for normal performance troubleshooting, there might be very unique circumstances where you need to get sub-millisecond.

    In my opinion the main reason to use the stopwatch class is the enhanced control it has. You can stop and restart the stopwatch at any time. You can specifically start the stopwatch, run some commands, stop the stopwatch, run more commands, and then restart it without having to add together 2 or more measure-commands. You also get the ability to zero out the stopwatch anytime you want. You can also have multiple stopwatches running at the same time!

    Azure Monitor Action Group Source IP Addresses

    One of the hurdles a company might run into when moving to Azure, especially with Azure Monitor and Log Analytics, is integration with Action Groups. Rarely are the actions of SMS or Email good enough to integrate with an internal Service Management system, so that leaves webhook as the simplest way to get data back into the datacenter. But that leaves one problem – Firewalls.

    The Azure backend can change at any time, so it’s important to know what IP addresses the Action Group can originate from – and we can do that with Get-AZNetworkServiceTag.

    To start, let’s install the az.network module. I am specifying allowclobber and force, to both update dependencies and upgrade existing modules. Make sure you are running or pwsh window or VSCode as an administrator.

    install-module az.network -allowclobber -force
    Downloading and Installing

    If you aren’t already connected, do a quick Connect-AzAccount. It might prompt you for a username/password, and will default to a subscription, so specify an alternative if necessary.

    Once connected, we can run the Get-AZNetworkServiceTag cmdlet. We will specify what region we can to get the information for – in this case, EastUS2.

    Get-AzNetworkServiceTag -Location eastus2

    The ‘values’ property is what we are after. It contains a massive amount of resource types, regions, and address ranges. In our case, let’s just get the Action Groups, specifically for EastUS2.

    (Get-AzNetworkServiceTag -Location eastus2).values|where-object {$_.name -eq 'ActionGroup.EastUS2'}

    And there we go! A list of 3 ranges. We can now work on our Firewall rules and make sure the webhook can get back in to the datacenter.

    Log Analytics and RSS Feeds – Why Not?

    The other day I was looking at the latest updates to Azure and noticed the handy RSS feed button, and that immediately made me think about automation that could be triggered from it. Obviously you could make a Power App to handle that, but since my head is in Azure Monitor at the moment, I thought – Why Not?

    Let’s get that data into the gateway drug of Azure – Log Analytics. This post is a primer on integrating these 2 tools, so if you are experienced with Log Analytics Workspace, feel free to skip ahead! But if you want to see some first steps, follow along! First, we will need a couple of things:

    1. A Log Analytics Workspace
    2. A Logic App
    3. A desired outcome – this is handled by the Workspace and we will cover some of the possibilities in the next couple of posts.

    Let’s start. Below you can see my LA Workspace:

    As you can see – it’s completely blank because it’s new. Before we create the Logic App, go ahead and select “Agents Management” and copy down the Workspace ID and the Primary (or secondary, I won’t judge) key.

    Now it’s time to setup the Logic App. Now before you ask – do I need a Logic App or can I just use the LA Workspace API to send data? Well, if you are asking that question, then you already know the answer – of course you could! The point here is to make it easy to consume the RSS Feed, and not have to write some sort of feed consumer ourselves. We’re lazy, after-all. Here’s the blank Logic App:

    The actual Logic App is pretty straight forward. Just a timer, followed by a RSS Feed grabbing action, then a straight port of the data into the Log Analytics data ingestion action. Something like this:

    That’s pretty straight forward. After about 15 minutes you should see the data in your workspace:

    Notice how the name we gave to the custom log in the Logic App is here, but has “_cl” appended to it? The workspace will do this automatically to any custom log we create. It literally stands for “custom log”. Same thing with the fields in the custom log. The Logic App automatically created the fields, but the workspace will append an underscore and an abbreviation for the type of field….’s’ for string for example.

    Now 3 things immediately come to mind – how do we get the right timestamp into the workspace, what are we going to do with it, and how can we only add new items? Over the next couple of posts, we will go over both. Depending on your reoccurrence settings, you might get some dupes, so if you followed along you might want to disable the Logic App for now.

    Why not? Pathfinder 2e API and PowerShell

    In another of my “Why Not?” series – a category of posts that don’t actually set out to show a widespread use, but rather just highlight something cool I found, I present just how much of a nerd I am.

    I love tabletop RPG games – DnD, Pathfinder, Shadowrun, Call of Cthulhu, Earthdawn,etc… You name it, I have probably sat around a table playing it with a group of friends. Our new favorite game to play recently – Pathfinder 2e.

    Recently I found something very cool – a really good PF2e character builder/manager called Wanderer’s Guide. I don’t have any contact with the author at all – I am just a fan of the software.

    A bit of background before I continue – I have been looking for an API to find raw PF2e data for quite a while. An app might be in the future, but I simply couldn’t find the data that I wanted. The Archive would be awesome to get API access to, but it’s not to be yet.

    After moping around for a couple of months, I found this, and an choir of angels began to sing. Wanderers Guide has an API, and it is simple awesome. Grab you free API key, and start to follow along.

    We are going to make a fairly standard API call first. Let’s craft the header with the API key you get from your profile. This is pretty straight forward:

    $ApiKey = "<Put your Wanderer's Guide API Key here>"
    $header = @{"Authorization" = "$apikey" }

    Next, let’s look at the endpoints we want to access. Each call will access a category of PF2e data – classes, ancestries, feats, heritages, etc… This lists the categories of data available.

    $baseurl = 'https://wanderersguide.app/api'
    $endpoints = 'spell', 'feat', 'background', 'ancestry', 'heritage'

    Now we are going to iterate through each endpoint and make the call to retrieve the data. But – since Wanderer’s Guide is nice enough to provide the API for free, we aren’t going to be jerks and constantly pull the full list of data each time we run the script. We want to only pull the data once (per session), so we will check to see if we have already done it.

    foreach ($endpoint in $endpoints) {
        if ((Test-Path variable:$endpoint'data') -eq $false){
            "Fetching $endpoint data from $baseurl/$endpoint/all"
            New-Variable -name ($endpoint + 'data') -force -Value (invoke-webrequest -Uri "$baseurl/$endpoint/all" -headers $header)

    The trick here is the New-Variable cmdlet – it lets us create a variable with a dynamic name while simultaneously filing it with the webrequest data. We can check to see if the variable is already created with the Test-Path cmdlet.

    Once we have the data, we need to do some simple parsing. Most of it is pretty straight forward – just convert it from JSON and pick the right property – but a couple of the endpoints need a bit more massaging. Heritages and Backgrounds, specifically.

    Here is the full script – it’s really handy to actually use out-gridview in order to parse the data. For example, do you want to background that gives training in Athletics – just pull of the background grid and filter away!

    $ApiKey = "<Put your Wanderer's Guide API Key here>"
    $header = @{"Authorization" = "$apikey" }
    $baseurl = 'https://wanderersguide.app/api'
    $endpoints = 'spell', 'feat', 'background', 'ancestry', 'heritage'
    foreach ($endpoint in $endpoints) {
        if ((Test-Path variable:$endpoint'data') -eq $false){
            "Fetching $endpoint data from $baseurl/$endpoint/all"
            New-Variable -name ($endpoint + 'data') -force -Value (invoke-webrequest -Uri "$baseurl/$endpoint/all" -headers $header)
            switch ($endpoint){
                'spell' {$Spells = ($spelldata.content|convertfrom-json).psobject.properties.value.spell|Out-GridView}
                'feat'{$feats = ($featdata.content|convertfrom-json).psobject.properties.value.feat|Out-GridView}
                'ancestry'{$ancestries = ($ancestrydata.content|convertfrom-json).psobject.properties.value.ancestry|Out-GridView}
                'background'{$backgrounds = ($backgrounddata.content|convertfrom-json).psobject.properties|where-object {$_.name -eq 'syncroot'}|select-object -expandProperty value|out-gridview}
                'heritage'{$heritages = ($heritagedata.content|convertfrom-json).psobject.properties|where-object {$_.name -eq 'syncroot'}|select-object -expandProperty value|out-gridview}
                default{"Instruction set not defined."}


    PowerShell Secrets Gotchas

    PowerShell Secrets Management is released, and it’s off to a very good start, but there are some things you might want to watch out for.

    The first one got me almost immediately – right after installing both modules and creating my first store. I tried to create a new secret, and was prompted for a password. It manifested in 2 different ways:

    “Exception Calling Prompt Unlock Vault” was the first, and occurred when trying to perform pretty much any cmdlet associated with a store. Deleting and recreating the store made no difference.

    The second issue was an exception claiming a null value was passed as a password, when it clearly wasn’t the case:

    "Cannot convert null to 'Microsoft.PowerShell.SecretStore.Authenticate' because it is a non-nullable value type"

    There is good news, though – both issues can be solves with a simple Reset-SecretStore.

    The next is an odd one – the scope for vaults is limited to the current user. You can’t add a vault with AllUsers, for example:

    PS  C:\blog (8:10:28 PM) > Set-SecretStoreConfiguration -Scope AllUsers
    Set-SecretStoreConfiguration: AllUsers scope is not yet supported.

    So this means that you can’t create a store with your normal account, and access it with a service account or admin account. The only two currently allowed values are “CurrentUser” and “AllUsers”, but fails with the above error if you try AllUsers. This could potentially be a deal breaker for some, but the error message hints that support might be coming in the future.

    So that’s it! A quick on this time, but I hope it helps save you a few minutes of frustration.

    EASY PowerShell API Endpoint with FluentD

    One of the biggest problems that I have had with PowerShell is that it’s just too good. I want to use it for everything. Need to perform automation based on monitoring events? Pwsh. Want to update rows in a database when someone clicks a link on a webpage? Pwsh. Want to automate annoying your friends with the push of a button on your phone? Pwsh. I use it for everything. There is just one problem.

    I am lazy and impatient. Ok, that’s two problems. Maybe counting is a third.

    I want things to happen instantly. I don’t want to schedule something in task scheduler. I don’t want have to run a script manually. I want an API like end-point that will allow me to trigger my shenanigans immediately.

    Oh, and I want it to be simple.

    Enter FluentD. What is Fluentd you ask? From the website – “Fluentd allows you to unify data collection and consumption for a better use and understanding of data.” I don’t necessarily agree with this statement, though – I believe it’s so much more than that. I view it more like an integration engine with a wide community of plug-ins that allow you to integrate a wide variety of toolsets. It’s simple, light-weight, and quick. It doesn’t consume a ton of resources sitting in the background, either. You can run it on a ton of different platforms too – *nix, windows, docker, etc… There is even a slim version for edge devices – IOT or small containers. And I can run it all on-prem if I want.

    What makes it so nice to use with PowerShell is that I can have a web API endpoint stood up in seconds that will trigger my PowerShell scripts. Literally – it’s amazingly simple. A simple config file like this is all it takes:

      @type http
      port 9880

    Boom – you have an listener on port 9880 ready to accept data. If you want to run a PowerShell script from the data it receives, just expand your config file a little.

      @type http
      port 9880
    <match **>
      @type exec
      command "e:/tasks/pwsh/pwsh.exe  -file e:/tasks/pwsh/events/start-annoyingpeople.ps1"
        @type json
        flush_interval 2s

    With this config file you are telling FluentD to listen on port 9880 (http://localhost:9880/automation?) for traffic. If it sees a JSON payload (post request) on that port, it will execute the command specified – in this case, my script to amuse me and annoy my friends. All I have to do is run this as a service on my Windows box (or a process on *Nix, of course) and I have a fully functioning PowerShell executing web API endpoint.

    It doesn’t have to just be web, either. They have over 800 plug-ins for input and output channels. Want SNMP traps to trigger your scripts? You can do it. How about an entry in a log starting your PowerShell fun? Sure! Seriously – take a look at FluentD and how it can up your PowerShell game immensely.

    Use a Config File with PowerShell

    How many times has this occurred?

    “Hey App Onwer – all of these PowerShell automation scripts you had our team create started failing this weekend. Did something change?”
    “Oh yeah Awesome Automation Team. We flipped over to a new database cluster on Saturday, along with a new web front end. Our API endpoints all changed as well. Didn’t anyone tell you? How long will it take to update your code?”

    Looking at dozens of scripts you personally didn’t create: “Ummmmm”

    This scenario happens all of the time, and there are probably a hundred different ways to deal with it. I personally use a config file, and for very specific reasons. If you are coding in PowerShell 7+ (and you should be), then making your code cross-platform compatiable should be at the top of your priority list. In the past, I would store things like database connection strings or API endpoints in the Windows registry, and my scripts would just reference that reg key. This worked great – I didn’t have the same property listed in a dozen different scripts, and I only had to update the property in once place. Once I started coding with cross-plat in mind, I obviously had to change my thinking. This is where a simple JSON config file comes in handy. A config file can be placed anywhere, and the structured JSON format makes creating, editing, and reading easy to do. Using a config file allows me to take my whole code and move it from one system to another regardless of platform.

    Here is what a sample config file might look like:

        "Environments": [
                "Name": "Prod",
                "CMDBConnectionString": "Data Source=cmdbdatabaseserverAG;MultiSubnetFailover=True;Initial Catalog=CMDB;Integrated Security=SSPI",
                "LoggingDBConnectionString": "Data Source=LoggingAG;MultiSubnetFailover=True;Initial Catalog=Logs;Integrated Security=SSPI",
                "Servers": [
                        "ServerName": "prodserver1",
                        "ModulesDirectory": "e:\\tasks\\pwsh\\modules",
                        "LogsDirectory": "e:\\tasks\\pwsh\\logs\\"
                        "ServerName": "prodserver2",
                        "ModulesDirectory": "e:\\tasks\\pwsh\\modules",
                        "LogsDirectory": "e:\\tasks\\pwsh\\logs\\"
                "Name": "Dev",
                "CMDBConnectionString": "Data Source=testcmdbdatabaseserverAG;MultiSubnetFailover=True;Initial Catalog=CMDB;Integrated Security=SSPI",
                "LoggingDBConnectionString": "Data Source=testLoggingAG;MultiSubnetFailover=True;Initial Catalog=Logs;Integrated Security=SSPI",
                "Servers": [
                        "ServerName": "testserver1",
                        "ModulesDirectory": "e:\\tasks\\pwsh\\modules",
                        "LogsDirectory": "e:\\tasks\\pwsh\\logs\\"
                        "ServerName": "testserver2",
                        "ModulesDirectory": "e:\\tasks\\pwsh\\modules",
                        "LogsDirectory": "e:\\tasks\\pwsh\\logs\\"

    And here is what the code that parses it looks like in each script:

    $config = get-content $PSScriptRoot\..\config.json|convertfrom-json
    $ModulesDir = $config.Environments.Servers|where-object {$_.ServerName -eq [Environment]::MachineName}|select-object -expandProperty ModulesDirectory
    $LogsDir = $config.Environments.Servers|where-object {$_.ServerName -eq [Environment]::MachineName}|select-object -expandProperty LogsDirectory
    $DBConnectionString = $config.Environments|where-object {[Environment]::MachineName -in $_.Servers.ServerName}|select-object -expandProperty CMDBConnectionString

    Note that in this particular setup, I use the server name of the server actually running the script to determine if the environment is production or not. That is completely optional – your config file might be as simple as a couple of lines – something like this:

        "CMDBConnectionString": "Data Source=cmdbdatabaseserverAG;MultiSubnetFailover=True;Initial Catalog=CMDB;Integrated Security=SSPI",
        "LoggingDBConnectionString": "Data Source=LoggingAG;MultiSubnetFailover=True;Initial Catalog=Logs;Integrated Security=SSPI",

    When you use a simple file like this, you don’t need to parse the machine name – a simple “$ModulesDir = $config |select-object -expandProperty ModulesDirectory” would work great!

    Now you can simply update the connection string or any other property in one location and start your testing. No need to “Find in all Files” or search and replace across all your scripts. An added plus is that using a config file vs something like a registry key is that the code is cross-platform compatible. I’ve used JSON here due to a personal preference, but you could use any type of flat file you wanted, including a simple text file. If you use a central command and control tool – something like SMA, then those properties are already stored locally, but this method would be handy for those that don’t have capability or simply want to prepare their scripts for running on K8s or Docker.

    Let me know what you think about this – Do you like this approach or do you have another method you like better? Hit me up on Twitter – @donnie_taylor