#tsql2sday #130 – Automate your stress away – Getting more SSIS Agent Job information

Automation

T-SQL Tuesday was started by Adam Machanic (blog|twitter) is hosted by a different person each month. The host selects the theme, and then the blogging begins. worldwide, on the second Tuesday of the month (all day, based on GMT time), bloggers attend this party by blogging about the theme. This month it is hosted by Elizabeth Noble blog and twitter.

Thank you Elizabeth

Elizabeth asks

My invitation to you is I want to know what you have automated to make your life easier?

From the Past

I am in the process of migrating my blog to GitHub pages and whilst doing so, I read my first ever technical blog post You have to start somewhere) In it I mention this blog post by John Sansom The Best Database Administrators Automate Everything which I am pleased to see is still available nearly a decade later

Here is a quote from his blog entry

## Automate Everything

>That’s right, I said everything. Just sit back and take the _time_ to consider this point for a moment. Let it wander around your mind whilst you consider the processes and tasks that you could look to potentially automate. Now eliminate the word _potentially_ from your vocabulary and evaluate how you could automate **e-v-e-r-y-t-h-i-n-g** that you do.>
>Even if you believe that there is only a remote possibility that you will need to repeat a given task, just go ahead and automate it anyway! Chances are that when the need to repeat the process comes around again, you will either be under pressure to get it done, or even better have more important _Proactive Mode_ tasks/projects to be getting on with

I love Automation

I have tried my best at all times to follow this advice in the last decade and pretty much I am happy that I have managed it.

– I use PowerShell (a lot!) to automate all sorts of routine tasks including migrating this blog
– I use Jupyter Notebooks to enable myself and others to automate Run Books, Training, Documentation, Demonstrations, Incident Response. You can find my notebooks here
– I use Azure DevOps to automate infrastructure creation and changes with terraform and delivery of changes to code as well as unit testing.
– I use GitHub actions to create this blog, publish the ADSNotebook module
– I use Chocolatey to install and update software
– I have used Desired State Configuration to ensure that infrastructure is as it is expected to be

At every point I am looking for a means to automate the thing that I am doing because it is almost guaranteed that there will be a time in the future after you have done a thing that there will be a need to do it again or to do it slightly differently.

Whats the last thing that you automated?

Following my blog post about Notifying a Teams Channel about a SQL Agent Job result I was asked if this could be tweaked to reduce the time spent getting information about SSIS Execution failures.

Finding SSIS failures

When you run an SSIS package in an Agent Job and it fails, the Agent Job History shows something along these lines

The job failed. The Job was invoked by User MyDomain\MyUserName. The last step to run was step 1 (scheduling ssis package).
Executed as user: NT Service\SQLSERVERAGENT. Microsoft (R) SQL Server Execute Package Utility Version 11.0.5058.0 for 64-bit Copyright (C) Microsoft Corporation. All rights reserved. Started: 4:17:12 PM Package execution on IS Server failed. **Execution ID: 123456789**, Execution Status:4. To view the details for the execution, right-click on the Integration Services Catalog, and open the [All Executions] report Started: 4:17:12 PM Finished: 4:17:12 PM Elapsed: 4.493 seconds. The package execution failed. The step failed.

The next step is to open SSMS, go to the SSISDb and click through to the SSIS reports and then scroll through to find the package and then the message. This is not particularly efficient and the SSIS reports are not known for their speedy executions!

This meant that the team member responsible for checking in the morning, could see which instance and which job had failed from the Teams message but then had to manually follow the above steps to find an error message that they could take action on.

Automate it

In the SSISDB database there is an `event_messages` view so if I could query that and filter by the Execution ID then I could get the message and place it into the Teams message. Now the Teams message contains the error for the SSIS execution and each time this happens it probably saves the team member 4 or 5 minutes 🙂

In the code below, I

1. check if the failure comes from an SSIS instance
`if($Inst -in ($SSISInstances)){`
2. Get the Execution ID from the Error message
`$ExecutionId = [regex]::matches($BaseerrMessage, ‘Execution ID: (\d{3,})’).groups[1].value`
3. Create a query for the SSISDB

`$SSISQuery = @”`
`SELECT * FROM catalog.event_messages em`
`WHERE em.operation_id = $ExecutionId`
`AND (em.event_name = ‘OnError’)`
`ORDER BY em.event_message_id;`
`”@`

4. Set the Error Message and the Execution Path to variables
`$errMessage = $SSISQueryResults.Message`
`$ExecutionPath = $SSISQueryResults.execution_path`
5. Get the Error Message for none SSIS failures
`}else{`
`$errMessage = $j.group[-1].Message`
`$ExecutionPath = ‘the job’`
`}`
6. Create the Teams message

You will see that I used `SELECT *` because someone will always ask for some extra information in the future!

The full script is below, Happy Automating!

   $webhookurl = "https://outlook.office.com/webhook/ the rest of it here" 
    $SSISInstances = # to identify SSIS instances
    $ProdInstances = # ALL instances for checking
    $startdate = (Get-Date).AddHours(-1)
    
    $AllFailedJobs = foreach ($Instance in $ProdInstances) {
        Write-Host "Connecting to $instance"
        try{
            $smo = Connect-DbaInstance $Instance -ErrorAction Stop
            Write-Host "Connected successfully to $instance"
        }
        catch{
            Write-Host "Failed to connect to $Instance" 
            $errorMessage = $_ | Out-String
            Write-Host $errorMessage
            Continue
        }
    
        Write-Host "Getting Agent Jobs on $instance"
        try {
            $AgentJobs = Get-DbaAgentJobHistory -SqlInstance $smo -EnableException -StartDate $startdate 
            Write-Host "Successfully got Agent Jobs on $instance"
        }
        catch {
            Write-Host "Failed to get agent jobs on $Instance" 
            $errorMessage = $_ | Out-String
            Write-Host $errorMessage
            Continue
        }
        
        $jobs = $agentJobs # | Where-Object { $Psitem.Job -match     '^Beard-\d\d\d\d\d' -or  $Psitem.Job -like 'BeardJob*'  } # if you need to     filter
        $FailedJobs = $jobs | Where-Object { $Psitem.Status -ne 'Succeeded' }
        $FailedJobs | Group-Object Job 
        try{
            $smo.ConnectionContext.Disconnect()
            Write-Host "Disconnecting $instance"
        }
        catch{
            Write-Host "Failed disconnect from  $Instance" 
            $errorMessage = $_ | Out-String
            Write-Host $errorMessage
            Continue
        }
    }
    Write-Host "We have  $($AllFailedJobs.Count) Failed Jobs"
    
    [System.Net.ServicePointManager]::ServerCertificateValidationCallback = { $true     }
    [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
    
    foreach ($j in $AllFailedJobs) {
     
    $Inst = $j.group[-1].SqlInstance
    $jName = $j.name
    $sname = $j.group[-1].StepName
    $edate = $j.group[-1].EndDate
    if($Inst -in ($SSISInstances)){
        $BaseerrMessage = $j.group[-1].Message
        $ExecutionId = [regex]::matches($BaseerrMessage, 'Execution ID: (\d{3,})').groups[1].value
    $SSISQuery = @"
    SELECT * FROM catalog.event_messages em 
    WHERE em.operation_id = $ExecutionId 
    AND (em.event_name = 'OnError')
    ORDER BY em.event_message_id;
    "@
    
    $SSISQueryResults = Invoke-DbaQuery -SqlInstance $Inst -Database SSISDB -Query $SSISQuery
    $errMessage = $SSISQueryResults.Message
    $ExecutionPath = $SSISQueryResults.execution_path
    }else{
        $errMessage = $j.group[-1].Message
        $ExecutionPath = 'the job'
    }
    
    $Text =  @"
    # **$Inst**   
    ## **$JName**  
    - The Job step that failed is - **$sname**  
    - It failed at - **$edate**  
    - It failed in $ExecutionPath with the message   
    - $errMessage   
    "@
    
    $JSONBody = [PSCustomObject][Ordered]@{
        "@type"      = "MessageCard"
        "@context"   = "http://schema.org/extensions"
        "summary"    = "There was a Job Failure"
        "themeColor" = '0078D7'
        "sections"   = @(
            @{
                "activityTitle"    = "Job Failures "
                "activitySubtitle" = "in the Last 1 hour"
                "activityImage"    = "https://blog.robsewell.com/assets/images/sobrob.jpg"
                "text"             = $text
                "markdown"         = $true
            }
        )
    }
     
    $TeamMessageBody = ConvertTo-Json $JSONBody -Depth 100
    
    $parameters = @{
        "URI"         = $webhookurl
        "Method"      = 'POST'
        "Body"        = $TeamMessageBody
        "ContentType" = 'application/json'
    }
     
        Invoke-RestMethod @parameters
    }
    
    if(-not $AllFailedJobs){
      
            $JSONBody = [PSCustomObject][Ordered]@{
                "@type"      = "MessageCard"
                "@context"   = "http://schema.org/extensions"
                "summary"    = "There were no job failures in the last hour at $    (Get-Date)"
                "themeColor" = '0078D7'
                "sections"   = @(
                    @{
                        "activityTitle"    = "There were no job failures at $    (Get-Date)"
                        "activitySubtitle" = "in the Last hour"
                        "activityImage"    = "https://blog.robsewell.com/assets/images/happyrob.jpg"
                        "text"             = "All is well"
                        "markdown"         = $true
                    }
                )
            }
         
            $TeamMessageBody = ConvertTo-Json $JSONBody -Depth 100
         
            $parameters = @{
                "URI"         = $webhookurl
                "Method"      = 'POST'
                "Body"        = $TeamMessageBody
                "ContentType" = 'application/json'
            }
            Invoke-RestMethod @parameters
    }

Creating Azure SQL Database AAD Contained Database Users with an SPN using PowerShell, Secrets Management, Azure Key Vault, and dbatools

Following on from my posts about using Secret Management Good bye Import-CliXml and running programmes as a different user, I have another use case.

After creating Azure SQL Databases in an Elastic Pool using a process pretty similar to this one I blogged about last year, I needed to be able to programmatically create users and assign permissions.

I need a user to login with

When I created my Azure SQL Server with Terraform, I set the Azure Admin to be a SPN as you can see in the image from the portal and set it to have an identity using the documentation for azurerm_mssql_server.

This allows this user to manage the access for the SQL Server as long as the SQL Server Azure AD identity has Directory Reader privileges. The SQL Server is called temp-beard-sqls and as you can see the identity is assigned to the role.

The privileges required to do this for a single identity are quite high

so now, you can assign an Azure Active Directory Group to that Role and allow less-privileged users to add the identity to this group . The documentation is here and there is a tutorial here explaining the steps you need to take.

What is an Azure SPN?

An Azure service principal is an identity created for use with applications, hosted services, and automated tools to access Azure resources.

https://docs.microsoft.com/en-us/cli/azure/create-an-azure-service-principal-azure-cli?toc=%2Fazure%2Fazure-resource-manager%2Ftoc.json&view=azure-cli-latest

I created the SPN using Azure CLI straight from the Azure Portal by clicking this button

and running

az ad sp create-for-rbac --name ServicePrincipalName

This will quickly create a SPN for you and return the password

Yes I have deleted this one

Add Azure Key Vault to Secret Management

In my previous posts, I have been using the Default Key Vault which is limited to your local machine and the user that is running the code. It would be better to use Azure Key Vault to store the details for the SPN so that it safely stored in the cloud and not on my machine and also so that anyone (or app) that has permissions to the vault can use it.

First you need to login to Azure in PowerShell (You will need to have the AZ* modules installed)

Connect-AzAccount

Be aware, the login box can appear behind the VS Code or Azure Data Studio window!

Once connected, if you have several Azure subscriptions, you can list them with

Get-AzSubscription

You can choose your subscription with

$AzureSubscription = Set-AzContext -SubscriptionName "NAME OF SUBSCRIPTION"

For the Secret Management Module to manage the Azure Key Vault, you first need to register it.

Ensure that you have permissions to connect by following the details in the network security documentation https://docs.microsoft.com/en-us/azure/key-vault/general/network-security and the secure access documentation https://docs.microsoft.com/en-us/azure/key-vault/general/secure-your-key-vault

Then you can run Register-SecretVault . You need to provide the local name for the key vault, the module name Az.KeyVault, and a VaultParameters hashtable with the KeyVault name and the Azure Subscription ID. You can register other types of Key Vaults to the Secret Management module in this way and they will require different values for the VaultParameters parameter.

$KeyVaultName = 'beard-key-vault'
Register-SecretVault -Name BeardKeyVault -ModuleName Az.KeyVault -VaultParameters @{ AZKVaultName = $KeyVaultName; SubscriptionId = $AzureSubscription.Subscription }

Adding the SPN details to the Azure Key Vault

Using the values for AppID – (Note NOT the display name) and the values for the password from the Azure CLI output or by creating a new secret for the SPN with PowerShell or via the portal. You can use the following code to add the SPN details and the tenantid to the Azure Key Vault using the Secret Management module

$ClientId = Read-Host "Enter ClientID" -AsSecureString
$SecretFromPortal = Read-Host "Enter Client Secret" -AsSecureString 
$tenantid = Read-Host "Enter TenantId" -AsSecureString 
Set-Secret -Vault BeardKeyVault -Name service-principal-guid -Secret $ClientId
Set-Secret -Vault BeardKeyVault -Name service-principal-secret -SecureStringSecret $SecretFromPortal
Set-Secret -Vault BeardKeyVault -Name Tenant-Id -Secret $tenantid

You can also do this with the Az.KeyVault module by following the instructions here

You can see the secrets in the portal

and also at the command line with the Secret Management module using

Get-SecretInfo -Vault RegisteredNameOfVault

Can my user connect?

If I try to connect in Azure Data Studio to my Azure SQL Database with my AAD account to the temp-sql-db-beard database. It fails.

By the way a great resource for troubleshooting the SQL error 18456 failure states can be found here https://sqlblog.org/2020/07/28/troubleshooting-error-18456

dbatools to the rescue 🙂

dbatools is an open source community collaboration PowerShell module for administrating SQL Server. You can find more about it at dbatools.io and get the book that Chrissy and I are writing about dbatools at dbatools.io\book

You can connect to Azure SQL Database with an Azure SPN using the following code. It will get the secrets from the Azure Key Vault that have been set above and create a connection. Lets see if I can run a query as the SPN.

$SqlInstance = 'temp-beard-sqls.database.windows.net'
$databasename = 'master'
$appid = Get-Secret -Vault BeardKeyVault -Name service-principal-guid  -AsPlainText
$Clientsecret = Get-Secret -Vault BeardKeyVault -Name service-principal-secret
$credential = New-Object System.Management.Automation.PSCredential ($appid,$Clientsecret)
$tenantid = Get-Secret -Vault BeardKeyVault -Name Sewells-Tenant-Id -AsPlainText
$AzureSQL = Connect-DbaInstance -SqlInstance $SqlInstance -Database $databasename  -SqlCredential $credential -Tenant $tenantid  -TrustServerCertificate 

Invoke-DbaQuery -SqlInstance $AzureSql -Database master  -SqlCredential $credential -Query "Select SUSER_NAME() as 'username'" 

Excellent 🙂

Add a user to the user database

I can then add my user to the temp-sql-db-beard Database. I need to create a new connection to the user database as you cannot use the USE [DatabaseName] statement

$Userdatabasename = 'temp-sql-db-beard'

$AzureSQL = Connect-DbaInstance -SqlInstance $SqlInstance -Database $Userdatabasename -SqlCredential $credential -Tenant $tenantid  -TrustServerCertificate 

Whilst you can use dbatools to create new users in Azure SQL Database at present you cant create AAD users. You can run a T-SQL Script to do this though. This script will create a contained database user in the database. I have added the role membership also but this can also be done with Add-DbaDbRoleMember from dbatools

$Query = @"
CREATE USER [rob@sewells-consulting.co.uk] FROM EXTERNAL PROVIDER
ALTER ROLE db_datareader ADD MEMBER [rob@sewells-consulting.co.uk]
"@
Invoke-DbaQuery -SqlInstance $AzureSql -Database $Userdatabasename  -SqlCredential $credential -Query $Query

Lets check the users on the database with dbatools

Get-DbaDbUser -SqlInstance $AzureSql -Database $Userdatabasename  |Out-GridView

I have my user and it is of type External user. Lets see if I can connect

Bingo 🙂

Happy Automating

Because I dont like to see awesome people struggling with PowerShell

Here is the same code using just the Az.KeyVault module

$appid = (Get-AzKeyVaultSecret -vaultName "beard-key-vault" -name "service-principal-guid").SecretValueText
$Clientsecret = (Get-AzKeyVaultSecret -vaultName "beard-key-vault" -name "service-principal-secret").SecretValue
$credential = New-Object System.Management.Automation.PSCredential ($appid,$Clientsecret)
$tenantid =  (Get-AzKeyVaultSecret -vaultName "beard-key-vault" -name "Sewells-Tenant-Id").SecretValueText
$AzureSQL = Connect-DbaInstance -SqlInstance $SqlInstance -Database $databasename  -SqlCredential $credential -Tenant $tenantid  -TrustServerCertificate 


Notifying a Teams Channel of a SQL Agent Job result

Following on from yesterdays post about creating an overview of SQL Agent Job Results and sending it to a Teams channel, I was given another challenge

Can you write a job step that I can add to SQL Agent jobs that can send the result of that job to a Teams Channel

A person with a need

The use case was for some migration projects that had steps that were scheduled via SQL Agent Jobs and instead of the DBA having to estimate when they would finish and keep checking so that they could let the next team know that it was time for their part to start, they wanted it to notify a Teams channel. This turned out especially useful as the job finished earlier than expected at 3am and the off-shore team could begin their work immediately.

Using SQL Agent Job tokens with PowerShell

You can use SQL Agent job tokens in Job step commands to reference the existing instance or job but I did not know if you could use that with PowerShell until I read Kendra Little’s blog post from 2009.

Thank you Kendra

Nothing is ever as easy as you think

So I thought, this is awesome, I can create a function and pass in the Instance and the JobId and all will be golden.

Nope

job_id <> $(JobID)

If we look in the sysjobs table at the Agent Job that we want to notify Teams about the result.

We can see that the job_id is

dc5937c3-766f-47b7-a5a5-48365708659a

If we look at the JobId property with PowerShell

We get

dc5937c3-766f-47b7-a5a5-48365708659a

Awesome, they are the same

But

If we look at the value of the $(JobID) SQL Agent Job Token,

we get

C33759DC6F76B747A5A548365708659A

which makes matching it to the JobId tricky

I tried all sorts of ways of casting and converting this value in SQL and PowerShell and in the end I just decided to manually convert the value

    $CharArray = $JobID.ToCharArray()

    $JobGUID = $CharArray[8] + $CharArray[9] + $CharArray[6] + $CharArray[7] + $CharArray[4] + $CharArray[5] + $CharArray[2] + $CharArray[3] + '-' + $CharArray[12] + $CharArray[13] + $CharArray[10] + $CharArray[11] + '-' + $CharArray[16] + $CharArray[17] + $CharArray[14] + $CharArray[15] + '-' + $CharArray[18] + $CharArray[19] + $CharArray[20] + $CharArray[21] + '-' + $CharArray[22] + $CharArray[23] + $CharArray[24] + $CharArray[25] + $CharArray[26] + $CharArray[27] + $CharArray[28] + $CharArray[29] + $CharArray[30] + $CharArray[31] + $CharArray[32] + $CharArray[33]

Send the information to Teams

Following the same pattern as yesterdays post, I created a function to send a message, depending on the outcome of the job and post it to the Teams function.

Again, I used Enter-Pssession to run the Teams notification from a machine that can send the message. (I have also included the code to do this without requiring that below so that you can send the message from the same machine that runs the job if required)

This code below is saved on a UNC share or the SQL Server as SingleNotifyTeams.ps1

Param(
    $SqlInstance,
    $JobID
)

$webhookurl = ""

$NotifyServer = 'BeardNUC2'
function Notify-TeamsSQlAgentJob {
    Param(
        $SQLInstance,
        $JobID,
        $webhookurl
    )

    $SQLInstance = $SQLInstance 
    # Import-Module 'C:\Program Files\WindowsPowerShell\Modules\dbatools\1.0.107\dbatools.psd1'
    [System.Net.ServicePointManager]::ServerCertificateValidationCallback = { $true }
    [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

    $CharArray = $JobID.ToCharArray()

    $JobGUID = $CharArray[8] + $CharArray[9] + $CharArray[6] + $CharArray[7] + $CharArray[4] + $CharArray[5] + $CharArray[2] + $CharArray[3] + '-' + $CharArray[12] + $CharArray[13] + $CharArray[10] + $CharArray[11] + '-' + $CharArray[16] + $CharArray[17] + $CharArray[14] + $CharArray[15] + '-' + $CharArray[18] + $CharArray[19] + $CharArray[20] + $CharArray[21] + '-' + $CharArray[22] + $CharArray[23] + $CharArray[24] + $CharArray[25] + $CharArray[26] + $CharArray[27] + $CharArray[28] + $CharArray[29] + $CharArray[30] + $CharArray[31] + $CharArray[32] + $CharArray[33]

    $Job = Get-DbaAgentJob -SQlInstance $SQLInstance | Where jobid -eq $JobGuiD
    $JobName = $Job.Name
    $Jobsteps = Get-DbaAgentJobStep -SQlInstance $SQLInstance -Job $JobName

    $JobStepNames = $Jobsteps.Name -join ' , '
    $JobStartDate = $job.JobSteps[0].LastRunDate
    $JobStatus = $job.LastRunOutcome
    $lastjobstepid = $jobsteps[-1].id
    $Jobstepsmsg = $Jobsteps | Out-String
    $JobStepStatus = ($Jobsteps | Where-Object {$_.id -ne $lastjobstepid -and $_.LastRunDate -ge $JobStartDate} ).ForEach{
        "   $($_.Name)  - $($_.LastRunDate) **$($_.LastRunOutCome)**  
"
    } 
    
    $Text = @"
# **$SqlInstance**   
## **$JobName**  

$jobstepMsg

Started at $JobStartDate 
- The individual Job Steps status was  

$JobStepStatus  


"@

    if (( $jobsteps | Where id -ne $lastjobstepid).LastRunOutcome -contains 'Failed') {
        $JSONBody = [PSCustomObject][Ordered]@{
            "@type"      = "MessageCard"
            "@context"   = "http://schema.org/extensions"
            "summary"    = "There was a Job Failure"
            "themeColor" = '0078D7'
            "sections"   = @(
                @{
                    "activityTitle"    = "The Job Failed"
                    "activitySubtitle" = "Work to do - Please investigate the following job by following the steps in the plan at LINKTOPLAN"
                    "activityImage"    = "https://fit93a.db.files.1drv.com/y4mTOWSzX1AfIWx-VdUgY_Qp3wqebttT7FWSvtKK-zAbpTJuU560Qccv1_Z_Oxd4T4zUtd5oVZGJeS17fkgbl1dXUmvbldnGcoThL-bnQYxrTrMkrJS1Wz2ZRV5RVtZS9f4GleZQOMuWXP1HMYSjYxa6w09nEyGg1masI-wKIZfdnEF6L8r83Q9BB7yIjlp6OXEmccZt99gpb4Qti9sIFNxpg"
                    "text"             = $text
                    "markdown"         = $true
                }
            )
        }
    }
    else {
        $JSONBody = [PSCustomObject][Ordered]@{
            "@type"      = "MessageCard"
            "@context"   = "http://schema.org/extensions"
            "summary"    = "The Job Succeeded"
            "themeColor" = '0078D7'
            "sections"   = @(
                @{
                    "activityTitle"    = "The Job Succeeded"
                    "activitySubtitle" = "All is well - Please continue with the next step in the plan at LINKTOPLAN"
                    "activityImage"    = "https://6f0bzw.db.files.1drv.com/y4mvnTDG9bCgNWTZ-2_DFl4-ZsUwpD9QIHUArsGF66H69zBO8a--FlflXiF7lrL2H3vgya0ogXIDx59hn62wo2tt3HWMbqnnCSp8yPmM1IFNwZMzgvSZBEs_n9B0v4h4M5PfOY45GVSjeFh8md140gWHaFpZoL4Vwh-fD7Zi3djU_r0PduZwNBVGOcoB6SMJ1m4NmMmemWr2lzBn57LutDkxw"
                    "text"             = $text
                    "markdown"         = $true
                }
            )
        }
    }

    $TeamMessageBody = ConvertTo-Json $JSONBody -Depth 100
 
    $NotifyCommand = {
    $parameters = @{
        "URI"         = $Using:webhookurl
        "Method"      = 'POST'
        "Body"        = $Using:TeamMessageBody
        "ContentType" = 'application/json'
    }
 
    Invoke-RestMethod @parameters
}
    $Session = New-PSSession -ComputerName $NotifyServer
    Invoke-Command -Session $Session -ScriptBlock $NotifyCommand
}

$msg = 'ServerName  = ' + $SQLInstance + 'JobId = ' + $JobID
Write-Host $msg
Notify-TeamsSQLAgentJob -SQlInstance $SqlInstance -JobID $JobID -webhookurl $webhookurl

Then it can be called in a SQL Agent job step, again following the guidelines at dbatools.io/agent

It is called slightly differently as you ned to pass in the SQL Agent tokens as parameters to the script

powershell.exe -File path to Notify-TeamsSQLAgentJob.ps1 -SQLInstance  $(ESCAPE_SQUOTE(SRVR)) -JobID  $(ESCAPE_NONE(JOBID))

SQL Agent Job Step Success and Failure

We need to take another step to ensure that this works as expected. We have to change the On Failure action for each job step to the “Go To Notify Teams” step

Making people smile

You can also add images (make sure the usage rights allow) so that the success notification can look like this

and the failure looks like this

Happy Automating !

Here is the code that does not require remoting to another server to send the message

Param(
    $SqlInstance,
    $JobID
)

$webhookurl = "https://outlook.office.com/webhook/5a8057cd-5e1a-4c84-9227-74a309f1c738@b122247e-1ebf-4b52-b309-c2aa7436fc6b/IncomingWebhook/affb85f05804438eb7ffb57665879248/f32fc7e6-a998-4670-8b33-635876559b80"

function Notify-TeamsSQlAgentJob {
    Param(
        $SQLInstance,
        $JobID,
        $webhookurl
    )

    $SQLInstance = $SQLInstance 
    # Import-Module 'C:\Program Files\WindowsPowerShell\Modules\dbatools\1.0.107\dbatools.psd1'
    [System.Net.ServicePointManager]::ServerCertificateValidationCallback = { $true }
    [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

    $CharArray = $JobID.ToCharArray()

    $JobGUID = $CharArray[8] + $CharArray[9] + $CharArray[6] + $CharArray[7] + $CharArray[4] + $CharArray[5] + $CharArray[2] + $CharArray[3] + '-' + $CharArray[12] + $CharArray[13] + $CharArray[10] + $CharArray[11] + '-' + $CharArray[16] + $CharArray[17] + $CharArray[14] + $CharArray[15] + '-' + $CharArray[18] + $CharArray[19] + $CharArray[20] + $CharArray[21] + '-' + $CharArray[22] + $CharArray[23] + $CharArray[24] + $CharArray[25] + $CharArray[26] + $CharArray[27] + $CharArray[28] + $CharArray[29] + $CharArray[30] + $CharArray[31] + $CharArray[32] + $CharArray[33]

    $Job = Get-DbaAgentJob -SQlInstance $SQLInstance | Where jobid -eq $JobGuiD
    $JobName = $Job.Name
    $Jobsteps = Get-DbaAgentJobStep -SQlInstance $SQLInstance -Job $JobName

    $JobStepNames = $Jobsteps.Name -join ' , '
    $JobStartDate = $job.JobSteps[0].LastRunDate
    $JobStatus = $job.LastRunOutcome
    $lastjobstepid = $jobsteps[-1].id
    $Jobstepsmsg = $Jobsteps | Out-String
    $JobStepStatus = ($Jobsteps | Where-Object {$_.id -ne $lastjobstepid -and $_.LastRunDate -ge $JobStartDate} ).ForEach{
        "   $($_.Name)  - $($_.LastRunDate) **$($_.LastRunOutCome)**  
"
    } 
    
    $Text = @"
# **$SqlInstance**   
## **$JobName**  

$jobstepMsg

Started at $JobStartDate 
- The individual Job Steps status was  

$JobStepStatus  


"@

    if (( $jobsteps | Where id -ne $lastjobstepid).LastRunOutcome -contains 'Failed') {
        $JSONBody = [PSCustomObject][Ordered]@{
            "@type"      = "MessageCard"
            "@context"   = "http://schema.org/extensions"
            "summary"    = "There was a Job Failure"
            "themeColor" = '0078D7'
            "sections"   = @(
                @{
                    "activityTitle"    = "The Job Failed"
                    "activitySubtitle" = "Work to do - Please investigate the following job by following the steps in the plan at LINKTOPLAN"
                    "activityImage"    = "https://fit93a.db.files.1drv.com/y4mTOWSzX1AfIWx-VdUgY_Qp3wqebttT7FWSvtKK-zAbpTJuU560Qccv1_Z_Oxd4T4zUtd5oVZGJeS17fkgbl1dXUmvbldnGcoThL-bnQYxrTrMkrJS1Wz2ZRV5RVtZS9f4GleZQOMuWXP1HMYSjYxa6w09nEyGg1masI-wKIZfdnEF6L8r83Q9BB7yIjlp6OXEmccZt99gpb4Qti9sIFNxpg"
                    "text"             = $text
                    "markdown"         = $true
                }
            )
        }
    }
    else {
        $JSONBody = [PSCustomObject][Ordered]@{
            "@type"      = "MessageCard"
            "@context"   = "http://schema.org/extensions"
            "summary"    = "The Job Succeeded"
            "themeColor" = '0078D7'
            "sections"   = @(
                @{
                    "activityTitle"    = "The Job Succeeded"
                    "activitySubtitle" = "All is well - Please continue with the next step in the plan at LINKTOPLAN"
                    "activityImage"    = "https://6f0bzw.db.files.1drv.com/y4mvnTDG9bCgNWTZ-2_DFl4-ZsUwpD9QIHUArsGF66H69zBO8a--FlflXiF7lrL2H3vgya0ogXIDx59hn62wo2tt3HWMbqnnCSp8yPmM1IFNwZMzgvSZBEs_n9B0v4h4M5PfOY45GVSjeFh8md140gWHaFpZoL4Vwh-fD7Zi3djU_r0PduZwNBVGOcoB6SMJ1m4NmMmemWr2lzBn57LutDkxw"
                    "text"             = $text
                    "markdown"         = $true
                }
            )
        }
    }

    $TeamMessageBody = ConvertTo-Json $JSONBody -Depth 100
 
    $parameters = @{
        "URI"         = $webhookurl
        "Method"      = 'POST'
        "Body"        = $TeamMessageBody
        "ContentType" = 'application/json'
    }
 
    Invoke-RestMethod @parameters
}

$msg = 'ServerName  = ' + $SQLInstance + 'JobId = ' + $JobID
Write-Host $msg
Notify-TeamsSQLAgentJob -SQlInstance $SqlInstance -JobID $JobID -webhookurl $webhookurl

Sending a SQL Agent Job results overview to a Microsoft Teams Channel

Microsoft Teams is fantastic for collaboration. It enables groups of people, teams if you like to be able to communicate, collaborate on documents, hold meetings and much much more.

SQL Agent Job Overview

Using dbatools we can create a simple script to gather the results of Agent Jobs form a list of instances. Maybe it would be good to be able to get the job runs results every 12 hours so that at 6am in the morning the early-bird DBA can quickly identify if there are any failures that need immediate action and at 6pm , the team can check that everything was ok before they clock off.

Here is an example of such a script

$SqlInstances = (Get-Vm -ComputerName BEARDNUC,BEARDNUC2).Where{$_.State -eq 'Running' -and $_.Name -like '*SQL*'}.Name
$AllJobs = "
SqlInstance...|...Total...|...Successful...|...FailedJobs...|...FailedSteps...|...Canceled...     
---------------------------------------------  
"
foreach ($Instance in $SQLInstances) {
    Write-Host "Connecting to $instance"
    try{
        $smo = Connect-DbaInstance $Instance -ErrorAction Stop
        Write-Host "Connected successfully to $instance"
    }
    catch{
        Write-Host "Failed to connect to $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }

    Write-Host "Getting Agent Jobs on $instance"
    try {
        $AgentJobs = Get-DbaAgentJobHistory -SqlInstance $smo -EnableException -StartDate $startdate 
        Write-Host "Successfully got Agent Jobs on $instance"
    }
    catch {
        Write-Host "Failed to get agent jobs on $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }
    

    $jobs = $agentJobs 
    $NumberOfJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0}).Count.ToString("00")
    $NumberOfFailedJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0}|  Where-Object {$PSItem.Status -eq 'Failed'}).StepName.Count.ToString("00")
    $NumberOfFailedJobSteps = ($Jobs |Where-Object {$PSitem.StepId -ne 0}|  Where-Object {$PSItem.Status -eq 'Failed'}).StepName.Count.ToString("00")
    $NumberOfSuccessfulJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0} | Where-Object {$PSItem.Status -eq 'Succeeded'}).StepName.Count.ToString("00")
    $NumberOfCanceledJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0} | Where-Object {$PSItem.Status -eq 'Canceled'}).StepName.Count.ToString("00")

     Write-Host "SqlInstance $Instance - Number of Jobs $NumberOfJobs - Number of Successful Jobs $NumberOfSuccessfulJobs  - Number of Failed Jobs $NumberOfFailedJobs"

    $AllJobs = $AllJobs + "$($Instance.Split('.')[0])..........<b>$NumberOfJobs</b>................<b>$NumberOfSuccessfulJobs</b>.........................<b>$NumberOfFailedJobs</b>............................<b>$NumberOfFailedJobSteps</b>..............................<b>$NumberOfCanceledJobs</b>........
"
    try{
        $smo.ConnectionContext.Disconnect()
        Write-Host "Disconnecting $instance"
    }
    catch{
        Write-Host "Failed disconnect from  $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }

}

Write-Host "Since $startdate"
Write-Host "$AllJobs"

and an example of running it.

Create a Teams Channel

If you have permissions, you can create a new Teams channel by clicking on the 3 ellipses and add channel

Then fill in the blanks

Create a Webhook Connector for the channel

Next, you need to have a connector for the channel, click on the 3 ellipses for the channel and click on connectors

Then you can choose the Incoming Webhook connector and click configure

Give the connector a name and upload an image if you wish and click create

The resulting screen will give you a URL that you can copy. If you need to find it again, then use the 3 ellipses again, click connectors and look at configured. You can then choose the webhook that you have created and click manage and you will find the URL.

Send to Teams using PowerShell

Now you can send a message to that Teams channel using PowerShell. You will need to add the webhook URL from your Teams connector

[System.Net.ServicePointManager]::ServerCertificateValidationCallback = { $true }
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$webhookurl = ""

    $Text =  @"
# Here is a Title

and a message

Image is from

https://www.flickr.com/photos/157270154@N05/38494483572

Photo by CreditDebitPro
"@

    $JSONBody = [PSCustomObject][Ordered]@{
        "@type"      = "MessageCard"
        "@context"   = "http://schema.org/extensions"
        "summary"    = "This is my summary"
        "themeColor" = '0078D7'
        "sections"   = @(
            @{
                "activityTitle"    = "Something Important "
                "activitySubtitle" = "I have something to say"
                "activityImage"    = "https://live.staticflickr.com/4568/38494483572_a98d623854_k.jpg"
                "text"             = $text
                "markdown"         = $true
            }
        )
    }
 
    $TeamMessageBody = ConvertTo-Json $JSONBody -Depth 100
 
    $parameters = @{
        "URI"         = $webhookurl
        "Method"      = 'POST'
        "Body"        = $TeamMessageBody
        "ContentType" = 'application/json'
    }
 
    Invoke-RestMethod @parameters

The code above will send a message that looks like this

Running as a SQL Agent Job

Now we can run this code as a SQL Agent Job and schedule it. Now, you may not be able to run that code on your SQL Server. It cannot connect to the internet, so how can we contact the Teams webhook?

There are probably a number of ways to do this but the solution that I took, was to allow a proxy account the ability to use PSRemoting and run the part of the script that connects to Teams on a different machine, that does have connectivity.

The script I used was as follows. You will need to add in the SQL Instances or better still dynamically gather them from your source of truth. You will need the webhook URL and the name of the server that can connect to Teams

$SQLInstances = 'SQL2005Ser2003','SQL2008Ser12R2','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3','SQL2017N5','SQL2019N20','SQL2019N21','SQL2019N22','SQL2019N5'

$startdate = (Get-Date).AddHours(-12)
$webhookurl = ""
$NotifyServer = 'BeardNUC2'

$AllJobs = "
SqlInstance...|...Total...|...Successful...|...FailedJobs...|...FailedSteps...|...Canceled...     
---------------------------------------------  
"
foreach ($Instance in $SQLInstances) {
    Write-Host "Connecting to $instance"
    try{
        $smo = Connect-DbaInstance $Instance -ErrorAction Stop
        Write-Host "Connected successfully to $instance"
    }
    catch{
        Write-Host "Failed to connect to $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }

    Write-Host "Getting Agent Jobs on $instance"
    try {
        $AgentJobs = Get-DbaAgentJobHistory -SqlInstance $smo -EnableException -StartDate $startdate 
        Write-Host "Successfully got Agent Jobs on $instance"
    }
    catch {
        Write-Host "Failed to get agent jobs on $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }
    

    $jobs = $agentJobs 
    $NumberOfJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0}).Count.ToString("00")
    $NumberOfFailedJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0}|  Where-Object {$PSItem.Status -eq 'Failed'}).StepName.Count.ToString("00")
    $NumberOfFailedJobSteps = ($Jobs |Where-Object {$PSitem.StepId -ne 0}|  Where-Object {$PSItem.Status -eq 'Failed'}).StepName.Count.ToString("00")
    $NumberOfSuccessfulJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0} | Where-Object {$PSItem.Status -eq 'Succeeded'}).StepName.Count.ToString("00")
    $NumberOfCanceledJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0} | Where-Object {$PSItem.Status -eq 'Canceled'}).StepName.Count.ToString("00")

     Write-Host "SqlInstance $Instance - Number of Jobs $NumberOfJobs - Number of Successful Jobs $NumberOfSuccessfulJobs  - Number of Failed Jobs $NumberOfFailedJobs"

    $AllJobs = $AllJobs + "$($Instance.Split('.')[0])..........<b>$NumberOfJobs</b>................<b>$NumberOfSuccessfulJobs</b>.........................<b>$NumberOfFailedJobs</b>............................<b>$NumberOfFailedJobSteps</b>..............................<b>$NumberOfCanceledJobs</b>........
"
    try{
        $smo.ConnectionContext.Disconnect()
        Write-Host "Disconnecting $instance"
    }
    catch{
        Write-Host "Failed disconnect from  $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }

}

Write-Host "Since $startdate"
Write-Host "$AllJobs"

$NotifyCommand = {
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = { $true }
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$webhookurl = $Using:TeamsWebhook
 
$allJobsMessage = $Using:AllJobs 
    $Text =  @"
# Overview of SQL Agent Jobs in Production since $($Using:startdate)  

$allJobsMessage
"@

    $JSONBody = [PSCustomObject][Ordered]@{
        "@type"      = "MessageCard"
        "@context"   = "http://schema.org/extensions"
        "summary"    = "Overview for the last 12 hours"
        "themeColor" = '0078D7'
        "sections"   = @(
            @{
                "activityTitle"    = "Job Failures "
                "activitySubtitle" = "Overview for the last 12 hours since $($Using:startdate)"
                "activityImage"    = "https://live.staticflickr.com/4568/38494483572_a98d623854_k.jpg"
                "text"             = $allJobsMessage
                "markdown"         = $true
            }
        )
    }
 
    $TeamMessageBody = ConvertTo-Json $JSONBody -Depth 100
 
    $parameters = @{
        "URI"         = $webhookurl
        "Method"      = 'POST'
        "Body"        = $TeamMessageBody
        "ContentType" = 'application/json'
    }
 
    Invoke-RestMethod @parameters
}

$Session = New-PSSession -ComputerName $NotifyServer
Invoke-Command -Session $Session -ScriptBlock $NotifyCommand

Then, follow the steps at dbatools.io/agent to create an agent job to run the script above on an instance with the dbatools module available to the SQL Service account. Use or create a proxy with permissions on the notify server and create an Agent Job.

USE [msdb]
GO

/****** Object:  Job [I am a Job that notifies Teams]    Script Date: 27/07/2020 20:27:27 ******/
BEGIN TRANSACTION
DECLARE @ReturnCode INT
SELECT @ReturnCode = 0
/****** Object:  JobCategory [[Uncategorized (Local)]]    Script Date: 27/07/2020 20:27:28 ******/
IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name=N'[Uncategorized (Local)]' AND category_class=1)
BEGIN
EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N'JOB', @type=N'LOCAL', @name=N'[Uncategorized (Local)]'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

END

DECLARE @jobId BINARY(16)
EXEC @ReturnCode =  msdb.dbo.sp_add_job @job_name=N'12 Hour Teams Notify', 
		@enabled=1, 
		@notify_level_eventlog=0, 
		@notify_level_email=0, 
		@notify_level_netsend=0, 
		@notify_level_page=0, 
		@delete_level=0, 
		@description=N'This job will notify Teams every 12 hours', 
		@category_name=N'[Uncategorized (Local)]', 
		@owner_login_name=N'THEBEARD\SQL_SVC', @job_id = @jobId OUTPUT
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

/****** Object:  Step [Notify Teams]    Script Date: 27/07/2020 20:27:28 ******/
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Notify Teams', 
		@step_id=1, 
		@cmdexec_success_code=0, 
		@on_success_action=1, 
		@on_success_step_id=0, 
		@on_fail_action=2, 
		@on_fail_step_id=0, 
		@retry_attempts=0, 
		@retry_interval=0, 
		@os_run_priority=0, @subsystem=N'CmdExec', 
		@command=N'powershell.exe -File C:\temp\AgentJobs\NotifyTeams.ps1', 
		@flags=0, 
		@proxy_name=N'TheBeardIsMighty'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
COMMIT TRANSACTION
GOTO EndSave
QuitWithRollback:
    IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION
EndSave:
GO


When the job runs

The results are posted to the Teams Channel

If you can run the Agent Job on a machine that can connect to Teams and your SQL Instances then you can remove the need to use a remote session by using this code

$SQLInstances = 'SQL2005Ser2003','SQL2008Ser12R2','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3','SQL2017N5','SQL2019N20','SQL2019N21','SQL2019N22','SQL2019N5'

$startdate = (Get-Date).AddHours(-12)
$webhookurl = ""


# Import-Module 'C:\Program Files\WindowsPowerShell\Modules\dbatools\1.0.107\dbatools.psd1'
$AllJobs = "
SqlInstance...|...Total...|...Successful...|...FailedJobs...|...FailedSteps...|...Canceled...     
---------------------------------------------  
"
foreach ($Instance in $SQLInstances) {
    Write-Host "Connecting to $instance"
    try{
        $smo = Connect-DbaInstance $Instance -ErrorAction Stop
        Write-Host "Connected successfully to $instance"
    }
    catch{
        Write-Host "Failed to connect to $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }

    Write-Host "Getting Agent Jobs on $instance"
    try {
        $AgentJobs = Get-DbaAgentJobHistory -SqlInstance $smo -EnableException -StartDate $startdate 
        Write-Host "Successfully got Agent Jobs on $instance"
    }
    catch {
        Write-Host "Failed to get agent jobs on $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }
    

    $jobs = $agentJobs 
    $NumberOfJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0}).Count.ToString("00")
    $NumberOfFailedJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0}|  Where-Object {$PSItem.Status -eq 'Failed'}).StepName.Count.ToString("00")
    $NumberOfFailedJobSteps = ($Jobs |Where-Object {$PSitem.StepId -ne 0}|  Where-Object {$PSItem.Status -eq 'Failed'}).StepName.Count.ToString("00")
    $NumberOfSuccessfulJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0} | Where-Object {$PSItem.Status -eq 'Succeeded'}).StepName.Count.ToString("00")
    $NumberOfCanceledJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0} | Where-Object {$PSItem.Status -eq 'Canceled'}).StepName.Count.ToString("00")

     Write-Host "SqlInstance $Instance - Number of Jobs $NumberOfJobs - Number of Successful Jobs $NumberOfSuccessfulJobs  - Number of Failed Jobs $NumberOfFailedJobs"

    $AllJobs = $AllJobs + "$($Instance.Split('.')[0])..........<b>$NumberOfJobs</b>................<b>$NumberOfSuccessfulJobs</b>.........................<b>$NumberOfFailedJobs</b>............................<b>$NumberOfFailedJobSteps</b>..............................<b>$NumberOfCanceledJobs</b>........
"
    try{
        $smo.ConnectionContext.Disconnect()
        Write-Host "Disconnecting $instance"
    }
    catch{
        Write-Host "Failed disconnect from  $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }

}

Write-Host "Since $startdate"
Write-Host "$AllJobs"

[System.Net.ServicePointManager]::ServerCertificateValidationCallback = { $true }
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
 
$allJobsMessage = $AllJobs 
    $Text =  @"
# Overview of SQL Agent Jobs in Production since $($startdate)  

$allJobsMessage
"@

    $JSONBody = [PSCustomObject][Ordered]@{
        "@type"      = "MessageCard"
        "@context"   = "http://schema.org/extensions"
        "summary"    = "Overview for the last 12 hours"
        "themeColor" = '0078D7'
        "sections"   = @(
            @{
                "activityTitle"    = "Job Results "
                "activitySubtitle" = "Overview for the last 12 hours since $($startdate)"
                "activityImage"    = "https://live.staticflickr.com/4568/38494483572_a98d623854_k.jpg"
                "text"             = $allJobsMessage
                "markdown"         = $true
            }
        )
    }
 
    $TeamMessageBody = ConvertTo-Json $JSONBody -Depth 100
 
    $parameters = @{
        "URI"         = $webhookurl
        "Method"      = 'POST'
        "Body"        = $TeamMessageBody
        "ContentType" = 'application/json'
    }
 
    Invoke-RestMethod @parameters

Happy automating!

Using Secret Management module to run SSMS, VS Code and Azure Data Studio as another user

Following on from my last post about the Secret Management module. I was asked another question.

> Can I use this to run applications as my admin account?

A user with a beard

It is good practice to not log into your work station with an account with admin privileges. In many shops, you will need to open applications that can do administration tasks with another set of account credentials.

Unfortunately, people being people, they will often store their admin account credentials in a less than ideal manner (OneNote, Notepad ++ etc) to make it easier for them, so that when they right click and run as a different user, they can copy and paste the password.

Use the Secret Management module

Again, I decided to use a notebook to show this as it is a fantastic way to share code and results and because it means that anyone can try it out.

The notebook may not render on a mobile device.

Using the notebook, I can quickly store my admin password safely and open and run the applications using the credential

Good Bye Import-CliXML – Use the Secrets Management module for your labs and demos

Don’t want to read all this? There are two dotnet interactive notebooks here with the relevant information for you to use.

https://beard.media/dotnetnotebooks

Jaap is awesome

I have to start here. For the longest time, whenever anyone has asked me how I store my credentials for use in my demos and labs I have always referred them to Jaap Brassers t blog post

https://www.jaapbrasser.com/quickly-and-securely-storing-your-credentials-powershell/

Joel is also awesome!

When people wanted a method of storing credentials that didnt involve files on disk I would suggest Joel Bennett’s t module BetterCredentials which uses the Windows Credential Manager

https://www.powershellgallery.com/packages/BetterCredentials/4.5

Microsoft? Also awesome!

In February, Microsoft released the SecretManagement module for preview.

https://devblogs.microsoft.com/powershell/secrets-management-development-release/

Sydney t gave a presentation at the European PowerShell Conference which you can watch on Youtube.

Good Bye Import-CliXML

So now I say, it is time to stop using Import-Clixml for storing secrets and use the Microsoft.PowerShell.SecretsManagement module instead for storing your secrets.

Notebooks are as good as blog posts

I love notebooks and to show some people who had asked about storing secrets, I have created some. So, because I am efficient lazy I have embedded them here for you to see. You can find them in my Jupyter Notebook repository

https://beard.media/dotnetnotebooks

in the Secrets folder

Installing and using the Secrets Management Module

These notebooks may not display on a mobile device unfortunately

Using the Secret Management Module in your scripts

Here is a simple example of using the module to provide the credential for a docker container and then to dbatools to query the container

These notebooks may not display on a mobile device unfortunately

Surprised and Honoured and Proud

I have always been extremely proud to be a Cloud and Datacenter Management MVP, and lucky enough to be involved with both the PowerShell community as well as the Data Platform community.

Today, July 1st is the date that many MVPs receive their renewal email to let them know that they have been awarded for another year. There is a lot of F5’ing and frequent checking of emails and “Have you heard yet?” DMs going around.

When I received the news, I was using Azure DevOps to run PowerShell and Terraform to build an Azure SQL Elastic Pool (yes, I will write a blog post about it!). I love technology and within my work, like many people, I work across many different disciplines. Azure, Azure DevOps, SQL Server and Microsoft Data Platform products are the main focus of my time.

I didn’t notice the significance of the information.

I was pleased as punch to be renewed again, proud that what I do is recognised by Microsoft, honoured to spend another year as an MVP. Then my friends pointed out the big news that I had missed.

Photo by pixpoetry on Unsplash

There are two award categories.

I have been awarded for both Cloud and Datacenter Management and Data Platform.

I am beyond words.

Proud, Surprised and Honoured.

Thank you to all of the people who help and support me. You help more than you will ever know.

I am going to go and prop my jaw shut!

Running Jupyter Notebooks as Agent Jobs

Azure Data Studio is a great tool for connecting with your data platform whether it is in Azure or on your hardware. Jupyter Notebooks are fantastic, you can have words, pictures, code and code results all saved in one document.

I have created a repository in my Github https://beard.media/Notebooks where I have stored a number of Jupyter notebooks both for Azure Data Studio and the new .NET interactive notebooks.

Another thing that you can do with notebooks is run them as Agent Jobs and save the results of the run.

Notebooks running T-SQL

This works easily for T-SQL notebooks. I am going to use this one that I created that uses T-SQL to gather permissions using old code that was in a share somewhere. We can run the notebook and get the permissions and save the notebook and the results will be available for all time (unless you delete the notebook!)

SQL Agent Extension in Azure Data Studio

In Azure Data Studio, if you press CTRL + SHIFT + X it will open the Extensions tab

You can add extra functionality to Azure Data Studio. Search in the top bar for Agent and press the install button to install the extension. You can connect to and instance in the connections tab (CTRL + SHIFT + D) and right click on it and click Manage. This will open up the server dashboard (why isn’t it instance dashboard?)

and you will also have the SQL Agent dashboard available

Its pretty neat, it has green and red bars against the jobs showing success or failure and the larger the bar the longer the run time. On the left you will see a book. Click that

Notebooks in Agent Jobs

You can create an Agent Job to run a notebook. As a notebook is just a json file, it can be stored in a database table. This interface will create two tables one to store the templates and one for the results. Click New Notebook Job

Then navigate to the notebook and select it.

Choose a database for the storage of the template and the results and one for the execution context.

The name of the job will be the file name of the notebook. You can change this but there is a bug where you can only enter one character at a time in the name before it changes focus so beware!

Once the job is created, you will see two tables in the storage database notebooks.nb_materialized and notebooks.nb_template

The materialised table is empty right now

but the template table has a row for the job which includes the notebook in json format.

If you click on the jobs in the Notebook Jobs window in the SQL Agent extension, you can see more information about the job run

You can also run the job from here. It doesn’t have to be run from here, it is just a normal agent job which you can run or schedule in any normal manner. Running it from here gives a pop-up

You have to refresh to see when the job is finished and it will be red if the job failed, green if it succeeded or orange if some cells failed like this!

But this is the good bit. Clicking on that icon will open the notebook that was created by that agent job run. Lets see what we get

You can see that we have the results of the queries that we wrote in the notebook alongside the documentation (or maybe explanation of the expected results)
If we scroll down a little (and change the theme colour so that you can see the error)

Msg , Level , State , Line 
Duplicate column names are not permitted in SQL PowerShell. To repeat a column, use a column alias for the duplicate column in the format Column_Name AS New_Name.

We have got an error from running the code via SQL PowerShell which is how the job is run. This error is also inserted into the notebooks.nb_template table

I edited the notebook locally to remove that block of code

Then edited the job and selected the updated notebook

and re-ran the job and got a green tick.

Now I can open the notebook from the latest run, but notice that from this view I can also open the previous notebook.

If I look in the nb_template table, the last_run_notebook_error has cleared

and if I look in the nb materialized table I can see two rows, one for each job run. The error from the first run is also stored in this table. The notebook column has the json for the notebook if you wish to access it in a different manner.

Tomorrow, we will see what the job steps look like and how to make this run on an instance which does not and cannot have the required PowerShell.

Spoiler Alert – May contain dbatools 🙂

Use Jupyter Notebooks to Help People on StackOverFlow

I am sat in the PowerShell Saturday in Hamburg. You can see me on the right of this picture writing my previous blog post!

I was talking with my friend Mathias Jessen @IISResetMe on Twitter about notebooks and he said that another great use case was to use them on Stack OverFlow

Now Mathias is an active answerer on Stack OverFlow

and he puts a lot of effort into writing his answers, formatting them, including code and results. Basically exactly the same as a Notebook. However, with a Notebook, you can enable people to run the code as well on their own machines.

Mathias says he will use notebooks to help people when he answers their PowerShell questions on Stack OverFlow. If you are a Stack OverFlow Answerer then you can too.

.NET PowerShell Notebooks – Using Pester

My last post had a lot of information about the new .NET PowerShell notebooks including installation instructions.

.NET Notebooks are Jupyter Notebooks that use .NET core to enable C#, F# and PowerShell kernels.

Use Cases

One of the main benefits that I see for Jupyter Notebooks for Ops folk is that the results of the query are saved with the notebook. This makes them fantastic for Incident resolution.

If you have an incident at 3am and you know that you will need that information in the wash up meeting the next day instead of copying and pasting results into a OneNote document or a text file, you can simply run the queries in a notebook and save it.

In the meeting, you can simply open the notebook and the results will be available for everyone to see.

Even better, if you have a template notebook for those scenarios and you can then compare them to previous occurrences.

Using Pester

Using Pester to validate that an environment is as you expect it is a good resource for incident resolution, potentially enabling you to quickly establish an area to concentrate on for the issue. However, if you try to run Pester in a .NET Notebook you will receive an error

Describe: 
Line |
   3 | Describe "Checking Problem ...... by $($ENV:USERDOMAIN) $($ENV:UserName)" {

     | ^ The 'Describe' command was found in the module 'Pester', but the module could not be loaded. For more information, run 'Import-Module Pester'.

Fixing it

When you try to Import-Module Pester you get the following error

Get-Command: C:\Users\mrrob\Documents\PowerShell\Modules\Pester\4.9.0\Pester.psm1
Line |
  94 |     $script:SafeCommands['Get-CimInstance'] = Get-Command -Name Get-CimInstance -Module CimCmdlets @safeCommandLookupParameters

     |                                               ^ The term 'Get-CimInstance' is not recognized as the name of a
     | cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included,
     | verify that the path is correct and try again.
 Import-Module: The module to process 'Pester.psm1', listed in field 'ModuleToProcess/RootModule' of module manifest 'C:\Users\mrrob\Documents\PowerShell\Modules\Pester\4.9.0\Pester.psd1' was not processed because no valid module was found in any module directory. 

Thats odd, why is it failing there? Dongbo Wang from the PowerShell team explains in the issue that I raised

Yes, it was the CimCmdlets module from the system32 module path that got imported (via the WinCompat feature added in PS7). This is because currently the PS kernel don’t ship all the built-in modules along with it …
The built-in modules are not published anywhere and are platform specific, it’s hard for an application that host powershell to ship them along. We have the issue PowerShell/PowerShell#11783 to track this work.

The way to resolve this is to Import the CimCmdlets Module from your local PowerShell 7 installation until the issue is resolved

Import-Module 'C:\program files\powershell\7-preview\Modules\CimCmdlets\CimCmdlets.psd1'

Then you can run your Pester

You can see all of this including all the results in this notebook that I have created and shared on Github and also below as a gist to embed in this blogpost

Sharing Code AND Results 🙂


Notebooks – A brilliant way of sharing what you did and the results that you got enabling others to follow along. You can do this with this Notebook. Download it and open it in your Jupyter Lab and you will be able to run it and see all of the errors and the fix on your machine.