Notifying a Teams Channel of a SQL Agent Job result

Following on from yesterdays post about creating an overview of SQL Agent Job Results and sending it to a Teams channel, I was given another challenge

Can you write a job step that I can add to SQL Agent jobs that can send the result of that job to a Teams Channel

A person with a need

The use case was for some migration projects that had steps that were scheduled via SQL Agent Jobs and instead of the DBA having to estimate when they would finish and keep checking so that they could let the next team know that it was time for their part to start, they wanted it to notify a Teams channel. This turned out especially useful as the job finished earlier than expected at 3am and the off-shore team could begin their work immediately.

Using SQL Agent Job tokens with PowerShell

You can use SQL Agent job tokens in Job step commands to reference the existing instance or job but I did not know if you could use that with PowerShell until I read Kendra Little’s blog post from 2009.

Thank you Kendra

Nothing is ever as easy as you think

So I thought, this is awesome, I can create a function and pass in the Instance and the JobId and all will be golden.

Nope

job_id <> $(JobID)

If we look in the sysjobs table at the Agent Job that we want to notify Teams about the result.

We can see that the job_id is

dc5937c3-766f-47b7-a5a5-48365708659a

If we look at the JobId property with PowerShell

We get

dc5937c3-766f-47b7-a5a5-48365708659a

Awesome, they are the same

But

If we look at the value of the $(JobID) SQL Agent Job Token,

we get

C33759DC6F76B747A5A548365708659A

which makes matching it to the JobId tricky

I tried all sorts of ways of casting and converting this value in SQL and PowerShell and in the end I just decided to manually convert the value

    $CharArray = $JobID.ToCharArray()

    $JobGUID = $CharArray[8] + $CharArray[9] + $CharArray[6] + $CharArray[7] + $CharArray[4] + $CharArray[5] + $CharArray[2] + $CharArray[3] + '-' + $CharArray[12] + $CharArray[13] + $CharArray[10] + $CharArray[11] + '-' + $CharArray[16] + $CharArray[17] + $CharArray[14] + $CharArray[15] + '-' + $CharArray[18] + $CharArray[19] + $CharArray[20] + $CharArray[21] + '-' + $CharArray[22] + $CharArray[23] + $CharArray[24] + $CharArray[25] + $CharArray[26] + $CharArray[27] + $CharArray[28] + $CharArray[29] + $CharArray[30] + $CharArray[31] + $CharArray[32] + $CharArray[33]

Send the information to Teams

Following the same pattern as yesterdays post, I created a function to send a message, depending on the outcome of the job and post it to the Teams function.

Again, I used Enter-Pssession to run the Teams notification from a machine that can send the message. (I have also included the code to do this without requiring that below so that you can send the message from the same machine that runs the job if required)

This code below is saved on a UNC share or the SQL Server as SingleNotifyTeams.ps1

Param(
    $SqlInstance,
    $JobID
)

$webhookurl = ""

$NotifyServer = 'BeardNUC2'
function Notify-TeamsSQlAgentJob {
    Param(
        $SQLInstance,
        $JobID,
        $webhookurl
    )

    $SQLInstance = $SQLInstance 
    # Import-Module 'C:\Program Files\WindowsPowerShell\Modules\dbatools\1.0.107\dbatools.psd1'
    [System.Net.ServicePointManager]::ServerCertificateValidationCallback = { $true }
    [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

    $CharArray = $JobID.ToCharArray()

    $JobGUID = $CharArray[8] + $CharArray[9] + $CharArray[6] + $CharArray[7] + $CharArray[4] + $CharArray[5] + $CharArray[2] + $CharArray[3] + '-' + $CharArray[12] + $CharArray[13] + $CharArray[10] + $CharArray[11] + '-' + $CharArray[16] + $CharArray[17] + $CharArray[14] + $CharArray[15] + '-' + $CharArray[18] + $CharArray[19] + $CharArray[20] + $CharArray[21] + '-' + $CharArray[22] + $CharArray[23] + $CharArray[24] + $CharArray[25] + $CharArray[26] + $CharArray[27] + $CharArray[28] + $CharArray[29] + $CharArray[30] + $CharArray[31] + $CharArray[32] + $CharArray[33]

    $Job = Get-DbaAgentJob -SQlInstance $SQLInstance | Where jobid -eq $JobGuiD
    $JobName = $Job.Name
    $Jobsteps = Get-DbaAgentJobStep -SQlInstance $SQLInstance -Job $JobName

    $JobStepNames = $Jobsteps.Name -join ' , '
    $JobStartDate = $job.JobSteps[0].LastRunDate
    $JobStatus = $job.LastRunOutcome
    $lastjobstepid = $jobsteps[-1].id
    $Jobstepsmsg = $Jobsteps | Out-String
    $JobStepStatus = ($Jobsteps | Where-Object {$_.id -ne $lastjobstepid -and $_.LastRunDate -ge $JobStartDate} ).ForEach{
        "   $($_.Name)  - $($_.LastRunDate) **$($_.LastRunOutCome)**  
"
    } 
    
    $Text = @"
# **$SqlInstance**   
## **$JobName**  

$jobstepMsg

Started at $JobStartDate 
- The individual Job Steps status was  

$JobStepStatus  


"@

    if (( $jobsteps | Where id -ne $lastjobstepid).LastRunOutcome -contains 'Failed') {
        $JSONBody = [PSCustomObject][Ordered]@{
            "@type"      = "MessageCard"
            "@context"   = "http://schema.org/extensions"
            "summary"    = "There was a Job Failure"
            "themeColor" = '0078D7'
            "sections"   = @(
                @{
                    "activityTitle"    = "The Job Failed"
                    "activitySubtitle" = "Work to do - Please investigate the following job by following the steps in the plan at LINKTOPLAN"
                    "activityImage"    = "https://fit93a.db.files.1drv.com/y4mTOWSzX1AfIWx-VdUgY_Qp3wqebttT7FWSvtKK-zAbpTJuU560Qccv1_Z_Oxd4T4zUtd5oVZGJeS17fkgbl1dXUmvbldnGcoThL-bnQYxrTrMkrJS1Wz2ZRV5RVtZS9f4GleZQOMuWXP1HMYSjYxa6w09nEyGg1masI-wKIZfdnEF6L8r83Q9BB7yIjlp6OXEmccZt99gpb4Qti9sIFNxpg"
                    "text"             = $text
                    "markdown"         = $true
                }
            )
        }
    }
    else {
        $JSONBody = [PSCustomObject][Ordered]@{
            "@type"      = "MessageCard"
            "@context"   = "http://schema.org/extensions"
            "summary"    = "The Job Succeeded"
            "themeColor" = '0078D7'
            "sections"   = @(
                @{
                    "activityTitle"    = "The Job Succeeded"
                    "activitySubtitle" = "All is well - Please continue with the next step in the plan at LINKTOPLAN"
                    "activityImage"    = "https://6f0bzw.db.files.1drv.com/y4mvnTDG9bCgNWTZ-2_DFl4-ZsUwpD9QIHUArsGF66H69zBO8a--FlflXiF7lrL2H3vgya0ogXIDx59hn62wo2tt3HWMbqnnCSp8yPmM1IFNwZMzgvSZBEs_n9B0v4h4M5PfOY45GVSjeFh8md140gWHaFpZoL4Vwh-fD7Zi3djU_r0PduZwNBVGOcoB6SMJ1m4NmMmemWr2lzBn57LutDkxw"
                    "text"             = $text
                    "markdown"         = $true
                }
            )
        }
    }

    $TeamMessageBody = ConvertTo-Json $JSONBody -Depth 100
 
    $NotifyCommand = {
    $parameters = @{
        "URI"         = $Using:webhookurl
        "Method"      = 'POST'
        "Body"        = $Using:TeamMessageBody
        "ContentType" = 'application/json'
    }
 
    Invoke-RestMethod @parameters
}
    $Session = New-PSSession -ComputerName $NotifyServer
    Invoke-Command -Session $Session -ScriptBlock $NotifyCommand
}

$msg = 'ServerName  = ' + $SQLInstance + 'JobId = ' + $JobID
Write-Host $msg
Notify-TeamsSQLAgentJob -SQlInstance $SqlInstance -JobID $JobID -webhookurl $webhookurl

Then it can be called in a SQL Agent job step, again following the guidelines at dbatools.io/agent

It is called slightly differently as you ned to pass in the SQL Agent tokens as parameters to the script

powershell.exe -File path to Notify-TeamsSQLAgentJob.ps1 -SQLInstance  $(ESCAPE_SQUOTE(SRVR)) -JobID  $(ESCAPE_NONE(JOBID))

SQL Agent Job Step Success and Failure

We need to take another step to ensure that this works as expected. We have to change the On Failure action for each job step to the “Go To Notify Teams” step

Making people smile

You can also add images (make sure the usage rights allow) so that the success notification can look like this

and the failure looks like this

Happy Automating !

Here is the code that does not require remoting to another server to send the message

Param(
    $SqlInstance,
    $JobID
)

$webhookurl = "https://outlook.office.com/webhook/5a8057cd-5e1a-4c84-9227-74a309f1c738@b122247e-1ebf-4b52-b309-c2aa7436fc6b/IncomingWebhook/affb85f05804438eb7ffb57665879248/f32fc7e6-a998-4670-8b33-635876559b80"

function Notify-TeamsSQlAgentJob {
    Param(
        $SQLInstance,
        $JobID,
        $webhookurl
    )

    $SQLInstance = $SQLInstance 
    # Import-Module 'C:\Program Files\WindowsPowerShell\Modules\dbatools\1.0.107\dbatools.psd1'
    [System.Net.ServicePointManager]::ServerCertificateValidationCallback = { $true }
    [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

    $CharArray = $JobID.ToCharArray()

    $JobGUID = $CharArray[8] + $CharArray[9] + $CharArray[6] + $CharArray[7] + $CharArray[4] + $CharArray[5] + $CharArray[2] + $CharArray[3] + '-' + $CharArray[12] + $CharArray[13] + $CharArray[10] + $CharArray[11] + '-' + $CharArray[16] + $CharArray[17] + $CharArray[14] + $CharArray[15] + '-' + $CharArray[18] + $CharArray[19] + $CharArray[20] + $CharArray[21] + '-' + $CharArray[22] + $CharArray[23] + $CharArray[24] + $CharArray[25] + $CharArray[26] + $CharArray[27] + $CharArray[28] + $CharArray[29] + $CharArray[30] + $CharArray[31] + $CharArray[32] + $CharArray[33]

    $Job = Get-DbaAgentJob -SQlInstance $SQLInstance | Where jobid -eq $JobGuiD
    $JobName = $Job.Name
    $Jobsteps = Get-DbaAgentJobStep -SQlInstance $SQLInstance -Job $JobName

    $JobStepNames = $Jobsteps.Name -join ' , '
    $JobStartDate = $job.JobSteps[0].LastRunDate
    $JobStatus = $job.LastRunOutcome
    $lastjobstepid = $jobsteps[-1].id
    $Jobstepsmsg = $Jobsteps | Out-String
    $JobStepStatus = ($Jobsteps | Where-Object {$_.id -ne $lastjobstepid -and $_.LastRunDate -ge $JobStartDate} ).ForEach{
        "   $($_.Name)  - $($_.LastRunDate) **$($_.LastRunOutCome)**  
"
    } 
    
    $Text = @"
# **$SqlInstance**   
## **$JobName**  

$jobstepMsg

Started at $JobStartDate 
- The individual Job Steps status was  

$JobStepStatus  


"@

    if (( $jobsteps | Where id -ne $lastjobstepid).LastRunOutcome -contains 'Failed') {
        $JSONBody = [PSCustomObject][Ordered]@{
            "@type"      = "MessageCard"
            "@context"   = "http://schema.org/extensions"
            "summary"    = "There was a Job Failure"
            "themeColor" = '0078D7'
            "sections"   = @(
                @{
                    "activityTitle"    = "The Job Failed"
                    "activitySubtitle" = "Work to do - Please investigate the following job by following the steps in the plan at LINKTOPLAN"
                    "activityImage"    = "https://fit93a.db.files.1drv.com/y4mTOWSzX1AfIWx-VdUgY_Qp3wqebttT7FWSvtKK-zAbpTJuU560Qccv1_Z_Oxd4T4zUtd5oVZGJeS17fkgbl1dXUmvbldnGcoThL-bnQYxrTrMkrJS1Wz2ZRV5RVtZS9f4GleZQOMuWXP1HMYSjYxa6w09nEyGg1masI-wKIZfdnEF6L8r83Q9BB7yIjlp6OXEmccZt99gpb4Qti9sIFNxpg"
                    "text"             = $text
                    "markdown"         = $true
                }
            )
        }
    }
    else {
        $JSONBody = [PSCustomObject][Ordered]@{
            "@type"      = "MessageCard"
            "@context"   = "http://schema.org/extensions"
            "summary"    = "The Job Succeeded"
            "themeColor" = '0078D7'
            "sections"   = @(
                @{
                    "activityTitle"    = "The Job Succeeded"
                    "activitySubtitle" = "All is well - Please continue with the next step in the plan at LINKTOPLAN"
                    "activityImage"    = "https://6f0bzw.db.files.1drv.com/y4mvnTDG9bCgNWTZ-2_DFl4-ZsUwpD9QIHUArsGF66H69zBO8a--FlflXiF7lrL2H3vgya0ogXIDx59hn62wo2tt3HWMbqnnCSp8yPmM1IFNwZMzgvSZBEs_n9B0v4h4M5PfOY45GVSjeFh8md140gWHaFpZoL4Vwh-fD7Zi3djU_r0PduZwNBVGOcoB6SMJ1m4NmMmemWr2lzBn57LutDkxw"
                    "text"             = $text
                    "markdown"         = $true
                }
            )
        }
    }

    $TeamMessageBody = ConvertTo-Json $JSONBody -Depth 100
 
    $parameters = @{
        "URI"         = $webhookurl
        "Method"      = 'POST'
        "Body"        = $TeamMessageBody
        "ContentType" = 'application/json'
    }
 
    Invoke-RestMethod @parameters
}

$msg = 'ServerName  = ' + $SQLInstance + 'JobId = ' + $JobID
Write-Host $msg
Notify-TeamsSQLAgentJob -SQlInstance $SqlInstance -JobID $JobID -webhookurl $webhookurl

Sending a SQL Agent Job results overview to a Microsoft Teams Channel

Microsoft Teams is fantastic for collaboration. It enables groups of people, teams if you like to be able to communicate, collaborate on documents, hold meetings and much much more.

SQL Agent Job Overview

Using dbatools we can create a simple script to gather the results of Agent Jobs form a list of instances. Maybe it would be good to be able to get the job runs results every 12 hours so that at 6am in the morning the early-bird DBA can quickly identify if there are any failures that need immediate action and at 6pm , the team can check that everything was ok before they clock off.

Here is an example of such a script

$SqlInstances = (Get-Vm -ComputerName BEARDNUC,BEARDNUC2).Where{$_.State -eq 'Running' -and $_.Name -like '*SQL*'}.Name
$AllJobs = "
SqlInstance...|...Total...|...Successful...|...FailedJobs...|...FailedSteps...|...Canceled...     
---------------------------------------------  
"
foreach ($Instance in $SQLInstances) {
    Write-Host "Connecting to $instance"
    try{
        $smo = Connect-DbaInstance $Instance -ErrorAction Stop
        Write-Host "Connected successfully to $instance"
    }
    catch{
        Write-Host "Failed to connect to $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }

    Write-Host "Getting Agent Jobs on $instance"
    try {
        $AgentJobs = Get-DbaAgentJobHistory -SqlInstance $smo -EnableException -StartDate $startdate 
        Write-Host "Successfully got Agent Jobs on $instance"
    }
    catch {
        Write-Host "Failed to get agent jobs on $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }
    

    $jobs = $agentJobs 
    $NumberOfJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0}).Count.ToString("00")
    $NumberOfFailedJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0}|  Where-Object {$PSItem.Status -eq 'Failed'}).StepName.Count.ToString("00")
    $NumberOfFailedJobSteps = ($Jobs |Where-Object {$PSitem.StepId -ne 0}|  Where-Object {$PSItem.Status -eq 'Failed'}).StepName.Count.ToString("00")
    $NumberOfSuccessfulJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0} | Where-Object {$PSItem.Status -eq 'Succeeded'}).StepName.Count.ToString("00")
    $NumberOfCanceledJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0} | Where-Object {$PSItem.Status -eq 'Canceled'}).StepName.Count.ToString("00")

     Write-Host "SqlInstance $Instance - Number of Jobs $NumberOfJobs - Number of Successful Jobs $NumberOfSuccessfulJobs  - Number of Failed Jobs $NumberOfFailedJobs"

    $AllJobs = $AllJobs + "$($Instance.Split('.')[0])..........<b>$NumberOfJobs</b>................<b>$NumberOfSuccessfulJobs</b>.........................<b>$NumberOfFailedJobs</b>............................<b>$NumberOfFailedJobSteps</b>..............................<b>$NumberOfCanceledJobs</b>........
"
    try{
        $smo.ConnectionContext.Disconnect()
        Write-Host "Disconnecting $instance"
    }
    catch{
        Write-Host "Failed disconnect from  $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }

}

Write-Host "Since $startdate"
Write-Host "$AllJobs"

and an example of running it.

Create a Teams Channel

If you have permissions, you can create a new Teams channel by clicking on the 3 ellipses and add channel

Then fill in the blanks

Create a Webhook Connector for the channel

Next, you need to have a connector for the channel, click on the 3 ellipses for the channel and click on connectors

Then you can choose the Incoming Webhook connector and click configure

Give the connector a name and upload an image if you wish and click create

The resulting screen will give you a URL that you can copy. If you need to find it again, then use the 3 ellipses again, click connectors and look at configured. You can then choose the webhook that you have created and click manage and you will find the URL.

Send to Teams using PowerShell

Now you can send a message to that Teams channel using PowerShell. You will need to add the webhook URL from your Teams connector

[System.Net.ServicePointManager]::ServerCertificateValidationCallback = { $true }
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$webhookurl = ""

    $Text =  @"
# Here is a Title

and a message

Image is from

https://www.flickr.com/photos/157270154@N05/38494483572

Photo by CreditDebitPro
"@

    $JSONBody = [PSCustomObject][Ordered]@{
        "@type"      = "MessageCard"
        "@context"   = "http://schema.org/extensions"
        "summary"    = "This is my summary"
        "themeColor" = '0078D7'
        "sections"   = @(
            @{
                "activityTitle"    = "Something Important "
                "activitySubtitle" = "I have something to say"
                "activityImage"    = "https://live.staticflickr.com/4568/38494483572_a98d623854_k.jpg"
                "text"             = $text
                "markdown"         = $true
            }
        )
    }
 
    $TeamMessageBody = ConvertTo-Json $JSONBody -Depth 100
 
    $parameters = @{
        "URI"         = $webhookurl
        "Method"      = 'POST'
        "Body"        = $TeamMessageBody
        "ContentType" = 'application/json'
    }
 
    Invoke-RestMethod @parameters

The code above will send a message that looks like this

Running as a SQL Agent Job

Now we can run this code as a SQL Agent Job and schedule it. Now, you may not be able to run that code on your SQL Server. It cannot connect to the internet, so how can we contact the Teams webhook?

There are probably a number of ways to do this but the solution that I took, was to allow a proxy account the ability to use PSRemoting and run the part of the script that connects to Teams on a different machine, that does have connectivity.

The script I used was as follows. You will need to add in the SQL Instances or better still dynamically gather them from your source of truth. You will need the webhook URL and the name of the server that can connect to Teams

$SQLInstances = 'SQL2005Ser2003','SQL2008Ser12R2','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3','SQL2017N5','SQL2019N20','SQL2019N21','SQL2019N22','SQL2019N5'

$startdate = (Get-Date).AddHours(-12)
$webhookurl = ""
$NotifyServer = 'BeardNUC2'

$AllJobs = "
SqlInstance...|...Total...|...Successful...|...FailedJobs...|...FailedSteps...|...Canceled...     
---------------------------------------------  
"
foreach ($Instance in $SQLInstances) {
    Write-Host "Connecting to $instance"
    try{
        $smo = Connect-DbaInstance $Instance -ErrorAction Stop
        Write-Host "Connected successfully to $instance"
    }
    catch{
        Write-Host "Failed to connect to $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }

    Write-Host "Getting Agent Jobs on $instance"
    try {
        $AgentJobs = Get-DbaAgentJobHistory -SqlInstance $smo -EnableException -StartDate $startdate 
        Write-Host "Successfully got Agent Jobs on $instance"
    }
    catch {
        Write-Host "Failed to get agent jobs on $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }
    

    $jobs = $agentJobs 
    $NumberOfJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0}).Count.ToString("00")
    $NumberOfFailedJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0}|  Where-Object {$PSItem.Status -eq 'Failed'}).StepName.Count.ToString("00")
    $NumberOfFailedJobSteps = ($Jobs |Where-Object {$PSitem.StepId -ne 0}|  Where-Object {$PSItem.Status -eq 'Failed'}).StepName.Count.ToString("00")
    $NumberOfSuccessfulJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0} | Where-Object {$PSItem.Status -eq 'Succeeded'}).StepName.Count.ToString("00")
    $NumberOfCanceledJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0} | Where-Object {$PSItem.Status -eq 'Canceled'}).StepName.Count.ToString("00")

     Write-Host "SqlInstance $Instance - Number of Jobs $NumberOfJobs - Number of Successful Jobs $NumberOfSuccessfulJobs  - Number of Failed Jobs $NumberOfFailedJobs"

    $AllJobs = $AllJobs + "$($Instance.Split('.')[0])..........<b>$NumberOfJobs</b>................<b>$NumberOfSuccessfulJobs</b>.........................<b>$NumberOfFailedJobs</b>............................<b>$NumberOfFailedJobSteps</b>..............................<b>$NumberOfCanceledJobs</b>........
"
    try{
        $smo.ConnectionContext.Disconnect()
        Write-Host "Disconnecting $instance"
    }
    catch{
        Write-Host "Failed disconnect from  $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }

}

Write-Host "Since $startdate"
Write-Host "$AllJobs"

$NotifyCommand = {
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = { $true }
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$webhookurl = $Using:TeamsWebhook
 
$allJobsMessage = $Using:AllJobs 
    $Text =  @"
# Overview of SQL Agent Jobs in Production since $($Using:startdate)  

$allJobsMessage
"@

    $JSONBody = [PSCustomObject][Ordered]@{
        "@type"      = "MessageCard"
        "@context"   = "http://schema.org/extensions"
        "summary"    = "Overview for the last 12 hours"
        "themeColor" = '0078D7'
        "sections"   = @(
            @{
                "activityTitle"    = "Job Failures "
                "activitySubtitle" = "Overview for the last 12 hours since $($Using:startdate)"
                "activityImage"    = "https://live.staticflickr.com/4568/38494483572_a98d623854_k.jpg"
                "text"             = $allJobsMessage
                "markdown"         = $true
            }
        )
    }
 
    $TeamMessageBody = ConvertTo-Json $JSONBody -Depth 100
 
    $parameters = @{
        "URI"         = $webhookurl
        "Method"      = 'POST'
        "Body"        = $TeamMessageBody
        "ContentType" = 'application/json'
    }
 
    Invoke-RestMethod @parameters
}

$Session = New-PSSession -ComputerName $NotifyServer
Invoke-Command -Session $Session -ScriptBlock $NotifyCommand

Then, follow the steps at dbatools.io/agent to create an agent job to run the script above on an instance with the dbatools module available to the SQL Service account. Use or create a proxy with permissions on the notify server and create an Agent Job.

USE [msdb]
GO

/****** Object:  Job [I am a Job that notifies Teams]    Script Date: 27/07/2020 20:27:27 ******/
BEGIN TRANSACTION
DECLARE @ReturnCode INT
SELECT @ReturnCode = 0
/****** Object:  JobCategory [[Uncategorized (Local)]]    Script Date: 27/07/2020 20:27:28 ******/
IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name=N'[Uncategorized (Local)]' AND category_class=1)
BEGIN
EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N'JOB', @type=N'LOCAL', @name=N'[Uncategorized (Local)]'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

END

DECLARE @jobId BINARY(16)
EXEC @ReturnCode =  msdb.dbo.sp_add_job @job_name=N'12 Hour Teams Notify', 
		@enabled=1, 
		@notify_level_eventlog=0, 
		@notify_level_email=0, 
		@notify_level_netsend=0, 
		@notify_level_page=0, 
		@delete_level=0, 
		@description=N'This job will notify Teams every 12 hours', 
		@category_name=N'[Uncategorized (Local)]', 
		@owner_login_name=N'THEBEARD\SQL_SVC', @job_id = @jobId OUTPUT
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

/****** Object:  Step [Notify Teams]    Script Date: 27/07/2020 20:27:28 ******/
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Notify Teams', 
		@step_id=1, 
		@cmdexec_success_code=0, 
		@on_success_action=1, 
		@on_success_step_id=0, 
		@on_fail_action=2, 
		@on_fail_step_id=0, 
		@retry_attempts=0, 
		@retry_interval=0, 
		@os_run_priority=0, @subsystem=N'CmdExec', 
		@command=N'powershell.exe -File C:\temp\AgentJobs\NotifyTeams.ps1', 
		@flags=0, 
		@proxy_name=N'TheBeardIsMighty'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
COMMIT TRANSACTION
GOTO EndSave
QuitWithRollback:
    IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION
EndSave:
GO


When the job runs

The results are posted to the Teams Channel

If you can run the Agent Job on a machine that can connect to Teams and your SQL Instances then you can remove the need to use a remote session by using this code

$SQLInstances = 'SQL2005Ser2003','SQL2008Ser12R2','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3','SQL2017N5','SQL2019N20','SQL2019N21','SQL2019N22','SQL2019N5'

$startdate = (Get-Date).AddHours(-12)
$webhookurl = ""


# Import-Module 'C:\Program Files\WindowsPowerShell\Modules\dbatools\1.0.107\dbatools.psd1'
$AllJobs = "
SqlInstance...|...Total...|...Successful...|...FailedJobs...|...FailedSteps...|...Canceled...     
---------------------------------------------  
"
foreach ($Instance in $SQLInstances) {
    Write-Host "Connecting to $instance"
    try{
        $smo = Connect-DbaInstance $Instance -ErrorAction Stop
        Write-Host "Connected successfully to $instance"
    }
    catch{
        Write-Host "Failed to connect to $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }

    Write-Host "Getting Agent Jobs on $instance"
    try {
        $AgentJobs = Get-DbaAgentJobHistory -SqlInstance $smo -EnableException -StartDate $startdate 
        Write-Host "Successfully got Agent Jobs on $instance"
    }
    catch {
        Write-Host "Failed to get agent jobs on $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }
    

    $jobs = $agentJobs 
    $NumberOfJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0}).Count.ToString("00")
    $NumberOfFailedJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0}|  Where-Object {$PSItem.Status -eq 'Failed'}).StepName.Count.ToString("00")
    $NumberOfFailedJobSteps = ($Jobs |Where-Object {$PSitem.StepId -ne 0}|  Where-Object {$PSItem.Status -eq 'Failed'}).StepName.Count.ToString("00")
    $NumberOfSuccessfulJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0} | Where-Object {$PSItem.Status -eq 'Succeeded'}).StepName.Count.ToString("00")
    $NumberOfCanceledJobs = ($Jobs |Where-Object {$PSitem.StepId -eq 0} | Where-Object {$PSItem.Status -eq 'Canceled'}).StepName.Count.ToString("00")

     Write-Host "SqlInstance $Instance - Number of Jobs $NumberOfJobs - Number of Successful Jobs $NumberOfSuccessfulJobs  - Number of Failed Jobs $NumberOfFailedJobs"

    $AllJobs = $AllJobs + "$($Instance.Split('.')[0])..........<b>$NumberOfJobs</b>................<b>$NumberOfSuccessfulJobs</b>.........................<b>$NumberOfFailedJobs</b>............................<b>$NumberOfFailedJobSteps</b>..............................<b>$NumberOfCanceledJobs</b>........
"
    try{
        $smo.ConnectionContext.Disconnect()
        Write-Host "Disconnecting $instance"
    }
    catch{
        Write-Host "Failed disconnect from  $Instance" 
        $errorMessage = $_ | Out-String
        Write-Host $errorMessage
        Continue
    }

}

Write-Host "Since $startdate"
Write-Host "$AllJobs"

[System.Net.ServicePointManager]::ServerCertificateValidationCallback = { $true }
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
 
$allJobsMessage = $AllJobs 
    $Text =  @"
# Overview of SQL Agent Jobs in Production since $($startdate)  

$allJobsMessage
"@

    $JSONBody = [PSCustomObject][Ordered]@{
        "@type"      = "MessageCard"
        "@context"   = "http://schema.org/extensions"
        "summary"    = "Overview for the last 12 hours"
        "themeColor" = '0078D7'
        "sections"   = @(
            @{
                "activityTitle"    = "Job Results "
                "activitySubtitle" = "Overview for the last 12 hours since $($startdate)"
                "activityImage"    = "https://live.staticflickr.com/4568/38494483572_a98d623854_k.jpg"
                "text"             = $allJobsMessage
                "markdown"         = $true
            }
        )
    }
 
    $TeamMessageBody = ConvertTo-Json $JSONBody -Depth 100
 
    $parameters = @{
        "URI"         = $webhookurl
        "Method"      = 'POST'
        "Body"        = $TeamMessageBody
        "ContentType" = 'application/json'
    }
 
    Invoke-RestMethod @parameters

Happy automating!

Using Secret Management module to run SSMS, VS Code and Azure Data Studio as another user

Following on from my last post about the Secret Management module. I was asked another question.

> Can I use this to run applications as my admin account?

A user with a beard

It is good practice to not log into your work station with an account with admin privileges. In many shops, you will need to open applications that can do administration tasks with another set of account credentials.

Unfortunately, people being people, they will often store their admin account credentials in a less than ideal manner (OneNote, Notepad ++ etc) to make it easier for them, so that when they right click and run as a different user, they can copy and paste the password.

Use the Secret Management module

Again, I decided to use a notebook to show this as it is a fantastic way to share code and results and because it means that anyone can try it out.

The notebook may not render on a mobile device.

Using the notebook, I can quickly store my admin password safely and open and run the applications using the credential

Good Bye Import-CliXML – Use the Secrets Management module for your labs and demos

Don’t want to read all this? There are two dotnet interactive notebooks here with the relevant information for you to use.

https://beard.media/dotnetnotebooks

Jaap is awesome

I have to start here. For the longest time, whenever anyone has asked me how I store my credentials for use in my demos and labs I have always referred them to Jaap Brassers t blog post

https://www.jaapbrasser.com/quickly-and-securely-storing-your-credentials-powershell/

Joel is also awesome!

When people wanted a method of storing credentials that didnt involve files on disk I would suggest Joel Bennett’s t module BetterCredentials which uses the Windows Credential Manager

https://www.powershellgallery.com/packages/BetterCredentials/4.5

Microsoft? Also awesome!

In February, Microsoft released the SecretManagement module for preview.

https://devblogs.microsoft.com/powershell/secrets-management-development-release/

Sydney t gave a presentation at the European PowerShell Conference which you can watch on Youtube.

Good Bye Import-CliXML

So now I say, it is time to stop using Import-Clixml for storing secrets and use the Microsoft.PowerShell.SecretsManagement module instead for storing your secrets.

Notebooks are as good as blog posts

I love notebooks and to show some people who had asked about storing secrets, I have created some. So, because I am efficient lazy I have embedded them here for you to see. You can find them in my Jupyter Notebook repository

https://beard.media/dotnetnotebooks

in the Secrets folder

Installing and using the Secrets Management Module

These notebooks may not display on a mobile device unfortunately

Using the Secret Management Module in your scripts

Here is a simple example of using the module to provide the credential for a docker container and then to dbatools to query the container

These notebooks may not display on a mobile device unfortunately

Surprised and Honoured and Proud

I have always been extremely proud to be a Cloud and Datacenter Management MVP, and lucky enough to be involved with both the PowerShell community as well as the Data Platform community.

Today, July 1st is the date that many MVPs receive their renewal email to let them know that they have been awarded for another year. There is a lot of F5’ing and frequent checking of emails and “Have you heard yet?” DMs going around.

When I received the news, I was using Azure DevOps to run PowerShell and Terraform to build an Azure SQL Elastic Pool (yes, I will write a blog post about it!). I love technology and within my work, like many people, I work across many different disciplines. Azure, Azure DevOps, SQL Server and Microsoft Data Platform products are the main focus of my time.

I didn’t notice the significance of the information.

I was pleased as punch to be renewed again, proud that what I do is recognised by Microsoft, honoured to spend another year as an MVP. Then my friends pointed out the big news that I had missed.

Photo by pixpoetry on Unsplash

There are two award categories.

I have been awarded for both Cloud and Datacenter Management and Data Platform.

I am beyond words.

Proud, Surprised and Honoured.

Thank you to all of the people who help and support me. You help more than you will ever know.

I am going to go and prop my jaw shut!

Running Jupyter Notebooks as Agent Jobs

Azure Data Studio is a great tool for connecting with your data platform whether it is in Azure or on your hardware. Jupyter Notebooks are fantastic, you can have words, pictures, code and code results all saved in one document.

I have created a repository in my Github https://beard.media/Notebooks where I have stored a number of Jupyter notebooks both for Azure Data Studio and the new .NET interactive notebooks.

Another thing that you can do with notebooks is run them as Agent Jobs and save the results of the run.

Notebooks running T-SQL

This works easily for T-SQL notebooks. I am going to use this one that I created that uses T-SQL to gather permissions using old code that was in a share somewhere. We can run the notebook and get the permissions and save the notebook and the results will be available for all time (unless you delete the notebook!)

SQL Agent Extension in Azure Data Studio

In Azure Data Studio, if you press CTRL + SHIFT + X it will open the Extensions tab

You can add extra functionality to Azure Data Studio. Search in the top bar for Agent and press the install button to install the extension. You can connect to and instance in the connections tab (CTRL + SHIFT + D) and right click on it and click Manage. This will open up the server dashboard (why isn’t it instance dashboard?)

and you will also have the SQL Agent dashboard available

Its pretty neat, it has green and red bars against the jobs showing success or failure and the larger the bar the longer the run time. On the left you will see a book. Click that

Notebooks in Agent Jobs

You can create an Agent Job to run a notebook. As a notebook is just a json file, it can be stored in a database table. This interface will create two tables one to store the templates and one for the results. Click New Notebook Job

Then navigate to the notebook and select it.

Choose a database for the storage of the template and the results and one for the execution context.

The name of the job will be the file name of the notebook. You can change this but there is a bug where you can only enter one character at a time in the name before it changes focus so beware!

Once the job is created, you will see two tables in the storage database notebooks.nb_materialized and notebooks.nb_template

The materialised table is empty right now

but the template table has a row for the job which includes the notebook in json format.

If you click on the jobs in the Notebook Jobs window in the SQL Agent extension, you can see more information about the job run

You can also run the job from here. It doesn’t have to be run from here, it is just a normal agent job which you can run or schedule in any normal manner. Running it from here gives a pop-up

You have to refresh to see when the job is finished and it will be red if the job failed, green if it succeeded or orange if some cells failed like this!

But this is the good bit. Clicking on that icon will open the notebook that was created by that agent job run. Lets see what we get

You can see that we have the results of the queries that we wrote in the notebook alongside the documentation (or maybe explanation of the expected results)
If we scroll down a little (and change the theme colour so that you can see the error)

Msg , Level , State , Line 
Duplicate column names are not permitted in SQL PowerShell. To repeat a column, use a column alias for the duplicate column in the format Column_Name AS New_Name.

We have got an error from running the code via SQL PowerShell which is how the job is run. This error is also inserted into the notebooks.nb_template table

I edited the notebook locally to remove that block of code

Then edited the job and selected the updated notebook

and re-ran the job and got a green tick.

Now I can open the notebook from the latest run, but notice that from this view I can also open the previous notebook.

If I look in the nb_template table, the last_run_notebook_error has cleared

and if I look in the nb materialized table I can see two rows, one for each job run. The error from the first run is also stored in this table. The notebook column has the json for the notebook if you wish to access it in a different manner.

Tomorrow, we will see what the job steps look like and how to make this run on an instance which does not and cannot have the required PowerShell.

Spoiler Alert – May contain dbatools 🙂

Use Jupyter Notebooks to Help People on StackOverFlow

I am sat in the PowerShell Saturday in Hamburg. You can see me on the right of this picture writing my previous blog post!

I was talking with my friend Mathias Jessen @IISResetMe on Twitter about notebooks and he said that another great use case was to use them on Stack OverFlow

Now Mathias is an active answerer on Stack OverFlow

and he puts a lot of effort into writing his answers, formatting them, including code and results. Basically exactly the same as a Notebook. However, with a Notebook, you can enable people to run the code as well on their own machines.

Mathias says he will use notebooks to help people when he answers their PowerShell questions on Stack OverFlow. If you are a Stack OverFlow Answerer then you can too.

.NET PowerShell Notebooks – Using Pester

My last post had a lot of information about the new .NET PowerShell notebooks including installation instructions.

.NET Notebooks are Jupyter Notebooks that use .NET core to enable C#, F# and PowerShell kernels.

Use Cases

One of the main benefits that I see for Jupyter Notebooks for Ops folk is that the results of the query are saved with the notebook. This makes them fantastic for Incident resolution.

If you have an incident at 3am and you know that you will need that information in the wash up meeting the next day instead of copying and pasting results into a OneNote document or a text file, you can simply run the queries in a notebook and save it.

In the meeting, you can simply open the notebook and the results will be available for everyone to see.

Even better, if you have a template notebook for those scenarios and you can then compare them to previous occurrences.

Using Pester

Using Pester to validate that an environment is as you expect it is a good resource for incident resolution, potentially enabling you to quickly establish an area to concentrate on for the issue. However, if you try to run Pester in a .NET Notebook you will receive an error

Describe: 
Line |
   3 | Describe "Checking Problem ...... by $($ENV:USERDOMAIN) $($ENV:UserName)" {

     | ^ The 'Describe' command was found in the module 'Pester', but the module could not be loaded. For more information, run 'Import-Module Pester'.

Fixing it

When you try to Import-Module Pester you get the following error

Get-Command: C:\Users\mrrob\Documents\PowerShell\Modules\Pester\4.9.0\Pester.psm1
Line |
  94 |     $script:SafeCommands['Get-CimInstance'] = Get-Command -Name Get-CimInstance -Module CimCmdlets @safeCommandLookupParameters

     |                                               ^ The term 'Get-CimInstance' is not recognized as the name of a
     | cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included,
     | verify that the path is correct and try again.
 Import-Module: The module to process 'Pester.psm1', listed in field 'ModuleToProcess/RootModule' of module manifest 'C:\Users\mrrob\Documents\PowerShell\Modules\Pester\4.9.0\Pester.psd1' was not processed because no valid module was found in any module directory. 

Thats odd, why is it failing there? Dongbo Wang from the PowerShell team explains in the issue that I raised

Yes, it was the CimCmdlets module from the system32 module path that got imported (via the WinCompat feature added in PS7). This is because currently the PS kernel don’t ship all the built-in modules along with it …
The built-in modules are not published anywhere and are platform specific, it’s hard for an application that host powershell to ship them along. We have the issue PowerShell/PowerShell#11783 to track this work.

The way to resolve this is to Import the CimCmdlets Module from your local PowerShell 7 installation until the issue is resolved

Import-Module 'C:\program files\powershell\7-preview\Modules\CimCmdlets\CimCmdlets.psd1'

Then you can run your Pester

You can see all of this including all the results in this notebook that I have created and shared on Github and also below as a gist to embed in this blogpost

Sharing Code AND Results 🙂


Notebooks – A brilliant way of sharing what you did and the results that you got enabling others to follow along. You can do this with this Notebook. Download it and open it in your Jupyter Lab and you will be able to run it and see all of the errors and the fix on your machine.

New .NET Notebooks are here – PowerShell 7 notebooks are here.

Data Science folk used Notebooks for documentation and to show re-runnable research. Azure Data Studio included this notebook functionality and added SQL kernel where with a little bit of faffing you could run PowerShell and then a Python kernel that enabled PowerShell. It seems that notebooks are so cool that everyone is creating them these days! I was browsing twitter when I saw this tweet.

PowerShell 7 Notebooks 🙂

A notebook experience for PowerShell 7 that sounds amazing. This will enable a true cross-platform PowerShell Notebook experience which is lacking from the Python version as it uses Windows PowerShell on Windows and PowerShell Core on other OS’s

The first thing I asked was – Will this come to Azure Data Studio. I got an immediate response from Sydney Smith PowerShell Program Manager saying it is on the roadmap

Install dependencies

To be able to run the notebook, you need to install some dependencies. First install the .NET CORE SDK which you can download from https://dotnet.microsoft.com/download This needs admin permissions to install.

You also need a Python installation – You can use Anaconda, which you can download from here https://www.anaconda.com/distribution/ This does not need admin to install

Add Anaconda to Windows Terminal

I have added the Anaconda prompt to Windows Terminal so that I have one entry point for all my CLIs. Open the settings file and add the code below. (It will also give you an icon and background.

        {
            // Make changes here to the Anaconda.exe profile
            "guid": "{0caa0dad-35be-5f56-a7ff-afceeeaa6101}",
            "name": "Anaconda",
            "commandline": "cmd.exe /K C:\\Users\\mrrob\\Anaconda3\\Scripts\\activate.bat",
            "hidden": false,
            "backgroundImage": "C:\\Users\\mrrob\\Anaconda3\\Menu\\anaconda-navigator.ico",
            "icon": "C:\\Users\\mrrob\\Anaconda3\\Menu\\anaconda-navigator.ico",
            "backgroundImageAlignment": "topRight",
            "backgroundImageStretchMode": "uniform",
            "backgroundImageOpacity": 0.1
        }

and it appears in the drop down

With Anaconda installed, check that that the kernel is available on your path. If like me you have Azure Data Studio installed, you will have additional kernels but the important one line here is

python3 C:\Users\USERNAME\Anaconda3\share\jupyter\kernels\python3

In Windows Terminal move to a PowerShell 7 prompt and install the dotnet interactive tool

dotnet tool install --global Microsoft.dotnet-interactive

Then you can install the .NET kernel in your Anaconda prompt using this command

dotnet interactive jupyter install

Sometimes new things have errors

I had an error when I tried this first time

Could not execute because the specified command or file was not found.
Possible reasons for this include:
* You misspelled a built-in dotnet command.
* You intended to execute a .NET Core program, but dotnet-interactive does not exist.
* You intended to run a global tool, but a dotnet-prefixed executable with this name could not be found on the PATH.

This is easily fixed by adding %USERPROFILE%\.dotnet\tools to my path with set PATH=%PATH%;%USERPROFILE%\.dotnet\tools

Running jupyter kernelspec list shows that the .NET kernel is installed for C Sharp, F Sharp and .NET PowerShell

Lets open a Notebook

Now you want to play with it!
You can run the lab environment using `jupyter lab`

This opens a browser

You can open existing Azure Data Studio PowerShell notebooks (but not SQL ones)

Sometimes new things have errors Part 2

Unfortunately, I get errors when trying to import Pester which means I can not use my dbachecks notebooks in this blog post. I have raised an issue on the repo here.

Create a New Notebook

But it is easy to create a new Notebook

In the launcher page click the .NET PowerShell button


Which will open a new Notebook in the directory that you launched the lab from. You can then add Code or Markdown as I have described before here.

Then you can add code, markdown and images to create your notebook.

Once you have finished using the notebook lab, you can shut it down in the Anaconda prompt with CTRL + C

Here is a video of running a notebook which anyone can use to create a couple of Docker containers running SQL 2019 and query them with dbatools. You can find the notebook further down this post.

Sharing Notebooks

You can create notebooks to run common tasks. Even better, from the lab you can convert the notebook including the results to a variety of formats to share with other none-technical people. I used this functionality this week to export Azure Data Studio Notebooks to HTML and PDF for a Project manager 🙂

You can find the Export Notebook command in the File menu

Exporting to HTML did not export the images but it does include the results

You can share notebooks via GitHub – Either in a gist like this

or by providing a straight link to the notebook in GitHub https://github.com/SQLDBAWithABeard/Notebooks/blob/master/notebooks/Exploring%20dbatools.ipynb

You can also use Binder https://mybinder.org/

This uses Docker to create an interactive Notebook. Create a Github repo like https://github.com/SQLDBAWithABeard/Notebooks (or just clone it) Copy your notebooks into the notebooks folder and push the changes to Github and then go to https://mybinder.org/ and add your URL to the repository.

You can see what it looks like by clicking the button below which Binder creates for you

Unfortunately the kernel only supports Python for the moment but you can see the possibilities 🙂

How to fork a GitHub repository and contribute to an open source project

I enjoying maintaining open source GitHub repositories such as dbachecks and ADSNotebook. I absolutely love it when people add more functionality to them.

To collaborate with a repository in GitHub you need to follow these steps

  • Fork the repository into your own GitHub
  • Clone the repository to your local machine
  • Create a new branch for your changes
  • Make some changes and commit them with useful messages
  • Push the changes to your repository
  • Create a Pull Request from your repository back to the original one

You will need to have git.exe available which you can download and install from https://git-scm.com/downloads if required

Fork the repository into your own GitHub

A fork is a copy of the original repository. This allows you to make changes without affecting the original project. It does not get updated when the original project gets updated (We will talk about that in the next post) This enables you to code a new feature or a bug fix, test it locally and make sure it is working.

Let’s take dbachecks as our example. Start by going to the project in GiHub. In this case the URL is https://github.com/sqlcollaborative/dbachecks You will see a Fork button at the top right of the page

When you click the button the repository is copied into your own GitHub account

The page will open at https://github.com/YOURGITHUBUSERNAME/NameOfRepository in this case https://github.com/SQLDBAWithABeard/dbachecks You will be able to see that it is a fork of the original repository at the top of the page

Clone the repository to your local machine

Forking the repository has created a remote repository stored on the GitHub servers. Now that the repository has been forked you need to clone it to your local machine to create a local repository so that you can start coding your amazing fix. When you have finished you can then sync it back to your remote repository ready for a Pull Request back to the original repository.

In your browser, at your remote repository that you just created (https://github.com/YOURGITHUBUSERNAME/NameOfRepository if you have closed the page) click on Clone or Download and then the icon to the right to copy the url

You can clone your repository in VS Code or Azure Data Studio by clicking F1 or CTRL + SHIFT + P in Windows or Linux and ⇧⌘P or F1 on a Mac

then start typing clone until you see Git:Clone and press enter or click

Paste in the URL that you just copied and click enter. A dialog will open asking you to select a folder. This is the parent directory where your local repository will be created. The clone will create a directory for your repository so you do not need to. I suggest that you use a folder called GitHub or something similar to place all of the repositories that you are going to clone and create.

When it has finished it will ask you if you wish to open the repository

if you click Open it will close anything that you have already got opened and open the folder. If you click Add to Workspace it will add the folder to the workspace and leave everything you already had open as it was and surprisingly clicking Open in New Window will open the folder in a new instance of Visual Studio Code or Azure Data Studio!

and you will also be able to see the local repository files on your computer

You can clone the repository at the command line if you wish by navigating to your local GitHub directory and running git clone TheURLYouCopied

Now your local repository has been created, it’s time to do your magic coding.

Create a new branch for your changes

It is a good idea to create a branch for your amazing new feature This enables you to work on coding for that feature in isolation. It has the added advantage that if you mess it right royally up, you can just delete that branch and start again with a new one!

To create a branch in VS Code or Azure Data Studio you can click on the branch name at the bottom left.

Or open the Command Palette and type Branch until you see Git: Create Branch

You will be prompted for a branch name

I like to choose a name that relates to the code that I am writing like configurable_engine or removeerroringexample You can see the name of the branch in the bottom left so that you always know which branch you are working on.

The icon shows that the branch is only local and hasn’t been pushed (published) to the remote repository yet

Make some changes and commit them with useful messages

Now you can start writing your code for your awesome new feature, bug fix or maybe just documentation improvement. Keep your commits small and give them useful commit messages that explain why you have made the change as the diff tooling will be able to show what change you have made

Write your code or change the documentation, save the file and in Visual Studio Code or Azure Data Studio you will see that the source control icon has a number on it

Clicking on the icon will show the files that have changes ready

You can write your commit message in the box and click CTRL + ENTER to commit your changes with a message

If you want to do this at the command line, you can use git status to see which files have changes

You will need to git add .or git add .\pathtofile to stage your changes ready for committing and then git commit -m 'Commit Message' to commit them

Notice that I did exactly what I just said not to do! A better commit message would have been So that people can find the guide to forking and creating a PR

Push the changes to your repository

You only have the changes that you have made in your local repository on your computer. Now you need to push those changes to Github your remote repository. You can click on the publish icon

You will get a pop-up asking you if you wish to stage your changes. I click Yes and never Always so that I can use this prompt as a sanity check that I am doing the right thing

At the command line you can push the branch, if you do that, you will have to tell git where the branch needs to go. If you just type git push it will helpfully tell you

fatal: The current branch AwesomeNewFeature has no upstream branch.
To push the current branch and set the remote as upstream, use

    git push --set-upstream origin AwesomeNewFeature

So you will need to use that command

You can see in the bottom left that the icon has changed

and if you read the output of the git push command you will see what the next step is also.

Create a Pull Request from your repository back to the original one

You can CTRL click the link in the git push output if you have pushed from the command line or if you visit either you repository or the original repository in your browser you will see that there is a Compare and Pull Request button

You click that and let GitHub do its magic

and it will create a Pull Request for you ready for you to fill in the required information, ask for reviewers and other options. Once you have done that you can click Create pull request and wait for the project maintainer to review it and (hopefully) accept it into their project

You can find the Pull Request that I created here https://github.com/sqlcollaborative/dbachecks/pull/720 and see how the rest of this blog post was created.

If you make more changes to the code in the same branch in your local repository and push them, they will automatically be added to this Pull Request whilst it is open. You can do this if the maintainer or reviewer asks for changes.

Shane has asked for a change

So I can go to my local repository in Azure Data Studio and make the requested change and save the file. If I look in the source control in Azure Data Studio I can again see there is a change waiting to be committed and if I click on the name of the file I can open the diff tool to see what the change was

Once I am happy with my change I can commit it again in the same way as before either in the editor or at the command line. The icon at the bottom will change to show that I have one commit in my local repository waiting to be pushed

To do the same thing at the command line I can type git status and see the same thing.

I can then push my change to my remote repository either in the GUI or by using git push

and it will automatically be added to the Pull Request as you can see

Now that the required changes for the review have been made, the review has been approved by Shane and the pull request is now ready to be merged. (You can also see that dbachecks runs some checks against the code when a Pull Request is made)

Many, many thanks to Shane b | t who helped with the writing of this post even whilst on a “no tech” holiday.

Go Ahead – Contribute to an Open Source Project

Hopefully you can now see how easy it is to create a fork of a GitHub repository, clone it to your own machine and contribute. There are many open source projects that you can contribute to.

You can use this process to contribute to the Microsoft Docs for example by clicking on the edit button on any page.

You can contribute other open source projects like

or go and find the the ones that you use and can help with.