Running Jupyter Notebooks as Agent Jobs

Azure Data Studio is a great tool for connecting with your data platform whether it is in Azure or on your hardware. Jupyter Notebooks are fantastic, you can have words, pictures, code and code results all saved in one document.

I have created a repository in my Github https://beard.media/Notebooks where I have stored a number of Jupyter notebooks both for Azure Data Studio and the new .NET interactive notebooks.

Another thing that you can do with notebooks is run them as Agent Jobs and save the results of the run.

Notebooks running T-SQL

This works easily for T-SQL notebooks. I am going to use this one that I created that uses T-SQL to gather permissions using old code that was in a share somewhere. We can run the notebook and get the permissions and save the notebook and the results will be available for all time (unless you delete the notebook!)

SQL Agent Extension in Azure Data Studio

In Azure Data Studio, if you press CTRL + SHIFT + X it will open the Extensions tab

You can add extra functionality to Azure Data Studio. Search in the top bar for Agent and press the install button to install the extension. You can connect to and instance in the connections tab (CTRL + SHIFT + D) and right click on it and click Manage. This will open up the server dashboard (why isn’t it instance dashboard?)

and you will also have the SQL Agent dashboard available

Its pretty neat, it has green and red bars against the jobs showing success or failure and the larger the bar the longer the run time. On the left you will see a book. Click that

Notebooks in Agent Jobs

You can create an Agent Job to run a notebook. As a notebook is just a json file, it can be stored in a database table. This interface will create two tables one to store the templates and one for the results. Click New Notebook Job

Then navigate to the notebook and select it.

Choose a database for the storage of the template and the results and one for the execution context.

The name of the job will be the file name of the notebook. You can change this but there is a bug where you can only enter one character at a time in the name before it changes focus so beware!

Once the job is created, you will see two tables in the storage database notebooks.nb_materialized and notebooks.nb_template

The materialised table is empty right now

but the template table has a row for the job which includes the notebook in json format.

If you click on the jobs in the Notebook Jobs window in the SQL Agent extension, you can see more information about the job run

You can also run the job from here. It doesn’t have to be run from here, it is just a normal agent job which you can run or schedule in any normal manner. Running it from here gives a pop-up

You have to refresh to see when the job is finished and it will be red if the job failed, green if it succeeded or orange if some cells failed like this!

But this is the good bit. Clicking on that icon will open the notebook that was created by that agent job run. Lets see what we get

You can see that we have the results of the queries that we wrote in the notebook alongside the documentation (or maybe explanation of the expected results)
If we scroll down a little (and change the theme colour so that you can see the error)

Msg , Level , State , Line 
Duplicate column names are not permitted in SQL PowerShell. To repeat a column, use a column alias for the duplicate column in the format Column_Name AS New_Name.

We have got an error from running the code via SQL PowerShell which is how the job is run. This error is also inserted into the notebooks.nb_template table

I edited the notebook locally to remove that block of code

Then edited the job and selected the updated notebook

and re-ran the job and got a green tick.

Now I can open the notebook from the latest run, but notice that from this view I can also open the previous notebook.

If I look in the nb_template table, the last_run_notebook_error has cleared

and if I look in the nb materialized table I can see two rows, one for each job run. The error from the first run is also stored in this table. The notebook column has the json for the notebook if you wish to access it in a different manner.

Tomorrow, we will see what the job steps look like and how to make this run on an instance which does not and cannot have the required PowerShell.

Spoiler Alert – May contain dbatools 🙂

Dynamically Creating Azure Data Studio Notebooks with PowerShell for an Incident Response Index Notebook

Now that Azure Data Studio has PowerShell Notebooks and there is a PowerShell Module for creating notebooks. I have been asked, more than once, what is the point? What is the use case? How does this help. I hope that this post will spark some ideas of one particular use-case.

I showed my silly example PowerShell code to create a PowerShell Notebook that created a PowerShell Notebook to my good friend Nick.

Nick is a fantastic, clever DBA who isn’t active on social media, which is a great shame as if he had time to share some of his fantastic work we would all benefit. He looked at that code and less than an hour later, came back to me with this code and idea which I have replicated here with his permission.

Thanks Nick.

The Use Case

The use case that Nick has is that he is converting some troubleshooting runbooks from their original locations (you know the sort of places – Sharepoint Docs, OneNote Notebooks, Shared Folders, the desktop of the Bastion Host) into a single repository of Azure Data Studio SQL or PowerShell Notebooks.

The idea is to have a single entry point into the troubleshooting steps and for the on-call DBA to create a Notebook from a template for the issue at hand which could be attached to an incident in the incident management solution. I suppose you could call it an Index Notebook.

Work Flow

When the DBA (or another team) opens this Notebook, they can choose the task that they are going to perform and click the link which will

  • copy the Notebook to the local machine
  • Rename the Notebook with the username and date
  • Open it ready for the work.

Once the work has been completed, the DBA can then attach the Notebook to the task or incident that has been created or use it in the Wash-Up/ Post Incident meeting.

This ensures that the original template notebook stays intact and unchanged and it is easy (which is always good when you are called out at 3am!) to create a uniquely named notebook .

Azure DevOps

Nick has placed this code into the deploy step in Azure DevOps which will deploy the template Notebooks from source control into the common folder and then this code will dynamically create the index Notebook each time there is a release.

Whilst the initial use case is incident response, this could easily be adapted for Notebooks used for Common Tasks or Run Books.

Notebooks

There are a number of Notebooks for different issue stored in directories. For this post, I have used the Notebooks from Microsoft that explain SQL 2019 features and troubleshooting which you can find in their GitHub repositories by following this link

The Azure DevOps deploys the Notebooks to a directory which then looks something like this

Some directories of Notebooks in a directory

Create an Index Notebook

Here is the code to create an index Notebook

This creates a Notebook in the root of the folder. It also uses the new -Collapse parameter in New-AdsNoteBookCell that creates the code blocks with the code collapsed so that it looks neater. The index Notebook looks like this in the root of the folder

Three O’Clock in the Morning

It’s 3am and I have been called out. I can open up the Index Notebook, find the set of queries I want to run and click the run button.

A new workbook opens up, named with my name and the time and I can get to work 🙂 I think it’s neat.

Here’s a video

Thanks Nick.

Maybe you can find him at SQL Bits next year. Did you know that SQL Bits 2020 was announced?

Check out https://sqlbits.com for more details

Create a PowerShell Notebook for Azure Data Studio with PowerShell

The latest update to the ADSNotebook PowerShell module I blogged about here now enables the creation of PowerShell notebooks with PowerShell.

You can install the module with

Install-Module ADSNotebook

or if you have already installed it you can use

Update-Module ADSNotebook

In the latest release, there is an extra parameter for New-AdsWorkBook of -Type which will accept either SQL or PowerShell

Create a PowerShell Notebook with PowerShell Rob

OK!

Here is some code to create a PowerShell Notebook. First we will create some cells using New-AdsWorkBookCell including all the markdown to add images and links. You can find my notebooks which explain how to write the markdown for your notebooks in my GitHub Presentations Repository

$introCelltext = "# Welcome to my Auto Generated PowerShell Notebook

## dbatools
![image](https://user-images.githubusercontent.com/6729780/68845538-7afcd200-06c3-11ea-952e-e4fe72a68fc8.png)  

dbatools is an open-source PowerShell Module for administering SQL Servers.
You can read more about dbatools and find the documentation at [dbatools.io](dbatools.io)
"
$SecondCelltext = "### Installation
You can install dbatools from the PowerShell Gallery using `Install-Module dbatools`
"

$thirdcelltext = "Install-Module dbatools"

$fourthCelltext = "### Getting Help
You should always use `Get-Help` to fins out how to use dbatools (and any PowerShell) commands"

$fifthcelltext = "Get-Help Get-DbaDatabase"
$sixthCelltext = "Try a command now. get the name, owner and collation of the user databases on the local instance"
$seventhCellText = "Get-DbaDatabase -SqlInstance localhost -ExcludeSystem | Select Name, Owner, Collation"

$Intro = New-ADSWorkBookCell -Type Text -Text $introCelltext
$second = New-ADSWorkBookCell -Type Text -Text $SecondCelltext
$third = New-ADSWorkBookCell -Type Code -Text $thirdcelltext
$fourth = New-ADSWorkBookCell -Type Text -Text $fourthCelltext
$fifth = New-ADSWorkBookCell -Type Code -Text $fifthcelltext
$sixth = New-ADSWorkBookCell -Type Text -Text $sixthCelltext
$seventh = New-ADSWorkBookCell -Type Code -Text $seventhCellText

Then we will create a new workbook using those cells

$path = 'C:\temp\dbatools.ipynb'
New-ADSWorkBook -Path $path -cells $Intro,$second,$third,$fourth,$fifth,$sixth,$Seventh -Type PowerShell

Then, when that code is run we can open the Notebook and ta-da

And it is super quick to run as well

UPDATE – Tyler Leonhardt t from the PowerShell team asked

Challenge accepted, with extra meta, here is the PowerShell to create a PowerShell Notebook which will create a PowerShell Notebook!!

Create Azure Data Studio SQL Notebooks with PowerShell

At PASS Summit today I gave a presentation about SQL Notebooks in Azure Data Studio for the DBA. I demo’d the PowerShell module ADSSQLNotebook.

which you can also find on GitHub (where I will be glad to take PR’s to improve it 🙂 )

This module has 3 functions

This module contains only 3 commands at present

  • Convert-ADSPowerShellForMarkdown

This will create the markdown link for embedding PowerShell code in a Text Cell for a SQL Notebook as described in this blog post

  • New-ADSWorkBookCell

This command will create a workbook text cell or a code cell for adding to the New-ADSWorkBook command

  • New-ADSWorkBook

This will create a new SQL Notebook using the cell objects created by New-ADSWorkBookCell

Usage

Convert-ADSPowerShellForMarkdown

Convert-ADSPowerShellForMarkdown -InputText "Get-ChildItem" -LinkText 'This will list the files' -ToClipBoard

Converts the PowerShell so that it works with MarkDown and sets it to the clipboard for pasting into a workbook cell

New-ADSWorkBookCell

$introCelltext = "# Welcome to my Auto Generated Notebook

## Automation
Using this we can automate the creation of notebooks for our use
"
$Intro = New-ADSWorkBookCell -Type Text -Text $introCelltext

Creates an Azure Data Studio Text cell and sets it to a variable for passing to  New-AdsWorkBook

New-ADSWorkBook

$introCelltext = "# Welcome to my Auto     Generated Notebook

## Automation
Using this we can automate the creation of notebooks for our use
"
$SecondCelltext = "## Running code
The next cell will have some code in it for running

## Server Principals
Below is the code to run against your     instance to find the server principals that are enabled"

$thirdcelltext = "SELECT Name
FROM sys.server_principals
WHERE is_disabled = 0"
$Intro = New-ADSWorkBookCell -Type Text -Text $introCelltext
$second = New-ADSWorkBookCell -Type Text  -Text $SecondCelltext
$third = New-ADSWorkBookCell -Type Code -Text $thirdcelltext

$path = 'C:\temp\AutoGenerated.ipynb'
New-ADSWorkBook -Path $path -cells $Intro,$second,$third

Creates 3 cells with New-AdsWorkBookCells to add to the workbook,
two text ones and a code one, then creates a SQL Notebook with
those cells and saves it as     C:\temp\AutoGenerated.ipynb

Installation

You can install this Module from the PowerShell Gallery using

Install-Module ADSNotebook

Compatability

This module has been tested on Windows PowerShell 5.1, PowerShell Core 6 and PowerShell 7 on Windows 10 and Ubuntu

Demo

PowerShell Notebooks in Azure Data Studio

The latest release of the insiders edition of Azure Data Studio brings the first edition of PowerShell Notebooks!

You can download the latest insiders edition from the link above, it can be installed alongside the stable release.

To access many of the commands available use F1 to open the command palette (like many of my tips this also works in Visual Studio Code). You can then start typing to get the command that you want.

You can then hit enter with the command that you want highlighted, use the mouse or use the shortcut which is displayed to the right.

In a new notebook, you can click the drop down next to kernel and now you can see that PowerShell is available

When you choose the PowerShell kernel, you will get a prompt asking you to configure the Python installation

If you have Python already installed you can browse to the location that it is installed or you can install Python. In the bottom pane you will be able to see the progress of the installation.

When it has completed, you will see

You may also get a prompt asking if you would like to upgrade some packages

Again this will be displayed in the tasks pane

Adding PowerShell


To add PowerShell Code to the notebook click the Code button at the top of the file

or the one you can find by highlighting above or below a block

I did not have intellisense, but you can easily write your code in Azure Data Studio or Visual Studio Code and paste it in the block.

Interestingly Shawn Melton ( t ) did

This was because he had the PowerShell extension installed and I did not (I know !!)
If you find you dont have intellisense then install the PowerShell extension!

Clicking the play button (which is only visible when you hover the mouse over it) will run the code

You can clear the results from every code block using the clear results button at the top

Otherwise, you can save the results with the Notebook by saving it. This is the part that is missing from running PowerShell in the Markdown blocks in a SQL Notebook as I described here

I am looking forward to how this develops. You can find my sample PowerShell notebook (with the code results) here

Getting SQL Server installation date with PowerShell using dbatools

Most of my writing time at the moment is devoted to  Learn dbatools in a Month of Lunches which is now available but here is a short post following a question someone asked me.

How can I get the Installation Date for SQL Server on my estate into a database with dbatools ?

You can get the date that SQL Server was installed using the creation date of the NT Authority\System login using T-SQL

SELECT create_date 
FROM sys.server_principals 
WHERE sid = 0x010100000000000512000000

With dbatools

To do this with dbatools you can use the command Get-DbaInstanceInstallDate command

Get-DbaInstanceInstallDate -SqlInstance localhost 

More than one instance

If we want to get the installation date for more than one instance we can simply create an array of instances for the SqlInstance parameter

Get-DbaInstanceInstallDate -SqlInstance localhost, localhost\DAVE

Get the Windows installation date too

You can also get the windows installation date with the IncludeWindows switch

Get-DbaInstanceInstallDate -SqlInstance localhost, localhost\DAVE -IncludeWindows 

Gather your instances

How you get the instances in your estate is going to be different per reader but here is an example using Registered Servers from my local registered servers list, you can also use a Central Management Server

Get-DbaRegisteredServer -Group local 

So we can gather those instances into a variable and pass that to Get-DbaInstanceInstallDate

$SqlInstances = Get-DbaRegisteredServer -Group local 
Get-DbaInstanceInstallDate -SqlInstance $SqlInstances 

Add to database

To add the results of any PowerShell command to a database, you can pipe the results to Write-DbaDbTableData

$SqlInstances = Get-DbaRegisteredServer -Group local 

$writeDbaDataTableSplat = @{
    SqlInstance = 'localhost'
    Table = 'InstallDate'
    Database = 'tempdb'
    Schema = 'dbo'
    AutoCreateTable = $true
}

Get-DbaInstanceInstallDate -SqlInstance $SqlInstances | Write-DbaDataTable @writeDbaDataTableSplat

This will create a table called InstallDate and put the results of the Get-DbaInstanceInstallDate command. Note – If you want to try this code, I would advise using a different database than tempdb!!

It is important to note that the table created may not have the most optimal data types and that you may want to pre-create the table.

So there you go, all the installation dates for your estate in a database table. Hope that helps you Jonny.

MEAP MEAP – #dbatoolsMoL – Live Book edition

It’s been a busy time!

As well as many other things, the fantastical BDFL of dbatools Chrissy Lemaire @cl and myself have written enough of a chunk of Learn dbatools in a Month of Lunches that our publisher Manning Publications have agreed to release it as a MEAP. Not a text book, this book is written in a fun conversational style and split up into chapters that you can read in a lunch-time.

It is impossible for me to hear MEAP and not think of this 🙂

but I expect you are wondering what a MEAP is?

What is MEAP?
A book can take a year or more to write, so how do you learn that hot new technology today? The answer is MEAP, the Manning Early Access Program. In MEAP, you read a book chapter-by-chapter while it’s being written and get the final eBook as soon as it’s finished. If you pre-order the pBook, you’ll get it long before it’s available in stores.

https://www.manning.com/meap-program

Basically, to make it easy to get and for those that like to get in early, you can order the book and get the first 4 chapters (three in reality) RIGHT NOW!! (It also means that Chrissy and I have to write the rest of book – dang still going to be busy!)

Simply head over to https://beard.media/bookblog and use the code mlsewell and you can get access to the book too.

This will also give you access to the live book.

live book

The live book is fantastic, you can read the whole book from within your browser. See the three icons that appear to the right of the book?

3 little icons (no porridge)

The left hand one enables you to bookmark an important part so that you can come back to it easily using the bookmarks link in the top right

bookmarks

The middle icon enables you to write notes for yourself, maybe ways that you can use the information or maybe comments about an awesome Italian.

Shoes

The last one is the way that you can make comments and engage us , the authors in conversation, ask questions, request clarification or wonder about Dutch data manglers

I think its down to PII

If you select a piece of text, another menu opens up

The first icon lets you highlight the text, to make it easier to find later

Hover over the highlight and you can choose different colours for different things.

or even create pretty pictures for Mathias

Mathias – Why isn’t he an MVP?

You can choose to annotate, which is sort of like highlighting and writing a note with the next icon

When you want to share a link to a particular part of the book with someone else, you can highlight part of it and click the link icon

It’s easy to start PowerShell as another user as long as you remember when to press SHIFT

Which will highlight the paragraph and open a dialogue at the bottom where you can create and copy the link.

By far the most important part for Chrissy and I is the last link. When you find something wrong you can mark it for our attention. Yes, even with Chrissy and I proof reading each others words, the fabulous proof reader Cláudio Silva (b | t) and awesome tech editor Mike Shepard (b | t)  as well as many community reviewers there are still, and will continue to be, issues. So when you find them, highlight them and click the right hand most link

with with more more than than one one

This will open up as shown so that you can fill in what was wrong (Please don’t report this error again Shane b | t has beaten you to it!)

You will have noticed on social media and elsewhere that we have left some easter eggs in the book

Whenever you find them or whenever you want to talk about the book on social media, please use the hashtag #dbatoolsMoL – you never know what goodies may end up in your inbox.

Oh and if you have got this far and don’t know what dbatools in a Month of Lunches is, listen to the hair and read more https://dbatools.io/meap/

Azure SQL Linux VM – configuring SQL, installing pwsh and connecting and interacting with dbatools

In my posts about using Azure Devops to build Azure resources with Terraform, I built a Linux SQL VM. I used the Terrafrom in this GitHub repository and created this

Connecting with MobaXterm

I had set the Network security rules to accept connections only from my static IP using variables in the Build Pipeline. I use MobaXterm as my SSH client. Its a free download. I click on sessions

Choose a SSH session and fill in the remote host address from the portal

fill in the password and

Configuring SQL

The next task is to configure the SQL installation. Following the instructions on the Microsoft docs site I run

enter the sa password and

Now to start SQL

Installing pwsh

Installing PowerShell Core (pwsh) is easy with snap

A couple of minutes of downloads and install

and pwsh is ready for use

Installing dbatools

To install dbatools from the Powershell Gallery simply run

This will prompt you to allow installing from an untrusted repository

and dbatools is ready to go

Connecting with Azure Data Studio

I can also connect with Azure Data Studio

and connect

Just a quick little post explaining what I did 🙂

Happy Linuxing!

Using Azure DevOps Build Pipeline Templates with Terraform to build an AKS cluster

In the last few posts I have moved from building an Azure SQL DB with Terraform using VS Code to automating the build process for the Azure SQL DB using Azure DevOps Build Pipelines to using Task Groups in Azure DevOps to reuse the same Build Process and build an Azure Linux SQL VM and Network Security Group. This evolution is fantastic but Task Groups can only be used in the same Azure DevOps repository. It would be brilliant if I could use Configuration as Code for the Azure Build Pipeline and store that in a separate source control repository which can be used from any Azure DevOps Project.

Luckily, you can 😉 You can use Azure DevOps Job Templates to achieve this. There is a limitation at present, you can only use them for Build Pipelines and not Release Pipelines.

The aim of this little blog series was to have a single Build Pipeline stored as code which I can use to build any infrastructure that I want with Terraform in Azure and be able to use it anywhere

Creating a Build Pipeline Template

I created a GitHub repository to hold my Build Templates, feel free to use them as a base for your own but please don’t try and use the repo for your own builds.

The easiest way to create a Build Template is to already have a Build Pipeline. This cannot be done from a Task Group but I still have the Build Pipeline from my automating the build process for the Azure SQL DB using Azure DevOps Build Pipelines blog post.

There is a View YAML button. I can click this to view the YAML definition of the Build Pipeline

I copy that and paste it into a new file in my BuildTemplates repository. (I have replaced my Azure Subscription information in the public repository)

Now I can use this yaml as configuration as code for my Build Pipeline 🙂 It can be used from any Azure DevOps project. Once you start looking at the code and the documentation for the yaml schema you can begin to write your pipelines as YAML, but sometimes it is easier to just create build pipeline or even just a job step in the browser and click the view yaml button!

Create an AKS Cluster with a SQL 2019 container using Terraform and Build templates

I have a GitHub Repository with the Terraform code to build a simple AKS cluster. This could not have been achieved without Richard Cheney’s article I am not going to explain how it all works for this blog post or some of the negatives of doing it this way. Instead lets build an Azure DevOps Build Pipeline to build it with Terraform using Configuration as Code (the yaml file)

I am going to create a new Azure DevOps Build Pipeline and as in the previous posts connect it to the GitHub Repository holding the Terraform code.

This time I am going to choose the Configuration as code template

I am going to give it a name and it will show me that it needs the path to the yaml file containing the build definition in the current repository.

Clicking the 3 ellipses will pop-up a file chooser and I pick the build.yaml file

The build.yaml file looks like this. The name is the USER/Repository Name and the endpoint is the name of the endpoint for the GitHub service connection in Azure DevOps. The template value is the name of the build yaml file @ the name given for the repository value.

You can find (and change) your GitHub service connection name by clicking on the cog bottom left in Azure DevOps and clicking service connections

I still need to create my variables for my Terraform template (perhaps I can now just leave those in my code?) For the AKS Cluster build right now I have to add presentation, location, ResourceGroupName, AgentPoolName, ServiceName, VMSize, agent_count

Then I click save and queue and the job starts running

If I want to edit the pipeline it looks a little different

The variables and triggers can be found under the 3 ellipses on the top right

It also defaults the trigger to automatic deployment.

It takes a bit longer to build

and when I get the Terraform code wrong and the build fails, I can just alter the code, commit it, push and a new build will start and the Terraform will work out what is built and what needs to be built!

but eventually the job finishes successfully

and the resources are built

and in Visual Studio Code with the Kubernetes extension installed I can connect to the cluster by clicking the 3 ellipses and Add Existing Cluster

I choose Azure Kubernetes Services and click next

Choose my subscription and then add the cluster

and then I can explore my cluster

I can also see the dashboard by right clicking on the cluster name and Open Dashboard

Right clicking on the service name and choosing describe

shows the external IP address, which I can put into Azure Data Studio and connect to my container

So I now I can source control my Build Job Steps and hold them in a central repository. I can make use of them in any project. This gives me much more control and saves me from repeating myself repeating myself. The disadvantage is that there is no handy warning when I change the underlying Build Repository that I will be affecting other Build Pipelines and there is no easy method to see which Build Pipelines are dependent on the build yaml file

Happy Automating

Using the same Azure DevOps build steps for Terraform with different Pipelines with Task Groups to build an Azure Linux SQL VM

In my last post I showed how to build an Azure DevOps Pipeline for a Terraform build of an Azure SQLDB. This will take the terraform code and build the required infrastructure.

The plan all along has been to enable me to build different environments depending on the requirement. Obviously I can repeat the steps from the last post for a new repository containing a Terraform code for a different environment but

If you are going to do something more than once Automate It

who first said this? Anyone know?

The build steps for building the Terraform are the same each time (if I keep a standard folder and naming structure) so it would be much more beneficial if I could keep them in a single place and any alterations to the process only need to be made in the one place 🙂

Task Groups

Azure DevOps has task groups. On the Microsoft Docs web-page they are described as


task group allows you to encapsulate a sequence of tasks, already defined in a build or a release pipeline, into a single reusable task that can be added to a build or release pipeline, just like any other tas


https://docs.microsoft.com/en-us/azure/devops/pipelines/library/task-groups?view=azure-devops

If you are doing this with a more complicated existing build pipeline it is important that you read the Before You Create A Task Group on the docs page. This will save you time when trying to understand why variables are not available (Another grey hair on my beard!)

Creating A Task Group

Here’s the thing, creating a task group is so easy it should be the default way you create Azure DevOps Pipelines. Let me walk you through it

I will use the Build Pipeline from the previous post. Click edit from the build page

Then CTRL and click to select all of the steps

Right Click and theres a Create Task Group button to click !

You can see that it has helpfully added the values for the parameters it requires for the location, Storage Account and the Resource Group.

Remember the grey beard hair above? We need to change those values to use the variables that we will add to the Build Pipeline using

Once you have done that click Create

This will also alter the current Build Pipeline to use the Task Group. Now we have a Task Group that we can use in any build pipeline in this project.

Using the Task Group with a new Build Pipeline to build an Azure Linux SQL VM

Lets re-use the build steps to create an Azure SQL Linux VM. First I created a new GitHub Repository for my Terraform code. Using the docs I created the Terraform to create a resource group, a Linux SQL VM, a virtual network, a subnet, a NIC for the VM, a public IP for the VM, a netwwork security group with two rules, one for SQL and one for SSH. It will look like this

The next step is to choose the repository

again we are going to select Empty job (although the next post will be about the Configuration as Code 🙂

As before we will name the Build Pipeline and the Agent Job Step and click the + to add a new task. This time we will search for the Task Group name that we created

I need to add in the variables from the variable.tf in the code and also for the Task Group

and when I click save and queue

It runs for less than 7 minutes

and when I look in the Azure portal

and I can connect in Azure Data Studio

Altering The Task Group

You can find the Task Groups under Pipelines in your Azure DevOps project

Click on the Task Group that you have created and then you can alter, edit it if required and click save

This will warn you that any changes will affect all pipelines and task groups that are using this task group. To find out what will be affected click on references


which will show you what will be affected.

Now I can run the same build steps for any Build Pipeline and alter them all in a single place using Task Groups simplifying the administration of the Build Pipelines.

The next post will show how to use Azure DevOps templates to use the same build steps across many projects and build pipelines and will build a simple AKS cluster

The first post showed how to build an Azure SQLDB with Terraform using VS Code

The second post showed how to use Azure DevOps Task Groups to use the same build steps in multiple pipelines and build an Azure Linux SQL Server VM

Happy Automating!