> Can I use this to run applications as my admin account?
A user with a beard
It is good practice to not log into your work station with an account with admin privileges. In many shops, you will need to open applications that can do administration tasks with another set of account credentials.
Unfortunately, people being people, they will often store their admin account credentials in a less than ideal manner (OneNote, Notepad ++ etc) to make it easier for them, so that when they right click and run as a different user, they can copy and paste the password.
Use the Secret Management module
Again, I decided to use a notebook to show this as it is a fantastic way to share code and results and because it means that anyone can try it out.
The notebook may not render on a mobile device.
Using the notebook, I can quickly store my admin password safely and open and run the applications using the credential
Azure Data Studio is a great tool for connecting with your data platform whether it is in Azure or on your hardware. Jupyter Notebooks are fantastic, you can have words, pictures, code and code results all saved in one document.
Another thing that you can do with notebooks is run them as Agent Jobs and save the results of the run.
Notebooks running T-SQL
This works easily for T-SQL notebooks. I am going to use this one that I created that uses T-SQL to gather permissions using old code that was in a share somewhere. We can run the notebook and get the permissions and save the notebook and the results will be available for all time (unless you delete the notebook!)
SQL Agent Extension in Azure Data Studio
In Azure Data Studio, if you press CTRL + SHIFT + X it will open the Extensions tab
You can add extra functionality to Azure Data Studio. Search in the top bar for Agent and press the install button to install the extension. You can connect to and instance in the connections tab (CTRL + SHIFT + D) and right click on it and click Manage. This will open up the server dashboard (why isn’t it instance dashboard?)
and you will also have the SQL Agent dashboard available
Its pretty neat, it has green and red bars against the jobs showing success or failure and the larger the bar the longer the run time. On the left you will see a book. Click that
Notebooks in Agent Jobs
You can create an Agent Job to run a notebook. As a notebook is just a json file, it can be stored in a database table. This interface will create two tables one to store the templates and one for the results. Click New Notebook Job
Then navigate to the notebook and select it.
Choose a database for the storage of the template and the results and one for the execution context.
The name of the job will be the file name of the notebook. You can change this but there is a bug where you can only enter one character at a time in the name before it changes focus so beware!
Once the job is created, you will see two tables in the storage database notebooks.nb_materialized and notebooks.nb_template
The materialised table is empty right now
but the template table has a row for the job which includes the notebook in json format.
If you click on the jobs in the Notebook Jobs window in the SQL Agent extension, you can see more information about the job run
You can also run the job from here. It doesn’t have to be run from here, it is just a normal agent job which you can run or schedule in any normal manner. Running it from here gives a pop-up
You have to refresh to see when the job is finished and it will be red if the job failed, green if it succeeded or orange if some cells failed like this!
But this is the good bit. Clicking on that icon will open the notebook that was created by that agent job run. Lets see what we get
You can see that we have the results of the queries that we wrote in the notebook alongside the documentation (or maybe explanation of the expected results) If we scroll down a little (and change the theme colour so that you can see the error)
Msg , Level , State , Line
Duplicate column names are not permitted in SQL PowerShell. To repeat a column, use a column alias for the duplicate column in the format Column_Name AS New_Name.
We have got an error from running the code via SQL PowerShell which is how the job is run. This error is also inserted into the notebooks.nb_template table
I edited the notebook locally to remove that block of code
Then edited the job and selected the updated notebook
and re-ran the job and got a green tick.
Now I can open the notebook from the latest run, but notice that from this view I can also open the previous notebook.
If I look in the nb_template table, the last_run_notebook_error has cleared
and if I look in the nb materialized table I can see two rows, one for each job run. The error from the first run is also stored in this table. The notebook column has the json for the notebook if you wish to access it in a different manner.
Tomorrow, we will see what the job steps look like and how to make this run on an instance which does not and cannot have the required PowerShell.
I enjoying maintaining open source GitHub repositories such as dbachecks and ADSNotebook. I absolutely love it when people add more functionality to them.
To collaborate with a repository in GitHub you need to follow these steps
Fork the repository into your own GitHub
Clone the repository to your local machine
Create a new branch for your changes
Make some changes and commit them with useful messages
Push the changes to your repository
Create a Pull Request from your repository back to the original one
You will need to have git.exe available which you can download and install from https://git-scm.com/downloads if required
Fork the repository into your own GitHub
A fork is a copy of the original repository. This allows you to make changes without affecting the original project. It does not get updated when the original project gets updated (We will talk about that in the next post) This enables you to code a new feature or a bug fix, test it locally and make sure it is working.
Let’s take dbachecks as our example. Start by going to the project in GiHub. In this case the URL is https://github.com/sqlcollaborative/dbachecks You will see a Fork button at the top right of the page
When you click the button the repository is copied into your own GitHub account
Forking the repository has created a remote repository stored on the GitHub servers. Now that the repository has been forked you need to clone it to your local machine to create a local repository so that you can start coding your amazing fix. When you have finished you can then sync it back to your remote repository ready for a Pull Request back to the original repository.
In your browser, at your remote repository that you just created (https://github.com/YOURGITHUBUSERNAME/NameOfRepository if you have closed the page) click on Clone or Download and then the icon to the right to copy the url
You can clone your repository in VS Code or Azure Data Studio by clicking F1 or CTRL + SHIFT + P in Windows or Linux and ⇧⌘P or F1 on a Mac
then start typing clone until you see Git:Clone and press enter or click
Paste in the URL that you just copied and click enter. A dialog will open asking you to select a folder. This is the parent directory where your local repository will be created. The clone will create a directory for your repository so you do not need to. I suggest that you use a folder called GitHub or something similar to place all of the repositories that you are going to clone and create.
When it has finished it will ask you if you wish to open the repository
if you click Open it will close anything that you have already got opened and open the folder. If you click Add to Workspace it will add the folder to the workspace and leave everything you already had open as it was and surprisingly clicking Open in New Window will open the folder in a new instance of Visual Studio Code or Azure Data Studio!
and you will also be able to see the local repository files on your computer
You can clone the repository at the command line if you wish by navigating to your local GitHub directory and running git clone TheURLYouCopied
Now your local repository has been created, it’s time to do your magic coding.
Create a new branch for your changes
It is a good idea to create a branch for your amazing new feature This enables you to work on coding for that feature in isolation. It has the added advantage that if you mess it right royally up, you can just delete that branch and start again with a new one!
To create a branch in VS Code or Azure Data Studio you can click on the branch name at the bottom left.
Or open the Command Palette and type Branch until you see Git: Create Branch
You will be prompted for a branch name
I like to choose a name that relates to the code that I am writing like configurable_engine or removeerroringexample You can see the name of the branch in the bottom left so that you always know which branch you are working on.
The icon shows that the branch is only local and hasn’t been pushed (published) to the remote repository yet
Make some changes and commit them with useful messages
Now you can start writing your code for your awesome new feature, bug fix or maybe just documentation improvement. Keep your commits small and give them useful commit messages that explain why you have made the change as the diff tooling will be able to show what change you have made
Write your code or change the documentation, save the file and in Visual Studio Code or Azure Data Studio you will see that the source control icon has a number on it
Clicking on the icon will show the files that have changes ready
You can write your commit message in the box and click CTRL + ENTER to commit your changes with a message
If you want to do this at the command line, you can use git status to see which files have changes
You will need to git add .or git add .\pathtofile to stage your changes ready for committing and then git commit -m 'Commit Message' to commit them
Notice that I did exactly what I just said not to do! A better commit message would have been So that people can find the guide to forking and creating a PR
Push the changes to your repository
You only have the changes that you have made in your local repository on your computer. Now you need to push those changes to Github your remote repository. You can click on the publish icon
You will get a pop-up asking you if you wish to stage your changes. I click Yes and never Always so that I can use this prompt as a sanity check that I am doing the right thing
At the command line you can push the branch, if you do that, you will have to tell git where the branch needs to go. If you just type git push it will helpfully tell you
fatal: The current branch AwesomeNewFeature has no upstream branch.
To push the current branch and set the remote as upstream, use
git push --set-upstream origin AwesomeNewFeature
So you will need to use that command
You can see in the bottom left that the icon has changed
and if you read the output of the git push command you will see what the next step is also.
Create a Pull Request from your repository back to the original one
You can CTRL click the link in the git push output if you have pushed from the command line or if you visit either you repository or the original repository in your browser you will see that there is a Compare and Pull Request button
You click that and let GitHub do its magic
and it will create a Pull Request for you ready for you to fill in the required information, ask for reviewers and other options. Once you have done that you can click Create pull request and wait for the project maintainer to review it and (hopefully) accept it into their project
If you make more changes to the code in the same branch in your local repository and push them, they will automatically be added to this Pull Request whilst it is open. You can do this if the maintainer or reviewer asks for changes.
Shane has asked for a change
So I can go to my local repository in Azure Data Studio and make the requested change and save the file. If I look in the source control in Azure Data Studio I can again see there is a change waiting to be committed and if I click on the name of the file I can open the diff tool to see what the change was
Once I am happy with my change I can commit it again in the same way as before either in the editor or at the command line. The icon at the bottom will change to show that I have one commit in my local repository waiting to be pushed
To do the same thing at the command line I can type git status and see the same thing.
I can then push my change to my remote repository either in the GUI or by using git push
and it will automatically be added to the Pull Request as you can see
Now that the required changes for the review have been made, the review has been approved by Shane and the pull request is now ready to be merged. (You can also see that dbachecks runs some checks against the code when a Pull Request is made)
Many, many thanks to Shane b | t who helped with the writing of this post even whilst on a “no tech” holiday.
Go Ahead – Contribute to an Open Source Project
Hopefully you can now see how easy it is to create a fork of a GitHub repository, clone it to your own machine and contribute. There are many open source projects that you can contribute to.
You can use this process to contribute to the Microsoft Docs for example by clicking on the edit button on any page.
You can contribute other open source projects like
I showed my silly example PowerShell code to create a PowerShell Notebook that created a PowerShell Notebook to my good friend Nick.
Nick is a fantastic, clever DBA who isn’t active on social media, which is a great shame as if he had time to share some of his fantastic work we would all benefit. He looked at that code and less than an hour later, came back to me with this code and idea which I have replicated here with his permission.
Thanks Nick.
The Use Case
The use case that Nick has is that he is converting some troubleshooting runbooks from their original locations (you know the sort of places – Sharepoint Docs, OneNote Notebooks, Shared Folders, the desktop of the Bastion Host) into a single repository of Azure Data Studio SQL or PowerShell Notebooks.
The idea is to have a single entry point into the troubleshooting steps and for the on-call DBA to create a Notebook from a template for the issue at hand which could be attached to an incident in the incident management solution. I suppose you could call it an Index Notebook.
Work Flow
When the DBA (or another team) opens this Notebook, they can choose the task that they are going to perform and click the link which will
copy the Notebook to the local machine
Rename the Notebook with the username and date
Open it ready for the work.
Once the work has been completed, the DBA can then attach the Notebook to the task or incident that has been created or use it in the Wash-Up/ Post Incident meeting.
This ensures that the original template notebook stays intact and unchanged and it is easy (which is always good when you are called out at 3am!) to create a uniquely named notebook .
Azure DevOps
Nick has placed this code into the deploy step in Azure DevOps which will deploy the template Notebooks from source control into the common folder and then this code will dynamically create the index Notebook each time there is a release.
Whilst the initial use case is incident response, this could easily be adapted for Notebooks used for Common Tasks or Run Books.
Notebooks
There are a number of Notebooks for different issue stored in directories. For this post, I have used the Notebooks from Microsoft that explain SQL 2019 features and troubleshooting which you can find in their GitHub repositories by following this link
The Azure DevOps deploys the Notebooks to a directory which then looks something like this
Some directories of Notebooks in a directory
Create an Index Notebook
Here is the code to create an index Notebook
This creates a Notebook in the root of the folder. It also uses the new -Collapse parameter in New-AdsNoteBookCell that creates the code blocks with the code collapsed so that it looks neater. The index Notebook looks like this in the root of the folder
Three O’Clock in the Morning
It’s 3am and I have been called out. I can open up the Index Notebook, find the set of queries I want to run and click the run button.
A new workbook opens up, named with my name and the time and I can get to work 🙂 I think it’s neat.
Here’s a video
Thanks Nick.
Maybe you can find him at SQL Bits next year. Did you know that SQL Bits 2020 was announced?
The latest update to the ADSNotebook PowerShell module I blogged about here now enables the creation of PowerShell notebooks with PowerShell.
You can install the module with
Install-Module ADSNotebook
or if you have already installed it you can use
Update-Module ADSNotebook
In the latest release, there is an extra parameter for New-AdsWorkBook of -Type which will accept either SQL or PowerShell
Create a PowerShell Notebook with PowerShell Rob
OK!
Here is some code to create a PowerShell Notebook. First we will create some cells using New-AdsWorkBookCell including all the markdown to add images and links. You can find my notebooks which explain how to write the markdown for your notebooks in my GitHub Presentations Repository
$introCelltext = "# Welcome to my Auto Generated PowerShell Notebook
## dbatools

dbatools is an open-source PowerShell Module for administering SQL Servers.
You can read more about dbatools and find the documentation at [dbatools.io](dbatools.io)
"
$SecondCelltext = "### Installation
You can install dbatools from the PowerShell Gallery using `Install-Module dbatools`
"
$thirdcelltext = "Install-Module dbatools"
$fourthCelltext = "### Getting Help
You should always use `Get-Help` to fins out how to use dbatools (and any PowerShell) commands"
$fifthcelltext = "Get-Help Get-DbaDatabase"
$sixthCelltext = "Try a command now. get the name, owner and collation of the user databases on the local instance"
$seventhCellText = "Get-DbaDatabase -SqlInstance localhost -ExcludeSystem | Select Name, Owner, Collation"
$Intro = New-ADSWorkBookCell -Type Text -Text $introCelltext
$second = New-ADSWorkBookCell -Type Text -Text $SecondCelltext
$third = New-ADSWorkBookCell -Type Code -Text $thirdcelltext
$fourth = New-ADSWorkBookCell -Type Text -Text $fourthCelltext
$fifth = New-ADSWorkBookCell -Type Code -Text $fifthcelltext
$sixth = New-ADSWorkBookCell -Type Text -Text $sixthCelltext
$seventh = New-ADSWorkBookCell -Type Code -Text $seventhCellText
Then we will create a new workbook using those cells
This will create the markdown link for embedding PowerShell code in a Text Cell for a SQL Notebook as described in this blog post
New-ADSWorkBookCell
This command will create a workbook text cell or a code cell for adding to the New-ADSWorkBook command
New-ADSWorkBook
This will create a new SQL Notebook using the cell objects created by New-ADSWorkBookCell
Usage
Convert-ADSPowerShellForMarkdown
Convert-ADSPowerShellForMarkdown -InputText "Get-ChildItem" -LinkText 'This will list the files' -ToClipBoard
Converts the PowerShell so that it works with MarkDown and sets it to the clipboard for pasting into a workbook cell
New-ADSWorkBookCell
$introCelltext = "# Welcome to my Auto Generated Notebook
## Automation
Using this we can automate the creation of notebooks for our use
"
$Intro = New-ADSWorkBookCell -Type Text -Text $introCelltext
Creates an Azure Data Studio Text cell and sets it to a variable for passing to New-AdsWorkBook
New-ADSWorkBook
$introCelltext = "# Welcome to my Auto Generated Notebook
## Automation
Using this we can automate the creation of notebooks for our use
"
$SecondCelltext = "## Running code
The next cell will have some code in it for running
## Server Principals
Below is the code to run against your instance to find the server principals that are enabled"
$thirdcelltext = "SELECT Name
FROM sys.server_principals
WHERE is_disabled = 0"
$Intro = New-ADSWorkBookCell -Type Text -Text $introCelltext
$second = New-ADSWorkBookCell -Type Text -Text $SecondCelltext
$third = New-ADSWorkBookCell -Type Code -Text $thirdcelltext
$path = 'C:\temp\AutoGenerated.ipynb'
New-ADSWorkBook -Path $path -cells $Intro,$second,$third
Creates 3 cells with New-AdsWorkBookCells to add to the workbook,
two text ones and a code one, then creates a SQL Notebook with
those cells and saves it as C:\temp\AutoGenerated.ipynb
Installation
You can install this Module from the PowerShell Gallery using
Install-Module ADSNotebook
Compatability
This module has been tested on Windows PowerShell 5.1, PowerShell Core 6 and PowerShell 7 on Windows 10 and Ubuntu
You can download the latest insiders edition from the link above, it can be installed alongside the stable release.
To access many of the commands available use F1 to open the command palette (like many of my tips this also works in Visual Studio Code). You can then start typing to get the command that you want.
You can then hit enter with the command that you want highlighted, use the mouse or use the shortcut which is displayed to the right.
In a new notebook, you can click the drop down next to kernel and now you can see that PowerShell is available
When you choose the PowerShell kernel, you will get a prompt asking you to configure the Python installation
If you have Python already installed you can browse to the location that it is installed or you can install Python. In the bottom pane you will be able to see the progress of the installation.
When it has completed, you will see
You may also get a prompt asking if you would like to upgrade some packages
Again this will be displayed in the tasks pane
Adding PowerShell
To add PowerShell Code to the notebook click the Code button at the top of the file
or the one you can find by highlighting above or below a block
I did not have intellisense, but you can easily write your code in Azure Data Studio or Visual Studio Code and paste it in the block.
This was because he had the PowerShell extension installed and I did not (I know !!) If you find you dont have intellisense then install the PowerShell extension!
Clicking the play button (which is only visible when you hover the mouse over it) will run the code
You can clear the results from every code block using the clear results button at the top
Otherwise, you can save the results with the Notebook by saving it. This is the part that is missing from running PowerShell in the Markdown blocks in a SQL Notebook as I described here
I am looking forward to how this develops. You can find my sample PowerShell notebook (with the code results) here
I have done a lot of writing in the last few months but you see no blog posts! My wonderful friend Chrissy and I are writing “dbatools in a Month of Lunches” to be published by Manning. That has taken up a lot of my writing mojo. We have hit a little break whilst we have some reviews done ready for the MEAP (For everyone who asks, the answer is the unfulfilling ‘soon’) so it’s time for a blog post!
SQL Notebooks are cool
I have had a lot of fun with SQL Notebooks recently. I have presented a session about them at a couple of events this month DataGrillen and SQL Saturday Cork. Here is a little snippet
— Rob He/Him robsewell@tech.lgbt & @counter.soci (@sqldbawithbeard) May 21, 2019
Yes, you can run PowerShell in a SQL Notebook in Azure Data Studio just by clicking a link in the markdown cell. This opens up a lot of excellent possibilities.
I have had several discussions about how SQL Notebooks can be used by SQL DBAs within their normal everyday roles. (Mainly because I don’t really understand what the sorcerers of data science do with notebooks!). I have helped clients to look at some of their processes and use SQL Notebooks to help with them. Creating Disaster Recovery or Change Run-books or Incident Response Templates or using them for product demonstrations. Of course, I needed to use PowerShell in that 🙂
I have really enjoyed working out how to run PowerShell in the markdown in a SQL Notebook in Azure Data Studio and I think Anthony the kubernetes magician did too!
First, before I go any further, I must say this. I was at the European PowerShell Conference when I was working this out and creating my sessions and I said the words
“Cool, I can click a link and run PowerShell, this is neat”
A Beardy fellow in Hannover
This stopped some red team friends of mine in their tracks and they said “Show me”. One of them was rubbing their hands with glee! You can imagine the sort of wicked, devious things that they were immediately considering doing.
Yes, it’s funny but also it carries a serious warning. Without understanding what it is doing, please don’t enable PowerShell to be run in a SQL Notebook that someone sent you in an email or you find on a GitHub. In the same way as you don’t open the word document attachment which will get a thousand million trillion pounddollars into your bank account or run code you copy from the internet on production without understanding what it does, this could be a very dangerous thing to do.
With that warning out of the way, there are loads of really useful and fantastic use cases for this. SQL Notebooks make great run-books or incident response recorders and PowerShell is an obvious tool for this. (If only we could save the PowerShell output in a SQL Notebook, this would be even better)
How on earth did you work this out?
Someone asked me how I worked it out. I didn’t! It began with Vicky Harp PM lead for the SQL Tools team at Microsoft
I then went and looked at Kevin Cunnane‘s notebook. Kevin is a member of the tools team working on Azure Data Studio. With SQL Notebooks, you can double click the markdown cell and see the code that is behind it. To understand how it is working, lets deviate a little.
Keyboard Shortcuts
IF you click the cog at the bottom left of Azure Data Studio and choose Keyboard Shortcuts
you can make Azure Data Studio (and Visual Studio Code) work exactly how you want it to. Typing in the top box will find a command and you can then set the shortcuts that you want to use to save yourself time.
This also enables you to see the command that is called when you use a keyboard shortcut. For example, you can see that for the focus terminal command it says workbench.action.terminal.focus.
It turns out that you can call this as a link in a Markdown document using HTML with <a href=""> and adding command: prior to the command text. When the link is clicked the command will run. Cool 🙂
For this to be able to work (you read the warning above?) you need to set the Notebook to be trusted by clicking this button.
This will allow any command to be run. Of course, people with beards will helpfully advise when this is required for a SQL Notebook. (Safe to say people attempting nefarious actions will try the same with your users)
Now that we know how to run an Azure Data Studio command using a link in a markdown cell the next step is to run a PowerShell command. I headed to the Visual Studio Code documentation and found
Send text from a keybinding The workbench.action.terminal.sendSequence command can be used to send a specific sequence of text to the terminal, including escape sequence
That’s the command we need, however, we still need to craft the command so that it will work as a link. It needs to be converted into a URL.
I started by using this website https://www.url-encode-decode.com/ to do this. This is how you can check the code in other peoples notebook, use the decode capability.
So I can just put that code into the href link and bingo!
If only it was that easy!!
Some Replacing is required
The + needs to be replaced with a space or %20
You also need to double the \ and replace the %3A with a : The " needs to be replaced with \u022, the ' with \u027, the curly braces won’t work unless you remove the %0D%0A. Got all that? Good!
Once you have written your PowerShell, encoded it, performed the replacements, you add \u000D at the end of the code to pass an enter to run the code and then place all of that into a link like this
<a href="command:workbench.action.terminal.sendSequence?%7B%22text%22%3A%22 PLACE THE ENCODED CODE HERE %22%7D">Link Text</a>
This means that if you want to add the PowerShell code to set a location and then list the files and folders in that location to a Markdown cell using PowerShell like this
Set-Location C:\dbachecks
Get-ChildItem
You would end up with a link like this
<a href="command:workbench.action.terminal.sendSequence?%7B%22text%22%3A%22 Set-Location C:%5C%5Cdbachecks \u000D Get-ChildItem \u000D %22%7D">Set Location and list files</a>
This will take a PowerShell command and turn it into a link that will work in an Azure Data Studio markdown. It’s not magic, it’s PowerShell. There is a –ToClipboard parameter which will copy the code to the clipboard ready for you to paste into the cell (On Windows machines only)
Giants
There are many uses for this but here’s one I think is cool.
The link below will go to a notebook, which will show how you the giants upon whose shoulders I stand
have enabled me to create a SQL Notebook with a link which will run some PowerShell to create a SQL Notebook which will have all of the Diagnostic Queries in it.
You could possibly use something like it for your incident response SQL Notebook.
It’s also cool that GitHub renders the notebook in a browser (You can’t run PowerShell or T-SQL from there though, you need Azure Data Studio!)
I had set the Network security rules to accept connections only from my static IP using variables in the Build Pipeline. I use MobaXterm as my SSH client. Its a free download. I click on sessions
Choose a SSH session and fill in the remote host address from the portal
fill in the password and
Configuring SQL
The next task is to configure the SQL installation. Following the instructions on the Microsoft docs site I run
Luckily, you can 😉 You can use Azure DevOps Job Templates to achieve this. There is a limitation at present, you can only use them for Build Pipelines and not Release Pipelines.
The aim of this little blog series was to have a single Build Pipeline stored as code which I can use to build any infrastructure that I want with Terraform in Azure and be able to use it anywhere
Creating a Build Pipeline Template
I created a GitHub repository to hold my Build Templates, feel free to use them as a base for your own but please don’t try and use the repo for your own builds.
There is a View YAML button. I can click this to view the YAML definition of the Build Pipeline
I copy that and paste it into a new file in my BuildTemplates repository. (I have replaced my Azure Subscription information in the public repository)
jobs:
- job: Build
pool:
name: Hosted VS2017
demands: azureps
steps:
- task: AzureCLI@1
displayName: 'Azure CLI to deploy azure storage for backend'
inputs:
azureSubscription: 'PUTYOURAZURESUBNAMEHERE'
scriptLocation: inlineScript
inlineScript: |
# the following script will create Azure resource group, Storage account and a Storage container which will be used to store terraform state
call az group create --location $(location) --name $(TerraformStorageRG)
call az storage account create --name $(TerraformStorageAccount) --resource-group $(TerraformStorageRG) --location $(location) --sku Standard_LRS
call az storage container create --name terraform --account-name $(TerraformStorageAccount)
- task: AzurePowerShell@3
displayName: 'Azure PowerShell script to get the storage key'
inputs:
azureSubscription: 'PUTYOURAZURESUBNAMEHERE'
ScriptType: InlineScript
Inline: |
# Using this script we will fetch storage key which is required in terraform file to authenticate backend stoarge account
$key=(Get-AzureRmStorageAccountKey -ResourceGroupName $(TerraformStorageRG) -AccountName $(TerraformStorageAccount)).Value[0]
Write-Host "##vso[task.setvariable variable=TerraformStorageKey]$key"
azurePowerShellVersion: LatestVersion
- task: qetza.replacetokens.replacetokens-task.replacetokens@3
displayName: 'Replace tokens in terraform file'
inputs:
rootDirectory: Build
targetFiles: |
**/*.tf
**/*.tfvars
tokenPrefix: '__'
tokenSuffix: '__'
- powershell: |
Get-ChildItem .\Build -Recurse
Get-Content .\Build\*.tf
Get-Content .\Build\*.tfvars
Get-ChildItem Env: | select Name
displayName: 'Check values in files'
enabled: false
- task: petergroenewegen.PeterGroenewegen-Xpirit-Vsts-Release-Terraform.Xpirit-Vsts-Release-Terraform.Terraform@2
displayName: 'Initialise Terraform'
inputs:
TemplatePath: Build
Arguments: 'init -backend-config="0-backend-config.tfvars"'
InstallTerraform: true
UseAzureSub: true
ConnectedServiceNameARM: 'PUTYOURAZURESUBNAMEHERE'
- task: petergroenewegen.PeterGroenewegen-Xpirit-Vsts-Release-Terraform.Xpirit-Vsts-Release-Terraform.Terraform@2
displayName: 'Plan Terraform execution'
inputs:
TemplatePath: Build
Arguments: plan
InstallTerraform: true
UseAzureSub: true
ConnectedServiceNameARM: 'PUTYOURAZURESUBNAMEHERE'
- task: petergroenewegen.PeterGroenewegen-Xpirit-Vsts-Release-Terraform.Xpirit-Vsts-Release-Terraform.Terraform@2
displayName: 'Apply Terraform'
inputs:
TemplatePath: Build
Arguments: 'apply -auto-approve'
InstallTerraform: true
UseAzureSub: true
ConnectedServiceNameARM: 'PUTYOURAZURESUBNAMEHERE'
Now I can use this yaml as configuration as code for my Build Pipeline 🙂 It can be used from any Azure DevOps project. Once you start looking at the code and the documentation for the yaml schema you can begin to write your pipelines as YAML, but sometimes it is easier to just create build pipeline or even just a job step in the browser and click the view yaml button!
Create an AKS Cluster with a SQL 2019 container using Terraform and Build templates
This time I am going to choose the Configuration as code template
I am going to give it a name and it will show me that it needs the path to the yaml file containing the build definition in the current repository.
Clicking the 3 ellipses will pop-up a file chooser and I pick the build.yaml file
The build.yaml file looks like this. The name is the USER/Repository Name and the endpoint is the name of the endpoint for the GitHub service connection in Azure DevOps. The template value is the name of the build yaml file @ the name given for the repository value.
You can find (and change) your GitHub service connection name by clicking on the cog bottom left in Azure DevOps and clicking service connections
I still need to create my variables for my Terraform template (perhaps I can now just leave those in my code?) For the AKS Cluster build right now I have to add presentation, location, ResourceGroupName, AgentPoolName, ServiceName, VMSize, agent_count
Then I click save and queue and the job starts running
If I want to edit the pipeline it looks a little different
The variables and triggers can be found under the 3 ellipses on the top right
It also defaults the trigger to automatic deployment.
It takes a bit longer to build
and when I get the Terraform code wrong and the build fails, I can just alter the code, commit it, push and a new build will start and the Terraform will work out what is built and what needs to be built!
but eventually the job finishes successfully
and the resources are built
and in Visual Studio Code with the Kubernetes extension installed I can connect to the cluster by clicking the 3 ellipses and Add Existing Cluster
I choose Azure Kubernetes Services and click next
Choose my subscription and then add the cluster
and then I can explore my cluster
I can also see the dashboard by right clicking on the cluster name and Open Dashboard
Right clicking on the service name and choosing describe
shows the external IP address, which I can put into Azure Data Studio and connect to my container
So I now I can source control my Build Job Steps and hold them in a central repository. I can make use of them in any project. This gives me much more control and saves me from repeating myself repeating myself. The disadvantage is that there is no handy warning when I change the underlying Build Repository that I will be affecting other Build Pipelines and there is no easy method to see which Build Pipelines are dependent on the build yaml file