Using Secret Management module to run SSMS, VS Code and Azure Data Studio as another user

Following on from my last post about the Secret Management module. I was asked another question.

> Can I use this to run applications as my admin account?

A user with a beard

It is good practice to not log into your work station with an account with admin privileges. In many shops, you will need to open applications that can do administration tasks with another set of account credentials.

Unfortunately, people being people, they will often store their admin account credentials in a less than ideal manner (OneNote, Notepad ++ etc) to make it easier for them, so that when they right click and run as a different user, they can copy and paste the password.

Use the Secret Management module

Again, I decided to use a notebook to show this as it is a fantastic way to share code and results and because it means that anyone can try it out.

The notebook may not render on a mobile device.

Using the notebook, I can quickly store my admin password safely and open and run the applications using the credential

How to fork a GitHub repository and contribute to an open source project

I enjoying maintaining open source GitHub repositories such as dbachecks and ADSNotebook. I absolutely love it when people add more functionality to them.

To collaborate with a repository in GitHub you need to follow these steps

  • Fork the repository into your own GitHub
  • Clone the repository to your local machine
  • Create a new branch for your changes
  • Make some changes and commit them with useful messages
  • Push the changes to your repository
  • Create a Pull Request from your repository back to the original one

You will need to have git.exe available which you can download and install from https://git-scm.com/downloads if required

Fork the repository into your own GitHub

A fork is a copy of the original repository. This allows you to make changes without affecting the original project. It does not get updated when the original project gets updated (We will talk about that in the next post) This enables you to code a new feature or a bug fix, test it locally and make sure it is working.

Let’s take dbachecks as our example. Start by going to the project in GiHub. In this case the URL is https://github.com/sqlcollaborative/dbachecks You will see a Fork button at the top right of the page

When you click the button the repository is copied into your own GitHub account

The page will open at https://github.com/YOURGITHUBUSERNAME/NameOfRepository in this case https://github.com/SQLDBAWithABeard/dbachecks You will be able to see that it is a fork of the original repository at the top of the page

Clone the repository to your local machine

Forking the repository has created a remote repository stored on the GitHub servers. Now that the repository has been forked you need to clone it to your local machine to create a local repository so that you can start coding your amazing fix. When you have finished you can then sync it back to your remote repository ready for a Pull Request back to the original repository.

In your browser, at your remote repository that you just created (https://github.com/YOURGITHUBUSERNAME/NameOfRepository if you have closed the page) click on Clone or Download and then the icon to the right to copy the url

You can clone your repository in VS Code or Azure Data Studio by clicking F1 or CTRL + SHIFT + P in Windows or Linux and ⇧⌘P or F1 on a Mac

then start typing clone until you see Git:Clone and press enter or click

Paste in the URL that you just copied and click enter. A dialog will open asking you to select a folder. This is the parent directory where your local repository will be created. The clone will create a directory for your repository so you do not need to. I suggest that you use a folder called GitHub or something similar to place all of the repositories that you are going to clone and create.

When it has finished it will ask you if you wish to open the repository

if you click Open it will close anything that you have already got opened and open the folder. If you click Add to Workspace it will add the folder to the workspace and leave everything you already had open as it was and surprisingly clicking Open in New Window will open the folder in a new instance of Visual Studio Code or Azure Data Studio!

and you will also be able to see the local repository files on your computer

You can clone the repository at the command line if you wish by navigating to your local GitHub directory and running git clone TheURLYouCopied

Now your local repository has been created, it’s time to do your magic coding.

Create a new branch for your changes

It is a good idea to create a branch for your amazing new feature This enables you to work on coding for that feature in isolation. It has the added advantage that if you mess it right royally up, you can just delete that branch and start again with a new one!

To create a branch in VS Code or Azure Data Studio you can click on the branch name at the bottom left.

Or open the Command Palette and type Branch until you see Git: Create Branch

You will be prompted for a branch name

I like to choose a name that relates to the code that I am writing like configurable_engine or removeerroringexample You can see the name of the branch in the bottom left so that you always know which branch you are working on.

The icon shows that the branch is only local and hasn’t been pushed (published) to the remote repository yet

Make some changes and commit them with useful messages

Now you can start writing your code for your awesome new feature, bug fix or maybe just documentation improvement. Keep your commits small and give them useful commit messages that explain why you have made the change as the diff tooling will be able to show what change you have made

Write your code or change the documentation, save the file and in Visual Studio Code or Azure Data Studio you will see that the source control icon has a number on it

Clicking on the icon will show the files that have changes ready

You can write your commit message in the box and click CTRL + ENTER to commit your changes with a message

If you want to do this at the command line, you can use git status to see which files have changes

You will need to git add .or git add .\pathtofile to stage your changes ready for committing and then git commit -m 'Commit Message' to commit them

Notice that I did exactly what I just said not to do! A better commit message would have been So that people can find the guide to forking and creating a PR

Push the changes to your repository

You only have the changes that you have made in your local repository on your computer. Now you need to push those changes to Github your remote repository. You can click on the publish icon

You will get a pop-up asking you if you wish to stage your changes. I click Yes and never Always so that I can use this prompt as a sanity check that I am doing the right thing

At the command line you can push the branch, if you do that, you will have to tell git where the branch needs to go. If you just type git push it will helpfully tell you

fatal: The current branch AwesomeNewFeature has no upstream branch.
To push the current branch and set the remote as upstream, use

    git push --set-upstream origin AwesomeNewFeature

So you will need to use that command

You can see in the bottom left that the icon has changed

and if you read the output of the git push command you will see what the next step is also.

Create a Pull Request from your repository back to the original one

You can CTRL click the link in the git push output if you have pushed from the command line or if you visit either you repository or the original repository in your browser you will see that there is a Compare and Pull Request button

You click that and let GitHub do its magic

and it will create a Pull Request for you ready for you to fill in the required information, ask for reviewers and other options. Once you have done that you can click Create pull request and wait for the project maintainer to review it and (hopefully) accept it into their project

You can find the Pull Request that I created here https://github.com/sqlcollaborative/dbachecks/pull/720 and see how the rest of this blog post was created.

If you make more changes to the code in the same branch in your local repository and push them, they will automatically be added to this Pull Request whilst it is open. You can do this if the maintainer or reviewer asks for changes.

Shane has asked for a change

So I can go to my local repository in Azure Data Studio and make the requested change and save the file. If I look in the source control in Azure Data Studio I can again see there is a change waiting to be committed and if I click on the name of the file I can open the diff tool to see what the change was

Once I am happy with my change I can commit it again in the same way as before either in the editor or at the command line. The icon at the bottom will change to show that I have one commit in my local repository waiting to be pushed

To do the same thing at the command line I can type git status and see the same thing.

I can then push my change to my remote repository either in the GUI or by using git push

and it will automatically be added to the Pull Request as you can see

Now that the required changes for the review have been made, the review has been approved by Shane and the pull request is now ready to be merged. (You can also see that dbachecks runs some checks against the code when a Pull Request is made)

Many, many thanks to Shane b | t who helped with the writing of this post even whilst on a “no tech” holiday.

Go Ahead – Contribute to an Open Source Project

Hopefully you can now see how easy it is to create a fork of a GitHub repository, clone it to your own machine and contribute. There are many open source projects that you can contribute to.

You can use this process to contribute to the Microsoft Docs for example by clicking on the edit button on any page.

You can contribute other open source projects like

or go and find the the ones that you use and can help with.

My current VS Code Extensions and using a workspace file

I have been asked a couple of times recently what my Visual Studio Code extensions are at the moment so I thought I would write a quick post and also look at workspaces and how you can enable and disable extensions within them

Listing Extensions

From the command line you can list your extensions using

code --list-extensions
code-insiders --list-extensions

My list looks like this

You can also see them in the view on the left of default Visual Studio Code and open them with CTRL + SHIFT + X (unless like me you have Snaggit installed and it has taken that shortcut

Installing Extensions

You can install extensions by opening the Extensions view in Visual Studio Code and searching for the extension. The list I have below has the precise names for each extension which you can use to search

You can also install extensions from the command-line with

code --install-extension <extensionid>
code-insiders --install-extension <extensionid>

My Extensions

I am going to list these in alphabetical order by display name for ease (my ease that is!)

Because Chrissy LeMaire and I are writing dbatools in a Month of Lunches using AsciiDoc, it makes sense to have an extension enabling previewing and syntax, you can find it here

For interacting with Azure I use the Azure Account Extension – ms-vscode.azure-account

I use Azure CLI so I make use of the functionality of the Azure CLI Tools extension ms-vscode.azurecli

For interacting with Azure Repos I use the ms-vsts.team extension

When creating ARM templates, this extension is very useful msazurermtools.azurerm-vscode-tools

I have a few theme extensions, this one is for fun in demos 😉 beardedbear.beardedtheme

The blackboard theme is my default one gerane.theme-blackboard

Chasing closing brackets is much easier with the Bracket Pair Colorixer, I use the beta version coenraads.bracket-pair-colorizer-2

I am rubbish at spelling and typing so I use this to help point out the issues! streetsidesoftware.code-spell-checker

Using the Docker extension adds another view to Visual Studio Code to ease working with containers ms-azuretools.vscode-docker

As an open-source project maintainer it is good to be able to work with GitHub pull requests without leaving Visual Studio Code github.vscode-pull-request-githubPreview

GitLens is absolutely invaluable when working with source control. It has so many features. This is an absolute must eamodio.gitlens

Working with Kubernetes? This extension adds another view for interacting with your cluster ms-kubernetes-tools.vscode-kubernetes-tools

Visual Studio Live Share enables you to collaborate in real-time in Visual Studio Code with your colleagues or friends. I blogged about this here ms-vsliveshare.vsliveshare

I love writing markdown and this linter assists me to ensure that my markdown is correct davidanson.vscode-markdownlint

The Material Icon Theme ensures that there are pretty icons in my editor! pkief.material-icon-theme

I have both the PowerShell extension ms-vscode.powershell and the PowerShell preview extension ms-vscode.powershell-preview installed but only one can be enabled at a time

This suite of extensions enables easy remote development so that you can develop your PowerShell scripts, for example, inside a ubuntu container running PowerShell 7 or inside Windows Subsystem for LInux ms-vscode-remote.vscode-remote-extensionpackPreview

Writing for cross-platform means looking out for line endings and this extension will display them and any whitespace in your editor medo64.render-crlf

An absolutely essential extension which enables me to backup all of my Visual Studio Code settings, shortcuts, extensions into a GitHub gist and keep all of my machines feeling the same. shan.code-settings-sync

For working with SQL Server within Visual Studio Code and having a view for my instances as well as a linter and intellisense I use ms-mssql.mssql

Yaml files and spaces! I no longer get so confused with this extension to help me 🙂 redhat.vscode-yaml

Workspaces

Now that is a lot of extensions and I dont need all of them everytime. I use workspaces to help with this. I will create a workspace file for the project I am working on.

I open or create the folders I will be working on and then click File and Save Workspace As and save the file in the root of the folder.

Now, the next time I want to open the workspace, I can open the workspace file or if I open the folder Visual Studio Code will helpfully prompt me

Now I can have all of my settings retained for that workspace

For this folder, I am ensuring that the PowerShell extension uses the PSScriptAnalyzer Settings file that I have created so that it will show if the code is compatible with the versions of PowerShell I have chosen. I can define settings for a workspace in the settings file, which you can open using CTRL and ,

But I can also enable or disable extensions for a workspace

So everytime I open this workspace I am only loading the extensions I want

Building Azure SQL Db with Terraform using Azure DevOps

In my last post I showed how to create a Resource Group and an Azure SQLDB with Terraform using Visual Studio Code to deploy.

Of course, I havent stopped there, who wants to manually run code to create things. There was a lot of install this and set up that. I would rather give the code to a build system and get it to run it. I can then even set it to automatically deploy new infrastructure when I commit some code to alter the configuration.

This scenario though is to build environments for presentations. Last time I created an Azure SQL DB and tagged it with DataInDevon (By the way you can get tickets for Data In Devon here – It is in Exeter on April 26th and 27th)

If I want to create the same environment but give it tags for a different event (This way I know when I can delete resources in Azure!) or name it differently, I can use Azure DevOps and alter the variables. I could just alter the code and commit the change and trigger a build or I could create variables and enable them to be set at the time the job is run. I use the former in “work” situations and the second for my presentations environment.

I have created a project in Azure DevOps for my Presentation Builds. I will be using GitHub to share the code that I have used. Once I clicked on pipelines, this is the page I saw

Clicking new pipeline, Azure DevOps asked me where my code was

I chose GitHub, authorised and chose the repository.

I then chose Empty Job on the next page. See the Configuration as code choice? We will come back to that later and our infrastructure as code will be deployed with a configuration as code 🙂

The next page allows us to give the build a good name and choose the Agent Pool that we want to use. Azure DevOps gives 7 different hosted agents running Linux, Mac, Windows or you can download an agent and run it on your own cpus. We will use the default agent for this process.

Clicking on Agent Job 1 enables me to change the name of the Agent Job. I could also choose a different type of Agent for different jobs within the same pipeline. This would be useful for testing different OS’s for example but for right now I shall just name it properly.

State

First we need somewhere to store the state of our build so that if we re-run it the Terraform plan step will be able to work out what it needs to do. (This is not absolutely required just for building my presentation environments and this might not be the best way to achieve this but for right now this is what I do and it works.)

I click on the + and search for Azure CLI.

and click on the Add button which gives me some boxes to fill in.

I choose my Azure subscription from the first drop down and choose Inline Script from the second

Inside the script block I put the following code

# the following script will create Azure resource group, Storage account and a Storage container which will be used to store terraform state
call az group create --location $(location) --name $(TerraformStorageRG)

call az storage account create --name $(TerraformStorageAccount) --resource-group $(TerraformStorageRG) --location $(location) --sku Standard_LRS

call az storage container create --name terraform --account-name $(TerraformStorageAccount)

This will create a Resource Group, a storage account and a container and use some variables to provide the values, we will come back to the variables later.

Access Key

The next thing that we need to do is to to enable the job to be able to access the storage account. We don’t want to store that key anywhere but we can use our Azure DevOps variables and some PowerShell to gather the access key and write it to the variable when the job is running . To create the variables I clicked on the variables tab

and then added the variables with the following names TerraformStorageRG, TerraformStorageAccount and location from the previous task and TerraformStorageKey for the next task.

With those created, I go back to Tasks and add an Azure PowerShell task

I then add this code to get the access key and overwrite the variable.

# Using this script we will fetch storage key which is required in terraform file to authenticate backend stoarge account

$key=(Get-AzureRmStorageAccountKey -ResourceGroupName $(TerraformStorageRG) -AccountName $(TerraformStorageAccount)).Value[0]

Write-Host "##vso[task.setvariable variable=TerraformStorageKey]$key"

Infrastructure as Code

In my GitHub repository I now have the following folders

The manual folders hold the code from the last blog post. In the Build folder, the main.tf file is identical and looks like this.

provider "azurerm" {
    version = "=1.24.0"
}

terraform {
  backend "azurerm" {
    key = "terraform.tfstate"
  }
}

resource "azurerm_resource_group" "presentation" {
  name     = "${var.ResourceGroupName}"
  location = "${var.location}"
    tags = {
    environment = "${var.presentation}"
  }
}

resource "azurerm_sql_server" "presentation" {
  name                         = "${var.SqlServerName}"
  resource_group_name          = "${azurerm_resource_group.presentation.name}"
  location                     = "${var.location}"
  version                      = "12.0"
  administrator_login          = "__SQLServerAdminUser__"
  administrator_login_password = "__SQLServerAdminPassword__"
    tags = {
    environment = "${var.presentation}"
  }
}

resource "azurerm_sql_database" "presentation" {
  name                = "${var.SqlDatabaseName}"
  resource_group_name = "${azurerm_sql_server.presentation.resource_group_name}"
  location            = "${var.location}"
  server_name         = "${azurerm_sql_server.presentation.name}"
  edition                          = "${var.Edition}"
  requested_service_objective_name = "${var.ServiceObjective}"

  tags = {
    environment = "${var.presentation}"
  }
}

The variables.tf folder looks like this.

variable "presentation" {
    description = "The name of the presentation - used for tagging Azure resources so I know what they belong to"
    default = "__Presentation__"
}

variable "ResourceGroupName" {
  description = "The Prefix used for all resources in this example"
  default     = "__ResourceGroupName__"
}

variable "location" {
  description = "The Azure Region in which the resources in this example should exist"
  default     = "__location__"
}

variable "SqlServerName" {
  description = "The name of the Azure SQL Server to be created or to have the database on - needs to be unique, lowercase between 3 and 24 characters including the prefix"
  default     = "__SqlServerName__"
}

variable "SQLServerAdminUser" {
  description = "The name of the Azure SQL Server Admin user for the Azure SQL Database"
  default     = "__SQLServerAdminUser__"
}
variable "SQLServerAdminPassword" {
  description = "The Azure SQL Database users password"
  default     = "__SQLServerAdminPassword__"
}
variable "SqlDatabaseName" {
  description = "The name of the Azure SQL database on - needs to be unique, lowercase between 3 and 24 characters including the prefix"
  default     = "__SqlDatabaseName__"
}


variable "Edition" {
  description = "The Edition of the Database - Basic, Standard, Premium, or DataWarehouse"
  default     = "__Edition__"
}

variable "ServiceObjective" {
  description = "The Service Tier S0, S1, S2, S3, P1, P2, P4, P6, P11 and ElasticPool"
  default     = "__ServiceObjective__"
}

It is exactly the same except that the values have been replaced by the value name prefixed and suffixed with __. This will enable me to replace the values with the variables in my Azure DevOps Build job.

The backend-config.tf file will store the details of the state that will be created by the first step and use the access key that has been retrieved in the second step.

resource_group_name = "__TerraformStorageRG__"

storage_account_name = "__TerraformStorageAccount__"

container_name = "terraform"

access_key = "__TerraformStorageKey__"

I need to add the following variables to my Azure DevOps Build – Presentation, ResourceGroupName, SqlServerName, SQLServerAdminUser, SQLServerAdminPassword, SqlDatabaseName, Edition, ServiceObjective . Personally I would advise setting the password or any other sensitive values to sensitive by clicking the padlock for that variable. This will stop the value being written to the log as well as hiding it behind *’s

Because I have tagged the variables with Settable at queue time , I can set the values whenever I run a build, so if I am at a different event I can change the name.

But the build job hasn’t been set up yet. First we need to replace the values in the variables file.

Replace the Tokens

I installed the Replace Tokens Task from the marketplace and added that to the build.

I am going to use a standard naming convention for my infrastructure code files so I add Build to the Root Directory. You can also click the ellipses and navigate to a folder in your repo. In the Target Files I add *”*/*.tf” and “**/*.tfvars” which will search all of the folders (**) and only work on files with a .tf or .tfvars extension (/*.tfvars) The next step is to make sure that the replacement prefix and suffix are correct. It is hidden under Advanced

Because I often forget this step and to aid in troubleshooting I add another step to read the contents of the files and place them in the logs. I do this by adding a PowerShell step which uses

Get-ChildItem .\Build -Recurse

Get-Content .\Build\*.tf 
Get-Content .\Build\*.tfvars 

Under control options there is a check box to enable or disable the steps so once I know that everything is ok with the build I will disable this step. The output in the log of a build will look like this showing the actual values in the files. This is really useful for finding spaces :-).

Running the Terraform in Azure DevOps

With everything set up we can now run the Terraform. I installed the Terraform task from the marketplace and added a task. We are going to follow the same process as the last blog post, init, plan, apply but this time we are going to automate it 🙂

First we will initialise

I put Build in the Terraform Template path. The Terraform arguments are

init -backend-config="0-backend-config.tfvars"

which will tell the Terraform to use the backend-config.tfvars file for the state. It is important to tick the Install terraform checkbox to ensure that terraform is available on the agent and to add the Azure Subscription (or Service Endpoint in a corporate environment

After the Initialise, I add the Terraform task again add Build to the target path and this time the argument is plan

Again, tick the install terraform checkbox and also the Use Azure Service Endpoint and choose the Azure Subscription.

We also need to tell the Terraform where to find the tfstate file by specifying the variables for the resource group and storage account and the container

Finally, add another Terraform task for the apply remembering to tick the install Terraform and Use Azure checkboxes

The arguments are

apply -auto-approve

This will negate the requirement for the “Only “yes” will be accepted to approve” from the manual steps post!

Build a Thing

Now we can build the environment – Clicking Save and Queue

opens this dialogue

where the variables can be filled in.

The build will be queued and clicking on the build number will open the logs

6 minutes later the job has finished

and the resources have been created.

If I want to look in the logs of the job I can click on one of the steps and take a look. This is the apply step

Do it Again For Another Presentation

So that is good, I can create my environment as I want it. Once my presentation has finished I can delete the Resource Groups. When I need to do the presentation again, I can queue another build and change the variables

The job will run

and the new resource group will be created

all ready for my next presentation 🙂

This is brilliant, I can set up the same solution for different repositories for different presentations (infrastructure) and recreate the above steps.

The next post will show how to use Azure DevOps Task Groups to use the same build steps in multiple pipelines and build an Azure Linux SQL Server VM

The post after that will show how to use Azure DevOps templates to use the same build steps across many projects and build pipelines and will build a simple AKS cluster

The first post showed how to build an Azure SQLDB with Terraform using VS Code

Building Azure SQL Db with Terraform with Visual Studio Code

I have been using Terraform for the last week or so to create some infrastructure and decided to bring that knowledge back to a problem that I and others suffer from – building environments for presentations, all for the sake of doing some learning.

What is Terraform?

According to the website


HashiCorp Terraform enables you to safely and predictably create, change, and improve infrastructure. It is an open source tool that codifies APIs into declarative configuration files that can be shared amongst team members, treated as code, edited, reviewed, and versioned


https://www.terraform.io/

This means that I can define my infrastructure as code. If I can do that then I can reliably do the same thing again and again, at work to create environments that have the same configuration or outside of work to repeatedly build the environment I need.

Building an Azure SQL Database with Terraform

To understand how to build a thing the best place to start is the documentation https://www.terraform.io/docs . For an Azure SQL Db in the docs you will find a block of code that looks like this

resource "azurerm_resource_group" "test" {
  name     = "acceptanceTestResourceGroup1"
  location = "West US"
}

resource "azurerm_sql_server" "test" {
  name                         = "mysqlserver"
  resource_group_name          = "${azurerm_resource_group.test.name}"
  location                     = "West US"
  version                      = "12.0"
  administrator_login          = "4dm1n157r470r"
  administrator_login_password = "4-v3ry-53cr37-p455w0rd"
}

resource "azurerm_sql_database" "test" {
  name                = "mysqldatabase"
  resource_group_name = "${azurerm_resource_group.test.name}"
  location            = "West US"
  server_name         = "${azurerm_sql_server.test.name}"

  tags = {
    environment = "production"
  }
}

If you read the code, you can see that there are key value pairs defining information about the resource that is being created. Anything inside a ${} is a dynamic reference. So

  resource_group_name          = "${azurerm_resource_group.test.name}"

refers to the name property in the azure_resource_group block called test (or the name of the resource group 🙂 )

Infrastructure As Code

So I can put that code into a file (name it main.tf) and alter it with the values and “run Terraform” and what I want will be created. Lets take it a step further though because I want to be able to reuse this code. Instead of hard-coding all of the values I am going to use variables. I can do this by creating another file called variables.tf which looks like

variable "presentation" {
    description = "The name of the presentation - used for tagging Azure resources so I know what they belong to"
    default = "dataindevon"
}

variable "ResourceGroupName" {
  description = "The Resource Group Name"
  default     = "beardrules"
}

variable "location" {
  description = "The Azure Region in which the resources in this example should exist"
  default     = "uksouth"
}

variable "SqlServerName" {
  description = "The name of the Azure SQL Server to be created or to have the database on - needs to be unique, lowercase between 3 and 24 characters including the prefix"
  default     = "jeremy"
}
variable "SQLServerAdminUser" {
  description = "The name of the Azure SQL Server Admin user for the Azure SQL Database"
  default     = "Beard"
}
variable "SQLServerAdminPassword" {
  description = "The Azure SQL Database users password"
  default     = "JonathanlovesR3ge%"
}

variable "SqlDatabaseName" {
  description = "The name of the Azure SQL database on - needs to be unique, lowercase between 3 and 24 characters including the prefix"
  default     = "jsdb"
}

variable "Edition" {
  description = "The Edition of the Database - Basic, Standard, Premium, or DataWarehouse"
  default     = "Standard"
}

variable "ServiceObjective" {
  description = "The Service Tier S0, S1, S2, S3, P1, P2, P4, P6, P11 and ElasticPool"
  default     = "S0"
}

and my main.tf then looks like this.

provider "azurerm" {
    version = "=1.24.0"
}

resource "azurerm_resource_group" "presentation" {
  name     = "${var.ResourceGroupName}"
  location = "${var.location}"
    tags = {
    environment = "${var.presentation}"
  }
}

resource "azurerm_sql_server" "presentation" {
  name                         = "${var.SqlServerName}"
  resource_group_name          = "${azurerm_resource_group.presentation.name}"
  location                     = "${var.location}"
  version                      = "12.0"
  administrator_login          = "${var.SQLServerAdminUser}"
  administrator_login_password = "${var.SQLServerAdminPassword}"
    tags = {
    environment = "${var.presentation}"
  }
}

resource "azurerm_sql_database" "presentation" {
  name                = "${var.SqlDatabaseName}"
  resource_group_name = "${azurerm_sql_server.presentation.resource_group_name}"
  location            = "${var.location}"
  server_name         = "${azurerm_sql_server.presentation.name}"
  edition                          = "${var.Edition}"
  requested_service_objective_name = "${var.ServiceObjective}"

  tags = {
    environment = "${var.presentation}"
  }
}

You can find these files in my GitHub Repository here.

Alright – deploy something

To deploy the code that I have written I need to download Terraform from https://www.terraform.io/downloads.html and then extract the exe to a folder in my PATH. (I chose C:\Windows). Then in Visual Studio Code I installed two extensions The Terraform Extension by Mikael Olenfalk which enables syntax highlighting and auto-completion for the tf files and the Azure Terraform extension. You will need also need Node.js from here.

With those in place I navigated to the directory holding my files in Visual Studio Code and pressed F1 and started typing azure terraform and chose Azure Terraform Init

I was then prompted to use Cloud Shell and a browser opened to login. Once I had logged in I waited until I saw this

I press F1 again and this time choose Azure Terraform plan. This is going to show me what Terraform is going to do if it applies this configuration.

You can see the what is going to be created. It is going to create 3 things

Once you have checked that the plan is what you want, press F1 again and choose Azure Terraform Apply

You are then asked to confirm that this is what you want. Only “yes” will be accepted. Then you will see the infrastructure being created

and a minute later

and Jeremy exists in the beardrules resource group

Then once I have finished with using the sqlinstance. I can press F1 again and choose Azure Terraform Destroy. Again there is a confirmation required.

and you will see the progress for 46 seconds

and all of the resources have gone.

Thats a good start. This enables me to create resources quickly and easily and keep the configuration for them safely in source control and easy to use.

In my next post I will create an Azure DevOps pipeline to deploy an AZure SQL Db withTerraform.

The post after will show how to use Azure DevOps Task Groups to use the same build steps in multiple pipelines and build an Azure Linux SQL Server VM

The post after that will show how to use Azure DevOps templates to use the same build steps across many projects and build pipelines and will build a simple AKS cluster

Using Docker to run Integration Tests for dbachecks

My wonderful friend André Kamman wrote a fantastic blog post this week SQL Server Container Instances via Cloudshell about how he uses containers in Azure to test code against different versions of SQL Server.

It reminded me that I do something very similar to test dbachecks code changes. I thought this might make a good blog post. I will talk through how I do this locally as I merge a PR from another great friend Cláudio Silva who has added agent job history checks.

GitHub PR VS Code Extension

I use the GitHub Pull Requests extension for VS Code to work with pull requests for dbachecks. This enables me to see all of the information about the Pull Request, merge it, review it, comment on it all from VS Code

I can also see which files have been changed and which changes have been made

Once I am ready to test the pull request I perform a checkout using the extension

This will update all of the files in my local repository with all of the changes in this pull request

You can see at the bottom left that the branch changes from development to the name of the PR.

Running The Unit Tests

The first thing that I do is to run the Unit Tests for the module. These will test that the code is following all of the guidelines that we require and that the tests are formatted in the correct way for the Power Bi to parse. I have blogged about this here and here and we use this Pester in our CI process in Azure DevOps which I described here.

I navigate to the root of the dbachecks repository on my local machine and run

$testresults = Invoke-Pester .\tests -ExcludeTag Integration -Show Fails -PassThru 

and after about a minute

Thank you Cláudio, the code has passed the tests 😉

Running Some Integration Tests

The difference between Unit tests and Integration tests in a nutshell is that the Unit tests are testing that the code is doing what is expected without any other external influences whilst the Integration tests are checking that the code is doing what is expected when running on an actual environment. In this scenario we know that the code is doing what is expected but we want to check what it does when it runs against a SQL Server and even when it runs against multiple SQL Servers of different versions.

Multiple Versions of SQL Server

As I have described before my friend and former colleague Andrew Pruski b | t has many resources for running SQL in containers. This means that I can quickly and easily create fresh uncontaminated instances of SQL 2012, 2014, 2016 and 2017 really quickly.

I can create 4 instances of different versions of SQL in (a tad over) 1 minute. How about you?

Imagine how long it would take to run the installers for 4 versions of SQL and the pain you would have trying to uninstall them and make sure everything is ‘clean’. Even images that have been sysprep’d won’t be done in 1 minute.

Docker Compose Up ?

So what is this magic command that has enabled me to do this? docker compose uses a YAML file to define multi-container applications. This means that with a file called docker-compose.yml like thish

version: '3.7'

services:
    sql2012:
        image: dbafromthecold/sqlserver2012dev:sp4
        ports:  
          - "15589:1433"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"
    sql2014:
        image: dbafromthecold/sqlserver2014dev:sp2
        ports:  
          - "15588:1433"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"
    sql2016:
        image: dbafromthecold/sqlserver2016dev:sp2
        ports:  
          - "15587:1433"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"
    sql2017:
        image: microsoft/mssql-server-windows-developer:2017-latest
        ports:  
          - "15586:1433"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"

and in that directory just run

docker-compose up -d

and 4 SQL containers are available to you. You can interact with them via SSMS if you wish with localhost comma PORTNUMBER. The port numbers in the above file are 15586, 15587,15588 and 15589

Now it must be noted, as I describe here that first I pulled the images to my laptop. The first time you run docker compose will take significantly longer if you haven’t pulled the images already (pulling the images will take quite a while depending on your broadband speed)

Credential

The next thing is to save a credential to make it easier to automate. I use the method described by my PowerShell friend Jaap Brasser here. I run this code

$CredentailPath = 'C:\MSSQL\BACKUP\KEEP\sacred.xml'
Get-Credential | Export-Clixml -Path $CredentialPath

and then I can create a credential object using

$cred = Import-Clixml $CredentailPath 

Check The Connections

I ensure a clean session by removing the dbatools and dbachecks modules and then import the local version of dbachecks and set some variables

$dbacheckslocalpath = 'GIT:\dbachecks\'
Remove-Module dbatools, dbachecks -ErrorAction SilentlyContinue
Import-Module $dbacheckslocalpath\dbachecks.psd1
$cred = Import-Clixml $CredentailPath 
$containers = 'localhost,15589', 'localhost,15588', 'localhost,15587', 'localhost,15586'

Now I can start to run my Integration tests. First reset the dbachecks configuration and set some configuration values

# run the checks against these instances
$null = Set-DbcConfig -Name app.sqlinstance $containers
# We are using SQL authentication
$null = Set-DbcConfig -Name policy.connection.authscheme -Value SQL
# sometimes its a bit slower than the default value
$null = Set-DbcConfig -Name policy.network.latencymaxms -Value 100 # because the containers run a bit slow!

Then I will run the dbachecks connectivity checks and save the results to a variable without showing any output

$ConnectivityTests = Invoke-DbcCheck -SqlCredential $cred -Check Connectivity -Show None -PassThru

I can then use Pester to check that dbachecks has worked as expected by testing if the failedcount property returned is 0.

Describe "Testing the checks are running as expected" -Tag Integration {
    Context "Connectivity Checks" {
        It "All Tests should pass" {
            $ConnectivityTests.FailedCount | Should -Be 0 -Because "We expect all of the checks to run and pass with default settings"
        }
    }
}

What is the Unit Test for this PR?

Next I think about what we need to be testing for the this PR. The Unit tests will help us.

Choose some Integration Tests

This check is checking the Agent job history settings and the unit tests are

  • It “Passes Check Correctly with Maximum History Rows disabled (-1)”
  • It “Fails Check Correctly with Maximum History Rows disabled (-1) but configured value is 1000”
  • It “Passes Check Correctly with Maximum History Rows being 10000”
  • It “Fails Check Correctly with Maximum History Rows being less than 10000”
  • It “Passes Check Correctly with Maximum History Rows per job being 100”
  • It “Fails Check Correctly with Maximum History Rows per job being less than 100”

So we will check the same things on real actual SQL Servers. First though we need to start the SQL Server Agent as it is not started by default. We can do this as follows

docker exec -ti integration_sql2012_1 powershell start-service SQLSERVERAGENT
docker exec -ti integration_sql2014_1 powershell start-service SQLSERVERAGENT
docker exec -ti integration_sql2016_1 powershell start-service SQLSERVERAGENT
docker exec -ti integration_sql2017_1 powershell start-service SQLSERVERAGENT

Unfortunately, the agent service wont start in the SQL 2014 container so I cant run agent integration tests for that container but it’s better than no integration tests.

This is What We Will Test

So we want to test if the check will pass with default settings. In general, dbachecks will pass for default instance, agent or database settings values by default.

We also want the check to fail if the configured value for dbachecks is set to default but the value has been set on the instance.

We want the check to pass if the configured value for the dbachecks configuration is set and the instance (agent, database) setting matches it.

If You Are Doing Something More Than Once ……

Let’s automate that. We are going to be repeatedly running those three tests for each setting that we are running integration tests for. I have created 3 functions for this again checking that FailedCount or Passed Count is 0 depending on the test.

function Invoke-DefaultCheck {
    It "All Checks should pass with default for $Check" {
        $Tests = get-variable "$($Check)default"  -ValueOnly
        $Tests.FailedCount | Should -Be 0 -Because "We expect all of the checks to run and pass with default setting (Yes we may set some values before but you get my drift)"
    }
}
function Invoke-ConfigCheck {
    It "All Checks should fail when config changed for $Check" {
        $Tests = get-variable "$($Check)configchanged"  -ValueOnly
        $Tests.PassedCount | Should -Be 0 -Because "We expect all of the checks to run and fail when we have changed the config values"
    }
}
function Invoke-ValueCheck {
    It "All Checks should pass when setting changed for $Check" {
        $Tests = get-variable "$($Check)valuechanged"  -ValueOnly
        $Tests.FailedCount | Should -Be 0 -Because "We expect all of the checks to run and pass when we have changed the settings to match the config values"
    }
}

Now I can use those functions inside a loop in my Integration Pester Test

    $TestingTheChecks = @('errorlogscount','jobhistory')
    Foreach ($Check in $TestingTheChecks) {
        Context "$Check Checks" {
            Invoke-DefaultCheck
            Invoke-ConfigCheck
            Invoke-ValueCheck
        }
    }

Write Some Integration Tests

So for this new test I have added a value to the TestingTheChecks array then I can test my checks. The default check I can check like this

# run the checks against these instances (SQL2014 agent wont start :-( ))
$null = Set-DbcConfig -Name app.sqlinstance $containers.Where{$_ -ne 'localhost,15588'}
# by default all tests should pass on default instance settings
$jobhistorydefault = Invoke-DbcCheck -SqlCredential $cred -Check JobHistory -Show None  -PassThru

Now I need to change the configurations so that they do not match the defaults and run the checks again

#Change the configuration to test that the checks fail
$null = Set-DbcConfig -Name agent.history.maximumjobhistoryrows -value 1000
$null = Set-DbcConfig -Name agent.history.maximumhistoryrows -value 10000
$jobhistoryconfigchanged = Invoke-DbcCheck -SqlCredential $cred -Check JobHistory -Show None  -PassThru

Next we have to change the instance settings so that they match the dbachecks configuration and run the checks and test that they all pass.

We will (of course) use dbatools for this. First we need to find the command that we need

Find-DbaCommand jobserver

and then work out how to use it

Get-Help Set-DbaAgentServer -Detailed

There is an example that does exactly what we want 🙂 So we can run this.

$setDbaAgentServerSplat = @{
    MaximumJobHistoryRows = 1000
    MaximumHistoryRows = 10000
    SqlInstance = $containers.Where{$_ -ne 'localhost,15588'}
    SqlCredential = $cred
}
Set-DbaAgentServer @setDbaAgentServerSplat
$jobhistoryvaluechanged = Invoke-DbcCheck -SqlCredential $cred -Check JobHistory -Show None  -PassThru

Run the Integration Tests

And then we will check that all of the checks are passing and failing as expected

Invoke-Pester .\DockerTests.ps1

Integration Test For Error Log Counts

There is another integration test there for the error logs count. This works in the same way. Here is the code


#region error Log Count - PR 583
# default test
$errorlogscountdefault = Invoke-DbcCheck -SqlCredential $cred -Check ErrorLogCount -Show None  -PassThru
# set a value and then it will fail
$null = Set-DbcConfig -Name policy.errorlog.logcount -Value 10
$errorlogscountconfigchanged = Invoke-DbcCheck -SqlCredential $cred -Check ErrorLogCount -Show None  -PassThru

# set the value and then it will pass
$null = Set-DbaErrorLogConfig -SqlInstance $containers -SqlCredential $cred -LogCount 10
$errorlogscountvaluechanged = Invoke-DbcCheck -SqlCredential $cred -Check ErrorLogCount -Show None  -PassThru
#endregion

Merge the Changes

So with all the tests passing I can merge the PR into the development branch and Azure DevOps will start a build. Ultimately, I would like to add the integration to the build as well following André‘s blog post but for now I used the GitHub Pull Request extension to merge the pull request into development which started a build and then merged that into master which signed the code and deployed it to the PowerShell gallery as you can see here and the result is

https://www.powershellgallery.com/packages/dbachecks/1.1.164

Visual Studio Code Live Sharing Set-Up

There was an announcement on the Visual Studio Code blog about the public preview of Live Share. This enables you to easily collaborate on code by securely sharing your coding session.

It is remarkably easy to set up 🙂

Installation

Open Visual Studio Code, open the Extensions side bar (CTRL + SHIFT + X)

01 - open extensions

Search for Live Share

02 - search.png

Click Install and then reload when it has done

03 - reload.png

You will notice in the bottom bar it will say finishing the installation and if you open the terminal (CTRL + ‘) and click on Output and change the drop down on the right to Visual Studio Live Share you can see what it is doing

04 - finishing installation.png

It is installing the dependancies as shown below

[Client I] Installing dependencies for Live Share…
[Client I] Downloading package ‘.NET Core Runtime 2.0.5 for win7-x86’
[Client I] Download complete.
[Client I] Downloading package ‘OmniSharp for Windows (.NET 4.6)’
[Client I] Download complete.
[Client I] Installing package ‘.NET Core Runtime 2.0.5 for win7-x86’
[Client V] Extracted packed files
[Client I] Validated extracted files.
[Client I] Moved and validated extracted files.
[Client I] Finished installing.
[Client I] Installing package ‘OmniSharp for Windows (.NET 4.6)’
[Client V] Extracted packed files
[Client I] Validated extracted files.
[Client I] Finished installing.
[Client I] No workspace id found.
Incidentally, this will also show the location of the log file

You will see in the bottom bar it will now say sign in

06 - sign in.png

Clicking that will open a browser and give you a choice of accounts to sign in with, your GitHub or your Microsoft ID

07 - sign in.png

Choose the one that you want to use and do your 2FA.

08 - 2FA.png

You do have 2FA on your Microsoft and GitHub (and all the other services)? If not go and set it up now – here for Microsoft and here for GitHub 

Once you have signed in you will get this notification which you can close

09 - close this notification.png

The icon in the bottom will change and show your account name and if you click it it will open the menu

09 - sharing menu.png

Sharing

To share your session you click on the Share icon in the bottom bar or the Start collaboration session in the menu above. The first time you do this there will be a pop-up as shown

05 - firewall popup.png

You can decide which way you (or your organisation) want to share. I chose to accept the firewall exception.

10 - invite link.png

The invite link is in your clipboard ready to share with your friends and colleagues (other open source contributors ??)

They can either open the link in a browser

11 - join via browser.png

or by using the Join Collaboration Session in the menu in VS Code

12 - Join via VS COde.png

Once they do the sharer will get a notification

13 - notification of sharing.png

and the person who has joined will have the same workspace opened in their Visual Studio Code

14 -shared workspace.png

You can then collaborate on your code and share the session. In the video below the left hand side is running in my jump box in Azure and the right hand side on my laptop and you can see that if you highlight code in one side it is shown in the other and if you alter it in one side it is changed in the other. I also saved that file in the joined session rather than from the session that initialised the sharing and it then saved in both sessions 🙂

So that shows how easy it is to install and to use. You can dive deeper using the documentation.

 

Happy Collaborating 🙂

Easily Splatting PowerShell with VS Code

So I always like to show splatting PowerShell commands when I am presenting sessions or workshops and realised that I had not really blogged about it. (This blog is for @dbafromthecold who asked me to 🙂 )

What is Splatting?

Well you will know that when you call a PowerShell function you can use intellisense to get the parameters and sometimes the parameter values as well. This can leave you with a command that looks like this on the screen

Start-DbaMigration -Source $Source -Destination $Destination -BackupRestore -NetworkShare $Share -WithReplace -ReuseSourceFolderStructure -IncludeSupportDbs -NoAgentServer -NoAudits -NoResourceGovernor -NoSaRename -NoBackupDevices
It goes on and on and on and while it is easy to type once, it is not so easy to see which values have been chosen. It is also not so easy to change the values.
By Splatting the parameters it makes it much easier to read and also to alter. So instead of the above you can have
$startDbaMigrationSplat = @{
Source = $Source
NetworkShare = $Share
NoAgentServer = $true
NoResourceGovernor = $true
WithReplace = $true
ReuseSourceFolderStructure = $true
Destination = $Destination
NoAudits = $true
BackupRestore = $true
NoSaRename = $true
IncludeSupportDbs = $true
NoBackupDevices = $true
}
Start-DbaMigration @startDbaMigrationSplat
This is much easier on the eye, but if you dont know what the parameters are (and are too lazy to use Get-Help – Hint You should always use Get-Help ) or like the convenience and efficiency of using the intellisense, this might feel like a backward step that slows your productivity in the cause of easy on the eye code.
Enter EditorServicesCommandSuite by SeeminglyScience for VS Code. Amongst the things it makes available to you is easy splatting and people are always impressed when I show it
You can install it from the PowerShell Gallery like all good modules using
Install-Module EditorServicesCommandSuite -Scope CurrentUser
and then add it to your VSCode PowerShell profile usually found at C:\Users\USERNAME\Documents\WindowsPowerShell\Microsoft.VSCode_profile.ps1
# Place this in your VSCode profile
Import-Module EditorServicesCommandSuite
Import-EditorCommand -Module EditorServicesCommandSuite
and now creating a splat is as easy as this.
Write the command, leave the cursor on a parameter, hit F1 – Choose PowerShell : Show Additional Commands (or use a keyboard shortcut) type splat press enter. Done 🙂

So very easy 🙂
Happy Splatting 🙂

VS Code – Terminal crashes when formatting script

I love VS Code. I love being able to press ALT + SHIFT + F and format my code.

formatting.gif

The Problem

Yesterday all I got when I pressed ALT + SHIFT + F was this

format error.png

I could reproduce it will. This was very frustrating.

Turning on Verbose Logging

To turn on verbose logging for the PowerShell Editor Services go the Cog in the bottom left, click it and then click User Settings.

Search for powershell.developer.editorServicesLogLevel

powershell.developer.editorServicesLogLevel.png

If you hover over the left hand channel a pencil will appear, click it and then click replace in settings

edit settings.png

This will put the entry in the right hand side where you can change the value. Set it to Verbose and save

user settigns.png

a prompt will come up asking if you want to restart PowerShell

start a new session.png

When you restart PowerShell, if you click on  Output and choose PowerShell Extension Logs you will see the path to the log file

logfilepath.png

Reproduce the error

I then reproduced the error and opened the log file this is what I got

10/02/2018 09:11:19 [ERROR] – Method “OnListenTaskCompleted” at line 391 of C:\projects\powershelleditorservices\src\PowerShellEditorServices.Protocol\MessageProtocol\ProtocolEndpoint.cs

ProtocolEndpoint message loop terminated due to unhandled exception:

System.AggregateException: One or more errors occurred. —> System.Management.Automation.CommandNotFoundException: The term ‘Invoke-Formatter’ is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
at System.Management.Automation.Runspaces.PipelineBase.Invoke(IEnumerable input)
at System.Management.Automation.PowerShell.Worker.ConstructPipelineAndDoWork(Runspace rs, Boolean performSyncInvoke)
at System.Management.Automation.PowerShell.Worker.CreateRunspaceIfNeededAndDoWork(Runspace rsToUse, Boolean isSync)
at System.Management.Automation.PowerShell.CoreInvokeHelper[TInput,TOutput](PSDataCollection`1 input, PSDataCollection`1 output, PSInvocationSettings settings)
at System.Management.Automation.PowerShell.CoreInvoke[TInput,TOutput](PSDataCollection`1 input, PSDataCollection`1 output, PSInvocationSettings settings)
at System.Management.Automation.PowerShell.Invoke(IEnumerable input, PSInvocationSettings settings)
at Microsoft.PowerShell.EditorServices.AnalysisService.InvokePowerShell(String command, IDictionary`2 paramArgMap)
at System.Threading.Tasks.Task`1.InnerInvoke()
at System.Threading.Tasks.Task.Execute()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.PowerShell.EditorServices.AnalysisService.<InvokePowerShellAsync>d__31.MoveNext()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.PowerShell.EditorServices.AnalysisService.<Format>d__22.MoveNext()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

Open an issue on GitHub

I couldnt quickly see what was happening so I opened an issue on the vscode-powershell repo by going to issues and clicking new issue and following the instructions

new issue.png

The Resolution

Keith Hill b | t pointed me to the resolution. Thank you Keith.

Further up in the log file there is a line where the editor services is loading the PSScriptAnalyzer module and it should have the Invoke-Formatter command exported, but mine was not. It loaded the PsScriptAnalyzer module  from my users module directory

10/02/2018 09:11:01 [NORMAL] – Method “FindPSScriptAnalyzerModule” at line 354 of C:\projects\powershelleditorservices\src\PowerShellEditorServices\Analysis\AnalysisService.cs

PSScriptAnalyzer found at C:\Users\XXXX\Documents\WindowsPowerShell\Modules\PSScriptAnalyzer\1.10.0\PSScriptAnalyzer.psd1

10/02/2018 09:11:01 [VERBOSE] – Method “EnumeratePSScriptAnalyzerCmdlets” at line 389 of C:\projects\powershelleditorservices\src\PowerShellEditorServices\Analysis\AnalysisService.cs

The following cmdlets are available in the imported PSScriptAnalyzer module:
Get-ScriptAnalyzerRule
Invoke-ScriptAnalyzer

I ran

$Env:PSModulePath.Split(';')

to see the module paths

module path.png

and looked in the .vscode-insiders\extensions\ms-vscode.powershell-1.5.1\modules directory. There was no PsScriptAnalyzer folder

no module.png

So I copied the PSScriptAnalyzer folder from the normal VS Code PowerShell Extension module folder into that folder and restarted PowerShell and I had my formatting back again 🙂

I then reset the logging mode in my user settings back to Normal

Thank you Keith

How to write a PowerShell function to use Confirm, Verbose and WhatIf

In my last blog post I showed how to run a script with the WhatIf parameter. This assumes that the commands within the script have been written to use the common parameters Confirm, Verbose and WhatIf.

Someone asked me how to make sure that any functions that they write will be able to do this.

it is very easy

When we define our function we are going to add [cmdletbinding(SupportsShouldProcess)] at the top

function Set-FileContent {
[cmdletbinding(SupportsShouldProcess)]
Param()

and every time we perform an action that will change something we put that code inside a code block like this

if ($PSCmdlet.ShouldProcess("The Item" , "The Change")) {
    # place code here
}

and alter The Item and The Change as appropriate.

I have created a snippet for VS Code to make this quicker for me. To add it to your VS Code. Click the settings button bottom right, Click User Snippets, choose the powershell json and add the code below between the last two }’s (Don’t forget the comma)

,
		"IfShouldProcess": {
		"prefix": "IfShouldProcess",
		"body": [
			"if ($$PSCmdlet.ShouldProcess(\"The Item\" , \"The Change\")) {",
			"   # Place Code here",
			"}"
				],
			"description": "Shows all the colour indexes for the Excel colours"
	}

and save the powershell.json file

Then when you are writing your code you can simply type “ifs” and tab and the code will be generated for you

As an example I shall create a function wrapped around Set-Content just so that you can see what happens.

function Set-FileContent {
    [cmdletbinding(SupportsShouldProcess)]
    Param(
        [Parameter(Mandatory = $true)]
        [ValidateNotNullOrEmpty()]
        [string]$Content,
        [Parameter(Mandatory = $true)]
        [ValidateScript( {Test-Path $_ })]
        [string]$File
    )
    if ($PSCmdlet.ShouldProcess("$File" , "Adding $Content to ")) {
        Set-Content -Path $File -Value $Content
    }
}

I have done this before because if the file does not exist then Set-Content will create a new file for you, but with this function I can check if the file exists first with the ValidateScript before running the rest of the function.

As you can see I add variables from my PowerShell code into the “The Item” and “The Change”. If I need to add a property of an object I use $($Item.Property).

So now, if I want to see what my new function would do if I ran it without actually making any changes I have -WhatIf added to my function automagically.

Set-FileContent -File C:\temp\number1\TextFile.txt -Content "This is the New Content" -WhatIf

If I want to confirm any action I take before it happens I have -Confirm

Set-FileContent -File C:\temp\number1\TextFile.txt -Content "This is the New Content" -Confirm

As you can see it also give the confirm prompts for the Set-Content command

You can also see the verbose messages with

Set-FileContent -File C:\temp\number1\TextFile.txt -Content "This is the New Content" -Verbose

So to summarise, it is really very simple to add Confirm, WhatIf and Verbose to your functions by placing  [cmdletbinding(SupportsShouldProcess)] at the top of the function and placing any code that makes a change inside

if ($PSCmdlet.ShouldProcess("The Item" , "The Change")) {

with some values that explain what the code is doing to the The Item and The Change.

Bonus Number 1 – This has added support for other common parameters as well – Debug, ErrorAction, ErrorVariable, WarningAction, WarningVariable, OutBuffer, PipelineVariable, and OutVariable.

Bonus Number 2 – This has automatically been added to your Help

Bonus Number 3 – This has reduced the amount of comments you need to write and improved other peoples understanding of what your code is supposed to do 🙂 People can read your code and read what you have entered for the IfShouldProcess and that will tell them what the code is supposed to do 🙂

Now you have seen how easy it is to write more professional PowerShell functions