My current VS Code Extensions and using a workspace file

I have been asked a couple of times recently what my Visual Studio Code extensions are at the moment so I thought I would write a quick post and also look at workspaces and how you can enable and disable extensions within them

Listing Extensions

From the command line you can list your extensions using

code --list-extensions
code-insiders --list-extensions

My list looks like this

You can also see them in the view on the left of default Visual Studio Code and open them with CTRL + SHIFT + X (unless like me you have Snaggit installed and it has taken that shortcut

Installing Extensions

You can install extensions by opening the Extensions view in Visual Studio Code and searching for the extension. The list I have below has the precise names for each extension which you can use to search

You can also install extensions from the command-line with

code --install-extension <extensionid>
code-insiders --install-extension <extensionid>

My Extensions

I am going to list these in alphabetical order by display name for ease (my ease that is!)

Because Chrissy LeMaire and I are writing dbatools in a Month of Lunches using AsciiDoc, it makes sense to have an extension enabling previewing and syntax, you can find it here

For interacting with Azure I use the Azure Account Extension – ms-vscode.azure-account

I use Azure CLI so I make use of the functionality of the Azure CLI Tools extension ms-vscode.azurecli

For interacting with Azure Repos I use the ms-vsts.team extension

When creating ARM templates, this extension is very useful msazurermtools.azurerm-vscode-tools

I have a few theme extensions, this one is for fun in demos 😉 beardedbear.beardedtheme

The blackboard theme is my default one gerane.theme-blackboard

Chasing closing brackets is much easier with the Bracket Pair Colorixer, I use the beta version coenraads.bracket-pair-colorizer-2

I am rubbish at spelling and typing so I use this to help point out the issues! streetsidesoftware.code-spell-checker

Using the Docker extension adds another view to Visual Studio Code to ease working with containers ms-azuretools.vscode-docker

As an open-source project maintainer it is good to be able to work with GitHub pull requests without leaving Visual Studio Code github.vscode-pull-request-githubPreview

GitLens is absolutely invaluable when working with source control. It has so many features. This is an absolute must eamodio.gitlens

Working with Kubernetes? This extension adds another view for interacting with your cluster ms-kubernetes-tools.vscode-kubernetes-tools

Visual Studio Live Share enables you to collaborate in real-time in Visual Studio Code with your colleagues or friends. I blogged about this here ms-vsliveshare.vsliveshare

I love writing markdown and this linter assists me to ensure that my markdown is correct davidanson.vscode-markdownlint

The Material Icon Theme ensures that there are pretty icons in my editor! pkief.material-icon-theme

I have both the PowerShell extension ms-vscode.powershell and the PowerShell preview extension ms-vscode.powershell-preview installed but only one can be enabled at a time

This suite of extensions enables easy remote development so that you can develop your PowerShell scripts, for example, inside a ubuntu container running PowerShell 7 or inside Windows Subsystem for LInux ms-vscode-remote.vscode-remote-extensionpackPreview

Writing for cross-platform means looking out for line endings and this extension will display them and any whitespace in your editor medo64.render-crlf

An absolutely essential extension which enables me to backup all of my Visual Studio Code settings, shortcuts, extensions into a GitHub gist and keep all of my machines feeling the same. shan.code-settings-sync

For working with SQL Server within Visual Studio Code and having a view for my instances as well as a linter and intellisense I use ms-mssql.mssql

Yaml files and spaces! I no longer get so confused with this extension to help me 🙂 redhat.vscode-yaml

Workspaces

Now that is a lot of extensions and I dont need all of them everytime. I use workspaces to help with this. I will create a workspace file for the project I am working on.

I open or create the folders I will be working on and then click File and Save Workspace As and save the file in the root of the folder.

Now, the next time I want to open the workspace, I can open the workspace file or if I open the folder Visual Studio Code will helpfully prompt me

Now I can have all of my settings retained for that workspace

For this folder, I am ensuring that the PowerShell extension uses the PSScriptAnalyzer Settings file that I have created so that it will show if the code is compatible with the versions of PowerShell I have chosen. I can define settings for a workspace in the settings file, which you can open using CTRL and ,

But I can also enable or disable extensions for a workspace

So everytime I open this workspace I am only loading the extensions I want

Advertisements

Building Azure SQL Db with Terraform using Azure DevOps

In my last post I showed how to create a Resource Group and an Azure SQLDB with Terraform using Visual Studio Code to deploy.

Of course, I havent stopped there, who wants to manually run code to create things. There was a lot of install this and set up that. I would rather give the code to a build system and get it to run it. I can then even set it to automatically deploy new infrastructure when I commit some code to alter the configuration.

This scenario though is to build environments for presentations. Last time I created an Azure SQL DB and tagged it with DataInDevon (By the way you can get tickets for Data In Devon here – It is in Exeter on April 26th and 27th)

If I want to create the same environment but give it tags for a different event (This way I know when I can delete resources in Azure!) or name it differently, I can use Azure DevOps and alter the variables. I could just alter the code and commit the change and trigger a build or I could create variables and enable them to be set at the time the job is run. I use the former in “work” situations and the second for my presentations environment.

I have created a project in Azure DevOps for my Presentation Builds. I will be using GitHub to share the code that I have used. Once I clicked on pipelines, this is the page I saw

Clicking new pipeline, Azure DevOps asked me where my code was

I chose GitHub, authorised and chose the repository.

I then chose Empty Job on the next page. See the Configuration as code choice? We will come back to that later and our infrastructure as code will be deployed with a configuration as code 🙂

The next page allows us to give the build a good name and choose the Agent Pool that we want to use. Azure DevOps gives 7 different hosted agents running Linux, Mac, Windows or you can download an agent and run it on your own cpus. We will use the default agent for this process.

Clicking on Agent Job 1 enables me to change the name of the Agent Job. I could also choose a different type of Agent for different jobs within the same pipeline. This would be useful for testing different OS’s for example but for right now I shall just name it properly.

State

First we need somewhere to store the state of our build so that if we re-run it the Terraform plan step will be able to work out what it needs to do. (This is not absolutely required just for building my presentation environments and this might not be the best way to achieve this but for right now this is what I do and it works.)

I click on the + and search for Azure CLI.

and click on the Add button which gives me some boxes to fill in.

I choose my Azure subscription from the first drop down and choose Inline Script from the second

Inside the script block I put the following code

This will create a Resource Group, a storage account and a container and use some variables to provide the values, we will come back to the variables later.

Access Key

The next thing that we need to do is to to enable the job to be able to access the storage account. We don’t want to store that key anywhere but we can use our Azure DevOps variables and some PowerShell to gather the access key and write it to the variable when the job is running . To create the variables I clicked on the variables tab

and then added the variables with the following names TerraformStorageRG, TerraformStorageAccount and location from the previous task and TerraformStorageKey for the next task.

With those created, I go back to Tasks and add an Azure PowerShell task

I then add this code to get the access key and overwrite the variable.

Infrastructure as Code

In my GitHub repository I now have the following folders

The manual folders hold the code from the last blog post. In the Build folder, the main.tf file is identical and looks like this.

The variables.tf folder looks like this.

It is exactly the same except that the values have been replaced by the value name prefixed and suffixed with __. This will enable me to replace the values with the variables in my Azure DevOps Build job.

The backend-config.tf file will store the details of the state that will be created by the first step and use the access key that has been retrieved in the second step.

I need to add the following variables to my Azure DevOps Build – Presentation, ResourceGroupName, SqlServerName, SQLServerAdminUser, SQLServerAdminPassword, SqlDatabaseName, Edition, ServiceObjective . Personally I would advise setting the password or any other sensitive values to sensitive by clicking the padlock for that variable. This will stop the value being written to the log as well as hiding it behind *’s

Because I have tagged the variables with Settable at queue time , I can set the values whenever I run a build, so if I am at a different event I can change the name.

But the build job hasn’t been set up yet. First we need to replace the values in the variables file.

Replace the Tokens

I installed the Replace Tokens Task from the marketplace and added that to the build.

I am going to use a standard naming convention for my infrastructure code files so I add Build to the Root Directory. You can also click the ellipses and navigate to a folder in your repo. In the Target Files I add *”*/*.tf” and “**/*.tfvars” which will search all of the folders (**) and only work on files with a .tf or .tfvars extension (/*.tfvars) The next step is to make sure that the replacement prefix and suffix are correct. It is hidden under Advanced

Because I often forget this step and to aid in troubleshooting I add another step to read the contents of the files and place them in the logs. I do this by adding a PowerShell step which uses

Under control options there is a check box to enable or disable the steps so once I know that everything is ok with the build I will disable this step. The output in the log of a build will look like this showing the actual values in the files. This is really useful for finding spaces :-).

Running the Terraform in Azure DevOps

With everything set up we can now run the Terraform. I installed the Terraform task from the marketplace and added a task. We are going to follow the same process as the last blog post, init, plan, apply but this time we are going to automate it 🙂

First we will initialise

I put Build in the Terraform Template path. The Terraform arguments are

which will tell the Terraform to use the backend-config.tfvars file for the state. It is important to tick the Install terraform checkbox to ensure that terraform is available on the agent and to add the Azure Subscription (or Service Endpoint in a corporate environment

After the Initialise, I add the Terraform task again add Build to the target path and this time the argument is plan

Again, tick the install terraform checkbox and also the Use Azure Service Endpoint and choose the Azure Subscription.

We also need to tell the Terraform where to find the tfstate file by specifying the variables for the resource group and storage account and the container

Finally, add another Terraform task for the apply remembering to tick the install Terraform and Use Azure checkboxes

The arguments are

This will negate the requirement for the “Only “yes” will be accepted to approve” from the manual steps post!

Build a Thing

Now we can build the environment – Clicking Save and Queue

opens this dialogue

where the variables can be filled in.

The build will be queued and clicking on the build number will open the logs

6 minutes later the job has finished

and the resources have been created.

If I want to look in the logs of the job I can click on one of the steps and take a look. This is the apply step

Do it Again For Another Presentation

So that is good, I can create my environment as I want it. Once my presentation has finished I can delete the Resource Groups. When I need to do the presentation again, I can queue another build and change the variables

The job will run

and the new resource group will be created

all ready for my next presentation 🙂

This is brilliant, I can set up the same solution for different repositories for different presentations (infrastructure) and recreate the above steps.

The next post will show how to use Azure DevOps Task Groups to use the same build steps in multiple pipelines and build an Azure Linux SQL Server VM

The post after that will show how to use Azure DevOps templates to use the same build steps across many projects and build pipelines and will build a simple AKS cluster

The first post showed how to build an Azure SQLDB with Terraform using VS Code

Building Azure SQL Db with Terraform with Visual Studio Code

I have been using Terraform for the last week or so to create some infrastructure and decided to bring that knowledge back to a problem that I and others suffer from – building environments for presentations, all for the sake of doing some learning.

What is Terraform?

According to the website


HashiCorp Terraform enables you to safely and predictably create, change, and improve infrastructure. It is an open source tool that codifies APIs into declarative configuration files that can be shared amongst team members, treated as code, edited, reviewed, and versioned


https://www.terraform.io/

This means that I can define my infrastructure as code. If I can do that then I can reliably do the same thing again and again, at work to create environments that have the same configuration or outside of work to repeatedly build the environment I need.

Building an Azure SQL Database with Terraform

To understand how to build a thing the best place to start is the documentation https://www.terraform.io/docs . For an Azure SQL Db in the docs you will find a block of code that looks like this

If you read the code, you can see that there are key value pairs defining information about the resource that is being created. Anything inside a ${} is a dynamic reference. So

refers to the name property in the azure_resource_group block called test (or the name of the resource group 🙂 )

Infrastructure As Code

So I can put that code into a file (name it main.tf) and alter it with the values and “run Terraform” and what I want will be created. Lets take it a step further though because I want to be able to reuse this code. Instead of hard-coding all of the values I am going to use variables. I can do this by creating another file called variables.tf which looks like

and my main.tf then looks like this.

You can find these files in my GitHub Repository here.

Alright – deploy something

To deploy the code that I have written I need to download Terraform from https://www.terraform.io/downloads.html and then extract the exe to a folder in my PATH. (I chose C:\Windows). Then in Visual Studio Code I installed two extensions The Terraform Extension by Mikael Olenfalk which enables syntax highlighting and auto-completion for the tf files and the Azure Terraform extension. You will need also need Node.js from here.

With those in place I navigated to the directory holding my files in Visual Studio Code and pressed F1 and started typing azure terraform and chose Azure Terraform Init

I was then prompted to use Cloud Shell and a browser opened to login. Once I had logged in I waited until I saw this

I press F1 again and this time choose Azure Terraform plan. This is going to show me what Terraform is going to do if it applies this configuration.

You can see the what is going to be created. It is going to create 3 things

Once you have checked that the plan is what you want, press F1 again and choose Azure Terraform Apply

You are then asked to confirm that this is what you want. Only “yes” will be accepted. Then you will see the infrastructure being created

and a minute later

and Jeremy exists in the beardrules resource group

Then once I have finished with using the sqlinstance. I can press F1 again and choose Azure Terraform Destroy. Again there is a confirmation required.

and you will see the progress for 46 seconds

and all of the resources have gone.

Thats a good start. This enables me to create resources quickly and easily and keep the configuration for them safely in source control and easy to use.

In my next post I will create an Azure DevOps pipeline to deploy an AZure SQL Db withTerraform.

The post after will show how to use Azure DevOps Task Groups to use the same build steps in multiple pipelines and build an Azure Linux SQL Server VM

The post after that will show how to use Azure DevOps templates to use the same build steps across many projects and build pipelines and will build a simple AKS cluster

Using Docker to run Integration Tests for dbachecks

My wonderful friend André Kamman wrote a fantastic blog post this week SQL Server Container Instances via Cloudshell about how he uses containers in Azure to test code against different versions of SQL Server.

It reminded me that I do something very similar to test dbachecks code changes. I thought this might make a good blog post. I will talk through how I do this locally as I merge a PR from another great friend Cláudio Silva who has added agent job history checks.

GitHub PR VS Code Extension

I use the GitHub Pull Requests extension for VS Code to work with pull requests for dbachecks. This enables me to see all of the information about the Pull Request, merge it, review it, comment on it all from VS Code

I can also see which files have been changed and which changes have been made

Once I am ready to test the pull request I perform a checkout using the extension

This will update all of the files in my local repository with all of the changes in this pull request

You can see at the bottom left that the branch changes from development to the name of the PR.

Running The Unit Tests

The first thing that I do is to run the Unit Tests for the module. These will test that the code is following all of the guidelines that we require and that the tests are formatted in the correct way for the Power Bi to parse. I have blogged about this here and here and we use this Pester in our CI process in Azure DevOps which I described here.

I navigate to the root of the dbachecks repository on my local machine and run

and after about a minute

Thank you Cláudio, the code has passed the tests 😉

Running Some Integration Tests

The difference between Unit tests and Integration tests in a nutshell is that the Unit tests are testing that the code is doing what is expected without any other external influences whilst the Integration tests are checking that the code is doing what is expected when running on an actual environment. In this scenario we know that the code is doing what is expected but we want to check what it does when it runs against a SQL Server and even when it runs against multiple SQL Servers of different versions.

Multiple Versions of SQL Server

As I have described before my friend and former colleague Andrew Pruski b | t has many resources for running SQL in containers. This means that I can quickly and easily create fresh uncontaminated instances of SQL 2012, 2014, 2016 and 2017 really quickly.

I can create 4 instances of different versions of SQL in (a tad over) 1 minute. How about you?

Imagine how long it would take to run the installers for 4 versions of SQL and the pain you would have trying to uninstall them and make sure everything is ‘clean’. Even images that have been sysprep’d won’t be done in 1 minute.

Docker Compose Up ?

So what is this magic command that has enabled me to do this? docker compose uses a YAML file to define multi-container applications. This means that with a file called docker-compose.yml like thish

and in that directory just run

and 4 SQL containers are available to you. You can interact with them via SSMS if you wish with localhost comma PORTNUMBER. The port numbers in the above file are 15586, 15587,15588 and 15589

Now it must be noted, as I describe here that first I pulled the images to my laptop. The first time you run docker compose will take significantly longer if you haven’t pulled the images already (pulling the images will take quite a while depending on your broadband speed)

Credential

The next thing is to save a credential to make it easier to automate. I use the method described by my PowerShell friend Jaap Brasser here. I run this code

and then I can create a credential object using

Check The Connections

I ensure a clean session by removing the dbatools and dbachecks modules and then import the local version of dbachecks and set some variables

Now I can start to run my Integration tests. First reset the dbachecks configuration and set some configuration values

Then I will run the dbachecks connectivity checks and save the results to a variable without showing any output

I can then use Pester to check that dbachecks has worked as expected by testing if the failedcount property returned is 0.

What is the Unit Test for this PR?

Next I think about what we need to be testing for the this PR. The Unit tests will help us.

Choose some Integration Tests

This check is checking the Agent job history settings and the unit tests are

  • It “Passes Check Correctly with Maximum History Rows disabled (-1)”
  • It “Fails Check Correctly with Maximum History Rows disabled (-1) but configured value is 1000”
  • It “Passes Check Correctly with Maximum History Rows being 10000”
  • It “Fails Check Correctly with Maximum History Rows being less than 10000”
  • It “Passes Check Correctly with Maximum History Rows per job being 100”
  • It “Fails Check Correctly with Maximum History Rows per job being less than 100”

So we will check the same things on real actual SQL Servers. First though we need to start the SQL Server Agent as it is not started by default. We can do this as follows

Unfortunately, the agent service wont start in the SQL 2014 container so I cant run agent integration tests for that container but it’s better than no integration tests.

This is What We Will Test

So we want to test if the check will pass with default settings. In general, dbachecks will pass for default instance, agent or database settings values by default.

We also want the check to fail if the configured value for dbachecks is set to default but the value has been set on the instance.

We want the check to pass if the configured value for the dbachecks configuration is set and the instance (agent, database) setting matches it.

If You Are Doing Something More Than Once ……

Let’s automate that. We are going to be repeatedly running those three tests for each setting that we are running integration tests for. I have created 3 functions for this again checking that FailedCount or Passed Count is 0 depending on the test.

Now I can use those functions inside a loop in my Integration Pester Test

Write Some Integration Tests

So for this new test I have added a value to the TestingTheChecks array then I can test my checks. The default check I can check like this

Now I need to change the configurations so that they do not match the defaults and run the checks again

Next we have to change the instance settings so that they match the dbachecks configuration and run the checks and test that they all pass.

We will (of course) use dbatools for this. First we need to find the command that we need

and then work out how to use it

There is an example that does exactly what we want 🙂 So we can run this.

Run the Integration Tests

And then we will check that all of the checks are passing and failing as expected

Integration Test For Error Log Counts

There is another integration test there for the error logs count. This works in the same way. Here is the code

Merge the Changes

So with all the tests passing I can merge the PR into the development branch and Azure DevOps will start a build. Ultimately, I would like to add the integration to the build as well following André‘s blog post but for now I used the GitHub Pull Request extension to merge the pull request into development which started a build and then merged that into master which signed the code and deployed it to the PowerShell gallery as you can see here and the result is

https://www.powershellgallery.com/packages/dbachecks/1.1.164

Visual Studio Code Live Sharing Set-Up

There was an announcement on the Visual Studio Code blog about the public preview of Live Share. This enables you to easily collaborate on code by securely sharing your coding session.

It is remarkably easy to set up 🙂

Installation

Open Visual Studio Code, open the Extensions side bar (CTRL + SHIFT + X)

01 - open extensions

Search for Live Share

02 - search.png

Click Install and then reload when it has done

03 - reload.png

You will notice in the bottom bar it will say finishing the installation and if you open the terminal (CTRL + ‘) and click on Output and change the drop down on the right to Visual Studio Live Share you can see what it is doing

04 - finishing installation.png

It is installing the dependancies as shown below

[Client I] Installing dependencies for Live Share…
[Client I] Downloading package ‘.NET Core Runtime 2.0.5 for win7-x86’
[Client I] Download complete.
[Client I] Downloading package ‘OmniSharp for Windows (.NET 4.6)’
[Client I] Download complete.
[Client I] Installing package ‘.NET Core Runtime 2.0.5 for win7-x86’
[Client V] Extracted packed files
[Client I] Validated extracted files.
[Client I] Moved and validated extracted files.
[Client I] Finished installing.
[Client I] Installing package ‘OmniSharp for Windows (.NET 4.6)’
[Client V] Extracted packed files
[Client I] Validated extracted files.
[Client I] Finished installing.
[Client I] No workspace id found.
Incidentally, this will also show the location of the log file

You will see in the bottom bar it will now say sign in

06 - sign in.png

Clicking that will open a browser and give you a choice of accounts to sign in with, your GitHub or your Microsoft ID

07 - sign in.png

Choose the one that you want to use and do your 2FA.

08 - 2FA.png

You do have 2FA on your Microsoft and GitHub (and all the other services)? If not go and set it up now – here for Microsoft and here for GitHub 

Once you have signed in you will get this notification which you can close

09 - close this notification.png

The icon in the bottom will change and show your account name and if you click it it will open the menu

09 - sharing menu.png

Sharing

To share your session you click on the Share icon in the bottom bar or the Start collaboration session in the menu above. The first time you do this there will be a pop-up as shown

05 - firewall popup.png

You can decide which way you (or your organisation) want to share. I chose to accept the firewall exception.

10 - invite link.png

The invite link is in your clipboard ready to share with your friends and colleagues (other open source contributors ??)

They can either open the link in a browser

11 - join via browser.png

or by using the Join Collaboration Session in the menu in VS Code

12 - Join via VS COde.png

Once they do the sharer will get a notification

13 - notification of sharing.png

and the person who has joined will have the same workspace opened in their Visual Studio Code

14 -shared workspace.png

You can then collaborate on your code and share the session. In the video below the left hand side is running in my jump box in Azure and the right hand side on my laptop and you can see that if you highlight code in one side it is shown in the other and if you alter it in one side it is changed in the other. I also saved that file in the joined session rather than from the session that initialised the sharing and it then saved in both sessions 🙂

So that shows how easy it is to install and to use. You can dive deeper using the documentation.

 

Happy Collaborating 🙂

Easily Splatting PowerShell with VS Code

So I always like to show splatting PowerShell commands when I am presenting sessions or workshops and realised that I had not really blogged about it. (This blog is for @dbafromthecold who asked me to 🙂 )

What is Splatting?

Well you will know that when you call a PowerShell function you can use intellisense to get the parameters and sometimes the parameter values as well. This can leave you with a command that looks like this on the screen

It goes on and on and on and while it is easy to type once, it is not so easy to see which values have been chosen. It is also not so easy to change the values.
By Splatting the parameters it makes it much easier to read and also to alter. So instead of the above you can have
This is much easier on the eye, but if you dont know what the parameters are (and are too lazy to use Get-Help – Hint You should always use Get-Help ) or like the convenience and efficiency of using the intellisense, this might feel like a backward step that slows your productivity in the cause of easy on the eye code.
Enter EditorServicesCommandSuite by SeeminglyScience for VS Code. Amongst the things it makes available to you is easy splatting and people are always impressed when I show it
You can install it from the PowerShell Gallery like all good modules using
and then add it to your VSCode PowerShell profile usually found at C:\Users\USERNAME\Documents\WindowsPowerShell\Microsoft.VSCode_profile.ps1
and now creating a splat is as easy as this.
Write the command, leave the cursor on a parameter, hit F1 – Choose PowerShell : Show Additional Commands (or use a keyboard shortcut) type splat press enter. Done 🙂

So very easy 🙂
Happy Splatting 🙂

VS Code – Terminal crashes when formatting script

I love VS Code. I love being able to press ALT + SHIFT + F and format my code.

formatting.gif

The Problem

Yesterday all I got when I pressed ALT + SHIFT + F was this

format error.png

I could reproduce it will. This was very frustrating.

Turning on Verbose Logging

To turn on verbose logging for the PowerShell Editor Services go the Cog in the bottom left, click it and then click User Settings.

Search for powershell.developer.editorServicesLogLevel

powershell.developer.editorServicesLogLevel.png

If you hover over the left hand channel a pencil will appear, click it and then click replace in settings

edit settings.png

This will put the entry in the right hand side where you can change the value. Set it to Verbose and save

user settigns.png

a prompt will come up asking if you want to restart PowerShell

start a new session.png

When you restart PowerShell, if you click on  Output and choose PowerShell Extension Logs you will see the path to the log file

logfilepath.png

Reproduce the error

I then reproduced the error and opened the log file this is what I got

10/02/2018 09:11:19 [ERROR] – Method “OnListenTaskCompleted” at line 391 of C:\projects\powershelleditorservices\src\PowerShellEditorServices.Protocol\MessageProtocol\ProtocolEndpoint.cs

ProtocolEndpoint message loop terminated due to unhandled exception:

System.AggregateException: One or more errors occurred. —> System.Management.Automation.CommandNotFoundException: The term ‘Invoke-Formatter’ is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
at System.Management.Automation.Runspaces.PipelineBase.Invoke(IEnumerable input)
at System.Management.Automation.PowerShell.Worker.ConstructPipelineAndDoWork(Runspace rs, Boolean performSyncInvoke)
at System.Management.Automation.PowerShell.Worker.CreateRunspaceIfNeededAndDoWork(Runspace rsToUse, Boolean isSync)
at System.Management.Automation.PowerShell.CoreInvokeHelper[TInput,TOutput](PSDataCollection1 input, PSDataCollection1 output, PSInvocationSettings settings)
at System.Management.Automation.PowerShell.CoreInvoke[TInput,TOutput](PSDataCollection1 input, PSDataCollection1 output, PSInvocationSettings settings)
at System.Management.Automation.PowerShell.Invoke(IEnumerable input, PSInvocationSettings settings)
at Microsoft.PowerShell.EditorServices.AnalysisService.InvokePowerShell(String command, IDictionary2 paramArgMap)
at System.Threading.Tasks.Task
1.InnerInvoke()
at System.Threading.Tasks.Task.Execute()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.PowerShell.EditorServices.AnalysisService.<InvokePowerShellAsync>d__31.MoveNext()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.PowerShell.EditorServices.AnalysisService.<Format>d__22.MoveNext()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

Open an issue on GitHub

I couldnt quickly see what was happening so I opened an issue on the vscode-powershell repo by going to issues and clicking new issue and following the instructions

new issue.png

The Resolution

Keith Hill b | t pointed me to the resolution. Thank you Keith.

Further up in the log file there is a line where the editor services is loading the PSScriptAnalyzer module and it should have the Invoke-Formatter command exported, but mine was not. It loaded the PsScriptAnalyzer module  from my users module directory

10/02/2018 09:11:01 [NORMAL] – Method “FindPSScriptAnalyzerModule” at line 354 of C:\projects\powershelleditorservices\src\PowerShellEditorServices\Analysis\AnalysisService.cs

PSScriptAnalyzer found at C:\Users\XXXX\Documents\WindowsPowerShell\Modules\PSScriptAnalyzer\1.10.0\PSScriptAnalyzer.psd1

10/02/2018 09:11:01 [VERBOSE] – Method “EnumeratePSScriptAnalyzerCmdlets” at line 389 of C:\projects\powershelleditorservices\src\PowerShellEditorServices\Analysis\AnalysisService.cs

The following cmdlets are available in the imported PSScriptAnalyzer module:
Get-ScriptAnalyzerRule
Invoke-ScriptAnalyzer

I ran

to see the module paths

module path.png

and looked in the .vscode-insiders\extensions\ms-vscode.powershell-1.5.1\modules directory. There was no PsScriptAnalyzer folder

no module.png

So I copied the PSScriptAnalyzer folder from the normal VS Code PowerShell Extension module folder into that folder and restarted PowerShell and I had my formatting back again 🙂

I then reset the logging mode in my user settings back to Normal

Thank you Keith

How to write a PowerShell function to use Confirm, Verbose and WhatIf

In my last blog post I showed how to run a script with the WhatIf parameter. This assumes that the commands within the script have been written to use the common parameters Confirm, Verbose and WhatIf.

Someone asked me how to make sure that any functions that they write will be able to do this.

it is very easy

When we define our function we are going to add [cmdletbinding(SupportsShouldProcess)] at the top

and every time we perform an action that will change something we put that code inside a code block like this

and alter The Item and The Change as appropriate.

I have created a snippet for VS Code to make this quicker for me. To add it to your VS Code. Click the settings button bottom right, Click User Snippets, choose the powershell json and add the code below between the last two }’s (Don’t forget the comma)

and save the powershell.json file

Then when you are writing your code you can simply type “ifs” and tab and the code will be generated for you

As an example I shall create a function wrapped around Set-Content just so that you can see what happens.

I have done this before because if the file does not exist then Set-Content will create a new file for you, but with this function I can check if the file exists first with the ValidateScript before running the rest of the function.

As you can see I add variables from my PowerShell code into the “The Item” and “The Change”. If I need to add a property of an object I use $($Item.Property).

So now, if I want to see what my new function would do if I ran it without actually making any changes I have -WhatIf added to my function automagically.

If I want to confirm any action I take before it happens I have -Confirm

As you can see it also give the confirm prompts for the Set-Content command

You can also see the verbose messages with

So to summarise, it is really very simple to add Confirm, WhatIf and Verbose to your functions by placing  [cmdletbinding(SupportsShouldProcess)] at the top of the function and placing any code that makes a change inside

with some values that explain what the code is doing to the The Item and The Change.

Bonus Number 1 – This has added support for other common parameters as well – Debug, ErrorAction, ErrorVariable, WarningAction, WarningVariable, OutBuffer, PipelineVariable, and OutVariable.

Bonus Number 2 – This has automatically been added to your Help

Bonus Number 3 – This has reduced the amount of comments you need to write and improved other peoples understanding of what your code is supposed to do 🙂 People can read your code and read what you have entered for the IfShouldProcess and that will tell them what the code is supposed to do 🙂

Now you have seen how easy it is to write more professional PowerShell functions

TSQL2sDay – Get-PostRoundup

First an apology, this round up is late!

The reason for that is an error in the PowerShell testing module Pester (That’s not completely true as you shall see!!)

I spoke in Stuttgart at the PowerShell Saturday last weekend and had intended to write this blog post whilst travelling, unfortunately I found a major error in Pester (again not strictly true but it makes a good story!!)

I explained it with this slide in my presentation

Yep, I forgot to pack my NUC with my VMs on it and had to re-write all my demos!!

But anyway, on to the TSQL2sDay posts

What a response. You wonderful people. I salute you with a Rimmer salute

There are 34 TSQL2sDay posts about dbatools, about starting with PowerShell, If you should learn PowerShell, SSAS, SSRS, Log Shipping, backups, restores, Pester, Default settings, best practices, migrations, Warnings in Agent Jobs, sqlpackage, VLFs, CMS, Disabling Named Pipes, Orphaned users, AG Status, AG Agent Jobs, logging, classes, auditing, copying files, ETL and more.

I am really pleased to see so many first timers to the TSQL2sDay blog monthly blog party. Please don’t let this be your only TSQL2sDay post. Come back next month and write a post on that topic.

Here they are below in the media of tweets, so that you can also go and follow these wonderful people who are so willing to share their knowledge. Say thank you to them, ask them questions, interact.

Learn, Share, Network

Volker wrote about testing best practices with dbatools

Dave explains why PowerShell is so useful to him in his ETL processes

Steve writes about the time he has saved using PowerShell to automate restores and audit SQL Server instances

Nate talks about copying large files like SQL Server backups using BITS with PowerShell

Warren talks about his experience as a beginner, the amount of things he automates and his DBReboot module

THANK YOU every single one and apologies if I have missed anyone!

 

 

#TSQL2sDay – Starting Out with PowerShell

tsql2sdayThis months TSQL2sday is hosted by me!

Surprise! – I chose PowerShell

I am really looking forward to seeing what other people post. Some advanced scripts that will help you and show you how you can use PowerShell to save you time and hopefully some beginner posts explaining experiences or showing you how to start with PowerShell and SQL Server. I decided to go with the latter.

First though a warning.

PowerShell is another language, you are not going to be as proficient in a new language as you are in the language you spend all day working with. You will have to go through the learning curve and you will have to understand how to interpret errors. There is a learning curve just like with any language. Undoubtedly you will get frustrated at times. Reach out for help. Use twitter, use the #powershellhelp in the SQL Server Community Slack channel. There are many other places and plenty of people who will be glad to help you.

Open the Editor

If you are using a Windows machine you will have PowerShell installed. You will find it in your start menu on Windows 7 under All Programs, Accessories, Windows PowerShell folder, and then click Windows PowerShell or Windows PowerShell ISE. On Windows 8 or 10 search for PowerShell.

PowerShell or PowerShell ISE? Use PowerShell instead of cmd.exe for command line usage and ISE for developing scripts, functions and modules. (However, read on before making your choice)

PowerShell on Other O/S’s (and Windows!)

Just as SQL Server is available on Linux PowerShell is also available cross-platform, you can run and edit PowerShell on Linux and on Mac natively. The best editor to use is VS Code which is also available cross-platform. In fact, as Microsofts David Wilson says in this blog post

The PowerShell ISE has been the official editor for PowerShell throughout most of the history of Windows PowerShell. Now with the advent of the cross-platform PowerShell Core, we need a new official editor that’s available across all supported OS platforms and versions. Visual Studio Code is now that editor and the majority of our effort will be focused there.

So my advice is, whichever operating system you are using, use VS Code to write, edit, debug and run your PowerShell as that is the editor where Microsoft are spending the majority of their effort in development. Follow the instructions here to install You can also use it with many other languages including T-SQL. I find it a very useful tool

Where to start?

Well that depends what you want to do. Start by reading. Start by trying to accomplish something in PowerShell even if you know how to do it another way. This will also help you to realise when you are trying to use a hammer to put in a screw (using the wrong tool for the job)

What Command?

You don’t know where to start so how do you know the command to use? Here is a first command. You will use it often. I use it everyday still despite using PowerShell daily for many years. But don’t run it yet!

PowerShell is quite intuitive in it’s command naming. It uses a Verb-Noun syntax. You know without me explaining what this command will do. It will “Get” the “Command” (s). PowerShell uses singular nouns for its commands. I have 8000+ commands on my machine so it would have overloaded you with commands if you had run it without filters!

We use * as a wildcard in PowerShell so we can use that with our Get-Command command to find some commands to run. Type Get-Com and then hit the tab button. This will not only help you as it will auto-complete, it will reduce the number of errors as it will only auto-complete correct command names :-).

This will show all commands which end with service. The results will have some different headings. The screenshot below shows just one command to reduce confusion.

03 - get-service.png

The type of command, the name of the command, the version of the command and the module it is in. PowerShell uses modules to group commands together and to enable you to install the ones that you require from a repository like the PowerShell Gallery (more on this later).

You can find all of the commands in a module with the Get-Command command like this

This will show you all of the commands in the management module which may be a good place to start exploring. In this module for example

  • Test-Connection is like ping but better
  • Test-Path tests the existence of files and file paths
  • *-Service works with services
  • *-Process works with processes
  • *-Computer for power options for a computer
  • Get-ComputerInfo which will give you computer information (surprisingly 🙂 )

This is also something to remember in the future when you want to know what commands a module has or what the precise name of the command is.

Maybe once you install the dbatools module, the sqlserver module or the SSRS module??

HOMEWORK – Examine the names of commands in other modules that you find in

or for more choices

How do I use that command though?

You need to read! PowerShell comes with a lot of documentation, a lot of it is available on your machine. Lets introduce another command

As you can imagine, this gets the help for a command or returns help topics about PowerShell. Again, I use this every single day and you should too. Lets start with the first one. Lets get help for the Get-ChildItem command

Now, some of you may not get very much returned and a message at the bottom which says

Get-Help cannot find the Help files for this cmdlet on this computer. It is displaying only partial help.
— To download and install Help files for the module that includes this cmdlet, use Update-Help.
— To view the Help topic for this cmdlet online, type: “Get-Help Get-ChildItem -Online” or
go to https://go.microsoft.com/fwlink/?LinkID=113308.

When you have something else to do I would recommend running Update-Help as it suggests (It can take a few minutes to run) but for now make use of the -Online switch.

HOMEWORK – Run Update-Help

This is available for all of the Microsoft modules and some of the better open source modules, dbatools is an example of a (brilliant) community module which has -Online URLs for it’s commands. Running Get-Help (with or without the -Online switch) will give you plenty of information about the command, the switches available, the inputs and outputs.

Maybe start by using the -examples switch

will show you some examples of how to use the command

05 gethelp examples

Now you can start to explore the file system with this command (also the SQL Server, Registry, Certificate Store and many more things)

I also like to use the ShowWindow switch

which will open up the help in a separate window which can be useful when you are writing a script and it is searchable

04 - gethelp showwindo

But Chrissy prefers to use the -Detailed switch

You can decide which way you like best.

Now you know how to find a command and hot to find out how to use it. If you are using VS Code (or ISE) you have access to Intellisense and snippets to help you with the syntax. You can run a command and get some results.

HOMEWORK – Use Get-Help to understand how to use commands. ( Extra merit marks for teachers pets who read the topics they find using Get-Help About_*) Also, instant demerit points throughout the course (your career) for students who do not use Get-Help when using any command for the first time.

Hopefully you are confident enough to have run some commands. If you are worried about breaking things only run the commands which start with Get- as these should only be returning information or objects and not changing anything.

This isn’t cmd 

This isn’t cmd. What you are getting back is not text.

It is an object.

Unless you choose to use one of the formatting commands or the command you are using outputs using a formatting command (pro-tip don’t do that) then what you will receive as the results of your command is an object.

This means that there is more that you can do with the results than just have them in text format. You want Text File,JSON, XML, CSV, Excel output or to write to a database? Look here for ideas

Objects,Objects,Objects

Objects are awesome. Lets start with a SQL Server Instance object to show you how to explore objects. You will need to install the most useful PowerShell module you will find if you are working with SQL Server – dbatools

Disclosure – I am a contributor to the open source module dbatools. I am presenting a full day workshop in Singapore for the PowerShell Conference Asia in October. I am also proctoring at Chrissy and CK’s PASS Summit in Seattle. I highly recommend that you come and join us at one of those events if you can to further learn how you can use PowerShell to administer SQL Server

You can install it from the PowerShell Gallery using

You will be asked if you want to trust the repository, say Yes. If this doesn’t work on your machine (you are on an older version of PowerShell) follow the instructions on this page.  Or ask for help in #powershellhelp in Slack

Now that the module is installed we can create a SQL Server Instance object by assigning the results of a command to a variable.

A variable in PowerShell is designated by a $.

I suggest that you get used to following this step as well. This is a great way of exploring the results of any command and understanding what you can do with it.

We will use the Get-Member command. Another one that I use every single day.

Warning – You do not have to use these methods to accomplish tasks in PowerShell against SQL Server as both Microsoft and the community have released modules which will enable you to general tasks without “coding”

This is an example to get you comfortable with PowerShell by using a target that you are comfortable with and can recognise.

Now that we have an object we can explore it

By just calling the variable it will display the default properties

06 - default properties.png

If you want to see what else is available you can use Get-Member You just pipe | to Get-Member

This will show you all of the events, methods and properties available on the object.  Methods are really useful, enabling you to do things like start agent jobs or enumerate permissions or members

If you wish to see all of the processes that are running on the instance

Out-GridView is a useful command allowing you to see the results of your command in a graphical format and easily filter using the search bar at the top.

07 - ogv.png

Other things you can do include reading the errorlog

08 - errorlog.png

Get the connections to a database

Get the current Traceflags

The properties that are available at this level are similar to the view that you see in SSMS Object Explorer for the instance. You can see a databases folder in SSMS and you can expand it. You can do the same in PowerShell

That is going to just show you the names of the databases on that instance. You can do the same thing using the pipe | again and Select-Object

This will pass the Databases array of objects “along the pipeline” and then you can use Select-Object to choose them. Multiple properties can be chosen with commas.

09 - databases.png

PowerShell will let you select things that do not exist so for example

does not return an error but some of the columns are empty 🙂 You need to use the names of the properties returned from Get-Member in your Select statement (Hmm, sounds a bit like T-SQL)

18 select.png

You can look at a single database default properties using

You can then use Get-Member to explore deeper and deeper.

You can also look at the Agent from the instance level object

HOMEWORK – Using the code above explore the $SMO object, look at the databases, tables, the columns and the indexes. Explore other properties and select them. (Extra merit points for exploring the SQL Server like a file system using the SQLServer drive. Start with cd SQLSERVER:\ and use Get-ChildItem or its alias dir or ls)

Warning Reminder – You do not have to use these methods to accomplish tasks in PowerShell against SQL Server as both Microsoft and the community have released modules which will enable you to do general tasks without “coding”

But this is a very useful way of understanding and exploring using PowerShell with a type of object that you understand. You will use these methods all of the time and when you need to accomplish the next thing you have the tools. Already. Now.

Recap

You should take from this for the future

  • Get-Command *search* – Find a Command
  • Get-Help NameOfCommand – How do I use the Command ?
  • $var = Some Command here – Set the output of a command to a variable
  • $var | Get-Member – What Events, Methods, Properties do I have on this object
  • $var | Select Property1, Property2, Property3

That’s All Code – Where’s The Easy Buttons ?

There are two answers to this.

Firstly, yes it’s all code, but you will find that code is the future. Every time you run the same piece of code it will do the same thing, every time a junior DBA follows the steps to manually do something in a GUI like SSMS they may make a mistake and a different mistake each time. As DBAs we use stored procedures, views, functions and more, they are all code. Using code is vital for automation, for making things easy, for reducing the risk of mistakes, for getting rid of the mundane.

It is also required for DevopsAgileScrumContinuousIntegrationDeliveryWhatsTheNextBuzzWord which is all the rage (and also a lot of fun with interesting challenges)

As estates get bigger and bigger the GUI becomes less useful to you.

Secondly, remember my warning at the top of this post? There is no easy button. You will have to learn new things you will make mistakes and get errors, you need to ask for help

I Have An ERROR !

Don’t be disheartened, errors happen. The red text looks scary and frightening but its not.

Read the Error

Lets use Microsofts sqlserver module. There are all sorts of ways to get it. If you have SSMS you probably have it already. Lets check

A result like this means that you have it

10 - get-module.png

If not

Again you will be asked if you want to trust the repository, say yes!

If I run

I get

11 - error.png

That error is fairly easy to understand. I can’t connect to an instance that isn’t running

If I run

I get a different error

12 sql login error.png

The Get-SqlLogin from Microsofts sqlserver module returns an error if it cannot find the specified login name. The important part of this error message is ObjectNotFound. That tells us what has happened

If we look at the logins

13 - Logins.png

and choose one that exists everything is good

14 one login.png

HOMEWORK – Here is another object to explore with Get-Member. Explore the Login object

For example

15 - login script.png

But back to errors. Sometimes it isn’t quite so obvious. I have altered my Create-Database function so that it shows an error instead of a helpful message.

16 - db error.png

This is an example of the sort of error message that you might not be able to decipher so easily. Even if you read the message it says Exception calling .ctor with 2 arguments. How can you, as a new PowerShell user work out what has happened?

Objects

Remember, PowerShell is great because of objects? The errors are objects too. Lets have a look at the error in full. This is another piece of code that I still use every single day

The $error variable holds an array of error objects for that session. The [0] targets the 1st object in the array which is the latest. We pipe that to fl which is an alias for Format-List. Normally I am very clear about not using aliases in my blog posts as I believe that you should not use aliases in your scripts and functions (This is due to a bad experience trying to google for % to find out what it was in PowerShell many years ago). However, you are only going to use this at the command line so its ok.

The results are

17 error.png

Once you read the error, you will see that there is a generic connection failed error message which you will be used to seeing and now you can go about fixing it.

Errors aren’t scary. You just have to read them. Sometimes you have to expand them to read them. Once you have read them don’t forget that there is also a lot of help out there on the internet. I recommend the #powershellhelp channel in the slack but with the full error message you will be able to get some useful results from whatever your choice of helpful solutions is.

That’s exactly 3000 words – which is slightly (!) more than I intended to write and a lot to read, so if you have got all the way down here. Congratulations, when you see me at Summit or at a SQL Saturday come up and tell me and I will give you a special sticker 🙂

Hopefully this has given you some guidance to starting to use PowerShell with SQL and how you can help yourself.

Good luck.