VS Code – Terminal crashes when formatting script

I love VS Code. I love being able to press ALT + SHIFT + F and format my code.

formatting.gif

The Problem

Yesterday all I got when I pressed ALT + SHIFT + F was this

format error.png

I could reproduce it will. This was very frustrating.

Turning on Verbose Logging

To turn on verbose logging for the PowerShell Editor Services go the Cog in the bottom left, click it and then click User Settings.

Search for powershell.developer.editorServicesLogLevel

powershell.developer.editorServicesLogLevel.png

If you hover over the left hand channel a pencil will appear, click it and then click replace in settings

edit settings.png

This will put the entry in the right hand side where you can change the value. Set it to Verbose and save

user settigns.png

a prompt will come up asking if you want to restart PowerShell

start a new session.png

When you restart PowerShell, if you click on  Output and choose PowerShell Extension Logs you will see the path to the log file

logfilepath.png

Reproduce the error

I then reproduced the error and opened the log file this is what I got

10/02/2018 09:11:19 [ERROR] – Method “OnListenTaskCompleted” at line 391 of C:\projects\powershelleditorservices\src\PowerShellEditorServices.Protocol\MessageProtocol\ProtocolEndpoint.cs

ProtocolEndpoint message loop terminated due to unhandled exception:

System.AggregateException: One or more errors occurred. —> System.Management.Automation.CommandNotFoundException: The term ‘Invoke-Formatter’ is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
at System.Management.Automation.Runspaces.PipelineBase.Invoke(IEnumerable input)
at System.Management.Automation.PowerShell.Worker.ConstructPipelineAndDoWork(Runspace rs, Boolean performSyncInvoke)
at System.Management.Automation.PowerShell.Worker.CreateRunspaceIfNeededAndDoWork(Runspace rsToUse, Boolean isSync)
at System.Management.Automation.PowerShell.CoreInvokeHelper[TInput,TOutput](PSDataCollection1 input, PSDataCollection1 output, PSInvocationSettings settings)
at System.Management.Automation.PowerShell.CoreInvoke[TInput,TOutput](PSDataCollection1 input, PSDataCollection1 output, PSInvocationSettings settings)
at System.Management.Automation.PowerShell.Invoke(IEnumerable input, PSInvocationSettings settings)
at Microsoft.PowerShell.EditorServices.AnalysisService.InvokePowerShell(String command, IDictionary2 paramArgMap)
at System.Threading.Tasks.Task
1.InnerInvoke()
at System.Threading.Tasks.Task.Execute()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.PowerShell.EditorServices.AnalysisService.<InvokePowerShellAsync>d__31.MoveNext()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.PowerShell.EditorServices.AnalysisService.<Format>d__22.MoveNext()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

Open an issue on GitHub

I couldnt quickly see what was happening so I opened an issue on the vscode-powershell repo by going to issues and clicking new issue and following the instructions

new issue.png

The Resolution

Keith Hill b | t pointed me to the resolution. Thank you Keith.

Further up in the log file there is a line where the editor services is loading the PSScriptAnalyzer module and it should have the Invoke-Formatter command exported, but mine was not. It loaded the PsScriptAnalyzer module  from my users module directory

10/02/2018 09:11:01 [NORMAL] – Method “FindPSScriptAnalyzerModule” at line 354 of C:\projects\powershelleditorservices\src\PowerShellEditorServices\Analysis\AnalysisService.cs

PSScriptAnalyzer found at C:\Users\XXXX\Documents\WindowsPowerShell\Modules\PSScriptAnalyzer\1.10.0\PSScriptAnalyzer.psd1

10/02/2018 09:11:01 [VERBOSE] – Method “EnumeratePSScriptAnalyzerCmdlets” at line 389 of C:\projects\powershelleditorservices\src\PowerShellEditorServices\Analysis\AnalysisService.cs

The following cmdlets are available in the imported PSScriptAnalyzer module:
Get-ScriptAnalyzerRule
Invoke-ScriptAnalyzer

I ran

to see the module paths

module path.png

and looked in the .vscode-insiders\extensions\ms-vscode.powershell-1.5.1\modules directory. There was no PsScriptAnalyzer folder

no module.png

So I copied the PSScriptAnalyzer folder from the normal VS Code PowerShell Extension module folder into that folder and restarted PowerShell and I had my formatting back again 🙂

I then reset the logging mode in my user settings back to Normal

Thank you Keith

How I created PowerShell.cool using Flow, Azure SQL DB, Cognitive Services & PowerBi

Last weekend I was thinking about how to save the tweets for PowerShell Conference Europe. This annual event occurs in Hanover and this year it is on April 17-20, 2018. The agenda has just been released and you can find it on the website http://www.psconf.eu/

I ended up creating an interactive PowerBi report to which my good friend and Data Platform MVP Paul Andrew b | t added a bit of magic and I published it. The magnificent Tobias Weltner b | t who organises PSConfEU pointed the domain name http://powershell.cool at the link. It looks like this.

During the monthly #PSTweetChat

I mentioned that I need to blog about how I created it and Jeff replied

so here it is! Looking forward to seeing the comparison between the PowerShell and Devops Summit and the PowerShell Conference Europe 🙂

This is an overview of how it works

 

You will find all of the resources and the scripts to do all of the below in the GitHub repo. So clone it and navigate to the filepath

Create Database

First lets create a database. Connect to your Azure subscription

01 - subscription.png

Then set some variables

They should all make sense, take note that you need to set and uncomment the login and password and choose which IPs to allow through the firewall

Create a Resource Group

02 - resource group.png

Create a SQL Server

03 - create server.png

Create a firewall rule, I just use my own IP and add the allow azure IPs

03a - firewall rule.png

Create a database

04 - create database.png

I have used the dbatools module to run the scripts to create the database. You can get it using

Run the scripts

05 - Create Table Sproc.png

This will have created the following in Azure, you can see it in the portal

07 - portal.png

You can connect to the database in SSMS and you will see

06 - show table.png

Create Cognitive Services

Now you can create the Text Analysis Cognitive Services API

First login (if you need to) and set some variables

Then create the API and get the key

You will need to accept the prompt

08 -cognitive service

Copy the Endpoint URL as you will need it.Then save one of  the keys for the next step!

09 cognitiveservice key

 

Create the Flow

I have exported the Flow to a zip file and also the json for a PowerApp (no details about that in this post). Both are available in the Github repo. I have submitted a template but it is not available yet.

Navigate to https://flow.microsoft.com/ and sign in

Creating Connections

You will need to set up your connections. Click New Connection and search for Text

16 - import step 3.png

Click Add and fill in the Account Key and the Site URL from the steps above

17 import step 5.png

click new connection and search for SQL Server

18 - import step 6.png

Enter the SQL Server Name (value of $AzureSQLServer) , Database Name , User Name and Password from the steps above

19 - import step 7.png

Click new Connection and search for Twitter and create a connection (the authorisation pop-up may be hidden behind other windows!)

Import the Flow

If you have a premium account you can import the flow, click Import

11 - import flow.png

12 - choose import.png

and choose the import.zip from the Github Repo

13 import step 1.png

 

Click on Create as new and choose a name

14 - import step 2.png

Click select during import next to Sentiment and choose the Sentiment connection

15 impot step 3.png

Select during import for the SQL Server Connection and choose the SQL Server Connection and do the same for the Twitter Connection

20 - import stpe 8.png

Then click import

21 - imported.png

Create the flow without import

If you do not have a premium account you can still create the flow using these steps. I have created a template but it is not available at the moment. Create the connections as above and then click Create from blank.

22 - importblank.png

 

Choose the trigger When a New Tweet is posted and add a search term. You may need to choose the connection to twitter by clicking the three dots

23 - importblank 1.png

Click Add an action

24 - add action.png

search for detect and choose the Text Analytics Detect Sentiment

25 - choose sentuiment.png

Enter the name for the connection, the account key and the URL from the creation of the API above. If you forgot to copy them

26 - enter details.png

Click in the text box and choose Tweet Text

27 - choose tweet text.png

Click New Step and add an action. Search for SQL Server and choose SQL Server – Execute Stored Procedure

28 - choose sql server execute stored procedure.png

Choose the stored procedure [dbo].[InsertTweet]

29 - choose stored procedure.png

Fill in as follows

  • __PowerAppsID__         0
  • Date                                 Created At
  • Sentiment                      Score
  • Tweet                              Tweet Text
  • UserLocation                 Location
  • UserName                      Tweeted By

as shown below

30 stored procedure info.png

Give the flow a name at the top and click save flow

31 flow created.png

Connect PowerBi

Open the PSConfEU Twitter Analysis Direct.pbix from the GitHub repo in PowerBi Desktop. Click the arrow next to Edit Queries and then change data source settings

32 change data source.png

Click Change source and enter the server (value of $AzureSQLServer) and the database name. It will alert you to apply changes

33 apply changes.png

It will then pop-up with a prompt for the credentials. Choose Database and enter your credentials and click connect

34 - creds.png

and your PowerBi will be populated from the Azure SQL Database 🙂 This will fail if there are no records in the table because your flow hasn’t run yet. If it does just wait until you see some tweets and then click apply changes again.

You will probably want to alter the pictures and links etc and then yo can publish the report

Happy Twitter Analysis

Dont forget to keep an eye on your flow runs to make sure they have succeeded.

How to write a PowerShell function to use Confirm, Verbose and WhatIf

In my last blog post I showed how to run a script with the WhatIf parameter. This assumes that the commands within the script have been written to use the common parameters Confirm, Verbose and WhatIf.

Someone asked me how to make sure that any functions that they write will be able to do this.

it is very easy

When we define our function we are going to add [cmdletbinding(SupportsShouldProcess)] at the top

and every time we perform an action that will change something we put that code inside a code block like this

and alter The Item and The Change as appropriate.

I have created a snippet for VS Code to make this quicker for me. To add it to your VS Code. Click the settings button bottom right, Click User Snippets, choose the powershell json and add the code below between the last two }’s (Don’t forget the comma)

and save the powershell.json file

Then when you are writing your code you can simply type “ifs” and tab and the code will be generated for you

As an example I shall create a function wrapped around Set-Content just so that you can see what happens.

I have done this before because if the file does not exist then Set-Content will create a new file for you, but with this function I can check if the file exists first with the ValidateScript before running the rest of the function.

As you can see I add variables from my PowerShell code into the “The Item” and “The Change”. If I need to add a property of an object I use $($Item.Property).

So now, if I want to see what my new function would do if I ran it without actually making any changes I have -WhatIf added to my function automagically.

If I want to confirm any action I take before it happens I have -Confirm

As you can see it also give the confirm prompts for the Set-Content command

You can also see the verbose messages with

So to summarise, it is really very simple to add Confirm, WhatIf and Verbose to your functions by placing  [cmdletbinding(SupportsShouldProcess)] at the top of the function and placing any code that makes a change inside

with some values that explain what the code is doing to the The Item and The Change.

Bonus Number 1 – This has added support for other common parameters as well – Debug, ErrorAction, ErrorVariable, WarningAction, WarningVariable, OutBuffer, PipelineVariable, and OutVariable.

Bonus Number 2 – This has automatically been added to your Help

Bonus Number 3 – This has reduced the amount of comments you need to write and improved other peoples understanding of what your code is supposed to do 🙂 People can read your code and read what you have entered for the IfShouldProcess and that will tell them what the code is supposed to do 🙂

Now you have seen how easy it is to write more professional PowerShell functions

How to run a PowerShell script file with Verbose, Confirm or WhatIf

Before you run a PowerShell command that makes a change to something you should check that it is going to do what you expect. You can do this by using the WhatIf parameter for commands that support it. For example, if you wanted to create a New SQL Agent Job Category you would use the awesome dbatools module and write some code like this

before you run it, you can check what it is going to do using

which gives a result like this

This makes it easy to do at the command line but when we get confident with PowerShell we will want to write scripts to perform tasks using more than one command. So how can we ensure that we can check that those will do what we are expecting without actually running the script and see what happens? Of course, there are Unit and integration testing that should be performed using Pester when developing the script but there will still be occasions when we want to see what this script will do this time in this environment.

Lets take an example. We want to place our SQL Agent jobs into specific custom categories depending on their name. We might write a script like this

You can run this script against any SQL instance by calling  it and passing an instance parameter from the command line like this

If you wanted to see what would happen, you could edit the script and add the WhatIf parameter to every changing command but that’s not really a viable solution. What you can do is

this will set all commands that accept WhatIf to use the WhatIf parameter. This means that if you are using functions that you have written internally you must ensure that you write your functions to use the common parameters

Once you have set the default value for WhatIf as above, you can simply call your script and see the WhatIf output

which will show the WhatIf output for the script

Once you have checked that everything is as you expected then you can remove the default value for the WhatIf parameter and run the script

and get the expected output

If you wish to see the verbose output or ask for confirmation before any change you can set those default parameters like this

and set them back by setting to false

 

Pester 4.2.0 has a Because…… because :-)

I was going through my demo for the South Coast User Group meeting tonight and decided to add some details about the Because parameter available in the Pester pre-release version 4.2.0.

To install a pre-release version you need to get the latest  PowerShellGet module. This is pre-installed with PowerShell v6 but for earlier versions open PowerShell as administrator and run

You can try out the Pester pre-release version (once you have the latest PowerShellGet) by installing it from the PowerShell Gallery with

There are a number of improvements as you can see in the change log I particularly like the

  • Add -BeTrue to test for truthy values
  • Add -BeFalse to test for falsy values

This release adds the Because parameter to the all assertions. This means that you can add a reason why the test has failed. As JAKUB JAREŠ writes here

  • Reasons force you think more
  • Reasons document your intent
  • Reasons make your TestCases clearer
  • So you can do something like this

Which gives an error message like this 🙂

As you can see the Expected gives the expected value and then your Because statement and then the actual result. Which means that you could write validation tests like

and get a result like this

Or if you were looking to validate your SQL Server you could write something like this

or maybe your security policies allow Windows Groups as logins on your SQL instances. You could easily link to the documentation and explain why this is important. This way you could build up a set of tests to validate your SQL Server is just so for your environment

Just for fun, these would look like this

and the code looks like

This will be a useful improvement to Pester when it is released and enable you to write validation checks with explanations.

Chrissy has written about dbachecks the new up and coming community driven open source PowerShell module for SQL DBAs to validate their SQL Server estate. we have taken some of the ideas that we have presented about a way of using dbatools with Pester to validate that everything is how it should be and placed them into a meta data driven framework to make things easy for anyone to use. It is looking really good and I am really excited about it. It will be released very soon.

Chrissy and I will be doing a pre-con at SQLBits where we will talk in detail about how this works. You can find out more and sign up here

Using the AST in Pester for dbachecks

TagLine – My goal – Chrissy will appreciate Unit Tests one day 🙂

Chrissy has written about dbachecks the new up and coming community driven open source PowerShell module for SQL DBAs to validate their SQL Server estate. we have taken some of the ideas that we have presented about a way of using dbatools with Pester to validate that everything is how it should be and placed them into a meta data driven framework to make things easy for anyone to use. It is looking really good and I am really excited about it. It will be released very soon.

Chrissy and I will be doing a pre-con at SQLBits where we will talk in detail about how this works. You can find out more and sign up here

Cláudio Silva has improved my PowerBi For Pester file and made it beautiful and whilst we were discussing this we found that if the Pester Tests were not formatted correctly the Power Bi looked … well rubbish to be honest! Chrissy asked if we could enforce some rules for writing our Pester tests.

The rules were

The Describe title should be in double quotes
The Describe should use the plural Tags parameter
The Tags should be singular
The first Tag should be a unique tag in Get-DbcConfig
The context title should end with $psitem
The code should use Get-SqlInstance or Get-ComputerName
The Code should use the forEach method
The code should not use $_
The code should contain a Context block

She asked me if I could write the Pester Tests for it and this is how I did it. I needed to look at the Tags parameter for the Describe. It occurred to me that this was a job for the Abstract Syntax Tree (AST). I don’t know very much about the this but I sort of remembered reading a blog post by Francois-Xavier Cat about using it with Pester so I went and read that and found an answer on Stack Overflow as well. These looked just like what I needed so I made use of them. Thank you very much to Francois-Xavier and wOxxOm for sharing.

The first thing I did was to get the Pester Tests which we have located in a checks folder and loop through them and get the content of the file with the Raw parameter

Then I decided to look at the Describes using the method that wOxxOm (I know no more about this person!) showed.
As I understand it, this code is using the Parser on the $check (which contains the code from the file) and finding all of the Describe commands and creating an object of the title of the Describe with the StaticType equal to String and values from the Tag parameter.
When I ran this against the database tests file I got the following results
Then it was a simple case of writing some tests for the values

The Describes variable is inside @() so that if there is only one the ForEach Method will still work. The unique tags are returned from our command Get-DbcCheck which shows all of the checks. We will have a unique tag for each test so that they can be run individually.

Yes, I have tried to ensure that the tags are singular by ensuring that they do not end with an s (apart from statistics) and so had to not check  BackupPathAccess and statistics. Filename is a variable that we add to each Describe Tags so that we can run all of the tests in one file. I added a little if block to the Pester as well so that the error if the Tags parameter was not passed was more obvious

I did the same with the context blocks as well

This time we look for the Context command and ensure that the string value ends with psitem as the PowerBi parses the last value when creating columns
Finally I got all of the code and check if it matches some coding standards

I trim the title from the Describe block so that it is easy to see where the failures (or passes) are with some regex and then loop through each statement apart from the first line to ensure that the code is using our internal commands Get-SQLInstance or Get-ComputerName to get information, that we are looping through each of those arrays using the ForEach method rather than ForEach-Object and using $psitem rather than $_ to reference the “This Item” in the array and that each Describe block has a context block.

This should ensure that any new tests that are added to the module follow the guidance we have set up on the Wiki and ensure that the Power Bi results still look beautiful!

Anyone can run the tests using

before they create a Pull request and it looks like
if everything is Green then they can submit their Pull Request 🙂 If not they can see quickly that something needs to be fixed. (fail early 🙂 )
03 fails.png

Converting a Datarow to a JSON object with PowerShell

This is just a quick post. As is frequent with these they are as much for me to refer to in the future and also because the very act of writing it down will aid me in remembering. I encourage you to do the same. Share what you learn because it will help you as well as helping others.

Anyway, I was writing some Pester tests for a module that I was writing when I needed some sample data. I have written before about using Json for this purpose This function required some data from a database so I wrote the query to get the data and used dbatools to run the query against the database using Get-DbaDatabase

Simple enough. I wanted to be able to Mock $variable. I wrapped the code above in a function, let’s call it Run-Query

Which meant that I could easily separate it for mocking in my test. I ran the code and investigated the $variable variable to ensure it had what I wanted for my test and then decided to convert it into JSON using ConvertTo-Json

Lets show what happens with an example using WideWorldImporters and a query I found on Kendra Littles blogpost about deadlocks

If I investigate the $variable variable I get

data results

The results were just what I wanted so I thought I will just convert them to JSON and save them in a file and bingo I have some test data in a mock to ensure my code is doing what I expect. However, when I run

I get

json error.png

and thats just for one row!

The way to resolve this is to only select the data that we need. The easiest way to do this is to exclude the properties that we don’t need

which gave me what I needed and a good use case for -ExcludeProperty

json fixed.png