Creating a PowerShell Module and TDD for Get-SQLDiagRecommendations

Yesterday I introduced the first command in the SQLDiagAPI module. A module to consume the SQL Diagnostics API.

I have been asked a few times what the process is for creating a module, using Github and developing with Pester and whilst this is not a comprehensive how-to I hope it will give some food for thought when you decide to write a PowerShell module or start using Pester for code development. I also hope it will encourage you to give it a try and to blog about your experience.

This is my experience from nothing to a module with a function using Test Driven Development with Pester. There are some details missing in some places but if something doesn’t make sense then ask a question. If something is incorrect then point it out. I plan on never stopping learning!

There are many links to further reading and I urge you to not only read the posts linked but also to read further and deeper. That’s a generic point for anyone in the IT field and not specific to PowerShell. Never stop learning. Also, say thank you to those that have taken their time to write content that you find useful. They will really appreciate that.

Github Repository

I created a new repository in Github and used Visual Studio Code to clone the repository by pressing F1 and typing clone – Choosing Git Clone and following the prompts. I started with this because I was always planning to share this code and because source controlling it is the best way to begin.

Plaster Template

When you create a module there are a number of files that you need and I have a number of generic tests that I add. I also have a structure that I create for the artifacts and a number of markdown documents that come with a GitHub Repository.  Whilst you could write a PowerShell script to create all of those, there is no need as there is PlasterPlaster is a PowerShell module that enables you to set up the default scaffolding for your PowerShell module structure and tokenise some files. This makes it much easier to have a default ‘scaffold’ for the module, a structure for the files and folders and create a new module simply. I used Kevin Marquettes post on Plaster  to create myself a template module. You can find my Plaster Template here 

You do not need to use Plaster at all but as with anything, if you find yourself repeating steps then it is time to automate it

With my Plaster Template created I could simply run

$plaster = @{
TemplatePath = "GIT:\PlasterTemplate" #(Split-Path $manifestProperties.Path)
DestinationPath = "GIT:\SQLDiagAPI"
FullName = "Rob Sewell"
ModuleName = "SQLDiagAPI"
ModuleDesc = "This is a module to work with the SQL Server Diagnostics (Preview) API. See https://blogs.msdn.microsoft.com/sql_server_team/sql-server-diagnostics-preview/ for more details "
Version = "0.9.0"
GitHubUserName = "SQLDBAWithABeard"
GitHubRepo = "SQLDiagAPI"
}
If(!(Test-Path $plaster.DestinationPath))
{
New-Item -ItemType Directory -Path $plaster.DestinationPath
}
Invoke-Plaster @plaster -Verbose

This created my module. It created this folder and file structure and included some default tests and markdown documents pre-populated.

00 - module

Pester

For those that don’t know. Pester is a PowerShell module for Test Driven Development

Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands

If you have PowerShell version 5 then you will have Pester already installed although you should update it to the latest version. If not you can get Pester from the PowerShell Gallery follow the instructions on that page to install it. This is a good post to start learning about Pester

API Key

Now that I have the module I started to think about the commands. I decided to start with the recommendations API which is described as

Customers will be able to keep their SQL Server instances up-to-date by easily reviewing the recommendations for their SQL Server instances. Customers can filter by product version or by feature area (e.g. Always On, Backup/Restore, Column Store, etc.) and view the latest Cumulative Updates (CU) and the underlying hotfixes addressed in the CU.

To use the API you need an API Key. An API Key is a secret token that identifies the application to the API and is used to control access.You can follow the instructions here https://ecsapi.portal.azure-api.net/ to get one for the SQL Server Diagnostics API.

01 - APIKey

I will need to store the key to use it and if I am writing code that others will use consider how they can repeat the steps that I take. I decided to save my API Key using the Export-CliXML command as described by Jaap Brasser here .

Get-Credential | Export-CliXml -Path "${env:\userprofile}\SQLDiag.Cred"

You need to enter a username even though it is not used and then enter the API Key as the password. It is saved in the root of the user profile folder as hopefully user accounts will have access there in most shops

TDD

I approached writing this module using Test Driven Development with Pester. This means that I have to write my tests before I write my code. There are many reasons for doing this which are outside the scope of this blog post. This is a very good post to read more

The first function I wanted to write was to get the recommendations from the API. I decide to call it Get-SQLDiagRecommendations.

I decided that the first test should be to ensure that the API Key exists. Otherwise I would not be able to use it when calling the API. I already had an idea of how I would approach it by storing the API Key using Test-Path and writing a warning if the file did not exist.

Mocking

However this is not going to work if I have already saved the key to the file. The test needs to not be reliant on any thing external. I need to be able to test this functionality without actually checking my system. I will use Mock to do this. You can read more about mocking with Pester here.

I added this to my Pester test

Context "Requirements" {
Mock Test-Path {$false}
Mock Write-Warning {"Warning"}

This is what happens when you run this test. When there is a call to Test-Path in the code you have written, instead of actually running Test-Path it will return whatever is inside the curly braces, in this case false. For Write-Warning it will return a string of Warning.

This means that I can write a test like this

It "Should throw a warning if there is no API Key XML File and the APIKey Parameter is not used"{
Get-SQLDiagRecommendations -ErrorAction SilentlyContinue | Should Be "Warning"
}

So I know that when running my code in this test, Test-Path will return false, which will invoke Write-Warning in my code and in the test that will return “Warning” . So if I have written my code correctly the test will pass without actually running the real Test-Path and interacting with my system or running Write-Warning which makes it easier to test that warnings are thrown correctly.

The name of the test will also let me (and others) know in the future what I was trying to achieve. This means that if I (or someone else) changes the code and the test fails they can understand what was meant to happen. They can then either write a new test for the changed code if the requirements are now different or alter the code so that it passes the original test.

I use

-ErrorAction SilentlyContinue

so that the only red text that I see on the screen is the results of the test and not any PowerShell errors.

Asserting

I can also check that I have successfully called my Mocks using Assert-MockCalled. This command will check that a command that has been mocked has been called successfully during the test in the scope of the Describe (or in this case Context) block of the tests

It 'Checks the Mock was called for Test-Path' {
$assertMockParams = @{
'CommandName' = ' Test-Path'
'Times' = 1
'Exactly' = $true
}
Assert-MockCalled @assertMockParams
}

I specify the command name, the number of times that I expect the mock to have been called and because I know that it will be exactly 1 time, I set exactly to $true. If I set exactly to false it would test that the mock was called at least the number of times specified. This is another test that I really have called the Mocks that I defined and the results are correct and dependant only on the code.

I set up the same test for Write-Warning.

Failed Test

I can now run my Pester tests using

Invoke-Pester .\Tests

and see that some failed.

02 - Failed Pester tests

Of course it failed I don’t have a function named Get-SQLDiagRecommendations

So why run the test?

I need to ensure that my test fails before I write the code to pass it. If I don’t do that I may mistakenly write a test that passes and therefore not be correctly testing my code.

You can also see that it has run all of the .Tests.ps1 files in the tests directory and has taken 42 seconds to run. The tests directory includes a number of Pester tests including checking that all of the scripts pass the Script Analyser rules and that all of the functions have the correct help. (thank you June Blender for that test)

Show

I can reduce the output of the tests using the Show parameter of Invoke-Pester. I will often use Fails as this will show the describe and context titles and only the tests that fail. This will run much quicker as it will not need to output all of the passed tests to the screen

03 - Pester show fails

Now the test is running in less than half of the time. You can filter the output in further ways using Show. You can run

Get-Help Invoke-Pester

to see how else you can do this.

Tags

As I am going to be writing tests and then writing code to pass the tests repeatedly I don’t want to run all of these tests all of the time so I can use the Tags parameter of Invoke-Pester to only run a certain suite tests. In the Unit.Tests.ps1 file the Describe block looks like this

Describe "Get-SQLDiagRecommendations" -Tags Build , Unit{
Context "Requirements" {

So I can run just the tests tagged Unit and skip all of the other tests. Combined with the Show Fails to reduce the output my Invoke-Pester code looks like this

Invoke-Pester .\tests -Show Fails -Tag Unit

04 - Pester Tags

Now I am only running the tests that I need for writing the code for the command the tests are running in under half a second 🙂 This is so much better when I am going to be running them repeatedly.

The other tests have different tags and I will show them running later in the post.

Code

Finally, we can write some code to pass our failing test

function Get-SQLDiagRecommendations1 {
[cmdletbinding()]
Param([string]$ApiKey)
if (!$ApiKey) {
if (!(Test-Path "${env:\userprofile}\SQLDiag.Cred")) {
Write-Warning "You have not created an XML File to hold the API Key or provided the API Key as a parameter
You can export the key to an XML file using Get-Credential | Export-CliXml -Path `"`${env:\userprofile}\SQLDiag.Cred`"
You can get a key by following the steps here https://ecsapi.portal.azure-api.net/ "
    }
}

Which would look like this if the file does not exist and the API Key parameter is not used

05 - Warning

I like to provide users with a useful message that they can follow rather than a lot of red text that they need to decipher

And now our tests pass

06 - Passing Tests

If you look at the API documentation the API requires a callerid as well as the APIKey. In the examples it uses the value from
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\MachineGUID

We can get that using Get-ItemProperty and without it we can’t call the API so I wrote tests like this.

It "Returns a warning if unable to get Machine GUID" {
Mock Get-MachineGUID {} -Verifiable
Mock Write-Warning {"Warning"} -Verifiable
Get-SQLDiagRecommendations -APIKey dummykey | Should Be "Warning"
Assert-VerifiableMocks
}

I am not saying this is the correct way to write your tests. I am showing that you can test multiple things in an It block and if any one of them fails the entire test fails.

I am mocking the internal function Get-MachineGuid and Write Warning just in the scope of this It Block and passing an APIKey parameter to Get-SQLDiagRecommendations so that we don’t hit the write-warnings we tested for above and then using Assert-VerifiableMocks  to verify that the mocks have been called. It does not verify how many times, just that all of the mocks in that block have been called

The test fails as expected and then I write the code to pass the test. This is the internal function to get the Machine GUID

function Get-MachineGUID {
try {
(Get-ItemProperty registry::HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\ -Name MachineGuid).MachineGUID
}
catch{
Write-Warning "Failed to get Machine GUID from HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\"
}
}

 

and this is the call to the internal function and warning message

$MachineGUID = Get-MachineGUID
if($MachineGUID.length -eq 0)
{
Write-Warning "Failed to get Machine GUID from HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\"
break
}

Rinse and repeat

That is basically the process that I follow to write a function. I just write a test, write some code to fix it, write another test, write some code to fix it. I keep going until I have finished writing the code and all the test have passed.

Best Practice Code

Once that was done and my Unit test had passed I run

 Invoke-Pester .\tests -Tag ScriptAnalyzer -Show Fails

To check that the PowerShell code that I had written conformed to the Script Analyzer rules. I added an exception to the Help.Exceptions.ps1 file to not run the rule for plural nouns as I think the command has to be called Get-SQLRecommendations with an S ! I have tagged the ScriptAnalyzer Tests with a tag so I can just run those tests.

Help

As that had all passed I could then run

Invoke-Pester .\tests -Tag Help

Which tests if I had the correct help for my functions. Of course that failed but I could use the nifty new feature in VS Codes PowerShell Extension to add the help scaffolding really easily as I describe here

Then I could run all 563 of the Pester tests in the tests folder and be happy that everything was OK

11 - All Pester passed.PNG

By the end I had written the module, which you can find here

There are instructions and a script to install it easily.

Right now it has only got the one function to get the SQL recommendations but I will look at expanding that over the next few days and once it is more complete put it onto the PowerShell Gallery and maybe move it into the SQL Server Community GitHub Organisation  home of https://dbatools.io , https://dbareports.io, Invoke-SQLCmd2 and the SSIS Reporting pack

Contribute

Of course I am happy to have others contribute to this, in fact I encourage it. Please fork and give PR’s and make this a useful module with more commands. There is the Diagnostic Analysis API as well to work with which I am very interested to see how we can make use of that with PowerShell

As always, I highly recommend that if you want to know more about Pester you head over here and purchase this book by Adam

Advertisements

VSCode – PowerShell extension 1.4.0 new command Out-CurrentFile

Yesterday David Wilson announced version 1.4.0 of the PowerShell extension for VSCode

He also pointed out that there have been over 1 million installs of the extension. 🙂

If you want to install the PowerShell extension you can hit F1 in VSCode and type

ext install PowerShell

or CTRL + SHIFT + X to open the extensions side bar and search for PowerShell and click the green install button.

There are a few enhancements in this release which you can read about here but I noticed the New File API and Out-CurrentFile command that were contributed by Doug Finke.

If your focus is in the editor in VSCode, you can simply CTRL + N and create a new file. You can alter the language that the file uses with CTRL + K, M (that’s CTRL and K and then M not CTRL and K and M!) or set the default new file language as I described here.

01 -new file.gif

If you are using VSCode as your daily PowerShell command line though, you would have to alter your focus from the terminal panel into the editor to do this. Now though you have

$psEditor.Workspace.NewFile()

which enables you to create a new file from the terminal

02- Another new file.gif

The Out-CurrentFile command sends the output of a command through Out-String to a new file. This makes it much easier to keep the results of some commands. You don’t have to pipe them to clip or highlight and CTRL + C to copy them and then open a  file and paste them. Of course you can use Out-File and then open the file but this is another way.

Lets see how it works. In this example, I want to export the T-SQL for creating the logins on an instance and add some comments to the code before saving it somewhere safely. I am going to use the Export-SQLLogin command from the dbatools module. MVP Cláudio Silva has written a great post on that command today.

First create a new file

$psEditor.Workspace.NewFile()

and then

Export-SqlLogin -SqlInstance . | Out-CurrentFile 

In the gif above

  • I create a new file,
  • Export the logins to it
  • change the language of the file to T-SQL
  • remove the string quotes and
  • add some comments.

All without leaving VSCode, just another reason that my productivity is increased using VSCode!

Unfortunately, it doesn’t work so well with Pester. (Of course I was going to try Pester!). This makes sense though, as Pester uses Write-Host to display the test results so nothing is going to the output stream so

Invoke-Pester '.\SQL Grillen\Demo Number 3.Tests.ps1' | Out-CurrentFile

returns this

04 - pester.PNG

and if you use the -PassThru parameter then you get the $Tests object as a string so

 Invoke-Pester '.\SQL Grillen\Demo Number 3.Tests.ps1'-Show Summary -PassThru | Out-CurrentFile

shows

05 - pester object.PNG

Which isn’t what I would need but I have other ways of working with Pester output.

VS Code – Automatic Dynamic PowerShell Help

VS Code is my coding tool of choice. I love that one lightweight editor can do so much and as PowerShell is usually the language that I write in I really love the PowerShell extension

Help

When you write a PowerShell function that is going to be used by someone other than you, you don’t want to be the guy or gal that has to support it indefinitely. You should write good help to enable your users to simply type

Get-Help NAMEOFCOMMAND

and get all of the help that they need to use the command

If we look at one of my favourite dbatools commands Get-DbaLastGoodCheckDB we can see this in action.

Get-Help Get-DbaLastGoodCheckDb -full

This returns

NAME
Get-DbaLastGoodCheckDb

SYNOPSIS
Get date/time for last known good DBCC CHECKDB

SYNTAX
Get-DbaLastGoodCheckDb [-SqlInstance] [[-SqlCredential] ] [-Silent] []

DESCRIPTION
Retrieves and compares the date/time for the last known good DBCC CHECKDB, as well as the creation date/time for the database.

This function supports SQL Server 2005+

Please note that this script uses the DBCC DBINFO() WITH TABLERESULTS. DBCC DBINFO has several known weak points, such as:
– DBCC DBINFO is an undocumented feature/command.
– The LastKnowGood timestamp is updated when a DBCC CHECKFILEGROUP is performed.
– The LastKnowGood timestamp is updated when a DBCC CHECKDB WITH PHYSICAL_ONLY is performed.
– The LastKnowGood timestamp does not get updated when a database in READ_ONLY.

An empty ($null) LastGoodCheckDb result indicates that a good DBCC CHECKDB has never been performed.

SQL Server 2008R2 has a “bug” that causes each databases to possess two dbi_dbccLastKnownGood fields, instead of the normal one.
This script will only displaythis function to only display the newest timestamp. If -Verbose is specified, the function will announce every time
more than one dbi_dbccLastKnownGood fields is encountered.

PARAMETERS
-SqlInstance
The SQL Server that you’re connecting to.

Required? true
Position? 1
Default value
Accept pipeline input? true (ByValue)
Accept wildcard characters? false

-SqlCredential Credential object used to connect to the SQL Server as a different user

Required? false
Position? 2
Default value
Accept pipeline input? false
Accept wildcard characters? false

-Silent []
Use this switch to disable any kind of verbose messages

Required? false
Position? named
Default value False
Accept pipeline input? false
Accept wildcard characters? false
This cmdlet supports the common parameters: Verbose, Debug,
ErrorAction, ErrorVariable, WarningAction, WarningVariable,
OutBuffer, PipelineVariable, and OutVariable. For more information, see
about_CommonParameters (http://go.microsoft.com/fwlink/?LinkID=113216).

INPUTS

OUTPUTS

NOTES

Copyright (C) 2016 Jakob Bindslet (jakob@bindslet.dk)

This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program. If not, see .

————————– EXAMPLE 1 ————————–

PS C:\>Get-DbaLastGoodCheckDb -SqlInstance ServerA\sql987

Returns a custom object displaying Server, Database, DatabaseCreated, LastGoodCheckDb, DaysSinceDbCreated, DaysSinceLastGoodCheckDb, Status and
DataPurityEnabled

————————– EXAMPLE 2 ————————–

PS C:\>Get-DbaLastGoodCheckDb -SqlInstance ServerA\sql987 -SqlCredential (Get-Credential sqladmin) | Format-Table -AutoSize

Returns a formatted table displaying Server, Database, DatabaseCreated, LastGoodCheckDb, DaysSinceDbCreated, DaysSinceLastGoodCheckDb, Status
and DataPurityEnabled.
Authenticates with SQL Server using alternative credentials.

RELATED LINKS
DBCC CHECKDB:
https://msdn.microsoft.com/en-us/library/ms176064.aspx
http://www.sqlcopilot.com/dbcc-checkdb.html
Data Purity:
http://www.sqlskills.com/blogs/paul/checkdb-from-every-angle-how-to-tell-if-data-purity-checks-will-be-run/
https://www.mssqltips.com/sqlservertip/1988/ensure-sql-server-data-purity-checks-are-performed/

So anyone who needs to use the command can see what it is, a full description, what each parameter is for, some examples and some links to more information

So what I used to do was put a snippet of code like this at the top of my function and then fill in the blanks

<#
.SYNOPSIS
Short description
.DESCRIPTION
Long description
.EXAMPLE
An example
.PARAMETER
Parameter Help
.NOTES
General notes
.LINK
Link to more information
#>

The latest release of the PowerShell extension for VS Code has made that process so much simpler 🙂 Thank you David and Keith

Now you can simply type <# and your help will be dynamically created. You will still have to fill in some of the blanks but it is a lot easier.

Here it is in action in its simplest form

Pester - Simple.gif

But it gets better than that. When you add parameters to your function code they are added to the help as well. Also, all you have to do is to tab between the different entries in the help to move between them

02 - detailed.gif

Now when we run

Get-Help Invoke-AmazingThings -Full

We get this

03 help.PNG

Nice and easy and a great feature added to the VS Code PowerShell extension

Write Good Help.png

Setting the default file type for a new file in VS Code

Just a short post today. When you open a new file in VS Code (Using CTRL + N) it opens by default as a plain text file.

To change the language for the file use CTRL +K, M.

That’s CTRL and K together and then M afterwards separately.

then you can choose the language for the file. It looks like this

01 - Change language

However, if you just want your new file to open as a particular language every time you can change this in the settings.

Click File –> Preferences –> Settings

or by clicking CTRL + ,

02 - Open Preferences.PNG

This opens the settings.json file. Search in the bar for default and scroll down until you see file

03 - File defaults.PNG

If you hover over the setting that you want to change, you will see a little pencil. Click on that and then Copy to Settings which will copy it to your user settings in the right hand pane.

NOTE – You will need to enter powershell and not PowerShell. For other languages, click on the language in the bottom bar and look at the value in the brackets next to the language name

04 - langauge.PNG

Once you have entered the new settings save the file (CTRL + S) and then any new file you open will be using the language you have chosen

It looks like this

05 - Change settings.gif

and now every new file that you open will be opened as a PowerShell file (or whichever language you choose)

You will still be able to change the language with CTRL K, m

Just to be clear, because people sometimes get this wrong. That’s CTRL and K, let go and then M. You will know you are doing correctly when you see

(CTRL + K) was pressed waiting for second key of chord……

06 - waiting for key

If you get it wrong and Press CTRL + K + M then you will open the Extensions search for keymaps.

 

07 - incorrect.PNG

This is a brilliant feature enabling you to copy key mappings for the programmes you use all the time and save you from learning the Code key mappings. You can find the keymaps in the Extensions Marketplace as well as by pressing CTRL + K + M

Using Twitter with VS Code

So today I saw this tweet from Mathias in reply to Stefan and Amanda

01 - tweet.PNG

 

That looks cool. Twitter in VS Code, one less programme to open Here how it works

Open VS Code and hit CTRL + P and type ext install twitter or hit CTRL + SHIFT + X to open the extensions and search

02 - install.PNG

 

Hit install and then reload

03 - reload

 

Accept the prompt

04 - prompt.PNG

and you will have a Twitter button in the bar at the bottom

05 - bar

If you click it then the top bar will change to the set up wizard

06 - forst time.PNG

Follow it along and create a Twitter App.

07 - want to continue

08 - create an appWhich will open up the website.

 

09 - create an app.PNG

Just fill in the blanks

010 - fil in the blanks.PNG

and then  you will have this window

011 - app settings.PNG

Click on keys and Access tokens

012 - settings and appl

Check the App Permissions are set to read and write

014 - app permissions

and then click create my access token

Then go back to VS Code and click

016 - settings details

Now you are shown the settings.json (which you can always find by File –> Preferences –> Settings )

017 - settingsjson.PNG

This bit, the wizard doesn’t explain very well (hence this post) If you have already some settings between the curly braces, you will need to put a comma and then paste the below code. If you do not then paste the below code between the curly braces

"twitter.consumerkey":"",
"twitter.consumersecret":"",
"twitter.accesstokenkey":"",
"twitter.accesstokensecret":"" 

and then paste the relevant keys and tokens from your twitter app between the double quotes

THEN PRESS CTRL + S to save the settings.json – Its in caps as when I showed someone they were too excited and didn’t save it!!

Now its all set up you can use the extension. Click the twitter button in the bar and

018 - Twitter actions

Now you can have your Home timeline in Code

019 - twitter home.PNG

Yes there are still a few seats left for the Europe PowerShell Conference You can search, see your mentions, your user page

020- mentions.PNG

and post 🙂 You can just press F1 and start typing twitter to get the commands

021 - comands

022 - tweet.png

and

 

023 - tweet.PNG

You can also message people using D message username. You can see the extension repo on GitHub which will be a good place to raise issues, bugs, feature requests

I’m not suire it will Increase my productivity !! but it really pleases the nerd in me!

 

 

Why VS Code Increases my Productivity

Last week I was showing a co-worker some PowerShell code and he asked what the editor was that I was using. Visual Studio Code I said. Why do you use that? What does it do?

This is what I showed him

Runs on any Operating System

Code (as I shall refer to it) is free lightweight open source editor which runs on all the main operating systems. So you have the same experience in Linux as on Windows. So there is less to learn

Extensions

You can add new languages, themes, debuggers and tools from the extensions gallery to reduce the number of programmes you have open and the need to switch between programmes

You can add extensions using CTRL + SHIFT  + X and searching in the bar

01 - Extensions

or by going to the Extensions gallery searching for the extensions and copying the installation command

02 - extensions gallery.PNG

Debugging

There is a rich de-bugging experience built in

03 - debugging.PNG

You can learn about debugging from the official docs and Keith Hill wrote a blog post on Hey Scripting Guys about debugging PowerShell

Intellisense

An absolute must to make life simpler. Code has intellisense for PowerShell and T-SQL which I use the most but also for many more languages . Read more here

Git integration

I love the Git integration, makes it so easy to work with GitHub for me. I can see diffs, commit, undo commits nice and simply. Just open the root folder of the repository and its there

04 - git

This page will give you a good start on using git with Code

No distractions

With full screen mode (F11) or Zen mode (CTRL +K, Z) I can concentrate on coding and not worry about distractions

Stay in one programme and do it all

I have a Markdown document, a PowerShell script and a T-SQL script all in one Git repository and I can work on all of them and version control in one place. The screencast below also shows some of the new capabilities available in the insiders version I managed to leave the screen recording dialogue open as well, apologies and the mistake was deliberate!

I used the GitLens and SQL beautify extensions as well as the dbatools module in that demo

That’s why I am using Code more and more these days, hope it helps

Happy Automating!

 

 

Max Length of a column in a DataTable in PowerShell

Whilst I was writing my Test-DbaLastBackup Posts I ran into a common error I get when importing datatables into a database

I was using this table

01 - table

and when I tried to add the results of the Test-DbaLastBackup I got this

02 -error.PNG

Exception calling “WriteToServer” with “1” argument(s): “The given value of type String from the data source cannot be converted to type nvarchar of the
specified target column.”
At line:356 char:4
+             $bulkCopy.WriteToServer($InputObject)
+             ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : InvalidOperationException

Hmm, it says that it can’t convert a string to a nvarchar, that doesn’t sound right.To find out what was happening I used a little bit of code that I use every single day

 $Error[0] | Fl -force

All errors from your current session are stored in the $error array so [0] accesses the most recent one and fl is an alias for Format-List and the force switch expands the object. This is what I saw

03 expanded error.PNG

System.Management.Automation.MethodInvocationException: Exception calling “WriteToServer” with “1” argument(s): “The given value of
type String from the data source cannot be converted to type nvarchar of the specified target column.” —>
System.InvalidOperationException: The given value of type String from the data source cannot be converted to type nvarchar of the
specified target column. —> System.InvalidOperationException: String or binary data would be truncated.

String or binary data would be truncated. OK that makes sense, one of my columns has larger data than the destination column but which one? Lets take a look at some of the data

SourceServer  : SQL2016N3
TestServer    : SQL2016N1
Database      : RidetheLightning
FileExists    : Skipped
RestoreResult : Restore not located on shared location
DbccResult    : Skipped
SizeMB        : 4.08
BackupTaken   : 3/19/2017 12:00:03 AM
BackupFiles   : C:\MSSQL\Backup\SQL2016N3\RidetheLightning\FULL\SQL2016N3_RidetheLightning_FULL_20170319_000003.bak

SourceServer  : SQL2016N3
TestServer    : SQL2016N1
Database      : TheCallofKtulu
FileExists    : Skipped
RestoreResult : Restore not located on shared location
DbccResult    : Skipped
SizeMB        : 4.08
BackupTaken   : 3/19/2017 12:00:04 AM
BackupFiles   : C:\MSSQL\Backup\SQL2016N3\TheCallofKtulu\FULL\SQL2016N3_TheCallofKtulu_FULL_20170319_000004.bak

SourceServer  : SQL2016N3
TestServer    : SQL2016N1
Database      : TrappedUnderIce
FileExists    : Skipped
RestoreResult : Restore not located on shared location
DbccResult    : Skipped
SizeMB        : 4.08
BackupTaken   : 3/19/2017 12:00:04 AM
BackupFiles   : C:\MSSQL\Backup\SQL2016N3\TrappedUnderIce\FULL\SQL2016N3_TrappedUnderIce_FULL_20170319_000004.bak

Hmm, its not going to be easy to work out which bit of data is too big here.

All I need to know is the maximum length of the columns in the datatable though so I have a little snippet that will do that for me

$columns = ($datatable | Get-Member -MemberType Property).Name
foreach($column in $Columns) {
$max = 0
foreach ($a in $datatable){
       if($max -lt $a.$column.length){
        $max = $a.$column.length
       }
}
Write-Output "$column max length is $max"
}

and the output looks like this

04 - max length.PNG

So we can quickly see that the backupfiles property is too big and change the table accordingly and no more error.

Its pretty quick too, scanning 105 rows in 56 milliseconds in this example
05 - how long.PNG

I keep this little snippet in my snippets list for PowerShell ISE which you can find here

Here is the code to add this as a snippet to ISE
## A list of snippets
$snips = Get-IseSnippet
## Add a snippet
if(!$snips.Where{$_.Name -like 'Max Length of Datatable*'})
{
$snippet = @{
 Title = 'Max Length of Datatable'
 Description = 'Takes a datatable object and iterates through it to get the max length of the string columns - useful for data loads'
 Text = @"
`$columns = (`$datatable | Get-Member -MemberType Property).Name
foreach(`$column in `$Columns)
{
`$max = 0
foreach (`$a in `$datatable)
{
if(`$max -lt `$a.`$column.length)
{
`$max = `$a.`$column.length
}
}
Write-Output "`$column max length is `$max"
}

"@
}
New-IseSnippet @snippet
}
and if you want to add it to your VSCode snippets then you need to edit the PowerShell.json file which is located in your user home AppData folder ‘C:\Users\User\AppData\Roaming\Code\User\snippets\powershell.json’ or by clicking File –> Preferences –> User Snippets and typing PowerShell
04 user snippets.gif

Then you can add this bit of json inside the curly braces

 "Max Length of Datatable": {
"prefix": "Max Length of Datatable",
"body": [
"$$columns = ($$datatable | Get-Member -MemberType Property).Name",
"foreach($$column in $$Columns) {",
"    $$max = 0",
"    foreach ($$a in $$datatable){",
"        if($$max -lt $$a.$$column.length){",
"            $$max = $$a.$$column.length",
"        }",
"    }",
"    Write-Output \"$$column max length is $$max\"",
"}"
],
"description": "Takes a datatable object and iterates through it to get the max length of the string columns - useful for data loads"
}, 

and you have your snippet ready for use

07 - snippet in vscode.gif

Happy Automating