Pester 4.2.0 has a Because…… because :-)

I was going through my demo for the South Coast User Group meeting tonight and decided to add some details about the Because parameter available in the Pester pre-release version 4.2.0.

To install a pre-release version you need to get the latest  PowerShellGet module. This is pre-installed with PowerShell v6 but for earlier versions open PowerShell as administrator and run

Install-Module  PowerShellGet

You can try out the Pester pre-release version (once you have the latest PowerShellGet) by installing it from the PowerShell Gallery with

Install-Module -Name Pester -AllowPrerelease -Force # -Scope CurrentUser # if not admin

There are a number of improvements as you can see in the change log I particularly like the

  • Add -BeTrue to test for truthy values
  • Add -BeFalse to test for falsy values

This release adds the Because parameter to the all assertions. This means that you can add a reason why the test has failed. As JAKUB JAREŠ writes here

  • Reasons force you think more
  • Reasons document your intent
  • Reasons make your TestCases clearer
  • So you can do something like this
Describe "This shows the Because"{
    It "Should be true" {
        $false | Should -BeTrue -Because "The Beard said so"
    }
}

Which gives an error message like this 🙂

As you can see the Expected gives the expected value and then your Because statement and then the actual result. Which means that you could write validation tests like

Describe "My System" {
    Context "Server" {
        It "Should be using XP SP3" {
            (Get-CimInstance -ClassName win32_operatingsystem).Version | Should -Be '5.1.2600' -Because "We have failed to bother to update the App and it only works on XP"
        }
        It "Should be running as rob-xps\mrrob" {
            whoami | Should -Be 'rob-xps\mrrob' -Because "This is the user with the permissions"
        }
        It "Should have SMB1 enabled" {
            (Get-SmbServerConfiguration).EnableSMB1Protocol | Should -BeTrue -Because "We don't care about the risk"
        }
    }
}

and get a result like this

Or if you were looking to validate your SQL Server you could write something like this

It "Backups Should have Succeeeded" {
    $Where = {$_IsEnabled -eq $true -and $_.Name -like '*databasebackup*'}
    $Should = @{
        BeTrue = $true
        Because =  "WE NEED BACKUPS - OMG"
    }
    (Get-DbaAgentJob -SqlInstance $instance| Where-Object $where).LastRunOutcome -NotContains 'Failed' | Should @Should
}

or maybe your security policies allow Windows Groups as logins on your SQL instances. You could easily link to the documentation and explain why this is important. This way you could build up a set of tests to validate your SQL Server is just so for your environment

It "Should only have Windows groups as logins" {
    $Should = @{
        Befalse = $true
        Because = "Our Security Policies say we must only have Windows groups as logins - See this document"
    }
    (Get-DbaLogin -SqlInstance $instance -WindowsLogins).LoginType -contains 'WindowsUser' | Should @Should
}

Just for fun, these would look like this

and the code looks like

$Instances = 'Rob-XPS', 'Rob-XPS\Bolton'

foreach ($instance in $Instances) {
    $Server, $InstanceName = $Instance.Split('/')
    if ($InstanceName.Length -eq 0) {$InstanceName = 'MSSSQLSERVER'}

    Describe "Testing the instance $instance" {
        Context "SQL Agent Jobs" {
            It "Backups Should have Succeeeded" {
                $Where = {$_IsEnabled -eq $true -and $_.Name -like '*databasebackup*'}
                $Should = @{
                    BeTrue = $true
                    Because =  "WE NEED BACKUPS - OMG "
                }
                (Get-DbaAgentJob -SqlInstance $instance| Where-Object $where).LastRunOutcome -NotContains 'Failed' | Should @Should
            }
            Context "Logins" {
                It "Should only have Windows groups as logins" {
                    $Should = @{
                        Befalse = $true
                        Because = "Our Security Policies say we must only have Windows groups as logins - See this document"
                    }
                    (Get-DbaLogin -SqlInstance $instance -WindowsLogins).LoginType -contains 'WindowsUser' | Should @Should
                }
            }
        }
    }
}

This will be a useful improvement to Pester when it is released and enable you to write validation checks with explanations.

Chrissy has written about dbachecks the new up and coming community driven open source PowerShell module for SQL DBAs to validate their SQL Server estate. we have taken some of the ideas that we have presented about a way of using dbatools with Pester to validate that everything is how it should be and placed them into a meta data driven framework to make things easy for anyone to use. It is looking really good and I am really excited about it. It will be released very soon.

Chrissy and I will be doing a pre-con at SQLBits where we will talk in detail about how this works. You can find out more and sign up here

Using the AST in Pester for dbachecks

TagLine – My goal – Chrissy will appreciate Unit Tests one day 🙂

Chrissy has written about dbachecks the new up and coming community driven open source PowerShell module for SQL DBAs to validate their SQL Server estate. we have taken some of the ideas that we have presented about a way of using dbatools with Pester to validate that everything is how it should be and placed them into a meta data driven framework to make things easy for anyone to use. It is looking really good and I am really excited about it. It will be released very soon.

Chrissy and I will be doing a pre-con at SQLBits where we will talk in detail about how this works. You can find out more and sign up here

Cláudio Silva has improved my PowerBi For Pester file and made it beautiful and whilst we were discussing this we found that if the Pester Tests were not formatted correctly the Power Bi looked … well rubbish to be honest! Chrissy asked if we could enforce some rules for writing our Pester tests.

The rules were

The Describe title should be in double quotes
The Describe should use the plural Tags parameter
The Tags should be singular
The first Tag should be a unique tag in Get-DbcConfig
The context title should end with $psitem
The code should use Get-SqlInstance or Get-ComputerName
The Code should use the forEach method
The code should not use $_
The code should contain a Context block

She asked me if I could write the Pester Tests for it and this is how I did it. I needed to look at the Tags parameter for the Describe. It occurred to me that this was a job for the Abstract Syntax Tree (AST). I don’t know very much about the this but I sort of remembered reading a blog post by Francois-Xavier Cat about using it with Pester so I went and read that and found an answer on Stack Overflow as well. These looked just like what I needed so I made use of them. Thank you very much to Francois-Xavier and wOxxOm for sharing.

The first thing I did was to get the Pester Tests which we have located in a checks folder and loop through them and get the content of the file with the Raw parameter

Describe "Checking that each dbachecks Pester test is correctly formatted for Power Bi and Coded correctly" {
$Checks =(Get-ChildItem$ModuleBase\checks).Where{$_.Name-ne'HADR.Tests.ps1'}
$Checks.Foreach{
$Check =Get-Content$Psitem.FullName-Raw
Context "$($_.Name) - Checking Describes titles and tags" {
Then I decided to look at the Describes using the method that wOxxOm (I know no more about this person!) showed.
$Describes = [Management.Automation.Language.Parser]::ParseInput($check, [ref]$tokens, [ref]$errors).
FindAll([Func[Management.Automation.Language.Ast, bool]] {
        param($ast)
        $ast.CommandElements -and
        $ast.CommandElements[0].Value -eq 'describe'
    }, $true) |
    ForEach {
    $CE = $_.CommandElements
    $secondString = ($CE |Where { $_.StaticType.name -eq 'string' })[1]
    $tagIdx = $CE.IndexOf(($CE |Where ParameterName -eq'Tags')) + 1
    $tags = if ($tagIdx -and $tagIdx -lt $CE.Count) {
        $CE[$tagIdx].Extent
    }
    New-Object PSCustomObject -Property @{
        Name = $secondString
        Tags = $tags
    }
}
As I understand it, this code is using the Parser on the $check (which contains the code from the file) and finding all of the Describe commands and creating an object of the title of the Describe with the StaticType equal to String and values from the Tag parameter.
When I ran this against the database tests file I got the following results
Then it was a simple case of writing some tests for the values
@($describes).Foreach{
    $title = $PSItem.Name.ToString().Trim('"').Trim('''')
    It "$title Should Use a double quote after the Describe" {
        $PSItem.Name.ToString().Startswith('"')| Should be $true
        $PSItem.Name.ToString().Endswith('"')| Should be $true
    }
    It "$title should use a plural for tags" {
        $PsItem.Tags| Should Not BeNullOrEmpty
    }
    # a simple test for no esses apart from statistics and Access!!
    if ($null -ne $PSItem.Tags) {
        $PSItem.Tags.Text.Split(',').Trim().Where{($_ -ne '$filename') -and ($_ -notlike '*statistics*') -and ($_ -notlike '*BackupPathAccess*') }.ForEach{
            It "$PsItem Should Be Singular" {
                $_.ToString().Endswith('s')| Should Be $False
            }
        }
        It "The first Tag Should Be in the unique Tags returned from Get-DbcCheck" {
            $UniqueTags -contains $PSItem.Tags.Text.Split(',')[0].ToString()| Should Be $true
        }
    }
    else {
        It "You haven't used the Tags Parameter so we can't check the tags" {
            $false| Should be $true
        }
    }
}

The Describes variable is inside @() so that if there is only one the ForEach Method will still work. The unique tags are returned from our command Get-DbcCheck which shows all of the checks. We will have a unique tag for each test so that they can be run individually.

Yes, I have tried to ensure that the tags are singular by ensuring that they do not end with an s (apart from statistics) and so had to not check  BackupPathAccess and statistics. Filename is a variable that we add to each Describe Tags so that we can run all of the tests in one file. I added a little if block to the Pester as well so that the error if the Tags parameter was not passed was more obvious

I did the same with the context blocks as well

Context "$($_.Name) - Checking Contexts" {
    ## Find the Contexts
    $Contexts = [Management.Automation.Language.Parser]::ParseInput($check, [ref]$tokens, [ref]$errors).
    FindAll([Func[Management.Automation.Language.Ast, bool]] {
            param($ast)
            $ast.CommandElements -and
            $ast.CommandElements[0].Value -eq 'Context'
        }, $true) |
        ForEach {
        $CE = $_.CommandElements
        $secondString = ($CE |Where { $_.StaticType.name -eq 'string' })[1]
        New-Object PSCustomObject -Property @{
            Name = $secondString
        }
    }
    @($Contexts).ForEach{
        $title = $PSItem.Name.ToString().Trim('"').Trim('''')
        It "$Title Should end with `$psitem So that the PowerBi will work correctly" {
            $PSItem.Name.ToString().Endswith('psitem"')| Should Be $true
        }
    }
}
This time we look for the Context command and ensure that the string value ends with psitem as the PowerBi parses the last value when creating columns
Finally I got all of the code and check if it matches some coding standards
Context "$($_.Name) - Checking Code" {
    ## This just grabs all the code
    $AST = [System.Management.Automation.Language.Parser]::ParseInput($Check, [ref]$null, [ref]$null)
    $Statements = $AST.EndBlock.statements.Extent
    ## Ignore the filename line
    @($Statements.Where{$_.StartLineNumber -ne 1}).ForEach{
        $title = [regex]::matches($PSItem.text, "Describe(.*)-Tag").groups[1].value.Replace('"', '').Replace('''', '').trim()
        It "$title Should Use Get-SqlInstance or Get-ComputerName" {
            ($PSItem.text -Match 'Get-SqlInstance') -or ($psitem.text -match 'Get-ComputerName')| Should be $true
        }
        It "$title Should use the ForEach Method" {
            ($Psitem.text -match 'Get-SqlInstance\).ForEach{') -or ($Psitem.text -match 'Get-ComputerName\).ForEach{')| Should Be $true# use the \ to escape the )
    }
    It "$title Should not use `$_" {
        ($Psitem.text -match '$_')| Should Be $false
    }
    It "$title Should Contain a Context Block" {
        $Psitem.text -match 'Context'| Should Be $True
    }
}

I trim the title from the Describe block so that it is easy to see where the failures (or passes) are with some regex and then loop through each statement apart from the first line to ensure that the code is using our internal commands Get-SQLInstance or Get-ComputerName to get information, that we are looping through each of those arrays using the ForEach method rather than ForEach-Object and using $psitem rather than $_ to reference the “This Item” in the array and that each Describe block has a context block.

This should ensure that any new tests that are added to the module follow the guidance we have set up on the Wiki and ensure that the Power Bi results still look beautiful!

Anyone can run the tests using

Invoke-Pester .\tests\Unit.Tests.ps1 -show Fails
before they create a Pull request and it looks like
if everything is Green then they can submit their Pull Request 🙂 If not they can see quickly that something needs to be fixed. (fail early 🙂 )
03 fails.png

Converting a Datarow to a JSON object with PowerShell

This is just a quick post. As is frequent with these they are as much for me to refer to in the future and also because the very act of writing it down will aid me in remembering. I encourage you to do the same. Share what you learn because it will help you as well as helping others.

Anyway, I was writing some Pester tests for a module that I was writing when I needed some sample data. I have written before about using Json for this purpose This function required some data from a database so I wrote the query to get the data and used dbatools to run the query against the database using Get-DbaDatabase

$db = Get-DbaDatabase -SqlInstance $Instance -Database $Database
$variable = $db.Query($Query)

Simple enough. I wanted to be able to Mock $variable. I wrapped the code above in a function, let’s call it Run-Query

function Run-Query {(Param $query)
$db = Get-DbaDatabase -SqlInstance $Instance -Database $Database
$variable = $db.Query($Query)
}

Which meant that I could easily separate it for mocking in my test. I ran the code and investigated the $variable variable to ensure it had what I wanted for my test and then decided to convert it into JSON using ConvertTo-Json

Lets show what happens with an example using WideWorldImporters and a query I found on Kendra Littles blogpost about deadlocks

$query = @"
SELECT Top 10 CityName, StateProvinceName, sp.LatestRecordedPopulation, CountryName
FROM Application.Cities AS city
JOIN Application.StateProvinces AS sp on
city.StateProvinceID = sp.StateProvinceID
JOIN Application.Countries AS ctry on
sp.CountryID=ctry.CountryID
WHERE sp.StateProvinceName = N'Virginia'
ORDER BY CityName
"@
$db = Get-DbaDatabase -SqlInstance rob-xps -Database WideWorldImporters
$variable = $db.Query($Query)
If I investigate the $variable variable I get

data results

The results were just what I wanted so I thought I will just convert them to JSON and save them in a file and bingo I have some test data in a mock to ensure my code is doing what I expect. However, when I run

$variable | ConvertTo-Json

I get

json error.png

and thats just for one row!

The way to resolve this is to only select the data that we need. The easiest way to do this is to exclude the properties that we don’t need

$variable | Select-Object * -ExcludeProperty ItemArray, Table, RowError, RowState, HasErrors | ConvertTo-Json

which gave me what I needed and a good use case for -ExcludeProperty

json fixed.png

Handling Missing Instances when Looping with Pester

In my previous posts about writing your first Pester Test and looping through instances I described how you can start to validate that your SQL Server is how YOU want it to be.

Unavailable machines

Once you begin to have a number of tests for a number of instances you want to be able to handle any machines that are not available cleanly otherwise you might end up with something like this.

01 - error.png

In this (made up) example we loop through 3 instances and try to check the DNS Server entry is correct but for one of them we get a massive error and if we had created a large number of tests for each machine we would have a large number of massive errors.

Empty Collection

If we don’t successfully create our collection we might have an empty collection which will give us a different issue. No tests

02 - no tests.png

If this was in amongst a whole number of tests we would not have tested anything in this Describe block and might be thinking that our tests were OK because we had no failures of our tests. We would be wrong!

Dealing with Empty Collections

One way of dealing with empty collections is to test that they have more than 0 members

if ($instances.count -gt 0) {
    $instances.ForEach{
        ## Tests in here
    }
}
else {Write-Warning "Uh-Oh - The Beard is Sad! - The collection is empty. Did you set `$Instances correctly?"}
Notice the backtick ` before the $ to escape it in the Write-Warning. An empty collection now looks like
03 - uh-oh.png
Which is much better and provides useful information to the user

Dealing with Unavailable Machines

If we want to make sure we dont clutter up our test results with a whole load of failures when a machine is unavailable we can use similar logic.

First we could check if it is responding to a ping (assuming that ICMP is allowed by the firewall and switches) using

Test-Connection -ComputerName $computer -Count 1 -Quiet -ErrorAction SilentlyContinue

This will just try one ping and do it quietly only returning True or False and if there are any errors it shouldn’t mention it

In the example above I am using PSRemoting and we should make sure that that is working too. So whilst I could use

Test-WSMan -ComputerName $computer

this only checks if a WSMAN connection is possible and not other factors that could be affecting the ability to run remote sessions. Having been caught by this before I have always used this function from Lee Holmes (Thank you Lee) and thus can use

$instances.ForEach{
    $computer = $_.Split('\')[0]# To get the computername if there is an instance name
    # Check if machine responds to ping
    if (!(Test-Connection-ComputerName $computer-Count 1-Quiet -ErrorAction SilentlyContinue))
    {Write-Warning "Uh-Oh - $Computer is not responding to a ping - aborting the tests for this machine"; Return}
    # Check if PSremoting is possible for this machine
    # Requires Test-PSRemoting by Lee Holmes http://www.leeholmes.com/blog/2009/11/20/testing-for-powershell-remoting-test-psremoting/
    if (!(Test-PsRemoting$computer))
    {Write-Warning "Uh-Oh - $Computer is not able to use PSRemoting - aborting the tests for this machine"; Return}
    Describe "Testing Instance $($_)" {
        ## Put tests in here
    }
which provides a result like this

04 - better handling.png

Which is much better I think 🙂

Let dbatools do the error handling for you

If your tests are only using the dbatools module then there is built in error handling that you can use. By default dbatools returns useful messages rather than the exceptions from PowerShell (You can enable the exceptions using the -EnableExceptions parameter if you want/need to) so if we run our example from the previous post it will look like

05 - dbatools handling.png

which is fine for a single command but we don’t really want to waste time and resources repeatedly trying to connect to an instance if we know it is not available if we are running multiple commands against each instance.

dbatools at the beginning of the loop

We can use Test-DbaConnection to perform a check at the beginning of the loop as we discussed in the previous post

$instances.ForEach{
    if (!((Test-DbaConnection-SqlInstance $_ -WarningAction SilentlyContinue).ConnectSuccess))
    {Write-Warning "Uh-Oh - we cannot connect to $_ - aborting the tests for this instance"; Return}
Notice that we have used -WarningAction SilentlyContinue to hide the warnings from the command this tiime. Our test now looks like
06 - dbatools test-dbaconnection.png
Test-DbaConnection performs a number of tests so you can check for ping SQL version, domain name and remoting if you want to exclude tests on those basis

Round Up

In this post we have covered some methods of ensuring that your Pester Tests return what you expect. You don’t want empty collections of SQL Instances making you think you have no failed tests when you have not actually run any tests.

You can do this by checking how many instances are in the collection

You also dont want to keep running tests against a machine or instance that is not responding or available.

You can do this by checking a ping with Test-Connection or if remoting is required by using the Test-PSRemoting function from Lee Holmes

If you want to use dbatools exclusively you can use Test-DbaConnection

Here is a framework to put your tests inside. You will need to provide the values for the $Instances and place your tests inside the Describe Block

if ($instances.count -gt 0) {
    $instances.ForEach{
        $TestConnection = Test-DbaConnection-SqlInstance $_ -WarningAction SilentlyContinue
        # Check if machine responds to ping
        if (!($TestConnection.IsPingable))
        {Write-Warning "Uh-Oh - The Beard is Sad! - - $_ is not responding to a ping - aborting the tests for this instance"; Return}
        # Check if we have remote access to the machine
        if (!($TestConnection.PsRemotingAccessible))
        {Write-Warning "Uh-Oh - The Beard is Sad! - - $_ is not able to use PSRemoting - aborting the tests for this instance"; Return}
        # Check if we have SQL connection to the Instance
        if (!($TestConnection.ConnectSuccess))
        {Write-Warning "Uh-Oh - The Beard is Sad! - - we cannot connect to SQL on $_ - aborting the tests for this instance"; Return}
        Describe "Testing Instance $($_)" {
            ## Now put your tests in here - seperate them with context blocks if you want to
            Context "Networks" { }
        }
    }
}
else
## If the collection is empty
{Write-Warning "Uh-Oh - The Beard is Sad! - The collection is empty. Did you set `$Instances correctly?"}

2 Ways to Loop through collections in Pester

In my last post I showed you how to write your first Pester test to validate something. Here’s a recap

  • Decide the information you wish to test
  • Understand how to get it with PowerShell
  • Understand what makes it pass and what makes it fail
  • Write a Pester Test

You probably have more than one instance that you want to test, so how do you loop through a collection of instances? There are a couple of ways.

Getting the Latest Version of the Module

The magnificent Steve Jones wrote about getting the latest version of Pester and the correct way to do it. You can find the important information here

Test Cases

The first way is to use the Test Case parameter of the It command (the test) which I have written about when using TDD for Pester here

Lets write a test first to check if we can successfully connect to a SQL Instance. Running

Find-DbaCommand connection

shows us that the Test-DbaConnection command is the one that we want from the dbatools module. We should always run Get-Help to understand how to use any PowerShell command. This shows us that the results will look like this

01 - gethelp test-dbaconnection

So there is a ConnectSuccess result which returns True or false. Our test can look like this for a single instance

Describe 'Testing connection to ROB-XPS' {
    It "Connects successfully to ROB-XPS" {
        (Test-DbaConnection-SqlInstance ROB-XPS).ConnectSuccess | Should Be $True
    }
}

which gives us some test results that look like this

successful test.png
which is fine for one instance but we want to check many.
We need to gather the instances into a $Instances variable. In my examples I have hard coded a list of SQL Instances but you can, and probably should, use a more dynamic method, maybe the results of a query to a configuration database. Then we can fill our TestCases variable which can be done like this
$Instances = 'ROB-XPS','ROB-XPS\DAVE','ROB-XPS\BOLTON','ROB-XPS\SQL2016'
# Create an empty array
$TestCases = @()
# Fill the Testcases with the values and a Name of Instance
$Instances.ForEach{$TestCases += @{Instance = $_}}
Then we can write our test like this
# Get a list of SQL Servers
# Use whichever method suits your situation
# Maybe from a configuration database
# I'm just using a hard-coded list for example
$Instances = 'ROB-XPS','ROB-XPS\DAVE','ROB-XPS\BOLTON','ROB-XPS\SQL2016'

# Create an empty array
$TestCases = @()

# Fill the Testcases with the values and a Name of Instance
$Instances.ForEach{$TestCases += @{Instance = $_}}
Describe 'Testing connection to SQL Instances' {
    # Put the TestCases 'Name' in <> and add the TestCases parameter
    It "Connects successfully to <Instance>" -TestCases $TestCases {
        # Add a Parameter to the test with the same name as the TestCases Name
        Param($Instance)
        # Write the test using the TestCases Name
        (Test-DbaConnection -SqlInstance $Instance).ConnectSuccess | Should Be $True
    }
}
Within the title of the test we refer to the instance inside <> and add the parameter TestCases with a value of the $TestCases variable. We also need to add a Param() to the test with the same name and then use that variable in the test.
This looks like this
Testcases test.png

Pester is PowerShell

The problem with  Test Cases is that we can only easily loop through one collection, but as Pester is just PowerShell we can simply use ForEach if we wanted to loop through multiple ones, like instances and then databases.

I like to use the ForEach method as it is slightly quicker than other methods. It will only work with PowerShell version 4 and above. Below that version you need to pipe the collection to For-EachObject.

Lets write a test to see if our databases have trustworthy set on. We can do this using the Trustworthy property returned from Get-DbaDatabase. 

We loop through our Instances using the ForEach method and create a Context for each Instance to make the test results easier to read. We then place the call to Get-DbaDatabase inside braces and loop through those and check the Trustworthy property

# Get a list of SQL Servers
# Use whichever method suits your situation
# Maybe from a configuration database
# I'm just using a hard-coded list for example
$Instances = 'ROB-XPS','ROB-XPS\DAVE','ROB-XPS\BOLTON','ROB-XPS\SQL2016'
Describe 'Testing user databases' {
    # Loop through the instances
    $Instances.ForEach{
        # Create a Context for each Instance.
        Context "Testing User Databases on $($_)" {
            # Loop through the User databases on the instance
            (Get-DbaDatabase -SqlInstance $_ -ExcludeAllSystemDb).ForEach{
                # Refer to the database name and Instance name inside a $()
                It "Database $($_.Name) on Instance $($_.Parent.Name) should not have TRUSTWORTHY ON" {
                    $_.Trustworthy | Should Be $false
                }
            }
        }
    }
}
and it looks like this

testdatabasetrustworthy.png

So there you have two different ways to loop through collections in your Pester tests. Hopefully this can help you to write some good tests to validate your environment.
Happy Pestering

Spend a Whole Day With Chrissy & I at SQLBits

If you would like to spend a whole day with Chrissy LeMaire and I at SQLBits in London in February – we have a pre-con on the Thursday
You can find out more about the pre-con sqlps.io/bitsprecon
and you can register at sqlps.io/bitsreg

Write Your first Pester Test Today

I was in Glasgow this Friday enjoying the fantastic hospitality of the Glasgow SQL User Group @SQLGlasgow and presenting sessions with Andre Kamman, William Durkin and Chrissy LeMaire

I presented “Green is Good Red is Bad – Turning your checklists into Pester Tests”. I had to make sure I had enough energy beforehand so I treated myself to a fabulous burger.

20171110_114933-compressor.jpg

Afterwards I was talking to some of the attendees and realised that maybe I could show how easy it was to start writing your first Pester test. Here are the steps to follow so that you can  write your first Pester test

Decide the information you wish to test
Understand how to get it with PowerShell
Understand what makes it pass and what makes it fail
Write a Pester Test

The first bit is up to you. I cannot decide what you need to test for on your servers in your environments. Whatever is the most important. For now pick one thing.

Logins – Lets pick logins as an example for this post. It is good practice to disable the sa account is advice that you will read all over the internet and is often written into estate documentation so lets write a test for that

Now we need the PowerShell command to return the information to test for. We need a command that will get information about logins on a SQL server and if it can return disabled logins then all the better.

As always when starting to use PowerShell with SQL Server I would start with dbatools if we run Find-DbaCommand we can search for commands in the module that support logins. (If you have chosen something none SQL Server related then you can use Get-Command or the internet to find the command you need)

find-dbacommand.png

Get-DbaLogin . That looks like the one that we want. Now we need to understand how to use it. Always always use Get-Help to do this. If we run

Get-Help Get-DbaLogins -detailed

we get all of the information about the command and the examples. Example 8 looks like it will help us

get-dbalogin example

So now try running the command for our disabled sa account

Get-DbaLogin -SqlInstance rob-xps -Login sa -Disabled

disabled sa account

So we know that if we have a disabled sa account we get a result. Lets enable the sa account and run the command again

not disabled.png

We don’t get a result. Excellent, now we know what happens for a successful test – we get one result and for failed test we get zero results. We can check that by running

login count

The first one has the account disabled and the second one not. So now we can write our Pester Test. We can start with a Describe Block with a useful title. I am going to add a context block so that you can see how you can group your tests.

describe context

and then we will write our test. Pester Tests use the It keyword. You should always give a useful title to your test

it should

Now we can write our test. We know that the command returns one result when we want it to pass so we can write a test like this

login test.png

The code I have added is

(Get-DbaLogin -SqlInstance rob-xps -Login sa -Disabled).Count | Should Be 1
which is
  • the code for getting the information about the thing we wanted to test (The count of the disabled sa logins on the instance)
  • a pipe symbol |
  • The Should key word
  • The Be keyword
  • and the result we want to pass the test (1)

Ta Da! One Pester test written. You can run the test just by highlighting the code and running it in VS Code (or PowerShell ISE) and it will look like this for a passing test

passing test

It is better to save it for later use and then call it with Invoke-Pester

invoke

So now you can write your first Pester test. Just find the PowerShell to get the information that you need, understand what the results will be for passing and failing tests and write your test 🙂

Getting the Latest Version of the Module

The magnificent Steve Jones wrote about getting the latest version of Pester and the correct way to do it. You can find the important information here

Spend a Whole Day With Chrissy & I at SQLBits

If you would like to spend a whole day with Chrissy LeMaire and I at SQLBits in London in February – we have a pre-con on the Thursday
You can find out more about the pre-con sqlps.io/bitsprecon
and you can register at sqlps.io/bitsreg

Using Plaster To Create a New PowerShell Module

Chrissy, CK and I presented a pre-con at PASS Summit in Seattle last week

20171031_083328.jpg

Tracey Boggiano T | B came along to our pre-con and afterwards we were talking about creating PowerShell modules. In her blog post she explains how she creates modules by copying the code from another module (dbatools in this case!) and altering it to fit her needs. This is an absolutely perfect way to do things, in our pre-con we mentioned that there is no use in re-inventing the wheel, if someone else has already written the code then make use of it.

I suggested however that she used the PowerShell module Plaster to do this. We didnt have enough time to really talk about Plaster, so Tracy, this is for you (and I am looking forward to your blog about using it to 😉 )

What is Plaster?

Plaster is a template-based file and project generator written in PowerShell. Its purpose is to streamline the creation of PowerShell module projects, Pester tests, DSC configurations, and more. File generation is performed using crafted templates which allow the user to fill in details and choose from options to get their desired output.

How Do I Get Plaster?

The best way to get Plaster is also the best way to get any PowerShell module, from the PowerShell Gallery

You can just run

Install-Module Plaster

If you get a prompt about the repository not being trusted, don’t worry you can say yes.

Following PowerShell’s Security Guiding Principles, Microsoft doesn’t trust its own repository by default. The advice as always is never trust anything from the internet even if a bearded fellow from the UK recommends it!!

The PowerShell Gallery is a centralised repository where anyone can upload code to share and whilst all uploads are analyzed for viruses and malicious code by Microsoft, user discretion is always advised. If you do not want to be prompted every time that you install a module then you can run

Set-PSRepository -Name PSGallery -InstallationPolicy Trusted

if you and/or your organisation think that that is the correct way forward.

What Can We Do With Plaster?

Now that we have installed the module we can get to the nitty gritty. You can (and should) use Plaster to automate the creation of your module structure. If you are going to something more than once then automate it!

I created a repository for my Plaster Template You are welcome to take it and modify it for your own needs. I created a folder structure and some default files that I always want to have in my module folder

module framework.png

So in my template I have created all of the folders to organise the files in the way that I want to for my modules. I have also included the license file and some markdown documents for readme, contributing and installation. If we look in the tests folder

tests folder.png

There are some default test files included as well.

But Plaster is more than just a file and folder template repository, if we look in the installation markdown file,  it looks like this

# Installing <%= $PLASTER_PARAM_ModuleName %>
# You can install <%= $PLASTER_PARAM_ModuleName %> from the Powershell Gallery using
Find-Module <%= $PLASTER_PARAM_ModuleName %> | Install-Module
Import-Module <%= $PLASTER_PARAM_ModuleName %>
We can paramatarise the content of our files. This will create a very simple markdown showing how to find and install the module from the PowerShell Gallery which saves us from having to type the same thing again and again. Lets see how to do that

The Manifest XML file

The magic happens in the manifest file You can create one with the New-PlasterManifest command in the template directory – Thank you to Mustafa for notifying that the manifest file creation now requires an extra parameter of TemplateType

$manifestProperties = @{
Path = "PlasterManifest.xml"
Title = "Full Module Template"
TemplateName = 'FullModuleTemplate'
TemplateVersion = '0.0.1'
TemplateType = 'Item'
Author = 'Rob Sewell'
}
New-Item -Path FullModuleTemplate -ItemType Directory
New-PlasterManifest @manifestProperties
This will create a PlasterManifest.xml file that looks like this
<?xml version="1.0" encoding="utf-8"?>
<plasterManifest
schemaVersion="1.1"
templateType="Project" xmlns="http://www.microsoft.com/schemas/PowerShell/Plaster/v1">
<metadata>
<name>FullModuleTemplate</name>
<id>220fba73-bf86-49e3-9ec5-c4bc2719d196</id>
<version>0.0.1</version>
<title>FullModuleTemplate</title>
<description>My PLaster Template for PowerShell Modules</description>
<author>Rob Sewell</author>
<tags></tags>
</metadata>
<parameters></parameters>
<content></content>
</plasterManifest>
You can see that the parameters and content tags are empty. This is where we will define the parameters which will replace the tokens in our files and the details for how to create our module folder.

Plaster Parameters

At present my parameters tag looks like this
<parameters>
<parameter name="FullName" type="text" prompt="Module author's name" />
<parameter name="ModuleName" type="text" prompt="Name of your module" />
<parameter name="ModuleDesc" type="text" prompt="Brief description on this module" />
<parameter name="Version" type="text" prompt="Initial module version" default="0.0.1" />
<parameter name="GitHubUserName" type="text" prompt="GitHub username" default="${PLASTER_PARAM_FullName}"/>
<parameter name="GitHubRepo" type="text" prompt="Github repo name for this module" default="${PLASTER_PARAM_ModuleName}"/>
</parameters>
So we can set up various parameters with their names and data types defined and a prompt and if we want a default value.
We can then use
<%= $PLASTER_PARAM_WHATEVERTHEPAREMETERNAMEIS %>
in our files to make use of the parameters.

Plaster Content

The other part of the manifest file to create is the content. This tells Plaster what to do when it runs.

Mine is split into 3 parts

<message>
Creating folder structure
</message>
<file source='' destination='docs'/>
<file source='' destination='functions'/>
<file source='' destination='internal'/>
<file source='' destination='tests'/>
We can provide messages to the user with the message tag. I create the folders using the filesource tag
<message>
Deploying common files
</message>
<file source='appveyor.yml' destination=''/>
<file source='contributing.md' destination=''/>
<file source='LICENSE.txt' destination=''/>
<templateFile source='install.md' destination=''/>
<templateFile source='readme.md' destination=''/>
<templateFile source='tests\Project.Tests.ps1' destination=''/>
<templateFile source='tests\Help.Tests.ps1' destination=''/>
<templateFile source='tests\Feature.Tests.ps1' destination=''/>
<templateFile source='tests\Regression.Tests.ps1' destination=''/>
<templateFile source='tests\Unit.Tests.ps1' destination=''/>
<templateFile source='tests\Help.Exceptions.ps1' destination=''/>
<templateFile source='docs\ReleaseNotes.txt' destination=''/>
<file source='module.psm1' destination='${PLASTER_PARAM_ModuleName}.psm1'/>
This part creates all of the required files. You can see that the static files (those which do not require any sort of parameterisation for the contents use the same file source tag as the folders with the source defined. The files that have content which is parameterised use a tag of templateFile Source telling Plaster to look inside there for the tokens to be replaced.
The last part of the content creates the module manifest.
<message>
Creating Module Manifest
</message>
<newModuleManifest
destination='${PLASTER_PARAM_ModuleName}.psd1'
moduleVersion='$PLASTER_PARAM_Version'
rootModule='${PLASTER_PARAM_ModuleName}.psm1'
author='$PLASTER_PARAM_FullName'
description='$PLASTER_PARAM_ModuleDesc'
encoding='UTF8-NoBOM'/>
which I have filled in with the parameters for each of the values.

Creating a new module at the command line

Now you can easily create a module with all of the required folders and files that you want by creating a directory and running

Invoke-Plaster -TemplatePath TEMPLATEDIRECTORY -DestinationPath DESTINATIONDIRECTORY

which looks like this

Its that easy 🙂

Create a module without prompts

You can also create a module without needing to answer prompts. We can prefill them in our parameter splat
$plaster = @{
TemplatePath ="GIT:\PlasterTemplate"
DestinationPath = "GIT:\NewModule"
FullName = "Rob Sewell"
ModuleName = "NewModule"
ModuleDesc = "Here is a module description"
Version = "0.9.0"
GitHubUserName = "SQLDBAWithABeard"
GitHubRepo = "NewModule"
}
If (!(Test-Path $plaster.DestinationPath)) {
New-Item-ItemType Directory -Path $plaster.DestinationPath
}
Invoke-Plaster @plaster
Which will look like this

Make Your Own

Hopefully this have given you enough information and shown you how easy it is to automate creating the framework for your new PowerShell modules and parameterising them. Let me know how you get on and share your examples

Further Reading

Kevin Marquettes blog post is an excellent and detailed post on using Plaster which you should also read for reference as well as David Christians post which has some great content on adding user choice to the parameters enabling one plaster template to fulfill multiple requirements.

dbatools with SQL on Docker and running SQL queries

I had a question from my good friend Andrew Pruski dbafromthecold on twitter or SQL Container Man as I call him 🙂

How do you guys run SQL Commands in dbatools

I will answer that at the bottom of this post, but during our discussion Andrew said he wanted to show the version of the SQL running in the Docker Container.

Thats easy I said. Here’s how to do it

You need to have installed Docker first see this page You can switch to using Windows containers right-clicking on the icon in the taskbar and choosing the command. If you have not already, then pull the SQL 2017 SQL image using

docker pull microsoft/mssql-server-windows-developer:latest

This may take a while to download and extract the image but its worth it, you will be able to spin up new SQL instances in no time

You can create a new SQL Docker container like this

docker run -d -p 15789:1433 --env ACCEPT_EULA=Y --env sa_password=SQL2017Password01 --name SQL2017 microsoft/mssql-server-windows-developer:latest

In only a few seconds you have a SQL 2017 instance up and running (Take a look at Andrews blog at dbafromthecold.com for a great container series with much greater detail)

Now that we have our container we need to connect to it. We need to gather the IPAddress. We can do this using docker command docker inspect but I like to make things a little more programmatical. This works for my Windows 10 machine for Windows SQL Containers. There are some errors with other machines it appears but there is an alternative below

$inspect = docker inspect SQL2017
<#
IPAddress": matches the characters IPAddress": literally (case sensitive)
\s matches any whitespace character (equal to [\r\n\t\f\v ])
" matches the character " literally (case sensitive)
1st Capturing Group (\d{1,3}.\d{1,3}.\d{1,3}.\d{1,3})
\d{1,3} matches a digit (equal to [0-9])
. matches any character (except for line terminators)
\d{1,3} matches a digit (equal to [0-9])
. matches any character (except for line terminators)
\d{1,3} matches a digit (equal to [0-9])
. matches any character (except for line terminators)
\d{1,3} matches a digit (equal to [0-9])
#>
$IpAddress = [regex]::matches($inspect,"IPAddress`":\s`"(\d{1,3}.\d{1,3}.\d{1,3}.\d{1,3})").groups[1].value

Those two lines of code (and several lines of comments) puts the results of the docker inspect command into a variable and then uses regex to pull out the IP Address

If you are getting errors with that you can also use

$IPAddress =docker inspect -f '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' containername

Thanks Andrew 🙂

Now we just need our credentials to connect to the instance

$cred = Get-Credential -UserName SA -Message "Enter SA Password Here"

and we can connect to our SQL container

$srv = Connect-DbaInstance -SqlInstance $IpAddress -Credential $cred

and get the version

$srv.Version

and many many other properties, just run

$srv | Get-Member

to see them. At the bottom, you will see a ScriptMethod called Query, which means that you can do things like

$Query = @"
SELECT @@Version
"@

$srv.Query($Query)

$srv.Query($Query).column1

Which looks like

It’s slightly different with a Linux SQL container. Switch Docker to run Linux containers by right-clicking on the icon in the taskbar and choosing the command to switch.
If you haven’t already pull the Linux SQL image
docker pull microsoft/mssql-server-linux:2017-latest

and then create a container

docker run -d -p 15789:1433 --env ACCEPT_EULA=Y --env SA_PASSWORD=SQL2017Password01 --name linuxcontainer microsoft/mssql-server-linux:2017-latest

Now we just need to connect with localhost and the port number which we have specified already and we can connect again

$LinuxSQL = 'Localhost,15789'
$linuxsrv = Connect-DbaInstance -SqlInstance $LinuxSQL -Credential $cred
$linuxsrv.Version
$linuxsrv.HostDistribution
$linuxsrv.Query($query).column1

Of course, this isn’t restricted just Connect-DbaInstance you can do this with any dbatools commands

Get-DbaDatabase -SqlInstance $LinuxSQL -SqlCredential $cred

Go and explore your Docker SQL conatiners with dbatools 🙂

You can get it using

Install-Module dbatools

and find commands with

Find-DbaCommand database

Don’t forget to use Get-Help with the name of the command to get information about how to use it

Get-Help Find-DbaCommand -detailed

Enjoy 🙂

A Pretty PowerBi Pester Results Template File

I have left the heat and humidity of Singapore where I have been presenting at the PowerShell Conference Asia and DevOpsDays Singapore to travel to Seattle for PASS Summit. During my Green is Good – Red is Bad session someone asked me if the PowerBi that I showed at the end would work with any Pester Test Results object and I said (without thinking) that it would.

It turns out that the PowerBi that I had set up for that session will work with my function to run Pester Tests against an Ola Hallengren installation but some of the formatting and custom columns were specific to that test.

I said that I would share a Power Bi file that people could plug any Pester Test Results into. This is the first iteration of that. I doubt that it will work for every single test but I think it will be a good starting point for people to use.

This is how to use it

Download the file from here.

Run your Pester Tests using the PassThru Parameter and set the results to a variable, you can also use the Show Parameter to reduce the output of the tests to the screen (and also speed up the tests)

$PesterResults = Invoke-Pester -Script  C:\temp\PBI-Test01.ps1 -Show Summary -PassThru

Then we convert the $PesterResults object into a JSON file

$PesterResults.TestResult | ConvertTo-Json -Depth 5 | Out-File C:\temp\pbi-test.json

Open the Power Bi file you downloaded

Click Home

then the words “Edit Queries”

then data source settings,

highlight the filename and click change source

then navigate to the JSON file you just created, click ok and close and the apply changes.

Which will load the data from the JSON file and display your pester results. You can then save this file with a new name and keep the template for other tests.

It’s not going to be perfect

It’s not going to work in all circumstances and I expect that with some test results it will display the results in a less than optimal manner but you should be able to modify this to suit your needs.

Please give it a try and see how you get on

Here is a sample report created with Demo 1 from my Green is Good session

You can click around and change the data you can see and also look at the other 4 pages

Here is another one that I created using my dbatools-scripts repo and a config file. Again, have a click around and see what it does.

$Config = (Get-Content GIT:\dbatools-scripts\TestConfig.json) -join "`n" | ConvertFrom-Json
$PesterResults = Invoke-Pester .\dbatools-scripts\ -PassThru
$PesterResults.TestResult | Convertto-Json |Out-File C:\temp\dbatools-scripts-pester.json

 

I also created a quick video showing the process too which I will upload when I am not at 35000 feet!!

Enjoy 🙂 Also, let me know if you think it would be better to have the file in Github which would allow contributions but it would only be seen as a binary file and therefore merging will be difficult. I am happy to do so.

TSQL2sDay – Get-PostRoundup

First an apology, this round up is late!

The reason for that is an error in the PowerShell testing module Pester (That’s not completely true as you shall see!!)

I spoke in Stuttgart at the PowerShell Saturday last weekend and had intended to write this blog post whilst travelling, unfortunately I found a major error in Pester (again not strictly true but it makes a good story!!)

I explained it with this slide in my presentation

Yep, I forgot to pack my NUC with my VMs on it and had to re-write all my demos!!

But anyway, on to the TSQL2sDay posts

What a response. You wonderful people. I salute you with a Rimmer salute

There are 34 TSQL2sDay posts about dbatools, about starting with PowerShell, If you should learn PowerShell, SSAS, SSRS, Log Shipping, backups, restores, Pester, Default settings, best practices, migrations, Warnings in Agent Jobs, sqlpackage, VLFs, CMS, Disabling Named Pipes, Orphaned users, AG Status, AG Agent Jobs, logging, classes, auditing, copying files, ETL and more.

I am really pleased to see so many first timers to the TSQL2sDay blog monthly blog party. Please don’t let this be your only TSQL2sDay post. Come back next month and write a post on that topic.

Here they are below in the media of tweets, so that you can also go and follow these wonderful people who are so willing to share their knowledge. Say thank you to them, ask them questions, interact.

Learn, Share, Network

Volker wrote about testing best practices with dbatools

https://twitter.com/VolkerBachmann/status/907664676150022144

Dave explains why PowerShell is so useful to him in his ETL processes

Steve writes about the time he has saved using PowerShell to automate restores and audit SQL Server instances

https://twitter.com/Steve_TSQL/status/907713842943115264

Nate talks about copying large files like SQL Server backups using BITS with PowerShell

Warren talks about his experience as a beginner, the amount of things he automates and his DBReboot module

THANK YOU every single one and apologies if I have missed anyone!