Handling Missing Instances when Looping with Pester

In my previous posts about writing your first Pester Test and looping through instances I described how you can start to validate that your SQL Server is how YOU want it to be.

Unavailable machines

Once you begin to have a number of tests for a number of instances you want to be able to handle any machines that are not available cleanly otherwise you might end up with something like this.

01 - error.png

In this (made up) example we loop through 3 instances and try to check the DNS Server entry is correct but for one of them we get a massive error and if we had created a large number of tests for each machine we would have a large number of massive errors.

Empty Collection

If we don’t successfully create our collection we might have an empty collection which will give us a different issue. No tests

02 - no tests.png

If this was in amongst a whole number of tests we would not have tested anything in this Describe block and might be thinking that our tests were OK because we had no failures of our tests. We would be wrong!

Dealing with Empty Collections

One way of dealing with empty collections is to test that they have more than 0 members

if ($instances.count -gt 0) {
    $instances.ForEach{
        ## Tests in here
    }
}
else {Write-Warning "Uh-Oh - The Beard is Sad! - The collection is empty. Did you set `$Instances correctly?"}
Notice the backtick ` before the $ to escape it in the Write-Warning. An empty collection now looks like
03 - uh-oh.png
Which is much better and provides useful information to the user

Dealing with Unavailable Machines

If we want to make sure we dont clutter up our test results with a whole load of failures when a machine is unavailable we can use similar logic.

First we could check if it is responding to a ping (assuming that ICMP is allowed by the firewall and switches) using

Test-Connection -ComputerName $computer -Count 1 -Quiet -ErrorAction SilentlyContinue

This will just try one ping and do it quietly only returning True or False and if there are any errors it shouldn’t mention it

In the example above I am using PSRemoting and we should make sure that that is working too. So whilst I could use

Test-WSMan -ComputerName $computer

this only checks if a WSMAN connection is possible and not other factors that could be affecting the ability to run remote sessions. Having been caught by this before I have always used this function from Lee Holmes (Thank you Lee) and thus can use

$instances.ForEach{
    $computer = $_.Split('\')[0]# To get the computername if there is an instance name
    # Check if machine responds to ping
    if (!(Test-Connection-ComputerName $computer-Count 1-Quiet -ErrorAction SilentlyContinue))
    {Write-Warning "Uh-Oh - $Computer is not responding to a ping - aborting the tests for this machine"; Return}
    # Check if PSremoting is possible for this machine
    # Requires Test-PSRemoting by Lee Holmes http://www.leeholmes.com/blog/2009/11/20/testing-for-powershell-remoting-test-psremoting/
    if (!(Test-PsRemoting$computer))
    {Write-Warning "Uh-Oh - $Computer is not able to use PSRemoting - aborting the tests for this machine"; Return}
    Describe "Testing Instance $($_)" {
        ## Put tests in here
    }
which provides a result like this

04 - better handling.png

Which is much better I think 🙂

Let dbatools do the error handling for you

If your tests are only using the dbatools module then there is built in error handling that you can use. By default dbatools returns useful messages rather than the exceptions from PowerShell (You can enable the exceptions using the -EnableExceptions parameter if you want/need to) so if we run our example from the previous post it will look like

05 - dbatools handling.png

which is fine for a single command but we don’t really want to waste time and resources repeatedly trying to connect to an instance if we know it is not available if we are running multiple commands against each instance.

dbatools at the beginning of the loop

We can use Test-DbaConnection to perform a check at the beginning of the loop as we discussed in the previous post

$instances.ForEach{
    if (!((Test-DbaConnection-SqlInstance $_ -WarningAction SilentlyContinue).ConnectSuccess))
    {Write-Warning "Uh-Oh - we cannot connect to $_ - aborting the tests for this instance"; Return}
Notice that we have used -WarningAction SilentlyContinue to hide the warnings from the command this tiime. Our test now looks like
06 - dbatools test-dbaconnection.png
Test-DbaConnection performs a number of tests so you can check for ping SQL version, domain name and remoting if you want to exclude tests on those basis

Round Up

In this post we have covered some methods of ensuring that your Pester Tests return what you expect. You don’t want empty collections of SQL Instances making you think you have no failed tests when you have not actually run any tests.

You can do this by checking how many instances are in the collection

You also dont want to keep running tests against a machine or instance that is not responding or available.

You can do this by checking a ping with Test-Connection or if remoting is required by using the Test-PSRemoting function from Lee Holmes

If you want to use dbatools exclusively you can use Test-DbaConnection

Here is a framework to put your tests inside. You will need to provide the values for the $Instances and place your tests inside the Describe Block

if ($instances.count -gt 0) {
    $instances.ForEach{
        $TestConnection = Test-DbaConnection-SqlInstance $_ -WarningAction SilentlyContinue
        # Check if machine responds to ping
        if (!($TestConnection.IsPingable))
        {Write-Warning "Uh-Oh - The Beard is Sad! - - $_ is not responding to a ping - aborting the tests for this instance"; Return}
        # Check if we have remote access to the machine
        if (!($TestConnection.PsRemotingAccessible))
        {Write-Warning "Uh-Oh - The Beard is Sad! - - $_ is not able to use PSRemoting - aborting the tests for this instance"; Return}
        # Check if we have SQL connection to the Instance
        if (!($TestConnection.ConnectSuccess))
        {Write-Warning "Uh-Oh - The Beard is Sad! - - we cannot connect to SQL on $_ - aborting the tests for this instance"; Return}
        Describe "Testing Instance $($_)" {
            ## Now put your tests in here - seperate them with context blocks if you want to
            Context "Networks" { }
        }
    }
}
else
## If the collection is empty
{Write-Warning "Uh-Oh - The Beard is Sad! - The collection is empty. Did you set `$Instances correctly?"}

2 Ways to Loop through collections in Pester

In my last post I showed you how to write your first Pester test to validate something. Here’s a recap

  • Decide the information you wish to test
  • Understand how to get it with PowerShell
  • Understand what makes it pass and what makes it fail
  • Write a Pester Test

You probably have more than one instance that you want to test, so how do you loop through a collection of instances? There are a couple of ways.

Getting the Latest Version of the Module

The magnificent Steve Jones wrote about getting the latest version of Pester and the correct way to do it. You can find the important information here

Test Cases

The first way is to use the Test Case parameter of the It command (the test) which I have written about when using TDD for Pester here

Lets write a test first to check if we can successfully connect to a SQL Instance. Running

Find-DbaCommand connection

shows us that the Test-DbaConnection command is the one that we want from the dbatools module. We should always run Get-Help to understand how to use any PowerShell command. This shows us that the results will look like this

01 - gethelp test-dbaconnection

So there is a ConnectSuccess result which returns True or false. Our test can look like this for a single instance

Describe 'Testing connection to ROB-XPS' {
    It "Connects successfully to ROB-XPS" {
        (Test-DbaConnection-SqlInstance ROB-XPS).ConnectSuccess | Should Be $True
    }
}

which gives us some test results that look like this

successful test.png
which is fine for one instance but we want to check many.
We need to gather the instances into a $Instances variable. In my examples I have hard coded a list of SQL Instances but you can, and probably should, use a more dynamic method, maybe the results of a query to a configuration database. Then we can fill our TestCases variable which can be done like this
$Instances = 'ROB-XPS','ROB-XPS\DAVE','ROB-XPS\BOLTON','ROB-XPS\SQL2016'
# Create an empty array
$TestCases = @()
# Fill the Testcases with the values and a Name of Instance
$Instances.ForEach{$TestCases += @{Instance = $_}}
Then we can write our test like this
# Get a list of SQL Servers
# Use whichever method suits your situation
# Maybe from a configuration database
# I'm just using a hard-coded list for example
$Instances = 'ROB-XPS','ROB-XPS\DAVE','ROB-XPS\BOLTON','ROB-XPS\SQL2016'

# Create an empty array
$TestCases = @()

# Fill the Testcases with the values and a Name of Instance
$Instances.ForEach{$TestCases += @{Instance = $_}}
Describe 'Testing connection to SQL Instances' {
    # Put the TestCases 'Name' in <> and add the TestCases parameter
    It "Connects successfully to <Instance>" -TestCases $TestCases {
        # Add a Parameter to the test with the same name as the TestCases Name
        Param($Instance)
        # Write the test using the TestCases Name
        (Test-DbaConnection -SqlInstance $Instance).ConnectSuccess | Should Be $True
    }
}
Within the title of the test we refer to the instance inside <> and add the parameter TestCases with a value of the $TestCases variable. We also need to add a Param() to the test with the same name and then use that variable in the test.
This looks like this
Testcases test.png

Pester is PowerShell

The problem with  Test Cases is that we can only easily loop through one collection, but as Pester is just PowerShell we can simply use ForEach if we wanted to loop through multiple ones, like instances and then databases.

I like to use the ForEach method as it is slightly quicker than other methods. It will only work with PowerShell version 4 and above. Below that version you need to pipe the collection to For-EachObject.

Lets write a test to see if our databases have trustworthy set on. We can do this using the Trustworthy property returned from Get-DbaDatabase. 

We loop through our Instances using the ForEach method and create a Context for each Instance to make the test results easier to read. We then place the call to Get-DbaDatabase inside braces and loop through those and check the Trustworthy property

# Get a list of SQL Servers
# Use whichever method suits your situation
# Maybe from a configuration database
# I'm just using a hard-coded list for example
$Instances = 'ROB-XPS','ROB-XPS\DAVE','ROB-XPS\BOLTON','ROB-XPS\SQL2016'
Describe 'Testing user databases' {
    # Loop through the instances
    $Instances.ForEach{
        # Create a Context for each Instance.
        Context "Testing User Databases on $($_)" {
            # Loop through the User databases on the instance
            (Get-DbaDatabase -SqlInstance $_ -ExcludeAllSystemDb).ForEach{
                # Refer to the database name and Instance name inside a $()
                It "Database $($_.Name) on Instance $($_.Parent.Name) should not have TRUSTWORTHY ON" {
                    $_.Trustworthy | Should Be $false
                }
            }
        }
    }
}
and it looks like this

testdatabasetrustworthy.png

So there you have two different ways to loop through collections in your Pester tests. Hopefully this can help you to write some good tests to validate your environment.
Happy Pestering

Spend a Whole Day With Chrissy & I at SQLBits

If you would like to spend a whole day with Chrissy LeMaire and I at SQLBits in London in February – we have a pre-con on the Thursday
You can find out more about the pre-con sqlps.io/bitsprecon
and you can register at sqlps.io/bitsreg

Write Your first Pester Test Today

I was in Glasgow this Friday enjoying the fantastic hospitality of the Glasgow SQL User Group @SQLGlasgow and presenting sessions with Andre Kamman, William Durkin and Chrissy LeMaire

I presented “Green is Good Red is Bad – Turning your checklists into Pester Tests”. I had to make sure I had enough energy beforehand so I treated myself to a fabulous burger.

20171110_114933-compressor.jpg

Afterwards I was talking to some of the attendees and realised that maybe I could show how easy it was to start writing your first Pester test. Here are the steps to follow so that you can  write your first Pester test

Decide the information you wish to test
Understand how to get it with PowerShell
Understand what makes it pass and what makes it fail
Write a Pester Test

The first bit is up to you. I cannot decide what you need to test for on your servers in your environments. Whatever is the most important. For now pick one thing.

Logins – Lets pick logins as an example for this post. It is good practice to disable the sa account is advice that you will read all over the internet and is often written into estate documentation so lets write a test for that

Now we need the PowerShell command to return the information to test for. We need a command that will get information about logins on a SQL server and if it can return disabled logins then all the better.

As always when starting to use PowerShell with SQL Server I would start with dbatools if we run Find-DbaCommand we can search for commands in the module that support logins. (If you have chosen something none SQL Server related then you can use Get-Command or the internet to find the command you need)

find-dbacommand.png

Get-DbaLogin . That looks like the one that we want. Now we need to understand how to use it. Always always use Get-Help to do this. If we run

Get-Help Get-DbaLogins -detailed

we get all of the information about the command and the examples. Example 8 looks like it will help us

get-dbalogin example

So now try running the command for our disabled sa account

Get-DbaLogin -SqlInstance rob-xps -Login sa -Disabled

disabled sa account

So we know that if we have a disabled sa account we get a result. Lets enable the sa account and run the command again

not disabled.png

We don’t get a result. Excellent, now we know what happens for a successful test – we get one result and for failed test we get zero results. We can check that by running

login count

The first one has the account disabled and the second one not. So now we can write our Pester Test. We can start with a Describe Block with a useful title. I am going to add a context block so that you can see how you can group your tests.

describe context

and then we will write our test. Pester Tests use the It keyword. You should always give a useful title to your test

it should

Now we can write our test. We know that the command returns one result when we want it to pass so we can write a test like this

login test.png

The code I have added is

(Get-DbaLogin -SqlInstance rob-xps -Login sa -Disabled).Count | Should Be 1
which is
  • the code for getting the information about the thing we wanted to test (The count of the disabled sa logins on the instance)
  • a pipe symbol |
  • The Should key word
  • The Be keyword
  • and the result we want to pass the test (1)

Ta Da! One Pester test written. You can run the test just by highlighting the code and running it in VS Code (or PowerShell ISE) and it will look like this for a passing test

passing test

It is better to save it for later use and then call it with Invoke-Pester

invoke

So now you can write your first Pester test. Just find the PowerShell to get the information that you need, understand what the results will be for passing and failing tests and write your test 🙂

Getting the Latest Version of the Module

The magnificent Steve Jones wrote about getting the latest version of Pester and the correct way to do it. You can find the important information here

Spend a Whole Day With Chrissy & I at SQLBits

If you would like to spend a whole day with Chrissy LeMaire and I at SQLBits in London in February – we have a pre-con on the Thursday
You can find out more about the pre-con sqlps.io/bitsprecon
and you can register at sqlps.io/bitsreg

TSQL2sDay – Folks Who Have Made a Difference

tsql2sday

This months TSQL2sDay is an absolute brilliant one hosted by Ewald Cress

the opportunity to give a shout-out to people (well-known or otherwise) who have made a meaningful contribution to your life in the world of data.

Fabulous, fabulous idea Ewald, I heartily approve

Now this is going to be difficult. There are so many wonderful people in the #SQLFamily who are so gracious and generous and willing to share. I am also lucky enough to be part of the PowerShell community which is also equally filled with amazing people. I do not want to write a novel or a massive list of people, I don’t want to risk missing someone out (Ewald, I’m beginning to question whether ‘fabulous’ should become ‘tricky and challenging’ !!)

So after consideration I am only going to talk about 4 wonderful people and the effect they have had on my life, my career and my community involvement but know that I truly appreciate the input that all of the peoples have had and the amazing friendships that I have all over the world. There is no order to this list, these are 4 of the people in equal first with all the other people I haven’t mentioned. This post should really scroll sideways. Interestingly I noticed after writing this that they are in reverse chronological order in my life!

The Hair!

At PASS Summit this year many people came up to me and said “Hey, Beard ……..” The first person who called me that is an amazing inspiring bundle of talented energy called Chrissy LeMaire

Many moons ago, we exchanged messages over social media and email, chatted after a PowerShell Virtual Group presentation and then one day she asked me to join as an organiser for the Virtual Group.

When dbatools was in it’s infancy she asked me to help and since then has given me interesting challenges to overcome from introducing Pester and appveyor to the dbatools development process to creating continuous delivery to our private PowerShell gallery for our summit pre-con forcing me to learn and implement new and cool things. Our shared love of enabling people to do cool things with PowerShell is so much fun to do 🙂

She is so generous and giving of her time and knowledge and has an amazing capability to get things done, whether by herself or by encouraging and supporting others.

We have presented at many conferences together, both SQL and PowerShell and we have the best of times doing so. It is so refreshing to find someone that I am comfortable presenting with and who has the same passion and energy for inspiring people. (It’s also fun to occasionally throw her off her stride mid-presentation (Thank you Cathrine 🙂 )

I am proud to call her my buddy. You are so inspiring Chrissy.

Thank you Ma’am

Amazing Couple

A few months after becoming a DBA I was the only DBA at the company as the others all left for various reasons. I was drowning in work, had no idea what I should be doing. I knew I didn’t have the knowledge and during that time I began to be aware of the SQL community and all the fine resources that it provides.

I then found out about a local user group and emailed the leader Jonathan Allen (He surprised me by reminding of this during our pre-con in Singapore a couple of weeks ago!) Jonathan and his wife Annette run the SQL South West user group and are also members of the SQL Bits committee, Annette is also the regional mentor for the UK. They give an awful amount of time and effort to the SQL Community in the UK. It took a few months before I even had the time to attend a user group and in those early days they both answered my naive questions and passed on so much of their technical knowledge and methodology to me and I soaked it up.

Later on, they invited me to help them to organise SQL Saturday Exeter, encouraged me to speak, gave me fabulous feedback and pointers to improve, encouraged me to volunteer for SQL Bits and have been incredibly supportive. I love them both very much. Neither like having their photo taken so I can’t embarrass them too much.

Next time you see them give them a hug.

Thank you J and A

The First One

Andrew Pruski dbafromthecold and SQL Containers Man

At the time I am talking about he was not a member of the SQL Community although he possessed all of the qualities that describe such a person. Now he is an established blogger and speaker and attender of SQL events.

He is one of the DBA’s who left me on my own!! He is the first SQL DBA I ever worked with. The person who taught me all those important first bits of knowledge about being a SQL DBA. He imparted a great amount of knowledge in a few months with great patience to an eager newbie.

More than that, he showed me that to succeed in IT, you need to do more than just an everyday 9-5, that it requires more time than that. He instilled in me (without realising it) a work ethic and a thirst for doing things right and gaining knowledge that I still have today. He inspired me when I was faced with trying to understand the mountain of knowledge that is SQL Server that it was possible to learn enough. He taught me the importance of testing things, of understanding the impact of the change that is being made. He showed me how to respond in crises and yet was still willing to share and teach during those times.

He has had a greater impact on me than he will ever know and I have told him this privately many times. I will never forgive him for abandoning me all those years ago and yet that is a large part of what made me who I am today. I was forced to have to deal with looking after a large estate by myself and needed to learn to automate fast and he just about left me with the skills to be able to accomplish that.

Massive shout out to you fella. Thank you

All the Others

Seriously, there are so many other people who I wish I could thank.

Every single one of you who blogs or speaks or records webinars that I have watched – thank you.

All of the organisers who ensure that events happen – thank you

All of the volunteers who assist at those events – thank you.

That group of amazing European speakers at the first SQL Saturday Exeter I attended. The cool group, my wife still reminds me of how I came home from that event so inspired by them. How incredibly generous and welcoming they were and how they welcomed me into their group even though I didn’t feel worthy to share their table. They taught me about the lack of egos and humbleness that defines the SQL family. I am proud to call them my friends now. Thank You (You know who you are)

We have a great community, may its ethos continue for a long time.

Comparing Agent Jobs across Availability Group Replicas with PowerShell

On the plane home from PAS Summit I was sat next to someone who had also attended and when he saw on my laptop that I was part of the SQL Community we struck up a conversation. He asked me how he could compare SQL Agent Jobs across availability group replicas to ensure that they were the same.

He already knew that he could use Copy-DbaAgentJob from dbatools to copy the jobs between replicas and we discussed how to set up an Agent job to accomplish this. The best way to run an Agent Job with a PowerShell script is described here

Compare-Object

I told him about Compare-Object a function available in PowerShell for precisely this task. Take these two SQL instances and their respective Agent Jobs

agentjobcompare.png

So we can see that some jobs are the same and some are different. How can we quickly and easily spot the differences?

$Default = Get-DbaAgentJob -SqlInstance rob-xps
$bolton = Get-DbaAgentJob -SqlInstance rob-xps\bolton
Compare-Object $Default $bolton
Those three lines of code will do it. The first two get the agent jobs from each instance and assign them to a variable and the last one compares them. This is the output
comparison.png
The arrows show that the first three jobs are only on the Bolton instance and the bottom three jobs are only on the default instance.

What If ?

 Another option I showed was to use the -WhatIf switch on Copy-DbaAgentJob. This parameter is available on all good PowerShell functions and will describe what the command would do if run WARNING – If you are using the old SQLPS module from prior to the SSMS 2016 release -WhatIf will actually run the commands so update your modules.
We can run
Copy-DbaAgentJob -Source rob-xps -Destination rob-xps\bolton -WhatIf

and get the following result

which shows us that there are two jobs on Rob-XPS which would be created on the Bolton instance

And if they have been modified?

Thats good he said, but what about if the jobs have been modified?
Well one thing you could do is to compare the jobs DateLastModified property by using the -Property parameter and the passthru switch
$Default = Get-DbaAgentJob -SqlInstance rob-xps
$Dave = Get-DbaAgentJob -SqlInstance rob-xps\dave
 
$Difference = Compare-Object $Default $dave -Property DateLastModified -PassThru
$Difference | Sort-Object Name | Select-Object OriginatingServer,Name,DateLastModified
This is going to return the jobs which are the same but were modified at a different time
sortedjobcompare.png
so that you can examine when they were changed. Of course the problem with that is that the DateLastModified is a very precise time so it is pretty much always going to be different. We can fix that but now it is a little more complex.

Just the Date please

We need to gather the jobs in the same way but create an array of custom objects with a calculated property like this
$Dave = Get-DbaAgentJob -SqlInstance rob-xps\dave
## Create a custom object array with the date instead of the datetime
$DaveJobs = @()
$Dave.ForEach{
    $DaveJobs += [pscustomobject]@{
        Server = $_.OriginatingServer
        Name   = $_.Name
        Date   = $_.DateLastModified.Date
    }
}
and then we can compare on the Date field. The full code is
## Get the Agent Jobs
$Default = Get-DbaAgentJob -SqlInstance rob-xps
$Dave = Get-DbaAgentJob -SqlInstance rob-xps\dave
## Create a custom object array with the date instead of the datetime
$DaveJobs = @()
$Dave.ForEach{
    $DaveJobs += [pscustomobject]@{
        Server = $_.OriginatingServer
        Name   = $_.Name
        Date   = $_.DateLastModified.Date
    }
}
## Create a custom object array with the date instead of the datetime
$DefaultJobs = @()
$Default.ForEach{
    $DefaultJobs += [pscustomobject]@{
        Server = $_.OriginatingServer
        Name   = $_.Name
        Date   = $_.DateLastModified.Date
    }
}
## Perform a comparison
$Difference = Compare-Object $DefaultJobs $DaveJobs -Property date -PassThru
## Sort by name and display
$Difference | Sort-Object Name | Select-Object Server, Name, Date

This will look like this

datecompare.png
Which is much better and hopefully more useful but it only works with 2 instances

I have more than 2 instances

So if we have more than 2 instances it gets a little more complicated as Compare-Object only supports two arrays. I threw together a quick function to compare each instance with the main instance. This is very rough and will work for now but I have also created a feature request issue on the dbatools repository so someone (maybe you ?? ) could go and help create those commands

FunctionCompare-AgentJobs {
    Param(
        $SQLInstances
    )
    ## remove jobs* variables from process
    Get-Variable jobs*|Remove-Variable
    ## Get the number of instances
    $count = $SQLInstances.Count
    ## Loop through instances
    $SQLInstances.ForEach{
        # Get the jobs and assign to a new dynamic variable
        $Number = [array]::IndexOf($SQLInstances, $_)
        $Job = Get-DbaAgentJob-SqlInstance $_
        New-Variable-Name "Jobs$Number"-Value $Job
    }
    $i = $count - 1
    $Primary = $SQLInstances[0]
    While ($i -gt 0) {
        ## Compare the jobs with Primary
        $Compare = $SQLInstances[$i]
        Write-Output"Comparing $Primary with $Compare "
        Compare-Object(Get-Variable Jobs0).Value (Get-Variable"Jobs$i").Value
        $i --
    }
}
which looks like this. It’s not perfect but it will do for now until the proper commands are created

compare agent jobs.png

Using Plaster To Create a New PowerShell Module

Chrissy, CK and I presented a pre-con at PASS Summit in Seattle last week

20171031_083328.jpg

Tracey Boggiano T | B came along to our pre-con and afterwards we were talking about creating PowerShell modules. In her blog post she explains how she creates modules by copying the code from another module (dbatools in this case!) and altering it to fit her needs. This is an absolutely perfect way to do things, in our pre-con we mentioned that there is no use in re-inventing the wheel, if someone else has already written the code then make use of it.

I suggested however that she used the PowerShell module Plaster to do this. We didnt have enough time to really talk about Plaster, so Tracy, this is for you (and I am looking forward to your blog about using it to 😉 )

What is Plaster?

Plaster is a template-based file and project generator written in PowerShell. Its purpose is to streamline the creation of PowerShell module projects, Pester tests, DSC configurations, and more. File generation is performed using crafted templates which allow the user to fill in details and choose from options to get their desired output.

How Do I Get Plaster?

The best way to get Plaster is also the best way to get any PowerShell module, from the PowerShell Gallery

You can just run

Install-Module Plaster

If you get a prompt about the repository not being trusted, don’t worry you can say yes.

Following PowerShell’s Security Guiding Principles, Microsoft doesn’t trust its own repository by default. The advice as always is never trust anything from the internet even if a bearded fellow from the UK recommends it!!

The PowerShell Gallery is a centralised repository where anyone can upload code to share and whilst all uploads are analyzed for viruses and malicious code by Microsoft, user discretion is always advised. If you do not want to be prompted every time that you install a module then you can run

Set-PSRepository -Name PSGallery -InstallationPolicy Trusted

if you and/or your organisation think that that is the correct way forward.

What Can We Do With Plaster?

Now that we have installed the module we can get to the nitty gritty. You can (and should) use Plaster to automate the creation of your module structure. If you are going to something more than once then automate it!

I created a repository for my Plaster Template You are welcome to take it and modify it for your own needs. I created a folder structure and some default files that I always want to have in my module folder

module framework.png

So in my template I have created all of the folders to organise the files in the way that I want to for my modules. I have also included the license file and some markdown documents for readme, contributing and installation. If we look in the tests folder

tests folder.png

There are some default test files included as well.

But Plaster is more than just a file and folder template repository, if we look in the installation markdown file,  it looks like this

# Installing <%= $PLASTER_PARAM_ModuleName %>
# You can install <%= $PLASTER_PARAM_ModuleName %> from the Powershell Gallery using
Find-Module <%= $PLASTER_PARAM_ModuleName %> | Install-Module
Import-Module <%= $PLASTER_PARAM_ModuleName %>
We can paramatarise the content of our files. This will create a very simple markdown showing how to find and install the module from the PowerShell Gallery which saves us from having to type the same thing again and again. Lets see how to do that

The Manifest XML file

The magic happens in the manifest file You can create one with the New-PlasterManifest command in the template directory – Thank you to Mustafa for notifying that the manifest file creation now requires an extra parameter of TemplateType

$manifestProperties = @{
Path = "PlasterManifest.xml"
Title = "Full Module Template"
TemplateName = 'FullModuleTemplate'
TemplateVersion = '0.0.1'
TemplateType = 'Item'
Author = 'Rob Sewell'
}
New-Item -Path FullModuleTemplate -ItemType Directory
New-PlasterManifest @manifestProperties
This will create a PlasterManifest.xml file that looks like this
<?xml version="1.0" encoding="utf-8"?>
<plasterManifest
schemaVersion="1.1"
templateType="Project" xmlns="http://www.microsoft.com/schemas/PowerShell/Plaster/v1">
<metadata>
<name>FullModuleTemplate</name>
<id>220fba73-bf86-49e3-9ec5-c4bc2719d196</id>
<version>0.0.1</version>
<title>FullModuleTemplate</title>
<description>My PLaster Template for PowerShell Modules</description>
<author>Rob Sewell</author>
<tags></tags>
</metadata>
<parameters></parameters>
<content></content>
</plasterManifest>
You can see that the parameters and content tags are empty. This is where we will define the parameters which will replace the tokens in our files and the details for how to create our module folder.

Plaster Parameters

At present my parameters tag looks like this
<parameters>
<parameter name="FullName" type="text" prompt="Module author's name" />
<parameter name="ModuleName" type="text" prompt="Name of your module" />
<parameter name="ModuleDesc" type="text" prompt="Brief description on this module" />
<parameter name="Version" type="text" prompt="Initial module version" default="0.0.1" />
<parameter name="GitHubUserName" type="text" prompt="GitHub username" default="${PLASTER_PARAM_FullName}"/>
<parameter name="GitHubRepo" type="text" prompt="Github repo name for this module" default="${PLASTER_PARAM_ModuleName}"/>
</parameters>
So we can set up various parameters with their names and data types defined and a prompt and if we want a default value.
We can then use
<%= $PLASTER_PARAM_WHATEVERTHEPAREMETERNAMEIS %>
in our files to make use of the parameters.

Plaster Content

The other part of the manifest file to create is the content. This tells Plaster what to do when it runs.

Mine is split into 3 parts

<message>
Creating folder structure
</message>
<file source='' destination='docs'/>
<file source='' destination='functions'/>
<file source='' destination='internal'/>
<file source='' destination='tests'/>
We can provide messages to the user with the message tag. I create the folders using the filesource tag
<message>
Deploying common files
</message>
<file source='appveyor.yml' destination=''/>
<file source='contributing.md' destination=''/>
<file source='LICENSE.txt' destination=''/>
<templateFile source='install.md' destination=''/>
<templateFile source='readme.md' destination=''/>
<templateFile source='tests\Project.Tests.ps1' destination=''/>
<templateFile source='tests\Help.Tests.ps1' destination=''/>
<templateFile source='tests\Feature.Tests.ps1' destination=''/>
<templateFile source='tests\Regression.Tests.ps1' destination=''/>
<templateFile source='tests\Unit.Tests.ps1' destination=''/>
<templateFile source='tests\Help.Exceptions.ps1' destination=''/>
<templateFile source='docs\ReleaseNotes.txt' destination=''/>
<file source='module.psm1' destination='${PLASTER_PARAM_ModuleName}.psm1'/>
This part creates all of the required files. You can see that the static files (those which do not require any sort of parameterisation for the contents use the same file source tag as the folders with the source defined. The files that have content which is parameterised use a tag of templateFile Source telling Plaster to look inside there for the tokens to be replaced.
The last part of the content creates the module manifest.
<message>
Creating Module Manifest
</message>
<newModuleManifest
destination='${PLASTER_PARAM_ModuleName}.psd1'
moduleVersion='$PLASTER_PARAM_Version'
rootModule='${PLASTER_PARAM_ModuleName}.psm1'
author='$PLASTER_PARAM_FullName'
description='$PLASTER_PARAM_ModuleDesc'
encoding='UTF8-NoBOM'/>
which I have filled in with the parameters for each of the values.

Creating a new module at the command line

Now you can easily create a module with all of the required folders and files that you want by creating a directory and running

Invoke-Plaster -TemplatePath TEMPLATEDIRECTORY -DestinationPath DESTINATIONDIRECTORY

which looks like this

Its that easy 🙂

Create a module without prompts

You can also create a module without needing to answer prompts. We can prefill them in our parameter splat
$plaster = @{
TemplatePath ="GIT:\PlasterTemplate"
DestinationPath = "GIT:\NewModule"
FullName = "Rob Sewell"
ModuleName = "NewModule"
ModuleDesc = "Here is a module description"
Version = "0.9.0"
GitHubUserName = "SQLDBAWithABeard"
GitHubRepo = "NewModule"
}
If (!(Test-Path $plaster.DestinationPath)) {
New-Item-ItemType Directory -Path $plaster.DestinationPath
}
Invoke-Plaster @plaster
Which will look like this

Make Your Own

Hopefully this have given you enough information and shown you how easy it is to automate creating the framework for your new PowerShell modules and parameterising them. Let me know how you get on and share your examples

Further Reading

Kevin Marquettes blog post is an excellent and detailed post on using Plaster which you should also read for reference as well as David Christians post which has some great content on adding user choice to the parameters enabling one plaster template to fulfill multiple requirements.

dbatools with SQL on Docker and running SQL queries

I had a question from my good friend Andrew Pruski dbafromthecold on twitter or SQL Container Man as I call him 🙂

How do you guys run SQL Commands in dbatools

I will answer that at the bottom of this post, but during our discussion Andrew said he wanted to show the version of the SQL running in the Docker Container.

Thats easy I said. Here’s how to do it

You need to have installed Docker first see this page You can switch to using Windows containers right-clicking on the icon in the taskbar and choosing the command. If you have not already, then pull the SQL 2017 SQL image using

docker pull microsoft/mssql-server-windows-developer:latest

This may take a while to download and extract the image but its worth it, you will be able to spin up new SQL instances in no time

You can create a new SQL Docker container like this

docker run -d -p 15789:1433 --env ACCEPT_EULA=Y --env sa_password=SQL2017Password01 --name SQL2017 microsoft/mssql-server-windows-developer:latest

In only a few seconds you have a SQL 2017 instance up and running (Take a look at Andrews blog at dbafromthecold.com for a great container series with much greater detail)

Now that we have our container we need to connect to it. We need to gather the IPAddress. We can do this using docker command docker inspect but I like to make things a little more programmatical. This works for my Windows 10 machine for Windows SQL Containers. There are some errors with other machines it appears but there is an alternative below

$inspect = docker inspect SQL2017
<#
IPAddress": matches the characters IPAddress": literally (case sensitive)
\s matches any whitespace character (equal to [\r\n\t\f\v ])
" matches the character " literally (case sensitive)
1st Capturing Group (\d{1,3}.\d{1,3}.\d{1,3}.\d{1,3})
\d{1,3} matches a digit (equal to [0-9])
. matches any character (except for line terminators)
\d{1,3} matches a digit (equal to [0-9])
. matches any character (except for line terminators)
\d{1,3} matches a digit (equal to [0-9])
. matches any character (except for line terminators)
\d{1,3} matches a digit (equal to [0-9])
#>
$IpAddress = [regex]::matches($inspect,"IPAddress`":\s`"(\d{1,3}.\d{1,3}.\d{1,3}.\d{1,3})").groups[1].value

Those two lines of code (and several lines of comments) puts the results of the docker inspect command into a variable and then uses regex to pull out the IP Address

If you are getting errors with that you can also use

$IPAddress =docker inspect -f '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' containername

Thanks Andrew 🙂

Now we just need our credentials to connect to the instance

$cred = Get-Credential -UserName SA -Message "Enter SA Password Here"

and we can connect to our SQL container

$srv = Connect-DbaInstance -SqlInstance $IpAddress -Credential $cred

and get the version

$srv.Version

and many many other properties, just run

$srv | Get-Member

to see them. At the bottom, you will see a ScriptMethod called Query, which means that you can do things like

$Query = @"
SELECT @@Version
"@

$srv.Query($Query)

$srv.Query($Query).column1

Which looks like

It’s slightly different with a Linux SQL container. Switch Docker to run Linux containers by right-clicking on the icon in the taskbar and choosing the command to switch.
If you haven’t already pull the Linux SQL image
docker pull microsoft/mssql-server-linux:2017-latest

and then create a container

docker run -d -p 15789:1433 --env ACCEPT_EULA=Y --env SA_PASSWORD=SQL2017Password01 --name linuxcontainer microsoft/mssql-server-linux:2017-latest

Now we just need to connect with localhost and the port number which we have specified already and we can connect again

$LinuxSQL = 'Localhost,15789'
$linuxsrv = Connect-DbaInstance -SqlInstance $LinuxSQL -Credential $cred
$linuxsrv.Version
$linuxsrv.HostDistribution
$linuxsrv.Query($query).column1

Of course, this isn’t restricted just Connect-DbaInstance you can do this with any dbatools commands

Get-DbaDatabase -SqlInstance $LinuxSQL -SqlCredential $cred

Go and explore your Docker SQL conatiners with dbatools 🙂

You can get it using

Install-Module dbatools

and find commands with

Find-DbaCommand database

Don’t forget to use Get-Help with the name of the command to get information about how to use it

Get-Help Find-DbaCommand -detailed

Enjoy 🙂