Creating a PowerShell Module and TDD for Get-SQLDiagRecommendations

Yesterday I introduced the first command in the SQLDiagAPI module. A module to consume the SQL Diagnostics API.

I have been asked a few times what the process is for creating a module, using Github and developing with Pester and whilst this is not a comprehensive how-to I hope it will give some food for thought when you decide to write a PowerShell module or start using Pester for code development. I also hope it will encourage you to give it a try and to blog about your experience.

This is my experience from nothing to a module with a function using Test Driven Development with Pester. There are some details missing in some places but if something doesn’t make sense then ask a question. If something is incorrect then point it out. I plan on never stopping learning!

There are many links to further reading and I urge you to not only read the posts linked but also to read further and deeper. That’s a generic point for anyone in the IT field and not specific to PowerShell. Never stop learning. Also, say thank you to those that have taken their time to write content that you find useful. They will really appreciate that.

Github Repository

I created a new repository in Github and used Visual Studio Code to clone the repository by pressing F1 and typing clone – Choosing Git Clone and following the prompts. I started with this because I was always planning to share this code and because source controlling it is the best way to begin.

Plaster Template

When you create a module there are a number of files that you need and I have a number of generic tests that I add. I also have a structure that I create for the artifacts and a number of markdown documents that come with a GitHub Repository.  Whilst you could write a PowerShell script to create all of those, there is no need as there is PlasterPlaster is a PowerShell module that enables you to set up the default scaffolding for your PowerShell module structure and tokenise some files. This makes it much easier to have a default ‘scaffold’ for the module, a structure for the files and folders and create a new module simply. I used Kevin Marquettes post on Plaster  to create myself a template module. You can find my Plaster Template here 

You do not need to use Plaster at all but as with anything, if you find yourself repeating steps then it is time to automate it

With my Plaster Template created I could simply run

$plaster = @{
TemplatePath = "GIT:\PlasterTemplate" #(Split-Path $manifestProperties.Path)
DestinationPath = "GIT:\SQLDiagAPI"
FullName = "Rob Sewell"
ModuleName = "SQLDiagAPI"
ModuleDesc = "This is a module to work with the SQL Server Diagnostics (Preview) API. See https://blogs.msdn.microsoft.com/sql_server_team/sql-server-diagnostics-preview/ for more details "
Version = "0.9.0"
GitHubUserName = "SQLDBAWithABeard"
GitHubRepo = "SQLDiagAPI"
}
If(!(Test-Path $plaster.DestinationPath))
{
New-Item -ItemType Directory -Path $plaster.DestinationPath
}
Invoke-Plaster @plaster -Verbose

This created my module. It created this folder and file structure and included some default tests and markdown documents pre-populated.

00 - module

Pester

For those that don’t know. Pester is a PowerShell module for Test Driven Development

Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands

If you have PowerShell version 5 then you will have Pester already installed although you should update it to the latest version. If not you can get Pester from the PowerShell Gallery follow the instructions on that page to install it. This is a good post to start learning about Pester

API Key

Now that I have the module I started to think about the commands. I decided to start with the recommendations API which is described as

Customers will be able to keep their SQL Server instances up-to-date by easily reviewing the recommendations for their SQL Server instances. Customers can filter by product version or by feature area (e.g. Always On, Backup/Restore, Column Store, etc.) and view the latest Cumulative Updates (CU) and the underlying hotfixes addressed in the CU.

To use the API you need an API Key. An API Key is a secret token that identifies the application to the API and is used to control access.You can follow the instructions here https://ecsapi.portal.azure-api.net/ to get one for the SQL Server Diagnostics API.

01 - APIKey

I will need to store the key to use it and if I am writing code that others will use consider how they can repeat the steps that I take. I decided to save my API Key using the Export-CliXML command as described by Jaap Brasser here .

Get-Credential | Export-CliXml -Path "${env:\userprofile}\SQLDiag.Cred"

You need to enter a username even though it is not used and then enter the API Key as the password. It is saved in the root of the user profile folder as hopefully user accounts will have access there in most shops

TDD

I approached writing this module using Test Driven Development with Pester. This means that I have to write my tests before I write my code. There are many reasons for doing this which are outside the scope of this blog post. This is a very good post to read more

The first function I wanted to write was to get the recommendations from the API. I decide to call it Get-SQLDiagRecommendations.

I decided that the first test should be to ensure that the API Key exists. Otherwise I would not be able to use it when calling the API. I already had an idea of how I would approach it by storing the API Key using Test-Path and writing a warning if the file did not exist.

Mocking

However this is not going to work if I have already saved the key to the file. The test needs to not be reliant on any thing external. I need to be able to test this functionality without actually checking my system. I will use Mock to do this. You can read more about mocking with Pester here.

I added this to my Pester test

Context "Requirements" {
Mock Test-Path {$false}
Mock Write-Warning {"Warning"}

This is what happens when you run this test. When there is a call to Test-Path in the code you have written, instead of actually running Test-Path it will return whatever is inside the curly braces, in this case false. For Write-Warning it will return a string of Warning.

This means that I can write a test like this

It "Should throw a warning if there is no API Key XML File and the APIKey Parameter is not used"{
Get-SQLDiagRecommendations -ErrorAction SilentlyContinue | Should Be "Warning"
}

So I know that when running my code in this test, Test-Path will return false, which will invoke Write-Warning in my code and in the test that will return “Warning” . So if I have written my code correctly the test will pass without actually running the real Test-Path and interacting with my system or running Write-Warning which makes it easier to test that warnings are thrown correctly.

The name of the test will also let me (and others) know in the future what I was trying to achieve. This means that if I (or someone else) changes the code and the test fails they can understand what was meant to happen. They can then either write a new test for the changed code if the requirements are now different or alter the code so that it passes the original test.

I use

-ErrorAction SilentlyContinue

so that the only red text that I see on the screen is the results of the test and not any PowerShell errors.

Asserting

I can also check that I have successfully called my Mocks using Assert-MockCalled. This command will check that a command that has been mocked has been called successfully during the test in the scope of the Describe (or in this case Context) block of the tests

It 'Checks the Mock was called for Test-Path' {
$assertMockParams = @{
'CommandName' = ' Test-Path'
'Times' = 1
'Exactly' = $true
}
Assert-MockCalled @assertMockParams
}

I specify the command name, the number of times that I expect the mock to have been called and because I know that it will be exactly 1 time, I set exactly to $true. If I set exactly to false it would test that the mock was called at least the number of times specified. This is another test that I really have called the Mocks that I defined and the results are correct and dependant only on the code.

I set up the same test for Write-Warning.

Failed Test

I can now run my Pester tests using

Invoke-Pester .\Tests

and see that some failed.

02 - Failed Pester tests

Of course it failed I don’t have a function named Get-SQLDiagRecommendations

So why run the test?

I need to ensure that my test fails before I write the code to pass it. If I don’t do that I may mistakenly write a test that passes and therefore not be correctly testing my code.

You can also see that it has run all of the .Tests.ps1 files in the tests directory and has taken 42 seconds to run. The tests directory includes a number of Pester tests including checking that all of the scripts pass the Script Analyser rules and that all of the functions have the correct help. (thank you June Blender for that test)

Show

I can reduce the output of the tests using the Show parameter of Invoke-Pester. I will often use Fails as this will show the describe and context titles and only the tests that fail. This will run much quicker as it will not need to output all of the passed tests to the screen

03 - Pester show fails

Now the test is running in less than half of the time. You can filter the output in further ways using Show. You can run

Get-Help Invoke-Pester

to see how else you can do this.

Tags

As I am going to be writing tests and then writing code to pass the tests repeatedly I don’t want to run all of these tests all of the time so I can use the Tags parameter of Invoke-Pester to only run a certain suite tests. In the Unit.Tests.ps1 file the Describe block looks like this

Describe "Get-SQLDiagRecommendations" -Tags Build , Unit{
Context "Requirements" {

So I can run just the tests tagged Unit and skip all of the other tests. Combined with the Show Fails to reduce the output my Invoke-Pester code looks like this

Invoke-Pester .\tests -Show Fails -Tag Unit

04 - Pester Tags

Now I am only running the tests that I need for writing the code for the command the tests are running in under half a second 🙂 This is so much better when I am going to be running them repeatedly.

The other tests have different tags and I will show them running later in the post.

Code

Finally, we can write some code to pass our failing test

function Get-SQLDiagRecommendations1 {
[cmdletbinding()]
Param([string]$ApiKey)
if (!$ApiKey) {
if (!(Test-Path "${env:\userprofile}\SQLDiag.Cred")) {
Write-Warning "You have not created an XML File to hold the API Key or provided the API Key as a parameter
You can export the key to an XML file using Get-Credential | Export-CliXml -Path `"`${env:\userprofile}\SQLDiag.Cred`"
You can get a key by following the steps here https://ecsapi.portal.azure-api.net/ "
    }
}

Which would look like this if the file does not exist and the API Key parameter is not used

05 - Warning

I like to provide users with a useful message that they can follow rather than a lot of red text that they need to decipher

And now our tests pass

06 - Passing Tests

If you look at the API documentation the API requires a callerid as well as the APIKey. In the examples it uses the value from
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\MachineGUID

We can get that using Get-ItemProperty and without it we can’t call the API so I wrote tests like this.

It "Returns a warning if unable to get Machine GUID" {
Mock Get-MachineGUID {} -Verifiable
Mock Write-Warning {"Warning"} -Verifiable
Get-SQLDiagRecommendations -APIKey dummykey | Should Be "Warning"
Assert-VerifiableMocks
}

I am not saying this is the correct way to write your tests. I am showing that you can test multiple things in an It block and if any one of them fails the entire test fails.

I am mocking the internal function Get-MachineGuid and Write Warning just in the scope of this It Block and passing an APIKey parameter to Get-SQLDiagRecommendations so that we don’t hit the write-warnings we tested for above and then using Assert-VerifiableMocks  to verify that the mocks have been called. It does not verify how many times, just that all of the mocks in that block have been called

The test fails as expected and then I write the code to pass the test. This is the internal function to get the Machine GUID

function Get-MachineGUID {
try {
(Get-ItemProperty registry::HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\ -Name MachineGuid).MachineGUID
}
catch{
Write-Warning "Failed to get Machine GUID from HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\"
}
}

 

and this is the call to the internal function and warning message

$MachineGUID = Get-MachineGUID
if($MachineGUID.length -eq 0)
{
Write-Warning "Failed to get Machine GUID from HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\"
break
}

Rinse and repeat

That is basically the process that I follow to write a function. I just write a test, write some code to fix it, write another test, write some code to fix it. I keep going until I have finished writing the code and all the test have passed.

Best Practice Code

Once that was done and my Unit test had passed I run

 Invoke-Pester .\tests -Tag ScriptAnalyzer -Show Fails

To check that the PowerShell code that I had written conformed to the Script Analyzer rules. I added an exception to the Help.Exceptions.ps1 file to not run the rule for plural nouns as I think the command has to be called Get-SQLRecommendations with an S ! I have tagged the ScriptAnalyzer Tests with a tag so I can just run those tests.

Help

As that had all passed I could then run

Invoke-Pester .\tests -Tag Help

Which tests if I had the correct help for my functions. Of course that failed but I could use the nifty new feature in VS Codes PowerShell Extension to add the help scaffolding really easily as I describe here

Then I could run all 563 of the Pester tests in the tests folder and be happy that everything was OK

11 - All Pester passed.PNG

By the end I had written the module, which you can find here

There are instructions and a script to install it easily.

Right now it has only got the one function to get the SQL recommendations but I will look at expanding that over the next few days and once it is more complete put it onto the PowerShell Gallery and maybe move it into the SQL Server Community GitHub Organisation  home of https://dbatools.io , https://dbareports.io, Invoke-SQLCmd2 and the SSIS Reporting pack

Contribute

Of course I am happy to have others contribute to this, in fact I encourage it. Please fork and give PR’s and make this a useful module with more commands. There is the Diagnostic Analysis API as well to work with which I am very interested to see how we can make use of that with PowerShell

As always, I highly recommend that if you want to know more about Pester you head over here and purchase this book by Adam

VSCode – PowerShell extension 1.4.0 new command Out-CurrentFile

Yesterday David Wilson announced version 1.4.0 of the PowerShell extension for VSCode

https://twitter.com/daviwil/status/877559988809457671

He also pointed out that there have been over 1 million installs of the extension. 🙂

If you want to install the PowerShell extension you can hit F1 in VSCode and type

ext install PowerShell

or CTRL + SHIFT + X to open the extensions side bar and search for PowerShell and click the green install button.

There are a few enhancements in this release which you can read about here but I noticed the New File API and Out-CurrentFile command that were contributed by Doug Finke.

If your focus is in the editor in VSCode, you can simply CTRL + N and create a new file. You can alter the language that the file uses with CTRL + K, M (that’s CTRL and K and then M not CTRL and K and M!) or set the default new file language as I described here.

01 -new file.gif

If you are using VSCode as your daily PowerShell command line though, you would have to alter your focus from the terminal panel into the editor to do this. Now though you have

$psEditor.Workspace.NewFile()

which enables you to create a new file from the terminal

02- Another new file.gif

The Out-CurrentFile command sends the output of a command through Out-String to a new file. This makes it much easier to keep the results of some commands. You don’t have to pipe them to clip or highlight and CTRL + C to copy them and then open a  file and paste them. Of course you can use Out-File and then open the file but this is another way.

Lets see how it works. In this example, I want to export the T-SQL for creating the logins on an instance and add some comments to the code before saving it somewhere safely. I am going to use the Export-SQLLogin command from the dbatools module. MVP Cláudio Silva has written a great post on that command today.

First create a new file

$psEditor.Workspace.NewFile()

and then

Export-SqlLogin -SqlInstance . | Out-CurrentFile

In the gif above

  • I create a new file,
  • Export the logins to it
  • change the language of the file to T-SQL
  • remove the string quotes and
  • add some comments.

All without leaving VSCode, just another reason that my productivity is increased using VSCode!

Unfortunately, it doesn’t work so well with Pester. (Of course I was going to try Pester!). This makes sense though, as Pester uses Write-Host to display the test results so nothing is going to the output stream so

Invoke-Pester '.\SQL Grillen\Demo Number 3.Tests.ps1' | Out-CurrentFile

returns this

04 - pester.PNG

and if you use the -PassThru parameter then you get the $Tests object as a string so

 Invoke-Pester '.\SQL Grillen\Demo Number 3.Tests.ps1'-Show Summary -PassThru | Out-CurrentFile

shows

05 - pester object.PNG

Which isn’t what I would need but I have other ways of working with Pester output.

Pester for Presentations – Ensuring it goes ok

Whilst I was at PSCONFEU I presented a session on writing pester tests instead of using checklists. You can see it here

During the talk I showed the pester test that I use to make sure that everything is ready for my presentation. A couple of people have asked me about this and wanted to know more so I thought that I would blog about it.

Some have said that I might be being a little OCD about it 😉 I agree that it could seem like that but there is nothing worse than having things go wrong during your presentation. It makes your heart beat faster and removes the emphasis from the presentation that you give.

When it is things that you as a presenter could have been able to foresee, like a VM not being started or a database not being restored to the pre-demo state or being logged in as the wrong user then it is much worse

I use Pester to ensure that my environment for my presentation is as I expect and in fact, in Hanover when I ran through my Pester test for my NUC environment I found that one of my SQL Servers had decided to be in a different time zone and therefore the SQL Service would not authenticate and start. I was able to quickly remove the references to that server and save myself from a sea of red during my demos

For those that don’t know. Pester is a PowerShell module for Test Driven Development

Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands

If you have PowerShell version 5 then you will have Pester already installed although you should update it to the latest version. If not you can get Pester from the PowerShell Gallery follow the instructions on that page to install it. This is a good post to start learning about Pester

What can you test? Everything. Well, specifically everything that you can write a PowerShell command to check. So when I am setting up for my presentation I check the following things. I add new things to my tests as I think of them or as I observe things that may break my presentations. Most recently that was ensuring that my Visual Studio Code session was running under the correct user. I did that like this

Describe "Presentation Test" {
    Context "VSCode" {
        It "Should be using the right username" {
            whoami | Should Be 'TheBeard\Rob'
       }
    }
}

01 - username.PNG

I think about the things that are important to me for my presentation.  I want to ensure that I only have one VS Code window open to avoid that situation where I am clicking through windows looking for the correct window. I can do that using Get-Process

It "Should have Code Insiders Open" {
(Get-Process 'Code - Insiders' -ErrorAction SilentlyContinue)| Should Not BeNullOrEmpty
}
        It "Should have One VS Code Process" {
            (Get-Process 'Code - Insiders' -ErrorAction SilentlyContinue).Count | Should Be 1
        }

I use -ErrorAction SilentlyContinue so that I don’t get a sea of red when I run the tests. Next I want to check my PowerPoint is ready for my presentation

[code langauge=”PowerShell”]
It "Should have PowerPoint Open" {
(Get-Process POWERPNT -ErrorAction SilentlyContinue).Count | Should Not BeNullOrEmpty
}
It "Should have One PowerPoint Open" {
(Get-Process POWERPNT -ErrorAction SilentlyContinue).Count | Should Be 1
}
It "Should have the correct PowerPoint Presentation Open" {
(Get-Process POWERPNT -ErrorAction SilentlyContinue).MainWindowTitle| Should Be ‘dbatools – SQL Server and PowerShell together – PowerPoint’
}

Again I use Get-Process. I check if PowerPoint is open, if there is one PowerPoint open and I use the MainWindowTitle property to check that it is the right PowerPoint presentation after nearly starting a presentation for SqlServer module with the dbatools slides!

I don’t want any distractions when I am presenting. I have the sort of friends who will notice if I get notifications for twitter popping up on my screen and repeatedly send tweets to make people laugh. (I admit, I’m one of those friends – I do this too!)

02 - Friends!!.PNG

Now I cannot get a test for quiet hours working. You can apparently use a Registry key, which of course you can check with PowerShell but I was unable to get it working. I haven’t looked at testing for Presentation Mode  but I test that those programmes are shut down, again using Get-Process

        It "Mail Should be closed" {
            (Get-Process HxMail -ErrorAction SilentlyContinue).Count | Should Be 0
        }
        It "Tweetium should be closed" {
            (Get-Process WWAHost -ErrorAction SilentlyContinue).Count | Should Be 0
        }
        It "Slack should be closed" {
            (Get-Process slack* -ErrorAction SilentlyContinue).Count | Should BE 0
        }

I am generally presenting with SQL Server so I need to make sure that SQL Server is running. I do this with Get-Service

Context "Local SQL" {
        It "DBEngine is running" {
            (Get-Service mssqlserver).Status | Should Be Running
        }
        It "SQL Server Agent is running" {
            (Get-Service sqlserveragent).Status | Should Be Running
        }
        It "DAVE DBEngine is running" {
            (Get-Service mssql*Dave).Status | Should Be Running
        }
        It "DAVE Agent is running" {
            (Get-Service sqlagent*dave).Status | Should Be Running
        }
    }

In this example I am testing that the SQL Service and the Agent service are running on both of my local instances.

I use a NUC running Hyper-V to enable me to show a number of SQL Servers running in a domain environment so I need to be able to test those too. I set the values of the servers I need into a variable and check that the VM is running and that they respond to ping

 Context "VM State" {
        $NUCServers = 'BeardDC1','BeardDC2','LinuxvNextCTP14','SQL2005Ser2003','SQL2012Ser08AG3','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3','SQLVnextN1','SQL2008Ser12R2'
        $NUCVMs = Get-VM -ComputerName beardnuc | Where-Object {$_.Name -in $NUCServers}
            foreach($VM in $NUCVms)
                {
                $VMName = $VM.Name
                  It "$VMName Should be Running"{
                    $VM.State | Should Be 'Running'
                  }
			    }
    }
Context "THEBEARD_Domain" {
            $NUCServers = 'BeardDC1','BeardDC2','LinuxvNextCTP14','SQL2005Ser2003','SQL2012Ser08AG3','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3','SQLVnextN1','SQL2008Ser12R2'
            foreach($VM in $NUCServers)
                {
                                 It "$VM Should respond to ping" {
				(Test-Connection -ComputerName $VM -Count 1 -Quiet -ErrorAction SilentlyContinue) | Should be $True
				}
                }
    }

I also need to check if the SQL Service and the Agent Service is running on each server

  Context "SQL State" {
        $SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*'  -and $_.State -eq 'Running'}).Name
        foreach($Server in $SQLServers)
        {
          $DBEngine = Get-service -ComputerName $Server -Name MSSQLSERVER
           It "$Server  DBEngine should be running" {
                $DBEngine.Status | Should Be 'Running'
            }
           It "$Server DBEngine Should be Auto Start" {
            $DBEngine.StartType | Should be 'Automatic'
           }
              $Agent= Get-service -ComputerName $Server -Name SQLSERVERAGENT
              It "$Server Agent should be running" {
                  $Agent.Status | Should Be 'Running'
           }
           It "$Server Agent Should be Auto Start" {
            $Agent.StartType | Should be 'Automatic'
           }
        }
        It "Linux SQL Server should be accepting connections" {
            $cred = Import-Clixml C:\temp\sa.xml
            {Connect-DbaSqlServer -SqlServer LinuxvnextCTP14 -Credential $cred -ConnectTimeout 60} | Should Not Throw
        }

    }
}

I check that the Linux SQL Server is available by storing the credential using Export-CliXML  and then use that credential with Connect-DbaSqlServer from dbatools

Using a NUC means I sometimes have fun with networking so I have a couple of tests for that too. Testing for the correct DNS Servers and gateways

    It "Should have DNS Servers for correct interface" {
        (Get-DnsClientServerAddress -InterfaceAlias 'Ethernet 3').Serveraddresses | Should Be @('10.0.0.1','10.0.0.2')
    }
    It "Should have correct gateway for alias"{
        (Get-NetIPConfiguration -InterfaceAlias 'Ethernet 3').Ipv4DefaultGateway.NextHop | Should Be '10.0.0.10'
    }

All of those are generic tests that have evolved over time and are run for every presentation but when I have specific things you require for a single presentation I test for those too.

For Example, later this week Cláudio Silva and I are presenting on dbatools at TUGAIT  We are showing the Test-DbaMaxMemory  , Get-DbaMaxMemory and Set-DbaMaxMemory commands so we need to ensure that the Max Memory for some servers is (In) Correctly set. I use Connect-DbaSqlServer to create an SMO Server object and test that

    It "Max Memory on SQl2012SerAG1 2 and 3 should be 2147483647" {
        (Connect-DbaSqlServer SQL2012Ser08AG1).Configuration.MaxServerMemory.RunValue | Should Be 2147483647
        (Connect-DbaSqlServer SQL2012Ser08AG2).Configuration.MaxServerMemory.RunValue | Should Be 2147483647
        (Connect-DbaSqlServer SQL2012Ser08AG3).Configuration.MaxServerMemory.RunValue | Should Be 2147483647
    }

We are also showing the Test-DbaIdentityUsage command so a column needs to be pre-prepared in AdventureWorks2014 to be able to show the error

    It "ShiftID LastValue Should be 255" {
        $a = Test-DbaIdentityUsage -SqlInstance ROB-XPS -Databases AdventureWorks2014 -NoSystemDb
        $a.Where{$_.Column -eq 'ShiftID'}.LastValue | should Be 255
    }

To ensure that we have orphaned files available for the Find-DbaOrphanedFile command I use this

    It "has Orphaned Files ready"{
        (Find-DbaOrphanedFile -SqlServer SQL2016N2).Count | Should Be 30
    }

There are any number of things that you may want to test to ensure that, as best as possible, the demo gods are not going to come and bite you in the middle of your presentation.

  • Files or Folders exist (or dont exist)
  • Databases, Agent Jobs, Alerts
  • Operators, Logins
  • SSIS packages, SSRS Reports
  • PowerBi files
  • Azure connectivity
  • Azure components

The list is endless, just look at what you require for your presentation.

Anything you can check with PowerShell you can test with Pester so build up your Pester presentation tests and reduce the reliance on the demo gods! I’ll still leave this here just to be safe!!

pray to the demo gods.jpg

Pester Test Inception and the Show Parameter

My fantastic friend Andre Kamman b | t  and I presented at PSConfEu last week

C_EDtK0XoAA1PL7 (2).jpg

and whilst we were there we were chatting about running Pester Tests. He wanted to know how he could run a Pester Test and not lose the failed tests as they scrolled past him. In his particular example we were talking about running hundreds of tests on thousands of databases on hundreds of servers

01 - pesters.gif

I guess it looks something like that!!

I explained about the -Show parameter which allows you to filter the results that you see. Using Get-Help Invoke-Pester you can see this

   -Show
Customizes the output Pester writes to the screen. Available options are None, Default,
Passed, Failed, Pending, Skipped, Inconclusive, Describe, Context, Summary, Header, All, Fails.

The options can be combined to define presets.
Common use cases are:

None – to write no output to the screen.
All – to write all available information (this is default option).
Fails – to write everything except Passed (but including Describes etc.).

A common setting is also Failed, Summary, to write only failed tests and test summary.

This parameter does not affect the PassThru custom object or the XML output that
is written when you use the Output parameters.

Required?                    false
Position?                    named
Default value                All
Accept pipeline input?       false
Accept wildcard characters?  false

So there are numerous options available to you. Lets see what they look like

I will use a dummy test which creates 10 Context blocks and runs from 1 to 10 and checks if the number has a remainder when divided by 7

Describe "Only the 7s Shall Pass" {
    $Servers = 0..10
    foreach($Server in $servers)
    {
        Context "This is the context for $Server" {
        foreach($A in 1..10){
            It "Should Not Pass for the 7s" {
                $A % 7 | Should Not Be 0
                }
            }
        }
    }
}

Imagine it is 10 servers running 10 different tests

For the Show parameter All is the default, which is the output that you are used to

02 - All.gif

None does not write anything out. You could use this with -Passthru which will pass ALL of the test results to a variable and if you added -OutputFile and -OutputFormat then you can save ALL of the results to a file for consumption by another system. The -Show parameter only affects the output from the Invoke-Pester command to the host not the output to the files or the variable.

Header only returns the header from the test results and looks like this ( I have included the none so that you can see!)

03 - none and header.PNG

Summary, as expected returns only the summary of the results

04 - summary.PNG

You can use more than one value for the Show parameter so if you chose Header, Summary, Describe you would get this

05 - headerdesscribe sumnmary.PNG

You could use Failed to only show the failed tests which looks like this

06 - failed.PNG

but Andre explained that he also want to be able to see some progress whilst the test was running. If there were no failures then he would not se anything at all.

So Fails might be the answer (or Failed and Summary but that would not show the progress)

07 - fails.PNG

Fails shows the Header, Describe, Context  and also shows the Summary.

However we carried on talking. PSConfEU is a fantastic place to talk about PowerShell 🙂 and wondered what would happen if you invoked Pester from inside a Pester test. I was pretty sure that it would work as Pester is just PowerShell but I thought it would be fun to have a look and see how we could solve that requirement

So I created 3 “Internal Tests” these are the ones we don’t want to see the output for. I then wrote an overarching Pester test to call them. In that Pester test script I assigned the results of each test to a variable which. When you examine it you see

08 - Pester Object.PNG

The custom object that is created shows the counts of all different results of the tests, the time it took and also the test result.

So I could create a Pester Test to check the Failed Count property of that Test result

$InternalTest1.FailedCount | Should Be 0

To make sure that we don’t lose the results of the tests we can output  them to a file like this

$InternalTest1 = Invoke-Pester .\Inside1.Tests.ps1 -Show None -PassThru -OutputFile C:\temp\Internal_Test1_Results.xml -OutputFormat NUnitXml

So now we can run Invoke-Pester and point it at that file and it will show the progress and the final result on the screen.

09 finale.PNG

You could make use of this in different ways

  • Server 1
    • Database1
    • Database2
    • Database3
    • Database4
  • Server 2
    • Database1
      Database2
      Database3
      Database4
  • Server 3
    • Database1
      Database2
      Database3
      Database4

Or by Test Category

  • Backup
    • Server1
    • Server 2
    • Server 3
    • Server 4
  • Agent Jobs
    • Server 1
    • Server 2
    • Server 3
    • Server 4
  • Indexes
    • Server 1
    • Server 2
    • Server 3
    • Server 4

Your only limitation is your imagination.

As we have mentioned PSConfEU you really should check out the videos on the youtube channel All of the videos that were successfully recorded will be on there. You could start with this one and mark your diaries for April 16-20 2018

 

 

Test the SQL Server database collation with PowerShell and dbatools

If your server collation is different to your database collation then you may find that you get an error similar to this

Cannot resolve the collation conflict between “SQL_Latin1_General_CP1_CI_AS” and “Latin1_General_CI_AS” in the equal to operation.

when your queries use the tempdb

It would be useful to be able to quickly test if that may be the case and with the dbatools module you can. There is a Test-DbaDatabaseCollation command which will do just that. This page will show you how to install dbatools if you have not already got it

As always you should start with Get-Help when looking at a PowerShell command

Get-Help TestDbaDatabaseCollation -ShowWindow

 

01 - Get Help.PNG

There are only 3 parameters Sqlserver, Credential and detailed

Lets start with SQLServer

Test-DbaDatabaseCollation -SqlServer SQLvNextN2

this gives a quick and simple output showing the server name, database name and an IsEqual property

02 - test server

So in this example we can see that the WideWorldImporters database does not have the same collation as the server. If we only wanted to see information about databases with a collation that does not match the server then we could use

(Test-DbaDatabaseCollation -SqlServer SQLvNextN2).Where{$_.IsEqual -eq $false}

03 - equals false

That doesn’t give us any further information though. There is the detailed parameter as well. Lets see what that does

 Test-DbaDatabaseCollation -SqlServer SQLvNextN2 -Detailed

04 - detailed.PNG

This time we get the server name, server collation, database name , database collation and the IsEqual property. This is a collection of objects so we are not bound be just seeing them on the screen we can use them as I blogged about here

For example

 ## Output to a file
Test-DbaDatabaseCollation -SqlServer SQLvNextN2 -Detailed |Out-File C:\Temp\CollationCheck.txt
## Output to CSV
Test-DbaDatabaseCollation -SqlServer SQLvNextN2 -Detailed |Export-Csv  C:\temp\CollationCheck.csv -NoTypeInformation
<## Output to JSON
Test-DbaDatabaseCollation -SqlServer SQLvNextN2 -Detailed | ConvertTo-Json | Out-file c:\temp\CollationCheck.json
## Look at the files
notepad C:\temp\CollationCheck.json
notepad C:\temp\CollationCheck.csv
notepad C:\temp\CollationCheck.txt

Of course, you will probably want to test more than one server at a time. Lets pass an array of servers and see what happens

$SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*' -and $_.Name -ne 'SQL2008Ser2008' -and $_.State -eq 'Running'}).Name
# Test Db collation
Test-DbaDatabaseCollation -SqlServer $SQLServers -Detailed 
05 - servers.PNG

In this example, I am querying my Hyper-V server for all VMs with SQL in the name,except for my broken SQL2008 box ,that are running. I love PowerShell’s Out-GridView command for many reasons. The ability to sort by columns quickly and simply is one of them. Lets add that to the code and sort by the IsEquals column

$SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*' -and $_.Name -ne 'SQL2008Ser2008' -and $_.State -eq 'Running'}).Name
# Test Db collation
Test-DbaDatabaseCollation -SqlServer $SQLServers -Detailed | Out-GridView
06 - servers ogv.PNG

Excellent, that works a treat. How about Linux? Does this work if SQL is running on Linux? We will have to use the credential parameter as we need SQL Authentication. this time I have used the Format-Table command to format the output.

$cred = Get-Credential
Test-DbaDatabaseCollation -SqlServer LinuxvNextCTP14 -Credential $cred -Detailed | Format-Table -AutoSize
07 - Linux.PNG

Lets add some Pester tests. If we want to test a list of servers and see if any of their databases have an incorrect collation we can simply test if the IsEquals flag contains a false.

We can do this using TestCases. Test cases allow Pester to loop through a collection of ‘things’ The testcases parameter takes an array of hashtables. This all sounds very complicated to those unclear about PowerShell but here some code to do it.

$SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*' -and $_.Name -ne 'SQL2008Ser2008' -and $_.State -eq 'Running'}).Name
$testCases= @()
$SQLServers.ForEach{$testCases += @{Name = $_}}

The first line gathers the list of SQL Servers from the Hyper-V as before. You can get this from a text file, csv, Active Directory, CMS, registered servers list. The second line initiates the TestCases array and the third line iterates through the list of servers and adds a hashtable to the TestCases array

To make use of the test cases we have to use the -TestCases parameter in our It block of our Pester Test and add a param() so that the test knows where to get the values from. To add the value from the test cases into the title of the test we need to reference it inside <>

If you want to learn more about Pester. I highly recommend The Pester Book by Don Jones and Adam Bertram

Here is the code

Describe "Testing Database Collation" {
    $SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*' -and $_.Name -ne 'SQL2008Ser2008' -and $_.State -eq 'Running'}).Name
    $testCases= @()
    $SQLServers.ForEach{$testCases += @{Name = $_}}
    It "<Name> databases have the right collation" -TestCases $testCases {
        Param($Name)
        $Collation = Test-DbaDatabaseCollation -SqlServer $Name
        $Collation.IsEqual -contains $false | Should Be $false
    }
}

If we save that as a PowerShell file, we can call it with Invoke-Pester

08 - Servers Pester.PNG

which shows which servers do not have databases with the correct collation. This may be all that is required but what about if you want to check each database on each server with Pester?

I could not see a way to do this with TestCases so I reverted to PowerShell. Pester is just PowerShell code after all.

Describe "Testing Database Collation" {
    $SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*' -and $_.Name -ne 'SQL2008Ser2008' -and $_.State -eq 'Running'}).Name
    foreach($Server in $SQLServers)
    {
        $CollationTests = Test-DbaDatabaseCollation -SqlServer $Server
        foreach($CollationTest in $CollationTests)
        {
            It "$($Collationtest.Server) database $($CollationTest.Database) should have the correct collation" {
                $CollationTest.IsEqual | Should Be $true
            }
        }
    }
}

In this example, we again gather the names of our SQL servers and then iterate through them. Then set the results of the Test-DBADatabaseCollation to a variable and iterate through each of the results and test the IsEquals property. We can save that as a file and call it with Invoke-Pester and this time it looks like

09 - Individual databases.PNG

Excellent we can quickly and easily see which database on which server doesnot have a matching collation. We cant see in the results of the Pester test what collation it should be though. Lets do that as well.

This time we need to use the Detailed parameter and test that the ServerCollation matches the DatabaseCollation. This will enable Pester to display that information to us. Here is the code

Describe "Testing Database Collation" {
    $SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*' -and $_.Name -ne 'SQL2008Ser2008' -and $_.State -eq 'Running'}).Name
    foreach($Server in $SQLServers)
    {
        $CollationTests = Test-DbaDatabaseCollation -SqlServer $Server -Detailed
        foreach($CollationTest in $CollationTests)
        {
            It "$($Collationtest.Server) database $($CollationTest.Database) should have the correct collation of $($CollationTest.ServerCollation)" {
                $CollationTest.DatabaseCollation | Should Be $CollationTest.ServerCollation
            }
        }
    }
}
and if we save that as a file and call it with invoke-Pester (you can just run the code using PowerShell as well) it looks like this
10 - full test.PNG

Now Pester shows us what collation it is expecting and what the collation of the database is as well when it fails the test. (I love the little arrow showing where the difference is!)

Hopefully this post has shown you how you can use Test-DbaDatabaseCollation from the dbatools module to test your servers and combine that with Pester. If you have any questions about the dbatools module go and ask in the dbatools channel in the SQL Community Slack channel

Using Pester with dbatools Test-DbaLastBackup

In previous posts I have shown how to use Test-DbaLastBackup from dbatools and how you can make use of the results. Today we will look at using  Pester with the results

Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands.

we shall use it to validate our results. First we need to gather our results as we have seen before, In this example I have set the MaxMb to 5 so change that if you are playing along

Import-Module dbatools
$TestServer = 'SQL2016N1'
$Server = 'SQL2016N2'
$servers = 'SQL2016N1','SQL2016N2'
$Results = $servers.ForEach{Test-DbaLastBackup -SqlServer $_ -Destination $TestServer -MaxMB 5}
Then we need to write some Pester Tests. I tried to use Test Cases which are the correct method to iterate through collections as Mike Robbins shows here but Pester does not accept the type of object that is returned from this command for that. It’s ok though, because Pester is just PowerShell we can use a foreach loop.
In this scenario, we are testing for failures rather than when the backup test has skipped due to the file path not being a network share or the size being greater than our max size, so our checks are using the Should Not assertion. I have also added a test for the time the backup was taken.
Describe "Last Backup Test results - NOTE THIS IGNORES Skipped restores,DBCC and BackupFiles" {
foreach($result in $results)
{
It "$($Result.Database) on $($Result.SourceServer) File Should Exist" {
$Result.FileExists| Should Not Be 'False'
}
It "$($Result.Database) on $($Result.SourceServer) Restore should be Success" {
$Result.RestoreResult| Should Not Be 'False'
}
It "$($Result.Database) on $($Result.SourceServer) DBCC should be Success" {
$Result.DBCCResult| Should Not Be 'False'
}
It "$($Result.Database) on $($Result.SourceServer) Backup Should be less than a week old" {
$Result.BackupTaken| Should BeGreaterThan (Get-Date).AddDays(-7)
}
}
If we run that we get an output like this. Green is Good Red is Bad 🙂
01 - pester script.PNG
We can save the script to a file and use the Invoke-Pester to call it like this.
Invoke-Pester C:\temp\BackupPester.ps1
(Some Restore Frames removed for brevity)
02 -invoke pester.gif
invoke-Pester can output results to a file so we can output to XML which can be consumed by many things
$Date = Get-Date -Format ddMMyyyHHmmss
$tempFolder = 'c:\temp\BackupTests\'
Push-Location $tempFolder
$XML = $tempFolder + "BackupTestResults_$Date.xml"
$script = 'C:\temp\BackupPester.ps1'
Invoke-Pester -Script $Script -OutputFile $xml -OutputFormat NUnitXml
will provide an XML file like this
04 - XML output.PNG
We can also make use of the reportunit.exe from http://relevantcodes.com/ to create pretty HTML files from the XML files we created
This piece of code will download and extract the file if it does not exist in the directory
#download and extract ReportUnit.exe
$url = 'http://relevantcodes.com/Tools/ReportUnit/reportunit-1.2.zip'
$fullPath = Join-Path $tempFolder $url.Split("/")[-1]
$reportunit = $tempFolder + '\reportunit.exe'
if((Test-Path $reportunit) -eq $false)
{
(New-Object Net.WebClient).DownloadFile($url,$fullPath)
Expand-Archive -Path $fullPath -DestinationPath $tempFolder
}
and this will run it against the XML and open the file
##run reportunit against report.xml and display result in browser
$HTML = $tempFolder  + 'index.html'
& .\reportunit.exe $tempFolder
Invoke-Item $HTML
which will look  like
03 - pretty html file.gif
Happy Automating

NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using

 Get-Module dbatools

and update it using an Administrator PowerShell session with

 Update-Module dbatools

You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using

Install-Module dbatools

Then you can use

Update-dbatools

PowerShell, Pester and Ola Hallengrens Maintenance Solution

If you are a SQL DBA you will have heard of Ola Hallengrens Maintenance solution If you haven’t go and click the link and look at the easiest way to ensure that all of your essential database maintenance is performed. You can also watch a video from Ola at SQL Bits
Recently I was thinking about how I could validate that this solution was installed in the way that I wanted it to be so I turned to Pester You can find a great how to get started here which will show you how to get Pester and how to get started with TDD.
This isn’t TDD though this is Environment Validation and this is how I went about creating my test.
First I thought about what I would look for in SSMS when I had installed the maintenance solution and made a list of the things that I would check which looked something like this. This would be the checklist you would create (or have already created) for yourself or a junior following this install. This is how easy you can turn that checklist into a Pester Test and remove the human element and open your install for automated testing
  • SQL Server Agent is running – Otherwise the jobs won’t run 🙂
  • We should have 4 backup jobs with a name of
  • DatabaseBackup – SYSTEM_DATABASES – FULL
  • DatabaseBackup – USER_DATABASES – FULL
  • DatabaseBackup – USER_DATABASES – DIFF
  • DatabaseBackup – USER_DATABASES – LOG
  • We should have Integrity Check and Index Optimisation Jobs
  • We should have the clean up jobs
  • All jobs should be scheduled
  • All jobs should be enabled
  • The jobs should have succeeded

I can certainly say that I have run through that check in my head and also written it down in an installation guide in the past. If I was being more careful I would have checked if there were the correct folders in the folder I was backing up to.

Ola’s script uses a default naming convention so this makes it easy. There should be a SERVERNAME or SERVERNAME$INSTANCENAME folder or if there is an Availability Group a CLUSTERNAME$AGNAME and in each of those a FULL DIFF and LOG folder which I can add to my checklist

So now we have our checklist we just need to turn in into a Pester Environmental Validation script

It would be useful to be able to pass in a number of instances so we will start with a foreach loop and then a Describe Block then split the server name and instance name, get the agent jobs and set the backup folder name

$ServerName = $Server.Split('\')[0]
$InstanceName = $Server.Split('\')[1]
$ServerName = $ServerName.ToUpper()
Describe 'Testing $Server Backup solution'{
BeforeAll {$Jobs = Get-SqlAgentJob -ServerInstance $Server
$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server
$dbs = $Srv.Databases.Where{$_.status -eq 'Normal'}.name
if($InstanceName)
{
$DisplayName = 'SQL Server Agent ($InstanceName)'
$Folder = $ServerName + '$' + $InstanceName
}
else
{
$DisplayName = 'SQL Server Agent (MSSQLSERVER)'
$Folder = $ServerName
}
}
if($CheckForBackups -eq $true)
{
$CheckForDBFolders -eq $true
}
$Root = $Share + '\' + $Folder 
I also set the Agent service display name so I can get its status. I split the jobs up using a Context block, one each for Backups, Database maintenance and solution clean up but they all follow the same pattern. .First get the jobs

$Jobs = $Jobs.Where{($_.Name -like 'DatabaseBackup - SYSTEM_DATABASES - FULL*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - FULL*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - DIFF*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - LOG*' + $JobSuffix + '*')}
Then we can iterate through them and check them but first lets test the Agent Service. You do this with an It Block and in it put a single test like this

actual-value | Should Be expected-value
So to check the Agent Job is running we can do this

(Get-service -ComputerName $ServerName -DisplayName $DisplayName).Status | Should Be 'Running'
To find out how to get the right values for any test I check using get member so to see what is available for a job I gathered the Agent Jobs into a variable using the Get-SQLAgentJob command in the new sqlserver module (which you can get by installing the latest SSMS from here) and then explored their properties using Get-Member and the values using Select Object

$jobs = Get-SqlAgentJob -ServerInstance $server
($Jobs | Get-Member -MemberType Property).name
$Jobs[0] | Select-Object *
then using a foreach to loop through them I can check that the jobs, exists, is enabled, has a schedule and succeeded last time it ran like this

$Jobs = $Jobs.Where{($_.Name -eq 'DatabaseIntegrityCheck - SYSTEM_DATABASES') -or ($_.Name -eq 'DatabaseIntegrityCheck - USER_DATABASES') -or ($_.Name -eq 'IndexOptimize - USER_DATABASES')}
foreach($job in $Jobs)
{
$JobName = $Job.Name
It '$JobName Job Exists'{
$Job | Should Not BeNullOrEmpty
}
It '$JobName Job is enabled' {
$job.IsEnabled | Should Be 'True'
}
It '$JobName Job has schedule' {
$Job.HasSchedule | Should Be 'True'
}
if($DontCheckJobOutcome -eq $false)
{
It '$JobName Job succeeded' {
$Job.LastRunOutCome | Should Be 'Succeeded'
}
}
So I have checked the agent and the jobs and now I want to check the folders exist. First for the instance using Test-Path so the user running the PowerShell session must have privileges and access to list the files and folders

Context '$Share Share For $Server' {
It 'Should have the root folder $Root' {
Test-Path $Root | Should Be $true
}
The for every database we need to set some variables for the Folder path. We don’t back up tempdb so we ignore that and then check if the server is SQL2012 or above and if it is check if the database is a member of an availability group and set the folder name appropriately

  foreach($db in $dbs.Where{$_ -ne 'tempdb'})
{

if($Srv.VersionMajor -ge 11)
{
If($srv.Databases[$db].AvailabilityGroupName)
{
$AG = $srv.Databases[$db].AvailabilityGroupName
$Cluster = $srv.ClusterName
$OLAAg = $Cluster + '$' + $AG
if($Share.StartsWith('\\') -eq $False)
{
$UNC = $Share.Replace(':','$')
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $OlaAG
}
else
{
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $Folder
}
}
else
{
if($Share.StartsWith('\\') -eq $False)
{
$UNC = $Share.Replace(':','$')
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $Folder
}
else
{
$Root = $Share + '\' + $Folder
}
}
}
$db = $db.Replace(' ','')
$Dbfolder = $Root + &amp;quot;\$db&amp;quot;
$Full = $Dbfolder + '\FULL'
$Diff = $Dbfolder + '\DIFF'
$Log  = $Dbfolder + '\LOG'
If($CheckForDBFolders -eq $True)
{
Context &amp;quot;Folder Check for $db on $Server on $Share&amp;quot; {
It &amp;quot;Should have a folder for $db database&amp;quot; {
Test-Path $Dbfolder |Should Be $true
} 
But we need some logic for checking for folders because Ola is smart and checks for Log Shipping databases so as not to break the LSN chain and system databases only have full folders and simple recovery databases only have full and diff folders. I used the System.IO.Directory Exists method as I found it slightly quicker for UNC Shares

If($CheckForDBFolders -eq $True)
{
Context 'Folder Check for $db on $Server on $Share' {
It 'Should have a folder for $db database' {
Test-Path $Dbfolder |Should Be $true
}
if($Db -notin ('master','msdb','model') -and ($Srv.Databases[$db].RecoveryModel -ne 'Simple') -and ( $LSDatabases -notcontains $db))
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
It 'Has a Diff Folder' {
[System.IO.Directory]::Exists($Diff) | Should Be $True
}
It 'Has a Log Folder' {
[System.IO.Directory]::Exists($Log) | Should Be $True
}
} #
elseif(($Srv.Databases[$db].RecoveryModel -eq 'Simple') -and $Db -notin ('master','msdb','model') -or ( $LSDatabases -contains $db) )
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
It 'Has a Diff Folder' {
[System.IO.Directory]::Exists($Diff) | Should Be $True
}
} #
else
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
}#
} # End Check for db folders
}
and a similar thing for the files in the folders although this caused me some more issues with performance. I first used Get-ChildItem but in folders where a log backup is running every 15 minutes it soon became very slow. So I then decided to compare the create time of the folder with the last write time which was significantly quicker for directories with a number of files but then fell down when there was a single file in the directory so if the times match I revert back to Get-ChildItem.
If anyone has a better more performant option I would be interested in knowing. I used Øyvind Kallstad PowerShell Conference session Chasing the seconds Slides and Video and tried the methods in there with Measure-Command but this was the best I came up with

If($CheckForBackups -eq $true)
{
Context ' File Check For $db on $Server on $Share' {
$Fullcreate = [System.IO.Directory]::GetCreationTime($Full)
$FullWrite = [System.IO.Directory]::GetLastWriteTime($Full)
if($Fullcreate -eq $FullWrite)
{
It 'Has Files in the FULL folder for $db' {
Get-ChildItem $Full\*.bak | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the FULL folder for $db' {
$FullCreate | Should BeLessThan $FullWrite
}
}
It 'Full File Folder was written to within the last 7 days' {
$Fullwrite |Should BeGreaterThan (Get-Date).AddDays(-7)
}
if($Db -notin ('master','msdb','model'))
{
$Diffcreate = [System.IO.Directory]::GetCreationTime($Diff)
$DiffWrite = [System.IO.Directory]::GetLastWriteTime($Diff)
if($Diffcreate -eq $DiffWrite)
{
It 'Has Files in the DIFF folder for $db' {
Get-ChildItem $Diff\*.bak | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the DIFF folder for $db' {
$DiffCreate | Should BeLessThan $DiffWrite
}
}&amp;amp;amp;amp;lt;/div&amp;amp;amp;amp;gt;&amp;amp;amp;amp;lt;div&amp;amp;amp;amp;gt;It 'Diff File Folder was written to within the last 24 Hours' {
$Diffwrite |Should BeGreaterThan (Get-Date).AddHours(-24)
}
}
if($Db -notin ('master','msdb','model') -and ($Srv.Databases[$db].RecoveryModel -ne 'Simple') -and ( $LSDatabases -notcontains $db))
{
$Logcreate = [System.IO.Directory]::GetCreationTime($Log)
$LogWrite = [System.IO.Directory]::GetLastWriteTime($Log)
if($Logcreate -eq $LogWrite)
{
It 'Has Files in the LOG folder for $db' {
Get-ChildItem $Log\*.trn | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the LOG folder for $db' {
$LogCreate | Should BeLessThan $LogWrite
}
}
It 'Log File Folder was written to within the last 30 minutes' {
$Logwrite |Should BeGreaterThan (Get-Date).AddMinutes(-30)
}
}# Simple Recovery
}
}# Check for backups
You could just run the script you have just created from your check-list, hopefully this blog post can help you see that you  can do so.
But I like the message showing number of tests and successes and failures at the bottom and I want to use parameters in my script. I can do this like this

[CmdletBinding()]
## Pester Test to check OLA
Param(
$Instance,
$CheckForBackups,
$CheckForDBFolders,
$JobSuffix ,
$Share ,
[switch]$NoDatabaseRestoreCheck,
[switch]$DontCheckJobOutcome
)
and then call it using Invoke-Pester with the parameters like this

$Script = @{
Path = $Path;
Parameters = @{ Instance = Instance;
CheckForBackups = $true;
CheckForDBFolders = $true;
JobSuffix = 'BackupShare1';
Share = '\\Server1\BackupShare1';
NoDatabaseRestoreCheck= $true;
DontCheckJobOutcome = $true}
}
Invoke-Pester -Script $Script
but that’s a bit messy, hard to remember and won’t encourage people newer to Powershell to use it so I wrapped it in a function with some help and examples and put it in GitHub Test-OlaInstance.ps1 and Test-Ola. There is one thing to remember. You will need to add the path to Test-Ola.ps1 on Line 90 of Test-OlaInstance so that the script can find it
Once you have that you can call it for a single instance or a number of instances like so. Here I check for Folders and Backup files
$Servers =  'SQL2008Ser2008','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2'
Test-OLAInstance -Instance $Servers -Share 'H:\' -CheckForBackups
and get  a nice result like this. In a little under 20 seconds I completed my checklist for 4 servers including checking if the files and folders exist for 61 databases 🙂 (The three failures were my Integrity Check jobs holding some test corrupt databases)
pester ola check.PNG
This gives me a nice and simple automated method of checking if Ola’s maintenance script has been correctly installed. I can use this for one server or many by passing in an array of servers (although they must use the same folder for backing up whether that is UNC or local) I can also add this to an automated build process to ensure that everything has been deployed correctly.
I hope you find it useful

Some Pester Tests for SQL Defaults

When I was at PowerShell Conference EU in Hannover last month (The videos are available now – click here and the slides and code here) I found out about Irwin Strachans Active Directory Operations Test which got me thinking.

I decided to do the same for my usual SQL Set-up. Treating all of your servers to the same defaults makes it even easier to manage at scale remotely.

I am comfortable with using SMO to gather and change properties on SQL Instances so I started by doing this

        It 'Should have a default Backup Directory of F:\SQLBACKUP\BACKUPS' {
$Scriptblock = {
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
$srv = New-Object Microsoft.SqlServer.Management.Smo.Server .
return $srv.BackupDirectory}
$State = Invoke-Command -ComputerName ROB-SURFACEBOOK -ScriptBlock $Scriptblock
$State |Should Be 'F:\SQLBACKUP\BACKUPS'

This is the how to find the properties that you want

  ## Load the Assemblies
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
## Create a Server SMO object
$srv = New-Object Microsoft.SqlServer.Management.Smo.Server SERVERNAME

## Explore it
$srv|gm

## If you find an array pick the first one and expand and then explore that
$srv.Databases[0] | select *
$srv.Databases[0] | gm

I quickly found as I added more tests that it was taking a long time to perform the tests (about 5 seconds each test) and that it took an age to fail each of the tests if the server name was incorrect or the server unavailable.

I fixed the first one by testing with a ping before running the tests

   ## Check for connectivity
if((Test-Connection $Server -count 1 -Quiet) -eq $false){
Write-Error 'Could not connect to $Server'
$_
continue
}

The continue is there because I wanted to loop through an array of servers

I improved the performance using a remote session and a custom object

      Describe "$Server" {
BeforeAll {
$Scriptblock = {
[pscustomobject]$Return = @{}
$srv = ''
$SQLAdmins = $Using:SQLAdmins
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server
$Return.DBAAdminDb = $Srv.Databases.Name.Contains('DBA-Admin')
$Logins = $srv.Logins.Where{$_.IsSystemObject -eq $false}.Name
$Return.SQLAdmins = @(Compare-Object $Logins $SQLAdmins -SyncWindow 0).Length - $Logins.count -eq $SQLAdmins.Count
$SysAdmins = $Srv.Roles['sysadmin'].EnumMemberNames()
$Return.SQLAdmin = @(Compare-Object $SysAdmins $SQLAdmins -SyncWindow 0).Length - $SysAdmins.count -eq $SQLAdmins.Count
$Return.BackupDirectory = $srv.BackupDirectory
$Return.DataDirectory = $srv.DefaultFile

The BeforeAll script block is run, as it sounds like it should, once before all of the tests, BeforeEach would run once before each of the tests. I define an empty custom object and then create an SMO object and add the properties I am interested in testing to it. I then return the custom object at the end

   $Return.Alerts82345Exist = ($srv.JobServer.Alerts |Where {$_.Messageid -eq 823 -or $_.Messageid -eq 824 -or $_.Messageid -eq 825}).Count
$Return.Alerts82345Enabled = ($srv.JobServer.Alerts |Where {$_.Messageid -eq 823 -or $_.Messageid -eq 824 -or $_.Messageid -eq 825 -and $_.IsEnabled -eq $true}).Count
$Return.SysDatabasesFullBackupToday = $srv.Databases.Where{$_.IsSystemObject -eq $true -and $_.Name -ne 'tempdb' -and $_.LastBackupDate -lt (Get-Date).AddDays(-1)}.Count
Return $Return
}
try {
$Return = Invoke-Command -ScriptBlock $Scriptblock -ComputerName $Server -ErrorAction Stop
}
catch {
Write-Error "Unable to Connect to $Server"
$Error
continue

I was then able to test against the property of the custom object

   It 'Should have Alerts for Severity 20 and above' {
$Return.Alerts20SeverityPlusExist | Should Be 6
}
It 'Severity 20 and above Alerts should be enabled' {
$Return.Alerts20SeverityPlusEnabled | Should Be 6
}
It 'Should have alerts for 823,824 and 825' {
$Return.Alerts82345Exist |Should Be 3
}
It 'Alerts for 823,824 and 825 should be enebled' {
$Return.Alerts82345Enabled |Should Be 3
}

Occasionally, for reasons I haven’t explored I had to test against the value property of the returned object

          It "The Full User Database Backup should be scheduled Weekly $OlaUserFullSchedule" {
$Return.OlaUserFullSchedule.value | Should Be $OlaUserFullSchedule
}

I wanted to be able to run the tests against environments or groups of servers with different default values so I parameterised the Test Results as well and then the logical step was to turn it into a function and then I could do some parameter splatting. This also gives me the opportunity to show all of the things that I am currently giving parameters to the test for

   $Parms = @{
Servers = 'SQLServer1','SQLServer2','SQLServer3';
SQLAdmins = 'THEBEARD\Rob','THEBEARD\SQLDBAsAlsoWithBeards';
BackupDirectory = 'C:\MSSQL\Backup';
DataDirectory = 'C:\MSSQL\Data\';
LogDirectory = 'C:\MSSQL\Logs\';
MaxMemMb = '4096';
Collation = 'Latin1_General_CI_AS';
TempFiles = 4 ;
OlaSysFullFrequency = 'Daily';
OlaSysFullStartTime = '21:00:00';
OlaUserFullSchedule = 'Weekly';
OlaUserFullFrequency = 1 ;## 1 for Sunday
OlaUserFullStartTime = '22:00:00';
OlaUserDiffSchedule = 'Weekly';
OlaUserDiffFrequency = 126; ## 126 for every day except Sunday
OlaUserDiffStartTime = '22:00:00';
OlaUserLogSubDayInterval = 15;
OlaUserLoginterval = 'Minute';
HasSPBlitz = $true;
HasSPBlitzCache = $True;
HasSPBlitzIndex = $True;
HasSPAskBrent = $true;
HASSPBlitzTrace =  $true;
HasSPWhoisActive = $true;
LogWhoIsActiveToTable = $true;
LogSPBlitzToTable = $true;
LogSPBlitzToTableEnabled = $true;
LogSPBlitzToTableScheduled = $true;
LogSPBlitzToTableSchedule = 'Weekly';
LogSPBlitzToTableFrequency = 2 ; # 2 means Monday
LogSPBlitzToTableStartTime  = '03:00:00'}

Test-SQLDefault @Parms

I have some other tests which always return what I want, particularly the firewall rules which you will have to modify to suit your own environment

To be able to run this you will need to have the Pester Module. If you are using Windows 10 then it is installed by default, if not

  Find-Module –Name 'Pester' | Install-Module

You can find more about Pester here and here and also these videos from the conference
You can find the tests on GitHub here and I will continue to add to the defaults that I check.
This is not a replacement for other SQL configuration tools such as PBM but it is a nice simple way of giving a report on the current status of a SQL installation either at a particular point in time when something is wrong or after an installation prior to passing the server over to another team or into service

.

PowerShell Pester Testing for Parameter Validation

This error caught me out. I am putting this post here firstly to remind me if I do it again adn also to help others who may hit the same issue.

Today I am rewriting a function to create a Hyper-V VM so that I can properly script the creation of my labs for demos and other things. I am doing this because I want to use DSC to create an availability group and want to be able to tear down and recreate the machines (but thats for another day)

I also have been looking at Pester which is a framework for running unit tests within PowerShell

You will find some good blog posts about starting with Pester here

Here is the start of the function. I validate the VMName parameter to ensure that there a VM with that  name does not already exist

function Create-HyperVMFromBase { 
[cmdletbinding()] 
param (
 [Parameter(Mandatory = $true,HelpMessage="Enter a VMName for the VM that does not exist")] [ValidateScript({(!(Get-VM -Name $_))})] 
[string]$VMName,

and my Pester test looks like this

$here = Split-Path -Parent $MyInvocation.MyCommand.Path
$sut = (Split-Path -Leaf $MyInvocation.MyCommand.Path).Replace(".Tests.", ".")
. {'$here\$sut'}

Describe "Create Hyper V from Base Tests" {
    Context "Parameter Values,Validations and Errors" {
        It exists {
        test-path function:\create-hypervmfrombase | should be $true
        }
        It "Should error when VMName exists" {
        $VMName = (Get-VM|Select -First 1 Name).Name
        create-hypervmfrombase -VMName $VMName |should throw
        }

I thought that what I was testing was that the function threw an error when an incorrect parameter was passed. The should throw should be true but what I got was

pester error3

So I was getting the correct error but not passing the test. It was a simple fix. Simply adding curly braces around the call to the function

$here = Split-Path -Parent $MyInvocation.MyCommand.Path
$sut = (Split-Path -Leaf $MyInvocation.MyCommand.Path).Replace(".Tests.", ".")
. "$here\$sut"
Describe "Create Hyper V from Base Tests" {
    Context "Parameter Values,Validations and Errors" {
        It exists {
        test-path function:\create-hypervmfrombase | should be $true
        }
        It "Should error when VMName exists" {
        $VMName = (Get-VM|Select -First 1 Name).Name
        {create-hypervmfrombase -VMName $VMName} |should throw
        }
    }
}

and we pass the test.

pester success2

PowerShell Pester – "The script failed due to call depth overflow."

This error caught me out. I am putting this post here firstly to remind me if I do it again and also to help others who may hit the same issue.

I also have been looking at Pester which is a framework for running unit tests within PowerShell

You will find some good blog posts about starting with Pester here

So I created a function script file Create-HyperVFromBase.ps1 and a tests script file Create-HyperVFromBase.tests.ps1 as shown.

pester scripts

The tests contained this code

 $here = Split-Path -Parent $MyInvocation.MyCommand.Path
$sut = (Split-Path -Leaf $MyInvocation.MyCommand.Path).Replace(".Tests.", ".")
. {'$here\$sut'}
Describe "Create Hyper V from Base Tests" {
    Context "Parameter Values,Validations and Errors" {
        It exists {
        test-path function:\create-hypervmfrombase | should be $true
        } 
}
}

When I ran the test I got the following error

pester error1

or

pester error2

Googling pester “The script failed due to call depth overflow.” returned only 7 results but the Reddit link contained the information I needed

.Replace() is case sensitive. It didn’t remove the .tests. keyword from your file name. So it calls your test script again and repeats the same mistake over and over.

and so I renamed the tests script file to Create-HyperVFromBase.Tests.ps1 With a Capital T! and bingo

pester success

Don’t forget to name your Pester Tests scripts with a capital T when loading the script in this way and remember that Replace() is case sensitive.

$here = Split-Path -Parent $MyInvocation.MyCommand.Path
$sut = (Split-Path -Leaf $MyInvocation.MyCommand.Path).Replace(".Tests.", ".")
. "$here\$sut"