Writing Dynamic and Random Tests Cases for Pester

I have written a module SQLDiagAPI for consuming the SQL Server Diagnostics API with PowerShell. I blogged about how I used Pester to develop one of the functions . Whilst writing Get-SQLDiagFix I wrote some Pester Tests to make sure that the output from the code was as expected.

Pester

For those that don’t know. Pester is a PowerShell module for Test Driven Development

Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands

If you have PowerShell version 5 then you will have Pester already installed although you should update it to the latest version. If not you can get Pester from the PowerShell Gallery follow the instructions on that page to install it. This is a good post to start learning about Pester

The Command Get-SQLDiagFix

Get-SQLDiagFix  returns the Product Name, Feature Name/Area, KB Number, Title and URL for the Fixes in the Cumulative Updates returned from the SQL Server Diagnostics Recommendations API. One Fix looks like this

07 - Get-SQLDiagFix result.png

This is how I wrote the Pester tests for that command

Mocking the results

In my describe block for each function I mock Get-SQLDiagRecommendations. This is the command that each of the current available commands in the module use to get the recommendations from the SQL Server Diagnostic Recommendations API. I did this by creating a json file from the API and saving it in a json folder inside the tests folder

01 - JSON folder.png

I can then mock Get-SQLDiagRecommendations inside a BeforeAll code block using

Describe "Get-SQLDiagFix" -Tags Build , Unit, Fix {
BeforeAll {
    $Recommendations = (Get-Content $PSScriptRoot\json\recommendations.JSON) -join "`n" | ConvertFrom-Json
    Mock Get-SQLDiagRecommendations {$Recommendations}
}
This means that every time the code in the test calls Get-SQLDiagRecommendations it will not use the internet to connect to the API and return an object. Instead it will return the $Recommendations object which is loaded from a file on the file system. I am not, therefore, depending on any external factors and I have a known set of data for my test results.
I also have a set of mocks in my Output Context code block
Context "Output" {
BeforeAll {
    $Fixes = (Get-Content $PSScriptRoot\json\fix.JSON) -join "`n" | ConvertFrom-Json
    $Products = Get-SQLDiagProduct
    $Features = Get-SQLDiagFeature
}

The fixes.json is a file which was created from the recommendations.json and only contains the properties returned by GetSQLDiagFix which is what we are testing here. I can set variables for Products and Features using the commands from the module as these will call Get-SQLDiagRecommendations which we have already mocked.

Test All of the Fixes

I can now test that the code I have written for Get-SQLDiagFix returns the correct data without any parameters using this test with Compare-Object.

It "returns all of the fixes with no parameter" {
    Compare-Object (Get-SQLDiagFix) $Fixes | Should BeNullOrEmpty
}
If there is no difference between the object returned from Get-SQLDiagFix and the $fixes object which uses the json file then the code is working as expected and the test will pass.

Test Cases

I learned about test cases from Mike Robbins blog post. Test cases enable you to provide a hash table of options and loop through the same test for each of them. Here is an example

There are the following products in the Recommendation API

  • SQL Server 2012 SP3
  • SQL Server 2016 SP1
  • SQL Server 2016 RTM
  • SQL Server 2014 SP1
  • SQL Server 2014 SP2
and I want to run a test for each product to check that the fixes returned from Get-SQLDiagFix for that product match the $fixes object filtered by Product for those products. Here is the code
$TestCases = @{ ProductName = 'SQL Server 2012 SP3'},
@{ ProductName = 'SQL Server 2016 SP1'},
@{ ProductName = 'SQL Server 2016 RTM'},
@{ ProductName ='SQL Server 2014 SP1'},
@{ ProductName = 'SQL Server 2014 SP2'}
It "Returns the correct results with a single product parameter <ProductName>" -TestCases $TestCases {
    param($productname)
    $results = $fixes.Where{$_.Product -in $ProductName}
    Compare-Object (Get-SQLDiagFix -Product $productname) $results | Should BeNullOrEmpty
}
You can click on the image below to see a larger, more readable version.
02 Test Cases.png

The $TestCases variable holds an array of hashtables, one for each product with a Name that matches the parameter that I use in the test and a value of the product name.

I wrote one test, one It code block.  I refer to the product in the title inside <> using the same name as the name in the hashtable. The test (It) needs a parameter of -TestCases with a value (in this example) of the $TestCases variable we have just defined. It also needs a param block with a parameter that matches the Name value from the hashtables.

The expected test results are placed in a $results variable by filtering the $Fixes variable (defined in the BeforeAll code block above) by the parameter $Productname

The test will then run for each of the test cases in the $TestCases variable comparing the results of Get-SQLDiagFix -Product $Productname with the expected results from the $fixes variable

Here are the test results

03 - product test results.png

Multiple Products in Test Cases

I also want to test that Get-SQLDiagFix will work for multiple Products. I need to create TestCases for those too. I do that in exactly the same way

$TestCases = @{ ProductName = 'SQL Server 2012 SP3', 'SQL Server 2016 SP1'},
@{ ProductName = 'SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM'},
@{ ProductName = 'SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM', 'SQL Server 2014 SP1'},
@{ ProductName = 'SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM', 'SQL Server 2014 SP1', 'SQL Server 2014 SP2'}
    It "Returns the correct results with multiple product parameter <ProductName>" -TestCases $TestCases {
        param($productname)
        $results = $fixes.Where{$_.Product -in $ProductName}
        Compare-Object (Get-SQLDiagFix -Product $productname) $results | Should BeNullOrEmpty
}
Which looks like this when the tests run
04 - mulitple product test results.png

Single Feature Dynamic Test Cases

Get-SQLDiagFix can also filter the fixes by feature area. The features are returned from Get-SQLDiagFeature. This means that I can create a test for each of the features by using the $features variable which was defined in the BeforeAll block as

$Features = Get-SQLDiagFeature

Then I can dynamically create test cases using

$TestCases = @()
$Features | Foreach-Object {$TestCases += @{Feature = $_}}
It "Returns the correct results with a single feature <Feature>" -TestCases $TestCases {
    param($Feature)
    $results = $fixes.Where{$_.Feature -in $Feature}
    Compare-Object (Get-SQLDiagFix -Feature $Feature) $results | Should BeNullOrEmpty
}

and the results look like

05 - single feature test results.png

Random Dynamic Multiple Feature Test Cases

I also need to test that Get-SQLDiagFix returns the correct results for multiple features and whilst I could create those by hand like the products example above why not let PowerShell do that for me?

I created 10 test cases. Each one has a random number of features between 2 and the number of features.  I can then write one test to make use of those test cases. This is how I do that

## Generate 10 TestCases of a random number of Features
$TestCases = @()
$x = 10
While ($x -gt 0) {
    ## We are testing multiples so we need at least 2
    $Number = Get-Random -Maximum $Features.Count -Minimum 2
    $Test = @()
    While ($Number -gt 0) {
        $Test += Get-Random $Features
        $Number --
    }
## Need unique values
$Test = $test | Select-Object -Unique
$TestCases += @{Feature = $Test}
$X --
}
It "Returns the correct results with multiple features <Feature>" -TestCases $TestCases {
 param($Feature)
 $results = $fixes.Where{$_.Feature -in $Feature}
 Compare-Object (Get-SQLDiagFix -Feature $Feature) $results | Should BeNullOrEmpty
 }

Now there are 10 tests each with a random number of features and the results look like this. Each time the test is run it will use a different set of features for each of the 10 tests but I will know that I am testing that the code will return the correct results for multiple features

06 - multiple features.png

Two Sets of Test Cases?

It is also possible for Get-SQLDiagFix to have one or more products and one or more features passed as parameters, which obviously also need to be tested to ensure the code is returning the correct results. As Pester is just PowerShell we can use normal PowerShell code. This means that I can test for a single product and a single feature using a foreach loop and Test Cases like this

foreach ($Product in $Products) {
    $TestCases = @()
    $Features = Get-SQLDiagFeature -Product $Product
    $Features | Foreach-Object {$TestCases += @{Feature = $_}}
    It "Returns the correct results for a single product parameter $Product with a single feature <Feature>" -TestCases $TestCases {
        param($Feature)
        $results = $fixes.Where{$_.Product -eq $product -and $_.Feature -in $Feature}
        Compare-Object (Get-SQLDiagFix -Product $Product -Feature $Feature) $results | Should BeNullOrEmpty
    }
}

To test for a single product and multiple features I use this code

 foreach ($Product in $Products) {
    ## Generate 10 TestCases of a random number of Features
    $TestCases = @()
    $x = 10
    While ($x -gt 0) {
        ## We are testing multiples so we need at least 2
        $Number = Get-Random -Maximum $Features.Count -Minimum 2
        $Test = @()
        While ($Number -gt 0) {
            $Test += Get-Random $Features
            $Number --
        }
        ## Need unique values
        $Test = $test | Select-Object -Unique
        $TestCases += @{Feature = $Test}
        $X --
    }
    It "Returns the correct results for a single product parameter $Product with a multiple features <Feature>" -TestCases $TestCases {
        param($Feature)
        $Test = (Get-SQLDiagFix -Product $Product -Feature $Feature)
        ## If there are no results Compare-Object bombs out even though it is correct
        ## This is a risky fix for that
        if ($Test) {
        $results = $fixes.Where{$_.Product -eq $product -and $_.Feature -in $Feature}
        Compare-Object $test $results | Should BeNullOrEmpty
        }
    }
 } 

Because it is dynamically creating the values for the two parameters, I have to check that there are some results to test on line 23 as Compare-Object will throw an error if the object to be compared is empty. I need to do this because it is possible for the test to pick products and features in a combination that there are no fixes in the results.

The reason I have commented it as a risky fix is because if someone changes the code and Get-SQLDiagFix does not return any results then the test would not run and therefore there would be no information from this test that the code had a bug. However, in this suite of tests there are many tests that would fail in that scenario but be careful in your own usage.

I test for multiple products with a single feature and multiple products with multiple features like this

 $Products = @('SQL Server 2012 SP3', 'SQL Server 2016 SP1'),
 @('SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM'),
 @('SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM', 'SQL Server 2014 SP1'),
 @('SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM', 'SQL Server 2014 SP1', 'SQL Server 2014 SP2')
 foreach ($Product in $Products) {
     $TestCases = @()
     $Features = Get-SQLDiagFeature -Product $Product
     $Features | Foreach-Object {$TestCases += @{Feature = $_}}
     It "Returns the correct results for multiple products parameter $Product with a single feature <Feature>" -TestCases $TestCases {
         param($Feature)
         $results = $fixes.Where{$_.Product -in $product -and $_.Feature -in $Feature}
         Compare-Object (Get-SQLDiagFix -Product $Product -Feature $Feature) $results | Should BeNullOrEmpty
     }
}
 $Products = @('SQL Server 2012 SP3', 'SQL Server 2016 SP1'),
 @('SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM'),
 @('SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM', 'SQL Server 2014 SP1'),
 @('SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM', 'SQL Server 2014 SP1', 'SQL Server 2014 SP2')
 foreach ($Product in $Products) {
     ## Generate 10 TestCases of a random number of Features
     $TestCases = @()
     $x = 10
     While ($x -gt 0) {
         ## We are testing multiples so we need at least 2
         $Number = Get-Random -Maximum $Features.Count -Minimum 2
         $Test = @()
         While ($Number -gt 0) {
             $Test += Get-Random $Features
             $Number --
         }
     ## Need unique values
     $Test = $test | Select-Object -Unique
     $TestCases += @{Feature = $Test}
     $X --
 }
 It "Returns the correct results for multiple products parameter $Product with a multiple feature <Feature>" -TestCases $TestCases {
     param($Feature)
     $Test = (Get-SQLDiagFix -Product $Product -Feature $Feature)
     ## Annoyingly if there are no results Compare-Object bombs out even though it is correct
     ## This is a risky fix for that
     if ($Test) {
         $results = $fixes.Where{$_.Product -in $product -and $_.Feature -in $Feature}
         Compare-Object $test $results | Should BeNullOrEmpty
        }
    }
 } 

You can see all of the unit tests for the SQLDiagAPI module in my GitHub repository

The module is available on the PowerShell Gallery which means that you can install it using

Install-Module SQLDiagAPI
Advertisements

Creating a PowerShell Module and TDD for Get-SQLDiagRecommendations

Yesterday I introduced the first command in the SQLDiagAPI module. A module to consume the SQL Diagnostics API.

I have been asked a few times what the process is for creating a module, using Github and developing with Pester and whilst this is not a comprehensive how-to I hope it will give some food for thought when you decide to write a PowerShell module or start using Pester for code development. I also hope it will encourage you to give it a try and to blog about your experience.

This is my experience from nothing to a module with a function using Test Driven Development with Pester. There are some details missing in some places but if something doesn’t make sense then ask a question. If something is incorrect then point it out. I plan on never stopping learning!

There are many links to further reading and I urge you to not only read the posts linked but also to read further and deeper. That’s a generic point for anyone in the IT field and not specific to PowerShell. Never stop learning. Also, say thank you to those that have taken their time to write content that you find useful. They will really appreciate that.

Github Repository

I created a new repository in Github and used Visual Studio Code to clone the repository by pressing F1 and typing clone – Choosing Git Clone and following the prompts. I started with this because I was always planning to share this code and because source controlling it is the best way to begin.

Plaster Template

When you create a module there are a number of files that you need and I have a number of generic tests that I add. I also have a structure that I create for the artifacts and a number of markdown documents that come with a GitHub Repository.  Whilst you could write a PowerShell script to create all of those, there is no need as there is PlasterPlaster is a PowerShell module that enables you to set up the default scaffolding for your PowerShell module structure and tokenise some files. This makes it much easier to have a default ‘scaffold’ for the module, a structure for the files and folders and create a new module simply. I used Kevin Marquettes post on Plaster  to create myself a template module. You can find my Plaster Template here 

You do not need to use Plaster at all but as with anything, if you find yourself repeating steps then it is time to automate it

With my Plaster Template created I could simply run

$plaster = @{
TemplatePath = "GIT:\PlasterTemplate" #(Split-Path $manifestProperties.Path)
DestinationPath = "GIT:\SQLDiagAPI"
FullName = "Rob Sewell"
ModuleName = "SQLDiagAPI"
ModuleDesc = "This is a module to work with the SQL Server Diagnostics (Preview) API. See https://blogs.msdn.microsoft.com/sql_server_team/sql-server-diagnostics-preview/ for more details "
Version = "0.9.0"
GitHubUserName = "SQLDBAWithABeard"
GitHubRepo = "SQLDiagAPI"
}
If(!(Test-Path $plaster.DestinationPath))
{
New-Item -ItemType Directory -Path $plaster.DestinationPath
}
Invoke-Plaster @plaster -Verbose

This created my module. It created this folder and file structure and included some default tests and markdown documents pre-populated.

00 - module

Pester

For those that don’t know. Pester is a PowerShell module for Test Driven Development

Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands

If you have PowerShell version 5 then you will have Pester already installed although you should update it to the latest version. If not you can get Pester from the PowerShell Gallery follow the instructions on that page to install it. This is a good post to start learning about Pester

API Key

Now that I have the module I started to think about the commands. I decided to start with the recommendations API which is described as

Customers will be able to keep their SQL Server instances up-to-date by easily reviewing the recommendations for their SQL Server instances. Customers can filter by product version or by feature area (e.g. Always On, Backup/Restore, Column Store, etc.) and view the latest Cumulative Updates (CU) and the underlying hotfixes addressed in the CU.

To use the API you need an API Key. An API Key is a secret token that identifies the application to the API and is used to control access.You can follow the instructions here https://ecsapi.portal.azure-api.net/ to get one for the SQL Server Diagnostics API.

01 - APIKey

I will need to store the key to use it and if I am writing code that others will use consider how they can repeat the steps that I take. I decided to save my API Key using the Export-CliXML command as described by Jaap Brasser here .

Get-Credential | Export-CliXml -Path "${env:\userprofile}\SQLDiag.Cred"

You need to enter a username even though it is not used and then enter the API Key as the password. It is saved in the root of the user profile folder as hopefully user accounts will have access there in most shops

TDD

I approached writing this module using Test Driven Development with Pester. This means that I have to write my tests before I write my code. There are many reasons for doing this which are outside the scope of this blog post. This is a very good post to read more

The first function I wanted to write was to get the recommendations from the API. I decide to call it Get-SQLDiagRecommendations.

I decided that the first test should be to ensure that the API Key exists. Otherwise I would not be able to use it when calling the API. I already had an idea of how I would approach it by storing the API Key using Test-Path and writing a warning if the file did not exist.

Mocking

However this is not going to work if I have already saved the key to the file. The test needs to not be reliant on any thing external. I need to be able to test this functionality without actually checking my system. I will use Mock to do this. You can read more about mocking with Pester here.

I added this to my Pester test

Context "Requirements" {
Mock Test-Path {$false}
Mock Write-Warning {"Warning"}

This is what happens when you run this test. When there is a call to Test-Path in the code you have written, instead of actually running Test-Path it will return whatever is inside the curly braces, in this case false. For Write-Warning it will return a string of Warning.

This means that I can write a test like this

It "Should throw a warning if there is no API Key XML File and the APIKey Parameter is not used"{
Get-SQLDiagRecommendations -ErrorAction SilentlyContinue | Should Be "Warning"
}

So I know that when running my code in this test, Test-Path will return false, which will invoke Write-Warning in my code and in the test that will return “Warning” . So if I have written my code correctly the test will pass without actually running the real Test-Path and interacting with my system or running Write-Warning which makes it easier to test that warnings are thrown correctly.

The name of the test will also let me (and others) know in the future what I was trying to achieve. This means that if I (or someone else) changes the code and the test fails they can understand what was meant to happen. They can then either write a new test for the changed code if the requirements are now different or alter the code so that it passes the original test.

I use

-ErrorAction SilentlyContinue

so that the only red text that I see on the screen is the results of the test and not any PowerShell errors.

Asserting

I can also check that I have successfully called my Mocks using Assert-MockCalled. This command will check that a command that has been mocked has been called successfully during the test in the scope of the Describe (or in this case Context) block of the tests

It 'Checks the Mock was called for Test-Path' {
$assertMockParams = @{
'CommandName' = ' Test-Path'
'Times' = 1
'Exactly' = $true
}
Assert-MockCalled @assertMockParams
}

I specify the command name, the number of times that I expect the mock to have been called and because I know that it will be exactly 1 time, I set exactly to $true. If I set exactly to false it would test that the mock was called at least the number of times specified. This is another test that I really have called the Mocks that I defined and the results are correct and dependant only on the code.

I set up the same test for Write-Warning.

Failed Test

I can now run my Pester tests using

Invoke-Pester .\Tests

and see that some failed.

02 - Failed Pester tests

Of course it failed I don’t have a function named Get-SQLDiagRecommendations

So why run the test?

I need to ensure that my test fails before I write the code to pass it. If I don’t do that I may mistakenly write a test that passes and therefore not be correctly testing my code.

You can also see that it has run all of the .Tests.ps1 files in the tests directory and has taken 42 seconds to run. The tests directory includes a number of Pester tests including checking that all of the scripts pass the Script Analyser rules and that all of the functions have the correct help. (thank you June Blender for that test)

Show

I can reduce the output of the tests using the Show parameter of Invoke-Pester. I will often use Fails as this will show the describe and context titles and only the tests that fail. This will run much quicker as it will not need to output all of the passed tests to the screen

03 - Pester show fails

Now the test is running in less than half of the time. You can filter the output in further ways using Show. You can run

Get-Help Invoke-Pester

to see how else you can do this.

Tags

As I am going to be writing tests and then writing code to pass the tests repeatedly I don’t want to run all of these tests all of the time so I can use the Tags parameter of Invoke-Pester to only run a certain suite tests. In the Unit.Tests.ps1 file the Describe block looks like this

Describe "Get-SQLDiagRecommendations" -Tags Build , Unit{
Context "Requirements" {

So I can run just the tests tagged Unit and skip all of the other tests. Combined with the Show Fails to reduce the output my Invoke-Pester code looks like this

Invoke-Pester .\tests -Show Fails -Tag Unit

04 - Pester Tags

Now I am only running the tests that I need for writing the code for the command the tests are running in under half a second 🙂 This is so much better when I am going to be running them repeatedly.

The other tests have different tags and I will show them running later in the post.

Code

Finally, we can write some code to pass our failing test

function Get-SQLDiagRecommendations1 {
[cmdletbinding()]
Param([string]$ApiKey)
if (!$ApiKey) {
if (!(Test-Path "${env:\userprofile}\SQLDiag.Cred")) {
Write-Warning "You have not created an XML File to hold the API Key or provided the API Key as a parameter
You can export the key to an XML file using Get-Credential | Export-CliXml -Path `"`${env:\userprofile}\SQLDiag.Cred`"
You can get a key by following the steps here https://ecsapi.portal.azure-api.net/ "
    }
}

Which would look like this if the file does not exist and the API Key parameter is not used

05 - Warning

I like to provide users with a useful message that they can follow rather than a lot of red text that they need to decipher

And now our tests pass

06 - Passing Tests

If you look at the API documentation the API requires a callerid as well as the APIKey. In the examples it uses the value from
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\MachineGUID

We can get that using Get-ItemProperty and without it we can’t call the API so I wrote tests like this.

It "Returns a warning if unable to get Machine GUID" {
Mock Get-MachineGUID {} -Verifiable
Mock Write-Warning {"Warning"} -Verifiable
Get-SQLDiagRecommendations -APIKey dummykey | Should Be "Warning"
Assert-VerifiableMocks
}

I am not saying this is the correct way to write your tests. I am showing that you can test multiple things in an It block and if any one of them fails the entire test fails.

I am mocking the internal function Get-MachineGuid and Write Warning just in the scope of this It Block and passing an APIKey parameter to Get-SQLDiagRecommendations so that we don’t hit the write-warnings we tested for above and then using Assert-VerifiableMocks  to verify that the mocks have been called. It does not verify how many times, just that all of the mocks in that block have been called

The test fails as expected and then I write the code to pass the test. This is the internal function to get the Machine GUID

function Get-MachineGUID {
try {
(Get-ItemProperty registry::HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\ -Name MachineGuid).MachineGUID
}
catch{
Write-Warning "Failed to get Machine GUID from HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\"
}
}

 

and this is the call to the internal function and warning message

$MachineGUID = Get-MachineGUID
if($MachineGUID.length -eq 0)
{
Write-Warning "Failed to get Machine GUID from HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\"
break
}

Rinse and repeat

That is basically the process that I follow to write a function. I just write a test, write some code to fix it, write another test, write some code to fix it. I keep going until I have finished writing the code and all the test have passed.

Best Practice Code

Once that was done and my Unit test had passed I run

 Invoke-Pester .\tests -Tag ScriptAnalyzer -Show Fails

To check that the PowerShell code that I had written conformed to the Script Analyzer rules. I added an exception to the Help.Exceptions.ps1 file to not run the rule for plural nouns as I think the command has to be called Get-SQLRecommendations with an S ! I have tagged the ScriptAnalyzer Tests with a tag so I can just run those tests.

Help

As that had all passed I could then run

Invoke-Pester .\tests -Tag Help

Which tests if I had the correct help for my functions. Of course that failed but I could use the nifty new feature in VS Codes PowerShell Extension to add the help scaffolding really easily as I describe here

Then I could run all 563 of the Pester tests in the tests folder and be happy that everything was OK

11 - All Pester passed.PNG

By the end I had written the module, which you can find here

There are instructions and a script to install it easily.

Right now it has only got the one function to get the SQL recommendations but I will look at expanding that over the next few days and once it is more complete put it onto the PowerShell Gallery and maybe move it into the SQL Server Community GitHub Organisation  home of https://dbatools.io , https://dbareports.io, Invoke-SQLCmd2 and the SSIS Reporting pack

Contribute

Of course I am happy to have others contribute to this, in fact I encourage it. Please fork and give PR’s and make this a useful module with more commands. There is the Diagnostic Analysis API as well to work with which I am very interested to see how we can make use of that with PowerShell

As always, I highly recommend that if you want to know more about Pester you head over here and purchase this book by Adam

PowerShell Module for the SQL Server Diagnostics API – 1st Command Get-SQLDiagRecommendations

I saw this blog post about the SQL Server Diagnostics add-on to SSMS and API and thought I would write some PowerShell to work with it as all of the examples use other languages.

SQL ServerDignostics API

The Diagnostic Analysis API allows you to upload memory dumps to be able to debug and self-resolve memory dump issues from their SQL Server instances and receive recommended Knowledge Base (KB) article(s) from Microsoft, which may be applicable for the fix.

There is also the Recommendations API to view the latest Cumulative Updates (CU) and the underlying hotfixes addressed in the CU which can be filtered by product version or by feature area (e.g. Always On, Backup/Restore, Column Store, etc).

I have written a module to work with this API. It is not complete. It only has one command as of now but I can see lots of possibilities for improvement and further commands to interact with the API fully and enable SQL Server professionals to use PowerShell for this.

Storing the API Key

To use the API you need an API Key. An API Key is a secret token that identifies the application to the API and is used to control access. You can follow the instructions here https://ecsapi.portal.azure-api.net/ to get one for the SQL Server Diagnostics API.

01 - APIKey

I will need to store the key to use it. I saved my API Key using the Export-CliXML command as described by Jaap Brasser here .

 Get-Credential | Export-CliXml -Path "${env:\userprofile}\SQLDiag.Cred" 

You need to enter a username even though it is not used and then enter the API Key as the password. It is saved in the root of the user profile folder as hopefully user accounts will have access there in most shops.

The commands in the module will look for the API Key in that SQLDiag.Cred file by default but you can also just use the APIKey parameter

Get-SQLDiagRecommendations

The first function in the module is Get-SQLDiagRecommendations. All this function does is connect to the Recommendations API and return an object containing the information about the latest Cumulative Updates.

If you have already saved your API Key as described above you can use

 Get-SQLDiagRecommendations 

If you want to enter the API Key manually you would use

 Get-SQLDiagRecommendations -APIKey XXXXXXXX

Either way it will return a PowerShell object containing all of the information which looks like this.

07 - Get-SQLRecommendations

One of the beauties of PowerShell is that you can pass objects down a pipeline and use them in other commands. Also, your only limit is your imagination.

You want to export to CSV, HTML, Text file?
Email, Import to database, store in Azure storage?
Embed in Word, Excel  on a SharePoint site?

All of this and much, much more is easily achievable with PowerShell.

In the future this command will feed other functions in the module that will display this information in a more useful fashion. I am thinking of commands like

Get-SQLDiagRecommendations |
Get-SQLDiagLatestCU -Version SQL2012

or

Get-SQLDiagRecommendations |
Get-SQLDiagKBArticle -Version SQL2012 -Feature BackupRestore

If you have any ideas please join in on GitHub

JSON

For now though you can use Get-SQLDiagRecommendations to output the results to JSON so that you can examine them or consume them.

If you use VS Code follow the steps here and you can export the results to the current file with

 Get-SQLDiagRecommendations |ConvertTo-Json -Depth 7 |Out-CurrentFile 

Which looks like this

08 - OutCurrentFile

It shows the entire JSON object containing all of the information about all of the latest CU’s for SQL Server 2012 and up and each of the KB Articles. I have minimised several of the nodes to try and show as much as possible for SQL Server 2012 SP3

If you do not use VS Code or you want to export straight to a file then you can

 Get-SQLDiagRecommendations |ConvertTo-Json -Depth 7 |Out-File -Path PATHTOFILE 

Out-GridView

I like Out-GridView so I quickly gathered the Product, Cumulative Update, Feature Type, KB Number and URL and outputted to Out-GridView like this

$recommendations = Get-SQLDiagRecommendations
$KBs = foreach ($recommendation in $recommendations.Recommendations){
    $Product = $recommendation.Product
    $CU = $recommendation.Title
    $CreatedOn = $recommendation.CreatedOn
    foreach ($fix in $recommendation.Content.RelevantFixes){
        $feature = $fix.Title
        foreach ($Kb in $fix.KbArticles){
            [PSCustomObject]@{
                CreatedOn = $CreatedOn
                Product = $Product
                CU = $CU
                Feature = $feature
                KB = $Kb.Rel
                Link = $Kb.href
                }
           }
       }
   }
 $kbs | Ogv 

As you can filter easily in Out-GridView I filtered by 2012 and this is what it looks like

09 - Out-GridView

This will enable you to quickly see any information that you require about the Cumulative Updates for SQL 2012, 2014 and 2016

Github

You can find the module on GitHub. There are instructions and a script to install it easily.

Right now it has only got the one function to get the SQL recommendations but I will look at expanding that over the next few days and once it is more complete put it onto the PowerShell Gallery and maybe move it into the SQL Server Community GitHub Organisation  home of https://dbatools.io , https://dbareports.io, Invoke-SQLCmd2 and the SSIS Reporting pack

Contribute

Of course I am happy to have others contribute to this, in fact I encourage it. Please fork and give PR’s and make this a useful module with more commands. There is the Diagnostic Analysis API as well to work with and I am very interested to see how we can make use of that with PowerShell

Tomorrow I have a post explaining the process I used to create the module and how I used Test Driven Development with Pester to write this function.

Why VS Code Increases my Productivity

Last week I was showing a co-worker some PowerShell code and he asked what the editor was that I was using. Visual Studio Code I said. Why do you use that? What does it do?

This is what I showed him

Runs on any Operating System

Code (as I shall refer to it) is free lightweight open source editor which runs on all the main operating systems. So you have the same experience in Linux as on Windows. So there is less to learn

Extensions

You can add new languages, themes, debuggers and tools from the extensions gallery to reduce the number of programmes you have open and the need to switch between programmes

You can add extensions using CTRL + SHIFT  + X and searching in the bar

01 - Extensions

or by going to the Extensions gallery searching for the extensions and copying the installation command

02 - extensions gallery.PNG

Debugging

There is a rich de-bugging experience built in

03 - debugging.PNG

You can learn about debugging from the official docs and Keith Hill wrote a blog post on Hey Scripting Guys about debugging PowerShell

Intellisense

An absolute must to make life simpler. Code has intellisense for PowerShell and T-SQL which I use the most but also for many more languages . Read more here

Git integration

I love the Git integration, makes it so easy to work with GitHub for me. I can see diffs, commit, undo commits nice and simply. Just open the root folder of the repository and its there

04 - git

This page will give you a good start on using git with Code

No distractions

With full screen mode (F11) or Zen mode (CTRL +K, Z) I can concentrate on coding and not worry about distractions

Stay in one programme and do it all

I have a Markdown document, a PowerShell script and a T-SQL script all in one Git repository and I can work on all of them and version control in one place. The screencast below also shows some of the new capabilities available in the insiders version I managed to leave the screen recording dialogue open as well, apologies and the mistake was deliberate!

I used the GitLens and SQL beautify extensions as well as the dbatools module in that demo

That’s why I am using Code more and more these days, hope it helps

Happy Automating!

 

 

Testing SQL Server Access to a share with PowerShell using dbatools

A good security practice is to backup our SQL Servers to a network share but not allow users to be able to browse the share. How can we ensure that our SQL Server has access or test it if it has been set up by someone else?

Lets set this up.

First lets create a share for our backups

$FileShareParams=@{
Name='SQLBackups'
Description='The Place for SQL Backups'
SourceVolume=(Get-Volume-DriveLetterD)
FileServerFriendlyName='beardnuc'
}
New-FileShare @FileShareParams

This will create us a share called SQLBackups on the D drive of the server beardnuc, but without any permissions, lets grant permissions to everyone

$FileSharePermsParams=@{
 Name = 'SQLBackups'
 AccessRight = 'Modify'
 AccountName = 'Everyone'}
Grant-FileShareAccess @FileSharePermsParams

01 file share.PNG

The share is created and I can access it and create a file

02 - create a file.PNG

and as we can see the permissions are granted for everyone

03 -permissions.PNG

OK, that’s not what we want so lets revoke that permission.

Revoke-FileShareAccess Name SQLBackups AccountName 'Everyone'

04 revoked.PNG

Now lets add permissions just for our SQL Server Service Accounts

$FileSharePermsParams = @{
Name = 'SQLBackups'
AccessRight = 'Modify'
AccountName = 'SQL_DBEngine_Service_Accounts
}
Grant-FileShareAccess @FileSharePermsParams 
and explicitly deny our DBA user accounts from accessing them.
$BlockFileShareParams = @{
Name = 'SQLBackups'
AccountName = 'SQL_DBAs_The_Cool_Ones'
}
Block-FileShareAccess @BlockFileShareParams
In the GUI our permissions look like this
and when I try to access as THEBEARD\Rob I get this

07 -no permissions.PNG

So how can I check that I have access from my SQL Server? Sure I could get the password of the SQL Service account and run a process as that account, not saying that’s a good idea but it could be done. Of course it couldn’t be done if you are using Managed Service Accounts or Group Managed Service Accounts but there is a way

Enter dbatools to the rescue 😉 The dbatools module (for those that don’t know) is a PowerShell module written by amazing folks in the community designed to make administrating your SQL Server significantly easier using PowerShell. The instructions for installing it are available here It comprises of 182 separate commands at present

There is a command called Test-SqlPath As always start with Get-Help

Get-Help Test-SqlPath -Full

08 - get help.PNG

So it uses master.dbo.xp_fileexist to determine if a file or directory exists, from the perspective of the SQL Server service account, has three parameters Sqlserver, Path and SqlCredential for SQL Authentication. Of course if that stored procedure is disabled on your estate then this command will not be of use to you. With that in mind, lets run it and see what it does
Test-SqlPath -SqlServer sql2016n1 -Path \\beardnuc\SQLBackups
09 - path test

That’s good I have access, lets back a database up

Backup-SqlDatabase -ServerInstance SQL2016N1 -Database DBA-Admin -CopyOnly -BackupAction Database -BackupFile '\\BeardNuc\SQLBackups\Test-DBA-Admin.bak'
Ah, I cant show you as I don’t have access. Better get in touch with the data centre admin to check 😉 Luckily, I am my own data centre admin and have another account I can use 🙂
10 - check

So what if we want to test all of our servers for access to the new share? I tried this

$SQLServers = (Get-VM -ComputerName beardnuc).Where{$_.Name -like '*SQL*' -and $_.Name -notlike 'SQL2008Ser2008'}.Name
Test-SqlPath -SqlServer $SQLServers -Path '\\BeardNuc\SQLBackups'
but unfortunately I hit an error
11 - error.PNG
It seems that at the moment (version 0.8.942) this command only accepts a single server. This is what you should do if you find either a bug or have an idea for dbatools. Raise an issue on Github
Navigate to the GitHub repository and click on issues. I generally search for the command name in the issues to see if someone else has beaten me to it
12 - issues
If those issues don’t match yours then click the green New Issue button
There is a template to fill in which asks you to specify your Windows, PowerShell and SQL versions with the commands that you need to do so included. Please do this and paste the results in as it will help the folks to replicate the issues in the case of more complicated  bugs
I created this issue with a potential fix as well, you don’t have to do that, just letting the folks know is good enough
Until that issue is resolved, you can check all of your servers as follows
$SQLServers=(Get-VM -ComputerName beardnuc).Where{$_.Name -like '*SQL*' -and $_.Name -notlike 'SQL2008Ser2008'}.Name
foreach($Server in $SQLServers)
{
$Test = Test-SqlPath -SqlServer $Server -Path '\\BeardNuc\SQLBackups'
[PSCustomObject]@{
Server = $Server
Result = $Test
}
}
13 - servers.PNG
and if I remove one of the service accounts from the group and restart the service an run the command again
14 - one fails.PNG
So that’s how to use dbatools to check that your SQL Server have access to a Network share and also how to create an issue on GitHub for dbatools and help it to get even better

Happy Automating

NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using

 Get-Module dbatools

and update it using an Administrator PowerShell session with

 Update-Module dbatools

You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using

Install-Module dbatools

Then you can use

Update-dbatools

The SQL Server Community Collaborative GitHub Organisation is born

My wonderful friend Chrissy LeMaire and I are the creators of two GitHub repositories for SQL Server and PowerShell called dbatools and dbareports

If you are working with SQL Server I highly recommend that you take a look at the vast number of commands available to you at dbatools which will help you complete tasks within SQL Server especially for Instance migrations and also a growing number of best practice implementations

Both of these modules are not just the work of one person any more. We have over 20 people who have collaborated on the modules THANK YOU ALL and more that have provided guidance and comments via the Slack Channels in the SQL Server Community Slack https://sqlps.io/slack and via the Trello boards https://dbatools.io/trello and https://dbareports/trello

At SQL Saturday Cambridge this weekend I was proud to join Chrissy in her presentation as we talked about both modules. Heres a fabulous picture of us with Buck Woody

 

wp_20160910_10_14_58_pro

 

We had discussed previously that it didn’t feel quite right that these community tools were under our own personal accounts and it also caused some administration issues with allowing access. So with that in mind after a naming discussion in the slack channel we created an organisation to hold them both

 SQL Server Community Collaborative

is born at https://github.com/sqlcollaborative

Nothing much changes except the name. we have even found that all the old links work and GitHub desktop updated. We will continue to make great commands with all of our fantastic collaborators. Discussions will happen in Slack and organisation in Trello and we will continue to grow and learn and teach and share and create together.

We would love you to come and join us

 

Remove-SQLDatabaseSafely My First Contribution to DBATools

What is DBA Tools?

A collection of modules for SQL Server DBAs. It initially started out as ‘sqlmigration’, but has now grown into a collection of various commands that help automate DBA tasks and encourage best practices.

You can read more about here and it is freely available for download on GitHub I thoroughly recommend that you watch this quick video to see just how easy it is to migrate an entire SQL instance in one command (Longer session here )

Installing it is as easy as

Install-Module dbatools

which will get you over 80 commands . Visit https://dbatools.io/functions/ to find out more information about them

cmdlets

The journey to Remove-SQLDatabaseSafely started with William Durkin b | t who presented to the SQL South West User Group  (You can get his slides here)

Following that session  I wrote a Powershell Script to gather information about the last used date for databases which I blogged about here and then a T-SQL script to take a final backup and create a SQL Agent Job to restore from that back up which I blogged about here The team have used this solution (updated to load the DBA Database and a report instead of using Excel) ever since and it proved invaluable when a read-only database was dropped and could quickly and easily be restored with no fuss.

I was chatting with Chrissy LeMaire who founded DBATools b | t about this process and when she asked for contributions in the SQL Server Community Slack I offered my help and she suggested I write this command. I have learnt so much. I thoroughly enjoyed and highly recommend working on projects collaboratively to improve your skills. It is amazing to work with such incredible professional PowerShell people.

I went back to the basics and thought about what was required and watched one of my favourite videos again. Grant Fritcheys Backup Rant

I decided that the process should be as follows

  1. Performs a DBCC CHECKDB
  2. Database is backed up WITH CHECKSUM
  3. Database is restored with VERIFY ONLY on the source
  4. An Agent Job is created to easily restore from that backup
  5. The database is dropped
  6. The Agent Job restores the database
  7. performs a DBCC CHECKDB and drops the database for a final time

This (hopefully) passes all of Grants checks. This is how I created the command

I check that the SQL Agent is running otherwise we wont be able to run the job. I use a while loop with a timeout like this

$agentservice = Get-Service -ComputerName $ipaddr -Name $serviceName
if ($agentservice.Status -ne 'Running')
{
$agentservice.Start()
$timeout = new-timespan -seconds 60
$sw = [diagnostics.stopwatch]::StartNew()
$agentstatus = (Get-Service -ComputerName $ipaddr -Name $serviceName).Status
while ($dbStatus -ne 'Running' -and $sw.elapsed -lt $timeout)
{
$dbStatus = (Get-Service -ComputerName $ipaddr -Name $serviceName).Status
}
}

There are a lot more checks and logic than I will describe here to make sure that the process is as robust as possible. For example, the script can exit after errors are found using DBCC CHECKDB or continue and label the database backup file and restore job appropriately. Unless the force option is used it will exit if the job name already exists. We have tried to think of everything but if something has been missed or you have suggestions let us know (details at end of post)

The only thing I didn’t add was a LARGE RED POP UP SAYING ARE YOU SURE YOU WANT TO DROP THIS DATABASE but I considered it!!

Performs a DBCC CHECKDB

Running DBCC CHECKDB with Powershell is as easy as this

$sourceserver = New-Object Microsoft.SQLServer.Management.Smo.Server "ServerName"
 $db = $sourceserver.databases[$dbname]
 $null = $db.CheckTables('None')

you can read more on MSDN

Database is backed up WITH CHECKSUM

Stuart Moore is my go to for doing backups and restores with SMO

I ensured that the backup was performed with checksum like this

$backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$backup.Action = [Microsoft.SqlServer.Management.SMO.BackupActionType]::Database
$backup.BackupSetDescription = "Final Full Backup of $dbname Prior to Dropping"
$backup.Database = $dbname
$backup.Checksum = $True

Database is restored with VERIFY ONLY on the source

I used SMO all the way through this command and performed the restore verify only like this

$restoreverify = New-Object 'Microsoft.SqlServer.Management.Smo.Restore'
$restoreverify.Database = $dbname
$restoreverify.Devices.AddDevice($filename, $devicetype)
$result = $restoreverify.SqlVerify($sourceserver)

An Agent Job is created to easily restore from that backup

First I created a category for the Agent Job

Function New-SqlAgentJobCategory
{
param ([string]$categoryname,
[object]$jobServer)
if (!$jobServer.JobCategories[$categoryname])
{
if ($Pscmdlet.ShouldProcess($sourceserver,"Creating Agent Job Category $categoryname")
{
try
{
Write-Output "Creating Agent Job Category $categoryname"
$category = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobCategory
$category.Parent = $jobServer
$category.Name = $categoryname
$category.Create()
Write-Output "Created Agent Job Category $categoryname"
}
catch
{
Write-Exception $_
throw "FAILED : To Create Agent Job Category $categoryname - Aborting"
}
}
}
}

and then generated the TSQL for the restore step by using the script method on the Restore SMO object

This is how to create an Agent Job

$job = New-Object Microsoft.SqlServer.Management.Smo.Agent.Job $jobServer, $jobname
$job.Name = $jobname
$job.OwnerLoginName = $jobowner
$job.Description = "This job will restore the $dbname database using the final backup located at $filename"code]</pre>
and then to add a job step to run the restore command
<pre>$jobStep = new-object Microsoft.SqlServer.Management.Smo.Agent.JobStep $job, $jobStepName
$jobStep.SubSystem = 'TransactSql' # 'PowerShell'
$jobStep.DatabaseName = 'master'
$jobStep.Command = $jobStepCommmand
$jobStep.OnSuccessAction = 'QuitWithSuccess'
$jobStep.OnFailAction = 'QuitWithFailure'
if ($Pscmdlet.ShouldProcess($destination, "Creating Agent JobStep on $destination"
{
$null = $jobStep.Create()
}
$job.ApplyToTargetServer($destination)
$job.StartStepID = $jobStartStepid
$job.Alter()

The database is dropped

We try 3 different methods to drop the database

$server.KillDatabase($dbname)
$server.databases[$dbname].Drop()
$null = $server.ConnectionContext.ExecuteNonQuery("DROP DATABASE "code]</pre>
<h2>The Agent Job restores the database</h2>
To run the Agent Job I call the start method of the Job SMO Object
<pre>
$job = $destserver.JobServer.Jobs[$jobname]
$job.Start()
$status = $job.CurrentRunStatus
while ($status -ne 'Idle')
{
Write-Output &quot;Restore Job for $dbname on $destination is $status&quot;
$job.Refresh()
$status = $job.CurrentRunStatus
Start-Sleep -Seconds 5
}
Then we drop the database for the final time with the confidence that we have a safe backup and an easy one click method to restore it from that backup (as long as the backup is in the same location)
There are further details on the functions page on dbatools
Some videos of it in action are on YouTube http://dbatools.io/video
You can take a look at the code on GitHub here

You can install it with

Install-Module dbatools
You can provide feedback via the Trello Board or discuss it in the #dbatools channel in the Sqlserver Community Slack
You too can also become a contributor https://dbatools.io/join-us/ Come and write a command to make it easy for DBAs to (this bit is up to your imagination).