.NET PowerShell Notebooks – Using Pester

My last post had a lot of information about the new .NET PowerShell notebooks including installation instructions.

.NET Notebooks are Jupyter Notebooks that use .NET core to enable C#, F# and PowerShell kernels.

Use Cases

One of the main benefits that I see for Jupyter Notebooks for Ops folk is that the results of the query are saved with the notebook. This makes them fantastic for Incident resolution.

If you have an incident at 3am and you know that you will need that information in the wash up meeting the next day instead of copying and pasting results into a OneNote document or a text file, you can simply run the queries in a notebook and save it.

In the meeting, you can simply open the notebook and the results will be available for everyone to see.

Even better, if you have a template notebook for those scenarios and you can then compare them to previous occurrences.

Using Pester

Using Pester to validate that an environment is as you expect it is a good resource for incident resolution, potentially enabling you to quickly establish an area to concentrate on for the issue. However, if you try to run Pester in a .NET Notebook you will receive an error

Describe: 
Line |
   3 | Describe "Checking Problem ...... by $($ENV:USERDOMAIN) $($ENV:UserName)" {

     | ^ The 'Describe' command was found in the module 'Pester', but the module could not be loaded. For more information, run 'Import-Module Pester'.

Fixing it

When you try to Import-Module Pester you get the following error

Get-Command: C:\Users\mrrob\Documents\PowerShell\Modules\Pester\4.9.0\Pester.psm1
Line |
  94 |     $script:SafeCommands['Get-CimInstance'] = Get-Command -Name Get-CimInstance -Module CimCmdlets @safeCommandLookupParameters

     |                                               ^ The term 'Get-CimInstance' is not recognized as the name of a
     | cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included,
     | verify that the path is correct and try again.
 Import-Module: The module to process 'Pester.psm1', listed in field 'ModuleToProcess/RootModule' of module manifest 'C:\Users\mrrob\Documents\PowerShell\Modules\Pester\4.9.0\Pester.psd1' was not processed because no valid module was found in any module directory. 

Thats odd, why is it failing there? Dongbo Wang from the PowerShell team explains in the issue that I raised

Yes, it was the CimCmdlets module from the system32 module path that got imported (via the WinCompat feature added in PS7). This is because currently the PS kernel don’t ship all the built-in modules along with it …
The built-in modules are not published anywhere and are platform specific, it’s hard for an application that host powershell to ship them along. We have the issue PowerShell/PowerShell#11783 to track this work.

The way to resolve this is to Import the CimCmdlets Module from your local PowerShell 7 installation until the issue is resolved

Import-Module 'C:\program files\powershell\7-preview\Modules\CimCmdlets\CimCmdlets.psd1'

Then you can run your Pester

You can see all of this including all the results in this notebook that I have created and shared on Github and also below as a gist to embed in this blogpost

Sharing Code AND Results 🙂


Notebooks – A brilliant way of sharing what you did and the results that you got enabling others to follow along. You can do this with this Notebook. Download it and open it in your Jupyter Lab and you will be able to run it and see all of the errors and the fix on your machine.

New .NET Notebooks are here – PowerShell 7 notebooks are here.

Data Science folk used Notebooks for documentation and to show re-runnable research. Azure Data Studio included this notebook functionality and added SQL kernel where with a little bit of faffing you could run PowerShell and then a Python kernel that enabled PowerShell. It seems that notebooks are so cool that everyone is creating them these days! I was browsing twitter when I saw this tweet.

PowerShell 7 Notebooks 🙂

A notebook experience for PowerShell 7 that sounds amazing. This will enable a true cross-platform PowerShell Notebook experience which is lacking from the Python version as it uses Windows PowerShell on Windows and PowerShell Core on other OS’s

The first thing I asked was – Will this come to Azure Data Studio. I got an immediate response from Sydney Smith PowerShell Program Manager saying it is on the roadmap

Install dependencies

To be able to run the notebook, you need to install some dependencies. First install the .NET CORE SDK which you can download from https://dotnet.microsoft.com/download This needs admin permissions to install.

You also need a Python installation – You can use Anaconda, which you can download from here https://www.anaconda.com/distribution/ This does not need admin to install

Add Anaconda to Windows Terminal

I have added the Anaconda prompt to Windows Terminal so that I have one entry point for all my CLIs. Open the settings file and add the code below. (It will also give you an icon and background.

        {
            // Make changes here to the Anaconda.exe profile
            "guid": "{0caa0dad-35be-5f56-a7ff-afceeeaa6101}",
            "name": "Anaconda",
            "commandline": "cmd.exe /K C:\\Users\\mrrob\\Anaconda3\\Scripts\\activate.bat",
            "hidden": false,
            "backgroundImage": "C:\\Users\\mrrob\\Anaconda3\\Menu\\anaconda-navigator.ico",
            "icon": "C:\\Users\\mrrob\\Anaconda3\\Menu\\anaconda-navigator.ico",
            "backgroundImageAlignment": "topRight",
            "backgroundImageStretchMode": "uniform",
            "backgroundImageOpacity": 0.1
        }

and it appears in the drop down

With Anaconda installed, check that that the kernel is available on your path. If like me you have Azure Data Studio installed, you will have additional kernels but the important one line here is

python3 C:\Users\USERNAME\Anaconda3\share\jupyter\kernels\python3

In Windows Terminal move to a PowerShell 7 prompt and install the dotnet interactive tool

dotnet tool install --global Microsoft.dotnet-interactive

Then you can install the .NET kernel in your Anaconda prompt using this command

dotnet interactive jupyter install

Sometimes new things have errors

I had an error when I tried this first time

Could not execute because the specified command or file was not found.
Possible reasons for this include:
* You misspelled a built-in dotnet command.
* You intended to execute a .NET Core program, but dotnet-interactive does not exist.
* You intended to run a global tool, but a dotnet-prefixed executable with this name could not be found on the PATH.

This is easily fixed by adding %USERPROFILE%\.dotnet\tools to my path with set PATH=%PATH%;%USERPROFILE%\.dotnet\tools

Running jupyter kernelspec list shows that the .NET kernel is installed for C Sharp, F Sharp and .NET PowerShell

Lets open a Notebook

Now you want to play with it!
You can run the lab environment using `jupyter lab`

This opens a browser

You can open existing Azure Data Studio PowerShell notebooks (but not SQL ones)

Sometimes new things have errors Part 2

Unfortunately, I get errors when trying to import Pester which means I can not use my dbachecks notebooks in this blog post. I have raised an issue on the repo here.

Create a New Notebook

But it is easy to create a new Notebook

In the launcher page click the .NET PowerShell button


Which will open a new Notebook in the directory that you launched the lab from. You can then add Code or Markdown as I have described before here.

Then you can add code, markdown and images to create your notebook.

Once you have finished using the notebook lab, you can shut it down in the Anaconda prompt with CTRL + C

Here is a video of running a notebook which anyone can use to create a couple of Docker containers running SQL 2019 and query them with dbatools. You can find the notebook further down this post.

Sharing Notebooks

You can create notebooks to run common tasks. Even better, from the lab you can convert the notebook including the results to a variety of formats to share with other none-technical people. I used this functionality this week to export Azure Data Studio Notebooks to HTML and PDF for a Project manager 🙂

You can find the Export Notebook command in the File menu

Exporting to HTML did not export the images but it does include the results

You can share notebooks via GitHub – Either in a gist like this

or by providing a straight link to the notebook in GitHub https://github.com/SQLDBAWithABeard/Notebooks/blob/master/notebooks/Exploring%20dbatools.ipynb

You can also use Binder https://mybinder.org/

This uses Docker to create an interactive Notebook. Create a Github repo like https://github.com/SQLDBAWithABeard/Notebooks (or just clone it) Copy your notebooks into the notebooks folder and push the changes to Github and then go to https://mybinder.org/ and add your URL to the repository.

You can see what it looks like by clicking the button below which Binder creates for you

Unfortunately the kernel only supports Python for the moment but you can see the possibilities 🙂

Using Docker to run Integration Tests for dbachecks

My wonderful friend André Kamman wrote a fantastic blog post this week SQL Server Container Instances via Cloudshell about how he uses containers in Azure to test code against different versions of SQL Server.

It reminded me that I do something very similar to test dbachecks code changes. I thought this might make a good blog post. I will talk through how I do this locally as I merge a PR from another great friend Cláudio Silva who has added agent job history checks.

GitHub PR VS Code Extension

I use the GitHub Pull Requests extension for VS Code to work with pull requests for dbachecks. This enables me to see all of the information about the Pull Request, merge it, review it, comment on it all from VS Code

I can also see which files have been changed and which changes have been made

Once I am ready to test the pull request I perform a checkout using the extension

This will update all of the files in my local repository with all of the changes in this pull request

You can see at the bottom left that the branch changes from development to the name of the PR.

Running The Unit Tests

The first thing that I do is to run the Unit Tests for the module. These will test that the code is following all of the guidelines that we require and that the tests are formatted in the correct way for the Power Bi to parse. I have blogged about this here and here and we use this Pester in our CI process in Azure DevOps which I described here.

I navigate to the root of the dbachecks repository on my local machine and run

$testresults = Invoke-Pester .\tests -ExcludeTag Integration -Show Fails -PassThru 

and after about a minute

Thank you Cláudio, the code has passed the tests 😉

Running Some Integration Tests

The difference between Unit tests and Integration tests in a nutshell is that the Unit tests are testing that the code is doing what is expected without any other external influences whilst the Integration tests are checking that the code is doing what is expected when running on an actual environment. In this scenario we know that the code is doing what is expected but we want to check what it does when it runs against a SQL Server and even when it runs against multiple SQL Servers of different versions.

Multiple Versions of SQL Server

As I have described before my friend and former colleague Andrew Pruski b | t has many resources for running SQL in containers. This means that I can quickly and easily create fresh uncontaminated instances of SQL 2012, 2014, 2016 and 2017 really quickly.

I can create 4 instances of different versions of SQL in (a tad over) 1 minute. How about you?

Imagine how long it would take to run the installers for 4 versions of SQL and the pain you would have trying to uninstall them and make sure everything is ‘clean’. Even images that have been sysprep’d won’t be done in 1 minute.

Docker Compose Up ?

So what is this magic command that has enabled me to do this? docker compose uses a YAML file to define multi-container applications. This means that with a file called docker-compose.yml like thish

version: '3.7'

services:
    sql2012:
        image: dbafromthecold/sqlserver2012dev:sp4
        ports:  
          - "15589:1433"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"
    sql2014:
        image: dbafromthecold/sqlserver2014dev:sp2
        ports:  
          - "15588:1433"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"
    sql2016:
        image: dbafromthecold/sqlserver2016dev:sp2
        ports:  
          - "15587:1433"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"
    sql2017:
        image: microsoft/mssql-server-windows-developer:2017-latest
        ports:  
          - "15586:1433"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"

and in that directory just run

docker-compose up -d

and 4 SQL containers are available to you. You can interact with them via SSMS if you wish with localhost comma PORTNUMBER. The port numbers in the above file are 15586, 15587,15588 and 15589

Now it must be noted, as I describe here that first I pulled the images to my laptop. The first time you run docker compose will take significantly longer if you haven’t pulled the images already (pulling the images will take quite a while depending on your broadband speed)

Credential

The next thing is to save a credential to make it easier to automate. I use the method described by my PowerShell friend Jaap Brasser here. I run this code

$CredentailPath = 'C:\MSSQL\BACKUP\KEEP\sacred.xml'
Get-Credential | Export-Clixml -Path $CredentialPath

and then I can create a credential object using

$cred = Import-Clixml $CredentailPath 

Check The Connections

I ensure a clean session by removing the dbatools and dbachecks modules and then import the local version of dbachecks and set some variables

$dbacheckslocalpath = 'GIT:\dbachecks\'
Remove-Module dbatools, dbachecks -ErrorAction SilentlyContinue
Import-Module $dbacheckslocalpath\dbachecks.psd1
$cred = Import-Clixml $CredentailPath 
$containers = 'localhost,15589', 'localhost,15588', 'localhost,15587', 'localhost,15586'

Now I can start to run my Integration tests. First reset the dbachecks configuration and set some configuration values

# run the checks against these instances
$null = Set-DbcConfig -Name app.sqlinstance $containers
# We are using SQL authentication
$null = Set-DbcConfig -Name policy.connection.authscheme -Value SQL
# sometimes its a bit slower than the default value
$null = Set-DbcConfig -Name policy.network.latencymaxms -Value 100 # because the containers run a bit slow!

Then I will run the dbachecks connectivity checks and save the results to a variable without showing any output

$ConnectivityTests = Invoke-DbcCheck -SqlCredential $cred -Check Connectivity -Show None -PassThru

I can then use Pester to check that dbachecks has worked as expected by testing if the failedcount property returned is 0.

Describe "Testing the checks are running as expected" -Tag Integration {
    Context "Connectivity Checks" {
        It "All Tests should pass" {
            $ConnectivityTests.FailedCount | Should -Be 0 -Because "We expect all of the checks to run and pass with default settings"
        }
    }
}

What is the Unit Test for this PR?

Next I think about what we need to be testing for the this PR. The Unit tests will help us.

Choose some Integration Tests

This check is checking the Agent job history settings and the unit tests are

  • It “Passes Check Correctly with Maximum History Rows disabled (-1)”
  • It “Fails Check Correctly with Maximum History Rows disabled (-1) but configured value is 1000”
  • It “Passes Check Correctly with Maximum History Rows being 10000”
  • It “Fails Check Correctly with Maximum History Rows being less than 10000”
  • It “Passes Check Correctly with Maximum History Rows per job being 100”
  • It “Fails Check Correctly with Maximum History Rows per job being less than 100”

So we will check the same things on real actual SQL Servers. First though we need to start the SQL Server Agent as it is not started by default. We can do this as follows

docker exec -ti integration_sql2012_1 powershell start-service SQLSERVERAGENT
docker exec -ti integration_sql2014_1 powershell start-service SQLSERVERAGENT
docker exec -ti integration_sql2016_1 powershell start-service SQLSERVERAGENT
docker exec -ti integration_sql2017_1 powershell start-service SQLSERVERAGENT

Unfortunately, the agent service wont start in the SQL 2014 container so I cant run agent integration tests for that container but it’s better than no integration tests.

This is What We Will Test

So we want to test if the check will pass with default settings. In general, dbachecks will pass for default instance, agent or database settings values by default.

We also want the check to fail if the configured value for dbachecks is set to default but the value has been set on the instance.

We want the check to pass if the configured value for the dbachecks configuration is set and the instance (agent, database) setting matches it.

If You Are Doing Something More Than Once ……

Let’s automate that. We are going to be repeatedly running those three tests for each setting that we are running integration tests for. I have created 3 functions for this again checking that FailedCount or Passed Count is 0 depending on the test.

function Invoke-DefaultCheck {
    It "All Checks should pass with default for $Check" {
        $Tests = get-variable "$($Check)default"  -ValueOnly
        $Tests.FailedCount | Should -Be 0 -Because "We expect all of the checks to run and pass with default setting (Yes we may set some values before but you get my drift)"
    }
}
function Invoke-ConfigCheck {
    It "All Checks should fail when config changed for $Check" {
        $Tests = get-variable "$($Check)configchanged"  -ValueOnly
        $Tests.PassedCount | Should -Be 0 -Because "We expect all of the checks to run and fail when we have changed the config values"
    }
}
function Invoke-ValueCheck {
    It "All Checks should pass when setting changed for $Check" {
        $Tests = get-variable "$($Check)valuechanged"  -ValueOnly
        $Tests.FailedCount | Should -Be 0 -Because "We expect all of the checks to run and pass when we have changed the settings to match the config values"
    }
}

Now I can use those functions inside a loop in my Integration Pester Test

    $TestingTheChecks = @('errorlogscount','jobhistory')
    Foreach ($Check in $TestingTheChecks) {
        Context "$Check Checks" {
            Invoke-DefaultCheck
            Invoke-ConfigCheck
            Invoke-ValueCheck
        }
    }

Write Some Integration Tests

So for this new test I have added a value to the TestingTheChecks array then I can test my checks. The default check I can check like this

# run the checks against these instances (SQL2014 agent wont start :-( ))
$null = Set-DbcConfig -Name app.sqlinstance $containers.Where{$_ -ne 'localhost,15588'}
# by default all tests should pass on default instance settings
$jobhistorydefault = Invoke-DbcCheck -SqlCredential $cred -Check JobHistory -Show None  -PassThru

Now I need to change the configurations so that they do not match the defaults and run the checks again

#Change the configuration to test that the checks fail
$null = Set-DbcConfig -Name agent.history.maximumjobhistoryrows -value 1000
$null = Set-DbcConfig -Name agent.history.maximumhistoryrows -value 10000
$jobhistoryconfigchanged = Invoke-DbcCheck -SqlCredential $cred -Check JobHistory -Show None  -PassThru

Next we have to change the instance settings so that they match the dbachecks configuration and run the checks and test that they all pass.

We will (of course) use dbatools for this. First we need to find the command that we need

Find-DbaCommand jobserver

and then work out how to use it

Get-Help Set-DbaAgentServer -Detailed

There is an example that does exactly what we want 🙂 So we can run this.

$setDbaAgentServerSplat = @{
    MaximumJobHistoryRows = 1000
    MaximumHistoryRows = 10000
    SqlInstance = $containers.Where{$_ -ne 'localhost,15588'}
    SqlCredential = $cred
}
Set-DbaAgentServer @setDbaAgentServerSplat
$jobhistoryvaluechanged = Invoke-DbcCheck -SqlCredential $cred -Check JobHistory -Show None  -PassThru

Run the Integration Tests

And then we will check that all of the checks are passing and failing as expected

Invoke-Pester .\DockerTests.ps1

Integration Test For Error Log Counts

There is another integration test there for the error logs count. This works in the same way. Here is the code


#region error Log Count - PR 583
# default test
$errorlogscountdefault = Invoke-DbcCheck -SqlCredential $cred -Check ErrorLogCount -Show None  -PassThru
# set a value and then it will fail
$null = Set-DbcConfig -Name policy.errorlog.logcount -Value 10
$errorlogscountconfigchanged = Invoke-DbcCheck -SqlCredential $cred -Check ErrorLogCount -Show None  -PassThru

# set the value and then it will pass
$null = Set-DbaErrorLogConfig -SqlInstance $containers -SqlCredential $cred -LogCount 10
$errorlogscountvaluechanged = Invoke-DbcCheck -SqlCredential $cred -Check ErrorLogCount -Show None  -PassThru
#endregion

Merge the Changes

So with all the tests passing I can merge the PR into the development branch and Azure DevOps will start a build. Ultimately, I would like to add the integration to the build as well following André‘s blog post but for now I used the GitHub Pull Request extension to merge the pull request into development which started a build and then merged that into master which signed the code and deployed it to the PowerShell gallery as you can see here and the result is

https://www.powershellgallery.com/packages/dbachecks/1.1.164

Checking Trace Flags with dbachecks, online docs and PSPowerHour

It’s been a few weeks since i have blogged as I have been busy with a lot of other things. One of which is preparing for my SQL Pass Summit pre-con which has lead to me improving the CI/CD for dbachecks by adding auto-creation of online documentation, which you can find at https://dbachecks.readthedocs.io or by running Get-Help with the -Online switch for any dbachecks command.
Get-Help Invoke-DbcCheck -Online
01 - online help.png
I will blog about how dbachecks uses Azure DevOps to do this another time

PSPowerHour

The PowerShell community members Michael T Lombardi and Warren Frame have created PSPowerHour. PSPowerHour is “like a virtual User Group, with a lightning-demo format, and room for non-PowerShell-specific content. Eight community members will give a demo each PowerHour.”
Chrissy blogged about the first one on the dbatools blog
You can watch the videos on the Youtube channel and keep an eye out for more online PSPowerHours via twitter or the GitHub page.
While watching the first group of sessions Andrew Wickham demonstrated using dbatools with trace flags and I thought that needs to be added to dbachecks so I created an issue. Anyone can do this to file improvements as well as bugs for members of the team to code.

Trace Flags

The previous release of dbachecks brought 2 new checks for traceflags. One for traceflags expected to be running and one for traceflags not expected to be running.
You will need to have installed dbachecks from the PowerShell Gallery to do this. This can be done using
Install-Module -Name dbachecks
Once dbachecks is installed you can find the checks using
Get-DBcCheck
you can filter using the pattern parameter
Get-DBcCheck -Pattern traceflag
02 - get0dbcconfig.png
This will show you
  • the UniqueTag which will enable you to run only that check if you wish
  • AllTags which shows which tags will include that check
  • Config will show you which configuration items can be set for this check

The trace flag checks require the app.sqlinstance configuration which is the list of SQL instances that the checks will run against. You can also specify the instances as a parameter for Invoke-DbCheck as well.

The configuration for the expected traceflags is policy.traceflags.expected By default it is set to null. You can see what configuration it has using

Get-DBcConfig policy.traceflags.expected

get-dbcconfig.png

So if you want to check that there are no trace flags running, then you can run
$instance = 'sql0'
Set-DbcConfig -Name app.sqlinstance -Value $instance
Invoke-DbcCheck -Check TraceFlagsExpected
check 1.png
Maybe this instance is required to have trace flag 1117 enabled so that all files in a file group grow equally, you can set the trace flag you expect to be running using
Set-DbcConfig -Name policy.traceflags.expected -Value 1117
set config.png
Now you when you run the check it fails
Invoke-DbcCheck -Check TraceFlagsExpecte
not found.png
and gives you the error message

 [-] Expected Trace Flags 1117 exist on sql0 593ms
Expected 1117 to be found in collection @(), because We expect that Trace Flag 1117 will be set on sql0, but it was not found.

So we have a failing test. We need to fix that. We can use dbatools
Enable-DbaTraceFlag -SqlInstance $instance -TraceFlag 1117
set traceflag.png
This time when we run the check
Invoke-DbcCheck -Check TraceFlagsExpected
it passes
passed test
If you just need to see what trace flags are enabled you can use
Get-DbaTraceFlag -SqlInstance $instance
get trace flag.png
Reset the configuration for the expected trace flag to an empty array and then set the configuration for traceflags we do not expect to be running to 1117
Set-DbcConfig -Name policy.traceflags.expected -Value @()
Set-DbcConfig -Name policy.traceflags.notexpected -Value 1117
set config 2.png
and then run the trace flags not expected to be running check with
Invoke-DbcCheck -Check TraceFlagsNotExpected
It will fail as 1117 is still running
not expected fail.png
and give the message
[-] Expected Trace Flags 1117 to not exist on sql0 321ms
Expected 1117 to not be found in collection 1117, because We expect that Trace Flag 1117 will not be set on sql0, but it was found.
So to resolve this failing check we need to disable the trace flag and we can do that with dbatools using
Disable-DbaTraceFlag -SqlInstance $instance -TraceFlag 1117
disable trace flag
and now when we run the check
Invoke-DbcCheck -Check TraceFlagsNotExpected
it passes
passed bnot expected.png
The checks also work with multiple traceflags so you can set multiple values for trace flags that are not expexted to be running
Set-DbcConfig -Name policy.traceflags.notexpected -Value 1117, 1118
and as we saw earlier, you can run both trace flag checks using
Invoke-DbcCheck -Check TraceFlag
multi checks.png
You can use this or any of the 95 available checks to validate that your SQL instances, singular or your whole estate are as you expect them to be.

Using the PowerShell AST to find a ForEach Method

In dbachecks we enable people to see what checks are available by running Get-DbcCheck. This gives a number of properties including the ‘type’ of check. This refers to the configuration item or parameter that is required to have a value for this check to run.

For example – Any check to do with SQL Agent is of type Sqlinstance because it requires an instance to be specified but a check for SPN is of type ComputerName because it requires a computer name to run.

Automation for the win

Because I believe in automation I do not want to have to hard code these values anywhere but create them when the module is imported so we use a json file to feed Get-DbcCheck and populate the Json file when we import the module. This is done using the method that I described here and means that whenever a new check is added it is automatically available in Get-DbcCheck without any extra work.

We use code like this
## Parse the file with AST
$CheckFileAST = [Management.Automation.Language.Parser]::ParseInput($check, [ref]$null, [ref]$null)
## Old code we can use the describes
$Describes = $CheckFileAST.FindAll([Func[Management.Automation.Language.Ast, bool]] {
        param ($ast)
        $ast.CommandElements -and
        $ast.CommandElements[0].Value -eq 'describe'
    }, $true)

@($describes).ForEach{
    $groups += $filename
    $Describe = $_.CommandElements.Where{$PSItem.StaticType.name -eq 'string'}[1]
    $title = $Describe.Value
    $Tags = $PSItem.CommandElements.Where{$PSItem.StaticType.name -eq 'Object[]' -and $psitem.Value -eq $null}.Extent.Text.ToString().Replace(', $filename', '')
    # CHoose the type
    if ($Describe.Parent -match "Get-Instance") {
        $type = "Sqlinstance"
    }
    elseif ($Describe.Parent -match "Get-ComputerName" -or $Describe.Parent -match "AllServerInfo") {
        $type = "ComputerName"
    }
    elseif ($Describe.Parent -match "Get-ClusterObject") {
        $Type = "ClusteNode"
    }
First we parse the code with the AST and store that in the CheckFileAST variable, then we use the FindAll method to find any command elements that match “Describe” which conveniently gets our describes and then we can simply match the Parent object which holds some code to each function that we use to get our values to be passed to the tests Get-ComputerName, Get-Instance, Get-ClusterObject and set the type appropriately.
which when run against a check like this
Describe "Backup Path Access" -Tags BackupPathAccess, Storage, DISA, $filename {
    @(Get-Instance).ForEach{
        if ($NotContactable -contains $psitem) {
            Context "Testing Backup Path Access on $psitem" {
                It "Can't Connect to $Psitem" {
                    $false| Should -BeTrue -Because "The instance should be available to be connected to!"
                }
            }
        }
        else {
            Context "Testing Backup Path Access on $psitem" {
                $backuppath = Get-DbcConfigValue policy.storage.backuppath
                if (-not$backuppath) {
                    $backuppath = (Get-DbaDefaultPath-SqlInstance $psitem).Backup
                }
                It "can access backup path ($backuppath) on $psitem" {
                    Test-DbaSqlPath-SqlInstance $psitem -Path $backuppath| Should -BeTrue -Because 'The SQL Service account needs to have access to the backup path to backup your databases'
                }
            }
        }
    }
}
will find the describe block and get the title “Backup Path Access”  and the tags BackupPathAccess, Storage, DISA, $filename and then find the Get-Instance and set the type to SqlInstance

Until Rob breaks it!

This has worked wonderfully well for 6 months or so of the life of dbachecks but this week I broke it!
The problem was the performance of the code. It is taking a long time to run the tests and I am looking at ways to improve this. I was looking at the Server.Tests file because I thought why not start with one of the smaller files.
It runs the following checks
Server Power Plan Configuration
SPNs
Disk Space
Ping Computer
CPUPrioritisation
Disk Allocation Unit
Instance Connection
and it was looping through the computer names for each check like this
Describe "Server Power Plan Configuration" -Tags PowerPlan, $filename {
    @(Get-ComputerName).ForEach{
    }
}
Describe "Instance Connection" -Tags InstanceConnection, Connectivity, $filename {
    @(Get-Instance).ForEach{
    }
}
Describe "SPNs" -Tags SPN, $filename {
    @(Get-ComputerName).ForEach{
    }
}
Describe "Disk Space" -Tags DiskCapacity, Storage, DISA, $filename {
    @(Get-ComputerName).ForEach{
    }
}
Describe "Ping Computer" -Tags PingComputer, $filename {
    @(Get-ComputerName).ForEach{
    }
}
Describe "CPUPrioritisation" -Tags CPUPrioritisation, $filename {
    @(Get-ComputerName).ForEach{
    }
}
Describe "Disk Allocation Unit" -Tags DiskAllocationUnit, $filename {
    @(Get-ComputerName).ForEach{
    }
}
I altered it to have only one loop for the computer names like so
@(Get-ComputerName).ForEach{
    Describe "Server Power Plan Configuration" -Tags PowerPlan, $filename {
    }
    Describe "SPNs" -Tags SPN, $filename {
    }
    Describe "Disk Space" -Tags DiskCapacity, Storage, DISA, $filename {
    }
    Describe "Ping Computer" -Tags PingComputer, $filename {
    }
    Describe "CPUPrioritisation" -Tags CPUPrioritisation, $filename {
    }
    Describe "Disk Allocation Unit" -Tags DiskAllocationUnit, $filename {
    }
}
Describe "Instance Connection" -Tags InstanceConnection, Connectivity, $filename {
    @(Get-Instance).ForEach{
    }
}
and immediately in testing my checks for the Server Tag decreased in time by about 60% 🙂
I was very happy.
Then I added it to the dbachecks module on my machine, loaded the module and realised that my Json file for Get-DbcCheck was no longer being populated for the type because this line
elseif ($Describe.Parent-match"Get-ComputerName"-or$Describe.Parent-match"AllServerInfo")
was no longer true.

AST for other things

So I googled Management.Automation.Language.Ast the first result lead me to docs.microsoft There are a number of different language elements available there and I found InvokeMemberExpressionAst which will let me find any methods that have been invoked, so now I can find the loops with
$ComputerNameForEach = $CheckFileAST.FindAll([Func[Management.Automation.Language.Ast, bool]] {
        param ($ast)
        $ast -is [System.Management.Automation.Language.InvokeMemberExpressionAst]
    }, $true)
When I examined the object returned I could see that I could further limit the result to get only the method for Get-ComputerName and then if I choose the Extent I can get the code of that loop
## New code uses a Computer Name loop to speed up execution so need to find that as well
$ComputerNameForEach=$CheckFileAST.FindAll([Func[Management.Automation.Language.Ast,bool]] {
param ($ast)
$ast-is [System.Management.Automation.Language.InvokeMemberExpressionAst] -and$ast.expression.Subexpression.Extent.Text-eq'Get-ComputerName'
}, $true).Extent

and now I can match the Tags to the type again :-)

if ($ComputerNameForEach-match$title) {
$type="ComputerName"
}
and now Get-DbcCheck is returning the right results and the checks are a little faster
You can find dbachecks on the PowerShell Gallery or install it using
Install-Module dbachecks -Scope CurrentUser

 

A PowerShell Pester Check for parsing SQL scripts

I like to write Pester checks to make sure that all is as expected! This is just a quick post as much to help me remember this script 🙂

This is a quick Pester test I wrote to ensure that some SQL Scripts in a directory would parse so there was some guarantee that they were valid T-SQL. It uses the SQLParser.dll and because it was using a build server without SQL Server I have to load the required DLLs from the dbatools module (Thank you dbatools 🙂 )

It simply runs through all of the .sql files and runs the parser against them and checks the errors. In the case of failures it will output where it failed in the error message in the failed Pester result as well.

You will need dbatools module installed on the instance and at least version 4 of the Pester module as well

Describe "Testing SQL" {
    Context "Running Parser" {
        ## Load assembly
        $Parserdll = (Get-ChildItem 'C:\Program Files\WindowsPowerShell\Modules\dbatools' -Include Microsoft.SqlServer.Management.SqlParser.dll -Recurse)[0].FullName
        [System.Reflection.Assembly]::LoadFile($Parserdll) | Out-Null
        $TraceDll = (Get-ChildItem 'C:\Program Files\WindowsPowerShell\Modules\dbatools' -Include Microsoft.SqlServer.Diagnostics.Strace.dll -Recurse)[0].FullName
        [System.Reflection.Assembly]::LoadFile($TraceDll) | Out-Null
        $ParseOptions = New-Object Microsoft.SqlServer.Management.SqlParser.Parser.ParseOptions
        $ParseOptions.BatchSeparator = 'GO'
        $files = Get-ChildItem -Path $Env:Directory -Include *.sql -Recurse ## This variable is set as a Build Process Variable or put your path here
        $files.ForEach{
            It "$($Psitem.FullName) Should Parse SQL correctly" {
                $filename = $Psitem.FullName
                $sql = Get-Content -LiteralPath "$fileName"
                $Script = [Microsoft.SqlServer.Management.SqlParser.Parser.Parser]::Parse($SQL, $ParseOptions)
                $Script.Errors | Should -BeNullOrEmpty
            }
        }
    }
}

dbachecks – Improved Descriptions

With the latest release of dbachecks we have added a new check for testing that foreign keys and constraints are trusted thanks to Cláudio Silva b | t

To get the latest release you will need to run

Update-Module dbachecks

You should do this regularly as we release new improvements frequently.

We have also added better descriptions for the checks which was suggested by the same person who inspired the previous improvement I blogged about here

Instead of the description just being the name of the check it is now more of a, well, a description really 🙂

This has the added effect that it means that just running Get-DbcCheck in the command line will not fit all of the information on a normal screen

01 - get-dbccheck.png

You can use the Format-Table command (or its alias ft at the command line) and select the properties to display using

Get-DbcCheck | ft -Property UniqueTag, Description -Wrap

02 - get-dbccheck format table

or you can use Format-List (or its alias fl at the command line)

Get-DbcCheck | fl
03 get-dbccheck format list.png
Or you can use Out-GridView (or its alias ogv at the command line) (Incidentally, could you also thumbs up this issue on Github to get Out-GridView functionality in PowerShell 6)
Get-DbcCheck | ogv
04 - get-dbacheck ogv
Happy Validating !

Version Update, Code Signing and publishing to the PowerShell Gallery with VSTS

At the fabulous PowerShell Conference EU I presented about Continuous Delivery to the PowerShell Gallery with VSTS and explained how we use VSTS to enable CD for dbachecks. We even released a new version during the session 🙂

So how do we achieve this?

We have a few steps

  • Create a project and link to our GitHub
  • Run unit uests with Pester to make sure that our code is doing what we expect.
  • Update our module version and commit the change to GitHub
  • Sign our code with a code signing certificate
  • Publish to the PowerShell Gallery

Create Project and link to GitHub

First you need to create a VSTS project by going to https://www.visualstudio.com/ This is free for up to 5 users with 1 concurrent CI/CD queue limited to a maximum of 60 minutes run time which should be more than enough for your PowerShell module.

01 - sign up.png

Click on Get Started for free under Visual Studio Team Services and fill in the required information. Then on the front page click new project

02 - New Project.png

Fill in the details and click create

03 - create project.png

Click on builds and then new definition

04- builds.png

next you need to link your project to your GitHub (or other source control providers) repository

05 - github auth.png

You can either authorise with OAuth or you can provide a PAT token following the instructions here. Once that is complete choose your repo. Save the PAT as you will need it later in the process!

06 - choose repo.png

and choose the branch that you want this build definition to run against.

07 branch.png

I chose to run the Unit Tests when a PR was merged into the development branch. I will then create another build definition for the master branch to sign the code and update module version. This enables us to push several PRs into the development branch and create a single release for the gallery.

Then I start with an empty process

08 - empty process.png

and give it a suitable name

09 - name it.png

i chose the hosted queue but you can download an agent to your build server if you need to do more or your integration tests require access to other resources not available on the hosted agent.

Run Unit Tests with Pester

We have a number of Unit tests in our tests folder in dbachecks so we want to run them to ensure that everything is as it should be and the new code will not break existing functionality (and for dbachecks the format of the PowerBi)

You can use the Pester Test Runner Build Task from the folk at Black Marble by clicking on the + sign next to Phase 1 and searching for Pester

10 - Pester task runner.png

You will need to click Get It Free to install it and then click add to add the task to your build definition. You can pretty much leave it as default if you wish and Pester will run all of the *.Tests.ps1 files that it finds in the directory where it downloads the GitHub repo which is referred to using the variable $(Build.SourcesDirectory). It will then output the results to a json file called Test-Pester.XML ready for publishing.

However, as dbachecks has a number of dependent modules, this task was not suitable. I spoke with Chris Gardner  b | t  from Black Marble at the PowerShell Conference and he says that this can be resolved so look out for the update. Chris is a great guy and always willing to help, you can often find him in the PowerShell Slack channel answering questions and helping people

But as you can use PowerShell in VSTS tasks, this is not a problem although you need to write your PowerShell using try catch to make sure that your task fails when your PowerShell errors. This is the code I use to install the modules

$ErrorActionPreference = 'Stop'

# Set location to module home path in artifacts directory
try {
    Set-Location $(Build.SourcesDirectory)
    Get-ChildItem
}
catch {
    Write-Error "Failed to set location"

}

# Get the Module versions
Install-Module Configuration -Scope CurrentUser -Force
$Modules = Get-ManifestValue -Path .\dbachecks.psd1 -PropertyName RequiredModules

$PesterVersion = $Modules.Where{$_.Get_Item('ModuleName') -eq 'Pester'}[0].Get_Item('ModuleVersion')
$PSFrameworkVersion = $Modules.Where{$_.Get_Item('ModuleName') -eq 'PSFramework'}[0].Get_Item('ModuleVersion')
$dbatoolsVersion = $Modules.Where{$_.Get_Item('ModuleName') -eq 'dbatools'}[0].Get_Item('ModuleVersion')

# Install Pester
try {
    Write-Output "Installing Pester"
    Install-Module Pester  -RequiredVersion $PesterVersion  -Scope CurrentUser -Force -SkipPublisherCheck
    Write-Output "Installed Pester"

}
catch {
    Write-Error "Failed to Install Pester $($_)"
}
# Install dbatools
try {
    Write-Output "Installing PSFramework"
    Install-Module PSFramework  -RequiredVersion $PsFrameworkVersion  -Scope CurrentUser -Force 
    Write-Output "Installed PSFramework"

}
catch {
    Write-Error "Failed to Install PSFramework $($_)"
}
# Install dbachecks
try {
    Write-Output "Installing dbatools"
    Install-Module dbatools  -RequiredVersion $dbatoolsVersion  -Scope CurrentUser -Force 
    Write-Output "Installed dbatools"

}
catch {
    Write-Error "Failed to Install dbatools $($_)"
}

# Add current folder to PSModulePath
try {
    Write-Output "Adding local folder to PSModulePath"
    $ENV:PSModulePath = $ENV:PSModulePath + ";$pwd"
    Write-Output "Added local folder to PSModulePath"    
    $ENV:PSModulePath.Split(';')
}
catch {
    Write-Error "Failed to add $pwd to PSModulePAth - $_"
}

I use the Configuration module from Joel Bennett to get the required module versions for the required modules and then add the path to $ENV:PSModulePath so that the modules will be imported. I think this is because the modules did not import correctly without it.

Once I have the modules I can then run Pester as follows

try {
    Write-Output "Installing dbachecks"
    Import-Module .\dbachecks.psd1
    Write-Output "Installed dbachecks"

}
catch {
    Write-Error "Failed to Install dbachecks $($_)"
}
$TestResults = Invoke-Pester .\tests -ExcludeTag Integration,IntegrationTests  -Show None -OutputFile $(Build.SourcesDirectory)\Test-Pester.XML -OutputFormat NUnitXml -PassThru

if ($TestResults.failedCount -ne 0) {
    Write-Error "Pester returned errors"
}

As you can see I import the dbachecks module from the local folder, run Invoke-Pester and output the results to an XML file and check that there are no failing tests.

Whether you use the task or PowerShell the next step is to Publish the test results so that they are displayed in the build results in VSTS.

Click on the + sign next to Phase 1 and search for Publish

12 - publish test results.png

 

Choose the Publish Test Results task and leave everything as default unless you have renamed the xml file. This means that on the summary page you will see some test results

 

13 - Test on sumary page.png

and on the tests tab you can see more detailed information and drill down into the tests

14 - detailed test report.png

Trigger

The next step is to trigger a build when a commit is pushed to the development branch. Click on Triggers and tick enable continuous integration

15 Trigger.png

Saving the Build Definition

I would normally save the build definition regularly and ensure that there is a good message in the comment. I always tell clients that this is like a commit message for your build process so that you can see the history of the changes for the build definition.

You can see the history on the edit tab of the build definition

16 - build history.png

If you want to compare or revert the build definition this can be done using the hamburger menu as shown below.

17 - build history compare revert.png

Update the Module Version

Now we need to create a build definition for the master branch to update the module version and sign the code ready for publishing to the PowerShell Gallery when we commit or merge to master

Create a new build definition as above but this time choose the master branch

18 - master build.png

Again choose an empty process and name it sensibly, click the + sign next to Phase 1 and search for PowerShell

19 - PowerShell task.png

I change the version to 2 and use this code. Note that the commit message has ***NO_CI*** in it. Putting this in a commit message tells VSTS not to trigger a build for this commit.

$manifest = Import-PowerShellDataFile .\dbachecks.psd1 
[version]$version = $Manifest.ModuleVersion
Write-Output "Old Version - $Version"
# Add one to the build of the version number
[version]$NewVersion = "{0}.{1}.{2}" -f $Version.Major, $Version.Minor, ($Version.Build + 1) 
Write-Output "New Version - $NewVersion"
# Update the manifest file
try {
    Write-Output "Updating the Module Version to $NewVersion"
    $path = "$pwd\dbachecks.psd1"
    (Get-Content .\dbachecks.psd1) -replace $version, $NewVersion | Set-Content .\dbachecks.psd1 -Encoding string
    Write-Output "Updated the Module Version to $NewVersion"
}
catch {
    Write-Error "Failed to update the Module Version - $_"
}

try {
    Write-Output "Updating GitHub"
git config user.email "mrrobsewell@outlook.com"
git config user.name "SQLDBAWithABeard"
git add .\dbachecks.psd1
git commit -m "Updated Version Number to $NewVersion ***NO_CI***"

git push https://$(RobsGitHubPAT)@github.com/sqlcollaborative/dbachecks.git HEAD:master
Write-Output "Updated GitHub "

}
catch {
    $_ | Fl -Force
    Write-Output "Failed to update GitHub"
}

I use Get-Content Set-Content as I had errors with the Update-ModuleManifest but Adam Murray g | t uses this code to update the version using the BuildID from VSTS

$newVersion = New-Object version -ArgumentList 1, 0, 0, $env:BUILD_BUILDID
$Public  = @(Get-ChildItem -Path $ModulePath\Public\*.ps1)
$Functions = $public.basename
Update-ModuleManifest -Path $ModulePath\$ModuleName.psd1 -ModuleVersion $newVersion -FunctionsToExport $Functions

You can commit your change by adding your PAT token as a variable under the variables tab. Don’t forget to tick the padlock to make it a secret so it is not displayed in the logs

20 - variables.png

Sign the code with a certificate

The SQL Collaborative uses a code signing certificate from DigiCert who allow MVPs to use one for free to sign their code for open source projects, Thank You. We had to upload the certificate to the secure files store in the VSTS library. Click on library, secure files and the blue +Secure File button

21 - secure file store.png

You also need to add the password as a variable under the variables tab as above. Again don’t forget to tick the padlock to make it a secret so it is not displayed in the logs

Then you need to add a task to download the secure file. Click on the + sign next to Phase 1 and search for secure

22 download secure file.png

choose the file from the drop down

23 - download secure file.png

Next we need to import the certificate and sign the code. I use a PowerShell task for this with the following code

$ErrorActionPreference = 'Stop'
# read in the certificate from a pre-existing PFX file
# I have checked this with @IISResetMe and this does not go in the store only memory
$cert = [System.Security.Cryptography.X509Certificates.X509Certificate2]::new("$(Agent.WorkFolder)\_temp\dbatools-code-signing-cert.pfx","$(CertPassword)")

try {
    Write-Output "Signing Files"
    # find all scripts in your module...
Get-ChildItem  -Filter *.ps1 -Include *.ps1 -Recurse -ErrorAction SilentlyContinue |
# ...that do not have a signature yet...
Where-Object {
  ($_ | Get-AuthenticodeSignature).Status -eq 'NotSigned'
  } |
# and apply one
# (note that we added -WhatIf so no signing occurs. Remove this only if you
# really want to add digital signatures!)
Set-AuthenticodeSignature -Certificate $cert
Write-Output "Signed Files"
}
catch {
    $_ | Format-List -Force
    Write-Error "Failed to sign scripts"
}

which will import the certificate into memory and sign all of the scripts in the module folder.

Publish your artifact

The last step of the master branch build publishes the artifact (your signed module) to VSTS ready for the release task. Again, click the + sign next to Phase one and choose the Publish Artifact task not the deprecated copy and publish artifact task and give the artifact a useful name

24 - publish artifact.png

Don’t forget to set the trigger for the master build as well following the same steps as the development build above

Publish to the PowerShell Gallery

Next we create a release to trigger when there is an artifact ready and publish to the PowerShell Gallery.

Click the Releases tab and New Definition

25 - Reelase creation

Choose an empty process and name the release definition appropriately

26 Release name empty process.png

Now click on the artifact and choose the master build definition. If you have not run a build you will get an error like below but dont worry click add.

27 - add artifact.png

Click on the lightning bolt next to the artifact to open the continuous deployment trigger

28 - Choose lightning bolt

and turn on Continuous Deployment so that when an artifact has been created with an updated module version and signed code it is published to the gallery

28 - Continuous deployment trigger

Next, click on the environment and name it appropriately and then click on the + sign next to Agent Phase and choose a PowerShell step

29 - PowerShell Publish step

You may wonder why I dont choose the PowerShell Gallery Packager task. There are two reasons. First I need to install the required modules for dbachecks (dbatools, PSFramework, Pester) prior to publishing and second it appears that the API Key is stored in plain text

30 - PowerShell Gallery Publisher

I save my API key for the PowerShell Gallery as a variable again making sure to tick the padlock to make it a secret

31 - API Key variable.png

and then use the following code to install the required modules and publish the module to the gallery

Install-Module dbatools -Scope CurrentUser -Force
Install-Module Pester -Scope CurrentUser -SkipPublisherCheck -Force
Install-Module PSFramework -Scope CurrentUser -Force

Publish-Module -Path "$(System.DefaultWorkingDirectory)/Master - Version Update, Signing and Publish Artifact/dbachecks" -NuGetApiKey "$(GalleryApiKey)"

Thats it 🙂

Now we have a process that will automatically run our Pester tests when we commit or merge to the development branch and then update our module version number and sign our code and publish to the PowerShell Gallery when we commit or merge to the master branch

Added Extra – Dashboard

I like to create dashboards in VSTS to show the progress of the various definitions. You can do this under the dashboard tab. Click edit and choose or search for widgets and add them to the dashboard

32 - Dashboard.png

Added Extra – Badges

You can also enable badges for displaying on your readme in GitHub (or VSTS). For the build defintions this is under the options tab.

33 - Build badges

for the release definitions, click the environment and then options and integrations

34 - Release Badge

You can then copy the URL and use it in your readme like this on dbachecks

35 - dbachecks readme badges.png

The SQL Collaborative has joined the preview of enabling public access to VSTS projects as detailed in this blog post So you can see the dbachecks build and release without the need to log in and soon the dbatools process as well

I hope you found this useful and if you have any questions or comments please feel free to contact me

 

Happy Automating!

dbachecks – Configuration Deep Dive

Today is the day that we have announced dbachecks  a PowerShell module enabling you to validate your SQL Instances. You can read more about it here where you can learn how to install it and see some simple use cases

108 Configurations

One of the things I have been talking about in my presentation “Green is Good Red is Bad” is configuring Pester checks so that you do not have to keep writing new tests for the same thing but with different values.

For example, a different user for a database owner. The code to write the test for the database owner is the same but the value might be different for different applications, environments, clients, teams, domains etc. I gave a couple of different methods for achieving this.

With dbachecks we have made this much simpler enabling you to set configuration items at run-time or for your session and enabling you to export and import them so you can create different configs for different use cases

There are 108 configuration items at present. You can see the current configuration by running

Get-DbcConfig

which will show you the name of the config, the value it is currently set and the description

You can see all of the configs and their descriptions here

NameDescription
agent.databasemailprofileName of the Database Mail Profile in SQL Agent
agent.dbaoperatoremailEmail address of the DBA Operator in SQL Agent
agent.dbaoperatornameName of the DBA Operator in SQL Agent
agent.failsafeoperatorEmail address of the DBA Operator in SQL Agent
app.checkreposWhere Pester tests/checks are stored
app.computernameList of Windows Servers that Windows-based tests will run against
app.localappPersisted files live here
app.maildirectoryFiles for mail are stored here
app.sqlcredentialThe universal SQL credential if Trusted/Windows Authentication is not used
app.sqlinstanceList of SQL Server instances that SQL-based tests will run against
app.wincredentialThe universal Windows if default Windows Authentication is not used
command.invokedbccheck.excludecheckInvoke-DbcCheck: The checks that should be skipped by default.
domain.domaincontrollerThe domain controller to process your requests
domain.nameThe Active Directory domain that your server is a part of
domain.organizationalunitThe OU that your server should be a part of
mail.failurethreshholdNumber of errors that must be present to generate an email report
mail.fromEmail address the email reports should come from
mail.smtpserverStore the name of the smtp server to send email reports
mail.subjectSubject line of the email report
mail.toEmail address to send the report to
policy.backup.datadirDestination server data directory
policy.backup.defaultbackupcompreesionDefault Backup Compression check should be enabled $true or disabled $false
policy.backup.diffmaxhoursMaxmimum number of hours before Diff Backups are considered outdated
policy.backup.fullmaxdaysMaxmimum number of days before Full Backups are considered outdated
policy.backup.logdirDestination server log directory
policy.backup.logmaxminutesMaxmimum number of minutes before Log Backups are considered outdated
policy.backup.newdbgraceperiodThe number of hours a newly created database is allowed to not have backups
policy.backup.testserverDestination server for backuptests
policy.build.warningwindowThe number of months prior to a build being unsupported that you want warning about
policy.connection.authschemeAuth requirement (Kerberos, NTLM, etc)
policy.connection.pingcountNumber of times to ping a server to establish average response time
policy.connection.pingmaxmsMaximum response time in ms
policy.dacallowedDAC should be allowed $true or disallowed $false
policy.database.autocloseAuto Close should be allowed $true or dissalowed $false
policy.database.autocreatestatisticsAuto Create Statistics should be enabled $true or disabled $false
policy.database.autoshrinkAuto Shrink should be allowed $true or dissalowed $false
policy.database.autoupdatestatisticsAuto Update Statistics should be enabled $true or disabled $false
policy.database.autoupdatestatisticsasynchronouslyAuto Update Statistics Asynchronously should be enabled $true or disabled $false
policy.database.filebalancetolerancePercentage for Tolerance for checking for balanced files in a filegroups
policy.database.filegrowthexcludedbDatabases to exclude from the file growth check
policy.database.filegrowthtypeGrowth Type should be 'kb' or 'percent'
policy.database.filegrowthvalueThe auto growth value (in kb) should be equal or higher than this value. Example: A value of 65535 means at least 64MB.
policy.database.logfilecountThe number of Log files expected on a database
policy.database.logfilesizecomparisonHow to compare data and log file size, options are maximum or average
policy.database.logfilesizepercentageMaximum percentage of Data file Size that logfile is allowed to be.
policy.database.maxvlfMax virtual log files
policy.dbcc.maxdaysMaxmimum number of days before DBCC CHECKDB is considered outdated
policy.diskspace.percentfreePercent disk free
policy.dump.maxcountMaximum number of expected dumps
policy.hadr.tcpportThe TCPPort for the HADR check
policy.identity.usagepercentMaxmimum percentage of max of identity column
policy.invaliddbowner.excludedbDatabases to exclude from invalid dbowner checks
policy.invaliddbowner.nameThe database owner account should not be this user
policy.network.latencymaxmsMax network latency average
policy.ola.commandlogenabledOla's CommandLog Cleanup should be enabled $true or disabled $false
policy.ola.commandlogscheduledOla's CommandLog Cleanup should be scheduled $true or disabled $false
policy.ola.databaseThe database where Ola's maintenance solution is installed
policy.ola.deletebackuphistoryenabledOla's Delete Backup History should be enabled $true or disabled $false
policy.ola.deletebackuphistoryscheduledOla's Delete Backup History should be scheduled $true or disabled $false
policy.ola.installedChecks to see if Ola Hallengren solution is installed
policy.ola.outputfilecleanupenabledOla's Output File Cleanup should be enabled $true or disabled $false
policy.ola.outputfilecleanupscheduledOla's Output File Cleanup should be scheduled $true or disabled $false
policy.ola.purgejobhistoryenabledOla's Purge Job History should be enabled $true or disabled $false
policy.ola.purgejobhistoryscheduledOla's Purge Job History should be scheduled $true or disabled $false
policy.ola.systemfullenabledOla's Full System Database Backup should be enabled $true or disabled $false
policy.ola.systemfullretentionOla's Full System Database Backup retention number of hours
policy.ola.systemfullscheduledOla's Full System Database Backup should be scheduled $true or disabled $false
policy.ola.systemintegritycheckenabledOla's System Database Integrity should be enabled $true or disabled $false
policy.ola.systemintegritycheckscheduledOla's System Database Integrity should be scheduled $true or disabled $false
policy.ola.userdiffenabledOla's Diff User Database Backup should be enabled $true or disabled $false
policy.ola.userdiffretentionOla's Diff User Database Backup retention number of hours
policy.ola.userdiffscheduledOla's Diff User Database Backup should be scheduled $true or disabled $false
policy.ola.userfullenabledOla's Full User Database Backup should be enabled $true or disabled $false
policy.ola.userfullretentionOla's Full User Database Backup retention number of hours
policy.ola.userfullscheduledOla's Full User Database Backup should be scheduled $true or disabled $false
policy.ola.userindexoptimizeenabledOla's User Index Optimization should be enabled $true or disabled $false
policy.ola.userindexoptimizescheduledOla's User Index Optimization should be scheduled $true or disabled $false
policy.ola.userintegritycheckenabledOla's User Database Integrity should be enabled $true or disabled $false
policy.ola.userintegritycheckscheduledOla's User Database Integrity should be scheduled $true or disabled $false
policy.ola.userlogenabledOla's Log User Database Backup should be enabled $true or disabled $false
policy.ola.userlogretentionOla's Log User Database Backup retention number of hours
policy.ola.userlogscheduledOla's Log User Database Backup should be scheduled $true or disabled $false
policy.oleautomationOLE Automation should be enabled $true or disabled $false
policy.pageverifyPage verify option should be set to this value
policy.recoverymodel.excludedbDatabases to exclude from standard recovery model check
policy.recoverymodel.typeStandard recovery model
policy.storage.backuppathEnables tests to check if servers have access to centralized backup location
policy.validdbowner.excludedbDatabases to exclude from valid dbowner checks
policy.validdbowner.nameThe database owner account should be this user
policy.whoisactive.databaseWhich database should contain the sp_WhoIsActive stored procedure
policy.xevent.requiredrunningsessionList of XE Sessions that should be running.
policy.xevent.requiredstoppedsessionList of XE Sessions that should not be running.
policy.xevent.validrunningsessionList of XE Sessions that can be be running.
skip.backup.testingDon't run Test-DbaLastBackup by default (it's not read-only)
skip.connection.pingSkip the ping check for connectivity
skip.connection.remotingSkip PowerShell remoting check for connectivity
skip.database.filegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.database.logfilecounttestSkip the logfilecount test
skip.datafilegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.dbcc.datapuritycheckSkip data purity check in last good dbcc command
skip.diffbackuptestSkip the Differential backup test
skip.logfilecounttestSkip the logfilecount test
skip.logshiptestingSkip the logshipping test
skip.tempdb1118Don't run test for Trace Flag 1118
skip.tempdbfilecountDon't run test for Temp Database File Count
skip.tempdbfilegrowthpercentDon't run test for Temp Database File Growth in Percent
skip.tempdbfilesizemaxDon't run test for Temp Database Files Max Size
skip.tempdbfilesoncDon't run test for Temp Database Files on C

So there are a lot of configurations that you can use. A lot are already set by default but all of them you can configure for the values that you need for your own estate.

The configurations are stored in the registry at HKCU:\Software\Microsoft\WindowsPowerShell\PSFramework\

First Configurations

First I would run this so that you can see all of the configs in a seperate window (note this does not work on PowerShell v6)

Get-DbcConfig | Out-GridView

Lets start with the first configurations that you will want to set. This should be the Instances and the Hosts that you want to check

You can get the value of the configuration item using

Get-DbcConfigValue -Name app.sqlinstance

as you can see in the image, nothing is returned so we have no instances configured at present. We have added tab completion to the name parameter so that you can easily find the right one

If you want to look at more information about the configuration item you can use

Get-DbcConfig -Name app.sqlinstance

which shows you the name, current value and the description

So lets set our first configuration for our SQL instance to localhost. I have included a video so you can see the auto-complete in action as well

Set-DbcConfig -Name app.sqlinstance localhost

This configuration will be used for any SQL based checks but not for any windows based ones like Services, PowerPlan, SPN, DiskSpace, Cluster so lets set the app.computername configuration as well

This means that when we run invoke-DbcCheck with AllChecks or by specifying a check, it will run against the local machine and default instance unless we specify a sqlinstance when calling Invoke-DbcCheck. So the code below will not use the configuration for app.sqlinstance.

Invoke-DbcCheck -SqlInstance TheBeard

Exclude a Check

You can exclude a check using the -ExcludeCheck parameter of Invoke-DbcConfig. In the example below I am running all of the Server checks but excluding the SPN as we are not on a domain

Invoke-DbcCheck -Check Server -ExcludeCheck SPN

 

There is a configuration setting to exclude checks as well. (Be careful this will exclude them even if you specifically specify a check using Invoke-DbcCheck but we do give you a warning!)

So now I can run

Set-DbcConfig -Name command.invokedbccheck.excludecheck -Value SPN
Invoke-DbcCheck -Check Server

and all of the server checks except the SPN check will run against the local machine and the default instance that I have set in the config

Creating an environment config and exporting it to use any time we like

So lets make this a lot more useful. Lets create a configuration for our production environment and save it to disk (or even source control it!) so that we can use it again and again. We can also then pass it to other members of our team or even embed it in an automated process or our CI/CD system

Lets build up a configuration for a number of tests for my “production” environment. I will not explain them all here but let you read through the code and the comments to see what has been set. You will see that some of them are due to me running the test on a single machine with one drive.

# The computername we will be testing
Set-DbcConfig -Name app.computername -Value localhost                                                                                                                                                                                                          
# The Instances we want to test
Set-DbcConfig -Name app.sqlinstance -Value 'localhost' ,'localhost\PROD1','localhost\PROD2', 'localhost\PROD3'                                                                                                                                            
# The database owner we expect
Set-DbcConfig -Name policy.validdbowner.name -Value 'dbachecksdemo\dbachecks'  
# the database owner we do NOT expect
Set-DbcConfig -Name policy.invaliddbowner.name -Value 'sa'      
# Should backups be compressed by default?
Set-DbcConfig -Name policy.backup.defaultbackupcompreesion -Value $true     
# Do we allow DAC connections?
Set-DbcConfig -Name policy.dacallowed -Value $true    
# What recovery model should we have?
Set-DbcConfig -Name policy.recoverymodel.type -value FULL     
# What should our database growth type be?
Set-DbcConfig -Name policy.database.filegrowthtype -Value kb   
# What authentication scheme are we expecting?                                                                                                            
Set-DbcConfig -Name policy.connection.authscheme -Value 'NTLM'
# Which Agent Operator should be defined?
Set-DbcConfig -Name agent.dbaoperatorname -Value 'DBA Team'
# Which Agent Operator email should be defined?
Set-DbcConfig -Name agent.dbaoperatoremail -Value 'DBATeam@TheBeard.Local'
# Which failsafe operator shoudl be defined?
Set-DbcConfig -Name agent.failsafeoperator -Value 'DBA Team'
# Where is the whoisactive stored procedure?
Set-DbcConfig -Name policy.whoisactive.database -Value DBAAdmin 
# What is the maximum time since I took a Full backup?
Set-DbcConfig -Name policy.backup.fullmaxdays -Value 7
# What is the maximum time since I took a DIFF backup (in hours) ?
Set-DbcConfig -Name policy.backup.diffmaxhours -Value 26
# What is the maximum time since I took a log backup (in minutes)?
Set-DbcConfig -Name policy.backup.logmaxminutes -Value 30 
# What is my domain name?
Set-DbcConfig -Name domain.name -Value 'WORKGROUP'
# Where is my Ola database?
Set-DbcConfig -Name policy.ola.database -Value DBAAdmin
# Which database should not be checked for recovery model
Set-DbcConfig -Name policy.recoverymodel.excludedb -Value 'master','msdb','tempdb'
# What is my SQL Credential
Set-DbcConfig -Name app.sqlcredential -Value $null
# Should I skip the check for temp files on c?
Set-DbcConfig -Name skip.tempdbfilesonc -Value $true
# Should I skip the check for temp files count?
Set-DbcConfig -Name skip.tempdbfilecount -Value $true
# Which Checks should be excluded?
Set-DbcConfig -Name command.invokedbccheck.excludecheck -Value LogShipping,ExtendedEvent, HADR, PseudoSimple,spn
# How many months before a build is unsupported do I want to fail the test?
Set-DbcConfig -Name policy.build.warningwindow -Value 6
Get-Dbcconfig | ogv

When I run this I get

I can then export this to disk (to store in source control) using

Export-DbcConfig -Path C:\Users\dbachecks\Desktop\production_config.json

and I have a configuration file

which I can use any time to set the configuration for dbachecks using the Import-DbcConfig command (But this doesn’t work in VS Codes integrated terminal – which occasionally does odd things, this appears to be one of them)

Import-DbcConfig -Path C:\Users\dbachecks\Desktop\production_config.json

So I can import this configuration and run my checks with it any time I like. This means that I can create many different test configurations for my many different environment or estate configurations.

Yes, I know “good/best practice” says we should use the same configuration for all of our instances but we know that isn’t true. We have instances that were set up 15 years ago that are still in production. We have instances from the companies our organisation has bought over the years that were set up by system administrators. We have instances that were set up by shadow IT and now we have to support but cant change.

As well as those though, we also have different environments. Our development or test environment will have different requirements to our production environments.

In this hypothetical situation the four instances for four different applications have 4 development containers which are connected to using SQL Authentication. We will need a different configuration.

SQL Authentication

We can set up SQL Authentication for connecting to our SQL Instances using the app.sqlcredential configuration. this is going to hold a PSCredential object for SQL Authenticated connection to your instance. If this is set the checks will always try to use it. Yes this means that the same username and password is being used for each connection. No there is currently no way to choose which instances use it and which don’t. This may be a limitation but as you will see further down you can still do this with different configurations

To set the  SQL Authentication run

Set-DbcConfig -Name app.sqlcredential -Value (Get-Credential)

This will give a prompt for you to enter the credential

Development Environment Configuration

So now we know how to set a SQL Authentication configuration we can create our development environment configuration like so. As you can see below the values are different for the checks and more checks have been skipped. I wont explain it all, if it doesn’t make sense ask a question in the comments or in the dbachecks in SQL Server Community Slack

#region Dev Config
# The Instances we want to test
Set-DbcConfig -Name app.sqlinstance -Value 'localhost,1401' ,'localhost,1402','localhost,1403', 'localhost,1404' 
# What is my SQL Credential
Set-DbcConfig -Name app.sqlcredential -Value (Get-Credential)
# The database owner we expect
Set-DbcConfig -Name policy.validdbowner.name -Value 'sa'   
# What authentication scheme are we expecting?  
Set-DbcConfig -Name policy.connection.authscheme -Value 'SQL'
# the database owner we do NOT expect
Set-DbcConfig -Name policy.invaliddbowner.name -Value 'dbachecksdemo\dbachecks'
# Should backups be compressed by default?
Set-DbcConfig -Name policy.backup.defaultbackupcompreesion -Value $false
# What should our database growth type be?
Set-DbcConfig -Name policy.database.filegrowthtype -Value kb
# What should our database growth value be higher than (Mb)?
Set-DbcConfig -Name policy.database.filegrowthvalue -Value 64
# Do we allow DAC connections?
Set-DbcConfig -Name policy.dacallowed -Value $false 
# What is the maximum latency (ms)?
Set-DbcConfig -Name policy.network.latencymaxms -Value 100
# What recovery model should we have?
Set-DbcConfig -Name policy.recoverymodel.type -value Simple
# Where is the whoisactive stored procedure?
Set-DbcConfig -Name policy.whoisactive.database -Value DBAAdmin 
# What is my domain name?
Set-DbcConfig -Name domain.name -Value 'WORKGROUP'
# Which database should not be checked for recovery model
Set-DbcConfig -Name policy.recoverymodel.excludedb -Value 'master','msdb','tempdb'
# Should I skip the check for temp files on c?
Set-DbcConfig -Name skip.tempdbfilesonc -Value $true
# Should I skip the check for temp files count?
Set-DbcConfig -Name skip.tempdbfilecount -Value $true
# How many months before a build is unsupported do I want to fail the test?
Set-DbcConfig -Name policy.build.warningwindow -Value 6
# Which Checks should be excluded?
Set-DbcConfig -Name command.invokedbccheck.excludecheck -Value LogShipping,ExtendedEvent, HADR, SaReNamed, PseudoSimple,spn, DiskSpace, DatabaseCollation,Agent,Backup,UnusedIndex,LogfileCount,FileGroupBalanced,LogfileSize,MaintenanceSolution,ServerNameMatch

Export-DbcConfig -Path C:\Users\dbachecks\Desktop\development_config.json

Using The Different Configurations

Now I have two configurations, one for my Production Environment and one for my development environment. I can run my checks whenever I like (perhaps you will automate this in some way)

  • Import the production configuration
  • Run my tests with that configuration and create a json file for my Power Bi labelled production
  • Import the development configuration (and enter the SQL authentication credential)
  • Run my tests with that configuration and create a json file for my Power Bi labelled development
  • Start Power Bi to show those results
# Import the production config
Import-DbcConfig C:\Users\dbachecks\Desktop\production_config.json
# Run the tests with the production config and create/update the production json
Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment Production
# Import the development config
Import-DbcConfig C:\Users\dbachecks\Desktop\development_config.json
# Run the tests with the production config and create/update the development json
Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment Development
# Open the PowerBi
Start-DbcPowerBi

I have published the Power Bi so that you can see what it would like and have a click around (maybe you can see improvements you would like to contribute)

now we can see how each environment is performing according to our settings for each environment

Combining Configurations Into One Result Set

As you saw above, by using the Environment parameter of Update-DbcPowerBiDataSource you can add different environments to one report. But if I wanted to have a report for my application APP1 showing both production and development environments but they have different configurations how can I do this?

Here’s how.

  • Create a configuration for the production environment (I have used the production configuration one from above but only localhost for the instance)
  • Export it using to  C:\Users\dbachecks\Desktop\APP1-Prod_config.json
  • Create a configuration for the development environment (I have used the development configuration one from above but only localhost,1401 for the instance)
  • Export it using to  C:\Users\dbachecks\Desktop\APP1-Dev_config.json

Then run

# Import the production config
Import-DbcConfig C:\Users\dbachecks\Desktop\APP1-Prod_config.json
# Run the tests with the production config and create/update the production json
Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment APP1
# Import the development config
Import-DbcConfig C:\Users\dbachecks\Desktop\APP1-Dev_config.json
# Run the tests with the production config and create/update the development json
Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment APP1 -Append
Start-DbcPowerBi

Notice that this time there is an Append on the last Invoke-DbcCheck this creates a single json file for the PowerBi and the results look like this. Now we have the results for our application and both the production environment localhost and the development container localhost,1401

 

It’s Open Source – We Want Your Ideas, Issues, New Code

dbachecks is open-source available on GitHub for anyone to contribute

We would love you to contribute. Please open issues for new tests, enhancements, bugs. Please fork the repository and add code to improve the module. please give feedback to make this module even more useful

You can also come in the SQL Server Community Slack and join the dbachecks channel and get advice, make comments or just join in the conversation

Thank You

I want to say thank you to all of the people who have enabled dbachecks to get this far. These wonderful people have used their own time to ensure that you have a useful tool available to you for free

Chrissy Lemaire @cl

Fred Weinmann @FredWeinmann

Cláudio Silva @ClaudioESSilva

Stuart Moore @napalmgram

Shawn Melton @wsmelton

Garry Bargsley @gbargsley

Stephen Bennett @staggerlee011

Sander Stad @SQLStad

Jess Pomfret @jpomfret

Jason Squires @js0505

Shane O’Neill @SOZDBA

and all of the other people who have contributed in the dbachecks Slack channel

Announcing dbachecks – Configurable PowerShell Validation For Your SQL Instances

For the last couple of months members of the dbatools team have been working on a new PowerShell module called dbachecks. This open source PowerShell module will enable you to validate your SQL Instances. Today it is released for you all to start to use 🙂

Validate Your SQL Instances?

What do I mean by validate your SQL Instances? You want to know if your SQL Instances are (still) set up in the way that you want them to be or that you have not missed any configurations when setting them up. With dbachecks you can use any or all of the 80 checks to ensure one or many SQL Instances are as you want them to be. Using Pester, dbachecks will validate your SQL Instance(s) against default settings or ones that you configure yourself.

Installation

Installation is via the PowerShell Gallery. You will need to open PowerShell on a machine connected to the internet and run

Install-Module dbachecks

If you are not running your process as admin or you only want (or are able) to install for your own user account you will need to

Install-Module -Scope CurrentUser

This will also install the PSFramework module used for configuration (and other things beneath the hood) and the latest version (4.2.0 – released on Sunday!) of Pester

Once you have installed the module you can see the commands available by running

Get-Command -Module dbachecks

To be able to use these (and any PowerShell) commands, your first step should always be Get-Help

Get-Help Send-DbcMailMessage

80 Checks

At the time of release, dbachecks has 80 checks. You can see all of the checks by running

Get-DbcCheck

(Note this has nothing to do with DBCC CheckDb!) Here is the output of

Get-DbcCheck | Select Group, UniqueTag

so you can see the current checks

GroupUniqueTag
AgentAgentServiceAccount
AgentDbaOperator
AgentFailsafeOperator
AgentDatabaseMailProfile
AgentFailedJob
DatabaseDatabaseCollation
DatabaseSuspectPage
DatabaseTestLastBackup
DatabaseTestLastBackupVerifyOnly
DatabaseValidDatabaseOwner
DatabaseInvalidDatabaseOwner
DatabaseLastGoodCheckDb
DatabaseIdentityUsage
DatabaseRecoveryModel
DatabaseDuplicateIndex
DatabaseUnusedIndex
DatabaseDisabledIndex
DatabaseDatabaseGrowthEvent
DatabasePageVerify
DatabaseAutoClose
DatabaseAutoShrink
DatabaseLastFullBackup
DatabaseLastDiffBackup
DatabaseLastLogBackup
DatabaseVirtualLogFile
DatabaseLogfileCount
DatabaseLogfileSize
DatabaseFileGroupBalanced
DatabaseAutoCreateStatistics
DatabaseAutoUpdateStatistics
DatabaseAutoUpdateStatisticsAsynchronously
DatabaseDatafileAutoGrowthType
DatabaseTrustworthy
DatabaseOrphanedUser
DatabasePseudoSimple
DatabaseAdHocWorkloads
DomainDomainName
DomainOrganizationalUnit
HADRClusterHealth
HADRClusterServerHealth
HADR
HADRSystem.Object[]
InstanceSqlEngineServiceAccount
InstanceSqlBrowserServiceAccount
InstanceTempDbConfiguration
InstanceAdHocWorkload
InstanceBackupPathAccess
InstanceDAC
InstanceNetworkLatency
InstanceLinkedServerConnection
InstanceMaxMemory
InstanceOrphanedFile
InstanceServerNameMatch
InstanceMemoryDump
InstanceSupportedBuild
InstanceSaRenamed
InstanceDefaultBackupCompression
InstanceXESessionStopped
InstanceXESessionRunning
InstanceXESessionRunningAllowed
InstanceOLEAutomation
InstanceWhoIsActiveInstalled
LogShippingLogShippingPrimary
LogShippingLogShippingSecondary
ServerPowerPlan
ServerInstanceConnection
ServerSPN
ServerDiskCapacity
ServerPingComputer
MaintenancePlanSystemFull
MaintenancePlanUserFull
MaintenancePlanUserDiff
MaintenancePlanUserLog
MaintenancePlanCommandLog
MaintenancePlanSystemIntegrityCheck
MaintenancePlanUserIntegrityCheck
MaintenancePlanUserIndexOptimize
MaintenancePlanOutputFileCleanup
MaintenancePlanDeleteBackupHistory
MaintenancePlanPurgeJobHistory

108 Configurations

One of the things I have been talking about in my presentation “Green is Good Red is Bad” is configuring Pester checks so that you do not have to keep writing new tests for the same thing but with different values.

For example, a different user for a database owner. The code to write the test for the database owner is the same but the value might be different for different applications, environments, clients, teams, domains etc. I gave a couple of different methods for achieving this.

With dbachecks we have made this much simpler enabling you to set configuration items at run-time or for your session and enabling you to export and import them so you can create different configs for different use cases

There are 108 configuration items at present. You can see the current configuration by running

Get-DbcConfig

which will show you the name of the config, the value it is currently set and the description

You can see all of the configs and their descriptions here

NameDescription
agent.databasemailprofileName of the Database Mail Profile in SQL Agent
agent.dbaoperatoremailEmail address of the DBA Operator in SQL Agent
agent.dbaoperatornameName of the DBA Operator in SQL Agent
agent.failsafeoperatorEmail address of the DBA Operator in SQL Agent
app.checkreposWhere Pester tests/checks are stored
app.computernameList of Windows Servers that Windows-based tests will run against
app.localappPersisted files live here
app.maildirectoryFiles for mail are stored here
app.sqlcredentialThe universal SQL credential if Trusted/Windows Authentication is not used
app.sqlinstanceList of SQL Server instances that SQL-based tests will run against
app.wincredentialThe universal Windows if default Windows Authentication is not used
command.invokedbccheck.excludecheckInvoke-DbcCheck: The checks that should be skipped by default.
domain.domaincontrollerThe domain controller to process your requests
domain.nameThe Active Directory domain that your server is a part of
domain.organizationalunitThe OU that your server should be a part of
mail.failurethreshholdNumber of errors that must be present to generate an email report
mail.fromEmail address the email reports should come from
mail.smtpserverStore the name of the smtp server to send email reports
mail.subjectSubject line of the email report
mail.toEmail address to send the report to
policy.backup.datadirDestination server data directory
policy.backup.defaultbackupcompreesionDefault Backup Compression check should be enabled $true or disabled $false
policy.backup.diffmaxhoursMaxmimum number of hours before Diff Backups are considered outdated
policy.backup.fullmaxdaysMaxmimum number of days before Full Backups are considered outdated
policy.backup.logdirDestination server log directory
policy.backup.logmaxminutesMaxmimum number of minutes before Log Backups are considered outdated
policy.backup.newdbgraceperiodThe number of hours a newly created database is allowed to not have backups
policy.backup.testserverDestination server for backuptests
policy.build.warningwindowThe number of months prior to a build being unsupported that you want warning about
policy.connection.authschemeAuth requirement (Kerberos, NTLM, etc)
policy.connection.pingcountNumber of times to ping a server to establish average response time
policy.connection.pingmaxmsMaximum response time in ms
policy.dacallowedDAC should be allowed $true or disallowed $false
policy.database.autocloseAuto Close should be allowed $true or dissalowed $false
policy.database.autocreatestatisticsAuto Create Statistics should be enabled $true or disabled $false
policy.database.autoshrinkAuto Shrink should be allowed $true or dissalowed $false
policy.database.autoupdatestatisticsAuto Update Statistics should be enabled $true or disabled $false
policy.database.autoupdatestatisticsasynchronouslyAuto Update Statistics Asynchronously should be enabled $true or disabled $false
policy.database.filebalancetolerancePercentage for Tolerance for checking for balanced files in a filegroups
policy.database.filegrowthexcludedbDatabases to exclude from the file growth check
policy.database.filegrowthtypeGrowth Type should be 'kb' or 'percent'
policy.database.filegrowthvalueThe auto growth value (in kb) should be equal or higher than this value. Example: A value of 65535 means at least 64MB.
policy.database.logfilecountThe number of Log files expected on a database
policy.database.logfilesizecomparisonHow to compare data and log file size, options are maximum or average
policy.database.logfilesizepercentageMaximum percentage of Data file Size that logfile is allowed to be.
policy.database.maxvlfMax virtual log files
policy.dbcc.maxdaysMaxmimum number of days before DBCC CHECKDB is considered outdated
policy.diskspace.percentfreePercent disk free
policy.dump.maxcountMaximum number of expected dumps
policy.hadr.tcpportThe TCPPort for the HADR check
policy.identity.usagepercentMaxmimum percentage of max of identity column
policy.invaliddbowner.excludedbDatabases to exclude from invalid dbowner checks
policy.invaliddbowner.nameThe database owner account should not be this user
policy.network.latencymaxmsMax network latency average
policy.ola.commandlogenabledOla's CommandLog Cleanup should be enabled $true or disabled $false
policy.ola.commandlogscheduledOla's CommandLog Cleanup should be scheduled $true or disabled $false
policy.ola.databaseThe database where Ola's maintenance solution is installed
policy.ola.deletebackuphistoryenabledOla's Delete Backup History should be enabled $true or disabled $false
policy.ola.deletebackuphistoryscheduledOla's Delete Backup History should be scheduled $true or disabled $false
policy.ola.installedChecks to see if Ola Hallengren solution is installed
policy.ola.outputfilecleanupenabledOla's Output File Cleanup should be enabled $true or disabled $false
policy.ola.outputfilecleanupscheduledOla's Output File Cleanup should be scheduled $true or disabled $false
policy.ola.purgejobhistoryenabledOla's Purge Job History should be enabled $true or disabled $false
policy.ola.purgejobhistoryscheduledOla's Purge Job History should be scheduled $true or disabled $false
policy.ola.systemfullenabledOla's Full System Database Backup should be enabled $true or disabled $false
policy.ola.systemfullretentionOla's Full System Database Backup retention number of hours
policy.ola.systemfullscheduledOla's Full System Database Backup should be scheduled $true or disabled $false
policy.ola.systemintegritycheckenabledOla's System Database Integrity should be enabled $true or disabled $false
policy.ola.systemintegritycheckscheduledOla's System Database Integrity should be scheduled $true or disabled $false
policy.ola.userdiffenabledOla's Diff User Database Backup should be enabled $true or disabled $false
policy.ola.userdiffretentionOla's Diff User Database Backup retention number of hours
policy.ola.userdiffscheduledOla's Diff User Database Backup should be scheduled $true or disabled $false
policy.ola.userfullenabledOla's Full User Database Backup should be enabled $true or disabled $false
policy.ola.userfullretentionOla's Full User Database Backup retention number of hours
policy.ola.userfullscheduledOla's Full User Database Backup should be scheduled $true or disabled $false
policy.ola.userindexoptimizeenabledOla's User Index Optimization should be enabled $true or disabled $false
policy.ola.userindexoptimizescheduledOla's User Index Optimization should be scheduled $true or disabled $false
policy.ola.userintegritycheckenabledOla's User Database Integrity should be enabled $true or disabled $false
policy.ola.userintegritycheckscheduledOla's User Database Integrity should be scheduled $true or disabled $false
policy.ola.userlogenabledOla's Log User Database Backup should be enabled $true or disabled $false
policy.ola.userlogretentionOla's Log User Database Backup retention number of hours
policy.ola.userlogscheduledOla's Log User Database Backup should be scheduled $true or disabled $false
policy.oleautomationOLE Automation should be enabled $true or disabled $false
policy.pageverifyPage verify option should be set to this value
policy.recoverymodel.excludedbDatabases to exclude from standard recovery model check
policy.recoverymodel.typeStandard recovery model
policy.storage.backuppathEnables tests to check if servers have access to centralized backup location
policy.validdbowner.excludedbDatabases to exclude from valid dbowner checks
policy.validdbowner.nameThe database owner account should be this user
policy.whoisactive.databaseWhich database should contain the sp_WhoIsActive stored procedure
policy.xevent.requiredrunningsessionList of XE Sessions that should be running.
policy.xevent.requiredstoppedsessionList of XE Sessions that should not be running.
policy.xevent.validrunningsessionList of XE Sessions that can be be running.
skip.backup.testingDon't run Test-DbaLastBackup by default (it's not read-only)
skip.connection.pingSkip the ping check for connectivity
skip.connection.remotingSkip PowerShell remoting check for connectivity
skip.database.filegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.database.logfilecounttestSkip the logfilecount test
skip.datafilegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.dbcc.datapuritycheckSkip data purity check in last good dbcc command
skip.diffbackuptestSkip the Differential backup test
skip.logfilecounttestSkip the logfilecount test
skip.logshiptestingSkip the logshipping test
skip.tempdb1118Don't run test for Trace Flag 1118
skip.tempdbfilecountDon't run test for Temp Database File Count
skip.tempdbfilegrowthpercentDon't run test for Temp Database File Growth in Percent
skip.tempdbfilesizemaxDon't run test for Temp Database Files Max Size
skip.tempdbfilesoncDon't run test for Temp Database Files on C

Running A Check

You can quickly run a single check by calling Invoke-DbcCheck.

Invoke-DbcCheck -SqlInstance localhost -Check FailedJob

Excellent, my agent jobs have not failed 🙂

Invoke-DbcCheck -SqlInstance localhost -Check LastGoodCheckDb

Thats good, all of my databases have had a successful DBCC CHECKDB within the last 7 days.

Setting a Configuration

To save me from having to specify the instance I want to run my tests against I can set the app.sqlinstance config to the instances I want to check.

Set-DbcConfig -Name app.sqlinstance -Value localhost, 'localhost\PROD1'

Then whenever I call Invoke-DbcCheck it will run against those instances for the SQL checks

So now if I run

Invoke-DbcCheck -Check LastDiffBackup

I can see that I dont have a diff backup for the databases on both instances. Better stop writing this and deal with that !!

The configurations are stored in the registry but you can export them and then import them for re-use easily. I have written another blog post about that.

The Show Parameter

Getting the results of the tests on the screen is cool but if you are running a lot of tests against a lot of instances then you might find that you have 3 failed tests out of 15000! This will mean a lot of scrolling through green text looking for the red text and you may find that your PowerShell buffer doesnt hold all of your test results leaving you very frustrated.

dbachecks supports the Pester Show parameter enabling you to filter the output of the results to the screen. The available values are Summary, None, Fails, Inconclusive, Passed, Pending and Skipped

in my opinion by far the most useful one is Fails as this will show you only the failed tests with the context to enable you to see which tests have failed

Invoke-DbcCheck -Check Agent -Show Fails

If we check all of the checks tagged as Agent we can easily see that most passed but The Job That Fails (surprisingly) failed. All of the other tests that were run for the agent service, operators, failsafe operator, database mail and all other agent jobs all passed in the example below

Test Results are for other People as well

It is all very well and good being able to run tests and get the results on our screen. It will be very useful for people to be able to validate a new SQL instance for example or run a morning check or the first step of an incident response. But test results are also useful for other people so we need to be able to share them

We have created a Power Bi Dashboard that comes with the dbachecks module to enable easy sharing of the test results. You can also send the results via email using Send-DbcMailMessage. we have an open issue for putting them into a database that we would love you to help resolve.

To get the results into PowerBi you can run

Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment Production

This will run all of the dbachecks using your configuration for your Production environment, output only the failed tests to the screen and save the results in your windows\temp\dbachecks folder with a suffix of Production

If you then used a different configuration for your development environment and ran

Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment Development

it will run all of the dbachecks using your configuration for your Development environment, output only the failed tests to the screen and save the results in your windows\temp\dbachecks folder with a suffix of Development and you would end up with two files in the folder

You can then simply run

Start-DbcPowerBi

and as long as you have the (free) Powerbi Desktop then you will see this. You will need to refresh the data to get your test results

Of course it is Powerbi so you can publish this report. Here it is so that you can click around and see what it looks like

It’s Open Source – We Want Your Ideas, Issues, New Code

dbachecks is open-source available on GitHub for anyone to contribute

We would love you to contribute. Please open issues for new tests, enhancements, bugs. Please fork the repository and add code to improve the module. please give feedback to make this module even more useful

You can also come in the SQL Server Community Slack and join the dbachecks channel and get advice, make comments or just join in the conversation

Further Reading

There are many more introduction blog posts covering different areas at

Thank You

I want to say thank you to all of the people who have enabled dbachecks to get this far. These wonderful people have used their own time to ensure that you have a useful tool available to you for free

Chrissy Lemaire @cl

Fred Weinmann @FredWeinmann

Cláudio Silva @ClaudioESSilva

Stuart Moore @napalmgram

Shawn Melton @wsmelton

Garry Bargsley @gbargsley

Stephen Bennett @staggerlee011

Sander Stad @SQLStad

Jess Pomfret @jpomfret

Jason Squires @js0505

Shane O’Neill @SOZDBA

Tony Wilhelm @TonyWSQL

and all of the other people who have contributed in the dbachecks Slack channel