dbachecks – Save the results to a database for historical reporting

I gave a presentation at SQL Day in Poland last week on dbachecks and one of the questions I got asked was will you write a command to put the results of the checks into a database for historical reporting.

The answer is no and here is the reasoning. The capability is already there. Most good PowerShell commands will only return an object and the beauty of an object is that you can do anything you like with it. Your only limit is your imagination 🙂 I have written about this before here. The other reason is that it would be very difficult to write something that was easily configurable for the different requirements that people will require. But here is one way of doing it.

Create a configuration and save it

Let’s define a configuration and call it production. This is something that I do all of the time so that I can easily run a set of checks with the configuration that I want.

Now I can export that configuration to a json file and store on a file share or in source control using the code below. This makes it easy to embed the checks into an automation solution
and then I can use it with
01 - Invoke-DbcCheck
I would use one of the Show parameter values here if I was running it at the command line, probably fails to make reading the information easier

Add results to a database

This only gets us the test results on the screen, so if we want to save them to a database we have to use the PassThru parameter for Invoke-DbcCheck. I will run the checks again, save them to a variable

Then I can use the dbatools Write-DbaDatatable command to write the results to a table in a database. I need to do this twice, once for the summary and once for the test results

and I get two tables one for the summary

02 - summary

and one for the details

03 - detail
This works absolutely fine and I could continue to add test results in this fashion but it has no date property so it is not so useful for reporting.

Create tables and triggers

This is one way of doing it. I am not sure it is the best way but it works! I always look forward to how people take ideas and move them forward so if you have a better/different solution please blog about it and reference it in the comments below

First I created a staging table for the summary results

and a destination table with a primary key and a date column which defaults to todays date

and added an INSERT trigger to the staging table

and for the details I do the same thing. A details table

A stage table

with a trigger

Then I can use Write-DbaDatatable with a couple of extra parameters, FireTriggers to run the trigger, Truncate and Confirm:$false to avoid any confirmation because I want this to run without any interaction and I can get the results into the database.

detail with stage

Which means that I can now query some of this data and also create PowerBi reports for it.

To enable me to have results for the groups in dbachecks I have to do a little bit of extra manipulation. I can add all of the checks to the database using

But because the Ola Hallengren Job names are configuration items I need to update the values for those checks which I can do as follows

You can get a sample Power Bi report in my Github which also has the code from this blog post

Then you just need to open in PowerBi Desktop and

Click Edit Queries
Click Data Source Settings
Click Change Source
Change the Instance and Database names

09 - PowerBi

Then have an interactive report like this. Feel free to click around and see how it works. Use the arrows at the bottom right to go full-screen. NOTE – it filters by “today” so if I haven’t run the check and the import then click on one of the groups under “Today’s Checks by Group”

This enables me to filter the results and see what has happened in the past so I can filter by one instance
05 - filter by instance
or I can filter by a group of tests
07 - filter by instance
or even by a group of tests for an instance
08 - filter by instance and insance

Hopefully, this will give you some ideas of what you can do with your dbachecks results. You can find all of the code and the PowerBi in my GitHub

Happy Validating!

Creating SQL Server Containers for versions 2012-2017

I am working on my dbatools and dbachecks presentations for SQL Saturday Finland, SQLDays, SQL Saturday Cork and SQLGrillen I want to show the two modules running against a number of SQL Versions so I have installed

  • 2 Domain Controllers
  • 2 SQL 2017 instances on Windows 2016 with an Availability Group and WideWorldImporters database
  • 1 Windows 2016 jump box with all the programmes I need
  • 1 Windows 2016 with containers

using a VSTS build and this set of ARM templates and scripts

I wanted to create containers running SQL2017, SQL2016, SQL2014 and SQL2012 and restore versions of the AdventureWorks database onto each one.

Move Docker Location

I redirected my docker location from my C:\ drive to my E:\ drive so I didnt run out of space. I did this by creating a daemon.json file in C:\ProgramData\docker\config and adding

and restarting the docker service which created folders like this
01 - folders.png
Then I ran
to create a volume to hold the backups that I could mount on the containers

AdventureWorks Backups

I downloaded all the AdventureWorks backups from GitHub and copied them to E:\containers\volumes\sqlbackups\_data

Getting the Images

To download the SQL 2017 image from the DockerHub I ran

and waited for it to download and extract

I also needed the images for other versions. My good friend Andrew Pruski b | t has versions available for us to use on his Docker Hub  so it is just a case of running

and waiting for those to download and extract (This can take a while!)

Create the containers

Creating the containers is as easy as

so all I needed to run to create 4 SQL containers one of each version was

and just a shade over 12 seconds later I have 4 SQL instances ready for me 🙂

02 - creating containers.png

03 - Containers at the ready.png

Storing Credentials

This is not something I would do in a Production environment but I save my credentials using this method that Jaap Brasser b | t shared here

which means that I can get the credentials in my PowerShell session (as long as it is the same user that created the file) using

Restoring the databases

I restored all of the AdventureWorks databases that each instance will support onto each instance, so 2017 has all of them whilst 2012 only has the 2012 versions.

First I needed to get the filenames of the backup files into a variable

and the container connection strings, which are the hostname and the port number

then I can restore the databases using dbatools using a switch statement on the version which I get with the NameLevel property of Get-DbaSqlBuildReference-
I need to create the file paths for each backup file by getting the correct backups and appending the names to C:\SQLBackups which is where the volume is mounted inside the container
As Get-DbaDatabase gives the container ID as the Computer Name I have highlighted each container below
04 - databases.png
That is how easy it is to create a number of SQL containers of differing versions for your presentations or exploring needs
Happy Automating!

dbachecks – Configuration Deep Dive

Today is the day that we have announced dbachecks  a PowerShell module enabling you to validate your SQL Instances. You can read more about it here where you can learn how to install it and see some simple use cases

108 Configurations

One of the things I have been talking about in my presentation “Green is Good Red is Bad” is configuring Pester checks so that you do not have to keep writing new tests for the same thing but with different values.

For example, a different user for a database owner. The code to write the test for the database owner is the same but the value might be different for different applications, environments, clients, teams, domains etc. I gave a couple of different methods for achieving this.

With dbachecks we have made this much simpler enabling you to set configuration items at run-time or for your session and enabling you to export and import them so you can create different configs for different use cases

There are 108 configuration items at present. You can see the current configuration by running

which will show you the name of the config, the value it is currently set and the description

You can see all of the configs and their descriptions here

NameDescription
agent.databasemailprofileName of the Database Mail Profile in SQL Agent
agent.dbaoperatoremailEmail address of the DBA Operator in SQL Agent
agent.dbaoperatornameName of the DBA Operator in SQL Agent
agent.failsafeoperatorEmail address of the DBA Operator in SQL Agent
app.checkreposWhere Pester tests/checks are stored
app.computernameList of Windows Servers that Windows-based tests will run against
app.localappPersisted files live here
app.maildirectoryFiles for mail are stored here
app.sqlcredentialThe universal SQL credential if Trusted/Windows Authentication is not used
app.sqlinstanceList of SQL Server instances that SQL-based tests will run against
app.wincredentialThe universal Windows if default Windows Authentication is not used
command.invokedbccheck.excludecheckInvoke-DbcCheck: The checks that should be skipped by default.
domain.domaincontrollerThe domain controller to process your requests
domain.nameThe Active Directory domain that your server is a part of
domain.organizationalunitThe OU that your server should be a part of
mail.failurethreshholdNumber of errors that must be present to generate an email report
mail.fromEmail address the email reports should come from
mail.smtpserverStore the name of the smtp server to send email reports
mail.subjectSubject line of the email report
mail.toEmail address to send the report to
policy.backup.datadirDestination server data directory
policy.backup.defaultbackupcompreesionDefault Backup Compression check should be enabled $true or disabled $false
policy.backup.diffmaxhoursMaxmimum number of hours before Diff Backups are considered outdated
policy.backup.fullmaxdaysMaxmimum number of days before Full Backups are considered outdated
policy.backup.logdirDestination server log directory
policy.backup.logmaxminutesMaxmimum number of minutes before Log Backups are considered outdated
policy.backup.newdbgraceperiodThe number of hours a newly created database is allowed to not have backups
policy.backup.testserverDestination server for backuptests
policy.build.warningwindowThe number of months prior to a build being unsupported that you want warning about
policy.connection.authschemeAuth requirement (Kerberos, NTLM, etc)
policy.connection.pingcountNumber of times to ping a server to establish average response time
policy.connection.pingmaxmsMaximum response time in ms
policy.dacallowedDAC should be allowed $true or disallowed $false
policy.database.autocloseAuto Close should be allowed $true or dissalowed $false
policy.database.autocreatestatisticsAuto Create Statistics should be enabled $true or disabled $false
policy.database.autoshrinkAuto Shrink should be allowed $true or dissalowed $false
policy.database.autoupdatestatisticsAuto Update Statistics should be enabled $true or disabled $false
policy.database.autoupdatestatisticsasynchronouslyAuto Update Statistics Asynchronously should be enabled $true or disabled $false
policy.database.filebalancetolerancePercentage for Tolerance for checking for balanced files in a filegroups
policy.database.filegrowthexcludedbDatabases to exclude from the file growth check
policy.database.filegrowthtypeGrowth Type should be 'kb' or 'percent'
policy.database.filegrowthvalueThe auto growth value (in kb) should be equal or higher than this value. Example: A value of 65535 means at least 64MB.
policy.database.logfilecountThe number of Log files expected on a database
policy.database.logfilesizecomparisonHow to compare data and log file size, options are maximum or average
policy.database.logfilesizepercentageMaximum percentage of Data file Size that logfile is allowed to be.
policy.database.maxvlfMax virtual log files
policy.dbcc.maxdaysMaxmimum number of days before DBCC CHECKDB is considered outdated
policy.diskspace.percentfreePercent disk free
policy.dump.maxcountMaximum number of expected dumps
policy.hadr.tcpportThe TCPPort for the HADR check
policy.identity.usagepercentMaxmimum percentage of max of identity column
policy.invaliddbowner.excludedbDatabases to exclude from invalid dbowner checks
policy.invaliddbowner.nameThe database owner account should not be this user
policy.network.latencymaxmsMax network latency average
policy.ola.commandlogenabledOla's CommandLog Cleanup should be enabled $true or disabled $false
policy.ola.commandlogscheduledOla's CommandLog Cleanup should be scheduled $true or disabled $false
policy.ola.databaseThe database where Ola's maintenance solution is installed
policy.ola.deletebackuphistoryenabledOla's Delete Backup History should be enabled $true or disabled $false
policy.ola.deletebackuphistoryscheduledOla's Delete Backup History should be scheduled $true or disabled $false
policy.ola.installedChecks to see if Ola Hallengren solution is installed
policy.ola.outputfilecleanupenabledOla's Output File Cleanup should be enabled $true or disabled $false
policy.ola.outputfilecleanupscheduledOla's Output File Cleanup should be scheduled $true or disabled $false
policy.ola.purgejobhistoryenabledOla's Purge Job History should be enabled $true or disabled $false
policy.ola.purgejobhistoryscheduledOla's Purge Job History should be scheduled $true or disabled $false
policy.ola.systemfullenabledOla's Full System Database Backup should be enabled $true or disabled $false
policy.ola.systemfullretentionOla's Full System Database Backup retention number of hours
policy.ola.systemfullscheduledOla's Full System Database Backup should be scheduled $true or disabled $false
policy.ola.systemintegritycheckenabledOla's System Database Integrity should be enabled $true or disabled $false
policy.ola.systemintegritycheckscheduledOla's System Database Integrity should be scheduled $true or disabled $false
policy.ola.userdiffenabledOla's Diff User Database Backup should be enabled $true or disabled $false
policy.ola.userdiffretentionOla's Diff User Database Backup retention number of hours
policy.ola.userdiffscheduledOla's Diff User Database Backup should be scheduled $true or disabled $false
policy.ola.userfullenabledOla's Full User Database Backup should be enabled $true or disabled $false
policy.ola.userfullretentionOla's Full User Database Backup retention number of hours
policy.ola.userfullscheduledOla's Full User Database Backup should be scheduled $true or disabled $false
policy.ola.userindexoptimizeenabledOla's User Index Optimization should be enabled $true or disabled $false
policy.ola.userindexoptimizescheduledOla's User Index Optimization should be scheduled $true or disabled $false
policy.ola.userintegritycheckenabledOla's User Database Integrity should be enabled $true or disabled $false
policy.ola.userintegritycheckscheduledOla's User Database Integrity should be scheduled $true or disabled $false
policy.ola.userlogenabledOla's Log User Database Backup should be enabled $true or disabled $false
policy.ola.userlogretentionOla's Log User Database Backup retention number of hours
policy.ola.userlogscheduledOla's Log User Database Backup should be scheduled $true or disabled $false
policy.oleautomationOLE Automation should be enabled $true or disabled $false
policy.pageverifyPage verify option should be set to this value
policy.recoverymodel.excludedbDatabases to exclude from standard recovery model check
policy.recoverymodel.typeStandard recovery model
policy.storage.backuppathEnables tests to check if servers have access to centralized backup location
policy.validdbowner.excludedbDatabases to exclude from valid dbowner checks
policy.validdbowner.nameThe database owner account should be this user
policy.whoisactive.databaseWhich database should contain the sp_WhoIsActive stored procedure
policy.xevent.requiredrunningsessionList of XE Sessions that should be running.
policy.xevent.requiredstoppedsessionList of XE Sessions that should not be running.
policy.xevent.validrunningsessionList of XE Sessions that can be be running.
skip.backup.testingDon't run Test-DbaLastBackup by default (it's not read-only)
skip.connection.pingSkip the ping check for connectivity
skip.connection.remotingSkip PowerShell remoting check for connectivity
skip.database.filegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.database.logfilecounttestSkip the logfilecount test
skip.datafilegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.dbcc.datapuritycheckSkip data purity check in last good dbcc command
skip.diffbackuptestSkip the Differential backup test
skip.logfilecounttestSkip the logfilecount test
skip.logshiptestingSkip the logshipping test
skip.tempdb1118Don't run test for Trace Flag 1118
skip.tempdbfilecountDon't run test for Temp Database File Count
skip.tempdbfilegrowthpercentDon't run test for Temp Database File Growth in Percent
skip.tempdbfilesizemaxDon't run test for Temp Database Files Max Size
skip.tempdbfilesoncDon't run test for Temp Database Files on C

So there are a lot of configurations that you can use. A lot are already set by default but all of them you can configure for the values that you need for your own estate.

The configurations are stored in the registry at HKCU:\Software\Microsoft\WindowsPowerShell\PSFramework\

First Configurations

First I would run this so that you can see all of the configs in a seperate window (note this does not work on PowerShell v6)

Lets start with the first configurations that you will want to set. This should be the Instances and the Hosts that you want to check

You can get the value of the configuration item using

as you can see in the image, nothing is returned so we have no instances configured at present. We have added tab completion to the name parameter so that you can easily find the right one

If you want to look at more information about the configuration item you can use

which shows you the name, current value and the description

So lets set our first configuration for our SQL instance to localhost. I have included a video so you can see the auto-complete in action as well

This configuration will be used for any SQL based checks but not for any windows based ones like Services, PowerPlan, SPN, DiskSpace, Cluster so lets set the app.computername configuration as well

This means that when we run invoke-DbcCheck with AllChecks or by specifying a check, it will run against the local machine and default instance unless we specify a sqlinstance when calling Invoke-DbcCheck. So the code below will not use the configuration for app.sqlinstance.

Exclude a Check

You can exclude a check using the -ExcludeCheck parameter of Invoke-DbcConfig. In the example below I am running all of the Server checks but excluding the SPN as we are not on a domain

 

There is a configuration setting to exclude checks as well. (Be careful this will exclude them even if you specifically specify a check using Invoke-DbcCheck but we do give you a warning!)

So now I can run

and all of the server checks except the SPN check will run against the local machine and the default instance that I have set in the config

Creating an environment config and exporting it to use any time we like

So lets make this a lot more useful. Lets create a configuration for our production environment and save it to disk (or even source control it!) so that we can use it again and again. We can also then pass it to other members of our team or even embed it in an automated process or our CI/CD system

Lets build up a configuration for a number of tests for my “production” environment. I will not explain them all here but let you read through the code and the comments to see what has been set. You will see that some of them are due to me running the test on a single machine with one drive.

When I run this I get

I can then export this to disk (to store in source control) using

and I have a configuration file

which I can use any time to set the configuration for dbachecks using the Import-DbcConfig command (But this doesn’t work in VS Codes integrated terminal – which occasionally does odd things, this appears to be one of them)

So I can import this configuration and run my checks with it any time I like. This means that I can create many different test configurations for my many different environment or estate configurations.

Yes, I know “good/best practice” says we should use the same configuration for all of our instances but we know that isn’t true. We have instances that were set up 15 years ago that are still in production. We have instances from the companies our organisation has bought over the years that were set up by system administrators. We have instances that were set up by shadow IT and now we have to support but cant change.

As well as those though, we also have different environments. Our development or test environment will have different requirements to our production environments.

In this hypothetical situation the four instances for four different applications have 4 development containers which are connected to using SQL Authentication. We will need a different configuration.

SQL Authentication

We can set up SQL Authentication for connecting to our SQL Instances using the app.sqlcredential configuration. this is going to hold a PSCredential object for SQL Authenticated connection to your instance. If this is set the checks will always try to use it. Yes this means that the same username and password is being used for each connection. No there is currently no way to choose which instances use it and which don’t. This may be a limitation but as you will see further down you can still do this with different configurations

To set the  SQL Authentication run

This will give a prompt for you to enter the credential

Development Environment Configuration

So now we know how to set a SQL Authentication configuration we can create our development environment configuration like so. As you can see below the values are different for the checks and more checks have been skipped. I wont explain it all, if it doesn’t make sense ask a question in the comments or in the dbachecks in SQL Server Community Slack

Using The Different Configurations

Now I have two configurations, one for my Production Environment and one for my development environment. I can run my checks whenever I like (perhaps you will automate this in some way)

  • Import the production configuration
  • Run my tests with that configuration and create a json file for my Power Bi labelled production
  • Import the development configuration (and enter the SQL authentication credential)
  • Run my tests with that configuration and create a json file for my Power Bi labelled development
  • Start Power Bi to show those results

I have published the Power Bi so that you can see what it would like and have a click around (maybe you can see improvements you would like to contribute)

now we can see how each environment is performing according to our settings for each environment

Combining Configurations Into One Result Set

As you saw above, by using the Environment parameter of Update-DbcPowerBiDataSource you can add different environments to one report. But if I wanted to have a report for my application APP1 showing both production and development environments but they have different configurations how can I do this?

Here’s how.

  • Create a configuration for the production environment (I have used the production configuration one from above but only localhost for the instance)
  • Export it using to  C:\Users\dbachecks\Desktop\APP1-Prod_config.json
  • Create a configuration for the development environment (I have used the development configuration one from above but only localhost,1401 for the instance)
  • Export it using to  C:\Users\dbachecks\Desktop\APP1-Dev_config.json

Then run

Notice that this time there is an Append on the last Invoke-DbcCheck this creates a single json file for the PowerBi and the results look like this. Now we have the results for our application and both the production environment localhost and the development container localhost,1401

 

It’s Open Source – We Want Your Ideas, Issues, New Code

dbachecks is open-source available on GitHub for anyone to contribute

We would love you to contribute. Please open issues for new tests, enhancements, bugs. Please fork the repository and add code to improve the module. please give feedback to make this module even more useful

You can also come in the SQL Server Community Slack and join the dbachecks channel and get advice, make comments or just join in the conversation

Thank You

I want to say thank you to all of the people who have enabled dbachecks to get this far. These wonderful people have used their own time to ensure that you have a useful tool available to you for free

Chrissy Lemaire @cl

Fred Weinmann @FredWeinmann

Cláudio Silva @ClaudioESSilva

Stuart Moore @napalmgram

Shawn Melton @wsmelton

Garry Bargsley @gbargsley

Stephen Bennett @staggerlee011

Sander Stad @SQLStad

Jess Pomfret @jpomfret

Jason Squires @js0505

Shane O’Neill @SOZDBA

and all of the other people who have contributed in the dbachecks Slack channel

Announcing dbachecks – Configurable PowerShell Validation For Your SQL Instances

For the last couple of months members of the dbatools team have been working on a new PowerShell module called dbachecks. This open source PowerShell module will enable you to validate your SQL Instances. Today it is released for you all to start to use 🙂

Validate Your SQL Instances?

What do I mean by validate your SQL Instances? You want to know if your SQL Instances are (still) set up in the way that you want them to be or that you have not missed any configurations when setting them up. With dbachecks you can use any or all of the 80 checks to ensure one or many SQL Instances are as you want them to be. Using Pester, dbachecks will validate your SQL Instance(s) against default settings or ones that you configure yourself.

Installation

Installation is via the PowerShell Gallery. You will need to open PowerShell on a machine connected to the internet and run

If you are not running your process as admin or you only want (or are able) to install for your own user account you will need to

This will also install the PSFramework module used for configuration (and other things beneath the hood) and the latest version (4.2.0 – released on Sunday!) of Pester

Once you have installed the module you can see the commands available by running

To be able to use these (and any PowerShell) commands, your first step should always be Get-Help

80 Checks

At the time of release, dbachecks has 80 checks. You can see all of the checks by running

(Note this has nothing to do with DBCC CheckDb!) Here is the output of

so you can see the current checks

GroupUniqueTag
AgentAgentServiceAccount
AgentDbaOperator
AgentFailsafeOperator
AgentDatabaseMailProfile
AgentFailedJob
DatabaseDatabaseCollation
DatabaseSuspectPage
DatabaseTestLastBackup
DatabaseTestLastBackupVerifyOnly
DatabaseValidDatabaseOwner
DatabaseInvalidDatabaseOwner
DatabaseLastGoodCheckDb
DatabaseIdentityUsage
DatabaseRecoveryModel
DatabaseDuplicateIndex
DatabaseUnusedIndex
DatabaseDisabledIndex
DatabaseDatabaseGrowthEvent
DatabasePageVerify
DatabaseAutoClose
DatabaseAutoShrink
DatabaseLastFullBackup
DatabaseLastDiffBackup
DatabaseLastLogBackup
DatabaseVirtualLogFile
DatabaseLogfileCount
DatabaseLogfileSize
DatabaseFileGroupBalanced
DatabaseAutoCreateStatistics
DatabaseAutoUpdateStatistics
DatabaseAutoUpdateStatisticsAsynchronously
DatabaseDatafileAutoGrowthType
DatabaseTrustworthy
DatabaseOrphanedUser
DatabasePseudoSimple
DatabaseAdHocWorkloads
DomainDomainName
DomainOrganizationalUnit
HADRClusterHealth
HADRClusterServerHealth
HADR
HADRSystem.Object[]
InstanceSqlEngineServiceAccount
InstanceSqlBrowserServiceAccount
InstanceTempDbConfiguration
InstanceAdHocWorkload
InstanceBackupPathAccess
InstanceDAC
InstanceNetworkLatency
InstanceLinkedServerConnection
InstanceMaxMemory
InstanceOrphanedFile
InstanceServerNameMatch
InstanceMemoryDump
InstanceSupportedBuild
InstanceSaRenamed
InstanceDefaultBackupCompression
InstanceXESessionStopped
InstanceXESessionRunning
InstanceXESessionRunningAllowed
InstanceOLEAutomation
InstanceWhoIsActiveInstalled
LogShippingLogShippingPrimary
LogShippingLogShippingSecondary
ServerPowerPlan
ServerInstanceConnection
ServerSPN
ServerDiskCapacity
ServerPingComputer
MaintenancePlanSystemFull
MaintenancePlanUserFull
MaintenancePlanUserDiff
MaintenancePlanUserLog
MaintenancePlanCommandLog
MaintenancePlanSystemIntegrityCheck
MaintenancePlanUserIntegrityCheck
MaintenancePlanUserIndexOptimize
MaintenancePlanOutputFileCleanup
MaintenancePlanDeleteBackupHistory
MaintenancePlanPurgeJobHistory

108 Configurations

One of the things I have been talking about in my presentation “Green is Good Red is Bad” is configuring Pester checks so that you do not have to keep writing new tests for the same thing but with different values.

For example, a different user for a database owner. The code to write the test for the database owner is the same but the value might be different for different applications, environments, clients, teams, domains etc. I gave a couple of different methods for achieving this.

With dbachecks we have made this much simpler enabling you to set configuration items at run-time or for your session and enabling you to export and import them so you can create different configs for different use cases

There are 108 configuration items at present. You can see the current configuration by running

which will show you the name of the config, the value it is currently set and the description

You can see all of the configs and their descriptions here

NameDescription
agent.databasemailprofileName of the Database Mail Profile in SQL Agent
agent.dbaoperatoremailEmail address of the DBA Operator in SQL Agent
agent.dbaoperatornameName of the DBA Operator in SQL Agent
agent.failsafeoperatorEmail address of the DBA Operator in SQL Agent
app.checkreposWhere Pester tests/checks are stored
app.computernameList of Windows Servers that Windows-based tests will run against
app.localappPersisted files live here
app.maildirectoryFiles for mail are stored here
app.sqlcredentialThe universal SQL credential if Trusted/Windows Authentication is not used
app.sqlinstanceList of SQL Server instances that SQL-based tests will run against
app.wincredentialThe universal Windows if default Windows Authentication is not used
command.invokedbccheck.excludecheckInvoke-DbcCheck: The checks that should be skipped by default.
domain.domaincontrollerThe domain controller to process your requests
domain.nameThe Active Directory domain that your server is a part of
domain.organizationalunitThe OU that your server should be a part of
mail.failurethreshholdNumber of errors that must be present to generate an email report
mail.fromEmail address the email reports should come from
mail.smtpserverStore the name of the smtp server to send email reports
mail.subjectSubject line of the email report
mail.toEmail address to send the report to
policy.backup.datadirDestination server data directory
policy.backup.defaultbackupcompreesionDefault Backup Compression check should be enabled $true or disabled $false
policy.backup.diffmaxhoursMaxmimum number of hours before Diff Backups are considered outdated
policy.backup.fullmaxdaysMaxmimum number of days before Full Backups are considered outdated
policy.backup.logdirDestination server log directory
policy.backup.logmaxminutesMaxmimum number of minutes before Log Backups are considered outdated
policy.backup.newdbgraceperiodThe number of hours a newly created database is allowed to not have backups
policy.backup.testserverDestination server for backuptests
policy.build.warningwindowThe number of months prior to a build being unsupported that you want warning about
policy.connection.authschemeAuth requirement (Kerberos, NTLM, etc)
policy.connection.pingcountNumber of times to ping a server to establish average response time
policy.connection.pingmaxmsMaximum response time in ms
policy.dacallowedDAC should be allowed $true or disallowed $false
policy.database.autocloseAuto Close should be allowed $true or dissalowed $false
policy.database.autocreatestatisticsAuto Create Statistics should be enabled $true or disabled $false
policy.database.autoshrinkAuto Shrink should be allowed $true or dissalowed $false
policy.database.autoupdatestatisticsAuto Update Statistics should be enabled $true or disabled $false
policy.database.autoupdatestatisticsasynchronouslyAuto Update Statistics Asynchronously should be enabled $true or disabled $false
policy.database.filebalancetolerancePercentage for Tolerance for checking for balanced files in a filegroups
policy.database.filegrowthexcludedbDatabases to exclude from the file growth check
policy.database.filegrowthtypeGrowth Type should be 'kb' or 'percent'
policy.database.filegrowthvalueThe auto growth value (in kb) should be equal or higher than this value. Example: A value of 65535 means at least 64MB.
policy.database.logfilecountThe number of Log files expected on a database
policy.database.logfilesizecomparisonHow to compare data and log file size, options are maximum or average
policy.database.logfilesizepercentageMaximum percentage of Data file Size that logfile is allowed to be.
policy.database.maxvlfMax virtual log files
policy.dbcc.maxdaysMaxmimum number of days before DBCC CHECKDB is considered outdated
policy.diskspace.percentfreePercent disk free
policy.dump.maxcountMaximum number of expected dumps
policy.hadr.tcpportThe TCPPort for the HADR check
policy.identity.usagepercentMaxmimum percentage of max of identity column
policy.invaliddbowner.excludedbDatabases to exclude from invalid dbowner checks
policy.invaliddbowner.nameThe database owner account should not be this user
policy.network.latencymaxmsMax network latency average
policy.ola.commandlogenabledOla's CommandLog Cleanup should be enabled $true or disabled $false
policy.ola.commandlogscheduledOla's CommandLog Cleanup should be scheduled $true or disabled $false
policy.ola.databaseThe database where Ola's maintenance solution is installed
policy.ola.deletebackuphistoryenabledOla's Delete Backup History should be enabled $true or disabled $false
policy.ola.deletebackuphistoryscheduledOla's Delete Backup History should be scheduled $true or disabled $false
policy.ola.installedChecks to see if Ola Hallengren solution is installed
policy.ola.outputfilecleanupenabledOla's Output File Cleanup should be enabled $true or disabled $false
policy.ola.outputfilecleanupscheduledOla's Output File Cleanup should be scheduled $true or disabled $false
policy.ola.purgejobhistoryenabledOla's Purge Job History should be enabled $true or disabled $false
policy.ola.purgejobhistoryscheduledOla's Purge Job History should be scheduled $true or disabled $false
policy.ola.systemfullenabledOla's Full System Database Backup should be enabled $true or disabled $false
policy.ola.systemfullretentionOla's Full System Database Backup retention number of hours
policy.ola.systemfullscheduledOla's Full System Database Backup should be scheduled $true or disabled $false
policy.ola.systemintegritycheckenabledOla's System Database Integrity should be enabled $true or disabled $false
policy.ola.systemintegritycheckscheduledOla's System Database Integrity should be scheduled $true or disabled $false
policy.ola.userdiffenabledOla's Diff User Database Backup should be enabled $true or disabled $false
policy.ola.userdiffretentionOla's Diff User Database Backup retention number of hours
policy.ola.userdiffscheduledOla's Diff User Database Backup should be scheduled $true or disabled $false
policy.ola.userfullenabledOla's Full User Database Backup should be enabled $true or disabled $false
policy.ola.userfullretentionOla's Full User Database Backup retention number of hours
policy.ola.userfullscheduledOla's Full User Database Backup should be scheduled $true or disabled $false
policy.ola.userindexoptimizeenabledOla's User Index Optimization should be enabled $true or disabled $false
policy.ola.userindexoptimizescheduledOla's User Index Optimization should be scheduled $true or disabled $false
policy.ola.userintegritycheckenabledOla's User Database Integrity should be enabled $true or disabled $false
policy.ola.userintegritycheckscheduledOla's User Database Integrity should be scheduled $true or disabled $false
policy.ola.userlogenabledOla's Log User Database Backup should be enabled $true or disabled $false
policy.ola.userlogretentionOla's Log User Database Backup retention number of hours
policy.ola.userlogscheduledOla's Log User Database Backup should be scheduled $true or disabled $false
policy.oleautomationOLE Automation should be enabled $true or disabled $false
policy.pageverifyPage verify option should be set to this value
policy.recoverymodel.excludedbDatabases to exclude from standard recovery model check
policy.recoverymodel.typeStandard recovery model
policy.storage.backuppathEnables tests to check if servers have access to centralized backup location
policy.validdbowner.excludedbDatabases to exclude from valid dbowner checks
policy.validdbowner.nameThe database owner account should be this user
policy.whoisactive.databaseWhich database should contain the sp_WhoIsActive stored procedure
policy.xevent.requiredrunningsessionList of XE Sessions that should be running.
policy.xevent.requiredstoppedsessionList of XE Sessions that should not be running.
policy.xevent.validrunningsessionList of XE Sessions that can be be running.
skip.backup.testingDon't run Test-DbaLastBackup by default (it's not read-only)
skip.connection.pingSkip the ping check for connectivity
skip.connection.remotingSkip PowerShell remoting check for connectivity
skip.database.filegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.database.logfilecounttestSkip the logfilecount test
skip.datafilegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.dbcc.datapuritycheckSkip data purity check in last good dbcc command
skip.diffbackuptestSkip the Differential backup test
skip.logfilecounttestSkip the logfilecount test
skip.logshiptestingSkip the logshipping test
skip.tempdb1118Don't run test for Trace Flag 1118
skip.tempdbfilecountDon't run test for Temp Database File Count
skip.tempdbfilegrowthpercentDon't run test for Temp Database File Growth in Percent
skip.tempdbfilesizemaxDon't run test for Temp Database Files Max Size
skip.tempdbfilesoncDon't run test for Temp Database Files on C

Running A Check

You can quickly run a single check by calling Invoke-DbcCheck.

Excellent, my agent jobs have not failed 🙂

Thats good, all of my databases have had a successful DBCC CHECKDB within the last 7 days.

Setting a Configuration

To save me from having to specify the instance I want to run my tests against I can set the app.sqlinstance config to the instances I want to check.

Then whenever I call Invoke-DbcCheck it will run against those instances for the SQL checks

So now if I run

I can see that I dont have a diff backup for the databases on both instances. Better stop writing this and deal with that !!

The configurations are stored in the registry but you can export them and then import them for re-use easily. I have written another blog post about that.

The Show Parameter

Getting the results of the tests on the screen is cool but if you are running a lot of tests against a lot of instances then you might find that you have 3 failed tests out of 15000! This will mean a lot of scrolling through green text looking for the red text and you may find that your PowerShell buffer doesnt hold all of your test results leaving you very frustrated.

dbachecks supports the Pester Show parameter enabling you to filter the output of the results to the screen. The available values are Summary, None, Fails, Inconclusive, Passed, Pending and Skipped

in my opinion by far the most useful one is Fails as this will show you only the failed tests with the context to enable you to see which tests have failed

If we check all of the checks tagged as Agent we can easily see that most passed but The Job That Fails (surprisingly) failed. All of the other tests that were run for the agent service, operators, failsafe operator, database mail and all other agent jobs all passed in the example below

Test Results are for other People as well

It is all very well and good being able to run tests and get the results on our screen. It will be very useful for people to be able to validate a new SQL instance for example or run a morning check or the first step of an incident response. But test results are also useful for other people so we need to be able to share them

We have created a Power Bi Dashboard that comes with the dbachecks module to enable easy sharing of the test results. You can also send the results via email using Send-DbcMailMessage. we have an open issue for putting them into a database that we would love you to help resolve.

To get the results into PowerBi you can run

This will run all of the dbachecks using your configuration for your Production environment, output only the failed tests to the screen and save the results in your windows\temp\dbachecks folder with a suffix of Production

If you then used a different configuration for your development environment and ran

it will run all of the dbachecks using your configuration for your Development environment, output only the failed tests to the screen and save the results in your windows\temp\dbachecks folder with a suffix of Development and you would end up with two files in the folder

You can then simply run

and as long as you have the (free) Powerbi Desktop then you will see this. You will need to refresh the data to get your test results

Of course it is Powerbi so you can publish this report. Here it is so that you can click around and see what it looks like

It’s Open Source – We Want Your Ideas, Issues, New Code

dbachecks is open-source available on GitHub for anyone to contribute

We would love you to contribute. Please open issues for new tests, enhancements, bugs. Please fork the repository and add code to improve the module. please give feedback to make this module even more useful

You can also come in the SQL Server Community Slack and join the dbachecks channel and get advice, make comments or just join in the conversation

Further Reading

There are many more introduction blog posts covering different areas at

Thank You

I want to say thank you to all of the people who have enabled dbachecks to get this far. These wonderful people have used their own time to ensure that you have a useful tool available to you for free

Chrissy Lemaire @cl

Fred Weinmann @FredWeinmann

Cláudio Silva @ClaudioESSilva

Stuart Moore @napalmgram

Shawn Melton @wsmelton

Garry Bargsley @gbargsley

Stephen Bennett @staggerlee011

Sander Stad @SQLStad

Jess Pomfret @jpomfret

Jason Squires @js0505

Shane O’Neill @SOZDBA

Tony Wilhelm @TonyWSQL

and all of the other people who have contributed in the dbachecks Slack channel

Pester 4.2.0 has a Because…… because :-)

I was going through my demo for the South Coast User Group meeting tonight and decided to add some details about the Because parameter available in the Pester pre-release version 4.2.0.

To install a pre-release version you need to get the latest  PowerShellGet module. This is pre-installed with PowerShell v6 but for earlier versions open PowerShell as administrator and run

You can try out the Pester pre-release version (once you have the latest PowerShellGet) by installing it from the PowerShell Gallery with

There are a number of improvements as you can see in the change log I particularly like the

  • Add -BeTrue to test for truthy values
  • Add -BeFalse to test for falsy values

This release adds the Because parameter to the all assertions. This means that you can add a reason why the test has failed. As JAKUB JAREŠ writes here

  • Reasons force you think more
  • Reasons document your intent
  • Reasons make your TestCases clearer
  • So you can do something like this

Which gives an error message like this 🙂

As you can see the Expected gives the expected value and then your Because statement and then the actual result. Which means that you could write validation tests like

and get a result like this

Or if you were looking to validate your SQL Server you could write something like this

or maybe your security policies allow Windows Groups as logins on your SQL instances. You could easily link to the documentation and explain why this is important. This way you could build up a set of tests to validate your SQL Server is just so for your environment

Just for fun, these would look like this

and the code looks like

This will be a useful improvement to Pester when it is released and enable you to write validation checks with explanations.

Chrissy has written about dbachecks the new up and coming community driven open source PowerShell module for SQL DBAs to validate their SQL Server estate. we have taken some of the ideas that we have presented about a way of using dbatools with Pester to validate that everything is how it should be and placed them into a meta data driven framework to make things easy for anyone to use. It is looking really good and I am really excited about it. It will be released very soon.

Chrissy and I will be doing a pre-con at SQLBits where we will talk in detail about how this works. You can find out more and sign up here

TSQL2sDay – Folks Who Have Made a Difference

tsql2sday

This months TSQL2sDay is an absolute brilliant one hosted by Ewald Cress

the opportunity to give a shout-out to people (well-known or otherwise) who have made a meaningful contribution to your life in the world of data.

Fabulous, fabulous idea Ewald, I heartily approve

Now this is going to be difficult. There are so many wonderful people in the #SQLFamily who are so gracious and generous and willing to share. I am also lucky enough to be part of the PowerShell community which is also equally filled with amazing people. I do not want to write a novel or a massive list of people, I don’t want to risk missing someone out (Ewald, I’m beginning to question whether ‘fabulous’ should become ‘tricky and challenging’ !!)

So after consideration I am only going to talk about 4 wonderful people and the effect they have had on my life, my career and my community involvement but know that I truly appreciate the input that all of the peoples have had and the amazing friendships that I have all over the world. There is no order to this list, these are 4 of the people in equal first with all the other people I haven’t mentioned. This post should really scroll sideways. Interestingly I noticed after writing this that they are in reverse chronological order in my life!

The Hair!

At PASS Summit this year many people came up to me and said “Hey, Beard ……..” The first person who called me that is an amazing inspiring bundle of talented energy called Chrissy LeMaire

Many moons ago, we exchanged messages over social media and email, chatted after a PowerShell Virtual Group presentation and then one day she asked me to join as an organiser for the Virtual Group.

When dbatools was in it’s infancy she asked me to help and since then has given me interesting challenges to overcome from introducing Pester and appveyor to the dbatools development process to creating continuous delivery to our private PowerShell gallery for our summit pre-con forcing me to learn and implement new and cool things. Our shared love of enabling people to do cool things with PowerShell is so much fun to do 🙂

She is so generous and giving of her time and knowledge and has an amazing capability to get things done, whether by herself or by encouraging and supporting others.

We have presented at many conferences together, both SQL and PowerShell and we have the best of times doing so. It is so refreshing to find someone that I am comfortable presenting with and who has the same passion and energy for inspiring people. (It’s also fun to occasionally throw her off her stride mid-presentation (Thank you Cathrine 🙂 )

I am proud to call her my buddy. You are so inspiring Chrissy.

Thank you Ma’am

Amazing Couple

A few months after becoming a DBA I was the only DBA at the company as the others all left for various reasons. I was drowning in work, had no idea what I should be doing. I knew I didn’t have the knowledge and during that time I began to be aware of the SQL community and all the fine resources that it provides.

I then found out about a local user group and emailed the leader Jonathan Allen (He surprised me by reminding of this during our pre-con in Singapore a couple of weeks ago!) Jonathan and his wife Annette run the SQL South West user group and are also members of the SQL Bits committee, Annette is also the regional mentor for the UK. They give an awful amount of time and effort to the SQL Community in the UK. It took a few months before I even had the time to attend a user group and in those early days they both answered my naive questions and passed on so much of their technical knowledge and methodology to me and I soaked it up.

Later on, they invited me to help them to organise SQL Saturday Exeter, encouraged me to speak, gave me fabulous feedback and pointers to improve, encouraged me to volunteer for SQL Bits and have been incredibly supportive. I love them both very much. Neither like having their photo taken so I can’t embarrass them too much.

Next time you see them give them a hug.

Thank you J and A

The First One

Andrew Pruski dbafromthecold and SQL Containers Man

At the time I am talking about he was not a member of the SQL Community although he possessed all of the qualities that describe such a person. Now he is an established blogger and speaker and attender of SQL events.

He is one of the DBA’s who left me on my own!! He is the first SQL DBA I ever worked with. The person who taught me all those important first bits of knowledge about being a SQL DBA. He imparted a great amount of knowledge in a few months with great patience to an eager newbie.

More than that, he showed me that to succeed in IT, you need to do more than just an everyday 9-5, that it requires more time than that. He instilled in me (without realising it) a work ethic and a thirst for doing things right and gaining knowledge that I still have today. He inspired me when I was faced with trying to understand the mountain of knowledge that is SQL Server that it was possible to learn enough. He taught me the importance of testing things, of understanding the impact of the change that is being made. He showed me how to respond in crises and yet was still willing to share and teach during those times.

He has had a greater impact on me than he will ever know and I have told him this privately many times. I will never forgive him for abandoning me all those years ago and yet that is a large part of what made me who I am today. I was forced to have to deal with looking after a large estate by myself and needed to learn to automate fast and he just about left me with the skills to be able to accomplish that.

Massive shout out to you fella. Thank you

All the Others

Seriously, there are so many other people who I wish I could thank.

Every single one of you who blogs or speaks or records webinars that I have watched – thank you.

All of the organisers who ensure that events happen – thank you

All of the volunteers who assist at those events – thank you.

That group of amazing European speakers at the first SQL Saturday Exeter I attended. The cool group, my wife still reminds me of how I came home from that event so inspired by them. How incredibly generous and welcoming they were and how they welcomed me into their group even though I didn’t feel worthy to share their table. They taught me about the lack of egos and humbleness that defines the SQL family. I am proud to call them my friends now. Thank You (You know who you are)

We have a great community, may its ethos continue for a long time.

Comparing Agent Jobs across Availability Group Replicas with PowerShell

On the plane home from PAS Summit I was sat next to someone who had also attended and when he saw on my laptop that I was part of the SQL Community we struck up a conversation. He asked me how he could compare SQL Agent Jobs across availability group replicas to ensure that they were the same.

He already knew that he could use Copy-DbaAgentJob from dbatools to copy the jobs between replicas and we discussed how to set up an Agent job to accomplish this. The best way to run an Agent Job with a PowerShell script is described here

Compare-Object

I told him about Compare-Object a function available in PowerShell for precisely this task. Take these two SQL instances and their respective Agent Jobs

agentjobcompare.png

So we can see that some jobs are the same and some are different. How can we quickly and easily spot the differences?

Those three lines of code will do it. The first two get the agent jobs from each instance and assign them to a variable and the last one compares them. This is the output
comparison.png
The arrows show that the first three jobs are only on the Bolton instance and the bottom three jobs are only on the default instance.

What If ?

 Another option I showed was to use the -WhatIf switch on Copy-DbaAgentJob. This parameter is available on all good PowerShell functions and will describe what the command would do if run WARNING – If you are using the old SQLPS module from prior to the SSMS 2016 release -WhatIf will actually run the commands so update your modules.
We can run

and get the following result

which shows us that there are two jobs on Rob-XPS which would be created on the Bolton instance

And if they have been modified?

Thats good he said, but what about if the jobs have been modified?
Well one thing you could do is to compare the jobs DateLastModified property by using the -Property parameter and the passthru switch
This is going to return the jobs which are the same but were modified at a different time
sortedjobcompare.png
so that you can examine when they were changed. Of course the problem with that is that the DateLastModified is a very precise time so it is pretty much always going to be different. We can fix that but now it is a little more complex.

Just the Date please

We need to gather the jobs in the same way but create an array of custom objects with a calculated property like this
and then we can compare on the Date field. The full code is

This will look like this

datecompare.png
Which is much better and hopefully more useful but it only works with 2 instances

I have more than 2 instances

So if we have more than 2 instances it gets a little more complicated as Compare-Object only supports two arrays. I threw together a quick function to compare each instance with the main instance. This is very rough and will work for now but I have also created a feature request issue on the dbatools repository so someone (maybe you ?? ) could go and help create those commands

which looks like this. It’s not perfect but it will do for now until the proper commands are created

compare agent jobs.png

dbatools with SQL on Docker and running SQL queries

I had a question from my good friend Andrew Pruski dbafromthecold on twitter or SQL Container Man as I call him 🙂

How do you guys run SQL Commands in dbatools

I will answer that at the bottom of this post, but during our discussion Andrew said he wanted to show the version of the SQL running in the Docker Container.

Thats easy I said. Here’s how to do it

You need to have installed Docker first see this page You can switch to using Windows containers right-clicking on the icon in the taskbar and choosing the command. If you have not already, then pull the SQL 2017 SQL image using

This may take a while to download and extract the image but its worth it, you will be able to spin up new SQL instances in no time

You can create a new SQL Docker container like this


In only a few seconds you have a SQL 2017 instance up and running (Take a look at Andrews blog at dbafromthecold.com for a great container series with much greater detail)

Now that we have our container we need to connect to it. We need to gather the IPAddress. We can do this using docker command docker inspect but I like to make things a little more programmatical. This works for my Windows 10 machine for Windows SQL Containers. There are some errors with other machines it appears but there is an alternative below


Those two lines of code (and several lines of comments) puts the results of the docker inspect command into a variable and then uses regex to pull out the IP Address

If you are getting errors with that you can also use

Thanks Andrew 🙂

Now we just need our credentials to connect to the instance


and we can connect to our SQL container

and get the version

and many many other properties, just run

to see them. At the bottom, you will see a ScriptMethod called Query, which means that you can do things like


Which looks like

It’s slightly different with a Linux SQL container. Switch Docker to run Linux containers by right-clicking on the icon in the taskbar and choosing the command to switch.
If you haven’t already pull the Linux SQL image

and then create a container


Now we just need to connect with localhost and the port number which we have specified already and we can connect again


Of course, this isn’t restricted just Connect-DbaInstance you can do this with any dbatools commands

Go and explore your Docker SQL conatiners with dbatools 🙂

You can get it using

and find commands with

Don’t forget to use Get-Help with the name of the command to get information about how to use it

Enjoy 🙂

TSQL2sDay – Get-PostRoundup

First an apology, this round up is late!

The reason for that is an error in the PowerShell testing module Pester (That’s not completely true as you shall see!!)

I spoke in Stuttgart at the PowerShell Saturday last weekend and had intended to write this blog post whilst travelling, unfortunately I found a major error in Pester (again not strictly true but it makes a good story!!)

I explained it with this slide in my presentation

Yep, I forgot to pack my NUC with my VMs on it and had to re-write all my demos!!

But anyway, on to the TSQL2sDay posts

What a response. You wonderful people. I salute you with a Rimmer salute

There are 34 TSQL2sDay posts about dbatools, about starting with PowerShell, If you should learn PowerShell, SSAS, SSRS, Log Shipping, backups, restores, Pester, Default settings, best practices, migrations, Warnings in Agent Jobs, sqlpackage, VLFs, CMS, Disabling Named Pipes, Orphaned users, AG Status, AG Agent Jobs, logging, classes, auditing, copying files, ETL and more.

I am really pleased to see so many first timers to the TSQL2sDay blog monthly blog party. Please don’t let this be your only TSQL2sDay post. Come back next month and write a post on that topic.

Here they are below in the media of tweets, so that you can also go and follow these wonderful people who are so willing to share their knowledge. Say thank you to them, ask them questions, interact.

Learn, Share, Network

Volker wrote about testing best practices with dbatools

Dave explains why PowerShell is so useful to him in his ETL processes

Steve writes about the time he has saved using PowerShell to automate restores and audit SQL Server instances

Nate talks about copying large files like SQL Server backups using BITS with PowerShell

Warren talks about his experience as a beginner, the amount of things he automates and his DBReboot module

THANK YOU every single one and apologies if I have missed anyone!

 

 

Using Get-SQLDiagFix to get information from the SQL Server Diagnostic API with PowerShell

The SQL Server Diagnostics Preview was announced just over a week ago It includes an add-on for SQL Server Management Studio to enable you to analyse SQL Server memory dumps and view information on the latest SQL Server cumulative updates for supported versions of SQL Server. Arun Sirpal has written a good blog post showing how to install it and use it in SSMS to analyse dumps.

There is also a developer API available so I thought I would write some PowerShell to consume it as there are no PowerShell code examples available in the documentation!

In a previous post I have explained how I created the module and a GitHub repository and used Pester to help me to develop the first command Get-SQLDIagRecommendations. At present the module has 5 commands, all for accessing the Recommendations API.

This post is about the command Get-SQLDiagFix which returns the Product Name, Feature Name/Area, KB Number, Title and URL for the Fixes in the Cumulative Updates returned from the SQL Server Diagnostics Recommendations API.

PowerShell Gallery

The module is available on the PowerShell Gallery which means that you can install it using

Install-Module SQLDiagAPI

as long as you have the latest version of the PowerShellGet module. This is already installed in Windows 10 and with WMF 5 but you can install it on the following systems

  • Windows 8.1 Pro
  • Windows 8.1 Enterprise
  • Windows 7 SP1
  • Windows Server 2016 TP5
  • Windows Server 2012 R2
  • Windows Server 2008 R2 SP1

following the instructions here.

If you are not running your PowerShell using a local administrator account you will need to run

Install-Module SQLDiagAPI -Scope CurrentUser

to install the module.

If you can’t use the PowerShell Gallery you can install it using the instructions on the repository

API Key

To use the API you need an API Key. An API Key is a secret token that identifies the application to the API and is used to control access. You can follow the instructions here to get one for the SQL Server Diagnostics API.

01 - APIKey

You will need to store the key to use it. I recommend saving the API Key using the Export-CliXML command as described by Jaap Brasser here .

Get-Credential | Export-CliXml -Path "${env:\userprofile}\SQLDiag.Cred"

You need to enter a username even though it is not used and then enter the API Key as the password. It is saved in the root of the user profile folder as hopefully, user accounts will have access there in most shops.

This will save you from having to enter the APIKey every time you run the commands as the code is looking for it to be saved in that file.

The Commands

Once you have installed the module and the APIKey it will be available whenever you start PowerShell. The first time you install you  may need to run

Import-Module SQLDiagAPI

to load it into your session. Once it is loaded you can view the available commands using

Get-Command -Module SQLDiagAPI

01 - SQLDiagAPI Commands.png

You can find out more about the commands on the GitHub Repository  and the Help files are in the documentation.

Get-Help

Always, always when starting with a new module or function in PowerShell you should start with Get-Help. I like to use the -ShowWindow parameter to open the help in a separate window as it has all of the help and a handy search box.

Get-Help Get-SQLDiagFix

02 - Get-Help Get-SQLDiagFix.png

Good help should always include plenty of examples to show people how to use the command. There are 12 examples in the help for Get-SQLDiagFix. You can view just the examples using

Get-Help Get-SQLDiagFix -examples

Get All Of The Fixes

The easiest thing to do is to get all of the available fixes from the API. This is done using

Get-SQLDiagFix

which will return all 123 Fixes currently referenced in the API.

03 get-sqldiagfix.png

That is just a lot of information on the screen. If we want to search through that with PowerShell we can use Out-GridView

Get-SQLDiagFix | Select Product, Feature, KB, Title | Out-GridView

05 Get-SQLDiagFix OutGridView Search.gif

Or maybe if you want to put them in a database you could use dbatools

$Fixes = Get-SQLDiagFix | Out-DbaDataTable
Write-DbaDataTable -SqlServer $Server -Database $DB -InputObject $Fixes -Table Fixes -AutoCreateTable

Get Fixes for a Product

If you only want to see the fixes for a particular product you can use the product parameter. To see all of the products available in the API you can run

Get-SQLDiagProduct

06 Get-SQLDiagProduct.png

You can either specify the product

Get-SQLDiagFix -Product 'SQL Server 2016 SP1' | Format-Table

07 Get-SQLDiagFix Product.png

or you can pipe the results of Get-SQLDiagProduct to Get-SQLDiagFix which enables you to search. For example, to search for all fixes for SQL Server 2014 you can do

Get-SQLDiagProduct 2014 | Get-SQLDiagFix | Format-Table -AutoSize

08 - Get-SQLDiagFix Product Search.png

Which will show the fixes available in the API for SQL Server 2014 SP1 and SQL Server 2014 SP2

Get The Fixes for A Feature

The fixes in the API are also categorised by feature area. You can see all of the feature areas using Get-SQLDiagFeature

Get-SQLDiagFeature

09 get-sqldiagfeature.png

You can see the fixes in a particular feature area using the Feature parameter with

Get-SQLDiagFix -Feature Spatial | Format-Table -AutoSize

10 - Get-SQLDiagFix by feature.png

or you can search for a feature with a name like query and show the fixes using

Get-SQLDiagFix -Feature (Get-SQLDiagFeature query) | Format-Table -AutoSize

11 - Get-SQLDiagFix by feature query.png

Get Fixes for a Product and a Feature

You can combine the two approaches above to search for fixes by product and feature area. If you want to see the fixes for SQL Server 2016  to do with backups you can use

Get-SQLDiagProduct 2016 | Get-SQLDiagFix -Feature (Get-SQLDiagFeature backup) | Format-Table -AutoSize

12 - Get-SQLDiagFix by feature adn product.png

No-one wants to see the words “…restore fails when….”! This is probably a good time to fix that.

Open the KB Article Web-Page

As well as getting the title and KB number of the fix, you can open the web-page. This code will open the fixes for all SP1 products in the feature area like al in Out-GridView and enable you to choose one (or more) and open them in your default browser

Get-SQLDiagProduct SP1 | Get-SQLDiagFix -Feature (Get-SQLDiagFeature -Feature al) `
| Out-GridView -PassThru | ForEach-Object {Start-Process $_.URL}
13 - Open a webpage.gif

 

There is a YouTube video as well showing how to use the command

 

You can find the GitHub repository at  https://github.com/SQLDBAWithABeard/SQLDiagAPI