Running Windows and Linux SQL Containers together

Just for fun I decided to spend Christmas Eve getting Windows and Linux SQL containers running together.

WARNING

This is NOT a production ready solution, in fact I would not even recommend that you try it.
I definitely wouldn’t recommend it on any machine with anything useful on it that you want to use again.
We will be using a re-compiled dockerd.exe created by someone else and you know the rules about downloading things from the internet don’t you? and trusting unknown unverified people?

Maybe you can try this in an Azure VM or somewhere else safe.

Anyway, with that in mind, lets go.

Linux Containers On Windows

You can run Linux containers on Windows in Docker as follows. You need to be running the latest Docker for Windows.

Right click on the whale in the task bar and select Settings

Notice that I am running Windows Containers as there is a switch to Linux containers option. If you see Switch to Windows containers then click that first.

Click on Daemon and then tick the experimental features tick box and press apply.

Docker will restart and you can now run Linux containers alongside windows containers.

So you you can pull the Ubuntu container with

docker pull ubuntu:18.04

and then you can run it with

docker run -it --name ubuntu ubuntu:18.04

There you go one Linux container running 🙂
A good resource for learning bash for SQL Server DBAs is Kellyn Pot’Vin-Gorman b | t series on Simple Talk

Type Exit to get out of the container and to remove it

docker rm ubuntu


Running SQL Linux Containers On Windows

So can we run SQL Containers ?

Well, we can pull the image successfully.

docker pull mcr.microsoft.com/mssql/server:2019-CTP2.2-ubuntu

If you try that without the experimental features enabled you will get this error.

image operating system “linux” cannot be used on this platform

So you would think that what you can do is to use the code from Andrew ‘dbafromthecold’ Pruski’s b | t excellent container series

docker run -d -p 15789:1433 --env ACCEPT_EULA=Y --env SA_PASSWORD=Testing1122 --name testcontainer mcr.microsoft.com/mssql/server:2019-CTP2.2-ubuntu

When you do, the command will finish successfully but the container won’t be started (as can been seen by the red dot in the docker explorer).

If you look at the logs for the container. (I am lazy, I right click on the container and choose show logs in VS Code 🙂 ) you will see

sqlservr: This program requires a machine with at least 2000 megabytes of memory.
/opt/mssql/bin/sqlservr: This program requires a machine with at least 2000 megabytes of memory.

Now, if you are running Linux containers, this is an easy fix. All you have to do is to right click on the whale in the taskbar, choose Settings, Advanced and move the slider for the Memory and click apply.

But in Windows containers that option is not available.

If you go a-googling you will find that Shawn Melton created an issue for this many months ago, which gets referenced by this issue for the guest compute service, which references this PR in moby. But as this hasn’t been merged into master yet it is not available. I got bored of waiting for this and decided to look a bit deeper today.

Get It Working Just For Fun

So, you read the warning at the top?

Now let’s get it working. I take zero credit here. All of the work was done by Brian Weeteling b | G in this post

So you can follow Brians examples and check out the source code and compile it as he says or you can download the exe that he has made available (remember the warning?)

Stop Docker for Windows, and with the file downloaded and unzipped, open an admin PowerShell and navigate to the directory the dockerd.exe file is and run

.\dockerd.exe

You will get an output like this and it will keep going for a while.

Leave this window open whilst you are using Docker like this. Once you see

Then open a new PowerShell window or VS Code. You will need to run it as admin. I ran

docker ps-a

to see if it was up and available.

I also had to create a bootx64.efi file at C:\Program Files\Linux Containers which I did by copying and renaming the kernel file in that folder.

Now I can use a docker-compose file to create 5 containers. Four will be Windows containers from Andrews Docker hub repositories or Microsoft’s Docker Hub for SQL 2012, SQL 2014, SQL 2016, and SQL 2017 and one will be the latest Ubuntu SQL 2019 CTP 2.2 image. Note that you have to use version 2.4 of docker compose as the platform tag is not available yet in any later version, although it is coming to 3.7 soon.

version: '2.4'

services:
    sql2019:
        image: mcr.microsoft.com/mssql/server:2019-CTP2.2-ubuntu
        platform: linux
        ports:  
          - "15585:1433"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"
    sql2012:
        image: dbafromthecold/sqlserver2012dev:sp4
        platform: windows
        ports:  
          - "15589:1433"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"
    sql2014:
        image: dbafromthecold/sqlserver2014dev:sp2
        platform: windows
        ports:  
          - "15588:1433"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"
    sql2016:
        image: dbafromthecold/sqlserver2016dev:sp2
        platform: windows
        ports:  
          - "15587:1433"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"
    sql2017:
        image: microsoft/mssql-server-windows-developer:2017-latest
        platform: windows
        ports:  
          - "15586:1433"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"

Save this code as docker-compose.yml and navigate to the directory in an admin PowerShell or VS Code and run

docker-compose up -d

and now I have Windows and Linux SQL containers running together. This means that I can test some code against all versions of SQL from 2012 to 2019 easily in containers 🙂

So that is just a bit of fun.

To return to the normal Docker, simply CTRL and C the admin PowerShell you ran .\dockerd.exe in and you will see the logs showing it shutting down.

You will then be able to start Docker For Windows as usual.

I look forward to the time, hopefully early next year when all of the relevant PR’s have been merged and this is available in Docker for Windows.

Happy Automating 🙂

Gathering all the Logs and Running the Availability Group Failover Detection Utility with PowerShell

30/11/2018 – Function has been updated to deal with named instances.

Last week the Tiger Team released their Availability Group Failover Detection Utility which will provide root cause analysis on Cluster Logs, SQL Error Logs, and the Availability groups extended events logs. There is a blog post here and the tool can be downloaded from the Tiger Team GitHub Repository

A Bit of Faffing*

It states on the readme for the Tiger Team GitHub Repository.

Repository for Tiger team for “as-is” solutions and tools/scripts that the team publishes.

The important words are “as-is” sometimes these tools need a bit of faffing some looking after!

There is a pre-requisite and sometimes a little “fixing” that you need to do to get it to run correctly.

First, install the “Microsoft Visual C++ Redistributable for Visual Studio 2017” from here. On the download page, scroll down to the “Other Tools and Frameworks” section to download the redistributable (x64 version).

cdistributable.PNG

Then when you run FailoverDetection.exe you may get strong name validation errors like.

strong name.png

Unhandled Exception: System.IO.FileLoadException: Could not load file or assembly ‘Microsoft.Sq1Server.XEvent.Linq, Version=15.0.0.0, Culture=neutral, PublicKeyToken=89845dcd808cc91’ or one of it s dependencies. Strong name validation failed. (Exception from HRESULT; 0x8013141A) – – – >.Security.SecurityException: Strong name validation failed. (Exception from HRESULT: 0x8e13141A)
—End of inner exception stack trace  —
at FailoverDetector. XeventParser.LoadXevent(String xelFi1eName, String serverName)

Then you will need to run the sn.exe tool which is in the zip file. Use this syntax.

.\sn.exe -Vr PATHTODLLFile

stroingname fix.png

I had to do it for two DLLs.

NOTE – If you get an error like this when running sn.exe (or any executable) from PowerShell it means that you have missed the .\ (dot whack) in front of the executable name.

striong name fail.png

* Faffing – Doing something that is a bit awkward See Link .

Logs required for the Tool

To run the Failover Detection Utility you need to gather the following information from each replica and place it in the specified data folder.

  • SQL error logs
  • Always On Availability Groups Extended Event Logs
  • System Health Extended Event Logs
  • System log
  • Windows cluster log

Once you have gathered all of that data then you need to alter the configuration file for the executable.

{
    "Data Source Path": "Path to Data File",
    "Health Level": 3,
    "Instances": [
        "Replica1",
        "Replica2",
        "Replica3"
    ]
}

Running The Tool

Once you have done that you can then run the Failover Detection Utility. You can double click the exe,

run the exe.PNG

or you can run it from the command line.

run the exe with powershell.PNG

In both cases it won’t exit so when you see the Saving Results to JSON file, you can press enter (sometimes twice!).

The results can be seen in the JSON file which will be stored in a Results directory in the directory that the the FailoverDetection.exe exists.

results.PNG

You can also use some switches with the FailoverDetection utility.

–Analyze – When “–Analyze” is specified as a parameter, the utility will load configuration file without copying log data. It assumes the log files have already been copied over. It does everything as default mode except copying log data. This option is useful if you already have the data in the local tool execution subdirectories and want to rerun the analysis.

-Show -The utility after analyzing log data will display the results in the command console. Additionally, the results will be persisted to a JSON file in the results folder.

They look like this

results - show.PNG

Again, you need to press enter for the details to come through. The results are still saved to the Results folder as json as well so you won’t lose them.

When You Are Doing Something More Than Once ….

Automate it 🙂

When I saw the data that needed to be gathered for this tool, I quickly turned to PowerShell to enable me to easily gather the information. That has turned into a function which will

  • Download and extract the zip file from the Tiger Team GitHub repository
  • Identify all of the replicas for an Availability Group and dynamically create the configuration JSON file
  • Gather all of the required log files and place them in a specified data folder
  • Run the FailoverDetection.exe with any of the switches
  • Includes -Verbose, -Confirm, -Whatif switches so that you can easily see what is happening, be prompted to confirm before actions or see what would happen if you ran the function
  • You still need to press enter at the end though 🙁
  • and you will still need to install the “Microsoft Visual C++ Redistributable for Visual Studio 2017” and runt he strong names tool if needed

This function requires PowerShell version 5, the failovercluster module and and the dbatools module.

You can get the function from my GitHub Functions Repository here (at the moment – will be adding to dbatools see below)

Load the function by either running the code or if you have it saved as a file dot-sourcing it.

. .\Invoke-SqlFailOverDetection.ps1

There are two .’s with a space in between and then a \ without a space. so Dot Space Dot Whack path to file.

The next thing you should do is what you should always do with a new PowerShell function, look at the help.

Get-Help Invoke-SqlFailOverDetection -Detailed

You will find plenty of examples to get you going and explanations of all of the parameters.

Let’s see it in action.

First lets run with a -WhatIf switch which will show us what will happen without performing any state changing actions.

$InstallationFolder = 'C:\temp\failoverdetection\new\Install'
$DownloadFolder = 'C:\temp\failoverdetection\new\Download'
$DataFolder = 'C:\temp\failoverdetection\new\Data'
$SQLInstance = 'SQL0'

$invokeSqlFailOverDetectionSplat = @{
DownloadFolder = $DownloadFolder
SQLInstance = $SQLInstance
DataFolder = $DataFolder
InstallationFolder = $InstallationFolder
}
Invoke-SqlFailOverDetection @invokeSqlFailOverDetectionSplat -WhatIf

whatif.PNG

So you can see that if we run it without the -WhatIf switch it will

  • Create some directories
  • Download the zip file from the repo
  • Extract the zip file
  • Copy the required logs from each of the replicas to the data folder
  • Create the JSON configuration file
  • Run the executable

NOTE : – I have limited the gathering of the system event log to the last 2 days to limit the amount of time spent dealing with a large system log. I gather all of the SQL Error logs in the Error log path as that works for the first scenario I wrote this for, your mileage may vary.

So if we want to run the command we can remove the -WhatIf switch.

$InstallationFolder = 'C:\temp\failoverdetection\new\Install'
$DownloadFolder = 'C:\temp\failoverdetection\new\Download'
$DataFolder = 'C:\temp\failoverdetection\new\Data'
$SQLInstance = 'SQL0'

$invokeSqlFailOverDetectionSplat = @{
DownloadFolder = $DownloadFolder
SQLInstance = $SQLInstance
DataFolder = $DataFolder
InstallationFolder = $InstallationFolder
}
Invoke-SqlFailOverDetection @invokeSqlFailOverDetectionSplat

It can take a little while to run depending on the number of replicas, size of logs etc but once it has started running you can do other things.

It will require being run as an account with permissions to all of the folders specified and Windows and SQL permissions on all of the replicas in the Availability Group.

run1.PNG

As you can see below it has gathered all of the results and placed them in the data folder.

datagathered.PNG

The results can be found in the results folder.

resultsjson.PNG

If I have already run the tool, I can use the Analyze switch to save gathering the data again. I also use the AlreadyDownloaded switch as I do not need to download the zip file again.

$invokeSqlFailOverDetectionSplat = @{
DownloadFolder = $DownloadFolder
SQLInstance = $SQLInstance
DataFolder = $DataFolder
InstallationFolder = $InstallationFolder
AlreadyDownloaded = $true
Analyze = $true
}
Invoke-SqlFailOverDetection @invokeSqlFailOverDetectionSplat
analyze.PNG
and the results are again saved in the results folder.
I can show the results on the screen as well as saving them as JSON with the Show parameter.
$InstallationFolder = 'C:\temp\failoverdetection\Install'
$DownloadFolder = 'C:\temp\failoverdetection\Download'
$DataFolder = 'C:\temp\failoverdetection\Data'
$SQLInstance = 'SQL0'

$invokeSqlFailOverDetectionSplat = @{
DownloadFolder = $DownloadFolder
SQLInstance = $SQLInstance
DataFolder = $DataFolder
InstallationFolder = $InstallationFolder
AlreadyDownloaded = $true
Analyze = $true
Show = $true
}
Invoke-SqlFailOverDetection @invokeSqlFailOverDetectionSplat

show.PNG

You will then need to press enter to get the next lot of results.

more show results.PNG

Why Not Add This To dbatools?

I haven’t added this to dbatools (yet) because I wrote it in this way for a particular need and dbatools requires support for PowerShell V3 . I have, however created an issue added to this issue in the dbatools GitHub Repository (as this is how you to start the process of adding things to dbatools) so hopefully we can get it in there soon as well – in which case I will come back and update this post.

 

Happy Automating!

dbachecks – Save the results to a database for historical reporting

I gave a presentation at SQL Day in Poland last week on dbachecks and one of the questions I got asked was will you write a command to put the results of the checks into a database for historical reporting.

The answer is no and here is the reasoning. The capability is already there. Most good PowerShell commands will only return an object and the beauty of an object is that you can do anything you like with it. Your only limit is your imagination 🙂 I have written about this before here. The other reason is that it would be very difficult to write something that was easily configurable for the different requirements that people will require. But here is one way of doing it.

Create a configuration and save it

Let’s define a configuration and call it production. This is something that I do all of the time so that I can easily run a set of checks with the configuration that I want.

# The computername we will be testing
Set-DbcConfig -Name app.computername -Value $sql0,$SQl1
# The Instances we want to test
Set-DbcConfig -Name app.sqlinstance -Value $sql0,$SQl1
# The database owner we expect
Set-DbcConfig -Name policy.validdbowner.name -Value 'THEBEARD\EnterpriseAdmin'
# the database owner we do NOT expect
Set-DbcConfig -Name policy.invaliddbowner.name -Value 'sa'
# Should backups be compressed by default?
Set-DbcConfig -Name policy.backup.defaultbackupcompression -Value $true
# Do we allow DAC connections?
Set-DbcConfig -Name policy.dacallowed -Value $true
# What recovery model should we have?
Set-DbcConfig -Name policy.recoverymodel.type -value FULL
# What should ourt database growth type be?
Set-DbcConfig -Name policy.database.filegrowthtype -Value kb
# What authentication scheme are we expecting?
Set-DbcConfig -Name policy.connection.authscheme -Value 'KERBEROS'
# Which Agent Operator should be defined?
Set-DbcConfig -Name agent.dbaoperatorname -Value 'The DBA Team'
# Which Agent Operator email should be defined?
Set-DbcConfig -Name agent.dbaoperatoremail -Value 'TheDBATeam@TheBeard.Local'
# Which failsafe operator shoudl be defined?
Set-DbcConfig -Name agent.failsafeoperator -Value 'The DBA Team'
## Set the database mail profile name
Set-DbcConfig -Name agent.databasemailprofile -Value 'DbaTeam'
# Where is the whoisactive stored procedure?
Set-DbcConfig -Name policy.whoisactive.database -Value master
# What is the maximum time since I took a Full backup?
Set-DbcConfig -Name policy.backup.fullmaxdays -Value 7
# What is the maximum time since I took a DIFF backup (in hours) ?
Set-DbcConfig -Name policy.backup.diffmaxhours -Value 26
# What is the maximum time since I took a log backup (in minutes)?
Set-DbcConfig -Name policy.backup.logmaxminutes -Value 30
# What is my domain name?
Set-DbcConfig -Name domain.name -Value 'TheBeard.Local'
# Where is my Ola database?
Set-DbcConfig -Name policy.ola.database -Value master
# Which database should not be checked for recovery model
Set-DbcConfig -Name policy.recoverymodel.excludedb -Value 'master','msdb','tempdb'
# Should I skip the check for temp files on c?
Set-DbcConfig -Name skip.tempdbfilesonc -Value $true
# Should I skip the check for temp files count?
Set-DbcConfig -Name skip.tempdbfilecount -Value $true
# Which Checks should be excluded?
Set-DbcConfig -Name command.invokedbccheck.excludecheck -Value LogShipping,ExtendedEvent, PseudoSimple,SPN, TestLastBackupVerifyOnly,IdentityUsage,SaRenamed
# How many months before a build is unsupported do I want to fail the test?
Set-DbcConfig -Name policy.build.warningwindow -Value 6
## I need to set the app.cluster configuration to one of the nodes for the HADR check
## and I need to set the domain.name value
Set-DbcConfig -Name app.cluster -Value $SQL0
Set-DbcConfig -Name domain.name -Value 'TheBeard.Local'
## I also skip the ping check for the listener as we are in Azure
Set-DbcConfig -Name skip.hadr.listener.pingcheck -Value $true
Now I can export that configuration to a json file and store on a file share or in source control using the code below. This makes it easy to embed the checks into an automation solution
Export-DbcConfig -Path Git:\Production.Json
and then I can use it with
Import-DbcConfig -Path Git:\Production.Json
Invoke-DbcCheck
01 - Invoke-DbcCheck
I would use one of the Show parameter values here if I was running it at the command line, probably fails to make reading the information easier

Add results to a database

This only gets us the test results on the screen, so if we want to save them to a database we have to use the PassThru parameter for Invoke-DbcCheck. I will run the checks again, save them to a variable
$Testresults = Invoke-DbcCheck -PassThru -Show Fails

Then I can use the dbatools Write-DbaDatatable command to write the results to a table in a database. I need to do this twice, once for the summary and once for the test results

$Testresults | Write-DbaDataTable -SqlInstance $sql0 -Database tempdb -Table Prod_dbachecks_summary -AutoCreateTable
$Testresults.TestResult | Write-DbaDataTable -SqlInstance $sql0 -Database tempdb -Table Prod_dbachecks_detail -AutoCreateTable

and I get two tables one for the summary

02 - summary

and one for the details

03 - detail
This works absolutely fine and I could continue to add test results in this fashion but it has no date property so it is not so useful for reporting.

Create tables and triggers

This is one way of doing it. I am not sure it is the best way but it works! I always look forward to how people take ideas and move them forward so if you have a better/different solution please blog about it and reference it in the comments below

First I created a staging table for the summary results

CREATE TABLE [dbachecks].[Prod_dbachecks_summary_stage](
	[TagFilter] [nvarchar](max) NULL,
	[ExcludeTagFilter] [nvarchar](max) NULL,
	[TestNameFilter] [nvarchar](max) NULL,
	[TotalCount] [int] NULL,
	[PassedCount] [int] NULL,
	[FailedCount] [int] NULL,
	[SkippedCount] [int] NULL,
	[PendingCount] [int] NULL,
	[InconclusiveCount] [int] NULL,
	[Time] [bigint] NULL,
	[TestResult] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

and a destination table with a primary key and a date column which defaults to todays date

CREATE TABLE [dbachecks].[Prod_dbachecks_summary](
	[SummaryID] [int] IDENTITY(1,1) NOT NULL,
	[TestDate] [date] NOT NULL,
	[TagFilter] [nvarchar](max) NULL,
	[ExcludeTagFilter] [nvarchar](max) NULL,
	[TestNameFilter] [nvarchar](max) NULL,
	[TotalCount] [int] NULL,
	[PassedCount] [int] NULL,
	[FailedCount] [int] NULL,
	[SkippedCount] [int] NULL,
	[PendingCount] [int] NULL,
	[InconclusiveCount] [int] NULL,
	[Time] [bigint] NULL,
	[TestResult] [nvarchar](max) NULL,
 CONSTRAINT [PK_Prod_dbachecks_summary] PRIMARY KEY CLUSTERED 
(
	[SummaryID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_summary] ADD  CONSTRAINT [DF_Prod_dbachecks_summary_TestDate]  DEFAULT (getdate()) FOR [TestDate]
GO

and added an INSERT trigger to the staging table

CREATE TRIGGER [dbachecks].[Load_Prod_Summary] 
   ON   [dbachecks].[Prod_dbachecks_summary_stage]
   AFTER INSERT
AS 
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

    INSERT INTO [dbachecks].[Prod_dbachecks_summary] 
	([TagFilter], [ExcludeTagFilter], [TestNameFilter], [TotalCount], [PassedCount], [FailedCount], [SkippedCount], [PendingCount], [InconclusiveCount], [Time], [TestResult])
	SELECT [TagFilter], [ExcludeTagFilter], [TestNameFilter], [TotalCount], [PassedCount], [FailedCount], [SkippedCount], [PendingCount], [InconclusiveCount], [Time], [TestResult] FROM [dbachecks].[Prod_dbachecks_summary_stage]

END
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_summary_stage] ENABLE TRIGGER [Load_Prod_Summary]
GO
and for the details I do the same thing. A details table
CREATE TABLE [dbachecks].[Prod_dbachecks_detail](
	[DetailID] [int] IDENTITY(1,1) NOT NULL,
	[SummaryID] [int] NOT NULL,
	[ErrorRecord] [nvarchar](max) NULL,
	[ParameterizedSuiteName] [nvarchar](max) NULL,
	[Describe] [nvarchar](max) NULL,
	[Parameters] [nvarchar](max) NULL,
	[Passed] [bit] NULL,
	[Show] [nvarchar](max) NULL,
	[FailureMessage] [nvarchar](max) NULL,
	[Time] [bigint] NULL,
	[Name] [nvarchar](max) NULL,
	[Result] [nvarchar](max) NULL,
	[Context] [nvarchar](max) NULL,
	[StackTrace] [nvarchar](max) NULL,
 CONSTRAINT [PK_Prod_dbachecks_detail] PRIMARY KEY CLUSTERED 
(
	[DetailID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail]  WITH CHECK ADD  CONSTRAINT [FK_Prod_dbachecks_detail_Prod_dbachecks_summary] FOREIGN KEY([SummaryID])
REFERENCES [dbachecks].[Prod_dbachecks_summary] ([SummaryID])
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail] CHECK CONSTRAINT [FK_Prod_dbachecks_detail_Prod_dbachecks_summary]
GO

A stage table

CREATE TABLE [dbachecks].[Prod_dbachecks_detail_stage](
	[ErrorRecord] [nvarchar](max) NULL,
	[ParameterizedSuiteName] [nvarchar](max) NULL,
	[Describe] [nvarchar](max) NULL,
	[Parameters] [nvarchar](max) NULL,
	[Passed] [bit] NULL,
	[Show] [nvarchar](max) NULL,
	[FailureMessage] [nvarchar](max) NULL,
	[Time] [bigint] NULL,
	[Name] [nvarchar](max) NULL,
	[Result] [nvarchar](max) NULL,
	[Context] [nvarchar](max) NULL,
	[StackTrace] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

with a trigger

CREATE TRIGGER [dbachecks].[Load_Prod_Detail] 
   ON   [dbachecks].[Prod_dbachecks_detail_stage]
   AFTER INSERT
AS 
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

    INSERT INTO [dbachecks].[Prod_dbachecks_detail] 
([SummaryID],[ErrorRecord], [ParameterizedSuiteName], [Describe], [Parameters], [Passed], [Show], [FailureMessage], [Time], [Name], [Result], [Context], [StackTrace])
	SELECT 
	(SELECT MAX(SummaryID) From [dbachecks].[Prod_dbachecks_summary]),[ErrorRecord], [ParameterizedSuiteName], [Describe], [Parameters], [Passed], [Show], [FailureMessage], [Time], [Name], [Result], [Context], [StackTrace]
	FROM [dbachecks].[Prod_dbachecks_detail_stage]

END
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail_stage] ENABLE TRIGGER [Load_Prod_Detail]
GO

Then I can use Write-DbaDatatable with a couple of extra parameters, FireTriggers to run the trigger, Truncate and Confirm:$false to avoid any confirmation because I want this to run without any interaction and I can get the results into the database.

$Testresults | Write-DbaDataTable -SqlInstance $Instance -Database $Database -Schema dbachecks -Table Prod_dbachecks_summary_stage -FireTriggers -Truncate -Confirm:$False
$Testresults.TestResult | Write-DbaDataTable -SqlInstance $Instance -Database $Database -Schema dbachecks -Table Prod_dbachecks_detail_stage -FireTriggers -Truncate -Confirm:$False
detail with stage

Which means that I can now query some of this data and also create PowerBi reports for it.

To enable me to have results for the groups in dbachecks I have to do a little bit of extra manipulation. I can add all of the checks to the database using

Get-DbcCheck | Write-DbaDataTable -SqlInstance $sql0 -Database ValidationResults -Schema dbachecks -Table Checks -Truncate -Confirm:$False -AutoCreateTable

But because the Ola Hallengren Job names are configuration items I need to update the values for those checks which I can do as follows

$query = "
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.systemfull) + "' WHERE [Describe] = 'Ola - `$SysFullJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserFull) + "' WHERE [Describe] = 'Ola - `$UserFullJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserDiff) + "' WHERE [Describe] = 'Ola - `$UserDiffJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserLog) + "' WHERE [Describe] = 'Ola - `$UserLogJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.CommandLogCleanup) + "' WHERE [Describe] = 'Ola - `$CommandLogJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.SystemIntegrity) + "' WHERE [Describe] = 'Ola - `$SysIntegrityJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserIntegrity) + "' WHERE [Describe] = 'Ola - `$UserIntegrityJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserIndex) + "' WHERE [Describe] = 'Ola - `$UserIndexJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.OutputFileCleanup) + "' WHERE [Describe] = 'Ola - `$OutputFileJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.DeleteBackupHistory) + "' WHERE [Describe] = 'Ola - `$DeleteBackupJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.PurgeBackupHistory) + "' WHERE [Describe] = 'Ola - `$PurgeBackupJobName'
"
Invoke-DbaSqlQuery -SqlInstance $SQL0 -Database ValidationResults -Query $query

You can get a sample Power Bi report in my Github which also has the code from this blog post

Then you just need to open in PowerBi Desktop and

Click Edit Queries
Click Data Source Settings
Click Change Source
Change the Instance and Database names

09 - PowerBi

Then have an interactive report like this. Feel free to click around and see how it works. Use the arrows at the bottom right to go full-screen. NOTE – it filters by “today” so if I haven’t run the check and the import then click on one of the groups under “Today’s Checks by Group”

This enables me to filter the results and see what has happened in the past so I can filter by one instance
05 - filter by instance
or I can filter by a group of tests
07 - filter by instance
or even by a group of tests for an instance
08 - filter by instance and insance

Hopefully, this will give you some ideas of what you can do with your dbachecks results. You can find all of the code and the PowerBi in my GitHub

Happy Validating!

Creating SQL Server Containers for versions 2012-2017

I am working on my dbatools and dbachecks presentations for SQL Saturday Finland, SQLDays, SQL Saturday Cork and SQLGrillen I want to show the two modules running against a number of SQL Versions so I have installed

  • 2 Domain Controllers
  • 2 SQL 2017 instances on Windows 2016 with an Availability Group and WideWorldImporters database
  • 1 Windows 2016 jump box with all the programmes I need
  • 1 Windows 2016 with containers

using a VSTS build and this set of ARM templates and scripts

I wanted to create containers running SQL2017, SQL2016, SQL2014 and SQL2012 and restore versions of the AdventureWorks database onto each one.

Move Docker Location

I redirected my docker location from my C:\ drive to my E:\ drive so I didnt run out of space. I did this by creating a daemon.json file in C:\ProgramData\docker\config and adding

{"data-root": "E:\\containers"}
and restarting the docker service which created folders like this
01 - folders.png
Then I ran
docker volume create SQLBackups
to create a volume to hold the backups that I could mount on the containers

AdventureWorks Backups

I downloaded all the AdventureWorks backups from GitHub and copied them to E:\containers\volumes\sqlbackups\_data

Get-ChildItem $Home\Downloads\AdventureWorks* | Copy-Item -Destination E:\containers\volumes\sqlbackups\_data

Getting the Images

To download the SQL 2017 image from the DockerHub I ran

docker pull microsoft/mssql-server-windows-developer:latest

and waited for it to download and extract

I also needed the images for other versions. My good friend Andrew Pruski b | t has versions available for us to use on his Docker Hub  so it is just a case of running

docker pull dbafromthecold/sqlserver2016dev:sp1
docker pull dbafromthecold/sqlserver2014dev:sp2
docker pull dbafromthecold/sqlserver2012dev:sp4
and waiting for those to download and extract (This can take a while!)

Create the containers

Creating the containers is as easy as

docker run -d -p ExposedPort:InternalPort --name NAME -v VolumeName:LocalFolder -e sa_password=THEPASSWORD -e ACCEPT_EULA=Y IMAGENAME
so all I needed to run to create 4 SQL containers one of each version was
docker run -d -p 15789:1433 --name 2017 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y microsoft/mssql-server-windows-developer
docker run -d -p 15788:1433 --name 2016 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y dbafromthecold/sqlserver2016dev:sp1
docker run -d -p 15787:1433 --name 2014 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y dbafromthecold/sqlserver2014dev:sp2
docker run -d -p 15786:1433 --name 2012 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y dbafromthecold/sqlserver2012dev:sp4

and just a shade over 12 seconds later I have 4 SQL instances ready for me 🙂

02 - creating containers.png

03 - Containers at the ready.png

Storing Credentials

This is not something I would do in a Production environment but I save my credentials using this method that Jaap Brasser b | t shared here

Get-Credential | Export-Clixml -Path $HOME\Documents\sa.cred
which means that I can get the credentials in my PowerShell session (as long as it is the same user that created the file) using
$cred = Import-Clixml $HOME\Documents\sa.cred

Restoring the databases

I restored all of the AdventureWorks databases that each instance will support onto each instance, so 2017 has all of them whilst 2012 only has the 2012 versions.

First I needed to get the filenames of the backup files into a variable

$filenames = (Get-ChildItem '\\bearddockerhost\e$\containers\volumes\sqlbackups\_data').Name

and the container connection strings, which are the hostname and the port number

$containers = 'bearddockerhost,15789', 'bearddockerhost,15788', 'bearddockerhost,15787', 'bearddockerhost,15786'
then I can restore the databases using dbatools using a switch statement on the version which I get with the NameLevel property of Get-DbaSqlBuildReference-
$cred = Import-Clixml $HOME\Documents\sa.cred
$containers = 'bearddockerhost,15789', 'bearddockerhost,15788', 'bearddockerhost,15787', 'bearddockerhost,15786'
$filenames = (Get-ChildItem '\\bearddockerhost\e$\containers\volumes\sqlbackups\_data').Name
$containers.ForEach{
    $Container = $Psitem
    $NameLevel = (Get-DbaSqlBuildReference-SqlInstance $Container-SqlCredential $cred).NameLevel
    switch ($NameLevel) {
        2017 {
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path C:\sqlbackups\ -useDestinationDefaultDirectories -WithReplace |Out-Null
            Write-Verbose-Message "Restored Databases on 2017"
        }
        2016 {
            $Files = $Filenames.Where{$PSitem -notlike '*2017*'}.ForEach{'C:\sqlbackups\' + $Psitem}
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path $Files-useDestinationDefaultDirectories -WithReplace
            Write-Verbose-Message "Restored Databases on 2016"
        }
        2014 {
            $Files = $Filenames.Where{$PSitem -notlike '*2017*' -and $Psitem -notlike '*2016*'}.ForEach{'C:\sqlbackups\' + $Psitem}
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path $Files-useDestinationDefaultDirectories -WithReplace
            Write-Verbose-Message "Restored Databases on 2014"
        }
        2012 {
            $Files = $Filenames.Where{$PSitem -like '*2012*'}.ForEach{'C:\sqlbackups\' + $Psitem}
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path $Files-useDestinationDefaultDirectories -WithReplace
            Write-Verbose-Message "Restored Databases on 2012"
        }
        Default {}
    }
}
I need to create the file paths for each backup file by getting the correct backups and appending the names to C:\SQLBackups which is where the volume is mounted inside the container
As Get-DbaDatabase gives the container ID as the Computer Name I have highlighted each container below
04 - databases.png
That is how easy it is to create a number of SQL containers of differing versions for your presentations or exploring needs
Happy Automating!

dbachecks – Configuration Deep Dive

Today is the day that we have announced dbachecks  a PowerShell module enabling you to validate your SQL Instances. You can read more about it here where you can learn how to install it and see some simple use cases

108 Configurations

One of the things I have been talking about in my presentation “Green is Good Red is Bad” is configuring Pester checks so that you do not have to keep writing new tests for the same thing but with different values.

For example, a different user for a database owner. The code to write the test for the database owner is the same but the value might be different for different applications, environments, clients, teams, domains etc. I gave a couple of different methods for achieving this.

With dbachecks we have made this much simpler enabling you to set configuration items at run-time or for your session and enabling you to export and import them so you can create different configs for different use cases

There are 108 configuration items at present. You can see the current configuration by running

Get-DbcConfig

which will show you the name of the config, the value it is currently set and the description

You can see all of the configs and their descriptions here

NameDescription
agent.databasemailprofileName of the Database Mail Profile in SQL Agent
agent.dbaoperatoremailEmail address of the DBA Operator in SQL Agent
agent.dbaoperatornameName of the DBA Operator in SQL Agent
agent.failsafeoperatorEmail address of the DBA Operator in SQL Agent
app.checkreposWhere Pester tests/checks are stored
app.computernameList of Windows Servers that Windows-based tests will run against
app.localappPersisted files live here
app.maildirectoryFiles for mail are stored here
app.sqlcredentialThe universal SQL credential if Trusted/Windows Authentication is not used
app.sqlinstanceList of SQL Server instances that SQL-based tests will run against
app.wincredentialThe universal Windows if default Windows Authentication is not used
command.invokedbccheck.excludecheckInvoke-DbcCheck: The checks that should be skipped by default.
domain.domaincontrollerThe domain controller to process your requests
domain.nameThe Active Directory domain that your server is a part of
domain.organizationalunitThe OU that your server should be a part of
mail.failurethreshholdNumber of errors that must be present to generate an email report
mail.fromEmail address the email reports should come from
mail.smtpserverStore the name of the smtp server to send email reports
mail.subjectSubject line of the email report
mail.toEmail address to send the report to
policy.backup.datadirDestination server data directory
policy.backup.defaultbackupcompreesionDefault Backup Compression check should be enabled $true or disabled $false
policy.backup.diffmaxhoursMaxmimum number of hours before Diff Backups are considered outdated
policy.backup.fullmaxdaysMaxmimum number of days before Full Backups are considered outdated
policy.backup.logdirDestination server log directory
policy.backup.logmaxminutesMaxmimum number of minutes before Log Backups are considered outdated
policy.backup.newdbgraceperiodThe number of hours a newly created database is allowed to not have backups
policy.backup.testserverDestination server for backuptests
policy.build.warningwindowThe number of months prior to a build being unsupported that you want warning about
policy.connection.authschemeAuth requirement (Kerberos, NTLM, etc)
policy.connection.pingcountNumber of times to ping a server to establish average response time
policy.connection.pingmaxmsMaximum response time in ms
policy.dacallowedDAC should be allowed $true or disallowed $false
policy.database.autocloseAuto Close should be allowed $true or dissalowed $false
policy.database.autocreatestatisticsAuto Create Statistics should be enabled $true or disabled $false
policy.database.autoshrinkAuto Shrink should be allowed $true or dissalowed $false
policy.database.autoupdatestatisticsAuto Update Statistics should be enabled $true or disabled $false
policy.database.autoupdatestatisticsasynchronouslyAuto Update Statistics Asynchronously should be enabled $true or disabled $false
policy.database.filebalancetolerancePercentage for Tolerance for checking for balanced files in a filegroups
policy.database.filegrowthexcludedbDatabases to exclude from the file growth check
policy.database.filegrowthtypeGrowth Type should be 'kb' or 'percent'
policy.database.filegrowthvalueThe auto growth value (in kb) should be equal or higher than this value. Example: A value of 65535 means at least 64MB.
policy.database.logfilecountThe number of Log files expected on a database
policy.database.logfilesizecomparisonHow to compare data and log file size, options are maximum or average
policy.database.logfilesizepercentageMaximum percentage of Data file Size that logfile is allowed to be.
policy.database.maxvlfMax virtual log files
policy.dbcc.maxdaysMaxmimum number of days before DBCC CHECKDB is considered outdated
policy.diskspace.percentfreePercent disk free
policy.dump.maxcountMaximum number of expected dumps
policy.hadr.tcpportThe TCPPort for the HADR check
policy.identity.usagepercentMaxmimum percentage of max of identity column
policy.invaliddbowner.excludedbDatabases to exclude from invalid dbowner checks
policy.invaliddbowner.nameThe database owner account should not be this user
policy.network.latencymaxmsMax network latency average
policy.ola.commandlogenabledOla's CommandLog Cleanup should be enabled $true or disabled $false
policy.ola.commandlogscheduledOla's CommandLog Cleanup should be scheduled $true or disabled $false
policy.ola.databaseThe database where Ola's maintenance solution is installed
policy.ola.deletebackuphistoryenabledOla's Delete Backup History should be enabled $true or disabled $false
policy.ola.deletebackuphistoryscheduledOla's Delete Backup History should be scheduled $true or disabled $false
policy.ola.installedChecks to see if Ola Hallengren solution is installed
policy.ola.outputfilecleanupenabledOla's Output File Cleanup should be enabled $true or disabled $false
policy.ola.outputfilecleanupscheduledOla's Output File Cleanup should be scheduled $true or disabled $false
policy.ola.purgejobhistoryenabledOla's Purge Job History should be enabled $true or disabled $false
policy.ola.purgejobhistoryscheduledOla's Purge Job History should be scheduled $true or disabled $false
policy.ola.systemfullenabledOla's Full System Database Backup should be enabled $true or disabled $false
policy.ola.systemfullretentionOla's Full System Database Backup retention number of hours
policy.ola.systemfullscheduledOla's Full System Database Backup should be scheduled $true or disabled $false
policy.ola.systemintegritycheckenabledOla's System Database Integrity should be enabled $true or disabled $false
policy.ola.systemintegritycheckscheduledOla's System Database Integrity should be scheduled $true or disabled $false
policy.ola.userdiffenabledOla's Diff User Database Backup should be enabled $true or disabled $false
policy.ola.userdiffretentionOla's Diff User Database Backup retention number of hours
policy.ola.userdiffscheduledOla's Diff User Database Backup should be scheduled $true or disabled $false
policy.ola.userfullenabledOla's Full User Database Backup should be enabled $true or disabled $false
policy.ola.userfullretentionOla's Full User Database Backup retention number of hours
policy.ola.userfullscheduledOla's Full User Database Backup should be scheduled $true or disabled $false
policy.ola.userindexoptimizeenabledOla's User Index Optimization should be enabled $true or disabled $false
policy.ola.userindexoptimizescheduledOla's User Index Optimization should be scheduled $true or disabled $false
policy.ola.userintegritycheckenabledOla's User Database Integrity should be enabled $true or disabled $false
policy.ola.userintegritycheckscheduledOla's User Database Integrity should be scheduled $true or disabled $false
policy.ola.userlogenabledOla's Log User Database Backup should be enabled $true or disabled $false
policy.ola.userlogretentionOla's Log User Database Backup retention number of hours
policy.ola.userlogscheduledOla's Log User Database Backup should be scheduled $true or disabled $false
policy.oleautomationOLE Automation should be enabled $true or disabled $false
policy.pageverifyPage verify option should be set to this value
policy.recoverymodel.excludedbDatabases to exclude from standard recovery model check
policy.recoverymodel.typeStandard recovery model
policy.storage.backuppathEnables tests to check if servers have access to centralized backup location
policy.validdbowner.excludedbDatabases to exclude from valid dbowner checks
policy.validdbowner.nameThe database owner account should be this user
policy.whoisactive.databaseWhich database should contain the sp_WhoIsActive stored procedure
policy.xevent.requiredrunningsessionList of XE Sessions that should be running.
policy.xevent.requiredstoppedsessionList of XE Sessions that should not be running.
policy.xevent.validrunningsessionList of XE Sessions that can be be running.
skip.backup.testingDon't run Test-DbaLastBackup by default (it's not read-only)
skip.connection.pingSkip the ping check for connectivity
skip.connection.remotingSkip PowerShell remoting check for connectivity
skip.database.filegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.database.logfilecounttestSkip the logfilecount test
skip.datafilegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.dbcc.datapuritycheckSkip data purity check in last good dbcc command
skip.diffbackuptestSkip the Differential backup test
skip.logfilecounttestSkip the logfilecount test
skip.logshiptestingSkip the logshipping test
skip.tempdb1118Don't run test for Trace Flag 1118
skip.tempdbfilecountDon't run test for Temp Database File Count
skip.tempdbfilegrowthpercentDon't run test for Temp Database File Growth in Percent
skip.tempdbfilesizemaxDon't run test for Temp Database Files Max Size
skip.tempdbfilesoncDon't run test for Temp Database Files on C

So there are a lot of configurations that you can use. A lot are already set by default but all of them you can configure for the values that you need for your own estate.

The configurations are stored in the registry at HKCU:\Software\Microsoft\WindowsPowerShell\PSFramework\

First Configurations

First I would run this so that you can see all of the configs in a seperate window (note this does not work on PowerShell v6)

Get-DbcConfig | Out-GridView

Lets start with the first configurations that you will want to set. This should be the Instances and the Hosts that you want to check

You can get the value of the configuration item using

Get-DbcConfigValue -Name app.sqlinstance

as you can see in the image, nothing is returned so we have no instances configured at present. We have added tab completion to the name parameter so that you can easily find the right one

If you want to look at more information about the configuration item you can use

Get-DbcConfig -Name app.sqlinstance

which shows you the name, current value and the description

So lets set our first configuration for our SQL instance to localhost. I have included a video so you can see the auto-complete in action as well

Set-DbcConfig -Name app.sqlinstance localhost

This configuration will be used for any SQL based checks but not for any windows based ones like Services, PowerPlan, SPN, DiskSpace, Cluster so lets set the app.computername configuration as well

This means that when we run invoke-DbcCheck with AllChecks or by specifying a check, it will run against the local machine and default instance unless we specify a sqlinstance when calling Invoke-DbcCheck. So the code below will not use the configuration for app.sqlinstance.

Invoke-DbcCheck -SqlInstance TheBeard

Exclude a Check

You can exclude a check using the -ExcludeCheck parameter of Invoke-DbcConfig. In the example below I am running all of the Server checks but excluding the SPN as we are not on a domain

Invoke-DbcCheck -Check Server -ExcludeCheck SPN

 

There is a configuration setting to exclude checks as well. (Be careful this will exclude them even if you specifically specify a check using Invoke-DbcCheck but we do give you a warning!)

So now I can run

Set-DbcConfig -Name command.invokedbccheck.excludecheck -Value SPN
Invoke-DbcCheck -Check Server

and all of the server checks except the SPN check will run against the local machine and the default instance that I have set in the config

Creating an environment config and exporting it to use any time we like

So lets make this a lot more useful. Lets create a configuration for our production environment and save it to disk (or even source control it!) so that we can use it again and again. We can also then pass it to other members of our team or even embed it in an automated process or our CI/CD system

Lets build up a configuration for a number of tests for my “production” environment. I will not explain them all here but let you read through the code and the comments to see what has been set. You will see that some of them are due to me running the test on a single machine with one drive.

# The computername we will be testing
Set-DbcConfig -Name app.computername -Value localhost                                                                                                                                                                                                          
# The Instances we want to test
Set-DbcConfig -Name app.sqlinstance -Value 'localhost' ,'localhost\PROD1','localhost\PROD2', 'localhost\PROD3'                                                                                                                                            
# The database owner we expect
Set-DbcConfig -Name policy.validdbowner.name -Value 'dbachecksdemo\dbachecks'  
# the database owner we do NOT expect
Set-DbcConfig -Name policy.invaliddbowner.name -Value 'sa'      
# Should backups be compressed by default?
Set-DbcConfig -Name policy.backup.defaultbackupcompreesion -Value $true     
# Do we allow DAC connections?
Set-DbcConfig -Name policy.dacallowed -Value $true    
# What recovery model should we have?
Set-DbcConfig -Name policy.recoverymodel.type -value FULL     
# What should our database growth type be?
Set-DbcConfig -Name policy.database.filegrowthtype -Value kb   
# What authentication scheme are we expecting?                                                                                                            
Set-DbcConfig -Name policy.connection.authscheme -Value 'NTLM'
# Which Agent Operator should be defined?
Set-DbcConfig -Name agent.dbaoperatorname -Value 'DBA Team'
# Which Agent Operator email should be defined?
Set-DbcConfig -Name agent.dbaoperatoremail -Value 'DBATeam@TheBeard.Local'
# Which failsafe operator shoudl be defined?
Set-DbcConfig -Name agent.failsafeoperator -Value 'DBA Team'
# Where is the whoisactive stored procedure?
Set-DbcConfig -Name policy.whoisactive.database -Value DBAAdmin 
# What is the maximum time since I took a Full backup?
Set-DbcConfig -Name policy.backup.fullmaxdays -Value 7
# What is the maximum time since I took a DIFF backup (in hours) ?
Set-DbcConfig -Name policy.backup.diffmaxhours -Value 26
# What is the maximum time since I took a log backup (in minutes)?
Set-DbcConfig -Name policy.backup.logmaxminutes -Value 30 
# What is my domain name?
Set-DbcConfig -Name domain.name -Value 'WORKGROUP'
# Where is my Ola database?
Set-DbcConfig -Name policy.ola.database -Value DBAAdmin
# Which database should not be checked for recovery model
Set-DbcConfig -Name policy.recoverymodel.excludedb -Value 'master','msdb','tempdb'
# What is my SQL Credential
Set-DbcConfig -Name app.sqlcredential -Value $null
# Should I skip the check for temp files on c?
Set-DbcConfig -Name skip.tempdbfilesonc -Value $true
# Should I skip the check for temp files count?
Set-DbcConfig -Name skip.tempdbfilecount -Value $true
# Which Checks should be excluded?
Set-DbcConfig -Name command.invokedbccheck.excludecheck -Value LogShipping,ExtendedEvent, HADR, PseudoSimple,spn
# How many months before a build is unsupported do I want to fail the test?
Set-DbcConfig -Name policy.build.warningwindow -Value 6
Get-Dbcconfig | ogv

When I run this I get

I can then export this to disk (to store in source control) using

Export-DbcConfig -Path C:\Users\dbachecks\Desktop\production_config.json

and I have a configuration file

which I can use any time to set the configuration for dbachecks using the Import-DbcConfig command (But this doesn’t work in VS Codes integrated terminal – which occasionally does odd things, this appears to be one of them)

Import-DbcConfig -Path C:\Users\dbachecks\Desktop\production_config.json

So I can import this configuration and run my checks with it any time I like. This means that I can create many different test configurations for my many different environment or estate configurations.

Yes, I know “good/best practice” says we should use the same configuration for all of our instances but we know that isn’t true. We have instances that were set up 15 years ago that are still in production. We have instances from the companies our organisation has bought over the years that were set up by system administrators. We have instances that were set up by shadow IT and now we have to support but cant change.

As well as those though, we also have different environments. Our development or test environment will have different requirements to our production environments.

In this hypothetical situation the four instances for four different applications have 4 development containers which are connected to using SQL Authentication. We will need a different configuration.

SQL Authentication

We can set up SQL Authentication for connecting to our SQL Instances using the app.sqlcredential configuration. this is going to hold a PSCredential object for SQL Authenticated connection to your instance. If this is set the checks will always try to use it. Yes this means that the same username and password is being used for each connection. No there is currently no way to choose which instances use it and which don’t. This may be a limitation but as you will see further down you can still do this with different configurations

To set the  SQL Authentication run

Set-DbcConfig -Name app.sqlcredential -Value (Get-Credential)

This will give a prompt for you to enter the credential

Development Environment Configuration

So now we know how to set a SQL Authentication configuration we can create our development environment configuration like so. As you can see below the values are different for the checks and more checks have been skipped. I wont explain it all, if it doesn’t make sense ask a question in the comments or in the dbachecks in SQL Server Community Slack

#region Dev Config
# The Instances we want to test
Set-DbcConfig -Name app.sqlinstance -Value 'localhost,1401' ,'localhost,1402','localhost,1403', 'localhost,1404' 
# What is my SQL Credential
Set-DbcConfig -Name app.sqlcredential -Value (Get-Credential)
# The database owner we expect
Set-DbcConfig -Name policy.validdbowner.name -Value 'sa'   
# What authentication scheme are we expecting?  
Set-DbcConfig -Name policy.connection.authscheme -Value 'SQL'
# the database owner we do NOT expect
Set-DbcConfig -Name policy.invaliddbowner.name -Value 'dbachecksdemo\dbachecks'
# Should backups be compressed by default?
Set-DbcConfig -Name policy.backup.defaultbackupcompreesion -Value $false
# What should our database growth type be?
Set-DbcConfig -Name policy.database.filegrowthtype -Value kb
# What should our database growth value be higher than (Mb)?
Set-DbcConfig -Name policy.database.filegrowthvalue -Value 64
# Do we allow DAC connections?
Set-DbcConfig -Name policy.dacallowed -Value $false 
# What is the maximum latency (ms)?
Set-DbcConfig -Name policy.network.latencymaxms -Value 100
# What recovery model should we have?
Set-DbcConfig -Name policy.recoverymodel.type -value Simple
# Where is the whoisactive stored procedure?
Set-DbcConfig -Name policy.whoisactive.database -Value DBAAdmin 
# What is my domain name?
Set-DbcConfig -Name domain.name -Value 'WORKGROUP'
# Which database should not be checked for recovery model
Set-DbcConfig -Name policy.recoverymodel.excludedb -Value 'master','msdb','tempdb'
# Should I skip the check for temp files on c?
Set-DbcConfig -Name skip.tempdbfilesonc -Value $true
# Should I skip the check for temp files count?
Set-DbcConfig -Name skip.tempdbfilecount -Value $true
# How many months before a build is unsupported do I want to fail the test?
Set-DbcConfig -Name policy.build.warningwindow -Value 6
# Which Checks should be excluded?
Set-DbcConfig -Name command.invokedbccheck.excludecheck -Value LogShipping,ExtendedEvent, HADR, SaReNamed, PseudoSimple,spn, DiskSpace, DatabaseCollation,Agent,Backup,UnusedIndex,LogfileCount,FileGroupBalanced,LogfileSize,MaintenanceSolution,ServerNameMatch

Export-DbcConfig -Path C:\Users\dbachecks\Desktop\development_config.json

Using The Different Configurations

Now I have two configurations, one for my Production Environment and one for my development environment. I can run my checks whenever I like (perhaps you will automate this in some way)

  • Import the production configuration
  • Run my tests with that configuration and create a json file for my Power Bi labelled production
  • Import the development configuration (and enter the SQL authentication credential)
  • Run my tests with that configuration and create a json file for my Power Bi labelled development
  • Start Power Bi to show those results
# Import the production config
Import-DbcConfig C:\Users\dbachecks\Desktop\production_config.json
# Run the tests with the production config and create/update the production json
Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment Production
# Import the development config
Import-DbcConfig C:\Users\dbachecks\Desktop\development_config.json
# Run the tests with the production config and create/update the development json
Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment Development
# Open the PowerBi
Start-DbcPowerBi

I have published the Power Bi so that you can see what it would like and have a click around (maybe you can see improvements you would like to contribute)

now we can see how each environment is performing according to our settings for each environment

Combining Configurations Into One Result Set

As you saw above, by using the Environment parameter of Update-DbcPowerBiDataSource you can add different environments to one report. But if I wanted to have a report for my application APP1 showing both production and development environments but they have different configurations how can I do this?

Here’s how.

  • Create a configuration for the production environment (I have used the production configuration one from above but only localhost for the instance)
  • Export it using to  C:\Users\dbachecks\Desktop\APP1-Prod_config.json
  • Create a configuration for the development environment (I have used the development configuration one from above but only localhost,1401 for the instance)
  • Export it using to  C:\Users\dbachecks\Desktop\APP1-Dev_config.json

Then run

# Import the production config
Import-DbcConfig C:\Users\dbachecks\Desktop\APP1-Prod_config.json
# Run the tests with the production config and create/update the production json
Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment APP1
# Import the development config
Import-DbcConfig C:\Users\dbachecks\Desktop\APP1-Dev_config.json
# Run the tests with the production config and create/update the development json
Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment APP1 -Append
Start-DbcPowerBi

Notice that this time there is an Append on the last Invoke-DbcCheck this creates a single json file for the PowerBi and the results look like this. Now we have the results for our application and both the production environment localhost and the development container localhost,1401

 

It’s Open Source – We Want Your Ideas, Issues, New Code

dbachecks is open-source available on GitHub for anyone to contribute

We would love you to contribute. Please open issues for new tests, enhancements, bugs. Please fork the repository and add code to improve the module. please give feedback to make this module even more useful

You can also come in the SQL Server Community Slack and join the dbachecks channel and get advice, make comments or just join in the conversation

Thank You

I want to say thank you to all of the people who have enabled dbachecks to get this far. These wonderful people have used their own time to ensure that you have a useful tool available to you for free

Chrissy Lemaire @cl

Fred Weinmann @FredWeinmann

Cláudio Silva @ClaudioESSilva

Stuart Moore @napalmgram

Shawn Melton @wsmelton

Garry Bargsley @gbargsley

Stephen Bennett @staggerlee011

Sander Stad @SQLStad

Jess Pomfret @jpomfret

Jason Squires @js0505

Shane O’Neill @SOZDBA

and all of the other people who have contributed in the dbachecks Slack channel

Announcing dbachecks – Configurable PowerShell Validation For Your SQL Instances

For the last couple of months members of the dbatools team have been working on a new PowerShell module called dbachecks. This open source PowerShell module will enable you to validate your SQL Instances. Today it is released for you all to start to use 🙂

Validate Your SQL Instances?

What do I mean by validate your SQL Instances? You want to know if your SQL Instances are (still) set up in the way that you want them to be or that you have not missed any configurations when setting them up. With dbachecks you can use any or all of the 80 checks to ensure one or many SQL Instances are as you want them to be. Using Pester, dbachecks will validate your SQL Instance(s) against default settings or ones that you configure yourself.

Installation

Installation is via the PowerShell Gallery. You will need to open PowerShell on a machine connected to the internet and run

Install-Module dbachecks

If you are not running your process as admin or you only want (or are able) to install for your own user account you will need to

Install-Module -Scope CurrentUser

This will also install the PSFramework module used for configuration (and other things beneath the hood) and the latest version (4.2.0 – released on Sunday!) of Pester

Once you have installed the module you can see the commands available by running

Get-Command -Module dbachecks

To be able to use these (and any PowerShell) commands, your first step should always be Get-Help

Get-Help Send-DbcMailMessage

80 Checks

At the time of release, dbachecks has 80 checks. You can see all of the checks by running

Get-DbcCheck

(Note this has nothing to do with DBCC CheckDb!) Here is the output of

Get-DbcCheck | Select Group, UniqueTag

so you can see the current checks

GroupUniqueTag
AgentAgentServiceAccount
AgentDbaOperator
AgentFailsafeOperator
AgentDatabaseMailProfile
AgentFailedJob
DatabaseDatabaseCollation
DatabaseSuspectPage
DatabaseTestLastBackup
DatabaseTestLastBackupVerifyOnly
DatabaseValidDatabaseOwner
DatabaseInvalidDatabaseOwner
DatabaseLastGoodCheckDb
DatabaseIdentityUsage
DatabaseRecoveryModel
DatabaseDuplicateIndex
DatabaseUnusedIndex
DatabaseDisabledIndex
DatabaseDatabaseGrowthEvent
DatabasePageVerify
DatabaseAutoClose
DatabaseAutoShrink
DatabaseLastFullBackup
DatabaseLastDiffBackup
DatabaseLastLogBackup
DatabaseVirtualLogFile
DatabaseLogfileCount
DatabaseLogfileSize
DatabaseFileGroupBalanced
DatabaseAutoCreateStatistics
DatabaseAutoUpdateStatistics
DatabaseAutoUpdateStatisticsAsynchronously
DatabaseDatafileAutoGrowthType
DatabaseTrustworthy
DatabaseOrphanedUser
DatabasePseudoSimple
DatabaseAdHocWorkloads
DomainDomainName
DomainOrganizationalUnit
HADRClusterHealth
HADRClusterServerHealth
HADR
HADRSystem.Object[]
InstanceSqlEngineServiceAccount
InstanceSqlBrowserServiceAccount
InstanceTempDbConfiguration
InstanceAdHocWorkload
InstanceBackupPathAccess
InstanceDAC
InstanceNetworkLatency
InstanceLinkedServerConnection
InstanceMaxMemory
InstanceOrphanedFile
InstanceServerNameMatch
InstanceMemoryDump
InstanceSupportedBuild
InstanceSaRenamed
InstanceDefaultBackupCompression
InstanceXESessionStopped
InstanceXESessionRunning
InstanceXESessionRunningAllowed
InstanceOLEAutomation
InstanceWhoIsActiveInstalled
LogShippingLogShippingPrimary
LogShippingLogShippingSecondary
ServerPowerPlan
ServerInstanceConnection
ServerSPN
ServerDiskCapacity
ServerPingComputer
MaintenancePlanSystemFull
MaintenancePlanUserFull
MaintenancePlanUserDiff
MaintenancePlanUserLog
MaintenancePlanCommandLog
MaintenancePlanSystemIntegrityCheck
MaintenancePlanUserIntegrityCheck
MaintenancePlanUserIndexOptimize
MaintenancePlanOutputFileCleanup
MaintenancePlanDeleteBackupHistory
MaintenancePlanPurgeJobHistory

108 Configurations

One of the things I have been talking about in my presentation “Green is Good Red is Bad” is configuring Pester checks so that you do not have to keep writing new tests for the same thing but with different values.

For example, a different user for a database owner. The code to write the test for the database owner is the same but the value might be different for different applications, environments, clients, teams, domains etc. I gave a couple of different methods for achieving this.

With dbachecks we have made this much simpler enabling you to set configuration items at run-time or for your session and enabling you to export and import them so you can create different configs for different use cases

There are 108 configuration items at present. You can see the current configuration by running

Get-DbcConfig

which will show you the name of the config, the value it is currently set and the description

You can see all of the configs and their descriptions here

NameDescription
agent.databasemailprofileName of the Database Mail Profile in SQL Agent
agent.dbaoperatoremailEmail address of the DBA Operator in SQL Agent
agent.dbaoperatornameName of the DBA Operator in SQL Agent
agent.failsafeoperatorEmail address of the DBA Operator in SQL Agent
app.checkreposWhere Pester tests/checks are stored
app.computernameList of Windows Servers that Windows-based tests will run against
app.localappPersisted files live here
app.maildirectoryFiles for mail are stored here
app.sqlcredentialThe universal SQL credential if Trusted/Windows Authentication is not used
app.sqlinstanceList of SQL Server instances that SQL-based tests will run against
app.wincredentialThe universal Windows if default Windows Authentication is not used
command.invokedbccheck.excludecheckInvoke-DbcCheck: The checks that should be skipped by default.
domain.domaincontrollerThe domain controller to process your requests
domain.nameThe Active Directory domain that your server is a part of
domain.organizationalunitThe OU that your server should be a part of
mail.failurethreshholdNumber of errors that must be present to generate an email report
mail.fromEmail address the email reports should come from
mail.smtpserverStore the name of the smtp server to send email reports
mail.subjectSubject line of the email report
mail.toEmail address to send the report to
policy.backup.datadirDestination server data directory
policy.backup.defaultbackupcompreesionDefault Backup Compression check should be enabled $true or disabled $false
policy.backup.diffmaxhoursMaxmimum number of hours before Diff Backups are considered outdated
policy.backup.fullmaxdaysMaxmimum number of days before Full Backups are considered outdated
policy.backup.logdirDestination server log directory
policy.backup.logmaxminutesMaxmimum number of minutes before Log Backups are considered outdated
policy.backup.newdbgraceperiodThe number of hours a newly created database is allowed to not have backups
policy.backup.testserverDestination server for backuptests
policy.build.warningwindowThe number of months prior to a build being unsupported that you want warning about
policy.connection.authschemeAuth requirement (Kerberos, NTLM, etc)
policy.connection.pingcountNumber of times to ping a server to establish average response time
policy.connection.pingmaxmsMaximum response time in ms
policy.dacallowedDAC should be allowed $true or disallowed $false
policy.database.autocloseAuto Close should be allowed $true or dissalowed $false
policy.database.autocreatestatisticsAuto Create Statistics should be enabled $true or disabled $false
policy.database.autoshrinkAuto Shrink should be allowed $true or dissalowed $false
policy.database.autoupdatestatisticsAuto Update Statistics should be enabled $true or disabled $false
policy.database.autoupdatestatisticsasynchronouslyAuto Update Statistics Asynchronously should be enabled $true or disabled $false
policy.database.filebalancetolerancePercentage for Tolerance for checking for balanced files in a filegroups
policy.database.filegrowthexcludedbDatabases to exclude from the file growth check
policy.database.filegrowthtypeGrowth Type should be 'kb' or 'percent'
policy.database.filegrowthvalueThe auto growth value (in kb) should be equal or higher than this value. Example: A value of 65535 means at least 64MB.
policy.database.logfilecountThe number of Log files expected on a database
policy.database.logfilesizecomparisonHow to compare data and log file size, options are maximum or average
policy.database.logfilesizepercentageMaximum percentage of Data file Size that logfile is allowed to be.
policy.database.maxvlfMax virtual log files
policy.dbcc.maxdaysMaxmimum number of days before DBCC CHECKDB is considered outdated
policy.diskspace.percentfreePercent disk free
policy.dump.maxcountMaximum number of expected dumps
policy.hadr.tcpportThe TCPPort for the HADR check
policy.identity.usagepercentMaxmimum percentage of max of identity column
policy.invaliddbowner.excludedbDatabases to exclude from invalid dbowner checks
policy.invaliddbowner.nameThe database owner account should not be this user
policy.network.latencymaxmsMax network latency average
policy.ola.commandlogenabledOla's CommandLog Cleanup should be enabled $true or disabled $false
policy.ola.commandlogscheduledOla's CommandLog Cleanup should be scheduled $true or disabled $false
policy.ola.databaseThe database where Ola's maintenance solution is installed
policy.ola.deletebackuphistoryenabledOla's Delete Backup History should be enabled $true or disabled $false
policy.ola.deletebackuphistoryscheduledOla's Delete Backup History should be scheduled $true or disabled $false
policy.ola.installedChecks to see if Ola Hallengren solution is installed
policy.ola.outputfilecleanupenabledOla's Output File Cleanup should be enabled $true or disabled $false
policy.ola.outputfilecleanupscheduledOla's Output File Cleanup should be scheduled $true or disabled $false
policy.ola.purgejobhistoryenabledOla's Purge Job History should be enabled $true or disabled $false
policy.ola.purgejobhistoryscheduledOla's Purge Job History should be scheduled $true or disabled $false
policy.ola.systemfullenabledOla's Full System Database Backup should be enabled $true or disabled $false
policy.ola.systemfullretentionOla's Full System Database Backup retention number of hours
policy.ola.systemfullscheduledOla's Full System Database Backup should be scheduled $true or disabled $false
policy.ola.systemintegritycheckenabledOla's System Database Integrity should be enabled $true or disabled $false
policy.ola.systemintegritycheckscheduledOla's System Database Integrity should be scheduled $true or disabled $false
policy.ola.userdiffenabledOla's Diff User Database Backup should be enabled $true or disabled $false
policy.ola.userdiffretentionOla's Diff User Database Backup retention number of hours
policy.ola.userdiffscheduledOla's Diff User Database Backup should be scheduled $true or disabled $false
policy.ola.userfullenabledOla's Full User Database Backup should be enabled $true or disabled $false
policy.ola.userfullretentionOla's Full User Database Backup retention number of hours
policy.ola.userfullscheduledOla's Full User Database Backup should be scheduled $true or disabled $false
policy.ola.userindexoptimizeenabledOla's User Index Optimization should be enabled $true or disabled $false
policy.ola.userindexoptimizescheduledOla's User Index Optimization should be scheduled $true or disabled $false
policy.ola.userintegritycheckenabledOla's User Database Integrity should be enabled $true or disabled $false
policy.ola.userintegritycheckscheduledOla's User Database Integrity should be scheduled $true or disabled $false
policy.ola.userlogenabledOla's Log User Database Backup should be enabled $true or disabled $false
policy.ola.userlogretentionOla's Log User Database Backup retention number of hours
policy.ola.userlogscheduledOla's Log User Database Backup should be scheduled $true or disabled $false
policy.oleautomationOLE Automation should be enabled $true or disabled $false
policy.pageverifyPage verify option should be set to this value
policy.recoverymodel.excludedbDatabases to exclude from standard recovery model check
policy.recoverymodel.typeStandard recovery model
policy.storage.backuppathEnables tests to check if servers have access to centralized backup location
policy.validdbowner.excludedbDatabases to exclude from valid dbowner checks
policy.validdbowner.nameThe database owner account should be this user
policy.whoisactive.databaseWhich database should contain the sp_WhoIsActive stored procedure
policy.xevent.requiredrunningsessionList of XE Sessions that should be running.
policy.xevent.requiredstoppedsessionList of XE Sessions that should not be running.
policy.xevent.validrunningsessionList of XE Sessions that can be be running.
skip.backup.testingDon't run Test-DbaLastBackup by default (it's not read-only)
skip.connection.pingSkip the ping check for connectivity
skip.connection.remotingSkip PowerShell remoting check for connectivity
skip.database.filegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.database.logfilecounttestSkip the logfilecount test
skip.datafilegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.dbcc.datapuritycheckSkip data purity check in last good dbcc command
skip.diffbackuptestSkip the Differential backup test
skip.logfilecounttestSkip the logfilecount test
skip.logshiptestingSkip the logshipping test
skip.tempdb1118Don't run test for Trace Flag 1118
skip.tempdbfilecountDon't run test for Temp Database File Count
skip.tempdbfilegrowthpercentDon't run test for Temp Database File Growth in Percent
skip.tempdbfilesizemaxDon't run test for Temp Database Files Max Size
skip.tempdbfilesoncDon't run test for Temp Database Files on C

Running A Check

You can quickly run a single check by calling Invoke-DbcCheck.

Invoke-DbcCheck -SqlInstance localhost -Check FailedJob

Excellent, my agent jobs have not failed 🙂

Invoke-DbcCheck -SqlInstance localhost -Check LastGoodCheckDb

Thats good, all of my databases have had a successful DBCC CHECKDB within the last 7 days.

Setting a Configuration

To save me from having to specify the instance I want to run my tests against I can set the app.sqlinstance config to the instances I want to check.

Set-DbcConfig -Name app.sqlinstance -Value localhost, 'localhost\PROD1'

Then whenever I call Invoke-DbcCheck it will run against those instances for the SQL checks

So now if I run

Invoke-DbcCheck -Check LastDiffBackup

I can see that I dont have a diff backup for the databases on both instances. Better stop writing this and deal with that !!

The configurations are stored in the registry but you can export them and then import them for re-use easily. I have written another blog post about that.

The Show Parameter

Getting the results of the tests on the screen is cool but if you are running a lot of tests against a lot of instances then you might find that you have 3 failed tests out of 15000! This will mean a lot of scrolling through green text looking for the red text and you may find that your PowerShell buffer doesnt hold all of your test results leaving you very frustrated.

dbachecks supports the Pester Show parameter enabling you to filter the output of the results to the screen. The available values are Summary, None, Fails, Inconclusive, Passed, Pending and Skipped

in my opinion by far the most useful one is Fails as this will show you only the failed tests with the context to enable you to see which tests have failed

Invoke-DbcCheck -Check Agent -Show Fails

If we check all of the checks tagged as Agent we can easily see that most passed but The Job That Fails (surprisingly) failed. All of the other tests that were run for the agent service, operators, failsafe operator, database mail and all other agent jobs all passed in the example below

Test Results are for other People as well

It is all very well and good being able to run tests and get the results on our screen. It will be very useful for people to be able to validate a new SQL instance for example or run a morning check or the first step of an incident response. But test results are also useful for other people so we need to be able to share them

We have created a Power Bi Dashboard that comes with the dbachecks module to enable easy sharing of the test results. You can also send the results via email using Send-DbcMailMessage. we have an open issue for putting them into a database that we would love you to help resolve.

To get the results into PowerBi you can run

Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment Production

This will run all of the dbachecks using your configuration for your Production environment, output only the failed tests to the screen and save the results in your windows\temp\dbachecks folder with a suffix of Production

If you then used a different configuration for your development environment and ran

Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment Development

it will run all of the dbachecks using your configuration for your Development environment, output only the failed tests to the screen and save the results in your windows\temp\dbachecks folder with a suffix of Development and you would end up with two files in the folder

You can then simply run

Start-DbcPowerBi

and as long as you have the (free) Powerbi Desktop then you will see this. You will need to refresh the data to get your test results

Of course it is Powerbi so you can publish this report. Here it is so that you can click around and see what it looks like

It’s Open Source – We Want Your Ideas, Issues, New Code

dbachecks is open-source available on GitHub for anyone to contribute

We would love you to contribute. Please open issues for new tests, enhancements, bugs. Please fork the repository and add code to improve the module. please give feedback to make this module even more useful

You can also come in the SQL Server Community Slack and join the dbachecks channel and get advice, make comments or just join in the conversation

Further Reading

There are many more introduction blog posts covering different areas at

Thank You

I want to say thank you to all of the people who have enabled dbachecks to get this far. These wonderful people have used their own time to ensure that you have a useful tool available to you for free

Chrissy Lemaire @cl

Fred Weinmann @FredWeinmann

Cláudio Silva @ClaudioESSilva

Stuart Moore @napalmgram

Shawn Melton @wsmelton

Garry Bargsley @gbargsley

Stephen Bennett @staggerlee011

Sander Stad @SQLStad

Jess Pomfret @jpomfret

Jason Squires @js0505

Shane O’Neill @SOZDBA

Tony Wilhelm @TonyWSQL

and all of the other people who have contributed in the dbachecks Slack channel

Pester 4.2.0 has a Because…… because :-)

I was going through my demo for the South Coast User Group meeting tonight and decided to add some details about the Because parameter available in the Pester pre-release version 4.2.0.

To install a pre-release version you need to get the latest  PowerShellGet module. This is pre-installed with PowerShell v6 but for earlier versions open PowerShell as administrator and run

Install-Module  PowerShellGet

You can try out the Pester pre-release version (once you have the latest PowerShellGet) by installing it from the PowerShell Gallery with

Install-Module -Name Pester -AllowPrerelease -Force # -Scope CurrentUser # if not admin

There are a number of improvements as you can see in the change log I particularly like the

  • Add -BeTrue to test for truthy values
  • Add -BeFalse to test for falsy values

This release adds the Because parameter to the all assertions. This means that you can add a reason why the test has failed. As JAKUB JAREŠ writes here

  • Reasons force you think more
  • Reasons document your intent
  • Reasons make your TestCases clearer
  • So you can do something like this
Describe "This shows the Because"{
    It "Should be true" {
        $false | Should -BeTrue -Because "The Beard said so"
    }
}

Which gives an error message like this 🙂

As you can see the Expected gives the expected value and then your Because statement and then the actual result. Which means that you could write validation tests like

Describe "My System" {
    Context "Server" {
        It "Should be using XP SP3" {
            (Get-CimInstance -ClassName win32_operatingsystem).Version | Should -Be '5.1.2600' -Because "We have failed to bother to update the App and it only works on XP"
        }
        It "Should be running as rob-xps\mrrob" {
            whoami | Should -Be 'rob-xps\mrrob' -Because "This is the user with the permissions"
        }
        It "Should have SMB1 enabled" {
            (Get-SmbServerConfiguration).EnableSMB1Protocol | Should -BeTrue -Because "We don't care about the risk"
        }
    }
}

and get a result like this

Or if you were looking to validate your SQL Server you could write something like this

It "Backups Should have Succeeeded" {
    $Where = {$_IsEnabled -eq $true -and $_.Name -like '*databasebackup*'}
    $Should = @{
        BeTrue = $true
        Because =  "WE NEED BACKUPS - OMG"
    }
    (Get-DbaAgentJob -SqlInstance $instance| Where-Object $where).LastRunOutcome -NotContains 'Failed' | Should @Should
}

or maybe your security policies allow Windows Groups as logins on your SQL instances. You could easily link to the documentation and explain why this is important. This way you could build up a set of tests to validate your SQL Server is just so for your environment

It "Should only have Windows groups as logins" {
    $Should = @{
        Befalse = $true
        Because = "Our Security Policies say we must only have Windows groups as logins - See this document"
    }
    (Get-DbaLogin -SqlInstance $instance -WindowsLogins).LoginType -contains 'WindowsUser' | Should @Should
}

Just for fun, these would look like this

and the code looks like

$Instances = 'Rob-XPS', 'Rob-XPS\Bolton'

foreach ($instance in $Instances) {
    $Server, $InstanceName = $Instance.Split('/')
    if ($InstanceName.Length -eq 0) {$InstanceName = 'MSSSQLSERVER'}

    Describe "Testing the instance $instance" {
        Context "SQL Agent Jobs" {
            It "Backups Should have Succeeeded" {
                $Where = {$_IsEnabled -eq $true -and $_.Name -like '*databasebackup*'}
                $Should = @{
                    BeTrue = $true
                    Because =  "WE NEED BACKUPS - OMG "
                }
                (Get-DbaAgentJob -SqlInstance $instance| Where-Object $where).LastRunOutcome -NotContains 'Failed' | Should @Should
            }
            Context "Logins" {
                It "Should only have Windows groups as logins" {
                    $Should = @{
                        Befalse = $true
                        Because = "Our Security Policies say we must only have Windows groups as logins - See this document"
                    }
                    (Get-DbaLogin -SqlInstance $instance -WindowsLogins).LoginType -contains 'WindowsUser' | Should @Should
                }
            }
        }
    }
}

This will be a useful improvement to Pester when it is released and enable you to write validation checks with explanations.

Chrissy has written about dbachecks the new up and coming community driven open source PowerShell module for SQL DBAs to validate their SQL Server estate. we have taken some of the ideas that we have presented about a way of using dbatools with Pester to validate that everything is how it should be and placed them into a meta data driven framework to make things easy for anyone to use. It is looking really good and I am really excited about it. It will be released very soon.

Chrissy and I will be doing a pre-con at SQLBits where we will talk in detail about how this works. You can find out more and sign up here

TSQL2sDay – Folks Who Have Made a Difference

tsql2sday

This months TSQL2sDay is an absolute brilliant one hosted by Ewald Cress

the opportunity to give a shout-out to people (well-known or otherwise) who have made a meaningful contribution to your life in the world of data.

Fabulous, fabulous idea Ewald, I heartily approve

Now this is going to be difficult. There are so many wonderful people in the #SQLFamily who are so gracious and generous and willing to share. I am also lucky enough to be part of the PowerShell community which is also equally filled with amazing people. I do not want to write a novel or a massive list of people, I don’t want to risk missing someone out (Ewald, I’m beginning to question whether ‘fabulous’ should become ‘tricky and challenging’ !!)

So after consideration I am only going to talk about 4 wonderful people and the effect they have had on my life, my career and my community involvement but know that I truly appreciate the input that all of the peoples have had and the amazing friendships that I have all over the world. There is no order to this list, these are 4 of the people in equal first with all the other people I haven’t mentioned. This post should really scroll sideways. Interestingly I noticed after writing this that they are in reverse chronological order in my life!

The Hair!

At PASS Summit this year many people came up to me and said “Hey, Beard ……..” The first person who called me that is an amazing inspiring bundle of talented energy called Chrissy LeMaire

Many moons ago, we exchanged messages over social media and email, chatted after a PowerShell Virtual Group presentation and then one day she asked me to join as an organiser for the Virtual Group.

When dbatools was in it’s infancy she asked me to help and since then has given me interesting challenges to overcome from introducing Pester and appveyor to the dbatools development process to creating continuous delivery to our private PowerShell gallery for our summit pre-con forcing me to learn and implement new and cool things. Our shared love of enabling people to do cool things with PowerShell is so much fun to do 🙂

She is so generous and giving of her time and knowledge and has an amazing capability to get things done, whether by herself or by encouraging and supporting others.

We have presented at many conferences together, both SQL and PowerShell and we have the best of times doing so. It is so refreshing to find someone that I am comfortable presenting with and who has the same passion and energy for inspiring people. (It’s also fun to occasionally throw her off her stride mid-presentation (Thank you Cathrine 🙂 )

I am proud to call her my buddy. You are so inspiring Chrissy.

Thank you Ma’am

Amazing Couple

A few months after becoming a DBA I was the only DBA at the company as the others all left for various reasons. I was drowning in work, had no idea what I should be doing. I knew I didn’t have the knowledge and during that time I began to be aware of the SQL community and all the fine resources that it provides.

I then found out about a local user group and emailed the leader Jonathan Allen (He surprised me by reminding of this during our pre-con in Singapore a couple of weeks ago!) Jonathan and his wife Annette run the SQL South West user group and are also members of the SQL Bits committee, Annette is also the regional mentor for the UK. They give an awful amount of time and effort to the SQL Community in the UK. It took a few months before I even had the time to attend a user group and in those early days they both answered my naive questions and passed on so much of their technical knowledge and methodology to me and I soaked it up.

Later on, they invited me to help them to organise SQL Saturday Exeter, encouraged me to speak, gave me fabulous feedback and pointers to improve, encouraged me to volunteer for SQL Bits and have been incredibly supportive. I love them both very much. Neither like having their photo taken so I can’t embarrass them too much.

Next time you see them give them a hug.

Thank you J and A

The First One

Andrew Pruski dbafromthecold and SQL Containers Man

At the time I am talking about he was not a member of the SQL Community although he possessed all of the qualities that describe such a person. Now he is an established blogger and speaker and attender of SQL events.

He is one of the DBA’s who left me on my own!! He is the first SQL DBA I ever worked with. The person who taught me all those important first bits of knowledge about being a SQL DBA. He imparted a great amount of knowledge in a few months with great patience to an eager newbie.

More than that, he showed me that to succeed in IT, you need to do more than just an everyday 9-5, that it requires more time than that. He instilled in me (without realising it) a work ethic and a thirst for doing things right and gaining knowledge that I still have today. He inspired me when I was faced with trying to understand the mountain of knowledge that is SQL Server that it was possible to learn enough. He taught me the importance of testing things, of understanding the impact of the change that is being made. He showed me how to respond in crises and yet was still willing to share and teach during those times.

He has had a greater impact on me than he will ever know and I have told him this privately many times. I will never forgive him for abandoning me all those years ago and yet that is a large part of what made me who I am today. I was forced to have to deal with looking after a large estate by myself and needed to learn to automate fast and he just about left me with the skills to be able to accomplish that.

Massive shout out to you fella. Thank you

All the Others

Seriously, there are so many other people who I wish I could thank.

Every single one of you who blogs or speaks or records webinars that I have watched – thank you.

All of the organisers who ensure that events happen – thank you

All of the volunteers who assist at those events – thank you.

That group of amazing European speakers at the first SQL Saturday Exeter I attended. The cool group, my wife still reminds me of how I came home from that event so inspired by them. How incredibly generous and welcoming they were and how they welcomed me into their group even though I didn’t feel worthy to share their table. They taught me about the lack of egos and humbleness that defines the SQL family. I am proud to call them my friends now. Thank You (You know who you are)

We have a great community, may its ethos continue for a long time.

Comparing Agent Jobs across Availability Group Replicas with PowerShell

On the plane home from PAS Summit I was sat next to someone who had also attended and when he saw on my laptop that I was part of the SQL Community we struck up a conversation. He asked me how he could compare SQL Agent Jobs across availability group replicas to ensure that they were the same.

He already knew that he could use Copy-DbaAgentJob from dbatools to copy the jobs between replicas and we discussed how to set up an Agent job to accomplish this. The best way to run an Agent Job with a PowerShell script is described here

Compare-Object

I told him about Compare-Object a function available in PowerShell for precisely this task. Take these two SQL instances and their respective Agent Jobs

agentjobcompare.png

So we can see that some jobs are the same and some are different. How can we quickly and easily spot the differences?

$Default = Get-DbaAgentJob -SqlInstance rob-xps
$bolton = Get-DbaAgentJob -SqlInstance rob-xps\bolton
Compare-Object $Default $bolton
Those three lines of code will do it. The first two get the agent jobs from each instance and assign them to a variable and the last one compares them. This is the output
comparison.png
The arrows show that the first three jobs are only on the Bolton instance and the bottom three jobs are only on the default instance.

What If ?

 Another option I showed was to use the -WhatIf switch on Copy-DbaAgentJob. This parameter is available on all good PowerShell functions and will describe what the command would do if run WARNING – If you are using the old SQLPS module from prior to the SSMS 2016 release -WhatIf will actually run the commands so update your modules.
We can run
Copy-DbaAgentJob -Source rob-xps -Destination rob-xps\bolton -WhatIf

and get the following result

which shows us that there are two jobs on Rob-XPS which would be created on the Bolton instance

And if they have been modified?

Thats good he said, but what about if the jobs have been modified?
Well one thing you could do is to compare the jobs DateLastModified property by using the -Property parameter and the passthru switch
$Default = Get-DbaAgentJob -SqlInstance rob-xps
$Dave = Get-DbaAgentJob -SqlInstance rob-xps\dave
 
$Difference = Compare-Object $Default $dave -Property DateLastModified -PassThru
$Difference | Sort-Object Name | Select-Object OriginatingServer,Name,DateLastModified
This is going to return the jobs which are the same but were modified at a different time
sortedjobcompare.png
so that you can examine when they were changed. Of course the problem with that is that the DateLastModified is a very precise time so it is pretty much always going to be different. We can fix that but now it is a little more complex.

Just the Date please

We need to gather the jobs in the same way but create an array of custom objects with a calculated property like this
$Dave = Get-DbaAgentJob -SqlInstance rob-xps\dave
## Create a custom object array with the date instead of the datetime
$DaveJobs = @()
$Dave.ForEach{
    $DaveJobs += [pscustomobject]@{
        Server = $_.OriginatingServer
        Name   = $_.Name
        Date   = $_.DateLastModified.Date
    }
}
and then we can compare on the Date field. The full code is
## Get the Agent Jobs
$Default = Get-DbaAgentJob -SqlInstance rob-xps
$Dave = Get-DbaAgentJob -SqlInstance rob-xps\dave
## Create a custom object array with the date instead of the datetime
$DaveJobs = @()
$Dave.ForEach{
    $DaveJobs += [pscustomobject]@{
        Server = $_.OriginatingServer
        Name   = $_.Name
        Date   = $_.DateLastModified.Date
    }
}
## Create a custom object array with the date instead of the datetime
$DefaultJobs = @()
$Default.ForEach{
    $DefaultJobs += [pscustomobject]@{
        Server = $_.OriginatingServer
        Name   = $_.Name
        Date   = $_.DateLastModified.Date
    }
}
## Perform a comparison
$Difference = Compare-Object $DefaultJobs $DaveJobs -Property date -PassThru
## Sort by name and display
$Difference | Sort-Object Name | Select-Object Server, Name, Date

This will look like this

datecompare.png
Which is much better and hopefully more useful but it only works with 2 instances

I have more than 2 instances

So if we have more than 2 instances it gets a little more complicated as Compare-Object only supports two arrays. I threw together a quick function to compare each instance with the main instance. This is very rough and will work for now but I have also created a feature request issue on the dbatools repository so someone (maybe you ?? ) could go and help create those commands

FunctionCompare-AgentJobs {
    Param(
        $SQLInstances
    )
    ## remove jobs* variables from process
    Get-Variable jobs*|Remove-Variable
    ## Get the number of instances
    $count = $SQLInstances.Count
    ## Loop through instances
    $SQLInstances.ForEach{
        # Get the jobs and assign to a new dynamic variable
        $Number = [array]::IndexOf($SQLInstances, $_)
        $Job = Get-DbaAgentJob-SqlInstance $_
        New-Variable-Name "Jobs$Number"-Value $Job
    }
    $i = $count - 1
    $Primary = $SQLInstances[0]
    While ($i -gt 0) {
        ## Compare the jobs with Primary
        $Compare = $SQLInstances[$i]
        Write-Output"Comparing $Primary with $Compare "
        Compare-Object(Get-Variable Jobs0).Value (Get-Variable"Jobs$i").Value
        $i --
    }
}
which looks like this. It’s not perfect but it will do for now until the proper commands are created

compare agent jobs.png

dbatools with SQL on Docker and running SQL queries

I had a question from my good friend Andrew Pruski dbafromthecold on twitter or SQL Container Man as I call him 🙂

How do you guys run SQL Commands in dbatools

I will answer that at the bottom of this post, but during our discussion Andrew said he wanted to show the version of the SQL running in the Docker Container.

Thats easy I said. Here’s how to do it

You need to have installed Docker first see this page You can switch to using Windows containers right-clicking on the icon in the taskbar and choosing the command. If you have not already, then pull the SQL 2017 SQL image using

docker pull microsoft/mssql-server-windows-developer:latest

This may take a while to download and extract the image but its worth it, you will be able to spin up new SQL instances in no time

You can create a new SQL Docker container like this

docker run -d -p 15789:1433 --env ACCEPT_EULA=Y --env sa_password=SQL2017Password01 --name SQL2017 microsoft/mssql-server-windows-developer:latest

In only a few seconds you have a SQL 2017 instance up and running (Take a look at Andrews blog at dbafromthecold.com for a great container series with much greater detail)

Now that we have our container we need to connect to it. We need to gather the IPAddress. We can do this using docker command docker inspect but I like to make things a little more programmatical. This works for my Windows 10 machine for Windows SQL Containers. There are some errors with other machines it appears but there is an alternative below

$inspect = docker inspect SQL2017
<#
IPAddress": matches the characters IPAddress": literally (case sensitive)
\s matches any whitespace character (equal to [\r\n\t\f\v ])
" matches the character " literally (case sensitive)
1st Capturing Group (\d{1,3}.\d{1,3}.\d{1,3}.\d{1,3})
\d{1,3} matches a digit (equal to [0-9])
. matches any character (except for line terminators)
\d{1,3} matches a digit (equal to [0-9])
. matches any character (except for line terminators)
\d{1,3} matches a digit (equal to [0-9])
. matches any character (except for line terminators)
\d{1,3} matches a digit (equal to [0-9])
#>
$IpAddress = [regex]::matches($inspect,"IPAddress`":\s`"(\d{1,3}.\d{1,3}.\d{1,3}.\d{1,3})").groups[1].value

Those two lines of code (and several lines of comments) puts the results of the docker inspect command into a variable and then uses regex to pull out the IP Address

If you are getting errors with that you can also use

$IPAddress =docker inspect -f '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' containername

Thanks Andrew 🙂

Now we just need our credentials to connect to the instance

$cred = Get-Credential -UserName SA -Message "Enter SA Password Here"

and we can connect to our SQL container

$srv = Connect-DbaInstance -SqlInstance $IpAddress -Credential $cred

and get the version

$srv.Version

and many many other properties, just run

$srv | Get-Member

to see them. At the bottom, you will see a ScriptMethod called Query, which means that you can do things like

$Query = @"
SELECT @@Version
"@

$srv.Query($Query)

$srv.Query($Query).column1

Which looks like

It’s slightly different with a Linux SQL container. Switch Docker to run Linux containers by right-clicking on the icon in the taskbar and choosing the command to switch.
If you haven’t already pull the Linux SQL image
docker pull microsoft/mssql-server-linux:2017-latest

and then create a container

docker run -d -p 15789:1433 --env ACCEPT_EULA=Y --env SA_PASSWORD=SQL2017Password01 --name linuxcontainer microsoft/mssql-server-linux:2017-latest

Now we just need to connect with localhost and the port number which we have specified already and we can connect again

$LinuxSQL = 'Localhost,15789'
$linuxsrv = Connect-DbaInstance -SqlInstance $LinuxSQL -Credential $cred
$linuxsrv.Version
$linuxsrv.HostDistribution
$linuxsrv.Query($query).column1

Of course, this isn’t restricted just Connect-DbaInstance you can do this with any dbatools commands

Get-DbaDatabase -SqlInstance $LinuxSQL -SqlCredential $cred

Go and explore your Docker SQL conatiners with dbatools 🙂

You can get it using

Install-Module dbatools

and find commands with

Find-DbaCommand database

Don’t forget to use Get-Help with the name of the command to get information about how to use it

Get-Help Find-DbaCommand -detailed

Enjoy 🙂