dbachecks – Dark Mode Historical Validation PowerBi

in my last post I showed how you can save the results of dbachecks to a database and created a PowerBi report. Inspired by Frank Henninger in the #dbachecks slack channel and Shawn Melton who explained the difficulties with red/green colour blind I then created this one 🙂

dark mode

You can find it in my GitHub and have a play with it below

Happy Validating!

dbachecks – Save the results to a database for historical reporting

I gave a presentation at SQL Day in Poland last week on dbachecks and one of the questions I got asked was will you write a command to put the results of the checks into a database for historical reporting.

The answer is no and here is the reasoning. The capability is already there. Most good PowerShell commands will only return an object and the beauty of an object is that you can do anything you like with it. Your only limit is your imagination 🙂 I have written about this before here. The other reason is that it would be very difficult to write something that was easily configurable for the different requirements that people will require. But here is one way of doing it.

Create a configuration and save it

Let’s define a configuration and call it production. This is something that I do all of the time so that I can easily run a set of checks with the configuration that I want.

# The computername we will be testing
Set-DbcConfig -Name app.computername -Value $sql0,$SQl1
# The Instances we want to test
Set-DbcConfig -Name app.sqlinstance -Value $sql0,$SQl1
# The database owner we expect
Set-DbcConfig -Name policy.validdbowner.name -Value 'THEBEARD\EnterpriseAdmin'
# the database owner we do NOT expect
Set-DbcConfig -Name policy.invaliddbowner.name -Value 'sa'
# Should backups be compressed by default?
Set-DbcConfig -Name policy.backup.defaultbackupcompression -Value $true
# Do we allow DAC connections?
Set-DbcConfig -Name policy.dacallowed -Value $true
# What recovery model should we have?
Set-DbcConfig -Name policy.recoverymodel.type -value FULL
# What should ourt database growth type be?
Set-DbcConfig -Name policy.database.filegrowthtype -Value kb
# What authentication scheme are we expecting?
Set-DbcConfig -Name policy.connection.authscheme -Value 'KERBEROS'
# Which Agent Operator should be defined?
Set-DbcConfig -Name agent.dbaoperatorname -Value 'The DBA Team'
# Which Agent Operator email should be defined?
Set-DbcConfig -Name agent.dbaoperatoremail -Value 'TheDBATeam@TheBeard.Local'
# Which failsafe operator shoudl be defined?
Set-DbcConfig -Name agent.failsafeoperator -Value 'The DBA Team'
## Set the database mail profile name
Set-DbcConfig -Name agent.databasemailprofile -Value 'DbaTeam'
# Where is the whoisactive stored procedure?
Set-DbcConfig -Name policy.whoisactive.database -Value master
# What is the maximum time since I took a Full backup?
Set-DbcConfig -Name policy.backup.fullmaxdays -Value 7
# What is the maximum time since I took a DIFF backup (in hours) ?
Set-DbcConfig -Name policy.backup.diffmaxhours -Value 26
# What is the maximum time since I took a log backup (in minutes)?
Set-DbcConfig -Name policy.backup.logmaxminutes -Value 30
# What is my domain name?
Set-DbcConfig -Name domain.name -Value 'TheBeard.Local'
# Where is my Ola database?
Set-DbcConfig -Name policy.ola.database -Value master
# Which database should not be checked for recovery model
Set-DbcConfig -Name policy.recoverymodel.excludedb -Value 'master','msdb','tempdb'
# Should I skip the check for temp files on c?
Set-DbcConfig -Name skip.tempdbfilesonc -Value $true
# Should I skip the check for temp files count?
Set-DbcConfig -Name skip.tempdbfilecount -Value $true
# Which Checks should be excluded?
Set-DbcConfig -Name command.invokedbccheck.excludecheck -Value LogShipping,ExtendedEvent, PseudoSimple,SPN, TestLastBackupVerifyOnly,IdentityUsage,SaRenamed
# How many months before a build is unsupported do I want to fail the test?
Set-DbcConfig -Name policy.build.warningwindow -Value 6
## I need to set the app.cluster configuration to one of the nodes for the HADR check
## and I need to set the domain.name value
Set-DbcConfig -Name app.cluster -Value $SQL0
Set-DbcConfig -Name domain.name -Value 'TheBeard.Local'
## I also skip the ping check for the listener as we are in Azure
Set-DbcConfig -Name skip.hadr.listener.pingcheck -Value $true
Now I can export that configuration to a json file and store on a file share or in source control using the code below. This makes it easy to embed the checks into an automation solution
Export-DbcConfig -Path Git:\Production.Json
and then I can use it with
Import-DbcConfig -Path Git:\Production.Json
Invoke-DbcCheck
01 - Invoke-DbcCheck
I would use one of the Show parameter values here if I was running it at the command line, probably fails to make reading the information easier

Add results to a database

This only gets us the test results on the screen, so if we want to save them to a database we have to use the PassThru parameter for Invoke-DbcCheck. I will run the checks again, save them to a variable
$Testresults = Invoke-DbcCheck -PassThru -Show Fails

Then I can use the dbatools Write-DbaDatatable command to write the results to a table in a database. I need to do this twice, once for the summary and once for the test results

$Testresults | Write-DbaDataTable -SqlInstance $sql0 -Database tempdb -Table Prod_dbachecks_summary -AutoCreateTable
$Testresults.TestResult | Write-DbaDataTable -SqlInstance $sql0 -Database tempdb -Table Prod_dbachecks_detail -AutoCreateTable

and I get two tables one for the summary

02 - summary

and one for the details

03 - detail
This works absolutely fine and I could continue to add test results in this fashion but it has no date property so it is not so useful for reporting.

Create tables and triggers

This is one way of doing it. I am not sure it is the best way but it works! I always look forward to how people take ideas and move them forward so if you have a better/different solution please blog about it and reference it in the comments below

First I created a staging table for the summary results

CREATE TABLE [dbachecks].[Prod_dbachecks_summary_stage](
	[TagFilter] [nvarchar](max) NULL,
	[ExcludeTagFilter] [nvarchar](max) NULL,
	[TestNameFilter] [nvarchar](max) NULL,
	[TotalCount] [int] NULL,
	[PassedCount] [int] NULL,
	[FailedCount] [int] NULL,
	[SkippedCount] [int] NULL,
	[PendingCount] [int] NULL,
	[InconclusiveCount] [int] NULL,
	[Time] [bigint] NULL,
	[TestResult] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

and a destination table with a primary key and a date column which defaults to todays date

CREATE TABLE [dbachecks].[Prod_dbachecks_summary](
	[SummaryID] [int] IDENTITY(1,1) NOT NULL,
	[TestDate] [date] NOT NULL,
	[TagFilter] [nvarchar](max) NULL,
	[ExcludeTagFilter] [nvarchar](max) NULL,
	[TestNameFilter] [nvarchar](max) NULL,
	[TotalCount] [int] NULL,
	[PassedCount] [int] NULL,
	[FailedCount] [int] NULL,
	[SkippedCount] [int] NULL,
	[PendingCount] [int] NULL,
	[InconclusiveCount] [int] NULL,
	[Time] [bigint] NULL,
	[TestResult] [nvarchar](max) NULL,
 CONSTRAINT [PK_Prod_dbachecks_summary] PRIMARY KEY CLUSTERED 
(
	[SummaryID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_summary] ADD  CONSTRAINT [DF_Prod_dbachecks_summary_TestDate]  DEFAULT (getdate()) FOR [TestDate]
GO

and added an INSERT trigger to the staging table

CREATE TRIGGER [dbachecks].[Load_Prod_Summary] 
   ON   [dbachecks].[Prod_dbachecks_summary_stage]
   AFTER INSERT
AS 
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

    INSERT INTO [dbachecks].[Prod_dbachecks_summary] 
	([TagFilter], [ExcludeTagFilter], [TestNameFilter], [TotalCount], [PassedCount], [FailedCount], [SkippedCount], [PendingCount], [InconclusiveCount], [Time], [TestResult])
	SELECT [TagFilter], [ExcludeTagFilter], [TestNameFilter], [TotalCount], [PassedCount], [FailedCount], [SkippedCount], [PendingCount], [InconclusiveCount], [Time], [TestResult] FROM [dbachecks].[Prod_dbachecks_summary_stage]

END
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_summary_stage] ENABLE TRIGGER [Load_Prod_Summary]
GO
and for the details I do the same thing. A details table
CREATE TABLE [dbachecks].[Prod_dbachecks_detail](
	[DetailID] [int] IDENTITY(1,1) NOT NULL,
	[SummaryID] [int] NOT NULL,
	[ErrorRecord] [nvarchar](max) NULL,
	[ParameterizedSuiteName] [nvarchar](max) NULL,
	[Describe] [nvarchar](max) NULL,
	[Parameters] [nvarchar](max) NULL,
	[Passed] [bit] NULL,
	[Show] [nvarchar](max) NULL,
	[FailureMessage] [nvarchar](max) NULL,
	[Time] [bigint] NULL,
	[Name] [nvarchar](max) NULL,
	[Result] [nvarchar](max) NULL,
	[Context] [nvarchar](max) NULL,
	[StackTrace] [nvarchar](max) NULL,
 CONSTRAINT [PK_Prod_dbachecks_detail] PRIMARY KEY CLUSTERED 
(
	[DetailID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail]  WITH CHECK ADD  CONSTRAINT [FK_Prod_dbachecks_detail_Prod_dbachecks_summary] FOREIGN KEY([SummaryID])
REFERENCES [dbachecks].[Prod_dbachecks_summary] ([SummaryID])
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail] CHECK CONSTRAINT [FK_Prod_dbachecks_detail_Prod_dbachecks_summary]
GO

A stage table

CREATE TABLE [dbachecks].[Prod_dbachecks_detail_stage](
	[ErrorRecord] [nvarchar](max) NULL,
	[ParameterizedSuiteName] [nvarchar](max) NULL,
	[Describe] [nvarchar](max) NULL,
	[Parameters] [nvarchar](max) NULL,
	[Passed] [bit] NULL,
	[Show] [nvarchar](max) NULL,
	[FailureMessage] [nvarchar](max) NULL,
	[Time] [bigint] NULL,
	[Name] [nvarchar](max) NULL,
	[Result] [nvarchar](max) NULL,
	[Context] [nvarchar](max) NULL,
	[StackTrace] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

with a trigger

CREATE TRIGGER [dbachecks].[Load_Prod_Detail] 
   ON   [dbachecks].[Prod_dbachecks_detail_stage]
   AFTER INSERT
AS 
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

    INSERT INTO [dbachecks].[Prod_dbachecks_detail] 
([SummaryID],[ErrorRecord], [ParameterizedSuiteName], [Describe], [Parameters], [Passed], [Show], [FailureMessage], [Time], [Name], [Result], [Context], [StackTrace])
	SELECT 
	(SELECT MAX(SummaryID) From [dbachecks].[Prod_dbachecks_summary]),[ErrorRecord], [ParameterizedSuiteName], [Describe], [Parameters], [Passed], [Show], [FailureMessage], [Time], [Name], [Result], [Context], [StackTrace]
	FROM [dbachecks].[Prod_dbachecks_detail_stage]

END
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail_stage] ENABLE TRIGGER [Load_Prod_Detail]
GO

Then I can use Write-DbaDatatable with a couple of extra parameters, FireTriggers to run the trigger, Truncate and Confirm:$false to avoid any confirmation because I want this to run without any interaction and I can get the results into the database.

$Testresults | Write-DbaDataTable -SqlInstance $Instance -Database $Database -Schema dbachecks -Table Prod_dbachecks_summary_stage -FireTriggers -Truncate -Confirm:$False
$Testresults.TestResult | Write-DbaDataTable -SqlInstance $Instance -Database $Database -Schema dbachecks -Table Prod_dbachecks_detail_stage -FireTriggers -Truncate -Confirm:$False
detail with stage

Which means that I can now query some of this data and also create PowerBi reports for it.

To enable me to have results for the groups in dbachecks I have to do a little bit of extra manipulation. I can add all of the checks to the database using

Get-DbcCheck | Write-DbaDataTable -SqlInstance $sql0 -Database ValidationResults -Schema dbachecks -Table Checks -Truncate -Confirm:$False -AutoCreateTable

But because the Ola Hallengren Job names are configuration items I need to update the values for those checks which I can do as follows

$query = "
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.systemfull) + "' WHERE [Describe] = 'Ola - `$SysFullJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserFull) + "' WHERE [Describe] = 'Ola - `$UserFullJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserDiff) + "' WHERE [Describe] = 'Ola - `$UserDiffJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserLog) + "' WHERE [Describe] = 'Ola - `$UserLogJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.CommandLogCleanup) + "' WHERE [Describe] = 'Ola - `$CommandLogJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.SystemIntegrity) + "' WHERE [Describe] = 'Ola - `$SysIntegrityJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserIntegrity) + "' WHERE [Describe] = 'Ola - `$UserIntegrityJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserIndex) + "' WHERE [Describe] = 'Ola - `$UserIndexJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.OutputFileCleanup) + "' WHERE [Describe] = 'Ola - `$OutputFileJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.DeleteBackupHistory) + "' WHERE [Describe] = 'Ola - `$DeleteBackupJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.PurgeBackupHistory) + "' WHERE [Describe] = 'Ola - `$PurgeBackupJobName'
"
Invoke-DbaSqlQuery -SqlInstance $SQL0 -Database ValidationResults -Query $query

You can get a sample Power Bi report in my Github which also has the code from this blog post

Then you just need to open in PowerBi Desktop and

Click Edit Queries
Click Data Source Settings
Click Change Source
Change the Instance and Database names

09 - PowerBi

Then have an interactive report like this. Feel free to click around and see how it works. Use the arrows at the bottom right to go full-screen. NOTE – it filters by “today” so if I haven’t run the check and the import then click on one of the groups under “Today’s Checks by Group”

This enables me to filter the results and see what has happened in the past so I can filter by one instance
05 - filter by instance
or I can filter by a group of tests
07 - filter by instance
or even by a group of tests for an instance
08 - filter by instance and insance

Hopefully, this will give you some ideas of what you can do with your dbachecks results. You can find all of the code and the PowerBi in my GitHub

Happy Validating!

dbachecks – Improved Descriptions

With the latest release of dbachecks we have added a new check for testing that foreign keys and constraints are trusted thanks to Cláudio Silva b | t

To get the latest release you will need to run

Update-Module dbachecks

You should do this regularly as we release new improvements frequently.

We have also added better descriptions for the checks which was suggested by the same person who inspired the previous improvement I blogged about here

Instead of the description just being the name of the check it is now more of a, well, a description really 🙂

This has the added effect that it means that just running Get-DbcCheck in the command line will not fit all of the information on a normal screen

01 - get-dbccheck.png

You can use the Format-Table command (or its alias ft at the command line) and select the properties to display using

Get-DbcCheck | ft -Property UniqueTag, Description -Wrap

02 - get-dbccheck format table

or you can use Format-List (or its alias fl at the command line)

Get-DbcCheck | fl
03 get-dbccheck format list.png
Or you can use Out-GridView (or its alias ogv at the command line) (Incidentally, could you also thumbs up this issue on Github to get Out-GridView functionality in PowerShell 6)
Get-DbcCheck | ogv
04 - get-dbacheck ogv
Happy Validating !

dbachecks – Which Configuration Item For Which Check ?

I love showing dbachecks to people. It’s really cool seeing how people will use it and listening to their experiences. I was showing it to a production DBA a month or so ago and he said

How Do I Know Which Checks There Are?

OK you just need to run

Get-DbcCheck

and it will show you

01 - get-dbcchecks.png

It will show you the group, the type (does it need a computer name or an instance name), The description, the unique tag for running just that check and all the tags that will run that check

OK he said, you talked about configurations

How Do I Know Which Configurations There Are?

So to do that you just need to run

Get-DbcConfig

and it will show you

02 - dbcconfig.png

You can see the name, the current value and the description

Ah thats cool he said so

How Do I Know Which Configuration Is For Which Check?

Well, you just…. , you know…… AHHHHHHH

Ping – light bulb moment!

It’s always really useful to give something you have built to people who have never seen it before and then listen to what they say. Their new eyes and different experiences or expectations will give you lots of insight

None of the amazing contributors to dbachecks had thought of this scenario so I decided to fix this. First I asked for an issue to be raised in GitHub because an issue can be an improvement or a suggestion not just a bug.

Then I fixed it so that it would do what was required. Thank you Nick for this feedback and for helping to improve dbachecks

I improved Get-DbcCheck so that now it shows the configuration item related to each check

It is easier to see (and sort or search) if you use Out-GridView

Get-DbcCheck | Out-GridView

03 - New dbccheck.png

So now you can see which configuration can be set for each check!

 

Happy Validating!

Creating SQL Server Containers for versions 2012-2017

I am working on my dbatools and dbachecks presentations for SQL Saturday Finland, SQLDays, SQL Saturday Cork and SQLGrillen I want to show the two modules running against a number of SQL Versions so I have installed

  • 2 Domain Controllers
  • 2 SQL 2017 instances on Windows 2016 with an Availability Group and WideWorldImporters database
  • 1 Windows 2016 jump box with all the programmes I need
  • 1 Windows 2016 with containers

using a VSTS build and this set of ARM templates and scripts

I wanted to create containers running SQL2017, SQL2016, SQL2014 and SQL2012 and restore versions of the AdventureWorks database onto each one.

Move Docker Location

I redirected my docker location from my C:\ drive to my E:\ drive so I didnt run out of space. I did this by creating a daemon.json file in C:\ProgramData\docker\config and adding

{"data-root": "E:\\containers"}
and restarting the docker service which created folders like this
01 - folders.png
Then I ran
docker volume create SQLBackups
to create a volume to hold the backups that I could mount on the containers

AdventureWorks Backups

I downloaded all the AdventureWorks backups from GitHub and copied them to E:\containers\volumes\sqlbackups\_data

Get-ChildItem $Home\Downloads\AdventureWorks* | Copy-Item -Destination E:\containers\volumes\sqlbackups\_data

Getting the Images

To download the SQL 2017 image from the DockerHub I ran

docker pull microsoft/mssql-server-windows-developer:latest

and waited for it to download and extract

I also needed the images for other versions. My good friend Andrew Pruski b | t has versions available for us to use on his Docker Hub  so it is just a case of running

docker pull dbafromthecold/sqlserver2016dev:sp1
docker pull dbafromthecold/sqlserver2014dev:sp2
docker pull dbafromthecold/sqlserver2012dev:sp4
and waiting for those to download and extract (This can take a while!)

Create the containers

Creating the containers is as easy as

docker run -d -p ExposedPort:InternalPort --name NAME -v VolumeName:LocalFolder -e sa_password=THEPASSWORD -e ACCEPT_EULA=Y IMAGENAME
so all I needed to run to create 4 SQL containers one of each version was
docker run -d -p 15789:1433 --name 2017 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y microsoft/mssql-server-windows-developer
docker run -d -p 15788:1433 --name 2016 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y dbafromthecold/sqlserver2016dev:sp1
docker run -d -p 15787:1433 --name 2014 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y dbafromthecold/sqlserver2014dev:sp2
docker run -d -p 15786:1433 --name 2012 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y dbafromthecold/sqlserver2012dev:sp4

and just a shade over 12 seconds later I have 4 SQL instances ready for me 🙂

02 - creating containers.png

03 - Containers at the ready.png

Storing Credentials

This is not something I would do in a Production environment but I save my credentials using this method that Jaap Brasser b | t shared here

Get-Credential | Export-Clixml -Path $HOME\Documents\sa.cred
which means that I can get the credentials in my PowerShell session (as long as it is the same user that created the file) using
$cred = Import-Clixml $HOME\Documents\sa.cred

Restoring the databases

I restored all of the AdventureWorks databases that each instance will support onto each instance, so 2017 has all of them whilst 2012 only has the 2012 versions.

First I needed to get the filenames of the backup files into a variable

$filenames = (Get-ChildItem '\\bearddockerhost\e$\containers\volumes\sqlbackups\_data').Name

and the container connection strings, which are the hostname and the port number

$containers = 'bearddockerhost,15789', 'bearddockerhost,15788', 'bearddockerhost,15787', 'bearddockerhost,15786'
then I can restore the databases using dbatools using a switch statement on the version which I get with the NameLevel property of Get-DbaSqlBuildReference-
$cred = Import-Clixml $HOME\Documents\sa.cred
$containers = 'bearddockerhost,15789', 'bearddockerhost,15788', 'bearddockerhost,15787', 'bearddockerhost,15786'
$filenames = (Get-ChildItem '\\bearddockerhost\e$\containers\volumes\sqlbackups\_data').Name
$containers.ForEach{
    $Container = $Psitem
    $NameLevel = (Get-DbaSqlBuildReference-SqlInstance $Container-SqlCredential $cred).NameLevel
    switch ($NameLevel) {
        2017 {
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path C:\sqlbackups\ -useDestinationDefaultDirectories -WithReplace |Out-Null
            Write-Verbose-Message "Restored Databases on 2017"
        }
        2016 {
            $Files = $Filenames.Where{$PSitem -notlike '*2017*'}.ForEach{'C:\sqlbackups\' + $Psitem}
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path $Files-useDestinationDefaultDirectories -WithReplace
            Write-Verbose-Message "Restored Databases on 2016"
        }
        2014 {
            $Files = $Filenames.Where{$PSitem -notlike '*2017*' -and $Psitem -notlike '*2016*'}.ForEach{'C:\sqlbackups\' + $Psitem}
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path $Files-useDestinationDefaultDirectories -WithReplace
            Write-Verbose-Message "Restored Databases on 2014"
        }
        2012 {
            $Files = $Filenames.Where{$PSitem -like '*2012*'}.ForEach{'C:\sqlbackups\' + $Psitem}
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path $Files-useDestinationDefaultDirectories -WithReplace
            Write-Verbose-Message "Restored Databases on 2012"
        }
        Default {}
    }
}
I need to create the file paths for each backup file by getting the correct backups and appending the names to C:\SQLBackups which is where the volume is mounted inside the container
As Get-DbaDatabase gives the container ID as the Computer Name I have highlighted each container below
04 - databases.png
That is how easy it is to create a number of SQL containers of differing versions for your presentations or exploring needs
Happy Automating!

#tsql2sday – Giving Back – Reprise

Itsql2sdayt’s TSQL Tuesday again! This month our host is Riley Major (b/t) and the subject is Giving Back. He’s given us two options here (as well as the side option of your favorite 2017 improvement). Pick a way that you’d like to give back to the community and talk about it, or if you already give back,  tell us how and why you started.

 

There will be a lot of excellent posts on this subject and one of the things that I like is that you can go to http://tsqltuesday.com and look up all of the entries which means that for a particular topic you can find a bunch of blog posts from different angles (I miss-typed that as angels first and yes they are all angels!) whether it is career improvement or technical like say Extended Events it is all there waiting for you 🙂

I wrote about giving back in 2014 when I was about to help organise SQL Saturday Exeter for the first time. Last year I blogged about a few of the wonderful people who made a difference to me. So this post is a little about how I give back and also hopefully some hints that can help you to do the same as well.

How

There are so many ways that you can give back to the community as Riley’s post shows.

Sharing your knowledge is a good way. I share my knowledge in blog posts and in sessions at user groups and at conferences. You can too.

From beginner, introductory posts and talks to expert level deep dives every single one of us has learned from the blog posts that other people have spent their time and effort creating.

YOU can share your knowledge, even if you have only been using a technology for a few months, you have knowledge of the things you have learned and the things that would have made it easier. Write a post about those things.

Answering questions is another way. You can do this on Stack Overflow, SQL Server Central, PowerShell.Org , Reddit, Facebook , on Twitter using the #sqlhelp or #PowerShellHelp or just in person. I try to answer questions when I see them on twitter or in the SQL Community Slack

Mentoring or just providing feedback to people. You can offer to proof read blog posts or abstract submissions or you can listen to peoples presentations. I do this and it is a lot of fun

I also share my knowledge via my GitHub. All of my presentations slides and code are available as well as other code that I use. I also contribute to open-source projects such as dbatools and dbachecks.  You can do this too. You could open an issue for an improvement or bug. You can contribute your code, even if you are not confident writing the code you can fix spelling mistakes or add documentation., everything helps

You can help with organisation of events. I have helped to organise the PowerShell Europe Conference, PSDay.UK, SQL Saturday Exeter, SQL SouthWest user group, PASS PowerShell Virtual Group. I have also helped with session choices for a number of other events like SQL Grillen and SQLGLA and I have volunteered at many events from SQL Saturdays to SQL Bits. Everything from setting up and tearing down (IE moving heavy things and cleaning up rubbish) to sitting on the information desk, giving out badges, making sure the speakers are on time in their sessions. You can do this too. Just ask the organisers of the events what they need. It is better to do this prior to the event than on the day but I am sure all help is welcomed. Richard Munn and I talked (waffled?) about this at SQL Bits this year

If you would like to be considered as a volunteer for next years SQL Bits please email helpers at sqlbits.com

Why

So why did I start giving back?

I wanted to be useful. I saw the amount of work that Jonathan and Annette were doing organising SQL Saturday Exeter and SQL South West. I hoped that I could help them with that.

Why do I carry on doing it?

Because it is fun 🙂 I enjoy speaking, I enjoy sharing my knowledge and talking to people

That is good but there is more to it as well

You learn so much by writing a presentation or a blog post because you will do research.

You will learn even more when people ask you questions in your sessions or leave comments on your blog posts and you have to go and find the answers

You learn new and useful skills and demonstrate your knowledge to potential employers

My blog is a scrapbook of knowledge that I go back to and use all the time (and sometimes I forget that I have written something and find my own post in the search results!)

But the most important reason is that I feel that it pays back some of the benefit that I have gained from all of those people who’s time and effort I made use of for free when I was learning and continue to do so to this day. All of those blog posts and videos and presentations that I consumed have helped to make me the technician I am today. I have skills and abilities that I would not have without them all and by giving back I hope that I am enabling others to develop and see the benefit of sharing so that they will continue to do so in the future and I can learn from them.

 

 

 

 

 

 

 

Visual Studio Code Live Sharing Set-Up

There was an announcement on the Visual Studio Code blog about the public preview of Live Share. This enables you to easily collaborate on code by securely sharing your coding session.

It is remarkably easy to set up 🙂

Installation

Open Visual Studio Code, open the Extensions side bar (CTRL + SHIFT + X)

01 - open extensions

Search for Live Share

02 - search.png

Click Install and then reload when it has done

03 - reload.png

You will notice in the bottom bar it will say finishing the installation and if you open the terminal (CTRL + ‘) and click on Output and change the drop down on the right to Visual Studio Live Share you can see what it is doing

04 - finishing installation.png

It is installing the dependancies as shown below

[Client I] Installing dependencies for Live Share…
[Client I] Downloading package ‘.NET Core Runtime 2.0.5 for win7-x86’
[Client I] Download complete.
[Client I] Downloading package ‘OmniSharp for Windows (.NET 4.6)’
[Client I] Download complete.
[Client I] Installing package ‘.NET Core Runtime 2.0.5 for win7-x86’
[Client V] Extracted packed files
[Client I] Validated extracted files.
[Client I] Moved and validated extracted files.
[Client I] Finished installing.
[Client I] Installing package ‘OmniSharp for Windows (.NET 4.6)’
[Client V] Extracted packed files
[Client I] Validated extracted files.
[Client I] Finished installing.
[Client I] No workspace id found.
Incidentally, this will also show the location of the log file

You will see in the bottom bar it will now say sign in

06 - sign in.png

Clicking that will open a browser and give you a choice of accounts to sign in with, your GitHub or your Microsoft ID

07 - sign in.png

Choose the one that you want to use and do your 2FA.

08 - 2FA.png

You do have 2FA on your Microsoft and GitHub (and all the other services)? If not go and set it up now – here for Microsoft and here for GitHub 

Once you have signed in you will get this notification which you can close

09 - close this notification.png

The icon in the bottom will change and show your account name and if you click it it will open the menu

09 - sharing menu.png

Sharing

To share your session you click on the Share icon in the bottom bar or the Start collaboration session in the menu above. The first time you do this there will be a pop-up as shown

05 - firewall popup.png

You can decide which way you (or your organisation) want to share. I chose to accept the firewall exception.

10 - invite link.png

The invite link is in your clipboard ready to share with your friends and colleagues (other open source contributors ??)

They can either open the link in a browser

11 - join via browser.png

or by using the Join Collaboration Session in the menu in VS Code

12 - Join via VS COde.png

Once they do the sharer will get a notification

13 - notification of sharing.png

and the person who has joined will have the same workspace opened in their Visual Studio Code

14 -shared workspace.png

You can then collaborate on your code and share the session. In the video below the left hand side is running in my jump box in Azure and the right hand side on my laptop and you can see that if you highlight code in one side it is shown in the other and if you alter it in one side it is changed in the other. I also saved that file in the joined session rather than from the session that initialised the sharing and it then saved in both sessions 🙂

So that shows how easy it is to install and to use. You can dive deeper using the documentation.

 

Happy Collaborating 🙂

Version Update, Code Signing and publishing to the PowerShell Gallery with VSTS

At the fabulous PowerShell Conference EU I presented about Continuous Delivery to the PowerShell Gallery with VSTS and explained how we use VSTS to enable CD for dbachecks. We even released a new version during the session 🙂

So how do we achieve this?

We have a few steps

  • Create a project and link to our GitHub
  • Run unit uests with Pester to make sure that our code is doing what we expect.
  • Update our module version and commit the change to GitHub
  • Sign our code with a code signing certificate
  • Publish to the PowerShell Gallery

Create Project and link to GitHub

First you need to create a VSTS project by going to https://www.visualstudio.com/ This is free for up to 5 users with 1 concurrent CI/CD queue limited to a maximum of 60 minutes run time which should be more than enough for your PowerShell module.

01 - sign up.png

Click on Get Started for free under Visual Studio Team Services and fill in the required information. Then on the front page click new project

02 - New Project.png

Fill in the details and click create

03 - create project.png

Click on builds and then new definition

04- builds.png

next you need to link your project to your GitHub (or other source control providers) repository

05 - github auth.png

You can either authorise with OAuth or you can provide a PAT token following the instructions here. Once that is complete choose your repo. Save the PAT as you will need it later in the process!

06 - choose repo.png

and choose the branch that you want this build definition to run against.

07 branch.png

I chose to run the Unit Tests when a PR was merged into the development branch. I will then create another build definition for the master branch to sign the code and update module version. This enables us to push several PRs into the development branch and create a single release for the gallery.

Then I start with an empty process

08 - empty process.png

and give it a suitable name

09 - name it.png

i chose the hosted queue but you can download an agent to your build server if you need to do more or your integration tests require access to other resources not available on the hosted agent.

Run Unit Tests with Pester

We have a number of Unit tests in our tests folder in dbachecks so we want to run them to ensure that everything is as it should be and the new code will not break existing functionality (and for dbachecks the format of the PowerBi)

You can use the Pester Test Runner Build Task from the folk at Black Marble by clicking on the + sign next to Phase 1 and searching for Pester

10 - Pester task runner.png

You will need to click Get It Free to install it and then click add to add the task to your build definition. You can pretty much leave it as default if you wish and Pester will run all of the *.Tests.ps1 files that it finds in the directory where it downloads the GitHub repo which is referred to using the variable $(Build.SourcesDirectory). It will then output the results to a json file called Test-Pester.XML ready for publishing.

However, as dbachecks has a number of dependent modules, this task was not suitable. I spoke with Chris Gardner  b | t  from Black Marble at the PowerShell Conference and he says that this can be resolved so look out for the update. Chris is a great guy and always willing to help, you can often find him in the PowerShell Slack channel answering questions and helping people

But as you can use PowerShell in VSTS tasks, this is not a problem although you need to write your PowerShell using try catch to make sure that your task fails when your PowerShell errors. This is the code I use to install the modules

$ErrorActionPreference = 'Stop'

# Set location to module home path in artifacts directory
try {
    Set-Location $(Build.SourcesDirectory)
    Get-ChildItem
}
catch {
    Write-Error "Failed to set location"

}

# Get the Module versions
Install-Module Configuration -Scope CurrentUser -Force
$Modules = Get-ManifestValue -Path .\dbachecks.psd1 -PropertyName RequiredModules

$PesterVersion = $Modules.Where{$_.Get_Item('ModuleName') -eq 'Pester'}[0].Get_Item('ModuleVersion')
$PSFrameworkVersion = $Modules.Where{$_.Get_Item('ModuleName') -eq 'PSFramework'}[0].Get_Item('ModuleVersion')
$dbatoolsVersion = $Modules.Where{$_.Get_Item('ModuleName') -eq 'dbatools'}[0].Get_Item('ModuleVersion')

# Install Pester
try {
    Write-Output "Installing Pester"
    Install-Module Pester  -RequiredVersion $PesterVersion  -Scope CurrentUser -Force -SkipPublisherCheck
    Write-Output "Installed Pester"

}
catch {
    Write-Error "Failed to Install Pester $($_)"
}
# Install dbatools
try {
    Write-Output "Installing PSFramework"
    Install-Module PSFramework  -RequiredVersion $PsFrameworkVersion  -Scope CurrentUser -Force 
    Write-Output "Installed PSFramework"

}
catch {
    Write-Error "Failed to Install PSFramework $($_)"
}
# Install dbachecks
try {
    Write-Output "Installing dbatools"
    Install-Module dbatools  -RequiredVersion $dbatoolsVersion  -Scope CurrentUser -Force 
    Write-Output "Installed dbatools"

}
catch {
    Write-Error "Failed to Install dbatools $($_)"
}

# Add current folder to PSModulePath
try {
    Write-Output "Adding local folder to PSModulePath"
    $ENV:PSModulePath = $ENV:PSModulePath + ";$pwd"
    Write-Output "Added local folder to PSModulePath"    
    $ENV:PSModulePath.Split(';')
}
catch {
    Write-Error "Failed to add $pwd to PSModulePAth - $_"
}

I use the Configuration module from Joel Bennett to get the required module versions for the required modules and then add the path to $ENV:PSModulePath so that the modules will be imported. I think this is because the modules did not import correctly without it.

Once I have the modules I can then run Pester as follows

try {
    Write-Output "Installing dbachecks"
    Import-Module .\dbachecks.psd1
    Write-Output "Installed dbachecks"

}
catch {
    Write-Error "Failed to Install dbachecks $($_)"
}
$TestResults = Invoke-Pester .\tests -ExcludeTag Integration,IntegrationTests  -Show None -OutputFile $(Build.SourcesDirectory)\Test-Pester.XML -OutputFormat NUnitXml -PassThru

if ($TestResults.failedCount -ne 0) {
    Write-Error "Pester returned errors"
}

As you can see I import the dbachecks module from the local folder, run Invoke-Pester and output the results to an XML file and check that there are no failing tests.

Whether you use the task or PowerShell the next step is to Publish the test results so that they are displayed in the build results in VSTS.

Click on the + sign next to Phase 1 and search for Publish

12 - publish test results.png

 

Choose the Publish Test Results task and leave everything as default unless you have renamed the xml file. This means that on the summary page you will see some test results

 

13 - Test on sumary page.png

and on the tests tab you can see more detailed information and drill down into the tests

14 - detailed test report.png

Trigger

The next step is to trigger a build when a commit is pushed to the development branch. Click on Triggers and tick enable continuous integration

15 Trigger.png

Saving the Build Definition

I would normally save the build definition regularly and ensure that there is a good message in the comment. I always tell clients that this is like a commit message for your build process so that you can see the history of the changes for the build definition.

You can see the history on the edit tab of the build definition

16 - build history.png

If you want to compare or revert the build definition this can be done using the hamburger menu as shown below.

17 - build history compare revert.png

Update the Module Version

Now we need to create a build definition for the master branch to update the module version and sign the code ready for publishing to the PowerShell Gallery when we commit or merge to master

Create a new build definition as above but this time choose the master branch

18 - master build.png

Again choose an empty process and name it sensibly, click the + sign next to Phase 1 and search for PowerShell

19 - PowerShell task.png

I change the version to 2 and use this code. Note that the commit message has ***NO_CI*** in it. Putting this in a commit message tells VSTS not to trigger a build for this commit.

$manifest = Import-PowerShellDataFile .\dbachecks.psd1 
[version]$version = $Manifest.ModuleVersion
Write-Output "Old Version - $Version"
# Add one to the build of the version number
[version]$NewVersion = "{0}.{1}.{2}" -f $Version.Major, $Version.Minor, ($Version.Build + 1) 
Write-Output "New Version - $NewVersion"
# Update the manifest file
try {
    Write-Output "Updating the Module Version to $NewVersion"
    $path = "$pwd\dbachecks.psd1"
    (Get-Content .\dbachecks.psd1) -replace $version, $NewVersion | Set-Content .\dbachecks.psd1 -Encoding string
    Write-Output "Updated the Module Version to $NewVersion"
}
catch {
    Write-Error "Failed to update the Module Version - $_"
}

try {
    Write-Output "Updating GitHub"
git config user.email "mrrobsewell@outlook.com"
git config user.name "SQLDBAWithABeard"
git add .\dbachecks.psd1
git commit -m "Updated Version Number to $NewVersion ***NO_CI***"

git push https://$(RobsGitHubPAT)@github.com/sqlcollaborative/dbachecks.git HEAD:master
Write-Output "Updated GitHub "

}
catch {
    $_ | Fl -Force
    Write-Output "Failed to update GitHub"
}

I use Get-Content Set-Content as I had errors with the Update-ModuleManifest but Adam Murray g | t uses this code to update the version using the BuildID from VSTS

$newVersion = New-Object version -ArgumentList 1, 0, 0, $env:BUILD_BUILDID
$Public  = @(Get-ChildItem -Path $ModulePath\Public\*.ps1)
$Functions = $public.basename
Update-ModuleManifest -Path $ModulePath\$ModuleName.psd1 -ModuleVersion $newVersion -FunctionsToExport $Functions

You can commit your change by adding your PAT token as a variable under the variables tab. Don’t forget to tick the padlock to make it a secret so it is not displayed in the logs

20 - variables.png

Sign the code with a certificate

The SQL Collaborative uses a code signing certificate from DigiCert who allow MVPs to use one for free to sign their code for open source projects, Thank You. We had to upload the certificate to the secure files store in the VSTS library. Click on library, secure files and the blue +Secure File button

21 - secure file store.png

You also need to add the password as a variable under the variables tab as above. Again don’t forget to tick the padlock to make it a secret so it is not displayed in the logs

Then you need to add a task to download the secure file. Click on the + sign next to Phase 1 and search for secure

22 download secure file.png

choose the file from the drop down

23 - download secure file.png

Next we need to import the certificate and sign the code. I use a PowerShell task for this with the following code

$ErrorActionPreference = 'Stop'
# read in the certificate from a pre-existing PFX file
# I have checked this with @IISResetMe and this does not go in the store only memory
$cert = [System.Security.Cryptography.X509Certificates.X509Certificate2]::new("$(Agent.WorkFolder)\_temp\dbatools-code-signing-cert.pfx","$(CertPassword)")

try {
    Write-Output "Signing Files"
    # find all scripts in your module...
Get-ChildItem  -Filter *.ps1 -Include *.ps1 -Recurse -ErrorAction SilentlyContinue |
# ...that do not have a signature yet...
Where-Object {
  ($_ | Get-AuthenticodeSignature).Status -eq 'NotSigned'
  } |
# and apply one
# (note that we added -WhatIf so no signing occurs. Remove this only if you
# really want to add digital signatures!)
Set-AuthenticodeSignature -Certificate $cert
Write-Output "Signed Files"
}
catch {
    $_ | Format-List -Force
    Write-Error "Failed to sign scripts"
}

which will import the certificate into memory and sign all of the scripts in the module folder.

Publish your artifact

The last step of the master branch build publishes the artifact (your signed module) to VSTS ready for the release task. Again, click the + sign next to Phase one and choose the Publish Artifact task not the deprecated copy and publish artifact task and give the artifact a useful name

24 - publish artifact.png

Don’t forget to set the trigger for the master build as well following the same steps as the development build above

Publish to the PowerShell Gallery

Next we create a release to trigger when there is an artifact ready and publish to the PowerShell Gallery.

Click the Releases tab and New Definition

25 - Reelase creation

Choose an empty process and name the release definition appropriately

26 Release name empty process.png

Now click on the artifact and choose the master build definition. If you have not run a build you will get an error like below but dont worry click add.

27 - add artifact.png

Click on the lightning bolt next to the artifact to open the continuous deployment trigger

28 - Choose lightning bolt

and turn on Continuous Deployment so that when an artifact has been created with an updated module version and signed code it is published to the gallery

28 - Continuous deployment trigger

Next, click on the environment and name it appropriately and then click on the + sign next to Agent Phase and choose a PowerShell step

29 - PowerShell Publish step

You may wonder why I dont choose the PowerShell Gallery Packager task. There are two reasons. First I need to install the required modules for dbachecks (dbatools, PSFramework, Pester) prior to publishing and second it appears that the API Key is stored in plain text

30 - PowerShell Gallery Publisher

I save my API key for the PowerShell Gallery as a variable again making sure to tick the padlock to make it a secret

31 - API Key variable.png

and then use the following code to install the required modules and publish the module to the gallery

Install-Module dbatools -Scope CurrentUser -Force
Install-Module Pester -Scope CurrentUser -SkipPublisherCheck -Force
Install-Module PSFramework -Scope CurrentUser -Force

Publish-Module -Path "$(System.DefaultWorkingDirectory)/Master - Version Update, Signing and Publish Artifact/dbachecks" -NuGetApiKey "$(GalleryApiKey)"

Thats it 🙂

Now we have a process that will automatically run our Pester tests when we commit or merge to the development branch and then update our module version number and sign our code and publish to the PowerShell Gallery when we commit or merge to the master branch

Added Extra – Dashboard

I like to create dashboards in VSTS to show the progress of the various definitions. You can do this under the dashboard tab. Click edit and choose or search for widgets and add them to the dashboard

32 - Dashboard.png

Added Extra – Badges

You can also enable badges for displaying on your readme in GitHub (or VSTS). For the build defintions this is under the options tab.

33 - Build badges

for the release definitions, click the environment and then options and integrations

34 - Release Badge

You can then copy the URL and use it in your readme like this on dbachecks

35 - dbachecks readme badges.png

The SQL Collaborative has joined the preview of enabling public access to VSTS projects as detailed in this blog post So you can see the dbachecks build and release without the need to log in and soon the dbatools process as well

I hope you found this useful and if you have any questions or comments please feel free to contact me

 

Happy Automating!