PASS Summit, SQLTrain and My First US SQL Saturday

Next week is the week when I used to dread looking at Twitter and especially the #PASSsummit hashtag, watching all of those folk having a great time in great sessions and doing all of the networking. Last year I was lucky enough to attend for the first time and take part in Chrissy LeMaire and CKs pre-con, this year I decided to add a US SQL Saturday to the list.

I shall be attending SQL Saturday Oregon in Portland and presenting about dbatools. I am really lucky, my amazing Dad, now that he is retired, happily plays Dad’s taxi (still!) and frequently drives me to the airport when I go away on trips. This is the first time he has really gone Wow! When I asked why, it is because Portland is where Grimm is filmed and he has watched the whole series and loved it!

I am looking forward to ticking off another thing on my list of things to do and presenting at a US SQL Saturday to add to 12 SQL Saturday cities in Europe and multiple other events around the world. If you are there come up and say hi.

I shall also be room monitoring for the PowerShell for the DBA session by Amy Herold t | b There are still volunteer slots available to help, sign up here if you are going. It is a lot of fun and an excellent way to give something back. SQL Saturdays take a lot of work to organise and organisers are always willing for some help. You will meet new people and have a lot of fun as well.

To get to Seattle I am going on the SQL Train. A whole bunch of data platform folk travelling on a train together up to Seattle for PASS Summit. It looks like it will be a lot of fun 🙂

Once in Seattle it is time for the week where others will not want to look at my twitter feed 🙂 A whole week at PASS Summit watching the sessions and networking (there may be beer involved) with our peers.

My one piece of advice is please don’t hide away in your hotel room for all of the time that sessions are not on. I know that dealing with a large amount of people can be tough and you might need some me time ( I will ) but there are a lot of activities both loud and quieter where you will have the opportunity to meet up and make new friends and contacts which may really pay back further down the line and you will have fun too.

On the Tuesday I am doing a pre-conference session Professional and Proficient PowerShell: From Writing Scripts to Developing Solutions. A whole day where I will be showing how to write PowerShell modules and all of the tips and tricks that I have learnt over the years.

Wednesday sees me embracing my inner André Kamman – unfortunately he is unable to make PASS Summit this year, so I will be delivering the session dbatools Powershell Library – The Complete Introduction in his place in room 2AB. I shall try to do it in as cool and relaxed a way as he does (It probably wont work I will get too excited 🙂 )

On Thursday I will be talking about dbachecks in room 6C which will also be streamed on PASSTv.

In between all of that, I shall be attending some networking events, visiting sessions, hanging out with people new and old and walking down the corridors, so if you see me, stop me and say hi, I’d love to meet you 🙂
(note – just before my sessions I may be too nervous to properly have a conversation)

Hopefully, I will meet you there and for the twitter folk stuck back in Europe I empathise 🙂

Checking Trace Flags with dbachecks, online docs and PSPowerHour

It’s been a few weeks since i have blogged as I have been busy with a lot of other things. One of which is preparing for my SQL Pass Summit pre-con which has lead to me improving the CI/CD for dbachecks by adding auto-creation of online documentation, which you can find at https://dbachecks.readthedocs.io or by running Get-Help with the -Online switch for any dbachecks command.
Get-Help Invoke-DbcCheck -Online
01 - online help.png
I will blog about how dbachecks uses Azure DevOps to do this another time

PSPowerHour

The PowerShell community members Michael T Lombardi and Warren Frame have created PSPowerHour. PSPowerHour is “like a virtual User Group, with a lightning-demo format, and room for non-PowerShell-specific content. Eight community members will give a demo each PowerHour.”
Chrissy blogged about the first one on the dbatools blog
You can watch the videos on the Youtube channel and keep an eye out for more online PSPowerHours via twitter or the GitHub page.
While watching the first group of sessions Andrew Wickham demonstrated using dbatools with trace flags and I thought that needs to be added to dbachecks so I created an issue. Anyone can do this to file improvements as well as bugs for members of the team to code.

Trace Flags

The previous release of dbachecks brought 2 new checks for traceflags. One for traceflags expected to be running and one for traceflags not expected to be running.
You will need to have installed dbachecks from the PowerShell Gallery to do this. This can be done using
Install-Module -Name dbachecks
Once dbachecks is installed you can find the checks using
Get-DBcCheck
you can filter using the pattern parameter
Get-DBcCheck -Pattern traceflag
02 - get0dbcconfig.png
This will show you
  • the UniqueTag which will enable you to run only that check if you wish
  • AllTags which shows which tags will include that check
  • Config will show you which configuration items can be set for this check

The trace flag checks require the app.sqlinstance configuration which is the list of SQL instances that the checks will run against. You can also specify the instances as a parameter for Invoke-DbCheck as well.

The configuration for the expected traceflags is policy.traceflags.expected By default it is set to null. You can see what configuration it has using

Get-DBcConfig policy.traceflags.expected

get-dbcconfig.png

So if you want to check that there are no trace flags running, then you can run
$instance = 'sql0'
Set-DbcConfig -Name app.sqlinstance -Value $instance
Invoke-DbcCheck -Check TraceFlagsExpected
check 1.png
Maybe this instance is required to have trace flag 1117 enabled so that all files in a file group grow equally, you can set the trace flag you expect to be running using
Set-DbcConfig -Name policy.traceflags.expected -Value 1117
set config.png
Now you when you run the check it fails
Invoke-DbcCheck -Check TraceFlagsExpecte
not found.png
and gives you the error message

 [-] Expected Trace Flags 1117 exist on sql0 593ms
Expected 1117 to be found in collection @(), because We expect that Trace Flag 1117 will be set on sql0, but it was not found.

So we have a failing test. We need to fix that. We can use dbatools
Enable-DbaTraceFlag -SqlInstance $instance -TraceFlag 1117
set traceflag.png
This time when we run the check
Invoke-DbcCheck -Check TraceFlagsExpected
it passes
passed test
If you just need to see what trace flags are enabled you can use
Get-DbaTraceFlag -SqlInstance $instance
get trace flag.png
Reset the configuration for the expected trace flag to an empty array and then set the configuration for traceflags we do not expect to be running to 1117
Set-DbcConfig -Name policy.traceflags.expected -Value @()
Set-DbcConfig -Name policy.traceflags.notexpected -Value 1117
set config 2.png
and then run the trace flags not expected to be running check with
Invoke-DbcCheck -Check TraceFlagsNotExpected
It will fail as 1117 is still running
not expected fail.png
and give the message
[-] Expected Trace Flags 1117 to not exist on sql0 321ms
Expected 1117 to not be found in collection 1117, because We expect that Trace Flag 1117 will not be set on sql0, but it was found.
So to resolve this failing check we need to disable the trace flag and we can do that with dbatools using
Disable-DbaTraceFlag -SqlInstance $instance -TraceFlag 1117
disable trace flag
and now when we run the check
Invoke-DbcCheck -Check TraceFlagsNotExpected
it passes
passed bnot expected.png
The checks also work with multiple traceflags so you can set multiple values for trace flags that are not expexted to be running
Set-DbcConfig -Name policy.traceflags.notexpected -Value 1117, 1118
and as we saw earlier, you can run both trace flag checks using
Invoke-DbcCheck -Check TraceFlag
multi checks.png
You can use this or any of the 95 available checks to validate that your SQL instances, singular or your whole estate are as you expect them to be.

Deploying To a Power Bi Report Server with PowerShell

Just a quick post to share some code that I used to solve a problem I had recently.

I needed to automate the deployment of some Power Bi reports to a Power Bi Report Server PBRS using TFS. I had some modified historical validation dbachecks pbix files that I wanted to automate the deployment of and enable the client to be able to quickly and simply deploy the reports as needed.

The manual way

It is always a good idea to understand how to do a task manually before automating it. To deploy to PBRS you need to use the Power Bi Desktop optimised for Power Bi Report Server. There are instructions here. Then it is easy to deploy to the PBRS by clicking file and save as and choosing Power Bi Report Server

manual deploy

If I then want to set the datasource to use a different set of credentials I navigate to the folder that holds the report in PBRS and click the hamburger menu and Manage

manage

and I can alter the User Name and Password or the type of connection by clicking on DataSources

testconn.PNG

and change it to use the reporting user for example.

Automation

But I dont want to have to do this each time and there will be multiple pbix files, so I wanted to automate the solution. The end result was a VSTS or TFS release process so that I could simply drop the pbix into a git repository, commit my changes, sync them and have the system deploy them automatically.

As with all good ideas, I started with a google and found this post by Bill Anton which gave me a good start ( I could not get the connection string change to work in my test environment but this was not required so I didnt really examine why)

I wrote a function that I can use via TFS or VSTS by embedding it in a PowerShell script. The function requires the ReportingServicesTools module which you can get by

Install-Module -Name ReportingServicesTools

The function below is available via the PowerShell Gallery also and you can get it with

Install-Script -Name PublishPBIXFile

The source code is on Github

and the code to call it looks like this

$folderName = 'TestFolder'
$ReportServerURI = 'http://localhost/Reports'
$folderLocation = '/'
$pbixfile = 'C:\Temp\test.pbix'
$description = "Descriptions"

$publishPBIXFileSplat = @{
    ReportServerURI    = $ReportServerURI
    folderLocation     = $folderLocation
    description        = $description
    pbixfile           = $pbixfile
    folderName         = $folderName
    AuthenticationType = 'Windows'
    ConnectionUserName = $UserName1
    Secret             = $Password1
    Verbose            = $true
}
Publish-PBIXFile @publishPBIXFileSplat
code1.PNG

which uploads the report to a folder which it will create if it does not exist. It will then upload pbix file, overwriting the existing one if it already exists

numbe3r1.PNG

and uses the username and password specified

code2.PNG

If I wanted to use a Domain reporting user instead I can do

$UserName1 = 'TheBeard\ReportingUser'

$publishPBIXFileSplat = @{
    ReportServerURI    = $ReportServerURI
    folderLocation     = $folderLocation
    description        = $description
    pbixfile           = $pbixfile
    folderName         = $folderName
    AuthenticationType = 'Windows'
    ConnectionUserName = $UserName1
    Secret             = $Password1
    Verbose            = $true
}
Publish-PBIXFile @publishPBIXFileSplat
and it changes
code4 reporting
If we want to use a SQL Authenticated user then
$UserName1 = 'TheReportingUserOfBeard'

$publishPBIXFileSplat = @{
    ReportServerURI    = $ReportServerURI
    folderLocation     = $folderLocation
    description        = $description
    pbixfile           = $pbixfile
    folderName         = $folderName
    AuthenticationType = 'SQL'
    # credential = $cred
    ConnectionUserName = $UserName1
    Secret             = $Password1

}
Publish-PBIXFile @publishPBIXFileSplat
sql auth.PNG
Excellent, it all works form the command line. You can pass in a credential object as well as username and password. The reason I enabled username and password? So that I can use TFS or VSTS and store my password as a secret variable.
Now I simply create a repository which has my pbix files and a PowerShell script and build a quick release process to deploy them whenever there is a change 🙂
The deploy script looks like
[CmdletBinding()]
Param (
    $PBIXFolder,
    $ConnectionStringPassword
)
$VerbosePreference = 'continue'
$ReportServerURI = 'http://TheBeardsAmazingReports/Reports'
Write-Output "Starting Deployment"
function Publish-PBIXFile {
    [CmdletBinding(DefaultParameterSetName = 'ByUserName', SupportsShouldProcess)]
    Param(
        [Parameter(Mandatory = $true)]
        [string]$FolderName,
        [Parameter(Mandatory = $true)]
        [string]$ReportServerURI,
        [Parameter(Mandatory = $true)]
        [string]$FolderLocation,
        [Parameter(Mandatory = $true)]
        [string]$PBIXFile,
        [Parameter()]
        [string]$Description = "Description of Your report Should go here",
        [Parameter()]
        [ValidateSet('Windows', 'SQL')]
        [string]$AuthenticationType,
        [Parameter(ParameterSetName = 'ByUserName')]
        [string]$ConnectionUserName,
        [Parameter(ParameterSetName = 'ByUserName')]
        [string]$Secret,
        [Parameter(Mandatory = $true, ParameterSetName = 'ByCred')]
        [pscredential]$Credential
    )
    $FolderPath = $FolderLocation + $FolderName
    $PBIXName = $PBIXFile.Split('\')[-1].Replace('.pbix', '')
    try {
        Write-Verbose"Creating a session to the Report Server $ReportServerURI"
        # establish session w/ Report Server
        $session = New-RsRestSession-ReportPortalUri $ReportServerURI
        Write-Verbose"Created a session to the Report Server $ReportServerURI"
    }
    catch {
        Write-Warning"Failed to create a session to the report server $reportserveruri"
        Return
    }
    # create folder (optional)
    try {
        if ($PSCmdlet.ShouldProcess("$ReportServerURI", "Creating a folder called $FolderName at $FolderLocation")) {
            $Null = New-RsRestFolder-WebSession $session-RsFolder $FolderLocation-FolderName $FolderName-ErrorAction Stop
        }
    }
    catch [System.Exception] {
        If ($_.Exception.InnerException.Message -eq 'The remote server returned an error: (409) Conflict.') {
            Write-Warning"The folder already exists - moving on"
        }
    }
    catch {
        Write-Warning"Failed to create a folder called $FolderName at $FolderLocation report server $ReportServerURI but not because it already exists"
        Return
    }
    try {
        if ($PSCmdlet.ShouldProcess("$ReportServerURI", "Uploading the pbix from $PBIXFile to the report server ")) {
            # upload copy of PBIX to new folder
            Write-RsRestCatalogItem-WebSession $session-Path $PBIXFile-RsFolder $folderPath-Description $Description-Overwrite
        }
    }
    catch {
        Write-Warning"Failed to upload the file $PBIXFile to report server $ReportServerURI"
        Return
    }
    try {
        Write-Verbose"Getting the datasources from the pbix file for updating"
        # get data source object
        $datasources = Get-RsRestItemDataSource-WebSession $session-RsItem "$FolderPath/$PBIXName"
        Write-Verbose"Got the datasources for updating"
    }
    catch {
        Write-Warning"Failed to get the datasources"
        Return
    }
    try {
        Write-Verbose"Updating Datasource"

        foreach ($dataSourcein$datasources) {
            if ($AuthenticationType -eq 'SQL') {
                $dataSource.DataModelDataSource.AuthType = 'UsernamePassword'
            }
            else {
                $dataSource.DataModelDataSource.AuthType = 'Windows'
            }
            if ($Credential -or $UserName) {
                if ($Credential) {
                    $UserName = $Credential.UserName
                    $Password = $Credential.GetNetworkCredential().Password
                }
                else {
                    $UserName = $ConnectionUserName
                    $Password = $Secret
                }
                $dataSource.CredentialRetrieval = 'Store'
                $dataSource.DataModelDataSource.Username = $UserName
                $dataSource.DataModelDataSource.Secret = $Password
            }
            if ($PSCmdlet.ShouldProcess("$ReportServerURI", "Updating the data source for the report $PBIXName")) {
                # update data source object on server
                Set-RsRestItemDataSource-WebSession $session-RsItem "$folderPath/$PBIXName"-RsItemType PowerBIReport -DataSources $datasource
            }
        }
    }
    catch {
        Write-Warning"Failed to set the datasource"
        Return
    }
    Write-Verbose"Completed Successfully"
}
foreach ($File in (Get-ChildItem $PBIXFolder\*.pbix)) {
    Write-Output"Processing $($File.FullName)"
    ## to enable further filtering later
    if ($File.FullName -like '*') {
        $folderName = 'ThePlaceForReports'
        $folderLocation = '/'
        $UserName = 'TheBeard\ReportingUser'
        $Password = $ConnectionStringPassword
        $pbixfile = $File.FullName
    }
    if ($File.FullName -like '*dbachecks*') {
        $description = "This is the morning daily checks file that....... more info"
    }
    if ($File.FullName -like '*TheOtherReport*') {
        $description = "This is hte other report, it reports others"
    }
    $publishPBIXFileSplat = @{
        ReportServerURI    = $ReportServerURI
        folderLocation     = $folderLocation
        description        = $description
        AuthenticationType = 'Windows'
        pbixfile           = $pbixfile
        folderName         = $folderName
        ConnectionUserName = $UserName
        Secret             = $Password
        Verbose            = $true
    }
    $Results = Publish-PBIXFile@publishPBIXFileSplat
    Write-Output$Results
}
Although the function does not need to be embedded in the script and can be deployed in a module, I have included it in here to make it easier for people to use quickly. I
Then create a PowerShell step in VSTS or TFS and call the script with the parameters as shown below and PowerBi files auto deploy to Power Bi Report Server
vsts.PNG
and I have my process complete 🙂
Happy Automating 🙂

Using the PowerShell AST to find a ForEach Method

In dbachecks we enable people to see what checks are available by running Get-DbcCheck. This gives a number of properties including the ‘type’ of check. This refers to the configuration item or parameter that is required to have a value for this check to run.

For example – Any check to do with SQL Agent is of type Sqlinstance because it requires an instance to be specified but a check for SPN is of type ComputerName because it requires a computer name to run.

Automation for the win

Because I believe in automation I do not want to have to hard code these values anywhere but create them when the module is imported so we use a json file to feed Get-DbcCheck and populate the Json file when we import the module. This is done using the method that I described here and means that whenever a new check is added it is automatically available in Get-DbcCheck without any extra work.

We use code like this
## Parse the file with AST
$CheckFileAST = [Management.Automation.Language.Parser]::ParseInput($check, [ref]$null, [ref]$null)
## Old code we can use the describes
$Describes = $CheckFileAST.FindAll([Func[Management.Automation.Language.Ast, bool]] {
        param ($ast)
        $ast.CommandElements -and
        $ast.CommandElements[0].Value -eq 'describe'
    }, $true)

@($describes).ForEach{
    $groups += $filename
    $Describe = $_.CommandElements.Where{$PSItem.StaticType.name -eq 'string'}[1]
    $title = $Describe.Value
    $Tags = $PSItem.CommandElements.Where{$PSItem.StaticType.name -eq 'Object[]' -and $psitem.Value -eq $null}.Extent.Text.ToString().Replace(', $filename', '')
    # CHoose the type
    if ($Describe.Parent -match "Get-Instance") {
        $type = "Sqlinstance"
    }
    elseif ($Describe.Parent -match "Get-ComputerName" -or $Describe.Parent -match "AllServerInfo") {
        $type = "ComputerName"
    }
    elseif ($Describe.Parent -match "Get-ClusterObject") {
        $Type = "ClusteNode"
    }
First we parse the code with the AST and store that in the CheckFileAST variable, then we use the FindAll method to find any command elements that match “Describe” which conveniently gets our describes and then we can simply match the Parent object which holds some code to each function that we use to get our values to be passed to the tests Get-ComputerName, Get-Instance, Get-ClusterObject and set the type appropriately.
which when run against a check like this
Describe "Backup Path Access" -Tags BackupPathAccess, Storage, DISA, $filename {
    @(Get-Instance).ForEach{
        if ($NotContactable -contains $psitem) {
            Context "Testing Backup Path Access on $psitem" {
                It "Can't Connect to $Psitem" {
                    $false| Should -BeTrue -Because "The instance should be available to be connected to!"
                }
            }
        }
        else {
            Context "Testing Backup Path Access on $psitem" {
                $backuppath = Get-DbcConfigValue policy.storage.backuppath
                if (-not$backuppath) {
                    $backuppath = (Get-DbaDefaultPath-SqlInstance $psitem).Backup
                }
                It "can access backup path ($backuppath) on $psitem" {
                    Test-DbaSqlPath-SqlInstance $psitem -Path $backuppath| Should -BeTrue -Because 'The SQL Service account needs to have access to the backup path to backup your databases'
                }
            }
        }
    }
}
will find the describe block and get the title “Backup Path Access”  and the tags BackupPathAccess, Storage, DISA, $filename and then find the Get-Instance and set the type to SqlInstance

Until Rob breaks it!

This has worked wonderfully well for 6 months or so of the life of dbachecks but this week I broke it!
The problem was the performance of the code. It is taking a long time to run the tests and I am looking at ways to improve this. I was looking at the Server.Tests file because I thought why not start with one of the smaller files.
It runs the following checks
Server Power Plan Configuration
SPNs
Disk Space
Ping Computer
CPUPrioritisation
Disk Allocation Unit
Instance Connection
and it was looping through the computer names for each check like this
Describe "Server Power Plan Configuration" -Tags PowerPlan, $filename {
    @(Get-ComputerName).ForEach{
    }
}
Describe "Instance Connection" -Tags InstanceConnection, Connectivity, $filename {
    @(Get-Instance).ForEach{
    }
}
Describe "SPNs" -Tags SPN, $filename {
    @(Get-ComputerName).ForEach{
    }
}
Describe "Disk Space" -Tags DiskCapacity, Storage, DISA, $filename {
    @(Get-ComputerName).ForEach{
    }
}
Describe "Ping Computer" -Tags PingComputer, $filename {
    @(Get-ComputerName).ForEach{
    }
}
Describe "CPUPrioritisation" -Tags CPUPrioritisation, $filename {
    @(Get-ComputerName).ForEach{
    }
}
Describe "Disk Allocation Unit" -Tags DiskAllocationUnit, $filename {
    @(Get-ComputerName).ForEach{
    }
}
I altered it to have only one loop for the computer names like so
@(Get-ComputerName).ForEach{
    Describe "Server Power Plan Configuration" -Tags PowerPlan, $filename {
    }
    Describe "SPNs" -Tags SPN, $filename {
    }
    Describe "Disk Space" -Tags DiskCapacity, Storage, DISA, $filename {
    }
    Describe "Ping Computer" -Tags PingComputer, $filename {
    }
    Describe "CPUPrioritisation" -Tags CPUPrioritisation, $filename {
    }
    Describe "Disk Allocation Unit" -Tags DiskAllocationUnit, $filename {
    }
}
Describe "Instance Connection" -Tags InstanceConnection, Connectivity, $filename {
    @(Get-Instance).ForEach{
    }
}
and immediately in testing my checks for the Server Tag decreased in time by about 60% 🙂
I was very happy.
Then I added it to the dbachecks module on my machine, loaded the module and realised that my Json file for Get-DbcCheck was no longer being populated for the type because this line
elseif ($Describe.Parent-match"Get-ComputerName"-or$Describe.Parent-match"AllServerInfo")
was no longer true.

AST for other things

So I googled Management.Automation.Language.Ast the first result lead me to docs.microsoft There are a number of different language elements available there and I found InvokeMemberExpressionAst which will let me find any methods that have been invoked, so now I can find the loops with
$ComputerNameForEach = $CheckFileAST.FindAll([Func[Management.Automation.Language.Ast, bool]] {
        param ($ast)
        $ast -is [System.Management.Automation.Language.InvokeMemberExpressionAst]
    }, $true)
When I examined the object returned I could see that I could further limit the result to get only the method for Get-ComputerName and then if I choose the Extent I can get the code of that loop
## New code uses a Computer Name loop to speed up execution so need to find that as well
$ComputerNameForEach=$CheckFileAST.FindAll([Func[Management.Automation.Language.Ast,bool]] {
param ($ast)
$ast-is [System.Management.Automation.Language.InvokeMemberExpressionAst] -and$ast.expression.Subexpression.Extent.Text-eq'Get-ComputerName'
}, $true).Extent

and now I can match the Tags to the type again :-)

if ($ComputerNameForEach-match$title) {
$type="ComputerName"
}
and now Get-DbcCheck is returning the right results and the checks are a little faster
You can find dbachecks on the PowerShell Gallery or install it using
Install-Module dbachecks -Scope CurrentUser

 

A PowerShell Conference In A Book

A Question

Shortly after the European PowerShell Conference and the PowerShell and Devops 2018 summit in the USA Mike Robbins b | t contacted me with a question.

Interested in writing a chapter in a PowerShell book?
I was intrigued and read on.

A Conference in a Book

There was more to this book than just writing about PowerShell though. Mike was suggesting that a group of wonderful PowerShell experts (Here’s a Twitter list) got together and created a conference in a book.

The book is designed as a conference in a book where each chapter is written independently with content similar to what you would present in a 45 minute presentation.

That’s a neat idea, people who couldn’t come to one of the conferences would be able to get an experience a little bit like attending a conference but in book form.

OK there would be no networking, evening entertainment or instance responses to questions but a bundle of useful information that you can take with you and read anywhere.

It’s All For Charity

(I hope older UK viewers read that in this voice 🙂 )

The bit that clinched it for me was this though

We’re donating all of the royalties from the book to the DevOps Collective Scholarship program https://leanpub.com/causes/devopscollective.

All the money raised by buying this book will go to the DevOps Collective OnRamp Scholarship program.

This scholarship provides

  • ticket to PowerShell and DevOps Global Summit OnRamp track specifically designed for entry-level professionals
  • five nights lodging
  • domestic airfare
  • buddy programme
and half of the slots are reserved for under-represented groups.
I really approve of this idea, without the help and support of the SQL and PowerShell technical communities I would not be where I am today and this will help to bring other people in at an early stage in their career. I am proud that I can give a little back.

Fabulous Editors

So I said yes.

I then had to sit down and write some words. I wrote about how we created dbachecks, the challenges we faced and how we overcame them.

One of my biggest challenges was writing in the wrong English! The book is written in American English and there are zeds where there should be esses and missing u’s in words! My spell checker was covered in red squiggles! The second challenge was getting the code to fit the column limit for the book. I show a lot of the AST code that we use to validate that dbachecks code will work correctly and it doesnt split to 80 characters very easily.

Luckily I had 3 wonderful, patient editors to help me with all of this. Mike Robbins , Michael T Lombardi and Jeff Hicks each helped me to make the chapter read more fluently, make sense and be spelled correctly!

Thank you very much you three for all the work you have put into this book.

Help Yourself and Others

If you want to attend a PowerShell conference in book form, want 30 chapters of fabulous PowerShell material and want to help grow and diversify our industry then look no further you can get the book here

book

 

You can also find all of the authors twitters and websites below, You should go and see what they are sharing there as well.

Author Website
Mike F Robbins https://mikefrobbins.com
Jeff Hicks https://jdhitsolutions.com
Michael Lombardi https://appoint.ly/t/michaeltlombardi
Adam Murry https://tikabu.com.au/blog/
Anthony Nocentino http://www.centinosystems.com
Brandon Olin https://devblackops.io
Brian Bunke https://www.brianbunke.com
Don Jones https://donjones.com
Doug Finke https://dfinke.github.io
Emin Atac https://p0w3rsh3ll.wordpress.com
Fred Weinmann https://allthingspowershell.blogspot.com
Graham Beer https://graham-beer.github.io
Irwin Strachan https://pshirwin.wordpress.com
James Petty https://scriptautomaterepeat.com
Jeremy Murrah https://murrahjm.github.io
Justin Sider https://invoke-automation.blog
Luc Dekens http://www.lucd.info
Mark Kraus https://get-powershellblog.blogspot.com
Mark Wragg https://wragg.io
Mike Kanakos https://www.networkadm.in
Mike Shepard https://powershellstation.com
Patrick Gruenauer https://sid-500.com
Prateek Singh https://ridicurious.com
Rob Pleau https://ephos.github.io
Thomas Lee https://tfl09.blogspot.com
Thomas Rayner https://workingsysadmin.com
Thom Schumacher https://powershellposse.com
Tim Curwick https://MadWithPowerShell.com
Tim Warner https://timwarnertech.com
Tommy Maynard https://tommymaynard.com
Tore Groneng https://asaconsultant.blogspot.com
Wesley Kirkland https://wesleyk.me

A PowerShell Pester Check for parsing SQL scripts

I like to write Pester checks to make sure that all is as expected! This is just a quick post as much to help me remember this script 🙂

This is a quick Pester test I wrote to ensure that some SQL Scripts in a directory would parse so there was some guarantee that they were valid T-SQL. It uses the SQLParser.dll and because it was using a build server without SQL Server I have to load the required DLLs from the dbatools module (Thank you dbatools 🙂 )

It simply runs through all of the .sql files and runs the parser against them and checks the errors. In the case of failures it will output where it failed in the error message in the failed Pester result as well.

You will need dbatools module installed on the instance and at least version 4 of the Pester module as well

Describe "Testing SQL" {
    Context "Running Parser" {
        ## Load assembly
        $Parserdll = (Get-ChildItem 'C:\Program Files\WindowsPowerShell\Modules\dbatools' -Include Microsoft.SqlServer.Management.SqlParser.dll -Recurse)[0].FullName
        [System.Reflection.Assembly]::LoadFile($Parserdll) | Out-Null
        $TraceDll = (Get-ChildItem 'C:\Program Files\WindowsPowerShell\Modules\dbatools' -Include Microsoft.SqlServer.Diagnostics.Strace.dll -Recurse)[0].FullName
        [System.Reflection.Assembly]::LoadFile($TraceDll) | Out-Null
        $ParseOptions = New-Object Microsoft.SqlServer.Management.SqlParser.Parser.ParseOptions
        $ParseOptions.BatchSeparator = 'GO'
        $files = Get-ChildItem -Path $Env:Directory -Include *.sql -Recurse ## This variable is set as a Build Process Variable or put your path here
        $files.ForEach{
            It "$($Psitem.FullName) Should Parse SQL correctly" {
                $filename = $Psitem.FullName
                $sql = Get-Content -LiteralPath "$fileName"
                $Script = [Microsoft.SqlServer.Management.SqlParser.Parser.Parser]::Parse($SQL, $ParseOptions)
                $Script.Errors | Should -BeNullOrEmpty
            }
        }
    }
}

dbachecks – Save the results to a database for historical reporting

I gave a presentation at SQL Day in Poland last week on dbachecks and one of the questions I got asked was will you write a command to put the results of the checks into a database for historical reporting.

The answer is no and here is the reasoning. The capability is already there. Most good PowerShell commands will only return an object and the beauty of an object is that you can do anything you like with it. Your only limit is your imagination 🙂 I have written about this before here. The other reason is that it would be very difficult to write something that was easily configurable for the different requirements that people will require. But here is one way of doing it.

Create a configuration and save it

Let’s define a configuration and call it production. This is something that I do all of the time so that I can easily run a set of checks with the configuration that I want.

# The computername we will be testing
Set-DbcConfig -Name app.computername -Value $sql0,$SQl1
# The Instances we want to test
Set-DbcConfig -Name app.sqlinstance -Value $sql0,$SQl1
# The database owner we expect
Set-DbcConfig -Name policy.validdbowner.name -Value 'THEBEARD\EnterpriseAdmin'
# the database owner we do NOT expect
Set-DbcConfig -Name policy.invaliddbowner.name -Value 'sa'
# Should backups be compressed by default?
Set-DbcConfig -Name policy.backup.defaultbackupcompression -Value $true
# Do we allow DAC connections?
Set-DbcConfig -Name policy.dacallowed -Value $true
# What recovery model should we have?
Set-DbcConfig -Name policy.recoverymodel.type -value FULL
# What should ourt database growth type be?
Set-DbcConfig -Name policy.database.filegrowthtype -Value kb
# What authentication scheme are we expecting?
Set-DbcConfig -Name policy.connection.authscheme -Value 'KERBEROS'
# Which Agent Operator should be defined?
Set-DbcConfig -Name agent.dbaoperatorname -Value 'The DBA Team'
# Which Agent Operator email should be defined?
Set-DbcConfig -Name agent.dbaoperatoremail -Value 'TheDBATeam@TheBeard.Local'
# Which failsafe operator shoudl be defined?
Set-DbcConfig -Name agent.failsafeoperator -Value 'The DBA Team'
## Set the database mail profile name
Set-DbcConfig -Name agent.databasemailprofile -Value 'DbaTeam'
# Where is the whoisactive stored procedure?
Set-DbcConfig -Name policy.whoisactive.database -Value master
# What is the maximum time since I took a Full backup?
Set-DbcConfig -Name policy.backup.fullmaxdays -Value 7
# What is the maximum time since I took a DIFF backup (in hours) ?
Set-DbcConfig -Name policy.backup.diffmaxhours -Value 26
# What is the maximum time since I took a log backup (in minutes)?
Set-DbcConfig -Name policy.backup.logmaxminutes -Value 30
# What is my domain name?
Set-DbcConfig -Name domain.name -Value 'TheBeard.Local'
# Where is my Ola database?
Set-DbcConfig -Name policy.ola.database -Value master
# Which database should not be checked for recovery model
Set-DbcConfig -Name policy.recoverymodel.excludedb -Value 'master','msdb','tempdb'
# Should I skip the check for temp files on c?
Set-DbcConfig -Name skip.tempdbfilesonc -Value $true
# Should I skip the check for temp files count?
Set-DbcConfig -Name skip.tempdbfilecount -Value $true
# Which Checks should be excluded?
Set-DbcConfig -Name command.invokedbccheck.excludecheck -Value LogShipping,ExtendedEvent, PseudoSimple,SPN, TestLastBackupVerifyOnly,IdentityUsage,SaRenamed
# How many months before a build is unsupported do I want to fail the test?
Set-DbcConfig -Name policy.build.warningwindow -Value 6
## I need to set the app.cluster configuration to one of the nodes for the HADR check
## and I need to set the domain.name value
Set-DbcConfig -Name app.cluster -Value $SQL0
Set-DbcConfig -Name domain.name -Value 'TheBeard.Local'
## I also skip the ping check for the listener as we are in Azure
Set-DbcConfig -Name skip.hadr.listener.pingcheck -Value $true
Now I can export that configuration to a json file and store on a file share or in source control using the code below. This makes it easy to embed the checks into an automation solution
Export-DbcConfig -Path Git:\Production.Json
and then I can use it with
Import-DbcConfig -Path Git:\Production.Json
Invoke-DbcCheck
01 - Invoke-DbcCheck
I would use one of the Show parameter values here if I was running it at the command line, probably fails to make reading the information easier

Add results to a database

This only gets us the test results on the screen, so if we want to save them to a database we have to use the PassThru parameter for Invoke-DbcCheck. I will run the checks again, save them to a variable
$Testresults = Invoke-DbcCheck -PassThru -Show Fails

Then I can use the dbatools Write-DbaDatatable command to write the results to a table in a database. I need to do this twice, once for the summary and once for the test results

$Testresults | Write-DbaDataTable -SqlInstance $sql0 -Database tempdb -Table Prod_dbachecks_summary -AutoCreateTable
$Testresults.TestResult | Write-DbaDataTable -SqlInstance $sql0 -Database tempdb -Table Prod_dbachecks_detail -AutoCreateTable

and I get two tables one for the summary

02 - summary

and one for the details

03 - detail
This works absolutely fine and I could continue to add test results in this fashion but it has no date property so it is not so useful for reporting.

Create tables and triggers

This is one way of doing it. I am not sure it is the best way but it works! I always look forward to how people take ideas and move them forward so if you have a better/different solution please blog about it and reference it in the comments below

First I created a staging table for the summary results

CREATE TABLE [dbachecks].[Prod_dbachecks_summary_stage](
	[TagFilter] [nvarchar](max) NULL,
	[ExcludeTagFilter] [nvarchar](max) NULL,
	[TestNameFilter] [nvarchar](max) NULL,
	[TotalCount] [int] NULL,
	[PassedCount] [int] NULL,
	[FailedCount] [int] NULL,
	[SkippedCount] [int] NULL,
	[PendingCount] [int] NULL,
	[InconclusiveCount] [int] NULL,
	[Time] [bigint] NULL,
	[TestResult] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

and a destination table with a primary key and a date column which defaults to todays date

CREATE TABLE [dbachecks].[Prod_dbachecks_summary](
	[SummaryID] [int] IDENTITY(1,1) NOT NULL,
	[TestDate] [date] NOT NULL,
	[TagFilter] [nvarchar](max) NULL,
	[ExcludeTagFilter] [nvarchar](max) NULL,
	[TestNameFilter] [nvarchar](max) NULL,
	[TotalCount] [int] NULL,
	[PassedCount] [int] NULL,
	[FailedCount] [int] NULL,
	[SkippedCount] [int] NULL,
	[PendingCount] [int] NULL,
	[InconclusiveCount] [int] NULL,
	[Time] [bigint] NULL,
	[TestResult] [nvarchar](max) NULL,
 CONSTRAINT [PK_Prod_dbachecks_summary] PRIMARY KEY CLUSTERED 
(
	[SummaryID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_summary] ADD  CONSTRAINT [DF_Prod_dbachecks_summary_TestDate]  DEFAULT (getdate()) FOR [TestDate]
GO

and added an INSERT trigger to the staging table

CREATE TRIGGER [dbachecks].[Load_Prod_Summary] 
   ON   [dbachecks].[Prod_dbachecks_summary_stage]
   AFTER INSERT
AS 
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

    INSERT INTO [dbachecks].[Prod_dbachecks_summary] 
	([TagFilter], [ExcludeTagFilter], [TestNameFilter], [TotalCount], [PassedCount], [FailedCount], [SkippedCount], [PendingCount], [InconclusiveCount], [Time], [TestResult])
	SELECT [TagFilter], [ExcludeTagFilter], [TestNameFilter], [TotalCount], [PassedCount], [FailedCount], [SkippedCount], [PendingCount], [InconclusiveCount], [Time], [TestResult] FROM [dbachecks].[Prod_dbachecks_summary_stage]

END
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_summary_stage] ENABLE TRIGGER [Load_Prod_Summary]
GO
and for the details I do the same thing. A details table
CREATE TABLE [dbachecks].[Prod_dbachecks_detail](
	[DetailID] [int] IDENTITY(1,1) NOT NULL,
	[SummaryID] [int] NOT NULL,
	[ErrorRecord] [nvarchar](max) NULL,
	[ParameterizedSuiteName] [nvarchar](max) NULL,
	[Describe] [nvarchar](max) NULL,
	[Parameters] [nvarchar](max) NULL,
	[Passed] [bit] NULL,
	[Show] [nvarchar](max) NULL,
	[FailureMessage] [nvarchar](max) NULL,
	[Time] [bigint] NULL,
	[Name] [nvarchar](max) NULL,
	[Result] [nvarchar](max) NULL,
	[Context] [nvarchar](max) NULL,
	[StackTrace] [nvarchar](max) NULL,
 CONSTRAINT [PK_Prod_dbachecks_detail] PRIMARY KEY CLUSTERED 
(
	[DetailID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail]  WITH CHECK ADD  CONSTRAINT [FK_Prod_dbachecks_detail_Prod_dbachecks_summary] FOREIGN KEY([SummaryID])
REFERENCES [dbachecks].[Prod_dbachecks_summary] ([SummaryID])
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail] CHECK CONSTRAINT [FK_Prod_dbachecks_detail_Prod_dbachecks_summary]
GO

A stage table

CREATE TABLE [dbachecks].[Prod_dbachecks_detail_stage](
	[ErrorRecord] [nvarchar](max) NULL,
	[ParameterizedSuiteName] [nvarchar](max) NULL,
	[Describe] [nvarchar](max) NULL,
	[Parameters] [nvarchar](max) NULL,
	[Passed] [bit] NULL,
	[Show] [nvarchar](max) NULL,
	[FailureMessage] [nvarchar](max) NULL,
	[Time] [bigint] NULL,
	[Name] [nvarchar](max) NULL,
	[Result] [nvarchar](max) NULL,
	[Context] [nvarchar](max) NULL,
	[StackTrace] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

with a trigger

CREATE TRIGGER [dbachecks].[Load_Prod_Detail] 
   ON   [dbachecks].[Prod_dbachecks_detail_stage]
   AFTER INSERT
AS 
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

    INSERT INTO [dbachecks].[Prod_dbachecks_detail] 
([SummaryID],[ErrorRecord], [ParameterizedSuiteName], [Describe], [Parameters], [Passed], [Show], [FailureMessage], [Time], [Name], [Result], [Context], [StackTrace])
	SELECT 
	(SELECT MAX(SummaryID) From [dbachecks].[Prod_dbachecks_summary]),[ErrorRecord], [ParameterizedSuiteName], [Describe], [Parameters], [Passed], [Show], [FailureMessage], [Time], [Name], [Result], [Context], [StackTrace]
	FROM [dbachecks].[Prod_dbachecks_detail_stage]

END
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail_stage] ENABLE TRIGGER [Load_Prod_Detail]
GO

Then I can use Write-DbaDatatable with a couple of extra parameters, FireTriggers to run the trigger, Truncate and Confirm:$false to avoid any confirmation because I want this to run without any interaction and I can get the results into the database.

$Testresults | Write-DbaDataTable -SqlInstance $Instance -Database $Database -Schema dbachecks -Table Prod_dbachecks_summary_stage -FireTriggers -Truncate -Confirm:$False
$Testresults.TestResult | Write-DbaDataTable -SqlInstance $Instance -Database $Database -Schema dbachecks -Table Prod_dbachecks_detail_stage -FireTriggers -Truncate -Confirm:$False
detail with stage

Which means that I can now query some of this data and also create PowerBi reports for it.

To enable me to have results for the groups in dbachecks I have to do a little bit of extra manipulation. I can add all of the checks to the database using

Get-DbcCheck | Write-DbaDataTable -SqlInstance $sql0 -Database ValidationResults -Schema dbachecks -Table Checks -Truncate -Confirm:$False -AutoCreateTable

But because the Ola Hallengren Job names are configuration items I need to update the values for those checks which I can do as follows

$query = "
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.systemfull) + "' WHERE [Describe] = 'Ola - `$SysFullJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserFull) + "' WHERE [Describe] = 'Ola - `$UserFullJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserDiff) + "' WHERE [Describe] = 'Ola - `$UserDiffJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserLog) + "' WHERE [Describe] = 'Ola - `$UserLogJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.CommandLogCleanup) + "' WHERE [Describe] = 'Ola - `$CommandLogJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.SystemIntegrity) + "' WHERE [Describe] = 'Ola - `$SysIntegrityJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserIntegrity) + "' WHERE [Describe] = 'Ola - `$UserIntegrityJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserIndex) + "' WHERE [Describe] = 'Ola - `$UserIndexJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.OutputFileCleanup) + "' WHERE [Describe] = 'Ola - `$OutputFileJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.DeleteBackupHistory) + "' WHERE [Describe] = 'Ola - `$DeleteBackupJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.PurgeBackupHistory) + "' WHERE [Describe] = 'Ola - `$PurgeBackupJobName'
"
Invoke-DbaSqlQuery -SqlInstance $SQL0 -Database ValidationResults -Query $query

You can get a sample Power Bi report in my Github which also has the code from this blog post

Then you just need to open in PowerBi Desktop and

Click Edit Queries
Click Data Source Settings
Click Change Source
Change the Instance and Database names

09 - PowerBi

Then have an interactive report like this. Feel free to click around and see how it works. Use the arrows at the bottom right to go full-screen. NOTE – it filters by “today” so if I haven’t run the check and the import then click on one of the groups under “Today’s Checks by Group”

This enables me to filter the results and see what has happened in the past so I can filter by one instance
05 - filter by instance
or I can filter by a group of tests
07 - filter by instance
or even by a group of tests for an instance
08 - filter by instance and insance

Hopefully, this will give you some ideas of what you can do with your dbachecks results. You can find all of the code and the PowerBi in my GitHub

Happy Validating!

dbachecks – Which Configuration Item For Which Check ?

I love showing dbachecks to people. It’s really cool seeing how people will use it and listening to their experiences. I was showing it to a production DBA a month or so ago and he said

How Do I Know Which Checks There Are?

OK you just need to run

Get-DbcCheck

and it will show you

01 - get-dbcchecks.png

It will show you the group, the type (does it need a computer name or an instance name), The description, the unique tag for running just that check and all the tags that will run that check

OK he said, you talked about configurations

How Do I Know Which Configurations There Are?

So to do that you just need to run

Get-DbcConfig

and it will show you

02 - dbcconfig.png

You can see the name, the current value and the description

Ah thats cool he said so

How Do I Know Which Configuration Is For Which Check?

Well, you just…. , you know…… AHHHHHHH

Ping – light bulb moment!

It’s always really useful to give something you have built to people who have never seen it before and then listen to what they say. Their new eyes and different experiences or expectations will give you lots of insight

None of the amazing contributors to dbachecks had thought of this scenario so I decided to fix this. First I asked for an issue to be raised in GitHub because an issue can be an improvement or a suggestion not just a bug.

Then I fixed it so that it would do what was required. Thank you Nick for this feedback and for helping to improve dbachecks

I improved Get-DbcCheck so that now it shows the configuration item related to each check

It is easier to see (and sort or search) if you use Out-GridView

Get-DbcCheck | Out-GridView

03 - New dbccheck.png

So now you can see which configuration can be set for each check!

 

Happy Validating!

Creating SQL Server Containers for versions 2012-2017

I am working on my dbatools and dbachecks presentations for SQL Saturday Finland, SQLDays, SQL Saturday Cork and SQLGrillen I want to show the two modules running against a number of SQL Versions so I have installed

  • 2 Domain Controllers
  • 2 SQL 2017 instances on Windows 2016 with an Availability Group and WideWorldImporters database
  • 1 Windows 2016 jump box with all the programmes I need
  • 1 Windows 2016 with containers

using a VSTS build and this set of ARM templates and scripts

I wanted to create containers running SQL2017, SQL2016, SQL2014 and SQL2012 and restore versions of the AdventureWorks database onto each one.

Move Docker Location

I redirected my docker location from my C:\ drive to my E:\ drive so I didnt run out of space. I did this by creating a daemon.json file in C:\ProgramData\docker\config and adding

{"data-root": "E:\\containers"}
and restarting the docker service which created folders like this
01 - folders.png
Then I ran
docker volume create SQLBackups
to create a volume to hold the backups that I could mount on the containers

AdventureWorks Backups

I downloaded all the AdventureWorks backups from GitHub and copied them to E:\containers\volumes\sqlbackups\_data

Get-ChildItem $Home\Downloads\AdventureWorks* | Copy-Item -Destination E:\containers\volumes\sqlbackups\_data

Getting the Images

To download the SQL 2017 image from the DockerHub I ran

docker pull microsoft/mssql-server-windows-developer:latest

and waited for it to download and extract

I also needed the images for other versions. My good friend Andrew Pruski b | t has versions available for us to use on his Docker Hub  so it is just a case of running

docker pull dbafromthecold/sqlserver2016dev:sp1
docker pull dbafromthecold/sqlserver2014dev:sp2
docker pull dbafromthecold/sqlserver2012dev:sp4
and waiting for those to download and extract (This can take a while!)

Create the containers

Creating the containers is as easy as

docker run -d -p ExposedPort:InternalPort --name NAME -v VolumeName:LocalFolder -e sa_password=THEPASSWORD -e ACCEPT_EULA=Y IMAGENAME
so all I needed to run to create 4 SQL containers one of each version was
docker run -d -p 15789:1433 --name 2017 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y microsoft/mssql-server-windows-developer
docker run -d -p 15788:1433 --name 2016 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y dbafromthecold/sqlserver2016dev:sp1
docker run -d -p 15787:1433 --name 2014 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y dbafromthecold/sqlserver2014dev:sp2
docker run -d -p 15786:1433 --name 2012 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y dbafromthecold/sqlserver2012dev:sp4

and just a shade over 12 seconds later I have 4 SQL instances ready for me 🙂

02 - creating containers.png

03 - Containers at the ready.png

Storing Credentials

This is not something I would do in a Production environment but I save my credentials using this method that Jaap Brasser b | t shared here

Get-Credential | Export-Clixml -Path $HOME\Documents\sa.cred
which means that I can get the credentials in my PowerShell session (as long as it is the same user that created the file) using
$cred = Import-Clixml $HOME\Documents\sa.cred

Restoring the databases

I restored all of the AdventureWorks databases that each instance will support onto each instance, so 2017 has all of them whilst 2012 only has the 2012 versions.

First I needed to get the filenames of the backup files into a variable

$filenames = (Get-ChildItem '\\bearddockerhost\e$\containers\volumes\sqlbackups\_data').Name

and the container connection strings, which are the hostname and the port number

$containers = 'bearddockerhost,15789', 'bearddockerhost,15788', 'bearddockerhost,15787', 'bearddockerhost,15786'
then I can restore the databases using dbatools using a switch statement on the version which I get with the NameLevel property of Get-DbaSqlBuildReference-
$cred = Import-Clixml $HOME\Documents\sa.cred
$containers = 'bearddockerhost,15789', 'bearddockerhost,15788', 'bearddockerhost,15787', 'bearddockerhost,15786'
$filenames = (Get-ChildItem '\\bearddockerhost\e$\containers\volumes\sqlbackups\_data').Name
$containers.ForEach{
    $Container = $Psitem
    $NameLevel = (Get-DbaSqlBuildReference-SqlInstance $Container-SqlCredential $cred).NameLevel
    switch ($NameLevel) {
        2017 {
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path C:\sqlbackups\ -useDestinationDefaultDirectories -WithReplace |Out-Null
            Write-Verbose-Message "Restored Databases on 2017"
        }
        2016 {
            $Files = $Filenames.Where{$PSitem -notlike '*2017*'}.ForEach{'C:\sqlbackups\' + $Psitem}
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path $Files-useDestinationDefaultDirectories -WithReplace
            Write-Verbose-Message "Restored Databases on 2016"
        }
        2014 {
            $Files = $Filenames.Where{$PSitem -notlike '*2017*' -and $Psitem -notlike '*2016*'}.ForEach{'C:\sqlbackups\' + $Psitem}
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path $Files-useDestinationDefaultDirectories -WithReplace
            Write-Verbose-Message "Restored Databases on 2014"
        }
        2012 {
            $Files = $Filenames.Where{$PSitem -like '*2012*'}.ForEach{'C:\sqlbackups\' + $Psitem}
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path $Files-useDestinationDefaultDirectories -WithReplace
            Write-Verbose-Message "Restored Databases on 2012"
        }
        Default {}
    }
}
I need to create the file paths for each backup file by getting the correct backups and appending the names to C:\SQLBackups which is where the volume is mounted inside the container
As Get-DbaDatabase gives the container ID as the Computer Name I have highlighted each container below
04 - databases.png
That is how easy it is to create a number of SQL containers of differing versions for your presentations or exploring needs
Happy Automating!

Visual Studio Code Live Sharing Set-Up

There was an announcement on the Visual Studio Code blog about the public preview of Live Share. This enables you to easily collaborate on code by securely sharing your coding session.

It is remarkably easy to set up 🙂

Installation

Open Visual Studio Code, open the Extensions side bar (CTRL + SHIFT + X)

01 - open extensions

Search for Live Share

02 - search.png

Click Install and then reload when it has done

03 - reload.png

You will notice in the bottom bar it will say finishing the installation and if you open the terminal (CTRL + ‘) and click on Output and change the drop down on the right to Visual Studio Live Share you can see what it is doing

04 - finishing installation.png

It is installing the dependancies as shown below

[Client I] Installing dependencies for Live Share…
[Client I] Downloading package ‘.NET Core Runtime 2.0.5 for win7-x86’
[Client I] Download complete.
[Client I] Downloading package ‘OmniSharp for Windows (.NET 4.6)’
[Client I] Download complete.
[Client I] Installing package ‘.NET Core Runtime 2.0.5 for win7-x86’
[Client V] Extracted packed files
[Client I] Validated extracted files.
[Client I] Moved and validated extracted files.
[Client I] Finished installing.
[Client I] Installing package ‘OmniSharp for Windows (.NET 4.6)’
[Client V] Extracted packed files
[Client I] Validated extracted files.
[Client I] Finished installing.
[Client I] No workspace id found.
Incidentally, this will also show the location of the log file

You will see in the bottom bar it will now say sign in

06 - sign in.png

Clicking that will open a browser and give you a choice of accounts to sign in with, your GitHub or your Microsoft ID

07 - sign in.png

Choose the one that you want to use and do your 2FA.

08 - 2FA.png

You do have 2FA on your Microsoft and GitHub (and all the other services)? If not go and set it up now – here for Microsoft and here for GitHub 

Once you have signed in you will get this notification which you can close

09 - close this notification.png

The icon in the bottom will change and show your account name and if you click it it will open the menu

09 - sharing menu.png

Sharing

To share your session you click on the Share icon in the bottom bar or the Start collaboration session in the menu above. The first time you do this there will be a pop-up as shown

05 - firewall popup.png

You can decide which way you (or your organisation) want to share. I chose to accept the firewall exception.

10 - invite link.png

The invite link is in your clipboard ready to share with your friends and colleagues (other open source contributors ??)

They can either open the link in a browser

11 - join via browser.png

or by using the Join Collaboration Session in the menu in VS Code

12 - Join via VS COde.png

Once they do the sharer will get a notification

13 - notification of sharing.png

and the person who has joined will have the same workspace opened in their Visual Studio Code

14 -shared workspace.png

You can then collaborate on your code and share the session. In the video below the left hand side is running in my jump box in Azure and the right hand side on my laptop and you can see that if you highlight code in one side it is shown in the other and if you alter it in one side it is changed in the other. I also saved that file in the joined session rather than from the session that initialised the sharing and it then saved in both sessions 🙂

So that shows how easy it is to install and to use. You can dive deeper using the documentation.

 

Happy Collaborating 🙂