Creating a PowerShell Module and TDD for Get-SQLDiagRecommendations

Yesterday I introduced the first command in the SQLDiagAPI module. A module to consume the SQL Diagnostics API.

I have been asked a few times what the process is for creating a module, using Github and developing with Pester and whilst this is not a comprehensive how-to I hope it will give some food for thought when you decide to write a PowerShell module or start using Pester for code development. I also hope it will encourage you to give it a try and to blog about your experience.

This is my experience from nothing to a module with a function using Test Driven Development with Pester. There are some details missing in some places but if something doesn’t make sense then ask a question. If something is incorrect then point it out. I plan on never stopping learning!

There are many links to further reading and I urge you to not only read the posts linked but also to read further and deeper. That’s a generic point for anyone in the IT field and not specific to PowerShell. Never stop learning. Also, say thank you to those that have taken their time to write content that you find useful. They will really appreciate that.

Github Repository

I created a new repository in Github and used Visual Studio Code to clone the repository by pressing F1 and typing clone – Choosing Git Clone and following the prompts. I started with this because I was always planning to share this code and because source controlling it is the best way to begin.

Plaster Template

When you create a module there are a number of files that you need and I have a number of generic tests that I add. I also have a structure that I create for the artifacts and a number of markdown documents that come with a GitHub Repository.  Whilst you could write a PowerShell script to create all of those, there is no need as there is PlasterPlaster is a PowerShell module that enables you to set up the default scaffolding for your PowerShell module structure and tokenise some files. This makes it much easier to have a default ‘scaffold’ for the module, a structure for the files and folders and create a new module simply. I used Kevin Marquettes post on Plaster  to create myself a template module. You can find my Plaster Template here 

You do not need to use Plaster at all but as with anything, if you find yourself repeating steps then it is time to automate it

With my Plaster Template created I could simply run

$plaster = @{
TemplatePath = "GIT:\PlasterTemplate" #(Split-Path $manifestProperties.Path)
DestinationPath = "GIT:\SQLDiagAPI"
FullName = "Rob Sewell"
ModuleName = "SQLDiagAPI"
ModuleDesc = "This is a module to work with the SQL Server Diagnostics (Preview) API. See https://blogs.msdn.microsoft.com/sql_server_team/sql-server-diagnostics-preview/ for more details "
Version = "0.9.0"
GitHubUserName = "SQLDBAWithABeard"
GitHubRepo = "SQLDiagAPI"
}
If(!(Test-Path $plaster.DestinationPath))
{
New-Item -ItemType Directory -Path $plaster.DestinationPath
}
Invoke-Plaster @plaster -Verbose

This created my module. It created this folder and file structure and included some default tests and markdown documents pre-populated.

00 - module

Pester

For those that don’t know. Pester is a PowerShell module for Test Driven Development

Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands

If you have PowerShell version 5 then you will have Pester already installed although you should update it to the latest version. If not you can get Pester from the PowerShell Gallery follow the instructions on that page to install it. This is a good post to start learning about Pester

API Key

Now that I have the module I started to think about the commands. I decided to start with the recommendations API which is described as

Customers will be able to keep their SQL Server instances up-to-date by easily reviewing the recommendations for their SQL Server instances. Customers can filter by product version or by feature area (e.g. Always On, Backup/Restore, Column Store, etc.) and view the latest Cumulative Updates (CU) and the underlying hotfixes addressed in the CU.

To use the API you need an API Key. An API Key is a secret token that identifies the application to the API and is used to control access.You can follow the instructions here https://ecsapi.portal.azure-api.net/ to get one for the SQL Server Diagnostics API.

01 - APIKey

I will need to store the key to use it and if I am writing code that others will use consider how they can repeat the steps that I take. I decided to save my API Key using the Export-CliXML command as described by Jaap Brasser here .

Get-Credential | Export-CliXml -Path "${env:\userprofile}\SQLDiag.Cred"

You need to enter a username even though it is not used and then enter the API Key as the password. It is saved in the root of the user profile folder as hopefully user accounts will have access there in most shops

TDD

I approached writing this module using Test Driven Development with Pester. This means that I have to write my tests before I write my code. There are many reasons for doing this which are outside the scope of this blog post. This is a very good post to read more

The first function I wanted to write was to get the recommendations from the API. I decide to call it Get-SQLDiagRecommendations.

I decided that the first test should be to ensure that the API Key exists. Otherwise I would not be able to use it when calling the API. I already had an idea of how I would approach it by storing the API Key using Test-Path and writing a warning if the file did not exist.

Mocking

However this is not going to work if I have already saved the key to the file. The test needs to not be reliant on any thing external. I need to be able to test this functionality without actually checking my system. I will use Mock to do this. You can read more about mocking with Pester here.

I added this to my Pester test

Context "Requirements" {
Mock Test-Path {$false}
Mock Write-Warning {"Warning"}

This is what happens when you run this test. When there is a call to Test-Path in the code you have written, instead of actually running Test-Path it will return whatever is inside the curly braces, in this case false. For Write-Warning it will return a string of Warning.

This means that I can write a test like this

It "Should throw a warning if there is no API Key XML File and the APIKey Parameter is not used"{
Get-SQLDiagRecommendations -ErrorAction SilentlyContinue | Should Be "Warning"
}

So I know that when running my code in this test, Test-Path will return false, which will invoke Write-Warning in my code and in the test that will return “Warning” . So if I have written my code correctly the test will pass without actually running the real Test-Path and interacting with my system or running Write-Warning which makes it easier to test that warnings are thrown correctly.

The name of the test will also let me (and others) know in the future what I was trying to achieve. This means that if I (or someone else) changes the code and the test fails they can understand what was meant to happen. They can then either write a new test for the changed code if the requirements are now different or alter the code so that it passes the original test.

I use

-ErrorAction SilentlyContinue

so that the only red text that I see on the screen is the results of the test and not any PowerShell errors.

Asserting

I can also check that I have successfully called my Mocks using Assert-MockCalled. This command will check that a command that has been mocked has been called successfully during the test in the scope of the Describe (or in this case Context) block of the tests

It 'Checks the Mock was called for Test-Path' {
$assertMockParams = @{
'CommandName' = ' Test-Path'
'Times' = 1
'Exactly' = $true
}
Assert-MockCalled @assertMockParams
}

I specify the command name, the number of times that I expect the mock to have been called and because I know that it will be exactly 1 time, I set exactly to $true. If I set exactly to false it would test that the mock was called at least the number of times specified. This is another test that I really have called the Mocks that I defined and the results are correct and dependant only on the code.

I set up the same test for Write-Warning.

Failed Test

I can now run my Pester tests using

Invoke-Pester .\Tests

and see that some failed.

02 - Failed Pester tests

Of course it failed I don’t have a function named Get-SQLDiagRecommendations

So why run the test?

I need to ensure that my test fails before I write the code to pass it. If I don’t do that I may mistakenly write a test that passes and therefore not be correctly testing my code.

You can also see that it has run all of the .Tests.ps1 files in the tests directory and has taken 42 seconds to run. The tests directory includes a number of Pester tests including checking that all of the scripts pass the Script Analyser rules and that all of the functions have the correct help. (thank you June Blender for that test)

Show

I can reduce the output of the tests using the Show parameter of Invoke-Pester. I will often use Fails as this will show the describe and context titles and only the tests that fail. This will run much quicker as it will not need to output all of the passed tests to the screen

03 - Pester show fails

Now the test is running in less than half of the time. You can filter the output in further ways using Show. You can run

Get-Help Invoke-Pester

to see how else you can do this.

Tags

As I am going to be writing tests and then writing code to pass the tests repeatedly I don’t want to run all of these tests all of the time so I can use the Tags parameter of Invoke-Pester to only run a certain suite tests. In the Unit.Tests.ps1 file the Describe block looks like this

Describe "Get-SQLDiagRecommendations" -Tags Build , Unit{
Context "Requirements" {

So I can run just the tests tagged Unit and skip all of the other tests. Combined with the Show Fails to reduce the output my Invoke-Pester code looks like this

Invoke-Pester .\tests -Show Fails -Tag Unit

04 - Pester Tags

Now I am only running the tests that I need for writing the code for the command the tests are running in under half a second 🙂 This is so much better when I am going to be running them repeatedly.

The other tests have different tags and I will show them running later in the post.

Code

Finally, we can write some code to pass our failing test

function Get-SQLDiagRecommendations1 {
[cmdletbinding()]
Param([string]$ApiKey)
if (!$ApiKey) {
if (!(Test-Path "${env:\userprofile}\SQLDiag.Cred")) {
Write-Warning "You have not created an XML File to hold the API Key or provided the API Key as a parameter
You can export the key to an XML file using Get-Credential | Export-CliXml -Path `"`${env:\userprofile}\SQLDiag.Cred`"
You can get a key by following the steps here https://ecsapi.portal.azure-api.net/ "
    }
}

Which would look like this if the file does not exist and the API Key parameter is not used

05 - Warning

I like to provide users with a useful message that they can follow rather than a lot of red text that they need to decipher

And now our tests pass

06 - Passing Tests

If you look at the API documentation the API requires a callerid as well as the APIKey. In the examples it uses the value from
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\MachineGUID

We can get that using Get-ItemProperty and without it we can’t call the API so I wrote tests like this.

It "Returns a warning if unable to get Machine GUID" {
Mock Get-MachineGUID {} -Verifiable
Mock Write-Warning {"Warning"} -Verifiable
Get-SQLDiagRecommendations -APIKey dummykey | Should Be "Warning"
Assert-VerifiableMocks
}

I am not saying this is the correct way to write your tests. I am showing that you can test multiple things in an It block and if any one of them fails the entire test fails.

I am mocking the internal function Get-MachineGuid and Write Warning just in the scope of this It Block and passing an APIKey parameter to Get-SQLDiagRecommendations so that we don’t hit the write-warnings we tested for above and then using Assert-VerifiableMocks  to verify that the mocks have been called. It does not verify how many times, just that all of the mocks in that block have been called

The test fails as expected and then I write the code to pass the test. This is the internal function to get the Machine GUID

function Get-MachineGUID {
try {
(Get-ItemProperty registry::HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\ -Name MachineGuid).MachineGUID
}
catch{
Write-Warning "Failed to get Machine GUID from HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\"
}
}

 

and this is the call to the internal function and warning message

$MachineGUID = Get-MachineGUID
if($MachineGUID.length -eq 0)
{
Write-Warning "Failed to get Machine GUID from HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\"
break
}

Rinse and repeat

That is basically the process that I follow to write a function. I just write a test, write some code to fix it, write another test, write some code to fix it. I keep going until I have finished writing the code and all the test have passed.

Best Practice Code

Once that was done and my Unit test had passed I run

 Invoke-Pester .\tests -Tag ScriptAnalyzer -Show Fails

To check that the PowerShell code that I had written conformed to the Script Analyzer rules. I added an exception to the Help.Exceptions.ps1 file to not run the rule for plural nouns as I think the command has to be called Get-SQLRecommendations with an S ! I have tagged the ScriptAnalyzer Tests with a tag so I can just run those tests.

Help

As that had all passed I could then run

Invoke-Pester .\tests -Tag Help

Which tests if I had the correct help for my functions. Of course that failed but I could use the nifty new feature in VS Codes PowerShell Extension to add the help scaffolding really easily as I describe here

Then I could run all 563 of the Pester tests in the tests folder and be happy that everything was OK

11 - All Pester passed.PNG

By the end I had written the module, which you can find here

There are instructions and a script to install it easily.

Right now it has only got the one function to get the SQL recommendations but I will look at expanding that over the next few days and once it is more complete put it onto the PowerShell Gallery and maybe move it into the SQL Server Community GitHub Organisation  home of https://dbatools.io , https://dbareports.io, Invoke-SQLCmd2 and the SSIS Reporting pack

Contribute

Of course I am happy to have others contribute to this, in fact I encourage it. Please fork and give PR’s and make this a useful module with more commands. There is the Diagnostic Analysis API as well to work with which I am very interested to see how we can make use of that with PowerShell

As always, I highly recommend that if you want to know more about Pester you head over here and purchase this book by Adam

PowerShell Module for the SQL Server Diagnostics API – 1st Command Get-SQLDiagRecommendations

I saw this blog post about the SQL Server Diagnostics add-on to SSMS and API and thought I would write some PowerShell to work with it as all of the examples use other languages.

SQL ServerDignostics API

The Diagnostic Analysis API allows you to upload memory dumps to be able to debug and self-resolve memory dump issues from their SQL Server instances and receive recommended Knowledge Base (KB) article(s) from Microsoft, which may be applicable for the fix.

There is also the Recommendations API to view the latest Cumulative Updates (CU) and the underlying hotfixes addressed in the CU which can be filtered by product version or by feature area (e.g. Always On, Backup/Restore, Column Store, etc).

I have written a module to work with this API. It is not complete. It only has one command as of now but I can see lots of possibilities for improvement and further commands to interact with the API fully and enable SQL Server professionals to use PowerShell for this.

Storing the API Key

To use the API you need an API Key. An API Key is a secret token that identifies the application to the API and is used to control access. You can follow the instructions here https://ecsapi.portal.azure-api.net/ to get one for the SQL Server Diagnostics API.

01 - APIKey

I will need to store the key to use it. I saved my API Key using the Export-CliXML command as described by Jaap Brasser here .

 Get-Credential | Export-CliXml -Path "${env:\userprofile}\SQLDiag.Cred" 

You need to enter a username even though it is not used and then enter the API Key as the password. It is saved in the root of the user profile folder as hopefully user accounts will have access there in most shops.

The commands in the module will look for the API Key in that SQLDiag.Cred file by default but you can also just use the APIKey parameter

Get-SQLDiagRecommendations

The first function in the module is Get-SQLDiagRecommendations. All this function does is connect to the Recommendations API and return an object containing the information about the latest Cumulative Updates.

If you have already saved your API Key as described above you can use

 Get-SQLDiagRecommendations 

If you want to enter the API Key manually you would use

 Get-SQLDiagRecommendations -APIKey XXXXXXXX

Either way it will return a PowerShell object containing all of the information which looks like this.

07 - Get-SQLRecommendations

One of the beauties of PowerShell is that you can pass objects down a pipeline and use them in other commands. Also, your only limit is your imagination.

You want to export to CSV, HTML, Text file?
Email, Import to database, store in Azure storage?
Embed in Word, Excel  on a SharePoint site?

All of this and much, much more is easily achievable with PowerShell.

In the future this command will feed other functions in the module that will display this information in a more useful fashion. I am thinking of commands like

Get-SQLDiagRecommendations |
Get-SQLDiagLatestCU -Version SQL2012

or

Get-SQLDiagRecommendations |
Get-SQLDiagKBArticle -Version SQL2012 -Feature BackupRestore

If you have any ideas please join in on GitHub

JSON

For now though you can use Get-SQLDiagRecommendations to output the results to JSON so that you can examine them or consume them.

If you use VS Code follow the steps here and you can export the results to the current file with

 Get-SQLDiagRecommendations |ConvertTo-Json -Depth 7 |Out-CurrentFile 

Which looks like this

08 - OutCurrentFile

It shows the entire JSON object containing all of the information about all of the latest CU’s for SQL Server 2012 and up and each of the KB Articles. I have minimised several of the nodes to try and show as much as possible for SQL Server 2012 SP3

If you do not use VS Code or you want to export straight to a file then you can

 Get-SQLDiagRecommendations |ConvertTo-Json -Depth 7 |Out-File -Path PATHTOFILE 

Out-GridView

I like Out-GridView so I quickly gathered the Product, Cumulative Update, Feature Type, KB Number and URL and outputted to Out-GridView like this

$recommendations = Get-SQLDiagRecommendations
$KBs = foreach ($recommendation in $recommendations.Recommendations){
    $Product = $recommendation.Product
    $CU = $recommendation.Title
    $CreatedOn = $recommendation.CreatedOn
    foreach ($fix in $recommendation.Content.RelevantFixes){
        $feature = $fix.Title
        foreach ($Kb in $fix.KbArticles){
            [PSCustomObject]@{
                CreatedOn = $CreatedOn
                Product = $Product
                CU = $CU
                Feature = $feature
                KB = $Kb.Rel
                Link = $Kb.href
                }
           }
       }
   }
 $kbs | Ogv 

As you can filter easily in Out-GridView I filtered by 2012 and this is what it looks like

09 - Out-GridView

This will enable you to quickly see any information that you require about the Cumulative Updates for SQL 2012, 2014 and 2016

Github

You can find the module on GitHub. There are instructions and a script to install it easily.

Right now it has only got the one function to get the SQL recommendations but I will look at expanding that over the next few days and once it is more complete put it onto the PowerShell Gallery and maybe move it into the SQL Server Community GitHub Organisation  home of https://dbatools.io , https://dbareports.io, Invoke-SQLCmd2 and the SSIS Reporting pack

Contribute

Of course I am happy to have others contribute to this, in fact I encourage it. Please fork and give PR’s and make this a useful module with more commands. There is the Diagnostic Analysis API as well to work with and I am very interested to see how we can make use of that with PowerShell

Tomorrow I have a post explaining the process I used to create the module and how I used Test Driven Development with Pester to write this function.

VSCode – PowerShell extension 1.4.0 new command Out-CurrentFile

Yesterday David Wilson announced version 1.4.0 of the PowerShell extension for VSCode

https://twitter.com/daviwil/status/877559988809457671

He also pointed out that there have been over 1 million installs of the extension. 🙂

If you want to install the PowerShell extension you can hit F1 in VSCode and type

ext install PowerShell

or CTRL + SHIFT + X to open the extensions side bar and search for PowerShell and click the green install button.

There are a few enhancements in this release which you can read about here but I noticed the New File API and Out-CurrentFile command that were contributed by Doug Finke.

If your focus is in the editor in VSCode, you can simply CTRL + N and create a new file. You can alter the language that the file uses with CTRL + K, M (that’s CTRL and K and then M not CTRL and K and M!) or set the default new file language as I described here.

01 -new file.gif

If you are using VSCode as your daily PowerShell command line though, you would have to alter your focus from the terminal panel into the editor to do this. Now though you have

$psEditor.Workspace.NewFile()

which enables you to create a new file from the terminal

02- Another new file.gif

The Out-CurrentFile command sends the output of a command through Out-String to a new file. This makes it much easier to keep the results of some commands. You don’t have to pipe them to clip or highlight and CTRL + C to copy them and then open a  file and paste them. Of course you can use Out-File and then open the file but this is another way.

Lets see how it works. In this example, I want to export the T-SQL for creating the logins on an instance and add some comments to the code before saving it somewhere safely. I am going to use the Export-SQLLogin command from the dbatools module. MVP Cláudio Silva has written a great post on that command today.

First create a new file

$psEditor.Workspace.NewFile()

and then

Export-SqlLogin -SqlInstance . | Out-CurrentFile

In the gif above

  • I create a new file,
  • Export the logins to it
  • change the language of the file to T-SQL
  • remove the string quotes and
  • add some comments.

All without leaving VSCode, just another reason that my productivity is increased using VSCode!

Unfortunately, it doesn’t work so well with Pester. (Of course I was going to try Pester!). This makes sense though, as Pester uses Write-Host to display the test results so nothing is going to the output stream so

Invoke-Pester '.\SQL Grillen\Demo Number 3.Tests.ps1' | Out-CurrentFile

returns this

04 - pester.PNG

and if you use the -PassThru parameter then you get the $Tests object as a string so

 Invoke-Pester '.\SQL Grillen\Demo Number 3.Tests.ps1'-Show Summary -PassThru | Out-CurrentFile

shows

05 - pester object.PNG

Which isn’t what I would need but I have other ways of working with Pester output.

dbatools at #SQLSatDublin

This weekend SQL Saturday Dublin occurred. For those that don’t know SQL Saturdays are free conferences with local and international speakers providing great sessions in the Data Platform sphere.

Chrissy LeMaire and I presented our session PowerShell SQL Server: Modern Database Administration with dbatools. You can find slides and code here . We were absolutely delighted to be named Best Speaker which was decided from the attendees average evaluation.

Chrissy also won the Best Lightning talk for her 5 minute (technically 4 minutes and 55 seconds) presentation on dbatools as well 🙂

We thoroughly enjoy giving this presentation and I think it shows in the feedback we received.

Feedback

History

We start with a little history of dbatools, how it started as one megalithic script Start-SQLMigration.ps1 and has evolved into (this number grows so often it is probably wrong by the time you read this) over 240 commands from 60 contributors

Requirements

We explain the requirements. You can see them here on the download page.

The minimum requirements for the Client are

  • PowerShell v3
  • SSMS / SMO 2008 R2

which we hope will cover a significant majority of peoples workstations.

The minimum requirements for the SQL Server are

  • SQL Server 2000
  • No PowerShell for pure SQL commands
  • PowerShell v2 for Windows commands
  • Remote PowerShell enabled for Windows commands

As you can see the SQL server does not even need to have PowerShell installed (unless you want to use the Windows commands). We test our commands thoroughly using a test estate that encompasses all versions of SQL from 2000 through to 2017 and whenever there is a vNext available we will test against that too.

We recommend though that you are using PowerShell v5.1 with SSMS or SMO for SQL 2016 on the client

Installation

We love how easy and simple the installation of dbatools is. As long as you have access to the internet (and permission from your companies security team to install 3rd party tools. Please don’t break your companies policies) you can simply install the module from the PowerShell Gallery using

Install-Module dbatools

If you are not a local administrator on your machine you can use the -Scope parameter

Install-Module dbatools -Scope CurrentUser

Incidentally, if you or your security team have concerns about the quality or trust of the content in the PowerShell Gallery please read this post which explains the steps that are taken when code is uploaded.

If you cannot use the PowerShell Gallery then you can use this line of code to install from GitHub

Invoke-Expression (Invoke-WebRequest https://dbatools.io/in)

There is a video on the download page showing the installation on a Windows 7 machine and also some other methods of installing the module should you need them.

Website

Next we visit the website dbatools.io and look at the front page. We have our regular joke about how Chrissy doesn’t want to present on migrations but I think they are so cool so she makes me perform the commentary on the video. (Don’t tell anyone but it also helps us to get in as many of the 240+ commands in a one hour session as well 😉 ). You can watch the video on the front page. You definitely should as you have never seen migrations performed so easily.

Then we talk about the comments we have received from well respected people from both SQL and PowerShell community members so you can trust that its not just some girl with hair and some bloke with a beard saying that its awesome.

Contributors

Probably my favourite page on the web-site is the team page showing all of the amazing fabulous wonderful people who have given their own time freely to make such a fantastic free tool. If we have contributors in the audience we do try to point them out. One of our aims with dbatools is to enable people to receive the recognition for the hard work that they put in and we do this via the team page, our LinkedIn company page as well as by linking back to the contributors in the help and the web-page for every command. I wish I could name check each one of you.

Thank You each and every one !!

Finding Commands

We then look at the command page and the new improved search page and demonstrate how you can use them to find information about the commands that you need and the challenges in keeping this all maintained during a period of such rapid expansion.

Demo

Then it is time for me to say this phrase. “Strap yourselves in, do up your seatbelts, now we are going to show 240 commands in the next 40 minutes. Are you ready!!”

Of course, I am joking, one of the hardest things about doing a one hour presentation on dbatools is the sheer number of commands that we have that we want to show off. Of course we have already shown some of them in the migration video above but we still have a lot more to show and there are a lot more that we wish we had time to show.

Backup and Restore

We start with a restore of one database and a single backup file using Restore-DbaDatabase showing you the easy to read warning that you get if the database already exists and then how to resolve that warning with the WithReplace switch

Then how to use it to restore an entire instance worth of backups to the latest available time by pointing Restore-DbaDatabase at a folder on a share

Then how to use Get-DbaDatabase to get all of the databases on an instance and pass them to Backup-DbaDatabase to back up an entire instance.

We look at the Backup history of some databases using Get-DbaBackupHistory and Out-GridView and examine detailed information about a backup file using Read-DbaBackupHeader.

We give thanks to Stuart Moore for his amazing work on these and several other backup and restore commands.

SPN’s

After a quick reminder that you can search for commands at the command line using Find-DbaCommand, we talk about SPNs and try to find someone, anyone, who actually likes working with SQL Server and SPNs and resolving the issues!!

Then we show Drew’s SPN commands Get-DbaSpn, Test-DbaSpn, Set-DbaSpn  and Remove-DbaSpn 

Holiday Tasks

We then talk about the things we ensure we run before going on holiday to make sure we leave with a warm fuzzy feeling that everything will be ok until we return :-

  • Get-DbaLastBackup will show the last time the database had any type of backup.
  • Get-DbaLastGoodCheckDb which shows the last time that a database had a successful DBCC CheckDb and how we can gather the information for all the databases on all of your instances in just one line of code
  • Get-DbaDiskSpace which will show the disk information for all of the drives including mount points and whether the disk is in use by SQL

Testing Your Backup Files By Restoring Them

We ask how many people test their backup files every single day and Dublin wins marks for a larger percentage than some other places we have given this talk. We show Test-DbaLastBackup in action so that you can see the files being created because we think it looks cool (and you can see the filenames!) Chrissy has written a great post about how you can set up your own dedicated backup file test server

Free Space

We show how to gather the file space information using Get-DbaDatabaseFreespace and then how you can put that (or the results of any PowerShell command) into a SQL database table using Out-DbaDataTable and Write-DbaDataTable

SQL Community

Next we talk about how we love to take community members blog posts and turn them into dbatools commands.

We start with Jonathan Kehayias’s post about SQL Server Max memory (http://bit.ly/sqlmemcalc) and show Get-DbaMaxMemory , Test-DbaMaxMemory and Set-DbaMaxMemory

Then Paul Randal’s blog post about Pseudo-Simple Mode which inspired  Test-DbaFullRecoveryModel

We talked about getting backup history earlier but now we talk about Get-DbaRestoreHistory a command inspired by Kenneth Fishers blog post to show when a database was restored and which file was used.

Next a command from Thomas LaRock which he gave us for testing linked servers Test-DbaLinkedServerConnection.

Glenn Berrys diagnostic information queries  are available thanks to André Kamman and the commands Invoke-DbaDiagnosticQuery and Export-DbaDiagnosticQuery. The second one will output all of the results to csv files.

Adam Mechanic’s sp_whoisactive is a common tool in SQL DBA’s toolkit and can now be installed using Install-DbaWhoIsActive and run using Invoke-DbaWhoIsActive.

Awesome Contributor Commands

Then we try to fit in as many commands that we can from our fantastic contributors showing how we can do awesome things with just one line of PowerShell code

The awesome Find-DbaStoredProcedure which you can read more about here which in tests searched 37,545 stored procedures on 9 instances in under 9 seconds for a particular string.

Find-DbaOrphanedFile which enables you to identify the files left over from detaching databases.

Don’t know the SQL Admin password for an instance? Reset-SqlAdmin can help you.

It is normally somewhere around here that we finish and even though we have shown 32 commands (and a few more encapsulated in the Start-SqlMigration command) that is less than 15% of the total number of commands in the module!!!

Somehow, we always manage to fit all of that into 60 minutes and have great fun doing it. Thank you to everyone who has come and seen our sessions in Vienna, Utrecht, PASS PowerShell Virtual Group, Hanover, Antwerp and Dublin.

More

So you want to know more about dbatools ? You can click the link and explore the website

You can look at source code on GitHub

You can join us in the SQL Community Slack in the #dbatools channel

You can watch videos on YouTube

You can see a list of all of the presentations and get a lot of the slides and demos

If you want to see the slides and demos from our Dublin presentation you can find them here

Volunteers

Lastly and most importantly of all. SQL Saturdays are run by volunteers so massive thanks to Bob, Carmel, Ben and the rest of the team who ensured that SQL Saturday Dublin went so very smoothly

 

 

VS Code – Automatic Dynamic PowerShell Help

VS Code is my coding tool of choice. I love that one lightweight editor can do so much and as PowerShell is usually the language that I write in I really love the PowerShell extension

Help

When you write a PowerShell function that is going to be used by someone other than you, you don’t want to be the guy or gal that has to support it indefinitely. You should write good help to enable your users to simply type

Get-Help NAMEOFCOMMAND

and get all of the help that they need to use the command

If we look at one of my favourite dbatools commands Get-DbaLastGoodCheckDB we can see this in action.

Get-Help Get-DbaLastGoodCheckDb -full

This returns

NAME
Get-DbaLastGoodCheckDb

SYNOPSIS
Get date/time for last known good DBCC CHECKDB

SYNTAX
Get-DbaLastGoodCheckDb [-SqlInstance] [[-SqlCredential] ] [-Silent] []

DESCRIPTION
Retrieves and compares the date/time for the last known good DBCC CHECKDB, as well as the creation date/time for the database.

This function supports SQL Server 2005+

Please note that this script uses the DBCC DBINFO() WITH TABLERESULTS. DBCC DBINFO has several known weak points, such as:
– DBCC DBINFO is an undocumented feature/command.
– The LastKnowGood timestamp is updated when a DBCC CHECKFILEGROUP is performed.
– The LastKnowGood timestamp is updated when a DBCC CHECKDB WITH PHYSICAL_ONLY is performed.
– The LastKnowGood timestamp does not get updated when a database in READ_ONLY.

An empty ($null) LastGoodCheckDb result indicates that a good DBCC CHECKDB has never been performed.

SQL Server 2008R2 has a “bug” that causes each databases to possess two dbi_dbccLastKnownGood fields, instead of the normal one.
This script will only displaythis function to only display the newest timestamp. If -Verbose is specified, the function will announce every time
more than one dbi_dbccLastKnownGood fields is encountered.

PARAMETERS
-SqlInstance
The SQL Server that you’re connecting to.

Required? true
Position? 1
Default value
Accept pipeline input? true (ByValue)
Accept wildcard characters? false

-SqlCredential Credential object used to connect to the SQL Server as a different user

Required? false
Position? 2
Default value
Accept pipeline input? false
Accept wildcard characters? false

-Silent []
Use this switch to disable any kind of verbose messages

Required? false
Position? named
Default value False
Accept pipeline input? false
Accept wildcard characters? false
This cmdlet supports the common parameters: Verbose, Debug,
ErrorAction, ErrorVariable, WarningAction, WarningVariable,
OutBuffer, PipelineVariable, and OutVariable. For more information, see
about_CommonParameters (http://go.microsoft.com/fwlink/?LinkID=113216).

INPUTS

OUTPUTS

NOTES

Copyright (C) 2016 Jakob Bindslet (jakob@bindslet.dk)

This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program. If not, see .

————————– EXAMPLE 1 ————————–

PS C:\>Get-DbaLastGoodCheckDb -SqlInstance ServerA\sql987

Returns a custom object displaying Server, Database, DatabaseCreated, LastGoodCheckDb, DaysSinceDbCreated, DaysSinceLastGoodCheckDb, Status and
DataPurityEnabled

————————– EXAMPLE 2 ————————–

PS C:\>Get-DbaLastGoodCheckDb -SqlInstance ServerA\sql987 -SqlCredential (Get-Credential sqladmin) | Format-Table -AutoSize

Returns a formatted table displaying Server, Database, DatabaseCreated, LastGoodCheckDb, DaysSinceDbCreated, DaysSinceLastGoodCheckDb, Status
and DataPurityEnabled.
Authenticates with SQL Server using alternative credentials.

RELATED LINKS
DBCC CHECKDB:
https://msdn.microsoft.com/en-us/library/ms176064.aspx
http://www.sqlcopilot.com/dbcc-checkdb.html
Data Purity:

CHECKDB From Every Angle: How to tell if data purity checks will be run?


https://www.mssqltips.com/sqlservertip/1988/ensure-sql-server-data-purity-checks-are-performed/

So anyone who needs to use the command can see what it is, a full description, what each parameter is for, some examples and some links to more information

So what I used to do was put a snippet of code like this at the top of my function and then fill in the blanks

<#
.SYNOPSIS
Short description
.DESCRIPTION
Long description
.EXAMPLE
An example
.PARAMETER
Parameter Help
.NOTES
General notes
.LINK
Link to more information
#>

The latest release of the PowerShell extension for VS Code has made that process so much simpler 🙂 Thank you David and Keith

Now you can simply type ## (edited May 2018 you can configure this to <# in the user settings) and your help will be dynamically created. You will still have to fill in some of the blanks but it is a lot easier.

Here it is in action in its simplest form

Pester - Simple.gif

But it gets better than that. When you add parameters to your function code they are added to the help as well. Also, all you have to do is to tab between the different entries in the help to move between them

02 - detailed.gif

Now when we run

Get-Help Invoke-AmazingThings -Full

We get this

03 help.PNG

Nice and easy and a great feature added to the VS Code PowerShell extension

Write Good Help.png