Good Bye Import-CliXML – Use the Secrets Management module for your labs and demos

Don’t want to read all this? There are two dotnet interactive notebooks here with the relevant information for you to use.

https://beard.media/dotnetnotebooks

Jaap is awesome

I have to start here. For the longest time, whenever anyone has asked me how I store my credentials for use in my demos and labs I have always referred them to Jaap Brassers t blog post

https://www.jaapbrasser.com/quickly-and-securely-storing-your-credentials-powershell/

Joel is also awesome!

When people wanted a method of storing credentials that didnt involve files on disk I would suggest Joel Bennett’s t module BetterCredentials which uses the Windows Credential Manager

https://www.powershellgallery.com/packages/BetterCredentials/4.5

Microsoft? Also awesome!

In February, Microsoft released the SecretManagement module for preview.

https://devblogs.microsoft.com/powershell/secrets-management-development-release/

Sydney t gave a presentation at the European PowerShell Conference which you can watch on Youtube.

Good Bye Import-CliXML

So now I say, it is time to stop using Import-Clixml for storing secrets and use the Microsoft.PowerShell.SecretsManagement module instead for storing your secrets.

Notebooks are as good as blog posts

I love notebooks and to show some people who had asked about storing secrets, I have created some. So, because I am efficient lazy I have embedded them here for you to see. You can find them in my Jupyter Notebook repository

https://beard.media/dotnetnotebooks

in the Secrets folder

Installing and using the Secrets Management Module

These notebooks may not display on a mobile device unfortunately

Using the Secret Management Module in your scripts

Here is a simple example of using the module to provide the credential for a docker container and then to dbatools to query the container

These notebooks may not display on a mobile device unfortunately

Version Update, Code Signing and publishing to the PowerShell Gallery with VSTS

At the fabulous PowerShell Conference EU I presented about Continuous Delivery to the PowerShell Gallery with VSTS and explained how we use VSTS to enable CD for dbachecks. We even released a new version during the session 🙂

So how do we achieve this?

We have a few steps

  • Create a project and link to our GitHub
  • Run unit uests with Pester to make sure that our code is doing what we expect.
  • Update our module version and commit the change to GitHub
  • Sign our code with a code signing certificate
  • Publish to the PowerShell Gallery

Create Project and link to GitHub

First you need to create a VSTS project by going to https://www.visualstudio.com/ This is free for up to 5 users with 1 concurrent CI/CD queue limited to a maximum of 60 minutes run time which should be more than enough for your PowerShell module.

01 - sign up.png

Click on Get Started for free under Visual Studio Team Services and fill in the required information. Then on the front page click new project

02 - New Project.png

Fill in the details and click create

03 - create project.png

Click on builds and then new definition

04- builds.png

next you need to link your project to your GitHub (or other source control providers) repository

05 - github auth.png

You can either authorise with OAuth or you can provide a PAT token following the instructions here. Once that is complete choose your repo. Save the PAT as you will need it later in the process!

06 - choose repo.png

and choose the branch that you want this build definition to run against.

07 branch.png

I chose to run the Unit Tests when a PR was merged into the development branch. I will then create another build definition for the master branch to sign the code and update module version. This enables us to push several PRs into the development branch and create a single release for the gallery.

Then I start with an empty process

08 - empty process.png

and give it a suitable name

09 - name it.png

i chose the hosted queue but you can download an agent to your build server if you need to do more or your integration tests require access to other resources not available on the hosted agent.

Run Unit Tests with Pester

We have a number of Unit tests in our tests folder in dbachecks so we want to run them to ensure that everything is as it should be and the new code will not break existing functionality (and for dbachecks the format of the PowerBi)

You can use the Pester Test Runner Build Task from the folk at Black Marble by clicking on the + sign next to Phase 1 and searching for Pester

10 - Pester task runner.png

You will need to click Get It Free to install it and then click add to add the task to your build definition. You can pretty much leave it as default if you wish and Pester will run all of the *.Tests.ps1 files that it finds in the directory where it downloads the GitHub repo which is referred to using the variable $(Build.SourcesDirectory). It will then output the results to a json file called Test-Pester.XML ready for publishing.

However, as dbachecks has a number of dependent modules, this task was not suitable. I spoke with Chris Gardner  b | t  from Black Marble at the PowerShell Conference and he says that this can be resolved so look out for the update. Chris is a great guy and always willing to help, you can often find him in the PowerShell Slack channel answering questions and helping people

But as you can use PowerShell in VSTS tasks, this is not a problem although you need to write your PowerShell using try catch to make sure that your task fails when your PowerShell errors. This is the code I use to install the modules

$ErrorActionPreference = 'Stop'

# Set location to module home path in artifacts directory
try {
    Set-Location $(Build.SourcesDirectory)
    Get-ChildItem
}
catch {
    Write-Error "Failed to set location"

}

# Get the Module versions
Install-Module Configuration -Scope CurrentUser -Force
$Modules = Get-ManifestValue -Path .\dbachecks.psd1 -PropertyName RequiredModules

$PesterVersion = $Modules.Where{$_.Get_Item('ModuleName') -eq 'Pester'}[0].Get_Item('ModuleVersion')
$PSFrameworkVersion = $Modules.Where{$_.Get_Item('ModuleName') -eq 'PSFramework'}[0].Get_Item('ModuleVersion')
$dbatoolsVersion = $Modules.Where{$_.Get_Item('ModuleName') -eq 'dbatools'}[0].Get_Item('ModuleVersion')

# Install Pester
try {
    Write-Output "Installing Pester"
    Install-Module Pester  -RequiredVersion $PesterVersion  -Scope CurrentUser -Force -SkipPublisherCheck
    Write-Output "Installed Pester"

}
catch {
    Write-Error "Failed to Install Pester $($_)"
}
# Install dbatools
try {
    Write-Output "Installing PSFramework"
    Install-Module PSFramework  -RequiredVersion $PsFrameworkVersion  -Scope CurrentUser -Force 
    Write-Output "Installed PSFramework"

}
catch {
    Write-Error "Failed to Install PSFramework $($_)"
}
# Install dbachecks
try {
    Write-Output "Installing dbatools"
    Install-Module dbatools  -RequiredVersion $dbatoolsVersion  -Scope CurrentUser -Force 
    Write-Output "Installed dbatools"

}
catch {
    Write-Error "Failed to Install dbatools $($_)"
}

# Add current folder to PSModulePath
try {
    Write-Output "Adding local folder to PSModulePath"
    $ENV:PSModulePath = $ENV:PSModulePath + ";$pwd"
    Write-Output "Added local folder to PSModulePath"    
    $ENV:PSModulePath.Split(';')
}
catch {
    Write-Error "Failed to add $pwd to PSModulePAth - $_"
}

I use the Configuration module from Joel Bennett to get the required module versions for the required modules and then add the path to $ENV:PSModulePath so that the modules will be imported. I think this is because the modules did not import correctly without it.

Once I have the modules I can then run Pester as follows

try {
    Write-Output "Installing dbachecks"
    Import-Module .\dbachecks.psd1
    Write-Output "Installed dbachecks"

}
catch {
    Write-Error "Failed to Install dbachecks $($_)"
}
$TestResults = Invoke-Pester .\tests -ExcludeTag Integration,IntegrationTests  -Show None -OutputFile $(Build.SourcesDirectory)\Test-Pester.XML -OutputFormat NUnitXml -PassThru

if ($TestResults.failedCount -ne 0) {
    Write-Error "Pester returned errors"
}

As you can see I import the dbachecks module from the local folder, run Invoke-Pester and output the results to an XML file and check that there are no failing tests.

Whether you use the task or PowerShell the next step is to Publish the test results so that they are displayed in the build results in VSTS.

Click on the + sign next to Phase 1 and search for Publish

12 - publish test results.png

 

Choose the Publish Test Results task and leave everything as default unless you have renamed the xml file. This means that on the summary page you will see some test results

 

13 - Test on sumary page.png

and on the tests tab you can see more detailed information and drill down into the tests

14 - detailed test report.png

Trigger

The next step is to trigger a build when a commit is pushed to the development branch. Click on Triggers and tick enable continuous integration

15 Trigger.png

Saving the Build Definition

I would normally save the build definition regularly and ensure that there is a good message in the comment. I always tell clients that this is like a commit message for your build process so that you can see the history of the changes for the build definition.

You can see the history on the edit tab of the build definition

16 - build history.png

If you want to compare or revert the build definition this can be done using the hamburger menu as shown below.

17 - build history compare revert.png

Update the Module Version

Now we need to create a build definition for the master branch to update the module version and sign the code ready for publishing to the PowerShell Gallery when we commit or merge to master

Create a new build definition as above but this time choose the master branch

18 - master build.png

Again choose an empty process and name it sensibly, click the + sign next to Phase 1 and search for PowerShell

19 - PowerShell task.png

I change the version to 2 and use this code. Note that the commit message has ***NO_CI*** in it. Putting this in a commit message tells VSTS not to trigger a build for this commit.

$manifest = Import-PowerShellDataFile .\dbachecks.psd1 
[version]$version = $Manifest.ModuleVersion
Write-Output "Old Version - $Version"
# Add one to the build of the version number
[version]$NewVersion = "{0}.{1}.{2}" -f $Version.Major, $Version.Minor, ($Version.Build + 1) 
Write-Output "New Version - $NewVersion"
# Update the manifest file
try {
    Write-Output "Updating the Module Version to $NewVersion"
    $path = "$pwd\dbachecks.psd1"
    (Get-Content .\dbachecks.psd1) -replace $version, $NewVersion | Set-Content .\dbachecks.psd1 -Encoding string
    Write-Output "Updated the Module Version to $NewVersion"
}
catch {
    Write-Error "Failed to update the Module Version - $_"
}

try {
    Write-Output "Updating GitHub"
git config user.email "mrrobsewell@outlook.com"
git config user.name "SQLDBAWithABeard"
git add .\dbachecks.psd1
git commit -m "Updated Version Number to $NewVersion ***NO_CI***"

git push https://$(RobsGitHubPAT)@github.com/sqlcollaborative/dbachecks.git HEAD:master
Write-Output "Updated GitHub "

}
catch {
    $_ | Fl -Force
    Write-Output "Failed to update GitHub"
}

I use Get-Content Set-Content as I had errors with the Update-ModuleManifest but Adam Murray g | t uses this code to update the version using the BuildID from VSTS

$newVersion = New-Object version -ArgumentList 1, 0, 0, $env:BUILD_BUILDID
$Public  = @(Get-ChildItem -Path $ModulePath\Public\*.ps1)
$Functions = $public.basename
Update-ModuleManifest -Path $ModulePath\$ModuleName.psd1 -ModuleVersion $newVersion -FunctionsToExport $Functions

You can commit your change by adding your PAT token as a variable under the variables tab. Don’t forget to tick the padlock to make it a secret so it is not displayed in the logs

20 - variables.png

Sign the code with a certificate

The SQL Collaborative uses a code signing certificate from DigiCert who allow MVPs to use one for free to sign their code for open source projects, Thank You. We had to upload the certificate to the secure files store in the VSTS library. Click on library, secure files and the blue +Secure File button

21 - secure file store.png

You also need to add the password as a variable under the variables tab as above. Again don’t forget to tick the padlock to make it a secret so it is not displayed in the logs

Then you need to add a task to download the secure file. Click on the + sign next to Phase 1 and search for secure

22 download secure file.png

choose the file from the drop down

23 - download secure file.png

Next we need to import the certificate and sign the code. I use a PowerShell task for this with the following code

$ErrorActionPreference = 'Stop'
# read in the certificate from a pre-existing PFX file
# I have checked this with @IISResetMe and this does not go in the store only memory
$cert = [System.Security.Cryptography.X509Certificates.X509Certificate2]::new("$(Agent.WorkFolder)\_temp\dbatools-code-signing-cert.pfx","$(CertPassword)")

try {
    Write-Output "Signing Files"
    # find all scripts in your module...
Get-ChildItem  -Filter *.ps1 -Include *.ps1 -Recurse -ErrorAction SilentlyContinue |
# ...that do not have a signature yet...
Where-Object {
  ($_ | Get-AuthenticodeSignature).Status -eq 'NotSigned'
  } |
# and apply one
# (note that we added -WhatIf so no signing occurs. Remove this only if you
# really want to add digital signatures!)
Set-AuthenticodeSignature -Certificate $cert
Write-Output "Signed Files"
}
catch {
    $_ | Format-List -Force
    Write-Error "Failed to sign scripts"
}

which will import the certificate into memory and sign all of the scripts in the module folder.

Publish your artifact

The last step of the master branch build publishes the artifact (your signed module) to VSTS ready for the release task. Again, click the + sign next to Phase one and choose the Publish Artifact task not the deprecated copy and publish artifact task and give the artifact a useful name

24 - publish artifact.png

Don’t forget to set the trigger for the master build as well following the same steps as the development build above

Publish to the PowerShell Gallery

Next we create a release to trigger when there is an artifact ready and publish to the PowerShell Gallery.

Click the Releases tab and New Definition

25 - Reelase creation

Choose an empty process and name the release definition appropriately

26 Release name empty process.png

Now click on the artifact and choose the master build definition. If you have not run a build you will get an error like below but dont worry click add.

27 - add artifact.png

Click on the lightning bolt next to the artifact to open the continuous deployment trigger

28 - Choose lightning bolt

and turn on Continuous Deployment so that when an artifact has been created with an updated module version and signed code it is published to the gallery

28 - Continuous deployment trigger

Next, click on the environment and name it appropriately and then click on the + sign next to Agent Phase and choose a PowerShell step

29 - PowerShell Publish step

You may wonder why I dont choose the PowerShell Gallery Packager task. There are two reasons. First I need to install the required modules for dbachecks (dbatools, PSFramework, Pester) prior to publishing and second it appears that the API Key is stored in plain text

30 - PowerShell Gallery Publisher

I save my API key for the PowerShell Gallery as a variable again making sure to tick the padlock to make it a secret

31 - API Key variable.png

and then use the following code to install the required modules and publish the module to the gallery

Install-Module dbatools -Scope CurrentUser -Force
Install-Module Pester -Scope CurrentUser -SkipPublisherCheck -Force
Install-Module PSFramework -Scope CurrentUser -Force

Publish-Module -Path "$(System.DefaultWorkingDirectory)/Master - Version Update, Signing and Publish Artifact/dbachecks" -NuGetApiKey "$(GalleryApiKey)"

Thats it 🙂

Now we have a process that will automatically run our Pester tests when we commit or merge to the development branch and then update our module version number and sign our code and publish to the PowerShell Gallery when we commit or merge to the master branch

Added Extra – Dashboard

I like to create dashboards in VSTS to show the progress of the various definitions. You can do this under the dashboard tab. Click edit and choose or search for widgets and add them to the dashboard

32 - Dashboard.png

Added Extra – Badges

You can also enable badges for displaying on your readme in GitHub (or VSTS). For the build defintions this is under the options tab.

33 - Build badges

for the release definitions, click the environment and then options and integrations

34 - Release Badge

You can then copy the URL and use it in your readme like this on dbachecks

35 - dbachecks readme badges.png

The SQL Collaborative has joined the preview of enabling public access to VSTS projects as detailed in this blog post So you can see the dbachecks build and release without the need to log in and soon the dbatools process as well

I hope you found this useful and if you have any questions or comments please feel free to contact me

 

Happy Automating!

How I created PowerShell.cool using Flow, Azure SQL DB, Cognitive Services & PowerBi

Last weekend I was thinking about how to save the tweets for PowerShell Conference Europe. This annual event occurs in Hanover and this year it is on April 17-20, 2018. The agenda has just been released and you can find it on the website http://www.psconf.eu/

I ended up creating an interactive PowerBi report to which my good friend and Data Platform MVP Paul Andrew b | t added a bit of magic and I published it. The magnificent Tobias Weltner b | t who organises PSConfEU pointed the domain name http://powershell.cool at the link. It looks like this.

During the monthly #PSTweetChat

I mentioned that I need to blog about how I created it and Jeff replied

so here it is! Looking forward to seeing the comparison between the PowerShell and Devops Summit and the PowerShell Conference Europe 🙂

This is an overview of how it works

 

You will find all of the resources and the scripts to do all of the below in the GitHub repo. So clone it and navigate to the filepath

Create Database

First lets create a database. Connect to your Azure subscription

## Log in to your Azure subscription using the Add-AzureRmAccount command and follow the on-screen directions.

 Add-AzureRmAccount

## Select the subscription

Set-AzureRmContext -SubscriptionId YourSubscriptionIDHere

01 - subscription.png

Then set some variables

# The data center and resource name for your resources
$resourcegroupname = "twitterresource"
$location = "WestEurope"
# The logical server name: Use a random value or replace with your own value (do not capitalize)
$servername = "server-$(Get-Random)"
# Set an admin login and password for your database
# The login information for the server You need to set these and uncomment them - Dont use these values

# $adminlogin = "ServerAdmin"                
# $password = "ChangeYourAdminPassword1"

# The ip address range that you want to allow to access your server - change as appropriate
# $startip = "0.0.0.0"
# $endip = "0.0.0.0"

# To just add your own IP Address
$startip = $endip = (Invoke-WebRequest 'http://myexternalip.com/raw').Content -replace "`n"

# The database name
$databasename = "tweets"

$AzureSQLServer = "$servername.database.windows.net,1433"
$Table = "table.sql"
$Proc = "InsertTweets.sql"

They should all make sense, take note that you need to set and uncomment the login and password and choose which IPs to allow through the firewall

Create a Resource Group

## Create a resource group

New-AzureRmResourceGroup -Name $resourcegroupname -Location $location

02 - resource group.png

Create a SQL Server

## Create a Server

$newAzureRmSqlServerSplat = @{
    SqlAdministratorCredentials = $SqlAdministratorCredentials
    ResourceGroupName = $resourcegroupname
    ServerName = $servername
    Location = $location
}
New-AzureRmSqlServer @newAzureRmSqlServerSplat

03 - create server.png

Create a firewall rule, I just use my own IP and add the allow azure IPs

$newAzureRmSqlServerFirewallRuleSplat = @{
    EndIpAddress = $endip
    StartIpAddress = $startip
    ServerName = $servername
    ResourceGroupName = $resourcegroupname
    FirewallRuleName = "AllowSome"
}
New-AzureRmSqlServerFirewallRule @newAzureRmSqlServerFirewallRuleSplat

# Allow Azure IPS

$newAzureRmSqlServerFirewallRuleSplat = @{
    AllowAllAzureIPs = $true
    ServerName = $servername
    ResourceGroupName = $resourcegroupname
}
New-AzureRmSqlServerFirewallRule @newAzureRmSqlServerFirewallRuleSplat

03a - firewall rule.png

Create a database

# Create a database

$newAzureRmSqlDatabaseSplat = @{
    ServerName = $servername
    ResourceGroupName = $resourcegroupname
    Edition = 'Basic'
    DatabaseName = $databasename
}
New-AzureRmSqlDatabase  @newAzureRmSqlDatabaseSplat

04 - create database.png

I have used the dbatools module to run the scripts to create the database. You can get it using

Install-Module dbatools # -Scope CurrentUser # if not admin process

Run the scripts

# Create a credential

$newObjectSplat = @{
    ArgumentList = $adminlogin, $(ConvertTo-SecureString -String $password -AsPlainText -Force)
    TypeName = 'System.Management.Automation.PSCredential'
}
$SqlAdministratorCredentials = New-Object @newObjectSplat

## Using dbatools module

$invokeDbaSqlCmdSplat = @{
    SqlCredential = $SqlAdministratorCredentials
    Database = $databasename
    File = $Table,$Proc
    SqlInstance = $AzureSQLServer
}
Invoke-DbaSqlCmd @invokeDbaSqlCmdSplat

05 - Create Table Sproc.png

This will have created the following in Azure, you can see it in the portal

07 - portal.png

You can connect to the database in SSMS and you will see

06 - show table.png

Create Cognitive Services

Now you can create the Text Analysis Cognitive Services API

First login (if you need to) and set some variables

## This creates cognitive services for analysing the tweets

## Log in to your Azure subscription using the Add-AzureRmAccount command and follow the on-screen directions.

Add-AzureRmAccount

## Select the subscription

Set-AzureRmContext -SubscriptionId YOUR SUBSCRIPTION ID HERE

#region variables
# The data center and resource name for your resources
$resourcegroupname = "twitterresource"
$location = "WestEurope"
$APIName = 'TweetAnalysis'
#endregion

Then create the API and get the key

#Create the cognitive services

$newAzureRmCognitiveServicesAccountSplat = @{
    ResourceGroupName = $resourcegroupname
    Location = $location
    SkuName = 'F0'
    Name = $APIName
    Type = 'TextAnalytics'
}
New-AzureRmCognitiveServicesAccount @newAzureRmCognitiveServicesAccountSplat

# Get the Key

$getAzureRmCognitiveServicesAccountKeySplat = @{
    Name = $APIName
    ResourceGroupName = $resourcegroupname
}
Get-AzureRmCognitiveServicesAccountKey @getAzureRmCognitiveServicesAccountKeySplat 

You will need to accept the prompt

08 -cognitive service

Copy the Endpoint URL as you will need it.Then save one of  the keys for the next step!

09 cognitiveservice key

 

Create the Flow

I have exported the Flow to a zip file and also the json for a PowerApp (no details about that in this post). Both are available in the Github repo. I have submitted a template but it is not available yet.

Navigate to https://flow.microsoft.com/ and sign in

Creating Connections

You will need to set up your connections. Click New Connection and search for Text

16 - import step 3.png

Click Add and fill in the Account Key and the Site URL from the steps above

17 import step 5.png

click new connection and search for SQL Server

18 - import step 6.png

Enter the SQL Server Name (value of $AzureSQLServer) , Database Name , User Name and Password from the steps above

19 - import step 7.png

Click new Connection and search for Twitter and create a connection (the authorisation pop-up may be hidden behind other windows!)

Import the Flow

If you have a premium account you can import the flow, click Import

11 - import flow.png

12 - choose import.png

and choose the import.zip from the Github Repo

13 import step 1.png

 

Click on Create as new and choose a name

14 - import step 2.png

Click select during import next to Sentiment and choose the Sentiment connection

15 impot step 3.png

Select during import for the SQL Server Connection and choose the SQL Server Connection and do the same for the Twitter Connection

20 - import stpe 8.png

Then click import

21 - imported.png

Create the flow without import

If you do not have a premium account you can still create the flow using these steps. I have created a template but it is not available at the moment. Create the connections as above and then click Create from blank.

22 - importblank.png

 

Choose the trigger When a New Tweet is posted and add a search term. You may need to choose the connection to twitter by clicking the three dots

23 - importblank 1.png

Click Add an action

24 - add action.png

search for detect and choose the Text Analytics Detect Sentiment

25 - choose sentuiment.png

Enter the name for the connection, the account key and the URL from the creation of the API above. If you forgot to copy them

#region Forgot the details

# Copy the URL if you forget to save it

$getAzureRmCognitiveServicesAccountSplat = @{
    Name = $APIName
    ResourceGroupName = $resourcegroupname
}
(Get-AzureRmCognitiveServicesAccount @getAzureRmCognitiveServicesAccountSplat).Endpoint | Clip

# Copy the Key if you forgot

$getAzureRmCognitiveServicesAccountKeySplat = @{
    Name = $APIName
    ResourceGroupName = $resourcegroupname
}
(Get-AzureRmCognitiveServicesAccountKey @getAzureRmCognitiveServicesAccountKeySplat).Key1 | Clip

#endregion

26 - enter details.png

Click in the text box and choose Tweet Text

27 - choose tweet text.png

Click New Step and add an action. Search for SQL Server and choose SQL Server – Execute Stored Procedure

28 - choose sql server execute stored procedure.png

Choose the stored procedure [dbo].[InsertTweet]

29 - choose stored procedure.png

Fill in as follows

  • __PowerAppsID__         0
  • Date                                 Created At
  • Sentiment                      Score
  • Tweet                              Tweet Text
  • UserLocation                 Location
  • UserName                      Tweeted By

as shown below

30 stored procedure info.png

Give the flow a name at the top and click save flow

31 flow created.png

Connect PowerBi

Open the PSConfEU Twitter Analysis Direct.pbix from the GitHub repo in PowerBi Desktop. Click the arrow next to Edit Queries and then change data source settings

32 change data source.png

Click Change source and enter the server (value of $AzureSQLServer) and the database name. It will alert you to apply changes

33 apply changes.png

It will then pop-up with a prompt for the credentials. Choose Database and enter your credentials and click connect

34 - creds.png

and your PowerBi will be populated from the Azure SQL Database 🙂 This will fail if there are no records in the table because your flow hasn’t run yet. If it does just wait until you see some tweets and then click apply changes again.

You will probably want to alter the pictures and links etc and then yo can publish the report

Happy Twitter Analysis

Dont forget to keep an eye on your flow runs to make sure they have succeeded.

Announcing PSDay.UK – Whats a PSDay?

On Thursday evening I attended the joint London WinOps and PowerShell User Group. It was an excellent evening with two great sessions by Jaap Brasser and Filip Verloy.

PSDay.UK

There was also an exciting announcement about PSDay.UK  https://psday.uk

PSDay.UK is a one day PowerShell event providing the opportunity for you to spend a whole day learning PowerShell from renowned experts from the UK and international speaking community. It will be held at

Skills Matter | CodeNode, 10 South Place, London, EC2M 7EB, GB

on

Friday 22nd September 2017  .ics

We will be running two tracks

  • PowerShell Zero to Hero
  • DevOps with PowerShell

Register your interest

Please go and visit the website and have a look and register your interest to get further notifications about the event.

Follow the @PSDayUK twitter account and Facebook page https://www.facebook.com/PSDayUK/ and keep yourself informed on this fantastic new event.

Want to Speak at PSDay.UK ?

We already have some fantastic speakers lined up but we would like to invite people to send us submissions for more sessions. If you have a PowerShell talk that will fit into one of the tracks and experience of delivering sessions at events please send us submissions via the website.
If you have questions about speaking feel free to contact me via twitter at @sqldbawithbeard

What is a PSDay ?

The International PowerShell community has three main global events which run over a number of days with top notch international speakers and Microsoft PowerShell team members, delivering in-depth information about the latest PowerShell trends and technologies, and connecting national communities with another.

There are a number of other PowerShell events that have been organised by wonderful volunteers in numerous countries and we feel there is an opportunity to create national events which complement the global events and help PowerShell passionates and professionals to get in touch and learn from another with a similar branding of PSDay.

We foresee PSDays to be smaller one day national events promoting speakers from the host country supported by other international speakers with the aim of increasing the exposure of national PowerShell user groups as well as providing excellent PowerShell training.

There will be a board of PowerShell community folk set up who will approve requests to use the PSDay name and shield logo providing the event is professionally organized and offer help with technical questions, viral marketing, and experience. We hope that this will enable people to set up their own PSDay in their own country and increase the exposure of the PowerShell community as well as PowerShell knowledge whilst sharing resources, knowledge, experience and skills and ensuring a good standard of PowerShell community national events.

Further details of this will be forthcoming and we welcome offers of assistance from people with relevant experience

 

 

Pester for Presentations – Ensuring it goes ok

Whilst I was at PSCONFEU I presented a session on writing pester tests instead of using checklists. You can see it here

During the talk I showed the pester test that I use to make sure that everything is ready for my presentation. A couple of people have asked me about this and wanted to know more so I thought that I would blog about it.

Some have said that I might be being a little OCD about it 😉 I agree that it could seem like that but there is nothing worse than having things go wrong during your presentation. It makes your heart beat faster and removes the emphasis from the presentation that you give.

When it is things that you as a presenter could have been able to foresee, like a VM not being started or a database not being restored to the pre-demo state or being logged in as the wrong user then it is much worse

I use Pester to ensure that my environment for my presentation is as I expect and in fact, in Hanover when I ran through my Pester test for my NUC environment I found that one of my SQL Servers had decided to be in a different time zone and therefore the SQL Service would not authenticate and start. I was able to quickly remove the references to that server and save myself from a sea of red during my demos

For those that don’t know. Pester is a PowerShell module for Test Driven Development

Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands

If you have PowerShell version 5 then you will have Pester already installed although you should update it to the latest version. If not you can get Pester from the PowerShell Gallery follow the instructions on that page to install it. This is a good post to start learning about Pester

What can you test? Everything. Well, specifically everything that you can write a PowerShell command to check. So when I am setting up for my presentation I check the following things. I add new things to my tests as I think of them or as I observe things that may break my presentations. Most recently that was ensuring that my Visual Studio Code session was running under the correct user. I did that like this

Describe "Presentation Test" {
    Context "VSCode" {
        It "Should be using the right username" {
            whoami | Should Be 'TheBeard\Rob'
       }
    }
}

01 - username.PNG

I think about the things that are important to me for my presentation.  I want to ensure that I only have one VS Code window open to avoid that situation where I am clicking through windows looking for the correct window. I can do that using Get-Process

It "Should have Code Insiders Open" {
(Get-Process 'Code - Insiders' -ErrorAction SilentlyContinue)| Should Not BeNullOrEmpty
}
        It "Should have One VS Code Process" {
            (Get-Process 'Code - Insiders' -ErrorAction SilentlyContinue).Count | Should Be 1
        }

I use -ErrorAction SilentlyContinue so that I don’t get a sea of red when I run the tests. Next I want to check my PowerPoint is ready for my presentation

[code langauge=”PowerShell”]
It "Should have PowerPoint Open" {
(Get-Process POWERPNT -ErrorAction SilentlyContinue).Count | Should Not BeNullOrEmpty
}
It "Should have One PowerPoint Open" {
(Get-Process POWERPNT -ErrorAction SilentlyContinue).Count | Should Be 1
}
It "Should have the correct PowerPoint Presentation Open" {
(Get-Process POWERPNT -ErrorAction SilentlyContinue).MainWindowTitle| Should Be ‘dbatools – SQL Server and PowerShell together – PowerPoint’
}

Again I use Get-Process. I check if PowerPoint is open, if there is one PowerPoint open and I use the MainWindowTitle property to check that it is the right PowerPoint presentation after nearly starting a presentation for SqlServer module with the dbatools slides!

I don’t want any distractions when I am presenting. I have the sort of friends who will notice if I get notifications for twitter popping up on my screen and repeatedly send tweets to make people laugh. (I admit, I’m one of those friends – I do this too!)

02 - Friends!!.PNG

Now I cannot get a test for quiet hours working. You can apparently use a Registry key, which of course you can check with PowerShell but I was unable to get it working. I haven’t looked at testing for Presentation Mode  but I test that those programmes are shut down, again using Get-Process

        It "Mail Should be closed" {
            (Get-Process HxMail -ErrorAction SilentlyContinue).Count | Should Be 0
        }
        It "Tweetium should be closed" {
            (Get-Process WWAHost -ErrorAction SilentlyContinue).Count | Should Be 0
        }
        It "Slack should be closed" {
            (Get-Process slack* -ErrorAction SilentlyContinue).Count | Should BE 0
        }

I am generally presenting with SQL Server so I need to make sure that SQL Server is running. I do this with Get-Service

Context "Local SQL" {
        It "DBEngine is running" {
            (Get-Service mssqlserver).Status | Should Be Running
        }
        It "SQL Server Agent is running" {
            (Get-Service sqlserveragent).Status | Should Be Running
        }
        It "DAVE DBEngine is running" {
            (Get-Service mssql*Dave).Status | Should Be Running
        }
        It "DAVE Agent is running" {
            (Get-Service sqlagent*dave).Status | Should Be Running
        }
    }

In this example I am testing that the SQL Service and the Agent service are running on both of my local instances.

I use a NUC running Hyper-V to enable me to show a number of SQL Servers running in a domain environment so I need to be able to test those too. I set the values of the servers I need into a variable and check that the VM is running and that they respond to ping

 Context "VM State" {
        $NUCServers = 'BeardDC1','BeardDC2','LinuxvNextCTP14','SQL2005Ser2003','SQL2012Ser08AG3','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3','SQLVnextN1','SQL2008Ser12R2'
        $NUCVMs = Get-VM -ComputerName beardnuc | Where-Object {$_.Name -in $NUCServers}
            foreach($VM in $NUCVms)
                {
                $VMName = $VM.Name
                  It "$VMName Should be Running"{
                    $VM.State | Should Be 'Running'
                  }
			    }
    }
Context "THEBEARD_Domain" {
            $NUCServers = 'BeardDC1','BeardDC2','LinuxvNextCTP14','SQL2005Ser2003','SQL2012Ser08AG3','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3','SQLVnextN1','SQL2008Ser12R2'
            foreach($VM in $NUCServers)
                {
                                 It "$VM Should respond to ping" {
				(Test-Connection -ComputerName $VM -Count 1 -Quiet -ErrorAction SilentlyContinue) | Should be $True
				}
                }
    }

I also need to check if the SQL Service and the Agent Service is running on each server

  Context "SQL State" {
        $SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*'  -and $_.State -eq 'Running'}).Name
        foreach($Server in $SQLServers)
        {
          $DBEngine = Get-service -ComputerName $Server -Name MSSQLSERVER
           It "$Server  DBEngine should be running" {
                $DBEngine.Status | Should Be 'Running'
            }
           It "$Server DBEngine Should be Auto Start" {
            $DBEngine.StartType | Should be 'Automatic'
           }
              $Agent= Get-service -ComputerName $Server -Name SQLSERVERAGENT
              It "$Server Agent should be running" {
                  $Agent.Status | Should Be 'Running'
           }
           It "$Server Agent Should be Auto Start" {
            $Agent.StartType | Should be 'Automatic'
           }
        }
        It "Linux SQL Server should be accepting connections" {
            $cred = Import-Clixml C:\temp\sa.xml
            {Connect-DbaSqlServer -SqlServer LinuxvnextCTP14 -Credential $cred -ConnectTimeout 60} | Should Not Throw
        }

    }
}

I check that the Linux SQL Server is available by storing the credential using Export-CliXML  and then use that credential with Connect-DbaSqlServer from dbatools

Using a NUC means I sometimes have fun with networking so I have a couple of tests for that too. Testing for the correct DNS Servers and gateways

    It "Should have DNS Servers for correct interface" {
        (Get-DnsClientServerAddress -InterfaceAlias 'Ethernet 3').Serveraddresses | Should Be @('10.0.0.1','10.0.0.2')
    }
    It "Should have correct gateway for alias"{
        (Get-NetIPConfiguration -InterfaceAlias 'Ethernet 3').Ipv4DefaultGateway.NextHop | Should Be '10.0.0.10'
    }

All of those are generic tests that have evolved over time and are run for every presentation but when I have specific things you require for a single presentation I test for those too.

For Example, later this week Cláudio Silva and I are presenting on dbatools at TUGAIT  We are showing the Test-DbaMaxMemory  , Get-DbaMaxMemory and Set-DbaMaxMemory commands so we need to ensure that the Max Memory for some servers is (In) Correctly set. I use Connect-DbaSqlServer to create an SMO Server object and test that

    It "Max Memory on SQl2012SerAG1 2 and 3 should be 2147483647" {
        (Connect-DbaSqlServer SQL2012Ser08AG1).Configuration.MaxServerMemory.RunValue | Should Be 2147483647
        (Connect-DbaSqlServer SQL2012Ser08AG2).Configuration.MaxServerMemory.RunValue | Should Be 2147483647
        (Connect-DbaSqlServer SQL2012Ser08AG3).Configuration.MaxServerMemory.RunValue | Should Be 2147483647
    }

We are also showing the Test-DbaIdentityUsage command so a column needs to be pre-prepared in AdventureWorks2014 to be able to show the error

    It "ShiftID LastValue Should be 255" {
        $a = Test-DbaIdentityUsage -SqlInstance ROB-XPS -Databases AdventureWorks2014 -NoSystemDb
        $a.Where{$_.Column -eq 'ShiftID'}.LastValue | should Be 255
    }

To ensure that we have orphaned files available for the Find-DbaOrphanedFile command I use this

    It "has Orphaned Files ready"{
        (Find-DbaOrphanedFile -SqlServer SQL2016N2).Count | Should Be 30
    }

There are any number of things that you may want to test to ensure that, as best as possible, the demo gods are not going to come and bite you in the middle of your presentation.

  • Files or Folders exist (or dont exist)
  • Databases, Agent Jobs, Alerts
  • Operators, Logins
  • SSIS packages, SSRS Reports
  • PowerBi files
  • Azure connectivity
  • Azure components

The list is endless, just look at what you require for your presentation.

Anything you can check with PowerShell you can test with Pester so build up your Pester presentation tests and reduce the reliance on the demo gods! I’ll still leave this here just to be safe!!

pray to the demo gods.jpg

Pester Test Inception and the Show Parameter

My fantastic friend Andre Kamman b | t  and I presented at PSConfEu last week

C_EDtK0XoAA1PL7 (2).jpg

and whilst we were there we were chatting about running Pester Tests. He wanted to know how he could run a Pester Test and not lose the failed tests as they scrolled past him. In his particular example we were talking about running hundreds of tests on thousands of databases on hundreds of servers

01 - pesters.gif

I guess it looks something like that!!

I explained about the -Show parameter which allows you to filter the results that you see. Using Get-Help Invoke-Pester you can see this

   -Show
Customizes the output Pester writes to the screen. Available options are None, Default,
Passed, Failed, Pending, Skipped, Inconclusive, Describe, Context, Summary, Header, All, Fails.

The options can be combined to define presets.
Common use cases are:

None – to write no output to the screen.
All – to write all available information (this is default option).
Fails – to write everything except Passed (but including Describes etc.).

A common setting is also Failed, Summary, to write only failed tests and test summary.

This parameter does not affect the PassThru custom object or the XML output that
is written when you use the Output parameters.

Required?                    false
Position?                    named
Default value                All
Accept pipeline input?       false
Accept wildcard characters?  false

So there are numerous options available to you. Lets see what they look like

I will use a dummy test which creates 10 Context blocks and runs from 1 to 10 and checks if the number has a remainder when divided by 7

Describe "Only the 7s Shall Pass" {
    $Servers = 0..10
    foreach($Server in $servers)
    {
        Context "This is the context for $Server" {
        foreach($A in 1..10){
            It "Should Not Pass for the 7s" {
                $A % 7 | Should Not Be 0
                }
            }
        }
    }
}

Imagine it is 10 servers running 10 different tests

For the Show parameter All is the default, which is the output that you are used to

02 - All.gif

None does not write anything out. You could use this with -Passthru which will pass ALL of the test results to a variable and if you added -OutputFile and -OutputFormat then you can save ALL of the results to a file for consumption by another system. The -Show parameter only affects the output from the Invoke-Pester command to the host not the output to the files or the variable.

Header only returns the header from the test results and looks like this ( I have included the none so that you can see!)

03 - none and header.PNG

Summary, as expected returns only the summary of the results

04 - summary.PNG

You can use more than one value for the Show parameter so if you chose Header, Summary, Describe you would get this

05 - headerdesscribe sumnmary.PNG

You could use Failed to only show the failed tests which looks like this

06 - failed.PNG

but Andre explained that he also want to be able to see some progress whilst the test was running. If there were no failures then he would not se anything at all.

So Fails might be the answer (or Failed and Summary but that would not show the progress)

07 - fails.PNG

Fails shows the Header, Describe, Context  and also shows the Summary.

However we carried on talking. PSConfEU is a fantastic place to talk about PowerShell 🙂 and wondered what would happen if you invoked Pester from inside a Pester test. I was pretty sure that it would work as Pester is just PowerShell but I thought it would be fun to have a look and see how we could solve that requirement

So I created 3 “Internal Tests” these are the ones we don’t want to see the output for. I then wrote an overarching Pester test to call them. In that Pester test script I assigned the results of each test to a variable which. When you examine it you see

08 - Pester Object.PNG

The custom object that is created shows the counts of all different results of the tests, the time it took and also the test result.

So I could create a Pester Test to check the Failed Count property of that Test result

$InternalTest1.FailedCount | Should Be 0

To make sure that we don’t lose the results of the tests we can output  them to a file like this

$InternalTest1 = Invoke-Pester .\Inside1.Tests.ps1 -Show None -PassThru -OutputFile C:\temp\Internal_Test1_Results.xml -OutputFormat NUnitXml

So now we can run Invoke-Pester and point it at that file and it will show the progress and the final result on the screen.

09 finale.PNG

You could make use of this in different ways

  • Server 1
    • Database1
    • Database2
    • Database3
    • Database4
  • Server 2
    • Database1
      Database2
      Database3
      Database4
  • Server 3
    • Database1
      Database2
      Database3
      Database4

Or by Test Category

  • Backup
    • Server1
    • Server 2
    • Server 3
    • Server 4
  • Agent Jobs
    • Server 1
    • Server 2
    • Server 3
    • Server 4
  • Indexes
    • Server 1
    • Server 2
    • Server 3
    • Server 4

Your only limitation is your imagination.

As we have mentioned PSConfEU you really should check out the videos on the youtube channel All of the videos that were successfully recorded will be on there. You could start with this one and mark your diaries for April 16-20 2018

 

 

Using Twitter with VS Code

So today I saw this tweet from Mathias in reply to Stefan and Amanda

01 - tweet.PNG

 

That looks cool. Twitter in VS Code, one less programme to open Here how it works

Open VS Code and hit CTRL + P and type ext install twitter or hit CTRL + SHIFT + X to open the extensions and search

02 - install.PNG

 

Hit install and then reload

03 - reload

 

Accept the prompt

04 - prompt.PNG

and you will have a Twitter button in the bar at the bottom

05 - bar

If you click it then the top bar will change to the set up wizard

06 - forst time.PNG

Follow it along and create a Twitter App.

07 - want to continue

08 - create an appWhich will open up the website.

 

09 - create an app.PNG

Just fill in the blanks

010 - fil in the blanks.PNG

and then  you will have this window

011 - app settings.PNG

Click on keys and Access tokens

012 - settings and appl

Check the App Permissions are set to read and write

014 - app permissions

and then click create my access token

Then go back to VS Code and click

016 - settings details

Now you are shown the settings.json (which you can always find by File –> Preferences –> Settings )

017 - settingsjson.PNG

This bit, the wizard doesn’t explain very well (hence this post) If you have already some settings between the curly braces, you will need to put a comma and then paste the below code. If you do not then paste the below code between the curly braces

"twitter.consumerkey":"",
"twitter.consumersecret":"",
"twitter.accesstokenkey":"",
"twitter.accesstokensecret":"" 

and then paste the relevant keys and tokens from your twitter app between the double quotes

THEN PRESS CTRL + S to save the settings.json – Its in caps as when I showed someone they were too excited and didn’t save it!!

Now its all set up you can use the extension. Click the twitter button in the bar and

018 - Twitter actions

Now you can have your Home timeline in Code

019 - twitter home.PNG

Yes there are still a few seats left for the Europe PowerShell Conference You can search, see your mentions, your user page

020- mentions.PNG

and post 🙂 You can just press F1 and start typing twitter to get the commands

021 - comands

022 - tweet.png

and

 

023 - tweet.PNG

You can also message people using D message username. You can see the extension repo on GitHub which will be a good place to raise issues, bugs, feature requests

I’m not suire it will Increase my productivity !! but it really pleases the nerd in me!

 

 

2016 – That was a Year :-)

Its the time of year for reflection and I have had the most amazing 2016, I am blessed that I love what I do so much. I thoroughly enjoy writing and talking and sharing and commenting and supporting and cherishing all the SQL and PowerShell things. I wrote about using Power Bi to display my checkins. I only started this in June and this is where I have been 🙂

swarm

I learnt about Pester and ended the year incorporating it into dbatools and dbareports. I also started using GitHub It is quite surprising to me how much time I now spend using both. I also had to start learning DSC for the client I was working with because as ‘the PowerShell guy’ I was the one who could the easiest. I learnt things and then forgot them causing me to find this Pester post via google later in the year!! (That’s a big reason for blogging by the way)

Early in the year we organised with SQL Saturday Exeter

Helping to organise a SQL Saturday is a lot of fun, especially when you do it with good friends, but choosing sessions is by far the most challenging part of it for me. I could have chosen at least 60 of these sessions and I know people were disappointed not to have been chosen. I was also the first person many saw at SQL Bits in Liverpool manning the front of house and getting asked the best question ever

The Beard says

When you go to an event –  Say thank you to the organisers and volunteers

and a TERRIBLE thing happened – I broke my DBA Team mug

WP_20160223_07_51_03_Pro.jpg

Luckily the fine folk at redgate sorted me out with a replacement from deep in the stores somewhere and gave it to me at SQL Saturday Exeter 🙂 Thank you.

I spoke at the PowerShell Conference Europe and met and made some great friends which lead to me speaking at the PowerShell Monday in Munich and the Dutch PowerShell Usergroup. SQL Saturday Dublin was a blast, its a wonderful city, Manchester had a whole PowerShell Track 🙂 and Cambridge was memorable for the appalling journey as well as the chance to share a stage with Chrissy. PowerShell Conference Asia in the sovereign city-state of Singapore was such a good event and place. Lastly of course was Slovenia with its fantastic Christmas lights and awesome event organisation. I visited some user groups too. Southampton run by my good friends John Martin and Steph Middleton Congratulations to John on his first MVP award yesterday, Cardiff for the Return of the Battle of the Beards with Terry McCann and Tobiasz Koprowski where the projector threw its toys out of the pram and Birmingham in the school hall which was slightly chilly (theres a joke there for some people)

Amazing things happened

We created https://sqlps.io/vote and https://sqlps.io/ssms and https://sqlps.io/powerbi to enable anyone to influence Microsoft and help to improve the PowerShell SQL experience

and lo and behold there was a new sqlserver module 🙂

I was also invited by Aaron and Chrissy to become an officer for the PASS PowerShell Virtual Chapter oh and we made https://sqlps.io/slack to enable people to talk about all things Data Platform – Another addition to my life that I didn’t have at the beginning of the year. I spend a lot of time in there in the #dbatools and #dbareports channels and have made some fantastic friends. Chrissy and I created the SQL Community Collaborative GitHub team and added dbatools and dbareports and even more friendships were born

And that’s the biggest and bestest thing about this year. Some amazing new friends and spending time with all my other friends. I started writing out a list but was terrified I would have missed someone out, so to all my friends

THANK YOU for a brilliant 2016 and 2017 shall be just as good 🙂

Here are a few of my pics from the year with a lot of my friends

 

Making Start-Demo work with multi-line commands without a backtick

I love to speak about PowerShell. I really enjoy giving presentations and when I saw Start-Demo being used at the PowerShell Conference in Hanover I started to make use of it in my presentations.

Start-Demo was written in 2007 by a fella who knows PowerShell pretty well 🙂  https://blogs.msdn.microsoft.com/powershell/2007/03/03/start-demo-help-doing-demos-using-powershell/

It was then updated in 2012 by Max Trinidad http://www.maxtblog.com/2012/02/powershell-start-demo-now-allows-multi-lines-onliners/

This enabled support for multi-line code using backticks at the end of each line. This works well but I dislike having to use the backticks in foreach loops, it confuses people who think that they need to be included and to my mind looks a bit messy

start-demo

This didn’t bother me enough to look at the code but I did mention it to my friend Luke t | g who decided to use it as a challenge for his Friday lunch-time codeathon and updated the function so that it works without needing a backtick

start-demo2

It also works with nested loops

start-demo3

just a little improvement but one I think that works well and looks good

You can find it at

https://github.com/SQLDBAWithABeard/Presentations/blob/master/Start-Demo.ps1

and a little demo showing what it can and cant do

https://github.com/SQLDBAWithABeard/Presentations/blob/master/start-demotest.ps1

Load the Start-Demo.ps1 file and then run

Start-Demo PATHTO\start-demotest.ps1

Enjoy!

 

 

 

Some Pester Tests for SQL Defaults

When I was at PowerShell Conference EU in Hannover last month (The videos are available now – click here and the slides and code here) I found out about Irwin Strachans Active Directory Operations Test which got me thinking.

I decided to do the same for my usual SQL Set-up. Treating all of your servers to the same defaults makes it even easier to manage at scale remotely.

I am comfortable with using SMO to gather and change properties on SQL Instances so I started by doing this

        It 'Should have a default Backup Directory of F:\SQLBACKUP\BACKUPS' {
$Scriptblock = {
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
$srv = New-Object Microsoft.SqlServer.Management.Smo.Server .
return $srv.BackupDirectory}
$State = Invoke-Command -ComputerName ROB-SURFACEBOOK -ScriptBlock $Scriptblock
$State |Should Be 'F:\SQLBACKUP\BACKUPS'

This is the how to find the properties that you want

  ## Load the Assemblies
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
## Create a Server SMO object
$srv = New-Object Microsoft.SqlServer.Management.Smo.Server SERVERNAME

## Explore it
$srv|gm

## If you find an array pick the first one and expand and then explore that
$srv.Databases[0] | select *
$srv.Databases[0] | gm

I quickly found as I added more tests that it was taking a long time to perform the tests (about 5 seconds each test) and that it took an age to fail each of the tests if the server name was incorrect or the server unavailable.

I fixed the first one by testing with a ping before running the tests

   ## Check for connectivity
if((Test-Connection $Server -count 1 -Quiet) -eq $false){
Write-Error 'Could not connect to $Server'
$_
continue
}

The continue is there because I wanted to loop through an array of servers

I improved the performance using a remote session and a custom object

      Describe "$Server" {
BeforeAll {
$Scriptblock = {
[pscustomobject]$Return = @{}
$srv = ''
$SQLAdmins = $Using:SQLAdmins
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server
$Return.DBAAdminDb = $Srv.Databases.Name.Contains('DBA-Admin')
$Logins = $srv.Logins.Where{$_.IsSystemObject -eq $false}.Name
$Return.SQLAdmins = @(Compare-Object $Logins $SQLAdmins -SyncWindow 0).Length - $Logins.count -eq $SQLAdmins.Count
$SysAdmins = $Srv.Roles['sysadmin'].EnumMemberNames()
$Return.SQLAdmin = @(Compare-Object $SysAdmins $SQLAdmins -SyncWindow 0).Length - $SysAdmins.count -eq $SQLAdmins.Count
$Return.BackupDirectory = $srv.BackupDirectory
$Return.DataDirectory = $srv.DefaultFile

The BeforeAll script block is run, as it sounds like it should, once before all of the tests, BeforeEach would run once before each of the tests. I define an empty custom object and then create an SMO object and add the properties I am interested in testing to it. I then return the custom object at the end

   $Return.Alerts82345Exist = ($srv.JobServer.Alerts |Where {$_.Messageid -eq 823 -or $_.Messageid -eq 824 -or $_.Messageid -eq 825}).Count
$Return.Alerts82345Enabled = ($srv.JobServer.Alerts |Where {$_.Messageid -eq 823 -or $_.Messageid -eq 824 -or $_.Messageid -eq 825 -and $_.IsEnabled -eq $true}).Count
$Return.SysDatabasesFullBackupToday = $srv.Databases.Where{$_.IsSystemObject -eq $true -and $_.Name -ne 'tempdb' -and $_.LastBackupDate -lt (Get-Date).AddDays(-1)}.Count
Return $Return
}
try {
$Return = Invoke-Command -ScriptBlock $Scriptblock -ComputerName $Server -ErrorAction Stop
}
catch {
Write-Error "Unable to Connect to $Server"
$Error
continue

I was then able to test against the property of the custom object

   It 'Should have Alerts for Severity 20 and above' {
$Return.Alerts20SeverityPlusExist | Should Be 6
}
It 'Severity 20 and above Alerts should be enabled' {
$Return.Alerts20SeverityPlusEnabled | Should Be 6
}
It 'Should have alerts for 823,824 and 825' {
$Return.Alerts82345Exist |Should Be 3
}
It 'Alerts for 823,824 and 825 should be enebled' {
$Return.Alerts82345Enabled |Should Be 3
}

Occasionally, for reasons I haven’t explored I had to test against the value property of the returned object

          It "The Full User Database Backup should be scheduled Weekly $OlaUserFullSchedule" {
$Return.OlaUserFullSchedule.value | Should Be $OlaUserFullSchedule
}

I wanted to be able to run the tests against environments or groups of servers with different default values so I parameterised the Test Results as well and then the logical step was to turn it into a function and then I could do some parameter splatting. This also gives me the opportunity to show all of the things that I am currently giving parameters to the test for

   $Parms = @{
Servers = 'SQLServer1','SQLServer2','SQLServer3';
SQLAdmins = 'THEBEARD\Rob','THEBEARD\SQLDBAsAlsoWithBeards';
BackupDirectory = 'C:\MSSQL\Backup';
DataDirectory = 'C:\MSSQL\Data\';
LogDirectory = 'C:\MSSQL\Logs\';
MaxMemMb = '4096';
Collation = 'Latin1_General_CI_AS';
TempFiles = 4 ;
OlaSysFullFrequency = 'Daily';
OlaSysFullStartTime = '21:00:00';
OlaUserFullSchedule = 'Weekly';
OlaUserFullFrequency = 1 ;## 1 for Sunday
OlaUserFullStartTime = '22:00:00';
OlaUserDiffSchedule = 'Weekly';
OlaUserDiffFrequency = 126; ## 126 for every day except Sunday
OlaUserDiffStartTime = '22:00:00';
OlaUserLogSubDayInterval = 15;
OlaUserLoginterval = 'Minute';
HasSPBlitz = $true;
HasSPBlitzCache = $True;
HasSPBlitzIndex = $True;
HasSPAskBrent = $true;
HASSPBlitzTrace =  $true;
HasSPWhoisActive = $true;
LogWhoIsActiveToTable = $true;
LogSPBlitzToTable = $true;
LogSPBlitzToTableEnabled = $true;
LogSPBlitzToTableScheduled = $true;
LogSPBlitzToTableSchedule = 'Weekly';
LogSPBlitzToTableFrequency = 2 ; # 2 means Monday
LogSPBlitzToTableStartTime  = '03:00:00'}

Test-SQLDefault @Parms

I have some other tests which always return what I want, particularly the firewall rules which you will have to modify to suit your own environment

To be able to run this you will need to have the Pester Module. If you are using Windows 10 then it is installed by default, if not

  Find-Module –Name 'Pester' | Install-Module

You can find more about Pester here and here and also these videos from the conference
You can find the tests on GitHub here and I will continue to add to the defaults that I check.
This is not a replacement for other SQL configuration tools such as PBM but it is a nice simple way of giving a report on the current status of a SQL installation either at a particular point in time when something is wrong or after an installation prior to passing the server over to another team or into service

.