Version Update, Code Signing and publishing to the PowerShell Gallery with VSTS

At the fabulous PowerShell Conference EU I presented about Continuous Delivery to the PowerShell Gallery with VSTS and explained how we use VSTS to enable CD for dbachecks. We even released a new version during the session 🙂

So how do we achieve this?

We have a few steps

  • Create a project and link to our GitHub
  • Run unit uests with Pester to make sure that our code is doing what we expect.
  • Update our module version and commit the change to GitHub
  • Sign our code with a code signing certificate
  • Publish to the PowerShell Gallery

Create Project and link to GitHub

First you need to create a VSTS project by going to https://www.visualstudio.com/ This is free for up to 5 users with 1 concurrent CI/CD queue limited to a maximum of 60 minutes run time which should be more than enough for your PowerShell module.

01 - sign up.png

Click on Get Started for free under Visual Studio Team Services and fill in the required information. Then on the front page click new project

02 - New Project.png

Fill in the details and click create

03 - create project.png

Click on builds and then new definition

04- builds.png

next you need to link your project to your GitHub (or other source control providers) repository

05 - github auth.png

You can either authorise with OAuth or you can provide a PAT token following the instructions here. Once that is complete choose your repo. Save the PAT as you will need it later in the process!

06 - choose repo.png

and choose the branch that you want this build definition to run against.

07 branch.png

I chose to run the Unit Tests when a PR was merged into the development branch. I will then create another build definition for the master branch to sign the code and update module version. This enables us to push several PRs into the development branch and create a single release for the gallery.

Then I start with an empty process

08 - empty process.png

and give it a suitable name

09 - name it.png

i chose the hosted queue but you can download an agent to your build server if you need to do more or your integration tests require access to other resources not available on the hosted agent.

Run Unit Tests with Pester

We have a number of Unit tests in our tests folder in dbachecks so we want to run them to ensure that everything is as it should be and the new code will not break existing functionality (and for dbachecks the format of the PowerBi)

You can use the Pester Test Runner Build Task from the folk at Black Marble by clicking on the + sign next to Phase 1 and searching for Pester

10 - Pester task runner.png

You will need to click Get It Free to install it and then click add to add the task to your build definition. You can pretty much leave it as default if you wish and Pester will run all of the *.Tests.ps1 files that it finds in the directory where it downloads the GitHub repo which is referred to using the variable $(Build.SourcesDirectory). It will then output the results to a json file called Test-Pester.XML ready for publishing.

However, as dbachecks has a number of dependent modules, this task was not suitable. I spoke with Chris Gardner  b | t  from Black Marble at the PowerShell Conference and he says that this can be resolved so look out for the update. Chris is a great guy and always willing to help, you can often find him in the PowerShell Slack channel answering questions and helping people

But as you can use PowerShell in VSTS tasks, this is not a problem although you need to write your PowerShell using try catch to make sure that your task fails when your PowerShell errors. This is the code I use to install the modules

$ErrorActionPreference = 'Stop'

# Set location to module home path in artifacts directory
try {
    Set-Location $(Build.SourcesDirectory)
    Get-ChildItem
}
catch {
    Write-Error "Failed to set location"

}

# Get the Module versions
Install-Module Configuration -Scope CurrentUser -Force
$Modules = Get-ManifestValue -Path .\dbachecks.psd1 -PropertyName RequiredModules

$PesterVersion = $Modules.Where{$_.Get_Item('ModuleName') -eq 'Pester'}[0].Get_Item('ModuleVersion')
$PSFrameworkVersion = $Modules.Where{$_.Get_Item('ModuleName') -eq 'PSFramework'}[0].Get_Item('ModuleVersion')
$dbatoolsVersion = $Modules.Where{$_.Get_Item('ModuleName') -eq 'dbatools'}[0].Get_Item('ModuleVersion')

# Install Pester
try {
    Write-Output "Installing Pester"
    Install-Module Pester  -RequiredVersion $PesterVersion  -Scope CurrentUser -Force -SkipPublisherCheck
    Write-Output "Installed Pester"

}
catch {
    Write-Error "Failed to Install Pester $($_)"
}
# Install dbatools
try {
    Write-Output "Installing PSFramework"
    Install-Module PSFramework  -RequiredVersion $PsFrameworkVersion  -Scope CurrentUser -Force 
    Write-Output "Installed PSFramework"

}
catch {
    Write-Error "Failed to Install PSFramework $($_)"
}
# Install dbachecks
try {
    Write-Output "Installing dbatools"
    Install-Module dbatools  -RequiredVersion $dbatoolsVersion  -Scope CurrentUser -Force 
    Write-Output "Installed dbatools"

}
catch {
    Write-Error "Failed to Install dbatools $($_)"
}

# Add current folder to PSModulePath
try {
    Write-Output "Adding local folder to PSModulePath"
    $ENV:PSModulePath = $ENV:PSModulePath + ";$pwd"
    Write-Output "Added local folder to PSModulePath"    
    $ENV:PSModulePath.Split(';')
}
catch {
    Write-Error "Failed to add $pwd to PSModulePAth - $_"
}

I use the Configuration module from Joel Bennett to get the required module versions for the required modules and then add the path to $ENV:PSModulePath so that the modules will be imported. I think this is because the modules did not import correctly without it.

Once I have the modules I can then run Pester as follows

try {
    Write-Output "Installing dbachecks"
    Import-Module .\dbachecks.psd1
    Write-Output "Installed dbachecks"

}
catch {
    Write-Error "Failed to Install dbachecks $($_)"
}
$TestResults = Invoke-Pester .\tests -ExcludeTag Integration,IntegrationTests  -Show None -OutputFile $(Build.SourcesDirectory)\Test-Pester.XML -OutputFormat NUnitXml -PassThru

if ($TestResults.failedCount -ne 0) {
    Write-Error "Pester returned errors"
}

As you can see I import the dbachecks module from the local folder, run Invoke-Pester and output the results to an XML file and check that there are no failing tests.

Whether you use the task or PowerShell the next step is to Publish the test results so that they are displayed in the build results in VSTS.

Click on the + sign next to Phase 1 and search for Publish

12 - publish test results.png

 

Choose the Publish Test Results task and leave everything as default unless you have renamed the xml file. This means that on the summary page you will see some test results

 

13 - Test on sumary page.png

and on the tests tab you can see more detailed information and drill down into the tests

14 - detailed test report.png

Trigger

The next step is to trigger a build when a commit is pushed to the development branch. Click on Triggers and tick enable continuous integration

15 Trigger.png

Saving the Build Definition

I would normally save the build definition regularly and ensure that there is a good message in the comment. I always tell clients that this is like a commit message for your build process so that you can see the history of the changes for the build definition.

You can see the history on the edit tab of the build definition

16 - build history.png

If you want to compare or revert the build definition this can be done using the hamburger menu as shown below.

17 - build history compare revert.png

Update the Module Version

Now we need to create a build definition for the master branch to update the module version and sign the code ready for publishing to the PowerShell Gallery when we commit or merge to master

Create a new build definition as above but this time choose the master branch

18 - master build.png

Again choose an empty process and name it sensibly, click the + sign next to Phase 1 and search for PowerShell

19 - PowerShell task.png

I change the version to 2 and use this code. Note that the commit message has ***NO_CI*** in it. Putting this in a commit message tells VSTS not to trigger a build for this commit.

$manifest = Import-PowerShellDataFile .\dbachecks.psd1 
[version]$version = $Manifest.ModuleVersion
Write-Output "Old Version - $Version"
# Add one to the build of the version number
[version]$NewVersion = "{0}.{1}.{2}" -f $Version.Major, $Version.Minor, ($Version.Build + 1) 
Write-Output "New Version - $NewVersion"
# Update the manifest file
try {
    Write-Output "Updating the Module Version to $NewVersion"
    $path = "$pwd\dbachecks.psd1"
    (Get-Content .\dbachecks.psd1) -replace $version, $NewVersion | Set-Content .\dbachecks.psd1 -Encoding string
    Write-Output "Updated the Module Version to $NewVersion"
}
catch {
    Write-Error "Failed to update the Module Version - $_"
}

try {
    Write-Output "Updating GitHub"
git config user.email "mrrobsewell@outlook.com"
git config user.name "SQLDBAWithABeard"
git add .\dbachecks.psd1
git commit -m "Updated Version Number to $NewVersion ***NO_CI***"

git push https://$(RobsGitHubPAT)@github.com/sqlcollaborative/dbachecks.git HEAD:master
Write-Output "Updated GitHub "

}
catch {
    $_ | Fl -Force
    Write-Output "Failed to update GitHub"
}

I use Get-Content Set-Content as I had errors with the Update-ModuleManifest but Adam Murray g | t uses this code to update the version using the BuildID from VSTS

$newVersion = New-Object version -ArgumentList 1, 0, 0, $env:BUILD_BUILDID
$Public  = @(Get-ChildItem -Path $ModulePath\Public\*.ps1)
$Functions = $public.basename
Update-ModuleManifest -Path $ModulePath\$ModuleName.psd1 -ModuleVersion $newVersion -FunctionsToExport $Functions

You can commit your change by adding your PAT token as a variable under the variables tab. Don’t forget to tick the padlock to make it a secret so it is not displayed in the logs

20 - variables.png

Sign the code with a certificate

The SQL Collaborative uses a code signing certificate from DigiCert who allow MVPs to use one for free to sign their code for open source projects, Thank You. We had to upload the certificate to the secure files store in the VSTS library. Click on library, secure files and the blue +Secure File button

21 - secure file store.png

You also need to add the password as a variable under the variables tab as above. Again don’t forget to tick the padlock to make it a secret so it is not displayed in the logs

Then you need to add a task to download the secure file. Click on the + sign next to Phase 1 and search for secure

22 download secure file.png

choose the file from the drop down

23 - download secure file.png

Next we need to import the certificate and sign the code. I use a PowerShell task for this with the following code

$ErrorActionPreference = 'Stop'
# read in the certificate from a pre-existing PFX file
# I have checked this with @IISResetMe and this does not go in the store only memory
$cert = [System.Security.Cryptography.X509Certificates.X509Certificate2]::new("$(Agent.WorkFolder)\_temp\dbatools-code-signing-cert.pfx","$(CertPassword)")

try {
    Write-Output "Signing Files"
    # find all scripts in your module...
Get-ChildItem  -Filter *.ps1 -Include *.ps1 -Recurse -ErrorAction SilentlyContinue |
# ...that do not have a signature yet...
Where-Object {
  ($_ | Get-AuthenticodeSignature).Status -eq 'NotSigned'
  } |
# and apply one
# (note that we added -WhatIf so no signing occurs. Remove this only if you
# really want to add digital signatures!)
Set-AuthenticodeSignature -Certificate $cert
Write-Output "Signed Files"
}
catch {
    $_ | Format-List -Force
    Write-Error "Failed to sign scripts"
}

which will import the certificate into memory and sign all of the scripts in the module folder.

Publish your artifact

The last step of the master branch build publishes the artifact (your signed module) to VSTS ready for the release task. Again, click the + sign next to Phase one and choose the Publish Artifact task not the deprecated copy and publish artifact task and give the artifact a useful name

24 - publish artifact.png

Don’t forget to set the trigger for the master build as well following the same steps as the development build above

Publish to the PowerShell Gallery

Next we create a release to trigger when there is an artifact ready and publish to the PowerShell Gallery.

Click the Releases tab and New Definition

25 - Reelase creation

Choose an empty process and name the release definition appropriately

26 Release name empty process.png

Now click on the artifact and choose the master build definition. If you have not run a build you will get an error like below but dont worry click add.

27 - add artifact.png

Click on the lightning bolt next to the artifact to open the continuous deployment trigger

28 - Choose lightning bolt

and turn on Continuous Deployment so that when an artifact has been created with an updated module version and signed code it is published to the gallery

28 - Continuous deployment trigger

Next, click on the environment and name it appropriately and then click on the + sign next to Agent Phase and choose a PowerShell step

29 - PowerShell Publish step

You may wonder why I dont choose the PowerShell Gallery Packager task. There are two reasons. First I need to install the required modules for dbachecks (dbatools, PSFramework, Pester) prior to publishing and second it appears that the API Key is stored in plain text

30 - PowerShell Gallery Publisher

I save my API key for the PowerShell Gallery as a variable again making sure to tick the padlock to make it a secret

31 - API Key variable.png

and then use the following code to install the required modules and publish the module to the gallery

Install-Module dbatools -Scope CurrentUser -Force
Install-Module Pester -Scope CurrentUser -SkipPublisherCheck -Force
Install-Module PSFramework -Scope CurrentUser -Force

Publish-Module -Path "$(System.DefaultWorkingDirectory)/Master - Version Update, Signing and Publish Artifact/dbachecks" -NuGetApiKey "$(GalleryApiKey)"

Thats it 🙂

Now we have a process that will automatically run our Pester tests when we commit or merge to the development branch and then update our module version number and sign our code and publish to the PowerShell Gallery when we commit or merge to the master branch

Added Extra – Dashboard

I like to create dashboards in VSTS to show the progress of the various definitions. You can do this under the dashboard tab. Click edit and choose or search for widgets and add them to the dashboard

32 - Dashboard.png

Added Extra – Badges

You can also enable badges for displaying on your readme in GitHub (or VSTS). For the build defintions this is under the options tab.

33 - Build badges

for the release definitions, click the environment and then options and integrations

34 - Release Badge

You can then copy the URL and use it in your readme like this on dbachecks

35 - dbachecks readme badges.png

The SQL Collaborative has joined the preview of enabling public access to VSTS projects as detailed in this blog post So you can see the dbachecks build and release without the need to log in and soon the dbatools process as well

I hope you found this useful and if you have any questions or comments please feel free to contact me

 

Happy Automating!

VS Code – Terminal crashes when formatting script

I love VS Code. I love being able to press ALT + SHIFT + F and format my code.

formatting.gif

The Problem

Yesterday all I got when I pressed ALT + SHIFT + F was this

format error.png

I could reproduce it will. This was very frustrating.

Turning on Verbose Logging

To turn on verbose logging for the PowerShell Editor Services go the Cog in the bottom left, click it and then click User Settings.

Search for powershell.developer.editorServicesLogLevel

powershell.developer.editorServicesLogLevel.png

If you hover over the left hand channel a pencil will appear, click it and then click replace in settings

edit settings.png

This will put the entry in the right hand side where you can change the value. Set it to Verbose and save

user settigns.png

a prompt will come up asking if you want to restart PowerShell

start a new session.png

When you restart PowerShell, if you click on  Output and choose PowerShell Extension Logs you will see the path to the log file

logfilepath.png

Reproduce the error

I then reproduced the error and opened the log file this is what I got

10/02/2018 09:11:19 [ERROR] – Method “OnListenTaskCompleted” at line 391 of C:\projects\powershelleditorservices\src\PowerShellEditorServices.Protocol\MessageProtocol\ProtocolEndpoint.cs

ProtocolEndpoint message loop terminated due to unhandled exception:

System.AggregateException: One or more errors occurred. —> System.Management.Automation.CommandNotFoundException: The term ‘Invoke-Formatter’ is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
at System.Management.Automation.Runspaces.PipelineBase.Invoke(IEnumerable input)
at System.Management.Automation.PowerShell.Worker.ConstructPipelineAndDoWork(Runspace rs, Boolean performSyncInvoke)
at System.Management.Automation.PowerShell.Worker.CreateRunspaceIfNeededAndDoWork(Runspace rsToUse, Boolean isSync)
at System.Management.Automation.PowerShell.CoreInvokeHelper[TInput,TOutput](PSDataCollection`1 input, PSDataCollection`1 output, PSInvocationSettings settings)
at System.Management.Automation.PowerShell.CoreInvoke[TInput,TOutput](PSDataCollection`1 input, PSDataCollection`1 output, PSInvocationSettings settings)
at System.Management.Automation.PowerShell.Invoke(IEnumerable input, PSInvocationSettings settings)
at Microsoft.PowerShell.EditorServices.AnalysisService.InvokePowerShell(String command, IDictionary`2 paramArgMap)
at System.Threading.Tasks.Task`1.InnerInvoke()
at System.Threading.Tasks.Task.Execute()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.PowerShell.EditorServices.AnalysisService.<InvokePowerShellAsync>d__31.MoveNext()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.PowerShell.EditorServices.AnalysisService.<Format>d__22.MoveNext()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

Open an issue on GitHub

I couldnt quickly see what was happening so I opened an issue on the vscode-powershell repo by going to issues and clicking new issue and following the instructions

new issue.png

The Resolution

Keith Hill b | t pointed me to the resolution. Thank you Keith.

Further up in the log file there is a line where the editor services is loading the PSScriptAnalyzer module and it should have the Invoke-Formatter command exported, but mine was not. It loaded the PsScriptAnalyzer module  from my users module directory

10/02/2018 09:11:01 [NORMAL] – Method “FindPSScriptAnalyzerModule” at line 354 of C:\projects\powershelleditorservices\src\PowerShellEditorServices\Analysis\AnalysisService.cs

PSScriptAnalyzer found at C:\Users\XXXX\Documents\WindowsPowerShell\Modules\PSScriptAnalyzer\1.10.0\PSScriptAnalyzer.psd1

10/02/2018 09:11:01 [VERBOSE] – Method “EnumeratePSScriptAnalyzerCmdlets” at line 389 of C:\projects\powershelleditorservices\src\PowerShellEditorServices\Analysis\AnalysisService.cs

The following cmdlets are available in the imported PSScriptAnalyzer module:
Get-ScriptAnalyzerRule
Invoke-ScriptAnalyzer

I ran

$Env:PSModulePath.Split(';')

to see the module paths

module path.png

and looked in the .vscode-insiders\extensions\ms-vscode.powershell-1.5.1\modules directory. There was no PsScriptAnalyzer folder

no module.png

So I copied the PSScriptAnalyzer folder from the normal VS Code PowerShell Extension module folder into that folder and restarted PowerShell and I had my formatting back again 🙂

I then reset the logging mode in my user settings back to Normal

Thank you Keith

How I created PowerShell.cool using Flow, Azure SQL DB, Cognitive Services & PowerBi

Last weekend I was thinking about how to save the tweets for PowerShell Conference Europe. This annual event occurs in Hanover and this year it is on April 17-20, 2018. The agenda has just been released and you can find it on the website http://www.psconf.eu/

I ended up creating an interactive PowerBi report to which my good friend and Data Platform MVP Paul Andrew b | t added a bit of magic and I published it. The magnificent Tobias Weltner b | t who organises PSConfEU pointed the domain name http://powershell.cool at the link. It looks like this.

During the monthly #PSTweetChat

I mentioned that I need to blog about how I created it and Jeff replied

so here it is! Looking forward to seeing the comparison between the PowerShell and Devops Summit and the PowerShell Conference Europe 🙂

This is an overview of how it works

 

You will find all of the resources and the scripts to do all of the below in the GitHub repo. So clone it and navigate to the filepath

Create Database

First lets create a database. Connect to your Azure subscription

## Log in to your Azure subscription using the Add-AzureRmAccount command and follow the on-screen directions.

 Add-AzureRmAccount

## Select the subscription

Set-AzureRmContext -SubscriptionId YourSubscriptionIDHere

01 - subscription.png

Then set some variables

# The data center and resource name for your resources
$resourcegroupname = "twitterresource"
$location = "WestEurope"
# The logical server name: Use a random value or replace with your own value (do not capitalize)
$servername = "server-$(Get-Random)"
# Set an admin login and password for your database
# The login information for the server You need to set these and uncomment them - Dont use these values

# $adminlogin = "ServerAdmin"                
# $password = "ChangeYourAdminPassword1"

# The ip address range that you want to allow to access your server - change as appropriate
# $startip = "0.0.0.0"
# $endip = "0.0.0.0"

# To just add your own IP Address
$startip = $endip = (Invoke-WebRequest 'http://myexternalip.com/raw').Content -replace "`n"

# The database name
$databasename = "tweets"

$AzureSQLServer = "$servername.database.windows.net,1433"
$Table = "table.sql"
$Proc = "InsertTweets.sql"

They should all make sense, take note that you need to set and uncomment the login and password and choose which IPs to allow through the firewall

Create a Resource Group

## Create a resource group

New-AzureRmResourceGroup -Name $resourcegroupname -Location $location

02 - resource group.png

Create a SQL Server

## Create a Server

$newAzureRmSqlServerSplat = @{
    SqlAdministratorCredentials = $SqlAdministratorCredentials
    ResourceGroupName = $resourcegroupname
    ServerName = $servername
    Location = $location
}
New-AzureRmSqlServer @newAzureRmSqlServerSplat

03 - create server.png

Create a firewall rule, I just use my own IP and add the allow azure IPs

$newAzureRmSqlServerFirewallRuleSplat = @{
    EndIpAddress = $endip
    StartIpAddress = $startip
    ServerName = $servername
    ResourceGroupName = $resourcegroupname
    FirewallRuleName = "AllowSome"
}
New-AzureRmSqlServerFirewallRule @newAzureRmSqlServerFirewallRuleSplat

# Allow Azure IPS

$newAzureRmSqlServerFirewallRuleSplat = @{
    AllowAllAzureIPs = $true
    ServerName = $servername
    ResourceGroupName = $resourcegroupname
}
New-AzureRmSqlServerFirewallRule @newAzureRmSqlServerFirewallRuleSplat

03a - firewall rule.png

Create a database

# Create a database

$newAzureRmSqlDatabaseSplat = @{
    ServerName = $servername
    ResourceGroupName = $resourcegroupname
    Edition = 'Basic'
    DatabaseName = $databasename
}
New-AzureRmSqlDatabase  @newAzureRmSqlDatabaseSplat

04 - create database.png

I have used the dbatools module to run the scripts to create the database. You can get it using

Install-Module dbatools # -Scope CurrentUser # if not admin process

Run the scripts

# Create a credential

$newObjectSplat = @{
    ArgumentList = $adminlogin, $(ConvertTo-SecureString -String $password -AsPlainText -Force)
    TypeName = 'System.Management.Automation.PSCredential'
}
$SqlAdministratorCredentials = New-Object @newObjectSplat

## Using dbatools module

$invokeDbaSqlCmdSplat = @{
    SqlCredential = $SqlAdministratorCredentials
    Database = $databasename
    File = $Table,$Proc
    SqlInstance = $AzureSQLServer
}
Invoke-DbaSqlCmd @invokeDbaSqlCmdSplat

05 - Create Table Sproc.png

This will have created the following in Azure, you can see it in the portal

07 - portal.png

You can connect to the database in SSMS and you will see

06 - show table.png

Create Cognitive Services

Now you can create the Text Analysis Cognitive Services API

First login (if you need to) and set some variables

## This creates cognitive services for analysing the tweets

## Log in to your Azure subscription using the Add-AzureRmAccount command and follow the on-screen directions.

Add-AzureRmAccount

## Select the subscription

Set-AzureRmContext -SubscriptionId YOUR SUBSCRIPTION ID HERE

#region variables
# The data center and resource name for your resources
$resourcegroupname = "twitterresource"
$location = "WestEurope"
$APIName = 'TweetAnalysis'
#endregion

Then create the API and get the key

#Create the cognitive services

$newAzureRmCognitiveServicesAccountSplat = @{
    ResourceGroupName = $resourcegroupname
    Location = $location
    SkuName = 'F0'
    Name = $APIName
    Type = 'TextAnalytics'
}
New-AzureRmCognitiveServicesAccount @newAzureRmCognitiveServicesAccountSplat

# Get the Key

$getAzureRmCognitiveServicesAccountKeySplat = @{
    Name = $APIName
    ResourceGroupName = $resourcegroupname
}
Get-AzureRmCognitiveServicesAccountKey @getAzureRmCognitiveServicesAccountKeySplat 

You will need to accept the prompt

08 -cognitive service

Copy the Endpoint URL as you will need it.Then save one of  the keys for the next step!

09 cognitiveservice key

 

Create the Flow

I have exported the Flow to a zip file and also the json for a PowerApp (no details about that in this post). Both are available in the Github repo. I have submitted a template but it is not available yet.

Navigate to https://flow.microsoft.com/ and sign in

Creating Connections

You will need to set up your connections. Click New Connection and search for Text

16 - import step 3.png

Click Add and fill in the Account Key and the Site URL from the steps above

17 import step 5.png

click new connection and search for SQL Server

18 - import step 6.png

Enter the SQL Server Name (value of $AzureSQLServer) , Database Name , User Name and Password from the steps above

19 - import step 7.png

Click new Connection and search for Twitter and create a connection (the authorisation pop-up may be hidden behind other windows!)

Import the Flow

If you have a premium account you can import the flow, click Import

11 - import flow.png

12 - choose import.png

and choose the import.zip from the Github Repo

13 import step 1.png

 

Click on Create as new and choose a name

14 - import step 2.png

Click select during import next to Sentiment and choose the Sentiment connection

15 impot step 3.png

Select during import for the SQL Server Connection and choose the SQL Server Connection and do the same for the Twitter Connection

20 - import stpe 8.png

Then click import

21 - imported.png

Create the flow without import

If you do not have a premium account you can still create the flow using these steps. I have created a template but it is not available at the moment. Create the connections as above and then click Create from blank.

22 - importblank.png

 

Choose the trigger When a New Tweet is posted and add a search term. You may need to choose the connection to twitter by clicking the three dots

23 - importblank 1.png

Click Add an action

24 - add action.png

search for detect and choose the Text Analytics Detect Sentiment

25 - choose sentuiment.png

Enter the name for the connection, the account key and the URL from the creation of the API above. If you forgot to copy them

#region Forgot the details

# Copy the URL if you forget to save it

$getAzureRmCognitiveServicesAccountSplat = @{
    Name = $APIName
    ResourceGroupName = $resourcegroupname
}
(Get-AzureRmCognitiveServicesAccount @getAzureRmCognitiveServicesAccountSplat).Endpoint | Clip

# Copy the Key if you forgot

$getAzureRmCognitiveServicesAccountKeySplat = @{
    Name = $APIName
    ResourceGroupName = $resourcegroupname
}
(Get-AzureRmCognitiveServicesAccountKey @getAzureRmCognitiveServicesAccountKeySplat).Key1 | Clip

#endregion

26 - enter details.png

Click in the text box and choose Tweet Text

27 - choose tweet text.png

Click New Step and add an action. Search for SQL Server and choose SQL Server – Execute Stored Procedure

28 - choose sql server execute stored procedure.png

Choose the stored procedure [dbo].[InsertTweet]

29 - choose stored procedure.png

Fill in as follows

  • __PowerAppsID__         0
  • Date                                 Created At
  • Sentiment                      Score
  • Tweet                              Tweet Text
  • UserLocation                 Location
  • UserName                      Tweeted By

as shown below

30 stored procedure info.png

Give the flow a name at the top and click save flow

31 flow created.png

Connect PowerBi

Open the PSConfEU Twitter Analysis Direct.pbix from the GitHub repo in PowerBi Desktop. Click the arrow next to Edit Queries and then change data source settings

32 change data source.png

Click Change source and enter the server (value of $AzureSQLServer) and the database name. It will alert you to apply changes

33 apply changes.png

It will then pop-up with a prompt for the credentials. Choose Database and enter your credentials and click connect

34 - creds.png

and your PowerBi will be populated from the Azure SQL Database 🙂 This will fail if there are no records in the table because your flow hasn’t run yet. If it does just wait until you see some tweets and then click apply changes again.

You will probably want to alter the pictures and links etc and then yo can publish the report

Happy Twitter Analysis

Dont forget to keep an eye on your flow runs to make sure they have succeeded.

TSQL2sDay – Get-PostRoundup

First an apology, this round up is late!

The reason for that is an error in the PowerShell testing module Pester (That’s not completely true as you shall see!!)

I spoke in Stuttgart at the PowerShell Saturday last weekend and had intended to write this blog post whilst travelling, unfortunately I found a major error in Pester (again not strictly true but it makes a good story!!)

I explained it with this slide in my presentation

Yep, I forgot to pack my NUC with my VMs on it and had to re-write all my demos!!

But anyway, on to the TSQL2sDay posts

What a response. You wonderful people. I salute you with a Rimmer salute

There are 34 TSQL2sDay posts about dbatools, about starting with PowerShell, If you should learn PowerShell, SSAS, SSRS, Log Shipping, backups, restores, Pester, Default settings, best practices, migrations, Warnings in Agent Jobs, sqlpackage, VLFs, CMS, Disabling Named Pipes, Orphaned users, AG Status, AG Agent Jobs, logging, classes, auditing, copying files, ETL and more.

I am really pleased to see so many first timers to the TSQL2sDay blog monthly blog party. Please don’t let this be your only TSQL2sDay post. Come back next month and write a post on that topic.

Here they are below in the media of tweets, so that you can also go and follow these wonderful people who are so willing to share their knowledge. Say thank you to them, ask them questions, interact.

Learn, Share, Network

Volker wrote about testing best practices with dbatools

Dave explains why PowerShell is so useful to him in his ETL processes

Steve writes about the time he has saved using PowerShell to automate restores and audit SQL Server instances

https://twitter.com/Steve_TSQL/status/907713842943115264

Nate talks about copying large files like SQL Server backups using BITS with PowerShell

Warren talks about his experience as a beginner, the amount of things he automates and his DBReboot module

THANK YOU every single one and apologies if I have missed anyone!

 

 

Writing Dynamic and Random Tests Cases for Pester

I have written a module SQLDiagAPI for consuming the SQL Server Diagnostics API with PowerShell. I blogged about how I used Pester to develop one of the functions . Whilst writing Get-SQLDiagFix I wrote some Pester Tests to make sure that the output from the code was as expected.

Pester

For those that don’t know. Pester is a PowerShell module for Test Driven Development

Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands

If you have PowerShell version 5 then you will have Pester already installed although you should update it to the latest version. If not you can get Pester from the PowerShell Gallery follow the instructions on that page to install it. This is a good post to start learning about Pester

The Command Get-SQLDiagFix

Get-SQLDiagFix  returns the Product Name, Feature Name/Area, KB Number, Title and URL for the Fixes in the Cumulative Updates returned from the SQL Server Diagnostics Recommendations API. One Fix looks like this

07 - Get-SQLDiagFix result.png

This is how I wrote the Pester tests for that command

Mocking the results

In my describe block for each function I mock Get-SQLDiagRecommendations. This is the command that each of the current available commands in the module use to get the recommendations from the SQL Server Diagnostic Recommendations API. I did this by creating a json file from the API and saving it in a json folder inside the tests folder

01 - JSON folder.png

I can then mock Get-SQLDiagRecommendations inside a BeforeAll code block using

Describe "Get-SQLDiagFix" -Tags Build , Unit, Fix {
BeforeAll {
    $Recommendations = (Get-Content $PSScriptRoot\json\recommendations.JSON) -join "`n" | ConvertFrom-Json
    Mock Get-SQLDiagRecommendations {$Recommendations}
}
This means that every time the code in the test calls Get-SQLDiagRecommendations it will not use the internet to connect to the API and return an object. Instead it will return the $Recommendations object which is loaded from a file on the file system. I am not, therefore, depending on any external factors and I have a known set of data for my test results.
I also have a set of mocks in my Output Context code block
Context "Output" {
BeforeAll {
    $Fixes = (Get-Content $PSScriptRoot\json\fix.JSON) -join "`n" | ConvertFrom-Json
    $Products = Get-SQLDiagProduct
    $Features = Get-SQLDiagFeature
}

The fixes.json is a file which was created from the recommendations.json and only contains the properties returned by GetSQLDiagFix which is what we are testing here. I can set variables for Products and Features using the commands from the module as these will call Get-SQLDiagRecommendations which we have already mocked.

Test All of the Fixes

I can now test that the code I have written for Get-SQLDiagFix returns the correct data without any parameters using this test with Compare-Object.

It "returns all of the fixes with no parameter" {
    Compare-Object (Get-SQLDiagFix) $Fixes | Should BeNullOrEmpty
}
If there is no difference between the object returned from Get-SQLDiagFix and the $fixes object which uses the json file then the code is working as expected and the test will pass.

Test Cases

I learned about test cases from Mike Robbins blog post. Test cases enable you to provide a hash table of options and loop through the same test for each of them. Here is an example

There are the following products in the Recommendation API

  • SQL Server 2012 SP3
  • SQL Server 2016 SP1
  • SQL Server 2016 RTM
  • SQL Server 2014 SP1
  • SQL Server 2014 SP2
and I want to run a test for each product to check that the fixes returned from Get-SQLDiagFix for that product match the $fixes object filtered by Product for those products. Here is the code
$TestCases = @{ ProductName = 'SQL Server 2012 SP3'},
@{ ProductName = 'SQL Server 2016 SP1'},
@{ ProductName = 'SQL Server 2016 RTM'},
@{ ProductName ='SQL Server 2014 SP1'},
@{ ProductName = 'SQL Server 2014 SP2'}
It "Returns the correct results with a single product parameter &amp;amp;amp;amp;lt;ProductName&amp;amp;amp;amp;gt;" -TestCases $TestCases {
    param($productname)
    $results = $fixes.Where{$_.Product -in $ProductName}
    Compare-Object (Get-SQLDiagFix -Product $productname) $results | Should BeNullOrEmpty
}
You can click on the image below to see a larger, more readable version.
02 Test Cases.png

The $TestCases variable holds an array of hashtables, one for each product with a Name that matches the parameter that I use in the test and a value of the product name.

I wrote one test, one It code block.  I refer to the product in the title inside <> using the same name as the name in the hashtable. The test (It) needs a parameter of -TestCases with a value (in this example) of the $TestCases variable we have just defined. It also needs a param block with a parameter that matches the Name value from the hashtables.

The expected test results are placed in a $results variable by filtering the $Fixes variable (defined in the BeforeAll code block above) by the parameter $Productname

The test will then run for each of the test cases in the $TestCases variable comparing the results of Get-SQLDiagFix -Product $Productname with the expected results from the $fixes variable

Here are the test results

03 - product test results.png

Multiple Products in Test Cases

I also want to test that Get-SQLDiagFix will work for multiple Products. I need to create TestCases for those too. I do that in exactly the same way

$TestCases = @{ ProductName = 'SQL Server 2012 SP3', 'SQL Server 2016 SP1'},
@{ ProductName = 'SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM'},
@{ ProductName = 'SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM', 'SQL Server 2014 SP1'},
@{ ProductName = 'SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM', 'SQL Server 2014 SP1', 'SQL Server 2014 SP2'}
    It "Returns the correct results with multiple product parameter <ProductName>" -TestCases $TestCases {
        param($productname)
        $results = $fixes.Where{$_.Product -in $ProductName}
        Compare-Object (Get-SQLDiagFix -Product $productname) $results | Should BeNullOrEmpty
}
Which looks like this when the tests run
04 - mulitple product test results.png

Single Feature Dynamic Test Cases

Get-SQLDiagFix can also filter the fixes by feature area. The features are returned from Get-SQLDiagFeature. This means that I can create a test for each of the features by using the $features variable which was defined in the BeforeAll block as

$Features = Get-SQLDiagFeature

Then I can dynamically create test cases using

$TestCases = @()
$Features | Foreach-Object {$TestCases += @{Feature = $_}}
It "Returns the correct results with a single feature <Feature>" -TestCases $TestCases {
    param($Feature)
    $results = $fixes.Where{$_.Feature -in $Feature}
    Compare-Object (Get-SQLDiagFix -Feature $Feature) $results | Should BeNullOrEmpty
}

and the results look like

05 - single feature test results.png

Random Dynamic Multiple Feature Test Cases

I also need to test that Get-SQLDiagFix returns the correct results for multiple features and whilst I could create those by hand like the products example above why not let PowerShell do that for me?

I created 10 test cases. Each one has a random number of features between 2 and the number of features.  I can then write one test to make use of those test cases. This is how I do that

## Generate 10 TestCases of a random number of Features
$TestCases = @()
$x = 10
While ($x -gt 0) {
    ## We are testing multiples so we need at least 2
    $Number = Get-Random -Maximum $Features.Count -Minimum 2
    $Test = @()
    While ($Number -gt 0) {
        $Test += Get-Random $Features
        $Number --
    }
## Need unique values
$Test = $test | Select-Object -Unique
$TestCases += @{Feature = $Test}
$X --
}
It "Returns the correct results with multiple features <Feature>" -TestCases $TestCases {
 param($Feature)
 $results = $fixes.Where{$_.Feature -in $Feature}
 Compare-Object (Get-SQLDiagFix -Feature $Feature) $results | Should BeNullOrEmpty
 }

Now there are 10 tests each with a random number of features and the results look like this. Each time the test is run it will use a different set of features for each of the 10 tests but I will know that I am testing that the code will return the correct results for multiple features

06 - multiple features.png

Two Sets of Test Cases?

It is also possible for Get-SQLDiagFix to have one or more products and one or more features passed as parameters, which obviously also need to be tested to ensure the code is returning the correct results. As Pester is just PowerShell we can use normal PowerShell code. This means that I can test for a single product and a single feature using a foreach loop and Test Cases like this

foreach ($Product in $Products) {
    $TestCases = @()
    $Features = Get-SQLDiagFeature -Product $Product
    $Features | Foreach-Object {$TestCases += @{Feature = $_}}
    It "Returns the correct results for a single product parameter $Product with a single feature <Feature>" -TestCases $TestCases {
        param($Feature)
        $results = $fixes.Where{$_.Product -eq $product -and $_.Feature -in $Feature}
        Compare-Object (Get-SQLDiagFix -Product $Product -Feature $Feature) $results | Should BeNullOrEmpty
    }
}

To test for a single product and multiple features I use this code

 foreach ($Product in $Products) {
    ## Generate 10 TestCases of a random number of Features
    $TestCases = @()
    $x = 10
    While ($x -gt 0) {
        ## We are testing multiples so we need at least 2
        $Number = Get-Random -Maximum $Features.Count -Minimum 2
        $Test = @()
        While ($Number -gt 0) {
            $Test += Get-Random $Features
            $Number --
        }
        ## Need unique values
        $Test = $test | Select-Object -Unique
        $TestCases += @{Feature = $Test}
        $X --
    }
    It "Returns the correct results for a single product parameter $Product with a multiple features <Feature>" -TestCases $TestCases {
        param($Feature)
        $Test = (Get-SQLDiagFix -Product $Product -Feature $Feature)
        ## If there are no results Compare-Object bombs out even though it is correct
        ## This is a risky fix for that
        if ($Test) {
        $results = $fixes.Where{$_.Product -eq $product -and $_.Feature -in $Feature}
        Compare-Object $test $results | Should BeNullOrEmpty
        }
    }
 } 

Because it is dynamically creating the values for the two parameters, I have to check that there are some results to test on line 23 as Compare-Object will throw an error if the object to be compared is empty. I need to do this because it is possible for the test to pick products and features in a combination that there are no fixes in the results.

The reason I have commented it as a risky fix is because if someone changes the code and Get-SQLDiagFix does not return any results then the test would not run and therefore there would be no information from this test that the code had a bug. However, in this suite of tests there are many tests that would fail in that scenario but be careful in your own usage.

I test for multiple products with a single feature and multiple products with multiple features like this

 $Products = @('SQL Server 2012 SP3', 'SQL Server 2016 SP1'),
 @('SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM'),
 @('SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM', 'SQL Server 2014 SP1'),
 @('SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM', 'SQL Server 2014 SP1', 'SQL Server 2014 SP2')
 foreach ($Product in $Products) {
     $TestCases = @()
     $Features = Get-SQLDiagFeature -Product $Product
     $Features | Foreach-Object {$TestCases += @{Feature = $_}}
     It "Returns the correct results for multiple products parameter $Product with a single feature <Feature>" -TestCases $TestCases {
         param($Feature)
         $results = $fixes.Where{$_.Product -in $product -and $_.Feature -in $Feature}
         Compare-Object (Get-SQLDiagFix -Product $Product -Feature $Feature) $results | Should BeNullOrEmpty
     }
}
 $Products = @('SQL Server 2012 SP3', 'SQL Server 2016 SP1'),
 @('SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM'),
 @('SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM', 'SQL Server 2014 SP1'),
 @('SQL Server 2012 SP3', 'SQL Server 2016 SP1', 'SQL Server 2016 RTM', 'SQL Server 2014 SP1', 'SQL Server 2014 SP2')
 foreach ($Product in $Products) {
     ## Generate 10 TestCases of a random number of Features
     $TestCases = @()
     $x = 10
     While ($x -gt 0) {
         ## We are testing multiples so we need at least 2
         $Number = Get-Random -Maximum $Features.Count -Minimum 2
         $Test = @()
         While ($Number -gt 0) {
             $Test += Get-Random $Features
             $Number --
         }
     ## Need unique values
     $Test = $test | Select-Object -Unique
     $TestCases += @{Feature = $Test}
     $X --
 }
 It "Returns the correct results for multiple products parameter $Product with a multiple feature <Feature>" -TestCases $TestCases {
     param($Feature)
     $Test = (Get-SQLDiagFix -Product $Product -Feature $Feature)
     ## Annoyingly if there are no results Compare-Object bombs out even though it is correct
     ## This is a risky fix for that
     if ($Test) {
         $results = $fixes.Where{$_.Product -in $product -and $_.Feature -in $Feature}
         Compare-Object $test $results | Should BeNullOrEmpty
        }
    }
 } 

You can see all of the unit tests for the SQLDiagAPI module in my GitHub repository

The module is available on the PowerShell Gallery which means that you can install it using

Install-Module SQLDiagAPI

Creating a PowerShell Module and TDD for Get-SQLDiagRecommendations

Yesterday I introduced the first command in the SQLDiagAPI module. A module to consume the SQL Diagnostics API.

I have been asked a few times what the process is for creating a module, using Github and developing with Pester and whilst this is not a comprehensive how-to I hope it will give some food for thought when you decide to write a PowerShell module or start using Pester for code development. I also hope it will encourage you to give it a try and to blog about your experience.

This is my experience from nothing to a module with a function using Test Driven Development with Pester. There are some details missing in some places but if something doesn’t make sense then ask a question. If something is incorrect then point it out. I plan on never stopping learning!

There are many links to further reading and I urge you to not only read the posts linked but also to read further and deeper. That’s a generic point for anyone in the IT field and not specific to PowerShell. Never stop learning. Also, say thank you to those that have taken their time to write content that you find useful. They will really appreciate that.

Github Repository

I created a new repository in Github and used Visual Studio Code to clone the repository by pressing F1 and typing clone – Choosing Git Clone and following the prompts. I started with this because I was always planning to share this code and because source controlling it is the best way to begin.

Plaster Template

When you create a module there are a number of files that you need and I have a number of generic tests that I add. I also have a structure that I create for the artifacts and a number of markdown documents that come with a GitHub Repository.  Whilst you could write a PowerShell script to create all of those, there is no need as there is PlasterPlaster is a PowerShell module that enables you to set up the default scaffolding for your PowerShell module structure and tokenise some files. This makes it much easier to have a default ‘scaffold’ for the module, a structure for the files and folders and create a new module simply. I used Kevin Marquettes post on Plaster  to create myself a template module. You can find my Plaster Template here 

You do not need to use Plaster at all but as with anything, if you find yourself repeating steps then it is time to automate it

With my Plaster Template created I could simply run

$plaster = @{
TemplatePath = "GIT:\PlasterTemplate" #(Split-Path $manifestProperties.Path)
DestinationPath = "GIT:\SQLDiagAPI"
FullName = "Rob Sewell"
ModuleName = "SQLDiagAPI"
ModuleDesc = "This is a module to work with the SQL Server Diagnostics (Preview) API. See https://blogs.msdn.microsoft.com/sql_server_team/sql-server-diagnostics-preview/ for more details "
Version = "0.9.0"
GitHubUserName = "SQLDBAWithABeard"
GitHubRepo = "SQLDiagAPI"
}
If(!(Test-Path $plaster.DestinationPath))
{
New-Item -ItemType Directory -Path $plaster.DestinationPath
}
Invoke-Plaster @plaster -Verbose

This created my module. It created this folder and file structure and included some default tests and markdown documents pre-populated.

00 - module

Pester

For those that don’t know. Pester is a PowerShell module for Test Driven Development

Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands

If you have PowerShell version 5 then you will have Pester already installed although you should update it to the latest version. If not you can get Pester from the PowerShell Gallery follow the instructions on that page to install it. This is a good post to start learning about Pester

API Key

Now that I have the module I started to think about the commands. I decided to start with the recommendations API which is described as

Customers will be able to keep their SQL Server instances up-to-date by easily reviewing the recommendations for their SQL Server instances. Customers can filter by product version or by feature area (e.g. Always On, Backup/Restore, Column Store, etc.) and view the latest Cumulative Updates (CU) and the underlying hotfixes addressed in the CU.

To use the API you need an API Key. An API Key is a secret token that identifies the application to the API and is used to control access.You can follow the instructions here https://ecsapi.portal.azure-api.net/ to get one for the SQL Server Diagnostics API.

01 - APIKey

I will need to store the key to use it and if I am writing code that others will use consider how they can repeat the steps that I take. I decided to save my API Key using the Export-CliXML command as described by Jaap Brasser here .

Get-Credential | Export-CliXml -Path "${env:\userprofile}\SQLDiag.Cred"

You need to enter a username even though it is not used and then enter the API Key as the password. It is saved in the root of the user profile folder as hopefully user accounts will have access there in most shops

TDD

I approached writing this module using Test Driven Development with Pester. This means that I have to write my tests before I write my code. There are many reasons for doing this which are outside the scope of this blog post. This is a very good post to read more

The first function I wanted to write was to get the recommendations from the API. I decide to call it Get-SQLDiagRecommendations.

I decided that the first test should be to ensure that the API Key exists. Otherwise I would not be able to use it when calling the API. I already had an idea of how I would approach it by storing the API Key using Test-Path and writing a warning if the file did not exist.

Mocking

However this is not going to work if I have already saved the key to the file. The test needs to not be reliant on any thing external. I need to be able to test this functionality without actually checking my system. I will use Mock to do this. You can read more about mocking with Pester here.

I added this to my Pester test

Context "Requirements" {
Mock Test-Path {$false}
Mock Write-Warning {"Warning"}

This is what happens when you run this test. When there is a call to Test-Path in the code you have written, instead of actually running Test-Path it will return whatever is inside the curly braces, in this case false. For Write-Warning it will return a string of Warning.

This means that I can write a test like this

It "Should throw a warning if there is no API Key XML File and the APIKey Parameter is not used"{
Get-SQLDiagRecommendations -ErrorAction SilentlyContinue | Should Be "Warning"
}

So I know that when running my code in this test, Test-Path will return false, which will invoke Write-Warning in my code and in the test that will return “Warning” . So if I have written my code correctly the test will pass without actually running the real Test-Path and interacting with my system or running Write-Warning which makes it easier to test that warnings are thrown correctly.

The name of the test will also let me (and others) know in the future what I was trying to achieve. This means that if I (or someone else) changes the code and the test fails they can understand what was meant to happen. They can then either write a new test for the changed code if the requirements are now different or alter the code so that it passes the original test.

I use

-ErrorAction SilentlyContinue

so that the only red text that I see on the screen is the results of the test and not any PowerShell errors.

Asserting

I can also check that I have successfully called my Mocks using Assert-MockCalled. This command will check that a command that has been mocked has been called successfully during the test in the scope of the Describe (or in this case Context) block of the tests

It 'Checks the Mock was called for Test-Path' {
$assertMockParams = @{
'CommandName' = ' Test-Path'
'Times' = 1
'Exactly' = $true
}
Assert-MockCalled @assertMockParams
}

I specify the command name, the number of times that I expect the mock to have been called and because I know that it will be exactly 1 time, I set exactly to $true. If I set exactly to false it would test that the mock was called at least the number of times specified. This is another test that I really have called the Mocks that I defined and the results are correct and dependant only on the code.

I set up the same test for Write-Warning.

Failed Test

I can now run my Pester tests using

Invoke-Pester .\Tests

and see that some failed.

02 - Failed Pester tests

Of course it failed I don’t have a function named Get-SQLDiagRecommendations

So why run the test?

I need to ensure that my test fails before I write the code to pass it. If I don’t do that I may mistakenly write a test that passes and therefore not be correctly testing my code.

You can also see that it has run all of the .Tests.ps1 files in the tests directory and has taken 42 seconds to run. The tests directory includes a number of Pester tests including checking that all of the scripts pass the Script Analyser rules and that all of the functions have the correct help. (thank you June Blender for that test)

Show

I can reduce the output of the tests using the Show parameter of Invoke-Pester. I will often use Fails as this will show the describe and context titles and only the tests that fail. This will run much quicker as it will not need to output all of the passed tests to the screen

03 - Pester show fails

Now the test is running in less than half of the time. You can filter the output in further ways using Show. You can run

Get-Help Invoke-Pester

to see how else you can do this.

Tags

As I am going to be writing tests and then writing code to pass the tests repeatedly I don’t want to run all of these tests all of the time so I can use the Tags parameter of Invoke-Pester to only run a certain suite tests. In the Unit.Tests.ps1 file the Describe block looks like this

Describe "Get-SQLDiagRecommendations" -Tags Build , Unit{
Context "Requirements" {

So I can run just the tests tagged Unit and skip all of the other tests. Combined with the Show Fails to reduce the output my Invoke-Pester code looks like this

Invoke-Pester .\tests -Show Fails -Tag Unit

04 - Pester Tags

Now I am only running the tests that I need for writing the code for the command the tests are running in under half a second 🙂 This is so much better when I am going to be running them repeatedly.

The other tests have different tags and I will show them running later in the post.

Code

Finally, we can write some code to pass our failing test

function Get-SQLDiagRecommendations1 {
[cmdletbinding()]
Param([string]$ApiKey)
if (!$ApiKey) {
if (!(Test-Path "${env:\userprofile}\SQLDiag.Cred")) {
Write-Warning "You have not created an XML File to hold the API Key or provided the API Key as a parameter
You can export the key to an XML file using Get-Credential | Export-CliXml -Path `"`${env:\userprofile}\SQLDiag.Cred`"
You can get a key by following the steps here https://ecsapi.portal.azure-api.net/ "
    }
}

Which would look like this if the file does not exist and the API Key parameter is not used

05 - Warning

I like to provide users with a useful message that they can follow rather than a lot of red text that they need to decipher

And now our tests pass

06 - Passing Tests

If you look at the API documentation the API requires a callerid as well as the APIKey. In the examples it uses the value from
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\MachineGUID

We can get that using Get-ItemProperty and without it we can’t call the API so I wrote tests like this.

It "Returns a warning if unable to get Machine GUID" {
Mock Get-MachineGUID {} -Verifiable
Mock Write-Warning {"Warning"} -Verifiable
Get-SQLDiagRecommendations -APIKey dummykey | Should Be "Warning"
Assert-VerifiableMocks
}

I am not saying this is the correct way to write your tests. I am showing that you can test multiple things in an It block and if any one of them fails the entire test fails.

I am mocking the internal function Get-MachineGuid and Write Warning just in the scope of this It Block and passing an APIKey parameter to Get-SQLDiagRecommendations so that we don’t hit the write-warnings we tested for above and then using Assert-VerifiableMocks  to verify that the mocks have been called. It does not verify how many times, just that all of the mocks in that block have been called

The test fails as expected and then I write the code to pass the test. This is the internal function to get the Machine GUID

function Get-MachineGUID {
try {
(Get-ItemProperty registry::HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\ -Name MachineGuid).MachineGUID
}
catch{
Write-Warning "Failed to get Machine GUID from HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\"
}
}

 

and this is the call to the internal function and warning message

$MachineGUID = Get-MachineGUID
if($MachineGUID.length -eq 0)
{
Write-Warning "Failed to get Machine GUID from HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\"
break
}

Rinse and repeat

That is basically the process that I follow to write a function. I just write a test, write some code to fix it, write another test, write some code to fix it. I keep going until I have finished writing the code and all the test have passed.

Best Practice Code

Once that was done and my Unit test had passed I run

 Invoke-Pester .\tests -Tag ScriptAnalyzer -Show Fails

To check that the PowerShell code that I had written conformed to the Script Analyzer rules. I added an exception to the Help.Exceptions.ps1 file to not run the rule for plural nouns as I think the command has to be called Get-SQLRecommendations with an S ! I have tagged the ScriptAnalyzer Tests with a tag so I can just run those tests.

Help

As that had all passed I could then run

Invoke-Pester .\tests -Tag Help

Which tests if I had the correct help for my functions. Of course that failed but I could use the nifty new feature in VS Codes PowerShell Extension to add the help scaffolding really easily as I describe here

Then I could run all 563 of the Pester tests in the tests folder and be happy that everything was OK

11 - All Pester passed.PNG

By the end I had written the module, which you can find here

There are instructions and a script to install it easily.

Right now it has only got the one function to get the SQL recommendations but I will look at expanding that over the next few days and once it is more complete put it onto the PowerShell Gallery and maybe move it into the SQL Server Community GitHub Organisation  home of https://dbatools.io , https://dbareports.io, Invoke-SQLCmd2 and the SSIS Reporting pack

Contribute

Of course I am happy to have others contribute to this, in fact I encourage it. Please fork and give PR’s and make this a useful module with more commands. There is the Diagnostic Analysis API as well to work with which I am very interested to see how we can make use of that with PowerShell

As always, I highly recommend that if you want to know more about Pester you head over here and purchase this book by Adam

PowerShell Module for the SQL Server Diagnostics API – 1st Command Get-SQLDiagRecommendations

I saw this blog post about the SQL Server Diagnostics add-on to SSMS and API and thought I would write some PowerShell to work with it as all of the examples use other languages.

SQL ServerDignostics API

The Diagnostic Analysis API allows you to upload memory dumps to be able to debug and self-resolve memory dump issues from their SQL Server instances and receive recommended Knowledge Base (KB) article(s) from Microsoft, which may be applicable for the fix.

There is also the Recommendations API to view the latest Cumulative Updates (CU) and the underlying hotfixes addressed in the CU which can be filtered by product version or by feature area (e.g. Always On, Backup/Restore, Column Store, etc).

I have written a module to work with this API. It is not complete. It only has one command as of now but I can see lots of possibilities for improvement and further commands to interact with the API fully and enable SQL Server professionals to use PowerShell for this.

Storing the API Key

To use the API you need an API Key. An API Key is a secret token that identifies the application to the API and is used to control access. You can follow the instructions here https://ecsapi.portal.azure-api.net/ to get one for the SQL Server Diagnostics API.

01 - APIKey

I will need to store the key to use it. I saved my API Key using the Export-CliXML command as described by Jaap Brasser here .

 Get-Credential | Export-CliXml -Path "${env:\userprofile}\SQLDiag.Cred" 

You need to enter a username even though it is not used and then enter the API Key as the password. It is saved in the root of the user profile folder as hopefully user accounts will have access there in most shops.

The commands in the module will look for the API Key in that SQLDiag.Cred file by default but you can also just use the APIKey parameter

Get-SQLDiagRecommendations

The first function in the module is Get-SQLDiagRecommendations. All this function does is connect to the Recommendations API and return an object containing the information about the latest Cumulative Updates.

If you have already saved your API Key as described above you can use

 Get-SQLDiagRecommendations 

If you want to enter the API Key manually you would use

 Get-SQLDiagRecommendations -APIKey XXXXXXXX

Either way it will return a PowerShell object containing all of the information which looks like this.

07 - Get-SQLRecommendations

One of the beauties of PowerShell is that you can pass objects down a pipeline and use them in other commands. Also, your only limit is your imagination.

You want to export to CSV, HTML, Text file?
Email, Import to database, store in Azure storage?
Embed in Word, Excel  on a SharePoint site?

All of this and much, much more is easily achievable with PowerShell.

In the future this command will feed other functions in the module that will display this information in a more useful fashion. I am thinking of commands like

Get-SQLDiagRecommendations |
Get-SQLDiagLatestCU -Version SQL2012

or

Get-SQLDiagRecommendations |
Get-SQLDiagKBArticle -Version SQL2012 -Feature BackupRestore

If you have any ideas please join in on GitHub

JSON

For now though you can use Get-SQLDiagRecommendations to output the results to JSON so that you can examine them or consume them.

If you use VS Code follow the steps here and you can export the results to the current file with

 Get-SQLDiagRecommendations |ConvertTo-Json -Depth 7 |Out-CurrentFile 

Which looks like this

08 - OutCurrentFile

It shows the entire JSON object containing all of the information about all of the latest CU’s for SQL Server 2012 and up and each of the KB Articles. I have minimised several of the nodes to try and show as much as possible for SQL Server 2012 SP3

If you do not use VS Code or you want to export straight to a file then you can

 Get-SQLDiagRecommendations |ConvertTo-Json -Depth 7 |Out-File -Path PATHTOFILE 

Out-GridView

I like Out-GridView so I quickly gathered the Product, Cumulative Update, Feature Type, KB Number and URL and outputted to Out-GridView like this

$recommendations = Get-SQLDiagRecommendations
$KBs = foreach ($recommendation in $recommendations.Recommendations){
    $Product = $recommendation.Product
    $CU = $recommendation.Title
    $CreatedOn = $recommendation.CreatedOn
    foreach ($fix in $recommendation.Content.RelevantFixes){
        $feature = $fix.Title
        foreach ($Kb in $fix.KbArticles){
            [PSCustomObject]@{
                CreatedOn = $CreatedOn
                Product = $Product
                CU = $CU
                Feature = $feature
                KB = $Kb.Rel
                Link = $Kb.href
                }
           }
       }
   }
 $kbs | Ogv 

As you can filter easily in Out-GridView I filtered by 2012 and this is what it looks like

09 - Out-GridView

This will enable you to quickly see any information that you require about the Cumulative Updates for SQL 2012, 2014 and 2016

Github

You can find the module on GitHub. There are instructions and a script to install it easily.

Right now it has only got the one function to get the SQL recommendations but I will look at expanding that over the next few days and once it is more complete put it onto the PowerShell Gallery and maybe move it into the SQL Server Community GitHub Organisation  home of https://dbatools.io , https://dbareports.io, Invoke-SQLCmd2 and the SSIS Reporting pack

Contribute

Of course I am happy to have others contribute to this, in fact I encourage it. Please fork and give PR’s and make this a useful module with more commands. There is the Diagnostic Analysis API as well to work with and I am very interested to see how we can make use of that with PowerShell

Tomorrow I have a post explaining the process I used to create the module and how I used Test Driven Development with Pester to write this function.

Why VS Code Increases my Productivity

Last week I was showing a co-worker some PowerShell code and he asked what the editor was that I was using. Visual Studio Code I said. Why do you use that? What does it do?

This is what I showed him

Runs on any Operating System

Code (as I shall refer to it) is free lightweight open source editor which runs on all the main operating systems. So you have the same experience in Linux as on Windows. So there is less to learn

Extensions

You can add new languages, themes, debuggers and tools from the extensions gallery to reduce the number of programmes you have open and the need to switch between programmes

You can add extensions using CTRL + SHIFT  + X and searching in the bar

01 - Extensions

or by going to the Extensions gallery searching for the extensions and copying the installation command

02 - extensions gallery.PNG

Debugging

There is a rich de-bugging experience built in

03 - debugging.PNG

You can learn about debugging from the official docs and Keith Hill wrote a blog post on Hey Scripting Guys about debugging PowerShell

Intellisense

An absolute must to make life simpler. Code has intellisense for PowerShell and T-SQL which I use the most but also for many more languages . Read more here

Git integration

I love the Git integration, makes it so easy to work with GitHub for me. I can see diffs, commit, undo commits nice and simply. Just open the root folder of the repository and its there

04 - git

This page will give you a good start on using git with Code

No distractions

With full screen mode (F11) or Zen mode (CTRL +K, Z) I can concentrate on coding and not worry about distractions

Stay in one programme and do it all

I have a Markdown document, a PowerShell script and a T-SQL script all in one Git repository and I can work on all of them and version control in one place. The screencast below also shows some of the new capabilities available in the insiders version I managed to leave the screen recording dialogue open as well, apologies and the mistake was deliberate!

I used the GitLens and SQL beautify extensions as well as the dbatools module in that demo

That’s why I am using Code more and more these days, hope it helps

Happy Automating!

 

 

Testing SQL Server Access to a share with PowerShell using dbatools

A good security practice is to backup our SQL Servers to a network share but not allow users to be able to browse the share. How can we ensure that our SQL Server has access or test it if it has been set up by someone else?

Lets set this up.

First lets create a share for our backups

$FileShareParams=@{
Name='SQLBackups'
Description='The Place for SQL Backups'
SourceVolume=(Get-Volume-DriveLetterD)
FileServerFriendlyName='beardnuc'
}
New-FileShare @FileShareParams

This will create us a share called SQLBackups on the D drive of the server beardnuc, but without any permissions, lets grant permissions to everyone

$FileSharePermsParams=@{
 Name = 'SQLBackups'
 AccessRight = 'Modify'
 AccountName = 'Everyone'}
Grant-FileShareAccess @FileSharePermsParams

01 file share.PNG

The share is created and I can access it and create a file

02 - create a file.PNG

and as we can see the permissions are granted for everyone

03 -permissions.PNG

OK, that’s not what we want so lets revoke that permission.

Revoke-FileShareAccess Name SQLBackups AccountName 'Everyone'

04 revoked.PNG

Now lets add permissions just for our SQL Server Service Accounts

$FileSharePermsParams = @{
Name = 'SQLBackups'
AccessRight = 'Modify'
AccountName = 'SQL_DBEngine_Service_Accounts
}
Grant-FileShareAccess @FileSharePermsParams 
and explicitly deny our DBA user accounts from accessing them.
$BlockFileShareParams = @{
Name = 'SQLBackups'
AccountName = 'SQL_DBAs_The_Cool_Ones'
}
Block-FileShareAccess @BlockFileShareParams
In the GUI our permissions look like this
and when I try to access as THEBEARD\Rob I get this

07 -no permissions.PNG

So how can I check that I have access from my SQL Server? Sure I could get the password of the SQL Service account and run a process as that account, not saying that’s a good idea but it could be done. Of course it couldn’t be done if you are using Managed Service Accounts or Group Managed Service Accounts but there is a way

Enter dbatools to the rescue 😉 The dbatools module (for those that don’t know) is a PowerShell module written by amazing folks in the community designed to make administrating your SQL Server significantly easier using PowerShell. The instructions for installing it are available here It comprises of 182 separate commands at present

There is a command called Test-SqlPath As always start with Get-Help

Get-Help Test-SqlPath -Full

08 - get help.PNG

So it uses master.dbo.xp_fileexist to determine if a file or directory exists, from the perspective of the SQL Server service account, has three parameters Sqlserver, Path and SqlCredential for SQL Authentication. Of course if that stored procedure is disabled on your estate then this command will not be of use to you. With that in mind, lets run it and see what it does
Test-SqlPath -SqlServer sql2016n1 -Path \\beardnuc\SQLBackups
09 - path test

That’s good I have access, lets back a database up

Backup-SqlDatabase -ServerInstance SQL2016N1 -Database DBA-Admin -CopyOnly -BackupAction Database -BackupFile '\\BeardNuc\SQLBackups\Test-DBA-Admin.bak'
Ah, I cant show you as I don’t have access. Better get in touch with the data centre admin to check 😉 Luckily, I am my own data centre admin and have another account I can use 🙂
10 - check

So what if we want to test all of our servers for access to the new share? I tried this

$SQLServers = (Get-VM -ComputerName beardnuc).Where{$_.Name -like '*SQL*' -and $_.Name -notlike 'SQL2008Ser2008'}.Name
Test-SqlPath -SqlServer $SQLServers -Path '\\BeardNuc\SQLBackups'
but unfortunately I hit an error
11 - error.PNG
It seems that at the moment (version 0.8.942) this command only accepts a single server. This is what you should do if you find either a bug or have an idea for dbatools. Raise an issue on Github
Navigate to the GitHub repository and click on issues. I generally search for the command name in the issues to see if someone else has beaten me to it
12 - issues
If those issues don’t match yours then click the green New Issue button
There is a template to fill in which asks you to specify your Windows, PowerShell and SQL versions with the commands that you need to do so included. Please do this and paste the results in as it will help the folks to replicate the issues in the case of more complicated  bugs
I created this issue with a potential fix as well, you don’t have to do that, just letting the folks know is good enough
Until that issue is resolved, you can check all of your servers as follows
$SQLServers=(Get-VM -ComputerName beardnuc).Where{$_.Name -like '*SQL*' -and $_.Name -notlike 'SQL2008Ser2008'}.Name
foreach($Server in $SQLServers)
{
$Test = Test-SqlPath -SqlServer $Server -Path '\\BeardNuc\SQLBackups'
[PSCustomObject]@{
Server = $Server
Result = $Test
}
}
13 - servers.PNG
and if I remove one of the service accounts from the group and restart the service an run the command again
14 - one fails.PNG
So that’s how to use dbatools to check that your SQL Server have access to a Network share and also how to create an issue on GitHub for dbatools and help it to get even better

Happy Automating

NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using

 Get-Module dbatools

and update it using an Administrator PowerShell session with

 Update-Module dbatools

You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using

Install-Module dbatools

Then you can use

Update-dbatools

The SQL Server Community Collaborative GitHub Organisation is born

My wonderful friend Chrissy LeMaire and I are the creators of two GitHub repositories for SQL Server and PowerShell called dbatools and dbareports

If you are working with SQL Server I highly recommend that you take a look at the vast number of commands available to you at dbatools which will help you complete tasks within SQL Server especially for Instance migrations and also a growing number of best practice implementations

Both of these modules are not just the work of one person any more. We have over 20 people who have collaborated on the modules THANK YOU ALL and more that have provided guidance and comments via the Slack Channels in the SQL Server Community Slack https://sqlps.io/slack and via the Trello boards https://dbatools.io/trello and https://dbareports/trello

At SQL Saturday Cambridge this weekend I was proud to join Chrissy in her presentation as we talked about both modules. Heres a fabulous picture of us with Buck Woody

 

wp_20160910_10_14_58_pro

 

We had discussed previously that it didn’t feel quite right that these community tools were under our own personal accounts and it also caused some administration issues with allowing access. So with that in mind after a naming discussion in the slack channel we created an organisation to hold them both

 SQL Server Community Collaborative

is born at https://github.com/sqlcollaborative

Nothing much changes except the name. we have even found that all the old links work and GitHub desktop updated. We will continue to make great commands with all of our fantastic collaborators. Discussions will happen in Slack and organisation in Trello and we will continue to grow and learn and teach and share and create together.

We would love you to come and join us