Deploying To a Power Bi Report Server with PowerShell

Just a quick post to share some code that I used to solve a problem I had recently.

I needed to automate the deployment of some Power Bi reports to a Power Bi Report Server PBRS using TFS. I had some modified historical validation dbachecks pbix files that I wanted to automate the deployment of and enable the client to be able to quickly and simply deploy the reports as needed.

The manual way

It is always a good idea to understand how to do a task manually before automating it. To deploy to PBRS you need to use the Power Bi Desktop optimised for Power Bi Report Server. There are instructions here. Then it is easy to deploy to the PBRS by clicking file and save as and choosing Power Bi Report Server

manual deploy

If I then want to set the datasource to use a different set of credentials I navigate to the folder that holds the report in PBRS and click the hamburger menu and Manage

manage

and I can alter the User Name and Password or the type of connection by clicking on DataSources

testconn.PNG

and change it to use the reporting user for example.

Automation

But I dont want to have to do this each time and there will be multiple pbix files, so I wanted to automate the solution. The end result was a VSTS or TFS release process so that I could simply drop the pbix into a git repository, commit my changes, sync them and have the system deploy them automatically.

As with all good ideas, I started with a google and found this post by Bill Anton which gave me a good start ( I could not get the connection string change to work in my test environment but this was not required so I didnt really examine why)

I wrote a function that I can use via TFS or VSTS by embedding it in a PowerShell script. The function requires the ReportingServicesTools module which you can get by

Install-Module -Name ReportingServicesTools

The function below is available via the PowerShell Gallery also and you can get it with

Install-Script -Name PublishPBIXFile

The source code is on Github

and the code to call it looks like this

$folderName = 'TestFolder'
$ReportServerURI = 'http://localhost/Reports'
$folderLocation = '/'
$pbixfile = 'C:\Temp\test.pbix'
$description = "Descriptions"

$publishPBIXFileSplat = @{
    ReportServerURI    = $ReportServerURI
    folderLocation     = $folderLocation
    description        = $description
    pbixfile           = $pbixfile
    folderName         = $folderName
    AuthenticationType = 'Windows'
    ConnectionUserName = $UserName1
    Secret             = $Password1
    Verbose            = $true
}
Publish-PBIXFile @publishPBIXFileSplat
code1.PNG

which uploads the report to a folder which it will create if it does not exist. It will then upload pbix file, overwriting the existing one if it already exists

numbe3r1.PNG

and uses the username and password specified

code2.PNG

If I wanted to use a Domain reporting user instead I can do

$UserName1 = 'TheBeard\ReportingUser'

$publishPBIXFileSplat = @{
    ReportServerURI    = $ReportServerURI
    folderLocation     = $folderLocation
    description        = $description
    pbixfile           = $pbixfile
    folderName         = $folderName
    AuthenticationType = 'Windows'
    ConnectionUserName = $UserName1
    Secret             = $Password1
    Verbose            = $true
}
Publish-PBIXFile @publishPBIXFileSplat
and it changes
code4 reporting
If we want to use a SQL Authenticated user then
$UserName1 = 'TheReportingUserOfBeard'

$publishPBIXFileSplat = @{
    ReportServerURI    = $ReportServerURI
    folderLocation     = $folderLocation
    description        = $description
    pbixfile           = $pbixfile
    folderName         = $folderName
    AuthenticationType = 'SQL'
    # credential = $cred
    ConnectionUserName = $UserName1
    Secret             = $Password1

}
Publish-PBIXFile @publishPBIXFileSplat
sql auth.PNG
Excellent, it all works form the command line. You can pass in a credential object as well as username and password. The reason I enabled username and password? So that I can use TFS or VSTS and store my password as a secret variable.
Now I simply create a repository which has my pbix files and a PowerShell script and build a quick release process to deploy them whenever there is a change 🙂
The deploy script looks like
[CmdletBinding()]
Param (
    $PBIXFolder,
    $ConnectionStringPassword
)
$VerbosePreference = 'continue'
$ReportServerURI = 'http://TheBeardsAmazingReports/Reports'
Write-Output "Starting Deployment"
function Publish-PBIXFile {
    [CmdletBinding(DefaultParameterSetName = 'ByUserName', SupportsShouldProcess)]
    Param(
        [Parameter(Mandatory = $true)]
        [string]$FolderName,
        [Parameter(Mandatory = $true)]
        [string]$ReportServerURI,
        [Parameter(Mandatory = $true)]
        [string]$FolderLocation,
        [Parameter(Mandatory = $true)]
        [string]$PBIXFile,
        [Parameter()]
        [string]$Description = "Description of Your report Should go here",
        [Parameter()]
        [ValidateSet('Windows', 'SQL')]
        [string]$AuthenticationType,
        [Parameter(ParameterSetName = 'ByUserName')]
        [string]$ConnectionUserName,
        [Parameter(ParameterSetName = 'ByUserName')]
        [string]$Secret,
        [Parameter(Mandatory = $true, ParameterSetName = 'ByCred')]
        [pscredential]$Credential
    )
    $FolderPath = $FolderLocation + $FolderName
    $PBIXName = $PBIXFile.Split('\')[-1].Replace('.pbix', '')
    try {
        Write-Verbose"Creating a session to the Report Server $ReportServerURI"
        # establish session w/ Report Server
        $session = New-RsRestSession-ReportPortalUri $ReportServerURI
        Write-Verbose"Created a session to the Report Server $ReportServerURI"
    }
    catch {
        Write-Warning"Failed to create a session to the report server $reportserveruri"
        Return
    }
    # create folder (optional)
    try {
        if ($PSCmdlet.ShouldProcess("$ReportServerURI", "Creating a folder called $FolderName at $FolderLocation")) {
            $Null = New-RsRestFolder-WebSession $session-RsFolder $FolderLocation-FolderName $FolderName-ErrorAction Stop
        }
    }
    catch [System.Exception] {
        If ($_.Exception.InnerException.Message -eq 'The remote server returned an error: (409) Conflict.') {
            Write-Warning"The folder already exists - moving on"
        }
    }
    catch {
        Write-Warning"Failed to create a folder called $FolderName at $FolderLocation report server $ReportServerURI but not because it already exists"
        Return
    }
    try {
        if ($PSCmdlet.ShouldProcess("$ReportServerURI", "Uploading the pbix from $PBIXFile to the report server ")) {
            # upload copy of PBIX to new folder
            Write-RsRestCatalogItem-WebSession $session-Path $PBIXFile-RsFolder $folderPath-Description $Description-Overwrite
        }
    }
    catch {
        Write-Warning"Failed to upload the file $PBIXFile to report server $ReportServerURI"
        Return
    }
    try {
        Write-Verbose"Getting the datasources from the pbix file for updating"
        # get data source object
        $datasources = Get-RsRestItemDataSource-WebSession $session-RsItem "$FolderPath/$PBIXName"
        Write-Verbose"Got the datasources for updating"
    }
    catch {
        Write-Warning"Failed to get the datasources"
        Return
    }
    try {
        Write-Verbose"Updating Datasource"

        foreach ($dataSourcein$datasources) {
            if ($AuthenticationType -eq 'SQL') {
                $dataSource.DataModelDataSource.AuthType = 'UsernamePassword'
            }
            else {
                $dataSource.DataModelDataSource.AuthType = 'Windows'
            }
            if ($Credential -or $UserName) {
                if ($Credential) {
                    $UserName = $Credential.UserName
                    $Password = $Credential.GetNetworkCredential().Password
                }
                else {
                    $UserName = $ConnectionUserName
                    $Password = $Secret
                }
                $dataSource.CredentialRetrieval = 'Store'
                $dataSource.DataModelDataSource.Username = $UserName
                $dataSource.DataModelDataSource.Secret = $Password
            }
            if ($PSCmdlet.ShouldProcess("$ReportServerURI", "Updating the data source for the report $PBIXName")) {
                # update data source object on server
                Set-RsRestItemDataSource-WebSession $session-RsItem "$folderPath/$PBIXName"-RsItemType PowerBIReport -DataSources $datasource
            }
        }
    }
    catch {
        Write-Warning"Failed to set the datasource"
        Return
    }
    Write-Verbose"Completed Successfully"
}
foreach ($File in (Get-ChildItem $PBIXFolder\*.pbix)) {
    Write-Output"Processing $($File.FullName)"
    ## to enable further filtering later
    if ($File.FullName -like '*') {
        $folderName = 'ThePlaceForReports'
        $folderLocation = '/'
        $UserName = 'TheBeard\ReportingUser'
        $Password = $ConnectionStringPassword
        $pbixfile = $File.FullName
    }
    if ($File.FullName -like '*dbachecks*') {
        $description = "This is the morning daily checks file that....... more info"
    }
    if ($File.FullName -like '*TheOtherReport*') {
        $description = "This is hte other report, it reports others"
    }
    $publishPBIXFileSplat = @{
        ReportServerURI    = $ReportServerURI
        folderLocation     = $folderLocation
        description        = $description
        AuthenticationType = 'Windows'
        pbixfile           = $pbixfile
        folderName         = $folderName
        ConnectionUserName = $UserName
        Secret             = $Password
        Verbose            = $true
    }
    $Results = Publish-PBIXFile@publishPBIXFileSplat
    Write-Output$Results
}
Although the function does not need to be embedded in the script and can be deployed in a module, I have included it in here to make it easier for people to use quickly. I
Then create a PowerShell step in VSTS or TFS and call the script with the parameters as shown below and PowerBi files auto deploy to Power Bi Report Server
vsts.PNG
and I have my process complete 🙂
Happy Automating 🙂

Creating SQL Server Containers for versions 2012-2017

I am working on my dbatools and dbachecks presentations for SQL Saturday Finland, SQLDays, SQL Saturday Cork and SQLGrillen I want to show the two modules running against a number of SQL Versions so I have installed

  • 2 Domain Controllers
  • 2 SQL 2017 instances on Windows 2016 with an Availability Group and WideWorldImporters database
  • 1 Windows 2016 jump box with all the programmes I need
  • 1 Windows 2016 with containers

using a VSTS build and this set of ARM templates and scripts

I wanted to create containers running SQL2017, SQL2016, SQL2014 and SQL2012 and restore versions of the AdventureWorks database onto each one.

Move Docker Location

I redirected my docker location from my C:\ drive to my E:\ drive so I didnt run out of space. I did this by creating a daemon.json file in C:\ProgramData\docker\config and adding

{"data-root": "E:\\containers"}
and restarting the docker service which created folders like this
01 - folders.png
Then I ran
docker volume create SQLBackups
to create a volume to hold the backups that I could mount on the containers

AdventureWorks Backups

I downloaded all the AdventureWorks backups from GitHub and copied them to E:\containers\volumes\sqlbackups\_data

Get-ChildItem $Home\Downloads\AdventureWorks* | Copy-Item -Destination E:\containers\volumes\sqlbackups\_data

Getting the Images

To download the SQL 2017 image from the DockerHub I ran

docker pull microsoft/mssql-server-windows-developer:latest

and waited for it to download and extract

I also needed the images for other versions. My good friend Andrew Pruski b | t has versions available for us to use on his Docker Hub  so it is just a case of running

docker pull dbafromthecold/sqlserver2016dev:sp1
docker pull dbafromthecold/sqlserver2014dev:sp2
docker pull dbafromthecold/sqlserver2012dev:sp4
and waiting for those to download and extract (This can take a while!)

Create the containers

Creating the containers is as easy as

docker run -d -p ExposedPort:InternalPort --name NAME -v VolumeName:LocalFolder -e sa_password=THEPASSWORD -e ACCEPT_EULA=Y IMAGENAME
so all I needed to run to create 4 SQL containers one of each version was
docker run -d -p 15789:1433 --name 2017 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y microsoft/mssql-server-windows-developer
docker run -d -p 15788:1433 --name 2016 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y dbafromthecold/sqlserver2016dev:sp1
docker run -d -p 15787:1433 --name 2014 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y dbafromthecold/sqlserver2014dev:sp2
docker run -d -p 15786:1433 --name 2012 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y dbafromthecold/sqlserver2012dev:sp4

and just a shade over 12 seconds later I have 4 SQL instances ready for me 🙂

02 - creating containers.png

03 - Containers at the ready.png

Storing Credentials

This is not something I would do in a Production environment but I save my credentials using this method that Jaap Brasser b | t shared here

Get-Credential | Export-Clixml -Path $HOME\Documents\sa.cred
which means that I can get the credentials in my PowerShell session (as long as it is the same user that created the file) using
$cred = Import-Clixml $HOME\Documents\sa.cred

Restoring the databases

I restored all of the AdventureWorks databases that each instance will support onto each instance, so 2017 has all of them whilst 2012 only has the 2012 versions.

First I needed to get the filenames of the backup files into a variable

$filenames = (Get-ChildItem '\\bearddockerhost\e$\containers\volumes\sqlbackups\_data').Name

and the container connection strings, which are the hostname and the port number

$containers = 'bearddockerhost,15789', 'bearddockerhost,15788', 'bearddockerhost,15787', 'bearddockerhost,15786'
then I can restore the databases using dbatools using a switch statement on the version which I get with the NameLevel property of Get-DbaSqlBuildReference-
$cred = Import-Clixml $HOME\Documents\sa.cred
$containers = 'bearddockerhost,15789', 'bearddockerhost,15788', 'bearddockerhost,15787', 'bearddockerhost,15786'
$filenames = (Get-ChildItem '\\bearddockerhost\e$\containers\volumes\sqlbackups\_data').Name
$containers.ForEach{
    $Container = $Psitem
    $NameLevel = (Get-DbaSqlBuildReference-SqlInstance $Container-SqlCredential $cred).NameLevel
    switch ($NameLevel) {
        2017 {
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path C:\sqlbackups\ -useDestinationDefaultDirectories -WithReplace |Out-Null
            Write-Verbose-Message "Restored Databases on 2017"
        }
        2016 {
            $Files = $Filenames.Where{$PSitem -notlike '*2017*'}.ForEach{'C:\sqlbackups\' + $Psitem}
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path $Files-useDestinationDefaultDirectories -WithReplace
            Write-Verbose-Message "Restored Databases on 2016"
        }
        2014 {
            $Files = $Filenames.Where{$PSitem -notlike '*2017*' -and $Psitem -notlike '*2016*'}.ForEach{'C:\sqlbackups\' + $Psitem}
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path $Files-useDestinationDefaultDirectories -WithReplace
            Write-Verbose-Message "Restored Databases on 2014"
        }
        2012 {
            $Files = $Filenames.Where{$PSitem -like '*2012*'}.ForEach{'C:\sqlbackups\' + $Psitem}
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path $Files-useDestinationDefaultDirectories -WithReplace
            Write-Verbose-Message "Restored Databases on 2012"
        }
        Default {}
    }
}
I need to create the file paths for each backup file by getting the correct backups and appending the names to C:\SQLBackups which is where the volume is mounted inside the container
As Get-DbaDatabase gives the container ID as the Computer Name I have highlighted each container below
04 - databases.png
That is how easy it is to create a number of SQL containers of differing versions for your presentations or exploring needs
Happy Automating!

Version Update, Code Signing and publishing to the PowerShell Gallery with VSTS

At the fabulous PowerShell Conference EU I presented about Continuous Delivery to the PowerShell Gallery with VSTS and explained how we use VSTS to enable CD for dbachecks. We even released a new version during the session 🙂

So how do we achieve this?

We have a few steps

  • Create a project and link to our GitHub
  • Run unit uests with Pester to make sure that our code is doing what we expect.
  • Update our module version and commit the change to GitHub
  • Sign our code with a code signing certificate
  • Publish to the PowerShell Gallery

Create Project and link to GitHub

First you need to create a VSTS project by going to https://www.visualstudio.com/ This is free for up to 5 users with 1 concurrent CI/CD queue limited to a maximum of 60 minutes run time which should be more than enough for your PowerShell module.

01 - sign up.png

Click on Get Started for free under Visual Studio Team Services and fill in the required information. Then on the front page click new project

02 - New Project.png

Fill in the details and click create

03 - create project.png

Click on builds and then new definition

04- builds.png

next you need to link your project to your GitHub (or other source control providers) repository

05 - github auth.png

You can either authorise with OAuth or you can provide a PAT token following the instructions here. Once that is complete choose your repo. Save the PAT as you will need it later in the process!

06 - choose repo.png

and choose the branch that you want this build definition to run against.

07 branch.png

I chose to run the Unit Tests when a PR was merged into the development branch. I will then create another build definition for the master branch to sign the code and update module version. This enables us to push several PRs into the development branch and create a single release for the gallery.

Then I start with an empty process

08 - empty process.png

and give it a suitable name

09 - name it.png

i chose the hosted queue but you can download an agent to your build server if you need to do more or your integration tests require access to other resources not available on the hosted agent.

Run Unit Tests with Pester

We have a number of Unit tests in our tests folder in dbachecks so we want to run them to ensure that everything is as it should be and the new code will not break existing functionality (and for dbachecks the format of the PowerBi)

You can use the Pester Test Runner Build Task from the folk at Black Marble by clicking on the + sign next to Phase 1 and searching for Pester

10 - Pester task runner.png

You will need to click Get It Free to install it and then click add to add the task to your build definition. You can pretty much leave it as default if you wish and Pester will run all of the *.Tests.ps1 files that it finds in the directory where it downloads the GitHub repo which is referred to using the variable $(Build.SourcesDirectory). It will then output the results to a json file called Test-Pester.XML ready for publishing.

However, as dbachecks has a number of dependent modules, this task was not suitable. I spoke with Chris Gardner  b | t  from Black Marble at the PowerShell Conference and he says that this can be resolved so look out for the update. Chris is a great guy and always willing to help, you can often find him in the PowerShell Slack channel answering questions and helping people

But as you can use PowerShell in VSTS tasks, this is not a problem although you need to write your PowerShell using try catch to make sure that your task fails when your PowerShell errors. This is the code I use to install the modules

$ErrorActionPreference = 'Stop'

# Set location to module home path in artifacts directory
try {
    Set-Location $(Build.SourcesDirectory)
    Get-ChildItem
}
catch {
    Write-Error "Failed to set location"

}

# Get the Module versions
Install-Module Configuration -Scope CurrentUser -Force
$Modules = Get-ManifestValue -Path .\dbachecks.psd1 -PropertyName RequiredModules

$PesterVersion = $Modules.Where{$_.Get_Item('ModuleName') -eq 'Pester'}[0].Get_Item('ModuleVersion')
$PSFrameworkVersion = $Modules.Where{$_.Get_Item('ModuleName') -eq 'PSFramework'}[0].Get_Item('ModuleVersion')
$dbatoolsVersion = $Modules.Where{$_.Get_Item('ModuleName') -eq 'dbatools'}[0].Get_Item('ModuleVersion')

# Install Pester
try {
    Write-Output "Installing Pester"
    Install-Module Pester  -RequiredVersion $PesterVersion  -Scope CurrentUser -Force -SkipPublisherCheck
    Write-Output "Installed Pester"

}
catch {
    Write-Error "Failed to Install Pester $($_)"
}
# Install dbatools
try {
    Write-Output "Installing PSFramework"
    Install-Module PSFramework  -RequiredVersion $PsFrameworkVersion  -Scope CurrentUser -Force 
    Write-Output "Installed PSFramework"

}
catch {
    Write-Error "Failed to Install PSFramework $($_)"
}
# Install dbachecks
try {
    Write-Output "Installing dbatools"
    Install-Module dbatools  -RequiredVersion $dbatoolsVersion  -Scope CurrentUser -Force 
    Write-Output "Installed dbatools"

}
catch {
    Write-Error "Failed to Install dbatools $($_)"
}

# Add current folder to PSModulePath
try {
    Write-Output "Adding local folder to PSModulePath"
    $ENV:PSModulePath = $ENV:PSModulePath + ";$pwd"
    Write-Output "Added local folder to PSModulePath"    
    $ENV:PSModulePath.Split(';')
}
catch {
    Write-Error "Failed to add $pwd to PSModulePAth - $_"
}

I use the Configuration module from Joel Bennett to get the required module versions for the required modules and then add the path to $ENV:PSModulePath so that the modules will be imported. I think this is because the modules did not import correctly without it.

Once I have the modules I can then run Pester as follows

try {
    Write-Output "Installing dbachecks"
    Import-Module .\dbachecks.psd1
    Write-Output "Installed dbachecks"

}
catch {
    Write-Error "Failed to Install dbachecks $($_)"
}
$TestResults = Invoke-Pester .\tests -ExcludeTag Integration,IntegrationTests  -Show None -OutputFile $(Build.SourcesDirectory)\Test-Pester.XML -OutputFormat NUnitXml -PassThru

if ($TestResults.failedCount -ne 0) {
    Write-Error "Pester returned errors"
}

As you can see I import the dbachecks module from the local folder, run Invoke-Pester and output the results to an XML file and check that there are no failing tests.

Whether you use the task or PowerShell the next step is to Publish the test results so that they are displayed in the build results in VSTS.

Click on the + sign next to Phase 1 and search for Publish

12 - publish test results.png

 

Choose the Publish Test Results task and leave everything as default unless you have renamed the xml file. This means that on the summary page you will see some test results

 

13 - Test on sumary page.png

and on the tests tab you can see more detailed information and drill down into the tests

14 - detailed test report.png

Trigger

The next step is to trigger a build when a commit is pushed to the development branch. Click on Triggers and tick enable continuous integration

15 Trigger.png

Saving the Build Definition

I would normally save the build definition regularly and ensure that there is a good message in the comment. I always tell clients that this is like a commit message for your build process so that you can see the history of the changes for the build definition.

You can see the history on the edit tab of the build definition

16 - build history.png

If you want to compare or revert the build definition this can be done using the hamburger menu as shown below.

17 - build history compare revert.png

Update the Module Version

Now we need to create a build definition for the master branch to update the module version and sign the code ready for publishing to the PowerShell Gallery when we commit or merge to master

Create a new build definition as above but this time choose the master branch

18 - master build.png

Again choose an empty process and name it sensibly, click the + sign next to Phase 1 and search for PowerShell

19 - PowerShell task.png

I change the version to 2 and use this code. Note that the commit message has ***NO_CI*** in it. Putting this in a commit message tells VSTS not to trigger a build for this commit.

$manifest = Import-PowerShellDataFile .\dbachecks.psd1 
[version]$version = $Manifest.ModuleVersion
Write-Output "Old Version - $Version"
# Add one to the build of the version number
[version]$NewVersion = "{0}.{1}.{2}" -f $Version.Major, $Version.Minor, ($Version.Build + 1) 
Write-Output "New Version - $NewVersion"
# Update the manifest file
try {
    Write-Output "Updating the Module Version to $NewVersion"
    $path = "$pwd\dbachecks.psd1"
    (Get-Content .\dbachecks.psd1) -replace $version, $NewVersion | Set-Content .\dbachecks.psd1 -Encoding string
    Write-Output "Updated the Module Version to $NewVersion"
}
catch {
    Write-Error "Failed to update the Module Version - $_"
}

try {
    Write-Output "Updating GitHub"
git config user.email "mrrobsewell@outlook.com"
git config user.name "SQLDBAWithABeard"
git add .\dbachecks.psd1
git commit -m "Updated Version Number to $NewVersion ***NO_CI***"

git push https://$(RobsGitHubPAT)@github.com/sqlcollaborative/dbachecks.git HEAD:master
Write-Output "Updated GitHub "

}
catch {
    $_ | Fl -Force
    Write-Output "Failed to update GitHub"
}

I use Get-Content Set-Content as I had errors with the Update-ModuleManifest but Adam Murray g | t uses this code to update the version using the BuildID from VSTS

$newVersion = New-Object version -ArgumentList 1, 0, 0, $env:BUILD_BUILDID
$Public  = @(Get-ChildItem -Path $ModulePath\Public\*.ps1)
$Functions = $public.basename
Update-ModuleManifest -Path $ModulePath\$ModuleName.psd1 -ModuleVersion $newVersion -FunctionsToExport $Functions

You can commit your change by adding your PAT token as a variable under the variables tab. Don’t forget to tick the padlock to make it a secret so it is not displayed in the logs

20 - variables.png

Sign the code with a certificate

The SQL Collaborative uses a code signing certificate from DigiCert who allow MVPs to use one for free to sign their code for open source projects, Thank You. We had to upload the certificate to the secure files store in the VSTS library. Click on library, secure files and the blue +Secure File button

21 - secure file store.png

You also need to add the password as a variable under the variables tab as above. Again don’t forget to tick the padlock to make it a secret so it is not displayed in the logs

Then you need to add a task to download the secure file. Click on the + sign next to Phase 1 and search for secure

22 download secure file.png

choose the file from the drop down

23 - download secure file.png

Next we need to import the certificate and sign the code. I use a PowerShell task for this with the following code

$ErrorActionPreference = 'Stop'
# read in the certificate from a pre-existing PFX file
# I have checked this with @IISResetMe and this does not go in the store only memory
$cert = [System.Security.Cryptography.X509Certificates.X509Certificate2]::new("$(Agent.WorkFolder)\_temp\dbatools-code-signing-cert.pfx","$(CertPassword)")

try {
    Write-Output "Signing Files"
    # find all scripts in your module...
Get-ChildItem  -Filter *.ps1 -Include *.ps1 -Recurse -ErrorAction SilentlyContinue |
# ...that do not have a signature yet...
Where-Object {
  ($_ | Get-AuthenticodeSignature).Status -eq 'NotSigned'
  } |
# and apply one
# (note that we added -WhatIf so no signing occurs. Remove this only if you
# really want to add digital signatures!)
Set-AuthenticodeSignature -Certificate $cert
Write-Output "Signed Files"
}
catch {
    $_ | Format-List -Force
    Write-Error "Failed to sign scripts"
}

which will import the certificate into memory and sign all of the scripts in the module folder.

Publish your artifact

The last step of the master branch build publishes the artifact (your signed module) to VSTS ready for the release task. Again, click the + sign next to Phase one and choose the Publish Artifact task not the deprecated copy and publish artifact task and give the artifact a useful name

24 - publish artifact.png

Don’t forget to set the trigger for the master build as well following the same steps as the development build above

Publish to the PowerShell Gallery

Next we create a release to trigger when there is an artifact ready and publish to the PowerShell Gallery.

Click the Releases tab and New Definition

25 - Reelase creation

Choose an empty process and name the release definition appropriately

26 Release name empty process.png

Now click on the artifact and choose the master build definition. If you have not run a build you will get an error like below but dont worry click add.

27 - add artifact.png

Click on the lightning bolt next to the artifact to open the continuous deployment trigger

28 - Choose lightning bolt

and turn on Continuous Deployment so that when an artifact has been created with an updated module version and signed code it is published to the gallery

28 - Continuous deployment trigger

Next, click on the environment and name it appropriately and then click on the + sign next to Agent Phase and choose a PowerShell step

29 - PowerShell Publish step

You may wonder why I dont choose the PowerShell Gallery Packager task. There are two reasons. First I need to install the required modules for dbachecks (dbatools, PSFramework, Pester) prior to publishing and second it appears that the API Key is stored in plain text

30 - PowerShell Gallery Publisher

I save my API key for the PowerShell Gallery as a variable again making sure to tick the padlock to make it a secret

31 - API Key variable.png

and then use the following code to install the required modules and publish the module to the gallery

Install-Module dbatools -Scope CurrentUser -Force
Install-Module Pester -Scope CurrentUser -SkipPublisherCheck -Force
Install-Module PSFramework -Scope CurrentUser -Force

Publish-Module -Path "$(System.DefaultWorkingDirectory)/Master - Version Update, Signing and Publish Artifact/dbachecks" -NuGetApiKey "$(GalleryApiKey)"

Thats it 🙂

Now we have a process that will automatically run our Pester tests when we commit or merge to the development branch and then update our module version number and sign our code and publish to the PowerShell Gallery when we commit or merge to the master branch

Added Extra – Dashboard

I like to create dashboards in VSTS to show the progress of the various definitions. You can do this under the dashboard tab. Click edit and choose or search for widgets and add them to the dashboard

32 - Dashboard.png

Added Extra – Badges

You can also enable badges for displaying on your readme in GitHub (or VSTS). For the build defintions this is under the options tab.

33 - Build badges

for the release definitions, click the environment and then options and integrations

34 - Release Badge

You can then copy the URL and use it in your readme like this on dbachecks

35 - dbachecks readme badges.png

The SQL Collaborative has joined the preview of enabling public access to VSTS projects as detailed in this blog post So you can see the dbachecks build and release without the need to log in and soon the dbatools process as well

I hope you found this useful and if you have any questions or comments please feel free to contact me

 

Happy Automating!

Checking Availability Groups with dbachecks

It’s been 45 days since we released dbachecks

Since then there have been 25 releases to the PowerShell Gallery!! Today release 1.1.119 was released 🙂 There have been over 2000 downloads of the module already.

In the beginning we had 80 checks and 108 configuration items, today we have 84 checks and 125 configuration items!

If you have already installed dbachecks it is important to make sure that you update regularly. You can do this by running

Update-Module dbachecks

If you want to try dbachecks, you can install it from the PowerShell Gallery by running

Install-Module dbachecks # -Scope CurrentUser # if not running as admin

You can read more about installation and read a number of blog posts about using different parts of dbachecks at this link https://dbatools.io/installing-dbachecks/

HADR Tests

Today we updated the HADR tests to add the capability to test multiple availability groups and fix a couple of bugs

Once you have installed dbachecks you will need to set some configuration so that you can perform the tests. You can see all of the configuration items and their values using

Get-DbcConfig | Out-GridView

get-config.png

You can set the values with the Set-DbcConfig command. It has intellisense to make things easier 🙂 To set the values for the HADR tests

Set-DbcConfig -Name app.cluster -Value sql1
Set-DbcConfig -Name app.computername -Value sql0,sql1
Set-DbcConfig -Name app.sqlinstance -Value sql0,sql1
Set-DbcConfig -Name domain.name -Value TheBeard.Local
Set-DbcConfig -Name skip.hadr.listener.pingcheck -Value $true
  • app.cluster requires one of the nodes of the cluster.
  • app.computername requires the windows computer names of the machines to run operating system checks against
  • app.sqlinstance requires the instance names of the SQL instances that you want to run SQL checks against (These are default instances but it will accept SERVER\INSTANCE)
  • domain.name requires the domain name the machines are part of
  • skip.hadr.listener.pingcheck is a boolean value which defines whether to skip the listener ping check or not. As this is in Azure I am skipping the check by setting the value to $true
  • policy.hadr.tcpport is set to default to 1433 but you can also set this configuration if your SQL is using a different port
NOTE – You can find all the configuration items that can skip tests by running
Get-DbcConfig -Name skip*
skips.png
Now we have set the configuration (For the HADR checks – There are many more configurations for other checks that you can set) you can run the checks with
Invoke-DbcCheck -Check HADR
check results.png
This runs the following checks
  • Each node on the cluster should be up
  • Each resource on the cluster should be online
  • Each SQL instance should be enabled for Always On
  • Connection check for the listener and each node
    • Should be pingable (unless skip.hadr.listener.pingcheck is set to true)
    • Should be able to run SQL commands
    • Should be the correct domain name
    • Should be using the correct tcpport
  • Each replica should not be in unknown state
  • Each synchronous replica should be synchronised
  • Each asynchronous replica should be synchonising
  • Each database should be synchronised (or synchronising) on each replica
  • Each database should be failover ready on each replica
  • Each database should be joined to the availability group on each replica
  • Each database should not be suspended on each replica
  • Each node should have the AlwaysOn_Health extended event
  • Each node should have the AlwaysOn_Health extended event running
  • Each node should have the AlwaysOn_Health extended event set to auto start

(Apologies folk over the pond, I use the Queens English 😉 )

This is good for us to be able to run this check at the command line but we can do more.

We can export the results and display them with PowerBi. Note we need to add -PassThru so that the results go through the pipeline and that I used -Show Fails so that only the titles of the Describe and Context blocks and any failing tests are displayed to the screen

Invoke-DbcCheck -Check HADR -Show Fails -PassThru | Update-DbcPowerBiDataSource -Environment HADR-Test
Start-DbcPowerBi

results.png

This will create a file at C:\Windows\Temp\dbachecks and open the PowerBi report. You will need to refresh the data in the report and then you will see

dbachecks.png

Excellent, everything passed 🙂

Saving Configuration for reuse

We can save our configuration using Export-DbcConfig which will export the configuration to a json file

Export-DbcConfig -Path Git:\PesterTests\MyHADRTestsForProd.json

so that we can run this particular set of tests with this comfiguration by importing the configuration using Import-DbcConfig

Import-DbcConfig -Path -Path Git:\PesterTests\MyHADRTestsForProd.json
Invoke-DbcCheck -Check HADR

In this way you can set up different check configurations for different use cases. This also enables you to make use of the checks in your CI/CD process. For example, I have a GitHub repository for creating a domain, a cluster and a SQL 2017 availability group using VSTS. I have saved a dbachecks configuration to my repository and as part of my build I can import that configuration, run the checks and output them to XML for consumption by the publish test results task of VSTS

After copying the configuration to the machine, I run

Import-Dbcconfig -Path C:\Windows\Temp\FirstBuild.json
Invoke-DbcCheck-AllChecks -OutputFile PesterTestResultsdbachecks.xml -OutputFormat NUnitXml
in my build step and then use the publish test results task and VSTS does the rest 🙂
VSTS results.png