Deploying To a Power Bi Report Server with PowerShell

Just a quick post to share some code that I used to solve a problem I had recently.

I needed to automate the deployment of some Power Bi reports to a Power Bi Report Server PBRS using TFS. I had some modified historical validation dbachecks pbix files that I wanted to automate the deployment of and enable the client to be able to quickly and simply deploy the reports as needed.

The manual way

It is always a good idea to understand how to do a task manually before automating it. To deploy to PBRS you need to use the Power Bi Desktop optimised for Power Bi Report Server. There are instructions here. Then it is easy to deploy to the PBRS by clicking file and save as and choosing Power Bi Report Server

manual deploy

If I then want to set the datasource to use a different set of credentials I navigate to the folder that holds the report in PBRS and click the hamburger menu and Manage

manage

and I can alter the User Name and Password or the type of connection by clicking on DataSources

testconn.PNG

and change it to use the reporting user for example.

Automation

But I dont want to have to do this each time and there will be multiple pbix files, so I wanted to automate the solution. The end result was a VSTS or TFS release process so that I could simply drop the pbix into a git repository, commit my changes, sync them and have the system deploy them automatically.

As with all good ideas, I started with a google and found this post by Bill Anton which gave me a good start ( I could not get the connection string change to work in my test environment but this was not required so I didnt really examine why)

I wrote a function that I can use via TFS or VSTS by embedding it in a PowerShell script. The function requires the ReportingServicesTools module which you can get by

Install-Module -Name ReportingServicesTools

The function below is available via the PowerShell Gallery also and you can get it with

Install-Script -Name PublishPBIXFile

The source code is on Github

and the code to call it looks like this

$folderName = 'TestFolder'
$ReportServerURI = 'http://localhost/Reports'
$folderLocation = '/'
$pbixfile = 'C:\Temp\test.pbix'
$description = "Descriptions"

$publishPBIXFileSplat = @{
    ReportServerURI    = $ReportServerURI
    folderLocation     = $folderLocation
    description        = $description
    pbixfile           = $pbixfile
    folderName         = $folderName
    AuthenticationType = 'Windows'
    ConnectionUserName = $UserName1
    Secret             = $Password1
    Verbose            = $true
}
Publish-PBIXFile @publishPBIXFileSplat
code1.PNG

which uploads the report to a folder which it will create if it does not exist. It will then upload pbix file, overwriting the existing one if it already exists

numbe3r1.PNG

and uses the username and password specified

code2.PNG

If I wanted to use a Domain reporting user instead I can do

$UserName1 = 'TheBeard\ReportingUser'

$publishPBIXFileSplat = @{
    ReportServerURI    = $ReportServerURI
    folderLocation     = $folderLocation
    description        = $description
    pbixfile           = $pbixfile
    folderName         = $folderName
    AuthenticationType = 'Windows'
    ConnectionUserName = $UserName1
    Secret             = $Password1
    Verbose            = $true
}
Publish-PBIXFile @publishPBIXFileSplat
and it changes
code4 reporting
If we want to use a SQL Authenticated user then
$UserName1 = 'TheReportingUserOfBeard'

$publishPBIXFileSplat = @{
    ReportServerURI    = $ReportServerURI
    folderLocation     = $folderLocation
    description        = $description
    pbixfile           = $pbixfile
    folderName         = $folderName
    AuthenticationType = 'SQL'
    # credential = $cred
    ConnectionUserName = $UserName1
    Secret             = $Password1

}
Publish-PBIXFile @publishPBIXFileSplat
sql auth.PNG
Excellent, it all works form the command line. You can pass in a credential object as well as username and password. The reason I enabled username and password? So that I can use TFS or VSTS and store my password as a secret variable.
Now I simply create a repository which has my pbix files and a PowerShell script and build a quick release process to deploy them whenever there is a change 🙂
The deploy script looks like
[CmdletBinding()]
Param (
    $PBIXFolder,
    $ConnectionStringPassword
)
$VerbosePreference = 'continue'
$ReportServerURI = 'http://TheBeardsAmazingReports/Reports'
Write-Output "Starting Deployment"
function Publish-PBIXFile {
    [CmdletBinding(DefaultParameterSetName = 'ByUserName', SupportsShouldProcess)]
    Param(
        [Parameter(Mandatory = $true)]
        [string]$FolderName,
        [Parameter(Mandatory = $true)]
        [string]$ReportServerURI,
        [Parameter(Mandatory = $true)]
        [string]$FolderLocation,
        [Parameter(Mandatory = $true)]
        [string]$PBIXFile,
        [Parameter()]
        [string]$Description = "Description of Your report Should go here",
        [Parameter()]
        [ValidateSet('Windows', 'SQL')]
        [string]$AuthenticationType,
        [Parameter(ParameterSetName = 'ByUserName')]
        [string]$ConnectionUserName,
        [Parameter(ParameterSetName = 'ByUserName')]
        [string]$Secret,
        [Parameter(Mandatory = $true, ParameterSetName = 'ByCred')]
        [pscredential]$Credential
    )
    $FolderPath = $FolderLocation + $FolderName
    $PBIXName = $PBIXFile.Split('\')[-1].Replace('.pbix', '')
    try {
        Write-Verbose"Creating a session to the Report Server $ReportServerURI"
        # establish session w/ Report Server
        $session = New-RsRestSession-ReportPortalUri $ReportServerURI
        Write-Verbose"Created a session to the Report Server $ReportServerURI"
    }
    catch {
        Write-Warning"Failed to create a session to the report server $reportserveruri"
        Return
    }
    # create folder (optional)
    try {
        if ($PSCmdlet.ShouldProcess("$ReportServerURI", "Creating a folder called $FolderName at $FolderLocation")) {
            $Null = New-RsRestFolder-WebSession $session-RsFolder $FolderLocation-FolderName $FolderName-ErrorAction Stop
        }
    }
    catch [System.Exception] {
        If ($_.Exception.InnerException.Message -eq 'The remote server returned an error: (409) Conflict.') {
            Write-Warning"The folder already exists - moving on"
        }
    }
    catch {
        Write-Warning"Failed to create a folder called $FolderName at $FolderLocation report server $ReportServerURI but not because it already exists"
        Return
    }
    try {
        if ($PSCmdlet.ShouldProcess("$ReportServerURI", "Uploading the pbix from $PBIXFile to the report server ")) {
            # upload copy of PBIX to new folder
            Write-RsRestCatalogItem-WebSession $session-Path $PBIXFile-RsFolder $folderPath-Description $Description-Overwrite
        }
    }
    catch {
        Write-Warning"Failed to upload the file $PBIXFile to report server $ReportServerURI"
        Return
    }
    try {
        Write-Verbose"Getting the datasources from the pbix file for updating"
        # get data source object
        $datasources = Get-RsRestItemDataSource-WebSession $session-RsItem "$FolderPath/$PBIXName"
        Write-Verbose"Got the datasources for updating"
    }
    catch {
        Write-Warning"Failed to get the datasources"
        Return
    }
    try {
        Write-Verbose"Updating Datasource"

        foreach ($dataSourcein$datasources) {
            if ($AuthenticationType -eq 'SQL') {
                $dataSource.DataModelDataSource.AuthType = 'UsernamePassword'
            }
            else {
                $dataSource.DataModelDataSource.AuthType = 'Windows'
            }
            if ($Credential -or $UserName) {
                if ($Credential) {
                    $UserName = $Credential.UserName
                    $Password = $Credential.GetNetworkCredential().Password
                }
                else {
                    $UserName = $ConnectionUserName
                    $Password = $Secret
                }
                $dataSource.CredentialRetrieval = 'Store'
                $dataSource.DataModelDataSource.Username = $UserName
                $dataSource.DataModelDataSource.Secret = $Password
            }
            if ($PSCmdlet.ShouldProcess("$ReportServerURI", "Updating the data source for the report $PBIXName")) {
                # update data source object on server
                Set-RsRestItemDataSource-WebSession $session-RsItem "$folderPath/$PBIXName"-RsItemType PowerBIReport -DataSources $datasource
            }
        }
    }
    catch {
        Write-Warning"Failed to set the datasource"
        Return
    }
    Write-Verbose"Completed Successfully"
}
foreach ($File in (Get-ChildItem $PBIXFolder\*.pbix)) {
    Write-Output"Processing $($File.FullName)"
    ## to enable further filtering later
    if ($File.FullName -like '*') {
        $folderName = 'ThePlaceForReports'
        $folderLocation = '/'
        $UserName = 'TheBeard\ReportingUser'
        $Password = $ConnectionStringPassword
        $pbixfile = $File.FullName
    }
    if ($File.FullName -like '*dbachecks*') {
        $description = "This is the morning daily checks file that....... more info"
    }
    if ($File.FullName -like '*TheOtherReport*') {
        $description = "This is hte other report, it reports others"
    }
    $publishPBIXFileSplat = @{
        ReportServerURI    = $ReportServerURI
        folderLocation     = $folderLocation
        description        = $description
        AuthenticationType = 'Windows'
        pbixfile           = $pbixfile
        folderName         = $folderName
        ConnectionUserName = $UserName
        Secret             = $Password
        Verbose            = $true
    }
    $Results = Publish-PBIXFile@publishPBIXFileSplat
    Write-Output$Results
}
Although the function does not need to be embedded in the script and can be deployed in a module, I have included it in here to make it easier for people to use quickly. I
Then create a PowerShell step in VSTS or TFS and call the script with the parameters as shown below and PowerBi files auto deploy to Power Bi Report Server
vsts.PNG
and I have my process complete 🙂
Happy Automating 🙂

dbachecks – Dark Mode Historical Validation PowerBi

in my last post I showed how you can save the results of dbachecks to a database and created a PowerBi report. Inspired by Frank Henninger in the #dbachecks slack channel and Shawn Melton who explained the difficulties with red/green colour blind I then created this one 🙂

dark mode

You can find it in my GitHub and have a play with it below

Happy Validating!

dbachecks – Save the results to a database for historical reporting

I gave a presentation at SQL Day in Poland last week on dbachecks and one of the questions I got asked was will you write a command to put the results of the checks into a database for historical reporting.

The answer is no and here is the reasoning. The capability is already there. Most good PowerShell commands will only return an object and the beauty of an object is that you can do anything you like with it. Your only limit is your imagination 🙂 I have written about this before here. The other reason is that it would be very difficult to write something that was easily configurable for the different requirements that people will require. But here is one way of doing it.

Create a configuration and save it

Let’s define a configuration and call it production. This is something that I do all of the time so that I can easily run a set of checks with the configuration that I want.

# The computername we will be testing
Set-DbcConfig -Name app.computername -Value $sql0,$SQl1
# The Instances we want to test
Set-DbcConfig -Name app.sqlinstance -Value $sql0,$SQl1
# The database owner we expect
Set-DbcConfig -Name policy.validdbowner.name -Value 'THEBEARD\EnterpriseAdmin'
# the database owner we do NOT expect
Set-DbcConfig -Name policy.invaliddbowner.name -Value 'sa'
# Should backups be compressed by default?
Set-DbcConfig -Name policy.backup.defaultbackupcompression -Value $true
# Do we allow DAC connections?
Set-DbcConfig -Name policy.dacallowed -Value $true
# What recovery model should we have?
Set-DbcConfig -Name policy.recoverymodel.type -value FULL
# What should ourt database growth type be?
Set-DbcConfig -Name policy.database.filegrowthtype -Value kb
# What authentication scheme are we expecting?
Set-DbcConfig -Name policy.connection.authscheme -Value 'KERBEROS'
# Which Agent Operator should be defined?
Set-DbcConfig -Name agent.dbaoperatorname -Value 'The DBA Team'
# Which Agent Operator email should be defined?
Set-DbcConfig -Name agent.dbaoperatoremail -Value 'TheDBATeam@TheBeard.Local'
# Which failsafe operator shoudl be defined?
Set-DbcConfig -Name agent.failsafeoperator -Value 'The DBA Team'
## Set the database mail profile name
Set-DbcConfig -Name agent.databasemailprofile -Value 'DbaTeam'
# Where is the whoisactive stored procedure?
Set-DbcConfig -Name policy.whoisactive.database -Value master
# What is the maximum time since I took a Full backup?
Set-DbcConfig -Name policy.backup.fullmaxdays -Value 7
# What is the maximum time since I took a DIFF backup (in hours) ?
Set-DbcConfig -Name policy.backup.diffmaxhours -Value 26
# What is the maximum time since I took a log backup (in minutes)?
Set-DbcConfig -Name policy.backup.logmaxminutes -Value 30
# What is my domain name?
Set-DbcConfig -Name domain.name -Value 'TheBeard.Local'
# Where is my Ola database?
Set-DbcConfig -Name policy.ola.database -Value master
# Which database should not be checked for recovery model
Set-DbcConfig -Name policy.recoverymodel.excludedb -Value 'master','msdb','tempdb'
# Should I skip the check for temp files on c?
Set-DbcConfig -Name skip.tempdbfilesonc -Value $true
# Should I skip the check for temp files count?
Set-DbcConfig -Name skip.tempdbfilecount -Value $true
# Which Checks should be excluded?
Set-DbcConfig -Name command.invokedbccheck.excludecheck -Value LogShipping,ExtendedEvent, PseudoSimple,SPN, TestLastBackupVerifyOnly,IdentityUsage,SaRenamed
# How many months before a build is unsupported do I want to fail the test?
Set-DbcConfig -Name policy.build.warningwindow -Value 6
## I need to set the app.cluster configuration to one of the nodes for the HADR check
## and I need to set the domain.name value
Set-DbcConfig -Name app.cluster -Value $SQL0
Set-DbcConfig -Name domain.name -Value 'TheBeard.Local'
## I also skip the ping check for the listener as we are in Azure
Set-DbcConfig -Name skip.hadr.listener.pingcheck -Value $true
Now I can export that configuration to a json file and store on a file share or in source control using the code below. This makes it easy to embed the checks into an automation solution
Export-DbcConfig -Path Git:\Production.Json
and then I can use it with
Import-DbcConfig -Path Git:\Production.Json
Invoke-DbcCheck
01 - Invoke-DbcCheck
I would use one of the Show parameter values here if I was running it at the command line, probably fails to make reading the information easier

Add results to a database

This only gets us the test results on the screen, so if we want to save them to a database we have to use the PassThru parameter for Invoke-DbcCheck. I will run the checks again, save them to a variable
$Testresults = Invoke-DbcCheck -PassThru -Show Fails

Then I can use the dbatools Write-DbaDatatable command to write the results to a table in a database. I need to do this twice, once for the summary and once for the test results

$Testresults | Write-DbaDataTable -SqlInstance $sql0 -Database tempdb -Table Prod_dbachecks_summary -AutoCreateTable
$Testresults.TestResult | Write-DbaDataTable -SqlInstance $sql0 -Database tempdb -Table Prod_dbachecks_detail -AutoCreateTable

and I get two tables one for the summary

02 - summary

and one for the details

03 - detail
This works absolutely fine and I could continue to add test results in this fashion but it has no date property so it is not so useful for reporting.

Create tables and triggers

This is one way of doing it. I am not sure it is the best way but it works! I always look forward to how people take ideas and move them forward so if you have a better/different solution please blog about it and reference it in the comments below

First I created a staging table for the summary results

CREATE TABLE [dbachecks].[Prod_dbachecks_summary_stage](
	[TagFilter] [nvarchar](max) NULL,
	[ExcludeTagFilter] [nvarchar](max) NULL,
	[TestNameFilter] [nvarchar](max) NULL,
	[TotalCount] [int] NULL,
	[PassedCount] [int] NULL,
	[FailedCount] [int] NULL,
	[SkippedCount] [int] NULL,
	[PendingCount] [int] NULL,
	[InconclusiveCount] [int] NULL,
	[Time] [bigint] NULL,
	[TestResult] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

and a destination table with a primary key and a date column which defaults to todays date

CREATE TABLE [dbachecks].[Prod_dbachecks_summary](
	[SummaryID] [int] IDENTITY(1,1) NOT NULL,
	[TestDate] [date] NOT NULL,
	[TagFilter] [nvarchar](max) NULL,
	[ExcludeTagFilter] [nvarchar](max) NULL,
	[TestNameFilter] [nvarchar](max) NULL,
	[TotalCount] [int] NULL,
	[PassedCount] [int] NULL,
	[FailedCount] [int] NULL,
	[SkippedCount] [int] NULL,
	[PendingCount] [int] NULL,
	[InconclusiveCount] [int] NULL,
	[Time] [bigint] NULL,
	[TestResult] [nvarchar](max) NULL,
 CONSTRAINT [PK_Prod_dbachecks_summary] PRIMARY KEY CLUSTERED 
(
	[SummaryID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_summary] ADD  CONSTRAINT [DF_Prod_dbachecks_summary_TestDate]  DEFAULT (getdate()) FOR [TestDate]
GO

and added an INSERT trigger to the staging table

CREATE TRIGGER [dbachecks].[Load_Prod_Summary] 
   ON   [dbachecks].[Prod_dbachecks_summary_stage]
   AFTER INSERT
AS 
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

    INSERT INTO [dbachecks].[Prod_dbachecks_summary] 
	([TagFilter], [ExcludeTagFilter], [TestNameFilter], [TotalCount], [PassedCount], [FailedCount], [SkippedCount], [PendingCount], [InconclusiveCount], [Time], [TestResult])
	SELECT [TagFilter], [ExcludeTagFilter], [TestNameFilter], [TotalCount], [PassedCount], [FailedCount], [SkippedCount], [PendingCount], [InconclusiveCount], [Time], [TestResult] FROM [dbachecks].[Prod_dbachecks_summary_stage]

END
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_summary_stage] ENABLE TRIGGER [Load_Prod_Summary]
GO
and for the details I do the same thing. A details table
CREATE TABLE [dbachecks].[Prod_dbachecks_detail](
	[DetailID] [int] IDENTITY(1,1) NOT NULL,
	[SummaryID] [int] NOT NULL,
	[ErrorRecord] [nvarchar](max) NULL,
	[ParameterizedSuiteName] [nvarchar](max) NULL,
	[Describe] [nvarchar](max) NULL,
	[Parameters] [nvarchar](max) NULL,
	[Passed] [bit] NULL,
	[Show] [nvarchar](max) NULL,
	[FailureMessage] [nvarchar](max) NULL,
	[Time] [bigint] NULL,
	[Name] [nvarchar](max) NULL,
	[Result] [nvarchar](max) NULL,
	[Context] [nvarchar](max) NULL,
	[StackTrace] [nvarchar](max) NULL,
 CONSTRAINT [PK_Prod_dbachecks_detail] PRIMARY KEY CLUSTERED 
(
	[DetailID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail]  WITH CHECK ADD  CONSTRAINT [FK_Prod_dbachecks_detail_Prod_dbachecks_summary] FOREIGN KEY([SummaryID])
REFERENCES [dbachecks].[Prod_dbachecks_summary] ([SummaryID])
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail] CHECK CONSTRAINT [FK_Prod_dbachecks_detail_Prod_dbachecks_summary]
GO

A stage table

CREATE TABLE [dbachecks].[Prod_dbachecks_detail_stage](
	[ErrorRecord] [nvarchar](max) NULL,
	[ParameterizedSuiteName] [nvarchar](max) NULL,
	[Describe] [nvarchar](max) NULL,
	[Parameters] [nvarchar](max) NULL,
	[Passed] [bit] NULL,
	[Show] [nvarchar](max) NULL,
	[FailureMessage] [nvarchar](max) NULL,
	[Time] [bigint] NULL,
	[Name] [nvarchar](max) NULL,
	[Result] [nvarchar](max) NULL,
	[Context] [nvarchar](max) NULL,
	[StackTrace] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

with a trigger

CREATE TRIGGER [dbachecks].[Load_Prod_Detail] 
   ON   [dbachecks].[Prod_dbachecks_detail_stage]
   AFTER INSERT
AS 
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

    INSERT INTO [dbachecks].[Prod_dbachecks_detail] 
([SummaryID],[ErrorRecord], [ParameterizedSuiteName], [Describe], [Parameters], [Passed], [Show], [FailureMessage], [Time], [Name], [Result], [Context], [StackTrace])
	SELECT 
	(SELECT MAX(SummaryID) From [dbachecks].[Prod_dbachecks_summary]),[ErrorRecord], [ParameterizedSuiteName], [Describe], [Parameters], [Passed], [Show], [FailureMessage], [Time], [Name], [Result], [Context], [StackTrace]
	FROM [dbachecks].[Prod_dbachecks_detail_stage]

END
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail_stage] ENABLE TRIGGER [Load_Prod_Detail]
GO

Then I can use Write-DbaDatatable with a couple of extra parameters, FireTriggers to run the trigger, Truncate and Confirm:$false to avoid any confirmation because I want this to run without any interaction and I can get the results into the database.

$Testresults | Write-DbaDataTable -SqlInstance $Instance -Database $Database -Schema dbachecks -Table Prod_dbachecks_summary_stage -FireTriggers -Truncate -Confirm:$False
$Testresults.TestResult | Write-DbaDataTable -SqlInstance $Instance -Database $Database -Schema dbachecks -Table Prod_dbachecks_detail_stage -FireTriggers -Truncate -Confirm:$False
detail with stage

Which means that I can now query some of this data and also create PowerBi reports for it.

To enable me to have results for the groups in dbachecks I have to do a little bit of extra manipulation. I can add all of the checks to the database using

Get-DbcCheck | Write-DbaDataTable -SqlInstance $sql0 -Database ValidationResults -Schema dbachecks -Table Checks -Truncate -Confirm:$False -AutoCreateTable

But because the Ola Hallengren Job names are configuration items I need to update the values for those checks which I can do as follows

$query = "
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.systemfull) + "' WHERE [Describe] = 'Ola - `$SysFullJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserFull) + "' WHERE [Describe] = 'Ola - `$UserFullJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserDiff) + "' WHERE [Describe] = 'Ola - `$UserDiffJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserLog) + "' WHERE [Describe] = 'Ola - `$UserLogJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.CommandLogCleanup) + "' WHERE [Describe] = 'Ola - `$CommandLogJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.SystemIntegrity) + "' WHERE [Describe] = 'Ola - `$SysIntegrityJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserIntegrity) + "' WHERE [Describe] = 'Ola - `$UserIntegrityJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserIndex) + "' WHERE [Describe] = 'Ola - `$UserIndexJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.OutputFileCleanup) + "' WHERE [Describe] = 'Ola - `$OutputFileJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.DeleteBackupHistory) + "' WHERE [Describe] = 'Ola - `$DeleteBackupJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.PurgeBackupHistory) + "' WHERE [Describe] = 'Ola - `$PurgeBackupJobName'
"
Invoke-DbaSqlQuery -SqlInstance $SQL0 -Database ValidationResults -Query $query

You can get a sample Power Bi report in my Github which also has the code from this blog post

Then you just need to open in PowerBi Desktop and

Click Edit Queries
Click Data Source Settings
Click Change Source
Change the Instance and Database names

09 - PowerBi

Then have an interactive report like this. Feel free to click around and see how it works. Use the arrows at the bottom right to go full-screen. NOTE – it filters by “today” so if I haven’t run the check and the import then click on one of the groups under “Today’s Checks by Group”

This enables me to filter the results and see what has happened in the past so I can filter by one instance
05 - filter by instance
or I can filter by a group of tests
07 - filter by instance
or even by a group of tests for an instance
08 - filter by instance and insance

Hopefully, this will give you some ideas of what you can do with your dbachecks results. You can find all of the code and the PowerBi in my GitHub

Happy Validating!

dbachecks – Configuration Deep Dive

Today is the day that we have announced dbachecks  a PowerShell module enabling you to validate your SQL Instances. You can read more about it here where you can learn how to install it and see some simple use cases

108 Configurations

One of the things I have been talking about in my presentation “Green is Good Red is Bad” is configuring Pester checks so that you do not have to keep writing new tests for the same thing but with different values.

For example, a different user for a database owner. The code to write the test for the database owner is the same but the value might be different for different applications, environments, clients, teams, domains etc. I gave a couple of different methods for achieving this.

With dbachecks we have made this much simpler enabling you to set configuration items at run-time or for your session and enabling you to export and import them so you can create different configs for different use cases

There are 108 configuration items at present. You can see the current configuration by running

Get-DbcConfig

which will show you the name of the config, the value it is currently set and the description

You can see all of the configs and their descriptions here

NameDescription
agent.databasemailprofileName of the Database Mail Profile in SQL Agent
agent.dbaoperatoremailEmail address of the DBA Operator in SQL Agent
agent.dbaoperatornameName of the DBA Operator in SQL Agent
agent.failsafeoperatorEmail address of the DBA Operator in SQL Agent
app.checkreposWhere Pester tests/checks are stored
app.computernameList of Windows Servers that Windows-based tests will run against
app.localappPersisted files live here
app.maildirectoryFiles for mail are stored here
app.sqlcredentialThe universal SQL credential if Trusted/Windows Authentication is not used
app.sqlinstanceList of SQL Server instances that SQL-based tests will run against
app.wincredentialThe universal Windows if default Windows Authentication is not used
command.invokedbccheck.excludecheckInvoke-DbcCheck: The checks that should be skipped by default.
domain.domaincontrollerThe domain controller to process your requests
domain.nameThe Active Directory domain that your server is a part of
domain.organizationalunitThe OU that your server should be a part of
mail.failurethreshholdNumber of errors that must be present to generate an email report
mail.fromEmail address the email reports should come from
mail.smtpserverStore the name of the smtp server to send email reports
mail.subjectSubject line of the email report
mail.toEmail address to send the report to
policy.backup.datadirDestination server data directory
policy.backup.defaultbackupcompreesionDefault Backup Compression check should be enabled $true or disabled $false
policy.backup.diffmaxhoursMaxmimum number of hours before Diff Backups are considered outdated
policy.backup.fullmaxdaysMaxmimum number of days before Full Backups are considered outdated
policy.backup.logdirDestination server log directory
policy.backup.logmaxminutesMaxmimum number of minutes before Log Backups are considered outdated
policy.backup.newdbgraceperiodThe number of hours a newly created database is allowed to not have backups
policy.backup.testserverDestination server for backuptests
policy.build.warningwindowThe number of months prior to a build being unsupported that you want warning about
policy.connection.authschemeAuth requirement (Kerberos, NTLM, etc)
policy.connection.pingcountNumber of times to ping a server to establish average response time
policy.connection.pingmaxmsMaximum response time in ms
policy.dacallowedDAC should be allowed $true or disallowed $false
policy.database.autocloseAuto Close should be allowed $true or dissalowed $false
policy.database.autocreatestatisticsAuto Create Statistics should be enabled $true or disabled $false
policy.database.autoshrinkAuto Shrink should be allowed $true or dissalowed $false
policy.database.autoupdatestatisticsAuto Update Statistics should be enabled $true or disabled $false
policy.database.autoupdatestatisticsasynchronouslyAuto Update Statistics Asynchronously should be enabled $true or disabled $false
policy.database.filebalancetolerancePercentage for Tolerance for checking for balanced files in a filegroups
policy.database.filegrowthexcludedbDatabases to exclude from the file growth check
policy.database.filegrowthtypeGrowth Type should be 'kb' or 'percent'
policy.database.filegrowthvalueThe auto growth value (in kb) should be equal or higher than this value. Example: A value of 65535 means at least 64MB.
policy.database.logfilecountThe number of Log files expected on a database
policy.database.logfilesizecomparisonHow to compare data and log file size, options are maximum or average
policy.database.logfilesizepercentageMaximum percentage of Data file Size that logfile is allowed to be.
policy.database.maxvlfMax virtual log files
policy.dbcc.maxdaysMaxmimum number of days before DBCC CHECKDB is considered outdated
policy.diskspace.percentfreePercent disk free
policy.dump.maxcountMaximum number of expected dumps
policy.hadr.tcpportThe TCPPort for the HADR check
policy.identity.usagepercentMaxmimum percentage of max of identity column
policy.invaliddbowner.excludedbDatabases to exclude from invalid dbowner checks
policy.invaliddbowner.nameThe database owner account should not be this user
policy.network.latencymaxmsMax network latency average
policy.ola.commandlogenabledOla's CommandLog Cleanup should be enabled $true or disabled $false
policy.ola.commandlogscheduledOla's CommandLog Cleanup should be scheduled $true or disabled $false
policy.ola.databaseThe database where Ola's maintenance solution is installed
policy.ola.deletebackuphistoryenabledOla's Delete Backup History should be enabled $true or disabled $false
policy.ola.deletebackuphistoryscheduledOla's Delete Backup History should be scheduled $true or disabled $false
policy.ola.installedChecks to see if Ola Hallengren solution is installed
policy.ola.outputfilecleanupenabledOla's Output File Cleanup should be enabled $true or disabled $false
policy.ola.outputfilecleanupscheduledOla's Output File Cleanup should be scheduled $true or disabled $false
policy.ola.purgejobhistoryenabledOla's Purge Job History should be enabled $true or disabled $false
policy.ola.purgejobhistoryscheduledOla's Purge Job History should be scheduled $true or disabled $false
policy.ola.systemfullenabledOla's Full System Database Backup should be enabled $true or disabled $false
policy.ola.systemfullretentionOla's Full System Database Backup retention number of hours
policy.ola.systemfullscheduledOla's Full System Database Backup should be scheduled $true or disabled $false
policy.ola.systemintegritycheckenabledOla's System Database Integrity should be enabled $true or disabled $false
policy.ola.systemintegritycheckscheduledOla's System Database Integrity should be scheduled $true or disabled $false
policy.ola.userdiffenabledOla's Diff User Database Backup should be enabled $true or disabled $false
policy.ola.userdiffretentionOla's Diff User Database Backup retention number of hours
policy.ola.userdiffscheduledOla's Diff User Database Backup should be scheduled $true or disabled $false
policy.ola.userfullenabledOla's Full User Database Backup should be enabled $true or disabled $false
policy.ola.userfullretentionOla's Full User Database Backup retention number of hours
policy.ola.userfullscheduledOla's Full User Database Backup should be scheduled $true or disabled $false
policy.ola.userindexoptimizeenabledOla's User Index Optimization should be enabled $true or disabled $false
policy.ola.userindexoptimizescheduledOla's User Index Optimization should be scheduled $true or disabled $false
policy.ola.userintegritycheckenabledOla's User Database Integrity should be enabled $true or disabled $false
policy.ola.userintegritycheckscheduledOla's User Database Integrity should be scheduled $true or disabled $false
policy.ola.userlogenabledOla's Log User Database Backup should be enabled $true or disabled $false
policy.ola.userlogretentionOla's Log User Database Backup retention number of hours
policy.ola.userlogscheduledOla's Log User Database Backup should be scheduled $true or disabled $false
policy.oleautomationOLE Automation should be enabled $true or disabled $false
policy.pageverifyPage verify option should be set to this value
policy.recoverymodel.excludedbDatabases to exclude from standard recovery model check
policy.recoverymodel.typeStandard recovery model
policy.storage.backuppathEnables tests to check if servers have access to centralized backup location
policy.validdbowner.excludedbDatabases to exclude from valid dbowner checks
policy.validdbowner.nameThe database owner account should be this user
policy.whoisactive.databaseWhich database should contain the sp_WhoIsActive stored procedure
policy.xevent.requiredrunningsessionList of XE Sessions that should be running.
policy.xevent.requiredstoppedsessionList of XE Sessions that should not be running.
policy.xevent.validrunningsessionList of XE Sessions that can be be running.
skip.backup.testingDon't run Test-DbaLastBackup by default (it's not read-only)
skip.connection.pingSkip the ping check for connectivity
skip.connection.remotingSkip PowerShell remoting check for connectivity
skip.database.filegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.database.logfilecounttestSkip the logfilecount test
skip.datafilegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.dbcc.datapuritycheckSkip data purity check in last good dbcc command
skip.diffbackuptestSkip the Differential backup test
skip.logfilecounttestSkip the logfilecount test
skip.logshiptestingSkip the logshipping test
skip.tempdb1118Don't run test for Trace Flag 1118
skip.tempdbfilecountDon't run test for Temp Database File Count
skip.tempdbfilegrowthpercentDon't run test for Temp Database File Growth in Percent
skip.tempdbfilesizemaxDon't run test for Temp Database Files Max Size
skip.tempdbfilesoncDon't run test for Temp Database Files on C

So there are a lot of configurations that you can use. A lot are already set by default but all of them you can configure for the values that you need for your own estate.

The configurations are stored in the registry at HKCU:\Software\Microsoft\WindowsPowerShell\PSFramework\

First Configurations

First I would run this so that you can see all of the configs in a seperate window (note this does not work on PowerShell v6)

Get-DbcConfig | Out-GridView

Lets start with the first configurations that you will want to set. This should be the Instances and the Hosts that you want to check

You can get the value of the configuration item using

Get-DbcConfigValue -Name app.sqlinstance

as you can see in the image, nothing is returned so we have no instances configured at present. We have added tab completion to the name parameter so that you can easily find the right one

If you want to look at more information about the configuration item you can use

Get-DbcConfig -Name app.sqlinstance

which shows you the name, current value and the description

So lets set our first configuration for our SQL instance to localhost. I have included a video so you can see the auto-complete in action as well

Set-DbcConfig -Name app.sqlinstance localhost

This configuration will be used for any SQL based checks but not for any windows based ones like Services, PowerPlan, SPN, DiskSpace, Cluster so lets set the app.computername configuration as well

This means that when we run invoke-DbcCheck with AllChecks or by specifying a check, it will run against the local machine and default instance unless we specify a sqlinstance when calling Invoke-DbcCheck. So the code below will not use the configuration for app.sqlinstance.

Invoke-DbcCheck -SqlInstance TheBeard

Exclude a Check

You can exclude a check using the -ExcludeCheck parameter of Invoke-DbcConfig. In the example below I am running all of the Server checks but excluding the SPN as we are not on a domain

Invoke-DbcCheck -Check Server -ExcludeCheck SPN

 

There is a configuration setting to exclude checks as well. (Be careful this will exclude them even if you specifically specify a check using Invoke-DbcCheck but we do give you a warning!)

So now I can run

Set-DbcConfig -Name command.invokedbccheck.excludecheck -Value SPN
Invoke-DbcCheck -Check Server

and all of the server checks except the SPN check will run against the local machine and the default instance that I have set in the config

Creating an environment config and exporting it to use any time we like

So lets make this a lot more useful. Lets create a configuration for our production environment and save it to disk (or even source control it!) so that we can use it again and again. We can also then pass it to other members of our team or even embed it in an automated process or our CI/CD system

Lets build up a configuration for a number of tests for my “production” environment. I will not explain them all here but let you read through the code and the comments to see what has been set. You will see that some of them are due to me running the test on a single machine with one drive.

# The computername we will be testing
Set-DbcConfig -Name app.computername -Value localhost                                                                                                                                                                                                          
# The Instances we want to test
Set-DbcConfig -Name app.sqlinstance -Value 'localhost' ,'localhost\PROD1','localhost\PROD2', 'localhost\PROD3'                                                                                                                                            
# The database owner we expect
Set-DbcConfig -Name policy.validdbowner.name -Value 'dbachecksdemo\dbachecks'  
# the database owner we do NOT expect
Set-DbcConfig -Name policy.invaliddbowner.name -Value 'sa'      
# Should backups be compressed by default?
Set-DbcConfig -Name policy.backup.defaultbackupcompreesion -Value $true     
# Do we allow DAC connections?
Set-DbcConfig -Name policy.dacallowed -Value $true    
# What recovery model should we have?
Set-DbcConfig -Name policy.recoverymodel.type -value FULL     
# What should our database growth type be?
Set-DbcConfig -Name policy.database.filegrowthtype -Value kb   
# What authentication scheme are we expecting?                                                                                                            
Set-DbcConfig -Name policy.connection.authscheme -Value 'NTLM'
# Which Agent Operator should be defined?
Set-DbcConfig -Name agent.dbaoperatorname -Value 'DBA Team'
# Which Agent Operator email should be defined?
Set-DbcConfig -Name agent.dbaoperatoremail -Value 'DBATeam@TheBeard.Local'
# Which failsafe operator shoudl be defined?
Set-DbcConfig -Name agent.failsafeoperator -Value 'DBA Team'
# Where is the whoisactive stored procedure?
Set-DbcConfig -Name policy.whoisactive.database -Value DBAAdmin 
# What is the maximum time since I took a Full backup?
Set-DbcConfig -Name policy.backup.fullmaxdays -Value 7
# What is the maximum time since I took a DIFF backup (in hours) ?
Set-DbcConfig -Name policy.backup.diffmaxhours -Value 26
# What is the maximum time since I took a log backup (in minutes)?
Set-DbcConfig -Name policy.backup.logmaxminutes -Value 30 
# What is my domain name?
Set-DbcConfig -Name domain.name -Value 'WORKGROUP'
# Where is my Ola database?
Set-DbcConfig -Name policy.ola.database -Value DBAAdmin
# Which database should not be checked for recovery model
Set-DbcConfig -Name policy.recoverymodel.excludedb -Value 'master','msdb','tempdb'
# What is my SQL Credential
Set-DbcConfig -Name app.sqlcredential -Value $null
# Should I skip the check for temp files on c?
Set-DbcConfig -Name skip.tempdbfilesonc -Value $true
# Should I skip the check for temp files count?
Set-DbcConfig -Name skip.tempdbfilecount -Value $true
# Which Checks should be excluded?
Set-DbcConfig -Name command.invokedbccheck.excludecheck -Value LogShipping,ExtendedEvent, HADR, PseudoSimple,spn
# How many months before a build is unsupported do I want to fail the test?
Set-DbcConfig -Name policy.build.warningwindow -Value 6
Get-Dbcconfig | ogv

When I run this I get

I can then export this to disk (to store in source control) using

Export-DbcConfig -Path C:\Users\dbachecks\Desktop\production_config.json

and I have a configuration file

which I can use any time to set the configuration for dbachecks using the Import-DbcConfig command (But this doesn’t work in VS Codes integrated terminal – which occasionally does odd things, this appears to be one of them)

Import-DbcConfig -Path C:\Users\dbachecks\Desktop\production_config.json

So I can import this configuration and run my checks with it any time I like. This means that I can create many different test configurations for my many different environment or estate configurations.

Yes, I know “good/best practice” says we should use the same configuration for all of our instances but we know that isn’t true. We have instances that were set up 15 years ago that are still in production. We have instances from the companies our organisation has bought over the years that were set up by system administrators. We have instances that were set up by shadow IT and now we have to support but cant change.

As well as those though, we also have different environments. Our development or test environment will have different requirements to our production environments.

In this hypothetical situation the four instances for four different applications have 4 development containers which are connected to using SQL Authentication. We will need a different configuration.

SQL Authentication

We can set up SQL Authentication for connecting to our SQL Instances using the app.sqlcredential configuration. this is going to hold a PSCredential object for SQL Authenticated connection to your instance. If this is set the checks will always try to use it. Yes this means that the same username and password is being used for each connection. No there is currently no way to choose which instances use it and which don’t. This may be a limitation but as you will see further down you can still do this with different configurations

To set the  SQL Authentication run

Set-DbcConfig -Name app.sqlcredential -Value (Get-Credential)

This will give a prompt for you to enter the credential

Development Environment Configuration

So now we know how to set a SQL Authentication configuration we can create our development environment configuration like so. As you can see below the values are different for the checks and more checks have been skipped. I wont explain it all, if it doesn’t make sense ask a question in the comments or in the dbachecks in SQL Server Community Slack

#region Dev Config
# The Instances we want to test
Set-DbcConfig -Name app.sqlinstance -Value 'localhost,1401' ,'localhost,1402','localhost,1403', 'localhost,1404' 
# What is my SQL Credential
Set-DbcConfig -Name app.sqlcredential -Value (Get-Credential)
# The database owner we expect
Set-DbcConfig -Name policy.validdbowner.name -Value 'sa'   
# What authentication scheme are we expecting?  
Set-DbcConfig -Name policy.connection.authscheme -Value 'SQL'
# the database owner we do NOT expect
Set-DbcConfig -Name policy.invaliddbowner.name -Value 'dbachecksdemo\dbachecks'
# Should backups be compressed by default?
Set-DbcConfig -Name policy.backup.defaultbackupcompreesion -Value $false
# What should our database growth type be?
Set-DbcConfig -Name policy.database.filegrowthtype -Value kb
# What should our database growth value be higher than (Mb)?
Set-DbcConfig -Name policy.database.filegrowthvalue -Value 64
# Do we allow DAC connections?
Set-DbcConfig -Name policy.dacallowed -Value $false 
# What is the maximum latency (ms)?
Set-DbcConfig -Name policy.network.latencymaxms -Value 100
# What recovery model should we have?
Set-DbcConfig -Name policy.recoverymodel.type -value Simple
# Where is the whoisactive stored procedure?
Set-DbcConfig -Name policy.whoisactive.database -Value DBAAdmin 
# What is my domain name?
Set-DbcConfig -Name domain.name -Value 'WORKGROUP'
# Which database should not be checked for recovery model
Set-DbcConfig -Name policy.recoverymodel.excludedb -Value 'master','msdb','tempdb'
# Should I skip the check for temp files on c?
Set-DbcConfig -Name skip.tempdbfilesonc -Value $true
# Should I skip the check for temp files count?
Set-DbcConfig -Name skip.tempdbfilecount -Value $true
# How many months before a build is unsupported do I want to fail the test?
Set-DbcConfig -Name policy.build.warningwindow -Value 6
# Which Checks should be excluded?
Set-DbcConfig -Name command.invokedbccheck.excludecheck -Value LogShipping,ExtendedEvent, HADR, SaReNamed, PseudoSimple,spn, DiskSpace, DatabaseCollation,Agent,Backup,UnusedIndex,LogfileCount,FileGroupBalanced,LogfileSize,MaintenanceSolution,ServerNameMatch

Export-DbcConfig -Path C:\Users\dbachecks\Desktop\development_config.json

Using The Different Configurations

Now I have two configurations, one for my Production Environment and one for my development environment. I can run my checks whenever I like (perhaps you will automate this in some way)

  • Import the production configuration
  • Run my tests with that configuration and create a json file for my Power Bi labelled production
  • Import the development configuration (and enter the SQL authentication credential)
  • Run my tests with that configuration and create a json file for my Power Bi labelled development
  • Start Power Bi to show those results
# Import the production config
Import-DbcConfig C:\Users\dbachecks\Desktop\production_config.json
# Run the tests with the production config and create/update the production json
Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment Production
# Import the development config
Import-DbcConfig C:\Users\dbachecks\Desktop\development_config.json
# Run the tests with the production config and create/update the development json
Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment Development
# Open the PowerBi
Start-DbcPowerBi

I have published the Power Bi so that you can see what it would like and have a click around (maybe you can see improvements you would like to contribute)

now we can see how each environment is performing according to our settings for each environment

Combining Configurations Into One Result Set

As you saw above, by using the Environment parameter of Update-DbcPowerBiDataSource you can add different environments to one report. But if I wanted to have a report for my application APP1 showing both production and development environments but they have different configurations how can I do this?

Here’s how.

  • Create a configuration for the production environment (I have used the production configuration one from above but only localhost for the instance)
  • Export it using to  C:\Users\dbachecks\Desktop\APP1-Prod_config.json
  • Create a configuration for the development environment (I have used the development configuration one from above but only localhost,1401 for the instance)
  • Export it using to  C:\Users\dbachecks\Desktop\APP1-Dev_config.json

Then run

# Import the production config
Import-DbcConfig C:\Users\dbachecks\Desktop\APP1-Prod_config.json
# Run the tests with the production config and create/update the production json
Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment APP1
# Import the development config
Import-DbcConfig C:\Users\dbachecks\Desktop\APP1-Dev_config.json
# Run the tests with the production config and create/update the development json
Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment APP1 -Append
Start-DbcPowerBi

Notice that this time there is an Append on the last Invoke-DbcCheck this creates a single json file for the PowerBi and the results look like this. Now we have the results for our application and both the production environment localhost and the development container localhost,1401

 

It’s Open Source – We Want Your Ideas, Issues, New Code

dbachecks is open-source available on GitHub for anyone to contribute

We would love you to contribute. Please open issues for new tests, enhancements, bugs. Please fork the repository and add code to improve the module. please give feedback to make this module even more useful

You can also come in the SQL Server Community Slack and join the dbachecks channel and get advice, make comments or just join in the conversation

Thank You

I want to say thank you to all of the people who have enabled dbachecks to get this far. These wonderful people have used their own time to ensure that you have a useful tool available to you for free

Chrissy Lemaire @cl

Fred Weinmann @FredWeinmann

Cláudio Silva @ClaudioESSilva

Stuart Moore @napalmgram

Shawn Melton @wsmelton

Garry Bargsley @gbargsley

Stephen Bennett @staggerlee011

Sander Stad @SQLStad

Jess Pomfret @jpomfret

Jason Squires @js0505

Shane O’Neill @SOZDBA

and all of the other people who have contributed in the dbachecks Slack channel

Announcing dbachecks – Configurable PowerShell Validation For Your SQL Instances

For the last couple of months members of the dbatools team have been working on a new PowerShell module called dbachecks. This open source PowerShell module will enable you to validate your SQL Instances. Today it is released for you all to start to use 🙂

Validate Your SQL Instances?

What do I mean by validate your SQL Instances? You want to know if your SQL Instances are (still) set up in the way that you want them to be or that you have not missed any configurations when setting them up. With dbachecks you can use any or all of the 80 checks to ensure one or many SQL Instances are as you want them to be. Using Pester, dbachecks will validate your SQL Instance(s) against default settings or ones that you configure yourself.

Installation

Installation is via the PowerShell Gallery. You will need to open PowerShell on a machine connected to the internet and run

Install-Module dbachecks

If you are not running your process as admin or you only want (or are able) to install for your own user account you will need to

Install-Module -Scope CurrentUser

This will also install the PSFramework module used for configuration (and other things beneath the hood) and the latest version (4.2.0 – released on Sunday!) of Pester

Once you have installed the module you can see the commands available by running

Get-Command -Module dbachecks

To be able to use these (and any PowerShell) commands, your first step should always be Get-Help

Get-Help Send-DbcMailMessage

80 Checks

At the time of release, dbachecks has 80 checks. You can see all of the checks by running

Get-DbcCheck

(Note this has nothing to do with DBCC CheckDb!) Here is the output of

Get-DbcCheck | Select Group, UniqueTag

so you can see the current checks

GroupUniqueTag
AgentAgentServiceAccount
AgentDbaOperator
AgentFailsafeOperator
AgentDatabaseMailProfile
AgentFailedJob
DatabaseDatabaseCollation
DatabaseSuspectPage
DatabaseTestLastBackup
DatabaseTestLastBackupVerifyOnly
DatabaseValidDatabaseOwner
DatabaseInvalidDatabaseOwner
DatabaseLastGoodCheckDb
DatabaseIdentityUsage
DatabaseRecoveryModel
DatabaseDuplicateIndex
DatabaseUnusedIndex
DatabaseDisabledIndex
DatabaseDatabaseGrowthEvent
DatabasePageVerify
DatabaseAutoClose
DatabaseAutoShrink
DatabaseLastFullBackup
DatabaseLastDiffBackup
DatabaseLastLogBackup
DatabaseVirtualLogFile
DatabaseLogfileCount
DatabaseLogfileSize
DatabaseFileGroupBalanced
DatabaseAutoCreateStatistics
DatabaseAutoUpdateStatistics
DatabaseAutoUpdateStatisticsAsynchronously
DatabaseDatafileAutoGrowthType
DatabaseTrustworthy
DatabaseOrphanedUser
DatabasePseudoSimple
DatabaseAdHocWorkloads
DomainDomainName
DomainOrganizationalUnit
HADRClusterHealth
HADRClusterServerHealth
HADR
HADRSystem.Object[]
InstanceSqlEngineServiceAccount
InstanceSqlBrowserServiceAccount
InstanceTempDbConfiguration
InstanceAdHocWorkload
InstanceBackupPathAccess
InstanceDAC
InstanceNetworkLatency
InstanceLinkedServerConnection
InstanceMaxMemory
InstanceOrphanedFile
InstanceServerNameMatch
InstanceMemoryDump
InstanceSupportedBuild
InstanceSaRenamed
InstanceDefaultBackupCompression
InstanceXESessionStopped
InstanceXESessionRunning
InstanceXESessionRunningAllowed
InstanceOLEAutomation
InstanceWhoIsActiveInstalled
LogShippingLogShippingPrimary
LogShippingLogShippingSecondary
ServerPowerPlan
ServerInstanceConnection
ServerSPN
ServerDiskCapacity
ServerPingComputer
MaintenancePlanSystemFull
MaintenancePlanUserFull
MaintenancePlanUserDiff
MaintenancePlanUserLog
MaintenancePlanCommandLog
MaintenancePlanSystemIntegrityCheck
MaintenancePlanUserIntegrityCheck
MaintenancePlanUserIndexOptimize
MaintenancePlanOutputFileCleanup
MaintenancePlanDeleteBackupHistory
MaintenancePlanPurgeJobHistory

108 Configurations

One of the things I have been talking about in my presentation “Green is Good Red is Bad” is configuring Pester checks so that you do not have to keep writing new tests for the same thing but with different values.

For example, a different user for a database owner. The code to write the test for the database owner is the same but the value might be different for different applications, environments, clients, teams, domains etc. I gave a couple of different methods for achieving this.

With dbachecks we have made this much simpler enabling you to set configuration items at run-time or for your session and enabling you to export and import them so you can create different configs for different use cases

There are 108 configuration items at present. You can see the current configuration by running

Get-DbcConfig

which will show you the name of the config, the value it is currently set and the description

You can see all of the configs and their descriptions here

NameDescription
agent.databasemailprofileName of the Database Mail Profile in SQL Agent
agent.dbaoperatoremailEmail address of the DBA Operator in SQL Agent
agent.dbaoperatornameName of the DBA Operator in SQL Agent
agent.failsafeoperatorEmail address of the DBA Operator in SQL Agent
app.checkreposWhere Pester tests/checks are stored
app.computernameList of Windows Servers that Windows-based tests will run against
app.localappPersisted files live here
app.maildirectoryFiles for mail are stored here
app.sqlcredentialThe universal SQL credential if Trusted/Windows Authentication is not used
app.sqlinstanceList of SQL Server instances that SQL-based tests will run against
app.wincredentialThe universal Windows if default Windows Authentication is not used
command.invokedbccheck.excludecheckInvoke-DbcCheck: The checks that should be skipped by default.
domain.domaincontrollerThe domain controller to process your requests
domain.nameThe Active Directory domain that your server is a part of
domain.organizationalunitThe OU that your server should be a part of
mail.failurethreshholdNumber of errors that must be present to generate an email report
mail.fromEmail address the email reports should come from
mail.smtpserverStore the name of the smtp server to send email reports
mail.subjectSubject line of the email report
mail.toEmail address to send the report to
policy.backup.datadirDestination server data directory
policy.backup.defaultbackupcompreesionDefault Backup Compression check should be enabled $true or disabled $false
policy.backup.diffmaxhoursMaxmimum number of hours before Diff Backups are considered outdated
policy.backup.fullmaxdaysMaxmimum number of days before Full Backups are considered outdated
policy.backup.logdirDestination server log directory
policy.backup.logmaxminutesMaxmimum number of minutes before Log Backups are considered outdated
policy.backup.newdbgraceperiodThe number of hours a newly created database is allowed to not have backups
policy.backup.testserverDestination server for backuptests
policy.build.warningwindowThe number of months prior to a build being unsupported that you want warning about
policy.connection.authschemeAuth requirement (Kerberos, NTLM, etc)
policy.connection.pingcountNumber of times to ping a server to establish average response time
policy.connection.pingmaxmsMaximum response time in ms
policy.dacallowedDAC should be allowed $true or disallowed $false
policy.database.autocloseAuto Close should be allowed $true or dissalowed $false
policy.database.autocreatestatisticsAuto Create Statistics should be enabled $true or disabled $false
policy.database.autoshrinkAuto Shrink should be allowed $true or dissalowed $false
policy.database.autoupdatestatisticsAuto Update Statistics should be enabled $true or disabled $false
policy.database.autoupdatestatisticsasynchronouslyAuto Update Statistics Asynchronously should be enabled $true or disabled $false
policy.database.filebalancetolerancePercentage for Tolerance for checking for balanced files in a filegroups
policy.database.filegrowthexcludedbDatabases to exclude from the file growth check
policy.database.filegrowthtypeGrowth Type should be 'kb' or 'percent'
policy.database.filegrowthvalueThe auto growth value (in kb) should be equal or higher than this value. Example: A value of 65535 means at least 64MB.
policy.database.logfilecountThe number of Log files expected on a database
policy.database.logfilesizecomparisonHow to compare data and log file size, options are maximum or average
policy.database.logfilesizepercentageMaximum percentage of Data file Size that logfile is allowed to be.
policy.database.maxvlfMax virtual log files
policy.dbcc.maxdaysMaxmimum number of days before DBCC CHECKDB is considered outdated
policy.diskspace.percentfreePercent disk free
policy.dump.maxcountMaximum number of expected dumps
policy.hadr.tcpportThe TCPPort for the HADR check
policy.identity.usagepercentMaxmimum percentage of max of identity column
policy.invaliddbowner.excludedbDatabases to exclude from invalid dbowner checks
policy.invaliddbowner.nameThe database owner account should not be this user
policy.network.latencymaxmsMax network latency average
policy.ola.commandlogenabledOla's CommandLog Cleanup should be enabled $true or disabled $false
policy.ola.commandlogscheduledOla's CommandLog Cleanup should be scheduled $true or disabled $false
policy.ola.databaseThe database where Ola's maintenance solution is installed
policy.ola.deletebackuphistoryenabledOla's Delete Backup History should be enabled $true or disabled $false
policy.ola.deletebackuphistoryscheduledOla's Delete Backup History should be scheduled $true or disabled $false
policy.ola.installedChecks to see if Ola Hallengren solution is installed
policy.ola.outputfilecleanupenabledOla's Output File Cleanup should be enabled $true or disabled $false
policy.ola.outputfilecleanupscheduledOla's Output File Cleanup should be scheduled $true or disabled $false
policy.ola.purgejobhistoryenabledOla's Purge Job History should be enabled $true or disabled $false
policy.ola.purgejobhistoryscheduledOla's Purge Job History should be scheduled $true or disabled $false
policy.ola.systemfullenabledOla's Full System Database Backup should be enabled $true or disabled $false
policy.ola.systemfullretentionOla's Full System Database Backup retention number of hours
policy.ola.systemfullscheduledOla's Full System Database Backup should be scheduled $true or disabled $false
policy.ola.systemintegritycheckenabledOla's System Database Integrity should be enabled $true or disabled $false
policy.ola.systemintegritycheckscheduledOla's System Database Integrity should be scheduled $true or disabled $false
policy.ola.userdiffenabledOla's Diff User Database Backup should be enabled $true or disabled $false
policy.ola.userdiffretentionOla's Diff User Database Backup retention number of hours
policy.ola.userdiffscheduledOla's Diff User Database Backup should be scheduled $true or disabled $false
policy.ola.userfullenabledOla's Full User Database Backup should be enabled $true or disabled $false
policy.ola.userfullretentionOla's Full User Database Backup retention number of hours
policy.ola.userfullscheduledOla's Full User Database Backup should be scheduled $true or disabled $false
policy.ola.userindexoptimizeenabledOla's User Index Optimization should be enabled $true or disabled $false
policy.ola.userindexoptimizescheduledOla's User Index Optimization should be scheduled $true or disabled $false
policy.ola.userintegritycheckenabledOla's User Database Integrity should be enabled $true or disabled $false
policy.ola.userintegritycheckscheduledOla's User Database Integrity should be scheduled $true or disabled $false
policy.ola.userlogenabledOla's Log User Database Backup should be enabled $true or disabled $false
policy.ola.userlogretentionOla's Log User Database Backup retention number of hours
policy.ola.userlogscheduledOla's Log User Database Backup should be scheduled $true or disabled $false
policy.oleautomationOLE Automation should be enabled $true or disabled $false
policy.pageverifyPage verify option should be set to this value
policy.recoverymodel.excludedbDatabases to exclude from standard recovery model check
policy.recoverymodel.typeStandard recovery model
policy.storage.backuppathEnables tests to check if servers have access to centralized backup location
policy.validdbowner.excludedbDatabases to exclude from valid dbowner checks
policy.validdbowner.nameThe database owner account should be this user
policy.whoisactive.databaseWhich database should contain the sp_WhoIsActive stored procedure
policy.xevent.requiredrunningsessionList of XE Sessions that should be running.
policy.xevent.requiredstoppedsessionList of XE Sessions that should not be running.
policy.xevent.validrunningsessionList of XE Sessions that can be be running.
skip.backup.testingDon't run Test-DbaLastBackup by default (it's not read-only)
skip.connection.pingSkip the ping check for connectivity
skip.connection.remotingSkip PowerShell remoting check for connectivity
skip.database.filegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.database.logfilecounttestSkip the logfilecount test
skip.datafilegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.dbcc.datapuritycheckSkip data purity check in last good dbcc command
skip.diffbackuptestSkip the Differential backup test
skip.logfilecounttestSkip the logfilecount test
skip.logshiptestingSkip the logshipping test
skip.tempdb1118Don't run test for Trace Flag 1118
skip.tempdbfilecountDon't run test for Temp Database File Count
skip.tempdbfilegrowthpercentDon't run test for Temp Database File Growth in Percent
skip.tempdbfilesizemaxDon't run test for Temp Database Files Max Size
skip.tempdbfilesoncDon't run test for Temp Database Files on C

Running A Check

You can quickly run a single check by calling Invoke-DbcCheck.

Invoke-DbcCheck -SqlInstance localhost -Check FailedJob

Excellent, my agent jobs have not failed 🙂

Invoke-DbcCheck -SqlInstance localhost -Check LastGoodCheckDb

Thats good, all of my databases have had a successful DBCC CHECKDB within the last 7 days.

Setting a Configuration

To save me from having to specify the instance I want to run my tests against I can set the app.sqlinstance config to the instances I want to check.

Set-DbcConfig -Name app.sqlinstance -Value localhost, 'localhost\PROD1'

Then whenever I call Invoke-DbcCheck it will run against those instances for the SQL checks

So now if I run

Invoke-DbcCheck -Check LastDiffBackup

I can see that I dont have a diff backup for the databases on both instances. Better stop writing this and deal with that !!

The configurations are stored in the registry but you can export them and then import them for re-use easily. I have written another blog post about that.

The Show Parameter

Getting the results of the tests on the screen is cool but if you are running a lot of tests against a lot of instances then you might find that you have 3 failed tests out of 15000! This will mean a lot of scrolling through green text looking for the red text and you may find that your PowerShell buffer doesnt hold all of your test results leaving you very frustrated.

dbachecks supports the Pester Show parameter enabling you to filter the output of the results to the screen. The available values are Summary, None, Fails, Inconclusive, Passed, Pending and Skipped

in my opinion by far the most useful one is Fails as this will show you only the failed tests with the context to enable you to see which tests have failed

Invoke-DbcCheck -Check Agent -Show Fails

If we check all of the checks tagged as Agent we can easily see that most passed but The Job That Fails (surprisingly) failed. All of the other tests that were run for the agent service, operators, failsafe operator, database mail and all other agent jobs all passed in the example below

Test Results are for other People as well

It is all very well and good being able to run tests and get the results on our screen. It will be very useful for people to be able to validate a new SQL instance for example or run a morning check or the first step of an incident response. But test results are also useful for other people so we need to be able to share them

We have created a Power Bi Dashboard that comes with the dbachecks module to enable easy sharing of the test results. You can also send the results via email using Send-DbcMailMessage. we have an open issue for putting them into a database that we would love you to help resolve.

To get the results into PowerBi you can run

Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment Production

This will run all of the dbachecks using your configuration for your Production environment, output only the failed tests to the screen and save the results in your windows\temp\dbachecks folder with a suffix of Production

If you then used a different configuration for your development environment and ran

Invoke-DbcCheck -AllChecks -Show Fails -PassThru |Update-DbcPowerBiDataSource -Environment Development

it will run all of the dbachecks using your configuration for your Development environment, output only the failed tests to the screen and save the results in your windows\temp\dbachecks folder with a suffix of Development and you would end up with two files in the folder

You can then simply run

Start-DbcPowerBi

and as long as you have the (free) Powerbi Desktop then you will see this. You will need to refresh the data to get your test results

Of course it is Powerbi so you can publish this report. Here it is so that you can click around and see what it looks like

It’s Open Source – We Want Your Ideas, Issues, New Code

dbachecks is open-source available on GitHub for anyone to contribute

We would love you to contribute. Please open issues for new tests, enhancements, bugs. Please fork the repository and add code to improve the module. please give feedback to make this module even more useful

You can also come in the SQL Server Community Slack and join the dbachecks channel and get advice, make comments or just join in the conversation

Further Reading

There are many more introduction blog posts covering different areas at

Thank You

I want to say thank you to all of the people who have enabled dbachecks to get this far. These wonderful people have used their own time to ensure that you have a useful tool available to you for free

Chrissy Lemaire @cl

Fred Weinmann @FredWeinmann

Cláudio Silva @ClaudioESSilva

Stuart Moore @napalmgram

Shawn Melton @wsmelton

Garry Bargsley @gbargsley

Stephen Bennett @staggerlee011

Sander Stad @SQLStad

Jess Pomfret @jpomfret

Jason Squires @js0505

Shane O’Neill @SOZDBA

Tony Wilhelm @TonyWSQL

and all of the other people who have contributed in the dbachecks Slack channel

How I created PowerShell.cool using Flow, Azure SQL DB, Cognitive Services & PowerBi

Last weekend I was thinking about how to save the tweets for PowerShell Conference Europe. This annual event occurs in Hanover and this year it is on April 17-20, 2018. The agenda has just been released and you can find it on the website http://www.psconf.eu/

I ended up creating an interactive PowerBi report to which my good friend and Data Platform MVP Paul Andrew b | t added a bit of magic and I published it. The magnificent Tobias Weltner b | t who organises PSConfEU pointed the domain name http://powershell.cool at the link. It looks like this.

During the monthly #PSTweetChat

I mentioned that I need to blog about how I created it and Jeff replied

so here it is! Looking forward to seeing the comparison between the PowerShell and Devops Summit and the PowerShell Conference Europe 🙂

This is an overview of how it works

 

You will find all of the resources and the scripts to do all of the below in the GitHub repo. So clone it and navigate to the filepath

Create Database

First lets create a database. Connect to your Azure subscription

## Log in to your Azure subscription using the Add-AzureRmAccount command and follow the on-screen directions.

 Add-AzureRmAccount

## Select the subscription

Set-AzureRmContext -SubscriptionId YourSubscriptionIDHere

01 - subscription.png

Then set some variables

# The data center and resource name for your resources
$resourcegroupname = "twitterresource"
$location = "WestEurope"
# The logical server name: Use a random value or replace with your own value (do not capitalize)
$servername = "server-$(Get-Random)"
# Set an admin login and password for your database
# The login information for the server You need to set these and uncomment them - Dont use these values

# $adminlogin = "ServerAdmin"                
# $password = "ChangeYourAdminPassword1"

# The ip address range that you want to allow to access your server - change as appropriate
# $startip = "0.0.0.0"
# $endip = "0.0.0.0"

# To just add your own IP Address
$startip = $endip = (Invoke-WebRequest 'http://myexternalip.com/raw').Content -replace "`n"

# The database name
$databasename = "tweets"

$AzureSQLServer = "$servername.database.windows.net,1433"
$Table = "table.sql"
$Proc = "InsertTweets.sql"

They should all make sense, take note that you need to set and uncomment the login and password and choose which IPs to allow through the firewall

Create a Resource Group

## Create a resource group

New-AzureRmResourceGroup -Name $resourcegroupname -Location $location

02 - resource group.png

Create a SQL Server

## Create a Server

$newAzureRmSqlServerSplat = @{
    SqlAdministratorCredentials = $SqlAdministratorCredentials
    ResourceGroupName = $resourcegroupname
    ServerName = $servername
    Location = $location
}
New-AzureRmSqlServer @newAzureRmSqlServerSplat

03 - create server.png

Create a firewall rule, I just use my own IP and add the allow azure IPs

$newAzureRmSqlServerFirewallRuleSplat = @{
    EndIpAddress = $endip
    StartIpAddress = $startip
    ServerName = $servername
    ResourceGroupName = $resourcegroupname
    FirewallRuleName = "AllowSome"
}
New-AzureRmSqlServerFirewallRule @newAzureRmSqlServerFirewallRuleSplat

# Allow Azure IPS

$newAzureRmSqlServerFirewallRuleSplat = @{
    AllowAllAzureIPs = $true
    ServerName = $servername
    ResourceGroupName = $resourcegroupname
}
New-AzureRmSqlServerFirewallRule @newAzureRmSqlServerFirewallRuleSplat

03a - firewall rule.png

Create a database

# Create a database

$newAzureRmSqlDatabaseSplat = @{
    ServerName = $servername
    ResourceGroupName = $resourcegroupname
    Edition = 'Basic'
    DatabaseName = $databasename
}
New-AzureRmSqlDatabase  @newAzureRmSqlDatabaseSplat

04 - create database.png

I have used the dbatools module to run the scripts to create the database. You can get it using

Install-Module dbatools # -Scope CurrentUser # if not admin process

Run the scripts

# Create a credential

$newObjectSplat = @{
    ArgumentList = $adminlogin, $(ConvertTo-SecureString -String $password -AsPlainText -Force)
    TypeName = 'System.Management.Automation.PSCredential'
}
$SqlAdministratorCredentials = New-Object @newObjectSplat

## Using dbatools module

$invokeDbaSqlCmdSplat = @{
    SqlCredential = $SqlAdministratorCredentials
    Database = $databasename
    File = $Table,$Proc
    SqlInstance = $AzureSQLServer
}
Invoke-DbaSqlCmd @invokeDbaSqlCmdSplat

05 - Create Table Sproc.png

This will have created the following in Azure, you can see it in the portal

07 - portal.png

You can connect to the database in SSMS and you will see

06 - show table.png

Create Cognitive Services

Now you can create the Text Analysis Cognitive Services API

First login (if you need to) and set some variables

## This creates cognitive services for analysing the tweets

## Log in to your Azure subscription using the Add-AzureRmAccount command and follow the on-screen directions.

Add-AzureRmAccount

## Select the subscription

Set-AzureRmContext -SubscriptionId YOUR SUBSCRIPTION ID HERE

#region variables
# The data center and resource name for your resources
$resourcegroupname = "twitterresource"
$location = "WestEurope"
$APIName = 'TweetAnalysis'
#endregion

Then create the API and get the key

#Create the cognitive services

$newAzureRmCognitiveServicesAccountSplat = @{
    ResourceGroupName = $resourcegroupname
    Location = $location
    SkuName = 'F0'
    Name = $APIName
    Type = 'TextAnalytics'
}
New-AzureRmCognitiveServicesAccount @newAzureRmCognitiveServicesAccountSplat

# Get the Key

$getAzureRmCognitiveServicesAccountKeySplat = @{
    Name = $APIName
    ResourceGroupName = $resourcegroupname
}
Get-AzureRmCognitiveServicesAccountKey @getAzureRmCognitiveServicesAccountKeySplat 

You will need to accept the prompt

08 -cognitive service

Copy the Endpoint URL as you will need it.Then save one of  the keys for the next step!

09 cognitiveservice key

 

Create the Flow

I have exported the Flow to a zip file and also the json for a PowerApp (no details about that in this post). Both are available in the Github repo. I have submitted a template but it is not available yet.

Navigate to https://flow.microsoft.com/ and sign in

Creating Connections

You will need to set up your connections. Click New Connection and search for Text

16 - import step 3.png

Click Add and fill in the Account Key and the Site URL from the steps above

17 import step 5.png

click new connection and search for SQL Server

18 - import step 6.png

Enter the SQL Server Name (value of $AzureSQLServer) , Database Name , User Name and Password from the steps above

19 - import step 7.png

Click new Connection and search for Twitter and create a connection (the authorisation pop-up may be hidden behind other windows!)

Import the Flow

If you have a premium account you can import the flow, click Import

11 - import flow.png

12 - choose import.png

and choose the import.zip from the Github Repo

13 import step 1.png

 

Click on Create as new and choose a name

14 - import step 2.png

Click select during import next to Sentiment and choose the Sentiment connection

15 impot step 3.png

Select during import for the SQL Server Connection and choose the SQL Server Connection and do the same for the Twitter Connection

20 - import stpe 8.png

Then click import

21 - imported.png

Create the flow without import

If you do not have a premium account you can still create the flow using these steps. I have created a template but it is not available at the moment. Create the connections as above and then click Create from blank.

22 - importblank.png

 

Choose the trigger When a New Tweet is posted and add a search term. You may need to choose the connection to twitter by clicking the three dots

23 - importblank 1.png

Click Add an action

24 - add action.png

search for detect and choose the Text Analytics Detect Sentiment

25 - choose sentuiment.png

Enter the name for the connection, the account key and the URL from the creation of the API above. If you forgot to copy them

#region Forgot the details

# Copy the URL if you forget to save it

$getAzureRmCognitiveServicesAccountSplat = @{
    Name = $APIName
    ResourceGroupName = $resourcegroupname
}
(Get-AzureRmCognitiveServicesAccount @getAzureRmCognitiveServicesAccountSplat).Endpoint | Clip

# Copy the Key if you forgot

$getAzureRmCognitiveServicesAccountKeySplat = @{
    Name = $APIName
    ResourceGroupName = $resourcegroupname
}
(Get-AzureRmCognitiveServicesAccountKey @getAzureRmCognitiveServicesAccountKeySplat).Key1 | Clip

#endregion

26 - enter details.png

Click in the text box and choose Tweet Text

27 - choose tweet text.png

Click New Step and add an action. Search for SQL Server and choose SQL Server – Execute Stored Procedure

28 - choose sql server execute stored procedure.png

Choose the stored procedure [dbo].[InsertTweet]

29 - choose stored procedure.png

Fill in as follows

  • __PowerAppsID__         0
  • Date                                 Created At
  • Sentiment                      Score
  • Tweet                              Tweet Text
  • UserLocation                 Location
  • UserName                      Tweeted By

as shown below

30 stored procedure info.png

Give the flow a name at the top and click save flow

31 flow created.png

Connect PowerBi

Open the PSConfEU Twitter Analysis Direct.pbix from the GitHub repo in PowerBi Desktop. Click the arrow next to Edit Queries and then change data source settings

32 change data source.png

Click Change source and enter the server (value of $AzureSQLServer) and the database name. It will alert you to apply changes

33 apply changes.png

It will then pop-up with a prompt for the credentials. Choose Database and enter your credentials and click connect

34 - creds.png

and your PowerBi will be populated from the Azure SQL Database 🙂 This will fail if there are no records in the table because your flow hasn’t run yet. If it does just wait until you see some tweets and then click apply changes again.

You will probably want to alter the pictures and links etc and then yo can publish the report

Happy Twitter Analysis

Dont forget to keep an eye on your flow runs to make sure they have succeeded.

Using the AST in Pester for dbachecks

TagLine – My goal – Chrissy will appreciate Unit Tests one day 🙂

Chrissy has written about dbachecks the new up and coming community driven open source PowerShell module for SQL DBAs to validate their SQL Server estate. we have taken some of the ideas that we have presented about a way of using dbatools with Pester to validate that everything is how it should be and placed them into a meta data driven framework to make things easy for anyone to use. It is looking really good and I am really excited about it. It will be released very soon.

Chrissy and I will be doing a pre-con at SQLBits where we will talk in detail about how this works. You can find out more and sign up here

Cláudio Silva has improved my PowerBi For Pester file and made it beautiful and whilst we were discussing this we found that if the Pester Tests were not formatted correctly the Power Bi looked … well rubbish to be honest! Chrissy asked if we could enforce some rules for writing our Pester tests.

The rules were

The Describe title should be in double quotes
The Describe should use the plural Tags parameter
The Tags should be singular
The first Tag should be a unique tag in Get-DbcConfig
The context title should end with $psitem
The code should use Get-SqlInstance or Get-ComputerName
The Code should use the forEach method
The code should not use $_
The code should contain a Context block

She asked me if I could write the Pester Tests for it and this is how I did it. I needed to look at the Tags parameter for the Describe. It occurred to me that this was a job for the Abstract Syntax Tree (AST). I don’t know very much about the this but I sort of remembered reading a blog post by Francois-Xavier Cat about using it with Pester so I went and read that and found an answer on Stack Overflow as well. These looked just like what I needed so I made use of them. Thank you very much to Francois-Xavier and wOxxOm for sharing.

The first thing I did was to get the Pester Tests which we have located in a checks folder and loop through them and get the content of the file with the Raw parameter

Describe "Checking that each dbachecks Pester test is correctly formatted for Power Bi and Coded correctly" {
$Checks =(Get-ChildItem$ModuleBase\checks).Where{$_.Name-ne'HADR.Tests.ps1'}
$Checks.Foreach{
$Check =Get-Content$Psitem.FullName-Raw
Context "$($_.Name) - Checking Describes titles and tags" {
Then I decided to look at the Describes using the method that wOxxOm (I know no more about this person!) showed.
$Describes = [Management.Automation.Language.Parser]::ParseInput($check, [ref]$tokens, [ref]$errors).
FindAll([Func[Management.Automation.Language.Ast, bool]] {
        param($ast)
        $ast.CommandElements -and
        $ast.CommandElements[0].Value -eq 'describe'
    }, $true) |
    ForEach {
    $CE = $_.CommandElements
    $secondString = ($CE |Where { $_.StaticType.name -eq 'string' })[1]
    $tagIdx = $CE.IndexOf(($CE |Where ParameterName -eq'Tags')) + 1
    $tags = if ($tagIdx -and $tagIdx -lt $CE.Count) {
        $CE[$tagIdx].Extent
    }
    New-Object PSCustomObject -Property @{
        Name = $secondString
        Tags = $tags
    }
}
As I understand it, this code is using the Parser on the $check (which contains the code from the file) and finding all of the Describe commands and creating an object of the title of the Describe with the StaticType equal to String and values from the Tag parameter.
When I ran this against the database tests file I got the following results
Then it was a simple case of writing some tests for the values
@($describes).Foreach{
    $title = $PSItem.Name.ToString().Trim('"').Trim('''')
    It "$title Should Use a double quote after the Describe" {
        $PSItem.Name.ToString().Startswith('"')| Should be $true
        $PSItem.Name.ToString().Endswith('"')| Should be $true
    }
    It "$title should use a plural for tags" {
        $PsItem.Tags| Should Not BeNullOrEmpty
    }
    # a simple test for no esses apart from statistics and Access!!
    if ($null -ne $PSItem.Tags) {
        $PSItem.Tags.Text.Split(',').Trim().Where{($_ -ne '$filename') -and ($_ -notlike '*statistics*') -and ($_ -notlike '*BackupPathAccess*') }.ForEach{
            It "$PsItem Should Be Singular" {
                $_.ToString().Endswith('s')| Should Be $False
            }
        }
        It "The first Tag Should Be in the unique Tags returned from Get-DbcCheck" {
            $UniqueTags -contains $PSItem.Tags.Text.Split(',')[0].ToString()| Should Be $true
        }
    }
    else {
        It "You haven't used the Tags Parameter so we can't check the tags" {
            $false| Should be $true
        }
    }
}

The Describes variable is inside @() so that if there is only one the ForEach Method will still work. The unique tags are returned from our command Get-DbcCheck which shows all of the checks. We will have a unique tag for each test so that they can be run individually.

Yes, I have tried to ensure that the tags are singular by ensuring that they do not end with an s (apart from statistics) and so had to not check  BackupPathAccess and statistics. Filename is a variable that we add to each Describe Tags so that we can run all of the tests in one file. I added a little if block to the Pester as well so that the error if the Tags parameter was not passed was more obvious

I did the same with the context blocks as well

Context "$($_.Name) - Checking Contexts" {
    ## Find the Contexts
    $Contexts = [Management.Automation.Language.Parser]::ParseInput($check, [ref]$tokens, [ref]$errors).
    FindAll([Func[Management.Automation.Language.Ast, bool]] {
            param($ast)
            $ast.CommandElements -and
            $ast.CommandElements[0].Value -eq 'Context'
        }, $true) |
        ForEach {
        $CE = $_.CommandElements
        $secondString = ($CE |Where { $_.StaticType.name -eq 'string' })[1]
        New-Object PSCustomObject -Property @{
            Name = $secondString
        }
    }
    @($Contexts).ForEach{
        $title = $PSItem.Name.ToString().Trim('"').Trim('''')
        It "$Title Should end with `$psitem So that the PowerBi will work correctly" {
            $PSItem.Name.ToString().Endswith('psitem"')| Should Be $true
        }
    }
}
This time we look for the Context command and ensure that the string value ends with psitem as the PowerBi parses the last value when creating columns
Finally I got all of the code and check if it matches some coding standards
Context "$($_.Name) - Checking Code" {
    ## This just grabs all the code
    $AST = [System.Management.Automation.Language.Parser]::ParseInput($Check, [ref]$null, [ref]$null)
    $Statements = $AST.EndBlock.statements.Extent
    ## Ignore the filename line
    @($Statements.Where{$_.StartLineNumber -ne 1}).ForEach{
        $title = [regex]::matches($PSItem.text, "Describe(.*)-Tag").groups[1].value.Replace('"', '').Replace('''', '').trim()
        It "$title Should Use Get-SqlInstance or Get-ComputerName" {
            ($PSItem.text -Match 'Get-SqlInstance') -or ($psitem.text -match 'Get-ComputerName')| Should be $true
        }
        It "$title Should use the ForEach Method" {
            ($Psitem.text -match 'Get-SqlInstance\).ForEach{') -or ($Psitem.text -match 'Get-ComputerName\).ForEach{')| Should Be $true# use the \ to escape the )
    }
    It "$title Should not use `$_" {
        ($Psitem.text -match '$_')| Should Be $false
    }
    It "$title Should Contain a Context Block" {
        $Psitem.text -match 'Context'| Should Be $True
    }
}

I trim the title from the Describe block so that it is easy to see where the failures (or passes) are with some regex and then loop through each statement apart from the first line to ensure that the code is using our internal commands Get-SQLInstance or Get-ComputerName to get information, that we are looping through each of those arrays using the ForEach method rather than ForEach-Object and using $psitem rather than $_ to reference the “This Item” in the array and that each Describe block has a context block.

This should ensure that any new tests that are added to the module follow the guidance we have set up on the Wiki and ensure that the Power Bi results still look beautiful!

Anyone can run the tests using

Invoke-Pester .\tests\Unit.Tests.ps1 -show Fails
before they create a Pull request and it looks like
if everything is Green then they can submit their Pull Request 🙂 If not they can see quickly that something needs to be fixed. (fail early 🙂 )
03 fails.png

A Pretty PowerBi Pester Results Template File

I have left the heat and humidity of Singapore where I have been presenting at the PowerShell Conference Asia and DevOpsDays Singapore to travel to Seattle for PASS Summit. During my Green is Good – Red is Bad session someone asked me if the PowerBi that I showed at the end would work with any Pester Test Results object and I said (without thinking) that it would.

It turns out that the PowerBi that I had set up for that session will work with my function to run Pester Tests against an Ola Hallengren installation but some of the formatting and custom columns were specific to that test.

I said that I would share a Power Bi file that people could plug any Pester Test Results into. This is the first iteration of that. I doubt that it will work for every single test but I think it will be a good starting point for people to use.

This is how to use it

Download the file from here.

Run your Pester Tests using the PassThru Parameter and set the results to a variable, you can also use the Show Parameter to reduce the output of the tests to the screen (and also speed up the tests)

$PesterResults = Invoke-Pester -Script  C:\temp\PBI-Test01.ps1 -Show Summary -PassThru

Then we convert the $PesterResults object into a JSON file

$PesterResults.TestResult | ConvertTo-Json -Depth 5 | Out-File C:\temp\pbi-test.json

Open the Power Bi file you downloaded

Click Home

then the words “Edit Queries”

then data source settings,

highlight the filename and click change source

then navigate to the JSON file you just created, click ok and close and the apply changes.

Which will load the data from the JSON file and display your pester results. You can then save this file with a new name and keep the template for other tests.

It’s not going to be perfect

It’s not going to work in all circumstances and I expect that with some test results it will display the results in a less than optimal manner but you should be able to modify this to suit your needs.

Please give it a try and see how you get on

Here is a sample report created with Demo 1 from my Green is Good session

You can click around and change the data you can see and also look at the other 4 pages

Here is another one that I created using my dbatools-scripts repo and a config file. Again, have a click around and see what it does.

$Config = (Get-Content GIT:\dbatools-scripts\TestConfig.json) -join "`n" | ConvertFrom-Json
$PesterResults = Invoke-Pester .\dbatools-scripts\ -PassThru
$PesterResults.TestResult | Convertto-Json |Out-File C:\temp\dbatools-scripts-pester.json

 

I also created a quick video showing the process too which I will upload when I am not at 35000 feet!!

Enjoy 🙂 Also, let me know if you think it would be better to have the file in Github which would allow contributions but it would only be seen as a binary file and therefore merging will be difficult. I am happy to do so.

TSQL2sDay – Get-PostRoundup

First an apology, this round up is late!

The reason for that is an error in the PowerShell testing module Pester (That’s not completely true as you shall see!!)

I spoke in Stuttgart at the PowerShell Saturday last weekend and had intended to write this blog post whilst travelling, unfortunately I found a major error in Pester (again not strictly true but it makes a good story!!)

I explained it with this slide in my presentation

Yep, I forgot to pack my NUC with my VMs on it and had to re-write all my demos!!

But anyway, on to the TSQL2sDay posts

What a response. You wonderful people. I salute you with a Rimmer salute

There are 34 TSQL2sDay posts about dbatools, about starting with PowerShell, If you should learn PowerShell, SSAS, SSRS, Log Shipping, backups, restores, Pester, Default settings, best practices, migrations, Warnings in Agent Jobs, sqlpackage, VLFs, CMS, Disabling Named Pipes, Orphaned users, AG Status, AG Agent Jobs, logging, classes, auditing, copying files, ETL and more.

I am really pleased to see so many first timers to the TSQL2sDay blog monthly blog party. Please don’t let this be your only TSQL2sDay post. Come back next month and write a post on that topic.

Here they are below in the media of tweets, so that you can also go and follow these wonderful people who are so willing to share their knowledge. Say thank you to them, ask them questions, interact.

Learn, Share, Network

Volker wrote about testing best practices with dbatools

Dave explains why PowerShell is so useful to him in his ETL processes

Steve writes about the time he has saved using PowerShell to automate restores and audit SQL Server instances

https://twitter.com/Steve_TSQL/status/907713842943115264

Nate talks about copying large files like SQL Server backups using BITS with PowerShell

Warren talks about his experience as a beginner, the amount of things he automates and his DBReboot module

THANK YOU every single one and apologies if I have missed anyone!

 

 

Enabling Cortana for dbareports PowerBi

Last week at the Birmingham user group I gave a presentation about PowerShell and SQL Server

saved-image-from-tweetium-8

It was a very packed session as I crammed in the new sqlserver module, dbatools and dbareports 🙂 On reflection I think this is a bit too much for a one hour session but at the end of the session I demo’d live Cortana using the dbareports dataset and returning a Cortana PowerBi page.

As always it took a couple of goes to get it right but when it goes correctly it is fantastic. I call it a salary increasing opportunity! Someone afterwards asked me how it was done so I thought that was worth a blog post

There is a video below but the steps are quite straightforward.

Add Cortana Specific Pages

Whilst you can just enable Cortana to access your dataset, as shown later in this post, which enables Cortana to search available datasets and return an appropriate visualisation it is better to provide specific pages for Cortana to use and display. You can do this in PowerBi Desktop

Start by adding a new page in your report by clicking on the plus button

 

add page.PNG

and then change the size of the report page by clicking on the paintbrush icon in the visualisation page.

page-size

This creates a page that is optimised for Cortana to display and also will be the first place that Cortana will look to answer the question

Power BI first looks for answers in Answer Pages and then searches your datasets and reports for other answers and displays them in the form of visualizations. The highest-scoring results display first as best matches, followed by links to other possible answers and applications. Best matches come from Power BI Answer Pages or Power BI reports.

Rename the page so that it contains the words or phrase you expect to be in the question such as “Servers By Version” You will help Cortana and PowerBi to get your results better if you use some of the column names in your dataset

Then it is just another report page and you can add visualisations just like any other page

cortana page.PNG

Make Cortana work for you and your users

If your users are likely to use a number of different words in their questions you can assist Cortana to find the right answer by adding alternate names. So maybe if your page is sales by store you might add shop, building, results, amount, orders. This is also useful when Cortana doesn’t understand the correct words as you will notice in the screenshot below I have added “service” for “servers” and “buy” for “by” to help get the right answer. You can add these alternate words by clicking the paintbrush under visualisations and then Page Information

cortana-additional

Publish your PBIX file to PowerBi.com

To publish your PowerBi report to PowerBi.com either via the Publish button in PowerBi desktop

publish

or by using the PowerBiPS module

Install-Module -Name PowerBIPS
#Grab the token, will require a sign in
$authToken = Get-PBIAuthToken Verbose
Import-PBIFile authToken $authToken filePath “Path to PBIX file” verbose

Enable Cortana

In your browser log into https://powerbi.com and then click on the cog and then settings

powerbicom.PNG

then click on Datasets

settings

Then choose the dataset – in this case dbareports SQL Information sample and click the tick box to Allow Cortana to access the this dataset and then click apply

dataset settings.PNG

Use Cortana against your PowerBi data

You can type into the Cortana search box and it will offer the opportunity for you to choose your PowerBi data

cortana-search

but it is so much better when you let it find the answer 🙂

cortana-search-1

and if you want to go to the PowerBi report there is a handy link at the bottom of the Cortana page

cortana-search-2

I absolutely love this, I was so pleased when I got it to work and the response when I show people is always one of wonder for both techies and none-techies alike

The conditions for Cortana to work

You will need to have added your work or school Microsoft ID to the computer or phone that you want to use Cortana on and that account must be able to access the dataset either because it is the dataset owner or because a dashboard using that dataset has been shared with that account.

From this page on PowerBi.com

When a new dataset or custom Cortana Answer Page is added to Power BI and enabled for Cortana it can take up to 30 minutes for results to begin appearing in Cortana. Logging in and out of Windows 10, or otherwise restarting the Cortana process in Windows 10, will allow new content to appear immediately.

It’s not perfect!

When you start using Cortana to query your data you will find that at times it is very frustrating. My wife was in fits of giggles listening to me trying to record the video below as Cortana refused to understand that I was saying “servers” and repeatedly searched Bing for “service” Whilst you can negate the effect by using the alternate names for the Q and A settings it is still a bit hit and miss at times.

It is amazing

There is something about giving people the ability to just talk to their device in a meeting and for example with dbareports ask

Which clients are in Bolton

or

When was the last backup for client The Eagles

and get the information they require and a link to the report in PowerBi.com. I am certain that the suits will be absolutely delighted at being able to show off in that way which is why I call it a salary increasing opportunity 🙂

We would love YOU to come and join us at the SQL Community Collaborative

Help us make dbatools, dbareports and Invoke-SQLCmd2 even better. You can join in by forking the repos in GitHub and writing your code and then performing a PR but we would much rather that you came and discussed new requests in our Trello boards, raised issues in GitHub and generally discussed the modules in the SQL Server Community Slack #dbatools #dbareports. We are also looking for assistance with our wiki pages, Pester tests and appveyor integration for our builds and any comments people want to make

SQL Server Collaborative GitHub Organisation holding the modules. Go here to raise issues, fork the repositories or download the code

dbatools Trello for discussion about new cmdlets

SQL Server Community Slack where you can find #dbatools and #dbareports as well as over 1100 people discussing all aspects of the Data Platform, events, jobs, presenting

COME AND JOIN US