Deploying To a Power Bi Report Server with PowerShell

Just a quick post to share some code that I used to solve a problem I had recently.

I needed to automate the deployment of some Power Bi reports to a Power Bi Report Server PBRS using TFS. I had some modified historical validation dbachecks pbix files that I wanted to automate the deployment of and enable the client to be able to quickly and simply deploy the reports as needed.

The manual way

It is always a good idea to understand how to do a task manually before automating it. To deploy to PBRS you need to use the Power Bi Desktop optimised for Power Bi Report Server. There are instructions here. Then it is easy to deploy to the PBRS by clicking file and save as and choosing Power Bi Report Server

manual deploy

If I then want to set the datasource to use a different set of credentials I navigate to the folder that holds the report in PBRS and click the hamburger menu and Manage

manage

and I can alter the User Name and Password or the type of connection by clicking on DataSources

testconn.PNG

and change it to use the reporting user for example.

Automation

But I dont want to have to do this each time and there will be multiple pbix files, so I wanted to automate the solution. The end result was a VSTS or TFS release process so that I could simply drop the pbix into a git repository, commit my changes, sync them and have the system deploy them automatically.

As with all good ideas, I started with a google and found this post by Bill Anton which gave me a good start ( I could not get the connection string change to work in my test environment but this was not required so I didnt really examine why)

I wrote a function that I can use via TFS or VSTS by embedding it in a PowerShell script. The function requires the ReportingServicesTools module which you can get by

Install-Module -Name ReportingServicesTools

The function below is available via the PowerShell Gallery also and you can get it with

Install-Script -Name PublishPBIXFile

The source code is on Github

and the code to call it looks like this

$folderName = 'TestFolder'
$ReportServerURI = 'http://localhost/Reports'
$folderLocation = '/'
$pbixfile = 'C:\Temp\test.pbix'
$description = "Descriptions"

$publishPBIXFileSplat = @{
    ReportServerURI    = $ReportServerURI
    folderLocation     = $folderLocation
    description        = $description
    pbixfile           = $pbixfile
    folderName         = $folderName
    AuthenticationType = 'Windows'
    ConnectionUserName = $UserName1
    Secret             = $Password1
    Verbose            = $true
}
Publish-PBIXFile @publishPBIXFileSplat
code1.PNG

which uploads the report to a folder which it will create if it does not exist. It will then upload pbix file, overwriting the existing one if it already exists

numbe3r1.PNG

and uses the username and password specified

code2.PNG

If I wanted to use a Domain reporting user instead I can do

$UserName1 = 'TheBeard\ReportingUser'

$publishPBIXFileSplat = @{
    ReportServerURI    = $ReportServerURI
    folderLocation     = $folderLocation
    description        = $description
    pbixfile           = $pbixfile
    folderName         = $folderName
    AuthenticationType = 'Windows'
    ConnectionUserName = $UserName1
    Secret             = $Password1
    Verbose            = $true
}
Publish-PBIXFile @publishPBIXFileSplat
and it changes
code4 reporting
If we want to use a SQL Authenticated user then
$UserName1 = 'TheReportingUserOfBeard'

$publishPBIXFileSplat = @{
    ReportServerURI    = $ReportServerURI
    folderLocation     = $folderLocation
    description        = $description
    pbixfile           = $pbixfile
    folderName         = $folderName
    AuthenticationType = 'SQL'
    # credential = $cred
    ConnectionUserName = $UserName1
    Secret             = $Password1

}
Publish-PBIXFile @publishPBIXFileSplat
sql auth.PNG
Excellent, it all works form the command line. You can pass in a credential object as well as username and password. The reason I enabled username and password? So that I can use TFS or VSTS and store my password as a secret variable.
Now I simply create a repository which has my pbix files and a PowerShell script and build a quick release process to deploy them whenever there is a change 🙂
The deploy script looks like
[CmdletBinding()]
Param (
    $PBIXFolder,
    $ConnectionStringPassword
)
$VerbosePreference = 'continue'
$ReportServerURI = 'http://TheBeardsAmazingReports/Reports'
Write-Output "Starting Deployment"
function Publish-PBIXFile {
    [CmdletBinding(DefaultParameterSetName = 'ByUserName', SupportsShouldProcess)]
    Param(
        [Parameter(Mandatory = $true)]
        [string]$FolderName,
        [Parameter(Mandatory = $true)]
        [string]$ReportServerURI,
        [Parameter(Mandatory = $true)]
        [string]$FolderLocation,
        [Parameter(Mandatory = $true)]
        [string]$PBIXFile,
        [Parameter()]
        [string]$Description = "Description of Your report Should go here",
        [Parameter()]
        [ValidateSet('Windows', 'SQL')]
        [string]$AuthenticationType,
        [Parameter(ParameterSetName = 'ByUserName')]
        [string]$ConnectionUserName,
        [Parameter(ParameterSetName = 'ByUserName')]
        [string]$Secret,
        [Parameter(Mandatory = $true, ParameterSetName = 'ByCred')]
        [pscredential]$Credential
    )
    $FolderPath = $FolderLocation + $FolderName
    $PBIXName = $PBIXFile.Split('\')[-1].Replace('.pbix', '')
    try {
        Write-Verbose"Creating a session to the Report Server $ReportServerURI"
        # establish session w/ Report Server
        $session = New-RsRestSession-ReportPortalUri $ReportServerURI
        Write-Verbose"Created a session to the Report Server $ReportServerURI"
    }
    catch {
        Write-Warning"Failed to create a session to the report server $reportserveruri"
        Return
    }
    # create folder (optional)
    try {
        if ($PSCmdlet.ShouldProcess("$ReportServerURI", "Creating a folder called $FolderName at $FolderLocation")) {
            $Null = New-RsRestFolder-WebSession $session-RsFolder $FolderLocation-FolderName $FolderName-ErrorAction Stop
        }
    }
    catch [System.Exception] {
        If ($_.Exception.InnerException.Message -eq 'The remote server returned an error: (409) Conflict.') {
            Write-Warning"The folder already exists - moving on"
        }
    }
    catch {
        Write-Warning"Failed to create a folder called $FolderName at $FolderLocation report server $ReportServerURI but not because it already exists"
        Return
    }
    try {
        if ($PSCmdlet.ShouldProcess("$ReportServerURI", "Uploading the pbix from $PBIXFile to the report server ")) {
            # upload copy of PBIX to new folder
            Write-RsRestCatalogItem-WebSession $session-Path $PBIXFile-RsFolder $folderPath-Description $Description-Overwrite
        }
    }
    catch {
        Write-Warning"Failed to upload the file $PBIXFile to report server $ReportServerURI"
        Return
    }
    try {
        Write-Verbose"Getting the datasources from the pbix file for updating"
        # get data source object
        $datasources = Get-RsRestItemDataSource-WebSession $session-RsItem "$FolderPath/$PBIXName"
        Write-Verbose"Got the datasources for updating"
    }
    catch {
        Write-Warning"Failed to get the datasources"
        Return
    }
    try {
        Write-Verbose"Updating Datasource"

        foreach ($dataSourcein$datasources) {
            if ($AuthenticationType -eq 'SQL') {
                $dataSource.DataModelDataSource.AuthType = 'UsernamePassword'
            }
            else {
                $dataSource.DataModelDataSource.AuthType = 'Windows'
            }
            if ($Credential -or $UserName) {
                if ($Credential) {
                    $UserName = $Credential.UserName
                    $Password = $Credential.GetNetworkCredential().Password
                }
                else {
                    $UserName = $ConnectionUserName
                    $Password = $Secret
                }
                $dataSource.CredentialRetrieval = 'Store'
                $dataSource.DataModelDataSource.Username = $UserName
                $dataSource.DataModelDataSource.Secret = $Password
            }
            if ($PSCmdlet.ShouldProcess("$ReportServerURI", "Updating the data source for the report $PBIXName")) {
                # update data source object on server
                Set-RsRestItemDataSource-WebSession $session-RsItem "$folderPath/$PBIXName"-RsItemType PowerBIReport -DataSources $datasource
            }
        }
    }
    catch {
        Write-Warning"Failed to set the datasource"
        Return
    }
    Write-Verbose"Completed Successfully"
}
foreach ($File in (Get-ChildItem $PBIXFolder\*.pbix)) {
    Write-Output"Processing $($File.FullName)"
    ## to enable further filtering later
    if ($File.FullName -like '*') {
        $folderName = 'ThePlaceForReports'
        $folderLocation = '/'
        $UserName = 'TheBeard\ReportingUser'
        $Password = $ConnectionStringPassword
        $pbixfile = $File.FullName
    }
    if ($File.FullName -like '*dbachecks*') {
        $description = "This is the morning daily checks file that....... more info"
    }
    if ($File.FullName -like '*TheOtherReport*') {
        $description = "This is hte other report, it reports others"
    }
    $publishPBIXFileSplat = @{
        ReportServerURI    = $ReportServerURI
        folderLocation     = $folderLocation
        description        = $description
        AuthenticationType = 'Windows'
        pbixfile           = $pbixfile
        folderName         = $folderName
        ConnectionUserName = $UserName
        Secret             = $Password
        Verbose            = $true
    }
    $Results = Publish-PBIXFile@publishPBIXFileSplat
    Write-Output$Results
}
Although the function does not need to be embedded in the script and can be deployed in a module, I have included it in here to make it easier for people to use quickly. I
Then create a PowerShell step in VSTS or TFS and call the script with the parameters as shown below and PowerBi files auto deploy to Power Bi Report Server
vsts.PNG
and I have my process complete 🙂
Happy Automating 🙂

dbachecks – Save the results to a database for historical reporting

I gave a presentation at SQL Day in Poland last week on dbachecks and one of the questions I got asked was will you write a command to put the results of the checks into a database for historical reporting.

The answer is no and here is the reasoning. The capability is already there. Most good PowerShell commands will only return an object and the beauty of an object is that you can do anything you like with it. Your only limit is your imagination 🙂 I have written about this before here. The other reason is that it would be very difficult to write something that was easily configurable for the different requirements that people will require. But here is one way of doing it.

Create a configuration and save it

Let’s define a configuration and call it production. This is something that I do all of the time so that I can easily run a set of checks with the configuration that I want.

# The computername we will be testing
Set-DbcConfig -Name app.computername -Value $sql0,$SQl1
# The Instances we want to test
Set-DbcConfig -Name app.sqlinstance -Value $sql0,$SQl1
# The database owner we expect
Set-DbcConfig -Name policy.validdbowner.name -Value 'THEBEARD\EnterpriseAdmin'
# the database owner we do NOT expect
Set-DbcConfig -Name policy.invaliddbowner.name -Value 'sa'
# Should backups be compressed by default?
Set-DbcConfig -Name policy.backup.defaultbackupcompression -Value $true
# Do we allow DAC connections?
Set-DbcConfig -Name policy.dacallowed -Value $true
# What recovery model should we have?
Set-DbcConfig -Name policy.recoverymodel.type -value FULL
# What should ourt database growth type be?
Set-DbcConfig -Name policy.database.filegrowthtype -Value kb
# What authentication scheme are we expecting?
Set-DbcConfig -Name policy.connection.authscheme -Value 'KERBEROS'
# Which Agent Operator should be defined?
Set-DbcConfig -Name agent.dbaoperatorname -Value 'The DBA Team'
# Which Agent Operator email should be defined?
Set-DbcConfig -Name agent.dbaoperatoremail -Value 'TheDBATeam@TheBeard.Local'
# Which failsafe operator shoudl be defined?
Set-DbcConfig -Name agent.failsafeoperator -Value 'The DBA Team'
## Set the database mail profile name
Set-DbcConfig -Name agent.databasemailprofile -Value 'DbaTeam'
# Where is the whoisactive stored procedure?
Set-DbcConfig -Name policy.whoisactive.database -Value master
# What is the maximum time since I took a Full backup?
Set-DbcConfig -Name policy.backup.fullmaxdays -Value 7
# What is the maximum time since I took a DIFF backup (in hours) ?
Set-DbcConfig -Name policy.backup.diffmaxhours -Value 26
# What is the maximum time since I took a log backup (in minutes)?
Set-DbcConfig -Name policy.backup.logmaxminutes -Value 30
# What is my domain name?
Set-DbcConfig -Name domain.name -Value 'TheBeard.Local'
# Where is my Ola database?
Set-DbcConfig -Name policy.ola.database -Value master
# Which database should not be checked for recovery model
Set-DbcConfig -Name policy.recoverymodel.excludedb -Value 'master','msdb','tempdb'
# Should I skip the check for temp files on c?
Set-DbcConfig -Name skip.tempdbfilesonc -Value $true
# Should I skip the check for temp files count?
Set-DbcConfig -Name skip.tempdbfilecount -Value $true
# Which Checks should be excluded?
Set-DbcConfig -Name command.invokedbccheck.excludecheck -Value LogShipping,ExtendedEvent, PseudoSimple,SPN, TestLastBackupVerifyOnly,IdentityUsage,SaRenamed
# How many months before a build is unsupported do I want to fail the test?
Set-DbcConfig -Name policy.build.warningwindow -Value 6
## I need to set the app.cluster configuration to one of the nodes for the HADR check
## and I need to set the domain.name value
Set-DbcConfig -Name app.cluster -Value $SQL0
Set-DbcConfig -Name domain.name -Value 'TheBeard.Local'
## I also skip the ping check for the listener as we are in Azure
Set-DbcConfig -Name skip.hadr.listener.pingcheck -Value $true
Now I can export that configuration to a json file and store on a file share or in source control using the code below. This makes it easy to embed the checks into an automation solution
Export-DbcConfig -Path Git:\Production.Json
and then I can use it with
Import-DbcConfig -Path Git:\Production.Json
Invoke-DbcCheck
01 - Invoke-DbcCheck
I would use one of the Show parameter values here if I was running it at the command line, probably fails to make reading the information easier

Add results to a database

This only gets us the test results on the screen, so if we want to save them to a database we have to use the PassThru parameter for Invoke-DbcCheck. I will run the checks again, save them to a variable
$Testresults = Invoke-DbcCheck -PassThru -Show Fails

Then I can use the dbatools Write-DbaDatatable command to write the results to a table in a database. I need to do this twice, once for the summary and once for the test results

$Testresults | Write-DbaDataTable -SqlInstance $sql0 -Database tempdb -Table Prod_dbachecks_summary -AutoCreateTable
$Testresults.TestResult | Write-DbaDataTable -SqlInstance $sql0 -Database tempdb -Table Prod_dbachecks_detail -AutoCreateTable

and I get two tables one for the summary

02 - summary

and one for the details

03 - detail
This works absolutely fine and I could continue to add test results in this fashion but it has no date property so it is not so useful for reporting.

Create tables and triggers

This is one way of doing it. I am not sure it is the best way but it works! I always look forward to how people take ideas and move them forward so if you have a better/different solution please blog about it and reference it in the comments below

First I created a staging table for the summary results

CREATE TABLE [dbachecks].[Prod_dbachecks_summary_stage](
	[TagFilter] [nvarchar](max) NULL,
	[ExcludeTagFilter] [nvarchar](max) NULL,
	[TestNameFilter] [nvarchar](max) NULL,
	[TotalCount] [int] NULL,
	[PassedCount] [int] NULL,
	[FailedCount] [int] NULL,
	[SkippedCount] [int] NULL,
	[PendingCount] [int] NULL,
	[InconclusiveCount] [int] NULL,
	[Time] [bigint] NULL,
	[TestResult] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

and a destination table with a primary key and a date column which defaults to todays date

CREATE TABLE [dbachecks].[Prod_dbachecks_summary](
	[SummaryID] [int] IDENTITY(1,1) NOT NULL,
	[TestDate] [date] NOT NULL,
	[TagFilter] [nvarchar](max) NULL,
	[ExcludeTagFilter] [nvarchar](max) NULL,
	[TestNameFilter] [nvarchar](max) NULL,
	[TotalCount] [int] NULL,
	[PassedCount] [int] NULL,
	[FailedCount] [int] NULL,
	[SkippedCount] [int] NULL,
	[PendingCount] [int] NULL,
	[InconclusiveCount] [int] NULL,
	[Time] [bigint] NULL,
	[TestResult] [nvarchar](max) NULL,
 CONSTRAINT [PK_Prod_dbachecks_summary] PRIMARY KEY CLUSTERED 
(
	[SummaryID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_summary] ADD  CONSTRAINT [DF_Prod_dbachecks_summary_TestDate]  DEFAULT (getdate()) FOR [TestDate]
GO

and added an INSERT trigger to the staging table

CREATE TRIGGER [dbachecks].[Load_Prod_Summary] 
   ON   [dbachecks].[Prod_dbachecks_summary_stage]
   AFTER INSERT
AS 
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

    INSERT INTO [dbachecks].[Prod_dbachecks_summary] 
	([TagFilter], [ExcludeTagFilter], [TestNameFilter], [TotalCount], [PassedCount], [FailedCount], [SkippedCount], [PendingCount], [InconclusiveCount], [Time], [TestResult])
	SELECT [TagFilter], [ExcludeTagFilter], [TestNameFilter], [TotalCount], [PassedCount], [FailedCount], [SkippedCount], [PendingCount], [InconclusiveCount], [Time], [TestResult] FROM [dbachecks].[Prod_dbachecks_summary_stage]

END
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_summary_stage] ENABLE TRIGGER [Load_Prod_Summary]
GO
and for the details I do the same thing. A details table
CREATE TABLE [dbachecks].[Prod_dbachecks_detail](
	[DetailID] [int] IDENTITY(1,1) NOT NULL,
	[SummaryID] [int] NOT NULL,
	[ErrorRecord] [nvarchar](max) NULL,
	[ParameterizedSuiteName] [nvarchar](max) NULL,
	[Describe] [nvarchar](max) NULL,
	[Parameters] [nvarchar](max) NULL,
	[Passed] [bit] NULL,
	[Show] [nvarchar](max) NULL,
	[FailureMessage] [nvarchar](max) NULL,
	[Time] [bigint] NULL,
	[Name] [nvarchar](max) NULL,
	[Result] [nvarchar](max) NULL,
	[Context] [nvarchar](max) NULL,
	[StackTrace] [nvarchar](max) NULL,
 CONSTRAINT [PK_Prod_dbachecks_detail] PRIMARY KEY CLUSTERED 
(
	[DetailID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail]  WITH CHECK ADD  CONSTRAINT [FK_Prod_dbachecks_detail_Prod_dbachecks_summary] FOREIGN KEY([SummaryID])
REFERENCES [dbachecks].[Prod_dbachecks_summary] ([SummaryID])
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail] CHECK CONSTRAINT [FK_Prod_dbachecks_detail_Prod_dbachecks_summary]
GO

A stage table

CREATE TABLE [dbachecks].[Prod_dbachecks_detail_stage](
	[ErrorRecord] [nvarchar](max) NULL,
	[ParameterizedSuiteName] [nvarchar](max) NULL,
	[Describe] [nvarchar](max) NULL,
	[Parameters] [nvarchar](max) NULL,
	[Passed] [bit] NULL,
	[Show] [nvarchar](max) NULL,
	[FailureMessage] [nvarchar](max) NULL,
	[Time] [bigint] NULL,
	[Name] [nvarchar](max) NULL,
	[Result] [nvarchar](max) NULL,
	[Context] [nvarchar](max) NULL,
	[StackTrace] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

with a trigger

CREATE TRIGGER [dbachecks].[Load_Prod_Detail] 
   ON   [dbachecks].[Prod_dbachecks_detail_stage]
   AFTER INSERT
AS 
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

    INSERT INTO [dbachecks].[Prod_dbachecks_detail] 
([SummaryID],[ErrorRecord], [ParameterizedSuiteName], [Describe], [Parameters], [Passed], [Show], [FailureMessage], [Time], [Name], [Result], [Context], [StackTrace])
	SELECT 
	(SELECT MAX(SummaryID) From [dbachecks].[Prod_dbachecks_summary]),[ErrorRecord], [ParameterizedSuiteName], [Describe], [Parameters], [Passed], [Show], [FailureMessage], [Time], [Name], [Result], [Context], [StackTrace]
	FROM [dbachecks].[Prod_dbachecks_detail_stage]

END
GO

ALTER TABLE [dbachecks].[Prod_dbachecks_detail_stage] ENABLE TRIGGER [Load_Prod_Detail]
GO

Then I can use Write-DbaDatatable with a couple of extra parameters, FireTriggers to run the trigger, Truncate and Confirm:$false to avoid any confirmation because I want this to run without any interaction and I can get the results into the database.

$Testresults | Write-DbaDataTable -SqlInstance $Instance -Database $Database -Schema dbachecks -Table Prod_dbachecks_summary_stage -FireTriggers -Truncate -Confirm:$False
$Testresults.TestResult | Write-DbaDataTable -SqlInstance $Instance -Database $Database -Schema dbachecks -Table Prod_dbachecks_detail_stage -FireTriggers -Truncate -Confirm:$False
detail with stage

Which means that I can now query some of this data and also create PowerBi reports for it.

To enable me to have results for the groups in dbachecks I have to do a little bit of extra manipulation. I can add all of the checks to the database using

Get-DbcCheck | Write-DbaDataTable -SqlInstance $sql0 -Database ValidationResults -Schema dbachecks -Table Checks -Truncate -Confirm:$False -AutoCreateTable

But because the Ola Hallengren Job names are configuration items I need to update the values for those checks which I can do as follows

$query = "
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.systemfull) + "' WHERE [Describe] = 'Ola - `$SysFullJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserFull) + "' WHERE [Describe] = 'Ola - `$UserFullJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserDiff) + "' WHERE [Describe] = 'Ola - `$UserDiffJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserLog) + "' WHERE [Describe] = 'Ola - `$UserLogJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.CommandLogCleanup) + "' WHERE [Describe] = 'Ola - `$CommandLogJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.SystemIntegrity) + "' WHERE [Describe] = 'Ola - `$SysIntegrityJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserIntegrity) + "' WHERE [Describe] = 'Ola - `$UserIntegrityJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.UserIndex) + "' WHERE [Describe] = 'Ola - `$UserIndexJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.OutputFileCleanup) + "' WHERE [Describe] = 'Ola - `$OutputFileJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.DeleteBackupHistory) + "' WHERE [Describe] = 'Ola - `$DeleteBackupJobName'
UPDATE [dbachecks].[Checks] SET [Describe] = 'Ola - " + (Get-DbcConfigValue -Name ola.jobname.PurgeBackupHistory) + "' WHERE [Describe] = 'Ola - `$PurgeBackupJobName'
"
Invoke-DbaSqlQuery -SqlInstance $SQL0 -Database ValidationResults -Query $query

You can get a sample Power Bi report in my Github which also has the code from this blog post

Then you just need to open in PowerBi Desktop and

Click Edit Queries
Click Data Source Settings
Click Change Source
Change the Instance and Database names

09 - PowerBi

Then have an interactive report like this. Feel free to click around and see how it works. Use the arrows at the bottom right to go full-screen. NOTE – it filters by “today” so if I haven’t run the check and the import then click on one of the groups under “Today’s Checks by Group”

This enables me to filter the results and see what has happened in the past so I can filter by one instance
05 - filter by instance
or I can filter by a group of tests
07 - filter by instance
or even by a group of tests for an instance
08 - filter by instance and insance

Hopefully, this will give you some ideas of what you can do with your dbachecks results. You can find all of the code and the PowerBi in my GitHub

Happy Validating!

Checking Availability Groups with dbachecks

It’s been 45 days since we released dbachecks

Since then there have been 25 releases to the PowerShell Gallery!! Today release 1.1.119 was released 🙂 There have been over 2000 downloads of the module already.

In the beginning we had 80 checks and 108 configuration items, today we have 84 checks and 125 configuration items!

If you have already installed dbachecks it is important to make sure that you update regularly. You can do this by running

Update-Module dbachecks

If you want to try dbachecks, you can install it from the PowerShell Gallery by running

Install-Module dbachecks # -Scope CurrentUser # if not running as admin

You can read more about installation and read a number of blog posts about using different parts of dbachecks at this link https://dbatools.io/installing-dbachecks/

HADR Tests

Today we updated the HADR tests to add the capability to test multiple availability groups and fix a couple of bugs

Once you have installed dbachecks you will need to set some configuration so that you can perform the tests. You can see all of the configuration items and their values using

Get-DbcConfig | Out-GridView

get-config.png

You can set the values with the Set-DbcConfig command. It has intellisense to make things easier 🙂 To set the values for the HADR tests

Set-DbcConfig -Name app.cluster -Value sql1
Set-DbcConfig -Name app.computername -Value sql0,sql1
Set-DbcConfig -Name app.sqlinstance -Value sql0,sql1
Set-DbcConfig -Name domain.name -Value TheBeard.Local
Set-DbcConfig -Name skip.hadr.listener.pingcheck -Value $true
  • app.cluster requires one of the nodes of the cluster.
  • app.computername requires the windows computer names of the machines to run operating system checks against
  • app.sqlinstance requires the instance names of the SQL instances that you want to run SQL checks against (These are default instances but it will accept SERVER\INSTANCE)
  • domain.name requires the domain name the machines are part of
  • skip.hadr.listener.pingcheck is a boolean value which defines whether to skip the listener ping check or not. As this is in Azure I am skipping the check by setting the value to $true
  • policy.hadr.tcpport is set to default to 1433 but you can also set this configuration if your SQL is using a different port
NOTE – You can find all the configuration items that can skip tests by running
Get-DbcConfig -Name skip*
skips.png
Now we have set the configuration (For the HADR checks – There are many more configurations for other checks that you can set) you can run the checks with
Invoke-DbcCheck -Check HADR
check results.png
This runs the following checks
  • Each node on the cluster should be up
  • Each resource on the cluster should be online
  • Each SQL instance should be enabled for Always On
  • Connection check for the listener and each node
    • Should be pingable (unless skip.hadr.listener.pingcheck is set to true)
    • Should be able to run SQL commands
    • Should be the correct domain name
    • Should be using the correct tcpport
  • Each replica should not be in unknown state
  • Each synchronous replica should be synchronised
  • Each asynchronous replica should be synchonising
  • Each database should be synchronised (or synchronising) on each replica
  • Each database should be failover ready on each replica
  • Each database should be joined to the availability group on each replica
  • Each database should not be suspended on each replica
  • Each node should have the AlwaysOn_Health extended event
  • Each node should have the AlwaysOn_Health extended event running
  • Each node should have the AlwaysOn_Health extended event set to auto start

(Apologies folk over the pond, I use the Queens English 😉 )

This is good for us to be able to run this check at the command line but we can do more.

We can export the results and display them with PowerBi. Note we need to add -PassThru so that the results go through the pipeline and that I used -Show Fails so that only the titles of the Describe and Context blocks and any failing tests are displayed to the screen

Invoke-DbcCheck -Check HADR -Show Fails -PassThru | Update-DbcPowerBiDataSource -Environment HADR-Test
Start-DbcPowerBi

results.png

This will create a file at C:\Windows\Temp\dbachecks and open the PowerBi report. You will need to refresh the data in the report and then you will see

dbachecks.png

Excellent, everything passed 🙂

Saving Configuration for reuse

We can save our configuration using Export-DbcConfig which will export the configuration to a json file

Export-DbcConfig -Path Git:\PesterTests\MyHADRTestsForProd.json

so that we can run this particular set of tests with this comfiguration by importing the configuration using Import-DbcConfig

Import-DbcConfig -Path -Path Git:\PesterTests\MyHADRTestsForProd.json
Invoke-DbcCheck -Check HADR

In this way you can set up different check configurations for different use cases. This also enables you to make use of the checks in your CI/CD process. For example, I have a GitHub repository for creating a domain, a cluster and a SQL 2017 availability group using VSTS. I have saved a dbachecks configuration to my repository and as part of my build I can import that configuration, run the checks and output them to XML for consumption by the publish test results task of VSTS

After copying the configuration to the machine, I run

Import-Dbcconfig -Path C:\Windows\Temp\FirstBuild.json
Invoke-DbcCheck-AllChecks -OutputFile PesterTestResultsdbachecks.xml -OutputFormat NUnitXml
in my build step and then use the publish test results task and VSTS does the rest 🙂
VSTS results.png

 

 

How I created PowerShell.cool using Flow, Azure SQL DB, Cognitive Services & PowerBi

Last weekend I was thinking about how to save the tweets for PowerShell Conference Europe. This annual event occurs in Hanover and this year it is on April 17-20, 2018. The agenda has just been released and you can find it on the website http://www.psconf.eu/

I ended up creating an interactive PowerBi report to which my good friend and Data Platform MVP Paul Andrew b | t added a bit of magic and I published it. The magnificent Tobias Weltner b | t who organises PSConfEU pointed the domain name http://powershell.cool at the link. It looks like this.

During the monthly #PSTweetChat

I mentioned that I need to blog about how I created it and Jeff replied

so here it is! Looking forward to seeing the comparison between the PowerShell and Devops Summit and the PowerShell Conference Europe 🙂

This is an overview of how it works

 

You will find all of the resources and the scripts to do all of the below in the GitHub repo. So clone it and navigate to the filepath

Create Database

First lets create a database. Connect to your Azure subscription

## Log in to your Azure subscription using the Add-AzureRmAccount command and follow the on-screen directions.

 Add-AzureRmAccount

## Select the subscription

Set-AzureRmContext -SubscriptionId YourSubscriptionIDHere

01 - subscription.png

Then set some variables

# The data center and resource name for your resources
$resourcegroupname = "twitterresource"
$location = "WestEurope"
# The logical server name: Use a random value or replace with your own value (do not capitalize)
$servername = "server-$(Get-Random)"
# Set an admin login and password for your database
# The login information for the server You need to set these and uncomment them - Dont use these values

# $adminlogin = "ServerAdmin"                
# $password = "ChangeYourAdminPassword1"

# The ip address range that you want to allow to access your server - change as appropriate
# $startip = "0.0.0.0"
# $endip = "0.0.0.0"

# To just add your own IP Address
$startip = $endip = (Invoke-WebRequest 'http://myexternalip.com/raw').Content -replace "`n"

# The database name
$databasename = "tweets"

$AzureSQLServer = "$servername.database.windows.net,1433"
$Table = "table.sql"
$Proc = "InsertTweets.sql"

They should all make sense, take note that you need to set and uncomment the login and password and choose which IPs to allow through the firewall

Create a Resource Group

## Create a resource group

New-AzureRmResourceGroup -Name $resourcegroupname -Location $location

02 - resource group.png

Create a SQL Server

## Create a Server

$newAzureRmSqlServerSplat = @{
    SqlAdministratorCredentials = $SqlAdministratorCredentials
    ResourceGroupName = $resourcegroupname
    ServerName = $servername
    Location = $location
}
New-AzureRmSqlServer @newAzureRmSqlServerSplat

03 - create server.png

Create a firewall rule, I just use my own IP and add the allow azure IPs

$newAzureRmSqlServerFirewallRuleSplat = @{
    EndIpAddress = $endip
    StartIpAddress = $startip
    ServerName = $servername
    ResourceGroupName = $resourcegroupname
    FirewallRuleName = "AllowSome"
}
New-AzureRmSqlServerFirewallRule @newAzureRmSqlServerFirewallRuleSplat

# Allow Azure IPS

$newAzureRmSqlServerFirewallRuleSplat = @{
    AllowAllAzureIPs = $true
    ServerName = $servername
    ResourceGroupName = $resourcegroupname
}
New-AzureRmSqlServerFirewallRule @newAzureRmSqlServerFirewallRuleSplat

03a - firewall rule.png

Create a database

# Create a database

$newAzureRmSqlDatabaseSplat = @{
    ServerName = $servername
    ResourceGroupName = $resourcegroupname
    Edition = 'Basic'
    DatabaseName = $databasename
}
New-AzureRmSqlDatabase  @newAzureRmSqlDatabaseSplat

04 - create database.png

I have used the dbatools module to run the scripts to create the database. You can get it using

Install-Module dbatools # -Scope CurrentUser # if not admin process

Run the scripts

# Create a credential

$newObjectSplat = @{
    ArgumentList = $adminlogin, $(ConvertTo-SecureString -String $password -AsPlainText -Force)
    TypeName = 'System.Management.Automation.PSCredential'
}
$SqlAdministratorCredentials = New-Object @newObjectSplat

## Using dbatools module

$invokeDbaSqlCmdSplat = @{
    SqlCredential = $SqlAdministratorCredentials
    Database = $databasename
    File = $Table,$Proc
    SqlInstance = $AzureSQLServer
}
Invoke-DbaSqlCmd @invokeDbaSqlCmdSplat

05 - Create Table Sproc.png

This will have created the following in Azure, you can see it in the portal

07 - portal.png

You can connect to the database in SSMS and you will see

06 - show table.png

Create Cognitive Services

Now you can create the Text Analysis Cognitive Services API

First login (if you need to) and set some variables

## This creates cognitive services for analysing the tweets

## Log in to your Azure subscription using the Add-AzureRmAccount command and follow the on-screen directions.

Add-AzureRmAccount

## Select the subscription

Set-AzureRmContext -SubscriptionId YOUR SUBSCRIPTION ID HERE

#region variables
# The data center and resource name for your resources
$resourcegroupname = "twitterresource"
$location = "WestEurope"
$APIName = 'TweetAnalysis'
#endregion

Then create the API and get the key

#Create the cognitive services

$newAzureRmCognitiveServicesAccountSplat = @{
    ResourceGroupName = $resourcegroupname
    Location = $location
    SkuName = 'F0'
    Name = $APIName
    Type = 'TextAnalytics'
}
New-AzureRmCognitiveServicesAccount @newAzureRmCognitiveServicesAccountSplat

# Get the Key

$getAzureRmCognitiveServicesAccountKeySplat = @{
    Name = $APIName
    ResourceGroupName = $resourcegroupname
}
Get-AzureRmCognitiveServicesAccountKey @getAzureRmCognitiveServicesAccountKeySplat 

You will need to accept the prompt

08 -cognitive service

Copy the Endpoint URL as you will need it.Then save one of  the keys for the next step!

09 cognitiveservice key

 

Create the Flow

I have exported the Flow to a zip file and also the json for a PowerApp (no details about that in this post). Both are available in the Github repo. I have submitted a template but it is not available yet.

Navigate to https://flow.microsoft.com/ and sign in

Creating Connections

You will need to set up your connections. Click New Connection and search for Text

16 - import step 3.png

Click Add and fill in the Account Key and the Site URL from the steps above

17 import step 5.png

click new connection and search for SQL Server

18 - import step 6.png

Enter the SQL Server Name (value of $AzureSQLServer) , Database Name , User Name and Password from the steps above

19 - import step 7.png

Click new Connection and search for Twitter and create a connection (the authorisation pop-up may be hidden behind other windows!)

Import the Flow

If you have a premium account you can import the flow, click Import

11 - import flow.png

12 - choose import.png

and choose the import.zip from the Github Repo

13 import step 1.png

 

Click on Create as new and choose a name

14 - import step 2.png

Click select during import next to Sentiment and choose the Sentiment connection

15 impot step 3.png

Select during import for the SQL Server Connection and choose the SQL Server Connection and do the same for the Twitter Connection

20 - import stpe 8.png

Then click import

21 - imported.png

Create the flow without import

If you do not have a premium account you can still create the flow using these steps. I have created a template but it is not available at the moment. Create the connections as above and then click Create from blank.

22 - importblank.png

 

Choose the trigger When a New Tweet is posted and add a search term. You may need to choose the connection to twitter by clicking the three dots

23 - importblank 1.png

Click Add an action

24 - add action.png

search for detect and choose the Text Analytics Detect Sentiment

25 - choose sentuiment.png

Enter the name for the connection, the account key and the URL from the creation of the API above. If you forgot to copy them

#region Forgot the details

# Copy the URL if you forget to save it

$getAzureRmCognitiveServicesAccountSplat = @{
    Name = $APIName
    ResourceGroupName = $resourcegroupname
}
(Get-AzureRmCognitiveServicesAccount @getAzureRmCognitiveServicesAccountSplat).Endpoint | Clip

# Copy the Key if you forgot

$getAzureRmCognitiveServicesAccountKeySplat = @{
    Name = $APIName
    ResourceGroupName = $resourcegroupname
}
(Get-AzureRmCognitiveServicesAccountKey @getAzureRmCognitiveServicesAccountKeySplat).Key1 | Clip

#endregion

26 - enter details.png

Click in the text box and choose Tweet Text

27 - choose tweet text.png

Click New Step and add an action. Search for SQL Server and choose SQL Server – Execute Stored Procedure

28 - choose sql server execute stored procedure.png

Choose the stored procedure [dbo].[InsertTweet]

29 - choose stored procedure.png

Fill in as follows

  • __PowerAppsID__         0
  • Date                                 Created At
  • Sentiment                      Score
  • Tweet                              Tweet Text
  • UserLocation                 Location
  • UserName                      Tweeted By

as shown below

30 stored procedure info.png

Give the flow a name at the top and click save flow

31 flow created.png

Connect PowerBi

Open the PSConfEU Twitter Analysis Direct.pbix from the GitHub repo in PowerBi Desktop. Click the arrow next to Edit Queries and then change data source settings

32 change data source.png

Click Change source and enter the server (value of $AzureSQLServer) and the database name. It will alert you to apply changes

33 apply changes.png

It will then pop-up with a prompt for the credentials. Choose Database and enter your credentials and click connect

34 - creds.png

and your PowerBi will be populated from the Azure SQL Database 🙂 This will fail if there are no records in the table because your flow hasn’t run yet. If it does just wait until you see some tweets and then click apply changes again.

You will probably want to alter the pictures and links etc and then yo can publish the report

Happy Twitter Analysis

Dont forget to keep an eye on your flow runs to make sure they have succeeded.

Using the AST in Pester for dbachecks

TagLine – My goal – Chrissy will appreciate Unit Tests one day 🙂

Chrissy has written about dbachecks the new up and coming community driven open source PowerShell module for SQL DBAs to validate their SQL Server estate. we have taken some of the ideas that we have presented about a way of using dbatools with Pester to validate that everything is how it should be and placed them into a meta data driven framework to make things easy for anyone to use. It is looking really good and I am really excited about it. It will be released very soon.

Chrissy and I will be doing a pre-con at SQLBits where we will talk in detail about how this works. You can find out more and sign up here

Cláudio Silva has improved my PowerBi For Pester file and made it beautiful and whilst we were discussing this we found that if the Pester Tests were not formatted correctly the Power Bi looked … well rubbish to be honest! Chrissy asked if we could enforce some rules for writing our Pester tests.

The rules were

The Describe title should be in double quotes
The Describe should use the plural Tags parameter
The Tags should be singular
The first Tag should be a unique tag in Get-DbcConfig
The context title should end with $psitem
The code should use Get-SqlInstance or Get-ComputerName
The Code should use the forEach method
The code should not use $_
The code should contain a Context block

She asked me if I could write the Pester Tests for it and this is how I did it. I needed to look at the Tags parameter for the Describe. It occurred to me that this was a job for the Abstract Syntax Tree (AST). I don’t know very much about the this but I sort of remembered reading a blog post by Francois-Xavier Cat about using it with Pester so I went and read that and found an answer on Stack Overflow as well. These looked just like what I needed so I made use of them. Thank you very much to Francois-Xavier and wOxxOm for sharing.

The first thing I did was to get the Pester Tests which we have located in a checks folder and loop through them and get the content of the file with the Raw parameter

Describe "Checking that each dbachecks Pester test is correctly formatted for Power Bi and Coded correctly" {
$Checks =(Get-ChildItem$ModuleBase\checks).Where{$_.Name-ne'HADR.Tests.ps1'}
$Checks.Foreach{
$Check =Get-Content$Psitem.FullName-Raw
Context "$($_.Name) - Checking Describes titles and tags" {
Then I decided to look at the Describes using the method that wOxxOm (I know no more about this person!) showed.
$Describes = [Management.Automation.Language.Parser]::ParseInput($check, [ref]$tokens, [ref]$errors).
FindAll([Func[Management.Automation.Language.Ast, bool]] {
        param($ast)
        $ast.CommandElements -and
        $ast.CommandElements[0].Value -eq 'describe'
    }, $true) |
    ForEach {
    $CE = $_.CommandElements
    $secondString = ($CE |Where { $_.StaticType.name -eq 'string' })[1]
    $tagIdx = $CE.IndexOf(($CE |Where ParameterName -eq'Tags')) + 1
    $tags = if ($tagIdx -and $tagIdx -lt $CE.Count) {
        $CE[$tagIdx].Extent
    }
    New-Object PSCustomObject -Property @{
        Name = $secondString
        Tags = $tags
    }
}
As I understand it, this code is using the Parser on the $check (which contains the code from the file) and finding all of the Describe commands and creating an object of the title of the Describe with the StaticType equal to String and values from the Tag parameter.
When I ran this against the database tests file I got the following results
Then it was a simple case of writing some tests for the values
@($describes).Foreach{
    $title = $PSItem.Name.ToString().Trim('"').Trim('''')
    It "$title Should Use a double quote after the Describe" {
        $PSItem.Name.ToString().Startswith('"')| Should be $true
        $PSItem.Name.ToString().Endswith('"')| Should be $true
    }
    It "$title should use a plural for tags" {
        $PsItem.Tags| Should Not BeNullOrEmpty
    }
    # a simple test for no esses apart from statistics and Access!!
    if ($null -ne $PSItem.Tags) {
        $PSItem.Tags.Text.Split(',').Trim().Where{($_ -ne '$filename') -and ($_ -notlike '*statistics*') -and ($_ -notlike '*BackupPathAccess*') }.ForEach{
            It "$PsItem Should Be Singular" {
                $_.ToString().Endswith('s')| Should Be $False
            }
        }
        It "The first Tag Should Be in the unique Tags returned from Get-DbcCheck" {
            $UniqueTags -contains $PSItem.Tags.Text.Split(',')[0].ToString()| Should Be $true
        }
    }
    else {
        It "You haven't used the Tags Parameter so we can't check the tags" {
            $false| Should be $true
        }
    }
}

The Describes variable is inside @() so that if there is only one the ForEach Method will still work. The unique tags are returned from our command Get-DbcCheck which shows all of the checks. We will have a unique tag for each test so that they can be run individually.

Yes, I have tried to ensure that the tags are singular by ensuring that they do not end with an s (apart from statistics) and so had to not check  BackupPathAccess and statistics. Filename is a variable that we add to each Describe Tags so that we can run all of the tests in one file. I added a little if block to the Pester as well so that the error if the Tags parameter was not passed was more obvious

I did the same with the context blocks as well

Context "$($_.Name) - Checking Contexts" {
    ## Find the Contexts
    $Contexts = [Management.Automation.Language.Parser]::ParseInput($check, [ref]$tokens, [ref]$errors).
    FindAll([Func[Management.Automation.Language.Ast, bool]] {
            param($ast)
            $ast.CommandElements -and
            $ast.CommandElements[0].Value -eq 'Context'
        }, $true) |
        ForEach {
        $CE = $_.CommandElements
        $secondString = ($CE |Where { $_.StaticType.name -eq 'string' })[1]
        New-Object PSCustomObject -Property @{
            Name = $secondString
        }
    }
    @($Contexts).ForEach{
        $title = $PSItem.Name.ToString().Trim('"').Trim('''')
        It "$Title Should end with `$psitem So that the PowerBi will work correctly" {
            $PSItem.Name.ToString().Endswith('psitem"')| Should Be $true
        }
    }
}
This time we look for the Context command and ensure that the string value ends with psitem as the PowerBi parses the last value when creating columns
Finally I got all of the code and check if it matches some coding standards
Context "$($_.Name) - Checking Code" {
    ## This just grabs all the code
    $AST = [System.Management.Automation.Language.Parser]::ParseInput($Check, [ref]$null, [ref]$null)
    $Statements = $AST.EndBlock.statements.Extent
    ## Ignore the filename line
    @($Statements.Where{$_.StartLineNumber -ne 1}).ForEach{
        $title = [regex]::matches($PSItem.text, "Describe(.*)-Tag").groups[1].value.Replace('"', '').Replace('''', '').trim()
        It "$title Should Use Get-SqlInstance or Get-ComputerName" {
            ($PSItem.text -Match 'Get-SqlInstance') -or ($psitem.text -match 'Get-ComputerName')| Should be $true
        }
        It "$title Should use the ForEach Method" {
            ($Psitem.text -match 'Get-SqlInstance\).ForEach{') -or ($Psitem.text -match 'Get-ComputerName\).ForEach{')| Should Be $true# use the \ to escape the )
    }
    It "$title Should not use `$_" {
        ($Psitem.text -match '$_')| Should Be $false
    }
    It "$title Should Contain a Context Block" {
        $Psitem.text -match 'Context'| Should Be $True
    }
}

I trim the title from the Describe block so that it is easy to see where the failures (or passes) are with some regex and then loop through each statement apart from the first line to ensure that the code is using our internal commands Get-SQLInstance or Get-ComputerName to get information, that we are looping through each of those arrays using the ForEach method rather than ForEach-Object and using $psitem rather than $_ to reference the “This Item” in the array and that each Describe block has a context block.

This should ensure that any new tests that are added to the module follow the guidance we have set up on the Wiki and ensure that the Power Bi results still look beautiful!

Anyone can run the tests using

Invoke-Pester .\tests\Unit.Tests.ps1 -show Fails
before they create a Pull request and it looks like
if everything is Green then they can submit their Pull Request 🙂 If not they can see quickly that something needs to be fixed. (fail early 🙂 )
03 fails.png

A Pretty PowerBi Pester Results Template File

I have left the heat and humidity of Singapore where I have been presenting at the PowerShell Conference Asia and DevOpsDays Singapore to travel to Seattle for PASS Summit. During my Green is Good – Red is Bad session someone asked me if the PowerBi that I showed at the end would work with any Pester Test Results object and I said (without thinking) that it would.

It turns out that the PowerBi that I had set up for that session will work with my function to run Pester Tests against an Ola Hallengren installation but some of the formatting and custom columns were specific to that test.

I said that I would share a Power Bi file that people could plug any Pester Test Results into. This is the first iteration of that. I doubt that it will work for every single test but I think it will be a good starting point for people to use.

This is how to use it

Download the file from here.

Run your Pester Tests using the PassThru Parameter and set the results to a variable, you can also use the Show Parameter to reduce the output of the tests to the screen (and also speed up the tests)

$PesterResults = Invoke-Pester -Script  C:\temp\PBI-Test01.ps1 -Show Summary -PassThru

Then we convert the $PesterResults object into a JSON file

$PesterResults.TestResult | ConvertTo-Json -Depth 5 | Out-File C:\temp\pbi-test.json

Open the Power Bi file you downloaded

Click Home

then the words “Edit Queries”

then data source settings,

highlight the filename and click change source

then navigate to the JSON file you just created, click ok and close and the apply changes.

Which will load the data from the JSON file and display your pester results. You can then save this file with a new name and keep the template for other tests.

It’s not going to be perfect

It’s not going to work in all circumstances and I expect that with some test results it will display the results in a less than optimal manner but you should be able to modify this to suit your needs.

Please give it a try and see how you get on

Here is a sample report created with Demo 1 from my Green is Good session

You can click around and change the data you can see and also look at the other 4 pages

Here is another one that I created using my dbatools-scripts repo and a config file. Again, have a click around and see what it does.

$Config = (Get-Content GIT:\dbatools-scripts\TestConfig.json) -join "`n" | ConvertFrom-Json
$PesterResults = Invoke-Pester .\dbatools-scripts\ -PassThru
$PesterResults.TestResult | Convertto-Json |Out-File C:\temp\dbatools-scripts-pester.json

 

I also created a quick video showing the process too which I will upload when I am not at 35000 feet!!

Enjoy 🙂 Also, let me know if you think it would be better to have the file in Github which would allow contributions but it would only be seen as a binary file and therefore merging will be difficult. I am happy to do so.

Enabling Cortana for dbareports PowerBi

Last week at the Birmingham user group I gave a presentation about PowerShell and SQL Server

saved-image-from-tweetium-8

It was a very packed session as I crammed in the new sqlserver module, dbatools and dbareports 🙂 On reflection I think this is a bit too much for a one hour session but at the end of the session I demo’d live Cortana using the dbareports dataset and returning a Cortana PowerBi page.

As always it took a couple of goes to get it right but when it goes correctly it is fantastic. I call it a salary increasing opportunity! Someone afterwards asked me how it was done so I thought that was worth a blog post

There is a video below but the steps are quite straightforward.

Add Cortana Specific Pages

Whilst you can just enable Cortana to access your dataset, as shown later in this post, which enables Cortana to search available datasets and return an appropriate visualisation it is better to provide specific pages for Cortana to use and display. You can do this in PowerBi Desktop

Start by adding a new page in your report by clicking on the plus button

 

add page.PNG

and then change the size of the report page by clicking on the paintbrush icon in the visualisation page.

page-size

This creates a page that is optimised for Cortana to display and also will be the first place that Cortana will look to answer the question

Power BI first looks for answers in Answer Pages and then searches your datasets and reports for other answers and displays them in the form of visualizations. The highest-scoring results display first as best matches, followed by links to other possible answers and applications. Best matches come from Power BI Answer Pages or Power BI reports.

Rename the page so that it contains the words or phrase you expect to be in the question such as “Servers By Version” You will help Cortana and PowerBi to get your results better if you use some of the column names in your dataset

Then it is just another report page and you can add visualisations just like any other page

cortana page.PNG

Make Cortana work for you and your users

If your users are likely to use a number of different words in their questions you can assist Cortana to find the right answer by adding alternate names. So maybe if your page is sales by store you might add shop, building, results, amount, orders. This is also useful when Cortana doesn’t understand the correct words as you will notice in the screenshot below I have added “service” for “servers” and “buy” for “by” to help get the right answer. You can add these alternate words by clicking the paintbrush under visualisations and then Page Information

cortana-additional

Publish your PBIX file to PowerBi.com

To publish your PowerBi report to PowerBi.com either via the Publish button in PowerBi desktop

publish

or by using the PowerBiPS module

Install-Module -Name PowerBIPS
#Grab the token, will require a sign in
$authToken = Get-PBIAuthToken Verbose
Import-PBIFile authToken $authToken filePath “Path to PBIX file” verbose

Enable Cortana

In your browser log into https://powerbi.com and then click on the cog and then settings

powerbicom.PNG

then click on Datasets

settings

Then choose the dataset – in this case dbareports SQL Information sample and click the tick box to Allow Cortana to access the this dataset and then click apply

dataset settings.PNG

Use Cortana against your PowerBi data

You can type into the Cortana search box and it will offer the opportunity for you to choose your PowerBi data

cortana-search

but it is so much better when you let it find the answer 🙂

cortana-search-1

and if you want to go to the PowerBi report there is a handy link at the bottom of the Cortana page

cortana-search-2

I absolutely love this, I was so pleased when I got it to work and the response when I show people is always one of wonder for both techies and none-techies alike

The conditions for Cortana to work

You will need to have added your work or school Microsoft ID to the computer or phone that you want to use Cortana on and that account must be able to access the dataset either because it is the dataset owner or because a dashboard using that dataset has been shared with that account.

From this page on PowerBi.com

When a new dataset or custom Cortana Answer Page is added to Power BI and enabled for Cortana it can take up to 30 minutes for results to begin appearing in Cortana. Logging in and out of Windows 10, or otherwise restarting the Cortana process in Windows 10, will allow new content to appear immediately.

It’s not perfect!

When you start using Cortana to query your data you will find that at times it is very frustrating. My wife was in fits of giggles listening to me trying to record the video below as Cortana refused to understand that I was saying “servers” and repeatedly searched Bing for “service” Whilst you can negate the effect by using the alternate names for the Q and A settings it is still a bit hit and miss at times.

It is amazing

There is something about giving people the ability to just talk to their device in a meeting and for example with dbareports ask

Which clients are in Bolton

or

When was the last backup for client The Eagles

and get the information they require and a link to the report in PowerBi.com. I am certain that the suits will be absolutely delighted at being able to show off in that way which is why I call it a salary increasing opportunity 🙂

We would love YOU to come and join us at the SQL Community Collaborative

Help us make dbatools, dbareports and Invoke-SQLCmd2 even better. You can join in by forking the repos in GitHub and writing your code and then performing a PR but we would much rather that you came and discussed new requests in our Trello boards, raised issues in GitHub and generally discussed the modules in the SQL Server Community Slack #dbatools #dbareports. We are also looking for assistance with our wiki pages, Pester tests and appveyor integration for our builds and any comments people want to make

SQL Server Collaborative GitHub Organisation holding the modules. Go here to raise issues, fork the repositories or download the code

dbatools Trello for discussion about new cmdlets

SQL Server Community Slack where you can find #dbatools and #dbareports as well as over 1100 people discussing all aspects of the Data Platform, events, jobs, presenting

COME AND JOIN US

 

PowerBi and API – Visualising my Checkins

For my own amusement and also to show my wife where I have been I use the Swarm check-in app on my phone and check-in to places. Also for my own amusement I used PowerBi to visualise the data via the API and allow me to filter it in various ways.

Whilst at the PowerShell Conference in Asia I was showing the mobile app to a group over some food and saying how easy it was and June Blender, the mother of PowerShell help, said that I ought to blog about it. So I have 🙂

Follow these steps and you can create this report.

powerbi8.PNGYou can also download the blank report and add your own access token to it should you wish. Details at the end of the post

I am using the swarm API but the principle is the same for any other API that provides you with data. For example, I used the same principles to create the embedded reports on the PASS PowerShell Virtual Chapter page showing the status of the cards suggesting improvements to the sqlserver module for the product team to work on. Hopefully, this post will give you some ideas to work on and show you that it is quite easy to get excellent data visualisation from APIs

First up we need to get the data. I took a look at the Swarm developers page ( The Trello is here by the way) I had to register for an app, which gave me a client id and a secret. I then followed the steps here to get my user token I was only interested in my own check ins so I used the steps under Token flow Client applications to get my access token which I used in an URL like this.

https://api.foursquare.com/v2/users/self/checkins?limit=5000&oauth_token=ACCESS_TOKEN&v=YYYYMMDD

I added the limit 5000 as the default number of checkins returned was too small for my needs and the date was that days date.

You can do this in Powershell using code I got from the magnificent Stephen Owen’s blog post

## Enter the details
$Clientid =''  ## Enter ClientId from foursquare
$redirect = '' ## enter redirect url from client app in foursquare
##Create the URL:
$URL = "https://foursquare.com/oauth2/authenticate?client_id=$Clientid&response_type=token&redirect_uri=$redirect"
## function from https://foxdeploy.com/2015/11/02/using-powershell-and-oauth/
Function Show-OAuthWindow {
Add-Type -AssemblyName System.Windows.Forms</div>
<div>$form = New-Object -TypeName System.Windows.Forms.Form -Property @{Width=440;Height=640}
$web  = New-Object -TypeName System.Windows.Forms.WebBrowser -Property @{Width=420;Height=600;Url=($url -f ($Scope -join "%20")) }
$DocComp  = {
$Global:uri = $web.Url.AbsoluteUri
if ($Global:Uri -match "error=[^&]*|code=[^&]*") {$form.Close() }
}
$web.ScriptErrorsSuppressed = $true
$web.Add_DocumentCompleted($DocComp)
$form.Controls.Add($web)
$form.Add_Shown({$form.Activate()})
$form.ShowDialog() | Out-Null
}
#endregion
#login to get an access code then close the redirect window
Show-OAuthWindow -URL $URl
## grab the token
$regex = '(?<=access_token=)(.*)'
$authCode  = ($uri | Select-string -pattern $regex).Matches[0].Value
$global:AuthToken = $authCode
Write-output "Received a token, $AuthToken"
Write-Output "So the URL for your PowerBi Data is :-"
$PowerBiUrl = "https://api.foursquare.com/v2/users/self/checkins?limit=5000&oauth_token=$AuthToken&v=20160829"
$PowerBiUrl | Clip

I checked the URL in a browser and confirmed that it returned a json object. Keep that URL safe you will need it in a minute. That code above has placed it in your clipboard. If you want to jump straight to the report using the download stop here and go to the end

So now lets move to Power BI. Go to powerbi.com and download the PowerBi Desktop. Its free. You will need to create an account using a school or work email address if you wish to put your reports in powerbi.com

Once you have downloaded and installed PowerBi Desktop you will be faced with a window like this

powerbi

Start by clicking Get Data

powerbi2

Then choose Web and paste the URL from above into the filename and press ok which will give you this

powerbi3

Now we need to put the data into a format that is of more use to us

power1

I clicked on the record link for response, then converted to table, then the little icon at the top of the column to expand the value.items column and then the value.items column again. It doesn’t look much yet but we are a step closer.

Next I looked in the table for the venue column, expanded that and the location column and the formatted address column.

power2

You can also expand the categories so that you can look at those too by expanding Value.items.venue.categories and Value.items.venue.categories1

powerbi4.gif

Now you will see that we have some duplicates in the data so we need to remove those. I did that by deleting the first 3 columns and then clicking remove duplicates under Delete Rows

power3b.gif

Then click close and apply. Then click on the data button as we need to rename and remove some more columns so that our data makes a little sense. I renamed the columns like this

Value.items.createdAt –> CreatedAt
Value.items.shout –> Comment
Value.items.venue.name –> VenueName
Value.items.venue.location.address –> VenueAddress
Value.items.timeZoneOffset –> TimeZoneOffset
Value.items.venue.location.lat –> VenueLat
Value.items.venue.location.lng –> VenueLong
Value.items.venue.location.postalCode –> VenuePostalCode
Value.items.venue.location.cc –> CountryCode
Value.items.venue.location.city –> City
Value.items.venue.location.state –> State
Value.items.venue.location.country –> Country
Value.items.venue.location.formattedAddress –> VenueAddress
Value.items.venue.url –> VenueURL
Value.items.venue.categories.name –> Catogory
Value.items.venue.categories.pluralName –> Categories

and remove all of the other columns. You can also do this in the Edit Queries window, I am just showing you that there are multiple ways to do the same thing

powerbi5.gif

Once you have done that you should have a window that looks like this. Notice I renamed the query to checkins as well

powerbi4.PNG

Now we need to create a calculated column for the time and a measure for the count of checkins. This is done using this code

[code langauge=”SQL”]Time = VAR UnixDays = [createdAt]/(60*60*24)
RETURN (DATEVALUE("1/1/1970")+UnixDays)

[code langauge=”SQL”]CountCheckins = COUNT(checkins[Time])

and we can move onto the report side of things. Frist we are going to download a custom visual. Go to the PowerBi Custom Visuals Page and download the Timeline visualpowerbi5.PNG

and then import it into your PowerBi report. I have embedded a YouTube video below showing the steps I took to turn this into the PowerBi report. Its pretty easy, you will be able to click on the visuals and then click on the data columns and alter them until you have the report that you want.

Once you have done this, you can upload it to PowerBi if you wish by clicking on the Publish button in PowerBi desktop and signing into PowerBi.com with your work email address.

powerbi6.PNG

and your report is available for you on PowerBi.com 🙂 By clicking on the pins on a visualisation you can add them to a dashboard.

powerbi8.gif

Once you have a dashboard you can then use the natural language query to ask questions of your data. Here are some examples

How many checkins are in GB
How many checkins are in airports
How many checkins by month
How many checkins by month in GB
Which airports
Show me hotel venuename and time
How many hotels by country
Show me hotel venuename and checkins count
metro stations venuename and count checkins as a map
Show me count checkins in Amsterdam by category as a donut

powerbi7.PNG

If you want to use the blank report, download it from here open it in PowerBi Desktop, click Edit Queries and Source and add your own URL and click Apply and then Refresh

powerbi9.gif

Hopefully, this has given you some ideas of ways that you can create some reports from many of the data sources available to you via API

DBA Database scripts are on Github

It started with a tweet from Dusty

Tweets

The second session I presented at the fantastic PowerShell Conference Europe was about using the DBA Database to automatically install DBA scripts like sp_Blitz, sp_AskBrent, sp_Blitzindex from Brent Ozar , Ola Hallengrens Maintenance Solution , Adam Mechanics sp_whoisactive , This fantastic script for logging the results from sp_whoisactive to a table , Extended events sessions and other goodies for the sanity of the DBA.

By making use of the dbo.InstanceList in my DBA database I am able to target instances, by SQL Version, OS Version, Environment, Data Centre, System, Client or any other variable I choose. An agent job that runs every night will automatically pick up the instances and the scripts that are marked as needing installing. This is great when people release updates to the above scripts allowing you to target the development environment and test before they get put onto live.

I talked to a lot of people in Hannover and they all suggested that I placed the scripts onto GitHub and after some how-to instructions from a few people (Thank you Luke) I spent the weekend updating and cleaning up the code and you can now find it on GitHub here

github

I have added the DBA Database project, the Powershell scripts and Agent Job creation scripts to call those scripts and everything else I use. Some of the DBA Scripts I use (and links to those you need to go and get yourself for licensing reasons) and the Power Bi files as well. I will be adding some more jobs that I use to gather other information soon.

Please go and have a look and see if it is of use to you. It is massively customisable and I have spoken to various people who have extended it in interesting ways so I look forward to hearing about what you do with it.

As always, questions and comments welcome

 

 

Power Bi, PowerShell and SQL Agent Jobs

Continuing my series on using Power Bi with my DBA Database I am going to show in this post how I create the most useful daily report for DBAs – The SQL Agent Job report. You can get the scripts and reports here

Please note this project became dbareports.io

AG1

This gives a quick overview of the status of the Agent Jobs across the estate and also quickly identifies recent failed jobs enabling the DBA to understand their focus and prioritise their morning efforts.

I gather the information into 2 tables AgentJobDetail

CREATE TABLE [Info].[AgentJobDetail](
[AgetnJobDetailID] [int] IDENTITY(1,1) NOT NULL,
[Date] [datetime] NOT NULL,
[InstanceID] [int] NOT NULL,
[Category] [nvarchar](50) NOT NULL,
[JobName] [nvarchar](250) NOT NULL,
[Description] [nvarchar](750) NOT NULL,
[IsEnabled] [bit] NOT NULL,
[Status] [nvarchar](50) NOT NULL,
[LastRunTime] [datetime] NOT NULL,
[Outcome] [nvarchar](50) NOT NULL,
CONSTRAINT [PK_info.AgentJobDetail] PRIMARY KEY CLUSTERED
(
[AgetnJobDetailID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

and AgentJobServer

CREATE TABLE [Info].[AgentJobServer](
[AgentJobServerID] [int] IDENTITY(1,1) NOT NULL,
[Date] [datetime] NOT NULL,
[InstanceID] [int] NOT NULL,
[NumberOfJobs] [int] NOT NULL,
[SuccessfulJobs] [int] NOT NULL,
[FailedJobs] [int] NOT NULL,
[DisabledJobs] [int] NOT NULL,
[UnknownJobs] [int] NOT NULL,
CONSTRAINT [PK_Info.AgentJobServer] PRIMARY KEY CLUSTERED
(
[AgentJobServerID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

The Detail table holds the results of every Agent Job and the Server table holds a roll up for each server. The script to gather this information is based on the script I used to put the information into an Excel Sheet as described in my post How I Check Hundreds of Agent Jobs in 60 Seconds with PowerShell which I also altered to send an HTML email to the DBA team each morning. This however is a much better solution and allows for better monitoring and trending.

As I have explained in my previous posts I use an Instance List table to hold the information about each instance in the estate and a series of PowerShell scripts which run via Agent Jobs to gather the information into various tables. These posts describe the use of the Write-Log function and the methodology of gathering the required information and looping through each instance so I wont repeat that here. There is an extra check I do however for Express Edition as this does not contain the Agent service

$edition = $srv.Edition
if ($Edition -eq 'Express') {
    Write-Log -Path $LogFile -Message "No Information gathered as this Connection $Connection is Express"
    continue
}

The Agent Job information can be found in SMO by exploring the $srv.JobServer.Jobs object and I gather the information by iterating through each job and setting the values we require to variables

try {
    $JobCount = $srv.JobServer.jobs.Count
    $successCount = 0
    $failedCount = 0
    $UnknownCount = 0
    $JobsDisabled = 0
    #For each job on the server
    foreach ($jobin$srv.JobServer.Jobs)
    {
        $jobName = $job.Name;
        $jobEnabled = $job.IsEnabled;
        $jobLastRunOutcome = $job.LastRunOutcome;
        $Category = $Job.Category;
        $RunStatus = $Job.CurrentRunStatus;
        $Time = $job.LastRunDate;
        if ($Time -eq '01/01/000100:00:00')
        {$Time = ''}
        $Description = $Job.Description;
        #Counts for jobs Outcome
        if ($jobEnabled -eq $False)
        {$JobsDisabled += 1}
        elseif ($jobLastRunOutcome -eq "Failed")
        {$failedCount += 1; }
        elseif ($jobLastRunOutcome -eq "Succeeded")
        {$successCount += 1; }
        elseif ($jobLastRunOutcome -eq "Unknown")
        {$UnknownCount += 1; }
    }    
}

I found that some Jobs had names and descriptions that had ‘ in them which would cause the SQL update or insert statement to fail so I use the replace method to replace the ‘ with ”

if ($Description -eq $null) {
    $Description = ' '
}
$Description = $Description.replace('''', '''''')
if ($jobName -eq $Null) {
    $jobName = 'None'
}
$JobName = $JobName.replace('''', '''''')

I then insert the data per job after checking that it does not already exist which allows me to re-run the job should a number of servers be uncontactable at the time of the job running without any additional work

IF NOT EXISTS (
SELECT&nbsp; [AgetnJobDetailID]
FROM [DBADatabase].[Info].[AgentJobDetail]
where jobname = '$jobName'
and InstanceID = (SELECT [InstanceID]
FROM [DBADatabase].[dbo].[InstanceList]
WHERE [ServerName] = '$ServerName'
AND [InstanceName] = '$InstanceName'
AND [Port] = '$Port')
and lastruntime = '$Time'
)
INSERT INTO [Info].[AgentJobDetail]
([Date]
,[InstanceID]
,[Category]
,[JobName]
,[Description]
,[IsEnabled]
,[Status]
,[LastRunTime]
,[Outcome])
VALUES
(GetDate()
,(SELECT [InstanceID]
FROM [DBADatabase].[dbo].[InstanceList]
WHERE [ServerName] = '$ServerName'
AND [InstanceName] = '$InstanceName'
AND [Port] = '$Port')
,'$Category'
,'$jobName'
,'$Description'
,'$jobEnabled'
,'$RunStatus'
,'$Time'
,'$jobLastRunOutcome')

I put this in a here-string variable and pass it to Invoke-SQLCmd I do the same with the roll up using this query

INSERT INTO [Info].[AgentJobServer]
([Date]
,[InstanceID]
,[NumberOfJobs]
,[SuccessfulJobs]
,[FailedJobs]
,[DisabledJobs]
,[UnknownJobs])
VALUES
(GetDate()
,(SELECT [InstanceID]
FROM [DBADatabase].[dbo].[InstanceList]
WHERE [ServerName] = '$ServerName'
AND [InstanceName] = '$InstanceName'
AND [Port] = '$Port')
,'$JobCount'
,'$successCount'
,'$failedCount'
,'$JobsDisabled'
,'$UnknownCount')

This job runs as a SQL Agent Job every morning a half an hour or so before the DBA arrives for the morning shift vastly improving the ability of the DBA to prioritise their morning routine.

To create the report open Power Bi Desktop and click Get Data

ag2

Then choose SQL Server and click connect

ag3

Enter the Connection string, the database and the  query to gather the data

ag5

The query is

Select IL.InstanceID,
IL.ServerName,
IL.InstanceName,
IL.Environment,
IL.Location,
AJD.Category,
AJD.Date,
AJD.Description,
AJD.IsEnabled,
AJD.JobName,
AJD.LastRunTime,
AJD.Outcome,
AJD.Status
FROM [dbo].[InstanceList] IL
JOIN [Info].[AgentJobDetail] AJD
ON IL.InstanceID = AJD.InstanceID
WHERE LastRunTime > DATEADD(Day,-31,GETDATE())

Once we have gathered the data we then create some extra columns and measures for the reports. First I create a date column from the datetime Date Column

DayDate = DATE(YEAR('Agent Job Detail'[Date]),MONTH('Agent Job Detail'[Date]),DAY('Agent Job Detail'[Date]))

I also do the same for the LastRuntime. I create a day of the week column so that I can report on jobs outcome by day

DayyOfWeek = CONCATENATE(WEEKDAY('Agent Job Detail'[Date],2),FORMAT('Agent Job Detail'[Date]," -dddd"))

My friend Terry McCann b | t helped me create a column that returns true if the last run time is within 24 hours of the current time to help identify the recent jobs that have failed NOTE – On a Monday morning you will need to change this if you do not check your jobs on the weekend.

Last Run Relative Hour = ((1.0*(NOW()-'Agent Job Detail'[LastRunTime]))*24)<24

I create a measure for Succeeded, Failed and Unknown

Succeeded = IF('Agent Job Detail'[Outcome] = "Succeeded"
, 1
, 0)

Next we have to create some measures for the sum of failed jobs and the averages This is the code for 7 day sum

Failed7Days = CALCULATE(SUM('Agent Job Detail'[Failed]),FILTER (
ALL ( 'Agent Job Detail'[Last Run Date] ),
'Agent Job Detail'[Last Run Date] > ( MAX ( 'Agent Job Detail'[Last Run Date]&nbsp; ) - 7 )
&& 'Agent Job Detail'[Last Run Date]&nbsp; <= MAX ( 'Agent Job Detail'[Last Run Date]&nbsp; )&nbsp;&nbsp;&nbsp;&nbsp; ) )

and for the 7 Day average

Failed7DayAverage = DIVIDE([Failed7Days],7)

I did the same for 30 days. I used the TechNet reference for DAX expressions and got ideas from Chris Webbs blog

ag6
First I created the 30 day historical trend chart using a Line and Clustered column chart using the last run date as the axis and the succeed measure as the column and the Failed, Failed 7 Day Average and failed 30 day average as the lines

I then formatted the lines and title and column

ag7

To create the gauge which shows how well we have done today I created a measure to quickly identify todays jobs

LastRun Relative Date Offset = INT('Agent Job Detail'[LastRunTime] - TODAY())

which I use as a filter for the gauge as shown below. I also create two measures zero and twenty for the minimum and maximum for the gauge

ag8

The rest of the report is measures for 7 day average and 30 day average, a slicer for environment  and two tables, one to show the historical job counts and one to show the jobs that have failed in the last 24 hours using the Last Run Relative Hour measure from above

ag9

There are many other reports that you can or may want to create maybe by day of the week or by category depending on your needs. Once you have the data gathered you are free to play with the data as you see fit. Please add any further examples of reports you can run or would like to run in the comments below.

Once you have your report written you can publish it to PowerBi.com and create a dashboard and query it with natural language. I have explained the process in previous posts

For example – How many Jobs failed today

ag110

Which server had most failed jobs

ag11

or using the category field which database maintenance jobs failed today

ag13

I hope these posts have given you ideas about how you can use Powershell, a DBA Database and Power Bi to help you to manage and report on your environment.

You can get the scripts and reports here

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs