Deploying To a Power Bi Report Server with PowerShell

Just a quick post to share some code that I used to solve a problem I had recently.

I needed to automate the deployment of some Power Bi reports to a Power Bi Report Server PBRS using TFS. I had some modified historical validation dbachecks pbix files that I wanted to automate the deployment of and enable the client to be able to quickly and simply deploy the reports as needed.

The manual way

It is always a good idea to understand how to do a task manually before automating it. To deploy to PBRS you need to use the Power Bi Desktop optimised for Power Bi Report Server. There are instructions here. Then it is easy to deploy to the PBRS by clicking file and save as and choosing Power Bi Report Server

manual deploy

If I then want to set the datasource to use a different set of credentials I navigate to the folder that holds the report in PBRS and click the hamburger menu and Manage

manage

and I can alter the User Name and Password or the type of connection by clicking on DataSources

testconn.PNG

and change it to use the reporting user for example.

Automation

But I dont want to have to do this each time and there will be multiple pbix files, so I wanted to automate the solution. The end result was a VSTS or TFS release process so that I could simply drop the pbix into a git repository, commit my changes, sync them and have the system deploy them automatically.

As with all good ideas, I started with a google and found this post by Bill Anton which gave me a good start ( I could not get the connection string change to work in my test environment but this was not required so I didnt really examine why)

I wrote a function that I can use via TFS or VSTS by embedding it in a PowerShell script. The function requires the ReportingServicesTools module which you can get by

Install-Module -Name ReportingServicesTools

The function below is available via the PowerShell Gallery also and you can get it with

Install-Script -Name PublishPBIXFile

The source code is on Github

and the code to call it looks like this

$folderName = 'TestFolder'
$ReportServerURI = 'http://localhost/Reports'
$folderLocation = '/'
$pbixfile = 'C:\Temp\test.pbix'
$description = "Descriptions"

$publishPBIXFileSplat = @{
    ReportServerURI    = $ReportServerURI
    folderLocation     = $folderLocation
    description        = $description
    pbixfile           = $pbixfile
    folderName         = $folderName
    AuthenticationType = 'Windows'
    ConnectionUserName = $UserName1
    Secret             = $Password1
    Verbose            = $true
}
Publish-PBIXFile @publishPBIXFileSplat
code1.PNG

which uploads the report to a folder which it will create if it does not exist. It will then upload pbix file, overwriting the existing one if it already exists

numbe3r1.PNG

and uses the username and password specified

code2.PNG

If I wanted to use a Domain reporting user instead I can do

$UserName1 = 'TheBeard\ReportingUser'

$publishPBIXFileSplat = @{
    ReportServerURI    = $ReportServerURI
    folderLocation     = $folderLocation
    description        = $description
    pbixfile           = $pbixfile
    folderName         = $folderName
    AuthenticationType = 'Windows'
    ConnectionUserName = $UserName1
    Secret             = $Password1
    Verbose            = $true
}
Publish-PBIXFile @publishPBIXFileSplat
and it changes
code4 reporting
If we want to use a SQL Authenticated user then
$UserName1 = 'TheReportingUserOfBeard'

$publishPBIXFileSplat = @{
    ReportServerURI    = $ReportServerURI
    folderLocation     = $folderLocation
    description        = $description
    pbixfile           = $pbixfile
    folderName         = $folderName
    AuthenticationType = 'SQL'
    # credential = $cred
    ConnectionUserName = $UserName1
    Secret             = $Password1

}
Publish-PBIXFile @publishPBIXFileSplat
sql auth.PNG
Excellent, it all works form the command line. You can pass in a credential object as well as username and password. The reason I enabled username and password? So that I can use TFS or VSTS and store my password as a secret variable.
Now I simply create a repository which has my pbix files and a PowerShell script and build a quick release process to deploy them whenever there is a change 🙂
The deploy script looks like
[CmdletBinding()]
Param (
    $PBIXFolder,
    $ConnectionStringPassword
)
$VerbosePreference = 'continue'
$ReportServerURI = 'http://TheBeardsAmazingReports/Reports'
Write-Output "Starting Deployment"
function Publish-PBIXFile {
    [CmdletBinding(DefaultParameterSetName = 'ByUserName', SupportsShouldProcess)]
    Param(
        [Parameter(Mandatory = $true)]
        [string]$FolderName,
        [Parameter(Mandatory = $true)]
        [string]$ReportServerURI,
        [Parameter(Mandatory = $true)]
        [string]$FolderLocation,
        [Parameter(Mandatory = $true)]
        [string]$PBIXFile,
        [Parameter()]
        [string]$Description = "Description of Your report Should go here",
        [Parameter()]
        [ValidateSet('Windows', 'SQL')]
        [string]$AuthenticationType,
        [Parameter(ParameterSetName = 'ByUserName')]
        [string]$ConnectionUserName,
        [Parameter(ParameterSetName = 'ByUserName')]
        [string]$Secret,
        [Parameter(Mandatory = $true, ParameterSetName = 'ByCred')]
        [pscredential]$Credential
    )
    $FolderPath = $FolderLocation + $FolderName
    $PBIXName = $PBIXFile.Split('\')[-1].Replace('.pbix', '')
    try {
        Write-Verbose"Creating a session to the Report Server $ReportServerURI"
        # establish session w/ Report Server
        $session = New-RsRestSession-ReportPortalUri $ReportServerURI
        Write-Verbose"Created a session to the Report Server $ReportServerURI"
    }
    catch {
        Write-Warning"Failed to create a session to the report server $reportserveruri"
        Return
    }
    # create folder (optional)
    try {
        if ($PSCmdlet.ShouldProcess("$ReportServerURI", "Creating a folder called $FolderName at $FolderLocation")) {
            $Null = New-RsRestFolder-WebSession $session-RsFolder $FolderLocation-FolderName $FolderName-ErrorAction Stop
        }
    }
    catch [System.Exception] {
        If ($_.Exception.InnerException.Message -eq 'The remote server returned an error: (409) Conflict.') {
            Write-Warning"The folder already exists - moving on"
        }
    }
    catch {
        Write-Warning"Failed to create a folder called $FolderName at $FolderLocation report server $ReportServerURI but not because it already exists"
        Return
    }
    try {
        if ($PSCmdlet.ShouldProcess("$ReportServerURI", "Uploading the pbix from $PBIXFile to the report server ")) {
            # upload copy of PBIX to new folder
            Write-RsRestCatalogItem-WebSession $session-Path $PBIXFile-RsFolder $folderPath-Description $Description-Overwrite
        }
    }
    catch {
        Write-Warning"Failed to upload the file $PBIXFile to report server $ReportServerURI"
        Return
    }
    try {
        Write-Verbose"Getting the datasources from the pbix file for updating"
        # get data source object
        $datasources = Get-RsRestItemDataSource-WebSession $session-RsItem "$FolderPath/$PBIXName"
        Write-Verbose"Got the datasources for updating"
    }
    catch {
        Write-Warning"Failed to get the datasources"
        Return
    }
    try {
        Write-Verbose"Updating Datasource"

        foreach ($dataSourcein$datasources) {
            if ($AuthenticationType -eq 'SQL') {
                $dataSource.DataModelDataSource.AuthType = 'UsernamePassword'
            }
            else {
                $dataSource.DataModelDataSource.AuthType = 'Windows'
            }
            if ($Credential -or $UserName) {
                if ($Credential) {
                    $UserName = $Credential.UserName
                    $Password = $Credential.GetNetworkCredential().Password
                }
                else {
                    $UserName = $ConnectionUserName
                    $Password = $Secret
                }
                $dataSource.CredentialRetrieval = 'Store'
                $dataSource.DataModelDataSource.Username = $UserName
                $dataSource.DataModelDataSource.Secret = $Password
            }
            if ($PSCmdlet.ShouldProcess("$ReportServerURI", "Updating the data source for the report $PBIXName")) {
                # update data source object on server
                Set-RsRestItemDataSource-WebSession $session-RsItem "$folderPath/$PBIXName"-RsItemType PowerBIReport -DataSources $datasource
            }
        }
    }
    catch {
        Write-Warning"Failed to set the datasource"
        Return
    }
    Write-Verbose"Completed Successfully"
}
foreach ($File in (Get-ChildItem $PBIXFolder\*.pbix)) {
    Write-Output"Processing $($File.FullName)"
    ## to enable further filtering later
    if ($File.FullName -like '*') {
        $folderName = 'ThePlaceForReports'
        $folderLocation = '/'
        $UserName = 'TheBeard\ReportingUser'
        $Password = $ConnectionStringPassword
        $pbixfile = $File.FullName
    }
    if ($File.FullName -like '*dbachecks*') {
        $description = "This is the morning daily checks file that....... more info"
    }
    if ($File.FullName -like '*TheOtherReport*') {
        $description = "This is hte other report, it reports others"
    }
    $publishPBIXFileSplat = @{
        ReportServerURI    = $ReportServerURI
        folderLocation     = $folderLocation
        description        = $description
        AuthenticationType = 'Windows'
        pbixfile           = $pbixfile
        folderName         = $folderName
        ConnectionUserName = $UserName
        Secret             = $Password
        Verbose            = $true
    }
    $Results = Publish-PBIXFile@publishPBIXFileSplat
    Write-Output$Results
}
Although the function does not need to be embedded in the script and can be deployed in a module, I have included it in here to make it easier for people to use quickly. I
Then create a PowerShell step in VSTS or TFS and call the script with the parameters as shown below and PowerBi files auto deploy to Power Bi Report Server
vsts.PNG
and I have my process complete 🙂
Happy Automating 🙂

Using the PowerShell AST to find a ForEach Method

In dbachecks we enable people to see what checks are available by running Get-DbcCheck. This gives a number of properties including the ‘type’ of check. This refers to the configuration item or parameter that is required to have a value for this check to run.

For example – Any check to do with SQL Agent is of type Sqlinstance because it requires an instance to be specified but a check for SPN is of type ComputerName because it requires a computer name to run.

Automation for the win

Because I believe in automation I do not want to have to hard code these values anywhere but create them when the module is imported so we use a json file to feed Get-DbcCheck and populate the Json file when we import the module. This is done using the method that I described here and means that whenever a new check is added it is automatically available in Get-DbcCheck without any extra work.

We use code like this
## Parse the file with AST
$CheckFileAST = [Management.Automation.Language.Parser]::ParseInput($check, [ref]$null, [ref]$null)
## Old code we can use the describes
$Describes = $CheckFileAST.FindAll([Func[Management.Automation.Language.Ast, bool]] {
        param ($ast)
        $ast.CommandElements -and
        $ast.CommandElements[0].Value -eq 'describe'
    }, $true)

@($describes).ForEach{
    $groups += $filename
    $Describe = $_.CommandElements.Where{$PSItem.StaticType.name -eq 'string'}[1]
    $title = $Describe.Value
    $Tags = $PSItem.CommandElements.Where{$PSItem.StaticType.name -eq 'Object[]' -and $psitem.Value -eq $null}.Extent.Text.ToString().Replace(', $filename', '')
    # CHoose the type
    if ($Describe.Parent -match "Get-Instance") {
        $type = "Sqlinstance"
    }
    elseif ($Describe.Parent -match "Get-ComputerName" -or $Describe.Parent -match "AllServerInfo") {
        $type = "ComputerName"
    }
    elseif ($Describe.Parent -match "Get-ClusterObject") {
        $Type = "ClusteNode"
    }
First we parse the code with the AST and store that in the CheckFileAST variable, then we use the FindAll method to find any command elements that match “Describe” which conveniently gets our describes and then we can simply match the Parent object which holds some code to each function that we use to get our values to be passed to the tests Get-ComputerName, Get-Instance, Get-ClusterObject and set the type appropriately.
which when run against a check like this
Describe "Backup Path Access" -Tags BackupPathAccess, Storage, DISA, $filename {
    @(Get-Instance).ForEach{
        if ($NotContactable -contains $psitem) {
            Context "Testing Backup Path Access on $psitem" {
                It "Can't Connect to $Psitem" {
                    $false| Should -BeTrue -Because "The instance should be available to be connected to!"
                }
            }
        }
        else {
            Context "Testing Backup Path Access on $psitem" {
                $backuppath = Get-DbcConfigValue policy.storage.backuppath
                if (-not$backuppath) {
                    $backuppath = (Get-DbaDefaultPath-SqlInstance $psitem).Backup
                }
                It "can access backup path ($backuppath) on $psitem" {
                    Test-DbaSqlPath-SqlInstance $psitem -Path $backuppath| Should -BeTrue -Because 'The SQL Service account needs to have access to the backup path to backup your databases'
                }
            }
        }
    }
}
will find the describe block and get the title “Backup Path Access”  and the tags BackupPathAccess, Storage, DISA, $filename and then find the Get-Instance and set the type to SqlInstance

Until Rob breaks it!

This has worked wonderfully well for 6 months or so of the life of dbachecks but this week I broke it!
The problem was the performance of the code. It is taking a long time to run the tests and I am looking at ways to improve this. I was looking at the Server.Tests file because I thought why not start with one of the smaller files.
It runs the following checks
Server Power Plan Configuration
SPNs
Disk Space
Ping Computer
CPUPrioritisation
Disk Allocation Unit
Instance Connection
and it was looping through the computer names for each check like this
Describe "Server Power Plan Configuration" -Tags PowerPlan, $filename {
    @(Get-ComputerName).ForEach{
    }
}
Describe "Instance Connection" -Tags InstanceConnection, Connectivity, $filename {
    @(Get-Instance).ForEach{
    }
}
Describe "SPNs" -Tags SPN, $filename {
    @(Get-ComputerName).ForEach{
    }
}
Describe "Disk Space" -Tags DiskCapacity, Storage, DISA, $filename {
    @(Get-ComputerName).ForEach{
    }
}
Describe "Ping Computer" -Tags PingComputer, $filename {
    @(Get-ComputerName).ForEach{
    }
}
Describe "CPUPrioritisation" -Tags CPUPrioritisation, $filename {
    @(Get-ComputerName).ForEach{
    }
}
Describe "Disk Allocation Unit" -Tags DiskAllocationUnit, $filename {
    @(Get-ComputerName).ForEach{
    }
}
I altered it to have only one loop for the computer names like so
@(Get-ComputerName).ForEach{
    Describe "Server Power Plan Configuration" -Tags PowerPlan, $filename {
    }
    Describe "SPNs" -Tags SPN, $filename {
    }
    Describe "Disk Space" -Tags DiskCapacity, Storage, DISA, $filename {
    }
    Describe "Ping Computer" -Tags PingComputer, $filename {
    }
    Describe "CPUPrioritisation" -Tags CPUPrioritisation, $filename {
    }
    Describe "Disk Allocation Unit" -Tags DiskAllocationUnit, $filename {
    }
}
Describe "Instance Connection" -Tags InstanceConnection, Connectivity, $filename {
    @(Get-Instance).ForEach{
    }
}
and immediately in testing my checks for the Server Tag decreased in time by about 60% 🙂
I was very happy.
Then I added it to the dbachecks module on my machine, loaded the module and realised that my Json file for Get-DbcCheck was no longer being populated for the type because this line
elseif ($Describe.Parent-match"Get-ComputerName"-or$Describe.Parent-match"AllServerInfo")
was no longer true.

AST for other things

So I googled Management.Automation.Language.Ast the first result lead me to docs.microsoft There are a number of different language elements available there and I found InvokeMemberExpressionAst which will let me find any methods that have been invoked, so now I can find the loops with
$ComputerNameForEach = $CheckFileAST.FindAll([Func[Management.Automation.Language.Ast, bool]] {
        param ($ast)
        $ast -is [System.Management.Automation.Language.InvokeMemberExpressionAst]
    }, $true)
When I examined the object returned I could see that I could further limit the result to get only the method for Get-ComputerName and then if I choose the Extent I can get the code of that loop
## New code uses a Computer Name loop to speed up execution so need to find that as well
$ComputerNameForEach=$CheckFileAST.FindAll([Func[Management.Automation.Language.Ast,bool]] {
param ($ast)
$ast-is [System.Management.Automation.Language.InvokeMemberExpressionAst] -and$ast.expression.Subexpression.Extent.Text-eq'Get-ComputerName'
}, $true).Extent

and now I can match the Tags to the type again :-)

if ($ComputerNameForEach-match$title) {
$type="ComputerName"
}
and now Get-DbcCheck is returning the right results and the checks are a little faster
You can find dbachecks on the PowerShell Gallery or install it using
Install-Module dbachecks -Scope CurrentUser

 

Hitting the Wall – #TSQL2sDay

farrel-nobel-97504-unsplashPhoto by Farrel Nobel on Unsplash

Welcome to another edition of T-SQL Tuesday!

This T-SQL Tuesday is hosted by Wayne Sheffield ( blog | twitter ) and he has asked us to talk about

[…] a time when you ran up against your own brick wall, and how you worked it out or dealt with it.

Hitting The Wall

When stuck in a problem in the past, I could often be found glued to a keyboard and screen for many hours. I would try this way and that way, I would Google and read Stack Overflow looking for ways around the particular issue I was seeing trying different things and finally I would get frustrated and fed up and stop.

Maybe I would go and walk the dog, maybe just sit somewhere else but I would often find that I had an idea how to solve my problem and quickly go back to the keyboard frustrated that I hadnt thought about this earlier and frequently rinse and repeat.

A Different Way

It took me many years to realise this and I wish I had done so sooner but once I made the connection that leaving the problem to one side for a little while meant that I often found a way to a solution for a problem I started setting a time limit.

30 minutes

If I have been stuck on a problem for 30 minutes, I (mostly, I still sometimes fail at this) stop, take a break, go for a walk or do something different and the number of times that I arrive if not at a solution then at a path to a solution is remarkable.

Ask

The other thing to do at this point in the troublesome problem solving is to ask. Twitter, Google, Slack, Stack Overflow. These are all excellent resources where you can quickly find people who are willing and capable of helping.

Don’t be like me and take years to work this out 🙂

A PowerShell Conference In A Book

A Question

Shortly after the European PowerShell Conference and the PowerShell and Devops 2018 summit in the USA Mike Robbins b | t contacted me with a question.

Interested in writing a chapter in a PowerShell book?
I was intrigued and read on.

A Conference in a Book

There was more to this book than just writing about PowerShell though. Mike was suggesting that a group of wonderful PowerShell experts (Here’s a Twitter list) got together and created a conference in a book.

The book is designed as a conference in a book where each chapter is written independently with content similar to what you would present in a 45 minute presentation.

That’s a neat idea, people who couldn’t come to one of the conferences would be able to get an experience a little bit like attending a conference but in book form.

OK there would be no networking, evening entertainment or instance responses to questions but a bundle of useful information that you can take with you and read anywhere.

It’s All For Charity

(I hope older UK viewers read that in this voice 🙂 )

The bit that clinched it for me was this though

We’re donating all of the royalties from the book to the DevOps Collective Scholarship program https://leanpub.com/causes/devopscollective.

All the money raised by buying this book will go to the DevOps Collective OnRamp Scholarship program.

This scholarship provides

  • ticket to PowerShell and DevOps Global Summit OnRamp track specifically designed for entry-level professionals
  • five nights lodging
  • domestic airfare
  • buddy programme
and half of the slots are reserved for under-represented groups.
I really approve of this idea, without the help and support of the SQL and PowerShell technical communities I would not be where I am today and this will help to bring other people in at an early stage in their career. I am proud that I can give a little back.

Fabulous Editors

So I said yes.

I then had to sit down and write some words. I wrote about how we created dbachecks, the challenges we faced and how we overcame them.

One of my biggest challenges was writing in the wrong English! The book is written in American English and there are zeds where there should be esses and missing u’s in words! My spell checker was covered in red squiggles! The second challenge was getting the code to fit the column limit for the book. I show a lot of the AST code that we use to validate that dbachecks code will work correctly and it doesnt split to 80 characters very easily.

Luckily I had 3 wonderful, patient editors to help me with all of this. Mike Robbins , Michael T Lombardi and Jeff Hicks each helped me to make the chapter read more fluently, make sense and be spelled correctly!

Thank you very much you three for all the work you have put into this book.

Help Yourself and Others

If you want to attend a PowerShell conference in book form, want 30 chapters of fabulous PowerShell material and want to help grow and diversify our industry then look no further you can get the book here

book

 

You can also find all of the authors twitters and websites below, You should go and see what they are sharing there as well.

Author Website
Mike F Robbins https://mikefrobbins.com
Jeff Hicks https://jdhitsolutions.com
Michael Lombardi https://appoint.ly/t/michaeltlombardi
Adam Murry https://tikabu.com.au/blog/
Anthony Nocentino http://www.centinosystems.com
Brandon Olin https://devblackops.io
Brian Bunke https://www.brianbunke.com
Don Jones https://donjones.com
Doug Finke https://dfinke.github.io
Emin Atac https://p0w3rsh3ll.wordpress.com
Fred Weinmann https://allthingspowershell.blogspot.com
Graham Beer https://graham-beer.github.io
Irwin Strachan https://pshirwin.wordpress.com
James Petty https://scriptautomaterepeat.com
Jeremy Murrah https://murrahjm.github.io
Justin Sider https://invoke-automation.blog
Luc Dekens http://www.lucd.info
Mark Kraus https://get-powershellblog.blogspot.com
Mark Wragg https://wragg.io
Mike Kanakos https://www.networkadm.in
Mike Shepard https://powershellstation.com
Patrick Gruenauer https://sid-500.com
Prateek Singh https://ridicurious.com
Rob Pleau https://ephos.github.io
Thomas Lee https://tfl09.blogspot.com
Thomas Rayner https://workingsysadmin.com
Thom Schumacher https://powershellposse.com
Tim Curwick https://MadWithPowerShell.com
Tim Warner https://timwarnertech.com
Tommy Maynard https://tommymaynard.com
Tore Groneng https://asaconsultant.blogspot.com
Wesley Kirkland https://wesleyk.me