TSQL2sDay – Get-PostRoundup

First an apology, this round up is late!

The reason for that is an error in the PowerShell testing module Pester (That’s not completely true as you shall see!!)

I spoke in Stuttgart at the PowerShell Saturday last weekend and had intended to write this blog post whilst travelling, unfortunately I found a major error in Pester (again not strictly true but it makes a good story!!)

I explained it with this slide in my presentation

Yep, I forgot to pack my NUC with my VMs on it and had to re-write all my demos!!

But anyway, on to the TSQL2sDay posts

What a response. You wonderful people. I salute you with a Rimmer salute

There are 34 TSQL2sDay posts about dbatools, about starting with PowerShell, If you should learn PowerShell, SSAS, SSRS, Log Shipping, backups, restores, Pester, Default settings, best practices, migrations, Warnings in Agent Jobs, sqlpackage, VLFs, CMS, Disabling Named Pipes, Orphaned users, AG Status, AG Agent Jobs, logging, classes, auditing, copying files, ETL and more.

I am really pleased to see so many first timers to the TSQL2sDay blog monthly blog party. Please don’t let this be your only TSQL2sDay post. Come back next month and write a post on that topic.

Here they are below in the media of tweets, so that you can also go and follow these wonderful people who are so willing to share their knowledge. Say thank you to them, ask them questions, interact.

Learn, Share, Network

Volker wrote about testing best practices with dbatools

https://twitter.com/VolkerBachmann/status/907664676150022144

Dave explains why PowerShell is so useful to him in his ETL processes

Steve writes about the time he has saved using PowerShell to automate restores and audit SQL Server instances

https://twitter.com/Steve_TSQL/status/907713842943115264

Nate talks about copying large files like SQL Server backups using BITS with PowerShell

Warren talks about his experience as a beginner, the amount of things he automates and his DBReboot module

THANK YOU every single one and apologies if I have missed anyone!

 

 

Adding a PowerShell Job Step to an existing SQL Agent Job Step with PowerShell

In my last post I showed how to add a T-SQL Job step to an existing SQL Agent Job. The process is exactly the same for a PowerShell job step.

As before I gathered the required jobs using Get-SQLAgentJob command from the sqlserver module which you can get by installing the latest SSMS from https://sqlps.io/dl 

This code was run on PowerShell version 5 and will not run on PowerShell version 3 or earlier as it uses the where method
I put all of our jobs that I required on the estate into a variable called $Jobs. (You will need to fill the $Servers variable with the names of your instances, maybe from a database or CMS or a text file and of course you can add more logic to filter those servers as required.

$Jobs = (Get-SQLAgentJob -ServerInstance $Servers).Where{$_.Name -like '*PartOfNameOfJob*' -and $_.IsEnabled -eq $true}

Of course to add a PowerShell Job step the target server needs to be SQL 2008 or higher. If you have an estate with older versions it is worth creating a SMO server object (you can use a snippet) and checking the version and then getting the jobs like this

foreach($Server in $Servers)
{
 $srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server
 if($srv.VersionMajor -ge 10)
 {
    $Jobs = $srv.JobServer.Jobs

and you could choose to create a CmdExec Job step for earlier verions in an else code block.

Once I have the Jobs I can iterate through them with a foreach loop

foreach($Job in $Jobs)

Then we need to create a new job step which is done with the following code

$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep 

To find out what is available for this object you can run

$NewStep | Get-Member -MemberType Property

job-step-properties

We need to set the name, the parent (The job), the command, the subsystem, the on fail action, on success action and the id for the job step.
I set the command to a variable to make the code easier to read

$Command = "Get-Process"

the rest of the properties I fill in inside the loop. To find out what the properties can hold I look at MSDN for a Microsoft.SqlServer.Management.Smo.Agent.JobStep  The ID property is the number of the job step starting at 1 so this example will add a new job step that will be the first to run

$Name = $Job.Name
$JobServer = $srv.JobServer
$Job = $JobServer.Jobs[$Name]
$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep
$NewStep.Name = 'a descriptive name for my PowerShell script'
$NewStep.Parent = $Job
$NewStep.Command = $Command
$NewStep.SubSystem = 'PowerShell'
$NewStep.OnFailAction = 'QuitWithFailure'
$NewStep.OnSuccessAction = 'GoToNextStep'
$NewStep.ID = 1

Once the object has all of the properties all we need to do is create it and alter the job

$NewStep.create()
$Job.Alter() 

and putting it all together it looks like this

foreach($Server in $Servers)
{
    $srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server
    if($srv.VersionMajor -ge 10)
    {
       $Jobs = $srv.JobServer.Jobs.Where{$_.Name -like '*PartOfNameOfJob*' -and $_.IsEnabled -eq $true}
       foreach($Job in $Jobs)
       {
           $NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep
           $NewStep.Name = 'a descriptive name for my PowerShell script'
           $NewStep.Parent = $Job
           $NewStep.Command = $Command
           $NewStep.SubSystem = 'PowerShell'
           $NewStep.OnFailAction = 'QuitWithFailure'
           $NewStep.OnSuccessAction = 'GoToNextStep'
           $NewStep.ID = 1
           $NewStep.create()
           $Job.Alter()
       }
    }

}

Happy Automating

VS Code PowerShell Snippets

Just a quick post, as much as a reminder for me as anything, but also useful to those that attended my sessions last week where I talked about snippets in PowerShell ISE

Jeff Hicks wrote a post explaining how to create snippets in VS Code for PowerShell

I love using snippets so I went and converted my snippets list for ISE (available on GitHub) into the json required for VS Code (available on GitHub)

Here is an example of snippet

[code]"SMO-Server": {
        "prefix": "SMO-Server",
        "body": [
            "$$srv = New-Object Microsoft.SqlServer.Management.Smo.Server $$Server"
        ],
        "description": "Creates a SQL Server SMO Object"
    },

I followed this process in this order

Click File –> Preferences –> User Snippets and type PowerShell or edit $env:\appdata\code\user\snippets\powershell.json

In order I converted the code in the existing snippets “Text” like this

        Replace `$ with $$
        Replace \ with \\
        Replace ” with \”
        \r for new line
        \t for tab
        Each line in “”
        , at the end of each line in the body   except the last one
        Look out for red or green squiggles 🙂
I then add
The name of the snippet, first before the : in “”
The prefix is what you type to get the snippet
The body is the code following the above Find and Replaces
The description is the description!!
and save and I have snippets in VS Code 🙂
snippets.gif
That should help you to convert existing ISE snippets into VS Code PowerShell snippets and save you time and keystrokes 🙂

Quickly Creating Test Users in SQL Server with PowerShell using the sqlserver module and dbatools

One of the most visited posts on my blog is nearly two and half years old now – Add User to SQL Server Database Role with PowerShell and Quickly Creating Test Users. I thought it was time to update it and use the latest sqlserver module and the dbatools module.

You can get the latest version of the sqlserver module by installing SSMS 2016. The PASS PowerShell Virtual Chapter have created a short link to make this easier for you to remember: https://sqlps.io/dl

Once you have downloaded and installed SSMS you can load the module.

Import-Module sqlserver

There is one situation where you will get an error loading the sqlserver module into PowerShell. If you have the SQLPS module already imported then you will get the following error:

Import-Module : The following error occurred while loading the extended type data file:

sqlserver-module-error

In that case you will need to remove the SQLPS module first.

Remove-Module sqlps
Import-Module sqlserver

The original post dealt with creating a number of test users for a database and assigning them to different roles quickly and easily.

First let’s quickly create a list of Admin users and a list of Service Users and save them in a text file.

$i = 0
while($I -lt 100)
{
"Beard_Service_User$i" | Out-File 'C:\temp\Users.txt' -Append
$i++
}

$i = 0
while($I -lt 10)
{
"Beard_Service_Admin_$i" | Out-File 'C:\temp\Admins.txt' -Append
$i++
}

Now that we have those users in files we can assign them to a variable by using Get-Content

$Admins = Get-Content 'C:\temp\Admins.txt'

Of course we can use any source for our users – a database, an excel file, Active Directory or even just type them in.

We can use the Add-SQLLogin command from the sqlserver module to add our users as SQL Logins, but at present we cannot add them as database users and assign them to a role.

If we want to add a Windows Group or a Windows User to our SQL Server we can do so using:

Add-SqlLogin -ServerInstance $Server -LoginName $User -LoginType WindowsUser -DefaultDatabase tempdb -Enable -GrantConnectSql 

Notice that we need to enable and grant connect SQL to the user.

If we want to add a SQL login the code is pretty much the same but we either have to enter the password in an authentication box or pass in a PSCredential object holding the username and password. Keeping credentials secure in PowerShell scripts is outside the scope of this post and the requirement is for none-live environments so we will pass in the same password for all users as a string to the script. You may want or be required to achieve this in a different fashion.

 $Pass = ConvertTo-SecureString -String $Password -AsPlainText -Force
$Credential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $User, $Pass
Add-SqlLogin -ServerInstance $Server -LoginName $User -LoginType $LoginType -DefaultDatabase tempdb -Enable -GrantConnectSql -LoginPSCredential $Credential

We can ensure that we are not trying to add logins that already exist using

 if(!($srv.Logins.Contains($User)))
{

The $srv is a SQL Server Management Server Object which you can create using a snippet. I blogged about snippets here and you can find my list of snippets on github here. However, today I am going to use the dbatools module to create a SMO Server Object using the Connect-DbaSqlServer command and assign the server and the database to a variable:

 # Create a SQL Server SMO Object
$srv = Connect-DbaSqlServer -SqlServer $server
$db = $srv.Databases[$Database]

Once we have our Logins we need to create our database users:

 $usr = New-Object ('Microsoft.SqlServer.Management.Smo.User') ($db, $User)
$usr.Login = $User
$usr.Create()

and add them to a database role.

#Add User to the Role
$db.roles[$role].AddMember($User)

I created a little function to call in the script and then simply loop through our users and admins and call the function.

foreach($User in $Users)
{
Add-UserToRole -Password $Password -User $user -Server $server -Role $Userrole  -LoginType SQLLogin
}

foreach($User in $Admins)
{
Add-UserToRole -Password $Password -User $user -Server $server -Role $adminrole  -LoginType SQLLogin
}

To check that they have been added correctly I simply use the Get-DbaRoleMember command from dbatools and output it to Out-GridView using the alias ogv as I am on the command line:

Get-DbaRoleMember -SqlInstance $server |ogv

which looks like this:

 

get-dbarole-memebr

Once we need to clean up the logins and users we can use the Get-SQLLogin and Remove-SQLLogin commands from the sqlserver module to remove the logins and if we do that first we can then use the dbatools command Remove-SQLOrphanuser to remove the orphaned users 🙂 (I thought that was rather cunning!)

(Get-SqlLogin -ServerInstance $server).Where{$_.Name -like '*Beard_Service_*'}|Remove-SqlLogin

Remove-SQLOrphanUser -SqlServer $Server -databases $database

The Remove-SQLLogin will prompt for confirmation and the result of the Remove-SQLOrphanUser looks like this

remove-them-all

When you are looking at doing this type of automation with PowerShell, you should remember always to make use of Get-Command, Get-Help and Get-Member. That will enable you to work out how to do an awful lot. I have a short video on youtube about this:

 

and when you get stuck come and ask in the SQL Server Slack at https://sqlps.io/slack. You will find a powershellhelp channel in there.

Here is the complete code:

#Requires -module sqlserver
#Requires -module dbatools

### Define some variables
$server = ''
$Password = "Password"
$Database = 'TheBeardsDatabase'
$Admins = Get-Content 'C:\temp\Admins.txt'
$Users = Get-Content 'C:\temp\Users.txt'
$LoginType = 'SQLLogin'
$userrole =  'Users'
$adminrole = 'Admin'

# Create a SQL Server SMO Object
$srv = Connect-DbaSqlServer -SqlServer $server
$db = $srv.Databases[$Database]

function Add-UserToRole
{
param
(
[Parameter(Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true,
ValueFromRemainingArguments=$false)]
[ValidateNotNullOrEmpty()]
[string]$Password,
[Parameter(Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true,
ValueFromRemainingArguments=$false)]
[ValidateNotNullOrEmpty()]
[string]$User,
[Parameter(Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true,
ValueFromRemainingArguments=$false)]
[ValidateNotNullOrEmpty()]
[string]$Server,
[Parameter(Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true,
ValueFromRemainingArguments=$false)]
[ValidateNotNullOrEmpty()]
[string]$Role,
[Parameter(Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true,
ValueFromRemainingArguments=$false)]
[ValidateSet("SQLLogin", "WindowsGroup", "WindowsUser")]
[string]$LoginType
)

if(!($srv.Logins.Contains($User)))
{
if($LoginType -eq 'SQLLogin')
{
$Pass = ConvertTo-SecureString -String $Password -AsPlainText -Force
$Credential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $User, $Pass
Add-SqlLogin -ServerInstance $Server -LoginName $User -LoginType $LoginType -DefaultDatabase tempdb -Enable -GrantConnectSql -LoginPSCredential $Credential
}
elseif($LoginType -eq 'WindowsGroup' -or $LoginType -eq 'WindowsUser')
{
Add-SqlLogin -ServerInstance $Server -LoginName $User -LoginType $LoginType -DefaultDatabase tempdb -Enable -GrantConnectSql
}
}
if (!($db.Users.Contains($User)))
{

# Add user to database

$usr = New-Object ('Microsoft.SqlServer.Management.Smo.User') ($db, $User)
$usr.Login = $User
$usr.Create()

}
#Add User to the Role
$db.roles[$role].AddMember($User)
}

foreach($User in $Users)
{
Add-UserToRole -Password $Password -User $user -Server $server -Role $Userrole -LoginType SQLLogin
}

foreach($User in $Admins)
{
Add-UserToRole -Password $Password -User $user -Server $server -Role $adminrole -LoginType SQLLogin
}

Get-DbaRoleMember -SqlInstance $server |ogv

Happy Automating!

 

 

 

SQL VNext sp_configure on Windows and Linux with dbatools

This weekend I set up some SQL vNext virtual machines, two on Windows and one on Linux so that I could test some scenarios and build an availability group.

IMPORTANT NOTE :- The names of dbatools commands with a Sql prefix WILL CHANGE in a later release of dbatools. dbatools will use Dba throughout in the future as the sqlserver PowerShell module uses the Sql prefix

I used PowerShell version 5.1.14393.693 and SQL Server vNext CTP 1.3 running on Windows Server 2016 and Ubuntu 16.04 in this blog post

I set up one Windows box with some changes to the default configuration, these are for my lab only.

Min Server Memory to 2Gb
Max Server Memory to 6Gb
Remote Admin Connections to 1
Backup Compression to 1
Ad Hoc Distributed Queries to 1

I can do this in PowerShell by creating a SQL SMO Server object using Connect-DbaSqlServer from dbatools and then altering the Configuration Properties values and calling the Alter() method

$WinSQl1 = 'SQLvNextN1'
$win1 = Connect-DbaSqlServer -SqlServer $WinSQl1
$win1.Configuration.Properties['DefaultBackupCompression'].ConfigValue = 1
$win1.Configuration.Properties['MinServerMemory'].ConfigValue = 2048
$win1.Configuration.Properties['MaxServerMemory'].ConfigValue = 6144
$win1.Configuration.Properties['RemoteAccess'].ConfigValue = 1
$win1.Configuration.Properties['OptimizeAdhocWorkloads'].ConfigValue = 1
$win1.Configuration.Alter()

I can see the change by using the Get-DbaSpConfigure from the popular PowerShell module dbatools. I like to output to Out-GridView (Alias ogv at the CLI)

$WinSQl1 = 'SQLvNextN1'
Get-DbaSpConfigure -SqlServer $WinSQl1 | ogv

1 SPConfig ogv.PNG

Out-GridView can filter in the top bar and it works very quickly. It is a very useful tool

2 - ogv filter.gif

You can also rearrange the columns and use the Add Criteria button to filter your results

3-spconfig-ogv

You can use Export-SQLSpConfigure to export the configuration to a .sql file which is useful for DR or documentation purposes. There is also a corresponding Import-SqlSpConfigure command

$WinSQl1 = 'SQLvNextN1'
$WinConfigPath = 'C:\Temp\Winconfig.sql'
Export-SqlSpConfigure -SqlServer $WinSQl1 -Path $winConfigPath
notepad $winConfigPath

4 - Export-SQLSpConfigure.gif

Chrissy LeMaire b | t showed a neat trick when we were in Utrecht to compare the configuration of two servers

$WinSQl1 = 'SQLvNextN1'
$WinSQl2 = 'SQLvNextN2'
$Win1SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl1
$Win2SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl2
$propcompare = foreach ($prop in $Win1SPConfigure) {
[pscustomobject]@{
Config = $prop.DisplayName
'Windows Node1 setting' = $prop.RunningValue
'Windows Node2 Setting' = $Win2SPConfigure | Where DisplayName -eq $prop.DisplayName | Select -ExpandProperty RunningValue
}
}
$propcompare | ogv
So when we compare the two Windows SQL nodes we can see that at this point there are differences in the configuration
5-compare-configures

 

Now we can copy the configuration from SQLvNextN1 to SQLvNextN2 using the Copy-SqlSpConfigure command

$WinSQl1 = 'SQLvNextN1'
$WinSQl2 = 'SQLvNextN2'
Copy-SqlSpConfigure -Source $WinSQl1 -Destination $WinSQl2

6 - Copy Configurations.gif

and we can see that those configurations are now the same for those two servers by comparing them like before.

7 - compare configurations.PNG

We can use Get-DbaSpConfigure with Linux servers as well but we need to use SQL authentication like so (No, I don’t know why I used two v’s!)

$linuxSQL = 'LinuxvvNext'
$cred = Get-Credential -UserName SA -Message "Linux SQL Auth"
$linuxSpConfigure = Get-DbaSpConfigure  -SqlServer $linuxSQL -SqlCredential $cred
$linuxSpConfigure | ogv

and compare the three servers like so

$WinSQl1 = 'SQLvNextN1'
$WinSQl2 = 'SQLvNextN2'
$linuxSQL = 'LinuxvvNext'
$cred = Get-Credential -UserName SA -Message "Linux SQL Auth"
<div>$Win1SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl1
$Win2SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl2
$linuxSpConfigure = Get-DbaSpConfigure  -SqlServer $linuxSQL -SqlCredential $cred
$propcompare = foreach ($prop in $Win1SPConfigure) {
[pscustomobject]@{
Config = $prop.DisplayName
'Windows Node1 setting' = $prop.RunningValue
'Windows Node2 Setting' = $Win2SPConfigure | Where DisplayName -eq $prop.DisplayName | Select -ExpandProperty RunningValue
'Linux Setting' = $linuxSpConfigure | Where DisplayName -eq $prop.DisplayName | Select -ExpandProperty RunningValue
}
}
$propcompare | ogv

and see the differences for the Linux Server

8 - Compare Linux Configs.PNG

we can export the Linux configuration using Export-SqlSpConfigure

9 - Export Linux Config.PNG

So now lets copy the configuration changes from the Windows Server to the Linux Server

10 - Copy Configuration to Linux.gif

There was some Red text there you will notice. This is because there are unsupported features in Linux as you can see in the SQL Server on Linux Release Notes The errors from the command are

Database Mail XPs to 0. Feature may not be supported.
SMO and DMO XPs to 1. Feature may not be supported.
Ole Automation Procedures to 0. Feature may not be supported.
xp_cmdshell to 0. Feature may not be supported.
Ad Hoc Distributed Queries to 0. Feature may not be supported.
Replication XPs to 0. Feature may not be supported.
contained database authentication to 0. Feature may not be supported.
hadoop connectivity to 0. Feature may not be supported.
polybase network encryption to 1. Feature may not be supported.
remote data archive to 0. Feature may not be supported.
allow polybase export to 0. Feature may not be supported.

But if we compare the configurations again, we can see that the settings we wanted have been altered successfully

11 - Compare Linux Configs again.PNG

Lets reset the configuration on the Linux server using Import-SqlSpConfigure and the Linuxconfig-backup.sql file created before any changes

12 - Import backup Linux configuration.gif

Interestingly this time we only had an error for Database Mail XPs, but as you can see below the Linux Configuration has been reset back to the original values.

13-configuration-comapre-back-to-original

Lets export the configuration from the Windows server to a file and import it onto the Linux server to see what happens

14 - Import Windows configuration to Linux.gif

Again only an error for Database Mail XPs, and we can see the configurations are again matching.

15 - Final Compare.PNG

Hopefully, this post has been of use in exploring the *SpConfigure* commands in dbatools and how you can use them to get the sp_configure output, compare it between servers, Export and import it to and from files, as well as copy it between servers for both Windows and Linux. Remember these commands have been tested from SQL 2000 to SQL vNext so they should work for you across your estate

 

Happy Automating!

 

Converting SQL Agent Job Duration to TimeSpan using PowerShell

When you look in msdb for the SQL Agent Job duration you will find that it is an int.

sysjobshistoiry

This is also the same when you look at Get-SQLAgentJobHistory from the sqlserver module. (You can get this by downloading the latest SSMS release from here)

agentjobhistoryproperties

This means that when you look at the various duration of the Agent Jobs you get something like this

duration.PNG

The first job took 15 hours 41 minutes  53 seconds, the second 1 minute 25 seconds, the third 21 seconds. This makes it quite tricky to calculate the duration in a suitable datatype. In T-SQL people use scripts like the following from MSSQLTips.com

((run_duration/10000*3600 + (run_duration/100)%100*60 + run_duration%100 + 31 ) / 60)  as 'RunDurationMinutes'

I needed more information than the number of minutes so I have this which will convert the Run Duration to a timespan

$FormattedDuration = @{Name = 'FormattedDuration' ; Expression = {[timespan]$_.RunDuration.ToString().PadLeft(6,'0').insert(4,':').insert(2,':')}}

formatted.PNG

So how did I get to there?

First I tried to just convert it. In PowerShell you can define a datatype in square brackets and PowerShell will try to convert it

timespan

It did its best but it converted it to ticks! So we need to convince PowerShell that this is a proper timespan. First we need to convert the run duration to a standard length, you can use the PadLeft method of a string to do this which will ensure that a string has a length and precede the current string with a value you choose until the string is that length.

Lets have a length of 6 and preceding zeros PadLeft(6,’0′)

padlefterror

But this works only if it is a string!! Remember red text is useful, it will often contain the information you need to resolve your error. Luckily there is a method to turn an int to a string. I am using the foreach method to demonstrate

padleft-with-string

Now every string is 6 characters long starting with zeros. So all that is left is to format this with colons to separate the hours and minutes and the minutes and seconds. We can do this with the insert method. You can find out the methods using Get-Member or its alias gm

methods.PNG

So the insert method takes an int for the startindex and a string value to enter

insert

There we go now we have some proper formatted timespans however they are still strings. We can then convert them using [timespan] Now we can format the results within the select by using an expression as shown below

select

and as you can see it is a timespan now

timespan property.PNG

On a slight side note. I needed the durations for Agent Jobs with a certain name within the last 6 days.

getting-agent-jobs

I did this by passing an array of servers (which I got from my dbareports database) to Get-SQLAgentJobHistory. I then used the Where method to filter for JobName and the Job Outcome step of the history. I compared the RunDate property  to Get-Date (today) adding -6 days using the AddDays method 🙂

Hopefully this will be of use to people and also I have it recorded for the next time I need to do it 🙂

 

 

DBA Database scripts are on Github

It started with a tweet from Dusty

Tweets

The second session I presented at the fantastic PowerShell Conference Europe was about using the DBA Database to automatically install DBA scripts like sp_Blitz, sp_AskBrent, sp_Blitzindex from Brent Ozar , Ola Hallengrens Maintenance Solution , Adam Mechanics sp_whoisactive , This fantastic script for logging the results from sp_whoisactive to a table , Extended events sessions and other goodies for the sanity of the DBA.

By making use of the dbo.InstanceList in my DBA database I am able to target instances, by SQL Version, OS Version, Environment, Data Centre, System, Client or any other variable I choose. An agent job that runs every night will automatically pick up the instances and the scripts that are marked as needing installing. This is great when people release updates to the above scripts allowing you to target the development environment and test before they get put onto live.

I talked to a lot of people in Hannover and they all suggested that I placed the scripts onto GitHub and after some how-to instructions from a few people (Thank you Luke) I spent the weekend updating and cleaning up the code and you can now find it on GitHub here

github

I have added the DBA Database project, the Powershell scripts and Agent Job creation scripts to call those scripts and everything else I use. Some of the DBA Scripts I use (and links to those you need to go and get yourself for licensing reasons) and the Power Bi files as well. I will be adding some more jobs that I use to gather other information soon.

Please go and have a look and see if it is of use to you. It is massively customisable and I have spoken to various people who have extended it in interesting ways so I look forward to hearing about what you do with it.

As always, questions and comments welcome

 

 

Using PowerShell to set Extended Events Sessions to AutoStart

When you look after more than a few SQL Servers you will need to perform the same actions against a number of  them and that is where PowerShell will be of great benefit. Recently I needed to ensure that all SQL Servers had a certain Extended Event Session set to auto-start and that it was running. I have used the Always On health session in the example below but you could use the same code below and do this for any Extended Event session. Just note that the code below checks for the existence of an Availability Group which may not be what you require.

As always when I started to look at Powershell for a solution I turned to MSDN and found this page and also a quick search found Mike Fals blogpost which showed me how to get going.

I used my DBA Database as described in my previous posts and created a query to check for all of the servers that were active and contactable

SELECT

IL.ServerName

FROM [dbo].[InstanceList] IL

WHERE NotContactable = 0

AND Inactive = 0

and used Invoke-SQLCMD to gather the Server Names


$Results = (Invoke-Sqlcmd -ServerInstance $DBADatabaseServer -Database DBADatabase -Query $query -ErrorAction Stop).ServerName

Then it was a case of looping through the servers and connecting to the XEvent Store and checking if the required extended evetn was started and set to auto-start and if not altering those settings

## Can we connect to the XEStore?
if(Test-Path SQLSERVER:\XEvent\$Server)
{
$XEStore = get-childitem -path SQLSERVER:\XEvent\$Server -ErrorAction SilentlyContinue  | where {$_.DisplayName -ieq 'default'}
$AutoStart = $XEStore.Sessions[$XEName].AutoStart
$Running = $XEStore.Sessions[$XEName].IsRunning
Write-Output "$server for $AGNames --- $XEName -- $AutoStart -- $Running"
if($AutoStart -eq $false)

{
$XEStore.Sessions[$XEName].AutoStart = $true
$XEStore.Sessions[$XEName].Alter()
}

if($Running -eq $false)
{
$XEStore.Sessions[$XEName].Start()
}
}

Very quick and simple and hopefully of use to people, this could easily be turned into a function. The full script is below and also available here on the Powershell gallery or by running  Save-Script -Name Set-ExtendedEventsSessionstoAutoStart -Path <path>

<#
.Synopsis
   Connects to the servers in the DBA Database and for Servers above 2012 sets alwayson_health Extended Events Sessions to Auto-Start and starts it if it is not running
.DESCRIPTION
   Sets Extended Events Sessions to Auto-Start and starts it if it is not running
.EXAMPLE
   Alter the XEvent name and DBADatabase name or add own server list and run
.NOTES
   AUTHOR - Rob Sewell
   BLOG - http://sqldbawithabeard.com
   DATE - 20/03/2016
#>
$DBADatabaseServer 
$XEName = 'AlwaysOn_health'
## Query to gather the servers required
$Query = @"

SELECT 

IL.ServerName

FROM [dbo].[InstanceList] IL

WHERE NotContactable = 0

AND Inactive = 0

"@

Try 
{
$Results = (Invoke-Sqlcmd -ServerInstance $DBADatabaseServer -Database DBADatabase -Query $query -ErrorAction Stop).ServerName
}

catch 
{
Write-Error "Unable to Connect to the DBADatabase - Please Check"
}

foreach($Server in $Results)

    {
        try
            {
            $srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Server
            }
        catch
            {
            Write-Output " Failed to connect to $Server"
            continue
            }
            # To ensure we have a connection to the server
            if (!( $srv.version)){
            Write-Output " Failed to Connect to $Server"
            continue
            }
        if($srv.versionmajor -ge '11')
            {
            ## NOTE this checks if there are Availability Groups - you may need to change this
            if ($srv.AvailabilityGroups.Name)
                {
                $AGNames = $srv.AvailabilityGroups.Name   
                ## Can we connect to the XEStore?                             
                if(Test-Path SQLSERVER:\XEvent\$Server)
                    {
                    $XEStore = get-childitem -path SQLSERVER:\XEvent\$Server -ErrorAction SilentlyContinue  | where {$_.DisplayName -ieq 'default'} 
                    $AutoStart = $XEStore.Sessions[$XEName].AutoStart
                    $Running = $XEStore.Sessions[$XEName].IsRunning
                    Write-Output "$server for $AGNames --- $XEName -- $AutoStart -- $Running"
                    if($AutoStart -eq $false)
                    
                        {
                        $XEStore.Sessions[$XEName].AutoStart = $true
                        $XEStore.Sessions[$XEName].Alter()
                        }
                    
                      if($Running -eq $false)
                        {
                        $XEStore.Sessions[$XEName].Start()
                        } 
                    }
                else
                    {
                    Write-Output "Failed to connect to XEvent on $Server"
                    }
                }

            else
                {
                ## Write-Output "No AGs on $Server"
                }
            }
        else
            {
            ##  Write-Output "$server not 2012 or above"
            }
} 

 

Power Bi, PowerShell and SQL Agent Jobs

Continuing my series on using Power Bi with my DBA Database I am going to show in this post how I create the most useful daily report for DBAs – The SQL Agent Job report. You can get the scripts and reports here

Please note this project became dbareports.io

AG1

This gives a quick overview of the status of the Agent Jobs across the estate and also quickly identifies recent failed jobs enabling the DBA to understand their focus and prioritise their morning efforts.

I gather the information into 2 tables AgentJobDetail

CREATE TABLE [Info].[AgentJobDetail](
[AgetnJobDetailID] [int] IDENTITY(1,1) NOT NULL,
[Date] [datetime] NOT NULL,
[InstanceID] [int] NOT NULL,
[Category] [nvarchar](50) NOT NULL,
[JobName] [nvarchar](250) NOT NULL,
[Description] [nvarchar](750) NOT NULL,
[IsEnabled] [bit] NOT NULL,
[Status] [nvarchar](50) NOT NULL,
[LastRunTime] [datetime] NOT NULL,
[Outcome] [nvarchar](50) NOT NULL,
CONSTRAINT [PK_info.AgentJobDetail] PRIMARY KEY CLUSTERED
(
[AgetnJobDetailID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

and AgentJobServer

CREATE TABLE [Info].[AgentJobServer](
[AgentJobServerID] [int] IDENTITY(1,1) NOT NULL,
[Date] [datetime] NOT NULL,
[InstanceID] [int] NOT NULL,
[NumberOfJobs] [int] NOT NULL,
[SuccessfulJobs] [int] NOT NULL,
[FailedJobs] [int] NOT NULL,
[DisabledJobs] [int] NOT NULL,
[UnknownJobs] [int] NOT NULL,
CONSTRAINT [PK_Info.AgentJobServer] PRIMARY KEY CLUSTERED
(
[AgentJobServerID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

The Detail table holds the results of every Agent Job and the Server table holds a roll up for each server. The script to gather this information is based on the script I used to put the information into an Excel Sheet as described in my post How I Check Hundreds of Agent Jobs in 60 Seconds with PowerShell which I also altered to send an HTML email to the DBA team each morning. This however is a much better solution and allows for better monitoring and trending.

As I have explained in my previous posts I use an Instance List table to hold the information about each instance in the estate and a series of PowerShell scripts which run via Agent Jobs to gather the information into various tables. These posts describe the use of the Write-Log function and the methodology of gathering the required information and looping through each instance so I wont repeat that here. There is an extra check I do however for Express Edition as this does not contain the Agent service

$edition = $srv.Edition
if ($Edition -eq 'Express') {
    Write-Log -Path $LogFile -Message "No Information gathered as this Connection $Connection is Express"
    continue
}

The Agent Job information can be found in SMO by exploring the $srv.JobServer.Jobs object and I gather the information by iterating through each job and setting the values we require to variables

try {
    $JobCount = $srv.JobServer.jobs.Count
    $successCount = 0
    $failedCount = 0
    $UnknownCount = 0
    $JobsDisabled = 0
    #For each job on the server
    foreach ($jobin$srv.JobServer.Jobs)
    {
        $jobName = $job.Name;
        $jobEnabled = $job.IsEnabled;
        $jobLastRunOutcome = $job.LastRunOutcome;
        $Category = $Job.Category;
        $RunStatus = $Job.CurrentRunStatus;
        $Time = $job.LastRunDate;
        if ($Time -eq '01/01/000100:00:00')
        {$Time = ''}
        $Description = $Job.Description;
        #Counts for jobs Outcome
        if ($jobEnabled -eq $False)
        {$JobsDisabled += 1}
        elseif ($jobLastRunOutcome -eq "Failed")
        {$failedCount += 1; }
        elseif ($jobLastRunOutcome -eq "Succeeded")
        {$successCount += 1; }
        elseif ($jobLastRunOutcome -eq "Unknown")
        {$UnknownCount += 1; }
    }    
}

I found that some Jobs had names and descriptions that had ‘ in them which would cause the SQL update or insert statement to fail so I use the replace method to replace the ‘ with ”

if ($Description -eq $null) {
    $Description = ' '
}
$Description = $Description.replace('''', '''''')
if ($jobName -eq $Null) {
    $jobName = 'None'
}
$JobName = $JobName.replace('''', '''''')

I then insert the data per job after checking that it does not already exist which allows me to re-run the job should a number of servers be uncontactable at the time of the job running without any additional work

IF NOT EXISTS (
SELECT&nbsp; [AgetnJobDetailID]
FROM [DBADatabase].[Info].[AgentJobDetail]
where jobname = '$jobName'
and InstanceID = (SELECT [InstanceID]
FROM [DBADatabase].[dbo].[InstanceList]
WHERE [ServerName] = '$ServerName'
AND [InstanceName] = '$InstanceName'
AND [Port] = '$Port')
and lastruntime = '$Time'
)
INSERT INTO [Info].[AgentJobDetail]
([Date]
,[InstanceID]
,[Category]
,[JobName]
,[Description]
,[IsEnabled]
,[Status]
,[LastRunTime]
,[Outcome])
VALUES
(GetDate()
,(SELECT [InstanceID]
FROM [DBADatabase].[dbo].[InstanceList]
WHERE [ServerName] = '$ServerName'
AND [InstanceName] = '$InstanceName'
AND [Port] = '$Port')
,'$Category'
,'$jobName'
,'$Description'
,'$jobEnabled'
,'$RunStatus'
,'$Time'
,'$jobLastRunOutcome')

I put this in a here-string variable and pass it to Invoke-SQLCmd I do the same with the roll up using this query

INSERT INTO [Info].[AgentJobServer]
([Date]
,[InstanceID]
,[NumberOfJobs]
,[SuccessfulJobs]
,[FailedJobs]
,[DisabledJobs]
,[UnknownJobs])
VALUES
(GetDate()
,(SELECT [InstanceID]
FROM [DBADatabase].[dbo].[InstanceList]
WHERE [ServerName] = '$ServerName'
AND [InstanceName] = '$InstanceName'
AND [Port] = '$Port')
,'$JobCount'
,'$successCount'
,'$failedCount'
,'$JobsDisabled'
,'$UnknownCount')

This job runs as a SQL Agent Job every morning a half an hour or so before the DBA arrives for the morning shift vastly improving the ability of the DBA to prioritise their morning routine.

To create the report open Power Bi Desktop and click Get Data

ag2

Then choose SQL Server and click connect

ag3

Enter the Connection string, the database and the  query to gather the data

ag5

The query is

Select IL.InstanceID,
IL.ServerName,
IL.InstanceName,
IL.Environment,
IL.Location,
AJD.Category,
AJD.Date,
AJD.Description,
AJD.IsEnabled,
AJD.JobName,
AJD.LastRunTime,
AJD.Outcome,
AJD.Status
FROM [dbo].[InstanceList] IL
JOIN [Info].[AgentJobDetail] AJD
ON IL.InstanceID = AJD.InstanceID
WHERE LastRunTime > DATEADD(Day,-31,GETDATE())

Once we have gathered the data we then create some extra columns and measures for the reports. First I create a date column from the datetime Date Column

DayDate = DATE(YEAR('Agent Job Detail'[Date]),MONTH('Agent Job Detail'[Date]),DAY('Agent Job Detail'[Date]))

I also do the same for the LastRuntime. I create a day of the week column so that I can report on jobs outcome by day

DayyOfWeek = CONCATENATE(WEEKDAY('Agent Job Detail'[Date],2),FORMAT('Agent Job Detail'[Date]," -dddd"))

My friend Terry McCann b | t helped me create a column that returns true if the last run time is within 24 hours of the current time to help identify the recent jobs that have failed NOTE – On a Monday morning you will need to change this if you do not check your jobs on the weekend.

Last Run Relative Hour = ((1.0*(NOW()-'Agent Job Detail'[LastRunTime]))*24)<24

I create a measure for Succeeded, Failed and Unknown

Succeeded = IF('Agent Job Detail'[Outcome] = "Succeeded"
, 1
, 0)

Next we have to create some measures for the sum of failed jobs and the averages This is the code for 7 day sum

Failed7Days = CALCULATE(SUM('Agent Job Detail'[Failed]),FILTER (
ALL ( 'Agent Job Detail'[Last Run Date] ),
'Agent Job Detail'[Last Run Date] > ( MAX ( 'Agent Job Detail'[Last Run Date]&nbsp; ) - 7 )
&& 'Agent Job Detail'[Last Run Date]&nbsp; <= MAX ( 'Agent Job Detail'[Last Run Date]&nbsp; )&nbsp;&nbsp;&nbsp;&nbsp; ) )

and for the 7 Day average

Failed7DayAverage = DIVIDE([Failed7Days],7)

I did the same for 30 days. I used the TechNet reference for DAX expressions and got ideas from Chris Webbs blog

ag6
First I created the 30 day historical trend chart using a Line and Clustered column chart using the last run date as the axis and the succeed measure as the column and the Failed, Failed 7 Day Average and failed 30 day average as the lines

I then formatted the lines and title and column

ag7

To create the gauge which shows how well we have done today I created a measure to quickly identify todays jobs

LastRun Relative Date Offset = INT('Agent Job Detail'[LastRunTime] - TODAY())

which I use as a filter for the gauge as shown below. I also create two measures zero and twenty for the minimum and maximum for the gauge

ag8

The rest of the report is measures for 7 day average and 30 day average, a slicer for environment  and two tables, one to show the historical job counts and one to show the jobs that have failed in the last 24 hours using the Last Run Relative Hour measure from above

ag9

There are many other reports that you can or may want to create maybe by day of the week or by category depending on your needs. Once you have the data gathered you are free to play with the data as you see fit. Please add any further examples of reports you can run or would like to run in the comments below.

Once you have your report written you can publish it to PowerBi.com and create a dashboard and query it with natural language. I have explained the process in previous posts

For example – How many Jobs failed today

ag110

Which server had most failed jobs

ag11

or using the category field which database maintenance jobs failed today

ag13

I hope these posts have given you ideas about how you can use Powershell, a DBA Database and Power Bi to help you to manage and report on your environment.

You can get the scripts and reports here

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

Populating My DBA Database for Power Bi with PowerShell – Databases

Following my post about using Power Bi with my DBA Database I have been asked if I would share the PowerShell scripts which I use to populate my database.

In this post I will show how to create the following report

db1

db2

Although you will find so many items of data that I expect that you will want to create different reports for your own requirements. You will also want to put the report onto PowerBi.com and explore the natural language querying as I show at the end of this post

You will find the latest version of my DBADatabase creation scripts and PowerShell scripts here.

The SQLInfo table is created using this code

CREATE TABLE [Info].[Databases](
	[DatabaseID] [int] IDENTITY(1,1) NOT NULL,
	[InstanceID] [int] NOT NULL,
	[Name] [nvarchar](256) NULL,
	[DateAdded] [datetime2](7) NULL,
	[DateChecked] [datetime2](7) NULL,
	[AutoClose] [bit] NULL,
	[AutoCreateStatisticsEnabled] [bit] NULL,
	[AutoShrink] [bit] NULL,
	[AutoUpdateStatisticsEnabled] [bit] NULL,
	[AvailabilityDatabaseSynchronizationState] [nvarchar](16) NULL,
	[AvailabilityGroupName] [nvarchar](128) NULL,
	[CaseSensitive] [bit] NULL,
	[Collation] [nvarchar](30) NULL,
	[CompatibilityLevel] [nvarchar](15) NULL,
	[CreateDate] [datetime2](7) NULL,
	[DataSpaceUsageKB] [float] NULL,
	[EncryptionEnabled] [bit] NULL,
	[IndexSpaceUsageKB] [float] NULL,
	[IsAccessible] [bit] NULL,
	[IsFullTextEnabled] [bit] NULL,
	[IsMirroringEnabled] [bit] NULL,
	[IsParameterizationForced] [bit] NULL,
	[IsReadCommittedSnapshotOn] [bit] NULL,
	[IsSystemObject] [bit] NULL,
	[IsUpdateable] [bit] NULL,
	[LastBackupDate] [datetime2](7) NULL,
	[LastDifferentialBackupDate] [datetime2](7) NULL,
	[LastLogBackupDate] [datetime2](7) NULL,
	[Owner] [nvarchar](30) NULL,
	[PageVerify] [nvarchar](17) NULL,
	[ReadOnly] [bit] NULL,
	[RecoveryModel] [nvarchar](10) NULL,
	[ReplicationOptions] [nvarchar](40) NULL,
	[SizeMB] [float] NULL,
	[SnapshotIsolationState] [nvarchar](10) NULL,
	[SpaceAvailableKB] [float] NULL,
	[Status] [nvarchar](35) NULL,
	[TargetRecoveryTime] [int] NULL,
 CONSTRAINT [PK_Databases] PRIMARY KEY CLUSTERED 
(
	[DatabaseID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

GO

The Powershell script uses Jason Wasser @wasserja Write-Log function to write to a text file but I also enable some logging into a new event log by following the steps here http://blogs.technet.com/b/heyscriptingguy/archive/2013/02/01/use-powershell-to-create-and-to-use-a-new-event-log.aspx to create a log named SQLAutoScript with a source SQLAUTOSCRIPT

To run the script I simply need to add the values for

$CentralDBAServer = '' ## Add the address of the instance that holds the DBADatabase
$CentralDatabaseName = 'DBADatabase' 
$LogFile = "\DBADatabaseServerUpdate_" + $Date + ".log" ## Set Path to Log File

And the script will do the rest. Call the script from a PowerShell Job Step and schedule it to run at the frequency you wish, I gather the information every week. You can get the script from here or you can read on to see how it works and how to create the report and publish it to powerbi.com and query it with natural langauge

I create a function called Catch-Block to save keystrokes and put my commands inside a try catch to make the scripts as robust as possible. I won’t include the try catch in the examples below. I gather all of the server names from the InstanceList table and set the results to an array variable called $ServerNames holding the server name, instance name and port

 $Query = @"
 SELECT [ServerName]
      ,[InstanceName]
      ,[Port]
  FROM [DBADatabase].[dbo].[InstanceList]
  Where Inactive = 0 
    AND NotContactable = 0
"@
try{
$AlltheServers= Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query $query
$ServerNames = $AlltheServers| Select ServerName,InstanceName,Port
}

I then loop through the array and create a $Connection variable for my SMO connection string and connect to the server

foreach ($ServerName in $ServerNames)
{
## $ServerName
 $InstanceName =  $ServerName|Select InstanceName -ExpandProperty InstanceName
 $Port = $ServerName| Select Port -ExpandProperty Port
$ServerName = $ServerName|Select ServerName -ExpandProperty ServerName 
 $Connection = $ServerName + '\' + $InstanceName + ',' + $Port

 try
 {
 $srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection

Even though I place the creation of the SMO server object in a try block you still need to an additional check to ensure that you can connect and populate the object as the code above creates an empty SMO Server object with the name property set to the $Connection variable if you can’t connect to that server and doesn’t error as you may expect
The way I have always validated an SMO Server object is to check the version property. There is no justifiable reason for choosing that property, you could choose any one but that’s the one I have always used. I use an if statement to do this ( This post about Snippets will show you the best way to learn PowerShell code) The reference I use for exiting a loop in the way that you want is this one In this case we use a continue to carry on iterating the loop

 if (!( $srv.version)){
 Catch-Block " Failed to Connect to $Connection"
 continue
 }

I then loop through the user databases

foreach($db in $srv.databases|Where-Object {$_.IsSystemObject -eq $false })
{
$Name = $db.Name
$Parent = $db.Parent.Name

To gather information on all databases just remove everything after the pipe symbol or if you wish to exclude certain databases from the collection gathering, maybe the database you keep your Change log table and DBA Team info in you can do that as well here

foreach($db in $srv.databases|Where-Object {$_.Name -ne 'EXCLUDENAME' })
{
$Name = $db.Name
$Parent = $db.Parent.Name

If you wish to view all of the different properties that you can gather information on in this way you can use this code to take a look. (This is something you should get used to doing when writing new Powershell scripts)

$Connection = 'SERVERNAMEHERE'
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection
 $srv.databases | Get-Member

An alternative method of doing this is to set a variable to a $db and then to select all of the properties so that you can see the values and identify the ones you want. Again this a good thing to do when exploring new objects

$db = $srv.databases['DBNAMEHERE'] 
$db| Select *

You can see from the screen shot below that there are 170 properties available to you on a SQL2014 instance. You can gather any or all of that information as long as you ensure that you have the columns with the correct data types in your table and that your script has the logic to deal with properties that do not exist although I have had less issue with this for the database object than the server object

db3

You can look for the property that you want by using the Get-Member cmdlet as shown above or use MSDN to find it starting from here or by GoogleBingDuckDuckGo ing “Powershell SMO” and the property you wish to find.

The rest of the script follows exactly the same pattern as the previous post by checking the SQL Info table for an entry for that instance and updating the table if it exists and inserting if it does not.

This is how I created the reports shown above.

Connect to the DBA Database and run these queries to gather the data for the report.

SELECT 
IL.ServerName
,IL.InstanceName
,IL.Location
,IL.Environment
,IL.Inactive
,IL.NotContactable
,D.[DatabaseID]
,D.[InstanceID]
,D.[Name]
,D.[DateAdded]
,D.[DateChecked]
,D.[AutoClose]
,D.[AutoCreateStatisticsEnabled]
,D.[AutoShrink]
,D.[AutoUpdateStatisticsEnabled]
,D.[AvailabilityDatabaseSynchronizationState]
,D.[AvailabilityGroupName]
,D.[CaseSensitive]
,D.[Collation]
,D.[CompatibilityLevel]
,D.[CreateDate]
,D.[DataSpaceUsageKB]
,D.[EncryptionEnabled]
,D.[IndexSpaceUsageKB]
,D.[IsAccessible]
,D.[IsFullTextEnabled]
,D.[IsMirroringEnabled]
,D.[IsParameterizationForced]
,D.[IsReadCommittedSnapshotOn]
,D.[IsUpdateable]
,D.[LastBackupDate]
,D.[LastDifferentialBackupDate]
,D.[LastLogBackupDate]
,D.[Owner]
,D.[PageVerify]
,D.[ReadOnly]
,D.[RecoveryModel]
,D.[ReplicationOptions]
,D.[SizeMB]
,D.[SnapshotIsolationState]
,D.[SpaceAvailableKB]
,D.[Status]
,D.[TargetRecoveryTime]
FROM [DBADatabase].[Info].[Databases] as D
JOIN [DBADatabase].[dbo].[InstanceList] as IL
ON IL.InstanceID =D.InstanceID

To get all the database and instance information and

SELECT C.ClientName
 ,[DatabaseID]
 ,[InstanceID]
 ,[Notes]
  FROM [DBADatabase].[dbo].[ClientDatabaseLookup] as CDL
  JOIN [DBADatabase].[dbo].[Clients] as C
  ON CDL.clientid = c.clientid

To get the client information. The client information needs to be manually added to the table as this (in general) needs a human bean to understand. When the script runs every night it will pick up new databases and I add a default value of “Not Entered” to the table which makes it easier to identify the databases that need this additional work. (This also means that as a Team Leader I can monitor that my team are doing this) It can also be added to any scripts which create new databases for deployment.

Then we need to create some measures and calculated columns for our report. I did this as I realised that I needed it when making the report rather than all up front.

I created two calculated columns for size for the databases one for Gb and one for Tb by clicking on the data icon on the left and then new measure

SizeGb = Query1[SizeMB]/1024
SizeTb = Query1[SizeGb]/1024

Some measures for count of Databases, Instances and Servers

Databases = COUNT(Query1[DatabaseID])
Instances = DISTINCTCOUNT(Query1[InstanceID])
Servers = DISTINCTCOUNT(Query1[ServerName])

I also wanted to be able to differentiate between ‘External’ and ‘Internal’ customers. So I created a calculated column for this value using a switch statement.

External = SWITCH(Clients[ClientName],"Not Entered", 0 , "Dev Team",0,"Mi Team",0,"DBA Team",0,"Finance Department",0,"HR",0,"Operations",0,"Payroll",0,"Test Team",0,"Systems Team",0,"Unknown",0,1)

I create a donut chart to show the size of the database in Gb by client (and no, my real clients are not rock bands 🙂 ) as shown below. I formatted the title, legend and background by clicking on the paintbrush in the visualisation pane. I would encourage you to investigate the options here.

db4
The other donut chart is number of clients per location (and those are SQL User group locations in the UK and my hometown Bolton)

db5

The rest of the visualisations on that report are cards and tables which I am sure that you can work out.

I created a map to show the location of the databases

db6

And after reading this post http://sqldusty.com/2015/08/03/power-bi-tip-use-the-treemap-chart-as-a-colorful-slicer/ by Dustin Ryan I created a colourful slicer for environment and the client and then added some other information. The important thing here is to pick the information that the person looking at the report needs to see. So if it is recovery model, compatibility level, collation, page verify setting, mirroring, replication, size and number of databases then this report is correct but I doubt that’s what you want 🙂

You can slice this report by location, client or environment. For example, I can easily see which clients have data in Exeter and the size and number of databases

db7

Or if Metallica ring me up I can quickly see that they have 4 databases, just under 69Gb of data in Exeter and it isn’t mirrored. You will notice that it is not easy to see the recovery model or the compatibility level. If you hover over the results you get a highlight figure which shows the data is filtered but it is not shown visually very well as there are over a thousand databases using full recovery model.

db8

If we are asked about the Integration environment we can see that it is hosted in Bolton, Manchester, Southampton and Exeter and comprises of 394 databases and 739 Gb of data. It is also easier to see the compatibility level and recovery model as the ratios are larger

db9

Once we have created the report in the way that we want we can then publish it to powerbi.com and share it with others if we wish. Publishing is as easy as pressing the publish button and entering your powerbi credentials but if you want your data to automatically refresh (and this is the point of the exercise to remove manual work) then you will need to install and configure the PowerBi gateway and schedule a refresh I will post about this later.

Once the report is published you can access it in the browser and create a dashboard by clicking the pin in the top right of a visualisation and a pop up will ask you which dashboard you wish to pin it to (Another recent update to Power Bi)

db10

Once you have a dashboard you can then perform some natural language question and answer on it. This can be quite interesting and not always quite what you (or your report readers) might expect but it is getting better all the time

db11

You have to remember to use the names of the columns correctly

db12

But once you have the query correct you can alter it by adding “as a VISUALISATION” and choose the visualisation

db13

db14

And once you have the visualisation you can pin it to the dashboard

I think you can see how useful it can be

db15

This doesn’t work quite as you expect

db16

But this does

db17

How about this (and yes it felt wrong to type!)

db18

And the auditors would love to be able to do this. (This is an old copy of the database in case The Eagles people are reading this – your database is backed up every 15 minutes)

db19

Or this for a DBA ( Yes, my obfuscation script database naming convention is a bit bland)

db20

Or the DBA team manager might choose this one

db21

The advantage that I cannot show via static pictures is that the data, visualisation and the suggestions alter in real time as you type

I hope that you have found this useful and that you can see the benefits and advantages of using a DBA Database and empowering people to use self-service to answer their own questions leaving the DBA time to do more important things like drinking coffee 🙂

As always if you have any questions or comments please feel free to post them on the blog.

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs