PowerShell, Pester and Ola Hallengrens Maintenance Solution

If you are a SQL DBA you will have heard of Ola Hallengrens Maintenance solution If you haven’t go and click the link and look at the easiest way to ensure that all of your essential database maintenance is performed. You can also watch a video from Ola at SQL Bits
Recently I was thinking about how I could validate that this solution was installed in the way that I wanted it to be so I turned to Pester You can find a great how to get started here which will show you how to get Pester and how to get started with TDD.
This isn’t TDD though this is Environment Validation and this is how I went about creating my test.
First I thought about what I would look for in SSMS when I had installed the maintenance solution and made a list of the things that I would check which looked something like this. This would be the checklist you would create (or have already created) for yourself or a junior following this install. This is how easy you can turn that checklist into a Pester Test and remove the human element and open your install for automated testing
  • SQL Server Agent is running – Otherwise the jobs won’t run 🙂
  • We should have 4 backup jobs with a name of
  • DatabaseBackup – SYSTEM_DATABASES – FULL
  • DatabaseBackup – USER_DATABASES – FULL
  • DatabaseBackup – USER_DATABASES – DIFF
  • DatabaseBackup – USER_DATABASES – LOG
  • We should have Integrity Check and Index Optimisation Jobs
  • We should have the clean up jobs
  • All jobs should be scheduled
  • All jobs should be enabled
  • The jobs should have succeeded

I can certainly say that I have run through that check in my head and also written it down in an installation guide in the past. If I was being more careful I would have checked if there were the correct folders in the folder I was backing up to.

Ola’s script uses a default naming convention so this makes it easy. There should be a SERVERNAME or SERVERNAME$INSTANCENAME folder or if there is an Availability Group a CLUSTERNAME$AGNAME and in each of those a FULL DIFF and LOG folder which I can add to my checklist

So now we have our checklist we just need to turn in into a Pester Environmental Validation script

It would be useful to be able to pass in a number of instances so we will start with a foreach loop and then a Describe Block then split the server name and instance name, get the agent jobs and set the backup folder name

$ServerName = $Server.Split('\')[0]
$InstanceName = $Server.Split('\')[1]
$ServerName = $ServerName.ToUpper()
Describe 'Testing $Server Backup solution'{
BeforeAll {$Jobs = Get-SqlAgentJob -ServerInstance $Server
$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server
$dbs = $Srv.Databases.Where{$_.status -eq 'Normal'}.name
if($InstanceName)
{
$DisplayName = 'SQL Server Agent ($InstanceName)'
$Folder = $ServerName + '$' + $InstanceName
}
else
{
$DisplayName = 'SQL Server Agent (MSSQLSERVER)'
$Folder = $ServerName
}
}
if($CheckForBackups -eq $true)
{
$CheckForDBFolders -eq $true
}
$Root = $Share + '\' + $Folder 
I also set the Agent service display name so I can get its status. I split the jobs up using a Context block, one each for Backups, Database maintenance and solution clean up but they all follow the same pattern. .First get the jobs

$Jobs = $Jobs.Where{($_.Name -like 'DatabaseBackup - SYSTEM_DATABASES - FULL*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - FULL*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - DIFF*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - LOG*' + $JobSuffix + '*')}
Then we can iterate through them and check them but first lets test the Agent Service. You do this with an It Block and in it put a single test like this

actual-value | Should Be expected-value
So to check the Agent Job is running we can do this

(Get-service -ComputerName $ServerName -DisplayName $DisplayName).Status | Should Be 'Running'
To find out how to get the right values for any test I check using get member so to see what is available for a job I gathered the Agent Jobs into a variable using the Get-SQLAgentJob command in the new sqlserver module (which you can get by installing the latest SSMS from here) and then explored their properties using Get-Member and the values using Select Object

$jobs = Get-SqlAgentJob -ServerInstance $server
($Jobs | Get-Member -MemberType Property).name
$Jobs[0] | Select-Object *
then using a foreach to loop through them I can check that the jobs, exists, is enabled, has a schedule and succeeded last time it ran like this

$Jobs = $Jobs.Where{($_.Name -eq 'DatabaseIntegrityCheck - SYSTEM_DATABASES') -or ($_.Name -eq 'DatabaseIntegrityCheck - USER_DATABASES') -or ($_.Name -eq 'IndexOptimize - USER_DATABASES')}
foreach($job in $Jobs)
{
$JobName = $Job.Name
It '$JobName Job Exists'{
$Job | Should Not BeNullOrEmpty
}
It '$JobName Job is enabled' {
$job.IsEnabled | Should Be 'True'
}
It '$JobName Job has schedule' {
$Job.HasSchedule | Should Be 'True'
}
if($DontCheckJobOutcome -eq $false)
{
It '$JobName Job succeeded' {
$Job.LastRunOutCome | Should Be 'Succeeded'
}
}
So I have checked the agent and the jobs and now I want to check the folders exist. First for the instance using Test-Path so the user running the PowerShell session must have privileges and access to list the files and folders

Context '$Share Share For $Server' {
It 'Should have the root folder $Root' {
Test-Path $Root | Should Be $true
}
The for every database we need to set some variables for the Folder path. We don’t back up tempdb so we ignore that and then check if the server is SQL2012 or above and if it is check if the database is a member of an availability group and set the folder name appropriately

  foreach($db in $dbs.Where{$_ -ne 'tempdb'})
{

if($Srv.VersionMajor -ge 11)
{
If($srv.Databases[$db].AvailabilityGroupName)
{
$AG = $srv.Databases[$db].AvailabilityGroupName
$Cluster = $srv.ClusterName
$OLAAg = $Cluster + '$' + $AG
if($Share.StartsWith('\\') -eq $False)
{
$UNC = $Share.Replace(':','$')
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $OlaAG
}
else
{
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $Folder
}
}
else
{
if($Share.StartsWith('\\') -eq $False)
{
$UNC = $Share.Replace(':','$')
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $Folder
}
else
{
$Root = $Share + '\' + $Folder
}
}
}
$db = $db.Replace(' ','')
$Dbfolder = $Root + "\$db"
$Full = $Dbfolder + '\FULL'
$Diff = $Dbfolder + '\DIFF'
$Log  = $Dbfolder + '\LOG'
If($CheckForDBFolders -eq $True)
{
Context "Folder Check for $db on $Server on $Share" {
It "Should have a folder for $db database" {
Test-Path $Dbfolder |Should Be $true
} 
But we need some logic for checking for folders because Ola is smart and checks for Log Shipping databases so as not to break the LSN chain and system databases only have full folders and simple recovery databases only have full and diff folders. I used the System.IO.Directory Exists method as I found it slightly quicker for UNC Shares

If($CheckForDBFolders -eq $True)
{
Context 'Folder Check for $db on $Server on $Share' {
It 'Should have a folder for $db database' {
Test-Path $Dbfolder |Should Be $true
}
if($Db -notin ('master','msdb','model') -and ($Srv.Databases[$db].RecoveryModel -ne 'Simple') -and ( $LSDatabases -notcontains $db))
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
It 'Has a Diff Folder' {
[System.IO.Directory]::Exists($Diff) | Should Be $True
}
It 'Has a Log Folder' {
[System.IO.Directory]::Exists($Log) | Should Be $True
}
} #
elseif(($Srv.Databases[$db].RecoveryModel -eq 'Simple') -and $Db -notin ('master','msdb','model') -or ( $LSDatabases -contains $db) )
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
It 'Has a Diff Folder' {
[System.IO.Directory]::Exists($Diff) | Should Be $True
}
} #
else
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
}#
} # End Check for db folders
}
and a similar thing for the files in the folders although this caused me some more issues with performance. I first used Get-ChildItem but in folders where a log backup is running every 15 minutes it soon became very slow. So I then decided to compare the create time of the folder with the last write time which was significantly quicker for directories with a number of files but then fell down when there was a single file in the directory so if the times match I revert back to Get-ChildItem.
If anyone has a better more performant option I would be interested in knowing. I used Øyvind Kallstad PowerShell Conference session Chasing the seconds Slides and Video and tried the methods in there with Measure-Command but this was the best I came up with

If($CheckForBackups -eq $true)
{
Context ' File Check For $db on $Server on $Share' {
$Fullcreate = [System.IO.Directory]::GetCreationTime($Full)
$FullWrite = [System.IO.Directory]::GetLastWriteTime($Full)
if($Fullcreate -eq $FullWrite)
{
It 'Has Files in the FULL folder for $db' {
Get-ChildItem $Full\*.bak | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the FULL folder for $db' {
$FullCreate | Should BeLessThan $FullWrite
}
}
It 'Full File Folder was written to within the last 7 days' {
$Fullwrite |Should BeGreaterThan (Get-Date).AddDays(-7)
}
if($Db -notin ('master','msdb','model'))
{
$Diffcreate = [System.IO.Directory]::GetCreationTime($Diff)
$DiffWrite = [System.IO.Directory]::GetLastWriteTime($Diff)
if($Diffcreate -eq $DiffWrite)
{
It 'Has Files in the DIFF folder for $db' {
Get-ChildItem $Diff\*.bak | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the DIFF folder for $db' {
$DiffCreate | Should BeLessThan $DiffWrite
}
}</div><div>It 'Diff File Folder was written to within the last 24 Hours' {
$Diffwrite |Should BeGreaterThan (Get-Date).AddHours(-24)
}
}
if($Db -notin ('master','msdb','model') -and ($Srv.Databases[$db].RecoveryModel -ne 'Simple') -and ( $LSDatabases -notcontains $db))
{
$Logcreate = [System.IO.Directory]::GetCreationTime($Log)
$LogWrite = [System.IO.Directory]::GetLastWriteTime($Log)
if($Logcreate -eq $LogWrite)
{
It 'Has Files in the LOG folder for $db' {
Get-ChildItem $Log\*.trn | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the LOG folder for $db' {
$LogCreate | Should BeLessThan $LogWrite
}
}
It 'Log File Folder was written to within the last 30 minutes' {
$Logwrite |Should BeGreaterThan (Get-Date).AddMinutes(-30)
}
}# Simple Recovery
}
}# Check for backups
You could just run the script you have just created from your check-list, hopefully this blog post can help you see that you  can do so.
But I like the message showing number of tests and successes and failures at the bottom and I want to use parameters in my script. I can do this like this

[CmdletBinding()]
## Pester Test to check OLA
Param(
$Instance,
$CheckForBackups,
$CheckForDBFolders,
$JobSuffix ,
$Share ,
[switch]$NoDatabaseRestoreCheck,
[switch]$DontCheckJobOutcome
)
and then call it using Invoke-Pester with the parameters like this

$Script = @{
Path = $Path;
Parameters = @{ Instance = Instance;
CheckForBackups = $true;
CheckForDBFolders = $true;
JobSuffix = 'BackupShare1';
Share = '\\Server1\BackupShare1';
NoDatabaseRestoreCheck= $true;
DontCheckJobOutcome = $true}
}
Invoke-Pester -Script $Script
but that’s a bit messy, hard to remember and won’t encourage people newer to Powershell to use it so I wrapped it in a function with some help and examples and put it in GitHub Test-OlaInstance.ps1 and Test-Ola. There is one thing to remember. You will need to add the path to Test-Ola.ps1 on Line 90 of Test-OlaInstance so that the script can find it
Once you have that you can call it for a single instance or a number of instances like so. Here I check for Folders and Backup files
$Servers =  'SQL2008Ser2008','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2'
Test-OLAInstance -Instance $Servers -Share 'H:\' -CheckForBackups
and get  a nice result like this. In a little under 20 seconds I completed my checklist for 4 servers including checking if the files and folders exist for 61 databases 🙂 (The three failures were my Integrity Check jobs holding some test corrupt databases)
pester ola check.PNG
This gives me a nice and simple automated method of checking if Ola’s maintenance script has been correctly installed. I can use this for one server or many by passing in an array of servers (although they must use the same folder for backing up whether that is UNC or local) I can also add this to an automated build process to ensure that everything has been deployed correctly.
I hope you find it useful

Converting SQL Agent Job Duration to TimeSpan using PowerShell

When you look in msdb for the SQL Agent Job duration you will find that it is an int.

sysjobshistoiry

This is also the same when you look at Get-SQLAgentJobHistory from the sqlserver module. (You can get this by downloading the latest SSMS release from here)

agentjobhistoryproperties

This means that when you look at the various duration of the Agent Jobs you get something like this

duration.PNG

The first job took 15 hours 41 minutes  53 seconds, the second 1 minute 25 seconds, the third 21 seconds. This makes it quite tricky to calculate the duration in a suitable datatype. In T-SQL people use scripts like the following from MSSQLTips.com

((run_duration/10000*3600 + (run_duration/100)%100*60 + run_duration%100 + 31 ) / 60)  as 'RunDurationMinutes'

I needed more information than the number of minutes so I have this which will convert the Run Duration to a timespan

$FormattedDuration = @{Name = 'FormattedDuration' ; Expression = {[timespan]$_.RunDuration.ToString().PadLeft(6,'0').insert(4,':').insert(2,':')}}

formatted.PNG

So how did I get to there?

First I tried to just convert it. In PowerShell you can define a datatype in square brackets and PowerShell will try to convert it

timespan

It did its best but it converted it to ticks! So we need to convince PowerShell that this is a proper timespan. First we need to convert the run duration to a standard length, you can use the PadLeft method of a string to do this which will ensure that a string has a length and precede the current string with a value you choose until the string is that length.

Lets have a length of 6 and preceding zeros PadLeft(6,’0′)

padlefterror

But this works only if it is a string!! Remember red text is useful, it will often contain the information you need to resolve your error. Luckily there is a method to turn an int to a string. I am using the foreach method to demonstrate

padleft-with-string

Now every string is 6 characters long starting with zeros. So all that is left is to format this with colons to separate the hours and minutes and the minutes and seconds. We can do this with the insert method. You can find out the methods using Get-Member or its alias gm

methods.PNG

So the insert method takes an int for the startindex and a string value to enter

insert

There we go now we have some proper formatted timespans however they are still strings. We can then convert them using [timespan] Now we can format the results within the select by using an expression as shown below

select

and as you can see it is a timespan now

timespan property.PNG

On a slight side note. I needed the durations for Agent Jobs with a certain name within the last 6 days.

getting-agent-jobs

I did this by passing an array of servers (which I got from my dbareports database) to Get-SQLAgentJobHistory. I then used the Where method to filter for JobName and the Job Outcome step of the history. I compared the RunDate property  to Get-Date (today) adding -6 days using the AddDays method 🙂

Hopefully this will be of use to people and also I have it recorded for the next time I need to do it 🙂

 

 

Remove-SQLDatabaseSafely My First Contribution to DBATools

What is DBA Tools?

A collection of modules for SQL Server DBAs. It initially started out as ‘sqlmigration’, but has now grown into a collection of various commands that help automate DBA tasks and encourage best practices.

You can read more about here and it is freely available for download on GitHub I thoroughly recommend that you watch this quick video to see just how easy it is to migrate an entire SQL instance in one command (Longer session here )

Installing it is as easy as

Install-Module dbatools

which will get you over 80 commands . Visit https://dbatools.io/functions/ to find out more information about them

cmdlets

The journey to Remove-SQLDatabaseSafely started with William Durkin b | t who presented to the SQL South West User Group  (You can get his slides here)

Following that session  I wrote a Powershell Script to gather information about the last used date for databases which I blogged about here and then a T-SQL script to take a final backup and create a SQL Agent Job to restore from that back up which I blogged about here The team have used this solution (updated to load the DBA Database and a report instead of using Excel) ever since and it proved invaluable when a read-only database was dropped and could quickly and easily be restored with no fuss.

I was chatting with Chrissy LeMaire who founded DBATools b | t about this process and when she asked for contributions in the SQL Server Community Slack I offered my help and she suggested I write this command. I have learnt so much. I thoroughly enjoyed and highly recommend working on projects collaboratively to improve your skills. It is amazing to work with such incredible professional PowerShell people.

I went back to the basics and thought about what was required and watched one of my favourite videos again. Grant Fritcheys Backup Rant

I decided that the process should be as follows

  1. Performs a DBCC CHECKDB
  2. Database is backed up WITH CHECKSUM
  3. Database is restored with VERIFY ONLY on the source
  4. An Agent Job is created to easily restore from that backup
  5. The database is dropped
  6. The Agent Job restores the database
  7. performs a DBCC CHECKDB and drops the database for a final time

This (hopefully) passes all of Grants checks. This is how I created the command

I check that the SQL Agent is running otherwise we wont be able to run the job. I use a while loop with a timeout like this

$agentservice = Get-Service -ComputerName $ipaddr -Name $serviceName
if ($agentservice.Status -ne 'Running') {
    $agentservice.Start()
    $timeout = new-timespan -seconds 60
    $sw = [diagnostics.stopwatch]::StartNew()
    $agentstatus = (Get-Service -ComputerName $ipaddr -Name $serviceName).Status
    while ($dbStatus -ne 'Running' -and $sw.elapsed -lt $timeout) {
        $dbStatus = (Get-Service -ComputerName $ipaddr -Name $serviceName).Status
    }
}

There are a lot more checks and logic than I will describe here to make sure that the process is as robust as possible. For example, the script can exit after errors are found using DBCC CHECKDB or continue and label the database backup file and restore job appropriately. Unless the force option is used it will exit if the job name already exists. We have tried to think of everything but if something has been missed or you have suggestions let us know (details at end of post)

The only thing I didn’t add was a LARGE RED POP UP SAYING ARE YOU SURE YOU WANT TO DROP THIS DATABASE but I considered it!!

Performs a DBCC CHECKDB

Running DBCC CHECKDB with Powershell is as easy as this

$sourceserver = New-Object Microsoft.SQLServer.Management.Smo.Server "ServerName"
$db = $sourceserver.databases[$dbname]
$null = $db.CheckTables('None')

you can read more on MSDN

Database is backed up WITH CHECKSUM

Stuart Moore is my go to for doing backups and restores with SMO

I ensured that the backup was performed with checksum like this

$backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$backup.Action = [Microsoft.SqlServer.Management.SMO.BackupActionType]::Database
$backup.BackupSetDescription = "Final Full Backup of $dbname Prior to Dropping"
$backup.Database = $dbname
$backup.Checksum = $True

Database is restored with VERIFY ONLY on the source

I used SMO all the way through this command and performed the restore verify only like this

$restoreverify = New-Object 'Microsoft.SqlServer.Management.Smo.Restore'
$restoreverify.Database = $dbname
$restoreverify.Devices.AddDevice($filename, $devicetype)
$result = $restoreverify.SqlVerify($sourceserver)

An Agent Job is created to easily restore from that backup

First I created a category for the Agent Job

Function New-SqlAgentJobCategory {
    param ([string]$categoryname,
        [object]$jobServer)
    if (!$jobServer.JobCategories[$categoryname]) {
        if ($Pscmdlet.ShouldProcess($sourceserver, "Creating Agent Job Category $categoryname")
            {
                try {
                    Write-Output "Creating Agent Job Category $categoryname"
                    $category = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobCategory
                    $category.Parent = $jobServer
                    $category.Name = $categoryname
                    $category.Create()
                    Write-Output "Created Agent Job Category $categoryname"
                }
                catch {
                    Write-Exception $_
                    throw "FAILED : To Create Agent Job Category $categoryname - Aborting"
                }
            }
        }
    }
}

and then generated the TSQL for the restore step by using the script method on the Restore SMO object

This is how to create an Agent Job

$job = New-Object Microsoft.SqlServer.Management.Smo.Agent.Job $jobServer, $jobname
$job.Name = $jobname
$job.OwnerLoginName = $jobowner
$job.Description = "This job will restore the $dbname database using the final backup located at $filename"

and then to add a job step to run the restore command

$jobStep = new-object Microsoft.SqlServer.Management.Smo.Agent.JobStep $job, $jobStepName $jobStep.SubSystem = 'TransactSql' # 'PowerShell' 
$jobStep.DatabaseName = 'master' 
$jobStep.Command = $jobStepCommmand 
$jobStep.OnSuccessAction = 'QuitWithSuccess' 
$jobStep.OnFailAction = 'QuitWithFailure' 
if ($Pscmdlet.ShouldProcess($destination, "Creating Agent JobStep on $destination")
    { 
        $null = $jobStep.Create()
    } 
    $job.ApplyToTargetServer($destination)
    $job.StartStepID = $jobStartStepid 
    $job.Alter()

 

The database is dropped

We try 3 different methods to drop the database

$server.KillDatabase($dbname)
$server.databases[$dbname].Drop()
$null = $server.ConnectionContext.ExecuteNonQuery("DROP DATABASE ")

The Agent Job restores the database

To run the Agent Job I call the start method of the Job SMO Object
    $job = $destserver.JobServer.Jobs[$jobname]
    $job.Start()
    $status = $job.CurrentRunStatus
    while ($status -ne 'Idle') {
        Write-Output " Restore Job for $dbname on $destination is $status"
        $job.Refresh()
        $status = $job.CurrentRunStatus
        Start-Sleep -Seconds 5
    }
Then we drop the database for the final time with the confidence that we have a safe backup and an easy one click method to restore it from that backup (as long as the backup is in the same location)
There are further details on the functions page on dbatools
Some videos of it in action are on YouTube http://dbatools.io/video
You can take a look at the code on GitHub here

You can install it with

Install-Module dbatools
You can provide feedback via the Trello Board or discuss it in the #dbatools channel in the Sqlserver Community Slack
You too can also become a contributor https://dbatools.io/join-us/ Come and write a command to make it easy for DBAs to (this bit is up to your imagination).

Using the new SQLServer Powershell module to get SQL Agent Job Information

So with the July Release of SSMS everything changed for using PowerShell with SQL. You can read the details here As I mentioned in my previous post the name of the module has changed to sqlserver

This means that if you have a PowerShell script doing Import-Module SQLPS, it will need to be changed to be Import-Module SqlServer in order to take advantage of the new provider functionality and new CMDLETs. The new module will be installed to “%Program Files\WindowsPowerShell\Modules\SqlServer” and hence no update to $env:PSModulePath is required.

You can download the latest SSMS release here Once you have installed and rebooted you can start to look at the new Powershell CMDlets

Import-module sqlserver

Take a look at cmdlets

  Get-command -module sqlserver

Today I want to look at agent jobs

  Get-command *sqlagent*

getcomand sqlagent

So I decided to see how to gather the information I gather for the DBADatabase as described here

This is the query I use to insert the data for the server level agent job information.

  $Query = @"
INSERT INTO [Info].[AgentJobServer]
 ([Date]
 ,[InstanceID]
 ,[NumberOfJobs]
 ,[SuccessfulJobs]
 ,[FailedJobs]
 ,[DisabledJobs]
 ,[UnknownJobs])
 VALUES
 (GetDate()
 ,(SELECT [InstanceID]
FROM [DBADatabase].[dbo].[InstanceList]
WHERE [ServerName] = '$ServerName'
AND [InstanceName] = '$InstanceName'
AND [Port] = '$Port')
 ,'$JobCount'
 ,'$successCount'
 ,'$failedCount'
 ,'$JobsDisabled'
 ,'$UnknownCount')
"@

So Get-SQLAgentJob looks like the one I need. Lets take a look at the help. This should be the starting point whenever you use a new cmdlet

  Get-Help Get-SqlAgentJob -Full

Which states

Returns a SQL Agent Job object for each job that is present in the target instance of SQL Agent.

That sounds like it will meet my needs. Lets take a look

 Get-SqlAgentJob -ServerInstance $Connection|ft -AutoSize

sqlinstances

I can get the information I require like this

 $JobCount = (Get-SqlAgentJob -ServerInstance $Connection ).Count
$successCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Succeeded'}.Count
$failedCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Failed'}.Count
$JobsDisabled = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.IsEnabled -eq $false}.Count
$UnknownCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Unknown'}.Count

NOTE – That code is for PowerShell V4 and V5, if you are using earlier versions of PowerShell you would need to use

 $JobCount = (Get-SqlAgentJob -ServerInstance $Connection ).Count
$successCount = (Get-SqlAgentJob -ServerInstance $Connection|Where-Object {$_.LastRunOutcome -eq 'Succeeded'}).Count
$failedCount = (Get-SqlAgentJob -ServerInstance $Connection |Where-Object {$_.LastRunOutcome -eq 'Failed'}).Count
$JobsDisabled = (Get-SqlAgentJob -ServerInstance $Connection |Where-Object{$_.IsEnabled -eq $false}).Count
$UnknownCount = (Get-SqlAgentJob -ServerInstance $Connection |Where-Object{$_.LastRunOutcome -eq 'Unknown'}).Count

But to make the code more performant it is better to do this

  [pscustomobject]$Jobs= @{}
$Jobs.JobCount = (Get-SqlAgentJob -ServerInstance $Connection ).Count
$Jobs.successCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Succeeded'}.Count
$Jobs.failedCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Failed'}.Count
$Jobs.JobsDisabled = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.IsEnabled -eq $false}.Count
$Jobs.UnknownCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Unknown'}.Count
$Jobs

jobs

Using Measure-Command showed that this completed in
TotalSeconds : 0.9889336
Rather than
TotalSeconds : 2.9045701

Note that

  (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.Enabled -eq $false}.Count

Does not work. I had to check the properties using

  Get-SqlAgentJob -ServerInstance $Connection |Get-Member -Type Properties

Which showed me

IsEnabled Property bool IsEnabled {get;set;}

So I tested this against the various SQL versions I had in my lab using this code

 $Table = $null
$Table = New-Object System.Data.DataTable "Jobs"
$Col1 = New-Object System.Data.DataColumn ServerName,([string])
$Col2 = New-Object System.Data.DataColumn JobCount,([int])
$Col3 = New-Object System.Data.DataColumn SuccessCount,([int])
$Col4 = New-Object System.Data.DataColumn FailedCount,([int])
$Col5 = New-Object System.Data.DataColumn DisabledCount,([int])
$Col6 = New-Object System.Data.DataColumn UnknownCount,([int])

$Table.Columns.Add($Col1)
$Table.Columns.Add($Col2)
$Table.Columns.Add($Col3)
$Table.Columns.Add($Col4)
$Table.Columns.Add($Col5)
$Table.Columns.Add($Col6)
foreach ($ServerName in $DemoServers)
{
## $ServerName
$InstanceName =  $ServerName|Select-Object InstanceName -ExpandProperty InstanceName
$Port = $ServerName| Select-Object Port -ExpandProperty Port
$ServerName = $ServerName|Select-Object ServerName -ExpandProperty ServerName
$Connection = $ServerName + '\' + $InstanceName + ',' + $Port
try
{
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection
}
catch
{
"Failed to connect to $Connection"
}
if (!( $srv.version)){
"Failed to Connect to $Connection"
continue
}
[pscustomobject]$Jobs= @{}
$JobHistory = Get-SqlAgentJob -ServerInstance $Connection
$Row = $Table.NewRow()
$Row.ServerName = $ServerName
$Row.JobCount = $JobHistory.Count
$Row.SuccessCount = $JobHistory.where{$_.LastRunOutcome -eq 'Succeeded'}.Count
$Row.FailedCount = $JobHistory.where{$_.LastRunOutcome -eq 'Failed'}.Count
$Row.DisabledCount = $JobHistory.where{$_.IsEnabled -eq $false}.Count
$Row.UnknownCount = $JobHistory.where{$_.LastRunOutcome -eq 'Unknown'}.Count
$Table.Rows.Add($row)
}
$Table|ft
Here are the results
job data table

I also had a look at Get-SQLAgentJobHistory Lets take a look at the help

 Get-help get-SQLAgentJobHistory -showwindow

DESCRIPTION

Returns the JobHistory present in the target instance of SQL Agent.

This cmdlet supports the following modes of operation to return the JobHistory:

  1. By specifying the Path of the SQL Agent instance.
  2. By passing the instance of the SQL Agent in the input.
  3. By invoking the cmdlet in a valid context.

So I ran

 Get-SqlAgentJobHistory -ServerInstance sql2014ser12r2

And got back a whole load of information. Every job history available on the server. Too much to look it immediately to work out what to do

So I looked at just one job

 Get-SqlAgentJobHistory -ServerInstance SQL2014Ser12R2 -JobName 'DatabaseBackup - SYSTEM_DATABASES - FULL - Local G Drive'

And got back the last months worth of history for that one job as that is the schedule used to purge the job history for this server So then I added -Since Yesterday to only get the last 24 hours history

 Get-SqlAgentJobHistory -ServerInstance SQL2014Ser12R2 -JobName 'DatabaseBackup - SYSTEM_DATABASES - FULL - Local G Drive' -Since Yesterday

agentjobdetail

The Since Parameter is described as

-Since <SinceType>

A convenient abbreviation to avoid using the -StartRunDate parameter.
It can be specified with the -EndRunDate parameter.

Do not specify a -StartRunDate parameter, if you want to use it.

Accepted values are:
– Midnight (gets all the job history information generated after midnight)
– Yesterday (gets all the job history information generated in the last 24 hours)
– LastWeek (gets all the job history information generated in the last week)
– LastMonth (gets all the job history information generated in the last month)

When I run

 Get-SqlAgentJobHistory -ServerInstance SQL2014Ser12R2 -JobName 'DatabaseBackup - SYSTEM_DATABASES - FULL - Local G Drive' -Since Yesterday |Measure-Object

I get

Count : 3

And if I run

 Get-SqlAgentJobHistory -ServerInstance SQL2014Ser12R2 -JobName 'DatabaseBackup - SYSTEM_DATABASES - FULL - Local G Drive' -Since Yesterday |select RunDate,StepID,Server,JobName,StepName,Message|Out-GridView

I get

agent job out gridview

Which matches the view I see in SSMS Agent Job History

jobhistory

So Get-SqlAgentJobHistory will enable you to use PowerShell to gather information about the Job history for each step of the Agent Jobs and also the message which I can see being very useful.

Come and join us in the SQL Community Slack to discuss these CMDLets and all things SQL Community https://sqlps.io/slack

CALL TO ACTION

Microsoft are engaging with the community to improve the tools we all use in our day to day work. There is are two Trello boards set up for YOU to use to contribute

https://sqlps.io/vote for SQLPS sqlserver PowerShell module

https://sqlps.io/ssms for SSMS

Go and join them and upvote YOUR preferred choice of the next lot of CMDlets

trellocount

We have also set up a SQL Community Slack for anyone in the community to discuss all things related to SQL including the Trello board items and already it seems a good place for people to get help with 150+ members in a few days. You can get an invite here https://sqlps.io/slack

Come and join us

DBA Database scripts are on Github

It started with a tweet from Dusty

Tweets

The second session I presented at the fantastic PowerShell Conference Europe was about using the DBA Database to automatically install DBA scripts like sp_Blitz, sp_AskBrent, sp_Blitzindex from Brent Ozar , Ola Hallengrens Maintenance Solution , Adam Mechanics sp_whoisactive , This fantastic script for logging the results from sp_whoisactive to a table , Extended events sessions and other goodies for the sanity of the DBA.

By making use of the dbo.InstanceList in my DBA database I am able to target instances, by SQL Version, OS Version, Environment, Data Centre, System, Client or any other variable I choose. An agent job that runs every night will automatically pick up the instances and the scripts that are marked as needing installing. This is great when people release updates to the above scripts allowing you to target the development environment and test before they get put onto live.

I talked to a lot of people in Hannover and they all suggested that I placed the scripts onto GitHub and after some how-to instructions from a few people (Thank you Luke) I spent the weekend updating and cleaning up the code and you can now find it on GitHub here

github

I have added the DBA Database project, the Powershell scripts and Agent Job creation scripts to call those scripts and everything else I use. Some of the DBA Scripts I use (and links to those you need to go and get yourself for licensing reasons) and the Power Bi files as well. I will be adding some more jobs that I use to gather other information soon.

Please go and have a look and see if it is of use to you. It is massively customisable and I have spoken to various people who have extended it in interesting ways so I look forward to hearing about what you do with it.

As always, questions and comments welcome

 

 

Backing up to URL container name – case is important

If you use SQL Backup to URL to backup your databases to Azure blob storage remember that for the container name case is important

So

BACKUP LOG [DatabaseName]
TO URL = N'https://storageaccountname.blob.core.windows.net/containername/databasename_log_dmmyyhhss.trn'
WITH CHECKSUM, NO_COMPRESSION, CREDENTIAL = N'credential'

will work but

BACKUP LOG [DatabaseName]
TO URL = N'https://storageaccountname.blob.core.windows.net/CONTAINERNAME/databasename_log_dmmyyhhss.trn'</span>
WITH CHECKSUM, NO_COMPRESSION, CREDENTIAL = N'credential'

will give an (400) Bad Request Error which may not be easy to diagnose

Msg 3271, Level 16, State 1, Line 1
A nonrecoverable I/O error occurred on file "https://storageacccountname.blob.core.windows.net/CONTAINERNAME/databasename_log_dmmyyhhss.trn':" Backup to URL received an exception from the remote endpoint. 
Exception Message: The remote server returned an error: (400) Bad Request..
Msg 3013, Level 16, State 1, Line 1
BACKUP LOG is terminating abnormally.

If you are using Ola Hallengrens jobs to perform your backup then your job step will look like this

sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d DBA-Admin -Q "EXECUTE [dbo].[DatabaseBackup] @Databases = 'USER_DATABASES',&nbsp; @URL = 'https://storageaccountname.blob.core.windows.net/containername', @Credential = 'credential', @BackupType = 'LOG', @ChangeBackupType = 'Y', @Verify = 'Y', @CheckSum = 'Y', @LogToTable = 'Y'" -b

Note the @ChangeBackupType = ‘Y’ parameter which is not created by default but I think is very useful. If you have just created a database and take log backups every 15 minutes but differential (or full) every night the log backup will fail until a full backup has been taken. This parameter will check if a log backup is possible and if not take a full backup meaning that you still can keep to your RTO/RPO requirements even for newly created databases

Power Bi, PowerShell and SQL Agent Jobs

Continuing my series on using Power Bi with my DBA Database I am going to show in this post how I create the most useful daily report for DBAs – The SQL Agent Job report. You can get the scripts and reports here

Please note this project became dbareports.io

AG1

This gives a quick overview of the status of the Agent Jobs across the estate and also quickly identifies recent failed jobs enabling the DBA to understand their focus and prioritise their morning efforts.

I gather the information into 2 tables AgentJobDetail

CREATE TABLE [Info].[AgentJobDetail](
[AgetnJobDetailID] [int] IDENTITY(1,1) NOT NULL,
[Date] [datetime] NOT NULL,
[InstanceID] [int] NOT NULL,
[Category] [nvarchar](50) NOT NULL,
[JobName] [nvarchar](250) NOT NULL,
[Description] [nvarchar](750) NOT NULL,
[IsEnabled] [bit] NOT NULL,
[Status] [nvarchar](50) NOT NULL,
[LastRunTime] [datetime] NOT NULL,
[Outcome] [nvarchar](50) NOT NULL,
CONSTRAINT [PK_info.AgentJobDetail] PRIMARY KEY CLUSTERED
(
[AgetnJobDetailID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

and AgentJobServer

CREATE TABLE [Info].[AgentJobServer](
[AgentJobServerID] [int] IDENTITY(1,1) NOT NULL,
[Date] [datetime] NOT NULL,
[InstanceID] [int] NOT NULL,
[NumberOfJobs] [int] NOT NULL,
[SuccessfulJobs] [int] NOT NULL,
[FailedJobs] [int] NOT NULL,
[DisabledJobs] [int] NOT NULL,
[UnknownJobs] [int] NOT NULL,
CONSTRAINT [PK_Info.AgentJobServer] PRIMARY KEY CLUSTERED
(
[AgentJobServerID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

The Detail table holds the results of every Agent Job and the Server table holds a roll up for each server. The script to gather this information is based on the script I used to put the information into an Excel Sheet as described in my post How I Check Hundreds of Agent Jobs in 60 Seconds with PowerShell which I also altered to send an HTML email to the DBA team each morning. This however is a much better solution and allows for better monitoring and trending.

As I have explained in my previous posts I use an Instance List table to hold the information about each instance in the estate and a series of PowerShell scripts which run via Agent Jobs to gather the information into various tables. These posts describe the use of the Write-Log function and the methodology of gathering the required information and looping through each instance so I wont repeat that here. There is an extra check I do however for Express Edition as this does not contain the Agent service

$edition = $srv.Edition
if ($Edition -eq 'Express') {
    Write-Log -Path $LogFile -Message "No Information gathered as this Connection $Connection is Express"
    continue
}

The Agent Job information can be found in SMO by exploring the $srv.JobServer.Jobs object and I gather the information by iterating through each job and setting the values we require to variables

try {
    $JobCount = $srv.JobServer.jobs.Count
    $successCount = 0
    $failedCount = 0
    $UnknownCount = 0
    $JobsDisabled = 0
    #For each job on the server
    foreach ($jobin$srv.JobServer.Jobs)
    {
        $jobName = $job.Name;
        $jobEnabled = $job.IsEnabled;
        $jobLastRunOutcome = $job.LastRunOutcome;
        $Category = $Job.Category;
        $RunStatus = $Job.CurrentRunStatus;
        $Time = $job.LastRunDate;
        if ($Time -eq '01/01/000100:00:00')
        {$Time = ''}
        $Description = $Job.Description;
        #Counts for jobs Outcome
        if ($jobEnabled -eq $False)
        {$JobsDisabled += 1}
        elseif ($jobLastRunOutcome -eq "Failed")
        {$failedCount += 1; }
        elseif ($jobLastRunOutcome -eq "Succeeded")
        {$successCount += 1; }
        elseif ($jobLastRunOutcome -eq "Unknown")
        {$UnknownCount += 1; }
    }    
}

I found that some Jobs had names and descriptions that had ‘ in them which would cause the SQL update or insert statement to fail so I use the replace method to replace the ‘ with ”

if ($Description -eq $null) {
    $Description = ' '
}
$Description = $Description.replace('''', '''''')
if ($jobName -eq $Null) {
    $jobName = 'None'
}
$JobName = $JobName.replace('''', '''''')

I then insert the data per job after checking that it does not already exist which allows me to re-run the job should a number of servers be uncontactable at the time of the job running without any additional work

IF NOT EXISTS (
SELECT&nbsp; [AgetnJobDetailID]
FROM [DBADatabase].[Info].[AgentJobDetail]
where jobname = '$jobName'
and InstanceID = (SELECT [InstanceID]
FROM [DBADatabase].[dbo].[InstanceList]
WHERE [ServerName] = '$ServerName'
AND [InstanceName] = '$InstanceName'
AND [Port] = '$Port')
and lastruntime = '$Time'
)
INSERT INTO [Info].[AgentJobDetail]
([Date]
,[InstanceID]
,[Category]
,[JobName]
,[Description]
,[IsEnabled]
,[Status]
,[LastRunTime]
,[Outcome])
VALUES
(GetDate()
,(SELECT [InstanceID]
FROM [DBADatabase].[dbo].[InstanceList]
WHERE [ServerName] = '$ServerName'
AND [InstanceName] = '$InstanceName'
AND [Port] = '$Port')
,'$Category'
,'$jobName'
,'$Description'
,'$jobEnabled'
,'$RunStatus'
,'$Time'
,'$jobLastRunOutcome')

I put this in a here-string variable and pass it to Invoke-SQLCmd I do the same with the roll up using this query

INSERT INTO [Info].[AgentJobServer]
([Date]
,[InstanceID]
,[NumberOfJobs]
,[SuccessfulJobs]
,[FailedJobs]
,[DisabledJobs]
,[UnknownJobs])
VALUES
(GetDate()
,(SELECT [InstanceID]
FROM [DBADatabase].[dbo].[InstanceList]
WHERE [ServerName] = '$ServerName'
AND [InstanceName] = '$InstanceName'
AND [Port] = '$Port')
,'$JobCount'
,'$successCount'
,'$failedCount'
,'$JobsDisabled'
,'$UnknownCount')

This job runs as a SQL Agent Job every morning a half an hour or so before the DBA arrives for the morning shift vastly improving the ability of the DBA to prioritise their morning routine.

To create the report open Power Bi Desktop and click Get Data

ag2

Then choose SQL Server and click connect

ag3

Enter the Connection string, the database and the  query to gather the data

ag5

The query is

Select IL.InstanceID,
IL.ServerName,
IL.InstanceName,
IL.Environment,
IL.Location,
AJD.Category,
AJD.Date,
AJD.Description,
AJD.IsEnabled,
AJD.JobName,
AJD.LastRunTime,
AJD.Outcome,
AJD.Status
FROM [dbo].[InstanceList] IL
JOIN [Info].[AgentJobDetail] AJD
ON IL.InstanceID = AJD.InstanceID
WHERE LastRunTime > DATEADD(Day,-31,GETDATE())

Once we have gathered the data we then create some extra columns and measures for the reports. First I create a date column from the datetime Date Column

DayDate = DATE(YEAR('Agent Job Detail'[Date]),MONTH('Agent Job Detail'[Date]),DAY('Agent Job Detail'[Date]))

I also do the same for the LastRuntime. I create a day of the week column so that I can report on jobs outcome by day

DayyOfWeek = CONCATENATE(WEEKDAY('Agent Job Detail'[Date],2),FORMAT('Agent Job Detail'[Date]," -dddd"))

My friend Terry McCann b | t helped me create a column that returns true if the last run time is within 24 hours of the current time to help identify the recent jobs that have failed NOTE – On a Monday morning you will need to change this if you do not check your jobs on the weekend.

Last Run Relative Hour = ((1.0*(NOW()-'Agent Job Detail'[LastRunTime]))*24)<24

I create a measure for Succeeded, Failed and Unknown

Succeeded = IF('Agent Job Detail'[Outcome] = "Succeeded"
, 1
, 0)

Next we have to create some measures for the sum of failed jobs and the averages This is the code for 7 day sum

Failed7Days = CALCULATE(SUM('Agent Job Detail'[Failed]),FILTER (
ALL ( 'Agent Job Detail'[Last Run Date] ),
'Agent Job Detail'[Last Run Date] > ( MAX ( 'Agent Job Detail'[Last Run Date]&nbsp; ) - 7 )
&& 'Agent Job Detail'[Last Run Date]&nbsp; <= MAX ( 'Agent Job Detail'[Last Run Date]&nbsp; )&nbsp;&nbsp;&nbsp;&nbsp; ) )

and for the 7 Day average

Failed7DayAverage = DIVIDE([Failed7Days],7)

I did the same for 30 days. I used the TechNet reference for DAX expressions and got ideas from Chris Webbs blog

ag6
First I created the 30 day historical trend chart using a Line and Clustered column chart using the last run date as the axis and the succeed measure as the column and the Failed, Failed 7 Day Average and failed 30 day average as the lines

I then formatted the lines and title and column

ag7

To create the gauge which shows how well we have done today I created a measure to quickly identify todays jobs

LastRun Relative Date Offset = INT('Agent Job Detail'[LastRunTime] - TODAY())

which I use as a filter for the gauge as shown below. I also create two measures zero and twenty for the minimum and maximum for the gauge

ag8

The rest of the report is measures for 7 day average and 30 day average, a slicer for environment  and two tables, one to show the historical job counts and one to show the jobs that have failed in the last 24 hours using the Last Run Relative Hour measure from above

ag9

There are many other reports that you can or may want to create maybe by day of the week or by category depending on your needs. Once you have the data gathered you are free to play with the data as you see fit. Please add any further examples of reports you can run or would like to run in the comments below.

Once you have your report written you can publish it to PowerBi.com and create a dashboard and query it with natural language. I have explained the process in previous posts

For example – How many Jobs failed today

ag110

Which server had most failed jobs

ag11

or using the category field which database maintenance jobs failed today

ag13

I hope these posts have given you ideas about how you can use Powershell, a DBA Database and Power Bi to help you to manage and report on your environment.

You can get the scripts and reports here

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

Using Power Bi with my DBA Database

Every good DBA should have a DBA database. A place to store information about all of their instances and databases.

I have an InstanceList table which looks like this

CREATE TABLE [dbo].[InstanceList](
[InstanceID] [int] IDENTITY(1,1) NOT NULL,
[ServerName] [nvarchar](50) NOT NULL,
[InstanceName] [nvarchar](50) NOT NULL,
[Port] [int] NOT NULL,
[AG] [bit] NULL,
[Inactive] [bit] NULL CONSTRAINT [DF_InstanceList_Inactive] DEFAULT ((0)),
[Environment] [nvarchar](25) NULL,
[Location] [nvarchar](30) NULL,
CONSTRAINT [PK_InstanceList_ID] PRIMARY KEY CLUSTERED
(
[InstanceID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

I use this as the basis for all of my information gathering. By adding Server name, Instance Name , Port, Environment and Location to the table I use overnight Agent jobs to run Powershell scripts to gather information about all of the instances. This way the information is dynamic and gathered from the server, so when we add RAM and change Max memory this is updated the next time the script runs. You can also automate your installation and decommission procedures (using Powershell scripts) to add the information to the DBA database automatically

I have 4 scripts

  • ServerInfo which gathers Windows OS information such as Version and edition of the operating system, number of processors,amount of RAM, IP address, domain name etc
  • SQLInfo which gathers information about the instance such as SQL version, edition, collation, max and min memory, MAXDOP , service accounts and start modes, default file locations etc
  • Database information such as size, data usage, index usage, last backup dates, owner and many more
  • Agent Job which gathers the outcome of the jobs that have run, their names, category into two tables one for a server rollup and one for details about each job

Recently I have received a lot of requests for information from various sources, auditors asking about encryption and backup policies, Project managers asking about database and sql versions, compliance asking about numbers of Windows 2003 servers or SQL 2005 servers, system teams asking which serves in a particular location can be turned off at which time dependant on which system they are supporting for a power down

Before we had the DBA database holding all of the information about the instances we would have struggled to be able to compile this information and when I saw Power Bi was released to GA I thought that it would be a good place to start to learn about it. By using data that I understood and answering questions that I knew the format of the answer I could be more confident about experimenting – ie. if I know I have 100 servers then any result for servers that exceeds that is incorrect

I have never been a BI guy, I claim no expertise in the correct methods of manipulating the data. There may very well be better methods of achieving these results and if there please feel free to comment below so that I can improve my knowledge and keep on learning

All data shown in the examples below has been generated from real-life data but all identifiable data has been altered or removed. I have no servers in Bolton, it is where I am from originally!!

I downloaded Power BI Desktop from powerbi.com and ran the installer and the first screen you see is this one

1

I then clicked on Get Data

2

And then SQL Server and filled in the details for my DBA Database and clicked connect

3

I used my current Windows credentials

4

It then asked me which tables I wanted to load so I said all of them 🙂

5

Once I had loaded the data I looked at the queries and renamed some of the columns to make more sense to me. I also created some calculated columns by clicking New Column

I created a relative date column using this code from Chris Webb http://blog.crossjoin.co.uk/2013/01/24/building-relative-date-reports-in-powerpivot/

Relative Date Offset=INT([Date] – TODAY()
Relative Date=IF([Relative Date Offset]=0
, "Today"
, "Today " & IF([Relative Date Offset]>0, "+", "") & [Relative Date Offset])

This will enable me to show data for the last day

I also did the same for days of the week

DayOfWeek = CONCATENATE(WEEKDAY('Info AgentJobDetail'[LastRunTime],2),FORMAT('InfoAgentJobDetail'[LastRunTime]," -dddd"))

Because I struggled to show the information about the Operating system I also created two columns for OS name and OS edition by adding columns as shown below

Operating System Version = SWITCH('Info ServerOSInfo'[OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "Server 2012",
"Microsoft Windows Server 2012 Standard","Server 2012",
"Microsoft Windows Server 2012 R2 Datacenter", "Server 2012 R2",
"Microsoft Windows Server 2008 R2 Standard", "Server 2008 R2",
"Microsoft Windows Server 2008 R2", "Server 2008 R2",
"Microsoft Windows Server 2008 R2 Enterprise", "Server 2008 R2",
"Microsoft® Windows Server® 2008 Standard", "Server 2008",
"Microsoft® Windows Server® 2008 Enterprise","Server 2008",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Server 2003",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Server 2003",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

And

Operating System Edition = SWITCH('Info ServerOSInfo'[OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "DataCenter",
"Microsoft Windows Server 2012 Standard","Standard",
"Microsoft Windows Server 2012 R2 Datacenter", "DataCenter",
"Microsoft Windows Server 2008 R2 Standard", "Standard",
"Microsoft Windows Server 2008 R2 Enterprise", "Enterprise",
"Microsoft® Windows Server® 2008 Standard", "Standard",
"Microsoft® Windows Server® 2008 Enterprise","Enterprise",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Standard",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Enterprise",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

Then I started to play with the data.

This is probably not how a professional would phrase it but I would say that if you don’t know how to use a new application be brave and give it a try.

OBVIOUSLY you are a PROFESSIONAL DBA and will not do anything that would endanger production, use a backup of your database and work locally if you need to.

The first thing I wanted to know was how many servers I had by operating system, how many by SQL version and the location of them so that I could answer the questions I had been asked. I had already written a query to get the correct information to give to the requestors so I knew the correct answers which was also an advantage. I did this like this

I expanded the Info ServerOSInfo query and dragged the ServerName field to the report which created a table of names

6

I then changed the ServerName values to Count

7

I then dragged the calculated column Operating System Version to the table

8

If I click on the table and then donut chart in the visualisations it changes to

9

So you can quickly see how you want the data displayed

I then decided to look at the number of SQL 2005 instances that I had and as I had relationships between SQLInfo and Instancelist and Clients I could build a more dynamic report.

I created a donut chart with SQLVersion as the legend and InstanceID as the values and a table of SQLVersion, ServerName and Instance Name. I also created a card that was count of InstanceID

10

Now it starts getting really useful. If I want to know how many SQL 2005 instances I have I simply click on SQL2005 in the donut chart and the rest of the report changes

11

This is very cool and I hope you can see how useful this could be and how brilliant it would be to enable relevant people within the organisation the ability to look at that report and answer their own questions.

Lets take it to the next step. I have a location column in the InstanceList table which comprises of town names. If I choose a map and drag that column to the Location field and set Values and Color Saturation to the Count of InstanceID

12

and create two tables one of client with a count of instanceid and one location with a count of instance id I can do this

13

Look at how it dynamically changes as you click on the data labels – This is very cool and makes me smile every time!! I altered the colour saturation colours to make it easier to see. Now if I am asked about SQL 2005 servers I can quickly click on SQL 2005 and

14

I can see that there are 32 instances, most are in Southampton, and which clients they support

If I click a location rather than SQL version the report alters like so

15

So you can simply pass the report file to your colleagues to enable them to use it or you can publish it to Powerbi.com. I am not going to go into any detail about the costs or licensing etc I will just say it is as easy as clicking publish. If you wish to have the information automatically refreshed there are some more steps that you would need to go through which are detailed here which enable you to connect your on-premise database to Powerbi using the data management gateway, alternatively you can simply refresh the data in the report and then publish it and replace the existing report.

Once the report is in powerbi.com you can enable Q and A on the data. This is some kind of supernatural mystical magical query language which enables you to query your data with natural language and will alter the results as you type and even cope with (deliberate for screenshot) spelling mistakes 🙂

16

I also created a report for my Agent Jobs to enable me to quickly and easily see which Jobs have failed in the last day

17

I did this by filtering the report by Relative Date Offset greater than -1 (today) and isenabled = True and Outcome = Failed

There are many many more ways I can see this being useful and I hope I have given you some ideas and encouraged you to try for yourself and find out more

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

Setting Up and Using Azure VM SQL Automated Backup (and Restore)

This weekend I was creating some Azure VMs to test and was required to use the GUI for some screenshots. I have always used my Powershell scripts described here to create my test systems and with a new job taking up a lot of my time had missed the announcement about Azure SQL Automated Backup and Azure SQL Automated Patching so was surprised to see this screen

1

I read the announcement and also the details on MSDN https://msdn.microsoft.com/en-us/library/azure/dn906091.aspx which show that this requires the SQL Server IaaS Agent. This is a default option on new virtual machines.

There are some other considerations too. It is only supported for SQL Server 2014 and Windows Server 2012 and 2012R2 at present and you can set a retention period to a maximum of 30 days but it is automated. You do not have to decide upon the backup strategy Azure will decide the frequency and type of backups dependent upon the workload of the database and some other factors such as

A full backup is taken
○ when an instance is added to use Managed backup
○ When transaction log growth is 1Gb or more
○ At least once a week
○ If the log chain is broken
○ When a database is created

A transaction log backup is taken
– If no log backup is found
– Transaction log space used is 5Mb or larger
– At least once every two hours
– Any time the transaction log backup is lagging behind a full database backup. The goal is to keep the log chain ahead of full backup.

From <https://msdn.microsoft.com/en-gb/library/dn449496(v=sql.120).aspx>

There are some restrictions
– Only database backups are supported
– System databases are not supported so you need to back those up yourself
– You can only back up to Azure storage
– Maximum backup size is 1Tb as this is the maximum size for a blob in Azure storage
– Simple recovery is not supported
– Maximum retention is 30 days – if you are required to keep your backups for longer than 30 days for regulatory or other reasons you could simply use Azure Automation to copy the files to another storage account in Azure)

How to set it up.

If you are using the GUI then you will find SQL Automated Backup in the optional config blade of the set up. You can follow the steps here to set it up. If (like me) you want to use Powershell then use the following code after you have created your Virtual Machine

$storageaccount = "<storageaccountname>"
$storageaccountkey = (Get-AzureStorageKey -StorageAccountName $storageaccount).Primary
$storagecontext = New-AzureStorageContext -StorageAccountName $storageaccount -StorageAccountKey $storageaccountkey

$encryptionpassword = (Get-Credential -message 'Backup Encryption Password' -User 'IGNOREUSER').password
$autobackupconfig = New-AzureVMSqlServerAutoBackupConfig -StorageContext $storagecontext -Enable -RetentionPeriod 10 -EnableEncryption -CertificatePassword $encryptionpassword
Get-AzureVM -ServiceName <vmservicename> -Name <vmname> | Set-AzureVMSqlServerExtension -AutoBackupSettings $autobackupconfig | Update-AzureVM

Once you have run the code, Azure will take care of the rest. Add a couple of databases to your instance and look in the storage account and you will see this

2

3

And in the automaticbackup container you will find the Certificates and master key backups

4

It will also create a credential

5

You can use the same credential to back up your system databases. If like me you use Ola Hallengrens excellent Maintenance Solution then simply change your systems backup job as follows


USE [msdb]
GO
EXEC msdb.dbo.sp_update_jobstep @job_name = 'DatabaseBackup - SYSTEM_DATABASES - FULL', @step_id=1 ,
		@command=N'sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "EXECUTE [dbo].[DatabaseBackup] @Databases = ''SYSTEM_DATABASES'', "https://myaccount.blob.core.windows.net/mycontainer"
		,  @Credential = ''AutoBackup_Credential'', @BackupType = ''FULL'', @Verify = ''Y'', @CleanupTime = NULL, @CheckSum = ''Y'', @LogToTable = ''Y''" -b'
GO

If you need to restore your database then you can use the GUI and when you choose restore you will see this screen

6

Enter your storage account and the key which you can get from the Azure portal. You will notice that the credential has already been selected, click connect and

7

There are all of your backups ready to restore to any point in time that you choose. By clicking script the T-SQL is generated which looks like this


USE [master]
BACKUP LOG [Test] TO  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_LogBackup_2015-07-16_06-21-26.bak'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,
NOFORMAT, NOINIT,  NAME = N'Test_LogBackup_2015-07-16_06-21-26',
NOSKIP, NOREWIND, NOUNLOAD,  NORECOVERY ,  STATS = 5
RESTORE DATABASE [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150714201240+00.bak'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150714202740+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150714224241+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715005741+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715031242+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715052742+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715074243+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715095743+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715121243+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150716060004+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NOUNLOAD,  STATS = 5
GO

There is an important note. Remember this when you have just set it up so that you don’t think that you have done it wrong (which is what I did!)

When you enable Automated Patching for the first time, Azure configures the SQL Server IaaS Agent in the background. During this time, the portal will not show that Automated Patching is configured. Wait several minutes for the agent to be installed, configured. After that the portal will reflect the new settings.

From <https://msdn.microsoft.com/en-us/library/azure/dn961166.aspx>

And also look out for this

8

The password I had chosen was not complex enough but the Powershell script had succeeded and not given me the warning

To set up SQL Automated Patching you follow a similar steps. The setting is again on the OS Config blade and click enable and then you can choose the frequency and duration of the patching.

It is important to remember to choose your maintenance window correctly. If you have set up your SQL VMs correctly you will have them in an availability set and be using either mirroring or Availability Groups and have the VMs set up in the same availability set to ensure availability during the underlying host patching but I had it confirmed by Principal Software Engineering Manager Sethu Srinivasan t via Microsoft PFE Arvind Shyamsundar b | t that the SQL Automated Patching is not HA aware so you will need to ensure that you set the maintenance windows on each VM to ensure that they do not overlap

Scheduling Ola Hallengrens Maintenance Solution Default Jobs with Powershell

If you are a SQL Server DBA you should know about Ola Hallengren and will probably have investigated his Maintenance Solution.

If you haven’t please start here https://ola.hallengren.com/

You can also watch his presentation at SQLBits at this link

http://sqlbits.com/Sessions/Event9/Inside_Ola_Hallengrens_Maintenance_Solution

where he talks about and demonstrates the solution.

It is possible to just run his script to install the solution and schedule the jobs and know that you have made a good start in keeping your databases safe. You should be more proactive than that and set specific jobs for your own special requirements but you can and should find that information in other places including the FAQ on Ola’s site

I particularly like the parameter @ChangeBackupType which when running the transaction log or differential backup will change the backup type to full if the backup type cannot be taken. This is excellent for picking up new databases and backing them up soon after creation

When you run the script the jobs are created but not scheduled and it is for this reason I created this function. All it does it schedule the jobs so that I know that they will be run when a new server is created and all the databases will be backed up. I can then go back at a later date and schedule them correctly for the servers workload or tweak them according to specific needs but this allows me that fuzzy feeling of knowing that the backups and other maintenance will be performed.

To accomplish this I pass a single parameter $Server to the function this is the connection string and should be in the format of SERVERNAME, SERVERNAME\INSTANCENAME or SERVERNAME\INSTANCENAME,Port

I then create a $srv SMO object as usual

$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server

Create a JobServer object and a Jobs array which holds the Jobs

$JobServer = $srv.JobServer
$Jobs = $JobServer.Jobs

And set the schedule for each job. I pick each Job using the Where-Object Cmdlet and break out if the job does not exist

$Job = $Jobs|Where-Object {$_.Name -eq 'DatabaseBackup - SYSTEM_DATABASES - FULL'}
       if ($Job -eq $Null)
       {Write-Output "No Job with that name"
       break}

Then I create a Schedule object and set its properties and create the schedule

$Schedule = new-object Microsoft.SqlServer.Management.Smo.Agent.JobSchedule ($job, 'Daily - Midnight ++ Not Sunday')
$Schedule.ActiveEndDate = Get-Date -Month 12 -Day 31 -Year 9999
$Schedule.ActiveEndTimeOfDay = '23:59:59'
$Schedule.FrequencyTypes = "Weekly"
$Schedule.FrequencyRecurrenceFactor = 1
$Schedule.FrequencySubDayTypes = "Once"
$Schedule.FrequencyInterval = 126 # Weekdays 62 + Saturdays 64 - <a href="https://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.agent.jobschedule.frequencyinterval.aspx">https://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.agent.jobschedule.frequencyinterval.aspx</a>
$Schedule.ActiveStartDate = get-date
$schedule.ActiveStartTimeOfDay = '00:16:00'
$Schedule.IsEnabled = $true
$Schedule.Create()

I have picked this example for the blog as it shows some of the less obvious gotchas. Setting the active end date could only be achieved by using the Get-Date Cmdlet and defining the date. The schedule frequency interval above is for every day except Sundays. This achieved by using the following table from MSDN which is always my first port of call when writing these scripts

WeekDays.Sunday = 1
WeekDays.Monday = 2
WeekDays.Tuesday = 4
WeekDays.Wednesday = 8
WeekDays.Thursday = 16
WeekDays.Friday = 32
WeekDays.Saturday = 64
WeekDays.WeekDays = 62
WeekDays.WeekEnds = 65
WeekDays.EveryDay = 127

Combine values using an OR logical operator to set more than a single day. For example, combine WeekDays.Monday and WeekDays.Friday (FrequencyInterval = 2 + 32 = 34) to schedule an activity for Monday and Friday.

It is easy using this to set up whichever schedule you wish by combining the numbers. I would advise commenting it in the script so that your future self or following DBAs can understand what is happening.

You can tweak this script or use the code to work with any Agent Jobs and set the schedules accordingly and you can check that you have set the schedules correctly with this code

   $srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server
   $JObserver = $srv.JobServer
   $JObs = $JObserver.Jobs
   $ActiveStartTimeOfDay = @{Name = "ActiveStartTimeOfDay"; Expression = {$_.JobSchedules.ActiveStartTimeOfDay}}
   $FrequencyInterval = @{Name = "FrequencyInterval"; Expression = {$_.JobSchedules.FrequencyInterval}}
   $FrequencyTypes = @{Name = "FrequencyTypes"; Expression = {$_.JobSchedules.FrequencyTypes}}
   $IsEnabled = @{Name = "IsEnabled"; Expression = {$_.JobSchedules.IsEnabled}}
   $Jobs|Where-Object{$_.Category -eq 'Database Maintenance'}|select name,$IsEnabled,$FrequencyTypes,$FrequencyInterval,$ActiveStartTimeOfDay|Format-Table -AutoSize

You can get the script from Script Center via the link below or by searching for “Ola” using the script browser add-in straight from ISE

browser

https://gallery.technet.microsoft.com/scriptcenter/Schedule-Ola-Hallengrens-a66a3c89