Using the new SQLServer Powershell module to get SQL Agent Job Information

So with the July Release of SSMS everything changed for using PowerShell with SQL. You can read the details here As I mentioned in my previous post the name of the module has changed to sqlserver

This means that if you have a PowerShell script doing Import-Module SQLPS, it will need to be changed to be Import-Module SqlServer in order to take advantage of the new provider functionality and new CMDLETs. The new module will be installed to “%Program Files\WindowsPowerShell\Modules\SqlServer” and hence no update to $env:PSModulePath is required.

You can download the latest SSMS release here Once you have installed and rebooted you can start to look at the new Powershell CMDlets

Take a look at cmdlets

Today I want to look at agent jobs

getcomand sqlagent

So I decided to see how to gather the information I gather for the DBADatabase as described here

This is the query I use to insert the data for the server level agent job information.

So Get-SQLAgentJob looks like the one I need. Lets take a look at the help. This should be the starting point whenever you use a new cmdlet

Which states

Returns a SQL Agent Job object for each job that is present in the target instance of SQL Agent.

That sounds like it will meet my needs. Lets take a look

sqlinstances

I can get the information I require like this

NOTE – That code is for PowerShell V4 and V5, if you are using earlier versions of PowerShell you would need to use

But to make the code more performant it is better to do this

jobs

Using Measure-Command showed that this completed in
TotalSeconds : 0.9889336
Rather than
TotalSeconds : 2.9045701

Note that

Does not work. I had to check the properties using

Which showed me

IsEnabled Property bool IsEnabled {get;set;}

So I tested this against the various SQL versions I had in my lab using this code

Here are the results
job data table

I also had a look at Get-SQLAgentJobHistory Lets take a look at the help

DESCRIPTION

Returns the JobHistory present in the target instance of SQL Agent.

This cmdlet supports the following modes of operation to return the JobHistory:

  1. By specifying the Path of the SQL Agent instance.
  2. By passing the instance of the SQL Agent in the input.
  3. By invoking the cmdlet in a valid context.

So I ran

And got back a whole load of information. Every job history available on the server. Too much to look it immediately to work out what to do

So I looked at just one job

And got back the last months worth of history for that one job as that is the schedule used to purge the job history for this server So then I added -Since Yesterday to only get the last 24 hours history

agentjobdetail

The Since Parameter is described as

-Since <SinceType>

A convenient abbreviation to avoid using the -StartRunDate parameter.
It can be specified with the -EndRunDate parameter.

Do not specify a -StartRunDate parameter, if you want to use it.

Accepted values are:
– Midnight (gets all the job history information generated after midnight)
– Yesterday (gets all the job history information generated in the last 24 hours)
– LastWeek (gets all the job history information generated in the last week)
– LastMonth (gets all the job history information generated in the last month)

When I run

I get

Count : 3

And if I run

I get

agent job out gridview

Which matches the view I see in SSMS Agent Job History

jobhistory

So Get-SqlAgentJobHistory will enable you to use PowerShell to gather information about the Job history for each step of the Agent Jobs and also the message which I can see being very useful.

Come and join us in the SQL Community Slack to discuss these CMDLets and all things SQL Community https://sqlps.io/slack

CALL TO ACTION

Microsoft are engaging with the community to improve the tools we all use in our day to day work. There is are two Trello boards set up for YOU to use to contribute

https://sqlps.io/vote for SQLPS sqlserver PowerShell module

https://sqlps.io/ssms for SSMS

Go and join them and upvote YOUR preferred choice of the next lot of CMDlets

trellocount

We have also set up a SQL Community Slack for anyone in the community to discuss all things related to SQL including the Trello board items and already it seems a good place for people to get help with 150+ members in a few days. You can get an invite here https://sqlps.io/slack

Come and join us

Advertisements

PowerShell CMDLets added for SQL2016 Always Encrypted

The post on the SQLServer blog at TechNet by the SQL Server Tools Team today made me jump out of my seat.

The July update for SSMS includes the first substantial improvement in SQL PowerShell in many years. We owe a lot of thanks for this effort to the great collaboration with our community. We have several new CMDLETs to share with you

In one release there are twenty-five new CMDLets for the new sqlserver module

This means that if you have a PowerShell script doing Import-Module SQLPS, it will need to be changed to be Import-Module SqlServer in order to take advantage of the new provider functionality and new CMDLETs. The new module will be installed to “%Program Files\WindowsPowerShell\Modules\SqlServer” and hence no update to $env:PSModulePath is required.

So SQLPS will still continue to work but will not be updated and will not contain the new CMDlets or the future new CMDlets.

So what new things do we have?

This month we introduce CMDLETs for the following areas:

  • Always Encrypted
  • SQL Agent
  • SQL Error Logs

Chrissy LeMaire has written about the new SQL Agent cmdlets

Aaron Nelson has written about the new Get-SqlErrorLog cmdlet

Laerte Junior has written about Invoke-SQLCmd

All four of us will be presenting a webinar on the new CMDlets via the PowerShell Virtual Chapter Wed, Jul 06 2016 12:00 Eastern Daylight Time If you cant make it a recording will be made available on YouTube on the VC Channel https://sqlps.io/video

Always Encrypted CMDlets

That leaves the Always Encrypted CMDLets and there are 17 of those!

Add-SqlColumnEncryptionKeyValue Adds a new encrypted value for an existing column encryption key object in the database.
Complete-SqlColumnMasterKeyRotation Completes the rotation of a column master key.
Get-SqlColumnEncryptionKey Returns all column encryption key objects defined in the database, or returns one column encryption key object with the specified name.
Get-SqlColumnMasterKey Returns the column master key objects defined in the database, or returns one column master key object with the specified name.
Invoke-SqlColumnMasterKeyRotation Initiates the rotation of a column master key.
New-SqlAzureKeyVaultColumnMasterKeySettings Creates a SqlColumnMasterKeySettings object describing an asymmetric key stored in Azure Key Vault.
New-SqlCngColumnMasterKeySettings Creates a SqlColumnMasterKeySettings object describing an asymmetric key stored in a key store supporting the Cryptography Next Generation (CNG) API.
New-SqlColumnEncryptionKey Crates a new column encryption key object in the database.
New-SqlColumnEncryptionKeyEncryptedValue Produces an encrypted value of a column encryption key.
New-SqlColumnEncryptionSettings Creates a new SqlColumnEncryptionSettings object that encapsulates information about a single column’s encryption, including CEK and encryption type.
New-SqlColumnMasterKey Creates a new column master key object in the database.
New-SqlCspColumnMasterKeySettings Creates a SqlColumnMasterKeySettings object describing an asymmetric key stored in a key store with a Cryptography Service Provider (CSP) supporting Cryptography API (CAPI).
Remove-SqlColumnEncryptionKey Removes the column encryption key object from the database.
Remove-SqlColumnEncryptionKeyValue Removes an encrypted value from an existing column encryption key object in the database.
Remove-SqlColumnMasterKey Removes the column master key object from the database.
Set-SqlColumnEncryption Encrypts, decrypts or re-encrypts specified columns in the database.
 

 

That seems to cover setting up Always Encrypted with Powershell , removing it and getting information about it. When the new SSMS update is dropped you will be able to start using all of this new functionality.

Just remember Import-Module sqlserver

CALL TO ACTION

Microsoft are engaging with the community to improve the tools we all use in our day to day work. There is are two Trello boards set up for YOU to use to contribute

https://sqlps.io/vote  for SQLPS  sqlserver PowerShell module

https://sqlps.io/ssms for SSMS

Go and join them and upvote YOUR preferred choice of the next lot of CMDlets

trellocount

 

We have also set up a SQL Community Slack for anyone in the community to discuss all things related to SQL including the Trello board items and already it seems a good place for people to get help with 150+ members in a few days. You can get an invite here https://sqlps.io/slack

Come and join us

Some Pester Tests for SQL Defaults

When I was at PowerShell Conference EU in Hannover last month (The videos are available now – click here and the slides and code here) I found out about Irwin Strachans Active Directory Operations Test which got me thinking.

I decided to do the same for my usual SQL Set-up. Treating all of your servers to the same defaults makes it even easier to manage at scale remotely.

I am comfortable with using SMO to gather and change properties on SQL Instances so I started by doing this

        It 'Should have a default Backup Directory of F:\SQLBACKUP\BACKUPS' {
$Scriptblock = {
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
$srv = New-Object Microsoft.SqlServer.Management.Smo.Server .
return $srv.BackupDirectory}
$State = Invoke-Command -ComputerName ROB-SURFACEBOOK -ScriptBlock $Scriptblock
$State |Should Be 'F:\SQLBACKUP\BACKUPS'

This is the how to find the properties that you want

  ## Load the Assemblies
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
## Create a Server SMO object
$srv = New-Object Microsoft.SqlServer.Management.Smo.Server SERVERNAME

## Explore it
$srv|gm

## If you find an array pick the first one and expand and then explore that
$srv.Databases[0] | select *
$srv.Databases[0] | gm

I quickly found as I added more tests that it was taking a long time to perform the tests (about 5 seconds each test) and that it took an age to fail each of the tests if the server name was incorrect or the server unavailable.

I fixed the first one by testing with a ping before running the tests

   ## Check for connectivity
if((Test-Connection $Server -count 1 -Quiet) -eq $false){
Write-Error 'Could not connect to $Server'
$_
continue
}

The continue is there because I wanted to loop through an array of servers

I improved the performance using a remote session and a custom object

      Describe "$Server" {
BeforeAll {
$Scriptblock = {
[pscustomobject]$Return = @{}
$srv = ''
$SQLAdmins = $Using:SQLAdmins
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server
$Return.DBAAdminDb = $Srv.Databases.Name.Contains('DBA-Admin')
$Logins = $srv.Logins.Where{$_.IsSystemObject -eq $false}.Name
$Return.SQLAdmins = @(Compare-Object $Logins $SQLAdmins -SyncWindow 0).Length - $Logins.count -eq $SQLAdmins.Count
$SysAdmins = $Srv.Roles['sysadmin'].EnumMemberNames()
$Return.SQLAdmin = @(Compare-Object $SysAdmins $SQLAdmins -SyncWindow 0).Length - $SysAdmins.count -eq $SQLAdmins.Count
$Return.BackupDirectory = $srv.BackupDirectory
$Return.DataDirectory = $srv.DefaultFile

The BeforeAll script block is run, as it sounds like it should, once before all of the tests, BeforeEach would run once before each of the tests. I define an empty custom object and then create an SMO object and add the properties I am interested in testing to it. I then return the custom object at the end

   $Return.Alerts82345Exist = ($srv.JobServer.Alerts |Where {$_.Messageid -eq 823 -or $_.Messageid -eq 824 -or $_.Messageid -eq 825}).Count
$Return.Alerts82345Enabled = ($srv.JobServer.Alerts |Where {$_.Messageid -eq 823 -or $_.Messageid -eq 824 -or $_.Messageid -eq 825 -and $_.IsEnabled -eq $true}).Count
$Return.SysDatabasesFullBackupToday = $srv.Databases.Where{$_.IsSystemObject -eq $true -and $_.Name -ne 'tempdb' -and $_.LastBackupDate -lt (Get-Date).AddDays(-1)}.Count
Return $Return
}
try {
$Return = Invoke-Command -ScriptBlock $Scriptblock -ComputerName $Server -ErrorAction Stop
}
catch {
Write-Error "Unable to Connect to $Server"
$Error
continue

I was then able to test against the property of the custom object

   It 'Should have Alerts for Severity 20 and above' {
$Return.Alerts20SeverityPlusExist | Should Be 6
}
It 'Severity 20 and above Alerts should be enabled' {
$Return.Alerts20SeverityPlusEnabled | Should Be 6
}
It 'Should have alerts for 823,824 and 825' {
$Return.Alerts82345Exist |Should Be 3
}
It 'Alerts for 823,824 and 825 should be enebled' {
$Return.Alerts82345Enabled |Should Be 3
}

Occasionally, for reasons I haven’t explored I had to test against the value property of the returned object

          It "The Full User Database Backup should be scheduled Weekly $OlaUserFullSchedule" {
$Return.OlaUserFullSchedule.value | Should Be $OlaUserFullSchedule
}

I wanted to be able to run the tests against environments or groups of servers with different default values so I parameterised the Test Results as well and then the logical step was to turn it into a function and then I could do some parameter splatting. This also gives me the opportunity to show all of the things that I am currently giving parameters to the test for

   $Parms = @{
Servers = 'SQLServer1','SQLServer2','SQLServer3';
SQLAdmins = 'THEBEARD\Rob','THEBEARD\SQLDBAsAlsoWithBeards';
BackupDirectory = 'C:\MSSQL\Backup';
DataDirectory = 'C:\MSSQL\Data\';
LogDirectory = 'C:\MSSQL\Logs\';
MaxMemMb = '4096';
Collation = 'Latin1_General_CI_AS';
TempFiles = 4 ;
OlaSysFullFrequency = 'Daily';
OlaSysFullStartTime = '21:00:00';
OlaUserFullSchedule = 'Weekly';
OlaUserFullFrequency = 1 ;## 1 for Sunday
OlaUserFullStartTime = '22:00:00';
OlaUserDiffSchedule = 'Weekly';
OlaUserDiffFrequency = 126; ## 126 for every day except Sunday
OlaUserDiffStartTime = '22:00:00';
OlaUserLogSubDayInterval = 15;
OlaUserLoginterval = 'Minute';
HasSPBlitz = $true;
HasSPBlitzCache = $True;
HasSPBlitzIndex = $True;
HasSPAskBrent = $true;
HASSPBlitzTrace =  $true;
HasSPWhoisActive = $true;
LogWhoIsActiveToTable = $true;
LogSPBlitzToTable = $true;
LogSPBlitzToTableEnabled = $true;
LogSPBlitzToTableScheduled = $true;
LogSPBlitzToTableSchedule = 'Weekly';
LogSPBlitzToTableFrequency = 2 ; # 2 means Monday
LogSPBlitzToTableStartTime  = '03:00:00'}

Test-SQLDefault @Parms

I have some other tests which always return what I want, particularly the firewall rules which you will have to modify to suit your own environment

To be able to run this you will need to have the Pester Module. If you are using Windows 10 then it is installed by default, if not

  Find-Module –Name 'Pester' | Install-Module

You can find more about Pester here and here and also these videos from the conference
You can find the tests on GitHub here and I will continue to add to the defaults that I check.
This is not a replacement for other SQL configuration tools such as PBM but it is a nice simple way of giving a report on the current status of a SQL installation either at a particular point in time when something is wrong or after an installation prior to passing the server over to another team or into service

.

DBA Database scripts are on Github

It started with a tweet from Dusty

Tweets

The second session I presented at the fantastic PowerShell Conference Europe was about using the DBA Database to automatically install DBA scripts like sp_Blitz, sp_AskBrent, sp_Blitzindex from Brent Ozar , Ola Hallengrens Maintenance Solution , Adam Mechanics sp_whoisactive , This fantastic script for logging the results from sp_whoisactive to a table , Extended events sessions and other goodies for the sanity of the DBA.

By making use of the dbo.InstanceList in my DBA database I am able to target instances, by SQL Version, OS Version, Environment, Data Centre, System, Client or any other variable I choose. An agent job that runs every night will automatically pick up the instances and the scripts that are marked as needing installing. This is great when people release updates to the above scripts allowing you to target the development environment and test before they get put onto live.

I talked to a lot of people in Hannover and they all suggested that I placed the scripts onto GitHub and after some how-to instructions from a few people (Thank you Luke) I spent the weekend updating and cleaning up the code and you can now find it on GitHub here

github

I have added the DBA Database project, the Powershell scripts and Agent Job creation scripts to call those scripts and everything else I use. Some of the DBA Scripts I use (and links to those you need to go and get yourself for licensing reasons) and the Power Bi files as well. I will be adding some more jobs that I use to gather other information soon.

Please go and have a look and see if it is of use to you. It is massively customisable and I have spoken to various people who have extended it in interesting ways so I look forward to hearing about what you do with it.

As always, questions and comments welcome

 

 

Backing up to URL container name – case is important

If you use SQL Backup to URL to backup your databases to Azure blob storage remember that for the container name case is important

So

will work but

will give an (400) Bad Request Error which may not be easy to diagnose

If you are using Ola Hallengrens jobs to perform your backup then your job step will look like this

Note the @ChangeBackupType = ‘Y’ parameter which is not created by default but I think is very useful. If you have just created a database and take log backups every 15 minutes but differential (or full) every night the log backup will fail until a full backup has been taken. This parameter will check if a log backup is possible and if not take a full backup meaning that you still can keep to your RTO/RPO requirements even for newly created databases

Power Bi, PowerShell and SQL Agent Jobs

Continuing my series on using Power Bi with my DBA Database I am going to show in this post how I create the most useful daily report for DBAs – The SQL Agent Job report. You can get the scripts and reports here

Please note this project became dbareports.io

AG1

This gives a quick overview of the status of the Agent Jobs across the estate and also quickly identifies recent failed jobs enabling the DBA to understand their focus and prioritise their morning efforts.

I gather the information into 2 tables AgentJobDetail

and AgentJobServer

The Detail table holds the results of every Agent Job and the Server table holds a roll up for each server. The script to gather this information is based on the script I used to put the information into an Excel Sheet as described in my post How I Check Hundreds of Agent Jobs in 60 Seconds with PowerShell which I also altered to send an HTML email to the DBA team each morning. This however is a much better solution and allows for better monitoring and trending.

As I have explained in my previous posts I use an Instance List table to hold the information about each instance in the estate and a series of PowerShell scripts which run via Agent Jobs to gather the information into various tables. These posts describe the use of the Write-Log function and the methodology of gathering the required information and looping through each instance so I wont repeat that here. There is an extra check I do however for Express Edition as this does not contain the Agent service

The Agent Job information can be found in SMO by exploring the $srv.JobServer.Jobs object and I gather the information by iterating through each job and setting the values we require to variables

I found that some Jobs had names and descriptions that had ‘ in them which would cause the SQL update or insert statement to fail so I use the replace method to replace the ‘ with ”

I then insert the data per job after checking that it does not already exist which allows me to re-run the job should a number of servers be uncontactable at the time of the job running without any additional work

I put this in a here-string variable and pass it to Invoke-SQLCmd I do the same with the roll up using this query

This job runs as a SQL Agent Job every morning a half an hour or so before the DBA arrives for the morning shift vastly improving the ability of the DBA to prioritise their morning routine.

To create the report open Power Bi Desktop and click Get Data

ag2

Then choose SQL Server and click connect

ag3

Enter the Connection string, the database and the  query to gather the data

ag5

The query is

Once we have gathered the data we then create some extra columns and measures for the reports. First I create a date column from the datetime Date Column

I also do the same for the LastRuntime. I create a day of the week column so that I can report on jobs outcome by day

My friend Terry McCann b | t helped me create a column that returns true if the last run time is within 24 hours of the current time to help identify the recent jobs that have failed NOTE – On a Monday morning you will need to change this if you do not check your jobs on the weekend.

I create a measure for Succeeded, Failed and Unknown

Next we have to create some measures for the sum of failed jobs and the averages This is the code for 7 day sum

and for the 7 Day average

I did the same for 30 days. I used the TechNet reference for DAX expressions and got ideas from Chris Webbs blog

ag6
First I created the 30 day historical trend chart using a Line and Clustered column chart using the last run date as the axis and the succeed measure as the column and the Failed, Failed 7 Day Average and failed 30 day average as the lines

I then formatted the lines and title and column

ag7

To create the gauge which shows how well we have done today I created a measure to quickly identify todays jobs

which I use as a filter for the gauge as shown below. I also create two measures zero and twenty for the minimum and maximum for the gauge

ag8

The rest of the report is measures for 7 day average and 30 day average, a slicer for environment  and two tables, one to show the historical job counts and one to show the jobs that have failed in the last 24 hours using the Last Run Relative Hour measure from above

ag9

There are many other reports that you can or may want to create maybe by day of the week or by category depending on your needs. Once you have the data gathered you are free to play with the data as you see fit. Please add any further examples of reports you can run or would like to run in the comments below.

Once you have your report written you can publish it to PowerBi.com and create a dashboard and query it with natural language. I have explained the process in previous posts

For example – How many Jobs failed today

ag110

Which server had most failed jobs

ag11

or using the category field which database maintenance jobs failed today

ag13

I hope these posts have given you ideas about how you can use Powershell, a DBA Database and Power Bi to help you to manage and report on your environment.

You can get the scripts and reports here

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

Populating My DBA Database for Power Bi with PowerShell – Databases

Following my post about using Power Bi with my DBA Database I have been asked if I would share the PowerShell scripts which I use to populate my database.

In this post I will show how to create the following report

db1

db2

Although you will find so many items of data that I expect that you will want to create different reports for your own requirements. You will also want to put the report onto PowerBi.com and explore the natural language querying as I show at the end of this post

You will find the latest version of my DBADatabase creation scripts and PowerShell scripts here.

The SQLInfo table is created using this code

CREATE TABLE [Info].[Databases](
	[DatabaseID] [int] IDENTITY(1,1) NOT NULL,
	[InstanceID] [int] NOT NULL,
	[Name] [nvarchar](256) NULL,
	[DateAdded] [datetime2](7) NULL,
	[DateChecked] [datetime2](7) NULL,
	[AutoClose] [bit] NULL,
	[AutoCreateStatisticsEnabled] [bit] NULL,
	[AutoShrink] [bit] NULL,
	[AutoUpdateStatisticsEnabled] [bit] NULL,
	[AvailabilityDatabaseSynchronizationState] [nvarchar](16) NULL,
	[AvailabilityGroupName] [nvarchar](128) NULL,
	[CaseSensitive] [bit] NULL,
	[Collation] [nvarchar](30) NULL,
	[CompatibilityLevel] [nvarchar](15) NULL,
	[CreateDate] [datetime2](7) NULL,
	[DataSpaceUsageKB] [float] NULL,
	[EncryptionEnabled] [bit] NULL,
	[IndexSpaceUsageKB] [float] NULL,
	[IsAccessible] [bit] NULL,
	[IsFullTextEnabled] [bit] NULL,
	[IsMirroringEnabled] [bit] NULL,
	[IsParameterizationForced] [bit] NULL,
	[IsReadCommittedSnapshotOn] [bit] NULL,
	[IsSystemObject] [bit] NULL,
	[IsUpdateable] [bit] NULL,
	[LastBackupDate] [datetime2](7) NULL,
	[LastDifferentialBackupDate] [datetime2](7) NULL,
	[LastLogBackupDate] [datetime2](7) NULL,
	[Owner] [nvarchar](30) NULL,
	[PageVerify] [nvarchar](17) NULL,
	[ReadOnly] [bit] NULL,
	[RecoveryModel] [nvarchar](10) NULL,
	[ReplicationOptions] [nvarchar](40) NULL,
	[SizeMB] [float] NULL,
	[SnapshotIsolationState] [nvarchar](10) NULL,
	[SpaceAvailableKB] [float] NULL,
	[Status] [nvarchar](35) NULL,
	[TargetRecoveryTime] [int] NULL,
 CONSTRAINT [PK_Databases] PRIMARY KEY CLUSTERED 
(
	[DatabaseID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

GO

The Powershell script uses Jason Wasser @wasserja Write-Log function to write to a text file but I also enable some logging into a new event log by following the steps here http://blogs.technet.com/b/heyscriptingguy/archive/2013/02/01/use-powershell-to-create-and-to-use-a-new-event-log.aspx to create a log named SQLAutoScript with a source SQLAUTOSCRIPT

To run the script I simply need to add the values for

$CentralDBAServer = '' ## Add the address of the instance that holds the DBADatabase
$CentralDatabaseName = 'DBADatabase' 
$LogFile = "\DBADatabaseServerUpdate_" + $Date + ".log" ## Set Path to Log File

And the script will do the rest. Call the script from a PowerShell Job Step and schedule it to run at the frequency you wish, I gather the information every week. You can get the script from here or you can read on to see how it works and how to create the report and publish it to powerbi.com and query it with natural langauge

I create a function called Catch-Block to save keystrokes and put my commands inside a try catch to make the scripts as robust as possible. I won’t include the try catch in the examples below. I gather all of the server names from the InstanceList table and set the results to an array variable called $ServerNames holding the server name, instance name and port

 $Query = @"
 SELECT [ServerName]
      ,[InstanceName]
      ,[Port]
  FROM [DBADatabase].[dbo].[InstanceList]
  Where Inactive = 0 
    AND NotContactable = 0
"@
try{
$AlltheServers= Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query $query
$ServerNames = $AlltheServers| Select ServerName,InstanceName,Port
}

I then loop through the array and create a $Connection variable for my SMO connection string and connect to the server

foreach ($ServerName in $ServerNames)
{
## $ServerName
 $InstanceName =  $ServerName|Select InstanceName -ExpandProperty InstanceName
 $Port = $ServerName| Select Port -ExpandProperty Port
$ServerName = $ServerName|Select ServerName -ExpandProperty ServerName 
 $Connection = $ServerName + '\' + $InstanceName + ',' + $Port

 try
 {
 $srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection

Even though I place the creation of the SMO server object in a try block you still need to an additional check to ensure that you can connect and populate the object as the code above creates an empty SMO Server object with the name property set to the $Connection variable if you can’t connect to that server and doesn’t error as you may expect
The way I have always validated an SMO Server object is to check the version property. There is no justifiable reason for choosing that property, you could choose any one but that’s the one I have always used. I use an if statement to do this ( This post about Snippets will show you the best way to learn PowerShell code) The reference I use for exiting a loop in the way that you want is this one In this case we use a continue to carry on iterating the loop

 if (!( $srv.version)){
 Catch-Block " Failed to Connect to $Connection"
 continue
 }

I then loop through the user databases

foreach($db in $srv.databases|Where-Object {$_.IsSystemObject -eq $false })
{
$Name = $db.Name
$Parent = $db.Parent.Name

To gather information on all databases just remove everything after the pipe symbol or if you wish to exclude certain databases from the collection gathering, maybe the database you keep your Change log table and DBA Team info in you can do that as well here

foreach($db in $srv.databases|Where-Object {$_.Name -ne 'EXCLUDENAME' })
{
$Name = $db.Name
$Parent = $db.Parent.Name

If you wish to view all of the different properties that you can gather information on in this way you can use this code to take a look. (This is something you should get used to doing when writing new Powershell scripts)

$Connection = 'SERVERNAMEHERE'
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection
 $srv.databases | Get-Member

An alternative method of doing this is to set a variable to a $db and then to select all of the properties so that you can see the values and identify the ones you want. Again this a good thing to do when exploring new objects

$db = $srv.databases['DBNAMEHERE'] 
$db| Select *

You can see from the screen shot below that there are 170 properties available to you on a SQL2014 instance. You can gather any or all of that information as long as you ensure that you have the columns with the correct data types in your table and that your script has the logic to deal with properties that do not exist although I have had less issue with this for the database object than the server object

db3

You can look for the property that you want by using the Get-Member cmdlet as shown above or use MSDN to find it starting from here or by GoogleBingDuckDuckGo ing “Powershell SMO” and the property you wish to find.

The rest of the script follows exactly the same pattern as the previous post by checking the SQL Info table for an entry for that instance and updating the table if it exists and inserting if it does not.

This is how I created the reports shown above.

Connect to the DBA Database and run these queries to gather the data for the report.

SELECT 
IL.ServerName
,IL.InstanceName
,IL.Location
,IL.Environment
,IL.Inactive
,IL.NotContactable
,D.[DatabaseID]
,D.[InstanceID]
,D.[Name]
,D.[DateAdded]
,D.[DateChecked]
,D.[AutoClose]
,D.[AutoCreateStatisticsEnabled]
,D.[AutoShrink]
,D.[AutoUpdateStatisticsEnabled]
,D.[AvailabilityDatabaseSynchronizationState]
,D.[AvailabilityGroupName]
,D.[CaseSensitive]
,D.[Collation]
,D.[CompatibilityLevel]
,D.[CreateDate]
,D.[DataSpaceUsageKB]
,D.[EncryptionEnabled]
,D.[IndexSpaceUsageKB]
,D.[IsAccessible]
,D.[IsFullTextEnabled]
,D.[IsMirroringEnabled]
,D.[IsParameterizationForced]
,D.[IsReadCommittedSnapshotOn]
,D.[IsUpdateable]
,D.[LastBackupDate]
,D.[LastDifferentialBackupDate]
,D.[LastLogBackupDate]
,D.[Owner]
,D.[PageVerify]
,D.[ReadOnly]
,D.[RecoveryModel]
,D.[ReplicationOptions]
,D.[SizeMB]
,D.[SnapshotIsolationState]
,D.[SpaceAvailableKB]
,D.[Status]
,D.[TargetRecoveryTime]
FROM [DBADatabase].[Info].[Databases] as D
JOIN [DBADatabase].[dbo].[InstanceList] as IL
ON IL.InstanceID =D.InstanceID

To get all the database and instance information and

SELECT C.ClientName
 ,[DatabaseID]
 ,[InstanceID]
 ,[Notes]
  FROM [DBADatabase].[dbo].[ClientDatabaseLookup] as CDL
  JOIN [DBADatabase].[dbo].[Clients] as C
  ON CDL.clientid = c.clientid

To get the client information. The client information needs to be manually added to the table as this (in general) needs a human bean to understand. When the script runs every night it will pick up new databases and I add a default value of “Not Entered” to the table which makes it easier to identify the databases that need this additional work. (This also means that as a Team Leader I can monitor that my team are doing this) It can also be added to any scripts which create new databases for deployment.

Then we need to create some measures and calculated columns for our report. I did this as I realised that I needed it when making the report rather than all up front.

I created two calculated columns for size for the databases one for Gb and one for Tb by clicking on the data icon on the left and then new measure

SizeGb = Query1[SizeMB]/1024
SizeTb = Query1[SizeGb]/1024

Some measures for count of Databases, Instances and Servers

Databases = COUNT(Query1[DatabaseID])
Instances = DISTINCTCOUNT(Query1[InstanceID])
Servers = DISTINCTCOUNT(Query1[ServerName])

I also wanted to be able to differentiate between ‘External’ and ‘Internal’ customers. So I created a calculated column for this value using a switch statement.

External = SWITCH(Clients[ClientName],"Not Entered", 0 , "Dev Team",0,"Mi Team",0,"DBA Team",0,"Finance Department",0,"HR",0,"Operations",0,"Payroll",0,"Test Team",0,"Systems Team",0,"Unknown",0,1)

I create a donut chart to show the size of the database in Gb by client (and no, my real clients are not rock bands 🙂 ) as shown below. I formatted the title, legend and background by clicking on the paintbrush in the visualisation pane. I would encourage you to investigate the options here.

db4
The other donut chart is number of clients per location (and those are SQL User group locations in the UK and my hometown Bolton)

db5

The rest of the visualisations on that report are cards and tables which I am sure that you can work out.

I created a map to show the location of the databases

db6

And after reading this post http://sqldusty.com/2015/08/03/power-bi-tip-use-the-treemap-chart-as-a-colorful-slicer/ by Dustin Ryan I created a colourful slicer for environment and the client and then added some other information. The important thing here is to pick the information that the person looking at the report needs to see. So if it is recovery model, compatibility level, collation, page verify setting, mirroring, replication, size and number of databases then this report is correct but I doubt that’s what you want 🙂

You can slice this report by location, client or environment. For example, I can easily see which clients have data in Exeter and the size and number of databases

db7

Or if Metallica ring me up I can quickly see that they have 4 databases, just under 69Gb of data in Exeter and it isn’t mirrored. You will notice that it is not easy to see the recovery model or the compatibility level. If you hover over the results you get a highlight figure which shows the data is filtered but it is not shown visually very well as there are over a thousand databases using full recovery model.

db8

If we are asked about the Integration environment we can see that it is hosted in Bolton, Manchester, Southampton and Exeter and comprises of 394 databases and 739 Gb of data. It is also easier to see the compatibility level and recovery model as the ratios are larger

db9

Once we have created the report in the way that we want we can then publish it to powerbi.com and share it with others if we wish. Publishing is as easy as pressing the publish button and entering your powerbi credentials but if you want your data to automatically refresh (and this is the point of the exercise to remove manual work) then you will need to install and configure the PowerBi gateway and schedule a refresh I will post about this later.

Once the report is published you can access it in the browser and create a dashboard by clicking the pin in the top right of a visualisation and a pop up will ask you which dashboard you wish to pin it to (Another recent update to Power Bi)

db10

Once you have a dashboard you can then perform some natural language question and answer on it. This can be quite interesting and not always quite what you (or your report readers) might expect but it is getting better all the time

db11

You have to remember to use the names of the columns correctly

db12

But once you have the query correct you can alter it by adding “as a VISUALISATION” and choose the visualisation

db13

db14

And once you have the visualisation you can pin it to the dashboard

I think you can see how useful it can be

db15

This doesn’t work quite as you expect

db16

But this does

db17

How about this (and yes it felt wrong to type!)

db18

And the auditors would love to be able to do this. (This is an old copy of the database in case The Eagles people are reading this – your database is backed up every 15 minutes)

db19

Or this for a DBA ( Yes, my obfuscation script database naming convention is a bit bland)

db20

Or the DBA team manager might choose this one

db21

The advantage that I cannot show via static pictures is that the data, visualisation and the suggestions alter in real time as you type

I hope that you have found this useful and that you can see the benefits and advantages of using a DBA Database and empowering people to use self-service to answer their own questions leaving the DBA time to do more important things like drinking coffee 🙂

As always if you have any questions or comments please feel free to post them on the blog.

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

Using Power Bi with my DBA Database

Every good DBA should have a DBA database. A place to store information about all of their instances and databases.

I have an InstanceList table which looks like this

CREATE TABLE [dbo].[InstanceList](
[InstanceID] [int] IDENTITY(1,1) NOT NULL,
[ServerName] [nvarchar](50) NOT NULL,
[InstanceName] [nvarchar](50) NOT NULL,
[Port] [int] NOT NULL,
[AG] [bit] NULL,
[Inactive] [bit] NULL CONSTRAINT [DF_InstanceList_Inactive] DEFAULT ((0)),
[Environment] [nvarchar](25) NULL,
[Location] [nvarchar](30) NULL,
CONSTRAINT [PK_InstanceList_ID] PRIMARY KEY CLUSTERED
(
[InstanceID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

I use this as the basis for all of my information gathering. By adding Server name, Instance Name , Port, Environment and Location to the table I use overnight Agent jobs to run Powershell scripts to gather information about all of the instances. This way the information is dynamic and gathered from the server, so when we add RAM and change Max memory this is updated the next time the script runs. You can also automate your installation and decommission procedures (using Powershell scripts) to add the information to the DBA database automatically

I have 4 scripts

  • ServerInfo which gathers Windows OS information such as Version and edition of the operating system, number of processors,amount of RAM, IP address, domain name etc
  • SQLInfo which gathers information about the instance such as SQL version, edition, collation, max and min memory, MAXDOP , service accounts and start modes, default file locations etc
  • Database information such as size, data usage, index usage, last backup dates, owner and many more
  • Agent Job which gathers the outcome of the jobs that have run, their names, category into two tables one for a server rollup and one for details about each job

Recently I have received a lot of requests for information from various sources, auditors asking about encryption and backup policies, Project managers asking about database and sql versions, compliance asking about numbers of Windows 2003 servers or SQL 2005 servers, system teams asking which serves in a particular location can be turned off at which time dependant on which system they are supporting for a power down

Before we had the DBA database holding all of the information about the instances we would have struggled to be able to compile this information and when I saw Power Bi was released to GA I thought that it would be a good place to start to learn about it. By using data that I understood and answering questions that I knew the format of the answer I could be more confident about experimenting – ie. if I know I have 100 servers then any result for servers that exceeds that is incorrect

I have never been a BI guy, I claim no expertise in the correct methods of manipulating the data. There may very well be better methods of achieving these results and if there please feel free to comment below so that I can improve my knowledge and keep on learning

All data shown in the examples below has been generated from real-life data but all identifiable data has been altered or removed. I have no servers in Bolton, it is where I am from originally!!

I downloaded Power BI Desktop from powerbi.com and ran the installer and the first screen you see is this one

1

I then clicked on Get Data

2

And then SQL Server and filled in the details for my DBA Database and clicked connect

3

I used my current Windows credentials

4

It then asked me which tables I wanted to load so I said all of them 🙂

5

Once I had loaded the data I looked at the queries and renamed some of the columns to make more sense to me. I also created some calculated columns by clicking New Column

I created a relative date column using this code from Chris Webb http://blog.crossjoin.co.uk/2013/01/24/building-relative-date-reports-in-powerpivot/

Relative Date Offset=INT([Date] – TODAY()
Relative Date=IF([Relative Date Offset]=0
, "Today"
, "Today " & IF([Relative Date Offset]>0, "+", "") & [Relative Date Offset])

This will enable me to show data for the last day

I also did the same for days of the week

DayOfWeek = CONCATENATE(WEEKDAY('Info AgentJobDetail'[LastRunTime],2),FORMAT('InfoAgentJobDetail'[LastRunTime]," -dddd"))

Because I struggled to show the information about the Operating system I also created two columns for OS name and OS edition by adding columns as shown below

Operating System Version = SWITCH('Info ServerOSInfo'[OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "Server 2012",
"Microsoft Windows Server 2012 Standard","Server 2012",
"Microsoft Windows Server 2012 R2 Datacenter", "Server 2012 R2",
"Microsoft Windows Server 2008 R2 Standard", "Server 2008 R2",
"Microsoft Windows Server 2008 R2", "Server 2008 R2",
"Microsoft Windows Server 2008 R2 Enterprise", "Server 2008 R2",
"Microsoft® Windows Server® 2008 Standard", "Server 2008",
"Microsoft® Windows Server® 2008 Enterprise","Server 2008",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Server 2003",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Server 2003",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

And

Operating System Edition = SWITCH('Info ServerOSInfo'[OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "DataCenter",
"Microsoft Windows Server 2012 Standard","Standard",
"Microsoft Windows Server 2012 R2 Datacenter", "DataCenter",
"Microsoft Windows Server 2008 R2 Standard", "Standard",
"Microsoft Windows Server 2008 R2 Enterprise", "Enterprise",
"Microsoft® Windows Server® 2008 Standard", "Standard",
"Microsoft® Windows Server® 2008 Enterprise","Enterprise",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Standard",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Enterprise",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

Then I started to play with the data.

This is probably not how a professional would phrase it but I would say that if you don’t know how to use a new application be brave and give it a try.

OBVIOUSLY you are a PROFESSIONAL DBA and will not do anything that would endanger production, use a backup of your database and work locally if you need to.

The first thing I wanted to know was how many servers I had by operating system, how many by SQL version and the location of them so that I could answer the questions I had been asked. I had already written a query to get the correct information to give to the requestors so I knew the correct answers which was also an advantage. I did this like this

I expanded the Info ServerOSInfo query and dragged the ServerName field to the report which created a table of names

6

I then changed the ServerName values to Count

7

I then dragged the calculated column Operating System Version to the table

8

If I click on the table and then donut chart in the visualisations it changes to

9

So you can quickly see how you want the data displayed

I then decided to look at the number of SQL 2005 instances that I had and as I had relationships between SQLInfo and Instancelist and Clients I could build a more dynamic report.

I created a donut chart with SQLVersion as the legend and InstanceID as the values and a table of SQLVersion, ServerName and Instance Name. I also created a card that was count of InstanceID

10

Now it starts getting really useful. If I want to know how many SQL 2005 instances I have I simply click on SQL2005 in the donut chart and the rest of the report changes

11

This is very cool and I hope you can see how useful this could be and how brilliant it would be to enable relevant people within the organisation the ability to look at that report and answer their own questions.

Lets take it to the next step. I have a location column in the InstanceList table which comprises of town names. If I choose a map and drag that column to the Location field and set Values and Color Saturation to the Count of InstanceID

12

and create two tables one of client with a count of instanceid and one location with a count of instance id I can do this

13

Look at how it dynamically changes as you click on the data labels – This is very cool and makes me smile every time!! I altered the colour saturation colours to make it easier to see. Now if I am asked about SQL 2005 servers I can quickly click on SQL 2005 and

14

I can see that there are 32 instances, most are in Southampton, and which clients they support

If I click a location rather than SQL version the report alters like so

15

So you can simply pass the report file to your colleagues to enable them to use it or you can publish it to Powerbi.com. I am not going to go into any detail about the costs or licensing etc I will just say it is as easy as clicking publish. If you wish to have the information automatically refreshed there are some more steps that you would need to go through which are detailed here which enable you to connect your on-premise database to Powerbi using the data management gateway, alternatively you can simply refresh the data in the report and then publish it and replace the existing report.

Once the report is in powerbi.com you can enable Q and A on the data. This is some kind of supernatural mystical magical query language which enables you to query your data with natural language and will alter the results as you type and even cope with (deliberate for screenshot) spelling mistakes 🙂

16

I also created a report for my Agent Jobs to enable me to quickly and easily see which Jobs have failed in the last day

17

I did this by filtering the report by Relative Date Offset greater than -1 (today) and isenabled = True and Outcome = Failed

There are many many more ways I can see this being useful and I hope I have given you some ideas and encouraged you to try for yourself and find out more

I have written further posts about this

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

Scheduling Ola Hallengrens Maintenance Solution Default Jobs with Powershell

If you are a SQL Server DBA you should know about Ola Hallengren and will probably have investigated his Maintenance Solution.

If you haven’t please start here https://ola.hallengren.com/

You can also watch his presentation at SQLBits at this link

http://sqlbits.com/Sessions/Event9/Inside_Ola_Hallengrens_Maintenance_Solution

where he talks about and demonstrates the solution.

It is possible to just run his script to install the solution and schedule the jobs and know that you have made a good start in keeping your databases safe. You should be more proactive than that and set specific jobs for your own special requirements but you can and should find that information in other places including the FAQ on Ola’s site

I particularly like the parameter @ChangeBackupType which when running the transaction log or differential backup will change the backup type to full if the backup type cannot be taken. This is excellent for picking up new databases and backing them up soon after creation

When you run the script the jobs are created but not scheduled and it is for this reason I created this function. All it does it schedule the jobs so that I know that they will be run when a new server is created and all the databases will be backed up. I can then go back at a later date and schedule them correctly for the servers workload or tweak them according to specific needs but this allows me that fuzzy feeling of knowing that the backups and other maintenance will be performed.

To accomplish this I pass a single parameter $Server to the function this is the connection string and should be in the format of SERVERNAME, SERVERNAME\INSTANCENAME or SERVERNAME\INSTANCENAME,Port

I then create a $srv SMO object as usual

$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server

Create a JobServer object and a Jobs array which holds the Jobs

$JobServer = $srv.JobServer
$Jobs = $JobServer.Jobs

And set the schedule for each job. I pick each Job using the Where-Object Cmdlet and break out if the job does not exist

$Job = $Jobs|Where-Object {$_.Name -eq 'DatabaseBackup - SYSTEM_DATABASES - FULL'}
       if ($Job -eq $Null)
       {Write-Output "No Job with that name"
       break}

Then I create a Schedule object and set its properties and create the schedule

$Schedule = new-object Microsoft.SqlServer.Management.Smo.Agent.JobSchedule ($job, 'Daily - Midnight ++ Not Sunday')
$Schedule.ActiveEndDate = Get-Date -Month 12 -Day 31 -Year 9999
$Schedule.ActiveEndTimeOfDay = '23:59:59'
$Schedule.FrequencyTypes = "Weekly"
$Schedule.FrequencyRecurrenceFactor = 1
$Schedule.FrequencySubDayTypes = "Once"
$Schedule.FrequencyInterval = 126 # Weekdays 62 + Saturdays 64 - <a href="https://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.agent.jobschedule.frequencyinterval.aspx">https://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.agent.jobschedule.frequencyinterval.aspx</a>
$Schedule.ActiveStartDate = get-date
$schedule.ActiveStartTimeOfDay = '00:16:00'
$Schedule.IsEnabled = $true
$Schedule.Create()

I have picked this example for the blog as it shows some of the less obvious gotchas. Setting the active end date could only be achieved by using the Get-Date Cmdlet and defining the date. The schedule frequency interval above is for every day except Sundays. This achieved by using the following table from MSDN which is always my first port of call when writing these scripts

WeekDays.Sunday = 1
WeekDays.Monday = 2
WeekDays.Tuesday = 4
WeekDays.Wednesday = 8
WeekDays.Thursday = 16
WeekDays.Friday = 32
WeekDays.Saturday = 64
WeekDays.WeekDays = 62
WeekDays.WeekEnds = 65
WeekDays.EveryDay = 127

Combine values using an OR logical operator to set more than a single day. For example, combine WeekDays.Monday and WeekDays.Friday (FrequencyInterval = 2 + 32 = 34) to schedule an activity for Monday and Friday.

It is easy using this to set up whichever schedule you wish by combining the numbers. I would advise commenting it in the script so that your future self or following DBAs can understand what is happening.

You can tweak this script or use the code to work with any Agent Jobs and set the schedules accordingly and you can check that you have set the schedules correctly with this code

   $srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server
   $JObserver = $srv.JobServer
   $JObs = $JObserver.Jobs
   $ActiveStartTimeOfDay = @{Name = "ActiveStartTimeOfDay"; Expression = {$_.JobSchedules.ActiveStartTimeOfDay}}
   $FrequencyInterval = @{Name = "FrequencyInterval"; Expression = {$_.JobSchedules.FrequencyInterval}}
   $FrequencyTypes = @{Name = "FrequencyTypes"; Expression = {$_.JobSchedules.FrequencyTypes}}
   $IsEnabled = @{Name = "IsEnabled"; Expression = {$_.JobSchedules.IsEnabled}}
   $Jobs|Where-Object{$_.Category -eq 'Database Maintenance'}|select name,$IsEnabled,$FrequencyTypes,$FrequencyInterval,$ActiveStartTimeOfDay|Format-Table -AutoSize

You can get the script from Script Center via the link below or by searching for “Ola” using the script browser add-in straight from ISE

browser

https://gallery.technet.microsoft.com/scriptcenter/Schedule-Ola-Hallengrens-a66a3c89