Quickly Creating Test Users in SQL Server with PowerShell using the sqlserver module and dbatools

One of the most visited posts on my blog is nearly two and half years old now – Add User to SQL Server Database Role with PowerShell and Quickly Creating Test Users. I thought it was time to update it and use the latest sqlserver module and the dbatools module.

You can get the latest version of the sqlserver module by installing SSMS 2016. The PASS PowerShell Virtual Chapter have created a short link to make this easier for you to remember: https://sqlps.io/dl

Once you have downloaded and installed SSMS you can load the module.

Import-Module sqlserver

There is one situation where you will get an error loading the sqlserver module into PowerShell. If you have the SQLPS module already imported then you will get the following error:

Import-Module : The following error occurred while loading the extended type data file:

sqlserver-module-error

In that case you will need to remove the SQLPS module first.

Remove-Module sqlps
Import-Module sqlserver

The original post dealt with creating a number of test users for a database and assigning them to different roles quickly and easily.

First let’s quickly create a list of Admin users and a list of Service Users and save them in a text file.

$i = 0
while($I -lt 100)
{
"Beard_Service_User$i" | Out-File 'C:\temp\Users.txt' -Append
$i++
}

$i = 0
while($I -lt 10)
{
"Beard_Service_Admin_$i" | Out-File 'C:\temp\Admins.txt' -Append
$i++
}

Now that we have those users in files we can assign them to a variable by using Get-Content

$Admins = Get-Content 'C:\temp\Admins.txt'

Of course we can use any source for our users – a database, an excel file, Active Directory or even just type them in.

We can use the Add-SQLLogin command from the sqlserver module to add our users as SQL Logins, but at present we cannot add them as database users and assign them to a role.

If we want to add a Windows Group or a Windows User to our SQL Server we can do so using:

Add-SqlLogin -ServerInstance $Server -LoginName $User -LoginType WindowsUser -DefaultDatabase tempdb -Enable -GrantConnectSql 

Notice that we need to enable and grant connect SQL to the user.

If we want to add a SQL login the code is pretty much the same but we either have to enter the password in an authentication box or pass in a PSCredential object holding the username and password. Keeping credentials secure in PowerShell scripts is outside the scope of this post and the requirement is for none-live environments so we will pass in the same password for all users as a string to the script. You may want or be required to achieve this in a different fashion.

 $Pass = ConvertTo-SecureString -String $Password -AsPlainText -Force
$Credential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $User, $Pass
Add-SqlLogin -ServerInstance $Server -LoginName $User -LoginType $LoginType -DefaultDatabase tempdb -Enable -GrantConnectSql -LoginPSCredential $Credential

We can ensure that we are not trying to add logins that already exist using

 if(!($srv.Logins.Contains($User)))
{

The $srv is a SQL Server Management Server Object which you can create using a snippet. I blogged about snippets here and you can find my list of snippets on github here. However, today I am going to use the dbatools module to create a SMO Server Object using the Connect-DbaSqlServer command and assign the server and the database to a variable:

 # Create a SQL Server SMO Object
$srv = Connect-DbaSqlServer -SqlServer $server
$db = $srv.Databases[$Database]

Once we have our Logins we need to create our database users:

 $usr = New-Object ('Microsoft.SqlServer.Management.Smo.User') ($db, $User)
$usr.Login = $User
$usr.Create()

and add them to a database role.

#Add User to the Role
$db.roles[$role].AddMember($User)

I created a little function to call in the script and then simply loop through our users and admins and call the function.

foreach($User in $Users)
{
Add-UserToRole -Password $Password -User $user -Server $server -Role $Userrole  -LoginType SQLLogin
}

foreach($User in $Admins)
{
Add-UserToRole -Password $Password -User $user -Server $server -Role $adminrole  -LoginType SQLLogin
}

To check that they have been added correctly I simply use the Get-DbaRoleMember command from dbatools and output it to Out-GridView using the alias ogv as I am on the command line:

Get-DbaRoleMember -SqlInstance $server |ogv

which looks like this:

 

get-dbarole-memebr

Once we need to clean up the logins and users we can use the Get-SQLLogin and Remove-SQLLogin commands from the sqlserver module to remove the logins and if we do that first we can then use the dbatools command Remove-SQLOrphanuser to remove the orphaned users 🙂 (I thought that was rather cunning!)

(Get-SqlLogin -ServerInstance $server).Where{$_.Name -like '*Beard_Service_*'}|Remove-SqlLogin

Remove-SQLOrphanUser -SqlServer $Server -databases $database

The Remove-SQLLogin will prompt for confirmation and the result of the Remove-SQLOrphanUser looks like this

remove-them-all

When you are looking at doing this type of automation with PowerShell, you should remember always to make use of Get-Command, Get-Help and Get-Member. That will enable you to work out how to do an awful lot. I have a short video on youtube about this:

 

and when you get stuck come and ask in the SQL Server Slack at https://sqlps.io/slack. You will find a powershellhelp channel in there.

Here is the complete code:

#Requires -module sqlserver
#Requires -module dbatools

### Define some variables
$server = ''
$Password = "Password"
$Database = 'TheBeardsDatabase'
$Admins = Get-Content 'C:\temp\Admins.txt'
$Users = Get-Content 'C:\temp\Users.txt'
$LoginType = 'SQLLogin'
$userrole =  'Users'
$adminrole = 'Admin'

# Create a SQL Server SMO Object
$srv = Connect-DbaSqlServer -SqlServer $server
$db = $srv.Databases[$Database]

function Add-UserToRole
{
param
(
[Parameter(Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true,
ValueFromRemainingArguments=$false)]
[ValidateNotNullOrEmpty()]
[string]$Password,
[Parameter(Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true,
ValueFromRemainingArguments=$false)]
[ValidateNotNullOrEmpty()]
[string]$User,
[Parameter(Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true,
ValueFromRemainingArguments=$false)]
[ValidateNotNullOrEmpty()]
[string]$Server,
[Parameter(Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true,
ValueFromRemainingArguments=$false)]
[ValidateNotNullOrEmpty()]
[string]$Role,
[Parameter(Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true,
ValueFromRemainingArguments=$false)]
[ValidateSet("SQLLogin", "WindowsGroup", "WindowsUser")]
[string]$LoginType
)

if(!($srv.Logins.Contains($User)))
{
if($LoginType -eq 'SQLLogin')
{
$Pass = ConvertTo-SecureString -String $Password -AsPlainText -Force
$Credential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $User, $Pass
Add-SqlLogin -ServerInstance $Server -LoginName $User -LoginType $LoginType -DefaultDatabase tempdb -Enable -GrantConnectSql -LoginPSCredential $Credential
}
elseif($LoginType -eq 'WindowsGroup' -or $LoginType -eq 'WindowsUser')
{
Add-SqlLogin -ServerInstance $Server -LoginName $User -LoginType $LoginType -DefaultDatabase tempdb -Enable -GrantConnectSql
}
}
if (!($db.Users.Contains($User)))
{

# Add user to database

$usr = New-Object ('Microsoft.SqlServer.Management.Smo.User') ($db, $User)
$usr.Login = $User
$usr.Create()

}
#Add User to the Role
$db.roles[$role].AddMember($User)
}

foreach($User in $Users)
{
Add-UserToRole -Password $Password -User $user -Server $server -Role $Userrole -LoginType SQLLogin
}

foreach($User in $Admins)
{
Add-UserToRole -Password $Password -User $user -Server $server -Role $adminrole -LoginType SQLLogin
}

Get-DbaRoleMember -SqlInstance $server |ogv

Happy Automating!

 

 

 

Advertisements

SQL VNext sp_configure on Windows and Linux with dbatools

This weekend I set up some SQL vNext virtual machines, two on Windows and one on Linux so that I could test some scenarios and build an availability group.

IMPORTANT NOTE :- The names of dbatools commands with a Sql prefix WILL CHANGE in a later release of dbatools. dbatools will use Dba throughout in the future as the sqlserver PowerShell module uses the Sql prefix

I used PowerShell version 5.1.14393.693 and SQL Server vNext CTP 1.3 running on Windows Server 2016 and Ubuntu 16.04 in this blog post

I set up one Windows box with some changes to the default configuration, these are for my lab only.

Min Server Memory to 2Gb
Max Server Memory to 6Gb
Remote Admin Connections to 1
Backup Compression to 1
Ad Hoc Distributed Queries to 1

I can do this in PowerShell by creating a SQL SMO Server object using Connect-DbaSqlServer from dbatools and then altering the Configuration Properties values and calling the Alter() method

$WinSQl1 = 'SQLvNextN1'
$win1 = Connect-DbaSqlServer -SqlServer $WinSQl1
$win1.Configuration.Properties['DefaultBackupCompression'].ConfigValue = 1
$win1.Configuration.Properties['MinServerMemory'].ConfigValue = 2048
$win1.Configuration.Properties['MaxServerMemory'].ConfigValue = 6144
$win1.Configuration.Properties['RemoteAccess'].ConfigValue = 1
$win1.Configuration.Properties['OptimizeAdhocWorkloads'].ConfigValue = 1
$win1.Configuration.Alter()

I can see the change by using the Get-DbaSpConfigure from the popular PowerShell module dbatools. I like to output to Out-GridView (Alias ogv at the CLI)

$WinSQl1 = 'SQLvNextN1'
Get-DbaSpConfigure -SqlServer $WinSQl1 | ogv

1 SPConfig ogv.PNG

Out-GridView can filter in the top bar and it works very quickly. It is a very useful tool

2 - ogv filter.gif

You can also rearrange the columns and use the Add Criteria button to filter your results

3-spconfig-ogv

You can use Export-SQLSpConfigure to export the configuration to a .sql file which is useful for DR or documentation purposes. There is also a corresponding Import-SqlSpConfigure command

$WinSQl1 = 'SQLvNextN1'
$WinConfigPath = 'C:\Temp\Winconfig.sql'
Export-SqlSpConfigure -SqlServer $WinSQl1 -Path $winConfigPath
notepad $winConfigPath

4 - Export-SQLSpConfigure.gif

Chrissy LeMaire b | t showed a neat trick when we were in Utrecht to compare the configuration of two servers

$WinSQl1 = 'SQLvNextN1'
$WinSQl2 = 'SQLvNextN2'
$Win1SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl1
$Win2SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl2
$propcompare = foreach ($prop in $Win1SPConfigure) {
[pscustomobject]@{
Config = $prop.DisplayName
'Windows Node1 setting' = $prop.RunningValue
'Windows Node2 Setting' = $Win2SPConfigure | Where DisplayName -eq $prop.DisplayName | Select -ExpandProperty RunningValue
}
}
$propcompare | ogv
So when we compare the two Windows SQL nodes we can see that at this point there are differences in the configuration
5-compare-configures

 

Now we can copy the configuration from SQLvNextN1 to SQLvNextN2 using the Copy-SqlSpConfigure command

$WinSQl1 = 'SQLvNextN1'
$WinSQl2 = 'SQLvNextN2'
Copy-SqlSpConfigure -Source $WinSQl1 -Destination $WinSQl2

6 - Copy Configurations.gif

and we can see that those configurations are now the same for those two servers by comparing them like before.

7 - compare configurations.PNG

We can use Get-DbaSpConfigure with Linux servers as well but we need to use SQL authentication like so (No, I don’t know why I used two v’s!)

$linuxSQL = 'LinuxvvNext'
$cred = Get-Credential -UserName SA -Message "Linux SQL Auth"
$linuxSpConfigure = Get-DbaSpConfigure  -SqlServer $linuxSQL -SqlCredential $cred
$linuxSpConfigure | ogv

and compare the three servers like so

$WinSQl1 = 'SQLvNextN1'
$WinSQl2 = 'SQLvNextN2'
$linuxSQL = 'LinuxvvNext'
$cred = Get-Credential -UserName SA -Message "Linux SQL Auth"
<div>$Win1SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl1
$Win2SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl2
$linuxSpConfigure = Get-DbaSpConfigure  -SqlServer $linuxSQL -SqlCredential $cred
$propcompare = foreach ($prop in $Win1SPConfigure) {
[pscustomobject]@{
Config = $prop.DisplayName
'Windows Node1 setting' = $prop.RunningValue
'Windows Node2 Setting' = $Win2SPConfigure | Where DisplayName -eq $prop.DisplayName | Select -ExpandProperty RunningValue
'Linux Setting' = $linuxSpConfigure | Where DisplayName -eq $prop.DisplayName | Select -ExpandProperty RunningValue
}
}
$propcompare | ogv

and see the differences for the Linux Server

8 - Compare Linux Configs.PNG

we can export the Linux configuration using Export-SqlSpConfigure

9 - Export Linux Config.PNG

So now lets copy the configuration changes from the Windows Server to the Linux Server

10 - Copy Configuration to Linux.gif

There was some Red text there you will notice. This is because there are unsupported features in Linux as you can see in the SQL Server on Linux Release Notes The errors from the command are

Database Mail XPs to 0. Feature may not be supported.
SMO and DMO XPs to 1. Feature may not be supported.
Ole Automation Procedures to 0. Feature may not be supported.
xp_cmdshell to 0. Feature may not be supported.
Ad Hoc Distributed Queries to 0. Feature may not be supported.
Replication XPs to 0. Feature may not be supported.
contained database authentication to 0. Feature may not be supported.
hadoop connectivity to 0. Feature may not be supported.
polybase network encryption to 1. Feature may not be supported.
remote data archive to 0. Feature may not be supported.
allow polybase export to 0. Feature may not be supported.

But if we compare the configurations again, we can see that the settings we wanted have been altered successfully

11 - Compare Linux Configs again.PNG

Lets reset the configuration on the Linux server using Import-SqlSpConfigure and the Linuxconfig-backup.sql file created before any changes

12 - Import backup Linux configuration.gif

Interestingly this time we only had an error for Database Mail XPs, but as you can see below the Linux Configuration has been reset back to the original values.

13-configuration-comapre-back-to-original

Lets export the configuration from the Windows server to a file and import it onto the Linux server to see what happens

14 - Import Windows configuration to Linux.gif

Again only an error for Database Mail XPs, and we can see the configurations are again matching.

15 - Final Compare.PNG

Hopefully, this post has been of use in exploring the *SpConfigure* commands in dbatools and how you can use them to get the sp_configure output, compare it between servers, Export and import it to and from files, as well as copy it between servers for both Windows and Linux. Remember these commands have been tested from SQL 2000 to SQL vNext so they should work for you across your estate

 

Happy Automating!

 

Adding a T-SQL Job Step to a SQL Agent Job with PowerShell

In my last post, I explained how to alter an existing job step across many servers. I also had cause to add a T-SQL Job step to a large number of jobs as well. This is how I did it.
As before I gathered the required jobs using Get-SQLAgentJob command from the sqlserver module which you can get by installing the latest SSMS from https://sqlps.io/dl 

This code was run on PowerShell version 5 and will not run on PowerShell version 3 or earlier as it uses the where method
I put all of our jobs that I required on the estate into a variable called $Jobs. (You will need to fill the $Servers variable with the names of your instances, maybe from a database or CMS or a text file

$Jobs = (Get-SQLAgentJob -ServerInstance $Servers).Where{$_.Name -like '*PartOfNameOfJob*' -and $_.IsEnabled -eq $true}

Then I can iterate through them with a foreach loop

foreach($Job in $Jobs)

Then we need to create a new job step which is done with the following code

$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep 

To find out what is available for this object you can run

$NewStep | Get-Member -MemberType Property

job-step-properties

We need to set the name, the parent (The job), the database, the command, the subsystem, the on fail action, on success action and the id for the job step.
I set the command to a variable to make the code easier to read

$Command = "SELECT Name from sys.databases"

the rest of the properties I fill in inside the loop. To find out what the properties can hold I look at MSDN for a Microsoft.SqlServer.Management.Smo.Agent.JobStep  The ID property is the number of the job step starting at 1 so this example will add a new job step that will be the first to run

$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep
$NewStep.Name = 'A descriptive name for the job step'
$NewStep.Parent = $Job
$NewStep.DatabaseName = 'master'
$NewStep.Command = $Command
$NewStep.SubSystem = 'TransactSql'
$NewStep.OnFailAction = 'QuitWithFailure'
$NewStep.OnSuccessAction = 'GoToNextStep'
$NewStep.ID = 1

Once the object has all of the properties all we need to do is create it and alter the job

$NewStep.create()
$Job.Alter() 

and putting it all together it looks like this

$Jobs = (Get-SQLAgentJob -ServerInstance $Servers).Where{$_.Name -like '*PartOfNameOfJob*' -and $_.IsEnabled -eq $true}
$Command = "Select name from sys.databases"
foreach($Job in $Jobs)
{
$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep
$NewStep.Name = 'A descriptive name for the job step1asdfsfasdfa'
$NewStep.Parent = $Job
$NewStep.DatabaseName = 'master'
$NewStep.Command = $Command
$NewStep.SubSystem = 'TransactSql'
$NewStep.OnFailAction = 'QuitWithFailure'
$NewStep.OnSuccessAction = 'GoToNextStep'
$NewStep.ID = 1
$NewStep.create()
$Job.Alter()
}

Hopefully this will help you if you need to add a T-SQL Job Step to a large number of servers
Happy Automating

Altering a Job Step on Hundreds of SQL Servers with PowerShell

I flew to Utrecht last week to present with Chrissy LeMaire and Sander Stad to present to the joint Dutch SQL and PowerShell User Groups. Whilst I was sat at the airport I got a phone call from my current client. “We need to change the backup path for all of the servers to a different share, how long will it take you?”

About 5 minutes  (PowerShell is very powerful – be careful when following these examples 😉 )

We will use the sqlserver module, so you will need to have installed the latest version of SSMS from https://sqlps.io/dl

This code was run using PowerShell version 5 and will not work on Powershell version 3 or lower as it uses the where method.

Lets grab all of our jobs on the estate. (You will need to fill the $Servers variable with the names of your instances, maybe from a database or CMS or a text file)

$Jobs = Get-SQLAgentJob -ServerInstance $Servers

Once we have the jobs we need to iterate only through the ones we need to. This step could also have been done in the line above. Lets assume we are using the Ola Hallengren Solution to backup our estate

Foreach($job in $Jobs.Where{$_.Name -like '*DatabaseBackup*' -and $_.isenabled -eq $true})

Then because I have to target a specific job step I can iterate through those and filter in the same way

foreach ($Step in $Job.jobsteps.Where{$_.Name -like '*DatabaseBackup*'})

Now all I need to do is to replace C:\Backup with C:\MSSQL\Backup (in this example I am using my labs backup paths)

$Step.Command = $Step.Command.Replace("Directory = N'C:\Backup'","Directory = N'C:\MSSQL\Backup'")

And then call the Alter method

$Step.Alter()

And that is all there is to it. Here is the full script I used

$Jobs = Get-SQLAgentJob -ServerInstance $Servers

Foreach($job in $Jobs.Where{$_.Name -like '*DatabaseBackup*' -and $_.isenabled -eq $true})
{
foreach ($Step in $Job.jobsteps.Where{$_.Name -like '*DatabaseBackup*'})
{
$Step.Command = $Step.Command.Replace("Directory = N'C:\Backup'","Directory = N'C:\MSSQL\Backup'")
$Step.Alter()
}
}

In only a few minutes I had altered several hundred instances worth of Ola Hallengren Jobs 🙂

This is one of the many reasons I love PowerShell, it enables me to perform mass changes very quickly and easily. Of course, you need to make sure that you know that what you are changing is what you want to change. I have caused severe issues by altering the SQL alerts frequency to 1 second instead of one hour on an estate!! Although the beauty of PowerShell meant that I was able to change it very quickly once the problem was realised
You can change a lot of settings. If you look at what is available at a job step level
job-step-properties
Happy Automating

PowerShell, Pester and Ola Hallengrens Maintenance Solution

If you are a SQL DBA you will have heard of Ola Hallengrens Maintenance solution If you haven’t go and click the link and look at the easiest way to ensure that all of your essential database maintenance is performed. You can also watch a video from Ola at SQL Bits
Recently I was thinking about how I could validate that this solution was installed in the way that I wanted it to be so I turned to Pester You can find a great how to get started here which will show you how to get Pester and how to get started with TDD.
This isn’t TDD though this is Environment Validation and this is how I went about creating my test.
First I thought about what I would look for in SSMS when I had installed the maintenance solution and made a list of the things that I would check which looked something like this. This would be the checklist you would create (or have already created) for yourself or a junior following this install. This is how easy you can turn that checklist into a Pester Test and remove the human element and open your install for automated testing
  • SQL Server Agent is running – Otherwise the jobs won’t run 🙂
  • We should have 4 backup jobs with a name of
  • DatabaseBackup – SYSTEM_DATABASES – FULL
  • DatabaseBackup – USER_DATABASES – FULL
  • DatabaseBackup – USER_DATABASES – DIFF
  • DatabaseBackup – USER_DATABASES – LOG
  • We should have Integrity Check and Index Optimisation Jobs
  • We should have the clean up jobs
  • All jobs should be scheduled
  • All jobs should be enabled
  • The jobs should have succeeded

I can certainly say that I have run through that check in my head and also written it down in an installation guide in the past. If I was being more careful I would have checked if there were the correct folders in the folder I was backing up to.

Ola’s script uses a default naming convention so this makes it easy. There should be a SERVERNAME or SERVERNAME$INSTANCENAME folder or if there is an Availability Group a CLUSTERNAME$AGNAME and in each of those a FULL DIFF and LOG folder which I can add to my checklist

So now we have our checklist we just need to turn in into a Pester Environmental Validation script

It would be useful to be able to pass in a number of instances so we will start with a foreach loop and then a Describe Block then split the server name and instance name, get the agent jobs and set the backup folder name

$ServerName = $Server.Split('\')[0]
$InstanceName = $Server.Split('\')[1]
$ServerName = $ServerName.ToUpper()
Describe 'Testing $Server Backup solution'{
BeforeAll {$Jobs = Get-SqlAgentJob -ServerInstance $Server
$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server
$dbs = $Srv.Databases.Where{$_.status -eq 'Normal'}.name
if($InstanceName)
{
$DisplayName = 'SQL Server Agent ($InstanceName)'
$Folder = $ServerName + '$' + $InstanceName
}
else
{
$DisplayName = 'SQL Server Agent (MSSQLSERVER)'
$Folder = $ServerName
}
}
if($CheckForBackups -eq $true)
{
$CheckForDBFolders -eq $true
}
$Root = $Share + '\' + $Folder 
I also set the Agent service display name so I can get its status. I split the jobs up using a Context block, one each for Backups, Database maintenance and solution clean up but they all follow the same pattern. .First get the jobs

$Jobs = $Jobs.Where{($_.Name -like 'DatabaseBackup - SYSTEM_DATABASES - FULL*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - FULL*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - DIFF*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - LOG*' + $JobSuffix + '*')}
Then we can iterate through them and check them but first lets test the Agent Service. You do this with an It Block and in it put a single test like this

actual-value | Should Be expected-value
So to check the Agent Job is running we can do this

(Get-service -ComputerName $ServerName -DisplayName $DisplayName).Status | Should Be 'Running'
To find out how to get the right values for any test I check using get member so to see what is available for a job I gathered the Agent Jobs into a variable using the Get-SQLAgentJob command in the new sqlserver module (which you can get by installing the latest SSMS from here) and then explored their properties using Get-Member and the values using Select Object

$jobs = Get-SqlAgentJob -ServerInstance $server
($Jobs | Get-Member -MemberType Property).name
$Jobs[0] | Select-Object *
then using a foreach to loop through them I can check that the jobs, exists, is enabled, has a schedule and succeeded last time it ran like this

$Jobs = $Jobs.Where{($_.Name -eq 'DatabaseIntegrityCheck - SYSTEM_DATABASES') -or ($_.Name -eq 'DatabaseIntegrityCheck - USER_DATABASES') -or ($_.Name -eq 'IndexOptimize - USER_DATABASES')}
foreach($job in $Jobs)
{
$JobName = $Job.Name
It '$JobName Job Exists'{
$Job | Should Not BeNullOrEmpty
}
It '$JobName Job is enabled' {
$job.IsEnabled | Should Be 'True'
}
It '$JobName Job has schedule' {
$Job.HasSchedule | Should Be 'True'
}
if($DontCheckJobOutcome -eq $false)
{
It '$JobName Job succeeded' {
$Job.LastRunOutCome | Should Be 'Succeeded'
}
}
So I have checked the agent and the jobs and now I want to check the folders exist. First for the instance using Test-Path so the user running the PowerShell session must have privileges and access to list the files and folders

Context '$Share Share For $Server' {
It 'Should have the root folder $Root' {
Test-Path $Root | Should Be $true
}
The for every database we need to set some variables for the Folder path. We don’t back up tempdb so we ignore that and then check if the server is SQL2012 or above and if it is check if the database is a member of an availability group and set the folder name appropriately

  foreach($db in $dbs.Where{$_ -ne 'tempdb'})
{

if($Srv.VersionMajor -ge 11)
{
If($srv.Databases[$db].AvailabilityGroupName)
{
$AG = $srv.Databases[$db].AvailabilityGroupName
$Cluster = $srv.ClusterName
$OLAAg = $Cluster + '$' + $AG
if($Share.StartsWith('\\') -eq $False)
{
$UNC = $Share.Replace(':','$')
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $OlaAG
}
else
{
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $Folder
}
}
else
{
if($Share.StartsWith('\\') -eq $False)
{
$UNC = $Share.Replace(':','$')
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $Folder
}
else
{
$Root = $Share + '\' + $Folder
}
}
}
$db = $db.Replace(' ','')
$Dbfolder = $Root + "\$db"
$Full = $Dbfolder + '\FULL'
$Diff = $Dbfolder + '\DIFF'
$Log  = $Dbfolder + '\LOG'
If($CheckForDBFolders -eq $True)
{
Context "Folder Check for $db on $Server on $Share" {
It "Should have a folder for $db database" {
Test-Path $Dbfolder |Should Be $true
} 
But we need some logic for checking for folders because Ola is smart and checks for Log Shipping databases so as not to break the LSN chain and system databases only have full folders and simple recovery databases only have full and diff folders. I used the System.IO.Directory Exists method as I found it slightly quicker for UNC Shares

If($CheckForDBFolders -eq $True)
{
Context 'Folder Check for $db on $Server on $Share' {
It 'Should have a folder for $db database' {
Test-Path $Dbfolder |Should Be $true
}
if($Db -notin ('master','msdb','model') -and ($Srv.Databases[$db].RecoveryModel -ne 'Simple') -and ( $LSDatabases -notcontains $db))
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
It 'Has a Diff Folder' {
[System.IO.Directory]::Exists($Diff) | Should Be $True
}
It 'Has a Log Folder' {
[System.IO.Directory]::Exists($Log) | Should Be $True
}
} #
elseif(($Srv.Databases[$db].RecoveryModel -eq 'Simple') -and $Db -notin ('master','msdb','model') -or ( $LSDatabases -contains $db) )
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
It 'Has a Diff Folder' {
[System.IO.Directory]::Exists($Diff) | Should Be $True
}
} #
else
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
}#
} # End Check for db folders
}
and a similar thing for the files in the folders although this caused me some more issues with performance. I first used Get-ChildItem but in folders where a log backup is running every 15 minutes it soon became very slow. So I then decided to compare the create time of the folder with the last write time which was significantly quicker for directories with a number of files but then fell down when there was a single file in the directory so if the times match I revert back to Get-ChildItem.
If anyone has a better more performant option I would be interested in knowing. I used Øyvind Kallstad PowerShell Conference session Chasing the seconds Slides and Video and tried the methods in there with Measure-Command but this was the best I came up with

If($CheckForBackups -eq $true)
{
Context ' File Check For $db on $Server on $Share' {
$Fullcreate = [System.IO.Directory]::GetCreationTime($Full)
$FullWrite = [System.IO.Directory]::GetLastWriteTime($Full)
if($Fullcreate -eq $FullWrite)
{
It 'Has Files in the FULL folder for $db' {
Get-ChildItem $Full\*.bak | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the FULL folder for $db' {
$FullCreate | Should BeLessThan $FullWrite
}
}
It 'Full File Folder was written to within the last 7 days' {
$Fullwrite |Should BeGreaterThan (Get-Date).AddDays(-7)
}
if($Db -notin ('master','msdb','model'))
{
$Diffcreate = [System.IO.Directory]::GetCreationTime($Diff)
$DiffWrite = [System.IO.Directory]::GetLastWriteTime($Diff)
if($Diffcreate -eq $DiffWrite)
{
It 'Has Files in the DIFF folder for $db' {
Get-ChildItem $Diff\*.bak | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the DIFF folder for $db' {
$DiffCreate | Should BeLessThan $DiffWrite
}
}</div><div>It 'Diff File Folder was written to within the last 24 Hours' {
$Diffwrite |Should BeGreaterThan (Get-Date).AddHours(-24)
}
}
if($Db -notin ('master','msdb','model') -and ($Srv.Databases[$db].RecoveryModel -ne 'Simple') -and ( $LSDatabases -notcontains $db))
{
$Logcreate = [System.IO.Directory]::GetCreationTime($Log)
$LogWrite = [System.IO.Directory]::GetLastWriteTime($Log)
if($Logcreate -eq $LogWrite)
{
It 'Has Files in the LOG folder for $db' {
Get-ChildItem $Log\*.trn | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the LOG folder for $db' {
$LogCreate | Should BeLessThan $LogWrite
}
}
It 'Log File Folder was written to within the last 30 minutes' {
$Logwrite |Should BeGreaterThan (Get-Date).AddMinutes(-30)
}
}# Simple Recovery
}
}# Check for backups
You could just run the script you have just created from your check-list, hopefully this blog post can help you see that you  can do so.
But I like the message showing number of tests and successes and failures at the bottom and I want to use parameters in my script. I can do this like this

[CmdletBinding()]
## Pester Test to check OLA
Param(
$Instance,
$CheckForBackups,
$CheckForDBFolders,
$JobSuffix ,
$Share ,
[switch]$NoDatabaseRestoreCheck,
[switch]$DontCheckJobOutcome
)
and then call it using Invoke-Pester with the parameters like this

$Script = @{
Path = $Path;
Parameters = @{ Instance = Instance;
CheckForBackups = $true;
CheckForDBFolders = $true;
JobSuffix = 'BackupShare1';
Share = '\\Server1\BackupShare1';
NoDatabaseRestoreCheck= $true;
DontCheckJobOutcome = $true}
}
Invoke-Pester -Script $Script
but that’s a bit messy, hard to remember and won’t encourage people newer to Powershell to use it so I wrapped it in a function with some help and examples and put it in GitHub Test-OlaInstance.ps1 and Test-Ola. There is one thing to remember. You will need to add the path to Test-Ola.ps1 on Line 90 of Test-OlaInstance so that the script can find it
Once you have that you can call it for a single instance or a number of instances like so. Here I check for Folders and Backup files
$Servers =  'SQL2008Ser2008','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2'
Test-OLAInstance -Instance $Servers -Share 'H:\' -CheckForBackups
and get  a nice result like this. In a little under 20 seconds I completed my checklist for 4 servers including checking if the files and folders exist for 61 databases 🙂 (The three failures were my Integrity Check jobs holding some test corrupt databases)
pester ola check.PNG
This gives me a nice and simple automated method of checking if Ola’s maintenance script has been correctly installed. I can use this for one server or many by passing in an array of servers (although they must use the same folder for backing up whether that is UNC or local) I can also add this to an automated build process to ensure that everything has been deployed correctly.
I hope you find it useful

Remove-SQLDatabaseSafely My First Contribution to DBATools

What is DBA Tools?

A collection of modules for SQL Server DBAs. It initially started out as ‘sqlmigration’, but has now grown into a collection of various commands that help automate DBA tasks and encourage best practices.

You can read more about here and it is freely available for download on GitHub I thoroughly recommend that you watch this quick video to see just how easy it is to migrate an entire SQL instance in one command (Longer session here )

Installing it is as easy as

which will get you over 80 commands . Visit https://dbatools.io/functions/ to find out more information about them

cmdlets

The journey to Remove-SQLDatabaseSafely started with William Durkin b | t who presented to the SQL South West User Group  (You can get his slides here)

Following that session  I wrote a Powershell Script to gather information about the last used date for databases which I blogged about here and then a T-SQL script to take a final backup and create a SQL Agent Job to restore from that back up which I blogged about here The team have used this solution (updated to load the DBA Database and a report instead of using Excel) ever since and it proved invaluable when a read-only database was dropped and could quickly and easily be restored with no fuss.

I was chatting with Chrissy LeMaire who founded DBATools b | t about this process and when she asked for contributions in the SQL Server Community Slack I offered my help and she suggested I write this command. I have learnt so much. I thoroughly enjoyed and highly recommend working on projects collaboratively to improve your skills. It is amazing to work with such incredible professional PowerShell people.

I went back to the basics and thought about what was required and watched one of my favourite videos again. Grant Fritcheys Backup Rant

I decided that the process should be as follows

  1. Performs a DBCC CHECKDB
  2. Database is backed up WITH CHECKSUM
  3. Database is restored with VERIFY ONLY on the source
  4. An Agent Job is created to easily restore from that backup
  5. The database is dropped
  6. The Agent Job restores the database
  7. performs a DBCC CHECKDB and drops the database for a final time

This (hopefully) passes all of Grants checks. This is how I created the command

I check that the SQL Agent is running otherwise we wont be able to run the job. I use a while loop with a timeout like this

There are a lot more checks and logic than I will describe here to make sure that the process is as robust as possible. For example, the script can exit after errors are found using DBCC CHECKDB or continue and label the database backup file and restore job appropriately. Unless the force option is used it will exit if the job name already exists. We have tried to think of everything but if something has been missed or you have suggestions let us know (details at end of post)

The only thing I didn’t add was a LARGE RED POP UP SAYING ARE YOU SURE YOU WANT TO DROP THIS DATABASE but I considered it!!

Performs a DBCC CHECKDB

Running DBCC CHECKDB with Powershell is as easy as this

you can read more on MSDN

Database is backed up WITH CHECKSUM

Stuart Moore is my go to for doing backups and restores with SMO

I ensured that the backup was performed with checksum like this

Database is restored with VERIFY ONLY on the source

I used SMO all the way through this command and performed the restore verify only like this

An Agent Job is created to easily restore from that backup

First I created a category for the Agent Job

and then generated the TSQL for the restore step by using the script method on the Restore SMO object

This is how to create an Agent Job

and then to add a job step to run the restore command

 

The database is dropped

We try 3 different methods to drop the database

The Agent Job restores the database

To run the Agent Job I call the start method of the Job SMO Object
Then we drop the database for the final time with the confidence that we have a safe backup and an easy one click method to restore it from that backup (as long as the backup is in the same location)
There are further details on the functions page on dbatools
Some videos of it in action are on YouTube http://dbatools.io/video
You can take a look at the code on GitHub here

You can install it with

You can provide feedback via the Trello Board or discuss it in the #dbatools channel in the Sqlserver Community Slack
You too can also become a contributor https://dbatools.io/join-us/ Come and write a command to make it easy for DBAs to (this bit is up to your imagination).

Using the new SQLServer Powershell module to get SQL Agent Job Information

So with the July Release of SSMS everything changed for using PowerShell with SQL. You can read the details here As I mentioned in my previous post the name of the module has changed to sqlserver

This means that if you have a PowerShell script doing Import-Module SQLPS, it will need to be changed to be Import-Module SqlServer in order to take advantage of the new provider functionality and new CMDLETs. The new module will be installed to “%Program Files\WindowsPowerShell\Modules\SqlServer” and hence no update to $env:PSModulePath is required.

You can download the latest SSMS release here Once you have installed and rebooted you can start to look at the new Powershell CMDlets

Take a look at cmdlets

Today I want to look at agent jobs

getcomand sqlagent

So I decided to see how to gather the information I gather for the DBADatabase as described here

This is the query I use to insert the data for the server level agent job information.

So Get-SQLAgentJob looks like the one I need. Lets take a look at the help. This should be the starting point whenever you use a new cmdlet

Which states

Returns a SQL Agent Job object for each job that is present in the target instance of SQL Agent.

That sounds like it will meet my needs. Lets take a look

sqlinstances

I can get the information I require like this

NOTE – That code is for PowerShell V4 and V5, if you are using earlier versions of PowerShell you would need to use

But to make the code more performant it is better to do this

jobs

Using Measure-Command showed that this completed in
TotalSeconds : 0.9889336
Rather than
TotalSeconds : 2.9045701

Note that

Does not work. I had to check the properties using

Which showed me

IsEnabled Property bool IsEnabled {get;set;}

So I tested this against the various SQL versions I had in my lab using this code

Here are the results
job data table

I also had a look at Get-SQLAgentJobHistory Lets take a look at the help

DESCRIPTION

Returns the JobHistory present in the target instance of SQL Agent.

This cmdlet supports the following modes of operation to return the JobHistory:

  1. By specifying the Path of the SQL Agent instance.
  2. By passing the instance of the SQL Agent in the input.
  3. By invoking the cmdlet in a valid context.

So I ran

And got back a whole load of information. Every job history available on the server. Too much to look it immediately to work out what to do

So I looked at just one job

And got back the last months worth of history for that one job as that is the schedule used to purge the job history for this server So then I added -Since Yesterday to only get the last 24 hours history

agentjobdetail

The Since Parameter is described as

-Since <SinceType>

A convenient abbreviation to avoid using the -StartRunDate parameter.
It can be specified with the -EndRunDate parameter.

Do not specify a -StartRunDate parameter, if you want to use it.

Accepted values are:
– Midnight (gets all the job history information generated after midnight)
– Yesterday (gets all the job history information generated in the last 24 hours)
– LastWeek (gets all the job history information generated in the last week)
– LastMonth (gets all the job history information generated in the last month)

When I run

I get

Count : 3

And if I run

I get

agent job out gridview

Which matches the view I see in SSMS Agent Job History

jobhistory

So Get-SqlAgentJobHistory will enable you to use PowerShell to gather information about the Job history for each step of the Agent Jobs and also the message which I can see being very useful.

Come and join us in the SQL Community Slack to discuss these CMDLets and all things SQL Community https://sqlps.io/slack

CALL TO ACTION

Microsoft are engaging with the community to improve the tools we all use in our day to day work. There is are two Trello boards set up for YOU to use to contribute

https://sqlps.io/vote for SQLPS sqlserver PowerShell module

https://sqlps.io/ssms for SSMS

Go and join them and upvote YOUR preferred choice of the next lot of CMDlets

trellocount

We have also set up a SQL Community Slack for anyone in the community to discuss all things related to SQL including the Trello board items and already it seems a good place for people to get help with 150+ members in a few days. You can get an invite here https://sqlps.io/slack

Come and join us

PowerShell CMDLets added for SQL2016 Always Encrypted

The post on the SQLServer blog at TechNet by the SQL Server Tools Team today made me jump out of my seat.

The July update for SSMS includes the first substantial improvement in SQL PowerShell in many years. We owe a lot of thanks for this effort to the great collaboration with our community. We have several new CMDLETs to share with you

In one release there are twenty-five new CMDLets for the new sqlserver module

This means that if you have a PowerShell script doing Import-Module SQLPS, it will need to be changed to be Import-Module SqlServer in order to take advantage of the new provider functionality and new CMDLETs. The new module will be installed to “%Program Files\WindowsPowerShell\Modules\SqlServer” and hence no update to $env:PSModulePath is required.

So SQLPS will still continue to work but will not be updated and will not contain the new CMDlets or the future new CMDlets.

So what new things do we have?

This month we introduce CMDLETs for the following areas:

  • Always Encrypted
  • SQL Agent
  • SQL Error Logs

Chrissy LeMaire has written about the new SQL Agent cmdlets

Aaron Nelson has written about the new Get-SqlErrorLog cmdlet

Laerte Junior has written about Invoke-SQLCmd

All four of us will be presenting a webinar on the new CMDlets via the PowerShell Virtual Chapter Wed, Jul 06 2016 12:00 Eastern Daylight Time If you cant make it a recording will be made available on YouTube on the VC Channel https://sqlps.io/video

Always Encrypted CMDlets

That leaves the Always Encrypted CMDLets and there are 17 of those!

Add-SqlColumnEncryptionKeyValue Adds a new encrypted value for an existing column encryption key object in the database.
Complete-SqlColumnMasterKeyRotation Completes the rotation of a column master key.
Get-SqlColumnEncryptionKey Returns all column encryption key objects defined in the database, or returns one column encryption key object with the specified name.
Get-SqlColumnMasterKey Returns the column master key objects defined in the database, or returns one column master key object with the specified name.
Invoke-SqlColumnMasterKeyRotation Initiates the rotation of a column master key.
New-SqlAzureKeyVaultColumnMasterKeySettings Creates a SqlColumnMasterKeySettings object describing an asymmetric key stored in Azure Key Vault.
New-SqlCngColumnMasterKeySettings Creates a SqlColumnMasterKeySettings object describing an asymmetric key stored in a key store supporting the Cryptography Next Generation (CNG) API.
New-SqlColumnEncryptionKey Crates a new column encryption key object in the database.
New-SqlColumnEncryptionKeyEncryptedValue Produces an encrypted value of a column encryption key.
New-SqlColumnEncryptionSettings Creates a new SqlColumnEncryptionSettings object that encapsulates information about a single column’s encryption, including CEK and encryption type.
New-SqlColumnMasterKey Creates a new column master key object in the database.
New-SqlCspColumnMasterKeySettings Creates a SqlColumnMasterKeySettings object describing an asymmetric key stored in a key store with a Cryptography Service Provider (CSP) supporting Cryptography API (CAPI).
Remove-SqlColumnEncryptionKey Removes the column encryption key object from the database.
Remove-SqlColumnEncryptionKeyValue Removes an encrypted value from an existing column encryption key object in the database.
Remove-SqlColumnMasterKey Removes the column master key object from the database.
Set-SqlColumnEncryption Encrypts, decrypts or re-encrypts specified columns in the database.
 

 

That seems to cover setting up Always Encrypted with Powershell , removing it and getting information about it. When the new SSMS update is dropped you will be able to start using all of this new functionality.

Just remember Import-Module sqlserver

CALL TO ACTION

Microsoft are engaging with the community to improve the tools we all use in our day to day work. There is are two Trello boards set up for YOU to use to contribute

https://sqlps.io/vote  for SQLPS  sqlserver PowerShell module

https://sqlps.io/ssms for SSMS

Go and join them and upvote YOUR preferred choice of the next lot of CMDlets

trellocount

 

We have also set up a SQL Community Slack for anyone in the community to discuss all things related to SQL including the Trello board items and already it seems a good place for people to get help with 150+ members in a few days. You can get an invite here https://sqlps.io/slack

Come and join us

Some Pester Tests for SQL Defaults

When I was at PowerShell Conference EU in Hannover last month (The videos are available now – click here and the slides and code here) I found out about Irwin Strachans Active Directory Operations Test which got me thinking.

I decided to do the same for my usual SQL Set-up. Treating all of your servers to the same defaults makes it even easier to manage at scale remotely.

I am comfortable with using SMO to gather and change properties on SQL Instances so I started by doing this

        It 'Should have a default Backup Directory of F:\SQLBACKUP\BACKUPS' {
$Scriptblock = {
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
$srv = New-Object Microsoft.SqlServer.Management.Smo.Server .
return $srv.BackupDirectory}
$State = Invoke-Command -ComputerName ROB-SURFACEBOOK -ScriptBlock $Scriptblock
$State |Should Be 'F:\SQLBACKUP\BACKUPS'

This is the how to find the properties that you want

  ## Load the Assemblies
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
## Create a Server SMO object
$srv = New-Object Microsoft.SqlServer.Management.Smo.Server SERVERNAME

## Explore it
$srv|gm

## If you find an array pick the first one and expand and then explore that
$srv.Databases[0] | select *
$srv.Databases[0] | gm

I quickly found as I added more tests that it was taking a long time to perform the tests (about 5 seconds each test) and that it took an age to fail each of the tests if the server name was incorrect or the server unavailable.

I fixed the first one by testing with a ping before running the tests

   ## Check for connectivity
if((Test-Connection $Server -count 1 -Quiet) -eq $false){
Write-Error 'Could not connect to $Server'
$_
continue
}

The continue is there because I wanted to loop through an array of servers

I improved the performance using a remote session and a custom object

      Describe "$Server" {
BeforeAll {
$Scriptblock = {
[pscustomobject]$Return = @{}
$srv = ''
$SQLAdmins = $Using:SQLAdmins
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server
$Return.DBAAdminDb = $Srv.Databases.Name.Contains('DBA-Admin')
$Logins = $srv.Logins.Where{$_.IsSystemObject -eq $false}.Name
$Return.SQLAdmins = @(Compare-Object $Logins $SQLAdmins -SyncWindow 0).Length - $Logins.count -eq $SQLAdmins.Count
$SysAdmins = $Srv.Roles['sysadmin'].EnumMemberNames()
$Return.SQLAdmin = @(Compare-Object $SysAdmins $SQLAdmins -SyncWindow 0).Length - $SysAdmins.count -eq $SQLAdmins.Count
$Return.BackupDirectory = $srv.BackupDirectory
$Return.DataDirectory = $srv.DefaultFile

The BeforeAll script block is run, as it sounds like it should, once before all of the tests, BeforeEach would run once before each of the tests. I define an empty custom object and then create an SMO object and add the properties I am interested in testing to it. I then return the custom object at the end

   $Return.Alerts82345Exist = ($srv.JobServer.Alerts |Where {$_.Messageid -eq 823 -or $_.Messageid -eq 824 -or $_.Messageid -eq 825}).Count
$Return.Alerts82345Enabled = ($srv.JobServer.Alerts |Where {$_.Messageid -eq 823 -or $_.Messageid -eq 824 -or $_.Messageid -eq 825 -and $_.IsEnabled -eq $true}).Count
$Return.SysDatabasesFullBackupToday = $srv.Databases.Where{$_.IsSystemObject -eq $true -and $_.Name -ne 'tempdb' -and $_.LastBackupDate -lt (Get-Date).AddDays(-1)}.Count
Return $Return
}
try {
$Return = Invoke-Command -ScriptBlock $Scriptblock -ComputerName $Server -ErrorAction Stop
}
catch {
Write-Error "Unable to Connect to $Server"
$Error
continue

I was then able to test against the property of the custom object

   It 'Should have Alerts for Severity 20 and above' {
$Return.Alerts20SeverityPlusExist | Should Be 6
}
It 'Severity 20 and above Alerts should be enabled' {
$Return.Alerts20SeverityPlusEnabled | Should Be 6
}
It 'Should have alerts for 823,824 and 825' {
$Return.Alerts82345Exist |Should Be 3
}
It 'Alerts for 823,824 and 825 should be enebled' {
$Return.Alerts82345Enabled |Should Be 3
}

Occasionally, for reasons I haven’t explored I had to test against the value property of the returned object

          It "The Full User Database Backup should be scheduled Weekly $OlaUserFullSchedule" {
$Return.OlaUserFullSchedule.value | Should Be $OlaUserFullSchedule
}

I wanted to be able to run the tests against environments or groups of servers with different default values so I parameterised the Test Results as well and then the logical step was to turn it into a function and then I could do some parameter splatting. This also gives me the opportunity to show all of the things that I am currently giving parameters to the test for

   $Parms = @{
Servers = 'SQLServer1','SQLServer2','SQLServer3';
SQLAdmins = 'THEBEARD\Rob','THEBEARD\SQLDBAsAlsoWithBeards';
BackupDirectory = 'C:\MSSQL\Backup';
DataDirectory = 'C:\MSSQL\Data\';
LogDirectory = 'C:\MSSQL\Logs\';
MaxMemMb = '4096';
Collation = 'Latin1_General_CI_AS';
TempFiles = 4 ;
OlaSysFullFrequency = 'Daily';
OlaSysFullStartTime = '21:00:00';
OlaUserFullSchedule = 'Weekly';
OlaUserFullFrequency = 1 ;## 1 for Sunday
OlaUserFullStartTime = '22:00:00';
OlaUserDiffSchedule = 'Weekly';
OlaUserDiffFrequency = 126; ## 126 for every day except Sunday
OlaUserDiffStartTime = '22:00:00';
OlaUserLogSubDayInterval = 15;
OlaUserLoginterval = 'Minute';
HasSPBlitz = $true;
HasSPBlitzCache = $True;
HasSPBlitzIndex = $True;
HasSPAskBrent = $true;
HASSPBlitzTrace =  $true;
HasSPWhoisActive = $true;
LogWhoIsActiveToTable = $true;
LogSPBlitzToTable = $true;
LogSPBlitzToTableEnabled = $true;
LogSPBlitzToTableScheduled = $true;
LogSPBlitzToTableSchedule = 'Weekly';
LogSPBlitzToTableFrequency = 2 ; # 2 means Monday
LogSPBlitzToTableStartTime  = '03:00:00'}

Test-SQLDefault @Parms

I have some other tests which always return what I want, particularly the firewall rules which you will have to modify to suit your own environment

To be able to run this you will need to have the Pester Module. If you are using Windows 10 then it is installed by default, if not

  Find-Module –Name 'Pester' | Install-Module

You can find more about Pester here and here and also these videos from the conference
You can find the tests on GitHub here and I will continue to add to the defaults that I check.
This is not a replacement for other SQL configuration tools such as PBM but it is a nice simple way of giving a report on the current status of a SQL installation either at a particular point in time when something is wrong or after an installation prior to passing the server over to another team or into service

.

DBA Database scripts are on Github

It started with a tweet from Dusty

Tweets

The second session I presented at the fantastic PowerShell Conference Europe was about using the DBA Database to automatically install DBA scripts like sp_Blitz, sp_AskBrent, sp_Blitzindex from Brent Ozar , Ola Hallengrens Maintenance Solution , Adam Mechanics sp_whoisactive , This fantastic script for logging the results from sp_whoisactive to a table , Extended events sessions and other goodies for the sanity of the DBA.

By making use of the dbo.InstanceList in my DBA database I am able to target instances, by SQL Version, OS Version, Environment, Data Centre, System, Client or any other variable I choose. An agent job that runs every night will automatically pick up the instances and the scripts that are marked as needing installing. This is great when people release updates to the above scripts allowing you to target the development environment and test before they get put onto live.

I talked to a lot of people in Hannover and they all suggested that I placed the scripts onto GitHub and after some how-to instructions from a few people (Thank you Luke) I spent the weekend updating and cleaning up the code and you can now find it on GitHub here

github

I have added the DBA Database project, the Powershell scripts and Agent Job creation scripts to call those scripts and everything else I use. Some of the DBA Scripts I use (and links to those you need to go and get yourself for licensing reasons) and the Power Bi files as well. I will be adding some more jobs that I use to gather other information soon.

Please go and have a look and see if it is of use to you. It is massively customisable and I have spoken to various people who have extended it in interesting ways so I look forward to hearing about what you do with it.

As always, questions and comments welcome