Making Start-Demo work with multi-line commands without a backtick

I love to speak about PowerShell. I really enjoy giving presentations and when I saw Start-Demo being used at the PowerShell Conference in Hanover I started to make use of it in my presentations.

Start-Demo was written in 2007 by a fella who knows PowerShell pretty well :-)  https://blogs.msdn.microsoft.com/powershell/2007/03/03/start-demo-help-doing-demos-using-powershell/

It was then updated in 2012 by Max Trinidad http://www.maxtblog.com/2012/02/powershell-start-demo-now-allows-multi-lines-onliners/

This enabled support for multi-line code using backticks at the end of each line. This works well but I dislike having to use the backticks in foreach loops, it confuses people who think that they need to be included and to my mind looks a bit messy

start-demo

This didn’t bother me enough to look at the code but I did mention it to my friend Luke t | g who decided to use it as a challenge for his Friday lunch-time codeathon and updated the function so that it works without needing a backtick

start-demo2

It also works with nested loops

start-demo3

just a little improvement but one I think that works well and looks good

You can find it at

https://github.com/SQLDBAWithABeard/Presentations/blob/master/Start-Demo.ps1

and a little demo showing what it can and cant do

https://github.com/SQLDBAWithABeard/Presentations/blob/master/start-demotest.ps1

Load the Start-Demo.ps1 file and then run

Start-Demo PATHTO\start-demotest.ps1

Enjoy!

 

 

 

Remove-SQLDatabaseSafely My First Contribution to DBATools

What is DBA Tools?

A collection of modules for SQL Server DBAs. It initially started out as ‘sqlmigration’, but has now grown into a collection of various commands that help automate DBA tasks and encourage best practices.

You can read more about here and it is freely available for download on GitHub I thoroughly recommend that you watch this quick video to see just how easy it is to migrate an entire SQL instance in one command (Longer session here )

Installing it is as easy as

Install-Module dbatools

which will get you over 80 commands . Visit https://dbatools.io/functions/ to find out more information about them

cmdlets

The journey to Remove-SQLDatabaseSafely started with William Durkin b | t who presented to the SQL South West User Group  (You can get his slides here)

Following that session  I wrote a Powershell Script to gather information about the last used date for databases which I blogged about here and then a T-SQL script to take a final backup and create a SQL Agent Job to restore from that back up which I blogged about here The team have used this solution (updated to load the DBA Database and a report instead of using Excel) ever since and it proved invaluable when a read-only database was dropped and could quickly and easily be restored with no fuss.

I was chatting with Chrissy LeMaire who founded DBATools b | t about this process and when she asked for contributions in the SQL Server Community Slack I offered my help and she suggested I write this command. I have learnt so much. I thoroughly enjoyed and highly recommend working on projects collaboratively to improve your skills. It is amazing to work with such incredible professional PowerShell people.

I went back to the basics and thought about what was required and watched one of my favourite videos again. Grant Fritcheys Backup Rant

I decided that the process should be as follows

  1. Performs a DBCC CHECKDB
  2. Database is backed up WITH CHECKSUM
  3. Database is restored with VERIFY ONLY on the source
  4. An Agent Job is created to easily restore from that backup
  5. The database is dropped
  6. The Agent Job restores the database
  7. performs a DBCC CHECKDB and drops the database for a final time

This (hopefully) passes all of Grants checks. This is how I created the command

I check that the SQL Agent is running otherwise we wont be able to run the job. I use a while loop with a timeout like this

$agentservice = Get-Service -ComputerName $ipaddr -Name $serviceName
if ($agentservice.Status -ne 'Running')
{
$agentservice.Start()
$timeout = new-timespan -seconds 60
$sw = [diagnostics.stopwatch]::StartNew()
$agentstatus = (Get-Service -ComputerName $ipaddr -Name $serviceName).Status
while ($dbStatus -ne 'Running' -and $sw.elapsed -lt $timeout)
{
$dbStatus = (Get-Service -ComputerName $ipaddr -Name $serviceName).Status
}
}

There are a lot more checks and logic than I will describe here to make sure that the process is as robust as possible. For example, the script can exit after errors are found using DBCC CHECKDB or continue and label the database backup file and restore job appropriately. Unless the force option is used it will exit if the job name already exists. We have tried to think of everything but if something has been missed or you have suggestions let us know (details at end of post)

The only thing I didn’t add was a LARGE RED POP UP SAYING ARE YOU SURE YOU WANT TO DROP THIS DATABASE but I considered it!!

Performs a DBCC CHECKDB

Running DBCC CHECKDB with Powershell is as easy as this

$sourceserver = New-Object Microsoft.SQLServer.Management.Smo.Server "ServerName"
 $db = $sourceserver.databases[$dbname]
 $null = $db.CheckTables('None')

you can read more on MSDN

Database is backed up WITH CHECKSUM

Stuart Moore is my go to for doing backups and restores with SMO

I ensured that the backup was performed with checksum like this

$backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
$backup.Action = [Microsoft.SqlServer.Management.SMO.BackupActionType]::Database
$backup.BackupSetDescription = "Final Full Backup of $dbname Prior to Dropping"
$backup.Database = $dbname
$backup.Checksum = $True

Database is restored with VERIFY ONLY on the source

I used SMO all the way through this command and performed the restore verify only like this

$restoreverify = New-Object 'Microsoft.SqlServer.Management.Smo.Restore'
$restoreverify.Database = $dbname
$restoreverify.Devices.AddDevice($filename, $devicetype)
$result = $restoreverify.SqlVerify($sourceserver)

An Agent Job is created to easily restore from that backup

First I created a category for the Agent Job

Function New-SqlAgentJobCategory
{
param ([string]$categoryname,
[object]$jobServer)
if (!$jobServer.JobCategories[$categoryname])
{
if ($Pscmdlet.ShouldProcess($sourceserver,"Creating Agent Job Category $categoryname")
{
try
{
Write-Output "Creating Agent Job Category $categoryname"
$category = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobCategory
$category.Parent = $jobServer
$category.Name = $categoryname
$category.Create()
Write-Output "Created Agent Job Category $categoryname"
}
catch
{
Write-Exception $_
throw "FAILED : To Create Agent Job Category $categoryname - Aborting"
}
}
}
}

and then generated the TSQL for the restore step by using the script method on the Restore SMO object

This is how to create an Agent Job

$job = New-Object Microsoft.SqlServer.Management.Smo.Agent.Job $jobServer, $jobname
$job.Name = $jobname
$job.OwnerLoginName = $jobowner
$job.Description = "This job will restore the $dbname database using the final backup located at $filename"code]</pre>
and then to add a job step to run the restore command
<pre>$jobStep = new-object Microsoft.SqlServer.Management.Smo.Agent.JobStep $job, $jobStepName
$jobStep.SubSystem = 'TransactSql' # 'PowerShell'
$jobStep.DatabaseName = 'master'
$jobStep.Command = $jobStepCommmand
$jobStep.OnSuccessAction = 'QuitWithSuccess'
$jobStep.OnFailAction = 'QuitWithFailure'
if ($Pscmdlet.ShouldProcess($destination, "Creating Agent JobStep on $destination"
{
$null = $jobStep.Create()
}
$job.ApplyToTargetServer($destination)
$job.StartStepID = $jobStartStepid
$job.Alter()

The database is dropped

We try 3 different methods to drop the database

$server.KillDatabase($dbname)
$server.databases[$dbname].Drop()
$null = $server.ConnectionContext.ExecuteNonQuery("DROP DATABASE "code]</pre>
<h2>The Agent Job restores the database</h2>
To run the Agent Job I call the start method of the Job SMO Object
<pre>
$job = $destserver.JobServer.Jobs[$jobname]
$job.Start()
$status = $job.CurrentRunStatus
while ($status -ne 'Idle')
{
Write-Output &quot;Restore Job for $dbname on $destination is $status&quot;
$job.Refresh()
$status = $job.CurrentRunStatus
Start-Sleep -Seconds 5
}
Then we drop the database for the final time with the confidence that we have a safe backup and an easy one click method to restore it from that backup (as long as the backup is in the same location)
There are further details on the functions page on dbatools
Some videos of it in action are on YouTube http://dbatools.io/video
You can take a look at the code on GitHub here

You can install it with

Install-Module dbatools
You can provide feedback via the Trello Board or discuss it in the #dbatools channel in the Sqlserver Community Slack
You too can also become a contributor https://dbatools.io/join-us/ Come and write a command to make it easy for DBAs to (this bit is up to your imagination).

Using the new SQLServer Powershell module to get SQL Agent Job Information

So with the July Release of SSMS everything changed for using PowerShell with SQL. You can read the details here As I mentioned in my previous post the name of the module has changed to sqlserver

This means that if you have a PowerShell script doing Import-Module SQLPS, it will need to be changed to be Import-Module SqlServer in order to take advantage of the new provider functionality and new CMDLETs. The new module will be installed to “%Program Files\WindowsPowerShell\Modules\SqlServer” and hence no update to $env:PSModulePath is required.

You can download the latest SSMS release here Once you have installed and rebooted you can start to look at the new Powershell CMDlets

 Import-module sqlserver

Take a look at cmdlets

 Get-command -module sqlserver

Today I want to look at agent jobs

 Get-command *sqlagent*

getcomand sqlagent

So I decided to see how to gather the information I gather for the DBADatabase as described here

This is the query I use to insert the data for the server level agent job information.

 $Query = @"
INSERT INTO [Info].[AgentJobServer]
 ([Date]
 ,[InstanceID]
 ,[NumberOfJobs]
 ,[SuccessfulJobs]
 ,[FailedJobs]
 ,[DisabledJobs]
 ,[UnknownJobs])
 VALUES
 (GetDate()
 ,(SELECT [InstanceID]
FROM [DBADatabase].[dbo].[InstanceList]
WHERE [ServerName] = '$ServerName'
AND [InstanceName] = '$InstanceName'
AND [Port] = '$Port')
 ,'$JobCount'
 ,'$successCount'
 ,'$failedCount'
 ,'$JobsDisabled'
 ,'$UnknownCount')
"@

So Get-SQLAgentJob looks like the one I need. Lets take a look at the help. This should be the starting point whenever you use a new cmdlet

 Get-Help Get-SqlAgentJob -Full

Which states

Returns a SQL Agent Job object for each job that is present in the target instance of SQL Agent.

That sounds like it will meet my needs. Lets take a look

Get-SqlAgentJob -ServerInstance $Connection|ft -AutoSize

sqlinstances

I can get the information I require like this


$JobCount = (Get-SqlAgentJob -ServerInstance $Connection ).Count
$successCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Succeeded'}.Count
$failedCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Failed'}.Count
$JobsDisabled = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.IsEnabled -eq $false}.Count
$UnknownCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Unknown'}.Count

NOTE – That code is for PowerShell V4 and V5, if you are using earlier versions of PowerShell you would need to use

$JobCount = (Get-SqlAgentJob -ServerInstance $Connection ).Count
$successCount = (Get-SqlAgentJob -ServerInstance $Connection|Where-Object {$_.LastRunOutcome -eq 'Succeeded'}).Count
$failedCount = (Get-SqlAgentJob -ServerInstance $Connection |Where-Object {$_.LastRunOutcome -eq 'Failed'}).Count
$JobsDisabled = (Get-SqlAgentJob -ServerInstance $Connection |Where-Object{$_.IsEnabled -eq $false}).Count
$UnknownCount = (Get-SqlAgentJob -ServerInstance $Connection |Where-Object{$_.LastRunOutcome -eq 'Unknown'}).Count

But to make the code more performant it is better to do this

 [pscustomobject]$Jobs= @{}
$Jobs.JobCount = (Get-SqlAgentJob -ServerInstance $Connection ).Count
$Jobs.successCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Succeeded'}.Count
$Jobs.failedCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Failed'}.Count
$Jobs.JobsDisabled = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.IsEnabled -eq $false}.Count
$Jobs.UnknownCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Unknown'}.Count
$Jobs

jobs

Using Measure-Command showed that this completed in
TotalSeconds : 0.9889336
Rather than
TotalSeconds : 2.9045701

Note that

 (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.Enabled -eq $false}.Count

Does not work. I had to check the properties using

 Get-SqlAgentJob -ServerInstance $Connection |Get-Member -Type Properties

Which showed me

IsEnabled Property bool IsEnabled {get;set;}

So I tested this against the various SQL versions I had in my lab using this code

$Table = $null
$Table = New-Object System.Data.DataTable "Jobs"
$Col1 = New-Object System.Data.DataColumn ServerName,([string])
$Col2 = New-Object System.Data.DataColumn JobCount,([int])
$Col3 = New-Object System.Data.DataColumn SuccessCount,([int])
$Col4 = New-Object System.Data.DataColumn FailedCount,([int])
$Col5 = New-Object System.Data.DataColumn DisabledCount,([int])
$Col6 = New-Object System.Data.DataColumn UnknownCount,([int])

$Table.Columns.Add($Col1)
$Table.Columns.Add($Col2)
$Table.Columns.Add($Col3)
$Table.Columns.Add($Col4)
$Table.Columns.Add($Col5)
$Table.Columns.Add($Col6)
foreach ($ServerName in $DemoServers)
{
## $ServerName
$InstanceName =  $ServerName|Select-Object InstanceName -ExpandProperty InstanceName
$Port = $ServerName| Select-Object Port -ExpandProperty Port
$ServerName = $ServerName|Select-Object ServerName -ExpandProperty ServerName
$Connection = $ServerName + '\' + $InstanceName + ',' + $Port
try
{
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection
}
catch
{
"Failed to connect to $Connection"
}
if (!( $srv.version)){
"Failed to Connect to $Connection"
continue
}
[pscustomobject]$Jobs= @{}
$JobHistory = Get-SqlAgentJob -ServerInstance $Connection
$Row = $Table.NewRow()
$Row.ServerName = $ServerName
$Row.JobCount = $JobHistory.Count
$Row.SuccessCount = $JobHistory.where{$_.LastRunOutcome -eq 'Succeeded'}.Count
$Row.FailedCount = $JobHistory.where{$_.LastRunOutcome -eq 'Failed'}.Count
$Row.DisabledCount = $JobHistory.where{$_.IsEnabled -eq $false}.Count
$Row.UnknownCount = $JobHistory.where{$_.LastRunOutcome -eq 'Unknown'}.Count
$Table.Rows.Add($row)
}
$Table|ft
Here are the results
job data table

I also had a look at Get-SQLAgentJobHistory Lets take a look at the help

Get-help get-SQLAgentJobHistory -showwindow

DESCRIPTION

Returns the JobHistory present in the target instance of SQL Agent.

This cmdlet supports the following modes of operation to return the JobHistory:

  1. By specifying the Path of the SQL Agent instance.
  2. By passing the instance of the SQL Agent in the input.
  3. By invoking the cmdlet in a valid context.

So I ran

Get-SqlAgentJobHistory -ServerInstance sql2014ser12r2 

And got back a whole load of information. Every job history available on the server. Too much to look it immediately to work out what to do

So I looked at just one job

Get-SqlAgentJobHistory -ServerInstance SQL2014Ser12R2 -JobName 'DatabaseBackup - SYSTEM_DATABASES - FULL - Local G Drive'

And got back the last months worth of history for that one job as that is the schedule used to purge the job history for this server So then I added -Since Yesterday to only get the last 24 hours history

Get-SqlAgentJobHistory -ServerInstance SQL2014Ser12R2 -JobName 'DatabaseBackup - SYSTEM_DATABASES - FULL - Local G Drive' -Since Yesterday

agentjobdetail

The Since Parameter is described as

-Since <SinceType>

A convenient abbreviation to avoid using the -StartRunDate parameter.
It can be specified with the -EndRunDate parameter.

Do not specify a -StartRunDate parameter, if you want to use it.

Accepted values are:
– Midnight (gets all the job history information generated after midnight)
– Yesterday (gets all the job history information generated in the last 24 hours)
– LastWeek (gets all the job history information generated in the last week)
– LastMonth (gets all the job history information generated in the last month)

When I run

Get-SqlAgentJobHistory -ServerInstance SQL2014Ser12R2 -JobName 'DatabaseBackup - SYSTEM_DATABASES - FULL - Local G Drive' -Since Yesterday |Measure-Object

I get

Count : 3

And if I run

Get-SqlAgentJobHistory -ServerInstance SQL2014Ser12R2 -JobName 'DatabaseBackup - SYSTEM_DATABASES - FULL - Local G Drive' -Since Yesterday |select RunDate,StepID,Server,JobName,StepName,Message|Out-GridView

I get

agent job out gridview

Which matches the view I see in SSMS Agent Job History

jobhistory

So Get-SqlAgentJobHistory will enable you to use PowerShell to gather information about the Job history for each step of the Agent Jobs and also the message which I can see being very useful.

Come and join us in the SQL Community Slack to discuss these CMDLets and all things SQL Community https://sqlps.io/slack

CALL TO ACTION

Microsoft are engaging with the community to improve the tools we all use in our day to day work. There is are two Trello boards set up for YOU to use to contribute

https://sqlps.io/vote for SQLPS sqlserver PowerShell module

https://sqlps.io/ssms for SSMS

Go and join them and upvote YOUR preferred choice of the next lot of CMDlets

trellocount

We have also set up a SQL Community Slack for anyone in the community to discuss all things related to SQL including the Trello board items and already it seems a good place for people to get help with 150+ members in a few days. You can get an invite here https://sqlps.io/slack

Come and join us

PowerShell CMDLets added for SQL2016 Always Encrypted

The post on the SQLServer blog at TechNet by the SQL Server Tools Team today made me jump out of my seat.

The July update for SSMS includes the first substantial improvement in SQL PowerShell in many years. We owe a lot of thanks for this effort to the great collaboration with our community. We have several new CMDLETs to share with you

In one release there are twenty-five new CMDLets for the new sqlserver module

This means that if you have a PowerShell script doing Import-Module SQLPS, it will need to be changed to be Import-Module SqlServer in order to take advantage of the new provider functionality and new CMDLETs. The new module will be installed to “%Program Files\WindowsPowerShell\Modules\SqlServer” and hence no update to $env:PSModulePath is required.

So SQLPS will still continue to work but will not be updated and will not contain the new CMDlets or the future new CMDlets.

So what new things do we have?

This month we introduce CMDLETs for the following areas:

  • Always Encrypted
  • SQL Agent
  • SQL Error Logs

Chrissy LeMaire has written about the new SQL Agent cmdlets

Aaron Nelson has written about the new Get-SqlErrorLog cmdlet

Laerte Junior has written about Invoke-SQLCmd

All four of us will be presenting a webinar on the new CMDlets via the PowerShell Virtual Chapter Wed, Jul 06 2016 12:00 Eastern Daylight Time If you cant make it a recording will be made available on YouTube on the VC Channel https://sqlps.io/video

Always Encrypted CMDlets

That leaves the Always Encrypted CMDLets and there are 17 of those!

Add-SqlColumnEncryptionKeyValue Adds a new encrypted value for an existing column encryption key object in the database.
Complete-SqlColumnMasterKeyRotation Completes the rotation of a column master key.
Get-SqlColumnEncryptionKey Returns all column encryption key objects defined in the database, or returns one column encryption key object with the specified name.
Get-SqlColumnMasterKey Returns the column master key objects defined in the database, or returns one column master key object with the specified name.
Invoke-SqlColumnMasterKeyRotation Initiates the rotation of a column master key.
New-SqlAzureKeyVaultColumnMasterKeySettings Creates a SqlColumnMasterKeySettings object describing an asymmetric key stored in Azure Key Vault.
New-SqlCngColumnMasterKeySettings Creates a SqlColumnMasterKeySettings object describing an asymmetric key stored in a key store supporting the Cryptography Next Generation (CNG) API.
New-SqlColumnEncryptionKey Crates a new column encryption key object in the database.
New-SqlColumnEncryptionKeyEncryptedValue Produces an encrypted value of a column encryption key.
New-SqlColumnEncryptionSettings Creates a new SqlColumnEncryptionSettings object that encapsulates information about a single column’s encryption, including CEK and encryption type.
New-SqlColumnMasterKey Creates a new column master key object in the database.
New-SqlCspColumnMasterKeySettings Creates a SqlColumnMasterKeySettings object describing an asymmetric key stored in a key store with a Cryptography Service Provider (CSP) supporting Cryptography API (CAPI).
Remove-SqlColumnEncryptionKey Removes the column encryption key object from the database.
Remove-SqlColumnEncryptionKeyValue Removes an encrypted value from an existing column encryption key object in the database.
Remove-SqlColumnMasterKey Removes the column master key object from the database.
Set-SqlColumnEncryption Encrypts, decrypts or re-encrypts specified columns in the database.
 

 

That seems to cover setting up Always Encrypted with Powershell , removing it and getting information about it. When the new SSMS update is dropped you will be able to start using all of this new functionality.

Just remember Import-Module sqlserver

CALL TO ACTION

Microsoft are engaging with the community to improve the tools we all use in our day to day work. There is are two Trello boards set up for YOU to use to contribute

https://sqlps.io/vote  for SQLPS  sqlserver PowerShell module

https://sqlps.io/ssms for SSMS

Go and join them and upvote YOUR preferred choice of the next lot of CMDlets

trellocount

 

We have also set up a SQL Community Slack for anyone in the community to discuss all things related to SQL including the Trello board items and already it seems a good place for people to get help with 150+ members in a few days. You can get an invite here https://sqlps.io/slack

Come and join us

Some Pester Tests for SQL Defaults

When I was at PowerShell Conference EU in Hannover last month (The videos are available now – click here and the slides and code here) I found out about Irwin Strachans Active Directory Operations Test which got me thinking.

I decided to do the same for my usual SQL Set-up. Treating all of your servers to the same defaults makes it even easier to manage at scale remotely.

I am comfortable with using SMO to gather and change properties on SQL Instances so I started by doing this

        It 'Should have a default Backup Directory of F:\SQLBACKUP\BACKUPS' {
$Scriptblock = {
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
$srv = New-Object Microsoft.SqlServer.Management.Smo.Server .
return $srv.BackupDirectory}
$State = Invoke-Command -ComputerName ROB-SURFACEBOOK -ScriptBlock $Scriptblock
$State |Should Be 'F:\SQLBACKUP\BACKUPS'

This is the how to find the properties that you want

  ## Load the Assemblies
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
## Create a Server SMO object
$srv = New-Object Microsoft.SqlServer.Management.Smo.Server SERVERNAME

## Explore it
$srv|gm

## If you find an array pick the first one and expand and then explore that
$srv.Databases[0] | select *
$srv.Databases[0] | gm

I quickly found as I added more tests that it was taking a long time to perform the tests (about 5 seconds each test) and that it took an age to fail each of the tests if the server name was incorrect or the server unavailable.

I fixed the first one by testing with a ping before running the tests

   ## Check for connectivity
if((Test-Connection $Server -count 1 -Quiet) -eq $false){
Write-Error 'Could not connect to $Server'
$_
continue
}

The continue is there because I wanted to loop through an array of servers

I improved the performance using a remote session and a custom object

      Describe "$Server" {
BeforeAll {
$Scriptblock = {
[pscustomobject]$Return = @{}
$srv = ''
$SQLAdmins = $Using:SQLAdmins
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server
$Return.DBAAdminDb = $Srv.Databases.Name.Contains('DBA-Admin')
$Logins = $srv.Logins.Where{$_.IsSystemObject -eq $false}.Name
$Return.SQLAdmins = @(Compare-Object $Logins $SQLAdmins -SyncWindow 0).Length - $Logins.count -eq $SQLAdmins.Count
$SysAdmins = $Srv.Roles['sysadmin'].EnumMemberNames()
$Return.SQLAdmin = @(Compare-Object $SysAdmins $SQLAdmins -SyncWindow 0).Length - $SysAdmins.count -eq $SQLAdmins.Count
$Return.BackupDirectory = $srv.BackupDirectory
$Return.DataDirectory = $srv.DefaultFile

The BeforeAll script block is run, as it sounds like it should, once before all of the tests, BeforeEach would run once before each of the tests. I define an empty custom object and then create an SMO object and add the properties I am interested in testing to it. I then return the custom object at the end

   $Return.Alerts82345Exist = ($srv.JobServer.Alerts |Where {$_.Messageid -eq 823 -or $_.Messageid -eq 824 -or $_.Messageid -eq 825}).Count
$Return.Alerts82345Enabled = ($srv.JobServer.Alerts |Where {$_.Messageid -eq 823 -or $_.Messageid -eq 824 -or $_.Messageid -eq 825 -and $_.IsEnabled -eq $true}).Count
$Return.SysDatabasesFullBackupToday = $srv.Databases.Where{$_.IsSystemObject -eq $true -and $_.Name -ne 'tempdb' -and $_.LastBackupDate -lt (Get-Date).AddDays(-1)}.Count
Return $Return
}
try {
$Return = Invoke-Command -ScriptBlock $Scriptblock -ComputerName $Server -ErrorAction Stop
}
catch {
Write-Error "Unable to Connect to $Server"
$Error
continue

I was then able to test against the property of the custom object

   It 'Should have Alerts for Severity 20 and above' {
$Return.Alerts20SeverityPlusExist | Should Be 6
}
It 'Severity 20 and above Alerts should be enabled' {
$Return.Alerts20SeverityPlusEnabled | Should Be 6
}
It 'Should have alerts for 823,824 and 825' {
$Return.Alerts82345Exist |Should Be 3
}
It 'Alerts for 823,824 and 825 should be enebled' {
$Return.Alerts82345Enabled |Should Be 3
}

Occasionally, for reasons I haven’t explored I had to test against the value property of the returned object

          It "The Full User Database Backup should be scheduled Weekly $OlaUserFullSchedule" {
$Return.OlaUserFullSchedule.value | Should Be $OlaUserFullSchedule
}

I wanted to be able to run the tests against environments or groups of servers with different default values so I parameterised the Test Results as well and then the logical step was to turn it into a function and then I could do some parameter splatting. This also gives me the opportunity to show all of the things that I am currently giving parameters to the test for

   $Parms = @{
Servers = 'SQLServer1','SQLServer2','SQLServer3';
SQLAdmins = 'THEBEARD\Rob','THEBEARD\SQLDBAsAlsoWithBeards';
BackupDirectory = 'C:\MSSQL\Backup';
DataDirectory = 'C:\MSSQL\Data\';
LogDirectory = 'C:\MSSQL\Logs\';
MaxMemMb = '4096';
Collation = 'Latin1_General_CI_AS';
TempFiles = 4 ;
OlaSysFullFrequency = 'Daily';
OlaSysFullStartTime = '21:00:00';
OlaUserFullSchedule = 'Weekly';
OlaUserFullFrequency = 1 ;## 1 for Sunday
OlaUserFullStartTime = '22:00:00';
OlaUserDiffSchedule = 'Weekly';
OlaUserDiffFrequency = 126; ## 126 for every day except Sunday
OlaUserDiffStartTime = '22:00:00';
OlaUserLogSubDayInterval = 15;
OlaUserLoginterval = 'Minute';
HasSPBlitz = $true;
HasSPBlitzCache = $True;
HasSPBlitzIndex = $True;
HasSPAskBrent = $true;
HASSPBlitzTrace =  $true;
HasSPWhoisActive = $true;
LogWhoIsActiveToTable = $true;
LogSPBlitzToTable = $true;
LogSPBlitzToTableEnabled = $true;
LogSPBlitzToTableScheduled = $true;
LogSPBlitzToTableSchedule = 'Weekly';
LogSPBlitzToTableFrequency = 2 ; # 2 means Monday
LogSPBlitzToTableStartTime  = '03:00:00'}

Test-SQLDefault @Parms

I have some other tests which always return what I want, particularly the firewall rules which you will have to modify to suit your own environment

To be able to run this you will need to have the Pester Module. If you are using Windows 10 then it is installed by default, if not

  Find-Module –Name 'Pester' | Install-Module

You can find more about Pester here and here and also these videos from the conference
You can find the tests on GitHub here and I will continue to add to the defaults that I check.
This is not a replacement for other SQL configuration tools such as PBM but it is a nice simple way of giving a report on the current status of a SQL installation either at a particular point in time when something is wrong or after an installation prior to passing the server over to another team or into service

.

DBA Database scripts are on Github

It started with a tweet from Dusty

Tweets

The second session I presented at the fantastic PowerShell Conference Europe was about using the DBA Database to automatically install DBA scripts like sp_Blitz, sp_AskBrent, sp_Blitzindex from Brent Ozar , Ola Hallengrens Maintenance Solution , Adam Mechanics sp_whoisactive , This fantastic script for logging the results from sp_whoisactive to a table , Extended events sessions and other goodies for the sanity of the DBA.

By making use of the dbo.InstanceList in my DBA database I am able to target instances, by SQL Version, OS Version, Environment, Data Centre, System, Client or any other variable I choose. An agent job that runs every night will automatically pick up the instances and the scripts that are marked as needing installing. This is great when people release updates to the above scripts allowing you to target the development environment and test before they get put onto live.

I talked to a lot of people in Hannover and they all suggested that I placed the scripts onto GitHub and after some how-to instructions from a few people (Thank you Luke) I spent the weekend updating and cleaning up the code and you can now find it on GitHub here

github

I have added the DBA Database project, the Powershell scripts and Agent Job creation scripts to call those scripts and everything else I use. Some of the DBA Scripts I use (and links to those you need to go and get yourself for licensing reasons) and the Power Bi files as well. I will be adding some more jobs that I use to gather other information soon.

Please go and have a look and see if it is of use to you. It is massively customisable and I have spoken to various people who have extended it in interesting ways so I look forward to hearing about what you do with it.

As always, questions and comments welcome

 

 

A Day In The Life of a SQLBits Volunteer

DSCN0286 (3)

So SQLBits has finished for another year. About 1500 people registered for the largest SQL Conference in Europe held over 4 days in Liverpool last week. 2 days of full day sessions and 2 days of shorter sessions, a pub quiz, a party (literally) out of this world.

It is organised and run by volunteers

Yes, you’re right. Wow.

Of course we must pay special thanks to Simon, Darren, Chris, Jonathan, Annette, Allan and Alex without forgetting all that JRJ has done. They ensured that the venue, sessions, speakers, sponsors, audio visual and not forgetting the amazing party by Sneaky Experience (Thank you Julia), all occurred at the right time in the right place with the right people and the work and effort they have put in cannot be over-estimated but this post isn’t about those generous, inspiring, amazing folk

Several people asked me what a volunteer does and how they can become one and hopefully this post will answer both of those questions. More than 40 people volunteered to help at SQL Bits

Some volunteers arrived on Tuesday to help with the set up. Making sure that we knew where everything was and helping to lift and carry all of the things that needed moving.

WP_20160504_07_13_44_Pro (2)

 

We wanted to begin the process of packing the swag bags so that all the attendees on Wednesday could have them as they arrived but unfortunately we were let down by a delivery firm and that was not possible. That job was completed by some very dedicated helpers during all of Wednesday and finished off by all helpers after Wednesdays sessions had finished. Many, many thanks and respect particularly to Bob and Conan for their dedication to the bag-packing cause, although I did hear “If I ever see another bag again ……..”

 

 

 

 

bean bags
All of the registration desks with badges, lanyards and booklets were set up ready for the next day, the covers were put on the tables, the bean bags put out. (They were needed later!!)

 

 

We started every morning with a helpers briefing at 7am
WP_20160506_07_40_18_Pro (2).jpg

where we ensured that everyone knew what was happening that day. Most helpers took some part in registration whether it was being at the desks handing out badges, assisting speakers and sponsors on arrival with questions and some the other jobs that needed doing such as putting the table cloths on the tables in the domes and getting out the bags.

WP_20160504_10_31_33_Pro (2)

DSCN0284.JPGDuring every session there were two room monitors. Their responsibility was to ensure that everything went smoothly during the day and during each of the sessions on Friday and Saturday. They assisted the speaker with drinks, timings and the other foibles that technology can throw, they would have known what to do if there were issues with the domes.

 

dome monitoringThey also made sure that everyone was in the correct session and answered lots of questions. The bit that hopefully most attendees missed was that in the case of any issues they called in one of the 5 or 6 super helpers who were patrolling the arena, who could go and get the AV guys or the venue staff to iron out the wrinkles, making sure that the water coolers got re-filled etc

They took all of your feedback forms and took them to the main desk. There were always people at the front desk helping, although Alex, Annette and Terri spent almost all of their time there.
WP_20160504_15_32_59_Pro

Of course it doesn’t stop there. During the breaks and particularly at lunchtime the helpers were available and indeed noticeable. They assisted with the queues at lunchtime and for the key-note

WP_20160506_09_37_35_Pro.jpg

answered questions and they helped attendees with session choices. There were always people and questions that they were willing to answer or to find someone who could.

Once the sessions had finished, the helpers didn’t. They manned and womanned the registration desks in the evening. Returned all of the feedback forms and helped with sorting out the venue ready for the next day. There was also a de-brief for any lessons learned.

WP_20160504_17_54_40_Pro (2)

They even gave out the free drinks tokens at the entrance to the party! (More Pictures Here)

DSCN0376 (2)

So why do they do it? They are all part of this amazing community dedicated to providing great events for people to learn. Many of them are volunteers at other events and/or user group leaders and attendees and want to give something back to the community.) . They were able to choose the full day and hour sessions that they attended so were still able to learn from the amazing speakers at the brilliant sessions available. Everyone was able to choose which days they helped it doesn’t have to be for every day you are at the conference. They are a wonderful group of people and I have made some fantastic friendships by volunteering and get great hugs🙂

WP_20160506_09_15_33_Pro (2).jpgYou also get asked the most brilliant questions like “Excuse me, I have a Dalek in my van, what would you like me to do with it?”

If that sounds like something you would like to do, you just need to email annette@sqlbits.com with ‘I would like to be a helper next year at SQLBits’ in the subject. You can also offer to help at your local user group or at other events, speak to your user group leaders for more details on that

If it’s not for you, that’s cool but please, for me, the next time you see them at an event just say thank you. It really means a lot and they thoroughly deserve every word of thanks that we give them.

thankyou