Using Pester with dbatools Test-DbaLastBackup

In previous posts I have shown how to use Test-DbaLastBackup from dbatools and how you can make use of the results. Today we will look at using  Pester with the results

Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands.

we shall use it to validate our results. First we need to gather our results as we have seen before, In this example I have set the MaxMb to 5 so change that if you are playing along

Import-Module dbatools
$TestServer = 'SQL2016N1'
$Server = 'SQL2016N2'
$servers = 'SQL2016N1','SQL2016N2'
$Results = $servers.ForEach{Test-DbaLastBackup -SqlServer $_ -Destination $TestServer -MaxMB 5}
Then we need to write some Pester Tests. I tried to use Test Cases which are the correct method to iterate through collections as Mike Robbins shows here but Pester does not accept the type of object that is returned from this command for that. It’s ok though, because Pester is just PowerShell we can use a foreach loop.
In this scenario, we are testing for failures rather than when the backup test has skipped due to the file path not being a network share or the size being greater than our max size, so our checks are using the Should Not assertion. I have also added a test for the time the backup was taken.
Describe "Last Backup Test results - NOTE THIS IGNORES Skipped restores,DBCC and BackupFiles" {
foreach($result in $results)
{
It "$($Result.Database) on $($Result.SourceServer) File Should Exist" {
$Result.FileExists| Should Not Be 'False'
}
It "$($Result.Database) on $($Result.SourceServer) Restore should be Success" {
$Result.RestoreResult| Should Not Be 'False'
}
It "$($Result.Database) on $($Result.SourceServer) DBCC should be Success" {
$Result.DBCCResult| Should Not Be 'False'
}
It "$($Result.Database) on $($Result.SourceServer) Backup Should be less than a week old" {
$Result.BackupTaken| Should BeGreaterThan (Get-Date).AddDays(-7)
}
}
If we run that we get an output like this. Green is Good Red is Bad 🙂
01 - pester script.PNG
We can save the script to a file and use the Invoke-Pester to call it like this.
Invoke-Pester C:\temp\BackupPester.ps1
(Some Restore Frames removed for brevity)
02 -invoke pester.gif
invoke-Pester can output results to a file so we can output to XML which can be consumed by many things
$Date = Get-Date -Format ddMMyyyHHmmss
$tempFolder = 'c:\temp\BackupTests\'
Push-Location $tempFolder
$XML = $tempFolder + "BackupTestResults_$Date.xml"
$script = 'C:\temp\BackupPester.ps1'
Invoke-Pester -Script $Script -OutputFile $xml -OutputFormat NUnitXml
will provide an XML file like this
04 - XML output.PNG
We can also make use of the reportunit.exe from http://relevantcodes.com/ to create pretty HTML files from the XML files we created
This piece of code will download and extract the file if it does not exist in the directory
#download and extract ReportUnit.exe
$url = 'http://relevantcodes.com/Tools/ReportUnit/reportunit-1.2.zip'
$fullPath = Join-Path $tempFolder $url.Split("/")[-1]
$reportunit = $tempFolder + '\reportunit.exe'
if((Test-Path $reportunit) -eq $false)
{
(New-Object Net.WebClient).DownloadFile($url,$fullPath)
Expand-Archive -Path $fullPath -DestinationPath $tempFolder
}
and this will run it against the XML and open the file
##run reportunit against report.xml and display result in browser
$HTML = $tempFolder  + 'index.html'
& .\reportunit.exe $tempFolder
Invoke-Item $HTML
which will look  like
03 - pretty html file.gif
Happy Automating

NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using

 Get-Module dbatools

and update it using an Administrator PowerShell session with

 Update-Module dbatools

You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using

Install-Module dbatools

Then you can use

Update-dbatools
Advertisements

Backing up SQL Server on Linux using Ola Hallengrens Maintenance Solution

With the release of SQL Server vNext CTP 1.4 SQL Agent was released for use on Linux. To install it on Ubuntu you need to upgrade your SQL Server to CTP 1.4. On Ubuntu you do this with

sudo apt-get update
sudo apt-get install mssql-server

Once you have CTP 1.4 you can install SQL Agent as follows

sudo apt-get update
sudo apt-get install mssql-server-agent
sudo systemctl restart mssql-server

for different flavours of Linux follow the steps here

Once you have done that you will see that the Agent is now available

01 - SSMS Agent Linux.PNG

So now I can schedule backups and maintenance for my Linux SQL databases using the agent. I immediately turned to Ola Hallengrens Maintenance Solution I downloaded the SQL file and ran it against my Linux server once I had changed the path for the backups to a directory I had created at /var/opt/mssql/backups notice that it is specified using Windows notation with C:\ at the root

SET @CreateJobs = 'Y'          -- Specify whether jobs should be created. 
SET @BackupDirectory     = N'C:\var\opt\mssql\backups' -- Specify the backup root directory. 
SET @CleanupTime         = 350         -- Time in hours, after which backup files are deleted. If no time is specified, then no backup files are deleted. 
SET @OutputFileDirectory = NULL         -- Specify the output file directory. If no directory is specified, then the SQL Server error log directory is used. 
SET @LogToTable          = 'Y'          -- Log commands to a table.

The stored procedures were created

03 - stored procedures

and the jobs were created

04 - jobs.PNG

Now the jobs are not going to run as they are as they have CmdExec steps and this is not supported in SQL on Linux so we have to make some changes to the steps. As I blogged previously, this is really easy using PowerShell

First we need to grab the jobs into a variable. We will use Get-SQLAgentJobHistory from the sqlserver module which you need to download SSMS 2016 or later to get. You can get it from https://sqlps.io/dl As we are targeting a Linux SQL Server we will use SQL authentication which we will provide via Get-Credential and then take a look at the jobs

Import-Module sqlserver
$cred = Get-Credential
$Jobs = Get-SqlAgentJob -ServerInstance LinuxvVNext -Credential $cred
$jobs |ft -auto

 

05 Powershell jobs.PNG

Once the jobs were in the variable I decided to filter out only the jobs that are calling the stored procedures to perform the backups, DBCC and Index optimisation and loop through them first. Backups are the most important after all

## Find the jobs we want to change foreach($Job in $jobs.Where{$_.Name -like '*DATABASES*'})

Then it is simply a case of replacing the sqlcmd text in the command to return it to T-SQL, adding the database name (I installed Ola’s stored procedures into the master database and changing the subsystem to use T-SQL instead of CmdExec

## replace the text as required
$job.jobsteps[0].command = $job.jobsteps[0].command.Replace('sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "' , '').Replace('" -b','')
## Change the subsystem
$job.jobsteps[0].subsystem = 'TransactSQL'
## Add the databasename
$job.jobsteps[0].DatabaseName = 'master'
## Alter the jobstep
$job.jobsteps[0].Alter()

We can check that it has done this using PowerShell

$Jobs = Get-SqlAgentJob -ServerInstance LinuxvVNext -Credential $cred
foreach($Job in $jobs.Where{$_.Name -like '*DATABASES*'})
{
foreach($step in $Job.JobSteps)
{
$step | Select Parent, Name, Command,DatabaseName,Subsystem
}
}
06 - Jobs changed.PNG

or by looking in SSMS if you prefer

07 - jobs changed ssms.PNG

Now lets run the jobs and check the history using Get-SqlAgentJobHistory

Get-SqlAgentJobHistory -ServerInstance linuxvnextctp14 -Credential $cred | select RunDate,StepID,Server,JobName,StepName,Message|Out-GridView
08 - ogv for jobs.PNG

Which pretty much matches what you see in SSMS

09 - ssms jobs view.PNG

and if you look in the directory you see the files exactly as you would expect them to be

10 - Files in Linux

We still need to change the other jobs that Ola’s script create. If we look at the command steps

 

11 - job comands.PNG

We can see that the CommandLog Cleanup job can use the same PowerShell code as the backup jobs, the sp_delete_backuphistory and sp_purgejobhistory jobs need to refer to the msdb database instead of master. For the moment the Output File Cleanup job is the one that is not able to be run on Linux. Hopefully soon we will be able to run PowerShell job steps and that will be resolved as well

Here is the full snippet of code to change all of the jobs

$server = 'Linuxvnextctp14'
$cred = Get-Credential
$Jobs = Get-SqlAgentJob -ServerInstance $server -Credential $cred
## Find the jobs we want to change
foreach($Job in $jobs)
{
if($Job.Name -like '*DATABASES*' -or $Job.Name -like '*CommandLog*')
{
## replace the text as required
$job.jobsteps[0].command = $job.jobsteps[0].command.Replace('sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "' , '').Replace('" -b','')
## Change the subsystem
$job.jobsteps[0].subsystem = 'TransactSQL'
## Add the databasename
$job.jobsteps[0].DatabaseName = 'master'
## Alter the jobstep
$job.jobsteps[0].Alter()
}
if($Job.Name -like '*history*')
{
## replace the text as required
$job.jobsteps[0].command = $job.jobsteps[0].command.Replace('sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d msdb -Q "' , '').Replace('" -b','')
## Change the subsystem
$job.jobsteps[0].subsystem = 'TransactSQL'
## Add the databasename
$job.jobsteps[0].DatabaseName = 'msdb'
## Alter the jobstep
$job.jobsteps[0].Alter()
}
}
 Happy Automating

Taking dbatools Test-DbaLastBackup a little further

In a previous post I showed how easy it is to test your backups using Test-DbaLastBackup

Today I thought I would take it a little further and show you how PowerShell can be used to transmit or store this information in the manner you require

Test-DBALastBackup returns an object of information

SourceServer  : SQL2016N2
TestServer    : SQL2016N2
Database      : FadetoBlack
FileExists    : True
RestoreResult : Success
DbccResult    : Success
SizeMB        : 1243.26
BackupTaken   : 3/18/2017 12:36:07 PM
BackupFiles   : Z:\SQL2016N2\FadetoBlack\FULL_COPY_ONLY\SQL2016N2_FadetoBlack_FULL_COPY_ONLY_20170318_123607.bak

which shows the server, the database name, if the file exists, the restore result, the DBCC result, the size of the backup file, when the backup was taken and the path used

Text File

As it is an object we can make use of that in PowerShell. We can output the results to a file

Test-DbaLastBackup -SqlServer sql2016n2 -Destination SQL2016N1 -MaxMB 5 | Out-File C:\temp\Test-Restore.txt
notepad C:\temp\Test-Restore.txt

01 - out file.PNG

CSV

Or maybe you need a CSV

 Test-DbaLastBackup -SqlServer sql2016n2 -Destination SQL2016N1 -MaxMB 5 | Export-Csv  C:\temp\Test-Restore.csv -NoTypeInformation

02 - csv file.PNG

JSON

Maybe you want some json

 Test-DbaLastBackup -SqlServer sql2016n2 -Destination SQL2016N1| ConvertTo-Json | Out-file c:\temp\test-results.json

06 - json results.PNG

HTML

Or an HTML page


$Results = Test-DbaLastBackup -SqlServer sql2016n2 -Destination SQL2016N1
$Results | ConvertTo-Html | Out-File c:\temp\test-results.html

03 - html.PNG

Excel

or perhaps you want a nice colour coded Excel sheet to show your manager or the auditors

Import-Module dbatools

$TestServer = 'SQL2016N1'
$Server = 'SQL2016N2'
## Run the test and save to a variable
$Results = Test-DbaLastBackup -SqlServer $server -Destination $TestServer
# Set the filename
$TestDate = Get-Date
$Date = Get-Date -Format ddMMyyy_HHmmss
$filename = 'C:\Temp\TestResults_' + $Date + '.xlsx'
# Create a .com object for Excel
$xl = new-object -comobject excel.application
$xl.Visible = $true # Set this to False when you run in production
$wb = $xl.Workbooks.Add() # Add a workbook
$ws = $wb.Worksheets.Item(1) # Add a worksheet
$cells=$ws.Cells
$col = 1
$row = 3
## Create a legenc
$cells.item($row,$col)="Legend"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex = 34
$row ++
$cells.item($row,$col)="True or Success"
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex = 10
$row ++
$cells.item($row,$col)="False or Failed"
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 3
$row ++
$cells.item($row,$col)="Skipped"
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 16
$row ++
$cells.item($row,$col)="Backup Under 7 days old"
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 4
$row ++
$cells.item($row,$col)="Backup Over 7 days old"
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 3
## Create a header
$col ++
$row = 3
$cells.item($row,$col)="Source Server"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="Test Server"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="Database"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="File Exists"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="Restore Result"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="DBCC Result"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="Size Mb"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="Backup Date"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="Backup Files"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col = 2
$row = 4
foreach($result in $results)
{
$col = 2
$cells.item($row,$col)=$Result.SourceServer
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$col ++
$cells.item($row,$col)=$Result.TestServer
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$col++
$cells.item($row,$col)=$Result.Database
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$col++
$cells.item($row,$col)=$Result.FileExists
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
if($result.FileExists -eq 'True')
{
    $Cells.item($row,$col).Interior.ColorIndex= 10
}
elseif($result.FileExists -eq 'False')
{
    $Cells.item($row,$col).Interior.ColorIndex= 3
}
else
{
    $Cells.item($row,$col).Interior.ColorIndex= 16
}
$col++
$cells.item($row,$col)=$Result.RestoreResult
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
if($result.RestoreResult -eq 'Success')
{
    $Cells.item($row,$col).Interior.ColorIndex= 10
}
elseif($result.RestoreResult -eq 'Failed')
{
    $Cells.item($row,$col).Interior.ColorIndex= 3
}
else
{
    $Cells.item($row,$col).Interior.ColorIndex= 16
}
$col++
$cells.item($row,$col)=$Result.DBCCResult
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
if($result.DBCCResult -eq 'Success')
{
    $Cells.item($row,$col).Interior.ColorIndex= 10
}
elseif($result.DBCCResult -eq 'Failed')
{
    $Cells.item($row,$col).Interior.ColorIndex= 3
}
else
{
    $Cells.item($row,$col).Interior.ColorIndex= 16
}
$col++
$cells.item($row,$col)=$Result.SizeMb
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$col++
$cells.item($row,$col)=$Result.BackupTaken
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
if($result.BackupTaken -gt (Get-Date).AddDays(-7))
{
    $Cells.item($row,$col).Interior.ColorIndex= 4
}
else
{
    $Cells.item($row,$col).Interior.ColorIndex= 3
}
$col++
$cells.item($row,$col)=$Result.BackupFiles
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$row++
}
[void]$ws.cells.entireColumn.Autofit()
## Add the title after the autofit
$col = 2
$row = 1
$cells.item($row,$col)="This report shows the results of the test backups performed on $TestServer for $Server on $TestDate"
$cells.item($row,$col).font.size=18
$Cells.item($row,$col).Columnwidth = 10
$wb.Saveas($filename)
$xl.quit()

It looks like this. Green is Good, Red is Bad, Grey is don’t care!

Email

You might need to email the results, here I am using GMail as an example. With 2 factor authentication you need to use an app password in the credential

Import-Module dbatools
$TestServer = 'SQL2016N1'$Server = 'SQL2016N2'
## Run the test and save to a variable
$Results = Test-DbaLastBackup -SqlServer $server -Destination $TestServer -MaxMB 5
$to = ''
$smtp = 'smtp.gmail.com'
$port = 587
$cred = Get-Credential
$from = 'Beard@TheBeard.Local'
$subject = 'The Beard Reports on Backup Testing'
$Body = $Results | Format-Table | Out-String
Send-MailMessage -To $to -From $from -Body $Body -Subject $subject -SmtpServer $smtp -Priority High -UseSsl -Port $port -Credential $cred
07 -email
You can of course attach any of the above files as an attachment using the -attachment parameter in Send-MailMessage

Database

Of course, as good data professionals we probably want to put the data into a database where we can ensure that it is kept safe and secure

dbatools has a couple of commands to help with that too. We can use Out-DbaDataTable to create a datatable object and Write-DbaDatatable to write it to a database

Create a table

USE [TestResults]
GO
CREATE TABLE [dbo].[backuptest](
[SourceServer] [nvarchar](250) NULL,
[TestServer] [nvarchar](250) NULL,
[Database] [nvarchar](250) NULL,
[FileExists] [nvarchar](10) NULL,
[RestoreResult] [nvarchar](200) NULL,
[DBCCResult] [nvarchar](200) NULL,
[SizeMB] [int] NULL,
[Backuptaken] [datetime] NULL,
[BackupFiles] [nvarchar](300) NULL
) ON [PRIMARY]
GO
then add the data
Import-Module dbatools
$TestServer = 'SQL2016N1'
$Server = 'SQL2016N2'
$servers = 'SQL2005Ser2003','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2012Ser08AG3','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3'
## Run the test for each server and save to a variable (This uses PowerShell v4 or above code)
$Results = $servers.ForEach{Test-DbaLastBackup -SqlServer $_ -Destination $TestServer -MaxMB 5}
## Convert to a daatatable.
$DataTable = Out-DbaDataTable -InputObject $Results
## Write to the database
Write-DbaDataTable -SqlServer $Server -Database TestResults -Schema dbo -Table backuptest -KeepNulls -InputObject $DataTable

and query it

08 - Database.PNG

Hopefully that has given you some ideas of how you can make use of this great command and also one of the benefits of PowerShell and the ability to use objects for different purposes

Happy Automating

NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using

 Get-Module dbatools

and update it using an Administrator PowerShell session with

 Update-Module dbatools

You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using

Install-Module dbatools

Then you can use

Update-dbatools

 

 

 

 

Testing Your SQL Server Backups the Easy Way with PowerShell & dbatools

In a previous post I wrote about how easy it was to restore a whole SQL Servers user databases from a  directory using the dbatools module. Maybe it is a good idea to look at for disaster recovery scenarios but even PowerShell is going to be useless if your backups don’t work

But setting up a solution to test your backups (technically test your restores) is difficult isn’t it?

Lets use the dbatools module and see how easy it is

The dbatools module has a command called Test-DbaLastBackup if you look at the page or at the help using

 

Get-Help Test-DbaLastBackup -ShowWindow

 

you will see that this command

Restores all or some of the latest backups and performs a consistency check

1. Gathers information about the last full backups
2. Restores the backups to the Destination with a new name. If no Destination is specified, the originating SqlServer will be used.
3. The database is restored as “dbatools-testrestore-$databaseName” by default, but you can change dbatools-testrestore to whatever you would like using -Prefix
4. The internal file names are also renamed to prevent conflicts with original database
5. A consistency check is then performed
6. And the test database is finally dropped

So, if you only have one SQL Server but want to ensure that you are testing your backup files then as along as you have the diskspace you can simply run

Test-DbaLastBackup -SqlServer sql2016n2

and the latest backups that have been taken will be restored using a different name with different filenames, checked for consistency and then dropped

01 - simple test backups.PNG

and as you can see an object is returned
SourceServer  : SQL2016N2
TestServer    : SQL2016N2
Database      : FadetoBlack
FileExists    : True
RestoreResult : Success
DbccResult    : Success
SizeMB        : 1243.26
BackupTaken   : 3/18/2017 12:36:07 PM
BackupFiles   : Z:\SQL2016N2\FadetoBlack\FULL_COPY_ONLY\SQL2016N2_FadetoBlack_FULL_COPY_ONLY_20170318_123607.bak

which shows the Server, the database, if the file exists, the restore result, the DBCC result, the size of the backup file, when it was taken and the path used

You don’t have to use the same server and in many shops you would not want to. You can specify a destination server and you can also pipe the results to Out-GridView to enable easy filtering.

Test-DbaLastBackup -SqlServer sql2016n2 -Destination SQL2016N1 | OGV

Note you need to be backing up to a shared location ie a path that starts \\ I have fudged this a little in the demo for the keen eyed

03 - with OGV.PNG

Maybe you only want to test the backups for the important databases or some backups are restored using other means and you don’t need to test them this way. There is a databases parameter which you can tab through the database names

04 - choosing databases.gif

In the ISE you can see the drop down of database names

05 - in ISE.PNG

If you have limited space you might not want to test the largest databases so you can use the MaxMb parameter to only restore databases under this size. In the example below, you can see that Fadetoblack was skipped and the system databases were skipped as they are not backing up to a shared location

06 - max mb.PNG

The databases are restored onto the server using a different name and the files are also named differently to avoid any conflicts. The default prefix is dbatools-testrestore- but you can change this using the prefix switch if you wish

02 data files.PNG

You may not want to use your special, super quick storage for performing your test restores. If you have separate data drives that you would like to use for the restores, you can specify those with the -DataDirectory and -LogDirectory. If you do not use these switches then the command will use the default data and log locations.

Its possible to reduce the amount of checks that are done. If you only want to do a Verify Only on the backup then you can use the -VerifyOnly switch, you can skip the DBCC check by using the -NoCheck switch and you can leave the test restore databases on the server using the -NoDrop switch

Happy Automating

 

NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using

 Get-Module dbatools

and update it using an Administrator PowerShell session with

 Update-Module dbatools

You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using

 Install-Module dbatools

Then you can use

 Update-dbatools

 

 

 

 

 

Restoring an entire SQL Server user databases with PowerShell using dbatools

All the good DBAs backup their databases.

A significant amount of SQL DBAs use Ola Hallengrens maintenance solution to do so.

This gives a folder structure like this

01 - folder structure.PNG

 

In my lab I had installed SQL 2016 on a server running Server 2016 TP5 which expired so I needed to re-install Windows and therefore needed to restore all of my user databases again. This was so easy using the dbatools module that I thought it was worth sharing to show how easy your disaster recovery process could be.

Having re-installed Windows and SQL and copied the backup files back to the server (although I could have used a network location), I then had to restore all of the user databases.

This is how I restored all of my user databases using the dbatools module command Restore-SQLBackupFromDirectory

 Restore-SqlBackupFromDirectory -SqlServer SQL2016N2 -Path '\\sql2016n2\c$\MSSQL\Backup\SQL2016N2'

Here it is in action

02 - Restore in action.PNG

This is the output

03 - output.PNG

That’s it. As simple as that. An entire SQL Servers user databases restored in one line of code. The latest Full, Latest Diff and latest Log backups for each user database all restored to the default file and log location without issue

If you look at the help for the command using

 Get-Help Restore-SqlBackupFromDirectory -ShowWindow

You will see that there is a -ReuseSourceFolderStructure switch which will use the file structure from the backup file if you have a complex file structure for your SQL files. You can also use a -NoRecovery switch so that you can add further backups, maybe you could use this for setting up mirroring or Always On Availability groups.

If you want to check the restore history of your SQL Server then you can use the Get-DbaRestoreHistory  command. I like to use Out-GridView as it enables you to filter and sort easily

 Get-DbaRestoreHistory -SqlServer sql2016N2 | ogv

 

04 get-database restore history

So you can see each file that was restored and where it was restored to

Happy Automating

 

NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using

 Get-Module dbatools

and update it using an Administrator PowerShell session with

 Update-Module dbatools

You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using

 Install-Module dbatools

Then you can use

 Update-dbatools

 

Adding a PowerShell Job Step to an existing SQL Agent Job Step with PowerShell

In my last post I showed how to add a T-SQL Job step to an existing SQL Agent Job. The process is exactly the same for a PowerShell job step.

As before I gathered the required jobs using Get-SQLAgentJob command from the sqlserver module which you can get by installing the latest SSMS from https://sqlps.io/dl 

This code was run on PowerShell version 5 and will not run on PowerShell version 3 or earlier as it uses the where method
I put all of our jobs that I required on the estate into a variable called $Jobs. (You will need to fill the $Servers variable with the names of your instances, maybe from a database or CMS or a text file and of course you can add more logic to filter those servers as required.

$Jobs = (Get-SQLAgentJob -ServerInstance $Servers).Where{$_.Name -like '*PartOfNameOfJob*' -and $_.IsEnabled -eq $true}

Of course to add a PowerShell Job step the target server needs to be SQL 2008 or higher. If you have an estate with older versions it is worth creating a SMO server object (you can use a snippet) and checking the version and then getting the jobs like this

foreach($Server in $Servers)
{
 $srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server
 if($srv.VersionMajor -ge 10)
 {
    $Jobs = $srv.JobServer.Jobs

and you could choose to create a CmdExec Job step for earlier verions in an else code block.

Once I have the Jobs I can iterate through them with a foreach loop

foreach($Job in $Jobs)

Then we need to create a new job step which is done with the following code

$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep 

To find out what is available for this object you can run

$NewStep | Get-Member -MemberType Property

job-step-properties

We need to set the name, the parent (The job), the command, the subsystem, the on fail action, on success action and the id for the job step.
I set the command to a variable to make the code easier to read

$Command = "Get-Process"

the rest of the properties I fill in inside the loop. To find out what the properties can hold I look at MSDN for a Microsoft.SqlServer.Management.Smo.Agent.JobStep  The ID property is the number of the job step starting at 1 so this example will add a new job step that will be the first to run

$Name = $Job.Name
$JobServer = $srv.JobServer
$Job = $JobServer.Jobs[$Name]
$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep
$NewStep.Name = 'a descriptive name for my PowerShell script'
$NewStep.Parent = $Job
$NewStep.Command = $Command
$NewStep.SubSystem = 'PowerShell'
$NewStep.OnFailAction = 'QuitWithFailure'
$NewStep.OnSuccessAction = 'GoToNextStep'
$NewStep.ID = 1

Once the object has all of the properties all we need to do is create it and alter the job

$NewStep.create()
$Job.Alter() 

and putting it all together it looks like this

foreach($Server in $Servers)
{
    $srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server
    if($srv.VersionMajor -ge 10)
    {
       $Jobs = $srv.JobServer.Jobs.Where{$_.Name -like '*PartOfNameOfJob*' -and $_.IsEnabled -eq $true}
       foreach($Job in $Jobs)
       {
           $NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep
           $NewStep.Name = 'a descriptive name for my PowerShell script'
           $NewStep.Parent = $Job
           $NewStep.Command = $Command
           $NewStep.SubSystem = 'PowerShell'
           $NewStep.OnFailAction = 'QuitWithFailure'
           $NewStep.OnSuccessAction = 'GoToNextStep'
           $NewStep.ID = 1
           $NewStep.create()
           $Job.Alter()
       }
    }

}

Happy Automating

VS Code PowerShell Snippets

Just a quick post, as much as a reminder for me as anything, but also useful to those that attended my sessions last week where I talked about snippets in PowerShell ISE

Jeff Hicks wrote a post explaining how to create snippets in VS Code for PowerShell

I love using snippets so I went and converted my snippets list for ISE (available on GitHub) into the json required for VS Code (available on GitHub)

Here is an example of snippet
"SMO-Server": {
        "prefix": "SMO-Server",
        "body": [
            "$$srv = New-Object Microsoft.SqlServer.Management.Smo.Server $$Server"
        ],
        "description": "Creates a SQL Server SMO Object"
    },

I followed this process in this order

Click File –> Preferences –> User Snippets and type PowerShell or edit $env:\appdata\code\user\snippets\powershell.json

In order I converted the code in the existing snippets “Text” like this

        Replace `$ with $$
        Replace \ with \\
        Replace ” with \”
        \r for new line
        \t for tab
        Each line in “”
        , at the end of each line in the body   except the last one
        Look out for red or green squiggles 🙂
I then add
The name of the snippet, first before the : in “”
The prefix is what you type to get the snippet
The body is the code following the above Find and Replaces
The description is the description!!
and save and I have snippets in VS Code 🙂
snippets.gif
That should help you to convert existing ISE snippets into VS Code PowerShell snippets and save you time and keystrokes 🙂