Following an upgrade to SQL Server the backup share had a number of backups, some from the old version and some from the newer version. I was asked if I had a script to be able to get the SQL Version from the backup file from all of the files in the backup share.
You can get more information about the backup using Read-DbaBackupHeader and as it is PowerShell it is easy to put this information into any format that you wish, maybe into a database with Write-DbaDataTable
— Rob He/Him robsewell@tech.lgbt & @counter.soci (@sqldbawithbeard) November 23, 2018
Support for PowerShell Core in dbatools is coming along very nicely. Following some hard work by the dbatools team and some PowerShell Community members like Mathias Jessen it is now possible to run a large number of dbatools commands in PowerShell Core running on Windows. There is still a little bit of work to do to get it working on Linux and Mac but I hear the team are working hard on that.
So the code example you see above was running on Windows 10 using PowerShell 6.1.1 the current latest stable release. This is excellent news and congratulations to all those working hard to make this work
If you want to try PowerShell Core, you can follow the instructions
The reason for that is an error in the PowerShell testing module Pester (That’s not completely true as you shall see!!)
I spoke in Stuttgart at the PowerShell Saturday last weekend and had intended to write this blog post whilst travelling, unfortunately I found a major error in Pester (again not strictly true but it makes a good story!!)
I explained it with this slide in my presentation
Yep, I forgot to pack my NUC with my VMs on it and had to re-write all my demos!!
— Rob He/Him robsewell@tech.lgbt & @counter.soci (@sqldbawithbeard) September 12, 2017
What a response. You wonderful people. I salute you with a Rimmer salute
There are 34 TSQL2sDay posts about dbatools, about starting with PowerShell, If you should learn PowerShell, SSAS, SSRS, Log Shipping, backups, restores, Pester, Default settings, best practices, migrations, Warnings in Agent Jobs, sqlpackage, VLFs, CMS, Disabling Named Pipes, Orphaned users, AG Status, AG Agent Jobs, logging, classes, auditing, copying files, ETL and more.
I am really pleased to see so many first timers to the TSQL2sDay blog monthly blog party. Please don’t let this be your only TSQL2sDay post. Come back next month and write a post on that topic.
Here they are below in the media of tweets, so that you can also go and follow these wonderful people who are so willing to share their knowledge. Say thank you to them, ask them questions, interact.
— Rob He/Him robsewell@tech.lgbt & @counter.soci (@sqldbawithbeard) September 12, 2017
Martin writes about a script he wrote for backup testing for #tsql2sday – The comparison vs assignment is a very good point for beginners https://t.co/5JOxEaD3wF
— Rob He/Him robsewell@tech.lgbt & @counter.soci (@sqldbawithbeard) September 12, 2017
Yet another #tsql2sday 1sttimer – Claudio writes about best practices adn how to test them with dbatools and Pester https://t.co/kpL5zHnng7
— Rob He/Him robsewell@tech.lgbt & @counter.soci (@sqldbawithbeard) September 12, 2017
Daniel (another #tsql2sday 1sttimer) – blogs about installing dbatools in Visual Studio to exclude tables from Scaffold-DbContext https://t.co/5c69uy2TWN
— Rob He/Him robsewell@tech.lgbt & @counter.soci (@sqldbawithbeard) September 12, 2017
A good security practice is to backup our SQL Servers to a network share but not allow users to be able to browse the share. How can we ensure that our SQL Server has access or test it if it has been set up by someone else?
Lets set this up.
First lets create a share for our backups
$FileShareParams=@{
Name='SQLBackups'
Description='The Place for SQL Backups'
SourceVolume=(Get-Volume-DriveLetterD)
FileServerFriendlyName='beardnuc'
}
New-FileShare @FileShareParams
This will create us a share called SQLBackups on the D drive of the server beardnuc, but without any permissions, lets grant permissions to everyone
and when I try to access as THEBEARD\Rob I get this
So how can I check that I have access from my SQL Server? Sure I could get the password of the SQL Service account and run a process as that account, not saying that’s a good idea but it could be done. Of course it couldn’t be done if you are using Managed Service Accounts or Group Managed Service Accounts but there is a way
Enter dbatools to the rescue 😉 The dbatools module (for those that don’t know) is a PowerShell module written by amazing folks in the community designed to make administrating your SQL Server significantly easier using PowerShell. The instructions for installing it are available here It comprises of 182 separate commands at present
There is a command called Test-SqlPath As always start with Get-Help
Get-Help Test-SqlPath -Full
So it uses master.dbo.xp_fileexist to determine if a file or directory exists, from the perspective of the SQL Server service account, has three parameters Sqlserver, Path and SqlCredential for SQL Authentication. Of course if that stored procedure is disabled on your estate then this command will not be of use to you. With that in mind, lets run it and see what it does
Ah, I cant show you as I don’t have access. Better get in touch with the data centre admin to check 😉 Luckily, I am my own data centre admin and have another account I can use 🙂
So what if we want to test all of our servers for access to the new share? I tried this
It seems that at the moment (version 0.8.942) this command only accepts a single server. This is what you should do if you find either a bug or have an idea for dbatools. Raise an issue on Github
Navigate to the GitHub repository and click on issues. I generally search for the command name in the issues to see if someone else has beaten me to it
If those issues don’t match yours then click the green New Issue button
There is a template to fill in which asks you to specify your Windows, PowerShell and SQL versions with the commands that you need to do so included. Please do this and paste the results in as it will help the folks to replicate the issues in the case of more complicated bugs
I created this issue with a potential fix as well, you don’t have to do that, just letting the folks know is good enough
Until that issue is resolved, you can check all of your servers as follows
$SQLServers=(Get-VM -ComputerName beardnuc).Where{$_.Name -like '*SQL*' -and $_.Name -notlike 'SQL2008Ser2008'}.Name
foreach($Server in $SQLServers)
{
$Test = Test-SqlPath -SqlServer $Server -Path '\\BeardNuc\SQLBackups'
[PSCustomObject]@{
Server = $Server
Result = $Test
}
}
and if I remove one of the service accounts from the group and restart the service an run the command again
So that’s how to use dbatools to check that your SQL Server have access to a Network share and also how to create an issue on GitHub for dbatools and help it to get even better
Happy Automating
NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using
Get-Module dbatools
and update it using an Administrator PowerShell session with
Update-Module dbatools
You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using
With SQL Server vNext CTP 1.4 now available and providing SQL Agent capability on Linux, I wrote here about using Ola Hallengrens scripts on Linux SQL Servers so can Test-DbaLastBackup work with Linux?
Yes it does!!
and I caught the database being restored in SSMS as well
Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands.
we shall use it to validate our results. First we need to gather our results as we have seen before, In this example I have set the MaxMb to 5 so change that if you are playing along
Then we need to write some Pester Tests. I tried to use Test Cases which are the correct method to iterate through collections as Mike Robbins shows here but Pester does not accept the type of object that is returned from this command for that. It’s ok though, because Pester is just PowerShell we can use a foreach loop.
In this scenario, we are testing for failures rather than when the backup test has skipped due to the file path not being a network share or the size being greater than our max size, so our checks are using the Should Not assertion. I have also added a test for the time the backup was taken.
Describe "Last Backup Test results - NOTE THIS IGNORES Skipped restores,DBCC and BackupFiles" {
foreach($result in $results)
{
It "$($Result.Database) on $($Result.SourceServer) File Should Exist" {
$Result.FileExists| Should Not Be 'False'
}
It "$($Result.Database) on $($Result.SourceServer) Restore should be Success" {
$Result.RestoreResult| Should Not Be 'False'
}
It "$($Result.Database) on $($Result.SourceServer) DBCC should be Success" {
$Result.DBCCResult| Should Not Be 'False'
}
It "$($Result.Database) on $($Result.SourceServer) Backup Should be less than a week old" {
$Result.BackupTaken| Should BeGreaterThan (Get-Date).AddDays(-7)
}
}
If we run that we get an output like this. Green is Good Red is Bad 🙂
We can save the script to a file and use the Invoke-Pester to call it like this.
Invoke-Pester C:\temp\BackupPester.ps1
(Some Restore Frames removed for brevity)
invoke-Pester can output results to a file so we can output to XML which can be consumed by many things
and this will run it against the XML and open the file
##run reportunit against report.xml and display result in browser
$HTML = $tempFolder + 'index.html'
& .\reportunit.exe $tempFolder
Invoke-Item $HTML
which will look like
Happy Automating
NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using
Get-Module dbatools
and update it using an Administrator PowerShell session with
Update-Module dbatools
You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using
With the release of SQL Server vNext CTP 1.4 SQL Agent was released for use on Linux. To install it on Ubuntu you need to upgrade your SQL Server to CTP 1.4. On Ubuntu you do this with
Once you have done that you will see that the Agent is now available
So now I can schedule backups and maintenance for my Linux SQL databases using the agent. I immediately turned to Ola Hallengrens Maintenance Solution I downloaded the SQL file and ran it against my Linux server once I had changed the path for the backups to a directory I had created at /var/opt/mssql/backups notice that it is specified using Windows notation with C:\ at the root
SET @CreateJobs= 'Y' -- Specify whether jobs should be created.
SET @BackupDirectory = N'C:\var\opt\mssql\backups' -- Specify the backup root directory.
SET @CleanupTime = 350 -- Time in hours, after which backup files are deleted. If no time is specified, then no backup files are deleted.
SET @OutputFileDirectory = NULL -- Specify the output file directory. If no directory is specified, then the SQL Server error log directory is used.
SET @LogToTable = 'Y' -- Log commands to a table.
The stored procedures were created
and the jobs were created
Now the jobs are not going to run as they are as they have CmdExec steps and this is not supported in SQL on Linux so we have to make some changes to the steps. As I blogged previously, this is really easy using PowerShell
First we need to grab the jobs into a variable. We will use Get-SQLAgentJobHistory from the sqlserver module which you need to download SSMS 2016 or later to get. You can get it from https://sqlps.io/dl As we are targeting a Linux SQL Server we will use SQL authentication which we will provide via Get-Credential and then take a look at the jobs
Once the jobs were in the variable I decided to filter out only the jobs that are calling the stored procedures to perform the backups, DBCC and Index optimisation and loop through them first. Backups are the most important after all
## Find the jobs we want to change foreach($Job in $jobs.Where{$_.Name -like '*DATABASES*'})
Then it is simply a case of replacing the sqlcmd text in the command to return it to T-SQL, adding the database name (I installed Ola’s stored procedures into the master database and changing the subsystem to use T-SQL instead of CmdExec
## replace the text as required
$job.jobsteps[0].command = $job.jobsteps[0].command.Replace('sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "' , '').Replace('" -b','')
## Change the subsystem
$job.jobsteps[0].subsystem = 'TransactSQL'
## Add the databasename
$job.jobsteps[0].DatabaseName = 'master'
## Alter the jobstep
$job.jobsteps[0].Alter()
We can check that it has done this using PowerShell
and if you look in the directory you see the files exactly as you would expect them to be
We still need to change the other jobs that Ola’s script create. If we look at the command steps
We can see that the CommandLog Cleanup job can use the same PowerShell code as the backup jobs, the sp_delete_backuphistory and sp_purgejobhistory jobs need to refer to the msdb database instead of master. For the moment the Output File Cleanup job is the one that is not able to be run on Linux. Hopefully soon we will be able to run PowerShell job steps and that will be resolved as well
Here is the full snippet of code to change all of the jobs
$server = 'Linuxvnextctp14'
$cred = Get-Credential
$Jobs = Get-SqlAgentJob -ServerInstance $server -Credential $cred
## Find the jobs we want to change
foreach ($Job in $jobs) {
if ($Job.Name -like '*DATABASES*' -or $Job.Name -like '*CommandLog*') {
## replace the text as required
$job.jobsteps[0].command = $job.jobsteps[0].command.Replace('sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "' , '').Replace('" -b', '')
## Change the subsystem
$job.jobsteps[0].subsystem = 'TransactSQL'
## Add the databasename
$job.jobsteps[0].DatabaseName = 'master'
## Alter the jobstep
$job.jobsteps[0].Alter()
}
if ($Job.Name -like '*history*') {
## replace the text as required
$job.jobsteps[0].command = $job.jobsteps[0].command.Replace('sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d msdb -Q "' , '').Replace('" -b', '')
## Change the subsystem
$job.jobsteps[0].subsystem = 'TransactSQL'
## Add the databasename
$job.jobsteps[0].DatabaseName = 'msdb'
## Alter the jobstep
$job.jobsteps[0].Alter()
}
}
Today I thought I would take it a little further and show you how PowerShell can be used to transmit or store this information in the manner you require
Test-DBALastBackup returns an object of information
which shows the server, the database name, if the file exists, the restore result, the DBCC result, the size of the backup file, when the backup was taken and the path used
Text File
As it is an object we can make use of that in PowerShell. We can output the results to a file
It looks like this. Green is Good, Red is Bad, Grey is don’t care!
Email
You might need to email the results, here I am using GMail as an example. With 2 factor authentication you need to use an app password in the credential
Import-Module dbatools
$TestServer = 'SQL2016N1'$Server = 'SQL2016N2'
## Run the test and save to a variable
$Results = Test-DbaLastBackup -SqlServer $server -Destination $TestServer -MaxMB 5
$to = ''
$smtp = 'smtp.gmail.com'
$port = 587
$cred = Get-Credential
$from = 'Beard@TheBeard.Local'
$subject = 'The Beard Reports on Backup Testing'
$Body = $Results | Format-Table | Out-String
Send-MailMessage -To $to -From $from -Body $Body -Subject $subject -SmtpServer $smtp -Priority High -UseSsl -Port $port -Credential $cred</pre>
You can of course attach any of the above files as an attachment using the -attachment parameter in Send-MailMessage
Database
Of course, as good data professionals we probably want to put the data into a database where we can ensure that it is kept safe and secure
dbatools has a couple of commands to help with that too. We can use Out-DbaDataTable to create a datatable object and Write-DbaDatatable to write it to a database
Create a table
USE [TestResults]
GO
CREATE TABLE [dbo].[backuptest](
[SourceServer] [nvarchar](250) NULL,
[TestServer] [nvarchar](250) NULL,
[Database] [nvarchar](250) NULL,
[FileExists] [nvarchar](10) NULL,
[RestoreResult] [nvarchar](200) NULL,
[DBCCResult] [nvarchar](200) NULL,
[SizeMB] [int] NULL,
[Backuptaken] [datetime] NULL,
[BackupFiles] [nvarchar](300) NULL
) ON [PRIMARY]
GO
then add the data
Import-Module dbatools
$TestServer = 'SQL2016N1'
$Server = 'SQL2016N2'
$servers = 'SQL2005Ser2003','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2012Ser08AG3','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3'
## Run the test for each server and save to a variable (This uses PowerShell v4 or above code)
$Results = $servers.ForEach{Test-DbaLastBackup -SqlServer $_ -Destination $TestServer -MaxMB 5}
## Convert to a daatatable.
$DataTable = Out-DbaDataTable -InputObject $Results
## Write to the database
Write-DbaDataTable -SqlServer $Server -Database TestResults -Schema dbo -Table backuptest -KeepNulls -InputObject $DataTable
and query it
Hopefully that has given you some ideas of how you can make use of this great command and also one of the benefits of PowerShell and the ability to use objects for different purposes
Happy Automating
NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using
Get-Module dbatools
and update it using an Administrator PowerShell session with
Update-Module dbatools
You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using
In a previous post I wrote about how easy it was to restore a whole SQL Servers user databases from a directory using the dbatools module. Maybe it is a good idea to look at for disaster recovery scenarios but even PowerShell is going to be useless if your backups don’t work
But setting up a solution to test your backups (technically test your restores) is difficult isn’t it?
The dbatools module has a command called Test-DbaLastBackup if you look at the page or at the help using
Get-Help Test-DbaLastBackup -ShowWindow
you will see that this command
Restores all or some of the latest backups and performs a consistency check
1. Gathers information about the last full backups
2. Restores the backups to the Destination with a new name. If no Destination is specified, the originating SqlServer will be used.
3. The database is restored as “dbatools-testrestore-$databaseName” by default, but you can change dbatools-testrestore to whatever you would like using -Prefix
4. The internal file names are also renamed to prevent conflicts with original database
5. A consistency check is then performed
6. And the test database is finally dropped
So, if you only have one SQL Server but want to ensure that you are testing your backup files then as along as you have the diskspace you can simply run
Test-DbaLastBackup -SqlServer sql2016n2
and the latest backups that have been taken will be restored using a different name with different filenames, checked for consistency and then dropped
and as you can see an object is returned
SourceServer : SQL2016N2
TestServer : SQL2016N2
Database : FadetoBlack
FileExists : True
RestoreResult : Success
DbccResult : Success
SizeMB : 1243.26
BackupTaken : 3/18/2017 12:36:07 PM
BackupFiles : Z:\SQL2016N2\FadetoBlack\FULL_COPY_ONLY\SQL2016N2_FadetoBlack_FULL_COPY_ONLY_20170318_123607.bak
which shows the Server, the database, if the file exists, the restore result, the DBCC result, the size of the backup file, when it was taken and the path used
You don’t have to use the same server and in many shops you would not want to. You can specify a destination server and you can also pipe the results to Out-GridView to enable easy filtering.
Note you need to be backing up to a shared location ie a path that starts \\ I have fudged this a little in the demo for the keen eyed
Maybe you only want to test the backups for the important databases or some backups are restored using other means and you don’t need to test them this way. There is a databases parameter which you can tab through the database names
In the ISE you can see the drop down of database names
If you have limited space you might not want to test the largest databases so you can use the MaxMb parameter to only restore databases under this size. In the example below, you can see that Fadetoblack was skipped and the system databases were skipped as they are not backing up to a shared location
The databases are restored onto the server using a different name and the files are also named differently to avoid any conflicts. The default prefix is dbatools-testrestore- but you can change this using the prefix switch if you wish
You may not want to use your special, super quick storage for performing your test restores. If you have separate data drives that you would like to use for the restores, you can specify those with the -DataDirectory and -LogDirectory. If you do not use these switches then the command will use the default data and log locations.
Its possible to reduce the amount of checks that are done. If you only want to do a Verify Only on the backup then you can use the -VerifyOnly switch, you can skip the DBCC check by using the -NoCheck switch and you can leave the test restore databases on the server using the -NoDrop switch
Happy Automating
NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using
Get-Module dbatools
and update it using an Administrator PowerShell session with
Update-Module dbatools
You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using
In my last post, I explained how to alter an existing job step across many servers. I also had cause to add a T-SQL Job step to a large number of jobs as well. This is how I did it.
As before I gathered the required jobs using Get-SQLAgentJob command from the sqlserver module which you can get by installing the latest SSMS from https://sqlps.io/dl
This code was run on PowerShell version 5 and will not run on PowerShell version 3 or earlier as it uses the where method
I put all of our jobs that I required on the estate into a variable called $Jobs. (You will need to fill the $Servers variable with the names of your instances, maybe from a database or CMS or a text file
To find out what is available for this object you can run
$NewStep | Get-Member -MemberType Property
We need to set the name, the parent (The job), the database, the command, the subsystem, the on fail action, on success action and the id for the job step.
I set the command to a variable to make the code easier to read
$Command = "SELECT Name from sys.databases"
the rest of the properties I fill in inside the loop. To find out what the properties can hold I look at MSDN for a Microsoft.SqlServer.Management.Smo.Agent.JobStep The ID property is the number of the job step starting at 1 so this example will add a new job step that will be the first to run
$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep
$NewStep.Name = 'A descriptive name for the job step'
$NewStep.Parent = $Job
$NewStep.DatabaseName = 'master'
$NewStep.Command = $Command
$NewStep.SubSystem = 'TransactSql'
$NewStep.OnFailAction = 'QuitWithFailure'
$NewStep.OnSuccessAction = 'GoToNextStep'
$NewStep.ID = 1
Once the object has all of the properties all we need to do is create it and alter the job
$NewStep.create()
$Job.Alter()
and putting it all together it looks like this
$Jobs = (Get-SQLAgentJob -ServerInstance $Servers).Where{$_.Name -like '*PartOfNameOfJob*' -and $_.IsEnabled -eq $true}
$Command = "Select name from sys.databases"
foreach($Job in $Jobs)
{
$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep
$NewStep.Name = 'A descriptive name for the job step1asdfsfasdfa'
$NewStep.Parent = $Job
$NewStep.DatabaseName = 'master'
$NewStep.Command = $Command
$NewStep.SubSystem = 'TransactSql'
$NewStep.OnFailAction = 'QuitWithFailure'
$NewStep.OnSuccessAction = 'GoToNextStep'
$NewStep.ID = 1
$NewStep.create()
$Job.Alter()
}
Hopefully this will help you if you need to add a T-SQL Job Step to a large number of servers
Happy Automating
I flew to Utrecht last week to present with Chrissy LeMaire and Sander Stad to present to the joint Dutch SQL and PowerShell User Groups. Whilst I was sat at the airport I got a phone call from my current client. “We need to change the backup path for all of the servers to a different share, how long will it take you?”
About 5 minutes (PowerShell is very powerful – be careful when following these examples 😉 )
We will use the sqlserver module, so you will need to have installed the latest version of SSMS from https://sqlps.io/dl
This code was run using PowerShell version 5 and will not work on Powershell version 3 or lower as it uses the where method.
Lets grab all of our jobs on the estate. (You will need to fill the $Servers variable with the names of your instances, maybe from a database or CMS or a text file)
$Jobs = Get-SQLAgentJob -ServerInstance $Servers
Once we have the jobs we need to iterate only through the ones we need to. This step could also have been done in the line above. Lets assume we are using the Ola Hallengren Solution to backup our estate
Foreach($job in $Jobs.Where{$_.Name -like '*DatabaseBackup*' -and $_.isenabled -eq $true})
Then because I have to target a specific job step I can iterate through those and filter in the same way
foreach ($Step in $Job.jobsteps.Where{$_.Name -like '*DatabaseBackup*'})
Now all I need to do is to replace C:\Backup with C:\MSSQL\Backup (in this example I am using my labs backup paths)
In only a few minutes I had altered several hundred instances worth of Ola Hallengren Jobs 🙂
This is one of the many reasons I love PowerShell, it enables me to perform mass changes very quickly and easily. Of course, you need to make sure that you know that what you are changing is what you want to change. I have caused severe issues by altering the SQL alerts frequency to 1 second instead of one hour on an estate!! Although the beauty of PowerShell meant that I was able to change it very quickly once the problem was realised
You can change a lot of settings. If you look at what is available at a job step level
Happy Automating