SQL VNext sp_configure on Windows and Linux with dbatools

This weekend I set up some SQL vNext virtual machines, two on Windows and one on Linux so that I could test some scenarios and build an availability group.

IMPORTANT NOTE :- The names of dbatools commands with a Sql prefix WILL CHANGE in a later release of dbatools. dbatools will use Dba throughout in the future as the sqlserver PowerShell module uses the Sql prefix

I used PowerShell version 5.1.14393.693 and SQL Server vNext CTP 1.3 running on Windows Server 2016 and Ubuntu 16.04 in this blog post

I set up one Windows box with some changes to the default configuration, these are for my lab only.

Min Server Memory to 2Gb
Max Server Memory to 6Gb
Remote Admin Connections to 1
Backup Compression to 1
Ad Hoc Distributed Queries to 1

I can do this in PowerShell by creating a SQL SMO Server object using Connect-DbaSqlServer from dbatools and then altering the Configuration Properties values and calling the Alter() method

$WinSQl1 = 'SQLvNextN1'
$win1 = Connect-DbaSqlServer -SqlServer $WinSQl1
$win1.Configuration.Properties['DefaultBackupCompression'].ConfigValue = 1
$win1.Configuration.Properties['MinServerMemory'].ConfigValue = 2048
$win1.Configuration.Properties['MaxServerMemory'].ConfigValue = 6144
$win1.Configuration.Properties['RemoteAccess'].ConfigValue = 1
$win1.Configuration.Properties['OptimizeAdhocWorkloads'].ConfigValue = 1
$win1.Configuration.Alter()

I can see the change by using the Get-DbaSpConfigure from the popular PowerShell module dbatools. I like to output to Out-GridView (Alias ogv at the CLI)

$WinSQl1 = 'SQLvNextN1'
Get-DbaSpConfigure -SqlServer $WinSQl1 | ogv

1 SPConfig ogv.PNG

Out-GridView can filter in the top bar and it works very quickly. It is a very useful tool

2 - ogv filter.gif

You can also rearrange the columns and use the Add Criteria button to filter your results

3-spconfig-ogv

You can use Export-SQLSpConfigure to export the configuration to a .sql file which is useful for DR or documentation purposes. There is also a corresponding Import-SqlSpConfigure command

$WinSQl1 = 'SQLvNextN1'
$WinConfigPath = 'C:\Temp\Winconfig.sql'
Export-SqlSpConfigure -SqlServer $WinSQl1 -Path $winConfigPath
notepad $winConfigPath

4 - Export-SQLSpConfigure.gif

Chrissy LeMaire b | t showed a neat trick when we were in Utrecht to compare the configuration of two servers

$WinSQl1 = 'SQLvNextN1'
$WinSQl2 = 'SQLvNextN2'
$Win1SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl1
$Win2SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl2
$propcompare = foreach ($prop in $Win1SPConfigure) {
[pscustomobject]@{
Config = $prop.DisplayName
'Windows Node1 setting' = $prop.RunningValue
'Windows Node2 Setting' = $Win2SPConfigure | Where DisplayName -eq $prop.DisplayName | Select -ExpandProperty RunningValue
}
}
$propcompare | ogv
So when we compare the two Windows SQL nodes we can see that at this point there are differences in the configuration
5-compare-configures

 

Now we can copy the configuration from SQLvNextN1 to SQLvNextN2 using the Copy-SqlSpConfigure command

$WinSQl1 = 'SQLvNextN1'
$WinSQl2 = 'SQLvNextN2'
Copy-SqlSpConfigure -Source $WinSQl1 -Destination $WinSQl2

6 - Copy Configurations.gif

and we can see that those configurations are now the same for those two servers by comparing them like before.

7 - compare configurations.PNG

We can use Get-DbaSpConfigure with Linux servers as well but we need to use SQL authentication like so (No, I don’t know why I used two v’s!)

$linuxSQL = 'LinuxvvNext'
$cred = Get-Credential -UserName SA -Message "Linux SQL Auth"
$linuxSpConfigure = Get-DbaSpConfigure  -SqlServer $linuxSQL -SqlCredential $cred
$linuxSpConfigure | ogv

and compare the three servers like so

$WinSQl1 = 'SQLvNextN1'
$WinSQl2 = 'SQLvNextN2'
$linuxSQL = 'LinuxvvNext'
$cred = Get-Credential -UserName SA -Message "Linux SQL Auth"
<div>$Win1SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl1
$Win2SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl2
$linuxSpConfigure = Get-DbaSpConfigure  -SqlServer $linuxSQL -SqlCredential $cred
$propcompare = foreach ($prop in $Win1SPConfigure) {
[pscustomobject]@{
Config = $prop.DisplayName
'Windows Node1 setting' = $prop.RunningValue
'Windows Node2 Setting' = $Win2SPConfigure | Where DisplayName -eq $prop.DisplayName | Select -ExpandProperty RunningValue
'Linux Setting' = $linuxSpConfigure | Where DisplayName -eq $prop.DisplayName | Select -ExpandProperty RunningValue
}
}
$propcompare | ogv

and see the differences for the Linux Server

8 - Compare Linux Configs.PNG

we can export the Linux configuration using Export-SqlSpConfigure

9 - Export Linux Config.PNG

So now lets copy the configuration changes from the Windows Server to the Linux Server

10 - Copy Configuration to Linux.gif

There was some Red text there you will notice. This is because there are unsupported features in Linux as you can see in the SQL Server on Linux Release Notes The errors from the command are

Database Mail XPs to 0. Feature may not be supported.
SMO and DMO XPs to 1. Feature may not be supported.
Ole Automation Procedures to 0. Feature may not be supported.
xp_cmdshell to 0. Feature may not be supported.
Ad Hoc Distributed Queries to 0. Feature may not be supported.
Replication XPs to 0. Feature may not be supported.
contained database authentication to 0. Feature may not be supported.
hadoop connectivity to 0. Feature may not be supported.
polybase network encryption to 1. Feature may not be supported.
remote data archive to 0. Feature may not be supported.
allow polybase export to 0. Feature may not be supported.

But if we compare the configurations again, we can see that the settings we wanted have been altered successfully

11 - Compare Linux Configs again.PNG

Lets reset the configuration on the Linux server using Import-SqlSpConfigure and the Linuxconfig-backup.sql file created before any changes

12 - Import backup Linux configuration.gif

Interestingly this time we only had an error for Database Mail XPs, but as you can see below the Linux Configuration has been reset back to the original values.

13-configuration-comapre-back-to-original

Lets export the configuration from the Windows server to a file and import it onto the Linux server to see what happens

14 - Import Windows configuration to Linux.gif

Again only an error for Database Mail XPs, and we can see the configurations are again matching.

15 - Final Compare.PNG

Hopefully, this post has been of use in exploring the *SpConfigure* commands in dbatools and how you can use them to get the sp_configure output, compare it between servers, Export and import it to and from files, as well as copy it between servers for both Windows and Linux. Remember these commands have been tested from SQL 2000 to SQL vNext so they should work for you across your estate

 

Happy Automating!

 

DBA Database scripts are on Github

It started with a tweet from Dusty

Tweets

The second session I presented at the fantastic PowerShell Conference Europe was about using the DBA Database to automatically install DBA scripts like sp_Blitz, sp_AskBrent, sp_Blitzindex from Brent Ozar , Ola Hallengrens Maintenance Solution , Adam Mechanics sp_whoisactive , This fantastic script for logging the results from sp_whoisactive to a table , Extended events sessions and other goodies for the sanity of the DBA.

By making use of the dbo.InstanceList in my DBA database I am able to target instances, by SQL Version, OS Version, Environment, Data Centre, System, Client or any other variable I choose. An agent job that runs every night will automatically pick up the instances and the scripts that are marked as needing installing. This is great when people release updates to the above scripts allowing you to target the development environment and test before they get put onto live.

I talked to a lot of people in Hannover and they all suggested that I placed the scripts onto GitHub and after some how-to instructions from a few people (Thank you Luke) I spent the weekend updating and cleaning up the code and you can now find it on GitHub here

github

I have added the DBA Database project, the Powershell scripts and Agent Job creation scripts to call those scripts and everything else I use. Some of the DBA Scripts I use (and links to those you need to go and get yourself for licensing reasons) and the Power Bi files as well. I will be adding some more jobs that I use to gather other information soon.

Please go and have a look and see if it is of use to you. It is massively customisable and I have spoken to various people who have extended it in interesting ways so I look forward to hearing about what you do with it.

As always, questions and comments welcome

 

 

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Following my post about using Power Bi with my DBA Database I have been asked if I would share the PowerShell scripts which I use to populate my database.

In this post I will show how to create the following report

1

2

Although you will find so many items of data that I expect that you will want to create different reports for your own requirements. You will also want to put the report onto PowerBi.com and explore the natural language querying as I show at the end of this post

You will find the latest version of my DBADatabase creation scripts and PowerShell scripts here.

The SQLInfo table is created using this code

CREATE TABLE [Info].[SQLInfo](
	[SQLInfoID] [int] IDENTITY(1,1) NOT NULL,
	[DateChecked] [datetime] NULL,
	[DateAdded] [datetime] NULL,
	[ServerName] [nvarchar](50) NULL,
	[InstanceName] [nvarchar](50) NULL,
	[SQLVersionString] [nvarchar](100) NULL,
	[SQLVersion] [nvarchar](100) NULL,
	[ServicePack] [nvarchar](3) NULL,
	[Edition] [nvarchar](50) NULL,
	[ServerType] [nvarchar](30) NULL,
	[Collation] [nvarchar](30) NULL,
	[IsHADREnabled] [bit] NULL,
	[SQLServiceAccount] [nvarchar](35) NULL,
	[SQLService] [nvarchar](30) NULL,
	[SQLServiceStartMode] [nvarchar](30) NULL,
	[BAckupDirectory] [nvarchar](256) NULL,
	[BrowserAccount] [nvarchar](50) NULL,
	[BrowserStartMode] [nvarchar](25) NULL,
	[IsSQLClustered] [bit] NULL,
	[ClusterName] [nvarchar](25) NULL,
	[ClusterQuorumstate] [nvarchar](20) NULL,
	[ClusterQuorumType] [nvarchar](30) NULL,
	[C2AuditMode] [nvarchar](30) NULL,
	[CostThresholdForParallelism] [tinyint] NULL,
	[MaxDegreeOfParallelism] [tinyint] NULL,
	[DBMailEnabled] [bit] NULL,
	[DefaultBackupCComp] [bit] NULL,
	[FillFactor] [tinyint] NULL,
	[MaxMem] [int] NULL,
	[MinMem] [int] NULL,
	[RemoteDacEnabled] [bit] NULL,
	[XPCmdShellEnabled] [bit] NULL,
	[CommonCriteriaComplianceEnabled] [bit] NULL,
	[DefaultFile] [nvarchar](100) NULL,
	[DefaultLog] [nvarchar](100) NULL,
	[HADREndpointPort] [int] NULL,
	[ErrorLogPath] [nvarchar](100) NULL,
	[InstallDataDirectory] [nvarchar](100) NULL,
	[InstallSharedDirectory] [nvarchar](100) NULL,
	[IsCaseSensitive] [bit] NULL,
	[IsFullTextInstalled] [bit] NULL,
	[LinkedServer] [nvarchar](max) NULL,
	[LoginMode] [nvarchar](20) NULL,
	[MasterDBLogPath] [nvarchar](100) NULL,
	[MasterDBPath] [nvarchar](100) NULL,
	[NamedPipesEnabled] [bit] NULL,
	[OptimizeAdhocWorkloads] [bit] NULL,
	[InstanceID] [int] NULL,
	[AGListener] [nvarchar](150) NULL,
	[AGs] [nvarchar](150) NULL,
 CONSTRAINT [PK__SQL__50A5926BC7005F29] PRIMARY KEY CLUSTERED 
(
	[SQLInfoID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

GO

ALTER TABLE [Info].[SQLInfo]  WITH CHECK ADD  CONSTRAINT [FK_SQLInfo_InstanceList] FOREIGN KEY([InstanceID])
REFERENCES [dbo].[InstanceList] ([InstanceID])
GO

ALTER TABLE [Info].[SQLInfo] CHECK CONSTRAINT [FK_SQLInfo_InstanceList]
GO

The Powershell script uses Jason Wasser @wasserja Write-Log function to write to a text file but I also enable some logging into a new event log by following the steps here http://blogs.technet.com/b/heyscriptingguy/archive/2013/02/01/use-powershell-to-create-and-to-use-a-new-event-log.aspx to create a log named SQLAutoScript with a source SQLAUTOSCRIPT

To run the script I simply need to add the values for

$CentralDBAServer = '' ## Add the address of the instance that holds the DBADatabase
$CentralDatabaseName = 'DBADatabase' 
$LogFile = "\DBADatabaseServerUpdate_" + $Date + ".log" ## Set Path to Log File

And the script will do the rest. Call the script from a PowerShell Job Step and schedule it to run at the frequency you wish, I gather the information every week. You can get the script from here or you can read on to see how it works and how to create the report and publish it to powerbi.com

I create a function called Catch-Block to save keystrokes and put my commands inside a try catch to make the scripts as robust as possible.

function Catch-Block
{
param ([string]$Additional)
$ErrorMessage = " On $Connection " + $Additional + $_.Exception.Message + $_.Exception.InnerException.InnerException.message
$Message = " This message came from the Automated Powershell script updating the DBA Database with Server Information"
$Msg = $Additional + $ErrorMessage + " " + $Message
Write-Log -Path $LogFile -Message $ErrorMessage -Level Error
Write-EventLog -LogName SQLAutoScript -Source "SQLAUTOSCRIPT" -EventId 1 -EntryType Error -Message $Msg
}

I give the function an additional parameter which will hold each custom error message which I write to both the event log and a text message to enable easy troubleshooting and include the message from the $Error variable by accessing it with $_. I won’t include the try catch in the examples below. I gather all of the server names from the InstanceList table and set the results to an array variable called $ServerNames holding the server name, instance name and port

 $Query = @"
 SELECT [ServerName]
      ,[InstanceName]
      ,[Port]
  FROM [DBADatabase].[dbo].[InstanceList]
  Where Inactive = 0 
    AND NotContactable = 0
"@
try{
$AlltheServers= Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query $query
$ServerNames = $AlltheServers| Select ServerName,InstanceName,Port
}

I then loop through the array and create a $Connection variable for my SMO connection string and connect to the server

foreach ($ServerName in $ServerNames)
{
## $ServerName
 $InstanceName =  $ServerName|Select InstanceName -ExpandProperty InstanceName
 $Port = $ServerName| Select Port -ExpandProperty Port
$ServerName = $ServerName|Select ServerName -ExpandProperty ServerName 
 $Connection = $ServerName + '\' + $InstanceName + ',' + $Port

 try
 {
 $srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection

Even though I place the creation of the SMO server object in a try block you still need to an additional check to ensure that you can connect and populate the object as the code above creates an empty SMO Server object with the name property set to the $Connection variable if you can’t connect to that server and doesn’t error as you may expect
The way I have always validated an SMO Server object is to check the version property. There is no justifiable reason for choosing that property, you could choose any one but that’s the one I have always used. I use an if statement to do this ( This post about Snippets will show you the best way to learn powershell code) The reference I use for exiting a loop in the way that you want is this one In this case we use a continue to carry on iterating the loop

 if (!( $srv.version)){
 Catch-Block " Failed to Connect to $Connection"
 continue
 }

If you wish to view all of the different properties that you can gather information on in this way you can use this code to take a look. (This is something you should get used to doing when writing new Powershell scripts)

$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection
 $srv | Get-Member

As you can see from the screenshot below on my SQL2014 server there are 184 properties. I havent chosen to gather all of them, only the ones that are of interest to me, our team or others who request information from our team such as auditors and project managers etc

3

You can choose to use any or all of these properties as long as you ensure you have the columns in your table with the correct data type and that you have the correct knowledge and logic to stop the script from erroring if/when the property is not available. Here is an example

if ($srv.IsHadrEnabled -eq $True)
 {$IsHADREnabled = $True
 $AGs = $srv.AvailabilityGroups|Select Name -ExpandProperty Name|Out-String
 $Expression = @{Name = 'ListenerPort' ; Expression = {$_.Name + ',' + $_.PortNumber }}
 $AGListener =  $srv.AvailabilityGroups.AvailabilityGroupListeners|select $Expression|select ListenerPort -ExpandProperty ListenerPort
 }
 else
 {
 $IsHADREnabled = $false
 $AGs = 'None'
 $AGListener = 'None'
 }
 $BackupDirectory = $srv.BackupDirectory

I check if the property IsHADREnabled is true and if it is I then gather the information about the Availability Group names and the listener port and if it doesn’t exist I set the values to None.

You will find that not all of the properties that you want are at the root of the Server SMO object. If you want you max and min memory values and you want to know if remote admin connections or xp_cmdshell are enabled you will need to look at the $Srv.Configuration object

 $MaxMem = $srv.Configuration.MaxServerMemory.ConfigValue
 $MinMem = $srv.Configuration.MinServerMemory.ConfigValue
 $RemoteDacEnabled = $srv.Configuration.RemoteDacConnectionsEnabled.ConfigValue
 $XPCmdShellEnabled = $srv.Configuration.XPCmdShellEnabled.ConfigValue

You can look for the property that you want by using the Get-Member cmdlet as shown above or use MSDN to find it starting from here or by GoogleBingDuckDuckGo ing “Powershell SMO” and the property you wish to find.

The rest of the script follows exactly the same pattern as the previous post by checking the SQL Info table for an entry for that instance and updating the table if it exists and inserting if it does not.

There are other uses for gathering this information than just for reporting on it. You can target different versions of SQL for different scripts. You can identify values that are outside what is expected and change them. If xp_cmdshell should not be enabled, write the TSQL to gather the connection string of all of the servers from the DBADatabase where the SQLInfo table has XPCMDShellenabled = 1 and loop through them exactly as above and change the value of $srv.Configuration.XPCmdShellEnabled.ConfigValue to 0 and then $Srv.Alter()

It is a very powerful way of dynamically targeting your estate if you are looking after many instances and with great power comes great responsibility.

ALWAYS TEST THESE AND ANY SCRIPTS YOU FIND OR SCRIPTS YOU WRITE BEFORE YOU RUN THEM IN YOUR PRODUCTION ENVIRONMENT

Yeah, I shouted and some people thought it was rude. But its important, it needs to be repeated and drilled in so that it becomes habitual. You can do great damage to your estate with only a few lines of PowerShell and a DBA Database so please be very careful and ensure that you have a suitable test subset of servers that you can use to test

The other thing we can do is report on the data and with Power Bi we can create self service reports and dashboards and also make use of the natural language query at powerbi.com so that when your systems team ask “What are all the servers in X data center?” you can enable them to answer it themselves or when the compliance officer asks how many SQL 2005 instances do we have and which clients do they serve you can give them a dashboard they can query themselves.

This is how I create the two reports you see at the top. I start by connecting to the data source, my DBA Database

4

And I use this query

SELECT 
	IL.ServerName
	,IL.InstanceName
	  ,IL.Location
	  ,IL.Environment
	  ,IL.Inactive
	  ,IL.NotContactable
	  ,SI.[SQLInfoID]
      ,SI.[DateChecked]
      ,SI.[DateAdded]
      ,SI.[ServerName]
      ,SI.[InstanceName]
      ,SI.[SQLVersionString]
      ,SI.[SQLVersion]
      ,SI.[ServicePack]
      ,SI.[Edition]
      ,SI.[ServerType]
      ,SI.[Collation]
      ,SI.[IsHADREnabled]
      ,SI.[SQLServiceAccount]
      ,SI.[SQLService]
      ,SI.[SQLServiceStartMode]
      ,SI.[BAckupDirectory]
      ,SI.[BrowserAccount]
      ,SI.[BrowserStartMode]
      ,SI.[IsSQLClustered]
      ,SI.[ClusterName]
      ,SI.[ClusterQuorumstate]
      ,SI.[ClusterQuorumType]
      ,SI.[C2AuditMode]
      ,SI.[CostThresholdForParallelism]
      ,SI.[MaxDegreeOfParallelism]
      ,SI.[DBMailEnabled]
      ,SI.[DefaultBackupCComp]
      ,SI.[FillFactor]
      ,SI.[MaxMem]
      ,SI.[MinMem]
      ,SI.[RemoteDacEnabled]
      ,SI.[XPCmdShellEnabled]
      ,SI.[CommonCriteriaComplianceEnabled]
      ,SI.[DefaultFile]
      ,SI.[DefaultLog]
      ,SI.[HADREndpointPort]
      ,SI.[ErrorLogPath]
      ,SI.[InstallDataDirectory]
      ,SI.[InstallSharedDirectory]
      ,SI.[IsCaseSensitive]
      ,SI.[IsFullTextInstalled]
      ,SI.[LinkedServer]
      ,SI.[LoginMode]
      ,SI.[MasterDBLogPath]
      ,SI.[MasterDBPath]
      ,SI.[NamedPipesEnabled]
      ,SI.[OptimizeAdhocWorkloads]
      ,SI.[InstanceID]
      ,SI.[AGListener]
      ,SI.[AGs]
        FROM [DBADatabase].[Info].[SQLInfo] as SI
  JOIN [DBADatabase].[dbo].[InstanceList] as IL
  ON IL.InstanceID =  SI.InstanceID

So that I can easily add any and all the data to the reports if I choose or query using them in powerbi.com

First I created 3 measures.

[code 1=”=” 2=”2=”2=”2=”””SQL””””” language=”language”]
AG = DISTINCTCOUNT(Query1[AGs])
Instances = DISTINCTCOUNT(Query1[InstanceID])
Servers = DISTINCTCOUNT(Query1[ServerName])

I click on map

5

And drag the location column to location and the Instances measure to both the Values and Color Saturation

6

I then click on edit and format the title and change the colours for the data

7

Next I created I heat map for Instances by Edition. The picture shows the details

8

And a column chart for Instances by Version

9

I also add a table showing the number of instances in each location and a slicer for environment.

Even though you have added one slicer, you are able to slice the data by clicking on the charts. If I click on Developer Edition I can quickly see which versions and locations they are in

10

This works for the map and the column chart as well. This has all been created using live data as a base with all identifying information altered, Bolton is where I was born and the other locations are chosen at random, all other figures and rollups have also been altered.

11

To create the other report I create two donut charts for Instances by version and by location using steps similar to my previous post and then add some tables for location, edition and xp_cmdshell enabled as well as some cards showing total numbers of Servers, Instances and Availability Groups and a slicer for environment to create a report like this, you can use the donut charts to slice the data as well

12

But there are so many different points of information gathered by this script that you get extra value using the natural language query on powerbi.com.

Click Publish and enter your powerbi.com credentials and then log into powerbi.com in a browser and you will see your report and your dataset. (Note, you can easily filter to find your dashboards, reports and data sets)

13

Click the plus sign to create a new dashboard and click the pin on any of the objects in your report to pin them to the dashboard

14

Then you can view (and share) your dashboard

15

Once you have done this you can query your data using natural language. It will cope with spelling mistakes and expects the column names so you may want to think about renaming them in your report by right clicking on them after you get your data.

You can ask it questions and build up information on the fly and alter it as you need it. As a DBA doing this and imagining enabling others to be able to ask these questions whenever they want from a browser and as many times as they like, it was very cool!

16

17

18

19

20

Pretty cool, I think you and any of your ‘requestors’ would agree

You can get all of the scripts here

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

Making a Change Log Easier With PowerShell

Having a Change Log is a good thing. A quick and simple place to find out what has changed on a server and when. This can be invaluable when troubleshooting, matching a change to a symptom especially when assessed alongside your performance counter collection. Here is a simple way to make use of a change log and automate it

Create a simple table


USE [MDW]
GO

CREATE TABLE [dbo].[ChangeLog](
 [ChangeID] [int] IDENTITY(1,1) PRIMARY KEY ,
 [Date] [datetime] NOT NULL,
 [Server] [varchar](50) NOT NULL,
 [UserName] [nvarchar](50) NOT NULL,
 [Change] [nvarchar](max) NOT NULL,
)

GO

You can keep this on a central server or create a database on each server, whichever fits your needs best. You can add other columns if you want your information in a different format

Once you have your table you can create a couple of Powershell functions to easily and quickly add to and retrieve data from the table. I make use of Invoke-SQLCMD2 in these functions

This can then be included in any automation tasks that you use to update your environments whether you are using automated deployment methods for releases or using SCCM to patch your environments making it easy to update and also easy to automate by making it part of your usual deployment process.

To add a new change

<#
.Synopsis
 A function to add a ChangeLog information
.DESCRIPTION
 Load function for adding a change to the changelog table in the MDW database on MDWSERVER.
 Use Get-ChangeLog $Server to see details
 Inputs the username of the account running powershell into the database as the user
REQUIRES Invoke-SQLCMD2
http://sqldbawithabeard.com
.EXAMPLE
 Add-ChangeLog SERVERNAME "Altered AutoGrowth Settings for TempDB to None"

 Adds ServerName UserName and Altered AutoGrowth Settings for TempDB to None to the change log table
#>
Function Add-ChangeLog
{
[CmdletBinding()]
Param(
 [Parameter(Mandatory=$True)]
 [string]$Server,

 [Parameter(Mandatory=$True)]
 [string]$Change
)

$UserName = $env:USERDOMAIN + '\' + $env:USERNAME

$Query = "INSERT INTO [dbo].[ChangeLog]
 ([Date]
 ,[Server]
 ,[UserName]
 ,[Change])
 VALUES
 (GetDate()
 ,'$Server'
 ,'$UserName'
 ,'$Change')
"
Invoke-Sqlcmd2 -ServerInstance MDWSERVER -Database "MDW" -Query $Query -Verbose
}

You can then run

Add-ChangeLog SERVERNAME "Added New Database SuperAppData"

to add the change to the change log

To retrieve the data you can use

<#
.Synopsis
 A function to get ChangeLog information
.DESCRIPTION
 Load function for finding ChangeLog information. Information is selected from the MDW Database on SERVERNAME
REQUIRES Invooke-SQLCMD2
http://sqldbawithabeard.com
.EXAMPLE
 Get-ChangeLog SERVERNAME
#>
Function Get-ChangeLog
{
 [CmdletBinding()]
 [OutputType([int])]
 Param
 (
 # Server Name Required
 [Parameter(Mandatory=$true,]
 $Server
 )

$a = @{Expression={$_.Date};Label="Date";width=15}, `
@{Expression={$_.Server};Label="Server";width=10},
@{Expression={$_.UserName};Label="UserName";width=20}, `
@{Expression={$_.Change};Label="Change";width=18}

Invoke-Sqlcmd2 -ServerInstance MDWSERVER -Database "MDW" -Query "SELECT * FROM dbo.ChangeLog WHERE Server = '$Server';" -Verbose|Format-table $a -Auto -Wrap

}

and use

Get-ChangeLog SERVERNAME

To find out what changed when. Happy Automating

Changing Delay Between Responses for SQL Alerts with Powershell

So you have read that you should have alerts for severity levels 16 to 24 and 823,824 and 825 on SQLSkills.com or maybe you have used sp_blitz and received the Blitz Result: No SQL Server Agent Alerts Configured and like a good and conscientious DBA you have set them up.

Hopefully you also have Jonathan Allens blog on your feed and if you look at his historical posts and seen this one where lack of a delay in response broke the Exchange Server!

However sometimes the oft used delay between responses of 1 minute is too much. Alerts should be actionable after all and maybe you sync your email every 15 minutes and don’t need to see 15 alerts for the same error or you decide that certain level of errors require a lesser response and therefore you only need to know about them every hour or three. Or possibly you want to enforce a certain delay for all servers and want to set up a system to check regularly and enforce your rule

Whatever the reason, changing the delay between response for every alert on every server with SSMS could be time consuming and (of course) I will use Powershell to do the job.

To find the alerts I follow the process I use when finding any new property in powershell


$server = 'SERVERNAME'

$srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server

I know that the Alerts will be found under the JobServer Property


$srv.JobServer.Alerts|Get-Member

Shows me

DelayBetweenResponses   Property   int DelayBetweenResponses {get;set;}

And

 Alter                   Method     void Alter(), void IAlterable.Alter()

So I use both of those as follows


Foreach($Alert in $srv.JobServer.Alerts)

{

$Alert.DelayBetweenResponses = 600 # This is in seconds

$Alert.Alter()

}

And place it in a foreach loop for the servers I want to change. If I only want to change certain alerts I can do so by filtering on Name


Foreach($Alert in $srv.JobServer.Alerts|Where-Object {$_.Name -eq 'NameOfAlert'})

Or by category


Foreach($Alert in $srv.JobServer.Alerts|Where-Object {$_.CategoryName -eq 'Category Name'})

When you have 5 minutes go and look at the results of


$srv.JobServer|Get-Member

And explore and let me know what you find

Generating T-SQL Randomly with Powershell

I have a lab on my laptop running various servers so that I can problem solve and learn and recently I wanted to add several months of data into a database. I had created a stored procedure to take some parameters perform some logic and insert the data.

To execute the stored procedure in T-SQL I simply run this

EXECUTE [dbo].[usp_Insert_DriveSpace] 'Server1','C','2014-11-05','100','25'

which uses the server name, drive letter, date, capacity and free space to add the data

In my wisdom I decided to create some data that was more ‘real-life’ I was interested in storing drive space data and will be learning how to write reports on it. To do this I had pre-populated some tables in the database with 10 Server Names each with 5 drives so I needed 10*5*90 or 4500 statements

I wanted to populate this with about 3 months of data as if it had been gathered every day. I read this post about using CTEs to create sequences and I am sure it can be done this way but I don’t have the T-SQL skills to do so. If someone can (or has) done that please let me know as I am trying to improve my T-SQL skills and would be interested in how to approach and solve this problem with T-SQL

I solved it with Powershell in this way.

Created an array of Servers and an array of Drives to enable me to iterate though each.

$Servers = 'Server1','Server2','Server3','Server4','Server5','Server6','Server7','Server8','Server9','Server10'
$Drives = 'C','D','E','F','G'

Set the drive capacity for each drive. To make my life slightly easier I standardised my ‘servers’

$CDriveCapacity = 100
$DDriveCapacity = 50
$EDriveCapacity = 200
$FDriveCapacity = 200
$GDriveCapacity = 500

I needed to create a date. You can use Get-Date to get todays date and to get dates or times in the future or the past you can use the AddDays() function. You can also add ticks, milliseconds, seconds, minutes, hours, months or years

(Get-Date).AddDays(1)

I then needed to format the date. This is slightly confusing. If you just use Get-Date to get the current date (time) then you can use the format or uformat switch to format the output

Get-Date -Format yyyyMMdd
Get-Date -UFormat %Y%m%d

However this does not work once you have used the AddDays() method. You have to use the ToString() method

 $Date = (get-date).AddDays(-7).ToString('yyyy-MM-dd')

To replicate gathering data each day I decided to use a while loop. I set $x to –95 and pressed CTRL and J to bring up Snippets and typed w and picked the while loop. You can find out more about snippets in my previous post I started at –95 so that all the identity keys incremented in a real-life manner oldest to newest.

$x = -98
while ($x -le 0)
{
    $Date = (get-date).AddDays($x).ToString('yyyy-MM-dd')

    foreach($Server in $Servers)
    {
        foreach ($Drive in $Drives)
        {

I could then use the while loop to generate data for each day and loop through each server and each drive and generate the T-SQL but I wanted more!

I wanted to generate some random numbers for the free space available for each drive. I used the Get-Random cmdlet If you are going to use it make sure you read this post to make sure that you don’t get caught by the gotcha. I decided to set the free space for my OS,Data and Log Files to somewhere between 70 and 3 Gb free as in this imaginary scenario these drives are carefully monitored and the data and log file sizes under the control of a careful DBA but still able to go below thresholds.

if($Drive -eq 'C')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3

I set the TempDB drive to have either 4,7 or 11 Gb free so that i can try to colour code my reports depending on values and if one field only has three values it makes it simpler to verify.

I set the Backup Drive to somewhere between 50 and 0 so that I will hit 0 sometimes!!

Here is the full script. It generated 4500 T-SQL statements in just under 16 seconds

$Servers = 'Server1','Server2','Server3','Server4','Server5','Server6','Server7','Server8','Server9','Server10'
$Drives = 'C','D','E','F','G'
$CDriveCapacity = 100
$DDriveCapacity = 50
$EDriveCapacity = 200
$FDriveCapacity = 200
$GDriveCapacity = 500

$x = -98
while ($x -le 0)
{
    $Date = (get-date).AddDays($x).ToString('yyyy-MM-dd')

    foreach($Server in $Servers)
    {
        foreach ($Drive in $Drives)
        {
            if($Drive -eq 'C')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3
            Write-Host &quot;EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$CDriveCapacity','$Free'&quot;
            }
            elseif($Drive -eq 'D')
            {
            $Free = Get-Random -InputObject 4,7,11
            Write-Host &quot;EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$DDriveCapacity','$Free'&quot;
            }
            elseif($Drive -eq 'E')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3
            Write-Host &quot;EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$EDriveCapacity','$Free'&quot;
            }
            elseif($Drive -eq 'F')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3
            Write-Host &quot;EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$FDriveCapacity','$Free'&quot;
            }
            elseif($Drive -eq 'G')
            {
            $Free = Get-Random -Maximum 50 -Minimum 0
            Write-Host &quot;EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$GDriveCapacity','$Free'&quot;
            }
        }
    }
    $X++
}

Once it had run I simply copied the output into SSMS and was on my way

Emailing Disk Space Alerting With Powershell

 

A DBA doesn’t want to run out of space on their servers, even in their labs! To avoid this happening I wrote a Powershell script to provide some alerts by email.

This is the script and how I worked my way through the solution. I hope it is of benefit to others.

The script works in the following way

  • Iterates through a list of servers
  • Runs a WMI query to gather disk information
  • If the free space has fallen below a threshold, checks to see if it has emailed before and if not emails a warning
  • Resets if free space has risen above the threshold
  • Logs what it does but manages the space the logs use

As you will have seen before I use a Servers text file in my scripts. This is a text file with a single server name on each line. You could also use a query against a DBA or MDW database using Invoke-SQLCMD2, which ever is the most suitable for you.

 $Servers = Get-Content 'PATH\TO\Servers.txt' foreach($Server in $Servers) { 

The WMI query is a very simple one to gather the disk information. I format the results and place them in variables for reuse

 $Disks = Get-WmiObject win32_logicaldisk -ComputerName $Server | Where-Object {$_.drivetype -eq 3} $TotalSpace=[math]::Round(($Disk.Size/1073741824),2) # change to gb and 2 decimal places $FreeSpace=[Math]::Round(($Disk.FreeSpace/1073741824),2)# change to gb and 2 decimal places $UsedSpace = $TotalSpace - $FreeSpace $PercentFree = [Math]::Round((($FreeSpace/$TotalSpace)*100),2)# change to gb and 2 decimal places 

Use a bit of logic to check if the freespace is below a threshold and see if the email has already been sent

 # Check if percent free below warning level if ($PercentFree -le $SevereLevel) { # if text file has been created (ie email should already have been sent) do nothing if(Test-Path $CheckFileSevere) {} # if percent free below warning level and text file doesnot exist create text file and email else { 

If it has not create a unique named text file and create the email body using HTML and the values stored in the variables

 New-Item $CheckFileSevere -ItemType File #Create Email Body $EmailBody = '' $EmailBody += " " 

and then send it

 $Subject = "URGENT Disk Space Alert 1%" $Body = $EmailBody $msg = new-object Net.Mail.MailMessage $smtp = new-object Net.Mail.SmtpClient($smtpServer) $smtp.port = '25' $msg.From = $From $msg.Sender = $Sender $msg.To.Add($To) $msg.Subject = $Subject $msg.Body = $Body $msg.IsBodyHtml = $True $smtp.Send($msg) 

If the freespace is above all of the warning levels, check for existence of the text file and delete it if found so that the next time the script runs it will send an email.

 if(Test-Path $CheckFile) { Remove-Item $CheckFile -Force

To enable logging create a log file each day

 $Logdate = Get-Date -Format yyyyMMdd $LogFile = $Location + 'logfile' + $LogDate+ '.txt' # if daily log file does not exist create one if(!(Test-Path $LogFile)) { New-Item $Logfile -ItemType File 

And write the info to it at each action

 $logentrydate = (Get-Date).DateTime $Log = $logentrydate + ' ' + $ServerName + ' ' + $DriveLetter + ' ' + $VolumeName + ' ' + $PercentFree +' -- Severe Email Sent' Add-Content -Value $Log -Path $Logfile

Making sure that you clean up after

 # any logfiles older than 7 days delete Get-ChildItem -Path $Location *logfile* |Where-Object {$_.LastWriteTime -gt (Get-Date).AddDays(7) }|Remove-Item -Force 

I run the script in a Powershell Step in an SQL Agent Job every 5 minutes and Now I know when my servers in my lab are running out of space with an email like this

image

You can find the script here

Refreshing A SQL Mirrored Database Using Powershell

 

SQL mirroring is a means of providing high availability for your SQL database. It is available in Standard Edition and although the feature is deprecated it is still widely utilised. You can read more about it on MSDN here and Jes Borland wrote a useful post answering many questions here

There are situations where you may need to refresh these databases. Disaster Recovery is an obvious one but also during development to provide testing or development environments to test your High Availability implementations, run through disaster scenarios, create run books or ensure that the code changes still work with mirroring. There are other scenarios but this post covers the automation of restoring a mirrored database from a backup.

I have mentioned before and no doubt I shall again, John Sansom wrote a great post about automation and I am a strong follower of that principle.

To refresh a SQL mirror the following steps are required, there are some gotchas that you need to be aware of which I will discuss later

remove mirroring
restore principle database from backup
perform a transaction log backup of the principle database
restore both backups on the mirror server with no recovery
recreate mirroring
resolve orphaned users
check mirroring status

Regular blog followers will know that I prefer to use Powershell when I can (and where it is relevant to do so) and so I have used Powershell to automate all of the steps above

The script requires some variables to be set up at the beginning. You can easily change this and make the script into a function and call it if you desire, but for this post I shall consider the script as a standalone. The reasoning for this is that I imagine that it will be placed into a run book or stored for use in a repository for specific use and therefore reduces any pre-requisites for using it.

Set variables as follows, the last three variables set the types for the backup action type and device type and do not need to be altered.

# Set up some variables

 "$PrincipalServer = '' # Enter Principal Server Name
$MirrorServer = '' # Enter Mirror Server Name
$DBName = '' # Enter Database Name
$FileShare = '' # Enter FileShare with trailing slash
$LocationReplace = $FileShare + $DBName + 'Refresh.bak'
$LocationTran = $FileShare + $DBName + 'formirroring.trn'

$PrincipalEndPoint = 'TCP://SERVERNAME:5022' # Change as required
$MirrorEndpoint = 'TCP://SERVERNAME:5022' # Change as required
$WitnessEndpoint = 'TCP://SERVERNAME:5022' # Change as required

$Full = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
$Tran = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Log
$File = [Microsoft.SqlServer.Management.Smo.DeviceType]::File&amp;lt;/PRE&amp;gt;&lt;/PRE&gt;

 

After some error checking the first thing is to create server and database SMO objects

 "# Create Server objects
$Principal = New-Object Microsoft.SQLServer.Management.SMO.Server $PrincipalServer
$Mirror = New-Object Microsoft.SQLServer.Management.Smo.server $MirrorServer

#Create Database Objects
$DatabaseMirror = $Mirror.Databases[$DBName]
$DatabasePrincipal = $Principal.Databases[$DBName]&amp;lt;/PRE&amp;gt;&lt;/PRE&gt;

(Added Extra – Use New-ISESnippet to create a SMO Server Snippet and use CTRL + J to find it

 "New-IseSnippet -Title SMO-Server -Description "Create A SQL Server SMO Object"
-Text "`$srv = New-Object Microsoft.SqlServer.Management.Smo.Server `$server"

)

Remove Mirroring

Before we can restore the database we need to remove mirroring

 "$DatabasePrincipal.ChangeMirroringState([Microsoft.SqlServer.Management.Smo.MirroringOption]::Off)

restore principle database from backup

Once mirroring has been removed we can restore the database. Stuart Moore’s Great Series provides all the code you need to backup and restore databases with Powershell. There is however a bug which can catch you out. Here’s the code

 "$restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore|Out-Null
$restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($LocationReplace,$File)|Out-Null
$restore.Database = $DBName
$restore.ReplaceDatabase = $True
$restore.Devices.add($restoredevice)
#Perform Restore
$restore.sqlrestore($PrincipalServer)
$restore.Devices.Remove($restoredevice)

The bug is as follows, if your restore is going to take longer than 10 minutes and you are using an earlier version of SQL than SQL 2012 SP1 CU8 then you will find that the restore fails after 10 minutes. This is the default timeout. You may try to set the

 "$srv.ConnectionContext.StatementTimeout

Value to a larger value or 0 and this will work after SQL 2012 SP1 CU8 but prior to that you will still face the same error. The simple workaround is to use Invoke-SQLCmd2 and to script the restore as follows

 "#Set up Restore using refresh backup

$restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
$restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($LocationReplace,$File)|Out-Null
$restore.Database = $DBName
$restore.ReplaceDatabase = $True
$restore.Devices.add($restoredevice)
#Perform Restore
$restore.sqlrestore($PrincipalServer) # if query time &lt; 600 seconds
# $query = $restore.Script($PrincipalServer) # if using Invoke-SQLCMD2
$restore.Devices.Remove($restoredevice)

perform a transaction backup of the principle database

We need to have a full and transaction log backup to set up mirroring. Again you may need to use the script method if your backup will take longer than 600 seconds.

 "
#Setup Trans Backup
$Backup = New-Object Microsoft.SqlServer.Management.Smo.Backup|Out-Null
$Full = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
$Tran = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Log
$File = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$Backup.Action = $Tran
$Backup.BackupSetDescription = “Log Backup of “ + $DBName
$Backup.Database = $DBName
$BackupDevice = New-Object –TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($LocationTran,$File)|Out-Null
$Backup.Devices.Add($BackupDevice)
# Perform Backup
$Backup.SqlBackup($PrincipalServer)
# $query = $Backup.Script($PrincipalServer) # if query time &lt; 600 seconds
$Backup.Devices.Remove($BackupDevice)

# Invoke-Sqlcmd2 –ServerInstance $PrincipalServer –Database master –Query $query –ConnectionTimeout 0 # comment out if not used

Restore both backups on the mirror server with no recovery

To complete the mirroring set up we need to restore the backups onto the mirror server with no recovery as follows

 "#Set up Restore of Full Backup on Mirror Server
$restore = New-Object -TypeName Microsoft.SqlServe r.Management.Smo.Restore|Out-Null
$restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($LocationReplace,$File)|Out-Null
$restore.Database = $DBName
$restore.ReplaceDatabase = $True
$restore.NoRecovery = $true
$restore.Devices.add($restoredevice)
$restore.sqlrestore($MirrorServer) # if query time &lt; 600 seconds
# $query = $restore.Script($MirrorServer) # if using Invoke-SQLCMD2
$restore.Devices.Remove($restoredevice)

# Invoke-Sqlcmd2 -ServerInstance $MirrorServer -Database master -Query $query -ConnectionTimeout 0 # comment out if not used

# Set up Restore of Log Backup on Mirror Server
$restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore|Out-Null
$restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($LocationTran,$File)|Out-Null
$restore.Database = $DBName
$restore.ReplaceDatabase = $True
$restore.NoRecovery = $true
$restore.Devices.add($restoredevice)
$restore.sqlrestore($MirrorServer)
$restore.Devices.Remove($restoredevice)

Recreate mirroring

You recreate mirroring in the same way as you would if you were using T-SQL simply add the principal endpoint to the mirror, and the mirror and witness endpoints to the principal

 "#Recreate Mirroring
$DatabaseMirror.MirroringPartner = $PrincipalEndPoint
$DatabaseMirror.Alter()
$DatabasePrincipal.MirroringPartner = $MirrorEndpoint
$DatabasePrincipal.MirroringWitness = $WitnessEndpoint
$DatabasePrincipal.Alter()

Resolve orphaned users

You will need to resolve any users and permissions on your destination servers. I do not know a way to do this with PowerShell and would be interested if anyone has found a way to replace the password or the SID on a user object, please contact me if you know.

Many people do this with the sp_rev_logins stored procedure which will create the T-SQL for recreating the logins. However, Powershell cannot read the outputs of the message window where the script prints the script. If you know that your logins are staying static then run sp_rev_logins and store the output in a sql file and call it with Invoke-SQLCmd2

 "$SQL = ‘’ #Path to File
Invoke-Sqlcmd2 –ServerInstance $Server –Database master –InputFile $SQL

The other option is to set up a SSIS package following this blog post and call it from Powershell as follows

 "Invoke-Command –ComputerName $Server –scriptblock {DTExec.exe /File “PATHTOPackage.dtsx”}

This requires Powershell Remoting to have been set up on the server which may or may not be available to you in your environment.

IMPORTANT NOTE – The script does not include any methods for resolving orphaned users so you will need to test and then add your own solution to the script.

check mirroring status

Lastly you want to check that the script has run successfully and that mirroring is synchronised (I am from the UK!!) To do this I check that time and file used for the last database backup using this script

 "#Check that correct file and backup date used

$query = "SELECT TOP 1 [rs].[destination_database_name] as 'database',
[rs].[restore_date] as 'restoredate',
[bs].[backup_finish_date] as 'backuptime',
[bmf].[physical_device_name] as 'Filename'
FROM msdb..restorehistory rs
INNER JOIN msdb..backupset bs
ON [rs].[backup_set_id] = [bs].[backup_set_id]
INNER JOIN msdb..backupmediafamily bmf
ON [bs].[media_set_id] = [bmf].[media_set_id]
ORDER BY [rs].[restore_date] DESC"

Invoke-Sqlcmd2 -ServerInstance $PrincipalServer -Database msdb -Query $query |Format-Table -AutoSize –Wrap

and that mirroring has synchronised using the following Powershell command

 "$DatabasePrincipal | select Name, MirroringStatus, IsAccessible |Format-Table -AutoSize

Depending on your needs you may add some error checking using the results of the above scripts. As I said at the top of the post, you can turn this script into a function and call it at will or add it to an Agent Job for regular scheduling or just kept in a folder ready to be run when required. The choice is yours but all usual rules apply. Don’t believe anything you read on this blog post, don’t run any scripts on production, test before running any scripts, understand what the code is doing before you run it or I am not responsible if you break anything

Here is the script

<# 
.NOTES 
    Name: Refresh Mirrored Database
    Author: Rob Sewell  http://sqldbawithabeard.com
    Requires: Invoke-SQLCMD2 (included)
    Version History: 
                    1.2 22/08/2014 
.SYNOPSIS 
    Refreshes a mirrored database
.DESCRIPTION 
    This script will refresh a mirrored database, recreate mirroring and chekc status of mirroring. 
    Further details on the website
    Requires the variables at the top of the script to be filled in
    IMPORTANT - Orpahaned users are not resolved with this acript without additions. See blog post for options
#>  
# Load Invoke-SQLCMD2


#Load the assemblies the script requires
[void][reflection.assembly]::LoadWithPartialName( "Microsoft.SqlServer.Management.Common" );
[void][reflection.assembly]::LoadWithPartialName( "Microsoft.SqlServer.SmoEnum" );
[void][reflection.assembly]::LoadWithPartialName( "Microsoft.SqlServer.Smo" );
[void][reflection.assembly]::LoadWithPartialName( "Microsoft.SqlServer.SmoExtended " );
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") 
[System.Reflection.Assembly]::LoadWithPartialName("System.Windows.Forms")|Out-Null

# Set up some variables

$PrincipalServer = '' # Enter Principal Server Name
$MirrorServer = '' # Enter Mirror Server Name
$DBName = '' # Enter Database Name
$FileShare = '' # Enter FileShare with trailing slash
$LocationReplace = $FileShare + $DBName + 'Refresh.bak'
$LocationFUll = $FileShare + $DBName + 'formirroring.bak'
$LocationTran = $FileShare + $DBName + 'formirroring.trn'

$PrincipalEndPoint = 'TCP://SERVERNAME:5022' # Change as required
$MirrorEndpoint = 'TCP://SERVERNAME:5022' # Change as required
$WitnessEndpoint = 'TCP://SERVERNAME:5022' # Change as required

$Full = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
$Tran = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Log
$File = [Microsoft.SqlServer.Management.Smo.DeviceType]::File

###################### 
<# 
.SYNOPSIS 
Runs a T-SQL script. 
.DESCRIPTION 
Runs a T-SQL script. Invoke-Sqlcmd2 only returns message output, such as the output of PRINT statements when -verbose parameter is specified 
.INPUTS 
None 
    You cannot pipe objects to Invoke-Sqlcmd2 
.OUTPUTS 
   System.Data.DataTable 
.EXAMPLE 
Invoke-Sqlcmd2 -ServerInstance "MyComputer\MyInstance" -Query "SELECT login_time AS 'StartTime' FROM sysprocesses WHERE spid = 1" 
This example connects to a named instance of the Database Engine on a computer and runs a basic T-SQL query. 
StartTime 
----------- 
2010-08-12 21:21:03.593 
.EXAMPLE 
Invoke-Sqlcmd2 -ServerInstance "MyComputer\MyInstance" -InputFile "C:\MyFolder\tsqlscript.sql" | Out-File -filePath "C:\MyFolder\tsqlscript.rpt" 
This example reads a file containing T-SQL statements, runs the file, and writes the output to another file. 
.EXAMPLE 
Invoke-Sqlcmd2  -ServerInstance "MyComputer\MyInstance" -Query "PRINT 'hello world'" -Verbose 
This example uses the PowerShell -Verbose parameter to return the message output of the PRINT command. 
VERBOSE: hello world 
.NOTES 
Version History 
v1.0   - Chad Miller - Initial release 
v1.1   - Chad Miller - Fixed Issue with connection closing 
v1.2   - Chad Miller - Added inputfile, SQL auth support, connectiontimeout and output message handling. Updated help documentation 
v1.3   - Chad Miller - Added As parameter to control DataSet, DataTable or array of DataRow Output type 
#> 
function Invoke-Sqlcmd2 { 
    [CmdletBinding()] 
    param( 
        [Parameter(Position = 0, Mandatory = $true)] [string]$ServerInstance, 
        [Parameter(Position = 1, Mandatory = $false)] [string]$Database, 
        [Parameter(Position = 2, Mandatory = $false)] [string]$Query, 
        [Parameter(Position = 3, Mandatory = $false)] [string]$Username, 
        [Parameter(Position = 4, Mandatory = $false)] [string]$Password, 
        [Parameter(Position = 5, Mandatory = $false)] [Int32]$QueryTimeout = 600, 
        [Parameter(Position = 6, Mandatory = $false)] [Int32]$ConnectionTimeout = 15, 
        [Parameter(Position = 7, Mandatory = $false)] [ValidateScript( {test-path $_})] [string]$InputFile, 
        [Parameter(Position = 8, Mandatory = $false)] [ValidateSet("DataSet", "DataTable", "DataRow")] [string]$As = "DataRow" 
    ) 
 
    if ($InputFile) { 
        $filePath = $(resolve-path $InputFile).path 
        $Query = [System.IO.File]::ReadAllText("$filePath") 
    } 
 
    $conn = new-object System.Data.SqlClient.SQLConnection 
      
    if ($Username) 
    { $ConnectionString = "Server={0};Database={1};User ID={2};Password={3};Trusted_Connection=False;Connect Timeout={4}" -f $ServerInstance, $Database, $Username, $Password, $ConnectionTimeout } 
    else 
    { $ConnectionString = "Server={0};Database={1};Integrated Security=True;Connect Timeout={2}" -f $ServerInstance, $Database, $ConnectionTimeout } 
 
    &n bsp; $conn.ConnectionString = $ConnectionString 
     
    #Following EventHandler is used for PRINT and RAISERROR T-SQL statements. Executed when -Verbose parameter specified by caller 
    if ($PSBoundParameters.Verbose) { 
        $conn.FireInfoMessageEventOnUserErrors = $true 
        $handler = [System.Data.SqlClient.SqlInfoMessageEventHandler] {Write-Verbose "$($_)"} 
        $conn.add_InfoMessage($handler) 
    } 
     
    $conn.Open() 
    $cmd = new-object system.Data.SqlClient.SqlCommand($Query, $conn) 
    $cmd.CommandTimeout = $QueryTimeout 
    $ds = New-Object system.Data.DataSet 
    $da = New-Object system.Data.SqlClient.SqlDataAdapter($cmd) 
    [void]$da.fill($ds) 
    $conn.Close() 
    switch ($As) { 
        'DataSet' { Write-Output ($ds) } 
        'DataTable' { Write-Output ($ds.Tables) } 
        'DataRow' { Write-Output ($ds.Tables[0]) } 
    } 
 
} #Invoke-Sqlcmd2

# Check for existence of Backup file with correct name
If (!(Test-Path $LocationReplace)) {
    Write-Output " There is no file called " 
    Write-Output $LocationReplace
    Write-Output "Please correct and re-run"
    break
}

# Remove Old Backups
if (Test-Path $locationFull) {
    Remove-Item $LocationFUll -Force
}

if (Test-Path $locationTran) {
    Remove-Item $LocationTran -Force
}

# Create Server objects
$Principal = New-Object Microsoft.SQLServer.Management.SMO.Server $PrincipalServer
$Mirror = New-Object Microsoft.SQLServer.Management.Smo.server $MirrorServer

#Create Database Objects
$DatabaseMirror = $Mirror.Databases[$DBName]
$DatabasePrincipal = $Principal.Databases[$DBName]

# If database is on Mirror server fail it over to Principal
if ($DatabasePrincipal.IsAccessible -eq $False) {
    $DatabaseMirror.ChangeMirroringState([Microsoft.SqlServer.Management.Smo.MirroringOption]::Failover) 
}

# remove mirroring

$DatabasePrincipal.ChangeMirroringState([Microsoft.SqlServer.Management.Smo.MirroringOption]::Off)

#Set up Restore using refresh backup

$restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore
$restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($LocationReplace, $File)|Out-Null
$restore.Database = $DBName
$restore.ReplaceDatabase = $True
$restore.Devices.add($restoredevice)
#Perform Restore
$restore.sqlrestore($PrincipalServer) # if query time < 600 seconds
# $query = $restore.Script($PrincipalServer) # if using Invoke-SQLCMD2
$restore.Devices.Remove($restoredevice)

# Invoke-Sqlcmd2 -ServerInstance $PrincipalServer -Database master -Query $query -ConnectionTimeout 0 # comment out if not used

# Set up Full Backup
$Backup = New-Object Microsoft.SqlServer.Management.Smo.Backup
$Backup.Action = $Full
$Backup.BackupSetDescription = "Full Backup of " + $DBName
$Backup.Database = $DatabasePrincipal.Name
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($LocationFull, $File)
$Backup.Devices.Add($BackupDevice)
# Perform Backup
$Backup.SqlBackup($PrincipalServer)
# $query = $Backup.Script($PrincipalServer) # if query time < 600 seconds
$Backup.Devices.Remove($BackupDevice)

# Invoke-Sqlcmd2 -ServerInstance $PrincipalServer -Database master -Query $query -ConnectionTimeout 0 # comment out if not used

 
#Setup Trans Backup
$Backup = New-Object Microsoft.SqlServer.Management.Smo.Backup|Out-Null
$Full = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Database
$Tran = [Microsoft.SQLServer.Management.SMO.BackupActionType]::Log
$File = [Microsoft.SqlServer.Management.Smo.DeviceType]::File
$Backup.Action = $Tran
$Backup.BackupSetDescription = "Log Backup of " + $DBName
$Backup.Database = $DBName
$BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($LocationTran, $File)|Out-Null
$Backup.Devices.Add($BackupDevice)
# Perform Backup
$Backup.SqlBackup($PrincipalServer)
# $query = $Backup.Script($PrincipalServer) # if query time < 600 seconds
$Backup.Devices.Remove($BackupDevice)

# Invoke-Sqlcmd2 -ServerInstance $PrincipalServer -Database master -Query $query -ConnectionTimeout 0 # comment out if not used

#Set up Restore of Full Backup on Mirror Server
$restore = New-Object -TypeName Microsoft.SqlServe r.Management.Smo.Restore|Out-Null
$restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($LocationFUll, $File)|Out-Null
$restore.Database = $DBName
$restore.ReplaceDatabase = $True
$restore.NoRecovery = $true
$restore.Devices.add($restoredevice)
$restore.sqlrestore($MirrorServer) # if query time < 600 seconds
# $query = $restore.Script($MirrorServer) # if using Invoke-SQLCMD2
$restore.Devices.Remove($restoredevice)

# Invoke-Sqlcmd2 -ServerInstance $MirrorServer -Database master -Query $query -ConnectionTimeout 0 # comment out if not used


# Set up Restore of Log Backup on Mirror Server
$restore = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Restore|Out-Null
$restoredevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($LocationTran, $File)|Out-Null
$restore.Database = $DBName
$restore.ReplaceDatabase = $True
$restore.NoRecovery = $true
$restore.Devices.add($restoredevice)
$restore.sqlrestore($MirrorServer)
$restore.Devices.Remove($restoredevice)

#Recreate Mirroring
$DatabaseMirror.MirroringPartner = $PrincipalEndPoint
$DatabaseMirror.Alter()
$DatabasePrincipal.MirroringPartner = $MirrorEndpoint
$DatabasePrincipal.MirroringWitness = $WitnessEndpoint
$DatabasePrincipal.Alter()

# Resolve Orphaned Users if needed


#Check that correct file and backup date used

$query = "SELECT TOP 20 [rs].[destination_database_name] as 'database', 
[rs].[restore_date] as 'restoredate', 
[bs].[backup_finish_date] as 'backuptime', 
[bmf].[physical_device_name] as 'Filename'
FROM msdb..restorehistory rs
INNER JOIN msdb..backupset bs
ON [rs].[backup_set_id] = [bs].[backup_set_id]
INNER JOIN msdb..backupmediafamily bmf 
ON [bs].[media_set_id] = [bmf].[media_set_id] 
ORDER BY [rs].[restore_date] DESC"

Invoke-Sqlcmd2 -ServerInstance $PrincipalServer -Database msdb -Query $query |Format-Table -AutoSize -Wrap

$DatabasePrincipal | select Name, MirroringStatus, IsAccessible |Format-Table -AutoSize

 

You Have To Start Somewhere

medium_33194896

The hardest part is looking at the blank page and beginning to type. It’s much easier to go and play with the settings of the site, to look at plugins and other cool things. The only other blog I have written was http://wombatsdojogle.wordpress.com. This was a little easer as there was always ‘something’ that needed to be written about. Whether it was training or route planning or every day on the road I had material and an obvious thing to write.

This is a little harder for me so I will begin as follows

During different careers working in secure units, working for a small family firm doing everything from delivery driving to office work and working for myself selling things via eBay and at car boot sales I have always been interested in computers. I was (still am) the fella who could fix and sort things out. Towards the end I was getting paid for it too. I helped small businesses and individuals, I set up systems, reinstalled operating systems, dealt with viruses. You know the sort of thing. When things dried up and circumstances changed meaning I could spend some time away from home I got a job at an arts university. In a small team my responsibilities ranged from password resets and printer installs to rolling out new PCs and laptops and helping to merge active directory domains. I loved it. The travel too and from work was a pain at times but the job was grand though the pay wasn’t!!

I joined MyWork in a service desk role. Not your typical log and flog sort of place but a 24/7 team responsible for the first answering of the phone to a significant amount of second line fixing and routine IT tasks invaluable to the running of MyWork. A couple of years later after many suggestions of jobs I should apply for and plenty of encouragement from colleagues I applied for and got the position of Oracle DBA. That didn’t work out quite as expected and two months later I was asked to move to be a SQL DBA. That was 18 months ago and I am astonished by how much I have learnt so far and still slightly daunted by the sheer amount there still is to learn.

The reason I was asked to move is that responsibility for the SQL estate had moved to the team and the one SQL DBA was struggling with the sheer amount of work required. He had joined only a few months earlier and found that best practice and SQL had not been applied particularly well and with more than 700 databases to support he couldn’t keep up.

He and I began to make changes. Permissions for developers were removed – no more sysadmins for developers on live systems. Backups were run 7 days a week and checked every day. Service accounts were set up to run the various SQL services per server. Documentation was begun. All the good things that should be done were started to be done.

There were arguments and outbursts. Developers took time to understand that we were doing things for the best of MyWork and not to annoy them or stop them working. We got things wrong for sure. We didn’t communicate well with colleagues in other teams at times but we had the backing of our line management.

Then my colleague left to go to pastures new. I had been a SQL DBA for exactly 6 months and I was on my own. Then my line manager left so I had to look after the general maintenance of the Oracle estate as well. There are also some Ingres databases critical to MyWork and they were my responsibility as well.

For a few months I somehow managed to keep everything going without making any major booboos. It was a struggle. I was fighting my lack of knowledge, the sheer amount of work and running much too hard on caffeine and nicotine. At the end of last year some salvation arrived. First an Oracle DBA joined then a team lead (also an Oracle DBA) and another SQL DBA. Not the many years experienced SQL DBA I had hoped for who could advise me and teach me but I sure am glad he’s here.

In the last few weeks I am beginning to see the benefit of this. No longer on call all the time. Not as much fire fighting. Able to plan my day instead of walking in and dealing with whoever or whatever was shouting loudest. I was finally able to go to the local SQL User Group for the first time last month.

http://sqlsouthwest.co.uk/

and meet up with some fabulous people.

I decided to start to write a blog about my experience. I hope it will show me how far I have come, how much I have learnt and the way I have done it. It may be of use to people and hopefully it will increase my interaction with the rest of the SQL community who are without doubt the most interactive and helpful group of people mainly without egos.

I have an idea of my next post. It will be about resolving the challenge and time spent checking and resolving backups.

The idea for it started with reading a blog post by John Samson http://www.johnsansom.com who can be found on twitter @SQLBrit

The blog post is one of the most read on his blog and is titled

The Best Database Administrators Automate Everything

Here is a quote from that blog entry

 

Automate Everything

That’s right, I said everything. Just sit back and take the time to consider this point for a moment. Let it wander around your mind whilst you consider the processes and tasks that you could look to potentially automate. Now eliminate the word potentially from your vocabulary and evaluate how you could automate e-v-e-r-y-t-h-i-n-g that you do.

Even if you believe that there is only a remote possibility that you will need to repeat a given task, just go ahead and automate it anyway! Chances are that when the need to repeat the process comes around again, you will either be under pressure to get it done, or even better have more importantProactive Mode tasks/projects to be getting on with

 

I have tried my best to follow this advice. I haven’t always succeeded. Many times I just didn’t have the time to spare to write the automation even though it would save me time later. Now with more assistance in my team I am starting to resolve that

My interest in PowerShell, which was piqued when I wanted to organise my photos and a colleague pointed me at a script to sort my photos into year and month, encouraged me to create my favourite automation process which I will describe next time.

 

 

 

 

photo credit: emdot via photopin cc