SQL VNext sp_configure on Windows and Linux with dbatools

This weekend I set up some SQL vNext virtual machines, two on Windows and one on Linux so that I could test some scenarios and build an availability group.

IMPORTANT NOTE :- The names of dbatools commands with a Sql prefix WILL CHANGE in a later release of dbatools. dbatools will use Dba throughout in the future as the sqlserver PowerShell module uses the Sql prefix

I used PowerShell version 5.1.14393.693 and SQL Server vNext CTP 1.3 running on Windows Server 2016 and Ubuntu 16.04 in this blog post

I set up one Windows box with some changes to the default configuration, these are for my lab only.

Min Server Memory to 2Gb
Max Server Memory to 6Gb
Remote Admin Connections to 1
Backup Compression to 1
Ad Hoc Distributed Queries to 1

I can do this in PowerShell by creating a SQL SMO Server object using Connect-DbaSqlServer from dbatools and then altering the Configuration Properties values and calling the Alter() method

$WinSQl1 = 'SQLvNextN1'
$win1 = Connect-DbaSqlServer -SqlServer $WinSQl1
$win1.Configuration.Properties['DefaultBackupCompression'].ConfigValue = 1
$win1.Configuration.Properties['MinServerMemory'].ConfigValue = 2048
$win1.Configuration.Properties['MaxServerMemory'].ConfigValue = 6144
$win1.Configuration.Properties['RemoteAccess'].ConfigValue = 1
$win1.Configuration.Properties['OptimizeAdhocWorkloads'].ConfigValue = 1
$win1.Configuration.Alter()

I can see the change by using the Get-DbaSpConfigure from the popular PowerShell module dbatools. I like to output to Out-GridView (Alias ogv at the CLI)

$WinSQl1 = 'SQLvNextN1'
Get-DbaSpConfigure -SqlServer $WinSQl1 | ogv

1 SPConfig ogv.PNG

Out-GridView can filter in the top bar and it works very quickly. It is a very useful tool

2 - ogv filter.gif

You can also rearrange the columns and use the Add Criteria button to filter your results

3-spconfig-ogv

You can use Export-SQLSpConfigure to export the configuration to a .sql file which is useful for DR or documentation purposes. There is also a corresponding Import-SqlSpConfigure command

$WinSQl1 = 'SQLvNextN1'
$WinConfigPath = 'C:\Temp\Winconfig.sql'
Export-SqlSpConfigure -SqlServer $WinSQl1 -Path $winConfigPath
notepad $winConfigPath

4 - Export-SQLSpConfigure.gif

Chrissy LeMaire b | t showed a neat trick when we were in Utrecht to compare the configuration of two servers

$WinSQl1 = 'SQLvNextN1'
$WinSQl2 = 'SQLvNextN2'
$Win1SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl1
$Win2SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl2
$propcompare = foreach ($prop in $Win1SPConfigure) {
[pscustomobject]@{
Config = $prop.DisplayName
'Windows Node1 setting' = $prop.RunningValue
'Windows Node2 Setting' = $Win2SPConfigure | Where DisplayName -eq $prop.DisplayName | Select -ExpandProperty RunningValue
}
}
$propcompare | ogv
So when we compare the two Windows SQL nodes we can see that at this point there are differences in the configuration
5-compare-configures

 

Now we can copy the configuration from SQLvNextN1 to SQLvNextN2 using the Copy-SqlSpConfigure command

$WinSQl1 = 'SQLvNextN1'
$WinSQl2 = 'SQLvNextN2'
Copy-SqlSpConfigure -Source $WinSQl1 -Destination $WinSQl2

6 - Copy Configurations.gif

and we can see that those configurations are now the same for those two servers by comparing them like before.

7 - compare configurations.PNG

We can use Get-DbaSpConfigure with Linux servers as well but we need to use SQL authentication like so (No, I don’t know why I used two v’s!)

$linuxSQL = 'LinuxvvNext'
$cred = Get-Credential -UserName SA -Message "Linux SQL Auth"
$linuxSpConfigure = Get-DbaSpConfigure  -SqlServer $linuxSQL -SqlCredential $cred
$linuxSpConfigure | ogv

and compare the three servers like so

$WinSQl1 = 'SQLvNextN1'
$WinSQl2 = 'SQLvNextN2'
$linuxSQL = 'LinuxvvNext'
$cred = Get-Credential -UserName SA -Message "Linux SQL Auth"
<div>$Win1SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl1
$Win2SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl2
$linuxSpConfigure = Get-DbaSpConfigure  -SqlServer $linuxSQL -SqlCredential $cred
$propcompare = foreach ($prop in $Win1SPConfigure) {
[pscustomobject]@{
Config = $prop.DisplayName
'Windows Node1 setting' = $prop.RunningValue
'Windows Node2 Setting' = $Win2SPConfigure | Where DisplayName -eq $prop.DisplayName | Select -ExpandProperty RunningValue
'Linux Setting' = $linuxSpConfigure | Where DisplayName -eq $prop.DisplayName | Select -ExpandProperty RunningValue
}
}
$propcompare | ogv

and see the differences for the Linux Server

8 - Compare Linux Configs.PNG

we can export the Linux configuration using Export-SqlSpConfigure

9 - Export Linux Config.PNG

So now lets copy the configuration changes from the Windows Server to the Linux Server

10 - Copy Configuration to Linux.gif

There was some Red text there you will notice. This is because there are unsupported features in Linux as you can see in the SQL Server on Linux Release Notes The errors from the command are

Database Mail XPs to 0. Feature may not be supported.
SMO and DMO XPs to 1. Feature may not be supported.
Ole Automation Procedures to 0. Feature may not be supported.
xp_cmdshell to 0. Feature may not be supported.
Ad Hoc Distributed Queries to 0. Feature may not be supported.
Replication XPs to 0. Feature may not be supported.
contained database authentication to 0. Feature may not be supported.
hadoop connectivity to 0. Feature may not be supported.
polybase network encryption to 1. Feature may not be supported.
remote data archive to 0. Feature may not be supported.
allow polybase export to 0. Feature may not be supported.

But if we compare the configurations again, we can see that the settings we wanted have been altered successfully

11 - Compare Linux Configs again.PNG

Lets reset the configuration on the Linux server using Import-SqlSpConfigure and the Linuxconfig-backup.sql file created before any changes

12 - Import backup Linux configuration.gif

Interestingly this time we only had an error for Database Mail XPs, but as you can see below the Linux Configuration has been reset back to the original values.

13-configuration-comapre-back-to-original

Lets export the configuration from the Windows server to a file and import it onto the Linux server to see what happens

14 - Import Windows configuration to Linux.gif

Again only an error for Database Mail XPs, and we can see the configurations are again matching.

15 - Final Compare.PNG

Hopefully, this post has been of use in exploring the *SpConfigure* commands in dbatools and how you can use them to get the sp_configure output, compare it between servers, Export and import it to and from files, as well as copy it between servers for both Windows and Linux. Remember these commands have been tested from SQL 2000 to SQL vNext so they should work for you across your estate

 

Happy Automating!

 

Advertisements

DBA Database scripts are on Github

It started with a tweet from Dusty

Tweets

The second session I presented at the fantastic PowerShell Conference Europe was about using the DBA Database to automatically install DBA scripts like sp_Blitz, sp_AskBrent, sp_Blitzindex from Brent Ozar , Ola Hallengrens Maintenance Solution , Adam Mechanics sp_whoisactive , This fantastic script for logging the results from sp_whoisactive to a table , Extended events sessions and other goodies for the sanity of the DBA.

By making use of the dbo.InstanceList in my DBA database I am able to target instances, by SQL Version, OS Version, Environment, Data Centre, System, Client or any other variable I choose. An agent job that runs every night will automatically pick up the instances and the scripts that are marked as needing installing. This is great when people release updates to the above scripts allowing you to target the development environment and test before they get put onto live.

I talked to a lot of people in Hannover and they all suggested that I placed the scripts onto GitHub and after some how-to instructions from a few people (Thank you Luke) I spent the weekend updating and cleaning up the code and you can now find it on GitHub here

github

I have added the DBA Database project, the Powershell scripts and Agent Job creation scripts to call those scripts and everything else I use. Some of the DBA Scripts I use (and links to those you need to go and get yourself for licensing reasons) and the Power Bi files as well. I will be adding some more jobs that I use to gather other information soon.

Please go and have a look and see if it is of use to you. It is massively customisable and I have spoken to various people who have extended it in interesting ways so I look forward to hearing about what you do with it.

As always, questions and comments welcome

 

 

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Following my post about using Power Bi with my DBA Database I have been asked if I would share the PowerShell scripts which I use to populate my database.

In this post I will show how to create the following report

1

2

Although you will find so many items of data that I expect that you will want to create different reports for your own requirements. You will also want to put the report onto PowerBi.com and explore the natural language querying as I show at the end of this post

You will find the latest version of my DBADatabase creation scripts and PowerShell scripts here.

The SQLInfo table is created using this code

CREATE TABLE [Info].[SQLInfo](
	[SQLInfoID] [int] IDENTITY(1,1) NOT NULL,
	[DateChecked] [datetime] NULL,
	[DateAdded] [datetime] NULL,
	[ServerName] [nvarchar](50) NULL,
	[InstanceName] [nvarchar](50) NULL,
	[SQLVersionString] [nvarchar](100) NULL,
	[SQLVersion] [nvarchar](100) NULL,
	[ServicePack] [nvarchar](3) NULL,
	[Edition] [nvarchar](50) NULL,
	[ServerType] [nvarchar](30) NULL,
	[Collation] [nvarchar](30) NULL,
	[IsHADREnabled] [bit] NULL,
	[SQLServiceAccount] [nvarchar](35) NULL,
	[SQLService] [nvarchar](30) NULL,
	[SQLServiceStartMode] [nvarchar](30) NULL,
	[BAckupDirectory] [nvarchar](256) NULL,
	[BrowserAccount] [nvarchar](50) NULL,
	[BrowserStartMode] [nvarchar](25) NULL,
	[IsSQLClustered] [bit] NULL,
	[ClusterName] [nvarchar](25) NULL,
	[ClusterQuorumstate] [nvarchar](20) NULL,
	[ClusterQuorumType] [nvarchar](30) NULL,
	[C2AuditMode] [nvarchar](30) NULL,
	[CostThresholdForParallelism] [tinyint] NULL,
	[MaxDegreeOfParallelism] [tinyint] NULL,
	[DBMailEnabled] [bit] NULL,
	[DefaultBackupCComp] [bit] NULL,
	[FillFactor] [tinyint] NULL,
	[MaxMem] [int] NULL,
	[MinMem] [int] NULL,
	[RemoteDacEnabled] [bit] NULL,
	[XPCmdShellEnabled] [bit] NULL,
	[CommonCriteriaComplianceEnabled] [bit] NULL,
	[DefaultFile] [nvarchar](100) NULL,
	[DefaultLog] [nvarchar](100) NULL,
	[HADREndpointPort] [int] NULL,
	[ErrorLogPath] [nvarchar](100) NULL,
	[InstallDataDirectory] [nvarchar](100) NULL,
	[InstallSharedDirectory] [nvarchar](100) NULL,
	[IsCaseSensitive] [bit] NULL,
	[IsFullTextInstalled] [bit] NULL,
	[LinkedServer] [nvarchar](max) NULL,
	[LoginMode] [nvarchar](20) NULL,
	[MasterDBLogPath] [nvarchar](100) NULL,
	[MasterDBPath] [nvarchar](100) NULL,
	[NamedPipesEnabled] [bit] NULL,
	[OptimizeAdhocWorkloads] [bit] NULL,
	[InstanceID] [int] NULL,
	[AGListener] [nvarchar](150) NULL,
	[AGs] [nvarchar](150) NULL,
 CONSTRAINT [PK__SQL__50A5926BC7005F29] PRIMARY KEY CLUSTERED 
(
	[SQLInfoID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

GO

ALTER TABLE [Info].[SQLInfo]  WITH CHECK ADD  CONSTRAINT [FK_SQLInfo_InstanceList] FOREIGN KEY([InstanceID])
REFERENCES [dbo].[InstanceList] ([InstanceID])
GO

ALTER TABLE [Info].[SQLInfo] CHECK CONSTRAINT [FK_SQLInfo_InstanceList]
GO

The Powershell script uses Jason Wasser @wasserja Write-Log function to write to a text file but I also enable some logging into a new event log by following the steps here http://blogs.technet.com/b/heyscriptingguy/archive/2013/02/01/use-powershell-to-create-and-to-use-a-new-event-log.aspx to create a log named SQLAutoScript with a source SQLAUTOSCRIPT

To run the script I simply need to add the values for

$CentralDBAServer = '' ## Add the address of the instance that holds the DBADatabase
$CentralDatabaseName = 'DBADatabase' 
$LogFile = "\DBADatabaseServerUpdate_" + $Date + ".log" ## Set Path to Log File

And the script will do the rest. Call the script from a PowerShell Job Step and schedule it to run at the frequency you wish, I gather the information every week. You can get the script from here or you can read on to see how it works and how to create the report and publish it to powerbi.com

I create a function called Catch-Block to save keystrokes and put my commands inside a try catch to make the scripts as robust as possible.

function Catch-Block
{
param ([string]$Additional)
$ErrorMessage = " On $Connection " + $Additional + $_.Exception.Message + $_.Exception.InnerException.InnerException.message
$Message = " This message came from the Automated Powershell script updating the DBA Database with Server Information"
$Msg = $Additional + $ErrorMessage + " " + $Message
Write-Log -Path $LogFile -Message $ErrorMessage -Level Error
Write-EventLog -LogName SQLAutoScript -Source "SQLAUTOSCRIPT" -EventId 1 -EntryType Error -Message $Msg
}

I give the function an additional parameter which will hold each custom error message which I write to both the event log and a text message to enable easy troubleshooting and include the message from the $Error variable by accessing it with $_. I won’t include the try catch in the examples below. I gather all of the server names from the InstanceList table and set the results to an array variable called $ServerNames holding the server name, instance name and port

 $Query = @"
 SELECT [ServerName]
      ,[InstanceName]
      ,[Port]
  FROM [DBADatabase].[dbo].[InstanceList]
  Where Inactive = 0 
    AND NotContactable = 0
"@
try{
$AlltheServers= Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query $query
$ServerNames = $AlltheServers| Select ServerName,InstanceName,Port
}

I then loop through the array and create a $Connection variable for my SMO connection string and connect to the server

foreach ($ServerName in $ServerNames)
{
## $ServerName
 $InstanceName =  $ServerName|Select InstanceName -ExpandProperty InstanceName
 $Port = $ServerName| Select Port -ExpandProperty Port
$ServerName = $ServerName|Select ServerName -ExpandProperty ServerName 
 $Connection = $ServerName + '\' + $InstanceName + ',' + $Port

 try
 {
 $srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection

Even though I place the creation of the SMO server object in a try block you still need to an additional check to ensure that you can connect and populate the object as the code above creates an empty SMO Server object with the name property set to the $Connection variable if you can’t connect to that server and doesn’t error as you may expect
The way I have always validated an SMO Server object is to check the version property. There is no justifiable reason for choosing that property, you could choose any one but that’s the one I have always used. I use an if statement to do this ( This post about Snippets will show you the best way to learn powershell code) The reference I use for exiting a loop in the way that you want is this one In this case we use a continue to carry on iterating the loop

 if (!( $srv.version)){
 Catch-Block " Failed to Connect to $Connection"
 continue
 }

If you wish to view all of the different properties that you can gather information on in this way you can use this code to take a look. (This is something you should get used to doing when writing new Powershell scripts)

$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection
 $srv | Get-Member

As you can see from the screenshot below on my SQL2014 server there are 184 properties. I havent chosen to gather all of them, only the ones that are of interest to me, our team or others who request information from our team such as auditors and project managers etc

3

You can choose to use any or all of these properties as long as you ensure you have the columns in your table with the correct data type and that you have the correct knowledge and logic to stop the script from erroring if/when the property is not available. Here is an example

if ($srv.IsHadrEnabled -eq $True)
 {$IsHADREnabled = $True
 $AGs = $srv.AvailabilityGroups|Select Name -ExpandProperty Name|Out-String
 $Expression = @{Name = 'ListenerPort' ; Expression = {$_.Name + ',' + $_.PortNumber }}
 $AGListener =  $srv.AvailabilityGroups.AvailabilityGroupListeners|select $Expression|select ListenerPort -ExpandProperty ListenerPort
 }
 else
 {
 $IsHADREnabled = $false
 $AGs = 'None'
 $AGListener = 'None'
 }
 $BackupDirectory = $srv.BackupDirectory

I check if the property IsHADREnabled is true and if it is I then gather the information about the Availability Group names and the listener port and if it doesn’t exist I set the values to None.

You will find that not all of the properties that you want are at the root of the Server SMO object. If you want you max and min memory values and you want to know if remote admin connections or xp_cmdshell are enabled you will need to look at the $Srv.Configuration object

 $MaxMem = $srv.Configuration.MaxServerMemory.ConfigValue
 $MinMem = $srv.Configuration.MinServerMemory.ConfigValue
 $RemoteDacEnabled = $srv.Configuration.RemoteDacConnectionsEnabled.ConfigValue
 $XPCmdShellEnabled = $srv.Configuration.XPCmdShellEnabled.ConfigValue

You can look for the property that you want by using the Get-Member cmdlet as shown above or use MSDN to find it starting from here or by GoogleBingDuckDuckGo ing “Powershell SMO” and the property you wish to find.

The rest of the script follows exactly the same pattern as the previous post by checking the SQL Info table for an entry for that instance and updating the table if it exists and inserting if it does not.

There are other uses for gathering this information than just for reporting on it. You can target different versions of SQL for different scripts. You can identify values that are outside what is expected and change them. If xp_cmdshell should not be enabled, write the TSQL to gather the connection string of all of the servers from the DBADatabase where the SQLInfo table has XPCMDShellenabled = 1 and loop through them exactly as above and change the value of $srv.Configuration.XPCmdShellEnabled.ConfigValue to 0 and then $Srv.Alter()

It is a very powerful way of dynamically targeting your estate if you are looking after many instances and with great power comes great responsibility.

ALWAYS TEST THESE AND ANY SCRIPTS YOU FIND OR SCRIPTS YOU WRITE BEFORE YOU RUN THEM IN YOUR PRODUCTION ENVIRONMENT

Yeah, I shouted and some people thought it was rude. But its important, it needs to be repeated and drilled in so that it becomes habitual. You can do great damage to your estate with only a few lines of PowerShell and a DBA Database so please be very careful and ensure that you have a suitable test subset of servers that you can use to test

The other thing we can do is report on the data and with Power Bi we can create self service reports and dashboards and also make use of the natural language query at powerbi.com so that when your systems team ask “What are all the servers in X data center?” you can enable them to answer it themselves or when the compliance officer asks how many SQL 2005 instances do we have and which clients do they serve you can give them a dashboard they can query themselves.

This is how I create the two reports you see at the top. I start by connecting to the data source, my DBA Database

4

And I use this query

SELECT 
	IL.ServerName
	,IL.InstanceName
	  ,IL.Location
	  ,IL.Environment
	  ,IL.Inactive
	  ,IL.NotContactable
	  ,SI.[SQLInfoID]
      ,SI.[DateChecked]
      ,SI.[DateAdded]
      ,SI.[ServerName]
      ,SI.[InstanceName]
      ,SI.[SQLVersionString]
      ,SI.[SQLVersion]
      ,SI.[ServicePack]
      ,SI.[Edition]
      ,SI.[ServerType]
      ,SI.[Collation]
      ,SI.[IsHADREnabled]
      ,SI.[SQLServiceAccount]
      ,SI.[SQLService]
      ,SI.[SQLServiceStartMode]
      ,SI.[BAckupDirectory]
      ,SI.[BrowserAccount]
      ,SI.[BrowserStartMode]
      ,SI.[IsSQLClustered]
      ,SI.[ClusterName]
      ,SI.[ClusterQuorumstate]
      ,SI.[ClusterQuorumType]
      ,SI.[C2AuditMode]
      ,SI.[CostThresholdForParallelism]
      ,SI.[MaxDegreeOfParallelism]
      ,SI.[DBMailEnabled]
      ,SI.[DefaultBackupCComp]
      ,SI.[FillFactor]
      ,SI.[MaxMem]
      ,SI.[MinMem]
      ,SI.[RemoteDacEnabled]
      ,SI.[XPCmdShellEnabled]
      ,SI.[CommonCriteriaComplianceEnabled]
      ,SI.[DefaultFile]
      ,SI.[DefaultLog]
      ,SI.[HADREndpointPort]
      ,SI.[ErrorLogPath]
      ,SI.[InstallDataDirectory]
      ,SI.[InstallSharedDirectory]
      ,SI.[IsCaseSensitive]
      ,SI.[IsFullTextInstalled]
      ,SI.[LinkedServer]
      ,SI.[LoginMode]
      ,SI.[MasterDBLogPath]
      ,SI.[MasterDBPath]
      ,SI.[NamedPipesEnabled]
      ,SI.[OptimizeAdhocWorkloads]
      ,SI.[InstanceID]
      ,SI.[AGListener]
      ,SI.[AGs]
        FROM [DBADatabase].[Info].[SQLInfo] as SI
  JOIN [DBADatabase].[dbo].[InstanceList] as IL
  ON IL.InstanceID =  SI.InstanceID

So that I can easily add any and all the data to the reports if I choose or query using them in powerbi.com

First I created 3 measures.

AG = DISTINCTCOUNT(Query1[AGs])
Instances = DISTINCTCOUNT(Query1[InstanceID])
Servers = DISTINCTCOUNT(Query1[ServerName])

I click on map

5

And drag the location column to location and the Instances measure to both the Values and Color Saturation

6

I then click on edit and format the title and change the colours for the data

7

Next I created I heat map for Instances by Edition. The picture shows the details

8

And a column chart for Instances by Version

9

I also add a table showing the number of instances in each location and a slicer for environment.

Even though you have added one slicer, you are able to slice the data by clicking on the charts. If I click on Developer Edition I can quickly see which versions and locations they are in

10

This works for the map and the column chart as well. This has all been created using live data as a base with all identifying information altered, Bolton is where I was born and the other locations are chosen at random, all other figures and rollups have also been altered.

11

To create the other report I create two donut charts for Instances by version and by location using steps similar to my previous post and then add some tables for location, edition and xp_cmdshell enabled as well as some cards showing total numbers of Servers, Instances and Availability Groups and a slicer for environment to create a report like this, you can use the donut charts to slice the data as well

12

But there are so many different points of information gathered by this script that you get extra value using the natural language query on powerbi.com.

Click Publish and enter your powerbi.com credentials and then log into powerbi.com in a browser and you will see your report and your dataset. (Note, you can easily filter to find your dashboards, reports and data sets)

13

Click the plus sign to create a new dashboard and click the pin on any of the objects in your report to pin them to the dashboard

14

Then you can view (and share) your dashboard

15

Once you have done this you can query your data using natural language. It will cope with spelling mistakes and expects the column names so you may want to think about renaming them in your report by right clicking on them after you get your data.

You can ask it questions and build up information on the fly and alter it as you need it. As a DBA doing this and imagining enabling others to be able to ask these questions whenever they want from a browser and as many times as they like, it was very cool!

16

17

18

19

20

Pretty cool, I think you and any of your ‘requestors’ would agree

You can get all of the scripts here

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

Making a Change Log Easier With PowerShell

Having a Change Log is a good thing. A quick and simple place to find out what has changed on a server and when. This can be invaluable when troubleshooting, matching a change to a symptom especially when assessed alongside your performance counter collection. Here is a simple way to make use of a change log and automate it

Create a simple table


USE [MDW]
GO

CREATE TABLE [dbo].[ChangeLog](
 [ChangeID] [int] IDENTITY(1,1) PRIMARY KEY ,
 [Date] [datetime] NOT NULL,
 [Server] [varchar](50) NOT NULL,
 [UserName] [nvarchar](50) NOT NULL,
 [Change] [nvarchar](max) NOT NULL,
)

GO

You can keep this on a central server or create a database on each server, whichever fits your needs best. You can add other columns if you want your information in a different format

Once you have your table you can create a couple of Powershell functions to easily and quickly add to and retrieve data from the table. I make use of Invoke-SQLCMD2 in these functions

This can then be included in any automation tasks that you use to update your environments whether you are using automated deployment methods for releases or using SCCM to patch your environments making it easy to update and also easy to automate by making it part of your usual deployment process.

To add a new change

<#
.Synopsis
 A function to add a ChangeLog information
.DESCRIPTION
 Load function for adding a change to the changelog table in the MDW database on MDWSERVER.
 Use Get-ChangeLog $Server to see details
 Inputs the username of the account running powershell into the database as the user
REQUIRES Invoke-SQLCMD2
http://sqldbawithabeard.com
.EXAMPLE
 Add-ChangeLog SERVERNAME "Altered AutoGrowth Settings for TempDB to None"

 Adds ServerName UserName and Altered AutoGrowth Settings for TempDB to None to the change log table
#>
Function Add-ChangeLog
{
[CmdletBinding()]
Param(
 [Parameter(Mandatory=$True)]
 [string]$Server,

 [Parameter(Mandatory=$True)]
 [string]$Change
)

$UserName = $env:USERDOMAIN + '\' + $env:USERNAME

$Query = "INSERT INTO [dbo].[ChangeLog]
 ([Date]
 ,[Server]
 ,[UserName]
 ,[Change])
 VALUES
 (GetDate()
 ,'$Server'
 ,'$UserName'
 ,'$Change')
"
Invoke-Sqlcmd2 -ServerInstance MDWSERVER -Database "MDW" -Query $Query -Verbose
}

You can then run

Add-ChangeLog SERVERNAME "Added New Database SuperAppData"

to add the change to the change log

To retrieve the data you can use

<#
.Synopsis
 A function to get ChangeLog information
.DESCRIPTION
 Load function for finding ChangeLog information. Information is selected from the MDW Database on SERVERNAME
REQUIRES Invooke-SQLCMD2
http://sqldbawithabeard.com
.EXAMPLE
 Get-ChangeLog SERVERNAME
#>
Function Get-ChangeLog
{
 [CmdletBinding()]
 [OutputType([int])]
 Param
 (
 # Server Name Required
 [Parameter(Mandatory=$true,]
 $Server
 )

$a = @{Expression={$_.Date};Label="Date";width=15}, `
@{Expression={$_.Server};Label="Server";width=10},
@{Expression={$_.UserName};Label="UserName";width=20}, `
@{Expression={$_.Change};Label="Change";width=18}

Invoke-Sqlcmd2 -ServerInstance MDWSERVER -Database "MDW" -Query "SELECT * FROM dbo.ChangeLog WHERE Server = '$Server';" -Verbose|Format-table $a -Auto -Wrap

}

and use

Get-ChangeLog SERVERNAME

To find out what changed when. Happy Automating

Changing Delay Between Responses for SQL Alerts with Powershell

So you have read that you should have alerts for severity levels 16 to 24 and 823,824 and 825 on SQLSkills.com or maybe you have used sp_blitz and received the Blitz Result: No SQL Server Agent Alerts Configured and like a good and conscientious DBA you have set them up.

Hopefully you also have Jonathan Allens blog on your feed and if you look at his historical posts and seen this one where lack of a delay in response broke the Exchange Server!

However sometimes the oft used delay between responses of 1 minute is too much. Alerts should be actionable after all and maybe you sync your email every 15 minutes and don’t need to see 15 alerts for the same error or you decide that certain level of errors require a lesser response and therefore you only need to know about them every hour or three. Or possibly you want to enforce a certain delay for all servers and want to set up a system to check regularly and enforce your rule

Whatever the reason, changing the delay between response for every alert on every server with SSMS could be time consuming and (of course) I will use Powershell to do the job.

To find the alerts I follow the process I use when finding any new property in powershell


$server = 'SERVERNAME'

$srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server

I know that the Alerts will be found under the JobServer Property


$srv.JobServer.Alerts|Get-Member

Shows me

DelayBetweenResponses   Property   int DelayBetweenResponses {get;set;}

And

 Alter                   Method     void Alter(), void IAlterable.Alter()

So I use both of those as follows


Foreach($Alert in $srv.JobServer.Alerts)

{

$Alert.DelayBetweenResponses = 600 # This is in seconds

$Alert.Alter()

}

And place it in a foreach loop for the servers I want to change. If I only want to change certain alerts I can do so by filtering on Name


Foreach($Alert in $srv.JobServer.Alerts|Where-Object {$_.Name -eq 'NameOfAlert'})

Or by category


Foreach($Alert in $srv.JobServer.Alerts|Where-Object {$_.CategoryName -eq 'Category Name'})

When you have 5 minutes go and look at the results of


$srv.JobServer|Get-Member

And explore and let me know what you find

Generating T-SQL Randomly with Powershell

I have a lab on my laptop running various servers so that I can problem solve and learn and recently I wanted to add several months of data into a database. I had created a stored procedure to take some parameters perform some logic and insert the data.

To execute the stored procedure in T-SQL I simply run this

EXECUTE [dbo].[usp_Insert_DriveSpace] 'Server1','C','2014-11-05','100','25'

which uses the server name, drive letter, date, capacity and free space to add the data

In my wisdom I decided to create some data that was more ‘real-life’ I was interested in storing drive space data and will be learning how to write reports on it. To do this I had pre-populated some tables in the database with 10 Server Names each with 5 drives so I needed 10*5*90 or 4500 statements

I wanted to populate this with about 3 months of data as if it had been gathered every day. I read this post about using CTEs to create sequences and I am sure it can be done this way but I don’t have the T-SQL skills to do so. If someone can (or has) done that please let me know as I am trying to improve my T-SQL skills and would be interested in how to approach and solve this problem with T-SQL

I solved it with Powershell in this way.

Created an array of Servers and an array of Drives to enable me to iterate though each.

$Servers = 'Server1','Server2','Server3','Server4','Server5','Server6','Server7','Server8','Server9','Server10'
$Drives = 'C','D','E','F','G'

Set the drive capacity for each drive. To make my life slightly easier I standardised my ‘servers’

$CDriveCapacity = 100
$DDriveCapacity = 50
$EDriveCapacity = 200
$FDriveCapacity = 200
$GDriveCapacity = 500

I needed to create a date. You can use Get-Date to get todays date and to get dates or times in the future or the past you can use the AddDays() function. You can also add ticks, milliseconds, seconds, minutes, hours, months or years

(Get-Date).AddDays(1)

I then needed to format the date. This is slightly confusing. If you just use Get-Date to get the current date (time) then you can use the format or uformat switch to format the output

Get-Date -Format yyyyMMdd
Get-Date -UFormat %Y%m%d

However this does not work once you have used the AddDays() method. You have to use the ToString() method

 $Date = (get-date).AddDays(-7).ToString('yyyy-MM-dd')

To replicate gathering data each day I decided to use a while loop. I set $x to –95 and pressed CTRL and J to bring up Snippets and typed w and picked the while loop. You can find out more about snippets in my previous post I started at –95 so that all the identity keys incremented in a real-life manner oldest to newest.

$x = -98
while ($x -le 0)
{
    $Date = (get-date).AddDays($x).ToString('yyyy-MM-dd')

    foreach($Server in $Servers)
    {
        foreach ($Drive in $Drives)
        {

I could then use the while loop to generate data for each day and loop through each server and each drive and generate the T-SQL but I wanted more!

I wanted to generate some random numbers for the free space available for each drive. I used the Get-Random cmdlet If you are going to use it make sure you read this post to make sure that you don’t get caught by the gotcha. I decided to set the free space for my OS,Data and Log Files to somewhere between 70 and 3 Gb free as in this imaginary scenario these drives are carefully monitored and the data and log file sizes under the control of a careful DBA but still able to go below thresholds.

if($Drive -eq 'C')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3

I set the TempDB drive to have either 4,7 or 11 Gb free so that i can try to colour code my reports depending on values and if one field only has three values it makes it simpler to verify.

I set the Backup Drive to somewhere between 50 and 0 so that I will hit 0 sometimes!!

Here is the full script. It generated 4500 T-SQL statements in just under 16 seconds

$Servers = 'Server1','Server2','Server3','Server4','Server5','Server6','Server7','Server8','Server9','Server10'
$Drives = 'C','D','E','F','G'
$CDriveCapacity = 100
$DDriveCapacity = 50
$EDriveCapacity = 200
$FDriveCapacity = 200
$GDriveCapacity = 500

$x = -98
while ($x -le 0)
{
    $Date = (get-date).AddDays($x).ToString('yyyy-MM-dd')

    foreach($Server in $Servers)
    {
        foreach ($Drive in $Drives)
        {
            if($Drive -eq 'C')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3
            Write-Host &quot;EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$CDriveCapacity','$Free'&quot;
            }
            elseif($Drive -eq 'D')
            {
            $Free = Get-Random -InputObject 4,7,11
            Write-Host &quot;EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$DDriveCapacity','$Free'&quot;
            }
            elseif($Drive -eq 'E')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3
            Write-Host &quot;EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$EDriveCapacity','$Free'&quot;
            }
            elseif($Drive -eq 'F')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3
            Write-Host &quot;EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$FDriveCapacity','$Free'&quot;
            }
            elseif($Drive -eq 'G')
            {
            $Free = Get-Random -Maximum 50 -Minimum 0
            Write-Host &quot;EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$GDriveCapacity','$Free'&quot;
            }
        }
    }
    $X++
}

Once it had run I simply copied the output into SSMS and was on my way

Emailing Disk Space Alerting With Powershell

 

A DBA doesn’t want to run out of space on their servers, even in their labs! To avoid this happening I wrote a Powershell script to provide some alerts by email.

This is the script and how I worked my way through the solution. I hope it is of benefit to others.

The script works in the following way

  • Iterates through a list of servers
  • Runs a WMI query to gather disk information
  • If the free space has fallen below a threshold, checks to see if it has emailed before and if not emails a warning
  • Resets if free space has risen above the threshold
  • Logs what it does but manages the space the logs use

As you will have seen before I use a Servers text file in my scripts. This is a text file with a single server name on each line. You could also use a query against a DBA or MDW database using Invoke-SQLCMD2, which ever is the most suitable for you.

 $Servers = Get-Content 'PATH\TO\Servers.txt' foreach($Server in $Servers) { 

The WMI query is a very simple one to gather the disk information. I format the results and place them in variables for reuse

 $Disks = Get-WmiObject win32_logicaldisk -ComputerName $Server | Where-Object {$_.drivetype -eq 3} $TotalSpace=[math]::Round(($Disk.Size/1073741824),2) # change to gb and 2 decimal places $FreeSpace=[Math]::Round(($Disk.FreeSpace/1073741824),2)# change to gb and 2 decimal places $UsedSpace = $TotalSpace - $FreeSpace $PercentFree = [Math]::Round((($FreeSpace/$TotalSpace)*100),2)# change to gb and 2 decimal places 

Use a bit of logic to check if the freespace is below a threshold and see if the email has already been sent

 # Check if percent free below warning level if ($PercentFree -le $SevereLevel) { # if text file has been created (ie email should already have been sent) do nothing if(Test-Path $CheckFileSevere) {} # if percent free below warning level and text file doesnot exist create text file and email else { 

If it has not create a unique named text file and create the email body using HTML and the values stored in the variables

 New-Item $CheckFileSevere -ItemType File #Create Email Body $EmailBody = '' $EmailBody += " " 

and then send it

 $Subject = "URGENT Disk Space Alert 1%" $Body = $EmailBody $msg = new-object Net.Mail.MailMessage $smtp = new-object Net.Mail.SmtpClient($smtpServer) $smtp.port = '25' $msg.From = $From $msg.Sender = $Sender $msg.To.Add($To) $msg.Subject = $Subject $msg.Body = $Body $msg.IsBodyHtml = $True $smtp.Send($msg) 

If the freespace is above all of the warning levels, check for existence of the text file and delete it if found so that the next time the script runs it will send an email.

 if(Test-Path $CheckFile) { Remove-Item $CheckFile -Force

To enable logging create a log file each day

 $Logdate = Get-Date -Format yyyyMMdd $LogFile = $Location + 'logfile' + $LogDate+ '.txt' # if daily log file does not exist create one if(!(Test-Path $LogFile)) { New-Item $Logfile -ItemType File 

And write the info to it at each action

 $logentrydate = (Get-Date).DateTime $Log = $logentrydate + ' ' + $ServerName + ' ' + $DriveLetter + ' ' + $VolumeName + ' ' + $PercentFree +' -- Severe Email Sent' Add-Content -Value $Log -Path $Logfile

Making sure that you clean up after

 # any logfiles older than 7 days delete Get-ChildItem -Path $Location *logfile* |Where-Object {$_.LastWriteTime -gt (Get-Date).AddDays(7) }|Remove-Item -Force 

I run the script in a Powershell Step in an SQL Agent Job every 5 minutes and Now I know when my servers in my lab are running out of space with an email like this

image

You can find the script here