SQL VNext sp_configure on Windows and Linux with dbatools

This weekend I set up some SQL vNext virtual machines, two on Windows and one on Linux so that I could test some scenarios and build an availability group.

IMPORTANT NOTE :- The names of dbatools commands with a Sql prefix WILL CHANGE in a later release of dbatools. dbatools will use Dba throughout in the future as the sqlserver PowerShell module uses the Sql prefix

I used PowerShell version 5.1.14393.693 and SQL Server vNext CTP 1.3 running on Windows Server 2016 and Ubuntu 16.04 in this blog post

I set up one Windows box with some changes to the default configuration, these are for my lab only.

Min Server Memory to 2Gb
Max Server Memory to 6Gb
Remote Admin Connections to 1
Backup Compression to 1
Ad Hoc Distributed Queries to 1

I can do this in PowerShell by creating a SQL SMO Server object using Connect-DbaSqlServer from dbatools and then altering the Configuration Properties values and calling the Alter() method

$WinSQl1 = 'SQLvNextN1'
$win1 = Connect-DbaSqlServer -SqlServer $WinSQl1
$win1.Configuration.Properties['DefaultBackupCompression'].ConfigValue = 1
$win1.Configuration.Properties['MinServerMemory'].ConfigValue = 2048
$win1.Configuration.Properties['MaxServerMemory'].ConfigValue = 6144
$win1.Configuration.Properties['RemoteAccess'].ConfigValue = 1
$win1.Configuration.Properties['OptimizeAdhocWorkloads'].ConfigValue = 1
$win1.Configuration.Alter()

I can see the change by using the Get-DbaSpConfigure from the popular PowerShell module dbatools. I like to output to Out-GridView (Alias ogv at the CLI)

$WinSQl1 = 'SQLvNextN1'
Get-DbaSpConfigure -SqlServer $WinSQl1 | ogv

1 SPConfig ogv.PNG

Out-GridView can filter in the top bar and it works very quickly. It is a very useful tool

2 - ogv filter.gif

You can also rearrange the columns and use the Add Criteria button to filter your results

3-spconfig-ogv

You can use Export-SQLSpConfigure to export the configuration to a .sql file which is useful for DR or documentation purposes. There is also a corresponding Import-SqlSpConfigure command

$WinSQl1 = 'SQLvNextN1'
$WinConfigPath = 'C:\Temp\Winconfig.sql'
Export-SqlSpConfigure -SqlServer $WinSQl1 -Path $winConfigPath
notepad $winConfigPath

4 - Export-SQLSpConfigure.gif

Chrissy LeMaire b | t showed a neat trick when we were in Utrecht to compare the configuration of two servers

$WinSQl1 = 'SQLvNextN1'
$WinSQl2 = 'SQLvNextN2'
$Win1SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl1
$Win2SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl2
$propcompare = foreach ($prop in $Win1SPConfigure) {
[pscustomobject]@{
Config = $prop.DisplayName
'Windows Node1 setting' = $prop.RunningValue
'Windows Node2 Setting' = $Win2SPConfigure | Where DisplayName -eq $prop.DisplayName | Select -ExpandProperty RunningValue
}
}
$propcompare | ogv
So when we compare the two Windows SQL nodes we can see that at this point there are differences in the configuration
5-compare-configures

 

Now we can copy the configuration from SQLvNextN1 to SQLvNextN2 using the Copy-SqlSpConfigure command

$WinSQl1 = 'SQLvNextN1'
$WinSQl2 = 'SQLvNextN2'
Copy-SqlSpConfigure -Source $WinSQl1 -Destination $WinSQl2

6 - Copy Configurations.gif

and we can see that those configurations are now the same for those two servers by comparing them like before.

7 - compare configurations.PNG

We can use Get-DbaSpConfigure with Linux servers as well but we need to use SQL authentication like so (No, I don’t know why I used two v’s!)

$linuxSQL = 'LinuxvvNext'
$cred = Get-Credential -UserName SA -Message "Linux SQL Auth"
$linuxSpConfigure = Get-DbaSpConfigure  -SqlServer $linuxSQL -SqlCredential $cred
$linuxSpConfigure | ogv

and compare the three servers like so

$WinSQl1 = 'SQLvNextN1'
$WinSQl2 = 'SQLvNextN2'
$linuxSQL = 'LinuxvvNext'
$cred = Get-Credential -UserName SA -Message "Linux SQL Auth"
<div>$Win1SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl1
$Win2SPConfigure = Get-DbaSpConfigure -SqlServer $WinSQl2
$linuxSpConfigure = Get-DbaSpConfigure  -SqlServer $linuxSQL -SqlCredential $cred
$propcompare = foreach ($prop in $Win1SPConfigure) {
[pscustomobject]@{
Config = $prop.DisplayName
'Windows Node1 setting' = $prop.RunningValue
'Windows Node2 Setting' = $Win2SPConfigure | Where DisplayName -eq $prop.DisplayName | Select -ExpandProperty RunningValue
'Linux Setting' = $linuxSpConfigure | Where DisplayName -eq $prop.DisplayName | Select -ExpandProperty RunningValue
}
}
$propcompare | ogv

and see the differences for the Linux Server

8 - Compare Linux Configs.PNG

we can export the Linux configuration using Export-SqlSpConfigure

9 - Export Linux Config.PNG

So now lets copy the configuration changes from the Windows Server to the Linux Server

10 - Copy Configuration to Linux.gif

There was some Red text there you will notice. This is because there are unsupported features in Linux as you can see in the SQL Server on Linux Release Notes The errors from the command are

Database Mail XPs to 0. Feature may not be supported.
SMO and DMO XPs to 1. Feature may not be supported.
Ole Automation Procedures to 0. Feature may not be supported.
xp_cmdshell to 0. Feature may not be supported.
Ad Hoc Distributed Queries to 0. Feature may not be supported.
Replication XPs to 0. Feature may not be supported.
contained database authentication to 0. Feature may not be supported.
hadoop connectivity to 0. Feature may not be supported.
polybase network encryption to 1. Feature may not be supported.
remote data archive to 0. Feature may not be supported.
allow polybase export to 0. Feature may not be supported.

But if we compare the configurations again, we can see that the settings we wanted have been altered successfully

11 - Compare Linux Configs again.PNG

Lets reset the configuration on the Linux server using Import-SqlSpConfigure and the Linuxconfig-backup.sql file created before any changes

12 - Import backup Linux configuration.gif

Interestingly this time we only had an error for Database Mail XPs, but as you can see below the Linux Configuration has been reset back to the original values.

13-configuration-comapre-back-to-original

Lets export the configuration from the Windows server to a file and import it onto the Linux server to see what happens

14 - Import Windows configuration to Linux.gif

Again only an error for Database Mail XPs, and we can see the configurations are again matching.

15 - Final Compare.PNG

Hopefully, this post has been of use in exploring the *SpConfigure* commands in dbatools and how you can use them to get the sp_configure output, compare it between servers, Export and import it to and from files, as well as copy it between servers for both Windows and Linux. Remember these commands have been tested from SQL 2000 to SQL vNext so they should work for you across your estate

 

Happy Automating!

 

Populating My DBA Database for Power Bi with PowerShell – Databases

Following my post about using Power Bi with my DBA Database I have been asked if I would share the PowerShell scripts which I use to populate my database.

In this post I will show how to create the following report

db1

db2

Although you will find so many items of data that I expect that you will want to create different reports for your own requirements. You will also want to put the report onto PowerBi.com and explore the natural language querying as I show at the end of this post

You will find the latest version of my DBADatabase creation scripts and PowerShell scripts here.

The SQLInfo table is created using this code

CREATE TABLE [Info].[Databases](
	[DatabaseID] [int] IDENTITY(1,1) NOT NULL,
	[InstanceID] [int] NOT NULL,
	[Name] [nvarchar](256) NULL,
	[DateAdded] [datetime2](7) NULL,
	[DateChecked] [datetime2](7) NULL,
	[AutoClose] [bit] NULL,
	[AutoCreateStatisticsEnabled] [bit] NULL,
	[AutoShrink] [bit] NULL,
	[AutoUpdateStatisticsEnabled] [bit] NULL,
	[AvailabilityDatabaseSynchronizationState] [nvarchar](16) NULL,
	[AvailabilityGroupName] [nvarchar](128) NULL,
	[CaseSensitive] [bit] NULL,
	[Collation] [nvarchar](30) NULL,
	[CompatibilityLevel] [nvarchar](15) NULL,
	[CreateDate] [datetime2](7) NULL,
	[DataSpaceUsageKB] [float] NULL,
	[EncryptionEnabled] [bit] NULL,
	[IndexSpaceUsageKB] [float] NULL,
	[IsAccessible] [bit] NULL,
	[IsFullTextEnabled] [bit] NULL,
	[IsMirroringEnabled] [bit] NULL,
	[IsParameterizationForced] [bit] NULL,
	[IsReadCommittedSnapshotOn] [bit] NULL,
	[IsSystemObject] [bit] NULL,
	[IsUpdateable] [bit] NULL,
	[LastBackupDate] [datetime2](7) NULL,
	[LastDifferentialBackupDate] [datetime2](7) NULL,
	[LastLogBackupDate] [datetime2](7) NULL,
	[Owner] [nvarchar](30) NULL,
	[PageVerify] [nvarchar](17) NULL,
	[ReadOnly] [bit] NULL,
	[RecoveryModel] [nvarchar](10) NULL,
	[ReplicationOptions] [nvarchar](40) NULL,
	[SizeMB] [float] NULL,
	[SnapshotIsolationState] [nvarchar](10) NULL,
	[SpaceAvailableKB] [float] NULL,
	[Status] [nvarchar](35) NULL,
	[TargetRecoveryTime] [int] NULL,
 CONSTRAINT [PK_Databases] PRIMARY KEY CLUSTERED 
(
	[DatabaseID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

GO

The Powershell script uses Jason Wasser @wasserja Write-Log function to write to a text file but I also enable some logging into a new event log by following the steps here http://blogs.technet.com/b/heyscriptingguy/archive/2013/02/01/use-powershell-to-create-and-to-use-a-new-event-log.aspx to create a log named SQLAutoScript with a source SQLAUTOSCRIPT

To run the script I simply need to add the values for

$CentralDBAServer = '' ## Add the address of the instance that holds the DBADatabase
$CentralDatabaseName = 'DBADatabase' 
$LogFile = "\DBADatabaseServerUpdate_" + $Date + ".log" ## Set Path to Log File

And the script will do the rest. Call the script from a PowerShell Job Step and schedule it to run at the frequency you wish, I gather the information every week. You can get the script from here or you can read on to see how it works and how to create the report and publish it to powerbi.com and query it with natural langauge

I create a function called Catch-Block to save keystrokes and put my commands inside a try catch to make the scripts as robust as possible. I won’t include the try catch in the examples below. I gather all of the server names from the InstanceList table and set the results to an array variable called $ServerNames holding the server name, instance name and port

 $Query = @"
 SELECT [ServerName]
      ,[InstanceName]
      ,[Port]
  FROM [DBADatabase].[dbo].[InstanceList]
  Where Inactive = 0 
    AND NotContactable = 0
"@
try{
$AlltheServers= Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query $query
$ServerNames = $AlltheServers| Select ServerName,InstanceName,Port
}

I then loop through the array and create a $Connection variable for my SMO connection string and connect to the server

foreach ($ServerName in $ServerNames)
{
## $ServerName
 $InstanceName =  $ServerName|Select InstanceName -ExpandProperty InstanceName
 $Port = $ServerName| Select Port -ExpandProperty Port
$ServerName = $ServerName|Select ServerName -ExpandProperty ServerName 
 $Connection = $ServerName + '\' + $InstanceName + ',' + $Port

 try
 {
 $srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection

Even though I place the creation of the SMO server object in a try block you still need to an additional check to ensure that you can connect and populate the object as the code above creates an empty SMO Server object with the name property set to the $Connection variable if you can’t connect to that server and doesn’t error as you may expect
The way I have always validated an SMO Server object is to check the version property. There is no justifiable reason for choosing that property, you could choose any one but that’s the one I have always used. I use an if statement to do this ( This post about Snippets will show you the best way to learn PowerShell code) The reference I use for exiting a loop in the way that you want is this one In this case we use a continue to carry on iterating the loop

 if (!( $srv.version)){
 Catch-Block " Failed to Connect to $Connection"
 continue
 }

I then loop through the user databases

foreach($db in $srv.databases|Where-Object {$_.IsSystemObject -eq $false })
{
$Name = $db.Name
$Parent = $db.Parent.Name

To gather information on all databases just remove everything after the pipe symbol or if you wish to exclude certain databases from the collection gathering, maybe the database you keep your Change log table and DBA Team info in you can do that as well here

foreach($db in $srv.databases|Where-Object {$_.Name -ne 'EXCLUDENAME' })
{
$Name = $db.Name
$Parent = $db.Parent.Name

If you wish to view all of the different properties that you can gather information on in this way you can use this code to take a look. (This is something you should get used to doing when writing new Powershell scripts)

$Connection = 'SERVERNAMEHERE'
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection
 $srv.databases | Get-Member

An alternative method of doing this is to set a variable to a $db and then to select all of the properties so that you can see the values and identify the ones you want. Again this a good thing to do when exploring new objects

$db = $srv.databases['DBNAMEHERE'] 
$db| Select *

You can see from the screen shot below that there are 170 properties available to you on a SQL2014 instance. You can gather any or all of that information as long as you ensure that you have the columns with the correct data types in your table and that your script has the logic to deal with properties that do not exist although I have had less issue with this for the database object than the server object

db3

You can look for the property that you want by using the Get-Member cmdlet as shown above or use MSDN to find it starting from here or by GoogleBingDuckDuckGo ing “Powershell SMO” and the property you wish to find.

The rest of the script follows exactly the same pattern as the previous post by checking the SQL Info table for an entry for that instance and updating the table if it exists and inserting if it does not.

This is how I created the reports shown above.

Connect to the DBA Database and run these queries to gather the data for the report.

SELECT 
IL.ServerName
,IL.InstanceName
,IL.Location
,IL.Environment
,IL.Inactive
,IL.NotContactable
,D.[DatabaseID]
,D.[InstanceID]
,D.[Name]
,D.[DateAdded]
,D.[DateChecked]
,D.[AutoClose]
,D.[AutoCreateStatisticsEnabled]
,D.[AutoShrink]
,D.[AutoUpdateStatisticsEnabled]
,D.[AvailabilityDatabaseSynchronizationState]
,D.[AvailabilityGroupName]
,D.[CaseSensitive]
,D.[Collation]
,D.[CompatibilityLevel]
,D.[CreateDate]
,D.[DataSpaceUsageKB]
,D.[EncryptionEnabled]
,D.[IndexSpaceUsageKB]
,D.[IsAccessible]
,D.[IsFullTextEnabled]
,D.[IsMirroringEnabled]
,D.[IsParameterizationForced]
,D.[IsReadCommittedSnapshotOn]
,D.[IsUpdateable]
,D.[LastBackupDate]
,D.[LastDifferentialBackupDate]
,D.[LastLogBackupDate]
,D.[Owner]
,D.[PageVerify]
,D.[ReadOnly]
,D.[RecoveryModel]
,D.[ReplicationOptions]
,D.[SizeMB]
,D.[SnapshotIsolationState]
,D.[SpaceAvailableKB]
,D.[Status]
,D.[TargetRecoveryTime]
FROM [DBADatabase].[Info].[Databases] as D
JOIN [DBADatabase].[dbo].[InstanceList] as IL
ON IL.InstanceID =D.InstanceID

To get all the database and instance information and

SELECT C.ClientName
 ,[DatabaseID]
 ,[InstanceID]
 ,[Notes]
  FROM [DBADatabase].[dbo].[ClientDatabaseLookup] as CDL
  JOIN [DBADatabase].[dbo].[Clients] as C
  ON CDL.clientid = c.clientid

To get the client information. The client information needs to be manually added to the table as this (in general) needs a human bean to understand. When the script runs every night it will pick up new databases and I add a default value of “Not Entered” to the table which makes it easier to identify the databases that need this additional work. (This also means that as a Team Leader I can monitor that my team are doing this) It can also be added to any scripts which create new databases for deployment.

Then we need to create some measures and calculated columns for our report. I did this as I realised that I needed it when making the report rather than all up front.

I created two calculated columns for size for the databases one for Gb and one for Tb by clicking on the data icon on the left and then new measure

SizeGb = Query1[SizeMB]/1024
SizeTb = Query1[SizeGb]/1024

Some measures for count of Databases, Instances and Servers

Databases = COUNT(Query1[DatabaseID])
Instances = DISTINCTCOUNT(Query1[InstanceID])
Servers = DISTINCTCOUNT(Query1[ServerName])

I also wanted to be able to differentiate between ‘External’ and ‘Internal’ customers. So I created a calculated column for this value using a switch statement.

External = SWITCH(Clients[ClientName],"Not Entered", 0 , "Dev Team",0,"Mi Team",0,"DBA Team",0,"Finance Department",0,"HR",0,"Operations",0,"Payroll",0,"Test Team",0,"Systems Team",0,"Unknown",0,1)

I create a donut chart to show the size of the database in Gb by client (and no, my real clients are not rock bands 🙂 ) as shown below. I formatted the title, legend and background by clicking on the paintbrush in the visualisation pane. I would encourage you to investigate the options here.

db4
The other donut chart is number of clients per location (and those are SQL User group locations in the UK and my hometown Bolton)

db5

The rest of the visualisations on that report are cards and tables which I am sure that you can work out.

I created a map to show the location of the databases

db6

And after reading this post http://sqldusty.com/2015/08/03/power-bi-tip-use-the-treemap-chart-as-a-colorful-slicer/ by Dustin Ryan I created a colourful slicer for environment and the client and then added some other information. The important thing here is to pick the information that the person looking at the report needs to see. So if it is recovery model, compatibility level, collation, page verify setting, mirroring, replication, size and number of databases then this report is correct but I doubt that’s what you want 🙂

You can slice this report by location, client or environment. For example, I can easily see which clients have data in Exeter and the size and number of databases

db7

Or if Metallica ring me up I can quickly see that they have 4 databases, just under 69Gb of data in Exeter and it isn’t mirrored. You will notice that it is not easy to see the recovery model or the compatibility level. If you hover over the results you get a highlight figure which shows the data is filtered but it is not shown visually very well as there are over a thousand databases using full recovery model.

db8

If we are asked about the Integration environment we can see that it is hosted in Bolton, Manchester, Southampton and Exeter and comprises of 394 databases and 739 Gb of data. It is also easier to see the compatibility level and recovery model as the ratios are larger

db9

Once we have created the report in the way that we want we can then publish it to powerbi.com and share it with others if we wish. Publishing is as easy as pressing the publish button and entering your powerbi credentials but if you want your data to automatically refresh (and this is the point of the exercise to remove manual work) then you will need to install and configure the PowerBi gateway and schedule a refresh I will post about this later.

Once the report is published you can access it in the browser and create a dashboard by clicking the pin in the top right of a visualisation and a pop up will ask you which dashboard you wish to pin it to (Another recent update to Power Bi)

db10

Once you have a dashboard you can then perform some natural language question and answer on it. This can be quite interesting and not always quite what you (or your report readers) might expect but it is getting better all the time

db11

You have to remember to use the names of the columns correctly

db12

But once you have the query correct you can alter it by adding “as a VISUALISATION” and choose the visualisation

db13

db14

And once you have the visualisation you can pin it to the dashboard

I think you can see how useful it can be

db15

This doesn’t work quite as you expect

db16

But this does

db17

How about this (and yes it felt wrong to type!)

db18

And the auditors would love to be able to do this. (This is an old copy of the database in case The Eagles people are reading this – your database is backed up every 15 minutes)

db19

Or this for a DBA ( Yes, my obfuscation script database naming convention is a bit bland)

db20

Or the DBA team manager might choose this one

db21

The advantage that I cannot show via static pictures is that the data, visualisation and the suggestions alter in real time as you type

I hope that you have found this useful and that you can see the benefits and advantages of using a DBA Database and empowering people to use self-service to answer their own questions leaving the DBA time to do more important things like drinking coffee 🙂

As always if you have any questions or comments please feel free to post them on the blog.

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

#tsql2sday #60 – Something New Learned – Problem Step Recorder

What is T-SQL Tuesday?

T-SQL Tuesday is a monthly blog party hosted by a different blogger each month. This blog party was started by Adam Machanic (blog|twitter). You can take part by posting your own participating post that fits the topic of the month and follows the requirements Additionally, if you are interested in hosting a future T-SQL Tuesday, contact Adam Machanic on his blog.

This month’s blog party is hosted by Chris Yates blog |twitter who asked people to share something newly learned.

I love being a part of the SQL community. It gives me the opportunity to learn as much as I want to about anything I can think of within the data field. In the last couple of months I have presented at Newcastle User Group and learnt about migrating SQL using Powershell with Stuart Moore. At our user group in Exeter http://sqlsouthwest.co.uk/ we had Steph Middleton talking about version control for databases and lightning talks from Pavol Rovensky on Mocking in C# ,John Martin on Azure fault domains and availability sets using a pen and a whiteboard!, Annette Allen on Database Unit Testing,Terry McCann  on SQL Certifications. We also had Jonathan Allen talking about some free tools and resources to help manage both large and small SQL environments.  I went to SQL Relay in Southampton and saw Stuart Moore (again!) Scott Klein Alex Yates James Skipworth and I joined the PASS DBA fundamentals virtual chapter webinar for Changing Your Habits to Improve the Performance of Your T-SQL by Mickey Stuewe and that’s only the ‘in-person’ learning that I did. I also read a lot of blog posts!

But instead of repeating what I learnt from others within the community I thought I would write a blog post that I have been meaning to write for a few weeks about a solution pre-built into Windows that appears to not be well known. Problem Step Recorder.

What is PSR?

I found out about a little known tool included in Windows Operating System a couple of months ago which enables you to record what you are doing by taking screenshots of every mouse click. The tool is Step Recorder also known as PSR. It is included by default in Windows 7 , Windows 8 and 8.1 and Windows Server 2008 and above.

What does it do?

Simply put, it records “This is what I did” There are many situations when this can be useful

  • You can use this during installations to help create documentation. “This is what I did” when I installed X and now you can follow those steps and I know I haven’t missed anything.
  • You can use it when communicating with 3rd parties or other support teams. “This is what I did” when I got this error and here are all of the steps so that you can re-create the issue and I know that I haven’t missed anything
  • You can use this when resolving high priority incidents. “This is what I did” when System X broke, it includes all of the times of my actions.
    I still keep my notepad by my keyboard out of habit but I have a record of the exact steps that I took to try to resolve the issue which will be very useful for reporting on the incident in the near future and also placing into a Knowledge Base for others to use if it happens again and I know I haven’t missed anything
  • For assisting family members. Like many, I am “The IT guy” and PSR enables me to provide clear instructions with pictures showing exactly where I clicked to those family members who are having trouble with “The internet being broken”

It does this by automatically taking a screen shot after every mouse click or program event with a timestamp and a description of what happened. It does not record keystrokes though so if you need to record what you have typed there is some manual steps required

So how do you access PSR?

Simple. Type “psr” into the run box, cmd or PowerShell and it will open

Untitled picture

Once you click on Start Record it will start recording your clicks and taking screenshots. However I always open the settings by clicking on the drop down to the left of the help icon first and change the number of recent screen captures to store to the maximum value of 100.

1Untitled picture

If you do not you will get no warning but PSR will only save the last 25 screenshots it takes and your results will look like the below. It will still record your actions but not keep the screenshots.

Previous Next

Step 16: (‎09/‎11/‎2014 13:47:45) User left click on “Chris Yates (@YatesSQL) | Twitter (tab item)”

No screenshots were saved for this step.

Previous Next

Step 17: (‎09/‎11/‎2014 13:47:47) User left click on “The SQL Professor | ‘Leadership Through Service’ (text)”

No screenshots were saved for this step.

Previous Next

Step 18: (‎09/‎11/‎2014 13:47:47) User left click on “T-SQL Tuesday #60 – Something New Learned | The SQL Professor (text)” in “T-SQL Tuesday #60 – Something New Learned | The SQL Professor – Google Chrome”

untitled

You can also set the name and location of the saved file in the settings but if you leave it blank it will prompt for a location and name once you click Stop Record

How do I add keyboard input?

PSR allows you add keyboard input manually. You may need this if you need to include the text you have entered into prompts or address bars or if you wish to add further comment. You can do this by clicking add comment, drawing a box around the relevant part of the screen for the text input and inputting the text into the box

2Untitled picture

In the results this looks like

Step 1: (‎09/‎11/‎2014 12:56:22) User Comment: “http://www.microsoft.com/en-gb/download/details.aspx?id=42573

untitled1

What do the results look like?

Once you have finished the actions that you want to record (or when you think you are close to 100 screenshots) click stop record and the following screen will be displayed

3Untitled picture

This allows you to review what PSR has recorded. You can then save it to a location of your desire. It is saved as a zip file which has a single .mht file in it. You can open the file without unzipping the archive and it will open in Internet Explorer. As you can see from the shots below you can run PSR on your client and it will still record actions in your RDP sessions although it does not record as much detail. The first two are on my SCOM server in my lab and the second two are on the laptop using the SCOM console

Previous Next

Step 11: (‎09/‎11/‎2014 13:02:13) User left click on “Input Capture Window (pane)” in “SCOM on ROB-LAPTOP – Virtual Machine Connection”

untitled2

Previous Next

Step 12: (‎09/‎11/‎2014 13:02:16) User left click on “Input Capture Window (pane)” in “SCOM on ROB-LAPTOP – Virtual Machine Connection”

untitled3

Previous Next

Step 13: (‎09/‎11/‎2014 13:06:25) User right click on “Management Packs (tree item)” in “Agent Managed – THEBEARDMANAGEMENTGROUP – Operations Manager”

untitled4

Previous Next

Step 14: (‎09/‎11/‎2014 13:06:27) User left click on “Import Management Packs… (menu item)”

untitled5

You can then use the zip file as you wish. Maybe you email it to your third party support team (once you have edited any confidential data) or you can attach it to your incident in your IT Service Management solution or attach it to a report. If you wish to create documentation you can open the .mht file in Word, edit it as you see fit and save it appropriately.

So that is one of the many things that I have learnt recently and I am looking forward to seeing what others have learnt especially as many will have just been to the SQL PASS Summit. You will be able to find the other posts in this blog party in the comments on Chris’s page

SQL Server Operators and Notifications with Powershell – Strange Enumerate issue fixed by @napalmgram

Alerting of issues across the SQL Server estate is important and recently I needed to audit the operators and the notifications that they were receiving.

I created a SQL Server Object

2013-09-04_125056

One of the important things to remember when investigating SMO is the Get-Member cmdlet. This will show all methods and properties of the object

$server | Get-Member

gave me the JobServer Property

$Server.JobServer|gm

includes the Operator Property

$Server.JobServer.Operators | gm
2013-09-04_125717

has the EnumJobNotifications and EnumNotifications methods

So it was easy to loop through each server in the servers.txt file and enumerate the notifications for each Operator

2013-09-04_130052

and create a simple report

However this does not work as it does not perform the second enumerate. Try it yourself, switch round the EnumJobNotifications and EnumNotifications methods in that script and see what happens.

So I ended up with two functions

2013-09-04_174005
2013-09-04_173953

and I thought I could do this

2013-09-04_174056

But that doesnt work

So I tried this

2013-09-04_174112

and that doesnt work either

Now the reports are coming out showing the correct number of lines but not displaying them. I spent a period of time on my Azure boxes trying to work a way around this. I set the outputs to both enums to a variable and noted that they are different type of objects.

2013-09-05_113931

Job Notifications are System.Object and Alert Notifications are System.Array

I tried to enumerate through each member of the array and display them but got too tired to finish but I had contacted my friend Stuart Moore Twitter | Blog who had a look and resolved it by simply piping the Enumerates to Format-Table. Thank you Stuart.

So the final script is as follows

2013-09-05_114601

and the script is

#############################################################################################
#
# NAME: Show-SQLServerOperators.ps1
# AUTHOR: Rob Sewell http://sqldbawithabeard.com
# DATE:03/09/2013
#
# COMMENTS: Load function for Enumerating Operators and Notifications
# ————————————————————————

Function Show-SQLServerOperators ($SQLServer) {
    Write-Output "############### $SQLServer ##########################"
    Write-Output "#####################################################`n"     

    $server = new-object "Microsoft.SqlServer.Management.Smo.Server" $SQLServer
        
        
    foreach ($Operator in $server.JobServer.Operators) {
        $Operator = New-Object ("$SMO.Agent.Operator") ($server.JobServer, $Operator)

        $OpName = $Operator.Name
        Write-Output "Operator $OpName"
        Write-Output "`n###### Job Notifications   ######"
        $Operator.EnumJobNotifications()| Select JobName | Format-Table
        Write-Output "#####################################################`n"  
        Write-Output "`n###### Alert Notifications  #######"
        $Operator.EnumNotifications() | Select AlertName | Format-Table
        Write-Output "#####################################################`n"  
                 
    }
 
}        

Those Pesky ‘s

Changing Domain Names in a Column

A quick little post for today. Not particularly SQL related but the points at the end are relevant.

I had a task when moving a service to a new development area to change the domain name within columns in several tables from “DOMAIN1\USER” to “DOMAIN2\USER”

In SQL I was able to do this quite easily as follows

   USE [DATABASENAME] 
    GO
    -- Declare variables 
    DECLARE @Live nvarchar(10) 
    DECLARE @Dev nvarchar(10) 
    
    -- Set the variable to the Domains 
    Set @Live = 'Live Domain' 
    Set @Dev = 'Dev Domain' 
    
    --Update tables 
    UPDATE [TABLENAME] 
    SET [User] = REPLACE([User], @Live, @Dev) 
    GO 
    UPDATE [TABLENAME] 
    SET [Group] = REPLACE([Group], @Live, @Dev) 
    GO

I also had to do the same for some Oracle databases too and this is where the fun started!

I needed to create the update scripts for documentation for the Oracle databases.

I wanted to create

  update schema.tablename set userid = replace ('DOMAIN1\USER', 'DOMAIN1', 'DOMAIN2') WHERE USERID = 'DOMAIN1\USER';

for each userid in the table.I had trouble with the script I found in our DBA area as it kept failing with

ORA-00911: invalid character

at the \

as it wouldn’t add the ‘ ‘ around DOMAIN1\USER

Not being an Oracle DBA but wanting to solve the issue once and for all I tried a whole host of solutions trying to find the escape character. i asked the Oracle DBAs but they were unable to help Checking the IT Pros handbook (also known as Google!) made me more confused but in the end I solved it.

   select 'update schema.table set userid = replace (''' || userid || ''', ''DOMAIN1'', ''DOMAIN2'') WHERE USERID = ''' || USERID || ''';' FROM schema.tablename;

A whole host of ‘s in there!!

I put this in my blog as it is relevant to my situation and an experience I have had that I couldn’t easily solve. Maybe it will help another person searching for the same thing.

It raises some interesting points

The script provided ( I use that term loosely, it had the right name and was in the right place to use for this process) had obviously not been run as it didn’t work or someone had manually added the ‘s. I wasn’t go to do that for the number of users required.

If it no good, if it doesn’t do what i expected or is still in development then mark it as so, so that everyone knows. In the name of the script, in the comments in the script or by keeping live tested scripts in one place. Which ever method you choose is fine as long as it is appropriate to your environment and everyone knows about it

I probably say a dozen times a day to my new colleague

“In case you/I get run over by a bus”

It is all very well being the one who knows everything but it is pointless if you aren’t there SPOF’s (Single Points of Failure) apply to people as well as hardware.

Enable your service to be supported by preparing proper documentation.

This doesn’t have to be reams of paperwork. It can sometimes be as simple as placing things in a recognised place or a single comment in the script.

I hold my hands up. I am guilty of this too. I have been so busy I haven’t done this as much as I should have over the last few months of last year. I have tried but not done as well as I should have. In my defence, I have spent plenty of time recently rectifying this, which is why this situation was so memorable.

Some links I have read in the past related to this by  people who know more than me.

Documentation It Doesn’t Suck – Brent Ozar

Your Lack Of Documentation is Costing you More than you Think – John Samson

Do You Document Your SQL Server Instances? – Brad McGhee