Listing the SQL Server SysAdmins With PowerShell

A very short blog today just to pass on this little script.

I was required to list all of the SysAdmins across a large estate. Obviously I turned to PowerShell :-)

I iterated through my server list collection and then created a server SMO object and used the EnumServerRoleMembers method to display all of the sysadmin members

2014-04-12_152433

This will work on SQL2000 – SQL2012. You can see how you can easily change the rolename in the script to enumerate other server roles.

Another way you could do it is to use the query

“SELECT c.name AS Sysadmin_Server_Role_Members
FROM sys.server_principals a
INNER JOIN sys.server_role_members b
ON a.principal_id = b.role_principal_id AND a.type = ‘R’ AND a.name =’sysadmin’
INNER JOIN sys.server_principals c
ON b.member_principal_id = c.principal_id”

and pass that with Invoke-SQLCMD through to every server (if you had to use Powershell :-) ). That query won’t work with SQL 2000 though

How I Check Hundreds of SQL Agent Jobs in 60 Seconds with Powershell

Checking that your Agent Jobs have completed successfully is a vital part of any DBA’s responsibility. It is essential to ensure that all of the hard work you have put into setting up the jobs can be quickly and easily checked. In a large estate this can be very time consuming and if done manually prone to human error. I have repeatedly mentioned John Sansoms Blog Post entitled “The Best DBAs Automate Everything” and I follow that advice. Today I will share with you one fo the first scripts that I wrote.

When I started as a DBA I was told that my first job every morning was to check the Agent Jobs and resolve any errors. This is still something I do first before anything else. (Except coffee, experience has taught me that you get your coffee before you log into your computer otherwise on the bad days you can miss out on coffee for many an hour) I have two scripts to do this. The first sends me an email if the number of failed jobs on a server is greater than zero. This helps me to quickly and simply identify where to start in the case of multiple failures and is also a backup to the second script.

The second script runs on a different server and creates an excel worksheet and colour codes it. This makes it very simple to quickly scroll through the sheet and spot any red cells which designate failed jobs and also provides a nice easy to understand method to show management that on that specific day everything went well (or badly)

As with any Powershell script which manipulates Office applications you first need to create an object and add the workbook and worksheet to it. I also set a filename date variable and a Date variable for the Sheet.

 

When you use Powershell to manipulate Excel you can access individual cells by identifying them by Row and Column. I use this to create a description for the work book as follows

There are lots of properties that you can play with within Excel. As with any Powershell the best way to find what you need is to use the Get-Member Cmdlet. If you run

($cells.item(1,3)|Get-Member).Count

You will see that there are 185 Methods and Properties available to you (in Office 2013 on Windows 8.1)

The snippet above creates the following

 

As you can see we are going to colour code the Job Status according to the three available results Successful, Failed and Unknown. We are also going to colour code the date column to see when the job was last run, this will enable you to easily identify if the last time the job ran it was successful but last night it didn’t kick off for some reason.

The next step is a fairly standard loop through available servers by picking them from a SQLServers text file, a list of the server names (ServerName\Instance if required) that you wish to check. You could also just create an array of server names or pick them from a table with Invoke-SQLCmd but which ever way you do it you need to be able to iterate through the array and then the .Jobs Collection in the JobServer Namespace as follows

 

 

What the script then does is to use the following properties of the $Job object and write the Excel File according to the logic in the description

$Job.Name
$Job.IsEnabled
$Job.LastRunOutcome
$Job.LastRunDate

To finish up save the workbook to a share available to all of the DBA Team and quit Excel. Notice that I use a double whammy to make sure Excel is really gone. First I quit the .com object and then I stop the process. I do this because I found that on my server quitting the .com object left the Excel process running and I ended up with dozens and dozens of them. If you have Excel open before you run this script either comment out the last line or save your work (You should save your work anyway regulary!)

As always I take no responsibility for your environment, that’s your Job! Don’t run this on Production unless you know what it is doing and are happy that you have first tested it somewhere safely away from any important systems. Make sure that you understand the correct time to run this job and have qualified the impact on the box it is running on.

Here is a screen shot of the finished Excel Sheet

 

As you can see the Data Transfer Job needs investigation! The reason I add to yellow rows above and below each servers list of jobs is to help me identify any server that is not responding as that will be easily recognised as two lots of yellow with nothing between them

I have considered improving this script by inputting the data into a database and running a report from that database but have not had the need to do so yet.

Enjoy this script, you can find it here

 

If you have any questions please use the comment box below

 

 

Rationalisation of Database with Powershell and T-SQL part two

In the previous post I showed the script to create an Excel Workbook, colour coded showing the last used date for all of the databases on servers in my sqlservers.txt file. After gathering that information over several months, there is then a requirement for someone to make a decision as to which databases can be removed.

Obviously there will be some databases that are read-only or if not set specifically as read-only may only be used for reference without data being added. You should hopefully have knowledge of these databases and be able to take them off the list quickly.

There are other challenges for a DBA to overcome prior to any action. Many questions need to be answered such as

Who owns the database?
Who is the service owner responsible for the service/application in use by the database?
Even though they may be the service owner who will ultimately sign off permission to remove the database are they aware of how important it is for their people? Or what times of the year it is important to them?
You may find test and development databases that have not been used for months but will they be required next week?
Is it important enough for them to take the time to give the permission?

And plenty more… Add some in the comments below.

Our Primary responsibility is the data. We need to be able to ensure that the data is safe and can be made available quickly and easily. In this situation we need to have a valid backup and a quick and easy method of restoring it. I chose to solve this by creating a T-SQL script which will :-

The reasoning for these steps is best explained by watching this video and yes I always perform the last step too J

I could have used PowerShell to do this by examining The SMO for the Server and the JobServer but this time I decided to challenge myself by writing it in T-SQL as I am weaker in that area. The script below is the result of that work. It works for me. I expect that there are other ways of doing this and please feel free to point out any errors or suggestions. That is how I learn. Hopefully these posts will be of use to other DBAs like myself.

As always with anything you read on the internet. Validate and test. This script works for me on SQL Servers 2005, 2008,2008R2 and 2012 but if you are thinking of running it in your own Production Environment – DON’T.

Well not until you have tested it somewhere safe first J

The first challenge I encountered was that I wanted to only have to change the name of the database to be able to run the script and perform all of these steps. That will also lead onto a stored procedure and then I can automate more of this process and schedule at times to suit the database servers as well. I accomplished this by using a temp table and populating it with the variables I will need as shown below

-- Drop temp table if it exists
IF OBJECT_ID('tempdb..#vars') IS NOT NULL
DROP TABLE #vars

-- Create table to hold global variable
create table #vars (DBName nvarchar(50), PATH nvarchar(300),DataName nvarchar(50),LogName nvarchar (50),DataLoc nvarchar (256),LogLoc nvarchar (256))
insert into #vars (DBName) values ('DATABASENAME')

-- Declare and set variables

DECLARE @PATH nvarchar(300)
Set @Path = (SELECT 'PATH TO RATIONALISATION FOLDER WITH TRAILING SLASH' + @DBName + '_LastGolden_' + + convert(varchar(50),GetDate(),112) + '.bak' )

DECLARE @DataName nvarchar(50)
Set @DataName = (SELECT f.name
FROM sys.master_files F
join sys.databases D
on d.database_id = f.database_id
WHERE F.type = 0
AND d.Name = @DBNAME)

-- Print @DataName

DECLARE @LogName nvarchar (50)
Set @LogName = (SELECT f.name
FROM sys.master_files F
join sys.databases D
on d.database_id = f.database_id
WHERE F.type = 1
AND d.Name = @DBNAME)

-- PRINT @LogName

Declare @DataLoc nvarchar (256)
Set @DataLoc = (SELECT f.physical_name
FROM sys.master_files F
join sys.databases D
on d.database_id = f.database_id
WHERE F.type = 0
AND d.Name = @DBNAME)

--Print @DataLoc

Declare @LogLoc nvarchar (256)
Set @LogLoc = (SELECT f.physical_name
FROM sys.master_files F
join sys.databases D
on d.database_id = f.database_id
WHERE F.type = 1
AND d.Name = @DBNAME)

--Print @LogLoc

update #vars Set PATH = @PATH
update #vars Set DataName = @DataName
update #vars Set LogName = @LogName
update #vars Set DataLoc = @DataLoc
update #vars Set LogLoc = @LogLoc

-- Select * from #vars

I then use the variables throughout the script by selecting them from the temp table as follows

DECLARE @DBName nvarchar(50)
Set @DBName = (Select DBNAme from #vars)

And using the variables to create and execute the T-SQL for each of the steps above.

It is pointless to move onto the next step of the previous one has failed so I created some error handling as follows

if @@error != 0 raiserror('Rationalisation Script failed at Verify Restore', 20, -1) with log
GO

I created the T-SQL for the agent job by first creating the restore script and adding it to a variable and then right-clicking on a previously created restore database job and using the script to new window command

It was then a case of adding single quotes and reading the code until it would successfully run

You can find the script here

The process I have used is to change the database name in the script and run it and then run the Agent Job and check the database has been created. Then and only then can I drop the database and disable any jobs for the database. Yes that was the last step in the video J as Grant says “a file is just a file, a backup is a restored database”

Using this script you can reduce the footprint and load on your servers by removing unneeded or unused databases whilst still guaranteeing that should there be a requirement for them you KNOW you can easily restore them. You will still need to take some additional steps like adding a stop to the Agent Job to recreate any users and any other jobs that the database needs but that is more specific to your environment and you will be best placed to achieve this

Rationalisation of Database with Powershell and T-SQL part one

I have recently been involved in a project to rationalise databases. It is easy in a large organisation for database numbers to rapidly increase and sometimes the DBA may not be aware of or be able to control the rise if they don’t have knowledge of all of the database servers on the estate.

There are lots of benefits of rationalisation to the business. Reduced cpu usage = reduced heat released = lower air-con bill for the server room and less storage used = quicker backups and less tapes used or better still less requirement for that expensive new SAN. You may be able to consolidate data and provide one version of the truth for the business as well. Removing servers can release licensing costs which could then be diverted elsewhere or pay for other improvements.

William Durkin b | t presented to the SQL South West User Group about this and will be doing the session at SQL Saturday in Exeter in March 2014 Please check out his session for a more detailed view

I needed to be able to identify databases that could possibly be deleted and realised that an easy way to achieve this would be to use a script to check for usage of the database.

No need to recreate the wheel so I went to Aaron Bertrands blog http://sqlblog.com/blogs/aaron_bertrand/archive/2008/05/06/when-was-my-database-table-last-accessed.aspx and used his script. Instead of using an audit file I decided to use Powershell so that I could output the results to Excel and colour code them. This made it easier to check the results and also easier to show to Managers and Service Owners


##############################################################################################
# NAME: lastdbusage.ps1
# AUTHOR: Rob Sewell
# http://sqldbawithabeard.com
# DATE:19/10/2013
#
# COMMENTS: Fill Excel WorkBook with details fo last access times for each database
#
# NOTES : Does NOT work with SQL 2000 boxes

# Set SQL Query
$query = “WITH agg AS
(
SELECT
max(last_user_seek) last_user_seek,
max(last_user_scan) last_user_scan,
max(last_user_lookup) last_user_lookup,
max(last_user_update) last_user_update,
sd.name dbname
FROM
sys.dm_db_index_usage_stats, master..sysdatabases sd
WHERE
sd.name not in(‘master’,’tempdb’,’model’,’msdb’)
AND
database_id = sd.dbid group by sd.name
)
SELECT
dbname,
last_read = MAX(last_read),
last_write = MAX(last_write)
FROM
(
SELECT dbname, last_user_seek, NULL FROM agg
UNION ALL
SELECT dbname, last_user_scan, NULL FROM agg
UNION ALL
SELECT dbname, last_user_lookup, NULL FROM agg
UNION ALL
SELECT dbname, NULL, last_user_update FROM agg
) AS x (dbname, last_read, last_write)
GROUP BY
dbname
ORDER BY 1;

#Open Excel
$xl = new-object -comobject excel.application
$wb = $xl.Workbooks.Add()

# Load SMO extension
[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SqlServer.Smo”) | Out-Null;

# Get List of sql servers to check
$sqlservers = Get-Content ‘D:\SkyDrive\Documents\Scripts\Powershell Scripts\sqlservers.txt’

# Loop through each sql server from sqlservers.txt
foreach($sqlserver in $sqlservers)
{
# Get the time SQL was restarted
$svr = New-Object ‘Microsoft.SQLServer.Management.Smo.Server’ $SQLServer
$db = $svr.Databases['TempDB']
$CreateDate = $db.CreateDate

#Run Query against SQL Server
$Results = Invoke-Sqlcmd -ServerInstance $sqlServer -Query $query -Database master
# Add a new sheet
$ws = $wb.Worksheets.Add()
$name = “$sqlserver”
# Name the Sheet
$ws.name = $Name
$cells=$ws.Cells
$xl.Visible = $true
#define some variables to control navigation
$row = 2
$col = 2
$cells.item($row,$col)=$SQLServer + ‘ Was Rebooted at ‘ + $CreateDate
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$row=3
$col=2
# Set some titles
$cells.item($row,$col)=”Server”
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$col++
$cells.item($row,$col)=”Database”
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 40
$col++
$cells.item($row,$col)=”Last Read”
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 20
$col++
$cells.item($row,$col)=”Last Write”
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 20
$col++

foreach($result in $results)
{
# Check if value is NULL
$DBNull = [System.DBNull]::Value
$LastRead = $Result.last_read
$LastWrite = $Result.last_write

$row++
$col=2
$cells.item($Row,$col)=$sqlserver
$col++
$cells.item($Row,$col)=$Result.dbname
$col++
if($LastRead -eq $DBNull)
{
$LastRead = “Not Since Last Reboot”
$colour = “46″
$cells.item($Row,$col).Interior.ColorIndex = $colour
$cells.item($Row,$col)= $LastRead
}
else
{
$cells.item($Row,$col)= $LastRead
}
$col++
if($LastWrite -eq $DBNull)
{
$LastWrite = “Not Since Last Reboot”
$colour = “46″
$cells.item($Row,$col).Interior.ColorIndex = $colour
$cells.item($Row,$col)= $LastWrite
}
else
{
$cells.item($Row,$col)= $LastWrite
}
}
}

$xl.DisplayAlerts = $false

$wb.Saveas(“D:\DatabaseLastAccessTimeFeb2014.xlsx”)
$xl.quit()
Stop-Process -Name *excel*

What it does is place the query in a variable. Get the contents of the SQL Server text file holding all my known SQL Servers and runs the query against each of them storing the results in a variable. It then creates an Excel Workbook and a new sheet for each server and populates the sheet including a bit of colour formatting before saving it. The results look like this

usage excel

The tricky bit was understanding how to match the NULL result from the query. This was done by assigning a variable to [System.DBNull]::Value and using that.

Of course these stats are reset when SQL Server restarts so I also included the SQL server restart time using the create date property  of the TempDB. I gathered these stats for a few months before starting any rationalisation.

My next post will be about the next step in the process. You can get the script here

Installing Windows Server 2003 on Hyper-v

A quick post, as much for next time (if there is one) in case I forget.

I created a new Virtual Machine in Hyper-v and installed Windows Server 2003 Standard but after the reboot I had no network connection despite setting it up to connect to my internal network.

I tried re-adding the network adapter in Hyper-v and rebooting which  didnt work so I checked Device Manager which displayed No Device

To fix this, I downloaded and installed Windows Server 2003 SP2 and then installed Integration Services from the Action Menu of the Hyper-V Virtual Machine Connection.

Then after a reboot I had a network adapter

Why You Should Visit the Sponsors at #SQLSatExeter and Other Community Events

 

 

SQL Saturdays and other community events rely on sponsors and you know that you will often get entered into a raffle for a prize in exchange for your contact details and there will be freebies of various types from many vendors but there is more that you can get from visiting the sponsors.

FREE COFFEE

At SQL Saturday Exeter on the 22nd March 2014  we are putting free coffee amongst the sponsors. Yes, it’s a ruse in some ways to put you in the same room as the sponsors whilst your mind is buzzing with all the new SQL learning you have been doing and you are feeling confident and inspired about SQL and what you can achieve.

We need the sponsors to put on the events and the sponsors need us to help put them in contact with purchasers of their wares. It is good for all community events if the sponsors can put SQL Saturday Exeter (or another community event} into their CRM as the point of first contact or the place a decision was made for a purchase as it will mean that when they analyse their data in readiness for next years budget community events will still be important to them and they will spend their money and we will continue to be able to benefit from superb free or very cheap training and learning, networking and down right good fun at next years events

DIRECT CONTACT WITH THE PEOPLE WHO MAKE YOUR TOOLS

For example, I use Red Gate’s SQL Monitor and make use of the graphs to baseline, to see when there are variations to that baseline and to get alerted about long running queries, deadlocks and many other useful DBA information.

Whilst at the Red Gate stand at SQL Saturday in Cambridge I got talking to Daniel Rothig who is one of the developers for SQL Monitor and I was able to ask him about using SQL Monitor.

I wanted to know how best to use the base lining feature and how best to describe some of the detail I was seeing to none-technical people. He and Jonathan Allen was able to give me some examples and knowledge to improve my capabilities in this area. It was fantastic to be able to discuss the product with him and see where they are wanting to take it.

THE SPONSORS WIN TOO

Obviously the sponsors need customers and that is why they put a large amount of marketing activity into SQL Community Events. They want to put their products in front of the people who will be using them and make sales.

But there is a further benefit too Daniel asked me to show him how I used the tool and what I would improve if I could.  I explained that I was having trouble getting the Regex correct for writing exceptions for the alerts for long running queries and I said that I wished there was button I could press to automatically ignore that query that sometimes. He said he would take that back to the team. Excellent, I was able to get a way to improve a good tool to make me work smarter and my experience better

I don’t know if my idea will make it to Production but I hope so. Daniel also said it was useful to see the way users of their software navigated the application and used the features and that that knowledge would help future development

Daniel said

“I’m sure we can make a sale or two on a SQL Saturday – but then, why am I there, and not a sales team? We’ve found it’s more valuable to meet people in the community, learn about their jobs and problems, and search for a gleam in their eyes when we show them our solutions. We take home those first impressions, and the feedback from long-time users, to make our software more focused, relevant, and useful.

And the conversations are always great fun – so come and say hi!”

It’s a win all ways round

WE’LL PUT BISCUITS WITH THE COFFEE TOO AT SQL SAT EXETER!!

Find out more about SQL Saturday Exeter at http://sqlsouthwest.co.uk/ 

Viewing SQL Endpoint Permissions with PowerShell

A quick and simple post today as I have been very busy. I needed to list the users with permissions on mirroring endpoints today so I wrote this script and figured it was worth sharing.

It’s a simple script which takes a server name from a Read-Host prompt. Displays the available endpoints and asks which one you want and shows you the permissions

$Server = Read-Host "Please Enter the Server"
$Endpoints = $srv.Endpoints |select Name -ExpandProperty Name
$EndpointName = Read-Host "Please Enter the Endpoint Name `n Available Names are `n $Endpoints"
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $server
$Endpoint = $srv.Endpoints[$EndpointName]
$Endpoint.enumObjectPermissions()

and heres a screenshot of the results

Ps`1

If you want to do it with T-SQL

select s.name as grantee,
e.name as endpoint,
p.permission_name as permission,
p.state_desc as state_desc
from sys.server_permissions p
join sys.server_principals s on s.principal_id = p.grantee_principal_id
join sys.endpoints e on p.major_id = e.endpoint_id
where p.type='CO'

image