A look at the SQL Assessment Intelligence Pack in Operational Insights

Operational Insights is a service that has been added in preview to Azure. It enables you to collect, combine, correlate and visualize all your machine data in one place. It can collect data from all of your machines either via SCOM or by using an agent. Once the data is collected Operational Insights has a number of Intelligence Packs which have pre-configured rules and algorithms to provide analysis in various areas including for SQL Server

http://azure.microsoft.com/en-gb/services/operational-insights/

I thought I would take a look. I have an installation of SCOM in my lab on my laptop and I read the instructions to see how to connect it to Operational Insights. (You don’t have to have a SCOM installation to use Operational insights you can make use of an agent as well just follow the steps from the page below)

http://azure.microsoft.com/en-us/trial/operational-insights-get-started/

It really is very simple

If you have an Azure subscription already you can sign into the portal and join the preview program by clicking

New –> App Services –> Operational Insights

and create a new Operational Insights Workspace.

Once you have done that, if you have an installation of SCOM 2012 you need to be running Service Pack 1 and download and install the System Center Operational Insights Connector for Operations Manager and import the MPB files into SCOM.

If you have SCOM 2012R2 the connector is already installed and to connect your SCOM to Operational Insights is very very easy as you can see on

http://azure.microsoft.com/en-us/trial/operational-insights-get-started/?step2=withaccount&step3=SCOMcustomer

  1. In the Operations Manager Console, click Administration.
  2. Under Administration, select System Center Advisor, and then click Advisor Connection.
  3. Click Register to Advisor Service.
  4. Sign in with your Microsoft or Organizational account.
  5. Choose an existing Operational Insights workspace from the drop down menu
  6. Confirm your changes.
  7. In the System Center Advisor Overview page, Under Actions, click Add a Computer/Group.
  8. Under Options, select Windows Server or All Instance Groups, and then search and add servers that you want data

That is it. No really, that is it. I was amazed how quickly I was able to get this done in my lab and it would not take very long in a large implementation of SCOM either as you will have your groups of computers defined which will make it easy to decide which groups to use. You could use a separate workspace for each type of server or split up the information per service. It really is very customisable.

Once you have done that, go and add some of the Intelligence Packs. Each intelligence pack will change the amount  and type of data that is collected. At November 23rd there are

Alert Management – for your SCOM Alerts

Change Tracking – Tracking Configuration Changes

Log Management – for event log collection and interrogation

System Update Assessment – Missing Security Updates

Malware Assessment – Status of Anti-Malware and Anti-Virus scans

Capacity Planning – Identify Capacity and Utilisation bottlenecks

SQL Assessment – The risk and health of SQL Server Environment

There are also two ‘coming soon’ Intelligence packs

AD Assessment – Risk and health of Active Directory

Security – Explore security related data and help identify security breaches

You then (if you are like me) have a period of frustration whilst you wait for all of the data to be uploaded and aggregated but once it is you sign into the Operational Insights Portal

https://preview.opinsights.azure.com and it will look like this

opsman1

There is a lot of information there. As it is on my laptop and the lab is not running all of the time and is not connected to the internet most of the time I am not surprised that there are some red parts to my assessment!!

Obviously I was interested in the SQL Assessment and I explored it a bit further

Clicking on the SQL Assessment tile takes you to a screen which shows the SQL Assessment broken down into 6 Focus areas

Security and Compliance, Availability and Business Continuity, Performance and Scalability, Upgrade, Migration and  Deployment, Operations and Monitoring and Change and Configuration Management. MSDN http://msdn.microsoft.com/en-us/library/azure/dn873967.aspx gives some more information about each one

Security and Compliance - Safeguard the reputation of your organization by defending yourself from security threats and breaches, enforcing corporate policies, and meeting technical, legal and regulatory compliance requirements.

Availability and Business Continuity - Keep your services available and your business profitable by ensuring the resiliency of your infrastructure and by having the right level of business protection in the event of a disaster.

Performance and Scalability - Help your organization to grow and innovate by ensuring that your IT environment can meet current performance requirements and can respond quickly to changing business needs.

Upgrade, Migration and Deployment - Position your IT department to be the key driver of change and innovation, by taking full advantage of new enabling technologies to unlock more business value for organizational units, workforce and customers.

Operations and Monitoring - Lower your IT maintenance budget by streamlining your IT operations and implementing a comprehensive preventative maintenance program to maximize business performance.

Change and Configuration Management - Protect the day-to-day operations of your organization and ensure that changes won’t negatively affect the business by establishing change control procedures and by tracking and auditing system configurations.

You will be able to see some dials showing you how well you are doing in each area for the servers whose data has been collected.

opsman2

Each area will have the High Priority Recommendations shown below the dial and you can click on them to see more information about those recommendations

opsman3

You can also click the dial or the see all link to enter the search area where you can customise how you wish to see the data that has been collected, this looks a bit confusing at first

opsman4

The top bar contains the search , the timescale and some buttons to save the search, view the saved searches and view the search history, all of which will be shown in the right hand column below

The left column contains a bar graph for the search and all of the filters. The middle column contains the results of the search and can be viewed in list or tabular format and exported to CSV using the button below. A little bit of experimentation will give you a better understanding of how the filtering works and how you can make use of that for your environment

By looking at the search for the Operations and Monitoring Focus Area shown above

Type:SQLAssessmentRecommendation IsRollup=true RecommendationPeriod=2014-11 FocusArea=”Operations and Monitoring” RecommendationResult=Failed | sort RecommendationWeight desc

I saw that RecommendationResult=Failed and changed it to RecommendationResult=Passed. This enabled me to see all of the Recommendations that had been passed in the Focus Area and clicking the export button downloaded a csv file. I deleted RecommendationResult=Passed from the search and that gave me all of the recommendations that made up that Focus Area

Operations and Monitoring Focus Area

Recommendation
Enable Remote Desktop on servers.
Enable Remote Desktop on virtual machines.
Ensure computers are able to download updates.
Configure event logs to overwrite or archive old events automatically.
Review event log configuration to ensure event data is retained automatically. This relates to System Logs
Review event log configuration to ensure event data is retained automatically. This relates to Application Logs

I decided then to do the same for each of the Focus Areas for the SQL Assessment Intelligence Pack

Security and Compliance Focus Area

Recommendation
Change passwords that are the same as the login name.
Remove logins with blank passwords.
LAN Manager Hash for Passwords Stored
Investigate why unsigned kernel modules were loaded.
Apply security best practices to contained databases.
Enable User Account control on all computers.
Consider disabling the xp_cmdshell extended stored procedure.
Implement Windows authentication on Microsoft Azure-hosted SQL Server deployments.
Avoid using the Local System account to run the SQL Server service.
Avoid adding users to the db_owner database role.
Ensure only essential users are added to the SQL Server sysadmin server role.
Disable SQL Server guest user in all user databases.
Avoid running SQL Server Agent jobs using highly-privileged accounts.
Configure the SQL Server Agent service to use a recommended account.
Apply Windows password policies to SQL Server logins.
Investigate failures to validate the integrity of protected files.
Investigate failures to validate kernel modules.

Availability and Business Continuity Focus Area

Recommendation
Schedule full database backups at least weekly.
Optimize your backup strategy with Microsoft Azure Blob Storage.
Avoid using the Simple database recovery model.
Ensure all installations of Windows are activated.
Investigate logical disk errors.
Reduce the maximum Kerberos access token size.
Investigate connection failures due to SSPI context errors.
Set the PAGE_VERIFY database option to CHECKSUM.
Increase free space on system drives.
Investigate a write error on a disk.
Check the network access to Active Directory domain controllers.
Review DNS configuration on non-DNS servers.
Increase free space on system drives.
Investigate memory dumps.
Increase free space on system drives.
Investigate why the computer shut down unexpectedly.
Enable dynamic DNS registration for domain-joined servers.

Performance and Scalability Focus Area

Recommendation
Increase the number of tempdb database files.
Configure the tempdb database to reduce page allocation contention.
Ensure all tempdb database files have identical initial sizes and growth increments.
Set autogrowth increments for database files and log files to fixed values rather than percentage values.
Set autogrowth increments for transaction log files to less than 1GB.
Modify auto-grow settings to use a fixed size growth increment of less than 1GB and consider enabling Instant File Initialization.
Change your Affinity Mask and Affinity I/O MASK settings to prevent conflicts.
Resolve issues caused by excessive virtual log files.
Modify the database file layout for databases larger than 1TB.
Set the AUTO_CLOSE option to OFF for frequently accessed databases.
Review memory requirements on servers with less than 4GB of physical memory installed.
Configure system SiteName properties to be dynamic.
Align the Max Degree of Parallelism option to the number of logical processors.
Align the Max Degree of Parallelism option to the number of logical processors.
Consider disabling the AUTO_SHRINK database option.
Review memory requirements on computers with high paging file use.
Ensure SQL Server does not consume memory required by other applications and system components.
Consider changing your power saving settings to optimize performance.
Increase the initial size of the tempdb database.
Review the configuration of Maximum Transfer Unit (MTU) size.
Review your paging file settings.
Review and optimize memory cache configuration.
Review the configuration of Maximum Transfer Unit (MTU) size.
Review the system processor scheduling mode.
Review network provider ordering settings.
Remove invalid entries from the PATH environment variable.
Remove network entries from the PATH environment variable.
Investigate processes that use a large number of threads.
Avoid hosting user database files on the same disk volume as tempdb database files.
Review processes with large working set sizes.
Reduce the length of the PATH environment variable.
Reduce the number of entries in the PATH environment variable.
Ensure SQL Server does not consume memory required by other applications and system components.
Enable the backup compression default configuration option.
Ensure the DNS Client service is running and is set to start automatically.
Consider compressing database tables and indexes.

Upgrade, Migration and Deployment Focus Area

Recommendation
Ensure all devices run supported operating system versions.
Ensure that the guest user is enabled in the msdb database.
Avoid using the Affinity64 Mask configuration setting in new development work.
Avoid using the Affinity Mask configuration setting in new development work.
Avoid using the Affinity I/O Mask configuration setting in new development work.
Avoid using the Allow Updates configuration option in SQL Server.
Avoid using the Allow Updates configuration option in SQL Server.
Avoid using the Affinity64 I/O Mask configuration setting in new development work.
Configure SQL Server to accept incoming connections.
Configure SQL Server instances and firewalls to allow communication over TCP/IP.

As I have no data for Change and Configuration Management I was not able to see the recommendations in my Operation Insights Workspace

As you can see from the type and description of the recommendations above these are all areas that a DBA will be concerned about and the benefit of having all of this information gathered, pre-sorted, prioritised and presented to you in this manner will enable you to work towards a better SQL environment and track your progress. You can read more about the SQL Assessment Intelligence Pack here

http://msdn.microsoft.com/en-us/library/azure/dn873958.aspx

As well as the pre-determined queries that are built into the Intelligence pack you can search your data in any way that you require enabling you to present information about the health and risk of your SQL Environment to your team or your management with ease. The “with ease” bit is dependent on you understanding the language and structure of the search queries.

You will need to put this page into your bookmarks

http://msdn.microsoft.com/library/azure/dn873997.aspx

As it contains the syntax and definitions to search your data

A very useful page for a starter like me is

http://blogs.msdn.com/b/dmuscett/archive/2014/10/19/advisor-searches-collection.aspx

by Daniele Muscetta which has a list of useful Operational Insights search queries such as

SQL Recommendation by Computer

Type=SQLAssessmentRecommendation IsRollup=false RecommendationResult=Failed | measure count() by Computer

If you click the star to the right of the search box you will find the saved searches. For the SQL Assessment Intelligence Pack there are

Did the agent pass the prerequisite check (if not, SQL Assessment data won’t be complete)
Focus Areas
How many SQL Recommendation are affecting a Computer a SQL Instance or a Database?
How many times did each unique SQL Recommendation trigger?
SQL Assesments passed by Server
SQL Recommendation by Computer
SQL Recommendation by Database
SQL Recommendation by Instance

You can use these and you can save your own searches which show the data in a way that is valuable to you.

Overall I am impressed with this tool and can see how it can be beneficial for a DBA as well as for System Administrators. I was amazed how easy it was to set up and how quickly I was able to start manipulating the data once it had been uploaded.

Changing Delay Between Responses for SQL Alerts with Powershell

So you have read that you should have alerts for severity levels 16 to 24 and 823,824 and 825 on SQLSkills.com or maybe you have used sp_blitz and received the Blitz Result: No SQL Server Agent Alerts Configured and like a good and conscientious DBA you have set them up.

Hopefully you also have Jonathan Allens blog on your feed and if you look at his historical posts and seen this one where lack of a delay in response broke the Exchange Server!

However sometimes the oft used delay between responses of 1 minute is too much. Alerts should be actionable after all and maybe you sync your email every 15 minutes and don’t need to see 15 alerts for the same error or you decide that certain level of errors require a lesser response and therefore you only need to know about them every hour or three. Or possibly you want to enforce a certain delay for all servers and want to set up a system to check regularly and enforce your rule

Whatever the reason, changing the delay between response for every alert on every server with SSMS could be time consuming and (of course) I will use Powershell to do the job.

To find the alerts I follow the process I use when finding any new property in powershell


$server = 'SERVERNAME'

$srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server

I know that the Alerts will be found under the JobServer Property


$srv.JobServer.Alerts|Get-Member

Shows me

DelayBetweenResponses   Property   int DelayBetweenResponses {get;set;}

And

 Alter                   Method     void Alter(), void IAlterable.Alter()

So I use both of those as follows


Foreach($Alert in $srv.JobServer.Alerts)

{

$Alert.DelayBetweenResponses = 600 # This is in seconds

$Alert.Alter()

}

And place it in a foreach loop for the servers I want to change. If I only want to change certain alerts I can do so by filtering on Name


Foreach($Alert in $srv.JobServer.Alerts|Where-Object {$_.Name -eq 'NameOfAlert'})

Or by category


Foreach($Alert in $srv.JobServer.Alerts|Where-Object {$_.CategoryName -eq 'Category Name'})

When you have 5 minutes go and look at the results of


$srv.JobServer|Get-Member

And explore and let me know what you find

#tsql2sday #60 – Something New Learned – Problem Step Recorder

What is T-SQL Tuesday?

T-SQL Tuesday is a monthly blog party hosted by a different blogger each month. This blog party was started by Adam Machanic (blog|twitter). You can take part by posting your own participating post that fits the topic of the month and follows the requirements Additionally, if you are interested in hosting a future T-SQL Tuesday, contact Adam Machanic on his blog.

This month’s blog party is hosted by Chris Yates blog |twitter who asked people to share something newly learned.

I love being a part of the SQL community. It gives me the opportunity to learn as much as I want to about anything I can think of within the data field. In the last couple of months I have presented at Newcastle User Group and learnt about migrating SQL using Powershell with Stuart Moore. At our user group in Exeter http://sqlsouthwest.co.uk/ we had Steph Middleton talking about version control for databases and lightning talks from Pavol Rovensky on Mocking in C# ,John Martin on Azure fault domains and availability sets using a pen and a whiteboard!, Annette Allen on Database Unit Testing,Terry McCann  on SQL Certifications. We also had Jonathan Allen talking about some free tools and resources to help manage both large and small SQL environments.  I went to SQL Relay in Southampton and saw Stuart Moore (again!) Scott Klein Alex Yates James Skipworth and I joined the PASS DBA fundamentals virtual chapter webinar for Changing Your Habits to Improve the Performance of Your T-SQL by Mickey Stuewe and that’s only the ‘in-person’ learning that I did. I also read a lot of blog posts!

But instead of repeating what I learnt from others within the community I thought I would write a blog post that I have been meaning to write for a few weeks about a solution pre-built into Windows that appears to not be well known. Problem Step Recorder.

What is PSR?

I found out about a little known tool included in Windows Operating System a couple of months ago which enables you to record what you are doing by taking screenshots of every mouse click. The tool is Step Recorder also known as PSR. It is included by default in Windows 7 , Windows 8 and 8.1 and Windows Server 2008 and above.

What does it do?

Simply put, it records “This is what I did” There are many situations when this can be useful

  • You can use this during installations to help create documentation. “This is what I did” when I installed X and now you can follow those steps and I know I haven’t missed anything.
  • You can use it when communicating with 3rd parties or other support teams. “This is what I did” when I got this error and here are all of the steps so that you can re-create the issue and I know that I haven’t missed anything
  • You can use this when resolving high priority incidents. “This is what I did” when System X broke, it includes all of the times of my actions.
    I still keep my notepad by my keyboard out of habit but I have a record of the exact steps that I took to try to resolve the issue which will be very useful for reporting on the incident in the near future and also placing into a Knowledge Base for others to use if it happens again and I know I haven’t missed anything
  • For assisting family members. Like many, I am “The IT guy” and PSR enables me to provide clear instructions with pictures showing exactly where I clicked to those family members who are having trouble with “The internet being broken”

It does this by automatically taking a screen shot after every mouse click or program event with a timestamp and a description of what happened. It does not record keystrokes though so if you need to record what you have typed there is some manual steps required

So how do you access PSR?

Simple. Type “psr” into the run box, cmd or PowerShell and it will open

Untitled picture

Once you click on Start Record it will start recording your clicks and taking screenshots. However I always open the settings by clicking on the drop down to the left of the help icon first and change the number of recent screen captures to store to the maximum value of 100.

1Untitled picture

If you do not you will get no warning but PSR will only save the last 25 screenshots it takes and your results will look like the below. It will still record your actions but not keep the screenshots.

Previous Next

Step 16: (‎09/‎11/‎2014 13:47:45) User left click on “Chris Yates (@YatesSQL) | Twitter (tab item)”

No screenshots were saved for this step.

Previous Next

Step 17: (‎09/‎11/‎2014 13:47:47) User left click on “The SQL Professor | ‘Leadership Through Service’ (text)”

No screenshots were saved for this step.

Previous Next

Step 18: (‎09/‎11/‎2014 13:47:47) User left click on “T-SQL Tuesday #60 – Something New Learned | The SQL Professor (text)” in “T-SQL Tuesday #60 – Something New Learned | The SQL Professor – Google Chrome”

untitled

You can also set the name and location of the saved file in the settings but if you leave it blank it will prompt for a location and name once you click Stop Record

How do I add keyboard input?

PSR allows you add keyboard input manually. You may need this if you need to include the text you have entered into prompts or address bars or if you wish to add further comment. You can do this by clicking add comment, drawing a box around the relevant part of the screen for the text input and inputting the text into the box

2Untitled picture

In the results this looks like

Step 1: (‎09/‎11/‎2014 12:56:22) User Comment: “http://www.microsoft.com/en-gb/download/details.aspx?id=42573

untitled1

What do the results look like?

Once you have finished the actions that you want to record (or when you think you are close to 100 screenshots) click stop record and the following screen will be displayed

3Untitled picture

This allows you to review what PSR has recorded. You can then save it to a location of your desire. It is saved as a zip file which has a single .mht file in it. You can open the file without unzipping the archive and it will open in Internet Explorer. As you can see from the shots below you can run PSR on your client and it will still record actions in your RDP sessions although it does not record as much detail. The first two are on my SCOM server in my lab and the second two are on the laptop using the SCOM console

Previous Next

Step 11: (‎09/‎11/‎2014 13:02:13) User left click on “Input Capture Window (pane)” in “SCOM on ROB-LAPTOP – Virtual Machine Connection”

untitled2

Previous Next

Step 12: (‎09/‎11/‎2014 13:02:16) User left click on “Input Capture Window (pane)” in “SCOM on ROB-LAPTOP – Virtual Machine Connection”

untitled3

Previous Next

Step 13: (‎09/‎11/‎2014 13:06:25) User right click on “Management Packs (tree item)” in “Agent Managed – THEBEARDMANAGEMENTGROUP – Operations Manager”

untitled4

Previous Next

Step 14: (‎09/‎11/‎2014 13:06:27) User left click on “Import Management Packs… (menu item)”

untitled5

You can then use the zip file as you wish. Maybe you email it to your third party support team (once you have edited any confidential data) or you can attach it to your incident in your IT Service Management solution or attach it to a report. If you wish to create documentation you can open the .mht file in Word, edit it as you see fit and save it appropriately.

So that is one of the many things that I have learnt recently and I am looking forward to seeing what others have learnt especially as many will have just been to the SQL PASS Summit. You will be able to find the other posts in this blog party in the comments on Chris’s page

Generating T-SQL Randomly with Powershell

I have a lab on my laptop running various servers so that I can problem solve and learn and recently I wanted to add several months of data into a database. I had created a stored procedure to take some parameters perform some logic and insert the data.

To execute the stored procedure in T-SQL I simply run this

EXECUTE [dbo].[usp_Insert_DriveSpace] 'Server1','C','2014-11-05','100','25'

which uses the server name, drive letter, date, capacity and free space to add the data

In my wisdom I decided to create some data that was more ‘real-life’ I was interested in storing drive space data and will be learning how to write reports on it. To do this I had pre-populated some tables in the database with 10 Server Names each with 5 drives so I needed 10*5*90 or 4500 statements

I wanted to populate this with about 3 months of data as if it had been gathered every day. I read this post about using CTEs to create sequences and I am sure it can be done this way but I don’t have the T-SQL skills to do so. If someone can (or has) done that please let me know as I am trying to improve my T-SQL skills and would be interested in how to approach and solve this problem with T-SQL

I solved it with Powershell in this way.

Created an array of Servers and an array of Drives to enable me to iterate though each.

$Servers = 'Server1','Server2','Server3','Server4','Server5','Server6','Server7','Server8','Server9','Server10'
$Drives = 'C','D','E','F','G'

Set the drive capacity for each drive. To make my life slightly easier I standardised my ‘servers’

$CDriveCapacity = 100
$DDriveCapacity = 50
$EDriveCapacity = 200
$FDriveCapacity = 200
$GDriveCapacity = 500

I needed to create a date. You can use Get-Date to get todays date and to get dates or times in the future or the past you can use the AddDays() function. You can also add ticks, milliseconds, seconds, minutes, hours, months or years

(Get-Date).AddDays(1)

I then needed to format the date. This is slightly confusing. If you just use Get-Date to get the current date (time) then you can use the format or uformat switch to format the output

Get-Date -Format yyyyMMdd
Get-Date -UFormat %Y%m%d

However this does not work once you have used the AddDays() method. You have to use the ToString() method

 $Date = (get-date).AddDays(-7).ToString('yyyy-MM-dd')

To replicate gathering data each day I decided to use a while loop. I set $x to –95 and pressed CTRL and J to bring up Snippets and typed w and picked the while loop. You can find out more about snippets in my previous post I started at –95 so that all the identity keys incremented in a real-life manner oldest to newest.

$x = -98
while ($x -le 0)
{
    $Date = (get-date).AddDays($x).ToString('yyyy-MM-dd')

    foreach($Server in $Servers)
    {
        foreach ($Drive in $Drives)
        {

I could then use the while loop to generate data for each day and loop through each server and each drive and generate the T-SQL but I wanted more!

I wanted to generate some random numbers for the free space available for each drive. I used the Get-Random cmdlet If you are going to use it make sure you read this post to make sure that you don’t get caught by the gotcha. I decided to set the free space for my OS,Data and Log Files to somewhere between 70 and 3 Gb free as in this imaginary scenario these drives are carefully monitored and the data and log file sizes under the control of a careful DBA but still able to go below thresholds.

if($Drive -eq 'C')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3

I set the TempDB drive to have either 4,7 or 11 Gb free so that i can try to colour code my reports depending on values and if one field only has three values it makes it simpler to verify.

I set the Backup Drive to somewhere between 50 and 0 so that I will hit 0 sometimes!!

Here is the full script. It generated 4500 T-SQL statements in just under 16 seconds

$Servers = 'Server1','Server2','Server3','Server4','Server5','Server6','Server7','Server8','Server9','Server10'
$Drives = 'C','D','E','F','G'
$CDriveCapacity = 100
$DDriveCapacity = 50
$EDriveCapacity = 200
$FDriveCapacity = 200
$GDriveCapacity = 500

$x = -98
while ($x -le 0)
{
    $Date = (get-date).AddDays($x).ToString('yyyy-MM-dd')

    foreach($Server in $Servers)
    {
        foreach ($Drive in $Drives)
        {
            if($Drive -eq 'C')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3
            Write-Host "EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$CDriveCapacity','$Free'"
            }
            elseif($Drive -eq 'D')
            {
            $Free = Get-Random -InputObject 4,7,11
            Write-Host "EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$DDriveCapacity','$Free'"
            }
            elseif($Drive -eq 'E')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3
            Write-Host "EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$EDriveCapacity','$Free'"
            }
            elseif($Drive -eq 'F')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3
            Write-Host "EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$FDriveCapacity','$Free'"
            }
            elseif($Drive -eq 'G')
            {
            $Free = Get-Random -Maximum 50 -Minimum 0
            Write-Host "EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$GDriveCapacity','$Free'"
            }
        }
    }
    $X++
}

Once it had run I simply copied the output into SSMS and was on my way

Emailing Disk Space Alerting With Powershell

 

A DBA doesn’t want to run out of space on their servers, even in their labs! To avoid this happening I wrote a Powershell script to provide some alerts by email.

This is the script and how I worked my way through the solution. I hope it is of benefit to others.

The script works in the following way

  • Iterates through a list of servers
  • Runs a WMI query to gather disk information
  • If the free space has fallen below a threshold, checks to see if it has emailed before and if not emails a warning
  • Resets if free space has risen above the threshold
  • Logs what it does but manages the space the logs use

As you will have seen before I use a Servers text file in my scripts. This is a text file with a single server name on each line. You could also use a query against a DBA or MDW database using Invoke-SQLCMD2, which ever is the most suitable for you.

 $Servers = Get-Content 'PATH\TO\Servers.txt' foreach($Server in $Servers) { 

The WMI query is a very simple one to gather the disk information. I format the results and place them in variables for reuse

 $Disks = Get-WmiObject win32_logicaldisk -ComputerName $Server | Where-Object {$_.drivetype -eq 3} $TotalSpace=[math]::Round(($Disk.Size/1073741824),2) # change to gb and 2 decimal places $FreeSpace=[Math]::Round(($Disk.FreeSpace/1073741824),2)# change to gb and 2 decimal places $UsedSpace = $TotalSpace - $FreeSpace $PercentFree = [Math]::Round((($FreeSpace/$TotalSpace)*100),2)# change to gb and 2 decimal places 

Use a bit of logic to check if the freespace is below a threshold and see if the email has already been sent

 # Check if percent free below warning level if ($PercentFree -le $SevereLevel) { # if text file has been created (ie email should already have been sent) do nothing if(Test-Path $CheckFileSevere) {} # if percent free below warning level and text file doesnot exist create text file and email else { 

If it has not create a unique named text file and create the email body using HTML and the values stored in the variables

 New-Item $CheckFileSevere -ItemType File #Create Email Body $EmailBody = '' $EmailBody += " " 

and then send it

 $Subject = "URGENT Disk Space Alert 1%" $Body = $EmailBody $msg = new-object Net.Mail.MailMessage $smtp = new-object Net.Mail.SmtpClient($smtpServer) $smtp.port = '25' $msg.From = $From $msg.Sender = $Sender $msg.To.Add($To) $msg.Subject = $Subject $msg.Body = $Body $msg.IsBodyHtml = $True $smtp.Send($msg) 

If the freespace is above all of the warning levels, check for existence of the text file and delete it if found so that the next time the script runs it will send an email.

 if(Test-Path $CheckFile) { Remove-Item $CheckFile -Force

To enable logging create a log file each day

 $Logdate = Get-Date -Format yyyyMMdd $LogFile = $Location + 'logfile' + $LogDate+ '.txt' # if daily log file does not exist create one if(!(Test-Path $LogFile)) { New-Item $Logfile -ItemType File 

And write the info to it at each action

 $logentrydate = (Get-Date).DateTime $Log = $logentrydate + ' ' + $ServerName + ' ' + $DriveLetter + ' ' + $VolumeName + ' ' + $PercentFree +' -- Severe Email Sent' Add-Content -Value $Log -Path $Logfile

Making sure that you clean up after

 # any logfiles older than 7 days delete Get-ChildItem -Path $Location *logfile* |Where-Object {$_.LastWriteTime -gt (Get-Date).AddDays(7) }|Remove-Item -Force 

I run the script in a Powershell Step in an SQL Agent Job every 5 minutes and Now I know when my servers in my lab are running out of space with an email like this

image

You can find the script here

Number of VLFs and Autogrowth Settings Colour Coded to Excel with PowerShell

So you have read up on VLFs

No doubt you will have read this post by Kimberly Tripp and this one and maybe this one too and you want to identify the databases in your environment which have a large number of VLFs and also the initial size and the autogrowth settings of the log files.

There are several posts about this and doing this with PowerShell like this one or this one. As is my wont I chose to output to Excel and colour code the cells depending on the number of VLFs or the type of Autogrowth.

There is not a pure SMO way of identifying the number of VLFs in a log file that I am aware of and it is simple to use DBCC LOGINFO to get that info.

I also wanted to input the autogrowth settings, size, space used, the logical name and the file path. I started by getting all of my servers into a $Servers Array as follows

$Servers = Get-Content 'PATHTO\sqlservers.txt'

Whilst presenting at the Newcastle User Group, Chris Taylor b | t asked a good question. He asked if that was the only way to do this or if you could use your DBA database.

It is much better to make use of the system you already use to record your databases. It will also make it much easier for you to be able to run scripts against more specific groups of databases without needing to keep multiple text files up to date. You can accomplish this as follows

$Query = 'SELECT Name FROM dbo.databases WHERE CONDITION meets your needs'
$Servers = Invoke-Sqlcmd -ServerInstance MANAGEMENTSERVER -Database DBADATABASE -Query $query

I then create a foreach loop and a server SMO object (Did you read my blog post about snippets? the code for a SMO Server snippet is there) returned the number of rows for DBCC LOGINFO and the information I wanted.

foreach ($Server in $Servers)
    {
      $srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server
      foreach ($db in $srv.Databases|Where-Object {$_.isAccessible -eq $True})
      {
        $DB.ExecuteWithResults('DBCC LOGINFO').Tables[0].Rows.Count
         $db.LogFiles | Select Growth,GrowthType,Size, UsedSpace,Name,FileName
       }
    }

It’s not very pretty or particularly user friendly so I decided to put it into Excel

I did this by using my Excel Snippet

$snippet = @{
      Title = 'Excel Object';
      Description = 'Creates a Excel Workbook and Sheet';
      Text = @'
      # Create a .com object for Excel
    `$xl = new-object -comobject excel.application
    `$xl.Visible = `$true # Set this to False when you run in production
    `$wb = `$xl.Workbooks.Add() # Add a workbook
    `$ws = `$wb.Worksheets.Item(1) # Add a worksheet
    `$cells=`$ws.Cells
    #Do Some Stuff - perhaps -
      `$cells.item(`$row,`$col)=`'Server`'
      `$cells.item(`$row,`$col).font.size=16
      `$Cells.item(`$row,`$col).Columnwidth = 10
      `$col++
    `$wb.Saveas(`'C:\temp\Test`$filename.xlsx`')
    `$xl.quit()
    Stop-Process -Name EXCEL
    '@
    }
    New-IseSnippet @snippet

and placed the relevant bits into the foreach loop

foreach ($Server in $Servers)
    {
      $srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server
      foreach ($db in $srv.Databases|Where-Object {$_.isAccessible -eq $True})
      {
        $VLF = $DB.ExecuteWithResults('DBCC LOGINFO').Tables[0].Rows.Count
        $logFile = $db.LogFiles | Select Growth,GrowthType,Size, UsedSpace,Name,FileName
        $Name = $DB.name
        $cells.item($row,$col)=$Server
        $col++
        $cells.item($row,$col)=$Name
        $col++
        $cells.item($row,$col)=$VLF
        $col++
        $col++
        $Type = $logFile.GrowthType.ToString()
        $cells.item($row,$col)=$Type
        $col++
        $cells.item($row,$col)=($logFile.Size)
        $col++
        $cells.item($row,$col)=($logFile.UsedSpace)
        $col++
        $cells.item($row,$col)=$logFile.Name
        $col++
        $cells.item($row,$col)=$logFile.FileName

I had to use the ToString() method on the Type property to get Excel to display the text. I wanted to set the colour for the VLF cells to yellow or red dependant on their value and the colour of the growth type cell to red if the value was Percent. This was achieved like this

if($VLF -gt $TooMany)
    {
      $cells.item($row,$col).Interior.ColorIndex = 6 # Yellow
    }
    if($VLF -gt $WayTooMany)
    {
      $cells.item($row,$col).Interior.ColorIndex = 3 # Red
    }
    if($Type -eq 'Percent')
    {
      $cells.item($row,$col).Interior.ColorIndex = 3 #Red
    }

I also found this excellent post by which has many many snippets of code to work with excel sheets.

I used

$cells.item($row,$col).HorizontalAlignment = 3 #center
$cells.item($row,$col).HorizontalAlignment = 4 #right
$ws.UsedRange.EntireColumn.AutoFit()

although I had to move the Title so that it was after the above line so that it looked ok.


image

You can find the script here. As always test it somewhere safe first, understand what it is doing and any questions get in touch.

PowerShell Snippets A Great Learning Tool

 

When I talk to people about Powershell they often ask how can they easily learn the syntax. Here’s a good tip

Open PowerShell ISE and press CTRL + J

image

You will find a number of snippets that will enable you to write your scripts easily.  Johnathan Medd PowerShell MVP has written a good post about snippets on the Hey, Scripting Guy! blog so I will not repeat that but suggest that you go and read that post.

Not only are there default snippets for you to use but you can create your own snippets. However there isn’t a snippet for creating a new snippet so here is the code to do that

 
$snippet1 = @{
    Title = "New-Snippet";
    Description = "Create a New Snippet";
    Text = @"
`$snippet = @{
    Title = `"Put Title Here`";
    Description = `"Description Here`";
    Text = @`"
    Code in Here 
`"@
}
New-IseSnippet @snippet
"@
}
New-IseSnippet @snippet1 –Force

I frequently use the SQL Server SMO Object in my code so I created this snippet

 
$snippet = @{
    Title = "SMO-Server";
    Description = "Creates a SQL Server SMO Object";
    Text = @"
    $srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server
"@
}
New-IseSnippet @snippet

I also use Data Tables a lot so I created a snippet for that too

 
$snippet = @{
     Title = "New-DataTable";
     Description = "Creates a Data Table Object";
     Text = @"
      # Create Table Object
 $table = New-Object system.Data.DataTable $TableName
 
 
 # Create Columns
 $col1 = New-Object system.Data.DataColumn NAME1,([string])
 $col2 = New-Object system.Data.DataColumn NAME2,([decimal])
 
 #Add the Columns to the table
 $table.columns.add($col1)
 $table.columns.add($col2)
 
 # Create a new Row
 $row = $table.NewRow() 
 
 # Add values to new row
 $row.Name1 = 'VALUE'
 $row.NAME2 = 'VALUE'
 
 #Add new row to table
 $table.Rows.Add($row)
"@
 }
 New-IseSnippet @snippet

Denniver Reining has created a Snippet Manager which you can use to further expand your snippets usage and it is free as well.

If you have further examples of useful snippets please feel free to post them in the comments below