Triggering a System Center Configuration Manager deployment task

Posted on Updated on

A slightly different topic today.

Once you have built up knowledge, you become the person that people ask to solve things. This is something I really enjoy, taking a problem and solving it for people and in the process teaching them and enabling them to automate more things.

A colleague was performing a new deployment of a product via SCCM and wanted to trigger the clients to update and receive the new update instead of waiting for it to be scheduled.

They had found some code that would do this

Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000121}"|Out-Null
Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000021}"|Out-Null
Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000022}"|Out-Null
Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000002}"|Out-Null

They had the idea of using this command and a text file containing the machines and PS Remote.

I looked at it a different way and gave them a function so that they could provide the Collection Name (In SCCM a collection is a list of machines for a specific purpose) and the function would import the SCCM module, connect to the Site get the names of the machines in the collection and run the command on each one

function Trigger-DeploymentCycle
{
param
(
[string]$CollectionName
)

# PS script to run

$scriptblock = {
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000121}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000021}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000022}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000002}"|Out-Null
    }

## import SCCM module
Import-Module (Join-Path $(Split-Path $env:SMS_ADMIN_UI_PATH) ConfigurationManager.psd1)
#open drive for SCCM 
cd <Site Code>:\ #### cd <Site Code>:\ replace with Site Code or add param $SiteCOde and use cd ${$SiteCode}:\ 
# Get Computer names in collection
$PCs = (Get-CMDeviceCollectionDirectMembershipRule -CollectionName $CollectionName).rulename
$Count = $PCs.count
Write-Output "Total number of PCs = $Count"

Invoke-Command –ComputerName $PCs –ScriptBlock $scriptblock –ThrottleLimit 50

}  

This would work very well but they wanted some error checking to enable them to identify machines they were unable to connect to following the deployment so the final solution which will run a little slower

Set up function and parameters and create log files

function Trigger-DeploymentCycle
{
param
(
[string]$CollectionName
)

# Create log file
$StartTime = Get-Date
$Date = Get-Date -Format ddMMyyHHss
$Errorlogpath = "C:\temp\SCCMError" + $Date + ".txt"
$Successlogpath = "C:\temp\SCCMSuccess" + $Date + ".txt"
New-Item -Path $Errorlogpath -ItemType File
New-Item -Path $Successlogpath -ItemType File

$StartLog = "Script Started at $StartTime"
$StartLog | Out-File -FilePath $Successlogpath -Append

Create the script block, import the SCCM module, connect to the SCCM site and get the machines in the collection. Note that you will have to change <Site Code> with your own site code

 

$scriptblock = {
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000121}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000021}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000022}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000002}"|Out-Null
    }

## import SCCM module
Import-Module (Join-Path $(Split-Path $env:SMS_ADMIN_UI_PATH) ConfigurationManager.psd1)
#open drive for SCCM 
cd <Site Code>:\ #### cd <Site Code>:\ replace with Site Code or add param $SiteCOde and use cd ${$SiteCode}:\ 
# Get Computer names in collection
$PCs = (Get-CMDeviceCollectionDirectMembershipRule -CollectionName $CollectionName).rulename
$Count = $PCs.count
Write-Output "Total number of PCs = $Count"

I wanted to give them a progress output so I needed to be able to identify the number of machines in the collection by using the count property. I then needed to output the number of the item within the array which I did with

$a= [array]::IndexOf($PCs, $PC) + 1
Write-Output " Connecting to PC - $PC -- $a of $count"

I then pinged the machine,ran the script block and wrote to the log files and finally opened the log files

if (Test-Connection $PC -Quiet -Count 1)
{   
# Run command on PC
Invoke-Command -ComputerName $PC -scriptblock $scriptblock
$Success = "SUCCESS - finished - $PC -- $a of $count" 
 $Success | Out-File -FilePath $Successlogpath -Append
Write-Output $Success
}
else
{
$ErrorMessage = "ERROR - $PC is not available -- $PC -- $a of $count"
$ErrorMessage| Out-File -FilePath $Errorlogpath -Append 
Write-Output $ErrorMessage
}
}

notepad $Errorlogpath
notepad $Successlogpath

Now they can load the function into their powershell sessions and type

TriggerDeplyment COLLECTIONNAME

and they will be able to manually trigger the tasks. This function will trigger the following tasks for a list of PCs in a collection.

Machine Policy Assignment Request — {00000000-0000-0000-0000-000000000021}
Machine Policy Evaluation — {00000000-0000-0000-0000-000000000022}
Software Inventory — {00000000-0000-0000-0000-000000000002}
Application Deployment Evaluation Cycle: {00000000-0000-0000-0000-000000000121}

Here is the list of other tasks you can trigger:

Discovery Data Collection Cycle: {00000000-0000-0000-0000-000000000003}
Hardware Inventory Cycle: {00000000-0000-0000-0000-000000000001}
Machine Policy Retrieval and Evaluation Cycle: {00000000-0000-0000-0000-000000000021}
Software Metering Usage Report Cycle: {00000000-0000-0000-0000-000000000031}
Software Updates Deployment Evaluation Cycle: {00000000-0000-0000-0000-000000000108}
Software Updates Scan Cycle: {00000000-0000-0000-0000-000000000113}
Windows Installer Source List Update Cycle: {00000000-0000-0000-0000-000000000032}
Hardware Inventory={00000000-0000-0000-0000-000000000001}
Software Update Scan={00000000-0000-0000-0000-000000000113}
Software Update Deployment Re-eval={00000000-0000-0000-0000-000000000114}
Data Discovery={00000000-0000-0000-0000-000000000003}
Refresh Default Management Point={00000000-0000-0000-0000-000000000023}
Refresh Location (AD site or Subnet)={00000000-0000-0000-0000-000000000024}
Software Metering Usage Reporting={00000000-0000-0000-0000-000000000031}
Sourcelist Update Cycle={00000000-0000-0000-0000-000000000032}
Cleanup policy={00000000-0000-0000-0000-000000000040}
Validate assignments={00000000-0000-0000-0000-000000000042}
Certificate Maintenance={00000000-0000-0000-0000-000000000051}
Branch DP Scheduled Maintenance={00000000-0000-0000-0000-000000000061}
Branch DP Provisioning Status Reporting={00000000-0000-0000-0000-000000000062}
Refresh proxy management point={00000000-0000-0000-0000-000000000037}
Software Update Deployment={00000000-0000-0000-0000-000000000108}
State Message Upload={00000000-0000-0000-0000-000000000111}
State Message Cache Cleanup={00000000-0000-0000-0000-000000000112}

You can find the function here

Trigger-Deployment

and all of my Script Center Submissions are here

As always – The internet lies, fibs and deceives and everything you read including this post should be taken with a pinch of salt and examined carefully. All code should be understood and tested prior to running in a live environment.

Show AutoGrowth Events with Powershell to CSV

Posted on Updated on

This week I was reading Pinal Daves post about Autogrowth Events

http://blog.sqlauthority.com/2015/02/03/sql-server-script-whenwho-did-auto-grow-for-the-database/

as it happened I had a requirement to make use of the script only a few days later. I was asked to provide the information in a CSV so that the person who required the information could manipulate it in Excel.

I am a great believer in Automation. If you are going to do something more than once then automate it so I wrote two functions, added them to TFS and now they will be available to all of my team members next time they load Powershell.

Why two functions? Well Pinal Daves script gets the information from the default trace for a single database but there may be times when you need to know the autogrowth events that happened on a server with multiple databases.

I use a very simple method for doing this as I have not found the correct way to parse the default trace with Powershell. The functions rely on Invoke-SQLCMD2 which I also have in my functions folder and pass the query from Pinal Daves Blog post as a here string

$Results = Invoke-Sqlcmd2 -ServerInstance $Server -Database master -Query $Query

To output to CSV I use the Export-CSV cmdlet

if($CSV)
{
$Results| Export-Csv -Path $CSV
}

And to open the CSV I add a [switch] parameter. You can find out more about parameters here or by

Get-Help about_Functions_Advanced_Parameters

so the parameter block of my function looks like

param
(
[Parameter(Mandatory=$true)]
[string]$Server,
[Parameter(Mandatory=$true)]
[string]$Database,
[Parameter(Mandatory=$false)]
[string]$CSV,
[Parameter(Mandatory=$false)]
[switch]$ShowCSV
)

Now when I am asked again to provide this information it is as easy as typing

Show-AutogrowthServer -Server SQL2014Ser12R2 

or

Show-AutogrowthDatabase -Server SQL2014Ser12R2 -Database Autogrowth

and the results will be displayed as below

autogrowth

just a side note. Pinal Daves script uses @@servername in the where clause and if you have renamed your host the script will be blank. The resolution to this is to runt he following T-SQL

 sp_dropserver 'OLDSERVERNAME';
GO
sp_addserver NEWSERVERNAME, local;
GO

You can find the scripts here

Show-AutoGrowthServer

Show-AutoGrowthDatabase

and all of my Script Center Submissions are here

As always – The internet lies, fibs and deceives and everything you read including this post  should be taken with a pinch of salt and examined carefully. All code should be understood and tested prior to running in a live environment.

Uploading a Source Folder to Azure File Storage

Posted on

Azure File Storage enables you to present an Azure Storage Account to your IaaS VMs as a share using SMB. You can fid out further details here

http://azure.microsoft.com/en-gb/documentation/articles/storage-dotnet-how-to-use-files/ 

Once you have created your Azure File Storage Account and connected your Azure Virtual Machines to it, you may need to upload data from your premises into the storage to enable it to be accessed by the Virtual Machines

To accomplish this I wrote a function and called it Upload-ToAzureFileStorage

I started by creating a source folder and files to test

New-Item -Path C:\temp\TestUpload\New1 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New2 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New3 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New4 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New5 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\a -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\b -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\c -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\d -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\a\1 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\a\2 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\a\3 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\a\4 -ItemType Directory

New-Item -Path C:\temp\TestUpload\New1\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New2\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New3\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New4\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New5\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New1\list\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New1\list\a\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New1\list\a\1\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New1\list\a\2\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New1\list\a\3\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New1\list\a\4\file.txt -ItemType File

Then we needed to connect to the subscription, get the storage account access key and create a context to store them

#Select Azure Subscription
Select-AzureSubscription -SubscriptionName $AzureSubscriptionName

# Get the Storage Account Key
$StorageAccountKey = (Get-AzureStorageKey -StorageAccountName $StorageAccountName).Primary

# create a context for account and key
$ctx=New-AzureStorageContext $StorageAccountName $StorageAccountKey

The Get-AzureStorageShare  cmdlet shows the shares available for the context so we can check if the share exists

$S = Get-AzureStorageShare -Context $ctx -ErrorAction SilentlyContinue|Where-Object {$_.Name -eq $AzureShare}

and if it doesnt exist create it using New-AzureStorageShare

$s = New-AzureStorageShare $AzureShare -Context $ctx

For the sake only of doing it a different way we can check for existence of the directory in Azure File Storage that we are going to upload the files to like this

$d = Get-AzureStorageFile -Share $s -ErrorAction SilentlyContinue|select Name

if ($d.Name -notcontains $AzureDirectory)

and if it doesnt exist create it using New-AzureStorageDirectory

$d = New-AzureStorageDirectory -Share $s -Path $AzureDirectory

Now that we have the directory created in the storage account we need to create any subfolders. First get the folders

# get all the folders in the source directory
$Folders = Get-ChildItem -Path $Source -Directory -Recurse

We can then iterate through them using a foreach loop. If we do this and select the FullName property the results will be

C:\temp\TestUpload\New1
C:\temp\TestUpload\New2
C:\temp\TestUpload\New3
C:\temp\TestUpload\New4
C:\temp\TestUpload\New5
C:\temp\TestUpload\New1\list
C:\temp\TestUpload\New1\list\a
C:\temp\TestUpload\New1\list\b
C:\temp\TestUpload\New1\list\c
C:\temp\TestUpload\New1\list\d
C:\temp\TestUpload\New1\list\a\1
C:\temp\TestUpload\New1\list\a\2
C:\temp\TestUpload\New1\list\a\3
C:\temp\TestUpload\New1\list\a\4

but to create new folders we need to remove the “C:\temp\TestUpload” and replace it with the Directory name in Azure. I chose to do this as follows using the substring method and the length of the source folder path.

foreach($Folder in $Folders)
 {
 $f = ($Folder.FullName).Substring(($source.Length))
 $Path = $AzureDirectory + $f

and tested that the results came out as I wanted

AppName\New1
AppName\New2
AppName\New3
AppName\New4
AppName\New5
AppName\New1\list
AppName\New1\list\a
AppName\New1\list\b
AppName\New1\list\c
AppName\New1\list\d
AppName\New1\list\a\1
AppName\New1\list\a\2
AppName\New1\list\a\3
AppName\New1\list\a\4

I could then create the new folders in azure using New-AzureStorageDirectory again

New-AzureStorageDirectory -Share $s -Path $Path -ErrorAction SilentlyContinue

I followed the same process with the files

$files = Get-ChildItem -Path $Source -Recurse -File</pre>
<pre>foreach($File in $Files)
 {
 $f = ($file.FullName).Substring(($Source.Length))
 $Path = $AzureDirectory + $f

and then created the files using Set-AzureStorageFileContent this has a -Force and a -Confirm switch and I added those into my function by using a [switch] Parameter

#upload the files to the storage

 if($Confirm)
 {
 Set-AzureStorageFileContent -Share $s -Source $File.FullName -Path $Path -Confirm
 }
 else
 {
 Set-AzureStorageFileContent -Share $s -Source $File.FullName -Path $Path -Force
 }

You can download the function from the Script Center

https://gallery.technet.microsoft.com/scriptcenter/Recursively-upload-a-bfb615fe

As also, any comments or queries are welcome and obviously the internet lies so please understand and test all code you find before using it in production

Twas 2 Days Before Xmas or Thank you SQLFamily

Posted on Updated on

Twas 2 days before Xmas & all through the office,
not a creature was stirring not even old Maurice.
With merriment going on outside of his window
There sat a bearded DBA without much to do

No changes can be made through the holiday season
We’re on skeleton support, which is a good reason
Ensure you are making the most of your time
You mustn’t be wasting the company dime

The backups are checked, there isn’t an issue
So documentation writing should ensue
Instead he decided to procrastinate
And so, this little ditty he proceeded to create

Looking back over last year he did ruminate
About all the progress he had made, it was great
So much had been learned, so many improvements
Derived using content from fine ladies and gents

Impossible to estimate how much it would cost
Or calculate the amount of revenue lost
For all that he would have been unable to do
Or the times that he knew how to get out of a stew

But also the friends old, new and the rest
The talking and dining and drinking and jest
I am lucky to be a part of the SQL Family
So thank you one and all, with love from me

beard

Giving Back – #TSQL2sday

Posted on Updated on

T-SQL Tuesday, which was started by Adam Machanic (blog|twitter) and is now starting its 6th year, is hosted by a different person each month. The host selects the theme, and then the blogging begins. Worldwide, on the second Tuesday of the month (all day, based on GMT time), bloggers attend this party by blogging about the theme. This month it is hosted by Wayne Sheffield blog|twitter and in the spirit of the holiday season it is about giving. This is my post on giving back, in the little ways as well as the bigger ones, how you can give back and why it not only benefits others but also yourself

What’s the SQL Family?

The SQL Family (or SQL community call it what you will) is a fabulous place to be. Full of many wonderful, talented, passionate and generous people. Every method of interaction that you wish for can be found if you look. Ever wanted to know how others do it? or Does this happen in other shops? or I wish I had access to someone who knows about ‘insert new shiny thing here’?

I guess that that is how I joined. I had no peers in my shop. I had no one to turn to to ask questions or get advice. I had no support and I turned to the internet. Now we all know that the internet lies. It is full of accidental and deliberate mistruths, of part information and downright bad advice. You have to be careful where you go and who you trust. I gradually found myself going back to the same resources and from those I found out about PASS and user groups

I am in the UK. I found the list of UK SQL User Groups

You can find more here both local and virtual

User Groups

I found a user group near me and went along to a meeting not knowing what to expect. I found a group of people like me willing to give up their time to learn and share knowledge. A wide range of people from DBAs, Developers, BI Professionals and SysAdmins. Working for International multi regional companies looking after many hundreds of SQL Servers to single sysadmins looking after the whole kit and caboodle and everything in between. A wealth and breadth of knowledge to tap into. You will learn so much not only from the sessions but also the conversation with all these other talented people

Come along.

Simply coming along will bring benefit. Other people will be interested in what you have to say even if you are in week 0 of your first ever job. Your view will still be valued. Everyone can learn from everybody and NO-ONE knows it all.

There will come a point where you will pass on a piece of knowledge or an idea or a way of working and someone will say thank you I didn’t know that. You just gave back. It may even be someone you look up to, someone whose knowledge and experience far outweighs yours whose word you hang on to. That feels good.

You may ask the questions that others thought but didnt ask and boy are they glad you asked the question. You just gave back. It’s something I do often. I ask questions and sometimes I ask questions I know the answer to so that they will be of benefit to the group.

What will you get? More than you can ever put in. Free training, often free pizza, knowledge, advice,guidance, contacts, support, a network of people in your field, notice of job openings, swag, fun, friends, more social events and more and more

The user groups are run by volunteers in their own time out of the goodness of their hearts. They will always need your help. Turn up 5 minutes earlier and help set out the chairs or put out the handouts or assist with the tech. You just gave back. Afterwards before going to the pub clear the tables, help carry the boxes, put the pizza evidence in the bin. You just gave back

SQL Saturdays and other community events

SQL Saturdays are held all over the world most every Saturday You can find more about them here https://www.sqlsaturday.com/ There are also other larger events such as SQL Bits and SQL Relay here in the UK. Everything I wrote about User groups counts here just in a slightly larger scale. You will be able to attend several sessions across many different areas for free on a Saturday

These events are also run by volunteers and they will also need your help. If you can spare some time to help on a registration desk you just gave back. A room monitor to ensure the speaker and delegates have everything they need, the room is tidy and the session runs to time. You just gave back. Putting things out and tidying them away again. You just gave back.

You can become a volunteer by asking the people organising the events if they would like your help. These events will all have twitter feeds and emails and facebook pages and many methods of getting in touch. Contact them and offer your help if you can. You just gave back.

If you fancy taking the next step then you can get involved in organising the events. This is hard work, great fun, a good thing to add to your CV and you just gave back. There are so many areas to get involved organising an event. Premises and technology, speakers and printers, volunteers and sponsors all need co-ordination. Websites,twitter feeds, feedback forms, posters, marketing materials all need designing and producing. There are so many ways in which you will be able to provide value to the event and you just gave back

Oh and whilst I am at it, when you attend an event

Say Thank You to the volunteers. You just gave back.

 Speaking and Blogging

All the events named above need speakers. The bigger events like the SQL Saturdays and the large events like SQL Bits will generally have more established speakers but every user group will need speakers and they will be more likely to accept new speakers and will be very supportive if you say that you are a new speaker. Every speaker had to make their first presentation at some point and they all know how it feels and can provide guidance and advice. You will feel that you don’t have anything to speak about that others will want to hear about. You do. Your experience and your knowledge or how you solved something or created something will be of interest to people. Of course, you need to check with the user group leaders and members if your idea for a presentation is suitable. Like anything you do that is new, researching it and taking advice from people with more experience is always useful. Maybe you can start with a lightning talk. Give it a go. You just gave back.

What do you get back from Speaking and Blogging?

I’ll tell you a secret. The vast majority of my posts ( This is an exception) are written for the benefit of one person. Me.

I write posts to record my learning for myself. To document my knowledge. I use my posts to help me to do my job. I know that I wrote the method of doing something somewhere and this is part of my store of knowledge. I write posts to answer peoples questions so that I have a place to point them to. Occasionally people will email me asking a question and if it requires a detailed response I will write a post and email them to tell them that this is the answer. I often point my work colleagues at my blog when they ask me questions about Azure or Powershell. You could also see your blog as an extension of your CV and use it when job hunting and develop it in that way

I also write posts to expand my knowledge and this is the same for speaking. When I am writing a blog post or a presentation I will read and watch videos and ensure I know more about it. The process of creating that content will improve my own knowledge and work practices and you will find that, as you write your blog posts you will have a deeper knowledge also. When you give your presentations you will learn as you answer questions or find the answer to the question afterwards (It’s ok to do that) that you are improving yourself and your knowledge.You will also be giving back.

Putting your information online will enable people to find it. Sure you can worry about SEO and getting to the top of search pages but you know that sometimes the answer is on the ninth page. What you write will be of benefit to others and by taking the time to post you will be giving back to the community

You can do one, many or all of those and you will be giving back. I hope you do

I will be giving back. You will find me at SQL Bits where I shall be room monitoring and volunteering.

You will find me at SQL Saturday Exeter. I am again one of the fabulous SQL South West team

who are again organising a SQL Saturday in Exeter in the UK on April 24th/25th 2015 You can find out more here http://sqlsouthwest.co.uk/sql-saturday-372/ 

You still have time, if you are quick, to submit a session to speak or present a pre-con at Exeter. Submissions close on 15th December and we would love to have yours

I shall carry on blogging and hopefully present at some user groups again this year. If you see me any where, come up and say hi to me. You just gave back

 

 

 

Making a Change Log Easier With PowerShell

Posted on Updated on

Having a Change Log is a good thing. A quick and simple place to find out what has changed on a server and when. This can be invaluable when troubleshooting, matching a change to a symptom especially when assessed alongside your performance counter collection. Here is a simple way to make use of a change log and automate it

Create a simple table


USE [MDW]
GO

CREATE TABLE [dbo].[ChangeLog](
 [ChangeID] [int] IDENTITY(1,1) PRIMARY KEY ,
 [Date] [datetime] NOT NULL,
 [Server] [varchar](50) NOT NULL,
 [UserName] [nvarchar](50) NOT NULL,
 [Change] [nvarchar](max) NOT NULL,
)

GO

You can keep this on a central server or create a database on each server, whichever fits your needs best. You can add other columns if you want your information in a different format

Once you have your table you can create a couple of Powershell functions to easily and quickly add to and retrieve data from the table. I make use of Invoke-SQLCMD2 in these functions

This can then be included in any automation tasks that you use to update your environments whether you are using automated deployment methods for releases or using SCCM to patch your environments making it easy to update and also easy to automate by making it part of your usual deployment process.

To add a new change

<#
.Synopsis
 A function to add a ChangeLog information
.DESCRIPTION
 Load function for adding a change to the changelog table in the MDW database on MDWSERVER.
 Use Get-ChangeLog $Server to see details
 Inputs the username of the account running powershell into the database as the user
REQUIRES Invoke-SQLCMD2

http://sqldbawithabeard.com

.EXAMPLE
 Add-ChangeLog SERVERNAME "Altered AutoGrowth Settings for TempDB to None"

 Adds ServerName UserName and Altered AutoGrowth Settings for TempDB to None to the change log table
#>
Function Add-ChangeLog
{
[CmdletBinding()]
Param(
 [Parameter(Mandatory=$True)]
 [string]$Server,

 [Parameter(Mandatory=$True)]
 [string]$Change
)

$UserName = $env:USERDOMAIN + '\' + $env:USERNAME

$Query = "INSERT INTO [dbo].[ChangeLog]
 ([Date]
 ,[Server]
 ,[UserName]
 ,[Change])
 VALUES
 (GetDate()
 ,'$Server'
 ,'$UserName'
 ,'$Change')
"
Invoke-Sqlcmd2 -ServerInstance MDWSERVER -Database "MDW" -Query $Query -Verbose
}

You can then run

Add-ChangeLog SERVERNAME "Added New Database SuperAppData"

to add the change to the change log

To retrieve the data you can use

<#
.Synopsis
 A function to get ChangeLog information
.DESCRIPTION
 Load function for finding ChangeLog information. Information is selected from the MDW Database on SERVERNAME
REQUIRES Invooke-SQLCMD2

http://sqldbawithabeard.com

.EXAMPLE
 Get-ChangeLog SERVERNAME
#>
Function Get-ChangeLog
{
 [CmdletBinding()]
 [OutputType([int])]
 Param
 (
 # Server Name Required
 [Parameter(Mandatory=$true,]
 $Server
 )

$a = @{Expression={$_.Date};Label="Date";width=15}, `
@{Expression={$_.Server};Label="Server";width=10},
@{Expression={$_.UserName};Label="UserName";width=20}, `
@{Expression={$_.Change};Label="Change";width=18}

Invoke-Sqlcmd2 -ServerInstance MDWSERVER -Database "MDW" -Query "SELECT * FROM dbo.ChangeLog WHERE Server = '$Server';" -Verbose|Format-table $a -Auto -Wrap

}

and use

Get-ChangeLog SERVERNAME

To find out what changed when. Happy Automating

A look at the SQL Assessment Intelligence Pack in Operational Insights

Posted on Updated on

Operational Insights is a service that has been added in preview to Azure. It enables you to collect, combine, correlate and visualize all your machine data in one place. It can collect data from all of your machines either via SCOM or by using an agent. Once the data is collected Operational Insights has a number of Intelligence Packs which have pre-configured rules and algorithms to provide analysis in various areas including for SQL Server

http://azure.microsoft.com/en-gb/services/operational-insights/

I thought I would take a look. I have an installation of SCOM in my lab on my laptop and I read the instructions to see how to connect it to Operational Insights. (You don’t have to have a SCOM installation to use Operational insights you can make use of an agent as well just follow the steps from the page below)

http://azure.microsoft.com/en-us/trial/operational-insights-get-started/

It really is very simple

If you have an Azure subscription already you can sign into the portal and join the preview program by clicking

New –> App Services –> Operational Insights

and create a new Operational Insights Workspace.

Once you have done that, if you have an installation of SCOM 2012 you need to be running Service Pack 1 and download and install the System Center Operational Insights Connector for Operations Manager and import the MPB files into SCOM.

If you have SCOM 2012R2 the connector is already installed and to connect your SCOM to Operational Insights is very very easy as you can see on

http://azure.microsoft.com/en-us/trial/operational-insights-get-started/?step2=withaccount&step3=SCOMcustomer

  1. In the Operations Manager Console, click Administration.
  2. Under Administration, select System Center Advisor, and then click Advisor Connection.
  3. Click Register to Advisor Service.
  4. Sign in with your Microsoft or Organizational account.
  5. Choose an existing Operational Insights workspace from the drop down menu
  6. Confirm your changes.
  7. In the System Center Advisor Overview page, Under Actions, click Add a Computer/Group.
  8. Under Options, select Windows Server or All Instance Groups, and then search and add servers that you want data

That is it. No really, that is it. I was amazed how quickly I was able to get this done in my lab and it would not take very long in a large implementation of SCOM either as you will have your groups of computers defined which will make it easy to decide which groups to use. You could use a separate workspace for each type of server or split up the information per service. It really is very customisable.

Once you have done that, go and add some of the Intelligence Packs. Each intelligence pack will change the amount  and type of data that is collected. At November 23rd there are

Alert Management – for your SCOM Alerts

Change Tracking – Tracking Configuration Changes

Log Management – for event log collection and interrogation

System Update Assessment – Missing Security Updates

Malware Assessment – Status of Anti-Malware and Anti-Virus scans

Capacity Planning – Identify Capacity and Utilisation bottlenecks

SQL Assessment – The risk and health of SQL Server Environment

There are also two ‘coming soon’ Intelligence packs

AD Assessment – Risk and health of Active Directory

Security – Explore security related data and help identify security breaches

You then (if you are like me) have a period of frustration whilst you wait for all of the data to be uploaded and aggregated but once it is you sign into the Operational Insights Portal

https://preview.opinsights.azure.com and it will look like this

opsman1

There is a lot of information there. As it is on my laptop and the lab is not running all of the time and is not connected to the internet most of the time I am not surprised that there are some red parts to my assessment!!

Obviously I was interested in the SQL Assessment and I explored it a bit further

Clicking on the SQL Assessment tile takes you to a screen which shows the SQL Assessment broken down into 6 Focus areas

Security and Compliance, Availability and Business Continuity, Performance and Scalability, Upgrade, Migration and  Deployment, Operations and Monitoring and Change and Configuration Management. MSDN http://msdn.microsoft.com/en-us/library/azure/dn873967.aspx gives some more information about each one

Security and Compliance – Safeguard the reputation of your organization by defending yourself from security threats and breaches, enforcing corporate policies, and meeting technical, legal and regulatory compliance requirements.

Availability and Business Continuity – Keep your services available and your business profitable by ensuring the resiliency of your infrastructure and by having the right level of business protection in the event of a disaster.

Performance and Scalability – Help your organization to grow and innovate by ensuring that your IT environment can meet current performance requirements and can respond quickly to changing business needs.

Upgrade, Migration and Deployment – Position your IT department to be the key driver of change and innovation, by taking full advantage of new enabling technologies to unlock more business value for organizational units, workforce and customers.

Operations and Monitoring – Lower your IT maintenance budget by streamlining your IT operations and implementing a comprehensive preventative maintenance program to maximize business performance.

Change and Configuration Management – Protect the day-to-day operations of your organization and ensure that changes won’t negatively affect the business by establishing change control procedures and by tracking and auditing system configurations.

You will be able to see some dials showing you how well you are doing in each area for the servers whose data has been collected.

opsman2

Each area will have the High Priority Recommendations shown below the dial and you can click on them to see more information about those recommendations

opsman3

You can also click the dial or the see all link to enter the search area where you can customise how you wish to see the data that has been collected, this looks a bit confusing at first

opsman4

The top bar contains the search , the timescale and some buttons to save the search, view the saved searches and view the search history, all of which will be shown in the right hand column below

The left column contains a bar graph for the search and all of the filters. The middle column contains the results of the search and can be viewed in list or tabular format and exported to CSV using the button below. A little bit of experimentation will give you a better understanding of how the filtering works and how you can make use of that for your environment

By looking at the search for the Operations and Monitoring Focus Area shown above

Type:SQLAssessmentRecommendation IsRollup=true RecommendationPeriod=2014-11 FocusArea=”Operations and Monitoring” RecommendationResult=Failed | sort RecommendationWeight desc

I saw that RecommendationResult=Failed and changed it to RecommendationResult=Passed. This enabled me to see all of the Recommendations that had been passed in the Focus Area and clicking the export button downloaded a csv file. I deleted RecommendationResult=Passed from the search and that gave me all of the recommendations that made up that Focus Area

Operations and Monitoring Focus Area

Recommendation
Enable Remote Desktop on servers.
Enable Remote Desktop on virtual machines.
Ensure computers are able to download updates.
Configure event logs to overwrite or archive old events automatically.
Review event log configuration to ensure event data is retained automatically. This relates to System Logs
Review event log configuration to ensure event data is retained automatically. This relates to Application Logs

I decided then to do the same for each of the Focus Areas for the SQL Assessment Intelligence Pack

Security and Compliance Focus Area

Recommendation
Change passwords that are the same as the login name.
Remove logins with blank passwords.
LAN Manager Hash for Passwords Stored
Investigate why unsigned kernel modules were loaded.
Apply security best practices to contained databases.
Enable User Account control on all computers.
Consider disabling the xp_cmdshell extended stored procedure.
Implement Windows authentication on Microsoft Azure-hosted SQL Server deployments.
Avoid using the Local System account to run the SQL Server service.
Avoid adding users to the db_owner database role.
Ensure only essential users are added to the SQL Server sysadmin server role.
Disable SQL Server guest user in all user databases.
Avoid running SQL Server Agent jobs using highly-privileged accounts.
Configure the SQL Server Agent service to use a recommended account.
Apply Windows password policies to SQL Server logins.
Investigate failures to validate the integrity of protected files.
Investigate failures to validate kernel modules.

Availability and Business Continuity Focus Area

Recommendation
Schedule full database backups at least weekly.
Optimize your backup strategy with Microsoft Azure Blob Storage.
Avoid using the Simple database recovery model.
Ensure all installations of Windows are activated.
Investigate logical disk errors.
Reduce the maximum Kerberos access token size.
Investigate connection failures due to SSPI context errors.
Set the PAGE_VERIFY database option to CHECKSUM.
Increase free space on system drives.
Investigate a write error on a disk.
Check the network access to Active Directory domain controllers.
Review DNS configuration on non-DNS servers.
Increase free space on system drives.
Investigate memory dumps.
Increase free space on system drives.
Investigate why the computer shut down unexpectedly.
Enable dynamic DNS registration for domain-joined servers.

Performance and Scalability Focus Area

Recommendation
Increase the number of tempdb database files.
Configure the tempdb database to reduce page allocation contention.
Ensure all tempdb database files have identical initial sizes and growth increments.
Set autogrowth increments for database files and log files to fixed values rather than percentage values.
Set autogrowth increments for transaction log files to less than 1GB.
Modify auto-grow settings to use a fixed size growth increment of less than 1GB and consider enabling Instant File Initialization.
Change your Affinity Mask and Affinity I/O MASK settings to prevent conflicts.
Resolve issues caused by excessive virtual log files.
Modify the database file layout for databases larger than 1TB.
Set the AUTO_CLOSE option to OFF for frequently accessed databases.
Review memory requirements on servers with less than 4GB of physical memory installed.
Configure system SiteName properties to be dynamic.
Align the Max Degree of Parallelism option to the number of logical processors.
Align the Max Degree of Parallelism option to the number of logical processors.
Consider disabling the AUTO_SHRINK database option.
Review memory requirements on computers with high paging file use.
Ensure SQL Server does not consume memory required by other applications and system components.
Consider changing your power saving settings to optimize performance.
Increase the initial size of the tempdb database.
Review the configuration of Maximum Transfer Unit (MTU) size.
Review your paging file settings.
Review and optimize memory cache configuration.
Review the configuration of Maximum Transfer Unit (MTU) size.
Review the system processor scheduling mode.
Review network provider ordering settings.
Remove invalid entries from the PATH environment variable.
Remove network entries from the PATH environment variable.
Investigate processes that use a large number of threads.
Avoid hosting user database files on the same disk volume as tempdb database files.
Review processes with large working set sizes.
Reduce the length of the PATH environment variable.
Reduce the number of entries in the PATH environment variable.
Ensure SQL Server does not consume memory required by other applications and system components.
Enable the backup compression default configuration option.
Ensure the DNS Client service is running and is set to start automatically.
Consider compressing database tables and indexes.

Upgrade, Migration and Deployment Focus Area

Recommendation
Ensure all devices run supported operating system versions.
Ensure that the guest user is enabled in the msdb database.
Avoid using the Affinity64 Mask configuration setting in new development work.
Avoid using the Affinity Mask configuration setting in new development work.
Avoid using the Affinity I/O Mask configuration setting in new development work.
Avoid using the Allow Updates configuration option in SQL Server.
Avoid using the Allow Updates configuration option in SQL Server.
Avoid using the Affinity64 I/O Mask configuration setting in new development work.
Configure SQL Server to accept incoming connections.
Configure SQL Server instances and firewalls to allow communication over TCP/IP.

As I have no data for Change and Configuration Management I was not able to see the recommendations in my Operation Insights Workspace.

Edit: Daniele Muscetta has said in the comments that this is a bug which is being tracked

As you can see from the type and description of the recommendations above these are all areas that a DBA will be concerned about and the benefit of having all of this information gathered, pre-sorted, prioritised and presented to you in this manner will enable you to work towards a better SQL environment and track your progress. You can read more about the SQL Assessment Intelligence Pack here

http://msdn.microsoft.com/en-us/library/azure/dn873958.aspx

As well as the pre-determined queries that are built into the Intelligence pack you can search your data in any way that you require enabling you to present information about the health and risk of your SQL Environment to your team or your management with ease. The “with ease” bit is dependent on you understanding the language and structure of the search queries.

You will need to put this page into your bookmarks

http://msdn.microsoft.com/library/azure/dn873997.aspx

As it contains the syntax and definitions to search your data

A very useful page for a starter like me is

http://blogs.msdn.com/b/dmuscett/archive/2014/10/19/advisor-searches-collection.aspx

by Daniele Muscetta which has a list of useful Operational Insights search queries such as

SQL Recommendation by Computer

Type=SQLAssessmentRecommendation IsRollup=false RecommendationResult=Failed | measure count() by Computer

If you click the star to the right of the search box you will find the saved searches. For the SQL Assessment Intelligence Pack there are

Did the agent pass the prerequisite check (if not, SQL Assessment data won’t be complete)
Focus Areas
How many SQL Recommendation are affecting a Computer a SQL Instance or a Database?
How many times did each unique SQL Recommendation trigger?
SQL Assesments passed by Server
SQL Recommendation by Computer
SQL Recommendation by Database
SQL Recommendation by Instance

You can use these and you can save your own searches which show the data in a way that is valuable to you.

Overall I am impressed with this tool and can see how it can be beneficial for a DBA as well as for System Administrators. I was amazed how easy it was to set up and how quickly I was able to start manipulating the data once it had been uploaded.