#tsql2sday – Giving Back – Reprise

Itsql2sdayt’s TSQL Tuesday again! This month our host is Riley Major (b/t) and the subject is Giving Back. He’s given us two options here (as well as the side option of your favorite 2017 improvement). Pick a way that you’d like to give back to the community and talk about it, or if you already give back,  tell us how and why you started.

 

There will be a lot of excellent posts on this subject and one of the things that I like is that you can go to http://tsqltuesday.com and look up all of the entries which means that for a particular topic you can find a bunch of blog posts from different angles (I miss-typed that as angels first and yes they are all angels!) whether it is career improvement or technical like say Extended Events it is all there waiting for you 🙂

I wrote about giving back in 2014 when I was about to help organise SQL Saturday Exeter for the first time. Last year I blogged about a few of the wonderful people who made a difference to me. So this post is a little about how I give back and also hopefully some hints that can help you to do the same as well.

How

There are so many ways that you can give back to the community as Riley’s post shows.

Sharing your knowledge is a good way. I share my knowledge in blog posts and in sessions at user groups and at conferences. You can too.

From beginner, introductory posts and talks to expert level deep dives every single one of us has learned from the blog posts that other people have spent their time and effort creating.

YOU can share your knowledge, even if you have only been using a technology for a few months, you have knowledge of the things you have learned and the things that would have made it easier. Write a post about those things.

Answering questions is another way. You can do this on Stack Overflow, SQL Server Central, PowerShell.Org , Reddit, Facebook , on Twitter using the #sqlhelp or #PowerShellHelp or just in person. I try to answer questions when I see them on twitter or in the SQL Community Slack

Mentoring or just providing feedback to people. You can offer to proof read blog posts or abstract submissions or you can listen to peoples presentations. I do this and it is a lot of fun

I also share my knowledge via my GitHub. All of my presentations slides and code are available as well as other code that I use. I also contribute to open-source projects such as dbatools and dbachecks.  You can do this too. You could open an issue for an improvement or bug. You can contribute your code, even if you are not confident writing the code you can fix spelling mistakes or add documentation., everything helps

You can help with organisation of events. I have helped to organise the PowerShell Europe Conference, PSDay.UK, SQL Saturday Exeter, SQL SouthWest user group, PASS PowerShell Virtual Group. I have also helped with session choices for a number of other events like SQL Grillen and SQLGLA and I have volunteered at many events from SQL Saturdays to SQL Bits. Everything from setting up and tearing down (IE moving heavy things and cleaning up rubbish) to sitting on the information desk, giving out badges, making sure the speakers are on time in their sessions. You can do this too. Just ask the organisers of the events what they need. It is better to do this prior to the event than on the day but I am sure all help is welcomed. Richard Munn and I talked (waffled?) about this at SQL Bits this year

If you would like to be considered as a volunteer for next years SQL Bits please email helpers at sqlbits.com

Why

So why did I start giving back?

I wanted to be useful. I saw the amount of work that Jonathan and Annette were doing organising SQL Saturday Exeter and SQL South West. I hoped that I could help them with that.

Why do I carry on doing it?

Because it is fun 🙂 I enjoy speaking, I enjoy sharing my knowledge and talking to people

That is good but there is more to it as well

You learn so much by writing a presentation or a blog post because you will do research.

You will learn even more when people ask you questions in your sessions or leave comments on your blog posts and you have to go and find the answers

You learn new and useful skills and demonstrate your knowledge to potential employers

My blog is a scrapbook of knowledge that I go back to and use all the time (and sometimes I forget that I have written something and find my own post in the search results!)

But the most important reason is that I feel that it pays back some of the benefit that I have gained from all of those people who’s time and effort I made use of for free when I was learning and continue to do so to this day. All of those blog posts and videos and presentations that I consumed have helped to make me the technician I am today. I have skills and abilities that I would not have without them all and by giving back I hope that I am enabling others to develop and see the benefit of sharing so that they will continue to do so in the future and I can learn from them.

 

 

 

 

 

 

 

Using the AST in Pester for dbachecks

TagLine – My goal – Chrissy will appreciate Unit Tests one day 🙂

Chrissy has written about dbachecks the new up and coming community driven open source PowerShell module for SQL DBAs to validate their SQL Server estate. we have taken some of the ideas that we have presented about a way of using dbatools with Pester to validate that everything is how it should be and placed them into a meta data driven framework to make things easy for anyone to use. It is looking really good and I am really excited about it. It will be released very soon.

Chrissy and I will be doing a pre-con at SQLBits where we will talk in detail about how this works. You can find out more and sign up here

Cláudio Silva has improved my PowerBi For Pester file and made it beautiful and whilst we were discussing this we found that if the Pester Tests were not formatted correctly the Power Bi looked … well rubbish to be honest! Chrissy asked if we could enforce some rules for writing our Pester tests.

The rules were

The Describe title should be in double quotes
The Describe should use the plural Tags parameter
The Tags should be singular
The first Tag should be a unique tag in Get-DbcConfig
The context title should end with $psitem
The code should use Get-SqlInstance or Get-ComputerName
The Code should use the forEach method
The code should not use $_
The code should contain a Context block

She asked me if I could write the Pester Tests for it and this is how I did it. I needed to look at the Tags parameter for the Describe. It occurred to me that this was a job for the Abstract Syntax Tree (AST). I don’t know very much about the this but I sort of remembered reading a blog post by Francois-Xavier Cat about using it with Pester so I went and read that and found an answer on Stack Overflow as well. These looked just like what I needed so I made use of them. Thank you very much to Francois-Xavier and wOxxOm for sharing.

The first thing I did was to get the Pester Tests which we have located in a checks folder and loop through them and get the content of the file with the Raw parameter

Describe "Checking that each dbachecks Pester test is correctly formatted for Power Bi and Coded correctly" {
$Checks =(Get-ChildItem$ModuleBase\checks).Where{$_.Name-ne'HADR.Tests.ps1'}
$Checks.Foreach{
$Check =Get-Content$Psitem.FullName-Raw
Context "$($_.Name) - Checking Describes titles and tags" {
Then I decided to look at the Describes using the method that wOxxOm (I know no more about this person!) showed.
$Describes = [Management.Automation.Language.Parser]::ParseInput($check, [ref]$tokens, [ref]$errors).
FindAll([Func[Management.Automation.Language.Ast, bool]] {
        param($ast)
        $ast.CommandElements -and
        $ast.CommandElements[0].Value -eq 'describe'
    }, $true) |
    ForEach {
    $CE = $_.CommandElements
    $secondString = ($CE |Where { $_.StaticType.name -eq 'string' })[1]
    $tagIdx = $CE.IndexOf(($CE |Where ParameterName -eq'Tags')) + 1
    $tags = if ($tagIdx -and $tagIdx -lt $CE.Count) {
        $CE[$tagIdx].Extent
    }
    New-Object PSCustomObject -Property @{
        Name = $secondString
        Tags = $tags
    }
}
As I understand it, this code is using the Parser on the $check (which contains the code from the file) and finding all of the Describe commands and creating an object of the title of the Describe with the StaticType equal to String and values from the Tag parameter.
When I ran this against the database tests file I got the following results
Then it was a simple case of writing some tests for the values
@($describes).Foreach{
    $title = $PSItem.Name.ToString().Trim('"').Trim('''')
    It "$title Should Use a double quote after the Describe" {
        $PSItem.Name.ToString().Startswith('"')| Should be $true
        $PSItem.Name.ToString().Endswith('"')| Should be $true
    }
    It "$title should use a plural for tags" {
        $PsItem.Tags| Should Not BeNullOrEmpty
    }
    # a simple test for no esses apart from statistics and Access!!
    if ($null -ne $PSItem.Tags) {
        $PSItem.Tags.Text.Split(',').Trim().Where{($_ -ne '$filename') -and ($_ -notlike '*statistics*') -and ($_ -notlike '*BackupPathAccess*') }.ForEach{
            It "$PsItem Should Be Singular" {
                $_.ToString().Endswith('s')| Should Be $False
            }
        }
        It "The first Tag Should Be in the unique Tags returned from Get-DbcCheck" {
            $UniqueTags -contains $PSItem.Tags.Text.Split(',')[0].ToString()| Should Be $true
        }
    }
    else {
        It "You haven't used the Tags Parameter so we can't check the tags" {
            $false| Should be $true
        }
    }
}

The Describes variable is inside @() so that if there is only one the ForEach Method will still work. The unique tags are returned from our command Get-DbcCheck which shows all of the checks. We will have a unique tag for each test so that they can be run individually.

Yes, I have tried to ensure that the tags are singular by ensuring that they do not end with an s (apart from statistics) and so had to not check  BackupPathAccess and statistics. Filename is a variable that we add to each Describe Tags so that we can run all of the tests in one file. I added a little if block to the Pester as well so that the error if the Tags parameter was not passed was more obvious

I did the same with the context blocks as well

Context "$($_.Name) - Checking Contexts" {
    ## Find the Contexts
    $Contexts = [Management.Automation.Language.Parser]::ParseInput($check, [ref]$tokens, [ref]$errors).
    FindAll([Func[Management.Automation.Language.Ast, bool]] {
            param($ast)
            $ast.CommandElements -and
            $ast.CommandElements[0].Value -eq 'Context'
        }, $true) |
        ForEach {
        $CE = $_.CommandElements
        $secondString = ($CE |Where { $_.StaticType.name -eq 'string' })[1]
        New-Object PSCustomObject -Property @{
            Name = $secondString
        }
    }
    @($Contexts).ForEach{
        $title = $PSItem.Name.ToString().Trim('"').Trim('''')
        It "$Title Should end with `$psitem So that the PowerBi will work correctly" {
            $PSItem.Name.ToString().Endswith('psitem"')| Should Be $true
        }
    }
}
This time we look for the Context command and ensure that the string value ends with psitem as the PowerBi parses the last value when creating columns
Finally I got all of the code and check if it matches some coding standards
Context "$($_.Name) - Checking Code" {
    ## This just grabs all the code
    $AST = [System.Management.Automation.Language.Parser]::ParseInput($Check, [ref]$null, [ref]$null)
    $Statements = $AST.EndBlock.statements.Extent
    ## Ignore the filename line
    @($Statements.Where{$_.StartLineNumber -ne 1}).ForEach{
        $title = [regex]::matches($PSItem.text, "Describe(.*)-Tag").groups[1].value.Replace('"', '').Replace('''', '').trim()
        It "$title Should Use Get-SqlInstance or Get-ComputerName" {
            ($PSItem.text -Match 'Get-SqlInstance') -or ($psitem.text -match 'Get-ComputerName')| Should be $true
        }
        It "$title Should use the ForEach Method" {
            ($Psitem.text -match 'Get-SqlInstance\).ForEach{') -or ($Psitem.text -match 'Get-ComputerName\).ForEach{')| Should Be $true# use the \ to escape the )
    }
    It "$title Should not use `$_" {
        ($Psitem.text -match '$_')| Should Be $false
    }
    It "$title Should Contain a Context Block" {
        $Psitem.text -match 'Context'| Should Be $True
    }
}

I trim the title from the Describe block so that it is easy to see where the failures (or passes) are with some regex and then loop through each statement apart from the first line to ensure that the code is using our internal commands Get-SQLInstance or Get-ComputerName to get information, that we are looping through each of those arrays using the ForEach method rather than ForEach-Object and using $psitem rather than $_ to reference the “This Item” in the array and that each Describe block has a context block.

This should ensure that any new tests that are added to the module follow the guidance we have set up on the Wiki and ensure that the Power Bi results still look beautiful!

Anyone can run the tests using

Invoke-Pester .\tests\Unit.Tests.ps1 -show Fails
before they create a Pull request and it looks like
if everything is Green then they can submit their Pull Request 🙂 If not they can see quickly that something needs to be fixed. (fail early 🙂 )
03 fails.png

Why Volunteer at SQLBits ?

WARNING – Contains Orange and light-hearted photos 😉

I have returned home from SQLBits 2017 The Disco Edition. I am exhausted, my body is pointing out to me in a variety of ways that this is the only week of the year that I spend so much time on my feet. Why would anyone do it?

Many months of work

First though, you need to know that the SQLBits conference is organised and run by volunteers. All of the committee spend many, many long hours, out of their own free time, for many months before and after the event to ensure that the attendees, sponsors, speakers and volunteers experience is trouble free. I think that they do an amazing and fantastic job and repeatedly pull off the best, most enjoyable conference that I have been to.

Thank you Simon, Chris, Darren, Allan, Alex, Jonathan, Annette

Thank you also to their families as well, who undoubtedly miss out on time with them whilst they are organising everything to do with the event, from finding venues, organising dates, speakers, marketing, website, sponsors, printing, audio visual, THE PARTY!! and all the other big and small things that it takes to make an event of that size occur.

Orange Shirted Wonderful Folk

There is another group of volunteers that you will have seen at SQLBits. For the last couple of years we have been the ones in the orange shirts.

Here is the Class of 2017

WP_20170406_17_27_05_Pro.jpg

I think this is a brilliant colour as it makes us easy to spot (although a couple of attendees who also had orange tops on did get stopped and asked for directions 🙂 )

What do they do?

These folk come in early and get everything set up. Sometimes we have to explain that the event isn’t ready for you yet

WP_20170406_07_31_20_Pro.jpgWe sort out the registration desk and greet every attendee, speaker and sponsor and assist them.

WP_20170405_08_11_25_Pro.jpg

We help the speakers get set up

WP_20170405_08_15_38_Pro.jpg

and ensure they have everything they need.

aaron bertrand.jpg

Aaron Bertrand (above) and John Martin from SentryOne said that it is the best experience for a speaker that they have had anywhere.

We direct and assist the attendees to be in the right place, sometimes with some flair!

WP_20170404_21_13_03_Pro.jpg

We ensure that any issues are resolved quickly and with as little distraction as possible. The room is too hot, too cold, too noisy or too quiet. The projector isn’t working or the speakers microphone has a buzz, there is too much light or too little. The water coolers are empty. The rubbish needs picking up. All these and many other minor complications are communicated and passed to the correct people to get resolved.

WP_20170404_17_51_33_Pro (2).jpg

Sometimes we have to resolve our own issues. We had folks who were called by their work and had to stop helping and go back to their day jobs for a few hours. We all understand what it is like for people working in technology and adapt and manage accordingly. In almost every photo I took, there is someone in an orange shirt to be seen.

WP_20170406_10_43_41_Pro.jpg

We answer numerous questions from the 1500 or so attendees (and the odd sheep) who came this year.

rchard.jpg

From timings and locations to taxi numbers or restaurants. Unfortunately I did not beat last years “Best question I have been asked at SQLBits” which was

Excuse me, I have a Dalek in the van . What would you like me to do with it?

I was even asked questions on the way back to the hotel gone midnight on Saturday!!

We stay afterwards and help to get ready for the next day, putting out the new signs for the domes and the required paperwork.

So why do we do it?

I asked the guys and gals this question and in their own words, this is why they do it

Being a volunteer at SQLBits is not easy. I’m writing this three days after the event and my legs are still sore. Most days are 11 hours long and you will be standing for most of them. Very often the sessions are full, so you’ll be giving up your seat to one of the attendees. Lunches and breaks are shorter as you are either cleaning down the last session or getting ready for the next. When things go wrong, and they do, you’ll need to get them fixed as quickly as possible even if you have not had  coffee yet.

You do get to attend sessions but you might not always get your first choice. This can be both a good and bad thing. Very often I have filled in on sessions that I normally wouldn’t attend as they are outside my direct area of work, only to find them the most interesting as I get to see how the other half lives.

So why do I keep coming back? Well it’s fun. We have a laugh, even on reception when it’s busy you get to joke with the attendees, speakers and other helpers. There is pizza, beer and jokes while bag packing. Odd expresso calls!  Working along side some else is a great way to get to know them. I live outside the normal SQL community structures, my nearest user group is a 150 miles away. So I don’t get to interact with other SQL family members as often as others. But even so, I know as soon as I walk into SQL Bits, there will be a chorus of, “Hey Conan, how have you been?” from people I haven’t seen in a year. There is also something about wearing a bright orange shirt that seems to attract interactions from the attendees.

All because of the of the experience that is being a volunteer.

Conan Farrell

WP_20170406_10_33_15_Pro (2).jpg

I owe a lot to the SQL Community. It was at SQLBits a few years ago that someone convinced me to start speaking. That encouragement and acceptance from the #SQLFamily put into motion a series of events that lead to me quitting the best job of my life last year in favour of an adventure to set up my own company. It’s been a wonderful journey and if it had not been for SQLBits (and SQL Relay, and SQL Saturdays, and others) I wouldn’t have taken it.

I have a debt to pay. And it’s wonderful to be able to contribute towards giving other people the same opportunities that the community has given me.

Also, for similar reasons, I recently joined the committee for SQL Relay. While I’ve been a sponsor and a speaker a lot before, I am fairly inexperienced at organising events. Helping out with the awesome SQLBits team has been a great learning curve. I hope to take what I have learned and apply it to my role for SQL Relay.

Finally, frankly, there are so many great people in the SQL Community it is just wonderful to be able to work with them during the day. (And share some beers with them in the evening!)

Alex Yates

alex2

I volunteer at SQLBits and in the wider Data Platform community for various reasons. The community is the glue in our industry and being part of that glue is both a privilege and an honour. Without volunteers these awesome events couldn’t function and we couldn’t share knowledge as freely. There also would not be the same opportunities to network with such great people and experts in the same field. I am proud to be part of the SQL Family and facilitate the learning that being a volunteer offers to professionals just like me. Stronger together.

Paul Andrew

WP_20170404_16_03_17_Pro (2).jpg

When I volunteered, I felt I’d like to give back to the SQL Community a bit of the much I’ve received from them. I wanted to become more engaged.
I didn’t even dream it would be like this, the terrific team I found here was so great that I can barely wait for the next one.

Miguel Oliveira

Volunteering gives you the experience of the effort & reward of running an event.. You build an appreciation of generating success, creating networks of colleagues and friends, being quick to react, and helps you walk away with the feeling of “I contributed to this”. Everyone should volunteer, even if its just once.

Ben Watt

WP_20170405_09_31_24_Pro (2).jpg

This year was my 10th sqlbits (remembering we skipped unlucky 13) and 7th I’ve been on the helping team for. I started helping when I learned my original conference companions were moving on to different technologies. Something I’ve never understood.

Historically I’ve worked in multiple companies as a lone dba, something I’ve found many at SQLBits can relate to. Through user groups and SQLBits I’ve met lots of others in the same boat over the years. It can be a frustrating job defending servers, implementing best practice and writing sql when all your application team want to do is add columns and ship the next release!

Yes, there are good networking opportunities and the parties are great but at the core is great quality training. I’ve had a great time playing a small part in helping deliver the experience over the years. Occasionally I have to shout about feedback forms or seating arrangements but on the whole folk are fine with it.  If I’m honest they are long days. A typical day involves a start and end meeting in addition to sessions you are monitoring. Add to that hotel commutes, catching up with the friends you made last year, meals, drinks etc and its 1am. Oh, and tomorrow’s meeting is 7.30 eek….

It’s been a great journey watching the progression of events. Each one adds to the last, they run a lot slicker now. The sponsors are first class and always embrace the theme with their stalls and giveaways. I do wish more folk would enter the prize draws and stop me winning though, it’s getting embarrassing now 🙂

Richard Doering 

rich2

The best Data Platform conference in Europe! I couldn’t miss the opportunity to be much more than just an attendee.

Hubert Kobierzewski

I volunteer for sqlbits, to get experience from a multi day event that has a lot of attendees. Lots of take aways to make local community events run smoother for the delegates.
Thats my main reason

Jens Vestergard

There is no way I could not use this next photo celebrate this years Most Valuable Helper.

WP_20170407_20_45_12_Pro (2).jpg

Thank you Shaun Atkinson you were amazing and thank you for looking after my former intern James as a first time volunteer

How can I do it next year?

We always welcome new people to our team. If you would like to volunteer your time to help at the next SQLBits please send an email to helpers@sqlbits.com with a subject of I would like to volunteer or something similar. Nearer the time of next years SQLBits (No I don’t know when or where it is, I will know when you do) the awesome, amazing, fantastic, brilliant, organising amazeballs also known as Annette will get in touch with you with further details

You can expect to have a lot of fun, make new friends and help make the best data platform conference the best data platform conference

Oh and sometimes theres free pizza 🙂

WP_20170404_18_31_58_Pro

Another Thank You

Some lovely people, (I am really sorry but I didn’t get everyones name) brought sweets, biscuits, cakes and other goodies for the crew. Thank you, they were very much appreciated.

Pictures because smiles 🙂

Not all of the volunteers wanted to give feedback publically but I had some cracking photos so I thought I would share them as well. Enjoy

This slideshow requires JavaScript.

PowerShell, Pester and Ola Hallengrens Maintenance Solution

If you are a SQL DBA you will have heard of Ola Hallengrens Maintenance solution If you haven’t go and click the link and look at the easiest way to ensure that all of your essential database maintenance is performed. You can also watch a video from Ola at SQL Bits
Recently I was thinking about how I could validate that this solution was installed in the way that I wanted it to be so I turned to Pester You can find a great how to get started here which will show you how to get Pester and how to get started with TDD.
This isn’t TDD though this is Environment Validation and this is how I went about creating my test.
First I thought about what I would look for in SSMS when I had installed the maintenance solution and made a list of the things that I would check which looked something like this. This would be the checklist you would create (or have already created) for yourself or a junior following this install. This is how easy you can turn that checklist into a Pester Test and remove the human element and open your install for automated testing
  • SQL Server Agent is running – Otherwise the jobs won’t run 🙂
  • We should have 4 backup jobs with a name of
  • DatabaseBackup – SYSTEM_DATABASES – FULL
  • DatabaseBackup – USER_DATABASES – FULL
  • DatabaseBackup – USER_DATABASES – DIFF
  • DatabaseBackup – USER_DATABASES – LOG
  • We should have Integrity Check and Index Optimisation Jobs
  • We should have the clean up jobs
  • All jobs should be scheduled
  • All jobs should be enabled
  • The jobs should have succeeded

I can certainly say that I have run through that check in my head and also written it down in an installation guide in the past. If I was being more careful I would have checked if there were the correct folders in the folder I was backing up to.

Ola’s script uses a default naming convention so this makes it easy. There should be a SERVERNAME or SERVERNAME$INSTANCENAME folder or if there is an Availability Group a CLUSTERNAME$AGNAME and in each of those a FULL DIFF and LOG folder which I can add to my checklist

So now we have our checklist we just need to turn in into a Pester Environmental Validation script

It would be useful to be able to pass in a number of instances so we will start with a foreach loop and then a Describe Block then split the server name and instance name, get the agent jobs and set the backup folder name

$ServerName = $Server.Split('\')[0]
$InstanceName = $Server.Split('\')[1]
$ServerName = $ServerName.ToUpper()
Describe 'Testing $Server Backup solution'{
BeforeAll {$Jobs = Get-SqlAgentJob -ServerInstance $Server
$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server
$dbs = $Srv.Databases.Where{$_.status -eq 'Normal'}.name
if($InstanceName)
{
$DisplayName = 'SQL Server Agent ($InstanceName)'
$Folder = $ServerName + '$' + $InstanceName
}
else
{
$DisplayName = 'SQL Server Agent (MSSQLSERVER)'
$Folder = $ServerName
}
}
if($CheckForBackups -eq $true)
{
$CheckForDBFolders -eq $true
}
$Root = $Share + '\' + $Folder 
I also set the Agent service display name so I can get its status. I split the jobs up using a Context block, one each for Backups, Database maintenance and solution clean up but they all follow the same pattern. .First get the jobs

$Jobs = $Jobs.Where{($_.Name -like 'DatabaseBackup - SYSTEM_DATABASES - FULL*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - FULL*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - DIFF*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - LOG*' + $JobSuffix + '*')}
Then we can iterate through them and check them but first lets test the Agent Service. You do this with an It Block and in it put a single test like this

actual-value | Should Be expected-value
So to check the Agent Job is running we can do this

(Get-service -ComputerName $ServerName -DisplayName $DisplayName).Status | Should Be 'Running'
To find out how to get the right values for any test I check using get member so to see what is available for a job I gathered the Agent Jobs into a variable using the Get-SQLAgentJob command in the new sqlserver module (which you can get by installing the latest SSMS from here) and then explored their properties using Get-Member and the values using Select Object

$jobs = Get-SqlAgentJob -ServerInstance $server
($Jobs | Get-Member -MemberType Property).name
$Jobs[0] | Select-Object *
then using a foreach to loop through them I can check that the jobs, exists, is enabled, has a schedule and succeeded last time it ran like this

$Jobs = $Jobs.Where{($_.Name -eq 'DatabaseIntegrityCheck - SYSTEM_DATABASES') -or ($_.Name -eq 'DatabaseIntegrityCheck - USER_DATABASES') -or ($_.Name -eq 'IndexOptimize - USER_DATABASES')}
foreach($job in $Jobs)
{
$JobName = $Job.Name
It '$JobName Job Exists'{
$Job | Should Not BeNullOrEmpty
}
It '$JobName Job is enabled' {
$job.IsEnabled | Should Be 'True'
}
It '$JobName Job has schedule' {
$Job.HasSchedule | Should Be 'True'
}
if($DontCheckJobOutcome -eq $false)
{
It '$JobName Job succeeded' {
$Job.LastRunOutCome | Should Be 'Succeeded'
}
}
So I have checked the agent and the jobs and now I want to check the folders exist. First for the instance using Test-Path so the user running the PowerShell session must have privileges and access to list the files and folders

Context '$Share Share For $Server' {
It 'Should have the root folder $Root' {
Test-Path $Root | Should Be $true
}
The for every database we need to set some variables for the Folder path. We don’t back up tempdb so we ignore that and then check if the server is SQL2012 or above and if it is check if the database is a member of an availability group and set the folder name appropriately

  foreach($db in $dbs.Where{$_ -ne 'tempdb'})
{

if($Srv.VersionMajor -ge 11)
{
If($srv.Databases[$db].AvailabilityGroupName)
{
$AG = $srv.Databases[$db].AvailabilityGroupName
$Cluster = $srv.ClusterName
$OLAAg = $Cluster + '$' + $AG
if($Share.StartsWith('\\') -eq $False)
{
$UNC = $Share.Replace(':','$')
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $OlaAG
}
else
{
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $Folder
}
}
else
{
if($Share.StartsWith('\\') -eq $False)
{
$UNC = $Share.Replace(':','$')
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $Folder
}
else
{
$Root = $Share + '\' + $Folder
}
}
}
$db = $db.Replace(' ','')
$Dbfolder = $Root + "\$db"
$Full = $Dbfolder + '\FULL'
$Diff = $Dbfolder + '\DIFF'
$Log  = $Dbfolder + '\LOG'
If($CheckForDBFolders -eq $True)
{
Context "Folder Check for $db on $Server on $Share" {
It "Should have a folder for $db database" {
Test-Path $Dbfolder |Should Be $true
} 
But we need some logic for checking for folders because Ola is smart and checks for Log Shipping databases so as not to break the LSN chain and system databases only have full folders and simple recovery databases only have full and diff folders. I used the System.IO.Directory Exists method as I found it slightly quicker for UNC Shares

If($CheckForDBFolders -eq $True)
{
Context 'Folder Check for $db on $Server on $Share' {
It 'Should have a folder for $db database' {
Test-Path $Dbfolder |Should Be $true
}
if($Db -notin ('master','msdb','model') -and ($Srv.Databases[$db].RecoveryModel -ne 'Simple') -and ( $LSDatabases -notcontains $db))
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
It 'Has a Diff Folder' {
[System.IO.Directory]::Exists($Diff) | Should Be $True
}
It 'Has a Log Folder' {
[System.IO.Directory]::Exists($Log) | Should Be $True
}
} #
elseif(($Srv.Databases[$db].RecoveryModel -eq 'Simple') -and $Db -notin ('master','msdb','model') -or ( $LSDatabases -contains $db) )
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
It 'Has a Diff Folder' {
[System.IO.Directory]::Exists($Diff) | Should Be $True
}
} #
else
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
}#
} # End Check for db folders
}
and a similar thing for the files in the folders although this caused me some more issues with performance. I first used Get-ChildItem but in folders where a log backup is running every 15 minutes it soon became very slow. So I then decided to compare the create time of the folder with the last write time which was significantly quicker for directories with a number of files but then fell down when there was a single file in the directory so if the times match I revert back to Get-ChildItem.
If anyone has a better more performant option I would be interested in knowing. I used Øyvind Kallstad PowerShell Conference session Chasing the seconds Slides and Video and tried the methods in there with Measure-Command but this was the best I came up with

If($CheckForBackups -eq $true)
{
Context ' File Check For $db on $Server on $Share' {
$Fullcreate = [System.IO.Directory]::GetCreationTime($Full)
$FullWrite = [System.IO.Directory]::GetLastWriteTime($Full)
if($Fullcreate -eq $FullWrite)
{
It 'Has Files in the FULL folder for $db' {
Get-ChildItem $Full\*.bak | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the FULL folder for $db' {
$FullCreate | Should BeLessThan $FullWrite
}
}
It 'Full File Folder was written to within the last 7 days' {
$Fullwrite |Should BeGreaterThan (Get-Date).AddDays(-7)
}
if($Db -notin ('master','msdb','model'))
{
$Diffcreate = [System.IO.Directory]::GetCreationTime($Diff)
$DiffWrite = [System.IO.Directory]::GetLastWriteTime($Diff)
if($Diffcreate -eq $DiffWrite)
{
It 'Has Files in the DIFF folder for $db' {
Get-ChildItem $Diff\*.bak | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the DIFF folder for $db' {
$DiffCreate | Should BeLessThan $DiffWrite
}
}</div><div>It 'Diff File Folder was written to within the last 24 Hours' {
$Diffwrite |Should BeGreaterThan (Get-Date).AddHours(-24)
}
}
if($Db -notin ('master','msdb','model') -and ($Srv.Databases[$db].RecoveryModel -ne 'Simple') -and ( $LSDatabases -notcontains $db))
{
$Logcreate = [System.IO.Directory]::GetCreationTime($Log)
$LogWrite = [System.IO.Directory]::GetLastWriteTime($Log)
if($Logcreate -eq $LogWrite)
{
It 'Has Files in the LOG folder for $db' {
Get-ChildItem $Log\*.trn | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the LOG folder for $db' {
$LogCreate | Should BeLessThan $LogWrite
}
}
It 'Log File Folder was written to within the last 30 minutes' {
$Logwrite |Should BeGreaterThan (Get-Date).AddMinutes(-30)
}
}# Simple Recovery
}
}# Check for backups
You could just run the script you have just created from your check-list, hopefully this blog post can help you see that you  can do so.
But I like the message showing number of tests and successes and failures at the bottom and I want to use parameters in my script. I can do this like this

[CmdletBinding()]
## Pester Test to check OLA
Param(
$Instance,
$CheckForBackups,
$CheckForDBFolders,
$JobSuffix ,
$Share ,
[switch]$NoDatabaseRestoreCheck,
[switch]$DontCheckJobOutcome
)
and then call it using Invoke-Pester with the parameters like this

$Script = @{
Path = $Path;
Parameters = @{ Instance = Instance;
CheckForBackups = $true;
CheckForDBFolders = $true;
JobSuffix = 'BackupShare1';
Share = '\\Server1\BackupShare1';
NoDatabaseRestoreCheck= $true;
DontCheckJobOutcome = $true}
}
Invoke-Pester -Script $Script
but that’s a bit messy, hard to remember and won’t encourage people newer to Powershell to use it so I wrapped it in a function with some help and examples and put it in GitHub Test-OlaInstance.ps1 and Test-Ola. There is one thing to remember. You will need to add the path to Test-Ola.ps1 on Line 90 of Test-OlaInstance so that the script can find it
Once you have that you can call it for a single instance or a number of instances like so. Here I check for Folders and Backup files
$Servers =  'SQL2008Ser2008','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2'
Test-OLAInstance -Instance $Servers -Share 'H:\' -CheckForBackups
and get  a nice result like this. In a little under 20 seconds I completed my checklist for 4 servers including checking if the files and folders exist for 61 databases 🙂 (The three failures were my Integrity Check jobs holding some test corrupt databases)
pester ola check.PNG
This gives me a nice and simple automated method of checking if Ola’s maintenance script has been correctly installed. I can use this for one server or many by passing in an array of servers (although they must use the same folder for backing up whether that is UNC or local) I can also add this to an automated build process to ensure that everything has been deployed correctly.
I hope you find it useful

Scheduling Ola Hallengrens Maintenance Solution Default Jobs with Powershell

If you are a SQL Server DBA you should know about Ola Hallengren and will probably have investigated his Maintenance Solution.

If you haven’t please start here https://ola.hallengren.com/

You can also watch his presentation at SQLBits at this link

http://sqlbits.com/Sessions/Event9/Inside_Ola_Hallengrens_Maintenance_Solution

where he talks about and demonstrates the solution.

It is possible to just run his script to install the solution and schedule the jobs and know that you have made a good start in keeping your databases safe. You should be more proactive than that and set specific jobs for your own special requirements but you can and should find that information in other places including the FAQ on Ola’s site

I particularly like the parameter @ChangeBackupType which when running the transaction log or differential backup will change the backup type to full if the backup type cannot be taken. This is excellent for picking up new databases and backing them up soon after creation

When you run the script the jobs are created but not scheduled and it is for this reason I created this function. All it does it schedule the jobs so that I know that they will be run when a new server is created and all the databases will be backed up. I can then go back at a later date and schedule them correctly for the servers workload or tweak them according to specific needs but this allows me that fuzzy feeling of knowing that the backups and other maintenance will be performed.

To accomplish this I pass a single parameter $Server to the function this is the connection string and should be in the format of SERVERNAME, SERVERNAME\INSTANCENAME or SERVERNAME\INSTANCENAME,Port

I then create a $srv SMO object as usual

$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server

Create a JobServer object and a Jobs array which holds the Jobs

$JobServer = $srv.JobServer
$Jobs = $JobServer.Jobs

And set the schedule for each job. I pick each Job using the Where-Object Cmdlet and break out if the job does not exist

$Job = $Jobs|Where-Object {$_.Name -eq 'DatabaseBackup - SYSTEM_DATABASES - FULL'}
       if ($Job -eq $Null)
       {Write-Output "No Job with that name"
       break}

Then I create a Schedule object and set its properties and create the schedule

$Schedule = new-object Microsoft.SqlServer.Management.Smo.Agent.JobSchedule ($job, 'Daily - Midnight ++ Not Sunday')
$Schedule.ActiveEndDate = Get-Date -Month 12 -Day 31 -Year 9999
$Schedule.ActiveEndTimeOfDay = '23:59:59'
$Schedule.FrequencyTypes = "Weekly"
$Schedule.FrequencyRecurrenceFactor = 1
$Schedule.FrequencySubDayTypes = "Once"
$Schedule.FrequencyInterval = 126 # Weekdays 62 + Saturdays 64 - <a href="https://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.agent.jobschedule.frequencyinterval.aspx">https://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.agent.jobschedule.frequencyinterval.aspx</a>
$Schedule.ActiveStartDate = get-date
$schedule.ActiveStartTimeOfDay = '00:16:00'
$Schedule.IsEnabled = $true
$Schedule.Create()

I have picked this example for the blog as it shows some of the less obvious gotchas. Setting the active end date could only be achieved by using the Get-Date Cmdlet and defining the date. The schedule frequency interval above is for every day except Sundays. This achieved by using the following table from MSDN which is always my first port of call when writing these scripts

WeekDays.Sunday = 1
WeekDays.Monday = 2
WeekDays.Tuesday = 4
WeekDays.Wednesday = 8
WeekDays.Thursday = 16
WeekDays.Friday = 32
WeekDays.Saturday = 64
WeekDays.WeekDays = 62
WeekDays.WeekEnds = 65
WeekDays.EveryDay = 127

Combine values using an OR logical operator to set more than a single day. For example, combine WeekDays.Monday and WeekDays.Friday (FrequencyInterval = 2 + 32 = 34) to schedule an activity for Monday and Friday.

It is easy using this to set up whichever schedule you wish by combining the numbers. I would advise commenting it in the script so that your future self or following DBAs can understand what is happening.

You can tweak this script or use the code to work with any Agent Jobs and set the schedules accordingly and you can check that you have set the schedules correctly with this code

   $srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server
   $JObserver = $srv.JobServer
   $JObs = $JObserver.Jobs
   $ActiveStartTimeOfDay = @{Name = "ActiveStartTimeOfDay"; Expression = {$_.JobSchedules.ActiveStartTimeOfDay}}
   $FrequencyInterval = @{Name = "FrequencyInterval"; Expression = {$_.JobSchedules.FrequencyInterval}}
   $FrequencyTypes = @{Name = "FrequencyTypes"; Expression = {$_.JobSchedules.FrequencyTypes}}
   $IsEnabled = @{Name = "IsEnabled"; Expression = {$_.JobSchedules.IsEnabled}}
   $Jobs|Where-Object{$_.Category -eq 'Database Maintenance'}|select name,$IsEnabled,$FrequencyTypes,$FrequencyInterval,$ActiveStartTimeOfDay|Format-Table -AutoSize

You can get the script from Script Center via the link below or by searching for “Ola” using the script browser add-in straight from ISE

browser

https://gallery.technet.microsoft.com/scriptcenter/Schedule-Ola-Hallengrens-a66a3c89

#TSQL2sDay Why My Head is Always in The Cloud

Todays post is my first for the TSQL2sDay series. For those not familiar this is rotating blog party that was started by Adam Machanic (@AdamMachanic | blog) back in 2009. If you want to catch up on all the fun to date? Check out this nice archive (link) put together by Steve Jones (@way0utwest |blog). Thank you Steve!!!

Azure Ballon - Credit http://owenrichardson.com/

This one is hosted by Jorge Segarra @SQLChicken:  who said This month’s topic is all about the cloud. What’s your take on it? Have you used it? If so, let’s hear your experiences. Haven’t used it? Let’s hear why or why not? Do you like/dislike recent changes made to cloud services? It’s clear skies for writing! So let’s hear it folks, where do you stand with the cloud?

My wife would tell you that my head is always in the cloud and she’s right (she usually is) just not like that picture! I would love to float gracefully above the land and gaze upon the view but its the landing that bothers me and will always stop me from trying it

Credit http://owenrichardson.com/

She’s right, pedantically and literally too, because this year I have spent a lot of time with my head and my fingers and my thinking in Virtual Machines using Windows Azure. That is where I have learnt a lot of my SQL and Powershell this year. After SQL Saturday Exeter and SQL Bits in Nottingham this year I have needed a place to practice and learn, an environment to try things and break things and mend them again and experiment.

I learn just as well by doing things as I do reading about them. Stuart Moore  @napalmgram has a great post called Learning to Play with SQL Server and whist I haven’t been as rough with my Azure SQL instances as he suggests I have been able to practice at will without worry and thanks to my MSDN subscription without cost. I have taken examples from blog posts and demos from User Group Sessions and run them on my Windows Azure VMs

Every single blog post I have written this year that has examples has been written in Azure and screen shots from Azure. Whilst some of my Powershell scripts in the PowerShell Box of Tricks series had already been written to solve one particular problem or another at MyWork, every single one was refined and demo’d and all the screen shots were from Azure and several were developed on Azure too

My first ever session to the SQL South West user group was about Spinning up and Shutting Down VMS in Azure was about Azure and was an interesting experience in Murphys Law which meant I ended up having to deliver it  on Azure.

The second time I have talked was about the PowerShell Box of Tricks series to the Cardiff User Group. Having learnt my lesson from the first time I had bought a mini HDMI to VGA converter and I had tested it using a couple of monitors at home and it worked wonderfully. However, when I got to Cardiff my little Asus convertible didn’t provide enough grunt to power the funky presentation screen. Luckily thanks to Stuart Moore @napalmgram who was also there doing his excellent PowerShell Back Up and Restore Session who let me use his Mac I was able to deliver the session using Office Web App to run the PowerPoint from my SkyDrive whilst all the demos were on ………Yup you guessed it Windows Azure !!!

So I feel qualified to answer Jorge’s questions and take part in T-SQL Tuesday this time round.

I like Azure. I like the ease I can spin up and down machines or any PaaS services at will. I love that I can do it with PowerShell because I really enjoy using PowerShell in my day to day work and at home too. Living as I do in a beautifully convenient bungalow in the country, I still enjoy the frustration of watching that spinning ring as my videos buffer on our 1.8Mbs at best internet connection. Whilst that does have an impact on using Azure it is a damn sight better than waiting many days trying to download one single file. Something like an ISO file for the latest SQL Server CTP for example.

There is no way I would have got a look at SQL Server 2014 if it wasn’t for Azure. I was able to spin up a SQL Server 2014 machine in only a few minutes and log in and have a play and then delete it. I have done the same with Server 2012 and 2012 R2. It has enabled me to try setting up Availability Groups and other technologies not yet implemented at MyWork

I wouldn’t have been able to do any of that on my machines at home as I don’t have anything capable of running Hyper-V whilst this 8 year old desktop still keeps hanging on despite the odd noises. (Negotiations are currently in place to replace it with something shiny and new. Just need that lottery win now !!)

I have also transferred my Cricket Averages database to WASD and am talking with a friend of mine about developing an app that will use the mobile service as well.

The rate of change is much quicker in the cloud, things change and change quickly. As quickly as I had written my post about Spinning up and Shutting Down VMS in Azure Microsoft changed the rules and didn’t charge for machines that were turned off. New services appear all the time. New services move quickly through from Preview to release and as Grant Fritchey noticed this week new views have been added to to Windows Azure SQL Database under the covers. I think this is something we are just going to have to live with. The scale of the cloud means it is much easier to test improvements at large scale and that means they can be released quicker.  It makes it more challenging to keep up I admit but it’s a constant drip of new things rather than a big bang all at once.

Azure has brought me to where I am today and I think it will continue to be part of my future. If I remember to submit my PowerShell session for SQL Saturday Exeter (Submit yours here) and it gets chosen then you will be able to see me there (if you register here) using Azure to give back to the SQL Community

SQL Saturday Exeter–What’s the Point? My Experience of 2013 SQLSatExeter

 

Disclaimer – I am on the committee organising the next SQL Saturday Exeter. To be kept up to date about SQL Saturday #269 in the South West, follow @SQLSatExeter and#SQLSatExeter on twitter and see details at the bottom. This post is about my experience at this years event.

In March this year the SQL South West User Group hosted SQL Saturday #194 in Exeter. I was a new member to the User Group having finally been able to join them for the first time in January. At that meeting Chris Testa O’Neill presented a session and was very passionate about the SQL Community and the benefit of the SQL Saturdays and other events.  I am always keen to learn new things and find ways of developing my skills. As I haven’t won the lottery I also look out for good deals as well!!

SQL SATURDAY PRE-CONS ARE EXCEPTIONAL VALUE

It was relatively easy to persuade my bosses to pay for my pre-con. For £150 I was able to spend a whole day in a room with about a dozen people being trained in SQL Server Security by Denny Cherry @mrdenny. The conversation went along the lines of

“I want to go to this training session being delivered by this guy. Link to MVP page. It’s £150 and is in Exeter so no other costs required”

My boss – “OK”

Of course there was a little more fun and games to be had with the payment but it was easy for me to get training sorted and £150 is not going to break the training budget.

Looking back through my notes from the session today I realise quite how much I have taken from it into my role at work. I can’t really comment which and what though that wouldn’t be good security!!

I remember an enjoyable day with plenty of technical learning, a lot of questions and answers and plenty of laughs as well. But more than that was the opportunity to mix with other professionals and talk with them. During the breaks and at lunch there were plenty of opportunities to chew the fat, learn how others do things, make new friends and put faces to twitter handles. (NOTE : I do look pretty much like my twitter profile picture so if you see me at SQL Community events I expect you to come up and say hi, that’s part of the benefit of attending these events, having a good natter)

Take a look at the end of this post for details of 2014 Pre-Cons

SQL SATURDAY – CAN’T GET CHEAPER THAN FREE

SQL Saturdays are FREE

SQL Saturdays offer sessions from internationally renowned and local SQL speakers on subjects relevant to you and your job, your future career, your development plan or just to challenge yourself by learning about something outside of your comfort zone. For Nothing. Add in the networking opportunities, the prizes from the sponsors, (if you were at Exeter this year the beer and the pasty) and if you added it up its a sizeable investment in yourself, your career and your development (did I mention a free beer and pasty?)

NOT BAD FOR FREE!!

To enable that, SQL Saturday organisers have to go out and talk sponsors into putting their hands into their pockets. They will only do that if it is worthwhile to them. You can make it easier for the organisers by going and spending time with the sponsors during the breaks, chatting with them and giving them your details. Also, if you choose to use one of their products please tell the sponsors you spoke to them at a SQL Saturday. They are (usually) data professionals who will record that and use that to make future decisions which will we hope include sponsoring SQL Saturdays.

This year on the Saturday I went to the following sessions

A temporary fix for a short term problem by Ian Meade
Advanced SQL Server 2012 HA and DR Architectures by Christian Bolton
Busting common T-SQL myths by Dave Morrison
Power View and the Cube by Régis Baccaro
Natural Born Killers, performance issues to avoid by Richard Douglas
Tracking server performance without slowing it down by Jonathan Allen which I also Room Monitored
Increasing Business and IT collaboration by Chris Testa-O’Neill

It was a really good day. I learnt so much from all those knowledgeable and talented people. It really kicked me on in my development at work. I was able to take from each of those sessions and use that knowledge to do my job better and I made new friends and new contacts. Just going back to my notes today has reminded me of something that I need to look into for work Smile Some of the conversations I have had at events this year have been fascinating – learning how other people do the same thing you do in a completely different but equally valid way,  problem-solving with a different set and type of minds than the ones at MyWork, laughing at the same things and moaning about similar frustrations. All have been both entertaining and rewarding and I think are worth mentioning as things I enjoyed about going to SQL Community events this year and play a part in the reason I shall continue to go to them (Just hope my boss doesn’t read this and think he won’t have to pay as I will go anyway!)

It’s busy and hectic, the sessions come along thick and fast and there are lots of people around to talk to. I wish I had made use of the SQL Saturday mobile phone app and I definitely recommend researching ahead of time and planning your day out.

This years sessions have not been decided yet but I have seen some of the submissions and there are some fabulous sessions there. You could also submit a session yourself. Choosing the sessions will be tough, but we want to offer the opportunity to speak to as many people as possible both new and experienced speakers.

You can submit your sessions at this link http://www.sqlsaturday.com/269/callforspeakers.aspx

ROUND-UP SQL SATURDAY EXETER WHY WOULDN’T YOU COME

For a newbie, as I was last time, SQL Saturday Exeter was a revelation.

An opportunity to learn without spending thousands of my own or MyWorks money to sit in a lecture room and listen to a trainer.

A chance to develop my understanding in a friendly environment amongst my peers where I could ask questions.

A place to meet new people and build relationships who have helped me with situations at work throughout the year. I reckon I’m in credit already

This year I have attended SQL Bits and SQL Saturday Cambridge and this month I shall be at SQL Relay in Cardiff and in Bristol. That all started with SQL Saturday 194 in Exeter 2013

WHAT ABOUT NEXT YEARS SQL SATURDAY EXETER?

Next years SQL Saturday in Exeter, SQL Saturday #269, will be held at the same place – Jury’s Inn Hotel Exeter on March 21/22nd 2014.

We had such amazing submissions for our pre-cons that we have had to find more rooms to be able to fit them all in.. You can see for yourself the quality of the sessions and speakers for SQL Saturday Exeter 2014 at the following link

http://sqlsouthwest.co.uk/sql-saturday-269-precon-training-day-details/

What do you think? I want to split myself into 8 and go to every one!

WHAT SHOULD YOU DO NOW?

I suggest that you should book Saturday 22nd March 2014 out in your calendar right this minute. Done that? Good.

Now go to this link

http://www.sqlsaturday.com/269/

and register for FREE to attend and let us know @SQLSatExeter

Next make yourself a coffee (Other beverages are available) and head to the pre-con page

http://sqlsouthwest.co.uk/sql-saturday-269-precon-training-day-details/

This bit is up to you, the choice is hard. I can’t tell you which one of our eight fabulous sessions you want to go to. It’s not for me to say which amazing speaker you want to spend a day with for a bargain price but if you need further info please get in touch and we will try and help. Unfortunately our human cloning experiment is not stable enough to allow you to go to more than one!

Then, let me know you have done so and come and say hi when you are here.

Powershell won’t save when running as a scheduled job

Or, How SQLBits put me in touch with Laerte and solved a problem

I have a scheduled Powershell job which I use to create an Excel file colour coded for backup checks. (I will blog about it another time) It works brilliantly on my desktop and saves the file to a UNC path and emails the team the location. It works brilliantly when run in Powershell on the server. When I schedule it to run though it doesn’t do so well. The job completes without errors but no file is saved.

If you examine the processes running at the time you can see the excel process is running  so I knew it was doing something but couldn’t work out why it was failing.

It was one of those jobs that gets put to the bottom of the list because the service worked ok I just needed to have it running on the server rather than a desktop for resilience, recovery and security purposes. Every now and then I would try and work out what was going on but new work and new problems would always arrive and it has been like that for 6 or maybe even 9 months.

As you know I attended SQLBits this weekend and I went into a session with Laerte Junior. Laerte is a SQL Server MVP and can be found at simple-talk as well as his own blog http://shellyourexperience.com/ or on twitter @LaerteSQLDBA Oh and He loves Star Wars 🙂

Laerte_Junior[1]After a fascinating session I asked him if I could show him my problem. He very graciously said yes and after looking at the code and listening to me explain the problem he suggested this very simple solution which he said had taken him a great deal of searching to find. It’s a bug with COM objects and requires the creation of folders as shown below. I cam into work today, tried it and it worked. HOORAY another thing off my list and big thanks to Laerte

#Region Bug_Jobs_ComObjects
#(32Bit, always)
# Create Folder
#New-Item –name C:\Windows\System32\config\systemprofile\Desktop  –itemtype directory
#
#(64Bit)
# Create folder
#New-Item –name C:\Windows\SysWOW64\config\systemprofile\Desktop  –itemtype directory
#EndRegion Bug_Jobs_ComObjects

 

This worked for me however I had already implemented another fix for a possible gotcha so I will tell you of that one too

Sometimes Powershell cannot save to UNC paths because of  IE enhanced security.

Either log in as user and add server to intranet site zones or disable the warning in registry as follows

[HKEY_CURRENT_USER\Software\Policies\Microsoft\Windows\CurrentVersion\Internet Settings\ZoneMap] "UNCAsIntranet"=dword:00000000

 

Please don’t ever trust anything you read on the internet and certainly don’t implement it on production servers without first both understanding what it will do and testing it thoroughly. This solution worked for me in my environment I hope it is of use to you in yours but I know nothing about your environment and you know little about mine