Enabling Cortana for dbareports PowerBi

Last week at the Birmingham user group I gave a presentation about PowerShell and SQL Server

saved-image-from-tweetium-8

It was a very packed session as I crammed in the new sqlserver module, dbatools and dbareports ūüôā On reflection I think this is a bit too much for a one hour session but at the end of the session I demo’d live Cortana using the dbareports dataset and returning a Cortana PowerBi page.

As always it took a couple of goes to get it right but when it goes correctly it is fantastic. I call it a salary increasing opportunity! Someone afterwards asked me how it was done so I thought that was worth a blog post

There is a video below but the steps are quite straightforward.

Add Cortana Specific Pages

Whilst you can just enable Cortana to access your dataset, as shown later in this post, which enables Cortana to search available datasets and return an appropriate visualisation it is better to provide specific pages for Cortana to use and display. You can do this in PowerBi Desktop

Start by adding a new page in your report by clicking on the plus button

 

add page.PNG

and then change the size of the report page by clicking on the paintbrush icon in the visualisation page.

page-size

This creates a page that is optimised for Cortana to display and also will be the first place that Cortana will look to answer the question

Power BI first looks for answers in Answer Pages and then searches your datasets and reports for other answers and displays them in the form of visualizations. The highest-scoring results display first as best matches, followed by links to other possible answers and applications. Best matches come from Power BI Answer Pages or Power BI reports.

Rename the page so that it contains the words or phrase you expect to be in the question such as “Servers By Version” You will help Cortana and PowerBi to get your results better if you use some of the column names in your dataset

Then it is just another report page and you can add visualisations just like any other page

cortana page.PNG

Make Cortana work for you and your users

If your users are likely to use a number of different words in their questions you can assist Cortana to find the right answer by adding alternate names. So maybe if your¬†page is sales by store you might add¬†shop, building, results, amount, orders.¬†This is also useful when Cortana doesn’t understand the correct words as you will notice in the screenshot below I have added “service” for “servers” and “buy” for “by” to help get the right answer.¬†You can add these alternate words by clicking the paintbrush under visualisations and then Page Information

cortana-additional

Publish your PBIX file to PowerBi.com

To publish your PowerBi report to PowerBi.com either via the Publish button in PowerBi desktop

publish

or by using the PowerBiPS module

Install-Module -Name PowerBIPS
#Grab the token, will require a sign in
$authToken = Get-PBIAuthToken Verbose
Import-PBIFile authToken $authToken filePath “Path to PBIX file” verbose

Enable Cortana

In your browser log into https://powerbi.com and then click on the cog and then settings

powerbicom.PNG

then click on Datasets

settings

Then choose the dataset – in this case dbareports SQL Information sample and click the tick box to Allow Cortana to access the this dataset and then click apply

dataset settings.PNG

Use Cortana against your PowerBi data

You can type into the Cortana search box and it will offer the opportunity for you to choose your PowerBi data

cortana-search

but it is so much better when you let it find the answerūüôā

cortana-search-1

and if you want to go to the PowerBi report there is a handy link at the bottom of the Cortana page

cortana-search-2

I absolutely love this, I was so pleased when I got it to work and the response when I show people is always one of wonder for both techies and none-techies alike

The conditions for Cortana to work

You will need to have added your work or school Microsoft ID to the computer or phone that you want to use Cortana on and that account must be able to access the dataset either because it is the dataset owner or because a dashboard using that dataset has been shared with that account.

From this page on PowerBi.com

When a new dataset or custom Cortana Answer Page is added to Power BI and enabled for Cortana it can take up to 30 minutes for results to begin appearing in Cortana. Logging in and out of Windows 10, or otherwise restarting the Cortana process in Windows 10, will allow new content to appear immediately.

It’s not perfect!

When you start using Cortana to query your data you will find that at times it is very frustrating. My wife was in fits of giggles listening to me trying to record the video below as Cortana refused to understand that I was saying “servers” and repeatedly searched Bing for “service” Whilst you can negate the effect by using the alternate names for the Q and A settings it is still a bit hit and miss at times.

It is amazing

There is something about giving people the ability to just talk to their device in a meeting and for example with dbareports ask

Which clients are in Bolton

or

When was the last backup for client The Eagles

and get the information they require and a link to the report in PowerBi.com. I am certain that the suits will be absolutely delighted at being able to show off in that way which is why I call it a salary increasing opportunityūüôā

We would love YOU to come and join us at the SQL Community Collaborative

Help us make dbatools, dbareports and Invoke-SQLCmd2 even better. You can join in by forking the repos in GitHub and writing your code and then performing a PR but we would much rather that you came and discussed new requests in our Trello boards, raised issues in GitHub and generally discussed the modules in the SQL Server Community Slack #dbatools #dbareports. We are also looking for assistance with our wiki pages, Pester tests and appveyor integration for our builds and any comments people want to make

SQL Server Collaborative GitHub Organisation holding the modules. Go here to raise issues, fork the repositories or download the code

dbatools Trello for discussion about new cmdlets

SQL Server Community Slack where you can find #dbatools and #dbareports as well as over 1100 people discussing all aspects of the Data Platform, events, jobs, presenting

COME AND JOIN US

 

Speaking? You? Go on. #tsql2sday #84

This is a blog post for¬†this month’s T-SQL Tuesday post, hosted by Andy Yun (b|t). T-SQL Tuesday is a monthly blog event started by Adam Machanic (b|t). The T-SQL Tuesday topic this month was about advice for new speakers. Thanks Andy for hosting. I have created a channel in the SQL Server Community Slack for presenting which everyone can make use of to ask and to answer questions

I think you should share what you know with others.

You will be amazing.

I will give you some great advice I learnt from a fantastic person’s blog post

  1. Start speaking
  2. Keep going
  3. Listen to feedback
  4. That’s it.

Kendra has said it all, you don’t need to read any furtherūüėČ

 

 

 

 

However..

Not all plain sailing

I love giving sessions but I never knew or thought¬†that I would. My journey to speaking started at my SQL user group in Exeter and two fabulous people Jonathan and Annette Allen¬†who encouraged me to share some PowerShell with the group. I was terrified, didn’t think I was worthy,¬†my HDMI output wasn’t strong enough to power the projector, I had to transfer my slides and demo to Jonathans laptop. It was a fraught and frustrating experience.

My second presentation was done on Stuart Moores MacBook Pro¬†using Office Online for presentations and Azure for demos. Again a change right at the last minute and using a machine I didn’t know (and a different keyboard set-up).

Stuff will go wrong. Murphy’s Law will always show his head somewhere and no matter how often you test and re-test your demos, sometimes an odd thing will make them stop working

There will be problems and issues, you can mitigate some of them by following the 6 P’s

Proper Preparation Prevents Pretty Poor Performance.

You can read some great blog posts in this T-SQL Tuesday Series and also this one from Steve Jones or any of these But also accept that these things happen and you must be prepared to shine on through the darkness if the power runs out or use pen and paper or even plastic cups like John Martin :-)

You never know you might enjoy it

I found I enjoyed it and wanted to do more and since then I have presented sessions in a wide variety of places. It was very strange to have been sat watching and listening to all of these fantastic presenters thinking I could never do that and then find out that actually it is something that I enjoy doing and find fun. You can do that too.

Equally, it’s ok to not enjoy it, think its not worth the stress and hassle and support the community in a different way but at least give it a go

You will be nervous

quote-joan-jett-you-want-to-have-butterflies-in-your-185899

I shared a train across Germany with someone who had attended the PSMonday conference in Munich and they were astonished when I¬†said that I get very nervous before speaking. It’s ok to be nervous, the trick is to make use of that nervous energy and turn it into something positive.

I get very nervous before presentations. My hands shake, I sweat, I either babble or loose my voice. I fret and fidget and check everything a thousandillion times. I find it is better for me if I am sat in the room during the previous presentation as that generally helps me to feel more relaxed as I can listen to their talk and also out of respect for the presenter and the organisation it forces me to sit quietly.

You will find your own way to deal with this, maybe listening to music on headphones or just sitting quietly somewhere. Don’t worry if it is not immediately obvious, try some different¬†things, talk with others and believe me, it will be ok.

Don’t try to numb it with alcohol

Once I get up and its ‘my’ turn I take a few deep breaths and suddenly presenter turns on and I forget all about being nervous.

173101-everything-you-want-is-on-the-other-side-of-fear

Something to talk about

I have nothing to talk about.

Or everyone else knows more than I do.

Or X Y and Z talk about this much better than I do.

I’m scared

Richard Munn and I gave an impromptu session at SQL Relay in Cardiff where we talked about and hopefully encouraged people to start speaking and these statements came up.

wp_20161004_13_59_32_pro

Heres (a little of) what we said

No-one knows everything. Many people know a hell of a lot but not everything. You know a lot more than you realise and you also know things that no-one else does.

If you are stuck for things to talk about think about the you of 6 months or a year ago and something that you have learnt in that time and write the session that you wish you could have seen then. There will be other people at a similar stage who will appreciate it.

Don’t be scared, they are only people.

Practice

My dog is the one person who has been present at my presentations the most. He has listened (sometimes intently) to me practicing.

You need to practice speaking out loud.

You need to understand the timings

You need to be comfortable with hearing yourself speaking out aloud

You need to practice speaking out loud

A double reminder because I think it is important. You should practice and practice and practice with an eye on your timings and if you have a good friend who is technical or a small group at work for a lunchtime maybe then ask them if they will listen and give feedback.

Wanna chat?

I am very passionate about community involvement and lucky enough to be involved in two fantastic communities – the SQL community and the PowerShell community and have made some great friends along the way. I was amazed and proud when very soon after my second presentation someone told me that I had inspired them to start to present.

Since then I have gone out of my way to encourage other people to speak and to blog and am really enjoying watching them blossom. If you want to have a chat via email or via slack about speaking or blogging or getting involved in the community please feel free to contact me and I promise you I will get back to you. Better still go to the SQL Community Slack and ask questions in #presentingorspeaking

Go find out more

We are good at sharing and learning technical content but we can share and learn about so much more, about all aspects of our life. Go and read all of the other posts in this T-SQL Tuesday for startersūüôā and develop

PowerBi and API – Visualising my Checkins

For my own amusement and also to show my wife where I have been I use the Swarm check-in app on my phone and check-in to places. Also for my own amusement I used PowerBi to visualise the data via the API and allow me to filter it in various ways.

Whilst at the PowerShell Conference in Asia I was showing the mobile app to a group over some food and saying how easy it was and June Blender,¬†the mother of PowerShell help,¬†said that I ought to blog about it. So I haveūüôā

Follow these steps and you can create this report.

powerbi8.PNGYou can also download the blank report and add your own access token to it should you wish. Details at the end of the post

I am using the swarm API but the principle is the same for any other API that provides you with data. For example, I used the same principles to create the embedded reports on the PASS PowerShell Virtual Chapter page showing the status of the cards suggesting improvements to the sqlserver module for the product team to work on. Hopefully, this post will give you some ideas to work on and show you that it is quite easy to get excellent data visualisation from APIs

First up we need to get the data. I took a look at the Swarm developers page ( The Trello is here by the way) I had to register for an app, which gave me a client id and a secret. I then followed the steps here to get my user token I was only interested in my own check ins so I used the steps under Token flow Client applications to get my access token which I used in an URL like this.

https://api.foursquare.com/v2/users/self/checkins?limit=5000&oauth_token=ACCESS_TOKEN&v=YYYYMMDD

I added the limit 5000 as the default number of checkins returned was too small for my needs and the date was that days date.

You can do this in Powershell using code I got from the magnificent Stephen Owen’s blog post

## Enter the details
$Clientid =''  ## Enter ClientId from foursquare
$redirect = '' ## enter redirect url from client app in foursquare
##Create the URL:
$URL = "https://foursquare.com/oauth2/authenticate?client_id=$Clientid&response_type=token&redirect_uri=$redirect"
## function from https://foxdeploy.com/2015/11/02/using-powershell-and-oauth/
Function Show-OAuthWindow {
Add-Type -AssemblyName System.Windows.Forms</div>
<div>$form = New-Object -TypeName System.Windows.Forms.Form -Property @{Width=440;Height=640}
$web  = New-Object -TypeName System.Windows.Forms.WebBrowser -Property @{Width=420;Height=600;Url=($url -f ($Scope -join "%20")) }
$DocComp  = {
$Global:uri = $web.Url.AbsoluteUri
if ($Global:Uri -match "error=[^&]*|code=[^&]*") {$form.Close() }
}
$web.ScriptErrorsSuppressed = $true
$web.Add_DocumentCompleted($DocComp)
$form.Controls.Add($web)
$form.Add_Shown({$form.Activate()})
$form.ShowDialog() | Out-Null
}
#endregion
#login to get an access code then close the redirect window
Show-OAuthWindow -URL $URl
## grab the token
$regex = '(?<=access_token=)(.*)'
$authCode  = ($uri | Select-string -pattern $regex).Matches[0].Value
$global:AuthToken = $authCode
Write-output "Received a token, $AuthToken"
Write-Output "So the URL for your PowerBi Data is :-"
$PowerBiUrl = "https://api.foursquare.com/v2/users/self/checkins?limit=5000&oauth_token=$AuthToken&v=20160829"
$PowerBiUrl | Clip

I checked the URL in a browser and confirmed that it returned a json object. Keep that URL safe you will need it in a minute. That code above has placed it in your clipboard. If you want to jump straight to the report using the download stop here and go to the end

So now lets move to Power BI. Go to powerbi.com and download the PowerBi Desktop. Its free. You will need to create an account using a school or work email address if you wish to put your reports in powerbi.com

Once you have downloaded and installed PowerBi Desktop you will be faced with a window like this

powerbi

Start by clicking Get Data

powerbi2

Then choose Web and paste the URL from above into the filename and press ok which will give you this

powerbi3

Now we need to put the data into a format that is of more use to us

power1

I clicked on the record link for response, then converted to table, then the little icon at the top of the column to expand the value.items column¬†and then the value.items column again. It doesn’t look much yet but we are a step¬†closer.

Next I looked in the table for the venue column, expanded that and the location column and the formatted address column.

power2

You can also expand the categories so that you can look at those too by expanding Value.items.venue.categories and Value.items.venue.categories1

powerbi4.gif

Now you will see that we have some duplicates in the data so we need to remove those. I did that by deleting the first 3 columns and then clicking remove duplicates under Delete Rows

power3b.gif

Then click close and apply. Then click on the data button as we need to rename and remove some more columns so that our data makes a little sense. I renamed the columns like this

Value.items.createdAt –> CreatedAt
Value.items.shout –> Comment
Value.items.venue.name –> VenueName
Value.items.venue.location.address –> VenueAddress
Value.items.timeZoneOffset –> TimeZoneOffset
Value.items.venue.location.lat –> VenueLat
Value.items.venue.location.lng –> VenueLong
Value.items.venue.location.postalCode –> VenuePostalCode
Value.items.venue.location.cc –> CountryCode
Value.items.venue.location.city –> City
Value.items.venue.location.state –> State
Value.items.venue.location.country –> Country
Value.items.venue.location.formattedAddress –> VenueAddress
Value.items.venue.url –> VenueURL
Value.items.venue.categories.name –> Catogory
Value.items.venue.categories.pluralName –> Categories

and remove all of the other columns. You can also do this in the Edit Queries window, I am just showing you that there are multiple ways to do the same thing

powerbi5.gif

Once you have done that you should have a window that looks like this. Notice I renamed the query to checkins as well

powerbi4.PNG

Now we need to create a calculated column for the time and a measure for the count of checkins. This is done using this code

Time = VAR UnixDays = [createdAt]/(60*60*24)
RETURN (DATEVALUE("1/1/1970")+UnixDays)
CountCheckins = COUNT(checkins[Time])

and we can move onto the report side of things. Frist we are going to download a custom visual. Go to the PowerBi Custom Visuals Page and download the Timeline visualpowerbi5.PNG

and then import it into your PowerBi report. I have embedded a YouTube video below showing the steps I took to turn this into the PowerBi report. Its pretty easy, you will be able to click on the visuals and then click on the data columns and alter them until you have the report that you want.

Once you have done this, you can upload it to PowerBi if you wish by clicking on the Publish button in PowerBi desktop and signing into PowerBi.com with your work email address.

powerbi6.PNG

and your report is available for you on PowerBi.comūüôā By clicking on the pins on a visualisation you can add them to a dashboard.

powerbi8.gif

Once you have a dashboard you can then use the natural language query to ask questions of your data. Here are some examples

How many checkins are in GB
How many checkins are in airports
How many checkins by month
How many checkins by month in GB
Which airports
Show me hotel venuename and time
How many hotels by country
Show me hotel venuename and checkins count
metro stations venuename and count checkins as a map
Show me count checkins in Amsterdam by category as a donut

powerbi7.PNG

If you want to use the blank report, download it from here open it in PowerBi Desktop, click Edit Queries and Source and add your own URL and click Apply and then Refresh

powerbi9.gif

Hopefully, this has given you some ideas of ways that you can create some reports from many of the data sources available to you via API

PSConfAsia 2016

I have just got back to the UK from Singapore following the amazing PSConfAsia conference. I must say that Matt, Milton, Sebastian and Ben did a fantastic job organising this conference and were proud that there was a notable increase in attendees from last year.

sebastians-photo

 

The conference began (unofficially) with a PowerShell User group session in the Microsoft Offices on Wednesday where Ravi Chaganti spoke about DSC

WP_20161019_19_56_07_Pro (2).jpg

and then Desmond Lee lead a Q and A session. In the end we decided that all the answers were

It Depends and Test in your Environment

That evening, I even managed to jump on the PASS PowerShell Virtual Chapter session by Scott Sutherland Hacking SQL Servers on Scale using PowerShell the recording of which is here¬†¬†A¬†session organised and managed online¬†in three different time zones by Aaron Chrissy and myselfūüôā.

On Thursday the conference proper started with a pre-con day at the Amazon Web Services office. Yes, you read that right. This conference really highlighted the cross-platform¬†direction¬†and adoption of open-source that Microsoft is taking.¬†¬†Jason Yoder spent all day teaching a group “PowerShell for Beginners” in one room

WP_20161020_09_24_43_Pro.jpg

while The Amazon Web Services Team showed DevOps on AWS with PowerShell in the morning and June Blender gave a SAPIEN Toolmaking Seminar.fter this we went back to the Microsoft Offices for another User Group where Jason Yoder gave a (nother) session with Jaap Brasser on PowerShell Tips and Tricks (Demo)

WP_20161020_19_26_24_Pro (2).jpg

Friday started with The PowerShell Team represented by Kenneth Hansen & Angel Calvo talking about PowerShell Past, Present and Future. It was really good that there was such great access to the product team at the conference and I saw lots of interaction around the conference as well, in addition to the sessions they provided.

Next up for me was another session from the PowerShell Team, this time Hemant Mahawar & Jason Shirk taking us on a Journey Through the Ages of PowerShell Security

Execution Policy is not a security feature

That took us to lunch, we were treated to excellent lunches at this conference

WP_20161020_12_07_14_Pro (2).jpg

After lunch I sat in the PowerShell Teams Ask Us Anything session although I was mainly preparing for my own session Powershell Profile Prepares Perfect Production Purlieu which followed. There were excellent sessions on JEA, Nano Server, Chef and DSC, Containers, ETS and securing PowerShell against malware whilst I attended Flynn Bundy’s session about Windows Containers and Building GUIs with XAML with David Das Neves

WP_20161021_15_58_12_Pro (2).jpg

That evening, organisers, speakers and attendees all went to the Penny Black pub on Marina Bay and enjoyed some food, refreshments and networking

Saturday started slowly after the rain (another impressive ‘feature’ of Singapore)¬† but the first session was a brilliant one with Hemant Mahawar & Jason Shirk talking Pragmatic PowerShell and answering questions. I am glad Jason used Carnac¬†to show what he was typing so that people could (just about¬†:-) ) keep up. I then attended the excellent session about contribution with Microsoft.

The rest of the day had amazing sessions on Azure Automation, IoT, AWS Cloud Formation, Centralised Repository Server, Chef, Puppet, Professional Help, Nano Server, Docker, DSC, Release Pipeline and of course some bearded fella talking about Installing SQL Scripts and creating Pester Tests for them and combining PowerShell, SQL, SSRS, PowerBi and Cortanaūüôā

Jason Yoder's photo.jpg

My takeaways from the conference were that Microsoft is very open to all members of the open source community, DevOps is a very important topic and also the following points from the PowerShell team

PowerShell Team want YOU to contribute.
Interact with them
File bugs
Feature Requests
Documentation
Tests
Code

and

Fixing is better than complainingūüôā @HemanMahawar #psconfasia You can help fix the documentation. Use the contribute button on the doc

and

If you are thinking of starting or run a PowerShell usergroup Microsoft would like help. Tag 1 of the team such as @ANGELCALVOS #psconfasia

Special thanks and congratulations must go to Matt, Milton, Sebastian and Ben for their excellent organisation and for creating an awesome event. I am looking forward to seeing how they can better it next year and also hoping that seeing all the fabulous speakers and sessions will inspire some attendees from this years event to share their own knowledge and experience at local user groups and even next years conference.

PowerShell, Pester and Ola Hallengrens Maintenance Solution

If you are a SQL DBA you will have heard of Ola Hallengrens Maintenance solution If you haven’t go and click the link and look at the easiest way to ensure that all of your essential database maintenance is performed. You can also watch a video from Ola at SQL Bits
Recently I was thinking about how I could validate that this solution was installed in the way that I wanted it to be so I turned to Pester You can find a great how to get started here which will show you how to get Pester and how to get started with TDD.
This isn’t TDD though this is Environment Validation and this is how I went about creating my test.
First I thought about what I would look for in SSMS when I had installed the maintenance solution and made a list of the things that I would check which looked something like this. This would be the checklist you would create (or have already created) for yourself or a junior following this install. This is how easy you can turn that checklist into a Pester Test and remove the human element and open your install for automated testing
  • SQL Server Agent is running – Otherwise the jobs won’t runūüôā
  • We should have 4 backup jobs with a name of
  • DatabaseBackup – SYSTEM_DATABASES – FULL
  • DatabaseBackup – USER_DATABASES – FULL
  • DatabaseBackup – USER_DATABASES – DIFF
  • DatabaseBackup – USER_DATABASES – LOG
  • We should have Integrity Check and Index Optimisation Jobs
  • We should have the clean up jobs
  • All jobs should be scheduled
  • All jobs should be enabled
  • The jobs should have succeeded

I can certainly say that I have run through that check in my head and also written it down in an installation guide in the past. If I was being more careful I would have checked if there were the correct folders in the folder I was backing up to.

Ola’s script uses a default naming convention so this makes it easy. There should be a SERVERNAME or SERVERNAME$INSTANCENAME folder or if there is an Availability Group a CLUSTERNAME$AGNAME and in each of those a FULL DIFF and LOG folder which I can add to my checklist

So now we have our checklist we just need to turn in into a Pester Environmental Validation script

It would be useful to be able to pass in a number of instances so we will start with a foreach loop and then a Describe Block then split the server name and instance name, get the agent jobs and set the backup folder name

$ServerName = $Server.Split('\')[0]
$InstanceName = $Server.Split('\')[1]
$ServerName = $ServerName.ToUpper()
Describe 'Testing $Server Backup solution'{
BeforeAll {$Jobs = Get-SqlAgentJob -ServerInstance $Server
$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server
$dbs = $Srv.Databases.Where{$_.status -eq 'Normal'}.name
if($InstanceName)
{
$DisplayName = 'SQL Server Agent ($InstanceName)'
$Folder = $ServerName + '$' + $InstanceName
}
else
{
$DisplayName = 'SQL Server Agent (MSSQLSERVER)'
$Folder = $ServerName
}
}
if($CheckForBackups -eq $true)
{
$CheckForDBFolders -eq $true
}
$Root = $Share + '\' + $Folder 
I also set the Agent service display name so I can get its status. I split the jobs up using a Context block, one each for Backups, Database maintenance and solution clean up but they all follow the same pattern. .First get the jobs

$Jobs = $Jobs.Where{($_.Name -like 'DatabaseBackup - SYSTEM_DATABASES - FULL*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - FULL*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - DIFF*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - LOG*' + $JobSuffix + '*')}
Then we can iterate through them and check them but first lets test the Agent Service. You do this with an It Block and in it put a single test like this

actual-value | Should Be expected-value
So to check the Agent Job is running we can do this

(Get-service -ComputerName $ServerName -DisplayName $DisplayName).Status | Should Be 'Running'
To find out how to get the right values for any test I check using get member so to see what is available for a job I gathered the Agent Jobs into a variable using the Get-SQLAgentJob command in the new sqlserver module (which you can get by installing the latest SSMS from here) and then explored their properties using Get-Member and the values using Select Object

$jobs = Get-SqlAgentJob -ServerInstance $server
($Jobs | Get-Member -MemberType Property).name
$Jobs[0] | Select-Object *
then using a foreach to loop through them I can check that the jobs, exists, is enabled, has a schedule and succeeded last time it ran like this

$Jobs = $Jobs.Where{($_.Name -eq 'DatabaseIntegrityCheck - SYSTEM_DATABASES') -or ($_.Name -eq 'DatabaseIntegrityCheck - USER_DATABASES') -or ($_.Name -eq 'IndexOptimize - USER_DATABASES')}
foreach($job in $Jobs)
{
$JobName = $Job.Name
It '$JobName Job Exists'{
$Job | Should Not BeNullOrEmpty
}
It '$JobName Job is enabled' {
$job.IsEnabled | Should Be 'True'
}
It '$JobName Job has schedule' {
$Job.HasSchedule | Should Be 'True'
}
if($DontCheckJobOutcome -eq $false)
{
It '$JobName Job succeeded' {
$Job.LastRunOutCome | Should Be 'Succeeded'
}
}
So I have checked the agent and the jobs and now I want to check the folders exist. First for the instance using Test-Path so the user running the PowerShell session must have privileges and access to list the files and folders

Context '$Share Share For $Server' {
It 'Should have the root folder $Root' {
Test-Path $Root | Should Be $true
}
The for every database we need to set some variables for the Folder path. We don’t back up tempdb so we ignore that and then check if the server is SQL2012 or above and if it is check if the database is a member of an availability group and set the folder name appropriately

  foreach($db in $dbs.Where{$_ -ne 'tempdb'})
{

if($Srv.VersionMajor -ge 11)
{
If($srv.Databases[$db].AvailabilityGroupName)
{
$AG = $srv.Databases[$db].AvailabilityGroupName
$Cluster = $srv.ClusterName
$OLAAg = $Cluster + '$' + $AG
if($Share.StartsWith('\\') -eq $False)
{
$UNC = $Share.Replace(':','$')
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $OlaAG
}
else
{
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $Folder
}
}
else
{
if($Share.StartsWith('\\') -eq $False)
{
$UNC = $Share.Replace(':','$')
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $Folder
}
else
{
$Root = $Share + '\' + $Folder
}
}
}
$db = $db.Replace(' ','')
$Dbfolder = $Root + &amp;quot;\$db&amp;quot;
$Full = $Dbfolder + '\FULL'
$Diff = $Dbfolder + '\DIFF'
$Log  = $Dbfolder + '\LOG'
If($CheckForDBFolders -eq $True)
{
Context &amp;quot;Folder Check for $db on $Server on $Share&amp;quot; {
It &amp;quot;Should have a folder for $db database&amp;quot; {
Test-Path $Dbfolder |Should Be $true
} 
But we need some logic for checking for folders because Ola is smart and checks for Log Shipping databases so as not to break the LSN chain and system databases only have full folders and simple recovery databases only have full and diff folders. I used the System.IO.Directory Exists method as I found it slightly quicker for UNC Shares

If($CheckForDBFolders -eq $True)
{
Context 'Folder Check for $db on $Server on $Share' {
It 'Should have a folder for $db database' {
Test-Path $Dbfolder |Should Be $true
}
if($Db -notin ('master','msdb','model') -and ($Srv.Databases[$db].RecoveryModel -ne 'Simple') -and ( $LSDatabases -notcontains $db))
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
It 'Has a Diff Folder' {
[System.IO.Directory]::Exists($Diff) | Should Be $True
}
It 'Has a Log Folder' {
[System.IO.Directory]::Exists($Log) | Should Be $True
}
} #
elseif(($Srv.Databases[$db].RecoveryModel -eq 'Simple') -and $Db -notin ('master','msdb','model') -or ( $LSDatabases -contains $db) )
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
It 'Has a Diff Folder' {
[System.IO.Directory]::Exists($Diff) | Should Be $True
}
} #
else
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
}#
} # End Check for db folders
}
and a similar thing for the files in the folders although this caused me some more issues with performance. I first used Get-ChildItem but in folders where a log backup is running every 15 minutes it soon became very slow. So I then decided to compare the create time of the folder with the last write time which was significantly quicker for directories with a number of files but then fell down when there was a single file in the directory so if the times match I revert back to Get-ChildItem.
If anyone has a better more performant option I would be interested in knowing. I used √ėyvind Kallstad PowerShell Conference session Chasing the seconds Slides and Video¬†and tried the methods in there with Measure-Command but this was the best I came up with

If($CheckForBackups -eq $true)
{
Context ' File Check For $db on $Server on $Share' {
$Fullcreate = [System.IO.Directory]::GetCreationTime($Full)
$FullWrite = [System.IO.Directory]::GetLastWriteTime($Full)
if($Fullcreate -eq $FullWrite)
{
It 'Has Files in the FULL folder for $db' {
Get-ChildItem $Full\*.bak | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the FULL folder for $db' {
$FullCreate | Should BeLessThan $FullWrite
}
}
It 'Full File Folder was written to within the last 7 days' {
$Fullwrite |Should BeGreaterThan (Get-Date).AddDays(-7)
}
if($Db -notin ('master','msdb','model'))
{
$Diffcreate = [System.IO.Directory]::GetCreationTime($Diff)
$DiffWrite = [System.IO.Directory]::GetLastWriteTime($Diff)
if($Diffcreate -eq $DiffWrite)
{
It 'Has Files in the DIFF folder for $db' {
Get-ChildItem $Diff\*.bak | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the DIFF folder for $db' {
$DiffCreate | Should BeLessThan $DiffWrite
}
}&amp;amp;amp;amp;lt;/div&amp;amp;amp;amp;gt;&amp;amp;amp;amp;lt;div&amp;amp;amp;amp;gt;It 'Diff File Folder was written to within the last 24 Hours' {
$Diffwrite |Should BeGreaterThan (Get-Date).AddHours(-24)
}
}
if($Db -notin ('master','msdb','model') -and ($Srv.Databases[$db].RecoveryModel -ne 'Simple') -and ( $LSDatabases -notcontains $db))
{
$Logcreate = [System.IO.Directory]::GetCreationTime($Log)
$LogWrite = [System.IO.Directory]::GetLastWriteTime($Log)
if($Logcreate -eq $LogWrite)
{
It 'Has Files in the LOG folder for $db' {
Get-ChildItem $Log\*.trn | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the LOG folder for $db' {
$LogCreate | Should BeLessThan $LogWrite
}
}
It 'Log File Folder was written to within the last 30 minutes' {
$Logwrite |Should BeGreaterThan (Get-Date).AddMinutes(-30)
}
}# Simple Recovery
}
}# Check for backups
You could just run the script you have just created from your check-list, hopefully this blog post can help you see that you  can do so.
But I like the message showing number of tests and successes and failures at the bottom and I want to use parameters in my script. I can do this like this

[CmdletBinding()]
## Pester Test to check OLA
Param(
$Instance,
$CheckForBackups,
$CheckForDBFolders,
$JobSuffix ,
$Share ,
[switch]$NoDatabaseRestoreCheck,
[switch]$DontCheckJobOutcome
)
and then call it using Invoke-Pester with the parameters like this

$Script = @{
Path = $Path;
Parameters = @{ Instance = Instance;
CheckForBackups = $true;
CheckForDBFolders = $true;
JobSuffix = 'BackupShare1';
Share = '\\Server1\BackupShare1';
NoDatabaseRestoreCheck= $true;
DontCheckJobOutcome = $true}
}
Invoke-Pester -Script $Script
but that’s a bit messy, hard to remember and won’t encourage people newer to Powershell to use it so I wrapped it in a function with some help and examples and put it in GitHub Test-OlaInstance.ps1¬†and Test-Ola. There is one thing to remember. You will need to add the path to Test-Ola.ps1 on Line 90 of Test-OlaInstance so that the script can find it
Once you have that you can call it for a single instance or a number of instances like so. Here I check for Folders and Backup files
$Servers =  'SQL2008Ser2008','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2'
Test-OLAInstance -Instance $Servers -Share 'H:\' -CheckForBackups
and get¬† a nice result like this. In a little under 20 seconds I completed my checklist for 4 servers including checking if the files and folders exist for 61 databasesūüôā (The three failures were my Integrity Check jobs holding some test corrupt databases)
pester ola check.PNG
This gives me a nice and simple automated method of checking if Ola’s maintenance script has been correctly installed. I can use this for one server or many by passing in an array of servers (although they must use the same folder for backing up whether that is UNC or local) I can also add this to an automated build process to ensure that everything has been deployed correctly.
I hope you find it useful

The SQL Server Community Collaborative GitHub Organisation is born

My wonderful friend Chrissy LeMaire and I are the creators of two GitHub repositories for SQL Server and PowerShell called dbatools and dbareports

If you are working with SQL Server I highly recommend that you take a look at the vast number of commands available to you at dbatools which will help you complete tasks within SQL Server especially for Instance migrations and also a growing number of best practice implementations

Both of these modules are not just the work of one person any more. We have over 20 people who have collaborated on the modules THANK YOU ALL and more that have provided guidance and comments via the Slack Channels in the SQL Server Community Slack https://sqlps.io/slack and via the Trello boards https://dbatools.io/trello and https://dbareports/trello

At SQL Saturday Cambridge this weekend I was proud to join Chrissy in her presentation as we talked about both modules. Heres a fabulous picture of us with Buck Woody

 

wp_20160910_10_14_58_pro

 

We had discussed previously that it didn’t feel quite right that these community tools were under our own personal accounts and it also caused some administration issues with allowing access. So with that in mind after a naming discussion in the slack channel we created an organisation to hold them both

 SQL Server Community Collaborative

is born at https://github.com/sqlcollaborative

Nothing much changes except the name. we have even found that all the old links work and GitHub desktop updated. We will continue to make great commands with all of our fantastic collaborators. Discussions will happen in Slack and organisation in Trello and we will continue to grow and learn and teach and share and create together.

We would love you to come and join us

 

Converting SQL Agent Job Duration to TimeSpan using PowerShell

When you look in msdb for the SQL Agent Job duration you will find that it is an int.

sysjobshistoiry

This is also the same when you look at Get-SQLAgentJobHistory from the sqlserver module. (You can get this by downloading the latest SSMS release from here)

agentjobhistoryproperties

This means that when you look at the various duration of the Agent Jobs you get something like this

duration.PNG

The first job took 15 hours 41 minutes  53 seconds, the second 1 minute 25 seconds, the third 21 seconds. This makes it quite tricky to calculate the duration in a suitable datatype. In T-SQL people use scripts like the following from MSSQLTips.com

((run_duration/10000*3600 + (run_duration/100)%100*60 + run_duration%100 + 31 ) / 60)  as 'RunDurationMinutes'

I needed more information than the number of minutes so I have this which will convert the Run Duration to a timespan

$FormattedDuration = @{Name = 'FormattedDuration' ; Expression = {[timespan]$_.RunDuration.ToString().PadLeft(6,'0').insert(4,':').insert(2,':')}}

formatted.PNG

So how did I get to there?

First I tried to just convert it. In PowerShell you can define a datatype in square brackets and PowerShell will try to convert it

timespan

It did its best but it converted it to ticks! So we need to convince PowerShell that this is a proper timespan. First we need to convert the run duration to a standard length, you can use the PadLeft method of a string to do this which will ensure that a string has a length and precede the current string with a value you choose until the string is that length.

Lets have a length of 6 and preceding zeros PadLeft(6,’0′)

padlefterror

But this works only if it is a string!! Remember red text is useful, it will often contain the information you need to resolve your error. Luckily there is a method to turn an int to a string. I am using the foreach method to demonstrate

padleft-with-string

Now every string is 6 characters long starting with zeros. So all that is left is to format this with colons to separate the hours and minutes and the minutes and seconds. We can do this with the insert method. You can find out the methods using Get-Member or its alias gm

methods.PNG

So the insert method takes an int for the startindex and a string value to enter

insert

There we go now we have some proper formatted timespans however they are still strings. We can then convert them using [timespan] Now we can format the results within the select by using an expression as shown below

select

and as you can see it is a timespan now

timespan property.PNG

On a slight side note. I needed the durations for Agent Jobs with a certain name within the last 6 days.

getting-agent-jobs

I did this by passing an array of servers (which I got from my dbareports database) to Get-SQLAgentJobHistory. I then used the Where method to filter for JobName and the Job Outcome step of the history. I compared the RunDate property¬† to Get-Date (today) adding -6 days using the AddDays methodūüôā

Hopefully this will be of use to people and also I have it recorded for the next time I need to do itūüôā