How I created PowerShell.cool using Flow, Azure SQL DB, Cognitive Services & PowerBi

Last weekend I was thinking about how to save the tweets for PowerShell Conference Europe. This annual event occurs in Hanover and this year it is on April 17-20, 2018. The agenda has just been released and you can find it on the website http://www.psconf.eu/

I ended up creating an interactive PowerBi report to which my good friend and Data Platform MVP Paul Andrew b | t added a bit of magic and I published it. The magnificent Tobias Weltner b | t who organises PSConfEU pointed the domain name http://powershell.cool at the link. It looks like this.

During the monthly #PSTweetChat

I mentioned that I need to blog about how I created it and Jeff replied

so here it is! Looking forward to seeing the comparison between the PowerShell and Devops Summit and the PowerShell Conference Europe 🙂

This is an overview of how it works

 

You will find all of the resources and the scripts to do all of the below in the GitHub repo. So clone it and navigate to the filepath

Create Database

First lets create a database. Connect to your Azure subscription

01 - subscription.png

Then set some variables

They should all make sense, take note that you need to set and uncomment the login and password and choose which IPs to allow through the firewall

Create a Resource Group

02 - resource group.png

Create a SQL Server

03 - create server.png

Create a firewall rule, I just use my own IP and add the allow azure IPs

03a - firewall rule.png

Create a database

04 - create database.png

I have used the dbatools module to run the scripts to create the database. You can get it using

Run the scripts

05 - Create Table Sproc.png

This will have created the following in Azure, you can see it in the portal

07 - portal.png

You can connect to the database in SSMS and you will see

06 - show table.png

Create Cognitive Services

Now you can create the Text Analysis Cognitive Services API

First login (if you need to) and set some variables

Then create the API and get the key

You will need to accept the prompt

08 -cognitive service

Copy the Endpoint URL as you will need it.Then save one of  the keys for the next step!

09 cognitiveservice key

 

Create the Flow

I have exported the Flow to a zip file and also the json for a PowerApp (no details about that in this post). Both are available in the Github repo. I have submitted a template but it is not available yet.

Navigate to https://flow.microsoft.com/ and sign in

Creating Connections

You will need to set up your connections. Click New Connection and search for Text

16 - import step 3.png

Click Add and fill in the Account Key and the Site URL from the steps above

17 import step 5.png

click new connection and search for SQL Server

18 - import step 6.png

Enter the SQL Server Name (value of $AzureSQLServer) , Database Name , User Name and Password from the steps above

19 - import step 7.png

Click new Connection and search for Twitter and create a connection (the authorisation pop-up may be hidden behind other windows!)

Import the Flow

If you have a premium account you can import the flow, click Import

11 - import flow.png

12 - choose import.png

and choose the import.zip from the Github Repo

13 import step 1.png

 

Click on Create as new and choose a name

14 - import step 2.png

Click select during import next to Sentiment and choose the Sentiment connection

15 impot step 3.png

Select during import for the SQL Server Connection and choose the SQL Server Connection and do the same for the Twitter Connection

20 - import stpe 8.png

Then click import

21 - imported.png

Create the flow without import

If you do not have a premium account you can still create the flow using these steps. I have created a template but it is not available at the moment. Create the connections as above and then click Create from blank.

22 - importblank.png

 

Choose the trigger When a New Tweet is posted and add a search term. You may need to choose the connection to twitter by clicking the three dots

23 - importblank 1.png

Click Add an action

24 - add action.png

search for detect and choose the Text Analytics Detect Sentiment

25 - choose sentuiment.png

Enter the name for the connection, the account key and the URL from the creation of the API above. If you forgot to copy them

26 - enter details.png

Click in the text box and choose Tweet Text

27 - choose tweet text.png

Click New Step and add an action. Search for SQL Server and choose SQL Server – Execute Stored Procedure

28 - choose sql server execute stored procedure.png

Choose the stored procedure [dbo].[InsertTweet]

29 - choose stored procedure.png

Fill in as follows

  • __PowerAppsID__         0
  • Date                                 Created At
  • Sentiment                      Score
  • Tweet                              Tweet Text
  • UserLocation                 Location
  • UserName                      Tweeted By

as shown below

30 stored procedure info.png

Give the flow a name at the top and click save flow

31 flow created.png

Connect PowerBi

Open the PSConfEU Twitter Analysis Direct.pbix from the GitHub repo in PowerBi Desktop. Click the arrow next to Edit Queries and then change data source settings

32 change data source.png

Click Change source and enter the server (value of $AzureSQLServer) and the database name. It will alert you to apply changes

33 apply changes.png

It will then pop-up with a prompt for the credentials. Choose Database and enter your credentials and click connect

34 - creds.png

and your PowerBi will be populated from the Azure SQL Database 🙂 This will fail if there are no records in the table because your flow hasn’t run yet. If it does just wait until you see some tweets and then click apply changes again.

You will probably want to alter the pictures and links etc and then yo can publish the report

Happy Twitter Analysis

Dont forget to keep an eye on your flow runs to make sure they have succeeded.

Announcing PSDay.UK – Whats a PSDay?

On Thursday evening I attended the joint London WinOps and PowerShell User Group. It was an excellent evening with two great sessions by Jaap Brasser and Filip Verloy.

PSDay.UK

There was also an exciting announcement about PSDay.UK  https://psday.uk

PSDay.UK is a one day PowerShell event providing the opportunity for you to spend a whole day learning PowerShell from renowned experts from the UK and international speaking community. It will be held at

Skills Matter | CodeNode, 10 South Place, London, EC2M 7EB, GB

on

Friday 22nd September 2017  .ics

We will be running two tracks

  • PowerShell Zero to Hero
  • DevOps with PowerShell

Register your interest

Please go and visit the website and have a look and register your interest to get further notifications about the event.

Follow the @PSDayUK twitter account and Facebook page https://www.facebook.com/PSDayUK/ and keep yourself informed on this fantastic new event.

Want to Speak at PSDay.UK ?

We already have some fantastic speakers lined up but we would like to invite people to send us submissions for more sessions. If you have a PowerShell talk that will fit into one of the tracks and experience of delivering sessions at events please send us submissions via the website.
If you have questions about speaking feel free to contact me via twitter at @sqldbawithbeard

What is a PSDay ?

The International PowerShell community has three main global events which run over a number of days with top notch international speakers and Microsoft PowerShell team members, delivering in-depth information about the latest PowerShell trends and technologies, and connecting national communities with another.

There are a number of other PowerShell events that have been organised by wonderful volunteers in numerous countries and we feel there is an opportunity to create national events which complement the global events and help PowerShell passionates and professionals to get in touch and learn from another with a similar branding of PSDay.

We foresee PSDays to be smaller one day national events promoting speakers from the host country supported by other international speakers with the aim of increasing the exposure of national PowerShell user groups as well as providing excellent PowerShell training.

There will be a board of PowerShell community folk set up who will approve requests to use the PSDay name and shield logo providing the event is professionally organized and offer help with technical questions, viral marketing, and experience. We hope that this will enable people to set up their own PSDay in their own country and increase the exposure of the PowerShell community as well as PowerShell knowledge whilst sharing resources, knowledge, experience and skills and ensuring a good standard of PowerShell community national events.

Further details of this will be forthcoming and we welcome offers of assistance from people with relevant experience

 

 

Pester for Presentations – Ensuring it goes ok

Whilst I was at PSCONFEU I presented a session on writing pester tests instead of using checklists. You can see it here

During the talk I showed the pester test that I use to make sure that everything is ready for my presentation. A couple of people have asked me about this and wanted to know more so I thought that I would blog about it.

Some have said that I might be being a little OCD about it 😉 I agree that it could seem like that but there is nothing worse than having things go wrong during your presentation. It makes your heart beat faster and removes the emphasis from the presentation that you give.

When it is things that you as a presenter could have been able to foresee, like a VM not being started or a database not being restored to the pre-demo state or being logged in as the wrong user then it is much worse

I use Pester to ensure that my environment for my presentation is as I expect and in fact, in Hanover when I ran through my Pester test for my NUC environment I found that one of my SQL Servers had decided to be in a different time zone and therefore the SQL Service would not authenticate and start. I was able to quickly remove the references to that server and save myself from a sea of red during my demos

For those that don’t know. Pester is a PowerShell module for Test Driven Development

Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands

If you have PowerShell version 5 then you will have Pester already installed although you should update it to the latest version. If not you can get Pester from the PowerShell Gallery follow the instructions on that page to install it. This is a good post to start learning about Pester

What can you test? Everything. Well, specifically everything that you can write a PowerShell command to check. So when I am setting up for my presentation I check the following things. I add new things to my tests as I think of them or as I observe things that may break my presentations. Most recently that was ensuring that my Visual Studio Code session was running under the correct user. I did that like this

Describe "Presentation Test" {
    Context "VSCode" {
        It "Should be using the right username" {
            whoami | Should Be 'TheBeard\Rob'
       }
    }
}

01 - username.PNG

I think about the things that are important to me for my presentation.  I want to ensure that I only have one VS Code window open to avoid that situation where I am clicking through windows looking for the correct window. I can do that using Get-Process

It "Should have Code Insiders Open" {
(Get-Process 'Code - Insiders' -ErrorAction SilentlyContinue)| Should Not BeNullOrEmpty
}
        It "Should have One VS Code Process" {
            (Get-Process 'Code - Insiders' -ErrorAction SilentlyContinue).Count | Should Be 1
        }

I use -ErrorAction SilentlyContinue so that I don’t get a sea of red when I run the tests. Next I want to check my PowerPoint is ready for my presentation

[code langauge=”PowerShell”]
It "Should have PowerPoint Open" {
(Get-Process POWERPNT -ErrorAction SilentlyContinue).Count | Should Not BeNullOrEmpty
}
It "Should have One PowerPoint Open" {
(Get-Process POWERPNT -ErrorAction SilentlyContinue).Count | Should Be 1
}
It "Should have the correct PowerPoint Presentation Open" {
(Get-Process POWERPNT -ErrorAction SilentlyContinue).MainWindowTitle| Should Be ‘dbatools – SQL Server and PowerShell together – PowerPoint’
}

Again I use Get-Process. I check if PowerPoint is open, if there is one PowerPoint open and I use the MainWindowTitle property to check that it is the right PowerPoint presentation after nearly starting a presentation for SqlServer module with the dbatools slides!

I don’t want any distractions when I am presenting. I have the sort of friends who will notice if I get notifications for twitter popping up on my screen and repeatedly send tweets to make people laugh. (I admit, I’m one of those friends – I do this too!)

02 - Friends!!.PNG

Now I cannot get a test for quiet hours working. You can apparently use a Registry key, which of course you can check with PowerShell but I was unable to get it working. I haven’t looked at testing for Presentation Mode  but I test that those programmes are shut down, again using Get-Process

        It "Mail Should be closed" {
            (Get-Process HxMail -ErrorAction SilentlyContinue).Count | Should Be 0
        }
        It "Tweetium should be closed" {
            (Get-Process WWAHost -ErrorAction SilentlyContinue).Count | Should Be 0
        }
        It "Slack should be closed" {
            (Get-Process slack* -ErrorAction SilentlyContinue).Count | Should BE 0
        }

I am generally presenting with SQL Server so I need to make sure that SQL Server is running. I do this with Get-Service

Context "Local SQL" {
        It "DBEngine is running" {
            (Get-Service mssqlserver).Status | Should Be Running
        }
        It "SQL Server Agent is running" {
            (Get-Service sqlserveragent).Status | Should Be Running
        }
        It "DAVE DBEngine is running" {
            (Get-Service mssql*Dave).Status | Should Be Running
        }
        It "DAVE Agent is running" {
            (Get-Service sqlagent*dave).Status | Should Be Running
        }
    }

In this example I am testing that the SQL Service and the Agent service are running on both of my local instances.

I use a NUC running Hyper-V to enable me to show a number of SQL Servers running in a domain environment so I need to be able to test those too. I set the values of the servers I need into a variable and check that the VM is running and that they respond to ping

 Context "VM State" {
        $NUCServers = 'BeardDC1','BeardDC2','LinuxvNextCTP14','SQL2005Ser2003','SQL2012Ser08AG3','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3','SQLVnextN1','SQL2008Ser12R2'
        $NUCVMs = Get-VM -ComputerName beardnuc | Where-Object {$_.Name -in $NUCServers}
            foreach($VM in $NUCVms)
                {
                $VMName = $VM.Name
                  It "$VMName Should be Running"{
                    $VM.State | Should Be 'Running'
                  }
			    }
    }
Context "THEBEARD_Domain" {
            $NUCServers = 'BeardDC1','BeardDC2','LinuxvNextCTP14','SQL2005Ser2003','SQL2012Ser08AG3','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3','SQLVnextN1','SQL2008Ser12R2'
            foreach($VM in $NUCServers)
                {
                                 It "$VM Should respond to ping" {
				(Test-Connection -ComputerName $VM -Count 1 -Quiet -ErrorAction SilentlyContinue) | Should be $True
				}
                }
    }

I also need to check if the SQL Service and the Agent Service is running on each server

  Context "SQL State" {
        $SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*'  -and $_.State -eq 'Running'}).Name
        foreach($Server in $SQLServers)
        {
          $DBEngine = Get-service -ComputerName $Server -Name MSSQLSERVER
           It "$Server  DBEngine should be running" {
                $DBEngine.Status | Should Be 'Running'
            }
           It "$Server DBEngine Should be Auto Start" {
            $DBEngine.StartType | Should be 'Automatic'
           }
              $Agent= Get-service -ComputerName $Server -Name SQLSERVERAGENT
              It "$Server Agent should be running" {
                  $Agent.Status | Should Be 'Running'
           }
           It "$Server Agent Should be Auto Start" {
            $Agent.StartType | Should be 'Automatic'
           }
        }
        It "Linux SQL Server should be accepting connections" {
            $cred = Import-Clixml C:\temp\sa.xml
            {Connect-DbaSqlServer -SqlServer LinuxvnextCTP14 -Credential $cred -ConnectTimeout 60} | Should Not Throw
        }

    }
}

I check that the Linux SQL Server is available by storing the credential using Export-CliXML  and then use that credential with Connect-DbaSqlServer from dbatools

Using a NUC means I sometimes have fun with networking so I have a couple of tests for that too. Testing for the correct DNS Servers and gateways

    It "Should have DNS Servers for correct interface" {
        (Get-DnsClientServerAddress -InterfaceAlias 'Ethernet 3').Serveraddresses | Should Be @('10.0.0.1','10.0.0.2')
    }
    It "Should have correct gateway for alias"{
        (Get-NetIPConfiguration -InterfaceAlias 'Ethernet 3').Ipv4DefaultGateway.NextHop | Should Be '10.0.0.10'
    }

All of those are generic tests that have evolved over time and are run for every presentation but when I have specific things you require for a single presentation I test for those too.

For Example, later this week Cláudio Silva and I are presenting on dbatools at TUGAIT  We are showing the Test-DbaMaxMemory  , Get-DbaMaxMemory and Set-DbaMaxMemory commands so we need to ensure that the Max Memory for some servers is (In) Correctly set. I use Connect-DbaSqlServer to create an SMO Server object and test that

    It "Max Memory on SQl2012SerAG1 2 and 3 should be 2147483647" {
        (Connect-DbaSqlServer SQL2012Ser08AG1).Configuration.MaxServerMemory.RunValue | Should Be 2147483647
        (Connect-DbaSqlServer SQL2012Ser08AG2).Configuration.MaxServerMemory.RunValue | Should Be 2147483647
        (Connect-DbaSqlServer SQL2012Ser08AG3).Configuration.MaxServerMemory.RunValue | Should Be 2147483647
    }

We are also showing the Test-DbaIdentityUsage command so a column needs to be pre-prepared in AdventureWorks2014 to be able to show the error

    It "ShiftID LastValue Should be 255" {
        $a = Test-DbaIdentityUsage -SqlInstance ROB-XPS -Databases AdventureWorks2014 -NoSystemDb
        $a.Where{$_.Column -eq 'ShiftID'}.LastValue | should Be 255
    }

To ensure that we have orphaned files available for the Find-DbaOrphanedFile command I use this

    It "has Orphaned Files ready"{
        (Find-DbaOrphanedFile -SqlServer SQL2016N2).Count | Should Be 30
    }

There are any number of things that you may want to test to ensure that, as best as possible, the demo gods are not going to come and bite you in the middle of your presentation.

  • Files or Folders exist (or dont exist)
  • Databases, Agent Jobs, Alerts
  • Operators, Logins
  • SSIS packages, SSRS Reports
  • PowerBi files
  • Azure connectivity
  • Azure components

The list is endless, just look at what you require for your presentation.

Anything you can check with PowerShell you can test with Pester so build up your Pester presentation tests and reduce the reliance on the demo gods! I’ll still leave this here just to be safe!!

pray to the demo gods.jpg

Pester Test Inception and the Show Parameter

My fantastic friend Andre Kamman b | t  and I presented at PSConfEu last week

C_EDtK0XoAA1PL7 (2).jpg

and whilst we were there we were chatting about running Pester Tests. He wanted to know how he could run a Pester Test and not lose the failed tests as they scrolled past him. In his particular example we were talking about running hundreds of tests on thousands of databases on hundreds of servers

01 - pesters.gif

I guess it looks something like that!!

I explained about the -Show parameter which allows you to filter the results that you see. Using Get-Help Invoke-Pester you can see this

   -Show
Customizes the output Pester writes to the screen. Available options are None, Default,
Passed, Failed, Pending, Skipped, Inconclusive, Describe, Context, Summary, Header, All, Fails.

The options can be combined to define presets.
Common use cases are:

None – to write no output to the screen.
All – to write all available information (this is default option).
Fails – to write everything except Passed (but including Describes etc.).

A common setting is also Failed, Summary, to write only failed tests and test summary.

This parameter does not affect the PassThru custom object or the XML output that
is written when you use the Output parameters.

Required?                    false
Position?                    named
Default value                All
Accept pipeline input?       false
Accept wildcard characters?  false

So there are numerous options available to you. Lets see what they look like

I will use a dummy test which creates 10 Context blocks and runs from 1 to 10 and checks if the number has a remainder when divided by 7

Describe "Only the 7s Shall Pass" {
    $Servers = 0..10
    foreach($Server in $servers)
    {
        Context "This is the context for $Server" {
        foreach($A in 1..10){
            It "Should Not Pass for the 7s" {
                $A % 7 | Should Not Be 0
                }
            }
        }
    }
}

Imagine it is 10 servers running 10 different tests

For the Show parameter All is the default, which is the output that you are used to

02 - All.gif

None does not write anything out. You could use this with -Passthru which will pass ALL of the test results to a variable and if you added -OutputFile and -OutputFormat then you can save ALL of the results to a file for consumption by another system. The -Show parameter only affects the output from the Invoke-Pester command to the host not the output to the files or the variable.

Header only returns the header from the test results and looks like this ( I have included the none so that you can see!)

03 - none and header.PNG

Summary, as expected returns only the summary of the results

04 - summary.PNG

You can use more than one value for the Show parameter so if you chose Header, Summary, Describe you would get this

05 - headerdesscribe sumnmary.PNG

You could use Failed to only show the failed tests which looks like this

06 - failed.PNG

but Andre explained that he also want to be able to see some progress whilst the test was running. If there were no failures then he would not se anything at all.

So Fails might be the answer (or Failed and Summary but that would not show the progress)

07 - fails.PNG

Fails shows the Header, Describe, Context  and also shows the Summary.

However we carried on talking. PSConfEU is a fantastic place to talk about PowerShell 🙂 and wondered what would happen if you invoked Pester from inside a Pester test. I was pretty sure that it would work as Pester is just PowerShell but I thought it would be fun to have a look and see how we could solve that requirement

So I created 3 “Internal Tests” these are the ones we don’t want to see the output for. I then wrote an overarching Pester test to call them. In that Pester test script I assigned the results of each test to a variable which. When you examine it you see

08 - Pester Object.PNG

The custom object that is created shows the counts of all different results of the tests, the time it took and also the test result.

So I could create a Pester Test to check the Failed Count property of that Test result

$InternalTest1.FailedCount | Should Be 0

To make sure that we don’t lose the results of the tests we can output  them to a file like this

$InternalTest1 = Invoke-Pester .\Inside1.Tests.ps1 -Show None -PassThru -OutputFile C:\temp\Internal_Test1_Results.xml -OutputFormat NUnitXml

So now we can run Invoke-Pester and point it at that file and it will show the progress and the final result on the screen.

09 finale.PNG

You could make use of this in different ways

  • Server 1
    • Database1
    • Database2
    • Database3
    • Database4
  • Server 2
    • Database1
      Database2
      Database3
      Database4
  • Server 3
    • Database1
      Database2
      Database3
      Database4

Or by Test Category

  • Backup
    • Server1
    • Server 2
    • Server 3
    • Server 4
  • Agent Jobs
    • Server 1
    • Server 2
    • Server 3
    • Server 4
  • Indexes
    • Server 1
    • Server 2
    • Server 3
    • Server 4

Your only limitation is your imagination.

As we have mentioned PSConfEU you really should check out the videos on the youtube channel All of the videos that were successfully recorded will be on there. You could start with this one and mark your diaries for April 16-20 2018

 

 

Using Twitter with VS Code

So today I saw this tweet from Mathias in reply to Stefan and Amanda

01 - tweet.PNG

 

That looks cool. Twitter in VS Code, one less programme to open Here how it works

Open VS Code and hit CTRL + P and type ext install twitter or hit CTRL + SHIFT + X to open the extensions and search

02 - install.PNG

 

Hit install and then reload

03 - reload

 

Accept the prompt

04 - prompt.PNG

and you will have a Twitter button in the bar at the bottom

05 - bar

If you click it then the top bar will change to the set up wizard

06 - forst time.PNG

Follow it along and create a Twitter App.

07 - want to continue

08 - create an appWhich will open up the website.

 

09 - create an app.PNG

Just fill in the blanks

010 - fil in the blanks.PNG

and then  you will have this window

011 - app settings.PNG

Click on keys and Access tokens

012 - settings and appl

Check the App Permissions are set to read and write

014 - app permissions

and then click create my access token

Then go back to VS Code and click

016 - settings details

Now you are shown the settings.json (which you can always find by File –> Preferences –> Settings )

017 - settingsjson.PNG

This bit, the wizard doesn’t explain very well (hence this post) If you have already some settings between the curly braces, you will need to put a comma and then paste the below code. If you do not then paste the below code between the curly braces

"twitter.consumerkey":"",
"twitter.consumersecret":"",
"twitter.accesstokenkey":"",
"twitter.accesstokensecret":"" 

and then paste the relevant keys and tokens from your twitter app between the double quotes

THEN PRESS CTRL + S to save the settings.json – Its in caps as when I showed someone they were too excited and didn’t save it!!

Now its all set up you can use the extension. Click the twitter button in the bar and

018 - Twitter actions

Now you can have your Home timeline in Code

019 - twitter home.PNG

Yes there are still a few seats left for the Europe PowerShell Conference You can search, see your mentions, your user page

020- mentions.PNG

and post 🙂 You can just press F1 and start typing twitter to get the commands

021 - comands

022 - tweet.png

and

 

023 - tweet.PNG

You can also message people using D message username. You can see the extension repo on GitHub which will be a good place to raise issues, bugs, feature requests

I’m not suire it will Increase my productivity !! but it really pleases the nerd in me!

 

 

2016 – That was a Year :-)

Its the time of year for reflection and I have had the most amazing 2016, I am blessed that I love what I do so much. I thoroughly enjoy writing and talking and sharing and commenting and supporting and cherishing all the SQL and PowerShell things. I wrote about using Power Bi to display my checkins. I only started this in June and this is where I have been 🙂

swarm

I learnt about Pester and ended the year incorporating it into dbatools and dbareports. I also started using GitHub It is quite surprising to me how much time I now spend using both. I also had to start learning DSC for the client I was working with because as ‘the PowerShell guy’ I was the one who could the easiest. I learnt things and then forgot them causing me to find this Pester post via google later in the year!! (That’s a big reason for blogging by the way)

Early in the year we organised with SQL Saturday Exeter

Helping to organise a SQL Saturday is a lot of fun, especially when you do it with good friends, but choosing sessions is by far the most challenging part of it for me. I could have chosen at least 60 of these sessions and I know people were disappointed not to have been chosen. I was also the first person many saw at SQL Bits in Liverpool manning the front of house and getting asked the best question ever

The Beard says

When you go to an event –  Say thank you to the organisers and volunteers

and a TERRIBLE thing happened – I broke my DBA Team mug

WP_20160223_07_51_03_Pro.jpg

Luckily the fine folk at redgate sorted me out with a replacement from deep in the stores somewhere and gave it to me at SQL Saturday Exeter 🙂 Thank you.

I spoke at the PowerShell Conference Europe and met and made some great friends which lead to me speaking at the PowerShell Monday in Munich and the Dutch PowerShell Usergroup. SQL Saturday Dublin was a blast, its a wonderful city, Manchester had a whole PowerShell Track 🙂 and Cambridge was memorable for the appalling journey as well as the chance to share a stage with Chrissy. PowerShell Conference Asia in the sovereign city-state of Singapore was such a good event and place. Lastly of course was Slovenia with its fantastic Christmas lights and awesome event organisation. I visited some user groups too. Southampton run by my good friends John Martin and Steph Middleton Congratulations to John on his first MVP award yesterday, Cardiff for the Return of the Battle of the Beards with Terry McCann and Tobiasz Koprowski where the projector threw its toys out of the pram and Birmingham in the school hall which was slightly chilly (theres a joke there for some people)

Amazing things happened

We created https://sqlps.io/vote and https://sqlps.io/ssms and https://sqlps.io/powerbi to enable anyone to influence Microsoft and help to improve the PowerShell SQL experience

and lo and behold there was a new sqlserver module 🙂

I was also invited by Aaron and Chrissy to become an officer for the PASS PowerShell Virtual Chapter oh and we made https://sqlps.io/slack to enable people to talk about all things Data Platform – Another addition to my life that I didn’t have at the beginning of the year. I spend a lot of time in there in the #dbatools and #dbareports channels and have made some fantastic friends. Chrissy and I created the SQL Community Collaborative GitHub team and added dbatools and dbareports and even more friendships were born

And that’s the biggest and bestest thing about this year. Some amazing new friends and spending time with all my other friends. I started writing out a list but was terrified I would have missed someone out, so to all my friends

THANK YOU for a brilliant 2016 and 2017 shall be just as good 🙂

Here are a few of my pics from the year with a lot of my friends

 

Making Start-Demo work with multi-line commands without a backtick

I love to speak about PowerShell. I really enjoy giving presentations and when I saw Start-Demo being used at the PowerShell Conference in Hanover I started to make use of it in my presentations.

Start-Demo was written in 2007 by a fella who knows PowerShell pretty well 🙂  https://blogs.msdn.microsoft.com/powershell/2007/03/03/start-demo-help-doing-demos-using-powershell/

It was then updated in 2012 by Max Trinidad http://www.maxtblog.com/2012/02/powershell-start-demo-now-allows-multi-lines-onliners/

This enabled support for multi-line code using backticks at the end of each line. This works well but I dislike having to use the backticks in foreach loops, it confuses people who think that they need to be included and to my mind looks a bit messy

start-demo

This didn’t bother me enough to look at the code but I did mention it to my friend Luke t | g who decided to use it as a challenge for his Friday lunch-time codeathon and updated the function so that it works without needing a backtick

start-demo2

It also works with nested loops

start-demo3

just a little improvement but one I think that works well and looks good

You can find it at

https://github.com/SQLDBAWithABeard/Presentations/blob/master/Start-Demo.ps1

and a little demo showing what it can and cant do

https://github.com/SQLDBAWithABeard/Presentations/blob/master/start-demotest.ps1

Load the Start-Demo.ps1 file and then run

Start-Demo PATHTO\start-demotest.ps1

Enjoy!

 

 

 

Some Pester Tests for SQL Defaults

When I was at PowerShell Conference EU in Hannover last month (The videos are available now – click here and the slides and code here) I found out about Irwin Strachans Active Directory Operations Test which got me thinking.

I decided to do the same for my usual SQL Set-up. Treating all of your servers to the same defaults makes it even easier to manage at scale remotely.

I am comfortable with using SMO to gather and change properties on SQL Instances so I started by doing this

        It 'Should have a default Backup Directory of F:\SQLBACKUP\BACKUPS' {
$Scriptblock = {
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
$srv = New-Object Microsoft.SqlServer.Management.Smo.Server .
return $srv.BackupDirectory}
$State = Invoke-Command -ComputerName ROB-SURFACEBOOK -ScriptBlock $Scriptblock
$State |Should Be 'F:\SQLBACKUP\BACKUPS'

This is the how to find the properties that you want

  ## Load the Assemblies
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
## Create a Server SMO object
$srv = New-Object Microsoft.SqlServer.Management.Smo.Server SERVERNAME

## Explore it
$srv|gm

## If you find an array pick the first one and expand and then explore that
$srv.Databases[0] | select *
$srv.Databases[0] | gm

I quickly found as I added more tests that it was taking a long time to perform the tests (about 5 seconds each test) and that it took an age to fail each of the tests if the server name was incorrect or the server unavailable.

I fixed the first one by testing with a ping before running the tests

   ## Check for connectivity
if((Test-Connection $Server -count 1 -Quiet) -eq $false){
Write-Error 'Could not connect to $Server'
$_
continue
}

The continue is there because I wanted to loop through an array of servers

I improved the performance using a remote session and a custom object

      Describe "$Server" {
BeforeAll {
$Scriptblock = {
[pscustomobject]$Return = @{}
$srv = ''
$SQLAdmins = $Using:SQLAdmins
[void][reflection.assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo');
$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server
$Return.DBAAdminDb = $Srv.Databases.Name.Contains('DBA-Admin')
$Logins = $srv.Logins.Where{$_.IsSystemObject -eq $false}.Name
$Return.SQLAdmins = @(Compare-Object $Logins $SQLAdmins -SyncWindow 0).Length - $Logins.count -eq $SQLAdmins.Count
$SysAdmins = $Srv.Roles['sysadmin'].EnumMemberNames()
$Return.SQLAdmin = @(Compare-Object $SysAdmins $SQLAdmins -SyncWindow 0).Length - $SysAdmins.count -eq $SQLAdmins.Count
$Return.BackupDirectory = $srv.BackupDirectory
$Return.DataDirectory = $srv.DefaultFile

The BeforeAll script block is run, as it sounds like it should, once before all of the tests, BeforeEach would run once before each of the tests. I define an empty custom object and then create an SMO object and add the properties I am interested in testing to it. I then return the custom object at the end

   $Return.Alerts82345Exist = ($srv.JobServer.Alerts |Where {$_.Messageid -eq 823 -or $_.Messageid -eq 824 -or $_.Messageid -eq 825}).Count
$Return.Alerts82345Enabled = ($srv.JobServer.Alerts |Where {$_.Messageid -eq 823 -or $_.Messageid -eq 824 -or $_.Messageid -eq 825 -and $_.IsEnabled -eq $true}).Count
$Return.SysDatabasesFullBackupToday = $srv.Databases.Where{$_.IsSystemObject -eq $true -and $_.Name -ne 'tempdb' -and $_.LastBackupDate -lt (Get-Date).AddDays(-1)}.Count
Return $Return
}
try {
$Return = Invoke-Command -ScriptBlock $Scriptblock -ComputerName $Server -ErrorAction Stop
}
catch {
Write-Error "Unable to Connect to $Server"
$Error
continue

I was then able to test against the property of the custom object

   It 'Should have Alerts for Severity 20 and above' {
$Return.Alerts20SeverityPlusExist | Should Be 6
}
It 'Severity 20 and above Alerts should be enabled' {
$Return.Alerts20SeverityPlusEnabled | Should Be 6
}
It 'Should have alerts for 823,824 and 825' {
$Return.Alerts82345Exist |Should Be 3
}
It 'Alerts for 823,824 and 825 should be enebled' {
$Return.Alerts82345Enabled |Should Be 3
}

Occasionally, for reasons I haven’t explored I had to test against the value property of the returned object

          It "The Full User Database Backup should be scheduled Weekly $OlaUserFullSchedule" {
$Return.OlaUserFullSchedule.value | Should Be $OlaUserFullSchedule
}

I wanted to be able to run the tests against environments or groups of servers with different default values so I parameterised the Test Results as well and then the logical step was to turn it into a function and then I could do some parameter splatting. This also gives me the opportunity to show all of the things that I am currently giving parameters to the test for

   $Parms = @{
Servers = 'SQLServer1','SQLServer2','SQLServer3';
SQLAdmins = 'THEBEARD\Rob','THEBEARD\SQLDBAsAlsoWithBeards';
BackupDirectory = 'C:\MSSQL\Backup';
DataDirectory = 'C:\MSSQL\Data\';
LogDirectory = 'C:\MSSQL\Logs\';
MaxMemMb = '4096';
Collation = 'Latin1_General_CI_AS';
TempFiles = 4 ;
OlaSysFullFrequency = 'Daily';
OlaSysFullStartTime = '21:00:00';
OlaUserFullSchedule = 'Weekly';
OlaUserFullFrequency = 1 ;## 1 for Sunday
OlaUserFullStartTime = '22:00:00';
OlaUserDiffSchedule = 'Weekly';
OlaUserDiffFrequency = 126; ## 126 for every day except Sunday
OlaUserDiffStartTime = '22:00:00';
OlaUserLogSubDayInterval = 15;
OlaUserLoginterval = 'Minute';
HasSPBlitz = $true;
HasSPBlitzCache = $True;
HasSPBlitzIndex = $True;
HasSPAskBrent = $true;
HASSPBlitzTrace =  $true;
HasSPWhoisActive = $true;
LogWhoIsActiveToTable = $true;
LogSPBlitzToTable = $true;
LogSPBlitzToTableEnabled = $true;
LogSPBlitzToTableScheduled = $true;
LogSPBlitzToTableSchedule = 'Weekly';
LogSPBlitzToTableFrequency = 2 ; # 2 means Monday
LogSPBlitzToTableStartTime  = '03:00:00'}

Test-SQLDefault @Parms

I have some other tests which always return what I want, particularly the firewall rules which you will have to modify to suit your own environment

To be able to run this you will need to have the Pester Module. If you are using Windows 10 then it is installed by default, if not

  Find-Module –Name 'Pester' | Install-Module

You can find more about Pester here and here and also these videos from the conference
You can find the tests on GitHub here and I will continue to add to the defaults that I check.
This is not a replacement for other SQL configuration tools such as PBM but it is a nice simple way of giving a report on the current status of a SQL installation either at a particular point in time when something is wrong or after an installation prior to passing the server over to another team or into service

.

DBA Database scripts are on Github

It started with a tweet from Dusty

Tweets

The second session I presented at the fantastic PowerShell Conference Europe was about using the DBA Database to automatically install DBA scripts like sp_Blitz, sp_AskBrent, sp_Blitzindex from Brent Ozar , Ola Hallengrens Maintenance Solution , Adam Mechanics sp_whoisactive , This fantastic script for logging the results from sp_whoisactive to a table , Extended events sessions and other goodies for the sanity of the DBA.

By making use of the dbo.InstanceList in my DBA database I am able to target instances, by SQL Version, OS Version, Environment, Data Centre, System, Client or any other variable I choose. An agent job that runs every night will automatically pick up the instances and the scripts that are marked as needing installing. This is great when people release updates to the above scripts allowing you to target the development environment and test before they get put onto live.

I talked to a lot of people in Hannover and they all suggested that I placed the scripts onto GitHub and after some how-to instructions from a few people (Thank you Luke) I spent the weekend updating and cleaning up the code and you can now find it on GitHub here

github

I have added the DBA Database project, the Powershell scripts and Agent Job creation scripts to call those scripts and everything else I use. Some of the DBA Scripts I use (and links to those you need to go and get yourself for licensing reasons) and the Power Bi files as well. I will be adding some more jobs that I use to gather other information soon.

Please go and have a look and see if it is of use to you. It is massively customisable and I have spoken to various people who have extended it in interesting ways so I look forward to hearing about what you do with it.

As always, questions and comments welcome