Using Get-SQLDiagFix to get information from the SQL Server Diagnostic API with PowerShell

The SQL Server Diagnostics Preview was announced just over a week ago It includes an add-on for SQL Server Management Studio to enable you to analyse SQL Server memory dumps and view information on the latest SQL Server cumulative updates for supported versions of SQL Server. Arun Sirpal has written a good blog post showing how to install it and use it in SSMS to analyse dumps.

There is also a developer API available so I thought I would write some PowerShell to consume it as there are no PowerShell code examples available in the documentation!

In a previous post I have explained how I created the module and a GitHub repository and used Pester to help me to develop the first command Get-SQLDIagRecommendations. At present the module has 5 commands, all for accessing the Recommendations API.

This post is about the command Get-SQLDiagFix which returns the Product Name, Feature Name/Area, KB Number, Title and URL for the Fixes in the Cumulative Updates returned from the SQL Server Diagnostics Recommendations API.

PowerShell Gallery

The module is available on the PowerShell Gallery which means that you can install it using

Install-Module SQLDiagAPI

as long as you have the latest version of the PowerShellGet module. This is already installed in Windows 10 and with WMF 5 but you can install it on the following systems

  • Windows 8.1 Pro
  • Windows 8.1 Enterprise
  • Windows 7 SP1
  • Windows Server 2016 TP5
  • Windows Server 2012 R2
  • Windows Server 2008 R2 SP1

following the instructions here.

If you are not running your PowerShell using a local administrator account you will need to run

Install-Module SQLDiagAPI -Scope CurrentUser

to install the module.

If you can’t use the PowerShell Gallery you can install it using the instructions on the repository

API Key

To use the API you need an API Key. An API Key is a secret token that identifies the application to the API and is used to control access. You can follow the instructions here to get one for the SQL Server Diagnostics API.

01 - APIKey

You will need to store the key to use it. I recommend saving the API Key using the Export-CliXML command as described by Jaap Brasser here .

Get-Credential | Export-CliXml -Path "${env:\userprofile}\SQLDiag.Cred"

You need to enter a username even though it is not used and then enter the API Key as the password. It is saved in the root of the user profile folder as hopefully, user accounts will have access there in most shops.

This will save you from having to enter the APIKey every time you run the commands as the code is looking for it to be saved in that file.

The Commands

Once you have installed the module and the APIKey it will be available whenever you start PowerShell. The first time you install you  may need to run

Import-Module SQLDiagAPI

to load it into your session. Once it is loaded you can view the available commands using

Get-Command -Module SQLDiagAPI

01 - SQLDiagAPI Commands.png

You can find out more about the commands on the GitHub Repository  and the Help files are in the documentation.

Get-Help

Always, always when starting with a new module or function in PowerShell you should start with Get-Help. I like to use the -ShowWindow parameter to open the help in a separate window as it has all of the help and a handy search box.

Get-Help Get-SQLDiagFix

02 - Get-Help Get-SQLDiagFix.png

Good help should always include plenty of examples to show people how to use the command. There are 12 examples in the help for Get-SQLDiagFix. You can view just the examples using

Get-Help Get-SQLDiagFix -examples

Get All Of The Fixes

The easiest thing to do is to get all of the available fixes from the API. This is done using

Get-SQLDiagFix

which will return all 123 Fixes currently referenced in the API.

03 get-sqldiagfix.png

That is just a lot of information on the screen. If we want to search through that with PowerShell we can use Out-GridView

Get-SQLDiagFix | Select Product, Feature, KB, Title | Out-GridView

05 Get-SQLDiagFix OutGridView Search.gif

Or maybe if you want to put them in a database you could use dbatools

$Fixes = Get-SQLDiagFix | Out-DbaDataTable
Write-DbaDataTable -SqlServer $Server -Database $DB -InputObject $Fixes -Table Fixes -AutoCreateTable

Get Fixes for a Product

If you only want to see the fixes for a particular product you can use the product parameter. To see all of the products available in the API you can run

Get-SQLDiagProduct

06 Get-SQLDiagProduct.png

You can either specify the product

Get-SQLDiagFix -Product 'SQL Server 2016 SP1' | Format-Table

07 Get-SQLDiagFix Product.png

or you can pipe the results of Get-SQLDiagProduct to Get-SQLDiagFix which enables you to search. For example, to search for all fixes for SQL Server 2014 you can do

Get-SQLDiagProduct 2014 | Get-SQLDiagFix | Format-Table -AutoSize

08 - Get-SQLDiagFix Product Search.png

Which will show the fixes available in the API for SQL Server 2014 SP1 and SQL Server 2014 SP2

Get The Fixes for A Feature

The fixes in the API are also categorised by feature area. You can see all of the feature areas using Get-SQLDiagFeature

Get-SQLDiagFeature

09 get-sqldiagfeature.png

You can see the fixes in a particular feature area using the Feature parameter with

Get-SQLDiagFix -Feature Spatial | Format-Table -AutoSize

10 - Get-SQLDiagFix by feature.png

or you can search for a feature with a name like query and show the fixes using

Get-SQLDiagFix -Feature (Get-SQLDiagFeature query) | Format-Table -AutoSize

11 - Get-SQLDiagFix by feature query.png

Get Fixes for a Product and a Feature

You can combine the two approaches above to search for fixes by product and feature area. If you want to see the fixes for SQL Server 2016  to do with backups you can use

Get-SQLDiagProduct 2016 | Get-SQLDiagFix -Feature (Get-SQLDiagFeature backup) | Format-Table -AutoSize

12 - Get-SQLDiagFix by feature adn product.png

No-one wants to see the words “…restore fails when….”! This is probably a good time to fix that.

Open the KB Article Web-Page

As well as getting the title and KB number of the fix, you can open the web-page. This code will open the fixes for all SP1 products in the feature area like al in Out-GridView and enable you to choose one (or more) and open them in your default browser

Get-SQLDiagProduct SP1 | Get-SQLDiagFix -Feature (Get-SQLDiagFeature -Feature al) `
| Out-GridView -PassThru | ForEach-Object {Start-Process $_.URL}
13 - Open a webpage.gif

 

There is a YouTube video as well showing how to use the command

 

You can find the GitHub repository at  https://github.com/SQLDBAWithABeard/SQLDiagAPI

Advertisements

Creating a PowerShell Module and TDD for Get-SQLDiagRecommendations

Yesterday I introduced the first command in the SQLDiagAPI module. A module to consume the SQL Diagnostics API.

I have been asked a few times what the process is for creating a module, using Github and developing with Pester and whilst this is not a comprehensive how-to I hope it will give some food for thought when you decide to write a PowerShell module or start using Pester for code development. I also hope it will encourage you to give it a try and to blog about your experience.

This is my experience from nothing to a module with a function using Test Driven Development with Pester. There are some details missing in some places but if something doesn’t make sense then ask a question. If something is incorrect then point it out. I plan on never stopping learning!

There are many links to further reading and I urge you to not only read the posts linked but also to read further and deeper. That’s a generic point for anyone in the IT field and not specific to PowerShell. Never stop learning. Also, say thank you to those that have taken their time to write content that you find useful. They will really appreciate that.

Github Repository

I created a new repository in Github and used Visual Studio Code to clone the repository by pressing F1 and typing clone – Choosing Git Clone and following the prompts. I started with this because I was always planning to share this code and because source controlling it is the best way to begin.

Plaster Template

When you create a module there are a number of files that you need and I have a number of generic tests that I add. I also have a structure that I create for the artifacts and a number of markdown documents that come with a GitHub Repository.  Whilst you could write a PowerShell script to create all of those, there is no need as there is PlasterPlaster is a PowerShell module that enables you to set up the default scaffolding for your PowerShell module structure and tokenise some files. This makes it much easier to have a default ‘scaffold’ for the module, a structure for the files and folders and create a new module simply. I used Kevin Marquettes post on Plaster  to create myself a template module. You can find my Plaster Template here 

You do not need to use Plaster at all but as with anything, if you find yourself repeating steps then it is time to automate it

With my Plaster Template created I could simply run

$plaster = @{
TemplatePath = "GIT:\PlasterTemplate" #(Split-Path $manifestProperties.Path)
DestinationPath = "GIT:\SQLDiagAPI"
FullName = "Rob Sewell"
ModuleName = "SQLDiagAPI"
ModuleDesc = "This is a module to work with the SQL Server Diagnostics (Preview) API. See https://blogs.msdn.microsoft.com/sql_server_team/sql-server-diagnostics-preview/ for more details "
Version = "0.9.0"
GitHubUserName = "SQLDBAWithABeard"
GitHubRepo = "SQLDiagAPI"
}
If(!(Test-Path $plaster.DestinationPath))
{
New-Item -ItemType Directory -Path $plaster.DestinationPath
}
Invoke-Plaster @plaster -Verbose

This created my module. It created this folder and file structure and included some default tests and markdown documents pre-populated.

00 - module

Pester

For those that don’t know. Pester is a PowerShell module for Test Driven Development

Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands

If you have PowerShell version 5 then you will have Pester already installed although you should update it to the latest version. If not you can get Pester from the PowerShell Gallery follow the instructions on that page to install it. This is a good post to start learning about Pester

API Key

Now that I have the module I started to think about the commands. I decided to start with the recommendations API which is described as

Customers will be able to keep their SQL Server instances up-to-date by easily reviewing the recommendations for their SQL Server instances. Customers can filter by product version or by feature area (e.g. Always On, Backup/Restore, Column Store, etc.) and view the latest Cumulative Updates (CU) and the underlying hotfixes addressed in the CU.

To use the API you need an API Key. An API Key is a secret token that identifies the application to the API and is used to control access.You can follow the instructions here https://ecsapi.portal.azure-api.net/ to get one for the SQL Server Diagnostics API.

01 - APIKey

I will need to store the key to use it and if I am writing code that others will use consider how they can repeat the steps that I take. I decided to save my API Key using the Export-CliXML command as described by Jaap Brasser here .

Get-Credential | Export-CliXml -Path "${env:\userprofile}\SQLDiag.Cred"

You need to enter a username even though it is not used and then enter the API Key as the password. It is saved in the root of the user profile folder as hopefully user accounts will have access there in most shops

TDD

I approached writing this module using Test Driven Development with Pester. This means that I have to write my tests before I write my code. There are many reasons for doing this which are outside the scope of this blog post. This is a very good post to read more

The first function I wanted to write was to get the recommendations from the API. I decide to call it Get-SQLDiagRecommendations.

I decided that the first test should be to ensure that the API Key exists. Otherwise I would not be able to use it when calling the API. I already had an idea of how I would approach it by storing the API Key using Test-Path and writing a warning if the file did not exist.

Mocking

However this is not going to work if I have already saved the key to the file. The test needs to not be reliant on any thing external. I need to be able to test this functionality without actually checking my system. I will use Mock to do this. You can read more about mocking with Pester here.

I added this to my Pester test

Context "Requirements" {
Mock Test-Path {$false}
Mock Write-Warning {"Warning"}

This is what happens when you run this test. When there is a call to Test-Path in the code you have written, instead of actually running Test-Path it will return whatever is inside the curly braces, in this case false. For Write-Warning it will return a string of Warning.

This means that I can write a test like this

It "Should throw a warning if there is no API Key XML File and the APIKey Parameter is not used"{
Get-SQLDiagRecommendations -ErrorAction SilentlyContinue | Should Be "Warning"
}

So I know that when running my code in this test, Test-Path will return false, which will invoke Write-Warning in my code and in the test that will return “Warning” . So if I have written my code correctly the test will pass without actually running the real Test-Path and interacting with my system or running Write-Warning which makes it easier to test that warnings are thrown correctly.

The name of the test will also let me (and others) know in the future what I was trying to achieve. This means that if I (or someone else) changes the code and the test fails they can understand what was meant to happen. They can then either write a new test for the changed code if the requirements are now different or alter the code so that it passes the original test.

I use

-ErrorAction SilentlyContinue

so that the only red text that I see on the screen is the results of the test and not any PowerShell errors.

Asserting

I can also check that I have successfully called my Mocks using Assert-MockCalled. This command will check that a command that has been mocked has been called successfully during the test in the scope of the Describe (or in this case Context) block of the tests

It 'Checks the Mock was called for Test-Path' {
$assertMockParams = @{
'CommandName' = ' Test-Path'
'Times' = 1
'Exactly' = $true
}
Assert-MockCalled @assertMockParams
}

I specify the command name, the number of times that I expect the mock to have been called and because I know that it will be exactly 1 time, I set exactly to $true. If I set exactly to false it would test that the mock was called at least the number of times specified. This is another test that I really have called the Mocks that I defined and the results are correct and dependant only on the code.

I set up the same test for Write-Warning.

Failed Test

I can now run my Pester tests using

Invoke-Pester .\Tests

and see that some failed.

02 - Failed Pester tests

Of course it failed I don’t have a function named Get-SQLDiagRecommendations

So why run the test?

I need to ensure that my test fails before I write the code to pass it. If I don’t do that I may mistakenly write a test that passes and therefore not be correctly testing my code.

You can also see that it has run all of the .Tests.ps1 files in the tests directory and has taken 42 seconds to run. The tests directory includes a number of Pester tests including checking that all of the scripts pass the Script Analyser rules and that all of the functions have the correct help. (thank you June Blender for that test)

Show

I can reduce the output of the tests using the Show parameter of Invoke-Pester. I will often use Fails as this will show the describe and context titles and only the tests that fail. This will run much quicker as it will not need to output all of the passed tests to the screen

03 - Pester show fails

Now the test is running in less than half of the time. You can filter the output in further ways using Show. You can run

Get-Help Invoke-Pester

to see how else you can do this.

Tags

As I am going to be writing tests and then writing code to pass the tests repeatedly I don’t want to run all of these tests all of the time so I can use the Tags parameter of Invoke-Pester to only run a certain suite tests. In the Unit.Tests.ps1 file the Describe block looks like this

Describe "Get-SQLDiagRecommendations" -Tags Build , Unit{
Context "Requirements" {

So I can run just the tests tagged Unit and skip all of the other tests. Combined with the Show Fails to reduce the output my Invoke-Pester code looks like this

Invoke-Pester .\tests -Show Fails -Tag Unit

04 - Pester Tags

Now I am only running the tests that I need for writing the code for the command the tests are running in under half a second 🙂 This is so much better when I am going to be running them repeatedly.

The other tests have different tags and I will show them running later in the post.

Code

Finally, we can write some code to pass our failing test

function Get-SQLDiagRecommendations1 {
[cmdletbinding()]
Param([string]$ApiKey)
if (!$ApiKey) {
if (!(Test-Path "${env:\userprofile}\SQLDiag.Cred")) {
Write-Warning "You have not created an XML File to hold the API Key or provided the API Key as a parameter
You can export the key to an XML file using Get-Credential | Export-CliXml -Path `"`${env:\userprofile}\SQLDiag.Cred`"
You can get a key by following the steps here https://ecsapi.portal.azure-api.net/ "
    }
}

Which would look like this if the file does not exist and the API Key parameter is not used

05 - Warning

I like to provide users with a useful message that they can follow rather than a lot of red text that they need to decipher

And now our tests pass

06 - Passing Tests

If you look at the API documentation the API requires a callerid as well as the APIKey. In the examples it uses the value from
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\MachineGUID

We can get that using Get-ItemProperty and without it we can’t call the API so I wrote tests like this.

It "Returns a warning if unable to get Machine GUID" {
Mock Get-MachineGUID {} -Verifiable
Mock Write-Warning {"Warning"} -Verifiable
Get-SQLDiagRecommendations -APIKey dummykey | Should Be "Warning"
Assert-VerifiableMocks
}

I am not saying this is the correct way to write your tests. I am showing that you can test multiple things in an It block and if any one of them fails the entire test fails.

I am mocking the internal function Get-MachineGuid and Write Warning just in the scope of this It Block and passing an APIKey parameter to Get-SQLDiagRecommendations so that we don’t hit the write-warnings we tested for above and then using Assert-VerifiableMocks  to verify that the mocks have been called. It does not verify how many times, just that all of the mocks in that block have been called

The test fails as expected and then I write the code to pass the test. This is the internal function to get the Machine GUID

function Get-MachineGUID {
try {
(Get-ItemProperty registry::HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\ -Name MachineGuid).MachineGUID
}
catch{
Write-Warning "Failed to get Machine GUID from HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\"
}
}

 

and this is the call to the internal function and warning message

$MachineGUID = Get-MachineGUID
if($MachineGUID.length -eq 0)
{
Write-Warning "Failed to get Machine GUID from HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\"
break
}

Rinse and repeat

That is basically the process that I follow to write a function. I just write a test, write some code to fix it, write another test, write some code to fix it. I keep going until I have finished writing the code and all the test have passed.

Best Practice Code

Once that was done and my Unit test had passed I run

 Invoke-Pester .\tests -Tag ScriptAnalyzer -Show Fails

To check that the PowerShell code that I had written conformed to the Script Analyzer rules. I added an exception to the Help.Exceptions.ps1 file to not run the rule for plural nouns as I think the command has to be called Get-SQLRecommendations with an S ! I have tagged the ScriptAnalyzer Tests with a tag so I can just run those tests.

Help

As that had all passed I could then run

Invoke-Pester .\tests -Tag Help

Which tests if I had the correct help for my functions. Of course that failed but I could use the nifty new feature in VS Codes PowerShell Extension to add the help scaffolding really easily as I describe here

Then I could run all 563 of the Pester tests in the tests folder and be happy that everything was OK

11 - All Pester passed.PNG

By the end I had written the module, which you can find here

There are instructions and a script to install it easily.

Right now it has only got the one function to get the SQL recommendations but I will look at expanding that over the next few days and once it is more complete put it onto the PowerShell Gallery and maybe move it into the SQL Server Community GitHub Organisation  home of https://dbatools.io , https://dbareports.io, Invoke-SQLCmd2 and the SSIS Reporting pack

Contribute

Of course I am happy to have others contribute to this, in fact I encourage it. Please fork and give PR’s and make this a useful module with more commands. There is the Diagnostic Analysis API as well to work with which I am very interested to see how we can make use of that with PowerShell

As always, I highly recommend that if you want to know more about Pester you head over here and purchase this book by Adam

PowerShell Module for the SQL Server Diagnostics API – 1st Command Get-SQLDiagRecommendations

I saw this blog post about the SQL Server Diagnostics add-on to SSMS and API and thought I would write some PowerShell to work with it as all of the examples use other languages.

SQL ServerDignostics API

The Diagnostic Analysis API allows you to upload memory dumps to be able to debug and self-resolve memory dump issues from their SQL Server instances and receive recommended Knowledge Base (KB) article(s) from Microsoft, which may be applicable for the fix.

There is also the Recommendations API to view the latest Cumulative Updates (CU) and the underlying hotfixes addressed in the CU which can be filtered by product version or by feature area (e.g. Always On, Backup/Restore, Column Store, etc).

I have written a module to work with this API. It is not complete. It only has one command as of now but I can see lots of possibilities for improvement and further commands to interact with the API fully and enable SQL Server professionals to use PowerShell for this.

Storing the API Key

To use the API you need an API Key. An API Key is a secret token that identifies the application to the API and is used to control access. You can follow the instructions here https://ecsapi.portal.azure-api.net/ to get one for the SQL Server Diagnostics API.

01 - APIKey

I will need to store the key to use it. I saved my API Key using the Export-CliXML command as described by Jaap Brasser here .

 Get-Credential | Export-CliXml -Path "${env:\userprofile}\SQLDiag.Cred" 

You need to enter a username even though it is not used and then enter the API Key as the password. It is saved in the root of the user profile folder as hopefully user accounts will have access there in most shops.

The commands in the module will look for the API Key in that SQLDiag.Cred file by default but you can also just use the APIKey parameter

Get-SQLDiagRecommendations

The first function in the module is Get-SQLDiagRecommendations. All this function does is connect to the Recommendations API and return an object containing the information about the latest Cumulative Updates.

If you have already saved your API Key as described above you can use

 Get-SQLDiagRecommendations 

If you want to enter the API Key manually you would use

 Get-SQLDiagRecommendations -APIKey XXXXXXXX

Either way it will return a PowerShell object containing all of the information which looks like this.

07 - Get-SQLRecommendations

One of the beauties of PowerShell is that you can pass objects down a pipeline and use them in other commands. Also, your only limit is your imagination.

You want to export to CSV, HTML, Text file?
Email, Import to database, store in Azure storage?
Embed in Word, Excel  on a SharePoint site?

All of this and much, much more is easily achievable with PowerShell.

In the future this command will feed other functions in the module that will display this information in a more useful fashion. I am thinking of commands like

Get-SQLDiagRecommendations |
Get-SQLDiagLatestCU -Version SQL2012

or

Get-SQLDiagRecommendations |
Get-SQLDiagKBArticle -Version SQL2012 -Feature BackupRestore

If you have any ideas please join in on GitHub

JSON

For now though you can use Get-SQLDiagRecommendations to output the results to JSON so that you can examine them or consume them.

If you use VS Code follow the steps here and you can export the results to the current file with

 Get-SQLDiagRecommendations |ConvertTo-Json -Depth 7 |Out-CurrentFile 

Which looks like this

08 - OutCurrentFile

It shows the entire JSON object containing all of the information about all of the latest CU’s for SQL Server 2012 and up and each of the KB Articles. I have minimised several of the nodes to try and show as much as possible for SQL Server 2012 SP3

If you do not use VS Code or you want to export straight to a file then you can

 Get-SQLDiagRecommendations |ConvertTo-Json -Depth 7 |Out-File -Path PATHTOFILE 

Out-GridView

I like Out-GridView so I quickly gathered the Product, Cumulative Update, Feature Type, KB Number and URL and outputted to Out-GridView like this

$recommendations = Get-SQLDiagRecommendations
$KBs = foreach ($recommendation in $recommendations.Recommendations){
    $Product = $recommendation.Product
    $CU = $recommendation.Title
    $CreatedOn = $recommendation.CreatedOn
    foreach ($fix in $recommendation.Content.RelevantFixes){
        $feature = $fix.Title
        foreach ($Kb in $fix.KbArticles){
            [PSCustomObject]@{
                CreatedOn = $CreatedOn
                Product = $Product
                CU = $CU
                Feature = $feature
                KB = $Kb.Rel
                Link = $Kb.href
                }
           }
       }
   }
 $kbs | Ogv 

As you can filter easily in Out-GridView I filtered by 2012 and this is what it looks like

09 - Out-GridView

This will enable you to quickly see any information that you require about the Cumulative Updates for SQL 2012, 2014 and 2016

Github

You can find the module on GitHub. There are instructions and a script to install it easily.

Right now it has only got the one function to get the SQL recommendations but I will look at expanding that over the next few days and once it is more complete put it onto the PowerShell Gallery and maybe move it into the SQL Server Community GitHub Organisation  home of https://dbatools.io , https://dbareports.io, Invoke-SQLCmd2 and the SSIS Reporting pack

Contribute

Of course I am happy to have others contribute to this, in fact I encourage it. Please fork and give PR’s and make this a useful module with more commands. There is the Diagnostic Analysis API as well to work with and I am very interested to see how we can make use of that with PowerShell

Tomorrow I have a post explaining the process I used to create the module and how I used Test Driven Development with Pester to write this function.

dbatools at #SQLSatDublin

This weekend SQL Saturday Dublin occurred. For those that don’t know SQL Saturdays are free conferences with local and international speakers providing great sessions in the Data Platform sphere.

Chrissy LeMaire and I presented our session PowerShell SQL Server: Modern Database Administration with dbatools. You can find slides and code here . We were absolutely delighted to be named Best Speaker which was decided from the attendees average evaluation.

Chrissy also won the Best Lightning talk for her 5 minute (technically 4 minutes and 55 seconds) presentation on dbatools as well 🙂

We thoroughly enjoy giving this presentation and I think it shows in the feedback we received.

Feedback

History

We start with a little history of dbatools, how it started as one megalithic script Start-SQLMigration.ps1 and has evolved into (this number grows so often it is probably wrong by the time you read this) over 240 commands from 60 contributors

Requirements

We explain the requirements. You can see them here on the download page.

The minimum requirements for the Client are

  • PowerShell v3
  • SSMS / SMO 2008 R2

which we hope will cover a significant majority of peoples workstations.

The minimum requirements for the SQL Server are

  • SQL Server 2000
  • No PowerShell for pure SQL commands
  • PowerShell v2 for Windows commands
  • Remote PowerShell enabled for Windows commands

As you can see the SQL server does not even need to have PowerShell installed (unless you want to use the Windows commands). We test our commands thoroughly using a test estate that encompasses all versions of SQL from 2000 through to 2017 and whenever there is a vNext available we will test against that too.

We recommend though that you are using PowerShell v5.1 with SSMS or SMO for SQL 2016 on the client

Installation

We love how easy and simple the installation of dbatools is. As long as you have access to the internet (and permission from your companies security team to install 3rd party tools. Please don’t break your companies policies) you can simply install the module from the PowerShell Gallery using

Install-Module dbatools

If you are not a local administrator on your machine you can use the -Scope parameter

Install-Module dbatools -Scope CurrentUser

Incidentally, if you or your security team have concerns about the quality or trust of the content in the PowerShell Gallery please read this post which explains the steps that are taken when code is uploaded.

If you cannot use the PowerShell Gallery then you can use this line of code to install from GitHub

Invoke-Expression (Invoke-WebRequest https://dbatools.io/in)

There is a video on the download page showing the installation on a Windows 7 machine and also some other methods of installing the module should you need them.

Website

Next we visit the website dbatools.io and look at the front page. We have our regular joke about how Chrissy doesn’t want to present on migrations but I think they are so cool so she makes me perform the commentary on the video. (Don’t tell anyone but it also helps us to get in as many of the 240+ commands in a one hour session as well 😉 ). You can watch the video on the front page. You definitely should as you have never seen migrations performed so easily.

Then we talk about the comments we have received from well respected people from both SQL and PowerShell community members so you can trust that its not just some girl with hair and some bloke with a beard saying that its awesome.

Contributors

Probably my favourite page on the web-site is the team page showing all of the amazing fabulous wonderful people who have given their own time freely to make such a fantastic free tool. If we have contributors in the audience we do try to point them out. One of our aims with dbatools is to enable people to receive the recognition for the hard work that they put in and we do this via the team page, our LinkedIn company page as well as by linking back to the contributors in the help and the web-page for every command. I wish I could name check each one of you.

Thank You each and every one !!

Finding Commands

We then look at the command page and the new improved search page and demonstrate how you can use them to find information about the commands that you need and the challenges in keeping this all maintained during a period of such rapid expansion.

Demo

Then it is time for me to say this phrase. “Strap yourselves in, do up your seatbelts, now we are going to show 240 commands in the next 40 minutes. Are you ready!!”

Of course, I am joking, one of the hardest things about doing a one hour presentation on dbatools is the sheer number of commands that we have that we want to show off. Of course we have already shown some of them in the migration video above but we still have a lot more to show and there are a lot more that we wish we had time to show.

Backup and Restore

We start with a restore of one database and a single backup file using Restore-DbaDatabase showing you the easy to read warning that you get if the database already exists and then how to resolve that warning with the WithReplace switch

Then how to use it to restore an entire instance worth of backups to the latest available time by pointing Restore-DbaDatabase at a folder on a share

Then how to use Get-DbaDatabase to get all of the databases on an instance and pass them to Backup-DbaDatabase to back up an entire instance.

We look at the Backup history of some databases using Get-DbaBackupHistory and Out-GridView and examine detailed information about a backup file using Read-DbaBackupHeader.

We give thanks to Stuart Moore for his amazing work on these and several other backup and restore commands.

SPN’s

After a quick reminder that you can search for commands at the command line using Find-DbaCommand, we talk about SPNs and try to find someone, anyone, who actually likes working with SQL Server and SPNs and resolving the issues!!

Then we show Drew’s SPN commands Get-DbaSpn, Test-DbaSpn, Set-DbaSpn  and Remove-DbaSpn 

Holiday Tasks

We then talk about the things we ensure we run before going on holiday to make sure we leave with a warm fuzzy feeling that everything will be ok until we return :-

  • Get-DbaLastBackup will show the last time the database had any type of backup.
  • Get-DbaLastGoodCheckDb which shows the last time that a database had a successful DBCC CheckDb and how we can gather the information for all the databases on all of your instances in just one line of code
  • Get-DbaDiskSpace which will show the disk information for all of the drives including mount points and whether the disk is in use by SQL

Testing Your Backup Files By Restoring Them

We ask how many people test their backup files every single day and Dublin wins marks for a larger percentage than some other places we have given this talk. We show Test-DbaLastBackup in action so that you can see the files being created because we think it looks cool (and you can see the filenames!) Chrissy has written a great post about how you can set up your own dedicated backup file test server

Free Space

We show how to gather the file space information using Get-DbaDatabaseFreespace and then how you can put that (or the results of any PowerShell command) into a SQL database table using Out-DbaDataTable and Write-DbaDataTable

SQL Community

Next we talk about how we love to take community members blog posts and turn them into dbatools commands.

We start with Jonathan Kehayias’s post about SQL Server Max memory (http://bit.ly/sqlmemcalc) and show Get-DbaMaxMemory , Test-DbaMaxMemory and Set-DbaMaxMemory

Then Paul Randal’s blog post about Pseudo-Simple Mode which inspired  Test-DbaFullRecoveryModel

We talked about getting backup history earlier but now we talk about Get-DbaRestoreHistory a command inspired by Kenneth Fishers blog post to show when a database was restored and which file was used.

Next a command from Thomas LaRock which he gave us for testing linked servers Test-DbaLinkedServerConnection.

Glenn Berrys diagnostic information queries  are available thanks to André Kamman and the commands Invoke-DbaDiagnosticQuery and Export-DbaDiagnosticQuery. The second one will output all of the results to csv files.

Adam Mechanic’s sp_whoisactive is a common tool in SQL DBA’s toolkit and can now be installed using Install-DbaWhoIsActive and run using Invoke-DbaWhoIsActive.

Awesome Contributor Commands

Then we try to fit in as many commands that we can from our fantastic contributors showing how we can do awesome things with just one line of PowerShell code

The awesome Find-DbaStoredProcedure which you can read more about here which in tests searched 37,545 stored procedures on 9 instances in under 9 seconds for a particular string.

Find-DbaOrphanedFile which enables you to identify the files left over from detaching databases.

Don’t know the SQL Admin password for an instance? Reset-SqlAdmin can help you.

It is normally somewhere around here that we finish and even though we have shown 32 commands (and a few more encapsulated in the Start-SqlMigration command) that is less than 15% of the total number of commands in the module!!!

Somehow, we always manage to fit all of that into 60 minutes and have great fun doing it. Thank you to everyone who has come and seen our sessions in Vienna, Utrecht, PASS PowerShell Virtual Group, Hanover, Antwerp and Dublin.

More

So you want to know more about dbatools ? You can click the link and explore the website

You can look at source code on GitHub

You can join us in the SQL Community Slack in the #dbatools channel

You can watch videos on YouTube

You can see a list of all of the presentations and get a lot of the slides and demos

If you want to see the slides and demos from our Dublin presentation you can find them here

Volunteers

Lastly and most importantly of all. SQL Saturdays are run by volunteers so massive thanks to Bob, Carmel, Ben and the rest of the team who ensured that SQL Saturday Dublin went so very smoothly

 

 

Pester for Presentations – Ensuring it goes ok

Whilst I was at PSCONFEU I presented a session on writing pester tests instead of using checklists. You can see it here

During the talk I showed the pester test that I use to make sure that everything is ready for my presentation. A couple of people have asked me about this and wanted to know more so I thought that I would blog about it.

Some have said that I might be being a little OCD about it 😉 I agree that it could seem like that but there is nothing worse than having things go wrong during your presentation. It makes your heart beat faster and removes the emphasis from the presentation that you give.

When it is things that you as a presenter could have been able to foresee, like a VM not being started or a database not being restored to the pre-demo state or being logged in as the wrong user then it is much worse

I use Pester to ensure that my environment for my presentation is as I expect and in fact, in Hanover when I ran through my Pester test for my NUC environment I found that one of my SQL Servers had decided to be in a different time zone and therefore the SQL Service would not authenticate and start. I was able to quickly remove the references to that server and save myself from a sea of red during my demos

For those that don’t know. Pester is a PowerShell module for Test Driven Development

Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands

If you have PowerShell version 5 then you will have Pester already installed although you should update it to the latest version. If not you can get Pester from the PowerShell Gallery follow the instructions on that page to install it. This is a good post to start learning about Pester

What can you test? Everything. Well, specifically everything that you can write a PowerShell command to check. So when I am setting up for my presentation I check the following things. I add new things to my tests as I think of them or as I observe things that may break my presentations. Most recently that was ensuring that my Visual Studio Code session was running under the correct user. I did that like this

Describe "Presentation Test" {
    Context "VSCode" {
        It "Should be using the right username" {
            whoami | Should Be 'TheBeard\Rob'
       }
    }
}

01 - username.PNG

I think about the things that are important to me for my presentation.  I want to ensure that I only have one VS Code window open to avoid that situation where I am clicking through windows looking for the correct window. I can do that using Get-Process

It "Should have Code Insiders Open" {
(Get-Process 'Code - Insiders' -ErrorAction SilentlyContinue)| Should Not BeNullOrEmpty
}
        It "Should have One VS Code Process" {
            (Get-Process 'Code - Insiders' -ErrorAction SilentlyContinue).Count | Should Be 1
        }

I use -ErrorAction SilentlyContinue so that I don’t get a sea of red when I run the tests. Next I want to check my PowerPoint is ready for my presentation

        It "Should have PowerPoint Open" {
            (Get-Process POWERPNT  -ErrorAction SilentlyContinue).Count | Should Not BeNullOrEmpty
        }
       It "Should have One PowerPoint Open" {
            (Get-Process POWERPNT  -ErrorAction SilentlyContinue).Count | Should Be 1
        }
        It "Should have the correct PowerPoint Presentation Open" {
            (Get-Process POWERPNT  -ErrorAction SilentlyContinue).MainWindowTitle| Should Be 'dbatools - SQL Server and PowerShell together - PowerPoint'
        }

Again I use Get-Process. I check if PowerPoint is open, if there is one PowerPoint open and I use the MainWindowTitle property to check that it is the right PowerPoint presentation after nearly starting a presentation for SqlServer module with the dbatools slides!

I don’t want any distractions when I am presenting. I have the sort of friends who will notice if I get notifications for twitter popping up on my screen and repeatedly send tweets to make people laugh. (I admit, I’m one of those friends – I do this too!)

02 - Friends!!.PNG

Now I cannot get a test for quiet hours working. You can apparently use a Registry key, which of course you can check with PowerShell but I was unable to get it working. I haven’t looked at testing for Presentation Mode  but I test that those programmes are shut down, again using Get-Process

        It "Mail Should be closed" {
            (Get-Process HxMail -ErrorAction SilentlyContinue).Count | Should Be 0
        }
        It "Tweetium should be closed" {
            (Get-Process WWAHost -ErrorAction SilentlyContinue).Count | Should Be 0
        }
        It "Slack should be closed" {
            (Get-Process slack* -ErrorAction SilentlyContinue).Count | Should BE 0
        }

I am generally presenting with SQL Server so I need to make sure that SQL Server is running. I do this with Get-Service

Context "Local SQL" {
        It "DBEngine is running" {
            (Get-Service mssqlserver).Status | Should Be Running
        }
        It "SQL Server Agent is running" {
            (Get-Service sqlserveragent).Status | Should Be Running
        }
        It "DAVE DBEngine is running" {
            (Get-Service mssql*Dave).Status | Should Be Running
        }
        It "DAVE Agent is running" {
            (Get-Service sqlagent*dave).Status | Should Be Running
        }
    }

In this example I am testing that the SQL Service and the Agent service are running on both of my local instances.

I use a NUC running Hyper-V to enable me to show a number of SQL Servers running in a domain environment so I need to be able to test those too. I set the values of the servers I need into a variable and check that the VM is running and that they respond to ping

 Context "VM State" {
        $NUCServers = 'BeardDC1','BeardDC2','LinuxvNextCTP14','SQL2005Ser2003','SQL2012Ser08AG3','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3','SQLVnextN1','SQL2008Ser12R2'
        $NUCVMs = Get-VM -ComputerName beardnuc | Where-Object {$_.Name -in $NUCServers}
            foreach($VM in $NUCVms)
                {
                $VMName = $VM.Name
                  It "$VMName Should be Running"{
                    $VM.State | Should Be 'Running'
                  }
			    }
    }
Context "THEBEARD_Domain" {
            $NUCServers = 'BeardDC1','BeardDC2','LinuxvNextCTP14','SQL2005Ser2003','SQL2012Ser08AG3','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3','SQLVnextN1','SQL2008Ser12R2'
            foreach($VM in $NUCServers)
                {
                                 It "$VM Should respond to ping" {
				(Test-Connection -ComputerName $VM -Count 1 -Quiet -ErrorAction SilentlyContinue) | Should be $True
				}
                }
    }

I also need to check if the SQL Service and the Agent Service is running on each server

  Context "SQL State" {
        $SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*'  -and $_.State -eq 'Running'}).Name
        foreach($Server in $SQLServers)
        {
          $DBEngine = Get-service -ComputerName $Server -Name MSSQLSERVER
           It "$Server  DBEngine should be running" {
                $DBEngine.Status | Should Be 'Running'
            }
           It "$Server DBEngine Should be Auto Start" {
            $DBEngine.StartType | Should be 'Automatic'
           }
              $Agent= Get-service -ComputerName $Server -Name SQLSERVERAGENT
              It "$Server Agent should be running" {
                  $Agent.Status | Should Be 'Running'
           }
           It "$Server Agent Should be Auto Start" {
            $Agent.StartType | Should be 'Automatic'
           }
        }
        It "Linux SQL Server should be accepting connections" {
            $cred = Import-Clixml C:\temp\sa.xml
            {Connect-DbaSqlServer -SqlServer LinuxvnextCTP14 -Credential $cred -ConnectTimeout 60} | Should Not Throw
        }

    }
}

I check that the Linux SQL Server is available by storing the credential using Export-CliXML  and then use that credential with Connect-DbaSqlServer from dbatools

Using a NUC means I sometimes have fun with networking so I have a couple of tests for that too. Testing for the correct DNS Servers and gateways

    It "Should have DNS Servers for correct interface" {
        (Get-DnsClientServerAddress -InterfaceAlias 'Ethernet 3').Serveraddresses | Should Be @('10.0.0.1','10.0.0.2')
    }
    It "Should have correct gateway for alias"{
        (Get-NetIPConfiguration -InterfaceAlias 'Ethernet 3').Ipv4DefaultGateway.NextHop | Should Be '10.0.0.10'
    }

All of those are generic tests that have evolved over time and are run for every presentation but when I have specific things you require for a single presentation I test for those too.

For Example, later this week Cláudio Silva and I are presenting on dbatools at TUGAIT  We are showing the Test-DbaMaxMemory  , Get-DbaMaxMemory and Set-DbaMaxMemory commands so we need to ensure that the Max Memory for some servers is (In) Correctly set. I use Connect-DbaSqlServer to create an SMO Server object and test that

    It "Max Memory on SQl2012SerAG1 2 and 3 should be 2147483647" {
        (Connect-DbaSqlServer SQL2012Ser08AG1).Configuration.MaxServerMemory.RunValue | Should Be 2147483647
        (Connect-DbaSqlServer SQL2012Ser08AG2).Configuration.MaxServerMemory.RunValue | Should Be 2147483647
        (Connect-DbaSqlServer SQL2012Ser08AG3).Configuration.MaxServerMemory.RunValue | Should Be 2147483647
    }

We are also showing the Test-DbaIdentityUsage command so a column needs to be pre-prepared in AdventureWorks2014 to be able to show the error

    It "ShiftID LastValue Should be 255" {
        $a = Test-DbaIdentityUsage -SqlInstance ROB-XPS -Databases AdventureWorks2014 -NoSystemDb
        $a.Where{$_.Column -eq 'ShiftID'}.LastValue | should Be 255
    }

To ensure that we have orphaned files available for the Find-DbaOrphanedFile command I use this

    It "has Orphaned Files ready"{
        (Find-DbaOrphanedFile -SqlServer SQL2016N2).Count | Should Be 30
    }

There are any number of things that you may want to test to ensure that, as best as possible, the demo gods are not going to come and bite you in the middle of your presentation.

  • Files or Folders exist (or dont exist)
  • Databases, Agent Jobs, Alerts
  • Operators, Logins
  • SSIS packages, SSRS Reports
  • PowerBi files
  • Azure connectivity
  • Azure components

The list is endless, just look at what you require for your presentation.

Anything you can check with PowerShell you can test with Pester so build up your Pester presentation tests and reduce the reliance on the demo gods! I’ll still leave this here just to be safe!!

pray to the demo gods.jpg

Why VS Code Increases my Productivity

Last week I was showing a co-worker some PowerShell code and he asked what the editor was that I was using. Visual Studio Code I said. Why do you use that? What does it do?

This is what I showed him

Runs on any Operating System

Code (as I shall refer to it) is free lightweight open source editor which runs on all the main operating systems. So you have the same experience in Linux as on Windows. So there is less to learn

Extensions

You can add new languages, themes, debuggers and tools from the extensions gallery to reduce the number of programmes you have open and the need to switch between programmes

You can add extensions using CTRL + SHIFT  + X and searching in the bar

01 - Extensions

or by going to the Extensions gallery searching for the extensions and copying the installation command

02 - extensions gallery.PNG

Debugging

There is a rich de-bugging experience built in

03 - debugging.PNG

You can learn about debugging from the official docs and Keith Hill wrote a blog post on Hey Scripting Guys about debugging PowerShell

Intellisense

An absolute must to make life simpler. Code has intellisense for PowerShell and T-SQL which I use the most but also for many more languages . Read more here

Git integration

I love the Git integration, makes it so easy to work with GitHub for me. I can see diffs, commit, undo commits nice and simply. Just open the root folder of the repository and its there

04 - git

This page will give you a good start on using git with Code

No distractions

With full screen mode (F11) or Zen mode (CTRL +K, Z) I can concentrate on coding and not worry about distractions

Stay in one programme and do it all

I have a Markdown document, a PowerShell script and a T-SQL script all in one Git repository and I can work on all of them and version control in one place. The screencast below also shows some of the new capabilities available in the insiders version I managed to leave the screen recording dialogue open as well, apologies and the mistake was deliberate!

I used the GitLens and SQL beautify extensions as well as the dbatools module in that demo

That’s why I am using Code more and more these days, hope it helps

Happy Automating!

 

 

Why Volunteer at SQLBits ?

WARNING – Contains Orange and light-hearted photos 😉

I have returned home from SQLBits 2017 The Disco Edition. I am exhausted, my body is pointing out to me in a variety of ways that this is the only week of the year that I spend so much time on my feet. Why would anyone do it?

Many months of work

First though, you need to know that the SQLBits conference is organised and run by volunteers. All of the committee spend many, many long hours, out of their own free time, for many months before and after the event to ensure that the attendees, sponsors, speakers and volunteers experience is trouble free. I think that they do an amazing and fantastic job and repeatedly pull off the best, most enjoyable conference that I have been to.

Thank you Simon, Chris, Darren, Allan, Alex, Jonathan, Annette

Thank you also to their families as well, who undoubtedly miss out on time with them whilst they are organising everything to do with the event, from finding venues, organising dates, speakers, marketing, website, sponsors, printing, audio visual, THE PARTY!! and all the other big and small things that it takes to make an event of that size occur.

Orange Shirted Wonderful Folk

There is another group of volunteers that you will have seen at SQLBits. For the last couple of years we have been the ones in the orange shirts.

Here is the Class of 2017

WP_20170406_17_27_05_Pro.jpg

I think this is a brilliant colour as it makes us easy to spot (although a couple of attendees who also had orange tops on did get stopped and asked for directions 🙂 )

What do they do?

These folk come in early and get everything set up. Sometimes we have to explain that the event isn’t ready for you yet

WP_20170406_07_31_20_Pro.jpgWe sort out the registration desk and greet every attendee, speaker and sponsor and assist them.

WP_20170405_08_11_25_Pro.jpg

We help the speakers get set up

WP_20170405_08_15_38_Pro.jpg

and ensure they have everything they need.

aaron bertrand.jpg

Aaron Bertrand (above) and John Martin from SentryOne said that it is the best experience for a speaker that they have had anywhere.

We direct and assist the attendees to be in the right place, sometimes with some flair!

WP_20170404_21_13_03_Pro.jpg

We ensure that any issues are resolved quickly and with as little distraction as possible. The room is too hot, too cold, too noisy or too quiet. The projector isn’t working or the speakers microphone has a buzz, there is too much light or too little. The water coolers are empty. The rubbish needs picking up. All these and many other minor complications are communicated and passed to the correct people to get resolved.

WP_20170404_17_51_33_Pro (2).jpg

Sometimes we have to resolve our own issues. We had folks who were called by their work and had to stop helping and go back to their day jobs for a few hours. We all understand what it is like for people working in technology and adapt and manage accordingly. In almost every photo I took, there is someone in an orange shirt to be seen.

WP_20170406_10_43_41_Pro.jpg

We answer numerous questions from the 1500 or so attendees (and the odd sheep) who came this year.

rchard.jpg

From timings and locations to taxi numbers or restaurants. Unfortunately I did not beat last years “Best question I have been asked at SQLBits” which was

Excuse me, I have a Dalek in the van . What would you like me to do with it?

I was even asked questions on the way back to the hotel gone midnight on Saturday!!

We stay afterwards and help to get ready for the next day, putting out the new signs for the domes and the required paperwork.

So why do we do it?

I asked the guys and gals this question and in their own words, this is why they do it

Being a volunteer at SQLBits is not easy. I’m writing this three days after the event and my legs are still sore. Most days are 11 hours long and you will be standing for most of them. Very often the sessions are full, so you’ll be giving up your seat to one of the attendees. Lunches and breaks are shorter as you are either cleaning down the last session or getting ready for the next. When things go wrong, and they do, you’ll need to get them fixed as quickly as possible even if you have not had  coffee yet.

You do get to attend sessions but you might not always get your first choice. This can be both a good and bad thing. Very often I have filled in on sessions that I normally wouldn’t attend as they are outside my direct area of work, only to find them the most interesting as I get to see how the other half lives.

So why do I keep coming back? Well it’s fun. We have a laugh, even on reception when it’s busy you get to joke with the attendees, speakers and other helpers. There is pizza, beer and jokes while bag packing. Odd expresso calls!  Working along side some else is a great way to get to know them. I live outside the normal SQL community structures, my nearest user group is a 150 miles away. So I don’t get to interact with other SQL family members as often as others. But even so, I know as soon as I walk into SQL Bits, there will be a chorus of, “Hey Conan, how have you been?” from people I haven’t seen in a year. There is also something about wearing a bright orange shirt that seems to attract interactions from the attendees.

All because of the of the experience that is being a volunteer.

Conan Farrell

WP_20170406_10_33_15_Pro (2).jpg

I owe a lot to the SQL Community. It was at SQLBits a few years ago that someone convinced me to start speaking. That encouragement and acceptance from the #SQLFamily put into motion a series of events that lead to me quitting the best job of my life last year in favour of an adventure to set up my own company. It’s been a wonderful journey and if it had not been for SQLBits (and SQL Relay, and SQL Saturdays, and others) I wouldn’t have taken it.

I have a debt to pay. And it’s wonderful to be able to contribute towards giving other people the same opportunities that the community has given me.

Also, for similar reasons, I recently joined the committee for SQL Relay. While I’ve been a sponsor and a speaker a lot before, I am fairly inexperienced at organising events. Helping out with the awesome SQLBits team has been a great learning curve. I hope to take what I have learned and apply it to my role for SQL Relay.

Finally, frankly, there are so many great people in the SQL Community it is just wonderful to be able to work with them during the day. (And share some beers with them in the evening!)

Alex Yates

alex2

I volunteer at SQLBits and in the wider Data Platform community for various reasons. The community is the glue in our industry and being part of that glue is both a privilege and an honour. Without volunteers these awesome events couldn’t function and we couldn’t share knowledge as freely. There also would not be the same opportunities to network with such great people and experts in the same field. I am proud to be part of the SQL Family and facilitate the learning that being a volunteer offers to professionals just like me. Stronger together.

Paul Andrew

WP_20170404_16_03_17_Pro (2).jpg

When I volunteered, I felt I’d like to give back to the SQL Community a bit of the much I’ve received from them. I wanted to become more engaged.
I didn’t even dream it would be like this, the terrific team I found here was so great that I can barely wait for the next one.

Miguel Oliveira

Volunteering gives you the experience of the effort & reward of running an event.. You build an appreciation of generating success, creating networks of colleagues and friends, being quick to react, and helps you walk away with the feeling of “I contributed to this”. Everyone should volunteer, even if its just once.

Ben Watt

WP_20170405_09_31_24_Pro (2).jpg

This year was my 10th sqlbits (remembering we skipped unlucky 13) and 7th I’ve been on the helping team for. I started helping when I learned my original conference companions were moving on to different technologies. Something I’ve never understood.

Historically I’ve worked in multiple companies as a lone dba, something I’ve found many at SQLBits can relate to. Through user groups and SQLBits I’ve met lots of others in the same boat over the years. It can be a frustrating job defending servers, implementing best practice and writing sql when all your application team want to do is add columns and ship the next release!

Yes, there are good networking opportunities and the parties are great but at the core is great quality training. I’ve had a great time playing a small part in helping deliver the experience over the years. Occasionally I have to shout about feedback forms or seating arrangements but on the whole folk are fine with it.  If I’m honest they are long days. A typical day involves a start and end meeting in addition to sessions you are monitoring. Add to that hotel commutes, catching up with the friends you made last year, meals, drinks etc and its 1am. Oh, and tomorrow’s meeting is 7.30 eek….

It’s been a great journey watching the progression of events. Each one adds to the last, they run a lot slicker now. The sponsors are first class and always embrace the theme with their stalls and giveaways. I do wish more folk would enter the prize draws and stop me winning though, it’s getting embarrassing now 🙂

Richard Doering 

rich2

The best Data Platform conference in Europe! I couldn’t miss the opportunity to be much more than just an attendee.

Hubert Kobierzewski

I volunteer for sqlbits, to get experience from a multi day event that has a lot of attendees. Lots of take aways to make local community events run smoother for the delegates.
Thats my main reason

Jens Vestergard

There is no way I could not use this next photo celebrate this years Most Valuable Helper.

WP_20170407_20_45_12_Pro (2).jpg

Thank you Shaun Atkinson you were amazing and thank you for looking after my former intern James as a first time volunteer

How can I do it next year?

We always welcome new people to our team. If you would like to volunteer your time to help at the next SQLBits please send an email to helpers@sqlbits.com with a subject of I would like to volunteer or something similar. Nearer the time of next years SQLBits (No I don’t know when or where it is, I will know when you do) the awesome, amazing, fantastic, brilliant, organising amazeballs also known as Annette will get in touch with you with further details

You can expect to have a lot of fun, make new friends and help make the best data platform conference the best data platform conference

Oh and sometimes theres free pizza 🙂

WP_20170404_18_31_58_Pro

Another Thank You

Some lovely people, (I am really sorry but I didn’t get everyones name) brought sweets, biscuits, cakes and other goodies for the crew. Thank you, they were very much appreciated.

Pictures because smiles 🙂

Not all of the volunteers wanted to give feedback publically but I had some cracking photos so I thought I would share them as well. Enjoy

This slideshow requires JavaScript.