I have been asked a couple of times recently what my Visual Studio Code extensions are at the moment so I thought I would write a quick post and also look at workspaces and how you can enable and disable extensions within them
Listing Extensions
From the command line you can list your extensions using
You can also see them in the view on the left of default Visual Studio Code and open them with CTRL + SHIFT + X (unless like me you have Snaggit installed and it has taken that shortcut
Installing Extensions
You can install extensions by opening the Extensions view in Visual Studio Code and searching for the extension. The list I have below has the precise names for each extension which you can use to search
You can also install extensions from the command-line with
I am going to list these in alphabetical order by display name for ease (my ease that is!)
Because Chrissy LeMaire and I are writing dbatools in a Month of Lunches using AsciiDoc, it makes sense to have an extension enabling previewing and syntax, you can find it here
For interacting with Azure I use the Azure Account Extension – ms-vscode.azure-account
I use Azure CLI so I make use of the functionality of the Azure CLI Tools extension ms-vscode.azurecli
For interacting with Azure Repos I use the ms-vsts.team extension
When creating ARM templates, this extension is very useful msazurermtools.azurerm-vscode-tools
I have a few theme extensions, this one is for fun in demos 😉 beardedbear.beardedtheme
The blackboard theme is my default one gerane.theme-blackboard
Chasing closing brackets is much easier with the Bracket Pair Colorixer, I use the beta version coenraads.bracket-pair-colorizer-2
I am rubbish at spelling and typing so I use this to help point out the issues! streetsidesoftware.code-spell-checker
Using the Docker extension adds another view to Visual Studio Code to ease working with containers ms-azuretools.vscode-docker
As an open-source project maintainer it is good to be able to work with GitHub pull requests without leaving Visual Studio Code github.vscode-pull-request-githubPreview
GitLens is absolutely invaluable when working with source control. It has so many features. This is an absolute must eamodio.gitlens
Working with Kubernetes? This extension adds another view for interacting with your cluster ms-kubernetes-tools.vscode-kubernetes-tools
Visual Studio Live Share enables you to collaborate in real-time in Visual Studio Code with your colleagues or friends. I blogged about this here ms-vsliveshare.vsliveshare
I love writing markdown and this linter assists me to ensure that my markdown is correct davidanson.vscode-markdownlint
The Material Icon Theme ensures that there are pretty icons in my editor! pkief.material-icon-theme
I have both the PowerShell extension ms-vscode.powershell and the PowerShell preview extension ms-vscode.powershell-preview installed but only one can be enabled at a time
This suite of extensions enables easy remote development so that you can develop your PowerShell scripts, for example, inside a ubuntu container running PowerShell 7 or inside Windows Subsystem for LInux ms-vscode-remote.vscode-remote-extensionpackPreview
Writing for cross-platform means looking out for line endings and this extension will display them and any whitespace in your editor medo64.render-crlf
An absolutely essential extension which enables me to backup all of my Visual Studio Code settings, shortcuts, extensions into a GitHub gist and keep all of my machines feeling the same. shan.code-settings-sync
For working with SQL Server within Visual Studio Code and having a view for my instances as well as a linter and intellisense I use ms-mssql.mssql
Yaml files and spaces! I no longer get so confused with this extension to help me 🙂 redhat.vscode-yaml
Workspaces
Now that is a lot of extensions and I dont need all of them everytime. I use workspaces to help with this. I will create a workspace file for the project I am working on.
I open or create the folders I will be working on and then click File and Save Workspace As and save the file in the root of the folder.
Now, the next time I want to open the workspace, I can open the workspace file or if I open the folder Visual Studio Code will helpfully prompt me
Now I can have all of my settings retained for that workspace
For this folder, I am ensuring that the PowerShell extension uses the PSScriptAnalyzer Settings file that I have created so that it will show if the code is compatible with the versions of PowerShell I have chosen. I can define settings for a workspace in the settings file, which you can open using CTRL and ,
But I can also enable or disable extensions for a workspace
So everytime I open this workspace I am only loading the extensions I want
You can download the latest insiders edition from the link above, it can be installed alongside the stable release.
To access many of the commands available use F1 to open the command palette (like many of my tips this also works in Visual Studio Code). You can then start typing to get the command that you want.
You can then hit enter with the command that you want highlighted, use the mouse or use the shortcut which is displayed to the right.
In a new notebook, you can click the drop down next to kernel and now you can see that PowerShell is available
When you choose the PowerShell kernel, you will get a prompt asking you to configure the Python installation
If you have Python already installed you can browse to the location that it is installed or you can install Python. In the bottom pane you will be able to see the progress of the installation.
When it has completed, you will see
You may also get a prompt asking if you would like to upgrade some packages
Again this will be displayed in the tasks pane
Adding PowerShell
To add PowerShell Code to the notebook click the Code button at the top of the file
or the one you can find by highlighting above or below a block
I did not have intellisense, but you can easily write your code in Azure Data Studio or Visual Studio Code and paste it in the block.
This was because he had the PowerShell extension installed and I did not (I know !!) If you find you dont have intellisense then install the PowerShell extension!
Clicking the play button (which is only visible when you hover the mouse over it) will run the code
You can clear the results from every code block using the clear results button at the top
Otherwise, you can save the results with the Notebook by saving it. This is the part that is missing from running PowerShell in the Markdown blocks in a SQL Notebook as I described here
I am looking forward to how this develops. You can find my sample PowerShell notebook (with the code results) here
I have done a lot of writing in the last few months but you see no blog posts! My wonderful friend Chrissy and I are writing “dbatools in a Month of Lunches” to be published by Manning. That has taken up a lot of my writing mojo. We have hit a little break whilst we have some reviews done ready for the MEAP (For everyone who asks, the answer is the unfulfilling ‘soon’) so it’s time for a blog post!
SQL Notebooks are cool
I have had a lot of fun with SQL Notebooks recently. I have presented a session about them at a couple of events this month DataGrillen and SQL Saturday Cork. Here is a little snippet
— Rob He/Him robsewell@tech.lgbt & @counter.soci (@sqldbawithbeard) May 21, 2019
Yes, you can run PowerShell in a SQL Notebook in Azure Data Studio just by clicking a link in the markdown cell. This opens up a lot of excellent possibilities.
I have had several discussions about how SQL Notebooks can be used by SQL DBAs within their normal everyday roles. (Mainly because I don’t really understand what the sorcerers of data science do with notebooks!). I have helped clients to look at some of their processes and use SQL Notebooks to help with them. Creating Disaster Recovery or Change Run-books or Incident Response Templates or using them for product demonstrations. Of course, I needed to use PowerShell in that 🙂
I have really enjoyed working out how to run PowerShell in the markdown in a SQL Notebook in Azure Data Studio and I think Anthony the kubernetes magician did too!
First, before I go any further, I must say this. I was at the European PowerShell Conference when I was working this out and creating my sessions and I said the words
“Cool, I can click a link and run PowerShell, this is neat”
A Beardy fellow in Hannover
This stopped some red team friends of mine in their tracks and they said “Show me”. One of them was rubbing their hands with glee! You can imagine the sort of wicked, devious things that they were immediately considering doing.
Yes, it’s funny but also it carries a serious warning. Without understanding what it is doing, please don’t enable PowerShell to be run in a SQL Notebook that someone sent you in an email or you find on a GitHub. In the same way as you don’t open the word document attachment which will get a thousand million trillion pounddollars into your bank account or run code you copy from the internet on production without understanding what it does, this could be a very dangerous thing to do.
With that warning out of the way, there are loads of really useful and fantastic use cases for this. SQL Notebooks make great run-books or incident response recorders and PowerShell is an obvious tool for this. (If only we could save the PowerShell output in a SQL Notebook, this would be even better)
How on earth did you work this out?
Someone asked me how I worked it out. I didn’t! It began with Vicky Harp PM lead for the SQL Tools team at Microsoft
I then went and looked at Kevin Cunnane‘s notebook. Kevin is a member of the tools team working on Azure Data Studio. With SQL Notebooks, you can double click the markdown cell and see the code that is behind it. To understand how it is working, lets deviate a little.
Keyboard Shortcuts
IF you click the cog at the bottom left of Azure Data Studio and choose Keyboard Shortcuts
you can make Azure Data Studio (and Visual Studio Code) work exactly how you want it to. Typing in the top box will find a command and you can then set the shortcuts that you want to use to save yourself time.
This also enables you to see the command that is called when you use a keyboard shortcut. For example, you can see that for the focus terminal command it says workbench.action.terminal.focus.
It turns out that you can call this as a link in a Markdown document using HTML with <a href=""> and adding command: prior to the command text. When the link is clicked the command will run. Cool 🙂
For this to be able to work (you read the warning above?) you need to set the Notebook to be trusted by clicking this button.
This will allow any command to be run. Of course, people with beards will helpfully advise when this is required for a SQL Notebook. (Safe to say people attempting nefarious actions will try the same with your users)
Now that we know how to run an Azure Data Studio command using a link in a markdown cell the next step is to run a PowerShell command. I headed to the Visual Studio Code documentation and found
Send text from a keybinding The workbench.action.terminal.sendSequence command can be used to send a specific sequence of text to the terminal, including escape sequence
That’s the command we need, however, we still need to craft the command so that it will work as a link. It needs to be converted into a URL.
I started by using this website https://www.url-encode-decode.com/ to do this. This is how you can check the code in other peoples notebook, use the decode capability.
So I can just put that code into the href link and bingo!
If only it was that easy!!
Some Replacing is required
The + needs to be replaced with a space or %20
You also need to double the \ and replace the %3A with a : The " needs to be replaced with \u022, the ' with \u027, the curly braces won’t work unless you remove the %0D%0A. Got all that? Good!
Once you have written your PowerShell, encoded it, performed the replacements, you add \u000D at the end of the code to pass an enter to run the code and then place all of that into a link like this
<a href="command:workbench.action.terminal.sendSequence?%7B%22text%22%3A%22 PLACE THE ENCODED CODE HERE %22%7D">Link Text</a>
This means that if you want to add the PowerShell code to set a location and then list the files and folders in that location to a Markdown cell using PowerShell like this
Set-Location C:\dbachecks
Get-ChildItem
You would end up with a link like this
<a href="command:workbench.action.terminal.sendSequence?%7B%22text%22%3A%22 Set-Location C:%5C%5Cdbachecks \u000D Get-ChildItem \u000D %22%7D">Set Location and list files</a>
This will take a PowerShell command and turn it into a link that will work in an Azure Data Studio markdown. It’s not magic, it’s PowerShell. There is a –ToClipboard parameter which will copy the code to the clipboard ready for you to paste into the cell (On Windows machines only)
Giants
There are many uses for this but here’s one I think is cool.
The link below will go to a notebook, which will show how you the giants upon whose shoulders I stand
have enabled me to create a SQL Notebook with a link which will run some PowerShell to create a SQL Notebook which will have all of the Diagnostic Queries in it.
You could possibly use something like it for your incident response SQL Notebook.
It’s also cool that GitHub renders the notebook in a browser (You can’t run PowerShell or T-SQL from there though, you need Azure Data Studio!)
Both of these run a random query in a single thread so I thought I would use PoshRSJob by Boe Prox b | t to run multiple queries at the same time 🙂
To install PoshRSJob, like with any PowerShell module, you run
Install-Module -Name PoshRSJob
I downloaded AdventureWorksBOLWorkload zip from Pieters blog post and extracted to my C:\temp folder. I created a Invoke-RandomWorkload function which you can get from my functions repository in Github. The guts of the function are
1.. $NumberOfJobs | Start-RSJob -Name "WorkLoad" -Throttle $Throttle -ScriptBlock {
# Get the queries
$Queries = Get-Content -Delimiter $Using:Delimiter -Path $Using:PathToScript
# Pick a Random Query from the input object
$Query = Get-Random -InputObject $Queries
# Run the Query
Invoke-SqlCmd -ServerInstance $Using:SqlInstance -Credential $Using:SqlCredential -Database $Using:Database -Query $Query
# Choose a random number of milliseconds to wait
$a = Get-Random -Maximum 2000 -Minimum 100;
Start-Sleep -Milliseconds $a;
}
which will created $NumberOfJobs jobs and then run $Throttle number of jobs in the background until they have all completed. Each job will run a random query from the query file using Invoke-SqlCmd. Why did I use Invoke-SqlCmd and not Invoke-DbaQuery from dbatools? dbatools creates runspaces in the background to help with logging and creating runspaces inside background jobs causes errors
You can get the function here. The full code is below
# With thanks to Jonathan Kehayias and Pieter Vanhove
<#
.SYNOPSIS
Runs a random workload against a database using a sql file
.DESCRIPTION
Runs a random workload against a database using PoshRSJobs to create parallel jobs to run random
queries from a T-SQL file by default it uses the AdventureWorksBOLWorkload.sql from Pieter Vanhove
.PARAMETER SqlInstance
The SQL instance to run the queries against
.PARAMETER SqlCredential
The SQL Credential for the Instance if required
.PARAMETER Database
The name of the database to run the queries against
.PARAMETER NumberOfJobs
The number of jobs to create - default 10
.PARAMETER Delay
The delay in seconds for the output for the running jobs - default 10
.PARAMETER Throttle
The number of parallel jobs to run at a time - default 5
.PARAMETER PathToScript
The path to the T-SQL script holding the queries - default 'C:\temp\AdventureWorksBOLWorkload\AdventureWorksBOLWorkload.sql'
.PARAMETER Delimiter
The delimiter in the T-SQL Script between the queries - default ------
.PARAMETER ShowOutput
Shows the output from the jobs
.EXAMPLE
Invoke-RandomWorkload -SqlInstance $SQL2019CTP23 -SqlCredential $cred -Database AdventureWorks2014 -NumberOfJobs 100 -Delay 10 -Throttle 10
Runs 100 queries with a maximum of 10 at a time against the AdventureWorks2014 database on $SQL2019CTP23
.EXAMPLE
$x = 10
while($X -gt 0){
Invoke-RandomWorkload -SqlInstance $SQL2019CTP23 -SqlCredential $cred -Database AdventureWorks2014 -NumberOfJobs 1000 -Delay 10 -Throttle 10
$x --
}
Runs 1000 queries with a maximum of 10 at a time against the AdventureWorks2014 database on $SQL2019CTP23 10 times in a loop
.NOTES
With thanks to Pieter Vanhove
https://blogs.technet.microsoft.com/msftpietervanhove/2016/01/08/generate-workload-on-your-azure-sql-database/
and
Jonathan Kehayias
…. communities will come together once again in the sixth great Global Azure Bootcamp event! Each user group will organize their own one day deep dive class on Azure the way they see fit and how it works for their members. The result is that thousands of people get to learn about Azure and join together online under the social hashtag #GlobalAzure!
Saturday Is Free Learning
I am a part of the team organising the event in Exeter. Now there is a little story here. We had chosen this date by chance to hold an event we call Data In Devon giving people in the South West (of UK) the chance to access a whole day of high quality data and technical sessions for free on a Saturday.
When the Global Azure Bootcamp was announced, we had a conversation with the organisers and they agreed that we could add Exeter as a venue as we had already decided to have a whole track dedicated to Azure. You can find our schedule here https://sqlsouthwest.co.uk/data-in-devon-saturday-schedule/ and you can register to attend via this form
Now, we have some costs obviously, not a lot but venues are not free and neither is food 😉. We have a couple of sponsors (feel free to contact me if your company is interested in sponsoring the event) but we also have some paid training days on Friday 25th April.
Friday Is Training Day
It’s a great opportunity to get cheap high-quality training from some of the best in their areas of expertise. There are still some tickets for £175 and the price will rise only to £200. I think that £200 is fantastic value to be able to spend a day learning from
PowerShell is cross-platform, it works exactly the same on Windows, on Linux and Mac. It is awesome for automation and amazing for administration.
We will cover
the basics about PowerShell, PowerShell security
how to open PowerShell , how to install PowerShell .
4 vital commands to enable you to be able to help yourself
The PowerShell Gallery and how to find, install and use additional modules
Reading the language
Working with output
Why Red text is a good thing and how to learn from the errors
We will even delve into scripting with PowerShell and how to validate your environment
There will also be the opportunity to learn about any areas of PowerShell, Automation, CI/CD that you have questions about. This is a beginner level session in which I will teach you to be comfortable with PowerShell and confident in being able to use it in the future
Attendees wanting to follow along should bring a laptop.
It reminded me that I do something very similar to test dbachecks code changes. I thought this might make a good blog post. I will talk through how I do this locally as I merge a PR from another great friend Cláudio Silva who has added agent job history checks.
GitHub PR VS Code Extension
I use the GitHub Pull Requests extension for VS Code to work with pull requests for dbachecks. This enables me to see all of the information about the Pull Request, merge it, review it, comment on it all from VS Code
I can also see which files have been changed and which changes have been made
Once I am ready to test the pull request I perform a checkout using the extension
This will update all of the files in my local repository with all of the changes in this pull request
You can see at the bottom left that the branch changes from development to the name of the PR.
Running The Unit Tests
The first thing that I do is to run the Unit Tests for the module. These will test that the code is following all of the guidelines that we require and that the tests are formatted in the correct way for the Power Bi to parse. I have blogged about this here and here and we use this Pester in our CI process in Azure DevOps which I described here.
I navigate to the root of the dbachecks repository on my local machine and run
Thank you Cláudio, the code has passed the tests 😉
Running Some Integration Tests
The difference between Unit tests and Integration tests in a nutshell is that the Unit tests are testing that the code is doing what is expected without any other external influences whilst the Integration tests are checking that the code is doing what is expected when running on an actual environment. In this scenario we know that the code is doing what is expected but we want to check what it does when it runs against a SQL Server and even when it runs against multiple SQL Servers of different versions.
Multiple Versions of SQL Server
As I have described before my friend and former colleague Andrew Pruski b | t has many resources for running SQL in containers. This means that I can quickly and easily create fresh uncontaminated instances of SQL 2012, 2014, 2016 and 2017 really quickly.
I can create 4 instances of different versions of SQL in (a tad over) 1 minute. How about you?
Imagine how long it would take to run the installers for 4 versions of SQL and the pain you would have trying to uninstall them and make sure everything is ‘clean’. Even images that have been sysprep’d won’t be done in 1 minute.
Docker Compose Up ?
So what is this magic command that has enabled me to do this? docker compose uses a YAML file to define multi-container applications. This means that with a file called docker-compose.yml like thish
and 4 SQL containers are available to you. You can interact with them via SSMS if you wish with localhost comma PORTNUMBER. The port numbers in the above file are 15586, 15587,15588 and 15589
Now it must be noted, as I describe here that first I pulled the images to my laptop. The first time you run docker compose will take significantly longer if you haven’t pulled the images already (pulling the images will take quite a while depending on your broadband speed)
Credential
The next thing is to save a credential to make it easier to automate. I use the method described by my PowerShell friend Jaap Brasser here. I run this code
Now I can start to run my Integration tests. First reset the dbachecks configuration and set some configuration values
# run the checks against these instances
$null = Set-DbcConfig -Name app.sqlinstance $containers
# We are using SQL authentication
$null = Set-DbcConfig -Name policy.connection.authscheme -Value SQL
# sometimes its a bit slower than the default value
$null = Set-DbcConfig -Name policy.network.latencymaxms -Value 100 # because the containers run a bit slow!
Then I will run the dbachecks connectivity checks and save the results to a variable without showing any output
I can then use Pester to check that dbachecks has worked as expected by testing if the failedcount property returned is 0.
Describe "Testing the checks are running as expected" -Tag Integration {
Context "Connectivity Checks" {
It "All Tests should pass" {
$ConnectivityTests.FailedCount | Should -Be 0 -Because "We expect all of the checks to run and pass with default settings"
}
}
}
What is the Unit Test for this PR?
Next I think about what we need to be testing for the this PR. The Unit tests will help us.
Choose some Integration Tests
This check is checking the Agent job history settings and the unit tests are
It “Passes Check Correctly with Maximum History Rows disabled (-1)”
It “Fails Check Correctly with Maximum History Rows disabled (-1) but configured value is 1000”
It “Passes Check Correctly with Maximum History Rows being 10000”
It “Fails Check Correctly with Maximum History Rows being less than 10000”
It “Passes Check Correctly with Maximum History Rows per job being 100”
It “Fails Check Correctly with Maximum History Rows per job being less than 100”
So we will check the same things on real actual SQL Servers. First though we need to start the SQL Server Agent as it is not started by default. We can do this as follows
Unfortunately, the agent service wont start in the SQL 2014 container so I cant run agent integration tests for that container but it’s better than no integration tests.
This is What We Will Test
So we want to test if the check will pass with default settings. In general, dbachecks will pass for default instance, agent or database settings values by default.
We also want the check to fail if the configured value for dbachecks is set to default but the value has been set on the instance.
We want the check to pass if the configured value for the dbachecks configuration is set and the instance (agent, database) setting matches it.
If You Are Doing Something More Than Once ……
Let’s automate that. We are going to be repeatedly running those three tests for each setting that we are running integration tests for. I have created 3 functions for this again checking that FailedCount or Passed Count is 0 depending on the test.
function Invoke-DefaultCheck {
It "All Checks should pass with default for $Check" {
$Tests = get-variable "$($Check)default" -ValueOnly
$Tests.FailedCount | Should -Be 0 -Because "We expect all of the checks to run and pass with default setting (Yes we may set some values before but you get my drift)"
}
}
function Invoke-ConfigCheck {
It "All Checks should fail when config changed for $Check" {
$Tests = get-variable "$($Check)configchanged" -ValueOnly
$Tests.PassedCount | Should -Be 0 -Because "We expect all of the checks to run and fail when we have changed the config values"
}
}
function Invoke-ValueCheck {
It "All Checks should pass when setting changed for $Check" {
$Tests = get-variable "$($Check)valuechanged" -ValueOnly
$Tests.FailedCount | Should -Be 0 -Because "We expect all of the checks to run and pass when we have changed the settings to match the config values"
}
}
Now I can use those functions inside a loop in my Integration Pester Test
And then we will check that all of the checks are passing and failing as expected
Invoke-Pester .\DockerTests.ps1
Integration Test For Error Log Counts
There is another integration test there for the error logs count. This works in the same way. Here is the code
#region error Log Count - PR 583
# default test
$errorlogscountdefault = Invoke-DbcCheck -SqlCredential $cred -Check ErrorLogCount -Show None -PassThru
# set a value and then it will fail
$null = Set-DbcConfig -Name policy.errorlog.logcount -Value 10
$errorlogscountconfigchanged = Invoke-DbcCheck -SqlCredential $cred -Check ErrorLogCount -Show None -PassThru
# set the value and then it will pass
$null = Set-DbaErrorLogConfig -SqlInstance $containers -SqlCredential $cred -LogCount 10
$errorlogscountvaluechanged = Invoke-DbcCheck -SqlCredential $cred -Check ErrorLogCount -Show None -PassThru
#endregion
Merge the Changes
So with all the tests passing I can merge the PR into the development branch and Azure DevOps will start a build. Ultimately, I would like to add the integration to the build as well following André‘s blog post but for now I used the GitHub Pull Request extension to merge the pull request into development which started a build and then merged that into master which signed the code and deployed it to the PowerShell gallery as you can see here and the result is
To find a command you can use the dbatools command Find-DbaCommand
For commands for service run
Find-DbaCommand Service
There are a whole bundle returned
This is how you can find any dbatools command. There is also a -Tag parameter on Find-DbaCommand.
Find-DbaCommand -Tag Service
This returns
How to use any PowerShell command
Always always start with Get-Help
Get-Help Get-DbaService -Detailed
This will show you all the information about the command including examples 🙂
All of these commands below require that the account running the PowerShell is a Local Admin on the host.
One Host Many Hosts
Now I have used just one host for all of the examples on this page. Do not be fooled, you can always use an array of hosts wherever I have $ComputerName you can set it to as many hosts as you like
$ComputerName = 'SQL0','SQL1'
You can even get those names form a database, Excel sheet, CMS.
Getting the Services
So to get the services on a machine run
$ComputerName = 'Name of Computer'
Get-DbaService -ComputerName $ComputerName
I will use the alias ft for this in some of the examples, that is fine for the command line but use the full command name in any code that you write that other people use
You have an object returned so you can output to anything if you want – CSV, JSON, text file, email, azure storage, database, the world is your oyster.
Getting the Services for one instance
The Get-DbaService command has a number of parameters. There is an InstanceName parameter enabling you to get only the services for one instance. If we just want the default instance services
You can also use the -Type parameter to get only services of a particular type. You can get one of the following: “Agent”,”Browser”,”Engine”,”FullText”,”SSAS”,”SSIS”,”SSRS”, “PolyBase”
You can use Start-DbaService and Stop-DbaService to start and stop the services. They each have ComputerName, InstanceName and Type parameters like Get-DbaService.
The full text service was started with the engine service which is why it gave a warning. You can see this if you have all of the services stopped and just want to start the engine services with the type parameter.
We can stop them by type as well, although this will show an extra requirement. If we start our MIRROR instance services again and then try to stop just the engine type.
You will get a warning due to the dependant services
WARNING: [10:31:02][Update-ServiceStatus] (MSSQL$MIRROR on SQL0) The attempt to stop the service returned the following error: The service cannot be stopped because other services that are running are dependent on it.
WARNING: [10:31:02][Update-ServiceStatus] (MSSQL$MIRROR on SQL0) Run the command with ‘-Force’ switch to force the restart of a dependent SQL Agent
So all you have to do is use the force Luke (or whatever your name is!)
You can also stop the services for an entire host, again you will need the Force parameter.
Start-DbaService -ComputerName $ComputerName |ft
Stop-DbaService -ComputerName $ComputerName -Force | ft
Restarting Services
It will come as no surprise by now to learn that Restart-DbaService follows the same pattern. It also has ComputerName, InstanceName and Type parameters like Get-DbaService, Start-DbaService and Stop-DbaService (Consistency is great, It’s one of the things that is being worked on towards 1.0 as you can see in the Bill of Health)
Again you will need the -Force for dependant services, you can restart all of the services on a host with
Now none of that answers @g-kannan’s question. Restarting only services with a certain service account.
With PowerShell you can pipe commands together so that the results of the first command are piped into the second. So we can get all of the engine services on a host for an instance with Get-DbaService and start them with Start-DbaService like this
You can do the same thing with syntax that may make more sense to you if you are used to T-SQL as follows
(Get-DbaService -ComputerName $ComputerName -Type Engine) | Where State -eq 'Stopped'
and then start only those services you could do
(Get-DbaService -ComputerName $ComputerName -Type Engine) | Where State -eq 'Stopped' | Start-DbaService
(note – you would just use Start-DbaService in this case as it wont start services that are already started!)
# Stop just one of the engine services
Stop-DbaService -ComputerName $ComputerName -InstanceName MIRROR -Type Engine
# Get the engine services
Get-DbaService -ComputerName $ComputerName -Type Engine
# This will only start the one engine service that is stopped
Start-DbaService -ComputerName $ComputerName -Type Engine
Come On Rob! Answer the question!
So now that you know a lot more about these commands, you can restart only the services using a particular service account by using Get-DbaService to get the services
Get-DbaService -ComputerName $ComputerName -Type Engine | Where StartName -eq 'thebeard\sqlsvc'
and then once you know that you have the right ‘query’ you can pipe that to Restart-DbaService (Like making sure your SELECT query returns the correct rows for your WHERE clause before running the DELETE or UPDATE)
Michael Karpenko wrote a comment pointing out that I had not supported named instances, which was correct as it had not been written for that. Thank you Michael 🙂 I have updated the code to deal with named instances.
Confusing results
I also realised as we started testing the code that if you had run the code once and then ran it again against a different availability group the tool does not clear out the data folder that it uses so you can get confusing results.
In the image below I had looked at the default instance and then a MIRROR named instance. As you can see the results json on the left shows the default instance SQLClusterAG while the one on the right shows both the SQLClusterAG and the MirrrAG instance results.
This is not so useful if you don’t notice this at first with the expanded json!! Now you may in this situation want to see the combined results from all of the availability groups on one cluster. You could gather all of the data from each instance and then add it to the data folder easily enough.
By cleaning out the data folder before running the utility the results are as expected.
Archive the data for historical analysis
One of the production DBAs pointed out that having gathered the information, it would be useful to hold it for better analysis of repeated issues. I have added an archiving step so that when the tools runs, if there is already data in the data gathering folder, it will copy that to an archive folder and name it with the date and time that the cluster log was created as this is a good estimation of when the analysis was performed. If an archive folder location is not provided it will create an archive folder in the data folder. This is not an ideal solution though, as the utility will copy all of the files and folders from there to its own location so it is better to define an archive folder in the parameters.
Get-Eventlog is sloooooooooooow
I was running the tools and noticed it sat running the system event log task for a long long time. I ran some tests using a variation of the dbatools prompt.
This will show in the prompt how long it took to run the previous statement .
In the image above (which you can click to get a larger version as with all images on this blog) you can see that it took 18ms to set the date variable, FOUR MINUTES and FORTY THREE seconds to get the system log in the last 2 days using Get-EventLog and 29.1 seconds using Get-WinEvent and a FilterHashtable.
Getting the function
This function requires PowerShell version 5 and the dbatools module.
30/11/2018 – Function has been updated to deal with named instances.
Last week the Tiger Team released their Availability Group Failover Detection Utility which will provide root cause analysis on Cluster Logs, SQL Error Logs, and the Availability groups extended events logs. There is a blog post here and the tool can be downloaded from the Tiger Team GitHub Repository
A Bit of Faffing*
It states on the readme for the Tiger Team GitHub Repository.
Repository for Tiger team for “as-is” solutions and tools/scripts that the team publishes.
The important words are “as-is” sometimes these tools need a bit of faffing some looking after!
There is a pre-requisite and sometimes a little “fixing” that you need to do to get it to run correctly.
First, install the “Microsoft Visual C++ Redistributable for Visual Studio 2017” from here. On the download page, scroll down to the “Other Tools and Frameworks” section to download the redistributable (x64 version).
Then when you run FailoverDetection.exe you may get strong name validation errors like.
Unhandled Exception: System.IO.FileLoadException: Could not load file or assembly ‘Microsoft.Sq1Server.XEvent.Linq, Version=15.0.0.0, Culture=neutral, PublicKeyToken=89845dcd808cc91’ or one of it s dependencies. Strong name validation failed. (Exception from HRESULT; 0x8013141A) – – – >.Security.SecurityException: Strong name validation failed. (Exception from HRESULT: 0x8e13141A)
—End of inner exception stack trace —
at FailoverDetector. XeventParser.LoadXevent(String xelFi1eName, String serverName)
Then you will need to run the sn.exe tool which is in the zip file. Use this syntax.
.\sn.exe -Vr PATHTODLLFile
I had to do it for two DLLs.
NOTE – If you get an error like this when running sn.exe (or any executable) from PowerShell it means that you have missed the .\ (dot whack) in front of the executable name.
* Faffing – Doing something that is a bit awkward See Link .
Logs required for the Tool
To run the Failover Detection Utility you need to gather the following information from each replica and place it in the specified data folder.
SQL error logs
Always On Availability Groups Extended Event Logs
System Health Extended Event Logs
System log
Windows cluster log
Once you have gathered all of that data then you need to alter the configuration file for the executable.
{
"Data Source Path": "Path to Data File",
"Health Level": 3,
"Instances": [
"Replica1",
"Replica2",
"Replica3"
]
}
Running The Tool
Once you have done that you can then run the Failover Detection Utility. You can double click the exe,
or you can run it from the command line.
In both cases it won’t exit so when you see the Saving Results to JSON file, you can press enter (sometimes twice!).
The results can be seen in the JSON file which will be stored in a Results directory in the directory that the the FailoverDetection.exe exists.
You can also use some switches with the FailoverDetection utility.
–Analyze – When “–Analyze” is specified as a parameter, the utility will load configuration file without copying log data. It assumes the log files have already been copied over. It does everything as default mode except copying log data. This option is useful if you already have the data in the local tool execution subdirectories and want to rerun the analysis.
–-Show -The utility after analyzing log data will display the results in the command console. Additionally, the results will be persisted to a JSON file in the results folder.
They look like this
Again, you need to press enter for the details to come through. The results are still saved to the Results folder as json as well so you won’t lose them.
When You Are Doing Something More Than Once ….
Automate it 🙂
When I saw the data that needed to be gathered for this tool, I quickly turned to PowerShell to enable me to easily gather the information. That has turned into a function which will
Download and extract the zip file from the Tiger Team GitHub repository
Identify all of the replicas for an Availability Group and dynamically create the configuration JSON file
Gather all of the required log files and place them in a specified data folder
Run the FailoverDetection.exe with any of the switches
Includes -Verbose, -Confirm, -Whatif switches so that you can easily see what is happening, be prompted to confirm before actions or see what would happen if you ran the function
You still need to press enter at the end though 🙁
and you will still need to install the “Microsoft Visual C++ Redistributable for Visual Studio 2017” and runt he strong names tool if needed
This function requires PowerShell version 5, the failovercluster module and and the dbatools module.
So you can see that if we run it without the -WhatIf switch it will
Create some directories
Download the zip file from the repo
Extract the zip file
Copy the required logs from each of the replicas to the data folder
Create the JSON configuration file
Run the executable
NOTE : – I have limited the gathering of the system event log to the last 2 days to limit the amount of time spent dealing with a large system log. I gather all of the SQL Error logs in the Error log path as that works for the first scenario I wrote this for, your mileage may vary.
So if we want to run the command we can remove the -WhatIf switch.
It can take a little while to run depending on the number of replicas, size of logs etc but once it has started running you can do other things.
It will require being run as an account with permissions to all of the folders specified and Windows and SQL permissions on all of the replicas in the Availability Group.
As you can see below it has gathered all of the results and placed them in the data folder.
The results can be found in the results folder.
If I have already run the tool, I can use the Analyze switch to save gathering the data again. I also use the AlreadyDownloaded switch as I do not need to download the zip file again.
You will then need to press enter to get the next lot of results.
Why Not Add This To dbatools?
I haven’t added this to dbatools (yet) because I wrote it in this way for a particular need and dbatools requires support for PowerShell V3 . I have, however created an issue added to this issue in the dbatools GitHub Repository (as this is how you to start the process of adding things to dbatools) so hopefully we can get it in there soon as well – in which case I will come back and update this post.
Following an upgrade to SQL Server the backup share had a number of backups, some from the old version and some from the newer version. I was asked if I had a script to be able to get the SQL Version from the backup file from all of the files in the backup share.
You can get more information about the backup using Read-DbaBackupHeader and as it is PowerShell it is easy to put this information into any format that you wish, maybe into a database with Write-DbaDataTable
— Rob He/Him robsewell@tech.lgbt & @counter.soci (@sqldbawithbeard) November 23, 2018
Support for PowerShell Core in dbatools is coming along very nicely. Following some hard work by the dbatools team and some PowerShell Community members like Mathias Jessen it is now possible to run a large number of dbatools commands in PowerShell Core running on Windows. There is still a little bit of work to do to get it working on Linux and Mac but I hear the team are working hard on that.
So the code example you see above was running on Windows 10 using PowerShell 6.1.1 the current latest stable release. This is excellent news and congratulations to all those working hard to make this work
If you want to try PowerShell Core, you can follow the instructions