Generating a Workload against AdventureWorks with PowerShell

For a later blog post I have been trying to generate some workload against an AdventureWorks database.

I found this excellent blog post by Pieter Vanhove t https://blogs.technet.microsoft.com/msftpietervanhove/2016/01/08/generate-workload-on-your-azure-sql-database/ which references this 2011 post by Jonathan Kehayias t
https://www.sqlskills.com/blogs/jonathan/the-adventureworks2008r2-books-online-random-workload-generator/

Both of these run a random query in a single thread so I thought I would use PoshRSJob by Boe Prox b | t to run multiple queries at the same time 🙂

To install PoshRSJob, like with any PowerShell module, you run

Install-Module -Name PoshRSJob

I downloaded AdventureWorksBOLWorkload zip from Pieters blog post and extracted to my C:\temp folder. I created a Invoke-RandomWorkload function which you can get from my functions repository in Github. The guts of the function are

    1.. $NumberOfJobs | Start-RSJob -Name "WorkLoad"  -Throttle $Throttle -ScriptBlock  {

        # Get the queries
        $Queries = Get-Content -Delimiter $Using:Delimiter -Path $Using:PathToScript 
        # Pick a Random Query from the input object 
        $Query = Get-Random -InputObject $Queries 
        # Run the Query
        Invoke-SqlCmd -ServerInstance  $Using:SqlInstance -Credential $Using:SqlCredential -Database $Using:Database -Query $Query 
        # Choose a random number of milliseconds to wait
        $a = Get-Random -Maximum 2000 -Minimum 100; 
        Start-Sleep -Milliseconds $a;     
    } 

which will created $NumberOfJobs jobs and then run $Throttle number of jobs in the background until they have all completed. Each job will run a random query from the query file using Invoke-SqlCmd. Why did I use Invoke-SqlCmd and not Invoke-DbaQuery from dbatools? dbatools creates runspaces in the background to help with logging and creating runspaces inside background jobs causes errors

Then I can run the function with

Invoke-RandomWorkload -SqlInstance $SQL2019CTP23 -SqlCredential $cred -Database AdventureWorks2014  -NumberOfJobs 1000 -Delay 10 -Throttle 10

and create a random workload. Creating lots of background jobs takes resources so when I wanted to run a longer workload I created a loop.

$x = 10
while($X -gt 0){
    Invoke-RandomWorkload -SqlInstance $SQL2019CTP23 -SqlCredential $cred -Database AdventureWorks2014  -NumberOfJobs 1000 -Delay 10 -Throttle 10
$x --
}

You can get the function here. The full code is below

# With thanks to Jonathan Kehayias and Pieter Vanhove

<#
.SYNOPSIS
Runs a random workload against a database using a sql file

.DESCRIPTION
Runs a random workload against a database using PoshRSJobs to create parallel jobs to run random 
queries from a T-SQL file by default it uses the AdventureWorksBOLWorkload.sql from Pieter Vanhove

.PARAMETER SqlInstance
The SQL instance to run the queries against

.PARAMETER SqlCredential
The SQL Credential for the Instance if required

.PARAMETER Database
The name of the database to run the queries against

.PARAMETER NumberOfJobs
The number of jobs to create - default 10

.PARAMETER Delay
The delay in seconds for the output for the running jobs - default 10

.PARAMETER Throttle
The number of parallel jobs to run at a time - default 5

.PARAMETER PathToScript
The path to the T-SQL script holding the queries - default 'C:\temp\AdventureWorksBOLWorkload\AdventureWorksBOLWorkload.sql'

.PARAMETER Delimiter
The delimiter in the T-SQL Script between the queries - default ------

.PARAMETER ShowOutput
Shows the output from the jobs

.EXAMPLE
Invoke-RandomWorkload -SqlInstance $SQL2019CTP23 -SqlCredential $cred -Database AdventureWorks2014 -NumberOfJobs 100 -Delay 10 -Throttle 10 

Runs 100 queries with a maximum of 10 at a time against the AdventureWorks2014 database on $SQL2019CTP23

.EXAMPLE
 $x = 10
 while($X -gt 0){
     Invoke-RandomWorkload -SqlInstance $SQL2019CTP23 -SqlCredential $cred -Database AdventureWorks2014  -NumberOfJobs 1000 -Delay 10 -Throttle 10
 $x --
 }

Runs 1000 queries with a maximum of 10 at a time against the AdventureWorks2014 database on $SQL2019CTP23 10 times in a loop


.NOTES
With thanks to Pieter Vanhove
https://blogs.technet.microsoft.com/msftpietervanhove/2016/01/08/generate-workload-on-your-azure-sql-database/
and
Jonathan Kehayias
The AdventureWorks2008R2 Books Online Random Workload Generator
#> function Invoke-RandomWorkload { #Requires -Module PoshRsJob #Requires -Module SQLServer Param( [string]$SqlInstance, [pscredential]$SqlCredential, [string]$Database, [int]$NumberOfJobs = 10, [int]$Delay = 10, [int]$Throttle = 5, [string]$PathToScript = 'C:\temp\AdventureWorksBOLWorkload\AdventureWorksBOLWorkload.sql', [string]$Delimiter = "------", [switch]$ShowOutput ) #Check if there are old Workload Jobs $WorkloadJobs = Get-RSJob -Name Workload if ($WorkloadJobs) { Write-Output "Removing Old WorkLoad Jobs" $WorkloadJobs |Stop-RSJob $WorkloadJobs | Remove-RSJob } Write-Output "Creating Background Jobs" 1.. $NumberOfJobs | Start-RSJob -Name "WorkLoad" -Throttle $Throttle -ScriptBlock { # Get the queries $Queries = Get-Content -Delimiter $Using:Delimiter -Path $Using:PathToScript # Pick a Random Query from the input object $Query = Get-Random -InputObject $Queries # Run the Query Invoke-SqlCmd -ServerInstance $Using:SqlInstance -Credential $Using:SqlCredential -Database $Using:Database -Query $Query # Choose a random number of milliseconds to wait $a = Get-Random -Maximum 2000 -Minimum 100; Start-Sleep -Milliseconds $a; } $runningJobs = (Get-RSJob -Name WorkLoad -State Running).Count While ($runningJobs -ne 0) { $jobs = Get-RSJob -Name WorkLoad $runningJobs = $Jobs.Where{$PSItem.State -eq 'Running'}.Count $WaitingJobs = $Jobs.Where{$PSItem.State -eq 'NotStarted'}.Count $CompletedJobs = $Jobs.Where{$PSItem.State -eq 'Completed'}.Count Write-Output "$runningJobs jobs running - $WaitingJobs jobs waiting - $CompletedJobs -jobs finished" Start-Sleep -Seconds $Delay } Write-Output "Jobs have finished" if ($ShowOutput) { Write-Output "WorkLoad Jobs Output below -" Get-RSJob -Name WorkLoad | Receive-RSJob } Write-Output "Removing Old WorkLoad Jobs" Get-RSJob -Name WorkLoad | Remove-RSJob Write-Output "Finished" }

Creating SQL Server Containers for versions 2012-2017

I am working on my dbatools and dbachecks presentations for SQL Saturday Finland, SQLDays, SQL Saturday Cork and SQLGrillen I want to show the two modules running against a number of SQL Versions so I have installed

  • 2 Domain Controllers
  • 2 SQL 2017 instances on Windows 2016 with an Availability Group and WideWorldImporters database
  • 1 Windows 2016 jump box with all the programmes I need
  • 1 Windows 2016 with containers

using a VSTS build and this set of ARM templates and scripts

I wanted to create containers running SQL2017, SQL2016, SQL2014 and SQL2012 and restore versions of the AdventureWorks database onto each one.

Move Docker Location

I redirected my docker location from my C:\ drive to my E:\ drive so I didnt run out of space. I did this by creating a daemon.json file in C:\ProgramData\docker\config and adding

{"data-root": "E:\\containers"}
and restarting the docker service which created folders like this
01 - folders.png
Then I ran
docker volume create SQLBackups
to create a volume to hold the backups that I could mount on the containers

AdventureWorks Backups

I downloaded all the AdventureWorks backups from GitHub and copied them to E:\containers\volumes\sqlbackups\_data

Get-ChildItem $Home\Downloads\AdventureWorks* | Copy-Item -Destination E:\containers\volumes\sqlbackups\_data

Getting the Images

To download the SQL 2017 image from the DockerHub I ran

docker pull microsoft/mssql-server-windows-developer:latest

and waited for it to download and extract

I also needed the images for other versions. My good friend Andrew Pruski b | t has versions available for us to use on his Docker Hub  so it is just a case of running

docker pull dbafromthecold/sqlserver2016dev:sp1
docker pull dbafromthecold/sqlserver2014dev:sp2
docker pull dbafromthecold/sqlserver2012dev:sp4
and waiting for those to download and extract (This can take a while!)

Create the containers

Creating the containers is as easy as

docker run -d -p ExposedPort:InternalPort --name NAME -v VolumeName:LocalFolder -e sa_password=THEPASSWORD -e ACCEPT_EULA=Y IMAGENAME
so all I needed to run to create 4 SQL containers one of each version was
docker run -d -p 15789:1433 --name 2017 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y microsoft/mssql-server-windows-developer
docker run -d -p 15788:1433 --name 2016 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y dbafromthecold/sqlserver2016dev:sp1
docker run -d -p 15787:1433 --name 2014 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y dbafromthecold/sqlserver2014dev:sp2
docker run -d -p 15786:1433 --name 2012 -v sqlbackups:C:\SQLBackups -e sa_password=PruskiIsSQLContainerMan! -e ACCEPT_EULA=Y dbafromthecold/sqlserver2012dev:sp4

and just a shade over 12 seconds later I have 4 SQL instances ready for me 🙂

02 - creating containers.png

03 - Containers at the ready.png

Storing Credentials

This is not something I would do in a Production environment but I save my credentials using this method that Jaap Brasser b | t shared here

Get-Credential | Export-Clixml -Path $HOME\Documents\sa.cred
which means that I can get the credentials in my PowerShell session (as long as it is the same user that created the file) using
$cred = Import-Clixml $HOME\Documents\sa.cred

Restoring the databases

I restored all of the AdventureWorks databases that each instance will support onto each instance, so 2017 has all of them whilst 2012 only has the 2012 versions.

First I needed to get the filenames of the backup files into a variable

$filenames = (Get-ChildItem '\\bearddockerhost\e$\containers\volumes\sqlbackups\_data').Name

and the container connection strings, which are the hostname and the port number

$containers = 'bearddockerhost,15789', 'bearddockerhost,15788', 'bearddockerhost,15787', 'bearddockerhost,15786'
then I can restore the databases using dbatools using a switch statement on the version which I get with the NameLevel property of Get-DbaSqlBuildReference-
$cred = Import-Clixml $HOME\Documents\sa.cred
$containers = 'bearddockerhost,15789', 'bearddockerhost,15788', 'bearddockerhost,15787', 'bearddockerhost,15786'
$filenames = (Get-ChildItem '\\bearddockerhost\e$\containers\volumes\sqlbackups\_data').Name
$containers.ForEach{
    $Container = $Psitem
    $NameLevel = (Get-DbaSqlBuildReference-SqlInstance $Container-SqlCredential $cred).NameLevel
    switch ($NameLevel) {
        2017 {
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path C:\sqlbackups\ -useDestinationDefaultDirectories -WithReplace |Out-Null
            Write-Verbose-Message "Restored Databases on 2017"
        }
        2016 {
            $Files = $Filenames.Where{$PSitem -notlike '*2017*'}.ForEach{'C:\sqlbackups\' + $Psitem}
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path $Files-useDestinationDefaultDirectories -WithReplace
            Write-Verbose-Message "Restored Databases on 2016"
        }
        2014 {
            $Files = $Filenames.Where{$PSitem -notlike '*2017*' -and $Psitem -notlike '*2016*'}.ForEach{'C:\sqlbackups\' + $Psitem}
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path $Files-useDestinationDefaultDirectories -WithReplace
            Write-Verbose-Message "Restored Databases on 2014"
        }
        2012 {
            $Files = $Filenames.Where{$PSitem -like '*2012*'}.ForEach{'C:\sqlbackups\' + $Psitem}
            Restore-DbaDatabase-SqlInstance $Container-SqlCredential $cred-Path $Files-useDestinationDefaultDirectories -WithReplace
            Write-Verbose-Message "Restored Databases on 2012"
        }
        Default {}
    }
}
I need to create the file paths for each backup file by getting the correct backups and appending the names to C:\SQLBackups which is where the volume is mounted inside the container
As Get-DbaDatabase gives the container ID as the Computer Name I have highlighted each container below
04 - databases.png
That is how easy it is to create a number of SQL containers of differing versions for your presentations or exploring needs
Happy Automating!

Getting SQL Server File Sizes and Space Used with dbatools

I read a great blog post about answering the question how big is the database using T-SQL on SQL Buffet and wondered how much I could do with the dbatools module

The dbatools module (for those that don’t know) is a PowerShell module written by amazing folks in the community designed to make administrating your SQL Server significantly easier using PowerShell. The instructions for installing it are available here It comprises of 182 separate commands at present (11 March 2017 Version 0.8.938)

I know that there is a Get-DBADatabaseFreeSpace Command written by Mike Fal b | t and using Glenn Berry’s diagnostic queries

First thing as always is to look at the help

 Get-Help Get-DbaDatabaseFreespace -ShowWindow

which will show you the help for the command and some examples

Lets look at the details for a single instance

 Get-DbaDatabaseFreespace -sqlserver $server

This is what it looks like

02 - singel server.gif

and yes it really is that fast, I have not speeded this up. 232 ms to get those details for an instance with 19 databases

03 - Measure Command.PNG

What information do you get ? Lets look at the information for a single database, you get an object for each file

Server               : SQL2014SER12R2
Database             : DBA-Admin
FileName             : DBA-Admin_System
FileGroup            : PRIMARY
PhysicalName         : F:\DBA-Admin_System.MDF
FileType             : ROWS
UsedSpaceMB          : 3
FreeSpaceMB          : 253
FileSizeMB           : 256
PercentUsed          : 1
AutoGrowth           : 0
AutoGrowType         : MB
SpaceUntilMaxSizeMB  : 16777213
AutoGrowthPossibleMB : 0
UnusableSpaceMB      : 16777213
Server               : SQL2014SER12R2
Database             : DBA-Admin
FileName             : DBA-Admin_Log
FileGroup            :
PhysicalName         : G:\DBA-Admin_Log.LDF
FileType             : LOG
UsedSpaceMB          : 32
FreeSpaceMB          : 224
FileSizeMB           : 256
PercentUsed          : 12
AutoGrowth           : 256
AutoGrowType         : MB
SpaceUntilMaxSizeMB  : 2528
AutoGrowthPossibleMB : 2304
UnusableSpaceMB      : 0
Server               : SQL2014SER12R2
Database             : DBA-Admin
FileName             : DBA-Admin_User
FileGroup            : UserFG
PhysicalName         : F:\DBA-Admin_User.NDF
FileType             : ROWS
UsedSpaceMB          : 1
FreeSpaceMB          : 255
FileSizeMB           : 256
PercentUsed          : 0
AutoGrowth           : 256
AutoGrowType         : MB
SpaceUntilMaxSizeMB  : 5119
AutoGrowthPossibleMB : 4864
UnusableSpaceMB      : 0
There is a lot of useful information returned for each file. Its better if you use Out-GridView as then you can order by columns and filter in the top bar.
04 - single server ogv.gif

 

As always, PowerShell uses the permissions of the account running the sessions to connect to the SQL Server unless you provide a separate credential for SQL Authentication. If you need to connect with a different windows account you will need to hold Shift down and right click on the PowerShell icon and click run as a different user.

Lets get the information for a single database. The command has dynamic parameters which populate the database names to save you time and keystrokes

05 dynamic parameters.gif

But you may want to gather information about more than one server. lets take a list of servers and place them into a variable. You can add your servers to this variable in a number of ways, maybe by querying your CMDB or using your registered servers or central management server

$SQLServers = 'SQL2005Ser2003','SQL2012Ser08AG3','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3','SQLVnextN1','SQLvNextN2'

and then

Get-DbaDatabaseFreespace -SqlInstance $SQLServers | Out-GridView

 

06 - Many servers ogv.PNG

As you can see, you get a warning quite correctly, that the information for the asynchronous secondary node of the availability group databases is not available and I did not have all of my servers running so there are a couple of could not connect warnings as well. You can still filter very quickly. dbatools is tested from SQL2000 to SQL vNext as you can see below (although I don’t have a SQL2000 instance)

07 - filter ogv

 

Not only on Windows, this command will work against SQL running on Linux as well

08 - linux.PNG

So if we want to know the total size of the files on disk for  the database we need to look at the FileSizeMB property

$server = 'SQL2014Ser12R2'
$dbName = 'AdventureWorksDW2014'
Get-DbaDatabaseFreespace -SqlServer $server -database $dbName |
Select Database,FileName,FileSizeMB

Of course that’s an easy calculation here

08a - filesize.PNG

but if we have numerous files then it may be tougher. we can use the Measure-Object command to sum the properties. We need to do a bit of preparation here and set a couple of calculated properties to make it more readable

$server = 'SQL2014Ser12R2'
$dbName = 'AdventureWorksDW2014'
$database = @{Name = 'Database'; Expression = {$dbname}}
$FileSize = @{Name = 'FileSize'; Expression = {$_.Sum}}
Get-DbaDatabaseFreespace -SqlServer $server -database $dbName |
Select Database,FileSizeMB |
Measure-Object FileSizeMB -Sum |
Select $database ,Property, $filesize

09 - filessize

Maybe we want to look at all of the databases on an instance. Again, we have to do a little more work here

$server = 'SQL2014Ser12R2'
$srv = Connect-DbaSqlServer $server
$SizeonDisk = @()
$srv.Databases |ForEach-Object {
$dbName = $_.Name
$database = @{Name = 'Database'; Expression = {$dbname}}
$FileSize = @{Name = 'FileSize'; Expression = {$_.Sum}}
$SizeOnDisk += Get-DbaDatabaseFreespace -SqlServer $server -database $dbName | Select Database,FileSizeMB |  Measure-Object FileSizeMb -Sum | Select $database ,Property, $Filesize
}
$SizeOnDisk

10 - size on disk

If we wanted the databases ordered by the size of their files we could do this

$SizeOnDisk |Sort-Object Filesize -Descending

11 - size sorted.PNG

As it is PowerShell we have an object and we can use it any way we like. Maybe we want that information in a text file or a csv or an excel file or in an email, PowerShell can do that

 ## In a text file
$SizeonDisk | Out-file C:\temp\Sizeondisk.txt
Invoke-Item C:\temp\Sizeondisk.txt
## In a CSV
$SizeonDisk | Export-Csv C:\temp\Sizeondisk.csv -NoTypeInformation
notepad C:\temp\Sizeondisk.csv
## Email
Send-MailMessage -SmtpServer $smtp -From DBATeam@TheBeard.local -To JuniorDBA-Smurf@TheBeard.Local `
-Subject "Smurf this needs looking At" -Body $SizeonDisk
## Email as Attachment
Send-MailMessage -SmtpServer $smtp -From DBATeam@TheBeard.local -To JuniorDBA-Smurf@TheBeard.Local `
-Subject "Smurf this needs looking At" -Body "Smurf" -Attachments C:\temp\Sizeondisk.csv

I had a little play with Boe Prox PoshCharts (you have to use the dev branch) to see if I could get some nice charts and unfortunately the bar charts did not come out so well but luckily the donut and pie charts did. (I’m a DBA I love donuts!)

$SizeonDisk| Out-PieChart -XField Database -YField FileSize -Title "UsedSpaceMB per Database on $Server" -IncludeLegend -Enable3D
$SizeonDisk| Out-DoughnutChart -XField Database -YField FileSize -Title "UsedSpaceMB per Database on $Server" -IncludeLegend -Enable3D

12 - donuts.PNG

So the point is, whatever you or your process requires you can pretty much bet that PowerShell can enable it for you to automate.

You can make use of all of the properties exposed by the command. If you want to only see the files with less than 20% space free

 Get-DbaDatabaseFreespace -SqlServer $server | Where-Object {$_.PercentUsed -gt 80}

13 - percent used.PNG

you can also use the command to check for file growth settings as well

 Get-DbaDatabaseFreespace -SqlServer $server | Where-Object {$_.AutoGrowType  -ne 'Mb'}

14 - autogrowth.PNG

Or maybe you want to know the FileSize, Used and Free Space per database

 $server = 'SQL2014Ser12R2'
$srv = Connect-DbaSqlServer $server
$SizeonDisk = @()
$srv.Databases |ForEach-Object {
$dbName = $_.Name
$database = @{Name = 'Database'; Expression = {$dbname}}
$MB = @{Name = 'Mbs'; Expression = {$_.Sum}}
$SizeOnDisk += Get-DbaDatabaseFreespace -SqlServer $server -database $dbName | Select Database,FileSizeMB, UsedSpaceMB, FreeSpaceMb |  Measure-Object FileSizeMb , UsedSpaceMB, FreeSpaceMb -Sum  | Select $database ,Property, $Mb
}
$SizeOnDisk 

15 totals.PNG

Hopefully that has given you a quick insight into another one of the fabulous dbatools commands. Any questions, comment below or head over to the SQL Server Community Slack via https://sqlps.io/slack

Happy Automating

 

NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using

 Get-Module dbatools

and update it using an Administrator PowerShell session with

 Update-Module dbatools

You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using

 Install-Module dbatools

Then you can use

 Update-dbatools

 

Add Adventure Works Database to Windows Azure VM

This has been an interesting journey. The Adventure Works database is frequently used in blogs and reference books and I wanted to install it in my Windows Azure Learning Lab and I also wanted to automate the process.

The easiest way is to download the Windows Azure MDF file from  http://msftdbprodsamples.codeplex.com/ jump through all the security warnings in Internet Explorer and save the file and then create the database as follows

CREATE DATABASE AdventureWorks2012
ON (FILENAME = 'PATH TO \AdventureWorks2012_Data.mdf')
FOR ATTACH_REBUILD_LOG ;

That is the way I will do it from now on! After reading this page I tried to download the file with Powershell but it would not as I could not provide a direct link to the file. Maybe someone can help me with that. So I thought I would use my SkyDrive to hold the MDF file and map a drive on the server.

\to do this you need to add the Desktop Experience feature to the server. This can be done as follows

Import-Module ServerManager
Add-WindowsFeature Desktop-Experience -restart

This will take a few minutes to install, reboot and then configure the updates before you can log back in. While it is doing this log into your SkyDrive and navigate to a folder and copy the URL to notepad

It will look something like this

https://skydrive.live.com/?lc=2137#cid=XXXXXXXXXXXXXXXX&id=CYYYYYYYYYYYYYYYY

Copy the GUID after the cid=

and write this command

net use T:  \\d.docs.live.net@SSL\XXXXXXXXXXXXXXXX /user:$user $password

I keep this in a script and pass the user and password in via Read-Host

However, if you try to copy the item from the folder you will get an error

The file size exceeds the limit allowed and cannot be saved

So you will need to alter a registry key as follows