dbachecks – Dark Mode Historical Validation PowerBi

in my last post I showed how you can save the results of dbachecks to a database and created a PowerBi report. Inspired by Frank Henninger in the #dbachecks slack channel and Shawn Melton who explained the difficulties with red/green colour blind I then created this one 🙂

dark mode

You can find it in my GitHub and have a play with it below

Happy Validating!

dbachecks – Save the results to a database for historical reporting

I gave a presentation at SQL Day in Poland last week on dbachecks and one of the questions I got asked was will you write a command to put the results of the checks into a database for historical reporting.

The answer is no and here is the reasoning. The capability is already there. Most good PowerShell commands will only return an object and the beauty of an object is that you can do anything you like with it. Your only limit is your imagination 🙂 I have written about this before here. The other reason is that it would be very difficult to write something that was easily configurable for the different requirements that people will require. But here is one way of doing it.

Create a configuration and save it

Let’s define a configuration and call it production. This is something that I do all of the time so that I can easily run a set of checks with the configuration that I want.

Now I can export that configuration to a json file and store on a file share or in source control using the code below. This makes it easy to embed the checks into an automation solution
and then I can use it with
01 - Invoke-DbcCheck
I would use one of the Show parameter values here if I was running it at the command line, probably fails to make reading the information easier

Add results to a database

This only gets us the test results on the screen, so if we want to save them to a database we have to use the PassThru parameter for Invoke-DbcCheck. I will run the checks again, save them to a variable

Then I can use the dbatools Write-DbaDatatable command to write the results to a table in a database. I need to do this twice, once for the summary and once for the test results

and I get two tables one for the summary

02 - summary

and one for the details

03 - detail
This works absolutely fine and I could continue to add test results in this fashion but it has no date property so it is not so useful for reporting.

Create tables and triggers

This is one way of doing it. I am not sure it is the best way but it works! I always look forward to how people take ideas and move them forward so if you have a better/different solution please blog about it and reference it in the comments below

First I created a staging table for the summary results

and a destination table with a primary key and a date column which defaults to todays date

and added an INSERT trigger to the staging table

and for the details I do the same thing. A details table

A stage table

with a trigger

Then I can use Write-DbaDatatable with a couple of extra parameters, FireTriggers to run the trigger, Truncate and Confirm:$false to avoid any confirmation because I want this to run without any interaction and I can get the results into the database.

detail with stage

Which means that I can now query some of this data and also create PowerBi reports for it.

To enable me to have results for the groups in dbachecks I have to do a little bit of extra manipulation. I can add all of the checks to the database using

But because the Ola Hallengren Job names are configuration items I need to update the values for those checks which I can do as follows

You can get a sample Power Bi report in my Github which also has the code from this blog post

Then you just need to open in PowerBi Desktop and

Click Edit Queries
Click Data Source Settings
Click Change Source
Change the Instance and Database names

09 - PowerBi

Then have an interactive report like this. Feel free to click around and see how it works. Use the arrows at the bottom right to go full-screen. NOTE – it filters by “today” so if I haven’t run the check and the import then click on one of the groups under “Today’s Checks by Group”

This enables me to filter the results and see what has happened in the past so I can filter by one instance
05 - filter by instance
or I can filter by a group of tests
07 - filter by instance
or even by a group of tests for an instance
08 - filter by instance and insance

Hopefully, this will give you some ideas of what you can do with your dbachecks results. You can find all of the code and the PowerBi in my GitHub

Happy Validating!

dbachecks – Configuration Deep Dive

Today is the day that we have announced dbachecks  a PowerShell module enabling you to validate your SQL Instances. You can read more about it here where you can learn how to install it and see some simple use cases

108 Configurations

One of the things I have been talking about in my presentation “Green is Good Red is Bad” is configuring Pester checks so that you do not have to keep writing new tests for the same thing but with different values.

For example, a different user for a database owner. The code to write the test for the database owner is the same but the value might be different for different applications, environments, clients, teams, domains etc. I gave a couple of different methods for achieving this.

With dbachecks we have made this much simpler enabling you to set configuration items at run-time or for your session and enabling you to export and import them so you can create different configs for different use cases

There are 108 configuration items at present. You can see the current configuration by running

which will show you the name of the config, the value it is currently set and the description

You can see all of the configs and their descriptions here

NameDescription
agent.databasemailprofileName of the Database Mail Profile in SQL Agent
agent.dbaoperatoremailEmail address of the DBA Operator in SQL Agent
agent.dbaoperatornameName of the DBA Operator in SQL Agent
agent.failsafeoperatorEmail address of the DBA Operator in SQL Agent
app.checkreposWhere Pester tests/checks are stored
app.computernameList of Windows Servers that Windows-based tests will run against
app.localappPersisted files live here
app.maildirectoryFiles for mail are stored here
app.sqlcredentialThe universal SQL credential if Trusted/Windows Authentication is not used
app.sqlinstanceList of SQL Server instances that SQL-based tests will run against
app.wincredentialThe universal Windows if default Windows Authentication is not used
command.invokedbccheck.excludecheckInvoke-DbcCheck: The checks that should be skipped by default.
domain.domaincontrollerThe domain controller to process your requests
domain.nameThe Active Directory domain that your server is a part of
domain.organizationalunitThe OU that your server should be a part of
mail.failurethreshholdNumber of errors that must be present to generate an email report
mail.fromEmail address the email reports should come from
mail.smtpserverStore the name of the smtp server to send email reports
mail.subjectSubject line of the email report
mail.toEmail address to send the report to
policy.backup.datadirDestination server data directory
policy.backup.defaultbackupcompreesionDefault Backup Compression check should be enabled $true or disabled $false
policy.backup.diffmaxhoursMaxmimum number of hours before Diff Backups are considered outdated
policy.backup.fullmaxdaysMaxmimum number of days before Full Backups are considered outdated
policy.backup.logdirDestination server log directory
policy.backup.logmaxminutesMaxmimum number of minutes before Log Backups are considered outdated
policy.backup.newdbgraceperiodThe number of hours a newly created database is allowed to not have backups
policy.backup.testserverDestination server for backuptests
policy.build.warningwindowThe number of months prior to a build being unsupported that you want warning about
policy.connection.authschemeAuth requirement (Kerberos, NTLM, etc)
policy.connection.pingcountNumber of times to ping a server to establish average response time
policy.connection.pingmaxmsMaximum response time in ms
policy.dacallowedDAC should be allowed $true or disallowed $false
policy.database.autocloseAuto Close should be allowed $true or dissalowed $false
policy.database.autocreatestatisticsAuto Create Statistics should be enabled $true or disabled $false
policy.database.autoshrinkAuto Shrink should be allowed $true or dissalowed $false
policy.database.autoupdatestatisticsAuto Update Statistics should be enabled $true or disabled $false
policy.database.autoupdatestatisticsasynchronouslyAuto Update Statistics Asynchronously should be enabled $true or disabled $false
policy.database.filebalancetolerancePercentage for Tolerance for checking for balanced files in a filegroups
policy.database.filegrowthexcludedbDatabases to exclude from the file growth check
policy.database.filegrowthtypeGrowth Type should be 'kb' or 'percent'
policy.database.filegrowthvalueThe auto growth value (in kb) should be equal or higher than this value. Example: A value of 65535 means at least 64MB.
policy.database.logfilecountThe number of Log files expected on a database
policy.database.logfilesizecomparisonHow to compare data and log file size, options are maximum or average
policy.database.logfilesizepercentageMaximum percentage of Data file Size that logfile is allowed to be.
policy.database.maxvlfMax virtual log files
policy.dbcc.maxdaysMaxmimum number of days before DBCC CHECKDB is considered outdated
policy.diskspace.percentfreePercent disk free
policy.dump.maxcountMaximum number of expected dumps
policy.hadr.tcpportThe TCPPort for the HADR check
policy.identity.usagepercentMaxmimum percentage of max of identity column
policy.invaliddbowner.excludedbDatabases to exclude from invalid dbowner checks
policy.invaliddbowner.nameThe database owner account should not be this user
policy.network.latencymaxmsMax network latency average
policy.ola.commandlogenabledOla's CommandLog Cleanup should be enabled $true or disabled $false
policy.ola.commandlogscheduledOla's CommandLog Cleanup should be scheduled $true or disabled $false
policy.ola.databaseThe database where Ola's maintenance solution is installed
policy.ola.deletebackuphistoryenabledOla's Delete Backup History should be enabled $true or disabled $false
policy.ola.deletebackuphistoryscheduledOla's Delete Backup History should be scheduled $true or disabled $false
policy.ola.installedChecks to see if Ola Hallengren solution is installed
policy.ola.outputfilecleanupenabledOla's Output File Cleanup should be enabled $true or disabled $false
policy.ola.outputfilecleanupscheduledOla's Output File Cleanup should be scheduled $true or disabled $false
policy.ola.purgejobhistoryenabledOla's Purge Job History should be enabled $true or disabled $false
policy.ola.purgejobhistoryscheduledOla's Purge Job History should be scheduled $true or disabled $false
policy.ola.systemfullenabledOla's Full System Database Backup should be enabled $true or disabled $false
policy.ola.systemfullretentionOla's Full System Database Backup retention number of hours
policy.ola.systemfullscheduledOla's Full System Database Backup should be scheduled $true or disabled $false
policy.ola.systemintegritycheckenabledOla's System Database Integrity should be enabled $true or disabled $false
policy.ola.systemintegritycheckscheduledOla's System Database Integrity should be scheduled $true or disabled $false
policy.ola.userdiffenabledOla's Diff User Database Backup should be enabled $true or disabled $false
policy.ola.userdiffretentionOla's Diff User Database Backup retention number of hours
policy.ola.userdiffscheduledOla's Diff User Database Backup should be scheduled $true or disabled $false
policy.ola.userfullenabledOla's Full User Database Backup should be enabled $true or disabled $false
policy.ola.userfullretentionOla's Full User Database Backup retention number of hours
policy.ola.userfullscheduledOla's Full User Database Backup should be scheduled $true or disabled $false
policy.ola.userindexoptimizeenabledOla's User Index Optimization should be enabled $true or disabled $false
policy.ola.userindexoptimizescheduledOla's User Index Optimization should be scheduled $true or disabled $false
policy.ola.userintegritycheckenabledOla's User Database Integrity should be enabled $true or disabled $false
policy.ola.userintegritycheckscheduledOla's User Database Integrity should be scheduled $true or disabled $false
policy.ola.userlogenabledOla's Log User Database Backup should be enabled $true or disabled $false
policy.ola.userlogretentionOla's Log User Database Backup retention number of hours
policy.ola.userlogscheduledOla's Log User Database Backup should be scheduled $true or disabled $false
policy.oleautomationOLE Automation should be enabled $true or disabled $false
policy.pageverifyPage verify option should be set to this value
policy.recoverymodel.excludedbDatabases to exclude from standard recovery model check
policy.recoverymodel.typeStandard recovery model
policy.storage.backuppathEnables tests to check if servers have access to centralized backup location
policy.validdbowner.excludedbDatabases to exclude from valid dbowner checks
policy.validdbowner.nameThe database owner account should be this user
policy.whoisactive.databaseWhich database should contain the sp_WhoIsActive stored procedure
policy.xevent.requiredrunningsessionList of XE Sessions that should be running.
policy.xevent.requiredstoppedsessionList of XE Sessions that should not be running.
policy.xevent.validrunningsessionList of XE Sessions that can be be running.
skip.backup.testingDon't run Test-DbaLastBackup by default (it's not read-only)
skip.connection.pingSkip the ping check for connectivity
skip.connection.remotingSkip PowerShell remoting check for connectivity
skip.database.filegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.database.logfilecounttestSkip the logfilecount test
skip.datafilegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.dbcc.datapuritycheckSkip data purity check in last good dbcc command
skip.diffbackuptestSkip the Differential backup test
skip.logfilecounttestSkip the logfilecount test
skip.logshiptestingSkip the logshipping test
skip.tempdb1118Don't run test for Trace Flag 1118
skip.tempdbfilecountDon't run test for Temp Database File Count
skip.tempdbfilegrowthpercentDon't run test for Temp Database File Growth in Percent
skip.tempdbfilesizemaxDon't run test for Temp Database Files Max Size
skip.tempdbfilesoncDon't run test for Temp Database Files on C

So there are a lot of configurations that you can use. A lot are already set by default but all of them you can configure for the values that you need for your own estate.

The configurations are stored in the registry at HKCU:\Software\Microsoft\WindowsPowerShell\PSFramework\

First Configurations

First I would run this so that you can see all of the configs in a seperate window (note this does not work on PowerShell v6)

Lets start with the first configurations that you will want to set. This should be the Instances and the Hosts that you want to check

You can get the value of the configuration item using

as you can see in the image, nothing is returned so we have no instances configured at present. We have added tab completion to the name parameter so that you can easily find the right one

If you want to look at more information about the configuration item you can use

which shows you the name, current value and the description

So lets set our first configuration for our SQL instance to localhost. I have included a video so you can see the auto-complete in action as well

This configuration will be used for any SQL based checks but not for any windows based ones like Services, PowerPlan, SPN, DiskSpace, Cluster so lets set the app.computername configuration as well

This means that when we run invoke-DbcCheck with AllChecks or by specifying a check, it will run against the local machine and default instance unless we specify a sqlinstance when calling Invoke-DbcCheck. So the code below will not use the configuration for app.sqlinstance.

Exclude a Check

You can exclude a check using the -ExcludeCheck parameter of Invoke-DbcConfig. In the example below I am running all of the Server checks but excluding the SPN as we are not on a domain

 

There is a configuration setting to exclude checks as well. (Be careful this will exclude them even if you specifically specify a check using Invoke-DbcCheck but we do give you a warning!)

So now I can run

and all of the server checks except the SPN check will run against the local machine and the default instance that I have set in the config

Creating an environment config and exporting it to use any time we like

So lets make this a lot more useful. Lets create a configuration for our production environment and save it to disk (or even source control it!) so that we can use it again and again. We can also then pass it to other members of our team or even embed it in an automated process or our CI/CD system

Lets build up a configuration for a number of tests for my “production” environment. I will not explain them all here but let you read through the code and the comments to see what has been set. You will see that some of them are due to me running the test on a single machine with one drive.

When I run this I get

I can then export this to disk (to store in source control) using

and I have a configuration file

which I can use any time to set the configuration for dbachecks using the Import-DbcConfig command (But this doesn’t work in VS Codes integrated terminal – which occasionally does odd things, this appears to be one of them)

So I can import this configuration and run my checks with it any time I like. This means that I can create many different test configurations for my many different environment or estate configurations.

Yes, I know “good/best practice” says we should use the same configuration for all of our instances but we know that isn’t true. We have instances that were set up 15 years ago that are still in production. We have instances from the companies our organisation has bought over the years that were set up by system administrators. We have instances that were set up by shadow IT and now we have to support but cant change.

As well as those though, we also have different environments. Our development or test environment will have different requirements to our production environments.

In this hypothetical situation the four instances for four different applications have 4 development containers which are connected to using SQL Authentication. We will need a different configuration.

SQL Authentication

We can set up SQL Authentication for connecting to our SQL Instances using the app.sqlcredential configuration. this is going to hold a PSCredential object for SQL Authenticated connection to your instance. If this is set the checks will always try to use it. Yes this means that the same username and password is being used for each connection. No there is currently no way to choose which instances use it and which don’t. This may be a limitation but as you will see further down you can still do this with different configurations

To set the  SQL Authentication run

This will give a prompt for you to enter the credential

Development Environment Configuration

So now we know how to set a SQL Authentication configuration we can create our development environment configuration like so. As you can see below the values are different for the checks and more checks have been skipped. I wont explain it all, if it doesn’t make sense ask a question in the comments or in the dbachecks in SQL Server Community Slack

Using The Different Configurations

Now I have two configurations, one for my Production Environment and one for my development environment. I can run my checks whenever I like (perhaps you will automate this in some way)

  • Import the production configuration
  • Run my tests with that configuration and create a json file for my Power Bi labelled production
  • Import the development configuration (and enter the SQL authentication credential)
  • Run my tests with that configuration and create a json file for my Power Bi labelled development
  • Start Power Bi to show those results

I have published the Power Bi so that you can see what it would like and have a click around (maybe you can see improvements you would like to contribute)

now we can see how each environment is performing according to our settings for each environment

Combining Configurations Into One Result Set

As you saw above, by using the Environment parameter of Update-DbcPowerBiDataSource you can add different environments to one report. But if I wanted to have a report for my application APP1 showing both production and development environments but they have different configurations how can I do this?

Here’s how.

  • Create a configuration for the production environment (I have used the production configuration one from above but only localhost for the instance)
  • Export it using to  C:\Users\dbachecks\Desktop\APP1-Prod_config.json
  • Create a configuration for the development environment (I have used the development configuration one from above but only localhost,1401 for the instance)
  • Export it using to  C:\Users\dbachecks\Desktop\APP1-Dev_config.json

Then run

Notice that this time there is an Append on the last Invoke-DbcCheck this creates a single json file for the PowerBi and the results look like this. Now we have the results for our application and both the production environment localhost and the development container localhost,1401

 

It’s Open Source – We Want Your Ideas, Issues, New Code

dbachecks is open-source available on GitHub for anyone to contribute

We would love you to contribute. Please open issues for new tests, enhancements, bugs. Please fork the repository and add code to improve the module. please give feedback to make this module even more useful

You can also come in the SQL Server Community Slack and join the dbachecks channel and get advice, make comments or just join in the conversation

Thank You

I want to say thank you to all of the people who have enabled dbachecks to get this far. These wonderful people have used their own time to ensure that you have a useful tool available to you for free

Chrissy Lemaire @cl

Fred Weinmann @FredWeinmann

Cláudio Silva @ClaudioESSilva

Stuart Moore @napalmgram

Shawn Melton @wsmelton

Garry Bargsley @gbargsley

Stephen Bennett @staggerlee011

Sander Stad @SQLStad

Jess Pomfret @jpomfret

Jason Squires @js0505

Shane O’Neill @SOZDBA

and all of the other people who have contributed in the dbachecks Slack channel

Announcing dbachecks – Configurable PowerShell Validation For Your SQL Instances

For the last couple of months members of the dbatools team have been working on a new PowerShell module called dbachecks. This open source PowerShell module will enable you to validate your SQL Instances. Today it is released for you all to start to use 🙂

Validate Your SQL Instances?

What do I mean by validate your SQL Instances? You want to know if your SQL Instances are (still) set up in the way that you want them to be or that you have not missed any configurations when setting them up. With dbachecks you can use any or all of the 80 checks to ensure one or many SQL Instances are as you want them to be. Using Pester, dbachecks will validate your SQL Instance(s) against default settings or ones that you configure yourself.

Installation

Installation is via the PowerShell Gallery. You will need to open PowerShell on a machine connected to the internet and run

If you are not running your process as admin or you only want (or are able) to install for your own user account you will need to

This will also install the PSFramework module used for configuration (and other things beneath the hood) and the latest version (4.2.0 – released on Sunday!) of Pester

Once you have installed the module you can see the commands available by running

To be able to use these (and any PowerShell) commands, your first step should always be Get-Help

80 Checks

At the time of release, dbachecks has 80 checks. You can see all of the checks by running

(Note this has nothing to do with DBCC CheckDb!) Here is the output of

so you can see the current checks

GroupUniqueTag
AgentAgentServiceAccount
AgentDbaOperator
AgentFailsafeOperator
AgentDatabaseMailProfile
AgentFailedJob
DatabaseDatabaseCollation
DatabaseSuspectPage
DatabaseTestLastBackup
DatabaseTestLastBackupVerifyOnly
DatabaseValidDatabaseOwner
DatabaseInvalidDatabaseOwner
DatabaseLastGoodCheckDb
DatabaseIdentityUsage
DatabaseRecoveryModel
DatabaseDuplicateIndex
DatabaseUnusedIndex
DatabaseDisabledIndex
DatabaseDatabaseGrowthEvent
DatabasePageVerify
DatabaseAutoClose
DatabaseAutoShrink
DatabaseLastFullBackup
DatabaseLastDiffBackup
DatabaseLastLogBackup
DatabaseVirtualLogFile
DatabaseLogfileCount
DatabaseLogfileSize
DatabaseFileGroupBalanced
DatabaseAutoCreateStatistics
DatabaseAutoUpdateStatistics
DatabaseAutoUpdateStatisticsAsynchronously
DatabaseDatafileAutoGrowthType
DatabaseTrustworthy
DatabaseOrphanedUser
DatabasePseudoSimple
DatabaseAdHocWorkloads
DomainDomainName
DomainOrganizationalUnit
HADRClusterHealth
HADRClusterServerHealth
HADR
HADRSystem.Object[]
InstanceSqlEngineServiceAccount
InstanceSqlBrowserServiceAccount
InstanceTempDbConfiguration
InstanceAdHocWorkload
InstanceBackupPathAccess
InstanceDAC
InstanceNetworkLatency
InstanceLinkedServerConnection
InstanceMaxMemory
InstanceOrphanedFile
InstanceServerNameMatch
InstanceMemoryDump
InstanceSupportedBuild
InstanceSaRenamed
InstanceDefaultBackupCompression
InstanceXESessionStopped
InstanceXESessionRunning
InstanceXESessionRunningAllowed
InstanceOLEAutomation
InstanceWhoIsActiveInstalled
LogShippingLogShippingPrimary
LogShippingLogShippingSecondary
ServerPowerPlan
ServerInstanceConnection
ServerSPN
ServerDiskCapacity
ServerPingComputer
MaintenancePlanSystemFull
MaintenancePlanUserFull
MaintenancePlanUserDiff
MaintenancePlanUserLog
MaintenancePlanCommandLog
MaintenancePlanSystemIntegrityCheck
MaintenancePlanUserIntegrityCheck
MaintenancePlanUserIndexOptimize
MaintenancePlanOutputFileCleanup
MaintenancePlanDeleteBackupHistory
MaintenancePlanPurgeJobHistory

108 Configurations

One of the things I have been talking about in my presentation “Green is Good Red is Bad” is configuring Pester checks so that you do not have to keep writing new tests for the same thing but with different values.

For example, a different user for a database owner. The code to write the test for the database owner is the same but the value might be different for different applications, environments, clients, teams, domains etc. I gave a couple of different methods for achieving this.

With dbachecks we have made this much simpler enabling you to set configuration items at run-time or for your session and enabling you to export and import them so you can create different configs for different use cases

There are 108 configuration items at present. You can see the current configuration by running

which will show you the name of the config, the value it is currently set and the description

You can see all of the configs and their descriptions here

NameDescription
agent.databasemailprofileName of the Database Mail Profile in SQL Agent
agent.dbaoperatoremailEmail address of the DBA Operator in SQL Agent
agent.dbaoperatornameName of the DBA Operator in SQL Agent
agent.failsafeoperatorEmail address of the DBA Operator in SQL Agent
app.checkreposWhere Pester tests/checks are stored
app.computernameList of Windows Servers that Windows-based tests will run against
app.localappPersisted files live here
app.maildirectoryFiles for mail are stored here
app.sqlcredentialThe universal SQL credential if Trusted/Windows Authentication is not used
app.sqlinstanceList of SQL Server instances that SQL-based tests will run against
app.wincredentialThe universal Windows if default Windows Authentication is not used
command.invokedbccheck.excludecheckInvoke-DbcCheck: The checks that should be skipped by default.
domain.domaincontrollerThe domain controller to process your requests
domain.nameThe Active Directory domain that your server is a part of
domain.organizationalunitThe OU that your server should be a part of
mail.failurethreshholdNumber of errors that must be present to generate an email report
mail.fromEmail address the email reports should come from
mail.smtpserverStore the name of the smtp server to send email reports
mail.subjectSubject line of the email report
mail.toEmail address to send the report to
policy.backup.datadirDestination server data directory
policy.backup.defaultbackupcompreesionDefault Backup Compression check should be enabled $true or disabled $false
policy.backup.diffmaxhoursMaxmimum number of hours before Diff Backups are considered outdated
policy.backup.fullmaxdaysMaxmimum number of days before Full Backups are considered outdated
policy.backup.logdirDestination server log directory
policy.backup.logmaxminutesMaxmimum number of minutes before Log Backups are considered outdated
policy.backup.newdbgraceperiodThe number of hours a newly created database is allowed to not have backups
policy.backup.testserverDestination server for backuptests
policy.build.warningwindowThe number of months prior to a build being unsupported that you want warning about
policy.connection.authschemeAuth requirement (Kerberos, NTLM, etc)
policy.connection.pingcountNumber of times to ping a server to establish average response time
policy.connection.pingmaxmsMaximum response time in ms
policy.dacallowedDAC should be allowed $true or disallowed $false
policy.database.autocloseAuto Close should be allowed $true or dissalowed $false
policy.database.autocreatestatisticsAuto Create Statistics should be enabled $true or disabled $false
policy.database.autoshrinkAuto Shrink should be allowed $true or dissalowed $false
policy.database.autoupdatestatisticsAuto Update Statistics should be enabled $true or disabled $false
policy.database.autoupdatestatisticsasynchronouslyAuto Update Statistics Asynchronously should be enabled $true or disabled $false
policy.database.filebalancetolerancePercentage for Tolerance for checking for balanced files in a filegroups
policy.database.filegrowthexcludedbDatabases to exclude from the file growth check
policy.database.filegrowthtypeGrowth Type should be 'kb' or 'percent'
policy.database.filegrowthvalueThe auto growth value (in kb) should be equal or higher than this value. Example: A value of 65535 means at least 64MB.
policy.database.logfilecountThe number of Log files expected on a database
policy.database.logfilesizecomparisonHow to compare data and log file size, options are maximum or average
policy.database.logfilesizepercentageMaximum percentage of Data file Size that logfile is allowed to be.
policy.database.maxvlfMax virtual log files
policy.dbcc.maxdaysMaxmimum number of days before DBCC CHECKDB is considered outdated
policy.diskspace.percentfreePercent disk free
policy.dump.maxcountMaximum number of expected dumps
policy.hadr.tcpportThe TCPPort for the HADR check
policy.identity.usagepercentMaxmimum percentage of max of identity column
policy.invaliddbowner.excludedbDatabases to exclude from invalid dbowner checks
policy.invaliddbowner.nameThe database owner account should not be this user
policy.network.latencymaxmsMax network latency average
policy.ola.commandlogenabledOla's CommandLog Cleanup should be enabled $true or disabled $false
policy.ola.commandlogscheduledOla's CommandLog Cleanup should be scheduled $true or disabled $false
policy.ola.databaseThe database where Ola's maintenance solution is installed
policy.ola.deletebackuphistoryenabledOla's Delete Backup History should be enabled $true or disabled $false
policy.ola.deletebackuphistoryscheduledOla's Delete Backup History should be scheduled $true or disabled $false
policy.ola.installedChecks to see if Ola Hallengren solution is installed
policy.ola.outputfilecleanupenabledOla's Output File Cleanup should be enabled $true or disabled $false
policy.ola.outputfilecleanupscheduledOla's Output File Cleanup should be scheduled $true or disabled $false
policy.ola.purgejobhistoryenabledOla's Purge Job History should be enabled $true or disabled $false
policy.ola.purgejobhistoryscheduledOla's Purge Job History should be scheduled $true or disabled $false
policy.ola.systemfullenabledOla's Full System Database Backup should be enabled $true or disabled $false
policy.ola.systemfullretentionOla's Full System Database Backup retention number of hours
policy.ola.systemfullscheduledOla's Full System Database Backup should be scheduled $true or disabled $false
policy.ola.systemintegritycheckenabledOla's System Database Integrity should be enabled $true or disabled $false
policy.ola.systemintegritycheckscheduledOla's System Database Integrity should be scheduled $true or disabled $false
policy.ola.userdiffenabledOla's Diff User Database Backup should be enabled $true or disabled $false
policy.ola.userdiffretentionOla's Diff User Database Backup retention number of hours
policy.ola.userdiffscheduledOla's Diff User Database Backup should be scheduled $true or disabled $false
policy.ola.userfullenabledOla's Full User Database Backup should be enabled $true or disabled $false
policy.ola.userfullretentionOla's Full User Database Backup retention number of hours
policy.ola.userfullscheduledOla's Full User Database Backup should be scheduled $true or disabled $false
policy.ola.userindexoptimizeenabledOla's User Index Optimization should be enabled $true or disabled $false
policy.ola.userindexoptimizescheduledOla's User Index Optimization should be scheduled $true or disabled $false
policy.ola.userintegritycheckenabledOla's User Database Integrity should be enabled $true or disabled $false
policy.ola.userintegritycheckscheduledOla's User Database Integrity should be scheduled $true or disabled $false
policy.ola.userlogenabledOla's Log User Database Backup should be enabled $true or disabled $false
policy.ola.userlogretentionOla's Log User Database Backup retention number of hours
policy.ola.userlogscheduledOla's Log User Database Backup should be scheduled $true or disabled $false
policy.oleautomationOLE Automation should be enabled $true or disabled $false
policy.pageverifyPage verify option should be set to this value
policy.recoverymodel.excludedbDatabases to exclude from standard recovery model check
policy.recoverymodel.typeStandard recovery model
policy.storage.backuppathEnables tests to check if servers have access to centralized backup location
policy.validdbowner.excludedbDatabases to exclude from valid dbowner checks
policy.validdbowner.nameThe database owner account should be this user
policy.whoisactive.databaseWhich database should contain the sp_WhoIsActive stored procedure
policy.xevent.requiredrunningsessionList of XE Sessions that should be running.
policy.xevent.requiredstoppedsessionList of XE Sessions that should not be running.
policy.xevent.validrunningsessionList of XE Sessions that can be be running.
skip.backup.testingDon't run Test-DbaLastBackup by default (it's not read-only)
skip.connection.pingSkip the ping check for connectivity
skip.connection.remotingSkip PowerShell remoting check for connectivity
skip.database.filegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.database.logfilecounttestSkip the logfilecount test
skip.datafilegrowthdisabledSkip validation of datafiles which have growth value equal to zero.
skip.dbcc.datapuritycheckSkip data purity check in last good dbcc command
skip.diffbackuptestSkip the Differential backup test
skip.logfilecounttestSkip the logfilecount test
skip.logshiptestingSkip the logshipping test
skip.tempdb1118Don't run test for Trace Flag 1118
skip.tempdbfilecountDon't run test for Temp Database File Count
skip.tempdbfilegrowthpercentDon't run test for Temp Database File Growth in Percent
skip.tempdbfilesizemaxDon't run test for Temp Database Files Max Size
skip.tempdbfilesoncDon't run test for Temp Database Files on C

Running A Check

You can quickly run a single check by calling Invoke-DbcCheck.

Excellent, my agent jobs have not failed 🙂

Thats good, all of my databases have had a successful DBCC CHECKDB within the last 7 days.

Setting a Configuration

To save me from having to specify the instance I want to run my tests against I can set the app.sqlinstance config to the instances I want to check.

Then whenever I call Invoke-DbcCheck it will run against those instances for the SQL checks

So now if I run

I can see that I dont have a diff backup for the databases on both instances. Better stop writing this and deal with that !!

The configurations are stored in the registry but you can export them and then import them for re-use easily. I have written another blog post about that.

The Show Parameter

Getting the results of the tests on the screen is cool but if you are running a lot of tests against a lot of instances then you might find that you have 3 failed tests out of 15000! This will mean a lot of scrolling through green text looking for the red text and you may find that your PowerShell buffer doesnt hold all of your test results leaving you very frustrated.

dbachecks supports the Pester Show parameter enabling you to filter the output of the results to the screen. The available values are Summary, None, Fails, Inconclusive, Passed, Pending and Skipped

in my opinion by far the most useful one is Fails as this will show you only the failed tests with the context to enable you to see which tests have failed

If we check all of the checks tagged as Agent we can easily see that most passed but The Job That Fails (surprisingly) failed. All of the other tests that were run for the agent service, operators, failsafe operator, database mail and all other agent jobs all passed in the example below

Test Results are for other People as well

It is all very well and good being able to run tests and get the results on our screen. It will be very useful for people to be able to validate a new SQL instance for example or run a morning check or the first step of an incident response. But test results are also useful for other people so we need to be able to share them

We have created a Power Bi Dashboard that comes with the dbachecks module to enable easy sharing of the test results. You can also send the results via email using Send-DbcMailMessage. we have an open issue for putting them into a database that we would love you to help resolve.

To get the results into PowerBi you can run

This will run all of the dbachecks using your configuration for your Production environment, output only the failed tests to the screen and save the results in your windows\temp\dbachecks folder with a suffix of Production

If you then used a different configuration for your development environment and ran

it will run all of the dbachecks using your configuration for your Development environment, output only the failed tests to the screen and save the results in your windows\temp\dbachecks folder with a suffix of Development and you would end up with two files in the folder

You can then simply run

and as long as you have the (free) Powerbi Desktop then you will see this. You will need to refresh the data to get your test results

Of course it is Powerbi so you can publish this report. Here it is so that you can click around and see what it looks like

It’s Open Source – We Want Your Ideas, Issues, New Code

dbachecks is open-source available on GitHub for anyone to contribute

We would love you to contribute. Please open issues for new tests, enhancements, bugs. Please fork the repository and add code to improve the module. please give feedback to make this module even more useful

You can also come in the SQL Server Community Slack and join the dbachecks channel and get advice, make comments or just join in the conversation

Further Reading

There are many more introduction blog posts covering different areas at

Thank You

I want to say thank you to all of the people who have enabled dbachecks to get this far. These wonderful people have used their own time to ensure that you have a useful tool available to you for free

Chrissy Lemaire @cl

Fred Weinmann @FredWeinmann

Cláudio Silva @ClaudioESSilva

Stuart Moore @napalmgram

Shawn Melton @wsmelton

Garry Bargsley @gbargsley

Stephen Bennett @staggerlee011

Sander Stad @SQLStad

Jess Pomfret @jpomfret

Jason Squires @js0505

Shane O’Neill @SOZDBA

Tony Wilhelm @TonyWSQL

and all of the other people who have contributed in the dbachecks Slack channel

How I created PowerShell.cool using Flow, Azure SQL DB, Cognitive Services & PowerBi

Last weekend I was thinking about how to save the tweets for PowerShell Conference Europe. This annual event occurs in Hanover and this year it is on April 17-20, 2018. The agenda has just been released and you can find it on the website http://www.psconf.eu/

I ended up creating an interactive PowerBi report to which my good friend and Data Platform MVP Paul Andrew b | t added a bit of magic and I published it. The magnificent Tobias Weltner b | t who organises PSConfEU pointed the domain name http://powershell.cool at the link. It looks like this.

During the monthly #PSTweetChat

I mentioned that I need to blog about how I created it and Jeff replied

so here it is! Looking forward to seeing the comparison between the PowerShell and Devops Summit and the PowerShell Conference Europe 🙂

This is an overview of how it works

 

You will find all of the resources and the scripts to do all of the below in the GitHub repo. So clone it and navigate to the filepath

Create Database

First lets create a database. Connect to your Azure subscription

01 - subscription.png

Then set some variables

They should all make sense, take note that you need to set and uncomment the login and password and choose which IPs to allow through the firewall

Create a Resource Group

02 - resource group.png

Create a SQL Server

03 - create server.png

Create a firewall rule, I just use my own IP and add the allow azure IPs

03a - firewall rule.png

Create a database

04 - create database.png

I have used the dbatools module to run the scripts to create the database. You can get it using

Run the scripts

05 - Create Table Sproc.png

This will have created the following in Azure, you can see it in the portal

07 - portal.png

You can connect to the database in SSMS and you will see

06 - show table.png

Create Cognitive Services

Now you can create the Text Analysis Cognitive Services API

First login (if you need to) and set some variables

Then create the API and get the key

You will need to accept the prompt

08 -cognitive service

Copy the Endpoint URL as you will need it.Then save one of  the keys for the next step!

09 cognitiveservice key

 

Create the Flow

I have exported the Flow to a zip file and also the json for a PowerApp (no details about that in this post). Both are available in the Github repo. I have submitted a template but it is not available yet.

Navigate to https://flow.microsoft.com/ and sign in

Creating Connections

You will need to set up your connections. Click New Connection and search for Text

16 - import step 3.png

Click Add and fill in the Account Key and the Site URL from the steps above

17 import step 5.png

click new connection and search for SQL Server

18 - import step 6.png

Enter the SQL Server Name (value of $AzureSQLServer) , Database Name , User Name and Password from the steps above

19 - import step 7.png

Click new Connection and search for Twitter and create a connection (the authorisation pop-up may be hidden behind other windows!)

Import the Flow

If you have a premium account you can import the flow, click Import

11 - import flow.png

12 - choose import.png

and choose the import.zip from the Github Repo

13 import step 1.png

 

Click on Create as new and choose a name

14 - import step 2.png

Click select during import next to Sentiment and choose the Sentiment connection

15 impot step 3.png

Select during import for the SQL Server Connection and choose the SQL Server Connection and do the same for the Twitter Connection

20 - import stpe 8.png

Then click import

21 - imported.png

Create the flow without import

If you do not have a premium account you can still create the flow using these steps. I have created a template but it is not available at the moment. Create the connections as above and then click Create from blank.

22 - importblank.png

 

Choose the trigger When a New Tweet is posted and add a search term. You may need to choose the connection to twitter by clicking the three dots

23 - importblank 1.png

Click Add an action

24 - add action.png

search for detect and choose the Text Analytics Detect Sentiment

25 - choose sentuiment.png

Enter the name for the connection, the account key and the URL from the creation of the API above. If you forgot to copy them

26 - enter details.png

Click in the text box and choose Tweet Text

27 - choose tweet text.png

Click New Step and add an action. Search for SQL Server and choose SQL Server – Execute Stored Procedure

28 - choose sql server execute stored procedure.png

Choose the stored procedure [dbo].[InsertTweet]

29 - choose stored procedure.png

Fill in as follows

  • __PowerAppsID__         0
  • Date                                 Created At
  • Sentiment                      Score
  • Tweet                              Tweet Text
  • UserLocation                 Location
  • UserName                      Tweeted By

as shown below

30 stored procedure info.png

Give the flow a name at the top and click save flow

31 flow created.png

Connect PowerBi

Open the PSConfEU Twitter Analysis Direct.pbix from the GitHub repo in PowerBi Desktop. Click the arrow next to Edit Queries and then change data source settings

32 change data source.png

Click Change source and enter the server (value of $AzureSQLServer) and the database name. It will alert you to apply changes

33 apply changes.png

It will then pop-up with a prompt for the credentials. Choose Database and enter your credentials and click connect

34 - creds.png

and your PowerBi will be populated from the Azure SQL Database 🙂 This will fail if there are no records in the table because your flow hasn’t run yet. If it does just wait until you see some tweets and then click apply changes again.

You will probably want to alter the pictures and links etc and then yo can publish the report

Happy Twitter Analysis

Dont forget to keep an eye on your flow runs to make sure they have succeeded.

Using the AST in Pester for dbachecks

TagLine – My goal – Chrissy will appreciate Unit Tests one day 🙂

Chrissy has written about dbachecks the new up and coming community driven open source PowerShell module for SQL DBAs to validate their SQL Server estate. we have taken some of the ideas that we have presented about a way of using dbatools with Pester to validate that everything is how it should be and placed them into a meta data driven framework to make things easy for anyone to use. It is looking really good and I am really excited about it. It will be released very soon.

Chrissy and I will be doing a pre-con at SQLBits where we will talk in detail about how this works. You can find out more and sign up here

Cláudio Silva has improved my PowerBi For Pester file and made it beautiful and whilst we were discussing this we found that if the Pester Tests were not formatted correctly the Power Bi looked … well rubbish to be honest! Chrissy asked if we could enforce some rules for writing our Pester tests.

The rules were

The Describe title should be in double quotes
The Describe should use the plural Tags parameter
The Tags should be singular
The first Tag should be a unique tag in Get-DbcConfig
The context title should end with $psitem
The code should use Get-SqlInstance or Get-ComputerName
The Code should use the forEach method
The code should not use $_
The code should contain a Context block

She asked me if I could write the Pester Tests for it and this is how I did it. I needed to look at the Tags parameter for the Describe. It occurred to me that this was a job for the Abstract Syntax Tree (AST). I don’t know very much about the this but I sort of remembered reading a blog post by Francois-Xavier Cat about using it with Pester so I went and read that and found an answer on Stack Overflow as well. These looked just like what I needed so I made use of them. Thank you very much to Francois-Xavier and wOxxOm for sharing.

The first thing I did was to get the Pester Tests which we have located in a checks folder and loop through them and get the content of the file with the Raw parameter

Then I decided to look at the Describes using the method that wOxxOm (I know no more about this person!) showed.
As I understand it, this code is using the Parser on the $check (which contains the code from the file) and finding all of the Describe commands and creating an object of the title of the Describe with the StaticType equal to String and values from the Tag parameter.
When I ran this against the database tests file I got the following results
Then it was a simple case of writing some tests for the values

The Describes variable is inside @() so that if there is only one the ForEach Method will still work. The unique tags are returned from our command Get-DbcCheck which shows all of the checks. We will have a unique tag for each test so that they can be run individually.

Yes, I have tried to ensure that the tags are singular by ensuring that they do not end with an s (apart from statistics) and so had to not check  BackupPathAccess and statistics. Filename is a variable that we add to each Describe Tags so that we can run all of the tests in one file. I added a little if block to the Pester as well so that the error if the Tags parameter was not passed was more obvious

I did the same with the context blocks as well

This time we look for the Context command and ensure that the string value ends with psitem as the PowerBi parses the last value when creating columns
Finally I got all of the code and check if it matches some coding standards

I trim the title from the Describe block so that it is easy to see where the failures (or passes) are with some regex and then loop through each statement apart from the first line to ensure that the code is using our internal commands Get-SQLInstance or Get-ComputerName to get information, that we are looping through each of those arrays using the ForEach method rather than ForEach-Object and using $psitem rather than $_ to reference the “This Item” in the array and that each Describe block has a context block.

This should ensure that any new tests that are added to the module follow the guidance we have set up on the Wiki and ensure that the Power Bi results still look beautiful!

Anyone can run the tests using

before they create a Pull request and it looks like
if everything is Green then they can submit their Pull Request 🙂 If not they can see quickly that something needs to be fixed. (fail early 🙂 )
03 fails.png

A Pretty PowerBi Pester Results Template File

I have left the heat and humidity of Singapore where I have been presenting at the PowerShell Conference Asia and DevOpsDays Singapore to travel to Seattle for PASS Summit. During my Green is Good – Red is Bad session someone asked me if the PowerBi that I showed at the end would work with any Pester Test Results object and I said (without thinking) that it would.

It turns out that the PowerBi that I had set up for that session will work with my function to run Pester Tests against an Ola Hallengren installation but some of the formatting and custom columns were specific to that test.

I said that I would share a Power Bi file that people could plug any Pester Test Results into. This is the first iteration of that. I doubt that it will work for every single test but I think it will be a good starting point for people to use.

This is how to use it

Download the file from here.

Run your Pester Tests using the PassThru Parameter and set the results to a variable, you can also use the Show Parameter to reduce the output of the tests to the screen (and also speed up the tests)

Then we convert the $PesterResults object into a JSON file

Open the Power Bi file you downloaded

Click Home

then the words “Edit Queries”

then data source settings,

highlight the filename and click change source

then navigate to the JSON file you just created, click ok and close and the apply changes.

Which will load the data from the JSON file and display your pester results. You can then save this file with a new name and keep the template for other tests.

It’s not going to be perfect

It’s not going to work in all circumstances and I expect that with some test results it will display the results in a less than optimal manner but you should be able to modify this to suit your needs.

Please give it a try and see how you get on

Here is a sample report created with Demo 1 from my Green is Good session

You can click around and change the data you can see and also look at the other 4 pages

Here is another one that I created using my dbatools-scripts repo and a config file. Again, have a click around and see what it does.

 

I also created a quick video showing the process too which I will upload when I am not at 35000 feet!!

Enjoy 🙂 Also, let me know if you think it would be better to have the file in Github which would allow contributions but it would only be seen as a binary file and therefore merging will be difficult. I am happy to do so.

TSQL2sDay – Get-PostRoundup

First an apology, this round up is late!

The reason for that is an error in the PowerShell testing module Pester (That’s not completely true as you shall see!!)

I spoke in Stuttgart at the PowerShell Saturday last weekend and had intended to write this blog post whilst travelling, unfortunately I found a major error in Pester (again not strictly true but it makes a good story!!)

I explained it with this slide in my presentation

Yep, I forgot to pack my NUC with my VMs on it and had to re-write all my demos!!

But anyway, on to the TSQL2sDay posts

What a response. You wonderful people. I salute you with a Rimmer salute

There are 34 TSQL2sDay posts about dbatools, about starting with PowerShell, If you should learn PowerShell, SSAS, SSRS, Log Shipping, backups, restores, Pester, Default settings, best practices, migrations, Warnings in Agent Jobs, sqlpackage, VLFs, CMS, Disabling Named Pipes, Orphaned users, AG Status, AG Agent Jobs, logging, classes, auditing, copying files, ETL and more.

I am really pleased to see so many first timers to the TSQL2sDay blog monthly blog party. Please don’t let this be your only TSQL2sDay post. Come back next month and write a post on that topic.

Here they are below in the media of tweets, so that you can also go and follow these wonderful people who are so willing to share their knowledge. Say thank you to them, ask them questions, interact.

Learn, Share, Network

Volker wrote about testing best practices with dbatools

Dave explains why PowerShell is so useful to him in his ETL processes

Steve writes about the time he has saved using PowerShell to automate restores and audit SQL Server instances

Nate talks about copying large files like SQL Server backups using BITS with PowerShell

Warren talks about his experience as a beginner, the amount of things he automates and his DBReboot module

THANK YOU every single one and apologies if I have missed anyone!

 

 

Enabling Cortana for dbareports PowerBi

Last week at the Birmingham user group I gave a presentation about PowerShell and SQL Server

saved-image-from-tweetium-8

It was a very packed session as I crammed in the new sqlserver module, dbatools and dbareports 🙂 On reflection I think this is a bit too much for a one hour session but at the end of the session I demo’d live Cortana using the dbareports dataset and returning a Cortana PowerBi page.

As always it took a couple of goes to get it right but when it goes correctly it is fantastic. I call it a salary increasing opportunity! Someone afterwards asked me how it was done so I thought that was worth a blog post

There is a video below but the steps are quite straightforward.

Add Cortana Specific Pages

Whilst you can just enable Cortana to access your dataset, as shown later in this post, which enables Cortana to search available datasets and return an appropriate visualisation it is better to provide specific pages for Cortana to use and display. You can do this in PowerBi Desktop

Start by adding a new page in your report by clicking on the plus button

 

add page.PNG

and then change the size of the report page by clicking on the paintbrush icon in the visualisation page.

page-size

This creates a page that is optimised for Cortana to display and also will be the first place that Cortana will look to answer the question

Power BI first looks for answers in Answer Pages and then searches your datasets and reports for other answers and displays them in the form of visualizations. The highest-scoring results display first as best matches, followed by links to other possible answers and applications. Best matches come from Power BI Answer Pages or Power BI reports.

Rename the page so that it contains the words or phrase you expect to be in the question such as “Servers By Version” You will help Cortana and PowerBi to get your results better if you use some of the column names in your dataset

Then it is just another report page and you can add visualisations just like any other page

cortana page.PNG

Make Cortana work for you and your users

If your users are likely to use a number of different words in their questions you can assist Cortana to find the right answer by adding alternate names. So maybe if your page is sales by store you might add shop, building, results, amount, orders. This is also useful when Cortana doesn’t understand the correct words as you will notice in the screenshot below I have added “service” for “servers” and “buy” for “by” to help get the right answer. You can add these alternate words by clicking the paintbrush under visualisations and then Page Information

cortana-additional

Publish your PBIX file to PowerBi.com

To publish your PowerBi report to PowerBi.com either via the Publish button in PowerBi desktop

publish

or by using the PowerBiPS module

Install-Module -Name PowerBIPS
#Grab the token, will require a sign in
$authToken = Get-PBIAuthToken Verbose
Import-PBIFile authToken $authToken filePath “Path to PBIX file” verbose

Enable Cortana

In your browser log into https://powerbi.com and then click on the cog and then settings

powerbicom.PNG

then click on Datasets

settings

Then choose the dataset – in this case dbareports SQL Information sample and click the tick box to Allow Cortana to access the this dataset and then click apply

dataset settings.PNG

Use Cortana against your PowerBi data

You can type into the Cortana search box and it will offer the opportunity for you to choose your PowerBi data

cortana-search

but it is so much better when you let it find the answer 🙂

cortana-search-1

and if you want to go to the PowerBi report there is a handy link at the bottom of the Cortana page

cortana-search-2

I absolutely love this, I was so pleased when I got it to work and the response when I show people is always one of wonder for both techies and none-techies alike

The conditions for Cortana to work

You will need to have added your work or school Microsoft ID to the computer or phone that you want to use Cortana on and that account must be able to access the dataset either because it is the dataset owner or because a dashboard using that dataset has been shared with that account.

From this page on PowerBi.com

When a new dataset or custom Cortana Answer Page is added to Power BI and enabled for Cortana it can take up to 30 minutes for results to begin appearing in Cortana. Logging in and out of Windows 10, or otherwise restarting the Cortana process in Windows 10, will allow new content to appear immediately.

It’s not perfect!

When you start using Cortana to query your data you will find that at times it is very frustrating. My wife was in fits of giggles listening to me trying to record the video below as Cortana refused to understand that I was saying “servers” and repeatedly searched Bing for “service” Whilst you can negate the effect by using the alternate names for the Q and A settings it is still a bit hit and miss at times.

It is amazing

There is something about giving people the ability to just talk to their device in a meeting and for example with dbareports ask

Which clients are in Bolton

or

When was the last backup for client The Eagles

and get the information they require and a link to the report in PowerBi.com. I am certain that the suits will be absolutely delighted at being able to show off in that way which is why I call it a salary increasing opportunity 🙂

We would love YOU to come and join us at the SQL Community Collaborative

Help us make dbatools, dbareports and Invoke-SQLCmd2 even better. You can join in by forking the repos in GitHub and writing your code and then performing a PR but we would much rather that you came and discussed new requests in our Trello boards, raised issues in GitHub and generally discussed the modules in the SQL Server Community Slack #dbatools #dbareports. We are also looking for assistance with our wiki pages, Pester tests and appveyor integration for our builds and any comments people want to make

SQL Server Collaborative GitHub Organisation holding the modules. Go here to raise issues, fork the repositories or download the code

dbatools Trello for discussion about new cmdlets

SQL Server Community Slack where you can find #dbatools and #dbareports as well as over 1100 people discussing all aspects of the Data Platform, events, jobs, presenting

COME AND JOIN US

 

PowerBi and API – Visualising my Checkins

For my own amusement and also to show my wife where I have been I use the Swarm check-in app on my phone and check-in to places. Also for my own amusement I used PowerBi to visualise the data via the API and allow me to filter it in various ways.

Whilst at the PowerShell Conference in Asia I was showing the mobile app to a group over some food and saying how easy it was and June Blender, the mother of PowerShell help, said that I ought to blog about it. So I have 🙂

Follow these steps and you can create this report.

powerbi8.PNGYou can also download the blank report and add your own access token to it should you wish. Details at the end of the post

I am using the swarm API but the principle is the same for any other API that provides you with data. For example, I used the same principles to create the embedded reports on the PASS PowerShell Virtual Chapter page showing the status of the cards suggesting improvements to the sqlserver module for the product team to work on. Hopefully, this post will give you some ideas to work on and show you that it is quite easy to get excellent data visualisation from APIs

First up we need to get the data. I took a look at the Swarm developers page ( The Trello is here by the way) I had to register for an app, which gave me a client id and a secret. I then followed the steps here to get my user token I was only interested in my own check ins so I used the steps under Token flow Client applications to get my access token which I used in an URL like this.

https://api.foursquare.com/v2/users/self/checkins?limit=5000&oauth_token=ACCESS_TOKEN&v=YYYYMMDD

I added the limit 5000 as the default number of checkins returned was too small for my needs and the date was that days date.

You can do this in Powershell using code I got from the magnificent Stephen Owen’s blog post

## Enter the details
$Clientid =''  ## Enter ClientId from foursquare
$redirect = '' ## enter redirect url from client app in foursquare
##Create the URL:
$URL = "https://foursquare.com/oauth2/authenticate?client_id=$Clientid&response_type=token&redirect_uri=$redirect"
## function from https://foxdeploy.com/2015/11/02/using-powershell-and-oauth/
Function Show-OAuthWindow {
Add-Type -AssemblyName System.Windows.Forms</div>
<div>$form = New-Object -TypeName System.Windows.Forms.Form -Property @{Width=440;Height=640}
$web  = New-Object -TypeName System.Windows.Forms.WebBrowser -Property @{Width=420;Height=600;Url=($url -f ($Scope -join "%20")) }
$DocComp  = {
$Global:uri = $web.Url.AbsoluteUri
if ($Global:Uri -match "error=[^&]*|code=[^&]*") {$form.Close() }
}
$web.ScriptErrorsSuppressed = $true
$web.Add_DocumentCompleted($DocComp)
$form.Controls.Add($web)
$form.Add_Shown({$form.Activate()})
$form.ShowDialog() | Out-Null
}
#endregion
#login to get an access code then close the redirect window
Show-OAuthWindow -URL $URl
## grab the token
$regex = '(?<=access_token=)(.*)'
$authCode  = ($uri | Select-string -pattern $regex).Matches[0].Value
$global:AuthToken = $authCode
Write-output "Received a token, $AuthToken"
Write-Output "So the URL for your PowerBi Data is :-"
$PowerBiUrl = "https://api.foursquare.com/v2/users/self/checkins?limit=5000&oauth_token=$AuthToken&v=20160829"
$PowerBiUrl | Clip

I checked the URL in a browser and confirmed that it returned a json object. Keep that URL safe you will need it in a minute. That code above has placed it in your clipboard. If you want to jump straight to the report using the download stop here and go to the end

So now lets move to Power BI. Go to powerbi.com and download the PowerBi Desktop. Its free. You will need to create an account using a school or work email address if you wish to put your reports in powerbi.com

Once you have downloaded and installed PowerBi Desktop you will be faced with a window like this

powerbi

Start by clicking Get Data

powerbi2

Then choose Web and paste the URL from above into the filename and press ok which will give you this

powerbi3

Now we need to put the data into a format that is of more use to us

power1

I clicked on the record link for response, then converted to table, then the little icon at the top of the column to expand the value.items column and then the value.items column again. It doesn’t look much yet but we are a step closer.

Next I looked in the table for the venue column, expanded that and the location column and the formatted address column.

power2

You can also expand the categories so that you can look at those too by expanding Value.items.venue.categories and Value.items.venue.categories1

powerbi4.gif

Now you will see that we have some duplicates in the data so we need to remove those. I did that by deleting the first 3 columns and then clicking remove duplicates under Delete Rows

power3b.gif

Then click close and apply. Then click on the data button as we need to rename and remove some more columns so that our data makes a little sense. I renamed the columns like this

Value.items.createdAt –> CreatedAt
Value.items.shout –> Comment
Value.items.venue.name –> VenueName
Value.items.venue.location.address –> VenueAddress
Value.items.timeZoneOffset –> TimeZoneOffset
Value.items.venue.location.lat –> VenueLat
Value.items.venue.location.lng –> VenueLong
Value.items.venue.location.postalCode –> VenuePostalCode
Value.items.venue.location.cc –> CountryCode
Value.items.venue.location.city –> City
Value.items.venue.location.state –> State
Value.items.venue.location.country –> Country
Value.items.venue.location.formattedAddress –> VenueAddress
Value.items.venue.url –> VenueURL
Value.items.venue.categories.name –> Catogory
Value.items.venue.categories.pluralName –> Categories

and remove all of the other columns. You can also do this in the Edit Queries window, I am just showing you that there are multiple ways to do the same thing

powerbi5.gif

Once you have done that you should have a window that looks like this. Notice I renamed the query to checkins as well

powerbi4.PNG

Now we need to create a calculated column for the time and a measure for the count of checkins. This is done using this code

[code langauge=”SQL”]Time = VAR UnixDays = [createdAt]/(60*60*24)
RETURN (DATEVALUE("1/1/1970")+UnixDays)

[code langauge=”SQL”]CountCheckins = COUNT(checkins[Time])

and we can move onto the report side of things. Frist we are going to download a custom visual. Go to the PowerBi Custom Visuals Page and download the Timeline visualpowerbi5.PNG

and then import it into your PowerBi report. I have embedded a YouTube video below showing the steps I took to turn this into the PowerBi report. Its pretty easy, you will be able to click on the visuals and then click on the data columns and alter them until you have the report that you want.

Once you have done this, you can upload it to PowerBi if you wish by clicking on the Publish button in PowerBi desktop and signing into PowerBi.com with your work email address.

powerbi6.PNG

and your report is available for you on PowerBi.com 🙂 By clicking on the pins on a visualisation you can add them to a dashboard.

powerbi8.gif

Once you have a dashboard you can then use the natural language query to ask questions of your data. Here are some examples

How many checkins are in GB
How many checkins are in airports
How many checkins by month
How many checkins by month in GB
Which airports
Show me hotel venuename and time
How many hotels by country
Show me hotel venuename and checkins count
metro stations venuename and count checkins as a map
Show me count checkins in Amsterdam by category as a donut

powerbi7.PNG

If you want to use the blank report, download it from here open it in PowerBi Desktop, click Edit Queries and Source and add your own URL and click Apply and then Refresh

powerbi9.gif

Hopefully, this has given you some ideas of ways that you can create some reports from many of the data sources available to you via API