Populating My DBA Database for Power Bi with PowerShell – Databases

Following my post about using Power Bi with my DBA Database I have been asked if I would share the PowerShell scripts which I use to populate my database.

In this post I will show how to create the following report

db1

db2

Although you will find so many items of data that I expect that you will want to create different reports for your own requirements. You will also want to put the report onto PowerBi.com and explore the natural language querying as I show at the end of this post

You will find the latest version of my DBADatabase creation scripts and PowerShell scripts here.

The SQLInfo table is created using this code

CREATE TABLE [Info].[Databases](
	[DatabaseID] [int] IDENTITY(1,1) NOT NULL,
	[InstanceID] [int] NOT NULL,
	[Name] [nvarchar](256) NULL,
	[DateAdded] [datetime2](7) NULL,
	[DateChecked] [datetime2](7) NULL,
	[AutoClose] [bit] NULL,
	[AutoCreateStatisticsEnabled] [bit] NULL,
	[AutoShrink] [bit] NULL,
	[AutoUpdateStatisticsEnabled] [bit] NULL,
	[AvailabilityDatabaseSynchronizationState] [nvarchar](16) NULL,
	[AvailabilityGroupName] [nvarchar](128) NULL,
	[CaseSensitive] [bit] NULL,
	[Collation] [nvarchar](30) NULL,
	[CompatibilityLevel] [nvarchar](15) NULL,
	[CreateDate] [datetime2](7) NULL,
	[DataSpaceUsageKB] [float] NULL,
	[EncryptionEnabled] [bit] NULL,
	[IndexSpaceUsageKB] [float] NULL,
	[IsAccessible] [bit] NULL,
	[IsFullTextEnabled] [bit] NULL,
	[IsMirroringEnabled] [bit] NULL,
	[IsParameterizationForced] [bit] NULL,
	[IsReadCommittedSnapshotOn] [bit] NULL,
	[IsSystemObject] [bit] NULL,
	[IsUpdateable] [bit] NULL,
	[LastBackupDate] [datetime2](7) NULL,
	[LastDifferentialBackupDate] [datetime2](7) NULL,
	[LastLogBackupDate] [datetime2](7) NULL,
	[Owner] [nvarchar](30) NULL,
	[PageVerify] [nvarchar](17) NULL,
	[ReadOnly] [bit] NULL,
	[RecoveryModel] [nvarchar](10) NULL,
	[ReplicationOptions] [nvarchar](40) NULL,
	[SizeMB] [float] NULL,
	[SnapshotIsolationState] [nvarchar](10) NULL,
	[SpaceAvailableKB] [float] NULL,
	[Status] [nvarchar](35) NULL,
	[TargetRecoveryTime] [int] NULL,
 CONSTRAINT [PK_Databases] PRIMARY KEY CLUSTERED 
(
	[DatabaseID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

GO

The Powershell script uses Jason Wasser @wasserja Write-Log function to write to a text file but I also enable some logging into a new event log by following the steps here http://blogs.technet.com/b/heyscriptingguy/archive/2013/02/01/use-powershell-to-create-and-to-use-a-new-event-log.aspx to create a log named SQLAutoScript with a source SQLAUTOSCRIPT

To run the script I simply need to add the values for

$CentralDBAServer = '' ## Add the address of the instance that holds the DBADatabase
$CentralDatabaseName = 'DBADatabase' 
$LogFile = "\DBADatabaseServerUpdate_" + $Date + ".log" ## Set Path to Log File

And the script will do the rest. Call the script from a PowerShell Job Step and schedule it to run at the frequency you wish, I gather the information every week. You can get the script from here or you can read on to see how it works and how to create the report and publish it to powerbi.com and query it with natural langauge

I create a function called Catch-Block to save keystrokes and put my commands inside a try catch to make the scripts as robust as possible. I won’t include the try catch in the examples below. I gather all of the server names from the InstanceList table and set the results to an array variable called $ServerNames holding the server name, instance name and port

 $Query = @"
 SELECT [ServerName]
      ,[InstanceName]
      ,[Port]
  FROM [DBADatabase].[dbo].[InstanceList]
  Where Inactive = 0 
    AND NotContactable = 0
"@
try{
$AlltheServers= Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query $query
$ServerNames = $AlltheServers| Select ServerName,InstanceName,Port
}

I then loop through the array and create a $Connection variable for my SMO connection string and connect to the server

foreach ($ServerName in $ServerNames)
{
## $ServerName
 $InstanceName =  $ServerName|Select InstanceName -ExpandProperty InstanceName
 $Port = $ServerName| Select Port -ExpandProperty Port
$ServerName = $ServerName|Select ServerName -ExpandProperty ServerName 
 $Connection = $ServerName + '\' + $InstanceName + ',' + $Port

 try
 {
 $srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection

Even though I place the creation of the SMO server object in a try block you still need to an additional check to ensure that you can connect and populate the object as the code above creates an empty SMO Server object with the name property set to the $Connection variable if you can’t connect to that server and doesn’t error as you may expect
The way I have always validated an SMO Server object is to check the version property. There is no justifiable reason for choosing that property, you could choose any one but that’s the one I have always used. I use an if statement to do this ( This post about Snippets will show you the best way to learn PowerShell code) The reference I use for exiting a loop in the way that you want is this one In this case we use a continue to carry on iterating the loop

 if (!( $srv.version)){
 Catch-Block " Failed to Connect to $Connection"
 continue
 }

I then loop through the user databases

foreach($db in $srv.databases|Where-Object {$_.IsSystemObject -eq $false })
{
$Name = $db.Name
$Parent = $db.Parent.Name

To gather information on all databases just remove everything after the pipe symbol or if you wish to exclude certain databases from the collection gathering, maybe the database you keep your Change log table and DBA Team info in you can do that as well here

foreach($db in $srv.databases|Where-Object {$_.Name -ne 'EXCLUDENAME' })
{
$Name = $db.Name
$Parent = $db.Parent.Name

If you wish to view all of the different properties that you can gather information on in this way you can use this code to take a look. (This is something you should get used to doing when writing new Powershell scripts)

$Connection = 'SERVERNAMEHERE'
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection
 $srv.databases | Get-Member

An alternative method of doing this is to set a variable to a $db and then to select all of the properties so that you can see the values and identify the ones you want. Again this a good thing to do when exploring new objects

$db = $srv.databases['DBNAMEHERE'] 
$db| Select *

You can see from the screen shot below that there are 170 properties available to you on a SQL2014 instance. You can gather any or all of that information as long as you ensure that you have the columns with the correct data types in your table and that your script has the logic to deal with properties that do not exist although I have had less issue with this for the database object than the server object

db3

You can look for the property that you want by using the Get-Member cmdlet as shown above or use MSDN to find it starting from here or by GoogleBingDuckDuckGo ing “Powershell SMO” and the property you wish to find.

The rest of the script follows exactly the same pattern as the previous post by checking the SQL Info table for an entry for that instance and updating the table if it exists and inserting if it does not.

This is how I created the reports shown above.

Connect to the DBA Database and run these queries to gather the data for the report.

SELECT 
IL.ServerName
,IL.InstanceName
,IL.Location
,IL.Environment
,IL.Inactive
,IL.NotContactable
,D.[DatabaseID]
,D.[InstanceID]
,D.[Name]
,D.[DateAdded]
,D.[DateChecked]
,D.[AutoClose]
,D.[AutoCreateStatisticsEnabled]
,D.[AutoShrink]
,D.[AutoUpdateStatisticsEnabled]
,D.[AvailabilityDatabaseSynchronizationState]
,D.[AvailabilityGroupName]
,D.[CaseSensitive]
,D.[Collation]
,D.[CompatibilityLevel]
,D.[CreateDate]
,D.[DataSpaceUsageKB]
,D.[EncryptionEnabled]
,D.[IndexSpaceUsageKB]
,D.[IsAccessible]
,D.[IsFullTextEnabled]
,D.[IsMirroringEnabled]
,D.[IsParameterizationForced]
,D.[IsReadCommittedSnapshotOn]
,D.[IsUpdateable]
,D.[LastBackupDate]
,D.[LastDifferentialBackupDate]
,D.[LastLogBackupDate]
,D.[Owner]
,D.[PageVerify]
,D.[ReadOnly]
,D.[RecoveryModel]
,D.[ReplicationOptions]
,D.[SizeMB]
,D.[SnapshotIsolationState]
,D.[SpaceAvailableKB]
,D.[Status]
,D.[TargetRecoveryTime]
FROM [DBADatabase].[Info].[Databases] as D
JOIN [DBADatabase].[dbo].[InstanceList] as IL
ON IL.InstanceID =D.InstanceID

To get all the database and instance information and

SELECT C.ClientName
 ,[DatabaseID]
 ,[InstanceID]
 ,[Notes]
  FROM [DBADatabase].[dbo].[ClientDatabaseLookup] as CDL
  JOIN [DBADatabase].[dbo].[Clients] as C
  ON CDL.clientid = c.clientid

To get the client information. The client information needs to be manually added to the table as this (in general) needs a human bean to understand. When the script runs every night it will pick up new databases and I add a default value of “Not Entered” to the table which makes it easier to identify the databases that need this additional work. (This also means that as a Team Leader I can monitor that my team are doing this) It can also be added to any scripts which create new databases for deployment.

Then we need to create some measures and calculated columns for our report. I did this as I realised that I needed it when making the report rather than all up front.

I created two calculated columns for size for the databases one for Gb and one for Tb by clicking on the data icon on the left and then new measure

SizeGb = Query1[SizeMB]/1024
SizeTb = Query1[SizeGb]/1024

Some measures for count of Databases, Instances and Servers

Databases = COUNT(Query1[DatabaseID])
Instances = DISTINCTCOUNT(Query1[InstanceID])
Servers = DISTINCTCOUNT(Query1[ServerName])

I also wanted to be able to differentiate between ‘External’ and ‘Internal’ customers. So I created a calculated column for this value using a switch statement.

External = SWITCH(Clients[ClientName],"Not Entered", 0 , "Dev Team",0,"Mi Team",0,"DBA Team",0,"Finance Department",0,"HR",0,"Operations",0,"Payroll",0,"Test Team",0,"Systems Team",0,"Unknown",0,1)

I create a donut chart to show the size of the database in Gb by client (and no, my real clients are not rock bands 🙂 ) as shown below. I formatted the title, legend and background by clicking on the paintbrush in the visualisation pane. I would encourage you to investigate the options here.

db4
The other donut chart is number of clients per location (and those are SQL User group locations in the UK and my hometown Bolton)

db5

The rest of the visualisations on that report are cards and tables which I am sure that you can work out.

I created a map to show the location of the databases

db6

And after reading this post http://sqldusty.com/2015/08/03/power-bi-tip-use-the-treemap-chart-as-a-colorful-slicer/ by Dustin Ryan I created a colourful slicer for environment and the client and then added some other information. The important thing here is to pick the information that the person looking at the report needs to see. So if it is recovery model, compatibility level, collation, page verify setting, mirroring, replication, size and number of databases then this report is correct but I doubt that’s what you want 🙂

You can slice this report by location, client or environment. For example, I can easily see which clients have data in Exeter and the size and number of databases

db7

Or if Metallica ring me up I can quickly see that they have 4 databases, just under 69Gb of data in Exeter and it isn’t mirrored. You will notice that it is not easy to see the recovery model or the compatibility level. If you hover over the results you get a highlight figure which shows the data is filtered but it is not shown visually very well as there are over a thousand databases using full recovery model.

db8

If we are asked about the Integration environment we can see that it is hosted in Bolton, Manchester, Southampton and Exeter and comprises of 394 databases and 739 Gb of data. It is also easier to see the compatibility level and recovery model as the ratios are larger

db9

Once we have created the report in the way that we want we can then publish it to powerbi.com and share it with others if we wish. Publishing is as easy as pressing the publish button and entering your powerbi credentials but if you want your data to automatically refresh (and this is the point of the exercise to remove manual work) then you will need to install and configure the PowerBi gateway and schedule a refresh I will post about this later.

Once the report is published you can access it in the browser and create a dashboard by clicking the pin in the top right of a visualisation and a pop up will ask you which dashboard you wish to pin it to (Another recent update to Power Bi)

db10

Once you have a dashboard you can then perform some natural language question and answer on it. This can be quite interesting and not always quite what you (or your report readers) might expect but it is getting better all the time

db11

You have to remember to use the names of the columns correctly

db12

But once you have the query correct you can alter it by adding “as a VISUALISATION” and choose the visualisation

db13

db14

And once you have the visualisation you can pin it to the dashboard

I think you can see how useful it can be

db15

This doesn’t work quite as you expect

db16

But this does

db17

How about this (and yes it felt wrong to type!)

db18

And the auditors would love to be able to do this. (This is an old copy of the database in case The Eagles people are reading this – your database is backed up every 15 minutes)

db19

Or this for a DBA ( Yes, my obfuscation script database naming convention is a bit bland)

db20

Or the DBA team manager might choose this one

db21

The advantage that I cannot show via static pictures is that the data, visualisation and the suggestions alter in real time as you type

I hope that you have found this useful and that you can see the benefits and advantages of using a DBA Database and empowering people to use self-service to answer their own questions leaving the DBA time to do more important things like drinking coffee 🙂

As always if you have any questions or comments please feel free to post them on the blog.

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

Advertisements

#TSQL2sDay Why My Head is Always in The Cloud

Todays post is my first for the TSQL2sDay series. For those not familiar this is rotating blog party that was started by Adam Machanic (@AdamMachanic | blog) back in 2009. If you want to catch up on all the fun to date? Check out this nice archive (link) put together by Steve Jones (@way0utwest |blog). Thank you Steve!!!

Azure Ballon - Credit http://owenrichardson.com/

This one is hosted by Jorge Segarra @SQLChicken:  who said This month’s topic is all about the cloud. What’s your take on it? Have you used it? If so, let’s hear your experiences. Haven’t used it? Let’s hear why or why not? Do you like/dislike recent changes made to cloud services? It’s clear skies for writing! So let’s hear it folks, where do you stand with the cloud?

My wife would tell you that my head is always in the cloud and she’s right (she usually is) just not like that picture! I would love to float gracefully above the land and gaze upon the view but its the landing that bothers me and will always stop me from trying it

Credit http://owenrichardson.com/

She’s right, pedantically and literally too, because this year I have spent a lot of time with my head and my fingers and my thinking in Virtual Machines using Windows Azure. That is where I have learnt a lot of my SQL and Powershell this year. After SQL Saturday Exeter and SQL Bits in Nottingham this year I have needed a place to practice and learn, an environment to try things and break things and mend them again and experiment.

I learn just as well by doing things as I do reading about them. Stuart Moore  @napalmgram has a great post called Learning to Play with SQL Server and whist I haven’t been as rough with my Azure SQL instances as he suggests I have been able to practice at will without worry and thanks to my MSDN subscription without cost. I have taken examples from blog posts and demos from User Group Sessions and run them on my Windows Azure VMs

Every single blog post I have written this year that has examples has been written in Azure and screen shots from Azure. Whilst some of my Powershell scripts in the PowerShell Box of Tricks series had already been written to solve one particular problem or another at MyWork, every single one was refined and demo’d and all the screen shots were from Azure and several were developed on Azure too

My first ever session to the SQL South West user group was about Spinning up and Shutting Down VMS in Azure was about Azure and was an interesting experience in Murphys Law which meant I ended up having to deliver it  on Azure.

The second time I have talked was about the PowerShell Box of Tricks series to the Cardiff User Group. Having learnt my lesson from the first time I had bought a mini HDMI to VGA converter and I had tested it using a couple of monitors at home and it worked wonderfully. However, when I got to Cardiff my little Asus convertible didn’t provide enough grunt to power the funky presentation screen. Luckily thanks to Stuart Moore @napalmgram who was also there doing his excellent PowerShell Back Up and Restore Session who let me use his Mac I was able to deliver the session using Office Web App to run the PowerPoint from my SkyDrive whilst all the demos were on ………Yup you guessed it Windows Azure !!!

So I feel qualified to answer Jorge’s questions and take part in T-SQL Tuesday this time round.

I like Azure. I like the ease I can spin up and down machines or any PaaS services at will. I love that I can do it with PowerShell because I really enjoy using PowerShell in my day to day work and at home too. Living as I do in a beautifully convenient bungalow in the country, I still enjoy the frustration of watching that spinning ring as my videos buffer on our 1.8Mbs at best internet connection. Whilst that does have an impact on using Azure it is a damn sight better than waiting many days trying to download one single file. Something like an ISO file for the latest SQL Server CTP for example.

There is no way I would have got a look at SQL Server 2014 if it wasn’t for Azure. I was able to spin up a SQL Server 2014 machine in only a few minutes and log in and have a play and then delete it. I have done the same with Server 2012 and 2012 R2. It has enabled me to try setting up Availability Groups and other technologies not yet implemented at MyWork

I wouldn’t have been able to do any of that on my machines at home as I don’t have anything capable of running Hyper-V whilst this 8 year old desktop still keeps hanging on despite the odd noises. (Negotiations are currently in place to replace it with something shiny and new. Just need that lottery win now !!)

I have also transferred my Cricket Averages database to WASD and am talking with a friend of mine about developing an app that will use the mobile service as well.

The rate of change is much quicker in the cloud, things change and change quickly. As quickly as I had written my post about Spinning up and Shutting Down VMS in Azure Microsoft changed the rules and didn’t charge for machines that were turned off. New services appear all the time. New services move quickly through from Preview to release and as Grant Fritchey noticed this week new views have been added to to Windows Azure SQL Database under the covers. I think this is something we are just going to have to live with. The scale of the cloud means it is much easier to test improvements at large scale and that means they can be released quicker.  It makes it more challenging to keep up I admit but it’s a constant drip of new things rather than a big bang all at once.

Azure has brought me to where I am today and I think it will continue to be part of my future. If I remember to submit my PowerShell session for SQL Saturday Exeter (Submit yours here) and it gets chosen then you will be able to see me there (if you register here) using Azure to give back to the SQL Community

SQL Server 2014 CTP on Azure. Powershell won’t create XTP tables

Having a couple of hours to play I created a SQL 2014 box on Azure with Windows Server 2012 R1 and followed This Post on SQLServerCentral  to create a File Group and table with T-SQL. Everything went well. Don’t try and provision an extra small box with that image though as it is so slow as to be unusable!!

I then decided to take a look at doing it with Powershell and I cannot create the table. I think it is to do with my Index creation. Can anyone see my mistake as I am going round in circles now. I have posted to DBAStackExchange so answer there

Here’s the T-SQL that Klaus uses to create a database and table with a XTP File Group

    -- Create new database
CREATE DATABASE TestDatabase
GO
--Add MEMORY_OPTIMIZED_DATA filegroup to the database.
ALTER DATABASE TestDatabase
ADD FILEGROUP XTPFileGroup CONTAINS MEMORY_OPTIMIZED_DATA
-- Add a new file to the previous created file group
ALTER DATABASE TestDatabase ADD FILE
(
NAME = N'HekatonFile1',
FILENAME = N'C:\Program Files\Microsoft SQL
Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\HekatonFile1')
TO FILEGROUP [HekatonFileGroup]
GO
-- Let's create a new Memory Optimized Table
CREATE TABLE TestTable
(
 Col1 INT NOT NULL,
Col2 VARCHAR(100) NOT NULL,
 Col3 VARCHAR(100) NOT NULL
CONSTRAINT chk_PrimaryKey PRIMARY KEY NONCLUSTERED HASH (Col1) WITH (BUCKET_COUNT = 1024)
) WITH (MEMORY_OPTIMIZED = ON)
GO

And Here is my Powershell

# To Load SQL Server Management Objects into PowerShell
[System.Reflection.Assembly]::LoadWithPartialName(‘Microsoft.SqlServer.SMO’)  | out-null
[System.Reflection.Assembly]::LoadWithPartialName(‘Microsoft.SqlServer.SMOExtended’)  | out-null
$server= new-Object Microsoft.SqlServer.Management.Smo.Server("(local)")
$dbname = "HekatonTest"
$db = New-Object Microsoft.SqlServer.Management.Smo.Database($server, $dbname)
# Add FileGroups
$FG1Name = "PRIMARY"
$Normalfg = new-object ('Microsoft.SqlServer.Management.Smo.FileGroup') ($db, $FG1Name)
$db.FileGroups.Add($Normalfg)
$FG2Name = "MemOpt"
$MemOptFG = new-object ('Microsoft.SqlServer.Management.Smo.FileGroup') ($db, $FG2Name)
$MemOptFG.FileGroupType = "MemoryOptimizedDataFileGroup"
$db.FileGroups.Add($MemOptFG)
#Create datafiles
$normallogname = "HekatonTest_Data"
$dbdfnormal = new-object ('Microsoft.SqlServer.Management.Smo.DataFile') ($Normalfg, $normallogname)
$dbnormalfile= $server.Information.MasterDBPath + '\' + $normallogname + '.mdf'
$normalfg.Files.Add($dbdfnormal)
$dbdfnormal.FileName = $dbnormalfile
$dbdfnormal.Size = [double](5.0 * 1024.0)
$dbdfnormal.GrowthType = 'Percent'
$dbdfnormal.Growth = 25.0
$dbdfnormal.IsPrimaryFile = 'True'
$MemOptFilename = "MemOpt_Data"
$MemOptDataFile = new-object ('Microsoft.SqlServer.Management.Smo.DataFile') ($MemOptFG, $MemOptFilename)
$MemOptDataFilePath= $server.Information.MasterDBPath + '\' + $MemOptFilename + '.ndf'
$MemOptFG.Files.Add($MemOptDataFile)
$MemOptDataFile.FileName = $MemOptDataFilePath

#Create Database
$db.Create() 
#Create Table
$dbname = "HekatonTest"
$db = $server.databases[$dbname]
$tb = new-object Microsoft.SqlServer.Management.Smo.Table($db, "MemOptTable")
#Add Columns
$col1 = new-object Microsoft.SqlServer.Management.Smo.Column($tb,"Col1", [Microsoft.SqlServer.Management.Smo.DataType]::Int)
$col2 = new-object Microsoft.SqlServer.Management.Smo.Column($tb, "Col2", [Microsoft.SqlServer.Management.Smo.DataType]::varchar(100))
$col3 = new-object Microsoft.SqlServer.Management.Smo.Column($tb, "Col3", [Microsoft.SqlServer.Management.Smo.DataType]::varchar(100))
$tb.Columns.Add($col1)
$tb.Columns.Add($col2)
$tb.Columns.Add($col3)
$C1Name =$col1.Name
#Create Index
$IX = New-Object Microsoft.SqlServer.Management.Smo.Index ($Tb, "PK_PrimaryKeyName")   
$IX_col = New-Object Microsoft.SqlServer.Management.Smo.IndexedColumn ($IX,$C1Name)
$IX.IndexedColumns.Add($IX_col)
$IX.IndexKeyType = "DriPrimaryKey"
$IX.IndexType = "HashIndex"
$IX.BucketCount = "1024"
$IX.FileGroup = "PRIMARY"
$IX.IsMemoryOptimized = $true
$Tb.Indexes.Add($IX)
$tb.FileGroup = "MemOpt"
$tb.Durability = "SchemaAndData"
$tb.IsMemoryOptimized = $true
$tb.Create()

I can create the filegroups and the database with PS and use SSMS to create Memory Optimised Tables (XTP) and if I take out the index creation then the table creates but isnt XTP obviously.

All you get is the frustrating Create() failed error which doesn’t help me to find the problem.

I know I can take the T-SQL and put it into Invoke-SQLCMD but I figured you ought to be able to do it with Powershell.

Can anyone point me in the right direction? For interest only, maybe it’s not available in the CTP. Answer over here