Populating My DBA Database for Power Bi with PowerShell – Databases

Following my post about using Power Bi with my DBA Database I have been asked if I would share the PowerShell scripts which I use to populate my database.

In this post I will show how to create the following report

db1

db2

Although you will find so many items of data that I expect that you will want to create different reports for your own requirements. You will also want to put the report onto PowerBi.com and explore the natural language querying as I show at the end of this post

You will find the latest version of my DBADatabase creation scripts and PowerShell scripts here.

The SQLInfo table is created using this code

CREATE TABLE [Info].[Databases](
	[DatabaseID] [int] IDENTITY(1,1) NOT NULL,
	[InstanceID] [int] NOT NULL,
	[Name] [nvarchar](256) NULL,
	[DateAdded] [datetime2](7) NULL,
	[DateChecked] [datetime2](7) NULL,
	[AutoClose] [bit] NULL,
	[AutoCreateStatisticsEnabled] [bit] NULL,
	[AutoShrink] [bit] NULL,
	[AutoUpdateStatisticsEnabled] [bit] NULL,
	[AvailabilityDatabaseSynchronizationState] [nvarchar](16) NULL,
	[AvailabilityGroupName] [nvarchar](128) NULL,
	[CaseSensitive] [bit] NULL,
	[Collation] [nvarchar](30) NULL,
	[CompatibilityLevel] [nvarchar](15) NULL,
	[CreateDate] [datetime2](7) NULL,
	[DataSpaceUsageKB] [float] NULL,
	[EncryptionEnabled] [bit] NULL,
	[IndexSpaceUsageKB] [float] NULL,
	[IsAccessible] [bit] NULL,
	[IsFullTextEnabled] [bit] NULL,
	[IsMirroringEnabled] [bit] NULL,
	[IsParameterizationForced] [bit] NULL,
	[IsReadCommittedSnapshotOn] [bit] NULL,
	[IsSystemObject] [bit] NULL,
	[IsUpdateable] [bit] NULL,
	[LastBackupDate] [datetime2](7) NULL,
	[LastDifferentialBackupDate] [datetime2](7) NULL,
	[LastLogBackupDate] [datetime2](7) NULL,
	[Owner] [nvarchar](30) NULL,
	[PageVerify] [nvarchar](17) NULL,
	[ReadOnly] [bit] NULL,
	[RecoveryModel] [nvarchar](10) NULL,
	[ReplicationOptions] [nvarchar](40) NULL,
	[SizeMB] [float] NULL,
	[SnapshotIsolationState] [nvarchar](10) NULL,
	[SpaceAvailableKB] [float] NULL,
	[Status] [nvarchar](35) NULL,
	[TargetRecoveryTime] [int] NULL,
 CONSTRAINT [PK_Databases] PRIMARY KEY CLUSTERED 
(
	[DatabaseID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

GO

The Powershell script uses Jason Wasser @wasserja Write-Log function to write to a text file but I also enable some logging into a new event log by following the steps here http://blogs.technet.com/b/heyscriptingguy/archive/2013/02/01/use-powershell-to-create-and-to-use-a-new-event-log.aspx to create a log named SQLAutoScript with a source SQLAUTOSCRIPT

To run the script I simply need to add the values for

$CentralDBAServer = '' ## Add the address of the instance that holds the DBADatabase
$CentralDatabaseName = 'DBADatabase' 
$LogFile = "\DBADatabaseServerUpdate_" + $Date + ".log" ## Set Path to Log File

And the script will do the rest. Call the script from a PowerShell Job Step and schedule it to run at the frequency you wish, I gather the information every week. You can get the script from here or you can read on to see how it works and how to create the report and publish it to powerbi.com and query it with natural langauge

I create a function called Catch-Block to save keystrokes and put my commands inside a try catch to make the scripts as robust as possible. I won’t include the try catch in the examples below. I gather all of the server names from the InstanceList table and set the results to an array variable called $ServerNames holding the server name, instance name and port

 $Query = @"
 SELECT [ServerName]
      ,[InstanceName]
      ,[Port]
  FROM [DBADatabase].[dbo].[InstanceList]
  Where Inactive = 0 
    AND NotContactable = 0
"@
try{
$AlltheServers= Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query $query
$ServerNames = $AlltheServers| Select ServerName,InstanceName,Port
}

I then loop through the array and create a $Connection variable for my SMO connection string and connect to the server

foreach ($ServerName in $ServerNames)
{
## $ServerName
 $InstanceName =  $ServerName|Select InstanceName -ExpandProperty InstanceName
 $Port = $ServerName| Select Port -ExpandProperty Port
$ServerName = $ServerName|Select ServerName -ExpandProperty ServerName 
 $Connection = $ServerName + '\' + $InstanceName + ',' + $Port

 try
 {
 $srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection

Even though I place the creation of the SMO server object in a try block you still need to an additional check to ensure that you can connect and populate the object as the code above creates an empty SMO Server object with the name property set to the $Connection variable if you can’t connect to that server and doesn’t error as you may expect
The way I have always validated an SMO Server object is to check the version property. There is no justifiable reason for choosing that property, you could choose any one but that’s the one I have always used. I use an if statement to do this ( This post about Snippets will show you the best way to learn PowerShell code) The reference I use for exiting a loop in the way that you want is this one In this case we use a continue to carry on iterating the loop

 if (!( $srv.version)){
 Catch-Block " Failed to Connect to $Connection"
 continue
 }

I then loop through the user databases

foreach($db in $srv.databases|Where-Object {$_.IsSystemObject -eq $false })
{
$Name = $db.Name
$Parent = $db.Parent.Name

To gather information on all databases just remove everything after the pipe symbol or if you wish to exclude certain databases from the collection gathering, maybe the database you keep your Change log table and DBA Team info in you can do that as well here

foreach($db in $srv.databases|Where-Object {$_.Name -ne 'EXCLUDENAME' })
{
$Name = $db.Name
$Parent = $db.Parent.Name

If you wish to view all of the different properties that you can gather information on in this way you can use this code to take a look. (This is something you should get used to doing when writing new Powershell scripts)

$Connection = 'SERVERNAMEHERE'
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection
 $srv.databases | Get-Member

An alternative method of doing this is to set a variable to a $db and then to select all of the properties so that you can see the values and identify the ones you want. Again this a good thing to do when exploring new objects

$db = $srv.databases['DBNAMEHERE'] 
$db| Select *

You can see from the screen shot below that there are 170 properties available to you on a SQL2014 instance. You can gather any or all of that information as long as you ensure that you have the columns with the correct data types in your table and that your script has the logic to deal with properties that do not exist although I have had less issue with this for the database object than the server object

db3

You can look for the property that you want by using the Get-Member cmdlet as shown above or use MSDN to find it starting from here or by GoogleBingDuckDuckGo ing “Powershell SMO” and the property you wish to find.

The rest of the script follows exactly the same pattern as the previous post by checking the SQL Info table for an entry for that instance and updating the table if it exists and inserting if it does not.

This is how I created the reports shown above.

Connect to the DBA Database and run these queries to gather the data for the report.

SELECT 
IL.ServerName
,IL.InstanceName
,IL.Location
,IL.Environment
,IL.Inactive
,IL.NotContactable
,D.[DatabaseID]
,D.[InstanceID]
,D.[Name]
,D.[DateAdded]
,D.[DateChecked]
,D.[AutoClose]
,D.[AutoCreateStatisticsEnabled]
,D.[AutoShrink]
,D.[AutoUpdateStatisticsEnabled]
,D.[AvailabilityDatabaseSynchronizationState]
,D.[AvailabilityGroupName]
,D.[CaseSensitive]
,D.[Collation]
,D.[CompatibilityLevel]
,D.[CreateDate]
,D.[DataSpaceUsageKB]
,D.[EncryptionEnabled]
,D.[IndexSpaceUsageKB]
,D.[IsAccessible]
,D.[IsFullTextEnabled]
,D.[IsMirroringEnabled]
,D.[IsParameterizationForced]
,D.[IsReadCommittedSnapshotOn]
,D.[IsUpdateable]
,D.[LastBackupDate]
,D.[LastDifferentialBackupDate]
,D.[LastLogBackupDate]
,D.[Owner]
,D.[PageVerify]
,D.[ReadOnly]
,D.[RecoveryModel]
,D.[ReplicationOptions]
,D.[SizeMB]
,D.[SnapshotIsolationState]
,D.[SpaceAvailableKB]
,D.[Status]
,D.[TargetRecoveryTime]
FROM [DBADatabase].[Info].[Databases] as D
JOIN [DBADatabase].[dbo].[InstanceList] as IL
ON IL.InstanceID =D.InstanceID

To get all the database and instance information and

SELECT C.ClientName
 ,[DatabaseID]
 ,[InstanceID]
 ,[Notes]
  FROM [DBADatabase].[dbo].[ClientDatabaseLookup] as CDL
  JOIN [DBADatabase].[dbo].[Clients] as C
  ON CDL.clientid = c.clientid

To get the client information. The client information needs to be manually added to the table as this (in general) needs a human bean to understand. When the script runs every night it will pick up new databases and I add a default value of “Not Entered” to the table which makes it easier to identify the databases that need this additional work. (This also means that as a Team Leader I can monitor that my team are doing this) It can also be added to any scripts which create new databases for deployment.

Then we need to create some measures and calculated columns for our report. I did this as I realised that I needed it when making the report rather than all up front.

I created two calculated columns for size for the databases one for Gb and one for Tb by clicking on the data icon on the left and then new measure

SizeGb = Query1[SizeMB]/1024
SizeTb = Query1[SizeGb]/1024

Some measures for count of Databases, Instances and Servers

Databases = COUNT(Query1[DatabaseID])
Instances = DISTINCTCOUNT(Query1[InstanceID])
Servers = DISTINCTCOUNT(Query1[ServerName])

I also wanted to be able to differentiate between ‘External’ and ‘Internal’ customers. So I created a calculated column for this value using a switch statement.

External = SWITCH(Clients[ClientName],"Not Entered", 0 , "Dev Team",0,"Mi Team",0,"DBA Team",0,"Finance Department",0,"HR",0,"Operations",0,"Payroll",0,"Test Team",0,"Systems Team",0,"Unknown",0,1)

I create a donut chart to show the size of the database in Gb by client (and no, my real clients are not rock bands 🙂 ) as shown below. I formatted the title, legend and background by clicking on the paintbrush in the visualisation pane. I would encourage you to investigate the options here.

db4
The other donut chart is number of clients per location (and those are SQL User group locations in the UK and my hometown Bolton)

db5

The rest of the visualisations on that report are cards and tables which I am sure that you can work out.

I created a map to show the location of the databases

db6

And after reading this post http://sqldusty.com/2015/08/03/power-bi-tip-use-the-treemap-chart-as-a-colorful-slicer/ by Dustin Ryan I created a colourful slicer for environment and the client and then added some other information. The important thing here is to pick the information that the person looking at the report needs to see. So if it is recovery model, compatibility level, collation, page verify setting, mirroring, replication, size and number of databases then this report is correct but I doubt that’s what you want 🙂

You can slice this report by location, client or environment. For example, I can easily see which clients have data in Exeter and the size and number of databases

db7

Or if Metallica ring me up I can quickly see that they have 4 databases, just under 69Gb of data in Exeter and it isn’t mirrored. You will notice that it is not easy to see the recovery model or the compatibility level. If you hover over the results you get a highlight figure which shows the data is filtered but it is not shown visually very well as there are over a thousand databases using full recovery model.

db8

If we are asked about the Integration environment we can see that it is hosted in Bolton, Manchester, Southampton and Exeter and comprises of 394 databases and 739 Gb of data. It is also easier to see the compatibility level and recovery model as the ratios are larger

db9

Once we have created the report in the way that we want we can then publish it to powerbi.com and share it with others if we wish. Publishing is as easy as pressing the publish button and entering your powerbi credentials but if you want your data to automatically refresh (and this is the point of the exercise to remove manual work) then you will need to install and configure the PowerBi gateway and schedule a refresh I will post about this later.

Once the report is published you can access it in the browser and create a dashboard by clicking the pin in the top right of a visualisation and a pop up will ask you which dashboard you wish to pin it to (Another recent update to Power Bi)

db10

Once you have a dashboard you can then perform some natural language question and answer on it. This can be quite interesting and not always quite what you (or your report readers) might expect but it is getting better all the time

db11

You have to remember to use the names of the columns correctly

db12

But once you have the query correct you can alter it by adding “as a VISUALISATION” and choose the visualisation

db13

db14

And once you have the visualisation you can pin it to the dashboard

I think you can see how useful it can be

db15

This doesn’t work quite as you expect

db16

But this does

db17

How about this (and yes it felt wrong to type!)

db18

And the auditors would love to be able to do this. (This is an old copy of the database in case The Eagles people are reading this – your database is backed up every 15 minutes)

db19

Or this for a DBA ( Yes, my obfuscation script database naming convention is a bit bland)

db20

Or the DBA team manager might choose this one

db21

The advantage that I cannot show via static pictures is that the data, visualisation and the suggestions alter in real time as you type

I hope that you have found this useful and that you can see the benefits and advantages of using a DBA Database and empowering people to use self-service to answer their own questions leaving the DBA time to do more important things like drinking coffee 🙂

As always if you have any questions or comments please feel free to post them on the blog.

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

Enterprise Strategies – A #TSQL2sDay post

This months TSQL2sDay blog post party is hosted by Jen McCown and is about Enterprise Strategy.

Adam Mechanic started TSQL Tuesdays over 5 years ago and you will find many brilliant posts under that heading if you search for them

Managing SQL servers at enterprise scale is not a straightforward task. Your aim as a DBA should be to simplify it as much as possible and to automate everything that you possibly can. This post by John Sansom could have been written for this months party and I recommend that you read it.

So here are a few points that I think you should consider if you look after SQL in an Enterprise environment.

  • Enterprise Strategy will undoubtedly garner a whole host of excellent posts and Jen will provide a round up post which will I am certain will be an excellent resource. Take a look here
  • Know where your instances are and have a single place that you can reference them from. Some people recommend a Central Management Server but I find this too restrictive for my needs. I use an InstanceList table in my DBA Database with the following columns [ServerName], [InstanceName] , [Port] , [AG] , [Inactive] , [Environment] and [Location]. This enables me to target instances not just by name but by environment (Dev, Test, Pre-Prod, Live etc), by location or by joining the InstanceList table with another table I can target by the application or any number of other factors. I also capture information about the servers at windows and SQL level to this database so I can target the SQL 2012 servers specifically if need be or any other metric. This is very powerful and enables far greater flexibility than the CMS in my opinion.
  • Use PowerShell (no surprise I would mention this!) PowerShell is a brilliant tool for automation and I use it all of the time
  • Get used to using this piece of Powershell code
	 $Query = @"
	 SELECT [ServerName],[InstanceName],[Port]
	  FROM [DBADatabase].[dbo].[InstanceList]
	  Where Inactive = 0 AND NotContactable = 0
	"@
	try{
	$AlltheServers= Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query $query
	$ServerNames = $AlltheServers| Select ServerName,InstanceName,Port
	}
	foreach ($ServerName in $ServerNames)
	{
	## $ServerName
	 $InstanceName =  $ServerName|Select InstanceName -ExpandProperty InstanceName
	 $Port = $ServerName| Select Port -ExpandProperty Port
	$ServerName = $ServerName|Select ServerName -ExpandProperty ServerName 
	 $Connection = $ServerName + '\' + $InstanceName + ',' + $Port
	
	 try
	 {
	 $srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection

Notice the query variable above, this is where the power lies as it enables you to gather all the instances that you need for your task as described in the bullet post above. Once you get used to doing this you can do things like this identify all the instances with Remote DAC disabled using a query against the DBA Database and then enable it on all servers by adding this code to the loop above

$srv.RemoteDacEnabled = $true
$srv.alter()

Very quick very simple and very very powerful. You can also use this to run TSQL scripts against the instances you target but there are some added complications with Invoke-SQLCmd that you need to be aware of

  • BE CAREFUL. Test and understand and test before you run any script on a live system especially using a script like this which enables you to target ALL of your servers. You must definitely check that your $ServerNames array contains only the instances you need before you make any changes. You need to be ultra-cautious when it is possible to do great damage
  • Write scripts that are robust and handle errors gracefully. I use Jason Wasser @wasserja Write-Log function to write to a text file and wrap my commands in a try catch block.
  • Include comments in your scripts to assist either the future you or the folks in your position in 5 years time. I would also add one of my bug bears – Use the description block in Agent Jobs. The first place any DBA is going to go to when that job fails is to open the properties of the job. Please fill in that block so that anyone troubleshooting knows some information about what the job does or at the very least a link to some documentation about it
  • Finally in my list, don’t overdo the alerts. Alerting is vital for any DBA it is a brilliant way to ensure that you quickly know about any issues affecting your estate but all alerts should be actionable and in some cases you can automate the action that you can take but the message here is don’t send messages to the DBA team email for every single tiny thing or they will get swamped and ignore the vital one. This holds for whichever alerting or monitoring system that you use

This is but a small sub-section of things that you need to consider when responsible for a large SQL estate but if you need help and advice or just moral support and you don’t already interact with the SQL community then make today the day you start. Maybe this post by Thomas La Rock is a good place to start or your nearest User Group/Chapter or the #sqlfamily hashtag or give me a shout and I will gladly help.

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Following my post about using Power Bi with my DBA Database I have been asked if I would share the PowerShell scripts which I use to populate my database.

In this post I will show how to create the following report

1

2

Although you will find so many items of data that I expect that you will want to create different reports for your own requirements. You will also want to put the report onto PowerBi.com and explore the natural language querying as I show at the end of this post

You will find the latest version of my DBADatabase creation scripts and PowerShell scripts here.

The SQLInfo table is created using this code

CREATE TABLE [Info].[SQLInfo](
	[SQLInfoID] [int] IDENTITY(1,1) NOT NULL,
	[DateChecked] [datetime] NULL,
	[DateAdded] [datetime] NULL,
	[ServerName] [nvarchar](50) NULL,
	[InstanceName] [nvarchar](50) NULL,
	[SQLVersionString] [nvarchar](100) NULL,
	[SQLVersion] [nvarchar](100) NULL,
	[ServicePack] [nvarchar](3) NULL,
	[Edition] [nvarchar](50) NULL,
	[ServerType] [nvarchar](30) NULL,
	[Collation] [nvarchar](30) NULL,
	[IsHADREnabled] [bit] NULL,
	[SQLServiceAccount] [nvarchar](35) NULL,
	[SQLService] [nvarchar](30) NULL,
	[SQLServiceStartMode] [nvarchar](30) NULL,
	[BAckupDirectory] [nvarchar](256) NULL,
	[BrowserAccount] [nvarchar](50) NULL,
	[BrowserStartMode] [nvarchar](25) NULL,
	[IsSQLClustered] [bit] NULL,
	[ClusterName] [nvarchar](25) NULL,
	[ClusterQuorumstate] [nvarchar](20) NULL,
	[ClusterQuorumType] [nvarchar](30) NULL,
	[C2AuditMode] [nvarchar](30) NULL,
	[CostThresholdForParallelism] [tinyint] NULL,
	[MaxDegreeOfParallelism] [tinyint] NULL,
	[DBMailEnabled] [bit] NULL,
	[DefaultBackupCComp] [bit] NULL,
	[FillFactor] [tinyint] NULL,
	[MaxMem] [int] NULL,
	[MinMem] [int] NULL,
	[RemoteDacEnabled] [bit] NULL,
	[XPCmdShellEnabled] [bit] NULL,
	[CommonCriteriaComplianceEnabled] [bit] NULL,
	[DefaultFile] [nvarchar](100) NULL,
	[DefaultLog] [nvarchar](100) NULL,
	[HADREndpointPort] [int] NULL,
	[ErrorLogPath] [nvarchar](100) NULL,
	[InstallDataDirectory] [nvarchar](100) NULL,
	[InstallSharedDirectory] [nvarchar](100) NULL,
	[IsCaseSensitive] [bit] NULL,
	[IsFullTextInstalled] [bit] NULL,
	[LinkedServer] [nvarchar](max) NULL,
	[LoginMode] [nvarchar](20) NULL,
	[MasterDBLogPath] [nvarchar](100) NULL,
	[MasterDBPath] [nvarchar](100) NULL,
	[NamedPipesEnabled] [bit] NULL,
	[OptimizeAdhocWorkloads] [bit] NULL,
	[InstanceID] [int] NULL,
	[AGListener] [nvarchar](150) NULL,
	[AGs] [nvarchar](150) NULL,
 CONSTRAINT [PK__SQL__50A5926BC7005F29] PRIMARY KEY CLUSTERED 
(
	[SQLInfoID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

GO

ALTER TABLE [Info].[SQLInfo]  WITH CHECK ADD  CONSTRAINT [FK_SQLInfo_InstanceList] FOREIGN KEY([InstanceID])
REFERENCES [dbo].[InstanceList] ([InstanceID])
GO

ALTER TABLE [Info].[SQLInfo] CHECK CONSTRAINT [FK_SQLInfo_InstanceList]
GO

The Powershell script uses Jason Wasser @wasserja Write-Log function to write to a text file but I also enable some logging into a new event log by following the steps here http://blogs.technet.com/b/heyscriptingguy/archive/2013/02/01/use-powershell-to-create-and-to-use-a-new-event-log.aspx to create a log named SQLAutoScript with a source SQLAUTOSCRIPT

To run the script I simply need to add the values for

$CentralDBAServer = '' ## Add the address of the instance that holds the DBADatabase
$CentralDatabaseName = 'DBADatabase' 
$LogFile = "\DBADatabaseServerUpdate_" + $Date + ".log" ## Set Path to Log File

And the script will do the rest. Call the script from a PowerShell Job Step and schedule it to run at the frequency you wish, I gather the information every week. You can get the script from here or you can read on to see how it works and how to create the report and publish it to powerbi.com

I create a function called Catch-Block to save keystrokes and put my commands inside a try catch to make the scripts as robust as possible.

function Catch-Block
{
param ([string]$Additional)
$ErrorMessage = " On $Connection " + $Additional + $_.Exception.Message + $_.Exception.InnerException.InnerException.message
$Message = " This message came from the Automated Powershell script updating the DBA Database with Server Information"
$Msg = $Additional + $ErrorMessage + " " + $Message
Write-Log -Path $LogFile -Message $ErrorMessage -Level Error
Write-EventLog -LogName SQLAutoScript -Source "SQLAUTOSCRIPT" -EventId 1 -EntryType Error -Message $Msg
}

I give the function an additional parameter which will hold each custom error message which I write to both the event log and a text message to enable easy troubleshooting and include the message from the $Error variable by accessing it with $_. I won’t include the try catch in the examples below. I gather all of the server names from the InstanceList table and set the results to an array variable called $ServerNames holding the server name, instance name and port

 $Query = @"
 SELECT [ServerName]
      ,[InstanceName]
      ,[Port]
  FROM [DBADatabase].[dbo].[InstanceList]
  Where Inactive = 0 
    AND NotContactable = 0
"@
try{
$AlltheServers= Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query $query
$ServerNames = $AlltheServers| Select ServerName,InstanceName,Port
}

I then loop through the array and create a $Connection variable for my SMO connection string and connect to the server

foreach ($ServerName in $ServerNames)
{
## $ServerName
 $InstanceName =  $ServerName|Select InstanceName -ExpandProperty InstanceName
 $Port = $ServerName| Select Port -ExpandProperty Port
$ServerName = $ServerName|Select ServerName -ExpandProperty ServerName 
 $Connection = $ServerName + '\' + $InstanceName + ',' + $Port

 try
 {
 $srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection

Even though I place the creation of the SMO server object in a try block you still need to an additional check to ensure that you can connect and populate the object as the code above creates an empty SMO Server object with the name property set to the $Connection variable if you can’t connect to that server and doesn’t error as you may expect
The way I have always validated an SMO Server object is to check the version property. There is no justifiable reason for choosing that property, you could choose any one but that’s the one I have always used. I use an if statement to do this ( This post about Snippets will show you the best way to learn powershell code) The reference I use for exiting a loop in the way that you want is this one In this case we use a continue to carry on iterating the loop

 if (!( $srv.version)){
 Catch-Block " Failed to Connect to $Connection"
 continue
 }

If you wish to view all of the different properties that you can gather information on in this way you can use this code to take a look. (This is something you should get used to doing when writing new Powershell scripts)

$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection
 $srv | Get-Member

As you can see from the screenshot below on my SQL2014 server there are 184 properties. I havent chosen to gather all of them, only the ones that are of interest to me, our team or others who request information from our team such as auditors and project managers etc

3

You can choose to use any or all of these properties as long as you ensure you have the columns in your table with the correct data type and that you have the correct knowledge and logic to stop the script from erroring if/when the property is not available. Here is an example

if ($srv.IsHadrEnabled -eq $True)
 {$IsHADREnabled = $True
 $AGs = $srv.AvailabilityGroups|Select Name -ExpandProperty Name|Out-String
 $Expression = @{Name = 'ListenerPort' ; Expression = {$_.Name + ',' + $_.PortNumber }}
 $AGListener =  $srv.AvailabilityGroups.AvailabilityGroupListeners|select $Expression|select ListenerPort -ExpandProperty ListenerPort
 }
 else
 {
 $IsHADREnabled = $false
 $AGs = 'None'
 $AGListener = 'None'
 }
 $BackupDirectory = $srv.BackupDirectory

I check if the property IsHADREnabled is true and if it is I then gather the information about the Availability Group names and the listener port and if it doesn’t exist I set the values to None.

You will find that not all of the properties that you want are at the root of the Server SMO object. If you want you max and min memory values and you want to know if remote admin connections or xp_cmdshell are enabled you will need to look at the $Srv.Configuration object

 $MaxMem = $srv.Configuration.MaxServerMemory.ConfigValue
 $MinMem = $srv.Configuration.MinServerMemory.ConfigValue
 $RemoteDacEnabled = $srv.Configuration.RemoteDacConnectionsEnabled.ConfigValue
 $XPCmdShellEnabled = $srv.Configuration.XPCmdShellEnabled.ConfigValue

You can look for the property that you want by using the Get-Member cmdlet as shown above or use MSDN to find it starting from here or by GoogleBingDuckDuckGo ing “Powershell SMO” and the property you wish to find.

The rest of the script follows exactly the same pattern as the previous post by checking the SQL Info table for an entry for that instance and updating the table if it exists and inserting if it does not.

There are other uses for gathering this information than just for reporting on it. You can target different versions of SQL for different scripts. You can identify values that are outside what is expected and change them. If xp_cmdshell should not be enabled, write the TSQL to gather the connection string of all of the servers from the DBADatabase where the SQLInfo table has XPCMDShellenabled = 1 and loop through them exactly as above and change the value of $srv.Configuration.XPCmdShellEnabled.ConfigValue to 0 and then $Srv.Alter()

It is a very powerful way of dynamically targeting your estate if you are looking after many instances and with great power comes great responsibility.

ALWAYS TEST THESE AND ANY SCRIPTS YOU FIND OR SCRIPTS YOU WRITE BEFORE YOU RUN THEM IN YOUR PRODUCTION ENVIRONMENT

Yeah, I shouted and some people thought it was rude. But its important, it needs to be repeated and drilled in so that it becomes habitual. You can do great damage to your estate with only a few lines of PowerShell and a DBA Database so please be very careful and ensure that you have a suitable test subset of servers that you can use to test

The other thing we can do is report on the data and with Power Bi we can create self service reports and dashboards and also make use of the natural language query at powerbi.com so that when your systems team ask “What are all the servers in X data center?” you can enable them to answer it themselves or when the compliance officer asks how many SQL 2005 instances do we have and which clients do they serve you can give them a dashboard they can query themselves.

This is how I create the two reports you see at the top. I start by connecting to the data source, my DBA Database

4

And I use this query

SELECT 
	IL.ServerName
	,IL.InstanceName
	  ,IL.Location
	  ,IL.Environment
	  ,IL.Inactive
	  ,IL.NotContactable
	  ,SI.[SQLInfoID]
      ,SI.[DateChecked]
      ,SI.[DateAdded]
      ,SI.[ServerName]
      ,SI.[InstanceName]
      ,SI.[SQLVersionString]
      ,SI.[SQLVersion]
      ,SI.[ServicePack]
      ,SI.[Edition]
      ,SI.[ServerType]
      ,SI.[Collation]
      ,SI.[IsHADREnabled]
      ,SI.[SQLServiceAccount]
      ,SI.[SQLService]
      ,SI.[SQLServiceStartMode]
      ,SI.[BAckupDirectory]
      ,SI.[BrowserAccount]
      ,SI.[BrowserStartMode]
      ,SI.[IsSQLClustered]
      ,SI.[ClusterName]
      ,SI.[ClusterQuorumstate]
      ,SI.[ClusterQuorumType]
      ,SI.[C2AuditMode]
      ,SI.[CostThresholdForParallelism]
      ,SI.[MaxDegreeOfParallelism]
      ,SI.[DBMailEnabled]
      ,SI.[DefaultBackupCComp]
      ,SI.[FillFactor]
      ,SI.[MaxMem]
      ,SI.[MinMem]
      ,SI.[RemoteDacEnabled]
      ,SI.[XPCmdShellEnabled]
      ,SI.[CommonCriteriaComplianceEnabled]
      ,SI.[DefaultFile]
      ,SI.[DefaultLog]
      ,SI.[HADREndpointPort]
      ,SI.[ErrorLogPath]
      ,SI.[InstallDataDirectory]
      ,SI.[InstallSharedDirectory]
      ,SI.[IsCaseSensitive]
      ,SI.[IsFullTextInstalled]
      ,SI.[LinkedServer]
      ,SI.[LoginMode]
      ,SI.[MasterDBLogPath]
      ,SI.[MasterDBPath]
      ,SI.[NamedPipesEnabled]
      ,SI.[OptimizeAdhocWorkloads]
      ,SI.[InstanceID]
      ,SI.[AGListener]
      ,SI.[AGs]
        FROM [DBADatabase].[Info].[SQLInfo] as SI
  JOIN [DBADatabase].[dbo].[InstanceList] as IL
  ON IL.InstanceID =  SI.InstanceID

So that I can easily add any and all the data to the reports if I choose or query using them in powerbi.com

First I created 3 measures.

[code 1=”=” 2=”2=”2=”2=”””SQL””””” language=”language”]
AG = DISTINCTCOUNT(Query1[AGs])
Instances = DISTINCTCOUNT(Query1[InstanceID])
Servers = DISTINCTCOUNT(Query1[ServerName])

I click on map

5

And drag the location column to location and the Instances measure to both the Values and Color Saturation

6

I then click on edit and format the title and change the colours for the data

7

Next I created I heat map for Instances by Edition. The picture shows the details

8

And a column chart for Instances by Version

9

I also add a table showing the number of instances in each location and a slicer for environment.

Even though you have added one slicer, you are able to slice the data by clicking on the charts. If I click on Developer Edition I can quickly see which versions and locations they are in

10

This works for the map and the column chart as well. This has all been created using live data as a base with all identifying information altered, Bolton is where I was born and the other locations are chosen at random, all other figures and rollups have also been altered.

11

To create the other report I create two donut charts for Instances by version and by location using steps similar to my previous post and then add some tables for location, edition and xp_cmdshell enabled as well as some cards showing total numbers of Servers, Instances and Availability Groups and a slicer for environment to create a report like this, you can use the donut charts to slice the data as well

12

But there are so many different points of information gathered by this script that you get extra value using the natural language query on powerbi.com.

Click Publish and enter your powerbi.com credentials and then log into powerbi.com in a browser and you will see your report and your dataset. (Note, you can easily filter to find your dashboards, reports and data sets)

13

Click the plus sign to create a new dashboard and click the pin on any of the objects in your report to pin them to the dashboard

14

Then you can view (and share) your dashboard

15

Once you have done this you can query your data using natural language. It will cope with spelling mistakes and expects the column names so you may want to think about renaming them in your report by right clicking on them after you get your data.

You can ask it questions and build up information on the fly and alter it as you need it. As a DBA doing this and imagining enabling others to be able to ask these questions whenever they want from a browser and as many times as they like, it was very cool!

16

17

18

19

20

Pretty cool, I think you and any of your ‘requestors’ would agree

You can get all of the scripts here

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

Populating My DBA Database for Power Bi with PowerShell – Server Info

Following my last post about using Power Bi with my DBA Database I have been asked if I would share the PowerShell scripts which I use to populate my database. They are the secondary part to my DBADatabase which I also use to automate the installation and upgrade of all of my DBA scripts as I started to blog about in this post Installing and upgrading default scripts automation – part one – Introduction which is a series I will continue later.

In this post I will show how to create the following report

1

You will find the latest version of my DBADatabase creation scripts here.

I create the following tables

dbo.ClientDatabaseLookup
dbo.Clients
dbo.InstanceList
dbo.InstanceScriptLookup
dbo.ScriptList
Info.AgentJobDetail
Info.AgentJobServer
Info.Databases
Info.Scriptinstall
Info.ServerOSInfo
Info.SQLInfo

By adding Server name, Instance Name , Port, Environment, NotContactable, and Location into the InstanceList table I can gather all of the information that I need and also easily add more information to other tables as I need to.

The not contactable column is so that I am able to add instances that I am not able to contact due to permission or environment issues. I can still gather information about them manually and add it to the table. I use the same script and change it to generate the SQL query rather than run it, save the query and then run the query manually to insert the data. This is why I have the DateAdded and Date Checked column so that I know how recent the data is. I don’t go as far as recording the change however as that will be added to a DBA-Admin database on every instance which stores every change to the instance.

The ServerOSInfo table is created like so

/****** Object: Table [Info].[ServerOSInfo]    Script Date: 26/08/2015 19:50:38 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [Info].[ServerOSInfo](
[ServerOSInfoID] [int] IDENTITY(1,1) NOT NULL,
[DateAdded] [datetime] NULL,
[DateChecked] [datetime] NULL,
[ServerName] [nvarchar](50) NULL,
[DNSHostName] [nvarchar](50) NULL,
[Domain] [nvarchar](30) NULL,
[OperatingSystem] [nvarchar](100) NULL,
[NoProcessors] [tinyint] NULL,
[IPAddress] [nvarchar](15) NULL,
[RAM] [int] NULL,
CONSTRAINT [PK__ServerOS__50A5926BC7005F29] PRIMARY KEY CLUSTERED
(
[ServerOSInfoID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

The Powershell script uses Jason Wasser @wasserja Write-Log function to write to a text file but I also  enable some logging into a new event log by following the steps here http://blogs.technet.com/b/heyscriptingguy/archive/2013/02/01/use-powershell-to-create-and-to-use-a-new-event-log.aspx to create a log named SQLAutoScript with a source SQLAUTOSCRIPT

To run the script I simply need to add the values for

$CentralDBAServer = '' ## Add the address of the instance that holds the DBADatabase
$CentralDatabaseName= 'DBADatabase' 
$LogFile = "\DBADatabaseServerUpdate_" + $Date + ".log" ## Set Path to Log File

And the script will do the rest. Call the script from a PowerShell Job Step and schedule it to run at the frequency you wish, I gather the information every week. You can get the script from here or you can read on to see how it works and how to create the report

I create a function called Catch-Block to save keystrokes and put my commands inside a try catch to make the scripts as robust as possible.

function Catch-Block{
param ([string]$Additional)
$ErrorMessage = " On $Connection " + $Additional + $_.Exception.Message + $_.Exception.InnerException.InnerException.message
$Message = " This message came from the Automated Powershell script updating the
DBA Database with Server Information"
$Msg = $Additional + $ErrorMessage + " " + $Message
Write-Log -Path $LogFile -Message $ErrorMessage -Level Error
Write-EventLog -LogName SQLAutoScript -Source "SQLAUTOSCRIPT" -EventId 1 -EntryType Error -Message $Msg
}

I give the function an additional parameter which will hold each custom error message which I write to both the event log and a text message to enable easy troubleshooting and include the message from the $Error variable by accessing it with $_. I won’t include the try catch in the examples below. I gather all of the server names from the InstanceList table and set the results to an array variable called $Servers

$AlltheServers = Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query "SELECT DISTINCT [ServerName] FROM [DBADatabase].[dbo].[InstanceList] WHERE Inactive = 0 OR NotContactable = 1"
$Servers = $AlltheServers| Select ServerName -ExpandProperty ServerName

I then loop through the array and gather the information with three WMI queries.

Write-Log -Path $LogFile -Message "Gathering Info for $Server "
foreach($Server in $Servers)
{
Write-Log -Path $LogFile -Message "Gathering Info for $Servers"
$DNSHostName = 'NOT GATHERED'
$Domain = 'NOT GATHERED'
$OperatingSystem = 'NOT GATHERED'
$IP = 'NOT GATHERED'
try{
$Info = get-wmiobject win32_computersystem -ComputerName $Server -ErrorAction Stop|select DNSHostName,Domain,
@{Name="RAM";Expression={"{0:n0}" -f($_.TotalPhysicalMemory/1gb)}},NumberOfLogicalProcessors

I give the variables some default values in case they are not picked up and set the error action for the command to Stop to exit the try and the first query gathers the DNSHostName, Domain Name, the amount of RAM in GB and the number of logical processors, the second gathers the Operating System version but the third was the most interesting to do. There are many methods of gathering the IP Address using powershell and I tried a few of them before finding one that would work with all of the server versions that I had in my estate but the one that worked remotely the best for me and this is a good point to say that this works in my lab and in my shop but may not nessacarily work in yours, so understand, check and test this and any other script that you find on the internet before you let them anywhere near your production environment.

Unfortunately the one that worked everywhere remotely errored with the local server so I added a check to see if the server name in the variable matches the global environment variable of Computer Name

$OS =  gwmi Win32_OperatingSystem  -ComputerName $Server| select @{name='Name';Expression={($_.caption)}} 
if($Server -eq $env:COMPUTERNAME)
{$IP = (Get-WmiObject -ComputerName $Server -class win32_NetworkAdapterConfiguration -Filter 'ipenabled = "true"' -ErrorAction Stop).ipaddress[0] }
else {$IP = [System.Net.Dns]::GetHostAddresses($Server).IPAddressToString }
Write-Log -Path $LogFile -Message "WMI Info gathered for $Server "

Once I have all of the information I check if the server already exists in the ServerOs table and choose to either insert or update.

	$Exists = Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query "SELECT [ServerName] FROM [DBADatabase].[Info].[ServerOSInfo] WHERE ServerName = '$Server'"
	
	if ($Exists)
	{
	$Query = @"
	UPDATE [Info].[ServerOSInfo]
	   SET [DateChecked] = GetDate()
	      ,[ServerName] = '$Server'
	      ,[DNSHostName] = '$DNSHostName'
	      ,[Domain] = '$Domain'
	      ,[OperatingSystem] = '$OperatingSystem'
	      ,[NoProcessors] = '$NOProcessors'
	      ,[IPAddress] = '$IP'
	      ,[RAM] = '$RAM'
	WHERE ServerName = '$Server'
	"@
	}
	else
	{
	$Query = @"
	INSERT INTO [Info].[ServerOSInfo]
	           ([DateChecked]
	           ,[DateAdded
	           ,[ServerName]
	           ,[DNSHostName]
	           ,[Domain]
	           ,[OperatingSystem]
	           ,[NoProcessors]
	           ,[IPAddress]
	           ,[RAM])
	     VALUES
	   ( GetDate()
	      ,GetDate()
	      ,'$Server'
	      ,'$DNSHostName'
	      ,'$Domain'
	      ,'$OperatingSystem'
	      ,'$NoProcessors'
	      ,'$IP'
	      ,'$RAM')
	"@
	}
	Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query $Query

And that’s it. Now if you wish to gather different data about your servers then you can examine the data available to you by

get-wmiobject Win32_OperatingSystem -ComputerName $Server | Get-Member
get-wmiobject win32_computersystem -ComputerName $Server | Get-Member

If you find something that you want to gather you can then add the property to the script and gather that information as well, make sure that you add the column to the table and to both the insert and update statements in the PowerShell Script

Creating the report in Power Bi

All data shown in the examples below has been generated from real-life data but all identifiable data has been altered or removed. I was born in Bolton and SQL SouthWest is based in Exeter 🙂

Open Power Bi Desktop and click get data. Add the connection details for your DBA Database server and database and add the query

	SELECT SOI.[ServerOSInfoID]
	      ,SOI.[DateChecked]
	      ,SOI.[ServerName]
	      ,SOI.[DNSHostName]
	      ,SOI.[Domain]
	      ,SOI.[OperatingSystem]
	      ,SOI.[NoProcessors]
	      ,SOI.[IPAddress]
	      ,SOI.[RAM]
	,IL.ServerName
	,IL.InstanceName
		  ,IL.Location
		  ,IL.Environment
		  ,IL.Inactive
		  ,IL.NotContactable
	        FROM [DBADatabase].[Info].[ServerOSInfo] as SOI
	  JOIN [dbo].[InstanceList] as IL
	  ON IL.ServerName =  SOI.[ServerName]

2

Create a new column for the Operating Edition by clicking data on the left and using this code as described in my previous post

Operating System Edition = SWITCH([OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "DataCenter",
"Microsoft Windows Server 2012 Standard","Standard",
"Microsoft Windows Server 2012 R2 Datacenter", "DataCenter",
"Microsoft Windows Server 2008 R2 Standard", "Standard",
"Microsoft Windows Server 2008 R2 Enterprise", "Enterprise",
"Microsoft® Windows Server® 2008 Standard", "Standard",
"Microsoft® Windows Server® 2008 Enterprise","Enterprise",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Standard",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Enterprise",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

And one for OS Version using this code

OS Version = SWITCH([OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "Server 2012",
"Microsoft Windows Server 2012 Standard","Server 2012",
"Microsoft Windows Server 2012 R2 Datacenter", "Server 2012 R2",
"Microsoft Windows Server 2008 R2 Standard", "Server 2008 R2",
"Microsoft Windows Server 2008 R2", "Server 2008 R2",
"Microsoft Windows Server 2008 R2 Enterprise", "Server 2008 R2",
"Microsoft® Windows Server® 2008 Standard", "Server 2008",
"Microsoft® Windows Server® 2008 Enterprise","Server 2008",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Server 2003",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Server 2003",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

I also created a new measure to count the distinct number of servers and instances as follows

Servers = DISTINCTCOUNT(Query1[Servers Name])
Instances = COUNT(Query1[Instance])

Then in the report area I start by creating a new text box and adding a title to the report and setting the page level filter to InActive is false so that all decommissioned servers are not included

3

I then create a donut chart for the number of servers by Operating System by clicking the donut chart in the visualisations and then dragging the OS version to the Details and the Servers Name to the Values

4

I then click the format button and added a proper title and the background colour

5

Then create the server numbers by location in the same way by clicking donut chart and adding location and count of server names and adding the formatting in the same way as the previous donut

6

I created a number of charts to hold single values for Domain, Instance, Server, RAM, Processors and the number of Not Contactable to provide a quick easy view of those figures, especially when you filter the report by clicking on a value within the donut chart. I find that managers really like this feature. They are all created in the same way by clicking the card in the visualisation and choosing the value

7

I also add a table for the number of servers by operating system and the number of servers by location by dragging those values to a table visualisation. I find that slicers are very useful ways of enabling information to be displayed as required, use the live visualisation to do this, I add the environment column to slice so that I can easily see values for the live environment or the development environment

I create a separate page in the report to display all of the server data as this can be useful for other teams such as the systems (server admin) team. I give them a lot of different slicers : – Domain, Location, Environment, OS Version, Edition and NotContactable with a table holding all of the relevant values to enable them to quickly see details

8

You can get all of the scripts here

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

Using Power Bi with my DBA Database

Every good DBA should have a DBA database. A place to store information about all of their instances and databases.

I have an InstanceList table which looks like this

CREATE TABLE [dbo].[InstanceList](
[InstanceID] [int] IDENTITY(1,1) NOT NULL,
[ServerName] [nvarchar](50) NOT NULL,
[InstanceName] [nvarchar](50) NOT NULL,
[Port] [int] NOT NULL,
[AG] [bit] NULL,
[Inactive] [bit] NULL CONSTRAINT [DF_InstanceList_Inactive] DEFAULT ((0)),
[Environment] [nvarchar](25) NULL,
[Location] [nvarchar](30) NULL,
CONSTRAINT [PK_InstanceList_ID] PRIMARY KEY CLUSTERED
(
[InstanceID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

I use this as the basis for all of my information gathering. By adding Server name, Instance Name , Port, Environment and Location to the table I use overnight Agent jobs to run Powershell scripts to gather information about all of the instances. This way the information is dynamic and gathered from the server, so when we add RAM and change Max memory this is updated the next time the script runs. You can also automate your installation and decommission procedures (using Powershell scripts) to add the information to the DBA database automatically

I have 4 scripts

  • ServerInfo which gathers Windows OS information such as Version and edition of the operating system, number of processors,amount of RAM, IP address, domain name etc
  • SQLInfo which gathers information about the instance such as SQL version, edition, collation, max and min memory, MAXDOP , service accounts and start modes, default file locations etc
  • Database information such as size, data usage, index usage, last backup dates, owner and many more
  • Agent Job which gathers the outcome of the jobs that have run, their names, category into two tables one for a server rollup and one for details about each job

Recently I have received a lot of requests for information from various sources, auditors asking about encryption and backup policies, Project managers asking about database and sql versions, compliance asking about numbers of Windows 2003 servers or SQL 2005 servers, system teams asking which serves in a particular location can be turned off at which time dependant on which system they are supporting for a power down

Before we had the DBA database holding all of the information about the instances we would have struggled to be able to compile this information and when I saw Power Bi was released to GA I thought that it would be a good place to start to learn about it. By using data that I understood and answering questions that I knew the format of the answer I could be more confident about experimenting – ie. if I know I have 100 servers then any result for servers that exceeds that is incorrect

I have never been a BI guy, I claim no expertise in the correct methods of manipulating the data. There may very well be better methods of achieving these results and if there please feel free to comment below so that I can improve my knowledge and keep on learning

All data shown in the examples below has been generated from real-life data but all identifiable data has been altered or removed. I have no servers in Bolton, it is where I am from originally!!

I downloaded Power BI Desktop from powerbi.com and ran the installer and the first screen you see is this one

1

I then clicked on Get Data

2

And then SQL Server and filled in the details for my DBA Database and clicked connect

3

I used my current Windows credentials

4

It then asked me which tables I wanted to load so I said all of them 🙂

5

Once I had loaded the data I looked at the queries and renamed some of the columns to make more sense to me. I also created some calculated columns by clicking New Column

I created a relative date column using this code from Chris Webb http://blog.crossjoin.co.uk/2013/01/24/building-relative-date-reports-in-powerpivot/

Relative Date Offset=INT([Date] – TODAY()
Relative Date=IF([Relative Date Offset]=0
, "Today"
, "Today " & IF([Relative Date Offset]>0, "+", "") & [Relative Date Offset])

This will enable me to show data for the last day

I also did the same for days of the week

DayOfWeek = CONCATENATE(WEEKDAY('Info AgentJobDetail'[LastRunTime],2),FORMAT('InfoAgentJobDetail'[LastRunTime]," -dddd"))

Because I struggled to show the information about the Operating system I also created two columns for OS name and OS edition by adding columns as shown below

Operating System Version = SWITCH('Info ServerOSInfo'[OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "Server 2012",
"Microsoft Windows Server 2012 Standard","Server 2012",
"Microsoft Windows Server 2012 R2 Datacenter", "Server 2012 R2",
"Microsoft Windows Server 2008 R2 Standard", "Server 2008 R2",
"Microsoft Windows Server 2008 R2", "Server 2008 R2",
"Microsoft Windows Server 2008 R2 Enterprise", "Server 2008 R2",
"Microsoft® Windows Server® 2008 Standard", "Server 2008",
"Microsoft® Windows Server® 2008 Enterprise","Server 2008",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Server 2003",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Server 2003",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

And

Operating System Edition = SWITCH('Info ServerOSInfo'[OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "DataCenter",
"Microsoft Windows Server 2012 Standard","Standard",
"Microsoft Windows Server 2012 R2 Datacenter", "DataCenter",
"Microsoft Windows Server 2008 R2 Standard", "Standard",
"Microsoft Windows Server 2008 R2 Enterprise", "Enterprise",
"Microsoft® Windows Server® 2008 Standard", "Standard",
"Microsoft® Windows Server® 2008 Enterprise","Enterprise",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Standard",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Enterprise",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

Then I started to play with the data.

This is probably not how a professional would phrase it but I would say that if you don’t know how to use a new application be brave and give it a try.

OBVIOUSLY you are a PROFESSIONAL DBA and will not do anything that would endanger production, use a backup of your database and work locally if you need to.

The first thing I wanted to know was how many servers I had by operating system, how many by SQL version and the location of them so that I could answer the questions I had been asked. I had already written a query to get the correct information to give to the requestors so I knew the correct answers which was also an advantage. I did this like this

I expanded the Info ServerOSInfo query and dragged the ServerName field to the report which created a table of names

6

I then changed the ServerName values to Count

7

I then dragged the calculated column Operating System Version to the table

8

If I click on the table and then donut chart in the visualisations it changes to

9

So you can quickly see how you want the data displayed

I then decided to look at the number of SQL 2005 instances that I had and as I had relationships between SQLInfo and Instancelist and Clients I could build a more dynamic report.

I created a donut chart with SQLVersion as the legend and InstanceID as the values and a table of SQLVersion, ServerName and Instance Name. I also created a card that was count of InstanceID

10

Now it starts getting really useful. If I want to know how many SQL 2005 instances I have I simply click on SQL2005 in the donut chart and the rest of the report changes

11

This is very cool and I hope you can see how useful this could be and how brilliant it would be to enable relevant people within the organisation the ability to look at that report and answer their own questions.

Lets take it to the next step. I have a location column in the InstanceList table which comprises of town names. If I choose a map and drag that column to the Location field and set Values and Color Saturation to the Count of InstanceID

12

and create two tables one of client with a count of instanceid and one location with a count of instance id I can do this

13

Look at how it dynamically changes as you click on the data labels – This is very cool and makes me smile every time!! I altered the colour saturation colours to make it easier to see. Now if I am asked about SQL 2005 servers I can quickly click on SQL 2005 and

14

I can see that there are 32 instances, most are in Southampton, and which clients they support

If I click a location rather than SQL version the report alters like so

15

So you can simply pass the report file to your colleagues to enable them to use it or you can publish it to Powerbi.com. I am not going to go into any detail about the costs or licensing etc I will just say it is as easy as clicking publish. If you wish to have the information automatically refreshed there are some more steps that you would need to go through which are detailed here which enable you to connect your on-premise database to Powerbi using the data management gateway, alternatively you can simply refresh the data in the report and then publish it and replace the existing report.

Once the report is in powerbi.com you can enable Q and A on the data. This is some kind of supernatural mystical magical query language which enables you to query your data with natural language and will alter the results as you type and even cope with (deliberate for screenshot) spelling mistakes 🙂

16

I also created a report for my Agent Jobs to enable me to quickly and easily see which Jobs have failed in the last day

17

I did this by filtering the report by Relative Date Offset greater than -1 (today) and isenabled = True and Outcome = Failed

There are many many more ways I can see this being useful and I hope I have given you some ideas and encouraged you to try for yourself and find out more

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

Setting Up and Using Azure VM SQL Automated Backup (and Restore)

This weekend I was creating some Azure VMs to test and was required to use the GUI for some screenshots. I have always used my Powershell scripts described here to create my test systems and with a new job taking up a lot of my time had missed the announcement about Azure SQL Automated Backup and Azure SQL Automated Patching so was surprised to see this screen

1

I read the announcement and also the details on MSDN https://msdn.microsoft.com/en-us/library/azure/dn906091.aspx which show that this requires the SQL Server IaaS Agent. This is a default option on new virtual machines.

There are some other considerations too. It is only supported for SQL Server 2014 and Windows Server 2012 and 2012R2 at present and you can set a retention period to a maximum of 30 days but it is automated. You do not have to decide upon the backup strategy Azure will decide the frequency and type of backups dependent upon the workload of the database and some other factors such as

A full backup is taken
○ when an instance is added to use Managed backup
○ When transaction log growth is 1Gb or more
○ At least once a week
○ If the log chain is broken
○ When a database is created

A transaction log backup is taken
– If no log backup is found
– Transaction log space used is 5Mb or larger
– At least once every two hours
– Any time the transaction log backup is lagging behind a full database backup. The goal is to keep the log chain ahead of full backup.

From <https://msdn.microsoft.com/en-gb/library/dn449496(v=sql.120).aspx>

There are some restrictions
– Only database backups are supported
– System databases are not supported so you need to back those up yourself
– You can only back up to Azure storage
– Maximum backup size is 1Tb as this is the maximum size for a blob in Azure storage
– Simple recovery is not supported
– Maximum retention is 30 days – if you are required to keep your backups for longer than 30 days for regulatory or other reasons you could simply use Azure Automation to copy the files to another storage account in Azure)

How to set it up.

If you are using the GUI then you will find SQL Automated Backup in the optional config blade of the set up. You can follow the steps here to set it up. If (like me) you want to use Powershell then use the following code after you have created your Virtual Machine

$storageaccount = "<storageaccountname>"
$storageaccountkey = (Get-AzureStorageKey -StorageAccountName $storageaccount).Primary
$storagecontext = New-AzureStorageContext -StorageAccountName $storageaccount -StorageAccountKey $storageaccountkey

$encryptionpassword = (Get-Credential -message 'Backup Encryption Password' -User 'IGNOREUSER').password
$autobackupconfig = New-AzureVMSqlServerAutoBackupConfig -StorageContext $storagecontext -Enable -RetentionPeriod 10 -EnableEncryption -CertificatePassword $encryptionpassword
Get-AzureVM -ServiceName <vmservicename> -Name <vmname> | Set-AzureVMSqlServerExtension -AutoBackupSettings $autobackupconfig | Update-AzureVM

Once you have run the code, Azure will take care of the rest. Add a couple of databases to your instance and look in the storage account and you will see this

2

3

And in the automaticbackup container you will find the Certificates and master key backups

4

It will also create a credential

5

You can use the same credential to back up your system databases. If like me you use Ola Hallengrens excellent Maintenance Solution then simply change your systems backup job as follows


USE [msdb]
GO
EXEC msdb.dbo.sp_update_jobstep @job_name = 'DatabaseBackup - SYSTEM_DATABASES - FULL', @step_id=1 ,
		@command=N'sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "EXECUTE [dbo].[DatabaseBackup] @Databases = ''SYSTEM_DATABASES'', "https://myaccount.blob.core.windows.net/mycontainer"
		,  @Credential = ''AutoBackup_Credential'', @BackupType = ''FULL'', @Verify = ''Y'', @CleanupTime = NULL, @CheckSum = ''Y'', @LogToTable = ''Y''" -b'
GO

If you need to restore your database then you can use the GUI and when you choose restore you will see this screen

6

Enter your storage account and the key which you can get from the Azure portal. You will notice that the credential has already been selected, click connect and

7

There are all of your backups ready to restore to any point in time that you choose. By clicking script the T-SQL is generated which looks like this


USE [master]
BACKUP LOG [Test] TO  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_LogBackup_2015-07-16_06-21-26.bak'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,
NOFORMAT, NOINIT,  NAME = N'Test_LogBackup_2015-07-16_06-21-26',
NOSKIP, NOREWIND, NOUNLOAD,  NORECOVERY ,  STATS = 5
RESTORE DATABASE [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150714201240+00.bak'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150714202740+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150714224241+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715005741+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715031242+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715052742+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715074243+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715095743+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715121243+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150716060004+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NOUNLOAD,  STATS = 5
GO

There is an important note. Remember this when you have just set it up so that you don’t think that you have done it wrong (which is what I did!)

When you enable Automated Patching for the first time, Azure configures the SQL Server IaaS Agent in the background. During this time, the portal will not show that Automated Patching is configured. Wait several minutes for the agent to be installed, configured. After that the portal will reflect the new settings.

From <https://msdn.microsoft.com/en-us/library/azure/dn961166.aspx>

And also look out for this

8

The password I had chosen was not complex enough but the Powershell script had succeeded and not given me the warning

To set up SQL Automated Patching you follow a similar steps. The setting is again on the OS Config blade and click enable and then you can choose the frequency and duration of the patching.

It is important to remember to choose your maintenance window correctly. If you have set up your SQL VMs correctly you will have them in an availability set and be using either mirroring or Availability Groups and have the VMs set up in the same availability set to ensure availability during the underlying host patching but I had it confirmed by Principal Software Engineering Manager Sethu Srinivasan t via Microsoft PFE Arvind Shyamsundar b | t that the SQL Automated Patching is not HA aware so you will need to ensure that you set the maintenance windows on each VM to ensure that they do not overlap

Scheduling Ola Hallengrens Maintenance Solution Default Jobs with Powershell

If you are a SQL Server DBA you should know about Ola Hallengren and will probably have investigated his Maintenance Solution.

If you haven’t please start here https://ola.hallengren.com/

You can also watch his presentation at SQLBits at this link

http://sqlbits.com/Sessions/Event9/Inside_Ola_Hallengrens_Maintenance_Solution

where he talks about and demonstrates the solution.

It is possible to just run his script to install the solution and schedule the jobs and know that you have made a good start in keeping your databases safe. You should be more proactive than that and set specific jobs for your own special requirements but you can and should find that information in other places including the FAQ on Ola’s site

I particularly like the parameter @ChangeBackupType which when running the transaction log or differential backup will change the backup type to full if the backup type cannot be taken. This is excellent for picking up new databases and backing them up soon after creation

When you run the script the jobs are created but not scheduled and it is for this reason I created this function. All it does it schedule the jobs so that I know that they will be run when a new server is created and all the databases will be backed up. I can then go back at a later date and schedule them correctly for the servers workload or tweak them according to specific needs but this allows me that fuzzy feeling of knowing that the backups and other maintenance will be performed.

To accomplish this I pass a single parameter $Server to the function this is the connection string and should be in the format of SERVERNAME, SERVERNAME\INSTANCENAME or SERVERNAME\INSTANCENAME,Port

I then create a $srv SMO object as usual

$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server

Create a JobServer object and a Jobs array which holds the Jobs

$JobServer = $srv.JobServer
$Jobs = $JobServer.Jobs

And set the schedule for each job. I pick each Job using the Where-Object Cmdlet and break out if the job does not exist

$Job = $Jobs|Where-Object {$_.Name -eq 'DatabaseBackup - SYSTEM_DATABASES - FULL'}
       if ($Job -eq $Null)
       {Write-Output "No Job with that name"
       break}

Then I create a Schedule object and set its properties and create the schedule

$Schedule = new-object Microsoft.SqlServer.Management.Smo.Agent.JobSchedule ($job, 'Daily - Midnight ++ Not Sunday')
$Schedule.ActiveEndDate = Get-Date -Month 12 -Day 31 -Year 9999
$Schedule.ActiveEndTimeOfDay = '23:59:59'
$Schedule.FrequencyTypes = "Weekly"
$Schedule.FrequencyRecurrenceFactor = 1
$Schedule.FrequencySubDayTypes = "Once"
$Schedule.FrequencyInterval = 126 # Weekdays 62 + Saturdays 64 - <a href="https://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.agent.jobschedule.frequencyinterval.aspx">https://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.agent.jobschedule.frequencyinterval.aspx</a>
$Schedule.ActiveStartDate = get-date
$schedule.ActiveStartTimeOfDay = '00:16:00'
$Schedule.IsEnabled = $true
$Schedule.Create()

I have picked this example for the blog as it shows some of the less obvious gotchas. Setting the active end date could only be achieved by using the Get-Date Cmdlet and defining the date. The schedule frequency interval above is for every day except Sundays. This achieved by using the following table from MSDN which is always my first port of call when writing these scripts

WeekDays.Sunday = 1
WeekDays.Monday = 2
WeekDays.Tuesday = 4
WeekDays.Wednesday = 8
WeekDays.Thursday = 16
WeekDays.Friday = 32
WeekDays.Saturday = 64
WeekDays.WeekDays = 62
WeekDays.WeekEnds = 65
WeekDays.EveryDay = 127

Combine values using an OR logical operator to set more than a single day. For example, combine WeekDays.Monday and WeekDays.Friday (FrequencyInterval = 2 + 32 = 34) to schedule an activity for Monday and Friday.

It is easy using this to set up whichever schedule you wish by combining the numbers. I would advise commenting it in the script so that your future self or following DBAs can understand what is happening.

You can tweak this script or use the code to work with any Agent Jobs and set the schedules accordingly and you can check that you have set the schedules correctly with this code

   $srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server
   $JObserver = $srv.JobServer
   $JObs = $JObserver.Jobs
   $ActiveStartTimeOfDay = @{Name = "ActiveStartTimeOfDay"; Expression = {$_.JobSchedules.ActiveStartTimeOfDay}}
   $FrequencyInterval = @{Name = "FrequencyInterval"; Expression = {$_.JobSchedules.FrequencyInterval}}
   $FrequencyTypes = @{Name = "FrequencyTypes"; Expression = {$_.JobSchedules.FrequencyTypes}}
   $IsEnabled = @{Name = "IsEnabled"; Expression = {$_.JobSchedules.IsEnabled}}
   $Jobs|Where-Object{$_.Category -eq 'Database Maintenance'}|select name,$IsEnabled,$FrequencyTypes,$FrequencyInterval,$ActiveStartTimeOfDay|Format-Table -AutoSize

You can get the script from Script Center via the link below or by searching for “Ola” using the script browser add-in straight from ISE

browser

https://gallery.technet.microsoft.com/scriptcenter/Schedule-Ola-Hallengrens-a66a3c89

Triggering a System Center Configuration Manager deployment task

A slightly different topic today.

Once you have built up knowledge, you become the person that people ask to solve things. This is something I really enjoy, taking a problem and solving it for people and in the process teaching them and enabling them to automate more things.

A colleague was performing a new deployment of a product via SCCM and wanted to trigger the clients to update and receive the new update instead of waiting for it to be scheduled.

They had found some code that would do this

Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000121}"|Out-Null
Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000021}"|Out-Null
Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000022}"|Out-Null
Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000002}"|Out-Null

They had the idea of using this command and a text file containing the machines and PS Remote.

I looked at it a different way and gave them a function so that they could provide the Collection Name (In SCCM a collection is a list of machines for a specific purpose) and the function would import the SCCM module, connect to the Site get the names of the machines in the collection and run the command on each one

function Trigger-DeploymentCycle
{
param
(
[string]$CollectionName
)

# PS script to run

$scriptblock = {
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000121}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000021}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000022}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000002}"|Out-Null
    }

## import SCCM module
Import-Module (Join-Path $(Split-Path $env:SMS_ADMIN_UI_PATH) ConfigurationManager.psd1)
#open drive for SCCM 
cd <Site Code>:\ #### cd <Site Code>:\ replace with Site Code or add param $SiteCOde and use cd ${$SiteCode}:\ 
# Get Computer names in collection
$PCs = (Get-CMDeviceCollectionDirectMembershipRule -CollectionName $CollectionName).rulename
$Count = $PCs.count
Write-Output "Total number of PCs = $Count"

Invoke-Command –ComputerName $PCs –ScriptBlock $scriptblock –ThrottleLimit 50

}  

This would work very well but they wanted some error checking to enable them to identify machines they were unable to connect to following the deployment so the final solution which will run a little slower

Set up function and parameters and create log files

function Trigger-DeploymentCycle
{
param
(
[string]$CollectionName
)

# Create log file
$StartTime = Get-Date
$Date = Get-Date -Format ddMMyyHHss
$Errorlogpath = "C:\temp\SCCMError" + $Date + ".txt"
$Successlogpath = "C:\temp\SCCMSuccess" + $Date + ".txt"
New-Item -Path $Errorlogpath -ItemType File
New-Item -Path $Successlogpath -ItemType File

$StartLog = "Script Started at $StartTime"
$StartLog | Out-File -FilePath $Successlogpath -Append

Create the script block, import the SCCM module, connect to the SCCM site and get the machines in the collection. Note that you will have to change <Site Code> with your own site code

 

$scriptblock = {
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000121}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000021}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000022}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000002}"|Out-Null
    }

## import SCCM module
Import-Module (Join-Path $(Split-Path $env:SMS_ADMIN_UI_PATH) ConfigurationManager.psd1)
#open drive for SCCM 
cd <Site Code>:\ #### cd <Site Code>:\ replace with Site Code or add param $SiteCOde and use cd ${$SiteCode}:\ 
# Get Computer names in collection
$PCs = (Get-CMDeviceCollectionDirectMembershipRule -CollectionName $CollectionName).rulename
$Count = $PCs.count
Write-Output "Total number of PCs = $Count"

I wanted to give them a progress output so I needed to be able to identify the number of machines in the collection by using the count property. I then needed to output the number of the item within the array which I did with

$a= [array]::IndexOf($PCs, $PC) + 1
Write-Output " Connecting to PC - $PC -- $a of $count"

I then pinged the machine,ran the script block and wrote to the log files and finally opened the log files

if (Test-Connection $PC -Quiet -Count 1)
{   
# Run command on PC
Invoke-Command -ComputerName $PC -scriptblock $scriptblock
$Success = "SUCCESS - finished - $PC -- $a of $count" 
 $Success | Out-File -FilePath $Successlogpath -Append
Write-Output $Success
}
else
{
$ErrorMessage = "ERROR - $PC is not available -- $PC -- $a of $count"
$ErrorMessage| Out-File -FilePath $Errorlogpath -Append 
Write-Output $ErrorMessage
}
}

notepad $Errorlogpath
notepad $Successlogpath

Now they can load the function into their powershell sessions and type

[code language=”language="Powershell”]TriggerDeplyment COLLECTIONNAME

and they will be able to manually trigger the tasks. This function will trigger the following tasks for a list of PCs in a collection.

Machine Policy Assignment Request — {00000000-0000-0000-0000-000000000021}
Machine Policy Evaluation — {00000000-0000-0000-0000-000000000022}
Software Inventory — {00000000-0000-0000-0000-000000000002}
Application Deployment Evaluation Cycle: {00000000-0000-0000-0000-000000000121}

Here is the list of other tasks you can trigger:

Discovery Data Collection Cycle: {00000000-0000-0000-0000-000000000003}
Hardware Inventory Cycle: {00000000-0000-0000-0000-000000000001}
Machine Policy Retrieval and Evaluation Cycle: {00000000-0000-0000-0000-000000000021}
Software Metering Usage Report Cycle: {00000000-0000-0000-0000-000000000031}
Software Updates Deployment Evaluation Cycle: {00000000-0000-0000-0000-000000000108}
Software Updates Scan Cycle: {00000000-0000-0000-0000-000000000113}
Windows Installer Source List Update Cycle: {00000000-0000-0000-0000-000000000032}
Hardware Inventory={00000000-0000-0000-0000-000000000001}
Software Update Scan={00000000-0000-0000-0000-000000000113}
Software Update Deployment Re-eval={00000000-0000-0000-0000-000000000114}
Data Discovery={00000000-0000-0000-0000-000000000003}
Refresh Default Management Point={00000000-0000-0000-0000-000000000023}
Refresh Location (AD site or Subnet)={00000000-0000-0000-0000-000000000024}
Software Metering Usage Reporting={00000000-0000-0000-0000-000000000031}
Sourcelist Update Cycle={00000000-0000-0000-0000-000000000032}
Cleanup policy={00000000-0000-0000-0000-000000000040}
Validate assignments={00000000-0000-0000-0000-000000000042}
Certificate Maintenance={00000000-0000-0000-0000-000000000051}
Branch DP Scheduled Maintenance={00000000-0000-0000-0000-000000000061}
Branch DP Provisioning Status Reporting={00000000-0000-0000-0000-000000000062}
Refresh proxy management point={00000000-0000-0000-0000-000000000037}
Software Update Deployment={00000000-0000-0000-0000-000000000108}
State Message Upload={00000000-0000-0000-0000-000000000111}
State Message Cache Cleanup={00000000-0000-0000-0000-000000000112}

You can find the function here

Trigger-Deployment

and all of my Script Center Submissions are here

As always – The internet lies, fibs and deceives and everything you read including this post should be taken with a pinch of salt and examined carefully. All code should be understood and tested prior to running in a live environment.

Show AutoGrowth Events with Powershell to CSV

This week I was reading Pinal Daves post about Autogrowth Events

http://blog.sqlauthority.com/2015/02/03/sql-server-script-whenwho-did-auto-grow-for-the-database/

as it happened I had a requirement to make use of the script only a few days later. I was asked to provide the information in a CSV so that the person who required the information could manipulate it in Excel.

I am a great believer in Automation. If you are going to do something more than once then automate it so I wrote two functions, added them to TFS and now they will be available to all of my team members next time they load Powershell.

Why two functions? Well Pinal Daves script gets the information from the default trace for a single database but there may be times when you need to know the autogrowth events that happened on a server with multiple databases.

I use a very simple method for doing this as I have not found the correct way to parse the default trace with Powershell. The functions rely on Invoke-SQLCMD2 which I also have in my functions folder and pass the query from Pinal Daves Blog post as a here string

$Results = Invoke-Sqlcmd2 -ServerInstance $Server -Database master -Query $Query

To output to CSV I use the Export-CSV cmdlet

if($CSV)
{
$Results| Export-Csv -Path $CSV
}

And to open the CSV I add a [switch] parameter. You can find out more about parameters here or by

Get-Help about_Functions_Advanced_Parameters

so the parameter block of my function looks like

param
(
[Parameter(Mandatory=$true)]
[string]$Server,
[Parameter(Mandatory=$true)]
[string]$Database,
[Parameter(Mandatory=$false)]
[string]$CSV,
[Parameter(Mandatory=$false)]
[switch]$ShowCSV
)

Now when I am asked again to provide this information it is as easy as typing

Show-AutogrowthServer -Server SQL2014Ser12R2 

or

Show-AutogrowthDatabase -Server SQL2014Ser12R2 -Database Autogrowth

and the results will be displayed as below

autogrowth

just a side note. Pinal Daves script uses @@servername in the where clause and if you have renamed your host the script will be blank. The resolution to this is to runt he following T-SQL

 sp_dropserver 'OLDSERVERNAME';
GO
sp_addserver NEWSERVERNAME, local;
GO

You can find the scripts here

Show-AutoGrowthServer

Show-AutoGrowthDatabase

and all of my Script Center Submissions are here

As always – The internet lies, fibs and deceives and everything you read including this post  should be taken with a pinch of salt and examined carefully. All code should be understood and tested prior to running in a live environment.