Using Power Bi with my DBA Database

Every good DBA should have a DBA database. A place to store information about all of their instances and databases.

I have an InstanceList table which looks like this

CREATE TABLE [dbo].[InstanceList](
[InstanceID] [int] IDENTITY(1,1) NOT NULL,
[ServerName] [nvarchar](50) NOT NULL,
[InstanceName] [nvarchar](50) NOT NULL,
[Port] [int] NOT NULL,
[AG] [bit] NULL,
[Inactive] [bit] NULL CONSTRAINT [DF_InstanceList_Inactive] DEFAULT ((0)),
[Environment] [nvarchar](25) NULL,
[Location] [nvarchar](30) NULL,
CONSTRAINT [PK_InstanceList_ID] PRIMARY KEY CLUSTERED
(
[InstanceID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

I use this as the basis for all of my information gathering. By adding Server name, Instance Name , Port, Environment and Location to the table I use overnight Agent jobs to run Powershell scripts to gather information about all of the instances. This way the information is dynamic and gathered from the server, so when we add RAM and change Max memory this is updated the next time the script runs. You can also automate your installation and decommission procedures (using Powershell scripts) to add the information to the DBA database automatically

I have 4 scripts

  • ServerInfo which gathers Windows OS information such as Version and edition of the operating system, number of processors,amount of RAM, IP address, domain name etc
  • SQLInfo which gathers information about the instance such as SQL version, edition, collation, max and min memory, MAXDOP , service accounts and start modes, default file locations etc
  • Database information such as size, data usage, index usage, last backup dates, owner and many more
  • Agent Job which gathers the outcome of the jobs that have run, their names, category into two tables one for a server rollup and one for details about each job

Recently I have received a lot of requests for information from various sources, auditors asking about encryption and backup policies, Project managers asking about database and sql versions, compliance asking about numbers of Windows 2003 servers or SQL 2005 servers, system teams asking which serves in a particular location can be turned off at which time dependant on which system they are supporting for a power down

Before we had the DBA database holding all of the information about the instances we would have struggled to be able to compile this information and when I saw Power Bi was released to GA I thought that it would be a good place to start to learn about it. By using data that I understood and answering questions that I knew the format of the answer I could be more confident about experimenting – ie. if I know I have 100 servers then any result for servers that exceeds that is incorrect

I have never been a BI guy, I claim no expertise in the correct methods of manipulating the data. There may very well be better methods of achieving these results and if there please feel free to comment below so that I can improve my knowledge and keep on learning

All data shown in the examples below has been generated from real-life data but all identifiable data has been altered or removed. I have no servers in Bolton, it is where I am from originally!!

I downloaded Power BI Desktop from powerbi.com and ran the installer and the first screen you see is this one

1

I then clicked on Get Data

2

And then SQL Server and filled in the details for my DBA Database and clicked connect

3

I used my current Windows credentials

4

It then asked me which tables I wanted to load so I said all of them 🙂

5

Once I had loaded the data I looked at the queries and renamed some of the columns to make more sense to me. I also created some calculated columns by clicking New Column

I created a relative date column using this code from Chris Webb http://blog.crossjoin.co.uk/2013/01/24/building-relative-date-reports-in-powerpivot/

Relative Date Offset=INT([Date] – TODAY()
Relative Date=IF([Relative Date Offset]=0
, "Today"
, "Today " & IF([Relative Date Offset]>0, "+", "") & [Relative Date Offset])

This will enable me to show data for the last day

I also did the same for days of the week

DayOfWeek = CONCATENATE(WEEKDAY('Info AgentJobDetail'[LastRunTime],2),FORMAT('InfoAgentJobDetail'[LastRunTime]," -dddd"))

Because I struggled to show the information about the Operating system I also created two columns for OS name and OS edition by adding columns as shown below

Operating System Version = SWITCH('Info ServerOSInfo'[OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "Server 2012",
"Microsoft Windows Server 2012 Standard","Server 2012",
"Microsoft Windows Server 2012 R2 Datacenter", "Server 2012 R2",
"Microsoft Windows Server 2008 R2 Standard", "Server 2008 R2",
"Microsoft Windows Server 2008 R2", "Server 2008 R2",
"Microsoft Windows Server 2008 R2 Enterprise", "Server 2008 R2",
"Microsoft® Windows Server® 2008 Standard", "Server 2008",
"Microsoft® Windows Server® 2008 Enterprise","Server 2008",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Server 2003",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Server 2003",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

And

Operating System Edition = SWITCH('Info ServerOSInfo'[OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "DataCenter",
"Microsoft Windows Server 2012 Standard","Standard",
"Microsoft Windows Server 2012 R2 Datacenter", "DataCenter",
"Microsoft Windows Server 2008 R2 Standard", "Standard",
"Microsoft Windows Server 2008 R2 Enterprise", "Enterprise",
"Microsoft® Windows Server® 2008 Standard", "Standard",
"Microsoft® Windows Server® 2008 Enterprise","Enterprise",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Standard",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Enterprise",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

Then I started to play with the data.

This is probably not how a professional would phrase it but I would say that if you don’t know how to use a new application be brave and give it a try.

OBVIOUSLY you are a PROFESSIONAL DBA and will not do anything that would endanger production, use a backup of your database and work locally if you need to.

The first thing I wanted to know was how many servers I had by operating system, how many by SQL version and the location of them so that I could answer the questions I had been asked. I had already written a query to get the correct information to give to the requestors so I knew the correct answers which was also an advantage. I did this like this

I expanded the Info ServerOSInfo query and dragged the ServerName field to the report which created a table of names

6

I then changed the ServerName values to Count

7

I then dragged the calculated column Operating System Version to the table

8

If I click on the table and then donut chart in the visualisations it changes to

9

So you can quickly see how you want the data displayed

I then decided to look at the number of SQL 2005 instances that I had and as I had relationships between SQLInfo and Instancelist and Clients I could build a more dynamic report.

I created a donut chart with SQLVersion as the legend and InstanceID as the values and a table of SQLVersion, ServerName and Instance Name. I also created a card that was count of InstanceID

10

Now it starts getting really useful. If I want to know how many SQL 2005 instances I have I simply click on SQL2005 in the donut chart and the rest of the report changes

11

This is very cool and I hope you can see how useful this could be and how brilliant it would be to enable relevant people within the organisation the ability to look at that report and answer their own questions.

Lets take it to the next step. I have a location column in the InstanceList table which comprises of town names. If I choose a map and drag that column to the Location field and set Values and Color Saturation to the Count of InstanceID

12

and create two tables one of client with a count of instanceid and one location with a count of instance id I can do this

13

Look at how it dynamically changes as you click on the data labels – This is very cool and makes me smile every time!! I altered the colour saturation colours to make it easier to see. Now if I am asked about SQL 2005 servers I can quickly click on SQL 2005 and

14

I can see that there are 32 instances, most are in Southampton, and which clients they support

If I click a location rather than SQL version the report alters like so

15

So you can simply pass the report file to your colleagues to enable them to use it or you can publish it to Powerbi.com. I am not going to go into any detail about the costs or licensing etc I will just say it is as easy as clicking publish. If you wish to have the information automatically refreshed there are some more steps that you would need to go through which are detailed here which enable you to connect your on-premise database to Powerbi using the data management gateway, alternatively you can simply refresh the data in the report and then publish it and replace the existing report.

Once the report is in powerbi.com you can enable Q and A on the data. This is some kind of supernatural mystical magical query language which enables you to query your data with natural language and will alter the results as you type and even cope with (deliberate for screenshot) spelling mistakes 🙂

16

I also created a report for my Agent Jobs to enable me to quickly and easily see which Jobs have failed in the last day

17

I did this by filtering the report by Relative Date Offset greater than -1 (today) and isenabled = True and Outcome = Failed

There are many many more ways I can see this being useful and I hope I have given you some ideas and encouraged you to try for yourself and find out more

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

Setting Up and Using Azure VM SQL Automated Backup (and Restore)

This weekend I was creating some Azure VMs to test and was required to use the GUI for some screenshots. I have always used my Powershell scripts described here to create my test systems and with a new job taking up a lot of my time had missed the announcement about Azure SQL Automated Backup and Azure SQL Automated Patching so was surprised to see this screen

1

I read the announcement and also the details on MSDN https://msdn.microsoft.com/en-us/library/azure/dn906091.aspx which show that this requires the SQL Server IaaS Agent. This is a default option on new virtual machines.

There are some other considerations too. It is only supported for SQL Server 2014 and Windows Server 2012 and 2012R2 at present and you can set a retention period to a maximum of 30 days but it is automated. You do not have to decide upon the backup strategy Azure will decide the frequency and type of backups dependent upon the workload of the database and some other factors such as

A full backup is taken
○ when an instance is added to use Managed backup
○ When transaction log growth is 1Gb or more
○ At least once a week
○ If the log chain is broken
○ When a database is created

A transaction log backup is taken
– If no log backup is found
– Transaction log space used is 5Mb or larger
– At least once every two hours
– Any time the transaction log backup is lagging behind a full database backup. The goal is to keep the log chain ahead of full backup.

From <https://msdn.microsoft.com/en-gb/library/dn449496(v=sql.120).aspx>

There are some restrictions
– Only database backups are supported
– System databases are not supported so you need to back those up yourself
– You can only back up to Azure storage
– Maximum backup size is 1Tb as this is the maximum size for a blob in Azure storage
– Simple recovery is not supported
– Maximum retention is 30 days – if you are required to keep your backups for longer than 30 days for regulatory or other reasons you could simply use Azure Automation to copy the files to another storage account in Azure)

How to set it up.

If you are using the GUI then you will find SQL Automated Backup in the optional config blade of the set up. You can follow the steps here to set it up. If (like me) you want to use Powershell then use the following code after you have created your Virtual Machine

$storageaccount = "<storageaccountname>"
$storageaccountkey = (Get-AzureStorageKey -StorageAccountName $storageaccount).Primary
$storagecontext = New-AzureStorageContext -StorageAccountName $storageaccount -StorageAccountKey $storageaccountkey

$encryptionpassword = (Get-Credential -message 'Backup Encryption Password' -User 'IGNOREUSER').password
$autobackupconfig = New-AzureVMSqlServerAutoBackupConfig -StorageContext $storagecontext -Enable -RetentionPeriod 10 -EnableEncryption -CertificatePassword $encryptionpassword
Get-AzureVM -ServiceName <vmservicename> -Name <vmname> | Set-AzureVMSqlServerExtension -AutoBackupSettings $autobackupconfig | Update-AzureVM

Once you have run the code, Azure will take care of the rest. Add a couple of databases to your instance and look in the storage account and you will see this

2

3

And in the automaticbackup container you will find the Certificates and master key backups

4

It will also create a credential

5

You can use the same credential to back up your system databases. If like me you use Ola Hallengrens excellent Maintenance Solution then simply change your systems backup job as follows


USE [msdb]
GO
EXEC msdb.dbo.sp_update_jobstep @job_name = 'DatabaseBackup - SYSTEM_DATABASES - FULL', @step_id=1 ,
		@command=N'sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "EXECUTE [dbo].[DatabaseBackup] @Databases = ''SYSTEM_DATABASES'', "https://myaccount.blob.core.windows.net/mycontainer"
		,  @Credential = ''AutoBackup_Credential'', @BackupType = ''FULL'', @Verify = ''Y'', @CleanupTime = NULL, @CheckSum = ''Y'', @LogToTable = ''Y''" -b'
GO

If you need to restore your database then you can use the GUI and when you choose restore you will see this screen

6

Enter your storage account and the key which you can get from the Azure portal. You will notice that the credential has already been selected, click connect and

7

There are all of your backups ready to restore to any point in time that you choose. By clicking script the T-SQL is generated which looks like this


USE [master]
BACKUP LOG [Test] TO  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_LogBackup_2015-07-16_06-21-26.bak'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,
NOFORMAT, NOINIT,  NAME = N'Test_LogBackup_2015-07-16_06-21-26',
NOSKIP, NOREWIND, NOUNLOAD,  NORECOVERY ,  STATS = 5
RESTORE DATABASE [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150714201240+00.bak'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150714202740+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150714224241+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715005741+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715031242+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715052742+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715074243+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715095743+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715121243+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150716060004+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NOUNLOAD,  STATS = 5
GO

There is an important note. Remember this when you have just set it up so that you don’t think that you have done it wrong (which is what I did!)

When you enable Automated Patching for the first time, Azure configures the SQL Server IaaS Agent in the background. During this time, the portal will not show that Automated Patching is configured. Wait several minutes for the agent to be installed, configured. After that the portal will reflect the new settings.

From <https://msdn.microsoft.com/en-us/library/azure/dn961166.aspx>

And also look out for this

8

The password I had chosen was not complex enough but the Powershell script had succeeded and not given me the warning

To set up SQL Automated Patching you follow a similar steps. The setting is again on the OS Config blade and click enable and then you can choose the frequency and duration of the patching.

It is important to remember to choose your maintenance window correctly. If you have set up your SQL VMs correctly you will have them in an availability set and be using either mirroring or Availability Groups and have the VMs set up in the same availability set to ensure availability during the underlying host patching but I had it confirmed by Principal Software Engineering Manager Sethu Srinivasan t via Microsoft PFE Arvind Shyamsundar b | t that the SQL Automated Patching is not HA aware so you will need to ensure that you set the maintenance windows on each VM to ensure that they do not overlap

#tsql2sday #60 – Something New Learned – Problem Step Recorder

What is T-SQL Tuesday?

T-SQL Tuesday is a monthly blog party hosted by a different blogger each month. This blog party was started by Adam Machanic (blog|twitter). You can take part by posting your own participating post that fits the topic of the month and follows the requirements Additionally, if you are interested in hosting a future T-SQL Tuesday, contact Adam Machanic on his blog.

This month’s blog party is hosted by Chris Yates blog |twitter who asked people to share something newly learned.

I love being a part of the SQL community. It gives me the opportunity to learn as much as I want to about anything I can think of within the data field. In the last couple of months I have presented at Newcastle User Group and learnt about migrating SQL using Powershell with Stuart Moore. At our user group in Exeter http://sqlsouthwest.co.uk/ we had Steph Middleton talking about version control for databases and lightning talks from Pavol Rovensky on Mocking in C# ,John Martin on Azure fault domains and availability sets using a pen and a whiteboard!, Annette Allen on Database Unit Testing,Terry McCann  on SQL Certifications. We also had Jonathan Allen talking about some free tools and resources to help manage both large and small SQL environments.  I went to SQL Relay in Southampton and saw Stuart Moore (again!) Scott Klein Alex Yates James Skipworth and I joined the PASS DBA fundamentals virtual chapter webinar for Changing Your Habits to Improve the Performance of Your T-SQL by Mickey Stuewe and that’s only the ‘in-person’ learning that I did. I also read a lot of blog posts!

But instead of repeating what I learnt from others within the community I thought I would write a blog post that I have been meaning to write for a few weeks about a solution pre-built into Windows that appears to not be well known. Problem Step Recorder.

What is PSR?

I found out about a little known tool included in Windows Operating System a couple of months ago which enables you to record what you are doing by taking screenshots of every mouse click. The tool is Step Recorder also known as PSR. It is included by default in Windows 7 , Windows 8 and 8.1 and Windows Server 2008 and above.

What does it do?

Simply put, it records “This is what I did” There are many situations when this can be useful

  • You can use this during installations to help create documentation. “This is what I did” when I installed X and now you can follow those steps and I know I haven’t missed anything.
  • You can use it when communicating with 3rd parties or other support teams. “This is what I did” when I got this error and here are all of the steps so that you can re-create the issue and I know that I haven’t missed anything
  • You can use this when resolving high priority incidents. “This is what I did” when System X broke, it includes all of the times of my actions.
    I still keep my notepad by my keyboard out of habit but I have a record of the exact steps that I took to try to resolve the issue which will be very useful for reporting on the incident in the near future and also placing into a Knowledge Base for others to use if it happens again and I know I haven’t missed anything
  • For assisting family members. Like many, I am “The IT guy” and PSR enables me to provide clear instructions with pictures showing exactly where I clicked to those family members who are having trouble with “The internet being broken”

It does this by automatically taking a screen shot after every mouse click or program event with a timestamp and a description of what happened. It does not record keystrokes though so if you need to record what you have typed there is some manual steps required

So how do you access PSR?

Simple. Type “psr” into the run box, cmd or PowerShell and it will open

Untitled picture

Once you click on Start Record it will start recording your clicks and taking screenshots. However I always open the settings by clicking on the drop down to the left of the help icon first and change the number of recent screen captures to store to the maximum value of 100.

1Untitled picture

If you do not you will get no warning but PSR will only save the last 25 screenshots it takes and your results will look like the below. It will still record your actions but not keep the screenshots.

Previous Next

Step 16: (‎09/‎11/‎2014 13:47:45) User left click on “Chris Yates (@YatesSQL) | Twitter (tab item)”

No screenshots were saved for this step.

Previous Next

Step 17: (‎09/‎11/‎2014 13:47:47) User left click on “The SQL Professor | ‘Leadership Through Service’ (text)”

No screenshots were saved for this step.

Previous Next

Step 18: (‎09/‎11/‎2014 13:47:47) User left click on “T-SQL Tuesday #60 – Something New Learned | The SQL Professor (text)” in “T-SQL Tuesday #60 – Something New Learned | The SQL Professor – Google Chrome”

untitled

You can also set the name and location of the saved file in the settings but if you leave it blank it will prompt for a location and name once you click Stop Record

How do I add keyboard input?

PSR allows you add keyboard input manually. You may need this if you need to include the text you have entered into prompts or address bars or if you wish to add further comment. You can do this by clicking add comment, drawing a box around the relevant part of the screen for the text input and inputting the text into the box

2Untitled picture

In the results this looks like

Step 1: (‎09/‎11/‎2014 12:56:22) User Comment: “http://www.microsoft.com/en-gb/download/details.aspx?id=42573

untitled1

What do the results look like?

Once you have finished the actions that you want to record (or when you think you are close to 100 screenshots) click stop record and the following screen will be displayed

3Untitled picture

This allows you to review what PSR has recorded. You can then save it to a location of your desire. It is saved as a zip file which has a single .mht file in it. You can open the file without unzipping the archive and it will open in Internet Explorer. As you can see from the shots below you can run PSR on your client and it will still record actions in your RDP sessions although it does not record as much detail. The first two are on my SCOM server in my lab and the second two are on the laptop using the SCOM console

Previous Next

Step 11: (‎09/‎11/‎2014 13:02:13) User left click on “Input Capture Window (pane)” in “SCOM on ROB-LAPTOP – Virtual Machine Connection”

untitled2

Previous Next

Step 12: (‎09/‎11/‎2014 13:02:16) User left click on “Input Capture Window (pane)” in “SCOM on ROB-LAPTOP – Virtual Machine Connection”

untitled3

Previous Next

Step 13: (‎09/‎11/‎2014 13:06:25) User right click on “Management Packs (tree item)” in “Agent Managed – THEBEARDMANAGEMENTGROUP – Operations Manager”

untitled4

Previous Next

Step 14: (‎09/‎11/‎2014 13:06:27) User left click on “Import Management Packs… (menu item)”

untitled5

You can then use the zip file as you wish. Maybe you email it to your third party support team (once you have edited any confidential data) or you can attach it to your incident in your IT Service Management solution or attach it to a report. If you wish to create documentation you can open the .mht file in Word, edit it as you see fit and save it appropriately.

So that is one of the many things that I have learnt recently and I am looking forward to seeing what others have learnt especially as many will have just been to the SQL PASS Summit. You will be able to find the other posts in this blog party in the comments on Chris’s page

Generating T-SQL Randomly with Powershell

I have a lab on my laptop running various servers so that I can problem solve and learn and recently I wanted to add several months of data into a database. I had created a stored procedure to take some parameters perform some logic and insert the data.

To execute the stored procedure in T-SQL I simply run this

EXECUTE [dbo].[usp_Insert_DriveSpace] 'Server1','C','2014-11-05','100','25'

which uses the server name, drive letter, date, capacity and free space to add the data

In my wisdom I decided to create some data that was more ‘real-life’ I was interested in storing drive space data and will be learning how to write reports on it. To do this I had pre-populated some tables in the database with 10 Server Names each with 5 drives so I needed 10*5*90 or 4500 statements

I wanted to populate this with about 3 months of data as if it had been gathered every day. I read this post about using CTEs to create sequences and I am sure it can be done this way but I don’t have the T-SQL skills to do so. If someone can (or has) done that please let me know as I am trying to improve my T-SQL skills and would be interested in how to approach and solve this problem with T-SQL

I solved it with Powershell in this way.

Created an array of Servers and an array of Drives to enable me to iterate though each.

$Servers = 'Server1','Server2','Server3','Server4','Server5','Server6','Server7','Server8','Server9','Server10'
$Drives = 'C','D','E','F','G'

Set the drive capacity for each drive. To make my life slightly easier I standardised my ‘servers’

$CDriveCapacity = 100
$DDriveCapacity = 50
$EDriveCapacity = 200
$FDriveCapacity = 200
$GDriveCapacity = 500

I needed to create a date. You can use Get-Date to get todays date and to get dates or times in the future or the past you can use the AddDays() function. You can also add ticks, milliseconds, seconds, minutes, hours, months or years

(Get-Date).AddDays(1)

I then needed to format the date. This is slightly confusing. If you just use Get-Date to get the current date (time) then you can use the format or uformat switch to format the output

Get-Date -Format yyyyMMdd
Get-Date -UFormat %Y%m%d

However this does not work once you have used the AddDays() method. You have to use the ToString() method

 $Date = (get-date).AddDays(-7).ToString('yyyy-MM-dd')

To replicate gathering data each day I decided to use a while loop. I set $x to –95 and pressed CTRL and J to bring up Snippets and typed w and picked the while loop. You can find out more about snippets in my previous post I started at –95 so that all the identity keys incremented in a real-life manner oldest to newest.

$x = -98
while ($x -le 0)
{
    $Date = (get-date).AddDays($x).ToString('yyyy-MM-dd')

    foreach($Server in $Servers)
    {
        foreach ($Drive in $Drives)
        {

I could then use the while loop to generate data for each day and loop through each server and each drive and generate the T-SQL but I wanted more!

I wanted to generate some random numbers for the free space available for each drive. I used the Get-Random cmdlet If you are going to use it make sure you read this post to make sure that you don’t get caught by the gotcha. I decided to set the free space for my OS,Data and Log Files to somewhere between 70 and 3 Gb free as in this imaginary scenario these drives are carefully monitored and the data and log file sizes under the control of a careful DBA but still able to go below thresholds.

if($Drive -eq 'C')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3

I set the TempDB drive to have either 4,7 or 11 Gb free so that i can try to colour code my reports depending on values and if one field only has three values it makes it simpler to verify.

I set the Backup Drive to somewhere between 50 and 0 so that I will hit 0 sometimes!!

Here is the full script. It generated 4500 T-SQL statements in just under 16 seconds

$Servers = 'Server1','Server2','Server3','Server4','Server5','Server6','Server7','Server8','Server9','Server10'
$Drives = 'C','D','E','F','G'
$CDriveCapacity = 100
$DDriveCapacity = 50
$EDriveCapacity = 200
$FDriveCapacity = 200
$GDriveCapacity = 500

$x = -98
while ($x -le 0)
{
    $Date = (get-date).AddDays($x).ToString('yyyy-MM-dd')

    foreach($Server in $Servers)
    {
        foreach ($Drive in $Drives)
        {
            if($Drive -eq 'C')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3
            Write-Host &quot;EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$CDriveCapacity','$Free'&quot;
            }
            elseif($Drive -eq 'D')
            {
            $Free = Get-Random -InputObject 4,7,11
            Write-Host &quot;EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$DDriveCapacity','$Free'&quot;
            }
            elseif($Drive -eq 'E')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3
            Write-Host &quot;EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$EDriveCapacity','$Free'&quot;
            }
            elseif($Drive -eq 'F')
            {
            $Free = Get-Random -Maximum 70 -Minimum 3
            Write-Host &quot;EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$FDriveCapacity','$Free'&quot;
            }
            elseif($Drive -eq 'G')
            {
            $Free = Get-Random -Maximum 50 -Minimum 0
            Write-Host &quot;EXECUTE [dbo].[usp_Insert_DriveSpace] '$Server','$Drive','$Date','$GDriveCapacity','$Free'&quot;
            }
        }
    }
    $X++
}

Once it had run I simply copied the output into SSMS and was on my way

SQL Saturday Exeter–What’s the Point? My Experience of 2013 SQLSatExeter

 

Disclaimer – I am on the committee organising the next SQL Saturday Exeter. To be kept up to date about SQL Saturday #269 in the South West, follow @SQLSatExeter and#SQLSatExeter on twitter and see details at the bottom. This post is about my experience at this years event.

In March this year the SQL South West User Group hosted SQL Saturday #194 in Exeter. I was a new member to the User Group having finally been able to join them for the first time in January. At that meeting Chris Testa O’Neill presented a session and was very passionate about the SQL Community and the benefit of the SQL Saturdays and other events.  I am always keen to learn new things and find ways of developing my skills. As I haven’t won the lottery I also look out for good deals as well!!

SQL SATURDAY PRE-CONS ARE EXCEPTIONAL VALUE

It was relatively easy to persuade my bosses to pay for my pre-con. For £150 I was able to spend a whole day in a room with about a dozen people being trained in SQL Server Security by Denny Cherry @mrdenny. The conversation went along the lines of

“I want to go to this training session being delivered by this guy. Link to MVP page. It’s £150 and is in Exeter so no other costs required”

My boss – “OK”

Of course there was a little more fun and games to be had with the payment but it was easy for me to get training sorted and £150 is not going to break the training budget.

Looking back through my notes from the session today I realise quite how much I have taken from it into my role at work. I can’t really comment which and what though that wouldn’t be good security!!

I remember an enjoyable day with plenty of technical learning, a lot of questions and answers and plenty of laughs as well. But more than that was the opportunity to mix with other professionals and talk with them. During the breaks and at lunch there were plenty of opportunities to chew the fat, learn how others do things, make new friends and put faces to twitter handles. (NOTE : I do look pretty much like my twitter profile picture so if you see me at SQL Community events I expect you to come up and say hi, that’s part of the benefit of attending these events, having a good natter)

Take a look at the end of this post for details of 2014 Pre-Cons

SQL SATURDAY – CAN’T GET CHEAPER THAN FREE

SQL Saturdays are FREE

SQL Saturdays offer sessions from internationally renowned and local SQL speakers on subjects relevant to you and your job, your future career, your development plan or just to challenge yourself by learning about something outside of your comfort zone. For Nothing. Add in the networking opportunities, the prizes from the sponsors, (if you were at Exeter this year the beer and the pasty) and if you added it up its a sizeable investment in yourself, your career and your development (did I mention a free beer and pasty?)

NOT BAD FOR FREE!!

To enable that, SQL Saturday organisers have to go out and talk sponsors into putting their hands into their pockets. They will only do that if it is worthwhile to them. You can make it easier for the organisers by going and spending time with the sponsors during the breaks, chatting with them and giving them your details. Also, if you choose to use one of their products please tell the sponsors you spoke to them at a SQL Saturday. They are (usually) data professionals who will record that and use that to make future decisions which will we hope include sponsoring SQL Saturdays.

This year on the Saturday I went to the following sessions

A temporary fix for a short term problem by Ian Meade
Advanced SQL Server 2012 HA and DR Architectures by Christian Bolton
Busting common T-SQL myths by Dave Morrison
Power View and the Cube by Régis Baccaro
Natural Born Killers, performance issues to avoid by Richard Douglas
Tracking server performance without slowing it down by Jonathan Allen which I also Room Monitored
Increasing Business and IT collaboration by Chris Testa-O’Neill

It was a really good day. I learnt so much from all those knowledgeable and talented people. It really kicked me on in my development at work. I was able to take from each of those sessions and use that knowledge to do my job better and I made new friends and new contacts. Just going back to my notes today has reminded me of something that I need to look into for work Smile Some of the conversations I have had at events this year have been fascinating – learning how other people do the same thing you do in a completely different but equally valid way,  problem-solving with a different set and type of minds than the ones at MyWork, laughing at the same things and moaning about similar frustrations. All have been both entertaining and rewarding and I think are worth mentioning as things I enjoyed about going to SQL Community events this year and play a part in the reason I shall continue to go to them (Just hope my boss doesn’t read this and think he won’t have to pay as I will go anyway!)

It’s busy and hectic, the sessions come along thick and fast and there are lots of people around to talk to. I wish I had made use of the SQL Saturday mobile phone app and I definitely recommend researching ahead of time and planning your day out.

This years sessions have not been decided yet but I have seen some of the submissions and there are some fabulous sessions there. You could also submit a session yourself. Choosing the sessions will be tough, but we want to offer the opportunity to speak to as many people as possible both new and experienced speakers.

You can submit your sessions at this link http://www.sqlsaturday.com/269/callforspeakers.aspx

ROUND-UP SQL SATURDAY EXETER WHY WOULDN’T YOU COME

For a newbie, as I was last time, SQL Saturday Exeter was a revelation.

An opportunity to learn without spending thousands of my own or MyWorks money to sit in a lecture room and listen to a trainer.

A chance to develop my understanding in a friendly environment amongst my peers where I could ask questions.

A place to meet new people and build relationships who have helped me with situations at work throughout the year. I reckon I’m in credit already

This year I have attended SQL Bits and SQL Saturday Cambridge and this month I shall be at SQL Relay in Cardiff and in Bristol. That all started with SQL Saturday 194 in Exeter 2013

WHAT ABOUT NEXT YEARS SQL SATURDAY EXETER?

Next years SQL Saturday in Exeter, SQL Saturday #269, will be held at the same place – Jury’s Inn Hotel Exeter on March 21/22nd 2014.

We had such amazing submissions for our pre-cons that we have had to find more rooms to be able to fit them all in.. You can see for yourself the quality of the sessions and speakers for SQL Saturday Exeter 2014 at the following link

http://sqlsouthwest.co.uk/sql-saturday-269-precon-training-day-details/

What do you think? I want to split myself into 8 and go to every one!

WHAT SHOULD YOU DO NOW?

I suggest that you should book Saturday 22nd March 2014 out in your calendar right this minute. Done that? Good.

Now go to this link

http://www.sqlsaturday.com/269/

and register for FREE to attend and let us know @SQLSatExeter

Next make yourself a coffee (Other beverages are available) and head to the pre-con page

http://sqlsouthwest.co.uk/sql-saturday-269-precon-training-day-details/

This bit is up to you, the choice is hard. I can’t tell you which one of our eight fabulous sessions you want to go to. It’s not for me to say which amazing speaker you want to spend a day with for a bargain price but if you need further info please get in touch and we will try and help. Unfortunately our human cloning experiment is not stable enough to allow you to go to more than one!

Then, let me know you have done so and come and say hi when you are here.

SQL Server 2014 CTP on Azure. Powershell won’t create XTP tables

Having a couple of hours to play I created a SQL 2014 box on Azure with Windows Server 2012 R1 and followed This Post on SQLServerCentral  to create a File Group and table with T-SQL. Everything went well. Don’t try and provision an extra small box with that image though as it is so slow as to be unusable!!

I then decided to take a look at doing it with Powershell and I cannot create the table. I think it is to do with my Index creation. Can anyone see my mistake as I am going round in circles now. I have posted to DBAStackExchange so answer there

    -- Create new database
CREATE DATABASE TestDatabase
GO
--Add MEMORY_OPTIMIZED_DATA filegroup to the database.
ALTER DATABASE TestDatabase
ADD FILEGROUP XTPFileGroup CONTAINS MEMORY_OPTIMIZED_DATA
-- Add a new file to the previous created file group
ALTER DATABASE TestDatabase ADD FILE
(
NAME = N'HekatonFile1',
FILENAME = N'C:\Program Files\Microsoft SQL
Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\HekatonFile1')
TO FILEGROUP [HekatonFileGroup]
GO
-- Let's create a new Memory Optimized Table
CREATE TABLE TestTable
(
 Col1 INT NOT NULL,
Col2 VARCHAR(100) NOT NULL,
 Col3 VARCHAR(100) NOT NULL
CONSTRAINT chk_PrimaryKey PRIMARY KEY NONCLUSTERED HASH (Col1) WITH (BUCKET_COUNT = 1024)
) WITH (MEMORY_OPTIMIZED = ON)
GO

And Here is my Powershell

# To Load SQL Server Management Objects into PowerShell
[System.Reflection.Assembly]::LoadWithPartialName(‘Microsoft.SqlServer.SMO’)  | out-null
[System.Reflection.Assembly]::LoadWithPartialName(‘Microsoft.SqlServer.SMOExtended’)  | out-null
$server= new-Object Microsoft.SqlServer.Management.Smo.Server("(local)")
$dbname = "HekatonTest"
$db = New-Object Microsoft.SqlServer.Management.Smo.Database($server, $dbname)
# Add FileGroups
$FG1Name = "PRIMARY"
$Normalfg = new-object ('Microsoft.SqlServer.Management.Smo.FileGroup') ($db, $FG1Name)
$db.FileGroups.Add($Normalfg)
$FG2Name = "MemOpt"
$MemOptFG = new-object ('Microsoft.SqlServer.Management.Smo.FileGroup') ($db, $FG2Name)
$MemOptFG.FileGroupType = "MemoryOptimizedDataFileGroup"
$db.FileGroups.Add($MemOptFG)
#Create datafiles
$normallogname = "HekatonTest_Data"
$dbdfnormal = new-object ('Microsoft.SqlServer.Management.Smo.DataFile') ($Normalfg, $normallogname)
$dbnormalfile= $server.Information.MasterDBPath + '\' + $normallogname + '.mdf'
$normalfg.Files.Add($dbdfnormal)
$dbdfnormal.FileName = $dbnormalfile
$dbdfnormal.Size = [double](5.0 * 1024.0)
$dbdfnormal.GrowthType = 'Percent'
$dbdfnormal.Growth = 25.0
$dbdfnormal.IsPrimaryFile = 'True'
$MemOptFilename = "MemOpt_Data"
$MemOptDataFile = new-object ('Microsoft.SqlServer.Management.Smo.DataFile') ($MemOptFG, $MemOptFilename)
$MemOptDataFilePath= $server.Information.MasterDBPath + '\' + $MemOptFilename + '.ndf'
$MemOptFG.Files.Add($MemOptDataFile)
$MemOptDataFile.FileName = $MemOptDataFilePath

#Create Database
$db.Create() 
#Create Table
$dbname = "HekatonTest"
$db = $server.databases[$dbname]
$tb = new-object Microsoft.SqlServer.Management.Smo.Table($db, "MemOptTable")
#Add Columns
$col1 = new-object Microsoft.SqlServer.Management.Smo.Column($tb,"Col1", [Microsoft.SqlServer.Management.Smo.DataType]::Int)
$col2 = new-object Microsoft.SqlServer.Management.Smo.Column($tb, "Col2", [Microsoft.SqlServer.Management.Smo.DataType]::varchar(100))
$col3 = new-object Microsoft.SqlServer.Management.Smo.Column($tb, "Col3", [Microsoft.SqlServer.Management.Smo.DataType]::varchar(100))
$tb.Columns.Add($col1)
$tb.Columns.Add($col2)
$tb.Columns.Add($col3)
$C1Name =$col1.Name
#Create Index
$IX = New-Object Microsoft.SqlServer.Management.Smo.Index ($Tb, "PK_PrimaryKeyName")   
$IX_col = New-Object Microsoft.SqlServer.Management.Smo.IndexedColumn ($IX,$C1Name)
$IX.IndexedColumns.Add($IX_col)
$IX.IndexKeyType = "DriPrimaryKey"
$IX.IndexType = "HashIndex"
$IX.BucketCount = "1024"
$IX.FileGroup = "PRIMARY"
$IX.IsMemoryOptimized = $true
$Tb.Indexes.Add($IX)
$tb.FileGroup = "MemOpt"
$tb.Durability = "SchemaAndData"
$tb.IsMemoryOptimized = $true
$tb.Create()

I can create the filegroups and the database with PS and use SSMS to create Memory Optimised Tables (XTP) and if I take out the index creation then the table creates but isnt XTP obviously.

All you get is the frustrating Create() failed error which doesn’t help me to find the problem.

I know I can take the T-SQL and put it into Invoke-SQLCMD but I figured you ought to be able to do it with Powershell.

Can anyone point me in the right direction? For interest only, maybe it’s not available in the CTP. Answer over here

12 Things I learnt at SQLBits XI

  • The Helpers are awesome

clip_image001

  • Often Sessions fill up very quickly – Get there early

clip_image002

  • You can learn as much outside of the sessions as you can in them

  • There are amazing prizes

A LEGO R2D2 !!!

  • Bring your sense of humour

  • The SQL community contains the most gracious and generous, willing to help people

  • You can connect with your user group and get a mini SQL Bits every month

Find your User Group Here http://sqlsouthwest.co.uk/national_ug.htm

  • If there is no user group in your area people will help you to start one

RT @fatherjack Interesting chat about a potential new user group in the uk. Anyone around Newcastle area looking for some free training?

  • Every session is videoed and will be available online. For free.

  • You will learn and have fun

Too awesome for words! “@justjonlevett: Lego Server! @fusionio #sqlbitspic.twitter.com/bhxPaTIq4K

More blogs about SQL Bits XI and Photos

SQL Bits Facebook https://www.facebook.com/SQLBits

Roger Van Unen Gallery https://plus.google.com/photos/109984741094039234638/albums/5874913179986208577

JR’s Gallery https://skydrive.live.com/?cid=7b73b60f4c7d77c9&id=7B73B60F4C7D77C9%212222

Steve Jones Blog http://voiceofthedba.wordpress.com/2013/05/06/fun-at-sql-bits/

Chris Webbs Blog http://cwebbbi.wordpress.com/2013/05/05/sqlbits-xi-summary/

Find more from the Facebook Page or #sqlbits

Finally a BIG Thank you to all these people http://sqlbits.com/about/WhosWho.aspx

and the fantastic helpers without whom SQL Bits would never happen

Till Next year