2016 – That was a Year :-)

Its the time of year for reflection and I have had the most amazing 2016, I am blessed that I love what I do so much. I thoroughly enjoy writing and talking and sharing and commenting and supporting and cherishing all the SQL and PowerShell things. I wrote about using Power Bi to display my checkins. I only started this in June and this is where I have been 🙂

swarm

I learnt about Pester and ended the year incorporating it into dbatools and dbareports. I also started using GitHub It is quite surprising to me how much time I now spend using both. I also had to start learning DSC for the client I was working with because as ‘the PowerShell guy’ I was the one who could the easiest. I learnt things and then forgot them causing me to find this Pester post via google later in the year!! (That’s a big reason for blogging by the way)

Early in the year we organised with SQL Saturday Exeter

Helping to organise a SQL Saturday is a lot of fun, especially when you do it with good friends, but choosing sessions is by far the most challenging part of it for me. I could have chosen at least 60 of these sessions and I know people were disappointed not to have been chosen. I was also the first person many saw at SQL Bits in Liverpool manning the front of house and getting asked the best question ever

The Beard says

When you go to an event –  Say thank you to the organisers and volunteers

and a TERRIBLE thing happened – I broke my DBA Team mug

WP_20160223_07_51_03_Pro.jpg

Luckily the fine folk at redgate sorted me out with a replacement from deep in the stores somewhere and gave it to me at SQL Saturday Exeter 🙂 Thank you.

I spoke at the PowerShell Conference Europe and met and made some great friends which lead to me speaking at the PowerShell Monday in Munich and the Dutch PowerShell Usergroup. SQL Saturday Dublin was a blast, its a wonderful city, Manchester had a whole PowerShell Track 🙂 and Cambridge was memorable for the appalling journey as well as the chance to share a stage with Chrissy. PowerShell Conference Asia in the sovereign city-state of Singapore was such a good event and place. Lastly of course was Slovenia with its fantastic Christmas lights and awesome event organisation. I visited some user groups too. Southampton run by my good friends John Martin and Steph Middleton Congratulations to John on his first MVP award yesterday, Cardiff for the Return of the Battle of the Beards with Terry McCann and Tobiasz Koprowski where the projector threw its toys out of the pram and Birmingham in the school hall which was slightly chilly (theres a joke there for some people)

Amazing things happened

We created https://sqlps.io/vote and https://sqlps.io/ssms and https://sqlps.io/powerbi to enable anyone to influence Microsoft and help to improve the PowerShell SQL experience

and lo and behold there was a new sqlserver module 🙂

I was also invited by Aaron and Chrissy to become an officer for the PASS PowerShell Virtual Chapter oh and we made https://sqlps.io/slack to enable people to talk about all things Data Platform – Another addition to my life that I didn’t have at the beginning of the year. I spend a lot of time in there in the #dbatools and #dbareports channels and have made some fantastic friends. Chrissy and I created the SQL Community Collaborative GitHub team and added dbatools and dbareports and even more friendships were born

And that’s the biggest and bestest thing about this year. Some amazing new friends and spending time with all my other friends. I started writing out a list but was terrified I would have missed someone out, so to all my friends

THANK YOU for a brilliant 2016 and 2017 shall be just as good 🙂

Here are a few of my pics from the year with a lot of my friends

 

Populating My DBA Database for Power Bi with PowerShell – Server Info

Following my last post about using Power Bi with my DBA Database I have been asked if I would share the PowerShell scripts which I use to populate my database. They are the secondary part to my DBADatabase which I also use to automate the installation and upgrade of all of my DBA scripts as I started to blog about in this post Installing and upgrading default scripts automation – part one – Introduction which is a series I will continue later.

In this post I will show how to create the following report

1

You will find the latest version of my DBADatabase creation scripts here.

I create the following tables

dbo.ClientDatabaseLookup
dbo.Clients
dbo.InstanceList
dbo.InstanceScriptLookup
dbo.ScriptList
Info.AgentJobDetail
Info.AgentJobServer
Info.Databases
Info.Scriptinstall
Info.ServerOSInfo
Info.SQLInfo

By adding Server name, Instance Name , Port, Environment, NotContactable, and Location into the InstanceList table I can gather all of the information that I need and also easily add more information to other tables as I need to.

The not contactable column is so that I am able to add instances that I am not able to contact due to permission or environment issues. I can still gather information about them manually and add it to the table. I use the same script and change it to generate the SQL query rather than run it, save the query and then run the query manually to insert the data. This is why I have the DateAdded and Date Checked column so that I know how recent the data is. I don’t go as far as recording the change however as that will be added to a DBA-Admin database on every instance which stores every change to the instance.

The ServerOSInfo table is created like so

/****** Object: Table [Info].[ServerOSInfo]    Script Date: 26/08/2015 19:50:38 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [Info].[ServerOSInfo](
[ServerOSInfoID] [int] IDENTITY(1,1) NOT NULL,
[DateAdded] [datetime] NULL,
[DateChecked] [datetime] NULL,
[ServerName] [nvarchar](50) NULL,
[DNSHostName] [nvarchar](50) NULL,
[Domain] [nvarchar](30) NULL,
[OperatingSystem] [nvarchar](100) NULL,
[NoProcessors] [tinyint] NULL,
[IPAddress] [nvarchar](15) NULL,
[RAM] [int] NULL,
CONSTRAINT [PK__ServerOS__50A5926BC7005F29] PRIMARY KEY CLUSTERED
(
[ServerOSInfoID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

The Powershell script uses Jason Wasser @wasserja Write-Log function to write to a text file but I also  enable some logging into a new event log by following the steps here http://blogs.technet.com/b/heyscriptingguy/archive/2013/02/01/use-powershell-to-create-and-to-use-a-new-event-log.aspx to create a log named SQLAutoScript with a source SQLAUTOSCRIPT

To run the script I simply need to add the values for

$CentralDBAServer = '' ## Add the address of the instance that holds the DBADatabase
$CentralDatabaseName= 'DBADatabase' 
$LogFile = "\DBADatabaseServerUpdate_" + $Date + ".log" ## Set Path to Log File

And the script will do the rest. Call the script from a PowerShell Job Step and schedule it to run at the frequency you wish, I gather the information every week. You can get the script from here or you can read on to see how it works and how to create the report

I create a function called Catch-Block to save keystrokes and put my commands inside a try catch to make the scripts as robust as possible.

function Catch-Block{
param ([string]$Additional)
$ErrorMessage = " On $Connection " + $Additional + $_.Exception.Message + $_.Exception.InnerException.InnerException.message
$Message = " This message came from the Automated Powershell script updating the
DBA Database with Server Information"
$Msg = $Additional + $ErrorMessage + " " + $Message
Write-Log -Path $LogFile -Message $ErrorMessage -Level Error
Write-EventLog -LogName SQLAutoScript -Source "SQLAUTOSCRIPT" -EventId 1 -EntryType Error -Message $Msg
}

I give the function an additional parameter which will hold each custom error message which I write to both the event log and a text message to enable easy troubleshooting and include the message from the $Error variable by accessing it with $_. I won’t include the try catch in the examples below. I gather all of the server names from the InstanceList table and set the results to an array variable called $Servers

$AlltheServers = Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query "SELECT DISTINCT [ServerName] FROM [DBADatabase].[dbo].[InstanceList] WHERE Inactive = 0 OR NotContactable = 1"
$Servers = $AlltheServers| Select ServerName -ExpandProperty ServerName

I then loop through the array and gather the information with three WMI queries.

Write-Log -Path $LogFile -Message "Gathering Info for $Server "
foreach($Server in $Servers)
{
Write-Log -Path $LogFile -Message "Gathering Info for $Servers"
$DNSHostName = 'NOT GATHERED'
$Domain = 'NOT GATHERED'
$OperatingSystem = 'NOT GATHERED'
$IP = 'NOT GATHERED'
try{
$Info = get-wmiobject win32_computersystem -ComputerName $Server -ErrorAction Stop|select DNSHostName,Domain,
@{Name="RAM";Expression={"{0:n0}" -f($_.TotalPhysicalMemory/1gb)}},NumberOfLogicalProcessors

I give the variables some default values in case they are not picked up and set the error action for the command to Stop to exit the try and the first query gathers the DNSHostName, Domain Name, the amount of RAM in GB and the number of logical processors, the second gathers the Operating System version but the third was the most interesting to do. There are many methods of gathering the IP Address using powershell and I tried a few of them before finding one that would work with all of the server versions that I had in my estate but the one that worked remotely the best for me and this is a good point to say that this works in my lab and in my shop but may not nessacarily work in yours, so understand, check and test this and any other script that you find on the internet before you let them anywhere near your production environment.

Unfortunately the one that worked everywhere remotely errored with the local server so I added a check to see if the server name in the variable matches the global environment variable of Computer Name

$OS =  gwmi Win32_OperatingSystem  -ComputerName $Server| select @{name='Name';Expression={($_.caption)}} 
if($Server -eq $env:COMPUTERNAME)
{$IP = (Get-WmiObject -ComputerName $Server -class win32_NetworkAdapterConfiguration -Filter 'ipenabled = "true"' -ErrorAction Stop).ipaddress[0] }
else {$IP = [System.Net.Dns]::GetHostAddresses($Server).IPAddressToString }
Write-Log -Path $LogFile -Message "WMI Info gathered for $Server "

Once I have all of the information I check if the server already exists in the ServerOs table and choose to either insert or update.

	$Exists = Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query "SELECT [ServerName] FROM [DBADatabase].[Info].[ServerOSInfo] WHERE ServerName = '$Server'"
	
	if ($Exists)
	{
	$Query = @"
	UPDATE [Info].[ServerOSInfo]
	   SET [DateChecked] = GetDate()
	      ,[ServerName] = '$Server'
	      ,[DNSHostName] = '$DNSHostName'
	      ,[Domain] = '$Domain'
	      ,[OperatingSystem] = '$OperatingSystem'
	      ,[NoProcessors] = '$NOProcessors'
	      ,[IPAddress] = '$IP'
	      ,[RAM] = '$RAM'
	WHERE ServerName = '$Server'
	"@
	}
	else
	{
	$Query = @"
	INSERT INTO [Info].[ServerOSInfo]
	           ([DateChecked]
	           ,[DateAdded
	           ,[ServerName]
	           ,[DNSHostName]
	           ,[Domain]
	           ,[OperatingSystem]
	           ,[NoProcessors]
	           ,[IPAddress]
	           ,[RAM])
	     VALUES
	   ( GetDate()
	      ,GetDate()
	      ,'$Server'
	      ,'$DNSHostName'
	      ,'$Domain'
	      ,'$OperatingSystem'
	      ,'$NoProcessors'
	      ,'$IP'
	      ,'$RAM')
	"@
	}
	Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query $Query

And that’s it. Now if you wish to gather different data about your servers then you can examine the data available to you by

get-wmiobject Win32_OperatingSystem -ComputerName $Server | Get-Member
get-wmiobject win32_computersystem -ComputerName $Server | Get-Member

If you find something that you want to gather you can then add the property to the script and gather that information as well, make sure that you add the column to the table and to both the insert and update statements in the PowerShell Script

Creating the report in Power Bi

All data shown in the examples below has been generated from real-life data but all identifiable data has been altered or removed. I was born in Bolton and SQL SouthWest is based in Exeter 🙂

Open Power Bi Desktop and click get data. Add the connection details for your DBA Database server and database and add the query

	SELECT SOI.[ServerOSInfoID]
	      ,SOI.[DateChecked]
	      ,SOI.[ServerName]
	      ,SOI.[DNSHostName]
	      ,SOI.[Domain]
	      ,SOI.[OperatingSystem]
	      ,SOI.[NoProcessors]
	      ,SOI.[IPAddress]
	      ,SOI.[RAM]
	,IL.ServerName
	,IL.InstanceName
		  ,IL.Location
		  ,IL.Environment
		  ,IL.Inactive
		  ,IL.NotContactable
	        FROM [DBADatabase].[Info].[ServerOSInfo] as SOI
	  JOIN [dbo].[InstanceList] as IL
	  ON IL.ServerName =  SOI.[ServerName]

2

Create a new column for the Operating Edition by clicking data on the left and using this code as described in my previous post

Operating System Edition = SWITCH([OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "DataCenter",
"Microsoft Windows Server 2012 Standard","Standard",
"Microsoft Windows Server 2012 R2 Datacenter", "DataCenter",
"Microsoft Windows Server 2008 R2 Standard", "Standard",
"Microsoft Windows Server 2008 R2 Enterprise", "Enterprise",
"Microsoft® Windows Server® 2008 Standard", "Standard",
"Microsoft® Windows Server® 2008 Enterprise","Enterprise",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Standard",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Enterprise",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

And one for OS Version using this code

OS Version = SWITCH([OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "Server 2012",
"Microsoft Windows Server 2012 Standard","Server 2012",
"Microsoft Windows Server 2012 R2 Datacenter", "Server 2012 R2",
"Microsoft Windows Server 2008 R2 Standard", "Server 2008 R2",
"Microsoft Windows Server 2008 R2", "Server 2008 R2",
"Microsoft Windows Server 2008 R2 Enterprise", "Server 2008 R2",
"Microsoft® Windows Server® 2008 Standard", "Server 2008",
"Microsoft® Windows Server® 2008 Enterprise","Server 2008",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Server 2003",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Server 2003",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

I also created a new measure to count the distinct number of servers and instances as follows

Servers = DISTINCTCOUNT(Query1[Servers Name])
Instances = COUNT(Query1[Instance])

Then in the report area I start by creating a new text box and adding a title to the report and setting the page level filter to InActive is false so that all decommissioned servers are not included

3

I then create a donut chart for the number of servers by Operating System by clicking the donut chart in the visualisations and then dragging the OS version to the Details and the Servers Name to the Values

4

I then click the format button and added a proper title and the background colour

5

Then create the server numbers by location in the same way by clicking donut chart and adding location and count of server names and adding the formatting in the same way as the previous donut

6

I created a number of charts to hold single values for Domain, Instance, Server, RAM, Processors and the number of Not Contactable to provide a quick easy view of those figures, especially when you filter the report by clicking on a value within the donut chart. I find that managers really like this feature. They are all created in the same way by clicking the card in the visualisation and choosing the value

7

I also add a table for the number of servers by operating system and the number of servers by location by dragging those values to a table visualisation. I find that slicers are very useful ways of enabling information to be displayed as required, use the live visualisation to do this, I add the environment column to slice so that I can easily see values for the live environment or the development environment

I create a separate page in the report to display all of the server data as this can be useful for other teams such as the systems (server admin) team. I give them a lot of different slicers : – Domain, Location, Environment, OS Version, Edition and NotContactable with a table holding all of the relevant values to enable them to quickly see details

8

You can get all of the scripts here

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

Speaking at PowerShell Virtual Chapter and SQL Cardiff User Group this month

Just a quick post to say that I will be speaking at the PowerShell Virtual Chapter meeting this Thursday at 4pm GMT 12pm EDT and also at the Cardiff SQL User Group on Tuesday 31st March

I will be giving my Making Powershell Useful for your Team presentation

You have heard about PowerShell and may be spent a little bit of time exploring some of the ways in which it will benefit you at work. You want to be able to perform tasks more specific to your organisation and need to share them with your team. I will show you how you can achieve this by demonstrating

  • An easy way to learn the syntax
  • How to explore SQL Server with Powershell
  • How to turn your one off scripts into shareable functions
  • How to ensure that your team can easily and quickly make use of and contribute to PowerShell solutions
  • Where else to go for help

You can find out more about the Virtual Chapter here

http://powershell.sqlpass.org/ 

and the Cardiff meeting here

http://www.meetup.com/Cardiff-SQL-Server-User-Group/events/219492623/ 

The Cardiff meeting has been named The Battle Of The Beards as it features Tobiasz Koprowski: talking about Windows Azure SQL Database – Tips and Tricks for beginners and Terry McCann with SSRS Inception. I will be giving the same presentation as at the Virtual Chapter

I hope to see you at one or both sessions

#tsql2sday #60 – Something New Learned – Problem Step Recorder

What is T-SQL Tuesday?

T-SQL Tuesday is a monthly blog party hosted by a different blogger each month. This blog party was started by Adam Machanic (blog|twitter). You can take part by posting your own participating post that fits the topic of the month and follows the requirements Additionally, if you are interested in hosting a future T-SQL Tuesday, contact Adam Machanic on his blog.

This month’s blog party is hosted by Chris Yates blog |twitter who asked people to share something newly learned.

I love being a part of the SQL community. It gives me the opportunity to learn as much as I want to about anything I can think of within the data field. In the last couple of months I have presented at Newcastle User Group and learnt about migrating SQL using Powershell with Stuart Moore. At our user group in Exeter http://sqlsouthwest.co.uk/ we had Steph Middleton talking about version control for databases and lightning talks from Pavol Rovensky on Mocking in C# ,John Martin on Azure fault domains and availability sets using a pen and a whiteboard!, Annette Allen on Database Unit Testing,Terry McCann  on SQL Certifications. We also had Jonathan Allen talking about some free tools and resources to help manage both large and small SQL environments.  I went to SQL Relay in Southampton and saw Stuart Moore (again!) Scott Klein Alex Yates James Skipworth and I joined the PASS DBA fundamentals virtual chapter webinar for Changing Your Habits to Improve the Performance of Your T-SQL by Mickey Stuewe and that’s only the ‘in-person’ learning that I did. I also read a lot of blog posts!

But instead of repeating what I learnt from others within the community I thought I would write a blog post that I have been meaning to write for a few weeks about a solution pre-built into Windows that appears to not be well known. Problem Step Recorder.

What is PSR?

I found out about a little known tool included in Windows Operating System a couple of months ago which enables you to record what you are doing by taking screenshots of every mouse click. The tool is Step Recorder also known as PSR. It is included by default in Windows 7 , Windows 8 and 8.1 and Windows Server 2008 and above.

What does it do?

Simply put, it records “This is what I did” There are many situations when this can be useful

  • You can use this during installations to help create documentation. “This is what I did” when I installed X and now you can follow those steps and I know I haven’t missed anything.
  • You can use it when communicating with 3rd parties or other support teams. “This is what I did” when I got this error and here are all of the steps so that you can re-create the issue and I know that I haven’t missed anything
  • You can use this when resolving high priority incidents. “This is what I did” when System X broke, it includes all of the times of my actions.
    I still keep my notepad by my keyboard out of habit but I have a record of the exact steps that I took to try to resolve the issue which will be very useful for reporting on the incident in the near future and also placing into a Knowledge Base for others to use if it happens again and I know I haven’t missed anything
  • For assisting family members. Like many, I am “The IT guy” and PSR enables me to provide clear instructions with pictures showing exactly where I clicked to those family members who are having trouble with “The internet being broken”

It does this by automatically taking a screen shot after every mouse click or program event with a timestamp and a description of what happened. It does not record keystrokes though so if you need to record what you have typed there is some manual steps required

So how do you access PSR?

Simple. Type “psr” into the run box, cmd or PowerShell and it will open

Untitled picture

Once you click on Start Record it will start recording your clicks and taking screenshots. However I always open the settings by clicking on the drop down to the left of the help icon first and change the number of recent screen captures to store to the maximum value of 100.

1Untitled picture

If you do not you will get no warning but PSR will only save the last 25 screenshots it takes and your results will look like the below. It will still record your actions but not keep the screenshots.

Previous Next

Step 16: (‎09/‎11/‎2014 13:47:45) User left click on “Chris Yates (@YatesSQL) | Twitter (tab item)”

No screenshots were saved for this step.

Previous Next

Step 17: (‎09/‎11/‎2014 13:47:47) User left click on “The SQL Professor | ‘Leadership Through Service’ (text)”

No screenshots were saved for this step.

Previous Next

Step 18: (‎09/‎11/‎2014 13:47:47) User left click on “T-SQL Tuesday #60 – Something New Learned | The SQL Professor (text)” in “T-SQL Tuesday #60 – Something New Learned | The SQL Professor – Google Chrome”

untitled

You can also set the name and location of the saved file in the settings but if you leave it blank it will prompt for a location and name once you click Stop Record

How do I add keyboard input?

PSR allows you add keyboard input manually. You may need this if you need to include the text you have entered into prompts or address bars or if you wish to add further comment. You can do this by clicking add comment, drawing a box around the relevant part of the screen for the text input and inputting the text into the box

2Untitled picture

In the results this looks like

Step 1: (‎09/‎11/‎2014 12:56:22) User Comment: “http://www.microsoft.com/en-gb/download/details.aspx?id=42573

untitled1

What do the results look like?

Once you have finished the actions that you want to record (or when you think you are close to 100 screenshots) click stop record and the following screen will be displayed

3Untitled picture

This allows you to review what PSR has recorded. You can then save it to a location of your desire. It is saved as a zip file which has a single .mht file in it. You can open the file without unzipping the archive and it will open in Internet Explorer. As you can see from the shots below you can run PSR on your client and it will still record actions in your RDP sessions although it does not record as much detail. The first two are on my SCOM server in my lab and the second two are on the laptop using the SCOM console

Previous Next

Step 11: (‎09/‎11/‎2014 13:02:13) User left click on “Input Capture Window (pane)” in “SCOM on ROB-LAPTOP – Virtual Machine Connection”

untitled2

Previous Next

Step 12: (‎09/‎11/‎2014 13:02:16) User left click on “Input Capture Window (pane)” in “SCOM on ROB-LAPTOP – Virtual Machine Connection”

untitled3

Previous Next

Step 13: (‎09/‎11/‎2014 13:06:25) User right click on “Management Packs (tree item)” in “Agent Managed – THEBEARDMANAGEMENTGROUP – Operations Manager”

untitled4

Previous Next

Step 14: (‎09/‎11/‎2014 13:06:27) User left click on “Import Management Packs… (menu item)”

untitled5

You can then use the zip file as you wish. Maybe you email it to your third party support team (once you have edited any confidential data) or you can attach it to your incident in your IT Service Management solution or attach it to a report. If you wish to create documentation you can open the .mht file in Word, edit it as you see fit and save it appropriately.

So that is one of the many things that I have learnt recently and I am looking forward to seeing what others have learnt especially as many will have just been to the SQL PASS Summit. You will be able to find the other posts in this blog party in the comments on Chris’s page

#TSQL2sDay Why My Head is Always in The Cloud

Todays post is my first for the TSQL2sDay series. For those not familiar this is rotating blog party that was started by Adam Machanic (@AdamMachanic | blog) back in 2009. If you want to catch up on all the fun to date? Check out this nice archive (link) put together by Steve Jones (@way0utwest |blog). Thank you Steve!!!

Azure Ballon - Credit http://owenrichardson.com/

This one is hosted by Jorge Segarra @SQLChicken:  who said This month’s topic is all about the cloud. What’s your take on it? Have you used it? If so, let’s hear your experiences. Haven’t used it? Let’s hear why or why not? Do you like/dislike recent changes made to cloud services? It’s clear skies for writing! So let’s hear it folks, where do you stand with the cloud?

My wife would tell you that my head is always in the cloud and she’s right (she usually is) just not like that picture! I would love to float gracefully above the land and gaze upon the view but its the landing that bothers me and will always stop me from trying it

Credit http://owenrichardson.com/

She’s right, pedantically and literally too, because this year I have spent a lot of time with my head and my fingers and my thinking in Virtual Machines using Windows Azure. That is where I have learnt a lot of my SQL and Powershell this year. After SQL Saturday Exeter and SQL Bits in Nottingham this year I have needed a place to practice and learn, an environment to try things and break things and mend them again and experiment.

I learn just as well by doing things as I do reading about them. Stuart Moore  @napalmgram has a great post called Learning to Play with SQL Server and whist I haven’t been as rough with my Azure SQL instances as he suggests I have been able to practice at will without worry and thanks to my MSDN subscription without cost. I have taken examples from blog posts and demos from User Group Sessions and run them on my Windows Azure VMs

Every single blog post I have written this year that has examples has been written in Azure and screen shots from Azure. Whilst some of my Powershell scripts in the PowerShell Box of Tricks series had already been written to solve one particular problem or another at MyWork, every single one was refined and demo’d and all the screen shots were from Azure and several were developed on Azure too

My first ever session to the SQL South West user group was about Spinning up and Shutting Down VMS in Azure was about Azure and was an interesting experience in Murphys Law which meant I ended up having to deliver it  on Azure.

The second time I have talked was about the PowerShell Box of Tricks series to the Cardiff User Group. Having learnt my lesson from the first time I had bought a mini HDMI to VGA converter and I had tested it using a couple of monitors at home and it worked wonderfully. However, when I got to Cardiff my little Asus convertible didn’t provide enough grunt to power the funky presentation screen. Luckily thanks to Stuart Moore @napalmgram who was also there doing his excellent PowerShell Back Up and Restore Session who let me use his Mac I was able to deliver the session using Office Web App to run the PowerPoint from my SkyDrive whilst all the demos were on ………Yup you guessed it Windows Azure !!!

So I feel qualified to answer Jorge’s questions and take part in T-SQL Tuesday this time round.

I like Azure. I like the ease I can spin up and down machines or any PaaS services at will. I love that I can do it with PowerShell because I really enjoy using PowerShell in my day to day work and at home too. Living as I do in a beautifully convenient bungalow in the country, I still enjoy the frustration of watching that spinning ring as my videos buffer on our 1.8Mbs at best internet connection. Whilst that does have an impact on using Azure it is a damn sight better than waiting many days trying to download one single file. Something like an ISO file for the latest SQL Server CTP for example.

There is no way I would have got a look at SQL Server 2014 if it wasn’t for Azure. I was able to spin up a SQL Server 2014 machine in only a few minutes and log in and have a play and then delete it. I have done the same with Server 2012 and 2012 R2. It has enabled me to try setting up Availability Groups and other technologies not yet implemented at MyWork

I wouldn’t have been able to do any of that on my machines at home as I don’t have anything capable of running Hyper-V whilst this 8 year old desktop still keeps hanging on despite the odd noises. (Negotiations are currently in place to replace it with something shiny and new. Just need that lottery win now !!)

I have also transferred my Cricket Averages database to WASD and am talking with a friend of mine about developing an app that will use the mobile service as well.

The rate of change is much quicker in the cloud, things change and change quickly. As quickly as I had written my post about Spinning up and Shutting Down VMS in Azure Microsoft changed the rules and didn’t charge for machines that were turned off. New services appear all the time. New services move quickly through from Preview to release and as Grant Fritchey noticed this week new views have been added to to Windows Azure SQL Database under the covers. I think this is something we are just going to have to live with. The scale of the cloud means it is much easier to test improvements at large scale and that means they can be released quicker.  It makes it more challenging to keep up I admit but it’s a constant drip of new things rather than a big bang all at once.

Azure has brought me to where I am today and I think it will continue to be part of my future. If I remember to submit my PowerShell session for SQL Saturday Exeter (Submit yours here) and it gets chosen then you will be able to see me there (if you register here) using Azure to give back to the SQL Community

Lessons Learnt from my first talk at SQL SouthWest

The timing was good enough that I could offer to do a talk based on my previous post on Windows Azure for my SQL User Group SQL SouthWest when Jonathan and Annette.( @FatherJack and @MrsFatherJack) put out a call for volunteers.

I did my best with the 7 P’s. I ran through it at lunchtime, I made sure I had power and a HDMI lead after checking with Jonathan, I got a glass of water. I knew the first line I was going to say

However, I neglected to check that I would have HDMI in at the location so everything that was on my laptop was useless! My laptop did very odd things to the USB stick when I tried to transfer to Jonathans laptop and he didn’t have Powershell V3 installed so whilst Neil Hambly @Neil_Hambly from Confio was speaking I was busy ignoring a very interesting talk on Waits to install and configure Powershell Azure on my Azure VM. Sorry Neil.

But in the end it more or less worked and we are lucky to have such a patient and supportive user group who helped me along the way as well. Thank you folks

Things I took away from the evening

  1. Double check you have all the connections
  2. Practice and Practice some more
  3. Think about the times when something is running and what you will say when there is nothing to see
  4. Presenting completely inside a RDP session adds unnecessary complication
  5. The Demo Gods WILL hit you and the curse of the red text will fall upon you during the presentation. Accept it and move on.
  6. Have an opening line
  7. Remember to breath (especially when the demo falls over)
  8. Enjoy it!

It didn’t go perfectly but people gave me some good feedback and I am pleased to say that I have pointed people towards something new that will help them and passed over my knowledge and that to me is what the SQL Community is all about. I have a load of other ideas for things I can talk about and blog about so it is going to be a very busy time for me as I work my way through them and do all the other exciting things coming my way in the SQL world.

Visit your own User Group – You can find them here http://www.sqlpass.org/

If you are in the South West UK then come and join our group. Free training and conversation with like minded people once a month and pizza too what could be better!!