Populating My DBA Database for Power Bi with PowerShell – Server Info

Following my last post about using Power Bi with my DBA Database I have been asked if I would share the PowerShell scripts which I use to populate my database. They are the secondary part to my DBADatabase which I also use to automate the installation and upgrade of all of my DBA scripts as I started to blog about in this post Installing and upgrading default scripts automation – part one – Introduction which is a series I will continue later.

In this post I will show how to create the following report

1

You will find the latest version of my DBADatabase creation scripts here.

I create the following tables

dbo.ClientDatabaseLookup
dbo.Clients
dbo.InstanceList
dbo.InstanceScriptLookup
dbo.ScriptList
Info.AgentJobDetail
Info.AgentJobServer
Info.Databases
Info.Scriptinstall
Info.ServerOSInfo
Info.SQLInfo

By adding Server name, Instance Name , Port, Environment, NotContactable, and Location into the InstanceList table I can gather all of the information that I need and also easily add more information to other tables as I need to.

The not contactable column is so that I am able to add instances that I am not able to contact due to permission or environment issues. I can still gather information about them manually and add it to the table. I use the same script and change it to generate the SQL query rather than run it, save the query and then run the query manually to insert the data. This is why I have the DateAdded and Date Checked column so that I know how recent the data is. I don’t go as far as recording the change however as that will be added to a DBA-Admin database on every instance which stores every change to the instance.

The ServerOSInfo table is created like so

/****** Object: Table [Info].[ServerOSInfo]    Script Date: 26/08/2015 19:50:38 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [Info].[ServerOSInfo](
[ServerOSInfoID] [int] IDENTITY(1,1) NOT NULL,
[DateAdded] [datetime] NULL,
[DateChecked] [datetime] NULL,
[ServerName] [nvarchar](50) NULL,
[DNSHostName] [nvarchar](50) NULL,
[Domain] [nvarchar](30) NULL,
[OperatingSystem] [nvarchar](100) NULL,
[NoProcessors] [tinyint] NULL,
[IPAddress] [nvarchar](15) NULL,
[RAM] [int] NULL,
CONSTRAINT [PK__ServerOS__50A5926BC7005F29] PRIMARY KEY CLUSTERED
(
[ServerOSInfoID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

The Powershell script uses Jason Wasser @wasserja Write-Log function to write to a text file but I also  enable some logging into a new event log by following the steps here http://blogs.technet.com/b/heyscriptingguy/archive/2013/02/01/use-powershell-to-create-and-to-use-a-new-event-log.aspx to create a log named SQLAutoScript with a source SQLAUTOSCRIPT

To run the script I simply need to add the values for

$CentralDBAServer = '' ## Add the address of the instance that holds the DBADatabase
$CentralDatabaseName= 'DBADatabase' 
$LogFile = "\DBADatabaseServerUpdate_" + $Date + ".log" ## Set Path to Log File

And the script will do the rest. Call the script from a PowerShell Job Step and schedule it to run at the frequency you wish, I gather the information every week. You can get the script from here or you can read on to see how it works and how to create the report

I create a function called Catch-Block to save keystrokes and put my commands inside a try catch to make the scripts as robust as possible.

function Catch-Block{
param ([string]$Additional)
$ErrorMessage = " On $Connection " + $Additional + $_.Exception.Message + $_.Exception.InnerException.InnerException.message
$Message = " This message came from the Automated Powershell script updating the
DBA Database with Server Information"
$Msg = $Additional + $ErrorMessage + " " + $Message
Write-Log -Path $LogFile -Message $ErrorMessage -Level Error
Write-EventLog -LogName SQLAutoScript -Source "SQLAUTOSCRIPT" -EventId 1 -EntryType Error -Message $Msg
}

I give the function an additional parameter which will hold each custom error message which I write to both the event log and a text message to enable easy troubleshooting and include the message from the $Error variable by accessing it with $_. I won’t include the try catch in the examples below. I gather all of the server names from the InstanceList table and set the results to an array variable called $Servers

$AlltheServers = Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query "SELECT DISTINCT [ServerName] FROM [DBADatabase].[dbo].[InstanceList] WHERE Inactive = 0 OR NotContactable = 1"
$Servers = $AlltheServers| Select ServerName -ExpandProperty ServerName

I then loop through the array and gather the information with three WMI queries.

Write-Log -Path $LogFile -Message "Gathering Info for $Server "
foreach($Server in $Servers)
{
Write-Log -Path $LogFile -Message "Gathering Info for $Servers"
$DNSHostName = 'NOT GATHERED'
$Domain = 'NOT GATHERED'
$OperatingSystem = 'NOT GATHERED'
$IP = 'NOT GATHERED'
try{
$Info = get-wmiobject win32_computersystem -ComputerName $Server -ErrorAction Stop|select DNSHostName,Domain,
@{Name="RAM";Expression={"{0:n0}" -f($_.TotalPhysicalMemory/1gb)}},NumberOfLogicalProcessors

I give the variables some default values in case they are not picked up and set the error action for the command to Stop to exit the try and the first query gathers the DNSHostName, Domain Name, the amount of RAM in GB and the number of logical processors, the second gathers the Operating System version but the third was the most interesting to do. There are many methods of gathering the IP Address using powershell and I tried a few of them before finding one that would work with all of the server versions that I had in my estate but the one that worked remotely the best for me and this is a good point to say that this works in my lab and in my shop but may not nessacarily work in yours, so understand, check and test this and any other script that you find on the internet before you let them anywhere near your production environment.

Unfortunately the one that worked everywhere remotely errored with the local server so I added a check to see if the server name in the variable matches the global environment variable of Computer Name

$OS =  gwmi Win32_OperatingSystem  -ComputerName $Server| select @{name='Name';Expression={($_.caption)}} 
if($Server -eq $env:COMPUTERNAME)
{$IP = (Get-WmiObject -ComputerName $Server -class win32_NetworkAdapterConfiguration -Filter 'ipenabled = "true"' -ErrorAction Stop).ipaddress[0] }
else {$IP = [System.Net.Dns]::GetHostAddresses($Server).IPAddressToString }
Write-Log -Path $LogFile -Message "WMI Info gathered for $Server "

Once I have all of the information I check if the server already exists in the ServerOs table and choose to either insert or update.

	$Exists = Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query "SELECT [ServerName] FROM [DBADatabase].[Info].[ServerOSInfo] WHERE ServerName = '$Server'"
	
	if ($Exists)
	{
	$Query = @"
	UPDATE [Info].[ServerOSInfo]
	   SET [DateChecked] = GetDate()
	      ,[ServerName] = '$Server'
	      ,[DNSHostName] = '$DNSHostName'
	      ,[Domain] = '$Domain'
	      ,[OperatingSystem] = '$OperatingSystem'
	      ,[NoProcessors] = '$NOProcessors'
	      ,[IPAddress] = '$IP'
	      ,[RAM] = '$RAM'
	WHERE ServerName = '$Server'
	"@
	}
	else
	{
	$Query = @"
	INSERT INTO [Info].[ServerOSInfo]
	           ([DateChecked]
	           ,[DateAdded
	           ,[ServerName]
	           ,[DNSHostName]
	           ,[Domain]
	           ,[OperatingSystem]
	           ,[NoProcessors]
	           ,[IPAddress]
	           ,[RAM])
	     VALUES
	   ( GetDate()
	      ,GetDate()
	      ,'$Server'
	      ,'$DNSHostName'
	      ,'$Domain'
	      ,'$OperatingSystem'
	      ,'$NoProcessors'
	      ,'$IP'
	      ,'$RAM')
	"@
	}
	Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query $Query

And that’s it. Now if you wish to gather different data about your servers then you can examine the data available to you by

get-wmiobject Win32_OperatingSystem -ComputerName $Server | Get-Member
get-wmiobject win32_computersystem -ComputerName $Server | Get-Member

If you find something that you want to gather you can then add the property to the script and gather that information as well, make sure that you add the column to the table and to both the insert and update statements in the PowerShell Script

Creating the report in Power Bi

All data shown in the examples below has been generated from real-life data but all identifiable data has been altered or removed. I was born in Bolton and SQL SouthWest is based in Exeter 🙂

Open Power Bi Desktop and click get data. Add the connection details for your DBA Database server and database and add the query

	SELECT SOI.[ServerOSInfoID]
	      ,SOI.[DateChecked]
	      ,SOI.[ServerName]
	      ,SOI.[DNSHostName]
	      ,SOI.[Domain]
	      ,SOI.[OperatingSystem]
	      ,SOI.[NoProcessors]
	      ,SOI.[IPAddress]
	      ,SOI.[RAM]
	,IL.ServerName
	,IL.InstanceName
		  ,IL.Location
		  ,IL.Environment
		  ,IL.Inactive
		  ,IL.NotContactable
	        FROM [DBADatabase].[Info].[ServerOSInfo] as SOI
	  JOIN [dbo].[InstanceList] as IL
	  ON IL.ServerName =  SOI.[ServerName]

2

Create a new column for the Operating Edition by clicking data on the left and using this code as described in my previous post

Operating System Edition = SWITCH([OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "DataCenter",
"Microsoft Windows Server 2012 Standard","Standard",
"Microsoft Windows Server 2012 R2 Datacenter", "DataCenter",
"Microsoft Windows Server 2008 R2 Standard", "Standard",
"Microsoft Windows Server 2008 R2 Enterprise", "Enterprise",
"Microsoft® Windows Server® 2008 Standard", "Standard",
"Microsoft® Windows Server® 2008 Enterprise","Enterprise",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Standard",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Enterprise",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

And one for OS Version using this code

OS Version = SWITCH([OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "Server 2012",
"Microsoft Windows Server 2012 Standard","Server 2012",
"Microsoft Windows Server 2012 R2 Datacenter", "Server 2012 R2",
"Microsoft Windows Server 2008 R2 Standard", "Server 2008 R2",
"Microsoft Windows Server 2008 R2", "Server 2008 R2",
"Microsoft Windows Server 2008 R2 Enterprise", "Server 2008 R2",
"Microsoft® Windows Server® 2008 Standard", "Server 2008",
"Microsoft® Windows Server® 2008 Enterprise","Server 2008",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Server 2003",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Server 2003",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

I also created a new measure to count the distinct number of servers and instances as follows

Servers = DISTINCTCOUNT(Query1[Servers Name])
Instances = COUNT(Query1[Instance])

Then in the report area I start by creating a new text box and adding a title to the report and setting the page level filter to InActive is false so that all decommissioned servers are not included

3

I then create a donut chart for the number of servers by Operating System by clicking the donut chart in the visualisations and then dragging the OS version to the Details and the Servers Name to the Values

4

I then click the format button and added a proper title and the background colour

5

Then create the server numbers by location in the same way by clicking donut chart and adding location and count of server names and adding the formatting in the same way as the previous donut

6

I created a number of charts to hold single values for Domain, Instance, Server, RAM, Processors and the number of Not Contactable to provide a quick easy view of those figures, especially when you filter the report by clicking on a value within the donut chart. I find that managers really like this feature. They are all created in the same way by clicking the card in the visualisation and choosing the value

7

I also add a table for the number of servers by operating system and the number of servers by location by dragging those values to a table visualisation. I find that slicers are very useful ways of enabling information to be displayed as required, use the live visualisation to do this, I add the environment column to slice so that I can easily see values for the live environment or the development environment

I create a separate page in the report to display all of the server data as this can be useful for other teams such as the systems (server admin) team. I give them a lot of different slicers : – Domain, Location, Environment, OS Version, Edition and NotContactable with a table holding all of the relevant values to enable them to quickly see details

8

You can get all of the scripts here

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs

#TSQL2sDay Why My Head is Always in The Cloud

Todays post is my first for the TSQL2sDay series. For those not familiar this is rotating blog party that was started by Adam Machanic (@AdamMachanic | blog) back in 2009. If you want to catch up on all the fun to date? Check out this nice archive (link) put together by Steve Jones (@way0utwest |blog). Thank you Steve!!!

Azure Ballon - Credit http://owenrichardson.com/

This one is hosted by Jorge Segarra @SQLChicken:  who said This month’s topic is all about the cloud. What’s your take on it? Have you used it? If so, let’s hear your experiences. Haven’t used it? Let’s hear why or why not? Do you like/dislike recent changes made to cloud services? It’s clear skies for writing! So let’s hear it folks, where do you stand with the cloud?

My wife would tell you that my head is always in the cloud and she’s right (she usually is) just not like that picture! I would love to float gracefully above the land and gaze upon the view but its the landing that bothers me and will always stop me from trying it

Credit http://owenrichardson.com/

She’s right, pedantically and literally too, because this year I have spent a lot of time with my head and my fingers and my thinking in Virtual Machines using Windows Azure. That is where I have learnt a lot of my SQL and Powershell this year. After SQL Saturday Exeter and SQL Bits in Nottingham this year I have needed a place to practice and learn, an environment to try things and break things and mend them again and experiment.

I learn just as well by doing things as I do reading about them. Stuart Moore  @napalmgram has a great post called Learning to Play with SQL Server and whist I haven’t been as rough with my Azure SQL instances as he suggests I have been able to practice at will without worry and thanks to my MSDN subscription without cost. I have taken examples from blog posts and demos from User Group Sessions and run them on my Windows Azure VMs

Every single blog post I have written this year that has examples has been written in Azure and screen shots from Azure. Whilst some of my Powershell scripts in the PowerShell Box of Tricks series had already been written to solve one particular problem or another at MyWork, every single one was refined and demo’d and all the screen shots were from Azure and several were developed on Azure too

My first ever session to the SQL South West user group was about Spinning up and Shutting Down VMS in Azure was about Azure and was an interesting experience in Murphys Law which meant I ended up having to deliver it  on Azure.

The second time I have talked was about the PowerShell Box of Tricks series to the Cardiff User Group. Having learnt my lesson from the first time I had bought a mini HDMI to VGA converter and I had tested it using a couple of monitors at home and it worked wonderfully. However, when I got to Cardiff my little Asus convertible didn’t provide enough grunt to power the funky presentation screen. Luckily thanks to Stuart Moore @napalmgram who was also there doing his excellent PowerShell Back Up and Restore Session who let me use his Mac I was able to deliver the session using Office Web App to run the PowerPoint from my SkyDrive whilst all the demos were on ………Yup you guessed it Windows Azure !!!

So I feel qualified to answer Jorge’s questions and take part in T-SQL Tuesday this time round.

I like Azure. I like the ease I can spin up and down machines or any PaaS services at will. I love that I can do it with PowerShell because I really enjoy using PowerShell in my day to day work and at home too. Living as I do in a beautifully convenient bungalow in the country, I still enjoy the frustration of watching that spinning ring as my videos buffer on our 1.8Mbs at best internet connection. Whilst that does have an impact on using Azure it is a damn sight better than waiting many days trying to download one single file. Something like an ISO file for the latest SQL Server CTP for example.

There is no way I would have got a look at SQL Server 2014 if it wasn’t for Azure. I was able to spin up a SQL Server 2014 machine in only a few minutes and log in and have a play and then delete it. I have done the same with Server 2012 and 2012 R2. It has enabled me to try setting up Availability Groups and other technologies not yet implemented at MyWork

I wouldn’t have been able to do any of that on my machines at home as I don’t have anything capable of running Hyper-V whilst this 8 year old desktop still keeps hanging on despite the odd noises. (Negotiations are currently in place to replace it with something shiny and new. Just need that lottery win now !!)

I have also transferred my Cricket Averages database to WASD and am talking with a friend of mine about developing an app that will use the mobile service as well.

The rate of change is much quicker in the cloud, things change and change quickly. As quickly as I had written my post about Spinning up and Shutting Down VMS in Azure Microsoft changed the rules and didn’t charge for machines that were turned off. New services appear all the time. New services move quickly through from Preview to release and as Grant Fritchey noticed this week new views have been added to to Windows Azure SQL Database under the covers. I think this is something we are just going to have to live with. The scale of the cloud means it is much easier to test improvements at large scale and that means they can be released quicker.  It makes it more challenging to keep up I admit but it’s a constant drip of new things rather than a big bang all at once.

Azure has brought me to where I am today and I think it will continue to be part of my future. If I remember to submit my PowerShell session for SQL Saturday Exeter (Submit yours here) and it gets chosen then you will be able to see me there (if you register here) using Azure to give back to the SQL Community

SQL Saturday Exeter–What’s the Point? My Experience of 2013 SQLSatExeter

 

Disclaimer – I am on the committee organising the next SQL Saturday Exeter. To be kept up to date about SQL Saturday #269 in the South West, follow @SQLSatExeter and#SQLSatExeter on twitter and see details at the bottom. This post is about my experience at this years event.

In March this year the SQL South West User Group hosted SQL Saturday #194 in Exeter. I was a new member to the User Group having finally been able to join them for the first time in January. At that meeting Chris Testa O’Neill presented a session and was very passionate about the SQL Community and the benefit of the SQL Saturdays and other events.  I am always keen to learn new things and find ways of developing my skills. As I haven’t won the lottery I also look out for good deals as well!!

SQL SATURDAY PRE-CONS ARE EXCEPTIONAL VALUE

It was relatively easy to persuade my bosses to pay for my pre-con. For £150 I was able to spend a whole day in a room with about a dozen people being trained in SQL Server Security by Denny Cherry @mrdenny. The conversation went along the lines of

“I want to go to this training session being delivered by this guy. Link to MVP page. It’s £150 and is in Exeter so no other costs required”

My boss – “OK”

Of course there was a little more fun and games to be had with the payment but it was easy for me to get training sorted and £150 is not going to break the training budget.

Looking back through my notes from the session today I realise quite how much I have taken from it into my role at work. I can’t really comment which and what though that wouldn’t be good security!!

I remember an enjoyable day with plenty of technical learning, a lot of questions and answers and plenty of laughs as well. But more than that was the opportunity to mix with other professionals and talk with them. During the breaks and at lunch there were plenty of opportunities to chew the fat, learn how others do things, make new friends and put faces to twitter handles. (NOTE : I do look pretty much like my twitter profile picture so if you see me at SQL Community events I expect you to come up and say hi, that’s part of the benefit of attending these events, having a good natter)

Take a look at the end of this post for details of 2014 Pre-Cons

SQL SATURDAY – CAN’T GET CHEAPER THAN FREE

SQL Saturdays are FREE

SQL Saturdays offer sessions from internationally renowned and local SQL speakers on subjects relevant to you and your job, your future career, your development plan or just to challenge yourself by learning about something outside of your comfort zone. For Nothing. Add in the networking opportunities, the prizes from the sponsors, (if you were at Exeter this year the beer and the pasty) and if you added it up its a sizeable investment in yourself, your career and your development (did I mention a free beer and pasty?)

NOT BAD FOR FREE!!

To enable that, SQL Saturday organisers have to go out and talk sponsors into putting their hands into their pockets. They will only do that if it is worthwhile to them. You can make it easier for the organisers by going and spending time with the sponsors during the breaks, chatting with them and giving them your details. Also, if you choose to use one of their products please tell the sponsors you spoke to them at a SQL Saturday. They are (usually) data professionals who will record that and use that to make future decisions which will we hope include sponsoring SQL Saturdays.

This year on the Saturday I went to the following sessions

A temporary fix for a short term problem by Ian Meade
Advanced SQL Server 2012 HA and DR Architectures by Christian Bolton
Busting common T-SQL myths by Dave Morrison
Power View and the Cube by Régis Baccaro
Natural Born Killers, performance issues to avoid by Richard Douglas
Tracking server performance without slowing it down by Jonathan Allen which I also Room Monitored
Increasing Business and IT collaboration by Chris Testa-O’Neill

It was a really good day. I learnt so much from all those knowledgeable and talented people. It really kicked me on in my development at work. I was able to take from each of those sessions and use that knowledge to do my job better and I made new friends and new contacts. Just going back to my notes today has reminded me of something that I need to look into for work Smile Some of the conversations I have had at events this year have been fascinating – learning how other people do the same thing you do in a completely different but equally valid way,  problem-solving with a different set and type of minds than the ones at MyWork, laughing at the same things and moaning about similar frustrations. All have been both entertaining and rewarding and I think are worth mentioning as things I enjoyed about going to SQL Community events this year and play a part in the reason I shall continue to go to them (Just hope my boss doesn’t read this and think he won’t have to pay as I will go anyway!)

It’s busy and hectic, the sessions come along thick and fast and there are lots of people around to talk to. I wish I had made use of the SQL Saturday mobile phone app and I definitely recommend researching ahead of time and planning your day out.

This years sessions have not been decided yet but I have seen some of the submissions and there are some fabulous sessions there. You could also submit a session yourself. Choosing the sessions will be tough, but we want to offer the opportunity to speak to as many people as possible both new and experienced speakers.

You can submit your sessions at this link http://www.sqlsaturday.com/269/callforspeakers.aspx

ROUND-UP SQL SATURDAY EXETER WHY WOULDN’T YOU COME

For a newbie, as I was last time, SQL Saturday Exeter was a revelation.

An opportunity to learn without spending thousands of my own or MyWorks money to sit in a lecture room and listen to a trainer.

A chance to develop my understanding in a friendly environment amongst my peers where I could ask questions.

A place to meet new people and build relationships who have helped me with situations at work throughout the year. I reckon I’m in credit already

This year I have attended SQL Bits and SQL Saturday Cambridge and this month I shall be at SQL Relay in Cardiff and in Bristol. That all started with SQL Saturday 194 in Exeter 2013

WHAT ABOUT NEXT YEARS SQL SATURDAY EXETER?

Next years SQL Saturday in Exeter, SQL Saturday #269, will be held at the same place – Jury’s Inn Hotel Exeter on March 21/22nd 2014.

We had such amazing submissions for our pre-cons that we have had to find more rooms to be able to fit them all in.. You can see for yourself the quality of the sessions and speakers for SQL Saturday Exeter 2014 at the following link

http://sqlsouthwest.co.uk/sql-saturday-269-precon-training-day-details/

What do you think? I want to split myself into 8 and go to every one!

WHAT SHOULD YOU DO NOW?

I suggest that you should book Saturday 22nd March 2014 out in your calendar right this minute. Done that? Good.

Now go to this link

http://www.sqlsaturday.com/269/

and register for FREE to attend and let us know @SQLSatExeter

Next make yourself a coffee (Other beverages are available) and head to the pre-con page

http://sqlsouthwest.co.uk/sql-saturday-269-precon-training-day-details/

This bit is up to you, the choice is hard. I can’t tell you which one of our eight fabulous sessions you want to go to. It’s not for me to say which amazing speaker you want to spend a day with for a bargain price but if you need further info please get in touch and we will try and help. Unfortunately our human cloning experiment is not stable enough to allow you to go to more than one!

Then, let me know you have done so and come and say hi when you are here.

Lessons Learnt from my first talk at SQL SouthWest

The timing was good enough that I could offer to do a talk based on my previous post on Windows Azure for my SQL User Group SQL SouthWest when Jonathan and Annette.( @FatherJack and @MrsFatherJack) put out a call for volunteers.

I did my best with the 7 P’s. I ran through it at lunchtime, I made sure I had power and a HDMI lead after checking with Jonathan, I got a glass of water. I knew the first line I was going to say

However, I neglected to check that I would have HDMI in at the location so everything that was on my laptop was useless! My laptop did very odd things to the USB stick when I tried to transfer to Jonathans laptop and he didn’t have Powershell V3 installed so whilst Neil Hambly @Neil_Hambly from Confio was speaking I was busy ignoring a very interesting talk on Waits to install and configure Powershell Azure on my Azure VM. Sorry Neil.

But in the end it more or less worked and we are lucky to have such a patient and supportive user group who helped me along the way as well. Thank you folks

Things I took away from the evening

  1. Double check you have all the connections
  2. Practice and Practice some more
  3. Think about the times when something is running and what you will say when there is nothing to see
  4. Presenting completely inside a RDP session adds unnecessary complication
  5. The Demo Gods WILL hit you and the curse of the red text will fall upon you during the presentation. Accept it and move on.
  6. Have an opening line
  7. Remember to breath (especially when the demo falls over)
  8. Enjoy it!

It didn’t go perfectly but people gave me some good feedback and I am pleased to say that I have pointed people towards something new that will help them and passed over my knowledge and that to me is what the SQL Community is all about. I have a load of other ideas for things I can talk about and blog about so it is going to be a very busy time for me as I work my way through them and do all the other exciting things coming my way in the SQL world.

Visit your own User Group – You can find them here http://www.sqlpass.org/

If you are in the South West UK then come and join our group. Free training and conversation with like minded people once a month and pizza too what could be better!!