Parsing XML Child Nodes and Converting to DateTime with PowerShell

Posted on

As part of my organiser role for SQLSaturday Exeter (Training Day Information here and Saturday Information here) I needed to get some schedule information to input into a database.

I had read Steve Jones blog posts on Downloading SQL Saturday Data and followed the steps there to download the data from the SQL Saturday website for our event.

A typical session is held in the XML like this

 <event>
 <importID>27608</importID>
 <speakers>
 <speaker>
 <id>27608</id>
 <name>William Durkin</name>
 </speaker>
 </speakers>
 <track>Track 2</track>
 <location>
 <name>Buccaneer's Refuge </name>
 </location>
 <title>Stories from the Trenches: Upgrading SQL with Minimal Downtime</title>
 <description>SQL Server has come a long way in the last few years, with Microsoft investing heavily in High Availability features. This session will show you how to use these features to enable you to safely upgrade a SQL Server, while ensuring you have a return path if things should go wrong. You will leave the session knowing what features you can use to upgrade either the OS, Hardware or SQL Server version while keeping your maintenance window to a minimum. The session will apply to Standard Edition as well as Enterprise Edition, so doesn't only apply to 'High Rollers'!</description>
 <startTime>4/25/2015 3:20:00 PM</startTime>
 <endTime>4/25/2015 4:10:00 PM</endTime>
 </event>

I needed to output the following details – Speaker Name , Room , Start time,Duration and Title

To accomplish this I examined the node for Williams session


$i = 372
$baseURL = “http://www.sqlsaturday.com/eventxml.aspx?sat=”
$DestinationFile = “E:\SQLSatData\SQLSat” + $i + “.xml”
$sourceURL = $baseURL + $i

$doc = New-Object System.Xml.XmlDocument
$doc.Load($sourceURL)
$doc.Save($DestinationFile)

$Sessions = $doc.GuidebookXML.events

$Sessions.event[39]

I then established that to get the speakers name I had to obtain the value from the child node which I accomplished as follows


$Speaker = @{Name="Speaker"; Expression = {$_.speakers.speaker.name}}

$Sessions.event[39]|select $Speaker #To check that it worked

This is an easy way to obtain sub(or child) properties within a select in PowerShell and I would recommend that you practice and understand that syntax of @{Name=””; Expression = {} } which will enable you to perform all kinds of manipulation on those objects. You are not just limited to obtaining child properties but can perform calculations as well

I did the same thing to get the room and the start time


$Room = @{Name="Room"; Expression = {$_.location.name}}
$StartTime = @{Name="StartTime"; Expression = {$_.StartTime}}
$Sessions.event[39]|select $Speaker,$Room,$StartTime #To check that it worked

I then needed duration and thought that I could use


$Duration = @{Name ="Duration"; Expression = {($_.EndTime) - ($_.StartTime)}}

$Sessions.event[39]|select $duration

However that just gave me a blank result so to troubleshoot I ran


$Sessions.event[39].endtime - $sessions.event[39].startTime

Which errored with the (obvious when I thought about it) message

Cannot convert value “4/25/2015 4:10:00 PM” to type “System.Int32″. Error: “Input string was not in a correct format.”
At line:1 char:1
+ $Sessions.event[39].endtime – $sessions.event[39].startTime
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [], RuntimeException
+ FullyQualifiedErrorId : InvalidCastFromStringToInteger

The value was stored as a string

Running


$Sessions.event[39].endtime |Get-Member

showed me that there was a method called ToDateTime but there is an easier way. By defining the datatype of an object Powershell will convert it for you so the resulting code looks like this


$Sessions = $doc.GuidebookXML.events
$Speaker = @{Name="Speaker"; Expression = {$_.speakers.speaker.name}}
$Room = @{Name="Room"; Expression = {$_.location.name}}
$Duration = @{Name ="Duration"; Expression = {[datetime]($_.EndTime) - [datetime]($_.StartTime)}}
$startTime = @{Name="StartTime"; Expression = {[datetime]($_.StartTime)}}
$Sessions.event|select $Speaker,$Room,$Starttime,$Duration,Title |Format-Table -AutoSize -Wrap

and the resulting entry is finally as I required it. I believe that this will use the regional settings from the installation on the machine that you are using but I have not verified that. If anyone in a different region would like to run this code and check that that is the case I will update the post accordingly

zzCapture

Hopefully you have learnt from this how you can extend select from the pipeline and how defining the datatype can be beneficial. Any questions please comment below

Speaking at PowerShell Virtual Chapter and SQL Cardiff User Group this month

Posted on

Just a quick post to say that I will be speaking at the PowerShell Virtual Chapter meeting this Thursday at 4pm GMT 12pm EDT and also at the Cardiff SQL User Group on Tuesday 31st March

I will be giving my Making Powershell Useful for your Team presentation

You have heard about PowerShell and may be spent a little bit of time exploring some of the ways in which it will benefit you at work. You want to be able to perform tasks more specific to your organisation and need to share them with your team. I will show you how you can achieve this by demonstrating

  • An easy way to learn the syntax
  • How to explore SQL Server with Powershell
  • How to turn your one off scripts into shareable functions
  • How to ensure that your team can easily and quickly make use of and contribute to PowerShell solutions
  • Where else to go for help

You can find out more about the Virtual Chapter here

http://powershell.sqlpass.org/ 

and the Cardiff meeting here

http://www.meetup.com/Cardiff-SQL-Server-User-Group/events/219492623/ 

The Cardiff meeting has been named The Battle Of The Beards as it features Tobiasz Koprowski: talking about Windows Azure SQL Database – Tips and Tricks for beginners and Terry McCann with SSRS Inception. I will be giving the same presentation as at the Virtual Chapter

I hope to see you at one or both sessions

Triggering a System Center Configuration Manager deployment task

Posted on Updated on

A slightly different topic today.

Once you have built up knowledge, you become the person that people ask to solve things. This is something I really enjoy, taking a problem and solving it for people and in the process teaching them and enabling them to automate more things.

A colleague was performing a new deployment of a product via SCCM and wanted to trigger the clients to update and receive the new update instead of waiting for it to be scheduled.

They had found some code that would do this

Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000121}"|Out-Null
Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000021}"|Out-Null
Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000022}"|Out-Null
Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000002}"|Out-Null

They had the idea of using this command and a text file containing the machines and PS Remote.

I looked at it a different way and gave them a function so that they could provide the Collection Name (In SCCM a collection is a list of machines for a specific purpose) and the function would import the SCCM module, connect to the Site get the names of the machines in the collection and run the command on each one

function Trigger-DeploymentCycle
{
param
(
[string]$CollectionName
)

# PS script to run

$scriptblock = {
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000121}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000021}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000022}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000002}"|Out-Null
    }

## import SCCM module
Import-Module (Join-Path $(Split-Path $env:SMS_ADMIN_UI_PATH) ConfigurationManager.psd1)
#open drive for SCCM 
cd <Site Code>:\ #### cd <Site Code>:\ replace with Site Code or add param $SiteCOde and use cd ${$SiteCode}:\ 
# Get Computer names in collection
$PCs = (Get-CMDeviceCollectionDirectMembershipRule -CollectionName $CollectionName).rulename
$Count = $PCs.count
Write-Output "Total number of PCs = $Count"

Invoke-Command –ComputerName $PCs –ScriptBlock $scriptblock –ThrottleLimit 50

}  

This would work very well but they wanted some error checking to enable them to identify machines they were unable to connect to following the deployment so the final solution which will run a little slower

Set up function and parameters and create log files

function Trigger-DeploymentCycle
{
param
(
[string]$CollectionName
)

# Create log file
$StartTime = Get-Date
$Date = Get-Date -Format ddMMyyHHss
$Errorlogpath = "C:\temp\SCCMError" + $Date + ".txt"
$Successlogpath = "C:\temp\SCCMSuccess" + $Date + ".txt"
New-Item -Path $Errorlogpath -ItemType File
New-Item -Path $Successlogpath -ItemType File

$StartLog = "Script Started at $StartTime"
$StartLog | Out-File -FilePath $Successlogpath -Append

Create the script block, import the SCCM module, connect to the SCCM site and get the machines in the collection. Note that you will have to change <Site Code> with your own site code

 

$scriptblock = {
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000121}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000021}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000022}"|Out-Null
    Invoke-WMIMethod -Namespace root\ccm -Class SMS_CLIENT -Name TriggerSchedule "{00000000-0000-0000-0000-000000000002}"|Out-Null
    }

## import SCCM module
Import-Module (Join-Path $(Split-Path $env:SMS_ADMIN_UI_PATH) ConfigurationManager.psd1)
#open drive for SCCM 
cd <Site Code>:\ #### cd <Site Code>:\ replace with Site Code or add param $SiteCOde and use cd ${$SiteCode}:\ 
# Get Computer names in collection
$PCs = (Get-CMDeviceCollectionDirectMembershipRule -CollectionName $CollectionName).rulename
$Count = $PCs.count
Write-Output "Total number of PCs = $Count"

I wanted to give them a progress output so I needed to be able to identify the number of machines in the collection by using the count property. I then needed to output the number of the item within the array which I did with

$a= [array]::IndexOf($PCs, $PC) + 1
Write-Output " Connecting to PC - $PC -- $a of $count"

I then pinged the machine,ran the script block and wrote to the log files and finally opened the log files

if (Test-Connection $PC -Quiet -Count 1)
{   
# Run command on PC
Invoke-Command -ComputerName $PC -scriptblock $scriptblock
$Success = "SUCCESS - finished - $PC -- $a of $count" 
 $Success | Out-File -FilePath $Successlogpath -Append
Write-Output $Success
}
else
{
$ErrorMessage = "ERROR - $PC is not available -- $PC -- $a of $count"
$ErrorMessage| Out-File -FilePath $Errorlogpath -Append 
Write-Output $ErrorMessage
}
}

notepad $Errorlogpath
notepad $Successlogpath

Now they can load the function into their powershell sessions and type

TriggerDeplyment COLLECTIONNAME

and they will be able to manually trigger the tasks. This function will trigger the following tasks for a list of PCs in a collection.

Machine Policy Assignment Request — {00000000-0000-0000-0000-000000000021}
Machine Policy Evaluation — {00000000-0000-0000-0000-000000000022}
Software Inventory — {00000000-0000-0000-0000-000000000002}
Application Deployment Evaluation Cycle: {00000000-0000-0000-0000-000000000121}

Here is the list of other tasks you can trigger:

Discovery Data Collection Cycle: {00000000-0000-0000-0000-000000000003}
Hardware Inventory Cycle: {00000000-0000-0000-0000-000000000001}
Machine Policy Retrieval and Evaluation Cycle: {00000000-0000-0000-0000-000000000021}
Software Metering Usage Report Cycle: {00000000-0000-0000-0000-000000000031}
Software Updates Deployment Evaluation Cycle: {00000000-0000-0000-0000-000000000108}
Software Updates Scan Cycle: {00000000-0000-0000-0000-000000000113}
Windows Installer Source List Update Cycle: {00000000-0000-0000-0000-000000000032}
Hardware Inventory={00000000-0000-0000-0000-000000000001}
Software Update Scan={00000000-0000-0000-0000-000000000113}
Software Update Deployment Re-eval={00000000-0000-0000-0000-000000000114}
Data Discovery={00000000-0000-0000-0000-000000000003}
Refresh Default Management Point={00000000-0000-0000-0000-000000000023}
Refresh Location (AD site or Subnet)={00000000-0000-0000-0000-000000000024}
Software Metering Usage Reporting={00000000-0000-0000-0000-000000000031}
Sourcelist Update Cycle={00000000-0000-0000-0000-000000000032}
Cleanup policy={00000000-0000-0000-0000-000000000040}
Validate assignments={00000000-0000-0000-0000-000000000042}
Certificate Maintenance={00000000-0000-0000-0000-000000000051}
Branch DP Scheduled Maintenance={00000000-0000-0000-0000-000000000061}
Branch DP Provisioning Status Reporting={00000000-0000-0000-0000-000000000062}
Refresh proxy management point={00000000-0000-0000-0000-000000000037}
Software Update Deployment={00000000-0000-0000-0000-000000000108}
State Message Upload={00000000-0000-0000-0000-000000000111}
State Message Cache Cleanup={00000000-0000-0000-0000-000000000112}

You can find the function here

Trigger-Deployment

and all of my Script Center Submissions are here

As always – The internet lies, fibs and deceives and everything you read including this post should be taken with a pinch of salt and examined carefully. All code should be understood and tested prior to running in a live environment.

Show AutoGrowth Events with Powershell to CSV

Posted on Updated on

This week I was reading Pinal Daves post about Autogrowth Events

http://blog.sqlauthority.com/2015/02/03/sql-server-script-whenwho-did-auto-grow-for-the-database/

as it happened I had a requirement to make use of the script only a few days later. I was asked to provide the information in a CSV so that the person who required the information could manipulate it in Excel.

I am a great believer in Automation. If you are going to do something more than once then automate it so I wrote two functions, added them to TFS and now they will be available to all of my team members next time they load Powershell.

Why two functions? Well Pinal Daves script gets the information from the default trace for a single database but there may be times when you need to know the autogrowth events that happened on a server with multiple databases.

I use a very simple method for doing this as I have not found the correct way to parse the default trace with Powershell. The functions rely on Invoke-SQLCMD2 which I also have in my functions folder and pass the query from Pinal Daves Blog post as a here string

$Results = Invoke-Sqlcmd2 -ServerInstance $Server -Database master -Query $Query

To output to CSV I use the Export-CSV cmdlet

if($CSV)
{
$Results| Export-Csv -Path $CSV
}

And to open the CSV I add a [switch] parameter. You can find out more about parameters here or by

Get-Help about_Functions_Advanced_Parameters

so the parameter block of my function looks like

param
(
[Parameter(Mandatory=$true)]
[string]$Server,
[Parameter(Mandatory=$true)]
[string]$Database,
[Parameter(Mandatory=$false)]
[string]$CSV,
[Parameter(Mandatory=$false)]
[switch]$ShowCSV
)

Now when I am asked again to provide this information it is as easy as typing

Show-AutogrowthServer -Server SQL2014Ser12R2 

or

Show-AutogrowthDatabase -Server SQL2014Ser12R2 -Database Autogrowth

and the results will be displayed as below

autogrowth

just a side note. Pinal Daves script uses @@servername in the where clause and if you have renamed your host the script will be blank. The resolution to this is to runt he following T-SQL

 sp_dropserver 'OLDSERVERNAME';
GO
sp_addserver NEWSERVERNAME, local;
GO

You can find the scripts here

Show-AutoGrowthServer

Show-AutoGrowthDatabase

and all of my Script Center Submissions are here

As always – The internet lies, fibs and deceives and everything you read including this post  should be taken with a pinch of salt and examined carefully. All code should be understood and tested prior to running in a live environment.

Uploading a Source Folder to Azure File Storage

Posted on

Azure File Storage enables you to present an Azure Storage Account to your IaaS VMs as a share using SMB. You can fid out further details here

http://azure.microsoft.com/en-gb/documentation/articles/storage-dotnet-how-to-use-files/ 

Once you have created your Azure File Storage Account and connected your Azure Virtual Machines to it, you may need to upload data from your premises into the storage to enable it to be accessed by the Virtual Machines

To accomplish this I wrote a function and called it Upload-ToAzureFileStorage

I started by creating a source folder and files to test

New-Item -Path C:\temp\TestUpload\New1 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New2 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New3 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New4 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New5 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\a -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\b -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\c -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\d -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\a\1 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\a\2 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\a\3 -ItemType Directory
New-Item -Path C:\temp\TestUpload\New1\list\a\4 -ItemType Directory

New-Item -Path C:\temp\TestUpload\New1\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New2\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New3\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New4\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New5\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New1\list\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New1\list\a\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New1\list\a\1\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New1\list\a\2\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New1\list\a\3\file.txt -ItemType File
New-Item -Path C:\temp\TestUpload\New1\list\a\4\file.txt -ItemType File

Then we needed to connect to the subscription, get the storage account access key and create a context to store them

#Select Azure Subscription
Select-AzureSubscription -SubscriptionName $AzureSubscriptionName

# Get the Storage Account Key
$StorageAccountKey = (Get-AzureStorageKey -StorageAccountName $StorageAccountName).Primary

# create a context for account and key
$ctx=New-AzureStorageContext $StorageAccountName $StorageAccountKey

The Get-AzureStorageShare  cmdlet shows the shares available for the context so we can check if the share exists

$S = Get-AzureStorageShare -Context $ctx -ErrorAction SilentlyContinue|Where-Object {$_.Name -eq $AzureShare}

and if it doesnt exist create it using New-AzureStorageShare

$s = New-AzureStorageShare $AzureShare -Context $ctx

For the sake only of doing it a different way we can check for existence of the directory in Azure File Storage that we are going to upload the files to like this

$d = Get-AzureStorageFile -Share $s -ErrorAction SilentlyContinue|select Name

if ($d.Name -notcontains $AzureDirectory)

and if it doesnt exist create it using New-AzureStorageDirectory

$d = New-AzureStorageDirectory -Share $s -Path $AzureDirectory

Now that we have the directory created in the storage account we need to create any subfolders. First get the folders

# get all the folders in the source directory
$Folders = Get-ChildItem -Path $Source -Directory -Recurse

We can then iterate through them using a foreach loop. If we do this and select the FullName property the results will be

C:\temp\TestUpload\New1
C:\temp\TestUpload\New2
C:\temp\TestUpload\New3
C:\temp\TestUpload\New4
C:\temp\TestUpload\New5
C:\temp\TestUpload\New1\list
C:\temp\TestUpload\New1\list\a
C:\temp\TestUpload\New1\list\b
C:\temp\TestUpload\New1\list\c
C:\temp\TestUpload\New1\list\d
C:\temp\TestUpload\New1\list\a\1
C:\temp\TestUpload\New1\list\a\2
C:\temp\TestUpload\New1\list\a\3
C:\temp\TestUpload\New1\list\a\4

but to create new folders we need to remove the “C:\temp\TestUpload” and replace it with the Directory name in Azure. I chose to do this as follows using the substring method and the length of the source folder path.

foreach($Folder in $Folders)
 {
 $f = ($Folder.FullName).Substring(($source.Length))
 $Path = $AzureDirectory + $f

and tested that the results came out as I wanted

AppName\New1
AppName\New2
AppName\New3
AppName\New4
AppName\New5
AppName\New1\list
AppName\New1\list\a
AppName\New1\list\b
AppName\New1\list\c
AppName\New1\list\d
AppName\New1\list\a\1
AppName\New1\list\a\2
AppName\New1\list\a\3
AppName\New1\list\a\4

I could then create the new folders in azure using New-AzureStorageDirectory again

New-AzureStorageDirectory -Share $s -Path $Path -ErrorAction SilentlyContinue

I followed the same process with the files

$files = Get-ChildItem -Path $Source -Recurse -File</pre>
<pre>foreach($File in $Files)
 {
 $f = ($file.FullName).Substring(($Source.Length))
 $Path = $AzureDirectory + $f

and then created the files using Set-AzureStorageFileContent this has a -Force and a -Confirm switch and I added those into my function by using a [switch] Parameter

#upload the files to the storage

 if($Confirm)
 {
 Set-AzureStorageFileContent -Share $s -Source $File.FullName -Path $Path -Confirm
 }
 else
 {
 Set-AzureStorageFileContent -Share $s -Source $File.FullName -Path $Path -Force
 }

You can download the function from the Script Center

https://gallery.technet.microsoft.com/scriptcenter/Recursively-upload-a-bfb615fe

As also, any comments or queries are welcome and obviously the internet lies so please understand and test all code you find before using it in production

Twas 2 Days Before Xmas or Thank you SQLFamily

Posted on Updated on

Twas 2 days before Xmas & all through the office,
not a creature was stirring not even old Maurice.
With merriment going on outside of his window
There sat a bearded DBA without much to do

No changes can be made through the holiday season
We’re on skeleton support, which is a good reason
Ensure you are making the most of your time
You mustn’t be wasting the company dime

The backups are checked, there isn’t an issue
So documentation writing should ensue
Instead he decided to procrastinate
And so, this little ditty he proceeded to create

Looking back over last year he did ruminate
About all the progress he had made, it was great
So much had been learned, so many improvements
Derived using content from fine ladies and gents

Impossible to estimate how much it would cost
Or calculate the amount of revenue lost
For all that he would have been unable to do
Or the times that he knew how to get out of a stew

But also the friends old, new and the rest
The talking and dining and drinking and jest
I am lucky to be a part of the SQL Family
So thank you one and all, with love from me

beard

Giving Back – #TSQL2sday

Posted on Updated on

T-SQL Tuesday, which was started by Adam Machanic (blog|twitter) and is now starting its 6th year, is hosted by a different person each month. The host selects the theme, and then the blogging begins. Worldwide, on the second Tuesday of the month (all day, based on GMT time), bloggers attend this party by blogging about the theme. This month it is hosted by Wayne Sheffield blog|twitter and in the spirit of the holiday season it is about giving. This is my post on giving back, in the little ways as well as the bigger ones, how you can give back and why it not only benefits others but also yourself

What’s the SQL Family?

The SQL Family (or SQL community call it what you will) is a fabulous place to be. Full of many wonderful, talented, passionate and generous people. Every method of interaction that you wish for can be found if you look. Ever wanted to know how others do it? or Does this happen in other shops? or I wish I had access to someone who knows about ‘insert new shiny thing here’?

I guess that that is how I joined. I had no peers in my shop. I had no one to turn to to ask questions or get advice. I had no support and I turned to the internet. Now we all know that the internet lies. It is full of accidental and deliberate mistruths, of part information and downright bad advice. You have to be careful where you go and who you trust. I gradually found myself going back to the same resources and from those I found out about PASS and user groups

I am in the UK. I found the list of UK SQL User Groups

You can find more here both local and virtual

User Groups

I found a user group near me and went along to a meeting not knowing what to expect. I found a group of people like me willing to give up their time to learn and share knowledge. A wide range of people from DBAs, Developers, BI Professionals and SysAdmins. Working for International multi regional companies looking after many hundreds of SQL Servers to single sysadmins looking after the whole kit and caboodle and everything in between. A wealth and breadth of knowledge to tap into. You will learn so much not only from the sessions but also the conversation with all these other talented people

Come along.

Simply coming along will bring benefit. Other people will be interested in what you have to say even if you are in week 0 of your first ever job. Your view will still be valued. Everyone can learn from everybody and NO-ONE knows it all.

There will come a point where you will pass on a piece of knowledge or an idea or a way of working and someone will say thank you I didn’t know that. You just gave back. It may even be someone you look up to, someone whose knowledge and experience far outweighs yours whose word you hang on to. That feels good.

You may ask the questions that others thought but didnt ask and boy are they glad you asked the question. You just gave back. It’s something I do often. I ask questions and sometimes I ask questions I know the answer to so that they will be of benefit to the group.

What will you get? More than you can ever put in. Free training, often free pizza, knowledge, advice,guidance, contacts, support, a network of people in your field, notice of job openings, swag, fun, friends, more social events and more and more

The user groups are run by volunteers in their own time out of the goodness of their hearts. They will always need your help. Turn up 5 minutes earlier and help set out the chairs or put out the handouts or assist with the tech. You just gave back. Afterwards before going to the pub clear the tables, help carry the boxes, put the pizza evidence in the bin. You just gave back

SQL Saturdays and other community events

SQL Saturdays are held all over the world most every Saturday You can find more about them here https://www.sqlsaturday.com/ There are also other larger events such as SQL Bits and SQL Relay here in the UK. Everything I wrote about User groups counts here just in a slightly larger scale. You will be able to attend several sessions across many different areas for free on a Saturday

These events are also run by volunteers and they will also need your help. If you can spare some time to help on a registration desk you just gave back. A room monitor to ensure the speaker and delegates have everything they need, the room is tidy and the session runs to time. You just gave back. Putting things out and tidying them away again. You just gave back.

You can become a volunteer by asking the people organising the events if they would like your help. These events will all have twitter feeds and emails and facebook pages and many methods of getting in touch. Contact them and offer your help if you can. You just gave back.

If you fancy taking the next step then you can get involved in organising the events. This is hard work, great fun, a good thing to add to your CV and you just gave back. There are so many areas to get involved organising an event. Premises and technology, speakers and printers, volunteers and sponsors all need co-ordination. Websites,twitter feeds, feedback forms, posters, marketing materials all need designing and producing. There are so many ways in which you will be able to provide value to the event and you just gave back

Oh and whilst I am at it, when you attend an event

Say Thank You to the volunteers. You just gave back.

 Speaking and Blogging

All the events named above need speakers. The bigger events like the SQL Saturdays and the large events like SQL Bits will generally have more established speakers but every user group will need speakers and they will be more likely to accept new speakers and will be very supportive if you say that you are a new speaker. Every speaker had to make their first presentation at some point and they all know how it feels and can provide guidance and advice. You will feel that you don’t have anything to speak about that others will want to hear about. You do. Your experience and your knowledge or how you solved something or created something will be of interest to people. Of course, you need to check with the user group leaders and members if your idea for a presentation is suitable. Like anything you do that is new, researching it and taking advice from people with more experience is always useful. Maybe you can start with a lightning talk. Give it a go. You just gave back.

What do you get back from Speaking and Blogging?

I’ll tell you a secret. The vast majority of my posts ( This is an exception) are written for the benefit of one person. Me.

I write posts to record my learning for myself. To document my knowledge. I use my posts to help me to do my job. I know that I wrote the method of doing something somewhere and this is part of my store of knowledge. I write posts to answer peoples questions so that I have a place to point them to. Occasionally people will email me asking a question and if it requires a detailed response I will write a post and email them to tell them that this is the answer. I often point my work colleagues at my blog when they ask me questions about Azure or Powershell. You could also see your blog as an extension of your CV and use it when job hunting and develop it in that way

I also write posts to expand my knowledge and this is the same for speaking. When I am writing a blog post or a presentation I will read and watch videos and ensure I know more about it. The process of creating that content will improve my own knowledge and work practices and you will find that, as you write your blog posts you will have a deeper knowledge also. When you give your presentations you will learn as you answer questions or find the answer to the question afterwards (It’s ok to do that) that you are improving yourself and your knowledge.You will also be giving back.

Putting your information online will enable people to find it. Sure you can worry about SEO and getting to the top of search pages but you know that sometimes the answer is on the ninth page. What you write will be of benefit to others and by taking the time to post you will be giving back to the community

You can do one, many or all of those and you will be giving back. I hope you do

I will be giving back. You will find me at SQL Bits where I shall be room monitoring and volunteering.

You will find me at SQL Saturday Exeter. I am again one of the fabulous SQL South West team

who are again organising a SQL Saturday in Exeter in the UK on April 24th/25th 2015 You can find out more here http://sqlsouthwest.co.uk/sql-saturday-372/ 

You still have time, if you are quick, to submit a session to speak or present a pre-con at Exeter. Submissions close on 15th December and we would love to have yours

I shall carry on blogging and hopefully present at some user groups again this year. If you see me any where, come up and say hi to me. You just gave back