Setting Up and Using Azure VM SQL Automated Backup (and Restore)

Posted on

This weekend I was creating some Azure VMs to test and was required to use the GUI for some screenshots. I have always used my Powershell scripts described here to create my test systems and with a new job taking up a lot of my time had missed the announcement about Azure SQL Automated Backup and Azure SQL Automated Patching so was surprised to see this screen

1

I read the announcement and also the details on MSDN https://msdn.microsoft.com/en-us/library/azure/dn906091.aspx which show that this requires the SQL Server IaaS Agent. This is a default option on new virtual machines.

There are some other considerations too. It is only supported for SQL Server 2014 and Windows Server 2012 and 2012R2 at present and you can set a retention period to a maximum of 30 days but it is automated. You do not have to decide upon the backup strategy Azure will decide the frequency and type of backups dependent upon the workload of the database and some other factors such as

A full backup is taken
○ when an instance is added to use Managed backup
○ When transaction log growth is 1Gb or more
○ At least once a week
○ If the log chain is broken
○ When a database is created

A transaction log backup is taken
– If no log backup is found
– Transaction log space used is 5Mb or larger
– At least once every two hours
– Any time the transaction log backup is lagging behind a full database backup. The goal is to keep the log chain ahead of full backup.

From <https://msdn.microsoft.com/en-gb/library/dn449496(v=sql.120).aspx>

There are some restrictions
– Only database backups are supported
– System databases are not supported so you need to back those up yourself
– You can only back up to Azure storage
– Maximum backup size is 1Tb as this is the maximum size for a blob in Azure storage
– Simple recovery is not supported
– Maximum retention is 30 days – if you are required to keep your backups for longer than 30 days for regulatory or other reasons you could simply use Azure Automation to copy the files to another storage account in Azure)

How to set it up.

If you are using the GUI then you will find SQL Automated Backup in the optional config blade of the set up. You can follow the steps here to set it up. If (like me) you want to use Powershell then use the following code after you have created your Virtual Machine

$storageaccount = "<storageaccountname>"
$storageaccountkey = (Get-AzureStorageKey -StorageAccountName $storageaccount).Primary
$storagecontext = New-AzureStorageContext -StorageAccountName $storageaccount -StorageAccountKey $storageaccountkey

$encryptionpassword = (Get-Credential -message 'Backup Encryption Password' -User 'IGNOREUSER').password
$autobackupconfig = New-AzureVMSqlServerAutoBackupConfig -StorageContext $storagecontext -Enable -RetentionPeriod 10 -EnableEncryption -CertificatePassword $encryptionpassword
Get-AzureVM -ServiceName <vmservicename> -Name <vmname> | Set-AzureVMSqlServerExtension -AutoBackupSettings $autobackupconfig | Update-AzureVM

Once you have run the code, Azure will take care of the rest. Add a couple of databases to your instance and look in the storage account and you will see this

2

3

And in the automaticbackup container you will find the Certificates and master key backups

4

It will also create a credential

5

You can use the same credential to back up your system databases. If like me you use Ola Hallengrens excellent Maintenance Solution then simply change your systems backup job as follows


USE [msdb]
GO
EXEC msdb.dbo.sp_update_jobstep @job_name = 'DatabaseBackup - SYSTEM_DATABASES - FULL', @step_id=1 ,
		@command=N'sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "EXECUTE [dbo].[DatabaseBackup] @Databases = ''SYSTEM_DATABASES'', "https://myaccount.blob.core.windows.net/mycontainer"
		,  @Credential = ''AutoBackup_Credential'', @BackupType = ''FULL'', @Verify = ''Y'', @CleanupTime = NULL, @CheckSum = ''Y'', @LogToTable = ''Y''" -b'
GO

If you need to restore your database then you can use the GUI and when you choose restore you will see this screen

6

Enter your storage account and the key which you can get from the Azure portal. You will notice that the credential has already been selected, click connect and

7

There are all of your backups ready to restore to any point in time that you choose. By clicking script the T-SQL is generated which looks like this


USE [master]
BACKUP LOG [Test] TO  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_LogBackup_2015-07-16_06-21-26.bak'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,
NOFORMAT, NOINIT,  NAME = N'Test_LogBackup_2015-07-16_06-21-26',
NOSKIP, NOREWIND, NOUNLOAD,  NORECOVERY ,  STATS = 5
RESTORE DATABASE [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150714201240+00.bak'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150714202740+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150714224241+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715005741+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715031242+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715052742+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715074243+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715095743+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715121243+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150716060004+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NOUNLOAD,  STATS = 5
GO

There is an important note. Remember this when you have just set it up so that you don’t think that you have done it wrong (which is what I did!)

When you enable Automated Patching for the first time, Azure configures the SQL Server IaaS Agent in the background. During this time, the portal will not show that Automated Patching is configured. Wait several minutes for the agent to be installed, configured. After that the portal will reflect the new settings.

From <https://msdn.microsoft.com/en-us/library/azure/dn961166.aspx>

And also look out for this

8

The password I had chosen was not complex enough but the Powershell script had succeeded and not given me the warning

To set up SQL Automated Patching you follow a similar steps. The setting is again on the OS Config blade and click enable and then you can choose the frequency and duration of the patching.

It is important to remember to choose your maintenance window correctly. If you have set up your SQL VMs correctly you will have them in an availability set and be using either mirroring or Availability Groups and have the VMs set up in the same availability set to ensure availability during the underlying host patching but I had it confirmed by Principal Software Engineering Manager Sethu Srinivasan t via Microsoft PFE Arvind Shyamsundar b | t that the SQL Automated Patching is not HA aware so you will need to ensure that you set the maintenance windows on each VM to ensure that they do not overlap

Scheduling Ola Hallengrens Maintenance Solution Default Jobs with Powershell

Posted on Updated on

If you are a SQL Server DBA you should know about Ola Hallengren and will probably have investigated his Maintenance Solution.

If you haven’t please start here https://ola.hallengren.com/

You can also watch his presentation at SQLBits at this link

http://sqlbits.com/Sessions/Event9/Inside_Ola_Hallengrens_Maintenance_Solution

where he talks about and demonstrates the solution.

It is possible to just run his script to install the solution and schedule the jobs and know that you have made a good start in keeping your databases safe. You should be more proactive than that and set specific jobs for your own special requirements but you can and should find that information in other places including the FAQ on Ola’s site

I particularly like the parameter @ChangeBackupType which when running the transaction log or differential backup will change the backup type to full if the backup type cannot be taken. This is excellent for picking up new databases and backing them up soon after creation

When you run the script the jobs are created but not scheduled and it is for this reason I created this function. All it does it schedule the jobs so that I know that they will be run when a new server is created and all the databases will be backed up. I can then go back at a later date and schedule them correctly for the servers workload or tweak them according to specific needs but this allows me that fuzzy feeling of knowing that the backups and other maintenance will be performed.

To accomplish this I pass a single parameter $Server to the function this is the connection string and should be in the format of SERVERNAME, SERVERNAME\INSTANCENAME or SERVERNAME\INSTANCENAME,Port

I then create a $srv SMO object as usual

$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server

Create a JobServer object and a Jobs array which holds the Jobs

$JobServer = $srv.JobServer
$Jobs = $JobServer.Jobs

And set the schedule for each job. I pick each Job using the Where-Object Cmdlet and break out if the job does not exist

$Job = $Jobs|Where-Object {$_.Name -eq 'DatabaseBackup - SYSTEM_DATABASES - FULL'}
       if ($Job -eq $Null)
       {Write-Output "No Job with that name"
       break}

Then I create a Schedule object and set its properties and create the schedule

$Schedule = new-object Microsoft.SqlServer.Management.Smo.Agent.JobSchedule ($job, 'Daily - Midnight ++ Not Sunday')
$Schedule.ActiveEndDate = Get-Date -Month 12 -Day 31 -Year 9999
$Schedule.ActiveEndTimeOfDay = '23:59:59'
$Schedule.FrequencyTypes = "Weekly"
$Schedule.FrequencyRecurrenceFactor = 1
$Schedule.FrequencySubDayTypes = "Once"
$Schedule.FrequencyInterval = 126 # Weekdays 62 + Saturdays 64 - <a href="https://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.agent.jobschedule.frequencyinterval.aspx">https://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.agent.jobschedule.frequencyinterval.aspx</a>
$Schedule.ActiveStartDate = get-date
$schedule.ActiveStartTimeOfDay = '00:16:00'
$Schedule.IsEnabled = $true
$Schedule.Create()

I have picked this example for the blog as it shows some of the less obvious gotchas. Setting the active end date could only be achieved by using the Get-Date Cmdlet and defining the date. The schedule frequency interval above is for every day except Sundays. This achieved by using the following table from MSDN which is always my first port of call when writing these scripts

WeekDays.Sunday = 1
WeekDays.Monday = 2
WeekDays.Tuesday = 4
WeekDays.Wednesday = 8
WeekDays.Thursday = 16
WeekDays.Friday = 32
WeekDays.Saturday = 64
WeekDays.WeekDays = 62
WeekDays.WeekEnds = 65
WeekDays.EveryDay = 127

Combine values using an OR logical operator to set more than a single day. For example, combine WeekDays.Monday and WeekDays.Friday (FrequencyInterval = 2 + 32 = 34) to schedule an activity for Monday and Friday.

It is easy using this to set up whichever schedule you wish by combining the numbers. I would advise commenting it in the script so that your future self or following DBAs can understand what is happening.

You can tweak this script or use the code to work with any Agent Jobs and set the schedules accordingly and you can check that you have set the schedules correctly with this code

   $srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server
   $JObserver = $srv.JobServer
   $JObs = $JObserver.Jobs
   $ActiveStartTimeOfDay = @{Name = "ActiveStartTimeOfDay"; Expression = {$_.JobSchedules.ActiveStartTimeOfDay}}
   $FrequencyInterval = @{Name = "FrequencyInterval"; Expression = {$_.JobSchedules.FrequencyInterval}}
   $FrequencyTypes = @{Name = "FrequencyTypes"; Expression = {$_.JobSchedules.FrequencyTypes}}
   $IsEnabled = @{Name = "IsEnabled"; Expression = {$_.JobSchedules.IsEnabled}}
   $Jobs|Where-Object{$_.Category -eq 'Database Maintenance'}|select name,$IsEnabled,$FrequencyTypes,$FrequencyInterval,$ActiveStartTimeOfDay|Format-Table -AutoSize

You can get the script from Script Center via the link below or by searching for “Ola” using the script browser add-in straight from ISE

browser

https://gallery.technet.microsoft.com/scriptcenter/Schedule-Ola-Hallengrens-a66a3c89

Spinach and Database Development- SQLSatExeter Keynote

Posted on Updated on

Last weekend, we held our SQL Saturday event in Exeter. It was a brilliant event for many reasons but we were delighted to have a world exclusive keynote video by Phil Factor about Spinach and Database Development. With many thanks to those that made it possible and particularly to Phil Factor I have linked to the video here and also transcribed it. Please watch and read and understand the message

 

 

What has spinach got to do with Database Development?

 

 

Generations of children were fed spinach in preference to more nutritious things, such as cardboard, because of the persistence of bad data.

 

It wasn’t in fact the decimal point error of legend but confusion over the way that iron was measured in the late 19th century data. As a result nutritionists persisted in believing for generations that it was a rich source of iron that the body needs in order to create bloodcells. In fact, the very little iron that there is in spinach isn’t in a form that can be readily absorbed by the body anyway.

 

The consequences of bad data can be dire

 

Guarding the quality of your data is about the most important thing that you as a data professional can do. You may think that performance is important but it would just deliver you the wrong answer faster. Resilience? it would just make it more likely that you’d be able to deliver the wrong answer. Delivery? Yep you got it, the wrong answer quicker.

 

The spinach example is a good one because bad data is hard to detect and can go unnoticed for generations. This is probably because people don’t inspect and challenge data as much as they should. You would have thought it strange that a vegetable like spinach should have fifty times as much iron as any other vegetable but the fact came from a very reputable source so people just shrugged and accepted it

 

We have a touching faith in data,

 

We, as a culture, assume its correct and complete, we like to believe that it’s impossible that either prejudice, bias, criminality or foolishness could affect the result, worse we think that valuable truth can be sifted from any data no matter the source. If there’s enough of it then there must be value in it. It’s like panning for gold dust from a river. The sad truth is that this is a delusion but very common in our society. We are, in our mass culture, in the bronze age rather than the information age struggling with silvery toys imbued with mystical magical powers

 

A good database professional must be unequivocal.

 

Bad data cannot be cleaned in the same way that one can clean mud of a diamond. If data contains bad data then the entire data set must be rejected

 

There’s no such thing as data cleansing.

 

You as a DBA may be asked to take out data that seems absurd such as ages that are negative or ages that are so great that the person couldn’t possibly be alive but then that leaves you in the same dataset, data that is plausible but wrong.

 

Only in very exceptional circumstances when you know precisely why a minority of your data is wrong would you be justified in correcting it.

 

Statistics can help us to make very confident assertions about large datasets if they conform to one of the common distributions but they cannot tell us anything about individual items of data. You can of course remove outliers but in fact outliers are just items of data that don’t conform to your assumptions about the data and the whole point of data analysis is to test your assumptions. By cleaning data, by removing outliers you can prove almost anything scientifically

 

A well designed database is defended in depth at every possible opportunity.

 

Depth is actually an interesting analogy because experience tells us that bad data seems to leak in under pressure, through every crack when the database is working hard. Like you will see in a World War 2 submarine movie, in a well-used OLTP database, we are like the crew, swivelling our eyes in terror savouring the futility of any remediation as ghastly drips run down the walls of our database and wishing we had put in more constraints.

 

In terms of the defence of data, check constraints and foreign key constraints are excellent of course and triggers are good but there are other ways of getting warnings of errors in data such as sudden changes in the distribution of data and other anomalies. One check I like to do is the tourism check where you check your data all the way through back to source, this technique once famously picked up the fact that a famous motor manufacturer was reporting its deceleration figures in yards per second when it should have been metres per second.

 

When you start putting in check constraints you say to yourself, this couldn’t possibly happen. This is the voice of superstition. A famous programmer of the 1970’s took to putting a message in his code saying “this error could never happen” and he put it in places where it couldn’t possibly ever be executed and the funny thing was the more he tested the programme, the more that error appeared on the screen and it is the same with constraints, the more traps you set the more critters you catch and you’re left wondering how on earth all that bad data was getting in

 

Its misleading to go on about the value of the great flood of big data. There’s a strong superstition that data has some sort of intrinsic mystical value all of its own.

 

Unless you can prove that data is correct its valueless because if you trust it you can end up with generations of children compelled to eat spinach for no good reason at all.

Instances and Ports with PowerShell

Posted on

Just a quick post and a day late for #SQLNewBlogger There are some excellent posts on that hashtag and I recommend that you read them

When you know a server name but not the name of the instances or the ports that they are using this function will be of use


<#
.SYNOPSIS
Shows the Instances and the Port Numbers on a SQL Server

.DESCRIPTION
This function will show the Instances and the Port Numbers on a SQL Server using WMI

.PARAMETER Server
The Server Name

.EXAMPLE
Get-SQLInstancesPort Fade2Black

This will display the instances and the port numbers on the server Fade2Black
.NOTES
AUTHOR: Rob Sewell sqldbawithabeard.com
DATE: 22/04/2015
#>

function Get-SQLInstancesPort
{

param ([string]$Server)

[system.reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo")|Out-Null
[system.reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.SqlWmiManagement")|Out-Null
$mc = new-object Microsoft.SqlServer.Management.Smo.Wmi.ManagedComputer $Server
$Instances = $mc.ServerInstances

foreach($Instance in $Instances)
{
$port = @{Name ="Port"; Expression = {$_.ServerProtocols['Tcp'].IPAddresses['IPAll'].IPAddressProperties['TcpPort'].Value}}
$Parent = @{Name = "Parent"; Expression ={$_.Parent.Name}}
$Instance|Select $Parent,Name, $Port
}
}

Checking SQL Server User Role Membership with PowerShell

Posted on Updated on

SQLDBAwithTheBeard:

Please  go and check the New SQL Bloggers posting here https://twitter.com/search?q=%23sqlnewblogger There are some brilliant new and older bloggers adding great value to the SQL Community

This is my most viewed post so I thought it made a good candidate to be updated and reblogged

Originally posted on SQL DBA with A Beard:

As you know, I love PowerShell!

I use it all the time in my daily job as a SQL DBA and at home whilst learning as well.

Not only do I use PowerShell for automating tasks such as Daily Backup Checks, Drive Space Checks, Service Running Checks, File Space Checks, Failed Agent Job Checks, SQL Error Log Checks, DBCC Checks and more but also for those questions which come up daily and interfere with concentrating on a complex or time consuming task.

I have developed a series of functions over time which save me time and effort whilst still enabling me to provide a good service to my customers. I keep them all in a functions folder and call them whenever I need them. I also have a very simple GUI which I have set up for my colleagues to enable them to easily answer simple questions quickly and easily…

View original 359 more words

PowerShelling SQL Saturday Sessions to the Guidebook app

Posted on Updated on

Following on from my previous post about parsing XML where I used the information from Steve Jones blog post to get information from the SQL Saturday web site I thought that this information and script may be useful for others performing the same task.

  1. Edit – This post was written prior to the updates to the SQL Saturday website over the weekend. When it can back up the script worked perfectly but the website is unavailable at the moment again so I will check and update as needed once it is back.

    We are looking at using the Guidebook app to provide an app for our attendees with all the session details for SQL Saturday Exeter

    The Guidebook admin website requires the data for the sessions in a certain format. You can choose CSV or XLS.

    In the admin portal you can download the template

    down

    which gives an Excel file like this

-excel

 

So now all we need to do is to fill it with data.

I have an Excel Object Snippet which I use to create new Excel Objects when using Powershell to manipulate Excel. Here it is for you. Once you have run the code you will be able to press CTRL + J and be able to choose the New Excel Object Snippet any time.


$snippet = @{
Title = "New Excel Object";
Description = "Creates a New Excel Object";
Text = @"
# Create a .com object for Excel
`$xl = new-object -comobject excel.application
`$xl.Visible = `$true # Set this to False when you run in production
`$wb = `$xl.Workbooks.Add() # Add a workbook

`$ws = `$wb.Worksheets.Item(1) # Add a worksheet

`$cells=`$ws.Cells
&lt;#
Do Some Stuff

perhaps

`$cells.item(`$row,`$col)="Server"
`$cells.item(`$row,`$col).font.size=16
`$Cells.item(`$row,`$col).Columnwidth = 10
`$col++
#&gt;

`$wb.Saveas("C:\temp\Test`$filename.xlsx")
`$xl.quit()
"@
}
New-IseSnippet @snippet

I needed to change this to open the existing file by using

$wb = $xl.Workbooks.Open($GuideBookPath)

In the more help tab of the Excel workbook it says

2.     Make sure that your dates are in the following format: MM/DD/YYYY (i.e. 4/21/2011).  If the dates are in any other format, such
as “April 21, 2011” or “3-Mar-2012”, Gears will not be able to import the data and you will receive an error message.
3.     Make sure that your times are in the following format: HH:MM AM/PM (i.e. 2:30 PM, or 11:15 AM). If the times are in any other
format, such as “3:00 p.m.” or “3:00:00 PM”, Gears will not be able to import the data and you will receive an error message.

So we need to do some manipulation of the data we gather. As before I selected the information from the XML as follows

$Speaker = @{Name="Speaker"; Expression = {$_.speakers.speaker.name}}
$Room = @{Name="Room"; Expression = {$_.location.name}}
$startTime = @{Name="StartTime"; Expression = {[datetime]($_.StartTime)}}
$Endtime = @{Name ="EndTime"; Expression = {[datetime]($_.EndTime)}}
$Talks = $Sessions.event|Where-Object {$_.title -ne 'Coffee Break' -and $_.title -ne 'Room Change' -and $_.title -ne 'Lunch Break' -and $_.title -ne 'Raffle and Cream Tea'}| select $Speaker,$Room,$Starttime,$Endtime,Title,Description |Sort-Object StartTime

I then looped through the $Talks array and wrote each line to Excel like this


foreach ($Talk in $Talks)
{
$Date = $Talk.StartTime.ToString('MM/dd/yyyy') ## to put the info in the right format
$Start = $talk.StartTime.ToString('hh:mm tt') ## to put the info in the right format
$End = $Talk.Endtime.ToString('hh:mm tt') ## to put the info in the right format
$Title = $Talk.Title
$Description = $Talk.Description
$Room = $Talk.Room
$col = 2
$cells.item($row,$col) = $Title
$col ++
$cells.item($row,$col) = $Date
$col ++
$cells.item($row,$col) = $Start
$col ++
$cells.item($row,$col) = $End
$col ++
$cells.item($row,$col) = $Room
$col ++
$col ++
$cells.item($row,$col) = $Description
$row++
}

I know that I converted the String to DateTime and then back to a String again but that was the easiest (quickest) way to obtain the correct format for the Excel file

Then to finish save the file and quit Excel

$wb.Save()
$xl.quit()

Then you upload the file in the Guidebook admin area
import

wait for the email confirmation and all your sessions are available in the guidebook

sched

I hope that is useful to others. The full script is below

## From http://www.sqlservercentral.com/blogs/steve_jones/2015/01/26/downloading-sql-saturday-data/

$i = 372
$baseURL = “http://www.sqlsaturday.com/eventxml.aspx?sat=”
$DestinationFile = “E:\SQLSatData\SQLSat” + $i + “.xml”
$GuideBookPath = 'C:\temp\Guidebook_Schedule_Template.xls'
$sourceURL = $baseURL + $i

$doc = New-Object System.Xml.XmlDocument
$doc.Load($sourceURL)
$doc.Save($DestinationFile)

$Sessions = $doc.GuidebookXML.events
$Speaker = @{Name="Speaker"; Expression = {$_.speakers.speaker.name}}
$Room = @{Name="Room"; Expression = {$_.location.name}}
$startTime = @{Name="StartTime"; Expression = {[datetime]($_.StartTime)}}
$Endtime = @{Name ="EndTime"; Expression = {[datetime]($_.EndTime)}}

$Talks = $Sessions.event|Where-Object {$_.title -ne 'Coffee Break' -and $_.title -ne 'Room Change' -and $_.title -ne 'Lunch Break' -and $_.title -ne 'Raffle and Cream Tea'}| select $Speaker,$Room,$Starttime,$Endtime,Title,Description |Sort-Object StartTime

# Create a .com object for Excel
$xl = new-object -comobject excel.application
$xl.Visible = $true # Set this to False when you run in production
$wb = $xl.Workbooks.Open($GuideBookPath)
$ws = $wb.Worksheets.item(1)

$cells=$ws.Cells

$cells.item(2,1) = '' # To clear that entry
$cells.item(3,1) = '' # To clear that entry

$col = 2
$row = 2

foreach ($Talk in $Talks)
{
$Date = $Talk.StartTime.ToString('MM/dd/yyyy') ## to put the info in the right format
$Start = $talk.StartTime.ToString('hh:mm tt') ## to put the info in the right format
$End = $Talk.Endtime.ToString('hh:mm tt') ## to put the info in the right format
$Title = $Talk.Title
$Description = $Talk.Description
$Room = $Talk.Room
$col = 2
$cells.item($row,$col) = $Title
$col ++
$cells.item($row,$col) = $Date
$col ++
$cells.item($row,$col) = $Start
$col ++
$cells.item($row,$col) = $End
$col ++
$cells.item($row,$col) = $Room
$col ++
$col ++
$cells.item($row,$col) = $Description
$row++
}

$wb.Save()
$xl.quit()