Adding a T-SQL Job Step to a SQL Agent Job with PowerShell

In my last post, I explained how to alter an existing job step across many servers. I also had cause to add a T-SQL Job step to a large number of jobs as well. This is how I did it.
As before I gathered the required jobs using Get-SQLAgentJob command from the sqlserver module which you can get by installing the latest SSMS from https://sqlps.io/dl 

This code was run on PowerShell version 5 and will not run on PowerShell version 3 or earlier as it uses the where method
I put all of our jobs that I required on the estate into a variable called $Jobs. (You will need to fill the $Servers variable with the names of your instances, maybe from a database or CMS or a text file

$Jobs = (Get-SQLAgentJob -ServerInstance $Servers).Where{$_.Name -like '*PartOfNameOfJob*' -and $_.IsEnabled -eq $true}

Then I can iterate through them with a foreach loop

foreach($Job in $Jobs)

Then we need to create a new job step which is done with the following code

$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep 

To find out what is available for this object you can run

$NewStep | Get-Member -MemberType Property

job-step-properties

We need to set the name, the parent (The job), the database, the command, the subsystem, the on fail action, on success action and the id for the job step.
I set the command to a variable to make the code easier to read

$Command = "SELECT Name from sys.databases"

the rest of the properties I fill in inside the loop. To find out what the properties can hold I look at MSDN for a Microsoft.SqlServer.Management.Smo.Agent.JobStep  The ID property is the number of the job step starting at 1 so this example will add a new job step that will be the first to run

$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep 
$NewStep.Name = 'A descriptive name for the job step'
$NewStep.Parent = $Job
$NewStep.DatabaseName = 'master'
$NewStep.Command = $Command
$NewStep.SubSystem = 'TransactSql'
$NewStep.OnFailAction = 'QuitWithFailure'
$NewStep.OnSuccessAction = 'GoToNextStep'
$NewStep.ID = 1

Once the object has all of the properties all we need to do is create it and alter the job

$NewStep.create()
$Job.Alter() 

and putting it all together it looks like this

$Jobs = (Get-SQLAgentJob -ServerInstance $Servers).Where{$_.Name -like '*PartOfNameOfJob*' -and $_.IsEnabled -eq $true}
$Command = "Select name from sys.databases"
foreach($Job in $Jobs)
{
$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep 
$NewStep.Name = 'A descriptive name for the job step1asdfsfasdfa'
$NewStep.Parent = $Job
$NewStep.DatabaseName = 'master'
$NewStep.Command = $Command
$NewStep.SubSystem = 'TransactSql'
$NewStep.OnFailAction = 'QuitWithFailure'
$NewStep.OnSuccessAction = 'GoToNextStep'
$NewStep.ID = 1
$NewStep.create().script()
$Job.Alter()
}

Hopefully this will help you if you need to add a T-SQL Job Step to a large number of servers
Happy Automating

Advertisements

Altering a Job Step on Hundreds of SQL Servers with PowerShell

I flew to Utrecht last week to present with Chrissy LeMaire and Sander Stad to present to the joint Dutch SQL and PowerShell User Groups. Whilst I was sat at the airport I got a phone call from my current client. “We need to change the backup path for all of the servers to a different share, how long will it take you?”

About 5 minutes  (PowerShell is very powerful – be careful when following these examples 😉 )

We will use the sqlserver module, so you will need to have installed the latest version of SSMS from https://sqlps.io/dl

This code was run using PowerShell version 5 and will not work on Powershell version 3 or lower as it uses the where method.

Lets grab all of our jobs on the estate. (You will need to fill the $Servers variable with the names of your instances, maybe from a database or CMS or a text file)

$Jobs = Get-SQLAgentJob -ServerInstance $Servers

Once we have the jobs we need to iterate only through the ones we need to. This step could also have been done in the line above. Lets assume we are using the Ola Hallengren Solution to backup our estate

Foreach($job in $Jobs.Where{$_.Name -like '*DatabaseBackup*' -and $_.isenabled -eq $true})

Then because I have to target a specific job step I can iterate through those and filter in the same way

foreach ($Step in $Job.jobsteps.Where{$_.Name -like '*DatabaseBackup*'})

Now all I need to do is to replace C:\Backup with C:\MSSQL\Backup (in this example I am using my labs backup paths)

$Step.Command = $Step.Command.Replace("Directory = N'C:\Backup'","Directory = N'C:\MSSQL\Backup'")

And then call the Alter method

$Step.Alter()

And that is all there is to it. Here is the full script I used

$Jobs = Get-SQLAgentJob -ServerInstance $Servers

Foreach($job in $Jobs.Where{$_.Name -like '*DatabaseBackup*' -and $_.isenabled -eq $true})
{
foreach ($Step in $Job.jobsteps.Where{$_.Name -like '*DatabaseBackup*'})
{
$Step.Command = $Step.Command.Replace("Directory = N'C:\Backup'","Directory = N'C:\MSSQL\Backup'")
$Step.Alter()
}
}

In only a few minutes I had altered several hundred instances worth of Ola Hallengren Jobs 🙂

This is one of the many reasons I love PowerShell, it enables me to perform mass changes very quickly and easily. Of course, you need to make sure that you know that what you are changing is what you want to change. I have caused severe issues by altering the SQL alerts frequency to 1 second instead of one hour on an estate!! Although the beauty of PowerShell meant that I was able to change it very quickly once the problem was realised
You can change a lot of settings. If you look at what is available at a job step level
job-step-properties
Happy Automating

Using the PowerShell SQL Provider with SQL Authentication

Whilst having a conversation with Chrissy LeMaire last week about using the SQL Provider. She asked if it could use SQL Authentication. I had no idea but said I would find out. This is how and what I did.

I am going to demonstrate this in Visual Studio Code with the PowerShell extension installed but I have also tested this in PowerShell version 5.1.14393.576 on Windows 10 and PowerShell Version 5.0.10586.117 on Windows 8.1 but it will work across other versions I believe

First we need to import the sqlserver module

Import-Module sqlserver

Then if we run Get-PSDrive we can see our SQL Server PS Drive

get psdrive.PNG

A quick dir and we can see what is in that drive

get-psdrive

We can navigate this like it is a file system. For example, below I have changed
to the ‘SQL directory’ run dir and found the machine name ROB-SURFACEBOOK and then
changed to that ‘directory’ and dir to find the instances, navigated to the default
instance and run dir to see what is available

cds

This would enable me to do this to get the database names for example

dir-databases

You can connect to remote SQL instances in the same manner by running

CD SQLSERVER:\SQL\SERVERNAME

However I don’t have permission using this account

failed to connect.PNG

We need to create a new PS Drive so first, as always, start with Get-Help

get-help new-psdrive.PNG

So we can create a new PS Drive with  a credential

Parameters
-Credential Specifies a user account that has permission to perform this action. The default is the current user.

Does it accept SQL Server?

Notes
New-PSDrive is designed to work with the data exposed by any provider. To list the providers available in your session, use Get-PSProvider.

get-psprovider.PNG

Yes it does 🙂

So we can add a new PSDrive using

 New-PSDrive -Name SQL2016N1-SQLAuth -PSProvider Sqlserver `
 -Root SQLSERVER:\SQL\SQL2016N1 -Description 'This is the SQL2016N1 SQL `
PSDrive using the  SQL Authenticated login DemoLogin' -Credential Get-Credential

add psdrive.PNG

Which will prompt us for a credential. Once we enter the credential we will have our SQL Server drive for that instance

access-denied

Except we won’t !! This is because the account that VS Code (or PowerShell) is running does not have permissions on the SQL Server. So I need to run VS Code (or PowerShell) as a user with those permissions by right clicking on the icon whilst holding shift down and clicking run as different user. Obviously you won’t need to do this if the current logged on user has permission on the remote SQL Server

right-click-run-as-different-user

and then enter the credentials of the user

creds

Now that we are running as the correct user

whoami
We repeat the code from above and we will have our PS Drive

need an instance.PNG

Nope 🙂 This time we get the following error

New-PSDrive : SQL Server PowerShell provider error: The root path for a drive with credentials needs to include the instance name.

which makes sense if you think about it as the credentials will only be valid at instance level

So we will add the instance name to the command. As this is the default instance we add DEFAULT

New-PSDrive -Name SQL2016N1-SQLAuth -PSProvider Sqlserver `
-Root SQLSERVER:\SQL\SQL2016N1\DEFAULT -Description 'This is the SQL2016N1 SQL `
PSDrive using the  SQL Authenticated login DemoLogin' -Credential Get-Credential

we-have-a-new-psdrive

Success!! So we will be able to see it in our list of PS Drives using Get-PSDrive

sql auth psdrive.PNG

and we can see further details like this

details.PNG

and we can navigate to it using cd NAMEOFDRIVE

dir-sqlauth

and perform our tasks

databases and logins.PNG

If you take a look at the SQL Server you will see that all that is being run is T-SQL

its just tsql.PNG

Unfortunately you cannot use the persist parameter with a SQLSERVER provider to persist the mapping across sessions

This post has shown you how to set up SQL Server PS Drives using SQL Authentication. The important points are

  • Use Get-Help New-PSDrive -ShowWindow for all the help and examples
  • You must be running PowerShell or VS Code as a user with Windows Permissions on the SQL Server
  • You must specify the instance name
  • Use “New-PSDrive -Name PSDRIVENAME -PSProvider SqlServer -Root SQLSERVER:\SQL\SERVER\INSTANCE -Description ‘DESCRIPTION’ -Credential Get-Credential”
  • You can then navigate the remote SQL Server like a file directory from the command line using SQL Authentication
  • It’s just running T-SQL against the instance

A Whole Day of PowerShell and SQL – Join Chrissy LeMaire & I at #sqlsatvienna

2016-12-29-18_53_40-sqlsaturday-579-vienna-2017-_-event-home

SQL Saturday Vienna 2017 is only just around the corner.

On Friday, January 20, 2017, a lot of local and international speakers will gather to deliver sessions relating to SQL Server and all related services. It’s a fabulous agenda – take a look I am sure that it will be a brilliant day and everyone will gain awesome knowledge from all of the renowned speakers

The day before a SQL Saturday there are pre-cons. Full day training sessions at a very reasonable price that help the organisers to fund the cost of the SQL Saturdays. This year, I have the pleasure of doing a pre-con with Chrissy LeMaire on Thursday, January 19, 2017 at the JUFA Hotel in Vienna.

modern-dba-toolkit

Our session is aimed at DBAs, from junior to senior, who want to embrace the latest methods and increase their toolset but anyone working with SQL who wants to know more about PowerShell will benefit

Come and join us for a great day of PowerShell and SQL full of demos and useful solutions that you will be able to take back to your workplace. We will improve your knowledge and skills, show you how the modern DBA will work and have fun at the same time.

You can get a ticket here

Here is a sneak peak at the agenda (Don’t tell Chrissy I showed you!)

agenda1.PNG

agenda2.PNG

 

Running SQL Queries with Visual Studio Code

Reading this blog post by Shawn Melton Introduction of Visual Studio Code for DBAs reminded me that whilst I use Visual Studio Code (which I shall refer to as Code from here on) for writing PowerShell and Markdown and love how easily it interacts with Github I hadn’t tried T-SQL. If you are new to Code (or if you are not) go and read Shawns blog post but here are the steps I took to running T-SQL code using Code

To download Code go to this link https://code.visualstudio.com/download and choose your operating system. Code works on Windows, Linux and Mac

00-code-download

Once you have downloaded and installed hit CTRL SHIFT and P which will open up the command palette

01-ctrlshiftp

Once you start typing the results will filter so type ext and then select Extensions : Install Extension

02-extensions

Which will open the Extensions tab ( You could have achieved the same end result just by clicking this icon)

03-install-extensions

But then you would not have learned about the command palette 🙂

So, with the extensions tab open, search for mssql and then click install

04-search-mssql

Once it has installed the button will change to Reload so click it

05-reload

And you will be prompted to Reload the window

06-reload-prompt

Accept the prompt and then open a new file (CTRL N) and then change the language for the file.

You can do this by clicking CTRL K and then M (Not CTRL K CTRL M) or click the language button

07-choose-langauga

And then choose SQL

08-choose-sql

This will start a download so make sure you are connected (and allowed to connect to the internet)

09-start-download

Once it has finished it will show this

10-finished-downloading

And offer you the chance to read the release notes

11-release-notes

Which you can get for any extension anytime by finding the extension in the extensions tab and clicking on it. This has links to tutorials as well as information about the release

12-release-notes

The mssql extension enables Intellisence for T-SQL when you open a .sql file or when you change the language to SQL as shown above for a new file

13-intellisense

Write your T-SQL Query and press CTRL SHIFT and E or Right Click and choose Execute Query. This will ask you to choose a Connection Profile (and display any existing profiles)

14-execute

Choose Create Connection Profile and answer the prompts

15-enter-servername

The query will then run

16-query-runs

You can then output the results to csv or json if you wish

17-results

You can find a video showing this whole process with some typos and an error here

Using SQL with VS Code

2016 – That was a Year :-)

Its the time of year for reflection and I have had the most amazing 2016, I am blessed that I love what I do so much. I thoroughly enjoy writing and talking and sharing and commenting and supporting and cherishing all the SQL and PowerShell things. I wrote about using Power Bi to display my checkins. I only started this in June and this is where I have been 🙂

swarm

I learnt about Pester and ended the year incorporating it into dbatools and dbareports. I also started using GitHub It is quite surprising to me how much time I now spend using both. I also had to start learning DSC for the client I was working with because as ‘the PowerShell guy’ I was the one who could the easiest. I learnt things and then forgot them causing me to find this Pester post via google later in the year!! (That’s a big reason for blogging by the way)

Early in the year we organised with SQL Saturday Exeter

Helping to organise a SQL Saturday is a lot of fun, especially when you do it with good friends, but choosing sessions is by far the most challenging part of it for me. I could have chosen at least 60 of these sessions and I know people were disappointed not to have been chosen. I was also the first person many saw at SQL Bits in Liverpool manning the front of house and getting asked the best question ever

The Beard says

When you go to an event –  Say thank you to the organisers and volunteers

and a TERRIBLE thing happened – I broke my DBA Team mug

WP_20160223_07_51_03_Pro.jpg

Luckily the fine folk at redgate sorted me out with a replacement from deep in the stores somewhere and gave it to me at SQL Saturday Exeter 🙂 Thank you.

I spoke at the PowerShell Conference Europe and met and made some great friends which lead to me speaking at the PowerShell Monday in Munich and the Dutch PowerShell Usergroup. SQL Saturday Dublin was a blast, its a wonderful city, Manchester had a whole PowerShell Track 🙂 and Cambridge was memorable for the appalling journey as well as the chance to share a stage with Chrissy. PowerShell Conference Asia in the sovereign city-state of Singapore was such a good event and place. Lastly of course was Slovenia with its fantastic Christmas lights and awesome event organisation. I visited some user groups too. Southampton run by my good friends John Martin and Steph Middleton Congratulations to John on his first MVP award yesterday, Cardiff for the Return of the Battle of the Beards with Terry McCann and Tobiasz Koprowski where the projector threw its toys out of the pram and Birmingham in the school hall which was slightly chilly (theres a joke there for some people)

Amazing things happened

We created https://sqlps.io/vote and https://sqlps.io/ssms and https://sqlps.io/powerbi to enable anyone to influence Microsoft and help to improve the PowerShell SQL experience

and lo and behold there was a new sqlserver module 🙂

I was also invited by Aaron and Chrissy to become an officer for the PASS PowerShell Virtual Chapter oh and we made https://sqlps.io/slack to enable people to talk about all things Data Platform – Another addition to my life that I didn’t have at the beginning of the year. I spend a lot of time in there in the #dbatools and #dbareports channels and have made some fantastic friends. Chrissy and I created the SQL Community Collaborative GitHub team and added dbatools and dbareports and even more friendships were born

And that’s the biggest and bestest thing about this year. Some amazing new friends and spending time with all my other friends. I started writing out a list but was terrified I would have missed someone out, so to all my friends

THANK YOU for a brilliant 2016 and 2017 shall be just as good 🙂

Here are a few of my pics from the year with a lot of my friends

 

Deploying a Windows Data Science Virtual Machine to Azure with PowerShell easily

This weekend (10 December 2016), I went to Slovenia for a SQL Saturday. As always, it was an amazing event well organised by Mladen Prajdic, Dejan Sarka, and Matija Lah in a fabulous setting amongst fabulous scenery. I highly recommend it and, also, Ljubljana  is a wonderful place to be in December with all the lights and markets.

WP_20161209_19_21_06_Pro.jpg

Whilst I was there I was asked by someone if you could deploy data science virtual machines in Azure with PowerShell. I said I was sure that it could be done and agreed I would write a blog post, so here it is.

According to the Azure documentation

The Data Science Virtual Machine running on a Windows Server 2012 contains popular tools for data exploration, modeling and development activities. The main tools include Microsoft R Server Developer Edition (An enterprise ready scalable R framework) , Anaconda Python distribution, Jupyter notebooks for Python and R, Visual Studio Community Edition with Python, R and node.js tools, Power BI desktop, SQL Server 2016 Developer edition including support In-Database analytics using Microsoft R Server. It also includes open source deep learning tools like Microsoft Cognitive Toolkit (CNTK 2.0) and mxnet; ML algorithms like xgboost, Vowpal Wabbit. The Azure SDK and libraries on the VM allows you to build your applications using various services in the cloud that are part of the Cortana Analytics Suite which includes Azure Machine Learning, Azure data factory, Stream Analytics and SQL Datawarehouse, Hadoop, Data Lake, Spark and more.

I have created a function to wrap around the process to make it easier for none PowerShell  people to do this. There are a series of steps to follow below and you should be able to create a machine in about 10 minutes once you have completed the pre-requisites.

Enable Programmatically Deployment

First, an annoyance. To be able to deploy Data Science virtual machines in Azure programmatically  you first have to login to the portal and click some buttons.

In the Portal click new and then marketplace and then search for data science. Choose the Windows Data Science Machine and under the blue Create button you will see a link which says “Want to deploy programmatically? Get started” Clicking this will lead to the following blade.

set-up-programmatically

Click Enable and then save and you then move to PowerShell 🙂

Azure PowerShell Cmdlets

Follow the instructions here to install the Azure PowerShell modules. In the examples you see here I am using Windows 10 and PowerShell version 5.1.14393.479 and I installed the Azure modules using the Install-Module method

Get the script

To install a data science VM, we’ll use the New-WindowsDataScienceVM.ps1 script. In this script, I’m using version 1.2, but any version of this script published in PowerShell Gallery is fine.

To install the New-WindowsDataScienceVM script from the PowerShell gallery, type:

Install-Script New-WindowsDataScienceVM

For  more information about using the PowerShellGet cmdlets to install scripts and modules from PowerShell Gallery, read this page. The PowerShellGet modules is included in PowerShell 5.0 and later on Windows 10, but you can install PowerShellGet for PowerShell 3.0 and 4.0. If you cannot connect to the gallery or prefer not to install the module, you can also find the script on GitHub.

Login to Azure

You can login to Azure using the command

Login-AzureRMAccount

which will pop-up a prompt for you to log into Azure

login

Enable Simple Mode

The New-WindowsDataScienceVM function comes with a Simple switch parameter.

If you use -Simple, the function prompts you only for the admin username and password for the virtual machine. It creates a randomly-named, standard_DS1_v2-sized machine in the ukwest data centre with standard, locally redundant storage in a randomly named Resource Group. All of the required objects have random names, too. If that is not what you want, there is more information at the end of this post. I am considering offering a pop-up to choose location in Simple Mode. Let me know here if that would be something you would like

To create a simple data science VM, run:

New-WindowsDataScienceVM -Simple

Enter Local Admin Password

When you run the function, it prompts for a local admin username and password to log into the virtual machine. The password must have 3 of the following 1 Upper case, 1 lower case, I special character and 1 number. Don’t lose it, you will need it.

Local Admin.PNG

Grab a cuppa, creating your VM and its resources will take 5 – 10 minutes. (In my testing it reliably took between 7 and 8 minutes)  The screen will look like this

deploying.PNG

When the script has finished running you will have deployed a set of resources like this

portal

Login to the Virtual Machine

Copy and paste the correct code from the output at the end of the script to launch the RDP session and save the RDP file to your documents folder for later use.

Or you can find the Virtual machine name in the portal or by running

Get-AzureRmVM -ResourceGroupName <ResourceGroup> | Where-Object {$_.Name -like 'DSVM*'}

You can then use the code below to download a RDP file and log into the virtual machine using this code

Get-AzureRmRemoteDesktopFile -ResourceGroupName <ResourceGroup> -Name <VMName>  -LocalPath C:\WIP\DataScienceVM.rdp -Launch

rdp file.PNG

You will need to login with the local admin account you set up previously, which means that you will need to click on more choices and then the machinename\Username. In this case the machine name is DSVMZIAgd

You can copy the correct Virtual Machine name and Username from the output at the end of the script.

login screen.PNG

If you have forgotten your password, you can reset it in the Portal.

Enjoy the Data Science Virtual Machine

You are then logged in and can carry on. Once the Azure PowerShell modules and script are installed you would be able to have a machine up and running within 10 minutes.

vm-desktop

Cleaning Up

To remove the resource group and ALL resources in the resource group, including the data science VM, run:

Remove-AzureRmResourceGroup -Name <ResourceGroup>  -Force

This will remove ALL resources in that resource group, so be careful if you have deployed anything else.

Customising the Deployment

If you want to use different settings for the deployment or want to script the creation of a number of machines, you can run

Get-Help New-WindowsDataScienceVM -Full

and see all the options and further examples. Any questions please feel free to comment