Running Jupyter Notebooks as Agent Jobs

Azure Data Studio is a great tool for connecting with your data platform whether it is in Azure or on your hardware. Jupyter Notebooks are fantastic, you can have words, pictures, code and code results all saved in one document.

I have created a repository in my Github https://beard.media/Notebooks where I have stored a number of Jupyter notebooks both for Azure Data Studio and the new .NET interactive notebooks.

Another thing that you can do with notebooks is run them as Agent Jobs and save the results of the run.

Notebooks running T-SQL

This works easily for T-SQL notebooks. I am going to use this one that I created that uses T-SQL to gather permissions using old code that was in a share somewhere. We can run the notebook and get the permissions and save the notebook and the results will be available for all time (unless you delete the notebook!)

SQL Agent Extension in Azure Data Studio

In Azure Data Studio, if you press CTRL + SHIFT + X it will open the Extensions tab

You can add extra functionality to Azure Data Studio. Search in the top bar for Agent and press the install button to install the extension. You can connect to and instance in the connections tab (CTRL + SHIFT + D) and right click on it and click Manage. This will open up the server dashboard (why isn’t it instance dashboard?)

and you will also have the SQL Agent dashboard available

Its pretty neat, it has green and red bars against the jobs showing success or failure and the larger the bar the longer the run time. On the left you will see a book. Click that

Notebooks in Agent Jobs

You can create an Agent Job to run a notebook. As a notebook is just a json file, it can be stored in a database table. This interface will create two tables one to store the templates and one for the results. Click New Notebook Job

Then navigate to the notebook and select it.

Choose a database for the storage of the template and the results and one for the execution context.

The name of the job will be the file name of the notebook. You can change this but there is a bug where you can only enter one character at a time in the name before it changes focus so beware!

Once the job is created, you will see two tables in the storage database notebooks.nb_materialized and notebooks.nb_template

The materialised table is empty right now

but the template table has a row for the job which includes the notebook in json format.

If you click on the jobs in the Notebook Jobs window in the SQL Agent extension, you can see more information about the job run

You can also run the job from here. It doesn’t have to be run from here, it is just a normal agent job which you can run or schedule in any normal manner. Running it from here gives a pop-up

You have to refresh to see when the job is finished and it will be red if the job failed, green if it succeeded or orange if some cells failed like this!

But this is the good bit. Clicking on that icon will open the notebook that was created by that agent job run. Lets see what we get

You can see that we have the results of the queries that we wrote in the notebook alongside the documentation (or maybe explanation of the expected results)
If we scroll down a little (and change the theme colour so that you can see the error)

Msg , Level , State , Line 
Duplicate column names are not permitted in SQL PowerShell. To repeat a column, use a column alias for the duplicate column in the format Column_Name AS New_Name.

We have got an error from running the code via SQL PowerShell which is how the job is run. This error is also inserted into the notebooks.nb_template table

I edited the notebook locally to remove that block of code

Then edited the job and selected the updated notebook

and re-ran the job and got a green tick.

Now I can open the notebook from the latest run, but notice that from this view I can also open the previous notebook.

If I look in the nb_template table, the last_run_notebook_error has cleared

and if I look in the nb materialized table I can see two rows, one for each job run. The error from the first run is also stored in this table. The notebook column has the json for the notebook if you wish to access it in a different manner.

Tomorrow, we will see what the job steps look like and how to make this run on an instance which does not and cannot have the required PowerShell.

Spoiler Alert – May contain dbatools 🙂

Use Jupyter Notebooks to Help People on StackOverFlow

I am sat in the PowerShell Saturday in Hamburg. You can see me on the right of this picture writing my previous blog post!

I was talking with my friend Mathias Jessen @IISResetMe on Twitter about notebooks and he said that another great use case was to use them on Stack OverFlow

Now Mathias is an active answerer on Stack OverFlow

and he puts a lot of effort into writing his answers, formatting them, including code and results. Basically exactly the same as a Notebook. However, with a Notebook, you can enable people to run the code as well on their own machines.

Mathias says he will use notebooks to help people when he answers their PowerShell questions on Stack OverFlow. If you are a Stack OverFlow Answerer then you can too.

New .NET Notebooks are here – PowerShell 7 notebooks are here.

Data Science folk used Notebooks for documentation and to show re-runnable research. Azure Data Studio included this notebook functionality and added SQL kernel where with a little bit of faffing you could run PowerShell and then a Python kernel that enabled PowerShell. It seems that notebooks are so cool that everyone is creating them these days! I was browsing twitter when I saw this tweet.

PowerShell 7 Notebooks 🙂

A notebook experience for PowerShell 7 that sounds amazing. This will enable a true cross-platform PowerShell Notebook experience which is lacking from the Python version as it uses Windows PowerShell on Windows and PowerShell Core on other OS’s

The first thing I asked was – Will this come to Azure Data Studio. I got an immediate response from Sydney Smith PowerShell Program Manager saying it is on the roadmap

Install dependencies

To be able to run the notebook, you need to install some dependencies. First install the .NET CORE SDK which you can download from https://dotnet.microsoft.com/download This needs admin permissions to install.

You also need a Python installation – You can use Anaconda, which you can download from here https://www.anaconda.com/distribution/ This does not need admin to install

Add Anaconda to Windows Terminal

I have added the Anaconda prompt to Windows Terminal so that I have one entry point for all my CLIs. Open the settings file and add the code below. (It will also give you an icon and background.

        {
            // Make changes here to the Anaconda.exe profile
            "guid": "{0caa0dad-35be-5f56-a7ff-afceeeaa6101}",
            "name": "Anaconda",
            "commandline": "cmd.exe /K C:\\Users\\mrrob\\Anaconda3\\Scripts\\activate.bat",
            "hidden": false,
            "backgroundImage": "C:\\Users\\mrrob\\Anaconda3\\Menu\\anaconda-navigator.ico",
            "icon": "C:\\Users\\mrrob\\Anaconda3\\Menu\\anaconda-navigator.ico",
            "backgroundImageAlignment": "topRight",
            "backgroundImageStretchMode": "uniform",
            "backgroundImageOpacity": 0.1
        }

and it appears in the drop down

With Anaconda installed, check that that the kernel is available on your path. If like me you have Azure Data Studio installed, you will have additional kernels but the important one line here is

python3 C:\Users\USERNAME\Anaconda3\share\jupyter\kernels\python3

In Windows Terminal move to a PowerShell 7 prompt and install the dotnet interactive tool

dotnet tool install --global Microsoft.dotnet-interactive

Then you can install the .NET kernel in your Anaconda prompt using this command

dotnet interactive jupyter install

Sometimes new things have errors

I had an error when I tried this first time

Could not execute because the specified command or file was not found.
Possible reasons for this include:
* You misspelled a built-in dotnet command.
* You intended to execute a .NET Core program, but dotnet-interactive does not exist.
* You intended to run a global tool, but a dotnet-prefixed executable with this name could not be found on the PATH.

This is easily fixed by adding %USERPROFILE%\.dotnet\tools to my path with set PATH=%PATH%;%USERPROFILE%\.dotnet\tools

Running jupyter kernelspec list shows that the .NET kernel is installed for C Sharp, F Sharp and .NET PowerShell

Lets open a Notebook

Now you want to play with it!
You can run the lab environment using `jupyter lab`

This opens a browser

You can open existing Azure Data Studio PowerShell notebooks (but not SQL ones)

Sometimes new things have errors Part 2

Unfortunately, I get errors when trying to import Pester which means I can not use my dbachecks notebooks in this blog post. I have raised an issue on the repo here.

Create a New Notebook

But it is easy to create a new Notebook

In the launcher page click the .NET PowerShell button


Which will open a new Notebook in the directory that you launched the lab from. You can then add Code or Markdown as I have described before here.

Then you can add code, markdown and images to create your notebook.

Once you have finished using the notebook lab, you can shut it down in the Anaconda prompt with CTRL + C

Here is a video of running a notebook which anyone can use to create a couple of Docker containers running SQL 2019 and query them with dbatools. You can find the notebook further down this post.

Sharing Notebooks

You can create notebooks to run common tasks. Even better, from the lab you can convert the notebook including the results to a variety of formats to share with other none-technical people. I used this functionality this week to export Azure Data Studio Notebooks to HTML and PDF for a Project manager 🙂

You can find the Export Notebook command in the File menu

Exporting to HTML did not export the images but it does include the results

You can share notebooks via GitHub – Either in a gist like this

or by providing a straight link to the notebook in GitHub https://github.com/SQLDBAWithABeard/Notebooks/blob/master/notebooks/Exploring%20dbatools.ipynb

You can also use Binder https://mybinder.org/

This uses Docker to create an interactive Notebook. Create a Github repo like https://github.com/SQLDBAWithABeard/Notebooks (or just clone it) Copy your notebooks into the notebooks folder and push the changes to Github and then go to https://mybinder.org/ and add your URL to the repository.

You can see what it looks like by clicking the button below which Binder creates for you

Unfortunately the kernel only supports Python for the moment but you can see the possibilities 🙂

Dynamically Creating Azure Data Studio Notebooks with PowerShell for an Incident Response Index Notebook

Now that Azure Data Studio has PowerShell Notebooks and there is a PowerShell Module for creating notebooks. I have been asked, more than once, what is the point? What is the use case? How does this help. I hope that this post will spark some ideas of one particular use-case.

I showed my silly example PowerShell code to create a PowerShell Notebook that created a PowerShell Notebook to my good friend Nick.

Nick is a fantastic, clever DBA who isn’t active on social media, which is a great shame as if he had time to share some of his fantastic work we would all benefit. He looked at that code and less than an hour later, came back to me with this code and idea which I have replicated here with his permission.

Thanks Nick.

The Use Case

The use case that Nick has is that he is converting some troubleshooting runbooks from their original locations (you know the sort of places – Sharepoint Docs, OneNote Notebooks, Shared Folders, the desktop of the Bastion Host) into a single repository of Azure Data Studio SQL or PowerShell Notebooks.

The idea is to have a single entry point into the troubleshooting steps and for the on-call DBA to create a Notebook from a template for the issue at hand which could be attached to an incident in the incident management solution. I suppose you could call it an Index Notebook.

Work Flow

When the DBA (or another team) opens this Notebook, they can choose the task that they are going to perform and click the link which will

  • copy the Notebook to the local machine
  • Rename the Notebook with the username and date
  • Open it ready for the work.

Once the work has been completed, the DBA can then attach the Notebook to the task or incident that has been created or use it in the Wash-Up/ Post Incident meeting.

This ensures that the original template notebook stays intact and unchanged and it is easy (which is always good when you are called out at 3am!) to create a uniquely named notebook .

Azure DevOps

Nick has placed this code into the deploy step in Azure DevOps which will deploy the template Notebooks from source control into the common folder and then this code will dynamically create the index Notebook each time there is a release.

Whilst the initial use case is incident response, this could easily be adapted for Notebooks used for Common Tasks or Run Books.

Notebooks

There are a number of Notebooks for different issue stored in directories. For this post, I have used the Notebooks from Microsoft that explain SQL 2019 features and troubleshooting which you can find in their GitHub repositories by following this link

The Azure DevOps deploys the Notebooks to a directory which then looks something like this

Some directories of Notebooks in a directory

Create an Index Notebook

Here is the code to create an index Notebook

This creates a Notebook in the root of the folder. It also uses the new -Collapse parameter in New-AdsNoteBookCell that creates the code blocks with the code collapsed so that it looks neater. The index Notebook looks like this in the root of the folder

Three O’Clock in the Morning

It’s 3am and I have been called out. I can open up the Index Notebook, find the set of queries I want to run and click the run button.

A new workbook opens up, named with my name and the time and I can get to work 🙂 I think it’s neat.

Here’s a video

Thanks Nick.

Maybe you can find him at SQL Bits next year. Did you know that SQL Bits 2020 was announced?

Check out https://sqlbits.com for more details

Create a PowerShell Notebook for Azure Data Studio with PowerShell

The latest update to the ADSNotebook PowerShell module I blogged about here now enables the creation of PowerShell notebooks with PowerShell.

You can install the module with

Install-Module ADSNotebook

or if you have already installed it you can use

Update-Module ADSNotebook

In the latest release, there is an extra parameter for New-AdsWorkBook of -Type which will accept either SQL or PowerShell

Create a PowerShell Notebook with PowerShell Rob

OK!

Here is some code to create a PowerShell Notebook. First we will create some cells using New-AdsWorkBookCell including all the markdown to add images and links. You can find my notebooks which explain how to write the markdown for your notebooks in my GitHub Presentations Repository

$introCelltext = "# Welcome to my Auto Generated PowerShell Notebook

## dbatools
![image](https://user-images.githubusercontent.com/6729780/68845538-7afcd200-06c3-11ea-952e-e4fe72a68fc8.png)  

dbatools is an open-source PowerShell Module for administering SQL Servers.
You can read more about dbatools and find the documentation at [dbatools.io](dbatools.io)
"
$SecondCelltext = "### Installation
You can install dbatools from the PowerShell Gallery using `Install-Module dbatools`
"

$thirdcelltext = "Install-Module dbatools"

$fourthCelltext = "### Getting Help
You should always use `Get-Help` to fins out how to use dbatools (and any PowerShell) commands"

$fifthcelltext = "Get-Help Get-DbaDatabase"
$sixthCelltext = "Try a command now. get the name, owner and collation of the user databases on the local instance"
$seventhCellText = "Get-DbaDatabase -SqlInstance localhost -ExcludeSystem | Select Name, Owner, Collation"

$Intro = New-ADSWorkBookCell -Type Text -Text $introCelltext
$second = New-ADSWorkBookCell -Type Text -Text $SecondCelltext
$third = New-ADSWorkBookCell -Type Code -Text $thirdcelltext
$fourth = New-ADSWorkBookCell -Type Text -Text $fourthCelltext
$fifth = New-ADSWorkBookCell -Type Code -Text $fifthcelltext
$sixth = New-ADSWorkBookCell -Type Text -Text $sixthCelltext
$seventh = New-ADSWorkBookCell -Type Code -Text $seventhCellText

Then we will create a new workbook using those cells

$path = 'C:\temp\dbatools.ipynb'
New-ADSWorkBook -Path $path -cells $Intro,$second,$third,$fourth,$fifth,$sixth,$Seventh -Type PowerShell

Then, when that code is run we can open the Notebook and ta-da

And it is super quick to run as well

UPDATE – Tyler Leonhardt t from the PowerShell team asked

Challenge accepted, with extra meta, here is the PowerShell to create a PowerShell Notebook which will create a PowerShell Notebook!!

Create Azure Data Studio SQL Notebooks with PowerShell

At PASS Summit today I gave a presentation about SQL Notebooks in Azure Data Studio for the DBA. I demo’d the PowerShell module ADSSQLNotebook.

which you can also find on GitHub (where I will be glad to take PR’s to improve it 🙂 )

This module has 3 functions

This module contains only 3 commands at present

  • Convert-ADSPowerShellForMarkdown

This will create the markdown link for embedding PowerShell code in a Text Cell for a SQL Notebook as described in this blog post

  • New-ADSWorkBookCell

This command will create a workbook text cell or a code cell for adding to the New-ADSWorkBook command

  • New-ADSWorkBook

This will create a new SQL Notebook using the cell objects created by New-ADSWorkBookCell

Usage

Convert-ADSPowerShellForMarkdown

Convert-ADSPowerShellForMarkdown -InputText "Get-ChildItem" -LinkText 'This will list the files' -ToClipBoard

Converts the PowerShell so that it works with MarkDown and sets it to the clipboard for pasting into a workbook cell

New-ADSWorkBookCell

$introCelltext = "# Welcome to my Auto Generated Notebook

## Automation
Using this we can automate the creation of notebooks for our use
"
$Intro = New-ADSWorkBookCell -Type Text -Text $introCelltext

Creates an Azure Data Studio Text cell and sets it to a variable for passing to  New-AdsWorkBook

New-ADSWorkBook

$introCelltext = "# Welcome to my Auto     Generated Notebook

## Automation
Using this we can automate the creation of notebooks for our use
"
$SecondCelltext = "## Running code
The next cell will have some code in it for running

## Server Principals
Below is the code to run against your     instance to find the server principals that are enabled"

$thirdcelltext = "SELECT Name
FROM sys.server_principals
WHERE is_disabled = 0"
$Intro = New-ADSWorkBookCell -Type Text -Text $introCelltext
$second = New-ADSWorkBookCell -Type Text  -Text $SecondCelltext
$third = New-ADSWorkBookCell -Type Code -Text $thirdcelltext

$path = 'C:\temp\AutoGenerated.ipynb'
New-ADSWorkBook -Path $path -cells $Intro,$second,$third

Creates 3 cells with New-AdsWorkBookCells to add to the workbook,
two text ones and a code one, then creates a SQL Notebook with
those cells and saves it as     C:\temp\AutoGenerated.ipynb

Installation

You can install this Module from the PowerShell Gallery using

Install-Module ADSNotebook

Compatability

This module has been tested on Windows PowerShell 5.1, PowerShell Core 6 and PowerShell 7 on Windows 10 and Ubuntu

Demo

PowerShell Notebooks in Azure Data Studio

The latest release of the insiders edition of Azure Data Studio brings the first edition of PowerShell Notebooks!

You can download the latest insiders edition from the link above, it can be installed alongside the stable release.

To access many of the commands available use F1 to open the command palette (like many of my tips this also works in Visual Studio Code). You can then start typing to get the command that you want.

You can then hit enter with the command that you want highlighted, use the mouse or use the shortcut which is displayed to the right.

In a new notebook, you can click the drop down next to kernel and now you can see that PowerShell is available

When you choose the PowerShell kernel, you will get a prompt asking you to configure the Python installation

If you have Python already installed you can browse to the location that it is installed or you can install Python. In the bottom pane you will be able to see the progress of the installation.

When it has completed, you will see

You may also get a prompt asking if you would like to upgrade some packages

Again this will be displayed in the tasks pane

Adding PowerShell


To add PowerShell Code to the notebook click the Code button at the top of the file

or the one you can find by highlighting above or below a block

I did not have intellisense, but you can easily write your code in Azure Data Studio or Visual Studio Code and paste it in the block.

Interestingly Shawn Melton ( t ) did

This was because he had the PowerShell extension installed and I did not (I know !!)
If you find you dont have intellisense then install the PowerShell extension!

Clicking the play button (which is only visible when you hover the mouse over it) will run the code

You can clear the results from every code block using the clear results button at the top

Otherwise, you can save the results with the Notebook by saving it. This is the part that is missing from running PowerShell in the Markdown blocks in a SQL Notebook as I described here

I am looking forward to how this develops. You can find my sample PowerShell notebook (with the code results) here

PowerShell in SQL Notebooks in Azure Data Studio

I have done a lot of writing in the last few months but you see no blog posts! My wonderful friend Chrissy and I are writing “dbatools in a Month of Lunches” to be published by Manning. That has taken up a lot of my writing mojo. We have hit a little break whilst we have some reviews done ready for the MEAP (For everyone who asks, the answer is the unfulfilling ‘soon’) so it’s time for a blog post!

SQL Notebooks are cool

I have had a lot of fun with SQL Notebooks recently. I have presented a session about them at a couple of events this month DataGrillen and SQL Saturday Cork. Here is a little snippet

Yes, you can run PowerShell in a SQL Notebook in Azure Data Studio just by clicking a link in the markdown cell. This opens up a lot of excellent possibilities.

I have had several discussions about how SQL Notebooks can be used by SQL DBAs within their normal everyday roles. (Mainly because I don’t really understand what the sorcerers of data science do with notebooks!). I have helped clients to look at some of their processes and use SQL Notebooks to help with them. Creating Disaster Recovery or Change Run-books or Incident Response Templates or using them for product demonstrations. Of course, I needed to use PowerShell in that 🙂

I have really enjoyed working out how to run PowerShell in the markdown in a SQL Notebook in Azure Data Studio and I think Anthony the kubernetes magician did too!

OK enough magic puns lets talk about PowerShell in SQL Notebooks. You can read about how to create a SQL Notebook and run T-SQL queries here, (you no longer need the Insider Edition by the way)

PowerShell in Markdown!

First, before I go any further, I must say this. I was at the European PowerShell Conference when I was working this out and creating my sessions and I said the words

“Cool, I can click a link and run PowerShell, this is neat”

A Beardy fellow in Hannover

This stopped some red team friends of mine in their tracks and they said “Show me”. One of them was rubbing their hands with glee! You can imagine the sort of wicked, devious things that they were immediately considering doing.

Yes, it’s funny but also it carries a serious warning. Without understanding what it is doing, please don’t enable PowerShell to be run in a SQL Notebook that someone sent you in an email or you find on a GitHub. In the same way as you don’t open the word document attachment which will get a thousand million trillion pounddollars into your bank account or run code you copy from the internet on production without understanding what it does, this could be a very dangerous thing to do.

With that warning out of the way, there are loads of really useful and fantastic use cases for this. SQL Notebooks make great run-books or incident response recorders and PowerShell is an obvious tool for this. (If only we could save the PowerShell output in a SQL Notebook, this would be even better)

How on earth did you work this out?

Someone asked me how I worked it out. I didn’t! It began with Vicky Harp PM lead for the SQL Tools team at Microsoft

I then went and looked at Kevin Cunnane‘s notebook. Kevin is a member of the tools team working on Azure Data Studio. With SQL Notebooks, you can double click the markdown cell and see the code that is behind it. To understand how it is working, lets deviate a little.

Keyboard Shortcuts

IF you click the cog at the bottom left of Azure Data Studio and choose Keyboard Shortcuts

you can make Azure Data Studio (and Visual Studio Code) work exactly how you want it to. Typing in the top box will find a command and you can then set the shortcuts that you want to use to save yourself time.

This also enables you to see the command that is called when you use a keyboard shortcut. For example, you can see that for the focus terminal command it says workbench.action.terminal.focus.

It turns out that you can call this as a link in a Markdown document using HTML with <a href=""> and adding command: prior to the command text. When the link is clicked the command will run. Cool 🙂

For this to be able to work (you read the warning above?) you need to set the Notebook to be trusted by clicking this button.

This will allow any command to be run. Of course, people with beards will helpfully advise when this is required for a SQL Notebook. (Safe to say people attempting nefarious actions will try the same with your users)

Now that we know how to run an Azure Data Studio command using a link in a markdown cell the next step is to run a PowerShell command. I headed to the Visual Studio Code documentation and found

Send text from a keybinding
The workbench.action.terminal.sendSequence command can be used to send a specific sequence of text to the terminal, including escape sequence

That’s the command we need, however, we still need to craft the command so that it will work as a link. It needs to be converted into a URL.

I started by using this website https://www.url-encode-decode.com/ to do this. This is how you can check the code in other peoples notebook, use the decode capability.

Encoding Set-Location C:\dbachecks gives Set-Location+C%3A%5Cdbacheck`

So I can just put that code into the href link and bingo!

If only it was that easy!!

Some Replacing is required

The + needs to be replaced with a space or %20

You also need to double the \ and replace the %3A with a :
The " needs to be replaced with \u022, the ' with \u027, the curly braces won’t work unless you remove the %0D%0A. Got all that? Good!

Once you have written your PowerShell, encoded it, performed the replacements, you add \u000D at the end of the code to pass an enter to run the code and then place all of that into a link like this

<a href="command:workbench.action.terminal.sendSequence?%7B%22text%22%3A%22 PLACE THE ENCODED CODE HERE %22%7D">Link Text</a>

This means that if you want to add the PowerShell code to set a location and then list the files and folders in that location to a Markdown cell using PowerShell like this

Set-Location C:\dbachecks
Get-ChildItem

You would end up with a link like this

<a href="command:workbench.action.terminal.sendSequence?%7B%22text%22%3A%22 Set-Location C:%5C%5Cdbachecks \u000D Get-ChildItem \u000D %22%7D">Set Location and list files</a>

Doing something more than once?

I don’t want to remember that all of the time so I wrote a PowerShell function. You can find it on GitHub https://github.com/SQLDBAWithABeard/Functions/blob/master/Convert-ADSPowerShellForMarkdown.ps1

This will take a PowerShell command and turn it into a link that will work in an Azure Data Studio markdown. It’s not magic, it’s PowerShell. There is a –ToClipboard parameter which will copy the code to the clipboard ready for you to paste into the cell (On Windows machines only)

Giants

There are many uses for this but here’s one I think is cool.

The link below will go to a notebook, which will show how you the giants upon whose shoulders I stand

Glenn Berry,
Chrissy LeMaire,
André Kamman,
Gianluca Sartori

have enabled me to create a SQL Notebook with a link which will run some PowerShell to create a SQL Notebook which will have all of the Diagnostic Queries in it.

You could possibly use something like it for your incident response SQL Notebook.

It’s also cool that GitHub renders the notebook in a browser (You can’t run PowerShell or T-SQL from there though, you need Azure Data Studio!)

https://github.com/SQLDBAWithABeard/Presentations/blob/master/2019/Berlin%20SQL%20User%20Group/04%20-%20Glenn%20Berry%20Notebook.ipynb


Whats a SQL Notebook in Azure Data Studio?

Azure Data Studio is a cross-platform database tool for data professionals using the Microsoft family of on-premises and cloud data platforms on Windows, MacOS, and Linux.

Recently Vicky Harp tweeted

By the way, you can watch a recording from SQLBits of Vicky’s session


So in the interest of learning about something new I decided to give it a try.

Install The Insiders Edition

Unlike Visual Studio Code which has a link to the insiders download on the front page, you will have to visit the GitHub repository for the links to download the insiders release of Azure Data Studio. Scroll down and you will see

Try out the latest insiders build from master:

See the change log for additional details of what’s in this release.

Once you have installed you can connect to an instance, right click and choose New Notebook or you can use File – New Notebook

Incidentally, I use the docker-compose file here to create the containers and I map C:\MSSQL\BACKUP\KEEP on my local machine (where my backups are) to /var/opt/mssql/backups on the containers on lines 10 and 17 of the docker-compose so change as required . If you want to follow along then put the ValidationResults.bak in the folder on your local machine.
The Create-Ag.ps1 shows the code and creates an AG with dbatools. But I digress!

Install Notebook Dependencies

Once you click New Notebook you will get a prompt to install the dependencies.

It will show its output

and take a few minutes to run

It took all but 11 minutes on my machine

#

Create a Notebook

OK, so now that we have the dependencies installed we can create a notebook. I decided to use the ValidationResults database that I use for my dbachecks demos and describe here. I need to restore it from my local folder that I have mapped as a volume to my container. Of course, I use dbatools for this 🙂

# U: sqladmin P: dbatools.IO
$cred = Get-Credential
$restoreDbaDatabaseSplat = @{
    SqlInstance = $sqlinstance1
    SqlCredential = $cred
    UseDestinationDefaultDirectories = $true
    Path = '/var/opt/mssql/backups/ValidationResults.bak'
}
Restore-DbaDatabase @restoreDbaDatabaseSplat

I had already got a connection saved to the instance in Azure Data Studio, you may need to create a new one using the new connection icon at the top left and filling in the details. The password is in the code above.



Now I can start with my notebook. I am faced with this



I click on text and provide an intro


Once I had written that and clicked out, I couldn’t see what to do straight away!

Then I saw the code and text buttons at the top 🙂 Right, lets get on with it 🙂 I hit the code button and paste in the T-SQL to reset the dates in the database to simulate dbachecks having been run this morning.


There’s a run cell button on the right and when I press it

Cool 🙂

If the SQL query has results then they are shown as well

This is fun and I can see plenty of uses for it. Go and have a play with SQL notebooks 🙂

Source Control

I used CTRL K, CTRL O to open a folder and saved my notebook in my local Presentations folder which is source controlled. When I opened the explorer CTRL + SHIFT + E I can see that the folder and the file are colour coded green and have a U next to them marking them as Untracked. I can also see that the source control icon has a 1 for the number of files with changes and in the bottom left that I am in the master branch.

If I click on the source control icon (or CTRL + SHIFT + G) I can see the files with the changes and can enter a commit message

I then press CTRL + ENTER to commit my change and get this pop-up


As I only have one file and it has all the changes for this commit I click yes. If I had changed more than one file and only wanted to commit a single one at a time I would hover my mouse over the file and click the + to stage my change.



If I make a further change to the notebook and save it, I can see that the source control provider recognises the change but this time the folder the file is in and the file are colour coded brown with an M to show that they have been modified.

Unlike Visual Studio Code, when you then click on the source control icon and click on the change it does not show the differences in the notebook although this works with SQL files.

When I have made all my changes and committed them with good commit messages


I can see that there are 3 local changes ready to be pushed to by remote repository (GitHub in this case) and 0 remote commits in this branch by looking at the bottom left

I can click on the “roundy roundy” icon (I dont know its proper name 😊) and synchronise my changes. This comes with a pop-up

Personally I never press OK, Don’t Show Again because I like the double check and to think “Is this really what I want to do right now”. Once I press OK my changes will be synched with the remote repository. Explaining this means that you can find the notebook I have used in my Presentations GitHub Repository which means that you can run the Notebook too using the docker-compose file here and the instructions further up in the post.