Persisting databases with named volumes on Windows with docker compose

With all things containers I refer to my good friend Andrew Pruski. Known as dbafromthecold on twitter he blogs at https://dbafromthecold.com

I was reading his latest blog post Using docker named volumes to persist databases in SQL Server and decided to give it a try.

His instructions worked perfectly and I thought I would try them using a docker-compose file as I like the ease of spinning up containers with them.

I created a docker-compose file like this which will map my backup folder on my Windows 10 laptop to a directory on the container and two more folders to the system folders on the container in the same way as Andrew has in his blog.

version: '3.7'

services:
    2019-CTP23:
        image: mcr.microsoft.com/mssql/server:2019-CTP2.3-ubuntu
        ports:  
          - "15591:1433"
          - "5022:5022"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"
        volumes: 
          - C:\MSSQL\BACKUP\KEEP:/var/opt/mssql/backups
          - C:\MSSQL\DockerFiles\datafiles:/var/opt/sqlserver
          - C:\MSSQL\DockerFiles\system:/var/opt/mssql

and then from the directory I ran

docker-compose up -d

This will build the containers as defined in the docker-compose file. The -d runs the container in the background. This was the result.

UPDATE – 2019-03-27

I have no idea why, but today it has worked as expected using the above docker-compose file. I had tried this a couple of times, restarted docker and restarted my laptop and was consistently getting the results below – however today it has worked

So feel free to carry on reading, it’s a fun story and it shows how you can persist the databases in a new container but the above docker-compose has worked!

The command completed successfully but as you can see on the left the container is red because it is not running. (I am using the Docker Explorer extension for Visual Studio C

I inspected the logs from the container using

docker logs ctp23_2019-CTP23_1

which returned

This is an evaluation version. There are [153] days left in the evaluation period.
This program has encountered a fatal error and cannot continue running at Tue Mar 26 19:40:35 20
19
The following diagnostic information is available:
Reason: 0x00000006 Status: 0x40000015 Message: Kernel bug check Address: 0x6b643120
Parameters: 0x10861f680
Stacktrace: 000000006b72d63f 000000006b64317b 000000006b6305ca
000000006b63ee02 000000006b72b83a 000000006b72a29d
000000006b769c02 000000006b881000 000000006b894000
000000006b89c000 0000000000000001
Process: 7 – sqlservr
Thread: 11 (application thread 0x4)
Instance Id: e01b154f-7986-42c6-ae13-c7d34b8b257d
Crash Id: 8cbb1c22-a8d6-4fad-bf8f-01c6aa5389b7
Build stamp: 0e53295d0e1704ae5b221538dd6e2322cd46134e0cc32be49c887ca84cdb8c10
Distribution: Ubuntu 16.04.6 LTS
Processors: 2
Total Memory: 4906205184 bytes
Timestamp: Tue Mar 26 19:40:35 2019
Ubuntu 16.04.6 LTS
Capturing core dump and information to /var/opt/mssql/log…
dmesg: read kernel buffer failed: Operation not permitted
No journal files were found.
No journal files were found.
Attempting to capture a dump with paldumper
WARNING: Capture attempt failure detected
Attempting to capture a filtered dump with paldumper
WARNING: Attempt to capture dump failed. Reference /var/opt/mssql/log/core.sqlservr.7.temp/log/
paldumper-debug.log for details
Attempting to capture a dump with gdb
WARNING: Unable to capture crash dump with GDB. You may need to
allow ptrace debugging, enable the CAP_SYS_PTRACE capability, or
run as root.

which told me that …………. it hadn’t worked. So I removed the containers with

docker-compose down

I thought I would create the volumes ahead of time like Andrew’s blog had mentioned with

docker volume create mssqlsystem
docker volume create mssqluser

and then use the volume names in the docker-compose file mapped to the system folders in the container, this time the result was

ERROR: Named volume “mssqlsystem:/var/opt/sqlserver:rw” is used in service “2019-CTP23” but no declaration was found in the volumes section.

So that didnt work either 🙂

I decided to inspect the volume definition using

docker volume inspect mssqlsystem

I can see the mountpoint is /var/lib/docker/volumes/mssqlsystem/_data so I decided to try a docker-compose like this

version: '3.7'

services:
    2019-CTP23:
        image: mcr.microsoft.com/mssql/server:2019-CTP2.3-ubuntu
        ports:  
          - "15591:1433"
          - "5022:5022"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"
        volumes: 
          - C:\MSSQL\BACKUP\KEEP:/var/opt/mssql/backups
          - /var/lib/docker/volumes/mssqluser/_data:/var/opt/sqlserver
          - /var/lib/docker/volumes/mssqlsystem/_data:/var/opt/mssql

and then ran docker-compose up without the -d flag so that I could see all of the output

You can see in the output that the system database files are being moved. Thatlooks like it is working so I used CTRL + C to stop the container and return the terminal. I then ran docker-compose up -d and

I created a special database for Andrew.

I could then remove the container with

docker-compose down

To make sure there is nothing up my sleeve I altered the docker-compose file to use a different name and port but kept the volume definitions the same.

version: '3.7'

services:
    2019-CTP23-Mk1:
        image: mcr.microsoft.com/mssql/server:2019-CTP2.3-ubuntu
        ports:  
          - "15592:1433"
          - "5022:5022"
        environment:
          SA_PASSWORD: "Password0!"
          ACCEPT_EULA: "Y"
        volumes: 
          - C:\MSSQL\BACKUP\KEEP:/var/opt/mssql/backups
          - /var/lib/docker/volumes/mssqluser/_data:/var/opt/sqlserver
          - /var/lib/docker/volumes/mssqlsystem/_data:/var/opt/mssql

I ran docker-compose up -d again and connected to the new container and lo and behold the container is still there

So after doing this, I have learned that to persist the databases and to use docker-compose files I had to map the volume to the mountpoint of the docker volume. Except I haven’t, I have learned that sometimes weird things happen with Docker on my laptop!!

Whats a SQL Notebook in Azure Data Studio?

Azure Data Studio is a cross-platform database tool for data professionals using the Microsoft family of on-premises and cloud data platforms on Windows, MacOS, and Linux.

Recently Vicky Harp tweeted

By the way, you can watch a recording from SQLBits of Vicky’s session


So in the interest of learning about something new I decided to give it a try.

Install The Insiders Edition

Unlike Visual Studio Code which has a link to the insiders download on the front page, you will have to visit the GitHub repository for the links to download the insiders release of Azure Data Studio. Scroll down and you will see

Try out the latest insiders build from master:

See the change log for additional details of what’s in this release.

Once you have installed you can connect to an instance, right click and choose New Notebook or you can use File – New Notebook

Incidentally, I use the docker-compose file here to create the containers and I map C:\MSSQL\BACKUP\KEEP on my local machine (where my backups are) to /var/opt/mssql/backups on the containers on lines 10 and 17 of the docker-compose so change as required . If you want to follow along then put the ValidationResults.bak in the folder on your local machine.
The Create-Ag.ps1 shows the code and creates an AG with dbatools. But I digress!

Install Notebook Dependencies

Once you click New Notebook you will get a prompt to install the dependencies.

It will show its output

and take a few minutes to run

It took all but 11 minutes on my machine

#

Create a Notebook

OK, so now that we have the dependencies installed we can create a notebook. I decided to use the ValidationResults database that I use for my dbachecks demos and describe here. I need to restore it from my local folder that I have mapped as a volume to my container. Of course, I use dbatools for this 🙂

# U: sqladmin P: dbatools.IO
$cred = Get-Credential
$restoreDbaDatabaseSplat = @{
    SqlInstance = $sqlinstance1
    SqlCredential = $cred
    UseDestinationDefaultDirectories = $true
    Path = '/var/opt/mssql/backups/ValidationResults.bak'
}
Restore-DbaDatabase @restoreDbaDatabaseSplat

I had already got a connection saved to the instance in Azure Data Studio, you may need to create a new one using the new connection icon at the top left and filling in the details. The password is in the code above.



Now I can start with my notebook. I am faced with this



I click on text and provide an intro


Once I had written that and clicked out, I couldn’t see what to do straight away!

Then I saw the code and text buttons at the top 🙂 Right, lets get on with it 🙂 I hit the code button and paste in the T-SQL to reset the dates in the database to simulate dbachecks having been run this morning.


There’s a run cell button on the right and when I press it

Cool 🙂

If the SQL query has results then they are shown as well

This is fun and I can see plenty of uses for it. Go and have a play with SQL notebooks 🙂

Source Control

I used CTRL K, CTRL O to open a folder and saved my notebook in my local Presentations folder which is source controlled. When I opened the explorer CTRL + SHIFT + E I can see that the folder and the file are colour coded green and have a U next to them marking them as Untracked. I can also see that the source control icon has a 1 for the number of files with changes and in the bottom left that I am in the master branch.

If I click on the source control icon (or CTRL + SHIFT + G) I can see the files with the changes and can enter a commit message

I then press CTRL + ENTER to commit my change and get this pop-up


As I only have one file and it has all the changes for this commit I click yes. If I had changed more than one file and only wanted to commit a single one at a time I would hover my mouse over the file and click the + to stage my change.



If I make a further change to the notebook and save it, I can see that the source control provider recognises the change but this time the folder the file is in and the file are colour coded brown with an M to show that they have been modified.

Unlike Visual Studio Code, when you then click on the source control icon and click on the change it does not show the differences in the notebook although this works with SQL files.

When I have made all my changes and committed them with good commit messages


I can see that there are 3 local changes ready to be pushed to by remote repository (GitHub in this case) and 0 remote commits in this branch by looking at the bottom left

I can click on the “roundy roundy” icon (I dont know its proper name 😊) and synchronise my changes. This comes with a pop-up

Personally I never press OK, Don’t Show Again because I like the double check and to think “Is this really what I want to do right now”. Once I press OK my changes will be synched with the remote repository. Explaining this means that you can find the notebook I have used in my Presentations GitHub Repository which means that you can run the Notebook too using the docker-compose file here and the instructions further up in the post.

#TSQL2sDay – NomNomNomNomNom

The topic for this months T-SQL Tuesday #112 hosted by Shane O’Neill (Blog / Twitter) is about “dipping into your cookie jar”. This reference means “when times get tough how do you dip into your reserves to keep going”. Shane asks the following:

That is what I want from the contributors of this T-SQL Tuesday, those memories that they can think back on for sustenance. Like the humble cookie, I want a humble brag.

Mmmm Cookies


Photo by Pille-Riin Priske on Unsplash

I’m not good at bragging, I’m generally convinced that all of you are better than me. Yes, I am aware that it is irrational. This has made writing this post really hard. Sure, I get immense pleasure and satisfaction from solving a problem, that’s a form of instant fulfillment. Certainly, I enjoy teaching people and passing over my knowledge for them to use.
I am not going to write about technical things that I have done because they don’t give me sustenance in that way.

So what does give me sustenance when times are hard?

People.

The things I am most proud of are the things other people do where I have played a small part. These are the things I look back at and help to energise me. Things like

  • A couple of people who I suggested started writing blogs and then speaking who are now seen as experts in their niche.
  • The people I mentored as new speakers who are now speaking all over the continent.

The most recent story was a DBA who sat in a full day pre-con at a SQL Saturday, took loads of notes and waited at the end to ask questions. We were looking at some code and she was telling me it wasn’t very good and apologising for it. It was good, it performed the required actions over a large estate and I told her so. I asked about her job and with a big sigh, she told a story of being stuck in a rut, dealing with a lot of legacy systems, not enjoying it and not being able to move on. We had a long talk.

Cut to this years SQL Bits and she came running up to me all energised. She has a new job, doing something “Cool in the cloud”, she said the things she had learned had helped her to land this role.

In all of these cases, it is the person involved who has done all of the hard work but it is these things that keep me going. The thank yous and the smiles I see on those peoples faces as they do the thing that they love and enjoy their success and progression 🙂

Cake !!!!!


Photo by Prince Abid on Unsplash

Hey, thats cake and not cookies Rob.

I know. The biggest thing that keeps me going when times are tough though is the security I am able to provide. Nearly 20 years ago my life was very different. Without a job, I’d had to give up a career, struggling dealing with my wife’s serious illnesses, suddenly responsible for the entire household without the means to provide, I was in a very bleak place and saw no way out.

So to have found a career that is my hobby, to be able to work and also to have fun, to have a social world that provides me with friends and entertainment in many countries and the opportunity to experience different cultures and still be able to live comfortably. Thats a blessing and what keeps me going.

Also being able to pay my dad back for turning up with sacks of potatoes by taking him to football matches and comedy shows 🙂

Acknowledge what you have got, tell your loved ones that you love them, enjoy life and use your cookies when you need them but don’t forget the cake 🙂

#DataInDevon – Getting up to speed with PowerShell or spend a day with one of four other MVPs :-)

Saturday 27th April is Global Azure Bootcamp day

What’s Global Azure Bootcamp?

The website says it best


…. communities will come together once again in the sixth great Global Azure Bootcamp event! Each user group will organize their own one day deep dive class on Azure the way they see fit and how it works for their members. The result is that thousands of people get to learn about Azure and join together online under the social hashtag #GlobalAzure!

Saturday Is Free Learning

I am a part of the team organising the event in Exeter. Now there is a little story here. We had chosen this date by chance to hold an event we call Data In Devon giving people in the South West (of UK) the chance to access a whole day of high quality data and technical sessions for free on a Saturday.

When the Global Azure Bootcamp was announced, we had a conversation with the organisers and they agreed that we could add Exeter as a venue as we had already decided to have a whole track dedicated to Azure. You can find our schedule here https://sqlsouthwest.co.uk/data-in-devon-saturday-schedule/ and you can register to attend via this form 

Now, we have some costs obviously, not a lot but venues are not free and neither is food 😉. We have a couple of sponsors (feel free to contact me if your company is interested in sponsoring the event) but we also have some paid training days on Friday 25th April.

Friday Is Training Day

It’s a great opportunity to get cheap high-quality training from some of the best in their areas of expertise. There are still some tickets for £175 and the price will rise only to £200. I think that £200 is fantastic value to be able to spend a day learning from

Alex Whittles – Data Platform MVP – Bi in Azure
John Martin – Data Platform MVP – Infrastructure as Code with Terraform
Terry McCann – Data Platform MVP – Machine Learning: From model to production using the cloud, containers and Dev Ops 
William Durkin – Data Platform MVP – Performance Pain Reduction for Data Platform Projects

and myself – Getting up to speed with PowerShell

You can sign up for any of these sessions by following the instructions here https://sqlsouthwest.co.uk/training-day-schedule/#Pricing We don’t have a fancy website or booking system as we wanted to keep costs down.

The details of my training day are below

Getting up to speed with PowerShellS

PowerShell is cross-platform, it works exactly the same on Windows, on Linux and Mac. It is awesome for automation and amazing for administration.

We will cover

  • the basics about PowerShell, PowerShell security
  • how to open PowerShell , how to install PowerShell .
  • 4 vital commands to enable you to be able to help yourself
  • The PowerShell Gallery and how to find, install and use additional modules
  • Reading the language
  • Working with output
  • Why Red text is a good thing and how to learn from the errors
  • We will even delve into scripting with PowerShell and how to validate your environment

There will also be the opportunity to learn about any areas of PowerShell, Automation, CI/CD that you have questions about. This is a beginner level session in which I will teach you to be comfortable with PowerShell and confident in being able to use it in the future

Attendees wanting to follow along should bring a laptop.