Creating Azure SQL Database AAD Contained Database Users with an SPN using PowerShell, Secrets Management, Azure Key Vault, and dbatools

Following on from my posts about using Secret Management Good bye Import-CliXml and running programmes as a different user, I have another use case.

After creating Azure SQL Databases in an Elastic Pool using a process pretty similar to this one I blogged about last year, I needed to be able to programmatically create users and assign permissions.

I need a user to login with

When I created my Azure SQL Server with Terraform, I set the Azure Admin to be a SPN as you can see in the image from the portal and set it to have an identity using the documentation for azurerm_mssql_server.

This allows this user to manage the access for the SQL Server as long as the SQL Server Azure AD identity has Directory Reader privileges. The SQL Server is called temp-beard-sqls and as you can see the identity is assigned to the role.

The privileges required to do this for a single identity are quite high

so now, you can assign an Azure Active Directory Group to that Role and allow less-privileged users to add the identity to this group . The documentation is here and there is a tutorial here explaining the steps you need to take.

What is an Azure SPN?

An Azure service principal is an identity created for use with applications, hosted services, and automated tools to access Azure resources.

https://docs.microsoft.com/en-us/cli/azure/create-an-azure-service-principal-azure-cli?toc=%2Fazure%2Fazure-resource-manager%2Ftoc.json&view=azure-cli-latest

I created the SPN using Azure CLI straight from the Azure Portal by clicking this button

and running

az ad sp create-for-rbac --name ServicePrincipalName

This will quickly create a SPN for you and return the password

Yes I have deleted this one

Add Azure Key Vault to Secret Management

In my previous posts, I have been using the Default Key Vault which is limited to your local machine and the user that is running the code. It would be better to use Azure Key Vault to store the details for the SPN so that it safely stored in the cloud and not on my machine and also so that anyone (or app) that has permissions to the vault can use it.

First you need to login to Azure in PowerShell (You will need to have the AZ* modules installed)

Connect-AzAccount

Be aware, the login box can appear behind the VS Code or Azure Data Studio window!

Once connected, if you have several Azure subscriptions, you can list them with

Get-AzSubscription

You can choose your subscription with

$AzureSubscription = Set-AzContext -SubscriptionName "NAME OF SUBSCRIPTION"

For the Secret Management Module to manage the Azure Key Vault, you first need to register it.

Ensure that you have permissions to connect by following the details in the network security documentation https://docs.microsoft.com/en-us/azure/key-vault/general/network-security and the secure access documentation https://docs.microsoft.com/en-us/azure/key-vault/general/secure-your-key-vault

Then you can run Register-SecretVault . You need to provide the local name for the key vault, the module name Az.KeyVault, and a VaultParameters hashtable with the KeyVault name and the Azure Subscription ID. You can register other types of Key Vaults to the Secret Management module in this way and they will require different values for the VaultParameters parameter.

$KeyVaultName = 'beard-key-vault'
Register-SecretVault -Name BeardKeyVault -ModuleName Az.KeyVault -VaultParameters @{ AZKVaultName = $KeyVaultName; SubscriptionId = $AzureSubscription.Subscription }

Adding the SPN details to the Azure Key Vault

Using the values for AppID – (Note NOT the display name) and the values for the password from the Azure CLI output or by creating a new secret for the SPN with PowerShell or via the portal. You can use the following code to add the SPN details and the tenantid to the Azure Key Vault using the Secret Management module

$ClientId = Read-Host "Enter ClientID" -AsSecureString
$SecretFromPortal = Read-Host "Enter Client Secret" -AsSecureString 
$tenantid = Read-Host "Enter TenantId" -AsSecureString 
Set-Secret -Vault BeardKeyVault -Name service-principal-guid -Secret $ClientId
Set-Secret -Vault BeardKeyVault -Name service-principal-secret -SecureStringSecret $SecretFromPortal
Set-Secret -Vault BeardKeyVault -Name Tenant-Id -Secret $tenantid

You can also do this with the Az.KeyVault module by following the instructions here

You can see the secrets in the portal

and also at the command line with the Secret Management module using

Get-SecretInfo -Vault RegisteredNameOfVault

Can my user connect?

If I try to connect in Azure Data Studio to my Azure SQL Database with my AAD account to the temp-sql-db-beard database. It fails.

By the way a great resource for troubleshooting the SQL error 18456 failure states can be found here https://sqlblog.org/2020/07/28/troubleshooting-error-18456

dbatools to the rescue 🙂

dbatools is an open source community collaboration PowerShell module for administrating SQL Server. You can find more about it at dbatools.io and get the book that Chrissy and I are writing about dbatools at dbatools.io\book

You can connect to Azure SQL Database with an Azure SPN using the following code. It will get the secrets from the Azure Key Vault that have been set above and create a connection. Lets see if I can run a query as the SPN.

$SqlInstance = 'temp-beard-sqls.database.windows.net'
$databasename = 'master'
$appid = Get-Secret -Vault BeardKeyVault -Name service-principal-guid  -AsPlainText
$Clientsecret = Get-Secret -Vault BeardKeyVault -Name service-principal-secret
$credential = New-Object System.Management.Automation.PSCredential ($appid,$Clientsecret)
$tenantid = Get-Secret -Vault BeardKeyVault -Name Sewells-Tenant-Id -AsPlainText
$AzureSQL = Connect-DbaInstance -SqlInstance $SqlInstance -Database $databasename  -SqlCredential $credential -Tenant $tenantid  -TrustServerCertificate 

Invoke-DbaQuery -SqlInstance $AzureSql -Database master  -SqlCredential $credential -Query "Select SUSER_NAME() as 'username'" 

Excellent 🙂

Add a user to the user database

I can then add my user to the temp-sql-db-beard Database. I need to create a new connection to the user database as you cannot use the USE [DatabaseName] statement

$Userdatabasename = 'temp-sql-db-beard'

$AzureSQL = Connect-DbaInstance -SqlInstance $SqlInstance -Database $Userdatabasename -SqlCredential $credential -Tenant $tenantid  -TrustServerCertificate 

Whilst you can use dbatools to create new users in Azure SQL Database at present you cant create AAD users. You can run a T-SQL Script to do this though. This script will create a contained database user in the database. I have added the role membership also but this can also be done with Add-DbaDbRoleMember from dbatools

$Query = @"
CREATE USER [rob@sewells-consulting.co.uk] FROM EXTERNAL PROVIDER
ALTER ROLE db_datareader ADD MEMBER [rob@sewells-consulting.co.uk]
"@
Invoke-DbaQuery -SqlInstance $AzureSql -Database $Userdatabasename  -SqlCredential $credential -Query $Query

Lets check the users on the database with dbatools

Get-DbaDbUser -SqlInstance $AzureSql -Database $Userdatabasename  |Out-GridView

I have my user and it is of type External user. Lets see if I can connect

Bingo 🙂

Happy Automating

Because I dont like to see awesome people struggling with PowerShell

Here is the same code using just the Az.KeyVault module

$appid = (Get-AzKeyVaultSecret -vaultName "beard-key-vault" -name "service-principal-guid").SecretValueText
$Clientsecret = (Get-AzKeyVaultSecret -vaultName "beard-key-vault" -name "service-principal-secret").SecretValue
$credential = New-Object System.Management.Automation.PSCredential ($appid,$Clientsecret)
$tenantid =  (Get-AzKeyVaultSecret -vaultName "beard-key-vault" -name "Sewells-Tenant-Id").SecretValueText
$AzureSQL = Connect-DbaInstance -SqlInstance $SqlInstance -Database $databasename  -SqlCredential $credential -Tenant $tenantid  -TrustServerCertificate 


Building Azure SQL Db with Terraform using Azure DevOps

In my last post I showed how to create a Resource Group and an Azure SQLDB with Terraform using Visual Studio Code to deploy.

Of course, I havent stopped there, who wants to manually run code to create things. There was a lot of install this and set up that. I would rather give the code to a build system and get it to run it. I can then even set it to automatically deploy new infrastructure when I commit some code to alter the configuration.

This scenario though is to build environments for presentations. Last time I created an Azure SQL DB and tagged it with DataInDevon (By the way you can get tickets for Data In Devon here – It is in Exeter on April 26th and 27th)

If I want to create the same environment but give it tags for a different event (This way I know when I can delete resources in Azure!) or name it differently, I can use Azure DevOps and alter the variables. I could just alter the code and commit the change and trigger a build or I could create variables and enable them to be set at the time the job is run. I use the former in “work” situations and the second for my presentations environment.

I have created a project in Azure DevOps for my Presentation Builds. I will be using GitHub to share the code that I have used. Once I clicked on pipelines, this is the page I saw

Clicking new pipeline, Azure DevOps asked me where my code was

I chose GitHub, authorised and chose the repository.

I then chose Empty Job on the next page. See the Configuration as code choice? We will come back to that later and our infrastructure as code will be deployed with a configuration as code 🙂

The next page allows us to give the build a good name and choose the Agent Pool that we want to use. Azure DevOps gives 7 different hosted agents running Linux, Mac, Windows or you can download an agent and run it on your own cpus. We will use the default agent for this process.

Clicking on Agent Job 1 enables me to change the name of the Agent Job. I could also choose a different type of Agent for different jobs within the same pipeline. This would be useful for testing different OS’s for example but for right now I shall just name it properly.

State

First we need somewhere to store the state of our build so that if we re-run it the Terraform plan step will be able to work out what it needs to do. (This is not absolutely required just for building my presentation environments and this might not be the best way to achieve this but for right now this is what I do and it works.)

I click on the + and search for Azure CLI.

and click on the Add button which gives me some boxes to fill in.

I choose my Azure subscription from the first drop down and choose Inline Script from the second

Inside the script block I put the following code

# the following script will create Azure resource group, Storage account and a Storage container which will be used to store terraform state
call az group create --location $(location) --name $(TerraformStorageRG)

call az storage account create --name $(TerraformStorageAccount) --resource-group $(TerraformStorageRG) --location $(location) --sku Standard_LRS

call az storage container create --name terraform --account-name $(TerraformStorageAccount)

This will create a Resource Group, a storage account and a container and use some variables to provide the values, we will come back to the variables later.

Access Key

The next thing that we need to do is to to enable the job to be able to access the storage account. We don’t want to store that key anywhere but we can use our Azure DevOps variables and some PowerShell to gather the access key and write it to the variable when the job is running . To create the variables I clicked on the variables tab

and then added the variables with the following names TerraformStorageRG, TerraformStorageAccount and location from the previous task and TerraformStorageKey for the next task.

With those created, I go back to Tasks and add an Azure PowerShell task

I then add this code to get the access key and overwrite the variable.

# Using this script we will fetch storage key which is required in terraform file to authenticate backend stoarge account

$key=(Get-AzureRmStorageAccountKey -ResourceGroupName $(TerraformStorageRG) -AccountName $(TerraformStorageAccount)).Value[0]

Write-Host "##vso[task.setvariable variable=TerraformStorageKey]$key"

Infrastructure as Code

In my GitHub repository I now have the following folders

The manual folders hold the code from the last blog post. In the Build folder, the main.tf file is identical and looks like this.

provider "azurerm" {
    version = "=1.24.0"
}

terraform {
  backend "azurerm" {
    key = "terraform.tfstate"
  }
}

resource "azurerm_resource_group" "presentation" {
  name     = "${var.ResourceGroupName}"
  location = "${var.location}"
    tags = {
    environment = "${var.presentation}"
  }
}

resource "azurerm_sql_server" "presentation" {
  name                         = "${var.SqlServerName}"
  resource_group_name          = "${azurerm_resource_group.presentation.name}"
  location                     = "${var.location}"
  version                      = "12.0"
  administrator_login          = "__SQLServerAdminUser__"
  administrator_login_password = "__SQLServerAdminPassword__"
    tags = {
    environment = "${var.presentation}"
  }
}

resource "azurerm_sql_database" "presentation" {
  name                = "${var.SqlDatabaseName}"
  resource_group_name = "${azurerm_sql_server.presentation.resource_group_name}"
  location            = "${var.location}"
  server_name         = "${azurerm_sql_server.presentation.name}"
  edition                          = "${var.Edition}"
  requested_service_objective_name = "${var.ServiceObjective}"

  tags = {
    environment = "${var.presentation}"
  }
}

The variables.tf folder looks like this.

variable "presentation" {
    description = "The name of the presentation - used for tagging Azure resources so I know what they belong to"
    default = "__Presentation__"
}

variable "ResourceGroupName" {
  description = "The Prefix used for all resources in this example"
  default     = "__ResourceGroupName__"
}

variable "location" {
  description = "The Azure Region in which the resources in this example should exist"
  default     = "__location__"
}

variable "SqlServerName" {
  description = "The name of the Azure SQL Server to be created or to have the database on - needs to be unique, lowercase between 3 and 24 characters including the prefix"
  default     = "__SqlServerName__"
}

variable "SQLServerAdminUser" {
  description = "The name of the Azure SQL Server Admin user for the Azure SQL Database"
  default     = "__SQLServerAdminUser__"
}
variable "SQLServerAdminPassword" {
  description = "The Azure SQL Database users password"
  default     = "__SQLServerAdminPassword__"
}
variable "SqlDatabaseName" {
  description = "The name of the Azure SQL database on - needs to be unique, lowercase between 3 and 24 characters including the prefix"
  default     = "__SqlDatabaseName__"
}


variable "Edition" {
  description = "The Edition of the Database - Basic, Standard, Premium, or DataWarehouse"
  default     = "__Edition__"
}

variable "ServiceObjective" {
  description = "The Service Tier S0, S1, S2, S3, P1, P2, P4, P6, P11 and ElasticPool"
  default     = "__ServiceObjective__"
}

It is exactly the same except that the values have been replaced by the value name prefixed and suffixed with __. This will enable me to replace the values with the variables in my Azure DevOps Build job.

The backend-config.tf file will store the details of the state that will be created by the first step and use the access key that has been retrieved in the second step.

resource_group_name = "__TerraformStorageRG__"

storage_account_name = "__TerraformStorageAccount__"

container_name = "terraform"

access_key = "__TerraformStorageKey__"

I need to add the following variables to my Azure DevOps Build – Presentation, ResourceGroupName, SqlServerName, SQLServerAdminUser, SQLServerAdminPassword, SqlDatabaseName, Edition, ServiceObjective . Personally I would advise setting the password or any other sensitive values to sensitive by clicking the padlock for that variable. This will stop the value being written to the log as well as hiding it behind *’s

Because I have tagged the variables with Settable at queue time , I can set the values whenever I run a build, so if I am at a different event I can change the name.

But the build job hasn’t been set up yet. First we need to replace the values in the variables file.

Replace the Tokens

I installed the Replace Tokens Task from the marketplace and added that to the build.

I am going to use a standard naming convention for my infrastructure code files so I add Build to the Root Directory. You can also click the ellipses and navigate to a folder in your repo. In the Target Files I add *”*/*.tf” and “**/*.tfvars” which will search all of the folders (**) and only work on files with a .tf or .tfvars extension (/*.tfvars) The next step is to make sure that the replacement prefix and suffix are correct. It is hidden under Advanced

Because I often forget this step and to aid in troubleshooting I add another step to read the contents of the files and place them in the logs. I do this by adding a PowerShell step which uses

Get-ChildItem .\Build -Recurse

Get-Content .\Build\*.tf 
Get-Content .\Build\*.tfvars 

Under control options there is a check box to enable or disable the steps so once I know that everything is ok with the build I will disable this step. The output in the log of a build will look like this showing the actual values in the files. This is really useful for finding spaces :-).

Running the Terraform in Azure DevOps

With everything set up we can now run the Terraform. I installed the Terraform task from the marketplace and added a task. We are going to follow the same process as the last blog post, init, plan, apply but this time we are going to automate it 🙂

First we will initialise

I put Build in the Terraform Template path. The Terraform arguments are

init -backend-config="0-backend-config.tfvars"

which will tell the Terraform to use the backend-config.tfvars file for the state. It is important to tick the Install terraform checkbox to ensure that terraform is available on the agent and to add the Azure Subscription (or Service Endpoint in a corporate environment

After the Initialise, I add the Terraform task again add Build to the target path and this time the argument is plan

Again, tick the install terraform checkbox and also the Use Azure Service Endpoint and choose the Azure Subscription.

We also need to tell the Terraform where to find the tfstate file by specifying the variables for the resource group and storage account and the container

Finally, add another Terraform task for the apply remembering to tick the install Terraform and Use Azure checkboxes

The arguments are

apply -auto-approve

This will negate the requirement for the “Only “yes” will be accepted to approve” from the manual steps post!

Build a Thing

Now we can build the environment – Clicking Save and Queue

opens this dialogue

where the variables can be filled in.

The build will be queued and clicking on the build number will open the logs

6 minutes later the job has finished

and the resources have been created.

If I want to look in the logs of the job I can click on one of the steps and take a look. This is the apply step

Do it Again For Another Presentation

So that is good, I can create my environment as I want it. Once my presentation has finished I can delete the Resource Groups. When I need to do the presentation again, I can queue another build and change the variables

The job will run

and the new resource group will be created

all ready for my next presentation 🙂

This is brilliant, I can set up the same solution for different repositories for different presentations (infrastructure) and recreate the above steps.

The next post will show how to use Azure DevOps Task Groups to use the same build steps in multiple pipelines and build an Azure Linux SQL Server VM

The post after that will show how to use Azure DevOps templates to use the same build steps across many projects and build pipelines and will build a simple AKS cluster

The first post showed how to build an Azure SQLDB with Terraform using VS Code

Building Azure SQL Db with Terraform with Visual Studio Code

I have been using Terraform for the last week or so to create some infrastructure and decided to bring that knowledge back to a problem that I and others suffer from – building environments for presentations, all for the sake of doing some learning.

What is Terraform?

According to the website


HashiCorp Terraform enables you to safely and predictably create, change, and improve infrastructure. It is an open source tool that codifies APIs into declarative configuration files that can be shared amongst team members, treated as code, edited, reviewed, and versioned


https://www.terraform.io/

This means that I can define my infrastructure as code. If I can do that then I can reliably do the same thing again and again, at work to create environments that have the same configuration or outside of work to repeatedly build the environment I need.

Building an Azure SQL Database with Terraform

To understand how to build a thing the best place to start is the documentation https://www.terraform.io/docs . For an Azure SQL Db in the docs you will find a block of code that looks like this

resource "azurerm_resource_group" "test" {
  name     = "acceptanceTestResourceGroup1"
  location = "West US"
}

resource "azurerm_sql_server" "test" {
  name                         = "mysqlserver"
  resource_group_name          = "${azurerm_resource_group.test.name}"
  location                     = "West US"
  version                      = "12.0"
  administrator_login          = "4dm1n157r470r"
  administrator_login_password = "4-v3ry-53cr37-p455w0rd"
}

resource "azurerm_sql_database" "test" {
  name                = "mysqldatabase"
  resource_group_name = "${azurerm_resource_group.test.name}"
  location            = "West US"
  server_name         = "${azurerm_sql_server.test.name}"

  tags = {
    environment = "production"
  }
}

If you read the code, you can see that there are key value pairs defining information about the resource that is being created. Anything inside a ${} is a dynamic reference. So

  resource_group_name          = "${azurerm_resource_group.test.name}"

refers to the name property in the azure_resource_group block called test (or the name of the resource group 🙂 )

Infrastructure As Code

So I can put that code into a file (name it main.tf) and alter it with the values and “run Terraform” and what I want will be created. Lets take it a step further though because I want to be able to reuse this code. Instead of hard-coding all of the values I am going to use variables. I can do this by creating another file called variables.tf which looks like

variable "presentation" {
    description = "The name of the presentation - used for tagging Azure resources so I know what they belong to"
    default = "dataindevon"
}

variable "ResourceGroupName" {
  description = "The Resource Group Name"
  default     = "beardrules"
}

variable "location" {
  description = "The Azure Region in which the resources in this example should exist"
  default     = "uksouth"
}

variable "SqlServerName" {
  description = "The name of the Azure SQL Server to be created or to have the database on - needs to be unique, lowercase between 3 and 24 characters including the prefix"
  default     = "jeremy"
}
variable "SQLServerAdminUser" {
  description = "The name of the Azure SQL Server Admin user for the Azure SQL Database"
  default     = "Beard"
}
variable "SQLServerAdminPassword" {
  description = "The Azure SQL Database users password"
  default     = "JonathanlovesR3ge%"
}

variable "SqlDatabaseName" {
  description = "The name of the Azure SQL database on - needs to be unique, lowercase between 3 and 24 characters including the prefix"
  default     = "jsdb"
}

variable "Edition" {
  description = "The Edition of the Database - Basic, Standard, Premium, or DataWarehouse"
  default     = "Standard"
}

variable "ServiceObjective" {
  description = "The Service Tier S0, S1, S2, S3, P1, P2, P4, P6, P11 and ElasticPool"
  default     = "S0"
}

and my main.tf then looks like this.

provider "azurerm" {
    version = "=1.24.0"
}

resource "azurerm_resource_group" "presentation" {
  name     = "${var.ResourceGroupName}"
  location = "${var.location}"
    tags = {
    environment = "${var.presentation}"
  }
}

resource "azurerm_sql_server" "presentation" {
  name                         = "${var.SqlServerName}"
  resource_group_name          = "${azurerm_resource_group.presentation.name}"
  location                     = "${var.location}"
  version                      = "12.0"
  administrator_login          = "${var.SQLServerAdminUser}"
  administrator_login_password = "${var.SQLServerAdminPassword}"
    tags = {
    environment = "${var.presentation}"
  }
}

resource "azurerm_sql_database" "presentation" {
  name                = "${var.SqlDatabaseName}"
  resource_group_name = "${azurerm_sql_server.presentation.resource_group_name}"
  location            = "${var.location}"
  server_name         = "${azurerm_sql_server.presentation.name}"
  edition                          = "${var.Edition}"
  requested_service_objective_name = "${var.ServiceObjective}"

  tags = {
    environment = "${var.presentation}"
  }
}

You can find these files in my GitHub Repository here.

Alright – deploy something

To deploy the code that I have written I need to download Terraform from https://www.terraform.io/downloads.html and then extract the exe to a folder in my PATH. (I chose C:\Windows). Then in Visual Studio Code I installed two extensions The Terraform Extension by Mikael Olenfalk which enables syntax highlighting and auto-completion for the tf files and the Azure Terraform extension. You will need also need Node.js from here.

With those in place I navigated to the directory holding my files in Visual Studio Code and pressed F1 and started typing azure terraform and chose Azure Terraform Init

I was then prompted to use Cloud Shell and a browser opened to login. Once I had logged in I waited until I saw this

I press F1 again and this time choose Azure Terraform plan. This is going to show me what Terraform is going to do if it applies this configuration.

You can see the what is going to be created. It is going to create 3 things

Once you have checked that the plan is what you want, press F1 again and choose Azure Terraform Apply

You are then asked to confirm that this is what you want. Only “yes” will be accepted. Then you will see the infrastructure being created

and a minute later

and Jeremy exists in the beardrules resource group

Then once I have finished with using the sqlinstance. I can press F1 again and choose Azure Terraform Destroy. Again there is a confirmation required.

and you will see the progress for 46 seconds

and all of the resources have gone.

Thats a good start. This enables me to create resources quickly and easily and keep the configuration for them safely in source control and easy to use.

In my next post I will create an Azure DevOps pipeline to deploy an AZure SQL Db withTerraform.

The post after will show how to use Azure DevOps Task Groups to use the same build steps in multiple pipelines and build an Azure Linux SQL Server VM

The post after that will show how to use Azure DevOps templates to use the same build steps across many projects and build pipelines and will build a simple AKS cluster

How I created PowerShell.cool using Flow, Azure SQL DB, Cognitive Services & PowerBi

Last weekend I was thinking about how to save the tweets for PowerShell Conference Europe. This annual event occurs in Hanover and this year it is on April 17-20, 2018. The agenda has just been released and you can find it on the website http://www.psconf.eu/

I ended up creating an interactive PowerBi report to which my good friend and Data Platform MVP Paul Andrew b | t added a bit of magic and I published it. The magnificent Tobias Weltner b | t who organises PSConfEU pointed the domain name http://powershell.cool at the link. It looks like this.

During the monthly #PSTweetChat

I mentioned that I need to blog about how I created it and Jeff replied

so here it is! Looking forward to seeing the comparison between the PowerShell and Devops Summit and the PowerShell Conference Europe 🙂

This is an overview of how it works

 

You will find all of the resources and the scripts to do all of the below in the GitHub repo. So clone it and navigate to the filepath

Create Database

First lets create a database. Connect to your Azure subscription

## Log in to your Azure subscription using the Add-AzureRmAccount command and follow the on-screen directions.

 Add-AzureRmAccount

## Select the subscription

Set-AzureRmContext -SubscriptionId YourSubscriptionIDHere

01 - subscription.png

Then set some variables

# The data center and resource name for your resources
$resourcegroupname = "twitterresource"
$location = "WestEurope"
# The logical server name: Use a random value or replace with your own value (do not capitalize)
$servername = "server-$(Get-Random)"
# Set an admin login and password for your database
# The login information for the server You need to set these and uncomment them - Dont use these values

# $adminlogin = "ServerAdmin"                
# $password = "ChangeYourAdminPassword1"

# The ip address range that you want to allow to access your server - change as appropriate
# $startip = "0.0.0.0"
# $endip = "0.0.0.0"

# To just add your own IP Address
$startip = $endip = (Invoke-WebRequest 'http://myexternalip.com/raw').Content -replace "`n"

# The database name
$databasename = "tweets"

$AzureSQLServer = "$servername.database.windows.net,1433"
$Table = "table.sql"
$Proc = "InsertTweets.sql"

They should all make sense, take note that you need to set and uncomment the login and password and choose which IPs to allow through the firewall

Create a Resource Group

## Create a resource group

New-AzureRmResourceGroup -Name $resourcegroupname -Location $location

02 - resource group.png

Create a SQL Server

## Create a Server

$newAzureRmSqlServerSplat = @{
    SqlAdministratorCredentials = $SqlAdministratorCredentials
    ResourceGroupName = $resourcegroupname
    ServerName = $servername
    Location = $location
}
New-AzureRmSqlServer @newAzureRmSqlServerSplat

03 - create server.png

Create a firewall rule, I just use my own IP and add the allow azure IPs

$newAzureRmSqlServerFirewallRuleSplat = @{
    EndIpAddress = $endip
    StartIpAddress = $startip
    ServerName = $servername
    ResourceGroupName = $resourcegroupname
    FirewallRuleName = "AllowSome"
}
New-AzureRmSqlServerFirewallRule @newAzureRmSqlServerFirewallRuleSplat

# Allow Azure IPS

$newAzureRmSqlServerFirewallRuleSplat = @{
    AllowAllAzureIPs = $true
    ServerName = $servername
    ResourceGroupName = $resourcegroupname
}
New-AzureRmSqlServerFirewallRule @newAzureRmSqlServerFirewallRuleSplat

03a - firewall rule.png

Create a database

# Create a database

$newAzureRmSqlDatabaseSplat = @{
    ServerName = $servername
    ResourceGroupName = $resourcegroupname
    Edition = 'Basic'
    DatabaseName = $databasename
}
New-AzureRmSqlDatabase  @newAzureRmSqlDatabaseSplat

04 - create database.png

I have used the dbatools module to run the scripts to create the database. You can get it using

Install-Module dbatools # -Scope CurrentUser # if not admin process

Run the scripts

# Create a credential

$newObjectSplat = @{
    ArgumentList = $adminlogin, $(ConvertTo-SecureString -String $password -AsPlainText -Force)
    TypeName = 'System.Management.Automation.PSCredential'
}
$SqlAdministratorCredentials = New-Object @newObjectSplat

## Using dbatools module

$invokeDbaSqlCmdSplat = @{
    SqlCredential = $SqlAdministratorCredentials
    Database = $databasename
    File = $Table,$Proc
    SqlInstance = $AzureSQLServer
}
Invoke-DbaSqlCmd @invokeDbaSqlCmdSplat

05 - Create Table Sproc.png

This will have created the following in Azure, you can see it in the portal

07 - portal.png

You can connect to the database in SSMS and you will see

06 - show table.png

Create Cognitive Services

Now you can create the Text Analysis Cognitive Services API

First login (if you need to) and set some variables

## This creates cognitive services for analysing the tweets

## Log in to your Azure subscription using the Add-AzureRmAccount command and follow the on-screen directions.

Add-AzureRmAccount

## Select the subscription

Set-AzureRmContext -SubscriptionId YOUR SUBSCRIPTION ID HERE

#region variables
# The data center and resource name for your resources
$resourcegroupname = "twitterresource"
$location = "WestEurope"
$APIName = 'TweetAnalysis'
#endregion

Then create the API and get the key

#Create the cognitive services

$newAzureRmCognitiveServicesAccountSplat = @{
    ResourceGroupName = $resourcegroupname
    Location = $location
    SkuName = 'F0'
    Name = $APIName
    Type = 'TextAnalytics'
}
New-AzureRmCognitiveServicesAccount @newAzureRmCognitiveServicesAccountSplat

# Get the Key

$getAzureRmCognitiveServicesAccountKeySplat = @{
    Name = $APIName
    ResourceGroupName = $resourcegroupname
}
Get-AzureRmCognitiveServicesAccountKey @getAzureRmCognitiveServicesAccountKeySplat 

You will need to accept the prompt

08 -cognitive service

Copy the Endpoint URL as you will need it.Then save one of  the keys for the next step!

09 cognitiveservice key

 

Create the Flow

I have exported the Flow to a zip file and also the json for a PowerApp (no details about that in this post). Both are available in the Github repo. I have submitted a template but it is not available yet.

Navigate to https://flow.microsoft.com/ and sign in

Creating Connections

You will need to set up your connections. Click New Connection and search for Text

16 - import step 3.png

Click Add and fill in the Account Key and the Site URL from the steps above

17 import step 5.png

click new connection and search for SQL Server

18 - import step 6.png

Enter the SQL Server Name (value of $AzureSQLServer) , Database Name , User Name and Password from the steps above

19 - import step 7.png

Click new Connection and search for Twitter and create a connection (the authorisation pop-up may be hidden behind other windows!)

Import the Flow

If you have a premium account you can import the flow, click Import

11 - import flow.png

12 - choose import.png

and choose the import.zip from the Github Repo

13 import step 1.png

 

Click on Create as new and choose a name

14 - import step 2.png

Click select during import next to Sentiment and choose the Sentiment connection

15 impot step 3.png

Select during import for the SQL Server Connection and choose the SQL Server Connection and do the same for the Twitter Connection

20 - import stpe 8.png

Then click import

21 - imported.png

Create the flow without import

If you do not have a premium account you can still create the flow using these steps. I have created a template but it is not available at the moment. Create the connections as above and then click Create from blank.

22 - importblank.png

 

Choose the trigger When a New Tweet is posted and add a search term. You may need to choose the connection to twitter by clicking the three dots

23 - importblank 1.png

Click Add an action

24 - add action.png

search for detect and choose the Text Analytics Detect Sentiment

25 - choose sentuiment.png

Enter the name for the connection, the account key and the URL from the creation of the API above. If you forgot to copy them

#region Forgot the details

# Copy the URL if you forget to save it

$getAzureRmCognitiveServicesAccountSplat = @{
    Name = $APIName
    ResourceGroupName = $resourcegroupname
}
(Get-AzureRmCognitiveServicesAccount @getAzureRmCognitiveServicesAccountSplat).Endpoint | Clip

# Copy the Key if you forgot

$getAzureRmCognitiveServicesAccountKeySplat = @{
    Name = $APIName
    ResourceGroupName = $resourcegroupname
}
(Get-AzureRmCognitiveServicesAccountKey @getAzureRmCognitiveServicesAccountKeySplat).Key1 | Clip

#endregion

26 - enter details.png

Click in the text box and choose Tweet Text

27 - choose tweet text.png

Click New Step and add an action. Search for SQL Server and choose SQL Server – Execute Stored Procedure

28 - choose sql server execute stored procedure.png

Choose the stored procedure [dbo].[InsertTweet]

29 - choose stored procedure.png

Fill in as follows

  • __PowerAppsID__         0
  • Date                                 Created At
  • Sentiment                      Score
  • Tweet                              Tweet Text
  • UserLocation                 Location
  • UserName                      Tweeted By

as shown below

30 stored procedure info.png

Give the flow a name at the top and click save flow

31 flow created.png

Connect PowerBi

Open the PSConfEU Twitter Analysis Direct.pbix from the GitHub repo in PowerBi Desktop. Click the arrow next to Edit Queries and then change data source settings

32 change data source.png

Click Change source and enter the server (value of $AzureSQLServer) and the database name. It will alert you to apply changes

33 apply changes.png

It will then pop-up with a prompt for the credentials. Choose Database and enter your credentials and click connect

34 - creds.png

and your PowerBi will be populated from the Azure SQL Database 🙂 This will fail if there are no records in the table because your flow hasn’t run yet. If it does just wait until you see some tweets and then click apply changes again.

You will probably want to alter the pictures and links etc and then yo can publish the report

Happy Twitter Analysis

Dont forget to keep an eye on your flow runs to make sure they have succeeded.