Developing an Azure Service Bus PowerShell Module – Part 1
This is part one of a multi-part post. Here are links to all the posts in this series.
Developing an Azure Service Bus PowerShell Module – Part 1/
Part 2 – Developing the First Test Using Pester
Part 3 – More Tests, this time with Pester's Mock
Part 4 – Coding the PowerShell Module
Introduction
On my current project, we've had a work item on our backlog for some time to provide a way for us to receive and submit messages to Azure Service Bus queues. What we really need is a way to receive dead-letter messages and re-submit them after fixing the underlying issue that caused them to dead-letter in the first place.
We have been using the excellent Service Bus Explorer for this purpose. Unfortunately, that tool doesn't serialize the message body quite right when we re-submit batches of messages; we could only repair and re-submit messages one at a time. So, my team decided we needed to build our own tooling for this purpose. We didn't want to build anything too fancy, a couple of PowerShell cmdlets would be perfect for what we were envisioning. I took on the task because I wanted to test drive Pester, PowerShell's unit test framework.
Starting with the Goal in Mind
I figure we need two cmdlets:
- Receive-QueueMessage
- Submit-QueueMessage
Receive and Submit seem like good names as they work well with the semantics of queues. Unfortunately, Receive and Submit are not standard PowerShell verbs, so I decided to go with the following:
- Read-QueueMessage
- New-QueueMessage
The parameter sets of the cmdlets should look something like this:
Read-QueueMessage -Namespace <string> -QueueName <string> -Deadletter <flag> -Count <int> -CorrelationId <guid> -Peek <flag> -PolicyName <string> -SKey <string>
New-QueueMessage -Namespace <string> -QueueName <string> -Message <string> -CorrelationId <guid> -PolicyName <string> -Key <string>
Creating the Project
My team uses Git and Visual Studio Team Services for our source control repository. I started by creating a feature branch in Git by branching off the main develop branch.

Next I created the project. I used the PowerShell Module Project template and named the project Avanade.ServiceBus, which will be the name of the module.
Note: You will need to install PowerShell Tools for Visual Studio 2017 for this tooling to show up.

The project template creates a module file, a Pester test file and the module manifest file. Initially, they don't do very much. I'm going to start by stubbing out the functions that will become my cmdlets. Initially, all they will do is throw an exception.
[powershell] <# Read-QueueMessage #> function Read-QueueMessage { throw [NotImplementedException] }
<# New-QueueMessage #> function New-QueueMessage { throw [NotImplementedException] }
Export-ModuleMember -Function Read-QueueMessage Export-ModuleMember -Function New-QueueMessage
[/powershell]
Next, I coded the Pester test file. The first thing we need is some boilerplate code so that the test file will load the module it is testing.
[powershell] $here = Split-Path -Parent $MyInvocation.MyCommand.Path $sut = (Split-Path -Leaf $MyInvocation.MyCommand.Path) -replace '\.Tests\.ps1', '.psm1' Import-Module "$here\$sut" -Force [/powershell]
Next, I add two simple tests. The tests don't do anything other than call the module functions, which will throw the NotImplementedException. That will indicate the test file is wired up the module file correctly. The test file looks like this.
[powershell] # # This is a PowerShell Unit Test file. # You need a unit test framework such as Pester to run PowerShell Unit tests. # You can download Pester from http://go.microsoft.com/fwlink/?LinkID=534084 # $here = Split-Path -Parent $MyInvocation.MyCommand.Path $sut = (Split-Path -Leaf $MyInvocation.MyCommand.Path) -replace '\.Tests\.ps1', '.psm1' Import-Module "$here\$sut" -Force
Describe "Read-QueueMessage" { Context "No Parameters" { It "Should Throw Exception" { Read-QueueMessage } } }
Describe "New-QueueMessage" { Context "No Parameters" { It "Should Throw Exception" { New-QueueMessage } } } [/powershell] Finally, I r-click the test file in Solution Explorer, and choose Execute as Script.

The results show up in the PowerShell Interactive Window.

As expected, both tests are failing because the stubbed module functions are throwing the NotImplementedException. This tells me that the test file is wired up to the module file and everything is working as expected.
This is a good time to commit my changes.

This post covered getting started creating a PowerShell module using Visual Studio 2017. It highlighted using Visual Studio's Git integration for source control operations and Test-Driven Development using Pester. In the next post, I will add additional tests and explore Pester's capabilities in depth.
Resources:
Avanade Announces New Microsoft Azure Stack Solution
Great news! We can now talk about the new Azure Stack offering that many of us have been working on for many months now. Here is the official press release: http://www.prnewswire.com/news-releases/avanade-announces-new-microsoft-azure-stack-solution-300469372.html
Azure Service Bus Monitoring and Alerting using Azure Function and Application Insights
Being designing and architecting solutions for our clients on Azure Cloud for many years, we know that Service Bus plays an integral part in most of the application architectures when a messaging layer is involved. At the same time we also know that there is no straight answers when customer ask us about native monitoring and alerting capabilities of the service bus. For visual dashboards, you would need to drill down to the overview section of the queue blade.

For Diagnostic, there are only operational logs available natively.
Although there are few 3rd party products available in the market who have built a good story around monitoring and alerting on azure service bus but they come at an additional cost.
In quest of answering our customer question on how we can get monitoring and alerting capabilities of azure service bus, I figured out that answer lies within azure itself. This blog post illustrate a proof-of-concept solution which was done as part of one of our customer engagement. The PoC solution uses native azure services including:
- Service Bus
- Functions
- Application Insights
- Application Insight Analytics
- Application Insight Alerts
- Dashboard
The only service that would add cost to your monthly azure bill would be functions (assuming application insight is already part of your application architecture). You would need to analyze the cost of purchasing a 3rd part monitoring product vs. function cost.
Let’s deep dive in the actual solution quickly.
Step 1: Create an Azure Service Bus Queue
This is of course a perquisite since we will be monitoring and alerting around this queue. For PoC, I created a queue (by name queue2) under a service bus namespace with root managed key. Also I filled up the queue using one of my favorite tool “Service Bus Explorer”.

Step 2: Create an Azure Function
Next step is to create a function. This function logic is to:
- Query the service bus to fetch all the queues and topics available under it.
- Get the count of active and dead letter messages
- Create custom telemetry metric
- And finally log the metric to Application Insight
I choose to use the language “C#” but there are other language available. Also I configured the function to trigger every 5 seconds so it’s almost real time.

Step 3: Add Application Insight to Function
Application Insight will be use to log the telemetry of service bus by the function. Create or reuse an application insight instance and use the instrumentation key in the C# code. I have pasted the function code used in my PoC. The logging part of the code relies on custom metrics concept of application insights. For PoC, I created 2 custom metric – “Active Message Count” and “Dead Letter Count”.
Sample Function:
#r "Microsoft.ServiceBus"
using System;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;
using System.Text.RegularExpressions;
using System.Net.Http;
using static System.Environment;
using Microsoft.ApplicationInsights;
using Microsoft.ApplicationInsights.DataContracts;
public static async Task Run(TimerInfo myTimer, TraceWriter log)
{
var namespaceManager = NamespaceManager.CreateFromConnectionString(
Env("ServiceBusConnectionString"));
foreach(var topic in await namespaceManager.GetTopicsAsync())
{
foreach(var subscription in await namespaceManager.GetSubscriptionsAsync(topic.Path))
{
await LogMessageCountsAsync(
$"{Escape(topic.Path)}.{Escape(subscription.Name)}",
subscription.MessageCountDetails, log);
}
}
foreach(var queue in await namespaceManager.GetQueuesAsync())
{
await LogMessageCountsAsync(Escape(queue.Path),
queue.MessageCountDetails, log);
}
}
private static async Task LogMessageCountsAsync(string entityName,
MessageCountDetails details, TraceWriter log)
{
var telemetryClient = new TelemetryClient();
telemetryClient.InstrumentationKey = "YOUR INSTRUMENTATION KEY";
var telemetry = new TraceTelemetry(entityName);
telemetry.Properties.Add("Active Message Count", details.ActiveMessageCount.ToString());
telemetry.Properties.Add("Dead Letter Count", details.DeadLetterMessageCount.ToString());
telemetryClient.TrackMetric(new MetricTelemetry("Active Message Count", details.ActiveMessageCount));
telemetryClient.TrackMetric(new MetricTelemetry("Dead Letter Count", details.DeadLetterMessageCount));
telemetryClient.TrackTrace(telemetry);
}
private static string Escape(string input) => Regex.Replace(input, @"[^A-Za-z0-9]+", "_");
private static string Env(string name) => GetEnvironmentVariable(name, EnvironmentVariableTarget.Process);
Step 4: Test your function
Next step is to test your function by running it. If everything is setup right, you should start seeing the telemetry in the application insight. When you select one the trace, you should be able to view the “Active Message Count” and “Dead Letter Count” under custom data. In the screenshot below, my queue2 has 17 active messages and 0 dead letter.

Step 5: Add an Application Insight Analytics Query
Next step is to use AI Analytics to render service bus chart for monitoring. From the AI blade, you need to click on the Analytics icon. AI Analytics is a separate portal with a query window. You would need to write a query which can render a time chart for a queue based on those custom metrics. You can use the below sample query as a start.
Sample Query:
traces | where message has 'queue2' | extend activemessagecount = todouble( customDimensions.["Active Message Count"]) | summarize avg(timestamp) by activemessagecount | order by avg_timestamp asc | render timechart

Step 5: Publish the Chart to Dashboard
The AI Analytics chart can be publish (via pin icon) to Azure Dashboard which will enable monitoring users to actively monitor the service bus metrics when they login to azure portal. This will remove the need to drill down to the service bus blade.
Refer this to know more about the creating and publishing charts to dashboards.

Step 6: Add Alerts on the custom counter
The Last step is to create application insight alerts. For PoC, I created 2 alerts on “Active Message Count” and “Dead Letter Message Count” with a threshold. These will alert monitoring users with an email, if the message count exceeds a threshold limit. You can also send these alert to external monitoring tools via web hook.

Attached is sample email from azure AI alert:

Hope these steps will at least gives you an idea that above custom solution with azure native services can serve basic monitoring and alerting capabilities for service bus and for that matter other azure services as well. The key is to define your custom metrics that you would like to monitor against and then setup the solution.
Azure Table Storage and PowerShell, The Hard Way
In my previous post I gave a quick overview of the Shared Key authentication scheme used by the Azure storage service and demonstrated how authenticate and access the BLOB storage API through PowerShell. The file and queue services follow an authentication scheme that aligns with the BLOB requirements, however the table service is a bit different. I felt it might help the more tortured souls out there (like myself) if I tried to describe the nuances.
Azure Storage REST API, Consistently Inconsistent
Like the REST of all things new Microsoft (read Azure), the mantra is consistency. From a modern administrative perspective you should have a consistent experience across whatever environment and toolset you require. If you are a traditional administrator/engineer of the Microsoft stack, the tooling takes the form of PowerShell cmdlets. If you use Python, bash, etc. there is effectively equivalent tooling available. My gripes outstanding, I think Microsoft has done a tremendous job in this regard. I also make no claim that my preferences are necessarily the correct ones. The ‘inconsistencies’ I will be discussing are not really issues for you if you use the mainline SDK(s). As usual, I’ll be focusing on how things work behind the scenes and my observations.
Shared Key Authentication, but Not All Are Equal
In exploring the shared key authentication to the BLOB REST API, we generated and encoded the HTTP request signature. The string we needed to encode looked something like this:
GET
/*HTTP Verb*/
/*Content-Encoding*/
/*Content-Language*/
/*Content-Length (include value when zero)*/
/*Content-MD5*/
/*Content-Type*/
/*Date*/
/*Range*/
x-ms-date:Sun, 11 Oct 2009 21:49:13 GMT x-ms-version:2009-09-19
/*CanonicalizedHeaders*/
/myaccount/mycontainer\ncomp:metadata\nrestype:container
timeout:20
The table service takes a much simpler and yet arcane format that is encoded in an identical fashion.
GET
application/json;odata=nometadata
Mon, 15 May 2017 17:29:11 GMT
/billing73d55f68/fabriclogae0bced538344887a4021ae5c3b61cd0GlobalTime(PartitionKey='407edc6d872271f853085a7a18387784',RowKey='02519075544040622622_407edc6d872271f853085a7a18387784_ 0_2952_2640')In this case there are far fewer headers and query parameters to deal with, however there are now fairly rigid requirements. A Date header must be specified as opposed to either Date or x-ms-date, or both in the BLOB case. A Content-Type header must also be specified as part of the signature, and no additional header details are required. The canonical resource component is very different from the BLOB service. The canonical resource still takes a format of <storage account name>/<table name>/<query parameters>. At the table service level only the comp query parameter is to be included. As an example, to query the table service properties for the storage account the request would look something like https://myaccount.table.core.windows.net?restype=service&comp=properties. The canonical resource would be /myaccount/?comp=properties.
Generating the Signature with PowerShell
We will reuse our encoding function from the previous post and include a new method for generating the signature.
Function EncodeStorageRequest
{
[CmdletBinding()]
param
(
[Parameter(Mandatory = $true,ValueFromPipeline=$true,ValueFromPipelineByPropertyName=$true)]
[String[]]$StringToSign,
[Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true)]
[String]$SigningKey
)
PROCESS
{
foreach ($item in $StringToSign)
{
$KeyBytes = [System.Convert]::FromBase64String($SigningKey)
$HMAC = New-Object System.Security.Cryptography.HMACSHA256
$HMAC.Key = $KeyBytes
$UnsignedBytes = [System.Text.Encoding]::UTF8.GetBytes($item)
$KeyHash = $HMAC.ComputeHash($UnsignedBytes)
$SignedString=[System.Convert]::ToBase64String($KeyHash)
Write-Output $SignedString
}
}
}
$AccountName='myaccount'
$AccessKey='vyAEEzbcnIAkLKti1leDbfrAOQBu5bx52zyCkW0fGIBCsS+DDGXpfidOeAWyg7do8ujft1mFhnz9kmliycmiXA=='
$Uri="https://$AccountName.table.core.windows.net/tables"
$SignatureParams=@{
Resource=$Uri;
Date=[DateTime]::UtcNow.ToString('R');
Verb='GET';
ContentType='application/json;odata=nometadata';
}
$RequestSignature=GetTableTokenStringToSign @SignatureParams $TableToken=EncodeStorageRequest -StringToSign $RequestSignature -SigningKey $AccessKey
$TableHeaders=[ordered]@{
'x-ms-version'= '2016-05-31';
'DataServiceVersion'='3.0;Netfx';
'Accept-Charset'='UTF-8';
'Accept'='application/json;odata=fullmetadata';
'Date'=$SignatureParams.Date;
'Authorization'="SharedKey $($AccountName):$($TableToken)"
}
$RequestParams=@{
Uri=$SignatureParams.Resource;
Method=$SignatureParams.Verb;
Headers=$TableHeaders;
ContentType=$SignatureParams.ContentType;
ErrorAction='STOP'
}
$Response=Invoke-WebRequest @RequestParams -Verbose $Tables=$Response.Content | ConvertFrom-Json | Select-Object -ExpandProperty value
PS C:\WINDOWS\system32> $Tables|fl
odata.type : acestack.Tables odata.id : https://acestack.table.core.windows.net/Tables('provisioninglog') odata.editLink : Tables('provisioninglog') TableName : provisioninglog
The astute reader will notice we had to pass some different headers along. All table requests require either or both a DataServiceVersion or MaxDataServiceVersion. These values align with maximum versions of the REST API, which I won't bother belaboring. We also retrieved JSON rather than XML, and have a number of content types available to take the format in which are dictated by the Accept header. In the example we retrieved it with full OData metadata; other valid types include minimalmetadata and nometadata (atom/xml is returned from earlier data service versions). In another peculiarity XML is the only format returned for retrieving Service properties or stats.
Putting It to Greater Use With Your Old Friend OData
You likely want to actually read some data out of tables. Now that authorizing the request is out of the way it is a 'simple' manner of applying the appropriate OData query parameters. We will start with retrieving a list of all entities within a table. This will return a maximum of 1000 results (unless limited using the $top parameter) and a link to any subsequent pages of data will be returned in the response headers. In the following example we will query all entities in the fabriclogaeGlobalTime table in the fabrixstuffz storage account. In the interest of brevity I will limit this to 3 results.
$TableName='fakecustomers'
$Uri="https://$AccountName.table.core.windows.net/$TableName"
$SignatureParams=@{
Resource=$Uri;
Date=[DateTime]::UtcNow.ToString('R');
Verb='POST';
ContentType='application/json;odata=nometadata';
}
$RequestSignature=GetTableTokenStringToSign @SignatureParams $TableToken=EncodeStorageRequest -StringToSign $RequestSignature -SigningKey $AccessKey
$TableHeaders=[ordered]@{
'x-ms-version'= '2016-05-31'
'DataServiceVersion'='3.0;Netfx'
'Accept-Charset'='UTF-8'
'Accept'='application/json;odata=fullmetadata';
'Date'=$SignatureParams.Date;
'Authorization'="SharedKey $($AccountName):$($TableToken)"
}
$PartitionKey='mypartitionkey'
$RowKey='row771'
$TableEntity=New-Object PSobject @{
"Address"="Mountain View";
"Name"="Buckaroo Banzai";
"Age"=33;
"AmountDue"=200.23;
"FavoriteItem"="oscillation overthruster";
"CustomerCode@odata.type"="Edm.Guid";
"CustomerCode"="c9da6455-213d-42c9-9a79-3e9149a57833";
"CustomerSince@odata.type"="Edm.DateTime";
"CustomerSince"="2008-07-10T00:00:00";
"IsActive"=$true;
"NumberOfOrders@odata.type"="Edm.Int64"
"NumberOfOrders"="255";
"PartitionKey"=$PartitionKey;
"RowKey"=$RowKey
}
$RequestParams=@{
Uri=$SignatureParams.Resource;
Method=$SignatureParams.Verb;
Headers=$TableHeaders;
ContentType=$SignatureParams.ContentType;
ErrorAction='STOP'
}
$Response=Invoke-WebRequest @RequestParams
This should yield a result looking like this.
Cache-Control: no-cache
Transfer-Encoding: chunked
Content-Type: application/json;odata=nometadata;streaming=true;charset=utf-8
Server: Windows-Azure-Table/1.0 Microsoft-HTTPAPI/2.0
x-ms-request-id: 56afccf3-0002-0104-0285-d382b4000000
x-ms-version: 2016-05-31
X-Content-Type-Options: nosniff
x-ms-continuation-NextPartitionKey: 1!44!NDA3ZWRjNmQ4NzIyNzFmODUzMDg1YTdhMTgzODc3ODQ-
x-ms-continuation-NextRowKey: 1!88!MDI1MTkwNjc4NDkwNDA1NzI1NjlfNDA3ZWRjNmQ4NzIyNzFmODUzMDg1YTdhMTgzODc3ODRfMF8yOTUyXzI2NDA- Date: Tue, 23 May 2017 05:27:28 GMT
{
"value": [
{
"PartitionKey": "407edc6d872271f853085a7a18387784",
"RowKey": "02519067840040580939_407edc6d872271f853085a7a18387784_0_2952_2640",
"Timestamp": "2017-05-23T05:25:55.6307353Z",
"EventType": "Time",
"TaskName": "FabricNode",
"dca_version": -2147483648,
"epoch": "1",
"localTime": "2017-05-23T05:21:07.4129436Z",
"lowerBound": "2017-05-23T05:19:56.173659Z",
"upperBound": "2017-05-23T05:19:56.173659Z"
},
{
"PartitionKey": "407edc6d872271f853085a7a18387784",
"RowKey": "02519067843040711216_407edc6d872271f853085a7a18387784_0_2952_2640",
"Timestamp": "2017-05-23T05:20:53.9265804Z",
"EventType": "Time",
"TaskName": "FabricNode",
"dca_version": -2147483648,
"epoch": "1",
"localTime": "2017-05-23T05:16:07.3678218Z",
"lowerBound": "2017-05-23T05:14:56.1606307Z",
"upperBound": "2017-05-23T05:14:56.1606307Z"
},
{
"PartitionKey": "407edc6d872271f853085a7a18387784",
"RowKey": "02519067846040653329_407edc6d872271f853085a7a18387784_0_2952_2640",
"Timestamp": "2017-05-23T05:15:52.7217857Z",
"EventType": "Time",
"TaskName": "FabricNode",
"dca_version": -2147483648,
"epoch": "1",
"localTime": "2017-05-23T05:11:07.3406081Z",
"lowerBound": "2017-05-23T05:09:56.1664211Z",
"upperBound": "2017-05-23T05:09:56.1664211Z"
}
]
}
You should recognize a relatively standard OData response, with our desired values present within an array as the value property. There are two response headers to note here; x-ms-continuation-NextPartitionKey and x-ms-continuation-NextRowKey. These headers are the continuation token for retrieving the next available value(s). The service will return results in pages with a maximum length of 1000 results, unless limited using the $top query parameter like the previous example. If one were so inclined, they could continue to send GET requests, including the continuation token(s) until all results are enumerated.
Creating (or updating) table entities is a slightly different exercise, which can become slightly convoluted (at least in PowerShell or other scripts). Conceptually, all that is required to create an entity is a POST request to the table resource URI with a body containing the entity and the appropriate required headers. The complexity is primarily a result of the metadata overhead associated with the server OData implementation. We'll examine this by inserting an entity into a fictional customers table.
You should end up receiving the inserted object as a response:
PS C:\Windows\system32> $Response.Content | ConvertFrom-Json
PartitionKey : mypartitionkey
RowKey : row772
Timestamp : 2017-05-23T06:17:53.7244968Z
CustomerCode : c9da6455-213d-42c9-9a79-3e9149a57833
FavoriteItem : oscillation overthruster
AmountDue : 200.23
IsActive : True
CustomerSince : 2008-07-10T00:00:00
Name : Buckaroo Banzai
NumberOfOrders : 255
Age : 33
Address : Mountain View
You should notice that the object we submitted had some extra properties not present on the inserted entity. The API requires that for any entity property where the (.Net) data type can not be automatically inferred, a type annotation must be specified. In this case CustomerCode=c9da6455-213d-42c9-9a79-3e9149a57833 is a GUID (as opposed to a string) requires a property CustomerCode@odata.type=Edm.Guid. If you would like a more complete explanation the format is detailed here.
Three ways to do the same thing
You've got to give it to Microsoft, they certainly keep things interesting. In the above example, I showed one of three ways that you can insert an entity into a table. The service supports Insert, Insert or Merge (Upsert), and Insert or Replace operations (there are also individual Replace and Merge operations). In the following example I will show the Upsert operation using the same table and entity as before.
$Uri="https://$AccountName.table.core.windows.net/$TableName(PartitionKey='$PartitionKey',RowKey='$RowKey')"
$SignatureParams=@{
Resource=$Uri;
Date=[DateTime]::UtcNow.ToString('R');
Verb='MERGE';
ContentType='application/json;odata=nometadata';
}
$RequestSignature=GetTableTokenStringToSign @SignatureParams
$TableToken=EncodeStorageRequest -StringToSign $RequestSignature -SigningKey $AccessKey $TableEntity | Add-Member -MemberType NoteProperty -Name 'NickName' -Value 'MrMan'
$TableHeaders=[ordered]@{
'x-ms-version'= '2016-05-31'
'DataServiceVersion'='3.0;Netfx'
'Accept-Charset'='UTF-8'
'Accept'='application/json;odata=fullmetadata';
'Date'=$SignatureParams.Date;
'Authorization'="SharedKey $($AccountName):$($TableToken)"
}
$RequestParams = @{
Method= 'MERGE';
Uri= $Uri;
Body= $($TableEntity|ConvertTo-Json);
Headers= $TableHeaders;
ContentType= 'application/json;odata=fullmetadata'
}
$Response=Invoke-WebRequest @RequestParams
This should yield a response with the meaningful details of the operation in the headers.
PS C:\Windows\system32> $Response.Headers
Key Value
--- -----
x-ms-request-id 48489e3d-0002-005c-6515-d545b8000000
x-ms-version 2016-05-31
X-Content-Type-Options nosniff
Content-Length 0
Cache-Control no-cache
Date Thu, 25 May 2017 05:08:58 GMT
ETag W/"datetime'2017-05-25T05%3A08%3A59.5530222Z'"
Server Windows-Azure-Table/1.0 Microsoft-HTTPAPI/2.0
Now What?
I'm sure I've bored most of you enough already so I won't belabor any more of the operations, but I hope that I've given you a little more insight into the workings of another key element of the Azure Storage Service(s). As always, if you don't have a proclivity for doing things the hard way, feel free to check out a module supporting most of the Table (and BLOB) service functionality on the Powershell Gallery or GitHub.
Azure BLOB Storage and PowerShell: The Hard Way
Shared Key Authentication Scheme
In a previous post I covered my general love/hate affair with PowerShell; particularly with respect to the Microsoft Cloud. For the majority of you than can not be bothered to read, I expressed a longstanding grudge against the Azure Cmdlets, rooted in the Switch-AzureMode fiasco. As an aside, those of you enjoying the Azure Stack technical previews may notice as similar problem arising with 'AzureRM Profile', but I digress. More importantly, there was a general theme of understanding the abstractions placed in front of you as an IT professional. By now, most of you should be familiar with the OAuth Bearer tokens used throughout the Microsoft cloud. They are nearly ubiquitous, with the exception of a few services, most importantly storage. The storage service is authenticated with a Shared Key Authentication or a Shared Access Signature. I will be focusing on the former.
Anatomy of the Signature
The Authentication header of HTTP requests backing the Azure Storage Services take the following form:
Authorization: SharedKey <Storage Account Name>:<AccessSignature>
The access signature is an HMAC 256 encoded string (Signature) which is constructed mostly of the components of the backing HTTP request. The gritty details are (somewhat) clearly detailed at MSDN, but for example the string to be encoded for getting the list of blobs in a container, looks something like this.
GET
x-ms-date:Mon, 08 May 2017 23:28:20 GMT x-ms-version:2016-05-31 /nosaashere/certificates comp:list restype:container
Let's examine the properties of a request for creating a BLOB Snapshot.
GET https://nosaashere.blob.core.windows.net/nosaashere/managedvhds/Provisioned.vhdx?comp=snapshotCanonical Resource comp:snapshot Canonical Resource Query
PUT VERB x-ms-date:Mon, 08 May 2017 23:28:21 GMT Canonical Date Header x-ms-version:2016-05-31 Canonical Header /nosaashere/managedvhds/Provisioned.vhdxA more advanced request (like this example for appending data to a Page BLOB) will show how additional headers come into scope as we include an MD5 Hash to verify the content, a content-length, and other required API headers.
PUT
4096000
32qczJv1wUlqnJPQRdBUzw==
x-ms-blob-type:PageBlob
x-ms-date:Mon, 08 May 2017 23:28:39 GMT
x-ms-page-write:Update x-ms-range:bytes=12288000-16383999
x-ms-version:2016-05-31 /nosaashere/managedvhds/Provisioned.vhdx comp:page
The general idea is the verb, standard and custom request headers, canonical headers, canonical resource and query are presented as a newline delimited string. This string is encoded using the HMAC256 algorithm with the storage account key. This base64 encoded string is used for crafting the Authorization header. The Authorization header is passed with the other headers used to sign the request. If the server is able to match the signature, the request is authenticated.
Putting this in some PoSh
First things first, we need to generate the string to sign. This function will take arguments for the desired HTTP request (URI, Verb, Query, Headers) parameters and create the previously described string.
Function GetTokenStringToSign
{
[CmdletBinding()]
param
(
[Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
[ValidateSet('GET','PUT','DELETE')]
[string]$Verb="GET",
[Parameter(Mandatory=$true,ValueFromPipelineByPropertyName = $true)]
[System.Uri]$Resource,
[Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
[long]$ContentLength,
[Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
[String]$ContentLanguage,
[Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
[String]$ContentEncoding,
[Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
[String]$ContentType,
[Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
[String]$ContentMD5,
[Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
[long]$RangeStart,
[Parameter(Mandatory = $false,ValueFromPipelineByPropertyName = $true)]
[long]$RangeEnd,[Parameter(Mandatory = $true,ValueFromPipelineByPropertyName = $true)]
[System.Collections.IDictionary]$Headers
)
$ResourceBase=($Resource.Host.Split('.') | Select-Object -First 1).TrimEnd("`0")
$ResourcePath=$Resource.LocalPath.TrimStart('/').TrimEnd("`0")
$LengthString=[String]::Empty
$Range=[String]::Empty
if($ContentLength -gt 0){$LengthString="$ContentLength"}
if($RangeEnd -gt 0){$Range="bytes=$($RangeStart)-$($RangeEnd-1)"}
$SigningPieces = @($Verb, $ContentEncoding,$ContentLanguage, $LengthString,$ContentMD5, $ContentType, [String]::Empty, [String]::Empty, [String]::Empty, [String]::Empty, [String]::Empty, $Range)
foreach ($item in $Headers.Keys)
{
$SigningPieces+="$($item):$($Headers[$item])"
}
$SigningPieces+="/$ResourceBase/$ResourcePath"
if ([String]::IsNullOrEmpty($Resource.Query) -eq $false)
{
$QueryResources=@{}
$QueryParams=$Resource.Query.Substring(1).Split('&')
foreach ($QueryParam in $QueryParams)
{
$ItemPieces=$QueryParam.Split('=')
$ItemKey = ($ItemPieces|Select-Object -First 1).TrimEnd("`0")
$ItemValue = ($ItemPieces|Select-Object -Last 1).TrimEnd("`0")
if($QueryResources.ContainsKey($ItemKey))
{
$QueryResources[$ItemKey] = "$($QueryResources[$ItemKey]),$ItemValue"
}
else
{
$QueryResources.Add($ItemKey, $ItemValue)
}
}
$Sorted=$QueryResources.Keys|Sort-Object
foreach ($QueryKey in $Sorted)
{
$SigningPieces += "$($QueryKey):$($QueryResources[$QueryKey])"
}
}
$StringToSign = [String]::Join("`n",$SigningPieces)
Write-Output $StringToSign
}
Once we have the signature, it is a simple step create the required HMACSHA256 Hash using the storage account key. The following function takes the two arguments and returns the encoded signature.
Function EncodeStorageRequest
{
[CmdletBinding()]
param
(
[Parameter(Mandatory = $true,ValueFromPipeline=$true,ValueFromPipelineByPropertyName=$true)]
[String[]]$StringToSign,
[Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true)]
[String]$SigningKey
)
PROCESS
{
foreach ($item in $StringToSign)
{
$KeyBytes = [System.Convert]::FromBase64String($SigningKey)
$HMAC = New-Object System.Security.Cryptography.HMACSHA256
$HMAC.Key = $KeyBytes
$UnsignedBytes = [System.Text.Encoding]::UTF8.GetBytes($item)
$KeyHash = $HMAC.ComputeHash($UnsignedBytes)
$SignedString=[System.Convert]::ToBase64String($KeyHash)
Write-Output $SignedString
}
}
}
Now that we have a signature it is time to pass it on to the storage service API, for the following examples we will focus on BLOB. Let's return to the first example, retrieving a list of the BLOBs in the certificates container of the nosaashere storage account. This only requires the date and version API headers. This request would take the format:
GET https://nosaashere.blob.core.windows.net/certificates?restype=container&comp=list x-ms-date:Mon, 08 May 2017 23:28:20 GMT x-ms-version:2016-05-31To create the signature we can use the above function.
$StorageAccountName='nosaashere'
$ContainerName='certificates'
$AccessKey="WMTyrXNLHL+DF4Gwn1HgqMrpl3s8Zp7ttUevo0+KN2adpByHaYhX4OBY7fLNyzw5IItopGDAr8iQDxrhoHHiRg=="
$BlobContainerUri="https://$StorageAccountName.blob.core.windows.net/$ContainerName?restype=container&comp=list"
$BlobHeaders= @{
"x-ms-date"=[DateTime]::UtcNow.ToString('R');
"x-ms-version"='2016-05-31';
}
$UnsignedSignature=GetTokenStringToSign -Verb GET -Resource $BlobContainerUri -AccessKey $AccessKey -Headers $BlobHeaders $StorageSignature=EncodeStorageRequest -StringToSign $UnsignedSignature -SigningKey $SigningKey
#Now we should have a 'token' for our actual request.
$BlobHeaders.Add('Authorization',"SharedKey $($StorageAccountName):$($StorageSignature)")
$Result=Invoke-RestMethod -Uri $Uri -Headers $BlobHeaders –UseBasicParsing
If you make your call without using the -OutFile parameter you will find a weird looking string rather than the nice friendly XmlDocument you were expecting.
<?xml version="1.0" encoding="utf-8"?>
<EnumerationResults ServiceEndpoint="https://nosaashere.blob.core.windows.net/" ContainerName="certificates">
<Blobs>
<Blob>
<Name>azurestackroot.as01.cer</Name>
<Properties>
<Last-Modified>Fri, 05 May 2017 20:31:33 GMT</Last-Modified>
<Etag>0x8D493F5B8410E96</Etag>
<Content-Length>1001</Content-Length>
<Content-Type>application/octet-stream</Content-Type>
<Content-Encoding />
<Content-Language />
<Content-MD5>O2/fcFtzb9R6alGEgXDZKA==</Content-MD5>
<Cache-Control />
<Content-Disposition />
<BlobType>BlockBlob</BlobType>
<LeaseStatus>unlocked</LeaseStatus>
<LeaseState>available</LeaseState>
<ServerEncrypted>false</ServerEncrypted>
</Properties>
</Blob>
<Blob>
<Name>azurestackroot.as02.cer</Name>
<Properties>
<Last-Modified>Wed, 03 May 2017 22:54:49 GMT</Last-Modified>
<Etag>0x8D4927767174A24</Etag>
<Content-Length>1001</Content-Length>
<Content-Type>application/octet-stream</Content-Type>
<Content-Encoding />
<Content-Language />
<Content-MD5>arONICHXLfRUr61IH/XHbw==</Content-MD5>
<Cache-Control />
<Content-Disposition />
<BlobType>BlockBlob</BlobType>
<LeaseStatus>unlocked</LeaseStatus>
<LeaseState>available</LeaseState>
<ServerEncrypted>false</ServerEncrypted>
</Properties>
</Blob>
<Blob>
<Name>azurestackroot.as03.cer</Name>
<Properties>
<Last-Modified>Wed, 15 Mar 2017 19:43:50 GMT</Last-Modified>
<Etag>0x8D46BDB9AB84CFD</Etag>
<Content-Length>1001</Content-Length>
<Content-Type>application/octet-stream</Content-Type>
<Content-Encoding />
<Content-Language />
<Content-MD5>sZZ30o/oMO57VMfVR7ZBGg==</Content-MD5>
<Cache-Control />
<Content-Disposition />
<BlobType>BlockBlob</BlobType>
<LeaseStatus>unlocked</LeaseStatus>
<LeaseState>available</LeaseState>
<ServerEncrypted>false</ServerEncrypted>
</Properties>
</Blob>
<Blob>
<Name>azurestackroot.as04.cer</Name>
<Properties>
<Last-Modified>Wed, 26 Apr 2017 22:45:41 GMT</Last-Modified>
<Etag>0x8D48CF5F7534F4B</Etag>
<Content-Length>1001</Content-Length>
<Content-Type>application/octet-stream</Content-Type>
<Content-Encoding />
<Content-Language />
<Content-MD5>rnkI6VPz9i1pXOick4qDSw==</Content-MD5>
<Cache-Control />
<Content-Disposition />
<BlobType>BlockBlob</BlobType>
<LeaseStatus>unlocked</LeaseStatus>
<LeaseState>available</LeaseState>
<ServerEncrypted>false</ServerEncrypted>
</Properties>
</Blob>
</Blobs>
<NextMarker />
</EnumerationResults>What, pray tell is this  ? In a weird confluence of events there is a long standing 'issue' with the Invoke-RestMethod and Invoke-WebRequest Cmdlets with the UTF-8 BOM. Luckily, .Net has lots of support for this stuff. Generally, most people just use the OutFile parameter and pipe it along to the Get-Content Cmdlet. If you are like me, we'll look for the UTF-8 preamble and strip it from the string.
$UTF8ByteOrderMark=[System.Text.Encoding]::Default.GetString([System.Text.Encoding]::UTF8.GetPreamble())
if($Result.StartsWith($UTF8ByteOrderMark,[System.StringComparison]::Ordinal))
{
$Result=$Result.Remove(0,$UTF8ByteOrderMark.Length)
}
[Xml]$EnumerationResult=$Result
Now you'll see something you should be able to work with:
PS C:\Users\chris> $ResultXml.EnumerationResults
ServiceEndpoint ContainerName Blobs NextMarker
--------------- ------------- ----- ----------
https://nosaashere.blob.core.windows.net/ certificates Blobs
PS C:\Users\chris> $ResultXml.EnumerationResults.Blobs.Blob
Name Properties
---- ----------
azurestackroot.as01.cer Properties
azurestackroot.as02.cer Properties
azurestackroot.as03.cer Properties
azurestackroot.as04.cer Properties
All storage service requests return a good deal of information in the response headers. Enumeration style operations , like the previous example return the relevant data in the response body. Many operations, like retrieving container or BLOB metadata return only relevant data in the response headers. Let’s modify our previous request, noting the change in the query parameter. You will also need to use the Invoke-WebRequest Cmdlet (or your other favorite method) so that you can access the response headers.
$BlobContainerUri="https://$StorageAccountName.blob.core.windows.net/$ContainerName?restype=container&comp=metadata"
$BlobHeaders= @{ "x-ms-date"=[DateTime]::UtcNow.ToString('R'); "x-ms-version"='2016-05-31'; }
$UnsignedSignature=GetTokenStringToSign -Verb GET -Resource $BlobContainerUri `
-AccessKey $AccessKey -Headers $BlobHeaders $StorageSignature=EncodeStorageRequest `
-StringToSign $UnsignedSignature -SigningKey $SigningKey
$BlobHeaders.Add('Authorization',"SharedKey $($StorageAccountName):$($StorageSignature)")
$Response=Invoke-WebRequest -Uri $Uri -Headers $BlobHeaders –UseBasicParsing
$ContainerMetadata=$Response.Headers
We should have the resulting metadata key-value pairs present in the form x-ms-meta-<Key Name>.
C:\Users\chris> $ContainerMetaData
Key Value
--- -----
Transfer-Encoding chunked
x-ms-request-id 5f15423e-0001-003d-066d-ca0167000000
x-ms-version 2016-05-31
x-ms-meta-dispo 12345
x-ms-meta-stuff test
Date Thu, 11 May 2017 15:41:16 GMT
ETag "0x8D4954F4245F500"
Last-Modified Sun, 07 May 2017 13:45:01 GMT
Server Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0
Where to go from here?
With the authentication scheme in hand, you can now access the all of the storage service. This includes creating snapshots, uploading and downloading files. If you are not inclined to do things the hard way, feel free to check out a module supporting most of the BLOB service functionality on the Powershell Gallery or GitHub.
Topic Search
-
Securing TLS in WAC (Windows Admin Center) https://t.co/klDc7J7R4G
Posts by Date
- August 2025 1
- March 2025 1
- February 2025 1
- October 2024 1
- August 2024 1
- July 2024 1
- October 2023 1
- September 2023 1
- August 2023 3
- July 2023 1
- June 2023 2
- May 2023 1
- February 2023 3
- January 2023 1
- December 2022 1
- November 2022 3
- October 2022 7
- September 2022 2
- August 2022 4
- July 2022 1
- February 2022 2
- January 2022 1
- October 2021 1
- June 2021 2
- February 2021 1
- December 2020 2
- November 2020 2
- October 2020 1
- September 2020 1
- August 2020 1
- June 2020 1
- May 2020 2
- March 2020 1
- January 2020 2
- December 2019 2
- November 2019 1
- October 2019 7
- June 2019 2
- March 2019 2
- February 2019 1
- December 2018 3
- November 2018 1
- October 2018 4
- September 2018 6
- August 2018 1
- June 2018 1
- April 2018 2
- March 2018 1
- February 2018 3
- January 2018 2
- August 2017 5
- June 2017 2
- May 2017 3
- March 2017 4
- February 2017 4
- December 2016 1
- November 2016 3
- October 2016 3
- September 2016 5
- August 2016 11
- July 2016 13