Shared Key Authentication

Part 4 – Coding the PowerShell Module

This is part four of a multi-part post. Here are links to all the posts in this series.

Developing an Azure Service Bus PowerShell Module – Part 1/

Part 2 – Developing the First Test Using Pester

Part 3 – More Tests, this time with Pester's Mock

Part 4 – Coding the PowerShell Module

Introduction

In my last post, I developed the Pester unit tests for the New-QueueMessage and Read-QueueMessage cmdlets I am developing for an Azure Service Bus PowerShell module. In this post, I will illustrate developing each of the functions.

New-SasToken

Figuring out how to generate the SAS token took some digging. Fortunately, there are code samples on MSDN (linked below) that illustrate how to correctly compute the SHA hash required for the token. The token is made of four parts:

  • the name of the Azure Service Bus namespace
  • the SHA 256 hash,
  • the expiration time (which is the number of seconds since the beginning of the epoch starting at midnight January 1, 1970 UTC)
  • the SAS Policy name

The SHA 256 hash is the hash of another string, which includes:

  • the name of the Azure Service Bus namespace
  • a newline character
  • the expiration time

First, we need to compute the token expiration time. The code for that looks like this.

[powershell] $origin = [DateTime]"1/1/1970 00:00" $diff = New-TimeSpan -Start $origin -End $Expiry $tokenExpirationTime = [Convert]::ToInt32($diff.TotalSeconds) [/powershell]

Then I create the string that will be hashed.

[powershell] $stringToSign = [Web.HttpUtility]::UrlEncode($Namespace) + "`n" + $tokenExpirationTime [/powershell]

The next step is to new up an instance of an HMACSHA256 class, which will do the work of computing the hash. The Key property of the HMACSHA256 is set to a byte array that contains the SAS Policy key from the Azure portal.

Here's the code to new-up the HMACSHA256 class. Again, note that you don't set the key property to the key from the Azure portal, it's set to a byte array created from the key.

[powershell] $hmacsha = New-Object -TypeName System.Security.Cryptography.HMACSHA256 $hmacsha.Key = [Text.Encoding]::UTF8.GetBytes($Key) [/powershell]

Next the hash is computed with the HMACSHA256 class instance. The hash is converted to a base 64 string. That is what is used in the token.

[powershell] $hash = $hmacsha.ComputeHash([Text.Encoding]::UTF8.GetBytes($stringToSign)) $signature = [Convert]::ToBase64String($hash) [/powershell]

The last step is to create the token. The token is a formatted string that looks like this.

Here's the code to create the token. Note the grave character at the end of each line.

[powershell] $token = [string]::Format([Globalization.CultureInfo]::InvariantCulture, ` "SharedAccessSignature sr={0}&sig={1}&se={2}&skn={3}", ` [Web.HttpUtility]::UrlEncode($Namespace), ` [Web.HttpUtility]::UrlEncode($signature), ` $tokenExpirationTime, ` $PolicyName) [/powershell]

Now, when I run the Pester unit test, I get the following.

Note that the last test no longer throws the NotImplementedException. Now that test passes. This is a good time to commit my changes to source control.

New-QueueMessage

One of the parameters of New-QueueMessage is a PSCustomObject that should contain a property named Body. Any other properties on that object will be assigned to the BrokerProperties header parameter of the web request. The first thing the function needs to do is break that object apart.

[powershell] $body = $Message.Body $Message.psobject.properties.Remove("Body") [/powershell]

Next is to set up the parameters for the Invoke-WebRequest cmdlet.

[powershell] $uri = "https://$Namespace.servicebus.windows.net/$QueueName/messages" $token = New-SasToken -Namespace $Namespace -Policy $PolicyName -Key $Key $headers = @{ "Authorization"="$token"; "Content-Type"="application/atom+xml;type=entry;charset=utf-8" } $headers.Add("BrokerProperties", $(ConvertTo-Json -InputObject $Message -Compress)) [/powershell]

Finally, the Invoke-WebRequest call. The normal output of the command is redirected to the $null automatic variable. If an error occurs, Invoke-WebRequest will output the error to the error stream stderr.

[powershell] Invoke-WebRequest -Uri $uri -Headers $headers -Method Post -Body $body > $null [/powershell]

When I run my Pester tests, I get the following.

Two of my unit tests are passing now. This is a good time to commit my changes to source control.

Read-QueueMessage

Read-QueueMessage is a lot like New-QueueMessage. Make the call to the Service Bus REST API endpoint and use the results to construct a PSCustomObject that contains the BrokerProperties and the body of the response. I'll start by constructing the parameters required for the Invoke-WebRequest cmdlet.

[powershell] $uri = "https://$Namespace.servicebus.windows.net/$QueueName/messages/head" $token = New-SasToken -Namespace $Namespace -Policy $PolicyName -Key $Key $headers = @{ "Authorization"="$token" } [/powershell]

Next make the call to the Service Bus REST API.

[powershell] $response = Invoke-WebRequest -Uri $uri -Headers $headers -Method Delete [/powershell]

Finally, construct the PSCustomObject that contains the brokered message properties and the body of the message.

[powershell] $brokeredMessage = ConvertFrom-Json -InputObject $response.Headers.BrokerProperties Add-Member -InputObject $brokeredMessage ` -MemberType NoteProperty ` -Name "Body" ` -Value $response.Content [/powershell]

When I run my Pester tests, I get the following output.

All tests are passing. This is a good time to commit my changes.

In this post, I walked through the process of developing the three functions that make up my module. The module isn't done yet, it doesn't support all the envisioned scenarios yet. For example, I want to support the Service Bus Peek operation and retrieving messages from the dead letter queue in the Read-MessageQueue cmdlet. Comment based help is also needed so users can use the Get-Help cmdlet to see information about the cmdlets in the module. However, describing that process would be redundant and this series of posts has become long enough.

These posts have described using Visual Studio 2017 and the PowerShell Tools for Visual Studio 2017 plug-in to develop a simple but useful PowerShell module. The module can send and receive messages to Azure Service Bus. It has also described using Git for source control and the Pester unit test framework to support the development best-practice Test-Driven Development. My goal has been to show these technologies working together to create something useful.

Resources:

Service Bus authentication with Shared Access Signatures

Service Bus HTTP Client

Shared Access Signature authentication with Service Bus

Receive and Delete Message (Destructive Read)

Send Message (to an Azure Service Bus Queue)

Part 3 – More Tests, this time with Pester’s Mock

This is part three of a multi-part post. Here are links to all the posts in this series.

Developing an Azure Service Bus PowerShell Module – Part 1/

Part 2 – Developing the First Test Using Pester

Part 3 – More Tests, this time with Pester's Mock

Part 4 – Coding the PowerShell Module

Introduction

In my last post, I developed a Pester unit test for the New-SasToken helper function in the module I'm developing. In this post, I'm going to develop tests for the New-QueueMessage and Read-QueueMessage cmdlets. This post will also highlight using the Pester Mock command.

Mock

Good unit tests have several qualities. First (obviously), they should test your code. The trick here is to test your code without testing all the stuff your code depends on. So, the next good quality of a unit test is that it doesn't have a bunch of external dependencies. Another good quality of a unit test is that it should be fast. On a large project, your test suite will grow to have hundreds if not thousands of tests. If a developer cannot run the test suite quickly, they'll just stop running it. A test suite with a slow execution time quickly becomes a burden to the team rather than the asset it should be.

Tests that have external dependencies are slow. Consider code that calls a web service. First, the web service call itself is slow compared to code running on your development machine. Additionally, network latency may impact your test such that its unusable for other team members located around the world on a distributed team (a very common scenario here at Avanade). These problems are solved by mocking external dependencies. Pester has a Mock command that can mock most PowerShell cmdlets. I will use the Mock command to mock the Invoke-WebRequest cmdlet, and remove the dependencies my test suite would otherwise have on the Azure Service Bus REST API.

Unit Tests for New-QueueMessage

The parameter set for the New-QueueMessage cmdlet is in the first post in this series. Here it is again.

New-QueueMessage -Namespace <string> -QueueName <string> -Message <string> -CorrelationId <guid> -PolicyName <string> -Key <string>

Before I code the unit test, I'm going to add these parameters to my cmdlet. Here's what the cmdlet looks like.

[powershell] <# New-QueueMessage #> function New-QueueMessage { [CmdletBinding()] param ( [Parameter(Mandatory=$true, Position=0)] [string] $Namespace,

[Parameter(Mandatory=$true, Position=1)] [string] $QueueName,

[Parameter(Mandatory=$true, Position=2)] [pscustomobject] $Message,

[Parameter(Mandatory=$true, Position=3)] [string] $Key,

[Parameter(Mandatory=$false)] [string] $PolicyName = "RootManageSharedAccessKey",

[Parameter(Mandatory=$false)] [guid] $CorrelationId )

throw [NotImplementedException] } [/powershell]

The unit test will in some ways be much like the test for New-SasToken. The difference is that I need to create a mock for the Invoke-WebRequest cmdlet. Ideally, what my mock returns should look exactly like what the real Invoke-WebRequest cmdlet would return. The Microsoft documentation for Send Message shows the Response should look like this.

HTTP/1.1 201 Created Transfer-Encoding: chunked Content-Type: application/xml; charset=utf-8 Server: Microsoft-HTTPAPI/2.0 Date: Tue, 01 Jul 2014 23:00:22 GMT

0

Of course, Invoke-WebRequest does not simply return that text. It returns an object of type HtmlWebResponseObject. HtmlWebResponseObject doesn't expose a usable constructor, so we can't make one of those and have our mock return it. However, what we return from the mock just needs to be good enough. We'll use a Hashtable.

Another consideration is what should New-QueueMessage return? I believe it should return nothing if it is successful. It should write any error messages to stderr (i.e. using the Write-Error cmdlet).

My unit test looks like this.

[powershell] Describe "New-QueueMessage" { Context "Only required parameters in the correct order" { It "Should not return anything" { # Prepare $namespace = "sb-ycajp" $queueName = "usagerequest" $message = [pscustomobject] @{ "Body"="Test message"; } $key = "ggbkU/HOBDSYTTS0ljICEfn1dVdcxpfebcrAmR4HUXQ=" $mockWebResponse = @{ "StatusCode"="201"; "StatusDescrip-tion"="Created"; "Headers"=@{"Transfer-Encoding"="chunked";"Strict-Transport-Security"="max-age=31536000"; "Content-Type"="application/xml; charset=utf-8";"Date"="Sat, 05 Aug 2017 23:16:50 GMT";"Server"="Microsoft-HTTPAPI/2.0"} } Mock Invoke-WebRequest { returns $mockWebResponse }

# Operate $results = New-QueueMessage $namespace $queueName $message $key 2>&1

# Assert [string]::IsNullOrEmpty($results) | Should Be $true } } } [/powershell]

The redirection happening at the end of the call to New-QueueMessage redirects any error output to the $results variable, which should not contain anything if the test is successful.

When I run the tests, the results look like this.

Again, just what I expect to see: all tests failing with the NotImplementedException. This seems like another minor milestone, so I'm going to commit my changes to source control.

Unit Test for Read-QueueMessage

The unit test for Read-QueueMessage is much like the one for New-QueueMessage. The mock web response is a little different, but the procedure used to develop it is the same as that for New-QueueMessage. Just get the expected response from the Microsoft documentation and craft the Hashtable so it looks the same.

Here's the Read-QueueMessage cmdlet after adding the parameters.

[powershell] <# Read-QueueMessage #> function Read-QueueMessage { [CmdletBinding()] param ( [Parameter(Mandatory=$true, Position=0)] [string] $Namespace,

[Parameter(Mandatory=$true, Position=1)] [string] $QueueName,

[Parameter(Mandatory=$true, Position=2)] [string] $Key,

[Parameter(Mandatory=$false)] [Switch] $Deadletter = [string]::Empty,

[Parameter(Mandatory=$false)] [Switch] $Peek,

[Parameter(Mandatory=$false)] [int] $Count,

[Parameter(Mandatory=$false)] [string] $PolicyName = "RootManageSharedAccessKey",

[Parameter(Mandatory=$false)] [guid] $CorrelationId )

throw [NotImplementedException] } [/powershell]

The Microsoft documentation indicates I will receive a Response that looks like this:

HTTP/1.1 200 OK Transfer-Encoding: chunked Content-Type: application/atom+xml;type=entry;charset=utf-8 Server: Microsoft-HTTPAPI/2.0 BrokerProperties: {"DeliveryCount":1,"EnqueuedSequenceNumber":0,"EnqueuedTimeUtc":"Tue, 01 Jul 2014 23:00:23 GMT","Label":"M1","MessageId":"3a146f76afee41648677887ffced72d8","SequenceNumber":1,"State":"Active","TimeToLive":10} Date: Tue, 01 Jul 2014 23:00:23 GMT

12 This is a message. 0

I don't want to return most of that content as it's related to the Http response, not the queue message. I plan to create a PSCustomObject that has the BrokerProperties and the body of the web response.

My unit test looks like this.

[powershell] Describe "Read-QueueMessage" { Context "Only required parameters in the correct order" { It "Should return a message" { # Prepare $namespace = "sb-ycajp" $queueName = "usagerequest" $key = "ggbkU/HOBDSYTTS0ljICEfn1dVdcxpfebcrAmR4HUXQ=" $body = "Test message" $mockWebResponse = @{ "StatusCode"="200"; "StatusDescription"="OK"; "Headers"=@{ "Transfer-Encoding"="chunked"; "Content-Type"="application/atom+xml;type=entry;charset=utf-8"; "Server"="Microsoft-HTTPAPI/2.0"; "BrokerProperties"="{""DeliveryCount"": 1, ""EnqueuedSequenceNumber"": 0, ""Enqueued-TimeUtc"":""Tue, 01 Jul 2014 23:00:23 GMT"", ""Label"":""M1"", ""Message-Id"":""3a146f76afee41648677887ffced72d8"", ""SequenceNumber"":1, ""State"":""Active"", ""TimeToLive"":10}"; "Date"="Tue, 01 Jul 2014 23:00:23 GMT" }; "Content"="$body" } Mock Invoke-WebRequest { return $mockWebResponse }

# Operate $message = Read-QueueMessage $namespace $queueName $key

# Assert $message.Body | Should Be $body } } } [/powershell]

I've only illustrated developing three unit-tests. The actual project has quite a few more ensuring that all of the envisioned scenarios work as expected.

In this post, I've continued the TDD approach to developing a PowerShell module using Pester for unit testing my code. I've highlighted using the Mock statement to mock a PowerShell cmdlet so my tests don't have any external dependencies. So far, I've only developed the unit tests. In my next post, I will develop each of the functions in my module.

Resources:

Send Message (to an Azure Service Bus Queue)

Receive and Delete Message (Destructive Read)

Part 2 – Developing the First Test Using Pester

This is part two of a multi-part post. Here are links to all the posts in this series.

Developing an Azure Service Bus PowerShell Module – Part 1/

Part 2 – Developing the First Test Using Pester

Part 3 – More Tests, this time with Pester's Mock

Part 4 – Coding the PowerShell Module

Introduction

In my last post, I covered getting started creating a PowerShell module that will allow my team to send and receive messages to Azure Service Bus Queues. In this post, I'm going to start developing the unit tests using Pester, PowerShell's unit test framework.

Test-Driven Development

Test-Driven Development (TDD) is a best practice no matter what language you develop with. Having a good suite of tests gives developers the freedom to change code without worrying about breaking the code (or at least they will immediately know when they have broken the code). Without a good suite of tests for your code, you risk code-rot. So, first I'm going to develop a couple of tests.

My colleague Chris Speers has blogged many times about the advantages of the Azure REST API's when compared to other options that are available. I want to avoid dependencies on other components, so it follows that the cmdlets in my module will be wrappers around the Azure Service Bus REST API's.

The first thing you need to do to use the REST API's is authenticate to them. There are two ways your client code can authenticate to Azure Service Bus: with a Simple Web Token (SWT) and with a Shared Access Signature (SAS) token. Microsoft recommends the Shared Access Signature approach. When you create an Azure Service Bus namespace, Azure will generate an SAS Policy and an SAS key that the Service Bus client code will use to authenticate to Azure Service Bus. While the initial default policy has all rights, SAS policies can be crafted so that specific clients have specific privileges scoped to specific Service Bus resources.

My module is going to need a helper function that creates an SAS token. So the first thing I do is create a stub for that function.

[powershell] <# New-SasToken #> function New-SasToken { throw [NotImplementedException] } [/powershell]

Because I plan for this to be an internal helper function, I do not add an Export-ModuleMember statement for that function to the module file. However, that means that the function will not be in scope in the Pester test. To test that function with Pester, I need to wrap the tests for that function in an InModuleScope block. InModuleScope gives Pester access to module members that are not public.

When the SAS token is created, it needs to have the expiration time encoded in it. Usually, you would set the token to expire a few minutes in the future. That means that a different token will be returned each time the test is run, and that's not what I want. I'm going to add the ability to pass an optional parameter that has the expiration time for the token. That way in my test, I can pass a specific expiration time for the token and compare it to a known good token.

The token also contains the name of the SAS Policy encoded in the token. As mentioned above, when you create the namespace using the Azure portal, it will automatically create an SAS Policy named RootManageSharedAccessKey. You can find these settings in Azure by clicking the Connection String link in the Overview blade for your Service Bus namespace.

Since new policies may be created, we need also be able to pass the policy name to the New-SasToken function. I will add an optional parameter with the default value of RootManageSharedAccessKey. Finally, the token also contains the name of the Service Bus namespace, so I need a parameter for that as well. My New-SasToken function stub now looks like this.

[powershell] <# New-SasToken #> function New-SasToken { param ( [Parameter(Mandatory=$true, Position=0)] [string]$Namespace,

[Parameter(Mandatory=$true, Position=1)] [string]$Key,

[Parameter(Mandatory=$false)] [string]$PolicyName = "RootManageSharedAccessKey",

[Parameter(Mandatory=$false)] [DateTime]$Expiry = $([DateTime]::UtcNow.AddSeconds(20 * 60)) )

throw [NotImplementedException] } [/powershell]

The next thing is to develop the unit test for this function. The unit test itself is drop-dead simple. Basically, I just call the New-SasToken function passing the correct parameters and compare the token that is returned to a known good token. But where do I get a known good token? That took a bit of research and I've attached links to the resources I used in the Resources section at the end of this post. Let's just say it involved reading the Microsoft documentation and running some great sample code that I found on MSDN. My unit test now looks like this.

[powershell]InModuleScope "Avanade.ServiceBus" { Describe "New-SasToken" { Context "Expiry Parameter" { It "Should return a specific token" { # Prepare $namespace = "sb-ycajp" $key = "ggbkU/HOBDSYTTS0ljICEfn1dVdcxpfebcrAmR4HUXQ=" $expiry = [DateTime]"1/1/1980 00:00"

# Operate $token = New-SasToken $namespace -Key $key -Expiry $expiry

# Assert $token | Should Be "SharedAccessSignature sr=sb-ycajp&sig=17PCSRT%2flklQiCnT4E0o1XmVxp%2fhM7xBvIf8UwC9tG4%3d&se=315532800&skn=RootManageSharedAccessKey" } } } } [/powershell]

When I run my test suite, I get the following.

As expected, all tests are failing, throwing a NotImplementedException. At this point, I've coded my first test that is more than just a placeholder. It seems like a good time to commit my changes to source control.

This post has covered the development of a unit test following the practice of Test-Driven Development. In my next post, I will add tests for the New-QueueMessage and the Read-QueueMessage cmdlets.

 

Resources:

Service Bus authentication with Shared Access Signatures

Service Bus HTTP Client

Developing an Azure Service Bus PowerShell Module – Part 1

This is part one of a multi-part post. Here are links to all the posts in this series. Developing an Azure Service Bus PowerShell Module – Part 1/

Part 2 – Developing the First Test Using Pester

Part 3 – More Tests, this time with Pester's Mock

Part 4 – Coding the PowerShell Module

Introduction

On my current project, we've had a work item on our backlog for some time to provide a way for us to receive and submit messages to Azure Service Bus queues. What we really need is a way to receive dead-letter messages and re-submit them after fixing the underlying issue that caused them to dead-letter in the first place.

We have been using the excellent Service Bus Explorer for this purpose. Unfortunately, that tool doesn't serialize the message body quite right when we re-submit batches of messages; we could only repair and re-submit messages one at a time. So, my team decided we needed to build our own tooling for this purpose. We didn't want to build anything too fancy, a couple of PowerShell cmdlets would be perfect for what we were envisioning. I took on the task because I wanted to test drive Pester, PowerShell's unit test framework.

Starting with the Goal in Mind

I figure we need two cmdlets:

  • Receive-QueueMessage
  • Submit-QueueMessage

Receive and Submit seem like good names as they work well with the semantics of queues. Unfortunately, Receive and Submit are not standard PowerShell verbs, so I decided to go with the following:

  • Read-QueueMessage
  • New-QueueMessage

The parameter sets of the cmdlets should look something like this:

Read-QueueMessage -Namespace <string> -QueueName <string> -Deadletter <flag> -Count <int> -CorrelationId <guid> -Peek <flag> -PolicyName <string> -SKey <string>

New-QueueMessage -Namespace <string> -QueueName <string> -Message <string> -CorrelationId <guid> -PolicyName <string> -Key <string>

Creating the Project

My team uses Git and Visual Studio Team Services for our source control repository. I started by creating a feature branch in Git by branching off the main develop branch.

Next I created the project. I used the PowerShell Module Project template and named the project Avanade.ServiceBus, which will be the name of the module.

Note: You will need to install PowerShell Tools for Visual Studio 2017 for this tooling to show up.

The project template creates a module file, a Pester test file and the module manifest file. Initially, they don't do very much. I'm going to start by stubbing out the functions that will become my cmdlets. Initially, all they will do is throw an exception.

[powershell] <# Read-QueueMessage #> function Read-QueueMessage { throw [NotImplementedException] }

<# New-QueueMessage #> function New-QueueMessage { throw [NotImplementedException] }

Export-ModuleMember -Function Read-QueueMessage Export-ModuleMember -Function New-QueueMessage

[/powershell]

Next, I coded the Pester test file. The first thing we need is some boilerplate code so that the test file will load the module it is testing.

[powershell] $here = Split-Path -Parent $MyInvocation.MyCommand.Path $sut = (Split-Path -Leaf $MyInvocation.MyCommand.Path) -replace '\.Tests\.ps1', '.psm1' Import-Module "$here\$sut" -Force [/powershell]

Next, I add two simple tests. The tests don't do anything other than call the module functions, which will throw the NotImplementedException. That will indicate the test file is wired up the module file correctly. The test file looks like this.

[powershell] # # This is a PowerShell Unit Test file. # You need a unit test framework such as Pester to run PowerShell Unit tests. # You can download Pester from http://go.microsoft.com/fwlink/?LinkID=534084 # $here = Split-Path -Parent $MyInvocation.MyCommand.Path $sut = (Split-Path -Leaf $MyInvocation.MyCommand.Path) -replace '\.Tests\.ps1', '.psm1' Import-Module "$here\$sut" -Force

Describe "Read-QueueMessage" { Context "No Parameters" { It "Should Throw Exception" { Read-QueueMessage } } }

Describe "New-QueueMessage" { Context "No Parameters" { It "Should Throw Exception" { New-QueueMessage } } } [/powershell] Finally, I r-click the test file in Solution Explorer, and choose Execute as Script.

The results show up in the PowerShell Interactive Window.

As expected, both tests are failing because the stubbed module functions are throwing the NotImplementedException. This tells me that the test file is wired up to the module file and everything is working as expected.

This is a good time to commit my changes.

This post covered getting started creating a PowerShell module using Visual Studio 2017. It highlighted using Visual Studio's Git integration for source control operations and Test-Driven Development using Pester. In the next post, I will add additional tests and explore Pester's capabilities in depth.

Resources:

Approved Verbs for Windows PowerShell Commands

PowerShell Tools for Visual Studio 2017

Azure Table Storage and PowerShell, The Hard Way

In my previous post I gave a quick overview of the Shared Key authentication scheme used by the Azure storage service and demonstrated how authenticate and access the BLOB storage API through PowerShell.  The file and queue services follow an authentication scheme that aligns with the BLOB requirements, however the table service is a bit different.  I felt it might help the more tortured souls out there (like myself) if I tried to describe the nuances.

Azure Storage REST API, Consistently Inconsistent

Like the REST of all things new Microsoft (read Azure), the mantra is consistency.  From a modern administrative perspective you should have a consistent experience across whatever environment and toolset you require.  If you are a traditional administrator/engineer of the Microsoft stack, the tooling takes the form of PowerShell cmdlets.  If you use Python, bash, etc. there is effectively equivalent tooling available.  My gripes outstanding, I think Microsoft has done a tremendous job in this regard.  I also make no claim that my preferences are necessarily the correct ones.  The ‘inconsistencies’  I will be discussing are not really issues for you if you use the mainline SDK(s).  As usual, I’ll be focusing on how things work behind the scenes and my observations.

Shared Key Authentication, but Not All Are Equal

In exploring the shared key authentication to the BLOB REST API, we generated and encoded the HTTP request signature.  The string we needed to encode looked something like this:

GET
/*HTTP Verb*/
/*Content-Encoding*/
/*Content-Language*/
/*Content-Length (include value when zero)*/
/*Content-MD5*/
/*Content-Type*/
/*Date*/
/*Range*/  
x-ms-date:Sun, 11 Oct 2009 21:49:13 GMT x-ms-version:2009-09-19
/*CanonicalizedHeaders*/  
/myaccount/mycontainer\ncomp:metadata\nrestype:container
timeout:20

The table service takes a much simpler and yet arcane format that is encoded in an identical fashion.

GET
application/json;odata=nometadata
Mon, 15 May 2017 17:29:11 GMT
/billing73d55f68/fabriclogae0bced538344887a4021ae5c3b61cd0GlobalTime(PartitionKey='407edc6d872271f853085a7a18387784',RowKey='02519075544040622622_407edc6d872271f853085a7a18387784_ 0_2952_2640')

In this case there are far fewer headers and query parameters to deal with, however there are now fairly rigid requirements. A Date header must be specified as opposed to either Date or x-ms-date, or both in the BLOB case.  A Content-Type header must also be specified as part of the signature, and no additional header details are required.  The canonical resource component is very different from the BLOB service.  The canonical resource still takes a format of <storage account name>/<table name>/<query parameters>.  At the table service level only the comp query parameter is to be included.  As an example, to query the table service properties for the storage account the request would look something like https://myaccount.table.core.windows.net?restype=service&comp=properties. The canonical resource would be /myaccount/?comp=properties.

Generating the Signature with PowerShell

We will reuse our encoding function from the previous post and include a new method for generating the signature.


Function EncodeStorageRequest
{     
    [CmdletBinding()]
    param
    (
        [Parameter(Mandatory = $true,ValueFromPipeline=$true,ValueFromPipelineByPropertyName=$true)]
        [String[]]$StringToSign,
        [Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true)]
        [String]$SigningKey
    )     
    PROCESS
    {         
        foreach ($item in $StringToSign)
        {             
            $KeyBytes = [System.Convert]::FromBase64String($SigningKey)
            $HMAC = New-Object System.Security.Cryptography.HMACSHA256
            $HMAC.Key = $KeyBytes
            $UnsignedBytes = [System.Text.Encoding]::UTF8.GetBytes($item)
            $KeyHash = $HMAC.ComputeHash($UnsignedBytes)
            $SignedString=[System.Convert]::ToBase64String($KeyHash)
            Write-Output $SignedString
        }     
    } 
}

$AccountName='myaccount'
$AccessKey='vyAEEzbcnIAkLKti1leDbfrAOQBu5bx52zyCkW0fGIBCsS+DDGXpfidOeAWyg7do8ujft1mFhnz9kmliycmiXA=='
$Uri="https://$AccountName.table.core.windows.net/tables"
$SignatureParams=@{
    Resource=$Uri;
    Date=[DateTime]::UtcNow.ToString('R');
    Verb='GET';
    ContentType='application/json;odata=nometadata';
}
$RequestSignature=GetTableTokenStringToSign @SignatureParams $TableToken=EncodeStorageRequest -StringToSign $RequestSignature -SigningKey $AccessKey
$TableHeaders=[ordered]@{
    'x-ms-version'= '2016-05-31';
    'DataServiceVersion'='3.0;Netfx';
    'Accept-Charset'='UTF-8';
    'Accept'='application/json;odata=fullmetadata';
    'Date'=$SignatureParams.Date;
    'Authorization'="SharedKey $($AccountName):$($TableToken)"
}
$RequestParams=@{
    Uri=$SignatureParams.Resource;
    Method=$SignatureParams.Verb;
    Headers=$TableHeaders;
    ContentType=$SignatureParams.ContentType;
    ErrorAction='STOP'
}
$Response=Invoke-WebRequest @RequestParams -Verbose $Tables=$Response.Content | ConvertFrom-Json | Select-Object -ExpandProperty value


PS C:\WINDOWS\system32> $Tables|fl
odata.type : acestack.Tables odata.id : https://acestack.table.core.windows.net/Tables('provisioninglog') odata.editLink : Tables('provisioninglog') TableName : provisioninglog

The astute reader will notice we had to pass some different headers along.  All table requests require either or both a DataServiceVersion or MaxDataServiceVersion.  These values align with maximum versions of the REST API, which I won't bother belaboring.  We also  retrieved JSON rather than XML, and have a number of content types available to take the format in which are dictated by the Accept header.   In the example we retrieved it with full OData metadata; other valid types include minimalmetadata and nometadata (atom/xml is returned from earlier data service versions).  In another peculiarity XML is the only format returned for retrieving Service properties or stats.

Putting It to Greater Use With Your Old Friend OData

You likely want to actually read some data out of tables.  Now that authorizing the request is out of the way it is a 'simple' manner of applying the appropriate OData query parameters.  We will start with retrieving a list of all entities within a table.  This will return a maximum of 1000 results (unless limited using the $top parameter) and a link to any subsequent pages of data will be returned in the response headers.  In the following example we will query all entities in the fabriclogaeGlobalTime table in the fabrixstuffz storage account.  In the interest of brevity I will limit this to 3 results.


$TableName='fakecustomers'
$Uri="https://$AccountName.table.core.windows.net/$TableName"
$SignatureParams=@{
    Resource=$Uri;
    Date=[DateTime]::UtcNow.ToString('R');
    Verb='POST';
    ContentType='application/json;odata=nometadata'; 
} 
$RequestSignature=GetTableTokenStringToSign @SignatureParams $TableToken=EncodeStorageRequest -StringToSign $RequestSignature -SigningKey $AccessKey
$TableHeaders=[ordered]@{
    'x-ms-version'= '2016-05-31'
    'DataServiceVersion'='3.0;Netfx'
    'Accept-Charset'='UTF-8'
    'Accept'='application/json;odata=fullmetadata';
    'Date'=$SignatureParams.Date;
    'Authorization'="SharedKey $($AccountName):$($TableToken)"
}
$PartitionKey='mypartitionkey'
$RowKey='row771'
$TableEntity=New-Object PSobject @{
    "Address"="Mountain View";
    "Name"="Buckaroo Banzai";
    "Age"=33;
    "AmountDue"=200.23;
    "FavoriteItem"="oscillation overthruster";
    "CustomerCode@odata.type"="Edm.Guid";
    "CustomerCode"="c9da6455-213d-42c9-9a79-3e9149a57833";
    "CustomerSince@odata.type"="Edm.DateTime";
    "CustomerSince"="2008-07-10T00:00:00";
    "IsActive"=$true;
    "NumberOfOrders@odata.type"="Edm.Int64"
    "NumberOfOrders"="255";
    "PartitionKey"=$PartitionKey;
    "RowKey"=$RowKey
}
$RequestParams=@{
    Uri=$SignatureParams.Resource;
    Method=$SignatureParams.Verb;
    Headers=$TableHeaders;
    ContentType=$SignatureParams.ContentType;
    ErrorAction='STOP'
}
$Response=Invoke-WebRequest @RequestParams

This should yield a result looking like this.


Cache-Control: no-cache
Transfer-Encoding: chunked
Content-Type: application/json;odata=nometadata;streaming=true;charset=utf-8
Server: Windows-Azure-Table/1.0 Microsoft-HTTPAPI/2.0
x-ms-request-id: 56afccf3-0002-0104-0285-d382b4000000
x-ms-version: 2016-05-31
X-Content-Type-Options: nosniff
x-ms-continuation-NextPartitionKey: 1!44!NDA3ZWRjNmQ4NzIyNzFmODUzMDg1YTdhMTgzODc3ODQ-
x-ms-continuation-NextRowKey: 1!88!MDI1MTkwNjc4NDkwNDA1NzI1NjlfNDA3ZWRjNmQ4NzIyNzFmODUzMDg1YTdhMTgzODc3ODRfMF8yOTUyXzI2NDA- Date: Tue, 23 May 2017 05:27:28 GMT
{
    "value":  [
                  {
                      "PartitionKey":  "407edc6d872271f853085a7a18387784",
                      "RowKey":  "02519067840040580939_407edc6d872271f853085a7a18387784_0_2952_2640",
                      "Timestamp":  "2017-05-23T05:25:55.6307353Z",
                      "EventType":  "Time",
                      "TaskName":  "FabricNode",
                      "dca_version":  -2147483648,
                      "epoch":  "1",
                      "localTime":  "2017-05-23T05:21:07.4129436Z",
                      "lowerBound":  "2017-05-23T05:19:56.173659Z",
                      "upperBound":  "2017-05-23T05:19:56.173659Z"
                  },
                  {
                      "PartitionKey":  "407edc6d872271f853085a7a18387784",
                      "RowKey":  "02519067843040711216_407edc6d872271f853085a7a18387784_0_2952_2640",
                      "Timestamp":  "2017-05-23T05:20:53.9265804Z",
                      "EventType":  "Time",
                      "TaskName":  "FabricNode",
                      "dca_version":  -2147483648,
                      "epoch":  "1",
                      "localTime":  "2017-05-23T05:16:07.3678218Z",
                      "lowerBound":  "2017-05-23T05:14:56.1606307Z",
                      "upperBound":  "2017-05-23T05:14:56.1606307Z"
                  },
                  {
                      "PartitionKey":  "407edc6d872271f853085a7a18387784",
                      "RowKey":  "02519067846040653329_407edc6d872271f853085a7a18387784_0_2952_2640",
                      "Timestamp":  "2017-05-23T05:15:52.7217857Z",
                      "EventType":  "Time",
                      "TaskName":  "FabricNode",
                      "dca_version":  -2147483648,
                      "epoch":  "1",
                      "localTime":  "2017-05-23T05:11:07.3406081Z",
                      "lowerBound":  "2017-05-23T05:09:56.1664211Z",
                      "upperBound":  "2017-05-23T05:09:56.1664211Z"
                  }
              ]
}

You should recognize a relatively standard OData response, with our desired values present within an array as the value property. There are two response headers to note here; x-ms-continuation-NextPartitionKey and x-ms-continuation-NextRowKey. These headers are the continuation token for retrieving the next available value(s). The service will return results in pages with a maximum length of 1000 results, unless limited using the $top query parameter like the previous example. If one were so inclined, they could continue to send GET requests, including the continuation token(s) until all results are enumerated.

Creating (or updating) table entities is a slightly different exercise, which can become slightly convoluted (at least in PowerShell or other scripts).  Conceptually, all that is required to create an entity is a POST  request to the table resource URI with a body containing the entity and the appropriate required headers.  The complexity is primarily a result of the metadata overhead associated with the server OData implementation. We'll examine this by inserting an entity into a fictional customers table.

You should end up receiving the inserted object as a response:


PS C:\Windows\system32> $Response.Content | ConvertFrom-Json
PartitionKey : mypartitionkey
RowKey : row772
Timestamp : 2017-05-23T06:17:53.7244968Z
CustomerCode : c9da6455-213d-42c9-9a79-3e9149a57833
FavoriteItem : oscillation overthruster
AmountDue : 200.23
IsActive : True
CustomerSince : 2008-07-10T00:00:00
Name : Buckaroo Banzai
NumberOfOrders : 255
Age : 33
Address : Mountain View 

You should notice that the object we submitted had some extra properties not present on the inserted entity. The API requires that for any entity property where the (.Net) data type can not be automatically inferred, a type annotation must be specified. In this case CustomerCode=c9da6455-213d-42c9-9a79-3e9149a57833 is a GUID (as opposed to a string) requires a property CustomerCode@odata.type=Edm.Guid.  If you would like a more complete explanation the format is detailed here.

Three ways to do the same thing

You've got to give it to Microsoft, they certainly keep things interesting.  In the above example, I showed one of three ways that you can insert an entity into a table.  The service supports Insert, Insert or Merge (Upsert), and Insert or Replace operations (there are also individual Replace and Merge operations).  In the following example I will show the Upsert operation using the same table and entity as before.


$Uri="https://$AccountName.table.core.windows.net/$TableName(PartitionKey='$PartitionKey',RowKey='$RowKey')"
$SignatureParams=@{
    Resource=$Uri;
    Date=[DateTime]::UtcNow.ToString('R');
    Verb='MERGE';
    ContentType='application/json;odata=nometadata';
} 
$RequestSignature=GetTableTokenStringToSign @SignatureParams
$TableToken=EncodeStorageRequest -StringToSign $RequestSignature -SigningKey $AccessKey $TableEntity | Add-Member -MemberType NoteProperty -Name 'NickName' -Value 'MrMan'
$TableHeaders=[ordered]@{
    'x-ms-version'= '2016-05-31'
    'DataServiceVersion'='3.0;Netfx'
    'Accept-Charset'='UTF-8'
    'Accept'='application/json;odata=fullmetadata';
    'Date'=$SignatureParams.Date;
    'Authorization'="SharedKey $($AccountName):$($TableToken)"
}
$RequestParams = @{
    Method= 'MERGE';
    Uri= $Uri;
    Body= $($TableEntity|ConvertTo-Json);
    Headers= $TableHeaders;
    ContentType= 'application/json;odata=fullmetadata'
}
$Response=Invoke-WebRequest @RequestParams 

This should yield a response with the meaningful details of the operation in the headers.


PS C:\Windows\system32> $Response.Headers
Key                    Value
---                    -----  
x-ms-request-id        48489e3d-0002-005c-6515-d545b8000000
x-ms-version           2016-05-31 
X-Content-Type-Options nosniff
Content-Length         0
Cache-Control          no-cache
Date                   Thu, 25 May 2017 05:08:58 GMT
ETag                   W/"datetime'2017-05-25T05%3A08%3A59.5530222Z'"
Server                 Windows-Azure-Table/1.0 Microsoft-HTTPAPI/2.0

Now What?

I'm sure I've bored most of you enough already so I won't belabor any more of the operations, but I hope that I've given you a little more insight into the workings of another key element of the Azure Storage Service(s). As always, if you don't have a proclivity for doing things the hard way, feel free to check out a module supporting most of the Table (and BLOB) service functionality on the Powershell Gallery or GitHub.