Service Bus

Part 4 – Coding the PowerShell Module

This is part four of a multi-part post. Here are links to all the posts in this series.

Developing an Azure Service Bus PowerShell Module – Part 1/

Part 2 – Developing the First Test Using Pester

Part 3 – More Tests, this time with Pester's Mock

Part 4 – Coding the PowerShell Module

Introduction

In my last post, I developed the Pester unit tests for the New-QueueMessage and Read-QueueMessage cmdlets I am developing for an Azure Service Bus PowerShell module. In this post, I will illustrate developing each of the functions.

New-SasToken

Figuring out how to generate the SAS token took some digging. Fortunately, there are code samples on MSDN (linked below) that illustrate how to correctly compute the SHA hash required for the token. The token is made of four parts:

  • the name of the Azure Service Bus namespace
  • the SHA 256 hash,
  • the expiration time (which is the number of seconds since the beginning of the epoch starting at midnight January 1, 1970 UTC)
  • the SAS Policy name

The SHA 256 hash is the hash of another string, which includes:

  • the name of the Azure Service Bus namespace
  • a newline character
  • the expiration time

First, we need to compute the token expiration time. The code for that looks like this.

[powershell] $origin = [DateTime]"1/1/1970 00:00" $diff = New-TimeSpan -Start $origin -End $Expiry $tokenExpirationTime = [Convert]::ToInt32($diff.TotalSeconds) [/powershell]

Then I create the string that will be hashed.

[powershell] $stringToSign = [Web.HttpUtility]::UrlEncode($Namespace) + "`n" + $tokenExpirationTime [/powershell]

The next step is to new up an instance of an HMACSHA256 class, which will do the work of computing the hash. The Key property of the HMACSHA256 is set to a byte array that contains the SAS Policy key from the Azure portal.

Here's the code to new-up the HMACSHA256 class. Again, note that you don't set the key property to the key from the Azure portal, it's set to a byte array created from the key.

[powershell] $hmacsha = New-Object -TypeName System.Security.Cryptography.HMACSHA256 $hmacsha.Key = [Text.Encoding]::UTF8.GetBytes($Key) [/powershell]

Next the hash is computed with the HMACSHA256 class instance. The hash is converted to a base 64 string. That is what is used in the token.

[powershell] $hash = $hmacsha.ComputeHash([Text.Encoding]::UTF8.GetBytes($stringToSign)) $signature = [Convert]::ToBase64String($hash) [/powershell]

The last step is to create the token. The token is a formatted string that looks like this.

Here's the code to create the token. Note the grave character at the end of each line.

[powershell] $token = [string]::Format([Globalization.CultureInfo]::InvariantCulture, ` "SharedAccessSignature sr={0}&sig={1}&se={2}&skn={3}", ` [Web.HttpUtility]::UrlEncode($Namespace), ` [Web.HttpUtility]::UrlEncode($signature), ` $tokenExpirationTime, ` $PolicyName) [/powershell]

Now, when I run the Pester unit test, I get the following.

Note that the last test no longer throws the NotImplementedException. Now that test passes. This is a good time to commit my changes to source control.

New-QueueMessage

One of the parameters of New-QueueMessage is a PSCustomObject that should contain a property named Body. Any other properties on that object will be assigned to the BrokerProperties header parameter of the web request. The first thing the function needs to do is break that object apart.

[powershell] $body = $Message.Body $Message.psobject.properties.Remove("Body") [/powershell]

Next is to set up the parameters for the Invoke-WebRequest cmdlet.

[powershell] $uri = "https://$Namespace.servicebus.windows.net/$QueueName/messages" $token = New-SasToken -Namespace $Namespace -Policy $PolicyName -Key $Key $headers = @{ "Authorization"="$token"; "Content-Type"="application/atom+xml;type=entry;charset=utf-8" } $headers.Add("BrokerProperties", $(ConvertTo-Json -InputObject $Message -Compress)) [/powershell]

Finally, the Invoke-WebRequest call. The normal output of the command is redirected to the $null automatic variable. If an error occurs, Invoke-WebRequest will output the error to the error stream stderr.

[powershell] Invoke-WebRequest -Uri $uri -Headers $headers -Method Post -Body $body > $null [/powershell]

When I run my Pester tests, I get the following.

Two of my unit tests are passing now. This is a good time to commit my changes to source control.

Read-QueueMessage

Read-QueueMessage is a lot like New-QueueMessage. Make the call to the Service Bus REST API endpoint and use the results to construct a PSCustomObject that contains the BrokerProperties and the body of the response. I'll start by constructing the parameters required for the Invoke-WebRequest cmdlet.

[powershell] $uri = "https://$Namespace.servicebus.windows.net/$QueueName/messages/head" $token = New-SasToken -Namespace $Namespace -Policy $PolicyName -Key $Key $headers = @{ "Authorization"="$token" } [/powershell]

Next make the call to the Service Bus REST API.

[powershell] $response = Invoke-WebRequest -Uri $uri -Headers $headers -Method Delete [/powershell]

Finally, construct the PSCustomObject that contains the brokered message properties and the body of the message.

[powershell] $brokeredMessage = ConvertFrom-Json -InputObject $response.Headers.BrokerProperties Add-Member -InputObject $brokeredMessage ` -MemberType NoteProperty ` -Name "Body" ` -Value $response.Content [/powershell]

When I run my Pester tests, I get the following output.

All tests are passing. This is a good time to commit my changes.

In this post, I walked through the process of developing the three functions that make up my module. The module isn't done yet, it doesn't support all the envisioned scenarios yet. For example, I want to support the Service Bus Peek operation and retrieving messages from the dead letter queue in the Read-MessageQueue cmdlet. Comment based help is also needed so users can use the Get-Help cmdlet to see information about the cmdlets in the module. However, describing that process would be redundant and this series of posts has become long enough.

These posts have described using Visual Studio 2017 and the PowerShell Tools for Visual Studio 2017 plug-in to develop a simple but useful PowerShell module. The module can send and receive messages to Azure Service Bus. It has also described using Git for source control and the Pester unit test framework to support the development best-practice Test-Driven Development. My goal has been to show these technologies working together to create something useful.

Resources:

Service Bus authentication with Shared Access Signatures

Service Bus HTTP Client

Shared Access Signature authentication with Service Bus

Receive and Delete Message (Destructive Read)

Send Message (to an Azure Service Bus Queue)

Part 3 – More Tests, this time with Pester’s Mock

This is part three of a multi-part post. Here are links to all the posts in this series.

Developing an Azure Service Bus PowerShell Module – Part 1/

Part 2 – Developing the First Test Using Pester

Part 3 – More Tests, this time with Pester's Mock

Part 4 – Coding the PowerShell Module

Introduction

In my last post, I developed a Pester unit test for the New-SasToken helper function in the module I'm developing. In this post, I'm going to develop tests for the New-QueueMessage and Read-QueueMessage cmdlets. This post will also highlight using the Pester Mock command.

Mock

Good unit tests have several qualities. First (obviously), they should test your code. The trick here is to test your code without testing all the stuff your code depends on. So, the next good quality of a unit test is that it doesn't have a bunch of external dependencies. Another good quality of a unit test is that it should be fast. On a large project, your test suite will grow to have hundreds if not thousands of tests. If a developer cannot run the test suite quickly, they'll just stop running it. A test suite with a slow execution time quickly becomes a burden to the team rather than the asset it should be.

Tests that have external dependencies are slow. Consider code that calls a web service. First, the web service call itself is slow compared to code running on your development machine. Additionally, network latency may impact your test such that its unusable for other team members located around the world on a distributed team (a very common scenario here at Avanade). These problems are solved by mocking external dependencies. Pester has a Mock command that can mock most PowerShell cmdlets. I will use the Mock command to mock the Invoke-WebRequest cmdlet, and remove the dependencies my test suite would otherwise have on the Azure Service Bus REST API.

Unit Tests for New-QueueMessage

The parameter set for the New-QueueMessage cmdlet is in the first post in this series. Here it is again.

New-QueueMessage -Namespace <string> -QueueName <string> -Message <string> -CorrelationId <guid> -PolicyName <string> -Key <string>

Before I code the unit test, I'm going to add these parameters to my cmdlet. Here's what the cmdlet looks like.

[powershell] <# New-QueueMessage #> function New-QueueMessage { [CmdletBinding()] param ( [Parameter(Mandatory=$true, Position=0)] [string] $Namespace,

[Parameter(Mandatory=$true, Position=1)] [string] $QueueName,

[Parameter(Mandatory=$true, Position=2)] [pscustomobject] $Message,

[Parameter(Mandatory=$true, Position=3)] [string] $Key,

[Parameter(Mandatory=$false)] [string] $PolicyName = "RootManageSharedAccessKey",

[Parameter(Mandatory=$false)] [guid] $CorrelationId )

throw [NotImplementedException] } [/powershell]

The unit test will in some ways be much like the test for New-SasToken. The difference is that I need to create a mock for the Invoke-WebRequest cmdlet. Ideally, what my mock returns should look exactly like what the real Invoke-WebRequest cmdlet would return. The Microsoft documentation for Send Message shows the Response should look like this.

HTTP/1.1 201 Created Transfer-Encoding: chunked Content-Type: application/xml; charset=utf-8 Server: Microsoft-HTTPAPI/2.0 Date: Tue, 01 Jul 2014 23:00:22 GMT

0

Of course, Invoke-WebRequest does not simply return that text. It returns an object of type HtmlWebResponseObject. HtmlWebResponseObject doesn't expose a usable constructor, so we can't make one of those and have our mock return it. However, what we return from the mock just needs to be good enough. We'll use a Hashtable.

Another consideration is what should New-QueueMessage return? I believe it should return nothing if it is successful. It should write any error messages to stderr (i.e. using the Write-Error cmdlet).

My unit test looks like this.

[powershell] Describe "New-QueueMessage" { Context "Only required parameters in the correct order" { It "Should not return anything" { # Prepare $namespace = "sb-ycajp" $queueName = "usagerequest" $message = [pscustomobject] @{ "Body"="Test message"; } $key = "ggbkU/HOBDSYTTS0ljICEfn1dVdcxpfebcrAmR4HUXQ=" $mockWebResponse = @{ "StatusCode"="201"; "StatusDescrip-tion"="Created"; "Headers"=@{"Transfer-Encoding"="chunked";"Strict-Transport-Security"="max-age=31536000"; "Content-Type"="application/xml; charset=utf-8";"Date"="Sat, 05 Aug 2017 23:16:50 GMT";"Server"="Microsoft-HTTPAPI/2.0"} } Mock Invoke-WebRequest { returns $mockWebResponse }

# Operate $results = New-QueueMessage $namespace $queueName $message $key 2>&1

# Assert [string]::IsNullOrEmpty($results) | Should Be $true } } } [/powershell]

The redirection happening at the end of the call to New-QueueMessage redirects any error output to the $results variable, which should not contain anything if the test is successful.

When I run the tests, the results look like this.

Again, just what I expect to see: all tests failing with the NotImplementedException. This seems like another minor milestone, so I'm going to commit my changes to source control.

Unit Test for Read-QueueMessage

The unit test for Read-QueueMessage is much like the one for New-QueueMessage. The mock web response is a little different, but the procedure used to develop it is the same as that for New-QueueMessage. Just get the expected response from the Microsoft documentation and craft the Hashtable so it looks the same.

Here's the Read-QueueMessage cmdlet after adding the parameters.

[powershell] <# Read-QueueMessage #> function Read-QueueMessage { [CmdletBinding()] param ( [Parameter(Mandatory=$true, Position=0)] [string] $Namespace,

[Parameter(Mandatory=$true, Position=1)] [string] $QueueName,

[Parameter(Mandatory=$true, Position=2)] [string] $Key,

[Parameter(Mandatory=$false)] [Switch] $Deadletter = [string]::Empty,

[Parameter(Mandatory=$false)] [Switch] $Peek,

[Parameter(Mandatory=$false)] [int] $Count,

[Parameter(Mandatory=$false)] [string] $PolicyName = "RootManageSharedAccessKey",

[Parameter(Mandatory=$false)] [guid] $CorrelationId )

throw [NotImplementedException] } [/powershell]

The Microsoft documentation indicates I will receive a Response that looks like this:

HTTP/1.1 200 OK Transfer-Encoding: chunked Content-Type: application/atom+xml;type=entry;charset=utf-8 Server: Microsoft-HTTPAPI/2.0 BrokerProperties: {"DeliveryCount":1,"EnqueuedSequenceNumber":0,"EnqueuedTimeUtc":"Tue, 01 Jul 2014 23:00:23 GMT","Label":"M1","MessageId":"3a146f76afee41648677887ffced72d8","SequenceNumber":1,"State":"Active","TimeToLive":10} Date: Tue, 01 Jul 2014 23:00:23 GMT

12 This is a message. 0

I don't want to return most of that content as it's related to the Http response, not the queue message. I plan to create a PSCustomObject that has the BrokerProperties and the body of the web response.

My unit test looks like this.

[powershell] Describe "Read-QueueMessage" { Context "Only required parameters in the correct order" { It "Should return a message" { # Prepare $namespace = "sb-ycajp" $queueName = "usagerequest" $key = "ggbkU/HOBDSYTTS0ljICEfn1dVdcxpfebcrAmR4HUXQ=" $body = "Test message" $mockWebResponse = @{ "StatusCode"="200"; "StatusDescription"="OK"; "Headers"=@{ "Transfer-Encoding"="chunked"; "Content-Type"="application/atom+xml;type=entry;charset=utf-8"; "Server"="Microsoft-HTTPAPI/2.0"; "BrokerProperties"="{""DeliveryCount"": 1, ""EnqueuedSequenceNumber"": 0, ""Enqueued-TimeUtc"":""Tue, 01 Jul 2014 23:00:23 GMT"", ""Label"":""M1"", ""Message-Id"":""3a146f76afee41648677887ffced72d8"", ""SequenceNumber"":1, ""State"":""Active"", ""TimeToLive"":10}"; "Date"="Tue, 01 Jul 2014 23:00:23 GMT" }; "Content"="$body" } Mock Invoke-WebRequest { return $mockWebResponse }

# Operate $message = Read-QueueMessage $namespace $queueName $key

# Assert $message.Body | Should Be $body } } } [/powershell]

I've only illustrated developing three unit-tests. The actual project has quite a few more ensuring that all of the envisioned scenarios work as expected.

In this post, I've continued the TDD approach to developing a PowerShell module using Pester for unit testing my code. I've highlighted using the Mock statement to mock a PowerShell cmdlet so my tests don't have any external dependencies. So far, I've only developed the unit tests. In my next post, I will develop each of the functions in my module.

Resources:

Send Message (to an Azure Service Bus Queue)

Receive and Delete Message (Destructive Read)

Part 2 – Developing the First Test Using Pester

This is part two of a multi-part post. Here are links to all the posts in this series.

Developing an Azure Service Bus PowerShell Module – Part 1/

Part 2 – Developing the First Test Using Pester

Part 3 – More Tests, this time with Pester's Mock

Part 4 – Coding the PowerShell Module

Introduction

In my last post, I covered getting started creating a PowerShell module that will allow my team to send and receive messages to Azure Service Bus Queues. In this post, I'm going to start developing the unit tests using Pester, PowerShell's unit test framework.

Test-Driven Development

Test-Driven Development (TDD) is a best practice no matter what language you develop with. Having a good suite of tests gives developers the freedom to change code without worrying about breaking the code (or at least they will immediately know when they have broken the code). Without a good suite of tests for your code, you risk code-rot. So, first I'm going to develop a couple of tests.

My colleague Chris Speers has blogged many times about the advantages of the Azure REST API's when compared to other options that are available. I want to avoid dependencies on other components, so it follows that the cmdlets in my module will be wrappers around the Azure Service Bus REST API's.

The first thing you need to do to use the REST API's is authenticate to them. There are two ways your client code can authenticate to Azure Service Bus: with a Simple Web Token (SWT) and with a Shared Access Signature (SAS) token. Microsoft recommends the Shared Access Signature approach. When you create an Azure Service Bus namespace, Azure will generate an SAS Policy and an SAS key that the Service Bus client code will use to authenticate to Azure Service Bus. While the initial default policy has all rights, SAS policies can be crafted so that specific clients have specific privileges scoped to specific Service Bus resources.

My module is going to need a helper function that creates an SAS token. So the first thing I do is create a stub for that function.

[powershell] <# New-SasToken #> function New-SasToken { throw [NotImplementedException] } [/powershell]

Because I plan for this to be an internal helper function, I do not add an Export-ModuleMember statement for that function to the module file. However, that means that the function will not be in scope in the Pester test. To test that function with Pester, I need to wrap the tests for that function in an InModuleScope block. InModuleScope gives Pester access to module members that are not public.

When the SAS token is created, it needs to have the expiration time encoded in it. Usually, you would set the token to expire a few minutes in the future. That means that a different token will be returned each time the test is run, and that's not what I want. I'm going to add the ability to pass an optional parameter that has the expiration time for the token. That way in my test, I can pass a specific expiration time for the token and compare it to a known good token.

The token also contains the name of the SAS Policy encoded in the token. As mentioned above, when you create the namespace using the Azure portal, it will automatically create an SAS Policy named RootManageSharedAccessKey. You can find these settings in Azure by clicking the Connection String link in the Overview blade for your Service Bus namespace.

Since new policies may be created, we need also be able to pass the policy name to the New-SasToken function. I will add an optional parameter with the default value of RootManageSharedAccessKey. Finally, the token also contains the name of the Service Bus namespace, so I need a parameter for that as well. My New-SasToken function stub now looks like this.

[powershell] <# New-SasToken #> function New-SasToken { param ( [Parameter(Mandatory=$true, Position=0)] [string]$Namespace,

[Parameter(Mandatory=$true, Position=1)] [string]$Key,

[Parameter(Mandatory=$false)] [string]$PolicyName = "RootManageSharedAccessKey",

[Parameter(Mandatory=$false)] [DateTime]$Expiry = $([DateTime]::UtcNow.AddSeconds(20 * 60)) )

throw [NotImplementedException] } [/powershell]

The next thing is to develop the unit test for this function. The unit test itself is drop-dead simple. Basically, I just call the New-SasToken function passing the correct parameters and compare the token that is returned to a known good token. But where do I get a known good token? That took a bit of research and I've attached links to the resources I used in the Resources section at the end of this post. Let's just say it involved reading the Microsoft documentation and running some great sample code that I found on MSDN. My unit test now looks like this.

[powershell]InModuleScope "Avanade.ServiceBus" { Describe "New-SasToken" { Context "Expiry Parameter" { It "Should return a specific token" { # Prepare $namespace = "sb-ycajp" $key = "ggbkU/HOBDSYTTS0ljICEfn1dVdcxpfebcrAmR4HUXQ=" $expiry = [DateTime]"1/1/1980 00:00"

# Operate $token = New-SasToken $namespace -Key $key -Expiry $expiry

# Assert $token | Should Be "SharedAccessSignature sr=sb-ycajp&sig=17PCSRT%2flklQiCnT4E0o1XmVxp%2fhM7xBvIf8UwC9tG4%3d&se=315532800&skn=RootManageSharedAccessKey" } } } } [/powershell]

When I run my test suite, I get the following.

As expected, all tests are failing, throwing a NotImplementedException. At this point, I've coded my first test that is more than just a placeholder. It seems like a good time to commit my changes to source control.

This post has covered the development of a unit test following the practice of Test-Driven Development. In my next post, I will add tests for the New-QueueMessage and the Read-QueueMessage cmdlets.

 

Resources:

Service Bus authentication with Shared Access Signatures

Service Bus HTTP Client

Developing an Azure Service Bus PowerShell Module – Part 1

This is part one of a multi-part post. Here are links to all the posts in this series. Developing an Azure Service Bus PowerShell Module – Part 1/

Part 2 – Developing the First Test Using Pester

Part 3 – More Tests, this time with Pester's Mock

Part 4 – Coding the PowerShell Module

Introduction

On my current project, we've had a work item on our backlog for some time to provide a way for us to receive and submit messages to Azure Service Bus queues. What we really need is a way to receive dead-letter messages and re-submit them after fixing the underlying issue that caused them to dead-letter in the first place.

We have been using the excellent Service Bus Explorer for this purpose. Unfortunately, that tool doesn't serialize the message body quite right when we re-submit batches of messages; we could only repair and re-submit messages one at a time. So, my team decided we needed to build our own tooling for this purpose. We didn't want to build anything too fancy, a couple of PowerShell cmdlets would be perfect for what we were envisioning. I took on the task because I wanted to test drive Pester, PowerShell's unit test framework.

Starting with the Goal in Mind

I figure we need two cmdlets:

  • Receive-QueueMessage
  • Submit-QueueMessage

Receive and Submit seem like good names as they work well with the semantics of queues. Unfortunately, Receive and Submit are not standard PowerShell verbs, so I decided to go with the following:

  • Read-QueueMessage
  • New-QueueMessage

The parameter sets of the cmdlets should look something like this:

Read-QueueMessage -Namespace <string> -QueueName <string> -Deadletter <flag> -Count <int> -CorrelationId <guid> -Peek <flag> -PolicyName <string> -SKey <string>

New-QueueMessage -Namespace <string> -QueueName <string> -Message <string> -CorrelationId <guid> -PolicyName <string> -Key <string>

Creating the Project

My team uses Git and Visual Studio Team Services for our source control repository. I started by creating a feature branch in Git by branching off the main develop branch.

Next I created the project. I used the PowerShell Module Project template and named the project Avanade.ServiceBus, which will be the name of the module.

Note: You will need to install PowerShell Tools for Visual Studio 2017 for this tooling to show up.

The project template creates a module file, a Pester test file and the module manifest file. Initially, they don't do very much. I'm going to start by stubbing out the functions that will become my cmdlets. Initially, all they will do is throw an exception.

[powershell] <# Read-QueueMessage #> function Read-QueueMessage { throw [NotImplementedException] }

<# New-QueueMessage #> function New-QueueMessage { throw [NotImplementedException] }

Export-ModuleMember -Function Read-QueueMessage Export-ModuleMember -Function New-QueueMessage

[/powershell]

Next, I coded the Pester test file. The first thing we need is some boilerplate code so that the test file will load the module it is testing.

[powershell] $here = Split-Path -Parent $MyInvocation.MyCommand.Path $sut = (Split-Path -Leaf $MyInvocation.MyCommand.Path) -replace '\.Tests\.ps1', '.psm1' Import-Module "$here\$sut" -Force [/powershell]

Next, I add two simple tests. The tests don't do anything other than call the module functions, which will throw the NotImplementedException. That will indicate the test file is wired up the module file correctly. The test file looks like this.

[powershell] # # This is a PowerShell Unit Test file. # You need a unit test framework such as Pester to run PowerShell Unit tests. # You can download Pester from http://go.microsoft.com/fwlink/?LinkID=534084 # $here = Split-Path -Parent $MyInvocation.MyCommand.Path $sut = (Split-Path -Leaf $MyInvocation.MyCommand.Path) -replace '\.Tests\.ps1', '.psm1' Import-Module "$here\$sut" -Force

Describe "Read-QueueMessage" { Context "No Parameters" { It "Should Throw Exception" { Read-QueueMessage } } }

Describe "New-QueueMessage" { Context "No Parameters" { It "Should Throw Exception" { New-QueueMessage } } } [/powershell] Finally, I r-click the test file in Solution Explorer, and choose Execute as Script.

The results show up in the PowerShell Interactive Window.

As expected, both tests are failing because the stubbed module functions are throwing the NotImplementedException. This tells me that the test file is wired up to the module file and everything is working as expected.

This is a good time to commit my changes.

This post covered getting started creating a PowerShell module using Visual Studio 2017. It highlighted using Visual Studio's Git integration for source control operations and Test-Driven Development using Pester. In the next post, I will add additional tests and explore Pester's capabilities in depth.

Resources:

Approved Verbs for Windows PowerShell Commands

PowerShell Tools for Visual Studio 2017

Azure Service Bus Monitoring and Alerting using Azure Function and Application Insights

word-image19.png

Being designing and architecting solutions for our clients on Azure Cloud for many years, we know that Service Bus plays an integral part in most of the application architectures when a messaging layer is involved. At the same time we also know that there is no straight answers when customer ask us about native monitoring and alerting capabilities of the service bus. For visual dashboards, you would need to drill down to the overview section of the queue blade.

For Diagnostic, there are only operational logs available natively.

Although there are few 3rd party products available in the market who have built a good story around monitoring and alerting on azure service bus but they come at an additional cost.

In quest of answering our customer question on how we can get monitoring and alerting capabilities of azure service bus, I figured out that answer lies within azure itself. This blog post illustrate a proof-of-concept solution which was done as part of one of our customer engagement. The PoC solution uses native azure services including:

  • Service Bus
  • Functions
  • Application Insights
  • Application Insight Analytics
  • Application Insight Alerts
  • Dashboard

The only service that would add cost to your monthly azure bill would be functions (assuming application insight is already part of your application architecture). You would need to analyze the cost of purchasing a 3rd part monitoring product vs. function cost.

Let’s deep dive in the actual solution quickly.

Step 1: Create an Azure Service Bus Queue

This is of course a perquisite since we will be monitoring and alerting around this queue. For PoC, I created a queue (by name queue2) under a service bus namespace with root managed key. Also I filled up the queue using one of my favorite tool “Service Bus Explorer”.

Step 2: Create an Azure Function

Next step is to create a function. This function logic is to:

  1. Query the service bus to fetch all the queues and topics available under it.
  2. Get the count of active and dead letter messages
  3. Create custom telemetry metric
  4. And finally log the metric to Application Insight

I choose to use the language “C#” but there are other language available. Also I configured the function to trigger every 5 seconds so it’s almost real time.

Step 3: Add Application Insight to Function

Application Insight will be use to log the telemetry of service bus by the function. Create or reuse an application insight instance and use the instrumentation key in the C# code. I have pasted the function code used in my PoC. The logging part of the code relies on custom metrics concept of application insights. For PoC, I created 2 custom metric – “Active Message Count” and “Dead Letter Count”.

Sample Function:

#r "Microsoft.ServiceBus"
using System;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;
using System.Text.RegularExpressions;
using System.Net.Http;
using static System.Environment;
using Microsoft.ApplicationInsights;
using Microsoft.ApplicationInsights.DataContracts;

public static async Task Run(TimerInfo myTimer, TraceWriter log)
{
var namespaceManager = NamespaceManager.CreateFromConnectionString(
Env("ServiceBusConnectionString"));

foreach(var topic in await namespaceManager.GetTopicsAsync())
{
foreach(var subscription in await namespaceManager.GetSubscriptionsAsync(topic.Path))
{
await LogMessageCountsAsync(
$"{Escape(topic.Path)}.{Escape(subscription.Name)}",
subscription.MessageCountDetails, log);
}
}
foreach(var queue in await namespaceManager.GetQueuesAsync())
{
await LogMessageCountsAsync(Escape(queue.Path),
queue.MessageCountDetails, log);
}
}

private static async Task LogMessageCountsAsync(string entityName,
MessageCountDetails details, TraceWriter log)
{
var telemetryClient = new TelemetryClient();
telemetryClient.InstrumentationKey = "YOUR INSTRUMENTATION KEY";
var telemetry = new TraceTelemetry(entityName);
telemetry.Properties.Add("Active Message Count", details.ActiveMessageCount.ToString());
telemetry.Properties.Add("Dead Letter Count", details.DeadLetterMessageCount.ToString());
telemetryClient.TrackMetric(new MetricTelemetry("Active Message Count", details.ActiveMessageCount));
telemetryClient.TrackMetric(new MetricTelemetry("Dead Letter Count", details.DeadLetterMessageCount));
telemetryClient.TrackTrace(telemetry);
}
private static string Escape(string input) => Regex.Replace(input, @"[^A-Za-z0-9]+", "_");
private static string Env(string name) => GetEnvironmentVariable(name, EnvironmentVariableTarget.Process);

Step 4: Test your function

Next step is to test your function by running it. If everything is setup right, you should start seeing the telemetry in the application insight. When you select one the trace, you should be able to view the “Active Message Count” and “Dead Letter Count” under custom data. In the screenshot below, my queue2 has 17 active messages and 0 dead letter.

Step 5: Add an Application Insight Analytics Query

Next step is to use AI Analytics to render service bus chart for monitoring. From the AI blade, you need to click on the Analytics icon. AI Analytics is a separate portal with a query window. You would need to write a query which can render a time chart for a queue based on those custom metrics. You can use the below sample query as a start.

Sample Query:

traces
| where message has 'queue2'
| extend activemessagecount = todouble( customDimensions.["Active Message Count"])
| summarize avg(timestamp) by activemessagecount
| order by avg_timestamp asc
| render timechart

Step 5: Publish the Chart to Dashboard

The AI Analytics chart can be publish (via pin icon) to Azure Dashboard which will enable monitoring users to actively monitor the service bus metrics when they login to azure portal. This will remove the need to drill down to the service bus blade.

Refer this to know more about the creating and publishing charts to dashboards.

Step 6: Add Alerts on the custom counter

The Last step is to create application insight alerts. For PoC, I created 2 alerts on “Active Message Count” and “Dead Letter Message Count” with a threshold. These will alert monitoring users with an email, if the message count exceeds a threshold limit. You can also send these alert to external monitoring tools via web hook.

Attached is sample email from azure AI alert:

Hope these steps will at least gives you an idea that above custom solution with azure native services can serve basic monitoring and alerting capabilities for service bus and for that matter other azure services as well. The key is to define your custom metrics that you would like to monitor against and then setup the solution.