Ruling Your Resource Groups with an Iron Fist
If it is not obvious by now, we deploy a lot of resources to Azure. The rather expensive issue we encounter is that people rarely remember to clean up after themselves; it goes without saying we have encountered some staggeringly large bills. To remedy this, we enforced a simple, yet draconian policy around persisting Resource Groups. If your Resource Group does not possess values for our required tag set (Owner and Solution), it can be deleted at any time. At the time the tagging edict went out we had well over 1000 resource groups. As our lab manager began to script the removal using the Azure Cmdlets we encountered a new issue, the synchronous operations of the Cmdlets just took too long. We now use a little script we call "The Reaper" to "fire and forget".
If you read my previous post about Azure AD and Powershell, you may have noticed my predilection for doing things via the REST API. "The Reaper" simply uses the module from that post to obtain an authorization token and uses the REST API to evaluate Resource Groups for tag compliance and delete the offenders asynchronously. There are a few caveats to note; it will only work with organizational accounts and it will not delete any resources which have a lock.
It is published on the PowerShell gallery, so if you can obtain it like so:
[powershell] #Just download it Save-Script -Name thereaper -Path "C:\myscripts" #Install the script (with the module dependency) Install-Script -Name thereaper [/powershell]
The required parameters are a PSCredential and an array of strings for the tags to inspect. Notable Switch parameters are AllowEmptyTags (only checks for presence of the required tags) and DeleteEmpty (removes Resource Groups with no Resources even if tagged properly). There is also a SubscriptionFilters parameter taking an array of Subscription id's to limit the scope, otherwise all Subscriptions your account has access to will be evaluated. If you would simply like to see what the results would be, use the WhatIf Switch. Usage is as follows:
[powershell] $Credential=New-Object PSCredential("username@tenant.com",("YourPassword"|ConvertTo-SecureString -AsPlainText -Force)) $results=.\thereaper.ps1 -Credential $Credential ` -RequiredTags "Owner","Solution" -DeleteEmpty ` -SubscriptionFilters "49f4ba3e-72ec-4621-8e9e-89d312eafd1f","554503f6-a3aa-4b7a-a5a9-641ac65bf746" [/powershell]
A standard liability waiver applies; I hold no responsibility for the Resource Groups you destroy.
Automation Frameworks & Threshold for Creation
IntroductionYears ago I was walking through an office and saw an administrator logging onto a system, checking the C drive, updating a spreadsheet, logged off and then repeated this task. In pure disbelief, I stood and watched for a minute then asked, "What are you doing?", as I feared the response was "One of the monthly maintenance tasks, checking C drive space". I calmly asked him to please stop and went back to my desk to write a script.
As administrators and consultants we constantly have to evaluate when do you automate and when do you just do the task. There are many variables, time to automate, level of production access or change required (and security's opinion about this), how long it takes now, who's paying, how long will you have to keep doing it and what else you have to do.
Automation Frameworks While there are tasks we can automate and this programmer takes it to a new level of task automation including scripts for emailing his wife and making coffee (if I may quote Wayne and Garth, "we're not worthy, we're not worthy"), there is another side, automation frameworks for multiple scenarios and reuse. The PowerShell Deployment Toolkit was an amazing framework for deploying System Center. It took the task of deployment from days and weeks of installations and months of fixing deployments to a few minutes, hours and days and still flex to a variety of deployment patterns.
However, there was a learning curve, a list of prerequisites (some documented and some not), a few tricks and digging around custom code you sometimes had to reverse engineer. This PowerShell framework could have deployed anything, you just had to write your own XML manifest for what you wanted to deploy but that would take a lot of time learning the XML syntax that the deployment framework understood, testing the deployment, working through 'undocumented features' and so on. I actually extended the existing scripts for our own needs, but now a total of one person knows what those extra XML properties mean.
New Thresholds Cloud technologies like Azure are changing at a pace unseen in the enterprise space. VMs or IaaS compute has shifted through Classic, ARM deployments, versions V1 and V2, not to mention API versions. Extensions keep being added and DSC brings another layer to the table. These layers are all shifting with and without each other. Recently I had the property of the VM.OSProfile change on me from virtualharddisks to VHDs which broke a script.
When we consider developing frameworks like PDT in the cloud space with tools like PowerShell and RGTs we have to consider this threshold to develop with a new set of values. Is a script with 30 parameters at the top and some blocks of code you have to run manually enough? As opposed to building script with if, then, else, switch and validation logic around this original code. The logic flow is more than the code for actual deployment. The next level being script to generate the RGTs JSON code or PowerShell syntax dynamically. If this complex code had been using the property VM.OSProfile.virtualharddisks how would the script have responded and would the time to develop (and in the future maintain) this framework, around what is already a fairly streamlined deployment, be worth trading for the time to deploy manually?
Azure Resource Group Templates are a great example, the language is JSON and fairly straight forward at a glance, there are Inputs, Variables, Resources and Outputs. With Azure's rate of changes writing code around this could take weeks and months as opposed to managing several RGTs as the deployments are needed. Devops methodologies are starting to introduce this level of automation into code development and Infrastructure as code is rapidly changing what and how quickly we can rip down an environment and redeploy it.
Investing Time If you do want to invest time, it could be spent working on scripts and policy to reduce cloud costs like turning off VMs over the weekends or machines not tagged properly that were possibly deployed for testing and never deleted which could save more dollars than your time to execute a complex deployment script every month. Perhaps writing PowerShell modules to help with things like reporting or authentication. Maybe it's worth just reading about what's new instead, I found out about Azure Resource Manager policies months after they were released, keeping up is almost becoming a full time job.
Summary This article doesn't have the answers, it's meant to add new perspective and raise questions about what you automate and how far you push the modularization and reuse of code.
Disk to LUN Mapping for a Windows VM using Microsoft Storage Spaces in Azure
When multiple data disks are attached to a Windows VM using Microsoft Storage Spaces in Azure it can be difficult to identify the LUN a disk belongs to. This is especially troublesome when there is a need to delete, clone, or migrate a specific disk(s) from on VM to another VM in Azure. A scenario where this may come in handy is if you need to migrate only 1 or 2 disks associated with Microsoft Storage Spaces while leaving other disks in place on the VM.
When a disk is not associated with a storage pool the disk’s LUN can easily be seen in Location property of the general tab on the disk’s properties under the “Disk Management” utility.
When a disk is associated with a storage pool the disk will not show up in the “Disk Management” utility and the LUN is not easily viewable because of this.
The following can help identify the Disk to LUN Mapping on your Windows VM in Azure when Microsoft Storage Spaces are used.
We will collect some information on some of the disks displayed in Device Manager. Use this information to identify the LUNs of the disks you would like to delete, clone, or migrate. You will need to do these steps for each of the “Microsoft Storage Space Device” objects in the “Disks drives” object in the Device Manager utility.
- Physical Device Object Name – Can be used as the unique identifier for the “Microsoft Storage Space Device”.
- Volume and Drive Letter(s) – Will show what volumes are associated with the storage space device.
- Volume Capacity – This is optional but helpful to see.
- Power relations – Identifies uniquely each “Microsoft Virtual Disk” used by the “Microsoft Storage Space Device”.
- LUN ID(s) – This derived by manually converting the last 6 characters of each of the “Power Relations” values (see above) from HEX to Decimal.
1. Open Device Manager
a. Remote Desktop into your Azure VM that has the disks you would like to delete, clone, or migrate.
b. Right-Click Start Menu –> Select “Device Manager”
2. Get properties of the “Disk drives” object.
Note: This will need to be done for each of the “Microsoft Storage Space Device”
a. Right Click on the “Microsoft Storage Space Device” object and select properties.
3. Get the “Microsoft Storage Space Device” object’s “Physical Device Object name” property.
a. Select the “Details” tab of the “Microsoft Storage Space Device” properties window.
b. Select the “Physical Device Object name” property in the “Property” field dropdown menu.
c. Write down the “Physical Device Object Name” value for this “Microsoft Storage Space Device”.
Sample Value: “\Device\Space1”
4. Get the Volumes associated with the “Microsoft Storage Space Device”.
a. Select the “Volumes” tab of the “Microsoft Storage Space Device” properties window.
b. Click the “Populate” button to retrieve all volume information for this “Microsoft Storage Space Device”.
c. For each volume in the “Microsoft Storage Space Device” make note of the “Volume” and “Capacity” fields.
Sample Value “Volume”: “volume-on-pool-a (M:)”
Sample Value “Capacity”: “2016MB”
5. Get the “Power relations” properties for the “Microsoft Storage Space Device”.
a. Select the “Details” tab of the “Microsoft Storage Space Device” properties window.
b. Select the “Power relations” property in the “Property” field dropdown menu.
c. Write down the “Power relations” value(s) for this “Microsoft Storage Space Device”.
Sample “Power relations” Value(s):
SCSI\Disk&Ven_Msft&Prod_Virtual_Disk\000001 SCSI\Disk&Ven_Msft&Prod_Virtual_Disk\000000
6. Identify the LUN information in the “Power relations” Value string(s) and convert from HEX to Decimal.
Note: The LUN information is stored in hexadecimal format as the last 6 characters on the “Power relations” Value string(s). At this time this will need to be manually identified and converted to a decimal format for LUN identification. See this sample data below:
Sample “Power relations” value: “SCSI\Disk&Ven_Msft&Prod_Virtual_Disk\00001c” Sample Hexadecimal value (last six characters from above): “00001c”
Sample LUN ID (converted from HEX): “28”
a. Get the last 6 characters of the “Power relation” string(s).
b. Convert the last 6 characters from hexadecimal to decimal using Calc.exe (in Programmer mode). This is the LUN ID.
c. Note down the LUN ID with the other collected information.
Below are some example tables that were created using the method above. These can be used to aggregate the data when multiple disks are involved.
|
Volumes Table |
|||
|
Microsoft Storage Space Device (Physical Device Object Name) |
Drive Letter |
Volume Name |
Capacity |
|
\Device\Space1 |
M: |
volume-on-pool-a (M:) |
2016 MB |
|
\Device\Space2 |
N: |
volume-on-pool-b (N:) |
2016 MB |
|
Power Relations Table (LUNs) |
|||
|
Microsoft Storage Space Device (Physical Device Object Name) |
Power relations |
HEX (Manual - Last 6 of Power Relations) |
LUN (Manual - Convert Hex to Decimal) |
|
\Device\Space1 |
SCSI\Disk&Ven_Msft&Prod_Virtual_Disk\000001 |
000001 |
1 |
|
\Device\Space1 |
SCSI\Disk&Ven_Msft&Prod_Virtual_Disk\000000 |
000000 |
0 |
|
\Device\Space2 |
SCSI\Disk&Ven_Msft&Prod_Virtual_Disk\000003 |
000003 |
3 |
|
\Device\Space2 |
SCSI\Disk&Ven_Msft&Prod_Virtual_Disk\000002 |
000002 |
2 |
Generating HTML Reports in PowerShell – Part 4
Almost doneI'm sure you would like to have something other than "Your Logo Here" at the top of these reports, well you can. I have modified the module to accept two methods for rendering logos. The Module will now to accept a file path to a logo or alternatively you can code your base64 string into the module which is the default if no files are specified.
Pre-requisites Again, we are going to use the code to create the Azure VMs recordset and other session variables we built up in Part 1. The module is available through the PowerShell Gallery or can be installed from PowerShell using install-module -name ReportHTML. Additionally you can contribute via github ReportHTML which also has and all the Examples scripts for download. You can get the example code for part 4 from these Github links with comments and without comment.
Using images for Logos (Example 12) This example will utilize two jpg files, clientlogo and mainlogo and will encode these into base64 strings in the Get-HTMLClose function.
[powershell] ####### Example 12 ######## # The two logo files are stored in the report path $MainLogoFile = join-path $ReportOutputPath "ACELogo.jpg" $ClientLogoFile = join-path $ReportOutputPath "YourLogo.jpg"
$rpt = @() $rpt += Get-HtmlOpen -TitleText ($ReportName + " Example 12") $rpt += Get-HtmlContentOpen -HeaderText "Size Summary" $rpt += Get-HtmlContentTable ($RMVMArray | group Size | select name, count | sort count -Descending) $rpt += Get-HtmlContentClose
# In this case we are going to swap the logos around using ClientLogoFile and MainLogoFile parameters and switching the files used $rpt += Get-HtmlClose -ClientLogoFile $MainLogoFile -MainLogoFile $ClientLogoFile Test-Report -TestName Example12 [/powershell]
Change logos in the module (Example 13) This example show how the default option works. There are 5 client logo base64 strings encoded into the module. Simply calling the module with -ClientLogoType ClientLogo1 to ClientLogo5 with use a switch statement to select which logo to use. You can use Powershell or a website to create the encoding string. This string can then be put into the module. This obviously breaks receiving updates so I would recommend not using this but it's there at the moment. Below shows the code where the ClientLogo and MainLogo strings are located.
[powershell] ####### Example 13 ######## $rpt = @() $rpt += Get-HtmlOpen -TitleText ($ReportName + " Example 13") $rpt += Get-HtmlContentOpen -HeaderText "Size Summary" $rpt += Get-HtmlContentTable ($RMVMArray | group Size | select name, count | sort count -Descending) $rpt += Get-HtmlContentClose
# We have been using Get-HTMLClose up until now which has a default of ClientLogo1 # In this case we can specify ClientLogo5 $rpt += Get-HtmlClose -ClientLogoType ClientLogo5
Test-Report -TestName Example13 [/powershell]
Using images for Logos (Example 14) This example we can pass in the base 64 string directly making moving a report around without an image file easier. This uses the ClientLogoBase64 and MainLogoBase64 parameters.
Edit Please note I uploaded some bad code relating to get-htmlclose and logos please update to 1.0.0.12. Apologies for any inconvenience.
[powershell] ####### Example 14 ######## # for this we need to get the file and create the string. You could do this once and code the base64 string into the script $MainLogoFilePath = join-path $ReportOutputPath "ACELogo.jpg" $ClientLogoFilePath = join-path $ReportOutputPath "YourLogo.jpg" $MainBase64 = [Convert]::ToBase64String((Get-Content $ClientLogoFilePath -Encoding Byte)) $clientBase64 = [Convert]::ToBase64String((Get-Content $MainLogoFilePath -Encoding Byte))
# if you run the $clientBase64 and copy the content into a here string you can create the logo image without access to the file.
$rpt = @() $rpt += Get-HtmlOpen -TitleText ($ReportName + " Example 14") $rpt += Get-HtmlContentOpen -HeaderText "Size Summary" $rpt += Get-HtmlContentTable ($RMVMArray | group Size | select name, count | sort count -Descending) $rpt += Get-HtmlContentClose $rpt += Get-HtmlClose -ClientLogoBase64 $MainBase64 -MainLogoBase64 $MainBase64
Test-Report -TestName Example14 [/powershell]
Conclusion I really hope you find this useful and hope it can help you generate HTML reports on the fly to make someone's job easier. There is another function ConvertTo-AdvHTML here that requires some HTML knowledge. However this output could be intertwined into ReportHTML module for advanced usage. There is also a module here which has column sort which could be merged. There is a lot of potential here for expansion on what is there and a lot of room to improve as well. Please feel free to post suggestions, contact me or contribute via Github.
Good luck.
Generating HTML Reports in PowerShell – Part 3
Welcome BackYou're still here, hopefully you got the content working from Part 1 and Part 2 in this series about generating HTML reports in Powershell. Now let's have a look at creating a Pie Chart in the script. The module is available through the PowerShell Gallery or can be installed from PowerShell using install-module -name ReportHTML. Additionally you can contribute via github ReportHTML which also has and all the Examples scripts for download.
Pre-requisites Again, we are going to use the code to create the Azure VMs recordset and other session variables we built up in Part 1. You can get the code for part 3 from these Github links with comments and without comment
Module Changes I have added two functions to the module to help create Pie charts. The first function creates an object that contains the properties for creating the chart. There is lot of room for expansion and options to create different chart styles I will discuss at the end of this post. Create-HTMLPieChartObject will create a custom object with default chart properties that we pass to the Create-HTMLPieChart function along with a grouped recordset. Simply using the Powershell group command on an array will create an array with name and count properties that get used as the data points on the chart. You can alternatively create your own array with Name and Count headings or use and expression to change the existing headings to name and count, meaning you dont have to use group by function. Please make sure to update your local module files from Github
Creating a Pie Chart (Example 9) First we will create a very basic chart showing a summary of virtual machine power state.
[powershell] ####### Example 9 ######## # First we create a PieChart Object and load it into a variable $PieChartObject = Create-HTMLPieChartObject
# Have a look at what is in this object. $PieChartObject
# Let's set one property $PieChartObject.Title = "VMs Power State"
$rpt = @() $rpt += Get-HtmlOpen -TitleText ($ReportName + " Example 9") $rpt += Get-HtmlContentOpen -HeaderText "Chart Series" $rpt += Create-HTMLPieChart -PieChartObject $PieChartObject -PieChartData ($RMVMArray | group powerstate) $rpt += Get-HtmlContentClose $rpt += Get-HtmlClose
Test-Report -TestName Example9 [/powershell]
Pie Chart & additional Properties (Example 10) Let's have a look at some of the properties we can set
[powershell] ####### Example 10 ######## $PieChartObject = Create-HTMLPieChartObject $PieChartObject.Title = "VMs Sizes Deployed"
# There is a lot of data so let's make the pie chart a little bigger and explode the largest value $PieChartObject.Size.Height = 600 $PieChartObject.Size.Width = 600 $PieChartObject.ChartStyle.ExplodeMaxValue = $true
$rpt = @() $rpt += Get-HtmlOpen -TitleText ($ReportName + " Example 10") $rpt += Get-HtmlContentOpen -HeaderText "Chart Series"
# To summarize the data I have simply changed the group by property to size $rpt += Create-HTMLPieChart -PieChartObject $PieChartObject -PieChartData ($RMVMArray | group size) $rpt += Get-HtmlContentClose $rpt += Get-HtmlClose Test-Report -TestName Example10 [/powershell]
Pie Chart & Results Table (Example 11) Here is a quick example with a pie chart and results table
[powershell] ####### Example 11 ######## $PieChartObject1 = Create-HTMLPieChartObject $PieChartObject.Title = "VMs Powerstate" $PieChartObject2 = Create-HTMLPieChartObject $PieChartObject.Title = "VMs Sizes" $PieChartObject.Size.Height = 800 $PieChartObject.Size.Width = 800 $PieChartObject.ChartStyle.ExplodeMaxValue = $true
$rpt = @() $rpt += Get-HtmlOpen -TitleText ($ReportName + " Example 10") $rpt += Get-HtmlContentOpen -HeaderText "Power Summary" $rpt += Create-HTMLPieChart -PieChartObject $PieChartObject1 -PieChartData ($RMVMArray | group powerstate) $rpt += Get-HtmlContentTable ($RMVMArray | group powerstate | select name, count) $rpt += Get-HtmlContentClose $rpt += Get-HtmlContentOpen -HeaderText "VM Size Summary" $rpt += Create-HTMLPieChart -PieChartObject $PieChartObject2 -PieChartData ($RMVMArray | group size) $rpt += Get-HtmlContentTable ($RMVMArray | group Size | select name, count | sort count -Descending) $rpt += Get-HtmlContentClose $rpt += Get-HtmlClose
Test-Report -TestName Example11 [/powershell]
More Charting Properties There are a lot of properties that can be added to this object. I even thought about creating a couple of defaults sets of properties, for instance defaults for 'Exploded', 'SLA' or 'DisplayValues'. If you want to help expand this chart object, please contribute and edit the github project. The options for charting can be found at microsoft in the Class Reference
Additionally I would like to create some chart functions for line and bar charts, however I haven't had the time to create those functions recently. If you want to explore this please contribute to github or you can reach out and we can discuss how to go about this.
Summary We have one component left to cover which is changing the logos at the top left and top right that are displayed on the report. Currently these base64 strings are hard coded in the module. I will be working to create some more dynamics options before posting about this in Part 4.
Topic Search
Posts by Date
- August 2025 1
- March 2025 1
- February 2025 1
- October 2024 1
- August 2024 1
- July 2024 1
- October 2023 1
- September 2023 1
- August 2023 3
- July 2023 1
- June 2023 2
- May 2023 1
- February 2023 3
- January 2023 1
- December 2022 1
- November 2022 3
- October 2022 7
- September 2022 2
- August 2022 4
- July 2022 1
- February 2022 2
- January 2022 1
- October 2021 1
- June 2021 2
- February 2021 1
- December 2020 2
- November 2020 2
- October 2020 1
- September 2020 1
- August 2020 1
- June 2020 1
- May 2020 2
- March 2020 1
- January 2020 2
- December 2019 2
- November 2019 1
- October 2019 7
- June 2019 2
- March 2019 2
- February 2019 1
- December 2018 3
- November 2018 1
- October 2018 4
- September 2018 6
- August 2018 1
- June 2018 1
- April 2018 2
- March 2018 1
- February 2018 3
- January 2018 2
- August 2017 5
- June 2017 2
- May 2017 3
- March 2017 4
- February 2017 4
- December 2016 1
- November 2016 3
- October 2016 3
- September 2016 5
- August 2016 11
- July 2016 13










