All posts tagged with azure

How to disable the Azure AD password expiration policy through PowerShell

June 22, 2020 - Søren Alsbjerg Hørup

We recently encountered a problem with our automatic tests of a cloud solution. The solution utilizes Azure AD as identity provider and currently holds several test user accounts used by our automatic tests.

The tests were green for several weeks, but suddenly turned red due to the password expired! No problem we thought, we simply disable password expiration for the test users in the AD - but after traversing the Azure Portal we did not find the ability to disable or change the password expiration policy (WTF!)

After some Googling, I came to the conclusion that it is not possible to change the policy through the portal but that it is possible through PowerShelling (Is this a term I can use :-P)

Firstly, the AzureAD module must be installed in PowerShell:

Install-Module AzureAD 

This will populate the PowerShell with Azure specific cmdlets.

Next, the specific subscription needs to be selected:

Select-AzureSubscription -TenantId <GUID>

The GUID can be found Portal under Tenant ID:

annotation 2020 06 22 083340

Lastly, the following command gets the test user from the AD and sets the password policy to “DisablePasswordExpiration”:

Get-AzureADUser -ObjectId "") | Set-AzureADUser -PasswordPolicies DisablePasswordExpiration

That’s it! Password should no longer expire for the given user!

Hangfire - .NET background processing made easy

April 29, 2020 - Søren Alsbjerg Hørup

The cloud project I am currently working had the requirement that we needed to ingest, process, and write several gigs of data in a CosmosDB every 15 min.

For the processing part, we needed something that could scale, since the amount of data was proportional to the number of customers we have hooked up to the system.

Since the project consisted mainly of C# .NET Core developers, the initial processing was done using C# using async operations. This worked well, but was not really scalable - one in the team suggested to use for the processing, which turned out was a great fit for our use case. (Wished it was my idea, but it was not…)

Hangfire is an open source .NET Core library which manages distributed background jobs. It does this by starting a Server in the application where jobs can be submitted. Job types include: fire and forget, delayed jobs, recurring jobs, and continuations.

Hangfire uses a database to ensure information and metadata about jobs are persisted. In our case, we simply use an Azure SQL server. Multiple instances of the application hosting the Hangfire server helps with the processing of the jobs.

This architecture makes it possible to e.g. submit a job for each connected customer, which is processed by one or more nodes. If resources becomes a problem, we horizontal scale up the application to include more instances - which can even be done automatically depending on CPU load or other metric. In our case, we use Kubernetes for that.

What I really like about Hangfire is the fact that one can simply start with one instance of the application hosting the Hangfire server, and scale up later if needed.

Oh! and Hangfire comes with its own integrated dashboard that allows one to see submitted jobs. Neat!

Although we are not yet in production, my gut feeling is good on this one. Highly recommended!

Pulling a new image in a Container App Service

January 21, 2020 - Søren Alsbjerg Hørup

Containers are now first class citizens in App Services. A Container App Service can be created with a specific image that is pulled from DockerHub or other registry.

Pulling the image only happens when the App Service is started or restarted. Pulling the image automatically seems not to be supported, meaning that one has to manually restart the App Service every-time the image has been updated and needs to be re-pulled.

Luckily, App Service exposes a Webhook which will pull the latest image and restart the container if necessary. This can be enabled by setting Continuous Deployment to On. Afterwards, the Webhook URL can be copied and used as part of a CI/CD pipeline.

image 3

The URL has the form:


The webhook uses basic authentication. POSTing to the URL will pull the latest image. curl can be used for this purpose as such:

curl --data '' https://\${appservicename}:{password}@{appservicename}

Note: Remember to escape the dollar sign if using a Bash Shell

For CI/CD, this can be easily integrated. Example from one of my github projects:

name: Dockerize
on: [push]
    runs-on: ubuntu-latest
    - uses: actions/checkout@v1
    - name: Login to Docker
      run: docker login -u ${{secrets.DOCKER_USERNAME}} -p ${{secrets.DOCKER_PASSWORD}}
    - name: Build the Docker image
      run: docker build . --file Dockerfile --tag horup/outfitty:latest --tag horup/outfitty:$(date +%s)
    - name: Push the Docker image
      run: docker push horup/outfitty
    - name: Trigger App Service
      run: "curl --data '' ${{secrets.APP_SERVICE_WEBHOOK_URL}}"

Here, the last step posts empty data to the Webhook URL. Note that the URL is kept in a secret in Github due to it containing basic authentication credentials. Also remember to escape the dollar sign when keeping this as a secret.

That’s it!

Accessing The Microsoft Graph API using .NET

January 03, 2020 - Søren Alsbjerg Hørup

I recently needed to fetch all users contained within a Microsoft Azure Active Directory tenant from within a trusted application running on a Windows 10 computer.

For this, I implemented a .NET Console Core console application that utilizes the Microsoft Graph API to fetch the needed data.

Microsoft provides NuGet packages that makes this a breeze. Assuming the application has been registered in Azure Active Directory and a Client Secret has been created, access can be obtained by constructing an IConfidentialClientApplication object using ConfidentialClientApplicationBuilder like so:

IConfidentialClientApplication confidentialClientApplication = ConfidentialClientApplicationBuilder

Where clientId is the Guid of the application, tenantId is the Guid of the Azure Active Directory Tenant and secret is the client secret. The IConfidentialClientApplication and ConfidentialClientApplicationBuilder types are exposed the Microsoft.Identity.Client NuGet package.

To Access the Graph API, a GraphServiceClient must be constructed. This object provides properties and methods that can chained to construct queries towards the API.
This type is provided by the Microsoft.Graph NuGet Package.

GraphServiceClient needs an instance of a IAuthenticationProvider for it to be able to get an access token.
Microsoft provides ClientCredentialProvider which takes our IConfidentialClientApplication as parameter. ClientCredentialProvider is provided by the Microsoft.Graph.Auth NuGet package.

IAuthenticationProvider authProvider = new ClientCredentialProvider(confidentialClientApplication);


Note: The Microsoft.Graph.Auth package is currently in preview. Make sure to check “Include prerelease” to be able to find this package if you use the NuGet Package Manager in VS2019

Since ClientCredentialProvider implements the IAuthenticationProvider interface, we can now instantiate the GraphServiceClient

GraphServiceClient graphClient = new GraphServiceClient(authProvider);

With this, we can do queries towards the graph API. The following example gets all users of the Active Directory and returns these as an IGraphServiceUsersCollectionPage for further processing.

IGraphServiceUsersCollectionPage users = await graphClient.Users
   .Select(e => new {

That’s it! Remember to provide the needed API Permissions for the application if you intent to execute the query above:

image 1

Cosmos DB does not support Decimal, nor Float

September 30, 2019 - Søren Alsbjerg Hørup

Similar to the Table Storage post, Cosmos DB does not support Decimal.

But It does not even support single precision points only double precision points.

Not a big issue, but if floats are used these are ignored and not converted automatically to doubles, meaning the data is not persisted.

Azure Table Storage does not support Decimal

September 17, 2019 - Søren Alsbjerg Hørup

As the title suggests, Azure Table Storage does not support the Decimal fixed-point type of .NET. One needs to be very aware of this, since no warnings or errors are provided when using the Table Storage API.

I have just spent an hour trying to figure out why my decimal values were all zeroed out when retrieving the rows back from the table. Changing decimal to double fixed the issue.

For my application, double is fine, but for applications requiring fixed point arithmetics this is definitly a con.

As far as I can see, CosmosDB has the same limitation.

Oh well…

Azure Data Studio

August 27, 2019 - Søren Alsbjerg Hørup

I have always used SQL Server Management Studio (SSMS) when interfacing with an MS SQL server or Azure SQL Server. Recently, I have began using Azure Data Studio, an open source alternative from Microsoft (although with limitations).

I have not used it for long, but my experience can be summed up to the following bullets:

  • Round Trip and queries towards the server do not block the UI like in SSMS, but is instead handled in the background as it should be.
  • Even though background processing works without blocking, the UI seldomly show this fact meaning that background completed processing might show up after a couple of seconds without warning.
  • Intellisense is in their for general SQL, but it is not as context aware like in SSMS, e.g. it does not automatically suggest table names unless specified with the full schema.
  • It does not provide much help in the form of designing tables, everything needs to be done by SQL.
  • It provides extensions, similar to VSCode, that can be installed to increase functionality, e.g. connecting to other databases such as Postgres.
  • It has built in support for data charting (although I was unable to get this feature to work)

Azure Data Studio is very nice if you know SQL and frequently use SQL. If however database management / table / schema design is something you seldomly due, and thus require a bit more help in the process, SSMS is in my opinion more attractive due to it provides much more functionality that helps with designing tables and such.

In any case, I think Azure Data Studio will replace SSMS in the future for the majority of users - I will definitely use it for simpler queries that I know by hand.

Azure Blob Storage Benchmark

July 25, 2019 - Søren Alsbjerg Hørup

Another quick benchmark I did regarding azure was the transfer of 65K messages to an Azure Blob Storage.

My test application was a .NET Core Console application that:

  • Created 1000 tasks.
  • Each tasks would instantiate a message with a unique GUID and populate the message with a payload of 65K.
  • Each task would then push the message to the Blob Storage.
  • This process would be repeated 1000 times.

In total, 1000 x 1000 Blobs were created, transmitted and stored during the run of the application.

The results were that my app was consuming 28% of my CPU and using 600Mbit/s worth of bandwidth - which is the limit of my upload.

Clearly, Azure is not the limiting factor here, neither is my CPU.

Mosquitto Performance on low-end VM

July 24, 2019 - Søren Alsbjerg Hørup

I am currently designing a new topology for inter-controller communication using Mosquitto as broker. For fun, I wanted to see how much I could push Mosquitto so I started a low-end Ubuntu VM in Azure and wrote a simple .NET Test application using MQTTnet to put a bit of a load onto it.

I choose the B1s VM (1 vcpu and 1 GiB of memory), installed Ubuntu, installed Mosquitto and opened the default 1883 port for MQTT. This would serve as my broker.

For the benchmarking application I wrote a .NET Core Console app that would:

  • Instantiate N Tasks
  • Each Task instantiates its own MQTT Net client and Connect to the broker.
  • Each Task subscribes to to a “ping/{clientid}” topic and each second send a ping message containing Environment.TickCount + a 65K payload to the ping topic with its own client id.
  • The ping would be returned back to the Task due to the subscribe and ping-time could be measured.

With N = 200, 200/msg/s would be transmitted to the broker and back again, resulting in a theoretical bandwidth requirement of 13MB/s + overhead.

For the benchmark I decided to run the client on my own dev machine, since I can currently upload 50MB/s using the fiber connection that I have available - plenty of bandwidth for my test purposes.

Starting the client app with N=200, I saw the ping times fluctuating from 60ms to 300ms, indicating a bottleneck somewhere (needs to be analyzed further before concluding anything)

Using Azure Metrics, I could quickly mesure the CPU and bandwidth usage of the VM. The CPU usage of the VM peaked at 28.74% and the upload from the VM to my Client peaked at 15.61 MB/s or 125 Mbit/s.

I did not expect this kind of bandwidth / performance from this class of VM in Azure. I of course only tested the burst performance and not the long running performance of the VM, which might differ due to the VM class being a ‘burstable VM’.

Conclusion after this test: if you want a MQTT broker, start with a small burstable VM and see how it performs over time and then upgrade.

Mailing mortgage bond prices using Azure Functions, PuppeteerSharp MSSQL, and SendGrid

July 05, 2019 - Søren Alsbjerg Hørup

The mortgage interest rate in Denmark is approaching all-time low this summer. The current 30 year bond has a interest of 1% and it seems the interest rate will stay the same or decline even more this year.

I myself have a 3% mortgage loan on my house, therefore it makes perfect sense to convert from my 3% loan to the 1% load currently being offered. Bond prices, and thus the interest rate, goes up and down each day - therefore it makes perfect sense to follow the prices and take the loan when the interest rate is at its lowest.

To follow the bond price, and thus interest rate, I implemented an Azure Function which uses PuppeteerSharp, an high-level C# API to control Chromium over the DevTools protocol, to connect to my mortgage provider to fetch and email the daily price of my target mortgage using SendGrid.

Azure Functions’ do not allow GDI applications to run, due to the restricted sandbox nature, and will therefore not allow Chromium to run. Therefore, I have Chromium running on a seperate low-powered Windows VM in the cloud which my Azure Function connects to using PuppeteerSharp.

After connecting, the Azure Functions spawns a new page on the remote Chromium, redirects the page to the mortgage provider and uses jQuery to fetch the price of the bond. This price is saved in an SQL database, such that a comparison can be made to determine if the price has changed and by how much.

When a price change is detected, the new price + change is emailed to me using SendGrid, an easy to use Email Service. This allows me to easily monitor the bond price each day, since I am actively checking my email all day, without having to visit the mortgage provider.

Getting an email with the necessary information really helps monitoring the trend of the bond and when to act, since it automatically is pushed to me (my mailbox) and all unnecessary clutter is removed, such as the prices of the other bonds.

My next step is to find some other domain where I can apply these principles, or perhaps feature creep and make graph and/or prediction support to my bond monitor :-)

Azure 502.5 error with ASP.NET Core 2.1

August 23, 2018 - Søren Alsbjerg Hørup

I recently updated the .NET Core version of one of my ASP.NET Core project from 2.0 to 2.1. To my surprise, the application failed to start after publishing to Azure with a 502.5 error thrown at my face.

Investigation showed that the target version of .NET Core i was using was not avaliable on the App Service instance I was targeting.

To fix this, I changed my Publish configuration from Framework-dependent to Self-contained.


This change will deploy the complete framework alongside my application with the downside that:

  • Deployment takes a bit longer due to the framework needs to be deployed.
  • I need to manage the framework alongside my app, e.g. update it if security bugs or similar are found in the framework.

In any case, this allows me to deploy 2.1 applications to Azure until 2.1 is properly deployed as part of the infrastructure.

DNS for Azure VM

April 19, 2017 - Søren Alsbjerg Hørup

I needed several VMs in Azure for testing purposes and I required a semi-stable HTTP server on each of them. Normally I give the VM a public IP, but since this was for testing purposes I did not want to waste any of my public IPs.

Azure VMs can still be accessed from the outside even without a public IP, but the IP changes during each reboot (at-least). Azure allows one to setup a DNS address for the dynamic IP to be used instead of the IP, which is awesome.

This can be done by:

  • Selecting the VM’s resource group under Resource Groups in the Azure portal.
  • Select the Public IP address associated with the VM.
  • Select Configuration and then specify the optionally DNS name.

Depending on the location of the VM, the DNS will look something like this:

Obviously, one must choose a name which does not conflict with another DNS in the Azure cloud.