Azure Front Door and access restrictions

Lots of you may be familiar with Azure Front door but if not then let me summarize.

Azure Front Door is a cloud-based service from Microsoft Azure that provides a scalable and secure way to route traffic to various backend services, such as web applications, APIs, and microservices. It acts as a global load balancer that can intelligently distribute traffic across multiple regions based on geographic location, latency, and other metrics.

Some of the key features of Azure Front Door include:

  1. Global load balancing: Azure Front Door can intelligently distribute traffic across multiple backend services located in different regions, ensuring optimal performance and availability for users worldwide.
  2. Security: Azure Front Door provides SSL termination, DDoS protection, and other security features to help protect your backend services from malicious attacks.
  3. Traffic routing: Azure Front Door can route traffic based on user location, content type, URL path, and other criteria, making it easy to implement complex traffic routing scenarios.
  4. High availability: Azure Front Door is designed to provide high availability and reliability, with built-in redundancy and automatic failover capabilities.
  5. Analytics: Azure Front Door provides detailed analytics and monitoring capabilities, including real-time metrics, logs, and alerts, to help you optimize your traffic routing and improve the performance of your backend services.

Overall, Azure Front Door is a powerful tool for managing and optimizing traffic to your backend services, helping to ensure high performance, scalability, and security for your applications and APIs.

When you create an Azure Web application out of the box you get a wesbite that ends with the name .azurewebsites.net, many customers want to use a custom domain name so that it is much cleaner and nicer like gregorsuttie.com instead of gregorsuttie.azurewebsites.net

So if you are using Azure Front Door as part of a solution along with a custom domain and you wold like to restrict access to users so that they cannot go to the .azurewebsites.net part then you can go to Networking and use what’s called an Access restriction.

Now they will see this instead:-

You can even whitelist IP addresses so that certain users can still use the .azurewebsites.net as well as the kudu interface.

Don’t forget to subscribe to my YouTube Channel. And my Newsletter



Ignite 2023 In-Person Review

This blog post covers my recent trip to Seattle for the Microsoft Ignite Conference in Seattle last week, I always give my honest opinion and have no filter, last years Ignite wasnt great, this years was very much improved and here is why.

This years Ignite was in a new venue, right next door to last years but now its in the newer Seattle Convention Centre and the venue itself works great, not huge amounts of walking and the layout just works. The weather was good, the atmosphere was good all week, there was swag and even the Ignite bag option made a welcome return. Satya was in person for the Keynote which was also awesome to attend his keynote.

I enjoyed this years Ignite for several reasons, I thought the sessions were of a good standard, some people still saying not enough deep dive content but all said and done I thought it hit the mark.

I felt like i was witnessing the dawn of a new era, It really did feel like this is the age of AI, now I know were all sick of hearing about copilots and AI etc etc but this is the start of a huge change in the industry, our way of working is already changing and we are just starting out on the journey, I can’t wait to see where we are this time next year as these language models and the tooling around them gets better we can invent new ways of working faster and smarted – bring it on.

If your or your company isnt seriously looking into AI and what it offers I think you’ll be missing out for sure.

So Ignite was better because of the following reasons:-

  • Speakers hung around after sessions to talk
  • The ask the experts was super popular
  • The venue made it easy to find rooms and move around
  • The community lounge had some heavy hitters stop bye and hang out, chat and be able to network with them
  • The Keynote’s were in a cool venue with a nice atmosphere
  • The content was all about the era of ai but its exciting times and a lot of the sessions demo’d whats new and whats coming next
  • We had more some new MVP Networking meetups with the Azure Cosmos DB team on campus and the Azure MVP Leads
  • Networking with old and new friends and just enjoying learning about new technologies and generally having fun. A big thing for me is meeting the people I connect with online so its so much fun to meet them in person and say hi.
  • The after parties were pretty good, I really enjoyed the silent disco not gonna lie

Highlights included meeting Brendan Burns, Mark Russenovich, Erin Chapple, some guys called Patch and Switch, dunno who they are really, and going to watch the Seattle Kracken which was awesome!

See you in Vegas next year!



OpenSource walkthrough step by step LIVE!

Last night I interviewed a chap on my stream called John Aziz, John is 22 years of age and is a Gold Microsoft Studnt Ambassador. His name popped up when I saw this tweet from Savannah Ostrowski who is the Product Lead for the Azure Developer CLI.

I love opensource and I hear good things about Hacktoberfest, which is a month-long celebration of open-source projects, their maintainers, and the entire community of contributors.

I wanted to know more about John, his open source contributions and especially as its related to the Azure Developer CLI which i think if your into Azure at all (not just devs) then you should chekck it out asap.

John came onto my stream last night and introduced himself, he introduced Hacktoberfest and talked people through how they can get started in opensource, and also in Hacktoberfest itself.

John then picks an issue that is needing resolved and starts to work on it LIVE – ok now this is seriously impressive, I havent honestly seen anyone do this yet, I’m sure people do these on streams but I have never spoken to John, hes 22 ,and doesnt even know the Go programming language.

I was thoroughly impressed, if your interested in watching the link is below.

John is going to be working at Microsoft some day and probably not long from now, I have no doubts in that. I hope you enjoy the video and learning about opensource, how to get started



3 New Azure Bicep Features

Recently I invited Freek Berson onto my Azure livestream to dicuss Bicep and for Freek to demo 3 of his favourite new features.

Freek and I chatted about

  • What is Bicep?
  • How is Bicep being adopted?
  • Why is IaC important?
  • Demo: Bicep Parameter Files
  • Demo: Opinionated Formatting
  • Demo: Deployment Stacks
  • What you need to use these features
  • Call to Action

You can catch the livestream below

Please subsribe and leave comments 🙂

Don’t forget to subscribe to my YouTube Channel. And my Newsletter




Dapr 101: with Azure Greg and Marc Duiker

This past week I was joined my Marc Duiker on my Azure stream and Marc came on to cover off an intro to Dapr and show a few examples of why Dapr is such an interesting project if you are doing a Cloud Native project and especially if you are interested in learning more about Microservices and looking for a way to make the complex areas of microservices far less complex.

“Dapr (Distributed Application Runtime) is a free and open source runtime system designed to support cloud native and serverless computing. Its initial release supported SDKs and APIs for Java, . NET, Python, and Go, and targeted the Kubernetes cloud deployment system.”

In our livestream Marc introduces Dapr and runs through just a couple of slides before we dive into Visual Studio code, crack open the code and he start’s showing me some demonstrations on using the Dapr CLI showing how to the following:-

  • Dapr 101: start building distributed applications with ease
  • Dapr building block API’s
  • When is it really useful to use Dapr?
  • Demo: State management using Dapr
  • Demo: Resiliency built into Dapr
  • Demo: Workflows using Dapr
  • Demo: Chaining Workflows using Dapr
  • Demo: Observability using Dapr

If you would like to watch the video and learn more then you can watch the video from here:

Please subsribe and leave comments 🙂

Don’t forget to subscribe to my YouTube Channel. And my Newsletter




Outstanding Contribution to Microsoft Coummunity – Global Winner

Proud of this one and decided to blog about it so I have it on my blog.

4 years I had no Azure experience, hard work pays off. Time to plan whats next…



AZURE VM EXTENSIONS: PART 3 Refactoring our code

In this last part of talking about Azure VM Extensions I will make a couple of changes to refactor and make things better. Once you have more time, go back and refactor your code, its a good feeling to go back and improve upon the code.

So in this case I wanted to use Managed Identities for the CustomScriptExtension and I couldn’t get it working at first and due to time pressures I resorted to using SAS tokens. The thing I soon realised was that this is not the best way to go and I really wanted to revisit the codebase and get Managed Indentites working.

I see a lot of people created System Assigned Managed Identities and I try my best not to use these as they are tied to a resource, I always create a Managed Identity from the Azure Portal or Bicep first and then use that.

So I refactor my Bicep code for the CustomScriptExtension to use the Managed Identity Ive created and now the code is no longer needing to make use of a new SAS token each time it ran and then use this, its more secure to use a User Assigned Managed Identity.

@description('Deploy required userManagedIdentity')
module userManagedIdentity './modules/Microsoft.ManagedIdentity/userAssignedIdentities/deploy.bicep' =  {
  scope: resourceGroup(multiTenantResourceGroupName)
  name: userManagedIdentityName
  params: {
    name: userManagedIdentityName
    location: location
    tags: tags
  }
  dependsOn: resourceGroups
}

The above Bicep code creates our User Assigned Managed Identity and then we can make use of this within our CustomScriptExtension like so.

module virtualMachineName_ZabixxInstaller './modules/Microsoft.Compute/virtualMachines/extensions/deploy.bicep' = {
    scope: resourceGroup(multiTenantResourceGroupName)
    name: 'ZabixxInstaller'
    params: {
      enableAutomaticUpgrade: false
      name: 'ZabixxInstaller'
      publisher: 'Microsoft.Compute'
      type: 'CustomScriptExtension'
      typeHandlerVersion: '1.10'
      virtualMachineName: virtualMachineNameBackend
      location: location
      autoUpgradeMinorVersion: true
      settings: {
         fileUris: [ 
          'https://${storageAccountName}.blob.core.windows.net/${containername}/InstallZabbixAgent.ps1'
          'https://${storageAccountName}.blob.core.windows.net/${containername}/zabbix.zip'
          ]
      }
      protectedSettings: {
        commandToExecute: 'powershell.exe -ExecutionPolicy Unrestricted -File InstallZabbixAgent.ps1'
        managedIdentity: {
          objectId : userManagedIdentity.outputs.principalId
        }
      }
    }
    dependsOn: [
      resourceGroups
      virtualMachineBackend
    ]
  }

Summary

In summary we went from generating a SAS token off of the Azure Storage account to changing this to use a User Assigned Managed Identity which is more secure.


Don’t forget to subscribe to my YouTube Channel. And my Newsletter




Azure VM Extensions: Part 2 CustomScriptExtension

This blog post covers some of the battles I have had trying to install some software onto a VM within Azure. There are many ways to go about this and at the end of the day, yeah you live to fight another day.

High level requirements:

1 – Install Zabbix Agent (which is a windows service) onto the same VM and ensure the service starts correctly.

As like most things in life there is more than one way to do something, I could have used RunCommands I could add onto the DSC and added this step into part 1 etc. etc.

I went with using an Azure CustomScriptExtension, maybe not the best option, who knows, but here is how to get this working.

module virtualMachineName_ZabixxInstaller './modules/Microsoft.Compute/virtualMachines/extensions/deploy.bicep' = {
  scope: resourceGroup(multiTenantResourceGroupName)
  name: 'ZabixxInstaller'
  params: {
    enableAutomaticUpgrade: false
    name: 'ZabixxInstaller'
    publisher: 'Microsoft.Compute'
    type: 'CustomScriptExtension'
    typeHandlerVersion: '1.10'
    virtualMachineName: virtualMachineNameBackend
    location: location
    autoUpgradeMinorVersion: true
    settings: {
       fileUris: [ 
        'https://${storageAccountName}.blob.core.windows.net/${containername}/InstallZabbixAgentGS.ps1?${DSCSAS}'
        'https://${storageAccountName}.blob.core.windows.net/${containername}/zabbix.zip?${DSCSAS}'
        ]
    }
    protectedSettings: {
      commandToExecute: 'powershell.exe -ExecutionPolicy Unrestricted -File InstallZabbixAgentGS.ps1'
      managedIdentity: {}
    }
  }
  dependsOn: [
    resourceGroups
    virtualMachineBackend
  ]
}

So lets break this code down, firstly I make use of already written Azure Bicep Resource Modules which you can grab here :- https://github.com/Azure/ResourceModules/

I’m also using version 1.10 of this extension so make sure to watch out for that.

The fileUris are pointing to a storage account that has public access DISABLED, enabling public access is not what you want, notice the part at the end which is a SAS Token, I tried to get this working using a Managed Identity and gave up as running out of time, the docs say you can use a Managed Identity and maybe I will go back and try this now that I have more time, then update this blog post.

“If I generate a new SAS each time you deploy based on deployment time, it will re-run the Script extensions every time, since that SAS value changes. If you move to Managed Identity, the script extension does not have to process when you redeploy, it will be skipped, since the configuration/settings didn’t change. If you want it to redeploy, with no changes, then you can change the value of the forceUpdateTag value.”

Huge shout out to https://github.com/brwilkinson he was helping me with this a LOT, and was super helpful, I owe it to him for helping me get this working and I owe it to him to go back and test it using Managed Identity.

Summary

So in summary I am generating a SAS token from the existing storage account like I do in Part 1 and then I pass that into fileUris so that I don’t run into permission issues.


Don’t forget to subscribe to my YouTube Channel. And my Newsletter




Azure VM Extensions: Part 1 DSC Extensions

This blog post covers some of the battles I have had trying to install some software onto a VM within Azure. There are many ways to go about this and at the end of the day, yeah you live to fight another day.

High level requirements:

1 – Install Octopus Deploy Tentacle onto a VM and have the agent running.
2 – Install Zabbix Agent (which is a windows service) onto the same VM and ensure the service starts correctly. (I’ll cover this in part 2 of the blog post).

Now as said previous, many ways to do this, I looked up what is available from Octopus Deploy as they always have good stuff, they used to have an Azure Extension you could use but they binned that off in favour of PowerShell DSC. I have never used PowerShell DSC, so I’m game for learning anything I know nothing about. Turns out they have the PowerShell DSC available to download and so I just needed to write the Bicep that grabs this from an Azure storage account and away we go…

I have the Bicep for creating the VM and I let that do its thing and decide to create a separate module for the VM DSC extension. So I grab the zip file from Octopus Deploy and read thru the code etc and all good and then I add this to my Azure storage account manually and now I need to figure out the Bicep code within the extension to download the zip file and have it run. The issue I run into is that the Azure storage account isn’t public so I need to figure out on how to generate a SAS token from the storage account and use that to be able to grab the file from storage and download it to the VM.

In order to get a SAS token out for all blobs (might not be what you want so be careful) you can do something like this:-

var allBlobDownloadSAS = listAccountSAS(storageAccount.name, '2022-09-01', {
  signedProtocol: 'https'
  signedResourceTypes: 'sco'
  signedPermission: 'rl'
  signedServices: 'b'
  signedExpiry: '2024-01-01T00:00:00Z'
}).accountSasToken

output sasToken string = allBlobDownloadSAS

So this would be part of the storage account module and just use the SAS token like so:-

var DSCSAS = storageAccount.outputs.sasToken

Now here comes the bit to pay attention too. The Bicep for the VM Extension is as follow, now the URL only worked for me if I appended the SAS token at the end, I don’t want to make the storage account publicly accessible so I needed the SAS token

@description('Add the Octopus Deploy Tentacle to the Backend VM') 
module vmName_dscExtension_OctopusDeployExtension './modules/Microsoft.Compute/virtualMachines/extensions/deploy.bicep' =  if (addVirtualMachine) { 
  scope: resourceGroup(myResourceGroupName) 
  name: 'vmName_dscExtension' 
  params: { 
    autoUpgradeMinorVersion: true 
    enableAutomaticUpgrade: false 
    name: 'vmName_dscExtension' 
    publisher: 'Microsoft.Powershell' 
    type: 'DSC' 
    typeHandlerVersion: '2.77' 
    virtualMachineName: virtualMachineNameBackend 
    location: location 
    settings: { 
      configuration: { 
                  url: 'https://${storageAccountName}.blob.core.windows.net/${containername}/OctopusTentacle.zip?${DSCSAS}' 
        script: 'OctopusTentacle.ps1' 
        function: 'OctopusTentacle' 
      } 
      configurationArguments: { 
        ApiKey: tentacleApiKey 
        OctopusServerUrl: tentacleOctopusServerUrl 
        Environments: tentacleEnvironments 
        Roles: tentacleRoles 
        ServerPort: tentaclePort 
      } 
    } 
  } 
  dependsOn: [ 
    resourceGroups 
    virtualMachineBackend 
  ] 
}

No doubt you can do this with a Managed Identity but I couldn’t get it working and had spent a LOT of time on this so gave up and used the SAS token instead.

Note
The DSC extension takes care of unzipping the files onto the VM and running the PowerShell script called OctopusTentacle.ps1

Summary
In summary this is how you can create a SAS Token in Bicep from a storage account and also how you can reference a blob in the storage account and install it.
I realise the way I am doing this might not be best so if you find an alternative or have feedback do please let me know.


Don’t forget to subscribe to my YouTube Channel. And my Newsletter




Azure DataBricks talking to Azure SQL Database using Private Endpoints

At work a colleague reached out asking for some help with getting some python code querying an Azure SQL Server database and getting it all working. This is right up my street, fixing things I don’t use on a day to day basis is something of a challenge I just love working on.

I will list out the steps in order to achieve this, bare in mind we have Azure SQL deployed as well as Azure DataBricks at this point and when we try to query the Azure SQL Database we see errors like the ones below:-

“com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection to the host <redacted>.database.windows.net, port 1433 has failed. Error: “connect timed out. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.”.

as well as this one

com.microsoft.sqlserver.jdbc.SQLServerException: Cannot open server “<server name redacted>” requested by the login. The login failed. ClientConnectionId:c5977705-8b83-4f12-a4ce-0268ac868798

Ok so reading these errors might mean you look into whitelisting IP addresses.

Lets write the steps down to fix this issue and maybe it will help someone, probably me when I forget I wrote this blog next week 🙂

Ok so we did the following steps:-

  • Added a new Subnet to our databricks-vnet
  • Find your Azure SQL Server instance in the portal, go to the Networking tab and clicked Private access, click the + to Create a Private Endpoint, on the Virtual Network tab choose the Virtual network your using for DataBricks and select the new Subnet we want to use. Make sure to keep ‘Integrate with Private DNS Zone’ ticked.
  • Once the Private Endpoint has been created click on it and go to DNS Configuration, click on the link towards the bottom under the heading Private DNS Zone to be taken to your Private DNS Zone. Now click on ‘Virtual network links’. Again click the + to add a new Virtual Network Link and choose the DataBricks VNet, don’t tick Enable auto registration.

So it s just like any other Private Endpoint config, just remember to do the Virtual Network link. You also don’t need to whitelist an IP Addresses or anything like that.

Don’t forget to subscribe to my YouTube Channel. And my Newsletter