3 New Azure Bicep Features

Recently I invited Freek Berson onto my Azure livestream to dicuss Bicep and for Freek to demo 3 of his favourite new features.

Freek and I chatted about

  • What is Bicep?
  • How is Bicep being adopted?
  • Why is IaC important?
  • Demo: Bicep Parameter Files
  • Demo: Opinionated Formatting
  • Demo: Deployment Stacks
  • What you need to use these features
  • Call to Action

You can catch the livestream below

Please subsribe and leave comments 🙂

Don’t forget to subscribe to my YouTube Channel. And my Newsletter




Dapr 101: with Azure Greg and Marc Duiker

This past week I was joined my Marc Duiker on my Azure stream and Marc came on to cover off an intro to Dapr and show a few examples of why Dapr is such an interesting project if you are doing a Cloud Native project and especially if you are interested in learning more about Microservices and looking for a way to make the complex areas of microservices far less complex.

“Dapr (Distributed Application Runtime) is a free and open source runtime system designed to support cloud native and serverless computing. Its initial release supported SDKs and APIs for Java, . NET, Python, and Go, and targeted the Kubernetes cloud deployment system.”

In our livestream Marc introduces Dapr and runs through just a couple of slides before we dive into Visual Studio code, crack open the code and he start’s showing me some demonstrations on using the Dapr CLI showing how to the following:-

  • Dapr 101: start building distributed applications with ease
  • Dapr building block API’s
  • When is it really useful to use Dapr?
  • Demo: State management using Dapr
  • Demo: Resiliency built into Dapr
  • Demo: Workflows using Dapr
  • Demo: Chaining Workflows using Dapr
  • Demo: Observability using Dapr

If you would like to watch the video and learn more then you can watch the video from here:

Please subsribe and leave comments 🙂

Don’t forget to subscribe to my YouTube Channel. And my Newsletter




Outstanding Contribution to Microsoft Coummunity – Global Winner

Proud of this one and decided to blog about it so I have it on my blog.

4 years I had no Azure experience, hard work pays off. Time to plan whats next…



AZURE VM EXTENSIONS: PART 3 Refactoring our code

In this last part of talking about Azure VM Extensions I will make a couple of changes to refactor and make things better. Once you have more time, go back and refactor your code, its a good feeling to go back and improve upon the code.

So in this case I wanted to use Managed Identities for the CustomScriptExtension and I couldn’t get it working at first and due to time pressures I resorted to using SAS tokens. The thing I soon realised was that this is not the best way to go and I really wanted to revisit the codebase and get Managed Indentites working.

I see a lot of people created System Assigned Managed Identities and I try my best not to use these as they are tied to a resource, I always create a Managed Identity from the Azure Portal or Bicep first and then use that.

So I refactor my Bicep code for the CustomScriptExtension to use the Managed Identity Ive created and now the code is no longer needing to make use of a new SAS token each time it ran and then use this, its more secure to use a User Assigned Managed Identity.

@description('Deploy required userManagedIdentity')
module userManagedIdentity './modules/Microsoft.ManagedIdentity/userAssignedIdentities/deploy.bicep' =  {
  scope: resourceGroup(multiTenantResourceGroupName)
  name: userManagedIdentityName
  params: {
    name: userManagedIdentityName
    location: location
    tags: tags
  }
  dependsOn: resourceGroups
}

The above Bicep code creates our User Assigned Managed Identity and then we can make use of this within our CustomScriptExtension like so.

module virtualMachineName_ZabixxInstaller './modules/Microsoft.Compute/virtualMachines/extensions/deploy.bicep' = {
    scope: resourceGroup(multiTenantResourceGroupName)
    name: 'ZabixxInstaller'
    params: {
      enableAutomaticUpgrade: false
      name: 'ZabixxInstaller'
      publisher: 'Microsoft.Compute'
      type: 'CustomScriptExtension'
      typeHandlerVersion: '1.10'
      virtualMachineName: virtualMachineNameBackend
      location: location
      autoUpgradeMinorVersion: true
      settings: {
         fileUris: [ 
          'https://${storageAccountName}.blob.core.windows.net/${containername}/InstallZabbixAgent.ps1'
          'https://${storageAccountName}.blob.core.windows.net/${containername}/zabbix.zip'
          ]
      }
      protectedSettings: {
        commandToExecute: 'powershell.exe -ExecutionPolicy Unrestricted -File InstallZabbixAgent.ps1'
        managedIdentity: {
          objectId : userManagedIdentity.outputs.principalId
        }
      }
    }
    dependsOn: [
      resourceGroups
      virtualMachineBackend
    ]
  }

Summary

In summary we went from generating a SAS token off of the Azure Storage account to changing this to use a User Assigned Managed Identity which is more secure.


Don’t forget to subscribe to my YouTube Channel. And my Newsletter




Azure VM Extensions: Part 2 CustomScriptExtension

This blog post covers some of the battles I have had trying to install some software onto a VM within Azure. There are many ways to go about this and at the end of the day, yeah you live to fight another day.

High level requirements:

1 – Install Zabbix Agent (which is a windows service) onto the same VM and ensure the service starts correctly.

As like most things in life there is more than one way to do something, I could have used RunCommands I could add onto the DSC and added this step into part 1 etc. etc.

I went with using an Azure CustomScriptExtension, maybe not the best option, who knows, but here is how to get this working.

module virtualMachineName_ZabixxInstaller './modules/Microsoft.Compute/virtualMachines/extensions/deploy.bicep' = {
  scope: resourceGroup(multiTenantResourceGroupName)
  name: 'ZabixxInstaller'
  params: {
    enableAutomaticUpgrade: false
    name: 'ZabixxInstaller'
    publisher: 'Microsoft.Compute'
    type: 'CustomScriptExtension'
    typeHandlerVersion: '1.10'
    virtualMachineName: virtualMachineNameBackend
    location: location
    autoUpgradeMinorVersion: true
    settings: {
       fileUris: [ 
        'https://${storageAccountName}.blob.core.windows.net/${containername}/InstallZabbixAgentGS.ps1?${DSCSAS}'
        'https://${storageAccountName}.blob.core.windows.net/${containername}/zabbix.zip?${DSCSAS}'
        ]
    }
    protectedSettings: {
      commandToExecute: 'powershell.exe -ExecutionPolicy Unrestricted -File InstallZabbixAgentGS.ps1'
      managedIdentity: {}
    }
  }
  dependsOn: [
    resourceGroups
    virtualMachineBackend
  ]
}

So lets break this code down, firstly I make use of already written Azure Bicep Resource Modules which you can grab here :- https://github.com/Azure/ResourceModules/

I’m also using version 1.10 of this extension so make sure to watch out for that.

The fileUris are pointing to a storage account that has public access DISABLED, enabling public access is not what you want, notice the part at the end which is a SAS Token, I tried to get this working using a Managed Identity and gave up as running out of time, the docs say you can use a Managed Identity and maybe I will go back and try this now that I have more time, then update this blog post.

“If I generate a new SAS each time you deploy based on deployment time, it will re-run the Script extensions every time, since that SAS value changes. If you move to Managed Identity, the script extension does not have to process when you redeploy, it will be skipped, since the configuration/settings didn’t change. If you want it to redeploy, with no changes, then you can change the value of the forceUpdateTag value.”

Huge shout out to https://github.com/brwilkinson he was helping me with this a LOT, and was super helpful, I owe it to him for helping me get this working and I owe it to him to go back and test it using Managed Identity.

Summary

So in summary I am generating a SAS token from the existing storage account like I do in Part 1 and then I pass that into fileUris so that I don’t run into permission issues.


Don’t forget to subscribe to my YouTube Channel. And my Newsletter




Azure VM Extensions: Part 1 DSC Extensions

This blog post covers some of the battles I have had trying to install some software onto a VM within Azure. There are many ways to go about this and at the end of the day, yeah you live to fight another day.

High level requirements:

1 – Install Octopus Deploy Tentacle onto a VM and have the agent running.
2 – Install Zabbix Agent (which is a windows service) onto the same VM and ensure the service starts correctly. (I’ll cover this in part 2 of the blog post).

Now as said previous, many ways to do this, I looked up what is available from Octopus Deploy as they always have good stuff, they used to have an Azure Extension you could use but they binned that off in favour of PowerShell DSC. I have never used PowerShell DSC, so I’m game for learning anything I know nothing about. Turns out they have the PowerShell DSC available to download and so I just needed to write the Bicep that grabs this from an Azure storage account and away we go…

I have the Bicep for creating the VM and I let that do its thing and decide to create a separate module for the VM DSC extension. So I grab the zip file from Octopus Deploy and read thru the code etc and all good and then I add this to my Azure storage account manually and now I need to figure out the Bicep code within the extension to download the zip file and have it run. The issue I run into is that the Azure storage account isn’t public so I need to figure out on how to generate a SAS token from the storage account and use that to be able to grab the file from storage and download it to the VM.

In order to get a SAS token out for all blobs (might not be what you want so be careful) you can do something like this:-

var allBlobDownloadSAS = listAccountSAS(storageAccount.name, '2022-09-01', {
  signedProtocol: 'https'
  signedResourceTypes: 'sco'
  signedPermission: 'rl'
  signedServices: 'b'
  signedExpiry: '2024-01-01T00:00:00Z'
}).accountSasToken

output sasToken string = allBlobDownloadSAS

So this would be part of the storage account module and just use the SAS token like so:-

var DSCSAS = storageAccount.outputs.sasToken

Now here comes the bit to pay attention too. The Bicep for the VM Extension is as follow, now the URL only worked for me if I appended the SAS token at the end, I don’t want to make the storage account publicly accessible so I needed the SAS token

@description('Add the Octopus Deploy Tentacle to the Backend VM') 
module vmName_dscExtension_OctopusDeployExtension './modules/Microsoft.Compute/virtualMachines/extensions/deploy.bicep' =  if (addVirtualMachine) { 
  scope: resourceGroup(myResourceGroupName) 
  name: 'vmName_dscExtension' 
  params: { 
    autoUpgradeMinorVersion: true 
    enableAutomaticUpgrade: false 
    name: 'vmName_dscExtension' 
    publisher: 'Microsoft.Powershell' 
    type: 'DSC' 
    typeHandlerVersion: '2.77' 
    virtualMachineName: virtualMachineNameBackend 
    location: location 
    settings: { 
      configuration: { 
                  url: 'https://${storageAccountName}.blob.core.windows.net/${containername}/OctopusTentacle.zip?${DSCSAS}' 
        script: 'OctopusTentacle.ps1' 
        function: 'OctopusTentacle' 
      } 
      configurationArguments: { 
        ApiKey: tentacleApiKey 
        OctopusServerUrl: tentacleOctopusServerUrl 
        Environments: tentacleEnvironments 
        Roles: tentacleRoles 
        ServerPort: tentaclePort 
      } 
    } 
  } 
  dependsOn: [ 
    resourceGroups 
    virtualMachineBackend 
  ] 
}

No doubt you can do this with a Managed Identity but I couldn’t get it working and had spent a LOT of time on this so gave up and used the SAS token instead.

Note
The DSC extension takes care of unzipping the files onto the VM and running the PowerShell script called OctopusTentacle.ps1

Summary
In summary this is how you can create a SAS Token in Bicep from a storage account and also how you can reference a blob in the storage account and install it.
I realise the way I am doing this might not be best so if you find an alternative or have feedback do please let me know.


Don’t forget to subscribe to my YouTube Channel. And my Newsletter




Azure DataBricks talking to Azure SQL Database using Private Endpoints

At work a colleague reached out asking for some help with getting some python code querying an Azure SQL Server database and getting it all working. This is right up my street, fixing things I don’t use on a day to day basis is something of a challenge I just love working on.

I will list out the steps in order to achieve this, bare in mind we have Azure SQL deployed as well as Azure DataBricks at this point and when we try to query the Azure SQL Database we see errors like the ones below:-

“com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection to the host <redacted>.database.windows.net, port 1433 has failed. Error: “connect timed out. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.”.

as well as this one

com.microsoft.sqlserver.jdbc.SQLServerException: Cannot open server “<server name redacted>” requested by the login. The login failed. ClientConnectionId:c5977705-8b83-4f12-a4ce-0268ac868798

Ok so reading these errors might mean you look into whitelisting IP addresses.

Lets write the steps down to fix this issue and maybe it will help someone, probably me when I forget I wrote this blog next week 🙂

Ok so we did the following steps:-

  • Added a new Subnet to our databricks-vnet
  • Find your Azure SQL Server instance in the portal, go to the Networking tab and clicked Private access, click the + to Create a Private Endpoint, on the Virtual Network tab choose the Virtual network your using for DataBricks and select the new Subnet we want to use. Make sure to keep ‘Integrate with Private DNS Zone’ ticked.
  • Once the Private Endpoint has been created click on it and go to DNS Configuration, click on the link towards the bottom under the heading Private DNS Zone to be taken to your Private DNS Zone. Now click on ‘Virtual network links’. Again click the + to add a new Virtual Network Link and choose the DataBricks VNet, don’t tick Enable auto registration.

So it s just like any other Private Endpoint config, just remember to do the Virtual Network link. You also don’t need to whitelist an IP Addresses or anything like that.

Don’t forget to subscribe to my YouTube Channel. And my Newsletter



Learn the Azure Fundamentals – Full Course Free for everyone

We have wrapped up our Azure Fundamentals AZ-900 course which we ran on Lighthall.co

Huge thank you to Lisa Hoving, Simon Lee and Matt Boyd for helping give this course.

The PDF’s are also available.

Part 1

Part 2

Part 3

Part 4


Describe cloud concepts (25–30%)

  1. Describe cloud computing
  2. Describe the benefits of using cloud services
  3. Describe cloud service types


Describe Azure architecture and services (35–40%)

  1. Describe the core architectural components of Azure (first 3 topics)
  2. Describe the core architectural components of Azure (last 4 topics)
  3. Describe Azure compute and networking services (first 3 topics)
  4. Describe Azure compute and networking services (last 3 topics)
  5. Describe Azure storage services
  6. Describe Azure identity, access, and security (first 4 topics)
  7. Describe Azure identity, access, and security (last 4 topics)

Describe Azure management and governance (30–35%)

  1. Describe cost management in Azure
  2. Describe features and tools in Azure for governance and compliance
  3. Describe features and tools for managing and deploying Azure resources
  4. Describe monitoring tools in Azure

The full playlist on YouTube can be found here – https://www.youtube.com/playlist?list=PLrDWgkiCvaPReqv4uagsi9oCjADR9ADBO


Special thank you to Lisa Hoving , Simon Lee and Matt Boyd. We all hope you enjoy the content and find it useful.

Don’t forget to subscribe to my YouTube Channel. And my Newsletter



Why you should start a Newsletter.

If you are reading this and thinking should I start my own newsletter then the answer is yes go do it now, what are you waiting for?

By the way you can sign up for my newsletter below:-

https://gregors-newsletter.beehiiv.com/subscribe

I had been meaning to create a newsletter for probably the best part of a 4 years and just never got around to it, the main reason why I created a newsletter is to keep in touch with people who read it and think its useful. I’m also using is for side quests but more on that in the newsletter.

Now I am only on week 2 of my newsletter and still figuring the format out – probably should have done that first but hey getting started is half the battle right. Go choose some where to create you’re newsletter and get started, you won’t regret it.

I looked at Beehiiv and Mailchimp and I am not here to say which is better because quite frankly I just wanted to get going.

I signed up to both and ended up going with Beehiiv, why who knows it just seemed to appeal more for whatever reason. Honestly it could have easily been Mailchimp, but anyhoo I went with Beehiiv.

It’s easy to use and easy to see your list of followers, easy to create posts and they can look pretty neat once you spend some time tweaking the look and feel. Its not a 10 out of 10 solid feel for me just yet, so I would give it maybe an 8 out of 10 but at this point I have 150 subscribers and I’m off and running.

So why should you create a newsletter – well if you want a list of people who follow you and have their email addresses should you want to reach these people then what better way to do it that with a weekly newsletter – that’s what its for in the end.

Don’t forget to subscribe to my YouTube Channel. And my Newsletter



Azure App Service

Moving Azure Web Apps between App Service Plans

This past week a customer asked me why cant I move some Azure Web apps from one app service plan to another so I had a quick look into the issue and learned something new so why not blog about it as its been a while.

The customer had 2 app service plans which were like so:-

App Service Plan 1, West Europe, rg-apps and was the Premium V2 P1:v2 Pricing Tier and OS Type Windows
App Service Plan 2, West Europe rg-apps and was the Premium V2 P1:v2 Pricing Tier and OS Type Windows

Now if you look at the docs (Manage App Service plan – Azure App Service | Microsoft Learn) it states that

“You can move an app to another App Service plan, as long as the source plan and the target plan are in the same resource group, geographical region,and of the same OS type. Any change in type such as Windows to Linux or any type that is different from the originating type is not supported.”

The next part of the docs was the reason why we couldn’t move the apps from App Service Plan 1 to App Service Plan 2.

So to check this I went to each App Service Plan and on the overview screen you can click on JSON view and there you can find the JSON property called “webspace” and they had differing values.

So if you ever run into this issue you can check this webspace setting and that just might be the issue. I am blogging this in case I ever come across this again and I forget what to check.

The Third app service plan that was there, they could move apps into this one without issue. There is a couple of ways to solve the problem and I’ll let you figure out what they might be but the customer decided to leave them as is for now.

Don’t forget to subscribe to my YouTube Channel. And my Newsletter