Azure App Service

Azure Web app using Azure SQL using Private Endpoints

Ok so this blog post covers deploying an Azure Web App that talks to an Azure SQL Server database which we will then secure access to the database using a VNet and a Private Endpoint.

First we will deploy the web application which talks to Azure SQL, this wont be using a VNet nor a Private Endpoint and is unsecure and open to the internet, then we will tighten it down by adding the VNet and Private endpoint.

What are Azure Private Endpoints?

An Azure Private Endpoint is a network interface that connects your virtual network privately to a service powered by Azure Private Link. This allows you to access Azure services over an Azure Private Link, which is a private endpoint in your virtual network. This means that traffic between your virtual network and the service traverses over the Microsoft Azure backbone network, eliminating exposure from the public internet.

Ok lets get to it.

Step 1

Firstly follow the steps in this Microsoft article which you follow to deploy a web app taking to a local db, and then you can deploy an Azure SQL Database once we deploy to Azure (all steps within the following link)

https://learn.microsoft.com/en-us/azure/app-service/app-service-web-tutorial-dotnet-sqldatabase

At this point you should have an Azure Resource Group, an Azure App service plan (hosting plan) and an Azure Web Application deployed and working.

Enable If the database has no tables then you need to Enable code migrations, so go to the Tools Menu and then select Nuget Package Manager and then Package Manager Console, in the console enable Code First Migrations by typing Enable-Migrations and then press enter.

Read the section titled Enable Code First Migrations in Azure in the above link of the tutorial from Microsoft, make sure to publish to Azure again after this step.

Step 2

Now we need to VNet integrate this so we will start off by creating a new Azure VNet, I created my VNet with a 10.1.0.0/16 address space and then I created subnets like so:-

webappsSubnet: 10.1.2.0.24
sqlSubnet: 10.1.1.0/24

And then I clicked save.

Now we have a VNet with 2 subnets, lets VNet integrate both the SQL Server and the Web Application.

Step 3

Go to the Web app you deployed to Azure and then select Networking and then choose VNET Integration and select your VNet and then choose the webappsSubnet.

Once you add VNet integration it should look something like the following:-

Step 4

Ok so next do the same for the Azure SQL Server you deployed from the Microsoft guide and VNet integrate your Azure SQL Server.

On the Networking tab of your Azure SQL Server, make sure Public Access is set to Disable like the following:-

Now click on the Private Access tab and select create a Private Endpoint.

Create a Private Endpoint in Azure

In the second screen, make sure to select the correct VNet and choose the sqlSubnet.

So now we have setup a Private Endpoint for Azure SQL and we have waited for a few seconds so that the connection-state is Approved we are all set.

Troubleshooting


At this point your web application should be able to communicate with your backend Azure SQL Server using a Private Endpoint, if you delete the Private Endpoint you will see this:-

You will also see this is you don’t have the database populated, you should see the following if you have enable-migrations and re-published the code.

Connecting to the Database from your local pc

If you want to check that we have tables and data you can use a number of tools to connect to your new Azure SQL Server. I tend to use SQL Server Management Studio because I am old 🙂 – but before you can connect we need to change the Azure SQL Server firewall to allow my IP Address to connect to the database. To do this go to the Azure SQL server and then Networking and click on Public Access and fill it in like so:-

Give the Rule Name a decent name so you know who’s IP address you have whitelisted, in case you need to add several.

Note – Don’t tick the Box that says Allow Azure Services and resources to access this server, its not recommended to do this.

Once you connect you should see something like this:-

Summary

To summarise this blog post we initially deploy an Azure App Service with a SQL Server backend Database. Then we VNet integrate the Web App and SQL Server and then we use a Private Endpoint so that the communication from the Web App to Azure SQL traverses over the Microsoft Azure backbone network, eliminating exposure from the public internet.

If you have questions reach out to me here in the comments below or on twitter.

Don’t forget to subscribe to my YouTube Channel.



Azure Data Fundamentals DP-900 on LinkedIn Learning

All 4 parts of the exam are now available on LinkedIn Learning, for anyone who was looking for all 4 parts they are now all available at the links below:-

Part 1- https://www.linkedin.com/learning/azure-data-fundamentals-dp-900-cert-prep-1-core-data-concepts

Part 2 – https://www.linkedin.com/learning/azure-data-fundamentals-dp-900-cert-prep-2-working-with-relational-data-on-azure-17091985

Part 3 – https://www.linkedin.com/learning/azure-data-fundamentals-dp-900-cert-prep-3-working-with-non-relational-data-on-azure/

Part 4 – https://www.linkedin.com/learning/azure-data-fundamentals-dp-900-cert-prep-4-analytics-workloads-on-azure

If you have questions reach out to me here in the comments below or on twitter.

Don’t forget to subscribe to my YouTube Channel.



Enabling Defender for Cloud using Bicep

In this blog post I show you how to enable Defender for Cloud using Bicep

Microsoft Azure Defender is a cloud-based security solution that helps protect Azure resources and workloads running in Azure, on-premises, or in other clouds.

As always I try to make use of the following GitHub repository https://github.com/Azure/ResourceModules/ this is where I go to make use of the hundreds of already written Bicep scripts which I can make use of very quickly.

I start by cloning the repository then lifting the files I need to make what ever I need to deploy work, in this case I want the following folder(s) https://github.com/Azure/ResourceModules/tree/84fe9dfd578a22079b03bbdee3554b9ac51c2dc2/modules/Microsoft.Security/azureSecurityCenter

I store the files in a modules folder.

// Defender for Cloud Details


// Defender for Cloud parameters

param defenderAutoProvision string = 'On'
param defenderAppServicesPricingTier string = 'Standard'
param defenderVirtualMachinesPricingTier string = 'Standard'
param defenderSqlServersPricingTier string = 'Standard'
param defenderStorageAccountsPricingTier string = 'Standard'
param defenderDnsPricingTier string ='Standard'
param defenderArmPricingTier string = 'Standard'

module enableDefenderForCloudOnSubscription 'modules/defenderForCloud.bicep' = {
  name: 'defenderForCloud'
  params: {
    scope: subscription().id
    workspaceId: createLogWorkspace.outputs.resourceID
    autoProvision: defenderAutoProvision
    virtualMachinesPricingTier: defenderVirtualMachinesPricingTier
    sqlServersPricingTier: defenderSqlServersPricingTier
    storageAccountsPricingTier: defenderStorageAccountsPricingTier
    appServicesPricingTier: defenderAppServicesPricingTier
    dnsPricingTier: defenderDnsPricingTier
    armPricingTier: defenderArmPricingTier
   }
  }

To run this I run a very small PowerShell script, that contains the following:-

$deploymentID = (New-Guid).Guid
$location = 'westeurope'

az deployment sub create --name $deploymentID
--location $location --template-file ./main-deployment-1.bicep
--parameters location=$location --confirm-with-what-if
--output none

And this will enable Defender for Cloud and you can change the parameters as you like.

If you have questions reach out to me here in the comments below or on twitter.

Don’t forget to subscribe to myYouTube Channel.



Azure App Service

Creating Azure Architecture Diagrams from scratch (almost)

Part of my job is creating Azure Architectural diagrams and to be honest I really didn’t get on that well with Visio, its a great product but there was a lot of swearing when fiddling around with spacing and drawing arrows the way I wanted them etc., I just never felt proud of what I had created (It’s me not Visio lets be clear).

I stumbled across a video from https://twitter.com/LiorKamrat

Lior teaches you how to use PowerPoint, yes you read that correctly, PowerPoint to create really awesome looking architectural diagrams – definitely watch all of his video if you like me don’t get on well with Visio etc.

Whilst tweeting about this Dave Brannan asked if I had used draw.io inside Visual Studio Code, I had dabbled with draw.io but not within VS Code, so I installed the extension and started looking into it. Not bad I must say, I do like it and its very easy to create cracking diagrams.

Here is one I created from a sample diagram from the Microsoft docs about using Private Endpoints from a Web App to a SQL Server Database. I can reuse these super easily and use the GitHub repository David-Summers/Azure-Design: My Azure stencil collection for Visio. Highly functional and always up to date. (github.com) to copy in .png or .svg files which ever you prefer.

How about Draw.io?

If you install the draw.io integration extension by Henning Dietrichs to VS Code and then crate new empty file with the .drawio extension then you end up with this:-

This means you can use VS Code along with Draw.io to create diagrams – very cool.

Summary

I like the PowerPoint way of doing it as I can open up existing diagrams and easily create new diagrams off it it, I am sure you can do the same in any tool but I do particularly like how the PowerPoint diagrams turn out.

Massive thanks to Lior Kamrat for creating the video and I hope you find this post useful.

In Lior’s video he has some awesome links to to Azure Icon sets and example PowerPoints.

If you have questions reach out to me here in the comments below or on twitter.

Don’t forget to subscribe to myYouTube Channel.



Azure PostgreSQL Flexible Server using Bicep

In this blog post I show you how to create a new Azure PostgreSQL Flexible server using Bicep, the single server will be no longer at the start of 2024 so many of you will need to migrate to the flexible offering.

Azure PostgreSQL Flexible Server is a fully managed, cloud-based PostgreSQL database service that provides the ability to scale compute and storage resources independently, making it more flexible and cost-effective than other Azure PostgreSQL offerings.

As always I try to make use of the following GitHub repository https://github.com/Azure/ResourceModules/ this is where I go to make use of the hundreds of already written Bicep scripts which I can make use of very quickly.

I start by cloning the repository then lifting the files I need to make what ever I need to deploy work, in this case I want the following folder(s) https://github.com/Azure/ResourceModules/tree/main/modules/Microsoft.DBforPostgreSQL/flexibleServers

I store the files in a modules folder.

// Azure PostgreSQL Server details

param administratorLogin string = 'postgresqladmin'
param skuName string = 'Standard_D4s_v3'
param tier string = 'GeneralPurpose'
param availabilityZonestring string = '1'
param backupRetentionDays int = 20
param geoRedundantBackup string = 'Enabled'
param highAvailability string = 'SameZone'
param storageSizeGB int = 1024
param version string = '14'
param servername string = 'gregorspostgresql'

@description('Deploy an Azure PostgreSQL Server')
module createPostgresFlexibleServer 'modules/psqlflexibleServer_modules/deploy.bicep' = {
  scope: resourceGroup(dataTierRg)
  name: 'createPostgresFlexibleServer'
  params: {
    administratorLogin: administratorLogin
    administratorLoginPassword: administratorLoginPassword
    name: servername
    skuName: skuName 
    tier: tier 
    location: location
    availabilityZone: availabilityZonestring 
    backupRetentionDays: backupRetentionDays 
    geoRedundantBackup: geoRedundantBackup
    highAvailability: highAvailability
    storageSizeGB: storageSizeGB 
    version: version 
  }
}

To run this I run a very small PowerShell script, that contains the following:-
$deploymentID = (New-Guid).Guid
$location = 'westeurope'

az deployment sub create `
        --name $deploymentID `
        --location $location `
        --template-file ./main-deployment-1.bicep `
        --parameters location=$location  `
        --confirm-with-what-if `
        --output none
     

And this will deploy an Azure PostgreSQL Flexible server and you can change the parameters as you like.

If you have questions reach out to me here in the comments below or on twitter.

Don’t forget to subscribe to myYouTube Channel.



Azure SQL Server VNet Integrated using Bicep

I have a terrible memory so this blog post is mainly to remind me how to VNet Integrate Azure SQL.

The code below is creating an Azure SQL Server and VNet integrating it – the VirtualNetworkRule is the key part and the following is how to go about it.

I use this existing Bicep repo for all of the Bicep that I write – https://github.com/Azure/ResourceModules/

@description('Deploy an Azure SQL Server')
module createAzureSQL 'modules/azuresql_modules/deploy.bicep' = if (deployAzureSQL) {
  scope: resourceGroup(dataTierRg)
  name: azureSQLServerName
  params: {
    name: azureSQLServerName
    location: sqllocation 
    administratorLogin: azureSQLServerAdminLogin
    administratorLoginPassword: azureSQLAdminPassword
    tags: tags
    virtualNetworkRules: [
      {
        name: 'vnet-rule-${azureSQLServerName}'
        serverName: azureSQLServerName
        ignoreMissingVnetServiceEndpoint: false 
        virtualNetworkSubnetId: '/subscriptions/${subscriptionID}/resourceGroups/${appTierRg}/providers/Microsoft.Network/virtualNetworks/${appVNetName}/subnets/dataSubNet'
      }
    ]
  }
  dependsOn: [
    newRG
    createAppVNet
  ]
}

To get this to work you should also add a service endpoint into your subnet like the following:-

@description('An array of the subnets for the Application VNet.') 
var appSubnets = {
  shared: [

    {
      name: 'appSubnet'
      addressPrefix: '172.16.0.0/24'
      delegations: [
        {
          name: 'delegation'
          properties: {
            serviceName: 'Microsoft.Web/serverfarms'
          }
        }
      ]
    }
    {
      name: 'dataSubNet'
      addressPrefix: '172.16.1.0/24'
      serviceEndpoints: [
        {
          service: 'Microsoft.Sql'
        }
      ]
    }
  ]
}

Let me know if you found this example useful.



My 2022 Yearly Review

2022 was a strange year for me, felt like I hadn’t done much and then I started writing this blog post and it turns out wasn’t too bad after all. I did however get quite a few MVP nominations for people deserving of the award and got a good number of people started doing their first ever LinkedIn Learning course.

My first-ever LinkedIn Learning course went live in March of this past year, and then I finished the second part I was asked to do.

January

February

  • Glasgow Azure User Group.

March

  • Visit Intercept offices, The Netherlands.

April

  • Glasgow Azure User Group.

May

  • DevOps on Azure workshop – remotely
  • Deep Dive Governance and Azure Policy Workshop – remotely.
  • DevOps on Azure workshop – remotely.

June

  • Glasgow Azure User Group.
  • Intercept Summer Party.

July

  • Deep Dive Governance and Azure Policy Workshop – remotely.

August

  • Glasgow Azure User Group.

September

  • On holiday playing golf in the Scottish Pairs 2022 event.

October

  • Azure Cloud Essentials Workshop at the Microsoft office in Copenhagen, Denmark.
  • Earned the Microsoft Certified: Azure Network Engineer Associate Exam AZ-700.
  • Disneyland Paris holiday.
  • Ignite Conference in Seattle, USA.
  • Glasgow Azure User Group.

November

  • Azure DevOps and Azure Cloud Essentials Workshop at the Microsoft Reactor in Stockholm, Sweden.
  • Reached 15k twitter followers.

December

  • Data Workshop in Vienna at the Microsoft Austria office.
  • Intercept Xmas Party in the Netherlands.
  • Festive Tech Calendar ran for the whole month.

Summary

I visited a lot of new countries and gave talks at some very cool venues, next year I hope we can do more of those.

I still want to visit Croatia, Switzerland and hopefully Iceland too.

Happy New Year if your reading this far down.



Microsoft Ignite – In person Seattle review

This is my review of Ignite, in person, in Seattle.

This was my first ever time in Seattle, really enjoyed the city and look forward to returning (I love being in the United States). I was delighted to be able to attend this years event.

This years Ignite in person has been, well different, to be fair I think I had been greatly spoiled by the previous Ignite I attended back in 2019 in Orlando.

Microsoft has tried out some new ideas and below is my feedback and findings from Days 1 and 2 (supposed to be 3 day event, but it wasn’t in perso in Seattle at least).

No Ignite bag was a little disappointing (it was a thing each year until now), a swag free conference, I mean I do get it, but a lot of people weren’t pleased – I think we are going to have to get used to swag free conferences. If you were a sponsor the list of what you had to do to be able to giveaway anything was pretty high.

The vendors were down on a separate floor and to be honest, this was odd in my opinion, not sure that would have went down too well, you had to go into rooms to speak to some of them and that isnt what people liked or felt comfortable doing, when I asked people.

The food was fine, not sure the information on what was available where, was up to scratch but there you have it.

The venue itself, at least the Hub was cool, no MVP / RD wall was disappointing, or dragon popcorn but it was a nice looking area. However, for me, the area didn’t work when it came to sessions, many speakers had no microphones on day 1 and you couldn’t hear anything people were saying unless right up front (some sessions were packed), hopefully they learn that having speaker areas with so many people, you need to make sure people can hear what’s been said, if its in person, its not recorded and therefor you miss out which annoyed some people I spoke too.

The keynote was weird with Satya not being there in person, again, odd to have Ignite, in person, that doesnt have a live keynote from Satya.

In Summary, Day 1 was rescued by the people I met, the MVP after party (not all mvps got the invite) and playing table tennis and chatting with people from all over.

Day 2 was noticeably quieter, I mean like a third of the people decided to go enjoy the sun or something else, I was thinking to myself where is everyone, it very much had the vibe that the big speakers had left town and the Hub area was surprisingly quiet. Ask the Expert areas were quiet especially on day 2, it had that feeling of the last day at a conference early on which was not great.

Some sessions were ok, content was mostly high level marketing content and the sessions were beginner to intermediate levels. Content around the Hub was hard to hear and that for me didn’t work.

So lets cover the good the bad and the ugly from my point of view for this year’s Ignite.

GOOD
We were back in person at Ignite, meeting friends and new people from around the world is always the best part of a conference for me – going to dinner at night and hearing what people are working on and building and the questions they ask is the best thing going. Seattle is a very nice place to visit.

BAD
No Satya, no swag, 2 days conference not 3 as advertised (many flew home on Saturday due to this), no MVP wall, no sticker swap, no store to buy merchandise or surface laptops etc, and no book store either. The energy just wasnt there for me, some people tried to get the crowd to hoot and holler but it just wasnt happening.

UGLY
After reflection skipping this part, ask me in person.

Ways I would try to improve Ignite

  • Less marketing content, more deep dives.
  • Have the person delivering the Keynotes always be actually there in person, especially Satya.
  • Tell people its a swag free conference up front, to avoid disappointment.
  • Bring back the MVP / RD Wall, book store, merchandise stands etc.
  • Inform people better of any after event parties.
  • Have working wifi at the start of the conference.
  • Session speakers speak slowly, this isn’t a race, if you need more time have longer session. Not everyone is a native English speaker and rambling through content at 100 mph is an awful experience for a lot of people.
  • Tell people that where your sitting is not where the speaker is, some rooms were both called Tahoma 3 for example and one had the speaker and the other was earphones only – it wasnt clear.
  • Full sessions and being told to go watch it online, after travelling a long way to be in person this wasn’t a great experience. (have bigger rooms)
  • Two days isn’t long enough, lots of content at the same time yesterday was not fun, yes its recorded but I came to see people talk and the content needs to be spread out more.
  • Have the chance to get your photo taken professionally so that you can get a nice photo of you at the event.
  • Have more fun things on when it’s the drinks / nibbles at the end of each day.

Summary
In summary, Microsoft tried a new way of doing things from what I gathered and it fell short.

Would I go next year? – it depends.



Bicep Scenarios

I have been working with Bicep a lot on a recent project and I tweeted a reply to a tweet as you can see below, in order to reply properly I decided to write this blog post as hopefully being a way to give constructive feedback.

Lets take the example of vnet peering – if you google/bing for vnet peering bicep you will end up here most likely – https://docs.microsoft.com/en-us/azure/templates/microsoft.network/virtualnetworks/virtualnetworkpeerings?pivots=deployment-language-bicep

This is somewhat helpful in terms of it shows you the format, however it wont help a lot of people as its not got any examples of how it works.

I always use https://github.com/Azure/ResourceModules as I can look at an example, they have deployment examples links for every resource which is like so: –

Now with this example I can see what the values look like and get a feel for what is needed. The first screen shot does explain the values which is ok at best, an example would be good, a common scenario would be even better.

When it comes to documentation examples can be what makes or breaks the documentation.
For me scenarios are the answer to what I would like to see – If I want to create a keyvault, theres every chance I need to create a secret and populate this, maybe I need an access policy or a managed identity to go along with this – these are real work scenarios that the documentation dont cover – yes we can’t cover everything but if we go back to the first screen shot above – this isnt helping me a great deal truth be told.

Now there is a common scenarios page from the Bicep team – https://docs.microsoft.com/en-us/azure/azure-resource-manager/bicep/scenarios-secrets which is a good start, I feel we really need a lot more of these or at least link to github repositories for further help.

When trying to figure out how to get a load balancer to play nicely with 2 virtual machines I had to manually depoy it and then reverse engineer the bicep – the sample in the documentation was a good start, didnt cover what I needed but it was a good start.

Ive had dm’s saying that there isnt enough people with enough time to write all of the scenario’s, which is fine, finding a way to make public contributions of example scenario;s might be one way to go about it.

Either way I feel there is room for improvement on the documentation for a lot of microsoft docs, its an unpopular opinion but there you have it.

If you have questions reach out to me here in the comments below or on twitter.

Don’t forget to subscribe to myYouTube Channel.



Reverse Engineering Arm Templates to use with Bicep

When working with Bicep there are times when the documentation isn’t there, and the examples aren’t there yet either.

Recently I needed to deploy Azure Database for PostgreSQL Flexible server using Bicep, the docs for doing this are ok, examples would make the Bicep docs so much better, but anyway, if you have been using Bicep then you manage to get it working.

I tend to go to the following website https://github.com/Azure/ResourceModules/ when I am working with Bicep and use these modules as much as I can – note that they are constantly being worked/changed so bare that in mind.

I figure out the Bicep for my postgreSQL server and now I need too add in a couple of extensions – I’m new to postGreSQL and never touched it and google for the article and it turns out its super simple to manually add in any extension – this article show you how – https://docs.microsoft.com/en-us/azure/postgresql/flexible-server/concepts-extensions

So I decided to download the arm template for my postgreSQL server before the extensions were added and then compare this to the arm template after I have added the extensions into postGreSQL by hand. Then I compare the files and see whats been added.

Comparing both files I see the following:-

I then use a website which is available to use to generate Bicep code from existing arm templates. The website for this is https://bicepdemo.z22.web.core.windows.net/ – using this I click on the decompile button top right and point it at the ARM template I downloaded from the Azure portal after I had added in the extension manually. This then generates the Bicep code for me and I can see the section I needed to add in the extension.

resource flexibleServers_psql_dev_weu_name_azure_extensions 'Microsoft.DBforPostgreSQL/flexibleServers/configurations@2022-01-20-preview' = {
  parent: flexibleServers_psql_dev_weu_name_resource
  name: 'azure.extensions'
  properties: {
    value: 'LTREE,UUID-OSSP'
    source: 'user-override'
  }
}

So now I have the missing extension code I need for my Bicep code and we can remove the manually added extensions – redeploy the code and we are all good.

Summary

If you don’t know the Bicep code for what you need and you cant find any samples, try manually deploying your service, download the arm template and use the https://bicepdemo.z22.web.core.windows.net/ to decompile the ARM templae back into Bicep.

If you have questions reach out to me here in the comments below or on twitter.

Don’t forget to subscribe to myYouTube Channel.