Deploying AI Applications to Azure Web Apps: A Practical Architecture Guide

Stuff I learned from Ignite 2025

Azure Web Apps (part of Azure App Service) remains one of the most effective platforms for hosting production AI-enabled applications on Azure. With first-class support for managed identities, private networking, and native integration with Azure AI services, it provides a strong balance between operational simplicity and enterprise-grade security.

This article walks through a reference architecture for deploying AI applications to Azure Web Apps, grounded in current guidance and capabilities as of Microsoft Ignite 2025. The focus is on real-world concerns: identity, networking, configuration, and infrastructure as code.


Why Azure Web Apps for AI Workloads

Azure Web Apps is well-suited for AI-powered APIs and frontends that act as orchestrators rather than model hosts. In this pattern:

  • Models are hosted in managed services such as Azure OpenAI Service
  • The Web App handles request validation, prompt construction, tool calling, and post-processing
  • Stateful data is stored externally (e.g., databases or caches)

Key benefits include:

  • Built-in autoscaling and OS patching
  • Native support for managed identities
  • Tight integration with Azure networking and security controls
  • Straightforward CI/CD and infrastructure-as-code support

Reference Architecture Overview

https://learn.microsoft.com/en-us/azure/architecture/web-apps/app-service/_images/basic-app-service-architecture-flow.svg
https://learn.microsoft.com/en-us/azure/ai-foundry/openai/media/use-your-data/ingestion-architecture.png?view=foundry-classic

Conceptual architecture showing Azure Web App securely accessing Azure OpenAI via private endpoints.

At a high level, the architecture looks like this:

  1. Client calls the AI application hosted on Azure Web Apps
  2. Azure Web App authenticates using a managed identity
  3. Requests are sent to Azure OpenAI Service over a private endpoint
  4. Secrets and configuration are resolved from Azure Key Vault
  5. Observability data flows to Azure Monitor and Application Insights

This design avoids API keys in code, minimizes public exposure, and supports enterprise networking requirements.


Application Design Considerations for AI Apps

Stateless by Default

Azure Web Apps scale horizontally. Your AI application should:

  • Treat each request independently
  • Store conversation state externally (e.g., Redis or Cosmos DB)
  • Avoid in-memory session affinity for chat history

This aligns naturally with AI inference patterns, where each request sends the full prompt or context.

Latency and Token Costs

When calling large language models:

  • Batch or compress prompts where possible
  • Avoid unnecessary system messages
  • Cache deterministic responses when feasible

These optimizations are application-level but directly affect infrastructure cost and scale behavior.


Identity and Security with Managed Identities

One of the most important design decisions is how the Web App authenticates to AI services.

Azure Web Apps support system-assigned managed identities, which should be preferred over API keys.

Benefits:

  • No secrets in configuration
  • Automatic credential rotation
  • Centralized access control via Azure RBAC

For example, the Web App’s managed identity can be granted the Cognitive Services OpenAI User role on the Azure OpenAI resource.


Networking: Public vs Private Access

For development or low-risk workloads, public endpoints may be acceptable. For production and regulated environments, private networking is strongly recommended.

https://learn.microsoft.com/en-us/azure/app-service/media/overview-private-endpoint/global-schema-web-app.png
https://miro.medium.com/1%2AJ3Y2zmLFxdbPnwAc4V535g.png

Private endpoint architecture eliminating public exposure of AI services.

Key components:

  • VNet-integrated Azure Web App
  • Private Endpoint for Azure OpenAI Service
  • Private DNS zone resolution

This ensures that AI traffic never traverses the public internet.


Secure Configuration with Azure Key Vault

Application configuration typically includes:

  • Model deployment names
  • Token limits
  • Feature flags
  • Non-secret operational settings

Secrets (if any remain) should live in Azure Key Vault, accessed using the Web App’s managed identity. Azure Web Apps natively support Key Vault references in app settings, eliminating the need for runtime SDK calls in many cases.


Infrastructure as Code: Bicep Example

Below is a simplified Bicep example deploying:

  • An Azure Web App
  • A system-assigned managed identity
  • Secure app settings
resource appService 'Microsoft.Web/sites@2023-01-01' = {
  name: 'ai-webapp-prod'
  location: resourceGroup().location
  identity: {
    type: 'SystemAssigned'
  }
  properties: {
    serverFarmId: appServicePlan.id
    siteConfig: {
      appSettings: [
        {
          name: 'AZURE_OPENAI_ENDPOINT'
          value: 'https://my-openai-resource.openai.azure.com/'
        }
        {
          name: 'APPLICATIONINSIGHTS_CONNECTION_STRING'
          value: appInsights.properties.ConnectionString
        }
      ]
    }
  }
}

This approach keeps infrastructure declarative and auditable, while relying on Azure-native identity instead of secrets.


Terraform vs Bicep for AI Web Apps

AspectBicepTerraform
Azure-native supportExcellentVery good
Multi-cloudNoYes
Learning curveLower for Azure teamsHigher
Azure feature parityImmediateSometimes delayed

For Azure-only AI workloads, Bicep offers tighter alignment with new App Service and Azure AI features. Terraform remains valuable in multi-cloud or heavily standardized environments.


Observability and Monitoring

AI applications require more than standard HTTP metrics. At minimum, you should capture:

  • Request latency (end-to-end)
  • Token usage (where available)
  • Model error rates
  • Throttling or quota-related failures

Azure Web Apps integrates natively with Application Insights, enabling correlation between HTTP requests and outbound AI calls when instrumented correctly.


Deployment Checklist

  • Azure Web App deployed with managed identity
  • Azure OpenAI access granted via RBAC
  • Private endpoints enabled for production
  • Secrets removed from code and configuration
  • Application Insights enabled and validated
  • Prompt and token usage reviewed for cost efficiency

Further Reading

  • Azure Web Apps overview – Microsoft Learn
  • Azure OpenAI Service security and networking
  • Managed identities for Azure resources
  • Private endpoints and App Service VNet integration
  • Infrastructure as Code with Bicep

Deploying AI applications to Azure Web Apps is less about model hosting and more about secure orchestration. By combining managed identities, private networking, and infrastructure as code, you can build AI-powered systems that are scalable, auditable, and production-ready without unnecessary complexity.

I hope you found this article useful.


Tags:


Azure Web Apps and AI — What’s New from Microsoft Ignite 2025

Microsoft Ignite 2025 reaffirmed that AI is now a first-class development model across the Azure platform, not just an optional add-on. Among the many announcements at Ignite, updates to Azure App Service (Web Apps) and the broader Azure AI ecosystem directly affect how developers build intelligent, cloud-native applications — including web apps augmented with AI capabilities, semantic search, and agentic workflows.

This article walks through:

  • Key App Service / Web Apps announcements
  • The AI platform enhancements that Web Apps can leverage
  • Architectural patterns for integrating AI into web workloads
  • Infrastructure automation with Terraform and secure deployment patterns

What’s New in Azure App Service at Ignite 2025

At Ignite 2025, the Azure App Service team announced a set of updates focused on modernization, developer productivity, and AI integration readiness:

1. Managed Instance on Azure App Service (Public Preview)

A major new capability — Managed Instance on Azure App Service — entered public preview. It is designed to simplify migration of legacy web applications (especially classic ASP.NET and .NET apps with Windows dependencies) to a managed PaaS runtime with minimal code changes.

Key benefits:

  • Run legacy Windows web apps with Hyper-V nested virtualization.
  • Use configuration and installation scripts for dependencies rather than rewriting code.
  • Maintain automatic OS and .NET patching and updates.
  • Direct RDP access to instances for troubleshooting in complex migration scenarios.

This matters for AI because modernization often precedes adding AI-driven features — a classic app first needs to run reliably on App Service before layering in services like semantic search, chat UI APIs, or automated content enrichment workflows.


2. Enhanced Runtime and Language Support

Alongside the Managed Instance preview, App Service continues to expand support for modern runtimes, frameworks, and developer experiences. While not AI-specific, these enhancements make it easier to host intelligent applications built with:

  • .NET 8/9 and ASP.NET Core
  • Node.js 20+
  • Java 25 and beyond
  • Containers on App Service (Linux) with improved tooling

These runtimes interoperate cleanly with Azure AI APIs and SDKs. For example, a Node.js web app can call Azure AI Search or an LLM API directly via SDK or REST.


Azure AI Platform Enhancements from Ignite 2025

Ignite 2025 made clear that the foundation for AI-driven web apps is not just the compute layer, but platform services that encapsulate semantic understanding, retrieval-augmented generation (RAG), and agentic workflows.

1. Microsoft Foundry — Unified AI Agent Platform

At Ignite, Azure AI Foundry was rebranded and enhanced as “Microsoft Foundry”, a unified platform for building, deploying, and governing enterprise-grade AI agents across workloads. Foundry now supports:

  • Multi-agent orchestration
  • Open standards for models (including Anthropic Claude and OpenAI models)
  • Seamless integration with enterprise data and APIs

Foundry is not App Service, but it is the AI backbone you are likely to call from web apps for:

  • Conversational AI interfaces
  • Workflow automation (e.g., ticket triaging, contextual assistants)
  • Long-running agentic tasks tied to user sessions or backend triggers

More on Foundry’s updates are in the Microsoft documentation. TECHCOMMUNITY.MICROSOFT.COM+1

2. Semantic Retrieval and AI Search

Azure AI Search (formerly Cognitive Search) continues to evolve with RAG-friendly patterns that integrate vector search, semantic ranking, and LLMs. This makes it much easier to add “chat with your data” experiences into web UIs.

Azure AI Search documentation is here:
🔗 https://learn.microsoft.com/azure/search/what-is-azure-ai-search

Typical patterns for Web Apps include:

  • Document ingestion pipelines (Azure Blob, OneLake, Cosmos DB)
  • Indexer + semantic search for natural language queries
  • LLM integration to summarize and respond conversationally

Patterns for Integrating AI into Azure Web Apps

Below are architectural patterns you might adopt when extending web apps with AI capabilities following Ignite 2025.

Pattern 1 — RAG-Driven Conversational UI

  1. Web App Frontend on App Service (e.g., React, Blazor).
  2. API Layer in Azure Functions or .NET backend to handle requests.
  3. Azure AI Search + vector store for semantic retrieval.
  4. OpenAI or Foundry Models for response generation.

Flow:

  1. User asks a question in web UI.
  2. Backend calls Azure AI Search to retrieve relevant documents.
  3. Retrieved context is sent to a generative model.
  4. Model output is returned to the UI.
var client = new Azure.AI.OpenAI.OpenAIClient(new Uri(endpoint), new DefaultAzureCredential());
var response = await client.GetCompletionsAsync(
    deploymentOrModelName: "gpt-4.1-enterprise",
    new Azure.AI.OpenAI.CompletionsOptions
    {
        Prompts = { "Summarize these docs for a tech user: ..." }
    });

Pattern 2 — Agentic Backend Workflows

If your web app needs to trigger longer-running workflows (e.g., order fulfillment automation, customer support routing), you can:

  1. Expose an HTTP trigger from your Web App or Azure Function.
  2. Hand off processing to a Foundry agent (via API) that orchestrates multi-step logic.
  3. Use Queues (Service Bus, Storage Queues) for reliable message passing.

This pattern decouples UI from backend processing, letting agents execute tasks with traceability and governance — critical for compliance.


Infrastructure as Code — Terraform & Bicep

Automation and repeatability are essential. Below is a sample Terraform snippet to provision an App Service with a Managed Identity (for secure calls to Azure AI).

Notes:

provider "azurerm" {
  features {}
}

resource "azurerm_resource_group" "rg" {
  name     = "rg-webapp-ai"
  location = "WestEurope"
}

resource "azurerm_app_service_plan" "plan" {
  name                = "asp-webai"
  resource_group_name = azurerm_resource_group.rg.name
  sku {
    tier = "PremiumV4"
    size = "P1v4"
  }
}

resource "azurerm_app_service" "webapp" {
  name                = "webapp-ai"
  resource_group_name = azurerm_resource_group.rg.name
  location            = azurerm_resource_group.rg.location
  app_service_plan_id = azurerm_app_service_plan.plan.id

  identity {
    type = "SystemAssigned"
  }

  site_config {
    dotnet_framework_version = "v6.0"
  }

  app_settings = {
    "WEBSITE_RUN_FROM_PACKAGE" = "1"
    "AZURE_OPENAI_ENDPOINT"     = var.openai_endpoint
    "AZURE_OPENAI_MODEL"        = var.openai_model
  }
}

  1. Use managed identities to authenticate to Azure AI services instead of static keys.
  2. You can extend this snippet with Private Endpoints, Key Vault references, and app settings for AI service integration.

Terraform vs Bicep vs ARM:

  • Terraform excels with multi-cloud teams and state management.
  • Bicep/ARM provide first-class Azure tooling and tighter integration with Azure RBAC.
  • Choose based on team skills and governance requirements.

Security, Governance, and Identity

Ignite announcements highlighted enterprise-grade security for AI agents and services, including identity integration and access policies. For Web Apps calling AI APIs:

  • Use Managed Identity and Azure RBAC instead of connection strings.
  • Store secrets (if needed) in Azure Key Vault with MSI access.
  • Secure backend APIs with Azure AD tokens.

Summary

Ignite 2025 reinforced that AI is deeply woven into the future of Azure compute and application platforms:

  • Azure App Service continues to modernize with managed instances, enhanced runtimes, and improved migration tooling.
  • The Azure AI ecosystem — especially Microsoft Foundry and Azure AI Search — enables developers to add semantic, conversational, and agentic capabilities to web apps.
  • Architectural patterns (RAG, agent workflows) and secure automation (Terraform, managed identity) help deliver production-grade intelligent applications.

As you plan your next wave of web apps, consider AI as a core design axis, not an afterthought.


Further Reading



Using GitHub CoPilot with Azure

This is my entry for this year’s Azure Back To School. Massive shout-out to Dwayne Natwick for organising this every year!

If you’re building on Azure, integrating GitHub Copilot into your workflow can save time, reduce friction, and help with infrastructure, deployment, and debugging, not just writing business logic.

Below is an explanation of how to use it, what works well, and what to be aware of.

What “Copilot + Azure” means today

Here’s how the two worlds overlap currently:

GitHub Copilot for Azure: a VS Code extension that lets you ask about Azure, manage resources, deploy, and diagnose, all from the Copilot chat interface.

Copilot + Azure DevOps / Azure Repos: You can use Copilot with Azure Repos for suggestions, commit messages, and PR descriptions.

Agentic DevOps vision: Copilot is evolving into autonomous “agents” that can perform multi-step tasks, such as refactoring, testing, and fixing bugs. Azure pipelines, boards, and resource operations may tie into that.

Inner sourcing/knowledge reuse via MCP server: You can integrate Azure DevOps/Azure MCP server with Copilot, allowing it to suggest content from your organisation’s own modules or documentation.

Azure Boards integration: you can assign work items from Azure Boards to a Copilot coding agent, and track progress.

So it’s not just “autocomplete in VS Code + Azure SDK”, Copilot is pushing into infrastructure, operations, and agentic automation.

How to get started (practical steps)

  • Install Copilot and the Azure extension
    • In VS Code, get the GitHub Copilot for Azure extension.
    • You need a Copilot license (Pro, Business, etc.)
  • Use @azure prompts in Copilot Chat
    • Once the extension is active, you can prefix prompts with @azure to query Azure resource info, diagnose, or even do operations.

Examples:

@azure Deploy an Azure Function HTTP trigger with .NET 8
@azure What are the cost tiers for Azure SQL in West Europe?
@azure Diagnose why my webapp is showing 500 errors
  • Deploy from within the editor
    • Copilot for Azure can suggest CLI commands, resource templates, or deployment steps without you switching to the Azure Portal.
  • Use Copilot in code + infra files
    • When you edit ARM templates, Bicep, Terraform alongside app code, Copilot can help complete resource definitions, parameter scaffolding, and provide relevant snippets.
  • Explore agentic features
    • Try Copilot agents for multi-file changes, code refactors, or cloud migration tasks. (Feature availability depends on your subscription and preview access.)
  • Enable MCP / context servers
    • For tighter integration (e.g. your organization’s code modules, Azure DevOps context), you can configure MCP (Model Context Protocol) servers so Copilot has richer context

Benefits you’ll see

  • Faster iteration: fewer trips to Azure portal or docs.
  • Context-aware suggestions: Copilot knows your Azure setup or resource naming.
  • Multi-step automation: not just code, but “deploy, test, monitor” flows.
  • Consistency: reuse corporate standards or templates via inner sourcing.
  • Better dev-ops synergy: bridging code and cloud operations in one interface.

Example scenario
Let’s say you’re building a serverless API on Azure Functions + Cosmos DB.

@azure Create an Azure Function project with HTTP trigger, .NET 8

Copilot responds with scaffold commands, project template.

@azure Provision a Cosmos DB instance with RU/s 400, region North Europe

Next up try
@azure Deploy this function and connect to Cosmos DB

It issues az commands or points you to CI/CD YAML.

If you see any errors after deployment:

@azure Diagnose 500 error in function logs

It helps inspect logs, points you to misconfigurations or missing settings.

Meanwhile, inside code files, Copilot suggests resource names, configuration keys, and helper snippets.

Future direction & what to watch

  • More tasks handed over to the coding agent (not just suggestions).
  • Better integration into Azure services (monitoring, cost, policies).
  • Richer context via MCP so that Copilot is aware of your entire org’s patterns.
  • Tighter link between Azure Boards / Pipelines and Copilot agents. (You already can assign work items to Copilot.
  • Stronger guardrails: security analysis baked into generated infra code.

Summary

I use this GitHub Copilot on a daily basis and I recommend you at least take a look at it, if not use it daily too!



Using VS Code Beast Mode to learn

Subscribe to continue reading

Subscribe to get access to the rest of this post and other subscriber-only content.


Tags:


Microsoft Build: My Takeaways from this years conference

This years Microsoft Build was full of new releases, new services, new ways of doing things, and yep, lots of AI. The following are just some of the areas that grabbed my interest whilst attending the conference, due to working at it, I havent had time to try the demos, check out some of the announcements and get hands on with the tech that has newly been released, but I will in the coming weeks.

If like me you havent managed to catchup on the announcements then you can read the book of news for Build 2025.

If you are looking to upgrade your existing .net Framework application to .net Core then checkout thiese links:-

Docs: https://learn.microsoft.com/en-us/dotnet/core/porting/upgrade-assistant-overview
Upgrade Extension: https://marketplace.visualstudio.com/items?itemName=ms-dotnettools.upgradeassistant

If you would like co-pilot to help you upgrade your version of a .net Core application then here are some very useful links.

There is also some help available if you are just starting out and would like help deploying your application to Azure – Quickstart: Deploy your application to Azure with agent mode in GitHub Copilot for Azure

If you are interested in Hosting remote MCP Server’s in Azure App Service then this articlae has you covered – https://techcommunity.microsoft.com/blog/appsonazureblog/host-remote-mcp-servers-in-azure-app-service/4405082

Two of my favourite annoucements recently were the GitHub CoPilot Coding Agent and the new SRE Agent coming soon (you can sign up for this preview now!) and read more about the SRE agent – https://techcommunity.microsoft.com/blog/azurepaasblog/introducing-azure-sre-agent/4414569

Interested in some AI Labs then look no further than – https://ai.azure.com/labs

Maybe you like creating videos and now within Azure you can create high-quality visual content with GPT-Image-1 and Sora on Azure OpenAI—tailored for professional use cases – https://github.com/Azure-Samples/visionary-lab



Upgrading Your .NET Applications: Exploring .NET Upgrade Assistants at Microsoft Build

Microsoft Build is the flagship event for developers, showcasing the latest tools, frameworks, and innovations to empower modern software development. Among the highlights in recent years has been the focus on modernizing .NET applications, particularly through the .NET Upgrade Assistant tools. These tools streamline the transition from legacy .NET Framework to modern .NET (formerly .NET Core) and support upgrades between .NET versions. Additionally, the integration of GitHub Copilot in Visual Studio Code (VS Code) has added an AI-powered dimension to the upgrade process, making it smarter and more efficient. In this blog post, I’ll dive into the .NET Upgrade Assistant for migrating from .NET Framework to .NET Core, explore the .NET Core Upgrade Assistant, and highlight how Copilot in VS Code enhances these processes.

The Need for .NET Modernization

The .NET ecosystem has evolved significantly since the days of .NET Framework. With the introduction of .NET Core (now simply .NET), Microsoft unified its development platform to support cross-platform applications, improved performance, and modern cloud-native architectures. However, many organizations still rely on .NET Framework applications built years ago, which are tied to Windows and lack the scalability and features of modern .NET. Upgrading to .NET 8 or 9 (the latest Long-Term Support and Standard-Term Support versions as of 2025) unlocks benefits like enhanced performance, new APIs, and better cloud integration.

The challenge? Migrating legacy applications can be complex, involving changes to project structures, dependencies, and codebases. This is where the .NET Upgrade Assistant comes in, offering automated tools to simplify the process. At Microsoft Build, these tools have been showcased as critical for developers looking to modernize their applications efficiently.

.NET Upgrade Assistant: From .NET Framework to .NET Core

The .NET Upgrade Assistant is a powerful tool designed to help developers migrate .NET Framework applications to modern .NET. Available as both a Visual Studio extension and a command-line interface (CLI) tool, it automates many manual tasks, such as updating project files, converting to SDK-style projects, and addressing code incompatibilities. Let’s break down its key features and how it was highlighted at Microsoft Build.

Key Features of the .NET Upgrade Assistant

  1. Project File Conversion: The .NET Upgrade Assistant converts legacy .NET Framework project files to the modern SDK-style format used by .NET Core and beyond. This is a critical step, as the SDK-style format simplifies project configuration and supports cross-platform development. The tool leverages the try-convert utility to automate this process, reducing the need for manual edits.
  2. Code Analysis and Fixes: The assistant includes a robust analysis engine that scans your codebase for incompatibilities, such as deprecated APIs or platform-specific dependencies. It generates a detailed report with status icons (e.g., green checkmarks for successful upgrades, yellow warnings for issues needing attention, or red Xs for failures) and logs actions in the Visual Studio Output window. This helps developers prioritize fixes and ensure a smooth migration.
  3. Incremental Upgrades: For complex applications, such as ASP.NET web apps, the tool supports a side-by-side incremental upgrade approach. This creates a new .NET project alongside the existing .NET Framework project, allowing developers to migrate endpoints gradually while keeping the application functional. This is particularly useful for large-scale projects where a full rewrite isn’t feasible.
  4. NuGet Package Management: The assistant updates NuGet package references to compatible versions for the target .NET version. Recent updates, as announced at Microsoft Build, also support upgrading to Centralized Package Management (CPM), which simplifies dependency management across multiple projects.
  5. Extensibility: The tool supports third-party extensions through package and API mappings, allowing vendors to define how their libraries should be upgraded. This ensures compatibility with external dependencies, a common pain point in migrations.

Using the .NET Upgrade Assistant in Visual Studio

To use the .NET Upgrade Assistant in Visual Studio:

  1. Install the Extension: Available from the Visual Studio Marketplace, the extension integrates seamlessly with Visual Studio 2022 (version 17.1 or newer). You can verify installation by checking for an “Upgrade” option when right-clicking a project in Solution Explorer.
  2. Run the Upgrade: Right-click your project, select “Upgrade,” and follow the wizard to choose options like in-place upgrades (modifying the original project) or side-by-side upgrades (creating a copy). Select the target framework (e.g., .NET 8.0 or 9.0) and let the tool handle project file updates and code fixes.
  3. Review and Test: After the upgrade, review the generated report for any issues. Thorough testing is crucial, as some manual refactoring may be required, especially for ASP.NET to ASP.NET Core migrations.

Microsoft Build sessions have emphasized the tool’s ability to reduce migration time by automating repetitive tasks, with real-world examples showing successful upgrades of complex solutions. However, as noted in Build discussions, manual intervention is often needed for edge cases, such as unsupported APIs or third-party dependencies.

.NET Core Upgrade Assistant: Moving Between .NET Versions

For developers already on .NET Core or earlier .NET versions (e.g., .NET 5 or 6), the .NET Upgrade Assistant also supports upgrades to newer versions, such as .NET 8 or 9. This process is generally simpler than migrating from .NET Framework, as the project structure and APIs are more aligned. Key aspects include:

  • Target Framework Updates: The assistant updates the <TargetFramework> property in project files (e.g., from net6.0 to net9.0). This is often the only change needed for simple projects, as highlighted in Microsoft Build demos.
  • Dependency Resolution: The tool identifies and updates NuGet packages to versions compatible with the target framework, addressing security vulnerabilities or deprecated packages.
  • Code Assessment: Enhanced in 2024, the assistant’s code assessment features scan for potential issues at the source code level, providing a dashboard with issue severity and remediation effort estimates. This was a major focus at Build, showcasing how developers can pinpoint and resolve issues quickly.

For example, a Build session demonstrated upgrading a .NET 6 Razor Pages project to .NET 9, where the assistant updated NuGet packages like Microsoft.EntityFrameworkCore from version 6.0 to 9.0 and flagged a test failure for manual review. The process was completed with minimal manual changes, thanks to the tool’s automation.

GitHub Copilot in VS Code: Enhancing Upgrades

At Microsoft Build 2025, a significant highlight was the integration of GitHub Copilot with the .NET Upgrade Assistant, particularly through the “GitHub Copilot app modernization – Upgrade for .NET” extension. While this extension doesn’t yet support direct .NET Framework to .NET migrations, it excels at modernizing .NET Core projects and enhancing the upgrade experience in VS Code.

How Copilot Helps

  1. AI-Powered Guidance: Copilot analyzes your codebase and generates an upgrade plan, suggesting changes like updating target frameworks or modernizing APIs. It uses natural language prompts, allowing you to ask, “Upgrade my solution to .NET 9,” and it responds with a step-by-step plan.
  2. Automated Code Changes: Copilot applies transformations automatically, such as updating NuGet packages or refactoring code to use newer APIs. It commits changes to Git at each step, enabling easy rollbacks if needed.
  3. Learning from Manual Fixes: When manual intervention is required, Copilot learns from your changes and applies them to similar issues later, reducing repetitive work. This was showcased at Build with a demo upgrading a .NET 6 MVC project, where Copilot adapted to developer fixes in real time.
  4. Integration with VS Code: In VS Code, Copilot’s inline suggestions and chat interface make it easy to interact with the upgrade process. For example, you can enable Agent Mode, select the “Upgrade” tool, and let Copilot guide you through the process.

Getting Started in VS Code

To use Copilot for .NET upgrades in VS Code:

  1. Install Extensions: Ensure the GitHub Copilot and C# Dev Kit extensions are installed. A GitHub Copilot subscription is required.
  2. Enable Agent Mode: Go to the Copilot Chat window, select “Agent,” and choose the “Upgrade” tool.
  3. Start the Upgrade: Use a prompt like “Upgrade my project to .NET 9.” Copilot will analyze the project, apply changes, and provide a report with Git commit hashes and next steps.

Build sessions highlighted Copilot’s ability to reduce upgrade time by automating repetitive tasks and providing intelligent suggestions, though some limitations were noted, such as incomplete support for .NET Framework migrations.

Best Practices and Considerations

  • Backup Your Code: Always back up your project before running upgrades, as both the .NET Upgrade Assistant and Copilot make significant changes.
  • Test Thoroughly: Automated tools handle much of the process, but manual testing is essential to catch runtime issues, especially for complex applications.
  • Check Dependencies: Ensure third-party dependencies support the target .NET version. The assistant’s code assessment helps identify these issues early.
  • Leverage Community Feedback: Microsoft Build emphasized community contributions to the .NET Upgrade Assistant’s GitHub repository, where developers can report issues or suggest features.

Summary

Microsoft Build has positioned the .NET Upgrade Assistant as a cornerstone for modernizing .NET applications, offering robust tools for transitioning from .NET Framework to .NET Core and upgrading between .NET versions. The integration of GitHub Copilot in VS Code adds an AI-driven layer, making upgrades smarter and more interactive. Whether you’re using Visual Studio for a guided experience or VS Code with Copilot’s AI assistance, these tools empower developers to modernize their applications with confidence. As .NET continues to evolve, leveraging these assistants ensures your applications stay performant, secure, and ready for the future.

For more details, check out the .NET Upgrade Assistant on the Visual Studio Marketplace or explore Copilot’s capabilities at Microsoft Learn.



Automating Deployment of Azure Policies using Bicep

Introduction

This blog post is part of this years Azure Spring Clean an event which is ran to promote well managed Azure tenants. To achieve this, they have community driven articles that highlight best-practice, lessons learned, and help with some of the more difficult topics of Azure Management.

Azure Policy is a powerful governance tool that helps organizations enforce compliance across their Azure environments. By automating the deployment of Azure Policies using Bicep and the Azure Verified Modules (AVM) GitHub repository, you can ensure consistent policy enforcement while leveraging modular, reusable infrastructure as code.

This guide assumes you already have your environment set up in VS Code, including Bicep tooling and Azure CLI authentication.

Prerequisites

Before deploying Azure Policies with Bicep, ensure you have:

  • VS Code with the Bicep extension installed.
  • Azure CLI installed and authenticated (az login).
  • Bicep CLI installed (az bicep install if needed).
  • Git installed and cloned the Azure Verified Modules (AVM) repository.
  • Appropriate permissions to create and assign policies in Azure.

Deploying Policies to Management Groups and Subscriptions

Deploying policies at the management group level is a best practice for organizations that manage multiple subscriptions under a common governance framework. By applying policies at this higher level, you can ensure:

  • Consistency: Enforce compliance standards across all subscriptions within the management group without the need for redundant deployments.
  • Efficiency: Reduce operational overhead by managing policies centrally instead of applying them individually to each subscription.
  • Scalability: As new subscriptions are added to the management group, they automatically inherit the assigned policies, ensuring continuous compliance.

To apply policies at different scopes, use the following commands:

Deploying to a Management Group

$location = ‘West Europe’
$management-group-id = ‘mg-demo’

az deployment mg create \
  --management-group-id $management-group-id \
  --location $location \
  --template-file main.bicep \
  --parameters @parameters.json \
  --name MGPolicyDeployment

Deploying to a Subscription

az deployment sub create \
  --location eastus \
  --template-file main.bicep \
  --parameters @parameters.json \
  --name SUBPolicyDeployment

You can also check in the Azure Portal under Policy -> Assignments

Lets take a look at some example Azure Policies you may want to add to your management groups. In this example I would add them to a file called deployPolicyMg.bicep

targetScope = 'managementGroup'

@description('Policy Assignment Management Group - Allowed Locations')
module assignAllowedLocationPolicy 'policyAssignmentMg.bicep' = if (deployAllowedLocations) {
  name: 'AllowedLocations'
  params: {
    name: 'Allowed Locations'
    displayName: 'Allowed Locations'
    policyDefinitionId: '/providers/Microsoft.Authorization/policyDefinitions/e56962a6-4747-49cd-b67b-bf8b01975c4c'
    location: primaryLocation
    identity: 'None'
    parameters:{
      listOfAllowedLocations: {
        value: allowedLocations
    }
  }
}
}

@description('Policy Assignment Management Group - ISO 27001-2013')
module assignIso27001Policy 'policyAssignmentMg.bicep' = if (deployIso27001Policy) {
  name: 'Iso27001'
  params: {
    name: 'ISO 27001-2013'
    displayName: 'ISO 27001-2013'
    policyDefinitionId: '/providers/Microsoft.Authorization/policySetDefinitions/89c6cddc-1c73-4ac1-b19c-54d1a15a42f2'
    location: primaryLocation
    identity: 'SystemAssigned'
    roleDefinitionIds: []
  }
}

@description('Policy Assignment Management Group - Azure Security Benchmark')
module assignAscPolicy 'policyAssignmentMg.bicep' = if (deployAzureSecurityBenchmark) {
  name: 'AzureSecurityBenchmark'
  params: {
    name: 'Azure Security Benchmark'
    displayName: 'Azure Security Benchmark'
    policyDefinitionId: '/providers/Microsoft.Authorization/policySetDefinitions/1f3afdf9-d0c9-4c3d-847f-89da613e70a8'
    location: primaryLocation
    identity: 'None'
    roleDefinitionIds: []
  }
}

The above code example makes use of some variables which I add to a .bicepparameters file which would look like this:-

using './deployPolicyMg.bicep'

param deployAllowedLocations = true
param deployIso27001Policy = true
param deployAzureSecurityBenchmark = true


Lets take a look at some example Azure Policies you may want to add to a subscription. In this example I would add them to a file called deployPolicySub.bicep
targetScope = 'subscription'

@description('Assign Policies to Subscription - Require an Owner tag on resource groups')
module assignReguireRgOwnerTagPolicy 'policyAssignmentSub.bicep' =  if (tagAtSubscriptionLevel && ownerTagResourceGroupsPolicy) {
  name: 'reguireRgOwnerTagPolicy'
  params: {
    name: 'Require an Owner tag on resource groups'
    displayName: 'Require an Owner tag on resource groups'
    policyDefinitionId: '/providers/Microsoft.Authorization/policyDefinitions/96670d01-0a4d-4649-9c89-2d3abc0a5025'
    location: primaryLocation
    identity: 'None'
    parameters:{
      tagName: {
        value: ownerTagName
    }
  }
}
}

@description('Assign Policies to Subscription - Require a DeployedBy tag on resource groups')
module assignReguireRgDeployedByTagPolicy 'policyAssignmentSub.bicep' =  if (tagAtSubscriptionLevel && deployedByTagResourceGroupsPolicy) {
  name: 'reguireRgDeployedByTagPolicy'
  params: {
    name: 'Require a DeployedBy tag on resource groups'
    displayName: 'Require a DeployedBy tag on resource groups'
    policyDefinitionId: '/providers/Microsoft.Authorization/policyDefinitions/96670d01-0a4d-4649-9c89-2d3abc0a5025'
    location: primaryLocation
    identity: 'None'
    parameters:{
      tagName: {
        value: deployByTagName
    }
  }
}
}

The code above is an example of how you could add Azure Policies into a subscription.

The above code example makes use of some variables which I add to a .bicepparameters file which would look like this:-

using './deployPolicySub.bicep'

param tagAtSubscriptionLevel = true
param ownerTagResourceGroupsPolicy = true
param deployedByTagResourceGroupsPolicy = true

Conclusion

By leveraging Azure Bicep and Azure Verified Modules, you can automate and standardize Azure Policy deployment efficiently. Start using AVM today to maintain governance and compliance effortlessly!



Festive Tech Calendar 2024 YouTube Videos

This year’s Festive Tech Calendar videos are available on YouTube with the link to the playlist.

A huge THANK YOU to everyone who took part, has watched the videos and learned and especially those who also donated to this years charity – thanks from all of us at the Festive Tech Calendar team!

Day 1

Day 2

Day 3

Day 4

Day 5

Day 6

Day 7

Day 8

Day 9

Day 10

Day 11

Day 13

Day 14

Day 15

Day 16

Day 17

Day 18

Day 19

Day 20

Day 21

Day 22

Day 23

Day 25

Day 26

Day 27

Day 28

Day 29

Day 30

Day 31

This year’s Festive Tech Calendar videos are available on YouTube with the link to the playlist.



Get TenantId for any Azure Subscription

How I Used GitHub Copilot to Write a PowerShell GUI for Azure Tenant ID Lookup

When tasked with creating a PowerShell GUI to retrieve the Azure Tenant ID for any subscription, I decided to rely entirely on GitHub Copilot. Here’s how I did it—without manually writing a single line of code myself. (repo -> https://github.com/gsuttie/getTenantIdFromAzureSubscriptionId)

Setting Up

  1. Open Visual Studio Code: My preferred development environment. I enabled GitHub Copilot for code suggestions.
  2. Define Goals:
    • A user-friendly GUI for inputting an Azure Subscription ID.
    • Backend logic to retrieve the Tenant ID using Azure PowerShell.
    • Automatically generate inline documentation and a comprehensive README file.

Prompting GitHub Copilot

  • I started by creating a new PowerShell file and inputting the following prompt for Copilot:Create a PowerShell script for a GUI that accepts an Azure Subscription ID, retrieves the Tenant ID using `Get-AzSubscription`, and displays it. Include inline comments and generate a README.
  • I then tweeked the prompt a few times and the end result can be found in the following GitHub repo

Documentation and README

I added another comment to the script:

Generate a README file explaining the purpose of this script, its usage, prerequisites, and examples.

Copilot produced a structured README covering:

  • Purpose: Explaining the script’s function.
  • Usage: Step-by-step instructions on running the script.
  • Prerequisites: Details about Azure PowerShell modules and authentication requirements.
  • Example: A sample input and output demonstration.

Testing and Tweaking

I tested the script on a sample Azure environment. While functional, the GUI layout needed minor adjustments. I prompted Copilot with:

Improve the alignment and spacing of GUI elements.

This fine-tuned the interface, making it visually cleaner.

Final Output

With GitHub Copilot, I:

  • Built a functional PowerShell GUI to retrieve Azure Tenant IDs.
  • Included inline comments and documentation.
  • Generated a detailed README without writing any code manually.

Summary

GitHub Copilot significantly accelerated the development process. While it handled 95% of the work, reviewing and testing were key to ensuring functionality and usability. This approach is ideal for tasks where speed and automation are priorities.


Tags:


Bicep snippets using Azure Verified Modules

This blog is about listing the posts I have that demo using Bicep with code snippets for various different scenarios.

The list of snippets below makes use of the Azure Verified Modules GitHub repo from Microsoft.