Azure DP-300

Azure DP-300 Exam Study guide

Plan and Implement Data Platform Resources (15-20%)
Deploy resources by using manual methods



TFS

TFS TF400917 – Upgrading TFS to move to Azure Devops

A customer at work had an issue upgrading from TFS 2017 to TFS 2019 with a view to moving to Azure DevOps today, so I thought I would blog the issue and the fix in case anyone else runs into the same sort of issue. I learned how to resolve such issues like the one we had shown below: –

TF400917: The current configuration is not valid for this feature was the error message, googling takes you to this link: – https://docs.microsoft.com/en-us/azure/devops/reference/xml/process-configuration-xml-element?view=azure-devops-2020

From here we can download and run a tool called witadmin, you can read more here -> https://docs.microsoft.com/en-us/azure/devops/reference/witadmin/witadmin-customize-and-manage-objects-for-tracking-work?view=azure-devops-2020

If you downlod this tool and export the config like so:-

witadmin exportprocessconfig /collection:CollectionURL /p:ProjectName [/f:FileName]

We can export the processconfig and check for invalid items, in our case there was duplicate State values within the xml. So I exported the xml and made a change by hand and then imported the file with the removed duplicate item using the following command: –

witadmin importprocessconfig /collection:CollectionURL /p:ProjectName [/f:FileName] /v

For both of the commands above you have to supply the CollectionURL, ProjectName and a FileName and then by importing the config this fixed the issue. The devil here is in the detail, find the invalid details, in our case it was a duplicate State of Completed, I had to remove one and save, nope not that one, so I added it back in and removed the other and re-imported the config – problem solved.

Note
You can also download an add-on for Visual Studio which can help with the task of migrating from TFS to Azure DevOps which is called the TFS Process Template Editor. The link to download this is https://marketplace.visualstudio.com/items?itemName=KarthikBalasubramanianMSFT.TFSProcessTemplateEditor

With the above tool, you can visualize the config for your TFS setup which can help you see what’s going on under the hood a little better – a useful tool!.

Kudos to https://twitter.com/samsmithnz for telling me about this – Sam rocks!



Azure App Service

Troubleshooting App Services in Azure

In this blog post, I wanted to cover how to go about troubleshooting an App Service in Aure which is a web app with a SQL server backend whereby users have reported issues with the slow performance of the website.

The first thinh I tend to look at is the backend store, in this case, Azure SQL Server and we have some really great tooling we can use to troubleshoot perforamce issues with Azure SQL.

The first port of call was to open up the Azure Portal and go to the Resource Group with the issues and click on the SQL Server database and head to the Intelligent Performance section on the left-hand menu as highlight below: –

Performance Overview
This currently has a recommendations area that suggests adding 5 different Indexes which are all set as HIGH impact.

Indexes can sometimes cause adverse effects so it’s recommended to look at the suggestions, copy the script from the recommendations and consider if this Index will indeed help with the performance of queries.

Query Performance Insight
The second area I look at is query performance insight and from here we can see the average CPU, Data IO, Log IO on the SQL Server database across the last 24 Hours as an average. We also get an insight into what queries are running and taking the longest time to complete.

I changed the graph above to show the last 7 days and I can see CPU is maxed out at 100% for a long period within the last 7 days as seen below:-

Long Running Queries
This area identifies queries which are taking a long time to complete and always worth checking on this regularly.
The following is a screen shot of long running queries within the database for the past week. To find this information select the database instance in the portal and then select Query Performance Insight and select Long running queries, then I chose custom and changed the time period to Past week.

We can see above the yellow query is the database query which has the longest duration this past week, you can click on the yellow area and it will show you the details of the query which is a long running query.

Automatic Tuning

Azure SQL Database built-in intelligence automatically tunes your databases to optimize performance. What can automatic tuning do for you?

  • Automated performance tuning of databases
  • Automated verification of performance gains
  • Automated rollback and self-correction
  • Tuning history
  • Tuning action Transact-SQL (T-SQL) scripts for manual deployments
  • Proactive workload performance monitoring
  • Scale out capability on hundreds of thousands of databases
  • Positive impact to DevOps resources and the total cost of ownership

I would recommend that FVN turn this on and leave it like the following:-

This means that Azure will tune the indexes using built in intelligence and create indexes when it thinks you need them based on usage patterns. A word of caution here as these recommendations aren’t always correct so please bare this in mind.

Log Analytics
I always recommend adding the Azure SQL Analytics workspace solution to the subscription and this gives us further insight into the SQL Server in Azure. Once you turn this on you need to wait sometime before it can gather a decent amount of data.

The screen shot below shows us the type of information we can get from it, this screen shot was taken not long after being turned on so if you wait some time it will have much more useful details:-

From here we can get more information about deadlocks, timeouts, etc.


Now lets take a look at the website which is in an App Service in Azure and see what tool we can use to help us troubleshoot issues with the performance.

I always recommned adding Application Insights into Azure for resources when possible, and here if we click on the App Insights for the web app we can instantly get soe basic info. If you click on the Application Dashboard as seen below we get a high level vue of whats going on in our App Service.

The Application dashboard for a typical web app might look something like this: –

Ok, so let’s now do some further investigation into our app service issues. This time I chose the App Service itself and then I chose Diagnose and solve problems from the left-hand menu. This feature is underused in my opinion and is very useful indeed, not sure if many people have looked at it but it can be pretty helpful with recommendations and also pointing out some things that you may want to think about remediating.

Once in the Diagnose and solve problems area I usually click on Availability and Performance within the Troubleshooting categories section and if you do, you’ll see something like this: –

In the image above we can see that we have some App Performance issues to go and investigate. Clicking into the App Performance section we get in-depth details about the Performance and we get Observations that say things like Slow Request Execution with details of the web page, average latency, total execution time, etc. The detail here is very helpful in tracking down potential issues in the code, or the configuration of your web application. There are a number of options to check within each section of the 6 troubleshooting categories, an example is shown below for the Availbility and Performance section: –

Summary
In summary, there are a number of really awesome tools to aid us with troubleshooting App Service perormance issues, go check them out the next time your web app is running poorly.



Azure Functions

Azure Durable Functions – Support Caller

I wrote an Azure Durable function which makes a phone call to out of hours support engineers when an alert is raised within their production Azure environment, and I wanted to talk about how I did it and what I used.

When an alert is raised with the customers Azure environment I send an HTTP Post to my Azure durable function endpoint from the reporting tool we use, which is PRTG, you can do the same from Azure just as easily, we use PRTG to monitor Azure resources for things like High CPU and the amount of free disk space remaining, etc.

Durable functions was chosen so that I can make use of what’s called an orchestration durable function – you can read more about durable functions: – https://docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview?tabs=csharp

If you read the above articles you’ll get a good grasp of what an orchestrator in durable functions can do, to convey why I used them I have the following workflow requirements:-

  1. Receive details of the alert.
  2. Retrieve the support people’s phone numbers.
  3. If an alert is raised call the first number 3 times in 5 minutes, if answered by a human, read out the alert message and some extra content and ask the user to acknowledge the issue by pressing 1 on the keypad.
  4. If the support engineer doesn’t answer after 3 attempts then move onto the next number.
  5. If the support engineer answers and presses 1 stop the orchestration.

1 – Receive details of the alert
This is really easy to do, here I have a template set up in PRTG which forwards the details of the alert to my durable function like so:-

2 – Retrieve the support people’s phone numbers
I am storing the support people’s phone numbers in a CSV file which is uploaded to a simple Azure storage account, this allows the customer to edit the support numbers easily.

3 – Making the call
Here I make use of Twilio Rest API and I create a CallResource object and then call the Create method, Twilio has a thing called Twiml which you can create a message of your own out of and it will read this message out to the person who picks up the phone call. All of the details about who you call, what the call says, and the action they need to take are stored in config so it can be very easily changed for different customers.

The code to make a call is actually really simple.

var call = CallResource.Create(twiml: 
new Twilio.Types.Twiml($"<Response><Gather action='{callbackhandlerURL}'>{messageToReadToUser}"),
to: to,
from: from);

4 / 5 – Answering the call
This was the tricky part, figuring out if they had picked up the call was my initial challenge and I tried numerous things from the Twilio docs which were misleading, didn’t seem to work as I expected. The samples for this part of the documentation are sadly lacking.
Now when the call comes in the support engineer is asked to press 1 to acknowledge they have received the call and the orchestration can end, part of this involves having a callback URL so that Twilio can send you details back to a URL of your choice so that you can get the details of the call, things like call length, etc and if they pressed 1 during the call.


Orchestration
The orchestration part was pretty tricky for me to get right, huge thanks to @marcduiker he was an enormous help to me on this, figuring out how to do some of the steps proved tricky but very interesting!

Marc is putting together an Azure Functions University series where you can go and learn all about Azure Functions – please go check that out.


The orchestration logic was something like the following:-

MainOrchestrator – this function’s job is to be the orchestrator, within this function we call sub orchestrators, and also activity functions, think of an activity function as a separate function that does something, I had a GetNumbersFromStorage activity function and a SendNotification activity function. so the idea behind durable functions is to be able to call multiple azure functions using patterns, one of which is the orchestrator pattern.

RetryOrchestrator – this function’s job is to work out what to do when the call wasn’t answered the first time, do we need to make another call, how many times have we called this number, and have we ensured that the calls are spread out of 5 minutes so we don’t make multiple calls at the same time.


Twilio
To make this all work I created a Twilio account and purchased a number, this means you can use this number to make the calls. It costs 2 pence per call and 7 pence per call if you want to detect if someone answered the call using answering machine detection, so there are options available.

Summary
Durable functions have a lot of great use cases, definetly check them out and build something yourself to get a handle on how they work. The Azure durable function docs are really good.




AKS Zero To Hero – Series for everyone

Richard Hooper and I have started a new series called AKS Zero to Hero, the aim here is for Richard to teach me AKS from zero to knowledge to hopefully becoming a hero when it comes to AKS.

We see a lot of customers either already using AKS or wanting help getting started with AKS so it’s about time I got up to speed. If you are new to AKS or a seasoned professional we will be covering as much AKS content as we possibly can, the aim is to try to have content out each week.

We will be taking an asp.net core project which will be open-sourced on GitHub at https://github.com/CloudFamily/AKS_Zero_to_Hero and we will deploy this to AKS and cover as many areas of AKS as we can possibly cover. The series will run for a while so please hit subscribe and click on the bell notification to be alerted when a new video drops.

The YouTube playlist for all of our videos thus far can be found below.

Please give us feedback, ask questions etc and we can try and answer them in an ask me anything session which we will be planning within the next month.

Don’t forget to subscribe to my own YouTube Channel.



Using the latest version of the Azure CLI

In this blog post, I wanted to quickly cover how you can keep the Azure CLI up to date on your local system and within Azure. I use the Azure CLI as my go-to choice for writing deployment scripts in Azure. The reason you want to keep this up to date is for new additions as we all bug fixes for previous versions.

The Azure command-line interface (Azure CLI) is a set of commands used to create and manage Azure resources. The Azure CLI is available across Azure services and is designed to get you working quickly with Azure, with an emphasis on automation.

Its super simple to keep this up to date and you can do this by opening a PowerShell or Bash script window and typing:-

az upgrade

But instead of doing this maybe you want to keep it up to date without having to keep checking, you can also do this by using the following command:-

az config set auto-upgrade.enable=yes

But even better yet you can keep the Azure CLI up to date without ever being prompted by using the following command:-

az config set auto-upgrade.prompt=no

And that’s it, no you no longer need to worry about am I using the latest version of the Azure CLI.

You can read more on this at the following URL: – https://docs.microsoft.com/en-us/cli/azure/update-azure-cli?WT.mc_id=AZ-MVP-5003451

Don’t forget to subscribe to my YouTube Channel.



Skylines Summer Sessions

Over the summer this past year, I and Richard Hooper were interviewing people around the world on Azure related content and it has been an absolute blast, it has been so much fun chatting to some very talented individuals who have a passion for their profession.

We talked about all things Azure and threw in some fun questions along the way. If you haven’t checked out the content its around 30 minutes per video and we have slides and demo’s galore.

Checkout the speakers and content -> SkyLines Summer Sessions

Huge thank you to the amazing people who work at Skylines Academy, Amy, Brette, and Nick – thanks for setting this up and organizing it.


Thomas Maurer talks to us about Azure Arc with a very cool demo.

Thomas Maurer

Richard Hooper talks to us about AKS with a very cool little demo.

Richard Hooper

Dwayne-Natwick talks to us about Virtual Machine Scale sets and Virtual Machine Availability Sets

Dwayne Natwick

Peter De Tender talks to us about Terraform on Azure.

Peter De Tender

Maarten Goet talks to us about Azure Sentinel.

Maarten Goet

Wesley Haakman talks to us about Cloud Solution Providers (CSPs) and Managed Service Providers (MSPs)

Wesley Haakman

Shannon Kuehn talks to us about Azure VMware Solution (AVS).

Shannon Kuehn

Joe Carlyle talks to us about Azure FireWall.

Joe Carlyle

April Edwards talks to us about A/B Testing in Azure.

April Edwards

Adam Bertram talks to us about PowerShell

Adam Bertram

Sarah Lean talks to us about Datacenter Migration & Azure Migrate

Sarah Lean

Sam Smith talks to us about common mistakes with DevOps.

Sarah Lean

Gwyneth Peña talks to us about her journey to becoming an Azure MVP and a Cloud Engineer.

Gwyneth Peña S.

Wim Matthyssen

Wim talks to us about Azure spend and how to take control.

Pete Gallagher

Pete talks to us about Azure IoT.

Michael Levan

Michael talks to us about using Octopus Deploy with Azure.



Review of the year

Wow what a year its been.

  • Started a new job as An Azure Architect @ Intercept
  • Gave workshops at work on GitHub Actions, Azure PaaS, and Azure Governance.
  • Renewed as an MVP
  • Helped 9 people become an MVP.
  • 53 User Group talks / took part in.
  • Helped Organise this year’s Festive Tech Calendar, Global Azure Bootcamp UK / Ireland, Skylines Summer Sessions.
  • Spoke at Scottish Summit.
  • Passed the following exams: –
    • AI-900 Azure AI Fundamentals.
    • DP-900 Azure Data Fundamentals
    • DP-200 Implementing an Azure Data Solution.
    • DP-201 Designing an Azure Data Solution.
    • AZ-104 Azure Administrator Associate.
  • Became a Microsoft Certified Trainer.
  • Started my own YouTube channel.
  • Started the CloudFamily Podcast with Richard Hooper – https://anchor.fm/cloudfamily
  • Blogging – made a conscious effort to slow down blogging to spend time on other things, still managed over 10,000 views each month.

Whats next I hear you ask.

  • Speaking at Scottish Summit 2021
  • Azure AI-100 Designing and Implementing an Azure AI Solution
  • DP-300 Administering Relational Databases on Microsoft Azure
  • PL-900: Microsoft Power Platform Fundamentals

I don’t have much more planned than that.

Thank you
I have way too many people to thank in 2020, honestly, I am very fortunate to know some amazing people from the communities, everyone I work with day to day, shout out to everyone who I speak with on Twitter, everyone involved in running User Groups, event organizers, etc.
I am grateful to each and every person who I speak to, I do my best to get back to everyone and help where I can. I have noticed that none of you sleep, most of you are up at silly hours of the night.

Highlights this year
This one is easy, helping people who are passionate about helping others is something I will take time out of my day to help people with.

Summary
Next year, more of the same, helping people get started, sharing people’s content as it’s tough to create content and people spend a lot of time on this.
I’d like to see events that focus on newcomers to our communities and highlight their work.

Its been a horrid year for everyone, keep safe and keep your chin up and a smile on your face as much as you can.

Gregor



Azure Logic App Api call save a file to Blob Storage

I wanted to see how easy it would be to create a Logic App to call an API and return data from it and then translate the contents into a CSV file, any excuse to learn something new and play with Logic Apps, which is not something I have done a lot of, to be honest.

So my goal was to pick an api, call it using a GET request, grab the json from the Api and then convert this to a csv file and then create a file on blob storage. In this blog post I will show you how I went about it.

I want my Logic App to do this each month and grab the data from the api and create a new blob so lets take a look at the end result and go through it step by step.

So we have 5 steps to accomplish this task.

Recurrence – this is just going to run the logic app on a schedule, so I am running this once each month.
HTTP – here is where I give it the API URL which in my case is https://geocatalogus.nl/api/3/action/datastore_search?resource_id=ecbe6732-5a6b-4858-84db-b03c410ff7aa
and I set the Request type to GET.
Parse JSON – Here I grab the JSON response from the HTTP step above and then parse it by supplying an example of the body from the JSON returned in the HTTP step. This looks like so:-

The Body (green part in the screen shot) is taken from the Dynamic Content where I just typed Body and then clicked on it.
Create CSV Table – now I want to interrogate the Parsed JSON from above and find the part of the JSON I am interested in and for this API I want the part called record which is the data I am interested in.


Again I clicked on the From part above and chose Dynamic content which lists the parts of the JSON returned and from there I chose records, I left the Columns as Automatic and thats all I needed to do here.
Create Blob – Now I want to create a new blob in Azure Storage soI chose that for my last step and gave it the connection details to my Azure Storage Blob container like so:-

I run the Logic App and it calls the API within the HTTP step, parses the returned JSON from the API, I then use the Create CSV Table step to format the data and then save the output from that step by using a Create Blob step.

And that is all there is to it, I did this just to learn something new and remind myself how cool Logic Apps are and how easy they are to use.

Don’t forget to subscribe to my YouTube Channel.



Windows Docker containers

At work, I was trying to take a C++ ISAPI DLL project and see if I can get it running in a windows container. In this blog post, I will cover my findings when working with Windows Containers – for anyone who doesn’t know there is no GUI so my blog post will cover how to do some steps using PowerShell. The end goal here is to containerize a Windows IIS legacy web app and move it to AKS without re-writing it.


I had never used Docker on a real project until now so had barely used it (please bear this in mind), please note there may and probably is a better way to do some of the following, I have written this to give you a starter for 10 if you need to work with a windows container and might need to do some legacy work.

The following are some tips on how to do stuff using windows containers and what I have been learning the last 2 weeks.

You have the option to use all manner of Windows Containers, for the work I was doing I was using the following: –

FROM mcr.microsoft.com/windows/servercore/iis:windowsservercore-ltsc2016

SHELL ["powershell"]
COPY SetupGregor.ps1 .
RUN powershell -File .\SetupGregor.ps1
COPY Setup C:/Setup

RUN reg import .\odbcinistuff.reg
RUN Start-Process -FilePath msodbcsql.msi -ArgumentList "IACCEPTMSSQLCMDLNUTILSLICENSETERMS=YES"

RUN Set-ItemProperty -path 'HKLM:\SYSTEM\CurrentControlSet\Services\InetInfo\Parameters'  -Name PoolThreadLimit -Value 512 -Type DWord

# Install windows features
RUN Install-WindowsFeature NET-Framework-45-ASPNET ; \
     Install-WindowsFeature Web-Asp-Net45 ; \
     Install-WindowsFeature Web-Static-Content ; \
     Install-WindowsFeature Web-Http-Errors ; \
     Install-WindowsFeature Web-Default-Doc ; \
     Install-WindowsFeature Web-ISAPI-Filter ; \
     Install-WindowsFeature Web-Stat-Compression ; \
     Install-WindowsFeature Web-ISAPI-Ext ; \
     Install-WindowsFeature Web-ISAPI-Filter 

# IIS stuff
RUN Install-WindowsFeature Web-Mgmt-Service; \
New-ItemProperty -Path HKLM:\software\microsoft\WebManagement\Server -Name EnableRemoteManagement -Value 1 -Force; \
Set-Service -Name wmsvc -StartupType automatic; 

# Add user for Remote IIS Manager Login
RUN net user iisadmin <putyourpasswordhere> /ADD; \
net localgroup administrators iisadmin /add;

COPY Setup/LogMonitor.exe c:/LogMonitor
COPY Setup/LogMonitorConfig.json c:/LogMonitor

CMD Write-Host IIS Started... ; \
    while ($true) { Start-Sleep -Seconds 3600 }

The above are just samples of whats possible, lets cover them one by one below.

  • FROM mcr.microsoft.com/windows/servercore/iis:windowsservercore-ltsc2016 – here I am saying use the windowsservercore-ltsc2016 windows image.
  • SHELL [“powershell”] – here since I am using a windows container I want to use the PowerShell shell.
  • COPY SetupGregor.ps1 . – here I just copy a single file into my container.
  • RUN powershell -File .\SetupGregor.ps1 – here I am running a powershell file.
  • COPY Setup C:/Setup – here I am copying a full folder into the container.
  • RUN reg import .\odbcinistuff.reg – here I exported a registry file from a test server so that I can import this into my container and use it to setup ODBC System DSN’s that I needed.
  • RUN Start-Process -FilePath msodbcsql.msi -ArgumentList “IACCEPTMSSQLCMDLNUTILSLICENSETERMS=YES” – here I am running an msi silently in my container to install ODBC sql drivers
  • RUN Set-ItemProperty -path ‘HKLM:\SYSTEM\CurrentControlSet\Services\InetInfo\Parameters’  -Name PoolThreadLimit -Value 512 -Type DWord – here is a sample on how to set a new registry key inside my windows container.
  • RUN Install-WindowsFeature NET-Framework-45-ASPNET; – yep you guessed it I’m installing windows features in my windows container.
  • RUN Install-WindowsFeature Web-Mgmt-Service; \
    New-ItemProperty -Path HKLM:\software\microsoft\WebManagement\Server -Name EnableRemoteManagement -Value 1 -Force; \ – here I am setting up the ability to connect remotely into IIS running on my container – this helps enormously when you can see the IIS settings etc from outside your windows container.
  • Set-Service -Name wmsvc -StartupType automatic; – here I make sure the service starts automatically.
  • RUN net user iisadmin <putyourpasswordhere> /ADD; \net localgroup administrators iisadmin /add; – here I create a user I can use to connect into IIS on the container – I also run the app pool using this account.
  • COPY Setup/LogMonitor.exe c:/LogMonitor
    COPY Setup/LogMonitorConfig.json c:/LogMonitor – here I am copying LogMonitor (https://github.com/microsoft/windows-container-tools/tree/master/LogMonitor) This is an opensource .exe which you can use to monitor logs like IIS and the event viewer etc, its a C++ project which I have built, in case
    you don’t have the tooling handy – you can find that here – https://github.com/gsuttie/LogMonitor
  • CMD Write-Host IIS Started… ; \
     while ($true) { Start-Sleep -Seconds 3600 } – here I keep the windows container running as long as IIS is running, if you stop IIS the container will shut down (restart the app-pool instead if you need to make changes, saves you having to restart the container.)

Buld your container using a Dockerfile like the one above:-

docker image build --tag win2016GregorsDemo .

Start your Docker image

Docker run --name remoteiisGregor -d -p 8000:80 win2016GregorsDemo 

It will start up instantly then you can get the ipaddres like so

 docker inspect --format '{{ .NetworkSettings.Networks.nat.IPAddress }}' remoteiisGregor 

Now you can connect to IIS on the container from your local desktop

Add in the IP Address and then when asked for a username and password, the username is in the Dockerfile.

Username: IISadmin
Password: ********* (whatever you add in the dockerfile)

And viola – you should now be able to connect to IIS running inside a windows container.

Now to check settings within the container you can connect to the container doing the following:-

Docker ps -a

This will give you the containerId like the following:-

Now you can grab the first 3 letters of the container id and type this

Docker exec -it 32d powershell

And now you can connect to the container with a powershell shell windows and check folders, run commands etc.

Summary
You may ask why? – Whay am I doing this, well when a customer asks if they can go to AKS with an existing solution and it needs to run on windows containers, I thought yeah let;s get it working.

This is brief blog post which doesnt go into huge detail, if you have questions please just ask, I dont have much time under my belt with Docker but I learned a lot and figure out a number of things.