Category: Azure

AI-102 Azure Study Guide

In this blog post I cover all of the resources I came across whilst studying for the AZ-102 Exam

AI-102: AI Engineer on GitHub: – https://github.com/MicrosoftLearning/AI-102-AIEngineerhttps://github.com/MicrosoftLearning/AI-102-AIEngineer

And this too:-

https://microsoftlearning.github.io/AI-102-AIEngineer/

Plan and Manage an Azure Cognitive Services Solution (15-20%)
Select the appropriate Cognitive Services resource



Azure DP-300

Azure DP-300 Exam Study guide

Plan and Implement Data Platform Resources (15-20%)
Deploy resources by using manual methods



Azure App Service

Troubleshooting App Services in Azure

In this blog post, I wanted to cover how to go about troubleshooting an App Service in Aure which is a web app with a SQL server backend whereby users have reported issues with the slow performance of the website.

The first thinh I tend to look at is the backend store, in this case, Azure SQL Server and we have some really great tooling we can use to troubleshoot perforamce issues with Azure SQL.

The first port of call was to open up the Azure Portal and go to the Resource Group with the issues and click on the SQL Server database and head to the Intelligent Performance section on the left-hand menu as highlight below: –

Performance Overview
This currently has a recommendations area that suggests adding 5 different Indexes which are all set as HIGH impact.

Indexes can sometimes cause adverse effects so it’s recommended to look at the suggestions, copy the script from the recommendations and consider if this Index will indeed help with the performance of queries.

Query Performance Insight
The second area I look at is query performance insight and from here we can see the average CPU, Data IO, Log IO on the SQL Server database across the last 24 Hours as an average. We also get an insight into what queries are running and taking the longest time to complete.

I changed the graph above to show the last 7 days and I can see CPU is maxed out at 100% for a long period within the last 7 days as seen below:-

Long Running Queries
This area identifies queries which are taking a long time to complete and always worth checking on this regularly.
The following is a screen shot of long running queries within the database for the past week. To find this information select the database instance in the portal and then select Query Performance Insight and select Long running queries, then I chose custom and changed the time period to Past week.

We can see above the yellow query is the database query which has the longest duration this past week, you can click on the yellow area and it will show you the details of the query which is a long running query.

Automatic Tuning

Azure SQL Database built-in intelligence automatically tunes your databases to optimize performance. What can automatic tuning do for you?

  • Automated performance tuning of databases
  • Automated verification of performance gains
  • Automated rollback and self-correction
  • Tuning history
  • Tuning action Transact-SQL (T-SQL) scripts for manual deployments
  • Proactive workload performance monitoring
  • Scale out capability on hundreds of thousands of databases
  • Positive impact to DevOps resources and the total cost of ownership

I would recommend that FVN turn this on and leave it like the following:-

This means that Azure will tune the indexes using built in intelligence and create indexes when it thinks you need them based on usage patterns. A word of caution here as these recommendations aren’t always correct so please bare this in mind.

Log Analytics
I always recommend adding the Azure SQL Analytics workspace solution to the subscription and this gives us further insight into the SQL Server in Azure. Once you turn this on you need to wait sometime before it can gather a decent amount of data.

The screen shot below shows us the type of information we can get from it, this screen shot was taken not long after being turned on so if you wait some time it will have much more useful details:-

From here we can get more information about deadlocks, timeouts, etc.


Now lets take a look at the website which is in an App Service in Azure and see what tool we can use to help us troubleshoot issues with the performance.

I always recommned adding Application Insights into Azure for resources when possible, and here if we click on the App Insights for the web app we can instantly get soe basic info. If you click on the Application Dashboard as seen below we get a high level vue of whats going on in our App Service.

The Application dashboard for a typical web app might look something like this: –

Ok, so let’s now do some further investigation into our app service issues. This time I chose the App Service itself and then I chose Diagnose and solve problems from the left-hand menu. This feature is underused in my opinion and is very useful indeed, not sure if many people have looked at it but it can be pretty helpful with recommendations and also pointing out some things that you may want to think about remediating.

Once in the Diagnose and solve problems area I usually click on Availability and Performance within the Troubleshooting categories section and if you do, you’ll see something like this: –

In the image above we can see that we have some App Performance issues to go and investigate. Clicking into the App Performance section we get in-depth details about the Performance and we get Observations that say things like Slow Request Execution with details of the web page, average latency, total execution time, etc. The detail here is very helpful in tracking down potential issues in the code, or the configuration of your web application. There are a number of options to check within each section of the 6 troubleshooting categories, an example is shown below for the Availbility and Performance section: –

Summary
In summary, there are a number of really awesome tools to aid us with troubleshooting App Service perormance issues, go check them out the next time your web app is running poorly.



Azure Functions

Azure Durable Functions – Support Caller

I wrote an Azure Durable function which makes a phone call to out of hours support engineers when an alert is raised within their production Azure environment, and I wanted to talk about how I did it and what I used.

When an alert is raised with the customers Azure environment I send an HTTP Post to my Azure durable function endpoint from the reporting tool we use, which is PRTG, you can do the same from Azure just as easily, we use PRTG to monitor Azure resources for things like High CPU and the amount of free disk space remaining, etc.

Durable functions was chosen so that I can make use of what’s called an orchestration durable function – you can read more about durable functions: – https://docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview?tabs=csharp

If you read the above articles you’ll get a good grasp of what an orchestrator in durable functions can do, to convey why I used them I have the following workflow requirements:-

  1. Receive details of the alert.
  2. Retrieve the support people’s phone numbers.
  3. If an alert is raised call the first number 3 times in 5 minutes, if answered by a human, read out the alert message and some extra content and ask the user to acknowledge the issue by pressing 1 on the keypad.
  4. If the support engineer doesn’t answer after 3 attempts then move onto the next number.
  5. If the support engineer answers and presses 1 stop the orchestration.

1 – Receive details of the alert
This is really easy to do, here I have a template set up in PRTG which forwards the details of the alert to my durable function like so:-

2 – Retrieve the support people’s phone numbers
I am storing the support people’s phone numbers in a CSV file which is uploaded to a simple Azure storage account, this allows the customer to edit the support numbers easily.

3 – Making the call
Here I make use of Twilio Rest API and I create a CallResource object and then call the Create method, Twilio has a thing called Twiml which you can create a message of your own out of and it will read this message out to the person who picks up the phone call. All of the details about who you call, what the call says, and the action they need to take are stored in config so it can be very easily changed for different customers.

The code to make a call is actually really simple.

var call = CallResource.Create(twiml: 
new Twilio.Types.Twiml($"<Response><Gather action='{callbackhandlerURL}'>{messageToReadToUser}"),
to: to,
from: from);

4 / 5 – Answering the call
This was the tricky part, figuring out if they had picked up the call was my initial challenge and I tried numerous things from the Twilio docs which were misleading, didn’t seem to work as I expected. The samples for this part of the documentation are sadly lacking.
Now when the call comes in the support engineer is asked to press 1 to acknowledge they have received the call and the orchestration can end, part of this involves having a callback URL so that Twilio can send you details back to a URL of your choice so that you can get the details of the call, things like call length, etc and if they pressed 1 during the call.


Orchestration
The orchestration part was pretty tricky for me to get right, huge thanks to @marcduiker he was an enormous help to me on this, figuring out how to do some of the steps proved tricky but very interesting!

Marc is putting together an Azure Functions University series where you can go and learn all about Azure Functions – please go check that out.


The orchestration logic was something like the following:-

MainOrchestrator – this function’s job is to be the orchestrator, within this function we call sub orchestrators, and also activity functions, think of an activity function as a separate function that does something, I had a GetNumbersFromStorage activity function and a SendNotification activity function. so the idea behind durable functions is to be able to call multiple azure functions using patterns, one of which is the orchestrator pattern.

RetryOrchestrator – this function’s job is to work out what to do when the call wasn’t answered the first time, do we need to make another call, how many times have we called this number, and have we ensured that the calls are spread out of 5 minutes so we don’t make multiple calls at the same time.


Twilio
To make this all work I created a Twilio account and purchased a number, this means you can use this number to make the calls. It costs 2 pence per call and 7 pence per call if you want to detect if someone answered the call using answering machine detection, so there are options available.

Summary
Durable functions have a lot of great use cases, definetly check them out and build something yourself to get a handle on how they work. The Azure durable function docs are really good.




AKS Zero To Hero – Series for everyone

Richard Hooper and I have started a new series called AKS Zero to Hero, the aim here is for Richard to teach me AKS from zero to knowledge to hopefully becoming a hero when it comes to AKS.

We see a lot of customers either already using AKS or wanting help getting started with AKS so it’s about time I got up to speed. If you are new to AKS or a seasoned professional we will be covering as much AKS content as we possibly can, the aim is to try to have content out each week.

We will be taking an asp.net core project which will be open-sourced on GitHub at https://github.com/CloudFamily/AKS_Zero_to_Hero and we will deploy this to AKS and cover as many areas of AKS as we can possibly cover. The series will run for a while so please hit subscribe and click on the bell notification to be alerted when a new video drops.

The YouTube playlist for all of our videos thus far can be found below.

Please give us feedback, ask questions etc and we can try and answer them in an ask me anything session which we will be planning within the next month.

Don’t forget to subscribe to my own YouTube Channel.



Using the latest version of the Azure CLI

In this blog post, I wanted to quickly cover how you can keep the Azure CLI up to date on your local system and within Azure. I use the Azure CLI as my go-to choice for writing deployment scripts in Azure. The reason you want to keep this up to date is for new additions as we all bug fixes for previous versions.

The Azure command-line interface (Azure CLI) is a set of commands used to create and manage Azure resources. The Azure CLI is available across Azure services and is designed to get you working quickly with Azure, with an emphasis on automation.

Its super simple to keep this up to date and you can do this by opening a PowerShell or Bash script window and typing:-

az upgrade

But instead of doing this maybe you want to keep it up to date without having to keep checking, you can also do this by using the following command:-

az config set auto-upgrade.enable=yes

But even better yet you can keep the Azure CLI up to date without ever being prompted by using the following command:-

az config set auto-upgrade.prompt=no

And that’s it, no you no longer need to worry about am I using the latest version of the Azure CLI.

You can read more on this at the following URL: – https://docs.microsoft.com/en-us/cli/azure/update-azure-cli?WT.mc_id=AZ-MVP-5003451

Don’t forget to subscribe to my YouTube Channel.



Azure Logic App Api call save a file to Blob Storage

I wanted to see how easy it would be to create a Logic App to call an API and return data from it and then translate the contents into a CSV file, any excuse to learn something new and play with Logic Apps, which is not something I have done a lot of, to be honest.

So my goal was to pick an api, call it using a GET request, grab the json from the Api and then convert this to a csv file and then create a file on blob storage. In this blog post I will show you how I went about it.

I want my Logic App to do this each month and grab the data from the api and create a new blob so lets take a look at the end result and go through it step by step.

So we have 5 steps to accomplish this task.

Recurrence – this is just going to run the logic app on a schedule, so I am running this once each month.
HTTP – here is where I give it the API URL which in my case is https://geocatalogus.nl/api/3/action/datastore_search?resource_id=ecbe6732-5a6b-4858-84db-b03c410ff7aa
and I set the Request type to GET.
Parse JSON – Here I grab the JSON response from the HTTP step above and then parse it by supplying an example of the body from the JSON returned in the HTTP step. This looks like so:-

The Body (green part in the screen shot) is taken from the Dynamic Content where I just typed Body and then clicked on it.
Create CSV Table – now I want to interrogate the Parsed JSON from above and find the part of the JSON I am interested in and for this API I want the part called record which is the data I am interested in.


Again I clicked on the From part above and chose Dynamic content which lists the parts of the JSON returned and from there I chose records, I left the Columns as Automatic and thats all I needed to do here.
Create Blob – Now I want to create a new blob in Azure Storage soI chose that for my last step and gave it the connection details to my Azure Storage Blob container like so:-

I run the Logic App and it calls the API within the HTTP step, parses the returned JSON from the API, I then use the Create CSV Table step to format the data and then save the output from that step by using a Create Blob step.

And that is all there is to it, I did this just to learn something new and remind myself how cool Logic Apps are and how easy they are to use.

Don’t forget to subscribe to my YouTube Channel.



Immutable storage for Azure Storage Blobs

If you have storage blobs containing things like backups or files then Azure now has Immutable storage available for Azure Storage Blobs generally available in all public regions.

Immutable means that it is unable to change or be changed and this means that if a customer has let’s say a backup then they can store this unchanged which for some companies is very nice to have.

To take advantage or to test out immutable storage lets go through what we need to do to test it out.

  • First of all, create a storage account.
  • Click on Containers and create a new container, give it a name and choose Private (no anonymous access).
  • Once created click on the name of your new container and then upload some files.
  • Once you have uploaded some files click on Access Policy on the left-hand side, notice we have 2 sections, Stored access policies and Immutable blob storage, under Immutable blob storage, select Add policy.
  • We now have 2 options to choose from
    • Time-based retention
    • Legal hold

Time-based retention allows us to add a number of days value between 1 day and 400 years, this also makes the files immutable.

Note:- You cannot change this value to 0 at any time. Once the interval you add expires – Upon the expiration of the retention interval, the data will continue to be in a non-modifiable state but can be deleted. Retention policy changes may require some time to take effect. 5 edits are permitted to the policy.

Legal hold retention means you add a tag to the blob container – each legal hold policy needs to be associated with 1 or more tags. Tags are used as a name identifier, such as a case ID, to categorize and view records.

You cannot delete or modify any files with the container whilst there is either a Time-based retention policy or a Legal hold policy, however if you delete the legal hold policy you can then delete or modify files with the container.
With Time-based retention, you can allow additionally protected appends and change the retention interval.
Time-based retentions need to be locked in order to be active and to add a lock click on the 3 dots and choose Lock policy.

Note:- Once you apply the lock you cannot delete the lock and just before you click save on applying the lock you will see the following reminder:-

Summary
I can see some people having the need to keep backups and have them immutable for a number of legal reasons and this new feature will be very handy for them.

Don’t forget to subscribe to my YouTube Channel.



Azure Certification Prep

Hi folks, this blog post comes to you as part of the Azure Back to School community event ran by Dwayne Natwick.

Many thanks to Dwayne for allowing me to take part.

You can read more about the event on the official website

I am talking to you today about Azure Certification Prep, let’s not waste any more time and dive right in.

Whether your starting your Azure certification journey or along ways down the road the first thing you need to do is some homework on the exams:-

  • Which exam is the right one for you at this time?
  • Have you read the official exam page from Microsoft?
  • Have you read up on any changes about the exam that may have taken place?

I always suggest people start with the AZ-900 Azure fundamentals exam and this is for everyone, including experienced Azure users. This exam will set you up for the following if you are new to the exams or haven’t done any exams in a while:-

  • Learn to study (learn to take notes and try to remember the content you’ve read)
  • Get you into the habit of reading, trying to recall information, sitting practice tests.
  • There is also the no small feat of booking the actual exam and sitting it, lots of people put this off and dread exams.

Skills Measured
This area is key to your success in the actual exam, the content in this section is in essence what Microsoft will be testing you on, ensure you are comfortable with all of this content before taking the exam. Exams are usually broken down into 4 maybe areas and you’ll see a percentage scoring next to each section. An example of this would be the AZ-900 exam which has the following:-

Describe Cloud Concepts (15-20%)

This means that 15-20% of your exam will be on this subject area, some exams have areas as much as 35-45% so this is the area you want to be very sharp on as lots of questions will be around this area.

Advice
Whilst studying you tend to get a feel for how it is going, I would advise you try some practice tests and try to gauge from them if your ready or not. Book your exam as this will then help concentrate your mind and ensure you study, nothing like an exam deadline coming up to make you want to read the content, learn and pass your exam.

So a this point we have checked out the official exam guide, we know what topics we are going to be tested on, lets assume we are sitting the exams at home, we have to know what to expect when sitting an exam from home.

Sitting an exam in your house

There are some rules which you need to adhere to when taking an exam from your office / home and some of these are as follows: –

  • There is no bit of paper allowed for taking notes so if you need to take notes then there is a way to do this on-screen within the test, get familiar as you may need this.
  • No one is allowed to enter the room at any time during your exam, and you cant leave either.
  • You’re not allowed to read the exam questions out loud, cover your face or leave the webcam perimeter box for any reason, if you do you’re very likely to fail.
  • You have to take photos of your identification and the desk your siting at, behind where you are sitting, to the left, and to the right of where you are sitting.
  • The rules are there for people to read so be sure you know what you can and cannot do otherwise you run the risk of an instant failure which would not be fun.

Advice
Ensure your pc / laptop is charged, arrive 30 minutes early and go through your identification steps as soon as you can as it can be a little unsettling at times and you don’t want to be flustered before the exam begins.

Learning Resources
I always go in search of good learning resources for an exam and I always start off with Microsoft Learn. Here you will find learning paths for your exam and you can go through them at your own pace, make sure not to miss these as they really are excellent.

My favorite Learning Resources

Practice Tests
Whilst studying I always think its a good idea to do some knowledge checks, you may read some content week 1 and forget it week 2, practice tests help reinforce my learning and help me recall things I need to know for each exam. I have used practice tests from several different places and here are some of the people who have good training as well as good practice tests:-

Advice

  • Don’t start studying and then put it off would be my advice, you’re likely to forget some of what you learned.
  • Book the exam 2 weeks out and then this will focus your mind.
  • Read the questions carefully even in the practice tests.
  • No exam question has answers which are wildly wrong, Microsoft don’t do this anymore so you wont be able to rule lets say 2 from 4 answers right away for being wildly wrong with regards to the question.
  • Take the exam once your fairly confident you know what the answers to most of the questions, you wont get them all right but try to wait till your at least some what confident in your knowledge.
  • ALWAYS check the official exam page in case the skills measure area has been updated – exams are updated regularly and you don’t want caught out.

Wrap Up
No one likes sitting exams, but just think of the amount of things you have learned, and don’t be afraid to fail an exam, I have failed 3 and it just made me more determined to pass the next time and learn what I didn’t know the first time around.

If you need advice about anything exam related, please do reach out on twitter.

Don’t forget to subscribe to my YouTube Channel.



GitHub Actions 101

In this blog post series I am going to cover my journey to learning about GitHub Actions.

To get started with learning about GitHub Actions lets start by describing what they are.


So what exactly are GitHub Actions?

“GitHub Actions makes it easy to automate all your software workflows, now with world-class CI/CD. Build, test, and deploy your code right from GitHub. Make code reviews, branch management, and issue triaging work the way you want.

GitHub Actions help you automate your software development workflows in the same place you store code and collaborate on pull requests and issues. You can write individual tasks, called actions, and combine them to create a custom workflow.

You can write your own actions to use in your workflow or share the actions you build with the GitHub community

Workflows are custom automated processes that you can set up in your repository to build, test, package, release, or deploy any code project on GitHub.


To get started with learning about GitHub Actions lets start off by listing some of the best resources I have come across for getting started.

Don’t forget to checkout my YouTube Channel.