To Don’t List

I had 2 weeks off and plans fell through for the second week and so I have had some down time to figure out my next move. I’ve been searching for something of late and wasnt sure what it was and I know I have found what it was I needed.

Off the back of finding what I needed I am creating a To Don’t list so that I can check this regularly just to ensure I am on track, so here goes.

  • Don’t consume Twitter, use it to post and leave it there.
  • Same with LinkedIn, share content but dont spend time scrolling through it.
  • Don’t purchase a book from Amazon, get half way through it and put it down for good.
  • Don’t keep saying yes to every new project that people want me to work on, work on you’re own projects.
  • Don’t spend time gathering resources for a project and then move onto the next thing because it might be a better idea.
  • Don’t start the podcast back up and then let it slide again, I learned a lot from speaking to people all over the globe and it was a lot of fun.
  • Don’t spend time blogging just for the sake of blogging.
  • Don’t work on cool Azure projects at work and not blog about them either.
  • Don’t do any more exams unless I need to know about the thing for work, you’ve done loads, concentrate on your other projects.

Ok so this was a very quick blog post to keep me in check. I will probably add to it as I go.

Don’t forget to subscribe to my YouTube Channel.



1 year in…

Background

Just before covid started I applied for a new job at a company called Intercept in the Netherlands, I had previously only met one of their employees Wesley Haakman whilst at Ignite in Orlando in 2019 I think it was. I spoke to him asking about what they were up to and then spoke to Holly Lehman and asked her opinion on the company, and by this time I was swaying towards accepting the position offered, after I spoke to Holly I was convinced it was the right move for me.

I joined Intercept later that year on September 1st, 2020. 1 year today.

Intercept isn’t like any other company I had worked for before, other companies were good for different reasons, but where I work now is different, more on that later.

During my time here at Intercept I reckon I have worked on at the very least, 26 different projects in the last 12 months using a large variety of Azure services. My background was as a developer, devops, sre and then a development team manager.


Tech

I chose Intercept mainly for my love of Azure and wanting to work on it day to day.

I have worked with the following Azure technologies thus far:-

Azure Networking, Azure Data Factory, Databricks, Event Hubs, Synapse Analytics, App Service, Azure Functions, Container Registry, Virtual Machines, Azure DevOps, Cosmos DB, MySQL, PostgreSQL, Azure SQL, Azure Active Directory, Azure AD B2C, Azure Key Vault, Security Centre, Azure Api Management, Azure API for FIHR, Azure Event Grid, Logic Apps, Noification Hubs, Service Bus, Automation, Azure Backup, Azure Lighthouse, Azure Monitor, Azure Policy, Azure Portal, Cloud Shell, Cost Management, Azure CDN, Communication Services, Azure Migrate, Site Recovery, Application Gateway, Azure Bastion, Azure DNS, Azure Firewall, Azure Front Door, ExpressRoute, Load Balancer, Traffic Manager, VPN Gateway, Azure Storage, Azure Data Lake Storage and more.


Workshops

I have presented 15 or more workshops covering Azure DevOps, Github Action, Azure Fundamentals and Cost Management. During covid we have done them all remotely, cannot wait until we deliver them in person at Microsoft offices around Europe – really looking forward to that.


Culture

The culture at Intercept is simply awesome, you’ll have to take my word for it, I don’t want to go on about it but I thank my lucky stars I work here regularly. We have a lot of fun and have what we call the Intercept Cafe 3 times a week where you can drop in and shoot the breeze with anyone who joins (i love this idea!).

Days off are encouraged, we work flexible hours, everyone speaks English very well (except me of course) and we win as a team, lose as a team, but we mainly kick butt as a team.


Day to Day

Day to day we create designs for customer all around Europe and then we implement the design. We also help customers with improving existing solutions and have teams of people in what we call the continous improvement teams. We only do Azure at Intercept and we work mainly with ISV’s (Independent Software Vendors).


Intercept are Hiring

I would never normally write a blog post about work, and mention were hiring, but things are different at Intercept. I went on holiday and had ZERO stress or feeling in my stomach about returning to work and that was a first.

We have colleagues in the Netherlands, England, Scotland, Northern Ireland, Germany and South Africa and had a recent colleague join from South America who moved to the Netherlands – we chat, play bingo, play xbox, and more.

Take a look at our vacancies page https://intercept.cloud/en/vacancies/


Other Stuff

This summer I decided to down tools and stop studying for exams, stop blogging, stop recording content and just enjoy the summer, thankfully its been the best summer I can remember. I have enjoyed golfing and still doing my bit in the community helping people but havent been sat at my laptop anywhere near as much as I have in the past.

I was renewed as an MCT and MVP and auditioned for being a trainer on LinkedIn Learning. The training part has been something have been thinking about doing for a while now, finding the time was impossible but not I think I may have decided thats the way I want to go.

I realise I love helping people, and whatever that entails, helping people get a leg up, bevoming MVP’s, helping people by mentoring them etc is what I love doing.


Career

People still ask me what I want to do, do I have any career goals? – I can honestly say at this moment I am enjoying life at Intercept. I love working with Azure, I love learning hence why I do so many exams, I have a passion for learning and a passion for helping people which will never leave me. If you’re not currently learning something you’re going backwards.

Don’t forget to subscribe to my YouTube Channel.



How to set environment variables for use with an Azure Function

There are many ways to pass variables into an Azure Function.

In this quick blog post I will show you how you can test the Azure Function locally with local settings and then use app settings from the Azure Portal and then also use values stored within Azure KeyVault incase we need to store and retrive secrets.

Ok, so to run our Azure Function locally I prefer to use C# Azure Functions, they just work, I can debug them in VS Code or Visual Studio – I am a dinosaur and been using Visual Studio since it first came out so I tend to stick with using that over VS Code, yes I am that old.

Moving on lets show how we can make use of variables whilst debugging locally, for this we just need to create a local.settings.json file like the one below:-

{
  "IsEncrypted": false,
  "Values": {
    "FUNCTIONS_WORKER_RUNTIME": "dotnet",
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "DemoUsername": "azuregreg",
    "DemoPassword": "letmein"
  }
}

So our code can now use a stored username and password to test with locally – we don’t check this file in as its only for testing locally (add a .gitignore file).

Ok so we can test our Azure Function with code that accesses the local variable and away we go.

So our function now outputs the following (remember this is from local settings) :-

this is gregors demo function - the username is azuregreg and the password is letmein


Now the interesting part next is how do we store variables in Azure and make use of them.

So, we have some options available to us, for things like username and password we can store them in Azure KeyVault, if they are simple settings then we can store them in appsettings within the Azure Portal.

Let’s take a look at storing these variable in the App Settings section of the Azure Portal for our Azure Functions.

In the screen shot above we are in the Azure Portal and clicked into our Azure Function app and then click on Configuration and then + New application setting

Now we can add configuration values to our code, we can store setting here but sometimes we need to store secrets and for this we use Azure KeyVault, which we will return to shortly.

So lets add the username and password settings into the application settings and see how we can use them first of all.

So I went ahead and added them into the application settings section like so:-

In the screen shot above I clicked Show Values so that I can show you, the reader, the values I set exactly the same properties in the local.settings.json file when debugging locally – click Save.

Ok, so now we have added in some application settings, how does the code need to change to pick up these values from this area in the Portal? – well the good answer is we don’t need to change our code, it works exactly the same way as whilst using the local.settings.json file.

So the above works, but I hate when people say yeah but it works, yeah it works but can we make it better should always be the question – getting things working and making it as secure as it can be are different things altogether.

So why would we want to perhaps store the username and password in KeyVault? – glad you asked.

Maybe there is a username and password which you don’t want everyone to know, so you can add these values to KeyVault and not give anyone access to read the values for the password for example, but your application can read and use the value from the keyvault without anyone being able to see the password in plain text – sounds good to me so lets go set this up.

If you want to test this out I created a C# Azure Function which is basically just the following code, reminder the local.settings.json is at the start of the blog post.

C# Azure function code below.

[FunctionName("Function1")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req, ILogger log)
{
    log.LogInformation("C# HTTP trigger function processed a request.");

    string username = Environment.GetEnvironmentVariable("DemoUsername");
    string password = Environment.GetEnvironmentVariable("DemoPassword");

    string responseMessage = "this is gregors demo function";

    return new OkObjectResult(responseMessage);
}

First we need to create a KeyVault and one you’ve done that, locate it and click on Secrets and create 2 new secrets, lets call then username and password for simplicity – set the values like so:-

username: azuregregviaappsettings
password: letmeinviaappsettings

When we run our Azure HTTP Function this time it picks up the values from the appsettings we just updated in the app settings section of the Azure function in the portal.

So our function now outputs:-

this is gregors demo function - the username is azuregregviaappsettings and the password is letmeinviaappsettings

Lets now change our existing app settings so that we get the details from KeyVault rather than just simply storing them in our app settings (where anyone can see them) – only people with the correct RBAC user rights can see our KeyVault secrets.

Ok now that we have our 2 secrets lets try to access them from the appsettings section of our Azure Function. Go back to the Azure Function and then click on Configuration and then we will be back at the screen that shows us our current applications settings.

Now lets edit the existing app setting called DemoUsername by finding the row and selecting the edit button as below :-

Lets past in the following
@Microsoft.KeyVault(SecretUri=https://<your keyvault name>.vault.azure.net/secrets/username) replacing <your keyvault name> with the name of your KeyVault like below.

Once you’ve done this for the username and password we can now use the values from our Azure Function.

Before this works we need to do a couple more steps – we need to create a managed identity and then also create an access policy within KeyVault.

To create a managed identity go to your Azure Function and then under Settings, select Identity. Change the status to On and click Save, also take a copy of the Object ID as we will need this later on.

Next we need to create an access policy within Kay Vault, so go into you’re KeyVault and select Access Policies, and then choose the + Add Access Policy link. Where it says Select principal, click the words none selected and then paste in the Object ID we took a note of above and then select it. Then within the Secret permission drop down select Get and List and then click Add.

Now go back to your Azure Function and select Configuration and then edit both your DemoUsername and DemoPassword app settings and then click save, They should now look like this:-

Notice in the screen shot above the Green Tick next to the words Key vault Reference, if this is red then check your steps with the managed identity, and creating the access policy above.

So our function now outputs:- 

this is gregors demo function - the username is azuregregfromkeyvault and the password is letmeinviakeyvault

And that is how you can run and test or function locally, using local.settings.json but also stored environment variables in app settings but also store them in KeyVault if they are secrets.

Don’t forget to subscribe to my YouTube Channel.



Deploying your Azure Function using Azure DevOps

In this blog post we will cover how to deploy your Azure Function using Azure DevOps to try to get away from the right click publish way of deploying / or deploying straight from VS Code for example (ideally we want to run tests against our code). In my case I will be deploying a PowerShell Azure Function.

In Azure DevOps we have a new repo project which has our Azure Function code within a repository in Azure DevOps.

If we choose select Create Pipeline

We can then choose, Azure Repos Git like so:-

Then we need to select the Repo, next we select PowerShell Function App to Windows on Azure

Next we need to select the correct Azure Subscription, and then choose an existing Function App Name, then click Validate and configure.

So now we have a pipeline YAML file which will build and deploy your Azure PowerShell Function to Azure itself and the Yaml looks like the following:-

We can set our pipeline to build and deploy the code whenever any code changes are commited to the Azure Repo. As you can see highlighted above we do this by using a trigger: – master which means any commits to this branch will run the pipeline.

This is pretty simple stuff but its nice to know that Azure DevOps can tell from the repo that we have a PowerShell Azure Function and can create our yaml for us.



What advice would you give…

I see people on Twitter saying what advice would you give to younger people or what career advice would you give people who are just getting started in their careers.

Ok, so, I have been in the I.T. industry for 25 years, I didn’t do any Computer Science courses, etc, I am a self-taught programmer and here is my advice, yep that’s right my advice, you may disagree and that’s great, be boring if we all agreed all the time.

  • Ask for help, don’t be shy to ask for help always.
  • Ask for feedback, on all types of things, feedback is important for growth, you may not like it but you’ll learn more from negative feedback than you will from positive feedback.
  • Work like someone is watching you, never slack, put a shift in each and every day.
  • Do something you enjoy doing, at your job you will be doing it day in and day out for a long time most likely.
  • Find a mentor, and talk to them regularly.
  • Be passionate about what you do. Learn something outside your comfort zone, as often as you can, staying in your comfort zone teaches you next to nothing.
  • Don’t try and learn it all, no one knows it all, and neither will you.
  • Pick a subject area and be someone who stands out in this area.
  • If you are thinking about doing something and it’s a challenge, go for it, just do it, stop overthinking.
  • If you have a job that you don’t like – find another job.
  • Be honest and trustworthy and you will go a long way, no BS, leave that for others.
  • Thank the people who help you.
  • Give back when you can.
  • Network with like-minded people, there are lots of people out there just like you.
  • Work smart, not hard.
  • Share failures and successes, no one wins all the time, you’ll learn more from your failures.

Don’t forget to subscribe to my YouTube Channel.



AI-102 Azure Study Guide

In this blog post I cover all of the resources I came across whilst studying for the AZ-102 Exam

AI-102: AI Engineer on GitHub: – https://github.com/MicrosoftLearning/AI-102-AIEngineerhttps://github.com/MicrosoftLearning/AI-102-AIEngineer

And this too:-

https://microsoftlearning.github.io/AI-102-AIEngineer/

Plan and Manage an Azure Cognitive Services Solution (15-20%)
Select the appropriate Cognitive Services resource



People Networking

I got some great advice from my sister when I was younger regarding networking with people and it stayed with me.

I get people saying to me that I know everyone, I smile at that because that means my hard work is paying off, what do I mean by hard work?

I am different to a lot of people – a lot of people look at things like Twitter and see numbers of followers, etc and they say they don’t care about numbers, I believe them, I really do. I on the other hand think of it differently, I see social media as a way to people network. If I have the opportunity to engage with people who I can help or they can help me then I am going to grab that opportunity always.

If you walked into a conference and you know people and people know you, there is an ice breaker for starters, but not everyone is a social butterfly, trust me neither am I, but I am working on it.

Imagine you could find all of the like-minded people who can help you day to day with questions you may have and explanations without ever leaving your seat or having to troll through wrong stack overflow answers.

Going back to the whole everyone knows you, I took some time to find the people I want to connect with, I work with Azure daily, I looked for all the people who work at Microsoft and are Azure minded people – I follow them and that way I learn more than you would think.

When I went to Ignite 2019 I recognized a lot of people and that was amazing, I spoke to people I had interacted with a little bit on Twitter and now I chat with them regularly.

Summary

There is no ego here, If you have an ego then we can’t be friends. I don’t follow people to get numbers up, although the image below does crack me up, I follow people so that I can network, as networking brings opportunities, wide-ranging opportunities at that.

I have had more opportunities than I could ever imagine, I’ve spent the Covid time saying mostly no to people. Networking has meant I have been able to help a lot of people with the Azure exams and also get started learning Azure.

My 2 cents is to network your backside off.

Don’t forget to subscribe to my YouTube Channel.



Azure DP-300

Azure DP-300 Exam Study guide

Plan and Implement Data Platform Resources (15-20%)
Deploy resources by using manual methods



TFS

TFS TF400917 – Upgrading TFS to move to Azure Devops

A customer at work had an issue upgrading from TFS 2017 to TFS 2019 with a view to moving to Azure DevOps today, so I thought I would blog the issue and the fix in case anyone else runs into the same sort of issue. I learned how to resolve such issues like the one we had shown below: –

TF400917: The current configuration is not valid for this feature was the error message, googling takes you to this link: – https://docs.microsoft.com/en-us/azure/devops/reference/xml/process-configuration-xml-element?view=azure-devops-2020

From here we can download and run a tool called witadmin, you can read more here -> https://docs.microsoft.com/en-us/azure/devops/reference/witadmin/witadmin-customize-and-manage-objects-for-tracking-work?view=azure-devops-2020

If you downlod this tool and export the config like so:-

witadmin exportprocessconfig /collection:CollectionURL /p:ProjectName [/f:FileName]

We can export the processconfig and check for invalid items, in our case there was duplicate State values within the xml. So I exported the xml and made a change by hand and then imported the file with the removed duplicate item using the following command: –

witadmin importprocessconfig /collection:CollectionURL /p:ProjectName [/f:FileName] /v

For both of the commands above you have to supply the CollectionURL, ProjectName and a FileName and then by importing the config this fixed the issue. The devil here is in the detail, find the invalid details, in our case it was a duplicate State of Completed, I had to remove one and save, nope not that one, so I added it back in and removed the other and re-imported the config – problem solved.

Note
You can also download an add-on for Visual Studio which can help with the task of migrating from TFS to Azure DevOps which is called the TFS Process Template Editor. The link to download this is https://marketplace.visualstudio.com/items?itemName=KarthikBalasubramanianMSFT.TFSProcessTemplateEditor

With the above tool, you can visualize the config for your TFS setup which can help you see what’s going on under the hood a little better – a useful tool!.

Kudos to https://twitter.com/samsmithnz for telling me about this – Sam rocks!



Azure App Service

Troubleshooting App Services in Azure

In this blog post, I wanted to cover how to go about troubleshooting an App Service in Aure which is a web app with a SQL server backend whereby users have reported issues with the slow performance of the website.

The first thinh I tend to look at is the backend store, in this case, Azure SQL Server and we have some really great tooling we can use to troubleshoot perforamce issues with Azure SQL.

The first port of call was to open up the Azure Portal and go to the Resource Group with the issues and click on the SQL Server database and head to the Intelligent Performance section on the left-hand menu as highlight below: –

Performance Overview
This currently has a recommendations area that suggests adding 5 different Indexes which are all set as HIGH impact.

Indexes can sometimes cause adverse effects so it’s recommended to look at the suggestions, copy the script from the recommendations and consider if this Index will indeed help with the performance of queries.

Query Performance Insight
The second area I look at is query performance insight and from here we can see the average CPU, Data IO, Log IO on the SQL Server database across the last 24 Hours as an average. We also get an insight into what queries are running and taking the longest time to complete.

I changed the graph above to show the last 7 days and I can see CPU is maxed out at 100% for a long period within the last 7 days as seen below:-

Long Running Queries
This area identifies queries which are taking a long time to complete and always worth checking on this regularly.
The following is a screen shot of long running queries within the database for the past week. To find this information select the database instance in the portal and then select Query Performance Insight and select Long running queries, then I chose custom and changed the time period to Past week.

We can see above the yellow query is the database query which has the longest duration this past week, you can click on the yellow area and it will show you the details of the query which is a long running query.

Automatic Tuning

Azure SQL Database built-in intelligence automatically tunes your databases to optimize performance. What can automatic tuning do for you?

  • Automated performance tuning of databases
  • Automated verification of performance gains
  • Automated rollback and self-correction
  • Tuning history
  • Tuning action Transact-SQL (T-SQL) scripts for manual deployments
  • Proactive workload performance monitoring
  • Scale out capability on hundreds of thousands of databases
  • Positive impact to DevOps resources and the total cost of ownership

I would recommend that FVN turn this on and leave it like the following:-

This means that Azure will tune the indexes using built in intelligence and create indexes when it thinks you need them based on usage patterns. A word of caution here as these recommendations aren’t always correct so please bare this in mind.

Log Analytics
I always recommend adding the Azure SQL Analytics workspace solution to the subscription and this gives us further insight into the SQL Server in Azure. Once you turn this on you need to wait sometime before it can gather a decent amount of data.

The screen shot below shows us the type of information we can get from it, this screen shot was taken not long after being turned on so if you wait some time it will have much more useful details:-

From here we can get more information about deadlocks, timeouts, etc.


Now lets take a look at the website which is in an App Service in Azure and see what tool we can use to help us troubleshoot issues with the performance.

I always recommned adding Application Insights into Azure for resources when possible, and here if we click on the App Insights for the web app we can instantly get soe basic info. If you click on the Application Dashboard as seen below we get a high level vue of whats going on in our App Service.

The Application dashboard for a typical web app might look something like this: –

Ok, so let’s now do some further investigation into our app service issues. This time I chose the App Service itself and then I chose Diagnose and solve problems from the left-hand menu. This feature is underused in my opinion and is very useful indeed, not sure if many people have looked at it but it can be pretty helpful with recommendations and also pointing out some things that you may want to think about remediating.

Once in the Diagnose and solve problems area I usually click on Availability and Performance within the Troubleshooting categories section and if you do, you’ll see something like this: –

In the image above we can see that we have some App Performance issues to go and investigate. Clicking into the App Performance section we get in-depth details about the Performance and we get Observations that say things like Slow Request Execution with details of the web page, average latency, total execution time, etc. The detail here is very helpful in tracking down potential issues in the code, or the configuration of your web application. There are a number of options to check within each section of the 6 troubleshooting categories, an example is shown below for the Availbility and Performance section: –

Summary
In summary, there are a number of really awesome tools to aid us with troubleshooting App Service perormance issues, go check them out the next time your web app is running poorly.