Microsoft Ignite 2019 – My Review

I have just returned from Microsoft Ignite 2019 which was held in Orlando Florida, here is my take on the experience of attending the event.

I was staying at the Best Western Orlando Convention Centre hotel which was pretty handy for the event and meant I could walk to the venue each day (I’d recommend staying close by if you’re planning to attend next year)

Friday
I started by arriving on the Friday (which was great) I managed to get used to the location and find my way around where I was staying and figure out where everything was which I would also recommend. I managed to meet up with other people who were attending and we went out for food and drinks that night which meant chatting with a group from Germany who were mostly MVP’s, a great start to the my time in Florida.

Saturday
I did more of the same on Saturday and spent a good deal of the day around the Hyatt Regency hotel which is a great area to hang out as this is where a number of the Microsoft speakers were staying. We had drinks and food at the hotel bar and again I spent a good deal of time meeting attendee’s from all around the globe.

Sunday
Sunday allows you to attend a Pre-day which is where you can attend a day of learning the day before the conference starts, this is normally $500 but you get this free as an MVP. This was awesome as I managed to meet a good number of Azure MVP’s from around the world and chat to them during the day.

Mon-Friday was going to some sessions hanging out in the Hub and the Microsoft booth’s where I asked a lot of questions with product team members (which is invaluable).

Thursday night we got to go Universal Studios and Islands of Adventure from 7:30pm until midnight and this is for the Ignite celebration, the parks are open for attendees only and there is free food and drink and all of the rides are free which was incredible.

Overall the experience of attending Ignite was just wow, the size of the venue, the number of attendees and number of sessions was hard to comprehend. I will be going next year without a doubt and highly recommend it to anyone. The networking opportunities are endless, meeting both Microsoft staff and people I know from twitter etc.

Tips for next year: –

  • Arrive early and acclimatise
  • Don’t over do it,  there is a lot of walking and the fear of missing out will always be there.
  • The parties at night are amazing so pace yourself throughout each day.
  • If your an MVP try and be there on the Sunday.
  • Attend sessions if you have questions, otherwise watch them at the HUB or when you get home.
  • Be prepared for not a lot of sleep, you can sleep when you get home.
  • Enjoy the Microsoft store and endless freebies from the vendors.
  • Attend – honestly its off the charts good fun and the opportunities to learn and meet people are endless.

If you can’t afford to attend or don’t fancy travelling to Florida then try to attend an Ignite Tour venue near you (they have already began and more coming soon)



Microsoft Security Code Analysis for Azure Devops – Part 3 BinSkim

Microsoft has recently released a new set of security tooling for Azure Devops which is called Microsoft Security Code Analysis.

The Microsoft Security Code Analysis Extension is a collection of tasks for the Azure DevOps Services platform. These tasks automatically download and run secure development tools in the build pipeline.

In this post I’ll cover BinSkim and how to use it.


BinSkim
BinSkim is a Portable Executable (PE) light-weight scanner that validates compiler/linker settings and other security-relevant binary characteristics. The build task provides a command line wrapper around the BinSkim.exe application. BinSkim is an open source tool.

Setup:

  1. Open your team project from your Azure DevOps Account.
  2. Navigate to the Build tab under Build and Release
  3. Select the Build Definition into which you wish to add the BinSkim build task.
    New – Click New and follow the steps detailed to create a new Build Definition.
    Edit – Select the Build Definition. On the subsequent page, click Edit to begin editing the Build Definition.
  4. Click + to navigate to the Add Tasks pane.
  5. Find the BinSkim build task either from the list or using the search box and then click Add.
  6. The BinSkim build task should now be a part of the Build Definition. Add it after the publishing steps for your build artifacts.

Customizing the BinSkim Build Task:

  1. Click the BinSkim task to see the different options available within.
  2. Set the build configuration to Debug to produce *.pdb debug files. They are used by BinSkim to map issues found in the output binary back to source code.
  3. Choose Type = Basic & Function = Analyze to avoid researching and creating your own commandline.
  4. Target – One or more specifiers to a file, directory, or filter pattern that resolves to one or more binaries to analyze.
    • Multiple targets should be separated by a semicolon(;).
    • Can be a single file or contain wildcards.
    • Directories should always end with \*
    • Examples:
      • *.dll;*.exe
      • $(BUILD_STAGINGDIRECTORY)\*
      • $(BUILD_STAGINGDIRECTORY)\*.dll;$(BUILD_STAGINGDIRECTORY)\*.exe;
    • Make sure the first argument to BinSkim.exe is the verb analyze using full paths, or paths relative to the source directory.
    • For Command Line input, multiple targets should be separated by a space.
    • You can omit the /o or /output file parameter; it will be added for you or replaced.
    • Standard Command Line Configuration
      • analyze $(Build.StagingDirectory)\* –recurse –verbose
      • analyze *.dll *.exe –recurse –verbose
      • Note that the trailing \* is very important when specifying a directory or directories for the target.

    BinSkim User Guide

    For more details on BinSkim whether command line arguments or rules by ID or exit codes, visit the BinSkim User Guide



Microsoft Security Code Analysis for Azure Devops – Part 2 Credential Scanner

Microsoft has recently released a new set of security tooling for Azure Devops which is called Microsoft Security Code Analysis.

The Microsoft Security Code Analysis Extension is a collection of tasks for the Azure DevOps Services platform. These tasks automatically download and run secure development tools in the build pipeline.

In this post I’ll show you how to get the new extension and how to go about using it.

Credential Scanner (aka CredScan) is a tool developed and maintained by Microsoft to identify credential leaks such as those in source code and configuration files. Some of the commonly found types of credentials are default passwords, SQL connection strings and Certificates with private keys.
The CredScan build task is included in the Microsoft Security Code Analysis Extension. This page has the steps needed to configure & run the build task as part of your build definition.

Lets start by adding Cred Scan to a build for an existing project – I’ll use the AzureAdventCalendar project which I already have setup within my Azure Devops project at https://dev.azure.com.

Setup:

  1. Open your team project from your Azure DevOps Account.
  2. Navigate to the Build tab under Build and Release
  3. Select the Build Definition into which you wish to add the CredScan build task.
    New – Click New and follow the steps detailed to create a new Build Definition.
    Edit – Select the Build Definition. On the subsequent page, click Edit to begin editing the Build Definition.
  4. Click + to navigate to the Add Tasks pane.
  5. Find the CredScan build task either from the list or using the search box and then click Add.
  6. The Run CredScan build task should now be a part of the Build Definition.

 


 

 

 


 

 

 


Customizing the CredScan Build Task:

Available options include: –

  • Output Format – TSV/ CSV/ SARIF/ PREfast
  • Tool Version (Recommended: Latest)
  • Scan Folder – The folder in your repository to scan
  • Searchers File Type – Options to locate the searchers file used for scanning.
  • Suppressions File – A JSON file can be used for suppressing issues in the output log (more details in the Resources section).
  • (New) Verbose Output – self explanatory
  • Batch Size – The number of concurrent threads used to run Credential Scanners in parallel. Defaults to 20 (Value must be in the range of 1 to 2147483647).
  • (New) Match Timeout – The amount of time to spend attempting a searcher match before abandoning the check.
  • (New) File Scan Read Buffer Size – Buffer size while reading content in bytes. (Defaults to 524288)
  • (New) Maximum File Scan Read Bytes – Maximum number of bytes to read from a given file during content analysis. (Defaults to 104857600)
  • Run this task (under Control Options) – Specifies when the task should run. Choose “Custom conditions” to specify more complex conditions.

*Version – Build task version within Azure DevOps. Not frequently used.


Resources

Local suppressions scenarios and examples

Two of the most common suppression scenarios are detailed below: –

1. Suppress all occurrences of a given secret within the specified path

The hash key of the secret from the CredScan output file is required as shown in the sample below

{
“tool”: “Credential Scanner”,
“suppressions”: [
{
“hash”: “CLgYxl2FcQE8XZgha9/UbKLTkJkUh3Vakkxh2CAdhtY=”,
“_justification”: “Secret used by MSDN sample, it is fake.”
}
] }

Warning: The hash key is generated by a portion of the matching value or file content. Any source code revision could change the hash key and disable the suppression rule.

2. To suppress all secrets in a specified file (or to suppress the secrets file itself)
The file expression could be a file name or any postfix portion of the full file path/name. Wildcards are not supported.

Example
File to be suppressed: [InputPath]\src\JS\lib\angular.js
Valid Suppression Rules:[InputPath]\src\JS\lib\angular.js — suppress the file in the specified path
\src\JS\lib\angular.js
\JS\lib\angular.js
\lib\angular.js
angular.js — suppress any file with the same name
        {
“tool”: “Credential Scanner”,
“suppressions”: [
{
“file”: “\\files\\AdditonalSearcher.xml”,
“_justification”: “Additional CredScan searcher specific to my team”
},
{
“file”: “\\files\\unittest.pfx”,
“_justification”: “Legitimate UT certificate file with private key”
}
] }

Warning: All future secrets added to the file will also get suppressed automatically.


Secrets management guidelines
While detecting hard coded secrets in a timely manner and mitigating the risks is helpful, it is even better if one could prevent secrets from getting checked in altogether. In this regard, Microsoft has released CredScan Code Analyzer as part of Microsoft DevLabs extension for Visual Studio. While in early preview, it provides developers an inline experience for detecting potential secrets in their code, giving them the opportunity to fix those issues in real-time. For more information, please refer to this blog on Managing Secrets Securely in the Cloud.
Below are few additional resources to help you manage secrets and access sensitive information from within your applications in a secure manner:


Extending search capabilities
CredScan relies on a set of content searchers commonly defined in the buildsearchers.xml file. The file contains an array of XML serialized objects that represent a ContentSearcher object. The program is distributed with a set of searchers that have been well tested but it does allow you to implement your own custom searchers too.

A content searcher is defined as follows:

  • Name – The descriptive searcher name to be used in CredScan output file. It is recommended to use camel case naming convention for searcher names.
  • RuleId – The stable opaque id of the searcher.
    • CredScan default searchers are assigned with RuleIds like CSCAN0010, CSCAN0020, CSCAN0030, etc. The last digit is reserved for potential searcher regex group merging or division.
    • RuleId for customized searchers should have its own namespace in the format of: CSCAN-{Namespace}0010, CSCAN-{Namespace}0020, CSCAN-{Namespace}0030, etc.
    • The fully qualified searcher name is the combination of the RuleId and the searcher name, e.g. CSCAN0010.KeyStoreFiles, CSCAN0020.Base64EncodedCertificate, etc.
  • ResourceMatchPattern – Regex of file extensions to check against searcher
  • ContentSearchPatterns – Array of strings containing Regex statements to match. If no search patterns are defined, all files matching the resource match pattern will be returned.
  • ContentSearchFilters – Array of strings containing Regex statements to filter searcher specific false positives.
  • Matchdetails – A descriptive message and/or mitigation instructions to be added for each match of the searcher.
  • Recommendation – Provides the suggestions field content for a match using PREfast report format.
  • Severity – An integer to reflect the severity of the issue (Highest = 1).

 

 

Join me in part 3 where I cover off BinSkim



Microsoft Security Code Analysis for Azure Devops – Part 1

Microsoft has recently released a new set of security tooling for Azure Devops which is called Microsoft Security Code Analysis.

The Microsoft Security Code Analysis Extension is a collection of tasks for the Azure DevOps Services platform. These tasks automatically download and run secure development tools in the build pipeline.

In this post I’ll show you what they cover below, in part 2, I’ll show you them in action in Azure Devops.


Credential Scanner
Passwords and other secrets stored in source code is currently a big problem. Credential Scanner is a static analysis tool that detects credentials, secrets, certificates, and other sensitive content in your source code and your build output.

More Information


BinSkim
BinSkim is a Portable Executable (PE) light-weight scanner that validates compiler/linker settings and other security-relevant binary characteristics. The build task provides a command line wrapper around the BinSkim.exe application. BinSkim is an open source tool.

More Information (BinSkim on GitHub)


TSLint
TSLint is an extensible static analysis tool that checks TypeScript code for readability, maintainability, and functionality errors. It is widely supported across modern editors and build systems and can be customized with your own lint rules, configurations, and formatters. TSLint is an open source tool.

More Information on Github


Roslyn Analyzers
Microsoft’s compiler-integrated static analysis tool for analyzing managed code (C# and VB).

More Information (Roslyn Analyzers on docs.microsoft.com)


Microsoft Security Risk Detection
Security Risk Detection is Microsoft’s unique cloud-based fuzz testing service for identifying exploitable security bugs in software.

More Information (MSRD on docs.microsoft.com)


Anti-Malware Scanner
The Anti-Malware Scanner build task is now included in the Microsoft Security Code Analysis Extension. It must be run on a build agent which has Windows Defender already installed.

More Information


Analysis and Post-Processing of Results

The Microsoft Security Code Analysis extension has three build tasks to help you process and analyze the results found by the security tools tasks.

  • The Publish Security Analysis Logs build task preserves logs files from the build for investgiation and follow-up.
  • The Security Report build task collects all issues reported by all tools and adds them to a single summary report file.
  • The Post-Analysis build task allows customers to inject build breaks and fail the build should an anlysis tool report security issues found in the code that was scanned.

Publish Security Analysis Logs
The Publish Security Analysis Logs build task preserves the log files of the security tools run during the build. They can be published to the Azure DevOps Server artifacts (as a zip file), or copies to an accessible file share from your private build agent.

More Information


Security Report
The Security Report build task parses the log files created by the security tools run during the build and creates a summary report file with all issues found by the analysis tools.
The task can be configured to report findings for specific tools or for all tools, and you can also choose what level of issues (errors or errors and warnings) should be reported.

More Information


Post-Analysis (Build Break)
The Post-analysis build task enables the customer to inject a build break and fail the build in case one ore more analysis tools reports findings or issues in the code.
Individual build tasks will succeed, by design, as long as the tool completes successfully, whether there are findings or not. This is so that the build can run to completion allowing all tools to run.
To fail the build based on security issues found by one of the tools run in the build, then you can add and configure this build task.
The task can be configured to break the build for issues found by specific tools or for all tools, and also based on the severity of issues found (errors or errors and warnings).

More Information



Azure DevOps Generator – New Content

Recently Microsoft open sourced the Azure Devops Generator and recently its had some new content added which I wanted to highlight. You can use this tool to learn all sorts of Azure Devops tips and tricks from building code, seeing how it hangs together, deploying and even checking your code for vulnerabilities with arm templates and GitHub resources etc.

 

I can’t stress how useful this resource has been for me to spinning up test Azure Devops Projects for blog posts, testing security add-ons, etc. (more blogs to follow very soon). Please play with this and learn, the demo generator has a lot more in it than the lat time I checked and was pleasantly surprised, its an awesome tool.

The following is a quick tour of what is there at present: –

General Tab
The general tab is for creating projects in Azure DevOps from existing project templates, this will give you full source code, build and release pipelines, wikis, example kanban boards with issues etc and more
Note: There are different types of project if you scroll down the list.


Devops Labs Tab

On this tab we have more sample projects, but this time they cover the concepts of things like using Terraform, Ansible, Docker, Azure Key Vault and more, if you want to learn more about these then here is a great way to give them a spin.


Microsoft Learn Tab
Using Microsoft Learn we can learn how to do things like: –

  • Create a build pipeline with Azure Pipelines
  • Scan code for vulnerabilities in Azure Pipelines
  • Manage database changes in Azure Pipelines
  • Run non-functional tests in Azure Pipelines

Microsoft Cloud Adoption Framework Tab

The Cloud Adoption Plan template creates a backlog for managing cloud adoption efforts based on the guidance in the Microsoft Cloud Adoption Framework.


Private Tab

Azure DevOps Demo Generator enables users to create their own templates using existing projects and provision new projects using extracted template. The capacity to have custom templates can be helpful in many situations, such as constructing custom training content, providing only certain artifacts, etc.


You can even create a template from an existing project you have within Azure DevOps by selecting ‘Create New Template’ – this is super nice, I’ll leave you to explore this further.

Enjoy!



Azure Advent Calendar Participant Information

The Azure Advent Calendar kicks off on December 1st through to December 25th this year.

For people taking part (entry is now closed – apologies) we have setup a YouTube Channel to host your entries on your behalf, we will send you back the YouTube link once we have uploaded and scheduled the video.

For participants, please send us the video via a file share such as OneDrive etc. If you do not have one message @pixel_robots and he will send you a link where you can send us your video.

On the day of your entry please publish your blog post live to the world and just add a link back to the website which is https://azureadventcalendar.com/

On each individual day we will tweet out the content for each of the 3 entries and use the hashtag #azureadventcalendar

If your needing to add any artwork then please use the following image: –

Any questions please reach out to @Gregor_Suttie or @Pixel_Robots via twitter.



Top 20 Azure Influencer’s

I was thrilled to learn that I’ve been included on Nigel Frank International’s list of the 20 best Microsoft Azure influencer’s on Twitter.

The line-up was revealed earlier this week, and highlighted a broad cross-section of people from around the world who’ve made a name in the Azure sphere in one way or another.

Included are a host of Microsoft MVPs and personnel, from the firm’s CTO Mark Russinovich and Regional Director Carsten Rachfahl to prolific bloggers, speakers and independent voices in the Azure community, such as Jennelle Crothers and Joanne Klein.

I’m delighted to be included alongside such esteemed professionals, and huge congratulations to everyone who made the list.

 

 

 

 

 

 

To read the full article, follow this link: https://www.nigelfrank.com/blog/top-20-microsoft-azure-influencers-on-twitter/

 

 



Azure Security Exam – AZ-500 Study plan

This is my study plan for October for the Azure AZ-500 exam

I’ll be using the EDX course pretty much on its own, did this for the AZ-400 Azure Devops exam and we will see how that goes.

Week 1 – Manage identity and access (20-25%) – Studied for it first week in October.
Week 2 – Implement platform protection (35-40%) – Studied for it second week in October.
Week 3 – Manage security operations (15-20%)
Week 4 – Secure data and applications (30-35%)

Sit the exam

Week 1 – At the end of week 1 I have went through the entire section on https://openedx.microsoft.com/courses/course-v1:Microsoft+AZ-500.0+2019_T2/course/ for the Manage and Identity Access section.



Azure Security articles in September

I decided to make September a month of Azure Security learning for myself, the following is a list of existing articles and also new security articles which I have written: –

  • Azure Policies – Learn what they are and why they are super useful and super easy to setup.
  • Azure Managed Service Identity – Managed Service Identity allows you to securely access your Azure resources and avoid storing credentials in your code.
  • Azure Role-Based Access Control – Role-based access control (RBAC) is a system that provides fine-grained access management of Azure resources.
  • Azure KeyVault – The Azure KeyVault Service is where you store certificate keys, passwords and more instead of having them stored within your application.
  • Azure Devops Open Source Scan your code – Scan your code for open source vulnerabilities and learn whats out of date within your project and also what vulnerabilities those versions may contain.
  • Azure Devops – Secure DevOps Kit for Azure (AzSK) – The “Secure DevOps Kit for Azure” is a collection of scripts, tools, extensions, automation’s, etc. that caters to the end to end Azure subscription and resource security needs for dev ops teams using extensive automation and smoothly integrating security into native dev ops workflows helping accomplish secure dev ops.
  • Intro to Azure Security –  “Introduction to Azure Security”, is written to provide a comprehensive look at the security available with Microsoft Azure.
  • Azure security documentation – everything you wanted to know about security within Azure.
  • Azure Api Management using Okta to secure using OAuth 2.0 – use Okta to secure your Api’s within Azure API Management

Enjoy!



Azure Api Management using Okta to secure using OAuth 2.0

This blog post will cover how to move an existing or new api into Azure API Management and then secure it using Okta.

 

Okta – “The Okta Identity Cloud provides secure identity management with Single Sign-On, Multi-factor Authentication, Lifecycle Management (Provisioning), and more”.

I had access to a development tenant within Okta which looks something like this:-

 

 

 

 

 

 


I created a new application and called it ‘Azure API Management’ and chose Web as the platform and OpenID Connect as the sign on Method like so: –

So now we have filled this out we can go back and edit it and see the screen which shows us important details including Client ID, Secret and Login redirect URI’s, all of which are important details in order to get this working.


Azure API Management

Within Azure, create a new instance of Azure API Management and once this has been created go down on the left hand menu and under Security select OAuth 2.0 and then select Add, I gave it the name Okta.

The client registration url is important here, you can find yours within your new Application within Okta, under the SignOn tab, look for the section that says OpenID Connect ID Token.

The other details which are very important are as follows (in red)

and further down that screen where you see the ClientID and Client Secret: –


That’s it for Azure, so let’s switch back to Okta.

Now we need to check the Sign On tab and take a note of some important settings

 

 

 



 

At this stage we haven’t added any API’s to Azure API management, so let’s do that by following this excellent example: – https://docs.microsoft.com/en-us/azure/api-management/import-and-publish

Once you have imported an api you can test it a number of ways including using tools like postman, but you can also use the API Management developer portal which you can launch from your Azure API Management Instance back in Azure seen in the link below: –


Now that we have the Developer Portal open, select API’s from the header and then click on the API you imported in a previous step.


Click the Try it Button

So to check things are talking to Okta to try to get a token, we need to change the drop down under the Authorization section and change it from No auth to Authorization code. This will attempt to go off to Okta and you should see a Login prompt to Okta.

 

 

Once you enter details and click Sign in if all is setup correctly you’ll know see something like this:-

Now we have a bit saying when the access token will expire and also at the bottom it shows lots of **** for where the access token is added but hidden.

Other things of note

I had to create/edit an assigment (user) within Okta because I was setup with a username – so under assignments within your Application make sure users have a username setup.

Note
The important part here is that you can access api’s in API management and by default they’ll always just work, the trick is to make them request an Okta token. In bound policies are the magic th

Lastly we need to add whats called an in-bound policy to check the token is valid – otherwise the calls will always succeed with or without using Okta.

To add an in-bound policy go to your Azure API instance within Azure, then the developer portal and select your api and then select All operations (or the api call you wish to secure) and then select Inbound processing like so:-

Here we have several options for the inbound policy and in this example I chose validate JWT and filled it out as below: –

You can read more here on API management policies.

And that is how you go about integrating Okta with Azure API Management.

Feel free to get in touch if you have any questions.