ALM DevOps Dynamics 365 Power Platform

Power Platform DevOps : Part 2 – ALM Framework and Provisioning Tool

Part 2 of a series of Articles co-authored by Dylan Haskins and Eugene Van Staden covering our thoughts, strategies and tools for Application Lifecycle Management (ALM) and DevOps for the Power Platform and PowerApps Portals.

Welcome to Part 2 of a series of Articles co-authored by Dylan Haskins and Eugene Van Staden covering our thoughts, strategies and tools for ALM and DevOps for the Power Platform and PowerApps Portals.

In Part 1 : ALM Fundamentals and Maturity we covered the Fundamentals of Application Lifecycle Management (ALM) and provided a Maturity model and checklist. The referenced white-paper covers in depth the topics of Solutions, Publishers, Layering and Patching (so we won’t cover it again here)

In our original planning of these articles we were going to gradually share each of the components of our ALM process, culminating in a complete solution… but as we worked through it with some of our colleagues we decided that a far better approach would be to share the complete solution and all it’s magical goodness upfront, followed by some Day-to-Day usage examples for Consultants and Developers, so that you can immediately get the access to all the benefits.

We will then, over a series of articles, deconstruct the entire toolset and deep-dive into each of the components, for those readers who wish to gain a complete understanding of all the moving parts.

The series consists of the following parts :

Our ALM Framework and Provisioning Tool

We have made the complete source code for our Framework and Provisioning tool available here.

Our aim in putting together this framework was to make it as simple, automated and streamlined as possible, to stand up a fully integrated continuous deployment environment that meets all of the elements outlined in the ALM Maturity model for the Microsoft Power Platform

To get going with the Framework all you need is the following :

  • A basic understanding of Azure DevOps (signing up for an account at
  • Basic Consultant or Developer experience of the Microsoft Power Platform / Dynamics 365 (
  • The ability to be able to open Windows PowerShell and copy / paste πŸ˜€
  • You will also need a Power Apps / Dynamics 365 Development environment (Create one via the Power Platform Admin centre)
  • Additionally you will need a Power Apps / Dynamics 365 environment that will be used to Continuously Deploy your solutions to. We prefer to have a separate “Deployment Staging” environment as we will be continuously deploying automatically, from there we can have an approval step to deploy to additional environments like Test, Pre-Production and Production. (Create one via the Power Platform Admin centre)

Side Note : Creating these additional environments has become a lot simpler with the introduction of capacity based instances in the Power Platform, you no longer require a license for each Sandbox / Production environment, only 1Gb of free storage capacity to create an Environment.

To get started you just need to perform the following steps :

  • Right click on Start and select Windows PowerShell (Admin)
  • Copy and Paste the following lines of code into the PowerShell window :
Set-ExecutionPolicy Unrestricted -Force
(Invoke-WebRequest -UseBasicParsing:$true).Content | Out-File .\Provision.ps1
  • Press Enter

This will launch the Provisioning tool and the Welcome screen / Instructions.

Note : Aside from a few questions asked in the script – Credentials, Git Repository Name etc. this is a fully automated script that performs all of these tasks for you.

After pressing “Enter” to continue, the first thing the script will do is check for, and install (if they are missing), the pre-requisites. These are git for Windows and the Azure CLI

You will then be asked for the name of your Azure DevOps organisation (which if you didn’t already have you should have signed up for earlier). Just enter the NAME i.e. the last part of the URL after

You will then be directed to a browser to login to your Azure DevOps organisation.

Minimise the browser windows and continue with the PowerShell script.

You will then be asked if you would like to Create a New Project in Azure DevOps or connect to an Existing one. We are going to create a new one called Power Platform DevOps.

Note : The default setting for the new project will apply the SCRUM template and select Git as the source control framework, you can change the template by modifying the PowerShell script.

You will then be asked for a name for the Git Repository that should be created for the ALM Framework to use. We’ll create one called Power Platform Repo

In the background a new Git repository will be created and the DevOps Source Code template from our repository provided at the beginning of the article will be automatically imported into your Azure DevOps repo as well as cloned locally to your root drive in a folder called \Dev\Repos\<Name of your Repository>

The script will then also provision an Azure AD app registration, which is required to run the PowerApps Solution Checker as part of our pipeline. This new App registration is called “Power App Checker” more on this in Part 6 : Deconstructing the Framework : ADO Pipeline

Next you will be asked to Connect to your CDS / Dynamics 365 Tenant.

Enter the Username and Password for your Developer Account or Service Account that has System admin permissions to the Development and Deployment Staging Environments (that you created earlier)

You will the be presented with a list of available Environments in your Tenant

Select you Development environment, in our case 0 (Zero)

You can then either connect to an existing unmanaged solution (if you have already started customising) or you can create a new solution (along with a new publisher). We will choose to Create a new solution called Power Platform DevOps

Next select your Deployment Staging environment, in our case 1 (One)

From here the script carries on by itself and does a number of things, each of which will be covered in more detail in our articles that deconstruct the Framework. The rest of the tasks that happen in the background are :

  • Unmanaged solution version number will be updated
  • Unmanaged and Managed versions of the solution will be exported, unpacked using Solution Packager and committed to source control
  • Your environment specific details will be injected into your source control repositories configurations (to enable the tools to work automatically going forward)
  • Two variable groups will be created in Azure DevOps that contain the URLs, Usernames and Passwords (stored as secrets) for your Development and your Deployment staging environments
    • These variable groups can be found in Azure DevOps under Pipeline -> Library
  • An automated Azure DevOps build and deploy pipeline will be created and run using the settings you have previously provided
  • Two browser windows will be opened
    • Your Azure DevOps repository
    • Your Azure DevOps Pipeline

In the Pipeline browser windows you will see the newly created pipeline running the automated build for our “Initial Commit”

Note : If your view looks slightly different to the screenshots here it would be because we have enabled the Preview feature for Multi-Stage pipelines, which changes the way pipelines are viewed and gives greater flexibility for pipelines. You can enable this from the Preview Features menu on the top right in Azure DevOps

Opening up the Pipeline will show the different stages, currently “Build Stage” and “Deployment” and their progress.

You can click on the Build Job to see its progress

Here you can see our Build and Deployment Job completed successfully, this would mean that our Solution has been successfully packed, checked and deployed to our Deployment Staging Environment.

At a high level, the following steps are performed :

  • Checkout our latest source code
  • Update the Azure DevOps Pipeline build number to match our CDS / D365 Solution version number (this gives us great traceability)
  • Build the Solution
  • Run our Unit and UI Test (none currently exist, but failures here would fail the pipeline and ensure regressions are minimised)
  • Pack the Solution file and run it against the PowerApps Solution Checker service (any Critical issues detected here will fail the Build, lower lever issues will be logged as warnings for review)
  • Archive any reference data that the Framework exported (using the Configuration Migration Utility)
  • Prepare the artifacts for Deployment
  • If the build stage is successful the Deployment stage will trigger which will Deploy the Solution (and Data) to our Deployment Staging environment

That concludes Part 2, in Part 3 : Day-to-Day ALM for Consultants we will take a look at how a Consultant might use the tools available to participate in the DevOps process and get their customisations and data deployed automatically.

6 replies on “Power Platform DevOps : Part 2 – ALM Framework and Provisioning Tool”

During setup the build job fails at the NuGet restore step. An error is thrown:

Error parsing solution file at d:\a\1\s\DemoAgileProject.sln: Exception has been thrown by the target of an invocation. The project file could not be loaded. Could not load file or assembly ‘Microsoft.Build.Framework, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a’ or one of its dependencies. The system cannot find the file specified. d:\a\1\s\DemoAgileProject.sln

Thanks for reaching out, unfortunately there was a recent update to the build agents that need a newer version of NuGet specified.
You just need to update the NuGet Installer part of the build.yaml file to the following (it has been updated on GitHub now as well) :

– task: NuGetToolInstaller@0
displayName: ‘Use NuGet 5.5.1’
versionSpec: 5.5.1

Hey Dylan – great write-up!

I’m running into an issue at the point where I would create a project in my DevOps environment and am wondering if you would have any insight as to a solution or workaround? Here’s what I’m seeing:

Would you like to [C]reate a new Azure DevOps Project or [S]elect and existing one (Default [S]): C
Please enter the Name of the Project you wish to Create: PPDevOps
Creating DevOps Project PPDevOps
ValidationError: A potentially dangerous Request.Path value was detected from the client (:).
ValidationError: A potentially dangerous Request.Path value was detected from the client (:).
ValidationError: A potentially dangerous Request.Path value was detected from the client (:).
Cloning Git Repo PPDevOps locally
If prompted for credentials, enter the same credentials you used for
fatal: repository ‘\Dev\Repos\PPDevOps’ does not exist
Confirming Git User Details
Cleaning up Git Repository
error: pathspec ‘master’ did not match any file(s) known to git
Initialized empty Git repository in C:/Users/scott/Downloads/PowerPlatformDevOps/.git/
warning: LF will be replaced by CRLF in Provision.ps1.
The file will have its original line endings in your working directory
warning: LF will be replaced by CRLF in Provision_Full.ps1.
The file will have its original line endings in your working directory
usage: git remote add []

-f, –fetch fetch the remote branches
–tags import all tags and associated objects when fetching
or do not fetch any tag at all (–no-tags)
-t, –track branch(es) to track
-m, –master
master branch
set up remote as a mirror to push to or fetch from

Connecting to Power Platform
Press Enter to Connect to your CDS / D365 Tenant or [Q]uit:

Thanks so much for the response, Dylan!

The updated tooling is very nice! I am still encountering the same error about a ‘potentially dangerous path’ when trying to create a project in ADO. I’ll take a deeper dive on things and see if I can figure it out. Assuming I do, I’ll be sure to report back the resolution.


Hey Scott,

What did you specify when asked for your Azure DevOps Organisation ? This must just be the name (last part of the URL) and not the full URL, if you specified the full URL it will error.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.