Welcome to Part 2 of a series of Articles co-authored by Dylan Haskins and Eugene Van Staden covering our thoughts, strategies and tools for ALM and DevOps for the Power Platform and PowerApps Portals.
In Part 1 : ALM Fundamentals and Maturity we covered the Fundamentals of Application Lifecycle Management (ALM) and provided a Maturity model and checklist. The referenced white-paper covers in depth the topics of Solutions, Publishers, Layering and Patching (so we won’t cover it again here)
In our original planning of these articles we were going to gradually share each of the components of our ALM process, culminating in a complete solution… but as we worked through it with some of our colleagues we decided that a far better approach would be to share the complete solution and all it’s magical goodness upfront, followed by some Day-to-Day usage examples for Consultants and Developers, so that you can immediately get the access to all the benefits.
We will then, over a series of articles, deconstruct the entire toolset and deep-dive into each of the components, for those readers who wish to gain a complete understanding of all the moving parts.
The series consists of the following parts :
- Part 1 : ALM Fundamentals and Maturity
- Part 2 : Our ALM Framework and Provisioning Tool
- Part 3 : Day-to-Day ALM for Consultants
- Part 4 : Day-to-Day ALM for Developers
- Part 5 : Deconstructing the Framework : Solutions Project
Our ALM Framework and Provisioning Tool
We have made the complete source code for our Framework and Provisioning tool available here.
Our aim in putting together this framework was to make it as simple, automated and streamlined as possible, to stand up a fully integrated continuous deployment environment that meets all of the elements outlined in the ALM Maturity model for the Microsoft Power Platform
To get going with the Framework all you need is the following :
- A basic understanding of Azure DevOps (signing up for an account at https://dev.azure.com
- Basic Consultant or Developer experience of the Microsoft Power Platform / Dynamics 365 (https://make.powerapps.com)
- The ability to be able to open Windows PowerShell and copy / paste 😀
- You will also need a Power Apps / Dynamics 365 Development environment (Create one via the Power Platform Admin centre)
- Additionally you will need a Power Apps / Dynamics 365 environment that will be used to Continuously Deploy your solutions to. We prefer to have a separate “Deployment Staging” environment as we will be continuously deploying automatically, from there we can have an approval step to deploy to additional environments like Test, Pre-Production and Production. (Create one via the Power Platform Admin centre)
Side Note : Creating these additional environments has become a lot simpler with the introduction of capacity based instances in the Power Platform, you no longer require a license for each Sandbox / Production environment, only 1Gb of free storage capacity to create an Environment.
To get started you just need to perform the following steps :
- Right click on Start and select Windows PowerShell (Admin)
- Copy and Paste the following lines of code into the PowerShell window :
Set-ExecutionPolicy Unrestricted -Force (Invoke-WebRequest https://raw.githubusercontent.com/dylanhaskins/PowerPlatformCICD/master/Provision.ps1 -UseBasicParsing:$true).Content | Out-File .\Provision.ps1 .\Provision.ps1
- Press Enter
This will launch the Provisioning tool and the Welcome screen / Instructions.
Note : Aside from a few questions asked in the script – Credentials, Git Repository Name etc. this is a fully automated script that performs all of these tasks for you.
You will then be asked for the name of your Azure DevOps organisation (which if you didn’t already have you should have signed up for earlier). Just enter the NAME i.e. the last part of the URL after https://dev.azure.com/name
You will then be directed to a browser to login to your Azure DevOps organisation.
Minimise the browser windows and continue with the PowerShell script.
You will then be asked if you would like to Create a New Project in Azure DevOps or connect to an Existing one. We are going to create a new one called Power Platform DevOps.
Note : The default setting for the new project will apply the SCRUM template and select Git as the source control framework, you can change the template by modifying the PowerShell script.
You will then be asked for a name for the Git Repository that should be created for the ALM Framework to use. We’ll create one called Power Platform Repo
In the background a new Git repository will be created and the DevOps Source Code template from our repository provided at the beginning of the article will be automatically imported into your Azure DevOps repo as well as cloned locally to your root drive in a folder called \Dev\Repos\<Name of your Repository>
The script will then also provision an Azure AD app registration, which is required to run the PowerApps Solution Checker as part of our pipeline. This new App registration is called “Power App Checker” more on this in Part 6 : Deconstructing the Framework : ADO Pipeline
Next you will be asked to Connect to your CDS / Dynamics 365 Tenant.
Enter the Username and Password for your Developer Account or Service Account that has System admin permissions to the Development and Deployment Staging Environments (that you created earlier)
You will the be presented with a list of available Environments in your Tenant
Select you Development environment, in our case 0 (Zero)
You can then either connect to an existing unmanaged solution (if you have already started customising) or you can create a new solution (along with a new publisher). We will choose to Create a new solution called Power Platform DevOps
Next select your Deployment Staging environment, in our case 1 (One)
From here the script carries on by itself and does a number of things, each of which will be covered in more detail in our articles that deconstruct the Framework. The rest of the tasks that happen in the background are :
- Unmanaged solution version number will be updated
- Unmanaged and Managed versions of the solution will be exported, unpacked using Solution Packager and committed to source control
- Your environment specific details will be injected into your source control repositories configurations (to enable the tools to work automatically going forward)
- Two variable groups will be created in Azure DevOps that contain the URLs, Usernames and Passwords (stored as secrets) for your Development and your Deployment staging environments
- These variable groups can be found in Azure DevOps under Pipeline -> Library
- An automated Azure DevOps build and deploy pipeline will be created and run using the settings you have previously provided
- Two browser windows will be opened
- Your Azure DevOps repository
- Your Azure DevOps Pipeline
In the Pipeline browser windows you will see the newly created pipeline running the automated build for our “Initial Commit”
Note : If your view looks slightly different to the screenshots here it would be because we have enabled the Preview feature for Multi-Stage pipelines, which changes the way pipelines are viewed and gives greater flexibility for pipelines. You can enable this from the Preview Features menu on the top right in Azure DevOps
Opening up the Pipeline will show the different stages, currently “Build Stage” and “Deployment” and their progress.
You can click on the Build Job to see its progress
Here you can see our Build and Deployment Job completed successfully, this would mean that our Solution has been successfully packed, checked and deployed to our Deployment Staging Environment.
At a high level, the following steps are performed :
- Checkout our latest source code
- Update the Azure DevOps Pipeline build number to match our CDS / D365 Solution version number (this gives us great traceability)
- Build the Solution
- Run our Unit and UI Test (none currently exist, but failures here would fail the pipeline and ensure regressions are minimised)
- Pack the Solution file and run it against the PowerApps Solution Checker service (any Critical issues detected here will fail the Build, lower lever issues will be logged as warnings for review)
- Archive any reference data that the Framework exported (using the Configuration Migration Utility)
- Prepare the artifacts for Deployment
- If the build stage is successful the Deployment stage will trigger which will Deploy the Solution (and Data) to our Deployment Staging environment
That concludes Part 2, in Part 3 : Day-to-Day ALM for Consultants we will take a look at how a Consultant might use the tools available to participate in the DevOps process and get their customisations and data deployed automatically.