Select Page

If you’ve ever wondered what’s actually happening inside an automated ALM pipeline for Dynamics 365 business rules, this post is for you. We’re going beyond the high-level overview and into the technical detail – the repository structure, the YAML pipelines, the service connections, and a live walkthrough of pushing a real change from dev all the way through to test.

We show you how the fictitious Contso Bank manages its North52 business rules using Azure DevOps, and everything described here can be applied directly to your own environment.

YouTube player

The Repository Structure

Understanding the folder structure of the Git repo is the foundation for everything else. At the top level there are three main folders and a readme.

Pipelines YAML

This folder contains the two pipeline definition files: the export pipeline and the deploy pipeline. Keeping them separate from the actual solution files keeps the repo clean and navigable.

Solutions Archive

This is where exported zip files live. Inside, there is a subfolder for each solution. Opening the business-rules-credit-card folder reveals two subfolders: managed and unmanaged. Every time the pipeline runs, it drops a versioned zip into each – for example, business-rules-credit-card-managed-1.0.5.0.zip.

This archive matters most when something goes wrong. Rolling back a deployment means pointing the deploy pipeline at an older version zip. No rebuilding, no guesswork.

Solutions Unpacked

This is where source control actually happens. When the export pipeline runs, it unpacks the unmanaged solution into individual XML files – folders for entities, workflows, North52 formulas, and all the other components that make up your solution.

Git tracks changes at the file level. When a business rule is updated in dev and the pipeline runs, you can see exactly which files changed, what the old values were, and what the new values are. You get a proper diff – not just two zip files to compare blindly.

The archive and unpacked folders serve different purposes deliberately. The archive holds deployable packages. The unpacked folder is purely for visibility and traceability – the human-readable version of what’s inside those zips.

The Export Pipeline YAML – Step by Step

Pipeline Name and Trigger

The pipeline name includes the target solution variable and today’s date, so each run shows up in history with a meaningful label. Trigger is set to none – this is a manually triggered pipeline. Two parameters are exposed at runtime: commit message and work item ID.

Variables and Semantic Versioning

Target solution, major, and minor are set as UI pipeline variables – locked down and not editable at runtime. The patch variable is defined in the YAML using a counter function that takes major and minor as its key. Every pipeline run increments patch by one. If you change the major or minor value in the UI, the key changes and the counter resets to zero automatically. The solution version variable combines all three into a clean semantic version: major.minor.patch.0.

Checkout and Publish Customizations

The first stage begins by checking out the repo with persist credentials set to true. Without this, the pipeline can pull the repo but cannot push commits back to it. Power Platform Build Tools are then installed. The publish customisations task connects to the dev environment via the Dataverse Dev service connection and publishes all pending changes – without this step, unpublished developer changes would be missed in the export.

Set Version, Export, and Unpack

The set solution version task stamps the computed version number onto the solution in Dataverse before export, so zip files carry the correct version internally. Two export tasks follow – one with managed set to false (for source control) and one with managed set to true (for deployment). Both use async operations with a 60-minute timeout to accommodate large solutions.

The unpack task then extracts the unmanaged zip into the solutions unpacked folder, with override files set to true so each export cleanly replaces the previous unpacked state. This is what produces meaningful diffs in Git.

PowerShell: Archive and Versioned Filenames

Two PowerShell steps follow. The first ensures the archive directory structure exists. The second copies the exported zip files from the staging directory into the archive with versioned filenames – so business-rules-credit-card-managed.zip becomes business-rules-credit-card-managed-1.0.3.0.zip in the managed subfolder. If either source file is missing, the script throws an error and stops immediately.

Git Commit and Work Item Linking

The Git commit step sets the Git identity to whoever triggered the pipeline, so commits show the actual person’s name rather than a generic service account. It stages all changes and builds the commit message by combining the input message, the work item ID prefixed with a hash, and the solution version – for example: Updated threshold #15-v1.0.3.0.

A safety check skips the commit entirely if there are no actual changes. The step also won’t run during a pull request build – only on direct pipeline runs. A final step publishes the solution zips as pipeline artifacts as a backup download option.

The Deploy Pipeline YAML – Step by Step

Parameters and Runtime Control

Three parameters are exposed when running the pipeline. Deploy to Test and Deploy to Production are booleans that render as checkboxes. Solution Version is a text field defaulting to “latest.” Target solution is again set as a UI variable.

Version Resolution

Inside each stage, after checking out the repo, a PowerShell script resolves which zip file to deploy. If a specific version is entered (e.g. 1.0.2.0), it looks for an exact match in the solutions archive managed folder. If that file doesn’t exist, it lists all available versions in the log output rather than failing silently. If “latest” is the default, it scans the folder, sorts the files, and grabs the most recent one. The resolved zip path is then set as a pipeline variable for the import step.

Production Stage Safety

The production stage depends on the test stage. If both checkboxes are ticked, production waits for test to succeed before running – if test fails, production is automatically skipped and your production environment is protected. If only the production checkbox is ticked, the dependency is satisfied immediately and production runs independently. You get safety when deploying to both, and flexibility when you only need one.

Service Connections

Three service connections are configured in Azure DevOps project settings: Dataverse Dev, Dataverse Test, and Dataverse Prod. Each uses the server URL, tenant ID, application ID, and a client secret. The export pipeline uses the Dev connection. The deploy pipeline uses Test and Prod depending on which stages run.

Live Walkthrough: Lowering the Maximum Credit Score Threshold

Work Item #15 specifies that the maximum required credit score for an eligibility rule should be lowered from 850 to 845. The change has been made in the North52 formula editor in the dev environment. Now it needs to move through the pipeline.

Running the Export Pipeline

Opening the Dev Export Git Sync pipeline, the variables confirm the target solution and current version numbers. Running the pipeline prompts for a commit message (“Lowered max credit score from 850 to 845”) and the work item ID (15). The pipeline then runs every step in sequence: publish customizations, compute version, export both solutions, unpack, archive with versioned filenames, commit to Git with the linked work item, and publish artifacts.

Reviewing the Git Commit and Diff

In the repo, the commit appears exactly as expected. The diff view shows the formula PGK file with the old value (850) in red and the new value (845) in green. The commit details show who ran the pipeline, and the linked work item is visible. Clicking through to the work item confirms the commit has been automatically connected there too – full traceability in both directions.

Deploying to Test

Running the deploy pipeline with the test checkbox selected and version set to “latest,” the pipeline checks out the repo, resolves the most recent managed zip from the archive, imports the solution to the test environment via the Dataverse Test service connection, and publishes customizations. The pipeline summary confirms only the test stage ran – production was skipped as expected.

What Was Achieved End to End

From a single work item to a deployed test environment, every step is automated, linked, and auditable:

  • Requirement captured in Azure DevOps as a work item.
  • Change made in the North52 formula editor in dev.
  • Export pipeline triggered – solution exported, versioned, unpacked, archived, and committed to Git with the work item linked.
  • Diff visible in Git showing exactly what changed.
  • Deploy pipeline triggered – managed solution imported to test from the archive, no manual steps.

This is what enterprise-grade ALM looks like for Dynamics 365 Power Platform business rules.