If you’ve ever stared at a production issue and wondered “who changed this rule, and when?” — you’re not alone. For teams managing business rules in Dynamics 365 and Power Platform, tracking changes has traditionally meant digging through emails, chasing down spreadsheets, and hoping someone remembered to document what they did. There’s a better way.
This post walks through a complete Application Lifecycle Management (ALM) solution that brings full version control, automated deployments, and end-to-end audit-ability to your North52 business rules — using Azure DevOps as the backbone.

Why Version Control Matters for Business Rules
Business rules aren’t static. Credit card eligibility criteria, approval thresholds, compliance requirements — these change constantly in response to market conditions, regulatory updates, and evolving business needs. The rule that works today may look completely different six months from now.
The danger of managing these changes manually is what we call “black-box deployments”: changes going to production with no clear record of who made them, what exactly changed, or why. In regulated industries, that’s not just inconvenient — it can mean failing a compliance audit.
What teams need is complete traceability: a clear chain from the original business requirement, through development and testing, all the way to production deployment.
The Four-Step ALM Workflow
The solution is a four-step process that, once configured, runs almost entirely on autopilot.
Step 1 — Request
Every change starts as a work item in Azure DevOps. Whether it’s a user story, a bug fix, or a formal change request, each change has a trackable origin before a single line of logic is touched.
Step 2 — Build
A developer or business analyst makes the update directly in the North52 formula editor within the development environment. No complex coding — just a straightforward configuration change to the business logic.
Step 3 — Commit
This is where the automation earns its keep. An Azure DevOps pipeline exports the solution from dev, unpacks it, and commits it to your Git repository. The commit message includes a direct link back to the originating work item, creating a permanent and searchable audit trail.
Step 4 — Deploy
A separate pipeline picks up the committed code and deploys it as a managed solution — first to test, and then to production when you’re ready. Crucially, the exact same managed solution that was validated in test is what gets deployed to production. No manual steps, no opportunity for human error.
Seeing It in Action: The Platinum Card Example
To make this concrete, consider a real change scenario: updating the minimum total assets threshold for platinum card eligibility. A work item is created in Azure DevOps (Work Item #16), specifying that the threshold for customers with a credit rating between 760 and 770 should increase from $200,000 to $300,000.
A functional consultant opens the North52 Decision Suite and makes the update. No zip files are emailed. No manual exports happen. Instead, the export pipeline is triggered and tagged with Work Item #16 — that tag is the critical link connecting the business request to the technical change.
The pipeline automatically exports the solution, unpacks it, and commits it to source control with a descriptive commit message. The result is an instant, detailed diff showing exactly what changed: the value moved from 200,000 to 300,000. The commit is linked back to the work item, and anyone can see who exported it and when.
Deploying to test or production follows the same pipeline-driven process — no manual intervention required.
How the Repository Is Structured
The Azure DevOps repository is organized into three main areas.
Pipelines YAML
Two pipeline definition files — one for exporting from dev (dev-export-sync.yml) and one for deploying to test and production (deploy-solution.yml).
Solutions Archive
Versioned zip files of every solution export, organized by solution name and split into managed and unmanaged subfolders. Every version is retained, which means rolling back is always an option.
Solutions Unpacked
The unpacked XML source files that enable the diff view. When the pipeline exports a solution, it breaks it apart at the component level so Git can track granular changes over time.
This structure gives you the best of both worlds: archive zips for easy deployment and unpacked source control for detailed version history.
Semantic Versioning and Rollbacks
The export pipeline automatically applies semantic versioning to each solution using a major.minor.patch.0 format. Major and minor numbers are controlled through pipeline variables, while the patch number auto-increments with every run and resets when you bump the major or minor. The result is clean, meaningful version numbers like 1.0.1.0 rather than arbitrary build IDs.
The deploy pipeline supports both “deploy latest” and “deploy a specific version” modes. If you specify a version number, the pipeline looks for an exact match in the archive. If that version doesn’t exist, it lists all available versions in the log so you can pick the right one. This makes rollbacks straightforward — you specify the version, and the pipeline handles the rest.
Decoupled Pipelines: The Key Design Principle
One of the most important aspects of this architecture is that the export and deploy pipelines are completely decoupled. There is no pipeline resource link or artifact dependency between them. The export pipeline writes to the Git repo; the deploy pipeline reads from it. That’s the entire connection.
This means you can export without deploying, deploy an older version without re-exporting, or run both pipelines independently at different times. The Git repository is the single source of truth that ties the whole system together.
Authentication and Service Connections
Pipelines authenticate to Dynamics 365 environments through service connections using Service Principal (SPN) authentication — one each for dev, test, and production. Service Principal authentication is more secure than user accounts because it isn’t tied to a specific person and doesn’t expire when someone leaves the organization.
The quickest way to set these up is the Power Platform CLI command pac admin create-service-principal, which registers the app in Entra ID, creates the application user in Dataverse, and assigns the System Administrator role — all in a single command.
Key Benefits at a Glance
Linked Requirements
Every technical change traces back to a business requirement via Azure DevOps work items, creating a clear chain from request to implementation.
Single Source of Truth
The Git repository is always current, always versioned, and always automated — no more chasing down who has the latest version.
Consistent Deployments
The same pipeline and the same managed solution move from dev to test to production. No variation, no manual steps, no human error.
Compliance-Ready Audit Trail
Every change is recorded automatically with who made it, when, what changed, and why. In regulated industries, this can be the difference between passing or failing a compliance audit.
Faster Releases
No waiting for someone to manually export and import solutions. Once configured, this process largely runs itself.