AI Governance Institute logo
AI Governance Institute

Practical Governance for Enterprise AI

Change Management
CHM · Change ManagementCHM-002Medium effortAgent-relevant

Model Deployment Gate Process

Require formal approval before new model versions, prompt changes, or configuration updates are deployed to production AI systems.

Objective

Prevent untested or unapproved changes from reaching production and causing harm or compliance failures.

Maturity Levels

1

Initial

Model changes are deployed without formal review or approval.

2

Developing

Informal review exists for major changes but minor changes (prompts, configs) are deployed without gates.

3

Defined

All production changes require documented testing results and sign-off before deployment.

4

Managed

Deployment gate metrics (approval rate, rejection reasons, cycle time) are tracked and reported.

5

Optimizing

Gate criteria are continuously refined based on post-deployment incident data.

Evidence Requirements

What an auditor or assessor would expect to see for this control.

  • Deployment gate checklist showing all required tests passed and approvals obtained before each production deployment
  • Test result reports for evaluation, bias assessment, and security tests executed as gate conditions
  • Approval records with approver identity, timestamp, and scope for each deployment event
  • Gate exception records for any deployment that bypassed standard criteria, with risk acceptance sign-off
  • Deployment log showing version promoted, timestamp, deployer identity, and target environment

Implementation Notes

Key steps

  • Define which change types require which level of gate: model version changes, prompt changes, and configuration changes may warrant different levels of scrutiny.
  • Require evaluation results as a gate artifact: minimum performance thresholds, bias checks, and adversarial robustness scores must be documented before approval.
  • Include business sign-off alongside technical approval for high-risk systems — the person responsible for the business outcome should approve production changes.
  • Log deployment events with the approver, timestamp, change description, and evaluation evidence — this is your change management audit record.

Example Implementation

AI platform team deploying a prompt-based summarization model to a customer-facing product

Deployment Gate Checklist — Summarization Model v1.4

Required artifacts (must be attached before gate review):

  • Model/prompt change description (what changed and why)
  • Evaluation results: ROUGE scores, hallucination rate, 50-sample human eval
  • Adversarial test results: prompt injection suite (pass/fail)
  • Bias/fairness check results (for user-facing features)
  • Rollback plan and tested rollback time

Gate approvals required:

ApproverRoleSign-off Date
J. RiveraML Lead (technical sign-off)
S. ChenProduct Lead (business sign-off)
M. OkaforCompliance (if risk tier = Critical)

Blocking criteria: Any of the following blocks deployment until resolved:

  • Hallucination rate > 5% on evaluation set
  • Any open Critical security finding
  • Missing required artifact

Log entry: Deployment event logged with approver names, artifact versions, and gate decision timestamp

Control Details

Control ID
CHM-002
Typical owner
AI Engineering / AI Governance Team
Implementation effort
Medium effort
Agent-relevant
Yes

Tags

deployment gatechange controlapproval processMLOps