AI Governance Institute logo
AI Governance Institute

Practical Governance for Enterprise AI

Change Management
CHM · Change ManagementCHM-004Low effort

AI Model Change Documentation

Record what changed between model versions, why the change was made, what testing was performed, and who approved the deployment.

Objective

Maintain an auditable record of AI system evolution that supports accountability, incident investigation, and regulatory examination.

Maturity Levels

1

Initial

Changes are not documented; the reason for past decisions cannot be reconstructed.

2

Developing

Major changes are documented informally but minor changes and the rationale behind them are not captured.

3

Defined

A standard change record template captures the change description, rationale, test results, approver, and deployment date for all production changes.

4

Managed

Change documentation is reviewed during governance meetings; quality and completeness are assessed.

5

Optimizing

Change records are automatically generated from deployment pipeline artifacts and linked to evaluation results.

Evidence Requirements

What an auditor or assessor would expect to see for this control.

  • Change log entries for all model updates including description, rationale, version before and after, test summary, and approver
  • Updated model cards or technical documentation at each version reflecting current model characteristics
  • Traceability records linking each change to the business or regulatory requirement that drove it
  • Stakeholder notification records confirming downstream users were informed of material changes before deployment
  • Documentation completeness review records confirming all required fields were populated before deployment approval

Implementation Notes

Key steps

  • Use a standard template for every model change record: what changed, why, what tests were run, what risks were identified, who approved, and the deployment date.
  • Retain change records for at least as long as the model is in production plus your required audit retention period.
  • Link change records to incident records: when an incident occurs, the change history should be one of the first things examined.
  • For third-party models, document the vendor's stated change description alongside your internal validation results.

Example Implementation

MLOps team shipping a prompt update to a risk-tier Critical AI system

Model Change Record — Risk Scoring System

FieldValue
Change IDCHG-2026-0147
SystemCredit Risk Scorer
Change typePrompt update (system prompt v3.1 → v3.2)
Change descriptionTightened output format constraints to prevent free-text rationales that failed downstream parsing
ReasonProduction incident CHG-INC-0023 — 0.3% of outputs failed to parse, defaulting to manual review
Tests runRegression suite (500 cases), format validation (1,000 cases), bias check
Test resultsRegression: 0 regressions; Format pass rate: 100%; Bias check: pass (demographic parity delta < 1%)
Risk identifiedNone (prompt-only change; model weights unchanged)
ApproversJ. Rivera (ML Lead) · M. Okafor (Compliance)
Deployment date2026-04-14T11:30Z
Linked incidentINC-0023
Rollback targetsystem-prompt-v3.1

Control Details

Control ID
CHM-004
Typical owner
AI Engineering / Compliance
Implementation effort
Low effort
Agent-relevant
No

Tags

change managementdocumentationaudit trailchange records