GAMP 5

GAMP 5 – Risk-Based Approach to Compliant Computerized Systems

This topic is part of the SG Systems Global regulatory & operations glossary.

Updated October 2025 • Validation & Data Integrity • CSV, Part 11 / Annex 11, Lifecycle & Quality Risk Management

GAMP 5 (Good Automated Manufacturing Practice, 5th edition) is an industry guidance for applying a lifecycle, risk-based approach to the validation and ongoing control of computerized systems used in regulated environments. It harmonizes how manufacturers and suppliers specify, verify, and maintain systems so that they are fit for intended use, meet data integrity expectations, and remain compliant over their operational life. GAMP 5 is not a regulation, nor a one-time project plan; it is a way of working that connects business process understanding, GxP quality risk management, and proportionate verification to yield credible evidence that systems do what they should, consistently, under control. In practical terms, GAMP 5 complements the legal requirements for electronic records and signatures—such as 21 CFR Part 11 and EU Annex 11—and the broader expectations of Computer System Validation (CSV) and Data Integrity. The guidance promotes supplier assessment, critical thinking, scalable documentation, and practical testing strategies that emphasize what matters most to patient and product risk.

“Validate what matters, where it matters, in a way that generates trust without wasting effort—this is the spirit of GAMP 5.”

That spirit shows up in how teams define scope (intended use), analyze process and data risks, categorize software to right-size effort, leverage supplier development and testing evidence, and focus verification where failure would have meaningful impact. It also shows up in how organizations sustain compliance: managing change deliberately through Change Control, keeping configurations and infrastructure known, reviewing periodic risks and audit trails, and measuring the health of controls with metrics that drive action instead of paperwork. When applied well, GAMP 5 leads to systems that are easier to run, easier to defend, and more resilient to the inevitable change of business needs and technology stacks.

TL;DR: GAMP 5 is a lifecycle, risk-based method for specifying, verifying, and maintaining computerized systems so they are fit for intended use and compliant with Part 11/Annex 11. It scales validation by software category and risk, leverages supplier evidence, and emphasizes ongoing control—configuration management, security, backup/restore, archival, and periodic review—over one-off documentation.

1) What GAMP 5 Covers (Scope & Intent)

GAMP 5 addresses the full lifecycle of computerized systems that influence product quality, patient safety, or data used to make quality decisions. This includes MES, LIMS, ERP interfaces, labeling systems, historians, e-signature platforms, WMS/SCM, and laboratory instruments with software. The lifecycle typically spans concept, project (requirements through release), and operation (use, monitoring, change, incident/problem management, periodic review, retirement). Within that lifecycle, GAMP 5 expects teams to define intended use precisely, perform quality risk management to identify GxP-critical functions and records, and ensure proportionate control and verification of those elements. Evidence should show that functional requirements have been met, security and access controls are effective, records are attributable and contemporaneous with audit trails, and that backup/restore and archival preserve integrity for the retention period (see Data Retention & Archival).

2) Risk-Based Thinking & Software Categories

GAMP 5’s most practical lever is scaling effort by risk and by software category. Commercial off-the-shelf (COTS) platforms configured to support a process differ from custom code; instrumentation firmware differs from a spreadsheet with macros. By classifying software appropriately, teams avoid over-testing low-risk vendor-managed components while reserving rigor for custom/high-impact functions and integrations. Risk assessment starts with process understanding: where could the system permit or conceal a quality-critical error, miscalculate a dose, mislabel a product, lose an audit trail, or allow an unauthorized change? The output is a mapped set of critical functions and data with targeted controls—role-based access, Dual Verification for irreversible actions, range and limit checks (SPC control limits), barcode and label template binding (Barcode Validation with GS1/GTIN integrity), device time sync, and secure configurations. Verification aligns to risk: configuration testing focuses on the configured behavior that expresses requirements; integration testing focuses on data handoffs between systems; and negative testing challenges error paths and privilege boundaries.

3) Roles, Responsibilities & Supplier Engagement

GAMP 5 encourages clear delineation of responsibilities. The business process owner articulates intended use and acceptance criteria; quality ensures that risk management and validation methods are sound; IT/OT manages infrastructure, security, and service continuity; and suppliers provide development practices, test evidence, defect handling, and product roadmaps germane to risk. Supplier assessment is not a rubber stamp; it should examine the vendor’s quality management system, release management, cybersecurity posture, support model, and the maturity of documentation available for leveraged validation. Where the supplier provides application configuration tools (e.g., workflow builders), governance must ensure that configured objects are version-controlled, change-managed, and traceable to requirements so that subsequent changes can be tested proportionately under Change Control without re-validating the world.

4) Specifications That Matter

Documentation should exist to explain what the system must do (user requirements), how it will be realized (functional/design specifications for configurations and interfaces), and how the team will verify it (test plans, protocols, and acceptance criteria). GAMP 5 advocates living, right-sized specifications linked to risk, not encyclopedic tomes. For an eBMR, this may look like a master set of step types, interlock rules, parameter ranges, label bindings, and data capture rules sourced from the approved eMMR. For a WMS integration, specifications should define the transaction catalog (e.g., goods receipt, component release, pick/consume, finished goods release), acknowledgements, retries, and reconciliation reporting so that orphaned or duplicate messages cannot silently corrupt genealogy (see Batch Genealogy).

5) Testing with Critical Thinking

GAMP 5 favors risk-based testing that proves the requirement, not just the happy path. That means explicit negative tests (wrong label template, out-of-range weight during gravimetric weighing, expired lot violating FEFO, disabled audit trail), privilege boundary tests (attempting a critical action without role), and timing tests (unsynchronized clocks, failed backup, network dropouts). Testing should be traceable (requirement ↔ test), independent where possible, and executed in a controlled environment that mirrors production for critical elements. Evidence is stronger when it includes system-generated artifacts (event logs, audit trails, device printouts) and when failures are recorded transparently with deviations feeding CAPA for systemic issues. For repetitive configurations (e.g., many eBMR steps of the same type), leverage design qualification patterns and template testing with parameter variation rather than brute-force duplicates that add cost but little confidence.

6) Data Integrity, Security & Part 11/Annex 11 Controls

Because computerized systems create and transform GxP data, GAMP 5 threads data integrity through every decision. Records must be attributable, legible, contemporaneous, original/true copy, and accurate (ALCOA+). Practically, that yields unique user identities (no shared logins), role-based access with least privilege, enforced meaning-of-signature for e-signatures, secure computer-generated audit trails for create/modify/delete/report, time synchronization across platforms, validated backup/restore with periodic restore tests, and long-term archival that preserves readability and metadata. Where labels and variable data are generated, Barcode Validation and template control prevent mismatches; where equipment states influence execution, Asset Calibration Status interlocks block use in out-of-tolerance conditions. The system should make the correct action the easy action, with technical interlocks sealing the gaps that SOPs and training alone cannot reliably cover.

7) Operation, Change & Periodic Review

After go-live, control shifts to operations. Incident and problem management capture and analyze issues; change proposals evaluate impact on validated state, data integrity, and training; and configuration baselines stay current and retrievable. Periodic review checks whether the system remains fit for intended use given process changes, supplier updates, vulnerability advisories, and audit-trail trends. Where a trend suggests control erosion—rising overrides, frequent label reprints, late backups—risk should be reassessed and a CAPA opened with effectiveness checks. Retirement plans must preserve records and their context; where a system is replaced, data migration and decommissioning should be validated to ensure continued accessibility of required evidence, consistent with Retention & Archival policy.

8) Common Failure Modes & How to Avoid Them

  • Document-heavy, risk-light validation. Huge binders, thin thinking. Fix: lead with intended use and risk; right-size specs and tests.
  • Ignoring supplier evidence. Re-testing vendor-proven behaviors while missing your configuration risks. Fix: leverage supplier QA and focus on your use and integrations.
  • Weak configuration management. Lost baselines, untraceable changes. Fix: version everything; script and store environment “as code” where feasible.
  • Audit trail present, not useful. No periodic review, time drift, or unmonitored edits. Fix: review trails; sync clocks; alert on critical events.
  • Interface blind spots. Orphaned transactions and silent failures. Fix: acknowledgements, retries with visibility, and reconciliation reports.
  • Label/UDI mismatches. Templates outside control. Fix: bind templates to masters and enforce scan-back at print/apply (respecting GS1/GTIN rules).
  • Training lag. New configuration effective before users trained. Fix: training gating via Document Control.

9) Metrics That Prove Control

  • Change implementation lead time with validation effort vs. risk category.
  • Defect discovery distribution (supplier vs. configuration vs. integration) and escape rate to production.
  • Audit-trail review findings and closure timeliness.
  • Backup/restore drill success and mean restore time for critical datasets.
  • Override/exception rate for critical actions; label mismatch prevention incidents.
  • Periodic review on-time rate and resulting CAPA effectiveness.

10) How This Fits with V5

V5 by SG Systems Global is engineered to embody GAMP 5 principles in day-to-day operations so that validation is sustainable rather than episodic. In V5 MES, master definitions (eMMR) generate executable eBMR steps with enforced sequencing, parameter limits, Dual Verification, device data capture (including gravimetric weighing), and Barcode Validation with GS1/GTIN controls. V5 QMS manages Document Control, validation packages, supplier assessments, Change Control, deviations and CAPA, with audit-trail and e-signature controls aligned to Part 11/Annex 11 and GMP/cGMP. V5 WMS enforces FEFO/FIFO, Directed Picking, and Goods Receipt checks, feeding clean genealogy to MES and CoA processes. For distribution contexts, controls align with GDP expectations (storage/transport), while food-sector deployments map to GFSI-benchmarked schemes. Platform-wide, immutable audit trails, time sync, backup/restore verification, and configurable security roles reduce the gap between “validated state” and “operational reality.” Dashboards surface the metrics above so periodic review and management oversight are part of normal work—not a scramble before inspection.


11) FAQ

Q1. Is GAMP 5 mandatory?
No. It is guidance widely accepted by regulators and industry as good practice for CSV. Compliance is to laws and predicate rules; GAMP 5 provides a practical way to demonstrate fitness for use and data integrity across GxP domains.

Q2. Do we have to re-validate on every vendor patch?
Not necessarily. Use risk-based change control: assess impact on intended use, security, and data integrity; review supplier release notes; and test proportionately. Critical functions and integrations get priority regression coverage.

Q3. Can we leverage supplier testing?
Yes. Leverage supplier SDLC evidence and certification where appropriate, and supplement with configuration/integration testing specific to your intended use and environment.

Q4. How much documentation is “enough”?
Enough to show traceability from requirements to verified functionality, that risks are controlled, and that ongoing operation (security, backup/restore, audit trails, training) is effective. More pages do not equal more control.

Q5. How does GAMP 5 apply to spreadsheets and small tools?
Apply proportionate controls: lock critical formulas, restrict access, protect cells, version files, and review audit logs if available. For high-impact tools, consider migrating to controlled application functionality within MES/LIMS.


Related Reading
• Foundations: CSV | GxP | GMP / cGMP | 21 CFR Part 11 | EU Annex 11
• Execution & Integrity: eMMR | eBMR | Gravimetric Weighing | Barcode Validation | GS1 / GTIN
• Materials & Distribution: Goods Receipt | FEFO | FIFO | GDP | GFSI