Product Quality Review (PQR)

Product Quality Review (PQR) – Evidence, Trends, and Decisions that Improve the Next Lot

This topic is part of the SG Systems Global regulatory & operations glossary.

Updated October 2025 • Lifecycle Quality & Continuous Improvement • QA, Manufacturing, Regulatory

Product Quality Review (PQR) is the structured, periodic assessment of each product’s end-to-end performance: materials, manufacturing, in-process and release results, deviations and complaints, changes made and changes needed. The goal isn’t a binder; it’s capability. A credible PQR mines the year’s data from your MES, LIMS, and WMS, connects it into the eBMR story, and issues clear actions under MOC and CAPA. If your PQR doesn’t change specifications, methods, suppliers, or controls when evidence says it should, it’s just admin work with better fonts. In many markets PQR is synonymous with the Annual Product Review (APR); regardless of label, the discipline is the same: analyze, decide, and improve.

“A PQR that doesn’t alter how you make or release product is a history book, not a control system.”

TL;DR: PQR consolidates a product’s yearly evidence—materials, process performance, IPC/release results, Deviations/NC, complaints, and changes—into decisions: keep, tighten, or fix. Data and records live in eBMR, LIMS, and WMS; evidence integrity is under Part 11/Annex 11 with audit trails and retention. Outcomes route through MOC, Change Control, and CAPA and are tracked in CPV and SPC Control Limits.

1) Purpose: From Compliance Ritual to Capability Engine

PQR exists to verify that the process continues to deliver quality product and to make changes when it doesn’t. That means reconciling the designed state with observed reality: Do the IPC controls still predict release? Are methods robust in routine use? Are specs aligned to capability or masking risk? Are supplier lots driving variability in the WMS history? Is the trend in CPV tightening or drifting? The PQR answers these with evidence and ends with decisions, signed and tracked.

2) Scope and Inputs—What Goes Into a Credible PQR

  • Batch history and genealogy. Pull from Lot Traceability and the eBMR: materials, equipment, users, and step outcomes.
  • Analytical results. All release/stability/IPC data from LIMS, including OOT patterns and any OOS investigations.
  • Quality events. Summaries and trends of Deviations/NC, NCRs, complaints, and recalls, with closure and CAPA effectiveness.
  • Changes and validation. Relevant MOC/Change Control items, associated re-qualification (IQ/OQ/PQ) and CSV evidence.
  • Supply and logistics. Supplier performance, Goods Receipt issues, FIFO/FEFO adherence, and storage/handling signals from WMS.
  • Regulatory and QMS. Impact from inspections and Internal Audit findings, alignment with ICH Q10 and cGMP.

3) Evidence Integrity—Trust the Numbers or Stop

PQR conclusions are only as credible as the records behind them. Enforce unique credentials and signature meaning per Part 11; confirm computer-generated Audit Trails are complete; and verify retention under Data Retention & Archival. If the data pipeline allows silent edits or orphan spreadsheets, fix that with priority—PQRs built on sand collapse during inspection.

4) Analysis Flow—From Signals to Decisions

A disciplined PQR follows a repeatable logic:

  • Describe. Summarize batches, volumes, and significant events; identify data gaps to correct next cycle.
  • Diagnose. Trend critical attributes with SPC Control Limits and examine OOT excursions.
  • Decide. For each finding, choose: keep as is, tighten, or remediate via CAPA and MOC.
  • Deploy. Drive approved changes through Document Control so methods, instructions, and limits update in production, and evidence lands in the next PQR automatically.

5) What to Trend—The Short List that Matters

  • CTQs and release results. Potency, impurities, pH, moisture—trend distributions and capability; connect to CPV.
  • IPC predictors. Do in-process measures correlate and predict release? If not, fix sampling, timing, or controls in the MES.
  • Deviations/complaints. Repeat causes, time-to-closure, and CAPA sustainability.
  • Supplier/material effects. Lot-to-lot shifts linked by genealogy, FIFO/FEFO.
  • Release velocity. Lag from last test to Lot Release and to Finished-Goods Release; blockers are usually fixable design issues.
  • Validation health. Frequency of re-qualification (IQ/OQ/PQ) and CSV changes driven by product findings.

6) Specifications—Tighten, Loosen, or Hold

PQR is the right forum to test specification realism. If capability is wide, tighten limits to protect the customer; if capability is narrow for sound reasons, plan process changes or sampling schemes instead of hiding risk. Document specification proposals under Change Control with supporting CPV evidence and final approval captured via secure e-signature in the QMS. Avoid “rounded passes”—harmonize calculation and rounding rules with the official methods in LIMS.

7) Methods and Instruments—Reality Check

If a method generates disproportionate OOT/OOS or analyst comments, fix the method or the training, not the data. Tie instrument suitability to Asset Calibration Status and challenge data integrity in the workflow—audit trails, users, time sync. Re-qualify equipment under IQ/OQ/PQ where justified and update SOPs through Document Control.

8) Manufacturing Controls—Make Gates Do the Work

PQR outcomes must live in operations. If trends show wrong-material picks or premature line starts, wire harder interlocks: Directed Picking with disposition checks; Label Verification before pack; MES step blocks based on IPC status; and automatic QA holds at Goods Receipt. If the PQR recommends a change that doesn’t become a gate, expect to see the same issue next year.

9) Complaints and Field Performance—Close the Loop

Map complaint categories against manufacturing signatures and supplier lots using genealogy. Where field signals correlate to specific operations or materials, prioritize CAPAs that change those operations—not just more inspection. PQR is where customer pain translates into plant control changes with measurable effects in the next cycle’s SPC charts and KPIs.

10) Roles and Governance—Who Owns What

Quality leads the PQR and approves the report; manufacturing and labs own the truth of execution; regulatory ensures commitments align with filings and ICH Q10; supply chain supplies material data; and IT/OT ensures the record systems are reliable. Keep the segregation of duties clean—authors do not approve their own conclusions. All commitments must be traceable to owners and due dates in the QMS, with progress visible and auditable.

11) Common Failure Patterns—and Antidotes

  • Binder obedience. Report produced, zero actions. Antidote: require at least one improvement or a justified “no change” with data.
  • Spreadsheet wilderness. Off-system “analysis” that can’t be audited. Antidote: analyze from controlled exports; keep calculations under CSV.
  • Rounded passes. Precision hides failure. Antidote: harmonize rounding to method; review raw and calculated values in LIMS.
  • Weak gating. Same deviation every month. Antidote: enforce interlocks in MES/WMS and validate during re-qualification.
  • Spec creep. Limits change without capability evidence. Antidote: route via Change Control with CPV support.
  • CAPA theater. Actions close, nothing changes. Antidote: require effectiveness metrics in the next PQR.

12) Output—What the PQR Must Produce

  • Conclusion on product state: capable, at risk, or non-capable with justification.
  • Action register: approved CAPA and MOC with owners/dates.
  • Specification/method updates proposed via Change Control.
  • Validation impact: required IQ/OQ/PQ or CSV testing.
  • Supply-side actions: supplier qualification adjustments or material handling changes in WMS.
  • Next-cycle measurement plan embedded in CPV.

13) Metrics that Prove PQR is Working

Count reduced NCR rates year-over-year; fewer OOS incidents; improved process capability; shorter time from batch completion to Lot Release; and tangible improvements in order-to-ship lead time. Track the percentage of PQR actions implemented on schedule and their effectiveness measured in the next cycle’s SPC charts. If these metrics don’t move, your PQR is observation, not control.

14) How This Fits with V5 by SG Systems Global

V5 Solution Overview. The V5 platform unifies product evidence—eBMR for execution, LIMS for analytical truth, and WMS for materials—under governed versions and attributable signatures. That makes the PQR a pull from living systems, not a spreadsheet archaeology project.

V5 MES. In the V5 MES, product run histories, IPC controls, and step outcomes are queryable by product and time. Interlocks derived from PQR findings (e.g., tighter limits, new line-clearance prompts) are configured once and enforced automatically across the next lots, with all changes recorded in the eBMR and Audit Trails.

V5 QMS. Within the V5 QMS, PQR reports, approvals, and actions live under Document Control. Decisions spawn CAPA and MOC, each with due dates and effectiveness checks. The next cycle pulls status automatically, so “closed” means evidenced, not just ticked.

V5 WMS. The V5 WMS supplies material identity, Directed Picking, bin/location rules, and FEFO logic, allowing PQR-driven material controls (e.g., shelf-life or storage changes) to become hard system gates.

Bottom line: V5 converts PQR from a retrospective report into a forward-leaning control loop: insights trigger interlocks, specifications, and SOP changes that are enforced tomorrow and measured in the next review.

15) FAQ

Q1. Is PQR the same as APR?
In many contexts, yes. APR and PQR both assess a product’s performance and drive improvements. Use the local regulatory term but keep the discipline identical: analyze evidence, decide, and implement under governance.

Q2. Where should PQR data live?
In the systems of record: execution in the eBMR, analytical truth in LIMS, materials in WMS, and the PQR report and actions under Document Control with attributable approvals.

Q3. How do we prove that PQR actions worked?
Embed actions into the process via MOC/Change Control, then track effect in the next cycle’s CPV and SPC metrics. If the curve doesn’t move, the CAPA was noise.

Q4. What if the data quality is weak?
Don’t average garbage. Address root causes via CAPA—fix audit trails, close spreadsheet gaps, re-train, and tie devices to Asset Calibration Status. Resume conclusions only when evidence is trustworthy.

Q5. How often should we do PQR?
Annually at minimum. High-risk or unstable products justify more frequent interim reviews, with final annual consolidation. The cadence should reflect risk and CPV signals.


Related Reading
• Lifecycle & Performance: Annual Product Review (APR) | CPV | SPC Control Limits | KPI
• Records & Governance: eBMR | 21 CFR Part 11 | Annex 11 | Audit Trail (GxP) | Data Retention & Archival | Document Control
• Systems & Execution: MES | LIMS | WMS
• Actions & Decisions: CAPA | MOC | Change Control | Deviation / Nonconformance | Lot Release