Two disciplines. One blind spot.
MES and data integrity are designed for the same outcome — yet they're almost always managed by different teams, on different timelines, with different definitions of done.
Asks: Does it work?
- System goes live on schedule
- Batch records are electronic
- Users are trained and signed off
- Validation is complete and filed
- Project is closed
Asks: Can you prove it?
- Every record is attributable to a named individual
- Audit trails capture the right data fields
- No hybrid records exist alongside the system
- Administrators can't alter what they shouldn't
- Records remain readable for their full retention period
The MES team delivers a functioning system. The regulator expects a controlled system. These are not the same thing — and the difference is rarely discovered until an inspection.
You may have a problem
if any of these sound familiar.
Not a checklist. Not a self-assessment. Just five things we hear — almost every time — when we start working with a new client.
"The system will alarm if any parameters are exceeded."
Alarms tell you when something went wrong. They do not tell you what happened before, during, or after the excursion — who acknowledged it, how quickly, what action was taken, and whether the record reflects reality. An alarm is a notification. A data integrity control is a system of evidence. They are not the same thing.
"We review the administrative audit trail on a yearly basis."
This one matters more than most people realise. Administrative audit trails record system-level activity — user changes, configuration edits, access modifications. They do not capture what happened during a batch. If your team isn't reviewing production audit trails prior to batch release, you are missing the data that tells you whether the record accurately reflects what occurred on the floor. A yearly admin review is not a substitute — and it was never designed to be.
"Quality reviews the batch record. They'd catch a DI issue."
Batch record review catches content errors. It does not catch structural data integrity failures — the kind embedded in how the system was configured. If the audit trail is scoped incorrectly, no amount of batch record review will surface it. The gap is invisible until an inspector asks for something you can't produce.
"We haven't had any findings on this system."
Absence of findings is not evidence of compliance. It is evidence that no one has looked carefully yet. The FDA's data integrity guidance specifically calls out computerised systems as a focus area. The question isn't whether you'll face scrutiny — it's whether you'll be ready when you do.
"Do we actually need a risk assessment for this system?"
The answer is almost always yes — but more importantly, the question itself is a signal. A risk assessment isn't a bureaucratic checkbox. It's the document that defines what could go wrong, how likely it is, and what controls exist to prevent it. Without it, your validation scope is a guess, your audit trail configuration is a guess, and your readiness for an inspection is built on assumptions rather than evidence.
"The companies who intentionally integrate their data integrity — not just as procedural expectations, but as part of their operational fabric — set themselves up for long-term success."
Contact us.
info@inadvertentqc.com
(919) 592-1972
