Continued from Part 1…
GLP
When I started in the industry back in 1990, I was a statistician and a statistical programmer supporting GLP studies in general toxicology, reproductive toxicology, and pharmacology.
We followed well-written SOPs and validated all of our computer programs.
Why? Because as 21 CFR Part 58 was interpreted, these activities were required.
- Our QAU audited our department on a yearly basis
- They audited our computer system validation deliverables for every system as we produced them
- They QC’d our work products in every study report right back to the raw data
Our QAU taught us the importance of working in a way that would give all of our customers confidence in the conclusions of the reports we contributed to.
While none of us used the phrase “data integrity,” the way we worked helped ensure data integrity for the studies in non-human animals.
GCP
Our counterparts in the clinical statistics group, who worked with data from human subjects, did not have SOPs requiring system validation.
Why not? As we understood the world at that time, there was no regulatory requirement to do so. Our QAU was not engaged with the clinical statistics group in any way.
1997
What game changing regulatory action happened in 1997?
FDA promulgated 21 CFR Part 11: Electronic Records; Electronic Signatures.
Post-1997
As a consequence, I soon found myself project managing the retrospective evaluation of over 100 clinical systems. Validation was just starting to get talked about in clinical.
The GLP QAU was restructuring and beginning to extend its reach into GCP.
Our initial efforts were met with resistance, anger, even derision.
One PhD clinical statistician, when questioned by an auditor about a table in a study report shouted, “I’m a PhD statistician! I don’t make mistakes!”
At a meeting between clinical management and the combined GLP/GCP audit function, a very senior manager on the clinical side called one of the auditors a Nazi. (There’s just nothing that stops a dialog better than calling one of the participants a Nazi.)
Fortunately, we’ve made progress in the following 17 years.
Nearly all sponsors have a clinical audit function that provides compliance advice to clinical teams and routinely performs audits of clinical investigator sites, internal processes, and service providers.
However, for a number of complex reasons, we’re still wrestling with the meaning of data integrity and how to apply the Part 11 meaning of computer system validation, especially in clinical research.
Let’s review.
Why do we validate computer systems?
Why do we bother? It’s not just because QA makes us.
We validate computer systems to ensure they perform accurately, reliably, and consistently with their intended performance AND so we have “the ability to discern invalid or altered records” (21 CFR Part 11).
It’s hard for anyone to discern invalid or altered records when the recorder uses invisible ink!
Before we look at 2 case studies and how they give the impression of using invisible ink, let’s work towards agreement on the answer to the question…
Are there any regulatory provisions for draft GLP or draft GCP data?
Case Study 1: Flow Cytometry
Flow cytometry has become an important tool to assess the cellular impact of biologics on clinical trial subjects and is often a key technology used to establish biomarkers.
For background:
- A flow cytometer is a lab instrument made up of 3 components: fluidics, optics, and an electronic computer system.
- The fluidic system streams individual cells suspended in fluid through a beam of light.
- As each cell passes through the light, it scatters light and fluoresces, creating light signals.
- The optic system detects the light signals and converts them to electronic signals.
- These are stored by the computer in a standard format, allowing the scientist to perform analyses.
Once the data are saved, they can be subset using a process called gating. Gating helps the scientist subset the data to focus the analysis on the cells of interest. Gates can be changed manually.
When scientists do change the gates, the changes may not be included in an audit trail, because:
- The instrument was not designed with an audit trail
- The instrument has an audit trail, but the lab did not turn it on
- There is an audit trail, but for it to work, the gates must be saved first, and the scientist believes “I can change the gates as often as I like! It’s not data until I say it’s data!”
When this happens, the previous gate setting has been recorded in invisible ink. That is to say, the previous gate values haven’t been recorded at all.
FDA Warning Letters issued to manufacturers of active pharmaceutical ingredients and finished pharmaceuticals say that’s not ok. Why is it ok for a GLP or GCP study?
Case Study 2: Electronic Data Capture
Electronic data capture (EDC) systems have become sponsors’ technology of choice for recording case report form data in clinical trials. The records in the EDC system form part of the subjects’ case histories, for which the investigator bears sole responsibility under 21 CFR Part 312.
This puts investigators in the uncomfortable position of being responsible for records over which they have limited control over once they have saved them.
I want to talk with you now about what can happen with these records before the investigator saves them.
In a well-meaning effort to make sure electronic Case Report Form (eCRF) data are as clean as possible as quickly as possible, some EDC systems are designed to allow queries to fire on data entered as soon as the computer cursor leaves the current data entry field and before the data on the eCRF are saved. Let’s look at what that means in practice, using 2 scenarios.
Head here for Part 3 and this series’ conclusions…
Get a Demo
We can show you insights into any of your key suppliers, FDA investigators, inspection trends, and much more.