When it comes to the purchase of any computerized system, the supplier must understand the data integrity requirements as well as develop technical controls to ensure data integrity controls, according to the PIC/S data integrity guidance release last year. 

Paul Smith, Global Compliance Marketing Specialist at Agilent Technologies, began his portion of the May 24 Redica webinar, “Are Laboratories Perpetuating Data Integrity Problems?” by referring to the PIC/S  document. 

[Related: View the webinar recording, including the presentation, here.]

He explained that global regulators want to see organizations move toward implementing technical solutions with technical controls when it comes to data integrity in the laboratory. This was a point he and his co-presenter Bob McDowall, a pharma consultant, returned to time and again throughout the webinar.

Smith then examined data integrity trends in FDA Warning Letters and 483 observations (Figure 1). His data covers the period right before and at the start of the COVID-19 pandemic.

Figure 1 FDA Data Integrity Warning Letter Trends

FIGURE 1 | Two of the Strongest Trends in FDA Warning Letters

The chart on the right covers a narrower time period of five years compared to ten in the other chart (Figure 1). This data is from Warning Letters that mention out-of-specification (OOS) results. He emphasized that FDA Warning Letters are typically written at a high level. Anyone looking for more detail about data integrity violations specific to laboratory equipment should look to FDA 483s as they have more details in that area. 

[Related: An analysis of 483 observations issued from 2012-2021 using Redica’s 483 Report (also known as the Barb Report after GMP Consultant Barbara W. Unger) found 137 primary 483 observations involving system controls in laboratories. To see how you could run your own analysis of 483 observations involving laboratories, contact us today for a walkthrough of our platform.] 

“One of the more recent trends is an increasing expansion of remediation actions and things laboratories have to do in order to be considered compliant after they receive a warning letter,” Smith explained. “Data integrity and OOS requirements apply to everything the lab does, but they also apply to instrument qualification as well as sample analysis.”

Regulators Evolve Approach to Data Integrity

Next, he delved into the Data Quality Triangle (Figure 2).

Figure 2 Data Quality Triangle

FIGURE 2 | Data Quality Triangle

“If you add a foundation layer to that triangle, that foundation layer becomes data governance,” Smith said. “If an organization’s laboratory doesn’t have data governance that is good or strong, then in principle, everything above it becomes at risk.

He then explained that global regulators are looking for more information when they evaluate data integrity today as compared to in the past. For example, in the past FDA would report in a Warning Letter that the investigator found evidence of fraud. But now investigators are more likely to report that systems, such as technical controls, did not prevent fraud, e.g., someone deleting data or inappropriate administration rights for some software.

“In simple terms, an auditor is going to evaluate the integrity of everything your lab does, and begin to evaluate that against two things: the scientific and regulator validity of the work you are performing and the data integrity of how the work was done and documented,” he explained.

Regulatory Guidance Supports Data Governance

For a closer look at specific data integrity regulatory guidance, McDowall pulled some examples from the 2021 PIC/S guidance. Although an internal-only document (“by regulators for regulators”), PIC/S allowed the industry to access it because it shows inspectorates’ mindsets. 

This guidance document addresses the crucial role of data governance. 

“It is the foundation of data integrity,” he said. “One of those elements is data ownership, but it is also management leadership. So, one of the key things is making certain that the data are owned by a key person and also that the integrity of that data is insured.”

Per McDowall, this shows the need for two particular components in order to ensure data integrity: a supplier who both understands data integrity and regulatory requirements and can also develop software with the technical controls to ensure data integrity and compliance. 

https://www.youtube.com/embed/BA0I5vs3wLU

McDowall and Smith also analyzed 483 observations and Warning Letter citations involving infrared spectroscopy equipment. After analyzing 104 FDA 483 observations or Warning Letters, they found that if you have not performed a thorough evaluation of the software, there is a 40% chance you will receive a 483 observation involving a lack of data integrity controls, audit trail issues, or no ability to protect records. 

“The point we would really make is, if you do not evaluate software properly, it perpetuates your data integrity problem,” McDowall said. 

In Part II, Paul Smith and Bob McDowall provide a deeper analysis, including lessons learned from three Warning Letters, the importance of continually improving, and the cost of noncompliance.

Additional Data Integrity Resources

Data Integrity and Your Clinical Investigator: What the Data Shows

Can Quality Culture Prevent GMP Issues Such as Data Integrity?

FDA Regulators Address Data Integrity and Lab Audit Trails

What Can Regulatory Data Tell Us About Data Integrity Trends?

Synergy at the Intersection of Data Integrity and Quality Culture

[Related: Click here to access the full webinar recording, including the slide presentation.]

Get a Demo

We can show you insights into any of your key suppliers, FDA investigators, inspection trends, and much more.

Request a Demo