This is the first in a series of posts based on a Redica Systems webinar featuring Ulrich Köllisch and Peter Baker in a wide-ranging Q&A on the multi-step process for reaching what they call “quality intelligence.”

Briefly, quality intelligence (QI) itself is a program of risk and data management that they see as an absolute precondition to applying Artificial Intelligence and Machine Learning (AI/ML) in life sciences manufacturing. And as Baker and Köllisch tell it, implementing QI is a multi-step process that includes two “key enablers,” data governance and knowledge management.

Find the full webinar here (alongside expanded biographies of our two guests). This article discusses the basics of QI, along with issues surrounding the first key enabler, data governance.

Ulrich Köllisch, who does most of the interviewing, is manager and subject matter expert on data integrity at GxP-CC, co-heads a PDA.org special interest group, and participates actively in other industry knowledge groups. Peter Baker, who mostly serves as the interviewee, is a former FDA investigator and current president of Live Oak Quality Assurance.

“What is Quality Intelligence?”

Köllisch’s question is a little hard to answer, at least quickly. QI itself “is not something that you can you could just write an SOP for and implement,” Baker says, “in the next month or the next quarter or even the next year.”

So Baker begins with the essentials, getting to quality intelligence by way of the two key enablers described in the FDA Q10 Pharmaceutical Quality System guidance document.

As mentioned above, these are data governance, with its risk-management tools listed in Figure 1 below, “and also this concept of knowledge management,” which is the totality of a site’s objective and tacit knowledge, integrated into a risk-management plan.

Baker explains QI in terms of a “road map to success.” To reach the destination, firms need to make “pit stops” — achieve specific goals outlined in regulatory guidance, such as those enablers in Q10. (“Guidance” can include actual guidance documents as well as inspection documents, such as FDA warning letters. “Especially when they talk about vigilant monitoring of inter- and intra-batch variability,” says Baker.)

Slide 5 Peter Baker

Figure 1 | Peter Baker’s roadmap for reaching organizational quality intelligence

The First Two Stops

Baker details the Roadmap more explicitly in a previous Redica Systems webinar, and in this article, we will concentrate mostly on the second stop.

  • Data Integrity. ALCOA+ concepts (Attributable, Legible, Contemporaneous, Original, Accurate) form its foundation.
  • Data governance. Integration of quality risk management (a key enabler) throughout operations. Includes data and process mapping, as well as qualitative risk assessment. See the 2021 PIC/S guidance “Good Practices for Data Management and Integrity.”

Key Enabler: Data Governance

In Baker and Köllisch’s understanding, the fundamentals of data governance are found in the 2021 PIC/S guidance, “Good Practices for Data Management and Integrity.” It requires a written, risk-based strategy on the management of certain kinds of data, whether that’s the log related to a WFI system or the half-dozen audit trails associated with a chromatography system, says Baker.

Other types of information are important too: life sciences facilities are also full of on-the-job knowledge about what affects the process and how. Data governance includes the gathering up of this kind of information, in the form of qualitative risk assessments.

These assessments are what investigators are looking for. They’ll ask, “What is your risk-based strategy? Is it batch-by-batch, periodic, or as-needed?” Baker says. “It doesn’t have to be [that] you’re reviewing it batch-by-batch every single day.”

If there’s no written strategy, it’s an automatic 483, because the grace period for adopting the principles in the PIC/S guidance are “pretty much over,” Baker says.

“The Annex-11 revision and concept paper from EMA is also very clear on that point,” adds Köllisch. “Any grace periods — for example, for outdated systems — are over.”

And regulators seem to be focusing on data handling ever more intensely. In Redica Systems analyses of FDA warning letters, “about 80% of those have to do with data integrity keywords,” such as “accuracy, loss of confidence in the accuracy of data or the completeness of the data,” says Baker.

And that fits with his own conclusions based on reading 483s in the Redica Systems database, he says.

Which Comes First, Data Integrity or Governance?

While the Roadmap to Quality Intelligence positions Data Integrity — ALCOA+ principles (attributable, legible, contemporaneous, original, and accurate) — as a necessary step before Data Governance, it’s not that simple. “Often it’s seen vice-versa,” says Köllisch.

“Data governance leads us to data integrity,” he says, as outlined in this BioPhorum position paper on FAIR data principles (Findable, Accessible, Interoperable, and Reusable). But it only works “if we get our systems under control, and our data governance and our quality culture in place,” Köllisch says.

Baker agrees. Despite some managers seeming to believe that training employees on ALCOA+ principles would handle data integrity concerns, training is “just the very, very first step in the process,” he says. Data governance “is how you ensure ALCOA.”

Verification and Review for Less-Critical Data

Less-critical data generates a lot of interest among quality professionals, says Köllisch. So he and Baker tackle it directly.

How much verification or review is enough?

First, by “less-critical data,” Köllisch means “data supporting our quality management system, or maybe data which supports the computer system validation.”

The 2021 PIC/S guidance refers to this data as ‘non-critical audit trails,’ notes Baker, and the “easy way” to handle it follows an SOP with a decision tree that deals with it “once every six months or every year.”

“But I don’t think that’s the right way to go,” Baker says. Citing ISPE’s GAMP 5 Appendix M-12 (“Critical Thinking”), Baker says with “the use of rigid tools or overly prescriptive methods, you’re going to guide yourself into doing either too much or too little,” instead of following a true risk-based approach. “It impedes critical thinking with use of the qualitative tools that we’ve been talking about,” he says.

Instead of an SOP and decision tree, that kind of risk management “should be done according to [ICH guideline Q9(R1) “Quality Risk Management”], in the risk acceptance section,” Baker says. “They say risk acceptance depends on many factors,” and the frequency of document review “should be done on a case-by-case basis.”

Keeping Eyes on the Road

It’s not always possible for everyone in the industry to keep up when regulatory agencies refine their views. And plenty of useful guidelines and hints are sprinkled throughout the agency’s publications, sometimes where you might not expect them.

Redica Systems keeps your eyes on the road, helping your teams by delivering the relevant regulatory surveillance signals to your inbox.

With Redica Systems, you can keep up with the FDA road signs that are most important to your company’s journey to quality intelligence. And you’ll take advantage of the most comprehensive and digestible feed of regulations and standards, built by our team of experts using advanced machine learning models.

Contact Redica Systems for a walk-through today!

Get a Demo

We’ll can show you insights into any of your key suppliers, FDA investigators, inspection trends, and much more.

Request a Demo