Dissecting the 21 CFR Part 11 Controversy

PEER-R EV IEW ED Dissecting the 21 CFR Part 11 Controversy Trudy Yin ABSTRACT The original intent of the 21 CFR Part 11, pertaining to the use of el...
Author: Frederick Walsh
14 downloads 0 Views 2MB Size
PEER-R EV IEW ED

Dissecting the 21 CFR Part 11 Controversy Trudy Yin

ABSTRACT The original intent of the 21 CFR Part 11, pertaining to the use of electronic records and signatures, was to encourage industry to adopt new technologies, but the regulation turned out to have a totally opposite effect. The revision of its guidance documents and the continuous redirection of this rule do not end the confusion and discussion of this subject. This paper attempts to diagnose the problems of Part 11 within the framework of the three main root causes identified in Gordon B. Richman’s “The Part 11 Controversy A Root Cause Analysis and Science-Based Solutions,” and explores the proposed risk-based remediation.

INTRODUCTION Published in 2005, Gordon B. Richman’s article “The Part 11 Controversy­-A Root Cause Analysis and Science-Based Solutions” (1) identifies three root causes in reaction to the well-known controversy surrounding the requirements of the 21 CFR Part 11 regulations. These three root causes are summarized as follows: • The US Food and Drug Administration’s failure to recognize the reality gap between perceptions and practices • Inadequate transfer of process-based quality concepts to computerized system by industry • The lack of understanding of good software/system engineering fundamentals (e.g., the validation concept). Discussion is continued for the possible solutions to remediate all these Part 11 derived issues. For the incorrect preconception or neglecting some basic components of good practice, such as audit trail and traceability, the author sees the need for an overall

For more Author information, go to gxpandjvt.com/bios

cultural change as industry’s ill practice is considered the core of non-compliance. For example, pharmaceutical companies have blind faith on their own in-house or purchased systems before the systems get challenged by FDA. The lack of cross-functional team disconnects the developmental and business aspect in an organization. On the regulator side, the published guidance, supposed to be a great help to clarify confusion, oftentimes poses additional burden to companies. As a result, both FDA and industry need to make an effort to annihilate the disparity of interpreting the rule. The FDA’s 21st century riskbased approach, which builds upon the use of hazard control, seems to be more welcomed by industry. It is also the most effective way to help companies in correcting previous misconception and thus lend themselves to quality software and systems, based on good science and practices.

FAILURE TO RECOGNIZE THE REALITY GAP BETWEEN PERCEPTIONS AND PRACTICES Two out of the three underlining root causes of the Part 11 failure are pointing to industry’s malpractice. This idea is based on the argument that the Part 11 final rule has built upon the computer science fundamentals of good software and systems engineering practices for which FDA has expected the pharmaceutical or healthcare sector to have already acknowledged and understood. This FDA assumption not only misses a good grip on the reality of the targeted audience, but also miscalculates the agency’s competence in this area. What most analysts have overlooked is the deficiency within the agency’s inspectorate. Inspectors themselves may not

[

gxpandjv t.com

ABOUT THE AUTHOR Trudy Yin is president of and a consultant with TY Consulting Services. She provides global quality, regulatory compliance, and validation support for pharmaceutical, biotechnology, and medical device clients. Trudy may be reached by e-mail at [email protected] or [email protected].

Journal

of

Validation T echnology [Winter 2010]

91

PEER-R EV IEW ED

be proficient enough to audit companies against the Part 11 standard when the regulation was first published. These inspectors have various opinions on what constitutes Part 11 compliance. Many of them are not software engineers and, therefore, are not totally comfortable with electronic documentation (2). The requirements themselves are ambiguous and derailed from what the regulator should look for and what industry should do to fulfill them. Part 11’s “Electronic Records; Electronic Signatures,” which should be entitled “Good Software and System Engineering Practices” as indicated in Richman’s article, is a good example of what is “off-track” about the regulation. This explains why we don’t see many warning letters issued in late 1990s when the Electronic Record and Electronic Signature (ERES) requirement was first published (3). Although industry is expecting either a revoke or an update of the regulation, there are reasons why FDA would not rescind the regulation. Janet Woodcock of the Center for Drug Evaluation and Research (CDER) explains that a rule on electronic records is unavoidable because industry needs direction on how to maintain electronic records or good manufacturing practice (GMP) documentation as the use of automation continues to grow (4). The fact that Part 11 affects other federal regulations (5) and coincides with worldwide standards (e.g., ISO, EU GMP Guide Annex 11) will cause a negative chain of reactions if the rule is revoked. For over a decade, the anticipation of the revision or amendment of Part 11 seems to be an endless task.

INADEQUATE TRANSFER OF PROCESSBASED QUALITY CONCEPTS TO THE COMPUTERIZED SYSTEM In 2003, FDA re-issued a guidance document for the purpose of clarification and re-focus. Although many have been arguing that the revised guidance document has compounded the problem, there are no concrete evidence or business cases to prove this claim. With improvements, the 2003 guidance reflects a paradigm shift from product-oriented to process-centric emphasis (6). Since the time when Part 11 first went into effect in 1997, this process-centric focus (originated in 1980s) has not been fully transferred into the computer-based environment. Therefore, the entire idea (i.e., the necessity of ERES requirement) should be treated as one of the emerging areas and should not be expected to come out perfect the first time. FDA has also realized the initial broad interpretation of the regulation brought forth the impracticality during 92

Journal

of

Validation T echnology [Winter 2010]

implementation; the new guidance was, therefore, created to narrow the scope and interpretation of Part 11. Under this new “narrow scope”, the agency exercises enforcement discretion on Part 11 requirements related to validation, audit trials, legacy systems, copies of records and record retention (7). The new scope still involves a broad range of documents but the requirement has set forth in the predicate rules that industry should be familiar with or have already adhered to. Today, organizations are aware that compliance cannot be satisfied by technology alone, but together with the implementation of comprehensive administration, procedural, design, and process controls. The message of a systematic and unified solution is surfacing.

LACK OF UNDERSTANDING OF GOOD SOFTWARE/SYSTEM ENGINEERING FUNDAMENTALS The pharmaceutical industry has been charged with having inadequate software quality knowledge. This is the third root cause of the Part 11 failure. Supported by FDA’s data on the 3140 medical device recalls between 1992 and 1998, good software engineering practices and validation were evidently insufficient during that period. The FDA study revealed 242 recalls due to software failures, and most of these software failures were caused by defects introduced during post-market software modifications (see Figure 1) (8, 9). Because the software quality assurance concept has not been effectively translated, this claim is redirected to the software engineers’ attention. Good software and engineering practice, if adopted by software industry, would have avoided what has been historically known as “software crisis,” meaning software development has encountered more failures than successes statistically (10). Besides, professionals from cross-functional groups (i.e., information technology, legal, quality, and compliance) often share common terminologies but mean different things (11). There is a lack of agreement on defining those terms. For instance, validation is no more than the execution of test scripts in the common software language, which is quite different from the process validation concept adopted by the pharmaceutical industry. With the inherent confusion of these terms and their implications, “validating everything,” although expensive, becomes a safeguard to regulatory compliance. High cost implementation and difficulty in defining what should fall under the regulatory requirement explain why industry continues to resist Part 11 (2). iv thome.com

T RU DY Y I N

ISSUE ON VALIDATION Although industry has had a misconception about the requirements in terms of computer-related system validation, validation is still agreed among practitioners to be the main reason for the high cost Part 11 implementation (7). A poll conducted by NuGenesis in 2001 (12, 13) shows relatively high figures with regard to Part 11 implementation in terms of time and cost (see Figures 2 and 3). From the industry perspective, the obstacles of Part 11 compliance came after the fact that there was a lack of resources and technologies for implementation and the cost of transition skyrocketed (13). It was simply the economic risks business owners were unwilling to take. By the year 2003, the cost of Part 11 compliance had reached hundreds of millions and even billions industry-wide according to some researchers (2). The economic impact of this rule wasn’t something that FDA would have expected. A reform of the collective mindset, although it may take considerable time and effort, is particularly important to tackle this misconception. Validation really shouldn’t be treated as something additional to the basic components of good software and systems engineering practice (e.g., traceability, documentation, audit trail, and source code review) that should have been “built-in” to the product (1). This idea is echoed in the process-centric paradigm aforementioned, whereby validation should be “a process” not “an event” (9). This “built-in” quality concept has been long supported by the quality-by-design (QbD) initiative. The QbD idea counterbalances “quality by inspection” and finished product testing that are considered to be the old validation practices. FDA has redefined process validation as “the collection and evaluation of data, from the process design stage throughout production, which establishes scientific evidence that a process is capable of consistently delivering quality product” in its 2008 draft guidance on Process Validation: General Principles and Practices (14). To stay abreast with the new risk-based directive, this guidance has major updates that are in parallel with the alternative validation approach (i.e., the realtime quality assurance within the process analytical technology [PAT] framework and the design space development to relieve excess revalidation/submission burden). A robust system never calls for volumes of paper document in the validation package. Costs will also adjust as the pattern of success is established (i.e., for eliminating extra spending on redundant or non value-added testing and on fixing noncompliance issues). In other words, if requirements are tested, gxpandjv t.com

Figure 1: Medical device recalls between 1992 and 1998.

Figure 2: Obstacles for Part 11 compliance.

defects are fixed, and the quality of the product is achieved and remained in every step of the product development lifecycle, there will be a huge relief of regulatory burden and long term cost saving. Part 11 compliance is the foundation to quality improvement that is inseparable with the lean manufacturing and six sigma goal.

RISK-BASED SOLUTION As many would agree, the replacement of traditional “validating everything” with the risk-based approach is the biggest progress of the 21st century. One of the intents of FDA’s 21st Century Initiative is to align Part 11 more closely with the risk-based elements (11). These have been adopted in many industrial sectors, such as the hazard analysis and critical control points (HACCP) Journal

of

Validation T echnology [Winter 2010]

93

PEER-R EV IEW ED

Figure 3: Economic impact of Part 11 compliance.

in the food industry, and the failure mode effect analysis (FMEA) in the aerospace sector, etc. This risk-based approach is considerably more beneficial over the old prescriptive approach of regulation. Because the concept and application of risk management coincides with other international standards (e.g., ISO 14971, ICH Q9), and is already a common practice for many industrial sectors worldwide; the previous issue of “reality gap” is now bridged.

TESTING METHODOLOGY In today’s world of computers, complexity increases where multiple hardware, software, operational environments, processes, personnel, and standard operating procedures (SOPs), etc. are all becoming one integrated whole. The Good Automated Manufacturing Practice’s (GAMP) five software categories set general boundaries by grouping software in terms of levels of complexity. These categories can be used as a guide for making testing and validation decisions. In response to the growing demand for high productivity, quality and compliance in drug and device manufacturing, the use of commercial off-the-shelf (COTS) software solutions has gained popularity. In order to qualify a COTS software vendor, either a formal vendor audit or a less scrutinized survey can be conducted, commensurate with the levels of risk the software represents. Sometimes, market reputation alone serves as a vendor qualification. The use of a risk-based approach helps decision makers to determine the degree of testing needed for validating a complex computerized system. A gap analysis is useful for indicating current com94

Journal

of

Validation T echnology [Winter 2010]

pliance status. Validation activities can, therefore, be prioritized according to the result of this evaluation. As said, validation should be treated as an integral part of the product lifecycle; it starts from development to retirement. Yet, these testing activities should have different focuses during different lifecycle phases to avoid unnecessary redundancy. The unit test, for example, should check and make sure the unit is free from coding and design faults; the integrated test should verify those components fit together and function properly after integration; the system test checks if the fully integrated system meets specifications; and finally the system will be turned over to the user for user acceptance testing to ensure the product meets user requirements. Users can then validate the product using the logical testing sequence: installation qualification (IQ), operational qualification (OQ), performance qualification (PQ); but the overall computer system validation (CSV) is greatly simplified by not repeating what has been done prior to each hand over or technology transfer. Besides satisfying regulatory requirements, user organizations also need to assess the systems’ trustworthiness from the business perspective and find the breakeven point between cost and quality. A recent development is to select methodology, such as the spiral model or agile method, to further support the iterative software development where early user involvement is emphasized. As user requirement specification is defined in the early stage, it is less likely that the software would fail the intended use after commercialization.

STRATEGIC COMPLIANCE For a successful Part 11 implementation, it is recommended to follow a step-by-step process as follows: • Determine the applicability of Part 11 by

drawing out a decision tree (Figure 4). Questions like “does the system create, modify, store, or transmit data in a digital format?” and “Are the data or records involved in a GXP related process?” can be asked. • Plan and perform risk assessments and define software categories. The newest GAMP 5 model has added another dimension (i.e., risk management) to its original V-model (15). Risk assessments should be used iteratively during the entire software lifecycle. GAMP software categories, intended use(s), and the complexity of the system are all good indicators of “how much validation is enough.” iv thome.com

T RU DY Y I N

• Validation master planning. Define the scope and extent of validation commensurate with the levels of risk in the validation master plan (VMP) supported by the formal use of risk assessments. Careful planning upfront where users and COTS vendors or in-house developers integrate would break the silo and facilitate a common process/ system understanding. • Implement applicable Part 11 controls. Key controls include requirements for security measures, authenticity of the electronic signatures, reliability of the e-records, integrity of the data, and managing audit trial, etc. If the previous steps have been closely followed, the implementation of these controls will align almost automatically.

SUMMARY The causal relationships being drawn out come to a conclusion that is not the end but only a beginning for treating the Part 11 symptoms. After all, root cause analysis is done in order to eliminate those causes. The science-based solution built upon the former use of hazard control points in the exact direction to where FDA and industry are gradually coming into consensus. To summarize, these new Part 11 solutions can be categorized into the following three themes compared against the old: • Narrow interpretation vs. broad interpretation of Part 11 • Process/system centric vs. result-oriented mindset or culture • A risk-based vs. prescriptive approach toward compliance. Because advancement in the computer-related environment will not be held back by any public dispute, stakeholders will need to move rapidly away from their comfort zone into the new generation of Part 11 compliance.

REFERENCES 1. G.B. Richman, “The Part 11 Controversy-A Root Cause Analysis and Science-Based Solutions,” Pharmaceutical Technology, ABI/INFORM Global, 2005, pp. S16-S27. 2. Barbara DePompa Reimers,”Easing the Pain of Part 11” [online], BIO-IT World, Apr 15, 2003 [cited Jun. 25, 2009], available from: http://www.bio-itworld.com/archive/041503/strategic_ pain.html. 3. Mark A. Heller & Stephanie Philbin, “Part 11: The FDA’s gxpandjv t.com

Figure 4: Step-wise Part 11 implementation.

Does the system create, modify, store or transmit data in a digital format?

No

Not subject to 21 CFR Part 11 requirement.

Yes

Is the system or records involved in a GxP related process?

No

Yes

Subject to 21 CFR Part 11 requirements

Risk assessment

Risk management master plan

Validation master plan

Prioritize and determine the extent of validation tasks according to each risk level.

Implement applicable Part 11 controls for each risk level.

New View” [online], BIO-IT World, April 15, 2003 [cited Jun. 27, 2009], available from: http://www.bio-itworld.com/archive/041503/strategic_fda.html. 4. “FDA Director Explains the Changes” [online], BIO-IT World, Apr 15, 2003 [cited Jun. 25, 2009], available from: http://www.bio-itworld.com/archive/041503/strategic_ changes.html. 5. L. Huber, “21 CFR Part 11: Past, Present, Future,” Journal of GXP Compliance, Vol. 12, No. 1, October 2007, pp. 22-37. 6. “21 CFR Part 11: Strategies and Best Practices for Compliance Process Control Solution,” Mastering Regulatory Compliance-Best Practices Executive Series, Amadeus International, QC, Canada, 2008. 7. FDA, Guidance for Industry Part 11, Electronic Records; Electronic Signatures-Scope and Application, August 2003. 8. FDA, FDA General Principles of Software Validation; Final Guidance for Industry and FDA Staff [online], Jan. 11, 2002 [cited Aug. 15, 2009], available from:

Journal

of

Validation T echnology [Winter 2010]

95

PEER-R EV IEW ED

http://www.fda.gov/MedicalDevices/DeviceRegulationandGuidance/GuidanceDocuments/ucm085281.htm. 9. “Software Validation for Compliance Process Control,” Mastering Regulatory Compliance-Best Practices Executive Series, Amadeus International, QC, Canada, 2008. 10. E. Georgiadou, “Software Process and Product Improvement: A Historical Perspective” Cybernetics and Systems Analysis, Vol. 39, No. 1, Jan/Feb 2003, pp. 125-141. 11. J. Avellanet, “How Part 11 Compliance Impacts QbD,” Pharmaceutical Manufacturing, PUTMAN, Vol. 6, Issue 9, October 2007. 12. “ (News and Analysis) Are Device Manufacturers Ready to Comply with 21 CFR Part 11 Requirements?” [online], Medical Device and Diagnostic Industry, December, 2001 [cited July18, 2009], available from: http://dev. devicelink.com/mddi/archive/01/12/007.html. 13. Oliver Schmidt, “GMP News: Poll on 21 CFR Part 11” [online], European Compliance Academy, December 4, 2002 [cited July 18, 2009], available from: http://www. gmp-compliance.org/eca_news_268.html. 14. FDA, Draft Guidance: Process Validation: General Principles and Practices, November 2008. 15. GAMP 5, A Risk-Based Approach to compliant GxP Computerized Systems, ISPE, fifth edition, February 2008. JVT

96

Journal

of

Validation T echnology [Winter 2010]

ARTICLE ACRONYM LISTING CDER Center for Drug Evaluation and Research COTS Commercial Off-The-Shelf CSV Computer System Validation ERES Electronic Record and Electronic Signature EU European Union FDA US Food and Drug Administration FMEA Failure Mode and Effects Analysis GAMP Good Automated Manufacturing Practice GMP Good Manufacturing Practice HACCP Hazard Analysis and Critical Control Point ICH International Conference on Harmonisation IQ Installation Qualification ISO International Organization for Standardization QbD Quality by Design OQ Operational Qualification PAT Process Analytical Technology PQ Performance Qualification SOPs Standard Operating Procedures VMP Validation Master Plan

iv thome.com