The issuance of the new FDA Process Validation Guidance in January of this year is a significant event for several reasons. Fundamentally, the definition of what constitutes acceptable process validation differs dramatically from the conventional definition first put forth in the 1987 guidance. This guidance, more than any of the ICH documents or previous FDA guidance issued since the Critical Path Initiative in 2003, attempts to legislate the transformation envisioned by the agency. In 2004 the FDA issued its landmark FDA guidance Pharmaceutical cGMPs for the 21st Century— A Risk-Based Approach, which advocated a more scientific approach to demonstrating process quality. Since that time there have been many discussions regarding how to implement the principles of Quality by Design (QbD) and risk-based decision making. Few could argue that the adoption of these new principles has proceeded at glacial speed. There are many reasons the industry has been slow to embrace these new concepts, despite their potential benefits in terms of process predictability and business performance. The underlying challenge, beyond the development of organizational expertise and resource allocation, is the significant paradigm shift in compliance thinking.
Historically, the foundation of our industry’s compliance philosophy was based upon the three quality pillars of inspection, testing, and documentation, while the 2004 FDA guidance advocates a product quality philosophy based upon process understanding and the scientific application of risk to maximize the potential for predictable product performance. To do this, the industry found itself supplanting industry standard practice—and the three-validation-lot rule of thumb— with a more descriptive methodology that required the industry to design and defend its approach to process and product development. This new, descriptive approach requires a thorough understanding of statistics, probability, and risk. Even if the development team were equipped to meet this challenge, the compliance organization often found itself lacking the necessary linkage between the old philosophy and the new philosophy. Consequently, the concept of risk-based process validation meant adding rather than reducing risk to the compliance equation in the minds of most quality professionals.
So now the agency has drawn a line in the sand with its new process validation guidance. If compliance professionals are to make the transition to the new guidance, there will have to be a clear roadmap to articulate, in broad terms, the necessary quality attributes for each stage of the new guidance. I would advocate the following deliverables to ensure a clear compliance position as the process moves through the three stages of process validation.
STAGE 1
Stage 1 of the new guidance requires identifying the critical process parameters that drive process stability. The guidance goes further to describe establishing the knowledge, design, and control space for the process. This phase is intended to identify as many sources of variation as reasonably possible for the process and to establish the first correlation between process performance and product performance. Deliverables from this stage should reference some risk-based assessment of potential failure modes in the process based upon the product design. This assessment should easily map to the areas of process characterization performed. In addition this stage should begin the discussion regarding sample size, sampling technique, in-process testing metrics, and measurement system capability. To be complete, the Stage 1 activity should also clearly define and defend what parameters are not critical to the process stability. This understanding is essential to laying the common understanding for Stage 2 in terms of the compliance argument.
STAGE 2
This stage introduces a new concept called the Process Performance Qualification (PPQ). Right away this could cause confusion with Performance Qualification (PQ), performed historically as the last stage of equipment qualification. Understanding the components of the PPQ is essential to garnering buy-in. The challenge most compliance professionals will have with this stage is that the level of characterization will vary depending upon the thoroughness of the work performed in Stage 1. Risk is always part of any quality assessment; however, it is rarely quantified in routine quality decision making. In this case, based upon the Stage 1 risk assessment and final control space establishment the sampling justification and acceptance criteria will have to be justified.
STAGE 3
The last stage—and perhaps the most confusing for compliance professionals—will be the monitoring portion of the process. All Quality Management Systems (QMS) require product performance monitoring as part of their annual product review. However, the new guidance is asking for more than that. The agency will look for evidence of monitoring of critical input and output parameters as well, in keeping with the new philosophy of process understanding driving product performance. The reality of all new processes is that it is nearly impossible to anticipate what the variability will be from the six Ishikawa factors (man, machine, measurement, materials, methods, and environment), until routine manufacturing begins in earnest. So it is reasonable to establish a data gathering phase before establishing alert and action limits for any process correction. One new advantage of this three stage approach is that change control assessment will be more straightforward since the variables that are critical have been identified a priority and the need to revalidate should be much easier to determine.
CONCLUSION
There is no doubt that the bar has been formally raised for all compliance professionals with this new guidance. The transformation recommended by ICH Q8 and Q9 has now been formally required. If we are to be successful as an industry it will be essential that we learn to adapt to the changing role and requirements of the new compliance philosophy. If we cannot, the road ahead will be a difficult one for the agency and industry alike.
Tuesday, July 26, 2011
Respiratory Hazards of Aerosol-Dispensed Cleaning Agents
Many doing critical cleaning have used aerosol dispensed cleaning agents to reach certain portions of valuable surfaces so that the cleaning agent could do its “magic” and be removed with soil by a person using a fabric wiper. In this column, I want to address the chronic health hazard to the respiratory system associated with doing that work, and in the subsequent column the acute flammability hazard to staff of the same. While this may be old business for some, others may not be aware of either hazard because aerosol-dispensed cleaners are common and have been used for so long.
AEROSOLS DEFINED
A liquid is converted to an aerosol when it is expanded under pressure through an orifice and sheared by the associated frictional forces from a continuum into small (or tiny droplets). Most commercial aerosol cans produce fluid droplets whose size is barely in the visible range (50 to 60 microns) down to the submicroscopic level.
DROP SIZE DETERMINES ALL
The hazard presented by an aerosol droplet to the respiratory system first depends upon the site within the respiratory system where it is deposited, and secondly upon its inherent toxicity at the point of deposition. The path to that site is determined by its aerodynamics in the air stream of breath, and aerodynamics is all about droplet size.
Droplets of a respirable aerosol greater than around 30 microns (possibly barely visible to the eye) aren’t well entrained in incoming air. It is likely many settle (because their inertia outweighs the buoyant force of the moving breath) outside human bodies, though some are trapped in the nose and mouth on breathing. This is called an inertial separation mechanism.
Droplets sized between around 10 to 30 microns (not visible to the eye) penetrate into the curving torturous path that is the throat (pharynx) but impact on and stick to wet tissue surfaces. So they become deposited in the airways of the head.
SOLVENT AEROSOL VS. SOLVENT VAPOR
Solvent in the vapor phase would penetrate to this position as well. But the crucial difference, what makes exposure limits of solvent aerosols so low, is essentially density of contact.
The mass of liquid solvent held in a droplet of less than 1 micron diameter contains huge numbers of solvent molecules which are all deposited where that droplet becomes trapped on wet lung tissue, while vapor also contains huge numbers of solvent molecules which condense over the more than fifty square meters of wet lung tissue.1
The effect of inhaling aerosols is to damage one’s lungs by administering a large dose of whatever toxic harm the solvent presents to a myriad of sites where oxygen transfer with the blood is accomplished. Aerosols thus become an amplifier for application of toxic damage to the lungs—the applied dose is exaggerated vs. exposure to solvent vapor. And solvents which are less toxic, and have higher exposure limits, must be treated as if they were more toxic, and have lower exposure limits.
EXPOSURE LIMITS (GASP!)
TLVs® for solvent aerosols are typically two orders of magnitude below those for solvent vapors. Yes, that’s two orders of magnitude because of the concentration effect noted above.2
RELIEF FROM THE HOSPITAL
The hazards described above can be overcome, as can most, involved with the use of solvents. The approach is simple: obtain and use personal protective equipment. An operator applying solvents to surfaces via aerosol delivery should at a minimum wear an N95 hospital breathing mask (and possibly appropriate gloves) during and after use; at maximum, wear a self-contained respiration system.
Proven specifications are that it will remove more than 95% of particles (droplets) whose sizes are above 0.3 microns. Every hospital supply store dispenses them at a reasonable cost.
FIRE NEXT TIME
In next month’s column we’ll cover the other hazard of using aerosol-dispensed cleaning solvents, flammability, and present that in a graphic manner which may be new to some.
References:
AEROSOLS DEFINED
A liquid is converted to an aerosol when it is expanded under pressure through an orifice and sheared by the associated frictional forces from a continuum into small (or tiny droplets). Most commercial aerosol cans produce fluid droplets whose size is barely in the visible range (50 to 60 microns) down to the submicroscopic level.
DROP SIZE DETERMINES ALL
The hazard presented by an aerosol droplet to the respiratory system first depends upon the site within the respiratory system where it is deposited, and secondly upon its inherent toxicity at the point of deposition. The path to that site is determined by its aerodynamics in the air stream of breath, and aerodynamics is all about droplet size.
Droplets of a respirable aerosol greater than around 30 microns (possibly barely visible to the eye) aren’t well entrained in incoming air. It is likely many settle (because their inertia outweighs the buoyant force of the moving breath) outside human bodies, though some are trapped in the nose and mouth on breathing. This is called an inertial separation mechanism.
Droplets sized between around 10 to 30 microns (not visible to the eye) penetrate into the curving torturous path that is the throat (pharynx) but impact on and stick to wet tissue surfaces. So they become deposited in the airways of the head.
SOLVENT AEROSOL VS. SOLVENT VAPOR
Solvent in the vapor phase would penetrate to this position as well. But the crucial difference, what makes exposure limits of solvent aerosols so low, is essentially density of contact.
The mass of liquid solvent held in a droplet of less than 1 micron diameter contains huge numbers of solvent molecules which are all deposited where that droplet becomes trapped on wet lung tissue, while vapor also contains huge numbers of solvent molecules which condense over the more than fifty square meters of wet lung tissue.1
The effect of inhaling aerosols is to damage one’s lungs by administering a large dose of whatever toxic harm the solvent presents to a myriad of sites where oxygen transfer with the blood is accomplished. Aerosols thus become an amplifier for application of toxic damage to the lungs—the applied dose is exaggerated vs. exposure to solvent vapor. And solvents which are less toxic, and have higher exposure limits, must be treated as if they were more toxic, and have lower exposure limits.
EXPOSURE LIMITS (GASP!)
TLVs® for solvent aerosols are typically two orders of magnitude below those for solvent vapors. Yes, that’s two orders of magnitude because of the concentration effect noted above.2
RELIEF FROM THE HOSPITAL
The hazards described above can be overcome, as can most, involved with the use of solvents. The approach is simple: obtain and use personal protective equipment. An operator applying solvents to surfaces via aerosol delivery should at a minimum wear an N95 hospital breathing mask (and possibly appropriate gloves) during and after use; at maximum, wear a self-contained respiration system.
Proven specifications are that it will remove more than 95% of particles (droplets) whose sizes are above 0.3 microns. Every hospital supply store dispenses them at a reasonable cost.
FIRE NEXT TIME
In next month’s column we’ll cover the other hazard of using aerosol-dispensed cleaning solvents, flammability, and present that in a graphic manner which may be new to some.
References:
- Luttrell, W. E. Stull, K. R., and Jederberg, W.W. Toxicology Principles for the Industrial Hygienist, American Industrial Hygiene Association, 2008, ISBN 1931504881.
- Poppendorf, W., Industrial Hygiene Control of Airborne Chemical Hazards, CRC Press, 2006, ISBN 0849395283, Chapter 4, Table 4.1 and Figure 4.2, pages 80 to 82. This analysis is somewhat flawed by there being only a small number of solvent aerosols for which there is epidemiological data and so an exposure limit.
The Fourth State of Matter—Part 2
Plasma, the fourth state of matter, is a powerful tool for critical cleaning, controlling contamination, and achieving the appropriate surface quality. A wide range of applications are in use; even more have been proposed. Examples include removing minute traces of contamination from semiconductor wafers, preparing surfaces for coatings, removing masks and markings, decontaminating and sterilizing medical devices, and skin rejuvenation.
In the previous column we explained that plasma is a mixture of atoms, ions, and electrons that result when sufficient energy (heat or voltage) is applied to a gas. The efficacy of plasma for surface modification and cleaning is due to the energy of the plasma particles and/or the ultraviolet light they emit as well as from chemical interaction of plasma constituents with surface molecules.
There is versatility provided by plasma parameters including the choice of gas and the pressure mode, vacuum or atmospheric. Two out of many areas of applicability include medical device cleaning and surface preparation for bonding and coating.
MEDICAL DEVICE PROCESSING
Because plasma can destroy as well as remove protein, there is potential utility in biomedical applications, including cleaning of reusable medical instruments such as forceps and dental drills. Plasma cleaning could be inserted as an added step between washing and autoclaving for high risk situations. A result of research conducted at The University of Edinburgh1 indicates that plasmas can destroy and remove proteins, including prions that are believed to cause Transmissible Spongiform Encephalopathy (TSE) or “Mad Cow Disease.”
CLEANING PRIOR TO BONDING OR COATING
Plasma not only can remove trace levels of contaminants but also can “roughen” the surface to improve the adhesion of a subsequent bonding, marking, or coating. Plastics and composite materials that might be damaged by traditional cleaning chemicals are among the materials treated by plasma. Atmospheric plasmas are line-of-sight and can be used for treating localized regions of a surface or can be scanned across a larger region.
Just as there is a large diversity of options for cleaning agents and methods in traditional liquid cleaning, plasma provides many options. Because much of the effect of plasma on surfaces is a combination of physical momentum and chemical reaction, the choice of gas to be used is an important parameter. Table 1 summarizes chemical and physical properties and uses of major gases used in plasma cleaning.
Many of the applications for plasma are still untapped. No cleaning or contamination removal method can fit all applications. Plasma cleaning and surface treatment provides additional tools that can be considered in the quest to use the most efficacious technique to achieve the desired surface properties.
References:
In the previous column we explained that plasma is a mixture of atoms, ions, and electrons that result when sufficient energy (heat or voltage) is applied to a gas. The efficacy of plasma for surface modification and cleaning is due to the energy of the plasma particles and/or the ultraviolet light they emit as well as from chemical interaction of plasma constituents with surface molecules.
There is versatility provided by plasma parameters including the choice of gas and the pressure mode, vacuum or atmospheric. Two out of many areas of applicability include medical device cleaning and surface preparation for bonding and coating.
MEDICAL DEVICE PROCESSING
Because plasma can destroy as well as remove protein, there is potential utility in biomedical applications, including cleaning of reusable medical instruments such as forceps and dental drills. Plasma cleaning could be inserted as an added step between washing and autoclaving for high risk situations. A result of research conducted at The University of Edinburgh1 indicates that plasmas can destroy and remove proteins, including prions that are believed to cause Transmissible Spongiform Encephalopathy (TSE) or “Mad Cow Disease.”
CLEANING PRIOR TO BONDING OR COATING
Plasma not only can remove trace levels of contaminants but also can “roughen” the surface to improve the adhesion of a subsequent bonding, marking, or coating. Plastics and composite materials that might be damaged by traditional cleaning chemicals are among the materials treated by plasma. Atmospheric plasmas are line-of-sight and can be used for treating localized regions of a surface or can be scanned across a larger region.
Just as there is a large diversity of options for cleaning agents and methods in traditional liquid cleaning, plasma provides many options. Because much of the effect of plasma on surfaces is a combination of physical momentum and chemical reaction, the choice of gas to be used is an important parameter. Table 1 summarizes chemical and physical properties and uses of major gases used in plasma cleaning.
Many of the applications for plasma are still untapped. No cleaning or contamination removal method can fit all applications. Plasma cleaning and surface treatment provides additional tools that can be considered in the quest to use the most efficacious technique to achieve the desired surface properties.
References:
- H. Baxter et al., “Application of epifluorescence scanning for monitoring the efficacy of protein removal by RF gas–plasma decontamination,” New Journal of Physics 11 (2009).
- K. Sautter and W. Moffat, “Gas Plasma—A Dry Process for Cleaning and Surface Treatment,” in Handbook for Critical Cleaning: Cleaning Agents and Systems, B. Kanegsberg and E. Kanegsberg, editors, CRC Press, 2011.
The Importance of Ongoing Facility Monitoring
Facility Monitoring and the routine periodic documentation of this information are vital to maintaining the cleanroom facility at optimal operational efficiency.
Regardless of the manufacturer you select for your facility monitoring instrumentation, a decision has to be made early on as to how stringent the sensor tolerance must be for your specific application. IE: Can you accept a 3% tolerance of your Relative Humidity reading or do you need a 0.5% tolerance. This is where you do not want to buy price and want to know about on-site calibration or the turn around time for off-site calibration.
The first line of defense is the monitoring of the cleanroom’s pressurization, from the main cleanroom outward to lesser clean areas. Whether you have a remote monitoring system or gages on the wall, it is imperative that the Cleanroom Manager be aware of what the pressure readings are, on a daily basis.
If you are using gages that are measuring the pressure differential, have these gages mounted in a panel box on the wall just outside of the gown room where no one can avoid seeing them upon entering the room. This panel box will provide access for the calibration of these gages on an annual basis, as there is no such thing as a “For Reference Only” sticker in lieu of a calibration sticker when it comes to the first line of defense of monitoring.
It is also beneficial to monitor the ongoing pressure differential of the HEPA filters versus the initial pressure drop when the filter was new. Usually the monitoring of one HEPA filter is sufficient for this application in order to give a snap shot of all of the HEPA filters so that you will be aware of when to change them in accordance with current industry guidelines.
Regarding airborne particle counting: where will you place the sensors in a remote monitoring installation application that will provide you with real readings?
OR if a designated trained Technician will be doing manual readings with a particle counter make sure that the particle counter, hose, and probe are “zero counted” prior to taking any readings. Also note the elevation of the particle counter probe on the report sheet as well as the operational mode of the room when the counts were taken.
Don’t forget the Viable (Microbial) Monitoring Program, if applicable.
When doing any monitoring, it is the monitoring Technician’s responsibility to be observant of any items that may have a negative impact on the cleanroom’s environment, such as unsealed penetrations in the walls or ceiling, broken light lenses, unseated or broken ceiling tiles, etc. and report them to the Cleanroom Manager.
In summation, history has proven that a continued, routine, documented Monitoring Program, in conjunction with good protocol, personnel disciplines, and housekeeping is a vital part of your cleanrooms Standard Operating Procedures.
Regardless of the manufacturer you select for your facility monitoring instrumentation, a decision has to be made early on as to how stringent the sensor tolerance must be for your specific application. IE: Can you accept a 3% tolerance of your Relative Humidity reading or do you need a 0.5% tolerance. This is where you do not want to buy price and want to know about on-site calibration or the turn around time for off-site calibration.
The first line of defense is the monitoring of the cleanroom’s pressurization, from the main cleanroom outward to lesser clean areas. Whether you have a remote monitoring system or gages on the wall, it is imperative that the Cleanroom Manager be aware of what the pressure readings are, on a daily basis.
If you are using gages that are measuring the pressure differential, have these gages mounted in a panel box on the wall just outside of the gown room where no one can avoid seeing them upon entering the room. This panel box will provide access for the calibration of these gages on an annual basis, as there is no such thing as a “For Reference Only” sticker in lieu of a calibration sticker when it comes to the first line of defense of monitoring.
It is also beneficial to monitor the ongoing pressure differential of the HEPA filters versus the initial pressure drop when the filter was new. Usually the monitoring of one HEPA filter is sufficient for this application in order to give a snap shot of all of the HEPA filters so that you will be aware of when to change them in accordance with current industry guidelines.
Regarding airborne particle counting: where will you place the sensors in a remote monitoring installation application that will provide you with real readings?
OR if a designated trained Technician will be doing manual readings with a particle counter make sure that the particle counter, hose, and probe are “zero counted” prior to taking any readings. Also note the elevation of the particle counter probe on the report sheet as well as the operational mode of the room when the counts were taken.
Don’t forget the Viable (Microbial) Monitoring Program, if applicable.
When doing any monitoring, it is the monitoring Technician’s responsibility to be observant of any items that may have a negative impact on the cleanroom’s environment, such as unsealed penetrations in the walls or ceiling, broken light lenses, unseated or broken ceiling tiles, etc. and report them to the Cleanroom Manager.
In summation, history has proven that a continued, routine, documented Monitoring Program, in conjunction with good protocol, personnel disciplines, and housekeeping is a vital part of your cleanrooms Standard Operating Procedures.
Thursday, July 14, 2011
Pharmaceutical Sciences Update
Formation of a polymer drug matrix and eventual release of drug molecule from the matrix.
Fatty acid and water-soluble polymer-based controlled release drug delivery system
Sustained release capsule formulations based on three components—drug, water-soluble polymer, and water-insoluble fatty acid—were developed. Theophylline, acetaminophen, and glipizide, representing a wide spectrum of aqueous solubility, were used as model drugs. Povidone and hydroxypropyl cellulose were selected as water-soluble polymers. Stearic acid and lauric acid were selected as water-insoluble fatty acids. Fatty acid, polymer, and drug mixture was filled into size #0 gelatin capsules and heated for two hours at 50°C. The drug particles were trapped into molten fatty acid and released at a controlled rate through pores created by the water-soluble polymer when capsules were exposed to an aqueous dissolution medium. Manipulation of the formulation components enabled release rates of glipizide and theophylline capsules to be similar to commercial Glucotrol® XL tablets and Theo-24® capsules, respectively. The capsules also exhibited satisfactory dissolution stability after exposure to 30°C/60% relative humidity (RH) in open petri dishes and to 40°C/75% RH in closed high-density polyethylene bottles. A computational fluid dynamic-based model was developed to quantitatively describe the drug transport in the capsule matrix and the drug release process. The simulation results showed a diffusion-controlled release mechanism from these capsules.Desai D, Kothari S, Chen W, et al. Fatty acid and water-soluble polymer-based controlled release drug delivery system. J Pharm Sci. 2011;100(5):1900-1912. Correspondence to Divyakant Desai, Research and Development, Bristol-Myers Squibb Company, New Brunswick, N.J. 08903-0191. Telephone: (732) 227-6458; divyakant.desai@bms.com.
Unfolding of mAb1 and mAb2 with GuHCl. (a) Steady-state equilibrium measurement and (b) kinetic unfolding of mAb1.
Evaluation of a non-Arrhenius model for therapeutic monoclonal antibody aggregation
Understanding antibody aggregation is of great significance for the pharmaceutical industry. We studied the aggregation of five different therapeutic monoclonal antibodies (mAbs) using size-exclusion chromatography-high-performance liquid chromatography (SEC-HPLC), fluorescence spectroscopy, electron microscopy, and light scattering methods at various temperatures with the aim of gaining insight into the aggregation process and developing models of it. In particular, we find that the kinetics can be described by a second-order model and are non-Arrhenius. Thus, we developed a non-Arrhenius model to connect accelerated aggregation experiments at high temperature to long-term storage experiments at low temperature. We evaluated our model by predicting mAb aggregation and comparing it with long-term behavior. Our results suggest that the number of monomers and mAb conformations within aggregates vary with the size and age of the aggregates and that only certain sizes of aggregates are populated in the solution. We also proposed a kinetic model based on conformational changes of proteins and monomer peak loss kinetics from SEC-HPLC. This model could be employed for a detailed analysis of mAb aggregation kinetics.Kayser V, Chennamsetty N, Voynov V, et al. Evaluation of a non-Arrhenius model for therapeutic monoclonal antibody aggregation. J Pharm Sci. 2011;100(7):2526-2542. Correspondence to Bernhardt L. Trout, Department of Chemical Engineering, Massachusetts Institute of Technology, Cambridge, Mass. 02139. Telephone: 617-258-5021; trout@mit.edu.
Structure of lamivudine.
Biowaiver monographs for immediate release solid oral dosage forms: lamivudine
Literature data relevant to the decision to allow a waiver of in vivo bioequivalence (BE) testing for the approval of immediate release (IR) solid oral dosage forms containing lamivudine as the only active pharmaceutical ingredient were reviewed. The solubility and permeability data of lamivudine, as well as its therapeutic index, its pharmacokinetic properties, data indicating excipient interactions, and reported BE/bioavailability (BA) studies were taken into consideration. Lamivudine is highly soluble, but its permeability characteristics are not well defined. Reported BA values in adults ranged from 82% to 88%. Therefore, lamivudine is assigned to the biopharmaceutics classification system (BCS) class III, noting that its permeability characteristics are near the border of BCS class I. Lamivudine is not a narrow therapeutic index drug. Provided that (a) the test product contains only excipients present in lamivudine IR solid oral drug products approved in the International Conference on Harmonization or associated countries in usual amounts and (b) the test product as well as the comparator product fulfill the BCS dissolution criteria for very rapidly dissolving, a biowaiver can be recommended for new lamivudine multisource IR products and major post-approval changes of marketed drug products.Strauch S, Jantratid E, Dressman JB, et al. Biowaiver monographs for immediate release solid oral dosage forms: lamivudine. J Pharm Sci. 2011;100(6):2054-2063. Correspondence to D. M. Barends, RIVM—National Institute for Public Health and the Environment, Bilthoven, The Netherlands. Telephone: 31-30-2744209; dirk.barends@rivm.nl.
Optical microscopy (a and b) and SEM (c and d) images of curcumin-loaded PLGA microparticles prepared by conventional solvent evaporation and homogenization technique; (a and c) after washing with water; (b and d) after washing with 10% (w/v) Tween 80. For (a and b), magnification is 400× and bar is 10 μm. For (c and d), magnification is 5000× and bar is 10 μm.
Highly loaded, sustained-release microparticles of curcumin for chemoprevention
Curcumin, a dietary polyphenol, has preventive and therapeutic potential against several diseases. Because of the chronic nature of many of these diseases, sustained-release dosage forms of curcumin could be of significant clinical value. However, the extreme lipophilicity and instability of curcumin are significant challenges in its formulation development. The objectives of this study were to fabricate an injectable microparticle formulation that can sustain curcumin release over a one-month period and to determine its chemopreventive activity in a mouse model. Microparticles were fabricated using poly(d, l-lactide-co-glycolide) polymer. A conventional emulsion solvent evaporation method of preparing microparticles resulted in crystallization of curcumin outside of microparticles and poor entrapment (∼1%, w/w loading). Rapid solvent removal using a vacuum dramatically increased drug entrapment (∼38%, w/w loading; 76% encapsulation efficiency). Microparticles sustained curcumin release over four weeks in vitro, and the drug release rate could be modulated by varying the polymer molecular weight and/or composition. A single subcutaneous dose of microparticles sustained curcumin liver concentration for nearly a month in mice. Hepatic glutathione-s-transferase and cyclooxygenase-2 activities, biomarkers for chemoprevention, were altered following treatment with curcumin microparticles. The results of these studies suggest that sustained-release microparticles of curcumin could be a novel and effective approach for cancer chemoprevention.QUALITY CONTROL - Part 3 of 3 | OOS: The Last Resort
A 70-person PerkinElmer OneSource on-site team takes complete responsibility for maintaining and qualifying more than 50,000 Merck Research Laboratories assets in six facilities. Practical advice for handling the second phase of out-of-specification investigationsIn the previous two articles, we discussed the background of laboratory out-of-specification (OOS) investigations and Phase I of the OOS investigation as outlined in the U.S. Food and Drug Administration (FDA) OOS guidance. If Phase I (see Figure 1) of the laboratory investigation does not identify an attributable laboratory error as the cause of the original OOS observation, the OOS guidance allows the organization to move into an expanded laboratory investigation (Phase IIB) in parallel with a review of the production activities (Phase IIA).How the two parts of the investigation are coordinated depends on the corporate culture. Review of production records should be a priority, because there is a reasonable possibility that it will reveal that the OOS is attributable to an error in the operations area. The Phase IIB (Step 11) laboratory investigation focuses on retesting to demonstrate that the original value does not represent the material. There are a number of conflicting opinions and expectations that must be considered as the organization develops its OOS retest policy, procedure, and protocols:
Unfortunately, many do not comprehend the impact of completing these tests with failing results. The laboratory might consider a policy or procedure that requires that the analyst inform laboratory management when moving to the second stage of testing so that management is aware of the potential problem. Because the rest of the organization does not understand that these procedures already include the retest, it will expect the laboratory to move to the OOS observation investigation mode. In reality, the organization should move into the investigation of a confirmed material failure, which will be driven by quality assurance and will involve the whole organization. Any laboratory OOS standard operating procedure should clearly identify the company’s policy on the use of outlier testing and the expectation that the cause of the original OOS observation (a deviation) will be identified. The following observations in an FDA warning letter demonstrate that the investigators are looking for the identification of the cause of the deviation: “Your OOS procedure contains no provision for conducting retest of new samples, yet your firm released these batches for distribution based on passing repeat test results without conducting a thorough investigation as required under 21 C.F.R. §211.192 to support your conclusion and rationale to release the affected lots … . Your firm’s OOS investigation relating to impurity levels for _____ concluded that the root cause was laboratory error, but the investigation did not identify what specific laboratory error occurred.” The regulatory expectation requires that retesting be conducted according to an approved protocol. Since the landmark 1993 Barr ruling, industry representatives have asked the FDA and others for a recommendation on what should be in the protocol. This is not addressed in the OOS guidance. The following should be identified in the OOS procedure as required content for the retest protocol, and the points should be covered in each retest protocol: The sample that will be used for the retesting. Retest replicates should come from the same sample that was used for the original test. Under circumstances in which the original sample is unstable, the sampling procedure will identify this and proceduralize resampling in the event of retest. If the original sample is inadequate, the sampling plan does not meet current regulatory expectations that the laboratory sample be adequate for the original test and any necessary retesting in the event of an OOS observation. If the laboratory continues, a resample will have to be justified and a resampling plan prepared and approved by the quality unit before resampling occurs and retesting is begun. The original, inadequate sampling plan should be investigated and corrected through the site deviation system. The method that will be used for the testing. The method used to obtain the original result will be used. For each retest sample preparation, the analyst will perform the complete procedure, including the preparation of mobile phase, standards, system suitability solutions, and other required preparations. If any compromises to this are allowed, they are identified in the protocol. For example, the analysts might batch the analytical and standard preparations on a chromatograph and use a common system suitability preparation to evaluate the operation of the chromatographic system. The analysts who will perform the retesting. The draft OOS guidance stated that the original analyst should not be included in the retesting. This requirement could cause a hardship for some laboratories. The final guidance suggests that at least two analysts, not including the original analyst, be involved in the retesting. Some feel that there is value in having the original analyst participate. Any analyst, including the original, could have a procedural bias that impacts his/her results. That potential must be considered in the preparation of the retest protocol and the evaluation of the data. Suggestion: Retesting should be performed by three analysts, including the original. The protocol should have acceptance criteria for intra- and inter-analyst result agreement, as well as the requirement that any lack of conformance to these criteria be resolved before the data analysis for an outlier begins. The number of replicates that will be tested. The purpose of retesting is to demonstrate that the original value, the OOS observation, does not accurately reflect the material under test. This is done by demonstrating that the original value is a statistical outlier when compared to multiple, independent retest results of the same material. Two retests will not provide any statistical confidence. Five or six retests will provide borderline confidence. Several possible retest scenarios are shown in Figure 2. Scenario 1 has been published and accepted by many. It is based on a chromatographic procedure in which each injection is calculated as a separate result. However, the individual results lack the independence appropriate for the evaluation. In the other scenarios, the complete test procedure is followed for each retest sample preparation. This includes replicate injections and calculations defined by the procedure. Suggestion: Use nine retests, three by each of three analysts. The protocol should require intra- and inter-analyst agreement as part of the overall outlier study. How the results will be evaluated and what constitutes an outlier. The protocol should identify acceptance criteria for the test results before they are evaluated for the outlier. Where possible, the laboratory should use a statistician to lead it through this evaluation. If a statistician is not available, the laboratory can use one of the standard statistical packages. Test method system suitability criteria must be met. There should be acceptance criteria for intra- and inter-analyst agreement. For the determination that the original value is an outlier, a number of statistical routines have been used. The student T is one routine that is used often; however, the use of any routine to identify a value as an outlier must be justified. Any criteria for determining whether or not the original value is an outlier must be established in the protocol before retesting is initiated. Suggestion: One possible criterion to consider is the following: The original is an outlier if it is outside of calculated average ±3σ for the retest results. When retesting is complete, the laboratory will report its complete investigation, results, and conclusions to quality assurance, which will be responsible for determining the disposition of the material. Often the firm looks for ways to measure the performance of the laboratory OOS investigation process. Performance measures that should be considered include:
Editor’s Choice
|
QUALITY CONTROL - Life Cycle | Stick with Six
QUALITY CONTROL - Life Cycle | Stick with Six
By Pedram Alaedini
The case for Lean Six Sigma in pharmaceutical formulation and process development
The evolution of the pharmaceutical industry has made product formulation, process development, and life cycle management more critical than ever. Expiring patents and the rise of generics, in addition to increasing drug development costs and regulatory requirements, are forcing pharmaceutical companies to develop products faster, cheaper, and better compared with just a few years ago.Formulation and process development is one of the key—and at the same time most difficult and costly—segments in product development and commercialization. It is also one of the most critical functions, in which the future of a product, its potential success, the level of quality risks, and the strength of its patents, manufacturing capabilities, and customer satisfaction are determined.
For the development of products with good formulation and manufacturing processes in the least time possible at reasonable costs, the trial-and-error approach isn’t the best one—but shortcuts should be avoided. Indeed, the process of product development requires a systematic approach that ensures that only the right steps are taken, and in the most logical sequence.
With the increasingly competitive landscape of the industry, rapid and significant productivity improvements are becoming a prerequisite for every pharmaceutical company’s survival. Six Sigma tools and techniques, used in conjunction with methodologies common to Lean practice, have fostered a powerful performance system that has enabled noticeable productivity growth in other industries over the past few years. Indeed, Lean Six Sigma provides the basis for the strong complementary relationship that is shared by process, quality, and performance, a relationship that leads to sustainable competitive advantages.
The Trouble with Formulation and Process Development Projects
Usually, the objective of any pharmaceutical formulation or process development project is formulating robust products and introducing these products to the market quickly and within budget, while at the same time complying with regulatory and customer requirements. In addition, to be able to produce formulated products at commercial scales, processes must be easily scalable, allowing for a smooth handover from R&D to manufacturing groups.Today, however, many formulation and product development projects suffer from a combination of problems that beget budget overruns, scale-up issues, time delays, and extreme frustration on the part of everyone involved. Most of these issues stem from:
- Lack of active involvement of top management and infrequent management reviews;
- Lack of required experience or talent assigned to projects;
- Unrealistic timelines;
- Poorly defined project scopes and goals;
- Lack of clarity regarding the activities required to achieve objectives;
- Lack of systems, basic infrastructure, and support from various departments; and;
- Inappropriate recognition and reward.
The Power of Lean Six Sigma
The ability to rapidly and effectively bring innovative and high quality products to market has become a hallmark of any successful consumer-driven enterprise. This is particularly true in the pharmaceutical industry, where radically shortened product development cycle times and drastically increased product quality levels remain the crucial differentiating factors between the best-performing companies and the rest of the industry. Speed to market is achieved by maximizing effectiveness in product formulation and process design and development, as well as in manufacturing stages.Over the past 20 years in other industries and functional groups within the pharmaceutical industry, both Lean and Six Sigma methodologies have proven it is possible to achieve dramatic improvements in cost, quality, and time by focusing on process performance. Lean methods reduce waste, cycle time, and non-value added work, thereby improving information and material flow throughout the process. Six Sigma tools, on the other hand, are used to identify root causes of variation in processes, shift the process averages to optimal levels, and reduce variation around the average to find the best operating conditions, identify high-performance operating windows, and design robust products and processes.
Using either one of these methodologies by itself has limitations, however: Six Sigma will eliminate defects but will not address the question of how to optimize process flow, and Lean principles exclude the advanced statistical tools often required to achieve the process capabilities needed to be truly Lean. It is important, therefore, to understand how these two methods complement each other. And, while each approach can result in dramatic improvements, utilizing both methods simultaneously holds the promise of being able to address all types of process problems with the most appropriate toolkit. Lean Six Sigma is a systematic approach to redesigning business operations to minimize the waste (Lean) and variations (Sigma) that occur through process repetition.
In the context of formulation and process development, Lean Six Sigma can be used to develop product understanding and process controls prior to technology transfer to production and to further optimize the process at manufacturing facilities.
Implementing Lean Six Sigma
Lean Six Sigma applications in the pharmaceutical industry have so far focused on factory-based pharmaceutical manufacturing environments. Manufacturing dozens of batches of product is repetitive; minimal variation in output is a key goal. Consequently, savings realized from minimizing waste and cost through a coordinated Lean Six Sigma program across the manufacturing sites of a global pharmaceutical manufacturing company can be highly significant.In recent years, there has been widespread interest in applying Lean Six Sigma approaches to pharmaceutical development activities because of the possibility that they could reduce waste, cost, cycle time, and variability in outputs. In the context of formulation and process development, Lean Six Sigma can be used to develop product understanding and process controls prior to technology transfer to production and to further optimize the process at commercial manufacturing facilities.
While product development is clearly a unique environment, the work performed across projects is similar and can benefit from some of the same optimization tools and methods that are applied to manufacturing. This is especially true for tasks that occur further downstream in the product development process, where manufacturing capability becomes an essential competitive advantage. It is possible to manage, standardize, and continuously improve the product development process as long as there is a solid understanding of, and allowances are made for, those characteristics of the product development environment that are indeed unique.
Obviously, there must be a balance. Not everything can be predicted through scientific analysis, but scientists must always work toward the goal of better product understanding during formulation and process development phases. Later in manufacturing, when the defects may be very easy to identify, they can be costly to correct. Conversely, in the early design phases, potential defects are more difficult to identify because of the need for predictive ability, but once defects are identified, they can be fairly easy to avoid.
Implementing Lean Six Sigma in pharmaceutical formulation and process development presents the same challenges as in other industries and functional groups and requires the same level of commitment on the part of senior management. In addition, in order to ensure proper execution, full-time dedicated champions must be assigned to assist in the implementation process.
In general, the principles forming the foundation for rapid and high quality formulation and process development teams, and the steps that constitute focal points that must be considered both in creating the development process and in managing formulation and process development projects, are:
Systematic approach to product development. The basic elements of the product development system—people, processes, and technology—must be fully integrated, aligned, and designed to be mutually supportive. With this in mind, the company must establish formal procedures for a streamlined process from pre-formulation to formulation design and development, through scale-up and technology transfer. This should also extend to support of manufacturing activities for a predetermined period. Indeed, Lean Sigma approaches are particularly appropriate for optimization of repetitive or routine activities such as checking and analyzing analytical data or batch defect rates.
Customer-first approach. The customer-first philosophy, both internal and external, results in a deep understanding of customer values and requirements, a necessary first step in any product development process. The customer for formulation and process development groups includes the patient, sales and marketing organizations, technology transfer and product support groups, manufacturing facilities, and perhaps many others.
Front-loaded process. Early scientific and engineering diligence and systematic problem solving and troubleshooting, along with true cross-functional participation, are the keys to maximizing the effectiveness of the product development process. Strong pre-formulation support to generate sufficient knowledge on active pharmaceutical ingredient properties, resulting in full understanding of potential development requirements, will be extremely beneficial through all future product development, manufacturing, and regulatory steps involved in approval and the commercialization process.
Continuous learning and improvement. Continuous learning and improvement for all involved must be a fundamental component of every job performed, rather than just a special one-time initiative.
Parallel and simultaneous execution. Concurrent scientific analysis and engineering force the formulation and process development teams to do the most they can with only the portion of the data and information that is available at any given time and is unlikely to change.
Standardization to create flexibility. Standardized skills and processes allow for program customization, broader scope of individual responsibility, a just-in-time human resource strategy, and flexible product development capacities. These standards are also crucial to downstream Lean manufacturing capabilities.
With the proper commitment and the correct approach, Lean Six Sigma will pay off significantly in terms of reduced development cycle times, higher product quality, reduced life cycle cost, and improved customer satisfaction. The only question now: Does a company have the dedication, over the long term, to challenge its scientists and engineers, openly view and address its weaknesses, understand its limitations, and eventually transform itself into a well-respected, world-class organization?
OUTSOURCING - Stability Testing | The Benefits of Outsourcing Stability Testing
By Ryan Williams and Sean Gavor
In general, companies perform stability testing to look for evidence of degradation and the formation of impurities and to ensure that the active ingredients are still within specification. Tablets, oral medications, injectables, and topicals all need to demonstrate stability. Every formulation of the drug product must be tested, and each drug product is subject to a variety of tests.
Companies define the requirements for stability testing in each product’s regulatory submission. International Committee on Harmonization (ICH) guidelines recommend that all testing be performed at approximately the same time. This means that at every stability interval, samples must be pulled from storage and tested within a few days of the target date.
Fortunately, even stability programs that are run in house can be outsourced, so it is never too late to turn your stability testing over to a qualified partner.
Contract labs with on-site stability storage eliminate this risk and reduce the time samples are outside their stability chambers. It is these same stability chambers that offer the most compelling reason for outsourcing: Purchasing, qualifying, and maintaining stability chambers can be an expensive proposition.
Both reach-in and walk-in versions of stability chambers must be continuously monitored for temperature and humidity, and there must be mechanisms in place to regulate the temperature and humidity so that each chamber operates within specified limits. These requirements make stability chambers costly to install and maintain.
Contract providers will have chambers and backup chambers on an uninterrupted power supply, with backup generators and 24/7 monitoring systems that feature alarms and backup alarms to notify personnel in the event of a temperature or humidity excursion. They will also have the staff and resources to ensure the chambers are serviced, inspected, calibrated, and qualified regularly and to maintain the significant paperwork involved in keeping the chambers consistent with current good manufacturing practices (cGMP) and ICH guidelines.
In addition, by offering a suite of chambers, contract testing providers can help companies meet regulatory requirements for distribution into countries with different climate conditions.
The ICH has established four zones for stability testing, each with different specifications, limits, and time points. Zone I conditions are for products that will be distributed in the United States, Canada, the United Kingdom, and Northern Europe. Zone II includes countries on the Mediterranean such as Portugal and Greece and more tropical parts of Japan. Zone III conditions are hot and dry, for places such as Iran, Iraq, and the Sudan. Zone IV conditions are hot and humid, about 40°C and 75% humidity, simulating the rain forests of Brazil and many countries in Southeast Asia.
continues below...
Even though your facility may follow GMPs, some early-stage stability methods are not formalized for outside use. Only validated methods should be utilized for stability testing. However, for some early stage programs the robustness of the method may not have been fully understood.
For example, Celsis received a method to be used for stability testing of a pharmaceutical product. While following the written transfer protocol, Celsis found that its results did not match those of the customer’s lab. The customer’s lab manager reviewed the instructions provided and confirmed that these were the same steps. But when the Celsis analyst talked directly with the company’s technician, the analyst learned that the tech had mixed the sample for longer than had been indicated in the provided method.
Small details like this must be determined prior to the execution of a stability program. Otherwise, your early data—and months or years of internal testing—may not be useful, and expiration dates may be affected.
Method transfer can be as simple as having the contract lab run your protocol to demonstrate that the test can be executed accurately and precisely. Some contract labs can also help you write a formal protocol if you do not have one. In either case, method transfer can be an important step to ensure that you can trust the accuracy of the results generated.
For example, a Celsis International client asked that a product be tested under extremely humid conditions. Celsis was able to create and qualify a difficult-to-maintain chamber condition of 40˚C and 90% humidity for this project.
Other examples of non-standard stability testing conducted to meet client needs include an environment with humidity below 20%—an extremely dry chamber—and a number of studies that cycled samples from minus 20°C to 40°C in 12 hours and back down to minus 20°C over the next 12 hours, repeating this up-and-down cycle every 12 hours for five days or more.
Some providers, including Celsis, also offer a special chamber for photostability storage. Photostability testing is required to demonstrate that the final packaging configuration is suitable for protecting a photo-liable product from photodegradation. Photostability can also be used during method validation to determine photodegradants during forced degradation studies.
Throughout the supply chain, there are containers on trucks or ships reaching very high temperatures during the summer months or freezing during a cold winter. Not all warehouses are climate controlled. And consider the large animal veterinarian who must keep all types of medications in his or her vehicle throughout the year.
Freeze/thaw and shipping studies are separate studies that can help evaluate overall stability. If your samples are found to degrade faster in higher temperatures, for example, a shipping study will identify the conditions at which the product can be shipped safely.
Even before a product is manufactured, a company may run a number of accelerated stability programs on formulation batches to evaluate the product’s feasibility. By using higher temperatures and higher humidities than expected, these accelerated programs are designed to predict the shelf life of a product prior to demonstrating it in real time.
Some multiple-dose containers of sterile products, such as IVs, have a resealable fabric. One aspect of stability testing for these types of products involves repeatedly opening the container and removing a dose to ensure that the correct number of doses is in the container, that the container seals up, and that the re-entry does not introduce contaminants.
Similarly, PET is required for multiple-use containers. PET ensures that over the life of the product the preservative will still be present and the bioactivity of the preservative will be maintained within specification.
Look for a comprehensive stability chamber qualification, calibration, and preventive maintenance program; qualified personnel running the program; and current, thorough SOPs, based on cGMP and ICH protocol, that govern every aspect of the program.
Discuss the lab’s process for maintaining files for studies. Is the system paper-based or electronic? What backups are in place? Ask what you can expect for reporting.
You should expect to receive a summary report at each time point. In some cases this will include a brief history of the testing along with a table showing the full results to date. At the end of the study, a full and final report should be issued.
Finally, you don’t want to be the lab’s first or only stability customer. It’s important that the partner you select can accurately anticipate and meet the testing requirements and volume your stability program entails. For example, Celsis has more than 30 years’ experience conducting stability studies, with 50 to 100 stability programs conducted simultaneously.
Contract labs invest hundreds of thousands of dollars in stability storage chambers, testing equipment, and qualification and maintenance of equipment. More money is spent on staffing, so they have the resources to jump in and do all the required testing within the proscribed time frame—be it every three months for a new product or three lots a year for a released product.
Best of all, the right contract lab will offer a turnkey program that means you won’t have to worry about the varying workload, temperature changes, chamber qualification, and reporting. When your program has been reliably transferred to the right outsourcing partner, the word stability will bring on of a feeling of calm.
By offering a suite of stability chambers, contract testing providers can help firms meet regulatory requirements for distribution into countries with different climate conditions.
A good partner will be able to offer sophisticated equipment, turnkey programs, and more
Unless your product is made and used within the same day, stability testing is required to demonstrate how long the product can be stored safely before it starts to degrade. It’s the science behind the expiration date.In general, companies perform stability testing to look for evidence of degradation and the formation of impurities and to ensure that the active ingredients are still within specification. Tablets, oral medications, injectables, and topicals all need to demonstrate stability. Every formulation of the drug product must be tested, and each drug product is subject to a variety of tests.
Companies define the requirements for stability testing in each product’s regulatory submission. International Committee on Harmonization (ICH) guidelines recommend that all testing be performed at approximately the same time. This means that at every stability interval, samples must be pulled from storage and tested within a few days of the target date.
Fortunately, even stability programs that are run in house can be outsourced, so it is never too late to turn your stability testing over to a qualified partner.
Key Considerations
In addition to maintaining a more consistent workflow in the lab, managers may choose to outsource stability testing to minimize the risk of transporting samples. Temperature excursions may occur while the samples are in transit from the stability storage facility to the lab for testing and then back into storage. These changes have the potential to affect the test results and, therefore, the projected expiration date of the product.Contract labs with on-site stability storage eliminate this risk and reduce the time samples are outside their stability chambers. It is these same stability chambers that offer the most compelling reason for outsourcing: Purchasing, qualifying, and maintaining stability chambers can be an expensive proposition.
Both reach-in and walk-in versions of stability chambers must be continuously monitored for temperature and humidity, and there must be mechanisms in place to regulate the temperature and humidity so that each chamber operates within specified limits. These requirements make stability chambers costly to install and maintain.
Contract providers will have chambers and backup chambers on an uninterrupted power supply, with backup generators and 24/7 monitoring systems that feature alarms and backup alarms to notify personnel in the event of a temperature or humidity excursion. They will also have the staff and resources to ensure the chambers are serviced, inspected, calibrated, and qualified regularly and to maintain the significant paperwork involved in keeping the chambers consistent with current good manufacturing practices (cGMP) and ICH guidelines.
In addition, by offering a suite of chambers, contract testing providers can help companies meet regulatory requirements for distribution into countries with different climate conditions.
The ICH has established four zones for stability testing, each with different specifications, limits, and time points. Zone I conditions are for products that will be distributed in the United States, Canada, the United Kingdom, and Northern Europe. Zone II includes countries on the Mediterranean such as Portugal and Greece and more tropical parts of Japan. Zone III conditions are hot and dry, for places such as Iran, Iraq, and the Sudan. Zone IV conditions are hot and humid, about 40°C and 75% humidity, simulating the rain forests of Brazil and many countries in Southeast Asia.
continues below...
Case study
Get Started with a Method Transfer
Method transfer can be as simple as having the contract lab run your protocol to demonstrate that the test can be executed accurately.
Small details must be determined prior to the execution of a stability program. Otherwise, your early data—and months or years of internal testing—may not be useful.
More complex stability programs should always start with a method transfer. This is used to demonstrate that the lab you’ve selected can produce accurate and precise results. Even though your facility may follow GMPs, some early-stage stability methods are not formalized for outside use. Only validated methods should be utilized for stability testing. However, for some early stage programs the robustness of the method may not have been fully understood.
For example, Celsis received a method to be used for stability testing of a pharmaceutical product. While following the written transfer protocol, Celsis found that its results did not match those of the customer’s lab. The customer’s lab manager reviewed the instructions provided and confirmed that these were the same steps. But when the Celsis analyst talked directly with the company’s technician, the analyst learned that the tech had mixed the sample for longer than had been indicated in the provided method.
Small details like this must be determined prior to the execution of a stability program. Otherwise, your early data—and months or years of internal testing—may not be useful, and expiration dates may be affected.
Method transfer can be as simple as having the contract lab run your protocol to demonstrate that the test can be executed accurately and precisely. Some contract labs can also help you write a formal protocol if you do not have one. In either case, method transfer can be an important step to ensure that you can trust the accuracy of the results generated.
Extreme Testing
Some products may require non-standard storage at conditions for which a manufacturer may not have qualified chambers. Contract labs are not always limited by the standard or zone conditions, however. Ask if the provider has variable chambers capable of being qualified at non-standard conditions.For example, a Celsis International client asked that a product be tested under extremely humid conditions. Celsis was able to create and qualify a difficult-to-maintain chamber condition of 40˚C and 90% humidity for this project.
Other examples of non-standard stability testing conducted to meet client needs include an environment with humidity below 20%—an extremely dry chamber—and a number of studies that cycled samples from minus 20°C to 40°C in 12 hours and back down to minus 20°C over the next 12 hours, repeating this up-and-down cycle every 12 hours for five days or more.
Some providers, including Celsis, also offer a special chamber for photostability storage. Photostability testing is required to demonstrate that the final packaging configuration is suitable for protecting a photo-liable product from photodegradation. Photostability can also be used during method validation to determine photodegradants during forced degradation studies.
Throughout the supply chain, there are containers on trucks or ships reaching very high temperatures during the summer months or freezing during a cold winter. Not all warehouses are climate controlled. And consider the large animal veterinarian who must keep all types of medications in his or her vehicle throughout the year.
Freeze/thaw and shipping studies are separate studies that can help evaluate overall stability. If your samples are found to degrade faster in higher temperatures, for example, a shipping study will identify the conditions at which the product can be shipped safely.
Even before a product is manufactured, a company may run a number of accelerated stability programs on formulation batches to evaluate the product’s feasibility. By using higher temperatures and higher humidities than expected, these accelerated programs are designed to predict the shelf life of a product prior to demonstrating it in real time.
Beyond its standard and specialized storage conditions, a good outsourcing lab will be able to offer a full range of chemistry and microbiological testing options.
Testing Support
Beyond its standard and specialized storage conditions, a good outsourcing lab will be able to offer a full range of chemistry and microbiological testing options. In addition to the assay, dissolution, and impurities, other common tests include pH, color, sterility, endotoxin, and preservative efficacy testing (PET).Some multiple-dose containers of sterile products, such as IVs, have a resealable fabric. One aspect of stability testing for these types of products involves repeatedly opening the container and removing a dose to ensure that the correct number of doses is in the container, that the container seals up, and that the re-entry does not introduce contaminants.
Similarly, PET is required for multiple-use containers. PET ensures that over the life of the product the preservative will still be present and the bioactivity of the preservative will be maintained within specification.
What to Expect
When selecting a contract lab for your stability storage and testing program, choose one that is licensed with the U.S. Food and Drug Administration (FDA) and, as the FDA recommends, schedule an on-site audit, or at least make certain that third parties regularly review the lab’s facilities and systems.Look for a comprehensive stability chamber qualification, calibration, and preventive maintenance program; qualified personnel running the program; and current, thorough SOPs, based on cGMP and ICH protocol, that govern every aspect of the program.
Discuss the lab’s process for maintaining files for studies. Is the system paper-based or electronic? What backups are in place? Ask what you can expect for reporting.
You should expect to receive a summary report at each time point. In some cases this will include a brief history of the testing along with a table showing the full results to date. At the end of the study, a full and final report should be issued.
Finally, you don’t want to be the lab’s first or only stability customer. It’s important that the partner you select can accurately anticipate and meet the testing requirements and volume your stability program entails. For example, Celsis has more than 30 years’ experience conducting stability studies, with 50 to 100 stability programs conducted simultaneously.
Contract labs invest hundreds of thousands of dollars in stability storage chambers, testing equipment, and qualification and maintenance of equipment. More money is spent on staffing, so they have the resources to jump in and do all the required testing within the proscribed time frame—be it every three months for a new product or three lots a year for a released product.
Best of all, the right contract lab will offer a turnkey program that means you won’t have to worry about the varying workload, temperature changes, chamber qualification, and reporting. When your program has been reliably transferred to the right outsourcing partner, the word stability will bring on of a feeling of calm.
Suppliers of Potassium Iodide Catch Up after Japan Tsunami
Suppliers of Potassium Iodide Catch Up after Japan Tsunami
By Tim Donald
A ferry rests amid destroyed houses in Miyako in Iwate, the second-largest prefecture in Japan, after the 9.0 earthquake and subsequent tsunami that struck March 11.
Orders are being filled, but government stockpiles may be inadequate, manufacturers say
Suppliers of potassium iodide (KI) medications in the United States are meeting the demand for their products after being swamped with orders in the wake of the radiation leak in Fukushima, Japan, following March’s earthquake and tsunami. However, U.S. stockpiles of KI are expiring, and manufacturers and legislators have expressed concerns about whether current stocks would be sufficient to protect the public in the event of a nuclear accident in this country.In the wake of the massive earthquake and tsunami that damaged the Fukushima Daiichi nuclear power plant in Japan, the two U.S. companies that produce KI medications experienced a tremendous spike in orders. Officials of Anbex Inc. of Williamsburg, Va., and Fleming Pharmaceuticals of Fenton, Mo., told CNN at the time that they were inundated with calls and orders after the incident, which has been called the worst nuclear emergency in Japan since World War II and the worst in the world since Chernobyl.
The two companies are the only suppliers of KI medications with U.S. Food and Drug Administration (FDA) approval. Anbex produces 130 mg iOSAT KI tablets, and Fleming produces an oral solution, ThyroShield. KI has been found to protect the thyroid, especially in young children, from the cancer-causing effects of radioactive iodine that would be released in a nuclear emergency.
Both companies now report that their KI products are back in stock and that they are filling orders after experiencing a mad scramble in the wake of the Japanese emergency.
“We had a significant increase in business [after the Japan incident]. Currently we have no trouble, we have plenty of tablets,” said Alan Morris, president of Anbex.
Deborah Fleming Wurdack, chief administrative officer of Fleming Pharmaceuticals, agreed. “Really, it was a frenzy, and as expected, it has quieted down,” she said. “Orders are not up much over last year now that we’ve made it through that rush period. It’s funny how people react.”
Fleming Wurdack said most orders during the rush came from U.S. companies that wanted the product for their employees working in Japan. Some orders also came from companies on the West Coast concerned about radiation reaching the United States via the jet stream. A few large companies ordered the product for their employees stationed all over the country. “They decided it would be a good thing to stockpile,” she said.
Counterfeit Concerns
In times of drug shortages, counterfeit concerns come to the fore. Both companies said they deal with trusted suppliers and have few concerns about the legitimacy of their raw materials.“We make sure that we don’t buy anything that could be counterfeit. We do quantitative and qualitative analysis of everything we get in, and we know it’s real stuff,” Morris said.
“We deal with all FDA-approved suppliers, so they do keep track” of where their materials are coming from, Fleming Wurdack said. “During the peak of the frenzy we watched eBay to make sure there weren’t counterfeit products out there. We found one or two and contacted them immediately, and they pulled their products off of eBay. Because there are only two FDA-approved products in the United States, it’s pretty easy to keep an eye on.”
Stockpiles Adequate?
Both Morris and Fleming Wurdack said their companies are prepared to ramp up production again in the event of a similar emergency in this country. Morris, however, expressed concern over whether increased production would be sufficient to meet needs.“I don’t know what the demand parameter would be,” he said. “We could fill a certain level of demand, and above that we couldn’t fill. And the government wouldn’t have [KI], and lots of people would get cancer.”
Morris and Fleming Wurdack both noted that many currently stockpiled KI medications in this country are nearing or have reached their expiration dates.
About a third of the ThyroShield stockpile has expired, Fleming Wurdack said, and the other two-thirds will expire within about the next 12 months. “None of what has expired has been replaced by the federal government,” she said. “There are states that have replaced some of theirs.”
Upon request, the Nuclear Regulatory Commission (NRC) supplies a certain quantity of KI at no charge to states, she explained.
“[ThyroShield] has been disseminated to a large number of states, but the federal government has not renewed the contract with us and has not replenished the supply, so the burden has fallen on the individual states; if they want to have a fresh supply, they have to purchase it themselves. We have taken orders from several states to replenish their expiring stocks,” she said.
Regarding KI tablets, Morris said, “The government has replaced some of it, but they have not replaced everything.” He went further, saying, “There are stockpile supplies in the U.S., but they are terribly inadequate. They are a fraction of what they would have to be in a serious nuclear accident. The government knows that, of course, but they have not acted appropriately.”
Fleming Wurdack agreed, although in a more measured tone.
“Even if they stockpiled enough, let’s say, for those within a 20-mile radius of a nuclear power plant, the jet stream could carry it certainly much further than that. The jet stream carried radiation over to this country from Japan in small amounts.”
According to the Environmental Protection Agency, “trace amounts of radioactive isotopes consistent with the Japanese nuclear incident … far below levels of public health concern” were detected at air monitoring stations across the country as of March 28. At this time, the Centers for Disease Control and Prevention “does not recommend that people in the United States take [KI] or iodine supplements in response to the nuclear power plant explosions in Japan.”
Addressing Supply
Morris applauded the efforts of Rep. Edward J. Markey, D-Mass., and others in Congress to address the KI supply issue. Thirty members of Congress, led by Markey, sent a letter to President Obama on May 9, asking him to implement a law passed in 2002 that would increase supplies of KI at nuclear sites in this country.“Although this law has been on the books since 2002, it has, inexplicably, yet to be implemented,” the letter states.
Markey was the author of an amendment to the Public Health Security and Bioterrorism Preparedness and Response Act of 2002 calling for KI to be made available to state and local governments to meet the needs of people living within a 20-mile radius of a nuclear power plant. That provision of the law was never implemented by the administration of George W. Bush, and so far the Obama administration has upheld the Bush administration’s position on the issue.
Before that law was passed, distribution of enough KI to cover those living within a 10-mile radius of a nuclear plant was allowed to states that requested it from the NRC, according to a May 10 press release from Markey’s office.
Increased vigilance is likely to improve questionable sources of ingredients
Increased vigilance is likely to improve questionable sources of ingredientsOver the past five years, pharmaceutical companies have received plenty of negative press about the safety of their products. Of course, recall shave received much of the press coverage; all current recalls can be found on the U.S. Food and Drug Administration(FDA)website.Anyone logging on finds a pretty lengthy list, which indicates the FDA is doing its part to keep drugs safe. But what is the pharmaceutical industry doing to improve drug safety?Many of the safety issues are related to pharmaceutical supply chain management. Much of drug manufacturing takes place outside the United States, in countries like China and India. Over the past few years, there have been questions about the safety of the supply chain and how that supply chain affects the overall safety of the final drug product. The following questions are uppermost in the minds of both manufacturers and consumers:
Rx-360, a volunteer industry consortium, was established in 2009 to create and monitor a global quality system that meets the expectations of industry and regulators and assures patient safety by enhancing product quality and authenticity throughout the supply chain. Van Trieste is the founder and former chair and is now a member of the board of directors of Rx-360,a position in which he provides strategic direction, promotes Rx-360’s vision and mission publicly, and recruits new member companies to join Rx-360. Report Raises Red FlagsDrug safety issues and their relationship to the pharmaceutical supply chain were addressed in a 2010 report titled “Achieving Global Supply Chain Visibility, Control & Collaboration in Life-Sciences: RegulatoryNecessity, Business Imperative” sponsored by PricewaterhouseCoopers (PWC) and conducted by Axendia. In the report, which contains data regarding outsourcing and the supply chain,some of the numbers were staggering but not surprising. For example, 70%of pharmaceutical industry executives indicated that they have key suppliers in China and 57% in India. The industry has encountered much concern about obtaining inadequate products from these regions, a problem often attributed to poorly regulated supply chains. Is that statistic any indication that China might be one of the countries where the global supply chain has been faltering? Also, how has the global supply chain been affected by global outsourcing over the past few years? “We have seen several high-profile instances in the last year or two where very sophisticated criminals have broken into pharmaceutical warehouses, stealing truckloads or multiple truckloads of products that they go and sell on the openmarket.” “The numbers are certainly eye-catching but not surprising to me,” said Wynn Bailey, head of supply chain strategies at PWC. Bailey explained that it is definitely no surprise for a pharmaceutical,biotechnology, or even a medical product and supply company to have a manufacturing presence in China. There are several reasons for this.These companies initially became active in China because of its value as a source of raw materials or intermediate materials. “Many companies will not yet go so far as to source active pharmaceutical ingredient (API) from Chinese suppliers, though many of those companies expect this situation to change within the next few years. However, companies are also increasingly looking at China as a place to do their own manufacturing. They see China as a market with a lot of potential customers and an economy that’s developing to the point where people can afford to buy Western medicines,” said Bailey.—Wynn Bailey, head of supply chain stategies, PricewaterhouseCoopers “There are numerous quality manufacturers in China. So we must not paint all manufacturers in China with the same brush based on a group of bad actors,”said Van Trieste. “It is a pharmaceutical company’s responsibility to select an outsourcing partner that is qualified, and to actively monitor the performance of that outsourcing partner on an ongoing basis. Ultimately, it is the pharmaceutical company’s responsibility to protect the patients that they serve.” Amgen does not outsource manufacturing in China. The vast majority of Amgen’s products are produced at one of several Amgen manufacturing plants in California, Colorado, Rhode Island, Puerto Rico, and the Netherlands. “If you’re a major player in this industry, you cannot afford to not have a presence in China,” saidBailey,who explained that drug companies must effectively manage their activities in China, paying particular attention to managing any potential risk with supply chains associated with that market. He has observed that drug companies increasingly rely on the presence “on the ground” in China of representatives who are knowledgeable of the local market and are able to manage the logistical relationships with local supply chain sources in real time. Decreasing Impact of Global OutsourcingIn the report, the group measured the expected global trend for individual pharma companies over the next three years; firms expect global outsourcing will increase by about 78%. Based on this figure, does the industry have growing concerns about the quality of supply chains?“The [drug] companies, in some cases, don’t have a lot of choices other than to source raw materials or intermediate products from markets like China, because that’s where they are available and because the economics of doing that are in many ways somewhat compelling,”saysBailey.“I think drug companies need to do a better job of identifying the significant risks that exist,so that they can assure that the processes used by their suppliers meet the quality standards of developed markets…,”said Bailey. Some of that assurance would be in the form of FDA audits performed periodically on supply chain sources outside the United States to keep a closer eye on them and identify risks before they disrupt the supply chain and compromise drug safety. “In spite of some of the past quality issues that have surfaced, I don’t think we’re going to see a move out of China or India as a source for raw materials or products,” said Bailey. “There are two kinds of risk management activities. One is assuring that the manufacturing processes themselves are meeting GMP [good manufacturing practices], and the other is to protect against counterfeit activity. From a global pharma company’s perspective, they need to protect against criminal activity in the inbound supply chain and the supply chain for finished goods; the focus is more around assuring the quality, assuring that the requirements are met.” To ensure that no counterfeited product enters the patient, companies have begun to invest in a lot of track-and-trace technology. “We have seen several high-profile instances in the last year or two where very sophisticated criminals have broken into pharmaceutical warehouses, stealing truckloads or multiple truckloads of products that they go and sell on the open market,”Bailey said.“Obviously,in an age where terrorism is of particular concern,you want to make sure that you have processes and procedures in place to assure that the product you are selling under your brand name isn’t being tampered with. So those kinds of things I think are high on the agenda of people in pharmaceutical and biotech supply chains these days.” As Van Trieste noted: “Over the last several years there have been various legislative initiatives aimed at improving the pharmaceutical supply chain, none of which have become law. I cannot speculate about the legislative process and what, if any, new requirements will become laws and when, [although] many of the ideas proposed in draft legislation have been embraced by many pharmaceutical companies and suppliers.However, I do believe that it is inevitable that additional legislation will eventually be enacted. I believe that stricter criminal penalties should be prescribed by legislation for those who counterfeit or intentionally adulterate pharmaceuticals for economic gain.” Dr. James Netterwald is a biomedical writer based in New Jersey who writes articles and blogs on all things related to the pharmaceutical and biotechnology industry. He started Biopharmacomm LLC in 2009; his clients include medical education companies,medical advertising companies, science publishing companies, pharma-biotech companies, and public relations companies. More information on his writing can be found on www.nasw.org/users/netterjr. Editor’s Choice
|
OUTSOURCING - Compliance Assurance | Ensure Compliance Excellence
By Sharon Johnson
Pharma firms must ensure that all outsource providers’ locations have a robust and consistent quality management system.
An effective strategy for partnering with a global outsourcing organization
On Aug. 6, 2009, U.S. Food and Drug Commissioner Margaret Hamburg, MD, made her first presentation to the Food and Drug Law Institute, announcing a new enforcement policy to ensure regulatory compliance in the pharmaceutical and biotechnology industry. Dr. Hamburg stated that in order for the U.S. Food and Drug Administration (FDA) to be a strong agency—one the U.S. public can count on—the agency must appropriately enforce laws to assure that only safe and effective medical products are introduced into the marketplace.One theme Dr. Hamburg stressed was that this new policy applies not only to the drug/biotech applicant owners but also to the outsourcing companies with which they work. Both the pharma/biotech company and the outsource organization are inspected by regulatory agencies and must have equal understanding of all relevant current good manufacturing practice (cGMP) requirements.
Now, more than a year since Dr. Hamburg’s address and at a time of potential growth in outsourcing, it is timely to consider the relationship between the pharma/biotech applicant holder and its outsourcing partners with regard to cGMP compliance and the FDA 2009 enforcement policy, as well as regulatory agencies around the world. This article examines a standard way of working between the applicant holder and the pharma/biotech outsourcing company to ensure a mutually successful relationship of cGMP/policy compliance. The pharma/biotech company is referred to as the contract giver, and the outsourcing company is referred to as the contract receiver.
In the 2010 Annual Outsourcing Survey of more than 350 sponsor-side respondents, 44% reported that their outsourcing spending grew in the previous year (2009), and 45% projected that it would grow in 2010.
The 2009 growth was primarily experienced by Big Pharma (28%), followed by small/mid-tier pharma (16%) and virtual pharma (15%). Of the companies that outsource at least half of their commercial manufacturing, 59% outsource to preferred providers, 78% have never received a warning letter due to an inspection, and 81% would describe their relationship with a provider as a partnership. Nearly half (47%) of the companies said more than half of their outsourcing is with preferred providers.
Many organizations view the use of outsourcing organizations as an approach to driving efficiency, so it should come as no surprise that the practice is becoming more common.
Pharma/biotech companies must ensure that all their outsource providers’ locations have a current, robust, consistent quality management system (QMS) program in use. Even those contractor sites not used by the applicant holder must have an excellent compliance program, because the regulatory outcomes of any one of the contractor’s sites have the potential to affect all the contractor’s sites, and, consequently, the pharma/biotech applicant holder.
Companies are coping with change, including cost pressure, consolidation, loss of patent protection, the uncertainties of healthcare reform, increased Phase IV and comparator studies, and heightened regulatory oversight. As a result, the pharmaceutical industry is trending toward greater use of outsourcing, using a few key preferred providers as partners.
It is incumbent on the contract giver to thoughtfully evaluate the contract receiver’s critical thinking and escalation process in light of regulatory expectations for worldwide compliance.
Following Dr. Hamburg’s expectations and the requirements of other worldwide regulatory agencies, the applicant holder’s expectation of its outsourcing contractor should be one consisting of a strong collaborative relationship with strategic partnering to ensure safe and effective drug products and excellence in compliance. This article focuses on effective ways for pharma/biotech and contractor partners to work together to ensure the highest level of compliance, with the partnership creating positive results for both parties.Work Effectively
This section describes an industry standard for contracting with an outsourcing organization that encompasses both the contract giver and contract receiver. Such a standard will promote best practices and high-quality end products consumers can trust.There are three components essential to ensure a strong partnership between the contract giver and contract receiver. Excellence in cGMP compliance will be attainable and sustainable if these practices are followed:
- Select a highly qualified contractor (e.g., a contractor who understands, implements, and sustains cGMP compliance with the latest in regulatory expectations).
- Develop a clear partnership arrangement in which both parties understand what being a partner means and what it takes to develop an effective partnership, including trust and constant communication regarding the expectations of both sides.
- Establish a process that instills the ability to think forward, with both partners not only agreeing on the current interpretation of a regulatory/compliance requirement but also collaborating to anticipate the next steps needed to ensure continuous improvements, the partnership’s standard way of working to ensure success.
Six Steps to Improve Compliance Preparedness
- Improve operating systems, management systems, and behaviors;
- Partner with an outsource organization in whose quality of work and data you have complete confidence;
- Ramp up automation to enhance efficiency;
- Increase frequency of quality checks;
- Institute more efficient corrective action; and
- Maintain a current, robust, and consistent QMS program.
Select a Partner
Selecting the right partner is one of the most critical decisions necessary to ensure a successful business outcome. To understand the term partnership and its application in the pharma/biotech space, there are important quality and regulatory compliance considerations that must be evaluated by both the contract giver and the contract receiver.The contract giver must have a clear, comprehensive, easily understood contracting program that assures confidence in the outsourcing partner. The contract receiver must have confidence in the contract giver’s requirements, and should fit or comply with the contract giver’s business plan and not be in conflict with its overall quality program so as to maintain a compliant, sustainable business. The selection process for each party should include a due diligence program that covers critical quality and compliance factors before work begins and should continue throughout the work cycle between both parties. Due diligence becomes the standard work practice, in which evaluation with continual feedback is a routine process.
The contract giver’s due diligence program should assess regulatory compliance factors, such as the contractor’s QMS and the knowledge and experience the contractor brings to the work to be performed. The contract receiver should provide staff with sufficient education, knowledge, and experience in the industry to perform their associated tasks.
In the due diligence process, the contract giver will expect complete transparency from the contract receiver as a standard way of working. One of the most important factors in the due diligence process is assessment of the transparency the contractor provides regarding all facets of regulatory compliance processes, metrics, and action plans for continuous improvements through corrective actions/preventive actions (CAPA). This sharing of details is of the utmost importance for both parties and should be done as standard practice to assure that the right metrics are being evaluated and a trusted partnership has been established.
Another quality to expect from the contract receiver includes a satisfactory regulatory compliance history from worldwide agency inspections and a high level of regulatory intelligence. The contractor should have a thorough understanding of the complex, ever-changing regulations around the world, understand the impact a change to cGMP regulations or new guidance would have on the product, and be able to offer solutions to ensure continued compliance. It is incumbent on the contract giver to thoughtfully evaluate the contract receiver’s critical thinking and escalation process in light of regulatory expectations for worldwide compliance.
The selection process should address how communications will take place between the contract giver and the contract receiver’s various sites. The internal communications within the contractor’s multiple sites, which may be worldwide, must be a key factor in such processes as complaint handling and change control. The contractor’s metrics for continuous improvement and the system used to report back to the contract giver’s business are critical for long-term success. The communication systems must incorporate a quality agreement to ensure proper responsibility, and accountability must be made known throughout both organizations.
continues below...
Case study
A Partnership Strategy for Optimal Compliance
Table 1. Examples of the QMS program include global standards for the following:
- Corporate quality policies, standards, and standard operating procedures (SOPs);
- Business unit and site-specific SOPs;
- SOP for investigation handling and root-cause analysis;
- Internal audit program with dedicated staff;
- CAPA system;
- cGMP training programs;
- Process for regulatory actions such as field alerts or product recall, including reviews at all sites for effectiveness evaluations of CAPA and best practices results sharing;
- Response to regulatory agency observations;
- Batch records not right first time;
- Response to contract giver’s audit observations, shared throughout the organization for CAPA and best practice sharing;
- Lean processes and tools, Black Belt staff;
- Supplier-management program; and
- Quality agreements (customer and supplier).
In a recent 12-month period, Catalent underwent 52 inspections from 20 global regulatory authorities. It is common to average more than 20 customer audits within a single month. Catalent employs a robust process for partnering with the pharma/biotech industry. This partnering includes a transparent QMS program, pro-active outreach to customers, and participation in setting future standards and guidance for the industry.
Transparent QMS Program Catalent’s process for partnering is a QMS program that includes high-level policies and standards driven by global working groups as well as site-specific standard operating procedures to ensure enactment of the corporate programs. Advanced electronic tools are used throughout the company for consistency.
The QMS is a continuous, proactive, and interactive program of improvement to strengthen current processes. The QMS is coordinated from a central quality organization with staff strategically based throughout Catalent sites.
This central function provides an internal audit program, supplier-auditing program, and cGMP training that includes root-cause analysis, and provides for cross-functional ways of working within Catalent’s current quality, safety, and operations. It shares performance metrics with customers, including specific metrics around their products, as well as Catalent’s compliance improvement initiatives. Examples of the Catalent QMS program of global standards can be found in Table 1.
Catalent’s Approach
Daily: Cross-functional management teams walk the production floor every morning and review critical site quality and operation metrics—the daily stand-up. The team analyzes the most current input and status of operations and issues, acts on opportunities for improvement with site owners and timelines, and determines actions needed to mitigate risk. The site-management team reviews current metrics such as complaints, process delays, deviations/investigations, and safety issues. This floor review involves operations, warehouse, quality, and safety for the most current assessment of Catalent’s internal work processes/practices.Monthly/quarterly: Catalent works with customers through routine face-to-face meetings to cover quality and operational business. The agreed-upon metrics are reviewed, and highlights of corrective/preventive actions (CAPA) are covered, as well as future recommendations for continuous improvement. The senior-most level of Catalent leadership is involved in the customer discussion and helps drive the partnership toward mutually beneficial outcomes. These meetings, along with routine interactions at the site, continue to build the trust both parties need to ensure compliance and progress.
Internally, the senior vice president of quality and regulatory leads a weekly review with a cross-business audience, including the executive team, regarding ongoing regulatory inspection activity, regulatory commitments, and customer quality commitments.
In a recent 12-month period, Catalent underwent 52 inspections from 20 global regulatory authorities. It is common to average more than 20 customer audits in a month.
On a monthly basis, a deeper review is carried out for quality performance, quality and compliance highlights, and quality initiatives with the executive and business leadership teams. Regulatory inspection outcomes, customer audit outcomes, and global quality/operation initiatives are discussed through global telecoms to ensure that best practices are leveraged across all sites for consistent action. During these discussions, the management team, including senior-level participants, assesses the root cause of any problem, reviews customer comments and marketplace feedback, and develops ways to improve the local site, the business unit, or Catalent’s global operations.
A formal quality management review, led by the CEO, occurs quarterly and is attended by the executive leadership team, associated functional heads, Catalent business unit vice presidents, site general managers, and site quality leaders.
Annually/quarterly: A management review process, including Catalent’s most senior quality and operations leadership, along with customer representatives, is co-led and includes transparent discussion on current and past scorecard measures. This session generates a collaborative, long-term action plan that is based on continuous-improvement initiatives.
Catalent collaborates with regulatory and industry organizations to achieve industry improvement, establish policies, and effect change. International regulatory agencies have solicited the company’s senior-level representatives to share their expertise and intelligence, as well as advice on policy and industry improvements. Some participate on advisory boards and even play a role in shaping policies.
It is essential for pharma/biotech companies to constantly reassess their QMS strategy, seeking ways to improve. They must also make sure their outsourcing providers are working as trusted partners, equally committed to achieving compliance success. Strategic partnering for excellence involves selecting a highly qualified contractor, developing a clear agreement wherein both parties understand and thrive, and having a process to look to the future.
With a robust and stringent yet flexible strategy, a company can be confident of being inspection-ready at all times and producing safe and effective pharma/biotech products—whether processes are done in house or by a global outsourcing organization. The characteristics of a contract receiver must ensure the contract giver’s success.
Thinking Forward
Both outsource partners must not only manage the processes in the approved quality agreement but also work with high attention to detail and constant review of systems. This includes a rigorous management of the review process, including an internal audit program that is robust and meets the contract giver’s expectations relative to its auditing program. It should include a strong investigation process with a foundation in root-cause analysis and cGMP investigations of deviations and change orders. If the contract giver has agreed to sub-contract work from its contractors, it is important that the contract receiver has a supplier auditing program that meets the contract giver’s expectations and takes strong stewardship of materials and costs.A strategy to work within industry organizations, pharmacopeias, and regulatory forums helps to further strengthen the partnership as a relationship. A contract receiver that is working within standards-setting bodies such as pharmacopeias or regulatory review processes is a contract receiver that expects to help provide solutions to its partners, with a thorough understanding of how standards are written and set for the industry. This should mean participation on pharma/biotech association advisory boards and participation in industry working groups or even expert committees or panels within organizations, such as national pharmacopeias. Contract receivers should be actively involved with international industry associations and standards and guidance-setting bodies, including academia, to ensure that they understand current regulatory interpretation and can help influence pharma/biotech directions.
Subscribe to:
Posts (Atom)