Tuesday, July 20, 2010

IN THE LAB - In Vitro Assays | Alternative to Animal Testing Saves Time, Money: In vitro irritection assay system can reduce such testing

By Brian Dell
Alternative to Animal Testing Saves Time, Money
IMAGE COURTESY OF CELSIS ANALYTICAL SERVICES

The drive to refine, reduce and/or find alternatives for animal-based testing of pharmaceuticals, cosmetics, and personal care products has significantly accelerated the demand for validated in vitro studies that are viable equivalents to animal testing. The move to in vitro testing formats is hastened further by the high development costs and lack of assay robustness typical of animal-based testing, such as the long-established Draize test, which evaluates the acute toxicity of pharmaceutical or cosmetic products. By using in vitro assays in lieu of the traditional and often controversial practice of animal testing, manufacturers are realizing benefits characterized by the four Rs: reduction, refinement, replacement, and reproducibility.

With the alignment of certain factors—cost reduction, performance optimization, and a changing culture of acceptability around animal testing—in vitro assay formats will continue to gain acceptance by both product developers and the regulating bodies that ensure product safety.
A Time for Change

Over the last century, in vivo testing emerged as the preferred method to evaluate toxicity and to provide information on the potential dangers to humans from chemicals, pharmaceutical products, and raw materials. Until recently, animal testing was a common “gate” through which most products had to pass before being cleared for human use. Several factors have evolved the scientific community’s thinking and approach to animal testing, however.

First, ever-mounting pressure on manufacturers to cut costs and timelines to market is forcing researchers to find more efficient assays. The costs involved in caring for laboratory animals and developing and conducting validated animal studies are substantial, limiting the ability of young companies to compete. For example, with drug development, the timeline to develop and complete current good manufacturing practices-compliant preclinical studies can add years to a drug’s launch date. Thus, current market forces reward those manufacturers who find effective ways to do more at reduced cost and without sacrificing consumer safety.

The second factor driving the increased use of in vitro test methods is the availability of new assay and screening technologies, which offer substantial benefits over in vivo platforms. With the technical advances of the last 30 years, in vitro test methods demonstrate superior efficacy as defined by the four Rs.
RabbitWith the alignment of certain factors—cost reduction, performance optimization, and a changing culture of acceptability around animal testing—in vitro assay formats will continue to gain greater acceptance by both product developers and the regulating bodies that ensure product safety.
Goal of Today’s Researchers

In vitro testing offers a compelling business case. Improved reproducibility and consistency in lot-to-lot test data yield highly correlated results. The opportunity to administer assays in a high- throughput format means a quicker turnaround time for results—typically 48 hours vs. two to three weeks with traditional in vivo methods. These efficiencies sufficiently reduce costs, which enables testing of smaller amounts of more substances at an earlier development stage.

The technical advances afforded by modern in vitro test methods make it feasible to reduce the number of animals used for testing, refine the historical data provided by in vivo testing methods, replace existing in vivo tests with cheaper, faster in vitro approaches, and ultimately deliver a reproducibility level that is unobtainable with animal-based test methods. Several in vitro assays have been proven to achieve these goals.

A prime example of a proven in vitro alternative is the use of the Irritection Assay System (Celsis International Ltd.) as a substitute for the Draize test. Since the 1940s, the Draize test has used ocular and dermal testing on live rabbits to gauge acute pharmaceutical product toxicity. The Irritection Assay System is a standardized and quantifiable alternative model that detects and predicts the ocular or dermal irritation of potential raw materials or compounds.

For new pharmaceuticals, the Irritection Assay System evaluates the ocular and/or dermal irritation caused by a test sample. During the testing process, a proprietary protein solution is placed in the well of a Petri dish to mimic the eye proteins, and a membrane disk is placed on top of the protein layer. Different volumes or dilutions of test sample are incubated on the membrane; the membrane is observed for deterioration, and the remaining material in each well is mixed and tested for the degree of turbidity against a standard curve through optical density. After density scores are compared, the irritancy score is calculated.

Early in product development, an in vitro assay such as the Irritection format can determine whether animal testing is necessary. When Irritection results are positive for irritation or corrosion, there is no need to prove this again in an animal model. The pharmaceutical can be discarded, or the manufacturer can reformulate and reevaluate it using the Irritection test format. This practice reduces the total number of animals used in the research phase of product development. Additionally, both time and money are saved by the early determination of a product’s irritancy score. Working toward the same decision point in animal studies is significantly more time-consuming and expensive.
Refining in Vitro Methods

In vitro assays also serve to refine in vivo test information, which typically suffers from high variability across animal subjects. A prime example of an in vitro assay format that plays a refining role is the bacterial endotoxins test (BET: USP <85>). BET was originally designed to replace the rabbit pyrogen test for detecting Gram-negative bacteria in a test sample. Gram-negative bacteria are pyrogenic and raise body temperature by inducing the inflammatory response. Consequently, manufacturers must prove that pharmaceutical products intended for parenteral administration are clear of pyrogenic contamination prior to clinical use.

The U.S., European, and Japanese Pharmacopeias currently recognize two test methods for pyrogen testing. The rabbit pyrogen test (USP <28>) involves measuring the rise in temperature in rabbits following intravenous injection of a test solution. The febrile responses of rabbits to intravenous pyrogen vary from rabbit to rabbit due to the individual sensitivities of different animals. It is impossible to test certain drugs like cytokines, antibiotics, select sedatives/analgesics, plasma proteins, and radiopharmaceuticals this way. Variability, applicability, and a lack of standardized controls limit the accuracy of rabbit testing.

The in vitro BET is based on the coagulation of Limulus amebocyte lysate following endotoxin exposure. Immune cells in the blood will recognize pyrogens and release measurable fever-inducing signal molecules. Standardized controls and lot consistency eliminate the variability risk posed by animal subjects. BET predicts human response to pyrogens based on human fever, a method that is more relevant and accurate than those using animal analogs. While BET is limited to detecting only endotoxin pyrogens, more people are recognizing its ability to produce refined test data. Although some products still require animal testing, BET is now mandatory for many USP monographs.
RabbitIn vitro test methods ... are, consequently, gaining greater acceptance as strategic and necessary tools to reduce, refine, and replace animal use in drug research and product testing.
Replacing Animals with In Vitro Assays

Replacement occurs when an in vitro assay is deemed acceptable as an equivalent alternative to the animal test. The FDA has advocated for in vitro testing during drug development to assess safety issues like toxicity and drug-drug interactions. The FDA Guidance to Industry regarding drug interaction studies outlines the use of cellular and subcellular products for determining the inhibition and induction of key metabolizing enzymes. 1

Products such as cryopreserved hepatocytes or microsomes from human donors fulfill the requirements for pharmaceutical companies to assess these safety concerns. In this way, pharmaceutical companies can reduce the new drug attrition rate in clinical studies for safety and effectiveness issues by screening new chemical entities early in the discovery stage, then testing drug compounds prior to exposure in humans and animals, thus saving the company time and money while reducing the potential for adverse side effects in patients.

A final but also significant benefit of in vitro testing is the high lot-to-lot reproducibility and consistency of its test results. Many examples of variability have been noted when animals are used in testing. In fact, the biggest criticism of the Draize test is that it is unreliable and imprecise—the range of dermal responses among test animals extends from no effect to a very drastic skin irritation that includes necrosis. Studies suggest that the Draize test is only effective as a crude tool to distinguish irritants from non-irritants. The introduction of validated in vitro models such as Irritection and BET allows more consistent, measurable, and reproducible testing.

In conclusion, in vitro test methods deliver enhanced reproducibility and consistency in comparison to animal testing and are, consequently, gaining greater acceptance as strategic and necessary tools to reduce, refine, and replace animal use in drug research and product testing. As global manufacturers strive to deliver regulatory-compliant products to market in ever shorter time frames, in vitro methods such as the use of cryopreserved cellular research products and the Irritection and BET assays are not only viable alternatives to animal testing, but they are also strategic tools that can significantly reduce product development timelines and enhance the economic outcomes of the companies who use them.

Dell is associate director, biology, at Celsis Analytical Services. Reach him at bdell@celsis.com or (314) 885-1163.
References

1. United States Food and Drug Administration. Guidance for Industry: Drug Interaction Studies Study Design, Data Analysis, and Implications for Dosing and Labeling (2006 Draft). FDA. Available at: www.fda.gov/downloads/Drugs/GuidanceComplianceRegulatoryInformation/Guidances/ucm072101.pdf. Accessed June 13, 2010.

FORMULATION - Thermal Analysis | In-House Generation of Nitrogen for Thermal Analysis: In addition to saving money, method is safe, convenient

A broad range of thermal analysis (TA) techniques measure the effects of temperature on stability and other physical properties of polymers that are used in pharmaceuticals and pharmaceutical packaging. When TA measurements are taken, the chemical nature of the polymer should not change because of a reaction with oxygen or water vapor in the sample chamber.

To minimize the possibility of oxidation or other reactions, a high-purity inert gas such as nitrogen is passed through the sample chamber to displace the air. The DSC-1 Differential Scanning Calorimeter (Mettler-Toledo, Columbus, Ohio), for example, uses a flow of nitrogen on the order of 200 mL/minute at a pressure of two to five pounds per square inch (psi) to maintain optimum analytical conditions.

A high-pressure tank, a dewar, or an in-house nitrogen generator can supply nitrogen for thermal analysis. An in-house generator provides significant benefits compared with other approaches, including increased safety and convenience, lower cost, and diminished use of energy.
Figure 1. Nitrogen is generated from air using a hollow-fiber membrane bundle.
IMAGE COURTESY OF PARKER HANNIFIN CORP.
Figure 1. Nitrogen is generated from air using a hollow-fiber membrane bundle. It has a small diameter, so many fibers are bundled together to provide a large surface area for the permeation of oxygen and water.
Generation of Nitrogen for Analysis

In-house generation of nitrogen from ambient air requires the removal of oxygen, water vapor, and particulate matter. Nitrogen can be generated from air using a hollow-fiber membrane that permits oxygen and water vapor to permeate the membrane while the nitrogen flows through the tube. A fiber membrane has a small internal diameter, so a large number of fibers are bundled together to provide a large surface area for the permeation of oxygen and water (see Figure 1, left).

The Model N2-04 Nitrogen Generator (Parker Hannifin Corporation, Haverhill, Mass.) can provide six standard liters per minute (SLPM) of 99% nitrogen at 145 psi. Compressed air is first filtered to remove liquids and particulate matter; the filters are equipped with float drains to empty liquids that accumulate inside the filter housing. Separation of the nitrogen takes place inside the membrane bundle. The nitrogen is then directed downstream while oxygen and water molecules are ported to the atmosphere at a low pressure. A 0.01 µm membrane performs the final filtration, and the gas is delivered to the analyzer (see Figure 2, p. 27). The nitrogen has an atmospheric dew point of -58°F (-50°C), contains no particulate matter greater than 0.01 µm and no suspended liquids, is hydrocarbon- and phthalate-free, and is commercially sterile.

In-house generators can provide nitrogen with a purity as high as 99.9999%, with CO, CO2, H2O, and O2 levels at less than one parts per million, and at a flow rate and pressure compatible with thermal analysis.
Benefits of In-House Generator

An in-house nitrogen generator is safer, significantly more convenient, and less expensive than other methods. In addition, use of an in-house generator dramatically reduces environmental impact.

Because an in-house nitrogen generator will not alter the atmospheric composition of the air in the lab, it is considerably safer than a tank or a dewar. The generator separates the O2 from the N2 and vents it to the atmosphere; the 200 ml/min of N2 that is used by the TA instrument is then vented back to the room as well. The net change to the room atmosphere is zero.

In contrast, serious hazards are present when nitrogen gas is supplied to a thermal analyzer using a high-pressure gas tank or a liquid tank. If the contents of a full tank were suddenly vented into the laboratory, up to 9,000 L of gas would be released into the atmosphere. This volume would displace the equivalent amount of laboratory air, thereby reducing the breathable oxygen and potentially creating an asphyxiation hazard for laboratory occupants.

Use of an in-house generator eliminates the possibility of injury or damage that can occur when a gas tank is transported and installed. A standard gas tank is quite heavy and can pose a significant hazard to staff and facilities if the valve on a full tank is compromised during transport. In many facilities, trained technicians are used to replace gas tanks.

When a dewar flask or a high-pressure liquid tank is used, the possibility of user contact with liquid nitrogen, which has a boiling point of -196°C, must be considered. As with a high-pressure tank, a leak in the delivery system could release a significant amount of gas into the laboratory.

Roland Brunell, special projects manager at Danafilms Inc., of Marlborough, Mass., a manufacturer of films for packaging, said that safety was “the primary reason that an in-house generator was selected for the DSC used for analyzing polyethylene films.”
Figure 2. Schematic design of a N2-04 nitrogen generator.
IMAGE COURTESY OF PARKER HANNIFIN CORP.
Figure 2. Schematic design of a N2-04 nitrogen generator.
More Convenient

When an in-house generator is used, the gas can be supplied continually for 24 hours, seven days a week, with no user interaction other than routine annual maintenance. With tank gas or a dewar, on the other hand, the user must pay close attention to the level of gas in the tank and replace it periodically to ensure that the gas will not be depleted in the middle of a long series of analyses. For safety reasons, tanks are typically stored outside in a remote area, so replacing a cylinder can be time-consuming as well as inconvenient in inclement weather. Also, the analyst may need to get a qualified handler to move the tanks. A pressurized tank can be a significant hazard if the laboratory is located in a seismic zone.

If a nitrogen tank must be replaced during a series of analyses, analytical work will be interrupted to restart the system and wait for a stable baseline. In addition, if a series of automated analyses is desired, perhaps overnight, the analyst must ensure that a sufficient volume of gas is available before starting the sequence.

An in-house nitrogen generator allows for continuous operation of the thermal analyzer, and calibration only requires the measurement of a standard sample at a user-specified interval to ensure proper operation of the system. When a new tank is installed, however, the system may need recalibration to ensure accuracy, a time-consuming procedure that decreases laboratory efficiency and throughput.

The maintenance requirements for the in-house nitrogen generator are minimal. The readily accessible filters are typically replaced once year, a process that takes about 10 minutes for all three filters.
Table 1. Annual Costs of In-House Generation vs. High-Pressure Tanks (U.S. $)
IMAGE COURTESY OF PARKER HANNIFIN CORP.
Table 1. Annual Costs of In-House Generation vs. High-Pressure Tanks (U.S. $)
Lower Costs

In addition to significant improvements in safety and convenience, use of an in-house generator provides economic benefits in comparison with a gas tank or liquid nitrogen. The running cost of operating an in-house generator is extremely low, because the gas is obtained from laboratory air, with no electricity required. The running costs and maintenance for an in-house generator add up to a few hundred dollars a year for periodic filter replacement.

In contrast, the expense associated with a liquid or gas nitrogen tank is higher. The actual cost for using nitrogen gas from tanks is usually significantly greater than just the cost of obtaining the gas tank. The time involved in changing tanks, ordering tanks, maintaining inventory, and conducting related activities adds to the cost.

Hidden costs of tank gas can include the transportation demurrage and paperwork—purchase orders, inventory control and invoice payment. Additional costs are associated with the time required to transport the tank from the storage area, install the tank, replace the used tank in storage, and wait for the system to re-equilibrate after the tank has been replaced.

While the calculation of the precise cost of nitrogen gas from tanks is dependent on a broad range of local parameters as well as amount of gas used, significant savings are probable with in-house generation of nitrogen. According to Brunell of Danafilms, the payback period of the in-house nitrogen generator is about two years.

Table 1 (see left) shows a cost comparison between supplying gas with a tank vs. generating gas in house. For the analysis, we assumed that a single tank of gas is consumed weekly, that the tanks cost $60 each, and that four tanks are in house, with each tank replaced once each month. The analysis does not include incidental expenses, such as the costs associated with handling the gas tank, down time, ordering tanks, and other related activities. As this comparison shows, the cost of using an in-house generator is solely tied to maintenance (filter replacement) and is estimated to be about $1000 per year or approximately $20 per week.

A Global View of Regulations Affecting Nanomaterials


The 2000s have been characterized by an unprecedented exploration into research and development of nanotechnology and nanomaterials. Despite a slow start, new regulatory initiatives are popping up like mushrooms internationally. Many of these initiatives have yet to materialize themselves or are soft law initiatives, and their impact on the development of more authoritative and prescriptive regulatory measures is most likely to be limited.

This is due to a number of transnational regulatory challenges that include: (1) whether to adapt existing legislation or develop a new regulatory framework, (2) whether nanomaterials should be considered as different from their bulk counterparts, (3) how to define nanotechnology and nanomaterials, and (4) how to deal with the profound limitations of risk assessment when it comes to nanomaterials.

In this opinion, I discuss these and related issues and conclude that the development of a new authoritative and prescriptive regulatory framework might be the only way to effectively address these challenges while ensuring a transparent and informed decision-making process.

Hansen SF. A global view of regulations affecting nanomaterials. Published online ahead of print June 8, 2010. Wiley Interdisciplinary Reviews: Nanomedicine and Nanobiotechnology. Correspondence to Steffen F. Hansen, Department of Environmental Engineering, Technical University of Denmark at srh@env.dtu.dk.


Figure 2 caption: Uptake processes of therapeutic nucleic acids  (displayed for plasmid DNA and small interfering RNA) mediated by  bioresponsive polymers.
Uptake processes of therapeutic nucleic acids (displayed for plasmid DNA and small interfering RNA) mediated by bioresponsive polymers.

Review: Bioresponsive Polymers for the Delivery of Therapeutic Nucleic Acids

Polymers present an interesting option for the delivery of genes and other therapeutic nucleic acids. In the delivery process, the polymeric carriers face many different delivery tasks and different physiological microenvironments. Polymers can be designed to respond to microenvironmental differences with changes in their physio-chemical properties, enabling them to perform individual delivery tasks.

Cleavage of covalent bonds, disassembly of noncovalent interactions, changes of protonation, conformation, or hydrophilicity/lipophilicity, can trigger such dynamic physicochemical adjustments. The polymeric carrier has to stably bind the therapeutic nucleic acid during the extracellular delivery phase and protect it against degradation in the bloodstream. At the intracellular site of action, the polyplex has to disassemble to an extent that the nucleic acid is functionally accessible. Polyplexes need to be shielded in the circulation and be inert against numerous possible biological interactions, but should actively interact with the target cell surface by electrostatic or ligand receptor interactions. Lipid-membrane destabilization at the cell membrane or nontarget sites is usually associated with undesired cytotoxicity, the analogous biophysical event, however, is required within an endocytic vesicle for polyplex transfer into the cytosol.

Strategies will be presented how bioresponsive polymers can be designed and incorporated into polyplexes. Examples include dynamic stabilization of the polymer/nucleic acid core and transient activation of properties required for crossing lipid-membrane barriers. Bioresponsive delivery domains at the polyplex surface required for shielding, deshielding, and cell targeting also contribute to better performance.

Edinger D, Wagner E. Bioresponsive polymers for the delivery of therapeutic nucleic acids. Published online ahead of print June 8, 2010. Wiley Interdisciplinary Reviews: Nanomedicine and Nanobiotechnology. Correspondence to Ernst Wagner, Pharmaceutical Biotechnology, LMU University at ernst.wagner@cup.uni-muenchen.de.

The evolving use of inkjet technology



By Neil Canavan

A collaborative effort between the University of Leeds and Durham University in England and U.S.-based GlaxoSmithKline (GSK) may soon change the way certain drugs are manufactured by “printing” the active pharmaceutical ingredient (API) on the surface of an otherwise inert tablet. It’s hoped this new process will assure quality control, enhance safety, and, at the same time, reduce cost.

We first focused on drops in solution, which is fairly straightforward. Now we’re looking at droplets with the drug in suspension.
—Nik Kapur, PhD, University of Leeds

Allan Clarke, director of innovative manufacturing at GSK, initiated the project. Clarke wanted to revolutionize the processes for making GSK’s most potent orally administered drugs. These compounds, which have therapeutic activity ranging from a few micrograms to a few milligrams, present demanding quality control challenges. Hormonal agents represent one example of this type of compound. “We have a number of such compounds in our portfolio, which require the controlled, consistent application of micrograms of API as part of a tablet that weighs over 200 milligrams.”

Highly potent drugs also demand cumbersome and expensive safety and containment procedures. “This is necessary because if you inhaled even a small quantity of API, you would get a pharmaceutical dose.” At first blush, the solution to both of these problems was just that: Put the API in solution. This method would allow for a so-called “shirtsleeves” working environment without onerous respirators and would facilitate the measurement of small doses by creating a liquid of known API concentration, which could then be aliquoted—printed with an inkjet-like nozzle—onto the surface of a tablet. Simple enough.

Multiple Images Needed

It was up to GSK to invent the rest: drug formulations that result in manageable drops and imaging technologies that verify the drop volume or drug dose as it is being applied. “We’re talking about multiple images for every single tablet, at a rate of six tablets per second, per nozzle, on a system of multiple nozzles. That’s very different than conventional manufacture where you blend a 600-kilogram batch of API and excipients and feed it into tablet press, and then statistically sample for quality assurance,” Clarke said.

After developing the process and building industrial-scale equipment for low-dose products, GSK reached out to Nik Kapur, PhD, an expert in the fluid mechanics of droplets at the University of Leeds. GSK asked him to create dynamic models that would allow for expansion of the program to higher dose products.

“We first focused on drops in solution,” said Dr. Kapur, “which is fairly straightforward. Now we’re looking at droplets with the drug in suspension.” The idea is to enable the application of compounds that are not readily solubilized, as well as to increase the loading capacity per drop for drugs that require higher doses than those previously used.

In accomplishing this expansion, imaging was again key: “Droplets are incredibly complicated things. When you start to look at them using high-speed photography, you see how very small changes in the suspension have a dramatic impact on the way droplets form,” Dr Kapur said.

Shrinking Droplets

He is also interested in scale. The typical inkjet nozzle might be 20 to 30 microns across, whereas GSK is looking to dispense droplet volumes that might be millimeters in diameter. “My goal is to try to understand the influence of all the interrelated variables,” said Dr. Kapur. Ideally, at the end of what is slated to be a two-year collaboration (funded in part by England’s Technology Strategy Board) Dr. Kapur hopes to supply GSK with a recipe card, “a guide for the formulation of drugs that result in a workable droplet and the mechanical parameters that they can play with to achieve the desired deposition.”

This technology will only apply to immediate release therapeutics, however. Just as with a printer cartridge, the creation of printed combination drugs may be possible.

Inhalable Measles Vaccine Poised to Enter Clinical Trials: The new method could cut vaccine waste


An inhalable, immunogenic vaccine particle has been identified as a lead candidate vaccine for measles and is slated to enter clinical trials in India by summer’s end. Of the 163,000 measles-related deaths of children each year, roughly two-thirds occur in India.

We’re betting that the extraordinary large surface area in the alveoli in the deep lung will give us a therapeutic advantage.
—Robert Sievers, PhD, Colorado University’s Cooperative Institute for Research in Environmental Sciences

Accounting for this high mortality rate, despite the longstanding existence of an effective vaccine, is the phenomenon of “vaccine wastage,” said Robert Sievers, PhD, professor of chemistry and fellow at Colorado University’s Cooperative Institute for Research in Environmental Sciences. Dr. Sievers pioneered the methods used to create the particle.

“Presently, a lyophilized material goes out in multi-dose vials. You reconstitute it with local water that you probably had to purify, then inject one person, then another, but to avoid bacterial contamination you have to destroy what’s left within three to six hours.” The amount of wasted vaccine worldwide is estimated to exceed 40%. Clearly, the conditions on the ground require an innovation in vaccine manufacture.

A Roundabout Path

Despite that need, Dr. Sievers’ goals in embarking on this work were quite different. “For many years, my students and I were concerned about atmospheric quality and pollutants, and especially particulate matter in the air.” The necessity of making reference materials for those studies led to the creation and patenting of a method called carbon dioxide assisted nebulization with a bubble dryer (CAN-BD).

“This gave us nice, fine particles,” said Dr. Sievers. So fine, in fact, that pharmaceutical companies took notice, encouraging Dr. Sievers and his team to increase their study of medical aerosols and eventually leading to the measles project, which was recognized and rewarded with funding by the National Institutes of Health and the Gates Foundation.

“If you think about it,” said Dr. Sievers, “it’s an historical anomaly that we started out sticking needles into your body.” It works, but that doesn’t mean it’s the best way to deliver the active agent. Consider measles, an infection of the lung. “We’re betting that the extraordinary large surface area in the alveoli in the deep lung will give us a therapeutic advantage,” not necessarily superior in immunogenicity to parenteral administration of vaccine, but certainly equivalent.

Particle Size Crucial

The key is particle size, in this case a diminutive dimension that cannot be achieved with commercially available nebulizers like those used for asthma medications. Dr. Sievers discovered that CO2, when it is brought into contact with vaccine under high pressure, creates micro-bubbles that prevent particle aggregation upon drying at atmospheric pressure.

The use of CO2 was combined with an innovation in the constitution of the vaccine. “The biggest challenge in formulation was finding the proper sugar. Sorbitol is the standard, but we used a new inhalable excipient, myo-inositol,” said Dr. Sievers, “that gives you particles that are not nearly as sticky [and are] easily dispersed.” And easy to handle: A vaccine that requires no water or needles can be packaged as a single dose and is inhaled from an easily portable bag, thus minimizing vaccine waste.

“This could well open a new direction for vaccine delivery. I’m eager to see the results of the human trials,” said Tom Jin, MD, of the Aeras Global TB Vaccine Foundation. Dr. Jin’s interest is directly related to the efforts of Aeras, a nonprofit research organization dedicated to the development of new TB vaccines to improve pediatric coverage in developing countries. Aerosol delivery of vaccine is one component of the group’s work.

The foundation’s lead vaccine candidate, AERAS-402, was formulated with mannitol and powdered by way of freeze-drying. Dr. Jin was not eager to reinvent the excipient. “We’re getting two to four micron particle size, and that’s suitable for pulmonary delivery.” The CAN-BD method “is new, and people in the field are very interested in trying it,” he added.

Ensure Quality in Peptide Manufacturing Processes: Quality-centric approach needed at each step in process



By Shawn Shirzadi

As peptide-based therapies become increasingly viable drug discovery and development targets, the industry is paying more attention to the quality concerns that underlie peptide-manufacturing processes. Peptide synthesis for pharmaceutical manufacturing can be tedious and time-consuming, given the complexity of the product and the lengthy, intricate synthesis process. Regulatory compliance, quality control, and assurance efforts are critical for the successful development and manufacture of peptides as active pharmaceutical ingredients.

As a key element in the peptide production process, quality should be built into every step and should be considered a process parameter, not a process outcome. This will assure the purity of the final product and effectively satisfy regulatory oversight.

Peptides being purified at the American Peptide Company’s facility  in Vista, Calif.
IMAGE COURTERSY OF THE AMERICAN PEPTIDE COMPANY
Peptides being purified at the American Peptide Company’s facility in Vista, Calif.

Elements of Quality

Achieving product quality and purity requires a meticulous quality-centric approach from discovery to the final product release. Although quality encompasses all activities designed to ensure adequacy of manufactured products, pharmaceutical industry protocols are usually divided into two separate functions: quality assurance (QA), which oversees the entire manufacturing process and is responsible for the final release and disposition of the product, and quality control (QC), which is responsible for analytical testing and characterization of raw materials and finished products.

Essentially, QC monitors the endpoints of a production run: what comes in and what goes out. QA, by contrast, is responsible for quality throughout the entire manufacturing process.

The analytical chemists responsible for QC also ensure that analytical methods are developed and subsequently validated. Their assessment of the structural integrity and purity of the peptide is critical during the development stages of a product. Without rigorous analytical characterization and evaluation of potential impurities at the start of each manufacturing project, problems can be missed that will resurface as product recalls at a later point in the process, sometimes with devastating effects on patient health and safety.

Quality System Is the Sum of Its Parts

A quality system in a pharmaceutical manufacturing environment is composed of several components, including but not always limited to facilities and equipment, laboratory controls, materials, packaging, and labeling. These components should be designed to incorporate redundancies and fail-safes, because failure of one component can mean failure of the entire operation.

The facility and equipment component is a critical part of overall quality management, requiring consistent monitoring, maintenance, validation, and possibly calibration. Regular evaluation of the humidity, ventilation, air control system, compressed gases, and water systems is key. These facility- and equipment-specific considerations should be addressed during facility design and continually improved as needs evolve.

For example, a quality standard operating procedure should mandate regular cleaning and maintenance procedures, contamination prevention, and regular testing and monitoring of the controlled environment. Lighting, flooring, potable water, and sanitary facilities, as well as sanitization and pest control, are also important considerations.

Equipment and facility assets, such as a pharmaceutical-grade water system and emergency power supply systems, must be validated prior to use (installation qualification, operational qualification, and performance qualification). Cleanroom and all other controlled areas must be qualified prior to use.

A technician works on peptide process development that will  provide high-quality peptide compounds.
IMAGE COURTERSY OF THE AMERICAN PEPTIDE COMPANY
A technician works on peptide process development that will provide high-quality peptide compounds.

Achieving Compliance with a Focus on Quality

A focus on quality must have regulatory compliance as its ultimate goal. Adherence to current good manufacturing practices (cGMPs) and a robust documentation program can ensure reproducible, verifiable quality procedures that not only stand up to regulatory scrutiny but also guarantee a high-purity final product.

The U.S. Food and Drug Administration (FDA) mandates cGMPs, obligatory prerequisites to establishing a robust and reproducible manufacturing process. Apart from general guidelines, including the Code of Federal Regulations (CFR) and International Conference on Harmonisation (Q7) Good Manufacturing Practice Guide for Active Pharmaceutical Ingredients, there is only one guideline specifically dedicated to peptides. The FDA’s Guidance for Industry for the Submission of Chemistry, Manufacturing, and Controls Information for Synthetic Peptide Substances, issued in 1994, stipulates that the lot-release specifications—a set of tests and acceptance criteria that must be met before a product is released—must be sufficient to ensure the identity, purity, strength, and/or potency of the peptide and to demonstrate lot-to-lot consistency.

Every product manufactured under cGMP must undergo a battery of analytical tests. Each batch should be provided with a lot-specific certificate of analysis (COA) documenting specifications, test methods, and results. A typical COA contains information on appearance, solubility, purity by gradient high-performance liquid chromatography, and molecular weight, as well as peptide counter ion, water, and residual organic solvent content.

Documentation and Quality System

Documentation of manufacturing processes, along with all related in-process and final-release testing, is essential for maintaining compliance with regulatory oversight. Extensive documentation is required of production, change control, vendor audits, qualification process, and raw materials testing and release.

Specifically, documentation demonstrates compliance not only with cGMPs but also with 21 CFR 211, part 211.42: Design and construction features, which stipulates that “any building or buildings used in the manufacture, processing, packing, or holding of a drug product shall be of suitable size, construction, and location to facilitate cleaning, maintenance, and proper operations.” Firms must also prove compliance with 21 CFR 211, part 211.63: Equipment design, size, and location, which indicates that “equipment used in the manufacture, processing, packing, or holding of a drug product shall be of appropriate design, adequate size, and suitably located to facilitate operations for its intended use and for its cleaning and maintenance.”

An important step in the documentation process is the focus on accurate labeling and label accountability. Lack of strict controls in this area can spell disaster, because issues with mislabeling often lead to recalls.

Implementing Quality

Based on customer needs for a specific peptide manufacturing project, the product manufacturer develops a basis for a manufacturing project design scheme. From there, a conscientious peptide manufacturer will help define the parameters for the engineering function that include operational and compliance requirements. Along with guided tours of facilities, engineering the project involves assisting with the client’s preparation of regulatory documents, including chemistry, manufacturing, and controls and drug master files, all while discussing any and all discrepancies, product testing, and technical support. This dialogue should be ongoing, starting at the beginning of the project, before the initial design, and continuing after product release.

With an approved production batch record, the process begins with the qualification of raw materials as well as equipment used in the process. The manufacturer must ensure that the in-process testing and verification of critical steps are documented within the production batch records.

Systems Approach to Quality

Bearing all these elements in mind in a holistic fashion is critical for implementing a sound quality system. Indeed, the key word for any effective approach to quality outcomes is “system.” Because every component of a manufacturing process contributes to the quality of the final product, moving forward without a comprehensive systems approach to the entire process means that even a minor misstep can compromise final outcomes. The approach that considers every miniscule aspect of the manufacturing operation—from documentation to capital equipment—is the surest way to guarantee final quality.

Shirzadi is vice president of quality at American Peptide Company. Reach him at (408) 733-7604; for more information e-mail sales@americanpeptide.com or go to www.americanpeptide.com.

Formulation evolves rapidly from tablets to needle-free injection


By Gina Shaw
Formula Racing
IMAGE COURTESY OF DAVID CIPOLLA, PHD.

Back when I was a kid in the 1970s, there were pretty much two ways for oral drugs to be delivered, standard tablets and capsules, and capsules were still fairly novel. I remember an old TV commercial that showed a capsule splitting apart and hundreds of tiny round particles spilling forth.

Drug formulation, of course, goes back a lot farther than the 1960s and 1970s—indeed, you could take it back centuries. What was Socrates’ cup of poisoned hemlock, after all, but a fatal drug mixture?

But over the past three or four decades, the formulation of drugs—for both oral and parenteral delivery—has taken several giant leaps forward from the simple tablets and capsules of the 1960s and 1970s.

Oral Arguments

Drug tablets, of course, need a coating—to improve their appearance and stability, mask odor and taste, reduce dust, improve bulk handling, and, not least of all for the producer, help the consumer identify the brand. Back in the 1960s, most tablets were coated with sugar. But that coating had its drawbacks, said Yidan Lan, PhD, a senior scientist at BASF Pharma Solutions. “As we moved into the 1980s, film-coated tablets were developed, and with the original film coating, stability improved, and the coating process was also more consistent, less time consuming, and just better from an appearance point of view.”

Those particle-spilling capsules, usually called hard-shelled capsules, were not particularly popular in their early days, said Dr. Lan, so those, too, evolved. “We moved to the softgel approach. That was done to achieve some kind of water solubility and also to increase the bioavailability of the drug.”

The faster and much more economical film coating technology that was adopted in the 1970s could also be functionalized for time-release delivery. The immediate-release tablets and capsules of the 1970s started dissolving as soon as you swallowed them—that was that. Controlled-release dosage form technology became important in that era; most of these had first-order release patterns.

But the big goal in the 1970s was something called “zero-order release,” said Rick Soltero, PhD, president of PharmaDirections, a drug development consulting and management company. “That was the single biggest change in oral drug formulation in the 1970s and 1980s.” Zero-order release involves a mechanism that ensures that a steady amount of the drug is released over time, improving efficacy by maximizing bioavailability while minimizing side effects and peak/trough fluctuations.

Two revolutionary developments helped to make zero-order release a reality. The first was the osmotic pump—a technology patented by in the 1980s by ALZA, which would later be acquired by Johnson & Johnson. The technology, called OROS, consisted of an osmotic core and a semi-permeable membrane. As soon as you swallowed an OROS-based drug and it reached the stomach, water there would be drawn by osmotic pressure through the membrane to saturate the drug, diffusing it gradually as a liquid through tiny laser-drilled delivery orifices on the membrane of the drug. Osmotic pump tablets helped to reduce side effects and kept levels of the drug in the bloodstream at regular and predictable levels.

Figure 1. Example of Rapid Absorption of Small Molecule Drugs From  the Lung.
IMAGE COURTESY OF DAVID CIPOLLA, PHD.
Figure 1. Example of Rapid Absorption of Small Molecule Drugs From the Lung. AERx Morphine (8.8 mg loaded dose) vs. IV Morphine (4 mg). Absorption is shown for a small molecule, morphine. There is rapid absorption and once in the blood stream the clearance mimicked IV. Note that the bioavailability is almost 100%. The 8.8 mg loaded dose in the packet translated into about 5 mg lung dose based upon a 60% efficiency.

Functional Polymers Emerge

This technology is still used in drug formulation today, but it has been eclipsed somewhat by the development of functional polymers, which are used to coat single-unit tablets. These polymers simply act as excipients for the drug itself and can take a number of forms, such as ion exchange resins, polymeric adsorbents, cellulosic polymers, and polymeric coatings. Functional polymers not only allow for extended release, but can also mask the taste of a drug and improve its chemical and physical stability.

As the use of functional polymers evolved, multiple layers of coatings could be used to control the drug’s release. You might, for example, have an outer release control polymer, an inner protective coating or additional release-control layer, and then the drug’s core. “With advanced technologies today, you can also make very small mini-tablets,” said Dr. Lan. “A film-coated, single-unit tablet can now achieve the same drug release profile as the osmotic pump, without the need for drilling holes in the tablet.”

Another advance in this field, in the 1990s, involved the development of multi-particle drugs. “Originally, people were always trying to make it simple. The drug is included in a single tablet, it’s released, that’s it,” said Dr. Lan. “But multiple-unit dosage forms allow for increased uniformity of plasma levels and better reproducible bioavailability. Mini-matrix tablets combine the advantages of multiple unit dosage with those of matrix tablets, as their manufacturing technique is well established.”

“If you have multiple small particles, even if some have problems, the rest are still good, so you prevent ‘drug dumping,’ which is another advantage over single-unit tablets,” Dr. Lan said.

Another relatively recent method of controlled release is the “floating” tablet. Floating tablets have relatively low density—about the same as water—prolonging the drug’s contact with small intestinal mucosa. This is particularly useful for drugs that are poorly soluble and those that have poor bioavailability.

Yet another advance in oral delivery that came about in the early 1990s was the use of penetration enhancers—generally, combinations of fatty acids—to help enhance the bioavailability of poorly soluble drugs. “Elan was a propagator of a fair amount of that,” said Dr. Soltero. “It was based on a capric acid moiety and was used to get poorly soluble drugs, even some small peptides, into the bloodstream. Emisphere even claimed they had something that would get proteins through the GI tract, as did Novex, but I always thought that was more magic than reality.”

In oral delivery today, that’s still the biggest need, Dr. Soltero said: for poorly soluble drugs to be delivered successfully with good bioavailability. “There’s only one easy method for doing that that’s been successful so far, and that’s been increasing the number of particles by decreasing the particle size and thus increasing the total surface area. For that, nanotechnology is opening things up. Another thing that’s been effective has involved micelles, using things like surfactants or other ingredients to try to utilize the oil/water properties of the drug so that it can get across the GI tract.”

Figure 2. AERx Morphine Post-Op Pain Result: Pain Intensity Visual  Analog Scale (VAS).
IMAGE COURTESY OF DAVID CIPOLLA, PHD.
Figure 2. AERx Morphine Post-Op Pain Result: Pain Intensity Visual Analog Scale (VAS). This figure shows that if you achieve similar pharmacokinetic profiles via inhalation, then you also achieve an effective pharmacodynamic response. (LOCF stands for last observation carried forward.)

Piercing Needs

Of course, not all drugs are orally delivered. For parenteral formulations, particularly given the recent growth of biologics such as proteins and peptides, the past several decades have seen an evolution toward convenience for the patient.

“We started out with a lot of lyophilized products, which addressed the concerns about stability at room temperature,” said David Cipolla, PhD, senior director of pharmaceutical sciences for Aradigm Corporation. “However, these weren’t very convenient. Patients or healthcare professionals needed to reconstitute them, add saline or another buffer, and mix—sometimes for many minutes—to get the drug to dissolve. Then they had to transfer it into a syringe or injection device. When I was at Genentech, we developed the first growth hormone liquid formulation, Nutropin hGH, which removes the reconstitution step. Now most biological products are in liquid formats to provide greater convenience.”

For regular users of injectable medications, such as diabetics who need insulin, the past two decades have been marked by the development of pen-based injection systems. “There are now systems that have their own cartridges, designed especially for short-term duration; the drugs can spend a couple of weeks or a month in an injection system. To do that, preservatives had to be added to ensure that the formulation would remain stable and to maintain sterility,” Dr. Cipolla said.

Today, some injection devices shield the patient from ever seeing the needle—and some don’t use needles at all. In July 2009, the Food and Drug Administration (FDA) approved Sumavel DosePro, Zogenix’s needle-free sumatriptan injection for acute migraine and cluster headaches. “The liquid is in a half-milliliter container, a small glass vial, powered by a nitrogen cylinder,” Dr. Cipolla said. “The patient pushes the device up against their skin, the cylinder expels nitrogen gas and presses on the plunger, shooting liquid out through a tiny orifice in the device at such high force that it enters the skin and flows into the subcutaneous fat space. They’re not advertising it as pain-free. There’s still some sensation but it’s faster than human reaction time, so by the time you hear it actuated, the injection is complete.”

Another approach is the use of inhalation. Aradigm Corporation, for example, has developed the AERx pulmonary drug delivery platform for drugs such as as morphine (see Figs. 1-3, pgs. 11-13).

Figure 3. Aerosol Particle Size Influence on Lung Deposition Using  the AERx System.
IMAGE COURTESY OF DAVID CIPOLLA, PHD.
Figure 3. Aerosol Particle Size Influence on Lung Deposition Using the AERx System. Gamma scintigraphic studies are often used to determine the amount of (labeled) drug delivered to the lung and to some extent the deposition within the lungs. Three-dimensional single-photon emission computerized tomography studies like those above can give a clear picture of deposition in the lung (same subject across the four images using various AERx configurations). Clearly, at constant inhalation flow rates, with near monodisperse aerosols (GSD ~ 1.3), particle size affects deposition site.

The Solubility Challenge

As peptides, proteins, and now monoclonal antibodies have become increasingly important, drug formulators have been challenged to increase their solubility, because for many of those drugs, the required dose is very high. “The challenge has been, how do you get it all into a volume of formulation small enough to inject?” Dr. Cipolla asked. “If the viscosity becomes too high, it’s very hard to push the drug through the syringe. If you use a very wide-bore needle, that’s not too comfortable for the patient.”

That’s what happened with Nutropin Depot, a sustained-release growth hormone designed using Alkermes’ ProLease technology and brought to market by Genentech and Alkermes in December 1999. “It would be delivered once or twice monthly instead of daily or every few days, using microspheres that released the hormone over a long-time duration,” said Dr. Cipolla. “It sounded great, but the 21-gauge, half-inch needle was not well accepted. The manufacturing COGs [cost of goods] and market economics weren’t working well, either—so it was eventually withdrawn in 2004.” Alkermes has had greater success applying their sustained-release microsphere technology to deliver risperidone for schizophrenia and naltrexone for alcohol dependence.

Another innovation in parenteral delivery over the past 10 to 15 years has been reduced particle size in suspensions. “In the past, material was lyophilized, then jet-milled down to a few microns in size. But now you can create nanoparticles that are submicron in size,” Dr. Cipolla said.

“One of the leading early technologies was ALZA’s. They were able to create high concentrations of nanosuspensions, and with the smaller particles, have much greater surface area relative to the volume of the drug. For something that’s poorly soluble, the surface area guides how rapidly the drug can be dissolved. This approach is being used in nanoparticles for pills as well as nanosuspensions for injection,” he added. (One of the first nanosuspension products was, in fact, oral—Wyeth’s Rapamune [sirolimus] oral solution, approved by the FDA in 1999.)

In the early 2000s, the development of Captisol (cyclodextran) gave solubility another great leap forward. “These look like little donuts,” said Dr. Cipolla. “If you have a drug that’s poorly soluble, the cyclodextran can help solubilize it. The hydrophobic region of the drug can position itself within the ‘donut’ and increase its solubility by a factor of 10 to 100.” The first two drugs using the Captisol formulation that were approved were Zeldox/Geodon for injection (ziprasidone mesylate), a Pfizer therapy for schizophrenia approved in Europe in 2002, and Vfend I.V. (voriconazole), an antifungal from Pfizer that became the first U.S. regulatory approval of a Captisol-enabled drug in the same year.

The challenge has been, how do you get it all into a volume of formulation small enough to inject? If the viscosity becomes too high, it's very hard to push the drug through the syringe. If you use a very wide-bore needle, that’s not too comfortable for the patient.
—David Cipolla, PhD, Aradigm Corporation

Pegylation

In the early 2000s, pegylation swept the market. This covalent attachment of polyethylene glycol polymer chains to a drug, helps to shield the agent from the host’s immune system, prolongs its circulatory time by reducing renal clearance, and can improve the water solubility of hydrophobic drugs.

“The drug you saw pegylation first applied to was interferon alpha. PegIntron and Pegasys were the first two versions,” said Dr. Cipolla. “They reduced the frequency of administration from every other day to once a week, because the systemic half-life is increased and the drug stays in the bloodstream much longer.”

“Other polymer chains are also being added to proteins and peptides to keep them around longer,” said Dr. Soltero. “They’re changing the formulations by modifying the molecule itself. In terms of drug delivery, that means we’re going full circle in a way—back to how do we change the drug itself so that it has the characteristics of a long-term formulation, without worrying about the formulation itself to make it happen.”

Shaw is a freelance writer based in Montclair, N.J. Reach her at ginashaw@vagabondmedia.com.