Friday, February 26, 2010

Sizing Up Particle Analysis

Sizing Up Particle AnalysisBy James Netterwald, PhD, MT (ASCP)
Over the last several years, more methods have been standardized

The AAPS Journal published a report in 2004 by Burgess and colleagues about particle size analysis, from the fundamentals of particle structure and morphology characterization to measurement. The report, co-sponsored by the U.S. Food and Drug Administration (FDA) and the United States Pharmacopeia (USP), was based on a 2003 American Association of Pharmaceutical Scientists (AAPS) workshop on particle size analysis. The 2003 workshop and 2004 report (Burgess, et al. AAPS J. 2004;6 (3):23-24) reviewed what was then the current thinking about particle analysis.
Since that workshop and the published report, progress has been made in particle engineering of pharmaceutical products, but experts continue to call for harmonization of particle analysis methods. Quality by design approaches, sampling and sampling distribution methods, and methods for particle shape analysis have all been cited as areas that have improved since 2004, with progress expected to continue.
Particles have physical properties, as well as properties that can be attributed to the manufacturing process, particle “processability,” and quality, all of which determine the in vitro and in vivo performance of the particle. “There are a number of factors that affect particle size measurements, and this report touched on a number of them,” said William Kopesky, vice president of analytical services at Particle Technology Labs, Downer’s Grove, Ill. “The report covered a variety of techniques available for particle size determination; each presents their own pros and cons. Each technique should be applied to a different situation or set of samples depending on the question the researcher is trying to answer.”
The report also outlined the complexities and subjectivity of selecting an appropriate particle-sizing method and discussed how to express data generated by such methods. One problem identified by the report was the failure of sieving methods to control the extended release in tablets due to particle size effects.
The wide range of available dosage forms—solids, aerosols, suspensions, emulsions, liposomes, microspheres, and nanoparticles—was included in the report, as were the difficulties encountered in the size analysis of particles as diverse as those found in solid dosage forms and dispersed systems. For example, the impact of particle size on the therapeutic efficacy of an inhalation product “renders clinical bioavailability and bioequivalence potentially irrelevant,” the report noted.
Now, even if you have a handle on particle size, you still have to consider how shape plays a factor in the instrumental result.
—William Kopesky, Particle Technology Labs
Predictable therapeutic targeting, dose delivery, pharmacokinetics, and pharmacodynamics of liposome-delivered drugs are dependent on control of vesicle size. The report emphasized that sampling methods can affect both how particles are analyzed and how data generated by an analysis are expressed.
The overall purpose of the AAPS workshop was, according to the report, “to acknowledge the importance of each of these factors and to provide a forum for debate and discussion for individuals from all sectors of the scientific community with interest in pharmaceutical particle size analysis.”
Issue Is Harmonization
Andy Dunham, PhD, senior research director of technology resources at Baxter Healthcare Corporation, Deerfield, Ill., recalled the state of the field when the workshop was held. “Around 2004, particle engineering and dosage form design were starting to grow rapidly in the pharmaceutical industry. The industry recognized the need to establish common points for discussion and definition regarding process control and product characterization techniques.”
According to Kopesky, the workshop “was trying to illustrate that there has been and continues to be a harmonization issue using different techniques. With all the available different sizing techniques, each one working on a different analytical principle, you do get different data sets. One of the hurdles that everyone in this industry battles on a daily basis is comparing between analytical techniques, for example, microscopy versus laser diffraction … or even on the same analytical principle, say between different instrument manufacturers,” Kopesky said.

A Malvern Morphologi G3S Particle Characterization System, a state-of-the-art automated image analyzer.
“I think that we’ve seen some standardization of methods over the last several years, but we are still not necessarily there yet on all particle sizing technologies. … To my knowledge, there has been more standardization of methods since the report was published,” Kopesky added. For example, in 2005, the USP released a monograph, “Laser Diffraction Measurement of Particle Size” (USP <429>), which outlined a technique commonly used in the pharmaceutical industry as manufacturers have moved away from sieving to obtain better resolution of particle size distribution.
The International Conference on Harmonisation (ICH) has pulled the USP, the Japanese Pharmacopoeia, and the European Pharmacopoeia together “to get everybody on the same page when it comes to laser diffraction, describing the technique, and trying to specify more about what the exceptions and criteria are for validating a particle size method. Diffraction has been used routinely in our laboratory due to its popularity, and we reference the USP <429> document, which has been very helpful in the pharma industry, especially to show compliance and to standardize the technique,” Kopesky said.
Determination of Particle Shape
Particle shape determination is another current concern in the field of particle engineering. The 2004 report stated that “the measurement and expression of particle size is intimately bound with the shape and morphology of the constituent units that make up the ensemble of particles.” The importance of particle shape was described within the context of measurement using different analytical methods. According to the report, if all particles were spherical, such measurements would have low variation between methods. Because pharmaceutical particles are rarely, if ever, spherical, however, measurement can be problematic. This challenge highlights the importance of particle shape and morphology.
“People have often been content with equivalence sphere diameter … but now they are saying that is not good enough anymore. Now, even if you have a handle on particle size, you still have to consider how shape plays a factor in the instrumental result,” Kopesky said. Major industry manufacturers have also adopted particle shape image analysis approaches that can be used in conjunction with more classic techniques like laser diffraction-based approaches.
Particle size analysis is just one element of the quality of the product, and much of the quality analysis of particles has been overtaken by design approaches, according to Anthony Hickey, PhD, DSc, professor of molecular pharmaceutics, Eshelman School of Pharmacy, University of North Carolina at Chapel Hill. “At around the time of this report, there was a new initiative that was just coming out of the FDA on quality by design. There were also new analytical technologies that enabled you to measure in real time or online with feedback properties of the products so that you could actually ensure the product’s quality without necessarily having to go back and remake it. You could do back sampling and continuous processes that would give you very high quality products.”
According to Dr. Dunham, progress has been made in particle size analysis since the 2004 report. The USP adopted written standards regarding lipid emulsions (USP <729>), and the FDA emphasized the need for improving the characterization of protein aggregates. Although Baxter is not currently participating in the working group, the company sees a continuing need for the outputs and actions described in the 2004 report. For example, the impact of particle size distribution on product safety and performance of emerging pharmaceuticals should be better understood.
Also, the report described the importance of developing descriptors for particle size distribution data. A mass-weighted descriptor, rather than one that is number- or surface area-weighted, is the most relevant descriptor of the content of active pharmaceutical ingredient as a function of particle size, according to the report.
Setting Standards
But not everyone agrees that significant progress has been made. “I don’t think [the AAPS report] had a huge effect on the field because, in the conclusion, the suggestion was that we would have another meeting three years later, and I don’t think it ever happened,” said Dr. Hickey. The conclusion stated that “improvement of currently accepted methods for particle size analysis of pharmaceutical products will require ongoing participation by those involved with this activity. … A second meeting was proposed that would occur at a defined period following the first meeting (two years) to review the effect of the action items, and the passage of time, on industry and regulatory practice.”
Particle engineering and dosage form design were starting to grow rapidly in the pharmaceutical industry. The industry recognized the need to establish common points for discussion and definition regarding process control and product characterization techniques.
—Andy Dunham, PhD, Baxter Healthcare Corporation
Other experts contend that there was sufficient coverage of methods of particle analysis and therefore no need to develop additional techniques. Dr. Dunham said that “setting standards for how to use this technology is a current need for the industry and will require technically sound agreement between academia, industry, and regulatory bodies on the relationships between particle characteristics and product performance.”
Kopesky agreed that harmonization of methods is still a critical need. “The difference in the analytical techniques themselves is just one of the common problems that continue to hinder the advancement of particle size testing. Each analytical technique looks at a different parameter or different view of the particle because you are dealing with a 3-D object that is very often nonspherical and non-monodispersed. Also, there are several different variables to be measured by each technique, and each one is looking at a different set, so comparing between different techniques can be difficult.”
According to Dr. Hickey, “every particle sizing method has a range of functionalities. There is no one method that will let you measure particles of all scales and scrutinies, from boulders down to nanoparticles. The key is representative sampling, so that you are looking for something that reflects the properties of the product you are working with.”
The 2004 report clearly stated the need for harmonization of particle-sizing methods, Dr. Hickey said. “There is a need for written and physical standards for calibration of all particle-sizing methods. These standards should be reproducible, sensitive [for product control], and accurate, if absolute particle size is important.” Moving forward, harmonization will likely be a key issue in the field.
Dr. Netterwald is president and CEO of BioPharmaComm LLC. Reach him at
Impact of Particle Size on Aerosols

In 2004, the AAPS Journal published a report on particle size analysis, based on a workshop held the previous year, in which inhaled aerosols were identified as highly dependent on particle size and morphology. Anthony Hickey, PhD, DSc, professor of molecular pharmaceutics in the Eshelman School of Pharmacy, University of North Carolina at Chapel Hill, is an aerosol particle engineer who has done a good deal of particle sizing for other reasons.
Particle size is particularly important to the efficacy of an aerosol, Dr. Hickey said. “One of the things that came out of the 2003 AAPS workshop was that there are a lot of issues related to how to sample powder forms or droplet forms in the first place. … It is clear that sampling is key to really understanding your product.”
Because particle size is such an important element in the efficacy of inhaled drugs, particle size analysis must reflect the chemistry of the drug so that what is actually measured is the particle size of the drug, not just general particle size. “A lot of these aerosol products are not just drugs but are excipients or propellants or other things.”
Our understanding of laser diffraction and how to build bigger and better detectors and develop better algorithms that reflect particle behavior and size have come a long way. “Even more mechanical methods of sampling, such as inertial sampling, now have new impactors that are more suited to pharmaceutical products such as aerosol particles.”

Mass Spec Evolution

Mass Spec Evolution Drives Field’s GrowthBy Gina Shaw
Advanced MS techniques and tools have revolutionized the pharma lab
Editor’s Note: This article on the history and impact of mass spectrometry in the pharmaceutical industry is the second in a new series for Pharmaceutical Formulation & Quality. In “PharmaTools: Technologies That Changed Pharma and Biotech,” we look at various technologies such as mass spectrometry that have played a key role and had an indelible impact on the pharma and biotech industries. In our next two issues, we will examine the evolution of liquid chromatography and gas chromatography. To view other mass spectrometry materials, please see below.
Even in the midst of a world economic downturn, some products continue to attract buyers. Within the pharmaceutical industry, one standout is mass spectrometry (MS) systems. According to Strategic Directions International, a market research firm that tracks instrument business trends, the mass spectrometry market, already a $2 billion annual concern, is expected to grow at a 9% annual rate through 2012.
“The market will be led for the foreseeable future by the more advanced methods, including Fourier transform (FT)-MS, tandem LC-MS, and quadrupole time-of-flight QTOF LC-MS (9),” the report states.
In the last issue of Pharmaceutical Formulation & Quality, we looked at the overall history of mass spectrometry in the pharmaceutical industry and how its evolution from the “expert-from-Switzerland” mode to easy-to-operate, benchtop tandem LC-MS systems has made it a ubiquitous tool in virtually every pharma lab (“Bringing Mass Spec to the Masses,” September 2009, pgs. 16-21). In this issue, we’ll explore the history of some of the more advanced methods and adaptations of mass spectrometry that are now helping to drive the field’s growth.
One technology that waited decades for its time to come is time-of-flight (TOF). W.E. Stephens, at the University of Pennsylvania, developed the concept in 1946. In 1948, Cameron and Eggers, at Oak Ridge Laboratories, built the first TOF instrument with very low mass resolution. But it took more than five decades for improvements in electronics, software, and engineering design to make TOF mass spectrometry the indispensable industry tool it is today.
“The issue was fundamentally low performance in terms of analytical service, resolution, sensitivity, and mass accuracy,” says Iain Mylchreest, PhD, vice president and general manager of life sciences mass spectrometry for Thermo Fisher Scientific. “Quadrupoles were much easier to interface with, so TOF took a back burner until the supporting technology and the means to interface with that technology caught up.”
Quad TOF Vital for Proteomics

This image shows a 1990s-vintage mass spectrometer made by a company that is now part of Waters Corp. At the time, it was the largest, and arguably the most complex, spectrometer the company ever built. Only three of the systems were shipped; the systems took an engineer months to install. The advanced mass spec technologies available today are smaller and much easier to install.
TOF made a key leap forward in 1984 when Gary Glish, PhD, now the president of the American Society for Mass Spectrometry, published the first paper (in Analytical Chemistry) on quadrupole TOF mass spectrometry. “The evolution of the quad TOF over the past two decades has been very important, particularly for proteomics and metabolomics,” said Gary Siuzdak, PhD, senior director of the Scripps Center for Mass Spectrometry in La Jolla, Calif. “Having quad in the front and TOF in the back gives you accurate mass measurements.”
Previous quad-TOF instruments notoriously lacked robustness and accuracy, but that has changed in recent years, according to Dr. Siuzdak. “Previously, they tried hard to achieve five PPM [parts per million] accuracy, but more often than not the range was more like 20 or 30. Now, with new techniques and improved detectors, they’re routinely getting sub-five PPM accuracy, and in some reports I’ve even heard of sub-part-per-million accuracy.”
Engineering improvements, such as changing the composition of the flight tubes, have made this possible. “They’re now making them out of a ceramic-like material that has a very low coefficient of expansion, so even if the temperature changes in the room, the flight tube won’t change,” Dr. Siuzdak said. “When you’re talking about PPM accuracy, having a flight tube change in size can make a pretty big difference.”
The evolution of the quad TOF over the past two decades has been very important, particularly for proteomics and metabolomics.
—Gary Siuzdak, PhD, Scripps Center for Mass Spectrometry
Quad-TOF instruments are particularly useful for proteomic and metabolomic accuracy. “Now, people can create a profile of the sample using quad TOF without generating MS-MS data and then do comparative analysis between the profiles and look at which peaks are changing significantly between different samples, such as with a benign versus malignant cancer sample,” Dr. Siuzdak explains. “Instead of getting fragmentation data on everything, you can go after what’s significantly different.
“It gives you a more manageable quantity of information, and the fragmentation you get is of higher quality, only focusing on the molecules that matter. This all means that we can do much more direct lead analysis.”
The coupling of TOF with MALDI (matrix-assisted laser desorption/ionization), a soft ionization technique that allows the analysis of biomolecules and large organic molecules that are vulnerable to fragmentation with conventional ionization, has advanced both tools, Dr. Mylchreest said. “It’s a natural marriage, since MALDI is a pulse technology and TOF deals with pulses of ions.”
Introduced before electrospray was in wide use, MALDI allowed the direct analysis of big proteins and peptides, something very difficult to do with the technology of the time. “It was attractive to the biologists, because it had gel spots you could directly analyze in the mass spectrometer and get some idea of molecular weights, which you could never do before. You had to use chromatographic methodologies, which were very inaccurate,” Dr. Mylchreest said.
Pioneered in the early 1980s, the first MALDI instruments were linear, single-stage instruments. “With those, the primary application within proteomics was measuring the mass of intact proteins,” says Ronan O’Malley, PhD, MALDI product manager for the Waters Corporation. “The next development in MALDI, in the late 1980s and early 1990s, was the introduction of a reflectron into the instrument, an ion mirror that has the advantage of lengthening the flight tube, thereby increasing TOF and improving the mass accuracy that can be achieved.”
But the earliest MALDI machines had significant disadvantages. “The M in MALDI is for matrix. In the early days, you never knew why you’d get a good signal with a matrix in one case and not in another,” said Richard Caprioli, PhD, director of the Mass Spectrometry Research Center at Vanderbilt University in Tennessee and a developer of MALDI MS imaging.
“We might play all day with sample preparation to get our signal and try to understand these things, which is fine in academia, but not what they’re getting paid for in pharma. Today, there is a much better functional and chemical understanding of how to get a better signal,” Dr. Caprioli said.
Advantages of MALDI MS

The Synapt G2 System is a quadrupole time-of-flight mass spectrometer that recently came on the market.
Dual-stage MALDI, which came on the scene around the turn of the 21st century, allowed people to select ions in the first stage of mass spectrometry and fragment in the second. “This gave much more specificity, allowing you to add fragmentation to the molecular mass experiments,” said Dr. O’Malley. “Proteomics was the mainstream market for this capability, but there were also applications in quality control for formulated compounds and for analyzing oligonucleotides.”
Not long after, MALDI sources were combined with orthogonal instrumentation. “With an orthogonal system measuring TOF from the pusher to detector, you don’t have to take into account the TOF from source to detector,” said Dr. Mylchreest. “That’s important in MALDI, because there can be variations in the uniformity of the analyte across the target plate. With an orthogonal system, that doesn’t matter anymore. This brought the advantages of high resolution and exact mass to MALDI.”
Another advantage that MALDI MS brought to the table is speed. “The ability to do 2-5K laser shots per second—and you only need 10 or even less to give you a good analysis—enables a really rapid screening process,” said Dr. Caprioli. “Other MS techniques, although very valuable, are much slower. LC-MS might take one to three hours, whereas MALDI would take you one second to acquire the same data.”
Like TOF, Fourier transform mass spectrometry (FTMS) also took decades to reach its full potential. First developed in the mid-1970s, Fourier transform to ion cyclotron resonance (FTICR) mass analysis made FTMS applicable to the study of biomolecules. But FTMS and FTICR have only taken off within the last decade.
“You’re dealing with big magnet technology, which wasn’t that advanced back then,” said Dr. Mylchreest. “The magnets were huge, expensive, and weren’t shielded, and it took a long time to adapt that more academic technology for commercial usage.”
Today, FTICR offers ultra-high resolution along with impressive stability and accuracy. “It’s possible, with a skilled user, to get on the order of 500 parts per billion accuracy,” said Dr. Siuzdak. “This can really allow you to nail down elemental composition with relatively low ambiguity.”
FT has one significant downside: It is much more complex than TOF and quadrupole instrumentation. “FT instruments require a more advanced user,” said Dr. Siuzdak. “Within a day or so you can get reasonably familiar with [a] TOF or quadrupole instrument. With an FT system, especially the ICR instruments, it can take longer to learn all their aspects.” He added that some instruments on the market have made this easier, but they don’t offer resolution and accuracy as high as that attained from FTICR.
Ion Mobility Plays Important Role

Kevin Shanks, forensics manager at AIT Labs, prepares a time-of-flight mass spectrometer for use.
Driving the utility of many of these advanced instruments is ion mobility. As a technique, it’s been around for decades. Some of the first measurements were reported by researchers as early as the 1930s. Researchers at Bell Labs developed an instrument in 1967 that was “essentially an ion mobility drift tube combined with an orthogonal time-of-flight type analyzer,” said Alistair Wallace, PhD, Synapt product manager for Waters. But, just as with TOF, MALDI, and Fourier transform, ion mobility’s time had not yet arrived. “Electronics then were far less evolved, and the analyzers in use at that time were only capable of analyzing a single ion arrival event.”
It took another 25 years for pioneers like Michael Bowers’ group at the University of California, Santa Barbara, and David Clemmer’s at the University of Indiana, to move ion mobility mass spectrometry into the modern age. “Today, ion mobility is consistently increasing and driving the performance one can get from things like a TOF analyzer,” said Dr. Wallace.
Today, a technician or a grad student can come along, put in samples, and get accurate measurements from one to five PPM. That’s been a massive change for industry.
—Iain Mylchreest, PhD, Thermo Fisher Scientific
He compares the TOF analyzer to a big molecular dustbin. “You throw thousands of ions in there, and at the end of the day the limiting factor is the speed with which you can acquire the data. The faster and more powerful the electronics are, the more you can get out,” Dr. Wallace said.
Introduced in 2006, Waters’ Synapt instrument takes advantage of tri-wave technology to perform ion mobility at the limits of MS detection as it is currently known. “It can trap and accumulate ions prior to ion mobility separation, and the tight radial confinement of the T-wave enables you to get very high transmission of ions—nearly 100%—through the entire device,” said Dr. Wallace.
The impact of all of these advanced MS technologies on the pharmaceutical industry has been nothing short of revolutionary. “They’ve opened up new areas in terms of very high resolution mass analysis and accurate mass, something that has always been a big challenge in pharma. A lot of us have big fish stories on accurate mass,” said Dr. Mylchreest.
“They also made these experiments available to every lab. Today, a technician or a grad student can come along, put in samples, and get accurate measurements from one to five PPM. That’s been a massive change for industry. Although they all have different characteristics as to what they can do in terms of resolution, capabilities, and performance, they all address the same market space: opening up drug metabolism and structural analysis. They’ve opened up areas in proteomics and peptide sequencing and characterization that simply couldn’t be done before,” Dr. Mylchreest said.
Shaw is a freelance writer based in Montclair, N.J. Reach her at
Mass Spectrometry Resources
For more information and tools that can help you learn more about mass spectrometry, check out the resources below from Spectroscopynow is a free online resource for the spectroscopy community published by Wiley-Blackwell, which also publishes Pharmaceutical Formulation & Quality.
Mass Spectrometry: A Primer
Mass spectrometry is the characterization of matter through the separation and detection of gas-phase ions according to their mass as a function of the number of charge states of these ions. Prior to the mid-1980s to the early 1990s, mass spectrometry was primarily applicable to matter that existed in or could be put into the gas phase as neutral molecules (or atoms) before ionization could occur. This requirement (for the then available ionization processes) limited applications of the technique to analysis of matter that was volatile and not thermally labile. Since then, various desorption / ionization techniques have been developed that drive gas phase ions from the condensed phase (either in solution or in a solid matrix) for separation according to their m/z values. To learn more, read the full article
Mass Spectrometry Bibliography for Beginners
This listing, which features books, journals and Web sites, is intended for absolute beginners in using mass
Mass Spectrometry Glossary of Terms
The following is a list of some of the most common terms and abbreviations used in mass spectrometry. It is grouped under various headings to aid quick retrieval. The list has been compiled by Professor Anthony Mallet, University of Greenwich,
Mass Spectral Libraries - Reproducibility in EI, API and Tandem Mass Spectrometry
A number of publications have recently appeared in which attempts have been made to determine the reproducibility of tandem mass spectra libraries. For several decades libraries of EI spectra have been used successfully for compound identification, especially from GC-MS data, while, in contrast, many difficulties seem to be present in trying to achieve the same success for tandem mass spectrometry data. This short essay attempts to answer the question of why this difference should occur, what is it about EI spectra which are so reproducible over a wide variety of instruments, laboratories and over long periods of time and what methods have been tried to improve the success rate of the latter. To learn more, read the full article at

modular clean rooms

Make Way for the Modular CleanroomBy Lori Valigra
Companies adopt quicker-to-build facility and cleanroom designs
Competition and pressure to keep prices down are factors pushing biopharmaceutical companies to get new products to market faster. This increased pace means they must design and build new facilities and cleanrooms faster while keeping contamination control a priority. The semiconductor industry faced similar pressures and responded by building more highly automated cleanrooms run by fewer humans.
“Pharma is viewing new facilities like semiconductor companies did 15 years ago,” said Sterling Kline, director of project development at Integrated Project Services (IPS), a technical consulting and design company. It takes pharma companies three years to break ground, finish construction, and turn over the plant for qualification and validation. It takes semiconductor companies 12 months. “The pressures of Wall Street, pharmaceutical products coming off patents, healthcare costs, and return on R&D are changing the way pharma does projects. Pharma is now trying to build facilities that suit the product they think will be their next blockbuster.”
But pharma faces different challenges than do semiconductor companies when bringing in new technologies. The process from fermentation to formulation is not the same as the electronics industry and generally involves more people, said Thomas Hansz, president of the consultancy Facility Planning and Resources Inc. (FPR). “The process is semi-automated in pharma,” he said. “There are always people transferring things along.”
In addition, biopharma companies are generally slower to adopt new technologies. However, they are opening up to two new ideas: isolators, which are essentially closed glass hoods that surround manufacturing equipment and are accessible to humans through glove ports, and modular walls and ceilings that are prefabricated off-site. Both technologies are gaining traction after having ramped up for the past several years. And both minimize the impact of humans, who remain the biggest contamination risk by shedding hair and skin and breathing moisture into the air.
“The [biopharma] industry is a technology-driven business that is conservative at the same time. It makes improvements on what has worked before rather than wholesale changes. So it is evolutions, not revolutions,” said Bryan Phelan, managing partner at AdvanceTEC LLC, a contractor that designs and builds cleanrooms. And while pharma has resisted an influx of ideas from the semiconductor industry to some extent, it is starting to open up to cross-pollination. “People who built semiconductor plants 10 to 15 years ago now have careers with pharma plants,” Phelan said. There are areas that do not translate, however, such as airflow requirements and finishes on cleanroom surfaces.
Still, the industries are more alike than either wants to admit, said Tim Loughran, a consultant currently working with Cleanroom Construction Associates LLC, formerly with AdvanceTEC. “We did a project for Cambrex Biosciences where we applied microelectronics technology to a biotech research facility by using utility chases [a corridor behind the wall] that biotech and pharma hadn’t considered.”
Added Ralph Kraft, president of cleanroom services company R. Kraft Inc., “People need to look at their process from outside the box and ask, ‘What can I use from semiconductors in pharma?’”
Going Modular

A walkable, modular ceiling above the cleanroom allows equipment, ductwork, and utilities to be installed in the overhead crawl space at the same time the cleanroom below is being built. Sequencing the construction process can trim weeks to months off factory construction time.
Cambrex and Merck are among the growing number of companies installing prefabricated, modular walls and ceilings instead of traditional studs and dry wall construction. One big difference is that the modules are cleaner than dry wall, which puts particulates into the air when it is cut to size during construction or if a cart hits it in an operational plant. Manufacturers such as Plascore and AES Clean Technology build the modules offsite; they are then assembled at the manufacturing site, either a new plant or a retrofit.
Plascore’s wall panels are up to 4 feet wide and 12 feet long, and they are made with an aluminum honeycomb internal structure and a coating of antimicrobial unplasticized polyvinyl chloride or uPVC, which is said to hold up better than painted walls and is chemically resistant to cleaners. They also have coved (rounded) corners, chemically cold-welded seams, and as few ledges as possible to reduce particle trapping. “There is a general trend away from dry wall toward metal composite,” said Matt DeFer, product development engineer at Plascore. “It is a very nonporous finish.”
Decreasing latent particles can make a huge difference. “Modular systems create a much cleaner work site. If you do have to cut them, the aluminum honeycomb particles are small metal chips that fall directly onto the floor,” said AdvanceTEC’s Phelan. “When drywall gets cut, there are airborne particles, which do get wiped and filtered down, but a lot remain as latent particles that are everywhere, in the duct work, on surfaces, and on components.” He said companies have told him that it can take five to six months to requalify if latent particles are found during validation. Once a room is qualified, installing a new piece of equipment or changing the configuration of a modular room is a cleaner process and adds to the cleanroom design’s flexibility.
The pressures of Wall Street, pharmaceutical products coming off patents, healthcare costs, and return on R&D are changing the way pharma does projects. Pharma is now trying to build facilities that suit the product they think will be their next blockbuster.
—Sterling Kline, Integrated Project Services
Phelan figures about half of new facilities are modular, and he expects that figure to grow as biopharma takes to the just-in-time building and installation of modular systems. “Fewer people are on the construction site, so there is less contamination,” he added.
The top surface of the modular ceiling panels is strong enough to bear weight from the floor above. That allows equipment, ductwork, and utilities to be installed in the overhead crawl space at the same time the cleanroom below is being built. “Sequencing the construction process can trim weeks to months,” Phelan said of building a factory. The walkable ceilings also allow the facility to be serviced from above, reducing the contamination risk.
Added Loughran, “There is more assurance of the validation of the facility because it is built cleaner.” He said the modules can be reconfigured more easily after a facility is built. Loughran said prefabricated modules allow a biopharma company to delay the decision to move forward with a production facility for six months or more. “They can get through Phase II and be in Phase III before they pull the trigger for a production facility,” he said.
Such modules played a role in late 2005 when Merck started expanding its West Point, Pa., roller bottle vaccine processing facility to meet market demand. The 38,000-square-foot expansion included 14,000 square feet of current good manufacturing practice space. Merck set a goal for the $52 million project to beat its historic best 27 months to design and build the plant. Working with the consultancy IPS, it managed to complete the project in a record 24 months.
Merck built a 10-foot interstitial space between floors so that it could work on two elevations in parallel and keep very heavy building equipment from damaging the ceiling, said Steve Franey, RA, a technical architect at IPS who was senior project architect for the expansion. In a presentation at an International Society for Pharmaceutical Engineering conference, Franey and Merck’s senior project engineer showed that the collaboration helped save six to eight weeks on the project, and the modular wall and ceiling system saved four weeks.
There is more demand for flexible and interchangeable facilities, but Hansz of FPR does not expect biopharma to go to standardized plug-and-play modules that any company can use. “They will always be tweaked,” he said.
Separating Humans from Machines

A cleanroom built using prefabricated wall modules has coved (rounded) corners, chemically cold-welded seams, and as few ledges as possible to reduce particle trapping.
Isolators made by companies such as Skan AG, Bosch, and Steris are also gaining popularity among biopharma as part of a trend to decrease the amount of cleanroom space needed. Isolators are an alternative to restricted access barrier systems (RABS). While RABS use an aseptic filter in a cleanroom, contamination is possible each time a door opens. Isolators are pressurized and use vaporized hydrogen peroxide (VHP) as a cleaning gas. The technology dates to the 1990s, when Lilly, Upjohn, and Merck used it with limited success; early systems required an 18-hour decontamination cycle, since improved to three hours using VHP.
IPS’s Kline said the Food and Drug Administration prefers closed systems such as isolators, and companies using high-value active pharmaceutical ingredients (APIs) are once again eyeing them to protect their investments. “Folks were not making changes until 2002. Products were at $1 to $2 per vial. But biologics are $100 per vial. They can’t afford to take risks,” said Kline. Isolator designs started in 2002 are just now up and running, and they are catching on, he said.
Piggybacking on that trend is the move toward disposable contact parts as an alternative to cleaning. “This is the latest trend in biologic API factories,” said Kline. “The contact part is disposed of every run.”
While isolators have a bigger initial price tag, running from $9 million to $45 million compared to $1 million for a RABS, the overall cost to operate them is less, Kline said, pointing to a Merck study. The isolators prevent contamination and are less expensive to build than the traditional cleanroom. RABS need another $5 million more of investment for aseptic air; with isolators, there is a lower bioburden to be filtered, Kline said.
With isolators in place at various stages of the biopharma process, Kline said clients are starting to talk about not needing separate filling and formulation rooms. Combining the two would decrease line losses on pipes, he said. “I haven’t seen anyone do this yet, but we are discussing it with folks.”
Nanoproducts, which can break through the cell wall, are on the horizon and bring new challenges. Kline said the FDA is looking at future contamination control for nanotech.
Improving Processes, Communications
Modular systems create a much cleaner work site. If you do have to cut them, the aluminum honeycomb particles are small metal chips that fall directly onto the floor.
—Bryan Phelan, AdvanceTEC LLC
Kraft, Hansz, and others see room for improvement in biopharma when it comes to communications among people involved in new factory design. “They need to build better relationships with support organizations, more of a partner than a vendor relationship,” Kraft said, adding that there should be more openness in the biopharma corporate culture.
“We’ve been getting people from the marketing side involved,” said Hansz. Adding marketing to the production, manufacturing, finance, and research staff discussions is a recent trend. “They can have input into the future product lines and what is involved. It gives a glimpse into what is down the road.”
Bringing the disparate parties into one room can also reduce process steps and trim the amount of floor space required. One client was able to reduce 1,500 square feet of floor space, Hansz said. “They can rethink their processes and use automation effectively. And they can decrease the volume of contamination control spaces, decrease the amount of space in which equipment needs to be exposed to the clean environment.”
Kline added that it is important to have all the people who represent the process or the product design around the same table, using a team approach for the project’s conceptual phase. They should pick the project electives and must haves. “The challenge is being part of projects without that type of input. Sometimes a project is 50% to 70% done, and the people who occupy the room get into it too late,” he said. “A cross-functional team early on is the key to success.”