Monday, November 29, 2010

Step By Step Towards Selecting The Perfect Workbench


James Anderson
Purchasing a workbench or workstation may at first seem like a simple task.
Your employees have work to do, and they need an efficient, comfortable, and practical place to do it. But behind the deceptively simple proposition of purchasing a workbench may lurk a number of variables that must be considered to make sure you get what you actually need.
Employee needs vary widely among industries and applications. What’s perfect for an automotive dealership won’t work in a laboratory. What works for manufacturing facilities just won’t fly for a classroom setting. And a configuration that suits one laboratory may not be appropriate for another.
So, whether you are looking for technical workstations, height-adjustable workstations, assembly workstations, industrial benches, packing and shipping benches, or accessory systems, take the time to perform the necessary upfront work by following this step-by-step self examination that will help you choose the right workbench for all your needs.
The number one consideration–what work are you doing on the bench?
There’s one overriding consideration that will affect just about every aspect of your workbench purchasing decision: what work will you be performing on it? The answer to this question will affect everything from the size of the workbench to the surface material, to storage requirements, to ergonomic considerations.
Once you determine what work will be done on the bench, do an analysis of tasks associated with the work and use it to make a checklist of features needed to perform them. For example, say you’re in the business of assembling and maintaining cell phones, and you need to furnish a workspace for your repair technicians. You want a small workbench, perhaps one that is height-adjustable to bring the detailed repair job up to an optimal work zone and distance. Along with the workbench, you will also need an excellent lighting accessory. You’ll likely also need bins above the work surface to provide direct access to small parts, and an articulating arm that can hold assembly guidelines or diagrams. And depending on the flow of your repair operations, you might want to consider a material transfer work surface, or even a conveyor workstation, both of which can cost-effectively expedite material handling.
Or maybe you’re working in a pharmaceutical lab, where the work surface material becomes a more important part of the decision. Depending on the liquids and solids you’re handling, you might want either a stainless steel or epoxy resin chemical-resistant work surface to ensure long-lasting durable use. If your laboratory is in a cleanroom environment, your workstation will need to meet certain NSF International public health and safety standards. You might also need to store a combination of small beakers and instruments with large testing equipment–requiring a variety of storage solutions both above and below the work surface.
Image 1
SIZING UP THE SOLUTION
The size of your workbench is determined by a number of factors. First up is how much space is available in the work environment–how big a footprint will it occupy? With today’s modular workbenches making maximum use of vertical space, you may not need as big a workbench as you think. Next, how much work surface area does your application demand, both in terms of width (left to right) and depth (front to back). Does the entire work surface need to be within easy arm’s reach (by, say, an assembly technician)? Can you position needed items above the work surface on a vertical accessory system for easier access? Will you be working with large equipment or parts? If so, you may not only need a larger work surface, but might also need to factor in the weight-bearing capacity of your workbench.
WORKSTATION MEETS WORKFLOW
After thinking about size and footprint, you should consider whether your company’s workspace, type of work, and workflow are best served by a group of workstations laid out in a particular configuration. Some companies offer modular workstations that are specifically designed to accommodate different configurations, and thus different types of workflow. Use a design that positions your team for maximum efficiency.
If you’re operating with a progressive workflow, you may want to configure your workbenches to create an integrated, moving production line. Flow racks can then be used to stage and deliver parts using gravity, reducing material handling time, point-of-use storage, and cost.
If your team functions in cells or groups, it may be served best by different shaped configurations that encourage easy communication. Some workstations are available in modules, so they can easily be combined to create everything from in-line and in-line back-toback configurations to T, U, X and Y-shaped configurations.
Finally, consider transforming from stationery to mobile workbenches. Effortlessly accomplished with mobility enhancing accessories, mobile workbenches can provide for easy, smooth-rolling relocation. This will accommodate both day-to-day and future changes as well as simplify cleaning activities.
STORAGE–EVERYTHING IN ITS PLACE AND A PLACE FOR EVERYTHING
Spend some time doing some careful planning to get a workstation that exactly addresses your storage needs with little or no wasted space. Simplify your storage decisions by reducing the items being stored to only those that directly address your workbench applications. When doing your planning exercise, consider the size, shape, weight, quantity, and fragility of the items to be stored, as well as how accessible they need to be, and how much security they demand.
Do you need a home for shipping documents? A bar code scanner? Test equipment? Small parts? Tools? There are plenty of options for storage, both above and below the work surface. From plastic parts bins to a variety of shelving options to every size and configuration of drawer, there’s a lot to consider.
After determining exactly what needs to be stored, zero in on making the workspace more efficient. Create a designated storage location for every item. Modular drawer cabinet interiors are ideal for custom configuring to produce almost infinite layout options. This high level of organization is particularly important if different people are using the same workbench at different times. Time savings are maximized and inventory control becomes a non-issue.
LET THERE BE LIGHT
Lighting needs of the different workbench tasks is an important consideration. Does each station need separate lighting? Does the room itself have lighting deficiencies? Does the room light cast an unwanted color? And if you decide you need to equip your workbenches with lighting accessories, are your technicians best served by overhead fluorescent lighting or a swing arm that can be easily positioned and/or moved out of the way when not needed? Do you need an accessory that can diffuse the light and reduce glare?
POWER TO YOUR PEOPLE
After you weigh your lighting needs and options, you should next move on to your electrical requirements. From cleanrooms to quality control departments to research and development functions, having a convenient source of power at each workbench can be essential. There are diverse options to consider—from power beams and air beams to air supply brackets and cable management accessories. You can narrow your selections down to the necessary few by asking the right questions:
First consider the applications. Will each workbench be home to a computer monitor and other computer equipment? Do you need a data beam? Will the tasks at hand require compressed air, and what is the source of that air?
How many outlets do you need at each workbench (and how much power)? Where should the outlets be positioned? Do you require a ground-fault circuit interrupter (GFCI) to provide protection against severe shock and electrocution? Consider cord management, both from an aesthetic point of view, as well as the safety factor. To keep power cords from becoming tripwires, cable trays may be needed.
THE RIGHT ACCESSORIES MAKE THE SPACE MORE EFFICIENT
Think about the impact that add-on accessories might have in improving the employees’ job functions. No matter what the task, there’s an accessory option to help get the job done more efficiently and conveniently. By taking advantage of the abundant vertical space above the work surface, and the many interchangeable accessory options available, you can create a highly efficient work center that is tailored completely to the needs inherent to jobs being performed in the workspace. Examples include shelving for manuals or instruments, parts bin rails, a monitor bracket, or a keyboard holder.
PAY ATTENTION TO ERGONOMICS FOR SAFETY AND PRODUCTIVITY
It is essential to factor in ergonomics to improve safety and productivity. To minimize stress and strain for seated employees, a 30.5 inch work surface height will accommodate 99.5 percent of all male and 99.9 percent of all female workers. The optimal work surface height for standing employees depends entirely on the type of work being performed. Precision work usually requires a higher work surface, while heavier work demands a lower work surface.
But what if different shifts are using the same bench? And/or what if different tasks are being performed on the same bench? If so, consider an adjustable-height workstation. With such a bench, users can adjust the bench height with the simple turn of a crank or with a motor drive, and the work surface can move between approximately 25 and 41 inches.
STANDARDIZATION LEADS TO IMPROVED ADAPTABILITY
Most companies have multiple departments, from manufacturing to testing to shipping. Consider using a common workbench platform throughout the facility to gain such benefits as better utilization of inventory, easier reconfiguration, interchangeability of accessories, and aesthetic appeal.
Standardization allows efficient swapping of accessories among departments, and facilitates adjustments if work tasks change. Colors and designs match, and there are no surprises when employees shift to a different department.
PUTTING IT TOGETHER WITH DESIGN ASSISTANCE
If this list of considerations poses questions for which you need help to find answers, maybe you’d prefer to have help with your quest for the perfect workbench. Many workbench providers offer design planning assistance to guide you through the process and advise you of the most appropriate choices. Free services such as surveys and CAD drawings can make the process virtually painless.
LAST STEPS
If you choose a workbench provider who offers maximum breadth of product, and flexibility, you’ll be able to view all of your workbenches as part of a complete picture, although each may have been custom-built to accomplish a unique task.
The end result? Many smart steps for each department and one giant leap for your business.

Multi-Disciplinary Research Cleanroom Facility


Jelle Hanse
The world-wide need for the development of replacement human tissue, via regenerative medicine and stem cell applications, has resulted in the development of a new state-of-the-art cleanroom and laboratory research facility at Loughborough University.
The new 770m² multi-disciplinary Centre for Biological Engineering is the latest investment in research to improve human health by Loughborough University and the East Midlands Development Agency. The University aims to achieve the realization of regenerative medicine, cell technologies, and plasma medicine, through combining the human cell and tissue research programs of three university departments; the Department of Chemical Engineering, the Wolfson School of Mechanical and Manufacturing Engineering, and the Department of Electronics and Electrical Engineering.
To combine the three different fields of research in one multi-disciplinary research center required flexibility and transparency in the design. The team used its previous experience with designing and constructing cell and tissue cleanroom facilities, and its modular cleanroom construction method, to ensure that the facility was highly versatile and space-efficient, yet complied with all the regulatory requirements. The result is a superb, advanced new facility featuring the latest equipment that the customer is delighted with and is proud to show to other research institutes throughout the United Kingdom and abroad.
Image 1
CLASS II CENTRE FOR BIOLOGICAL ENGINEERING RESEARCH AREA
The Centre for Biological Engineering Research Area is one of the key areas in the new Centre for Biological Engineering. A range of laboratories for microbial, animal, and human cell culture research will enable the center’s staff to compete with biological engineers on a global scale. Facilities associated with the Cell Technologies Group (Department of Chemical Engineering) include a range of new culture vessels (from 150ml spinner flasks to conventional 5L Stirred Tank Reactors), a Fluorescent Activated Cell Sorting (FACS) system, fluorescent microscopy, and stateof- the-art analytical equipment all housed within a suite of Class II research laboratories. The aim of the work is to understand the interaction of the cell with the engineering environment for informed scale-up.
To ensure minimum risk of exposure to biological agents, the laboratory research area has been built to Microbiology Containment Level II, in accordance with the 1995 EC Biological Agents Directive. An isolated air system, special room pressure regimes, and strict staff operating policies ensure the safety of both the Center’s staff and the surroundings.
Image 2
cGMP AUTOMATED CELL CULTURE FACILITY
Within the Centre for Biological Engineering, the Healthcare Engineering Group of the Wolfson School of Mechanical and Manufacturing Engineering has a dedicated cGMP cleanroom suite that focuses on optimized automated human cell culture. Designed to run to an EU GMP Classification Grade B standard, the facility includes cryogenic storage facilities, a manual cell culturing area, and an area for automated cell culture containing an automated cell culture machine, designed to run in an EU GMP Classification Grade A environment.
The facility is intended to aseptically culture, expand, differentiate, and harvest adherent cells to be used in the GMP/cGMP manufacture of clinical trial Phases I, II, and III and licensed product batches of somatic cell therapeutic medicinal products. Products are anticipated to include cells and cell lines for allogeneic therapy and autologous cells for individual patients.
Approval from the Medicines and Healthcare products Regulatory Agency (MHRA) is essential for frontline medicine and healthcare product research; however no MHRA reference guidelines were available for this type of research work. Extensive discussions between the team mapped out this new territory and the cGMP facility was designed to be MHRA compliant.
Image 3
BIOELECTRICAL ENGINEERING FACILITY
The Centre for Biological Engineering also houses the Plasma Medicine Group (Department of Electrical and Electronics Engineering) with extensive facilities in research into how gas plasmas and pulsed electric fields could preferentially control skin infection, accelerate wound healing, and suppress tumor growth. This laboratory suite integrates an advanced atmospheric gas plasma laboratory with a Class II cell laboratory so that engineers, physicists, and life scientists can work together under the same roof. The facility represents the worlds-first integrated laboratory in plasma medicine with co-located plasma and cell laboratories.
Finally, the three departments will share common autoclave, storage, and office areas that will support the research.
The Centre for Biological Engineering brings together three fields of study: Biology, Engineering, and Medicine. Each department contributes to the common goal; the realization of regenerative medicine, cell technologies, and plasma medicine for regeneration of human cell and tissue. The partnership has resulted in a new, exciting innovative cleanroom and laboratory facility with immense potential for world breaking developments in human cell and tissue regeneration.
Jelle Hanse is the Export Executive at Clean Modules Ltd, responsible for sales, marketing, and business development of Clean Modules Ltd on the international market. Jelle and the team at Clean Modules Ltd specialize in the design and build of cleanrooms, laboratories, and associated clean environments, especially using pre-built and modular construction for use worldwide. While covering all industries, Clean Modules Ltd has particular specialist knowledge of the Life Sciences, Healthcare, Pharmaceutical, Biotechnical, and related industries

Finding The Optimal Analytical Test Part 1


Barbara Kanegsberg
Ed Kanegsberg
What analytical test should you use? A recent teachable and successful case study does not provide a miracle all-purpose analytical technique (there is no such thing) but rather illustrates the process of how a decision might be reached, and the considerations involved. It illustrates the importance of exploring beyond the methods that are in one’s immediate comfort zone, of understanding the features and limitations of any analytical method, and of adopting a scientific, rational, and defensible approach.
Testing involves “somehow perturbing the area to be analyzed and observing the results.”1 The key to successful analytical testing involves perturbing and observing the material to be analyzed in a meaningful, reproducible manner and in such a manner that meets the requirements at hand. This is easier said than done; and the process of selecting, developing, and validating a rational analytical method can involve interminable testing and discussion.
The case study involved collaboration between regulators at the South Coast Air Quality Management District (SCAQMD) in southern California and the Independent Lubricant and Manufacturers Association (ILMA). The approach to selecting the method involved determining and emulating likely manufacturing situations, testing and evaluation, assessment of method practicality, refinement of a standardized method, and consensus building.
In this instance, the goal was to determine evaporative loss of relatively volatile materials from metal working fluids so that a method could be included in an SCAQMD regulation with the goal of reducing VOC air emissions.
THE COLLABORATION
John M. Burke is Director of Engineering Services at Houghton International, Inc. in Valley Forge, PA. In 2008, as Chair of the Safety, Health, Environmental, and Regulatory Affairs (SHERA) Committee at ILMA, “I received a call from an ILMA member in southern California who described discussions with SCAQMD regarding all sorts of metalworking fluids, including vanishing oil, rust preventatives, and other protective fluids. As we learned more about the efforts to determine the VOC content, we knew we needed to be involved.”
The ILMA involvement developed into a major scientific effort, one that benefited the environment and considered requirements of industry. Naveen Berry, Planning and Rules Manager at SCAQMD, Diamond Bar, CA, described the association with ILMA as “a truly collaborative effort.”
WHICH METHOD?
Berry explains that “we considered EPA Method 24,3 it has many strengths. However, it is recognized as not being an optimal method to use for coatings but not for use with metalworking fluids and lubricants.” Burke adds that “Method 24 requires application of heat. While the temperature conditions are appropriate for aqueous solutions, there is a point when you heat a lubricant, which you go past the point of evaporating it, you are cooking it. The breakdown products are artifactual and do not truly emulate VOCs that would be released with normal, lower temperature, evaporation. In addition, Method 24 uses a laboratory oven. Placing a given sample at four different positions within the oven might yield four different results.”
Readers might want to note these limitations, because EPA Method 24 has been known to be used in manufacturing applications. In addition, loss and/or modification of a mixture of analytes during sample preparation are important considerations for everyone involved in manufacturing. Many complex mixtures can be readily modified during evaporation, especially with heating. For example, years ago, one of us (BK) participated in aerospace studies of Non-Volatile Residue (NVR) to qualify higher-boiling point solvents as replacements for lower-boiling point chlorinated and fluorinated solvents. We found that the evaporation temperature had to be lowered; and that the position of the sample in the laboratory oven had to be specified.
“In 2008 there was no standardized method for measuring VOCs in metalworking fluids,” continues Burke. “SCAQMD proposed a test method that they termed Method 313L.4 It called for determining VOCs by separation using gas chromatography (GC) with a flame ionization detector (FID).” Burke recounts that “after a year of testing with 313L, we could not get consistent test results among laboratories. In fact, we saw run to run variability even using the same GC system and using the same technician.”
Berry explains that SCAQMD decided not to pursue Method 313L, GC/FID for the current version of Rule 1144 because it did not meet the requirements of ASTM E-691 in terms of reproducibility. He adds that the project “raises the issue of how to treat semi-volatiles. When you test a mixture of multiple solvents in a gas chromatogram, generally you see distinct peaks. Some of the semi-volatiles did not form distinct peaks. They formed a smear.”5
THERMOGRAVIMETRIC ANALYSIS (TGA)
Burke recounts that “we were in a quandary. How could we agree to regulatory limits? How could we find a consistent test method? What constitutes volatility? Then, one of our company scientists suggested using TGA. TGA use is called out in military standards to determine the volatility in hydraulic oil. The thought was that perhaps TGA could be adapted to the problem at hand.”
On the surface, TGA might seem like EPA Method 24; and Burke notes that if TGA was used at the temperature called out for EPA Method 24, the results were indeed similar. ILMA contended that 110°C +/- 5°C was too hot, and that modification of the fluid was occurring. Burke recounts that “we finally said that we really need to define what constitutes volatility.”
CHARACTERIZING VOLATILITY
The SCAQMD/ILMA committee designed a test to characterize the evaporative properties of oils, using a time/temperature combination that might emulate realworld conditions. Burke notes that “it was important that we agreed to methods and to pass/fail criteria in advance. We tested three common naphthenic oils, with three different viscosities,* obtained from a single refinery. We selected a 26 week time span using the rationale that “two turns” of oil per year would replicate actual production conditions. The temperature was 40°C, emulating warm but realistic factory conditions. We agreed that we would observe the volatility curves (rate of change of fluid weight). If the curves flattened out before 26 weeks, we would stop earlier. We were surprised, not very pleasantly, by the results.”
The samples were still evaporating at 26 weeks. Burke explains that while the most viscous oil was beginning to asymptotically approach a leveling-off point, the other oils were still evaporating steadily. We thought that the medium viscosity oil would not be very evaporative; but it was almost 100% evaporative.”
“Since a 26 week testing protocol would be a bit cumbersome, the next step was to determine time and temperature conditions that emulated behavior of the oils used in the 26 week protocol. We tested five temperatures (71, 81, 91, 101, and 111°C) over shorter time periods. For each temperature, 120 minute by minute data points were collected.” Burke explains that they were able to set analytical conditions that matched results of the six month study to within 1%.
In summary, the group characterized the evaporative properties of oils used in lubricants, evaluated analytical approaches, and selected a promising method, TGA. In the next column, we continue with how the TGA method was tested and optimized, the current status, and a look to the future.
* The oils were characterized as 40 second, 60 second, and 100 second oils. The units refer to Universal Saybolt Seconds, a viscosity unit. It is the time for a given volume of oil to flow through a certain orifice. The longer the time, the more viscous the fluid.
Metalworking Fluid Restrictions May Affect You
We suggest that you, the reader, peruse Rule 1144.2 Metalworking fluids are being restricted to less evaporative materials in the SCAQMD area. SCAQMD rules sometimes are precedents for regulations in other areas and states. Features of the rule may impact your manufacturing facility either by current or future regulatory mandate or by corporate edict. If metalworking fluids that you or your suppliers use are reformulated or restricted, critical cleaning processes and approaches to determining surface residue on high-value product may also need to be revisited.
References
  1. B. Schiefelbein in Kanegsberg and Kanegsberg, “Find the Contaminant by Perturbing the Surface: XPS and Auger (Part 1),” Controlled Environments Magazine, November, 2007.
  2. SCAQMD Rule 1144, “Metalworking Fluids and Direct Contact Lubricants,” Amended July 9, 2010 http://aqmd.gov/rules/reg/reg11_tofc.html
  3. EPA Method 24, Method 24 – “Determination Of Volatile Matter Content, Water Content, Density, Volume Solids, and Weight Solids Of Surface Coatings” www.epa.gov/ttn/emc/promgate/m-24.pdf
  4. Appendix I SCAQMD Method 313: Determination of Volatile Organic Compounds (VOC) by Gas Chromatography/Mass Spectrometry (GC/MS) http://www.aqmd.gov/rules/cas/app1.html
  5. Clean Air Solvent Certification Protocol http://www.aqmd.gov/rules/cacc/index.html and www.aqmd.gov/rules/cacc/CACCprotocol.pdf

Barbara Kanegsberg and Ed Kanegsberg, Ph.D.“The Cleaning Lady” and “The Rocket Scientist,” are independent consultants in surface quality including critical/precision cleaning, contamination control, and validation. They are editors of The Handbook for Critical Cleaning, CRC Press; an expanded second edition is scheduled for publication in the 4th quarter of 2010. Contact BFK Solutions LLC, 310-459-3614;

Back to Basics - Who is to Blame for Critical Cleaning


Saturday, November 27, 2010

QUALITY CONTROL - Microbial Testing | Rapid Microbiological Methods in Lean Manufacturing



By Judy Madden Change can reduce costs, waste
By stripping waste and other steps that have no value from the production process, lean manufacturing can result in significant savings. But why stop there? If your company uses traditional testing methods to release its products to market, the adoption of rapid microbiological methods (RMMs) could be one of the most important additions you make to your company’s lean initiatives.
Rapid testing tools can shorten microbial hold times to just 24 hours for microbial limits and cut sterility testing time by more than 50%.
Rapid testing tools can shorten microbial hold times to just 24 hours for microbial limits and cut sterility testing time by more than 50%.
Lean quality is achieved by extending the principles of lean manufacturing to microbial testing. RMMs are being adopted by an increasing number of pharmaceutical companies, contract manufacturers, co-packers, and testing operations around the world to reduce lead times while ensuring product safety. Implementing rapid methods can unlock huge savings by shortening your microbial hold times to just 24 hours for microbial limits and cutting sterility testing time by more than 50%.

How Rapid Methods Work

Products that are susceptible to microbial contamination but are expected to be shipped contamination free are screened before being released into distribution. That screening adds multiple days to the production cycle and extends lead times. Micro-screening typically takes three to seven days for microbial limits testing and 14 days or more for sterility. That’s a long time to wait when you have a good production process in place that is turning out products free from bioburden almost all the time. The wait time is frustrating and expensive.
Well-validated rapid methods are just as effective as traditional microbiological methods, but faster. Results can be delivered in as few as 24 hours using a rapid method featuring adenylate kinase (AK). AK-amplified bioluminescence combines the use of adenosine triphosphate (ATP), the gold standard of rapid detection, with a patented enzyme technology to deliver accurate results even faster.
All living organisms contain the compound ATP, a vital part of energy metabolism. When ATP is detected in the standard bioluminescence test, a reaction occurs that generates a photon of yellow-green light, similar to that of a firefly. Although it is a very sensitive technique, it is limited because an organism can contain only a finite amount of the metabolite ATP.
AK is another vital part of energy metabolism in all living things. When supplied with an excess of adenosine diphosphate (ADP), AK acts as a catalyst, converting ADP into ATP. Because AK is an enzyme rather than a metabolite, it can be harnessed to generate an almost unlimited amount of its product. After 25 minutes, for example, the amount of ATP can be 1,000 times greater than the organism originally contained. As a result of this amplification and the reduced dependence upon microbial growth for detection, AK-amplified bioluminescence provides even faster results when used to screen for microbial contamination.
Well-validated rapid methods are just as effective as traditional microbiological methods, but faster. Results can be delivered in as few as 24 hours using a rapid method featuring adenylate kinase.

Benefits of Rapid Methods

When deciding whether to implement RMMs as part of your company’s lean initiatives, you should first consider the benefits.
Financial Benefits: Hundreds of manufacturing facilities worldwide have saved millions by implementing rapid methods, but how much could your company potentially save? At Celsis, the answer can be calculated using readily available data from your company’s finance, manufacturing, and quality departments. This information is used to populate the Celsis Value Creation Model, which then projects your company’s customized five-year net present value (NPV) and payback period for implementing a Celsis rapid detection system. In our experience, most companies realize payback on their investment in just six to nine months, with an average five-year NPV in excess of $500,000 per facility.
Cost-Per-Test Savings: While there is an increased cost for assay materials, the operational and financial benefits far outweigh the expense. When you combine all of the benefits listed above, the cost of rapid methods per assay is far less than that of traditional methods. If your company stores batches or products awaiting test results, you will benefit by reducing the working capital investment in inventory. By reducing your investment in inventory, you will benefit by clearing out space in the warehouse. If you carry safety stock, you will further benefit from cutting it down to size. If you manage a brand, you will benefit by offering your distributors a faster response time and by reducing the risk of contaminated product reaching consumers.
Expedited Order Fulfillment: Going lean reduces the time it takes you to produce, package, and/or test products, allowing you to fill orders quickly and more efficiently. Reducing the time required to complete orders leads to faster invoicing and faster payment. Additionally, without the need to house quarantined finished products, in-process and raw materials, and safety stock in micro-hold, your company will have extra warehouse space that will increase its capacity for additional business.
Reduced Working Capital Requirements: By reducing inventory requirements, you can manage a shorter cash cycle and dedicate more dollars to building your business rather than financing the conversion of raw materials to finished goods. This reclaimed working capital can then be used for productive purposes: growing revenue, developing new product, or funding other projects in functional areas that will bring value to the organization.
Enhanced Risk Management, Faster Recovery: What happens when there is a contamination event? At these times, when so much is at risk, the benefits of rapid methods are actually doubled. By improving efficiency in your product testing, you also reduce your recovery time if a contamination occurs. The faster a problem is identified, the more quickly you can begin corrective action and recover. RMMs reduce the potential negative impact on your customer relationships and minimize risk for your bottom line.
This benefit is not limited to a contamination event involving product released into distribution; it also includes those detected by your current traditional methods. The cost and timing associated with recovery can be significant and, to the extent that customer service is impacted, those costs can be even greater and more difficult to quantify.
Reduced Waste: Not only do rapid methods reduce the amount of product that must be disposed when a contamination occurs, they also reduce waste in testing supplies. Think about the vast volumes of broth, containers, and/or plates that your company goes through each month to run the multiple assays necessary to screen a single sample of product. With rapid methods, these bulky traditional testing supplies are significantly reduced. For example, the AK-based assay effectively screens for bacteria, yeast, or mold contamination from a single broth enrichment, and the assay requires only a small cuvette.
Rapid microbial methods (RMMs) can pay for themselves in six to nine months.
Click to Enlarge
Rapid microbial methods (RMMs) can pay for themselves in six to nine months.

Adopting Rapid Methods in Pharma

The pharmaceutical industry is an ideal environment for a rapid absence/presence screen. The manufacturing and packaging of pharmaceuticals today are conducted under conditions that produce a “clean” product at least 99% of the time. If a positive is rarely expected, there is rarely anything to enumerate.
With 99% or more of products testing negative for microbial contamination, lean principles suggest that a simple, rapid absence/presence screen will provide the actionable information necessary to manage the majority of production in a time- and cost-effective fashion. That leaves less than 1% of production to be managed on an exception basis for further evaluation against product specification. The use of enumeration technologies to “count to zero” typically results in limited implementation because of cost, limited product applicability, protocol complexity, or resource limitations. Quite simply, if the technology can’t be broadly implemented, then the financial and operational efficiencies won’t be realized.
Transitioning pharmaceuticals and other regulated products to a rapid release method is straightforward and is, in fact, encouraged by many global regulatory bodies, including the U.S. Food and Drug Administration (FDA). The preferred approach is to “tell them what you’re going to do, do it, and then tell them that you’ve done it.” That is a reasonably accurate description of the FDA’s comparability protocol. The centerpiece, or “do it” part, is validating the product or group of products for routine release using the rapid method. This typically entails side-by-side testing to ensure that the rapid method is as sensitive as the traditional method. During this stage, lab staff become more experienced and confident in the rapid system so that when the validation data are collected, the RMM is effectively implemented.
Your RMM provider may have regulatory compliance expertise on staff that you can tap into, as well as drug master files (DMFs) accepted by the FDA. The DMFs include data for specificity, limit of detection, robustness, ruggedness, and equivalence, and they can be used to supplement or streamline the validation of your rapid system. This may save significant time, both in your preparation and in the FDA’s review and approval process.
Controlling costs and operating more efficiently are priorities for everyone in the pharmaceutical value chain. Rapid microbial screening is a significant opportunity for manufacturers and contractors to reduce costs, risk, and working capital requirements through a leaner, more efficient quality screening process.

QUALITY CONTROL - Analytical Methods | LC/MS/MS Laboratory Workhorse for Generics



By Johnny Cardenas Drug companies leverage new opportunities
The window of opportunity opens wide for generic pharmaceutical companies when blockbuster drugs come off patent. As we approach the next few years, when a number of notable blockbusters such as Lipitor, Plavix, Advair, and Singulair come to the end of their patent protection, generic pharmaceutical companies are noticeably ramping up their drug development capabilities and are poised to reap big profits. These companies have evolved into giants in recent years, and the contract research organizations (CROs) they depend on are proliferating in many parts of the world. Because the window of opportunity is relatively short and time-sensitive, generic drug development must start early and move quickly so that products are available the minute they can be put on the market legally.
LC/MS/MS Laboratory Workhorse for Generics
Whether increasing their own laboratory bandwidth or outsourcing more work to CROs, generic pharmaceutical companies are implementing more studies with shorter and shorter timelines. By far the most common research activity in generic drug development is the bioequivalency study, which identifies and quantitates metabolized molecules to document the pharmaceutical/therapeutic equivalence between a generic drug and its brandname counterpart.
Many generic companies have dozens of studies going on simultaneously, requiring the processing of hundreds of thousands of samples. High performance liquid chromatography (HPLC) with triple quadrupole mass spectrometry (LC/MS/MS) is the gold standard for this pharmaceutical bioanalytical quantitation, and instrument manufacturers have been working at breakneck speed to deliver system improvements that will help drug developers move faster and more efficiently.
LC/MS/MS replaced LC-UV (liquid chromatography with UV detection) analysis for these bioequivalent studies in the 1990s, providing vast improvements in sensitivity, selectivity, and throughput. Emerging powerhouses such as India and China are also adopting this analytical technology as they join the global competition.
Figure 1: This figure illustrates 1,300 plasma injections over three days on the AB SCIEX QTRAP 5500 System. It demonstrates excellent reproducibility for the target compound and internal standard responses with a 3.3% coefficient of variation (CV) on the peak areas and a 2.3% CV on the peak area rations.
Click to Enlarge
Figure 1: This figure illustrates 1,300 plasma injections over three days on the AB SCIEX QTRAP 5500 System. It demonstrates excellent reproducibility for the target compound and internal standard responses with a 3.3% coefficient of variation (CV) on the peak areas and a 2.3% CV on the peak area rations.
LC/MS/MS has revolutionized the speed at which clinical and preclinical studies can be done, but throughput is only one advantage required by generic pharmaceutical companies. They also need ruggedness and robustness, systems that can deliver reproducible results over time and run the highest number of sample injections without cleanup. A rugged LC/MS/MS system will produce the same data day to day, and a robust one will deliver maximum time between failures (see Figure 1, above).

Canadian Case Study

Pharmascience is one of two large generic pharmaceutical companies based in Canada and produces an impressive portfolio of single-source prescription drugs. In Canada alone, the company fills more than 21 million prescriptions a year. According to Tristan Booth, PhD, the company’s vice president of innovative development, “we’ve improved our high-throughput bioanalytical lab with multiple high performance LC/MS/MS systems, to the point where we can do 50 studies in one year, processing 80,000 to 100,000 samples.” With seven AB SCIEX LC/MS/MS systems powering this bioanalytical lab, the company has now been able to devote more resources to discovery, including Phase 1 clinical trials.
Increased detection sensitivity has also improved the efficiency of generic pharmaceutical research. Biological samples contain compounds that can interfere with an assay. Preparation steps such as liquid- or solid-phase extraction can clean up the sample before processing, but they add time and cost to the study. With increased LC/MS/MS instrument sensitivity, these samples can be diluted rather than prepared with extraction techniques, allowing the injection of five to ten times less sample to obtain results.
Even highly polar analytes that are difficult to separate from a matrix can be analyzed more clearly with the latest highly sensitive instruments. The limit of quantitation (LOQ), which is the lowest concentration at which quantitative results can be accurately reproduced, can be lowered with newer, more sensitive systems such as the AB SCIEX QTRAP 5500 System.
Figure 2a: The figure to the right shows a liquid chromatography chromatogram displaying the excellent signal-to-noise (S/N) ratio in an analysis of budesonide from a dried blood spot sample using the new LC/MS/MS method. The ratio is 15 with a 3.2% coefficient of variation (CV). The figure below shows a calibration curve for budesonide verifying that the accuracy and precision for this new LC/MS/MS method on dried blood spot samples are well within the limits of acceptance.
Click to Enlarge
Figure 2a: The figure to the right shows a liquid chromatography chromatogram displaying the excellent signal-to-noise (S/N) ratio in an analysis of budesonide from a dried blood spot sample using the new LC/MS/MS method. The ratio is 15 with a 3.2% coefficient of variation (CV). The figure below shows a calibration curve for budesonide verifying that the accuracy and precision for this new LC/MS/MS method on dried blood spot samples are well within the limits of acceptance.
“Advancements in LC/MS/MS technology have delivered increased sensitivity for metabolized molecule identification from small sample concentrations in the sub-picogram range,” said Dr. Booth. Increases in selectivity allow Pharmascience to analyze 1,000 samples in two to three days, a significant improvement in throughput over just a few years ago.
Technological advances in LC/MS/MS, such as the TurboV Ion Source (AB SCIEX), provide increased sensitivity that reduces sample prep requirements, delivering improvements in LOQ, throughput, and robustness. Curtain Gas technology (AB SCIEX) protects the mass spec interface region and quadrupole analyzer from contamination, reducing routine maintenance requirements.
Data processing and advanced algorithms are another dynamic factor in successful bioanalytical analysis for generic drug development. As throughput increases on smaller and smaller sample volumes and concentrations, quantitation becomes more challenging. The ability of new triple quadrupole systems to monitor ever-increasing numbers of components has led to the need to improve data processing speed and accuracy. Mass spec companies continue to develop novel algorithms and software to keep pace with throughput and data generation advances.

CROs Add Analysis Bandwidth

Generic pharmaceutical companies are also ramping up by using CROs, which have significantly more LC/MS/MS analysis bandwidth. Ideally, both organizations will use the same LC/MS/MS platform to speed up the cross-validation process and eliminate method transfer issues. “Any CRO we use must have validated our methods,” said Dr. Booth. “If they are using the same LC/MS/MS platform that we have, validation is not an issue.”
As these generic drug companies look to the future, it’s clear that growth in opportunities will slow significantly as industry reaches the impending patent cliff, when the last of the big blockbusters go off patent. What looks like gloom and doom for big pharma will be a major catalyst for the growth of generic drug companies, which are embarking on new areas, including researching novel drug formulas, exploring new dosage forms, and even looking at drug repurposing. “We are diversifying our pipeline in anticipation of this market change,” said Dr. Booth. “Our LC/MS/MS capability has freed up resources so we can do more work in Phase 1 clinical trials, safety testing, dose escalations, etc.”
Figure 2b.
Click to Enlarge
Figure 2b.
As generic companies move beyond synthesizing blockbuster drugs to new and innovative endeavors, their analysis challenges expand to include qualitative pharmacokinetic studies, such as absorption, distribution, metabolism, and excretion. Complementary LC/MS/MS technology can meet these needs and streamline the transfer of methods from discovery to development. “It’s important to us to have backup and downstream LC/MS/MS systems that are all on one platform so we don’t waste time changing methods as we move down the pipeline,” said Dr. Booth.
The latest development for the analysis of small molecules in drug development is dried blood spot (DBS) sampling. DBS sample collection is easier and less invasive, and shipping and storage costs are significantly reduced. However, these small spots, between 20 and 100 times smaller than plasma samples, present new analysis challenges and require very high sensitivity detection. Many pharmaceutical companies and instrument manufacturers are working on ways to improve DBS sampling and analysis.
To demonstrate the feasibility of LC/MS/MS analysis on these samples, AB SCIEX recently developed a method for the quantitation of inhaled corticosteroids used in the treatment of asthma using DBS samples. These drugs, which are absorbed through the nasal passage and lungs before systemic metabolism, are found in very low concentrations in the blood. The use of DBS sampling for the pharmacokinetic study of these bronchodilators presents even lower sample concentrations.
Using the latest LC/MC/MS technology in the AB SCIEX QTRAP 5500 System, we were able to develop an accurate, verifiable method that enables pharmacokinetic and toxicokinetic studies of bronchodilators such as budesonide and fluticasone propionate. Figures 2a and 2b (see above) shows that quantification of inhalation drugs with low systemic circulation is feasible with DBS using ultra HPLC and high sensitivity triple quadrupole instruments. Sufficient sensitivity for 5-10 pg/mL LOQs can be achieved.
At virtually every point in the generic drug discovery pipeline, LC/MS/MS is an essential analytical tool, enabling research to advance and keep pace with ever-changing market requirements.

IN THE LAB - Lab Notebook | Analytical Methods Validation: Design and Execution



By Clifford Nilsen, CSSBB The mechanics of analytical methods validation
Clifford Nilsen, CSSBB
Editor’s Note: This is the third in a series of articles on analytical methods validation (the second part appeared on p. 34 of our June/July issue). The final article will continue to review the mechanics of analytical method validation, with an application of statistical methods that best support the validation effort.
This third installment of the series on analytical method validation, the process of demonstrating through laboratory studies that an analytical method is suitable for its intended use, will focus on the mechanics of analytical method validation.

Stability Indication (Forced Degradation Studies)

Analytical Methods Validation: Design and Execution
There are two kinds of forced degradation studies: chemical and physical. Chemical degradation is usually accomplished by acid hydrolysis, base hydrolysis, and oxidation.
Acid hydrolysis: Expose the sample to a mineral acid such as hydrochloric or sulfuric acid, starting with 100% working amount plus 10 mL of 0.5 N acid. Reflux the acidified sample for 30 minutes, cool to room temperature, neutralize it with 10 mL of 0.5N NaOH, dilute to a final volume, and assay the resulting sample using high performance liquid chromatography (HPLC) with a photodiode array detector to determine the amount of degradation, if any, and the peak purity of the principal analyte. If possible, strive to achieve between 15% and 30% degradation. If the initial experiment results in a degree of degradation far outside the 15% to 30% range, then adjust the acid strength and/or reflux time accordingly and repeat the experiment. Measuring the concentration of principal analyte in the degraded sample versus an unadulterated sample and comparing that value to the starting amount of sample can determine percent degradation.
There are two kinds of forced degradation studies: chemical and physical. Chemical degradation is usually accomplished by acid hydrolysis, base hydrolysis, and oxidation.
Base hydrolysis: Expose the sample to a base such as sodium hydroxide or potassium hydroxide, starting with 100% working amount plus 10 mL of 0.5 N base. Reflux the alkaline sample for 30 minutes, cool to room temperature, neutralize it with 10 mL of 0.5N mineral acid, dilute to a final volume, and assay the resulting sample using HPLC with a photodiode array detector to determine the amount of degradation, if any, and the peak purity of the principal analyte. If possible, strive to achieve between 15% and 30% degradation. If the initial experiment results in a degree of degradation far outside the 15% to 30% range, then adjust the base strength and or reflux time accordingly and repeat the experiment. Measuring the concentration of principal analyte in the degraded sample vs. an unadulterated sample and comparing that value to the starting amount of sample can determine percent degradation.
Oxidation: Expose the sample to 10% hydrogen peroxide starting with 100% working amount plus 10 mL of 10% hydrogen peroxide. Let the sample stand for 30 minutes at room temperature with occasional swirling, dilute to a final volume, and assay the resulting sample using HPLC with a photodiode array detector to determine the amount of degradation, if any, and the peak purity of the principal analyte. If possible, strive to achieve between 15% and 30% degradation. If the initial experiment results in a degree of degradation far outside the 15% to 30% range, then adjust the hydrogen peroxide strength and/or standing time accordingly and repeat the experiment. Measuring the concentration of principal analyte in the degraded sample versus an unadulterated sample and comparing that value to the starting amount of sample can determine percent degradation.
Physical degradation is achieved by exposure to heat and light.
Heat: Expose a portion of the sample to 60°C dry heat for five days. Allow the sample to cool down, then assay the resulting heat-exposed sample using HPLC with a photodiode array detector to determine the amount of degradation, if any, and the peak purity of the principal analyte. If possible, strive to achieve between 15% and 30% degradation. If the initial experiment results in a degree of degradation far outside the 15% to 30% range, then adjust the heating time accordingly and repeat the experiment. Measuring the concentration of principal analyte in the degraded sample versus an unadulterated sample and comparing that value to the starting amount of sample can determine percent degradation.
For each forced degradation, there should be no interfering degradants under the principal analyte peak as determined by peak purity analysis. Peak purity can be evaluated using purity parameters (factors), peak ratios, spectral overlays, or ratiograms. If this criterion is met, then the method is deemed to be stability indicating.
Light (photostability): Expose a portion of sample to 1.2 million lux-hours of cool white light and 200 watt-hours/m2 360nm ultraviolet A light, using a suitable photostability system (contact the author for more information). After light exposure, assay the resulting sample using HPLC with a photodiode array detector to determine the amount of degradation, if any, and the peak purity of the principal analyte. Report the percent degradation, if any, that is observed. Measuring the concentration of principal analyte in the degraded sample versus an unadulterated sample and comparing that value to the starting amount of sample can determine percent degradation.
For each forced degradation, there should be no interfering degradants under the principal analyte peak as determined by peak purity analysis. Peak purity can be evaluated using purity parameters (factors), peak ratios, spectral overlays, or ratiograms. If this criterion is met, then the method is deemed to be stability indicating.

Selectivity

Process a placebo (product without the active ingredient) as a sample at 10 times the working concentration. Assay the placebo as a sample to make sure that no interferences are present that will prevent detection of the active ingredient(s).
Typical interfering agents are preservatives and dyes that are often very ultraviolet active. For HPLC methods, there should be no peaks greater than noise at the retention time of the principal analyte.

Linearity and Range

Prepare a series of five (five standards that span a range of 50% to 150% of the analyte working range). For example, if one were performing a linearity on acetaminophen (APAP), and the working concentration was 0.1 mg/mL, then one might proceed as follows:
  • Make 200 mL of a solution of APAP in alcohol with a concentration of 1 mg/mL (10 times the working concentration). This is the APAP stock solution.
  • Prepare the five working standards according to Table 1 (see below, left).
Table 1. Acetaminophen (APAP) Linearity Standards
Table 1. Acetaminophen (APAP) Linearity Standards
Using the HPLC method for the product, inject each standard six times. For each standard—50%, 75%, 100%, 125%, and 150% of the working standard, respectively—calculate the area unit’s percent relative standard deviation to determine injection precision at each level. Plot the mean counts for each standard level versus concentration, performing a linear regression on the resulting curve.
In most cases, it is desirable to have an injection precision, in terms of area unit relative standard deviation, of less than 2% for each standard level, i.e., 50% to 150% of the working concentration. The linear correlation coefficient significance (r2) for the standard curve should typically be > 99.9%. The Pearson linearity coefficient (r) should be > 0.9995. Contrary to common belief, r2 is not the linear correlation coefficient, but, rather, r = correlation coefficient and r2 = significance of the correlation—in this case, the percentage of area counts that can be explained by concentration.

FORMULATION - Protein Stability | Recombinant Albumin, a Robust Excipient



By Phuong Tran, PHD; Geoffrey Francis; Larissa Chirkova, PHD; and Sally Grosvenor Reduces excipients required in manufacturing process
The development and production of therapeutic proteins and peptides are rapidly expanding in the pharmaceutical industry, with the manufacture of monoclonal antibodies (mAbs) a primary focus. Currently, there are more than 25 approved mAbs worldwide; 240 therapeutics are in clinical studies, and 26 of these are in Phase III clinical trials.1 Recombinant protein molecules are attractive drug candidates due to their site-specific binding and effectiveness at low concentrations, factors which lead to fewer side effects.
During the manufacturing process, transport, and storage, the protein therapeutic can be exposed to a variety of stresses that promote protein instability and degradation. Protein instability can be the result of physical and chemical degradation. Physical protein degradation, such as aggregation, denaturation, and adsorption, is often the result of changes in temperature, shear stresses, and lyophilization, while oxidation of the protein during processing and storage is common. Instability of the protein therapeutic can occur in either the liquid or solid form and is dependent on the protein sequence, isoelectric point, hydrophobicity, and carbohydrate content.2 With the potential to increase the immunogenicity and decrease the efficacy and shelf life of the protein drug product, protein stability is a key issue in the final product.
Figure 1: Suppression of merozoite surface protein (MSP-2) protein aggregation in the presence of rAlbumin. rAlbumin suppressed aggregation of the MSP-2 protein (3.5 mg/ml) in a concentration-dependent manner following a single freeze–thaw cycle.
Figure 1: Suppression of merozoite surface protein (MSP-2) protein aggregation in the presence of rAlbumin. rAlbumin suppressed aggregation of the MSP-2 protein (3.5 mg/ml) in a concentration-dependent manner following a single freeze–thaw cycle.
To protect against degradation, protein therapeutics are usually formulated with excipients to provide the product with an acceptable shelf life for storage and shipping. As defined by the International Pharmaceutical Excipients Council, excipients are any substance other than the active drug that have been appropriately evaluated for safety and are included in a drug delivery system to: (1) aid processing of the system during manufacture; (2) protect, support, or enhance stability and/or bioavailability; (3) assist in product identification; or (4) enhance any other attribute of the overall safety and effectiveness of the drug product during storage and use.3
Human serum albumin (HSA) has been used as an excipient in a number of therapeutic protein formulations, including erythropoietin, antihemophilic factor, and interferon beta-1a.4 Because it is the most abundant protein in human blood, the potential for HSA to elicit an immunogenic response is minimal. However, due to regulatory concerns over the risk of blood-borne contaminants, such as prions or viruses, in these animal-derived products, formulation scientists have moved away from its use in drug formulation.
This article investigates the use of a recombinant human albumin (rAlbumin), expressed in Saccharomyces cerevisiae and manufactured to current good manufacturing practice, as an excipient and examines its ability to prevent or minimize physical and chemical degradation of drug substances in various test formulations. Drug substances, including transforming growth factor-β3 (TGF-β3), insulin-like growth factor–I (IGF-I), and a malarial antigen vaccine protein, were formulated in the presence of recombinant human albumin, and the effect on protein stability was examined.

Protects Against Aggregation

Aggregation, generally described as the association of misfolded proteins, is a major problem encountered during the manufacturing process of therapeutic proteins, resulting in significant product loss and potentially increasing the immunogenicity of the drug product. There are numerous process operations during which protein aggregation can occur, such as refolding, purification, mixing, freeze-thawing, freeze-drying, and reconstitution. Aggregation can also occur during transport and storage. The formation of these aggregates is generally concentration dependent, which is a particular challenge for protein therapeutics formulated at high concentrations.
Figure 2: The effect of rAlbumin in protecting insulin-like growth factor-I (IGF-I) against oxidation in a concentration-dependent manner following exposure to H2O2.
Figure 2: The effect of rAlbumin in protecting insulin-like growth factor-I (IGF-I) against oxidation in a concentration-dependent manner following exposure to H2O2.
In this study, rAlbumin was evaluated for its ability to suppress amyloid fibril formation by the merozoite surface protein (MSP-2) after a single freeze-thaw cycle. Amyloid-like fibrils are measured by the effect on light scattering at λ 320 nm. A range of rAlbumin concentrations were evaluated for their ability to suppress aggregation of MSP-2. MSP-2 was chosen as the model to investigate aggregation due to its tendency to form amyloid-like fibril aggregates.5,6
At various concentrations, rAlbumin was dissolved in a solution of phosphate buffered saline (PBS); the MSP-2 protein (3.5 mg/ml) was then added to all samples, followed by a single freeze-thaw cycle. Freeze-thawing is one of the numerous process conditions during which protein aggregation can occur.2 Samples were then plated in a 96-well plate and stored at 2°C to 8°C. Absorbance readings were taken at λ 320 nm at multiple time intervals over a five-day period.
A variety of excipients in common use within the industry to improve protein stability were also compared with rAlbumin for their ability to inhibit protein aggregation. The excipients rAlbumin (15.0 mg/ml), glycine (20.0 mg/ml), PEG 400 (1.0 mg/ml), polysorbate 80 (0.82 mg/ml), and polysorbate 80 (8.2 mg/ml) were tested in the same model described above. Absorbance readings were taken at multiple time intervals at λ 320 nm. Results indicated that aggregation was suppressed by 50% at a 1:1 molar ratio of the antigen to rAlbumin and reduced by 80% in the presence of the highest concentration of rAlbumin (see Figure 1, p. 23). rAlbumin also suppressed aggregation of the MSP-2 antigen to a greater extent compared with other commercially available excipients when the antigen was formulated in PBS pH 6.4 (see Figure 2, p. 24).
Although surfactants like polysorbate 80 have proven beneficial during the manufacturing process by reducing stress-induced aggregation, they are also known to adversely affect protein stability during storage by acting as a pro-oxidant and increasing the oxidation of the therapeutic drug substance.7
Figure 3: The effect of rAlbumin compared to polysorbate 80 in preventing the nonspecific binding of transformng growth factor-β3 (TGF-β3) to plastic polypropylene container.
Click to Enlarge
Figure 3: The effect of rAlbumin compared to polysorbate 80 in preventing the nonspecific binding of transformng growth factor-β3 (TGF-β3) to plastic polypropylene container.
The mechanism by which albumin inhibits aggregation is not well understood. HSA is known to sequester >90% of the Alzheimer’s disease-related peptides Aβ (1-40) and Aβ (1-42) in blood serum, presumably affecting the ability of the Aβ peptides to aggregate.8 HSA is also known to bind and transport metal ions such as Cu. Complexes between Cu and Aβ peptides are involved in Aβ aggregation; therefore, HSA is capable of reducing Cu-induced aggregation.9

rAlbumin Acts as Antioxidant

Oxidation of a protein therapeutic is a problem for manufacturers, particularly during storage. Factors that can affect the oxidation rate of proteins during storage include oxygen (head space), light, the physical state of the product, and temperature. Modifications to proteins through oxidation can lead to a range of functional consequences, such as altered binding activities, increased susceptibility to aggregation and proteolysis, increased or decreased uptake by cells, and altered immunogenicity. It is for these reasons that Food and drug Administration guidelines suggest that oxidation must be controlled in the product formulation of therapeutic proteins. During storage, the methionine residues are often the most susceptible to oxidation; formulation excipients are often used to protect the protein from this oxidation.
IGF-I, an important anabolic growth factor, is susceptible to oxidation, particularly during storage, and was therefore chosen as an appropriate model to investigate rAlbumin’s ability to inhibit oxidation.10 To test the functionality of rAlbumin as an antioxidant, pharmaceutically relevant conditions in protein oxidation were modeled using trace amounts of the oxidizing agent hydrogen peroxide (H2O2). Both rAlbumin (0, 10.0, 15.0, and 20.0 mg/ml) and L-methionine (0, 0.1, 0.2, and 0.3 mg/ml) were dissolved in solutions of PBS buffer. The IGF-I protein (20 µg/ml) was then added to all samples, followed by H2O2 to a final concentration of 0.0005%, and incubated for eight hours at 37°C. The reaction was terminated with catalase, and degree of oxidation was analyzed by reverse-phase high performance liquid chromatography (HPLC). The percentage of oxidized IGF-I was calculated against the main IGF-I peak for all samples. Using hydrogen peroxide as the agent, oxidation of IGF-I was shown to be significantly reduced by the presence of increasing concentrations of rAlbumin (see Figure 3, p. 24). At the highest concentration, oxidation of IGF-I was reduced by 93%. It is noteworthy that the initial IGF-I sample already contained 11.6% of oxidized form due to storage alone.
The ability of rAlbumin to act as an antioxidant following exposure to hydrogen peroxide was also compared to the commonly used antioxidant L-methionine (see Figure 4, p. 25). The oxidative protection of IGF-I by rAlbumin was achieved at molar concentrations ~ 13-fold less than that of L-methionine.
Formulation of therapeutic proteins and peptides that provide optimal product stability during process, storage, and shipping is essential for the biopharmaceutical manufacturer.
HSA is known to have an antioxidant function, primarily due to a single free thiol at position Cys 34, and this single thiol of human serum albumin acts as a potential scavenger for reactive oxygen and nitrogen species. As an antioxidant, albumin also binds and transports metal ions, such as Cu2+ and Fe3+, thus reducing the availability of these ions to cause oxidation.11

Nonspecific Adsorption Reduced by rAlbumin

Like most proteins, protein therapeutics are susceptible to nonspecific adsorption to various surfaces. This loss of product can significantly decrease the concentration in solution, altering the efficacy of the drug. In addition to product loss, protein adsorption can lead to structural change, denaturation, and inactivation due to aggregation. Nonspecific protein adsorption is a particular problem for liquid product at low concentrations. Many surfaces the drug product can come into contact with during the manufacturing process lead to product loss due to nonspecific adsorption. These include, but are not limited to, delivery pumps, silicone tubing, and glass and plastic containers.
HSA has been used as a blocking agent to prevent therapeutic proteins from binding to various surfaces. The mechanism is not well understood, but albumin is believed to bind to charged surfaces through opposite charged functional groups on the molecule. Hydrophobic interactions that occur are at lower strength and are more easily reversible.12 TGF-β3, an active pharmaceutical ingredient in scarless wound healing, is a hydrophobic protein with a propensity to adsorb to container surfaces. The percentage loss of TGF-β3 due to nonspecific binding to polypropylene or glass vial surfaces in the absence and presence of rAlbumin was examined.
To test the effectiveness of this model, TGF-β3 (0.5–60 µg/ml) was added to a polypropylene or glass container containing citrate buffer pH 3.6. Each sample was mixed and centrifuged for three minutes at 13,500 rpm. Samples were transferred to HPLC vials for analysis via reverse-phase HPLC. Percentage recoveries of TGF-β3 were calculated against the TGF-β3 reference standard. rAlbumin was then assessed for its ability to prevent the loss of TGF-β3 to the container surface. Citrate buffer pH 3.6 was added to a polypropylene test container, followed by rAlbumin (0–0.5 mg/ml), then TGF-β3 (0.2 µg/ml). Each sample was mixed and centrifuged as described above. rAlbumin (0.1 mg/ml) was also compared against polysorbate 80 (0.1 mg/ml), using the method described above, for ability to protect TGF-β3 (0.2–1.0 µg/ml) against nonspecific adsorption to glass surfaces.
Figure 4: The effect of rAlbumin (0.1 mg/ml) compared to polysorbate 80 (0.1 mg/ml) in preventing the nonspecific binding of transformng growth factor-β3 (TGF-β3) (0.2–1.0 µg/ml) to glass vials.
Click to Enlarge
Figure 4: The effect of rAlbumin (0.1 mg/ml) compared to polysorbate 80 (0.1 mg/ml) in preventing the nonspecific binding of transformng growth factor-β3 (TGF-β3) (0.2–1.0 µg/ml) to glass vials.
In this study, rAlbumin significantly reduced protein loss that occurred due to the nonspecific binding of TGF-β3 to glass and plastic. In the absence of rAlbumin, nonspecific binding of TGF-β3 increased progressively at concentrations less than 60 µg/ml, and the recovery of the protein was significantly reduced at lower concentrations. In the presence of rAlbumin, however, the nonspecific binding of TGF-β3 to vessel surfaces was minimal, with >95% recovery achieved using just 0.05 mg/ml of rAlbumin. The benefit of adding rAlbumin was then compared to that of polysorbate 80 in preventing nonspecific binding of proteins to plastic and glass surfaces. Polysorbate 80 is widely used in the formulation of biotherapeutic products to address aggregation and nonspecific binding, but its suitability as a protein stabilizer raises certain concerns, because it is a potential source of peroxides. In this comparison, rAlbumin prevented the nonspecific binding of TGF-β3 at least as well as polysorbate 80 in plastic containers and significantly better in glass vials.

Multipurpose Excipient

Formulation of therapeutic proteins and peptides that provide optimal product stability during process, storage, and shipping is essential for the biopharmaceutical manufacturer. Numerous excipients that reduce protein degradation, which occurs through physical or chemical pathways, are in common use within the industry. Generally, although each excipient has a specific function in stabilizing a protein substance, it may also stimulate and enhance alternate pathways of protein degradation.
In this study, rAlbumin was found to be an effective multipurpose excipient. rAlbumin protects proteins against aggregation, especially amyloid-like fibril products, acts as an antioxidant in preventing protein oxidation, and serves as a blocking agent to prevent nonspecific adsorption to surfaces. rAlbumin reduces the total number of excipients required, simplifying the formulation strategy. Further, it could be argued that the use of a single or reduced number of excipients to formulate drugs may accelerate development time by more quickly allowing more “universal” excipient mixes to be derived—and that rAlbumin could facilitate such an approach.
Dr. Tran is senior scientist in the development and discovery group, Francis is chief scientist, Dr. Chirkova is discovery and development manager, and Grosvenor is scientific communications manager at Novozymes Biopharma, Adelaide, Australia. For more information, go to www.biopharma.novozymes.com or contact Phuong Tran at phut@novozymes.com.

References

  1. Reichert JM. Antibodies to watch in 2010. MAbs. 2010;2(1):84-100.
  2. Bogard WC Jr., Dean RT, Deo Y, et al. Practical considerations in the production, purification, and formulation of monoclonal antibodies for immunoscintigraphy and immunotherapy. Semin Nucl Med. 1989;19(3):202-220.
  3. International Pharmaceutical Excipients Council (IPEC), Pharmaceutical Quality Group (PQG). The Joint-PQG Good Manufacturing Practices Guide for Pharmaceutical Excipients. 2006. Available at: http://ipecamericas.org/node/132. Accessed September 25, 2010.
  4. Berezenko S. Formulation of biotherapeutics: avoiding human and animal excipients. Eur BioPharm Rev. Summer 2005.
  5. Wang W, Singh S, Zeng DL, et al. Antibody structure, instability, and formulation. J Pharm Sci. 2007;96(1):1-26.
  6. Kueltzo LA, Wang W, Randolph TW, et al. Effects of solution conditions, processing parameters, and container materials on aggregation of a monoclonal antibody during freeze-thawing. J Pharm Sci. 2008;97(5):1801-1812.
  7. Wang W, Wang YJ, Wang DQ. Dual effects of Tween 80 on protein stability. Int J Pharm. 2008;347(1-2):31-38.
  8. Milojevic J, Raditsis A, Melacini G. Human serum albumin inhibits Abeta fibrillization through a “monomer-competitor” mechanism. Biophys J. 2009;97(9):2585-2594.
  9. Perrone L, Mothes E, Vignes M, et al. Copper transfer from Cu-Abeta to human serum albumin inhibits aggregation, radical production and reduces Abeta toxicity. Chembiochem. 2010;11(1):110-118.
  10. Fransson JR. Oxidation of human insulin-like growth factor I in formulation studies. 3. Factorial experiments of the effects of ferric ions, EDTA, and visible light on methionine oxidation and covalent aggregation in aqueous solution. J Pharm Sci. 1997;86(9):1046-1050.
  11. Fransson J, Hagman A. Oxidation of human insulin-like growth factor I in formulation studies, II. Effects of oxygen, visible light, and phosphate on methionine oxidation in aqueous solution and evaluation of possible mechanisms. Pharm Res. 1996;13(10):1476-1481.
  12. Jeyachandran YL, Mielczarski E, Rai B, et al. Quantitative and qualitative evaluation of adsorption/desorption of bovine serum albumin on hydrophilic and hydrophobic surfaces. Langmuir. 2009;25(19):11614-11620.

Outsource Analytical Testing for QC



By Paul Smith Can save money, provide essential flexibility
Outsourcing your analytical testing can be one of the best decisions you ever make or cause sleepless nights for years to come if your outsourced results are scrutinized during a U.S. Food and Drug Administration (FDA) or other regulatory audit. The outcome will depend on your approach to selecting your outsourcing partner and the working relationship you develop with that company. All analysis must be carried out to meet a valid business decision or requirement; contract analysis is no different.
Outsource Analytical Testing for QC
Analysis costs money, and laboratory analysis is outsourced to save money. During the lifetime of a drug product, the approach to costs changes—from research and development (R&D) to routine supply to product maturity (generic manufacture). For example, during the R&D phase, during which the process, chemistry, sources of materials (and therefore potential impurities), and formulations may change, the emphasis is on time and maximizing the analytical information available to make timely scientific decisions on an evolving platform. This can result in the use of hyphenated analytical techniques and technology that may:
  • not routinely be available outside of an R&D facility;
  • not be sufficiently robust for a routine quality control (QC) laboratory; and/or
  • not be required to support routine current good manufacturing practice (cGMP).
In contrast, QC analysis carried out to support routine manufacture has a strong cost focus, which becomes stronger as a product matures into generic manufacture. The business and analytical requirements change during the lifetime of a drug. For pharmaceutical contract QC analysis, the drug company usually has to perform analytical technology transfer to allow the contract QC laboratories to perform the testing. Providing documented evidence of the successful transfer is fraught with potential problems, especially when any form of analytical technology transfer occurs or when there are problems with the core technology; for example, the contract laboratories may not have access to necessary analytical techniques, such as nuclear magnetic resonance.
Anything that is outside the core expertise of the contract laboratories will be a potential high-risk area. Therefore, even the transfer of gradient high performance liquid chromatography (HPLC) methods could be a problem if the method was not robust or if the laboratory is used to isocratic HPLC systems.
These examples are mentioned because the nature of the analysis requirements directly affects:
  • the type of contract testing laboratories needed;
  • the process used for deciding on a supplier;
  • the response time required for the results;
  • the approach to analytical quality normally used; and
  • the time component of the decision.
In addition to QC testing, transfer of a manufacturing process to a different facility will require environmental waste stream monitoring as part of the process optimization and discharge consent limit compliance—depending on what the waste streams contain, local legislation, and the consent limits.
Additionally, unless a fully dedicated manufacturing facility is being used, analysis of cleaning validation samples is usually required. For contract facilities new to cGMP manufacture, it is critical that any “history of use” information covers use of the facility prior to cGMP use and a risk assessment should be performed. If this information is not available, how do you know the facility was not used to manufacture pesticides or other potential high-risk materials?
Figure 1: Contrasting analytical requirements for quality control, environmental, and cleaning validation.
Click to Enlarge
Figure 1: Contrasting analytical requirements for quality control, environmental, and cleaning validation.

Different Types of Contract Analysis

Aspects of contract pharmaceutical QC analysis, environmental testing, and cleaning validation testing are highlighted in Figure 1 (see below), which shows some of the differences among these analytical sample types.
There is usually strong competition among service providers, along with well-established accreditation for laboratory collaboration opportunities to demonstrate compliance and conformance. As shown in Figure 1, however, the approach taken to sample analysis can be very different to the kind used in routine pharmaceutical QC analysis. Using HPLC as an example, in pharmaceutical QC testing, samples are tested in an analytical “run” or injection sequence, whereas pharmacopeial system suitability requirements (e.g. USP <621>) define standardization and criteria for demonstrating that the results are valid within the run. The run or injection sequence is repeated each time samples are tested, and the ratio of setup injections is high (e.g., it may take several hours to test one sample).
Additionally, analytical equipment may not be dedicated to a particular kind of testing, forcing the analyst to take time each day to set up the HPLC system for testing samples. In contrast, efficient environmental testing laboratories will dedicate analytical equipment to particular methods and testing that can run the analytical methods 24/7.
Generally, this approach provides greater stability for HPLC equipment, which is especially important for trace analysis; fewer analytical problems are experienced with dedicated use of HPLC. For environmental testing, this dedication and 24/7 operation means that as soon as a particular sample requires testing and sample preparation is complete, it is added to the run. Overall, this approach reduces the cost per sample and increases the return on investment for the equipment. The impact of instrument downtime for repair or maintenance becomes much more significant where equipment is dedicated, however.
Outsourcing saves money. Sometimes, more importantly, it provides essential flexibility in resource-constrained QC or a development department, allowing more projects to be supported. The outsourcing service provider’s flexibility is another important consideration; certainly, testing laboratories with limited flexibility would constrain any benefits they might offer.
Figure 2: Components of the data quality triangle based on USP <1058>.
Click to Enlarge
Figure 2: Components of the data quality triangle based on USP <1058>.

Analytical Data Quality

In the June/July edition of Pharmaceutical Formulation & Quality, Lori Valigra discussed analytical instrument qualification and introduced the quality data triangle from USP general chapter <1058> as part of her article, “Qualifying Analytical Instruments: General chapter <1058> clarifies terminology, classifies instruments.”1 This critical data triangle from USP <1058> is reproduced in Figure 2 (see p. 18). The principles are fundamentally applicable to all analytical testing laboratories.
A few differences exist among laboratories in relation to the data triangle shown in Figure 2, however:
  • the level where the greatest emphasis is placed;
  • the detail applied at each level; and
  • how the information is documented.
Failing to ensure the performance of the analytical instrument (by instrument qualification) or the method (by method validation) introduces a risk that the results may be invalid. The deceptively simple question, “How do you know your analytical results are valid?” almost always relies on all aspects of the data triangle for a full and robust defense of the question (evidence that all the levels are important for valid data). The different ways in which the data triangle is interpreted can be both a risk and an opportunity. Change can be slow in large pharmaceutical companies, particularly when there is an opportunity to relax standards to appropriate GMP levels.
Therefore, contracting out services can be used to facilitate and accelerate change; for example, a company may adopt the standards of work that the contract lab uses, standards that are appropriate to the work, rather than the company’s own default standards. There are risks: Any change can become an area of focus in a regulatory audit, and adopting different standards can mean an increased risk from the standards of work applied in the contract laboratories.
Contract environmental testing laboratories will place significant emphasis on the results obtained from the analysis of control samples (the top of the triangle). The results of these control samples are plotted in control charts such as Shewhart charts, both to demonstrate the quality of the results and to demonstrate and monitor the ongoing performance of the analytical system, including equipment, method, and analyst. Additionally, contract environmental testing laboratories are often ISO 17025 accredited, so the results are reported with uncertainty figures. For contract environmental testing laboratories, the ongoing system suitability is particularly significant in defending the question “How do you know your results are valid?” For cleaning validation samples, good design of the analytical method, run injection sequence, and sample testing can mean that the results are reported as a pass or fail against the limit test being applied, rather than numerically as in pharmaceutical QC testing.
This type of “self-contained” screening analysis, where results are reported as pass or fail, rather than numerical values, can be much more efficient. Emphasis is focused on the samples that fail or are close to the limit. These are the high-risk samples, where further cleaning of the process equipment is required. However, this approach needs careful consideration at the method development and validation stage, because it is fundamental to how the method will be used.

Technology Transfer

When the analysis is performed with an in-house method that is not internationally accepted, the QC laboratory that developed the method is responsible for training and transferring analytical capability to test the samples to the contract laboratory. This is where many problems can occur. The robustness of the analytical methods and the capability and expertise of the scientists in the contract laboratories becomes significant. It is at this stage that the partnership relationship, essential for the success of the services being contracted out, is developed. This relationship is particularly important if analytical problems arise.
A technology transfer protocol documents and demonstrates how the analysis capability is transferred from the pharmaceutical company to the contract laboratories. However, a poorly designed protocol can prove that the contract laboratories’ results are different. Care must be taken when selecting acceptance criteria for subjective tests such as appearance and solution color and for low-level impurities. Over-reliance on student t-test comparisons can be troublesome—and not necessarily appropriate—for comparing low-level impurities. Analytical problems and out-of-specification (OOS) result investigations are well-known areas of high FDA compliance focus and, therefore, potential risk. Because of this high risk, OOS investigations, as well as the laboratory’s procedure for returning an instrument to routine use following maintenance or a breakdown, should be scrutinized when evaluating contract laboratories.
As part of the analytical technology transfer process, the analytical instrumentation used at both sites must be considered. If gradient HPLC instruments are from different suppliers, the “dead volume” and other performance attributes of the instruments can play an unseen problematic role in the results obtained. In my experience of technology transfer, there are generally fewer problems if both laboratories use the same analytical equipment for some tests, including HPLC.
Sometimes this is not possible. For example, the contract laboratory may be in a country where, for logistical reasons, another instrument is more popular. If using the same instrumentation is not possible, reduce analytical transfer risks by qualifying the full working range of instrument use. Testing all the instruments in the same rigorous way would establish a common baseline, allowing ready identification of any differences in performance between the instruments. These differences can then be taken into consideration when transferring methods between instruments.
Figure 3: Options for approval of a contract laboratory.
Click to Enlarge
Figure 3: Options for approval of a contract laboratory.

Service Provider Selection

Each company uses a different approach when selecting contract service providers. The number of variables involved can result in a complex decision and, at the evaluation stage, companies often use a balanced score card or weighted Kepner Tregoe decision analysis model.2 This balanced approach ensures a fair and unbiased decision and prevents jumping to a company for reasons that are not significantly important in the wider business and compliance context. The disadvantage of this approach is that a company with limited experience contracting out a particular service may base the weighting factors (what aspects of the decision are more important) on theoretical considerations. This can potentially result in a poor structure and, therefore, a poor decision. However, narrowing the selection to the top few and then considering the potential risks helps balance this out.
Because the requirements for testing QC pharmaceutical compounds are quite different from environmental samples or screening type analysis, the use of different selection processes, perhaps simply applying different weighting factors in the Kepner Tregoe model, would be ideal. In addition to the selection against the laboratory requirements, there is also the wider business requirement for approval of the contract laboratories or supplier. Some options that can be considered are summarized in Figure 3 (see p. 18). The greater the risk, the farther down the triangle the decision is made.
Complex pharmaceutical QC testing requires a physical audit with full technology transfer and ongoing analytical result monitoring. But for contract environmental analysis or any other widely available analytical testing, completion of a supplier questionnaire for a laboratory with the required ISO accreditation might suffice. Preferably, the laboratory will produce analytical data such as chromatograms or infrared along with the certificate of analysis, even though some charge more to supply this data. Receiving and reviewing the analytical raw data support reduces risks of compliance troubles. Finally, when evaluating the data, ask yourself if it looks real—like the kind of data you would normally observe in your laboratory. Such thinking can be critical to identifying potential problems and risks.
Smith is the validation program manager for Europe in Analytical Sciences and Laboratory Services at PerkinElmer. Reach him at paul.smith@perkinelmer.com.

References

  1. Valigra L. Qualifying Analytical Instruments: General chapter <1058> clarifies terminology, classifies instruments. Pharmaceutical Formulation & Quality website. June/July 2010. Available at: http://www.pharmaquality.com/ME2/Audiences/dirmod.asp?sid=325598564E8C4B3EB736C7159241312D&nm=Browse+Articles&type=Publishing&mod=Publications%3A%3AArticle&mid=D3E3C719D8D44216836DCA4F4144BEC4&tier=4&id=C4030354EB4D4737BB47516C6D815CEB&AudID=5648A5C28C97462DBBDB309539B820EF. Accessed September 24, 2010.
  2. Kepner CH, Tregoe BB. The Rational Manager: A Systematic Approach to Problem Solving and Decision-Making. New York: McGraw-Hill; 1965.