Sunday, February 13, 2011

Manufacturing Tabltes:tablets dosage form advantages and disadvantages


In the tablet-pressing process, it is important that all ingredients be dry, powdered, and of uniform grain size as much as possible. The main guideline in manufacture is to ensure that the appropriate amount of active ingredient is equal in each tablet so ingredients should be well-mixed. Compressed tablets are exerted to great pressure in order to compact the material. If a sufficiently homogenous mix of the components cannot be obtained with simple mixing, the ingredients must be granulated prior to compression to assure an even distribution of the active compound in the final tablet. Two basic techniques are used to prepare powders for granulation into a tablet: wet granulation and dry granulation.

Powders that can be mixed well do not require granulation and can be compressed into tablets through Direct Compression

Direct Compression
This method is used when a group of ingredients can be blended and placed in a tablet press to make a tablet without any of the ingredients having to be changed. This is not very common because many tablets have active pharamaceutical ingredients which will not allow for direct compression due to their concentration or the excipients used in formulation are not conducive to direct compression.

Granulation is the process of collecting particles together by creating bonds between them. There are several different methods of granulation. The most popular, which is used by over 70% of formulation in tablet manufacture is wet granulation. Dry granulation is another method used to form granules.

Wet granulation for tablets
Wet granulation is a process of using a liquid binder or adhesive to the power mixture. The amount of liquid can be properly managed, and overwetting will cause the granules to be too hard and underwetting will cause the granules to be too soft and friable. Aqueous solutions have the advantage of being safer to deal with than solvents.

Procedure of Wet Granulation for tablets
Step 1: Weighing and Blending - the active ingredient, filler, disintegration agents, are weighed and mixed.
Step 2: The wet granulate is prepared by adding the liquid binder/adhesive. Examples of binders/adhesives include aqueous preparations of cornstarch, natural gums such as acacia, cellulose derivatives such as methyl cellulose, CMC, gelatins, and povidone. Ingredients are placed within a granulator which helps ensure correct density of the composition.
Step 3: Screening the damp mass into pellets or granules
Step 4: Drying the granulation
Step 5: Dry screening: After the granules are dried, pass through a screen of smaller size than the one used for the wet mass to select granules of uniform size to allow even fill in the die cavity
Step 6: Lubrication- A dry lubricant, antiadherent and glidant are added to the granules either by dusting over the spread-out granules or by blending with the granules. Its reduces friction between the tablet and the walls of the die cavity. Antiadherent reduces sticking of the tablet to the die and punch.
Step 7: Tableting: Last step in which the tablet is fed into the die cavity and then compressed between a lower and an upper punch.
Water may be used as the liquid binder, but sometimes many actives are not compatible with water. Water mixed into the powder can form bonds between powder particles that are strong enough to lock them in together. However, once the water dries, the powders may fall apart and therefore might not be strong enough to create and hold a bond. Povidone also known as polyvinyl pyrrolidone (PVP) is one of the most commonly used pharmaceutical binders. PVP and a solvent are mixed with the powders to form a bond during the process, and the solvent evaporates. Once the solvent evaporates and powders have formed a densely held mass, then the granulation is milled which results in formation of granules

Dry granulation for tablets
Wet granulation is a process of using a liquid binder or adhesive to the power mixture. The amount of liquid can be properly managed, and overwetting will cause the granules to be too hard and underwetting will cause the granules to be too soft and friable. Aqueous solutions have the advantage of being safer to deal with than solvents.

Procedure of Wet Granulation for tablets
Step 1: Weighing and Blending - the active ingredient, filler, disintegration agents, are weighed and mixed.
Step 2: The wet granulate is prepared by adding the liquid binder/adhesive. Examples of binders/adhesives include aqueous preparations of cornstarch, natural gums such as acacia, cellulose derivatives such as methyl cellulose, CMC, gelatins, and povidone. Ingredients are placed within a granulator which helps ensure correct density of the composition.
Step 3: Screening the damp mass into pellets or granules
Step 4: Drying the granulation
Step 5: Dry screening: After the granules are dried, pass through a screen of smaller size than the one used for the wet mass to select granules of uniform size to allow even fill in the die cavity
Step 6: Lubrication- A dry lubricant, antiadherent and glidant are added to the granules either by dusting over the spread-out granules or by blending with the granules. Its reduces friction between the tablet and the walls of the die cavity. Antiadherent reduces sticking of the tablet to the die and punch.
Step 7: Tableting: Last step in which the tablet is fed into the die cavity and then compressed between a lower and an upper punch.
Water may be used as the liquid binder, but sometimes many actives are not compatible with water. Water mixed into the powder can form bonds between powder particles that are strong enough to lock them in together. However, once the water dries, the powders may fall apart and therefore might not be strong enough to create and hold a bond. Povidone also known as polyvinyl pyrrolidone (PVP) is one of the most commonly used pharmaceutical binders. PVP and a solvent are mixed with the powders to form a bond during the process, and the solvent evaporates. Once the solvent evaporates and powders have formed a densely held mass, then the granulation is milled which results in formation of granules


Dry granulation for tablets
This process is used when the product needed to be granulated may be sensitive to moisture and heat. Dry granulation can be conducted on a press using slugging tooling or on a roller compactor commonly referred to as a chilsonator. Dry granulation equipment offers a wide range of pressure and roll types to attain proper densification. However the process may require repeated compaction steps to attain the proper granule end point.

Process times are often reduced and equipment requirements are streamlined; therefore the cost is reduced. However, dry granulation often produces a higher percentage of fines or noncompacted products, which could compromise the quality or create yield problems for the tablet. It requires drugs or excipients with cohesive properties.


Some granular chemicals are suitable for direct compression (free flowing) e.g. potassium chloride.
Tableting excipients with good flow characteristics and compressibility allow for direct compression of a variety of drugs.

Fluidized bed granulation
It is a multiple step process performed in the same vessel to pre-heat, granulate and dry the powders. It is today a commonly used method in pharmaceuticals because it allows the individual company to more fully control the powder preparation process. It requires only one piece of machinery that mixes all the powders and granules on a bed of air.


Tablet Compaction SimulatorTablet formulations are designed and tested using a laboratory machine called a Tablet Compaction Simulator or Powder Compaction Simulator. This is a computer controlled device that can measure the punch positions, punch pressures, friction forces, die wall pressures, and sometimes the tablet internal temperature during the compaction event. Numerous experiments with small quantities of different mixtures can be performed to optimise a formulation. Mathematically corrected punch motions can be programmed to simulate any type and model of production tablet press. Small differences in production machine stiffness can change the strain rate during compaction by large amounts, affecting temperature and compaction behaviour. To simulate true production conditions in today's high speed tablet presses, modern Compaction Simulators are very powerful and strong.

Initial quantities of active pharmaceutical ingredients are very expensive to produce, and using a Compaction Simulator reduces the amount of powder required for development.

Load controlled tests are particularly useful for designing multi-layer tablets where layer interface conditions must be studied.

Test data recorded by the Simulators must meet the regulations for security, completeness and quality to support new or modified drug filings, and show that the designed manufacturing process is robust and reliable

Tablet coating: tablets dosage form advantages and disadvantages


Many tablets today are coated after being pressed. Although sugar-coating was popular in the past, the process has many drawbacks. Modern tablet coatings are polymer and polysaccharide based, with plasticizers and pigments included. Tablet coatings must be stable and strong enough to survive the handling of the tablet, must not make tablets stick together during the coating process, and must follow the fine contours of embossed characters or logos on tablets. Coatings can also facilitate printing on tablets, if required. Coatings are necessary for tablets that have an unpleasant taste, and a smoother finish makes large tablets easier to swallow. Tablet coatings are also useful to extend the shelf-life of components that are sensitive to moisture or oxidation. Opaque materials like titanium dioxide can protect light-sensitive actives from photodegradation. Special coatings (for example with pearlescent effects) can enhance brand recognition.

If the active ingredient of a tablet is sensitive to acid, or is irritant to the stomach lining, an enteric coating can be used, which is resistant to stomach acid and dissolves in the high pH of the intestines. Enteric coatings are also used for medicines that can be negatively affected by taking a long time to reach the small intestine where they are absorbed. Coatings are often chosen to control the rate of dissolution of the drug in the gastro-intestinal tract. Some drugs will be absorbed better at different points in the digestive system. If the highest percentage of absorption of a drug takes place in the stomach, a coating that dissolves quickly and easily in acid will be selected. If the rate of absorption is best in the large intestine or colon, then a coating that is acid resistant and dissolves slowly would be used to ensure it reached that point before dispersing. The area of the gastro-intestinal tract with the best absorption for any particular drug is usually determined by clinical trials.

This is the last stage in tablet formulation and it is done to protect the tablet from temperature and humidity constraints. It is also done to mask the taste, give it special characteristics, distinction to the product, and prevent inadvertent contact with the drug substance. The most common forms of tablet coating are sugar coating and film coating.

Coating is also performed for the following reasons:

Controlling site of drug release
Providing controlled, continuous release or reduce the frequency of drug dosing
Maintaining physical or chemical drug integrity
Enhancing product acceptance and appearance
Sugar coating is done by rolling the tablets in heavy syrup, in a similar process to candy making. It is done to give tablets an attractive appearance and to make pill-taking less unpleasant. However the process is tedious and time-consuming and it requires the expertise of highly skilled technician. It also adds a substantial amount of weight to the tablet which can create some problems in packaging and distribution.

In comparison to sugar coating, film coating is more durable, less bulky, and less time consuming. But it creates more difficulty in hiding tablet appearance. The purpose of this coating is to prevent dissolution of the tablet in the stomach, where the stomach acid may degrade the active ingredient, or where the time of passage may compromise its effectiveness, in favor of dissolution in the small intestine, where the active principle is better absorbed.this website http://www.tabletsdosageform.blogspot.com/ is dedicated for educting pharmaceuticle students
tablets dosage form advantages and disadvantages

Tablet presses:tablets dosage form advantages and disadvantages


Tablet presses, also called tableting machines, range from small, inexpensive bench-top models that make one tablet at a time (single-station presses), no more than a few thousand an hour, and with only around a half-ton pressure, to large, computerized, industrial models (multi-station rotary or eccentric presses) that can make hundreds of thousands to millions of tablets an hour with much greater pressure. Some tablet presses can make extremely large tablets, such as some of the toilet cleaning and deodorizing products or dishwasher soap. Others can make smaller tablets, from regular aspirin to some the size of a bb gun pellet. Tablet presses may also be used to form tablets out of a wide variety of materials, from powdered metals to cookie crumbs. The tablet press is an essential piece of machinery for any pharmaceutical and nutraceutical manufacturer.

Pill-splitters
It is sometimes necessary to split tablets into halves or quarters. Tablets are easier to break accurately if scored, but there are devices called pill-splitters which cut unscored and scored tablets. Tablets with special coatings (for example enteric coatings or controlled-release coatings) should not be broken before use, as this will expose the tablet core to the digestive juices, short-circuiting the intended delayed-release effect.this website http://www.tabletsdosageform.blogspot.com/ is dedicated for educting pharmaceuticle students
tablets dosage form advantages and disadvantages

Tablet Dosage form

Tablet Dosage Form: Tablets dosage form advantages and disadvantages


A tablet is usually a compressed preparation that contains:

5-10% of the drug (active substance);
80% of fillers, disintegrants, lubricants, glidants, and binders; and
10% of compounds which ensure easy disintegration, disaggregation, and dissolution of the tablet in the stomach or the intestine.
The disintegration time can be modified for a rapid effect or for sustained release.

Special coatings can make the tablet resistant to the stomach acids such that it only disintegrates in the duodenum as a result of enzyme action or alkaline pH.

Pills can be coated with sugar, varnish, or wax to diguise the taste.

Some tablets are designed with an osmotically active core, surrounded by an impermeable membrane with a pore in it. This allows the drug to percolate out from the tablet at a constant rate as the tablet moves through the digestive tract.

Why Clean Room Face Masks are Necessary within Critical Environments

While we strive to remain as clean and hygienic as possible, the truth of the matter is that human beings, by sheer nature, carry a plethora of contaminant sources on a daily basis. While we tend to build up immunity to the various germs that plague us, we are unfortunately incapable of fully preventing the spreading of these bacteria. Because we the people are a significantly prevalent source of contamination, clean room face masks have become a necessity in critical environments. Face masks have certainly evolved from when they were originally worn in medical facilities years and years ago. Today's clean room face masks have been properly designed to reduce germ dispersion through the mouth, and more specifically, to protect controlled environments from human contamination.
Clean room face masks tend to boast superior properties of filtration efficiency as well as comfort. Because many clean room workers and operators are required to wear their gear for extended durations of time, clean room face masks may be customized according to each individual's specific needs. The outer material of clean room face masks typically consists of fractured film, which is an ideal consistency for extremely critical environments. Ear-loops are often made with polyurethane because of its non-shedding properties. Additionally, the ear-loops are typically knitted so as to remain comfortable for long periods of wear. In another effort to enhance comfort and eradicate shedding, headbands are also often made with polyurethane.
Read more at http://www.articlealley.com/article_793365_15.html?ktrack=kcplink
 
 Covering the head is often a complicated task because of our own needs as well as the needs of the particular environment that we will inhabit. It is important to make sure that no facial hair is exposed as it offers yet another way for contamination to enter a clean room. Although the primary goal of sporting a clean room face mask is to reduce the waste that we may bring into an area, it also serves to protect us from potentially hazardous chemicals or solutions. Being exposed to various chemicals does have the potential to trigger negative reactions within us. While different clean rooms boast different classifications, it is always better to be as cautious as possible so as not to contaminate anything within the space. Classifications are determined by the number of particles in the air allowed per cubic foot. Standards vary depending upon both the industry and the application, though it is usually safe to assume that intense preventative measures should be taken against any level of contamination whenever possible.

Ideal for use in pharmaceutical, medical, and biotechnological environments among many others, clean room face masks help to reduce the amount of contamination within an environment and thus inevitably lessen the number of product recalls that can occur within these industries. Government agencies demand strict guidelines for all of their projects and plans, thus clean room face masks are essential when operating within these sites as well.

Although clean room face masks are a crucial contamination control product, there are a variety of other products that must be implemented and used simultaneously in order to maintain the levels of cleanliness demanded by most critical environments. Clean room equipment and products are vital to the productivity, profitability, and safety of a critical environment. Aside from reducing contamination within these locations, controlled products can help to positively impact the lives of others while improving the quality of life globally.
Read more at http://www.articlealley.com/article_793365_15.html?ktrack=kcplink

The Importance of Bacteria Identification in Clean Rooms

A comprehensive environmental monitoring program of clean rooms should include routine monitoring of both viable and non-viable airborne particulates. Although there is no requirement for the microbial identification of all contaminants present in these controlled environments, an environmental control program shall include an appropriate level of bacteria identification obtained from sampling. There are several methods of bacterial identification available.
The first step for correct bacterial identification, especially concerning a clean room isolate, is the Gram staining, since it can provide elucidative clues about the source of the microbial contamination. If microbial identification of isolates reveals Gram-positive cocci, the source of contamination can be derived from humans. If bacteria identification of isolates reveals Gram-positive rods, the source of contamination can be derived from dust or strains resistant to disinfectants. If bacterial identification of isolates reveals Gram-negative rods, the source of contamination can be derived from water or any moistened surface.
Microbial identification in pharmaceutical clean rooms is required for several reasons associated with quality assurance: determination of organisms from the manufacturing environment; bacteria identification from final product testing; demonstrating absence of named organisms from non-sterile products and water; quality control of fermentation stocks in biotechnology; and confirmation of test organisms in validation processes.
More and more, the Food and Drug Administration (FDA) is expecting bacterial identification to aid in determining the usual flora for a specific site, to evaluate the effectiveness of cleaning and to troubleshoot the source of contamination that can occur when action levels are exceeded or sterility tests become contaminated

Lab Equipment for Clean Rooms and Critical Environment

There are some facets of medicine, industry and scientific research where there is a need for an environment as completely free as possible of any outside pollutants or substances that could bring in unwanted factors or variables to whatever procedure is being investigated, developed or operated on. For this purpose, laboratories with critical environments called 'clean rooms' have been developed. There are a huge variety of types of facilities and apparatus and clothing designed and available for a very wide range of research and control laboratories. Cole-Parmer stocks the equipment you need for a clean room as well as all the usual laboratory equipment and all the glassware you require. A clean room's level of contamination has to be a controlled one, with a specified level of contamination, which is the number of particles per cubic meter. For instance, the air in a typical urban street contains approximately 35,000,000 particles per cubic meter. An ISO 1 clean room may have no particles at all of a size above 0.5 microns, and only 12 particles per cubic meter or smaller than that. The critical environments developed may be as large as a whole huge factory premises manufacturing sensitive foods or materials, biotechnology, electronic technology, medicines, or they can be as small as a pre-term baby's incubator. Obviously, the degree of sterility will vary greatly. Some 'clean rooms' may be moderate, such as an ICU hospital ward where all instruments and equipment is sterilized but protective clothing may be limited to a face mask and sterile gloves. In an operating theater, sterile gowns, foot and head coverings are usually added. The room is generally isolated from entrance by anyone who is not involved in the actual surgery or medical procedure. In very critical environments, more sophisticated clothing is sometimes worn, even to the extent of helmets and separate breathing apparatus. Cole-parmer have a range of protective clothing, pro-clean overalls, gloves, head and foot covers, and masks to prevent contamination. Everything inside the clean room is sterilized and/or decontaminated. Even cleaning materials and tools are specialized for use inside the controlled environment, and kept sterile. Cole-Parmer supply all the necessary specialized cleaning agents, mops and brushes you need. Airflow, filtration, air pressure, humidity and temperature are generally also controlled. The entrances and exits are where the highest precautions are normally taken, with a 'gray room' where clothing is changed before entering a vacuum chamber, air lock, or air shower, where even the sterile clothing is decontaminated before entrance into the clean room itself. Cole-Parmer provide extremely useful layered adhesive-coated mats that capture dirt and dust so you don't track it into the clean room. When the top layer is dirty, you peel that layer off to expose a fresh surface underneath. As you can see, there is a whole industry devoted to the production of conditions for sterility and decontamination, ranging from architecture, building and engineering, to protective clothing manufacture, as well as machinery, apparatus, tools and equipment. You will find most of the apparatus and equipment you need at Cole-Parmer specialists.

Wednesday, February 9, 2011

Active Pharmaceutical Ingredients

API or active ingredient in drugs is a biology active substance. API is also part of pesticides. Some drugs may contain more than one active ingredient or API are usually suspended in liquid medium or mixed with am excipient for delivery. Te excipient is pharmaceutically inert substance.
There is no standardization in drugs containing API especially in case of herbal mixtures. The risk factor arises for patients on multiple medications. The API can react adversely with ingredients of other medicines and this can be life threatening. Albeit there are online services to help identify API in the patients’ medicine but many are not aware.
Large amount of API is outsourced to countries like India and China especially through contract manufacture. The API and intermediate outsourcing contributes to eighty percent of total outsourced manufacture in pharmaceuticals.
Most of the contract manufacturing in India is outsourced for generic formulations. But as infrastructure in India develops further and markets grow new chemical entities will be part of outsourcing.
Fine chemicals or APIs are subject to good manufacturing practices and are hence subject to many rules and regulations. They are manufactured in separate factories where the active ingredients are mixed with excipients, solvents or inactive pigments.
More than twenty five percent API and bulk drugs were exported to EU from India. The growth in export to European countries is exponential promising a bright future for pharma companies in India.
CEP is bench mark for quality manufacture of API accorded by EQDM in Europe. According to

Phramaexcil or Pharmaceutical export Promotion Council India as many as hundred and thirty six pharmaceutical companies have received approval for CEP. When a company complies with monograph of European Pharmacopoeia it is granted this certificate.
There is great demand for API intermediates manufacture of which is mostly outsourced to India and China. With improvement in manufacturing process in these countries the prospect for industry in very bright.

Monday, February 7, 2011

Induced Grating Technology in Particle Size Analysis


By: Lab Manager Magazine -

Problem: Currently nanoparticles are measured by most particle size analyzers using scattered light; however in some cases this presents many physical restrictions and also requires the input of the refractive index as a measurement condition. Future trends in science indicate the need for accurate sizing of nano and even sub-nano particles, particularly in the area of drug development and pharmaceuticals.
Solution: Although traditional particle sizing methods using scattered light alone have not been ruled out as a solution to this problem, a new technology that has received considerable attention and an Editors’ Choice bronze award for best new product at Pittcon 2009 is the induced grating (IG) method from Shimadzu Scientific Instruments’ IG-1000. IG is a new technique for measuring the size of nanoparticles using dielectrophoresis and diffracted light that delivers excellent reproducibility and acquires stable data, particularly for sub-10 nm particles.
How it Works Particle size is measured using the diffusion rate of a grating that is composed of particles in the liquid. The diffusion rate of large particles is slow and that of small particles, especially nanometer particles, is fast. The diffusion behavior of particles can be monitored by detecting the change of primary diffracted light.
In the Shimadzu IG-1000, a diffraction grating is formed by drawing particles toward the electrodes when dielectrophoresis is on, and the diffraction grating disappears when the dielectrophoresis is turned off and particles are released away from the electrodes. The decay process of this particle density diffraction grating is measured via the change in intensity of the diffracted light, and a relationship between particle size and diffusion rate is then established.
Stable measurement with good reproducibility is possible because IG utilizes optical signals emitted by the diffraction grating formed by the particles and not scattered light emitted by the particles. Even in the single nano region, a good S/N ratio can be obtained.
The new measurement principle is resistant to contamination and, even if the sample is mixed with small amounts of foreign particles, information about the particles to be analyzed is captured reliably. The filtering of samples in order to remove coarse particles is not required.
An alternating voltage is applied to cyclically arranged electrodes, and a cyclic concentration distribution of microscopic particles is formed in the liquid by dielectrophoresis. Although the cyclic concentration distribution of microscopic particles acts as a diffraction grating (a particle concentration diffraction grating), if the alternating voltage is stopped, the grating diffuses and disappears (patent pending).
The cyclically arranged electrodes also function as a diffraction grating, although the light created is weaker than the diffracted light created by the particle concentration diffraction grating. The electrode configuration has been modified as shown in the figure so that the pitch of the electrode diffraction grating is half that of the particle concentration diffraction grating (patent pending). In this way there is a more precise measurement.
The IG method also ensures high reproducibility and the acquisition of stable data. In particular, high reproducibility for particle sizes of less than 10 nm removes the uncertainty of particle analysis in the single nano region.

The Analytical Lab As Strategic Asset


By: Cozette Cuppett -

These days, laboratory operations are more visible than ever to management, whether they’re the organization’s shining star for profits or a capital expense black hole. If you are managing a lab and a budget, chances are you’ve gotten to know your organization’s purchasing and finance team—and they’ve gotten to know you—much better within the last year.
This increased visibility can be unnerving, especially for lab managers who have previously been more focused on the science than the business of the laboratory. Prepare for increased exposure and expectations of today’s management teams. Draw upon your experience and use the information at hand to confidently address budgetary, resource allocation, and other project management inquiries. Develop a strategy and a realistic implementation plan to enable your operations to meet or exceed your organization’s demands. Most important, deliver meaningful results. Position yourself so that your interactions across functions in the organization build your credentials rather than destroy your self-esteem.
Easier said than done. Begin by taking stock of your laboratory. Review your management’s expectations, factor in external influences that are out of your control, and determine how you’re going to deliver on your goals. Put yourself in a state of readiness so that you can recognize the short-term opportunities that will allow you to justify and drive a longer-term transformation of your laboratory into a strategic business asset—whether this means putting business systems in place to understand where to focus your efforts and assets or introducing forward-looking technology platforms to meet the needs of an ever-evolving business climate.
Get a clear picture of your current operations:
Assets and liabilities

Ongoing review of asset utilization and internal analytical process workflow is increasingly a way of life in the analytical laboratory. It’s necessary to plan strategic projects, justify capital requests, decommission assets, shift resources as necessary, and, in general, understand the facility’s operations.
Tools such as the Waters Empower™ 2 Business Intelligence Manager™ (BIM) provide a web-based dashboard software solution for rapid analysis of chromatography instrumentation performance data for faster, more qualified decisions on laboratory and business operations. Designed with proven business intelligence concepts that have been successful across many industries, the BIM allows lab managers and system administrators using Empower 2 Enterprise chromatography software to critically understand and exploit the strengths of their laboratories and identify areas that need added support.
Large volumes of complex information, such as chromatographic system usage, method analysis, and process flow, can be presented and visualized using dashboard tools like Waters Empower™ 2 Business Intelligence Manager.
As time passes, many laboratory technologies no longer provide a significant benefit to the laboratory—whether they are warehoused, sit unused on the lab bench, or consume more supplies and service time than is paid back in analytical impact. To determine the value of your facility’s technologies, take advantage of instrument vendors’ service and support organizations and asset management solutions. These services assist in evaluating where your technology is in its life cycle, so that you can intelligently decide when to decommission instruments or shift them to other departments where they’ll best achieve capacity utilization. If the technology no longer fits your organization’s needs, many times trade-in opportunities exist whereby you can get credit toward the purchase of a newer, more efficient or higher-capability model.
Embrace opportunities for change
The supply-side shortage of acetonitrile (ACN) has created an impetus for change in many laboratories. Even in facilities that are not directly impacted by this solvent shortage, the potential risk it poses to product supply and revenue generation is enough to catch the attention of senior management. With attention comes opportunity.
Savvy laboratory managers are leveraging the solvent shortage1 in conjunction with internal sustainability initiatives as a way to promote investment in technologies that not only minimize solvent consumption and disposal and their associated costs, but also improve laboratory productivity. Two technologies that have been cited by industry as tools that support greener laboratory operations are UltraPerformance LC® (UPLC®) and supercritical fluid chromatography (SFC).
By employing sub–2 μm particles, UPLC delivers more efficient chromatographic separations, enabling the instrument to use less solvent in shorter run times while maintaining or improving the performance achieved with traditional HPLC3 For example, the USP human insulin related-compounds assay consumes 20 mL of ACN per sample with a 68-minute run time by HPLC, whereas a UPLC separation of similar performance consumes 1.7 mL per sample with a 27-minute run time.4 This translates into a 92 percent decrease in acetonitrile consumption and a greater than 250 percent improvement in throughput. In a business environment where acetonitrile is being rationed and laboratory productivity is intensely monitored, this type of process improvement has been the basis of internal recognition awards for several of Waters’ customers by their own senior management teams.


Technologies such as UPLC can greatly decrease analytical run time and solvent consumption when compared to traditional HPLC—shown here, a 92 percent decrease in acetonitrile consumption and a greater than 250 percent improvement in throughput.
Alternatively, using carbon dioxide as its primary solvent, SFC enables scientists to generate excellent chromatographic results, particularly for chiral and preparative separations. In a solvent-intensive application like preparative chromatography, SFC simultaneously reduces solvent costs and shortens drydown time for collected sample fractions.
Adopting alternative technologies is one approach a lab manager can take to address problematic external influences such as the solvent shortage. Adapting processes is another. With high-purity acetonitrile being prioritized for quality control, scientists in development laboratories find themselves in a position where methanol and other solvents are increasingly part of their method scouting and optimization protocols.
Where methods are being created or changed to use less acetonitrile, an opportunity is created for laboratory managers to modify existing method development and validation practices; for example, to make quality-by-design (QbD) part of the process. By introducing a design of experiment (DoE) approach into method development studies5,6,7, scientists can define a knowledge space where the impact of the chosen chromatographic parameters on separation performance is fully characterized. Once such a method is validated, operating within that design space provides chemists with a statistically defendable range of chromatographic parameters that can be used without having to invoke regulatory change control processes.
This regulatory flexibility can pay significant dividends downstream as methods are transferred to different laboratories and where unforeseen changes in materials and processes occur. In the past, DoE approaches were limited to individuals who had access to statisticians for appropriate study design and whose laboratories had the analytical capacity to run the relatively large sample sets required. Commercially available DoE software, such as FusionAE™ from S-Matrix® and the highly efficient, fast chromatographic separations provided by UPLC, now make the DoE approach tenable for more laboratories.
Build a strong platform for the future
Technology standardization has much to offer laboratory management: efficiencies in operator training, standard operating procedure (SOP) maintenance, service and support, purchasing decisions, and technology transfer. On the flip side, many scientists shudder at the thought of being confined to a predefined, standardized solution to their application challenge. Platform technologies introduce flexibility to standardization. Built from the core facets of the standardized technology, these platforms allow scientists to customize components to achieve specialized tasks.
Take, for example, UPLC. This liquid chromatography platform technology first manifested itself in the ACQUITY UPLC® system and its bridged ethylene hybrid (BEH) columns, launched in 2004 by Waters with the core functionality needed to support mainstream LC separations. Since its launch, the UPLC platform has expanded to include the nanoACQUITY UPLC® system, which adapts the hardware and columns to support sample-limited and two-dimensional applications such as those encountered in life science laboratories. For scientists exploring the use of chip-based technologies with mass spectrometry, the TRIZAIC™ UPLC® system with nanoTile™ technology brings simplified user interaction and increased consistency to nanoscale separations.
Further evolution of the UPLC platform is evident in its open architecture UPLC configuration and user interface that supports walk-up sample analysis and quantification. The platform even extends to the manufacturing floor, in the PATROL™ UPLC® process analyzer for online and atline analysis of production processes. UPLC also transcends typical vendor boundaries, with the ACQUITY UPLC and nanoACQUITY UPLC systems being controlled by many key suppliers’ chromatography and mass spectrometry (MS) software packages. This provides access to UPLC for scientists who have already standardized on a specific data management platform. In addition, expansion of UPLC column offerings and specialized application kits broadens the platform’s use to include separation of amino acids, peptides, oligonucleotides, aflatoxins, and perfluorinated compounds, to name a few.
The platform concept is not limited to chromatography. Due to the individual design of their ionization sources, switching among mass spectrometers can require different optimization settings, in particular the ionization and fragmentation parameters. With the Xevo™ MS platform from Waters, scientists can efficiently move between a tandem quadrupole and timeof- flight MS with the Xevo TQ and Xevo QTof, respectively, and expect the same ionization settings to transfer between the instruments. Moreover, tools such as IntelliStart™ automate system setup and optimization steps, removing the subjective influence of individual chemists and increasing the accessibility of these instruments to more analysts.
As organizations move toward lean operation, where any unnecessary step is stripped from a process, platform technologies are a natural fit. They provide a base level of consistency that facilitates servicing, training, and procurement while offering the versatility necessary to accomplish business-critical tasks.
Source smart and make your investments deliver
Today’s business environment is making everyone work and invest smarter. In the laboratory, this may mean stretching available capital by purchasing used instrumentation. Some original equipment manufacturers offer certified pre-owned instruments for sale at a significant discount. These systems are refurbished by certified technicians using ISO-documented processes. Whether you purchase new or used technology, once your capital is spent, the expectation is that you will demonstrate results. Whom you source the necessary equipment from is as strategic a decision as what equipment you buy.
The value of every technology investment is dependent on the implementation. How often is an instrument purchased and then not used to its full potential? Or worse, it sits idle on the lab bench—misused, misunderstood, or abandoned completely. Often this is the result of insufficient training, education, and application support services either at the time of purchase or throughout the technology’s lifetime in the laboratory. By not availing your laboratory of these services from the technology vendor, instruments can languish in an obscure corner of the laboratory, never fully achieving the promise of the technology or delivering the expected return on investment.
This is a period of simultaneous challenge and opportunity for lab managers. Laboratory transformation and investment are taking place, but not without a comprehensive understanding of existing operational capabilities and requirements, justification and demonstration of return on investment, and detailed implementation plans. Leverage today’s short-term business challenges as an opportunity to transform your laboratory into one of your organization’s greatest assets.
References:
1. “A Solvent Dries Up,” Alex Tullo. Chemical & Engineering News, 86(47), November 24, 2008.
2. Green Analytical Chemistry at Pfizer. Mark Harding. British Pharmaceutical Conference, September 2008.
3. ACQUITY UltraPerformance LC by Design. Waters System Technology Note, 720000880EN.
3. Transfer of the USP Human Insulin-Related Compounds HPLC Method to the ACQUITY UPLC System. Tanya Jenkins and Patricia McConville. Waters Application Note, 720001396, 2005.
5. “A Quality-by-Design Methodology for Rapid LC Method Development, Part I.” Ira Krull, Michael Swartz, Joseph Turpin, Patrick H. Lukulay, and Richard Verseput. LCGC North America, December 2008.
6. “A Quality-by-Design Methodology for Rapid LC Method Development, Part II.” Ira Krull, Michael Swartz, Joseph Turpin, Patrick H. Lukulay, and Richard Verseput. LCGC North America, January 2009.
7. “A Quality-by-Design Methodology for Rapid LC Method Development, Part III.” Michael Swartz, Ira Krull, Joseph Turpin, Patrick H. Lukulay, and Richard Verseput. LCGC North America, April 2009.

Modular Lab Design


By: Steve Hackman

Budgets are tight. Schedules are compressed. Change is a constant. The continual need for new and upgraded laboratory space begs a question: Can one design approach fit all?
Fundamental to the process of laboratory facility planning is an understanding of some basic design principles that ensure future adaptability. Laboratories of all types—from corporate to clinical, research to instructional, and forensic to biocontainment—must have the flexibility to adapt to future, as-yet-unknown changes in technology and scientific processes.
While each lab type remains unique, the purposeful application of modular design, zoning of tasks and implementation of flexible planning concepts will produce the most efficient and cost-effective solutions. Modular design is by no means a cookie-cutter approach but rather a simplified approach to achieving a wide range of goals in laboratory design.
By following 10 core planning and design principles, it is possible to achieve a highly functional and highly adaptable facility.
Modular design
The first and most fundamental concept of laboratory planning is the application of modular design. This approach maintains the highest level of flexibility by allowing the functional requirements of the lab to influence the form—to design from the inside out. The modular approach provides interchangeability of spaces as well as opportunities for increased efficiency.
1. Integrated module
The module is the basic building block for organizing the laboratory. It is the unit of space required for lab occupants and equipment to function safely and effectively and is created by considering the depth of useful zones on both sides of an aisle. Ideally, the length of the module is calculated as a multiple of the width, for added flexibility in two directions, thus making it easier to adapt to a new scientific process with optional space orientation. A module that is integrated allows for multiple room sizes that share a common denominator or fit within a holistic, implicit approach. When the integrated module is implemented, even a multistory facility can be effectively designed to accommodate such varied functions as parking, imaging, vivarium, patients, labs and specialty processes.
The module can be combined to form large, open labs or divided in half, thirds, or quarter-size increments for various lab support requirements.
2. Right-sizing
Right-sizing of infrastructure such as mechanical and structural systems is key to long-term adaptability. Systems design should be fully integrated into the planning process at the earliest stages and utilize modular concepts to allow future changes to occur with minimal disruption. Air systems, power, data and piped services such as lab gases and water must be planned to anticipate future requirements and minimize a “fatal flaw” that could impede flexibility. The structural grid should pose minimal obstruction to work tasks. Floor-to-floor heights should permit space for routing future utilities for a range of scientific tasks, from wet chemistry and bioprocess experiments to physical/electrical and computational science.
3. Limited inventory
Using fewer basic design elements will result in a greater number of workable solutions. This is the principle of less is more, in which minimizing the inventory of customized spaces or modular differences can actually help achieve more flexibility of options. Customization is still possible through the ability to create endless configurations from a limited library of parts and pieces—not unlike the limitless variations of Tinkertoys™ or Legos™. Casework elements should be standardized based on current or anticipated scientific needs, to maximize options for reconfiguration. The design phase of a new or renovated laboratory project is the best opportunity to develop or improve upon current building standards; for example, replacing odd-sized spaces with like-sized universal rooms that can serve varied functions.
Zoning of tasks
The second concept of laboratory planning, zoning of tasks, refers to the way a building is organized—the interrelationship of spaces and the distribution of those spaces. Zoning embodies the thoughtful arrangement of laboratory activities to achieve the highest degree of flexibility. The application of this concept benefits the occupants by enhancing lab safety and fostering interaction.
For the St. Jude Children’s Research Building in Memphis, SmithGroup introduced multiple planning options with the implementation of a two-directional integrated module and a lean approach to systems distribution and room sizing.
4. Lab/office zoning
The lab/office relationship is critical to maintaining connectivity between the research bench and desk functions. There are several advantages to zoning offices and labs separately. First, costs and operational energy savings are realized; for example, unlike in the laboratory, office air can be recirculated, making office systems easier to maintain for human comfort. Through proper zoning, lab flexibility is increased by limiting the potential obstacle of offices “inside” the laboratory. The desired connectivity can be maintained between the lab and office zones by designing for proper proximities in conjunction with a networked lab management system.
5. Space distribution
Providing appropriate space distribution of lab and non-lab functions is the outcome of a collaborative programming and planning process. An understanding of what is unique about each lab, combined with satisfying critical relationships and adjacencies, will produce a synergistic plan. Benchmarking of similar facilities is important to gain insight from peer institutions. Building elements such as stairs and elevators, mechanical risers, and utility rooms should be positioned outside of the zones that require the most flexibility. The grouping of multiple labs to create neighborhoods with amenities can enhance team dynamics and foster interaction. The careful placement of like spaces such as offices, lab support and open labs can turn an average facility into an efficient working whole.
6. Separated support
Sequestering support zones can free general wet labs to function as generic labs, for use by multiple occupants and multiple purposes. The quantity of lab support varies based on the science; for example, there is a need for a higher ratio of lab support in a biomedical research facility. It is more common to find biosafety labs, clean rooms, imaging and specialized functions in today’s laboratories. These unique space types often require a higher level of infrastructure and greater construction costs. When collocated and distinctly zoned, they are more readily shared, improving operational costs and minimizing utility distribution.
With this accomplished, the generic labs can support a wider array of functions and remain adaptable. Zoning within the generic lab includes the separation of wetbench functions—such as sinks and services, fume hoods, and other exhaust devices—from dry-bench areas to improve performance and flexibility.
At the Arizona Biomedical Collaborative in Phoenix, a central lab corridor provides ready access and increased interaction among the open labs, lab support and the office zone, increasing efficiency and flexibility.
Flexible planning
Flexible planning is the final concept of laboratory design, integrating the principles of modular design and zoning of tasks within the entire facility. The benefit of clear organization, open labs and selection of highly functional casework provides long-term sustainability for the entire facility.
7. Clear organization
A laboratory’s design is most successful when the purity of the modular concept is translated clearly and follows the tenet of “keep it simple.” Lab safety is increased with organized pathways for egress and materials handling, alcoves for fume hoods, and functional zones for efficient wet-bench work. A single corridor scheme with offices opposite labs reinforces collaboration. Ghost corridors provide a simple and efficient interconnection between labs and lab support while also serving the purpose of internal lab circulation. By organizing the lab with clear and distinctive concepts that are meaningful to the particular facility, the life expectancy of the lab and its ability to adapt to change are greatly extended.
8. Open environment
There are many advantages to an open laboratory environment, where contiguous modules form a generic lab zone. Open labs are less costly to construct due to fewer walls and material interfaces. They are inherently safer, providing occupants with a higher level of visual and audible knowledge of a potential threat or incident. Open labs are more easily assigned and reconfigured for people and processes, leading to shared opportunities for a variety of functions. More important is the potential for collaboration and interdisciplinary activities that occurs in the open lab environment.
Transparency by the use of architectural techniques and ample glass keeps lab interiors bright and open. At the Lawrence Berkeley National Laboratory’s Molecular Foundry, researchers gain a sense of what’s going on beyond their immediate tasks. Photo Credit: David Wakely, courtesy SmithGroup
9. Functional furniture
Today, casework solutions have emerged that are quite flexible—just add wheels. Simple, table-based systems achieve improved performance over fixed casework, with the added benefit of mobility. By maintaining a limited kit-of-parts inventory for the lab casework and furniture, the lab can be more easily reconfigured, resulting in savings of time and money when physical changes are necessary. This is accomplished with interchangeable parts, adjustable shelving and countertops, mounting devices that make more efficient use of vertical space above the bench, mobile base cabinets, and flexible connections to ceilingmounted service outlets. Good laboratory design must also include attention to ergonomic issues such as proper casework selection, materials, task lighting and seating.
Simple, open labs allow for maximum transparency between and through modules. A kit-of-parts approach to casework selection permits endless configurations utilizing mobile pedestals and ceiling-mounted services.
10. Sustainable
Truly adaptable laboratory space is inherently sustainable. Sustainable design strategies—including the use of daylighting, energy-efficient building systems and sustainable materials—will assist in reducing the environmental impact while producing a building that functions longer than its normal life expectancy.
But consideration for a building’s long-term use and carbon footprint suggests that a simplified, low-tech but robust approach, like a historic factory or warehouse loft building, is an appropriate design model that adapts easily to change. This concept of sustainability through simplicity and flexibility challenges the understandable impulse to overdesign in order to meet every potential contingency. A low-tech approach can successfully allocate space for future needs without overinvesting in current and soon-obsolete technologies. For example, it may be sufficient to provide limited bench utility services in strategic locations instead of in an entire lab.
The loft-like versatility of the lab designed for the Arizona Biomedical Collaborative in Phoenix allows for the adaption of wet and dry benches while maximizing the use of daylight. Photo Credit: Bill Timmerman, courtesy SmithGroup
Conclusion
Laboratories are expensive buildings and must be designed to evolve with the continual shifts in scientific discovery and advances in technology. The concepts of design, when grounded in the 10 core principles above, will result in a facility that is highly functional yet has the flexibility to adapt to future change.
Good design looks easy. However, given the complexity of a laboratory, simple and fundamental principles are in fact often difficult to apply and achieve. The rigorous application of modular design, zoning of tasks and implementation of flexible planning concepts will improve the process of design and ultimately the use of the facility to achieve the greatest adaptability possible.

Cross Training


By: Allison Champion -

When managers think about laboratory performance optimization, many typically think of obtaining a new machine or instrument. There are new technologies that can be purchased or new programs that can be implemented, but many managers often overlook the possibility of utilizing something they already have—human performance optimization in the form of cross training.
This isn’t aerobics mixed with weight lifting, but a method used in the industry to expand knowledge among employees, and it’s becoming more and more common as managers find the need for employees to be capable in a multitude of settings.
At Ashland Analytical Services & Technology (AS&T), the laboratory arm of Ashland Inc., cross training programs are part of the process used to keep the lab a top performer. AS&T is a comprehensive laboratory that provides analytical testing and problem-solving services for Ashland and also operates as a contract laboratory providing services for other companies.
Holly Sennhenn, senior research chemist, watches as Emily Stawicki, technician, prepares a sample for distillation to determine nitrogen content using the Kjeldahl method of nitrogen analysis.
“I was hired to work in one area of spectroscopy, but after a few months on the job additional support was needed in another spectroscopy area, so I was chosen to cross-train,” said Holly Sennhenn, a senior research chemist at Ashland. “Having capabilities in both areas turned out to be beneficial, because the techniques complemented each other. I could adjust my focus in either area depending on the demand, which helped alleviate bottlenecks in our workflow,” she said.
Ashland’s cross training program
Ashland’s analytical laboratory is divided into three specialty groups: Materials Characterization (MAT), Spectroscopy/Microscopy (SAM) and Separations/Environmental Analysis (SEA).
Initially, cross training was implemented at Ashland after ISO 9001 certification was obtained in 1996. Around that time, AS&T began tracking the workload in a more process-centered fashion. A newly implemented laboratory information management system was used to gather the information needed to make decisions about where more expertise should be distributed. The cross training came about naturally as the workload rose in some areas and decreased in others.
Much of the cross training ended up happening in the MAT group. Work done in that area of the lab tends to generate massive amounts of numerical data and results that come from routine analyses performed by technicians. Ashland managers realized they could improve productivity by cross training these technicians to perform different analytical tests. Having knowledge in a wide variety of techniques allows technicians to share the often heavy workload.
“One of the primary reasons for cross training is coverage,” said Benjamin Chew, laboratory manager at AS&T. “We want people to have the ability to take sick leave or go on vacation without worrying about their work piling up.” He also notes the importance of cross training in keeping the lab running. “If an emergency comes up requiring immediate answers, we need to be able to address that,” he said. “For a service group that supports multiple internal business groups and multiple contract projects, as ours does, this is very important. If we have a single person assigned to run an analysis and that person is out of the office for a length of time, multiple projects will stop. That simply cannot happen.”
Joe Curran, research chemist, at Ashland’s Dublin facility, trains Wanxing Nancy Ni, chemist, Ashland Shanghai Technical Center, in performing acid value titrations.
The SAM and SEA departments employ the highest concentration of scientists with Ph.D. and Master of Science degrees in the entire lab. Their work does not generate as much numerical data as tests performed in the MAT lab does, but rather provides interpretation of data produced by the wide array of spectroscopic and chromatographic techniques performed. Because of this, it is more difficult to cross-train employees in these areas of the lab; however, Ashland has been successful in using its cross training techniques among those with high-level skill sets.
The AS&T team uses a proficiency matrix to determine who to cross-train and in what areas. Everyone in the lab rates themselves on a skill-level scale of 0, 1, 3, 4, 6 or 8 for each technique. If a person indicates a zero, it means that the person has no knowledge or skill in the technique and no skill in running the instruments required for it. A skill level of 8 indicates that the person is considered an expert on that technique and can develop and implement new methods that address nonstandard requests.
The matrix is used to determine what areas could use more expertise, who would benefit from the training and who is knowledgeable enough to train someone new in a certain technique. Each trainee is assigned a mentor to help orient him or her to the technology and, when applicable, the new area of the lab. Thorough documentation of the trainee’s work is kept and reviewed by managers throughout the training period.
“So much of analytical knowledge is gained through experience,” said Sennhenn. “You could work in this profession for 20 years and still encounter new problems. Working closely with a mentor is the one of the best ways to learn, because throughout the course of the mentoring process you are able to draw on your mentor’s experience and also see how he or she approaches problem solving. You also gain confidence as your mentor validates your results.”
Benefits and challenges of cross training
Ashland has found that one of the most direct benefits of cross training is the amount of individual career development that takes place among staff members. Each person is considered an “expert” in at least one technique, but having a working knowledge of several other techniques enhances the problem-solving skills each person can bring to the team. Managers have found that those who have been cross-trained have a greater appreciation for the strengths and weaknesses of each technique and can better focus on solving problems. Cross training also makes staff members more valuable to Ashland when opportunities arise to contribute AS&T expertise to outside clients.
James Listebarger, senior staff scientist, shows Kristen Harvey, technician, a spectrum of elements within a material, collected using Ashland’s Energy Dispersive X-Ray Spectrometer system.
Though valuable, cross training is complex as well. “Oftentimes during the course of analytical testing, the results you get lead to more questions,” said Sennhenn. “Being cross-trained allows you to continue to investigate the problem beyond one technique. This is beneficial because performing an analysis from beginning to end helps you see the full picture and makes it easier to piece together the information from the multiple techniques. It also helps you approach problem solving differently because you have in-depth knowledge of the other testing that is available.”
As beneficial as cross training is, problems arise in finding the time to perform the training. Ideally, employees would be cross-trained before a critical need for their skills arises, but that requires the ability to anticipate what future demand will surface. Workloads in various areas of the lab are very dependent on how well the businesses within Ashland are performing, so lab managers continuously track trends with consistent data gathering and work to understand the effects that different business maneuvers are likely to have on their workloads. The future plans and eventual actions of employees are other factors that affect the lab’s workload.
“Potential retirements and a method for transferring knowledge are a major reason for cross training,” said Chew. “We once had four analysts retire at the same time—a cumulative loss of 148 years of experience. We had been planning for this in the years prior by hiring new employees and placing them in roles where they could learn as much as possible, utilizing cross training as a teaching tool.”
Lab managers also understand that training is only the first step to an employee’s success. “Training may involve something as simple as showing a technician the commands used to run a specific instrument,” said Chew. “However, it is the knowledge of chemistry specific to the company that needs to be taught. Certain job skills are transferable—good general lab techniques, attention to detail and clear technical writing—but knowledge of a material and how it will behave under a specific analysis can only be picked up when you actually run the sample yourself. The basic skills to run a sample can be taught in as little as a week; the skills to interpret those results can take years.

Lab Safety Revisited

By: Vince McLeod, Glenn Ketcham -

It’s human nature to become complacent and relaxed in a familiar and comfortable setting. Things become routine, and you are able to navigate most of the day on autopilot. This is just as true in your lab, which can become like home or a comfortable old friend. But take a minute now and think back to when you first started working in a lab, and everything was new and challenging.
You were probably a bit nervous at the beginning, not wanting to seem completely clueless. You also may have been somewhat awed by the equipment, the chemicals, the procedures and the ease with which others in the lab moved from task to task. You studied those with more experience and took your cues from them on how to conduct yourself and approach specific operations. Some things might have looked wrong or even dangerous, but you were reassured when you saw others walk past without a hint of concern. These people were your role models and the keepers of laboratory knowledge; anything that passed muster with them must be okay. Out in front of these lab guardians was the principal investigator (PI), who ran the lab. You met with the PI when you were hired. He or she probably gave you a lab tour and set up a schedule to meet with you about your research progress. After that, you were handed off to the lab staff for your practical training. As everything was new and unfamiliar, you accepted things as the way they should be.
Now that a good number of years separate you from those first impressions, you have probably experienced firsthand success, as well as some close calls and some mishaps. YOU are now probably one of the role models that others look up to. Is the message you want to pass on to those folks reflected in the lab you manage? Some yoga instructors tell us we should close our eyes and then open them as if we were a child to see the world anew. We would ask you to do the same with your laboratory. Take a walk through your lab and look at everything with the eyes of a child, as though seeing it unprejudiced and honestly for the very first time.
A host of problems, from inappropriately stored waste to blocked hood baffles, unlabeled and mislabeled containers to residual spills. Photo courtesy of Mark Yanchisin
Let’s start in the hallway outside the lab. Is the entrance to your lab labeled with up-to-date emergency contact and safety information? If a security guard heard a freezer alarm going off at 5 a.m. or a custodian discovered water pouring through the ceiling from a plumbing problem on the floor above, would they easily be able to contact the right lab staff who could take appropriate action? Is there an up-to-date call list in the lab should you need to call in “all hands”? These are such easy things to do, and they can potentially preserve weeks, months or years of stored science as well as countless dollars in replacement costs.
We now move inside. What is your first impression when looking with “fresh eyes”? Is the lab neat, orderly and organized, or does it trend more toward chaos and entropy? Would your mother be proud of what she saw if she came to visit? The first impression that greets an outside inspector often sets the tone for what follows. The floors and aisles should be free from trip and spill hazards, such as randomly stored boxes, chemicals and waste. The absorbent, disposable paper bench coat should not be so contaminated that it documents the chemical usage of the bench student’s entire graduate career. Bench coats should be replaced periodically and whenever gross contamination occurs. Are the waste receptacles overfull? Are they properly contained and labeled? Peek in the garbage cans. Is there any inappropriate waste? Look at the sinks. Is the lab keeping up with the dirty glassware, or does this area resemble a mountain of precariously stacked dishes in a fraternity house kitchen? Does the glassware prevent quick and safe access to the emergency eyewash at the end of the bench?
As we stroll up and down each aisle, let’s focus in more detail. Look at what is on the benches and on the shelving between common benches. The bench top is where much of the science occurs. Are chemical containers labeled appropriately so that in case of an emergency, or even just a minor spill, all could decipher their content? We have seen inappropriate labeling of secondary chemical containers on many occasions, usually done by individuals wanting to prevent others in the lab from using their stock solutions. Inappropriate labeling of containers can present a hazard. It also is one of the most cited regulatory violations and can carry very stiff fines. Are chemicals stored properly and only kept out for the need at hand? Are sharps properly managed? Pull open some drawers; you may be surprised by what you find. Do bench-top equipment and apparatuses look properly assembled? Look at the glassware; are there chips or cracks, or starring on roundbottom flasks? Is there food on the bench? Are there any exposed electrical hazards or moving parts on equipment (e.g., belt guards on vacuum pumps)?
Take this opportunity to find and follow the electrical and extension cords. Are the cords in good repair? Things to look for include frayed or damaged insulation, missing ground pins on the plugs, electrical cords plugged into other cords and extension cords supplying power to equipment that is, for all practical purposes, stationary (e.g., freezers). If you find you have many extension cords and power strips throughout the lab, you may need to have additional outlets installed. Make sure that your important samples and operations are protected by emergency power (often identifiable as a red power receptacle).
Incompatible storage of chemicals. Photo courtesy of Mark Yanchisin
We now come to the fume hood, one of our most critical yet most misused pieces of safety equipment. A number of conditions must be maintained in order for a fume hood to effectively capture and exhaust contaminants. Unfortunately, most of these can be short-circuited by the user, resulting in potential exposures to the user and others in the lab. For the hood to work properly, air must be able to enter the face of the hood and sweep from the front to the rear with sufficient velocity but without excessive turbulence that can cause eddies. The hood sash must be positioned to produce effective capture velocity (capture velocity = volume flow/face area). The more open the sash, the greater the face area and the lower the capture velocity. The sash is also designed to provide user protection from explosions and other experiments gone awry in the hood. If the sash is up, this protection is removed. Make sure work is conducted at least six inches back from the face of the fume hood. Be sure that the air exhaust slots are not blocked at the rear of the hood. (This is where the air must go to leave along with the contaminants.) The front airfoil should not be obstructed; this provides direction to the air that sweeps the bottom surface of the hood. The same is true for biosafety cabinets. Make sure the front grill is not obstructed. This is critical for effective containment. As the interior needs to be decontaminated regularly, it is even more important that the front grill be free of clutter.
Where do we begin? Photo courtesy of Mark Yanchisin
You get the idea. Continue on and look at your safety equipment: showers, eyewashes, fire extinguishers, etc. Are these easily accessible, tested and ready to use? Look at your storage cabinets and check for chemical compatibility, proper labeling, and secure shelving. Peek inside your refrigerators. What do you see?
Now that you’ve looked around carefully at the physical lab, take a look at the people in it. In many cases, these are people entrusted to you for their education as graduate students and post-docs or perhaps for their continued livelihood as technical staff. If you are the lab manager or PI, you are the role model and they will follow your lead. Your behavior will establish proper conduct in the lab. Are they making good choices? Are they sufficiently skilled to do the tasks necessary for their assignments? Do you intervene when necessary?
Most of the science world has heard of the recent tragedy at UCLA where Sheri Sangji, a 23-year-old research assistant, died as a result of severe burns received during an experiment. Though we are not privy to information beyond that provided by public press and professional news groups1,2,3, it appears this was a case of an inexperienced chemist performing a dangerous procedure without proper training, equipment and supervision – a case of nonchalant, ad hoc procedures when in fact great care was actually needed. Now imagine speaking with the parents or spouse of someone in your lab who suffered a serious injury (or, even worse, died) and trying to explain why this happened. Could anything have been done to prevent it? From our years practicing safety in lab environments and investigating accidents, the answer is almost always “yes.”
So, now you have looked with “fresh eyes.” We hope you are satisfied with what you have seen. If you are not satisfied and you’re considering approaches for improvement, we have three suggestions:
Lead by example. Those working in the lab will generally follow the lead of those in charge. Make it a point to put on a lab coat and eye protection whenever you go into the lab. Consistently and impartially enforce rules. Discipline doesn’t have to be heavyhanded, but it does need to be consistent. If you see someone doing something that seems odd, don’t hesitate to ask about it. You might be very glad you did.
Set a time for cleanup. One of the largest and most productive synthetic organic chemistry labs we’ve had the pleasure to work with had mandatory cleanup time. All work had to cease one hour before the lab meeting each week. All lab staff had to spend that hour cleaning and maintaining their areas. The lab manager announced the cleanup time and walked through the lab during that hour to oversee and address issues, with the motto “That which gets measured, gets done.” The research manager also embraced the “lead by example” approach. Things never got out of hand; the lab was one of the best in terms of safety and compliance, and it produced quality science.
Have a productive method for bringing issues forward. This should be framed as a cooperative “we are in it for the common good” approach rather than a “gotcha”- type attitude. Require lab members, from the glass washer to the most senior staff member, to identify two or three safety issues at each lab meeting. Everyone should be able to come up with two issues regardless of how good a lab might be. These topics can then be opened to discussion and flagged for follow-up. This will set the stage for reinforcing safety as an important value in the culture of your laboratory.

Simplifying Pure Water Systems



Problem: Water purification systems are required for a variety of applications, all of which require different degrees of purity. With a wide range of pure water systems available, it can be difficult to know which one will provide the best results for your lab. The water system should be suited to the available feed water, as well as the application and capacity requirements of the laboratory. Not all feed water is the same and some applications require the removal of different specific contaminants. For example, tap water in North America commonly contains chlorine, which can damage reverse osmosis membranes and shorten the lifespan of the deionization cartridges within your pure water system. By having a system that effectively removes the chlorine, cartridge and membrane life are successfully extended and the laboratory is able to save on unnecessary expenses. Matching the correct system to the type of feed water and application can be a challenging task, but if performed correctly, the validity of resulting data can be maximized and the laboratory can be run as costeffectively as possible.

Solution: A feed water analysis service will provide a detailed and accurate method to help simplify the process of selecting an appropriate pure water system. Through the generation of a customized summary and system recommendation based on the quality of the feed water, users are provided with clear recommendations as to what will best suit their needs. The Thermo Scientific H2O SELECT water testing program is a free, no-obligation service in which customers receive a system configuration recommendation based on what contaminants need to be removed from the water to create the desired purity along with an estimated lifespan for the consumables. This will aid the laboratory in effective budgeting of resources.
Established to test the feed water of customers considering the purchase of a water purification system, the H2O SELECT™ kit is the most comprehensive water testing program currently available, providing expert analysis of feed water. Based on the customized results, the best system for your application will be identified. A summary of the results and a system recommendation will be produced from the quality of the feed water, the laboratory applications it is used for, as well as the daily water demand and budget. As all water systems are slightly different, some cartridges have a lower capacity than others, resulting in an increased operational cost. This program will identify annual operating costs and inform you of these before you purchase the system.
The H2O SELECT water analysis program is extremely easy to use: Having requested a free kit from Thermo Scientific and filled the test bottle with feed water, the scientist just needs to fill out a brief questionnaire and return it with the sample for analysis. Upon evaluation, the resulting data is matched to the laboratory requirements stated on the questionnaire. Based on this, a water system recommendation along with estimated annual operating costs will be mailed directly to the laboratory.

Evolution of the Clean Room


By: John Buie - Published: January 11 2011

Although the principles of clean room design go back more than 150 years to the beginning of bacterial control in hospitals, the clean room itself is a relatively modern development. It was the need for a clean environment for industrial manufacturing during the 1950s that led to the modern clean room as we know it.
A clean room is a rigorously controlled environment that has a low level of environmental pollutants such as dust, airborne microbes, aerosol particles and chemical vapors. The air entering a clean room is filtered and then continuously circulated through high efficiency particulate air (HEPA) and/or ultra-low particulate air (ULPA) filters to remove internally generated contaminants. Staff wearing protective clothing must enter and exit through airlocks, while equipment and furniture inside the clean room is specially designed to produce minimal particles.
While more than 30 different industry segments utilize clean rooms, 70 percent of U.S. clean room floor space is in the semiconductor and other electronic components, pharmaceutical, and biotechnology industries.
1939 – 1945
Development of the modern clean room began during the Second World War to improve the quality and reliability of instrumentation used in manufacturing guns, tanks and aircraft. During this time, HEPA filters were also developed to contain the dangerous radioactive, microbial or chemical contaminants that resulted from experiments into nuclear fission, as well as research into chemical and biological warfare.
While clean rooms for manufacturing and military purposes were being developed, the importance of ventilation for contamination control in hospitals was being realized. The use of ventilation in a medical setting gradually became standard practice during this time.
1950s – 1960s
The evolution of clean rooms gained momentum as a result of NASA’s space travel program in the 1950s and 1960s. It was during this time that the concept of ‘laminar flow’ was introduced, which marked a turning point in clean room technology.
In the late 1950s, Sandia Corporation (which later became Sandia National Laboratories) began investigating the excessive contamination levels found in clean rooms. Researchers found that clean rooms were being operated at the upper practical limits of cleanliness levels and identified a need to develop alternative clean room designs.
In 1960, Blowers and Crew in Middlesborough, UK was the first to improve contamination control by creating a unidirectional airflow from an air diffuser fitted over the entire ceiling in an operating room. In practice, the airflow was disturbed by air currents and the movement of people, but the idea of unidirectional flow was born.
Also in 1960, McCrone Associates began developing advanced particle handling techniques using tungsten needles and collodion. These techniques, which later became industry standards, were incorporated into the McCrone Associates Class 100 clean room.
In 1961, Professor Sir John Charnley and Hugh Howorth, working in a hospital in Manchester, UK, managed to significantly improve unidirectional airflow by creating a downward flow of air from a much smaller area of the ceiling, directly over the operating table.
Also in 1961, the first standard written for clean rooms, known as Technical Manual TO 00-25-203, was published by the United States Air Force. This standard considered clean room design and airborne particle standards, as well as procedures for entry, clothing and cleaning.
In 1962, Sandia Corp. launched the Whitfield Ultra-clean room, which was publicized in Time Magazine, creating a great deal of interest. Instead of simply using filters to clean incoming air, Whitfield used filtered air to keep the room clean by introducing a change of ultra-clean air every six seconds.
In 1962, Patent No. 3158457 for the laminar flow room was issued. It was known as an “ultra clean room.”
By 1965, several vertical down flow rooms were in operation in which the air flow ranged between 15 m (50 ft)/min and 30 m (100 ft)/min. It was during this time that the specification of 0.46 m/s air velocity and the requirement for 20 air changes an hour became the accepted standard.
In 1966, Patent No. 3273323 was submitted and issued for the “laminar flow airhood apparatus.”
1970s
By the early 1970s the principle of “laminar flow” had been translated from the laboratory to wide application in production and manufacturing processes.
1980s – 1990s
The 1980s saw continued interest in the development of the clean room. By this stage, clean room technology had also become of particular interest to food manufacturers.
During the late 1980s, STERIS (formerly known as Amsco) developed the use of hydrogen peroxide gas for the decontamination of clean rooms, and marketed the idea under the trademark VHP (vaporized hydrogen peroxide). Hydrogen peroxide gas rapidly became the most widely used method of sterilization, due to its unique combination of rapid antimicrobial efficacy, material compatibility and safety.
In 1980, Daldrop + Dr.Ing.Huber developed an innovative clean room ceiling, known as ‘Euro Clean’, to meet the rising challenges from industry at the beginning of the 80s.
In 1987, a patent was filed for a system of partitioning the clean room to allow zones of particularly high-level cleanliness. This improved the efficiency of individual clean rooms by allowing areas to adopt different degrees of cleanliness according to the location and need.
In 1991, a patent was filed for a helmet system that can be used in a medical clean room in which the user is protected from contaminated air in the environment, while the patient is protected from contaminated air being exhausted from the user’s helmet. Such a device decreases the possibility of operating room personnel being contaminated with viruses carried by the patients being operated upon.
In 1998/1999, CRC Clean Room Consulting GmbH introduced the clean room filter fan. This involved the integration of a filter fan unit, with filter, ventilator, and motor directly into the clean room ceiling.
2000s
The pace of clean room technology transformation has accelerated over recent years. Since the year 2000, there have been significant advances in new clean room technology, which have helped to streamline manufacturing and research processes, while also reducing the risk of contamination. Most of the technological developments of the past decade have been directed towards the manufacture of sterile products, particularly aseptically filled products.
In 2003, Eli Lilly pioneered the development of a new system for the prevention and containment of crosscontamination during the manufacture of pharmaceutical powders using a specially designed ”fog cart”. This allows the operator to be covered by an exceptionally fine fog of water on exit from a critical area, virtually eliminating the risk of transferring dust traces beyond their proper confines.
In 2009, The University of Southampton, UK opened a Nanofabrication Centre containing a clean room with nanofabrication facilities, making it possible to manufacture high-speed and non-volatile ”universal memory” devices for industry that could process information faster than anything achieved with conventional technologies.
The Future of Clean Rooms
Clean room facilities in the United States have been predicted to grow fourfold from a baseline of 1998 to the year 2015, to an estimated 180 million square feet in 2015.
The most common applications of clean rooms currently are in the manufacture of semiconductor and other electronic components, as well as in the pharmaceutical and biotechnology industries. In addition to these traditional applications, clean room technology has more recently been applied to micro- and nano-system processes, and this looks certain to be an area of growth in coming years. The development of clean room technology is likely to continue to be driven by certain key factors including the increasingly technical use of exotic physical and biological phenomena, the central role of increasingly fine structures, the creation and use of materials of the highest purity, and the increasingly broad-based utilization of biotechnology. Given the scale of these challenges, clean room technology looks set to remain indispensable to production in coming years.

Pharmaceuticals and Biotechnology and Stem Cells





 (Image source: wikimedia)

With recent developments in Haematopoietic Stem Cell research becoming a rapidly expanding new technology in many medical laboratories across the world, clean room technology is becoming a more and more lucrative field for those of us in the industry. The construction of these new clean room laboratories can present many challenges, most notably in the area of pharmaceutical science and biotechnology, in which clean rooms have been playing an instrumental role for years. Implementation of new clean room suite technology is constantly in high demand from researchers all over the world, and a recent article from PharmTech.com sheds some light on how recent developments in this technological avenue have affected European markets, most notably in the area of HEPA filtration systems:

“The HEPA filtration system in pharmaceutical clean rooms presents some distinct challenges. A terminal filtration system, in which ducts connect the individual HEPA filters to the air handling system, is the most common choice for Class 1000-100000 clean rooms. It is critical that the ducting does not contain any acoustic lining that could harbor microbial growth. In situations where a plenum configuration is selected, the sealing technique becomes a critical consideration. The gel sealant, popular in other applications – such as animal research and the production of medical devices – since the 1960s, has been less enthusiastically received by the pharmaceutical industry because the U-channels constructed to seat the filter units make sanitizing difficult.”

High efficiency particulate air systems are a large staple of the field of clean room technology, and new developments in sanitization will most likely revolutionize the field in the coming years. As the transportation of human cell tissue for transplantation or genetic observation becomes a more common application for medical facilities, it is likely that clean rooms will become a larger part of most institutions’ budgets.