Thursday, February 3, 2011

PCR Gets Personal, Faster



By Lori Valigra
FIGURE 1. The patented sample retention system of Thermo Fisher Scientific’s NanoDrop 2000 and 2000c spectrophotometers allow a sample to be pipetted directly onto an optical measurement surface.
FIGURE 1. The patented sample retention system of Thermo Fisher Scientific’s NanoDrop 2000 and 2000c spectrophotometers allow a sample to be pipetted directly onto an optical measurement surface.
Over the past three decades, polymerase chain reaction (PCR), which allows researchers to generate multiple copies of a piece of DNA, has become one of the most important techniques in biology. The procedure enables researchers to answer previously unanswerable questions about specific genes and nucleic acids, and to do so at an accelerated pace that shaves hours and even days off of experiments for biomarkers and potential drug targets.
“You can do studies you couldn’t do in the past when there wasn’t access to sufficient material,” said Jo Vandesompele, PhD, of the Center for Medical Genetics Ghent and Ghent University Hospital in Belgium. “Now it’s possible to have a low input amount and to generate multiple copies.” Dr. Vandesompele said the basic principle of PCR hasn’t changed much in 25 years, but the methodology, techniques, instrumentation, and software have improved to make it easier to use.
American biochemist Kary Mullis, PhD, is generally credited with conceiving the PCR technique in 1983 while at Cetus Corp., one of the first biotech companies, though the process was described earlier by Norwegian scientist Kjell Kleppe, PhD, and Nobel laureate H. Gobind Khorana, PhD. (Dr. Mullis received the Nobel Prize for chemistry for his work in 1993.) Prior to PCR, commonly used biochemistry methods such as enzyme-linked immunosorbent assay (ELISA), northern blot, and western blot took more time and put more demands on lab workers.
Still, in the early days of PCR there were no instruments. The repeated cycles of heating and cooling that characterize today’s PCR weren’t available then, so researchers had to use a hot water bath, which was time consuming and labor intensive, Dr. Vandesompele explained. “People had to add enzymes after each cycle,” he said. “So the development of PCR instruments itself was a milestone in the industry.” Another major move forward in the field of PCR was the invention early on of the heat-resistant Taq polymerase enzyme, which could operate at high temperatures. With the PCR instrument and Taq, users didn’t have to add enzymes at intervals, which made the technology easy to use—and popular. A third milestone, according to Dr. Vandesompele, was the hot start enzyme, a process in which the polymerase’s activity is inhibited at room temperature and activated at a high temperature, reducing amplification during the initial setup of PCR. “So the three major steps in getting PCR technology mainstream were instrumentation, Taq, and hot start,” he said.

Quantitative PCR: The Big Leap Forward

In 1993, the biggest milestone since the advent of PCR came: quantitative PCR, also known as real-time PCR and qPCR, which was pioneered by scientist Russell Higuchi, PhD, of Roche. While classic PCR generated multiple copies of a piece of DNA, qPCR could quantify the exact number of copies of DNA in a PCR reaction.
“The PCR of a quarter century ago gave you a qualitative answer, a yes/no answer. It allowed you, from a single copy within DNA, to make many amplifications so you were able to answer the question of whether the thing you are looking for is there or not,” said Manju Sethi, PhD, senior product manager of the NanoDrops product line for Thermo Fisher Scientific in Wilmington, Del., which makes equipment for quality control of samples prior to PCR.
So while classic PCR revolutionized genomics by quickly revealing whether or not the sample had something in it, there were some shortcomings, Dr. Sethi explained. “Sometimes a yes/no answer is not the end goal. In order to find out how much of a particular gene there is, you need quantification. What is a particular gene I’m looking at expressing, how much is it expressing, what is the rate at which it is expressing? The next frontier was qPCR.”
TABLE 1. Advances in PCR Instrumentation in Pharma. Key innovation drivers in PCR have been the need for increased sample throughput and the number of data points while reducing reaction volumes and time to results. The invention of real-time PCR (qPCR) in the early 1990s was a significant advance that allowed researchers to rapidly and accurately quantify gene expression levels in multiple samples. In 1999, Bio-Rad launched the first affordable non laser-based qPCR instrument (the iCycler iQ). By reducing the cost and size of real-time detection instruments, the then-novel technology of qPCR was made accessible to many more biotech and pharma labs.
TABLE 1. Advances in PCR Instrumentation in Pharma. Key innovation drivers in PCR have been the need for increased sample throughput and the number of data points while reducing reaction volumes and time to results. The invention of real-time PCR (qPCR) in the early 1990s was a significant advance that allowed researchers to rapidly and accurately quantify gene expression levels in multiple samples. In 1999, Bio-Rad launched the first affordable non laser-based qPCR instrument (the iCycler iQ). By reducing the cost and size of real-time detection instruments, the then-novel technology of qPCR was made accessible to many more biotech and pharma labs.
The invention of qPCR was a major achievement, agreed Rachel Scott, PhD, senior product manager of the gene expression division at Bio-Rad Laboratories Inc. in Hercules, Calif., which offers reagents and instruments. By improving the turnaround time so that scientists could get faster results, “it led to the ability to quickly and accurately quantify expressed levels in multiple samples,” Dr. Scott said. She said that in 1999, Bio-Rad was the first to launch an affordable, white-light laser based on real-time PCR instrumentation. This development reduced the cost and size of the PCR instrument from large copier to desktop size. “It made the technology more accessible to every researcher and pharmaceutical and biotech lab,” she said.
Life Technologies Corp.’s Applied Biosystems division was also an early player in the PCR market, with reagents and equipment. “In 1995, we came up with the very first real-time PCR instrument. That was the biggest contribution Applied Biosystems made to the PCR community,” said Junko Stevens, PhD, director of research and development and PCR reagents at Life Technologies in Carlsbad, Calif. The following year, the company came up with a universal TaqMan probe design. “So we made PCR from a semi-quantitative to a quantitative tool,” she said, adding that the industry is now moving toward user-centric innovation, so new systems and reagents must be easier for customers to use.
Thermo Fisher’s Dr. Sethi emphasized the need for quality control of samples in the qPCR process. Indeed, because it is error prone and time consuming, sample preparation remains one of the most challenging aspects of PCR. “With regular PCR, it’s not important to know how much you are starting with because even if you have one copy, you’re going to amplify it and get your answer, yes or no,” she explained. “But with qPCR, in order for you to have a meaningful answer at the end, you really need to understand what you are putting in at the beginning. That’s where quality control of nucleic acid becomes important. You put in a known amount and then, when you get your result, you can back-calculate and see it is correct.”
The most accurate conventional method for determining the concentration of nucleic acid is measuring the absorbance of the sample, because it is well known and understood that nucleic acid absorbs at a wavelength of 260 nanometers, she said. Conventional spectrophotometers, which have been used to determine the nucleic acid concentration for many years, require a user to perform settings before taking a measurement, and the sample must be diluted with water or a buffer and put into a cuvette. Thermo’s NanoDrop technology, introduced in 2002, eliminated these requirements. The user could just pipette a one-microliter sample, put it onto a flat surface on the instrument, click a button that says “measure,” and, in less than five seconds, get the concentration of the nucleic acid.

New Reagents Advance PCR Technology

The PCR market has evolved, with some big advances in reagents and in other areas such as high-resolution melt screening technology, which allows sample comparisons based on their DNA melting profiles. “It’s fast, it’s affordable, and it’s a very rapid screening tool,” said Bio-Rad’s Dr. Scott. “It’s accurate to single-base change resolution, as accurate as sequencing, but with greater sensitivity. That’s one of the advantages of the high-resolution melt approach.”
“We are increasing the sensitivity of reagents to allow for smaller reaction volumes,” added Viresh Patel, PhD, marketing manager at Bio-Rad, who said that smaller instrument sizes, higher throughput, and smaller reaction volumes are all PCR industry trends. “We want to create faster and more sensitive enzymes through polymerase engineering to reduce the volume of the reaction and generate the same signal level that an instrument can detect to give you real-time detection. Paired with buffering and chemistry, this can give you higher performing reagents that allow you to go smaller and reduce the reaction volume, but still maintain the performance you’d get with a larger reaction volume.”
Dr. Patel said Bio-Rad is working on new applications of qPCR beyond gene expression. The company is using its core technology for unique applications like its EpiQ chromatin analysis kit, introduced last November. EpiQ is for epigenetics, which plays a role in regulating gene expression. The product, a reagent kit using qPCR technology, allows a researcher to quantitatively determine the chromatin state of a target gene. “The kit significantly reduces the number of steps and time required to get results, as well as the number of cells needed to get reproducible results,” he said.
In addition to the EpiQ kit, Bio-Rad is developing enzymes with improved capabilities such as PCR polymerase and is starting to branch into reverse transcriptase. Dr. Patel said the advances in reagents are starting to keep up with advancements in instrumentation and detection technologies on the hardware side. “We are improving the speed and sensitivity of polymerase to increase the tolerance to contaminants or PCR inhibitors,” he said. “From an enzyme standpoint, we’ve developed novel binding proteins that increase the affinity of polymerase for DNA, so it effectively bypasses the effects of the inhibitor.”
TABLE 2. Validating PCR Experiments. Shown above is a detailed section from the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines. They outline the essential information that authors should provide to serve as a benchmark for assessing the quality of qPCR assays reported in published studies.
TABLE 2. Validating PCR Experiments. Shown above is a detailed section from the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines. They outline the essential information that authors should provide to serve as a benchmark for assessing the quality of qPCR assays reported in published studies.

Guidelines to Standardize PCR Techniques

Despite advances in reagents, equipment, and software to analyze tremendous amounts of data, PCR has lacked standardization. “Everyone was doing qPCR their own way with selection of the primers and the probes, the things that go into the assay,” said Dr. Sethi. “As a consequence, they published papers based on how they were doing it, and a lot of papers were retracted, because if your peers cannot duplicate what you do, you can’t prove the work is airtight. One of the big challenges remaining is the lack of standardization on how PCR is practiced.”
Last year a group of scientists came out with a set of de facto guidelines called MIQE, or Minimum Information for Publication of Quantitative Real-Time PCR Experiments, that act like a cookbook on how to perform effective PCR. Dr. Vandesompele explained that MIQE comprises quality assurance guidelines that cover each aspect of the qPCR workflow. “PCR has become a very easy-to-use technology, but it’s also dangerous for making mistakes and generating meaningless data,” he said. MIQE is an 85-question (parameter) checklist that indicates which probe or primer was used in PCR experiments and under what conditions, specifies whether quality control was performed on the sample before the experiment, and gives the concentration of the sample when the test was started.
Dr. Scott said that, from a reproducibility standpoint, the MIQE guidelines are a major improvement when it comes to validating the thinking of scientists behind the design and implementation of their experiments. Added Dr. Patel, “The issue of reproducibility is a challenge—and how to truly identify bias in amplification. If you repeat a process 40 to 50 times in an experiment, you are bound to introduce some level of bias. The MIQE guidelines try to put structure around this.” Not all researchers have adopted the guidelines, and peer-reviewed publications are just starting to require authors to adhere to them.

Toward Faster, More Direct Results

Because qPCR has been around for nearly 20 years, Dr. Sethi does not expect big changes going forward. “The new frontiers in qPCR are new targets,” she said. Jeremy Gillespie, PhD, group product manager at Thermo Fisher Scientific Genomics in the United Kingdom, said that in the next 10 years, he expects more biomarkers to come to fruition. “We’ll see thousands of samples with fewer targets. If you can increase sensitivity and reproducibility in qPCR and volumes, you’re going to make strides into quicker identification of targets,” he said. “If you’ve got a low expressed gene and have the sensitivity to pick it out, then that’s an advancement.”
Another trend is toward higher throughput of thousands and thousands of samples at a time, said Richard Kurtz, PhD, senior marketing manager in Bio-Rad’s gene expression division. Dr. Stevens of Life Technologies said her company is going toward much higher throughput, with its most recent OpenArray platform able to handle 3,072 reactions in a very small volume. Other companies that have moved in that direction include Roche, whose LightCycler can handle 1,536 reactions; Fluidigm, with the BioMark that can handle up to 10,000 reactions; and WaferGen with its SmartChip that has 5,184 nanowells. “I see pharmaceutical and biotech companies opting for high-throughput, low-volume qPCR to get more biomarkers,” said Dr. Vandesompele.
“There are trends in the industry toward more specificity, more end-user focus, and multiplexing PCR,” said Dr. Stevens. “People want to interrogate more targets from a single sample, so they need multiplexing. They want five different fluorescent colors coming out of one well. This can be used in pathogen detection, pharmaceutical manufacturing, DNA virus detection, and a recent new application, protein quantification.” New applications for PCR include copy number variation and epigenetics. Life Technologies introduced a kit for copy number variation last year and is working on a kit to interrogate epigenetics on a qPCR instrument that it expects to have out next summer.
And with sample prep a continuing challenge, Dr. Stevens predicts its eventual elimination. “Now biological samples are prepped, and then PCR is done,” she said. “But in five years, we expect to completely bypass that process, so you get an answer from a biological sample and bypass the sample prep.

No comments: