QUALITY ASSURANCE and PROCESS CONTROL Low-level Conductivity Standards Increase Accuracy,
The lower the conductivity, the higher the purity.
Purified water, and especially ultrapure water, is of great importance in semiconductor manufacturing, in power generation, and in chemical, pharmaceutical, and food and beverage processing operations. In recent years, conductivity testing of purified water has risen to new levels of importance. Understanding this measurement technique and the variables that affect instrument accuracy aids in the development of valid quality assurance procedures and appropriate process control.
Purified Water Testing
Water conductivity measurement provides an excellent assessment of contamination arising from soluble and ionizable substances. Conductivity proportionately reflects the degree of water purity—the lower the conductivity, the higher the purity.
Theoretically, pure water is essentially nonconductive. Water molecules ionize in water to a limited extent, producing non-zero conductivity. In addition to the conductivity promoted by water hydrolysis, gases such as CO2 and ammonia will dissolve in the water, ionize, and contribute to the conductivity. When water is purified by customary methods it contains small amounts of these gases. Conductivity arising from these ions is considered intrinsic to the water. Other ions, which are considered extraneous, such as sodium and chloride, may also be present and will affect conductivity.
Determining the Conductivity of Water
The electrical conductivity of water is determined by measuring the water's ability to pass electric current. Electrical conductivity is the reciprocal of the electrical resistance (C = 1/R). The ohm is the unit of electrical resistance. A circuit with a resistance of 1 ohm will pass a current of 1 ampere at 1 V. Since conductivity is the reciprocal of resistance, the classical unit of conductivity is the reciprocal of ohm, or mho (“ohm” spelled in reverse).
Pure water conductivity is expressed in ISO units (µSiemens/cm or µS/cm) rather than in the traditional unit, µmho/cm. When working with water purification systems, conductivity is usually expressed as its reciprocal, resistance. Thus, µmho/cm = Megohm/cm. Deionized water (DI) with a resistance of 16 Megohm/cm would have a conductivity of 0.0625 µS/cm.
In November 1998 US Pharmacopeia, Volume 23 (USP 23) established a new method for qualifying purified water. Testing of effluent water, as defined in this regulation, is deemed ideal for determining the completion of vessel Clean-in-Place (CIP) operations. USP permits acceptance of Purified Water made from acceptable feed, based solely upon conductivity measurements.
Pharmaceutical manufacturers adhere to the methods in the USP compendium. Because the FDA generally recognizes and endorses USP methods, this method has come into widespread use in the pharmaceutical industry. Outside of the pharmaceutical industry, this method is useful and instructive for the development of methodology in other applications and therefore merits review as a model.
Water conductivity is dependent upon temperature and the nature and concentration of intrinsic and extraneous ions. The testing procedure is divided into three stages. The USP Stage 1 qualification requires the use of a table (see Table 1) that relates temperature and conductivity. It is important to note that the relationship between temperature and conductivity in water is dependent on the nature of the dissolved substances as well as their activity (ionization). Considerations include:
Size, number, and valence of conductive particles or ions
Size and number of insoluble dispersed particles
Theoretically, pure water will exhibit conductivity dependent upon ionization of the water molecules alone. As we add conductive impurities, conductivity increases. The rate of increase in conductivity does not, however, correlate directly with the rate of increase of impurities. This is due to “crowding” in the solution. Conductivity increases more slowly as the concentration of conductive impurities increases.
Impurities also affect the behavior of conductivity with temperature. These multiple interactions are reflected in the USP 23 Stage 1 Chart (Table 1) and account for the specific effect on conductivity associated with both temperature level and water purity.
In Stage 1 testing, the conductivity and temperature of the sample is measured and the temperature value is found in the table. Looking across, the corresponding conductivity limit is found. If the conductivity of the sample does not exceed the chart value, the water meets the requirement of the test.
If the conductivity of the sample exceeds the chart value, Stage 2 testing is performed. At this stage a volume of the sample is equilibrated with room air at 25° C. If the change in conductivity is <0.1 µS/cm per 5 min. of mixing and the conductivity is not >2.1 µS/cm, the sample meets the requirements of this test.
If the water does not pass Stage 2 requirements, Stage 3 testing must be performed. In this test, a saturated potassium chloride solution is added to the same water sample and pH is measured, using the Stage 3 pH and Con duct ivity Requirements Ta ble to determine the con ductivity limit at the measured pH value. If the measured conductivity is greater than this value or if the pH is outside of the range of 5.0 to 7.0, the water does not meet the requirements of the test.
The conductivity meter used for measuring electrical conductivity is analogous to an ohmmeter used for measuring resistance. Commercial meters vary in style and design, but most consist of a sensing probe or cell, a source of alternating current, and a readout or display.
Sensing cells and probes can be configured in different ways, depending on the application. The classic cell is comprised of two conductive metal plates (electrodes) measuring 1 cm square, spaced 1 cm apart. This configuration yields a 1 cm3 volume of sample (water). The dipping probe usually consists of a nonconductive cylindrical core with two evenly spaced ring electrodes. These rings function in the same manner as the plates in the classic cell to provide a conductive path through the water.
The electrical resistance model also provides a useful analogy for understanding conductivity measurement. The method for measuring electrical resistance consists of applying a known direct current (DC) voltage to the resistor, connected in series with a current meter (ammeter). The current meter displays the current allowed to pass by the resistor. From Ohm's law [R = E/I] we can calculate the resistor's value. For example: if E = 10 V, and I = 0.1 amp, then the resistor's resistance, R = 100 ohms.
To measure water conductivity, the sensing cell or probe, in combination with the sample, is substituted for the resistor. Also, an alternating current (AC) is applied instead of DC because DC current will polarize the cell, causing the water to dissociate into hydrogen and oxygen. An AC frequency is selected, based on its ability to optimize instrument sensitivity.
Although the AC voltage prevents dissociation of the water, other variables must be considered in this process:
Phase difference between current and voltage, if not constant, can cause fluctuations in the measured conductivity between meter calibration and sample measurement.
Capacitance of the cell is dependent on the conductivity of the sample, among other things. As capacitance changes, conductance changes and the measurement is not strictly according to Ohm's law and sample measurements may not correlate with the calibration measurement.
Body capacitance of the analyst is usually managed by appropriate instrument design.
Appropriate choice of AC frequency for different ranges can prevent problems associated with resonance in the system. Meters that are designed for high-level conductivity may be suspect at very low levels.
User-made cable modifications can affect instrument cable inductance and resistance.
AC coupling from outside sources such as power, motors, fluorescent lighting, and other electric wiring can generate discrepancies between laboratory calibration and remote testing.
Any of these factors can influence test results. In most cases, however, they are negligible issues. Measured levels of <10 µS/cm should warrant investigation of these process variables.
It is reasonable to assume that if calibration is performed immediately before sample measurement, the phase difference between the voltage and current within the system is negligible. Ohm's law is then used to calculate the conductivity. From R = E/I, and C = 1/R, then C = I/E. If voltage, E, is held constant, the conductivity is directly proportional to the current, I, through the cell. Alternately, when current is held constant, the conductivity is proportional to the reciprocal of the voltage (1/E). The analyst does not usually deal with these choices, as they are typically a function of the meter’s design. These relationships, however, are important for understanding measurement compromise.
The conductivity of water and other aqueous solutions is highly dependent on temperature. Also, the relationship is non-linear. The conductivity of pure water ranges from about 0.01 µS/cm at 0° C to 0.8 µS/cm at 100° C. These effects can be seen in Figure 1.
It is customary to report conductivity and resistivity values, which are referenced to 25° C. Conductivity vs. temperature will vary, depending on the composition, concentration, and nature of the dissolved electrolytes in the sample. It is important to note that at 5 µS/cm and below, the dissociation of water contributes to the conductivity and increases the coefficient from about 2% per °C to about 5% per °C. It is helpful to make measurements of these levels at 25° C, ±0.1° C. When qualifying water under USP 23, uncompensated measurements are used, and conductivity and temperature are measured simultaneously. (Note: For further details regarding temperature compensation techniques the reader is referred to ASTM D-1125.)
Calibrating the Conductimeter
The conductivity measurement of effluent water can be accomplished either in the lab or in process. In either case, calibrating the measuring device is a critical step toward achieving maximum accuracy and validating the measurement.
The pharmaceutical standard for qualifying purified water, USP 23 (645), requires that the conductivity meter be calibrated to an accuracy of ±0.1 µS/cm. The ideal method for calibrating this instrument is with a low-level conductivity standard (Photo 2) that defines the meter's calibration curve. For maximum accuracy, the conductivity level of the calibration standard should also be close to the expected sample level. Calibrating with a 1000 µS/cm standard, and measuring a sample of approximately 1.0 µS/cm would mean that the standard and the sample are three orders of magnitude apart (3 logs) and would result in a serious degradation of accuracy.
The alternate method of meter calibration, which substitutes resistors for the measurement cell, is generally accomplished on the instrument bench, and does not account for cell constant, drift, contamination, or mechanical change in the measurement cell. Only low-level liquid solution calibration validates performance of both the conductivity cell and the meter electronics. This type of calibration also complies with cGMP procedures and is easier to defend in an FDA audit.
Liquid calibration of the conductimeter facilitates CIP operations by eliminating the need for serial sampling and lab testing. A flow-through cell, mounted in the discharge port of the vessel, monitors the effluent water used in the final rinse. Prior to mounting, the cell is calibrated with a low-level conductivity standard. After the operation is complete, the cell readings are revalidated with an alternate lot or level of the standard. Continuous flow conductimetric testing of final rinse effectively reduces cleaning costs and the time required for preparing a vessel for production.