|
|
|
4.Conclusion |
|
1.Introduction |
|
By definition method validation is the process of establishing the performance characteristics and limitations of a method and the identification of the influences which may change these characteristics and to what extent. Which analytes can it determine, in which matrices in the presence of which interferences? Within these conditions what levels of precision and accuracy can be achieved?
|
|
A test or calibration Method validation, aims at providing an assurance of reliability of data. The ongoing reliability and comparability of data can be guaranteed only through the implementation of quality assurance system including the application of method validation according to international accepted procedures and performance criteria.
|
|
2.When should be a method validated? |
|
The test or calibration methods that are being used by laboratories are normally from standards bodies or other professionally established international bodies like the Association of official Analytical Chemists (AOAC International).Even such standard methods will require some validation. Basically the laboratory will need to determine the accuracy and precision of the method in its hands as a minimum. Limit of detection may also need to be established if relevant, e.g. in environmental work.
|
However, when laboratories use test/calibration methods that are not from the above type of standard source full validation will be required.
|
|
|
3.Factors to be Considered in Method Validation |
The minimum requirements of method validation are bias, precision and LOD discussed below. In the case of methods developed from scratch in-house a much more comprehensive approach covering the other parameters described below will be required. Few laboratories, however, take this approach and the norm is to adopt and perhaps slightly modify standard methods.
|
|
Precision: Precision is the measure of the degree of repeatability of an analytical method under normal operation and is normally expressed as the percent relative standard deviation for a statistically significant number of samples.
|
|
The two most common precision measures are 'repeatability' and 'reproducibility'. These are expression of two extreme measure of precision which can be obtained. Repeatability (the smallest expected precision) will give an idea of the sort of variability to be expected when a method is performed by a single analyst on one piece of equipment over a short time scale. If a sample is analyzed by a number of laboratories for comparative purposes then a more meaningful precision measure to use is reproducibility ( this is the largest measure of precision).
|
|
In practice the laboratory is usually interested in the extent of variability which occurs over time when it operates the method. This is called 'intermediate precision' and describes the variability when the method is deployed in the same laboratory, perhaps on different pieces of equipment, and using different analysts on the staff. It is expected that this will give a value between repeatability and reproducibility.
|
|
Accuracy: Accuracy is popularly used to describe the measure of exactness of an analytical method, or the close of an agreement between the value, which is accepted as a conventional, true value or as an accepted reference value, and the value found.Accuracy is properly a qualitative concept and the correct term is 'bias'.
|
|
The bias of a method is an expression of how close the mean of asset of results (produced by the method) is to the true value. Bias is usually determined by study of relevant reference materials or by spiking studies.
|
|
Limit of detection: Limit of detection (LOD) is defined as the lowest concentration of an analyte in a sample that can be distinguished from a blank. It is expressed as a concentration at a certain specified signal-to-noise ratio, usually two-or three to one.
|
|
Where measurements are made at low analyte level e.g. in trace analysis, it is important to know what is the lowest concentration of the analyte that can be confidently detected by the method. as a method is used at lower and lower levels the precison deteriorates. Effectively the measurement becomes subject to increasing ‘noise’. The limit of detection is the point at which, with a defined probability, it becomes possible to distinguish signal from noise. Normally 95% probability, is the relevant level.
|
|
Limit of quantification: The limit of quantification (LOQ) is defined as the lowest concentration of an analyte in a sample that can be determined with acceptable precision and accuracy under the stated operational conditions of the method. Like LOD , LOQ is expressed as a concentration, with the precision and accuracy of the measurement also reported.
|
|
The overall effect is that the laboratory will state that an analyte is detectable between LOD and LOQ but will not offer a result as the precision is unacceptable. Below LOD the analyte is not detected and above LOQ it is measured.
|
|
Selectivity/Specificity: |
|
For analytical methods, the power of discrimination between the analyte and closely related substances (i.e. isomers, metabolites, degradation products endogenous substances, matrix constituents) is important
Therefore, potentially interfering substances must be chosen, and relevant blank sample must be analyzed, to detect the presence of possible interferences and to estimate the effect of the interferences.
|
In general, analytical methods can be said to consist of a measurement stage, which may or may be preceded by an isolation stage. It is necessary to establish that the signal produced at the measurement stage, which has been attributed to the analyte, is only due to the analyte and not from the presence of something chemically or physically similar or arising as coincidence. Whether or not other compounds interfere with the measurement of the analyte will depend on the effectiveness of the isolation stage and the Selectivity/Specificity of the measurement stage. Selectivity and Specificity are measures, which assess the reliability of measurement in the presence of interference. Specificity is generally considered to be 100% selectivity , but this agreement is not universal. Where the measurement stage is not specific, it is possible to state that certain analyte do not interfere.
|
|
Linearity and Range: |
|
Linearity is the ability of the method to elicit test results that are directly proportional to analyte concentration with in a given range. Linearity is generally reported as the variance of the slope of the regression line.Traditionally linearity was regarded as a desirable property of methods as only linear curves could be easily interpreted. With the ready availability of computing power this is now of little importance and non-linear calibrations can readily be dealt with.
Range is the interval between lower and upper levels of analyte(inclusive) that have demonstrated to be determined with the stated precision and accuracy using the method as written.
|
|
Ruggedness: Ruggedness is the degree of reproducibility of the results obtained under a variety of conditions, expressed as % RSD. These conditions include different laboratories, analysts, instruments, reagents, days etc.
|
|
Robustness: Robustness is the capacity of a method to remain unaffected by small deliberate variations in method parameters. The robustness of a method is evaluated by varying method parameters.
|
|
4.Conclusion |
|
A group of experts would have developed standard methods collaboratively. In theory this development should include consideration of all of the necessary aspects of validation. However, the responsibility remains with the user to ensure that the validation documentation is complete to the needs. Even if the validation is complete , user has to ensure that it establishes that the method is fit for the purpose intended for the method.
|
|
|
|
Reference |
|
The Fitness for Purpose of Analytical Methods: A Laboratory Guide to Method Validation and Related Topics (1998)
|
No comments:
Post a Comment