Ann-Cathrin Volz details the potential risks and options for monitoring DNA purity ratios
Not below 1.7 and above 2.0. Do these numbers sound familiar to you? Anyone who has ever worked with nucleic acids knows these numbers by heart and tries to strictly adhere to the limits of the 260/280 and the 260/230 ratios for their nucleic acid samples. But did you ever take the time to investigate the background of these purity ratios? It could pay off - the ratios leave much more room for interpretation than a simple pass/no pass classification.
Absorbance-based purity assessment
Next to gel electrophoretic methods, DNA/RNA purity is classically measured via spectrophotometric absorbance. Alternatively, fluorescence-based quantification methods offer a more sensitive approach and may allow to detect DNA down to the femtogram range. However, they need additional handling steps to stain DNA samples with specifically binding fluorescent dyes. In absorbance-based measurements, DNA concentrations can be calculated directly from the optical density (OD) at 260nm using the DNA specific extinction coefficient. This circumvents the need for a separately prepared reference standard curve. However, most common contaminants found in nucleic acid preparations, absorb in the range between 230-340nm (Fig. 1).
Since potential contaminants also absorb light at 260nm, they can negatively interfere with the calculation of the DNA concentrations (Fig. 2A). Guanidine hydrochloride or sodium acetate, both salts which are frequently used for nucleic acid purification, tend to decrease DNA absorbance at 260 nm and consequently lead to an underestimation of its concentration (Fig. 2A+B). In contrast, cellular protein residues (here represented by BSA), or phenol which is used to remove them, show substantial absorbance at 260nm and thereby lead to an overestimation of the actual DNA concentration. The same applies to particulate substances e.g. derived from precipitated sample components.
DNA purity ratios
Miscalculation of DNA concentrations inevitably leads to the use of deviating DNA quantities to downstream applications. Furthermore, contaminants may cause interferences in downstream assays. To save time and cost, quality assessment of the isolated nucleic acids or the purification method itself is an important prerequisite. Purity ratios are typically used in absorbance-based evaluation. They exploit the fact that potential contaminants alter absorbance not only at 260nm but also at other wavelengths. Phenol, proteins and salts, for example, show increased absorbance at 230nm. Contaminations with such substances may therefore be uncovered examining the 260/230 ratio, which should be ~2.2 for DNA samples. Furthermore, phenol, particulate substances and several amino acids like tyrosine, tryptophan, and phenylalanine display an increased absorbance at 280nm. These substances can in turn be identified by considering the 260/280 ratio, for which values ~1.8 are generally targeted. Looking at these ratios not only reveals information about the presence of impurities, but also allows to trace them back to a group of substances.
Although the quantification of DNA is significantly affected by salts such as guanidine hydrochloride, their impact on downstream applications is often negligible. Depending on the application, this allows to disregard reduced 260/230 ratios if 260/280 ratios are in the normal range. Proteins, on the other hand, can significantly interfere with different downstream applications. If the 260/230 and 260/280 ratios are both outside the normal range, this should be considered as a valid reason to revise the purification protocol.
In addition to the described ratios, the absorbance at 320 or 340nm is also often considered. Here, an increased absorbance occurs almost exclusively due to light scatter by particulate substances. Particulate substances not only lead to an absorbance increase at 260nm and 340nm, but they also result in a consistent increase of the OD over the entire wavelength range from 220 to 360nm and above. However, since their impact on downstream applications is limited, their spectral behaviour can be exploited to perform a background correction. If absorbance at 320-340nm is increased, the OD is simply subtracted from the whole spectrum, including the absorbance at 260nm. This way, the DNA concentration is corrected back to its actual quantity.
The ideal solution
BMG Labtech equips its single- to multi-mode microplate readers with a UV/vis spectrometer with absorbance spectral acquisition from 220-1,000nm in <1 sec/well. This enables the simultaneous acquisition of OD values at 230, 260, 280 and 340nm in less than one second per sample, much faster than any filter-based or monochromator-based reader. Thereby, all the data needed for the calculation of purity ratios and monitoring of possible impurities, can be acquired in a blink.
Ann-Cathrin Volz is with BMG Labtech