Improving Western blot reproducibility

Sam Egel reports on the advantages of IR fluorescence imaging technology

Obtaining reproducible data from chemiluminescent Western blots imaged with film is a difficult task. The need for consistent results has led to the development of infrared (IR) fluorescence imaging technology that overcomes variability inherent to chemiluminescent detection.

Limitations of chemiluminescence make it difficult to achieve consistent Western blot results. Chemiluminescence is an enzymatic reaction that generates light to indicate the presence of a target antigen. The rate of this reaction varies over time, so light production is not constant. Blots imaged at different stages in the reaction, due to delays such as waiting for the darkroom, will be inconsistent. Even identical blots imaged at the same time can produce inconsistent results (see Fig. 1).

Accuracy is further compromised by stripping and re-probing chemiluminescent blots to detect multiple targets. Stripping can cause loss of antigen from the membrane, and leads to inconsistent replicates.

The limitations of film

Film's non-linear response to low- and high-intensity signals impacts data integrity. High and low intensity signals do not induce a linear change in optical density (darkness level) on film, limiting the accuracy of quantification.

This non-linear response, called ‘reciprocity failure’, stems from the way film records signals using light-sensitive silver halide grains. Photons strike a silver ion in one of these grains and reduce the ion to elemental silver, forming a ‘latent’ image. During development, the elemental silver catalyses the conversion of the grain to dark metallic silver. Optical density increases as additional silver grains are converted to metallic silver. However, the relationship between the number of photons that strike an image and the number of grains converted to silver is not always linear.

Low-intensity signals do not produce photons rapidly enough to irreversibly activate silver grains. Activated grains are unstable; they must absorb multiple photons to create a latent image, or they will revert to the unactivated state and a latent image will not form, even with long exposures. This is called ‘low-intensity reciprocity failure’ and is demonstrated in the ‘toe’ region of the graph in Fig. 2.

Strong signals cause film response to plateau and become saturated. When many silver grains are activated, each additional photon is statistically unlikely to strike an unactivated grain. Photons that do not activate a silver grain are not recorded as optical density, causing dramatic under-representation of strong signals. This is called ‘high intensity reciprocity failure’ and is demonstrated in the ‘shoulder’ region of the graph in Fig. 2.

Artifacts from the film development process can obscure image data. Bands can appear blurry if the blot or film shifts during handling. Strong signals can blow out and bleed into other bands. Mechanical damage from the developer and even static electricity can leave marks on film. Such artifacts can make a blot difficult to interpret, limiting its analytical usefulness.

Office scanners are often used to digitise film for analysis. However, scanners cannot accurately reproduce original image data, and the dynamic range of scanned images is 2.6 times lower than CCD images. Among other problems, scanners truncate large signal peaks to limit output optical density to a factory preset range. This compromises data integrity, because not all peaks are recorded the same way.

Unlike chemiluminescent methods, IR fluorophores provide stable and reproducible signal over time. Reproducibility is enhanced, because timing no longer affects imaging results. Properly stored IR blots are stable for 18 months or more, and can be re-imaged later with the same results.

The linear, proportional relationship between IR fluorescent signal and amount of antigen (Fig. 3) enables accurate comparison and protein quantification.  Moreover, IR fluorescence technology provides over six logs of linear dynamic range (LDR). By contrast, manufacturers claim an LDR of about 100-fold for CCD imagers and closer to 15-fold has been observed. Film's LDR varies from eight-fold to roughly 10-fold. With the wide LDR from IR imaging, one digital image contains the information provided by all possible film or CCD exposures, with much deeper analytical detail.

Fewer manual steps

IR blots are imaged directly to a digital medium, which saves time and eliminates variability caused by substrate incubation, film exposure, and digital scanning of film. This streamlined workflow leads to greater data reproducibility.

Finally, IR technology allows multiple targets to be detected on the same blot using spectrally-distinct fluorophores. Multiplex detection improves quantification accuracy by allowing one channel to be used for normalization and eliminating the error-prone stripping and re-probing process.

IR technology is an excellent alternative to film exposure, providing reproducible results with stable, quantitative Western blot signals and a streamlined workflow.

For more information, visit www.scientistlive.com/eurolab

Sam Egel is technical writer, Biotechnology with Licor Biosciences.

Recent Issues