A comparision between back and front illuminated sensors

Dr Gerhard Holst tackles a common question in the discussion about back- versus front-illuminated sensors

Germany-based camera manufacturer PCO develops scientific CMOS (sCMOS) cameras with both, back and front illuminated sensor technology. When comparing back with front illuminated sensors, one question is often raised: why is a backside-illuminated sensor more sensitive than a front side illuminated?
All image sensors have light-sensitive pixels, but what does that mean? The pixels allow a spatial localisation of incoming light and consist of various electronics interconnected with metal wires to enable digitisation. The basic light to charge carrier conversion elements are the photodiodes. Both the lateral area and volume of a pixel are shared by photodiodes, metal wiring, transistors and capacitors. Thus, the sensitivity of an image sensor strongly depends on how much of the total pixel area is used for light to charge carrier conversion, or in other words, is light sensitive.

Fill Factor

The fill factor is a technical term for an image sensor, which describes the ratio of light sensitive area to total area of a pixel:

fill factor = (light sensitive area of pixel)
                   total area of pixel)

For example, in interline transfer CCD image sensors where the pixel area was shared by the photodiode and the shielded register, the fill factor was in the range of 30%. This means, a minimum of 70% of the incoming light would have been lost. The same principle holds true for CMOS image sensors, where all the transistors, capacitors and wires occupy valuable light converting space. During CCD sensor development, measures were developed to compensate the fill factor loss. The most effective measure was done simply by adding micro lenses on top of the image sensor.

Fig. 1 illustrates the differences in light collection shown for perpendicular impinging light. While some of the light is scattered, reflected or absorbed in spaces of the image sensors, the microlenses focus the light to the charge conversion photodiodes much more efficiently than without (Fig. 1a and 1b).

By this measure, the CMOS image sensor, shown in Fig. 1, has a total quantum efficiency of about 50% – quite good considering there are additional loss mechanisms in image sensors. The best quantum efficiencies achieved in interline transfer CCDs (with 30% fill factor) have been around 50 to 70%. In more recent sCMOS image sensors with similar fill factors, quantum efficiencies of above 80% have been achieved by optimisation of the microlenses and the manufacturing process. But the microlenses are, in most cases, made of mouldable plastic material, which attenuates significantly any UV light transmission.

Furthermore, there is a new influence introduced by microlenses, since the performance of these optical devices is dependent on the angle of incidence. This means that the microlenses add a more pronounced angular dependency to the quantum efficiency, as can be seen in Fig. 2.

The blue curve shows the horizontal angular dependence of the relative quantum efficiency of the interline transfer CCD image sensor. From an incident angle of 5° and higher, the relative quantum efficiency drops significantly, while the vertical dependence is less pronounced, which can be explained by the fact that in vertical direction the light sensitive area nearly covers the whole pixel while horizontally half or more of the area is used by the shielded register.

From Frontside to Backside Illumination

However, the micro-lenses cannot collect and focus all ray angles of incoming light. Further, the semiconductor manufacturing process contributes additional layers above the photodiodes (Fig. 3, depicting the wiring, the transistors and the capacitors). The electronics in these layers can cause light to be scattered and absorbed, resulting in a loss of light to charge conversion.

The loss of light due to physical blocking and scattering by electronics is more pronounced in CMOS image sensors with small pixel pitches and higher pixel counts (>4 MP) than many CCD sensors. Due to massive adoption of CMOS sensors (e.g. smartphone cameras) semiconductor manufacturers have developed methods to process the wafer with the image sensors effectively reversed, and a large part of the substrates physically and chemically etched away. This process results in image sensors that are effectively illuminated from the back, and the light reaches the photodiodes more directly (Fig. 3b).

Backside illumination of current CMOS image sensors has seen quantum efficiencies better than 90%. By the introduction of an additional surface (the surface of the backside), there are also additional dark current and noise sources added, the caveat being that many backside illuminated image sensors have higher dark current compared to the frontside illuminated counterparts.

Backside Illuminated Image sensors with microlenses

The advantage of having fewer layers above the photodiodes (higher sensitivity) also presents a disadvantage in decreased sharpness – technically described as modulation transfer function (MTF). Due to the remaining substrate above the photodiodes, backside illuminated image sensors generally show a decreased MTF, and if light arrives at particular angles, can be scattered or incorrectly guided to the next pixel. Luckily, the same microlens method, initially developed to increase the fill factor, now improves the MTF.

Fig. 4 illustrates the light rays hitting a backside illuminated image sensor under an angle (Fig. 4a), and showing that the microlenses (Fig. 4b) help collect the light at the photodiodes belonging to the pixel, where the light was impinging.

Conversely, as mentioned above in Fig. 2, the introduction of microlenses again has an impact on the angular dependence of the quantum efficiency, which means that the back illuminated image sensors without microlenses have a larger independence of the incident angle, even better than the red curve in Fig. 2.

The answer

Returning to the answer of our initial question, backside illuminated image sensors have fewer obstacles in the pathway of the incoming light as it reaches the volume of the pixel, where the conversion to charge carriers takes place. Therefore, backside illuminated CMOS image sensors are able to convert more of the light into charge carriers, resulting in larger signals and better images.

Dr Gerhard Holst is Head of Science & Research Business Development at PCO

Recent Issues