Specifications
mix pixels of different colors. Images must be Debayer processed first and
then stacked.
Balancing colors
CCD chip sensitivity to red, green and blue light is different. This means the
exposure of uniformly illuminated white surface does not produce the same
signal in pixels covered with different color filters. Usually blue pixels
gather less light (they have less quantum efficiency) then green and red
pixels. This results into more or less yellowish images (yellow is a
combination of red and green colors).
The effect described above is compensated by so-called “white balancing”.
White balancing is performed by brightening of less intensive colors (or
darkening of more intensive colors) to achieve color-neutral appearance of
white and/or gray colors. Usually is one color considered reference (e.g.
green) and other colors (red and blue) is lightened or darkened to level with
the green.
Automatic white balancing can be relatively easy on normal images, where
all colors are represented approximately uniformly. But this is almost
impossible on images of deep-space objects. For instance consider the
image of emission nebula, dominated by deep-red hydrogen alpha lines –
any attempts to lighten green and blue color to create color-neutral image
result to totally wrong color representation. Astronomical images are
usually color balanced manually.
As already described in the “Brightness and Contrast – Image Stretching”
chapter, image can be visually brightened by altering its stretch limits. SIPS
“Histogram and Stretch” tool displays and also enables altering of stretching
curve limits and shape for red, green and blue color individually.
52