top of page
Writer's pictureCalvin Klatt

Feeling Blue but Seeing Red - Part 3: Analysis and Conclusions

In two previous postings (Part 1, Part 2) I assessed the sensitivity of the camera I have been using, my “super” camera, the ZWO ASI6200MC, and the very similar monochrome camera, the ZWO ASI6200MM.


The results show several things:

- The efficiency or sensitivity of the camera varies significantly across the optical spectrum

- The claimed “quantum efficiency” of over 90% is not really achieved in practice.

- The colour camera’s separation of RGB is a near complete fuck-up. Presumably expensive Sony cameras with these same chips are capable of heavy-duty on-board image processing.



Figure 1 shows the efficiency (percent of photons that should produce signal in the CMOS sensor) of the ZWO ASI6200MC colour camera. Sensitivity of the Red, Green and Blue channels as well as the total are shown.



Figure 2 is the efficiency (percent of photons that should produce signal in the CMOS sensor) of the ZWO ASI6200MM monochrome camera, shown with no filter and with RGB filters from a company called Astronomik. The passband with a Hydrogen-Alpha filter is also shown which is only 4nm wide and appears as just a spike here.


A brief review of how the camera images are processed is necessary now to better understand the implications.


A colour camera produces images with three colour channels, an RGB image, usually saved in the form of one FITS file. FITS (Flexible Image Transport System) is the data format most widely used in the field of astronomy for transporting, analyzing, and archiving scientific data files. I use some free software to extract TIFF files from this, one for each colour (RG & B). I use a tool in Photoshop to load those three and form a colour image. I have some other tools that then enhance the image somewhat to produce the final images. In many cases the colour separation is very minimal and the final images look like greyscale.


The equivalent imaging with a monochrome camera would involve successively placing R, G and then B filters in front of the camera. There would be three FITS files, one for each colour, and TIFFs can be created from them. These would be loaded into Photoshop and the rest of the processing would be the same.


It is generally claimed that if one can switch the filters quickly (and perhaps refocus) then the overall time required to observe a target is roughly the same for colour cameras and monochrome. The colour camera would be observing the three colours at the same time, while the monochrome would only collect one colour at a time. The increased efficiency of the monochrome camera compensates.


The main negative, in my opinion, for monochrome RGB imaging is that you might collect one colour and then get clouded over for a week or a month. Perhaps by the time you observe again the target is gone and you have to wait a year for it to roll around again. This imaging also requires more effort to switch filters and to make sure that the focus is correct. More work = bad. More work outside in the middle of the night during Canadian winters = very bad.


The comparison of the two cameras reveals a negative aspect of colour camera imaging. The colours are all mixed up! Efficiency for any colour is quite low. The camera’s built-in UV and IR cutoffs do not allow us to explore those wavelengths (this is useful for lunar imaging). Colours should be more vibrant with the monochrome camera for the same total exposure time.


For a very faint galaxy target, the monochrome camera wins hands-down because the final image will be essentially monochrome no matter what we do. The monochrome camera can be used with no filter and it will be VASTLY more sensitive. Compare the yellow lines in the two graphs if you want proof of that.


Note that for most astronomical imaging a “luminance filter” would be used with the monochrome camera, passing only wavelengths between 400 and 700, but with near 100% transparency inside this band. On the other hand, there is another filter that passes only the IR wavelengths, useful for sharp lunar imaging. This IR imaging is not possible with the colour camera: I have a very cheap lunar IR filter which I placed in front of the colour camera and verified that no photons made it through.


This “luminance” filter leads us to another topic – using a luminance channel (to measure luminosity) in addition to the RGB imagery. Luminosity is the overall brightness of the image, pixel by pixel. It has no colour information, just brightness. If you combine a luminance frame with RGB frames, the RGB information will provide ONLY colour information (i.e. saturation and hue). The resulting image will have the crispness of the luminance layer, with colour layered on from RGB channels.



Figure 3 shows how luminosity, hue, and saturation combine to produce colour images.


LRGB imaging sounds even more complicated, but it may be the ideal use of a monochrome camera. For example, consider an image based on 1 hour of luminance data (letting all light through from 400nm to 700nm). It will be greyscale, but will blow away any colour camera for sensitivity and the resulting detail. Then imagine 30 minutes of R, 30 minutes of G, 30 minutes of B are collected for a total observation time of 2.5 hours. Such an image can be expected to be superior to an image from a colour camera collecting three colours simultaneously for 2.5 hours.


I do not have a monochrome camera, so I can’t really say if this is true. I certainly don’t relish wandering out in the middle of the night to change filters. However, my new ZWO ASI6200MM Pro monochrome camera is arriving in a few days (June 13, 2022), packaged with RGB filters, and we shall see.


Clear weather to all.

12 views0 comments

Comments


bottom of page