As a beginner, this question seemed like a no-brainer.
Why on earth would I want a monochrome camera that produces black and white images? Give me the colour thanks! Nobody wants to see black and white photos of space! And why do scientists with access to million dollar telescopes and cameras prefer black and white anyway? Those fools. The colour camera is cheaper.
Then of course, you inevitably try to make your photos better and better and realise that colour cameras are degrading your image, for a number of reasons. Professional deep space and planetary astrophotographers do enjoy colour photos, but to create them they use high-end monochrome cameras and filters to create their images one channel at a time. It’s at least three times as much work and imaging time, but the results are three times better.
Creating colour from monochrome
Let’s go back a step. How do you create a colour image from a monochrome camera?
By capturing 3 images of the same target, each with a red, blue or green filter, these channels can be aligned and combined as an RGB colour image.
But it’s three times the work – why would you want to do that?
The answer lies in the engineering required to create an RGB (colour) camera chip. Ultimately, all the little light wells on a chip are monochrome. They can’t themselves differentiate between red, green or blue photons. They just add the photon to the well. Colour chips instead use tiny filters over each pixel well which gives you an array of tiny pixels. Herein lies the next problem. An array of pixels that require 3 different values don’t divide into 2 or 4 very well, so you end up with an array that must be biased to one colour – usually green. This is called a Bayer filter or colour filter array (CFA).
See the problem?
There are way more green pixels than red or blue. This inherent bias is corrected in the process that converts the array into a colour digital image which is called demosaicing. Typically all this happens in-camera unless you shoot raw and demosaic yourself, but either way, the process averages the results from an area of pixels and this effectively downsamples your image from the actual number of pixels on the chip and creates artifacts such as false colours, colour bleeds, jagged edges, and more. With a monochrome camera, you get a 1:1 unblemished result based on the number of photons that hit the pixel well, with none of this immediate jiggery pokery that gives you a degraded image straight away.
But wait. It gets worse.
Pixel wells on a camera chip are microscopic. They are incredible feats of engineering and miniaturisation that allow functional pixels of such small size to site side-by-side but doing so requires a small gap between each pixel. That gap is another effective area where photons literally slip between the cracks and are never recorded at all. With a colour camera you need 3 times as many of these gaps so there is a loss of performance and sensitivity to light, as a greater surface area is dedicated to gaps instead of pixel wells. This is called quantum efficiency.
Compared here are the quantum efficiency (QE) graphs for the ASI ZWO 174 colour vs the equivalent mono version. You can see the green, with more pixels, has a slightly higher QE, but none can match the high QE of the mono. By virtue of the reduced QE alone, there is a loss of signal. Put simply, it takes longer exposure time to achieve the same pixel saturation you get with a mono camera.
But wait, there’s more.
Loss of signal
When a photon arrives at a mono chip, one of two things happens. It either hits a pixel well and is recorded, or it hits a gap or gets lost for whatever reason. With a colour chip, there is a third case: It can hit a filter in the colour filter array, and be discarded because it doesn’t match the colour for the filter. This means there is a 25% for green or 50% chance for red and blue photons to be lost entirely. This loss over the surface area of the chip creates a situation where detail and signal are both lost overall.
I know what you’re thinking. Why oh why did I buy this piece-of-crap colour DSLR / high speed CMOS or expensive one-shot-colour CCD?
We’ve all been there. Depending on how far you come with your photography, there’s usually a point where you simply must go the monochrome/filter-wheel upgrade path to get the sharp, precise, high quality images you see from Hubble and the professionals. You can bet they aren’t using colour cameras. No-sir-ree, Bob.
The case for colour
But there is a case for them. Ok maybe 2 cases.
The first, most obvious one is simplicity – because a colour camera saves you 3 or 4 times as much work processing a photo. A good nebula photo requires at least 30-40 subframes to reduce the noise with stacking. Using a mono camera means you’ll need 90-120 subframes, or even more if you shoot luminance in Ha as well as R, G, and B channels. I shoot with a one-shot-colour CCD and those who follow my socials know I pump out a lot of images quickly for this reason. I even use it for narrowband work which is a bit of an astro faux-pas, but I get away with it. On Instagram.
The other case is a practical one. Mono is best for planetary and other targets, but sometimes the object you are imaging is moving quickly, like a shadow transit of Jupiter’s moons, or the planet surface itself which does a full rotation in 9 hours. Post processing derotation may help in this case, but will still “smear” a nearby moon as the colour channels go out of alignment between filter captures. Another example may be a brilliant green comet racing across a star field. Good luck trying to image these in mono and align them later. It’s a trade off between getting enough frames, and getting them quickly enough to even combine them without the target having moved significantly between channels. In these cases, colour cameras are a convenient way to capture the event quickly and still produce a beautiful colour image.
The answer: Get both. Maybe.
The moral to the story, if there is one, is that you need both! At least, that’s what I keep telling myself. I’m not sure my bank balance agrees with me.
If you want to specialise and achieve the best possible solar, planetary, or nebula photos, then monochrome is the way to go. The loss in detail, clarity, and signal that comes with colour imaging cannot be recovered with post processing alone. For casual imagers, or anyone wanting to view images “live,” then colour is still an attractive option.
Now if anyone would like to donate to my observatory grade, liquid nitrogen cooled, scientific large format mono CCD fund, please send me a message via Google+, Instagram or email@example.com! I take any currency including PayPal or bitcoin.