Astrophotography, Pixel by Pixel: Part 6 - Dirty Buckets and Calibration Frames

Astrophotography, Pixel by Pixel: Part 6

Welcome one and all to another installment of this landmark blog series! Much of the discussion here once again builds upon the concepts covered in the previous entries which you can find here: 

Astrophotography, Pixel by Pixel: Part 1 – Well Depth, Pixel Size, and Quantum Efficiency

Astrophotography, Pixel by Pixel: Part 2 – Focal Ratio Effects

Astrophotography, Pixel by Pixel: Part 3 – Gain/ISO and Offset

Astrophotography, Pixel by Pixel: Part 4 – ADUs and You

Astrophotography, Pixel by Pixel: Part 5 – One Shot Color, Monochrome Sensors, and Filters 

And so let’s dive on back down to the tiny world of individual pixels and our analogy of buckets catching rain. Without further adu, it’s time to talk about dirty buckets. 

What Actually Fills our Buckets? 

Throughout this entire series, we have assumed that our buckets and incoming light are absolutely pristine. Everything that our buckets gather has only been the light from the object we care about. Sadly, this is not the case in the real world. There are different ways that our buckets get filled with things other than rain water or photons from our target. I am, of course, talking about notorious Noise (there is a difference between unwanted signal and true noise, but for the purposes of this article I am loosely lumping these two things together. What we are focusing on here is distinguishing between the photons and resultant electrons that come from the target we are imaging, and everything else that makes it into our pixel buckets). 

In our extended (to near the breaking point) analogy, noise would partially fill our buckets with things besides the drops of water we care about. There are two primary sources of noise, and a third source that can drown out our hard-fought for data that behaves slightly differently. 

Unfortunately, our pixel buckets are very dumb (but we still love them anyway): they do not differentiate between sources of electrons/water that fill them. All that our pixels care about is filling up with electrons, draining them at the end of each exposure, and sending out the value to be recorded in our frames.

 

Figure 1: The dumb but lovable Pixel

 

We know better. By understanding different sources of photons or electrons that fill our buckets that are not from our target, we can take steps to reduce the effect of the noise. This results in the remaining quantities of our pixels being mostly from our target.

 

    Figure 2: What's actually inside our pixels

 

The general comparison of the quantity of electrons that come from our intended target to the quantity of electrons that come from noise is called the Signal to Noise Ratio (SNR). A significant amount of the game of astrophotography is maximizing the SNR. This results in cleaner images, and we can show more and fainter signal of the target we are imaging. 

So what follows is a discussion on the different sources of noise in our images and the steps we can take to help mitigate its effects on our final images, thus improving the SNR. 

Dark Current or Thermal Noise and Dark Frames 

Dark current or thermal noise are for our purposes the same thing (specifically Dark current is the process by which this source of unwanted electrons are generated, and Dark noise is the actual value of accumulation which is the root mean square of the dark current). This noise is generated when our sensors are on and taking an image. There are two variables that affect this source of noise: temperature and duration of the frame. The amount of noise (recall this is a filling of the bucket that is not due to the light incoming from our target) increases the hotter our sensor is, and it also increases the longer our exposure duration. 

In our bucket analogy, this is like the buckets slowly gathering water when they are in use trying to capture and measure the rain water we care about. It is almost like dew gathering (but do not confuse this analogy here with actual dew forming on our gear. That is something different, and something we want to prevent as much as possible.): more is collected over time, and it can depend on the temperature. 

As the sensor heats up, that heat gets converted to electrons. The more heat there is, the more electrons are generated. A hotter chip is going to create more electrons than a cooler chip. This is one of the reasons why a cooled camera is so useful to astrophotography. Even on relatively warm nights, with a cooled camera that allows temperature regulation, we can push the sensor down to about -15° or -20°C which dramatically reduces the amount of dark current. This ultimately helps with the SNR, which gives us cleaner looking images. 

The creation of these electrons is a rate, such as .05 electrons per second. This is why the duration also matters. Let’s keep that same rate of dark current electron creation and do some quick math for different exposure durations. If we are shooting for 60 seconds, that means that on average, each pixel would have 3 electrons counted in the final output that were not from our target. If we shot for 5 minutes, or 300 seconds, this increases to a total of 15 electrons. 

Now imagine we had a warmer sensor and the rate of electron creation was instead .13 electrons per second. Then with our similar 60 and 300 second exposures, the electrons created would be 7.8 and 39 electrons respectively. A huge difference! In addition to cooling down our camera sensor, there is another technique that can help reduce the effects of dark current on our final images: dark frames. 

These are taken by completely covering the objective of the telescope or putting on the lens cap of the camera lens. In fact, you can even remove the camera from your lens or telescope entirely and cap it. We don't care about or want anything else besides the dark current to be collected by our sensor. 

Then take some frames that are of the exact same duration, same ISO/Gain setting, and same temperature (as much as possible) as your light frames. The goal here is to match exactly the quantity of electrons created from the dark current and record that as a separate image. Since the quantity of electrons generated depends in part on the temperature of the sensor, having a temperature regulated camera helps us ensure that we are matching the dark current as much as possible.

Figure 3: Contents of a dark drame

 

Then, we can take the dark frame, and subtract it from our light frames. This improves our SNR, and so improves our final result. 

 

                             Figure 4: Result of Dark Frame Subtraction

 

Notice here that there was still the read noise in our dark frame and it was subtracted out of our light frame right along with it. What's the deal with that? Well, the exact description of that source of noise and how we can help manage it brings us right into: 

Read Noise and Bias Frames 

Another source of noise, or electrons that make it into our images that are not part of the target we care about, is called Read Noise. This is generated from the process of pixel measurement and draining the pixels. It is introduced in every single frame taken, no matter the type, be it our light frames, dark frames, or flat frames (to be discussed in-depth in a future installment). 

This source is not affected by exposure duration, nor sky conditions, though it is affected by sensor temperature. It is a flat amount of noise that is introduced when taking an image. This means that this source of noise is present in our light frames, our dark frames, as well as our flat frames. 

Luckily there is a way to help mitigate this source of noise in a similar way to reducing dark current noise: bias frames. To remove the effect of read noise, cover the objective of the telescope or put on the lens cap to your camera in the exact same way as the dark frames. Similarly here, you can even remove your camera from your lens or telescope entirely and then cap the camera. Then at the same Gain/ISO setting and temperature, take a series of images at the fastest possible exposure time. This will create frames that are only recording the read noise.

 

Figure 5: Contents of a Bias Frame

 

There is no (or minimal) dark current noise since the exposures are so short. The only thing that is being recorded is the noise or electron counts that are present due to the camera taking a picture at all. Once you have accurately recorded the bias frames, these are then used in pre-processing to remove the effects of the read noise in your flat frames and is also removed in your light frames via dark frame subtraction, thus further improving your SNR, and resulting in cleaner final images. 

These are not directly removed from the light frames in the same way as the dark frames are applied.  Can you see why this is the case based on the discussion above? 

Recall that the read noise is present in every type of frame taken. This includes the dark frames themselves. Therefore, when we take our dark frames and subtract them from our light frames, we are actually subtracting the dark current and the read noise at the same time. It’s a two-for! 

Sky Glow and Light Pollution 

Recall that our pixel buckets do not make any sort of differentiation between types of incoming light that they capture; all of it is the same to them. A large source of photons that make it to our pixels that is not from our intended target is ambient light glow. This generally has two different sources: light pollution from electric lights and the moon. 

While you may be unlucky with the amount of light pollution in your region, it is lucky that this effect can be demonstrated in our persistent analogy. Recall that we are imagining that our sensor is like an array of buckets out on the lawn capturing rain to record how much rain fell directly in the area that particular bucket covers. 

Light pollution from electric lights would be like a very inconsiderate neighbor leaving their sprinklers on while you have your buckets out on the lawn. The water from the sprinklers makes it to your buckets as well, and the quantity captured from one source or another is completely indistinguishable. The water that you care about gets mixed in with the water from the sprinkler. And there is no worse neighbor with leaky sprinklers than the full moon. 

There are two ways to mitigate this source of unwanted water: the first is to image from a dark sky location when there is no moon out. The best way to improve your SNR in this regard is to get away from the source of the pollution! If all settings, including exposure duration, are the exact same, then a dark sky site will result in less sky-glow noise reaching our pixels.

Figure 6: Different Amounts of Light Pollution

 

The other option is to use a filter that can distinguish between the different sources of water or light. This would be like setting a special screen over the top of all your buckets that only allows real rain water through instead of water from your neighbors. A more robust discussion of the operation and effects of filters can be found in part 5 of this series, which for extra convenience I’ve linked to again here (there is a link to all the other parts of this series at the top of the article). 

Putting it all Together 

So let’s go back to our very first image of a pixel and see how all of the concepts above play out. We have a pixel that is filled to a specific quantity of electrons. That is all the pixel knows. It has no idea if the quantity comes from the target we care about, dark current, read noise, or sky glow from the moon or light pollution. 

Since you are smarter than the pixel (I’m not too sure I can make that claim about myself…yet), we can very closely mimic the quantity of electrons that are created by these sources that are not the target we care about.  We can take some dark frames and use those values to subtract from each image. This by itself will remove the dark current and the bias signal. We are left with far more clean data, better SNR, and therefore a much improved final image.  

These calibration frames are useful to create the best looking images. Application of all of these steps is actually done very easily and nearly automatically with calibration and stacking software such as Deep Sky Stacker, Nebulosity, or PixInsight. 

There is one more important kind of calibration frame: Flats. Our next installment in this series will dive into the effects of these frames, how they interact with dark and bias frames, and some techniques on how to take the flat frames. Stay tuned!

 

Takeaways 

  • We want to maximize Signal to Noise Ratio (SNR)
  • Signal comes from our target, Noise comes from many different sources
  • Dark and Bias frames help improve our SNR
  • Dark skies or filters (light pollution specific or narrowband) are your best friend

You can read the next, and final, installment of the blog series here.

6 comments

  • Jon, A couple of questions about calibration frames.
    1) Darks – In your recent class on AstroDLSR photography your slides say that can make 25-30 dark frames. Is that one frame per ISO/exposure combination? I ask because you say that you can make a master frame for when you use those settings. So if one made frames for ISO 200, 400, 800 and for 30 sec, 1 min, 2-min etc. for each ISO that is how one would arrive at 25-30 frames?
    2) Bias – It would seem that you would only have one frame for each anticipated ISO setting since one is using exposure of 1/1000 sec. Is this correct?
    3) Flats – I realize this is in Part 7, but how does one make flats?

    Thanks… Dana

    Dana Bowdish

Leave a comment

Please note, comments must be approved before they are published