Chris Peterson wrote: geckzilla wrote:
moontrail wrote:Thank you geckzilla for the explanation, if I understood it right there is an unintended sensor saturation effect that causes the difference with what would be expected.
Maybe not unintended... but yeah, once every star is white, the only thing left to provide a sense of relative brightness is the size of the dots. If the dots don't spread much, then they all look about the same in brightness.
Modern DLSR cameras have about 12 bits of dynamic range, meaning that they can capture about 9 stellar magnitudes between their noise level and their saturation level. That's muddied a little because of the way color is captured, but it's a reasonable approximation.
Also, the dynamic range of DSLRs varies considerably with make and model and ISO setting. According to the internet, the Nikon D810 currently has one of the highest dynamic ranges on the market: almost 15 bits (or ~11 stellar magnitudes). But that is at ISO 50. Most deep sky DSLR astrophotography is done with ISOs of 800 or more, where the D810 gives a DR no better than about 11 bits (or ~8 stellar magnitudes). Also note that the dynamic range is not to be confused with the bit depth of the image. If I can trust the Exif data, this APOD was made with a Canon EOS 5D Mark III, at ISO 1000, which gives a DR of about 11 bits (pretty close to its maximum rating of almost 12 bits at ISO 100). Source: http://www.dxomark.com/Cameras/
(I have a Nikon D5100 with only a fixed, or an alt-az tracking mount, so I cannot make an individual exposure of more than about 30 seconds, and so I tend to use very high ISOs to compensate, and just accept a lot of noise. If I really push the ISO, my DR can be as low as 6 bits. In these cases, I don't really care if I saturate the brighter stars.)