The Milky Way Over Ontario (APOD 29 Jul 2008)

Comments and questions about the APOD on the main view screen.
Post Reply
User avatar
emc
Equine Locutionist
Posts: 1307
Joined: Tue Jul 17, 2007 12:15 pm
AKA: Bear
Location: Ed’s World
Contact:

The Milky Way Over Ontario (APOD 29 Jul 2008)

Post by emc » Tue Jul 29, 2008 5:19 pm

This beautiful capture would be an awesome setting for a bountiful backyard birthday barbeque… pass the meatballs please!
Ed
Casting Art to the Net
Sometimes the best path is a new one.

User avatar
Pete
Science Officer
Posts: 145
Joined: Sun Jan 01, 2006 8:46 pm
AKA: Long John LeBone
Location: Toronto, ON

Post by Pete » Tue Jul 29, 2008 7:11 pm

Amazing photo.

Ontario's a big place; on her website, the photographer specifies the location as Binbrook (southern ON, near Hamilton).

I need to get away from city skies...

User avatar
orin stepanek
Plutopian
Posts: 8200
Joined: Wed Jul 27, 2005 3:41 pm
Location: Nebraska

Post by orin stepanek » Tue Jul 29, 2008 7:24 pm

I wish I could go outside at night and see the Milky Way as it shows up in these photos. Light contamination makes all but more dominating stars impossible to see. All the more reason to tune into APOD every day. :lol:

Orin
Orin

Smile today; tomorrow's another day!

User avatar
BMAONE23
Commentator Model 1.23
Posts: 4076
Joined: Wed Feb 23, 2005 6:55 pm
Location: California

Post by BMAONE23 » Tue Jul 29, 2008 7:33 pm

Anyone able to try this?
Telescope with tracking mount - Eyepiece connected to ccd imager (W/remote) - Hardwired to PC moniter or Laptop. Sit back and watch the show on the big screen. :D

User avatar
Chris Peterson
Abominable Snowman
Posts: 18202
Joined: Wed Jan 31, 2007 11:13 pm
Location: Guffey, Colorado, USA
Contact:

Post by Chris Peterson » Wed Jul 30, 2008 4:09 am

BMAONE23 wrote:Anyone able to try this?
Telescope with tracking mount - Eyepiece connected to ccd imager (W/remote) - Hardwired to PC moniter or Laptop. Sit back and watch the show on the big screen. :D
Sure, lots of people do it. You can use an ordinary astronomical CCD camera, or an integrating video camera. Either way, it only takes a few seconds integration time to bring out much more than the eye can see, with or without optics. That's nearly real time. Of course, if you want to collect light even longer, the detail becomes incredible.
Chris

*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com

henk21cm
Science Officer
Posts: 225
Joined: Mon Feb 04, 2008 9:47 pm
Location: The Netherlands

Which camera?

Post by henk21cm » Thu Jul 31, 2008 8:32 pm

Chris Peterson wrote:You can use an ordinary astronomical CCD camera, or an integrating video camera. Either way, it only takes a few seconds integration time to bring out much more than the eye can see, with or without optics.
Could you give me some more details please? Which brand of astronomical CCD camera? I tried some 15 seconds exposure time (that is the maximum) with a Canon Powershot, 6 MPixel. Not particulary lots of stars, just as much as the eye can see. Not yet connected: a Philips PCVC 750K webcam, with a removable lens. In your opinion: (Tick appropriate)
- Junk?
- Nice for a church during holydays?
- Good stuff?
Regards,
 Henk
21 cm: the universal wavelength of hydrogen

User avatar
Chris Peterson
Abominable Snowman
Posts: 18202
Joined: Wed Jan 31, 2007 11:13 pm
Location: Guffey, Colorado, USA
Contact:

Re: Which camera?

Post by Chris Peterson » Fri Aug 01, 2008 12:07 am

henk21cm wrote:Could you give me some more details please? Which brand of astronomical CCD camera? I tried some 15 seconds exposure time (that is the maximum) with a Canon Powershot, 6 MPixel. Not particulary lots of stars, just as much as the eye can see. Not yet connected: a Philips PCVC 750K webcam, with a removable lens. In your opinion: (Tick appropriate)
- Junk?
- Nice for a church during holydays?
- Good stuff?
With a 15-second exposure using the PowerShot, you should see quite a few more stars than you can with your eye. Digicams can have problems focusing on the sky, however, and it only takes a tiny focus error to significantly reduce sensitivity.

The webcam is potentially very capable, depending on the software you use. For deep sky objects, you want to collect and stack many frames, and you want each frame to be exposed as long as the webcam will allow.

Integrating video cameras, such as the StellaCam or MallinCam, can expose internally for several seconds, while still maintaining a standard video output suitable for connecting to a video monitor. These provide a very simple, and reasonably cost effective way to do near real-time video imaging.

Conventional astronomical CCD cameras, like those made by SBIG, can do something similar, by continuously shooting and stacking exposures that are several seconds long (or longer). These cameras are significantly more sensitive than any of the video options, and require a computer, but provide by far the best results.
Chris

*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com

henk21cm
Science Officer
Posts: 225
Joined: Mon Feb 04, 2008 9:47 pm
Location: The Netherlands

Re: Which camera?

Post by henk21cm » Sat Aug 02, 2008 9:04 am

G'day Chris,
you wrote:The webcam is potentially very capable, depending on the software you use. For deep sky objects, you want to collect and stack many frames, and you want each frame to be exposed as long as the webcam will allow.
Philips has some additional software packed with the cam. That is not very versatile. I'm still looking for an API of that camera, since than i can write software if my own. The same for the Canon camera.

You triggered another question: stacking of frames. As far as i understand stacking of frames reduces noise whereas the features in the subject are enhanced: noise increases with the square root of the number of frames, the subject increases with the number of frames. Popular software like Registaxuses this method successfully. From what you write i get the idea -possibly my error- that the sensitivity increases as well. Is that my misconception or is it true?
Chris wrote:Conventional astronomical CCD cameras, like those made by SBIG, can do something similar, by continuously shooting and stacking exposures that are several seconds long (or longer). These cameras are significantly more sensitive than any of the video options, and require a computer, but provide by far the best results.
Is their threshold (level below which light is not noticed) lower? If you need a computer, an Ethernet connection would be fine, since a PC working in the morning dew is not a task suitable for whatever PC.

Do you prefer to read the raw (not yet Bayered) images or do you just read the JPG's? Is there a trick -in stead of the Bayer algorithm- to enhance the image by stacking the greens, the red and the blue to one grayscalish 'intensity' pixel?
Regards,
 Henk
21 cm: the universal wavelength of hydrogen

User avatar
Case
Commander
Posts: 616
Joined: Sat Jun 09, 2007 10:08 pm
Location: (52°N, 06°E)

Re: Which camera?

Post by Case » Sat Aug 02, 2008 1:51 pm

henk21cm wrote:stacking of frames.
AFAIK, one could average the frames (mainly noise reduction in e.g. planetary photography), one could add up the energy values (to get a 'deeper' image of combined exposure time, for faint objects e.g. nebula) or a combination of both. One could stack frames in one of several ways.
Something like this:
Image

User avatar
Chris Peterson
Abominable Snowman
Posts: 18202
Joined: Wed Jan 31, 2007 11:13 pm
Location: Guffey, Colorado, USA
Contact:

Re: Which camera?

Post by Chris Peterson » Sat Aug 02, 2008 7:34 pm

henk21cm wrote:You triggered another question: stacking of frames. As far as i understand stacking of frames reduces noise whereas the features in the subject are enhanced: noise increases with the square root of the number of frames, the subject increases with the number of frames. Popular software like Registaxuses this method successfully. From what you write i get the idea -possibly my error- that the sensitivity increases as well. Is that my misconception or is it true?
Usually, the most meaningful way to define sensitivity is in terms of signal-to-noise. A low noise camera might collect half as much signal as a high noise camera (that is, the low noise camera is less sensitive to light), but it will still show more detail. While cameras are often rated in terms of quantum efficiency (QE), or the percentage of incident photons they will record, S/N is far more important. With video cameras, nearly all have about the same QE, but some are much more sensitive, simply because they control noise better.

When you stack, you boost the S/N, and therefore you boost the sensitivity. In the stacked image, you will be able to detect fainter features.
Is their threshold (level below which light is not noticed) lower?
Generally, electronic image sensors have no lower limit. They simply record a certain percentage of incident photons- typically 30-50% for consumer type devices. The longer you collect, the more signal you have. The minimum signal that you can measure is determined by the noise. It is common to consider a S/N of 3 to mark the threshold of detection.
Do you prefer to read the raw (not yet Bayered) images or do you just read the JPG's? Is there a trick -in stead of the Bayer algorithm- to enhance the image by stacking the greens, the red and the blue to one grayscalish 'intensity' pixel?
I prefer to avoid color in general. The only color camera I use is a webcam, and I save its individual frames already de-Bayered, but without compression (or losslessly compressed). In some cases I do convert to grayscale by weighting and summing the individual color pixels. This provides a boost in S/N at the expense of the color information. There are some advantages to working with raw frames from the camera, but this is often not possible, or may require specially modified hardware or software.
Chris

*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com

User avatar
Chris Peterson
Abominable Snowman
Posts: 18202
Joined: Wed Jan 31, 2007 11:13 pm
Location: Guffey, Colorado, USA
Contact:

Re: Which camera?

Post by Chris Peterson » Sat Aug 02, 2008 7:38 pm

Case wrote:AFAIK, one could average the frames (mainly noise reduction in e.g. planetary photography), one could add up the energy values (to get a 'deeper' image of combined exposure time, for faint objects e.g. nebula) or a combination of both.
In practice, there is seldom a difference between the two, unless you restrict the depth of the workspace- that is, you convert to something like 16-bit integers after the operation. If you don't limit your data depth, the S/N is exactly the same after averaging or summing. The actual values of individual pixels will be different, but since you normally stretch the data for display, either will look exactly the same.
Chris

*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com

henk21cm
Science Officer
Posts: 225
Joined: Mon Feb 04, 2008 9:47 pm
Location: The Netherlands

Re: Which camera?

Post by henk21cm » Sat Aug 02, 2008 8:50 pm

Case wrote:One could stack frames in one of several ways.
Something like this:
Image
Nice visualization Case. I'll focus on the last row of your image. White shades have characteristically a value of 192 and higher. Lets assume 192. When you add up 10 images, the whites add up to 1920 or higher. The image itself can only cope with 0-255. In stead of unsigned 8 bits integers, you will have to switch to unsigned 16 bits integers. Then, when saving an image, you will have to renormalize the image to the range between 0-255. As a result you will have a similar image as in the first row (adding and then divide by n, hoc loco 10).

Apart from stretching the image: shift the lowest pixel value to 0, calculate the difference between the highest and lowest pixel value and multiply the difference by a factor until the highest shifted pixel value is again 255, there is no difference between the two methods.
Regards,
 Henk
21 cm: the universal wavelength of hydrogen

henk21cm
Science Officer
Posts: 225
Joined: Mon Feb 04, 2008 9:47 pm
Location: The Netherlands

Re: Which camera?

Post by henk21cm » Sat Aug 02, 2008 9:21 pm

Chris Peterson wrote:Usually, the most meaningful way to define sensitivity is in terms of signal-to-noise. A low noise camera might collect half as much signal as a high noise camera (that is, the low noise camera is less sensitive to light), but it will still show more detail.
Earlier this season, when the sky was clear (2007-11-17), i did a 15 second exposure on the Pleiades. The image is rather noisy. So i agree, noise is definitively an important factor.
Chris Peterson wrote:When you stack, you boost the S/N, and therefore you boost the sensitivity. In the stacked image, you will be able to detect fainter features.
With that in mind, next time i see the Pleiades, i'll take a lot of images, and stack them.
Chris Peterson wrote:Generally, electronic image sensors have no lower limit. They simply record a certain percentage of incident photons- typically 30-50% for consumer type devices. The longer you collect, the more signal you have. The minimum signal that you can measure is determined by the noise. It is common to consider a S/N of 3 to mark the threshold of detection.
A CCD is a charge coupled device. Incident photons rip off electrons and as a result one plate of the capacitor get a tiny electric charge. Since the capacitor is etched on a semiconductor surface, it is not an ideal capacitor: it leaks. If the exponential decay time of the RC network is short, compared to the exposure time, the charge dissipates. A lot of charge dissipates as well and as fast as a small charge, but since an exponential decay never reaches zero, the remaining charge of a high charge might be just sufficient to be detected. That was the 'picture' so far. From what you write, i get the idea that thermal exitation is far more important than leak current and leak time.
And about throwing away the color you wrote:This provides a boost in S/N at the expense of the color information. There are some advantages to working with raw frames from the camera, but this is often not possible, or may require specially modified hardware or software.
There is an article on the Codeproject website on the wrapper class for the Canon API. I'll dig into that article and see whether i can get raw iamges working. The webcam is next.
Regards,
 Henk
21 cm: the universal wavelength of hydrogen

User avatar
Chris Peterson
Abominable Snowman
Posts: 18202
Joined: Wed Jan 31, 2007 11:13 pm
Location: Guffey, Colorado, USA
Contact:

Re: Which camera?

Post by Chris Peterson » Sat Aug 02, 2008 10:24 pm

henk21cm wrote:A CCD is a charge coupled device. Incident photons rip off electrons and as a result one plate of the capacitor get a tiny electric charge. Since the capacitor is etched on a semiconductor surface, it is not an ideal capacitor: it leaks...
Some CMOS detectors store charge on capacitors. CCDs do not, they store charge in potential wells established by precise voltages on carefully placed electrode structures. Charge dissipation in CCDs is extremely low; I've worked with CCD cameras utilizing single exposures of several months, with no loss of stored charge. The problem is that thermal electrons can be created, and these are indistinguishable from signal electrons. Thermal electron generation is reduced by cooling the sensor. For extremely long exposures, such as I mentioned above, cryogenic cooling to LN2 temperatures is common; for shorter exposures of a few hours or less, thermoelectric cooling is typically used, with typical temperatures at around -20°C. This cooling may reduce thermal noise to near zero, leaving only readout noise (minimized by using fewer individual exposures), and the inevitable statistical noise on the signal itself.
Chris

*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com

User avatar
Case
Commander
Posts: 616
Joined: Sat Jun 09, 2007 10:08 pm
Location: (52°N, 06°E)

Re: Which camera?

Post by Case » Sun Aug 03, 2008 2:12 am

Chris Peterson wrote:In practice, there is seldom a difference between the two [...] either will look exactly the same.
henk21cm wrote:[...] there is no difference between the two methods.
Thanks for the clarification, guys. That makes perfect sense.

henk21cm
Science Officer
Posts: 225
Joined: Mon Feb 04, 2008 9:47 pm
Location: The Netherlands

Re: Which camera?

Post by henk21cm » Sun Aug 03, 2008 6:28 pm

Chris Peterson wrote:Some CMOS detectors store charge on capacitors. CCDs do not, they store charge in potential wells established by precise voltages on carefully placed electrode structures.
It was late seventies, or early eighties that one of my educators told me about CCD's. In these days it was 'capacitors', that is why i brought in the notion of capacitors. The wikipedia article on CCD's must date from that time as well:

<<Quote: Not all image sensors use CCD technology; for example, CMOS chips are also commercially available. :etouQ>>

which confirms your statement. You can update the rest of the article in the wiki:

<<Quote: An image is projected by a lens on the capacitor array (the photoactive region), causing each capacitor to accumulate an electric charge proportional to the light intensity at that location. :etouQ>>

where the 30 year old capacitors pop up again.
you fortunately wrote:Charge dissipation in CCDs is extremely low; I've worked with CCD cameras utilizing single exposures of several months, with no loss of stored charge. The problem is that thermal electrons can be created, and these are indistinguishable from signal electrons.
OK, losing charge is no problem within 15 s, if you did not lose any charge after several months. Thermal noise as usual is the culprit. When doing long exposure photographs on ordinary photographic film, the Schwarzschild effect was degrading long exposure effective sensitivity of the film. Is something similar the case with CCD's? Never read something about that foor CCD's. Did you?

Next wacky idea came up:
  1. Take 36 images of 15 seconds.
  2. Take 108 images of 5 seconds.
Both have the same cumulative exposure time. When the frames are added, the resulting two images must be the same. The S/N ratio of the latter must be even slightly better, √3. From what i understand from your reply, there should not be a difference between both techniques, regarding the faintest star visible. Is that correct and congruent with your experiences?
Regards,
 Henk
21 cm: the universal wavelength of hydrogen

User avatar
Chris Peterson
Abominable Snowman
Posts: 18202
Joined: Wed Jan 31, 2007 11:13 pm
Location: Guffey, Colorado, USA
Contact:

Re: Which camera?

Post by Chris Peterson » Sun Aug 03, 2008 9:42 pm

henk21cm wrote:It was late seventies, or early eighties that one of my educators told me about CCD's. In these days it was 'capacitors', that is why i brought in the notion of capacitors.
It's true that the pixel structure contains something like a capacitor for storing the initial charge. In most cases, this is still quite different in operation from a conventional capacitor, however. For instance, many CCDs allow you to apply a bias to the pixel that actually prevents photoelectrons from being created- basically, an electronic shutter. I built a camera in the 1970s, using a first generation CCD. The devices have improved quite a lot since then!
OK, losing charge is no problem within 15 s, if you did not lose any charge after several months. Thermal noise as usual is the culprit. When doing long exposure photographs on ordinary photographic film, the Schwarzschild effect was degrading long exposure effective sensitivity of the film. Is something similar the case with CCD's? Never read something about that foor CCD's. Did you?
There is no analog in CCDs to reciprocity failure in film.
Next wacky idea came up:
  1. Take 36 images of 15 seconds.
  2. Take 108 images of 5 seconds.
Both have the same cumulative exposure time. When the frames are added, the resulting two images must be the same. The S/N ratio of the latter must be even slightly better, √3. From what i understand from your reply, there should not be a difference between both techniques, regarding the faintest star visible. Is that correct and congruent with your experiences?
As a rule, the example with fewer images will have better S/N. They both have the same signal, and they both have the same dark current signal (and therefore, dark current noise). But each time you read the camera, you inject readout noise. If you have R noise in a single exposure, you'll have 6R in 36 exposures, and 10R in 108 exposures. That's why you always want to try for fewer longer exposures. It's also why no video camera approaches a long exposure camera for imaging dim astronomical objects. For short exposures, readout noise simply swamps the signal.
Chris

*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com

henk21cm
Science Officer
Posts: 225
Joined: Mon Feb 04, 2008 9:47 pm
Location: The Netherlands

Re: Which camera?

Post by henk21cm » Sun Aug 03, 2008 10:25 pm

Chris Peterson wrote: As a rule, the example with fewer images will have better S/N. But each time you read the camera, you inject readout noise.
Ah, the readout noise -thanks for learning me a new concept- breaks the symmetry. Although the readout noise limits the applicability of my camera, it leaves logic intact. The outcome of my wacky idea would have been anti-logical: "To do difficult things, like capturing dim objects, it requires the most elaborate way to get there. "

So there are four contrinutions:

S: the wanted signal
D: the dark current noise
R: the readout noise
T: the thermal noise.

About D you said that it is small, maybe negligible compared to T. The ratio between R and T is unknown. I suggest it (R/T) is smaller than 1. Since R and T are independent contributions, you can add up their respective contributions quadratically. I suppose the T contribution (thermal noise energy) is proportional to the exposure time, just as the wanteed signal. Since R is independent of time, the lesser the amount of frames, the lesser the R contribution. That leaves me no other choise than the maximum exposure time: 15 seconds.
Regards,
 Henk
21 cm: the universal wavelength of hydrogen

User avatar
Chris Peterson
Abominable Snowman
Posts: 18202
Joined: Wed Jan 31, 2007 11:13 pm
Location: Guffey, Colorado, USA
Contact:

Re: Which camera?

Post by Chris Peterson » Mon Aug 04, 2008 12:34 am

henk21cm wrote: So there are four contrinutions:

S: the wanted signal
D: the dark current noise
R: the readout noise
T: the thermal noise.
D and T are the same thing. The dark current shows as a steady increase in charge, related to temperature. This can be subtracted from the image, which is the main purpose of a dark frame. What can't be subtracted is the noise component of the dark current, nominally the square root of the dark current signal.

There is also noise associated with the signal itself. The statistical noise is just the square root of the signal, which is why you want as many photons as possible. When you make astronomical images, there is also signal from the sky background. This also has a noise component that can't be removed.

The noise sources aren't all characterized by Poisson or Gaussian behavior, but to a reasonable approximation you can simply add them quadratically to evaluate their total contribution.
Chris

*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com

henk21cm
Science Officer
Posts: 225
Joined: Mon Feb 04, 2008 9:47 pm
Location: The Netherlands

Re: Which camera?

Post by henk21cm » Tue Aug 05, 2008 9:20 pm

Chris Peterson wrote:The dark current shows as a steady increase in charge, related to temperature. This can be subtracted from the image, which is the main purpose of a dark frame.
Chris, back in town, i found your reply. I have read the words "dark frame" earlier and hadn't a clue what it was. From what your reply makes me understand: there is an increase in pixel brightness, for which i suppose it is proportional to time (steady increase). So if you take an image of 5 minutes, next thing you do is put the cap on the lens, take an image of 5 minutes complete darkness, without any subject and subtract this dark image from your first 5 minute image. That reduces the 'background light', due to dark current. This dark current is caused by thermal exitations, across the band gap (ΔE/k) in semiconductors.
What can't be subtracted is the noise component of the dark current, nominally the square root of the dark current signal.
So there must be a systematic equally distributed part and a random part.
When you make astronomical images, there is also signal from the sky background. This also has a noise component that can't be removed.
Rayleigh scattering of light, like the sun scatters light. Red is scattered less than blue, so the sky is blue.
Regards,
 Henk
21 cm: the universal wavelength of hydrogen

Animation
Ensign
Posts: 17
Joined: Fri May 09, 2008 7:01 pm

what about natural viewing?

Post by Animation » Thu Aug 07, 2008 8:43 pm

All,

Are there places you can go to see deep details without light-gathering? How far out from the city do you have to go? What locales or countries (that are safe) are good for star-gazing, in case I'm ever on vacation somewhere cool?

I'm talking naked-eye stuff.

Lewis

User avatar
Chris Peterson
Abominable Snowman
Posts: 18202
Joined: Wed Jan 31, 2007 11:13 pm
Location: Guffey, Colorado, USA
Contact:

Re: what about natural viewing?

Post by Chris Peterson » Thu Aug 07, 2008 9:02 pm

Animation wrote:Are there places you can go to see deep details without light-gathering? How far out from the city do you have to go? What locales or countries (that are safe) are good for star-gazing, in case I'm ever on vacation somewhere cool?
My basic answer would be no. That's not to say you can't go to a good dark-sky site and be overwhelmed by the Milky Way. But you won't see color, and even with a telescope you won't approach the detail that even a cheap camera can manage with just a few seconds exposure. While bad skies can hide everything, the fundamental limitation is the sensitivity of the eye, not the sky.

If you want dark skies, you need to be at least 20 or 30 miles from any cities. Also, a surprising amount of light pollution is local. Just a few neighbors with bad outside lighting can ruin your sky as much as a giant city 10 miles away. A small town just a few miles away can be worse than a city 20 miles away.

There are many places with dark skies. In the U.S., every state has good sites. Vast areas west of the Mississippi are dark. And of course, many undeveloped countries have good dark sites. There isn't a big difference in darkness between the best sites in the world, and the best sites within 100 miles of wherever you live.
Chris

*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com

starnut
Science Officer
Posts: 114
Joined: Mon Apr 10, 2006 4:55 am

Re: what about natural viewing?

Post by starnut » Fri Aug 08, 2008 1:10 am

Chris Peterson wrote:
If you want dark skies, you need to be at least 20 or 30 miles from any cities. Also, a surprising amount of light pollution is local. Just a few neighbors with bad outside lighting can ruin your sky as much as a giant city 10 miles away. A small town just a few miles away can be worse than a city 20 miles away.
Going to a higher elevation, such as a mountaintop, far from the light pollution also helps. I did that a few times and noticed that I could see more stars than I could in the lowland.

Gary
Fight ignorance!

User avatar
Chris Peterson
Abominable Snowman
Posts: 18202
Joined: Wed Jan 31, 2007 11:13 pm
Location: Guffey, Colorado, USA
Contact:

Re: what about natural viewing?

Post by Chris Peterson » Fri Aug 08, 2008 1:29 am

starnut wrote:Going to a higher elevation, such as a mountaintop, far from the light pollution also helps. I did that a few times and noticed that I could see more stars than I could in the lowland.
Very true. I live at over 9000 feet elevation, and the skies here are somewhat darker than the models estimate. That's because there is less water vapor and fewer particulates to scatter light from the nearest city (40 miles away). I see a low light dome in that direction, but it doesn't spread to the surrounding sky.
Chris

*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com

Post Reply