Lethal levels of background radiation?

Ask questions, find resources, browse the virtual shelves.
Post Reply
greenew
Asternaut
Posts: 2
Joined: Wed Nov 12, 2008 8:37 pm

Lethal levels of background radiation?

Post by greenew » Tue Feb 16, 2010 8:40 pm

In discussions concerning life on other planets/solar systems the assertion has been made that systems located closer to the core of our galaxy (or any galaxy for that matter) would not develop. This due to high levels of background radiation caused by higher star densities/radiation flux. My question is, just how high is to high? What level of background radiation would make life untenable? Don't we live protected from direct radiation from our sun by our atmosphere and planetary magnetic fields? Couldn't this process protect other planets closer to the galactic core?

On a related note, there are frequent mentions of temperature of space, particularly in areas of star formation and planetary nebulas. What does this mean? Is it the temperature of the 'dust' particles? If an object is placed in this region does it's temperature become that of the 'temperature of space' in the region?

User avatar
Chris Peterson
Abominable Snowman
Posts: 18109
Joined: Wed Jan 31, 2007 11:13 pm
Location: Guffey, Colorado, USA
Contact:

Re: Lethal levels of background radiation?

Post by Chris Peterson » Wed Feb 17, 2010 5:39 am

greenew wrote:In discussions concerning life on other planets/solar systems the assertion has been made that systems located closer to the core of our galaxy (or any galaxy for that matter) would not develop. This due to high levels of background radiation caused by higher star densities/radiation flux. My question is, just how high is to high? What level of background radiation would make life untenable? Don't we live protected from direct radiation from our sun by our atmosphere and planetary magnetic fields? Couldn't this process protect other planets closer to the galactic core?
Nobody knows what would be too high, because nobody really knows what kind of life might be possible. I think the concern with radiation in dense parts of the galaxy has less to do with constant radiation than it does with the effects of occasional bursts from supernovas. Such events may have affected the development of life on Earth; a planet in a dense region would experience extremely high radiation levels much more often. Also, in dense regions it might be that planetary systems aren't stable enough to exist long enough for life to form, or not for anything other than primitive life.
On a related note, there are frequent mentions of temperature of space, particularly in areas of star formation and planetary nebulas. What does this mean? Is it the temperature of the 'dust' particles? If an object is placed in this region does it's temperature become that of the 'temperature of space' in the region?
It is usually the temperature of the gas in that region. In the case of nebulas, it can range from tens of thousands to millions of degrees. Above the Earth, where the ISS orbits, the temperature is a few thousand degrees. This is not the temperature a condensed object will reach in that area. In a nebula or above the Earth, an object is likely to be very cold. Its temperature is related to the rate that it absorbs heat from the environment (and the transfer efficiency is very low from gases at near vacuum pressures), and the rate it radiates back to space (typically much higher).
Chris

*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com

User avatar
RJN
Baffled Boffin
Posts: 1667
Joined: Sat Jul 24, 2004 1:58 pm
Location: Michigan Tech

Re: Lethal levels of background radiation?

Post by RJN » Wed Feb 17, 2010 2:14 pm

Interesting questions. It appears that different organisms can endure different levels of radiation. For example, Deinococcus radiodurans (D. rad) bacteria can survive in much higher radiation environments than humans. An APOD on this is here: http://antwrp.gsfc.nasa.gov/apod/ap090830.html .

One might speculate that a certain amount of radiation could be a cheap way for a species to obtain mutations that would be necessary for evolution to proceed in an optimal fashion. Too little radiation / too few mutations and the species doesn't evolve as fast as competing species and dies off. Alternatively, too much radiation leads to cell damage and the species again dies off. All species have (I think) self-check mechanisms to regulate the amount of mutations and tolerate higher levels of radiation, with D. rad leading humans in this department, but extensive checking might take a lot of energy.

User avatar
neufer
Vacationer at Tralfamadore
Posts: 18805
Joined: Mon Jan 21, 2008 1:57 pm
Location: Alexandria, Virginia

Re: Lethal levels of background radiation?

Post by neufer » Wed Feb 17, 2010 3:08 pm

http://en.wikipedia.org/wiki/Near-Earth_supernova wrote:
<<A near-Earth supernova is an explosion resulting from the death of a star that occurs close enough to the Earth (roughly less than 100 light-years away) to have noticeable effects on its biosphere. On average, a supernova explosion occurs within 10 parsecs of the Earth every 240 million years. Gamma rays are responsible for most of the adverse effects a supernova can have on a living terrestrial planet. In Earth's case, gamma rays induce a chemical reaction in the upper atmosphere, converting molecular nitrogen into nitrogen oxides, depleting the ozone layer enough to expose the surface to harmful solar and cosmic radiation. Phytoplankton and reef communities would be particularly affected, which could badly deplete the base of the marine food chain.

Speculation as to the effects of a nearby supernova on Earth often focuses on large stars as Type II supernova candidates. Several prominent stars within a few hundred light years from the Sun are candidates for becoming supernovae in as little as a millennium. One example is Betelgeuse, a red supergiant 427 light-years from Earth. Though spectacular, these "predictable" supernovae are thought to have little potential to affect Earth.

Recent estimates predict that a Type II supernova would have to be closer than eight parsecs (26 light-years) to destroy half of the Earth's ozone layer. Such estimates are mostly concerned with atmospheric modeling and considered only the known radiation flux from SN 1987A, a Type II supernova in the Large Magellanic Cloud. Estimates of the rate of supernova occurrence within 10 parsecs of the Earth vary from 0.05-0.5 per Ga to 10 per Ga. Several authors have based their estimates on the idea that supernovae are concentrated in the spiral arms of the galaxy, and that supernova explosions near the Sun usually occur during the ~10 million years that the Sun takes to pass through one of these regions (we are now in or entering the Orion arm). The relatively recent paper by Gehrels et al. uses a value of 3 supernovae less than 10 parsecs away per Ga. The frequency within a distance D is proportional to D3 for small values of D, but for larger values is proportional to D2 because of the finite thickness of the galactic disk. Examples of relatively near supernovae are the Vela Supernova Remnant (~800 ly, ~12,000 years ago) and Geminga (~550 ly, ~300,000 years ago).

Type Ia supernovae are thought to be potentially the most dangerous if they occur close enough to the Earth. Because Type Ia supernovae arise from dim, common white dwarf stars, it is likely that a supernova that could affect the Earth will occur unpredictably and take place in a star system that is not well studied. One theory suggests that a Type Ia supernova would have to be closer than a thousand parsecs (3300 light-years) to affect the Earth. The closest known candidate is IK Pegasi. It is currently estimated, however, that by the time it could become a threat, its velocity in relation to the Solar System would have carried IK Pegasi to a safe distance.

In 1996, astronomers at the University of Illinois at Urbana-Champaign theorized that traces of past supernovae might be detectable on Earth in the form of metal isotope signatures in rock strata. Subsequently, iron-60 enrichment has been reported in deep-sea rock of the Pacific Ocean by researchers from the Technical University of Munich. This iron isotope (only 23 atoms) was found in the top 2 cm of crust and dates from the last 13 million years or so. It is estimated that the supernova must have occurred in the last 5 million years or else it would have had to have happened very close to the solar system to account for so much iron-60 still being here. A supernova occurring as close as would have been needed would have probably caused a mass extinction, which didn't happen in that timeframe. The quantity of iron seems to indicate that the supernova was less than 30 parsecs away. On the other hand, the authors estimate the frequency of supernovae at a distance less than D (for reasonably small D) as around (D/10 pc)3 per Ga, which gives a probability of only around 5% for a supernova within 30 pc in the last 5 million years. They point out that the probability may be higher because we are entering the Orion arm of the Milky Way.

Adrian L. Melott et al. estimated that gamma ray bursts from "dangerously close" supernova explosions occur two or more times per thousand million years, and this has been proposed as the cause of the end Ordovician extinction, which resulted in the death of nearly 60% of the oceanic life on Earth.

In 1998 a supernova remnant, RX J0852.0-4622, was found in front (apparently) of the larger Vela Supernova Remnant. Gamma rays from the decay of titanium-44 (half-life 90 years) were independently discovered coming from it, showing that it must have exploded fairly recently (perhaps around 1200 AD), but there is no historical record of it. The flux of gamma rays and x-rays indicates that the supernova was relatively close to us (perhaps 200 parsecs or 660 ly). If so, this is a surprising event because supernovae less than 200 parsecs away are estimated to occur less than once per 100,000 years.

In 2009, researchers have found nitrates in ice cores from Antarctica at depths corresponding to the known supernovae of 1006 and 1054 AD, as well as from around 1060 AD. The nitrates were apparently formed from nitrogen oxides created by gamma rays from the supernovae. This technique should be able to detect supernovae going back several thousand years.>>
Art Neuendorffer

Post Reply