Supercooled isotopes increase decay rate?

Conundrum, Thu Aug 03 2006, 09:50PM

Link2

this is interesting, and if true could have major implications.
-A
Re: Supercooled isotopes increase decay rate?
..., Thu Aug 03 2006, 11:17PM

If this does work out to be true I guess it would be proof that you don't need to know why something works to get rich off it tongue
Re: Supercooled isotopes increase decay rate?
Coyote Wilde, Fri Aug 04 2006, 02:13PM

It doesn't say Rolfs doesn't have an idea--just that it doesn't conform to standard theory.

This goes along with a thought I had a while ago: if you cooled a radioactive substance so far as to get it near the Bose-Einstien condensate range, woudln't it tend to go all at once? This could be similar, maybe? confused
Re: Supercooled isotopes increase decay rate?
Bored Chemist, Fri Aug 04 2006, 11:55PM

An atom does not know how hot it is.
I think this is utter rubbish.
Re: Supercooled isotopes increase decay rate?
Quantum Singularity, Sat Aug 05 2006, 05:40PM

It may be rubbish, but cool it to 0 degree K and I am sure the decay rate would change... if only that were possible. But even if it would change by 1000 times at a few degrees K you would still have to cool it for hundreds or thousands of years still at that temp... which would probably be a very expensive venture. Plus the radioactive waste would probably be emitting 1000 times the normal amount of radiation since the rate of decay was increased that much? I see some problems with the increased radiation, and cost of cooling along with potential problems maintaining the cooling systems.
Re: Supercooled isotopes increase decay rate?
robert, Sat Aug 05 2006, 07:59PM

Bored Chemist wrote ...

An atom does not know how hot it is.
I think this is utter rubbish.

Sounds a lot like BS but the mechanism described would make a little sense IMHO.
Needs more investigation to be sure.

Also, wouldnt it be a little impractical to supercool large amounts of radioactive waste?
Could also be that a given amount of waste contaminates a much larger amount of material of the apperatus during that operation too.
Re: Supercooled isotopes increase decay rate?
Carbon_Rod, Sat Aug 05 2006, 09:18PM

BC has a point – the technology sounds pretty far-fetched and should be treated with scepticism. The detectors area of exposure would have to be taken into account. Perhaps there are contaminated materials (when molten) that appear to change the actual decay rate of the source material per cubic centimetre -- yet the source of the actual contamination remains consistent with known standard rates of decay.

They should talk to a metallurgist and see if they can reclaim higher levels of the depleted materials they claim to produce so quickly.
Re: Supercooled isotopes increase decay rate?
Bored Chemist, Sun Aug 06 2006, 08:24AM

There are a few problems with this idea First and formeost; it doesn't happen when other people try it.

"However, critics say his idea doesn't hold up, that it contradicts existing theory as well as other experimental results. Nick Stone, a retired nuclear physicist from Oxford University, told Physics Web that experiments with cooled, metal embedded alpha emitters had already been run by other physicists, and that no reduction in half life had been observed."

Secondly, the variability of electron capture rates with electron density at the nucleus is well known. It was well enough documented that it popped up in my university final exams back in '87. It's a fairly good way of measuring electron density at the nucleus (and, therefore s orbital contribution). But (and here's the killer) it has next to nothing to do with alpha decay. (Can anyone find me an isotope where both EC and alpha decay occur and with a half life worth bothering with?)

The 3rd aspect is that for metals most of the electrons are in or near (elctronic) ground states at room temperature anyway- cooling them won't make much difference to the s orbital contribution.

Even if it did work (and it doesn't, as the other experiments have shown) it wouldn't help much in practice. Radioactivity generates heat so it is difficult to maintain a radioactive samples near absolute zero.
If you reduce the half life (presumably, by magic) you increase the rate of heat production so it becomes even more difficult to keep them cool.