We’ve had a long comment aside in the Iridium Layer post about whether a Young Earth might be proven if one assumes that all isotopes “age” faster under water at some depth.
Let’s consider some of the repercussions if this were Truth instead of fantasy:
- Nuclear waste would not be a problem. If isotopes with surface half-lives of a billion years (U-238 has a 4.5 billion years half-life) decayed in a couple of hundred days to levels that we read now, the short-lived dangerous isotopes left over after fission would decay to nothing in a day or two at depth.
- Cheap energy: Long-life isotopes that are barely radioactive (like Lead-205 at 15 million years) could be immersed in water to increase their decay rate to give off their energy (as does Cobalt-57, 272 days) and used to run turbines.
- All isotopic dating methods would have to consider the depth and duration of immersion in water, even if we are looking at thousands instead of millions or billions of years.
But, to step back to reality: Why would the relatively level Atlantic ocean floor have isotope dates that consistently range from Now at the mid-Atlantic Ridge through 180 million years old approaching the continental shelf at Florida and Africa if water depth affects aging?
Also, why would rocks found in some mountain tops date as significantly younger than some at sea level, but still much older than Noah?
Here's something slightly related and highly fascinating: http://arxivblog.com/?p=596 (Do nuclear decay rates depend on our distance from the sun?)
I suppose certain creationists will take this as support for their view that decay rates could be variable and therefore compatible with a young Earth (despite the variation reported here suggesting no such thing), while simultaneously ignoring that this announcement is contrary to their view that scientists are far too dogmatically entrenched in their worldview to even consider variable decay rates.
That as much as 0.1% nuclear decay rate variation is across an annual cycle. Averaged over a year, decay rates are still steady. Also, they only monitored two isotopes with fairly short half-lives: Si-32=~ 100yrs, and Ra-226=~1,600yrs. Will the fluctuation be quantitatively different in isotopes with different orders of half-lives, like Uranium-238 at 4,467,000,000 years?
I will watch with great interest as they now explore in detail whether and why the rate changes. Is it based on solar gravitation (General relativity), neutrino flux ("strong" interactions), or some other feature?
Whatever turns out to be the cause of this fluctuation, I'll bet that depth of water will not affect it.
Dan writes:
"Nuclear waste would not be a problem. If isotopes with surface half-lives of a billion years (U-238 has a 4.5 billion years half-life) decayed in a couple of hundred days to levels that we read now, the short-lived dangerous isotopes left over after fission would decay to nothing in a day or two at depth."
Karl responds:
I don't think we have yet to pull any of the nuclear waste back up to see if it has stayed on its supposed half-live schedule ot not.
Neutrons are only effective as moderators of nuclear reations to a couple of meter depths. Neutrinos and cosmic rays are possibly factors up to hundreds of meters. Neutrinos can come from any direction, mostly from the sun, but some from the core of the planet.
Dan also writes:
"Cheap energy: Long-life isotopes that are barely radioactive (like Lead-205 at 15 million years) could be immersed in water to increase their decay rate to give off their energy (as does Cobalt-57, 272 days) and used to run turbines. "
Thats exactly what they thought at Chernoble but the reactions came back to bite them big time. Putting stuff that they have not isolated and studied carefully into a water tank with a known source of neutrons makes virtually anything in the enviroment radioactive and resets the very nature of the radioactivity we label spontaneous.
Karl could really use an introductory course on nuclear chemistry or even quantum physics. He has no idea what research has been done, nor what the results mean.
Chernobyl has nothing to do with immersing isotopes in deep water. It was a Fermi-style reactor, of the sort that was determined by the late 1940's to be too dangerous and unstable to use commercially in the U.S.
Cosmic rays have no effect at depth; primaries are stopped in a few miles of upper air, or a few inches of something more solid. Their tertiary products can be detected on the surface, and are part of the background radiation.
Neutrinos can indeed come from anywhere. This is because a 42,000,000 foot thick chunk of iron alloy (the Earth) has about a 50% chance of stopping one. The chance of any particular nucleus being affected by a neutrino is proportionally smaller. That 50% difference is how we know what direction neutrinos are coming from.
Every measurement of decay rates has taken all these external inputs into account. Look it up.
I don't need a class in fundaments physics. I need to better explain where I see the issue.
It doesn't matter if water or graphitre is used as the moderator, when the neutrons slow they more readily interact with other atoms making them potentially more unstable.
Fermi was the first to notice the effect that neutrons have on changing stable nuclides into unstable ones.
It may take a huge number of neutrinos to amount to the same effect as adding a single neutron. The effect would of course seem spontaneous because everyone recognizes that a single isolated neutrino poses virtually no danger to a stable nucleus. Even just the energy imparted from a faster neutrino might have the ability to cause radioactivity all around us just like it is detected in the huge water filled neutrinio tanks deep underground.
However over time and with the "flip of a coin," it could appear that any individual atom was finally pushed over the edge and its time to deacy was finally arrived at.
Dan says: "Every measurement of decay rates has taken all these external inputs into account. Look it up."
Every study I ever look up drives itself to eliminate curved lines until it gets a straight one, that sure looks like a bias to me.
First the background radiation is claimed to be removed from the data.
Then "known environmental influences" are figured into the calculations and viola the data is then reliable.
There is that scientific universal bias word "all of the influences are taken account." Again you can accept the null hypothesis as correct and keep yourself from looking. I don't feel so inclined to trust the judgement of others in this regard.
The top 1 to 7 meters of many calibrations of water radioactivity measuring devices consistently show a non-linear rise and then a non-linear drop off in radioactivity. What do we suppose that to mean?
Haven't seen an explanation that suites me other than neutrons probably are slowing and getting absorbed by the water with a maximun absorption happening between 4 to 5 meters.
Karl Kunker
Karl: Read "The Making of the Atomic Bomb" by Richard Rhodes to get a primer for laymen on who discovered isotope production, what neutrons do in the process, how they determine half-lives, and other such basic principles. At least read the wiki on neutrons.
Neutrinos are not small neutrons; they aren't even the same class of particles. Their etymological similarity comes from their shared characteristic of not having an electric charge, and one being enormously smaller than the other. Free neutrons have a half-life of about 15 minutes unless they are absorbed, so they are not common outside of fission reactions or related high-energy decay trains. That's why they need a special neutron source ("walnut") in the core of a plutonium pit to ensure detonation; there just aren't enough free neutrons in a mass of plutonium to trigger the reaction!
The depth of water or paraffin, or the temperature of graphite do not affect the role of any of these substances as neutron moderators in any significant way. Show me any data that says otherwise.
I have no idea to what you are referring. Please specify an example of the "many", what sort of "calibration" and what you mean by "water radioactivity".
There is a bias in mathematical modeling to fit a smooth curve to jittery data. If the model predicts a straight line on certain scales and range, then data that doesn't fit is examined to figure out why it doesn't fit the model. If more data consistently doesn't fit in the same direction, then the model is changed. A few odd points near the ends generally are ignored, within the limits of the characteristics of the experiment and the acknowledged error bars involved.
Please tell me what is this magical null hypothesis that you keep invoking? A theory is considered true until some evidence is produced to refute it. Are you simply using "null hypothesis" as a casual synonym for "theory"?
The null hypothesis is the easy way to prevent connecting the dots of inter relatedness.
One's bias is revealed in pure science by what questions you consider of significance. Data can have relatedness written all over it, but for those who will not ask the proper questions, the hull hypothesis can actually block proper interpretation of the data.
For example,
If I believed in an earth that was 4.6 billion years old and;
If radioactive half lives as they are calculated and implemented in dating the igneous rocks found in the geo-historical record agree with my interpretation of an old earth, and;
If constant half-lives make for a nice concise way to allow me to confirm what I suspected all along concerning the age of the earth, and
If a mathematical model of dubious nature that includes a spontaneous unknown cause that can be mathematically quantified, and;
If a presumption of constancy of half-lives is required to keep the system intact;
I will have no reason to want to ask the questions about radioactivity that do not fit into these nested loops of my thought process.
Karl: It's important to keep an open mind regarding bits of evidence as well as overall theories.
Thoughtful people don't enter the discussion with preconceived notions as to whether the earth is billions of years old. We examine things and follow where the evidence leads us.
Just like if you came home and saw that the door had been broken in, and your belongings were missing. You'd follow where the evidence leads and you wouldn't be persuaded by your neighbor who held a Holy Book high and declared that, based on a scriptural passage, your house has never been burglarized.
It seems that Karl is too young to remember the battle as the age of the Earth raised to its present level. I watched with fascination as new evidence forced the age ever upward from the mere hundreds of millions of the early 1960's up to halfway across the single-digit billions. If one reads some history of discovery, you can see what happens as the null-hypothesis of the Ussher time line was relentlessly and progressively driven to the age settled on in the 1970's by many diverse and converging sources of evidence. The age of the planet was accepted as many millions by Darwin's time, long before isotope dating.
Decay rates are not measured by how old we want the world to be. They are measured by Geiger and Cherenkov counters, and confirmed by chemical assay of the by-products. They are further confirmed by the reliable behavior of all the products that use such isotopes, like tritium watch dials, smoke detectors, nuclear medicine (thousands of different applications), atomic batteries, and so on.
The atomic batteries (based on decay, not fission) on the Voyager probes (and others) have delivered the amount of power expected (again, a measurement of the decay rate) from down here by the sun out to beyond the orbit of Pluto. If the distance from the sun does cause a measurable difference in decay rates based on the tiny difference in the Earth's orbital position, the effects weren't enough to be noticed in that application.
Before you say it, I agree: They were not looking for it. This just proves that (if such an effect does turn out to be real) it is not significant enough to affect the decay rate of a carefully monitored isotope battery at orders of magnitude change in distance from the sun over decades of observation.
Atomic radiation is very reliable for measuring time. Hence, the United States Naval Observatory Master Clock:
http://tycho.usno.navy.mil/clocks.html
"Since 1967, the International System of Units has defined the second as the duration of 9,192,631,770 cycles of radiation corresponding to the transition between two energy levels of the ground state of the caesium-133 atom. This definition makes the caesium oscillator (often called an atomic clock) the primary standard for time and frequency measurements."
http://en.wikipedia.org/wiki/Atomic_clock
"Ancient rocks exceeding 3.5 billion years in age are found on all of Earth's continents. The oldest rocks on Earth found so far are the Acasta Gneisses in northwestern Canada near Great Slave Lake (4.03 Ga) and the Isua Supracrustal rocks in West Greenland (3.7 to 3.8 Ga), but well-studied rocks nearly as old are also found in the Minnesota River Valley and northern Michigan (3.5-3.7 billion years), in Swaziland (3.4-3.5 billion years), and in Western Australia (3.4-3.6 billion years).
*These ancient rocks have been dated by a number of radiometric dating methods and the consistency of the results give scientists confidence that the ages are correct to within a few percent.*
An interesting feature of these ancient rocks is that they are not from any sort of "primordial crust" but are lava flows and sediments deposited in shallow water, an indication that Earth history began well before these rocks were deposited. In Western Australia, single zircon crystals found in younger sedimentary rocks have radiometric ages of as much as 4.3 billion years, making these tiny crystals the oldest materials to be found on Earth so far. "
http://pubs.usgs.gov/gip/geotime/age.html
Karl, what is this obsession you have with neutron radiation and neutron moderators? Radioactive carbon-14 decays by BETA radiation; i.e., it emits ELECTRONS. Accordingly, its half-life has nothing to do with neutron radiation or neutron moderators…which means your arguments are irrelevant and nonsensical. Get with the program.
Neutrons do turn stable nuclei into unstable ones.
Tell me that they don't and I'll leave the discussion.
Karl: Neutrons can turn stable elements into an unstable isotope, usually of the next element up. Hit carbon with a neutron and you get either heavier carbon or nitrogen plus a beta particle.
Even if free neutrons were common, they would not affect the ratios between isotopes measured for dating. Carbon-14 would become Nitrogen-15 at exactly the same rate (in proportion to original proportion) as Carbon-12 would become Carbon-13 or Nitrogen-13. The C-14/C-12 ratio remains essentially unchanged except through the natural, non-neutron-involved decay of the C-14 to N-14. Same goes for Potassium-Argon dating, and so on.
N-13 would immediately (within minutes) emit a positron to become stable C-13, and N-15 is stable. Nether would come back to affect the C-12/C-14 ratio if neutrons were involved.
Karl writes, "Neutrons do turn stable nuclei into unstable ones."
Again, Karl, your obsession with neutrons is irrelevant to the present discussion. Whether neutrons can turn stable nuclei into unstable ones is as irrelevant as whether protons or other particles can turn stable nuclei into unstable ones. We're dealing with beta decay of *unstable* carbon-14, not whether stable nuclei can be made unstable.
Okay!
Dan has just agreed that an absorbed neutron can cause a change to a nucleus.
Could it be possible now that a "non-absorbed" neutron or even a neutrino could cause a change to a nucleus?
The differing points of view come over when the nuclide is considered to become unstable. "Natural" radioactivity either has a cause or it is purely spontaneous.
You can never prove that anything is totally spontaneous (even evolution).
But if anything natural can be proven to have inter-related factors that show true causal relationships it is no longer purely spontaneous and should no longer be referred to as spontaneous.
Karl: You so silly! A non-absorbed neutron can change a nucleus as effectively as a non-hit baseball can cause a home-run.
Spontaneity is not the issue. The issue is predictability, measurability, and consistency. It doesn't matter what causes each isotope to decay at the rate it does. The fact is: They do.
You obviously have no clue what a neutron is or does, nor how it relates to neutrinos. Nor what the relationships are between isotopes, nor the essential difference between the chemical and the nuclear spheres of influence in matter.
Ben: Cesium atomic clocks are non-nuclear in nature. They make use of particular electron orbitals, a chemical-level characteristic of quantum physics. Cesium-133 is a stable isotope. All other cesium isotopes have extremely short half-lives, and therefore don't confuse the issue.
Water stops neutrons. Neutrons are absorbed by the hydrogen in water. If a neutron is too fast, it gets re-emitted as a slower neutron ("moderated"). Once they are slowed, they simply stick to hydrogen, creating stable deuterium and tritium, that decays (hl=12.3yr) into stable Helium-3 that bubbles away.
That's why it was safe for me to look down at a nuclear reactor in a few dozen feet of water and see the Cherenkov glow caused (in part) by the slowing of said neutrons myself.
Dan says:
"Karl: You so silly! A non-absorbed neutron can change a nucleus as effectively as a non-hit baseball can cause a home-run."
Why couldn't a non-absorbed neutron or even an absorbed neutrino still possess enough energy/momentum to cause an unaccounted for interaction of somesort with a nucleus.
Spontaneous decay rests its entire formulation upon the belief that nothing is interacting to cause the radioactive decay to happen.
Again another circular assumption, if the interaction can't be detected it can't exist in the popular model of natural radiaoctivity.
Throw a hard ball at the batter and you won't get a homerun, but you will sure get one irrate batter.
Karl writes, "Spontaneous decay rests its entire formulation upon the belief that nothing is interacting to cause the radioactive decay to happen."
Karl, the "belief" that you refer to is based on the *fact* that no "interacting cause" for spontaneous radioactive decay has ever been observed, including (to my knowledge) in environments that would make an "interacting cause" visible; i.e., in bubble chambers. Therefore, not only is it reasonable to conclude that spontaneous radioactive decay has no "interacting cause," it would be contrary to available facts to suggest that it does. Moreover, there is, to my knowledge, no coherent theory that predicts the existence of such an "interacting cause." Bottom line: neither theory nor empirical evidence supports what you are suggesting.
But it is more elegant to presume non-interactions over unobservable evidence for the model to be invariant.
You have a right to believe what you do as do I.
Can anyone explain what Karl means? As near as I can tell, a missing apple (non-interactions) is compared to an oxymoron orange (unobservable evidence) to proclaim the elegance of the unchanging state of some unstated model, as an exception to something ("But" what?).
Very concise, but semantically null. Or did I miss something?
Is he simply obscurely restating the Razor? That it is more correct to assume nothing happened than to assume that something unobservable happened?
I'm stating in a normally agreeable way the unwary methodology of scientists who think they can control and isolate influences upon experiments that they either can't explain or simply choose to ignore.
Statements like these:
The speed of light is constant (under certain conditions) is elegant but an assumption nonetheess.
Light travels in straight lines unless some extra unusual forces (like large masses distorting space) is elegant but an assumption nonetheless.
Radioactive half lives are invariant and are a reliable way to extrapolate time back beyond human observation is elegant but an assumption nonetheless.
Science likes to throw caution to the wind when ever an assumption about nature makes the assumptions inherent in idealistic mathematics agree with its own preconceived biases.
Karl: 2 out of 3 of your "assumptions" have been measured in many ways. They are facts. The behaviors of bosons such as photons are very well vetted.
The third, that something always measured to be constant has therefore always been a constant, even before it was first measured, is based on Occam: If something has no reason to change, and there is no evidence that it ever changes, and the model developed to explain its every behavior and is useful to explain other things works well with this extrapolation, then that is the working model.
If you can come up with any (ANY) explanation that would indicate how the decay rates of unstable nuclei could have been significantly different in the past, plus any evidence to support it, the Nobel prize is waiting. You can put together a team to do this for much less than the cost of mega-church frills. Think of the good you would do for the Young Earth cause! Go for it! Prove that the physicists, chemists, cosmologists, geologists, astronomers, and biologists are all wrong. If the evidence can be found, may God bless and support your journey.
But please desist with the uninformed conjectures. Learn the history of modern math from the philosophy of Pythagoras forward. Learn some basic chemistry and physics. Take at least a year of each subject after you've learned the necessary differential equations. When you know how Fourier's famous formula influenced Schrödinger, and what Darwin's contemporary J.C. Maxwell contributed to our later understanding of quantum physics, let us know.