Discussion Questions and Comments with Responses
from Barry Setterfield


Question 1. One of the basic principles of quantum mechanics is that light does in fact exist in discrete packets of energy. The very word "quantum" is a reference to this, the whole area of physics dealing with photons is named after it.

Using the analogy given by Setterfield of cars moving along a road: The doppler effect measured from the cars as they go by gives a continuous kind of change. But, if sound existed in quanta the way that light does, then you would be measuring the doppler effect as changing in the discrete units that the redshift is measured as changing in.

BARRY'S RESPONSE If a redshift is due to motion, it is not quantized; it is a smooth smearing depending on the velocity. There is not a quantized effect. We see this smooth smearing in velocities of stars, rotation of stars, and the movement of stars within galaxies. What happens is that the wavelength of the photon is stretched or contracted due to the velocity at the time of emission. Therefore the fact that the photon originates as a discrete packet of energy is irrelevant. The point that needs to be made is that in distant galaxies, photons of light have been emitted with a range of wavelengths. All these wavelengths are simultaneously shifted in jumps by the same fraction, and it is these jumps which Tifft has noted, and which are not indicative of a Doppler shift. So some other effect is at work. (November 16, 1999)


Comment 2. Observational evidence in favor of the proposition that the speed of light is time-variable is so weak as to be practically non-existent. I think the proposition can be safely ignored, simply on the grounds of lack of evidence. The interpretation of quantization as a fundamental problem for standard cosmology is also in error. Tifft's argument is that galactic redshifts have a superimposed periodicity. If the redshift is caused by a Doppler effect, and if the matter distribution is periodic, then the Redshifts will be periodic too, just as Tifft argues that they are (although Tifft's results are not all that strong either). So, even if we accept the "quantization" (poor semantics, "periodicity" is better and more descriptive of what is actually observed), it works just fine in a modified Big Bang cosmology (one has to find a way to construct a periodic mass distribution on large Scales, which should not be an onerous task).

Barry's response: A lot of cosmologists and science journal editors didn't think so. Neither did those editors who commissioned major articles on the topic.

There are in fact periodicities as well as redshift quantization effects. The periodicities are genuine galaxy-distribution effects. However, they all involve high redshift differences such as repeats at z = 0.0125 and z = 0.0565. The latter value involves 6,200 quantum jumps of Tifft's basic value and reflects he large-scale structuring of the cosmos at around 850 million light-years. The smaller value is around 190 million light-years. This is the approximate distance between super-clusters.

The point is that Tifft's basic quantum states still occur within these large-scale structures and have nothing to do with the size of galaxies or the distances between them. The lowest observed redshift quantization that can reasonably be attributed to an average distance between galaxies is the interval of 37.6 km/s that Guthrie and Napier picked up in our local supercluster. This comprises a block of 13 or 14 quantum jumps and a distance of around 1.85 million light-years. It serves to show that basic quantum states below the interval of 13 quantum jumps have nothing to do with galaxy size or distribution. Finally, Tifft has noted that there are red-shift quantum jumps within individual galaxies. This indicates that the effect has nothing to do with clustering. November 16, 1999.


Comment: I've been reading "Impossibility: The Limits of Science and the Science of Limits" by John D. Barrow and he has an interesting discussion on the speed of light in our geological past. On pages 186 and 187 he describes the discovery at Oklo in the West African Republic of Gabon, of the remnants of an ancient site where an accident of geology produced, for a while, the conditions suitable for a sustained chain reaction to take place - a sort of natural nuclear reactor. It was moderated by water permeating a deposit of uranium. As the reaction would proceed, the water would heat up and vaporize, thereby depriving the neutrons of the moderating influence of liquid water, and the reaction would slow down. As the water vapor condensed and reformed, the reaction would pick up the pace again. This didn't last very long on a geological time scale, but the reaction results are very informative. In particular, because of the way mass and energy are related, they could not have been at variance from our present day observations by any more than one part in ten million; otherwise the natural reactor would not have functioned. And this took place 1.8 billion years ago. Barrow cites M. Maurette, "The Oklo Reactor", Annual Reviews of Nuclear and Particle Science, 26,319 (1976) and A.I Shylakhter, Nature, 264, 340 (1976) and F Dyson and T Damour, "The Oklo Bound on the time variation of the fine-structure constant revisited", I>Nuclear Physics, B480 37 (1997)

Any significant variation in the relationship between mass and energy - the only variables that determine the speed of light in Einstein's famous e = m c squared - would have to be dated prior to 1.8 billion years ago, based on this witness from God's creation.

Barry's Response: There is a discussion of the effects of radioactive decay and natural ore bodies in Ex Nihilo Technical Journal Vol. 1, 1984, pp. 126-129. My reply on those pages was sparked by a question about Oklo and other ore bodies by Bob Gentry.

The basic fact about Uranium ore bodies is that they need slow neutrons to be captured by the uranium nucleus in order to produce the reaction. It is for that reason that water was needed at Oklo to slow the neutrons down sufficiently for the ore body to start a chain reaction. With high c values, it can be shown that atomic particles moved faster, proportional to c. This included the neutrons produced at Oklo. The high-speed neutrons were not near the uranium nuclei long enough to produce any reaction, just as high-speed neutrons are today. As essentially all the neutrons were in that category when c was higher, the chance of a reaction was significantly lower. The conclusion is that neutron induced reactions in ores, though minimal now, would have been even more minimal with higher light speed, so no chain reaction would occur. For a fuller discussion, refer to the original article.


Related Questions and Answers: Implications of Non-Constant Light Velocity

Email to Barry Setterfield:

Back to Paper: The Vacuum, Light Speed and the Redshift

Back to Barry Setterfield's Index Page

Back to Lambert Dolphin's Library