20160330

The death of heat death



Blog 20160330 The death of heat death

Rick Dutkiewicz

I just read the New Scientist article "When will the universe end? Not for at least 2.8 billion years".

The article is a "Reader's Digest" adaptation (I like to say "short-attention-span version") of a paper that can be found in the General Relativity and Quantum Cosmology section of the Cornell University Library, "Observational support for approaching cosmic doomsday". [ http://arxiv.org/abs/1602.06211v1 ]

I encourage everyone to take a quick scan of that paper. It shows how many of these popular woo-woo science articles come from conclusions drawn from a study of groups of models. Groups of models that are based on mathematical conjectures built upon the assumptions of The Big Bang narrative; a reality born of an immense explosion caused by "random fluctuations", followed by an "Inflationary Period", continuing with an "Expanding Universe" which consists of a 4-or-more-dimensional space/time "fabric", populated mostly by dark matter and dark energy, along with a little baryonic matter which is no more than the sub-atomic particles that pop in and out of existence at the quantum level, held together by fields of attractive force. Isn't that the overall picture we are asked to accept?

Drawing heavily on Bayesian probability theory, this paper brings together a group of cosmological models and draws conclusions after assigning probability values to each specific model. But, but, ... 

The very idea of "heat death" contradicts the law of conservation of matter and just about every other "law of nature" that we can articulate. That's why scientists who discuss black holes, big bangs, quantum randomness, dark matter & dark energy, string theories, multi-universes, etc. always have to say "this is where the laws of physics break down". Oh really?

The ridiculousness of this kind of thinking is paralleled in many parts of modern cosmology (and religion of course). Einstein explained gravity with the idea of "gravity wells" created by the curvature of space. A more massive object attracts a smaller object because it bends the space/time "fabric" downward to create a 3-dimensional "well" that the smaller object is sucked "down" into. How is that not explaining gravity with gravity? Your doctor might be very smart, but if you hear him say, "your skin is inflamed because you have Inflamed Skin Syndrome", you better ask some more questions.  

You haven't explained anything if you use a word or concept to define itself. It's the old "tortoises all the way down" fantasy. You think you are being logical because of all your beautiful mathematical formulae scribbled across multiple chalkboards, but you are kidding yourself. Just because you can make great observations and calculations, that doesn't mean you have a grasp of reality. Ptolemy proved that you can have a beautifully complex and even largely workable model of reality that is completely wrong. His beautiful conceptualization was just a house of cards waiting to fall as soon as reality tapped him on the shoulder.

Mathematics (along with its study of statistical logic and probability theory) only becomes useful when your terms can be correlated to matter or the motion of matter. When you use a "random" variable in your equation, that doesn't translate to a "causeless" event in reality. When your equation puts out negative numbers or irrational numbers, it's a good hint that you are no longer correlating to real matter or motion. For example, when I have 40 apples in a basket, and I subtract 40 apples, the answer to that simple arithmetic is Zero. But, just because the 40 apples were real and the motion of removing the 40 apples was real, that doesn't mean that there is such a thing as "zero apples". Zero is a very useful concept, but it is not a real thing or action.

When physicists take real observations of matter and motion, and put them together to create a mathematical model that results in "heat death", that doesn't mean "heat death" has to be a real thing or a possible event.

What ingredients go into these "Heat Death" computer models? Models that predict events that will happen billions and billions of years from now, and that will encompass the entire Big Bang universe?

Each model recipe includes:

• 2 cups Empirical Data (finely diced with plus and minus margins of errors)
• 1 bunch of statistical probability formulae (to taste)
• 1/2 cup crumbled assumptions (the more inner contradictions, the better)
• 2 Tablespoons (heaping) of random fluctuations
• Grease the pan with "a family of cosmological models featuring future singularities".
• Lightly flour the pan with your choice of statistical probability philosophy.
• Serve cold to friends hungry for reinforcement of their assumptions of indeterminism and finity.

"Does probability measure the real, physical tendency of something to occur or is it a measure of how strongly one believes it will occur?"


20160323

The Myth of “Quantum Entanglement”



Blog 20160323 The Myth of “Quantum Entanglement”

George Coyne

Regressive physicists who adhere to the ideas of Quantum Mechanical Theory (QMT) accept that something called “quantum entanglement” is a reality. Theoretically, this is supposed to occur when two quantum systems such as particles or groups of particles are linked such that their linear momenta in one direction and spatial coordinates in one direction have a 1:1 relationship. The combined systems can be described as a whole unit, sort of like the ends of a dumbbell. So, ascertaining the momentum or position for a single quantum system will result immediately in setting those properties for the paired one. In the case of the dumbbell we can see the connection between the two ends, but in quantum entanglement, that is not so clear.  

I would be inclined to accept that the experiments proved quantum entanglement if the following protocol was followed, which of course would be impossible throughout the duration of the experiment as described here. A laser beam fired through a certain type of crystal splits individual “photons” into pairs of “entangled photons” (photon A and photon B). These photons, which can be separated by any distance, are followed during the entire experiment. After doing thousands of measurements to photon A and noting corresponding changes to photon B's spin (e.g.: from an up spin to a down spin and back to an up spin etc), regardless of how many observations are made every time the physicist measures specific photon A's spin, then the specific photon B's spin will have the opposite spin. Unless this is the way the experiment is done then the results are meaningless.

When I described this scenario to Glenn Borchardt his response helped to clarify the situation. He said: “First of all, there are no such things as "photons." At best, many of the properties attributed to photons are the properties of individual aether particles. I doubt that any of what you mentioned is other than mathematical imagination. I have speculated that there are 10^20 aether particles in an electron (from Planck's constant and equations), so I can't imagine determining the spin of any one of them, much less a pair of them. BTW: I would love to have someone show me an experiment that does what you suggest. I don't need the math, just the data that proves it. In my opinion, data is far more significant as a proof than math.”

Also, one must take into account that entanglement experiments always involve the conservation of matter in motion. In the comments section of an October 2010 blog http://thescientificworldview.blogspot.ca/2015/10/spooky-action-at-distance.html Bill Westmiller accurately pointed out that complementary trajectories and attributes are inevitable when two particles (or waves) are emitted by a single object, and thus one should expect that the opposite spin will always be found on the paired object. Therefore it is ridiculous to think that there is a causal relationship involved in this. In referring to another problem involving more complex “entanglements” he stated that “spin has no preferred orientation: it is never entirely positive or negative. The instruments can only measure one or the other, even when the spin axis is up to 90 degrees off perpendicular. The "quantum effect" is just a 2% error rate in the instrument”.

For the paired particles to be “communicating” their quantum state to one another instantly irrespective of galactic distances would require faster than light transmission of information. As this violates Einstein's universal speed limit, he rejected entanglement as requiring “spooky action at a distance.” This objection is explained in a 1935 paper that he wrote with Boris Podolsky, and Nathan Rosen: “Can Quantum Mechanical Description of Physical Reality Be Considered Complete?” Now known as EPR, the paper contends that the wave function cannot sustain completeness of the quantum description without violating the principle of locality.

Bell's theorem is the counter argument to EPR. It states: No physical theory of local hidden variables can ever reproduce all of the predictions of QMT. Bell's 1964 paper, "On the Einstein-Podolsky-Rosen paradox"[1] provided an analogy to the EPR paradox. He said a measurement decision concerning one paired particle should not influence its paired particle. But by using a math formulation based on realism and locality, Bell provided cases where this would not equal QMT predictions. Clauser and Freedman (1972)[2] and Aspect and others (1981),[3] showed that in this respect QMT is accurate. Although their experiments apparently “proved” Bell-inequality violations, thereby excluding all local hidden variable explanations for QMT they do not do this for non-local hidden variables. But more importantly as mentioned previously, the experimental protocol was so flawed that the results are without any significance.

Bell provided one non-QMT explanation for entanglement. This is the idea of absolute determinism in the universe. He states:

“Suppose the world is super-deterministic, with not just inanimate nature running on behind-the-scenes clockwork, but with our behavior, including our belief that we are free to choose to do one experiment rather than another, absolutely predetermined, including the “decision” by the experimenter to carry out one set of measurements rather than another, the difficulty disappears. There is no need for a faster-than-light signal to tell particle A what measurement has been carried out on particle B, because the universe, including particle A, already “knows” what that measurement, and its outcome, will be."[4]

Although absolute determinism, as an explanation for Bell's theorem, doesn't require bringing in the nonsensical superposition of QMT particles, it is inappropriate in many ways including that it requires a finite universe. As Glenn Borchardt informed me while preparing this blog : “It cannot work because the correct analysis is based on UD which is based on infinite causality instead of finite causality.” In case you are not familiar with UD, it means Univironmental Determinism. It states that whatever happens to a portion of the universe ( “a “microcosm") depends on the infinite matter in motion within that microcosm and the infinite matter in motion external to that microcosm.

Another problem with Bell's absolute determinism explanation is that it requires the non-scientific idea that a particle or even the universe is capable of “knowing” something.

A powerful argument against Bell's theorem can be made by challenging the validity of the assumption that math is all that we need to represent reality. Bell's theorem supposedly proved that those opposing entanglement were wrong. But in fact the test was incapable of doing this in the real world because it was restricted to Set Theory and Venn Diagrams, which are pure basic forms of Math. His method was used to create a set of inequalities which were employed to set limits. If they were violated in the real world then entanglement theory was falsely declared correct because the realist opponents were thought to be wrong. The major problem was the incorrect assumption that equations are the "Essence of Reality." It is evident that Bell's Inequalities, based on his methodology, are only applicable to Formal Logic and purely formal relations, which are imaginary.

Rather than accepting the absurd idea of “superposition” of particles or Bell's absolute determinism, it seems more reasonable to not accept the concept that what is being observed in the experiments is caused by so called “entanglement.” Glenn Borchardt addresses this point in this statement:

“Again, phenomena that display “action at a distance” are “spooky” only to aether deniers. Without aether, we are stuck with “curved empty space,” “curved spacetime,” or the magical “attractive force” that still makes no sense even though it has been a solipsistic favorite for centuries. What seems to be “action at a distance” is most likely a local effect produced by variations in aether pressure, as we suggested as the neomechanical cause of gravitation.”

Physicists have no possibilty of knowing what state either particle of the “entangled” pair was in before the measurement. Those who subscribe to this entanglement abstraction claim that the paired particles were in a superposition.” That only ended when one of them was measured, thereby causing the particles superposition to collapse into one particular state. This totally unintuitive and unscientific nonsense that violates all reason and common sense was clearly invented to support the concept of entanglement.

The entanglement abstraction is a perfect illustration of a concept that developed as a result of confusing a formal imaginary mathematical description for what could possibly occur in the real universe. Most importantly, it demonstrates how physicists come to accept the most implausible and even impossible theories as a consequence of not recognizing when they are dealing with an abstraction that has no association with reality.  

It amazes me that physicists have been so incredibly foolish in how they interpreted the "entanglement" experiments. The fact that they were so convinced of the reality of “entanglement” led me to believe that I was not understanding something. But apparently it is the regressive physicists who are not comprehending the facts.



[1]       Bell, John (1964). "On the Einstein Podolsky Rosen Paradox. Physics 1 (3): 195–200.

[2]       Stuart J. Freedman and John F. Clauser. Phys. Rev. Lett. 28, 938 – Published 3 April 1972

[3]       Alain Aspect, Jean Dalibard, and Gérard Roger.  Phys. Rev. Lett. 49, 1804 – Published 20 December 1982

[4]       The quotation is an adaptation from the edited transcript of the radio interview with John Bell of 1985. See The Ghost in the Atom: A Discussion of the Mysteries of Quantum Physics by Paul C. W. Davies and Julian R. Brown,

20160316

Distinguishing abstractions from what exists





Blog 20160316 Distinguishing abstractions from what exists

George Coyne

To accurately state that a “thing” exists, there has to be mass and volume. Particular objects or specific particles of any size can always be defined in this way. If this is not possible then it is not valid to refer to a thing existing. That is why 20th century physics is mistaken in postulating “point particles” such as quarks and leptons which are considered to have rest mass but no volume.

Attempting to get measurements when applying this criteria to “matter” is not possible because there are no boundaries or mass that can be ascribed to matter. Therefore, as discussed in Blog 20160203 Matter and motion are abstractions, “matter” does not “exist”. I deliberately avoid using the word “it” when referring to matter because that would imply an existing “thing”. 

As early as the period of ancient Greece, philosophers such as Aristotle have recognized that space and time are abstractions, as distinguished from the material world which exists. These thinkers were aware that “empty space” is totally impossible. They realized that it is simply an abstraction useful in considering how objects within the universe are arranged. Aristotle and others of his era maintained that time is a measurement of motion or the cycles of change. Although Dr. Borchardt sees time as being the actual motion that is occurring, rather than its measurement, he and Aristotle both recognize time as an abstraction. 

So far the best abstraction that we have for explaining reality is the infinite universe theory described in Stephen Puetz's and Glenn Borchardt's “Universal Cycle Theory: Neomechanics of the Hierarchically Infinite Universe”.

Applying mass and volume criteria to the phrase “infinite universe” reveals that this phrase also refers to an abstraction because by definition there are no boundaries and no specific mass associated with an “infinite universe”. When physicists and cosmologists arrive at this understanding it will be enormously helpful in the advancement of scientific theory. By recognizing that “infinite universe” is an abstraction that attempts to represent the totality of everything that exists, along with the concept of there being no ultimate macro or micro boundaries in that totality, reveals the limitations of attempting to represent reality through abstractions.

Although I am convinced that there is some form of infinity, it is impossible to prove whether the universe is finite or infinite, or whether there are other forms of matter not yet discovered in an infinite universe. 

It is futile to expect thought to be adequate in conceptualizing anything that is not finite because all concepts require boundaries and limits. Thus, even concepts of an infinite universe are still circumscribed by the limitations of thought. It is similar to the impossibility of trying to represent “non existence” with a concept. When one tries to think of “nothing”, one is inevitably thinking of something. Therefore in referring to “what is” in “Infinite Universe Theory”, it is important to put the emphasis on falsifying concepts such as the theory of a finite universe, while keeping in mind the limitations of words and concepts in discussing that which is not limited.

Realizing that “infinity” is an abstraction helps to make this more manageable. Whereas when one thinks of infinity as being a real thing, there is an insurmountable problem because it is impossible to actually conceive of infinity.
_______

In a previous blog on consciousness I explained that when neurons, which form a network within the brain, are firing in communication with each other and the network, then consciousness occurs.
http://thescientificworldview.blogspot.ca/2015/08/using-mind-and-consciousness-in-freedom.html
Thus consciousness is a type of motion occurring within the brain, and since motion is an abstraction
http://thescientificworldview.blogspot.ca/2016/02/matter-and-motion-are-abstractions.html
so is consciousness.