RSS Output
French    German    Spain    Italian    Arabic    Chinese Simplified    Russian

Letters by a modern St. Ferdinand III about cults

Gab@StFerdinandIII - https://unstabbinated.substack.com/

Plenty of cults exist - every cult has its 'religious dogma', its idols, its 'prophets', its 'science', its 'proof' and its intolerant liturgy of demands.  Cults everywhere:  Corona, 'The Science' or Scientism, Islam, the State, the cult of Gender Fascism, Marxism, Darwin and Evolution, Globaloneywarming, Changing Climate, Abortion...

Tempus Fugit Memento Mori - Time Flies Remember Death 

Archive - April 2024

Cosmic Microwave Background radiation disproves the Big Bang religion.

One of many such proofs. CMB and that ‘horrible truth’ that the Earth might well be the barycentre of the universe.


 

Prologue

Cosmic Background Radiation is used as a proof of the ‘Big Bang’ a discredited theory, where supposedly a singular ‘egg’ containing all the elements within the universe ‘exploded’, creating everything, including life on this planet, and this explosion left a radiation imprint in the cosmos.  As previous posts have outlined, this theory and its CMB derivative is entirely fabricated and wrong.  However, it is even worse for the Banging-faithful.  CMB as properly understood, actually undermines and negates the Big Bang model.  Hanging yourself on your own petard and all that. 

 

What is it?

 

The Cosmic Microwave Background Radiation (CMB) is radiation in the form of microwaves (the same which are produced in a microwave oven) which is supposedly the residual energy left over from the Big Bang that was said to have occurred 13.7 billion years ago.  The original temperature of the Big Bang explosion was believed to have been about 3000 degrees Kelvin and this is said to have cooled down to the present 2.75° Kelvin of the CMB 13.7 billion years later as the universe expanded.  No evidence exists for these suppositions, just models and thought experiments. 

[A problem with this dogma is that our universe supposedly has a diameter of 93 billion light years, or roughly 550 billion-trillion miles.  13.7 billion years is not enough time for this distance to be created at the speed of light.]

 

In 1965, Arno Penzias and Robert Wilson ‘discovered’ the Cosmic Microwave Background Radiation or ‘CMB’.  It was hailed as one of the greatest discoveries ‘ever’ (or ‘evah’ in climate-speak), ‘confirming’ the Catholic Fr. Georges Lemaître’s theory, from the 1930s (Penzias and Wilson, 1965). 

 

There was, however, a generational-long history of CMB ‘discovery’, including Reber (d. 2002) whose discoveries in the early 1940s of the CMB were widely published in many peer-reviewed journals and the Canadian astronomer Andrew McKellar (1941) who discovered interstellar gas radiating at 3º Kelvin.  The Americans Penzias and Wilson (1965) received credit for this ‘insight’ because they interpreted the CMB in line with the Big Bang theology - a burgeoning field with enormous financial support. But the ideas of CMB date to at least 1895. Forecasts of the temperature of CMB have been all over the place:

·       In 1895, C. E. Guillaume, determined that the cosmic temperature (or CMB) should be 5° or 6° K (Guillaume, 1896).  

·       In 1926 Sir Arthur Eddington posited that the space between the heated bodies of the universe would cool down to a temperature slightly above absolute zero, and his chosen figure was between 2.8° and 3.18° K (Eddington, 1926). 

·            Seven years later (1933), Erhard Regener obtained the figure of 2.8° Kelvin, and stipulated that it was a homogeneous energy field. 

·       Nernst posited 0.75° Kelvin in 1938; Herzberg 2.3° K in 1941; Finlay-Freundlich, using the theory of “tired light” said it should be between 1.9° to 6° K.

 

One of the main tenets of the Big Bang theory is that the currently agreed 2.728ºK temperature is the result of radiation released in the reaction of electrons and protons that were in the process of forming hydrogen about one million years after the initial explosion.  Since the temperature during this reactive state is said to have been 3,000 ºK, the resulting 2.728ºK is said to be the result of a hydrogen flash redshift factor of = 1,000, although few have an explanation why there were no objects in the cosmos with z factors between 10 and 1000.  

 

Sir Fred Hoyle dubbed this theory “The Big Bang” to register his scepticism regarding its scientific validity, although Hoyle tenaciously held to an equally weak view called “The Steady State” theory, which holds that the universe is infinite yet comes into being little by little (Physics Today, Nov 1982).  Although Big Bang advocates claim that their theory predicted the existence of the CMB, their prediction was quite higher than the present 2.728° Kelvin as given in the list above (Gamow, 1961).

 

It exists but so what?

Few dispute the rather obvious fact that the CMB exists, but what is disputed is precisely why it exists and what it means.  All in all, there is little to persuade the critical observer that a Big Bang produces the CMB, as opposed to merely the natural minimum of heat expected in a universe at equilibrium. As Andre Assis puts it:

 

Usually it is claimed that the CBR (cosmic background radiation) is a proof of the big bang and of the expansion of the universe as it had been predicted by Gamow and collaborators….However, we performed a bibliographic search and found something quite different from this view….we have found several predictions or estimations of this temperature based on a stationary universe without expansion, always varying between 2 K and 6 K. Moreover, one of these estimates [C. E. Guillaume] was performed in 1896, prior to Gamow’s birth in 1904!  The conclusion is that the discovery of the CBR by Penzias and Wilson in 1965 is a decisive factor in favour of a universe in dynamical equilibrium without expansion, and against the big bang (Assis, pp. 189-190).

 

Not only can the CMB be shown to be unsupportive of the Big Bang theory, but it is obvious that the low Kelvin temperature is consistent with non-expanding models of the universee.g., geocentric models of the universe.  This is anathema to ‘The Science’. 

 

Isotropy versus Anisotropy

The reality that a low “residual energy” CMB invalidates the Big Bang and actually points to the Earth as the barycentre of the cosmos is sometimes admitted by ‘The Science’.  Joseph Silk of the University of California (Berkeley) lamented:

Studies of the cosmic background radiation have confirmed the isotropy of the radiation, or its complete uniformity in all directions.  If the universe possesses a center, we must be very close to it…” (Silk, pp. 399-400).

 

If observed anywhere else in the universe the CMB will appear heavily anisotropic (or heterogenous, dissimilar).  If viewed from the Earth the CMB appears to be isotropic or homogenous.  This cannot be tolerated by ‘The Science’, therefore there have been some furious attempts to dismiss this fact by presuming, in addition to its isotropy, that the universe is also homogeneous, since all Big Bang and Steady-State cosmologies require both isotropy and homogeneity (Ellis, p. 92).  Yet this is not what their own evidence shows. 

 

The Earth-viewed observation of CMB isotropy serves as the absolute frame of reference, anathema to Special Relativity and the cult of Einstein.  But there it is.  As  V. J. Weisskopf states:

It is remarkable that we now are justified in talking about an absolute motion, and that we can measure it. The great dream of Michelson and Morley is realized….It makes sense to say that an observer is at rest in an absolute sense when the 3K radiation appears to have the same frequencies in all directionsNature has provided an absolute frame of reference. The deeper significance of this concept is not yet clear (Weisskopf, 1983).

 

Einstein’s make-believe world of ‘Relativity’ relied on no absolutes, no frame of reference, no ether and no inconstancy in the speed of light, including the ‘acceleration’ of the universe pace the Big Bang dogma.  He was wrong on every assumption.   More here

Scientism and the Sloan Digital Sky Survey results. Mainstream cosmology in crisis.

Yet more proof that the Copernican Principle, the Big Bang, and even heliocentricity have little merit and even less observable evidence to support them.


 

Prologue

The Sloan Digital Sky Survey or SDSS, now more than 25 years old, was financed and created to provide the most accurate mapping of the galaxies, quasars, and other objects in the universe to date.  It is a long-running project currently in phase 5.  Hundreds of astronomers, dozens of institutions and observatories from around the globe are involved mapping out hundreds of thousands of galaxies, quasars, objects and of course the ever-elusive ‘dark matter’, without which, the entirely of the Big Bang theology fails.  In their own words, the Sloan Sky survey:

will map in detail one-quarter of the entire sky, determining the positions and absolute brightnesses of more than 100 million celestial objects. It will also measure the distances to more than a million galaxies and quasars… The SDSS addresses fascinating, fundamental questions about the universe…will tell us which theories are right – or whether we have to come up with entirely new ideas.  

 

The Sloan Digital Sky Survey (SDSS) is a joint project of The University of Chicago, Fermilab, the Institute for Advanced Study, the Japan Participation Group, The Johns Hopkins University, the Los Alamos National Laboratory, the Max-Planck- Institute for Astronomy (MPIA), the Max-Planck-Institute for Astrophysics (MPA), New Mexico State University, University of Pittsburgh, Princeton University, the United States Naval Observatory, and the University of Washington. Funding for the project has been provided by the Alfred P. Sloan Foundation, the participating institutions, the National Aeronautics and Space Administration, the National Science Foundation, the U.S. Department of Energy, the Japanese Monbukagakusho, and the Max Planck Society.

 

A long list of the great and good institutions.  The ‘establishment’ of academic cosmology no less.  Yet as they declare in their mission statement, creating and developing new ideas will need to be undertaken, along with the decommissioning of the Big Bang religion and much of Copernicanism.  The SDSS simply does not support either.  Not that anyone is told this. 

 

In fact, ‘The Science’ as it always has done when faced with evidence which eviscerates its dogma, will simply declare that the observations in fact support and confirm their theology!  Indeed, it is ‘exactly as they expected’.  We have heard the same for 200 years from the drugs-pharmaceutical industry, not to mention the non-sciences of evolution, medicine, virology, space exploration, and climate theology.  In every sphere and cult within ‘The Science’ the above declarations are the standard mantra.  Just ignore the evidence, obfuscate, issue propaganda, delete evidence and declare in ever-so confident tones that ‘The Science’ has been vindicated.  After this confident assertion, supported by tortured data sets, collect your money. 

 

It can’t be in the center!

 

By 2003, the SDSS had already discovered that the Earth seemed to be in the center of the known universe.  Since then, the data has simply accumulated in support of this observation.  However, howls of outrage and name-calling are sure to follow if this idea is either distributed or worse believed.  But there it is. 

 

The SDSS confirms that the Earth in the center of two wedge-shaped galaxy segments near the ‘barycenter’ or center of universal mass.  The SDSS also shows that galaxy density decreases as the distance from Earth increases, implying a concentric proportion leading to the Earth.  This means that the Bangers cannot use the excuse that the view if from the observer, namely our Earth, and therefore ‘distorts’ the known universe’s map. 

 

‘The Principle’ rubbished

If one were to perform a similar survey from another part of the universe, these concentric proportions would not appear.  This means that the centrality of Earth provided by the Sloan Digital Survey is thus consistent with the quantization of redshift values that have been accumulated for five decades or more (Varshni, Alp).  Once again, the ‘Copernican Principle’ is violated and no proof whatsoever can be offered in its defense.  The ‘Copernican Principle’, is simply that the Earth is an unimportant little flattened spheroid at the centre of nothing and therefore by extension, humans are a blind chance artefact of no great import, probably evolved from panspermic space dust.

 

But the facts don’t support this misanthropy or its associated dogma. Concentricity and the heterogeneous distribution of galaxies are in defiance of mainstream cosmology’s claims and models including its vaunted ‘Copernican Principle’.  The fact that the universe is not isotropic and does not show the same properties in every direction, as predicted by this ‘Principle’, means we have an anisotropic or heterogenous universe.  Given these facts, if the observer view was to change from the Earth to somewhere else in the universe, we can see that the mathematical theorems underlying galaxy formation are wrong and this viewer would conclude that the Earth is at the center.  Astronomer Harold Slusher wrote:

 

If the distribution of galaxies is homogeneous, then doubling the distance should increase the galaxy count eightfold; tripling it should produce a galaxy count 27 times as large. Actual counts of galaxies show a rate substantially less than this. If allowed to stand without correction, this feature of the galaxy counts implies a thinning out with distance in all directionsand that we are at the very center of the highest concentration of matter in the universe….This would argue that we are at the center of the universe.

 

When galaxy counts are adjusted for dimming effects, it appears that the number of galaxies per unit volume of space increases with distance.  From this we still appear to be at the center of the universe, but now it coincides with the point of least concentration of matter (Slusher, pp. 12-13).

 

 

SDSS data, which again confirms anisotropy and heterogeneity, contradicts the Copernican Principle and what Bang theology predicts and demands.  More here

4 reasons why E=mc2 is wrong. Einstein made very basic mistakes when interpreting this equation.

Another example of 'The Science' going off on the wrong path, unwilling and unable to correct itself.

 

Energy = mass (x) the speed of light squared.  There are many problems with this equation, which Einstein did not invent, but interpreted as part of his fantasy world of Relativity.  Prior to Einstein, various physicists including Isaac Newton, Jules Henri Poincaré, and Olinto De Pretto had proposed the equation.  Einstein derived the equation starting from the result of relativistic variation of light energy.  He appropriated the equation and its concepts without due attribution – which was a distinctive Einsteinian feature.  Why bother acknowledging the work of others?  He rarely if ever did. 

 

This short post will briefly describe the basic mistakes within the E=MCequation.  Many other posts go through the 1905 Special Theory of Relativity (STR) and why it is wrong (you can start with the fact that space is not a vacuum).  This post will add to these postulates.

 

Error #1:  Einstein rejects basic kinetics

STR as a theory is at its core, riddled with paradoxes and contradictions (see Herbert Dingle for the clock paradox).  So too is Einstein’s interpretation of E=MC2.  Einstein took this simple equation and then contorted it with paradoxical ideas of mass and energy. 

 

What does the equation mean?

Einstein’s general interpretation, kE=MC2 defines a relationship between mass and kinetic energy.

1.     when a body of mass is accelerated it gains mass and energy

2.     when a body of mass is decelerated it loses mass and energy

3.     the mass increase/decrease for all matter is proportional to each body’s kinetic energy (relative to a common position of rest for all matter)

 

What does this actually mean?  For Einstein in this interpretation of E=MC2, energy and mass coexist together.  The key is the kinetic energy for that body and its mass.  If we take an object and accelerate it at a given velocity, the kinetic energy in that velocity will contribute to the overall mass of that body.  This is achieved through ‘Joules’ or the measurement of kinetic energy.  One Joule has a mass of 10-17 kg.  A kilogram of mass will therefore weigh 1017 Joules. In this interpretation a Joule of energy is a quantity of energy, and it is also a quantity of mass.  Thus, bodies in motion will possess both Joules of kinetic energy and Joules of kinetic mass.  If the body in motion slows down, it will be losing kinetic energy, and Joules.  Its mass should therefore decrease as well. 

 

Einstein refused to accept this.  He did not believe in deceleration to be a meaningful measurement or concept which can be independently differentiated from an acceleration.  He also did not believe that an absolute position of rest could be determined because the mass changes caused by motion, in his view, can never be measured locally.   Einstein did not believe in any absolutes as there are anathema to STR.  All of these suppositions are simply wrong.

 

Error #2:  Einstein did not understand that photons have mass

 

Einstein never performed experiments.  He was a thought philosopher with mathematical skills.  Einstein’s did not understand that the primary meaning of E=MCis to define the mass of photons (light) as the truest measure of a mass (the base as it were of mass measurement).  Einstein arbitrarily declared, based on his own ego one assumes, that the photon was a particle without mass.  This error now permeates all of science. It is absurd. A particle with no mass would have no momentum or motion.

 

All particles have mass. Photons have a mass of at least 10-50 kg.

In his thought experiments Einstein used Planck’s Constant to make the transformation between the mass of an atom and the energy of a massless photon.  By failing to give the photon mass, he was unable to divide Planck’s constant into its component parts h=MλC, namely that the mass of a photon times its wavelength times the speed of light.  Rejecting the mass of the photon completely upends the point of E=MC2

 

If we admit that the photon has a mass, there is no case where that mass is converted into energy.

·       Any mass will have energy that can be measured

·       Energy has mass that can be weighed

·       Mass and energy by definition cannot be separated into the mass of matter and the energy of photons

 

We can conclude that mass and energy are the two primary parameters of both matter and photons.  One cannot exist without the other.  There is no such thing as the long cherished metaphysical idea ‘pure energy’.  Back on planet Earth we only have pure ‘mass-energy’.

 

Error #3:  Einstein did not understand anti-matter

 

Positrons and anti-matter are discussed in some other posts.  Positrons were discovered from 1928-1932 and Einstein could never wrap his head around the concept that antimatter and positrons were the latticework of space.  Space has never been a ‘vacuum’.  Space is full of antimatter and positrons which are real particles with real mass.  This means that matter cannot be converted into energy. 

 

How does this work?

·       Photons are produced by atoms

·       Photons are made from equal pieces of positive matter (proton) and negative matter (electron).

·       Neither a proton nor an electron can produce a photon by itself

·       A photon is the result of a joint effort between a proton and an electron with each contributing an equal amount of their mass and energy to make the photon

 

In this process, the creation of a photon (light particle), requires an equal quantity of positive matter (positron) and negative matter (electron).  There is no way to convert ordinary matter into photons except in the extremely small quantities produced by atomic radiation.

 

A core tenet of STR is that there is nothing in space, just a frictionless vacuum.  We know this is entirely wrong.  You can start with radiation which permeates space and makes speace travel impossible.  Radiation has energy and mass.  Beyond this the universe is absolutely filled with positive matter (protons) and negative matter (electrons).  

 

The problem for Einstein’s view of mass and energy is that when two particles couple together to form a hydrogen atom for example, they emit a series of photons in a process that begins very much like the annihilation between a positron and electron.  Both the proton and the electron will lose equal amounts of mass to the emitted photons as they drop down into the ground state where the process stops and the atom becomes stable. Einstein never supported antimatter nor the real process of photon creation.

 

Error #4:  Einstein never proved relative motion

 

Einstein failed to understand that all photons travel at C (speed of light which we know can vary through the ether), through the same inertial reference frame (inertial meaning the existing motion of the object) and not just relative to observers.  Einstein made the speed of light relative to the observer’s frame.  While it is true, as Einstein claimed, that all observers will measure the speed of light to be (c) in any frame, it is not true they measure the same quantity.  

 

Einstein never proved his assumption of ‘relative’ motion, which by itself is wrong. The Doppler effect (measuring the relative motion between a source and observer), means that we can measure the difference between acceleration and deceleration, and between motion and rest.  Just because Einstein never bothered to measure absolute rest, does not mean that absolute rest does not exist.  For example, there is the absolute motion of photons which Einstein ignored.  If photons move with absolute motion, then the motion of matter must also be absolute.  Einstein should have known that:

·       M=E/C2  (to rearrange the equation) which defines a body of matter’s excess mass associated with its absolute motion through rest

·       When a body of matter is accelerated to any velocity (v) relative to this frame, its mass increases with its kinetic energy KE=MC2

·       At a velocity of about 86% of the speed of light, a body’s mass is doubled with a kinetic mass that is equal to its rest mass

 

The above can only occur within a single frame of reference for all matter.  We know that matter gains mass when it is accelerated, and it also gives up that mass when it is decelerated. All photons and all electrons in a given reference frame have identical masses. 

 

Therefore, a precious tenet of Einstein’s theorems and STR, that there are no absolutes, is fundamentally wrong.  The most basic mistake was made by Einstein, namely, by using the Doppler effect he concluded that all motion itself was intrinsically relative.  Einstein failed to believe in a fixed frame that connected all forms of motion.  However, with a more careful look at the Doppler effect, one must conclude that a common absolute motion for all photons must exist.

 

Bottom Line

 

That E=MC2 is wrong will never be taught at school or shown in the science propaganda.  Despite the rather basic mistakes of Einstein, ‘The Science’ has enshrined this equation to demi-god status.  Yet it is utterly incorrect and leads real science down the wrong path. 

 

This equation also has nothing much to do with ‘nuclear fusion’ or fission, which is the oft-cited ‘proof’ of the equation.  Nuclear reactions do not support the theorem for the reasons outlined in this post.  As well the ‘mass’ used in the equation must exist.  Only existing matter can create matter.  Einstein’s fantasy world rejected this, believing that matter can magically be called upon to appear.  This alone negates the equation, but the postulates in this post add further weight to why the equation is of little use. 

E=MC2 is another classic example of Scientism.

Positrons and plasma disprove Einstein, STR and E=mc2. Anderson's 1932 experiment.

As with the Michelson Morley experiment of 1887, 'The Science' ignores evidence it does not like. Anderson disproved E=mc2, STR and quantum mechanics. Not bad.


Prologue

As previous posts have summarised, space is full of particles, matter, and fields.  Radiation by itself contains form, energy, and material.  Since the medieval period this complex of matter in space was termed an ‘ether’’.  Einstein’s STR dogma that the ‘ether’ does not exist is wrong and that by itself nullifies his relativity theory which was an attempt to ‘relativise’ away the literally thousands of experiments from 1810 to his time, which did not find Earth mobility or a diurnal rotation.  The ‘ponderable’ ether also abrogates the world’s most famous scientific equation E = mc2.  This equation is simply wrong. 

Anderson’s 1932 experiment

 

The micro-world of atoms and particles is still a recent discovery when viewed historically.  The electron was discovered in 1897 by J. J. Thomson; the proton in 1911 by Rutherford, Wein, et al., the neutron in 1932 by James Chadwick, and the positron was first identified in 1928 by Paul Dirac and confirmed in 1932 by the American Carl Anderson (1905-1991).

 

Anderson and Victor Heiss of Austria won the Nobel Prize for physics in 1936 for the discovery of the positron, the first known particle of ‘antimatter’.  Positrons are electrons ejected from atoms by interaction with high-energy photons (or light particles).  Anderson arrived at his discovery through his intensive research into gamma, x-rays and cosmic rays.  While studying photographs of cosmic rays in cloud-chambers, Anderson discovered a number of tracks whose orientation indicated they were caused by positively charged particles, but particles too small to be protons.  

 

In 1932 Anderson announced that the particles were ‘positrons’ or particles with the same mass as electrons but positively charged.  Paul Dirac had predicted their existence in 1928.  Anderson’s claim was verified the next year by the British physicist Patrick M. S. Blackett.  In 1937, Anderson would also discover the short-lived meson.  Theoretical discoveries have named some two hundred more nuclear particles, but most, like the meson, are unstable.  

 

Gamma time

 

In his discovery of the positron, Anderson found that when gamma radiation of no less than 1.022 million electron volts (MeV) was discharged in any point of space, an electron and positron emerged from that point.  Anderson also found the converse to be true, that when an electron collides with a positron, the two particles disappear, and produce two gamma-ray quanta which disperse in opposite directions, but with a combined energy of 1.022 MeV. As one set of authors describe his discovery:

“On August 2, 1932, Anderson obtained a stunningly clear photograph that shocked both men. Despite Millikan’s protestations, a particle had indeed shot up like a Roman candle from the floor of the chamber, slipped through the plate, and fallen off to the left. From the size of the track, the degree of the curvature, and the amount of momentum lost, the particle’s mass was obviously near to that of an electron. But the track curved the wrong way. The particle was positive.  Neither electron, proton, or neutron, the track came from something that had never been discovered before. It was, in fact, a “hole,” although Anderson did not realize it for a while. Anderson called the new particle a “positive electron,” but positron was the name that stuck. Positrons were the new type of matter – antimatter – Dirac had been forced to predict by his theory.”  (Crease and Mann, ed., T. Ferris, p. 78)

 

An exciting discovery.  As with all ‘Science’ the key is the interpretation of what happened.  Given the above description a valid inference is the following:

·       space is composed of a lattice of very stable electron- positron pairs

·       when the proper quanta of radiation are administered, these pairs will either temporarily deform the lattice or jolt the electrons and positrons out of alignment and release them into the view of our bubble chambers

 

This is what Anderson’s discovery amounts to.  There is no need to invent magical processes of ex-nihilo matter creation.  But of course, such an explanation is a problem for ‘The Science’.  It disproves Einstein’s Special Theory of Relativity (STR), much in vogue and well-funded by 1932, and the emerging (at that time) quantum mechanics model of the micro-universe (which also disproves STR but few are aware of this).  Anderson, like many others, had mechanically proven that space was not a void – it contained material and lots of it, a fact we know today to be true. 

 

The philosophers

After Anderson’s discovery two eminent philosophers-of-science, Einstein and Werner Heisenberg weighed in.  Einstein needed to save Relativity and Heisenberg quantum mechanics.  Relativity theory believes that there is a physical relationship between energy and matter and that space is a vacuum containing no ‘ponderable ether’, to quote Einstein.  We know that both assumptions are wrong.  But few will have been told this. 

 

In viewing Anderson’s 1932 result Einstein had no choice but to conclude that the appearance and disappearance of the electron-positron pair was an example, as he called it, of the creation and annihilation of matter.”  This ex-nihilo creation of matter is still the strongest proof for the formula E = mc2 or Energy is equal to mass of an object times the speed of light squared.  Not only could energy magically become mass, but mass could magically transform into energy.  This formula has become the standard interpretation of all subatomic particles.  It is pure speculation and it is wrong.

 

Energy = what?

Einstein’s equation mc2 gives the amount of energy that can be obtained if a mass is completely turned into energy.  This relation can be turned around: if two gamma rays with total energy collide, they may produce a mass m.  However, this is only possible if particles whose masses are or less can be created (visible light cannot turn into matter because there are no particles with small enough masses).  The smallest-mass particles are electrons (negatively charged) and positrons (positively charged), each with a mass corresponding to 0.511 MeV of energy.  

The standard ‘Science’ description of how E=Mc2, and how matter is created ex-nihilo, is the following. 

  •  

  • Because an electric charge is never created or destroyed, electrons and positrons can only be created in pairs, one of each, with zero total charge.  

  •  

  • Two gamma rays, each of energy 0.511 MeV or more, colliding head-on, can therefore produce an electron-positron pair.  

  • If the collision is not head-on, then the necessary energy is greater.  

  • If the gamma rays have more energy than the minimum required, the extra appears as kinetic energy of the newborn matter – the electron and positron are born in motion (Katz, p. 46). 

 

This means pace ‘The Science’, that matter is literally created out of nothing.  There is absolutely no proof whatsoever that matter can arise from nothing, and this contradicts the first law of thermodynamics.  So here we have Einstein and STR in direct opposition to a known and proven scientific principle. 

 

The first law of thermodynamics, also known as the law of conservation of energy, states that energy can neither be created nor destroyed, but it can be changed from one form to another.

 

Matter cannot simply be created out of nothing. For the equation to work, m must be defined and its origination declared. The equation achieves neither.

 

Dirac’s Dirge

 

English theoretical physicist Paul Dirac had predicted the discovery of the positron in 1928.  Dirac’s famous equation predicted that the entire universe was composed of electron-positron pairs, or as they are now termed ‘electropons’.  The most unique aspect of Dirac’s analysis was that his equation required two sets of electropon pairs, positive pairs and negative pairs (Dirac1928).  Dirac however, believed in an active, absolute ether, echoing the same belief found in Newtonian physics, Maxwell’s electro-magnetism and the equations of Lorentz.  For ‘The Science’ the ether was anathema.  Relativity does not work with an ether. 

 

In 1933 Dirac was awarded a Nobel with Erwin Schrodinger for discoveries of atomic theory productivity.  Dirac was famed as a founder of the quantum mechanics movement, quantum field theory, and a critic of STR.  Ethereally self-created matter is a convenient philosophical position, not a scientific position, Dirac declared in 1933:   

“To get an interpretation of some modern experimental results one must suppose that particles can be created and annihilated. Thus, if a particle is observed to come out from another particle, one can no longer be sure that the latter is composite. The former may have been created.  The distinction between elementary particles and composite particles now becomes a matter of convenience. This reason alone is sufficient to compel one to give up the attractive philosophical idea that all matter is made up of one kind, or perhaps two kinds, of bricks.” (Ferris, 1991, pp. 80- 81).

 

Even in Dirac’s world of quantum mechanics ‘space’ is filled with pairs of ‘virtual’ particles and antiparticles that are constantly materializing in pairs, separating, and then coming together again and annihilating each other.  These particles are called virtual because, unlike actual particles, they cannot be observed directly with a particle detector, yet according to Einstein, Hawking et al, they are self-created by the energy of universal gravitation (when in doubt always invoke ‘gravity’!).  ‘The Science’ maintains that these self-created pairs of matter can be measured, and their existence has been confirmed by a small shift (the “Lamb shift”) they produce in the spectrum of light from excited hydrogen atoms (Hawking, pp. 107-108).

 

Hawking and ‘The Science’ try too hard.  Dirac and many others who believed in the ether and the first principle of thermodynamics, proposed a more logical and less mystifying interpretation, namely, that the electron-positron pairs are not created through a gravitational-energy force but are already present, jarred loose by radiation.  Radiation itself obviously possesses mass and energy. It is a ‘force’ which permeates space and which makes space travel, even to the moon, impossible for living creatures (the moon landing fraud). This ‘Diracian’ interpretation would again destroy the Big Bang, Relativity and even quantum mechanics.  It is however the most obvious, sensible and reasonable.  Occam’s razor and all that.

 

Heisenberg’s hate

Einstein and his Relativity cult were not the only ones offended and horrified by Anderson’s positron experiment.  Quantum mechanics was also under threat.  Werner Heisenberg, the leader of the quantum movement, tried just about everything to destroy Dirac and his ether, except hiring an assassin.  Heisenberg loathed Dirac, referring to his work as “learned trash which no one can take seriously” (Werner Heisenberg, Letter to Wolfgang Pauli, February 8, 1934).  Open science, tolerance, bi-directional learning and all that. 

 

For six years Heisenberg and his colleagues tried to find an error in Dirac’s equation, but to no avail.  Failing miserably, they decided on deceit and mendacity.  Although Dirac’s equation required the negative energy electropon pairs to be raised to positive energy pairs, Heisenberg circumvented this process by claiming that the positive energy pairs were merely “created” and had no origin from negative energy.  Similarly, as Dirac’s equation required the positive energy pairs to go back intermittently to the negative energy state, Heisenberg reinterpreted this to mean that the positive pairs were “annihilated.”  

(Dirac’s equation which baffled Heisenberg, and predicted antimatter or positrons)

 

If there was any inadvertent crossover between the negative and positive, Heisenberg’s quantum mechanics coined the words “vacuum fluctuation” or “Zero-Point fluctuation” to take care of that problem.  More word salads from ‘The Science’.  Thus, we reveal the dubious origin of the “creation/annihilation” interpretation of Carl Anderson’s 1932 experiment, all due to fraud and the use of terminological inexactitudes.  Yet this corruption and deceit is now ‘mainstream science’ and ‘consensus’. 

 

Ether and the Light

The Anderson discovery was also important for another reason. It revealed that space consists of very dense yet very stable electropon pairings, perhaps in some type of lattice or crystalline structure.  You would expect ‘The Science’, to understand that light traveling through a dense medium would be affected.  Physics had already been forced to consider this with Einstein’s Nobel Prize- winning theory in 1905 of the photoelectric effect, or the process by which a photon of the right frequency releases an electron from metal, confirmed by Arthur Compton in 1923. 

 

It was therefore known by 1932 that light can be affected by, and produce, physical effects when it interacts with atomic particles.  The Sagnac experiment of 1923 had also revealed that the speed of light was inconstant due to the ether around the Earth.  There were literally thousands of interferometer results from 1867-1932 with measured Earth mobility of 1-4 km/sec which identifies an ether.  This should have suggested to ‘The Science’ that light was being physically affected by some kind of substance in space.  We should already know that deeply held religious-scientific beliefs such as the Earth moving at 108.000 km per hour, are not easily jettisoned.

More here

Einstein was wrong. Quantum Mechanics, Plancktons and the real world of particles.

Relativity has long been dead. Scientism. Too much worship of Einstein, too much money at risk, too many world views which could implode. 'Save the phenomena' at all costs.


Prologue

One of the great myths of the modern age is that Einstein was the most prolific and important scientist in history.  This is curious since he was largely wrong and purloined most of his ideas from others.  Einstein was a philosopher with mathematical skills.  He was not a scientist, not a practical engineer, not a physical experimenter.  Einstein never bothered to perform a single mechanical proof to support his thought experiments.  He ignored his critics and would never directly respond to experiments which disproved Copernicanism, diurnal rotation or the obvious inconsistencies and illogical suppositions of Relativity. 

 

Many posts on this substack elaborate why most of his Special Theory of Relativity (STR) is wrong.  The General Theory of Relativity (GTR) which added back the ‘ether’ that STR withdrew, is only marginally more useful.  Dozens of scientists and literally hundreds of thousands of experiments have disproved Einsteinian physics as posted on this substack. An exception to the above would be time dilation, another concept which Einstein did not invent but is attributed to him, which annihilates long ages (gravity slows down ‘clocks’ meaning Earth time and space time are very different, a day on Earth could be a light year in space).

 

What most people do not understand is that Einstein was trying to ‘save the phenomena’ of Copernicanism and the Earth’s rotation and plug the holes in Newtonian gravity and mechanics.  Copernicanism has never been physically proven.  It is still just a theory.  Many posts outline why this is a factual statement. Einstein failed to provide any evidence to support Copernicanism, he simply accepted it as a starting point and created make-believe theories and maths to discount and discredit 100 years or more of experiments which had thoroughly ‘debunked’ the idea of the Earth’s mobility. 

 

This is why his theory was eventually supported and enshrined as dogma.  Now unfortunately we are mired in his paradigm of ‘Relativity’ a wrong turn which began long before Einstein, dating back to Galileo and is the nexus of a massive U$25 billion or larger, per annum industry.  Money, power, prestige, awards, tenure and all that. $cientism. 

 

Not a vacuum

 

During the early 20th century ‘quantum mechanics’ was pursued.  This micro-universe of particles and molecules was absolutely anathema to Einstein and his STR (special theory of relativity) regime.  No ether, no material, no ‘ponderable matter’ was allowed to exist in Einstein’s make-believe world of space.  He called space a vacuum, the ultimate nothingness of nothing.  It was a complete void in his unproven theories.  This thought experiment is entirely incompatible with the evidence of our micro-universe and the composition of space including radiation.  Quantum mechanics by itself disproved Einstein’s theories, but both will be replaced by the planckton universe (see below).  We should also mention the obvious that radiation permeates space which contains pressure and material impact, so space cannot be a vacuum.

 

Most if not all of today’s physical theorists believe that inner and outer space hold a vast assortment of particles and fields. ‘Particles’ and ‘fields’ are words which are interchanged denoting some sort of physical matter in space.  One example is the concept of neutrinos. Some believe that our universe is bathed in a primary ether particle, the neutrino.  Neutrinos are extremely small entities with a tiny mass and can apparently travel through the empty space of the atom and do so at the speed of light. Having no charge, they can only affect other masses by their high kinetic energy.  Fifty trillion of them are said to pass through our human body every second.  

 

Neutrino-physics and the interaction with atomic particles, may help explain everything from gravity to how light travels, to how planets revolve around the Sun in either a Copernican or Tychonic system, with a neutrino ‘wind’ accounting for ‘inertia’ in planetary travel (see Tsau, 2005).  Many scientists also discuss particles that are even smaller than neutrinos, including gravitons, maximons, machions, etherons, axions, newtonites, higgsionos, bosons, etc.  String theorists like Brian Greene call these sub-neutrino particles the space-filling ether, the very ether that Einstein’s STR dismantled and his GTR later reimposed.

 

Ether complexity

 

In classical physics the idea of a vacuum is simply the absence of matter.  But this means nothingness and is fundamentally wrong.  So ‘The Science’ now redefines Einstein’s original STR idea of a vacuum which meant nothing, to be a ‘relativity ether’ in which the vacuum of space now contains some mass and material but not enough to affect movement or transference of light or moving objects. This is the classic word-salad to ‘save the phenomena’. It is nonsense (more below).

 

In opposition to STR, quantum mechanics using the [Heisenberg] uncertainty principle forces physics to regard a ‘vacuum’ as a very complex system.  For example, a particle-antiparticle pair can ‘pop’ into existence in empty space, provided that the two annihilate each other in a time so short that the violation of energy conservation implicit in this process cannot be detected.

 

 The vacuum, then, is maybe more akin to a pan of popcorn than a featureless, vast empty hold of absolute nothing.  Nobel laureate Robert Laughlin offers a good summary:

“The existence and properties of antimatter are profoundly important clues to the nature of the universe….The simplest solution – and the one that turned out to be experimentally correct – was to describe space as a system of many particles similar to an ordinary rock. This is not a precisely correct statement, since Paul Dirac formulated the relativistic theory of the electron…but in hindsight it is clear that they are exactly the same idea…. This…has the fascinating implication that real light involves motion of something occupying the vacuum of space….The properties of empty space relevant to our lives show all the signs of being emergent phenomena characteristic of a phase of matter” (Laughlin, pp. 103-105).

 

So space is not empty, it is occupied. Quantum mechanics has the ability to measure the effects of these particles.  It does not know what the particles are, nor can it accurately predict what these particles will do in every case (as opposed to being able to predict what atoms will do).  This is why physicists refer to particles that ‘pop in and out of existence’ and this is also why quantum mechanics theory will eventually be replaced.

“…according to quantum mechanics, empty space is not empty. Rather, the vacuum is filled with fields and particles that constantly pop in and out of existence. The problem is that when physicists estimate how much energy is contained within those fields and particles, they come up with a number…that is insanely large, 10120 times greater than what we observe” (Discover, October 2005, p. 56).

 

There are only 1080 of particles in the universe.  Where these particles come from and where they go is unknown (Trefil, p. 100). 

 

Plancking to the Max

 

Adding to the quantum complexity are ‘Planck’ dimensions, named after the physicist Max Planck (1858 – 1947) due to his formulation of the quantum ħ, the smallest unit of energy.  It is in this world that lengths come as small as 10-33 cm; mass as ethereal as 10-5 grams; and time as short as 10-44 seconds.  Comparing the Planck length to the size of an atom (10-13 cm) or an electron (10-20 cm), a Planck particle is 100 million trillion times smaller than an atom, and 1 million million times smaller than an electron!  

Planck length is derived from the formula √(Għ/c3), where G is the gravitational constant, ħ is Planck’s constant of angular momentum, and is the speed of light, (see Ginzburg, 1976).

 

How does modern science know ‘plancktons’ exist?  The logic of quantum physics leads them there.  Stephen Hawking describes it as part of the uncertainty principle:

“[T]he uncertainty principle means that even “empty” space is filled with pairs of virtual particles and antiparticles…(unlike real particles, they cannot be observed directly with a particle detector)….If it weren’t – if “empty” space were really completely empty – that would mean that all the fields, such as the gravitational and electromagnetic fields, would have to be exactly zero. However, the value of a field and its rate of change with time are like position and velocity of a particle: the uncertainty principle implies that the more accurately one knows one of these quantities, the less accurately one can know the other. So, if a field in empty space were fixed at exactly zero, then it would have both a precise value (zero) and a precise rate of change (also zero), in violation of that principle. Thus there must be a certain minimum amount of uncertainty, or quantum fluctuations, in the value of the field (Hawking, pp. 122-123)

 

Quantum fluctuations must therefore exist and by default, insanely small particles are popping in and out of existence.  The apparent appearance and disappearance of plancktons is 10-44 seconds.  Some physicists even describe these particles as ‘virtual’, appearing and disappearing through black-holes (John Wheeler, 1957).  Wheeler also wrote that black holes, if they existed, presented a huge problem for physics (“Those Baffling Black Holes,” Time, Sept. 4, 1978).  

 

Wheeler in a speech also stated: “To me, the formation of a naked singularity (ie a black hole), is equivalent to jumping across the Gulf of Mexico. I would be willing to bet a million dollars that it can’t be done. But I can’t prove that it can’t be done” (New York Times, March 10, 1991).  Hawking agreed with Wheeler describing space to be alive with ‘turbid random activity and gargantuan masses’, while ‘wormholes’ provide passage to other universes (Hawking (b) pp. 104-123).  Before he died Hawking recanted from his support of black holes, his entire life’s work in essence, saying they are incompatible with quantum mechanics.   More here