By Tushna Commissariat
After months of rumours, speculation and some 500 papers posted to the arXiv in an attempt to explain it, the ATLAS and CMS collaborations have confirmed that the small excess of diphoton events, or “bump”, at 750 GeV detected in their preliminary data is a mere statistical fluctuation that has disappeared in the light of more data. Most folks in the particle-physics community will have been unsurprised if a bit disappointed by today’s announcement at the International Conference on High Energy Physics (ICHEP) 2016, currently taking place in Chicago.
The story began around this time last year, soon after the LHC was rebooted and began its impressive 13 TeV run, when the ATLAS collaboration saw more events than expected around the 750 GeV mass window. This bump immediately caught the interest of physicists the world over, simply because there was a sniff of “new physics” around it, meaning that the Standard Model of particle physics did not predict the existence of a particle at that energy. But also, it was the first interesting data to emerge from the LHC after its momentous discovery of the Higgs boson in 2012 and if it had held, would have been one of the most exciting discoveries in modern particle physics.
According to ATLAS, “Last year’s result triggered lively discussions in the scientific communities about possible explanations in terms of new physics and the possible production of a new, beyond-Standard-Model particle decaying to two photons. However, with the modest statistical significance from 2015, only more data could give a conclusive answer.”
And that is precisely what both ATLAS and CMS did, by analysing the 2016 dataset that is nearly four times larger than that of last year. Sadly, both years’ data taken together reveal that the excess is not large enough to be an actual particle. “The compatibility of the 2015 and 2016 datasets, assuming a signal with mass and width given by the largest 2015 excess, is on the level of 2.7 sigma. This suggests that the observation in the 2015 data was an upward statistical fluctuation.” The CMS statement is succinctly similar: “No significant excess is observed over the Standard Model predictions.”
Tommaso Dorigo, blogger and CMS collaboration member, tells me that it is wisest to “never completely believe in a new physics signal until the data are confirmed over a long time” – preferably by multiple experiments. More interestingly, he tells me that the 750 Gev bump data seemed to be a “similar signal” to the early Higgs-to-gamma-gamma data the LHC physicists saw in 2011, when they were still chasing the particle. In much the same way, more data were obtained and the Higgs “bump” went on to be an official discovery. With the 750 GeV bump, the opposite is true. “Any new physics requires really really strong evidence to be believed because your belief in the Standard Model is so high and you have seen so many fluctuations go away,” says Dorigo.
And this is precisely what Colombia University’s Peter Woit – who blogs at Not Even Wrong – told me in March this year when I asked him how he thought the bump would play out. Woit pointed out that particle physics has a long history of “bumps” that may look intriguing at first glance, but will most likely be nothing. “If I had to guess, this will disappear,” he said, adding that the real surprise for him was that “there aren’t more bumps” considering how good the LHC team is at analysing its data and teasing out any possibilities.
It may be fair to wonder just why so many theorists decided to work with the unconfirmed data from last year and look for a possible explanation of what kind of particle it may have been and indeed, Dorigo says that “theorists should have known better”. But on the flip-side, the Standard Model predicted many a particle long before it was eventually discovered and so it is easy to see why many were keen to come up with the perfect new model.
Despite the hype and the eventual letdown, Dorigo is glad that this bump has got folks talking about high-energy physics. “It doesn’t matter even if it fizzles out; it’s important to keep asking ourselves these questions,” he says. The main reason for this, Dorigo explains, is that “we are at a very special junction in particle physics as we decide what new machine to build” and some input from current colliders is necessary. “Right now there is no clear direction,” he says. In light of the fact that there has been no new physics (or any hint of supersymmetry) from the LHC to date, the most likely future devices would be an electron–positron collider or, in the long term, a muon collider. But a much clearer indication is necessary before these choices are made and for now, much more data are needed.
It is true that the standard model “still hangs on strong.
However, here are a few well-known problems with the SM.
1. The Standard Model is primarily a heuristic model with 26-30 fundamental parameters that have to be “put in by hand”.
2. The Standard Model did not and cannot predict the masses of the fundamental particles that make up all of the luminous matter that we can observe. QCD still cannot retrodict the mass of the proton without considerable fudging, and even then it is only good to within 5%. As for retrodicting the mass of the electron, the SM cannot even make an attempt.
3. The Standard Model did not and cannot predict the existence of the dark matter that constitutes the overwhelming majority of matter in the cosmos. The Standard Model describes heuristically the “foam on top of the ocean”.
4. The vacuum energy density crisis clearly suggests a fundamental flaw at the very heart of particle physics. The VED crisis involves the fact that the vacuum energy densities predicted by particle physicists (microcosm) and measured by cosmologists (macrocosm) differ by up to 120 orders of magnitude (roughly 10^70 to 10^120, depending on how one ‘guess-timates’ the particle physics VED).
5. The conventional Planck mass is highly unnatural, i.e., it bears no relation to any particle observed in nature, and calls into question the foundations of the quantum chromodynamics sector of the Standard Model.
6. Many of the key particles of the Standard Model have never been directly observed. Rather, their existence is inferred from secondary, or more likely, tertiary decay products. Quantum chromodynamics is entirely built on inference, conjecture and speculation. It is too complex for simple definitive predictions and testing.
7. The standard model of particle physics cannot include the most fundamental and well-tested interaction of the cosmos: gravitation, i.e., general relativity.
True the SM cannot have one clear method for calculating the masses of its particles for the concept of mass is debatable and difficult to conceptualise in its finalities. It is absolutely true that mass is total energy content as in e=mc2 and the question is whether all massive objects are made of the same kind of stuff.
There was no way for the SM to predict DM and DE, because of the difficulties of understanding gravitation, mass and the vacuum energy, and whether there is an acceleration expansion beyond doubt. The vacuum energy is a misunderstood term. It should, however, be possible to say what it is really and to give it a proper value. I would say the value is very near to zero. That is why the Heisenberg.U.P. exists. Had it been much higher, the universe would not exist, for many obvious reasons.
The conventional Planck mass sounds to me like a reasonable concept,unless proved otherwise, and I cannot see how that is possible.
The root of conceptual problems with particles, energy, mass, gravity etc is that we do not really know what kind of thing the universe is, and that is why we hope to propose entirely new concepts which hopefully will give far greater opportunities to physics, maths and philosophy to interact meaningfully. It is impossible to try to explain a totally materialist universe.
Finally, as Maraboli said, the universe is such a balanced place, that the moment you see a problem there has to exist a solution. The point is that science exists to ensure we do not upset the balance, which has been the outcome of the greatest conception that can be contemplated.
Problem of mass within theoretical physics has two aspects. In the first case mass is in fact fundamental notion. Therefore cannot be interpreted more precisely. In order to define mass we must have larger system of
fundamental notions where elementary particles have described mechanisms of motion. Then mass is resistance against acceleration and can be interpreted just in the context of mechanisms of motion. This can be done within vacuum medium mechanics where vacuum medium is multicomponent medium. The second aspect is associated with equivalence of mass and energy. This step makes
resolution of seeing of reality by theory lower. This is so since various type properties are considered as the same. Therefore we should abandon all equivalence laws when we construct physical theories.
One should not be surprised that the diphoton “bump” at 750 GeV seen in the ATLAS and CMS data came out to be just a statistical fluctuations, but the LHC, for the future of particle physics, has to continue for the next few years trying to find something beyond the particle SM which, after all, is just an effective field theory.
I think we will find much more inspiration and potential for real progress coming from astrophysical research rather than from the relentless and probably mistaken absolute reductionism of particle physics. We shall see.
if not SUSY nor BSM,
ToE – (SM + GR) =
goo.gl/fZCQnt
Within the Standard Model geometry plays too large role. Therefore, in my opinion, the main problem of the SM rests on description of dynamics in relation to smaller scales. In particular within this theory we are not able to describe mechanisms of biological
evolution on the most fundamental level for instance. This phenomenon is associated just with dynamics. Persistence of life indicates that biological evolution is governed by processes within attracting set of an attractor with considerable power. Persistence of life is manifested as an experimental fact carried out by nature. Too poor dynamics described by SM does not allow us to indicate this attractor. This in turn is a consequence of too poor system of fundamental notions of the SM. Summarizing the SM does not see dynamics sufficiently well.
All of the interactions in Nature obey the local gauge symmetry and,indeed, are geometric in nature expressed through the algebraic equations.
I think that other geometries with additional symmetries as well as new hypotheses should be fully explored in order to estimate their connections with reality. This is traditional way which is associated with using of mathematics. Then dominant aspect of this way is related to correctnees of application of mathematics. However, theoretical physics has two aspects: the first one is related to correct application of mathematics, the second one is related to fitting of this mathematics to reality. The second aspect represents to larger degree methodology and is rarely discussed by physicists. I suggest for theoretical physicists making larger efforts towards methodology of constructing of physical theories. This can be done by more carefull indroduction of assumptions
and estimation of their status. Just status of introduced assumptions and number of these assumptions is responsible for fitting theory to reality. Consequently we should modify SM by a jump to new larger system of fundamental notions within this methodology in order to see dynamics better.
Trackback: The Nightmare Scenario | Not Even Wrong
Trackback: No fail no gain – TLP
@ Robert:
The statement that the SM can not explain DM is incorrect: there is still the neutrino. If it has a mass of about 2 eV, and is of Dirac signature, so that it has its sterile right handed partner, the case is still viable from the particle physics side. The mounting evidence against sterile neutrinos refers to Majoranas. On the cosmological side, lensing from the cluster A1689 is a clear pro, while BBN, CMB and structure formation are cons.
Well, those are mighty big ifs.
I have not seen a significant number (any?) of papers by DM researchers who are advocating this particular candidate. The consensus is that conventional neutrinos would not meet the requirements of DM.
The “sterile neutrino” candidacy was recently dealt a major blow by a newly reported experiment covered in all the major scientific sources.
100,000,000,000 MACHOs in the hand are a wiser choice than 0 mythical particles in the imagination.