|Long articles on this site||Short CMS Results on Fermilab Today||Short Physics in a Nutshells on Fermilab Today||Talks and older articles|
A coupling is a connection in the paths of particles viewed in a diagram in which one axis represents the flow of time. A triple coupling could be one particle decaying into two or two particles colliding and forming one. In the Standard Model, W-W-photon is allowed, but Z-Z-photon is not.
Most everyday phenomena, from magnetism to the chemical processes of life, are due to exchanges of photons. The only exceptions are gravity and radioactive decay (if you consider that an “everyday phenomenon”). Despite this apparent universality, the photon is only the first in a family of intermediate particles: Z bosons are like photons except that they have mass, and W+ and W− bosons are massive and also charged.
These four bosons, Z, W+, W− and the photon, can also interact with each other. Since a W boson is charged, it can emit a photon and change its trajectory. Viewed in a space-time diagram (above) this is known as a coupling between the initial W, the final W and the photon. Similarly, W, W and Z can interact, but couplings between three or more neutral bosons, such as Z-photon-photon, Z-Z-photon and Z-Z-Z, are not allowed in the Standard Model and have never been observed. This does not mean they’re impossible, though.
New phenomena beyond the Standard Model could make these so-called anomalous couplings possible. A recent study by CMS scientists searched for evidence of such couplings by looking for two Z bosons in the same event. Why not three? Take a look at the space-time diagram above: In a three-particle coupling, at least one was from the initial state.
However, other Standard Model processes can make pairs of Z bosons, so finding two Z’s is not enough. To determine whether the Z pair is due to an anomalous coupling or one of the allowed Standard Model processes, the scientists studied the momentum distribution of one of the Z’s and the relative rate of ZZ production in 7- and 8-TeV collisions. The effects of anomalous couplings would grow with collision energy.
This search revealed exactly as many pairs of Z bosons as the Standard Model predicts (within uncertainties) and no anomalous couplings. In fact, this analysis combined with a previous CMS result places the most stringent limits on anomalous couplings to date, sharpening our view of what might lie beyond the Standard Model.
Photograph of out-of-focus Christmas tree lights (large circles) that instead focuses on dust on the lens of the camera (small dark spots with rings). The rings around the dust are caused by diffraction. Photo: Jon Rista
In a bright light, you can sometimes see specks in your field of vision that resemble the picture above. If you try to turn your eyes to look at them, they move because these objects are sitting on your eyeball — blinking jostles them. They may be dust grains, cells or bubbles, but they are small enough to diffract light, making concentric rings like the ones shown in the picture.
Diffraction is a general phenomenon in which waves curl around small objects, with more or less intensity at different angles, like light around a dust grain. Protons in the LHC also diffract because protons are quantum mechanical waves. Sometimes protons simply bounce off each other, making ring-like patterns exactly like light bouncing off dust, but more often one or both of the protons break apart.
Diffraction in which one or both protons dissociate (break apart) is important to measure at LHC energies because it is a basic property of proton collisions that can’t be calculated from first principles. In a recent paper, CMS scientists measured the probability of two protons diffracting with single dissociation (one breaks apart) and double dissociation (both break apart).
This measurement is unlike most performed at the LHC. Most LHC measurements involve a search for exceedingly rare phenomena, events that occur in one out of 10 trillion collisions, but diffractive events occur in about one out of four collisions. Instead of describing how individual particles decay, this analysis studies collective effects in the combined two-proton system.
One challenge of dealing with the combined system is the problem of determining which particles came from which proton when both protons break apart. In practice, physicists distinguish the two clouds of debris by looking for cases with a large angular separation between them relative to the beamline, but the available angular range is limited by the size of the CMS detector. This analysis used an additional detector, CASTOR, which extends the combined coverage all the way from 90 degrees to a 10th of a degree from the beamline.
Though weird, this study has far-reaching implications. By improving the world’s knowledge of collective proton collision effects, these effects can be better simulated in predictions of nearly every other measurable at the LHC.
The three-fold symmetry of electrons, muons and taus may be broken by Higgs decays. (Design adapted from a neolithic spiral and the flag of Sicily.)
In Rendezvous with Rama, Arthur C. Clarke imagined an artifact built by aliens who have three arms with three fingers each, so everything about it has a three-fold symmetry. One could argue that our fondness for bilateral symmetries (in the design of cars, planes, cathedrals, etc.) comes from the ubiquity of this shape in life on Earth, and creatures from other worlds might have developed differently. However, it is more surprising to find such a pattern imprinted on the universe itself.
All particles of matter appear in threes: three generations of leptons and three generations of quarks. The second generation is a complete copy of the first with heavier masses, and the third generation is yet another copy. For instance, a muon is a heavy version of an electron, and a tau is a heavy muon. No one knows why matter comes in triplicate like this.
For quarks, the symmetry isn’t perfect because W bosons can turn quarks of one generation into quarks of another generation. Something else transforms generations of neutrinos. But charged leptons — electrons, muons and taus — appear to be rigidly distinct. Some physicists suspect that we simply haven’t found the particle that mixes them yet.
Or perhaps we have: Theoretically, the Higgs boson could mix lepton generations the way that the W boson mixes quarks. The Higgs decay modes haven’t all been discovered yet, so it’s possible that a single Higgs could decay into two generations of leptons at once, such as one muon and one tau. CMS scientists searched for muon-tau pairs with the right amount of energy to have come from a Higgs boson, and the results were surprising.
They saw an excess of events. That is, they considered all the ways that other processes could masquerade as Higgs to muon-tau decays, estimated how many of these spurious events they should expect to find, and found slightly more. The word “slightly” should be emphasized — it is on the border of statistical significance, and other would-be discoveries at this level of significance (and stronger) have been shown to be flukes. On the other hand, if the effect is real, it would start as a weak signal until enough data confirm it.
Naturally, the physics community is eager to see how this develops. The LHC, which is scheduled to restart soon at twice the energy of the first run, has the potential to produce Higgs bosons at a much higher rate — perhaps enough to determine whether this three-fold symmetry of leptons is broken or not.
Despite the difference in scale, medical PET scanners and the CMS experiment both observe the remnants of particle collisions and use them to reconstruct the source.
Antimatter is becoming increasingly common in medical technology. One technique, called positron emission tomography (PET scan), follows the flow of a positron-emitting fluid through the body. Wherever an antimatter positron encounters a normal-matter electron, the pair annihilates and produces particles whose trajectories trace back to the point of annihilation, allowing a computer to reconstruct the shape of structures in the body.
Orders of magnitude smaller and at orders of magnitude higher energies, the shape of a single proton is being reconstructed by CMS scientists using a similar technique. A proton is not a smooth sphere or a pinpoint — it is a seething ball of quarks, antiquarks and gluons that are constantly annihilating each other and creating new matter-antimatter pairs. If you could zoom into a proton, you would see more high-energy, short-lived quarks and antiquarks the closer you looked. But exactly how many more at each level of magnification is so difficult to calculate theoretically that it can only be measured and parametrized.
Whenever two protons collide, most of their constituent particles miss each other — usually only one part of each proton actually hits the other. When one of the stricken particles is a quark and the other is an antiquark, they annihilate and may create an electron-positron pair or a muon-antimuon pair, which are observed by the CMS detector. The frequency of these collisions, relative to their energies and the angles of their trajectories, allows scientists to actually see the proton at different levels of magnification.
The CMS collaboration recently published a thorough study of quark-antiquark to electron-positron and muon-antimuon pairs, known collectively as Drell-Yan scattering, as a function of several variables: the incident beam energy (7 and 8 TeV), the energy of the recoiling particle-antiparticle pair and their angles relative to the beamline. These measurements are so precise that they can be used to improve models of the shape of the proton: Some variations of the model parameters are compatible with the measurement, others are not. In that sense, it is like a scan of the shape of the proton.
This measurement has implications beyond just understanding the proton better: Nearly every other LHC result depends on these models. Measuring Drell-Yan scattering with exquisite precision is an indirect way of shrinking uncertainties in Higgs discoveries and extending the reach of hundreds of searches for new physics.
Distribution of mass calculations for pairs of jets: Black points are real data and colored peaks show what two sample models would look like (W' and excited quarks). See the paper for detailed plots.
In this new measurement, CMS scientists identified proton collisions that produce two jets and, assuming that these jets are the only products of a single-particle decay, they computed the mass of that particle. In most cases, the two jets don’t really come from a particle with a well-defined mass, so this interpretation produces a smooth distribution of masses.
Many theories of new physics predict new kinds of particles, and many of them would decay into pairs of jets. If one of these new particles exists, it would show up in the two-jet-mass plot as a narrow peak at a single mass on top of the smooth distribution. In a recent paper, CMS scientists extended the search for peaks up to 5 TeV. Since they didn’t see any, they systematically excluded theoretical models within that mass range.
I have oversimplified the search procedure a bit: the scientists used additional tools, such as methods to determine the jets came from b quarks, and tuned some searches for special cases, such as wide jets. This paper actually represents a constellation of new physics searches based on the central theme of peaks in the two jet mass distribution.
tar In rapid succession, the models are: (1) resonances from string theory, on the assumption that the energy scale for string theory is much lower than it appears to be; (2) scalar diquarks from grand unified theories; (3) excited quarks from oscillations of smaller particles that might be inside of the quarks; (4) axigluons and colorons, which would be heavy cousins of gluons in extended models of the strong force; (5) W’ and Z’ bosons, heavy cousins of the W and Z bosons that comprise the weak force; (6) Randall-Sundrum gravitons, or gravitons oscillating between our string-theory brane and another; and (7) microscopic black holes. Yes, these are the hypothetical black holes that caused so much controversy just before the LHC turned on, and there’s still no sign of them.
A branch that is one-dimensional to a chameleon is two-dimensional to an ant, and particles or waves that travel along the short dimension can loop around and even resonate.
Perhaps the most surprising thing about the LHC is that it has the potential to discover new dimensions. This is strange because dimensions are mutually perpendicular directions, like length, width and height — a new dimension would be a direction that is perpendicular to all three. Not only is it hard to believe that such a thing could have gone unnoticed until now, but how could colliding protons reveal it?
If a fourth dimension (not counting time) were exactly like length, width and height, we would have always known about it. We would describe the size of a box with four numbers, rather than three. When physicists speak of “extra dimensions,” they mean one or more dimensions that do not affect our macroscopic world, either because we’re stuck to a three-dimensional slice of the larger-dimensional space or because the extra dimension loops back on itself: If you travel far enough along it, you end up where you started, and “far enough” is a fraction of a proton’s width.
In one popular theory, both effects are responsible for hiding extra dimensions. The dimensions are small, and all particles are stuck to our three-dimensional slice except gravitons. This theory could explain why gravity is so weak compared to electromagnetism and nuclear forces — most gravitons would be lost in the extra dimensions.
In such a scenario, colliding protons would reveal the extra dimensions by creating a resonance of gravitons spinning around the extra dimensions. That is, the collision would create gravitons that go into the extra dimensions, loop around them, and arrive where they started. At the right energy, the gravitons would resonate like a ringing bell. The final result of this resonance would be to produce more particles, which can be observed by a detector like CMS.
The problem is that ordinary collisions also produce lots of particles: How would ordinary particle production be distinguished from extra dimensions? A group of CMS scientists approached the problem by measuring angular distributions of the observed particles, since extra dimensions would produce a different angular distribution than ordinary collisions. In fact, these scientists also used the angular distribution to determine if quarks, the constituents of protons, are themselves made of smaller particles.
The result was that no extra dimensions or quark substructure was seen, at scales that are 10 thousand times smaller than a proton’s radius.
Muons (red) are 18 times lighter than tau leptons (blue), so we expect Higgs decays to muon pairs to be about 300 times less common than Higgs decays to tau pairs.
Since the discovery of the Higgs boson two years ago, about 80 analyses have helped to pin down its properties. Today, we know that it does not spin, that it is mirror-symmetric, and that it decays into pairs of W bosons, pairs of Z bosons, pairs of tau leptons, and pairs of photons (through a pair of short-lived top quarks). There are even weak hints at a fifth decay mode: decays into pairs of b quarks. All of these results are in agreement with expectations for a Standard Model Higgs boson, but they are still coarse measurements with significant uncertainties.
To say that this boson is a Standard Model Higgs is to say that it is exactly the particle that was predicted in 1964. That leaves a lot of room for surprises. Without interference from new phenomena, the rate that this boson decays into particle-antiparticle pairs would be proportional to the square of the mass of the particle-antiparticle pairs. The best way to check the proportionality of something is to look at it on an extreme range. Since the Higgs is believed to give mass to everything from 0.0005-GeV electrons to 173-GeV top quarks, there’s plenty of room to check.
The highest decay rates are easiest to detect, so only the heaviest particle-antiparticle pairs have been tested so far. The lightest particle-antiparticle decay that has been observed is Higgs to pairs of tau leptons, which are 1.8 GeV each. The next-lighter final state that could be observed is Higgs to pairs of muons, which are 0.1 GeV each. By the expected scaling, Higgs to muon pairs should be 300 times less common. However, muons are easy to detect and clearly identify, so they make a good target.
Even if you combine all the LHC data collected so far, it would not be enough to see evidence of this decay mode. However, the LHC is scheduled to restart next spring at almost twice its former energy. Higher energy and more intense beams would produce more Higgs bosons, making a future detection of Higgs to muon pairs possible.
To prepare for such a discovery and find potential problems early, CMS scientists searched for Higgs to muon pairs in the current data set. They didn’t find any, but they did establish that no more than 0.16 percent of Higgs bosons decay into muons, only a factor of 7 from the expected number, and then they used these results to project sensitivity in future LHC data. Incidentally, the Higgs boson is the first particle known to decay into tau lepton pairs much more (6.3 percent) than muon pairs (0.023 percent). All other particles decay into taus and muons almost equally.
They also searched for Higgs decays into electrons, the lighter cousin of muons and tau leptons. Since electrons are 200 times lighter than muons, Higgs to electron pairs is expected only 0.00000051 percent of the time. None were found, though an observation would been an exciting surprise!
Most particles produced by proton collisions originate in the point where the beams cross. Those that do not are due to intermediate particles that travel some distance before they decay.
The main concern for most searches for rare phenomena is to control the backgrounds. Backgrounds are observations that resemble the one of interest, yet aren’t. For instance, fool’s gold is a background for gold prospectors. The main reason that the Higgs boson was hard to find is that most Higgs decays resemble b quark pair production, which is a million times more common. You not only have to find the one-in-a-million event picture, you have to identify some feature of it to prove that it is not an ordinary event.
This is particularly hard to do in proton collisions because protons break apart in messy ways — the quarks from the proton that missed each other generate a spray of particles that fly off just about everywhere. Look through a billion or a trillion of these splatter events and you can find one that resembles the pattern of new physics that you’re looking for. Physicists have many techniques for filtering out these backgrounds — requiring missing momentum from an invisible particle, high energy perpendicular to the beam, a resonance at a single energy, and the presence of electrons and muons are just a few.
A less common yet powerful technique for eliminating backgrounds is to look for displaced particle trajectories, meaning trajectories that don’t intersect the collision point. Particles that are directly created by the proton collision or are created by short-lived intermediates always emerge from this point. Those that emerge from some other point in space must be due to a long-lived intermediate.
A common example of this is the b quark, which can live as long as a trillionth of a second before decaying into visible particles. That might not sound like very long, but the quark is traveling so quickly that it covers several millimeters in that trillionth of a second, which is a measurable difference.
In a recent analysis, CMS scientists searched for displaced electrons and muons. Displaced tracks are rare, and electrons and muons are also rare, so displaced electrons and muons should be extremely rare. The only problem with this logic is that b quarks sometimes produce electrons and muons, so one other feature is needed to disambiguate. A b quark almost always produces a jet of particles, so this search for new physics also required that the electrons and muons were not close to jets.
With these simple selection criteria, the experimenters found only as many events as would be expected from standard physics. Therefore, it constrains any theory that predicts displaced electrons and muons. One of these is “displaced supersymmetry,” which generalizes the usual supersymmetry scenario by allowing the longest-lived supersymmetric particle to decay on the millimeter scale that this analysis tests. Displaced supersymmetry was introduced as a way that supersymmetry might exist yet be missed by most other analyses. Experiments like this one illuminate the dark corners in which supersymmetry might be hiding.
A recent CMS study cast a wide net and put limits on three theories of physics beyond the Standard Model: dark matter, extra dimensions and unparticles.
There is a common misconception that the LHC was built only to search for the Higgs boson. It is intended to answer many different questions about subatomic particles and the nature of our universe, so the collision data are reused by thousands of scientists, each studying their own favorite questions. Usually, a single analysis only answers one question, but recently, one CMS analysis addressed three different new physics: dark matter, extra dimensions and unparticles.
The study focused on proton collisions that resulted in a single jet of particles and nothing else. This can only happen if some of the collision products are invisible — for instance, one proton may emit a jet before collision and the collision itself produces only invisible particles. The jet is needed to be sure that a collision took place, but the real interest is in the invisible part.
Sometimes, the reason that nothing else was seen in the detector is mundane. Particles may be lost because their trajectories missed the active area of the detector or a component of the detector was malfunctioning during the event. More often, the reason is due to known physics: 20 percent of Z bosons decay into invisible neutrinos. If there were an excess of invisible events, more than predicted by the Standard Model, these extra events would be evidence of new phenomena.
The classic scenario involving invisible particles is dark matter. Dark matter has been observed through its gravitational effects on galaxies and the expansion of the universe, but it has never been detected in the laboratory. Speculations about the nature of dark matter abound, but it will remain mysterious until its properties can be studied experimentally.
Another way to get invisible particles is through extra dimensions. If our universe has more than three spatial dimensions (with only femtometers of “breathing room” in the other dimensions), then the LHC could produce gravitons that spin around the extra dimensions. Gravitons interact very weakly with ordinary matter, so they would appear to be invisible.
A third possibility is that there is a new form of matter that isn’t made of indivisible particles. These so-called unparticles can be produced in batches of 1½ , 2¾ , or any other amount. Unparticles, if they exist, would also interact weakly with matter.
All three scenarios produce something invisible, so if the CMS data had revealed an excess of invisible events, any one of the scenarios could have been responsible. Follow-up studies would have been needed to determine which one it was. As it turned out, however, there was no excess of invisible events, so the measurement constrains all three models at once. Three down in one blow!
LHC scientists are eager to see what the higher collision energy of Run 2 will deliver.
If you listen in on particle physics conversations, you’ll hear a lot of alphabet soup, such as “b to s gamma,” “Z to tau tau” and “q q-bar to X.” Reactions among particles provide a view to the underlying physics: You can learn how particles are related by how willing they are to collide and how often they decay a particular way.
I’ve long held a secret hope that someone would one day discover the “Z Z top event” (two Z bosons and a top quark), but this combination is just too rare. CMS has recently announced the next best thing: top top Z.
That is, in a large collection of proton-proton collisions, CMS scientists found that some of them produced a pair of top quarks and a Z boson. This shows that top quarks can interact with Z bosons, just like all other quarks.
This is not an assertion to be taken lightly. Top quarks are different from all other quarks, primarily because of their exceptionally high mass — 35 times heavier than the second heaviest. This difference in mass is responsible for many of the unique properties of the top quark. For instance, Z bosons decay into quark-antiquark pairs for every type of quark except top. Z bosons cannot decay into top quarks because they are heavier than the Z boson itself. Thus, it is well known that Z bosons interact with all other types of quarks, but only now do we learn that they interact with top quarks as well.
This analysis is challenging because the detector signature for top top Z resembles many other types of events, collectively called backgrounds. In particular, top top W has many features in common with top top Z because W and Z are both weak force bosons with similar masses. This analysis improves upon a previous one in that it distinguishes the W from the Z, measuring their rates separately (though top top W by itself cannot be clearly distinguished from its backgrounds).
Another challenge is that these event types, top top W and top top Z, are both exceedingly rare: a hundred times less common than Higgs production. This analysis is therefore one example of progress beyond the Higgs.
The Higgs field was recently found to give mass to leptons and quarks (matter particles) as well as the bosons (force particles). Lines in the diagram above indicate interactions between particles: The red lines are new.
Nearly 50 years before its discovery, the Higgs field was proposed as a way to explain why particles have mass. The Standard Model would be internally inconsistent if particles could have mass on their own (that is, as an intrinsic property like charge), but it would not be inconsistent to propose a new field that gives them an effective mass by interacting with them. That new field has come to be known as the Higgs field, and particles of this field are called Higgs bosons.
This story is well known, and it was told in many ways when the Higgs boson was discovered in 2012. What is less well known is that the problem of mass was not a single problem. The reason that force particles (such as W and Z bosons) cannot have intrinsic mass is different from the reason that matter particles (such as electrons and quarks) cannot have intrinsic mass. The effective mass of force particles and matter particles could come from different sources. There could be two Higgs fields, one that only interacts with and gives mass to force particles, the other to matter particles, or perhaps the mechanisms themselves could be completely different.
Many physicists expected that a single Higgs field would pull double duty and give mass to all the particles. This, however, was a hypothesis, based on the expectation that nature is simple and elegant.
As it turns out, nature seems to be simple and elegant. CMS scientists recently published a study of Higgs boson decays to matter particles, complementing its discovery, which was through its decays to force particles. The same Higgs field interacts with both types of particles in the expected way.
Specifically, the study focused on Higgs to tau pairs (tau is a heavy cousin of the electron) and Higgs to b quarks (the b quark is a heavy cousin of the quarks found in the protons and neutrons of an atom). Since this interaction is responsible for mass, it is stronger for more massive particles. Both of these decay products are hard to distinguish from backgrounds, especially the b quarks, so the statistical significance is weak (3.8 sigma, equivalent to a one in 14,000 chance that the combined observation is spurious). However, these decays and all the decays to force particles point back to a single Higgs boson. The basic principles of physics may yet be simple enough to fit on the front of a T-shirt.
Supersymmetric particles produced in proton collisions might have any of a variety of decay patterns, but the total energy of all decaying particles corresponds to the mass of the first particle in the cascade.
Supersymmetry is hard to kill. It is more general than most theories of physics beyond the Standard Model: It is the basic idea that particles and forces are fundamentally the same thing but appear different because something creates an effective distinction between them, similar to the way that the Higgs boson creates an effective distinction between the electric and weak parts of the electroweak force. For supersymmetry, that “something” is unknown — many different models of supersymmetry breaking have been proposed and others could be thought up tomorrow. Each variant, including minor variations in numerical parameters, yields different decay patterns involving different cascades of particles.
Seekers of supersymmetry are faced with a dilemma: Pick a model of supersymmetry breaking and hope you’re lucky enough to find it, or look at a quantity that is sensitive to a broad class of models, but with less statistical sensitivity. If you pick a specific model and don’t find it, you can set a precise exclusion limit, but only on one model — supersymmetry itself remains elusive. Broad searches, on the other hand, are hard to formulate. You have to think of a signature that is shared by many supersymmetric models yet is different enough from the Standard Model to clearly claim a discovery or set a tight limit.
One technique, dubbed “the razor,” shares aspects of both. It makes weak assumptions about the mechanism of supersymmetry breaking but also makes a sharp distinction between supersymmetric particles and the Standard Model. Assuming that supersymmetric particles are within reach of the LHC (like squarks and gluinos), and that they are produced in pairs (R-parity is not violated), and that they decay in cascading chains (as many variants of supersymmetry do), they would show up in LHC collisions as two rough bundles of particles, each with an energy that corresponds to the mass of the first supersymmetric particle in the chain. Different models predict different decay patterns within each bundle, but this technique looks only for the bundles.
A group of CMS scientists selected events using the razor technique and found them to be consistent with the Standard Model. This cuts out a broad range of supersymmetric models, many more than a focused technique would. There are still others that might evade the razor’s weak assumptions, but the remaining space is getting thin.
This plot shows how well the Higgs half-life (λ) is known: Less than 20 yoctoseconds is ruled out with 95 percent confidence, but greater than 20 yoctoseconds is still possible. The standard prediction is that the half-life is 100 yoctoseconds.
It is almost two years since physicists from the CMS and ATLAS collaborations announced the discovery of a Higgs-like boson. Today, the evidence has strengthened to the point that they no longer qualify it as “Higgs-like.” The signal is now much clearer, the particle is spinless (as a Higgs boson must be), more decay modes have been observed, and the proportions of decays into those modes are about right (with 15 percent uncertainties). What more could you want?
Perhaps its decay rate: The Higgs boson is an unstable particle, so it decays within a characteristic length of time. Although the time for an individual particle to decay is random, each type of particle has an average lifespan. The time for roughly half of a collection to decay is called its half-life. The half-life of the Higgs boson is not known, but it is predicted to be 100 yoctoseconds (septillionths of a second), which is a rather long time for a particle of its mass.
A measurement of the Higgs boson half-life would tell us a lot. Currently, only a few final states have been observed, which add up to about 2 percent of all predicted decays. For all we know, they might be nonstandard Higgses, and most of them might be decaying into exotic particles. Knowing the total decay rate would put an upper limit on this possibility. It would constrain even the decays that we don’t see.
CMS scientists recently attempted to measure the Higgs boson’s half-life and determined that it is at least 20 yoctoseconds. This analysis established a technique that will be applied to larger data sets, which are needed to fully measure it.
The technique is significant, because direct measurements of the half-life are far too insensitive. If, for instance, you tried to measure a Higgs’ lifespan from the distance it flies between its production and its decay, you’d be trying to measure a distance that is much smaller than an atom, beyond the capabilities of any microscope. Instead, you might take advantage of a fact from quantum mechanics, one that states that the half-life of a particle is inversely proportional to the uncertainty in its mass. Unfortunately, the detector’s mass resolution is a thousand times too insensitive to see that uncertainty. The physicists who performed this study used a clever trick involving the ratio of the real Higgs production rate divided by the virtual Higgs production rate and managed to constrain the half-life within a factor of six of its predicted value. No small feat!
A jet of water sprayed through water loses energy and changes shape, as illustrated by this Jacuzzi jet. CMS scientists studied a similar phenomenon in an exotic liquid of quarks and gluons.
Despite the complexity of particle colliders and the instrumentation needed to analyze their results, the ultimate aim of most particle physics experiments is to understand something simple. At a fundamental level, most natural phenomena turn out to be simple in profound ways. By contrast, our macroscopic world is teeming with complexity: A bucket of water is by far more complex than an electron. The exact way that water sloshes, curdles in turbulent flow and pinches into droplets when it splashes would be difficult to simulate on the world’s biggest supercomputers, even though the basic interactions between individual atoms are pretty well understood.
One part of the quantum world has this kind of complexity, however: the strong force that binds quarks. Unlike the electromagnetic force between atoms, the particles that make up the strong force are themselves attracted via the strong force, which begets more strong force. Physicists call them gluons because they make such a sticky mess. Like the bucket of water, the strong force is notoriously difficult to calculate because some of its properties are emergent — they arise from the interplay of many interactions.
One of these emergent properties is the fact that a lone quark flying away from a collision creates gluons, which create quarks, which create gluons, and becomes a jet of particles flying in roughly the same direction. Another is that if you get enough quarks in a small space (by colliding heavy nuclei), they undergo a phase transition into a new kind of liquid ruled by strong force interactions. Recently, scientists discovered that jets are eaten by the liquid: They are absorbed into the droplet and sometimes disappear entirely.
To get a more complete picture of this phenomenon, scientists have used the CMS experiment to study an in-between case, jets that are partially but not completely absorbed by the strong-force liquid. Like a hose sprayed through water, this results in misshapen jets. The angles among particles that make up the jet are noticeably wider than usual, and the exact amount of broadening tells us a little more about the nature of this new state of matter.
Top quarks or the particles involved in their creation can be indirectly studied through symmetries — or lack thereof.
Experimental physicists spend their lives thinking about uncertainty. Although the subject of their experiments may be heady stuff like the Higgs field, dark matter and gravity waves from the beginning of time, they spend most of their time wondering, “How can I be sure my instrument isn’t lying to me?” Combating potential sources of error is easily 90 percent of the work on a typical analysis.
There are many techniques for dealing with uncertainties, and one of them is to exploit symmetries. Imagine that you have a picture of a butterfly on a piece of thin paper and you want to know if the left wing is exactly like the right wing. Measuring each of its spots with a ruler and compass is error-prone, but folding the paper and holding it over a light reveals all of the differences quickly and accurately. If a spot on the right wing is slightly larger than the corresponding spot on the left, it won’t line up exactly.
One of the last big discoveries of the Tevatron was a forward-backward asymmetry in pairs of top and antitop quarks. When protons and antiprotons collided to produce top-antitop pairs, the top quarks flew out of the collision in the direction of the original proton more often than in the direction of the antiproton (and vice-versa). Just like folding butterfly wings, this result is robust against uncertainties in the total collision rate because such an error would cancel in the forward-backward comparison. For three years now, physicists have been trying to explain this asymmetry.
It is natural to ask if the LHC sees the same asymmetry. Unfortunately, the experiment can’t be exactly repeated because the LHC collides protons and protons — no antiprotons. However, there’s another potential asymmetry that could reveal the underlying cause. Protons are made of two energetic up quarks, an energetic down quark and a froth of low-energy quark-antiquark pairs. Antiprotons are the opposite. When the Tevatron saw an excess of backward-moving antitop quarks, it might have been because the antiproton’s antiquarks were, on average, more energetic than the proton’s antiquarks. At the LHC, this asymmetry would show up in top-antitop pair trajectory angles relative to the beamline: parallel to the line of colliding protons or perpendicular to it.
In a recent experiment, CMS scientists studied exactly that. They measured this top-antitop ratio, but found no asymmetry. Clearly, something else is going on here. Like a rear-view and a side-view mirror, these two views together give us a more complete picture of the mystery.
Now that a new landscape has been opened up by the LHC, it's time to survey every inch of it with precision measurements.
In the early months of the LHC, many of the studies performed were searches for striking phenomena: event patterns that could not be anything other than new physics. Before we had this window on the microscopic universe, all sorts of proposed new laws of nature were theoretically possible — they would have had negligible effects on low-energy experiments but would have been dramatically revealed in high-energy collisions. In its first two years, the LHC struck down hundreds of hypotheses such as these. In its third and fourth, it discovered and started measuring the properties of the long-sought Higgs boson, one prediction that turned out to be true.
While all of these activities are still going on, the tide is shifting toward thorough surveys of the new landscape. New laws of nature are probably still waiting to be discovered, which is to say that we probably don’t know everything there is to know just yet. But they might be subtle, lurking in the 10th decimal place. By measuring a wide variety of known processes with ever-increasing precision, scientists are casting a wide net. The surveyors might find what the conquerors missed.
One such analysis is the measurement of the production of a W boson accompanied by two b quarks. As with all experiments, there is a story behind it: Previous measurements of a W boson and at least one b quark disagreed with the theoretical calculations, especially in cases where multiple b quarks overlapped in the detector, masquerading as a single b quark. By requiring events with two distinct b quarks, this new analysis can help to resolve the issue. The discrepancy is probably not new physics, but an uncertainty in the structure of the proton at its smallest scales.
CMS scientists recently published a high-precision analysis of the W-b-b signature. What they found is consistent with the theoretical prediction, and that narrows the range of possible reasons for the previous discrepancy. In fact, it is sensitive to “next-to-next-to-leading order,” two levels of refinement deeper than the basic calculation. The new results are already providing important feedback to the theoretical community, helping them to stamp out uncertainties in the computer simulations that are used to make predictions. It will likely have a subtle effect on scientific studies for years to come, the way that accurate maps improved upon the first crude sketches of the new world.
Although we usually say that a proton contains three quarks (up, up and down), there are many more quark-antiquark pairs at fine scales.
Matter is made of molecules, which are made of atoms, which are primarily made of protons and neutrons, which are made of quarks. In each case, however, “made of” takes a subtly different meaning. Protons are not made of quarks the way that a wall is made of bricks but rather like the way that a fire is made of flames. They are seething balls of spontaneously forming and annihilating quarks.
Yet this tempest has structure. For instance, quarks and antiquarks can only be created or destroyed in pairs, so when we say that a proton contains three quarks, it is because the total number of quarks minus the total number of antiquarks is always three (two more up quarks than anti-up and one more down quark than anti-down). Adding a few more quark-antiquark pairs doesn’t change the difference.
The number of quarks plus the number of antiquarks depends on how closely you look. Just as a coastline seems to get longer as you zoom in (because the true coastline winds around every grain of sand on the beach), the number of quarks and antiquarks increases at finer scales. High energies are sensitive to small scales, so high-energy protons appear to be denser and are more likely to collide.
This affects the rate of production of every kind of particle made by the LHC. But because these high energies had never been explored before first collisions in 2010, no one knew for sure what the rates of particle production would be. In addition to searching for the Higgs boson and other new particles, physicists have been measuring familiar processes to get a clearer picture of how the proton’s density scales with energy.
One very precise way to do that is to count the ratio of W+ bosons to W- bosons. A W+ boson is formed when an up quark and an anti-down quark combine, and a W- is formed from down and anti-up. Since protons contain more up quarks than down, W+ is somewhat more likely than W-. The exact ratio depends on the density of antiquarks, so CMS scientists carefully measured tens of millions of W+ and W- bosons with impressively small uncertainties (0.2 to 0.4 percent). These measurements are already helping to nail down the structure of the proton at the smallest scales.
Spinless particles (such as Higgs bosons) decay in any direction with equal probability, but particles with spin (such as top quarks) prefer to decay along their axis of spin.
When I was asked to write an article on the CMS experiment’s measurement of the spin of the top quark, a metaphor immediately suggested itself: spinning tops. However, the more I thought of it, the more I realized how complex the relationship is between a toy top that you can spin with your fingers and a top quark, spinning on its own according to the rules of quantum mechanics.
A toy top is made of trillions of trillions of atoms, so its rotation is easy to define. It is rotating when the lattice of atoms orbit a central axis. If its surface were perfectly smooth and it were not made of granular pieces like atoms, then it would be harder to say that it is or is not rotating around its axis of symmetry — it would look exactly the same at all times. This is a problem when describing the rotation of black holes: All one can see is a knot of pure space-time, no surface wrinkles revolving like a carousel.
Angular momentum is a more fundamental concept than rotation, one that applies equally well to featureless objects like black holes. For an object with a well-defined rotation, angular momentum is proportional to the rate of rotation, but it is also proportional to the mass and diameter squared of the object in a way that depends on shape. If we shrink a top while maintaining its angular momentum, the rotation rate must increase, as it does when a spinning figure skater contracts his or her arms. If we shrink a toy top to an infinitesimal point, like a fundamental particle, the rotation rate would become infinite.
A top quark “spins” in the sense that it has angular momentum and it can transfer that angular momentum to other objects. If quarks are the indivisible, fundamental particles that they appear to be, then they have no parts to turn or rotate in the conventional sense.
Top quarks are too short-lived for their spins to be measured using macroscopic devices, but the spin’s influence can be seen in the quark’s pattern of decays. Imagine an exploding toy top: More pieces would fly out perpendicular to its axis of rotation than along it. If you only saw the trajectories of the debris, you could infer how much angular momentum the toy had before it exploded. Unfortunately for my metaphor, the top quark prefers to decay along its spin axis, rather than against it, because the mechanism has nothing to do with centrifugal forces.
With a huge sample of top quark decays observed by CMS, physicists selected the 9,000 cleanest events and successfully measured spin-related asymmetries in their decays. These measurements are sensitive to the details of the process that produces top quarks, which aligns the top spins before they decay. Ironically, the top quark’s short lifespan is essential for this inference — all other types of quarks lose the spin correlations that they were born with because they interact with each other before decaying.
White outlines indicate the shapes of quark-gluon plasma droplets, derived from data measured with CMS.
Shapes in the subatomic realm are difficult to describe. Ask a physicist about the shape of an electron, and she might talk about the shape of the probability cloud of possible electron positions, or maybe say that the electron itself is a speck of zero size (to the limit of anyone’s ability to measure), or might just ask what you mean by “shape,” anyway. The macroscopic world that we know well is full of objects assembled from septillions of particles, and these particles are distributed in space as shapes. Fundamental particles are, by definition, not made of anything but themselves, so they can’t always be described in familiar terms like shape, color and taste (except metaphorically).
However, not all particles are fundamental. Mesons are pairs of quarks bound by the strong force, protons and neutrons are made of three quarks, and more exotic combinations may have recently been discovered. The quarks orbit one another in specific spatial patterns that could be called shapes, though the physicists who study them speak in terms of form factors and deep inelastic scattering.
Another spatially extended subatomic object is the fireball that forms when heavy nuclei collide. Unlike bound states of quarks, this substance is a kind of fluid — a rapidly expanding droplet in which quarks and gluons flow and mix. When it is first formed, the droplet is shaped like the overlap of the two nuclei: circular if the nuclei collided exactly face-on and almond-shaped if one was offset from the other (see figure in this related article). A few yoctoseconds later, the blob spreads out and changes shape according to pressure differences. The fact that this spreading is mostly elliptical is evidence that the quark-gluon fluid has almost zero (or exactly zero) viscosity.
But there’s more to that shape than a simple ellipse, as CMS scientists revealed in a recent paper. The complete picture looks like the figure above: peripheral collisions result in a mostly elliptical shape with pear-like offsets, while central collisions result in a droplet that is slightly triangular. The triangular component was at first surprising, since the overlap region of two circular nuclei would be symmetric. Nuclei are not rigidly circular, though, and asymmetries in the collision allow asymmetries in the droplet to develop.
Not only do these measurements improve our understanding of an extreme state of matter, but it is satisfying to finally see the shapes of liquid splashes a trillion times smaller than a drop of water.
The decay pictured above results in two pairs of particles: a kaon with a pion and two muons. There is a lot of information in the opening angle of each pair and in the angle between the planes that pass through each pair.
Many of the analyses we present in these columns are fundamentally simple. The experiments are difficult to perform, requiring record-setting beam energies, magnetic fields, computing resources and a lot of hard work, but the goal is easy to explain: find the new particle or prove that it does not exist.
In this article, I’d like to present one of the more subtle analyses, a study of angular distributions from a chain of particle decays. The particles involved are well-known, and the decay (a B meson to an excited kaon and two muons) was first observed 10 years ago. What makes this analysis interesting is that there is so much information in the trajectories of the final particles. Undiscovered physics, such as supersymmetry, could influence the way that the B meson disintegrates, resulting in a modified pattern of particle trajectories. Without paying close attention to these distributions, scientists could miss an important hint of the exotic hiding in the mundane.
Precision measurements are further complicated by the fact that particles decay with random trajectories, like fireworks. But even fireworks have patterns and structure. A firework that explodes from a stand-still bursts symmetrically, like a rose, while a firework that explodes while still rocketing upward results in a funnel of final particle trajectories. With a careful record of a hundred random fireworks, you could learn a lot about their internal dynamics.
The fireworks analogy would be directly applicable to inferring a particle’s mass from its decay, but a recent paper from CMS studies two additional influences on the shape of the decay. One is the polarization of the excited kaon. In much the same way that light can be polarized horizontally or vertically, particles can be polarized, and the polarization of a decaying particle determines how well its remnants align (on average) with its original axis of motion. Another is the forward-backward asymmetry of the muons: how often they continue in the direction of motion of the B meson and how often they fly backward.
This particular decay has four final state particles (the excited kaon decays into a charged kaon and a pion, plus the two muons makes four), and all of the effects described above are happening simultaneously, with some randomness sprinkled in. Nature would be truly devious to hide evidence of a profound discovery in such a complicated setting, but this has never stopped her before.
The Higgs boson may decay into a Z boson and a photon through an intermediate pair of charged particles.
A particle’s branching fractions — that is, the probability that it will decay into one set of particles rather than another — are a good way to see if physicists really understand what’s happening at microscopic scales. Many things can affect a particle’s decision to decay into, say, electrons rather than photons. If the physicists’ predictions match the observed probability of decay, then the underlying mechanisms may be well understood, especially if it is a tight balance between opposing forces. If not, then there might be a new intermediate particle involved or some other new phenomena to be discovered.
Some branching fractions are determined by extremely complicated processes while others are relatively simple. The Z boson, for instance, decays into electron-positron, muon-antimuon and tau-antitau pairs with equal probability. It is as though the Z has a menu of everything less massive than it and blindly chooses from the menu.
The Higgs boson’s decays are also pretty simple: It can only decay directly into particle-antiparticle pairs, but with a probability that depends on the mass of the decay products. To fulfill its role as origin of mass, the Higgs must couple to matter particles in proportion to their masses and force particles by the squares of their masses, and hence it decays mostly into heavy particles. It is as though the Higgs’ menu is biased toward heavy final states.
However, the Higgs sometimes decides to decay in ways that aren’t even on this menu. It was discovered, in part, by its decay into two photons. Photons are massless — their coupling to the Higgs (and hence branching fraction from the Higgs) ought to be zero. Physicists believe that this decay is possible because the Higgs first decays to a pair of heavy charged particles that then re-collide to produce two photons. (Charged particles couple to photons.) It’s somewhat more complicated, but not unprecedented.
With this interpretation, other final states, such as a Z and a photon, are also possible. (Charged particles also couple to Z bosons.) This Z-photon pair isn’t even a particle-antiparticle pair, but it should be allowed in the same way that photon pairs are. A group of CMS physicists are exploring that possibility now, searching for Z-photon pairs with the mass of a Higgs. If the theory is well-understood, they should have enough data to see this “off-the-menu” decay mode in the near future.
Tracks from a b-quark (yellow) and an ordinary quark or gluon (purple), overlaid on a photo of the CMS tracker, in approximately the position where these particles were observed.
Of the six known types of quarks, only two can be distinguished in a typical particle physics experiment. The top quark, once produced, has a dramatic signature involving cascades of decays from heavy particles into lighter ones. The bottom (b) quark also decays into lighter particles, but these are hidden in a spray of additional particles that form along with it, called a jet. A jet is essentially random: random particles moving in nearly random directions. The lighter quarks—charm, strange, up, and down—produce only jets when they decay.
In practice, this means that it’s almost impossible to distinguish an up-quark from a down-quark. Fortunately, most of the questions that scientists want to address do not rely on telling the difference.
The b-quark, however, is interesting for a variety of reasons: It can be part of a signal for new phenomena; it is part of the top-quark decay chain; and it probes fundamental symmetries in the laws of nature. Finding a way to distinguish b-jets from all other jets would help many scientists at once.
Jets from b-quarks are a little different in a lot of ways. Since b-quarks fly a small distance before they decay (about 5 millimeters), some particle trajectories trace back to this decay rather than the collision point. Jets with a b-quark are slightly narrower with slightly fewer charged particles and are more likely to include an electron or muon. No one characteristic is enough to tell us, “This is certainly a b-jet and that is not,” but the confidence adds up with each additional clue, and physicists are able to assign a probability that a given jet is a b-jet. In a recent paper, CMS scientists presented the state of their art: For an 80 percent probability of identifying a real b-jet, they have a 10 percent probability of misidentifying a non-b-jet.
This is a “big data” analysis, much like the ones for which Internet companies such as Google, Amazon and Facebook are now famous. Among the trillions of recorded collisions that produced jets, these scientists found the few percent that are b-jets. Also like Internet data miners, the scientists used sophisticated statistical techniques and machine learning to optimize their search. Unlike big data analysis, however, this b-jet algorithm has quietly improved scientific understanding across more than 40 analyses, including searches for new physics, detailed studies of the top quark, and measurements of the Higgs boson.
Not only would the discovery of supersymmetry double the number of known particles, but it would introduce a new type of charge: Standard Model particles have positive R-parity and supersymmetric particles have negative R-parity.
Many of the CMS results presented in this column involve supersymmetry, the idea that there is a deep relationship between matter and forces. If nature is supersymmetric, then for each type of matter particle, there would be a corresponding supersymmetric force, and for each of the four forces, there would be a corresponding particle of supersymmetric matter. No evidence of supersymmetry has yet been found, despite the fervent searches, so you might be wondering why scientists are still looking for it. There are two reasons: (1) It would greatly increase our understanding of how all known particles unify, and (2) there are so many different ways that supersymmetry might manifest itself that the searches performed so far are not exhaustive.
To illustrate this complexity, let’s examine one property that a supersymmetric theory might or might not have, known as R-parity. R-parity is a property of particles in the same sense as electric charge. Just as protons are positively charged and electrons are negatively charged, all currently known particles have positive R-parity, and their supersymmetric variants, if they exist, would have negative R-parity. R-parity is the supersymmetric-ness of a particle. Supersymmetric theories come in two broad classes: those in which the R-parity of a system is constant, like electric charge, and those in which R-parity can change with time.
Not only would the constancy or variation of R-parity reveal something fundamental about supersymmetry, but it has implications for the way scientists search for it. If R-parity is constant, then at least one supersymmetric particle must be stable. It cannot decay into normal matter because that would flip its R-parity from negative to positive. This feature could explain dark matter, because the stable supersymmetric particle would be dark (invisible). It also means that supersymmetric particles would appear as missing energy in a particle detector like CMS, due to their invisibility. Most searches for supersymmetry look for this characteristic energy imbalance.
If, on the other hand, R-parity is not constant, then all supersymmetric particles could decay, and searches based on missing energy would not find them. For this reason, a team of CMS physicists performed a different kind of supersymmetry search, one that doesn’t rely on missing energy. They used lepton count, b quark identification and the distribution of energy to distinguish supersymmetric events from the background.
What they measured is consistent with known physics, further rolling back uncertainty with hard data. When dealing with the unknown, one must leave no stone unturned.
If you’ve ever seen computer displays like the one above, or old bubble chamber photographs, or even tinkered with a homemade cloud chamber, then you’ve seen particle tracks. Tracking is an important tool for particle physics experiments because tracks show you the comings and goings of individual particles. When coupled with a magnetic field, they also tell you the momentum of each particle, since slow particles curve in the field while fast ones fly straight through. Irène Joliot-Curie, daughter of Marie Curie and an early adopter of tracking for her radioactivity research in the 1930s, called it “the most beautiful phenomenon in the world, apart from childbirth.”
Nevertheless, tracking has some limitations. For one thing, it only reveals charged particles: Neutral particles leave no tracks unless they decay into charged particles. Also, a particle must survive long enough to enter the tracking detector to make a track. For particles with yoctosecond lifetimes (septillionths of a second), this is an issue. In practice, only five types of particles are commonly observed in tracking chambers: electrons, muons, pions, kaons and protons. The rest are inferred from the pattern of these particles’ trajectories or are identified by other techniques, such as calorimetry.
A new particle might show up as a new kind of track. The energy of collisions in the LHC is high enough to create new particles, even as-yet undiscovered particles. If a new particle is neutral and short-lived, like the Higgs boson, then it must be reconstructed from its decay products: a Higgs particle decays into two Z bosons, each of which decays into two electrons or two muons—an example involving four tracks. If, on the other hand, the new particle is charged and lives for tens of nanoseconds or more, it would pass through the detector for all the world to see.
In a recent paper, CMS scientists released results from a search for weird tracks hidden among the downpour of normal tracks from familiar particles. This analysis relied on unusual tools—the time that it took individual particles to fly through the detector and the amount of energy they lost along the way—finer detail than is needed for most analyses.
In the end, all was found to be consistent with known physics. These results rule out a variety of exotic theories with higher sensitivity than ever before.
This figure illustrates the pair production of two massive particles, each of which decays into two jets of lighter particles.
It is often said that in quantum mechanics, anything that can happen will happen. The emphasis, however, should be on the “can happen.” Not all processes are possible. Whenever two particles collide, nature checks her rule book to see which collision products are allowed and then randomly chooses from the options. Physicists generally express these rules as conservation laws, quantities that must remain equal before and after a collision, such as the sum of all mass and energy in the particles. One could say that the goal of particle physics is to discover all of the conservation laws.
Some conserved quantities, such as mass-energy, can be any fractional number, while others seem to be limited to whole-number multiples of a fundamental constant. Angular momentum, surprisingly enough, seems to be an example of the latter: You can’t spin less than 5.27 × 10−35 joule·seconds. Conservation laws that are restricted to strict increments like this are called quantum numbers.
One consequence of conserved quantum numbers is that some particles can only be produced in pairs. If a conserved quantum number describing the colliding particles is zero, then the collision cannot result in a single particle whose quantum number is +1. A +1 particle must be accompanied by a −1 particle. This applies to the quantum numbers that we know (such as angular momentum, electric charge, strong and weak force charges) as well as possible new quantum numbers that we don’t yet know. Some particles might have gone undiscovered because they can only be produced in pairs.
In a recent paper, CMS physicists used this feature to search for new particles. Instead of looking for the pattern of debris that would be consistent with the decay of a single new massive particle, they searched for the decay of two particles, the +1 and the −1 of some quantum number. Although they don’t know the mass of the new particles, they can expect them both to have equal masses. Constraints like this help to eliminate backgrounds from known physics, which is important because almost all of the collisions result in known processes. Anything that can happen, will.
A heavy variant of the Higgs boson would decay primarily into W bosons or Z bosons. This is a decay mode newly added to the search.
Evidence is mounting that the particle discovered last year is the long-sought Higgs boson. When it was announced, no one seemed more cautious of claiming that than its discoverers. But now, as experimental uncertainties shrink, they can confidently say that the particle has no intrinsic spin, it is mirror-symmetric, and it couples to other particles in rough proportion to their masses. These are all properties that the boson predicted by the Higgs mechanism must satisfy.
One property that the theory does not predict well, however, is the mass of that boson. All predictions relied on assumptions about physics beyond the Standard Model, but generally they were in the few-hundred-GeV range. When the LHC experiments began their search, they cast as wide a net as possible and seem to have made a catch at the low end, 125 GeV.
That’s not the end of the story: Even if the 125-GeV boson gives mass to the fundamental particles, it may not be acting alone. Nothing in the theory forbids multiple Higgs bosons. In fact, many of the predictions for a low-mass Higgs were based on supersymmetric extensions of the Standard Model, and these extensions require at least five Higgs bosons. So while some physicists study the properties of the boson in hand, others scour the net for more.
CMS scientists recently published a search for additional Higgs bosons from 145 to 1,000 GeV. This mass range requires different search techniques, since a heavy Higgs would decay differently than a light Higgs. Larger data sets and new search modes both contribute to making this the most stringent limit yet—heavy Higgs bosons with Standard Model decay patterns are now ruled out up to 710 GeV.
It is remarkable to live in a time when such sweeping statements can be made with data. The few-hundred-GeV region has long been marked on physicists’ maps as “thar be dragons,” the energy scale at which something breaks the unity between electromagnetism and the weak force. We’ve already caught one dragon, why not two?
In a large sample of proton-proton collisions resulting in muons and pions, a few thousand of them accumulate above the backgrounds with a mass of 3,872 MeV. This is the X(3872).
When the discovery of a new Higgs-like particle was announced last summer, it received a lot of well-deserved media attention. It is less widely known, however, that about a dozen new particles have been discovered in the past 10 years. Why all this lack of excitement? Unlike the Higgs boson, these other new particles are bound states of quarks: the same old particles in new combinations.
But they should not be so quickly dismissed. These new particles may be made of quarks, but most of them defy conventional explanations of how quarks fit together.
In the standard theory of quark interactions, quarks come in three types and can only bind together in ways that would result in an equal balance of these types. By analogy with the color wheel, they are called red, green and blue, and a particle like a proton is made of one red, one green and one blue quark, which mix to form white. Antiquarks are yellow, cyan and magenta. You can form a bound state from a yellow antiquark and a blue quark, for instance, since these colors are on opposite sides of the color wheel. See this “Physics in a Nutshell” for more.
Up to 2003, the only combinations of quarks and antiquarks that had ever been seen were red-green-blue particles, yellow-cyan-magenta antiparticles, and quark-antiquark pairs. Since then, a growing number of bound states have been discovered that do not fit this scheme. The first of these, discovered by the Belle experiment in Japan, was the X(3872), named “X” because we do not know what it is, and “3872” for its mass, measured in units of MeV.
The X(3872) and its companions might be the first examples of four-quark combinations, such as red-cyan-yellow-blue. These combinations are theoretically possible since they mix to form white, but they were not expected to be stable enough to be observed. The X(3872) in particular has a mass that is very close to the sum of two well-known bound states, D0 and D*0, so it might be a bound state of bound states. It has also been suggested that these new states are part-glueball hybrids.
In a recent experiment, CMS scientists observed the X(3872) with a strong signal (see the plot above) and measured several of its properties with higher precision than ever before. Far from the glare of the spotlight, these scientists are working to solve one of nature’s underappreciated mysteries.
Although cross section has little to do with a literal slice of the top quark, it can tell us about the quark's fundamental properties.
Last week’s Physics in a Nutshell described the strange way that particle physicists use the term “cross section” to mean a reaction rate. For instance, the proton-proton to top-antitop cross section is related to the probability that two protons will interact and produce a top quark and an antitop quark. This idea of cross section has historical roots in the fact that the collision probability of a stream of spherical balls is proportional to the cross sectional area of those balls. Today, the term is used because it specifies the collision probability in a way that is independent of the number of particles in the stream, so that results from one accelerator can be compared with results from another.
But why measure collision probabilities at all? Back when particles were thought to be hard spheres, it was an indirect way of measuring their size. In our current understanding, each particle is a quantum cloud of probable positions, and even if two particle clouds pass through each other, they may or may not interact. The probability of interaction is only partly determined by position—it is also affected by a fundamental property known as coupling, the strength of connection between the colliding particles and the result of the collision. When protons collide, the probability of making a top-antitop pair depends on the coupling of gluons inside the proton with the top quarks. By measuring this cross section, scientists learn how strongly gluons and top quarks are related, one link in the web that connects all fundamental particles.
In a recent paper, CMS physicists presented a new measurement of proton-proton to top-antitop cross section, using 60 times more data than the previous measurement. The extra precision that the large data set provides not only tests our understanding of the gluon-top coupling, it also checks assumptions about the density of gluons in the proton and quantifies a background for other searches. As the heaviest particle known, the top quark couples more tightly than any other to the Higgs boson and anything related to the origin of mass—the more we know about it, the better.
Far more Upsilon-2S and 3S particles disappear in the droplet of quark matter than Upsilon-1S, as can be seen in this plot from the CMS experiment.
Last week’s Physics in a Nutshell described how lead nuclei melt when they collide in the LHC. The molten nuclei form a new kind of fluid, one made of the random motions of quarks and gluons rather than atoms or molecules. But this quark matter exists for only a few trillionths of a trillionth of a second—how can CMS scientists learn anything about it?
The droplet is far too small and short-lived to poke it with any instruments, so scientists must rely on probes that are created along with it. Fortunately, the high-energy collision of two nuclei creates a lot of particles that are understood from previous studies. Upsilon mesons, for instance, are particles made of two unusually heavy quarks, discovered at Fermilab 35 years ago. The swarm of quarks and gluons in the fluid hit the Upsilons, sometimes enough to break them apart and absorb their quarks into the mix. The exact rate of this “meson melting” yields key insights into the temperature and nature of the droplet.
Not all Upsilons are alike: Upsilon-1S mesons are the smallest and least massive, while Upsilon-2S and -3S are more massive and larger. The Upsilon-3S is the biggest target and is therefore the easiest to hit and break apart. In a recent experiment, CMS scientists clearly observed fewer surviving Upsilon-2S and -3S relative to Upsilon-1S, a phenomenon known as sequential melting.
The plot above shows what this looks like: The particles observed in the aftermath of the collision can be identified by their masses (the horizontal axis). Collisions without quark matter (orange peaks) make about half as many Upsilon-2S as Upsilon-1S, but collisions with quark matter (pink peaks) result in far fewer Upsilon-2S and almost no Upsilon-3S. Since the only difference between the control and the experiment is the presence of a quark-gluon fluid, the heavy Upsilons must have been gobbled up by the fluid.
This effect was first seen by CMS scientists in an early run, but subsequent data have made it much more clear. The data sets are now large enough to see how the effect depends on the exact way that lead nuclei collide and to see it with particles other than just Upsilons. This experiment is one example of how familiar particles can be used as tools to explore a new state of matter.
High-energy collisions may reveal new lightweight particles if they are decay products of a heavy one.
To search for new phenomena, physicists need to go where no one has gone before. High-energy collisions have the potential to create particles too heavy to have been made in previous experiments at lower energy since the masses of the new particles come from the energy of collisions. High beam intensities shine more light on rare processes because a larger number of collisions means more chances to see an improbable reaction or decay. Sometimes, though, you need a little of both.
Suppose that there’s not just one new particle waiting to be discovered, but a whole system of new particles. Some of these may be hidden by their high masses while others may be hidden by very low reaction rates with ordinary matter. This possibility is sometimes called a hidden valley. If we can reach high enough in energy to create one of these new massive particles, it can decay into all the rest, revealing the whole landscape.
To test such a scenario, CMS scientists looked for new particles in their own backyard. That is, they scanned the debris of high-energy collisions for signs of low-mass particles to see if any new ones had slipped in among the familiar faces of the 1960s particle zoo. This is different from most searches, which look for high-mass signatures directly.
To be concrete, these scientists posed the question: what if the Higgs boson is not a single particle, but a family of interrelated particles? In addition to directly observing a new massive boson, like the one discovered last July, they should see the lightweight cousins of the Higgs from Higgs-to-Higgs decays. Or alternatively, what if dark matter has its own version of a photon, oxymoronically known as dark light? These dark photons would be produced along with invisible dark matter, but they would appear as visible low-mass particles.
In a careful analysis, no new lightweight particles were found. But when searching for the unexpected, it’s important to keep the whole landscape in view, including the part we thought we knew.
Charged particles passing through a detector leave behind tracks laden with information in their lengths, trajectories, curvatures and thicknesses. Image source
When the idea of quarks was first introduced in the 1960s, one of its stranger aspects was that quark charges must be one-third and two-thirds that of an electron. Up to that point, all known particles had −2, −1, 0, +1 or +2 times as much electric charge as the electron, a pattern that suggests that charge is quantized (only integer multiples are allowed). Quarks, which never appear alone, tell us that the basic unit is three times smaller. But solitary particles with such small charges have never been observed.
A solitary quark, if it could float through space on its own, would be a fractionally charged particle. The reason we have never seen lone quarks is now known: Quarks are bound together by a force so strong that any attempt to separate them simply creates more quarks. However, there is no obvious link between the fractional charge of the quarks and the strong force holding them together. In principle, there could be other particles with fractional charges and no strong force.
Physicists have been searching for free-floating, fractionally charged particles for decades. These particles, if they exist, might have eluded discovery by being too massive to be created in previous accelerators. Thus CMS scientists analyzed LHC data to see if its much higher energy is capable of producing heavy particles with fractional charge.
This analysis uses a very different technique from most studies at particle colliders. Usually, physicists measure the lifetime and energy of particles from the lengths, trajectories and curvatures of the tracks they leave behind as they swoop through the detector. To determine the strength of a particle’s charge, however, one must use the thickness as well, the boldness of the signature’s stroke.
In a bubble chamber like the ones used to look for free quarks in the 1960s, a fractionally charged particle would have produced fewer bubbles along its path in the fluid, drawing a weaker line. In a modern experiment like CMS, a fractionally charged particle would produce smaller electric signals as it passes through strips of silicon. CMS scientists had to find voltage blips nine times weaker than those from other particles.
After a thorough search, the analyzers have concluded that, if free-floating, fractionally charged particles exist, their masses must be greater than a quarter of a TeV—heavier than the heaviest known particle—to avoid being created by the LHC.
Leptoquarks would be produced in pairs, and each would decay into a lepton (such as an electron) and a quark (which becomes a jet). This is one of the leptoquark-like events found in the CMS data set. There are too few like this to rule out standard physics explanations.
Last week’s Physics in a Nutshell described a hypothetical particle called a leptoquark. The leptoquark would have features of the far more familiar leptons and quarks, the way a platypus has features of ducks and beavers. If leptoquarks do exist, they would reveal a deep connection between these two fundamental classes of particles.
Physicists have been searching for leptoquarks for years, but have never found one. If they do exist, then they must have a higher mass than previous experiments were able to reach. Leptoquarks could also allow ordinary matter to spontaneously decay, something that has never been observed. If leptoquarks have a high mass, then fluctuations in ordinary matter would rarely reach it and decays would be too infrequent to have been noticed. Both of these considerations point to a high energy scale, so it’s worth looking for leptoquarks at the LHC, the highest-energy collider in the world.
CMS scientists searched through all of the data collected in 2011, which corresponds to about 500 trillion proton-proton collisions. They were looking for events in which a leptoquark and an anti-leptoquark were produced by the energy of the collision, each decaying into a lepton and a quark (or their antimatter equivalents). Some leptons, like the electron, leave a clean track through the CMS detector, while others, like the neutrino, are invisible and have to be inferred from an imbalance in the debris. A quark always produces a spray of particles known as a jet.
The search turned up a handful of events with these characteristics, but no more than would be expected from known physics processes. Therefore, this result set the most stringent limits yet on the mass of leptoquarks. CMS scientists are already hard at work examining the 2012 data, in which protons collide with a higher energy and therefore are capable of producing more massive leptoquarks, should they exist.
Why scour a mountain of data to search for a particle that might not exist? To paraphrase George Mallory, “because it could be there.”
Analyzing jets from a particle collision is something like analyzing the aftermath of the collision of two pocket-watches. CMS scientists recently analyzed collisions in which hundreds of particles seemed to be channeled into six streams.
It is sometimes said that if particle physicists wanted to figure out how an expensive Swiss watch works, they would smash it and deduce its structure from the cogs, springs and glass that fly apart. To understand the inner life of protons, this is exactly what they do—smash two of them together and analyze the aftermath. It is not as ridiculous as it sounds. The basic laws of energy and momentum conservation make a precision science of this violent practice.
Energy and momentum are both quantities that describe the inertia of an object or a system of particles. Energy is the sum of motion in all directions, while momentum is the net motion, depending on direction. These quantities are useful because they are constants—even the messiest explosion maintains a constant momentum because leftward-going particles exactly balance rightward-going particles. The energy of an explosion is also conserved, as long as the potential energy that caused the explosion is counted. Among subatomic decays, this initial energy is the mass of the decaying particle, according to E = mc2 (energy equals mass times a large constant).
To analyze a collision, physicists add up the observed energy and momentum and solve both equations to deduce the masses of the particles that decayed before they could be observed directly. In a recent paper, CMS scientists applied an extreme version of this technique. They looked at events in which hundreds of visible particles seemed to be channeled in six streams, known as jets. Assuming that each jet was caused by a cascade of decays, starting from a single quark, they applied energy and momentum conservation to find out if triplets of quarks descended from an as-yet unknown particle.
Although the search turned up negative (no new particles yet), the experimenters’ ability to deal with so many jets is impressive. There are 10 different ways to identify two triplets among six quarks. If all combinations are considered, the nine wrong ones would drown out the right one, weakening the sensitivity of the analysis. These scientists noticed that wrong combinations usually produce more energetic quarks than right ones, and used this correlation to sharpen the precision of their search. Not bad for smashing watches.
Supersymmetry, the notion that matter and forces are two sides of the same coin, is an elegant idea that could explain many of the mysteries of particle physics. Searching for it, however, is not an easy task because there are so many different ways it could manifest itself. As described in last week’s Physics in a Nutshell, if supersymmetry exists, it can only be an approximate symmetry. Each model of broken supersymmetry predicts a different pattern of particles, their masses and their decay signatures. In a particle physics experiment, broken supersymmetry could look like just about anything.
How would you search for something that could look like anything? Fortunately, models of broken supersymmetry share a few broad features. For one thing, all of the models predict new particles, the superpartners of the ones we know. These would decay into familiar particles because they maintain part of their identity as they decay— for instance, supersymmetric quarks, called squarks, would decay to quarks. In many models, the lightest supersymmetric particle is invisible, like dark matter, which shows up in a particle collision as an apparent imbalance in the particle debris.
In a recent paper, CMS scientists used these as criteria in a broad search for new physics. They looked for an excess of collision events with many high-energy particles in a lopsided pattern, as though an invisible particle carried away much of the energy. They found nothing new— all was in agreement with known physics.
This result has far-reaching consequences. It rules out many of the simplest models of supersymmetry, but as described in last week’s Nutshell, more subtle models lie beyond. It makes some headway into these non-minimal models because so few model-specific assumptions were made in the analysis. It’s even general enough to address some non-supersymmetric theories, so the paper’s authors present their result in a theory-neutral way.
Sometimes, the lack of an observation can be as exciting as a discovery. While this summer’s discovery of a new massive boson was a marvelous achievement, it will confirm a widely held expectation if it turns out to really be the Higgs boson. On the other hand, the simple models of supersymmetry ruled out by this search were also widely expected, and now they are refuted.
Many different kinds of Higgs bosons have been hypothesized over the years. Any one of these ideas could be correct—or none at all.
After decades of speculation, the Higgs boson was finally discovered on the fourth of July, 2012 – or was it? Despite the headlines, scientists claimed to have seen not a Higgs boson, but “a new particle” or “a new boson.” As it often happens in science, the eureka moment is an explosion of more questions than answers.
What the experiments actually revealed was an excess of certain types of events. More collisions produced pairs of photons, pairs of Z bosons or pairs of W bosons than would be expected in a world without a Higgs. The photon and the Z boson measurements were precise enough to show that they came from decays of a single particle with a mass of approximately 125 GeV (heavier than all known particles except the top quark). The photon, Z and W are all bosons, which are particles of force, as contrasted with fermions, which are particles of matter. Fermions attract or repel each other by tossing and catching bosons.
In the Standard Model, fermions and bosons both acquire their masses by pushing through the same Higgs field, but the Standard Model may be wrong. Perhaps only bosons interact with the Higgs field. If so, then the fermions would have to get their masses some other way.
A group of CMS physicists considered this possibility, known as a “fermiophobic” Higgs. In this scenario, all of the usual assumptions about how Higgs bosons are produced in collisions and how they decay have to be modified. The physicists were able to reinterpret some existing studies, but others had to be reanalyzed for the fermiophobic case. Their result using last year’s data doesn’t support the idea that a fermiophobic Higgs boson exists, though it is not completely conclusive. The CMS collaboration has released a preliminary result using more data.
This is just one way that the Higgs can be non-standard. Some models, such as those with supersymmetry, require at least five Higgs bosons with complex Higgs-to-Higgs transitions. Rather than a finish line, this summer’s discovery was the start of a new adventure.
As physicists dig deeper into the Energy Frontier, they find new particles through their decays into known ones.
Three thousand years ago, the inhabitants of ancient Crete used a writing system called Linear B. Archaeologists in the mid-20th century managed to decipher it because they had a working knowledge of Mycenaean, an early form of Greek. Linear A, an earlier and still-unknown language, may someday be understood through its relationship with Linear B.
Discoveries often make other discoveries possible, even in particle physics. Most particles created in proton collisions are not produced in the first instant, but through successive chains of decay. For instance, top quarks are far too short-lived to make it from the collision point to the first layer of the CMS detectors, even at speeds approaching the speed of light. A top quark immediately decays into a bottom quark and a W boson. The bottom quark lives long enough to travel a fraction of an inch, but the W boson instantly decays into two more quarks. The quarks form dozens of mesons and baryons, some of which are unstable while others live for 10 nanoseconds or more (a long time). Only these last survivors are directly observed.
Each step in this decay chain was once considered new physics. The discovery of each stage made the next one possible: As with Linear B, scientists could reconstruct the top quark because they knew the language of bottom quarks and W bosons.
CMS scientists are currently engaged in a new expedition to search for massive particles that decay into pairs of high-energy top quarks. It is particularly challenging because the many decay products of each top quark overlap one another in the detector. These scientists had to invent sophisticated algorithms for disentangling the fragments of each W boson and bottom quark— so that they could reconstruct energetic top quarks, so that they might reconstruct a new particle, Z-prime, decaying into them.
Each step brings us closer to understanding the origins and fundamental structure of the universe, but it relies on the accumulated wisdom of generations of physicists.
A droplet of quark-gluon fluid created from the partial collision of two lead nuclei. The droplet is taller than it is wide and expands asymmetrically.
Have you ever noticed how cooking oil flows on a hot pan? Unheated, the oil is thick and spreads slowly, but on a hot pan it splashes and almost looks like water. Physicists call this difference one of viscosity: cold oil has about ten times the viscosity of hot oil.
When lead nuclei collide in the LHC, they form an exotic fluid of quarks and gluons that flows like a liquid, except that it has no measurable viscosity.
All liquids and gases in everyday experience are made of molecules or atoms that swarm and collide with one another. The plasma in the sun is made of ions rather than whole atoms, which causes it to flow in strange and complicated ways. For a brief moment after the collision of two lead nuclei in the LHC, quarks and gluons mix as a fluid made of the smallest particles known.
To measure this fluid’s viscosity, scientists use a technique similar to splashing oil on a pan to see how fast it spreads. Cold oil tends to smear out in a circle regardless of how it is thrown, but hot oil splatters in the direction that has the greatest initial difference in pressure. Similarly, a quark-gluon fluid with initial pressure differences would spread out in a circle if it were viscous, but not if it were runny.
The droplet of quark-gluon fluid has these pressure differences because of the way that it is made. Most nuclei don’t collide head-on, but are a little bit offset when they hit each other. The collision takes a bite out of each nucleus as it flies by, and the part that collides and liquefies is shaped like the overlap region: taller than it is wide (see figure). This asymmetry causes the drop to burst outward on each side more than top-to-bottom.
In a recent paper, CMS scientists measured the shape of the flow of the quark-gluon fluid. It acts like a cohesive liquid, but one with zero or nearly zero viscosity. Superfluids such as helium have this strange property when they are cooled to nearly absolute zero, but the droplet of quark soup is hundreds of thousands of times hotter than the center of the sun, the temperature of the first microsecond of the Big Bang.
In the above reaction, a bottom quark (b) and a strange quark (s) transform into two muons (μ) by way of a loop involving a top quark (t), two W bosons (W), and a neutrino (ν). Part of this loop goes backward in time.
Many particles created in proton collisions are short-lived, measuring their lives in trillionths of a trillionth of a second. Scientists reconstruct them from the trajectories of their remnants. But some particles are even more transient than that. They live and die entirely within a reaction - their brief existence is just a step in the transformation of the colliding particles into their remnants.
As a bizarre consequence of quantum mechanics, these ephemeral particles can even travel backward in time. Normal particles trace lines through space and time that are always forward in the time direction. A particle in the midst of a collision can actually trace a loop, connecting its birth with its death, but only if it can’t be observed.
Consider the diagram above. Two quarks, bottom and strange, collide and result in two muons. To satisfy conservation laws, a top quark, two W bosons and a neutrino must be created and destroyed in a sequence that ends with the last one begetting the first. The fact that the collision had far too little energy to create the mass of a top quark or even a W boson does not make this reaction impossible— it only makes it rare.
With no hard limit on the mass of the participants, anything could be running around the loop, even undiscovered particles and particles that are too massive for the LHC to directly create. These strangers would be noticed by their effect on the probability of the reaction: if there are more particles willing to help a bottom and a strange quark become two muons, it could happen more often.
If only known particles contribute, this reaction would be extremely infrequent— only a few bottom-strange encounters per billion would result in a pair of muons. That just makes it easier to see the effect that new particles would have. It’s easier to see a flea riding an ant than a flea riding a horse.
CMS scientists are searching for this kind of reaction with ever-increasing precision. Recently, they published a measurement sensitive to eight parts per billion. While the reaction hasn’t been definitively observed yet, this year’s dataset should be large enough to see it even if there are no new particles. We are sure to see the ant; time will tell if it has a rider.
Scattering reveals structure: the path of a bowling ball would hardly be deflected by a pin, but the ball could even bounce backward if it hit something heavy and hard, like a lamppost.
One hundred years ago, Hans Geiger and Ernest Marsden puzzled over the apparatus on their workbench, arguably the first particle physics experiment in history. Geiger had earned his doctorate only three years earlier and Marsden was an undergraduate. In this experiment, a vial of radon gas emitted a beam of charged particles at a tissue-thin leaf of gold. The particles hit the gold atoms and bounced off at random angles. The physicists counted the number of recoiling particles at each angle through a swiveling microscope. They were astounded to see them scattering everywhere, even backward.
The beam of particles was one of the most energetic known at the time, and it was expected to glance off of the gold atoms like a curling stone, not fly off like a hockey puck. Geiger, Marsden and their mentor Ernest Rutherford had discovered substructure within the atom: particles rebounded at such wild angles because they were hitting tiny, massive nuclei of protons and neutrons.
A century later, physicists around the world pore over data from the LHC. Beams of particles a million times more energetic than Geiger and Marsden’s collide head-on in the middle of a 40-foot detection system known as CMS. Some of these physicists, including students and postdocs at Fermilab, are looking for signs of substructure within quarks, just as quarks are within protons and neutrons, which are within nuclei, within atoms.
The majority of LHC collisions result in debris that scatters at small angles from the beams. But if quarks contain small, hard structures like the nucleus within the atom, then the highest-energy collisions would result in wide-angle scattering. In a recent paper, these physicists showed that there is no sign of substructure down to almost 10−20 meters, which is 370,000 times smaller than a gold nucleus. If Geiger and Marsden had discovered a nucleus the size of a city block, these scientists are looking for amoebas crawling on the sidewalk.
Are quarks the smallest parts of the nucleus, or is there a deeper layer? Though the instruments of discovery have become more sophisticated, the curiosity remains the same.
When probed at smaller scales, protons seem to contain more and more quarks and gluons, represented here as colored blobs.
The LHC may help us discover new laws of physics, but it is already improving our understanding of the laws we thought we knew. Quantum Chromodynamics (QCD) is the force law that describes the attraction between quarks. It is believed to be correct because it made many predictions that were confirmed experimentally in the 1970s and 1980s. However, QCD is too mathematically complex to solve in general - only special cases have been precisely calculated.
One problem that has not been solved is the detailed structure of protons and neutrons, which together make up most of the everyday world. Both contain three primary quarks (up, up, down for protons and down, down, up for neutrons), a froth of quark-antiquark pairs appearing out of the vacuum and annihilating with each other and a swarm of gluons holding them all together. But how many? How strongly do they all stick together? These are hard questions to answer from the fundamental theory.
Experiments reveal this structure in protons by colliding them and observing how much energy is released at different angles. Higher-energy collisions are privy to finer detail, like microscopes with higher resolving power, because they are quick enough to get a snapshot of the short-lived quark-antiquark pairs and gluons before they fluctuate away. The LHC provides the sharpest image yet.
The experiments show that the closer you look, the more of these quarks and gluons you see, especially gluons. In fact, there are so many at the scales probed by the LHC that sometimes more than two of them collide when two protons pass through each other. In a recent paper, CMS physicists demonstrate that their measurements are more consistent with multiple quark/gluon interactions than any other known process.
This has many implications. It may indicate a transition to a new regime of QCD, where the density of quarks and gluons saturates - that is, they become so dense that they start overlapping and interacting in new ways. These effects are interesting in their own right, but they are also necessary to predict probabilities for creating new known and unknown particles.
When a proton collision creates a W or Z boson and jets, it is about five times less likely to make two jets than one, and another five times less likely to make three jets than two, and so on...
About a hundred million pairs of protons collide in the LHC each second. The rate of collisions must be so high for two reasons: to improve the chances of seeing something rare and to allow patterns to emerge out of the noise.
Each collision results in a unique splatter of debris, and no one can predict what will result from any individual impact. Some collisions produce photons - single-particle flashes of light. Some make heavy relatives of photons known as W and Z bosons. Many collisions create narrow streams of particles known as jets.
If we group collision events by type, they begin to form patterns. For example, consider the collisions that result in a W or Z boson and one or more jets. Collisions producing one jet are more common than those that make two jets, which are more common than those that make three and so on. Just as in bird-watching, we cannot predict what the next collision will bring, but Canadian Geese are more common than Cackling Geese.
With enough data, the pattern becomes sharp. A recent CMS paper presents an analysis of the rates of these collisions. The scientists showed, with unprecedented precision, that each additional jet makes the collision type more rare by the same factor, approximately a factor of five. That is, for every 25,000 cases with one jet, there were about 5,000 cases with two jets, about 1,000 cases with three jets and about 200 with four jets.
Patterns like this reveal the inner world of quarks and gluons. The standard theory of the force that binds them dictates that an energetic quark or gluon is about as likely to appear as any other energetic quark or gluon. These particles then split up into jets like the ones observed in CMS. By counting jets, the CMS scientists showed that energetic quarks and gluons do seem to be produced independently.
This measurement is also important for searches for new phenomena. Many theoretical particles are expected to disintegrate into W bosons, Z bosons and jets, so they would look just like the cases studied here. In fact, the CDF collaboration observed an unexpected excess of cases with a W boson and two jets. Without a precise understanding of how many jets to expect at the LHC, a similar excess would be inconclusive. This measurement is a substantial improvement in our understanding and puts us in a much better position as we push our searches into the unknown realm of discovery.
This is the proton collision data delivered by the LHC to CMS. The red triangles indicate how much data was delivered up to a given day in 2011.
Datasets are the currency of physics. As data accumulate, measurement uncertainty ranges narrow, which increases the potential of discoveries and makes non-observations more stringent, with more far-reaching consequences. In collider physics, the amount of data is measured by the total number of collisions observed and the rate of those collisions, or the luminosity.
In 2011, the LHC produced more collisions than scientists dared to expect, breaking the world record luminosity in April and then continuing to grow seven-fold. By the end of the proton collision run in November, 240 million protons were colliding each second. Top quarks, once considered rare, were produced at a rate of one pair every two seconds. This provided CMS scientists with enough data to measure known processes with unprecedented precision, improving our understanding of the way protons collide and sharpening theoretical predictions of new phenomena.
One of the most pervasive ideas in theoretical physics is supersymmetry. This is the idea that matter and forces were unified in the early universe. Although this symmetry principle may appear in many guises, CMS scientists cast a wide net of search strategies to look for signs of supersymmetric particles. None were observed, and this means that many models that had been plausible in 2010 are now ruled out. Supersymmetry is still possible, but it has less wiggle room than it did before.
Many other models of new phenomena were studied as well: extra dimensions, substructure within familiar quarks and leptons, excited variants, new generations, conglomerates of these particles, new forces, new long-lived or stable particles, and even microscopic black holes. In each case, the fact that these phenomena did not appear rolls back the scope of speculation with hard facts. Curled-up extra dimensions may exist, but if so, they must be curled 2.5 times smaller than previously supposed.
The last four weeks of the year were dedicated to collisions of lead ions and physics of an entirely different character. When lead ions collide, so many quarks and gluons are produced that the fireball forms a fluid like the one that existed in the early universe. The December 2010 data revealed that this fluid is lumpier and more asymmetric than expected, that it absorbs even very energetic quarks and gluons, and that it melts mesons. It is too soon to say what will be found in this year’s lead ion dataset, which is 16 times larger.
The most dramatic revelation of 2011 concerns the Higgs boson, the as-yet undiscovered cornerstone of the Standard Model of particle physics. Hundreds of CMS scientists have pooled their data to show that the Standard Model Higgs, if it exists, can only have a mass in the narrow range between 115 and 127 GeV. This could mean that the Higgs is hiding within this range, or that the real Higgs boson is not a Standard Model Higgs but has exotic properties, or that it simply does not exist. LHC scientists anticipate a large enough dataset in 2012 to make a definitive statement. Regardless of the outcome, the stage is set for an exciting year.
We would like to thank the LHC accelerator staff for a highly successful physics run, and wish everyone a restful break and a happy new year!
Z bosons are identified by the particles that they decay into (μ+ and μ− in this case). Then, the Z bosons are used to study the collision that made them.
All of the matter and energy of our everyday experience is made of only five basic particles: electrons, up and down quarks, gluons to glue the quarks together and photons, which are particles of light. This is just a corner of the particle landscape as we currently know it: including these five, 29 different types of fundamental particles and antiparticles are routinely produced in high-energy collisions. Some of the others are antimatter, some immediately decay, and some interact so weakly with normal matter that they disappear from view, leaving those that make up our material world.
One of these unfamiliar particles is the Z boson, a high-energy analogue of the photon. Whereas a photon is massless and stable, a Z boson is very heavy and decays within a trillionth of a trillionth of a second. Yet the two are twin aspects of the same electroweak force. Processes that produce photons also produce Z bosons, if they have enough energy to create the Z particles’ large mass.
The high energy of collisions in the LHC is sufficient to produce Z bosons in abundance. Compared to the handful of Z particles observed at the time of their discovery in 1983, the LHC at peak luminosity produces millions of them per day. The flashes of impact in the LHC are approximately as bright in Z-light as they are in ordinary light.
In a recent paper, CMS scientists analyzed the angular distribution and energy spectrum of Z bosons emerging from these collisions. These distributions provide a view of the first instant of collision since Z bosons are unaffected by all of the other particles in the collision debris.
If this sounds familiar, there’s a good reason. An article two months ago presented the same kind of study using photons instead of Z bosons. The energy spectrum, or color, of the light emerging from the collisions provides an early snapshot of the interaction because light also streams out unaffected by the other particles.
Thirty years ago, the Z boson was an exotic, theoretical particle. Today, it is a new pair of eyes to reach deeper into the microcosm.
The magenta arrows indicate the paths of the protons before collision, the four red lines are particles resulting from direct quark or gluon annihilation, while much of the yellow and blue are fragments of the protons that missed each other.
In his freshman physics lectures, Richard Feynman compared the principle of energy conservation to a child playing with blocks. At the end of the day, the child seemed to have fewer blocks than he was given at the beginning of the day―until his mother looked hard enough and found them hidden under the bed. Similarly, we believe that energy cannot be created or destroyed because whenever some energy appears to be missing, we eventually find it hiding in another form.
When the LHC collides protons, the resulting energy forms new particles. The new particles are usually known types, like W and Z bosons, though new types, such as Higgs bosons, would also be produced if they exist. However, only a fraction of the collision energy produces particles in this way. It even varies from one event to another–many collisions convert only a tenth or a hundredth of the impact energy into new particles, while some rare events convert nearly all of it.
Where is the energy hiding? The key is that protons are composed of more fundamental particles–quarks and gluons– that collide. When two protons pass through each other, usually only one quark or gluon from each actually annihilate to make new particles. The rest of the fragments are deflected and carry most of the energy as a spray of particles close to the beamline.
In every collision event, the shrapnel from these near-misses overlaps the products of the direct impact. This complicates the data, but physicists are able to untangle them. The near-misses are interesting in their own right: a recent CMS paper presents a study of the number and energy of these particles. A thorough understanding of the quarks that miss improves the accuracy of identifying the quarks that hit, as well as the exact points in space where each collision took place. It also yields deeper insight into the inner structure of protons, which are perhaps the most familiar and yet most complex particles in physics.
The photons measured by CMS differ from visible light only in energy: light that we see has 2-3 electron-volts per photon, while photons from the LHC have billions of electron-volts.
When protons collide in the LHC, they sometimes emit a flash of light. Light is made of particles called photons, and the energy of the photons determines the color of the light. Red light is approximately 2 electron volts (eV) while blue light is about 3 eV, with all visible colors in between. Photons that are more energetic than blue light are usually called ultraviolet (100 eV), X-rays (10 thousand eV) and gamma rays (100 thousand eV or more). Many of the photons produced in LHC collisions have 30 billion to 300 billion electron volts (GeV).
CMS scientists recently measured the color of LHC collisions. That is, they measured the energy of each photon as it emerged from the collision debris and plotted the distribution of all of them. The figure above shows what that distribution looks like: Most of the measured photons have an energy of 30 GeV, with fewer and fewer at higher energies. One could say that the collisions are reddish, if the whole distribution were scaled down to the visible range.
Why is this interesting? Photons are unaffected by all of the other particles in the collision debris, so they let us see the first instant of collision. Most of the particles in the debris are hadrons such as pions, kaons, etas and rhos—the so-called particle zoo. Hadrons are made from coalescing quarks and gluons. This process changes their trajectories after the collision and hides their initial energies. Photons from the initial collision, however, stream directly into the CMS electromagnetic calorimeter to be measured. These measurements quantify the accuracy of current quark collision models, which are used in nearly all interpretations of LHC data.
The photons studied in this analysis are also the background noise that scientists need to sift through to search for a Higgs boson, assuming that it is hiding in the parameter range that is not yet ruled out. A Higgs boson can decay into two photons with a very narrow range of energy, like a pure but faint color in a ruddy mix of hues.
Since the bottom quarks (b and b) inside the Upsilon 1S (left) are held together more tightly than in the Upsilon 3S (right), the Upsilon 1S is less likely to fall apart when clobbered by quarks and gluons in a hot plasma.
There have been some hot days this summer; not quite hot enough to cook an egg on the sidewalk, but hot enough to bake cookies in a car. But we don’t need a thermometer to quantify temperature. Besides baking in our cars, we can note what melts and what doesn’t. When nuclei collide in the LHC, the temperatures produced are hot enough to melt subatomic particles called mesons. They are a million times hotter than the center of the sun, just as the center of the sun is many thousands of times hotter than summer in Chicago.
The upsilon 1S, 2S and 3S are a family of mesons, all of which are made of a bottom quark and antiquark, bound by a stream of gluons. In an analysis of lead-nucleus collisions, CMS scientists saw far fewer upsilon 2S and 3S mesons relative to upsilon 1S. This is what we expect if the temperature were hot enough to melt them.
For one month toward the end of each year, the LHC collides lead nuclei instead of protons. Lead nuclei are made of more than 200 protons and neutrons; when they collide, there are so many interactions in a small region of space that the debris mixes and becomes fluid, called quark-gluon plasma. If the plasma is hot enough, the whirlwind of quarks and gluons can knock apart the bottom and antibottom quarks of an upsilon meson. This is called melting, as explained by this analogy of melting ice: molecules of hot air can knock apart the bonds holding water molecules together in ice. However, upsilon mesons are much smaller and harder to break up than ice.
The CMS analysis revealed that 60 percent of upsilon 1S mesons and almost 90 percent of upsilon 2S and 3S seem to disappear when immersed in the hot aftermath of lead nucleus collisions. The fact that upsilon 2S and 3S are more suppressed is the most interesting part: scientists believe that it is because the 2S and 3S are more loosely bound than the 1S. Because the bottom quarks of the upsilon 2S and 3S dangle more loosely, less force is needed to knock them apart.
This effect introduces a new way to study quark-gluon plasmas—a new state of matter so hot and so short-lived that subatomic particles are the only thermometer.
What looks like a one-dimensional world to a large creature might actually be two-dimensional at smaller scales.
One of the goals of the LHC is to search for evidence of extra dimensions— spatial directions such as length, breadth and height, but curled or curved in a way that hides their existence in everyday life. In a recent paper, CMS scientists presented new bounds on such a scenario.
Extra dimensions could hide from our perception by curling into tiny loops. A particle that drifts a microscopic distance in this new direction would circumnavigate the loop and arrive at its starting point almost immediately. The universe might have several microscopic dimensions in addition to the three large dimensions that we know well.
Collisions in the LHC could reveal new dimensions by creating particles that are small enough to fit inside them. High-energy colliders are often called microscopes because the energetic particles they make fill less space than low-energy particles, which allows them to probe the smallest structures in nature. If extra dimensions exist, then some energy would seem to disappear from view as particles traverse the tiny loops. In the CMS detector, this would look like an energy imbalance in the highest-energy collisions.
In this analysis, the scientists studied collisions resulting in a large energy imbalance. Their observations are consistent with the conventional three dimensions, so if a fourth, fifth or sixth dimension does exist, then it must be curled up even smaller than previously hypothesized. For example, if there are six new dimensions, then they can be no larger than about 20 femtometers, which is as small as an atomic nucleus. This is two times smaller than what was previously considered.
In a nice feat of recycling, the same experimental measurement applies to a completely different theory. Hypothetical particles without a well-defined mass, called unparticles, would produce similar patterns in the collision debris. Wasting not, the authors tightened the world’s knowledge about unparticles as well.
The top quark is the most massive particle in the Standard Model, and might be pointing to what lies beyond.
Regular readers of Fermilab Today may be familiar with the top quark. For 15 years after its discovery in 1995, top quarks could only be produced by the Fermilab Tevatron. This changed with last year’s start-up of the LHC, when scientists saw top quarks in the CMS detector.
The top quark is, in a sense, at the extreme edge of the Standard Model. It is by far the most massive fundamental particle known to exist. Its enormous mass may be key to understanding the mystery of why particles have any mass at all— whatever that reason is, it most strongly manifests in top quarks.
Today’s featured CMS result is a measurement of the top quark’s mass and production rate at the LHC. Unlike any other quark, top quarks decay rapidly into a W boson and a b quark. Each of these can decay many different ways, giving scientists a choice in how to search for it. In this paper, scientists look for the following pattern: top and anti-top quarks produced in pairs, which together decay into two W bosons and two b quark jets, with both W bosons decaying into a lepton (electron e or muon μ) and a neutrino. Though the neutrino escapes undetected, it makes its presence known by shifting the balance of the other particles. Everything is observed at once in the detector, making these events very complicated to study.
Precision measurements of known particles provide important clues about the physics beyond them. Before the top quark was discovered, precision measurements of the W and Z bosons predicted the top quark’s mass fairly accurately. Similarly, the W, Z, and top quark properties together imply that the as-yet undiscovered Higgs boson has a mass just beyond the current searches. Indeed, the rocky shores of this distant outpost are ideal for disembarking into the unknown.
A proton contains two up quarks, a down quark and a soup of quark-antiquark pairs, seething below the surface.
It is often said that a proton is made of three quarks: two of the same type, called up quarks, and one of a different type called a down quark. But that’s not the whole story. In the space between these three stable quarks there is a boiling soup of quark–antiquark pairs. That is, a quark and an antimatter quark spontaneously come into existence, drift a while, and then recombine, destroying one another. This happens all the time— in every proton in every atom of every cell of our bodies, and in all of the matter in the universe.
When two protons collide in the LHC, most of the individual quarks miss each other. Often only one quark or antiquark from each proton collides directly. When an up quark collides with an anti-down quark, the two can combine to form a W+ boson; similarly, a down quark and an anti-up quark can combine to form a W¯ boson. In both cases, an antiquark is involved. Thus, each of the millions of W bosons produced at the LHC must come from at least one of these transient particles, caught before it had a chance to sink back into the soup.
CMS scientists recently measured the ratio of W+ to W¯ production in proton collisions at the LHC. The number of W+ bosons exceeds the number of W¯ bosons by about 40 percent, partly because each proton has two stable up-quarks for every stable down-quark. However, the exact ratio also depends on the density of the quark-antiquark soup.
Counting W+ and W¯ bosons yields new insight into the dynamic structure of protons, which is too complicated to compute from first principles with current techniques. It also informs predictions of new physics: The rate at which hypothetical particles would be produced depends on the density of quark-antiquark pairs, for the same reason that W bosons do. It is important to know the thickness of this soup when imagining what else might spring from it.
A schematic of the sort of collision debris that would hint at supersymmetry: a muon and an electron accompanied by jets and missing energy (invisible particles inferred from the lopsidedness of the rest of the debris). The vast majority of proton collisions produce only jets.
If you smash two protons together what would come out in the debris? In 99.9999 percent of the collisions, the result would be nothing but quarks and gluons, each of which then becomes a jet — a spray of particles made of more quarks and gluons. Though these complex events contain many particles, less than half of the known fundamental particles are represented.
Leptons such as electrons, muons and invisible neutrinos only emerge in the last 0.0001 percent of cases. However, detectors such as CMS are designed to be especially sensitive to leptons. They can even see invisible leptons by measuring of missing energy, the lopsidedness of all the rest of the debris.
Many theories of new physics, some incorporating the supersymmetry principle, result in more balanced event pictures such as the one shown above. This is not true of all supersymmetric theories, but considering the Standard Model’s million-to-one preference for jets over leptons, searching for events with one or two leptons eliminates huge backgrounds from interactions that are already understood. Squarks, sleptons, and other supersymmetric variants of known particles could lead to events containing a little of everything: jets, leptons and missing energy.
CMS physicists recently searched for this kind of particle signature: two or more electrons or muons, two or more jets, and missing energy. Despite the enormous number of proton collisions in the 2010 dataset, the Standard Model predicts that only about one of them would fit such a strikingly balanced menu. One event was found, confirming the old theory and ruling out some, but not all, supersymmetric models. As the 2011 data pour in, CMS physicists are using this technique to uncover yet more elusive theories.
A photon, a particle of light, has no mass. A Z boson has most of the same properties as a photon, but it is very massive. Even heavier photon-like particles could exist, but a new result from CMS sets the lower limit of their mass to at least 12.5 times heavier than Z bosons.
You may have heard that when matter and antimatter collide, they annihilate into pure energy. That is, when a negatively charged electron encounters its positively charged twin, the positron, the two form a new particle with no charge and twice the energy. The reverse is true as well: a particle of pure energy can decay into an electron and a positron, or a muon and an antimuon, or any other pair of opposites. The lightest particle of pure energy is the photon - light itself - but heavier variants have been discovered in colliders. The heaviest is the Z boson. There are many reasons to believe that more are waiting to be found.
CMS recently presented results on a search for a new particle, dubbed Z’ (pronounced Z prime), which would decay into an electron and a positron or a muon and an antimuon. If any such particle exists, it would have to be more than twelve and a half times heavier than the Z boson.
Though CMS physicists were searching for a deceptively simple signal - two oppositely charged particles - this result is an impressive demonstration of the precision of the CMS detector. When a super-heavy Z’ decays, its mass becomes the energy of motion of the two charged particles, making them the fastest electrons and muons ever produced in a laboratory. As these electrons and muons zip through the detectors, it is difficult to get a precise measurement of their energy. But despite the difficulties, the CMS detector performed beautifully, yielding the most definitive test to date.
There are as many ways to interpret this result as there are theories that predict new particles of pure energy. Some are related to the unification of all the forces in the Standard Model, others are inspired by superstrings. Still others are related to warped extra dimensions. As the size of the LHC dataset and collision energy increase, so will sensitivity to even heavier light. The next result could be a discovery - we are in uncharted territory.
Collision events in the figure above are arranged according to degree of momentum imbalance: balanced events pile up on the left while unbalanced events are put on the right. One of the supersymmetric models that CMS ruled out is shown in yellow: if this model were true, the physicists would have observed twice as many unbalanced events (See figure 2 in the paper for details).
What’s so super about supersymmetry? A lot of things, actually. Originally proposed to relate the particles that make up matter (fermions) to the particles that make up forces (bosons), the theory would also solve many other mysteries.
Supersymmetry predicts an energy scale at which fundamental forces combine (grand unification). It also explains why the Higgs boson mass would need to be much lower than this energy scale for the Higgs mechanism (the mechanism by which particles obtain mass) to work. Additionally, it provides a natural candidate for dark matter, the substance scientists believe is holding the universe together. As a principle of nature, supersymmetry seems to have everything going for it except evidence.
That’s why searches for supersymmetry are among the most anticipated experiments that physicists will perform at the LHC. Though the veil will be drawn back gradually as the dataset grows, the first results on supersymmetry from the LHC were recently posted by CMS.
There are many ways to search for supersymmetry; this week’s featured CMS result looks for two generic features in the aftermath of the proton collisions: unusually energetic jets of debris and invisible particles. The collaboration was able to rule out one model of supersymmetry.
The invisible particle that CMS physicists are searching for is the dark matter candidate predicted by supersymmetry, which is undetectable by definition. Supersymmetric interactions would be incredibly rare, too, so this search would be like looking for a needle in a haystack, except that the needle is invisible.
Fortunately, physicists have a way of distinguishing events with invisible particles from events with only visible particles. Debris from any collision must be balanced in momentum— the speed and mass of debris flying out on one side must be balanced by the speed and mass of debris on the other. An unseen particle would look like a momentum imbalance because we don’t see the particle that balances the momentum. If we arrange the data by the degree of imbalance, putting the hundreds of thousands of well-balanced events on the left and the handful of unbalanced events on the right, we get the plot shown above. The few unbalanced events on the right do contain invisible particles, but only neutrinos, which are a well-known part of the Standard Model. Some supersymmetric models are ruled out by these data, while many more await the additional techniques and larger datasets that are on their way.