Hyped Signal of Decaying Dark Matter Vanishes in Updated Analysis

In 2014, scientists observed X-ray activity from distant galaxies that was thought to be the first evidence of dark matter decay — a landmark discovery that could significantly advance efforts to characterize this puzzling substance. However, a new study from the Flatiron Institute and collaborators suggests that imperfect analysis methods used to detect the activity — called the 3.5 keV line — likely produced a phantom signal.

Two views of the Perseus galaxy cluster — one of the original sites thought to exhibit a 3.5 keV line — captured by the XMM-Newton and Chandra telescopes. Chandra: NASA/CXC/SAO/E. Bulbul et al.; XMM-Newton: ESA

In 2014, astrophysicists caught a glimpse of what they thought was their white whale: evidence of the nature of the mysterious and elusive dark matter that makes up 85 percent of the universe’s material. They spotted X-ray activity thought to result from decaying dark matter, as typical matter would not have been able to produce such a signal. With this exciting discovery, a window seemed to have finally opened into dark matter’s secrets.

The problem, however, is that according to new research, the signal (called the 3.5 keV line) probably never existed in the first place. By re-creating the original studies’ analysis techniques and applying new, more comprehensive tools, a team of astrophysicists concluded that the 3.5 keV line originally arose from flaws in data analysis. The team reports their findings in the April 1 issue of The Astrophysical Journal.

“This is an important result because we’re showing that these previous methodologies used to study dark matter decay may not be optimal and could be giving spurious results,” says study lead author Christopher Dessert, a postdoctoral fellow at the Flatiron Institute’s Center for Computational Astrophysics and New York University.

Dessert co-authored the study with Benjamin Safdi and Yujin Park of the University of California, Berkeley and Lawrence Berkeley National Laboratory, as well as Joshua Foster of the Massachusetts Institute of Technology.

The Case for the 3.5 keV Line

It was while scouring the cosmos in 2014 that both the XMM-Newton and Chandra space telescopes picked up a surprising signal of X-rays, each with around 3.5 kiloelectronvolts (keV) of energy, from multiple galaxies and galaxy clusters. This activity caught scientists’ attention because it didn’t seem to come from the usual suspects that produce X-ray energy, such as ionized gases.
“There are ways those lines can show up which can be explained by standard astrophysics, but none explained why this would be showing up at 3.5 keV specifically,” says Dessert. “So it seemed it really could be coming from dark matter decay.”

And if it was, the implications were profound. It would be the first concrete evidence of dark matter and could usher in a new era of astrophysics. As theoretical physicist Neal Weiner of New York University told Wired in 2014: “We may in the future say this was when dark matter was discovered.”

Detecting dark matter is difficult because although it makes up much of the universe, it is completely invisible. It doesn’t interact with light and can only be observed through its gravitational effects on its surroundings. One way to look for it, however, is through dark matter decay, in which dark matter is thought to spontaneously turn into other types of particles that we can measure.

We know that dark matter has existed for almost all of the universe’s 13.8-billion-year history. So, if dark matter decays, it must decay extremely slowly.

Luckily, dark matter makes up around 85 percent of the universe’s mass, so there’s still plenty of opportunity to see this decay. To try to catch the process in action, scientists have been focusing on areas with especially high concentrations of dark matter: galaxies and galaxy clusters. This is what XMM-Newton and Chandra set their sights on in 2014.

“There’s enough dark matter there that even if it decays extremely slowly, we can catch those decays, which can show up as ‘lines,’ or specific energies in X-rays,” Dessert says.
And so, with repeated detections across galaxies, the 3.5 keV line was the field’s new hot topic. But the story wasn’t over.

Lucy Reading-Ikkanda/Simons Foundation

Diving Back Into the Data

Why dispute the earlier findings? When data from the Milky Way didn’t show evidence for the 3.5 keV line, Dessert’s team suspected something was amiss.

“If the 3.5 keV line were coming from dark matter decay, then it would be strongest in the dark matter lining the Milky Way as opposed to other galaxies where people were previously looking,” because the Milky Way dark matter is so much closer to Earth, he says. “When we didn’t see a signal there, we started to think maybe something had gone wrong in the original data analysis.”

The scientists reexamined all the original evidence for the line (six datasets from XMM-Newton and Chandra), recapitulating the original studies’ analysis as closely as possible.

They first repeated the analyses in a wide energy range (looking at a broad spectrum of energies around 3.5 keV, as the original studies did). They then gradually narrowed this window, zeroing in on 3.5 keV.

“We did this because we thought that earlier studies were analyzing in too wide an energy range and might be picking up on X-rays that were from something other than the putative line,” says Dessert.
Sure enough, as they approached 3.5 keV, evidence for the line diminished — a strong indicator that the line didn’t exist.

In fact, even analyzing in the wider energy windows yielded no evidence of the line in any dataset but one. Dessert’s team thinks the original findings could have resulted from a reliance on a method called ‘local optimizers.’ Optimizers are algorithms used to find the best parameters for modeling data. For studies like this one, where there are a lot of parameters to account for, a local optimizer might not be able to do the job well enough. Because of this, Dessert’s team used global optimizers — which offer better statistical power — to fit the data, offering more accurate results.
“A global optimizer looks at more parameters to find those that match the data best, whereas a local optimizer might get stuck and decide it’s done when it hasn’t really found the right options yet,” says Dessert.

Local optimizers were simply more standard practice at the time of the original studies, Dessert notes. However, global optimizers are more appropriate for this kind of work and deliver no evidence for a line, he says.

Taken together, the results paint a sobering picture of a chase after a red herring. Or, as Dessert puts it, “We were trying to understand this line, but there was never a line to be understood at all.”

Shifting Course in the Search for Dark Matter Decay

“Going forward, our work shows that scientists looking for dark matter decay should use smaller energy windows, but if not, global optimizers are necessary to find correct results,” Dessert says.
This is especially critical because XMM-Newton and Chandra are about to retire after 25 years of roaming the sky, and their high-tech successors will carry on the hunt for dark matter decay.

“We just launched a new telescope called XRISM, and there will be another called Athena in the 2030s, which will be able to give us much more precise results,” says Dessert. “Which is very exciting, but we have to make sure when we do the analysis on the data they produce, we do it in the right way.”

Dessert is looking forward to continuing his search for the secrets of dark matter, and he is more optimistic than ever, thanks to the unique capabilities and computing power of the Flatiron Institute.

“Dark matter makes up so much of the universe, and yet we still don’t have much of a handle on it,” he says. “There’s a lot left to learn, and it’s certainly going to involve huge feats of data analysis, and Flatiron is really one of the best places to be doing that.”

Recent Articles