Sunday, February 26, 2017

The Murders of My Colleagues

Photo
Socorro Rosas López, mother of the murdered journalist Pedro Tamayo Rosas, with her other sons at Mr. Rosas’ funeral in July 2016 in Tierra Blanca, Veracruz, Mexico.CreditDaniel Berehulak for The New York Times
MEXICO CITY — A year ago, at least eight gunmen in military fatigues stormed the home of the crime reporter Anabel Flores near the city of Orizaba and dragged her away from her pleading family. The next day her body was found on a road; she was dead at 32, just a few weeks after giving birth to her second child.
In May and August, police arrested two suspected members of the Zetas drug cartel for the killing, but haven’t released their names or more details, leading the Committee to Protect Journalists to report that “the case remained opaque” — like the homicides of so many of her colleagues here.
Last year was one of the most deadly for Mexican reporters in recent history. Even the total number of victims is hard to pin down, thanks to botched investigations and confusion about how many of the dead officially worked as journalists. But most press groups count at least nine slain here in 2016, some as many 16. Reporters Without Borders said Mexico was the third most perilous country in the world for journalists, after Syria and Afghanistan — in other words, the most perilous outside a declared war zone.
When these annual numbers were released in December, they didn’t make much of a splash. People have become accustomed to grizzly stories of Mexican gangsters dragging reporters from their homes, ambushing them in their cars or leaving severed heads outside their newsrooms. Since 2000, the total journalist body count here has reached 100, according to the press freedom group Article 19. The murder of Mexican journalists is old news.
Continue reading the main story
Reporting from Mexico since 2001, I have written stories about journalists being shot or decapitated or disappearing more times than I can count. At one point, I was working in an international news agency when a colleague raised the question of whether we should continue to cover media murders at all anymore. Are they actually more important than the other 20,000 or so homicides that happen every year in Mexico?
I argued we should. The least we can do is publicize their names, give a last tribute to their work. But it’s more than that. The murder of journalists is not only the killing of human beings; it is also an attack on free speech. It has turned many parts of the country into black holes, granting immunity to corrupt officials.
Most journalists murdered in Mexico work for smaller media outlets in the provincial towns and cities where the state is weakest and organized crime strongest. The killers feel they can target reporters with minimal consequences. And they are right.
The national and international news media need to keep covering these journalists’ stories to both show solidarity and create pressure for justice. In an age when journalism is under attack from all sides, we need to defend our profession, and that starts with stopping our colleagues from being murdered.
Press freedom groups, including Mexico’s Periodistas de a Pie, have done great work training reporters in security measures like identifying escape routes before they go into difficult areas. Hundreds of threatened journalists have also enrolled in a protection program run by the federal government since 2012, getting panic buttons with which they can alert police if they are in trouble, or occasionally assistance moving to a different, safer city.
But these measures don’t do anything to fix the problem at its source. If we want criminals to stop killing journalists, we need something else: justice. Murder investigations are often handled by ineffective, and sometimes suspect, state prosecutors. In some cases, victims are kidnapped in one state and killed in another, leading to different prosecutors handling, or too often manhandling, the evidence.
A key problem is that local officials are often working with the drug cartels, and occasionally were even themselves involved in the attacks. A police officer in Oaxaca State was arrested in the killing of a journalist last year. The year before, an arrest warrant was issued for a mayor in Veracruz State for the murder of another reporter.
There is a designated federal prosecutor for crimes against journalists, but his office takes only a limited number of cases that fulfill particular legal requirements — like evidence that local police were involved. But the circumstances around a lot of the killings are murky, so many of the probes are left to those local officials.
The end results are poor. In at least two-thirds of journalist murders last year, no one was even arrested. When there were arrests, the police often acted suspiciously, sometimes refusing to release the names of the suspects.
One solution would be to make the office of the federal prosecutor on journalists head the investigation of every single media homicide. The office needs to have the resources and teeth to take on the difficult cases and reduce the impunity. And it should be responsible for solving those cases; if it doesn’t, the prosecutor should be replaced.
By taking the cases away from the states, you move them farther from the corruption networks that may be complicit in the journalists’ murders. And you push them into one central office that press groups can deal with — and pressure for results. But it needs to be transparent in its investigations. The Committee to Protect Journalists puts profiles of each case on its website, information that the special prosecutor itself should be providing.
Mexico is knee deep in problems, including rampant corruption that strangles the economy. Journalists are a key part of the solution, but they can expose the country’s rot only if they have the basic protection from being murdered. We should not become accustomed to the killing of our colleagues.

Tuesday, February 21, 2017

Cosmos Controversy: The Universe Is Expanding, but How Fast?

There is a crisis brewing in the cosmos, or perhaps in the community of cosmologists. The universe seems to be expanding too fast, some astronomers say. ¶ Recent measurements of the distances and velocities of faraway galaxies don’t agree with a hard-won “standard model” of the cosmos that has prevailed for the past two decades. ¶ The latest result shows a 9 percent discrepancy in the value of a long-sought number called the Hubble constant, which describes how fast the universe is expanding. But in a measure of how precise cosmologists think their science has become, this small mismatch has fostered a debate about just how well we know the cosmos. ¶ “If it is real, we will learn new physics,” said Wendy Freedman of the University of Chicago, who has spent most of her career charting the size and growth of the universe.
The Hubble constant, named after Edwin Hubble, the Mount Wilson and Carnegie Observatories astronomer who discovered that the universe is expanding, has ever given astronomers fits. In an expanding universe, the farther something is away from you, the faster it is receding. Hubble’s constant tells by how much.
But measuring it requires divining the distances of lights in the sky — stars and even whole galaxies that we can never visit or recreate in the lab. The strategy since Hubble’s day has been to find so-called standard candles, stars or whole galaxies whose distances can be calculated by how bright they look from Earth.
But the calibrators themselves need to be calibrated, which has led to a rickety chain of assumptions and measurements in which small errors and disagreements — about, say, how much dust is interfering with observations — can build up to cosmic proportions. Only three decades ago, renowned astronomers could not agree on whether the universe was 10 billion or 20 billion years old. Now everybody has settled on its age as about 13.8 billion years.
Continue reading the main story
Using a new generation of instruments like the Hubble Space Telescopeastronomers have steadily whittled down the uncertainty in the Hubble constant.
Photo
Albert Einstein, left, and Edwin Hubble, second from left, at the Mount Wilson Observatory in 1931.CreditImagno/Getty Images

Getting Closer

In 2001, a team led by Dr. Freedman reported a value of 72 kilometers per second per megaparsec (about 3.3 million lightyears), in the galumphing units astronomers prefer. It meant that for every 3.3 million lightyears a galaxy was farther away from us, it was moving 72 kilometers a second faster.
Hubble’s original estimate was much higher, at 500 in the same units of measurement.
Dr. Freedman’s result had an error margin that left it happily consistent with other more indirect calculations, that had gotten a slightly slower and lower value of 67 for the Hubble constant. Those were derived from studies of microwaves emitted and still lingering in the sky from the the primordial Big Bang fireball.
As a result, in recent years, astronomers have settled on a recipe for the universe that is as black and as decadent as a double dark chocolate chunk brownie. The universe consists of roughly 5 percent atomic matter by weight, 27 percent mysterious dark matter and 68 percent of the even more mysterious dark energy that is speeding up the cosmic expansion. Never mind that we don’t know exactly what all this dark stuff is. Astronomers have a good theory about how it behaves, and that has allowed them to tell a plausible story about how the universe evolved from when it was a trillionth of a second old until today.
But now the Hubble precision has gotten seemingly better, and the universe might be in trouble again.
Last summer a team led by Adam Riess of Johns Hopkins University and the Space Telescope Science Institute, using the Hubble Space Telescope and the giant Keck Telescope on Mauna Kea in Hawaii and supernova explosions as the ultimate distance markers, got a value of 73 plus or minus only 2.4 percent for the elusive constant.
That made waves because it meant that, if true, the Hubble constant as observed today was now clearly incompatible with a result of the lower slower value of 67 inferred from data obtained in 2013 by the European Planck spacecraft of relic radiation from the Big Bang. The Planck mission observations that show the universe when it was only 380,000 years old are considered the gold standard of cosmology.
Continue reading the main story
Photo
Cosmic microwave radiation left over from the Big Bang, as seen by the Planck spacecraft. CreditNASA
Whether the standard cosmic recipe might now need to be modified — for example, to account for a new species of subatomic particles streaming through space from the Big Bang — depends on whom you talk to. Some say it is too soon to get excited about new physics sneaking through such a small discrepancy in a field noted for controversy. With more data and better understanding of statistical uncertainties, the discrepancy might disappear, they say.
“No explanation I know of is less ugly than the problem,” Lawrence M. Krauss, a theorist at Arizona State, said.
Others say this could be the beginning of something big. David Spergel, a cosmologist at Princeton and the Simons Foundation, called the discrepancy “very intriguing,” but said he was not yet convinced that this was the signature of new physics. Michael S. Turner of the University of Chicago said, “If the discrepancy is real, this could be a disruption of the current highly successful standard model of cosmology and just what the younger generation wants — a chance for big discoveries, new insights and breakthroughs.”
Dr. Riess and his colleague Stefano Casertano got roughly the same answer of 73later last summer, strengthening the claim for a mismatch of Hubble constants. They used early data from the European spacecraft GAIA, which is measuring the distances of more than a billion stars by triangulation, thus allowing astronomers to skip some of the lower rungs on the distance ladder.
They calculated that the odds of this mismatch being a statistical fluke were less than one part in a hundred — which might sound good in poker but not in physics, which requires odds of less than one in a million to cement a claim of a discovery.
“I think it’s a potentially serious issue,” said Alex Filippenko, a University of California astronomer who is part of the team. “In this line of research the devil is in the details. And after getting the details right, we’re left with a major puzzle.”
Photo
An illustration showing the GAIA spacecraft and the Milky Way.CreditESA/ATG medialab; background: ESO, via S. Brunier
George Efstathiou, of the University of Cambridge and one of the leaders of the Planck mission responsible for its cosmological analysis, said Dr. Riess and his team had underestimated the errors in their measurement.
“So, in summary, I think that the Planck results are secure,” he wrote in an email. “They,” he said, referring to the other astronomers, “may be right and we have to modify their standard model, but the evidence looks weak to me.”
Dr. Riess and his colleagues have stood by their work, however, and the plot thickened further in December when a group called H0LiCOW (don’t ask) from the Max Planck Institute for Astrophysics in Garching, Germany, reported its own value of 72 for the Hubble constant, also inconsistent with the Planck space mission’s analysis.
Led by Sherry Suyu of Max Planck, the group measured the delays experienced by light rays from five distant flickering quasars as they followed different paths around massive galaxies on the way from Out There to us. . The technique, they say, depends only on geometry and Einstein’s theory of gravity, general relativity, making it independent of other assumptions about dust or the makeup of stars.
Last year, a group known as BOSS, the Baryon Oscillation Spectroscopic Survey, came up with a Hubble constant of about 68, based on how 1.5 million galaxies were clustered in space and time, but it used data from the cosmic microwave background for calibration.

What Comes Next?

There is wiggle room, Dr. Riess and others say, for both the modern and the primordial results to be right, because Planck measures the Hubble constant only indirectly as one of several parameters in the standard model of the universe. Other parameters could be tweaked.
Continue reading the main story
Photo
A supernova similar to those used to measure the universe’s expansion. CreditNASA
That is where new physics might come in.
The most likely candidates to fill the gap, Dr. Riess said, might be a new form of the ghostly particles called neutrinos, already known to be abundant in the cosmos. They come in three types that can change into one another as they traverse space; some physicists have suggested there could be a fourth kind, called sterile neutrinos, that don’t interact with anything at all.
Their discovery could unlock new realms in particle physics and perhaps shed light, so to speak, on the quest to understand the dark matter that suffuses space and provides the gravitational scaffolding for galaxies.
Another possibility is that the most popular version of dark energy — known as the cosmological constant, invented by Einstein 100 years ago and then rejected as a blunder — might have to be replaced in the cosmological model by a more virulent and controversial form known as phantom energy, which could cause the universe to eventually expand so fast that even atoms would be torn apart in a Big Rip billions of years from now.
“This is a very interesting tension,” Dr. Riess said. “This is why we play the game. We look for something not fitting.”
He added, “Clues about the dark sector or about fundamental physics are in play.”
This is the age of “precision cosmology,” and while everybody agrees that it is still too soon to tell, the avalanche of data from GAIA and the coming James Webb Space Telescope is just beginning, Dr. Freedman said. In the next few years she hopes the Hubble constant can be measured to 1 percent accuracy.
“And that’s what makes it interesting — this is feasible, and a lot of work is now ongoing that will allow us to resolve this within the next couple of years,” she said. “It’s what makes me want to work on this again!”
She said the situation reminded her of the late 1990s, when discrepant distances to distant supernova explosions led to the discovery that the expansion of the universe was accelerating under the influence of dark energy. Dr. Riess won a Nobel Prize for his part in that, and dark energy took its place in cosmic orthodoxy.
“It’s not quite ‘déjà vu,’” he wrote in an email, “but it’s funny that whenever my colleagues and I look at the contemporary universe with our radar guns, it’s expanding too fast for the contemporary expectations!”