Science journalism can be evidence-based, compelling — and wrong
A ranking of the best science-news outlets misjudges the relationship between research and reporting.
07 March 2017
Many science journalists rely on peer review to check their stories are true.
There has been much gnashing of teeth in the science-journalism community this week, with the release of an infographic that claims to rate the best and worst sites for scientific news. According to the American Council on Science and Health, which helped to prepare the ranking, the field is in a shoddy state. “If journalism as a whole is bad (and it is),” says the council, “science journalism is even worse. Not only is it susceptible to the same sorts of biases that afflict regular journalism, but it is uniquely vulnerable to outrageous sensationalism”.
News aggregator RealClearScience, which also worked on the analysis, goes further: “Much of science reporting is a morass of ideologically driven junk science, hyped research, or thick, technical jargon that almost no one can understand”.
How — without bias or outrageous sensationalism, of course — do they judge the newspapers and magazines that emerge from this sludge? Simple: they rank each by how evidence-based and compelling they subjectively judge its content to be. Modesty (almost) prevents us from naming the publication graded highest on both (okay, it’s Nature), but some names are lower than they would like. Big hitters including The New York Times, The Washington Post and The Guardian score relatively poorly.
FREE PDF GRATIS: Nature