Here are three things from last year’s UK Conference of Science Journalists (UKCSJ).
I was one of three people selected to pitch to real-life science editors in a Dragons’ Den-style session. My idea was to write an article about the increasing prevalence of shortsightedness (myopia) around the world. Scary stat: nearly one in three people might not have 20/20 vision by the year 2020. I didn’t get commissioned but you can watch the video here anyway.
After that intimidating experience, I reviewed two sessions. The first was about the use and misuse of statistics in science journalism:
“The Statistics in Science Journalism session at UKCSJ 2014 was a head-on collision between passionate journalists and the confusing monstrosity that is statistics. Deborah Cohen, the BMJ’s investigations editor, produced this session to help us understand how not to get things wrong.
Ivan Oransky, vice president of MedPage Today and co-founder of the excellent Embargo Watch and Retraction Watch blogs, led proceedings by taking us on a slide-by-slide journey through a realm of shoddy studies and equally shoddy reporting.”
Oransky’s presentation should be on his SlideShare page, but I couldn’t find them just now. He has plenty of other deeply interesting things on there, though.
The other session was about the issue of reproducibility in science:
“Science is in crisis, they say. Negative results don’t get published, while gibberish occasionally does; shaky studies are under-powered and over-reported; peer reviewers miss obvious mistakes and accept results that agree with their biases, regardless of merit; field-defining results cannot be replicated.
The current culture of ‘publish or perish’ doesn’t help matters. A scientist’s worth is judged based on how many papers they publish, how many times those papers are cited, and how much money they pull in.
Scientists, science journalists and others are beginning, however, to rage against the machine.”
Professor Chris Chambers, one of the speakers, put his excellent presentation about pre-registering studies, replicating them and making data open online here.