The BBC and Dead Journalists

The BBC runs a lot of stories which are little more than rewriting some organization or another’s press release. I don’t necessarily have a problem with that, except when they can’t be bothered to be even slightly circumspect about being accurate when they are doing so. I mean, if all you’re doing is rewriting someone’s press release, how hard can it be to get things right?

Apparently, its rather difficult for the BBC. Take this story from January 2004, for example, about a report on the number of journalists killed in 2004. The story notes in alarming language that 129 journalists were killed last year according to the International Federation of Journalists — the most ever since that group began keeping such statistics.

Here’s how the BBC describes the deaths of journalists,

The IFJ said that in almost every corner of the globe journalists were targeted and killed by the enemies of press freedom.

Another dangerous place to work was the Philippines where 13 journalists were murdered, many of them for reporting on corruption, crime and drugs trafficking.

The IFJ said governments have a duty to do more to protect journalists and to find out how and why they died.

Working conditions, particularly for local investigative reporters, were becoming more and more risky, the group added.

The clear implication is that journalists are being murdered right and left for trying to report the truth — which, to some extent, they are. But the BBC deceives by not bothering to repeat the IJF’s clear caveat that its stats on dead journalists include quite a few who died accidentally.

For example, it includes in those 129 deaths several individuals who died after their plane crashed while trying to get a perfect shot set up for a photographer. It also includes cases of reporters who died in car accidents while heading to cover a story. It even includes the tragic death of a young Texas reporter who died when the large boom antenna on a mobile broadcast van hit powerlines and the journalist was electrocuted.

All tragedies but hardly representing the persecution of the press that the BBC implies that all 129 deaths represent.


‘Deadliest’ year for journalists. Chris Morris, The BBC, January 18, 2005.

Send that BBC Reporter Back to College

Another example of a reporter — this time at the BBC — completely screwing up a story. In a February 27, 2004 story, Scientists doubt animal research, the BBC reports on a study published the British Medical of Journal which was basically nothing more than an animal rights attack on animal research. But whoever wrote the BBC story screwed up and didn’t have a clue what the BMJ paper actually said. Here’s how the BBC describes the paper,

In reaching their conclusions, the London team carried out a systematic review of all animal experiments which purported to have clinical relevance to humans.

Conducting a review of all animal experiments that had some sort of clinical relevance would be a gargantuan task that would likely take years to accomplish. A lot of studies from the 19th and 20th centuries likely don’t even exist in electronic formats yet.

Which is why the researchers didn’t even attempt to do what the BBC claims they actually accomplished. As the paper makes extraordinarily clear,

We searched Medline to identify published systematic reviews of animal experiments (see for the search strategy). The search identified 277 possible papers, of which 22 were reports of systematic reviews. We are also aware of one recently published study and two unpublished studies, bringing the total to 25. Three further studies are in progress (M Macleod, personal communication).

Leave it to the BBC to think that doing a Medline search for studies that systematically reviewed certain animal experiments is the same thing as systematically reviewing “all animal experiments which purported to have clinical relevance to humans.”

This is an especially pernicious error because one of the major problems this study has is one of selection bias. If a drug is supposed to do X but doesn’t, then researchers have a good chance of publishing a review of the animal research to see if there was anything that could have indicated the problem before going to market with the drug (ironically, in the reviews mentioned by the study, the problem was not the animal research itself but rather how the animal research results were used — or more accurately not used — by clinical researchers). Of course if a drug does exactly what it was supposed to do, then good luck getting an analysis of why nothing went wrong published. The researchers did the equivalent of tracking down man bites dog stories and concluding that more people bite dogs than vice versa.

But you’d never know that from the BBC which can’t even accurately describe the study.