JANUARY 17, 2014
Why "statistics" based on media reports are invalid
Earlier this week, I posted a report from an advisory group from an advisory group that was appointed by the Bonner Springs, KS City Council. In the report, they provided a very detailed explanation as to why they unanimously favored repealing the city's 24 year old breed ban.
One of the things they noted was that there seemed to be very little support for breed bans -- and that said support was based off of websites that relied solely on media reports for their data. I'm going to post their statements again:
"The research for articles and statistics presented difficulties as the majority were anti Breed Specific Ordinances. The few that were in favor of BSOs generally justified their positions with statistical data generated by dogsbite.org. Research of this website found the data to be extremely distorted with many myths presented as facts....because no one, including the CDC, maintains statistics of attacks by breed, the party who maintains the website gathers statistics based on a review of newspaper articles for reports of dog attacks. This method would not be embraced by any statistician, as this would lead to greatly skewed and inaccurate results."
I was glad to see this called out, as this is true of dogsbite.org and is even more true in the absurd Merrit Clifton Report. Both rely entirely on news reports to compile data, that they misinterpret statistics. They're not.
To demonstrate this, I thought I'd share an example from a completely unrelated area that a came across a few weeks ago.
Apparently, Slate was working on a report on gun violence in the United States and was calling on media reports to gather data. However, when they were done, they realized that their database of people killed by guns was roughly 1/3 of the CDC count of the number of people killled by guns.
Why the discrepency? Turns out, it's suicides. That because the media seldom reports suicides (I think the rationale is justified BTW) basing analysis of gun deaths based on media reports showed a completely distorted view of gun violence in America. Here are a couple of charts from Marginal Revolution:
The Media’s Picture of Gun Violence (suicides in red)
The CDC’s Picture of Gun Violence (suicides in red)
As you can see, the media reporting in this case is neither a comprehensive portrayal of gun violence in America, nor is it a statistically representative sample. In fact, statistically, it's entirely misleading.
The same is true for dog bites of course. The media doesn't attempt to cover every dog bite that happens in this country (nor should it), nor does it necessarily attempt to cover every major dog bite in this country. There is even evidence to suggest the media doesn't even cover every fatal dog attack.
The National Canine Research Council has done an excellent job that of reporting over the years how dog attacks including certain breeds of dogs garner far more media attention than others.
And then there are the ridiculous stories like this one about a child that was "attacked" by a pit bull. You can see the child's devastating injury from the "attack" at the left (click to enlarge). (Thanks Dog Hero for pointing me to the story).
I think we could all agree that if the media reported ever dog bite story that was equal to severity to this one we'd get very little accomplished on our local newscasts.
This, and others, highlight exactly why anyone who forms their opinions on dangerous dog policies based on media reports (or based on the opinion of anyone who does) is destined to be making an inaccurate conclusion based on data that is neither comprehensive nor a statistically representative sample.
If you want to start making communities safer, aCommunity Approach to dog bite prevention is a great place to start.
No comments:
Post a Comment