Report finds FBI undercounts police killings by half
A study released Tuesday by the federal Bureau of Justice Statistics (BJS) found that the FBIs official annual tally of police homicides persistently undercounted police killings by more than half.
Even before the BJS study, the FBIs annual statistics of justifiable homicides were widely known to be a significant undercount. A list compiled on the website Killed by Police of every police killing mentioned in the American media includes more than 2,000 deaths since May 2013.
The BJS study found that, on average, police killed 928 people per year between 2003-2009 and 2011, almost two and a half times higher than the FBI figure of 383. A total of 2,103 killings went unacknowledged by the FBI during that period.
The FBIs figures are based on voluntary reporting by local police agencies, with no standard reporting methodology, and are estimated to cover 46 percent of officer-involved homicides at best, according to the report. This is despite the fact that regular annual reports to the federal government on police brutality statistics has been legally required since the 1994 passage of the Violent Crime Control and Law Enforcement Act.
http://www.wsws.org/en/articles/2015/03/07/brut-m07.html
libdem4life
(13,877 posts)the under belly of the Dark Side. Self-reporting. Voluntary Reporting. A gallows humor LOL.
Igel
(35,309 posts)And it's much more complicated and nuanced than wsws is ever likely to represent with an accuracy of 50% or better. The main problem is that the report is not an analysis of the data and doesn't claim to be; it's an analysis of the data collection methodologies used over an 8-year period, which is best, how to improve upon them and how to interpret them. As such, it uses broad yet clear definitions for data classes but relatively precisely identifies and categorizes data sources and analytical methods.
For 2011, the uncaptured "law enforcement homicides" is from 31-41%. That's a clue to the accuracy of the number in the headline. Given the assumptions and analytical methods, they estimate that the total for a series of years (oddly excluding 2010, and ending in 2011) using one method is is 47-50% with a confidence interval of 95%. Using a second method, you get 35-38%. And a third method gets 46% at best. In sum, if you combine the methods, they estimate that 28% of the homicides will be entirely missed.
One key point is that the accuracy and coverage varies by state and jurisdiction and by data-collection methodology. It's clear that when they give a percentage of homicides likely to have been missed that's for the US as a whole, and no specific jurisdiction. So one city might have spot-on reporting of law-enforcement deaths, another might have really horrible reporting. They don't quantify this; that's done elsewhere.
"Law enforcement homicides" is well defined. It includes deaths during police or corrections contact at the level of state and local jurisdictions. If an FBI agent is working with local police, it's to be included in the report, regardless of who's responsible. If the local police aren't there and just federal agents or correctional facilities are responsible, then the death isn't to be included. Presumably there are other means for collecting such data. But it means that such reports exclude a set of what I'd call "police killings," so that maximalist "50%" number might need to be nudged up.
"Law enforcement homicides" includes not just police shootings, but also cancer deaths and cardiac-related deaths in penitentiaries. The police chase somebody and he runs into traffic and gets killed, it's a "law enforcement homicide." I remember one case last year in which some juveniles killed another in juvenile detention; this is a "law enforcement homicide." So the definition is rather broad. Many of those I wouldn't call "police killings." The report is quite specific in its definitions and uses them consistently. It includes gunshot deaths and choking deaths, but also when a prisoner dies at age 80 from a heart attack or suicide.
The report does not distinguish between types of law enforcement deaths, and you'd have to stop and think about what's more likely or less likely for a jurisdiction filing the reports: Would they be more likely to include an incident where a cop killed a suspect or where an inmate dies of a heart attack? That I can't answer--or, more accurately, any answer I assume reflects my attitude and emotions and biases more than any kind of actual statistical analysis based on data. (In short, "easy thinking" rather than "hard thinking."
WSWS is reporting on the report it wants to have had in front of it, not on the report it had in front of it; it qualifies as "motivated reporting." It applies its own definitions as it reads the report, which is a particularly difficult pseudo-fallacy to overcome or, in many cases and almost invariably when the actual report isn't in front of you or you don't feel like wading through it, even to spot.
jakeXT
(10,575 posts)http://www.npr.org/2015/03/06/391269342/system-for-reporting-police-killings-unreliable-study-finds
rock
(13,218 posts)Enthusiast
(50,983 posts)Collusion?
djean111
(14,255 posts)I don't think it was in ANY law enforcement body's self interest to keep an accurate count and description.