So, what we have here is two people looking at a bunch of statistics showing that women suffer disproportionate violence from men… and concluding that the responsibility for this falls on women for not getting married. It’s a textbook example of victim blaming — and also a convenient case study of how to make pretty much every mistake in the book when it comes to the interpretation of statistics.
For a start, there’s the conflation of correlation with causation. We have numbers showing that married men are statistically less abusive to women and their children than their unmarried counterparts. So what’s more likely: that the sort of man who abuses women is statistically less likely to have any interest in marriage, or that the act of standing in a church in a white dress on one day of your life means that your children have a magical shield around them protecting them from violence? I’m gonna go with the former, but then, that’s a reflection of my ideology.
Which brings us to the second point: confirmation bias. The hoary old cliché about lies, damn lies, and statistics is a cliché for a reason: because you can use numbers to demonstrate pretty much anything. W&W’s article notes that “married women are less likely to be raped, assaulted, or robbed than their unmarried peers.” OK, great. But is that an effect of marriage? Is there any evidence of causation? Because another way of interpreting these statistics might be doing a little more research (seriously, guys, two fucking Google searches) to discover that marriage is less prevalent in poorer communities, wherein violence against women and children is more prevalent. In other words — violence and marriage rates both correlate with socioeconomic status, with one declining as the other increases.
It’s not like the DHHS report didn’t address this point, by the way — even giving it only the most cursory of read-throughs was enough to come up with this, which doesn’t have pretty graphs but does show that children in families with low socioeconomic status are about five-and-a-half times more likely to experience maltreatment than their more privileged peers:
W&W don’t address this point at all, instead arguing that “marriage also seems to cause men to behave better.” Well, if you say so. But even if you do believe that putting a ring on a man’s finger somehow has the mystical power to curb his violent urges, the conclusions W&W draw from their numbers are asinine in the extreme. Women should marry to minimize their risk of violence, and if they don’t, well, whose fault is that? I mean, no doubt the data also suggests that staying in bed every day is far safer than getting up and going to work, because who knows, you might get hit by a bus, or fall in front of a subway train, or booted in the face by one of the “it’s showtime!” kids. Would W&W thus conclude that if any such thing happens to you, it’s your own fault for leaving the house in the first place?
One would hope not, although the original editorial is so fucking stupid that it’s hard to say so with any certainty. The point is that however you interpret these statistics, the fact remains that they’re very much open to interpretation. And this is the problem with data journalism. Statistics are just that: statistics. They don’t prove anything, especially not in a situation like this, when you’re looking at one of a gazillion variables that are in play. Underneath the veneer of “but it’s just the numbers!” impartiality, there’s just as much ideology-pushing as there is in any other area of the print media.
And further, the whole idea of “impartial” journalism represents a fundamental misunderstanding of the purpose of journalism in the first place. The way in which those statistics are interpreted and presented shapes the way in which they’re received. And the way in which they’re interpreted and presented reflects the preconceptions of those doing the interpretation and presenting.
Just as relevant, in this case: the very act of reporting something reflects an ideology, because the choice of whether something is or isn’t news is an inherently ideological one. We’re taught that good journalism relates “only the facts,” but data journalism embodies the limitations of that idea, as demonstrated amply by the hapless Wilcox and Wilson: there are no such things as impartial, mute facts. They have chosen a few graphs from a 455-page report and used them as “proof” of the view that naughty women should just get married and stop “taking lovers” already.
As an example of how meaningless this all is, you could conceivably use this same data that W&W used for their article to conclude that men are statistically a menace to society and thus should be castrated en masse and cast into outer darkness for all eternity. It’s just as valid and reasonable a conclusion to draw from the data as the one that W&W do draw, i.e., it’s simplistic and ridiculous and would be laughed out of the room by anyone outside a lunatic fringe, the only difference being that in this case the lunatic fringe is at the super-radical feminist end of the ideological spectrum as opposed to the misogynist fuckwit end.
The Internet is already giving this article a richly deserved pillorying, and I think I’ve done enough to add my voice to the chorus. But dear god, can this be the end of the data journalism trend? Perhaps if our studious antiheroes had gotten away from the numbers and actually gone to report on the people they were writing about, they might have saved themselves this embarrassment — and everyone else the galling experience of reading it.