OKCupid Founder Says Experimenting on Users Is Just “How Websites Work” — But It Doesn’t Have to Be

Share:

OKCupid published a pretty fascinating post on its OKTrends blog yesterday. The piece was written by the company’s implausibly monikered co-founder Christian Rudder, and it discussed the recent publicity around Facebook’s grand experiment in influencing the moods of its users by examining how they react to having their news feed filled with largely positive or negative posts. Under the title “We Experiment on Human Beings!,” Rudder wrote that OKCupid had conducted similar studies, including a case in which they provided false matching data to evaluate how important providing users with this information is in their decisions about whether or not to contact (and also maintain conversation) with another user. This was no different, Rudder argued, from what Facebook did, and no different from what a bazillion other sites are constantly doing on an ongoing basis.

Strictly speaking, he’s right — every website experiments with its users to some extent, for the simple reason that it’s the best way to work out what they do and don’t like. At its most basic level, this can be something as simple as A/B testing two different menu styles on your site to see which one users click on more frequently. Conceptually, there’s not a great deal of difference between this and what OKCupid has done — instead of a menu style, they’re A/B testing two versions of their algorithm or, perhaps more accurately, experimenting with how strongly various factors are weighted in that algorithm.

Is this ethical, or morally defensible? Honestly, it varies from case to case. The one key difference between what OKCupid has been doing and what Facebook did is that in the former case, there’s at least some sort of utility for users: the better OKCupid’s algorithm gets, the better chance a user has of getting laid meeting a wonderful person to share their life with. This is also of benefit to OKCupid — real cynics might suggest the site doesn’t want to make perfect long-term matches because it’d remove the need for the people involved to use its service, but it’s hard to imagine that the site works well enough for its designers to be that manipulative.

In the case of OKCupid, then, the needs of users and the site’s designers are pretty much aligned: everyone wants an algorithm that works as well as possible. It’s true that providing false positives and sending unwitting users on unsuitable dates is ethically questionable, but the site could easily argue that this was a valid experiment in testing the weighting of various factors in its matching algorithm. Ultimately, a more accurate method of matching users benefits everyone.

If this all sounds a bit idealistic, you can rest assured that if the interests of the site’s owners and the needs of its users weren’t aligned, the latter would go out the window. Which brings us to Facebook, where there’s constant tension between the way users want the site to work and the way Facebook wants people to use it: the former want to see their friends’ status updates and the occasional cat video; the latter wants to create a brave new world where everything is Sharable Content™ and privacy is a curious concept that old people occasionally reference with a sort of inexplicable nostalgia.

A recent example: you might want to see everything your friends have posted, in chronological order, like you used to do in the good old days of 2009. Facebook, however, wants you to use its Top Stories algorithm, whereby it can control what you see. This might be sold as a benefit to the user — streamlined feed! See only the posts you want to see! — but ultimately, it exists to serve to enforce Facebook’s vision of how Facebook works. It’s the same for a lot of what Facebook does, when you think about it.

Again, cynics might say that if you’re not paying for the product, you are the product, and in the case of Facebook, that’s certainly true — its services are provided to you in a sort of unspoken quid pro quo for the information you volunteer to Facebook. The fact that Facebook can do pretty much whatever it wants with your data is written into the site’s terms and conditions, to which you assented when you signed up. Whether these T&Cs are legally enforceable is probably open to argument, but good luck with the Zuckerberg Armored Legal Battalion if you decide to put them to the test.

Still, all of this doesn’t — or at least shouldn’t — give Facebook carte blanche to conduct experiments that’d probably land you in all sorts of trouble with the Dean of the Psychology Faculty if you tried them in a clinical environment. In this respect, perhaps the most notable and remarkable thing about the Great Facebook Experiment of June 2014 is that it may well have passed by without comment had its ethical implications not been questioned by ANIMAL New York writer (and former Flavorpill employee!) Sophie Weiner.

It’s indicative of a certain fatalism about the web — raise the topic of privacy or ethics with people, and they’ll most likely give you a sad smile and tell you that this is just the way things are, that this is how the web works, that nothing really comes for free. At least two of these statements are true — this is indeed how the web works, and nothing ever comes for free. But does it have to be that this is just the way things are? No, of course not. This is the greatest lie that people like Mark Zuckerberg have sold us: that things on the Internet were predestined to be the way they are.

It’s this way because the people in charge of sites like Facebook and OKCupid have made it this way. It’s disingenuous, then, for OKCupid’s Rudder to start his blog post with the condescending declaration, “But guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.” Websites work any way the people who created and maintain them want them to. They don’t have to work this way.

It’s not a fait accompli. The Internet could have evolved in a bazillion different ways, and it’s not like the Internet we’ve ended up with is the best of all possible worlds. No doubt the likes of AT&T would like you to believe that net neutrality was a utopian concept that had no chance of working in the real world, whereas in actuality it worked just fine as far as users were concerned, until the FCC went and fucked it all up.

There’s no reason that the web has to be a medium for corporations like Facebook to mine for data, and there’s no reason why a bunch of assholes in San Francisco should be able to sit around in their well-stocked corporate canteen and enthuse about how cool it is that they can use all this data to change people’s moods, man, before they ride their Segways off into the sunset. There’s no reason that Twitter should be free to tolerate nonsense like #twitterpurge, or that it can’t get its act together to institute a more effective reporting system for harassment. And so on. It doesn’t have to be this way. But every time we shrug our shoulders at something questionable that Facebook does, we bring that vision a little closer to fruition.