The fact that something happens all the time is almost never an argument for it being ethically defensible. This seems simple enough, but apparently OkCupid co-founder Christian Rudder — along with a host of social media data scientists — need a reminder.
Following the public furor over Facebook secretly conducting experiments on its users’ emotional responses, OkCupid came forward this week to admit that it too has been experimenting with users. Specifically, the online dating site was intentionally recommending matches to users that its algorithms had determined were not compatible to see if a connection would develop between them.
Videos by VICE
The troubling link between Facebook’s emotion study and Pentagon research. Read more here.
“Guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work,” wrote Rudder in a post entitled “We Experiment on Human Beings!” at the company’s OkTrends blog — which, it’s worth noting, is basically a trollish plug for his upcoming book.
Rudder’s comments are offensive not because they are false, but because they are patronizing and reflect a deeply flawed ethic regarding experimentation on humans. Yes, websites are borne of experimentation — algorithms are coded and tweaked, human experience expressed as data points is reduced to inputs and outputs for whatever consumer, social, or mercenary purpose a website might have. No spoilers there.
The OkCupid co-founder’s point is that everyone should stop freaking out about being experimented with, because the whole thing was always an experiment: adjustments in their algorithms redirect your experiences. As Rudder commented to BuzzFeed, “at OkCupid if the algorithm changes, yeah, they go on different dates, discover different people, maybe even marry somebody different. But that’s not me playing god, that’s just a fact of the service. Any decision the site makes has those implications because people are really using these services in their lives.”
It’s quite true that it is a mistake to treat data manipulations as wildly distinct from the ordinary functioning of social media sites. We are not talking about some neutral or pure interactive space that evil scientists then interfered with to manipulate our emotions without our consent.
Outrage over these experiments could suggest that we too readily trust that data is essentially neutral or “honest.” But clearly, social media informs the structure of our interactions and how we exist and identify ourselves through them. As Rudder points out, our formation as online subjects is already manipulated and shaped by the scientists who create these digital platforms. Consistently, the interests of consumer capital are a driving force here. We were never just sets of pure data, floating together in neutral cyberspace.
The latest Snowden leaks show that NSA surveillance gets extremely personal. Read more here.
That said, there are a couple of useful ripostes to Rudder’s logic. Firstly, there’s falsity in his suggestion that there is no discomfort or desire for resistance against how our daily lives are mediated through the strictures of social media — even in the absence of deceitful experiments. We know too much about corporate and government surveillance, especially in light of Edward Snowden’s leaks, to ignore the troubling repercussions of living networked lives online. Questions of consent are by no means settled here — the extent of our allowance or active participation in our own surveillance remains a nagging issue under advanced techno-capitalism.
But I think the key mistake at play in the debate over manipulations at Facebook or OkCupid is where we’re locating the problem. This is not an issue unique to cyberspace. Social media theorist Nathan Jurgenson has rightly challenged the fallacy of “digital dualism” and argued against the sharp distinction drawn between online and “real” lives. I believe that if we forget for a second that we’re talking about our cyberlives, the problems of the Facebook or OkCupid experiments become clearer.
The shaping and manipulation of our interactions in ways we do not see nor explicitly agree to are not confined to cyberspace. Don’t urban planners, developers, politicians, and law enforcement authorities influence the ways in which we are permitted, encouraged, and coded to act? Physical space, like online space, is not a neutral terrain. We are ushered into, and live through, sets of social relations that we didn’t explicitly consent to as autonomous agents. We navigate this world as consumers, workers, girlfriends — any number of identities and conformities that constitute our social reality. To be sure, none of this is simple or unproblematic.
Yet it seems relatively uncomplicated “in real life” to distinguish between the codings that affect how we ordinarily live (problematic as they may be) and other methods of manipulation or abuse. For example, within the context of late capitalism we can still point out the specific mendacity of, say, touting subprime mortgages or high-interest credit cards to poor and poorly informed communities. We can decry instances of rape culture, even while recognizing the structural problem of patriarchy that informs it.
Facebook threats are intolerable but they shouldn’t be illegal. Read more here.
The point being: data scientists are hiding behind an invalid dualism. The world — digital, physical, and otherwise — is plagued and shaped by codes that go unquestioned, beholden to the vagaries of existing power structures. This is the world we navigate. But within this world, we are able to point to specific acts of deception and bad faith and say, “that’s fucked up.” In this way, the Facebook and OkCupid manipulations are fucked up by virtue of being in bad faith.
The problem, then, is that experiments with our online lives are not issues unique to digital living. The failure of data scientists to admit wrongdoing in these instances reflects their inability to integrate the digital and the physical world. We are accustomed to living with the over-determining structural violences of capitalism, patriarchy, and racism — but we maintain the ability to point out egregious violations within these contexts. In the same way, we know our digital lives and selves to be coded and shaped, but that does not mean we cannot point out egregious examples of manipulation. Drawing these lines within already problematic contexts is what ethics is all about.
Follow Natasha Lennard on Twitter: @natashalennard
Image via Wikimedia Commons