Our collective ambivalence towards the social media sites we love to mindlessly scroll changed forever in 2016. The now-infamous story goes like this: Facebook had allowed data to be harvested through a quiz called “This Is Your Digital Life”, created by academic Aleksandr Kogan, which was later given to Cambridge Analytica. Cambridge Analytica then used the information collected from those quizzes to target voters on the fence with provocative, often hateful videos, in recent elections such as the Brexit referendum in the UK, or Trump’s presidential campaign. In the centre of this huge digital reckoning are numerous other mini scandals, pointing us to the big, unanswered question: why aren’t our data rights considered human rights?
The Great Hack, a new film by Jehane Noujaim and Karim Amer, follows this story through the eyes of its main characters. The Netflix documentary, out this week, features Guardian journalist Carole Cadwalladr who broke the story alongside David Carroll, a US academic currently suing Cambridge Analytica for his data, and Brittany Kaiser, who worked at the company at the time, all in the midst of the scandal. The doc hit headlines this week, after Brexit funder Aaron Banks threatened to sue Netflix over its release.
Videos by VICE
VICE spoke to Noujaim and Amer about piecing together the story, how they secured the trust of the main characters, and why nothing has yet been done yet to stop another big data breach.
VICE: Hi both. Where do you even begin with putting together a film like this?
Jehane Noujaim: We grew up in Egypt, and we made a film called The Square where we saw technology be an incredible force for good, for change, for holding power accountable. But then we saw that pendulum swing in the other direction, where those very same tools were being used to micro-target. In 2016, an election happened, Brexit was happening [and there was] the spread of fake news. We felt this was a very urgent film to make, yet ‘it’ was invisible. There was a deficit of language. Like the environmental movement, it was just starting up, and people found [the data breach] very hard to talk about. But we also didn’t have melting ice caps, we didn’t have the bird covered in oil.
Karim Amer: There were no images that could transcend what was happening.
Noujaim: So, how do we make a film that brings this story to life in a very urgent way? The answer was to find characters who could lead us into the belly of the beast.
It feels like a very character-driven film. How easy was it to get people to talk to you?
Amer: It’s always difficult. I met Carole [Cadwalladr] and Chris Wiley early, before they came out publicly, and we were following that. Our criteria for finding a character is someone who’s in a high stakes environment, who’s compelling and just about to jump off a cliff. They all understood that they were in the midst of history. There was this kind of awakening: “Wait, this balance of power is not where we thought it was, technology may be more of a wrecking ball than we imagined, data can be weaponised, truth has been collapsed into anything we want it to be, and Western democracy is more fragile than we imagined.”
Who’s the villain of the story?
Amer: I don’t think there is one villain. It’s not Alexander Nix’s fault that the American presidential election is a multibillion-dollar business, and it’s the biggest market of elections in the world. It’s not [Nix’s] fault that we’ve agreed to commoditise all of our personal behaviour and Facebook’s agreed to make it accessible for people. It’s not [Nix’s] fault that he was pursuing the dream of making a start-up that could “move fast and break things.”
That doesn’t mean [Nix] didn’t do a lot of very complicated ethical things that he should be held accountable to, but what I’m saying is that what we’re awakening to is a structural problem. Consent has never been more contested in its definition than now. Is it consensual when you don’t really understand what’s happening to your data because you just click because it’s easy and you don’t bother to read it? Is it consensual, when you thought that Facebook was just a place like Disneyland where nothing nefarious could happen?
One of the things Cambridge Analytica was best at was mapping out people’s neuroticism and anxiety and creating – with the help of Facebook – the best way to exploit people’s anxiety and make them more anxious, and often, stop them from going to vote. Does this reflect a democratic society? And if it doesn’t, then what are we going to do about it? Are we going to sit there and let them get away with it while we’re on our phones, twiddling away?
At a time of global social tension, to what extent is Facebook aiding that and to what extent is it just representing that?
Amer: [An algorithm] is going to allow you to see and paint the world based off your confirmation bias, and not have to make concessions with other humans in your society. Is Facebook to blame alone for that? No. Is Facebook alone to blame for the fascist rising in Europe? No, of course not. But when Facebook’s tools are being used to ferment this further, or being used to exploit this, then that is part of what ‘The Great Hack’ is, right? A hack is a vulnerability and an ability for everyone to be exploited. What Facebook has allowed to happen, in my opinion, is because of gross corporate negligence.
In the oil era, when you had a spill, we could see the visuals like, “Holy shit, look at how it’s destroying the marine life, look at how there’s a bird painted in tar”. It was horrible, and we understood what was happening. Even if we didn’t fully understand what caused it, we were like, “This is wrong”. Whereas with this [data], where we can’t see it, there’s no outcry in that regard.
This stuff happened on Facebook’s watch. Whether Facebook caused it, whether Facebook is incentivising it, whether this is the basis of Facebook’s business model, that’s for Facebook to answer. But right now, Facebook has refused to appear before Parliament. Facebook has refused to tell the American and British government what ads were shown in the darkness and to whom, paid for by what entity and how. The inability to do so has left us vulnerable and we’re about to enter into another set of elections.
I think when the Cambridge Analytica scandal erupted, there was definitely a level of disbelief, and questioning around the extent to which they were actually responsible.
Amer: The degree to which the campaign worked or didn’t work – that’s the oldest question of marketing. No one in advertising can tell you how it worked or didn’t work, but if advertising didn’t work, it wouldn’t be a trillion-plus-dollar industry. So surely, it must’ve worked and more importantly, it has brought our attention to a much larger structural problem that challenges the survival of the sanctity of the democratic process.
Noujaim: Advertising has always existed, but it’s never been so personalised and individualised at such a massive scale. I think that’s what is terrifying and with this film, people have said that they felt like they were watching an episode of Black Mirror that turned out to be real.
What do you think it would take for data rights to become human rights?
Amer: I don’t know. I think it goes back to awareness and consent.
But this scandal was huge, and many more people know about data breaches and yet – nothing.
Amer: Yes, that’s true. I think we’re going to have to continue to see other wreckage sites – beyond the political landscape and hopefully, those are not more cataclysmic – [to] get us to act. We just have to realise that we’ve ventured into this new era, and if we don’t protect humanity, which is what we’ve always needed ethics to do, it’s on us where we end up.
Thanks both.
The Great Hack is on Netflix (UK) from Wednesday July 24.