Between 2016 and 2018, Ian Beer found more than 30 vulnerabilities in the iPhone’s operating system. These bugs were worth millions of dollars on the open market.
Beer’s ultimate goal is to make the iPhone even more secure, but he doesn’t work at Apple. He’s a hacker for Google’s Project Zero, an elite hacking team focused on finding bugs in popular software and products made by companies like Apple, Microsoft, and by Google itself.
Videos by VICE
In between hunting bugs, Beer has also released a tool to help other researchers jailbreak the iPhone in order to find bugs in iOS, and taunted Apple during a talk at a security conference where he asked the company to donate his would-be rewards to a charity.
“I’d love to get a chance to sit down with you and discuss how together we can make iOS even more secure for all our users. Cheers,” Beer wrote in a tweet directed at Apple’s CEO Tim Cook at the time. (Apple has tried to hire Beer, according to two sources with knowledge, but for now he remains at Google.)
In other words, Beer wants to avoid precisely the scenario he laid out in a bombshell analysis he published at the end of August, where he detailed a shocking years-long campaign to hack iPhones in China.
“The reality remains that security protections will never eliminate the risk of attack if you’re being targeted,” Beer, who used to work at GCHQ, the UK’s intelligence agency, wrote in a blog post. “To be targeted might mean simply being born in a certain geographic region or being part of a certain ethnic group. All that users can do is be conscious of the fact that mass exploitation still exists and behave accordingly; treating their mobile devices as both integral to their modern lives, yet also as devices which when compromised, can upload their every action into a database to potentially be used against them.”
The researcher did not name them, but that “certain ethnic group” was later revealed to be the persecuted Uyghur minority in China’s province of Xinjiang. Apple confirmed news reports that the Uyghurs where the targets of the campaign in a statement that challenged some details of Google’s report.
This latest research has earned Google headlines all over the world, but Beer is only one of a team full of a team of superstar hackers, which include Natalie Silvanovich, Tavis Ormandy, and Jung Hoon Lee, the 25-year-old who was once dubbed one of the world’s best hackers by the The Register. Ever since Project Zero was announced in 2014, these hackers have taken apart software used by millions of people—and predominantly written by other company’s engineers—with a mission to “make zero-day hard.”
Do you work at Google, Apple, or in the security team of another tech giant? Are you a vulnerability researcher? If you have any tips about vulnerabilities and exploits, and the market for them, using a non-work phone or computer, you can contact Lorenzo Franceschi-Bicchierai securely on Signal at +1 917 257 1382, OTR chat at lorenzofb@jabber.ccc.de, or email lorenzofb@vice.com.
Zero-days are vulnerabilities or bugs in software that are unknown to the software maker, meaning they haven’t been patched yet. Hence their name: the company has been aware of the issue, and able to knowingly fix the bugs, for zero-days. As well as vulnerabilities, the term zero-days can be used to refer to exploits, which are techniques and code used to take advantage of those bugs.
In five years, Project Zero researchers have helped find and fix more than 1,500 vulnerabilities in some of the world’s most popular software, according Project Zero’s own tally. In Apple products, Beer and his colleagues have found more than 300 bugs; in Microsoft’s products they found more than 500; in Adobe’s Flash, they found more than 200. Project Zero has also found critical issues in CloudFlare, several antivirus apps, and chat apps such as WhatsApp and FaceTime. A Project Zero researcher was also part of the group who found the infamous Spectre and Meltdown flaws in Intel chips.
These numbers show Project Zero has had a massive impact on the security of devices, operating systems, and applications used by millions of people every day.
For Google, these disclosures give the internet giant good publicity by showing how much the company cares for the security of not just its users—but everyone else too. In assembling one of the most elite hacking teams on the planet, Google is messaging to its customers that it takes security very seriously. Along the way, Google has given itself an excuse to probe its competitors products and software, doubtlessly learning from others’ security mistakes. Project Zero has been able to poke holes in the bulletproof mystique of the iPhone’s security, which is widely believed to be the hardest consumer device to hack. In doing so, Google is able to insert itself into conversations it might not otherwise be a part of.
Regardless of Project Zero’s true mission, there’s no doubt that the team has had a profound influence on the cybersecurity industry in the last five years.
“Without this level of technical detail in the public eye, defenders don’t stand a chance.”
For one, Project Zero has normalized something that years ago was more controversial: a strict 90-day deadline for companies that receive its bug reports to patch the vulnerabilities. If they don’t patch in that time frame, Google drops the bugs itself. Microsoft, in particular, was not a fan of this policy at the beginning. Today, most companies that interact with Project Zero respect that 90-day deadline as an industry standard, a tidal change in the always controversial debate on the so-called “responsible disclosure”—the idea that security researchers who find vulnerabilities should first disclose them to the affected company, so that it can fix them before the bugs are exploited by hackers. According to its own tally, around 95 percent of bugs reported by Project Zero get patched within that deadline.
“People looked at the way the wind was blowing and then decided that—maybe just maybe— instead of creating a fuss, creating a fix within 90 days was just easier,” said Chris Evans, Project Zero’s original team leader.
But perhaps no accolade is more significant than how much people on the other side of Project Zero’s fence, whom Evans would call the “insecurity industry,” hate the Google hackers. This “insecurity industry” is made of companies like Azimuth Security and NSO Group, government contractors whose job is to find bugs and write exploits. But, instead of reporting the vulnerabilities to the companies who own the software, these companies sell them to governments who turn them into tools to hack and surveil targets.
“Fuck those guys,” said a researcher who works for a company that does offensive security, referring to Project Zero. “They don’t make the world safer.”
The researcher, who spoke on condition of anonymity because they are not allowed to talk to the press, said that zero-day vulnerabilities are sometimes used to go after terrorists or dangerous criminals. So when Project Zero kills those bugs, it may be killing tools used by intelligence agencies to go after the bad guys, according to the researcher.
In the end, however, Project Zero isn’t really stopping the trade of exploits to governments. If anything, Project Zero is highlighting areas where hackers can find more bugs. According to the researcher, by finding and reporting high-quality bugs, Project Zero is driving up the cost of other bugs and exploits, as they become rarer and harder to find.
“The price goes up and they’re making us all rich,” the researcher said. “Life does get hard, we just charge more. Keep doing what you’re doing, cause I’m getting richer.”
Ben Hawkes, Project Zero’s current team leader, thinks that being open about zero-days and detailing them in blog posts ultimately benefits users by pressuring companies to improve the security of their products, and show them what skilled hackers can do to hack them.
“We want to help provide accurate understanding of how attacks work to a wider audience so that users, customers, can ask the right questions and ask for the right things from their vendors and suppliers,” Hawkes said when we met in Las Vegas this summer.
“Fuck those guys, they don’t make the world safer.”
But some think Project Zero may actually be helping law enforcement and intelligence agencies learn from its research and help them develop what are known as N-day or 1-day exploits. These are hacks based on zero-days that have been disclosed—hence their name—but work until the user applies the patch. According to some critics, the idea here is that malicious hackers could lift the code published by Google researchers as part of their reports and build on it to target users who have yet to update their software.
Earlier this year in Las Vegas, during the Pwnie Awards ceremony, a mostly satirical affair that recognizes the best and worst hacks of the year, the founder of Azimuth Security Mark Dowd joked about this idea. When he was introducing an award, Dowd called NSO Group “the commercial arm” of Google Project Zero.
There is no evidence that government hackers have taken the exploits published in Project Zero’s research, and turned them into hacking tools. But—at least in theory—it could have happened, or could happen in the future.
“There are a lot of ways in which attackers can try and create exploits for things that are known,” Evans said. “It’s an age old problem of security disclosure, right? If you disclose something will the bad people turn around and use it?”
That shouldn’t stop Project Zero, or others, from sharing detailed knowledge of bugs and exploits, according to Evans.
“Let’s be clear, this will happen. Eventually, some researcher, maybe Project Zero, maybe someone else, they’ll publish something, and some future harm will occur. But you know, they did the right thing by sharing what they found, so that we can all learn from it,” Evans said. “You got to reserve your anger for the person that did the bad thing, not the security researcher who was just on the journey we’re all on to share things and learn together and grow together.”
Hawkes also agrees that there is a risk of something like this happening, but it’s all a matter of keeping an eye on it.
“We have to constantly monitor and assess to make sure that our disclosure policy results in more good than harm,” Hawkes said.
According to him, the reality is that the window of time to exploit users with an N-Day is shrinking on many platforms that have gotten better at forcing users to patch, such as iOS. Plus, Hawkes said, it’s not that easy to turn one of Project Zero’s bugs into reliable N-day exploits. It’s one thing to know there’s a bug in software, another to turn it into malware that bad guys can use to hack users. (For example, the much hyped Windows vulnerability known as BlueKeep was disclosed in May, and hackers have yet to turn it into an exploit that works seamlessly and can cause significant damage)
“There’s actually a substantial amount of additional research and development that you have to perform to take a Project Zero report and turn it into something that is pragmatically useful as an attacker in the wild,” Hawkes said. That’s because in practice attackers need to chain several different exploits to hack a target, so Project Zero may give them information on one of those links in the chain, but then the attackers need to use that exploit with others.
“The set of people that have that capability, that can perform that research and development. The theory is that they also have the capability to find the zero day themselves. […] That set of people appears to be very very small,” he added.
In other words, there aren’t that many other teams outside of Project Zero and offensive-focused hacking teams who can find highly valuable bugs and write these kinds of exploit.
That’s what makes Project Zero almost unique. It’s a team of extremely talented hackers who devote a lot of time and resources into finding bugs in the software of anyone they think is an interesting target. This means that more often than not, highly paid Google hackers are looking at code written by other companies.
“I think we’ll look back on Project Zero as previous generations did on AT&T Bell Labs: such a concentration of brilliant minds allowed to direct their own research is a beautiful thing, for however long it lasts, even if not long for this world,” said Katie Moussouris, a security researcher who launched the Microsoft Security Vulnerability Research (MSVR) team in 2008. MSVR is a pioneering program that has very similar goals to those of Project Zero: namely to find, report, and help fix bugs in third-party software. But MSVR has always been a program more than a team. There are no dedicated full time researchers on MSVR, while there’s around a dozen researchers working for Project Zero.
Hawkes is convinced that the world needs more Project Zeros. With that goal in mind, he said he and his colleagues have started to talk to other companies about building a “coalition” or Project Zeros across the industry, academia, and nonprofits.
It’s unclear what this coalition would look like, or who would be willing to be part of it.
Eric Doerr, the person at Microsoft who manages MSVR, said that there are no plans to have researchers work on that program full time, at least for now. We also asked Facebook, Amazon, and Apple if they had any plans to follow Google’s footsteps. Facebook declined to comment, while Apple and Amazon did not respond.
Moussouris agreed with Hawkes that the world needs more security researchers who focus on hard to find bugs, and then tell the world how their attacks worked—no matter who they work for and where the bugs are.
“Without this level of technical detail in the public eye,” she said, “defenders don’t stand a chance.”
Correction: a previous version of this story quoted Mark Dowd as saying Project Zero was NSO Group’s commercial arm. In reality, Dowd’s joke was the other way around. We regret the error.
Subscribe to our new cybersecurity podcast, CYBER.