Tech

Hospitals Don’t Trust Google’s Servers Enough to Use Glass

Image: SMU SMC

We’re all pretty afraid of Google Glass, for lots of reasons. Somewhere there was a line, and a computer you’re always wearing on your face is crossing it. There’s an element of Levinas-like reverence for the face, an aversion to an ever more web-mediated experience, and then there’s the privacy concerns.

While many have (justifiably) focused on the privacy of those in front of Glass’ omnipresent lens, questions about where all that collected data goes—and who has access to it—are an issue for whoever is wearing the frames as well. Targeted advertisingYoutube identity merging, and NSA-related revelations have taken some luster from Google’s halcyon “Don’t Be Evil” days. These days it’s not hard to think of things you see that you don’t want on Google’s servers.

Videos by VICE

Take for example, the results of the first “systematic evaluation of Google Glass in the healthcare environment,” which were just published in the International Journal of Surgery. Working with counterparts in Germany, a pediatrician in New York tried wearing Glass for four consecutive weeks and logged how it went, trying to give a more thorough assessment of Glass’ upsides and downsides than the more casual reports by early explorers.

There were occasional Glass gaffes that sometimes skewed comical—in the operating room, where sanitary concerns meant that Glass couldn’t be touched, the doctor had the device set to turn on when he tilted his head back 20 to 30 degrees. If it didn’t wake up on the first try or if Glass shut off after a short lag, the doctor had to repeat the motion. “A nurse watching this resulting repetitive backward head bobbing commented that this ‘can’t be healthy,’” the study dryly noted.

Throughout the hospital there wasn’t that knee-jerk anti-Glasshole reaction though. “Wearing Glass throughout the day for the study interval was well tolerated. Colleagues, staff, families and patients overwhelmingly had a positive response to Glass,” the study stated.

They reported some software and hardware problems—a short battery life, a lack of hands-free scrolling, some audio problems—but then there were the privacy concerns, which limited where Glass could be tested, even when the patient granted consent.

The study’s lead author and Glass-wearer, Oliver Muensterer, told me via email that, “We were going to wear them in the OR, and then have the other institution [in Germany] proctor or telementor the person doing so. However, our risk management office didn’t allow us to have any connection to the internet while we were taking pictures of patients, and neither did their ethics board.”

But Google Glass’s design means it pretty much has to sync up with the cloud. The main issue with privacy was that Glass synchronized data to the Google server whenever it was charging and connected to a stable WiFi network. “We avoided such a breach by temporarily deactivating the internet connection as well as downloading and deleting all patient data from Glass before the automatic synchronization could take place,” study stated.

“Bureaucrats and regulators have not kept up with the pace of technology, so the almost universal answer from institutions is that these activities are not allowed.”

The hospitals ethics board didn’t believe that the data would be safe in Google’s hands, for some increasingly familiar-sounding reasons. “Recent political events have revealed that government agencies use mass surveillance tactics and have the capability to break into even the most heavily secured networks, including the mobile phones of allied heads of state,” the study stated.

Google’s servers being compromised is one thing, but there are also concerns about Google itself having access to the information collected by Glass. The Electronic Privacy Information Center’s website outlines why this is problematic: “All of the data captured by Glass, including photos, videos, audio, location data, and other sensitive personal information, is stored on Google’s cloud servers. Google will possess the data and may analyze it to develop profiles of individuals.”

EPIC points out that this wouldn’t be out of character for Mountain View. “Google currently scans the contents of emails of its Gmail users in order to target advertising, so it is foreseeable that it could do the same with Glass data.”

Therein lies the rub. The data breach dilemma isn’t just starting; it’s here. Dr. Muensterer and I were exchanging emails—Gmail to Gmail. It might seem simple enough to say, “Well, if Glass isn’t secure, don’t use it,” but it just isn’t that simple. Concerns about Glass data are concerns we should’ve had the whole time. While denying doctors Glass might not seem all that unreasonable now, would you deny them, say, email?

I asked Dr. Muensterer what steps his hospital was taking to ensure that emails, texts, or even phone conversations about patients are kept as securely as the law requires:

“Most places simply forbid it—no texting or emailing of sensitive patient data except when from a secure email (so, for example, within the organization so that the correspondence never goes beyond the institution’s firewall.

You and I and everybody knows that this is not practical. In today’s health care environment, we rely on data transmission. When I am on call as the pediatric surgery attending, residents text or email me x-rays, or even pictures of wounds or clinical findings, and we make important, timely decisions based on these data. Acuity and patient turnover has reached a level in which we could not function efficiently if we didn’t use modern communication equipment, like smartphones and email. They are great, and really help clinicians do their job well in the interest of the patient.”

Likewise, Google Glass proved “clear utility” in the hospital setting: Hands-free two-way communication and photo and video documentation in the sterile operating room, real-time online search of complex medical condition and rare syndromes, diagnosis, and procedure codes for billing were praised in the report.

The question is, then, is Google Glass’ data privacy problem something, like the inadequate battery life or a lack of medical-specific apps, that can be fixed in Glass’ next incarnation? Or is it a problem with the essence of the device? Can Glass ever broadcast something besides a demonstration with a doll, as seen on the right?

For what its worth, Dr. Muensterer and his team believe that there’s a fairly plausible and easy solution on the tech company’s side. “In the long run, information companies such as Google or Apple may exhibit an interest in working with the medical community to design a secure network and server strategy that could be certified by hospitals for use by their staff,” the study said.

Of course, offering a secure network is one thing, but in Muensterer’s estimation that’s only part of the solution.

“I think institutions should be much more proactive to engage these technologies,” he said. “Bureaucrats and regulators have not kept up with the pace of technology, so the almost universal answer from institutions is that these activities are not allowed.”

Which comes back to the issue of trust. Even as Google reassured international privacy advocates last year that “Glass continues to be reviewed for privacy considerations as part of Google’s comprehensive privacy program,” the number of questions has only grown.

The medical community is a place where the issue is more than one of just changing norms—there’s a very real legal question that comes with constantly videotaping (and transmitting data of) your patients. Experts say that the market won’t exist for Glass for another two years, so the company has two years to work something out. The size of that market is going to depend on how much privacy Google can guarantee, and what Google’s word is worth in 2016.