Tech

This Bot Tweets Photos and Names of People Who Bought ‘Drugs’ on Venmo

On Tuesday, Motherboard reported on a project called Public By Default, in which a researcher took publicly available data on Venmo transactions, and was then able to identify granular aspects of users’ lives, including a married couple going to the vet, shopping at Walmart, and ordering particular takeout food.

With that project, the researcher did not publicly release her dataset, and only reported the anonymized findings. Now, a separate programmer has gone in the opposite direction. A new bot called “Who’s buying drugs on Venmo” tweets the usernames and photos of Venmo users who have marked their transaction with a particular drug keyword or emoji.

Videos by VICE

The bot raises questions not only about whether users are truly aware of the sort of data they are exposing by not using certain account settings, but also at which point does already public data qualitatively take on a different meaning or significance when compiled in aggregate and presented in new formats. The bot also potentially exposes people in ways they didn’t originally anticipate, touching on issues of consent and privacy.

“I wanted to demonstrate how much data Venmo was making publicly available with their open API and their public by default settings and encourage people to consider their privacy settings,” Joel Guerra, the creator of the bot, told Motherboard in an email.

Caption: A screenshot of the Venmo bot. Image: Twitter.

Venmo, owned by PayPal, is a social-network-mobile-payment-service crossover, allowing users to easily and quickly send money to each other. Users can make all their transactions and activity private, but by default, Venmo publicly presents the username, name, photo, and message sent with the money within the service’s app for others to see.

On Tuesday, Motherboard wrote a very basic script to confirm all of this information was easily available for download in bulk. At the time of writing this can all be done without any sort of authorization; Venmo does not require developers to sign up for API keys, for example, unlike other websites, such as Twitter.

As for the new drug focused bot, according to the Python code on the project’s Github, the script looks for certain drug-related keywords written in transaction messages. These include heroin, marijuana, cocaine, meth, and pills. Some terms also relate to sex, such as blowjob, porn, and hookers.

“Publix/alcohol – ubers/drinks at Bodega,” one message tweeted on Thursday reads. “Cotton balls and rubbing alcohol,” reads another, and a third simply has pill and beers emojis. (While reporting this piece, Motherboard also used a script to collect data on Venmo transactions; the bot appears to have not tweeted some messages in Motherboard’s collection that contain drug terms, suggesting the bot is not collecting all of the relevant transactions.)

Got a tip? You can contact this reporter securely on Signal on +44 20 8133 5190, OTR chat on jfcox@jabber.ccc.de, or email joseph.cox@vice.com.

Guerra said he thought “posting drugs references that I’m sure are mostly (if not exclusively jokes) would be the funniest way” to prove his intended point about Venmo’s privacy.

Indeed, some of these transactions are quite clearly not about illegal narcotics, or even really drugs. One transaction posted on Thursday included the message “Your love is my drug,” along with the profile picture of the user; seemingly the user and their spouse or significant other.

The bot is also quite crude at picking out phrases. One tweet includes the message “not drugs,” and another contains the phrase “Funding for your Scotland & Ireland trip. God speed,” with the bot presumably only picking up on the word “speed.”

With that ambiguity in mind, some Venmo users may not appreciate having their username and photograph tweeted to a wider audience with a drug association, even if made in jest.

Guerra seemingly anticipates this. The bio of the Twitter bot reads “if you want yourself removed @ me.”

“This is just for fun. Don’t be mad,” Guerra writes on his Github.

On Friday, after the original publication of this article, Guerra announced in a Medium post he was stopping the bot, and claimed the project’s reception had been “overwhelmingly positive.” (To be clear, the project faced fierce criticism; a privacy engineer tweeted on Thursday that “the fact that data is public does not make it okay to give more exposure to it.”) Guerra also deleted the “This is just for fun” language from the bot’s Github page.

“Hundreds of thousands of people were reading tweets and articles about the bot and discussing data privacy. I saw no further value in tweeting out anyone’s personal transactions anymore,” Guerra’s Medium post continued.

They are myriad examples of coders accessing other datasets but distributing them in such a way so as to irk the subjects of the data or others. In 2016, a student at a Danish university scraped the profiles of 70,000 people on the dating site OkCupid, and then publicly released that information in one, easy to download set. Academics fiercely pushed back against the project, saying that just because data may be public, that does not absolve anyone from ethical responsibility in handling it.

In another case, an activist recently downloaded the LinkedIn profiles of Immigration and Customs Enforcement (ICE) employees en masse, and pushed them in aggregate to multiple sites. Github, Twitter, and Medium subsequently deleted the database from their own platforms. When another technologist launched a similar project years earlier around the LinkedIn presence of members of the intelligence community, they ultimately faced a number of death threats.

Venmo did not immediately respond to a request for comment.

Solve Motherboard’s weekly, internet-themed crossword puzzle: Solve the Internet.

Update: This piece has been updated to include extra information on the apparent crudeness of the bot, that Guerra has since stopped the running of the bot, and clarified some criticism of the project.