Tech

Wikipedia Is Being Ripped Apart By a Witch Hunt For Secretly Paid Editors

Image: Evan Lorne/Shutterstock

Wikipedia is a strange beast: a latter-day Library of Alexandria that has to work overtime to keep erasing graffiti from its walls. For a brief moment in 2014, goalkeeper Tim Howard, star of the 2014 US World Cup soccer team, was listed on the site as the “secretary of defense,” and earlier this year House Speaker Paul Ryan found himself included on a list of invertebrates. Vandalism has plagued Wikipedia since its inception in 2001, but the site flourished in spite of these problems. But paid editors, who create and edit text to suit the needs of their clients, present a thornier challenge, since some of the work they do is permissible under Wikimedia Foundation’s terms of use as long as they disclose these conflicts of interest on their user pages. Not all paid editors make these disclosures, perhaps preferring the advantages offered by the appearance of objectivity, and those same policies that have made Wikipedia so useful—anonymity and consensus—are now at the heart of a controversy over how to prevent the alteration of the encyclopedia’s text by undisclosed paid editors.   

James Heilman, a medical doctor and university professor who edits Wikipedia under the username Doc James, is one of many Wikipedians who has urged closer oversight of this practice. “The problem of undisclosed paid promotional editing, or UPPE, has grown as Wikipedia has become more popular in the search rankings,” he told Motherboard. “We currently have multiple companies that send emails out both to those who currently have Wikipedia pages asking if they would like them ‘improved’ as well as to companies and individuals to create pages when they do not exist. Nearly all the people involved in UPPE are in breach of both community and Wikimedia Foundation policies.”

Videos by VICE

Michael Skirpan, a PhD candidate in computer science at Colorado University whose recent work has focused on surveillance and data collection, believes that Wikipedia’s viability hinges on a successful resolution to matters that might raise doubts about the site’s integrity and reliability. “Wikipedia’s status as an advertisement-free, well-referenced source for information and reference is integral to its utility,” he explained to Motherboard.”If it will remain a truly open platform, protecting this goal will require investigations into prolonged disputes in which a suspected but undisclosed paid actor continues to edit in misleading ways.”

Lots of people have meddled with Wikipedia for public relations reasons, including government actors. Congressional staffers have edited the entries for Vice President Mike Pence and fugitive hacker Edward Snowden, making Pence look better while describing Snowden as a “traitor.”  But Wikipedians can quickly resolve such obvious issues, much as they can supervise the work of paid editors whose corporate affiliations are disclosed. It is far more difficult to oversee and fact-check the work of undisclosed editors who earn a living by generating Wikipedia entries for not-so-public figures and fluffing the profiles of corporations and other major players.  

“Those involved in UPPE simply do not write neutral content. This isn’t the same as disclosed paid editing.  If Consumer Reports or the CDC wants to pay people to help improve Wikipedia’s health content, I am all for that. They must still disclose, but that is positive ‘paid editing,’ so long as they aren’t hiring people to improve the articles about their own organizations,” Heilman said. “We are in need of more tools to get these UPPE issues under control. Part of what we struggle with is that the Wikimedia Movement is partly born from a group that hold anonymity in exceedingly high esteem.”

The best role for the Wikimedia Foundation is to help improve community systems and technological tools.

In early January, Heilman and several other Wikipedians asked the Wikimedia Foundation (WMF), which hosts Wikipedia and other wikis, to address that very problem, including to what extent WMF’s legal department might utilize lawsuits to stop the practice. A prior set of guidelines for acceptable paid editing, proposed in 2010, had failed to gain approval, though this new discussion focused more narrowly on editors’ failure to disclose their conflicts of interest and the best way of investigating them. 

Jacob Rogers, legal counsel for the Wikimedia Foundation, explained that courtroom maneuvers can only go so far, writing in an e-mail, “legal actions are a blunt instrument:  courts use the same solutions to every problem, and have not kept up with the speed of modern technology.” In Rogers’ opinion, the best role for the Wikimedia Foundation was “helping to improve community systems and technological tools so that legal action isn’t necessary in many cases.”  

Rogers and his legal department colleagues then clarified their response by issuing an advisory statement on “paid editing and outing.” The statement emphasized that the first line of defense in most cases is the community itself, which can respond by educating users, warning them, and, finally, blocking them if necessary. It also reminded Wikipedians that “outing,” or “doxing,” others was generally frowned upon and considered “sufficient grounds for an immediate block.” While a “fair investigation” of undisclosed paid editing can sometimes be difficult to distinguish from “hound[ing]” the person under investigation, WMF affirmed it had “seen the Wikimedia communities successfully address [other] issues that require making difficult judgments.”

Wikimedia’s legal team told users that it functions best as the last line of defense in these situations.  It collects data on paid editors, can pursue legal action against paid editors who use Wikipedia trademarks (such as its logo) on their business pages, and may send cease and desist letters to repeat violators who continue to disregard Wikipedia’s terms of service. 

Wikipedian Heilman, who had published an op-ed about undisclosed paid editing back in July 2016, then created a workpage in response to the WMF statement, writing in an email that such a page would allow fellow Wikipedians to “more easily apply the G5 speedy deletion criteria [a policy that allows for the rapid removal of pages created by blocked or banned users].” He also added that he would not be updating the list until all of these community policy discussions are complete.  

As this situation continues to develop, Wikipedians and others may wish to consult some early research about the nature of contributions to the site.

Michael Skirpan, the Colorado University PhD candidate working on surveillance and data collection, believes undisclosed paid editors shouldn’t be able to exploit Wikipedia by availing themselves of the anonymity the community values so highly. “A user who refuses to disclose an affiliation should be interpreted as having taken a risk that may result in certain loss of rights on the platform,” he said.  “Investigations of undisclosed paid editors need not end in full exposure of an actor’s identity, but rather with a statement that the editor was found to be an affiliate of an organization with a conflict of interest on the article. On the other hand, for severe or repeated cases, Wikipedia may need to modify their harassment policy to state that if an editor makes repeated attempts to mislead the public, they are risking their privacy rights as investigations are central to maintaining content integrity.”

Heilman, the medical doctor who edits Wikipedia under the name Doc James, has several targeted solutions that, if implemented, could address the problem. “For starters, we can allow hyperlinking to job postings to buy or sell Wikipedia pages and allow the creation of lists of known bad actors [one of which he has already started] which will not only help with detection of further sock accounts from these groups but hopefully warn those who many consider hiring them not to do so,” he wrote in an e-mail. “Furthermore, we can work with online marketplaces such as Fiverr to remove accounts on these platforms that are in breach of our terms of use, develop AI tools to help human editors pick up further socks [people who use multiple Wikipedia accounts for improper purposes], and strengthen our notability criteria for creating articles in order to make them less susceptible to being gamed.”

As this situation continues to develop, Wikipedians and others may wish to consult some early research about the nature of contributions to the site. In 2006, programmer and hacktivist Aaron Swartz made a compelling argument that “occasional contributors” have been integral in the site’s development and “instead of trying to squeeze more work out of those who spend their lives on Wikipedia, we need to broaden the base of those who contribute a little bit.” A year later, researchers Denise Anthony, Sean Smith, and Tim Williamson published a paper on open source production that found “the highest quality writing” on the site came from “vast numbers of anonymous ‘Good Samaritans’ who contribute only once.” The number of active editors has fallen in recent years, however, and problems related to editorial misconduct have only become magnified now that a smaller number of people wield increased amounts of clout.

In Skirpan’s opinion, striking the right balance between encouraging contributors and supervising them will remain a collective effort. “Jumping to either extreme of locking down the platform to select editors or giving up on supervising integrity will hurt Wikipedia in the long run,” he said.