Centre for Internet & Society
Wiki's worth, on a different turf

Avril Lavigne's fans launched a fake campaign to make her music video the most watched clip on YouTube

An Indian duo–a programmer and a mathematician–have developed a tool to expose anonymous writers and cleanse Wikipedia of rogue editors

Wiki's worth, on a different turf

Avril Lavigne's fans launched a fake campaign to make her music video the most watched clip on YouTube

Bangalore-based Kiran Jonnalagadda, a Web programming guru, and Hans Varghese Mathews, a mathematician, are the new entrants to the emerging field of Wikipedia research. The duo is credited with building Wiki Analysis, a tool that helps researchers understand the growing phenomenon of astroturfing, the practice of faking grass-roots support on Wikipedia and other websites. Wikipedia is the first Google result for most searches and this has made it a popular destination for those trying to manipulate public opinion on the Internet. Corporations, governments and even pop artists have been caught astroturfing in the past.

Jonnalagadda and Mathews are among 34 researchers from 17 countries attending a two-day conference in Bangalore, WikiWars, which is concluding today. WikiWars is taking a fresh look at many different aspects of the world’s biggest encyclopaedia, the sixth most popular website on the Internet.

The first generation of astroturfing on Wikipedia has been, thus far, largely unsophisticated, with little attention paid to covering up digital evidence. Remember the campaign Avril Lavigne’s fans launched last year that turned her music video Girlfriend into the most viewed clip on YouTube? Wal-Mart Stores Inc. contracted its public relations firm Edelman to maintain a fake website called “Working Families for Wal-Mart”. They pretended to be ordinary citizens who opposed the views of the firm’s labour union.

It is well known that platforms such as Twitter and Facebook, with opaque management procedures, are susceptible to astroturf campaigns. Supporters of open licensing and peer production have always held that Wikipedia and other community-managed platforms are protected thanks to their transparency in policies and practices. But as far as Wikipedia researchers are concerned, the jury is still out.

Microsoft tried to pay technology blogger Rick Jelliffe to work on Wikipedia connected to OOXML (Office Open XML) during the ISO (International Organization for Standardization) approval process in an attempt to influence the global vote. OOXML was the new file format for MS Office documents that urgently needed approval to check the growing popularity of Open Office. A user called “Ril_editor”, active between September 2007 and May 2008, who claimed to be working out of Reliance Industries Ltd’s chief Mukesh Ambani’s offices, tried to expunge pages connected to negative publicity about Reliance. Scientologists were blocked by Wikipedia’s arbitration committee when they were found trying to systematically undermine Wikipedia’s NPOV (neutral point of view) policy. NPOV is Wikipedia’s particular spin on non-partisanship, providing equal space to all opinions. However, some Wikipedia researchers such as Geert Lovink, head of the Institute of Network Cultures, Amsterdam, and co-organizer of the WikiWars conference, believes that the dominance of English and textual citation requirements has meant that NPOV is never translated into practice.

An American team based out of the Santa Fe Institute, US, has developed WikiScanner, a public database of IP addresses that helps reveal the organizations behind anonymous edits on Wikipedia. WikiScanner has been used to expose the US Central Intelligence Agency’s manipulation of pages. WikiScanner doesn’t yet work for edits by authenticated users. The WikiScanner team has also developed another tool called Potential Sock Puppetry, which exposes those who use multiple user accounts from the same IP address. However, both tools could be circumvented by purchasing multiple data cards or getting people to work from public access points such as coffee shops and cyber cafés.

It is this gap the Indian duo’s tool tries to plug. The first version of their Wiki Analysis tool clusters users into potential lobbies based on the pages they edit within a date range. The tool’s next version will cluster users into lobbies based on the words they consistently add and delete across pages. Says Jonnalagadda, “Wikipedia is now close to a decade old and has many articles that have existed since its earliest days and have been edited by thousands of individuals.” It is now the primary encyclopaedic destination for Internet users, and that makes it a ripe target for astroturfing. At no point in the history of human civilization have so many collaborated over so long to produce one canonical document on any article of human knowledge.

“Wikipedia users rarely bother to check how a page was edited, but that information is all there, available to anyone who cares to look. We’re building the tools to help make sense of it,” Jonnalagadda says. Once Wiki Analysis is ready, you will be able to check if, for example, the editors of the climate change page on Wikipedia are more interested in ecology or energy.

Original article on Livemint

Filed under: