By Andreas Kolbe
One of the worst things about Wikipedia is how it provides a platform for malicious, anonymous slander. It did not have to be this way.
Israeli journalist Gideon Levy’s dad was recently defamed as a Nazi collaborator in Wikipedia, and the hoax spread instantly to other websites, including one news website which reported that the spurious information had been removed, and now claimed the article was “censored”. Levy had to employ Haaretz’s lawyer to have the article withdrawn, an option not open to everyone, as he rightly observes:
Wikipedia had published, for one day apparently, information planted there, that my father, Dr. Heinz Levy, had collaborated with the Nazis and therefore was awarded the position of district legal adviser under that horrific regime. When he came to Israel, he changed his name from Heinz to Zvi in order to blur his past, it added. All of this was reported by Rotter and a picture was added of the page in Wikipedia before it was “censored.”
I was in shock. I have been the subject of quite a few aspersions before but never anything like that. What can be done about slander of this type? How does one start to refute a revolting lie which in another second will spread like wildfire among the virtual thorn fields of the Internet? […]
Wikipedia published the information, even if only for a very short time. Only the decisive intervention of Haaretz’s lawyer, attorney Tali Lieblich, who sent a sharply worded letter to the management of Rotter, led to the (immediate) removal of the slanderous item. Not everyone has a lawyer, or a friend who brings to his attention the fact that he has been slandered on the Web. From my point of view, the affair has ended but it has not concluded: The reports about my father are continuing to circulate on the Web.
This type of defamation happens time and again in Wikipedia. British journalist Johann Hari edited the biographies of people he did not like under an assumed name, describing them as alcoholics, antisemites or homophobes. The changes remained live for days, weeks. Hari was mortified when he was eventually unmasked, and proffered an apology:
I took out nasty passages about people I admire – like Polly Toynbee, George Monbiot, Deborah Orr and Yasmin Alibhai-Brown. I factually corrected some other entries about other people. But in a few instances, I edited the entries of people I had clashed with in ways that were juvenile or malicious: I called one of them anti-Semitic and homophobic, and the other a drunk. I am mortified to have done this, because it breaches the most basic ethical rule: don’t do to others what you don’t want them to do to you.
Without allowing anonymous editing of biographies in Wikipedia, Hari would never have been tempted to act in this manner, and the episode would never have happened. Anonymous editing of biographies naturally invites this sort of abuse.
There has been more than one case of an academic having their career or family life ruined by malicious editors turning their biography into an attack piece. The Turkish scholar Taner Akçam for example was defamed as a terrorist in his Wikipedia biography, and promptly refused entry to Canada: on the strength of anonymous defamation in a Wikipedia article. Robert Fisk wrote an article on the case a few years ago, for the UK Independent.
Earlier this year, the feminist Anita Sarkeesian had porn images plastered over her Wikipedia biography when she commented on gender roles online. She rightly described it as harassment:
This was not done by just one or two trolls but was a coordinated cyber mob style effort involving a whole gang working together. The screenshot below was downloaded directly from one of the internet forums organizing the harassment. They were proudly posting this image as a trophy to boast about what they were doing and to encourage others to join in.
The vandalism included changing the text, changing the page categories, changing the external links to re-reroute to porn sites and adding a drawing of a woman with a man’s penis in her mouth captioned with “Daily Activities”.
These abuses are a reflection of how irresponsibly the English Wikipedia is managed by its anonymous crowd, which appears to savour its “right” to write about – and defame – people anonymously. The site’s obsession with anonymity, and its sanctions for violating it, are quite as pronounced as they are in Reddit, which last week banned Gawker from its pages after Adrian Chen, one of Gawker’s journalists, revealed the identity of a Reddit moderator who had specialised in creating deliberately offensive Reddit forums:
Judging from his internet footprint, Brutsch, 49, has a lot to sweat over. If you are capable of being offended, Brutsch has almost certainly done something that would offend you, then did his best to rub your face in it. His speciality is distributing images of scantily-clad underage girls, but as Violentacrez he also issued an unending fountain of racism, porn, gore, misogyny, incest, and exotic abominations yet unnamed, all on the sprawling online community Reddit. At the time I called Brutsch, his latest project was moderating a new section of Reddit where users posted covert photos they had taken of women in public, usually close-ups of their asses or breasts, for a voyeuristic sexual thrill. It was called “Creepshots.” Now Brutsch was the one feeling exposed and it didn’t suit him very well.
The striking hypocrisy here is that Reddit considered it a vital element of free speech for people to be able to post upskirt shots of unsuspecting teenage girls, but felt that free speech should not include posting the identity of a 49-year-old who wants to post upskirt shots of unsuspecting teenagers, and encourage others to do so.
Wikimedians display the same combination of disregard for the rights of image subjects and biography subjects on the one hand, and a claim of entitlement to total and complete privacy for themselves on the other. The result is that the people written about in Wikipedia suffer anonymous defamation, with little recourse available to them. Subjects and authors of sexual images taken in private settings have found at times that it is almost impossible to get one of these images deleted, even if they were uploaded without their consent. Posses of pseudonymous editors will block the deletion.
Biographies of living people in the English Wikipedia are generally the first Google hit for the person’s name. What a bizarre situation! People withholding their names – who at times include professional rivals, recently divorced ex-husbands, jilted lovers, angry neighbours, disgruntled former lodgers and trolls – are allowed to write the top Google hit for a named person. Wikimedia became aware of the scale of the problem some time ago. In 2009,
Wikipedia announced that it planned a move that many saw as a step away from its freewheeling ethos of anyone can edit. The plan, called “flagged revisions,” would be limited to articles about living people, and would require that material be signed off on by an experienced editor before it would be seen by the general reader. In essence, there would be a layer of review that would prevent some “edits” from appearing immediately.
At the time, the initiative was championed by Jimmy Wales and reflected concerns over several biographical mishaps:
Wales posted his plea to implement a pre-screening effort called Flagged Revisions after several unfortunate, but not unfamiliar incidents last week on the site including edits falsely reporting the deaths of Sens. Robert Byrd (D-W.Va.) and Edward Kennedy (D-Mass.). Janis Joplin’s entry was also tampered with and eventually locked down after a “30 Rock” episode aired involving the cast messing with the entry themselves. “This nonsense would have been 100% prevented by Flagged Revisions,” Wales wrote.
Wales was correct, and the same goes for the more recent vandalisms discussed above. But in the end, the English Wikipedia “community” of pseudonymous editors blocked the initiative. Despite all the media fanfare, the global introduction of Flagged Revisions for English Wikipedia biographies never happened.
So while a few language versions of Wikipedia (German and Polish, for example) use the Flagged Revisions feature on all articles, which ensures that changes are looked at by a trusted editor before they are shown to the public, in the English Wikipedia everything goes live immediately – including harassment and defamation in biographies. The result is real harm to living people, all for fear that any quality control would reduce the site’s page views and participation.
And this is the crux: Wikipedia is not managed to produce a reliable reference work. It is managed to maximise page views and donations, and to minimise responsibility and accountability for its anonymous editors.
Wikimedia Foundation donors are often under the impression that their donations will be used to increase the quality and reliability of Wikipedia content. But the Wikimedia Foundation takes no active part at all in the generation and control of Wikipedia content, in part to prevent the possibility of being held legally responsible for any of it. The Wikimedia Foundation defines itself as a provider of an electronic service, like an e-mail service provider, rather than as a publisher of information. The content is all in the hands of its generally anonymous or pseudonymous editors, under a system which the Wikimedia Foundation has designed to minimise responsibility and accountability, both for itself and its contributors.
According to its own financial statements, the Wikimedia Foundation currently has more than $30 million in unspent net assets. Its revenue has increased more than 12-fold over the past five years, and in every year, it has taken far more money than it has spent. Critics say that a gravy train is developing for national Wikimedia chapters, who find they have more money than they know what to do with. Yet the fundraiser banners will soon be back, along with the harassment and defamation cases. What will the millions be spent on?
You will look in vain for “measures to improve content quality and reliability” among the WMF Executive Director’s key organizational priorities.
Photo credit: Flickr/myeralan — licensed under Creative Commons Attribution 2.0 Generic (CC BY 2.0)