Why this Site?

  • Our Mission:
  • We exist to shine the light of scrutiny into the dark crevices of Wikipedia and its related projects; to examine the corruption there, along with its structural flaws; and to inoculate the unsuspecting public against the torrent of misinformation, defamation, and general nonsense that issues forth from one of the world’s most frequently visited websites, the “encyclopedia that anyone can edit.”
  • How you can participate:
  •  Visit the Wikipediocracy Forum, a candid exchange of views between Wikipedia editors, administrators, critics, proponents, and the general public.
  • 'Like' our Wikipediocracy page on Facebook.
  •  Follow Wikipediocracy on Twitter!

Google Search

Press Releases

  • Please click here for recent Wikipediocracy press releases.

Experiment concludes: Most misinformation inserted into Wikipedia may persist

by Gregory Kohs

A months-long experiment to deliberately insert misinformation into thirty different Wikipedia articles has been brought to an end, and the results may surprise you. In 63% of cases, the phony information persisted not for minutes or hours, but for weeks and months. Have you ever heard of Ecuadorian students dressed in formal three-piece suits, leading hiking tours of the Galapagos Islands? Did you know that during the testing of one of the first machines to make paper bags, two thumbs and a toe were lost to the cutting blade? And would it surprise you to learn that pain from inflammation is caused by the human body’s release of rhyolite, an igneous, volcanic rock?

None of these are true, but Wikipedia has been presenting these “facts” as truth now for more than six weeks. And the misinformation isn’t buried on seldom-viewed pages, either. Those three howlers alone have been viewed by over 125,000 Wikipedia readers thus far.

The second craziest thing of all may be that when I sought to roll back the damage I had caused Wikipedia, after fixing eight of the thirty articles, my User account was blocked by a site administrator. The most bizarre thing is what happened next: another editor set himself to work restoring the falsehoods, following the theory that a blocked editor’s edits must be reverted on sight.

How reliable is Wikipedia?

When Wikipedia first entered the scene in 2001, while lots of people thought how amazing it would be to have an encyclopedia about everything, a few people asked whether we could trust an encyclopedia that any old goofball can edit. Would the average reader be able to tell if what they encounter on Wikipedia is factual truth, or just creatively vandalized misinformation?


Over the years, a deceptive myth began to take hold as fact for many people, even ones with an advanced education — that Wikipedia’s community of editors was on constant watch for vandals, zapping their mischievous bits of misinformation, such that nearly all vandalism is very quickly reverted within minutes, if not seconds. The news media could report itself blue in the face about Wikipedia hoaxes like Amelia Bedelia or John Seigenthaler’s role in the Kennedy assassination or the record-breaking fake article about Aboriginal deity Jar’Edo Wens, or even about long-running personal vendettas like the one carried out for years by “Qworty”. But Wikipedia’s true believers went on record to spread their faith in the online encyclopedia’s mythically near-perfect record against misinformation. They’d dismiss these egregious cases as outlying incidents of rarity.

The same news media that would eat up those Wikipedia hoaxes would also regurgitate the evidence from “studies” that praised Wikipedia’s self-healing powers:

* Daily Dot said, “Numerous studies have shown that vandalism — sneaking curse words or lies or libel into articles — gets cleaned up pretty quickly on Wikipedia.”

* The Economist touted, “Normally, such vandalism is corrected within minutes on Wikipedia because other people see it, remove it and improve the entry with their own genuine insight.”

* National Public Radio featured a professor who said, “And I tried to simply delete that reference, and when I did so, within minutes, that page was restored”.

* Dan Gillmor’s 2008 book, We the Media, proclaimed, “Wikipedia draws strength from its volunteers who catch and fix every act of online vandalism. When the bad guys learn that someone will repair their damage within minutes, and therefore prevent the damage from being visible to the world, they tend to give up and move along to more vulnerable places.”

* The Washington Post explained, “Generally, clear violations are taken down within minutes, as was the case with recent commentary on President Bush.”

* And finally, professor Alex Halavais in 2004 ran a small test of Wikipedia’s ability to detect vandalism. He made 13 changes that he described as vandalism, and he planned to leave them there for two weeks. “He expected to prove that Wikipedia is unreliable and un-vetted. Instead, all the changes were detected and fixed within a couple of hours, and Halavais conceded that he was impressed with Wikipedia’s self-correcting nature.” The problem with Halavais’ facile test was that his edits were all made from one account, so once one blooper was discovered, they were all easily discovered.

And that’s why I felt the need for my own experiment in 2015. Is the widely-proclaimed “self-correcting” Wikipedia still as defensive against damage as it supposedly was a decade ago? My plan would sample 30 vandalized articles, about one per day, using different IP addresses for each attempt. Generally, I progressed from infrequently-viewed articles about obscure subjects (like Rufus Barringer and the Koegel Meat Company), to highly-viewed articles about well-known subjects like inflammation, the movie Up in the Air, and Newcastle upon Tyne. Most of my vandalism attempts were “buried” in the middle or the end of an article, but a few were tested toward the beginning of articles (where people are more likely to read). Sometimes I would provide a legitimate-looking reference source, even though the source did not support my editorial claim (something I call a “feint” in the research documentation); but other times I would provide no source. And sometimes I would try to “disguise” my bad content with other helpful and accurate content in the same edit; but other times I’d just plunk down the phony content on its own.

The ethics of experimentation



As my experiment was underway and I started to talk with others about it, I was told by more than one person that it wasn’t an ethical approach to research. I was even asked, “Do you have any empirical evidence for the existence of the belief” that people think Wikipedia quickly fixes most of the errors deliberately introduced to it. Cue April 5th’s episode of ’60 Minutes’, featuring millionaire co-founder of Wikipedia, Jimmy Wales, telling Morley Safer: “I actually think of the problems we have, vandalism is one of the minimal ones, because it’s so thoroughly monitored and so carefully looked after.” There’s your empirical evidence. I also believe that when airport security systems are tested by undercover law enforcement agents, it is a necessary measurement process, even if it happens to inconvenience many travelers.


Journalist Dan Murphy came to my defense, too, regarding the ethics of my experiment. He said:

The convenient lie in all this has been the definition of ‘vandalism’ as a 12-year-old inserting at the top of an article “Ralphy is a penis-head.” But lies and distortions? They persist for years and years. Wikipedia lies when it says most vandalism is removed quickly. It doesn’t even have a working definition of vandalism.

Furthermore, even though Wikipedia’s parent company, the Wikimedia Foundation, collected $5.7 million in surplus cash beyond expenses last fiscal year, the organization not only has never spent a dime to evaluate vandalism on Wikipedia, they have never enacted any study of Wikipedia’s content quality at all. So, I conclude that I have served an ethical purpose, doing the sort of due diligence that the Wikimedia Foundation cleverly has avoided for over a decade.

The secret to sticky misinformation

While my study of only 30 articles was limited in scope, it certainly proves an important point. If you properly format nonsense inserted into Wikipedia and especially include a reference source (even a bogus one) that conforms to Wikipedia’s style guidelines, there is a very good chance that your vandalism will persist indefinitely.

Had I not attempted to unravel my deliberate mistakes, I am quite sure that Wikipedia would still say that the Sagami Railway in Japan was initially set up in 1917 to transport corn and fresh spicy shrimp (can you imagine the odor?) along the Sagami River valley. Likewise, a letter from Abraham Lincoln to Edwin Stanton would still be falsely directed to Albert E. H. Johnson. And the legend of Bodhidharma turning a bridegroom into a goldfish would still be Wikipedia’s version of truth.

Furthermore, because my attempts to fix Wikipedia back to its previous state were halted by a Wikipedia administrator, at the time of this publication there are still eleven more boneheaded falsehoods in Wikipedia, waiting for some other volunteer to correct them.


One of the more delicious outcomes of my experiment has been the revelation of Wikipedia’s ridiculous governing rule sets and their policing. In other words, if a user named “Bumperdinck” without any stated credentials whatsoever works extensively on the article about inflammation, his Talk page is showered with praise from other editors. But if I try to correct Wikipedia so that it doesn’t say that inflamed human tissue produces volcanic rock, I cannot, because my account is blocked. Indeed, after my vandalism of the article about the Chenango Canal, I was still welcome to edit Wikipedia. But if I point out that another editor named “Hlkliman” who soon made several edits to the same article might presumably be Harvey L. Kliman, webmaster of the Chenango Canal Association website, then that is “outing”, and my postulation is removed from public view, and my account is blocked.

Perhaps in the coming weeks, I will publish more detailed accounts of the interesting stories related to my vandalism experiment, but in the meantime you can browse through my detailed notes on the project’s analysis sheet stored as a Google Document. Regardless of what you think about what I’ve done, there should be one clear takeaway from my results: Wikipedia’s purported “self-correcting” prowess is more myth than reality.


Image credits: Flickr/mufinn, Flickr/roger4336, Flickr/Ian E. Abbott ~ licensed under Creative Commons Attribution 2.0 Generic

Bhutanese Passport – what does the hoax say?

by Jar’edo Wens, Special Correspondent on Wikipedia hoaxes

The most viewed article on Wikipedia for the week ending 28 March was a short article about the passports of the tiny country of Bhutan. 1,771,673 page loads. The link to the article was widely shared on social media sites. Did the world develop a sudden interest in Bhutanese passports? No, the reason people were looking at that particular Wikipedia article was because the page included an audio file with what sounded like an auto-tuned racist caricature of an Asian accent reading the article text.

No Wikipedia admin had the sense to delete the audio file as the tasteless joke that it so clearly was. It was easy for people flooding in from Facebook, Twitter, Reddit, and elsewhere to overwhelm the “consensus” and have the file remain on the page. Several people argued that because the uploader of the file, KuchenZimjah, claimed to be from Bhutan, it

…continue reading Bhutanese Passport – what does the hoax say?

Wikipedia: a Bot’s-Eye View

By Hersch

As the Twenty-First Century drags on, more and more aspects of our daily lives are dominated by digital gizmos, and more and more common tasks are automated. So, then, why not Wikipedia? In recent years, automated programs, also known as robots or “bots,” have demonstrated that they can sign comments left on talk pages, revert vandalism, check for copyright violations on new pages, add or remove protection templates, and archive talk pages more expeditiously, with fewer errors, and with more civility and less drama than the human editors. Should we be looking forward to the day when Wikipedia will be fully automated, where bots will trawl the net for news sources and automatically include every last tidbit of gossipy trivia about celebrities or fictional television characters, rendering Wikipedia’s human editors entirely unnecessary?

Ah, but I can hear the objections already. Can bots be programmed to be snarky and disingenuous? Will they be able

…continue reading Wikipedia: a Bot’s-Eye View