The case against Wikipedia

Wikipedia was established in 2001, the idea being to make an online encyclopedia that anyone could add to and edit as needed to correct.

It made sense, in theory: No one is an expert on everything, but there are experts in everything — or even just people who could offer insightful knowledge on any subject — who could write accurate information about what they know in order to help build up this hub of what would, in theory, become a reliable source for anything you may need details about.

In my experience, Wikipedia seemed to have a golden era where it was simply the go-to option for what was referenced. What’s that? I need to look up information about an event or moment in history? To check on an actor’s age or see what the difference is between a llama and an alpaca?

But then it seemed to become more of a source you could check on, but that was understood to be just the baseline, default check that popped up in online searches. It was what you’d go to out of convenience instead of doing more than the bare minimum to check on something that may or may not, as it became, actually be correct. If you verified anything on there, it was mentioned where that info came from, as if to imply a large asterisk after whatever info you were referencing: By the way, this was only Wikipedia, so y’know… may not actually be the case.

John Naughton, an author and journalist for The Guardian, has a daily newsletter I subscribe to. A short while back, he was lauding Wikipedia because of what it represented and stood for. He said it’s the best of what the internet should always have been: A global effort to collect and provide information that others want or need to have access to. He allowed that there are those who know that Wikipedia has errors in it, but makes the fair point that if people are aware of errors, then they should fix them, which gets the site back to providing the accurate information that Wikipedia was founded to provide.

But what he overlooks in his assessment, I think, is the fact that while the vast bulk of humans do way more good than bad and certainly mean well, momentum in what people are doing at the time, and simply lack of real interest, I suspect makes correcting Wikipedia entries a bridge too far for many. Just because we may know something on Wikipedia is wrong doesn’t mean we want to log in there just to correct it. It’s at once an extremely easy thing to do, yet more than I think most people would want to bother doing.

And in fact there are those who actually utilize the incorrect information to their advantage. There’s a Florida rock band called Shinedown with a guitarist named Zack Myers. Zack’s Wikipedia page long had family history that included mentioning that, speaking of llamas, his family owns a llama farm. That detail is, as verified by a friend of Zack’s, 100% incorrect. Zack not only knew of that information on his page, but he intentionally left it there entirely to expose any lazy interviewers. If they brought up the llama farm, he knew that Wikipedia was where they got that tidbit, and that their dive into his background may not have gone beyond that. (The llama farm angle of his page has, as of this writing, been removed, though I did see it myself there less than a year ago.)

Then we get to a broader problem with Wikipedia, which is that the world is increasingly becoming a place where “facts” are based on opinion instead of verifiable facts. If you were to ask a certain percentage of Republicans in the U.S. if Joe Biden won the 2020 election, they would tell you no. Not that it’s their opinion that he didn’t win, but that it’s a hardline fact that he didn’t win. That the election was stolen, etc.

If you were to ask a certain portion of the global population whether or not the Covid vaccines work, or how they work (or don’t), you’d get some drastically different takes on it. Even people not formally trained in biology or epidemiology wouldn’t call those views opinions, they’d call them facts.

There are certain (and seemingly increasing) areas where opinion is held by some as equivalent to education and informed experience: My online research has told me the truth about vaccines and I’m as informed — even more informed — than those doctors and scientists who are promoting it.

The concern in this context being, if anyone who believes something is true has as much access to change Wikipedia’s content as anyone who’s educated and experienced to know what’s true, how much can we rely on what it says?

And then there’s the problem where even informed people disagree about the facts. I subscribe to a newsletter called Nonobvious Insights, and its curator and publisher Rohit Bhargava recently linked to an article explaining that the Wikipedia page covering Recession had been edited 180 times just last week because financial experts can’t all agree on whether or not the U.S. is in a recession. Forget yahoos who think their opinions equate to others’ education. This is many professionals in their field who have access to all the same data about the economy but who seemingly employ different metrics and are thus at odds regarding something as significant as what financial phase a country is in that has arguably the world’s most important economy.

If even something that big can’t be agreed upon, what hope do we have of ever creating a single, genuinely reliable reference source for everything else, as Wikipedia aspires to be?

I think Wikipedia is a great idea in theory that, from the foibles of human nature to the variable interpretations of hard data, doesn’t work in practice as the go-to source for reliable information. For the foreseeable future, any reference to Wikipedia will come with an inherent large asterisk, and everyone from students working on papers to people just wanting to get the facts of something will — and should — be expected to verify the facts elsewhere.