Tuesday, July 11, 2006

Death by Wikipedia

The Washington Post has an interesting article by Frank Ahrens called Death by Wikipedia: The Kenneth Lay Chronicles. Frank looks at the hot topic of how much we can trust Wikipedia given that it is entirely user-editable. He looks at the way Kenneth Lay's death was handled as the classic case of this and concludes:
That Wikipedia's greatest strength is its greatest weakness.

If the statement that "history is written by the winners" is too gross, it does speak to an underlying truth: All definitive encyclopedia authorship comes with the point of view of its times. It is unavoidable. As august and reliable as the Britannica is, one need only look back to 19th-century versions to see its Anglo-centric viewpoint and curious study of others that treated foreigners (say, Africans) as anthropological subjects rather than human equals.

An encyclopedia written from many points of view should, in theory, help eliminate that flaw. Further, as well-girded in research as encyclopedia authors are, there are countless experts on thousands of topics that know more than the Wikipedia authors; every topic has its fetishists, and thank goodness. If the goal is the ultimate compilation of truth-tested facts, Wikipedia could be a powerful tool.


But here's the dread fear with Wikipedia: It combines the global reach and authoritative bearing of an Internet encyclopedia with the worst elements of radicalized bloggers. You step into a blog, you know what you're getting. But if you search an encyclopedia, it's fair to expect something else. Actual facts, say. At its worst, Wikipedia is an active deception, a powerful piece of agitprop, not information.
(emphasis mine)

I think that's a fair call, there is some very questionable material there. Perhaps more importantly, the basic content of any Wikipedia article is dynamic. This can lead to people getting all sorts of ideas about the topic, or indeed about Wikipedia's accuracy, when the offending material might be reverted, edited or removed within moments of them leaving that web page.

For example Dave Winer mentioned on his blog that a friend had looked at the Wikipedia page about Dave and thought it pretty terrible stuff. Dave's critics on the Eye on Winer blog found this laughable as (to them) the article steers a fairly neutral course between the fairly polarised views on Dav'e contributions to blogging, RSS and podcasts.

However, a stroll through the History tab on his Wikipedia page shows a number of vandalism attempts that have been reverted (usually within a few hours), as well as some toning down of material deemed too "fanboy" by one or another named Wikipedia users. Usually vandalism is done by anonymous users, not registered Wikipedia users, a fact which helps Wikipedia editors identify vandalism more quickly (any edit by an anonymous user is worth checking out, especially on disputed articles).

If Dave's friend had seen the article at the wrong time, it might have looked very bad, and certainly this is the impression they got, whilst later viewers looking at it see nothing wrong with it (let's leave aside the two sets of viewers obvious bias). The ephemeral nature of Wikipedia edits is one of the main problems that I have with trusting it for anything other than fun research on hobby topics (like Vikings). Yet, increasingly, we see news websites referring people to Wikipedia for detailed information on a specialty topic (I've seen this at least twice myself from news.com.au).

I doubt that there's an easy answer to this issue, other than suggesting that people use more than one source for facts, and take anything they read with a grain of salt. This does speak to the issue of whether wikis are appropriate tools for the enterprise environment. Whilst they are great at capturing corporate knowledge, they are just as good at enshrining stupidity and promoting divisive views. But perhaps that is their real strength, the ability to lift the veil between people at work and reveal the real talent or lack thereof? To expose dramas and stresses that are hidden below normally civil exteriors?


  1. As you identify, the trick is knowing how to use a tool effectively, to get maximum value (and minimum misdirection/bad info) from that tool. In the case of wikipedia, I routinely check history and context -- i.e. verify the robustness of the web of trust -- before relying on an entry. I'll be going to the Wikimania conference in Boston next month to learn about other effective practices and how to end up with the mythical "Neutral Point of View" (NPOV) more frequently.

  2. raines,

    The problem with Wikipedia is that the mass audience it attracts may not realise that they need to apply that level of thinking when interpreting 'facts' on Wikipedia. However, that should become a core skillset for any web user, and one we teach our childen before we let them use the internet on their own.

    In an enterprise environment you can certainly train users to think like this - I wonder whether enterprises will see this as a feature that disqualifies wikis from use within the enterprise.

    James Dellow deals with this issue a lot on his ChiefTech blog.