Wiki’in it.
Got into a huge argument a few days ago with a friend over the academic acceptability of Wikipedia. It started because the UC system–or at least my school–is considering a complete and permanent ban on Wikipedia as a research tool and a possible citation. My guess is that in 5 years, you’ll be able to quote it in papers. Perhaps it’s because I remember how not too long who the ENTIRE internet was off limits as far as academic research was concerned. Why? Ignorance. Now of course, that stance has been recanted, and every college student knows how to use to the web for scholarly research. So why do I think it will be allowed?
–The argument that Glenn Reynolds puts forth in Army of Davids is worth repeating. Let’s say Encyclopedia Britannica has a 100 total articles and Wikipedia has a 1000 (more like a trillion, but let’s keep it simple). And the scale of accuracy is 0 to 10. Due to Wikipedia’s lack of certification, the accuracy of each article is 7, and with Britannica’s more strenuous process is a 9. The problem however, is that for each topic that Britannica neglects to address, it gets a zero. So when you stretch the timeline out–you’ll see that Wikipedia is overwhelmingly more accurate on average because it actually covers more material.
–The average error correction time on Wikipedia is incredibly fast. I can’t track the numbers down right now, but I remember it being something like under 10 minutes.
–A large group on semi-informed people is statistically more accurate than an incredibly small group of experts. Read Wisdom of Crowds, or anything on “Future Games.” And those are normally applied to things that haven’t even happened yet. So to assert that a crowd is better at predicting the future than they are at simply recording the past is ridiculous.
–The majority of information on Wikipedia isn’t even up for dispute. It’s dates, birthplaces, timelines, etc; which, if anyone stopped to think for a minute, they would realize are the ONLY things people quote encyclopedias for anyway. Only the especially juvenile use secondary sources for the crucial parts of their research papers anyway. As a general rule, in the meat of a paper you never quote dictionaries, encyclopedias or textbooks. There is no reason a student shouldn’t be able to say “Wikipedia places his birth in France during the mid-14th century.
The problem here is essentially a conflict of interest. Professors–as per maintaining their livelihood–have a vested interest in preventing the mass proliferation of knowledge. At its core, Wikipedia renders the average professor obsolete. It removes the human limitations of the middleman and replaces it with an infinitely large and more accurate source of information. Of course, they’re going to fight it. Students too are biased. They’d like to utilize the ease of Wikipedia–letting others collect and synthesize the vastness of academia for them.
So the solution lies somewhere in the middle. Professors–CollegeBoard perhaps–should get together and, either with the help of Wikipedia or independently, and begin to certify articles that meet their collective burden. Out of the millions of entries, some are obviously unacceptable. But the majority of them are detailed and helpful. With a seal of approval, students should be able to incorporate this wealth of human experience into their journey.
The university system is supposed to be a collection of the world’s greatest minds that’ve come together for one purpose–to teach and educate. It’s terrible ironic then, that when a computer database comes alone and automates the works of those greatest minds, that they’d fight it tooth and nail. It is simply too easy to dismiss Wikipedia as inaccurate or unscholarly. Statistically, that assertion is flat wrong. It defies the massive advancements we’ve made in psychology, economics, politics, and mathematics. A larger selectorate is smarter than a smaller one. Fact.
I remember reading somewhere that Wikipedia is doing something like what you described. They’re collecting a series of articles to be edited and “finalized” to be released on CD. Right now it’s almost exclusively made of the sorts of articles students would find useful.
Can you be more more specific when you mention ‘Future Games’? I’ve tried looking for what you mean but I only find fleetwood mack and articles about next gen consoles.
Yeah. Pick up Wisdom of Crowds if you want a detailed look, but essentially, Future Games harness the uncanny ability for large groups of people to make accurate predictions from everything to box office returns to possible terrorist attacks.
For instance, if they held up a jar of beans, and asked 500 strangers how many were in it–the groups mean guess will be disproportionally accurate. Or sometimes they set up fictional markets (functioning like a stock exchange) and have people buy and sell shares in where they think a terrorist attack will occur or who will win a presidential election. And as the group synthesizes information the shares go up and down and normally illustrate a surprising amount of forbearance. Often more accurate than polls or expert opinion.
I’m not quite sure why it is, but the larger the selectorate, the larger the collective wisdom.
The first point means that Wikipedia is more comprehensive, but it does NOT mean it’s more accurate. It still have a average value of 7.