Wiki-based encyclopedia, created by volunteer anonymous contributors, has had a growing audience.
The idea is to apply the principles of open source to an encyclopedia and therefore to knowledge. Namely, that is a free to use by all project, anyone can collaborate by providing a building block. The sum of the contributions, even modest, results in a comprehensive online encyclopedia.
Articles are freely licensed, they belong to everyone and can be placed on any website (however be careful though duplicate content towards search engines). The host Wikipedia is simply a server that provides the community documents written by the community.
Wikipedia has its laws, its citizens, readers and contributors, a kind of administration, police (against spam, do not worry) its judges ... All this has grown in three years, in response to problems that has encountered the encyclopedia.
This site looks rather nice, with its freedom, its openness to contributions and numerous articles sometimes complete, often informative. But it also has a dark side.
To analyze the contributions of editors to an article on Wikipedia, and reveal when the most contribution are made by a single editor who can thus influence its content, the researchers created a tool, the Wikidashboard.
It wants to compensate for the weaknesses of the system established by the organization, which simply mark a page as "subject to controversy," which does not necessarily generate the necessary corrections.
But knowing the anonymous Wikipedia editors, we can already guess how the results of the tools will also be biased with superficial editions in series on items ...
The WikiScanner is another tool, open source, and created in 2007 by Virgil Griffith, which analyzes informations on unregistered editors such as the IP and uses this data to confuse organizations that edit articles about themselves to give a more favorable picture. Among the list of these "manipulators" are: the FBI, the CIA, the Vatican, the Church of Scientology, the United Nations, Microsoft and Apple, and many political parties!
Even if this violates the terms of the site, the contributions may have to correct errors and not necessarily to hide embarrassing truths.
Wikidata is a database of knowledge and is of interest to web developers because it can be used by any website and provide continuously updated information such as the population of a city. In April 2013, there are 12 million items.
The main flaw was announced by an expert (references below):
« Although experts on a subject may edit a page, they ultimately have no more control over the content of that page than anyone else. »
A specialist is placed on the same level as an ignorant or even mentally retarded one.
And the President of the Encyclopedia Britannica, Jorge Cauz:
« If I were to be the CEO of Google or the founders of Google I would be very [displeased] that the best search engine in the world continues to provide as a first link, Wikipedia, » he said. »Is this the best they can do? Is this the best that [their] algorithm can do? »
I am interested in letting many people know that Wikipedia is a flawed and irresponsible research tool.
So says John Seigenthaler after a Wikipedia article presented him as the instigator of the murder of J.F. Kennedy and his brother Bobby. It was actually a hoax, but stayed for 5 months in Wikipedia and was quoted by trusty sites!
Contributions may be made without registration. The IP address is recorded, but if it is dynamically assigned by the operator at each session, it is a guarantee of anonymity.
A proxy IP address may be used to and thus contribute in strict anymymat.
But the most invested wiki hackers use a user completely transparent account. They lots of editions every day gives them autority, it is unlikely that another publisher revert their contributions even when complaints are raised.
The Wikimedia Foundation is responding to a study that shows a change in the way Wikipedia works, which is becoming more and more a restricted community hostile to occasional publishers. Researchers at Palo Alto Research Center in California warn that the evolution of wiki could jeopardize its future. The content creation is in regression, fewer and fewer articles are created and fewer edits are made.
One could explain this by the fact that many pages have been created (3 million for the English subdomain), so the need to complete the site becomes less obvious. But it seems that the problem comes from elsewhere.
Not only the number of items created per month is reduced but the number of monthly contributions (5.5 million) and publishers – registered or not – (750 000), is reduced too.
The occasional publishers are apparently object of ostracism. 25% of their contributions are deleted against 10% a few years ago. Thus, according to researchers, older publishers do act of resistance towards others, which tends to discourage them and leads to lower the quality of the content of the site.
This worries the Wikimedia Foundation and it has launched an operation to try to understand the phenomenon.
According to Google Trends for Website, Wikipedia.org’s traffic as a whole is declining slowly since 2007. This confirms the conclusions derived by the authors of this study.
The search engine Google, the source of most of the traffic of the site, always provided it with a preferential position in results pages. But it also tends to promote new content and related news. The stagnation of the renewal of the site content is consistent with the decline in its audience.