Wikipedia, the balancing act

I've been mostly inactive online lately: tweets here and there, some edits, a few sprees of emptying proposed deletion categories. Amidst all of the lab reports and other frantic work, I've taken little moments to consider a tricky question: Why does Wikipedia work? Why should it work? The oft-quoted "zeroth law of Wikipedia" has long been the following:
Wikipedia only works in practice. In theory, it can never work.
Wikipedia's size and complexity mean that generalizations quickly become over-generalizations, and many of my initial thought experiments had to be discarded. I finally hit on a model for experimentation that works: Wikipedia is a balancing act.

Wikipedia almost always falls between two extremes, at a point where most of the advantages of each can be maximized and the disadvantages minimized. With my chemistry background, I suppose I like to think of it as an equilibrium of sorts: there's a mixture of either "extreme" whose balance can change based on environmental conditions. Heating a chemical solution, for example, can shift a chemical equilibrium to favour an otherwise low-yield product; heating a discussion with a flame war can shift a social equilibrium to favour the otherwise low-use technique of removing members from the community. More seriously, whenever social conditions or attitudes change, any practice dependent on those attitudes will change correspondingly.

Wikipedia welcomes new content, so historically the barriers to creating new content have been low: to this day any registered user can create any page on a whim. Since Wikipedia also wants a certain quality of article, however, those whimsical pages might be deleted mere moments after they are created. This is one of the classic equilibria of Wikipedia, one that's been argued over countless times: inclusionism and deletionism. It's also a good example because it's shifted over time: the early Wikipedia was radically inclusionist, because any content was better than no content. As Wikipedia's content base has grown, so has deletionism: new articles are no longer so highly valued, and quality is increasingly valued, so low-quality articles are more likely to be deleted. Deletionism is entirely a product of a community that does not want to include content below a certain minimum of quality. Wikipedia depends on the balance between inclusionism and deletionism to be maintained: too much deletionism and you lose good content, too much inclusionism and you risk massive amounts of mediocre content.

I've not fully considered what facets of Wikipedia can be considered a balancing act in this regard, but some of the applications of the idea of balance seem obvious:
  • Governance: anarchy (open wiki editing), bureaucracy (wiki policies, the Arbritration Committee), and other systems find a balance (incidentally, it reminds me somewhat of the idea of "sociocracy" I found recently). In general, the anarchism of open wiki editing prevails as the most open, but when problems come up, bureaucracy and such can be used to make a relatively final decision rather than having continuous edit wars.
  • Page protection: open editing generally works, but as vandals arrive or edit wars begin, open editing becomes less tenable. High-profile articles like "George W. Bush", "Pie", or "Abortion" thus tend to end up protected. The forthcoming flagged protection feature will make this balance smoother by providing more layers of possible states of protection between the extremes of unprotection, semi-protection, and full protection.
Ultimately, the idea of equilibrium is too simple: it doesn't answer enough questions about how the balance started in the first place, or how stable each one is at the moment. What it can inform, however, is ideas on how to improve Wikipedia (e.g. the current strategic planning). People looking at problems in Wikipedia can target imbalances in the community, the software, or the available resources, and attempt to compensate. Ideas for optimizing Wikipedia for some desirable trait should take into account that Wikipedia depends on running its high-wire act between openness and standards, inclusiveness and quality, anarchism and bureaucracy—falling on either side would have unforeseen consequences.


Flagged Revisions: a confusing development

In the past day or so, a number of news organizations, beginning with the New York Times, have been publishing stories about how Wikipedia is closing off editing by adding "flagged revisions" software. While there are plans to add an implementation of the "FlaggedRevs" extension to the English Wikipedia, these plans are being critically misinterpreted by the media.

Among the critical errors are assertions that Wikipedia is closing off otherwise open editing with the new software, or confusions of various proposed implementations of the software. It's a cloud of doubt that does not help Wikipedia: there is more than enough fear, uncertainty and doubt around all things "flagged revisions", and that is unhealthy for community discussions on how to run the project. As such, this post will serve to point out key facts about flagged revisions and its history, and outline where the future might be headed.

The FlaggedRevs software

  1. FlaggedRevs is an extension to the MediaWiki software that runs Wikipedia.
  2. FlaggedRevs is highly configurable. Different settings in the software can produce vastly different systems of authentication, some even simultaneous.
  3. As the software, FlaggedRevs should not be confused with any given implementation of the software.

The planned implementation

  1. The planned implementation of FlaggedRevs on the English Wikipedia is a test. The test is expected to run for around two months, after which the community will evaluate its impact and debate longer-term plans.
  2. The planned implementation has two parts, "flagged protection" and "patrolled revisions".
    • Flagged protection is a system not unlike Wikipedia's current page protection system. In the current system, pages can be "protected" or "semi-protected" by administrators, with the former restricting editing to administrators and the latter to "autoconfirmed" users (any user with at least ten edits whose account is at least four days old). With flagged protection, all users will be able to edit an article, but only certain users will be able to mark a version of the article as acceptable so that it appears as the main version of the article. Only articles that would otherwise be protected will end up "flag-protected".
    • Patrolled revisions is a review feature. While it will apply to all articles, its primary effect will be to mark versions of an article as reviewed. This will help keep out vandalism, since it will be possible to easily check all the changes made since the last patrolled revision. It won't affect things otherwise.
  3. The current test was ratified in a poll held from 17 March 2009 through 1 April 2009; 324 users participated and 259, or approximately 80%, supported the implementation.
  4. The planned implementation should not be confused with the original suggested implementation, which is substantially different.

The original implementation

  1. This implementation of the software was the original design of the software.
  2. This implementation can be called "flagged revisions", but one must be careful not to confuse that phrase with other instances of the FlaggedRevs software.
  3. The German Wikipedia elected to use this implementation of the software, and continues to do so, having started in May 2008. Language versions of Wikipedia are largely independent and can use separate articles, policies, and software extensions.
  4. The original implementation would have effectively applied "flagged protection" to every article on the wiki. All edits would have to be manually reviewed by established editors before they appeared as the main version of the page.
  5. A test of the original implementation was proposed in a poll on the English Wikipedia, but failed to gain consensus, with only 59.6% support from 720 users. On Wikipedia, majority does not suffice; "consensus" for large numbers of users typically entails at least a 75% supermajority, though 80% or greater is preferred.

Availability of FlaggedRevs

The FlaggedRevs software has been available since at least 2007. Its controversial nature in the community has been the main bar to its implementation: the Wikimedia developers typically do not implement such major software changes without broad community support.

Nevertheless, the English Wikipedia presents a unique technical challenge because of its size, and so despite the success of the poll requesting the flagged protection & patrolled revisions implementation, FlaggedRevs is not available on the English Wikipedia yet. The specific details of the implementation also cause trouble, in both that they stretch the original design of the software and that FlaggedRevs is controversial enough that arguments may be had over the minutiae of any implementation.

In any event, a test implementation is up and available at http://flaggedrevs.labs.wikimedia.org/wiki/, and developers have indicated that the software will go live within the next few weeks.

Debating the original proposal

The original implementation of FlaggedRevs, as a measure whose actions are essentially restrictive, is very controversial within Wikipedia and, I get the impression, outside it as well. It has certain evident advantages:
  • With flagged revisions, effective vandalism is virtually impossible, and in any event lacks all thrill when it will typically not be seen by any but the user that removes it.
  • By extension, libel and other problems are similarly unlikely when manual review is required for all edits. Wikipedia's real-world responsibilities, e.g. to avoid negatively affecting the lives of those it describes, become more manageable.
  • Flagged revisions offers unique opportunities for long-term review of changes to Wikipedia articles. While tools for examining the differences (or "diffs") between articles have long been a part of the core MediaWiki software, there are not yet tools for "flagging" revisions as having particular levels of quality.
Flagged revisions has certain potential disadvantages as well:
  • In large projects, it's hard to tell how much backlog of edits would be incurred, or how that backlog would be distributed. There's a significant concern that the backlog could be days or weeks long, much like the existing backlog for reviewing new pages.
  • Flagged revisions represents a serious barrier to openness. The Wikipedia community not only prides itself on its openness, but is fueled by it: many new users become long-term members after discovering editing spontanously.
  • Were disruptive elements to abuse the flagged revisions system by "flagging" inappropriate material, the damage would be greater than to a system agnostic of editorial approval.
These concerns and advantages are each far from trivial, but represent a crossroads: can Wikipedia integrate further restrictions without sacrificing its essential character? I think not. While the community at large would like some implementation of FlaggedRevs, there has not been enough support to justify it in that implementation.

There are a number of various groups on Wikipedia pushing either way, and though I do not wish to draw misleading borders, they tend to generally fall into valuing either prevention of vandalism and libel, or the openness of the site, higher than the other. Some would go as far as to suggest that all contributors be required to provide real-name authentication: others, like myself, worry that Wikipedia would lose its greatest strength were it to become less open.

Openness of the planned system

The planned system, flagged protection & patrolled revisions, represents a serious improvement, in my view, on the original flagged revisions proposal. It removes much of the restrictiveness of that system, at the expense, even, of the potential gains. Most of the restrictions are now only for features where users would already be restricted from editing, or to new features. The focus is on the process of review, rather than the process of editing.

In this planned system, flagged protection would likely help make Wikipedia more open. Where pages are currently protected so that certain groups of users are restricted completely from editing, it would surely be more open for them to be able to edit but require review of those edits. There exists a system of requesting edits on Wikipedia already: I imagine that this system could be superseded by a simpler method of requesting confirmation of edits to those pages. Flagged protection can replace older, cruder methods of protecting pages, and thus be more inclusive.

Patrolled revisions is a feature that will be restricted to established users, but that considered, it's important to note that it does not remove any permission from any user. It is merely a sensitive permission that, to prevent abuse, won't be given to all users. In the long term, it is my hope that patrolled revisions can serve as a sort of insurance for Wikipedia's reliability: people worried about viewing vandalism can view a reviewed version of an article to be sure to avoid it, while the current revision of the article remains completely open for editing.

The future of Wikipedia

FlaggedRevs as a software highlights much of Wikipedia's character: there is a constant balancing act going on between editorial oversight and openness, and it's always tempting to see what would happen were it pushed one way or another. On the one hand lies potential stagnancy through lack of contributions, and on the other lies stagnancy through a glut of mediocre content. Developing Wikipedia serves as a challenge that is also an experiment: no group of people has before managed such an ambitious project so openly, let alone had the success that Wikipedia has enjoyed. I, along with many others, worry that whatever is implemented will shape public opinion of Wikipedia, influence the size and shape of its contributor base, or be a platform for increasing (or decreasing) restrictions on Wikipedia's content. We can only hope that the future Wikipedia will continue to improve and succeed.


Scientology "banned" from Wikipedia

In an effort to close some long-standing conflicts on Scientology-related topics, the Wikipedia Arbitration Committee (ArbCom, for short) has used some interesting measures in an attempt to settle the problem more thoroughly.

Most prominently, ArbCom has called for a blanket ban on editing from Scientology-associated IP addresses. Specifically, the sanction is the following:

2) All IP addresses owned or operated by the Church of Scientology and its associates, broadly interpreted, are to be blocked as if they were open proxies. Individual editors may request IP block exemption if they wish to contribute from the blocked IP addresses.
Passed 10 to 1 at 13:31, 28 May 2009 (UTC)
This is an interesting development, but I think it's being misinterpreted to some degree in the media. Given Scientology's reputation for attempting to influence media in their favour, it's entirely understandable that a group in Wikipedia's position would want to bar them from contributing, so I don't blame journalists and the public for misunderstanding the IP ban for an organizational ban—but they can't be excused for missing the fact that a number of anti-Scientology activists were topic-banned (disallowed from editing Scientology-related articles, on penalty of blocking) as part of the decision.
To a certain degree, it is the case: Wikipedia doesn't need the kind of relentless view-pushing that Scientologists present. Whether they're "fighting religious discrimination" or "suppressing the truth", their drive to stamp out criticism of the movement is undeniable, and on Wikipedia, that's unacceptable.

It's evidently not the case that Wikipedia is outright banning the organization. The above sanction only blocks the IP addresses rather than bans the organization. If you read the principles defined for the definition, this becomes more evident. Specifically, the following principle is interesting:

11) It is rarely possible to determine with complete certainty whether several editors from the same IP or corporate server are sockpuppets, meat puppets, or acquaintances who happen to edit Wikipedia. In such cases, remedies may be fashioned which are based on the behavior of the user rather than their identity. The Arbitration Committee may determine that editors who edit with the same agenda and make the same types of edits be treated as a single editor.
Passed 7 to 4 (with 1 abstention) at 13:31, 28 May 2009 (UTC)

It's not only the principle that justifies the block, in my opinion—since multiple uniformly pro-Scientology editors using Scientology IP addresses can't be distinguished from a single one abusing multiple accounts—but one of the more controversial principles in the case.

I find this curious, and so looked at the votes placed on the principle. Risker opposed the principle with the concern that it could be used badly, since users might be attacked using this principle based on the point of view that their edits support.*
*Having edits which support a particular point of view is, in my opinion, not inherently in violation of Wikipedia's policy of neutrality; even if someone's edits uniformly present a particular point of view, correcting other imbalances, or presenting material favourable to one view neutrally, is not necessarily a problem.

As I was writing this post, I stumbled upon a Huffington Post article which deals with this kind of concern. It's interesting in part because of the points that it misses, but in part because of the relevance that it has to the subtler implications of the ArbCom decision, and suggests that it may be a dangerous precedent. In "Wikipedia Removes Semi-Protection from Civil Liberties", Leah Anthony Libresco argues that Wikipedia's decision to ban Scientology is misguided. Libresco argues that setting the precedent of banning an organization like Scientology, a whole class of people, is akin to taking away civil liberties. There is some confusion: Libresco says, for example, that WikiScanner can "identify the sources of anonymous edits made on Wikipedia by analyzing the IP addresses of the perpetrators", when this isn't really the case (WikiScanner correlates edits made without a user account with a database of known IPs of organizations, but can't equal tools like CheckUser which can investigate registered users). I think that Libresco misunderstands the depth of the problem (that Wikipedia's methods for investigating abuse are insufficient for the Scientology IP addresses) and the seriousness of the remedy, while falling into the usual assumption of an organizational ban rather than a technical block. Libresco does, however, make some cogent points about the need for open discussion—if speech is suppressed, neutrality becomes more difficult to create and, worse, to expect.

This kind of sentiment was echoed in another article which criticized the decision. In "Why Wikipedia was wrong to ban Scientology", Evgeny Morozov attacks the ArbCom decision as one which suppresses the group from joining in the debate about itself. Two sections of his article summarize the article well for me:

I am no fan of Scientology, but I think that banning them from Wikipedia is going to be counterproductive. Unfortunately, it presents the Wikipedia admins/editors as a non-neutral group that opposes a particular set of ideas. In an ideal world, I don't think that the Wikipedia editors should be making any value judgements on whether a particular idea is good or bad, for it undermines the trust that users place in an open encyclopedia, no matter how innovative it is.


However, bowing down to Scientology-bashers is almost guaranteed to trigger similar requests from people who hate satanism, fascism, or even pokemons. [sic] Granted it's harder to identify and ban the more decentralized community of, say, satanists than that of scientologists [sic] (who have registered physical addresses), but I am sure that very soon somebody will request that another group is excluded from online deliberations over what kind of materials to publish about it. In a way, Wikipedia's decision opens Pandora's box : why allow Christians to edit articles on Christianity, for example?

It misses many of the obvious considerations, and has been thoroughly criticized so far in the comments, but it does raise a good point in the larger scheme of things: is a decision to block particular sets of IP addresses on these grounds tenable? It's certainly possible that it may—as the first quoted section, and some other articles have suggested—have negative public image effects for Wikipedia. It's certainly a reminder that, especially in administrative actions, even the appearance of impropriety in an action can be damaging without the need for any true abuse.

On another issue, the criticism raises questions. Libresco's article in particular makes the parallel of civil liberties in suggesting that the suppression of any particular group is troubling. While I think it's fallacious to make a direct comparison of Wikipedia to many existing political and economic systems, there are parallels that should not be ignored. I plan to outline some interesting parallels in a future article.

Is the block dangerous, justified, or merely ugly and unfortunate? I'd like to hear your opinion.


Hello, world!

I've thought a while about various Wikipedia-related issues, and I follow Wikipedia-related news, and I've decided that I should blog about them. Here are some basic ground rules:
  • The blog will follow NPOV, where NPOV is "Nihiltres' point of view". :) While it would be nice to follow a Wikipedia-like neutral point of view, I don't feel like working endlessly to appease critics. I generally think Wikipedia is a good thing, but not perfect—criticism will try to be constructive when applicable.
  • I plan to allow and encourage comments, but I reserve the right to delete posts for any reason. Specifically, comments which are unhelpful or derogatory will be targeted.
  • You can ask me to write about something! Contact me on my Wikipedia user talk page, by email (wiki dot nihiltres at gmail dot com), through Twitter (@wiki_nihiltres) through IRC ("Nihiltres" on freenode), or even "wikinihiltres" on AIM, and I'll consider writing about whatever topic you suggest.
  • I plan to try to make this blog accessible to non-Wikipedians, though it'll inevitably describe things that only Wikipedians are really interested in. Keep your WP:WTF (acronyms) out of comments, please!
I'm currently in the process of writing a first ("real") article, so watch this space.