Wikipedia:Why is Wikipedia losing contributors - Thinking about remedies



Introduction

edit

Wikipedia has worldwide influence: "Dr. Wikipedia," for instance, is now the number #1 source for healthcare information.[1] Wikipedia is regularly among the top 10 most visited websites.[2] It has become the arbiter of who is who, and what is what.[3]

Yet Wikipedia is losing editors.[citation needed] While the number of internet users continues to grow, the number of new Wikipedia editors is falling. This may be a long-term risk for Wikipedia. This page is looking at why Wikipedia is losing contributors, and what possible remedies could help.

Please contribute to this page...

edit
  • Try, whenever possible, to integrate your comment to existing sections
  • Keep as short as possible but don't be too short!
  • Pointing out a problem is only half the work - you are invited to also suggest a solution!
  • Add first-person comments to talk-page
  • Add general comments to main-page
GOAL1: let's try to keep cool and objective/impersonal
GOAL2: a simple not-too-long structured article is much more readable
GOAL3: let's find multiple solutions to each problem
GOAL4: best solutions are described by short user-cases
  • Think about the big picture, but try to make specific suggestions which can be realistically implemented by the existing community.

Who contributes to Wikipedia?

edit

Contributions come from diverse demographic and ethnographic segments:

What motivates contribution?

edit

Several motivations lead people to contribute:

Examples of problems

edit
  • Conflict
    • Too many editors are flamed, new edits automatically reverted and not properly discussed...
    • As a first time user, spending time writing a well-researched article for it to be fully deleted with no detailed explanation and help is frustrating and can deter you from wanting to contribute ever again.
    • There is no real recourse against incivility (only endless RfC's).
    • I want to be free to call someone X when he deserves it!
    • There are aggressive users, ready to bite me at any step I do...
    • There are frustrated users looking for somebody to witchhunt / to stalk...
  • Technically difficult
    • How can I possibly correct the error in this table if I can't see it?
    • Why do we need to learn a markup in 2012?
    • Shouldn't signing posts be done automatically like on news sites?
  • Confusing policies
    • Editing is too stressful.
    • There's too much to learn and too many guidelines to read!
    • I tried, but no one would help me.
  • Misuse
    • There are users that love to apply any existing WP:* against me/against the article and are proud to think about themselves as defensor fidei
    • Some articles are too biased! WP is slowly turning into a newspaper/a blog/...
    • Agenda such as claiming a person, book, recording, speech exhibits significance in an area to further an agenda. For example, claiming that a significant historical figure may have had disorder X to name drop promote disorder X when such could never be proven as a fact as the person lived 500 years or more in the past

Lack of collaboration and "lone wolf" culture repels new editors

edit

The MediaWiki platform, at present, is fair to poor at supporting collaboration, and moribund WikiProject corpses are increasingly blocking the path forward toward a new paradigm for a more collaborative Wikipedia. WikiProject History's collaboration of the month is listed as October... October 2007, from the previous decade.

Some attempts to revive teamwork and restart collaboration that have never been answered by anyone:

  • WikiProject History–here
  • WikiProject Chinese History–here

We must put ourselves in the shoes of a brand new, yet talented, editor approaching Wikipedia for collaboration on an article rewrite. Among editors, especially the newer editors, WikiProjects create the impression that collaboration is ongoing when it often isn't. Thus, it helps prevent new blood from launching new collaborations, stifling the collaborative environment that improves articles, fosters peace and understanding, and retains talented writers. The absence of a cordial, supportive, collaborative platform hurts retention and I think it's fair to say this absence often leaves behind an often caustic "lone wolf" culture that can repel women from the project.

The Wikimedia team has understood for years that in order to close the startling gender gap, editing must be a much more social experience. The potential "whittling down" each year of our pool of talented writers is the greatest threat to Wikipedia, as we must increase the number of editors, especially expert editors, to be able to fix the sprawling hellscape of weak, inaccurate and incomplete articles that drag down the project (especially in the area of the social sciences and humanities, which User:Sue_Gardner correctly pointed out at the 2011 Wikipedia in Higher Education summit). Retaining good people is the greatest danger to Wikipedia's success; though edit warring gets more attention, WP:DONTBEADICK and dispute resolution is crucial to the extent it effects retention of editors. We really need to keep good editors around, and I believe a more social, collaborative platform would go a long way toward that goal. We also must transition to what the Wikimedia Strategic Plan to 2015 foresees as "topical groups" based on editing interests.

Deletionism

edit

The philosophy of deletionism has the side-effect of killing too much good along with the bad. Nobody gets excited to join a project when they write up something meaningful only to see someone scrap it all, then use "dispute resolution" to bring them to a halt.

There is a risk of ignoring NOR or BLP when taking stuff out of an article because "they don't think it's important", or right, based on their three-minute thumbnail knowledge of the topic, without regard to the sources which do think it's important.

Deletionists often appear to be more interested removing content rather than fixing it. For instance, if an interesting content appears to be added in the wrong section, rather than placing it in the right section, older users seem more interested in scrapping the content altogether. This often occurs with new users, resulting in them losing interest in expanding existing articles or editing altogether.

Deletionists often don't fix things. They revert your entire edit, and may write many paragraphs into a talk or discussion page.

On the other hand, overenthusiastic fans often bloat articles with trivia and minutiae, often poorly researched and cited, if cited at all, that are of no use to the general public and best belong on fan sites. If deletionism keeps those kinds of editors away, it may be a good thing. What gets deleted, in most cases, is content that fails to live up to core policies such as no original research, neutral point of view, and reliable-source verification.

Abuse of power (mainly by Deletionists)

edit

Wikipedia editors can hold a few types of power over newbies:

  • detailed knowledge of Wikipedia guidelines, procedures and biases - Everyone knows intuitively what an encyclopedia is -- it's a place to look up information -- but people should not be confused or smacked down for being confused about all the things Wikipedia:What Wikipedia is not, or oddities like its inclusion of Pokémon but exclusion of most academics, or the peculiar guidelines for Wikipedia:Biographies of living persons. There's still too steep a learning curve for newbies to get the Wikipedia approach. There should be at least 100 templates or article guides for various types of topics. It's not uncommon for multiple policies to be tossed into deletion discussions, and the editor has no chance to defend their article without taking a crash-course in Wikipedia war tactics.
  • understanding of the conflict style, which is semi-legalese and a punchy "alpha-male" style of writing - Effective Wikipedia arguing in AfD or other discussions is very curt. Like in a courtroom, the arguments become quite mathematical/logical, and often the subjective/qualitative feel of the discussion is missing. Politeness and courtesy are often missing too; or they are added in an insincere way (thanks for your good faith edit, but I just deleted it as per WP:XYZ.).
  • patience and persistence - Newbie or casual editors just want to help out; it's easy to out-wait an editor if you spend two weeks dedicated to getting something deleted or changed. The perception of another editor was "Deletionists can't be bargained with. They can't be reasoned with. They don't feel pity, or remorse, or fear. And they absolutely will not stop, ever, until your contributions are reverted."
  • 1,545 are actually administrators - but I don't think admin abuse is that common, nor is it the focus of this discussion.
  • bots - Bots or Firefox extensions create an imbalance because it could take someone 30-60 minutes to work on new content, but 1-3 seconds for it to be tagged for deletion or other problems. This makes it too easy for fly-by tagging (e.g., WP:user warning templates) that alienates editors. If tagging was more difficult, then editors would only tag if they really mean it.
  • friends - You will (eventually) make friends on Wikipedia and you will (eventually) find it very hard to rule against them even when they deserve it.

Facts and truth

edit

At the very center of Wikipedia is a rejection of thousands of years of philosophy holding that truth has an independent existence. Without that basic principle at the center, any notions of "knowledge," "authority," and "reference" fall apart. And they collapse not just in the airy abstract, but in the gritty day to day. Most "edit wars" here are between people who value facts, and others who do not. Wikipedia's organizing principle holds that fact and truth are whatever a consensus of editors say they are. That's how you work, both philosophically and practically. The result is that all editors are equal, except those who have somehow obtained administrative rights, in which case they are equal to other administrators; that an editor or administrator need not be familiar with a subject to edit or administer an article; that all issues will be decided by consensus, which by definition at Wikipedia is correct.

Any expert or authority on a topic immediately learns that appeals to fact are meaningless. If a majority at Wikipedia decided that 2 + 2 = 5, then that's what Wikipedia will publish. Similarly, if a majority decides that "4" shall not be mentioned, then it will not be mentioned. On Wikipedia, fact, and truth, have no validity in themselves. This means that they are subject to political determination, i.e., the process of obtaining agreement. Those who believe that 2 + 2 = 4 will need to lobby for 4. Wikipedia has an exhaustive set of rules that purport to guide such discussions, but those rules are routinely ignored. For instance, debates will rage within an article over whether a bit of information is "notable" enough for inclusion. Wikipedia's clear rule is that "NOTABILITY" applies only to whether an assertion has significant coverage in verfiable sources, not to whether or not it is true.

Editorial equality is unrealistic, perhaps unsustainable

edit

Wikipedia regards all editors as equal. Editors enjoy equal ownership of contributions, which is to say: none. Articles enjoy equal permanence, which is to say: none. What a PhD contributed today a high school kid reverts tomorrow, or next week, or next year.

An editor who has worked hard develops a proprietary interest, becomes protective of his work. This is human nature, this is how we behave in the real world--never mind Wikipedia's "no ownership " ideal. An editor who knows something about a subject is infuriated to find a know-nothing undoing his contribution. And that good editor is battling not only the know-nothings of today, but the never ending future crops of know-nothings. He can only protect a good article for so long; there comes a time when finally all editors do become equal, the only genuine equality, the equality of the grave.

So good editors after some experience of know-nothing reversions, and abuse, and watching yesterday's perfect page become today's shambles, decide it just is not worth the effort. Editorial equality; universal transience--the policy drives away experienced editors, by the very experience of it.

It is not a policy that afflicted the print encyclopedias of yesteryear. They did not make the mistake of assuming all contributors, and contributions, are equal. They vetted very carefully. And it was prestigious to be invited to contribute. Perhaps it is time for Wikipedia to create a mechanism for separating the wheat from the chaff.

A presiding judge keeps order in his court with the contempt power, an instant strike of his gavel. There is no judge keeping order in a Wikipedia discussion. With all editors having equal power (which is to say: none) to stop abuse, naysaying and escalating hostility, there is no effective recourse. Especially not against wiley editors who have learned what they can get away with. Reasonable, well meaning editors flee, driven away; unreasonable, power-seeking editors concentrate, like an acid. As good editors leave, bad ones multiply, new editors find savagery, a deadly acid bath awaits them--a death spiral develops. Wikipedia withers.

Possible solutions

edit

Any solutions need to acknowledge that Wikipedia, for all its flaws, is unparalleled in being a huge, productive crowd sourcing machine. More than a century ago, the human starts to industrialize the manual work. A century after, the human starts to industrialize the intellectual work. But can the machine be tweaked? Any tweaks must work for the vast majority of contributors, support the motivations people have to contribute, and reduce the turnoffs.

Wikipedia could evolve, aiming to simplify and automate some user interaction and avoid most common stress sources. This should be adherent with WP main goal since editors are not here to act as a social forum.

Counterbalance the image of older contributors

edit

One of the causes of the departure of contributors is that older contributors are too self-confident. It's a well known phenomenon that also appears on forums where we use the terms oldfag and newfag. The principle is that the older members become regulars, form circles and do not welcome the newbies. The longer you are here, the more you are known, the more you have contributed and the more you are respected.

One solution is to discredit older contributors by associating each contributor with an extinct animal related to their arrival year. It's already done on the French Wikipedia and "I'm a smilodon", (2006) but I think that we must do more than a userbox. Only few people will have that and in several years, no active users will have that. It should automatically appear on one user-related page (user page, talk page, contributions, any...) for contributors older than one year. Comments?

Slowing down

edit

Most edit warring takes place in a short time. Worst behaviours emerge due to very short editing life cycle. Users being able to commit only once an hour/a day/X same article would be encouraged towards first-think-then-post. An intermediate sandbox tool could be very helpful. For example: Linux operating systems provide full 'per-user-system-vision'. Similarly everybody could be able to commit whenever he wants to the intermediate sandbox (and view it as user-X immediately). Once an hour/a day/... system will merge the intermediate sandbox with a simple algorithm: edits related to distinct sections immediately merged; edits related to a common section with a X policy (First In First Out/Last In First Out/...).

User dashboards

edit

People love dashboards. e.g., Google Analytics, Bitly stats, and various Twitter statistics.

Currently, only experts can find their way through the maze of wikilinks to statistics, like these.

Wikimedia could recruit interface designers to come up with ideas (this could be crowd sourced, there would be an outpouring of designs) for key graphs and statistics, and also improve the speed of data from toolserver.org. The tools chosen would be computationally realistic.

Statistics should include smart breakdowns, perhaps including how many reverts are successful or undisputed. We'd have to think about the metrics -- but set it up so that what's measured and readily viewed directly from a user's user page supported constructive work.

This could include a visualization of the results of a rating system in the edit history itself (from user Adjwilley in the discussion). For example, there could be little thumbs up buttons, so when you see an edit you like in the edit history, you hit the button, which records your action. It would be a means of giving some validation to editors who go around making good edits.

Channel deletionists

edit

A priori deletionism. Deletionists probably act in good faith, and they play an incredibly important role of keeping spam out of Wikipedia. The problem is when they become immune to the value of new content, or insensitive to the motivations of other contributors. Perhaps one type of solution is to help deletionists differentiate valuable and reckless edits. This could be done via statistics (see above). Deletionists should see some visual overview of how much of their deletes are good (e.g., removing spam) vs. unnecessary or harmful.

On the other hand, Wikipedia places a high value on edits containing no original research, a neutral point of view, and reliable-source verification. It may be counter-intuitive to make it harder to delete than to put questionable content in the first place.

Users vs. Admins

edit

Some users are concerned about a lack of freedom. Other users are concerned about an excess of freedom. Freedom, like the nature of human being itself, is something very difficult to investigate. Other wiki-parameters are much simpler to talk about:

  • There are millions of users and millions of articles
  • Users come from different cultures, with different personal/interaction attitudes
  • Newbies need to learn - through interaction - a lot before they become Wikipedians

The learning and interaction process, through years, has proven to scale well and could maybe work well even applied to billions of users and billions of articles. This works perfectly from an abstract point of view. Other web-based realities[citation needed] scale also well and turn into wiki competitors because of

  • The lack of a starting training: newbies are likely to shift towards simpler web realities
  • The lack of a strict user/admin interaction: newbies are not likely to be pleased by a there's somebody watching over you feeling.

Strict human interaction is very likely to rise rage and conflict. Being prevented by Mr. X to access a page is likely to be felt as a dumb decision/a revenge/a misunderstanding. Being prevented by automatic filtering to access a page (because you posted a well known X-bad-link, because you named somebody 'Nazi', ...) is more impersonal. Mr. X is still the one that reads the filter log and applies the ban/rewards you with -1 AL/... but you won't take it as a personal or direct offense.

Let the Jury Decide - In reference to technical submissions especially, the author probably has first hand experience and knowledge of his subject based upon employment that requires his knowledge to be accurate. Having the ability to cite published references to key points in his submittal to the satisfaction of a single admin is almost insurmountable. The Jury should be his peers out in industry or academia who, if given access to a Reply page, could approve or critique the submittal for review by the administrator. Obviously some trial period for public review would be necessary. This could apply to External Links as well. Also helpful would be the opportunity for the author to respond to a critique.

Problematic users

edit

Too much freedom is really tempting problematic users. There are a lot of people who feel themselves as powerful being able to open accusation against other users. RfC - a simple and polite invite to other editors to join talk page discussion and review - should be everything that users should refer to. Administrative tools should be posted and performed by admin only. Somebody could feel this like an authoritative evolution, but it could be the chance to dramatically simplify interaction and attract a lot of new users - now shifting towards social networks.

Problematic admins

edit

Too much freedom is really tempting problematic admins. Some admins may be involved in a too strict editor/admin COI. Editors and admins should be kept as separate roles, according to an non-cumulative group policy. Trusted people willing to help as an admin, after being elected, should give up their editor access.

Requirements that are more reasonable in the North than in the South

edit

There seems to be a “luck of the draw” at Wikipedia, where some editors, unlike others, belittle newspaper articles as references and require mention in several books. These editors seem unaware that publication of books is nowhere near as common or cost-effective in the South as in the affluent countries of the North. Also, the multiplicity of languages spoken in many Southern countries makes the publication of books a rarity, usually at one’s own expense. Unless Wikipedia is to become largely a Gringo preserve, I suggest editors be restrained from excessive demands for books to serve as references and for prominence in the South. Newspaper articles and websites independent of the subject of the article should be accepted with less skepticism.

Lack of response

edit

Getting no response. One problem could be that a lot of users get no response when they ask a question on a talk page, ask for feedback on articles they created, or ask for an editor review. This is very discouraging, and creates the impression that their input is not important to the community. In many ways, this is unavoidable for a talk page where there is no active watcher of the article, but in other contexts it could be avoided.

  • 1 - Every user is granted the right not to interact with you, even if asked on the talk page, unless you're pointing some bad misconduct...
  • 2 - Every user has the right to ask other users for review/RfC
  • 3 - Discussion about articles, taking placing on a personal talk page, is 1-1 not many-to-many, other users will not see it and get involved...
  • 4 - There are so many bureaucratic pages that users in their spare time are not very likely to read RfC page...
  • 5 - Getting rid of all those bureaucratic pages and have editors refer ONLY to a main/categorized RfC page will greatly help interaction...
  • 6 - No matter who answers you - getting answered and reviewed is the goal

A big categorized RfC page, and nothing more... - Application 1 : Lack of response

edit

Categories will play a growing role. A big unique RfC page is something like a bottle in the sea... Editors will "subscribe" to some channels and will promptly assist new RfCs. Editors subscribing to RfC channel X will also be able to help reviewing X-related content, not only user X conduct or written English style.

Solution: categorized RfC channels + users subscribing RfC channels = you're never alone + no more orphan articles

Software innovation

edit

Looking for all template codes is really time-consuming. Another possibly overlooked reason is the way the Wiki software works. Even experienced internet users, when looking at a Talk page discussion, give up and walk away. The interface isn't designed for creating clear threaded discussions, and the text really just runs together. Most of us here are used to it, but it required time. Also on the page editing side, people have to remember strange codes and templates in order to do certain things.

WP recognized by UNESCO tenwiki:World Heritage could be a great chance to invest in required software innovation!

Moving to new platforms for collaboration

edit

New platforms for collaboration. In order to reverse the troubling trend of editors leaving Wikipedia (i.e. improve the recruitment and retention of new writers) it's necessary we move beyond moribund WikiProjects to new platforms for collaboration. This is already addressed, in part, by the "strategy:Attracting and retaining participants" portion of the current Wikimedia Strategic Plan. My proposal deals with how we get from where we are now (in en.wikipedia, littered with moribund WikiProjects) to where the Strategic Plan takes us: the introduction of more social/collaborative tools to the Wiki, including "Users would be able to join topical groups, based on their editing interests (e.g., “18th century American history)". My proposal is about how we get from here to there.

Proposal: Moving beyond moribund WikiProjects to a new platform for collaboration.  

Your thoughts are appreciated!

Consolidating Wikiprojects. During the 2000s, many wikiprojects over-expanded, creating large sets of complex, interconnecting pages. This fracturing of a wikiproject exacerbates the appearance of abandonment. Consolidating back to fewer pages concentrates activity which in turn creates a more welcoming atmosphere (e.g. in 2015 WP:MCB went from 33 pages to 4 pages).

To get concrete, might OStatus compatibility, and thus the ability to interact with distributed open-API social networks such as GNU social and Friendica, be useful? What form might it take? Would it reduce the aforementioned "lone wolf" cultural aspects that alienate some users that like the social contact?

Automated user conduct handling

edit

Adaptive granular access control (AC)

edit

Let's imagine a unix-like access level.

  • User is given an access level (AL): -1 (=banned), 0 (=just created), ..., 10 (=allowed to edit any non-system page)
  • Wikipedia articles are given a protection level (PL): -1 (edit by banned users), 0 (=edit with AL >= 0), ... 10 (=edit with AL == 10)
  • User automatically gains +1 AL every X time and Y good edits (not raising any accusation/dicking/wikilove/... filter)

An adaptive and granular access control system could handle article protection in a very effective way - like Linux. Article rates PL 7, you've just joined WP at AL 0 and you can only read. You actively take part in WP N times without rising any filter, and after a week/a month/... you gain an extra point... Normal articles rate PL 0, and disputed articles rate PL 10, so can be edited only by cool editors.

Human interaction is still granted by an admin that reviews the filter report and decides whether to reward with -1 AL.

True behavioral redemption and guidelines conformity is granted by time and good edits number required to regain AL points.

Adaptive AC - Application 1 : auto-protecting articles of a certain level
edit

Articles reaching Ga, FA or FL status. IP edits and non-autoconfirmed users should be prevented (using protection) from editing GA/... articles. The major driving force that makes this more of a chore than a pleasure is having to wade through edit after edit to find out which is vandalism, which is misguided and which actually provides useful information, even on a locked article since a new user can affect them.

On GA+ articles, the problem becomes bigger because a group or individual has generally put a lot of effort into these pages including research, writing, sourcing, etc. Then, in the case of something like Transformers (film), a new film in the series comes out and people feel it necessary to change these articles, almost always by bloating plot but in other ways as well.

If these articles were giving moderate protection that required newer users to request edits, it would go a long way to preserving editors' work on what should be the flagship articles of the project, and perhaps lessening the impact that negative editing can have on the editor's work. It would also mean that there may be an endpoint to your work where you get it to a certain status and can entrust that edits being made are almost certainly in the positive by established and/or learned users.

Solution: granting articles an adaptive PL: Stubs rate PL 0, ... FAs rate PL 10

Having a "work in progress" status - and approval before published for new editors

edit

I have a little proposal to this whole method of handling articles. The action of letting anyone publish new articles, and then delete if not found good enough seems to be a bit strange. There are many services for publication where you can submit and edit your work, and then get it approved by the editorial panel. If not approved, it still has the status as “work in progress” until you re-submit for approval. In this way, new authors of articles can go their way creating good content without being snapped on their fingers, like today.

Creating good articles can sometimes be a time-consuming task. There is no way to mark articles as “work in progress, do not list”. Having an article in work where you decide on your own when it is ready to be evaluated/published would in general be good advice. Such articles could be linked for ease of reference when discussing articles with the editorial panel, but not be available for search.

Auto approval could be a status set by administrators in the user profile, for those gained a certain experience level.

The way it is done today is not very inspiring for those who want to write. Ole Kristian Ek Hornnes (talk) 17:50, 17 December 2012 (UTC)[reply]

WP:Original Research/WP:Reliable Sources and common sense

edit

OR/RS and common sense intersect and conflict a lot of times. It should be allowed to split articles: first part has to be completely referenced and readable by any inexperienced user, second part can be experts-oriented and reference to usually-taken-for-granted advanced topics.

Create a better introductory system

edit

We need a better introduction to editing. While we have several pages designed to help out newcomers, they either are too basic (you can edit this page, see how!) or too detailed (such as Help: Wikipedia: The Missing Manual – a great page, but overly detailed for a casual newbie). The Wikipedia:New contributors' help page, viewed by 500+ people a day, is a particularly bad introduction.

We need a beginner's guide that gives a basic overview of policy as well as formatting and community expectations, written solely to prevent newbies' first edits from being reverted. Most importantly, the page should be made prominent so that new editors will find it before they make their first edit. Perhaps we could link it in bold from the template that pops up when you edit from an IP. When new editors have a basic understanding of our expectations, their edits will be reverted less often and their self-confidence (as a whole) will be boosted. This should (if the recent data on the signpost is correct) lead to a higher retention rate. A page like Wikipedia:Your first article would be good, but written from a broader perspective. Perhaps create a page called Wikipedia:Your first edits? Or revamp and repurpose one of the existing help pages?

Facts vs. Truth, Inclusionism vs. Deletionism

edit

WP has been created according to a few principles, designed like the rules of a complex adaptive system game: you put a lot of people together, thousands, millions... You design just a few behavioural guidelines (not even related to principles), and then you start playing. The players (we), after a lot of game iterations, will design their own principles as a result of emerging collective behaviour.

The only problem could be having a non-homogeneous emerging collective behaviour, having this big experiment clearly split into two secluded macro-areas: inclusionism and deletionism. Inclusionist players are always reluctant to remove something because they fear censorship, incompleteness and are not interested in objective truth. Deletionist players are always reluctant to add something because they fear losing WP public reliability and are not interested in completeness or extending WP.

Solution: there are no wrong people, just the right people in the wrong jobs.

Article arbiter: all are equal but some more equal than others

edit

To address the problem that not all editors, or editorial contributions, are really equal--why not allow an article to have a designated Arbiter? That is, one active editor, or maybe three, who by consensus have made a lot of good contributions to it? The Arbiter would have a fixed term of one year, or three years, or whatever. If there is an edit war, an Arbiter would have the power to freeze the page, blocking changes for 24 hours to let things cool; or block warring editors from contributing to that page--again, just for a specified time. This would have several good effects. It would be a pat on the back for constructive editors. It would help preserve good text; work is no longer flung into the void. Edit warring or incivility would have an instant and effective response, which now it does not. And finally, this is consistent with Wikipedia's general policy of equality since the Arbiter is appointed by consensus.

From Wikipedia Signpost

edit

There's a ton of outside writing about Wikipedia, much related to the challenge of losing contributors. Here's some starting points...

Articles of interest

edit

See also

edit

References

edit
  1. ^ Feltman, Rachel (January 21, 2014). "The #1 doctor in the world is Dr. Wikipedia". Quartz. Retrieved 2014-01-23.
  2. ^ Keith Wagstaff (Jan 20, 2014). "Can you hear him now? Wikipedia's Jimmy Wales gets into the cellphone game". NBC News. Retrieved 2014-01-23.
  3. ^ Judith Newman (January 8, 2014). "Wikipedia-Mania Wikipedia, What Does Judith Newman Have to Do to Get a Page?". The New York Times. Retrieved 2014-01-23.