Jump to content
Wikimedia Meta-Wiki

Authority metric

From Meta, a Wikimedia project coordination wiki
This page is kept for historical interest. Any policies mentioned may be obsolete. If you want to revive the topic, you can use the talk page or start a discussion on the community forum.

I have an idea about article validation, but let me say first that I don't think Wikipedia is the right place to implement it. There will never be consensus support for such a thing, it should be implemented as a fork. In any case, none of us know what model is the best, why not try a diversity of approaches in parallel? Let the best model win. Still, there's lots of related discussion here, so it seems like a good place to start.

A Wikipedia article approaches perfection by w:Brownian motion. With some edits it gets better, with others it gets worse. On average, there are more good edits than bad edits, so articles diffuse towards perfection. But even if, hypothetically speaking, an article temporarily reaches perfection, it's only a matter of time before some random person dislodges it by adding a misconception or applying idiosyncratic formatting. The equilibrium state has an average article quality that is somewhat less than perfect, with a distribution of quality from very good to very bad.

What we need to do is reduce the temperature, restrict editing when an article gets closer to perfection. We slow down good edits, and we slow down bad edits. Nupedia made the mistake of freezing the system before the articles were any good, it started with a near-zero temperature allowing very little improvement. By reducing the temperature, you improve the average equilibrium article quality. I could make an analogy with planetary atmospheres but this is getting a bit silly.

What I'm saying with these dense metaphors is pretty mundane, and it has been said before by other people. The canonical example of this is the idea that we should protect articles once they become "good enough", and then only allow edits by some small group of people, say sysops, in response to comments on the talk page. But that's a bit coarse for my liking, I'm thinking of something more fine-grained.

Wikipedia is a battleground. It's full of edit wars and conflicts that go on forever. A consensus is reached, but then a month later some new zealot will come along and reopen the whole thing. I was wondering, how is this problem solved in the real world? The answer is a set of social conventions that we are all very familiar with -- w:authority. Wikipedia has a culture which eschews authority, and that's part of the reason I think this idea is impractical except as a fork.

In the real world, whoever has the most authority wins the argument. Authority is tied to power, to the means of disseminating information. Those with authority are more likely to win the support of journalists and reviewers of scientific journals. Disputes between people with the same amount of authority are often solved by appeal to a higher authority, as is the case in the management structure of corporations. In academia, discourse between people in the highest levels of authority is very well informed and relatively civil. Authority doesn't solve every argument, but it provides a quick method to dismiss the claims of people who are uninformed.

On Wikipedia, power is dissociated from authority. In an edit war between someone with a high school education and a professor, whoever has the most time to waste wins. The professor is obliged to explain basic principles of his or her field to every idiot who comes along, talking them out of their misconceptions.

My idea is to have a site where authority is reconnected with power. Authority should be measured in the same way it is measured in the real world -- qualifications, experience and respect from others who have authority in the same field. Authority is measured by assigning a metric for each field of knowledge a given person is interested in. Typically, an academic would have a high authority in a specialised field, and lower authority values in related fields. People wishing to obtain a high authority metric would have to provide information about themselves -- verifiable information about qualifications, lists of publications, etc. Trusted users, drawn from the general community, would be responsible for verifying such claims.

We would then give users with authority special rights over the articles in their field. They could, for example, protect an article such that only people with a similar or higher authority metric could edit it -- other edits would be consigned to a moderation queue.

Editing for style and grammar does not require authority in the given field. We could use a general trust metric to allow trusted users to make such edits, but there would be a warning above the edit box warning them not to change matters of fact. Because there is an entry barrier, users violating this condition could easily be dealt with.

Only articles which are well-written and polished would be protected in this way. New or incomplete articles would be open to editing, just like on Wikipedia. And we could implement simple methods to carry over changes from Wikipedia to the proposed site.

Any comments? -- Tim Starling 03:01, 17 Nov 2004 (UTC)

Sorry, no forks. We need to develop something in-house. --Daniel Mayer

That's not for you to decide. Before you get angry, I'm not really serious about forking the project, I don't have enough money for a start. But the whole idea of forking is that you can try things which aren't politically possible on the original site. A conservative solution, suiting the existing community, might not be the best solution. How can anyone tell except by experimentation? -- Tim Starling 05:04, 19 Nov 2004 (UTC)
FORKS ARE GOOD! The RightToFork is why we are building things with open, fork-friendly licenses in the first place. However I'm not sure why the above suggestions would require working outside of Wikimedia; this sounds an awful lot like a moderated 'Wikipedia 1.0' fork, which would be a Wikimedia-sponsored effort which has been vaguely planned for since ages. --brion 01:25, 20 Nov 2004 (UTC)
  • forks: many developments may benefit from the existing articles database. fething the HTML versions is not very efficient, therefore a fork author will prolly ask for read access on the WP articles database. refusing it may be stupid because the fork will fetch data on the public server (thru HTTP). accepting leads, if the fork is very successful, to a n neglectable load (DB servers, network...) on WP servers which may annoy WP who don't like the fork functions. but if most WP ppl like the fork... well, it will no longer be a fork. my conclusion is "try hard to create a proposal not disrupting anything"
  • authority metric: let's forget about 'expertise' on all non scientific or tech fields. every article may have a status, for example picked up among:
    • 'raw', meaning 'last standard content' (any existing article has this 'raw' status and, in my humble opinion, all non tech or sci articles will always have this status)
    • 'unpolluted', meaning 'free from any vandalism', an article is such blessed by an admin
    • 'validated', meaning that 'a Wikipedia commission of people knowing the field validated it', an article gains this status thanks to a 'WP expert'. WP experts are nominated by admins, for a given field, after an analyzis of all articles in order to give 'scores' to all authors. the bottomline is simple: an author who wrote many articles on the field which are stable (not often modified) and read by many... is an expert.
    • 'expertised', meaning that 'a world-known expert of the field checked it ok'. those world-known experts are elected by WP experts.

adding another reader-given score is possible. I'm affraid that every attempt to let people judge must be backed by powerful technical thingies in order to forbid any tricks, therefore we may use crypto to 'seal' their published opinions. my proposal is much more complete (and IMHO convincing :-) ), it is published

AltStyle によって変換されたページ (->オリジナル) /