Jump to content
Wikimedia Meta-Wiki

Censorship

From Meta, a Wikimedia project coordination wiki

old discussion archived on the talk page

This is a discussion page about Censorship, Filtering content, User preferences for reading, Ethical relativity, and other subjects.


The censorship problem

[edit ]

Wikipedia and other Wikimedia projects are currently blocked in various parts of the world, depriving people of access to them.

  • China
  • Schools and libraries that block wikipedia

Some individual families, teachers, and communities disallow access to the projects as well.

Reasons include

  • availability of politically sensitive material (as in China)
  • availability of images of lightly-clothed or nude people; or of sexual acts (as in Iran and some schools)
  • possibility of encountering vandalism (as in some classrooms)

Individual project members also regularly request that Wikimedia delete articles or images for reasons of 'inappropriateness' of content - either censoring others (because they don't like their subject matter) or to self-censor (in an effort to minimize the impacts of other censorship efforts, above). Most such requests are denied, but some material is removed - for legal, ethical, or practical reasons.

The filtering problem

[edit ]

Some audiences want to be able to vet websites via third-party filters. Schools often fall into this category, from primary school through university. They have a standard tool they use to determine whether sites are appropriate for their students to be visiting using school computers.

Some filters require no support from the content provider. Some do; spending time to develop such support can encourage poorly-defined filtering to take place.

Most 'filters' use negative, simplistic, and unbalanced classification schemes, rather than using widely-used descriptive classifications such as the existing wikimedia categories (or LOC or DDS classifications).

The user preferences problem

[edit ]

Some audiences want to self-limit what they have to see while browsing the projects. Muslims may not want to be confronted with images of Muhammad when reading articles about Islam or Muhammad. Christians and muslims may not want to see images of naked people, even when reading about the body.

However, making such user preferences available raises the possibility that such settings will be chosen by some intermediary power without the consent of the reader.

Some free knowledge and anti-censorship proponents feel that allowing people to self-limit what they see harms the integrity and effectiveness of a resource, and that supporting audiences that wish to self-censor is itself offensive.

The ethics problems

[edit ]

Specificity: A classification scheme that would help sensitive readers avoid topics they oppose requires those who classify to think in terms of all sets of beliefs and fears that might take issue with part of an image or text. This can amount to 'doing censorship for others' and leads to some deep moral issues. For example, some people might not (or not want to) classify:

  • articles with the words "shit" or "damn" as 'articles with cursing'
  • casual images of affection between people of the same sex as 'same-sex kissing'
  • passing nudity in Renaissance artworks as 'nudity in art'
  • historical images showing the realities of war as 'possible torture'
    someone might strongly feel that, if they were going to educate people, sometimes they need to know what's really going on. Someone might be deeply disturbed at the idea that they must help prevent children from developing an ethical conscience by whitewashing the world for them

There is also a completeness question - for such a scheme to work, must every image must be classified right away? If it's not, then it's largely meaningless, since it won't actually provide the censorship asked for. This means that the people uploading or creating the material would likely be asked to assist in censoring themselves.

AltStyle によって変換されたページ (->オリジナル) /