Jump to content
Wikimedia Meta-Wiki

Human Rights Policy Community Conversations

From Meta, a Wikimedia project coordination wiki
Eleanor Roosevelt Reviewing the Universal Declaration of Human Rights

Introduction

[edit ]

The Wikimedia Foundation's Global Advocacy team will be holding a series of community conversations in May 2022 to receive input on how the Human Rights Policy can best support individual members of the community. This page contains background information on the Human Rights Policy, commitments the Foundation has made under this policy, and information about how members of the community can influence how this policy is carried out. Smaller focus groups will also be organized to explore more sensitive topics in more focused, intimate settings. Additional community conversations will also be organized in the future.

The discussions and recommendations from these conversations will guide how the Foundation implements the Human Rights Policy in the coming months and years. Throughout this process, our team will take notes and provide a summary of challenges and recommendations identified during these community conversations on this page.

Background

[edit ]

The Wikimedia Foundation's Board of Trustees approved the Human Rights Policy on December 8, 2021. This policy serves as a compass for our broader work in advocating for policies and technologies that advance the movement and provides a framework for respecting and protecting the rights of everyone — from staff to volunteer contributors — across all Foundation operations and movement activities. Critically, the policy will inform how we respond to and protect members of our movement against demands and threats from non-state actors as well as governments that threaten to violate their human rights. Under this policy, the Foundation commits to the following:

  • Conduct ongoing human rights due diligence, including periodic human rights impact assessments, in addition to regular engagement with rights holders. Our human rights due diligence aims to ensure that we identify how all aspects of our operations and projects affect human rights, and in turn work to mitigate and prevent harm.
  • Track and publicly report on our efforts to meet our human rights commitments, as part of our broader commitment to public transparency and openness through all of our websites and platforms, in addition to our dedicated Transparency Reports.
  • Use our influence with partners, the private sector, and governments to advance and uphold respect for human rights. The Wikimedia Movement Strategy’s commitment to increase our global advocacy activities both reflects and enables this effort to advance the realization of rights we are uniquely positioned to support, especially the right to share and access information.
  • Provide access to effective remedies. In cases where the Foundation’s prevention and mitigation strategies have not prevented our products, platforms, or operations from contributing to the curtailment, infringement or violation of human rights, we commit to maintain and improve mechanisms for reporting harms or abuses. We also commit to work with experts and stakeholders to develop or support appropriate forms of redress, proportionate to the type and manner of harm. We commit to avoid obstructing access to other forms of remedy including judicial remedies.

Community Conversations

[edit ]

This policy commits the Wikimedia Foundation to to protect and respect the human rights of all people in the Wikimedia movement. How we implement this policy should ultimately benefit members of this community. To understand how this policy can best benefit members of the Wikimedia community, the Global Advocacy team is launching a series of community conversations to hear directly from individuals.

Objectives

[edit ]

These community conversations are opportunities for members of the Wikimedia community to provide critical input and feedback into how the Foundation implements the Human Rights Policy. The objectives for these conversations include:

  1. For the Foundation to better understand how your human rights have been impacted as a result of your work and involvement with the Wikimedia movement.
  2. To receive ideas, suggestions, and recommendations about how the Foundation can better support you or other members of the Community when your human rights are threatened.

The Global Advocacy team recognizes that human rights are threatened in every region of the world. For the purposes of these conversations, we will focus on the most pressing and grave human rights challenges facing our communities.

Format & Registration

[edit ]

These conversations will take place via Zoom and will last for 60 minutes. Representatives of the Foundation will provide opening remarks and updates (10 minutes) and the remaining time (50 minutes) will be open for members of the community to help shape policy implementation. Register for a session by clicking on the corresponding link in the table below. These conversations will be governed by Wikimedia's Friendly Spaces Policy and the Universal Code of Conduct. We encourage participants to come ready to participate in a dialogue and to share concrete ideas and recommendations. Register to participate using the links below!

Town Hall Community Conversations
Region Date Time (UTC) Time (Local)
Latin America & Caribbean (Conducted in Spanish) Tues 3 May 23:00 19:00 Santiago, Chile
Central & Eastern Europe Mon 9 May 17:00 19:00 Berlin, Germany
Africa Thurs 12 May 18:00 19:00 Abuja, Nigeria
Asia Mon 16 May 12:00 20:00 Jakarta, Indonesia

Language Interpretation

[edit ]

We will work to provide interpretation for languages with four or more interested community members. To request interpretation, email rgaines(_AT_)wikimedia.org. You can do this up until 5 days before the meeting to allow us to make the necessary arrangements.

Unable to Participate?

[edit ]

We understand that many of you may not have been able to participate in these Community Conversations for a number of reasons, such as scheduling conflicts or safety concerns. To accommodate these situations, an anonymous survey is now available for you to provide any thoughts, comments, ideas, recommendations, etc. This survey will be conducted via a third-party service, LimeSurvey, which may subject it to additional terms. For more information on privacy and data-handling, see the survey privacy statement. This survey will remain open through 30 June, 2022.

Topics of Focus

[edit ]

An organization-wide human rights impact assessment completed in 2021 identified five key risks associated with human rights across Wikimedia's platforms. These community conversations will focus on how these risks have impacted members of the Wikimedia community and how they are manifested in various regions and across our platforms. These risks include:

  1. Harmful content, such as the spread of hate speech, disinformation, or dangerous content that can contribute to self-harm or the harm of others
  2. Harassment, such as doxing individuals or attacking or threatening individuals based on their race, ethnicity, gender, gender identity, sexual orientation, or other personal characteristics;
  3. Government surveillance and censorship, such as monitoring individuals' online activity, blocking websites, requesting user data, or requesting content removal or alterations
  4. Risks to child rights, such as the potential for inappropriate communication with children, exposing children to harmful content, or the spread of content that exploits children
  5. Limitations on knowledge equity, such as information bias and individual discrimination that can be the result of gender inequities and the under-representation of racial, ethnic, and geographic groups

Note: The human rights impact assessment mentioned above will be published in the coming weeks.

Questions to Frame these Conversations

[edit ]

The Global Advocacy team will moderate each conversation to facilitate insightful, productive dialogues to better understand how community members' human rights have been impacted and how the Foundation can better support them in such situations in the future. Ahead of these conversations, consider the following questions:

  • Are you aware of any harm that has occurred in your community as a result of harmful content, harassment, or government surveillance and/or censorship on any Wikimedia platform?
  • Could this harm have been predicted? How could this harm have been mitigated?
  • Have you or anyone you know ever communicated with the Wikimedia Foundation when harm was imminent or actively occurring?
  • How did the Foundation support you or your community? What worked well and what cold be improved?
  • How can we prevent such harm from occurring again and affecting other members of your community?
  • If you became aware of possible harm, do you know how to raise your concerns to the Foundation?
  • What unique challenges is your community facing?
  • How can the Foundation better understand emerging threats to your human rights on our platforms in your region, country, community, etc.?

If you have questions you would like to submit before each conversation, please leave them on the Discussion page and indicate which region your question relates to.

Safety and Security of Participants

[edit ]

The Global Advocacy team prioritizes the safety and security of all participants in these events. If you feel the need to do so, we welcome you to use pseudonyms, to not turn on your video, and/or to submit questions in the live chat.

Summary of Community Conversations

[edit ]

Latin America

[edit ]
  • Online Harassment: Volunteers shared experiences being harassed online, both on and off-wiki; these volunteers noted that when writing about politically sensitive or socially controversial topics, they can be targeted by journalists or others who take issue with their content; another volunteer added that political polarization in the region contributes to this challenge; Volunteers expressed that they frequently don’t know how to deal with these challenges and need support.
  • Privacy: A volunteer elaborated on the issue of harassment and noted that if a Wikimedian has been active long enough, there is likely enough information about them online that an astute person could infer their identity, share it publicly, and begin to target them for their work; this is particularly problematic on Commons, where removing your own content is extremely difficult; a process should exist to remove your own information.
  • Community Recommendation: Volunteers recommended that resources and training be made available for them to better protect their own human rights; training would be helpful for the broader community to understand what human rights are impacted online and on-wiki, and how they can respond and access resources when they are threatened.

Asia

[edit ]
  • Doxxing: Volunteers shared experiences about cased of doxxing that bled over into Twitter and other off-wiki platforms; one volunteer was (mistakenly) told WMF could not help since it was not occurring on a WMF platform; WM staff recommended users contact talktohumanrights@ in such situations, and the Human Rights team will provide what assistance they can
  • Takedown Requests: A volunteer inquired about takedown requests, whether it has happened, and how the Foundation responds; WMF staff pointed to the Transparency Report and legal review process; a BoT member discussed a case where a page’s edit history was removed in an extreme situation to protect users.
  • Due Diligence/Events and Conferences: a volunteer inquired about taking into consideration human rights concerns in some countries as communities work to plan an in-person event; WMF staff affirmed that the HRP commits the Foundation to conducting due diligence and supporting communities to carry our due diligence around event planning could be a productive area of collaboration and co-design in the near future.
  • Contacting the Foundation on Human Rights Concerns: multiple volunteers inquired about contacting the Foundation around human rights concerns or threats, noting certain situations can be scary. WMF staff encouraged them to reach out to the Human Rights Team through whichever channels they felt most secure with for assistance, noting that details are kept private for security. The team can also support volunteers with digital security training, analyzing risks, and minimizing the amount of their information that is publicly available.

Africa

[edit ]
  • Government Surveillance: A volunteer inquired if the Foundation can detect government surveillance; another volunteer shared that they cannot edit certain Wikipedia articles for fear of government surveillance and possible retribution, while another shared how his affiliate group is a legally registered NGO that is, therefore, subject to government monitoring of its activities; WMF staff explained that detecting surveillance is difficult but many governments to carry our mass surveillance, but that the Foundation does advocate for such surveillance be narrow and lawful.
  • Legal Defense of Volunteers: A volunteer inquired about the Foundation providing legal defense to individual volunteers; WMF staff explained that such cases are evaluated on a case-by-case basis, but we do advocate for intermediary liability protections and work with local communities and partners to mitigate such situations.
  • Disinformation: A volunteer shared that "edit wars" and disinformation can become significant challenges in their community around elections; neutral volunteers want to address such disinformation, but do now know how; WMF staff highlighted our growing work to address disinformation on WMF platforms and that the Foundation hopes to provide training to volunteers.

Central & Eastern Europe

[edit ]
  • Doxxing: WMF staff shared that this has been the most frequent challenge in the region in recent weeks, including cases in which volunteers’ information was shared via Telegram; one volunteer who was doxxed was arrested; volunteers should reach out to the Human Rights team to flag any cases of doxxing and channels through which it is occuring; the Human Rights team can also provide digital security best practices and recommendations on what to do if an individual is being doxxed.
  • Foundation Commitment to Human Rights: the armed conflict in Ukraine is an extreme situation which touches so many parts of the Foundation, including grant making, trust and safety issues, technology, product, etc.; in such situations the BoT supports the Foundation in taking extraordinary steps to support affected volunteers, staff, and contractors that goes beyond what is normally done; this support can’t always be made public for security reasons, but the community can always share what their needs are.
    • A volunteer inquired about the best way to stay in contact on these matters; WMF staff explained that volunteers can contact the Human Rights team 24x7 through a variety of channels, using whichever they feel most comfortable with.
  • Government Censorship: A volunteer raised the topic of the Turkish government blocking Wikipedia in 2017 and inquired about Foundation communications to the community in such situations; WMF staff explained that the Foundation communicated what they could at the time and that, in some situations, sharing information can make the situation worse; WMF staff also explained the process for reviewing takedown requests from governments and pointed towards the Foundation’s Transparency Report.
  • Community Recommendation: A volunteer recommended a central hub be established where resources and information can be made available to them.

AltStyle によって変換されたページ (->オリジナル) /