Jump to content
Wikimedia Meta-Wiki

Grants:Project/Ocaasi/Misinformation And Its Discontents: Narrative Recommendations on Wikipedia's Vulnerabilities and Resilience

From Meta, a Wikimedia project coordination wiki
This is an archived version of this page, as edited by SlimVirgin (talk | contribs) at 20:19, 19 February 2020 (Endorsements: +). It may differ significantly from the current version .


statusdraft
Misinformation And Its Discontents: Narrative Recommendations on Wikipedia's Vulnerabilities and Resilience
summaryTo prevent disinformation on Wikipedia, we need to explore the current research, community practices, external expertise, and potential interventions that can address it.
targetEnglish Wikipedia, smaller and non-English Wikipedias
amount27,500ドル USD
granteeOcaasi
contact• jorlowitz(_AT_)gmail.com
this project needs...
contact
organization
volunteer
join
endorse
created on21:09, 17 February 2020 (UTC)


Project idea

What is the problem you're trying to solve?

What problem are you trying to solve by doing this project? This problem should be small enough that you expect it to be completely or mostly resolved by the end of this project. Remember to review the tutorial for tips on how to answer this question.

Disinformation is an urgent topic of study and concern, sadly because it's eroding our civil society and trust in media through viral activity on massive social networks. Misinformation [1] is defined merely wrong information, and may or may not be intentional. Disinformation [2], however, is intentionally wrong--and also harmful.

On Wikipedia, disinformation can come from good-faith editors who (mis)use sources that are actually (dis)information, single-purpose accounts that seek to skew and bias articles, paid editors with a financial interest in promoting client interests [3], coordinated groups that look to "brigade" topics where they have an agenda, and State Actors where governments seek to undermine political dissent or insert propaganda.

Wikipedia is generally considered a generally reliable source [4]. Wikipedia maintains this reputation despite a long history [5] of edit wars at controversial articles [6]. In many areas the encyclopedia has demonstrated remarkable capacity against attempts to introduce falsehoods and bias it. In other cases, however, even simple hoaxes [7] show off the gaps in our defenses and their broader negative impact ([8] [9] [10]).

Considering the wholesale havoc which misinformation and disinformation has wrought on sites such as Youtube ( [11] [12] [13] [14] [15]), Facebook ([16] [17] [18] [19] [20] [21]), Twitter ([22] [23] [24] [25] [26]), and WhatsApp ( [27] [28] [29] [30] [31]), Wikipedia may even be in a relatively enviable position. At the same time, an encyclopedia holds itself to a higher standard of reliability than other social web properties--and perception of trustworthiness is arguably more important for a project that prides itself on good information.

As information spreads with increasing speed through social networks, as news outlets struggle to combat lies and propaganda, and as political and government attempts to bias the information landscape become more widespread and complex, how do we as a neutral encyclopedia address the growing threat of disinformation on our own projects?

There are troubling and recent examples: a made-up Nazi Death Camp" [32], inserting pro-gun bias [33], whitewashing celebrity scandals [34], pro-Iranian networks [35], Coronavirus disinformation [36], disappearing concentration camps [37], fake chemical plant explosions [38], blamed missile attacks on planes [39], state propaganda [40], hoaxes about famous accessories [41], false rap album release dates [42], cryptocurrency lies [43], government censorship [44], and historical revisionism [45].

Whatever defenses we have amassed, we are still under constant attack.

Larger Wikipedia are likely doing better than most in robustness and resilience against misinformation merely due to the size of their active contributor base. Yet even our larger wikis are not immune from nefarious attempts to bias them, and the vulnerability of smaller wikis with many fewer editors, patrollers, and tools is presumably far greater.

And for highly sophisticated disinformation campaigns [46] by paid hackers or intelligence agencies funded by foreign governments, we are not yet in a position to say whether there have been serious attempts, or if so, at what scale. Our vulnerabilities may be overstated [47], or, we could be dangerously blind to what is happening already and what is coming. The nature of Wikipedia's anonymous or pseudonymous contributors could allow a well-funded targeted effort to develop trusted users who could be influential; or to compromise existing trusted users (e.g. admins). We need to do deeper research, because we don't know what we don't know, and covert sophisticated campaigns would be, well, covert and sophisticated.

When it comes to this massive existential problem, we don't know its scope and we haven't the identified the range of solutions. We haven't spoken with enough experts, or synthesized their wisdom and guidance. We have yet to distill community knowledge and connect people working on the same or similar issues from different perspectives. We are at the frontier of a disturbing trend in our digital information age, and we are working blind. We don't know the nature of the problem, we don't have a roadmap for the fixes, and we don't have a story that ties them together.

What is your solution to this problem?

For the problem you identified in the previous section, briefly describe your how you would like to address this problem. We recognize that there are many ways to solve a problem. We’d like to understand why you chose this particular solution, and why you think it is worth pursuing. Remember to review the tutorial for tips on how to answer this question.

This will be an investigative exploration, a narrative project with the goal of practical recommendations for implementation and expansion. The focus is on five core questions:

  • What policies, practices, and tools make some Wikipedias more robust against isinformation?
  • How do we vet neutrality and reliability of sources on articles about polarizing or controversial topics across multiple languages?
  • How are smaller or less active Wikipedias and Wiki communities more vulnerable to disinformation?
  • What trainings and tools would improve our resilience to disinformation attempts?
  • How do we share our expertise about disinformation throughout the movement and beyond?

First, we must understand what disinformation means and how it operates. Then we must investigate the community members whose regular practice involves fighting disinformation (a preliminary list of interviewees will be provided to the grants committee to protect confidentiality). We must look to Foundation staff and outward to external experts who are actively planning ways to combat disinformation. Finally, we must compile, analyze, and recommend actionable interventions.

Targeted solutions range from appropriate policies to citation tagging to community noticeboards to project trainings. One of the fundamental outcomes of this project is to identify different approaches to fighting disinformation, speak with those pioneering the approaches, contextualize where each approach would have the most value, and work to create a vision for how the approaches can work together to address disinformation.

Wikimedia Legal Terms of Use Our projects need clear prohibitions on the intentional misuse of information. This begins with the Wikimedia Foundation's Terms of use, which states that "Engaging in False Statements, Impersonation, or Fraud" is a violation [48]. Second, the Foundation implemented a Conflict-of-Interest Policy that requires disclosure of "your employer, client, and affiliation with respect to any contribution for which you receive, or expect to receive, compensation" [49]. To complicate matters, however, national legislative attempts to make disinformation illegal actually could endanger full and accurate coverage of controversial subjects.

Wikimedia Foundation Research Disinformation is already an active area of research at the Wikimedia Foundation. The Wikipedia Library program, Wikimedia Foundation Research, the Legal and Public Policy team, and the Wikicite initiative are each involved in addressing components of disinformation. An initial dive has been done through a literature review [50] of disinformation. There is a broader plan to address and improve Knowledge Integrity [51]. We now know more about how social media and Wikipedia articles interact [52], how readers use (or don't use) citations [53], which citations are being added and removed [54], and how to detect sockpuppets [55].

Outside Expertise Outside the Wikimedia Movement, numerous organizations have formed or focused to attempt to understand and limit the harm of disinformation. Groups like Credibility Coalition ([56]), MediaWell [57], MisinfoCon [58], Meedan [59], Demos [60], Hewlett Foundation [61], and the Data & Society Research Institute [62] are among dozens looking to tackle this sprawling and complex problem ). They are producing their own useful research on various dimensions of the subject ([63] [64] [65] [66] [67] [68] [69] [70] [71]). There has also been a proliferation of fact-checking efforts ([72]][73] [74] [75] [76] [77] [78] [79] [80] [81]). In a tertiary way, Wikipedia is one of them.

Active Editors, Diversity, and Health Having a sufficient number of informed editors, from a variety of backgrounds, in a civil editing environment, may be the most effective bulwark against disinformation. Larger Wikipedias have more capacity; more diverse Wikipedias have more willpower and resistance against bias; and lower-harassment communities invite in more people to positively contribute. The fundamental community components are essential to preventing issues like, for example, a national narrative becoming an entire language version of Wikipedia's narrative. Creating an informed, large, diverse and healthy community may ultimately be more important than any specific or sophisticated disinformation intervention.

Editing Policy Inside our projects, communities benefit from clear principles that outline the correct ways to handle information and verification. This includes strong consensus on Neutral Point of View [82], Verifiability [83], Reliable Sources [84], Advocacy [85], and Conflict of Interest [86]. Strong communities also have forums to help apply these principles such as English Wikipedia's Reliable Sources Noticeboard [87], Conflict of Interest Noticeboard [88], and WikiProject Reliability [89].

Capacity and Training No community arises fully formed, and there is a large role to play in basic capacity development and training. The Community Capacity team [90] at the Foundation has developed basic and useful materials [91] that give editors the understanding they need to apply core principles and policies of neutrality and verifiability. The Community Development team is building "The Learning Platform", to help spread these lessons in an engaging format. Top priority among the modules is basic information literacy. Many editors have simply never encountered concepts relating to finding, evaluating, and paraphasing sources. Of note, here is a tension in communities that are more accustomed to "orality" and the heavily text-biased nature of Wikipedia--this has systemic bias implications as well.

Citation Categorization One of the most promising areas of fighting disinformation is in better classifying, flagging, and labeling which sources are likely reliable or likely to contain disinformation. While many community members have advances knowledge of reliable sources, globally there is no way for any one person or project to evaluate all possible reliable sources. We could develop a global news index with ratings by category, perhaps using Wikicite and Wikidata. We could invite librarians or other information professionals to help us in this ranking process. We could rely on outside organizations' indexes [92]. There are already some impressive yet incomplete attempts at distilling citation reliability knowledge ([93] [94] [95] [96] [97] [98] [99] [100]). These are not shared across all projects, however, and none of them are globally representative. An exciting prototype tool [101] could expose these rankings or labels to all readers as an aid for citation literacy.

Citation Access Being able to read citations is a precursor to using good sources, as many of them are unfortunately locked behind paywalls. Project like The Wikipedia Library [102] give editors more tools to find and use good scholarly and academic reliable sources. Partnerships with organizations like the Internet Archive [103] meanwhile give readers a better chance at being able to lookup and verify information in citations. We know from basic research that more open citations are clicked on more often; access is a precondition for dispelling misinformation with sound research.

Algorithmic Assistance The initial conclusion of the Foundation's literature review on disinformation was that our best opportunity is to help article patrollers [104] through machine-assisted edit scoring--to more efficiently and effectively identify and review potential disinformation. There has also been the development of algorithms that help identify which statements likely need a citation [105], and that may integrate nicely with tools that help editors fix those statements [106].

Which of these solutions are most effective? How are they currently employed, on which projects, and by whom? A major goal of this project is to explore each area of intervention to determine its usefulness and its relation to other approaches. The best way to do this is direct conversation with the people leading these efforts.

Project goals

What are your goals for this project? Your goals should describe the top two or three benefits that will come out of your project. These should be benefits to the Wikimedia projects or Wikimedia communities. They should not be benefits to you individually. Remember to review the tutorial for tips on how to answer this question.

I will provide a fuller picture around the problem of disinformation on Wikimedia projects: where it is most severe, and what current factors are making it less so. I will tie together multiple threads of media, research, and community knowledge to create an enlightened narrative to guide discussions and future interventions. As a result, our community will be better informed and prepared to address a growing threat to our reputation and integrity.

I will investigate and distill broader expertise from staff, community members, scholars, and organizations about disinformation through targeted interviews. By deeply researching what the most experienced people on the subject know, I will capture the wisdom, innovations, and leadership residing inside out and outside of our projects. As a result, we will highlight our community knowledge and transform that wisdom into actionable insights.

I will provide clear recommendations for most effectively addressing disinformation, with proposed pathways towards implementation. I will compare and connect different approaches and analyze which are appropriate in different circumstances and how they can work together. As a result, staff and community should be able to better envision, plan, and execute next steps for their interventions.

I will share knowledge gained from the project through blog posts and conference presentations. I will use social media to spread learnings and further invite commentary. As a result, more people will know about the findings and know they have a contact who can connect them to resources and other experts.

Project impact

How will you know if you have met your goals?

For each of your goals, we’d like you to answer the following questions:

  1. During your project, what will you do to achieve this goal? (These are your outputs.)
  2. Once your project is over, how will it continue to positively impact the Wikimedia community or projects? (These are your outcomes.)

For each of your answers, think about how you will capture this information. Will you capture it with a survey? With a story? Will you measure it with a number? Remember, if you plan to measure a number, you will need to set a numeric target in your proposal (i.e. 45 people, 10 articles, 100 scanned documents). Remember to review the tutorial for tips on how to answer this question.

Goals Impacts (outcomes) Result (measurement)
Provide a fuller picture of the landscape Our community will be better informed and prepared to address a growing threat to our reputation and integrity Report is written and published on Meta
Distill varied knowledge from community and experts Highlight the community of our expertise and transform that knowledge into actionable insights Number of interviews conducted: 25 by voice, 25 additional by text
Provide clear recommendations and a roadmap for interventions Staff and community should be able to better envision, plan, and execute next steps for their interventions Report is written and published on Meta
Share knowledge gained More people will know about the findings and have key contacts who can connect them to resources and other experts Blog post is written and 2 presentations are given at community or media conferences.

Do you have any goals around participation or content?

Are any of your goals related to increasing participation within the Wikimedia movement, or increasing/improving the content on Wikimedia projects? If so, we ask that you look through these three metrics, and include any that are relevant to your project. Please set a numeric target against the metrics, if applicable.

As this is an investigation and research project, the only goals around participation would come at a future grant or project phase.

Project plan

Activities

Tell us how you'll carry out your project. What will you and other organizers spend your time doing? What will you have done at the end of your project? How will you follow-up with people that are involved with your project?

Activities Products Followup
Intensive research into academic and media papers, studies, and sources Written report hosted on a Meta Wikimedia project page Share landscape report on meta and through community messaging channels asking for commentary and suggestions
Deep interviews with community leaders and outside experts through both voice and text questions and conversations Written qualitative synthesis of interviews with high-level findings and key quotes published on Meta Share findings with interviewees, offer to make connections to and between interviewees, invitation to have scholars interact with our community
Synthesize recommendations for best approaches to fighting disinformation Written document hosted on a Meta Wikimedia project page Engage with Meta page project participants and alert community about developments on messaging channels and social media groups
Share findings widely inside and outside of the community Blog posts, social media engagement, and conference presentations Suggest second round of conversations with community members and external experts

Budget

How you will use the funds you are requesting? List bullet points for each expense. (You can create a table later if needed.) Don’t forget to include a total amount, and update this amount in the Probox at the top of your page too!

Project Element Cost
Disinformation landscape review 3,000ドル
Interviews with English Wikipedians 2,500ドル
Interviews with Stewards and Meta-pedians 2,500ドル
Interviews with non-English Wikipedians 3,000ドル
Interviews with Wikimedia Foundation Staff 2,500ドル
Interviews with external media experts 3,000ドル
Qualitative synthesis of interviews 3,000ドル
Review of potential interventions 2,500ドル
Recommendations for tools and trainings 2,500ドル
Presentation at misinformation conferences 3,000ドル
TOTAL 27,500ドル

Community engagement

How will you let others in your community know about your project? Why are you targeting a specific audience? How will you engage the community you’re aiming to serve at various points during your project? Community input and participation helps make projects successful.

I will create a meta project page about disinformation where I collect research and resources. I will host and publish the landscape review, intervention menu, and actionable recommendations on that page. There will be a place for people to sign up for updates and to contribute to the broader initiative of finding solutions for disinformation.

I will target active editors who do the most patrolling, editors who work on controversial subjects, editors from the global south, and editors who have experience combating disinformation. Communities that are most successful at addressing disinformation relative to the size of their active editor base, and communities that are less successful at addressing disinformation relative to the size of their active editor base are of particular interest. These are the people most affected by and best positioned to expose and address disinformation.

I will leverage social media to disseminate findings widely through blog posts, Facebook groups, mailing list posts, and attendance at community and online information conferences.

Get involved

Participants

Please use this section to tell us more about who is working on this project. For each member of the team, please describe any project-related skills, experience, or other background you have that might help contribute to making this idea a success.

Jake Orlowitz (User:Ocaasi) founded The Wikipedia Library and ran it from 2011-2019. By the time he left the program at the Wikimedia Foundation, TWL had a half-million dollar budget and 6-person team on 4 continents. Through The Wikipedia Library, Jake developed partnerships with 70 leading scholarly publishers to provide free access to 100,000 scholarly journals and reference texts. 25,000 editors now have access to those sources through the Wikipedia Library Card Platform. Jake created the viral #1Lib1Ref and #1Bib1Ref citation campaigns, which now add 10-20 thousand new references each year from librarians around the world to Wikipedia. He started the Wikipedia Visiting scholar program, the Books & Bytes newsletter, the Wikipedia + Libraries facebook group, the Wikimedia and Libraries Usergroup, and the @WikiLibrary Twitter account.

Jake negotiated the collaboration with Turnitin to fix copyright violations on Wikipedia, started collaboration with Internet Archive to rescue 10 million dead citation links, integrated OCLC ISBN citation data into Wikipedia's reference autogeneration interface, and began a project to add Citoid to Wikidata. He developed the OAbot web app, and is a founding member of the Open Scholarship Initiative. He co-released a dataset of Wikipedia's most cited sources and the proportion of free-to-read sources on Wikipedia. Jake created The Wikipedia Adventure interactive guided tutorial and facilitated the first-ever for-credit Wikipedia editing course at Stanford Medical School. He is an English Wikipedia Administrator. 2-time Wikimedia Foundation grantee, former Individual Engagement Grants Committee member, founding board member of Wiki Project Med Foundation, former Organizing Committee member for Wikicite, Linked Data 4 Libraries Program Committee member, and founder of the Wikimedia Foundation's Knowledge Integrity Program.

Jake has presented about Wikipedia, citations, and reliability at five Wikimanias, Stanford University, Internet Librarian, the American Library Association, OCLC, and IFLA. He is a primary author of "The Plain and Simple Conflict of Interest Guide", "Conflict of Interest editing on Wikipedia" ."Librarypedia: The future of Libraries, and Wikipedia", "The New Media Coalition Horizon Report for Libraries", "The Wikipedia Adventure: Field Evaluation", "Writing an open access encyclopedia in a closed access world", "The Wikipedia Library: The world's largest encyclopedia needs a digital library, and we are building it", "You're a researcher without a library: what do you do?", the Wikipedia "Research Help" portal, "Why Medical Schools Should Embrace Wikipedia", and the forthcoming Wikipedia @20 chapter "How Wikipedia Drove Professors Crazy, Made Me Sane, and Almost Saved the Internet." He has been interviewed by Publishers weekly in "Discovery Happens Here", Tow Journalism School for "Public Record Under Threat", and was featured in the documentary "Paywall: The Business of Scholarship".

Community notification

You are responsible for notifying relevant communities of your proposal, so that they can help you! Depending on your project, notification may be most appropriate on a Village Pump, talk page, mailing list, etc.--> Please paste links below to where relevant communities have been notified of your proposal, and to any other relevant community discussions. Need notification tips?

  • Wikimedia-l
  • Wikipedialibrary-l
  • Openaccess-l
  • Wikicite-discuss-l
  • Facebook: Wikipedia Weekly
  • Facebook: Wikimedia + Libraries

Endorsements

Do you think this project should be selected for a Project Grant? Please add your name and rationale for endorsing this project below! (Other constructive feedback is welcome on the discussion page).

  • Support Support Gamaliel (talk) 19:24, 19 February 2020 (UTC)
  • Support Support. Fascinating! I'd love to be involved, and in any case I think is very needed. Pundit (talk) 19:33, 19 February 2020 (UTC)
  • Support Support. Excellent idea and much needed. SarahSV talk 20:18, 19 February 2020 (UTC)

AltStyle によって変換されたページ (->オリジナル) /