Wikimedia is an antidote to disinformation: Introducing a repository of anti-disinformation projects

Translate This Post

“Don’t be a target of hoaxes!” says a participant in an online safety and anti-disinformation training for older people in Quilicura, Chile. Screenshot of Wikimedia Chile video, CC BY-SA 3.0, via YouTube.

During the confusing first months of the COVID-19 global pandemic, reliable sources of information about the disease were hard to find, and even reporting from major news media showed inconsistencies and contradictions. Conspiracy theories, false information, and disinformation campaigns were rampant, which led to confusion globally, and to an immediate toxic and deadly impact on individuals and communities worldwide.

Concerned, a group of Wikipedia volunteers decided to act. They organized task forces; they collated, curated, and fact-checked COVID-19-related information in a single location; they worked in multiple languages, across the world and around the clock. WikiProject COVID-19 was launched at record speed, its first edits dated 16 March, 2020—i.e., a month and a half after the World Health Organization (WHO) declared a public health emergency of international concern. By October of the same year, with more than 5200 articles in 175 languages related to COVID-19, the WHO and the Wikimedia Foundation established a collaboration to better understand what gaps in information needed to be filled, and to provide WHO resources to Wikimedians so that they could continue to do so.

Through WikiProject COVID-19, the Wikimedia community helped to ensure that readers across the globe had access to timely, trustworthy, and accurate information that was essential for their health and well-being. Without this public interest volunteer effort, people worldwide might have more easily fallen for disinformation, falsehoods, and conspiracy theories that could have cost their loved ones and them their health and, even, their lives.

Wikimedia volunteers work daily on the front lines of the search for reliable, trustworthy information, to counter the swath of mis- and disinformation growing online and offline, to reduce knowledge gaps, and to promote and increase knowledge equity. Many Wikimedia communities address these challenges and goals systematically, and have developed initiatives and projects that can provide valuable tools and resources to other communities. However, until now, information about these important projects has not been collected in one place to show the breadth and depth of activities happening across the Wikimedia communities. 

For this reason, in August 2022, the Foundation launched a public mapping to collect anti-disinformation initiatives and tools developed at the local level across Wikimedia projects. Today we are excited to announce a repository with almost 70 projects, which we are confident will help the hundreds of thousands of volunteer editors contributing to the projects to curb false and misleading information worldwide! 

The purpose of this repository is to provide Wikimedians all over the world with access to different communities’ projects, more ideas and tools, and details to contact others who can support them with their work to create and share trustworthy information. We also hope that this mapping will help to scale some of these projects, as well as trainings, research, and community operations. Last but not least, our intention is also to provide a picture of just how much work our communities do for trustworthy information: one that helps explain why the Wikimedia communities and projects are an antidote to disinformation.

Building a repository: Why Wikimedia is an antidote to disinformation

The work to create reliable information and combat mis- and disinformation takes different shapes. Those forms depend on the particular circumstances of the information ecosystem in which a specific community is immersed, the particular challenges and problems that arise from it, and the individual experience and skillset of the members of said community. This sometimes means working to cover knowledge gaps, since filling in missing information is essential to understanding our world and to prevent the sharing of false information. Knowledge gaps risk becoming a liability for trustworthy information, and contradictory reports are dangerous for knowledge integrity. Wikimedia volunteers know this, so they set out to find and correct the information included in articles and other project content, in addition to that of digital databases that are used as reliable and up-to-date primary sources.

Wikimedia projects both contribute directly to the information ecosystem through their open model, and rely on this ecosystem being healthy for their content. This means working to support trustworthy sources. It also means helping volunteers and readers better understand the sources and information they share. Numerous Wikimedians recognize this clearly, and have created training and media literacy courses for different groups in various countries. Wikimedia Chile collaborated with Google and the municipality (comuna) of Quilicura in Chile to develop a training called ¡Que no te pille la tecnología! Personas Mayores Informadas (Don’t let technology catch you! Senior Citizens Informed), which helped older people be safe on the internet, and discussed specific techniques against disinformation. 

In environments where mis- and disinformation is particularly rampant, Wikimedia volunteers greatly benefit from being trained in information and media literacy—i.e., the critical skills to spot disinformation campaigns. Wikimedia Ukraine partnered in 2020 with a local organization, Media Detector, to organize two media literacy trainings in Ukraine. Promoting media literacy as well as transparency is critical for maintaining a healthy and informed public discourse. It ensures that democratic institutions can communicate and function effectively. 

These examples show that Wikimedians understand that their work is multifaceted, and that without their commitment and efforts to promote knowledge equity, disinformation would have a much easier time infiltrating our information ecosystem worldwide. This is why we say that Wikimedia is an antidote to disinformation.

Wikimedia volunteers’ tireless efforts to fact-check and increase the reliability of Wikimedia projects has led them to organize numerous initiatives to support this aspect of their work throughout the years. Some communities work with museums and other cultural institutions to expand public knowledge of important moments in their histories. Others work on developing media literacy and hold fact-checking workshops, or collate lists of reliable or unreliable sources for other editors’ reference. Some other communities develop machine learning (ML) software as well as bots to automate parts of the projects’ patrolling work. These initiatives—exceptional examples of the strengths and advantages that the Wikimedia model offers in terms of content curation and moderation—play a vital role in contributing to fostering a healthy information ecosystem. Furthermore, they provide critical understanding and tools to other volunteer editors’ work.

Introducing a repository of Wikimedians’ anti-disinformation projects

Through many conversations with Wikimedia affiliates, researchers working on Wikimedia projects, and within the Foundation itself, we have already identified close to 70 projects that provide valuable lessons and tools. 

As we learn of new projects, the Foundation will continue adding them to the repository we have created, which anyone can easily access and discuss on its Talk page.

We hope that this collection will help volunteer editors who are looking for solutions or inspiration, make it easier for these initiatives to be replicated or scaled, and generally support the work of Wikimedians. We are confident that it will create connections and support new volunteer editors who are just joining the Wikimedia movement as well as seasoned Wikimedia volunteers. We hope that showcasing these initiatives and projects together will make Wikimedia communities’ work to curb false and misleading information online clearer to others who are interested in combating disinformation—from researchers to journalists, and from government bodies to civil society organizations. We also trust that it will clarify how volunteers not only protect free and open knowledge on Wikimedia projects, but contribute to making such content flourish on the Web as well.

We recognize that this repository is a partial catalog of the important contributions by Wikimedia communities as well as external organizations that constantly improve the accuracy and reliability of Wikipedia and the other projects. For that reason, please let us know what we missed! If there is any project that you would like to see included, please reach out to the Foundation’s Anti-Disinformation Strategy Lead, Costanza (csciubbacaniglia@wikimedia.org), and/or the Global Advocacy team (globaladvocacy@wikimedia.org). Together we can continue to make sure that Wikimedia continues to serve as an antidote to disinformation! 

Can you help us translate this article?

In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?