“Don’t Blink”: Protecting the Wikimedia model, its people, and its values in April 2025

Translate this post
Image collage for the April 2025 issue of ‘Don’t Blink.’ Image by the Wikimedia Foundation, CC BY-SA 4.0, via Wikimedia Commons.

Welcome to “Don’t Blink”! Every month we share developments from around the world that shape people’s ability to participate in the free knowledge movement. In case you blinked last month, here are the most important public policy advocacy topics that have kept the Wikimedia Foundation busy.

The Global Advocacy team works to advocate laws and government policies that protect the volunteer community-led Wikimedia model, Wikimedia’s people, and the Wikimedia movement’s core values. To learn more about us and the work we do with the rest of the Foundation, visit our Meta-Wiki webpage, follow us on LinkedIn or on X (formerly Twitter), and sign up to our quarterly newsletter or Wikimedia public policy mailing list.

________

Knowledge is human: Foundation’s new three-year artificial intelligence (AI) strategy puts human first
[Find out more about the Foundation’s AI strategy]

The Foundation has published a new AI strategy that celebrates and centers the volunteers behind Wikipedia’s success. This strategy outlines how the Foundation plans to approach the rapid expansion in the development and use of AI, including what kind of tools we will create for the Wikimedia projects. In the announcement, Chris Albon (Director of Machine Learning) and Leila Zia (Director and Head of Research) share how the Foundation will prioritize helping the volunteers who create Wikipedia to automate tasks and workflows in order to have more time to create, share, and curate knowledge. The Foundation intends to:

  • Support Wikimedia moderators and patrollers through AI-assisted workflows for repetitive tasks,
  • Improve how information is discovered on Wikipedia,
  • Automate translation of common topics so Wikimedia contributors can spend more time sharing their local expertise, and
  • Provide guided mentorship for new Wikimedia volunteers to help with onboarding.

The AI strategy reflects deeply-held Wikimedia values and principles. Any future work with AI will prioritize human agency and decision-making, transparency, the use of open-weight and open-source AI models, as well as preserving the fundamental multilingual nature of Wikipedia and other Wikimedia projects. 

We are not the only organization thinking about how AI may influence the future of sharing and accessing knowledge on the internet. Ongoing conversations around AI will have significant impacts on the future of free knowledge and the online information ecosystem that feeds public interest projects like Wikipedia. At a recent lunch lecture held at Columbia University, Stan Adams (Lead Public Policy Specialist for North America) joined the General Counsel of the New York Times for such a discussion in order to talk about how generative AI affects journalism, traditional news media, and the services that rely upon those sources. 

Reporting estimates that “Wikipedia is probably the most important single source in the training of AI models” and that it is one of the three biggest websites used for English-language AIs. At the same time, Wikipedians themselves rely on outside sources like newspapers and academic journals to support their work on the project. Preserving a healthy ecosystem of human information creation, gathering, and sharing in the face of widespread adoption of technologies like generative AI is fundamental.

There are two other concerns worth mentioning. One is the proliferation of content generated by AI online, oftentimes low-quality and riddled with errors, has some people concerned about how it threatens information integrity. The other is a technical phenomenon called model collapse, which is the observation that when AI models are fed on content generated by other AI, the quality of the new model’s content degrades over time. All of these facts are why it is so important that our new AI strategy prioritizes helping our volunteer editors to do their work and that our positive vision for the future of the internet includes protections for knowledge coming from journalism and academic research.

Find out more about the Foundation’s AI strategy.

Sharing lessons from Wikimedians at digital rights conferences around the world
[Read our RightsCon reflections, and learn more about Wikimedians’ participation at the Digital Rights and Inclusion Forum (DRIF) in this Diff blog post]

Attending conferences focused on digital rights is one of the many ways that Wikimedians can share their knowledge and expertise with the world, and also advocate policies that protect and promote public interest projects like Wikipedia. Wikimedians are often in the best position to discuss their work and what policies or regulations affect them, and joining international digital rights forums like these helps them bring unique perspectives to others who also work on these topics across the globe. 

In April, Wikimedians from across Africa attended the Digital Rights and Inclusion Forum (DRIF), a conference hosted by the Paradigm Initiative and held this year in Lusaka, Zambia. Wikimedians’ presentation topics included: empowering women and communities through digital inclusion; using technology to promote information integrity and keep people informed during elections; and setting an example for inclusive artificial intelligence by amplifying underrepresented voices on Wikipedia. This is the fourth year that the Foundation has sponsored DRIF; each year the connections made and lessons learned have proven valuable to the work Wikimedians do all year round. By supporting this event, we also contribute to a broader ecosystem of digital rights activists and civil society organizations who work tirelessly to shape inclusive and rights-respecting policies for the internet.

This month, we also shared reflections from Wikimedians who participated in RightsCon, a conference focused on human rights in the digital age that took place in Taipei, Taiwan, this past February. The Wikimedians attending the conference represented affiliates groups and allies from Africa, Australia, Taiwan, Uganda, and the United Kingdom. In their reflections, they had a lot to say about what they learned and what they were able to teach others. Reflecting on his experience, Liang-chih Shang Kuan (Wikimedia Taiwan) said: 

I was inspired to consider how my insights from RightsCon can be incorporated into ongoing Wikimedia projects. One way […] is preserving Indigenous languages. It was motivating to see how Wikimedians’ work on this issue resonates with a global audience, which highlights the need to continue investing in building linguistically inclusive online communities. […] RightsCon offered a valuable platform to engage with key stakeholders, including human rights activists and journalists, whose work aligns with Wikimedia’s mission.

Our blog post contains more insights from community members as well as reflections from the Foundation’s Global Advocacy team, who highlighted the urgency of protecting digital rights in our current era of shifting geopolitical dynamics and accelerating technological change. In addition, the team shared how there is a global conversation about the need to get creative around fundraising for digital rights organizations and have a positive vision for information integrity.

Read our RightsCon reflections, and learn more about Wikimedians’ participation at the Digital Rights and Inclusion Forum (DRIF) in this Diff blog post.

Discussing human rights impact assessments (HRIAs) at Wikimedia Europe’s (WMEU) General Assembly
[View our presentation about how we identify risks as a part of complying with the EU’s Digital Services Act (DSA)]

Wikimedians from across Europe and members of the Global Advocacy team joined WMEU for their annual General Assembly, held in Prague, Czechia. At the event, participants gathered to discuss the logistics of coordinating among affiliates, present research about how legal regulations affect Wikipedia’s impact, and share work on important topics ranging from promoting open cultural heritage to AI. 

Ricky Gaines (Human Rights Policy and Advocacy Lead) and Phil Bradley-Schmieg (Lead Counsel) led a discussion on how the Foundation has approached assessing systemic risk as required by the EU’s DSA. They gathered ideas and feedback from the experienced Wikimedians and affiliate leaders who participated about the risks they experience in their work and how to consider these in future risk assessment exercises. Under the DSA, Wikipedia is the only nonprofit website that has been designated as a Very Large Online Platform (VLOP), and it is also the only platform to have received a “positive with comments” result during the first independent audit of our compliance with the DSA. The global human rights impact assessments (HRIAs) we have already conducted—which include our organization-wide HRIA, child rights impact assessment, and soon-to-be-published AI and machine learning (ML) HRIA—help us to identify areas where we can improve the safety and integrity of our projects, while providing a transparent record to hold the Foundation accountable to its commitments. 

The WMEU General Assembly marked an important opportunity to celebrate all of the hard work in which Wikimedia communities in Europe engage, and to build closer collaboration among affiliates to help navigate increasingly shifting legal, technological, and political landscapes that present new challenges for unique community-led, public interest projects like Wikipedia.

View our presentation about how we identify risks as a part of complying with the EU’s Digital Services Act (DSA).

Defending privacy for electronic communications in Snap v. Pina legal case
[Read our Diff blog post about the lawsuit and the amicus brief we cosigned]

The Foundation recently filed a “friend-of-the-court” or amicus brief in a case that could have significant implications for the electronic privacy of internet users in the United States (US). The case, Snap v. Pina, involves a dispute about what kinds of private information a court can force online services and platforms to provide during a criminal court case. Here, the defendant in a murder trial requested a subpoena—that is, a demand with the power of the court behind it—for electronic communications that took place on platforms owned by Snap Inc. and Meta Platforms, Inc. Both companies argued that the 1986 Stored Communications Act (SCA)—a leading electronic communications privacy statute in the US—would prohibit service providers from disclosing a user’s private data to anyone. The Court of Appeals of California ruled otherwise, and held that the SCA does not apply to private electronic data held by a platform host when said host uses that data for its own “business purposes,” for instance, targeted advertising.

We cosigned an amicus brief to challenge this interpretation by the court, focusing on criticizing the “business purpose” view presented to get this important legal precedent right without supporting any of the specific parties to the dispute. This ruling could have severe consequences for privacy online, lessening the protections that services and platforms based in the US have to push back on both national and foreign requests for data by removing some of the steps that require cooperation between the US and foreign authorities to issue a search warrant. This is especially important at a time when foreign governments, especially authoritarian ones, are increasingly pressuring technology companies to comply with their demands, legal or otherwise. Therefore, we asked the court to overturn this new “business purpose” theory of the SCA and protect the privacy and safety of internet users everywhere from governmental repression.

Read our Diff blog post about the lawsuit and the amicus brief we cosigned.

________

Follow us on LinkedIn or on X (formerly Twitter), visit our Meta-Wiki webpage, sign up for our quarterly newsletter to receive updates, and join our Wikipedia policy mailing list. We hope to see you there!

Can you help us translate this article?

In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?