“Don’t Blink”: Protecting the Wikimedia model, its people, and its values in December 2023

Translate this post
Image collage for the December 2023 issue of ‘Don’t Blink.’ Image by the Wikimedia Foundation, CC BY-SA 4.0, via Wikimedia Commons.

Welcome to “Don’t Blink”! Every month we share developments from around the world that shape people’s ability to participate in the free knowledge movement. In case you blinked last month, here are the most important public policy advocacy topics that have kept the Wikimedia Foundation busy.

The Global Advocacy team works to advocate laws and government policies that protect the volunteer community-led Wikimedia model, Wikimedia’s people, and the Wikimedia movement’s core values. To learn more about us and the work we do with the rest of the Foundation, visit our Meta-Wiki webpage, follow us on X (formerly Twitter) (@WikimediaPolicy), and sign up to our Wikimedia public policy mailing list or quarterly newsletter


Protecting Wikimedia’s people
(Work related to privacy and countering surveillance)

Signing a letter to support legislative efforts to reform US mass surveillance
our letter urging US Congress to limit the National Security Agency’s surveillance]

As part of the Wikimedia Foundation’s efforts to move United States (US) Congress toward surveillance reform, we signed a letter alongside the Mozilla Foundation and various other organizations and companies urging legislators to support reform proposals like the Government Surveillance Reform Act (GSRA) and the Protecting Liberty and Ending Warrantless Surveillance Act (PLEWSA).

Furthermore, we asked members of Congress to strengthen these proposals by limiting the scope of surveillance targeting so that fewer people are swept up in the National Security Agency’s (NSA) massive surveillance system. We also warned lawmakers that the continuation of widely-documented abuses not only impacts the privacy of people online, but also deteriorates the trust of communities and individuals in the internet, weakening its economic and social potential. You can read our letter here.

Protecting Wikimedia’s values
(Work related to human rights and countering disinformation)

Input on UN Code of Conduct for information integrity on digital platforms
[Read our submission, available on Wikimedia Commons]

The United Nations (UN) is increasing its focus on digital policies and on threats specific to the online information environment. As part of this work, the UN decided to develop a Code of Conduct for information integrity on digital platforms. The result of this public consultation will be a document released by the UN Secretariat outlining best practices and recommendations to improve the quality of information online, particularly on digital platforms. The Wikimedia Foundation participated in the open call for submissions, offering input based on the Wikimedia model, which is uniquely rooted in transparency, user protection, and community-led content development. For more details, read our input, which is available on Wikimedia Commons.

Discussing the role of generative AI in mis- and disinformation at the NYC Tech Salon

Costanza Sciubba Caniglia, our Anti-Disinformation Strategy Lead, spoke at an event organized by the Technology Salon NYC (TSNYC) titled “What Can We Do About GenAI Fueling Disinformation?” Members of international organizations, corporations, civil society, and academia such as WITNESS, Google, UNESCO, and the Associated Press (AP) attended to discuss the increasingly significant role of generative AI in both propagating and combating misinformation and disinformation. Attendants were in agreement that it is critical to center global and frontline voices in the prioritization of needs and solutions around these threats, which extend to other closely related technologies such as deepfakes and synthetic media.

The event also represented an opportunity to speak about the use of machine learning (ML) and AI technologies on Wikipedia and other Wikimedia projects. Costanza spoke about how AI works best as an augmentation for the work that humans do, how Wikipedia can be an antidote to disinformation, and how ML and AI technologies can help. In addition, she highlighted the crucial impact of Wikipedia data to train large language models (LLMs), and discussed with those attending how we make LLMs more reliable in general and ensure a healthy online information environment.

Protecting the Wikimedia model
(Work related to access to knowledge and freedom of expression)

Wikimedia Foundation files a brief asking US Supreme Court to strike down Texas and Florida laws

[Read the Foundation’s public statement and our blog post on the lawsuits and the brief]

In early December 2023, the Wikimedia Foundation filed an amicus (“friend-of-the court”) brief with the US Supreme Court supporting legal challenges to laws in Texas and Florida that limit how internet platforms can engage in content moderation. The laws threaten the right to freedom of expression as guaranteed by the First Amendment of the US Constitution, which also protects the right not to speak or express a particular viewpoint. We filed the brief calling on the Court to strike down both laws as unconstitutional.

An amicus brief is a document filed by individuals or organizations who are not part of a lawsuit but who have an interest in the outcome of the case and want to educate the court about their concerns. In our brief, we explained why and how these laws pose a significant risk to the Foundation’s ability to host Wikipedia and other Wikimedia projects, and to the ability of Wikimedians to continue governing the projects as they have for more than 20 years. On the one hand, the laws are too vague and broadly written, and could create unnecessary uncertainty around legal risks for volunteer editors. On the other hand, even if they were written more clearly, forcing websites or the communities that manage them to host speech they do not want would violate their constitutional rights as well as overwhelm the projects with content that is not encyclopedic, neutral, and/or verifiable.

For more information, read the Foundation’s public statement and our blog post on the lawsuits and the brief. 

Discussing the Wikimedia Foundation’s perspective on the EU Digital Services Act

[Read our blog post about the EU’s content moderation law and how it can promote access to knowledge and protect online communities]

In August 2023, the European Union’s (EU) new Digital Services Act (DSA) started to apply to the largest platforms and search engines in the EU, which includes Wikipedia. The law aims to establish common rules that will govern online content moderation and will significantly impact the Wikimedia projects and online spaces worldwide. A subset of DSA rules will apply from mid-February 2024. Worldwide, there is much interest in seeing how the law is implemented and performs in practice, and also how internet platforms will comply and respond.

We believe that the obligations that the DSA establishes online not only can improve people’s experience of the internet, but also protect their rights online. However, we caution that governments and regulators should be careful not to impact smaller community-led platforms when implementing the new requirements, and warn legislators outside of the EU that they cannot copy and paste the DSA without providing adequate protections for free expression and user rights.

Read our blog post to learn about our engagement in the DSA legislative process, the new obligations that will apply to Wikimedia projects, the law’s global implications, and how governments can promote and protect communities’ ability to own and govern online spaces together.

Discussing the importance of public policy that promotes digital public goods, technology, and the internet

Amalia Toledo, our Lead Public Policy Specialist for Latin America and the Caribbean, attended the “Seminário Big Techs, Informação e Democracia na América Latina” (in English, “Seminar on Big Tech, Information, & Democracy in Latin America”) held in São Paulo, Brazil. The event aimed to discuss the impacts of technology in Latin American and Caribbean democracies, offer proactive agendas on how to curb the growing power of large digital platforms in the region, and contribute with regional perspectives to global debates. Participants at the event included civil society representatives, university students, academic researchers, and journalists from independent and alternative media.

Amalia attended the event in order to share with important policy actors the significance of regulation that can protect and promote digital public goods, technology, and the internet—in other words, how these offer vital frameworks to promote the vision that the Wikimedia Foundation and volunteer communities share of a world where everyone, everywhere, can participate in the sum of human knowledge. In addition, she shared reflections on how the Wikimedia projects can strengthen journalism worldwide. You can read more about the event here (in Portuguese).

Organizing a workshop with the Information Society Project at Yale University

The Wikimedia Foundation partnered with Yale University’s Information Society Project (ISP) and the Center for Democracy and Technology (CDT) to organize a day-long workshop in December. During the event, Foundation staff joined a small group of experts from civil society and academia for in-depth discussions about the past, present, and future of internet regulations. Workshop participants learned more about the Wikimedia model, and those attending brainstormed ideas on how to protect and promote public interest projects like Wikipedia at times when policymakers are seeking to curb the perceived harms of social media platforms and other internet phenomena. 

ICYMI: Explaining why Section 230 is important to Wikimedia projects and the diversity of the online information ecosystem

[Read our three-part blog post series on Section 230: part 1, part 2, and part 3]

Much of the modern internet, including Wikipedia and the Wikimedia projects, could not exist without Section 230. This fundamental US law protects internet platforms from lawsuits against content shared by users online and the decisions on content moderation regarding it. The statute provides the legal certainty empowering volunteer editors to create free knowledge on the Wikimedia projects, and also the ability of the Wikimedia Foundation to host them.

In the US, there is much discussion about changing or terminating the law, both by means of legislators and courts. The Foundation is working to explain how doing so could ruin Wikipedia. We have opposed bills like the EARN IT Act in the past because of their unintended consequences.

Since Section 230 is becoming more widely discussed, we published a blog post series to inform US members of Congress, their staff, and the voters who elect them about some key issues: 1) Why and how websites and services depend on the statute; 2) Misunderstandings and assumptions in legislative reforms that can lead to negative impacts on the online information ecosystem; and, 3) Alternatives that can constructively help to refocus internet regulation and empower communities and individuals to transform the internet for the better.

For more, explore the first post, the second post, and/or the third and final post in our series about Section 230. 


Follow us on X (formerly Twitter), visit our Meta-Wiki webpage, join our Wikipedia policy mailing list, and sign up for our quarterly newsletter to receive updates. We hope to see you there!

Can you help us translate this article?

In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?