
Welcome to “Don’t Blink”! Every month we share developments from around the world that shape people’s ability to participate in the free knowledge movement. In case you blinked last month, here are the most important public policy advocacy topics that have kept the Wikimedia Foundation busy.
The Global Advocacy team works to advocate laws and government policies that protect the volunteer community-led Wikimedia model, Wikimedia’s people, and the Wikimedia movement’s core values. To learn more about us and the work we do with the rest of the Foundation: visit our Meta-Wiki webpage; follow us on LinkedIn, X (formerly Twitter), and Bluesky; and, sign up for our quarterly newsletter or Wikimedia public policy mailing list.
________
THANKS FOR YOUR RESPONSES SO FAR! IF YOU HAVE NOT DONE SO YET, PLEASE TAKE OUR SURVEY TO HELP US IMPROVE “DON’T BLINK”!
We want to hear from you, our readers! Do you have a literal minute to spare? Your answers to this multiple-choice short survey about our monthly advocacy recap will help us share the news that is most relevant to you in the format that best meets your needs.
For more information about how we will use your feedback, please read our survey privacy statement.
________
Knowledge is human: Discussing the information ecosystem in the age of AI
[Learn more about the event and read recaps from Wikimedia UK and speaker Matt Rogerson (Director of Global Public Policy and Platform Strategy, Financial Times)]
The Wikimedia Foundation joined the British Library and Wikimedia UK to host “Knowledge is Human: The Information Ecosystem in the age of AI.” The event convened experts from the Digital Public Goods Alliance (DPGA), Khalili Foundation, Wellcome Trust, Open Data Institute, 5Rights Foundation, Financial Times, and other organizations to discuss how the rise of AI is changing the way information is collected, curated, and shared, especially in digital spaces. The event highlighted the importance of the humans who contribute knowledge to the world, and allowed participants from journalism, science, cultural heritage, and the open web to strategize the best ways to adapt to emerging technologies.
Participants from the Foundation and Wikimedia UK discussed their perspective of the challenges and opportunities that come with AI, and their work to overcome the challenges. On the topic of building an inclusive internet, Fiona Romeo (Director of Content Enablement) shared stories of the work that volunteer communities around the world do to incubate smaller language projects and address knowledge gaps, illustrating the unrecognized human labor in the public interest that lies behind the content that AI systems collect for profit. Franziska Putz (Senior Movement Advocacy Manager) chaired a discussion that highlighted examples of advocacy in action: the people fighting for the policies and principles required to make it possible for every open archive, shared dataset, and digital public good to exist. Birgit Müller (Director of Product) presented on the impact that AI crawler bots have had on the infrastructure of the Wikimedia projects. From Wikimedia UK, Daria Cybulska (Director of Programmes and Evaluation) discussed how the the chapter’s initiatives to promote media literacy are essential in an increasingly complicated online information ecosystem, and Lucy Crompton-Reid (Chief Executive) led a session exploring how scientific and cultural institutions can champion openness, accessibility, and transparency.
Learn more about the event and read recaps from Wikimedia UK and speaker Matt Rogerson (Director of Global Public Policy and Platform Strategy, Financial Times).
Supporting access to cultural heritage through the Open Heritage Statement
[Read the Open Heritage Statement and our blog post supporting this global call]
Wikimedia has signed the Open Heritage Statement, which advocates equitable and unrestricted access and use of cultural heritage in the public domain. We endorsed this statement alongside Wikimedia affiliates and other members of the TAROCH (Towards a Recommendation on Open Cultural Heritage) Coalition.
This statement recognizes that public domain heritage is essential for community building, knowledge sharing, and the preservation of cultural and linguistic diversity. It urges policymakers to dismantle legal and technical barriers that restrict open learning and cultural rights worldwide. In line with Wikimedia’s mission to promote free and open knowledge for all, we endorsed this statement to highlight the everyday challenges faced by our volunteer communities and our partners at galleries, libraries, archives, and museums (GLAM) worldwide.
For years, the Wikimedia Foundation and volunteer communities have witnessed outdated and overly restrictive intellectual property laws limiting access to works that should belong to everyone. The statement identifies some of these barriers, including the creation of new intellectual property (IP) rights over public domain materials, the imposition of arbitrary licenses, and the unnecessary use of restrictive technical protection measures. These practices undermine people’s rights to culture and freedom of expression, stifling creativity and free and open knowledge initiatives in the process.
The Open Heritage Statement represents a first step toward establishing a clear, harmonized international framework for the protection and accessibility of public domain cultural heritage. We join others in calling upon UNESCO to recognize the urgency of this issue and begin developing a binding international instrument, such as a Recommendation, on the digitization and open access of cultural heritage.
Read the Open Heritage Statement and our blog post supporting this global call.
Discussing information integrity and democratic resilience at the Paris Peace Forum
[Watch videos of our “Disinformation and Democratic Resilience” and “Information Integrity and Independent Media” panels]
The Paris Peace Forum gathers representatives from governments, civil society, business, and international organizations yearly to work toward global peace and sustainable prosperity. Bringing together key decision-makers across a wide variety of stakeholders from across every region of the globe nurtures practical cooperation, dialogue, and collaborative solutions to some of the world’s most pressing problems.
Rebecca MacKinnon (Special Advisor) spoke on two panels to share the perspectives of the Wikimedia Foundation and volunteer communities on two important global challenges. On one panel, Rebecca discussed democratic resilience against the rise of disinformation online. She explained the importance of protecting the people and institutions that do the hard work of discovering, verifying, and sharing reliable information in the public interest. This not only includes dedicated contributors to the Wikimedia projects, but also journalists, academics, and researchers, whose work is an important source both for the Wikimedia projects and the world at large. Rebecca emphasized that threats to these communities—from direct physical violence and propaganda to how AI is negatively affecting their livelihoods—put the world at risk of being overwhelmed with low-quality and intentionally false information online.
Rebecca also spoke on a panel about information integrity related to climate change. She called on governments that have committed to protect and support public interest media to also protect and support Wikipedia along with other digital public goods. While the panel’s moderator referred to “internet platforms” as a category, she explained that Wikipedia is different from commercial platforms. She explained that as a noncommercial community-governed project Wikipedia’s sole purpose is to enable everyone to share and access knowledge, unlike commercial social media platforms, whose business models incentivize targeting users with controversial content in order to promote viral engagement and monetize targeted advertising. Volunteer contributors working on climate change related topics on the Wikimedia projects, including initiatives to improve local and multilingual content about environmental topics, are bound by commitments and community with no other purpose than to provide neutral, accurate, and verifiable information in the public interest.
Watch videos of our “Disinformation and Democratic Resilience” and “Information Integrity and Independent Media” panels.
Sharing how the Wikimedia Foundation assesses risk at the Big Fat Brussels Meeting
[Learn more about Big Fat Brussels 2025 and read about our presentation on risk assessment under the EU Digital Services Act (in German)]
Wikimedians interested in public policy advocacy gathered once again this year at the tenth Big Fat Brussels Meeting, ready to discuss some of the most pressing legal and public policy issues affecting their work as well as to share information and strategies. This year’s gathering focused on two topics that will have impacts on the Wikimedia movement for years to come: AI and child protection online. During the event, participants discussed themes across the spectrum of AI applications: the commercial reuse of Wikipedia content; the burden of AI scraping bots; and how to address issues like deepfakes and disinformation. Those attending also learned about trends in child protection policy, including a worldwide boom in age verification laws.
Ricky Gaines (Human Rights Policy and Advocacy Lead) and Phil Bradley-Schmieg (Lead Counsel) presented at a special event on the Foundation’s ongoing compliance efforts with the European Union’s (EU) Digital Services Act (DSA). In addition to Wikimedians, this event was attended by regulators, journalists, and political advisors to legislators. Ricky and Phil shared how the Foundation assesses and manages risk, including through human rights impact assessments (HRIAs) like the recently published AI and machine learning HRIA. They discussed some of the highest priority risks that have been identified through these processes—which include disinformation on the projects and the harassment of volunteers—along with steps that the Foundation and volunteer communities have taken to start addressing the risks in question. Finally, the presentation covered some of the difficulties of compliance as a nonprofit organization, such as having to divert resources from development or community support to compliance as well as the inflexible deadlines for reporting compliance to the EU.
Learn more about Big Fat Brussels 2025 and read about our presentation on risk assessment under the DSA (in German).
Promoting information integrity at the World Economic Forum’s Annual Meeting of Global Future Councils
[Learn more about the Global Future Council on Information Integrity]
The World Economic Forum’s (WEF) Annual Meeting of Global Future Councils is an event that brings together experts in different fields to address critical global issues. Costanza Sciubba Caniglia (Anti-Disinformation Strategy Lead) is a member of the Global Future Council on Information Integrity and joined the annual meeting to engage in internal planning with the Council and share how the Wikimedia projects promote information integrity.
The Wikimedia Foundation’s perspective is an important bridge at WEF between the interests of nonprofit, public interest internet platforms and commercial social media platforms. Since the work of the Global Future Councils will be used by WEF to inform and shape future agenda at the global, regional, and industry levels, we were enthusiastic about sharing lessons learned about how to protect information integrity with other participants, highlighting that community-led models are also an effective way of meeting this worldwide challenge.
Learn more about the Global Future Council on Information Integrity.
Recommending 10 steps to strengthen stakeholder engagement in the WSIS+20 review
[Read the joint statement with Global Partners Digital and learn more about how affiliates are engaging around global internet governance]
This year it is especially important for Wikimedia to participate in policy discussions about internet governance, like the United Nation’s ongoing review of the World Summit on Information and Society recommendations. In December 2025, as a result of this review, a decision will be made whether or not internet governance will continue to be a multistakeholder matter. In other words, whether civil society groups (like the Wikimedia Foundation and affiliate), the private sector, and technical groups, can continue to have a say on how the internet works … or if it will only be governments who will get to make those crucial decisions.
The Foundation in partnership with an international coalition of Wikimedia affiliates has been engaging with this process—called WSIS+20 because it reviews the past 20 years. Our shared goal is to ensure that the reviewed guidelines promote human rights online, a multistakeholder model, and an internet that protects community-led spaces. As the year comes to a close and with it the evaluation of WSIS outcomes, we have joined a statement from Global Partners Digital, an international nonprofit that works to advance human rights-based, inclusive, and accountable digital policy around the world. The statement urges the UN and Member States to make sure that stakeholders are meaningfully consulted during this last phase of the review process.
The statement includes recommendations to: organize regular information exchanges with non-governmental organizations; create more permanent advisory roles for these actors in the implementation of the WSIS+20 outcomes; transparently publish new drafts of the outcome document; hold open proceedings; and more. Representatives from civil society, industry, academia, and other non-governmental organizations must be included in this crucial, final phase in order to reach a truly inclusive outcome in December.
Read the joint statement with Global Partners Digital and learn more about how affiliates are engaging around global internet governance.
Centering human rights at the Centro de Estudios en Libertad de Expresión y Acceso a la Información (CELE) annual workshop
[Read this recap of a session we attended as Global Network Initiative members]
We recently attended the annual workshop of Centro de Estudios en Libertad de Expresión y Acceso a la Información (CELE) at the University of Palermo, a research center dedicated to providing findings to civil society, journalists, government institutions, and the academic community that are dedicated to the promotion of freedom of expression, primarily in Latin America.
During the event, Rebecca MacKinnon joined a workshop dedicated to strengthening multistakeholder engagement in AI governance processes, including the upcoming 2025 India AI Impact Summit. Rebecca described how AI companies fail certain minority language communities, and how important Wikimedia volunteers and language activists who work to create content in local languages are in ensuring that these communities are not left behind as technology advances. She called for direct support for these communities as a part of broader AI governance and public policy.
Ricky Gaines also attended to present the results of the Foundation’s AI and machine learning HRIA report. In addition, Ricky discussed how best to engage stakeholders when assessing risk as well as the ways in which public policies related to AI might affect the Foundation. He introduced the participants to the Wikipedia Test, a policy tool and call to action used by the Foundation and other allied public interest organizations to evaluate whether a law or regulation might negatively affect the best parts of the internet. If a policy raises red flags when using the Wikipedia Test—that is to say, it fails to uphold important protections for freedom of expression, privacy, and community-led governance—it will likely have a negative impact on other public interest internet websites and services as well. As one participant put it: “If you’re not for protecting Wikipedia, then what do you even stand for?”
Read this recap of a session we attended as GNI members.
Exploring how journalism and Wikimedia projects benefit each other at Trust Conference and Newsgeist
[Watch a video of our presentation at Newsgeist]
Journalism and journalists are an essential source of knowledge in the world, both in print and online. One of the most common ways in which volunteer contributors meet Wikimedia projects’ verifiability standard is by citing news media articles, since every fact can then be supported by a reliable source. For this reason as well as shared values, threats to journalism are also threats to the Wikimedia projects.
The Wikimedia Foundation attended two events in October to explain these connections between Wikimedia projects and journalism, and also to share how we can address common challenges that our communities face:for instance, the increase in AI bots scraping content online and the rise of censorship around the world.
At Newsgeist, an annual media gathering led for the first time this year by the Center for News, Technology & Innovation (CNTI), Lauren Dickinson (Head of Global Media Relations) and Rebecca MacKinnon gave a presentation titled “Journalism and Wikipedia: Sink or Swim Together,” which explored the symbiotic relationship between Wikipedia and journalism. Lauren and Rebecca shared how strong protections for freedom of expression, including the First Amendment in the US, protect the ability of both journalists and Wikimedians to speak truth to power and share neutral, verifiable facts.
At the Trust Conference held by the Thomson Reuters Foundation, Jimmy Wales (Wikipedia founder and Foundation Board Member) spoke on the conference’s opening panel, “Democracy Under Siege: Upholding Freedom in an Autocratic Age.” Alongside representatives of major institutions safeguarding access to knowledge such as Jelani Cobb (Dean, Journalism School, Columbia University) and Nabiha Syed (Executive Director, Mozilla Foundation), Jimmy shared his perspective about how to build trust and democratic resilience in an increasingly polarized society. Rising disinformation attacks on civil society and a surge in new technologies eroding checks and balances, he noted, are fueling distrust in traditional centers of knowledge and authority. The Foundation has identified the same trends, which are reshaping the online information environment in which Wikimedia projects operate.
The conference spotlighted the similarities between civil society groups like Wikimedia affiliates and local media organizations, which are frontline groups in the information ecosystem and both the first to hold powerful actors to account and the first to be attacked. Strategic lawsuits against public participation (SLAPPs) were a prominent example of attacks levied against journalistic outlets and Wikimedia volunteer contributors. Using Wikipedia as an example, Jimmy called for the broader media community to help reassess the information systems on which we rely and to develop intentional, community-centered technologies—something Wikimedia has long embodied.
Watch a video of our presentation at Newsgeist.
________
Follow us on LinkedIn, X (formerly Twitter), and Bluesky; visit our Meta-Wiki webpage; sign up for our quarterly newsletter to receive updates; and, join our Wikipedia public policy mailing list. We hope to see you there!
Can you help us translate this article?
In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?
Start translation