Welcome to “Don’t Blink”! Every month we share developments from around the world that shape people’s ability to participate in the free knowledge movement. In case you blinked last month, here are the most important public policy advocacy topics that have kept the Wikimedia Foundation busy.
The Global Advocacy team works to advocate laws and government policies that protect the volunteer community-led Wikimedia model, Wikimedia’s people, and the Wikimedia movement’s core values. To learn more about us and the work we do with the rest of the Foundation, visit our Meta-Wiki webpage, follow us on X (formerly Twitter) (@WikimediaPolicy), and sign up to our Wikimedia public policy mailing list or quarterly newsletter.
________
Protecting the Wikimedia model
(Work related to access to knowledge and freedom of expression)
Reflecting on Wikimedians’ participation in the Digital Rights and Inclusion Forum
[Read about participants’ reflections in English and French, and find out more about this year’s panels]
This was a big year for Wikimedians at the Digital Rights and Inclusion Forum (DRIF) in Accra, Ghana. DRIF is a conference hosted by Paradigm Initiative where digital policies in Africa are debated and shaped, and partnerships are formed for action.
The Wikimedia Foundation celebrated its third year of sponsoring DRIF by granting scholarships to four Wikimedia volunteers so they could attend the event. The grantees were part of a group of over twenty Wikimedians, including locals and Foundation staff, who shared learnings on topics like preventing electoral disinformation, promoting access to education in offline environments, and supporting inclusion of women in online spaces. We recently interviewed a few of those attendees, organizers, and speakers to learn about their experiences and the benefits of participating at regional conferences like DRIF.
Participants discussed how they were able to share information about the Wikimedia model with others at the conference. Ceslause Ogbonnaya (Igbo Wikimedians User Group) said:
It always makes me smile when I meet people who call us “ghost workers” because they don’t see the work that we do to add content to Wikipedia, but read our edits on Wikipedia. I enjoy letting them know that we are regular people, just like them, who chose to make a change by documenting free and open knowledge using the Wikimedia projects.
When asked about learnings and insights from their participation, some themes that emerged were lessons on the importance of digital security in the run-up to elections and the power of local librarians to become advocates for digital rights and inclusion within their communities. Other participants highlighted the importance of the connections they made at the conference, both with other actors in the digital rights space in West Africa and across the region, and with fellow Wikimedians at a reception generously hosted by Open Foundation West Africa.
For more information, read about participants’ reflections in English and French, and find out more about this year’s panels.
Weighing in on public consultations about Artificial Intelligence
[Read our blog post summarizing the comments]
Recently, various departments and teams throughout the Foundation have worked together to submit comments in response to several governmental and intergovernmental consultations related to Artificial Intelligence (AI), its development, applications, and governance. In the comments and replies we submitted, we shared Wikipedia’s decades of experience and lessons learned with AI and machine learning (ML).
These submissions include:
- A response to a request for information on AI in the context of international development, which was conducted by the United States Agency for International Development (USAID). In our comments we encouraged the Agency to prioritize inclusion of all stakeholders and to make their consultations as specific as possible to ensure the issues are clearly explained and the right voices get heard.
- Input on the regulation of “dual-use” or open foundation AI models to the National Telecommunications and Information Administration (NTIA). In our comments, we argued that making information about AI models and weights more open and available to the public would lead to more benefits than attempting to keep model information locked behind proprietary doors. Those benefits include allowing researchers to identify flaws and vulnerabilities, counteract biases, and improve the performance of AI tools.
- Comments on AI and copyright protections to the US Copyright Office. In our feedback, we emphasized the need for proper attribution when AI systems use Wikimedia projects, including with hyperlinks to sources. Providing attribution this way not only gives proper credit to the authors, but also enables people who use AI systems to access additional information that allows them to verify the answers to their questions.
- Feedback on the United Nations AI Advisory Board’s interim report addressing the global governance of AI. In our comments, we echoed several of the points we raised in our comments to the USAID, aimed at improving conditions for a more diverse set of stakeholders to participate in conversations about AI governance.
- We have also provided input at several stages of the Global Digital Compact process, including in the form of an open letter where we asked for AI and ML to support and empower people, rather than replace them.
Find out more in our blog post, which summarizes our comments and their common themes.
US Supreme Court rules on important NetChoice legal cases
[Read our analysis of the decision on Diff]
Since 2021, we have been following a pair of cases in Texas and Florida challenging laws that claimed to prevent “censorship” of certain viewpoints on social media and restricted social media platforms’ ability to enforce their own content policies. NetChoice, a trade association representing large social media platforms and other tech companies, sued in both states to block these laws from taking effect and state courts granted injunctions halting the application of the law.
The cases then made their way up to the US Supreme Court, where the Wikimedia Foundation filed a “friend-of-the-court” brief (.pdf file) explaining how the laws could infringe upon the First Amendment rights of Wikimedia volunteers. While it remains unclear if these laws would apply to the Wikimedia projects, enforcement of such a law could disrupt Wikimedia communities’ decision-making processes and could damage the quality and reliability of Wikipedia by forcing them to include non-encyclopedic content.
In July 2024, the Supreme Court issued its decision in these cases. Ultimately based on procedure, the ruling claimed that lower courts needed to conduct additional analysis before they could rule on the cases because the laws had not yet been applied. NetChoice argued that these laws were designed to infringe its members’ First Amendment rights to free expression, and that their constitutionality should be assessed accordingly, but instead the Supreme Court called for the lower courts to base their interpretation on the scope of all possible applications of the laws.
In the short term, this means that the two cases will be reinterpreted by the lower courts, who will decide on the constitutionality. Though the laws are still on hold for now, the preliminary injunctions blocking them from taking effect may not last forever. In the long-term, the Court’s insistence on requiring lower courts to address all possible applications of broad laws to determine their constitutionality raises questions about whether it will be more difficult to challenge broadly-drafted laws in the future. We will continue to monitor the status of these state laws, and will provide updates on any decisions by the state courts that may impact their implementation.
For now, read our full analysis of the case on Diff or learn more about our “friend-of-the-court” brief in this blog post.
Providing input at stakeholder meeting for Global Digital Compact
[Read our open letter following the first draft of the Compact]
As a part of the ongoing process to revise the Global Digital Compact, the UN held a June meeting for stakeholders to share the second revision of the draft compact. Costanza Sciubba Caniglia (Anti-Disinformation Strategy Lead) represented the Wikimedia Foundation, sharing our input on how the draft could better address the positive vision we laid out in our open letter to the drafting committee. This was one of the last opportunities to influence the compact before it is finalized, and we are grateful to the UN for seeking input from all stakeholders, including those representing public interest projects like the Wikimedia projects.
Protecting Wikimedia’s values
(Work related to human rights and countering disinformation)
Highlights from the Anti-Disinformation Repository
[Read the blog post series and more on Diff]
The Anti-Disinformation Repository is a collection of tools and activities identified during a mapping exercise that explored how Wikimedia communities and the Foundation address the challenges of disinformation. Throughout June, we took a closer look at three stories from the repository, all of which serve as powerful examples of how Wikimedia can serve as an antidote to disinformation.
The first story highlights how an initiative on Wikidata has helped to improve public information on victims of political disappearances in Brazil. Wikidata is a free and open knowledge base that can be read and edited by both humans and machines, and functions as a common source for other Wikimedia projects. Using Wikidata, volunteers from Wiki Movimento Brasil and partners worked to help fix a gap in public recordkeeping about victims of political disappearances between 1964 and 1985: it brought together multiple incomplete records into one set on the project. Hosting the information on Wikidata ensured its integrity and visibility over time, which will hopefully contribute to more truth, justice, and accountability for victims and their families. Read this story on Diff for more details.
The second entry looked at Wikimedians’ responses to disinformation during the COVID-19 pandemic, which involved a multifaceted approach to providing neutral, fact-based information. Highlights of this work include initiatives like WikiProject COVID-19, a coordinated global volunteer effort to provide reliable resources about COVID-19 to volunteers and official collaborations—which included the Foundations’ work with the World Health Organization (WHO) to create accessible public health assets under a Creative Commons ShareAlike license. By coming together on projects like this, Wikimedians showcased their signature commitment to reliable and accurate information at a time when the world needed it most. Learn more about this story on Diff.
Our final story details some of the tools created and used by Wikimedians to counter disinformation on the projects. Volunteers editors are at the heart of addressing false or misleading information on Wikimedia’s projects. Over the years, these volunteers have developed multiple tools to help them in their work to detect harmful editing practices and safeguard the public from misleading information. Tools helping volunteer editors to focus their efforts on information integrity include Huggle, an editing interface which easily allows volunteers to identify vandalism, and Citation Hunt, a tool that identifies claims in articles which lack citations. Explore this story further on Diff
Discussing the Global Digital Compact at the 2024 Wiki Workshop
[Explore research and video presentations from the Wiki Workshop, and check out our latest on the Global Digital Compact]
On 20 June, researchers exploring all aspects of the Wikimedia projects gathered virtually for Wiki Workshop 2024. The workshop, which is the largest Wikimedia research event of the year, featured presentations on over forty extended abstracts, seven hall sessions focused on exchanging ideas and community building, and a keynote presentation from Dr. Brent Hecht, a Director of Applied Science at Microsoft.
Costanza Sciubba Caniglia (Anti-Disinformation Strategy Lead) attended the conference and hosted a hall session discussing our campaign around the Global Digital Compact, which involves close partnerships with Wikimedia affiliates and an open letter that has reached 650 signatures at the time of this publication. The session was well attended and provided an fruitful opportunity to share with researchers interested in public policy how we think about and conduct global advocacy campaigns.
To read the research abstracts or watch recordings of the research presentations, visit the Wiki Workshop 2024 website. To learn more about the partnership between the Foundation and Wikimedia affiliates doing advocacy around the Global Digital Compact, check out this blog post.
Discussing AI as a public good at the second Seminar on Big Tech, Information, and Democracy
[Watch the recorded session (in Spanish) on YouTube]
From 6–7 June, the second Seminar on “Big Tech, Information, and Democracy” was hosted in Bogotá, Colombia to bring together experts, academia, and civil society to discuss important topics around internet regulation, agree upon possible research agendas, and identify actions that can be taken to advance digital rights. The forum, which built upon work conducted in the first seminar in December 2023, was hosted by the Information and Democracy Forum, Observatorio Latinoamericano de Regulación, Medios y Convergencia (OBSERVACOM), and Intervozes.
Amalia Toledo (Lead Public Policy Specialist for Latin America and the Caribbean) was a panelist in a roundtable discussion called “AI as a Public Good: Ensuring Democratic Control of AI in the Information Space,” which discussed OBSERVACOM’s publication of the same name (.pdf file, in Spanish). In this session, Amalia shared four key lessons from the Wikimedia model that could serve as a guide when developing regulations around AI.
First, any regulation should be able to prevent AI from replacing humans, who are needed more than ever for knowledge building. Second, it should also be set to guide the development of these technologies with a commitment to work in a more transparent and open manner. Third, it should provide guidelines to increase trust by fostering collaborative governance models—or, at the very least, more meaningful stakeholder engagement. Fourth and finally, regulation should offer direction for these technologies to represent and serve a multilingual and diverse world.
Watch the recorded session (in Spanish) on YouTube.
Discussing disinformation responses on Wikipedia in an interview
[Read the interview (in German) on Silicon]
Silicon, a German technology news website, published an interview with Costanza Sciubba Caniglia about fighting disinformation on Wikipedia. In the interview, Costanza shared the importance of the “human factor” to ensure that the online encyclopedia contains neutral, fact-based information.
Wikipedia is shaped by all of the people who work on it, including the hundreds of thousands volunteers who add content in accordance with the project’s editorial guidelines and the administrators who uphold the rules and procedures that keep information on the projects reliable. Constanza discussed how these consensus-based decisions, which are made public in the articles’ history and Talk pages, can create the conditions for high-quality, reliable, and neutral knowledge.
She also highlighted the Foundation’s Disinformation Response Taskforce, which works with established volunteers to quickly identify and report potential information attacks during times of high risk like elections.
Read the interview (in German) on Silicon.
Announcements from our team
The third Global Advocacy quarterly newsletter is out!
In the most recent issue, we explained why the Foundation and Wikimedia affiliates published an open letter calling UN Member States to commit to protecting public interest spaces on the internet like the Wikimedia projects. Other highlights include: key lessons from Rebecca Mackinnon’s (Vice President, Global Advocacy) interview about US legal cornerstone Section 230; an update on the reauthorization of Section 702 of the Foreign Intelligence Surveillance Act (FISA) in the US; and, reflections on how Wikimedia projects’ 20+ years of experience shape the public comments we submit to international institutions and governments in relation to AI.
Subscribe to our newsletter for more updates on our public policy work, interviews, upcoming events, and other news from the Wikimedia movement!
________
Follow us on X (formerly Twitter), visit our Meta-Wiki webpage, join our Wikipedia policy mailing list, and sign up for our quarterly newsletter to receive updates. We hope to see you there!
Can you help us translate this article?
In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?
Start translation