Don’t Blink: Public Policy Snapshot for May 2023

Translate this post

Welcome to the “Don’t Blink” series! Every month we share developments from around the world that shape people’s ability to participate in the free knowledge movement. In case you blinked this month, here are the most important public policy advocacy topics that have kept the Wikimedia Foundation busy.

The Global Advocacy team works to advocate laws and policies that protect the volunteer community-led Wikimedia model, Wikimedia’s people, and the Wikimedia movement’s core values. To learn more about us and the work we do with the rest of the Foundation, visit our Meta-Wiki webpage, follow us on Twitter (@WikimediaPolicy), or sign up to our Wikimedia public policy mailing list.


Protecting the Wikimedia model
(Work related to access to knowledge and freedom of expression)

Wikipedia Designation as a “Very Large Online Platform” (VLOP)
[Read our blog post, and share your feedback]

The European Commission designated Wikipedia as a “Very Large Online Platform” (VLOP) under the new Digital Services Act (DSA) in late April. The DSA places Wikipedia and 18 other platforms with over 45 million “monthly active users” within the European Union (EU) under greater regulatory scrutiny and additional obligations. As a VLOP, Wikipedia’s obligations include mandatory obligations to periodically assess any “systemic risks” that Wikipedia could be contributing to in the EU (e.g., disinformation), and ensure that adequate mitigations are in place. The Foundation is already taking a number of actions necessary to comply with the DSA’s VLOP status and demonstrate the necessary level of responsibility that comes with its size and social impact. As with anything that concerns the Wikimedia movement, community initiative and empowerment will be crucial to meet this new regulatory challenge. We published a blog post on Diff to explain the VLOP designation in detail as well as to consult with Wikipedia editors across the EU about how the law affects your work on the projects. The Foundation is requesting help from Wikimedians, within the EU and also worldwide, in shaping how the DSA is applied and enforced to volunteer-run, community governed projects.

Potential Effects of UK Online Safety Bill (OSB) on Wikimedia Projects
[Read our blog post, and learn how to stay up-to-date]

The current version of the Online Safety Bill drafted by UK lawmakers may require significant changes to how Wikipedia and other Wikimedia projects operate. New obligations foreseen by the Bill include: preventing minors from encountering yet-undefined categories of lawful but “harmful” content; allowing adult users to filter out that same “lawful but harmful” content, and other content (neither illegal nor harmful) posted by users who have not gone through identity verification checks; and, collecting additional data on users to carry out age verification. The current draft also requires assessing whether certain content is illegal under UK law, or presents risks of being used in connection with other UK criminal offenses, including illegal immigration. Lucy Crompton-Reid, Chief Executive of Wikimedia UK, and the Foundation’s Phil Bradley-Schmieg (Lead Counsel) published a Diff blog post on the draft bill. The post shows the various ways in which the Foundation and, especially, the Wikimedia movement already work to offer reasonable safety to volunteers and readers. Importantly, this is done without it coming at the expense of user privacy, community autonomy, free speech, or individuals’ ability to pursue their curiosity. The post also explains the steps that the movement is already taking to provide greater clarity on what is unacceptable conduct on the projects. Lastly, it calls on legislators to consider whether these new compliance obligations are necessary for projects such as ours, and shares with the community how to stay informed and/or get involved in advocacy activities around the UK OSB.

Implications for the Wikimedia Model from US Supreme Court Rulings
[Read our blog post and learn more]

On 18 May 2023, the Supreme Court of the United States released its opinions on two related cases with important implications for Wikipedia and other Wikimedia projects: Gonzalez v. Google and Twitter, Inc v. Taamneh et al. In resolving these cases, the Court declined to rule on Section 230 of the 1996 Communications Decency Act (CDA), preserving the future of online platforms that enable people to share content on the internet, for now. We noted three takeaways from the ruling for Wikimedia projects as well as free knowledge advocates worldwide. First, the rulings mean Wikimedia volunteers can continue to share free knowledge globally. Second, despite these rulings, threats remain to Section 230 protections for the projects. And third, when considering Section 230 changes, US legislators and courts should not forget Wikipedia. You can read more about the rulings themselves and the three takeaways in the blog post we published following the Court releasing its opinions.

Protecting Wikimedia’s Values
(Work related to human rights and countering disinformation

Nobel Prize Summit
[Watch our studio conversation and panel discussion]

In late May, Rebecca MacKinnon (Vice President of Global Advocacy), Costanza Sciubba Caniglia (Anti-Disinformation Strategy Lead), and Stan Adams (Lead Public Policy Specialist for North America) participated in the Nobel Prize Summit in Washington, DC, which brings together laureates, leading experts, and the general public. This year’s theme was “Trust, Truth, and Hope,” and provided an opportunity for conversations on how we can combat misinformation, restore trust in science, and create a hopeful future. We attended the summit in order to share why governments and industry must prioritize the safety and security of people upon whom a trustworthy information ecosystem depends, caution against a “one-size-fits-all” approach to regulate internet platforms, and explain how Wikipedia works to counter disinformation. Rebecca spoke during a studio conversation and on a panel titled “All information is local,” where she highlighted the importance of diversity and inclusion in countering mis- and disinformation, and warned against the above-mentioned “one-size-fits-all” regulatory approach. She reinforced the message that countering disinformation requires strong commitment to human rights protections, and security for the least powerful, most vulnerable users.

Digital Rights Asia-Pacific Assembly 2023 (DRAPAC23)

The Digital Rights Asia-Pacific Assembly 2023 (DRAPAC23) brings together various kinds of changemakers from the region to strengthen solidarity and networks, champion diversity and inclusion, and bridge the media, technology, and human rights fields. The five-day event, which held 150 sessions, was organized by EngageMedia, a nonprofit organization promoting digital rights, open and secure technology, and documentaries on social issues. Members of Wikimedia Indonesia and Wikimedia Thailand hosted a session entitled “Protecting Online Freedom with Community-Led Content Moderation,” moderated by Rachel Arinii Judhistari (Lead Public Policy Specialist for Asia), which was so well-attended it had to be moved to a larger auditorium! Rachel also spoke at a panel titled “Content Moderation and the Global Majority,” and attended meetings with regional parliamentarians and civil society organizations.


Follow us on Twitter, visit our Meta-Wiki webpage, or join our Wikipedia policy mailing list for updates. We hope to see you there!

Can you help us translate this article?

In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?