Tuesday, 25 April, 2023 was a significant day in Wikipedia’s history: The European Commission designated Wikipedia as a “Very Large Online Platform” (VLOP) under the new Digital Services Act (EU DSA).
VLOP designation, which places platforms under greater regulatory scrutiny, isn’t based on how inherently risky they are, but rather, on the size of their readership: Wikipedia is one of 19 platforms that are estimated to have more than 45 million “monthly active users” in the European Union (EU). Our designation was based on the EU userbase estimates that we published in February 2023, and which we will periodically refresh. Other VLOPs include Facebook, LinkedIn, TikTok, Twitter, Google Search, and YouTube.
The EU DSA imposes a wide range of obligations on platforms, search engines, and app stores of all sizes. All Wikimedia projects are covered by these. The handful of services that get designated as VLOPs—in our case, only Wikipedia—face supplementary obligations and shorter implementation deadlines. In announcing the initial list of VLOPs, European Commissioner for Internal Market Thierry Breton declared: “With great scale comes great responsibility.”
What changes does the EU DSA bring for Wikimedians?
The EU DSA came into force last year, 2022, and our implementation is in full swing, starting with a few necessary Terms of Use (ToU) changes—alongside others that are not DSA-related—that have been under community consultation since February 2023.
In the near term, we expect to be including more data in future Transparency Reports, and making a few procedural tweaks to how we handle Office Actions (i.e., rare cases where the Foundation will itself take a content moderation action, rather than the community doing so), and receive complaints.
Starting in late August this year, the DSA’s VLOP status brings mandatory “systemic risk and mitigation” (SRAM) obligations for Wikipedia. So there will need to be an annual, honest look at whether or not Wikipedia is contributing to any systemic risks in the EU (for instance, electoral disinformation) and whether the Wikimedia movement is adequately doing its part in mitigating those.
For now, we’re hoping to heavily base that assessment on the existing Human Rights Impact Assessment work, regular human rights due diligence for specific features or policy changes, and the upcoming Child Rights Impact Assessment. The latter is timely: The safety of young people online is a topical discussion across Europe—and other places around the world—at the moment.
For the most part, we anticipate that actions already underway—at both community and Foundation level—will help us demonstrate the necessary level of responsibility that comes with VLOP status. The need for our movement to make any further changes (i.e., to introduce and/or refine some mitigations for systemic risks), will depend on how well the European Commission sees us as addressing those risks. As part of the law, our compliance will also be independently audited, once a year, starting in 2024, which will begin to give us a better picture of where we should focus our attention in the next few years.
As with all things Wikimedia, community initiative and empowerment remain absolutely critical to meeting the challenge. As we’ll explain below, we certainly welcome suggestions from the community about where important risks lie, and how we can best empower the entire movement to address them.
Will more countries adopt laws like this?
We are observing a global trend, driven primarily by concerns about social media’s impact on society, to introduce laws like the EU DSA.
Done badly, these new laws can be a threat to the Wikimedia movement and others like it. Certain countries are already using these laws as cover for ideologically-motivated content blocking orders, violating fundamental rights both of platforms and their users. Others may have nobler intentions, but impose demanding and highly-localized compliance burdens—“red tape”—that weighs heavily on organizations with limited resources. This could make it harder for them to compete with the Web’s larger, for-profit technology corporations.
Our hope is that lawmakers thinking about such laws will emulate the DSA’s drafters by listening to the concerns of Wikimedia communities and taking the distinct Wikimedia model into account. Unlike other laws and proposed bills, the DSA does not apply a “one size fits all” approach to internet platforms. It also preserves, at its core, the all-important notice-and-takedown paradigm for intermediary liability, rather than forcing the platform operator to systematically scan and block user-generated content that may be illegal in particular jurisdictions. The notice-and-takedown model has served Europe well for over 20 years, and it has worked effectively in the United States for handling copyright claims without disrupting our movement. Notice-and-takedown remains fundamental to the emergence and survival of projects like our own. Volunteer editors and communities need to be the main decision-makers, rather than unduly forcing the platform operators—in our case, the Foundation—into a Big Brother-like role.
Importantly, and thanks in part to the efforts of the Foundation’s Global Advocacy team and Wikimedia Europe (formerly known as FKAGEU or Free Knowledge Advocacy Group EU), the DSA’s rules are tailored to recognize the difference between the moderation decisions taken by platform operators, and the rules enforced by volunteer communities. This understanding is critical to preserving communities’ freedoms to act and govern themselves with limited interference from the platform operator.
For these reasons, we agree with many of the DSA’s core premises. Internet platforms should respect and protect the human rights of everyone who uses them, no matter whether to look for some facts every now and then or to contribute hundreds of thousands of edits on a variety of topics. Platforms should conduct risk assessments so that they can understand and prevent potential harms. Platforms should be transparent with the public about how content is moderated, amplified, and/or targeted. Platform rules need to be set and enforced in a fair and accountable manner.
The Foundation, and many Wikimedia affiliates, have made strong human rights commitments, and we are prioritizing work with Wikimedia communities to provide for the safety and inclusion of everyone, everywhere, who wants to access or share free knowledge.
What can Wikimedians do now?
While we think our movement is already doing a good job addressing the expectations of Wikipedia being a VLOP, compliance with the EU DSA is nonetheless a journey into uncharted territory that the Wikimedia movement cannot avoid taking. As we move forward together, there are things that Wikimedians across the EU and worldwide can do to help shape the law’s future enforcement and application.
First, please tell us what you think about the changes we are required to make, how they affect you, and what impact you think they have on your broader community and society.
Second, please get in touch with the Foundation’s Global Advocacy team (globaladvocacy@wikimedia.org) and/or our close allies at Wikimedia Europe (eupolicy@wikimedia.be) if you are located in the EU and interested in engaging in dialogue with regulators in Brussels or your home country about how the law affects your community and you.
Can you help us translate this article?
In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?
Start translation