Good intentions, bad effects: Wikimedia projects and the UK’s draft Online Safety Bill

Translate This Post
Libraries Team, CC BY 2.0

The Wikimedia Foundation intends to serve the UK’s diverse communities while working to keep vulnerable users safe. However, we must be allowed to retain our existing freedom to do so without compromising Wikipedia and other projects’ nonprofit, public interest, volunteer-led model, or our movement’s core values.

The Wikimedia Foundation will not require readers and volunteer editors to verify their ages before accessing Wikipedia unless our own community demands it. We believe that reasonable safety for vulnerable users—regardless of age—can be achieved in other ways.

The UK’s proposed Online Safety Bill (OSB), as currently drafted, may require a number of significant changes to how Wikipedia and other Wikimedia projects operate. Among its more than 250 pages of proposed rules, Clause 11 would require us to “prevent” users under the age of 18 (“under-18s”) from “encountering” certain still-undefined categories of lawful content. Another rule, Clause 9, requires us to prevent all UK users (including adults) from “encountering” content or conduct that looks like it might be illegal under dozens of UK laws (without requiring confirmation of actual illegality).

Recently, a large number of UK headlines (including in the BBC, The Guardian, The Register, and The Telegraph) concentrated on the proposed “under-18 exclusion” rule, and what it could mean for Wikipedia. That media coverage prompted us to share a closer look at those age-specific provisions in this blog post.

To prevent readers and volunteer editors under the age of 18 from “encountering” certain content, Clause 11 of the OSB requires in-scope services to either:

  1. Prevent such content from being uploaded, which means the Foundation would need to tell communities what counts as unacceptable content (i.e., an action that is generally at odds with our movement’s spirit), and then filter that out through proactive, generalized monitoring and analysis of content uploads and edits; or,
     
  2. Age-gate the content, which would mean profiling our users to learn their location (UK?) and age (under the age of 18?), and then discriminatively deny them the right to access open knowledge and information. This option would also mean making adult readers go through burdensome age checks, which could lead to many users preferring alternative, more “fringe” sites that are not subject to the OSB. Their information diets, and our projects (including Wikipedia), would suffer greatly

Such obligations fail to protect and support volunteer-led, not-for-profit platforms like ours. Worse still, many obligations, including those in Clause 11, are qualified by vague terms, such as a duty to take “proportionate” steps to meet the obligations in question. This offers no certainty that we will ever do enough to comply: The local regulator, the Office of Communications (Ofcom), is free to disagree at any time, and can then impose supplemental measures or penalties.

Reasonable safety is possible without “age-gating”

Since Wikipedia’s creation, the Wikimedia projects have been subject to many laws allowing government authorities and the public to report illegal content, which can then be analyzed (e.g., Is it actually illegal? Would removing it be consistent with website policies and human rights?) and, if found to be problematic, then removed. Usually, problematic content is just dealt with by website users directly. This is a major benefit of our projects, compared to traditional social media.

More recently, laws worldwide have started to add general duties of care towards vulnerable users, sometimes specifically concentrating on children. One example is the simple, but potentially powerful, Article 28 of the EU Digital Services Act (EU DSA). It requires reasonable child safety, but importantly for us, it does so without specifying the measures to take. It is also highly privacy-protective, unlike the UK OSB, which instead emphasizes age-based discrimination.

Wikimedia projects already offer reasonable safety, and there is more to come

Our movement’s work to make sure our projects are accessible and welcoming to new volunteers and readers precedes the EU DSA, UK OSB, and their other regulatory siblings. With the Foundation’s support, volunteer editors and communities themselves have shaped the rules and governance of Wikimedia projects: carefully debating content policies, moderating most project content, and building tools that help them do their work more effectively.

The Foundation’s Trust and Safety team is available to support volunteers if community mechanisms are unable to address and solve issues such as the physical safety of volunteers and readers, or disinformation. By arrangement with our communities, our Trust and Safety team also plays an important role in keeping child sexual abuse material (CSAM) off our platforms, and reporting it to the appropriate authorities.

The Foundation also employs a team of human rights professionals whose work is informed by a detailed, externally-conducted Human Rights Impact Assessment (HRIA). The HRIA produced many recommendations that are shaping our work and strategy. That same line of work will soon deliver a Child Rights Impact Assessment, focusing specifically on risks (and, conversely, on our projects’ value) to under-18s. We’re also supporting community efforts to combat disinformation and undisclosed paid editing (UPE) on our platforms.

Our movement is not stopping there. The ongoing rollout of the Universal Code of Conduct (UCoC), backed by modernized Terms of Use (ToU), provides greater clarity on unacceptable conduct across all Wikimedia projects, big or small. The Foundation is working on a new Incident Reporting Tool, and will be updating its Office Action (i.e., notice-and-takedown) processes for ease of use. We’ll be engaging in an annual project to assess the systemic risks related to use of Wikipedia, and will propose additional mitigations where appropriate.

More importantly still, these recent efforts are building on over twenty years of sustained community and Foundation attention to making our projects—such as Wikipedia—safe and inviting places. “No personal attacks” has been a Wikipedia policy since its early days. We also don’t engage in for-profit behavioral advertising, or try to trap users into addictive content-consumption loops: while we want to encourage participation on the projects, it’s not at the reader’s expense, whether in time, money or personal data.

The most crucial point is this: We recognize that it is not the Foundation that can take the most credit for our projects being wonderful to use and to participate in. That comes from our movement’s grassroots strength and resilience. So the Foundation’s efforts must at all times respect and encourage the volunteer community’s governance and participation, and maximize the ability for good, altruistic people to do what they do best: discuss and decide, for themselves, what should be on the projects. Laws should not force the Foundation to become the main decision-maker.

It is also important to note that our projects make their content freely available for reuse by anyone, anywhere, supported by extensive application programming interfaces (APIs) and even services that offer customized solutions to specific needs. This means it would be possible for schools and other interested parties to develop their own interfaces to Wikipedia project content, filtering as they see fit. If that is a genuine and popular need, there is nothing preventing market mechanisms or civil society groups from answering it—without the need for government mandates, or an impact on the core platforms.

Well-intentioned laws and regulators rightly demand that the owners of large commercial websites do more to ensure their users’ wellbeing. However, legislators, regulators, and judges should not demand the same from our projects, which are deliberately not governed “from the top”, and which are not trying to maximize profits

The Foundation, rather than being forced into the Big Brother-like role these laws are tailored for, should be allowed to continue in its current roles, both as a catalyst for community-driven improvements and a safety net for when communities require our input. The UK OSB threatens this.

There is a very real risk of new laws becoming a heavy distraction from our movement’s existing and ongoing efforts to enable everyone, everywhere, to have free access to the sum of all human knowledge. Those efforts are harmed by conflicting requirements, extensive reliance on secondary rulemaking (requiring constant engagement from our Global Advocacy team and local chapters), and heavy bureaucratic burdens. This takes time and other resources away from the efforts of Foundation and affiliate staff and volunteer communities to actually improve the content on the Wikimedia projects.

Other concerns with the UK OSB

The UK OSB’s child safety rules are by no means the full extent—nor even the focus—of our concerns with the draft law.

We have written in the past about our broader UK OSB concerns. Despite ongoing Parliamentary scrutiny, the concerns remain and, in some cases, are growing even stronger. For example, OSB Clause 8(5) would require the Foundation to specifically assess the risk of Wikimedia platforms and content being used for a range of UK criminal offenses. New amendments would include “abetting” illegal immigration into the UK as one of those offenses. Are sailing directions from France to the UK helping people smugglers? What about the sentence “If migrants arrive in England through illegal means, upon arrival the UK Government is unlikely to reject their claims to asylum,” which is found in this Wikipedia article?

Following such an assessment, the Foundation must, wherever “proportionate,” prevent adults from accessing those systems and content, either through geoblocking (Clause 9(2)(a)) or deletion. It must do this even if nobody has raised concerns about that content (Clause 9(3)(a)). Ofcom, the UK regulator, will issue guidance on what is “proportionate,” and can punish platforms if it believes they’ve got the balance wrong.

These are bad obligations to impose on nonprofits, especially those operating globally across hundreds of languages, national jurisdictions, and cultures. We are therefore calling on legislators to consider whether all of these new compliance obligations are genuinely necessary and proportionate for projects such as ours. We call upon them to either fix the underlying requirements, for everyone, or consider tailoring their laws to focus exclusively on the Web’s most genuinely problematic platforms.

To stay informed—or get involved—please subscribe to the UK mailing list for Wikimedians and to the Wikimedia public policy mailing list, which will be posting updates on the UK OSB’s progress and both the Foundation’s and Wikimedians’ advocacy activities.

Can you help us translate this article?

In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?