2024 Wikimedia Foundation litigation review: Defending the Wikimedia projects and volunteers in court, and shaping the future of the internet

Translate this post

The Wikimedia Foundation’s participation in legal cases during 2024 involved cases both as a defendant and as an active effort through impact litigation (i.e., lawsuits that aim to bring legal change in the public interest). The Foundation looks globally for opportunities to legally protect free knowledge, the Wikimedia volunteers who contribute it, and to clarify what material is part of the shared knowledge commons. This blog post highlights some of our work last year to protect free and open knowledge broadly, and the Wikimedia volunteers and projects in particular.

A photograph of the Justice statue by John van Nost the Younger on the Gates of Fortitude and Justice at Dublin Castle
The Justice statue by John van Nost the Younger on the Gates of Fortitude and Justice at Dublin Castle.
Image by Jan-Herm Janßen, CC BY-SA 3.0, via Wikimedia Commons.

The Wikimedia Foundation’s mission is to empower and engage people around the world with the aim of developing free and open encyclopedic content for everyone, everywhere, to access and share. One way that we advance this mission is by working to improve the laws and regulations that make it possible to host Wikipedia and other free knowledge Wikimedia projects along with those that enable Wikimedia volunteer communities to effectively collaborate and contribute to the projects.

There are two key ways in which our litigation efforts seek to ensure that the legal landscape protects and promotes contributions to free and open knowledge and the community-led governance and content moderation model of the projects. First, as the projects’ host, we respond to legal demands related to the projects in order to defend them. Second, we engage in targeted impact litigation—strategic legal filings that serve to enact legal change in the public interest—by, for example, submitting amicus (“friend-of-the-court”) briefs in important, precedent-setting lawsuits that concern third parties across the world and could affect the projects’ content or volunteers. This blog post explains the strategy behind our litigation efforts in a moment when, on the whole, the lawsuits where we have participated show a more complex picture of the legal environment that we have ever seen in the past.

There is presently a trend of increased government oversight and the enforcement of laws that challenge the concept of a free, global, and open internet. This trend has begun to change the types of legal demands that are brought to the Foundation and the expectations of countries around the world: not just for the Foundation, but for all platform hosts. In 2024, our Legal department was involved in litigation both as a defendant and as a third party engaging in impact litigation globally. We engaged in a range of cases covering intermediary liability protections, defamation and freedom of expression, and the right to privacy. This includes some cases we believe are Strategic Lawsuits Against Public Participation (SLAPPs)—lawsuits designed to censor information on legitimate matters of public interest—both from impact and defensive litigation standpoints. Our attempt in all of these cases is to protect and advance freedom of expression, the public interest through the Wikimedia projects and similar community-led public interest platforms, and the benefits of a reliable and accurate online information ecosystem.

Defending Wikimedia projects and its volunteers, and shaping laws to safeguard and promote free knowledge

Defensive litigation: Protecting the Wikimedia projects and volunteers in court

The Foundation and Wikimedia volunteer communities take much care so that the Wikimedia projects are not subject to legal demands. Nevertheless, court cases based on content on the projects do occur. When lawsuits come to us from anywhere in the world, we undertake defensive litigation to protect the projects and the people who edit them in good faith. We also look for ways to clarify the law in order to help volunteers, wherever they may live, understand what material is available as free knowledge and what material might be restricted or unlawful.

When we get a new case, we always aim to make sure we understand the content being fought over, because the quality of community-led work is a key aspect of protecting the projects. For instance, when controversial parts of a Wikipedia article are written in line with a volunteer community’s policy on a neutral point of view and supported by reliable sources, the Foundation is typically well-positioned to defend both the content and the people who wrote it. On the other hand, if the work is unsourced or its sources are inaccurate, the Foundation is very rarely positioned to be able to independently investigate and prove its accuracy, making it difficult to defend.

Similarly, if content on Wikipedia, Wikimedia Commons, Wikidata, or other projects offers details about a person’s private life, it is important that we can demonstrate that it is valuable to the public. It is therefore crucial that the volunteer communities continue to define, debate, and apply balanced standards around proportionate respect for private life (e.g., on Wikidata and English-language Wikipedia), the educational value of content, and using imagery of identifiable people—as they have so successfully done for the past 24 years. Particularly where robust and fair community evaluations are done in the open, such as on Talk pages and in Deletion discussions, the Foundation’s lawyers will be able to show our legal opponents, regulatory authorities, and judges that privacy and public interest have already been comprehensively weighed up and that, on balance, the law should protect the contributions of Wikimedia volunteers.

Lawsuits in 2024

Over the past year, the Foundation has been the subject of several lawsuits and legal demands. Our transparency reports show how many legal demands we receive, separating between those related to content and those related to user data. While the vast majority of legal complaints to the Foundation do not result in litigation, the legal situation for hosting a platform has begun to change significantly. The Wikimedia model does much to help resolve legal demands, since we can educate potential opponents about the community-led processes that can address their concern—like the Edit Request Wizard on English-language Wikipedia—and, on occasion, can even relay the complaint directly to community members on the requester’s behalf. A small number of cases present a clear legal obligation for the Foundation, resulting in some legal demands being granted as required by law. Until now, only a handful of complaints each year lead to litigation.

It is important to note that some defensive litigation cannot be publicly discussed: this varies based on the different norms of different countries and the nature of the case. For instance, the Foundation may have limited ability to repeat accusations that a court has already found to be defamatory or may not be able to discuss details of cases still in active litigation. Thankfully, there are a few cases where we can highlight some key victories protecting the projects and volunteers.

The Foundation had several legal victories in Germany in 2024. We won a privacy case against a wealthy gambling magnate, who in what we consider a SLAPP and based on vague claims of potential harm, tried to substantially expand the scope of German and European privacy laws in order to force us to censor factual and neutral references to him from Wikipedia. We also won a key case that will dissuade people all over the world from trying to take advantage of strict German defamation rules, in what we think is one of a few obvious “forum shopping” cases that we have recently faced.

“Forum shopping” is the attempt to choose a court in a jurisdiction whose laws might be more favorable to the plaintiff. We defeated another such case in the UK, brought by a former lawyer, who lost his defamation lawsuit against us in the England and Wales High Court, and then tried to appeal the decision. In March 2025, the Court of Appeal resoundingly rejected his attempt and endorsed the High Court’s ruling, including an important finding: A 12-month statute of limitations rule can still protect Wikipedia articles that have been edited or reverted as long as the result is still “substantially the same” as disputed passages that were first written up to a year ago. Lord Justice Warby declared many aspects of the former lawyer’s case “totally without merit,” and warned that the plaintiff could receive a Civil Restraining Order if he threatened to bring other meritless claims.

We also had some key successes related to user data this year, although we do not write individually about these caseIs so as to avoid inviting speculation or attempts to identify users via case details. Broadly, the Foundation had two cases in Japan where we were successful in protecting against the disclosure of user data. Similarly, we had a case in Brazil in which we were able to protect against the disclosure of users’ data. In India, the Foundation reached an agreement in litigation to continue a case without providing user data to the other party.

The César do Paço case in Portugal, which has slowed down immensely on questions of Portuguese law before its higher court, remains open and continues to present a range of issues related to privacy protections for users as well as protections for platform hosting. This is another case that we have previously identified as a SLAPP attempt. In this case, the Foundation is working to ensure that editors are safe to work on and improve important biographies, including about political figures and financiers.

Finally, in India, we were also successful in seeking the dismissal of a lawsuit regarding a BBC documentary about India’s Prime Minister, Narendra Modi—the content in question was never hosted on Wikipedia in the first place; it was just linked to, as a source.

We also had some setbacks this past year. In France, we had two lawsuits where French courts differed with the views of French volunteers, leading to the Foundation taking action to delete two articles based on our legal obligations under French law. We also began discussing how to approach them with the French community. We often encourage local volunteer communities to work on material that may be subject to litigation: communities nearly always have more options than the Foundation if they do want to work on articles subject to court orders. For example: a partial deletion and partial rewrite may comply with a court order, whereas the Foundation never writes new article content in response to a court and can only delete an article based on an order.

We have several active cases at the moment as well, which have not been fully decided. In India, the Asian News International (ANI) case remains ongoing, with the next hearing being delayed until May 2025. In Italy, we are litigating about an order to “noindex” a Wikipedia article related to a senior member of the Vatican who was accused of possessing child pornography. This is important for knowledge sharing, as de-indexing has become a new approach to people trying to keep information out of public awareness by means of the “right to be forgotten”—with the idea that if it stays out of searches, then it is basically secret from the public, even if it remains published somewhere. We also have a few more open cases in France and Germany, and one that started in Ukraine and added us mid-case.

Meanwhile, we are keeping a very close watch on a new wave of “online safety” laws around the world—designed with good intentions, but often ill-adapted to the diversity of services online. Wherever such laws might represent a threat to the Wikimedia model, it is possible that some legal questions may need to be answered by courts.

Impact litigation: Bringing about legal change in the public interest

The strategic legal filings that the Foundation submits to courts around the world aim to enact legal change in the public interest in ways that promote and protect the Wikimedia projects and volunteer communities that contribute to them.

Our mission covers many important aspects of the law, which is why the amicus briefs that we file cover a large variety of topics. For instance, we seek to ensure that content that is created, shared, curated, and moderated online remains accessible to those who are interested in it. In addition, we try to ensure that laws and regulations do not encroach on freedoms that protect the users of the projects or their ability to research and share free and open knowledge. For example, we advocate that good laws should not be used to curb freedom of expression in SLAPPs. And, last but not least, we strive to make sure that good legal standards, such as liability protections for platform hosts, remain consistent throughout the globe in order to actually protect the platforms in question.

Amicus briefs in 2024

NetChoice, LLC v. Paxton and Moody v. NetChoice, LLC are two of the most important recent cases in the United States (US) for Wikipedia’s community-led governance and content moderation model, along with the right to freedom of expression. In 2021, Texas and Florida enacted state laws that were designed to restrict social media platforms’ ability to enforce their own content policies. These laws were a response to certain high-profile content moderation decisions taken by large social media platforms, which the states alleged constituted censorship of some users’ viewpoints. The Foundation filed an amicus brief at the end of 2023 at the US Supreme Court, where we argued that laws restricting community-led content moderation would infringe the US First Amendment rights of Wikipedia volunteers and could damage the quality and reliability of Wikipedia by forcing them to include non-encyclopedic content.

For now, the issue is unresolved: In July 2024, the Supreme Court decided that the lower courts, when first ruling on the cases, had taken an overly narrow approach to considering the constitutional impact of these state laws. The Court sent the cases back to the lower courts with instructions to try again—essentially pressing the reset button. This has stopped the laws from being applied until the lower courts rule on them again.

The Foundation filed a joint amicus brief with Creative Commons and Project Gutenberg in another US lawsuit, key for platform content hosting: Hachette v. Internet Archive. Four major publishers accused the Internet Archive of encouraging copyright infringement through its Open Library service. In response to the COVID-19 pandemic shutting physical libraries and bookstores, the Internet Archive decided to remove lending restrictions on the 1.4 million digitized books it lent out to the public. A US district court found fault with the nonprofit’s fair use defense, emphasizing that the Internet Archive soliciting donations was a significant enough factor to find that the copyrighted works were being exploited in a way that harmed the owners of copyrighted material. We argued that the court’s interpretation of fair use could wrongly classify nonprofit secondary uses as commercial, impacting all nonprofit organizations’ ability to utilize copyrighted material, including Wikipedia. While the Internet Archive ultimately lost this case, when the US Court of Appeals for the Second Circuit affirmed the district court’s decision, the court of appeals did find that the solicitation of donations does not amount to commercial use.

As explained previously, ensuring accessibility to content that is of interest to the public is an aspect of our mission. For this reason, we cosigned an amicus brief with Pinterest, Google, and the Organization for Transformative Works in Elliot McGucken v. Valnet during early 2024. In this case, a photographer named Elliot McGucken accused Valnet, an entertainment media company, of embedding some of his social media posts and related photographs in their travel website. McGucken alleged that embedding constituted copyright infringement of his exclusive right of choosing how to exhibit his images to the public as the copyright owner. We argued that granting McGucken’s request would profoundly distort copyright law: it would make millions of platform operators into infringers, since it would allow copyright owners to prohibit third parties from pointing audiences to a work that is already publicly displayed online. McGucken went on to lose the case.

The Foundation’s impact litigation extends outside of the US as well to clarify uncertain laws around the world. As mentioned before, another aspect of our mission is to protect the rights and freedoms of project users everywhere to access and share free and open knowledge. In May 2024, the Foundation cosigned an amicus brief with Wikimedia France that asked the French Constitutional Council to invalidate certain articles of the new SREN (“sécuriser et réguler l’espace numérique,” that is, “securing and regulating digital space”) law. Even before SREN had become law, Wikimedia Europe and the Foundation had informed French lawmakers that it would be unconstitutionally broad in scope due to requirements such as: unreasonably short takedown times; quasi-global blocking orders (meaning that the French government would be forcing takedowns in other countries); and, the risk of criminal fines or imprisonment for sharing content that could cause “outrage” or “offense.” The Constitutional Council returned with their decision soon after, where they invalidated several parts of SREN, including the “digital outrage or offense” provision.

Our mission also includes highlighting when good laws are misused in SLAPPs in order to limit freedom of expression, as noted earlier. The Foundation engaged in Gisele Zuni Mousques v. Christian Chena before the Paraguayan Supreme Court of Justice in July 2024. In this lawsuit, Mousques sued Chena for allegedly violating a Paraguayan law that aims to fight gender discrimination in order to attempt suppressing factually correct public interest information about herself, threatening others’ freedom of expression and access to reliable and accurate information. In our amicus brief, cosigned with TEDIC and the Center for Studies on Freedom of Expression and Access to Information (CELE), we argued that the Justice of the Peace failed not only to consider the human rights implications of the ruling on freedom of expression and journalism, but also did not apply the appropriate test to determine whether a restriction of the affected party’s human rights was legitimate. An incorrect test for restricting speech could impact many future cases and limit the ability of Wikimedia volunteers to share notable public information. The case is still awaiting judgement before the Supreme Court of Justice.

Lastly, as we explained previously, our mission requires that we ensure that good and existing legal standards remain in place—for instance, those that protect platform providers from unjustifiable liabilities. The Foundation submitted an amicus brief in the case of Ulrich Richter Morales and Claudia Ramírez Tavera vs. Google Inc. and Google México before Mexico’s Supreme Court of Justice in August 2024. In this case, Ulrich Richter Morales sued Google for not removing an allegedly defamatory blog post on the Blogger platform. We argued that it was vital to maintain the existing protections provided by the internet intermediary liability legal framework, since they are essential for the collaborative and neutral nature of platforms like Wikipedia. This case is also still waiting for a decision before the Mexican Supreme Court.

Conclusions

On the whole, these cases present a more complex picture of the legal environment than we have ever seen in the past. Courts in many major countries are willing to find jurisdiction over the Foundation and Wikimedia volunteers based on biography articles about notable people or organizations in those countries. In addition, court cases have become more complicated, since some plaintiffs argue that even factual information on Wikipedia and the other Wikimedia projects might be too old, too irrelevant, too inaccurate, or even too incomplete to allow it to be legally hosted. The best thing that the communities can do in response to this is to make sure that articles are of good quality and important to the general public: the clearer it is that content is accurate, neutral, and serves the public interest, the better the defense that the Foundation can offer to protect content and refuse demands to disclose data.

These are only some of the cases the Wikimedia Foundation engages in on a daily basis. The Foundation, its affiliates, and allies continue to take an active role in upholding our mission of providing free and open educational content for all. Engaging actively in global litigation is one of the core avenues we use to move the world towards our vision of free and open knowledge for everyone, everywhere.

Can you help us translate this article?

In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?