Mark Bernstein is a Russian-language Wikipedia volunteer based in Belarus. He is one of the top 50 editors of Russian Wikipedia, where he has contributed more than 200,000 edits. Early in 2022, his Wikipedia account, social media accounts, and workplace address were doxxed—in other words, publicly identified—on the public Telegram channel of a Belarusian law enforcement service. Authorities detained Mark shortly after in March 2022, publishing a video of the arrest and a photograph of him in the same channel, stating that he was “distributing fake anti-Russian information.” They did not state the official charges or which edit or edits—if any—led to Mark’s detention.
Mark was initially sentenced to jail for 15 days for “disobeying law enforcement,” but then detained for over three months without explanation. In June 2022, the authorities failed a second time to state official charges, but found him “guilty of organizing and preparing activities that disrupt social order.” State-controlled media eventually confirmed that the cause was his contributions on Wikipedia to articles about the conflict in Ukraine. Mark is condemned to live for three years in conditions of restricted freedom, where he must be home at specified hours, and cannot conduct certain activities or leave the country.
In countries like Belarus, where news media organizations are suppressed and where censorship, disinformation, and internet controls are rampant, Wikipedia has become a vital lifeline for people who want to stay informed. Thousands of Wikimedians who work tirelessly in difficult conditions to make well-sourced factual information freely available to everyone, everywhere, oftentimes do so at great personal risk. This includes the life-and-death threats that volunteer editors in Ukraine and other conflict zones face. (A tragic example is that of Volodymyr Vakulenko, a writer, activist, and Wikimedian, who was disappeared by the Russian military following the invasion of his homeland, and later found murdered.)
As Mark’s case shows, volunteers in several countries must be careful to avoid being publicly identified, which can have serious consequences not only for themselves, but their loved ones as well. Wikimedians can be intimidated and prosecuted in courts on questionable—when not fabricated—legal charges simply for sharing fact-based, neutrally-written, well-sourced, and verifiable information. The determination that leads volunteers to speak the truth can be met with surveillance and harassment, retaliatory fines, prison sentences depriving them of their freedom—or even worse.
The world is experiencing its 16th consecutive year of decline in global freedom and, in 2021, the arrests of journalists worldwide reached its record high. Authoritarians and other bad actors are becoming more and more adept at exploiting vulnerabilities in online technologies, communities, and platforms. During times that are not always hospitable to free knowledge, the Wikimedia Foundation is doubling down on its responsibility to protect the human rights of Wikimedians like Mark and thousands of unnamed others to access and participate in open knowledge privately and safely.
This past 10th December marked Human Rights Day, a date that commemorates the adoption of the Universal Declaration of Human Rights in 1948 by the United Nations General Assembly. The Declaration, which is recognized and applies to every country in the world, establishes that each and every person has the right to the rights to freedom of expression and opinion, without interference, and to access and share information and ideas through any media and beyond borders. In order to reinforce the Foundation’s longstanding commitment to protect, respect, and uphold the rights of all those who contribute to and use Wikimedia projects, we announced the approval of our Human Rights Policy around this time last year.
Since the Policy’s approval, we’ve been hard at work strengthening our continuous efforts to protect and support Wikimedians for the long haul. Human rights risks to Wikimedians represent real and dangerous threats to freedom, as we’ve seen with volunteers like Mark.
Having conversations with the volunteer community allowed us to hear and learn more about their greatest concerns as well as what they need the most from us at the moment to stay safe. To mark Human Rights Day this year, we want to share what else we’ve been doing over the past year, 2022, to protect hundreds of thousands of Wikimedians and billions of readers from these threats, and also what we hope to do with your help in the year ahead.
Talking with the Volunteer Community
Protecting Wikimedians requires close collaboration and communication between Foundation staff and people across the movement. After the Board of Trustees approved the Human Rights Policy in 2021, we wanted to make sure volunteers would have plenty of opportunities to share their experiences, concerns, ideas, and proposed solutions.
To do this, we gathered feedback on the Human Rights Policy through two global conversation hours, an online survey translated in 11 languages, a private focus group with an at-risk community, and four regionally-focused community conversations. After publishing the Foundation’s first Human Rights Impact Assessment (HRIA) in July, we held a number of conversations with the community about it: a presentation and discussion at the Community Affairs Committee, a workshop at Wikimania, staff office hours, and two community conversation hours. To help keep Wikimedians safe in the immediate term, we also organized six digital security trainings for volunteers in vulnerable jurisdictions in Asia, Middle East and North Africa, and Sub-Saharan Africa, and had regular community consultations regarding the situation in Ukraine and Russia.
From these events, we heard and learned more about the communities’ worries and what else people need from the Foundation to keep themselves safe. Wikimedians’ greatest concerns do indeed include government surveillance and censorship, harassment and intimidation, in addition to harmful content. Through our conversations we learned that what the volunteer community needs the most are the following things:
- Resources and Training to understand how volunteers’ and readers’ human rights are impacted online and on-wiki, best practices to protect themselves, and information on how to respond and access resources when threatened;
- Improved Volunteer Moderation and Enforcement, which require greater resources to better volunteer moderation and enforcement tools and processes, as well as consulting volunteers on which tools and workflows need such support;
- Community Consultations to involve volunteers early on in identifying and reducing human rights risks and benefit from their institutional and technical knowledge, and to better understand specific challenges and feasible solutions.
Keeping Wikimedians Safe
At the same time that we talked with the volunteer community, we continued the behind-the-scenes work to keep Wikimedians safe in the long-haul. This work not only involves the daily efforts of our staff, but also the efforts of the Human Rights Steering Committee, which is composed of leaders from across the Foundation who are tasked with guiding the implementation of the policy. The Steering Committee convened four times this year to drive this work forward.
The human rights work that we do at the Foundation to protect Wikimedians, which is increasingly carried out in closer communication and coordination with the communities, can be mostly understood in terms of due diligence, transparency, and advocacy.
Due Diligence: By identifying, analyzing, and reducing the likelihood and/or impact of human rights threats among our projects, we proactively prevent harms from occurring on Wikimedia projects. This year we:
- Activated the Ukraine-Russia Coordination Committee to respond to emerging and rapidly evolving threats to Wikimedians and Wikimedia projects, which are happening as a result of the Russian invasion of Ukraine;
- Initiated a child rights impact assessment, and consulted experienced volunteers to identify and analyze human rights risks to minors using Wikipedia projects, so we can better reduce them and make Wikimedia a safer environment for kids;
- Completed a human rights impact assessment for a proposed technical project, and consulted experienced volunteers to better understand the human rights opportunities and risks of a new tool before the Foundation makes a decision on its release;
- Developed a human rights due diligence framework to guide when and how the Foundation will carry out human rights due diligence in a cohesive and systematic way, which also takes into account our limited resources.
Transparency:Tracking and publicly reporting on our efforts to meet our human rights commitments enables the community to hold the Foundation accountable for meeting them. This year, we’ve made additional information available to the community and the broader public by publishing the following reports:
- Foundation-wide Human Rights Impact Assessment (HRIA), which evaluated human rights risks across the Wikimedia movement and projects, and proposed a series of recommendations to reduce those risks;
- 2021 Transparency Reports, which continue to explain how the Foundation handles requests from government and others to alter or remove content from Wikimedia projects, what kind of requests we received during six-month periods, and how we responded to those requests.
We also updated the Board of Trustees Executive Committee on emerging and dynamic threats facing our movement, and the work we’ve completed so far to meet these challenges.
Advocacy: In order to make sure that Wikimedia projects can continue to prosper, the Foundation encourages governments to adopt and reform laws that keep the internet an open environment as well as respect human rights principles. We created the Global Advocacy team to lead these efforts in late 2021 and, ever since, it’s been working hard to protect free knowledge in different regions across the globe:
- Asia-Pacific: Highlighting how the Wikimedia model serves as a model for online communities at the C20 Summit, and meeting with government officials to explain why human rights safeguards and standards need to be included in internet policy;
- Europe and the United States: Leading a teach-in with British officials shaping the UK’s Online Safety Bill to explain why regulations need to protect community-governed platforms, and successfully urging the UK and US governments to ensure that economic sanctions against Russia following the invasion of Ukraine don’t prevent the Russian people from accessing the open internet;
- Latin America: Meeting with government officials and civil society organizations in Chile to raise awareness about the Wikimedia model, and providing constructive recommendations to a bill seeking to regulate online platforms.
Looking Ahead to 2023
Moved by the successful conversations and steady efforts to fulfill our human rights commitments, we’re looking forward to an equally ambitious year of action in 2023 to protect and uphold the rights of all those who contribute to and use Wikimedia projects! We plan to build upon the collaboration and various areas of work achieved this year in order to keep Wikimedians safe from the emerging and dynamic threats facing our movement in the future.
To this end, we’ll continue to talk with the volunteer community about human rights concerns and needs, and hold regular office hours as well as launch a Learn.Wiki course to help educate volunteers on digital security. We’ll also propose a Human Rights Due Diligence Framework to Foundation leadership. In addition, we’ll finalize a child rights impact assessment, which will be published, as will the human rights impact assessment for a proposed technical project, and the next two biannual transparency reports. Finally, we will present a self-assessment of our human rights protections to the Global Network Initiative (GNI), refine Foundation-wide crisis response systems to support Wikimedians under threat, and provide a second annual update to the Board of Trustees, towards the end of 2023, on all of the above.
Do You Want to Help?
As we wrote at the beginning of this post, we’re counting on your help to strengthen all of this human rights work. If you’d like to get involved in any of it and/or talk to us about any human rights concerns or needs that you have, please join the Human Rights Interest Group by signing up here!
You can also learn more about the work of our Global Advocacy and Human Rights teams and how they complement each other on Meta-Wiki.
The experiences and input of volunteers are critical to the success of this work, especially as the threats to free knowledge and our movement continue to evolve. There are a number of ways in which you can contribute to our work going forward. We’ll hold frequent meetings and consultations with the volunteer community, so don’t miss out and please help us celebrate the human rights of Wikimedians across the world, this and every year, with action!
Can you help us translate this article?
In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?
Start translation