In March 2015, the Wikimedia Foundation launched its first Inspire Campaign with the goal of improving gender diversity within our movement. The campaign invited idea on how we as a movement could improve the representation of women within our projects, both in its content and as contributors.
The response and effort from volunteers has been remarkable. Across the ideas that were funded, there was a diversity of approaches such as:
- An edit-a-thon series in Ghana to develop content on notable Ghanaian women
- A tool to track how the gender gap is changing on Wikipedia projects
- A pilot on mentorship-driven editing between high school and college students
These and other initiatives have resulted in concrete and surprising outcomes, such as:
- Creating or improving over 12,000 articles, including 126 new biographies on women,
- Engaging women as project leaders, volunteers, experienced editors and new editors,
- Correcting gender-related biases within Wikipedia articles.
As this campaign draws to a close we’d like to celebrate the work of our grant-funded projects: the leaders, volunteers, and participants who contributed (many whom were women), and the achievements that have moved us forward in addressing this topic.
Protecting user privacy through prevention
The internet can be a hostile place, and in this age of information, we have become more cautious about what we reveal about ourselves to others online. You can imagine then that in a campaign designed to attract women, privacy became a central concern for both program leaders and participants.
Program leaders were sensitive to this challenge, and cultivated spaces where women could contribute without compromising their need for privacy. For instance, we asked program leaders to report the number of women who attended their events. Many program leaders pushed back, citing the need to protect privacy. They raised two good points: that some editors choose not to disclose their gender online as a safety measure, and that by even associating their name or username with a public event designed for women, they could inadvertently compromise their privacy. Consequently, the total number of women who participated in these programs was underreported.
In spite of this conflict, it was clear that women were majority participants across funded projects. In projects hosting multiple events for training or improving project content, such as those hosted by AfroCROWD in New York, the Linguistics Editathon series in Canada and the U.S., and WikiNeedsGirls in Ghana, well over 50% of participants were women across their own events. Furthermore, in the mentorship groups formed through the Women of Wikipedia (WOW!) Editing Group, all 34 participants were women. These women showed strong commitment as a result of the program, and in a follow-up survey, many of them wanted to continue contributing together with their mentorship group beyond the program.
Who is missing on Wikipedia?
There is an impressive amount of information on Wikipedia today: over 43 million articles across 284 languages. In English Wikipedia alone, there are over 5 million articles today. A fair number of these articles are dedicated to people: biographies about notable individuals amount to over 6.5 million articles, and this number continues to increase every year.
It can be difficult to see what is missing within this sea of information, and biographies are one well-defined area where the question of “Who is missing?” is particularly pertinent. Today, biographies about women amount to just over 1 million articles across all languages. One million biographies out of 6 million biographies, or 16% of biographies in total. One million articles out of 43 million articles, or 2% of Wikipedia content in total (whereas 12% of Wikipedia content is biographies of men). This is one way to understand how women are underrepresented on Wikipedia today, and we know even less about the extent of underrepresentation for other non-male gender identities.
One Inspire grant sought to address the visibility of this issue through the development of a tool: Wikipedia Human Gender Indicators (WHGI). WHGI uses Wikidata to track the gender of newly-created biographies by language Wikipedia (and other parameters, such as country or date of birth of the individual), and provides reports in the form of weekly snapshots.
The project has seen solid evidence of usage since its completion. In February 2016 alone the site had approximately 4,000 pageviews and over 1,000 downloads of available datasets. The team also received important feedback from users on the tool: participants in WikiProject Women in Red—a volunteer project that has created more than 44,000 biographies about women—characterized the project as valuable to their work, as it helps them identify notable women to write about.
The first step to addressing a problem is to identify it. WHGI helps us to do that in a concrete, data-driven way.
Why does “Queen-Empress” redirect to “King-Emperor”?
Addressing the gender gap goes beyond addressing gaps in content. It includes igniting conversations and addressing bias in content, bias might be more subtle or even unseen to casual readers.
Just for the record is an ongoing Gender Gap Inspire grant that focuses on these more subtle forms of content bias on English Wikipedia. One of their events analyzed the process of Wikipedia editing to investigate the possibilities and challenges of gender-neutral writing.
They specifically looked at how pages are automatically re-directed to others (e.g. “Heroine” automatically re-directs to “Hero”) and the direction of those redirects: female to gender-neutral, male to gender-neutral, female to male, male to female. An analysis of almost 200 redirects on English Wikipedia showed that ~100 direct from male/female terms to gender-neutral terms, and ~100 from female to male terms. For example, “Ballerina” redirects to “Ballet dancer” and “Queen-Empress” redirects to “King-Emperor”.
These redirections may seem like minor technical issues, but they result in an encyclopedia that is rife with systemic bias. Raising awareness of these types of bias, starting discussions on and off wiki, and directly editing language were some of the main approaches Inspire grantees took to address bias.
Learn more!
These and other outcomes can be read in more detail in our full report. We encourage you to read on, learn more about what our grantees achieved, and join us in celebrating these project leaders and their participants! You can also learn more about Community Resources’ first experiment with proactive grantmaking and what we learned from this iteration.
Sati Houston, Grants Impact Strategist
Chris Schilling, Community Organizer
Wikimedia Foundation
Can you help us translate this article?
In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?
Start translation