Across the globe, Wikimedia organizations and volunteers are engaging in online and offline activities to get more editors to contribute to Wikimedia projects. There are expansive efforts to attract new editors and to mobilize existing editors who can contribute diverse and high-quality content. With so much activity and energy, it is important to take a deep breath and reflect:
- What are the programs expected to achieve (i.e., what are the program goals)?
- What does it mean for a program to have “impact”?
- How much “impact” equals success?
- How might our programs achieve the most impact?
These are the big questions the Program Evaluation members of the Learning and Evaluation team in the WMF Grantmaking department have begun to explore along with the community. This past month, we completed a beta version of evaluation reports that has begun to put systematic numbers behind a handful of popular programs.
The picture is clear that Wikimedia volunteers do incredible work to create free knowledge and to promote the free knowledge movement. But this picture is incomplete without the data to help tell the story. Putting numbers behind our stories and activities helps the community and the public to better understand what is actually happening on the ground and how our movement programs are making an impact. The evaluation reports measure programs systematically against shared goals to help us see which programs drive impact along various movement goals. From here, we can reflect on what the existing programs are doing and what remains to be done in our strategies to nurture and grow a community of editors and advocates around free knowledge.
A grand total of 119 implementations of 7 programs were analyzed from over 30 countries!
For the first round of reports, data were reviewed from 119 implementations of seven popular Wikimedia programs: Edit-a-thons, Editing workshops, on-wiki writing contests, the Wikipedia Education Program, GLAM content partnerships, Wiki Loves Monuments, and other photo initiatives. Data represented more than 60 program leaders, individual volunteers or organizations, program implementations in over 30 countries. These reports provide a basic sketch and a pilot of high-level analysis of how these programs are influencing the movement. They are also painting a picture of what these programs are in terms of their goals and help to surface the gaps in data and metrics. Here are just a few highlights:
So, what’s next?
- Examining additional programs! In FY 2014/2015, the goal is to expand the data related to these seven programs and to examine three additional programs: Hackathons, Conferences, and Wikimedian-in-Residence. Through these reports, the evaluation portal, and other pathways, we will continue conversations with the global community to work toward a shared view of program “impact” throughout the movement.
- Help us improve the reports! If you are running a Wikimedia program, start tracking it using the Reporting and Tracking toolkit. You will not only learn a lot about your own programs, but in sharing your data with us, we will be able to conduct stronger analysis on popular Wikimedia programs and we can better learn from one another to make better programs.
- Have you recently implemented a Wikimedia program? Tell us about your program or publish any tips you may have to share in the Learning Pattern Library!
- Questions? Comments? Reach out to us in the comments below or at email@example.com. You can also find us on the Evaluation Portal!
Edward Galvez, Program Evaluation Associate
Can you help us translate this article?
In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?Start translation