Let's start talking about program evaluation

Translate this post

Most Wikipedians I know – myself included – care deeply about our mission. We are highly passionate about our vision to provide free knowledge to every single person on the planet. And many of us invest an incredible amount of our free time into making this vision come true. Even though I have been editing Wikipedia since 2005, I’m still amazed when I look at the daily stream of Wikipedia’s recent changes. And I have a deep respect for the volunteers who invest their precious time and energy into never letting this stream of small and big improvements run dry.
For many years now, Wikipedians have not only worked on increasing and improving the amount of free content available on the web. We have also launched a wide variety of programmatic activities that intend to strengthen free knowledge by rising the public awareness of Wikipedia through exhibitions and presentations, by recruiting new editors as part of Wikipedia workshops, and by starting programs like “Wiki Loves Monuments”, the popular Wikimedia Commons photo competition.
We have not, however, been very strong at measuring the impact of those programmatic activities. “Measuring impact” in this case refers to quantifying the long-term effects those programmatic activities have on Wikipedia and its sister projects. In practice, this means, for example, analyzing how many of the people who attended a specific Wikipedia workshop actually turned into Wikipedians. How many of them embarked on editing articles as a result of the workshop and how many of them continued doing so on a regular basis?
Here’s where program evaluation comes into play.

  • If you’re supporting programmatic work as a volunteer, you most likely want to know whether your activities are worth the effort. Whether you help running photo competitions or sign up as a speaker at workshops for Wikipedia beginners – you want to know whether you’re making a real difference with what you’re doing. Program evaluation will help you answer this question. By measuring the outcome of the programmatic activity you signed up for, you will know whether your photo competition was successful and whether the people who participated in your workshop really learned the skills they need to write great articles on Wikipedia. This knowledge will make your work as a volunteer more fulfilling.
  • If you’re helping to improve existing programs, you’re most likely eager to find out which changes will make your program more effective and efficient. Imagine you could achieve the same result with fewer volunteer hours being spent. And what if you could double the number of people who actually start contributing to Wikipedia after your workshop, with making your workshop more engaging and fun? Improving an existing program requires that you measure its effectiveness. Here’s where integrating evaluation into your program design can make a difference.
  • If you’re thinking about starting a new program, you will want to have some evidence that your new program is working. How else would you convince others to participate in your program? And how else would you convince a funder to provide you with a grant so you can eventually execute your program and grow it over time? Program evaluation will help you to make a strong case for your idea. And it will also prevent you from embarking on activities that have no impact.
  • If you’re serving on the board of a Wikimedia chapter or a thematic organization, you might want to know which kind of programmatic activities produce the “biggest bang for the buck”. You might ask whether it makes more sense to start putting money and efforts into in-person workshops as compared to spending resources on creating an online training. How many hours of volunteer or staff time are you going to need in order to produce a specific result? Are the in-person workshops going to be more effective than the online training? And which of the two options will be more efficient? Also, which one is going to have the bigger long-term impact? In this sense, program evaluation can be a decision-making tool that will help you to determine which programmatic activities you want to embark on.

Finally, with the Funds Dissemination Committee (FDC) being in place since last year, there’s also another reason why program evaluation will be more important than ever: after the first round of funding for 2012/2013, the FDC requested more information about program impact, so it has a better foundation for making recommendations on what to fund in the future. This means that from now on, funding decisions will rely heavily on the ability of grantees to demonstrate what impact their programmatic activities have. That means that grantees will have to start thinking about program evaluation, in case they plan to apply for movement funds through the FDC process.
I’ve started a series of documents on Meta (“Program evaluation basics”) aimed at providing program leaders with an introduction to the main terms and concepts. Currently, three documents are available:

I invite you to take a look at the documents and to share your thoughts with me. I will also be available for an IRC office hour on Thursday, March 21, at 17:00 UTC. Let’s start talking about program evaluation…
Frank Schulenburg
Senior Director of Programs, Wikimedia Foundation

Archive notice: This is an archived post from blog.wikimedia.org, which operated under different editorial and content guidelines than Diff.

Can you help us translate this article?

In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?

Inline Feedbacks
View all comments

Thanks for the update! Looking forward to working with you.

Dear Frank Schulenburg,
I am very interested in your program, so I hope that you can provide information about the update of your programs routinely and regularly to me.
Thank-You Very Much for Your Kind Attention, Your Help and Your Concern.
May GOD Bless You Always!
Best Regards,
Claudius Erwin Mulialim
Owner Q-Tech Computer – Ruteng
(CV. Montée Vista Media Vision)