As Wikimedia program leaders and evaluators work together toward more systematic measurement and evaluation strategies, we face a common challenge to obtain both systematic quantitative data and qualitative information. By doing so, we can establish successful practices in outreach programs and understand the depth and variety of Wikimedia programs across our vast Wikimedia landscape. To meet this challenge, we are working to combine efforts and knowledge for two key purposes:
- To generalize program knowledge and design patterns of Wikimedia programs and
- To deepen our understanding of Wikimedia programs and projects and how they impact our communities.
Oftentimes, program leaders and evaluators question whether methods and measures that are quantitative are preferred over the qualitative, or whether qualitative outcomes can be given value like quantitative outcomes.
A good evaluation requires numbers and stories – one does not have meaning without the other. How those stories and numbers are collected can – and will – vary greatly, as each program leader seeks to understand their project or program story. Methods and measures for program evaluation should be designed to track how your project or program is doing, what it intends to do and how well the expected outcomes are reached. Whether those methods and measures are qualitative or quantitative will vary depending on your interests, your measurement point and your evaluation resources, but should, no matter what, be useful in telling the story of what you did and whether or not your work was a success. (Read more on the Evaluation portal.)
Most often, through triangulation of quantitative and qualitative measures, today’s social researchers define approximate measures or «proxies» for focusing on and measuring a phenomenon of interest. Through triangulated measures, qualitative and quantitative information can tell a better story of outcomes than either can alone. For instance, consider the phenomenon of volunteer editing behavior:
QUANTITATIVE “edit count” “bytes added/removed” “page views” |
A BETTER Story |
|
+ | = | |
QUALITATIVE “article subjects” “categories” “quality” ratings |
In a mixed-methods world, quantitative and qualitative tend to be two sides of the same measurement coin: related and nearly inseparable in practice.
How do numbers and qualitative attributes come together?
-
- All quantitative measures are based on qualitative judgments; all qualitative measures can be coded and analyzed quantitatively.
As illustrated in the images below, numbers do not mean anything without assigning a description; anecdotes mean nothing without numbers.
Whether it is a question about physical count data, or about an attitude, we must create the meaning of numbers, and numbers with meaning in measurement. (Read more here on a detailed example of how to ask yourself these questions during an education program with students, leading to a combined quantitative and qualitative approach.)
Next steps for Program Leaders
Trying to choose the best measures for your Wikimedia project or program?
Check out the helpful Measures for Evaluation matrix of common outcomes by program goal. We are working to map measures and tools for it.
Last week, the Program Evaluation and Design team initiated the Foundation’s second round of voluntary programs reporting. We invite all program leaders and evaluators to participate yet again in the most epic data collection and analysis of Wikimedia programs we’ve done so far. This year we will examine more than ten different programs:
|
|
Did you lead or evaluate any of these programs during the time from September 2013 through September 2014? If so, we need your data! For the full announcement visit our portal news pages.
Reporting is voluntary, but the more people do it, the better we can represent programs. This voluntary reporting will help us understand the depth and impact of programs across different contexts. It allows us to come together and generate a bird’s eye view of programs so that we can examine further what works best to meet our shared goals for Wikimedia. Together we can grow the AWESOME in Wikimedia programs!
Jaime Anstee, Ph.D, Program Evaluation Specialist, Wikimedia Foundation
Can you help us translate this article?
In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?
Start translation