Jun
24
Posted by nnlmneo on June 24th, 2016
Posted in: Blog
Children who grow vegetables are more likely to eat them. Likewise, stakeholders who have a hand in interpreting evaluation data are more likely to use the findings.
Traditionally, the data analysis phase of evaluation has been relegated to technical experts with research skills. However, as the field sharpens its focus on evaluation use, more evaluators are working on developing methods to engage groups of stakeholders in data analysis. While evaluation use is one objective of this approach, evaluators also are compelled to use participatory data analysis because it
Last week, Karen and I attended Participatory Sense-making for Evaluators: Data Parties and Dabbling in Data, a webinar offered by Kylie Hutchinson of Community Solutions and Corey Newhouse of Public Profit. They shared their strategies for getting groups of stakeholders to roll up their sleeves and dive elbow-deep in data. Such events are referred to as data parties, sense-making sessions, results-briefings, and data-driven reviews. Hutchinson also shared her data party resource page on Pinterest. I decided to dedicate this week’s blog post to a review of the resources I found to be most helpful.
One of my favorites is a paper by The Innovation Network, an organization known for creative participatory methods and tools. This article, Participatory Analysis: Expanding Stakeholders Involvement in Evaluation (Pankaj, Welsh, Ostenso), describes various projects that used participatory data analysis, along with key lessons learned from each project. The article is a great introduction on how and why to conduct participatory analysis.
This Pankaj et al. article mentions a data visualization tool called “data placemats” that can enhance data party facilitation. These are placemat-sized reports with graphic summaries of evaluation data. There is an art to their design, because findings must be easy to understand, but they must also be represented objectively to prevent biasing your participants’ analysis. For more information about data placements, check out Pankaj’s slides from her 2012 American Evaluation Association presentation available here.
Newhouse talked about another great resource titled Dabbling with Data, published by Public Profit. This guidebook is targeted to teachers and students, so it is laid out in lesson plan format. The first exercises in this guidebook target basic statistical skills to prepare non-researchers to work with data. Other exercises cover approaches to group quantitative and qualitative data analysis, along with exercises on how to share key findings.
Successful data parties rely on well-summarized and displayed data that lends itself to group discussion. As such, participatory qualitative data analysis poses its own challenges due to the sheer volume of data produced by qualitative methods. However, group analysis can be a boon to researchers. Groups can be structured to analyze raw data, thus spreading the workload over a group of individuals. I found two references on Hutchinson’s resource page dedicated specifically to describing procedures for participatory qualitative data analysis. One is by Taylor and Drake from Learning for Action. A second article by Suzanne Jackson describes an action research project that trained members of marginalized communities to conduct a participatory qualitative data analysis. (Unfortunately, the full article is not open access.)
Finally, a good participatory data analysis session requires skilled facilitation. Hutchinson recommended A Facilitator’s Guide to Participatory Decision Making by Kaner (3rd edition, April 2014, Jossey-Bass). I have not yet reviewed this book but plan to soon.
So here are just a few party guides to help your stakeholders dive into data and help make sense out of evaluation findings.
If you use these techniques, let us know and we’ll feature your project in a future blog post.