A Participatory Evaluation HOW TO: tips and tools for sensemaking, storytelling, and more

It was college reunion weekend at my alma mater (wahoowa!), and in addition to taking a few (literal) walks down memory lane, I thought a lot about feeling a sense of community and what magic combination of ingredients is needed to create that.

Shared experiences - especially those where you gain something of value and even more so those where you create something of value - are part of that secret sauce.

And participatory evaluation, when done well, checks all those boxes.

If you missed the last issue of Community Threads, be sure to check it out first, as it provides an overview of PoP Health’s participatory evaluation approach. With that approach in mind, today’s issue is diving into HOW we actually do participatory evaluation. Let's jump right in.

Can you share some tips for participatory evaluation?

Begin with the end goal in mind, and design your evaluation and learning process accordingly. What do community members, coalition members, and the evaluation and learning team most want to learn and why?

Consider how community members can co-own - and meaningfully benefit - from every stage of the evaluation and learning process, from initial brainstorming and planning through data collection and analysis through sensemaking and storytelling. How is each stage of the process structured to allow community members to drive or co-lead the process? How is each stage of the process designed to ensure community members walk away with new capacity, connections, resources, and supports?

Develop infrastructure for responsive feedback and continuous quality improvement. You’ve heard this tip before from me and I'm repeating it here because it can't be overstated and hardly anyone truly does it! Yes, a key goal of evaluation is to understand impact but an equally if not more important goal is to improve the work. It's vital to set up infrastructure (time, resources, systems) from the outset so you are flexible and nimble enough to implement course corrections and improvements in real time based on analysis of monitoring data.

Diversify and tailor your evaluation deliverables, and make them modular where possible. Here is a slide from PoP Health’s Evaluation 101 workshop that includes a range of possible deliverables through which to share evaluation results and stories (and there are many more beyond what's listed here).

Don’t limit yourself here, get creative! Data dashboards are all the rage these days, but I love Stephanie Evergreen’s take on them, especially for something community facing - make it a webpage instead. You can also tailor content to different audiences - we often do this via briefs/two-pagers, one for community members, one for policymakers, one for funders, and so on. They each care about different things. We have also had success with making our briefs modular - having “modules” (short sections of the brief) that can be toggled in or out of a brief depending on who needs X background information or who cares about Y data.

What are some specific strategies for participatory evaluation?

There are many, but here are a few I especially appreciate for the meaningful role community and coalition members play:

Community based system dynamics modeling: A key part of evaluation and learning in public health is understanding dynamic, complex, messy systems. In the case of our work on school mental health in DC, students, family members, teachers, school administrators, policymakers, and others may have completely different understandings of the school mental health system. So we engaged in what’s called Community Based System Dynamics, in partnership with the Social System Dynamics Lab – this process uses participatory group model building approaches to explore the system in question. We held modeling workshops with students, with caregivers, with teachers, and with our multisector Stakeholder Learning Community. During the workshops, groups huddled around large sheets of paper, discussing, writing things down, crossing things out, drawing arrows, and so on. They produced a series of causal loop diagrams, which I then synthesized and integrated into this version of our systems map. Much more on this in a prior PoP Health newsletter here.

Data placemats for collective sensemaking: Community members and coalition members have experiences, expertise, and perspectives that lead them to insights your evaluation team, program team, and funders are apt to miss entirely. So don't make the mistake of leaving them out of the conversation. I love using highly visual data placemats and data posters to bring coalition members and community members into the process of making sense of data and drawing insights - about what we've learned so far, how to continually improve the initiatives we work on, and what other information we need to gather moving forward. We’ve recently used data placemats (during a coalition-wide data sensemaking session) and data posters (during a community-wide symposium) to share initial data from evaluation of the BIRTH Plan with our community and coalition partners in Pittsburgh, pairing the visuals with discussion questions that help elicit their thoughts on what they take away from the data, their insights about how to improve the work, and what additional information they’d most like to see in the future.

Sharing personal narratives: Nothing is quite as powerful as a story in someone’s own voice. Capturing personal narratives of community members/program participants/those influenced by a policy is a vital participatory evaluation strategy. In addition to focus groups and interviews, there are many creative ways to do this, including video journals, audio diaries, photo voice, and more. I especially love this idea of a participatory video process focused on stories of significant change. Participants are given the chance to take part in a Participatory Video process at baseline; stories of Most Significant Change are collected via structured story circles at midline; each circle selects one story to record on video. A participatory analysis identifies themes and recommendations. Given consent, videos can be shared so stakeholders learn directly from participants’ stories.

Community collaboration strategies: The community collaboration strategies we have featured previously (i.e., focus groups in a box, data walks, and street stalls) can also all be used in the context of evaluation and learning.

What are some resources to help me engage in participatory evaluation?

I’d like to leave you with a few resources I’ve found especially practical and useful in terms of participatory evaluation. As always, drop me a note to share other helpful resources or tools you’ve come across!

As always, we share these tips, strategies, and resources in the hopes that they help you understand HOW to engage in participatory evaluation.

Participatory evaluation is likely going to be messier, slower, and more expensive than more traditional evaluation approaches. But on the flip side… You’ll be driven by community. You’ll learn more. You’ll create greater and more sustained change in your community.

The pros definitely outweigh the cons in my book - what about for you?

Sign up to receive future newsletters directly in your inbox at www.pophealthllc.com!

Previous
Previous

Why people forget facts but remember stories - an intro to Effective Storytelling

Next
Next

The 'P' in our CAPE: Participatory Evaluation - by the community + for the community