Ripple Effect Mapping: A “Radiant” Way to Capture Program Impacts

A group of leaders in college extension programs created a participatory group process designed to document the results of Extension educational efforts within complex, real-life settings. The method, known as Ripple Effect Mapping, uses elements of Appreciative Inquiry, mind mapping, and qualitative data analysis to engage program participants and other community stakeholders to reflect upon and visually map the intended and unintended changes produced by Extension programming. The result is not only a powerful technique to document impacts, but a way to engage and re-energize program participants.

Ripple Effect Mapping can be used to help unearth and document the divergent outcomes that result from dialogue and deliberation programs.

This article in the Journal of Extension was published in October 2012 (Volume 50, Number 5). Authors include Debra Hansen Kollock of Stevens County Extension, Lynette Flage of North Dakota State University Extension, Scott Chazdon of University of Minnesota Extension, Nathan Paine of the University of Minnesota, and Lorie Higgins of the University of Idaho.

Introduction

Evaluating the changes in groups, organizations, or communities resulting from Extension programming is difficult and challenging (Smith & Straughn, 1983), yet demonstrating impacts is critical for continued investment (Rennekamp & Arnold, 2009).

Ripple Effect Mapping (REM), is a promising method for conducting impact evaluation that engages program and community stakeholders to retrospectively and visually map the “performance story” (Mayne, 1999; Baker, Calvert, Emery, Enfield, & Williams, 2011) resulting from a program or complex collaboration. REM employs elements of Appreciative Inquiry, mind mapping, and qualitative data analysis.

REM was used to conduct an impact analysis of the Horizons program, an 18-month community-based program delivered to strengthen leadership to reduce poverty. The method (Kollock, 2011) was piloted in Washington, Idaho, and North Dakota Horizons communities to illustrate outcomes of the program over time. While there were minor process variations in each state, the REM technique in all three states utilized maps to illustrate to community members what was accomplished as well as furthered their enthusiasm for taking action on issues.

Background

Historically, the standard approach for impact evaluation has been experimental research. Yet critics of experimental approaches emphasize that these designs are often politically unfeasible and yield very little useful information on a program’s implementation or its context (Patton, 2002).

REM, an example of qualitative methodology based on open-ended group interviewing, provides “respectful attention to context” (Greene, 1994: 538) and better addresses the concerns of program stakeholders. The participatory group aspect of this method engages participants and others to produce high-quality evaluation data and increases the likelihood of future collective action.

REM is a form of mind mapping, a diagramming process that represents connections hierarchically (Eppler, 2006:203). A fundamental concept behind REM is radiant thinking (Buzan, 2003), which refers to the brain’s associative thought processes that derive from a central point and form links between integrated concepts (Wheeler & Szymanski, 2005; Bernstein, 2000). This makes REM an ideal tool for brainstorming, memorizing, and organizing.

Description of the Method

The steps involved in Ripple Effect Mapping are:

  1. Identifying the intervention: REM is best conducted for in-depth program interventions or collaborations that are expected to produce broad or deep changes in a group, organization, or community.
  2. Scheduling the event and inviting participants: The REM process includes both direct program participants and non-participant stakeholders. This latter group offers a unique perspective and a form of external validation to verify the “performance stories” of program participants. Ultimately, a group of eight to 15 participants is ideal.
  3. Appreciative Inquiry Interviews: At the beginning of the REM event, participants are paired up and instructed to interview each other about particular ways the program affected their lives or particular achievements or successes they have experienced as a result of the program (Cooperrider & Whitney, 2007).
  4. Mapping: The core of the session involves group mapping, using Mind Mapping software (Donaldson, 2010) or papers and tape on a wall, to brainstorm and hierarchically map the effects or “ripples” of the intervention. This process engages the entire group and provides opportunities for participants to make connections among program effects. The process is co-led by a facilitator and a “mapper” and is typically completed in one to two hours.
  5. Cleaning, Coding, and Analysis: After the session, the evaluator may need to reorganize the mind map and collect additional detail by interviewing other participants. The data produced in the mapping process can be downloaded into a spreadsheet program and coded in a variety of ways. For example, the “ripples” can be coded as short-term knowledge, skill, or attitude changes; medium-term behavior changes; and long-term changes in conditions. Furthermore, these changes in conditions can be coded using the Community Capitals Framework (Emery & Flora, 2006; Rasmussen, Armstrong, & Chazdon, 2011).

Benefits and Limitations

REM is:

  • Simple and cheap. Mind mapping software is available for free or at low cost. It is efficient to gather participants together for one face-to-face meeting rather than to conduct individual interviews.
  • Able to capture impacts of complex work. The technique successfully documents both intended and unintended effects of Extension work. For example, Extension programming often succeeds at building social capital (trust and connections among people). This method allows participants to describe the connections they’ve built as well as what these connections led to.
  • An effective communication tool. The visual nature of ripple maps makes them very useful as a tool to share program effects with stakeholders such as funders or local officials.
  • Motivating. As REM engages participants and stakeholders, it also creates positive energy for further collective action.

The limitations of REM are the risk of bias in participant selection and in data collection. The assembled participants may not have complete information about all the outcomes of a program and may not provide examples of negative consequences. One way to overcome these limitations is to conduct supplementary interviews with additional stakeholders after the session has been completed and to probe for negative consequences during the session.

Example with Map

Figure 1 shows a portion of one community’s Ripple Effect Map from the Horizons program. This section of a map features examples of first, second, and third order “ripples” from the program. The map illustrates the Fort Yates Horizons program, which conducted a study circles conversation that then led to community garden development. The community garden project spurred the town to a Native Garden partnership with the Tribe, which ultimately led to significant grants to support cultural understanding and assist those with limited resources.

Figure 1.
A Segment of a Ripple Effect Map

RippleEffectMap-mindmapimage

Conclusion

REM is a useful tool for impact analysis of Extension programming and may be particularly well suited for complex interventions or collaborations. Compared with other methods, it is straightforward, cost effective, and, most important, has the potential to generate further movement towards group, organizational, or community goals. We invite program staff and evaluators in other states to try this method out and engage with us in dialogue about the many uses, benefits, and limitations of this approach.

References

Baker, B., Calvert, M., Emery, M., Enfield, R., & Williams, B. (2011). Mapping the impact of youth on community development: What are we learning? [PowerPoint slides]. Retrieved from: http://ncrcrd.msu.edu/uploads/files/133/Mapping%20Impact%20of%20Youth%20on%20Com%20Dev%2012-3-10.pdf

Bernstein, D. A., Clarke-Stewart, A., Penner, L.A., Roy, E. J., & Wickens, C. D. (2000). Psychology (5th ed.) Boston: Houghton-Mifflin Company.

Buzan, T. (2003). The mind map book. London: BBC Books.

Cooperrider, D. L., & Whitney, D. (2007). Appreciative inquiry: A positive revolution in change. Pp. 73-88 in P. Holman & T. Devane (eds.), The Change Handbook, 2nd edition. San Francisco: Berrett-Koehler Publishers, Inc.

Donaldson J. (2010). Getting acquainted with free software. Journal of Extension [On-line], 48(3) Article 3TOT7. Available at: http://www.joe.org/joe/2010june/tt7.php

Emery, M., & Flora, C. B. (2006). Spiraling-up: Mapping community transformation with community capitals framework. Community Development: Journal of the Community Development Society 37(1), 19-35.

Eppler, M. J. (2006). A comparison between concept maps, mind maps, conceptual diagrams, and visual metaphors as complementary tools for knowledge construction and sharing. Information Visualization 5:202-210.

Greene, J. C. (1994). Qualitative program evaluation: Practice and promise. Pp. 530-544 in Denzin, N.K. and Lincoln, Y.S., eds. Handbook of qualitative research. Thousand Oaks, CA: Sage.

Kollock, D. A. (2011). Ripple effects mapping for evaluation. Washington State University curriculum. Pullman, WA.

Mayne, J. (1999). Addressing attribution through contribution analysis: Using performance measures sensibly. Retrieved from: http://www.oag-bvg.gc.ca/internet/docs/99dp1_e.pdf

Patton, M. (2002). Qualitative research and evaluation methods. London: Sage Publications.

Rasmussen, C., Armstrong, J., & Chazdon, S. (2011). Bridging Brown County: Captivating social capital as a means to community change. Journal of Leadership Education 10(1):63-82.

Rennekamp, R., & Arnold, M. (2009). What progress, program evaluation? Reflections on a quarter-century of Extension evaluation practice. Journal of Extension [On-line], 47(3) Article 3COM1. Available at: http://www.joe.org/joe/2009june/comm1.php

Smith, M. F., & Straughn, A. A. (1983). Impact evaluation: A challenge for Extension. Journal of Extension [On-line], 21(5). Available at: http://www.joe.org/joe/1983september/83-5-a9.pdf

Wheeler, R., & Szymanski, M. (2005). What is forestry: A multi-State, Web-based forestry education program. Journal of Extension [On-line], 43(4) Article 4IAW3. Available from: http://www.joe.org/joe/2005august/iw3.php

Resource Link: http://www.joe.org/joe/2012october/tt6.php

Democracy Pays

The U.K.-based Democratic Society produced a white paper in association with Public-i Ltd., on how democratic engagement can help local government save money in a time of cuts.

Executive summary:

This is a time of fiscal pressure and service cuts. Councils are restructuring services and looking to cut back on non-essential areas of spending. Is democratic engagement one of those areas?

Aside from the moral argument for democratic engagement, there is evidence that investment in strong democratic participation is important if reformed local government approaches are to result in more efficient spending and better-targeted services.

The evidence of self-directed support and personalised budgets shows that involving citizens and users in service provision can produce better-tailored services that operate at lower overall cost.

Where councils need to cut expenditure, high-quality democratic engagement in the budget setting process can provide them with better information, while increasing participants’ opinion of the council.

In countries with a tradition of more participatory democracy, higher levels of participatory democracy correlate with more efficient services and greater willingness to pay tax.

Creating a single architecture for public consultation and engagement can also reduce the cost of duplication in consultation exercises.

If they can create an attractive offer on democratic engagement, councils should be able to realise benefits, because there is a large untapped market of people who want to get engaged in their local area, as well as broader reach and range for online democratic engagement tools.

Resource Link: www.demsoc.org/democracy-pays-white-paper-how-digital-engagement-can-save-councils-money/

Direct Download: http://www.demsoc.org/static/Financial-Case-white-paper.pdf