Overview
Principles
Click on each principle for more info
Intentional methods and thoughtful data collection
The CDC (2014) notes that culturally competent evaluations should be sensitive to preferences for nonlinearity in logic model design, integrate multiple perspectives when determining what counts as credible evidence, rely on culturally appropriate data collection instruments, consider cultural and linguistic needs when planning data collection, and ensure that data collection processes are respectful of cultural mores (e.g., communication styles, conceptualizations of time, social hierarchies).
Social science methodologies are socially constructed and centered on the Western white male perspective (Chouinard & Cram, 2019). Culturally responsive evaluation (CRE) originally favored qualitative over quantitative methodologies to allow for the emergence of cultural nuances but now recommends mixed methods (Hood et al., 2015). For the purposes of CRE, the evaluation design does not need to be elaborate— it just needs to be appropriate for context. Instruments should be culturally valid; that is, they should resonate with the culture of the participants participating in the program being evaluated (Public Policy Associates, 2015). This may require pilot testing and adaptation to ensure cultural sensitivity and responsiveness. The process of data collection should also be responsive to cultural context. Failure to ensure that those collecting qualitative data are attuned to cultural context may lead to invalid data (Frierson et al., 2002).
Culturally responsive and equitable evaluation (CREE) is an approach that should be infused into all evaluation methodologies. Because CREE is a compilation of culturally responsive and equity-focused evaluation approaches, intentional methodology focusing on cultural validity and credible evidence is highlighted. As part of the CREE Learning Series, Donna Mertens describes CREE methods that include “building of relationships, contextual analysis, having inclusion and diversity, addressing power differences, and focusing on the use of the data for transformative purposes” (Expanding the Bench, 2021, video2.2).
In regard to international methods and data collection, Bowman et al. (2015) states, “Tribal communities must not lose sight of the quest to create or attain a culturally responsive evaluation system that embraces their hegemonic ability to dictate the mission, infrastructure, or organizational framework” (pg. 354). To that end, the context of Indigenous evaluation addresses issues related to the cultural/traditional context and that define the design of the evaluation, including gaining access to the right of knowledge and understanding how that knowledge is transmitted (Bowman et al., 2015).
Methods and data collection are designed to place evaluation in the hands of community and staff members to facilitate ownership, enhance credibility, and promote action. Communities are involved in the development of tools designed to help with assessment, planning, implementation, and self-evaluation of their program. All methods, including those that are innovative, are considered viable if appropriate for the situation (Fetterman et al., 2015). Although the use of quasi-experimental designs is considered possible and probable, experimental designs are not necessarily mentioned. That said, there is no expectation that under the appropriate condition, an experimental design would be discounted.
Appreciation for mixed methods has increased in both the international and U.S. domestic contexts. In their seminal publication on equity-focused evaluations in international contexts, Bamberger and Segone (2011) noted, “irrespective of the size and nature of the intervention, an evaluation design which applies a mixed-method approach will usually be the most appropriate to generate an accurate and comprehensive picture of how equity is integrated into an intervention. Mixing qualitative and quantitative approaches, while ensuring the inclusion of different stakeholders (including the worst-off groups), will offer a wide variety of perspectives and a more reliable picture of reality.” International evaluations now almost exclusively rely on qualitative methods such as key informant interviews and document review, as quantitative surveys can be expensive and contribute to assessment fatigue. In both international and U.S. domestic contexts, “existing methodology may not be sufficient. … through experimentation and creativity, evaluators can show how using a different set of tools or tests might allow for new questions to be asked and answered” (Dean-Coffey et al., 2014). Dean-Coffey urges evaluators to “move beyond methodological approaches and evaluator demographics to address culture and context, and in so doing, unpack our definitions of evidence, knowledge, and truth, so that we may create new ones grounded in this time, place, and set of intentions” (2018). The Equitable Evaluation Framework™ (EEF) calls for consideration of history, location, power, voice, relationship, time, plasticity, and reflexivity to ensure validity of evaluation findings (EEI, 2020; EEI & GEO, 2021).
Evaluators should use mixed methods to capture the complexity of the problem under study (Mertens, 2020). “The transformative methodological assumption includes the concepts of inclusion of the full range of stakeholders in the process of decision-making about the evaluation methods, conduct of contextual analyses that identify cultural factors and issues of power as a basis for building trusting relationships, use of mixed methods to capture the cultural complexity and the data necessary to be appropriately responsive and inclusive of diverse stakeholders, building on community assets, and designing the evaluation to challenge assumptions that impede progress toward positive social change. … a mixed methods approach has the potential to provide a fuller picture of the complexities of problems, as well as the design, implementation, and determination of the effectiveness of solutions” (Mertens, 2016). Mertens proposes a cyclical model of mixed-methods research “as a way of involving the community in research decisions and the collection of data that can be used for social justice purposes” (Mertens, 2007; Mertens, 2009).
© 2022 SLP4i and The Colorado Trust, authored by Katrina Bledsoe, Felisa Gonzales, and Blanca Guillen-Woods. This work is protected by copyright laws. No permission is required for its non-commercial use, provided that the authors are credited and cited.
For full citation use: Bledsoe, K., Gonzales, F., & Guillen-Woods, B*. (2022). The Eval Matrix™. Strategic Learning Partners for Innovation https://slp4i.com/the-eval-matrix.
*These authors contributed equally to this work with support from the Annie E. Casey Foundation and The Colorado Trust.
The Eval Matrix site designed by KarBel Multimedia