Collaborative Approaches to Evaluation. Группа авторов
this research is that evaluation policies appear to lean heavily toward supporting accountability-oriented approaches to evaluation. Yet, there is evidence to show that policies and practices that privilege learning as a central and desirable function of evaluation are more likely to connect with organization and program community personnel (Al Hudib, 2018). In our opinion, it would be entirely worthwhile to consider augmenting the content of evaluation policies with due treatment of learning-oriented approaches, CAE being exemplary in this respect. It is likely to be through the direct experience of success with the evaluation that organizational actors will become more willing to embrace evaluation as leverage for change.
Translating and Applying in Cross-cultural Contexts
Many international events have underscored the rapidly growing global interest in evaluation. One such series of events sponsored by EvalPartners was held in 2015 The International Year of Evaluation,9 intended to raise awareness and foster organizational and individual capacity building on a global scale. Much of the work of international development evaluation, as we have observed above, has been heavily weighted toward the interests and needs of bi- and multilateral donor agencies as well as public sector governance institutions. Yet, there is growing interest in the evaluation field building (e.g., Hay, 2010), which implicates the engagement of a much wider range of stakeholder interests in evaluation. We are inclined to think that the CAE principles could help to move this field building agenda. Required would be the official translation of the CAE principles into different languages of interest and applying them retrospectively or prospectively to evaluation projects at the local level. As we discuss below, we have already translated the principles and support documents into Spanish and French. Translation and application into other languages and contexts is most certainly possible and desirable. A caveat, however, is that the CAE principles reflect a western set of underlying assumptions and ways of thinking; translation into other languages is one thing, it would be quite another to actually apply the process in quite different cultural circumstances.
9 www.evalpartners.org/evalyear/international-year-of-evaluation-2015
In western culture, it seems we often equate development contexts with international development, but of course, many of the considerations and principles we have in mind apply to indigenous populations in our own jurisdictions. Such contexts provide yet another cross-cultural opportunity to work with and apply the principles. Regardless, whether international or local/indigenous, applications are not likely to be straightforward given variance and differences in cultural norms. It will be of high interest to see, for example, the extent to which the principles as we have laid them out integrate with indigenous and other ways of knowing.
Conceptually Framing Research on Evaluation
A while ago, an extensive review of 121 empirical studies on CAE was conducted, and it was observed that the vast majority of them took the form of reflective case narratives (Cousins & Chouinard, 2012). The review was extended to CAE in development contexts and similar results were found (Chouinard & Cousins, 2015). While reflective narratives offer considerable value for understanding complex psychosocial phenomena such as program implementation and impact, they are largely unverifiable given the propensity to underreport methods. We therefore have argued in favor of greatly expanding the range of research designs to gain a better understanding of practice and its implications for growing the evaluation knowledge base. The CAE principles implicitly provide a conceptual framework that may be entirely useful in this regard. We envision the development and utilization of qualitative, quantitative, and mixed-methods research designs to enable deeper understanding of the antecedents, practices, and consequences of CAE. Of particular interest would be comparative designs, where observations about the implementation of CAE in different contexts could be systematically informed. Longitudinal designs to chart the trajectory of relationships and other important considerations over time would also be of high value.
In the foregoing paragraphs, we have offered some suggestions about potentially powerful uses of the CAE principles not only to guide practice but to enable deeper understanding about CAE than is presently the case. In our opinion, the principles show great promise to stimulate dialogue and deliberation, analysis, and reflective practice in the field. But of course, the question as to their potential merit remains an empirical one. In the next section, we describe how we went about launching the principles, promoting them globally, and requesting collegial interest in field testing the principles.
The Global Test Drive of the CAE Principles
Rationale
From the point of decision to actually develop and validate a set of CAE principles, we knew that what we would be able to produce would only be preliminary. It will be through ongoing use and reflective, systematic assessment that we can learn about the extent to which the principles are effective and how they might be improved to make them more effective. Here is how we put it on previous occasions:
The principles would not be written in stone, but rather they would be the subject of continuous analysis and renewal through dialogue and systematic inquiry…. Moreover, we would propose that a set of working principles be subject to field testing and inquiry and that such inquiry should be, in and of itself, collaborative. (Cousins et al., 2013, p. 19)
Our sense is that the principles, when used as a set to guide and reflect on collaborative practice holds strong potential for enhancing the success of such evaluations, and we encourage ongoing, well-documented field trials to confirm this hunch…. It is our conviction that the principles require solid test driving opportunities, and they should be revised and perhaps reengineered sometime not too far down the road. (Shulha et al., 2016, p. 213)
To paraphrase what we said earlier in the chapter, the thing about laying out proposals favoring specific courses of action is that doing so comes with a certain amount of risk. That is to say, it is one thing to come up with direction for the field, but it is quite another to walk the talk. This book is our attempt to do just that; to make good on a commitment to test driving the principles in a range of contexts around the globe and to do so through the collaborative involvement of many of our evaluation colleagues. In this section, we describe our launch of the principles and our efforts to promote them as well as the global call for empirical field studies to test the principles in action.
Promotion and Launch
In the latter stages of development and validation of the initial set of principles, we began promotional activities in various locations at home and abroad (see Appendix 1). In January 2017, we officially launched the CAE principles in English, French, and Spanish through a wide range of networks and channels. As shown in Appendix 2 the launch included two appended documents: (i) a brochure style document giving a descriptive overview of the principles, suggestions for their use and application, and contact coordinates for request for further information; and (ii) an indicators document which also provided a descriptive overview of the principles along with the actions and indicator questions listed in Table 1. These documents were also translated from English into French and Spanish.
Call for Field Studies
Along with the launch, which encouraged evaluators and evaluation community members to use and apply the principles in practice, we simultaneously provided a call for proposals for field studies. Text for the call for field studies was foreshadowed in e-mails and listserv postings (See Appendix 2) where we provided a link to an online fillable proposal form (See Appendix 3). The text for the call provided background information, a rationale for the call, details about the peer-review and publication process, suggestions about content focus including a list of research or field test questions of interest, and finally details about proposal format and evaluation. The call