“…evaluation tells us what works and what does not in a project or policy”. This simple description from the UK Evaluation Society (UKES) makes clear the absolutely vital role that evaluators globally have to play in assessing the impact, for example, of public and private investments on people’s lives. Despite this, the evaluation profession is still making progress in developing formal qualifications, and ‘anybody’ can claim to be an evaluator. Especially a researcher, which was how many, including myself, got going as an evaluator. There are limited suitable evaluation degree courses and a few more short courses, but too few integrate the reflective practice so critical to becoming skilled evaluators.
Two organisations that promote reflective practice based on research and evaluation experience, as well as international development practice, are the UKES and IDS. IDS staff affiliated with the Centre for Development Impact (CDI) have recently joined as institutional members of the UKES, and the two organisations are exploring the potential for reflective practice to be enhanced in evaluation.
Reflective practice is a means of studying personal experiences to increase confidence to strengthen professional and personal skills. The UKES Voluntary Evaluator Peer Review (VEPR) has this focus at its core and a new round has just been announced. Within IDS reflective practice, which forms an important part of our MA Power, Participation and Social Change degree, addresses self-awareness of how socially derived knowledge and values shape relationships and the underlying structure in research and teaching. But unless reflective practice leads to something tangible it remains as an abstract concept in peoples’ psyche. Therefore, VEPR’s endpoint is a plan of action to do things differently as a result. This can be difficult as the newly emerged reflective evaluator faces entrenched institutional barriers. But the focus of the profession is moved, making it more effective.
Benefits of the UKES Voluntary Evaluator Peer Review
Last year I undertook the VEPR: a structured peer review by evaluation practitioners for evaluation practitioners to develop professional practice and competency. UKES has developed a Framework of Evaluation Capabilities which covers the areas of Evaluation Knowledge, such the social and political uses, designs, approaches and methodologies; Professional Practice, which includes managing evaluations, interpersonal skills and upholding values, and Qualities and Dispositions being adaptable, exercising rigorous and fair judgements, independence of mind, and integrity. These anchor the VEPR process. The evaluator brings together a set of practical skills (methods of data collection, analysis and reporting), a theoretical approach (the evaluation model used) and the interpersonal skills required to put these into practice. Merging these is the process of self-reflection.
VEPR addresses some thirty capabilities in two ways: first by asking candidates to self-grade themselves with evidence, referring to education, training, or work experience and second, by choosing two issues that relate to specific capabilities and producing a portfolio of work. My choices were first, ‘designing an evaluation appropriate to the task’ in the context of short summative (ex-post) evaluations, and second, ‘enabling the role of formative (ex-ante) evaluations to strengthen programme design’, which covers the evaluability of the programme at design stage. A portfolio of work was reviewed in a two-hour discussion with two senior evaluation professionals. It was not an examination (although I now have a nice certificate on my wall) but gave the opportunity to talk through professional areas for improvement using this structure.
My interest in evaluation design is based on the limits imposed by available resources and second, by the need to make programmes ‘evaluable’ (the extent to which an activity can be evaluated in a reliable and credible fashion), which are interconnected.
Making room for participation in evaluation
The two main conclusions from my VEPR were that first, when resources are limited, managerial issues come into greater play in the choice of method, and the evaluator cannot create a space for people to express themselves: there is little room for participation. Something that is explored in greater detail in IDS’ Participatory Monitoring and Evaluation for Learning professional development short course. The second conclusion was that formative evaluations also bring designers into the evaluation process, and I was advised to review developmental evaluation processes which support innovation (as championed by Michael Quinn Patton, see) and the different roles of designers and evaluators. Again, issues that are explored in a course on Contribution Analysis for Impact Evaluation offered by IDS colleagues. I was struck by this close intertwining of evaluation and managerial issues with power dynamics remaining submerged between evaluation commissioner, evaluator, and evaluation participants. In international development, the short summative evaluations rest mostly on accountability approaches rather than any open-ended sharing of perspectives and interests.
Importance of power relations in evaluations
The VEPR uses reflective practice to examine work experience and to reflect on what went well, what was challenging, the conclusions and what can be learned. It then explores what the evaluator can do about this learning. The work of IDS incorporates power analysis into reflective practice, by looking at personal experiences of power and how these are shaped by identities, values, and world views. In the model of development studies, power relations are integral, understanding how power works in society, whether as a productive or oppressive force.
Reflective practice that directly addresses power dynamics requires a process of re-constructing our mental maps. As powerful as this process can be for the learner, it does not necessarily lead to change. IDS’s work calls for more innovative learning practices which stimulate both the conceptual and rational review of working perspectives, and also the experiential, making sense of personal experiences of power and realising personal capacity to shift power. An evaluation may appear rather technocratic, but it is a value-laden demonstration of ‘us’ and ‘them’ with contradictory roles and power imbalances. As UKES embarks on its new round of VEPR, candidates and reviewers should be more alert to the issues of embedded power.
With the demand for evaluators rising, the growing collaboration between IDS and UKES has the potential to expand the opportunities for mutual learning across UK and international development sectors and build a community of skilled and engaged practitioners.