Agulhas has increasingly taken a collaborative approach to evaluations with client organisations. The aim of working closely with our partners is that any recommendations resulting from our findings are meaningful to their people, and understanding of their unique challenges, and can therefore support sustainable changes they wish to see in their organisation.
How do we achieve collaboration with clients and users of evaluation products, while maintaining our principles, remaining independent, and achieving our project goals? Here are some tips from Agulhas to enable meaningful collaboration within evaluation and research projects.
Why is collaboration in evaluation important?
The results, conclusions and recommendations of evaluations can have lasting impacts in many areas, including budgets, recruitment and an organisation’s overall direction. They can, and should, be an essential part of an organisation’s strategy, feedback and accountability tools.
However, evaluations are not apolitical. They do not exist in a vacuum; they are a product and reflection of an organisation’s inner workings, its people, and the wider sector as a whole. Collaborating with organisations to understand the internal and external dynamics within which they work as part of an evaluation process can lead to more sustainable change. It is important to understand the specific contexts an organisation works in and the challenges they face, in order to make an evaluation more insightful and impactful. In this blog we both set out why collaborative evaluations are worthwhile (in terms of buy-in, relevance, impact and sustainability) and how to go about ensuring a collaborative process.
Start small and start early
Collaboration in an evaluation or research process can start small, but must be ingrained from the very beginning. In regular meetings, agreeing terminology is essential, so you know you are working to the same definitions and shared fundamental understandings. This is particularly important when it comes to maintaining principles like diversity, equity and equality. These are terms that are often vaguely understood and incorrectly used, and different people often have different understandings of their meaning, leading to misunderstandings.
It might even be possible to recommend changes to the proposed methodology during inception and design meetings. Suggesting changes or clarifications, without pressuring an organisation, may lead them to reach new, potentially more impactful, questions and methods for understanding the proposed research or evaluation. Engaging with key stakeholders as early as possible on these fundamentals can ensure a collaborative approach from the very start and help avoid misinterpretations or challenges further along in the process.
Build relationships based on mutual trust
Managing changing political climates and internal politics can be challenging for both an organisation and an evaluator. Developing a good working relationship is essential. Fostering even a little buy-in early on with decision makers and project leads will help make any resulting evaluation recommendations more actionable.
Consider the use of collaborative workshops with a number of stakeholders across a variety of levels, regions and roles within an organisation, ahead of presenting final conclusions and recommendations. This ensures a diverse group of voices are able to provide feedback on suggestions and challenges specific to their organisation, and divides responsibility for actions and accountability for driving those actions forward. As a result the recommendations, collectively refined and prioritised, are truly actionable. Bring the organisation ‘on the evaluation journey’, rather than just presenting findings at the end, to ensure organisational buy in and follow through, making a lasting impact after the evaluation is published.
Give people ownership within the process
Our role as evaluators may differ from assignment to assignment. Our role might be to give our client the space to consider what they really need to do to enable the change they want to see. A key question we ask ourselves is ‘how do we ensure the organisation owns and fully supports our findings?’. It is essential that we get feedback on the solutions, not the evidenced problems we uncover in research.
Collaborative mixed methods, supported by practical collaborative learning tools such as Padlet and Miro, can help organisations identify their own findings and lessons learned. In a recent project, our team presented the emerging themes and preliminary recommendations to the client as part of a facilitated series of workshops, conducted via Teams and across multiple time zones, to incorporate the feedback and voices of staff from across the programme. These recommendation workshops served as a vehicle to galvanise buy-in and agree the practical steps that would enable staff to put each recommendation into practice. Participants were placed into breakout groups, allowing them to participate in rich discussion and begin the process of assigning ownership for proposed recommendation areas
Language and accessibility considerations are key in these workshops, particularly when working internationally. These include adapting online platforms depending on internet bandwidth, providing explainers to online tools prior to the workshops for those who are less familiar, and ensuring workshops are run in different languages where possible.
Apply strong evaluation principles to ensure independence and equity
As independent evaluators we are often in a unique and privileged position, being able to listen to the views of a diverse range of stakeholders. This comes with a responsibility to ensure we uphold principles of equity in our partnerships, creating safe spaces and making room for all voices through an inclusive approach.
Core to this approach is ensuring that anyone we engage through focus group discussions, interviews, workshops and surveys are as representative as possible of the organisation (i.e. not limiting our interactions to senior staff, or staff at HQ). An inclusive approach will include speaking with staff from across pay grades, and locations, to ensure that the nuances of the organisation or project being evaluated are captured. We are also mindful of power dynamics that are inevitably at play. For example, during collaborative sessions at client level, we ensure that participants are grouped with peers, helping create a comfortable environment for participation. The level of detail to be shared in the final report is clearly outlined from the start of the session, and it is clarified that participants will have anonymity. Care needs to be taken that this anonymity is guaranteed – e.g. to ensure that participants’ identity cannot be gleaned through description of their work, location or other characteristics that only a few staff members might share.
Underpinning these methods is a strong foundation of research ethics principles. Agulhas promotes the offering of anonymous feedback platforms and provides information on support for those impacted by the process (which is particularly important for evaluations of projects focused on sensitive topics, such as safeguarding).
Diverse teams are successful teams
It is not uncommon to see evaluation teams with very similar backgrounds and experience. While historically this may have been seen as a plus, having a diverse team with varied skills and expertise lends strength and credibility to an evaluations process and outputs.
Evaluation teams with flatter structures can value the inputs of leaders, experts and analysts, allowing for creativity, innovation and novel thinking. These teams may consist of experts in the field of the evaluation, and also experts in other areas, to avoid bias and ensure balance, geographical understanding and to account for nuances that can easily be overlooked. Involving third parties through quality assurance and peer review processes ensures the process and recommendations are of the highest standard and based on up-to-date thinking within the field
Don’t reinvent the wheel
Using an existing framework to structure analysis can help maintain independence and methodological rigour. Standing by your principles at every point of the process, and guiding your partners and clients along the evaluation journey can ensure they understand the process, have the opportunity to provide context and insight during the development of an evaluation, and ultimately support a collaborative, open and transparent working relationship.
Collaboration is a work in progress
When it comes to collaborative evaluations, one small ripple can make a big wave. Even starting small, like agreeing definitions and language with your partners and clients, can start a process of cooperation throughout a project, avoiding misunderstandings and, hopefully, challenges to the findings and recommendations which are meant to help an organisation improve. With just a little buy-in from stakeholders, encouraged by giving them space to consider what they really need to do to enable the change they want to see, we can see a shift in mindset when it comes to actioning recommendations. With a collaborative approach to evaluation, the output is theirs to own.
All our tips at a glance:
- Start small and start early
- Give people ownership within the process
- Build relationships based on mutual trust
- Diverse teams are successful teams
- Apply strong evaluation principles to ensure independence and equity
- Don’t reinvent the wheel – stand by your principles and use existing frameworks
- Collaboration is a work in progress – one small ripple can create a big wave