top of page

Free Resources and Tips for Evaluation Planning

Our clients do fabulous work in their communities!

So, it's no wonder they want to tell the story of their achievements. We love using evaluation to reflect on what happened, how it went, and why it matters. This way, we can capture wins and encourage our clients to refine their work to stay relevant in changing contexts.

But, evaluation can be tricky. This blog post addresses four common mistakes organizations make when starting an evaluation project:

To counter mistakes, we offer tips from our practice, free resources, examples, and reflections on how we conduct program evaluations at AND implementation.


1. Set Clear Goals and Ask Corresponding Questions

It is great for organizations to be curious about their work and set ambitious goals. But, too often, organizations scope unmanageable evaluation projects and set themselves up for disappointment.

The best evaluation is timely, feasible, and directly linked to use.

To stay on track, we invest time and energy upfront to craft meaningful goals and questions. This helps us to identify clear next steps and avoid getting distracted by shiny - but irrelevant - data.

Setting Goals

To set meaningful goals, we ask our clients to imagine where they would like to be by the end of the project.

What will they be able to do next? How will they use the evaluation? Who will be making decisions with them?

Once we know where we are going, we refine aspirational statements into a few clearly articulated and achievable goals. Sensible and specific goals help to articulate the scope of the project.

Crafting Questions

After we have clear goals, we get nerdy with our clients. To achieve those goals, what do we need to find out? To craft questions, we ask clients to think about what they want to learn or discover during the evaluation.

Once we know the information we need to make informed decisions, we craft questions that satisfy curiosity and act as a gift to our clients' future selves.

Sometimes, it takes a few tries to set meaningful goals and ask relevant questions - especially when working in large teams. That's okay! Investing time at this stage always pays off. If you're not sure how to get started, below is a handy planning guide to help you get started.

Evaluation Planning_ANDimplementation 2023
Download DOTX • 48KB

2. Start In The Right Place

It is critical to start an evaluation at the right time. If the evaluation begins too soon, relevant information won't be available yet. But, if the evaluation begins too late, data may no longer be available. To find the right starting point:

  1. Identify which information is most relevant to the evaluation question(s) (e.g., details about specific program activities or data about expected outcomes)

  2. Map that information to the time point at which it will become available (e.g., short-term outcomes should be observable immediately after a program ends, whereas longer-term outcomes may be observable after a few years)

For example...

Suppose Jennica and I are conducting a summative evaluation of AND's flagship course: Arts-Based Methods for Evaluators. We may want to understand if (a) students gained the skills and knowledge we expected (shorter-term outcomes) and (b) how applying skills influenced their practice (longer-term impacts).

To understand what students gain from each session, Jennica and I can collect feedback after each class while the information is fresh. This will let us act on feedback and adapt subsequent sessions to better achieve learning goals for students. But, to understand longer-term impacts (the "so what" of our course), we will have to be patient and wait until the course ends and everyone has had a chance to apply new skills and information in their practice. But, as time passes, former students will be less likely to respond to requests for input and struggle to attribute learning or changes to their practice directly to our course. So, it will be prudent to identify the sweet spot for feedback and plan data collection accordingly.

3. Use The Right Technique for the Context

Just because it worked last time does not mean the same technique will work again. To keep evaluations fresh, adapting to each new context is vital. This means preparing to try new things and moving away from a template or "drag and drop" approach. There are many wonderful approaches and techniques to experiment with in every evaluation. Here are just a few examples of ways we practice:

Appreciative inquiry, arts-based methods, case study, developmental evaluation, empowerment evaluation, most significant change, outcome mapping, principle-based evaluation, theory-based evaluation

There are fabulous free resources available BetterEvaluation (the Global Evaluation Initiative). No, this is not a sponsored post, just an authentic appreciation shout-out.

4. Keep Track of Available Data

Most organizations need help to manage and use data effectively. No judgement! Often, organizations have too much data and find it overwhelming to sift through large datasets. And, organizations that do not think they have any data often have more than they know, but struggle to identify and use what they have.

So, Jennica and I spend time taking stock of available data before we develop strategies to collect more. This way, valuable information that already exists does not go to waste, and we can create simple data collection strategies to be sustained after the evaluation ends. Below is a free resource to help organizations identify what they have.

Evaluation Data Inventory AND implementation 2022
Download DOTX • 51KB


There is no one right way to practice evaluation. Everyone has their own style that works for them. Whether you are an organization or a fellow evaluator, remember to start from your strengths, anchor your project in the sweet spot for available information, and stay true to your goals.

If you have questions about an evaluation or any of the resources shared, feel free to send us an email at

Subscribe to our blog to get new posts delivered right to your inbox!


Os comentários foram desativados.
bottom of page