Wednesday, 07 October 2015 10:42

Program Evaluation: What keeps you up at night?

Written by  Gail Vallance Barrington
Rate this item
(0 votes)
Program Evaluation: What keeps you up at night?

Anyone who has commissioned a program evaluation knows what it is like to have a sleepless night. The reason is simple: you are concerned about having lost track of the original question. The hyper-caffeinated effect is made worse by the slow and measured process of the evaluation. Staring at a ceiling you can’t see, your mind sifts through the research questions, survey questions, interview questions, and analysis questions. Has the focus been blurred; is the core question lost altogether? What can you do about this?

With thirty years in the field as an independent evaluator, I have come to believe that sometimes the evaluation serves the research better than the program. This can be due to poor communications between evaluator and client. Certainly the hands-off approach preferred by some managers does not foster success. A manager who says “Just give me the bottom line” is courting disaster.

The disconnect between what you want and what you get can be avoided. This need not take a lot of your time, nor should you play the role of the researcher. Instead, if you develop a high level, on-going relationship with your evaluator, you can ensure whether the evaluation stays on track.

Here are some strategies:

Clarify the Policy Question in the Design Phase

Make sure the evaluator understands your policy question and the type of recommendations you need. Are they about:

  • Program improvement?
  • Sustainability?
  • Expansion?
  • Future funding?
  • Closure?

Policy questions are complex and difficult to unpack, so make sure all the relevant topics are articulated. Don’t just wait for them to show up in the research findings.

Determine the Type of Evidence Needed, then Match it to Methodology

What kind of evidence will support your decision? Will hard numbers be persuasive? Do you need to prove, in a narrow sense, some kind of cause-effect relationship? Is your question about why? If so, a quantitative approach should produce the right information. On the other hand, if you really want to understand a phenomenon, why it happens, and how it affects those involved, your question may be how or what? In this case, you need a qualitative approach.

And there is a third option. If you need to ask both kinds of questions, a mixed methods approach is best. This requires more thought because the sequence of research activities is important. Do you have a qualitative question which will highlight topics for more focused numbers or is it the reverse — quantitative data that needs further explanation and description? Whatever approach you choose, make sure that the policy question drives the methodology — not the other way around.

Review the Time Planned for Evaluation Tasks

Take a look at the evaluator’s proposed list of tasks. Extensive time may be allocated for tool design and data collection but data analysis and report writing can be collapsed into a couple of weeks. The evaluator may be trying to respond to the parameters of the Request for Proposals in which too much is being requested in too short a time frame. What this means is that in the rush to produce the final report, you will lose a lot of the meaning buried deep within the data. Be prepared to sacrifice a little on the data collection side. With more time for synthesis, interpretation and reflection, you will get much stronger findings.

Discuss the Findings with the Evaluator

It is important to prioritize the findings once the data has been analyzed. Ask the evaluator to marshal the evidence beside each of your original questions. What are the strongest findings (i.e., most likely to be true across most cases)? Where do the findings tend to cluster? Where are there inconsistencies or gaps? Which findings are likely to be the most politically sensitive? What conclusions fall out of these findings? Is there enough evidence for you to move ahead or do you need further analysis?

 Craft the Recommendations with Key Stakeholders

Before the report is finalized, work with key stakeholders on the recommendations. Schedule a special meeting long in advance to warn them of their expected involvement. Ask the evaluator to present an overview. Provide participants with thumbnail findings and key topics for recommendations. Have them consider such issues as feasibility, appropriateness, cost, intended and unintended consequences, and political impact. Ask the counterfactual — what happens if nothing is done? Once a draft list of recommendations is developed, consider the language carefully and re-check with stakeholders as needed. It’s better to deal with them before the report is finalized rather than afterwards.

If you really want the evaluation to have an impact, make sure your schedule allows you to invest your time and interest in the process to make sure that the results are really useful. You will sleep better too!


Gail Vallance Barrington is a Credentialed Evaluator who has conducted over 130 program evaluation studies since founding her consulting firm, Barrington Research Group, Inc.  This email address is being protected from spambots. You need JavaScript enabled to view it.

Read 1148 times Last modified on Wednesday, 07 October 2015 10:52
More in this category: « License to grow

Leave a comment

Make sure you enter the (*) required information where indicated. HTML code is not allowed.

Share this article

copy link

 bookmark article



Related Articles

Mobile data usage to grow exponentially by 2021
Mobility is driving at such a rapid pace that itu2019s difficult to even look into the rear view mirror to see where we were 5 years ago, 3 years ago, or even 1 year ago. At such breakneck speed, itu2019s fairly easy to surmise that the growth of mobility will continue to increase in the years to come.  read more...

Why companies should pay attention to the new 'sharing culture'
By now the spectacle of people posing for a u201cselfieu201d or using their smartphones to take photos of their lunch so that they could share the picture online has become so common that it has lost its comedic lustre.  read more...

Middle Management: Bridging the gap … Credibility
Creative teams can create all day long. But innovation matters less when the rest of the organization does not trust them. When credibility erodes, it creates barriers for creators and their creations. Creative ideas and results become creations for vanityu2019s sake if not shared.  read more...

Evidence-Based Advice to Innovative Public Servants
Public sector innovation, once dismissed as a contradiction in terms, is now increasingly seen as an imperative. Many federal government departments are setting-up innovation labs, and the Privy Council Office has established an Innovation Hub to support the labs. Here's some evidence-based advice to help them along the way.  read more...

License to grow
A new report from the Conference Board of Canada underlines the merit of the Chartered Insurance Professional (CIP) designation for employers and employees alike.  read more...

Copyright © 1995 - 2017 1618261 Ontario Inc. O/A Navatar Press