Social

Friday, 28 November 2014 08:50

Of the essence: Coordinated training for government evaluators

Written by    Volume: 20  Issue: 9
Rate this item
(1 Vote)

There has never been a time since the introduction of program evaluation into our public services when training evaluators has been more important.

The problems in the field of Canadian evaluation are significant, not the least of which is its continued inability to deliver on its promises to provide real evidence of program effectiveness. The second challenge has been to provide consistent and rigorous training to evaluators in a way that meets the competencies identified by the Canadian Evaluation Society (CES), and most recently by the Treasury Board of Canada, for federal evaluation.

With regard to the first problem, the Auditor General reported in 2009 that departments were ill-equipped to evaluate all of their programs over a five-year period as required under the new Evaluation Policy. More important, the AG reported that the quality of evaluations was well below their coverage targets, but that “the rate of coverage was even lower because many of the effectiveness evaluations we reviewed did not adequately assess program effectiveness.” These conclusions were reinforced in the fall 2013 report, concluding that federal departments had made some progress toward meeting their coverage targets, but that again, “program evaluators noted constraints in their ability to address program effectiveness” (s.1.84).

Scholar Donald Savoie has written that despite significant federal investment in evaluation, the “contribution to the government’s policy and decision-making was negligible” and that “little has changed…apart from the fact that more money and staff are now being dedicated to the evaluation industry.” Most damning, however, he contends that managers and decision makers attribute limited currency to evaluation to value programs appropriately and realistically due to various conceptual problems in the field.

I argue that federal evaluation, at least, does not engage in the right questions nor does it have the capacity or training to engage appropriately in those questions, especially effectiveness, leading to major concerns about the entire legitimacy of the federal evaluation function.

These effectiveness challenges are perpetuated by the second problem. The 2009 Auditor General report states that, “despite having increased funding and staffing, the audited departments found it challenging to hire enough qualified, experienced evaluation staff to meet the needs for effectiveness evaluation.”

Funding for federal evaluation has since dropped considerably, but the capacity problem has always loomed large. In fact, we know that of the approximately 500 or so federal internal evaluators, at least half possess less than five-years of experience, and most of that experience is in project management, not actual evaluation research.

The combination of lack of effectiveness and inexperienced evaluators has resulted in a pervasive lack of confidence in the function and the field. Some of this can be attributed to a flawed federal policy, but in fairness these are field-wide issues. The pressure on the field to find evidence-based analysis to guide public policy mounts as provinces, territories and municipalities are now turning to evaluation as a way to guide budgetary decisions.

The challenge with our approach to training is that there is no approach. It is an assemblage of multiple actors working at cross-purposes in an open market. These professional bodies, private and non-profit sector entities, quasi-academic and academic institutions adhere to their own mandates and approaches vigorously. When such actors have come together to share their expertise, it has led to important improvements including the CES Credentialing Program. But without some Canadian vision for addressing a growing demand for evaluation training, we are inviting others to step in where we are divided with products that cost much and offer little.

As Canada inches toward the next federal election, Canadians will be asking what the record has been on some important matters, including social health, environment, employment and the economy. When governmental evaluators are asked what they’ve contributed, can they say more than simply they found “savings” in their existing programs?

Our governmental evaluation units have obsessed with cost reductions long enough: it is time to get back to the knitting and answer questions Canadians actually care about, including whether our tax dollars have been used to resolve public problems.

Getting qualified evaluators into our governmental internal units who understand this must be the first order of business. Providing a competency-based, coordinated, and Canadian approach to training is paramount if this is to happen.

The status quo will not hold: no one, including evaluators themselves, has the patience for it any longer, and the longer this drags on it does a disservice to us all.

Read 1748 times
Robert P. Shepherd

Dr. Robert P. Shepherd is associate professor in the School of Public Policy and Administration at Carleton University and co-Chair of the Consortium of Universities for Evaluation Education.

Leave a comment

Make sure you enter the (*) required information where indicated. HTML code is not allowed.

 
Share this article

copy link

 bookmark article

 



Polls


Related Articles

CGE successfully completes 3rd annual summit
Canadian Government Executiveu2019s Third Annual Leadership Summit at the Westin Hotel in downtown Ottawa on November 18th brought in respected keynote speakers that left the audience engaged and informed.  read more...

Prime Minister’s Advisory Committee Report on the Public Service Embraces the Value of Project Management
In March of this year, the Prime Ministeru2019s Advisory Committee on the Public Service, co-chaired in 2015 by former Senator, the Honourable Hugh Segal, and Rick Waugh, former CEO of Scotiabank, released its Ninth Annual Report to the Prime Minister of Canada. The Advisory Committee was established in 2006 to give advice to the Prime Minister on the renewal of the Public Service. The Committeeu2019s objective is to help shape the Public Service into a more effective and efficient institution, distinguished by highly-engaged and highly-skilled people performing critical tasks with professionalism and efficiency.  read more...

Interview: Leading for Results
The Canadian Government Executive Summit is just one week away and, with all the changes taking place in Ottawa, this yearu2019s summit couldnu2019t be taking place at a more opportune time. Recently we had the chance to sit down with John Richard Jones, the group publisher of Canadian Government Executive, to discuss the conference and what it might mean given the political context that Canadian government professionals of all stripes are now facing.  read more...

The logical way to manage change and achieve results
The necessity for government executives to manage change, and achieve results, is an enduring quality of government work. Two fundamental assumptions support this view. Every government program, regardless of its scale or scope, exists to serve a purpose...  read more...

Benefits management: A better method for maximizing performance
Every organization does it, but often with ambivalence: the performance measurement process has become a mechanical exercise dutifully executed to satisfy corporate requirements.  read more...








Copyright © 1995 - 2017 1618261 Ontario Inc. O/A Navatar Press