Social

Wednesday, 19 February 2014 09:56

Purposeful performance stories or pointless propaganda?

Written by    Volume: 20  Issue: 2
Rate this item
(1 Vote)

On December 3, 2013 the Performance and Planning Exchange held a workshop entitled, “Putting Performance Reporting to the Test: The 2013 DPR Rating Bee.” It was an opportunity for students in the Diploma in Policy and Program Evaluation program at Carleton University to put their evaluative skills to the test.

They were asked to assess five Departmental Performance Reports (DPR) according to two lenses: two of 13 criteria (reliability and validity, and relevance) from the Guide to Preparing Public Performance Reports (May 2007) developed by the Public Sector Accounting Board of Canada (PSAB); and, whether overall, they thought the DPRs told a coherent performance story according to John Mayne’s excellent discussion paper to the Office of the Auditor General of Canada and the guidance provided by the Report of the Standing on Public Accounts in September 2010, where the committee made recommendations for the improvement of departmental performance reports.

Mayne sets out five elements of a coherent performance story: i) sufficient context to allow readers to understand the report, including a results chain, and risks faced; ii) a description of planned spending linked to expected outputs and outcomes; iii) a comparison of expected results with actual results; iv) indication of lessons learned, and how the organization will benefit from these in future iterations of programming; and, v) whether steps are in place to ensure the quality of the data used in the report.

The students came to several important observations, but here are some of their major themes.

First, in terms of Mayne’s first element, there is no clear sense of purpose for these reports. The result, according to one student, is an “incoherent jumble of data streams that are not linked in any understandable way to plain language outcomes.” The reports ought to inform public debate about governmental action in addition to informing on the expenditure of public funds under the Financial Administration Act.

Second, the reports do not present a sense of vision and mission that provides the reader with where programming is aimed. According to another student commenting on Mayne’s second element, the reports “provided a ton of information on political tactics…which is not all that important to [citizens]. We need to start talking about high level policy goals and objectives that are relevant and agreed upon by all stakeholders.” These reports demonstrate little coherence on overall progress toward public policy objectives such as reducing poverty or crime.

Third, the reports lack evidentiary rigour. The measures tend to be at the activities level and do not shed light on results or outcomes. According to one student commenting on Mayne’s third and fourth elements, “a review of the performance indicators…does not yield much useful information on how [the department] used the information to inform decision-making and shape programs.” That is, the links between activities and results are unclear, which also leads one to question whether the links between planned and actual spending is reliable and valid. The lack of results chains, logic models, or links to vision statements show a clear divergence between departmental activities and collective progress toward coherently expressed results.

Fourth, given the political climate of the day, all of the reports examined provide a good news story that is unsupported by the limited evidence. According to a student commenting on Mayne’s fifth element, “most of these reports spend more time describing the few specific things they are doing well, while purposely avoiding shedding any bad light on what they are not doing well.”

Several improvements were forwarded by the students including:

• Prepare separate sections and messages for specific audiences, including citizens;
• Greater use of third party assessors, within/without government, to gauge report accuracy;
• Coherent guidance for departments on how to prepare DPRs in a standard format;
• DPRs should be focused more on results, and progress toward long-term policy outcomes in a coordinated and multi-jurisdictional way; and
• Use of plain language using valid evidentiary performance stories.

If trained students in evaluation are having difficulty with the limited rigour, coherence and readability of these reports, one can only imagine what the general public must think. The assumption is that Canadians want to engage in public policy conversations in an informed way. That we have a vehicle such as the DPR is a positive contribution, but in their current state, one must question their value for democratic discussion. If this workshop told us anything, it is that we can collectively do much better.

Read 28098 times
Robert P. Shepherd

Dr. Robert P. Shepherd is associate professor in the School of Public Policy and Administration at Carleton University and co-Chair of the Consortium of Universities for Evaluation Education.

2 comments

  • Comment Link Palmiro Campagna Thursday, 27 February 2014 14:52 posted by Palmiro Campagna

    Mr. Levesque raises an excellent point in terms of which Departments are being reviewed. Departments which are operationally driven vice program driven I would submit are very difficult to evaluate using socio-scientific principles as one gets outputs more so than outcomes. For example, as one of the biggest spenders, Defence purchases equipment primarily for its own use and for a cause which presumably benefits all Canadians as a whole and not to effect specific social changes as do many program driven Departments. What then is the outcome?
    On the other hand, looking at it from the student's perspective, certain government policy on program reporting seems designed to want to force-fit every Department into the program driven model or at least, that may be how such policy is being interpreted and put into force. If that is the case then the result may come across as a hodgepodge of bewildering facts and figures with no outcomes or logic models to speak of.
    Perhaps then what the student's analysis is really telling us is that the definitions of a program and program spending and reporting etc as noted in policy documents needs to be clearer (a point also noted recently by the OAG) and more explicit so as to differentiate between program and operationally driven departments. Make no mistake in that the latter can still be measured and assessed but one would likley use performance audit principles more so than evaluation methodology and the resulting reports just might make more sense.

  • Comment Link Eric Levesque Friday, 21 February 2014 10:20 posted by Eric Levesque

    A few comments if I may:
    (1) Students looked at five DPRs without providing additional context - social departments (e.g. aboriginal affairs, public health)? internal services (e.g. treasury board, finance, public work) ? service to Canadians (e.g. human resources, Canada revenue agency)? security focus (e.g. defence, public safety, Canadian border)? Policy/regulatory heavy (e.g. immigration, agriculture)? Each sphere will be in a position to show results (short/medium/long term) differently.
    (2) The author didn't provide enough context around the validity of the criteria chosen for analysis. Why those criteria? Are they applicable for all departments and all performance reports?
    (3) By focussing solely on comments from students which are "negative", the author fails to provide a complete picture (the good and the bad) - one of the criticism of the DPRs. It could be said that the series of choices made, from the title of the article to the student comments make this article "useless propaganda".

Leave a comment

Make sure you enter the (*) required information where indicated. HTML code is not allowed.

 
Share this article

copy link

 bookmark article

 



Polls


Related Articles

CGE successfully completes 3rd annual summit
Canadian Government Executiveu2019s Third Annual Leadership Summit at the Westin Hotel in downtown Ottawa on November 18th brought in respected keynote speakers that left the audience engaged and informed.  read more...

Prime Minister’s Advisory Committee Report on the Public Service Embraces the Value of Project Management
In March of this year, the Prime Ministeru2019s Advisory Committee on the Public Service, co-chaired in 2015 by former Senator, the Honourable Hugh Segal, and Rick Waugh, former CEO of Scotiabank, released its Ninth Annual Report to the Prime Minister of Canada. The Advisory Committee was established in 2006 to give advice to the Prime Minister on the renewal of the Public Service. The Committeeu2019s objective is to help shape the Public Service into a more effective and efficient institution, distinguished by highly-engaged and highly-skilled people performing critical tasks with professionalism and efficiency.  read more...

Interview: Leading for Results
The Canadian Government Executive Summit is just one week away and, with all the changes taking place in Ottawa, this yearu2019s summit couldnu2019t be taking place at a more opportune time. Recently we had the chance to sit down with John Richard Jones, the group publisher of Canadian Government Executive, to discuss the conference and what it might mean given the political context that Canadian government professionals of all stripes are now facing.  read more...

The logical way to manage change and achieve results
The necessity for government executives to manage change, and achieve results, is an enduring quality of government work. Two fundamental assumptions support this view. Every government program, regardless of its scale or scope, exists to serve a purpose...  read more...

Benefits management: A better method for maximizing performance
Every organization does it, but often with ambivalence: the performance measurement process has become a mechanical exercise dutifully executed to satisfy corporate requirements.  read more...








Copyright © 1995 - 2017 1618261 Ontario Inc. O/A Navatar Press