We are currently moving our web services and information to Canada.ca.

The Treasury Board of Canada Secretariat website will remain available until this move is complete.

Guide for the Review of Evaluation Reports

Archived information

Archived information is provided for reference, research or recordkeeping purposes. It is not subject à to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.


 

Centre of Excellence for Evaluation
Treasury Board of Canada Secretariat

January 2004


 

1. Introduction

The purpose of this guide is to provide analysts from the Centre of Excellence For Evaluation (CEE) with some general guidelines on how to approach the evaluation review process and how to provide feedback to TBS Program Sector analysts and departmental program managers.



 

2. Overview of Treasury Board Secretariat and Evaluation

The Treasury Board Secretariat (TBS) is dedicated to helping the Government of Canada manage its human, financial, information and technology resources prudently and in a manner that best supports government objectives and priorities.  TBS recommends and provides advice to the Treasury Board (TB) on policies, directives, regulations, and program expenditure proposals.  The Comptrollership Branch is an integral part of the stewardship business line dealing with the appropriate management of government resources. The Results-based management (RBM) Directorate falls within this business line and aims to support policy work to help departments generate accurate and timely performance information, in order to help managers make sound and effective decisions on their policies, programs and initiatives

The CEE is part of the RBM Directorate and provides guidance on the appropriate application of the TB Evaluation Policy across federal government departments subject to it, as well as departments whose programs have evaluation requirements related to the TB Policy on Transfer Payments. The evaluation policy supports the generation of accurate, timely, objective and evidenced-based information to help managers make sound, more effective decisions on their policies, programs and initiatives, and through this provide results for Canadians.

The Transfer Payments Policy (TPP) further reinforces the Evaluation Policy with the objective of ensuring sound management of, control over and accountability for transfer payments (grant or contribution payments).  In accordance with the TPP, departments are required to assess and report back on the effectiveness of the transfer payments through a formal program evaluation or similar review when requesting a renewal of Terms and Conditions for the program, policy or initiative. The TPP also requires the development of a Results-based Management and Accountability Framework (RMAF) as a component of TB Submissions involving transfer payments.

Ultimately, these policies and requirements support the TB Submission process.  A Treasury Board Submission is an official document submitted by a Minister on behalf of his/her department when a new program is being developed or an old program being renewed. The Submission is used to seek approval(s) or authority(ies) from Treasury Board Ministers where there is a requirement for TB approval before a proposal can be implemented. 

The TB submission process requires the coordination of several areas of expertise within TBS, as follows:

  • Program Sector Analysts provide advice and monitor departmental activities.  They are the single window of contact for departments with respect to the Treasury Board Submissions and Terms and Conditions.  They manage this process, call on policy expertise within TBS for assistance on Submissions and make recommendations to TB Ministers based on their consultations with their TBS colleagues in other areas of responsibilities.
  • The Financial Management Policy Division (FMPD) is responsible for the provision of policy advice and interpretation, as well as the review and approval of Terms and Conditions associated with TB Submissions for programs with transfer payments.
  • The Centre of Excellence for Evaluation (CEE) provides advice and guidance with respect to RMAFs and reviews evaluation reports that accompany TB Submissions and associated Terms and Conditions.  Since the RMAF is part of the Terms and Conditions of a program being considered for renewal, the CEE analyst signs-off on the RMAF.  Further, as part of policy monitoring requirements, the CEE also reviews departmental program evaluation reports that accompany a TB Submission, to ensure the validity of the conclusions made therein, in support of decision-making related to the TB Submission.
  • The Centre of Excellence for Internal Audit (CEIA) provides advice and guidance with respect to Risk-based Audit Frameworks (RBAFs), offers approval, and reviews internal audit plans and reports developed by departments.  The CEIA is responsible for the implementation of the Internal Audit Policy http://www.tbs-sct.gc.ca/pubs_pol/dcgpubs/ia-vi/pia-pvi_e.asp


 

3. Purpose of Reviewing Evaluation Reports

Section 7.3.7 of the TB Policy on Transfer Payments, which deals with the approval of program terms and conditions states "departments must assess, through a formal program evaluation or similar review and report back on the effectiveness of the transfer payments when requesting renewal of terms and conditions".   The expectation is that an evaluation or review must focus on the effectiveness of the program in question, through an assessment of relevance, progress / success at achieving program objectives, and cost-effectiveness.  Evaluation results should inform future decisions related to program design and implementation.

TBS supports the practice of evaluation in departments by providing advice on best practices, setting standards and developing tools for effective evaluations within departments, providing workshops and training on RMAF development, and monitoring the capacity of evaluation in departments. As a result, departments should use these evaluation resources to inform/guide their decision-making process.  The evaluation results should provide credible information that supports decision-making related to the TB Submission.  In pursuit of this goal, the CEE reviews the validity of the evaluation conclusions and ensures that evaluation findings are accurately reflected in the TB Submission.

The review of evaluation reports provides the Program Sector and CEE analyst with insight into how a program is being managed, whether results are being measured, and the progress being made towards planned objectives.  Furthermore, the completion of evaluation reports encourages program managers to implement RMAFs and to incorporate lessons learned in the design and re-design of programs.  The comments resulting from this review are provided to the Program Sector analyst for consideration to include in their Précis supporting the TB Submission.  This is of particular importance when CEE identifies contentious issues with respect to the validity of conclusions made in an evaluation report for the Program Sector to consider.  It is therefore important that the CEE analyst provide a clear and concise assessment of the evaluation report.



 

4. Evaluation Review Process

This section presents a step-by-step process for the review of evaluation reports.  It is important to note that when reviewing evaluation reports, CEE analysts are not doing a standard peer review of the study.  As such, the review should not be focused on a critique of the methodology used per se, but rather on the conclusions and recommendations drawn from it.  Nonetheless, the methodologies used should be appropriate to address the issues and the data should support the analysis.  The following represents the core elements to be addressed through the review conducted by the CEE.

  • The evaluation covers outcomes and issues as presented in the RMAF (or other planning document available, including an updated RMAF, at the time of the evaluation review) related to relevance, progress/success, and cost-effectiveness.
  • The methodology used is appropriate for the intended objectives of the study.
  • The findings and conclusions are based on evidence drawn from the evaluation research.
  • The recommendations are built on the conclusions and present corrective actions dealing with the findings of the study.

Step 1: Receiving the Evaluation Report

In many instances, the Program Sector analyst will send the CEE analyst a draft format of the evaluation report for review and comment.  This report may be made available before it is presented to the departmental Audit and Evaluation Committee for approval, providing an opportunity for adjusting findings and conclusions when deemed necessary.

At the time of the evaluation review, the CEE analyst should also access supporting documents such as the management response, TB Submission and RMAF (or previous evaluation framework) in order to get background information on the program, and clarify any potential issues the analyst may encounter during the evaluation review.

Step 2: Reviewing the Evaluation Report

When reviewing the evaluation report, the CEE analyst should pay attention to the key review criteria as outlined within Table 1. Note however that the CEE analyst's comments should also consider the report's inclusion of:

  • An Executive Summary providing an overview of the methodology, findings, conclusions, and recommendations.
  • A brief context piece describing the objectives and timing of the evaluation work, the policy, program or initiative that is being evaluated, and how it fits into the overall operation of the organization.
  • A description of the methodology and data sources used by the evaluation as well as the limitations of the evaluation in terms of its scope and methodology.
  • A Management Response, which outlines the action plan based on the recommendations put forth in the evaluation report.

TABLE 1:  KEY REVIEW CRITERIA

 

Evaluation Issues

 

Findings

 

Conclusions

 

Recommendations

 
The report identifies evaluation issues in accordance with evaluation policy requirements i.e. relevance, success and cost-effectiveness and their relationship to the RMAF logic model. The report presents findings that address and/or relate to the identified evaluation issues. The report concludes by addressing the evaluation issues raised within the evaluation. The report recommends corrective measures that are evidenced-based and linked to the report's evaluation findings, and the issues being addressed.
The report identifies evaluation issues that are consistent with the issues addressed within other related documents, i.e. RMAFs. In light of the methodologies used, the report presents logical, valid, and evidenced based findings that do not contradict one another. The report presents valid conclusions drawn from the evaluation findings, in light of the methodologies used. The report highlightsresponsibilities related to the implementation of proposed recommendations including time frames for management responses.
The report discusses other factors that contribute to the success, relevance and cost-effectiveness of the program, such as funding or partnering considerations. The report outlines factors that have influenced the success, relevance and cost-effectiveness of the program. The report contains a section that discusses the future implications of the findings of the evaluation research. The report incorporates future opportunities, areas of improvement, and/or future funding or resource possibilities within its recommendations.
The report includes an action plan drawn from recommendations. The report presents appropriate and realistic measures to improve the program, if necessary. The report will be used in future redesign and implementation of the program. The report recommends realistic and practical corrective measures, including timelines.

Step 3: Providing Comments

In addition to providing comments in reference to the key review criteria as presented within Table 1, the CEE analyst should, where applicable:

  • Highlight any potential implications arising from the evaluation's failure to meet any of the key review criteria.
  • Discuss the evaluation issues covered, noting if these have been adequately addressed by the study and if all the outcomes (as stated in the RMAF) have been measured.  If any evaluation issues have not been appropriately addressed by this study, it is important to mention them within the comments.
  • Discuss the validity of the findings, conclusions and recommendations in terms of their support through evidence-based information.  When this is not the case, it should be noted in the comments and a brief justification should be provided supporting this comment.
  • Verify that improvements to program performance measurement and evaluation strategies are included in the current RMAF if the current evaluation study has encountered challenges on that front.
  • Verify whether the management response is available and that the related action plan is incorporated in the TB Submission.







Inquiries about this review guide should be addressed to:
Senior Director,
Centre of Excellence for Evaluation
Results-Based Management Directorate
Comptrollership Branch
L'Esplanade Laurier, 9th floor, West Tower
300 Laurier Avenue West
Ottawa, Ontario
K1A 0R5

 

Appendix: Core Elements of an Evaluation Report, and Detailed Self-Assessment Criteria

The table below – drawn from Health Canada's April 2003 Evaluation Report Assessment Guide, identifies the elements that should be included in an evaluation study, as well as the specific criteria by which these elements should be assessed.


1. EXECUTIVE SUMMARY

 

Issues/Requirements

Criteria

1.1  Summary
  • Briefly present the following:
    • description of the policy, program or initiative evaluated;
    • why the evaluation was done;
    • who the client and intended audience of the evaluation are; and,
    • the main evaluation findings, conclusions, and recommendations.
(Suggestion:  the Executive Summary should be about 3 pages.)


 

2. INTRODUCTION AND CONTEXT

 

Issues/Requirements

Criteria

2.1  Description
  • The policy, program or initiative evaluated is clearly described, including the logic of cause-and-effect links between inputs, activities, outputs, outcomes, and external factors contributing to success or failure, i.e. policy or program theory and assumptions.
  • The description of program reach (intended beneficiaries) is clearly described.
  • The program resources are clearly described so that the reader can understand how program monies are allocated and have been spent.
2.2  Evaluation Context
  • The report provides the reader with appropriate context for the evaluation by clearly explaining or describing:
    • why the evaluation was conducted and why "now";
    • how the results will be used;
    • the objectives and scope for the evaluation;
    • the client, audience and key stakeholders for the evaluation;
    • the timing of the evaluation work; and
    • the (clear, useful and answerable) evaluation issues/questions being addressed by the evaluation and falling within the areas of enquiry (relevance, implementation, effectiveness, cost-effectiveness and efficiency).
  • Depending on the nature, purpose and timeliness of a particular evaluation study, the following evaluation questions should be considered for inclusion:
    • Is the program still relevant to the needs of Canadians?
    • Are the program's resources being used in the most efficient and effective way to deliver appropriate results?
    • Is it necessary for the federal government to operate this program, or could it be transferred to other levels of government, or to the private or voluntary sector?
    • Is there scope for considering more effective program structures and service delivery arrangements?
    • Are departmental management practices appropriate and of sufficient quality?


 

3. METHODOLOGY / DESIGN / DATA

 

Issues/Requirements

Criteria

3.1 Description of the Methodology/Design
  • The design of the evaluation is described to the extent that the study can be replicated;  e.g. the relationship between the data collection and the analysis is described clearly.
  • The evaluation design is appropriate to the intended objectives of the study.
  • The data collection is appropriate to the design (the methodology, instruments and sample are described in sufficient detail to make an assessment of methodological rigor); e.g. valid and reliable data.
  • Methods are carried out appropriately (e.g. valid sample size).
  • The analysis is appropriate.  The data supports the analysis (as determined by, for example, significance tests, and response rates).
  • ALL stakeholders are included, and their input is fairly depicted in a balanced way.
3.2  Multiple Lines of Evidence
  • The evaluation relies on more than one line of evidence and uses a mix of quantitative and qualitative approaches, one of which should be a literature review.
3.3  Data Quality
  • The data used in the evaluation are accurate and reliable.
3.4  Limitations
  • The limitations and trade-offs of the methodologies, data sources and data uses in the evaluation are clearly described.  For example:
    • actual and potential biases in, and reliability of the data are identified and explained in terms of their impact on stated findings.
    • the constraints of the evaluation and the perspective from which the intervention is evaluated are clear and the reader can assess the validity of the evaluators' judgment.
3.5  Accuracy
  • The information in the report is free of errors of fact or logic.


 

4. KEY FINDINGS

 

Issues/Requirements

Criteria

4.1  Evaluation Issues
  • The evaluation issues/questions are adequately addressed.
4.2  Objectivity
  • All significant findings are presented, testable, and do not go beyond what the evidence will support.
  • Balanced perspective – reflects the range and intensity of the observations and other evaluation input received; e.g. quotes of interviewees should indicate how prevalent the quoted sentiment or opinion is among all interviewees.
  • The results are sufficiently qualified to help readers draw substantiated inferences.
4.3  Clarity and Conciseness
  • Used plain language, and avoids specialized technical language to the extent possible.
  • Report is not overloaded with details.  Detailed information and analyses are included in technical appendices.
4.4  Evidence-based Findings
  • The findings are substantiated by the evidence, as described in the evaluation report.


 

5. KEY CONCLUSIONS

 

Issues/Requirements

Criteria

5.1  Supportable Conclusions
  • The conclusions address the evaluation questions and are supported by the findings.
  • The conclusions fit the entire analysis, are valid, and drawn from the evaluation findings, in light of the methodologies used.


 

6. RECOMMENDATIONS

 

Issues/Requirements

Criteria

6.1 Evidence-based Recommendations
  • The recommendations are supported by and flow logically from the findings and conclusions.
  • The recommendations address significant issues, i.e. they are not unprioritized "shopping lists".
  • To the extent possible, an assessment of the potential impact (on the policy, program or initiative evaluated) of implementing each recommendation is provided.
  • The recommendations include proposed timing for management action and some indication of quantity and quality, e.g. a simple statement that "funding should be increased" or "consultations should be expanded" without some "benchmark" objective that provides an idea of "by how much" and what "sufficient" or "good enough" could look like would be insufficient.
  • The recommendations are practical and can be realistically implemented.
  • The recommendations are addressed to specific parties.


 

7. DOCUMENT LENGTH

 

Issues/Requirements

Criteria

7.1  Length of the Report
  • To help bring better focus on the "truly important", the main body of the evaluation report should be limited to approximately 25 pages.  Other information could be provided in appendices and annexes.


 

8. MANAGEMENT RESPONSE / ACTION PLAN

 

Issues/Requirements

Criteria

8.1 Action Plan
  • The Management Response/Action Plan adequately addresses findings and recommendations.
  • The Action Plan describes the desired objective(s) of the action (what will be done), and timelines for action (when and who will do it).

 
Date modified: