Message From the Secretary of the Treasury Board

Evaluation serves many purposes in the Government of Canada. It can be used by deputy ministers, ministers and parliamentarians to improve government policies and programs, ensure effective and efficient allocation of resources to priorities, and provide assurance to Canadians that they are getting value for money from their tax dollars. Since the renewal of the Policy on Evaluation in April 2009, the federal evaluation function has been shifting its focus and increasing its coverage of federal programs in order to provide more comprehensive support to ministers for spending proposals and expenditure reviews.

Deputy heads of departments and agencies should continue to build the foundation for supporting sound expenditure management and for strengthening their respective Minister's proposals, for example, by:

As leader for the federal evaluation function, I want to underscore that investing in broad evaluation coverage is important for supporting accountability and responsible government spending overall. Having information available about the relevance and performance of programs supports decision makers in making informed spending choices.

In achieving broad evaluation coverage, investments in individual evaluations should be "right-sized" to suit decision-making needs. As evaluation coverage builds from year to year, providing a foundation of information to support decision making, departments should actively seek to implement more cost-effective, calibrated evaluation designs. Deputy heads and heads of evaluation alike should note that the Policy on Evaluation enables cost-effective evaluations by giving departments considerable flexibility for customizing the scope and scale of evaluations and for grouping programs together for evaluation purposes.

As the Government of Canada's base of evaluation information grows, I encourage deputy heads to benefit fully from the support that evaluations can provide for decision making by calling upon their departmental heads of evaluation to share strategic insights and knowledge integrated from the full breadth of evaluations.

 

Michelle d'Auray
Secretary of the Treasury Board

 

Highlights

Evaluation resources

Resources for the federal evaluation function were relatively stable from 2009–10 to 2010–11. In 2010–11, annual financial resources for evaluation across all large departments and agencies (LDAs) were approximately $67.4 million, with a median of $1.6 million. As for previous years, salaries represented the largest component of total resources for the function in 2010–11, at 55 per cent. A modest increase of 4.8 per cent over 2009–10 levels brought the number of full-time equivalents (FTEs) to 497 in 2010–11, with the median being 13.4. FTEs include evaluation specialists, support staff and executives.

Grants and Contributions (Gs&Cs) programs

The Financial Administration Act requires that all ongoing programs of Gs&Cs be evaluated every five years. The first five-year period for meeting this legal requirement ends in December 2011. Across all LDAs, evaluation coverage continued to expand between 2009–10 and 2010–11, but the pace of evaluating ongoing Gs&Cs slowed from 33.2 per cent to 7.4 per cent annual coverage. The Treasury Board of Canada Secretariat's monitoring shows that cumulative government-wide coverage of Gs&Cs in the four-year period from April 1, 2007, to March 31, 2011, reached 63 per cent. One year remains in the first five-year period for departments to meet the legal requirement for evaluating all ongoing Gs&Cs.

Other direct program spending

The Policy on Evaluation requires all other types of ongoing direct program spending (i.e., excluding Gs&Cs) to be evaluated every five years beginning in April 2013. Until that time, coverage of other direct program spending remains risk-based and at the discretion of individual deputy heads. Although the pace of expanding Gs&Cs coverage slowed significantly in 2010–11, coverage of other types of ongoing direct program spending (i.e., non-Gs&Cs) grew at a faster rate. Twice the dollar value of non-Gs&Cs direct program spending was evaluated in 2010–11 as in 2009–10 ($3.70 billion compared with $1.83 billion). Importantly, until its transitional period ends on March 31, 2013, the Policy on Evaluation calls for departments to plan their coverage of other direct program spending using a risk-based approach.

Looking at all types of direct program spending (i.e., Gs&Cs and non-Gs&Cs together), a slower pace of expansion in coverage was observed from 2009–10 to 2010–11, with annual increases in coverage measuring 14.2 per cent and 6.7 per cent, respectively. While several factors may explain why slower growth in overall coverage was observed in 2010–11, the Secretariat is taking steps to support further progress on coverage. Based on its continuous monitoring of the function, the Secretariat has preliminary indications that the annual coverage rate will rebound significantly in 2011–12.

Governance and support for the evaluation function

In general, departments have put governance structures in place to ensure neutral evaluation functions. All LDAs have a departmental evaluation committee, of which three quarters are chaired by a deputy head. Most departments have ensured that their head of evaluation has direct and unencumbered access to the deputy head by having the heads of evaluation report to the deputy head either administratively (51 per cent) or functionally (31  per cent).

The level of support to evaluations provided by program-led ongoing performance measurement was similar to that observed in 2009–10. Sixty  per cent of LDAs assessed under the Management Accountability Framework (MAF) assessment process had evaluations that usually or almost always cited difficulties in assessing program effectiveness due to insufficient performance data. To compensate for insufficient data, evaluators have to spend additional time and resources to collect the needed data themselves or to implement other approaches.

Quality of evaluations

The quality of evaluation reports remained high overall across LDAs, with 88 per cent of departments receiving quality ratings of "acceptable" or "strong" through the MAF assessment process. These ratings were based on an assessment of the quality of the evaluation methodologies used, whether program value for money was addressed in evaluations and the quality of recommendations, among other criteria.

Use of evaluations

As viewed through the MAF assessment process, evaluation use was "acceptable" or "strong" in 81 per cent of LDAs. Departments reported greater consideration of evaluations in Treasury Board submissions, Departmental Performance Reports and in the Strategic Review process than in Memoranda to Cabinet.

In consultations held in November 2011, departmental heads of evaluation indicated that when working with limited evaluation resources, choices must often be made between the range of uses that each evaluation will support and the overall evaluation coverage of the department's program base that can be achieved.

Although departments use evaluations for internal decision making, Treasury Board of Canada Secretariat analysts use evaluation evidence when examining and providing advice on funding proposals to the Treasury Board for consideration, and on proposals put forward during expenditure reviews such as Strategic Reviews. During Strategic Reviews conducted in 2010–11, the Secretariat noted that departments that had low evaluation coverage were usually among the least effective at explaining program results meaningfully; those that had comprehensive evaluation coverage of their programming generally provided more meaningful input to Strategic Reviews. Although Secretariat analysts indicated that evaluations that had a targeted focus were very useful for informing decisions related to individual Gs&Cs programs, in order to support a variety of expenditure management decisions, effective evaluation coverage required an appropriate balance between high-level and targeted evaluation coverage.

A large majority of LDAs (80 per cent) reported that they have systematic tracking of management action plans arising from evaluation recommendations. Of all the management action plan items that LDAs planned to complete in 2009–10 (1,247 items), 53 per cent had been fully implemented and 36 per cent had been partially implemented.

Leadership provided by the Treasury Board of Canada Secretariat

The Secretariat developed a 2011–12 action plan to outline priority activities for supporting policy implementation and for promoting overall progress in the federal evaluation function. The action plan encompassed the following four broad areas:

To increase access to skilled evaluators, the Secretariat undertook a series of initiatives. It introduced Leadership Competencies for Federal Heads of Evaluation, putting the Government of Canada at the international forefront of establishing a set of expected skills and abilities for leaders within the evaluation function. Other initiatives included finalizing a Framework for Professional Development for Evaluators and conducting a survey to identify the learning needs of those who work in the federal evaluation function. The Secretariat is currently engaging partners to develop training opportunities that will support government-wide evaluation learning priorities.

The Secretariat continued developing guidance on effective evaluation approaches, such as guidance on the governance of evaluations of horizontal initiatives, on the evaluation of policy functions and programs, on theory-based evaluation approaches, on resource-inclusive evaluations, and on calibrated evaluations. A guide to developing a departmental evaluation plan was completed in late 2010–11, and wide distribution of the guide among departments took place in 2011–12.

The Secretariat used several other means for sharing effective approaches among departments, such as quarterly meetings with heads of evaluation; workshops on leading practices, presentations and panel appearances at conferences; and exchanges through an online forum for members of the federal evaluation community.

The Secretariat continued to promote the quality of evaluations and their use in the Government of Canada by continuously monitoring evaluation quality and use, as well as by advising and supporting other analysts of the Treasury Board of Canada Secretariat who consider evaluation evidence when reviewing departmental proposals to the Treasury Board and to Cabinet. Through the annual MAF assessments of the quality and use of evaluation in departments, the Secretariat continued to identify and communicate areas for improvement in departmental evaluation functions.

To enhance its leadership for the federal evaluation function during 2010–11, the Secretariat built strong engagement from the ADM Champion Committee on Managing for Results, an interdepartmental advisory committee of assistant deputy ministers chaired by the Deputy Assistant Secretary of the Secretariat's Expenditure Management Sector. The committee's mandate is to champion advances in results-based management and evaluation within departments and agencies, and to lead to a better integration of performance information in expenditure management decision making. Since its launch in October 2010, the committee has focused on sharing approaches and good practices for integrated planning and for capitalizing on performance information generated by the evaluation function in relation to the Policy on Management, Resources and Results Structures.

Summary and conclusions

This 2011 annual report documents encouraging signs within the government-wide evaluation function during the first two years of implementing the 2009 Policy on Evaluation, including stable financial and human resources, and progress on the Secretariat's collaborative efforts to increase access to skilled evaluators, established departmental governance structures, and the continuing high quality of evaluation reports. Evidence from consultations with the Secretariat's program sectors, whose analysts use evaluation evidence when providing advice to the Treasury Board, also shows that some departments are successfully designing their evaluations and evaluation coverage to inform key decision-making processes.

This annual report also reveals that although resources for the government-wide function remained relatively stable in 2010–11, coverage of ongoing Gs&Cs and overall coverage of direct program spending accumulated at a slower pace. This report describes several factors that affect coverage that the Secretariat will explore further. The Secretariat's ongoing monitoring in 2011–12 gives preliminary indications that the growth rate of evaluation coverage will increase significantly over 2010–11 levels.

In addition to the positive signs noted above, this report highlights some challenges and areas for further progress in the function, including the following:

Date modified: