Treasury Board of Canada Secretariat
Symbol of the Government of Canada

ARCHIVED - Evaluation of the Treasury Board Submission Process


Warning This page has been archived.

Archived Content

Information identified as archived on the Web is for reference, research or recordkeeping purposes. It has not been altered or updated after the date of archiving. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you can request alternate formats on the "Contact Us" page.

5. Evaluation Issues and Methodology

Table 2 presents the evaluation issues and questions that were derived from the logic model developed by the evaluation team. The detailed evaluation matrix, which outlines the indicators and data sources used to address the evaluation questions, is presented in Appendix B.

Table 2: Summary of the Evaluation Issues and Questions
Evaluation Issue Evaluation Questions
Relevance
  • What need does the Treasury Board submission process fulfill?
Effectiveness Short Term
  • Have federal organizations and Secretariat employees demonstrated, over the years, an increased understanding of the elements of Treasury Board submissions, policies, and processes?
  • Does the Secretariat offer services that enable federal organizations to put forth draft submissions that comply with Treasury Board authorities, policies, and directions?
  • Does the Secretariat's review process for Treasury Board submissions ensure that they comply with government authorities and policies?
  • Are Treasury Board decisions well-informed and consistent with the advice, guidance, and recommendations of Secretariat analysts?
  • Are mechanisms in place to ensure Treasury Board decisions are carried out?
Intermediate and Long Term
  • What is the level of quality of Treasury Board submissions?
  • Does the Treasury Board submission process contribute to ensuring that departmental and government management, programs, and spending are aligned with Government of Canada priorities?16
  • Are there factors impeding the achievement of results?
  • Have there been any unexpected outcomes?17
Economy
  • What resources are allocated to the submission process?
  • Is the process efficient? What could be done to make the process more efficient?
  • Is the Treasury Board submission process risk-based?
  • Are Canadians getting value for their tax dollars?

(a) Data sources and methods

The evaluation framework uses multiple lines of evidence and complementary research methods to ensure the reliability and validity of the data. The following research methods were used:

  • document review;
  • interviews with stakeholders and experts;
  • a working session with assistant secretaries from the Secretariat and ADMs from selected federal organizations (ADM Working Session);
  • review of administrative, financial, and statistical data; and
  • a survey of analysts at the Secretariat and in federal organizations.

Each of these methods is described in more detail below.

Document review. Three main types of documents were reviewed during the evaluation:

  • general background documentation (e.g. documents describing the Treasury Board submission process’s history, rationale, and legislative framework);
  • documents specific to the submission process (e.g. the Guide and other related documents, such as the Analyst Survival Guide, and information on relevant committees, branches, and other groups involved in the process); and
  • past studies (e.g. research specific to the Treasury Board submission process, international studies).

For the full list of documents reviewed, please see Appendix C. Note that the evaluation did not review or assess Treasury Board submissions and précis for quality.

Interviews. Twenty-six interviews were completed (Table 3 and Appendix D). Interviewees included

  • program analysts and COE analysts:18
    • program analysts are the main point of contact for a submission; and
    • COE analysts represent sectors within the Secretariat (e.g. Chief Information Officer Branch, Corporate Services Sector, EMS) and provide policy advice to program analysts.
  • representatives from 12 selected federal organizations that put forward a Treasury Board submission within the last five years. Four interviews were conducted with federal organizations that fall into each of the following categories:
    • occasional submitters (less than one submission per month);
    • moderate submitters (one to two per month); and
    • heavy submitters (three or more per month).
  • external stakeholders, meaning individuals who are close to but not directly involved in the Treasury Board submission process (e.g. individuals from the Privy Council Office and from the Department of Finance Canada).
Table 3: List of Interview Groups
Interview Group Number of Interviews
Program analysts and COE analysts 11
Representatives from federal organizations 12
External stakeholders 3
Total 26

All interviews were conducted by telephone. Interviewees were sent an interview guide (see Appendix E) before the interviews were conducted.

Survey. A survey was administered over the Internet to program analysts, COE analysts, and representatives of federal organizations that put forward a Treasury Board submission in the last five years. A total of 547 individuals were asked to complete the questionnaire; 220 useable responses were received, for an overall response rate of 40% (see Table 4).19

Table 4: Survey Response Rates
Survey Group Total Sent Received Removed Total Kept Response Rate Confidence Interval
Program analysts 135 60 0 60 44.4% 95% ± 9.5%
COE analysts 181 66 0 66 36.5% 95% ± 9.6%
Federal organizations 231 99 5 20 94 40.7% 95% ± 7.8%
Total 547 225 5 220 40.2%  

All of the Secretariat's program analysts were invited to participate in the survey. They were also asked to provide contact information for all COE analysts they had consulted for advice on Treasury Board submissions during the last five years. Furthermore, the program analysts were asked to provide contact information for the individuals in federal organizations (including the Secretariat) that put forward a submission within the last year. Federal organizations were encouraged to forward the survey to any individual within the organization who had been involved in the Treasury Board submission review process.

Survey results are provided in Appendix G.

Working session with ADMs. A two-hour working session was conducted to gather qualitative information on the relevance, effectiveness, and economy of the Treasury Board submission process. Assistant secretaries from the Secretariat and selected ADMs responsible for corporate and/or strategic planning as well as selected departmental chief financial officers (CFO) from federal organizations were invited to participate. The issues to be discussed during the session (see Appendix F) were provided to the participants in advance.

Administrative, financial, and statistical data. Administrative, financial, and statistical data were gathered for the purpose of assessing the effectiveness and efficiency of the submission process. The evaluation team worked with the Treasury Board Submission Centre to review data related to Treasury Board submissions and to gain a greater understanding of the Submission Tracking System (STS). The evaluation team also reviewed data from the Management Accountability Framework (MAF) database.

Costing. Evaluating economy requires an analysis of the costs involved throughout the submission process. A costing exercise was undertaken; however, because direct costs related to Treasury Board submissions are not tracked separately, only the level of effort of some participants involved in the submission process was available. Section 6(c)(i), "Resources allocated to the submission process," therefore does not identify an estimated cost for the process.

Related international practice: The evaluation team undertook a limited review of submission process models used in other international jurisdictions.21 Given the cursory nature of this review, the evaluation team could not draw conclusions on the appropriateness of other models compared to the Canadian context. Findings from this review are therefore not presented in this report, though some interesting information was discovered during the review. For example, the role played by the Secretariat's assistant secretaries-whereby they present a federal organization's submission to Treasury Board-may be unique internationally. As is the case in the Canadian federal model, Secretariat equivalents in other jurisdictions are responsible for logistical and technical functions related to sessions of Cabinet, strategic and work planning, policy advice, legal functions, some monitoring functions, and their own internal management functions.22 They scrutinize material presented to Cabinet, ensuring that legal and policy considerations have been accounted for within structured submissions. In six of ten surveyed Organisation for Economic Co-operation and Development (OECD) countries,23 the Secretariat equivalent prepares a recommendation on how the submission should be handled in the Cabinet-level meeting. In these jurisdictions, however, it is the deputy minister of the submitting organization who presents the submission at the Cabinet-level meeting and not the equivalent of a Secretariat assistant secretary. Assistant secretary equivalents can therefore focus their attention on submissions that are most strategic or sensitive or for which their recommendation runs counter to that of the submitting organization.

Other information. Once the data were collected and analyzed, information gaps were discovered in a few key areas. To fill these gaps, the following methodologies were used:

  • Review of additional human resources (HR), financial, and statistical data—The evaluation team reviewed additional HR and financial data to gain further insight into the cost of the Treasury Board submission process. Documentation on program sector boot camps was also reviewed.
  • Interview with a senior advisor from the Secretariat—The evaluation team met with a senior advisor who has extensive knowledge of and experience with the submission process and its related management tools and structure.
  • Working sessions with program directors—Four sessions were held with the Secretariat’s Program Directors Group (PDG) to validate findings. In addition, assistant secretaries from the program sectors reviewed the final draft of this report.
  • Interviews with Expenditure Management Information System (EMIS) business analysts—The intent of the meetings was to gain additional knowledge and a greater understanding of the Secretariat's Budget Office Systems Renewal (BOSR) Project and the change management work that occurred during implementation of EMIS.

(b) Limitations of the evaluation

Time frame. A federal election was called shortly after the evaluation was launched. Not long after the election, Parliament was prorogued. These events delayed approval for the opinion research to be conducted for the evaluation; consequently, the time available to perform the research was limited. Another consequence of the election and subsequent prorogation was that the evaluation team could not interview Treasury Board ministers regarding the support they receive through the Treasury Board submission process.

Logic model. While the evaluation team believes that the ultimate outcome proposed in the logic model it developed is a valid description of the purpose of the Treasury Board submission process and therefore a valid basis for the evaluation, it was not possible to validate the ultimate outcome with an appropriate cross-section of stakeholders.

Review of performance measurement data. Performance data have not been collected on all aspects of the Treasury Board submission process. For instance, data are not collected on the extent to which the Treasury Board submissions officially submitted by federal organizations are actually required, or their compliance with policies and processes, and on the extent to which Treasury Board decisions reflect recommendations in the précis. In the absence of this information, the evaluation team was unable to assess the quality of submissions and précis. The evaluation therefore relied more heavily on survey and interview data to assess the effectiveness of the process.

Administrative data. Limited administrative data were available on the submission process and its results. Furthermore, as noted in the Treasury Board of Canada Secretariat Audit of Leave and Overtime (2008),24 some overtime data are not reliable, thereby limiting the extent to which the evaluation team could use such data to assess the amount (and related costs) of overtime claimed by program analysts in connection with Treasury Board submissions. In addition, other administrative data such as the number of days between the Treasury Board decision and the issuance of the decision letter would have provided additional lines of evidence.

Surveys and interviews. Given the highly variable nature of Treasury Board submissions, it stands to reason that the submission process experience would differ significantly from one submission to the next. More interviews would have provided better data on the effect of variations in a submission's size, scope, value, and complexity. The evaluation methodology attempted to address this limitation by inviting individuals from all federal organizations to participate in the Web-based survey. The results of the survey were cross-validated with the interview responses.

Costing methodology. Cost information to support a complete and accurate costing of the Secretariat's involvement in the Treasury Board submission process was not available.

Single-window approach to service delivery. Because the Secretariat uses a single-window approach for submissions, in which program analysts and their directors or executive directors are the point of contact for representatives from federal organizations, interaction between COE analysts and representatives from federal organizations is extremely limited. As a result, despite COE analysts having a clear role and contribution at the pre-submission and draft stages of the submission process, representatives from federal organizations may not be fully aware of the extent of this role. This may have led to the evaluation's greater focus on program analysts.

While there are some limitations with the evaluation methodology, multiple lines of evidence were used to draw conclusions about the Treasury Board submission process, strengthening the reliability and validity of the evaluation results. Despite the limitations, the methodology meets the requirements of the Treasury Board Policy on Evaluation and associated standards.