Treasury Board of Canada Secretariat
Symbol of the Government of Canada

ARCHIVED - IM/IT Investment Evaluation Guide


Warning This page has been archived.

Archived Content

Information identified as archived on the Web is for reference, research or recordkeeping purposes. It has not been altered or updated after the date of archiving. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you can request alternate formats on the "Contact Us" page.

Questionnaire

Information Technology Investment Evaluation Guide Questions

PLANNING PROCESS

Screening New Projects

  1. Does the organization have a defined process for submitting and screening new funding proposals for management consideration? Is this process established in policy guidance?
  2. Does the process define:
    • what information is to be submitted?
    • who must approve (screen) the information prior to formal submission?
    • how a determination will be made about where in the organization the project will be reviewed?
    • the stages of management review?
  3. Are roles, responsibilities, and authority for people and offices involved in the screening process clearly defined?
  4. What information is required for submitting projects for funding? (Please check off those that apply and include others.)
    • For most projects this information may include:
      • business case justification, including:
        • clear, designated senior management sponsorship from a program/business unit
        • links to business/program/mission objectives that the project is helping to achieve, as well as an explanation of how the IM/IT investment will directly or indirectly help achieve the intended outcomes associated with these objectives, and
        • clear identification of proposed benefits (both quantitative and qualitative)
        • cost/benefit estimates
      • alternative and sensitivity analyses
      • compliance with the information technology architecture
      • risk assessments
      • other

  5. Do defined thresholds for benefit/cost ratios, return on investment calculations, risk assessments, etc. exist? Are these thresholds clearly defined and understood?
  6. Are all funding proposals treated the same or does the organization have different process requirements for funding proposals of different size, scope, etc.? Are these different requirements adequately documented?
  7. If exceptions to the screening process are allowed, are the conditions for allowing exceptions clearly documented?
  8. Does the process clearly stipulate potential actions that can be taken for projects that are funded without evidence of following the screening process?

    Analyzing and Ranking All Projects Based on Benefit, Cost and Risk Criteria


  9. Does the organization require that the information and data submitted with funding proposals be validated (accuracy, reliability, completeness)? Does the process stipulate who is responsible for performing this validation (sponsor, project team, independent validation and verification teams, etc.)?
  10. Does the process stipulate where exceptions to validation are permitted? If exceptions are allowed, are other clearly defined conditions required to be met?
  11. Does the organization have an established, documented process for comparing and ranking all proposed IM/IT-related funding?
  12. Has the organization defined explicit criteria that will be used to help compare and rank projects? If no, go to 17. Do these criteria include cost, risk, and benefit elements (e.g., benefit/cost ratios, anticipated or actual impact on mission improvement priorities, risks versus benefits, etc.)? Do the criteria include both quantitative and qualitative criteria (e.g., return on investment calculations, risk modeling scores, capability assessments, alignment with critical needs, etc.)?
  13. Are the decision criteria weighted? If no, go to 14. If scoring weights are attached to different items for management consideration, are these clearly defined and understood by participants? Has management consensus been established on use of the weighting scheme? Are these weights being applied consistently to all project proposals? If not, has the organization established different weighting schemas for different project types?
  14. Does the process explain how the decision criteria are to be applied?
  15. If the organization uses a scoring model or decision support tool associated with the decision criteria to help measure the relative costs, benefits, and risks of each project, are the scoring elements precisely defined and differentiated?
  16. Is the process for analyzing and comparing IM/IT projects required throughout the organization, regardless of where actual funding decisions/approvals are made?
  17. Are the criteria used to compare and rank projects weighted? How were the weights determined? Is the weighting scheme reassessed periodically?
  18. Does the process include incentives or disincentives to ensure compliance? Are roles and responsibilities for enforcing the process defined?
  19. Does the organization require that management evaluations, as well as scoring, ranking, and prioritization results be documented (either manually or through the use of automated applications such as a decision support tool)?
  20. Does the organization require that this information, together with approved project data on cost, schedule, and performance, be tracked at an aggregate level? At a project level?

    Selecting a Portfolio of Projects


  21. Does the organization require that this information be entered into a recognized management information system? Does it require the data to be maintained in a uniform fashion?
  22. Does the organization have a formal systematic process for determining priorities and making funding decisions? Does the process clearly establish who in the organization has the responsibility and authority for making final IM/IT related funding decisions? Is the process clear in establishing this responsibility and authority for various organizational levels (department/branch/ sector/directorate/region/units, etc.)?
  23. Has the organization established an IM/IT investment review group (or some other review function)?
    • Who is on this review group?
    • Does the process cover the roles, responsibilities and authority of this group?
    • Is it clear on the role the group will play in:
      • selecting, controlling and evaluating IM/IT investments, and
      • suggesting changes to organizational policies, procedures and practices?
    • Does the group have authority to:
      1. approve, cancel or delay projects
      2. approve plans for mitigating risks
      3. validate expected returns
      4. place constraints on investment proposals regarding project size and duration?
    • Does the process stipulate the operating rules and procedures for this group, (i.e. explaining when it will meet, how it will function, how decisions will be made, what steps will be taken to resolve conflicts, etc.)?
    • Is it clear what projects the investment review group (or similar management group) will review (eg. all IM/IT spending proposals or IM/IT spending proposals that meet or exceed defined decision thresholds based on cost, level of risk, cross-functional, bureau, or office impact, or involving common infrastructure needs such as telecommunications, data centres or networks)?
  24. Are IM/IT decisions made as part of an overall capital planning process or are IM/IT projects separated out? Does the process explain how decisions made on IM/IT spending will be incorporated into the organization's overall budgeting or capital programming decision making process?
  25. Does the process require that data on obligations, outlays, and actual expenditures be maintained for all IM/IT spending? Are categories of IM/IT spending defined within the organization, eg. hardware, infrastructure, telecommunications, operations and maintenance, applications (development data processing services, personnel, etc.)?
  26. Has the organization conducted a review (inhouse) or via outside consultant/expert) of its current IM/IT spending portfolio to assess alignment with mission needs, priorities, strategic direction, or major process re-engineering?
    • Has the rate and type of IM/IT spending been aligned with management expectations?
    • Has trend analysis been done to show how patterns of investment and spending are changing?
    • Has an analysis been conducted to show how spending patterns could be affected by the proposed IM/IT portfolio?
    • Does the organization have a process for documenting and disseminating results of this review?
  27. Does the process define how unit or office level IM/IT decisions will be reviewed?

    Establishing Project Review Schedules

  28. Does the process stipulate how approved projects are to be monitored by senior management in regular investment meetings? Are there any procedures for informing project managers of decisions about monitoring schedules made by the investment review group?
  29. Is the review process clear on criteria for deciding the kinds of projects that will receive regular management monitoring and oversight by an investment review group versus those that will be monitored exclusively by management sponsors?
  30. Does the process allow the investment review group to call special project reviews outside of regular meetings if the group deems it necessary?
  31. Does the process require any additional certification or reviews before high-risk projects are allowed to proceed (eg. risk mitigation plans, additional cost certifications, etc.)?

    Evidence that Each Project has Met Project Submission Requirements

  32. For IM/IT proposals that were submitted for funding consideration, was all of the required data/information prepared and submitted in accordance with the prescribed process?
  33. Is there evidence that the data/information (cost, schedule, performance, risk) for submitted projects has been validated either independently or using a self assessment process?
  34. Did the information/data presented in the proposals come from organization recognized information systems (automated or otherwise)?
  35. Is the information/data easily accessible for further review and consideration?

    Analyses of Each Project's Costs, Benefits and Risks


  36. Are project cost data fully identified direct, indirect, ongoing? Do the data include full lifecycle costs? Did these cost data come from a recognized financial system?
  37. Have benefits of the investment been identified using quantitative and/or qualitative data/information that relate directly to mission support and performance improvement? Are expected cost savings, productivity gains, and improvements in quality identified and timeframes specified as to when these should occur?
  38. Have all foreseeable risks been identified? These risks may include technical, managerial, capability, procurement, organizational impact, stakeholder, etc. Have all concerns and potential problem issues been resolved or accounted for in a risk mitigation strategy for the project?
  39. Have business owners been involved in constructing and certifying the accuracy of the data that are submitted.
  40. Were baseline cost and performance data used as a basis for analysis? Is this information reliable and accurate? If baseline data were not available, were the estimates generated using a prescribed approach or method?
  41. For projects that are requesting continued or additional funding (for a new phase, new module, or as a result of increased costs), is there evidence that the newly submitted data reflect any changes that may have occurred to the cost, benefit, or risk information?
  42. Has project schedule information been reviewed in light of competing priorities; skills, capabilities, availability of organization staff; contractor expertise and experience, etc.
  43. Was the cost and return information that was submitted constructed using accepted methods and techniques (prescribed by the organization, legislative provisions, and/or accepted industry practice)?

    Data on the Existing Portfolio


  44. Does the organization maintain data on its current IM/IT spending portfolio (eg. are major categories of spending and investment defined and tracked such as operations and maintenance, applications and systems development, hardware acquisitions, telecommunications, personnel, contracted services, data administration, research and development etc.)?
  45. Are the costs and returns of the IM/IT spending portfolio presented on an aggregate basis (past, current, future)? On a project basis?
  46. Was the portfolio information (IM/IT spending categories, cost/benefit estimates average development costs, etc.) derived from recognized organizational management information systems? Are standard definitions and reporting elements used to maintain consistency and uniformity?

    Scoring and Prioritization Outcomes


  47. Are summary data presented on each project's costs, benefits, risks for senior management (investment review group, etc.) to consider?
  48. Does the management review group conduct scoring exercises to vote on the relative strengths and weaknesses of proposals? Are these scoring exercises recorded and documented? Are the criteria that are used as the basis for the scoring instruments defined and used consistently?
  49. Can the costs of the approved list of projects for funding be tracked to available funds and/or reflected in the budget requests?
  50. When management approves funding for projects that fall outside accepted thresholds (high risks, high project costs, noncompliance with the architecture etc.), is an explanation/rationale provided for this decision? Are additional management and project reporting requirements stipulated?

    Project Review Schedules


  51. Once projects are approved for funding by an investment group and/or organization head, are any additional project management or investment review reporting requirements (data, information, analysis) established for high risk, high cost projects (beyond what may be specified by existing processes)?

    • If so, have these requirements been clearly documented and communicated to the responsible project team?
    • Is it clear why this data/information is being requested and what it will be used for?
    • Has an explanation been given to the project team explaining how this information and its assessment by senior management may influence project continuation, delay or cancellation?
  52. Did each project that was approved have an outline or strategy developed establishing how any necessary acquisitions would be handled? Is this strategy appropriate given the project type?

    Determining Whether Projects Met Process Stipulated Requirements


  53. Are decisions being made about project readiness for review using the data/information requirements established by a project screening process?

    • Are project submissions being reviewed consistently (using the process and information required)?
    • Are screening decisions being recorded?
    • Is there evidence of projects being rejected.
    • Were explanations for submission rejections documented and communicated to the business sponsor?
    • If exceptions are being made to screening criteria, is the explanation documented and forwarded with the project proposal?

    Deciding Upon the Mixture of Projects in the Overall IM/IT Investment Portfolio


  54. Do the systems that were selected for the current portfolio represent the best match with mission needs. Do all of the selected projects support objectives or goals established in the organization's strategic plan or annual performance plans? Are all of the selected projects justified on the basis of their relative costs, benefits and risks? Are there any other factors that appear to have influenced the executives' decisions?
  55. For ongoing projects, have projected versus actual data on costs and interim results been used to make decisions about project continuation?
  56. Were project decisions made at the appropriate organizational level?
  57. Do you know the ratio of funding between the various project types (new, proposed, under development, operational, etc.) that were selected? Does this ratio appear to effectively support the organization's business plan and objectives?

    TRACKING & ASSESSMENT

    Consistently Monitoring Projects

  58. Does the organization have a defined, documented, and repeatable process for monitoring and reviewing IM/IT projects? Does this process define what the focus of the investment reviews will be? Some key elements that may be included in the review include the following:

    • project status including where the project stands in relation to the development of other projects
    • business sponsor evaluation of the project estimated vs. actual costs to date
    • estimated schedule vs. actual schedule
    • actual outcomes from modular testing pilots, prototypes or limited site testing vs. estimates
    • technical performance as well as estimated impact on program or business performance
    • updates on risk mitigation efforts including identification of any new risks and steps to address technical issues or problems that have arisen
    • contractor performance and delivery review of the methodology or systems development process
    • new unexpected issues affecting project progress or outcomes
  59. Does the process stipulate what project data and information must be submitted for management evaluation?
  60. Does the process indicate how data and information being presented for management review are to be verified and validated? Are roles and responsibilities for conducting this verification and validation spelled out?
  61. Does the process define a specific group (or groups) of managers that are responsible for conducting IM/IM/IT investment control meetings? What is this group called? Are procedural rules for how the investment review group will operate and make decisions clearly defined and documented? Are the roles, responsibilities, and authority of this group(s) clearly defined? Is the purpose of the investment review group clearly stated? Are the types of decisions that will be made at the IM/IT investment control meeting defined (e.g., project continuation, delay, cancellation, termination, acceleration, etc.)?
  62. If investment review processes are used across different units of the Branch (e.g., department, branch, sector, region, directorate, unit, etc.), do the different units have consistent policies, practices, and procedures for monitoring projects and reaching decisions?
  63. Does the process make accommodations for flexible reviews (frequency, required report submissions, etc.) for different kinds of projects (high risk/low return vs. low risk/low return)?
  64. Does the process define who is accountable for acting on review decisions? Is it clear the types of actions that are under different people's responsibilities (e.g., project team, CIO staff, business sponsor, financial staff)?
  65. Does the process define how open action items are to be addressed? Does the process define roles and responsibilities for making these decisions, as well as criteria for evaluating actions that are taken and determining whether the open item has been resolved?
  66. Are there mechanisms in place to help ensure compliance with the review process? What are the mechanisms? Are there disincentives in place for noncompliance? Is there someone responsible for overseeing the process? If so, who is that person? Is the process consistently maintained?

    Involving the Right People


  67. Who is involved in ongoing project reviews and decisions? Do the review groups include staff from program, IM/IT, and financial offices? Is there membership by a quality assurance or some other outside assessment group?
  68. Are project managers or site executives included in devising and approving actions to address deficiencies that were identified?

    Documenting All Major Actions and Decisions


  69. Does the organization define the various pieces of information that are to be updated and maintained? Does the organization define the various pieces of information that are to be updated and maintained. This information may include the following:

    • project related decisions that are made
    • actions that are to be taken as well as criteria or measures for evaluating improvement
    • project outcomes
    • cost/benefit analysis and other associated project information
    • business case information
  70. Does the process stipulate where these data are to be maintained (e.g., official organization information system with uniform data standards and entry procedures)?

    Feeding Lessons Learned Back Into the Planning Phase


  71. Does the organization have a process for evaluating current decision-making processes and suggesting changes to these processes based on lessons that are learned from investment control reviews?

    • Does the process account for and distinguish between senior management decision-making changes and project-level management changes?
    • Does the process specify someone who is accountable for identifying lessons that are learned from investment control reviews?
  72. Is there a process for refining or updating the selection criteria (both screening and ranking) based on lessons that are learned?
  73. Does the organization have a process for aggregating data/information across all major IM/IT projects (or spending categories) in order to compile an overall organizational track record on costs and benefits attributable to IM/IT?

    • Is someone (or office) charged with this specific responsibility?
    • Does the organization have procedures for presenting this information to the IM/IT investment review group?
    • For presenting it to all agency executives?

    Measures of Interim Results


  74. Are specific measures of performance being used to track costs, schedule, benefits, and risks? Are these measures updated throughout each project's life-cycle as costs, benefits, and risks become better known?
  75. Are data being used to track actual project performance (interim results) against estimates that were used to justify the project?
  76. Are gaps or differences between estimates and actuals being analyzed and explanatory factors documented for positive or negative variances?
  77. Is there documentation to support that interim cost, schedule, benefit, and risk information has been validated or verified? If risks have changed, is this supported by documented data or analyses?

    Updated Analyses of Each Project's Costs, Benefits, Schedule and Risks


  78. In investment control meetings, has the information in project business cases been updated to reflect the current state (including project costs to date, current risks and mitigation plans, interim benefit or performance results achieved, etc.)?
  79. Are project-level data (current and historical) being maintained and updated using organization approved databases/information systems? What are the databases or information systems that are used?
  80. Are changing business assumptions or environmental factors (political, funding, stakeholder support) identified and documented? If so, are the impacts of these factors on project outcomes evaluated?
  81. If a project is behind schedule, do risk mitigation plans get updated and are explanatory factors thoroughly evaluated?
  82. If contractor assistance is being utilized, are contractor performance reports conducted and results made available for management consideration?

    Deciding Whether to Cancel, Modify, Continue or Accelerate a Project


  83. Is the organization reviewing projects and using data analysis to make decisions about the future of each project? Were the decisions that were made reasonable given the situation and condition of the project? For those projects whose status remained stable, was a conscious decision made that the project should proceed or was it implicitly assumed that the project would continue?
  84. If problems were identified, was a decision made about what actions should be taken? Who made this decision? Is there some explanation or justification given for this decision? Were the actions that were taken appropriate to the problem (i.e., was the problem an IM/IT problem or a business/process problem)?
  85. What evidence is there, if any, that actions have been taken based on the results of project reviews? For instance, if the organization determined that new requirements have been introduced, what actions were taken to respond to these additional requirements? Were these actions documented? Did these actions fully address the requirements?
  86. If decisions are made that affect a project's funding, such as suspending project funds or cancelling a project, is there evidence in budget documents and spending information that reflects this decision? Are there criteria identifying what must be done for funding to resume?
  87. Are future "cascading" actions resulting from project decisions clearly identified and delineated?
  88. Is a management group following organization policies, procedures, and practices in making regular decisions about the continuation of major IM/IT projects?
  89. Are project data being used to make decisions and take action on IM/IT projects that are subjected to investment reviews?
  90. Are decisions being made at the right time (i.e., as prescribed by the agency process or as agreed to by senior management as part of project approval)?
  91. Are decisions regarding projects being executed? Is accountability and follow-up clearly defined?
  92. Is an independent review conducted afterward to ensure that actions were taken and to make further changes as necessary?
  93. If projects are allowed to proceed when investment review data and analyses raise serious questions about the project, has documentation been provided detailing how management reached its decision?

    Aggregating Data and Reviewing Collective Actions Taken to Data


  94. Has the organization aggregated data in order to assess organizational performance and to identify patterns or trends? At what levels C unit, division, agency, departmental C has information been aggregated? Is this information being fed back in to decision makers to help make future decisions?

    EVALUATION PROCESS

    Conduct Post Implementation Reviews (PIRs) Using a Standard Methodology


  95. Does the organization have a defined, documented process for conducting post implementation reviews (PIR) of IM/IT projects?

    • Is the purpose(s) of the PIR process clearly explained and communicated?
    • Is the process clear about when PIRs are to be conducted?
    • Are PIRs required on a regular basis to ensure that completed projects are reviewed in a timely manner?
    • Does the process delineate roles, responsibilities and authorities for people and offices involved in conducting the PIRs?
    • Does the process help ensure that these assessments are objective?
    • Does the process stipulate how conclusions and recommendations resulting from PIRs are to be communicated to and reviewed by senior management?
  96. Does the organization have a standardized methodology for conducting PIRs? Does this methodology, at a minimum, include assessments of customer satisfaction, mission/programmatic impact and technical performance/capability? Is the methodology required at all levels of the organization?
  97. Does the process define how the outcomes of PIRs are to be addressed? Are roles and responsibilities defined for taking action to address any concerns or problems that are identified?
  98. What Steps does the organization require to ensure that PIRs are conducted independently and objectively? Are the results of the PIRs validated or verified?
  99. Are the causes of project and process problems identified as part of the PIR?

    Feeding Lessons Learned Back into the Planning and Tracking & Oversight Phases


  100. Does the organization process or methodology for conducting PIRs include provisions for:

    • changing or improving existing management decision making processes; and
    • strengthening project level management?
  101. Does the organization have a mechanism for tracking (over time and for different kinds of projects) and aggregating the results of PIRs that are conducted?

    • Are the results of PIRs collected and maintained (in a manual or automated database)?
    • Is the information reported in a timely manner?
    • Are the results easily accessible? Has the organization identified roles and responsibilities for collecting and analyzing PIR report information?
  102. Does the organization have procedures for regularly reporting PIR results to senior management? Is this information reliable, accurate and easily accessible?
  103. Does the organization have procedures for regularly assessing the PIR process for completeness, quality and contribution to project level and executive decision making?

    Measurements of Actual vs. Projected Performance


  104. Is the organization collecting projected versus actual cost, benefit and risk data as part of the post implementation reviews?

    • Has the cost, benefit and risk information that was used for initial project justification been preserved?
    • Have updates that have been made to costs, benefits, or risks been noted? Have these updated also been preserved?
    • Have project benefits that were obtained been quantified?
    • If not, are qualitative measures being used to determine impact?
    • Have the cost data been verified or validated?
    • Are these data contained in a recognized organization financial management/accounting database?
  105. Does the PIR include assessments of customer satisfaction (end users, business or program unit sponsor, etc.)?
  106. Does the PIR include assessments of technical capability (eg. conformance to recognized systems development methodology, architecture compliance, contractor performance and oversight)?

    Documented "Track Record" (Project and Process)


  107. Does the organization routinely record its evaluation activities?
  108. Has the organization conducted trend analyses (descriptive or statistical) using the results of PIRs that have been conducted?

    • Do these analyses attempt to correlate actual project performance results with factors that may have caused these results (positive and negative)?
    • Are the results of these analyses presented to management as a regular part of the investment decision making process?
    • Are special reports issued to executive management?
  109. Are recommendations for specific projects and senior management decision making processes presented in PIRs?

    • Do these recommendations cover changes to process, data or decision making procedures used for both the Planning and Tracking & Oversight phases?
    • Are these recommendations well documented and supported by existing analyses?

    Assessing Projects Impact on Mission Performance and Determining Future Prospects for the Project


  110. What decisions have been made by senior management (or investment review group) regarding whether implemented projects met defined criteria for successful outcomes?
  111. Were corrective actions for specific projects included in senior management's decisions?

    • Were timetables and steps for implementing these changes established as part of the decision?
    • Were follow-up management reviews established?
    • Has a clear purpose for these reviews been defined?
  112. Has a plan been developed detailing how future O&M and disposition costs will be addressed?
  113. Are decisions that are being made on specific projects cognizant of the potential (or actual) impact the decision may have on other related projects?
  114. Have decisions regarding the status of projects been finalized? Have expected changes been communicated to the project manager?

    Revising the Planning and Tracking & Oversight Based on Lessons


  115. What decisions have been made to modify existing organizational IM/IT investment management processes?

    • Have these decisions been communicated to staff?
    • Are changes to existing processes, operating procedures and data requirements aligned with conclusions and recommendations documented in PIRs?
    • Has the organization clearly established and communicated when these changes to existing management processes will take effect?