This page has been archived.
Information identified as archived on the Web is for reference, research or recordkeeping purposes. It has not been altered or updated after the date of archiving. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you can request alternate formats on the "Contact Us" page.
This chapter describes the Systems Development Life Cycle and the roles within that Cycle, in enough detail that an auditor can perform an audit of development at any phase of any department's interpretation of the SDLC into its own Systems Development Process (SDP). This means that the auditor should be able to assess any project's progress, layout the tangible accomplishments for comparison to those accomplishments deemed appropriate for the sequential stages of development considered standard by this guide, and so determine "where" the project is relative to the standard. This will then enable the auditor to select, from all audit objectives given to that point in the standard SDLC, the audit objectives appropriate for the particular project.
The "Management of Information Technology" Policy June 1990 from Treasury Board supersedes Treasury Board Administrative Policy Manual 1978 Chapter 440.3 (Appendix J of this guide). Chapter 440 defined the Systems Development Life Cycle on which this guide was based. Although adherence to a specific SDLC is no longer prescribed by Treasury Board, this audit guide remains of value in defining the audit of an SDLC, which is still an accepted systems development practice.
The purpose of an SDLC is to allow system innovators and users to produce a controlled, economical, efficient and effective system. The following phases of the development process were suggested in Treasury Board Administrative Policy Manual 1978 Chapter 440.
While the Standard SDLC describes seven life cycle stages, individual Departmental SDLCs may contain more or fewer than these seven. However, from the work content of each stage or combination of stages in a particular SDLC, parallels of progress can be drawn by comparison to the seven-stage standard of project work accomplishment (see Figure 3). Therefore, as was previously stated, appropriate audit objectives and audit criteria (discussed in Chapter 3) can be selected for a particular system's audit from those applicable at the same and previous stages of this Guide's sequenced set of objectives.
On the same note, one current school of thought holds that, in this day of micro computers and/or prototyping/fourth generation languages, organizations cannot afford the controlled constraints of a formal life cycle methodology. Nonetheless, the auditor's responsibility is to ensure that adequate management control points exist, whatever the individual life cycle in place. To this end, the content or deliverables of the development phases must exist and should have been completed in a logical sequence of the Standard SDLC.
Before showing a generic SDLC comparison table of the Standard SDLC and comparing it to another terminology example, we should discuss one particular recent development technique in more detail.
Application prototyping is defined in this Guide as "dynamic visual modelling that provides a communication tool for the user and developer that is more effective than either narrative prose or static visual models for portraying functionality". It is an approach intended to simulate the ultimate system. The technique is an adjunct to a development methodology and not a replacement. Prototyping should be used at the Feasibility and General Design Stages, if a conscious decision has been made to use the technique at all, to determine functional and data requirements by permitting the user "hands on" involvement in the earliest stage possible. When the technique is chosen, the auditor should examine the decision of the project team and the control over the use of prototyping at the Feasibility and General Design stages.
Note that the auditor should ensure that Prototyping is not confused with Piloting. A prototype may be built with non-production software and thus could not be gradually expanded into the production version. A Pilot system is intended from inception to be expanded into the production version.
The auditor must insure that the difference is recognized by the project team or that formal, signed off decisions exist to extend the "prototype" beyond the General Design (or equivalent) stage.
The auditor should be aware of the current tendency for departments to manage their data formally and the effect on systems development that data administration and data base administration are having or should have in their environments. An excellent reference is "Information Management Strategy For Common Systems Report - 1989", by the TBS Advisory Committee for Information.
Figure 2, below, illustrates a sample development methodology, in flow chart form, using data management and structured design techniques. This is not the Standard SDLC approach. However, many of the terms and deliverables of the stages are similar to those used in the Standard SDLC. By matching the deliverable content of the standard stages and the deliverables of the audited system under development, the auditor will be able to select the equivalent standard stage objectives from this guide. This will provide the auditor with a core set of objectives, to be augmented dependant on the nature of the particular system, in order to yield optimum audit coverage during development (whatever the local SDLC and particular system characteristics).
Figure 3 (below beyond Figure 2) is a summary of deliverables by stage in the Standard SDLC.
Figure 2: A Sample of a Non-Standard Systems Development Life Cycle
Note 1: An active data dictionary exercises greater computer control over metadata (data about data in the system) than the passive dictionary.
Figure 3: Standard SDLC - Deliverables by Stage
At this stage terms of reference for the project should be formally defined and the project control parameters established.
Procedures involve performing a preliminary review of the existing system (if any) to assess the need for change and the nature of the suggested changes. The "problem" must be defined. A potential solution should be conceptualized for reference during the feasibility study phase. The description of the solution should not be so detailed that it prejudices the alternatives examined during the feasibility study.
At this time all external and internal constraints (cost, time, legislation, departmental guidelines, user needs, etc.) should be determined and their impact on the problem and the solution assessed. Security, including disaster recovery requirements and Privacy issues should be assessed during this phase.
This phase produces a Project Initiation Report.
When this stage is complete, an appropriate solution to the problem should have been determined and a preliminary plan for its implementation designed.
Users' Requirements may be documented or established by prototyping, thus providing a basis for identifying a solution.
It is of prime importance that enough alternative approaches be examined. A detailed analysis, at the conceptual level, of the various alternatives should support a formal justification for the suggested solution. This analysis should include cost benefit analysis (or similar techniques), consideration of financial and operational controls, and organization compatibility. As in the project initiation phase, care must be taken that evaluations are objective and complete and that there is no "built-in" bias towards one particular solution.
Resource requirements for the remainder of the project should be identified and time and costs estimated for management approval. Broken into appropriate project phases, these factors will be used to maintain and monitor project development.
Documentation of the above should be contained in a Feasibility Study Report.
Work during this phase will translate the proposed conceptual solution, determined during the feasibility study, into a workable solution ready for detailed design and implementation.
This will require:
Documentation of the information gathered in this stage will typically be contained in a General Design Report. Some departments may prefer to prepare two reports, the second to highlight the Business System Design by itself. Either way, these elements of the system must be clearly documented.
Based on the functional specifications from the general design stage, detailed procedures and computer specifications are produced. All controls, procedures, work flows, input/output documents, processing logic, file/data base layouts, and data elements will be finalized.
Management and user approval of this design stage is paramount. Therefore, the final product of this phase, the Detailed Design Report, should contain, in addition to detailed program specifications, workflows, etc., a non-technical description of the entire system. This should encompass:
Appropriate members of management should review the detailed specifications and technical requirements.
Documented system test plans and implementation and conversion plans should also be produced at this stage, and, in addition, a plan on how the activities in the implementation and installation phases will be coordinated. This will include preparing instruction manuals (users and operators), training, security, back-up and conversion procedures.
This stage creates all computer programs, forms, manuals and training material needed for an operational system.
Detailed program logic will be designed and application software coded.
User, operations and training manuals will be finalized and should cover, where appropriate:
All aspects of the system, including program logic and operational procedures, should be thoroughly tested. All procedures required for the installation of the system should be defined and scheduled.
This stage converts the system to operational status. The work includes converting existing files (if any) or creating the initial information base, training all personnel involved with the system (user and EDP), and instituting control and operational procedures through pilot or parallel run phase-in. All documentation from previous phases should be finalized. Conversion and installation procedures should be reviewed and tested. The project manager should issue a formal Project Completion Notice for approval.
Work during this stage consists of examining the project performance and system performance against the original project documentation of system cost/benefit and project cost and time schedules.
A period of settling in is normally allowed between Installation and Post-Installation audit. The audit team could be changed at this point, as well, to maximize objectivity, but decreased audit efficiency will offset the objectivity gained.
Thus, project reviews are important soon after system installation to assess the success of the systems development process and to identify any differences in control design and control operation.
As we have noted, adequate departmental standards should exist and be adhered to for each SDLC stage to ensure consistent and complete management control over implementation. However, it may be appropriate for the department to have defined and approved a separate set of SDLC activities based on the type of project being undertaken (i.e. major or minor system development). It is normal to document management approval of the deviation from departmental standards.
In many cases of micro- or mini-computer end user development, examining the importance of the data/information to the corporate body may indicate that some or all of the control points of an SDLC should be present.
Lastly, in evaluating whether or not system development or change is minor enough to justify grouping or eliminating some of the SDLC stages, the auditor should keep in mind that some relatively small system changes could be very significant from a control point of view.
The typical roles in the systems development process illustrate the contribution of each stage in the SDLC model to management's assurance of control, economy, effectiveness and efficiency in systems development. These are very basic descriptions, but they serve as examples of the roles an auditor should expect to find in a controlled environment. These roles, or their equivalents, and others are illustrated in Appendix A as interviewees for the questions related to the suggested Objectives and Criteria for each audit stage.
Management has a review role, to ensure that the developed system meets the ultimate goals of the organization. Management sets priorities on projects, budgets, and target dates. Management establishes departmental policies and standards for system development, then demands the appropriate occasions to exercise its control over the development process by ensuring that an SDLC is in place and is functioning as designed.
A major management responsibility is to decide how much risk can be tolerated in any project.
Management may need third-party technical help with these management responsibilities.
Each departmental organization usually appoints a sign-off authority for each stage of the development process. Taken together, they represent the approval authority. Some departments have an EDP Steering Committee at the DM level, which should consider all systems audit reports. This is sometimes the final approval authority. The key issue for internal audit is that some evidence of a formal approval process with senior level sign-off, be in place.
The designer/analyst works with the user requirements to develop a system that meets the objectives and needs of the user. The designer is responsible for ensuring that the system design is comprehensive and workable. The designer/analyst is also responsible for overall system control over data that transcends or integrates individual program controls. The designer also bears the responsibility for choosing the optimal technical design alternative. The auditor should ensure that the analyst's control role is not compromised by the project manager or anyone else.
The programmer creates an effective and efficient program from a specification drawn up by the analyst/functional representative. The program could be a dialogue or module of the overall system and the controls in the specification must ensure that the data that are entered into the program retain their integrity throughout the program's processing. Control over data must be programmed in the input editing process, internal EDP program processing, and output displayed, communicated, or printed.
It is the user who in the early stages of development clearly defines and supports the objectives and requirements to be satisfied by the system. It is also the user's responsibility to establish control requirements and to ensure that the resulting system delivers the required control. The user may need third-party technical help to ensure that the required control is in place.
The Treasury Board's Security Policy and Standards, 1989, outlines the security responsibilities of the Department, the Royal Canadian Mounted Police (RCMP), the Department of Supply and Services, the Department of Communications, the Department of Public Works, and the Security Evaluation Inspection Team (SEIT). The roles of the Departmental Security Officer and the Security Advisory Committee, within the two activities of Security Co-ordination and Security Administration, are briefly outlined. Every department should provide, directly or by consultation with the RCMP SEIT, advice, standards and evaluation of the physical (versus logical) controls required within that department over data, information and physical assets. Sign-off, from the approval authority should be evident at each required stage in the SDLC.
Other security references are contained in Item 12 of Appendix J.
New emphasis is being placed by many departments on managing data and information as a critical corporate asset. Data Administration can be described as the functions of planning, administration, and control of the data-related activities of an organization, and the Data Administrator is that person or organization responsible for carrying out the Data Administration.
A person from the Data Management or Administration area should be identified as a key project team member.
Data Base Administration plans, controls, and performs any other functions that directly lead to or have an immediate impact on operational data bases. The Data Base Administrator is the person or organization responsible for the functions of Data Base Administration. Where there is technical distinction between analyst and data base administrator, the auditor should ensure that data base administration is represented on the project team.
The auditor should review and evaluate the management controls used in developing new application systems or major enhancements. The auditor will look for evidence that there has been adequate user participation in the design and acceptance of the system and that there is adequate attention in the detail system and procedures design to accomplish general and application control.
The exact extent of the auditor's participation in systems development is determined by the risk to the organization of the development activity. The risk is comprised of elements of development cost, operational cost and the organization's dependence on the information processed. Today's systems design and development activity is growing as a significant portion of organizational time and expense and organizations rely more than ever on the continued functioning of their EDP systems. In much the same manner that the auditor would establish the materiality of their findings, the auditor should also establish the reason for choosing certain criteria over others in the Planning Phase of the audit. This is accomplished by establishing the extent of the risk to the department should a particular management control be poorly executed. In some cases, this approach may enable very few audit resources to handle very large systems development projects.
The auditor may find very complex documents, deemed necessary to explain the system development role relationship between product managers, communication system designers, data base administrators, data owners, users, clients (sometimes called users' users) and a host of other titles that have sprung up to deal with the more complex Information Technology world described in Chapter 1.
Other than for the development of system requirements for the audit group, as a user of the system, the auditor must never be held directly responsible for any project activity. Auditors are "outside" the project team, even though they may offer advice on control, by letters or reports, to the project team. The auditor must, through all project development stages, verify that all of the issues and role/reporting concepts that will arise during the project are well documented.
In all of its systems development review activities, auditors must
ensure that the independence of the audit function is not compromised
for later on-going system reviews. This is normally accomplished by
assigning different auditors once the system has been installed, and
through the manner in which the system development auditor makes
observations and recommendations for the improvement of control. The
auditor must always resist being involved in the actual system design.