This page has been archived.
Information identified as archived on the Web is for reference, research or recordkeeping purposes. It has not been altered or updated after the date of archiving. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you can request alternate formats on the "Contact Us" page.
Service Improvement Planning |
|||||||
|
Steps |
Performance |
|||||
|
|
Low |
(1) |
Transition |
(2) |
High |
(3) |
1 |
Internal Assessment Identify the internal and external client(s), product(s), service(s), partner(s) and stakeholder(s) |
Little or no definition of client groups, partners, or stakeholders |
|
Some attempt to identify client; some attempt to identify partners and stakeholders |
|
Everyone in the organization has a clear understanding who the clients for each product or service, who the partners and stakeholders are, and how they are involved |
|
2 |
Assess Current State Establish a client feedback strategy; identify current levels of employee and client satisfaction, expectations and priorities |
No consistent mechanisms to obtain feedback from clients and employees |
|
Sporadic client and/or staff surveys |
|
A strong strategy to continually measure client and employee satisfaction, expectations, and priorities for improvement |
|
3 |
Desired Future State Create service vision and mission statements |
Citizen- centred approach not articulated in the mission and vision statements |
|
"Lets improve service" mission and vision statements communicated to staff |
|
A mission and a vision statement that supports citizen- centred service improvement developed with and supported by staff |
|
4 |
Priorities for Improvement Identifying areas for potential improvement |
No organizational priorities have been defined |
|
Some priorities to improve service have been defined and communicated within the organization |
|
Priorities have been identified and communicated around the five drivers of satisfaction, visibility, access, human resources management, partnerships and use of IT for service improvement |
|
5 |
Set Standards and Targets Set improvement targets; set client-driven service standards |
None, not clear; performance not measured |
|
Service standards are not client-driven; occasional performance appraisals |
|
Client-driven standards established and published; performance measured at all levels and reported regularly to all staff |
|
6 |
Design Improvement Plan Develop an action plan to obtain improvements for each goal; identifying responsibi- lities, defining a schedule; allocating resources |
No improvement plan; no specific action to improve service; Not linked with client priorities for improvement |
|
Not formalized; not integrated within the planning cycle, one-shot effort |
|
Service improvement plan is set that includes actions to improve service, responsibi- lities, monitoring results, clarify accountability, and a schedule for completion and renewal |
|
7 |
Implement- ation Implement the improvement plan |
No implement- ation |
|
Actions defined in the plan are partially. Implemented; no clear responsibility and accountability to implement the plan |
|
Actions defined in the plan are fully Implemented; responsibility and accountability to implement the plan are clear to all employees |
|
8 |
Monitor Monitor and measure progress, ensure accountability for results |
None |
|
Review from time to time the current state of each improvement action |
|
Regular review of current improvement actions as well as improvement priorities; regular updates of standards; re-planning |
|
9 |
Recognition Establish, monitor and maintain an employee recognition program |
None |
|
Occasional award ceremonies; management- driven; part of performance appraisal |
|
Formal and systematic; it recognizes performance improvement, client satisfaction, and targets achievement |
|
|
|
Subtotal A: |
|
Subtotal B: |
|
Subtotal C: |
|
|
|
|
|
|
|
A+B+C= |
|
Add up your scores |
|
Low SIP Performance: |
9 - 15 |
In Transition SIP Performance: |
16 - 23 |
High SIP Performance: |
24 - 27 |
Continuous Feedback Strategy - sample |
|||
Data required |
Method |
Frequency |
Responsibility |
Citizen expectations and needs | Focus groups |
Annually |
Director, Strategic Planning |
Client satisfaction | Exit survey Postal survey |
Continuous Annually |
Front line staff Director, Client Services |
Client priorities for improvement | Exit survey Postal survey |
Continuous Annually |
Front line staff Director, Client Services |
Client complaints | Complaint management system |
Semi-annually |
Director, Client Services |
Employee satisfaction | Short electronic employee survey |
Semi-annually |
Director, Human Resources |
Employee ideas for improvement | Quality circle meeting |
Bi-monthly |
Director, Client Services |
A focus group is a small group, usually between six and twelve people, brought together to provide views on particular services and products in a consensus-building discussion. These groups stimulate discussions on specific topics, and are useful for gathering balanced and detailed input from a variety of clients with different perspectives. Focus groups encourage innovative thinking and consensus building around a specific product, service or service delivery process. A discussion guide often helps to direct the group. The guide focuses the discussion on to specific topics to ensure the information required is obtained.
Within the service improvement initiative, focus groups can be used in the assessment process to identify what is important for the client in terms of service and what could be a good service in terms of standards. Focus group could also be used in the middle of the process to identify solutions or actions to undertake to improve service or to better understand what is the meaning on specific comments provided by the clients within a survey or a comment card.
Tips on conducting successful client focus groups
The Common Measurements Tool (CMT) provides public organizations with a set of standard questions and standard measurement scales for use in surveying their clients. It must be stressed that it is a tool, not a "ready-to-use" client satisfaction survey. Rather, it is a comprehensive collection of potential survey questions that an organization may select from to custom design a client satisfaction survey that meets their information requirements. Organizations are encouraged to select those sections that are most appropriate to their services and clients. The use of standard questions allow the organization to benchmark progress on its service improvement journey over time, and since questions are standard, organizations can compare results with other organizations within the same business line. To ensure this ability to benchmark performance, several core questions will be required for inclusion in all surveys. These are presented on the following page.
The CMT is also a client satisfaction survey, not a citizen survey. A client survey deals with questions about service delivery at an operational level and specifics of the service delivery experience, such as the time required to deliver service, whether staff were courteous, and accessibility of the service. In contrast, a citizen survey addresses issues indirectly related to the delivery of services, such as the service delivery mechanisms and structures.
Designed to provide client feedback to any public organization and ensure that all aspects of client service are considered, the CMT is conceived around five key elements: client expectations, perceptions of the service experience, satisfaction levels, levels of importance, and priorities for service improvements. These are the basis for the types of questions asked in the CMT, which is arranged around five dimensions of service delivery: responsiveness, reliability, access and facilities, communications, and cost (where applicable).
With a focus on these five elements, the organization is able to know the degree of client satisfaction on various aspects of service delivery, and what clients consider important in the service delivery. When the priorities for improvement are considered and the expectations known, the organization can then focus efforts that will best serve to close the service gap in meeting the needs, expectations, and priorities of clients.
Comprehensive information on the five key service delivery elements provides a solid foundation on which to base decision-making, such as the areas to focus improvement efforts and resource allocation. It may also help in the management of client expectations, if those expectations are unrealistic or achievable, through better communication with clients.
In addition to surveys, comment cards are a common method to gather feedback from clients. Both tools provide some of the same information, but each is intended for a specific purpose. A survey is intended to gather information that can be analyzed and results benchmarked over time. The process uses a methodology that ensures that the results are representative and statistically valid. This allows the organization to know with a degree of certainty the extent to which service improvement efforts have impacted customer satisfaction, and to make strategic decisions based on the information.
Comment cards only provide broad opinions, which are often valid, but which cannot be used to track changes. As they do not follow a rigorous methodology, they are not considered statistically valid or representative of the client base. Since anyone can complete a comment card, in many cases they are only completed after a negative experience and have been referred to as "complaint cards." The primary purpose of the comment card is to provide information to staff quickly to so that operational problems can be corrected as soon as possible, notably on the key drivers of service (e.g. timeliness, staff courtesy, staff competence, quality of information, fairness, and outcome of the service). In addition, comment cards emphasize open-ended questions for broader comments on the service experience. Comment cards, as such, serve to supplement rather than replace a customer satisfaction survey. A generic comment card can be found here in PDF (107KB) or RTF (453KB) formats.
Organizations may conduct employee surveys as part of their service improvement initiative for a number of reasons. Employee involvement, commitment and participation are key elements of any organization who would like to improve service for citizens. Management needs the opinions of the work force to identify areas for improvement and should, therefore, provide regular opportunities for employees to participate in the decision-making process.
A well-handled employee survey can catalyze or enhance communication, partnerships with employees, and motivation. Morale, productivity, commitment and organizational vitality can be substantially improved by listening to and acting on employee suggestions.
Employee surveys can:
To ensure the quality and the validity of the survey some methodological and strategic decisions will be required to:
As part of the survey process, the organization should determine if the survey met its stated objectives. The organization might assess success against the following criteria:
An employee survey should focus on satisfaction and priorities for improvement in five areas:
For more information on employee surveys, please consult:
Canada. Statistics Canada, 1992. Guide to Conducting an Employee Opinion Survey in the Federal Public Service. Special Surveys Group, Statistics Canada.
Canada. Treasury Board of Canada Secretariat, 1999, Public Service Employee Survey.
Canada. Treasury Board of Canada Secretariat, 1995. Employee Surveys. Quality Service Guide VI.
Edwards, J. E., M. D. Thomas, P. Rosenfeld, and S. Booth-Kewley, 1996. How to Conduct Organizational Surveys: A Step-by-Step Guide. Thousand Oaks, California: Sage Publications.
Harwood, Paul L, 1998. Employee Surveys in the Public Service: Experience and Success Factors. Ottawa: Canadian Centre for Management Development.
As noted in Step 4 in the How-to Guide: Setting Improvement Priorities, one way to determine priorities for service improvement is to create a service improvement matrix. This allows decision-makers to visualize potential areas for service improvement based on client survey responses through a plot of client satisfaction and the importance of each service element.
By plotting the two ratings on a two-dimensional grid, it can be quickly determined which improvements are both necessary (low satisfaction ratings) and important for clients (high importance ratings). The location of each service element plotted isolates those that are service improvement priorities (see the legend on the following page).
The following data illustrates the use of the matrix, based on the five-point scale of the CMT.
Dimension of Service |
Satisfaction |
Importance |
|
a) |
Hours of service |
1 |
3 |
b) |
Comfort of waiting area |
2 |
4 |
c) |
Waiting times |
2 |
2 |
d) |
Parking |
5 |
2 |
e) |
Staff courtesy |
4 |
5 |
The Service Improvement Matrix
(also available in PDF (45KB) and RTF (106KB) format)
The Four Quadrants of the MatrixPriorities for Improvement. Service elements here have low satisfaction levels, but are the service dimensions which are also the most important to clients. These are the service elements that require immediate attention. Strengths. This contains those service elements that the client considers as important and has a high level of satisfaction. No improvement is required on this element. Redeployment? Elements in this quadrant have high satisfaction levels, but are not important to clients. No improvement is required to these service elements, or the opportunity may exist to reallocate of resources in support of other improvement priorities. Opportunities. Clients have low levels of satisfaction with these elements, but they are also not important. Improvements on these elements are not a priority at the moment. Note: By moving the crossed centre-lines of the matrix (up-down, left-right) how strict the improvement priorities are screened can be adjusted. The example on the previous page reflects such an adjustment. |
Based on the data in this example (which uses an adjusted centre-line), Hours of Service and Waiting Times emerge as improvement priorities.
When designing your questionnaires and testing them with focus groups, be sure to verify that you will get data to construct a service improvement matrix. As well, several priorities will likely emerge, or the priorities of minority groups may be squeezed out of the "Priorities for Improvement" quadrant. It is essential that the importance ratings are supported in the questionnaire with the question that asks respondents to identify their top priorities for improvement.
1 The description of the Service Improvement Matrix is taken from Listening to Customers: An Introduction prepared by S.A. Woodhouse et al. for the Service Quality B.C. Secretariat, Government of British Columbia, 1993. [return]
What are citizen expectations for the speed of service in various delivery channels? The 1998 Citizens First survey documents Canadians' expectations for service standards in the area of telephone service, counter service, voice mail, mail service, e-mail, and referrals.
For more information on service standards in the Government of Canada, please consult:
Canada. Treasury Board of Canada Secretariat, 1995. Quality Services - Guide VII: Service Standards. Ottawa: Minister of Supply and Services Canada.
Canada. Treasury Board of Canada Secretariat, 1996. Service Standards: a Guide to the Initiative. Ottawa: Minister of Supply and Services Canada.
Canada. Treasury Board of Canada Secretariat, 1996. An Overview of Quality and Affordable Service for Canadians Establishing Service Standards in the Federal Government Quality and Affordable Services for Canadians: Establishing Service Standards in the Federal Government (An Overview). Ottawa: Minister of Supply and Services Canada.
A Service Standard and Satisfaction Targets Template is available in RTF (79KB) and PDF (67KB) formats. |
The following pages present several options that departments and agencies, at different organizational levels, may wish to consider in the creation of their service improvement plans.
The service improvement plan is - in essence - a comprehensive summary document that captures the information collected and synthesized in Step 1 to Step 5 and presents it in a concise manner. It will identify the clients, partners and stakeholders of the organization, and state the mission statement of the organization that includes a service vision to provide focus to serve as a reminder of the mission when drafting the plan. It will identify the client feedback mechanisms used, and identify the current levels of client and employee satisfaction, expectations, and priorities. From this, it will identify the priorities that clients have identified for service improvement, and lead to the setting of satisfaction improvement targets and client driven service standards. The improvement plan then states actions to achieve the targets, defines the schedule, and allocates resources and responsibilities for improvements. It is also forward looking to the later steps by identifying how progress will be measured for each target.
The examples found on the following pages are examples only. They are not intended to represent the definitive ways to structure a service improvement plan, but to serve as suggestions for organizations, from the work unit level up to the department or agency-wide basis, for what such a plan might look like.
The examples presented are presented in a variety of formats, but generally all contain the same basic information elements. Remember, the service improvement plan is designed primarily to serve the internal needs of the organization to structure and plan actions for improvements. As such, the structure of the plan should reflect the needs of the organization so that is understandable by management and staff. A secondary consideration will be any information needed for reporting purposes. In these cases, the organization may wish to use the same format in the plan as it will use in the reporting process to simplify report preparation.
The following guidelines apply to all departments and agencies which have been identified as having significant service delivery activities with Canadians under the Service Improvement Initiative. However, all departments and agencies are encouraged to apply these guidelines where the information is available.
Results for Canadians commits the Government of Canada to measurable improvements in client satisfaction. The Treasury Board of Canada through the Service Improvement Initiative commits the government to achieve, at a minimum, a 10% increase in client satisfaction with key, significant direct service delivery activities by the year 2005. This initiative, which last year was integrated with the Government On-Line Initiative at Treasury Board, encourages service improvements from a citizen-centred perspective, focusing on achieving real improvement in client satisfaction with service quality. Departments are required to use their DPR to specify the activities, targets and achievements that are moving the government toward the 10% target. On-going service improvement depends on departments' and agencies' ability to: measure levels of client satisfaction, set targets for improving client satisfaction with their key services to the public, monitor implementation, and report progress on improvement in client satisfaction for key services to public.
Departments should report on four key elements of their service delivery performance:
|
The establishment of a formal and structured service improvement plan by business line, program, delivery channel (or geographic basis) is one of the key tools to achieve improvements in client satisfaction. In the report, it is important to identify what programs and services are formally covered by a service improvement plan, and how they are linked with an on-going client feedback strategy which allows the department to understand client needs, expectations and priorities for improvement, as well as monitor progress toward satisfaction targets and update its service improvement strategy.
Departments must demonstrate that the most significant programs and services from the citizen's point of view are appropriately covered. |
An overview of the key client priorities for service improvement identified this year, and the main actions taken to address them.
Measuring client satisfaction with service quality is the most effective way to assess whether or not actions undertaken to improve service had a real impact. While each department and agency will set their own targets for improving their client satisfaction results, departments will report on annual progress in improving client satisfaction and on meeting their annual improvement targets. In particular, departments will report their progress toward improving the timeliness of service, service accessibility, outcome, and overall satisfaction. Each of these aspects of service was highlighted in recent research as requiring specific attention. Departments must indicate, where applicable, whether services are surveyed on a cyclical basis (i.e. other than on an annual basis). When pertinent, detailed survey results regarding client satisfaction levels and priorities for improvement are to be made available by a proper link to the departmental Internet site.
Level of Satisfaction Table - Core Questions - Example (Tabulate by Service) |
||||
Issues |
Questions |
Baseline |
2002-2003 |
2002-2003 |
Timeliness |
Overall, how satisfied were you with the amount of time it took to get the service/product. |
XX % |
XX % |
XX % |
Access |
Overall, how satisfied were you with the accessibility of the service/product? |
XX % |
XX % |
XX % |
Overall Satisfaction |
How satisfied were you with the overall quality of service delivery? |
XX % |
XX % |
XX % |
Outcome |
In the end, did you get what you needed? |
XX % |
XX % |
XX % |
Common questions on client satisfaction have been defined for government-wide reporting. To ensure consistency and to be able to benchmark results, these common questions should be integrated into client satisfaction surveys undertaken by departments using the Common Measurements Tool (CMT). Detailed client satisfaction results should be made available on the departmental WWW site with the appropriate URL reference in the table. Departments are encouraged to use the CMT Benchmarking Database in their service improvement work. The CMT Benchmarking Database provides the opportunity to anonymously compare your CMT survey results with peer organizations across Canada and internationally. For more information visit the Institute for Citizen-Centred Service's website at http://www.iccs-isac.org/eng/default.htm |
Although improvement of client satisfaction is the key measure of service improvement and service quality, service standards continue to play an important role in the overall service improvement strategy. For each service standard, developed from knowledge of client expectations, departments are required to measure actual performance against these standards. Overall performance is to be reported in the DPR, while additional information about service standards and performance against those is to be made available by a proper link to the departmental Internet site. An example of a service standards table is provided below.
Service Standards Table - Example |
|||
Areas |
Service Standards |
Performance expected |
Results |
Accessibility |
We will provide you at least eight hours service each business day and we will post our hours in each office. |
XX % |
XX % |
Timeliness |
When you leave a message in a voice mailbox, we will return your call within 2 hours. |
XX % |
XX % |
Information |
If you send us a request about your personal file, we will respond to it within five business days of receiving it. |
XX % |
XX % |
Language |
Our services are available in both official languages. You will be served in the official language of your choice. |
XX % |
XX % |
Detailed service standards and results are available on our Web site. |
Service standards, performance results, as well as client survey feedback are to be published and made available to clients on the department's WWW site. The appropriate URL must be referenced in the table. |