Performance Measurement in Canadian Government Informatics
Bryan Shane and Gary Callaghan
A balanced performance measurement Â system requires that certain principles be followed to define the scope and Â provide a philosophy of operations.
This is the third and final article in a series on developing Â and implementing a performance measurement system (PMS) for an informatics function in a public service environment.’ The first article presented a Â conceptual approach to performance measurement based upon the Balanced Â Scorecard 2 approach, while the second article described the steps to developing Â and implementing a PMS. This third article describes the realization of a PMS in Â a Canadian government informatics organization, the Application Management Â Service (AMS) of Public Works and Government Services Canada (PWGSC). Â Specifically, this article describes the mission, services and clients of AMS; Â the rationale for embarking on the performance measurement project; and the Â development and implementation process. Finally, it provides an assessment of Â the benefits and risks associated with implementing such an Â endeavour.
AMS is a sector within Â the Government Telecommunications and Informatics Services (GTIS) branch of Â PWGSC. Its vision is to develop information management and information Â technology IM/IT partnerships with the department’s other business lines and Â with Special Operating Agencies (SOAs) to maximize PWGSC’s value as a Common Â Services Organization. As one of the department’s eight major business lines, Â GTIS is recognized by the federal government community as the principal agent for Â the management of common IM/IT infrastructure and services. GTIS provides a wide Â range of IM/IT services to PWGSC and other departments of the Canadian Â government.
AMS has 418 employees and is augmented by a similar Â number of consultants. it has an annual budget allocation of $42 million and an Â in-house supported technology inventory valued at approximately $300 million. Â Its mission is to provide development and support services in respect of Â administrative processes and education to government IM/IT professionals. it Â delivers value by working closely with clients to reduce business risk, while Â increasing flexibility and performance.
To achieve Â its mission, AMS provides a number of services to its clients, including:
- Application Â support – includes operational-support and functional-enhancement services Â primarily for PWGSC applications.
- Application Â development – includes major functional Â enhancements, architectural re-engineering services and IT project management.
- Application Â environment – includes architecture and Â environment services, such as standard development tools and techniques and Â quality management, primarily for AMS.
Other AMS Â services include business process reengineering, Internet support, data Â warehousing and software exchange.
Figure I Â illustrates the client base of AMS. The majority of AMS clients (87 percent) Â work within PWGSC, with the remainder coming from outside the department. Â Most clients are using more than one AMS service, with 71 percent using at least Â three services. The most widely used services include:
- application support (84 percent),
- application development (75 percent),
- application environment (41 percent),
- Internet Â (39 percent),
- data Â warehousing (26 percent), and
- software Â exchange (24 percent). 3
PWGSC and GTIS are committed to the review and renewal of federal programs Â and services to ensure that they are Â affordable, accessible and responsive to the needs of Canadians. This new Â environment created a number of challenges and opportunities in the delivery of Â these programs and services.
- There is a Â need to deliver high-quality programs and services across the country in an Â environment of ongoing fiscal restraint. Staff reductions and cost saving Â initiatives have created an environment in which employees often have difficulty Â managing their workloads, with a resulting negative effect on morale.
- There are Â opportunities for more effective leadership and simplified management structures Â that permit long-term, viable alternative approaches to delivering programs and Â services at reduced costs, based upon new forms of collaboration across branch Â and department lines.
- There is a Â need for a means to facilitate the sharing of best practices in the delivery of Â programs and services.
These challenges and opportunities, which are directly related to the effective Â delivery of programs and services to the Canadian public, were further Â compounded by the need for effective performance measurement practices in AMS. Â Until the Balanced Scorecard approach was implemented, the performance of Â programs and services was not measured in a holistic way.
At present, the established Â standard in the federal government for reporting performance is solely based on Â a financial perspective. This one-dimensional approach fails to provide Â performance information related to client and employee satisfaction, quality of Â service or continuous improvement. Many performance measurement systems do not Â provide the information needed to determine whether objectives and strategies Â are being effectively implemented. Much performance information is reactive Â rather than proactive. Moreover, it fails to recognize outstanding group or Â individual performance, which tends to undermine motivation and morale within Â the organization. This situation is further compounded by the fact that while Â approximately 10 percent of all informatics organizations have formal Â performance measurement programs, only one organization in six that starts such Â a program achieves a successful implementation.
Development and implementation Â approach
To address Â these problems, AMS undertook a holistic approach to the development and Â implementation of a PMS. The objective was to develop and implement a balanced Â and systematic PMS to assess the effectiveness of AMS’s operations from various Â points of view:
- quality of programs and services
- client Â satisfaction
- employee Â satisfaction
- continuous Â improvement.
The PMS Â was designed to provide feedback at all three levels: strategic, tactical and Â operational. It was also designed to evaluate the effectiveness of strategies Â and plans, to improve decision making, enable proactive problem correction and Â promote continuous improvement within the organization.
Developing and implementing Â a balanced PMS requires that a number Â of principles be followed to define the scope of the project and provide a Â philosophy of operations. AMS was no different. These principles indicate:
- Performance measurement requires time, effort, skills, expertise and, Â perhaps most importantly, the active support of senior management.
- Performance measurement must be organized around the department’s Â planning and budgeting cycles.
- Performance measurement business processes for finances, quality of Â service, employee satisfaction, client satisfaction and continuous improvement Â must be created or adapted to collect and analyze the data.
- An Office Â of Performance Measurement (OPM), in this case the AMS Program Office, must be Â established and held responsible for the planning, implementation and ongoing Â operation of the PMS. The OPM must report directly to, and be supported by, Â senior management.
- The PMS Â should be developed and implemented in phases.
- Simple and Â existing information sources should be used. There is no need to pursue new, Â expensive and/or time consuming approaches to data generation.
- The PMS Â must be designed so that multiple lines of evidence are generated for each Â perspective at different organizational levels and at different times. The Â information generated from one perspective must be confirmed by another so that Â there is a convergence of information to support the diagnosis and action on any Â issue.
Steps in developing and implementing a Â performance measurement system
As noted Â above, the PMS for AMS was implemented in phases. Phase 1 was a pilot project Â used in three core divisions, two offering development services and one Â providing support services. In Phase 2, the project was extended to include all Â high profile projects, those directly involving clients and some internal to AMS. Â This expansion covered most directorates within the sector. Phase 3, now Â underway, will extend the project to the two remaining directorates. The steps Â used in developing and implementing each phase of the PMS are briefly described Â below.
Step 1: Project orientation
The first step was to provide an orientation to management and staff on Â the balanced approach to performance measurement. This orientation covered the roles and responsibilities of participants, the stages of development Â and implementation, the time frames or milestones, the key principles of Â performance measurement, and anticipated and unforeseen issues.
Step 2: Readiness assessment
The second Â step was to assess the readiness of the AMS directorates and/or divisions to Â accept the balanced approach to performance measurement. This assessment Â would demonstrate how the organization’s mandate and functions supported or Â reinforced the business plans of AMS, GTIS and the department. Those Â responsible for developing the PMS were looking for evidence of management and Â staff commitment, as well as an assurance that the necessary resources to Â carry out the project would be made available. Following a positive analysis of Â the organization’s readiness, it was necessary to select core functional Â areas in which to begin a pilot.
Step 3: Â Performance Measurement Architecture (PMA)
The third Â step was the development of the Performance Measurement Architecture (PMA) to provide the design, content and Â structure of the PMS. In developing the PMA, Â business objectives were developed first, then informatics objectives were Â designed to support the business directions. Performance measures and Â performance indicators were then derived for the informatics function. These measures and associated indicators are objective, qualitative, Â quantitative and output oriented, but are also quantifiable in nature Â (see “Information quality” below for details).
The PMA was an evolving document, changing as Â measures were implemented and feedback was received. Once the PMA was Â approved, it became the sanctioned PMS architecture for AMS (see Table 1) and it Â was important to place it under change control, i.e., proposed changes were Â scrutinized and approvals were required.
Step 4: Â Implementation
As noted above, Phase 1 began Â on a pilot basis in three core areas. This approach was used to demonstrate the Â value of performance measurement and to build the competencies of the Â performance measurement team. After a successful three-month pilot, Phase 2 was Â started. The second phase lasted six months and concentrated on high-priority Â informatics projects involving AMS. Phase 3, currently underway, is addressing Â the remaining parts of the AMS organization.
The activities required to implement the AMS Performance Measurement Â System included:
- Performance Measurement Profile (PMP) – A PMP was developed from the PMA to assess the degree to which existing operational Â processes needed to be developed or adapted in order to implement the PMS. The profile also identified areas where information and processes concerning the performance measures and Â related performance indicators existed, and where information and processes Â needed to be adapted or created (see Appendix). Further, it identified the Â availability of baseline and benchmark information.
- Implementation Strategy – An implementation strategy was developed Â for each of the performance measurement perspectives:
- quality of Â service
- client Â satisfaction
- employee Â satisfaction
- continuous Â improvement.
- These Â strategies provide the direction to guide the implementation effort by defining Â the roles and responsibilities of participants, the frequency of reporting and Â the data collection methods. They also outline the methods for analyzing, Â reporting and interpreting the information, as well as for identifying any Â issues that could impede progress for each measurement perspective.
- Business Processes – Various day-to-day, month-to month business processes were Â developed or adapted to support data gathering for each of the Â performance measurement perspectives.
- Information Â Capture – Once business processes were adapted or created for each measurement Â perspective, the information generated from these processes was captured Â according to established reporting formats for each of the performance Â measurement perspectives (e.g., monthly status reports, procurement activity Â reports and action logs).
- Interpretation – This performance measurement information was interpreted Â to ensure internal validity and to provide an comparative external view. The Â interpretation of the information was accomplished in several ways: rating Â systems, baseline information, benchmarks, experience within the organization Â and the use of multiple lines of evidence.
- Reporting Â – The monthly performance information relates to financial performance and the Â quality of services. A full performance measurement report, the Dashboard, is Â published quarterly. Both monthly and quarterly reports discuss the Â organizational mission from various points of view. Cumulatively, this Â information provides a causal link between the goals of the mission and any Â strategic, tactical and operational issues interfering with their attainment. Â When issues are raised, a briefing note is prepared and presented to management Â in advance for discussion at the monthly review (see Appendix).
- Communication – Communicating results to senior management and staff is Â imperative so that. changes can be made to keep the organization on course Â towards its mission, to recognize and reward the efforts of individuals and Â teams, and to encourage continued positive results.
The Â benefits of the AMS performance measurement project must be assessed from Â several points of view. A discussion of the salient perspectives appears Â below.
Knowledge Â transfer
There was a considerable transfer of Â knowledge from the subject matter experts to AMS staff. Some key areas of Â knowledge transfer included:
- the methodology for developing performance measurement Â architectures;
- the means to establish baseline information to compare Â future performance;
- methods for collecting, analyzing and interpreting Â information;
- ways of developing or adapting business processes to Â generate performance information related to all measurement perspectives; Â and
- the use of facilitation techniques and “Straw Dog Models” Â to increase individual and organizational support for the Â PMS.
As a Â result, AMS is relatively self-sufficient in knowledge and is self-sustaining in Â its ability to operate and update the PMS.
AMS was successful in developing an organizational culture Â that values and supports balanced and comprehensive feedback as an essential Â element in examining issues and providing the information necessary for Â effective decision making. By improving communication, a common language and Â understanding developed in AMS concerning priorities, constraints and Â opportunities, as well as problems related to finances, quality of programs and Â services, client and employee satisfaction and continuous improvement. The PMS Â fostered a higher level of motivation and morale among staff through a greater Â appreciation of the issues and a greater involvement in formulating strategies Â and plans to resolve them. it is now viewed as a necessary and valued component Â of the management regime in AMS and has brought discipline to many of the Â sector’s business processes.
By providing AMS managers with multi-dimensional sources of Â information, the PMS provides a framework for decision making on cost Â effectiveness; improving, modifying or continuing programs or services; and Â improving client and employee satisfaction. It also provides information on Â program and service enhancements arising from changing client, employee or technology needs or innovations. Moreover, it provides a means of translating the AMS mission into Â concrete objectives, strategies and plans, which can be monitored and adjusted Â in response to ongoing environmental changes. Lastly, it provides a frame of reference for management and staff, working together, to sustain and enhance Â excellence in program and service delivery to their clients.
Information Â quality
The use of indices magnifies the complexity of information generated about any of the Â measurement perspectives in the PMS. The PMA for AMS includes four indices:
- Project Management
- Client Â Satisfaction
- Employee Â Satisfaction
- Continuous Â Improvement.
The use of Â these indices allows qualitative and quantitative information to be combined, Â yet provides the ability to quantify both so that all measures become output Â oriented. For example, the Project Management index provides an assessment of Â qualitative and quantitative measures of high-priority projects used to Â deliver services to clients. Quantitative measures used include whether the Â project is on time, within budget, within scope, and whether it meets Â all functional and technical quality requirements. Qualitative measures include Â the effective use of estimation, risk management, methodologies and tools, Â the quality of user involvement, and the effective use of staff and consultants. These dimensions of project management were expanded or decreased in Â response to changing conditions or experience. By using an index, these diverse Â measures were reduced to a common overall score representing all dimensions of Â project management. The use of indices in AMS provides a powerful and Â flexible tool to provide comprehensive information on the areas mentioned.
Interpretation of the Â information
Several Â techniques are used to increase the rigor with which the information Â provided by the AMS Performance Â Measurement System is interpreted. Rating systems, baseline Â information, benchmarks, and organizational knowledge and information from Â multiple lines of evidence are used to ensure that the issues revealed by Â the PMS are interpreted to ensure internal validity with a comparative Â external perspective.
Identification of best Â practices
The use Â of the PMS has also resulted in the identification of a number of best practices Â at both the project and sector level. By sharing these best practices, Â AMS has given employees the knowledge and skills necessary to deliver programs Â and services in a constantly changing IMAT environment.
There were a number of risks inherent in the development and Â implementation of a PMS in AMS.
Performance measurement requires the active support of Â management in communicating the rationale and benefits of the system, to help Â breakdown individual and organizational resistance to what may be perceived as a Â threatening project. Without the active and ongoing support of management in Â promoting the PMS and using the information to improve the functioning of the Â organization, the exercise would have been viewed as another form of unnecessary Â overhead. Even though there is strong support by the AMS executive for ongoing Â performance measurement, there is always the risk that insufficient time will be Â devoted to discussing and reviewing information at monthly and quarterly Â presentations.
Developing and implementing a successful PMS requires time, Â effort and money. It took just over a year to build the processes and framework Â for AMS. For most of that time, the assistance of consulting expertise was Â needed until the process became Â self-sustaining. In particular, the process Â required the support of an already overburdened staff.
The PMS Â was built at the sector level but fits within the planning and reporting Â framework (Performance Reporting and Accountability Structure) for GTIS and Â PWGSC. Many of the financial, human resource and other information systems did Â not produce the type of information required at the proper Â time or to the required level of quality. There Â was also a Â reliance on staff members outside the sector, operating in a “virtual organization,” to produce certain types of performance Â information. Together, these elements tended to restrict he ability of those implementing the PMS to produce reports with the Â frequency and quality desired to Â meet departmental business and financial cycles.
By publishing monthly and Â quarterly performance measurement reports, AMS provides a complete picture of Â the financial, quality-of-service, client/employee-satisfaction and Â continuous-improvement issues that it has dealt with during that period. By Â providing greater exposure to the organization’s strengths and weaknesses, AMS Â is identifying the issues and actions required to resolve them. While such Â exposure is beneficial to a proactive organization such as AMS, it could be Â detrimental when no action is taken, or when issues can only be resolved at the Â branch or departmental level.
The AMS performance measurement system is now well launched and is Â developing at an acceptable pace. it continues to Â provide insights into issues needed to steer AMS towards its mission. It also Â provides insights that can be applied to other organizations to assist in the Â smooth development and implementation of a balanced PMS, which is portable and Â highly applicable to other informatics organizations. To date, the insights and Â lessons learned indicate:
- The development of a balanced PMS must be tailored to fit the Â unique requirements of each organization in terms of specific measures, Â timing, sequence of activities and knowledge Â transfer.
- The use of pilots is necessary Â to provide evidence of the utility of performance measurement, to build Â acceptance and support, and to provide the experience needed to mold the PMS Â to the needs of the organization and its Â sub-organizations.
- The implementation of a balanced PMS requires the full support of Â management in terms of providing the leadership to communicate the Â necessity and importance of the approach and to supply adequate Â resources.
- The development of a PMA must begin at the top of the organization and be implemented Â downwards.
- The development and implementation of a PMS must avoid expensive Â data gathering and implementation approaches. Use a “Just Do It” mentality Â with existing sources of information, “work around” strategies, and available Â methods for interpreting and maintaining progress. Once established, the Â performance measurement process tends to be self-correcting and Â self-sustaining.
- Most importantly, the performance information, along with the associated analysis, must be used to take Â corrective actions in the form of strategies and plans to deal with issues Â interfering with the achievement of an organization’s mission.
1. TheÂ first article in this series, “Improved performance measurement: a prerequisite for better service delivery,” appeared in Optimum, Vol. 27, No. 4, pp. 1-5. TheÂ second article, “Implementing a performance measurement system in a publicÂ service informatics function,” appeared in Optimum, Vol. 28, No. 3, pp. 36-44.
2. R.S.Â Kaplan and D.P. Norton. The Balanced Scorecard: Translating Strategy into Action (Boston: Harvard Business School Press, 1996).
3. TheseÂ numbers represent the total where multiple selections were available and are notÂ intended to be a cumulative total.
Bryan Shane isÂ a senior partner with BPC Management Consultants. For more than 20 years, be hasÂ provided consulting services in the areas of strategic management, informationÂ technology and performance consulting, to a wide variety of public and privateÂ sector organizations. Mr. Shane has a BA in Political Science from CarletonÂ University and a BEd and MEd from the University of Ottawa. He has alsoÂ completed graduate studies in statistics and evaluation.
Gary Callaghan is the Manager of the Application Management Service (AMS) Program Office for the Government Telecommunications and Informatics Services Branch of Public Works and Government Services Canada. He has over 20 years of experience in the information management and information technology field, all with the Canadian federal government. Mr. Callaghan has been a project manager for the past seven years and has spent the last two years developing and implementing the AMS performance measurement framework.