A balanced performance measurement Â system requires that certain principles be followed to define the scope and Â provide a philosophy of operations.
This is the third and final article in a series on developing Â and implementing a performance measurement system (PMS) for an informatics function in a public service environment.’ The first article presented a Â conceptual approach to performance measurement based upon the Balanced Â Scorecard 2 approach, while the second article described the steps to developing Â and implementing a PMS. This third article describes the realization of a PMS in Â a Canadian government informatics organization, the Application Management Â Service (AMS) of Public Works and Government Services Canada (PWGSC). Â Specifically, this article describes the mission, services and clients of AMS; Â the rationale for embarking on the performance measurement project; and the Â development and implementation process. Finally, it provides an assessment of Â the benefits and risks associated with implementing such an Â endeavour.
AMS is a sector within Â the Government Telecommunications and Informatics Services (GTIS) branch of Â PWGSC. Its vision is to develop information management and information Â technology IM/IT partnerships with the department’s other business lines and Â with Special Operating Agencies (SOAs) to maximize PWGSC’s value as a Common Â Services Organization. As one of the department’s eight major business lines, Â GTIS is recognized by the federal government community as the principal agent for Â the management of common IM/IT infrastructure and services. GTIS provides a wide Â range of IM/IT services to PWGSC and other departments of the Canadian Â government.
AMS has 418 employees and is augmented by a similar Â number of consultants. it has an annual budget allocation of $42 million and an Â in-house supported technology inventory valued at approximately $300 million. Â Its mission is to provide development and support services in respect of Â administrative processes and education to government IM/IT professionals. it Â delivers value by working closely with clients to reduce business risk, while Â increasing flexibility and performance.
To achieve Â its mission, AMS provides a number of services to its clients, including:
Other AMS Â services include business process reengineering, Internet support, data Â warehousing and software exchange.
Figure I Â illustrates the client base of AMS. The majority of AMS clients (87 percent) Â work within PWGSC, with the remainder coming from outside the department. Â Most clients are using more than one AMS service, with 71 percent using at least Â three services. The most widely used services include:
PWGSC and GTIS are committed to the review and renewal of federal programs Â and services to ensure that they are Â affordable, accessible and responsive to the needs of Canadians. This new Â environment created a number of challenges and opportunities in the delivery of Â these programs and services.
These challenges and opportunities, which are directly related to the effective Â delivery of programs and services to the Canadian public, were further Â compounded by the need for effective performance measurement practices in AMS. Â Until the Balanced Scorecard approach was implemented, the performance of Â programs and services was not measured in a holistic way.
At present, the established Â standard in the federal government for reporting performance is solely based on Â a financial perspective. This one-dimensional approach fails to provide Â performance information related to client and employee satisfaction, quality of Â service or continuous improvement. Many performance measurement systems do not Â provide the information needed to determine whether objectives and strategies Â are being effectively implemented. Much performance information is reactive Â rather than proactive. Moreover, it fails to recognize outstanding group or Â individual performance, which tends to undermine motivation and morale within Â the organization. This situation is further compounded by the fact that while Â approximately 10 percent of all informatics organizations have formal Â performance measurement programs, only one organization in six that starts such Â a program achieves a successful implementation.
Development and implementation Â approach
To address Â these problems, AMS undertook a holistic approach to the development and Â implementation of a PMS. The objective was to develop and implement a balanced Â and systematic PMS to assess the effectiveness of AMS’s operations from various Â points of view:
The PMS Â was designed to provide feedback at all three levels: strategic, tactical and Â operational. It was also designed to evaluate the effectiveness of strategies Â and plans, to improve decision making, enable proactive problem correction and Â promote continuous improvement within the organization.
Developing and implementing Â a balanced PMS requires that a number Â of principles be followed to define the scope of the project and provide a Â philosophy of operations. AMS was no different. These principles indicate:
Steps in developing and implementing a Â performance measurement system
As noted Â above, the PMS for AMS was implemented in phases. Phase 1 was a pilot project Â used in three core divisions, two offering development services and one Â providing support services. In Phase 2, the project was extended to include all Â high profile projects, those directly involving clients and some internal to AMS. Â This expansion covered most directorates within the sector. Phase 3, now Â underway, will extend the project to the two remaining directorates. The steps Â used in developing and implementing each phase of the PMS are briefly described Â below.
Step 1: Project orientation
The first step was to provide an orientation to management and staff on Â the balanced approach to performance measurement. This orientation covered the roles and responsibilities of participants, the stages of development Â and implementation, the time frames or milestones, the key principles of Â performance measurement, and anticipated and unforeseen issues.
Step 2: Readiness assessment
The second Â step was to assess the readiness of the AMS directorates and/or divisions to Â accept the balanced approach to performance measurement. This assessment Â would demonstrate how the organization’s mandate and functions supported or Â reinforced the business plans of AMS, GTIS and the department. Those Â responsible for developing the PMS were looking for evidence of management and Â staff commitment, as well as an assurance that the necessary resources to Â carry out the project would be made available. Following a positive analysis of Â the organization’s readiness, it was necessary to select core functional Â areas in which to begin a pilot.
Step 3: Â Performance Measurement Architecture (PMA)
The third Â step was the development of the Performance Measurement Architecture (PMA) to provide the design, content and Â structure of the PMS. In developing the PMA, Â business objectives were developed first, then informatics objectives were Â designed to support the business directions. Performance measures and Â performance indicators were then derived for the informatics function. These measures and associated indicators are objective, qualitative, Â quantitative and output oriented, but are also quantifiable in nature Â (see “Information quality” below for details).
The PMA was an evolving document, changing as Â measures were implemented and feedback was received. Once the PMA was Â approved, it became the sanctioned PMS architecture for AMS (see Table 1) and it Â was important to place it under change control, i.e., proposed changes were Â scrutinized and approvals were required.
Step 4: Â Implementation
As noted above, Phase 1 began Â on a pilot basis in three core areas. This approach was used to demonstrate the Â value of performance measurement and to build the competencies of the Â performance measurement team. After a successful three-month pilot, Phase 2 was Â started. The second phase lasted six months and concentrated on high-priority Â informatics projects involving AMS. Phase 3, currently underway, is addressing Â the remaining parts of the AMS organization.
The activities required to implement the AMS Performance Measurement Â System included:
The Â benefits of the AMS performance measurement project must be assessed from Â several points of view. A discussion of the salient perspectives appears Â below.
Knowledge Â transfer
There was a considerable transfer of Â knowledge from the subject matter experts to AMS staff. Some key areas of Â knowledge transfer included:
As a Â result, AMS is relatively self-sufficient in knowledge and is self-sustaining in Â its ability to operate and update the PMS.
AMS was successful in developing an organizational culture Â that values and supports balanced and comprehensive feedback as an essential Â element in examining issues and providing the information necessary for Â effective decision making. By improving communication, a common language and Â understanding developed in AMS concerning priorities, constraints and Â opportunities, as well as problems related to finances, quality of programs and Â services, client and employee satisfaction and continuous improvement. The PMS Â fostered a higher level of motivation and morale among staff through a greater Â appreciation of the issues and a greater involvement in formulating strategies Â and plans to resolve them. it is now viewed as a necessary and valued component Â of the management regime in AMS and has brought discipline to many of the Â sector’s business processes.
By providing AMS managers with multi-dimensional sources of Â information, the PMS provides a framework for decision making on cost Â effectiveness; improving, modifying or continuing programs or services; and Â improving client and employee satisfaction. It also provides information on Â program and service enhancements arising from changing client, employee or technology needs or innovations. Moreover, it provides a means of translating the AMS mission into Â concrete objectives, strategies and plans, which can be monitored and adjusted Â in response to ongoing environmental changes. Lastly, it provides a frame of reference for management and staff, working together, to sustain and enhance Â excellence in program and service delivery to their clients.
Information Â quality
The use of indices magnifies the complexity of information generated about any of the Â measurement perspectives in the PMS. The PMA for AMS includes four indices:
The use of Â these indices allows qualitative and quantitative information to be combined, Â yet provides the ability to quantify both so that all measures become output Â oriented. For example, the Project Management index provides an assessment of Â qualitative and quantitative measures of high-priority projects used to Â deliver services to clients. Quantitative measures used include whether the Â project is on time, within budget, within scope, and whether it meets Â all functional and technical quality requirements. Qualitative measures include Â the effective use of estimation, risk management, methodologies and tools, Â the quality of user involvement, and the effective use of staff and consultants. These dimensions of project management were expanded or decreased in Â response to changing conditions or experience. By using an index, these diverse Â measures were reduced to a common overall score representing all dimensions of Â project management. The use of indices in AMS provides a powerful and Â flexible tool to provide comprehensive information on the areas mentioned.
Interpretation of the Â information
Several Â techniques are used to increase the rigor with which the information Â provided by the AMS Performance Â Measurement System is interpreted. Rating systems, baseline Â information, benchmarks, and organizational knowledge and information from Â multiple lines of evidence are used to ensure that the issues revealed by Â the PMS are interpreted to ensure internal validity with a comparative Â external perspective.
Identification of best Â practices
The use Â of the PMS has also resulted in the identification of a number of best practices Â at both the project and sector level. By sharing these best practices, Â AMS has given employees the knowledge and skills necessary to deliver programs Â and services in a constantly changing IMAT environment.
There were a number of risks inherent in the development and Â implementation of a PMS in AMS.
Performance measurement requires the active support of Â management in communicating the rationale and benefits of the system, to help Â breakdown individual and organizational resistance to what may be perceived as a Â threatening project. Without the active and ongoing support of management in Â promoting the PMS and using the information to improve the functioning of the Â organization, the exercise would have been viewed as another form of unnecessary Â overhead. Even though there is strong support by the AMS executive for ongoing Â performance measurement, there is always the risk that insufficient time will be Â devoted to discussing and reviewing information at monthly and quarterly Â presentations.
Developing and implementing a successful PMS requires time, Â effort and money. It took just over a year to build the processes and framework Â for AMS. For most of that time, the assistance of consulting expertise was Â needed until the process became Â self-sustaining. In particular, the process Â required the support of an already overburdened staff.
The PMS Â was built at the sector level but fits within the planning and reporting Â framework (Performance Reporting and Accountability Structure) for GTIS and Â PWGSC. Many of the financial, human resource and other information systems did Â not produce the type of information required at the proper Â time or to the required level of quality. There Â was also a Â reliance on staff members outside the sector, operating in a “virtual organization,” to produce certain types of performance Â information. Together, these elements tended to restrict he ability of those implementing the PMS to produce reports with the Â frequency and quality desired to Â meet departmental business and financial cycles.
By publishing monthly and Â quarterly performance measurement reports, AMS provides a complete picture of Â the financial, quality-of-service, client/employee-satisfaction and Â continuous-improvement issues that it has dealt with during that period. By Â providing greater exposure to the organization’s strengths and weaknesses, AMS Â is identifying the issues and actions required to resolve them. While such Â exposure is beneficial to a proactive organization such as AMS, it could be Â detrimental when no action is taken, or when issues can only be resolved at the Â branch or departmental level.
The AMS performance measurement system is now well launched and is Â developing at an acceptable pace. it continues to Â provide insights into issues needed to steer AMS towards its mission. It also Â provides insights that can be applied to other organizations to assist in the Â smooth development and implementation of a balanced PMS, which is portable and Â highly applicable to other informatics organizations. To date, the insights and Â lessons learned indicate:
1. TheÂ first article in this series, “Improved performance measurement: a prerequisite for better service delivery,” appeared in Optimum, Vol. 27, No. 4, pp. 1-5. TheÂ second article, “Implementing a performance measurement system in a publicÂ service informatics function,” appeared in Optimum, Vol. 28, No. 3, pp. 36-44.
2. R.S.Â Kaplan and D.P. Norton. The Balanced Scorecard: Translating Strategy into Action (Boston: Harvard Business School Press, 1996).
3. TheseÂ numbers represent the total where multiple selections were available and are notÂ intended to be a cumulative total.
Bryan Shane isÂ a senior partner with BPC Management Consultants. For more than 20 years, be hasÂ provided consulting services in the areas of strategic management, informationÂ technology and performance consulting, to a wide variety of public and privateÂ sector organizations. Mr. Shane has a BA in Political Science from CarletonÂ University and a BEd and MEd from the University of Ottawa. He has alsoÂ completed graduate studies in statistics and evaluation.
Gary Callaghan is the Manager of the Application Management Service (AMS) Program Office for the Government Telecommunications and Informatics Services Branch of Public Works and Government Services Canada. He has over 20 years of experience in the information management and information technology field, all with the Canadian federal government. Mr. Callaghan has been a project manager for the past seven years and has spent the last two years developing and implementing the AMS performance measurement framework.