Implementing a Performance Measurement System in a Public Service Informatics Function
The development and implementation of a balanced PMS must begin at the top.
Federal government departments are attempting toÂ provide programs and services to their clients that are affordable, accessibleÂ and responsive to their needs in a time of fiscal restraint and ongoing change.Â A major strategy to accomplish this has been the use of information managementÂ and information technology (IM/IT) to streamline operations, improve levels ofÂ service and pro- vide better information for decision making. This has been,Â however, fraught with problems. Projects have seldom been delivered on time, toÂ budget and to specification. To maximize the contribution of IM/IT in developingÂ and supporting outstanding government programs and services, there is a need toÂ improve the decision-making process. This can be brought about through anÂ effective performance measurement system (PMS). This article focuses on theÂ stages involved in the development and implementation of a balanced approach toÂ performance measurement in an informatics function of a public sectorÂ organization.
Performance measurement defined
A balanced approach to performance measurement can beÂ defined from three points of view:
- It is a philosophy of continuous learning inÂ which feedback is used to make ongoing adjustments to the course of anÂ organization in the pursuit of its vision.
- It is a continuous and ongoing process thatÂ begins with the setting of objectives and the development of strategies/plansÂ to achieve those objectives in support of the vision.
- It is a structure in which strategic, tacticalÂ and operational actions are linked via a feedback process to provide theÂ information required to intervene and/or improve the program or service on aÂ systematic basis.
Balanced performance measurement is a managementÂ system – an ongoing best practice that provides the means to assess theÂ effectiveness of an organization’s operations from four perspectives -Â financial, client satisfaction, quality of service and innovation/learning, ItÂ is used to provide feedback at all levels – strategic, tactical or operational -Â on how well strategies and plans are being met. This performance feedbackÂ provides the information necessary to improve decision making within theÂ organization, to enable proactive problem correction and to promote continuousÂ improvement.
Implementation of a balanced PMS in an informaticsÂ organization requires that a number of principles be followed to define theÂ scope and provide a philosophy of operations. These principles include
- understanding that the implementation andÂ integration of a PMS into the organization’s culture requires time, effort, skills/expertise and perhaps, most importantly, the active support of seniorÂ management
- recognizing that performance measurementÂ indicators must be organized around the department’s planning and budgeting cycles
- creating and/or adapting the performanceÂ measurement processes of financial planning and reporting, quality of service,Â status reporting, client satisfaction, employee satisfaction and continuousÂ improvement to the department’s operational processes
- establishing a central office of performanceÂ measurement (OPM) that is responsible for the planning, implementation andÂ ongoing operation of the PMS. (The OPM must report directly to and be supported by senior management.)
- using a phased implementation process to develop and implement the PMS * adopting a just-do-it mentality
- designing the PMS so that multiple lines ofÂ evidence are generated for each perspective at different organizational levelsÂ at different times. (The information generated from one perspective must beÂ confirmed by another perspective so that there is a convergence of informationÂ to support the diagnosis of and action on a particular issue.)
Acceptance of a PMS involves a gradual process ofÂ change in organizational culture. The focus is on identifying and dealing withÂ the issues necessary for achieving the organizational mission and on linkingÂ strategic plans with operational decision making. Over time, this approachÂ allows for the development of an organizational culture that values and supportsÂ balanced and comprehensive feedback as an essential element in both examiningÂ the issues and providing the information necessary for effective decisionÂ making.
Roles andÂ responsibilities
There are a number of roles and responsibilitiesÂ associated with the development and implementation of a PMS.
Office of performanceÂ measurement
The OPM holds overall responsibility for theÂ development and implementation of a PMS and must report directly to the ChiefÂ Information Officer.
Specific responsibilities of the OPM include
- building the performance measurementÂ architectures (PMAs)
- designing the strategies required to implementÂ performance measurement for each of the four measurement perspectives includedÂ in the balanced approach to performance measurement
- implementing the PMS on a pilot basis and if itÂ is successful, extending it across the organization
- establishing and integrating performanceÂ measurement processes into the operations of the organization for finance, quality of programs and services, client satisfaction, employee satisfactionÂ and continuous improvement
- providing a program to transfer knowledge, skillsÂ and abilities from the subject area experts to the informatics programs andÂ service areas.
- marketing the PMS to promote and improveÂ understanding, acceptance and support for it across the informatics organization
- providing performance measurement information toÂ informatics programs/services senior management
- identifying the issues and suggesting actions to senior management to resolve problems.
IT/M program and services area
Implementation cannot rest solely on the efforts ofÂ the OPM. Cooperation and action are required from the IM/IT organization itself.Â Specific responsibilities include:
- participating in the development process of theÂ PMS by providing feedback, attending meetings, modifying processes, being openÂ to change, etc.
- providing performance measurement informationÂ related to the quality of informatics programs and services
- providing the resources necessary to participateÂ effectively in the development, implementation and ongoing maintenance of theÂ PMS and to facilitate knowledge transfer
- using the information provided to improve decision making and t ‘o develop strategies and plans for dealing with issues revealed by the PMS.
Steps in developing andÂ implementing a PMS
This section briefly discusses the steps involved inÂ the development and implementation of a PMS for a public sector informatics organization.
Step 1: ProjectÂ orientation
Provide an orientation to management and staff on theÂ balanced approach to performance measurement. This orientation is designed toÂ alleviate fears associated with performance measurement by describing what it isÂ and is not; to explain the roles and responsibilities required of participants,Â the stages involved in development and implementation and the time frames andÂ principles involved; and to address anticipated problems. This step is necessaryÂ for improving understanding and building support for the project among theÂ uninitiated members of the informatics organization.
Step 2: Readiness assessment
Assess the readiness of the informatics organizationÂ to accept the balanced approach. Several criteria may be used to complete thisÂ assessment:
- a business plan, mandate, vision and organizational philosophy
- a strategic IM/IT plan that reinforces andÂ supports the business plan
- an organizational culture, relatively open toÂ change, that values feedback as essential to providing the informationÂ necessary for decision making
- collection, examination and reporting ofÂ corporate sources of information, especially finance, quality of programs andÂ services, client satisfaction, employee satisfaction and innovation
- the skills and abilities of staff to implement aÂ PMS and/or willingness to accept knowledge transfer
- a commitment to resource the projectÂ appropriately for one to two years – the time it will take to inculcate a PMSÂ into the organization.
This assessment must provide a clear picture of anÂ informatics organization with a mandate/vision of how the functions support andÂ reinforce the business plans of the department. Further, there must be evidenceÂ to demonstrate the commitment of management and staff and the resources to carryÂ out the project.
Based on a positive analysis of organizationalÂ readiness to develop and implement a PMS, it is necessary to select a core functional area to act as a pilot.
Step 3: Performance measurement architecture
Develop the PMA, which is the design, content andÂ structure of the PMS prior to its implementation, based on the departmental andÂ informatics organization’s mission, vision and philosophy. This can only be doneÂ from the top down – business objectives must be developed first, then theÂ informatics objectives, plans and strategies can be designed to support theÂ business directions. Performance measures/indicators (PM/PIs) can then beÂ derived for the informatics function. The PM/Pls are objective, quantifiable andÂ output oriented. They are also qualitative in nature.
The development and validation of interpretiveÂ models, known as the “straw dog” approach, should be used. This method is timeÂ effective and ensures that the involvement and participation of seniorÂ management and staff are focused and time efficient.
- The development of a straw dog business model forÂ the informatics organization specifies the organizational mission/vision,Â functions, related activities and critical success factors. Once this vehicleÂ is constructed, it must be validated with staff members and revised accordingÂ to their comments. This business model provides the information and insightÂ from which the PMA is developed.
- The construction of a straw dog PMA based on theÂ information gained from the business model has the following elements:
- organizational mission/vision
- objectives for each of the followingÂ perspectives: financial, quality of programs and services, and clientÂ satisfaction andÂ innovation / learning, which are divided into employeeÂ satisfaction/continuous improvement
- Based on the objectives for each PMA perspective,Â PM/PIs are developed.
The PMA provides information on past performanceÂ using traditional financial measures. It also provides information on currentÂ operations by focusing on programs and services and the level of clientÂ satisfaction. Further, it provides input on future requirements that may ariseÂ from changing technology, client needs or staff needs. The PMA provides feedbackÂ across all the dimensions of an informatics organization needed for effectiveÂ decision making. It is a causal link between the organizationalÂ vision/objectives and the information needed to deal with the multifacetedÂ strategic and operational issues that hinder progress toward its attainment.
Once the PMA is developed and approved, it becomesÂ the sanctioned PMS architecture for the informatics organization. It is a livingÂ document that evolves and changes as measures are implemented and feedback onÂ their effectiveness is received. It must be placed under change control.
Table 1 provides a generic PMA of an actual ITAMÂ organization.
Step 4:Â Implementation
Implementation of a PMS requires a phased approach.Â It should begin on a pilot basis within one or two core informatics businessÂ lines to demonstrate the value of performance measurement and to build theÂ competencies of the performance measurement team. After a successful pilot of 3Â to 6 months, the second phase can be initiated. This phase concentrates on onlyÂ high-priority informatics areas and/or projects. The third and final phaseÂ addresses the remaining parts of the informatics organization.
The activities required to implement a PMS areÂ described below.
Development of a performance measurement profile
The measurement profile for each PWPI provides anÂ assessment of the degree to which existing operational processes must beÂ developed or adapted in order to implement the PMS. It indicates areas whereÂ information and processes related to the PWPI exist and where they must beÂ adapted and/or created. It provides the basis for scheduling the implementationÂ of different measures, together with an estimated degree of effort.
Development of an implementation strategy for each measurement perspective
The development of an implementation strategy forÂ each of the four measurement perspectives provides a guide for defining theÂ roles and responsibilities of participants; the frequency of reporting; the dataÂ collection methods; the methods of analyzing, reporting and interpreting theÂ information; and the issues that may impede progress for each measurementÂ perspective. The strategy also provides the guidelines for developing andÂ operating the performance measurement processes once the implementation effortÂ is complete. These strategies also assist in creating a common understanding andÂ acceptance of all the required performance measurement elements prior to theirÂ development and use.
Integration of performance measurement processes into operational activities
The next activity in the implementation of a PMS isÂ the development or adaptation of existing business processes to support dataÂ gathering for each of the four performance measurement perspectives.
In some cases, existing sources can provide theÂ required information without much adaptation. This is normally the situation forÂ the financial perspective, where information on revenues, expenditures, capitalÂ expenditures and funding ratios is usually readily available. Changes that mayÂ be required in the existing financial processes relate to establishing referenceÂ levels; year-end reconciliation procedures; financial coding for projects;Â reporting of revenues/expenditures down to the project level; and reconciliationÂ of financial/time utilization information at the project level.
Measurement of performance can involve the creationÂ of a new business process. This is often the case in measuring clientÂ satisfaction. With regard to quality of service, employee satisfaction andÂ continuous improvement, the measurement of performance may involve either theÂ amendment of existing processes or the creation of new ones.
Once business processes for each measurementÂ perspective have been adapted or created, the information generated must beÂ captured according to predetermined reporting formats.
- Financial Report – provides (at a minimum) data onÂ revenues, expenditures, capital and funding ratios. This information isÂ usually reported on a monthly basis.
- Quality of Program/Service Composite Status ReportÂ – summarizes the performance of all ITAM projects related to each performanceÂ measure – project management index, functional/technical quality, etc. AllÂ these performance measures are averaged to determine an overall grading forÂ each project according to the strategy developed for this perspective. This isÂ also a monthly report.
- Client Satisfaction Report – summarizes all theÂ information in the client satisfaction index related to projects/services,Â staff/consultants, communication, improvements to existingÂ programs/services/processes or requirements for new ones. Client retentionÂ information is also published. This is usually a semi-annual report that isÂ refreshed on an ongoing basis by client satisfaction information gathered atÂ the project level.
- Employee Satisfaction Report – outlines all theÂ information in the employee satisfaction index related to plans/goals, roleÂ clarity, decision making/communication, team building, staff utilization,Â rewards/recognition and productivity. This may be either a semi-annual orÂ annual report, depending on the initial degree of satisfaction assessed.
- Continuous Improvement Report – defines all theÂ information in the continuous improvement index related to staffing,Â suggestions for improvement, professional development and rewards/recognition.Â This is a quarterly report. These reports are collated and published quarterlyÂ in a composite report called the Dashboard.
Interpretation ofÂ results
The performance information presented in theÂ Dashboard outlines issues related to the achievement of organizational visionÂ from the financial, quality of service, client satisfaction andÂ innovation/learning perspectives. Cumulatively, this information provides aÂ causal link between the achievement of the vision and the strategic, tacticalÂ and operational issues interfering with this goal. Yet the basic questionÂ remains of how to interpret the Dashboard information so that the necessaryÂ corrective actions can be taken. This can be accomplished in several ways:
- Rating system
Within each of the four measurementÂ perspectives, an implementation strategy has provided guidelines. For example,Â within the quality-of-service perspective, all the performance informationÂ related to programs and services is captured in a project status report. WhenÂ it comes to interpreting this information, the system developed provides aÂ green rating if all dimensions of the project cost, schedule,Â functional/technical quality, etc., are within 10 percent of plan. A yellowÂ rating is provided if these dimensions are between 10 percent to 20 percent. AÂ red rating is provided for dimensions greater than 20 percent. ForÂ non-quantitative quality-of -service dimensions such as effective use of staff,Â a judgment of these dimensions is provided by the project manager. ThisÂ judgment and the other information on the status report are verified byÂ providing a copy to the client.
Another rating system used is based uponÂ the interpretation of questionnaire information. On a five-point scale, anÂ answer provided by clients (client satisfaction perspective) or by employeesÂ (employee satisfaction questionnaire) is rated red, yellow or green, basedÂ upon average scores between 1 to 2.5; 2.6 to 3.5, and 3.6 to 5.0 respectively.
- Use of baseline information
Baseline information provides a historicalÂ perspective on the performance dimensions that permit the analysis of trendsÂ over time. This is powerful and necessary to understanding any changes in theÂ organization. It is also difficult to obtain. Financial and employeeÂ information are probably the only types available and are not likely toÂ reflect all performance measurement dimensions. However, it is worth theÂ effort to gather as much of this information as possible in a reasonableÂ length of time.
- Use of benchmarks
Benchmarking allows comparisons with the business processes of leadingÂ organizations in order to provide the information needed to improve existingÂ operations. As an example, a staff turnover rate of 11 percent annually isÂ highly stressful for most organizations and indicates a major problem, but inÂ the informatics world, that is a normal level of attrition. Without thisÂ benchmark information, the interpretation of this performance indicator wouldÂ be faulty.
- Experience with the organization
This experience is also a key requirement forÂ interpreting any performance information. Major changes must be interpretedÂ within the context of the history of the organization. For example, an overallÂ employee satisfaction score of 3.5 out of 5.0 may indicate major problems inÂ various dimensions. Knowing, however, that Program Review has just resulted inÂ major cuts and that this rating is an improvement over that of the previous year provides a powerful interpretation of this score.
- Multiple lines of evidence
The design of the PMA is completed in such a wayÂ that multiple lines of evidence are provided for each perspective at differentÂ levels in the organization at different times. For example, financialÂ information is obtained from financial performance indicators and is alsoÂ provided by the project status reports. Client satisfaction information isÂ provided through client satisfaction questionnaires but is also included as a dimension of the project status report. Multiple lines of evidence from theÂ PMS are designed to show convergence of information within each piece reinforcing the other. Any issue outlined in any perspective of the DashboardÂ should have corroborative information from another perspective if possible.
The most effective interpretation ofÂ performance measurement information can be obtained from using as many of theseÂ techniques as possible in combination. Their use is especially important inÂ writing the executive summaries in the Dashboard.
Communicating theÂ results
Communicating the results to both senior managementÂ and staff is imperative. It is only through communication and decision makingÂ that changes can be made either to correct the course of the organization towardÂ its vision, or to reward the efforts made and encourage continued results.
The Dashboard provides feedback to the managementÂ committee that contains the actual results on how well strategies and plans areÂ being met and identifies the issues at all levels – strategic, tactical andÂ operational. This performance information provides the information necessary forÂ improving decision making within the organization.
To ensure that the Dashboard is used effectively,
- the quarterly publishing date needs to beÂ standardized at six weeks after the quarter so that all members of theÂ management committee expect it as a regular feature of their managementÂ meetings.
- Â it should be delivered to members of theÂ management committee one week before the next management meeting to allow themÂ sufficient time to read and understand its contents.
- at least a one-hour time slot should be allottedÂ for the discussion of
- the progress achievedÂ since the last meeting
- the issues/impacts related to each perspective
- the recommendations to deal with the importantÂ issues
- the decisions required.
Step 5:Â Institutionalization of a PMS into the organization
Two strategies are essential to ensure thatÂ performance measurement becomes integrated with the organization’s standardÂ operating procedures.
Knowledge from the subject matter experts to theÂ organization must be transferred to make the informatics organization self-sufficient and self-sustaining in its ability to operate and update theÂ PMS. Specific areas of knowledge transfer include
- the methodology for developing performance measurement architectures
- the means to establish baseline information in order to compare future performance
- methods for developing and testing questionnaires
- methods for collecting, analyzing and interpreting data
- development of an evaluation strategy to assessÂ the effectiveness of the performance measurement pilot
- a business process renewal
- project management
- facilitation and structured interviewingÂ techniques
- training techniques
- presentation techniques.
As the OPM is responsible for the development andÂ implementation of the PMS, it should become the repository of these performanceÂ measurement skills/abilities. Sufficient dedicated resources are necessary toÂ ensure this expertise is retained within the organization.
The development of an organizational culture supportive of the PMS is essential to breaking down the organizational andÂ individual resistance to change. There are techniques that will gradually doÂ this and increase ownership of the system:
- Assessment of the readiness of the organization toÂ adopt this approach to performance measurement is an indicator to staff thatÂ change is underway and may well raise their expectations for the organization.
- Use of straw dog models to present new concepts orÂ tools allows staff members to participate, examine, review and revise theseÂ documents, thereby increasing their ownership and commitment and theirÂ expectation of continued involvement in the change process.
- Employment of facilitation and structuredÂ interviewing techniques helps in gaining effective feedback and in presentingÂ new information and concepts. This assists in opening the lines ofÂ communication and encourages free exchange of ideas, issues and solutions.
- Holding regular weekly meetings helps deal withÂ ongoing implementation issues and increases the involvement of staff inÂ supporting the course for the organization.
- Use of flexible methods to integrate the PMS intoÂ the operational processes of the organization improves the morale of staff asÂ they recognize their skills and talents and that they are being appreciated.
- Holding training/orientation sessions to transferÂ knowledge and skills to the program office and the program and service areasÂ within the informatics organization encourages and supports staff growth andÂ development.
- Using an issue log tracks problems hampering theÂ implementation effort. Actions on the part of managers indicate to staff thatÂ management is serious about the changes effected by the PMS.
- Using an issue briefing note to describe a problemÂ and outline its implications, solution options and recommended actionsÂ provides management with the information required to make sound practicalÂ decisions to drive the organization toward its vision.
- Ongoing leadership and communication by seniorÂ management is needed to support the cultural change created by the balancedÂ PMS, such as the need for staff to participate and cooperate fully and theÂ actions taken by managers to address the identified performance measurementÂ issues.
Experience has shown that several best practices canÂ be used to assist in the smooth development and implementation of a balancedÂ PMS:
- The development and implementation of a balancedÂ PMS is both an art and a science. The science of developing and implementing aÂ PMS involves the use of the steps previously described to provide similarÂ philosophy, structure and process so that comprehensive information can beÂ provided across all dimensions of the organization. The art of developing andÂ implementing a PMS involves tailoring the design and implementation process toÂ the unique requirements of each organization in terms of specific measures,Â timing, sequence of activities and knowledge transfer.
- The use of pilots provides evidence of the utilityÂ of this performance measurement approach. It gradually buildsÂ acceptance/support for a threatening project and provides the experienceÂ needed for tailoring the PMS to the unique requirements of the organizationÂ while developing the expertise within the organization.
- The development and establishment of a balancedÂ PMS must begin at the top of the organization and be implemented progressivelyÂ downward. This requires the full commitment and support of management in termsÂ of providing a supportive environment and communicating the necessity andÂ importance of this approach. It also requires a financial investment toÂ develop, implement and maintain the operation.
- Adopt a just-do-it approach to the development andÂ implementation process of the PMS. Don’t be tied to expensive data gatheringÂ and implementation methods. Use work-around strategies to find simple butÂ effective solutions. Where progress is delayed in developing and implementingÂ the measurement of one perspective, focus more attention on the others.Â Because the whole process is self-improving and self-correcting, the key is toÂ go with the information, processes and interpretation already developed.
- Unless the information generated from the balancedÂ PMS is used to take corrective actions in the form of strategies/plans toÂ steer the organization toward its mission and vision, the development andÂ implementation effort is wasted. This performance information must be used toÂ move the organization forward progressively in terms of finances, quality ofÂ programs/services, client/employee satisfaction and continuous adaptation toÂ changing circumstances. Where it is demonstrated to employees that performanceÂ information is used to improve the functioning of the organization, then theÂ PMS will become self-sustaining.
The delivery of informatics programs andÂ services that are affordable, accessible and responsive to the needs ofÂ Canadians can be achieved only if public sector informatics managers are able toÂ obtain effective feedback provided through a balanced approach to performance measurement. Such feedback provides the information needed for the informatics organization to establish and sustain excellence in program and service deliveryÂ to the public.
*1 This is based on the Balanced Scorecard ApproachÂ developed by R.S. Kaplan and D.P. Norton.
Bryan Shane is senior partner of BPCÂ Management Consultants in Ottawa. Since 1981 he has provided change managementÂ consulting services to both public and private sector organizations. Mr. ShaneÂ has a BA in Political Science from Carleton University and a BEd and Med fromÂ the University of Ottawa. He also completed postgraduate studies in statisticsÂ and evaluation.