Benchmarking  (from Municipal Benchmarks, David N. Ammons, Thousand Oaks, CA:  Sage Publications, 2001)


1.  Benchmarking is a tool used by local governments for performance measurement and monitoring.

  • Distinguish between simple comparisons and full-fledged, corporate-style benchmarking.
    • Analysis of performance gaps between one’s own organization and best-in-class performers
    • Identification of process differences that account for the gap
    • Adaptation of key processes for implementation in one’s own organization in an effort to close the gap
  • Reactive (speed/skill in responding to problems) vs. proactive (anticipating problems so they don’t occur) management
  • Aggregate statistics as camouflage
  • Unaudited data (inconsistent definitions)
  • Time frame differences

 

2.  Why Measure Performance?  Means for keeping score on how various operations are doing.

§         Accountability

§         Planning/Budgeting

§         Operational Improvement

§         Program Evaluation/Management-by-Objectives/Performance Appraisal

§         Reallocation of Resources

§         Directing Operations/Contract Monitoring

 

3.  Types of Performance Measures

  • Workload (output) measures (amount of work performed/amount of service received)
  • Efficiency measures (relationship between work performed and resources used to perform it; e.g., unit cost)
  • Effectiveness (outcomes) measures (degree to which performance objectives are achieved)
  • Productivity measures (combines effectiveness and efficiency in single indicator; e.g., unit cost per effective work/service performed)

 

4.  Criteria for Performance Measures

  • Valid
  • Reliable
  • Understandable
  • Timely
  • Resistant to undesired behavior
  • Comprehensiveness
  • Nonredundant
  • Sensitive to data collection cost
  • Focused on controllable facets of performance

 

5.  Sources of Performance Data

  • Existing records
  • Time logs
  • Citizen/client surveys
  • Trained observer ratings
  • Specially designed data collection processes

 

6.  Resistance of Performance Measurement

  • Supervisors threatened by vulnerability to accusations of poor performance
  • Reliance on political negotiations over data-driven decisions
  • Frontline employees and supervisors not consulted on choice of measures and fairness of measurement
  • Common complaints:
    • “You can’t measure what I do!”
      • Usually, office characterized by nonroutine work and no data collection system
      • Ask, “If your office were closed for a few weeks, you would be missed.  Who would suffer the greatest impact, and what aspects of your work would be missed the most?”
    • “You’re measuring the wrong thing!”
      • Involve the frontline employees
      • Disagreements & misunderstandings should be resolved
    • “It costs too much, and we don’t have the resources!”
      • Like the logger, faced with a stack of uncut logs, felt he could not take the time to sharpen his dull saw.
      • Demonstrate how the measures will lead to improved services, more efficient services, or more equitable services.

 

7.  Developing a Performance Measurement and Monitoring System

  • Secure management commitment
  • Assign responsibility for coordinating efforts to develop sets of performance measures
  • Select departments/activities/functions for development of performance measures
  • Identify goals & objectives
  • Design measures that reflect performance relevant to objectives
    • Emphasize service quality and outcomes (rather than inputs or workload)
    • Include no more measures than necessary
    • Solicit rank-and-file as well as management input/endorsements
    • Identify the work unit’s customers and emphasize delivery of services to them.
    • Consider periodic surveys of citizens, service recipients, or users of selected facilties
    • Include effectiveness and efficiency measures
  • Determine desired frequency of performance reporting
  • Assign departmental responsibility for data collection and reporting
  • Assign centralized responsibility for data receipt, monitoring, and feedback
  • Audit performance data periodically
  • Ensure that analysis of performance measures incorporates a suitable basis of comparison
  • Ensure a meaningful connection between performance measurement system and important decision processes
  • Continually refine performance measures, balancing need for refinement with the need for constancy in examining trends.
  • Incorporate selected measures into public information reporting.

 

8.  Benchmarking

  • Compare jurisdiction’s performance marks against some relevant peg
    • Compare to same measure in previous reporting period
    • Compare to different units in same jurisdiction providing similar services
    • Rarely, can use performance data from other jurisdictions,  national, state, or private-sector organizations
  • Anchor desired performance results in either
    • Professional standards (eg, ICMA data)
    • Experience of respected municipalities
  • Problems:
    • Data availability
    • Comparability

 

9.  Examples

Further  Reading:

Ammons, David  N.  2001.  Municipal Benchmarks, 2nd Ed.  Thousand Oaks, CA:  Sage Publications.

Hatry, Harry P.  1999.  Performance Measurement:  Getting Results.  Washington, DC:  The Urban Institute Press.

Hatry, Harry P. and others.  1992.  How Effective Are Your Community Services? 2nd Ed.  Washington, DC:  ICMA. 

 

MSU

 

© 2006 A.J.Filipovitch
Revised 7 October 2009