Taking the lead in risk management
Like their counterparts in other departments of local government, risk managers are interested in determining the “best practices” in their field. Identifying local government risk managers and risk management programs with reputations for excellence is a good starting point.
Last year, the Department of Public Administration at the University of North Carolina, Chapel Hill, mailed questionnaires to more than 2,000 local government risk management professionals across the United States and Canada, asking them to identify individual local government officials, departments or entire local government organizations (“reputational leaders”) that would be worthy models for others wishing to excel in risk management. The department received responses from 47 states, five Canadian provinces and the District of Columbia.
The tops
Altogether, 404 cities, counties, school districts and affiliated organizations were cited by the respondents for having outstanding risk management programs in local government. Seventy-two percent of those were cited by one respondent each; 17.3 percent were named by two respondents; 3.5 percent by three; and 7.2 percent by four or more respondents. The 29 most frequently cited programs clearly own national reputations as leaders in local government risk management.
Seventeen cities, seven counties, one combined municipal/county operation and four school districts comprised the 29. Led by Anaheim, Calif., which received 25 nods from 25 respondents, the most often-cited local governments span the country and serve populations as large as 3.2 million persons (Harris County, Texas) and as small as 60,110 (West Hartford, Conn.).
In many cases, especially among programs cited three times or fewer, favorable reputations were enjoyed within a given program’s state borders but not beyond. In contrast, several of the 29 top programs derived part of their reputational strength from respondents in other states. In fact, citations from outside their own states would have been sufficient to place three of the programs – Anaheim; Pima County, Ariz.; and Washoe County, Nev., and almost a fourth (Charlotte/Mecklenburg County, N.C.) – on the list of top programs without any support at all from in-state respondents.
As was the case with risk management programs, most of the officials cited as outstanding risk managers were named by only one respondent. A total of 437 persons were cited – 332 (76 percent) by only one respondent each. Fifty-seven (13 percent) were named by two respondents each, 21 (4.8 percent) by three each, and 27 (6.2 percent) by four or more respondents each.
Heading the list of reputational leaders were four officials who were cited 10 or more times as outstanding risk managers. Like his city, Anaheim Risk Manager Tom Vance led in citations with 23 respondents naming him. The other three managers with 10 or more citations – Dave Parker of Pima County, Tom Phillips of Santa Monica, Calif.; and Dan Pliszka of Charlotte/Mecklenburg County – also were affiliated with frequently cited risk management programs.
The search for best practices
Administrators and analysts searching for best practices are advised routinely to seek leads on outstanding programs from officials of relevant professional associations, writers and researchers in the field, consultants and other experts, as well as from their own colleagues. The survey provided that information in a concise form.
The questionnaires were sent primarily to members of the Public Risk Management Association (PRIMA), as well as to officers, board members and executive directors of the International Municipal Lawyers Association, the Public Agency Risk Managers Association, and the Public Risk Database Project. State league risk pool administrators, recent recipients of awards for public sector risk management, authors of books or articles on the subject, private consultants and officials of state risk management agencies also were surveyed.
PRIMA members generally agreed on the programs and risk managers with excellent national reputations. Most consultants and authors declined to provide recommendations, either because they were unprepared to offer that kind of advice or preferred not to do so. Results of the survey suggest that their advice might differ substantially from that of local government risk managers, but whether it would be better or worse remains an open question.
The eight responding consultants nominated 14 risk management programs as “best practices” models. They named only Charlotte, N.C., and Montgomery County, Md., three times or more, and they cited only one of the risk managers whose name appeared among the 27 most-cited reputational leaders offered by the other respondents. The four responding authors cited none of the programs or managers named by the panel as reputational leaders; however, two commended Cincinnati for its risk management program.
Pool administrators and state agency officials offered more nominations, but, again, their choices rarely matched those offered by other respondents. That does not mean that they – or the consultants and authors – have made invalid recommendations; it merely means that their choices tend to differ from those of PRIMA members in general, association officials and award winners. Using the survey results of the pool administrators, consultants and authors to institute a benchmarking project likely would result in a different project than using the results of PRIMA members, association officials and award winners would.
Outstanding public sector officials often toil in anonymity, and their commitment to excellence all too often is unnoticed and unappreciated. Even in their own communities, the contributions of dedicated and proficient administrators may be taken for granted by others who have grown to accept and expect top-notch performances every year. The 29 risk management programs and 27 risk managers cited in the survey enjoy reputations for excellence among knowledgeable persons in their own state and, in some cases, regionally or even nationally.
Still, having a reputation for excellence is not the same as having a confirmed record of performance excellence. The survey examined reputations only; no attempt was made to confirm whether the local governments and individual officials named are worthy of the reputations they have gained. (However, it is reasonable to expect a strong positive correlation between having a reputation for excellence and actually possessing that quality.) Benchmarkers hoping to improve their own operations by using the survey results for leads should keep those limitations in mind.
David Ammons is an associate professor of public administration at the University of North Carolina’s Institute of Government. He is the author of “Municipal Benchmarks: Assessing Local Performance and Establishing Community Standards,” and he consults with local gov-ernments on organizational and management concerns. Support for the study was provided by the Public Entity Risk Institute, the North Carolina League of Municipalities, Salisbury, N.C., and the Institute of Government at the University of North Carolina.