A Question
Ben Lawson’s response to the post “I measure therefore I manage” (do read it!) reminded me of a question I have recently been asking myself – What’s the difference between ‘populism’ and ‘community participation’? Is it, for example, the difference between uninformed (and often knee-jerk) reaction and informed and considered judgement? And, if so, what can we do to lift the level of informed response?
Good and Bad Practice
Back when I was writing “Strategic Asset Management” it was my job to note both good and bad practices in the measurement of community satisfaction. The clue? Good practices stepped themselves out from the pack by the amount of real understanding of the problem that they generated (both the costs and the consequences).
Citizen Juries
One example was the use of Citizen Juries, such as the exercise in Boroondara where a representative sample (about 20) of the citizenry were engaged (and paid!) to learn over a period of six weeks about the issues and the options facing the city and then – as informed citizens – to give their considered opinions. It is notable that the City then implemented those decisions, and the jury members were the strongest advocates for both the City and the Council.
Sydney Opera House
A smaller example was the decision to replace the broken flagstones in the courtyard of the Opera House. The question was ‘should all cracked flagstones be replaced or only those that were major problems?’ Three costings were developed according to the degree of replacement and pictures were provided of both the current situation and what the final result would look like for each option. Then local visitors (i.e. those that would be taxed to pay for it) were polled to see what level they supported. With both the costs and the consequences in evidence, those polled voted for a moderate degree of renewal, not the most expensive. You can see that good practice asked specific questions and provided both costs and consequences.
Poor practice?
Current local government ‘satisfaction surveys’ on the other hand ask general questions and provide neither costs nor consequences.
How would you prefer that decisions on your rates and taxes be determined?
Satisfaction surveys are poor measures. Uninformed responses are valueless for decision-making except for expediency. Exercises that allow respondents to “set a budget” while being informed of the consequences of their selections are a far better approach. Watching your rate-reducing activities result in “garbage in the streets” is an excellent way to help people give meaningful feedback.