Outrage Management and Precaution Advocacy [en]

[fr] Interview très intéressant concernant la communication des risques. Un risque c'est un danger objectif, et aussi une réaction subjective, "outrage". Les deux ne sont pas liés. On voit des réactions très émotionnelles à des risques très bas, et des risques hauts qui n'inquiètent pas du tout les gens. Il s'agit donc de trouver des techniques pour "calmer" l'inquiétude excessive pour des dangers mineurs (= "outrage management") et augmenter le sentiment de danger pour les dangers qui n'inquiètent pas assez (= "precaution advocacy"). Fascinant.

Listening to an old episode of On The Media, I came upon this super interesting segment about risk communication (titled Terrorists vs. Bathtubs — listen to the piece, it’s just over 10 minutes, or read the transcript).

Brooke interviews Peter Sandman, expert in the field. He presents risk as a combination of outrage and hazard. Hazard is the real danger and outrage is how upsetting it is. There is no correlation between the two, and that is what makes risk communication tricky.

When I was studying chemistry I had a class on risk management. It was one of my most interesting classes, and had I stayed in chemistry, I might have delved deeper into the subject. What I learned (and it changed the way I view the world) is that a risk is a product of a probability (that something will happen) and of the amount of damage if it happens. Peter Sandman adds another dimension to the equation: the human reaction.

Outrage management is what you do when you’re faced with people who are excessively angry or frightened about something that is not that dangerous. Precaution advocacy is what you do to make people more worried/scared about something they are not concerned about enough.

Trust and control play a big role on how much outrage a risk will generate. If I trust you and you say it’s no big deal, I’ll calm down. If I control the risk I’ll be less outraged than when I don’t (quoting from the interview transcript):

Trust is a biggie. If I trust you, I’m going to find the risk that you are exposing me to much more acceptable than if I don’t trust you. If you trust the government to tell you that surveillance is no big deal and they’re gonna do it responsibly, you’re gonna have a different response than if you think the government is not to be trusted. So trust is one.

Control is one. If it’s under my control I’m going to be less upset than if it’s under your control. Memorability goes in the other direction. If you can remember awful things happening or you can imagine awful things happening, that makes the risk more memorable, that makes it more a source of outrage. But what’s key here is that outrage has a much higher correlation with perceived hazard than hazard has with perceived hazard.

Peter gives an example of how to manage outrage:

Let’s take a situation that most of your listeners are going to think is genuinely low hazard, like vaccination. But if you’re the CDC or you’re some public health department and you’re dealing with a parent who’s anxious, it’s not mostly telling the parent that it’s foolish to worry about vaccine. It’s much more listening to the parent’s concerns. It’s partly acknowledging that there is some truth to those concerns. The strongest argument in the toolkit of opponents of vaccination is the dishonesty of vaccination proponents about the very small risk that’s real. If you’re 98 percent right and pretending to be 100 percent right, then the advocates of that two percent nail you!

And here’s an example of the opposite, precaution advocacy, when you actually try and increase outrage to encourage people into safer behaviours:

One of the things that demonstrably works well with seatbelts and well generally in precaution advocacy is scaring people. So those scary drivers at movies that, you know, they make teenagers watch actually do a lot of good. Role models work.

One of the most effective things in persuading people to get vaccinated against the swine flu pandemic a couple of years ago was when President Obama got his children vaccinated. One  example of a strategy that’s very powerful is if you can get people to do a behavior that doesn’t necessarily make sense to them, because they don’t have the attitude to support that behavior, once they have done the behavior, they begin to wonder why they did it. This is called cognitive dissonance. And, and cognitive dissonance is a very strong motivator for learning things that you wouldn’t otherwise want to learn.

A nice example of this is most people who have ever tried to ask people to sign petitions notice that more people sign your petition and then read your literature than read your literature and then signed your petition. They sign the petition to be courteous, and then the act of signing the petition makes them wonder, what did I do, what did I sign? Then they read the literature, in order to teach themselves that what they did made sense and, and to develop an attitude that supports the behavior.

The conversation goes on to talk about the NSA and surveillance and terrorism (this is not long after the Snowden leaks), as well as the narrative around fracking, which Peter has since written about on his website. (His website is full of good stuff, by the way, including musings on his legacy, as he’s pretty much semi-retired.)

What I was really interested in though was this concept of outrage, and how trying to calm outraged people down with facts doesn’t really work.