Phobia Indoctrination, Part II: Rational assessment of danger and how it goes wrong
In this series, we consider the applications of an important concept from the study of cults-- phobia indoctrination. Cults keep their members in the fold by instilling in their members irrational fears of the dangers posed by the outside world, and those outside the cult. Thus, a cult can subject its members to psychological abuse including, but not limited to gaslighting, and since members are isolated within the cult, too afraid to interact with those outside the cult, they have no external reality check. Indeed, they have no one to provide a check on the exaggeration of those fears. It is an insidious tactic, dangerous both for the damage it does directly, and for the way in which it traps people in a cult. The more I read about it, the more I think it has explanatory power in modern, American politics, even taking care to avoid Haslam's "concept creep." In order to see how phobia indoctrination works more broadly, though, we must have a baseline of comparison.
There is no such thing as a safe life. Five to one, baby, one in five. No one here gets out alive. Yet one can live a safer, better and happier life. The question is balancing the costs of mitigation against the risks themselves. Few Americans think about the actual, statistical risks of driving. Roads are dangerous. The rational response is not to hide in one's home, but rather, to drive defensively, avoid stupid things like drinking and driving, texting and driving, caring whether or not the guy who just sped past you gets his due, or anything like that. Live in the world, mitigate the risks through rational action, and understand that there is no such thing as a safe life.
When we consider any potential danger, the rational questions to ask are these. What, specifically, may happen? What is the cost if it happens? What is the probability that it will happen? What steps can I take to mitigate the risk of the event? How much can I decrease the probability, and for what cost? Think of the following basic risk reduction problem, put in utility terms. In microeconomics, we use Von Neumann-Morgenstern utility functions, which have the following property. If you get x utils with probability, p, that is with xp utils. Also, this stuff is additive.
So let's say you face a risk of an adverse outcome that will cost you b utils, where b < 0 if b occurs. The probability of b is p. If b does not occur, nothing happens, and you get nothing, as Willy Wonka says, so 0 utils. Your utility is 0(1-p) + bp = bp < 0.
But what if you could pay a cost, c, which reduces p? Is it worth it? That depends on c, and the reduction in p.
If bp' - c > bp where p' < p, then the protection you bought is worth it. We can simplify that.
bp' - bp > c
b(p' - p) > c
-b(p - p') > c
If I had been thinking about the positives and negatives in my notation at the start, I might have done that differently, but whatever. Here's the result, which makes sense, translated from the algebra. Buy the protection-- sorry for the mafia-speak-- if it costs less than the product of two quantities: the magnitude of what you are trying to avoid, and the reduction in the probability that it occurs.
Much of the time, the events under consideration are not presented to you as your probability of dying in an individual year. You may see number of events. You may see that broken down by demographic or other relevant category. You then need the denominator, which you may need to find over at the Census, and through a simple act of division, you calculate for yourself the actual probability of dying from Cause X. It is usually pretty low, but death is a pretty big penalty, so we turn to the multiplication issue in a Von Neumann-Morgenstern utility function. Low probability event, but a big penalty. The thing is, people usually forget how low the probability is, and fail to consider how low the probabilities are when assessing their fears, and hence what any given probability reduction would do.
And hence, how much to pay for protection.
What happens instead? You see an anecdote, and you are subject to a cognitive bias. Tversky and Kahneman wrote about the heuristics people use to assess probabilities, and the biases that result. Consider, for example, the availability heuristic. If it is easy to recall a type of event-- for example, because you have heard the anecdote-- you overestimate the probability of that type of event.
Meaning, you get scared of the shit you see in the news. Remember my terminology here: "the paradox of news." An event is new and different and hence, newsworthy because it is out of the ordinary, but observing the event (though mediated) causes you to think that the event is ordinary, which is the exact opposite of the truth.
Apply this.
It is really just an application of news practices plus Tversky & Kahneman. Following the bleeding/leading principle, it exacerbates your fears, and blows them way out of proportion. The way to get them in check is to do something that no normal person does, which is hang around on the CDC and FBI websites, cross-referencing them with Census data to figure out the right denominator.
And what can either a malevolent actor, or even a good-faith but mathematically inept actor do here? Focus on the anecdotes that cause you to overestimate the probability of one set of adverse outcomes, and hence overpay for protection for them, while cowering in a closed information bubble that talks about that exaggerated risk constantly because you need to be so so so scared of it. Maybe they believe it, and maybe they don't, but its effect on you will be the same.
Remember, it isn't just focusing on the risk, or even just exaggerating the risk, but convincing you to stay within a closed informational bubble, and overpay for protection based on an exaggerated estimate of the probability of the adverse event.
What's the other phobia indoctrination angle? Consider b. Death is pretty bad. Well, kinda. To Hobbes, it was the painful death that was bad, death being inevitable. I referenced Epictetus last week, and Epictetus didn't even give a shit about pain. Death was natural, life was temporary, so do right and accept that life is temporary.
Dude, go to the fucking doctor. Yeah, memento mori, but go to the fucking doctor.
But what about the non-death bad outcomes, and the non-injury bad outcomes? There are plenty of adverse outcomes for which we cannot simply look up the numbers in CDC and FBI databases. Maybe we are making projections into the future based on scientific and social scientific models, or less precise predictive models, or maybe even worse-- and this is where it gets really tricky-- we are projecting our own reactions and states in response to something, be it policy which is not literally life-or-death, or even just social situations and civil society.
Have you ever thought you would have X reaction to a situation, and had your reaction instead be Y?
Yes. Because you are bad at predicting your own reactions, to say nothing of the imprecision of predictions when they are not based on models that have been replicated over and over and over again. I can tell you, with a high degree of certainty, that incumbents will win House elections next year more than 90% of the time.
Your response: yeah, but that's boring! We all know that!
Yeah, you know that because we have observed, tested and replicated it.
I can also tell you that a bunch of challengers are going to spend fuckloads of money and still lose, and then you'll forget them because it won't fit the model you have in your head of money buying elections.
Is that less boring? We've replicated that one too.
Your own reactions to complex situations? You don't know those.
And here is an opportunity for mischief. If the cost is vague and unspecified, it is easy for someone to tell you that it will be high. If you believe that, then even if the probability is low or protection does little, you will pay for it, and indeed, overpay for it, while worse yet, cloistering amongst those who wallow in the fear of the unspecified.
Amid all of this, I implore the reader to remember former VP Dick Cheney's 1% doctrine. The idea was as follows. If there was even a 1% chance of the terrorists getting a nuke, we needed to treat that as an absolute certainty. The left recoiled, but it was actually about half-way to Von Neumann-Morganstern. Consider the magnitude of the disaster, should al Qaeda get a nuke. That magnitude, b, is so large as to swamp .01. The problem with the argument was that it ignored the question of the reduction in probability for any given risk mitigation, or the cost of any given risk mitigation, which was why it was only half right, or choose your proportion, but it was not entirely foolish. The point is the product of the disaster and the probability of it occurring.
Watch how easy it is to make a mistake, and then upon pointing to a potential adverse outcome, to exaggerate its risk. Even pointing out low risks, those who listen forget the low probability.
Probabilities are hard. Assessing your own response to a situation is hard. Any error in this process can be exploited, and along the way, what remains necessary is the external reality check, which is exactly what cults seek to deny their members, and exactly what any insular group seeks to deny its members. When the source of the reality check becomes a perceived danger, then you're fucked because, to whom do you turn?
Numbers should be easy. The big problem is usually the denominator. Numerators are generally simple to find, and whoever is trying to create the fear will focus on the numerator hoping that you forget that the denominator is 330,000,000, or some smaller slice with a similarly smaller numerator. And when we aren't talking about deaths, there is the persistent problem of predicting your own reaction. Nevertheless, math is always an option. Stray from that, and problems ensue. More to come.
Zamla Mammaz Manna, "Ventilation Calculation," from Familjesprickor.
Comments
Post a Comment