|How do we know it isn't true?|
The Herod Effect is a name I've decided to give to the situation where a lie is widely repeated as a truth because it makes such a great story that people want it to be true.It's a bit wishy-washy because there are lots of reasons people might want a false rumour to be true. I mentioned two of them in my last post.
First, a kind of sour grapes. If someone is particularly rich or attractive or successful or whatever, many people feel a kind of schadenfreude when they're "taken down a peg or two". Gere was an impossibly handsome and successful man, living many people's dreams. Presumably people felt that the idea of his being gay (he isn't) somehow devalued him (it wouldn't).
Second, a wish to have one's biases confirmed. Gamergater types often feel that men somehow own all aspects of the video game space and that women are 'invading' when they participate or write about it. They felt that the false story of Quinn's supposed sexual and unethical behaviour matched their feelings about how women are ruining video games.
There are significant differences here in the reasons people wanted to believe the different stories. Perhaps the most significant difference is that Gamergaters often didn't seem to care whether the original story was true or not. They repeated it as a weapon and a justification of doing even more horrible things.
Even so, I still think the concept of The Herod Effect is reasonably sound. It's about the things viral lies have in common rather than an attempt to classify. I use it as an example of a particular kind of threat to privacy. In a follow-up post, I'll discuss how The Herod Effect harms privacy in more details, with some examples. In this one, I'll describe some of the reasons people might have for believing and/or repeating rumours without evidence.
Other reasons may occur to you, feel free to comment.
Bad reasons to believe stuff
- We're scared or anxious. A lot of false stories cater to our fears. Vaccines cause autism. Immigrants are all rapists. Women want to take 'our' jobs. When we're already worried about something, stories that confirm our fears are oddly comforting. They seem to confirm that we're right to be scared of that thing, justifying the investment we've put into our crackpot theories. We tend to require less evidence for things that confirm our fears. This is somewhat complicated by the fact that a lot of the things people are scared of are manufactured by the media in the first place. Presumably we want to believe in the existence of threats to satisfy some other bias (immigrants are terrorists, scientists are lying to us or whatever) and then want to believe in the things that confirm our newly manufactured biases. Humans are weird.
- The rumors are surprising but not too surprising. We like stories that shock us but not so much that we have to actually expend effort in working out whether they're true. Stories that are shocking but still fit with our biases are the most durable. For example, we're accustomed to Trump saying idiotic things so it would be easy to accept yet another idiotic Trump quote as true - perhaps no matter how extreme it was - because at this point it's pretty much background. Actually, that might be a bad example because there's literally nothing terrible that Trump wouldn't say. The stupidest, most ignorant, terrifying thing in the world might as well be attributed to Trump whether he said it or not. Let's say Bush instead. He said a lot of stupid things too but one in particular that springs to mind is when he said that the French have no word for "entrepreneur". It wouldn't surprise me if he had said that and it would probably make me hate him a little bit more, but he didn't say it.
- We trust people who tell us things above people who actually know things. People like Alex Jones are either fantastically uneducated about every single aspect of the world or they prefer to promote lies because they see it as advancing their cause. It's not always possible to tell the difference. If we believe a story someone tells us because of some other bias, we're more likely to believe other things they say, even if they don't fully tick our bias boxes. A person constantly spewing "interesting" "facts" is more credible than someone wedded to a highly specific conspiracy theory, even if the conclusion is the same. This is different to believing someone because they have previously presented exemplary evidence for the things they say. No evidence of previous careful research is required, we believe them because they shotgun "facts" at us. Once one seems credible due to bias, others are likely to seem more credible too.
- We trust people who are experts in one field to have expertise in others. We see this a lot with physicists, unfortunately. It might seem like a stereotype but examples abound. There is a surprising number of physicists who think they know all about biology without ever having studied it, for example, and say very stupid things about it. We have a tendency to believe experts when they say things we want to hear, even if their expertise is in something completely different.
- Conversely, we hate experts. The hatred and distrust of expertise is part of what put Trump in the White House. It is perfectly clear that people unqualified for a job aren't the best candidates, but Trump successfully argued the opposite. "Joe the plumber" and "the man on the London omnibus" are other examples. The common person on the street is certainly entitled to an opinion and to have their opinion heard, but we wouldn't want that person to fly the plane we're on or perform surgery. But we do want non-experts to tell us what to think about things that require expertise. It's perverse.
- Repetition is very convincing. We tend to believe stories more when we hear them often, even if what we hear about them is debunking them. Climate change deniers' beliefs don't lessen when they consume stories debunking their silly arguments. Quite the reverse: mentioning climate change as either real or fake with tend to strengthen whichever belief you already happen to have.
- We are susceptible to what's already on our minds. Zeitgeist shifts and what terrified us a few decades ago might not be so convincing as what terrifies us now. Some time ago people were frightened of witches and believed all sorts of rumours with tragic consequences. Belief in witchcraft is much depleted (although by no means gone) but the susceptibility to believe claims without evidence is now based on the fear of terrorism, among other things. The comparison between draconian measures against immigrants and actual witch hunts are apt. It's strange that we find it so hard to see this.
- We tend to believe things that are easily explained. Vaccines cause autism. X causes cancer. Obama is a muslim. These are the sorts of fake ideas that tend to perpetuate. However:
- Conspiracy theories are justifications. People like simple fake facts ("X ordered 9/11") but they really love complicated explanations of why that's true, regardless of how they fit with the facts. Simple statements with complicated explanations are gold. The more complicated the justification, the better, especially if (as they inevitably do) they drag in other simple fake facts. I think what's happening here is that the complexity of the conspiracy argument causes people to treat it as abstract rather than something that supposedly happened in the actual real world. When we treat something as abstract it's easy to ignore inconvenient detail. That's what abstract thinking is.
- Numbers, even ludicrous numbers, are key. In my day, the pop star in question was Marc Almond. I've heard the same story about all sorts of other people, though, I won't provide a list. The rumour is that a given pop star was admitted to hospital and had a large quantity of sperm removed from his or her stomach. There is always a precise quantity of sperm specified. Sometimes it's a gallon. Sometimes it's ten gallons(!). It's always a wildly improbable amount. Ten gallons. That's 80 pints. Clearly more than a human stomach could possibly contain. And I'm not going to work out how many ejaculations that would require. But there's a number so people believe it. If it was "some" sperm, nobody would. Weird, bullshit specifics are somehow more convincing than actual evidence.
There are other bad reasons to believe things, but this post is already long enough and I'm not sure I've made my point yet. It is that these things have more in common than they do otherwise; bad reasons reasons to believe things fuel The Herod Effect and that effect is something worth considering in itself because of the horrible consequences it can have for privacy and other things. It's important to know why people believe stupid things but also important to recognise when it's happening even when we don't understand the specifics.
In my next post I want to tie all this stuff in with privacy violations. The Herod Effect is a specific threat to privacy which is not often considered. I'll explain why.