David McRaney. How Minds Change: The Surprising Science of Belief, Opinion and Persuasion

McRaney's book is worth reading in its entirety. Below I have excerpted the core concepts. As I read it, McRaney is reporting on a contemporary form of dialectic, a Socrates in jeans and running shoes. The old Socrates came off as adversarial. He said he wanted what was best for his dialectical partners, but they felt all he wanted was to best them. McRaney's primary insight is that you can't talk someone out of believing something by giving them facts or counter-arguments because belief is not about reason but rather about identity -- tribal affiliation (Burke's "Identification"). Thus to ask someone to change their mind is to invite them to have an identity crisis. As he points out, there are people whose identity is grounded in keeping an open mind: academics, scientists, lawyers perhaps, hopefully judges. These people will experience cognitive dissonance if they are tempted to jump to a conclusion, which is the opposite of most other people. But often even for these people open-mindedness is domain specific. Outside their field of expertise they may be just as jumpy as anyone else. I advocate cultivating sophisticated skepticism to help people take their beliefs less personally, to recognize that from the right perspective many perspectives are intelligible. That meta-perspective encourages a person to understand before they start to advocate or condemn. And obviously people resist psychic threats. Rather than challenging the validity of the partner's belief, McRaney found it was more effective to first ask them what level of commitment they had to that belief -- on a scale of 1 to 10 how certain are you? Once the respondent had declared their level of commitment, the interviewer asks why they believe what they believe and how they got there. The interviewer wants to avoid letting their respondent settle on an a priori belief, like The Truth or The Bible or Science says. Thus the interviewer keeps asking why in variously different but gentle ways until the respondent recognizes they didn't really know why. By "helping" the respondent moderate their level of commitment, the interviewer primes them to rethink their position, at least a little bit, and that small shift reduces the psychic threat changing their mind presents. Rather than becoming a different person, they remain the same person but with a slightly different attitude towards just one of their beliefs. Sometimes the respondent doesn't even notice they changed their mind. The only persuasion is self-persuasion.

Engaging in this kind of dialectical discourse is dangerous because it is psychologically invasive and one should never participate unless one is truly willing to let the other person think out loud without trying to guide them to a preferred conclusion; the goal is self-discovery not validation of an opinion. Along the way, the interviewer should also be willing to explore the how and why of their own beliefs. The interviewer isn't trying to change the informant's mind. He or she is simply "helping" them better understand how they came to believe what they think they believe, which most of the time undermines their committment because even a couple of whys? will show us that most of our beliefs are not well-founded. There's a bit of a lie here, however. The people McRaney interviewed were engaged in door-to-door political activism. Their goal was to have the people they interviewed talk themselves into changing their own mind or at least open them to the possibility of changing it later.

The methods McRaney reports on appear to be replicably effective. They certainly seem to be gentler than Socrates' method, which often irritated and entrenched his partners.


Page xvi the surprising psychology behind how people modify and update their beliefs, attitudes, and values; and how to apply that knowledge to whatever you believe needs changing, whether it's within one mind or a million.

Page xvii We will see that the speed of change is inversely proportional to the strength of our certainty, and certainty is a feeling: somewhere between an emotion and a mood, more akin to hunger than to logic. Persuasion, no matter the source, is a force that affects that feeling.

Page xvii Persuasion is not coercion, and it is also not an attempt to defeat your intellectual opponent with facts or moral superiority, nor is it a debate with a winner or a loser. Persuasion is leading a person along in stages, helping them to better understand their own thinking and how it could align with the message at hand. You can't persuade another person to change their mind if that person doesn't want to do so, and as you will see, the techniques that work best focus on a person's motivations more than their conclusions.

Page xviii All persuasion is self- persuasion.

Page xx we must avoid debate and start having conversations. Debates have winners and losers, and no one wants to be a loser. But if both sides feel safe to explore their reasoning, to think about their own thinking, to explore their motivations, we can each avoid the dead- end goal of winning an argument. Instead, we can pursue the shared goal of learning the truth.

Page 25 [the researchers] emphasized something they called "radical hospitality," a form of selfless concern and energetic friendliness akin to what you might experience at a family reunion. From the moment volunteers arrived at

Page 29 "There is no superior argument, no piece of information that we can offer, that is going to change their mind," he said, taking a long pause before continuing. "The only way they are going to change their mind is by changing their own mind--by talking themselves through their own thinking, by processing things they've never thought about before, things from their own life that are going to help them see things differently."

Page 30 it's their story that should take up most of the conversation. You want them to think about their own thinking. Once that real, lived memory was out in the open, you could (if done correctly) steer the conversation away from the world of conclusions with their facts googled for support, away from ideological abstractions and into the world of concrete details from that individual's personal experiences. It was there, and only there, he said, that a single conversation could change someone's mind.

Page 35 The idea is to move forward, make the person feel heard and respected, avoid arguing over a person's conclusions, and instead work to discover the motivations behind them. To that end, the next step is to evoke a person's emotional response to the issue.

Page 36 Instead of arguing, the canvasser listens, helping the voter untangle their thoughts by asking questions and reflecting back their answers to make certain they are hearing them correctly. If people feel heard, they further articulate their opinions and often begin to question them.

Page 36 As people explain themselves, they begin to produce fresh insights into why they feel one way or another.

Page 36 Instead of defending, they begin contemplating, and once a person is contemplating, they often produce their own counterarguments, and a newfound ambivalence washes over them. If enough counterarguments stack up, the balance may tip in favor of change. ... if he could evoke a memory from her own life that contradicted the reasoning she had shared, she might notice the conflict without him having to point it out.

Page 83 People with broadly similar experiences and motivations tend to disambiguate in broadly similar ways, and whether they find one another online or in person, the fact that trusted peers see things their way can feel like all the proof they need: they are right and the other side is wrong factually, morally, or otherwise.

Page 87 "cognitive empathy": an understanding that what others experience as the truth arrives in their minds unconsciously, so arguments over conclusions are often a waste of time. ... The better path ... would be for both parties to focus on their processing, on how and why they see what they see, not what.

Page 105 When we first suspect we may be wrong, when expectations don't match experience, we feel viscerally uncomfortable and resist accommodation by trying to apply our current models of reality to the situation. It's only when the brain accepts that its existing models will never resolve the incongruences that it updates the model itself by creating a new layer of abstraction to accommodate the novelty. The result is an epiphany, and like all epiphanies it is the conscious realization that our minds have changed that startles us, not the change itself.

Page 107 Kuhn was suggesting that when we update, it isn't the evidence that changes, but our interpretation of it.

Page 116 Unless otherwise motivated, the brain prefers to assimilate, to incorporate new information into its prior understanding of the world. In other words, the solution to "I might be wrong" is often "but I'm probably not."... To orient ourselves properly, we update carefully. So if novel information requires us to update our beliefs, attitudes, or values, we experience cognitive dissonance until we either change our minds or change our interpretations.

Page 165 humans value being good members of their groups much more than they value being right, so much so that as long as the group satisfies those needs, we will choose to be wrong if it keeps us in good standing with our peers.

Page 166 In times of great conflict, where groups are in close contact with each other, or communicating with each other a lot, individuals will work extra hard to identify themselves to each other as us and not them.

Page 168 The average person will never be in a position where beliefs on gun control or climate change or the death penalty will affect their daily lives. The only useful reason to hold any sort of beliefs on those issues, to argue about them, or share them with others is to "convey group allegiance,"

Page 170 Scientists, doctors, and academics are not immune. But lucky for them, in their tribes, openness to change and a willingness to question one's beliefs or to pick apart those of others also signals one's loyalty to the group. Their belonging goals are met by pursuing accuracy goals. For groups like truthers, the pursuit of belonging only narrowly overlaps with the pursuit of accuracy, because anything that questions dogma threatens excommunication.

Page 194 Research shows people are incredibly good at picking apart other people's reasons. We are just terrible at picking apart our own in the same way.

Page 203 psychology defines beliefs as propositions we consider to be true. The more confidence you feel, the more you intuit that a piece of information corresponds with the truth. The less confidence, the more you consider a piece of information to be a myth. ... Attitudes, however, are a spectrum of evaluations, feelings going from positive to negative that arise when we think about, well, anything really. We estimate the value or worth of anything we can categorize, and we do so based on the positive or negative emotions that arise when that attitude object is salient. Those emotions then cause us to feel attracted or repulsed by those attitude objects, and thus influence our motivations. Most importantly, attitudes are multivalent. We express them as likes or dislikes, approval or disapproval, or ambivalence when we feel both.

Page 226 [Street Epistemology] HERE ARE THE STEPS: Establish rapport. Assure the other person you aren't out to shame them, and then ask for consent to explore their reasoning. Ask for a claim. Confirm the claim by repeating it back in your own words. Ask if you've done a good job summarizing. Repeat until they are satisfied. Clarify their definitions. Use those definitions, not yours. Ask for a numerical measure of confidence in their claim. Ask what reasons they have to hold that level of confidence. Ask what method they've used to judge the quality of their reasons. Focus on that method for the rest of the conversation. Listen, summarize, repeat. Wrap up and wish them well.

Page 233 Anthony emphasized that street epistemology is about improving people's methods for arriving at confidence, not about persuading someone to believe one thing more than another.

Page 234 Ask outright, "With your consent, I would like to investigate together the reasoning behind your claims, and perhaps challenge it so that it either gets stronger or weaker—because the goal here is for both of us to walk away with better understandings of ourselves," or something like that. And if that isn't your goal, it won't work. You can't fake it.

Page 236 First, ask a nonthreatening question that's open-ended. Something like, "I've been reading a lot about vaccines lately, have you seen any of that?" Next, just listen for a while. Then communicate your curiosity and establish rapport by asking a nonjudgmental follow-up question. Next, reflect and paraphrase. Summarize what you've heard so far to make the other person feel heard and respected. Then look for common ground in the person's values. You might not agree with their argument, but you can communicate that you too have values like theirs, fears and anxieties, concerns and goals like they do. You just think the best way to deal with those issues is slightly different. Then share a personal narrative about your values to further connect. Finally, if your views have changed over time, share how.

Page 276 Researchers say, in short, it was about trust—we don't live in a post-truth world, but a post-trust world. A general distrust of media, science, medicine, and government makes a person very unlikely to get vaccinated no matter how much information you throw at them, especially when the people they do trust share their attitudes.