Groupthink and false consensus were already dangerous; then Facebook showed up

Ever thought something was a bad idea -- like walking down a dark street, or adding a feature to a product you thought no one would want -- but did it anyway because everyone else around you seemed to think it was ok?
Group process can be a wonderful thing. But it can be a terrible thing. Immerse yourself in a crowd of like-minded thinkers are they quite likely to convince you to make some pretty big mistakes; they'll take whatever blind spot you have and make it bigger and bigger. Even the smartest people in the world fall for this. Groupthink rather famously was blamed for the "go fever" that led to the Challenger Space Shuttle disaster, as you'll see below. The only antidote for false consensus and groupthink is to actively seek out diverse opinions and invite disagreement. In fact, wonderful things happen to groups -- from juries to corporate boards -- when they are constructed with well-crafted, diverse membership. You *know* this is true somewhere inside. Have you ever muttered, "They sure didn't have a woman help build this bathroom," or "they didn't have a mechanic at the table when the engineers designed this impossible-to-repair car engine?"
Sadly, right when the world seems to need that most, technology is undermining our ability to avoid thee kinds of bubbles.
I looked at this issue recently for PeopleScience.com. Below is an excerpt. You can read the entire piece at their site.
When was the last time you used software or a website and couldn't figure out how to do something simple, like click a “buy” button or save a file? Probably five minutes ago. Almost certainly, your confusion was born in a cubicle farm, where a set of programmers and designers convinced each other that their design was cool, or hip, or easy to use. And if those geeks could see you in your moment of confusion, it's likely they'd secretly call you stupid.
In all likelihood, you're not dumb (after all, you are reading PeopleScience.com). Instead, they were swayed by false consensus effect and groupthink when making their design decisions.
False consensus is baked deep into human nature. Most people feel better about themselves when they are surrounded by people who think like they do – or they believe that most people think like they do. It's not always a bad thing. Anyone who's ever tried to pick a lunch spot with five friends knows that reaching consensus can sometimes be hard. Having people in the group who tend to just do what everyone else does helps create an easier path to consensus.
But sometimes, false consensus and groupthink can lead to a disaster. Just consider pretty much every bubble the stock market has ever endured, and all the people who've lost their shirts by convincing each other about the latest “can't miss” investment.
False consensus and groupthink are two sides of the same coin; let’s take false consensus first. The term was popularized by a group of Stanford professors in a 1976 study. For the most part, it means “Everyone thinks that most people think the way they think.” The professors ran a series of clever experiments to demonstrate this. For example: You get a $20 speeding ticket, but the cop made mistakes filling it out. Would you contest it? Would others contest it? Or this funny example: the team asked subjects to wear a sandwich board around campus as an advertisement. Would you do it? How many others do you think would?
Invariably whatever answer test-takers gave about themselves, they believed a majority of others would do the same. For example, sandwich-board wearers presumed 64% of others would say yes, too. Those who refused to perform the stunt, assumed half as many people would. Meanwhile, those who just paid the $20 ticket assumed nearly three-quarters of others would make the same decision. Those who said they would contest it believed more than half of others would do the same. And so on.
Like web designers who think, "Everyone would look for that buy button on the lower-right-hand corner," Ross's subjects just assumed most people think like they do.
You see this all the time. If you care about the environment, you assume most people do, too. If you hate the Yankees, you believe everyone else does, too.
There is a darker side to Ross's tests, however. When asked about the supposed minority group who thinks differently than they do, test subjects ascribed all kinds of extreme characteristics to them – characteristics like laziness, miserliness or selfishness. So not only do they think different, they are different, too.
"To put it a little crassly: people tend to assume that those who don’t agree with them have something wrong with them,” wrote Psychologist Jeremy Dean on PsyBlog. “It might seem like a joke, but it is a real bias that people demonstrate.”
That should make you shudder a little. It's easy to imagine how false consensus could poison political debate. Read any Facebook discussion about politics and you can see this kind of belittling of other points of view in sharp relief. You also might wonder if technology makes false consensus even worse than it already is, and there’s a good argument that it does. False consensus is partly the function of availability bias: We tend to draw conclusions based on the information that is most available to us. Algorithmically fed echo chambers seem designed to fuel availability bias, false consensus and the demonizing of those who think differently.
Groupthink is a little different; it's most obvious during market frenzies. Economist Floyd Norrisonce described it as being “colorblind in a sea of red flags.” You've probably seen it at your office, too, when a bad idea seems to take hold of co-workers and they hold on for dear life. It's sometimes known by its more colloquial name, “wishful thinking,” or its darker name, "willful blindness."
The term groupthink was first introduced to the wider world in the November 1971 issue of Psychology Today, in an article by psychologist Irving Janis. It's often associated with the Vietnam war but reached real notoriety during painstaking investigation of the Space Shuttle Challenger disaster. The investigation blamed a “go fever” at NASA; Shuttle program manager Lawrence Mulloy blamed “groupthink” during his Congressional testimony. Rogers Commission Member Richard Feynman pointed to wide discrepancy over risk assessment between engineers and NASA managers, who were suffering from collective delusion about the risks posed by that doomed mission.
“It appears that there are enormous differences of opinion as to the probability of a failure with loss of vehicle and of human life,” he wrote. “The estimates range from roughly 1 in 100 to 1 in 100,000. The higher figures come from the working engineers, and the very low figures from management. What are the causes and consequences of this lack of agreement? Since 1 part in 100,000 would imply that one could put a Shuttle up each day for 300 years expecting to lose only one, we could properly ask ‘What is the cause of management’s fantastic faith in the machinery?’”
Janis, when he defined groupthink, offered up several root causes: illusions of invulnerability, unquestioned beliefs, stereotyping, self-censorship and the presence of mindguards – people in the group who consciously or unconsciously control information flow to decrease the odds of dissent.
Tragically, some research has found that denial-fueled groupthink is more likely when the group involved is likely to suffer losses, hinting at the reason for NASA’s go fever.
The best way to fight groupthink and false consensus is to invite dissent. That’s always easier said than done, but ensuring diversity in groups is a good head start. New research continues to arrive showing that diversity in groups helps them perform better. For example, racially mixed juries deliberate more and reach better decisions, according to one study.
Akshaya Kamalnath, a lecturer at Deakin Law School in Melbourne, recently summarized research into efforts to create more gender diverse corporate boards as a solution to corporate groupthink.
“Quantitative studies ... suggest that an increase in the number of female directors is associated with an amplified discussion of ‘tough issues,’” she wrote.
Appointing a devil’s advocate for groups is often cited as a good step, too, but better still is to make sure that burden doesn’t fall on a single person. Organizations should also reward truth sayers and invite free speech; tool staff for difficult conversation; and build in time to reflect on and revisit tough decisions, suggests University of North Carolina professor Claudia Fernandez.
Also, note that when a group feels threatened, it’s more likely to assume a bunker mentality and groupthink. The more NASA felt the need to justify the space program, the less management there felt it could afford a launch delay.
As for false consensus: Good interactive software designers live by the mantra, “You are not the user! The user is the user!” If you think everyone thinks like you, know you are probably wrong. So no matter what you make, test. And whatever you make – software, socks, automobiles – don’t simply validate your creations. Investigate instead.