Conformity and the Dangers of Group Polarization

—Cass R. Sunstein

The following is an excerpt from Cass R. Sunstein’s new book Conformity: The Power of Social Influences, and is adapted from a post that originally appeared in Quillette. Read the full excerpt here

When people talk to one another, what happens? Do they compromise? Do they move toward moderation? The answer is now clear, and it is not what intuition would suggest: members of deliberating groups typically end up in a more extreme position, consistent with their tendencies before deliberation began. This is the phenomenon known as group polarization. Group polarization is the usual pattern with deliberating groups, having been found in hundreds of studies involving more than a dozen countries, including the United States, France, and Germany. It helps account for many terrible things, including terrorism, stoning, and “mobbing” in all its forms.

It follows that a group of people who think that immigration is a serious problem will, after discussion, think that immigration is a horribly serious problem; that those who approve of an ongoing war effort will, as a result of discussion, become still more enthusiastic about that effort; that people who dislike a nation’s leaders will dislike those leaders quite intensely after talking with one another; and that people who disapprove of the United States, and are suspicious of its intentions, will increase their disapproval and suspicion if they exchange points of view. Indeed, there is specific evidence of the latter phenomenon among citizens of France.

When like-minded people talk with one another, they usually end up thinking a more extreme version of what they thought before the conversation began. It should be readily apparent that enclaves of people, inclined to rebellion or even violence, might move sharply in that direction as a consequence of internal deliberations. Political extremism is often a product of group polarization.

In the United States, group polarization helped both Barack Obama and Donald Trump to ascend to the presidency. Speaking mostly with one another, Obama supporters and Trump supporters became intensely committed to their candidate. On Facebook and Twitter, we can see group polarization in action every hour, every minute, or every day. As enclaves of like-minded people proliferate online, group polarization becomes inevitable. Sports fans fall prey to group polarization; so do companies deciding whether to launch some new product. It should be easy to see that group polarization is at work on university campuses and in feuds, ethnic and international strife, and war.

One of the characteristic features of feuds is that members of a group embroiled in a feud tend to talk only to one another, fueling and amplifying their outrage, and solidifying their impression of the relevant people and events. Many social movements, both good and bad, become possible through the heightened effects of outrage; consider the civil rights movements of the 1960s (and the contemporary #MeToo movement). Social enclaves are breeding groups for group polarization, sometimes for better and sometimes for worse.

There is another point, of special importance for purposes of understanding extremism and tribalism: In deliberating groups, those with a minority position often silence themselves or otherwise have disproportionately little weight. The result can be “hidden profiles”—important information that is not shared within the group. Group members often have information but do not discuss it, and the result is to produce bad decisions (or even worse).

Consider a study of serious errors within working groups, both face-to-face and online.1 The purpose of the study was to see how groups might collaborate to make personnel decisions. Résumés for three candidates, applying for a marketing manager position, were placed before the groups. The attributes of the candidates were rigged by the experimenters so that one applicant was clearly the best for the job described. Packets of information were given to subjects, each containing a subset of information from the résumés, so that each group member had only part of the relevant information. The groups consisted of three people, some operating face-to-face, some operating online.

Two results were especially striking. First, group polarization was common, as groups ended up in a more extreme position in the same direction as the original thinking of their members. Second, almost none of the deliberating groups made what was conspicuously the right choice, because they failed to share information in a way that would permit the group to make an objective decision. Members tended to share positive information about the winning candidate and negative information about the losers, while also suppressing negative information about the winner and positive information about the losers. Their statements served to reinforce the movement toward a group consensus rather than to add new and different points or to promote debate.

This finding fits with the more general claim, backed by a lot of evidence, that groups tend to dwell on shared information and to neglect information that is held by few members. It should be unnecessary to emphasize that this tendency can lead to big blunders—in governments, in think tanks, and on the Left and the Right.

Keep reading on Quillette…

Cass Sunstein will be in conversation with Harper’s Magazine at Book Culture at 450 Columbus Ave, New York, NY 10024 on Friday, June 14th 2019 at 7:00 PM. Join us there to learn more about the book and get your signed copy! 


ConformityCass R. Sunstein is the Robert Walmsley University Professor at Harvard. From 2009 to 2012, he served as the Administrator of the White House Office of Information and Regulatory Affairs. He is the bestselling co-author of Nudge: Improving Decisions about Health, Wealth, and Happiness and The World According to Star Wars.



1 See R. Hightower and L. Sayeed, The Impact of Computer-Mediated Communication Systems on Biased Group Discussion, Computers in Human Behavior Volume 11, Issue 1, Spring 1995.

Website | + posts