Taking a Stand is Most Important When You Seem Alone

A couple of bits on the importance of taking a stand, even if — especially if — it seems like you’re just one person trying to hold back the tide.

One comes from a review of the war tax resistance film An Act of Conscience. Pete Seeger appears in the film and says:

[There is] a big seesaw. At one end of the seesaw is a basket of rocks that’s down on the ground. At the other end of the seesaw is a basket half-full of sand. And some of us got teaspoons and we’re tryin’ to fill up that basket. Of course, most people are laughing at us. “Don’t you see? The sand is leaking out of the basket as fast as you’re puttin’ it in.” We say, “That’s true, but we’re getting more people with teaspoons all the time.” Some day you’re going to see that whole basket full of sand, and that whole seesaw’s gonna go [the other way] just like that. People will say, “Gee, how did it happen so sudden, us and our goddamn teaspoons.”

The next comes from FSK’s Guide to Reality, which points to another blog’s discussion of The Asch Conformity Experiment.

In this experiment, you’re shown a set of three vertical lines of different lengths, and then a fourth line, and then you’re asked which of the first three lines is the same length as the fourth one. It’s not a hard test. The correct answer is pretty obvious.

The catch is that you’re in a room full of people who, though they all look like test-takers like yourself, are actually confederates of the experimenter. They all give the same wrong answer, out loud. Now it’s your turn. Are you going to buck the popular consensus and give the right answer, or are you going to assume it’s you who’s crazy and give the consensus answer?

“Three-quarters of the subjects in Asch’s experiment gave a ‘conforming’ answer at least once. A third of the subjects conformed more than half the time.” But

Adding a single dissenter — just one other person who gives the correct answer, or even an incorrect answer that’s different from the group’s incorrect answer — reduces conformity very sharply, down to 5–10%.

However:

When the single dissenter suddenly switched to conforming to the group, subjects’ conformity rates went back up to just as high as in the no-dissenter condition. Being the first dissenter is a valuable (and costly!) social service, but you’ve got to keep it up.

This page has an example of one of the test cards, and a video of the test in progress. FSK’s blog provides and links to some more commentary on the test and its implications for those of us with fringe opinions.

(And on that note, here are a few versions of The Emperor’s New Clothes.)


From time to time I mull over Arendt’s “banality of evil” concept. For instance, last May I mentioned Arne Johan Vetlesen’s criticisms. Now, this month’s edition of The Psychologist features an article — Questioning the banality of evil — that continues to build on (or tear down, depending on how you see it) Arendt’s influential theory.

There is a widespread consensus amongst psychologists that tyranny triumphs either because ordinary people blindly follow orders or else because they mindlessly conform to powerful roles. However, recent evidence concerning historical events challenges these views. In particular, studies of the Nazi regime reveal that its functionaries engaged actively and creatively with their tasks.

Re-examination of classic social psychological studies points to the same dynamics at work. This article summarises these developments and lays out the case for an updated social psychology of tyranny that explains both the influence of tyrannical leaders and the active contributions of their followers.

The paper doesn’t treat Arendt’s work very fairly, I don’t think, but it does give some good pointers to conflicting data and new interpretations.


Some short bits: