Thursday 14 May 2009

Accepting Differences

Being different is all part of being human and makes life the rich tapestry that it is rather than a pretty dull monochromatic print. In our relationships, be they at work or at home, it is fundamentally important that we learn how to accept differences, as just that. Different is how it is, not good, not bad, just different. From here we can find better solutions, rather than shoe-horning each other or events into any one agenda. This week's blog is looking more at the organisational effects of not accepting differences however.

So what if you work with someone who finds it hard to accept differences or even in an organisation where conformity is a strongly held value? This is a fairly typical symptom of teams which are stuck and in organisations not performing particularly well. One important dimension of a healthy team and organisation is its ability to invite and incorporate differences of opinion, different perspectives and disagreement even, in order to achieve balanced decision-making. Not any one person can possibly anticipate all eventualities, however where challenge or objection is not permitted, then the decision-making becomes stunted and the group extremely vulnerable to being left behind in the marketplace, to losing good staff members or even to catastrophic events. There are many good examples of how this lack of open challenge has contributed to some unfortunate events, including disasters. The Space Shuttle Challenger disaster, for example, occurred on January 28, 1986, when the US Space Shuttle Challenger broke apart 73 seconds into its flight, leading to the deaths of its seven crew members, including the civilian female teacher, Christa McAuliffe. The disaster resulted in a 32-month hiatus in the shuttle program and the formation of the Rogers Commission, a special commission appointed by US President Ronald Reagan to investigate the accident. The Rogers Commission found that NASA's organisational culture and decision-making processes had been a key contributing factor to the accident.

To cut a long story short, NASA managers had known since 1977 that engineering contractor Morton Thiokol's design of the SRBs contained a potentially catastrophic flaw in the O-rings, but failed to address it adequately. They also disregarded warnings from engineers about the dangers of launching on such a cold day and had failed to adequately report these technical concerns to their superiors. This could be explained as a certain arrogance attached to holding more influential positions in the hierarchy and eliminating voices from down below (and external to NASA) or it could even be attributed to flawed decision-making processes. The report mentioned here made a much more general observation, that it was the culture of the organisation to make these "top-down" and closed decisions, irrespective of plenty of evidence being supplied to the contrary.

An entire area of psychology has been devoted to studying this phenomena of "filtering out" by the group the inconvenient truths or differences of opinion and has been called "Groupthink". Groupthink is a type of thought exhibited by group members who try to minimise conflict and reach consensus by shutting down or simply not inviting critical testing, analysing, and evaluating of all ideas. Individual creativity, uniqueness, and independent thinking are lost in the pursuit of group cohesiveness, as are the advantages of reasonable balance in choice and thought that might normally be obtained by making decisions as a group. Irving Janis, the researcher who coined the term, provides seven pointers on how organisations can overcome the natural but potentially limiting or fatal tendencies for groups to seek agreement:

1. Leaders should assign each member the role of “critical evaluator”. This allows each member to freely air objections and doubts.

2. Higher-ups should not express an opinion when assigning a task to a group.

3. The organisation should set up several independent groups, working on the same problem.

4. All effective alternatives should be examined.

5. Each member should discuss the group's ideas with trusted people outside of the group.

6. The group should invite outside experts into meetings. Group members should be allowed to discuss with and question the outside experts.

7. At least one group member should be assigned the role of Devil's advocate. This should be a different person for each meeting.

By following these guidelines, the risk of groupthink can be minimised. After the Bay of Pigs invasion fiasco, John F. Kennedy sought to avoid groupthink during the Cuban Missile Crisis. During meetings, he invited outside experts to share their viewpoints, and allowed group members to question them carefully. He also encouraged group members to discuss possible solutions with trusted members within their separate departments, and he even divided the group up into various sub-groups, to partially break the group cohesion. JFK was deliberately absent from the meetings, so as to avoid pressing his own opinion. Ultimately, the Cuban missile crisis was resolved peacefully, thanks in part to these measures. (this example has been extracted from Wikipedia).

Clearly in organisational settings, we are all vulnerable to groupthink, a point that continues to be demonstrated to this day in groups and organisations across the world, including some of those I have worked with.

I haven't discussed how seeking cohesion at all costs affects intimate relationships but clearly there is a downside here, where cohesion is more important than solving real problems. I will try to write more about this in next week's blog.

Here are a few references on the effects of groupthink on organisational decision-making, including what sorts of environments make groups particularly vulnerable:

McCauley, Clark. "The Nature of Social Influence in Groupthink: Compliance and Internalization." Journal of Personality and Social Psychology. Vol. 57-2 (1989). 250-260.

Schafer, M. and Crichlow, S. (1996). Antecedents of groupthink: a quantitative study. The Journal of Conflict Resolution, Vol. 40, No. 3 (Sep., 1996), pp. 415-435.

Vaughan, Diane. The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. Chicago. University of and Chicago Press, 1996.

No comments: