My opinion, my
conviction, gains infinitely in strength and success, the moment a second mind
has adopted it.
Novalis
In the aftermath of the shocking revelations of
scandal such as we’ve seen all too much in the buddhist and yoga communities
lately, one salient point is often ruminated upon and just as often with great
anger: why do so many practitioners “enable” the abusive behavior? How is it
that people can know a guru/teacher is engaging in inappropriate and damaging behavior
and not speak up? And most hypotheses revolve around psychological causes and
conditions. Here I’d like to look into another potential cause that
metacognition studies (involving thinking about how we think and perceive)
offer: the imagined agreement of others and the exaggerated impressions of
social support.
It should be obvious that what we believe is heavily
influenced by what we think others believe. One typical example I’ve often
found humorous is the office collection for a co-worker’s gift to celebrate the
birth of her new baby. When asked for a donation towards the gift, most of us
try to find out how much others have given and then decide our own contribution
accordingly. I’ve often wondered what operations lie behind the first person’s calculations!
Within limits, the tendency to be influenced by the
beliefs of others is valid and justified. What others think and how they behave
provide us with important sources of information about what is correct, valid
or appropriate. However, our ability to effectively utilize this information is
compromised by a systematic defect in our ability to accurately estimate the
beliefs and attitudes of others. We tend to exaggerate the extent to which
others hold the same beliefs we hold, and because of this tendency to think our
beliefs are shared by others these beliefs are more resistant to change than
they would be otherwise.
In a form of projection (which we usually think of
in Freudian terms as the projecting of unwanted or distasteful characteristics
onto others that one is unaware of possessing themselves) we tend to also
attribute to others characteristics that we do
know we possess onto others. Thus, we tend to over-estimate how many people
like what we like. This tendency has come to be called the “false consensus effect.”
The false consensus effect refers to the tendency
for people’s own beliefs, values and behavior to bias their estimates of how
widely such views and behaviors are shared by others. For example, fans of
country music think that more people like country music than those who dislike
country music; yoga practitioners tend to think more people practice yoga than
those who do not practice. Perhaps relevant to the “guru scandals” that have
come to our attention, one university experiment involved asking students if
they would be willing to walk around campus wearing a sandwich-board sign with
the message “REPENT.” There were fairly substantial percentages of those who
would be willing and of those who would not. After agreeing or declining to
wear the sign, the students were than asked to estimate the percentage of their
peers who would agree or decline. As the false consensus effect would predict,
the student’s estimates reflected their own choices: those who had agreed to
wear the sign estimated that 60% would do so while those who refused thought
only 27% would agree to wear it!
Note, the false consensus effect is of a relative nature; it is not that people
think their beliefs are shared by a majority of other people, but simply that
people’s estimates of the commonness of a given belief is positively correlated
with their own beliefs. It is not that religious fundamentalists believe most
people share their beliefs, but rather their estimates of the percentage of
religious fundamentalists in the general population can be counted on to exceed
similar estimates of their more secular peers.
Why should this be so? Research seems to point to
the mediating role of a host of cognitive and motivational variables. For
example, one motivational factor stems from our desire to maintain a positive
valuation of our own belief or judgment. If we have a strong emotional
investment in a belief we tend to exaggerate the extent of perceived social
support for the belief. Interestingly, research shows that people are
particularly likely to exaggerate the extent to which attractive, respected and
well-liked people share their beliefs.
A major factor behind the false consensus effect
that we can definitely see in cults and cult-like communities (such as we’ve
seen in the Anusara, Diamond Mountain, and Mt. Baldy communities to name three of the more recent and
infamous scandals) is the more generalized tendency to selectively expose
ourselves to information that supports our beliefs. Conservatives read
conservative periodicals and watch Fox News and thus receive support for their
conservative political ideology; religious creationists read creationist
literature rather than contemporary evolutionary biology and thus bolster their
creationist beliefs. With the fracturing of discourse found on the internet,
where we can choose to follow blogs and websites that support our views, it takes
a concerted effort and willingness to seek out opposing viewpoints.
Besides selectively exposing ourselves to a biased set of information relevant to a particular belief, we are also exposed to a biased sample of people and their views and beliefs. Liberals associate with other liberals; yoga practitioners associate with other yoga practitioners. It is a fact that similarity of beliefs, values and habits is one of the primary determinants of those with whom we associate. In fact, this is consciously valued, celebrated and suggested in the buddhist and yoga community as “the company of like-minded people” or sangha. And while such association does indeed have great benefit, if such a sangha grows insular and isolated, it can lead to the cultishnness we also often see. The importance of “transparency” for the health of a community becomes quite clear and pronounced when we come to understand that if we become insular in our association with others, the false consensus effect will lead us to see our beliefs as “common” because they are shared by “everyone we know.”
There are other factors that contribute to the false
consensus effect. One more I would
like to touch upon here is the mechanism that involves the resolution of the
ambiguities inherent in most issues, choices, or situations. Before we decide
what we think about some issue, we have to be clear about the terms. For
instance, if I’m asked for my opinion about Christianity, it would be helpful
to know what the term
“Christianity” refers to: the Pope and Catholicism, Billy Graham’s Evangelicalism or Radical Christianity? Knowing
what is meant will not only help determine my own opinion, but will also
influence my estimates of the preference of others.
With the false consensus effect seen to be as
prevalent as it is, the question becomes why aren’t our misconceptions about
what other people think corrected by the feedback we receive from others?
Shouldn’t we expect others to let us know if our beliefs or assumptions about
them are wrong? While in the most bizarre and erroneous cases we can count on
being called out, the fact is that generally such corrective feedback is not as
common as we might think. And this is yet another factor that leads to the cult-like,
group-think behind the silence that allows dysfunction to breed and persist in
closed communities. To some extent, cult members don’t get the corrective
feedback from others that their beliefs may be wrong, irrational and harmful or
that certain behaviors may be dysfunctional, because they are associating with
those who share their beliefs, values and behavior. However, even more telling,
it has been shown that even when we do cross paths with those whose beliefs and
attitudes conflict with our own, we are rarely challenged. People are generally reluctant to openly question
other people’s beliefs.
I should clarify that it’s adults who are generally reluctant to do so; children tend to be
brutally open and honestly revealing. Just think: as an adult have you ever
gone to the restroom while out at a social gathering to find your zipper undone
or some green salad remnant
obviously caught in your teeth? Yet I’m sure we can all remember how
gleefully our grammar-school friends would chant and point out our open fly or
the bit of food caught in our teeth!
I can speak from my experience as a naturalist that
when I lecture at yoga centers and hear some bit of new age, magical thinking,
it’s taken me years to get over my reluctance and discomfort in offering
contradictory information and evidence. Our reluctance to voice our
disagreements has been repeatedly demonstrated in psychological research:
people generally try to avoid potential conflict with others. Such reticence is
exacerbated in yoga and buddhist communities where any hint of dissent or
critical thinking is often met with silence or charges of “wrongful speech.” In
fact, in many contemporary communities, “right speech” has become a kind of
yoga/buddhist political correctness, marginalizing and devaluing any real
difference of opinion.
The emphasis on “right speech” amplifies the
cognitive tendency we already possess to avoid the unpleasant emotions produced
by disagreement and criticism. In social situations, people feign agreement to
head off conflict and disharmony. Social psychology tells us that we tend to
like people who are like ourselves and so the flip side of this, that if we
express disagreement we risk being disliked and ostracized, keeps us from
speaking up.
The buddhist and yoga communities tend to be
extremely uncomfortable with disagreement, conflict and criticism. And again,
such discomfort is a more particularized example of a general tendency shared
by us all. In everyday life, the hesitancy to speak up often has only minor consequences.
However, there are situations where this tendency can contribute to great harm
for individuals and for the community.
Psychologist Irving Janis’ work on “Groupthink” shows that even members of highly cohesive advisory
groups whose task is to suggest effective courses of action can become
paralyzed by the concern with maintaining apparent consensus within the group
and will sometimes censor their personal reservations to accomplish it. Janis
quotes Arthur Schlesinger’s account of the Bay of Pigs debacle
where Schlesinger confesses to “having kept so silent during those crucial
discussions in the Cabinet Room…. I can only explain my failure to do more than
raise a few timid questions by reporting that one’s impulse to blow the whistle
on this nonsense was simply undone by the circumstances of the discussion.”
Over and over again, whenever a scandal is finally
revealed, the questions immediately arise as to how such behavior had been
allowed to continue. When the John Friend scandal broke, it became clear that
many senior teachers had known of his breech of ethics; when the tragedy at
Diamond Mountain was made known, it became clear that many had known of Lama Christie’s apparent magical thinking, irrationality and narcissistic disconnect
from reality and her
husband Ian’s instability and aggressive tendencies; and when it was finally
revealed that Sasaki Roshi was a sexual predator over the course of 50 years or more it was also revealed that many
knew about his despicable behavior all along!
Because of the cognitive tendencies that are than
exacerbated by yogic and buddhist teachings that can be twisted to inculate a
culture of repression of expression and diversity of opinion, the failure to
express dissent is all too prevalent and has led to severe and painful
consequences. Because of the culture of silence and the false consensus effect,
our beliefs and behavior all too often lack healthy scrutiny and debate. This
lack of critical discussion leads us to exaggerate the consensus for our
beliefs and behavior. Bolstered by such a false sense of acceptance and social
support, our beliefs may strike us as more valid than is actually the case, and
they become ever more resistant to logical and empirical challenge.
I wish to end by expressing my gratitude and
appreciation to those, like Matthew Remski, who have taken on the generally
thankless task of speaking up and speaking out. May such critical thinking and compassionate inquiry continue to grow within the contemporary buddhist and yoga communities so that perhaps we can finally correct (and compensate for) some of our cognitive errors.
No comments:
Post a Comment