Innocent corruption. In all institutions that do not feel the sharp wind of public criticism (as, for example, in scholarly organizations and senates), an innocent corruption grows up, like a mushroom.
How do good people go bad?
How do bad people see themselves as good?
Those simplistic questions cover a complex topic.
We all want to view ourselves as good people. We know the difference between right and wrong yet we often do the wrong action despite that.
Many of the reasons for that have to do with biases and assumptions. There are a lot of these but one particular issue I came across speaks especially to those who maintain some kind of religious, philosophical or spiritual stance.
On his blog, Overcoming Bias, economist Robin Hanson wrote a post called Group Moral Licensing. Its basis came from a study called Vicarious moral licensing: The influence of others’ past moral actions on moral behavior. which demonstrated that those who participated in and recognized the moral character of their past actions tended to use that as something like a buffer to allow them to act in less then moral ways in the present.
This phenomenon takes the label of moral credentialing.
Numerous kinds of moral credentialing can be defined. All of these contain biases and rely on assumptions which I outline below relating them specifically to Buddhist contexts:
This position rests morality in past actions. If a person has a track record of group defined “good” moral behavior there exists an increased likelihood that behavioral vigilance relaxes. This provides one reason why we become shocked when historically good people are discovered to have developed moral failings. Relying on this position states “I’ve always been good therefore I am and always will be good so I don’t have to bother with checking myself too seriously”. This solidification of a ‘”good” moral identity can erode for quite some time before the individual and social cognitive dissonance becomes evident. We can note this in Buddhist circles with phrases such as “I have belonged to the sangha (or meditated) for X number of years. That proves I’m good and what I say or do is right.”
If we maintain membership in a moral group, such as a religious institution or identify with a particular moral position and this is bolstered by a group, a tendency appears by which individuals assume the moral character of the group and abdicate their individual moral responsibility. This position summarizes as, “I’m a Buddhist therefore anything I do is automatically good and what I say or do is right.”
Attribution by others
Deeming someone moral, by way of a prize, honor or other social recognition can have the effect of a person defining themselves as morally good without examining their own actual moral positions. The expression of this becomes, “I’ve won the Nobel Peace Prize therefore the world deems me good so I can do what I want because it’s all good.” In Buddhist contexts this expresses as, “I’m the abbot/board chair/senior student so people recognize my goodness and everythingI say or do is right.”
When someone hears repeatedly and/or from numerous sources that their behavior/actions have a high moral quality a tendency develops for that to generalize and for the person to believe it of themselves without examination. The expression of this becomes, “So many people have told me I’m good so whatever I say or do is right.” “People say I am such a good Buddhist therefore I don’t need to examine the truth of those claims much.”
A definite correlation exists between power and corruption. See the third link below for one study on this. This position can be summarized as “I am in a position of deciding for others therefore they trust me. So I must be right and good as are all of my actions or I wouldn’t be in this position.” Bit of a tautology (circular reasoning) there. Positions of deluded moral power develop with the assistance of some of the other factors but mostly with the help of social reinforcement. The more power one has the less likely one questions the rightness of one’s own actions.
You can see also that interrelationships exist between these positions. Power, based on attribution by others, amplified and of long duration can coalesce into quite a delusional moral trip. Think cult like delusion when you put them all together. They all rest on false premises that belie reality testing in the present moment. Questioning seems absurd when these social conditioning factors are in play. All of these can be used as justifications for immoral behavior and actions when people escape into a state of denial. When these positions appear often the expressions come across as “holier than thou” and become increasingly irrational and entrenched. (ie Fox News) Objectivity decreases the more these factors appear, the longer they remain evident and the more intense they become.
All of these biases, assumptions and overlooked spots serve to cloak an individual from taking responsibility for their own moral positions and their own actions.
I call all these positions moral credentialing by proxy in that they don’t deal with the reality of a situation at hand but rely on conditioned biases from which moral decision making is made.
Related studies and their summarized conclusions from their abstracts [additional notes of mine appear in brackets] :
Striving for the moral self: the effects of recalling past moral actions on future moral behavior. People’s desires to see themselves as moral actors can contribute to their striving for and achievement of a sense of self-completeness. The authors use self-completion theory to predict (and show) that recalling one’s own (im)moral behavior leads to compensatory rather than consistent moral action as a way of completing the moral self. [I would call this a guilt effect. The compensatory behavior-similar to atonement-doesn’t occur when situations are morally positive or neutral]
In a very different voice: unmasking moral hypocrisy. Overall, results suggested motivation to appear moral yet still benefit oneself. Such motivation is called moral hypocrisy. [Even when reminded of moral stakes people maintain a certain amount of desire for self-benefit and this influences outcomes]
Power increases hypocrisy: moralizing in reasoning, immorality in behavior. In five studies, we explored whether power increases moral hypocrisy (i.e., imposing strict moral standards on other people but practicing less strict moral behavior oneself). In Experiment 1, compared with the powerless, the powerful condemned other people’s cheating more, but also cheated more themselves. In Experiments 2 through 4, the powerful were more strict in judging other people’s moral transgressions than in judging their own transgressions. A final study found that the effect of power on moral hypocrisy depends on the legitimacy of the power: When power was illegitimate, the moral-hypocrisy effect was reversed, with the illegitimately powerful becoming stricter in judging their own behavior than in judging other people’s behavior. This pattern, which might be dubbed hypercrisy, was also found among low-power participants in Experiments 3 and 4. We discuss how patterns of hypocrisy and hypercrisy among the powerful and powerless can help perpetuate social inequality.
Moral hypocrisy: appearing moral to oneself without being so. [People use various strategies in order to avoid comparing themselves to moral standards.]
Moral credentials and the expression of prejudice. Three experiments supported the hypothesis that people are more willing to express attitudes that could be viewed as prejudiced when their past behavior has established their credentials as nonprejudiced persons.
Moral credentialing by association: the importance of choice and relationship closeness. People express more prejudice if they have established their "moral credentials."
Note: This post attempts an E-prime perspective. See next post for a definition of that. If I’ve missed any spots let me know so they can be corrected. Quotations don’t count.