Photograph by iStockphoto/Thinkstock.
A few years ago, Israeli game theorist Ariel Rubinstein got the idea of examining how the tools of economic science affected the judgment and empathy of his undergraduate students at Tel Aviv University. He made each student the CEO of a struggling hypothetical company, and tasked them with deciding how many employees to lay off. Some students were given an algebraic equation that expressed profits as a function of the number of employees on the payroll. Others were given a table listing the number of employees in one column and corresponding profits in the other. Simply presenting the layoff/profits data in a different format had a surprisingly strong effect on students? choices?fewer than half of the ?table? students chose to fire as many workers as was necessary to maximize profits, whereas three quarters of the ?equation? students chose the profit-maximizing level of pink slips. Why? The ?equation? group simply ?solved? the company?s problem of profit maximization, without thinking about the consequences for the employees they were firing.
Rubinstein?s classroom experiment serves as one lesson in the pitfalls of the scientific method: It often seems to distract us from considering the full implications of our calculations. The point isn?t that it?s necessarily immoral to fire an employee?Milton Friedman famously claimed that the sole purpose of a company is indeed to maximize profits?but rather that the students who were encouraged to think of the decision to fire someone as an algebra problem didn?t seem to think about the employees at all.
The experiment is indicative of the challenge faced by business schools, which devote themselves to teaching management as a science, without always acknowledging that every business decision has societal repercussions. A new generation of psychologists is now thinking about how to create ethical leaders in business and in other professions, based on the notion that good people often do bad things unconsciously. It may transform not just education in the professions, but the way we think about encouraging people to do the right thing in general.
At present, the ethics curriculum at business schools can best be described as an unsuccessful work-in-progress. It?s not that business schools are turning Mother Teresas into Jeffrey Skillings (Harvard Business School, class of ?79), despite some claims to that effect. It?s easy to come up with examples of rogue MBA graduates who have lied, cheated, and stolen their ways to fortunes (recently convicted Raj Rajaratnam is a graduate of the University of Pennsylvania?s Wharton School of Business; his partner in crime, Rajat Gupta, is a Harvard Business School alum). But a huge number of companies are run by business school grads, and for every Gupta and Rajaratnam there are scores of others who run their companies in perfectly legal anonymity. And of course, there are the many ethical missteps by non-MBA business leaders?Bernie Madoff was educated as a lawyer; Enron?s Ken Lay had a Ph.D. in economics.
In actuality, the picture suggested by the data is that business schools have no impact whatsoever on the likelihood that someone will cook the books or otherwise commit fraud. MBA programs are thus damned by faint praise: ?We do not turn our students into criminals,? would hardly make for an effective recruiting slogan.
If it?s too much to expect MBA programs to turn out Mother Teresas, is there anything that business schools can do to make tomorrow?s business leaders more likely to do the right thing? If so, it?s probably not by trying to teach them right from wrong?moral epiphanies are a scarce commodity by age 25, when most students start enrolling in MBA programs. Yet this is how business schools have taught ethics for most of their histories. They?ve often quarantined ethics into the beginning or end of the MBA education. When Ray began his MBA classes at Harvard Business School in 1994, the ethics course took place before the instruction in the ?science of management? in disciplines like statistics, accounting, and marketing. The idea was to provide an ethical foundation that would allow students to integrate the information and lessons from the practical courses with a broader societal perspective. Students in these classes read philosophical treatises, tackle moral dilemmas, and study moral exemplars such as Johnson & Johnson CEO James Burke, who took responsibility for and provided a quick response to the series of deaths from tampered Tylenol pills in the 1980s.
It?s a mistake to assume that MBA students only seek to maximize profits?there may be eye-rolling at some of the content of ethics curricula, but not at the idea that ethics has a place in business. Yet once the pre-term ethics instruction is out of the way, it is forgotten, replaced by more tangible and easier to grasp matters like balance sheets and factory design. ?Students get too distracted by the numbers to think very much about the social reverberations?and in some cases legal consequences?of employing accounting conventions to minimize tax burden or firing workers in the process of reorganizing the factory floor.
Business schools are starting to recognize that ethics can?t be cordoned off from the rest of a business student?s education. The most promising approach, in our view, doesn?t even try to give students a deeper personal sense of mission or social purpose ? it?s likely that no amount of indoctrination could have kept Jeff Skilling from blowing up Enron. Instead, it helps students to appreciate the unconscious ethical lapses that we commit every day without even realizing it and to think about how to minimize them. ?If finance and marketing can be taught as a science, then perhaps so too can ethics.
These ethical failures don?t occur at random ? countless experiments in psychology and economics labs and out in the world have documented the circumstances that make us most likely to ignore moral concerns ? what social psychologists Max Bazerman and Ann Tenbrusel call our moral blind spots. ?These result from numerous biases that exacerbate the sort of distraction from ethical consequences illustrated by the Rubinstein experiment. A classic sequence of studies illustrate how readily these blind spots can occur in something as seemingly straightforward as flipping a fair coin to determine rewards. Imagine that you are in charge of splitting a pair of tasks between yourself and another person. One job is fun and with a potential payoff of $30; the other tedious and without financial reward. Presumably, you?d agree that flipping a coin is a fair way of deciding?most subjects do. However, when sent off to flip the coin in private, about 90 percent of subjects come back claiming that their coin flip came up assigning them to the fun task, rather than the 50 percent that one would expect with a fair coin. Some people end up ignoring the coin; more interestingly, others respond to an unfavorable first flip by seeing it as ?just practice? or deciding to make it two out of three. That is, they find a way of temporarily adjusting their sense of fairness to obtain a favorable outcome.
There are many such examples of what Bazerman and Tenbrusel would argue are unintentional ethical failings: People fall prey to self-serving bias: an accountant whose future business depends on maintaining the approval of the companies he?s meant to be auditing is genuinely more likely to believe his clients? books are in order. We discriminate unconsciously against those who aren?t like us, passing them over for promotion or low-balling them in negotiations. And even when we lie, cheat, or steal for personal gain, we often disengage, at least temporarily, from the set of values that would normally lead us to look down upon those who lie, cheat, and steal.
Source: http://feeds.slate.com/click.phdo?i=2babbb0f384f30f15ecdfc3d5e6259ca
mike adams janoris jenkins john edwards trial brandon weeden felicia day nfl 2012 draft st louis rams
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.