People who count their chickens before they are hatched, act very wisely,
because chickens run about so absurdly that it is impossible to count
them accurately.
Oscar Wilde, 1854 - 1900
From Efficient Markets Theory to Behavioral Finance
Journal of Economic Perspectives-Volume 17, Number 1-Winter 2003-Pages 83-104
Robert J. Shiller
For a more detailed discussion of behavioral finance, see “A Survey of Behavioral Finance”, Nicholas Barberis and Richard Thaler (both of University of Chicago) in Handbook of the Economics of Finance, 2003, Elsevier Science B.V.
The field of behavioral finance has its origins in the work of the economist and computer scientist, Herbert Simon who recognized the computational limits to rationality, and the psychologists Amos Tversky and Daniel Kahneman, who demonstrated the behavioral basis for risk aversion.
By way of motivation for the discussion of behavioral finance, we present the follow passage (Special Issue Artificial Intelligence Applications on Wall Street, S. Slade (ed), Applied Artificial Intelligence, 10(6) 1996.)
The domain of finance and investing is replete with open problems. It is a matter of faith among economists that business managers maximize profits. A rational decision maker will increase revenues and decrease costs such that their difference is maximized. …
On the other hand, computer scientists, in the tradition of Simon, are aware that opportunities to exercise such rationality are rare. Realistic decisions are usually not formulated as equations. A problem search space is a more appropriate model, albeit subject to combinatorial explosion.
For example, consider the game of chess. There are 20 possible opening moves by white, to which there are 20 possible responses by black, for a total of 400 possible board positions after the first set of moves. In Table 1, we calculate the total number of possible board positions, assuming that the number of choices at any move continues to be 20. Thus, after 20 plies (10 complete sets of moves), there are about 1026 possible configurations. After 40 plies (20 moves), there are over 1052 positions. (Many chess games last for 30 moves or more.)
Assuming we have a computer chess program that can evaluate a board position every nanosecond (10-9 second), the program would require 109 years to asset all the alternatives in a 10-move (20 ply) game and select the best. Moreover, if we had a massively parallel computer comprising 5 billion of these nanosecond chess machines, if would still take 8 months to evaluate just the first 10 sets of moves.
Game playing has a rich history in both artificial intelligence and economics. However, the economic game theorists cannot repeal the laws of combinatorics. The problems that make chess an unsolvable game are writ large in the world of Wall Street.
Consider the New York Stock Exchange as a game (a “big board” game, if you will). There are over 3000 stocks listed, each of which we may consider a move in our game. Thus, our branching factor is no longer 20, but 3000.
Also, unlike the chess player, the investor may make more than one “move” at a time. For example, instead of deciding between 100 shares of Ford or 100 shares General Motors, the investor may purchase 50 shares of each. If we merely consider combinations of two stocks, our branching factor is now 3000 x 3000. If we allow portfolios of N stocks from the New York Stock Exchange, the branching factor is 3000N. Now we are talking about real numbers. … The computational complexity of the investment world far exceeds that of chess, which is itself intractable.
Economists assert that the computational problems are finessed by the market as a whole. The efficient market arrives at a stock’s proper and fair price through the repeated bid/ask process for every stock. However, if we look at a relatively simple case of creating a portfolio of 10 stocks from the NYSE, we immediately see the problem. The table below presents the cruel facts. Evaluating just the first move would take a 5 billion node parallel computer over 300 million years.
Parameter |
Value |
Stocks in portfolio |
10 |
New York Stock Exchange issues |
3,000 |
Number of portfolios |
3,00010 = 5.9E+34 |
Years in nanoseconds |
1.8E+18 |
Years at 5 billion CPUs |
374,486,301 |
Herbert Simon won the Nobel Prize in economics for establishing the principle of bounded rationality. That is, there are limits to rationality imposed by the computational demands of processing all available information. The fact is that for any realistic problem, it is not possible to behave as a rational economic agent.
The corollary to this is that the market is not efficient. Prices do not reflect everything that is known about a security. It is simply impossible.
[On a personal note, I have always found the efficient market theory not merely implausible, but delusional. It would be as if doctors worked from the assumption that people are immortal. The theory of immortality would be founded on the incontrovertible truth that no human being currently alive had ever died. The fact the people in fact die would be labeled as an anomaly. I suppose I lack sufficient faith to believe in the EMT. It is like the peace of God, for it passeth all understanding.]
In addition to the bounded rationality theory, another line of argument was developed by the psychologists Amos Tversky and Daniel Kahneman. They conducted a range of experiments in decision making that demonstrated that even in the absence of staggering computational demands, people did not behave according to the predictions of economic decision theory. For example, people demonstrated quantifiable risk aversion such that people would not accept a proposition with an expected value of zero if there was prominent downside risk. Kahneman won the Nobel Prize in economics in 2002. (Tverksy had already died.)
Prospect Theory (Kahneman and Tversky) is an alternative to economic decision theory. Rather than focus on absolute expected value, people frame outcomes as relative to earlier level. This is consistent with human perception of other attributes such as brightness, loudness, or temperature. Also, focus on losses and gains. Example: (Barberis and Thaler, p 1069)
In addition to whatever you own, you have been given $1,000. Now choose between
A = ($1,000, 50%)
B = ($500, 100%)
B was the more popular choice. The same subjects were then asked:
In addition to whatever you own, you have been given $2,000. Now choose between
C = (-$1,000, 50%)
D = (- $500, 100%)
This time, C was more popular. Note that the two problems are identical in terms of their final wealth positions and yet people choose differently. The subjects are apparently focusing only on gains and losses. Indeed, when they are not given any information about prior winnings, they choose B over A and C over D.
Many studies demonstrated that the normal risk aversion ratio is 2.5. That is, to be indifferent to a bet with a 50% chance of losing $100, the upside would have to be $250. Surprisingly, this ratio is found as innate in other species. Professor Keith Chen of Yale has quantified risk aversion in monkeys. (See http://www.som.yale.edu/Faculty/keith.chen/ )
Shiller attacks the efficient market hypothesis from another angle. He begins by acknowledging that for years finance took the efficient market theory as axiomatic. Merton and others simply assumed rational expectations. There were reports of “anomalies” that were dismissed as small.
Econophysics
As an aside, we note that physicists have been enticed to examine the data. Eugene Stanley and his colleagues have applied quantitative data analysis methods from physics to finance and economics. Stanley notes that both realms have oceans of data. This hybrid field is called “econophysics” which Stanley notes does not connote a low-priced physics, a la, Econo-Lodge, but rather a multidisciplinary approach to examining financial data.
Stanley also observes that physicists are less likely to dismiss data that fails to conform to predicted values as “anomalies.” If he held out his hand and dropped his pen, only to have it float to the ceiling, Stanley would not call that an anomaly to theory of gravity. He would find it an important observation that would require an explanation.
The 1980’s and Excess Volatility
One anomaly was excess volatility beyond that predicted by the efficient market theory. Most of the volatility in the stock market was unexplained.
The price P of a stock is the present value of all actual subsequent dividends, accruing according to the optimal forecast at time t.
P = P*t
The efficient market theory adds the mathematical expectation conditional on public information available at time t.
P = E tP*t
Any surprising movements in the stock market must reflect new information about the fundamental value of P*t. Thus,
P*t = Pt + Ut
where Ut is a forecast error. “The maximum possible variance of the forecast is the variance of the variable forecasted, and this can occur only if the forecaster has perfect foresight and forecasts correlate perfectly with the variable forecasted.”
However, Shiller showed that for the period 1871–2002, the present value of the S&P index behaved like a stable trend, suggesting that there is excess volatility in the stock market.
Simply stated, the volatility of returns is higher than the volatility of dividend growth.
Shiller describes various challenges and responses to his work on excess volatility
Samuelson argued that while the market as a whole was inefficient, individual stocks were efficient. “Dividend-price ratios on individual stocks do serve as forecasts of long-term future changes in their future dividends, as efficient markets assert.”
The Blossoming of Behavioral Finance
In 1990’s there was a shift of focus to developing psychological models of investing.
Feedback Models
Word of mouth feedback – rising stock prices feed on themselves. Without correction, become bubbles. (Think dot-com and “new paradigm” explanations.)
Feedback can also drive prices down.
Standard example: Tulip bulb bubble in Holland in 1630’s. (Instead of tulips, think Cisco, AOL, Yahoo, Netscape, yadayada.com)
Shiller’s book, Irrational Exuberance, came out at the peak of the technology bubble – March 2000. Pretty good timing.
Can create a feedback model with a difference equation (or recurrence relation), e.g., a Fibonacci sequence:
Tversky and Kahneman demonstrated representative bias. For example, in one study subjects were told the following:
Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice and also participated in anti-nuclear demonstrations.
When asked which of “Linda is a bank teller” (statement A) and “Linda is a bank teller and active in the feminist movement” (statement B) is more likely, subjects typically assign greater probability to B. This is, of course, impossible. (Barberis and Thaler, p 1064)
Another bias is sample size neglect. People make inferences based on too few data points. (e.g., an analyst makes four good recommendations, therefore, he is a good analyst.) Similarly, “hot hand” phenomenon in sports whereby sports fans become convinced that a basketball player who has made three shots in row is on a hot streak and will score again. Refuted by Tvsersky et al in 1985 study. This belief that even small samples will reflect the properties of the parent population is sometimes known as the “law of small numbers.” (Barberis and Thaler, p 1065)
Other biases include conservatism, belief perseverance, anchoring, and availability biased. (Barberis and Thaler, p 1066)
These biases can affect investing behavior.
Biased self-attribution. (“I was smart to buy Amazon. The market cratered because of Greenspan.”)
Ponzi schemes are real-life examples of feedback models. Albania in 1996-1997: 2,000 people died.
Stock market confidence index http://icf.som.yale.edu/confidence.index/ (Click on the graphs to get the detail.)
[Aside: Inflation, like stock prices, has a feedback component. If you expect prices to go up, you will spend or save accordingly.]
Simple feedback models do not imply strong serial correlation. There is a distributed lag. Other factors may be more important.
Short term momentum effects. Long term mean reversion.
Smart Money vs. Ordinary Investors (aka, the Limits of Arbitrage)
Not everyone is a rational optimizer, but smart money balances things out.
When irrational investors buy, smart money sells, and vice versa.
Problem: smart money may amplify feedback trades by buying ahead.
Problem: smart money may not completely offset irrational trades due to risk.
Evidence of feedback/smart dichotomy in momentum vs. contrarian traders in Fidelity study. Traders had consistent patterns.
Problem: if irrational optimists (zealots) drive up the price, the smart money may not be able to go short due to limited supply.
Example: March 2000 sale by 3Com of 5% share in Palm. Price so high implied that 3Com residual had a negative value. Shorting Palm became too expensive (35% interest).
Put option prices on Palm became expensive, violating put-call parity due to inability to short Palm.
“Short-sale constraints could be a fatal flaw in the basic efficient markets theory.”
Other reasons to short stocks, e.g., hedging.
Look for correlation between cost of shorting and subsequent performance. Data not generally available.
Data available from 1926 to 1933, published in WSJ. Corroborates expensive to short means higher price, and lower average returns.
1929 crash had been blamed on short sellers. Shorting fell into disrepute.
http://www.equilend.com since 2002 provides market for borrowing and lending.
Psychological cost of shorting. No limits to downside risk. Cites pain of regret and Tversky and Kahneman.
Result: investors avoid selling losers.
Result: investors want to avoid covering their shorts in a losing situation.
“People prefer to avoid putting themselves in situations that might confront them with psychologically difficult decisions in the future.”
The stock market of today is built on the principle of limited liability.
Proposed alternatives in 19th century:
unlimited liability (sue the Exxon stockholders for oil spills).
unlimited proportional liability (sue stockholders based on size of holdings).
double liability (twice the capital subscribed).
1830 arrived at limited liability, which is like a lottery ticket.
Given this theory, shorting stocks is not attractive.
Very little short selling occurs in practice – probably less than 2% of all outstanding shares.
Therefore, short selling does not hold the explanation for the efficient market theory.
As I was heading out of my Upper East Side apartment building to a leisurely breakfast, I happened across a neighbor whose husband is a well-known stock trader on Wall Street. Her darling 5-year old-son had overheard Daddy on the phone with a client, and was quite concerned.
“I know Daddy sells things at his job,” he remarked with consternation, “but why, oh why, did he say he would sell my shorts?”
New York Times, Metropolitan Diary, February 12, 2007
Conclusion
“Efficient market theory may lead to drastically incorrect interpretations of events such as major stock market bubbles.”
Eugene Fama rebuttals to critiques: (1) anomalies occur both as underreactions as well as overreactions, and (2) anomalies disappear due to time or methodology.
Responses. (1) No claim for single direction of reaction. (2) Natural course of research, but excess volatility does not subside.
“It would seem peculiar to argue that irrational markets should display regular and lasting patterns!”
“The recent worldwide stock market boom, and then crash after 2000, had its origins in human foibles and arbitrary feedback relations and must have generated a real and substantial misallocation of resources.”
Discussion question 1: if the market is not really efficient, is that good or bad for Columbia Management?
Discussion question 2: what are the implications of Prospect Theory for the practice of risk management?