Behavioral Finance

People who count their chickens before they are hatched, act very wisely,

because chickens run about so absurdly that it is impossible to count them accurately.
Oscar Wilde, 1854 - 1900


From Efficient Markets Theory to Behavioral Finance

Journal of Economic Perspectives-Volume 17, Number 1-Winter 2003-Pages 83-104

Robert J. Shiller


For a more detailed discussion of behavioral finance, see “A Survey of Behavioral Finance”, Nicholas Barberis and Richard Thaler (both of University of Chicago) in Handbook of the Economics of Finance, 2003, Elsevier Science B.V.


The field of behavioral finance has its origins in the work of the economist and computer scientist, Herbert Simon who recognized the computational limits to rationality, and the psychologists Amos Tversky and Daniel Kahneman, who demonstrated the behavioral basis for risk aversion.


By way of motivation for the discussion of behavioral finance, we present the follow passage (Special Issue Artificial Intelligence Applications on Wall Street, S. Slade (ed), Applied Artificial Intelligence, 10(6) 1996.)


The domain of finance and investing is replete with open problems. It is a matter of faith among economists that business managers maximize profits. A rational decision maker will increase revenues and decrease costs such that their difference is maximized. …


On the other hand, computer scientists, in the tradition of Simon, are aware that opportunities to exercise such rationality are rare. Realistic decisions are usually not formulated as equations. A problem search space is a more appropriate model, albeit subject to combinatorial explosion.


For example, consider the game of chess. There are 20 possible opening moves by white, to which there are 20 possible responses by black, for a total of 400 possible board positions after the first set of moves. In Table 1, we calculate the total number of possible board positions, assuming that the number of choices at any move continues to be 20. Thus, after 20 plies (10 complete sets of moves), there are about 1026 possible configurations. After 40 plies (20 moves), there are over 1052 positions. (Many chess games last for 30 moves or more.)


Assuming we have a computer chess program that can evaluate a board position every nanosecond (10-9 second), the program would require 109 years to asset all the alternatives in a 10-move (20 ply) game and select the best. Moreover, if we had a massively parallel computer comprising 5 billion of these nanosecond chess machines, if would still take 8 months to evaluate just the first 10 sets of moves.


Game playing has a rich history in both artificial intelligence and economics. However, the economic game theorists cannot repeal the laws of combinatorics. The problems that make chess an unsolvable game are writ large in the world of Wall Street.


Consider the New York Stock Exchange as a game (a “big board” game, if you will). There are over 3000 stocks listed, each of which we may consider a move in our game. Thus, our branching factor is no longer 20, but 3000.


Also, unlike the chess player, the investor may make more than one “move” at a time. For example, instead of deciding between 100 shares of Ford or 100 shares General Motors, the investor may purchase 50 shares of each. If we merely consider combinations of two stocks, our branching factor is now 3000 x 3000. If we allow portfolios of N stocks from the New York Stock Exchange, the branching factor is 3000N. Now we are talking about real numbers. … The computational complexity of the investment world far exceeds that of chess, which is itself intractable.


Economists assert that the computational problems are finessed by the market as a whole. The efficient market arrives at a stock’s proper and fair price through the repeated bid/ask process for every stock. However, if we look at a relatively simple case of creating a portfolio of 10 stocks from the NYSE, we immediately see the problem. The table below presents the cruel facts. Evaluating just the first move would take a 5 billion node parallel computer over 300 million years.


Parameter

Value

Stocks in portfolio

10

New York Stock Exchange issues

3,000

Number of portfolios

3,00010 = 5.9E+34

Years in nanoseconds

1.8E+18

Years at 5 billion CPUs

374,486,301


Herbert Simon won the Nobel Prize in economics for establishing the principle of bounded rationality. That is, there are limits to rationality imposed by the computational demands of processing all available information. The fact is that for any realistic problem, it is not possible to behave as a rational economic agent.


The corollary to this is that the market is not efficient. Prices do not reflect everything that is known about a security. It is simply impossible.


[On a personal note, I have always found the efficient market theory not merely implausible, but delusional. It would be as if doctors worked from the assumption that people are immortal. The theory of immortality would be founded on the incontrovertible truth that no human being currently alive had ever died. The fact the people in fact die would be labeled as an anomaly. I suppose I lack sufficient faith to believe in the EMT. It is like the peace of God, for it passeth all understanding.]


In addition to the bounded rationality theory, another line of argument was developed by the psychologists Amos Tversky and Daniel Kahneman. They conducted a range of experiments in decision making that demonstrated that even in the absence of staggering computational demands, people did not behave according to the predictions of economic decision theory. For example, people demonstrated quantifiable risk aversion such that people would not accept a proposition with an expected value of zero if there was prominent downside risk. Kahneman won the Nobel Prize in economics in 2002. (Tverksy had already died.)


Prospect Theory (Kahneman and Tversky) is an alternative to economic decision theory. Rather than focus on absolute expected value, people frame outcomes as relative to earlier level. This is consistent with human perception of other attributes such as brightness, loudness, or temperature. Also, focus on losses and gains. Example: (Barberis and Thaler, p 1069)


In addition to whatever you own, you have been given $1,000. Now choose between


A = ($1,000, 50%)

B = ($500, 100%)


B was the more popular choice. The same subjects were then asked:


In addition to whatever you own, you have been given $2,000. Now choose between


C = (-$1,000, 50%)

D = (- $500, 100%)


This time, C was more popular. Note that the two problems are identical in terms of their final wealth positions and yet people choose differently. The subjects are apparently focusing only on gains and losses. Indeed, when they are not given any information about prior winnings, they choose B over A and C over D.


Many studies demonstrated that the normal risk aversion ratio is 2.5. That is, to be indifferent to a bet with a 50% chance of losing $100, the upside would have to be $250. Surprisingly, this ratio is found as innate in other species. Professor Keith Chen of Yale has quantified risk aversion in monkeys. (See http://www.som.yale.edu/Faculty/keith.chen/ )


Shiller attacks the efficient market hypothesis from another angle. He begins by acknowledging that for years finance took the efficient market theory as axiomatic. Merton and others simply assumed rational expectations. There were reports of “anomalies” that were dismissed as small.


Econophysics

As an aside, we note that physicists have been enticed to examine the data. Eugene Stanley and his colleagues have applied quantitative data analysis methods from physics to finance and economics. Stanley notes that both realms have oceans of data. This hybrid field is called “econophysics” which Stanley notes does not connote a low-priced physics, a la, Econo-Lodge, but rather a multidisciplinary approach to examining financial data.


Stanley also observes that physicists are less likely to dismiss data that fails to conform to predicted values as “anomalies.” If he held out his hand and dropped his pen, only to have it float to the ceiling, Stanley would not call that an anomaly to theory of gravity. He would find it an important observation that would require an explanation.


The 1980’s and Excess Volatility


One anomaly was excess volatility beyond that predicted by the efficient market theory. Most of the volatility in the stock market was unexplained.


The price P of a stock is the present value of all actual subsequent dividends, accruing according to the optimal forecast at time t.


P = P*t


The efficient market theory adds the mathematical expectation conditional on public information available at time t.


P = E tP*t


Any surprising movements in the stock market must reflect new information about the fundamental value of P*t. Thus,


P*t = Pt + Ut


where Ut is a forecast error. “The maximum possible variance of the forecast is the variance of the variable forecasted, and this can occur only if the forecaster has perfect foresight and forecasts correlate perfectly with the variable forecasted.”


However, Shiller showed that for the period 1871–2002, the present value of the S&P index behaved like a stable trend, suggesting that there is excess volatility in the stock market.


Simply stated, the volatility of returns is higher than the volatility of dividend growth.


Shiller describes various challenges and responses to his work on excess volatility


Samuelson argued that while the market as a whole was inefficient, individual stocks were efficient. “Dividend-price ratios on individual stocks do serve as forecasts of long-term future changes in their future dividends, as efficient markets assert.”


The Blossoming of Behavioral Finance



Feedback Models









Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice and also participated in anti-nuclear demonstrations.


When asked which of “Linda is a bank teller” (statement A) and “Linda is a bank teller and active in the feminist movement” (statement B) is more likely, subjects typically assign greater probability to B. This is, of course, impossible. (Barberis and Thaler, p 1064)


Another bias is sample size neglect. People make inferences based on too few data points. (e.g., an analyst makes four good recommendations, therefore, he is a good analyst.) Similarly, “hot hand” phenomenon in sports whereby sports fans become convinced that a basketball player who has made three shots in row is on a hot streak and will score again. Refuted by Tvsersky et al in 1985 study. This belief that even small samples will reflect the properties of the parent population is sometimes known as the “law of small numbers.” (Barberis and Thaler, p 1065)


Other biases include conservatism, belief perseverance, anchoring, and availability biased. (Barberis and Thaler, p 1066)


These biases can affect investing behavior.








Smart Money vs. Ordinary Investors (aka, the Limits of Arbitrage)


























As I was heading out of my Upper East Side apartment building to a leisurely breakfast, I happened across a neighbor whose husband is a well-known stock trader on Wall Street. Her darling 5-year old-son had overheard Daddy on the phone with a client, and was quite concerned.

I know Daddy sells things at his job,” he remarked with consternation, “but why, oh why, did he say he would sell my shorts?”

New York Times, Metropolitan Diary, February 12, 2007



Conclusion




Discussion question 1: if the market is not really efficient, is that good or bad for Columbia Management?


Discussion question 2: what are the implications of Prospect Theory for the practice of risk management?