DOES IT MAKE SENSE to heed the advice of experts? This doesn’t seem like a hard question. I certainly listen to my doctor and to many others with specialized expertise. As a society, we all rely on experts—from civil engineers to airline pilots to firefighters—for our health and safety.
At the same time, however, human judgment seems to be riddled with flaws. Consider these examples:
In other words, human judgment is far from perfect. Even experts seem to get things wrong. But when it comes to investments, does that mean you should never rely on expert opinion? How should you think about this question?
The first step, in my opinion, is to distinguish between decisions that require a judgment about your own personal situation and those that require judgments about the wider world. Judgments about the wider world—including predictions about the economy, the financial markets or a specific stock—are much more difficult. That’s because of the virtually infinite number of variables involved.
Suppose you had been looking at the stocks of hotels or cruise lines two years ago, before the pandemic. No matter how thorough your analysis, there’s no way you could have known that these industries would soon have the rug pulled out from under them.
Similarly, it would have been difficult for investors to foresee past crises, such as 9/11, and yet the impact on markets was significant. The bottom line: Experts can’t really be helpful in making economic predictions because no one knows what unexpected events lie ahead.
You may be familiar with the investment classic A Random Walk Down Wall Street, which includes this famous line: “A blindfolded monkey throwing darts at a newspaper’s financial pages could select a portfolio that would do just as well as one carefully selected by the experts.”
While this sounds like hyperbole, a team at The Wall Street Journal has tested and confirmed it. The Journal’s employees pitted the hedge fund industry’s best ideas against selections made by throwing darts at the stock pages—and the randomly thrown darts beat the hedge fund managers. The lesson: Stock-picking and macroeconomic forecasting may seem like areas where you would want to listen to experts, but the data regularly confirm that they are not.
Meanwhile, you have a lot more information about your own personal finances than you do about the wider world. To be sure, there will always be unknowns. But your own financial future isn’t a complete unknown.
Suppose you were applying for a new home mortgage, and trying to decide between a fixed and a variable rate. A fixed rate offers certainty, while a variable rate is a potentially double-edged sword: It offers a lower rate for a handful of years, but then the rate can rise. For that reason, most people choose the fixed rate option.
But I recall a friend who opted for a variable rate loan. He was happy to take the risk and, in fact, didn’t even see it as a risk. He was newly married and confident he would move and pay off the loan long before the rate reset. Was he making a prediction? Yes, but it was a reasoned judgment about his own personal finances, not a wild guess about the wider world.
This illustrates my first recommendation: If an expert is offering advice, think critically about the information supporting that advice. How knowable is it? While all decisions require some judgment about the future, try to determine whether the expert is relying mostly on facts or on his crystal ball.
What else can you do to protect yourself from erroneous expert judgment? The concept of intrinsic value provides another useful filter. No one can predict where the stock market is going next. But because stocks have intrinsic value, it doesn’t require an act of faith to invest in the market—assuming you have a long-term perspective. Bonds and real estate also have intrinsic value. The upshot: It’s possible for experts to make reasonable judgments about these investments.
By contrast, if investments lack intrinsic value—such as gold, cryptocurrency or tulip bulbs—then experts are in a far weaker position when talking about them. Without intrinsic value, these things are based only on what the next person is willing to pay for them, and thus expert opinion isn’t worth much.
Another distinction to keep in mind: Some questions have a yes-or-no answer. For example, will the Fed raise interest rates this year? Will Congress raise taxes? With these questions, there are just two possible answers.
Meanwhile, other questions are totally open ended. For example, how will the rivalry between the U.S. and Russia evolve, and how will it affect our respective markets over the next 10 years? A broad question like that is much harder to get right than a narrow yes-or-no question. This is another litmus test when judging experts. The broader the question, the less stock you should put in the answer offered by any expert.
Statisticians have understood this for a long time. In 1950, Glenn Brier, a statistician with the U.S. Weather Bureau, developed a tool for assessing the accuracy of weather forecasts. His methodology is now known as Brier scoring and is used to assess all manner of forecasts. I won’t get into the details, but the key takeaway is that it’s only possible to assess the quality of a forecast when the question has a limited set of possible answers.
If experts are opining on something broader, like the evolution of our relationship with Russia, you can certainly listen. But recognize that they’re far out on a limb—and it’s even difficult to measure how far out they are.