IT’S A COMMONLY HELD belief that market risk is a function of time in the market. Simply put, risk falls as our holding period lengthens. This is the notion behind time diversification—the idea that more time allows us to diversify across different investment periods, resulting in reduced risk.
For example, the S&P 500 has historically generated positive returns in nearly every 20-year holding period, even after adjusting for inflation. Armed with this data, one of the first things financial advisors ask clients is about their time horizon. The longer clients have to invest, the more risk they can take—or so it’s assumed.
Unfortunately, this assumption is incorrect, at least according to financial theory. It’s certainly true that, as our investment time horizon lengthens, the probability that the average return will be positive increases. But if we’re concerned with the range of potential total returns, risk paradoxically rises with longer holding periods.
How can that be? In the world of finance, risk is often measured using standard deviation. Standard deviation is a measure of how far actual returns deviate from the average historical return. The higher the standard deviation, the greater the risk.
Consider the annual returns of the U.S. stock market. We’ll assume that returns are independent from year to year and have a normal, or bell-shaped, distribution. Let’s also assume an average return of 5% and a standard deviation of 30%.
Under those assumptions, 68% of the time, annual returns would fall between +35% and -25%. Higher standard deviations imply greater risk, because it means bigger swings up or down from the average.
For nerdier readers, here’s how this works mathematically. (Those less nerdy can skip to the next paragraph.) If the standard deviation of returns over one year is 30%, or 0.3, the standard deviation of the return over n years equals 0.3 * √n. For a 30-year time horizon (n = 30), 30-year returns have a standard deviation of 0.3 * √30 or 1.64. In other words, standard deviation, or risk, has ballooned from 30% over one year to 164% over 30 years. The upshot: Risk is higher, not lower, over longer time periods.
But what about the notion that the longer we remain invested in the market, the lower the probability of losing money? The probability of losing money in stocks does indeed fall with longer holding periods. Since 1871, there’s been just one 20-year period when the return of the S&P 500 was negative. From June 1901 to June 1921, its inflation-adjusted return was -4.3%.
Probabilities, however, don’t tell the whole story. Standard deviation also reflects the magnitude of deviations from the average. As the time horizon lengthens, the magnitude of worst-case outcomes grows.
For instance, an investor might ask the following question: “How much does my portfolio stand to lose if stock returns are in the worst 1% of possible outcomes?” In other words, what is my 1% “value at risk” in an awful market scenario?
In our example, the worst 1% one-year scenario would be a 47.7% loss. That’s painful. But the worst 1% 30-year scenario would be a bone-crushing loss of 90.2%. In short, risk—at least as measured by standard deviation—actually increases with time. Why? Just as gains compound, losses can, too. In a worst-case scenario, multiple losing years can result in a devastating cumulative loss.