The old school stock market I spent most of my career as an equity portfolio manager in had two characteristics: it was anticipatory and it was granular.
Armies of securities analysts, employed both by investment managers and by brokerage houses, pored over company financials, met regularly with managements–in public meetings sponsored by the local security analyst society, in private or by phone–to discuss operations, and, ultimately, to construct elaborate spreadsheets, from which they made informed earnings estimates. For a growth investor, the goal was to buy stock in companies where one’s spreadsheets indicated earnings would be much better than the consensus realized, and for longer than the consensus dreamed possible.
In the 1990s, investment managers realized they could save money if they eliminated in-house staffs and relied instead on research created by brokerage houses. During the 2008-09 financial crisis, the big investment banks laid off virtually all their highly-paid veteran analysts and replaced them with neophytes, on the idea that trading rather than research is their bread and butter
All in all, these developments are a big plus for individual investors like you and me, I think. Yes, some present brokerage researchers are still very good. But they’re closer to polished high school athletes than Jacob deGrom and Aaron Rodgers. So it’s much easier today for us to know more than the market does by focusing on a few areas or companies we’re interested in.
What to professionals do today? Given that I haven’t worked in the industry for a decade and a half, I don’t know for sure. Looking at the rhythms of the markets, however, I’m convinced today’s Wall Street is reactive, not anticipatory, and is focused on broad economic factors, not the nuts and bolts of individual companies.
By reactive, I mean professionals now concentrate on making an immediate, strong, presumably computer-driven, response to news developments. A good example, I think, is the stock and bond market plunges that followed Monday’s release of minutes of the December Fed meeting where the possibility was discussed that it would be raising interest rates sooner than expected. Hints of this had been floating around for a while, but hard news seems to have triggered the selling.
The decision about what to sell the most aggressively seems to me to be shaped by one form or another of factor analysis, which has been used in financial markets since at least the 1980s. The idea is to describe a given stock as a bundle of characteristics, or factors, based on a statistical analysis of its past behavior–and then trade the (smaller and simpler to figure out) set of factors rather than the big universe of stocks. Factors can be very traditional, like financial leverage or profitability. Others can be highly specific. One used early on, for example, is “sensitivity to changes in the oil price.” Loss-making tech companies might be another, more contemporary one.
Factor systems have a checkered history it terms of creating outperformance. They have two advantages, though. They’re cheaper and easier to maintain than hiring/training a bunch of human securities analysts. Also, potential clients feel much more comfortable buying “objective,” computer-driven systematic investment schemes than relying on a potentially quirky human. A much better excuse if things turn out badly.
My experience is that factor systems tend to overreact. This characteristic is amplified by the twitchy nature of today’s trend toward ultra-fast reaction to news. Two implications:
–there will always be opportunities to upgrade holdings during a downdraft like the one we’re in now, and
–not so relevant for now, but we should also arguably be more willing to take partial profits on a stock that’s going through the roof