liquidity and stock price changes

daily liquidity and price movements

Liquidity has a lot of different meanings.  Right now, though, I just want to write about what I think is making stocks yo-yo to and fro on any given day.

 

The default response by market makers–human or machine–to a large wave of selling of the kind algorithms seem to trigger is to move the market down as fast as trading regulations allow.  This serves a number of purposes:  it minimizes the unexpected inventory a market maker is forced to take on at a given price; it allows the market maker to gauge the urgency of the seller; the decline itself eventually discourages sellers with any price sensitivity, so the selling dries up; and it reduces the price the market maker pays for the inventory he accumulates.

A large wave of buying works in the opposite direction, but with the same general result: market makers sell less, but at higher prices and end up with less net short exposure.

 

From my present seat high in the bleachers, it seems to me the overall stock market game–to make more/lose less than the other guy–hasn’t changed.  But we’ve gone from the old, human-driven strategy of slow anticipation of likely news not yet released to violently fast computer reaction to news as it’s announced.

Today’s game isn’t simply algorithmic noise, though.  Apple (AAPL), for example, pretty steadily lost relative performance for weeks in November, after it announced it would no longer disclose unit sales of its products.  Two points:  the market had no problem in immediately understanding that this was a bad thing (implying humans were likely involved)   …and the negative price reaction continued for the better part of a month (suggesting that something/someone constrained the race to the bottom).  As it turns out, decision #1 was good and decision #2 was bad.  Presumably short-term traders will make adjustments.

my take

On the premise that dramatic daily shifts in the prices of individual stocks will continue for a while:

–if investors care about the high level of daily volatility, its persistence should imply an eventual contraction in the market PE multiple.  Ten years of rising market probably implies that this won’t happen overnight, if it occurs at all.

–individual investors like you and me may have more time to research new companies and establish positions, if the importance of discounting diminishes

–professional analysts may only retain their relevance if they actively publicize their conclusions, trying to trigger algorithmic action, rather than keeping them closely held and waiting for the rest of the world to eventually figure things out

–the old (and typically unsuccessfully executed) British strategy of maintaining core positions while dedicating, say, 20% of the portfolio to trading around them, may come back into vogue.  Even long-term investors may want to establish buy/sell targets for their holdings and become more trading-oriented as well

–algorithms will presumably begin to react to the heightened level of daily volatility they are creating.  Whether volatility increases or declines as a result isn’t clear

 

 

 

 

discounting in the age of algorithms

what discounting is

In traditional Wall Street parlance, discounting is factoring into today’s prices the anticipated effect of expected future events.  Put another way, in the best possible case, it’s buying a stock for, say $.25 extra today, thinking that in a week, or a month or a year, news will come out that makes the stock worth $1, or $10, or $100 more than it is today.

two components

They are:

—having/developing superior information, and

–correctly gauging what effect dissemination of the news will have on the stock.

In my experience, the first of these is the easier task.  Also, the answer to the second problem will likely be imprecise.  In most cases, “The stock will go up a lot when people understand x” is good enough.

examples

In the early days of the Apple turnaround, the company launched the iPod, which ended up doubling the company’s size.  So the key to earnings growth for AAPL was the rate of increase in iPod sales.  The heart of the iPod back then was a small form factor hard disk drive.  There were only two suppliers of this component, Hitachi and Seagate (?), so publicly available information on production of the small HDDs had some use.  Much more important, however, was that there was only one supplier of the tiny spindles the disks rotated around.  And, unknown to most on Wall Street, that small Japanese firm published monthly spindle production figures, which basically revealed AAPL’s anticipated sales.

Same thing in the early 1980s.  Intel chips ran so hot that they had to be encased in ceramic packaging–for which there was only one, again Japanese, source, Kyocera.  Again, monthly production figures, in Japanese, were publicly available.

In both cases, the production figures were accurate predictors of AAPL (INTC) unit sales a few months down the road.  Production ramp-up/cutback information, again public–though not easily accessible–data, was especially useful.

Third:  Back in the days before credit card data were widely available, retail analysts used to look at cash in circulation figures that the Federal Reserve published to gauge the temper of yearend holiday spending intentions.  The fourth-quarter rally in retail stocks sometimes ended in early December if the cash figures ticked down.

In all three cases, clever analysts found leading indicators of future earnings.  As the indicators became more widely known, Wall Street would begin to trade more on the course of the indicators rather than on the actual company results.

today’s world

Withdrawal of brokerage firms from the equity research business + downward pressure on fees + investor reallocation toward index investing have made traditional active management considerably less lucrative than it was during my working career.

A common response by investment firms has been to substitute one or two economists and/or data scientists for a room full of 10k-reading securities analysts who developed especially deep knowledge of a small number of market sectors.  As far as I can see, the approach of the algorithms the economists/programmers employ isn’t much more than to react quickly to news as it’s being disseminated.  (They may also be looking for leading indicators, but, if so, I don’t see any notable success.  Having seen several failed attempts–and having worked at the one big 1950s -1970s  success in this field, Value Line–I’m not that surprised at this failure.)

My thoughts: 

–there’s never been a better time to be a contrarian.  Know a few things well and use bouts of algorithmic craziness to trade around a core position

–For anyone who is willing to spend the time watching trading during days like Wednesday there’s also lots of information to be had from how individual stocks move.  In particular, which stocks fall the most but barely rebound?   which fall a little but rise a lot when the market turns?  which are just crazy volatile?

technical analysis in the 21st century

A reader asked last week what I think about technical analysis.  This is my answer.

what it is

Technical analysis in the stock market is the attempt to predict future stock prices by studying current and past patterns in the buying and selling of stocks, stock indices and associated derivatives.  The primary focus is on price and trading volume data.

Technical analysis is typically contrasted with fundamental analysis, the attempt to predict future stock prices by studying macro- and microeconomic data relevant to publicly traded companies.  The primary sources of these data are SEC-mandated disclosure of publicly traded company operating results and government and industry economic statistics.

what the market is

The stock market as the intersection of the objective financial/economic characteristics of publicly traded companies with the hopes and fears of the investors who buy and sell shares.  Fundamental analysis addresses primarily the companies; technical analysis primarily addresses the hopes and fears.

ebbing and flowing

To be clear, I think there’s an awful lot of ridiculous stuff passing itself off as technical “wisdom.”  The technical analyst’s bible (which I actually read a long time ago), the 1948 Technical Analysis of Stock Trends by Edwards and Magee, is now somewhere in my basement.  I’ve never been able to make heads nor tails of most of it.

On the other hand, in the US a century ago–and in markets today where reliable company financials aren’t available–individual investors had little else to guide them.

the old days–technicals rule (by default)

What individual investors looked for back then was unusual, pattern-breaking behavior in stock prices–because they had little else to alert them to positive/negative company developments.

I think this can still be a very useful thing to do, provided you’ve watched the daily price movements of a lot of stocks over a long enough period of time that you can recognize when something strange is happening.

the rise of fundamental analysis

Starting in the 1930s, federal regulation began to force publicly traded companies to make fuller and more accurate disclosure of financial results.  The Employee Retirement Income Security Act (ERISA) of 1974 mandated minimum levels of competence in the management of pension plan assets, laying the foundation for the fundamentals-driven securities analysis and portfolio management professions we have today in the US.

past the peak

The rise of passive investing and the rationalization of investment banking after the financial crisis have together reduced the amount of high-quality fundamental research being done in the US.  Academic investment theory, mostly lost in its wacky dreamworld of efficient markets, has never been a good training ground for analytic talent.

The waning of the profession of fundamental analysis is opening the door, I think, to alternatives.

algorithmic trading

Let’s say it takes three years working under the supervision of a research director or a portfolio manager to become an analyst who can work independently.  That’s expensive.  Plus, good research directors are very hard to find.  And the marketing people who generally run investment organizations have, in my experience, little ability to evaluate younger investment talent.

In addition, traditional investment organizations are in trouble in part because they’ve been unable to keep pace with the markets despite their high-priced talent.

The solution to beefing up research without breaking the bank?  Algorithmic trading.  I imagine investment management companies think that this is like replacing craft workers with the assembly line–more product at lower cost.

Many of the software-engineered trading products will, I think, be based on technical analysis.  Why?  The data are readily available.  Often, also, the simplest relationships are the most powerful.   I don’t think that’s true in the stock market, but it will probably take time for algos to figure this out.

My bottom line:  technical analysis will increase in importance in the coming years for two reasons:  the fading of traditional fundamental analysis, and the likelihood that software engineers hired by investment management companies will emphasize technicals, at least initially.

 

 

 

 

Knight Capital and its algorithmic trading snafu

Knight Capital

Though not particularly well known to individual–and even some professional–investors, Knight Capital is a very large market-making and trading broker in the US equity market.

algorithmic trading

Algorithmic traders, or “algos,” are typically IT-savvy arbitrageurs.  Like any other arb firms, their business is finding and exploiting differences in the pricing of identical, or very similar, instruments.  Algos differ from traditional arbitrageurs in that they use computer programs to do their searching for them.  That way they can cover more ground than humans, potentially trading more quickly and spotting more opportunities. Computers also execute their trades.

On Wednesday morning, Knight Capital was running for the first time an algorithmic trading program it had apparently developed itself.  The story isn’t 100% clear, but it sounds to me as if Knight was hoping to create an algo service that could be used by individual investors.  In any event, Knight’s computers started churning out buy orders for about 150 stocks at the opening bell.  But the quantities being asked for were huge–much larger than Knight had intended.  And some of the stocks in the bundle were, well, weird.

One of the more offbeat selections was Wizzard Software (WSE).

The issue had closed on July 31st at a stock price of $3,50, on volume of 15,067 shares.  Wizzard provides home health care staff in the West, resells podcasting services from ATT and Verizon, and, yes, it apparently also develops corporate software.  In 2011 WZE had total revenue of about $6.5 million, and lost money.

From the chart I looked at, Knight’s initial order for WZE seems to have been for an astoundingly large 150,000 shares.  That’s a bit less than 2% of the company.  It’s also at least two weeks’ total trading volume. (My guess is that it would take several months to accumulate that amount, if you wanted to do so without moving the price much.  And then, of course, absent a sharp reversal of WZE’s fortunes, you’d have much greater difficulty getting back out.)

It reportedly took Knight almost an hour to figure out that something had gone wrong with its software.  Rival market makers were much quicker off the mark and were providing boatloads of stock to Knight at ever-rising prices.  When the music finally stopped, WZE was close to $12.  WZE was one of six stocks where erroneous trades were cancelled b market officials.

But that left around 146 issues where the Knight orders weren’t simply torn up.  The firm accidentally owned massive (for it) amounts of t=stock it didn’t want.  Once it realized what was going on, Knight cancelled any remaining buy orders and began dumping out the stock it had just acquired.  The company estimates it lost $440 million Wednesday because of the software glitch!!! (To be clear, I think Knight made the correct decision in selling immediately.  The gaffe was too big and too public for it to hope it might trade out of its positions slowly and quietly.)

press comment misguided

Most of the press stories about this incident have revolved around the idea that computerized trading is undermining the confidence of traditional long-only investors, especially individuals, in the integrity of the stock market and the desirability of holding equities for the long term.  I think the stories are  crazy.

For one thing, the S&P 500 only rose about five points in early trading on Wednesday–and then went sideways for most of the day.  If you weren’t a day trader, it may well be that the first you heard of the Knight Capital fiasco was on the news Wednesday night.  Or it might have been the paper on Thursday morning.

The real story?

Consider what has happened to Knight because of its foray into algo trading.

–Its stock has lost about three-quarters of its market value in just the past two trading days.

–The Financial Times reports that major clients have shifted orders to other market makers–Vanguard, e-Trade and TD Ameritrade among them.  Brokers did this initially at Knight’s request.  Clients are remaining mum for now.  Certainly, no one I’m aware of is saying the crisis is over and they’ve gone back to business-as-usual with Knight.  The silence on this score suggests clients think Knight may be badly enough wounded that counterparty risk is a concern.

–According to the FT, Knight has hired an investment banker to help it consider its options, including a merger or sale of the firm.

In other words, it’s conceivable that the management that built the company may soon no longer be in control of it.

I can’t imagine this snafu makes anyone more eager to get involved in algorithmic trading.  Quite the opposite.  The Knight experience may become the cautionary tale that prevents the spread of algo trading away from specialists and into the mainstream of equity trading.