Comments on DELL, AAPL, and Macau. Here’s the link–or you can just click the tab at the top of the page.
I’ve been thinking recently about Adam Smith, his Wealth of Nations (1776), and the concept of the “invisible hand.” This idea is that in a world where everyone acts in his own rational self-interest, somehow the “invisible hand” of the market arranges affairs so that a favorable outcome, if not the most favorable outcome, results.
The concept of the “invisible hand” has been used in modern finance and economics to justify such prominent ideas as the (supposed) self-adjusting, self-regulating character of financial markets, the stabilizing nature of speculative activity in commodities markets and the efficient markets hypothesis. In other words, it has spawned some of the most misguided and counterfactual theorizing ever.
So I decided to go back and look at what Smith actually had to say. I was surprised by what I found.
Wealth of Nations
Smith introduces the idea of the ‘invisible hand’ in an indirect way. The only general statement he makes is that he distrusts the motives of those who claim to be acting in the public interest, saying no good comes from this. Rather, he says, the best economic results usually come from those who are acting in their own economic self-interest. He also offers the example of the decision whether to manufacture gods in the home country or abroad. The best result comes, not from those who claim patriotic motives for their trade, but from those who buy and sell the most commercially viable products, no matter what their country of origin. In so doing, he is led by an “invisible hand” to achieve a result which he doesn’t intend, but which benefits society as a whole–namely, putting capital into the control of people who can use it most productively.
Smith, however, doesn’t assert that self-interested action always or generally achieves a favorable result, although he clearly seems to believe that it does. He’s much more certain that the claim to be acting in the public interest is a bogus one.
Leibniz’ Theodicy (1710)
What really strikes me about Smith, however, is how he seems to frame his discussion in terms of the eighteenth-century theological debate about how to reconcile the existence of evil in a world created by an all-powerful and benevolent God. That shouldn’t come as a particular surprise (although it did to me), since Smith started out studying and later teaching ethics at the university in Glasgow.
The state-of-the-art of this topic was, I think, Leibniz’ Theodicy. His twist on the issue was to say that the world we live in, created by God, is the best of all possible worlds. One may well be able to envision better worlds than ours, but this isn’t an argument that God has created a relative clunker. Any such imagined world simply isn’t possible.
Today, we’d probably call this a linguistic trick. And Leibniz was parodied by Voltaire’s Dr. Pangloss in Candide. But Leibniz’ conclusion that the sum total of all human activity in the world, everyone acting in his own self-interest and without knowledge of God’s master plan, creates the maximum possible good was okay for eighteenth-century Europe.
Compare Leibniz with Smith.
Smith certainly knew the Leibniz argument. HIs is basically the same. But he presents his “invisible hand” as a generalization from experience, rather than a law–as one might expect from a colleague of David Hume. And he omits the part of the argument, such as it is, that gives it its logical force–that a transcendent being made the world so it works this way. In its place, Smith offers the notion that at some basic level people feel empathy toward one another and wish each other well.
Fine for the eighteenth century, not for the twentieth or twenty-first.
Schopenhauer and Freud
Also not fine for the nineteenth century. That’s when a series of thinkers, the first of whom was Arthur Schopenhauer in his World as Will and Representation (1818), began to take seriouly the possibility that the world was not open book previously thought, but instead a place where unconscious deception could color all we see and do.
These ideas were refined and popularized by a number medical practitioners toward the end of the century, of whom the most famous was Sigmund Freud.
The nineteenth century, then, opened the intellectual door for theorists to consider a world where mass delusions, wars, panics, manias and bubbles–all irrational, destabilizing movements–are possible. In other words, they opened the door to the real world.
Oddly enough, it’s only in the last generation that academic economists have begun to factor these nineteenth-century achievements into their thinking. Finance professors have not yet come to the party.
what does this have to do with investing?
To me, what I’ve written above suggests that academic practitioners of economics and finance are long on mathematical technique, but short on knowledge of cultural and intellectual historical developments of the past two or three centuries.
Since dubious academic ideas have given a patina of respectability to the regulatory laxity that has produced the financial crisis we’re struggling our way out of, maybe it’s time for a more general rethink of the presuppositions regulators are using. Maybe it’s also time for universities to supplement their stables of equation jockeys with people having more well-rounded intellectual backgrounds–maybe even with people who have practical experience in economics or finance. If so, maybe we wouldn’t have a repeat of Mr. Greenspan testifying before Congress that he was shocked that financial markets didn’t heal themselves. After all, God did create the best of all possible worlds.
I’m going to write here about what I perceive the political dynamics of inflation in the United States to have been in the past century. I presume, but don’t know, that the same process can and has occurred elsewhere.
The latter part of the nineteenth century and the first half of the twentieth were times of what amounts to class struggle in the US between business and labor. Issues ran the gamut from child labor and workplace safety to wage levels and unionization.
The sides coalesced around two political parties, the Democrats representing labor and the Republicans defending business.
(This struggle is basically over today, I think–leaving both major parties trying without a great deal of success so far to redefine themselves. For good or ill, most Americans no longer draw a sharp distinction between management and labor. This is partly because the nature of work has changed, partly because most Americans consider themselves part of management.)
During the time when workers were fighting for what we would now regard as basic, and self-evident rights, inflation became a significant weapon in the battle. How so?
inflation and bonds
A conventional bond is a series of interest payments made to the holder plus return of principal at the end of the bond’s term. The present value, or value today, of the bond is the sum of all these payments by discounting each back to the present using an appropriate interest rate. The higher the interest rate employed, the lower the present value.
A rising inflation rate erodes the present value of a bond. If, for example, when the holder purchases it, inflation is at 3% the buyer may be content with a 6% coupon. He receives $60 a year in interest payments and his $1000 back a the end of the bond’s term. The interest payments offset inflation and provide a real return of 3% annually.
Suppose the inflation rate rises to 7% immediately after the holder purchases the bond. Suddenly, he is no longer receiving a real return on his money. Part of the purchasing power of his investment is disappearing, due to the higher rate of inflation.
Conversely, the seller benefits from an increase in the inflation rate, since that results in a real decline in the value of the payments he has agreed to make to the holder.
back to politics
It seems to me that during the late nineteenth and early twentieth centuries a basic assumption of the Democrats, the party of labor, was that its constituents held no physical or financial assets. In fact, many might be net borrowers, or, as the financial world would put it today, be “short” financial assets. Their main source of economic worth was their ability to sell their labor.
In contrast, Republicans thought of their constituents as the “longs,” wealthy bond-coupon clippers, with ownership of vast amounts of physical and financial assets.
two opposing agendas
These differences set the agendas of the two parties. If the Democrats were in power, they could attempt to transfer wealth from business to labor overtly by increasing taxes on the wealthy and/or by raising benefits provided by the government to workers. Or they could do so covertly by establishing economic policies that induce inflation. That would decrease the wealth of the old time robber barons–and at the same time it would lessen the real value of the loans workers had taken out from them.
When the Republicans were in power, they would start to undo the policies initiated by the Democrats, by trying to balance the government’s books and by fighting inflation with restrictive economic policies.
I think this is the way Washington worked even through the 1970s.
the new order
Not any more, though.
The nineteenth century model was one of massive capital investment in plant and equipment (think: blast furnace steel) operated by manual labor. Accelerating rates of technological change have destroyed that economic model. Who are today’s economic heroes?–Google, Apple, Amazon, Pixar, biotech… They are relatively small groups of highly educated people creating service businesses that require little physical capital, many of them using the internet as a substitute for having a large advertising budget and extensive physical distribution facilities.
the old dynamic reborn
At present, most domestic economists are praying for any sign of inflation to emerge, simply to give the US some breathing room against the possibility of deflation.
Beyond this, however, inflation has reemerged as a political issue in the US. The new dynamic has arisen from the fact that Washington has borrowed heavily from foreign governments–notably Japan and China–as well as from domestic sources.
So the drama of the first half of the twentieth century has been recast, with the Chinese in the role of big business and Washington in the role of labor. It is certainly tempting to lawmakers to attempt to repay foreign creditors in inflation-diminished dollars rather than to have to have tax revenues large enough to generate the entire real amount owed. On the other hand, China, sensing this line of thought, has been increasingly vocal over the past year or so in its concern that Washington protect the purchasing power of the dollar through economic orthodoxy.
This new drama is still in rehearsals. The collapse of the euro has meant it won’t need to open on Broadway any time soon. But it will still be important to monitor how the play is shaping up.
Because the two words, inflation and deflation, look alike, they invite the conclusion that there’s a single phenomenon– -flation–that has two varieties, de- and in-. As a practical matter, despite the similar names, inflation and deflation are actually quite different in how they affect an economy. In the US at present, knowledgeable politicians (an oxymoron?) and economists have their fingers crossed that inflation somehow resurfaces and that deflation will not become an issue.
An economy with inflation is one where the price of things in general is rising. It isn’t enough that some prices are rising–even very visible prices like gasoline or movie tickets. In an inflationary economy, overall prices have to be rising, so that the cost of living steadily goes up. (I wrote about inflation more extensively in a post from May 25, 2009.)
In a developed economy like the US, the only price that really counts for inflation is the price of labor.
If inflation had a tendency to stay well-behaved, at a constant, low rate, it wouldn’t be much of a problem. But it usually doesn’t do either. One way to think about what happens is this:
in an inflationary environment, some people underestimate inflation. They think prices will rise by, say, 3% in the coming year. They ask for and get a 3% wage increase. But inflation turns out to be 4%, so in real (i.e., adjusted for price-level changes) terms they are making less than they used to. So the following year, they ask for a 6% raise. Others ask for and receive a 5% raise, so they’re better off in real terms than before. So they try to do the same thing the following year. As a result, the rate at which prices are rising tends to increase.
At some point, expectations change. Companies start to raise the prices of their output and individual wage earners up their wage demands in anticipation of, and as protection against, future inflation increases. In doing so, they create the increased inflation they fear.
As inflation accelerates, people start spending more and more time defending against future price increases and trying to work the situation in their favor. This means less time doing productive work. At more advanced stages, capital investment in long-term projects slows, because figuring out its profitability may depend on forecasting accurately what inflation will be ten years hence–which has become impossible. For the same reason, no one wants to hold fixed income securities, including government debt.
In the worst case, hyperinflation (think: Japan or Germany close to a century ago, or Brazil twenty years ago), the economy comes close to collapse.
The (relative) good news about inflation is that it’s a well-understood phenomenon. Any government knows what to do to remedy the situation: restrictive policy (higher interest rates, plus maybe less government spending and higher taxes) until inflation begins to decline and expectations in the economy change. The real stumbling block to an inflation cure is having the political will to implement it and a Paul Volcker-like central banker to oversee the process.
In its definition, deflation is the opposite of inflation. It’s a steady, general fall in the price level. To my mind, three factors make deflation something different from a mirror image of inflation.
1. Deflation is weird. Other than the Great Depression or the Weimar Republic, it hasn’t occurred very often in the contemporary world. Other than maybe the PC industry, no one is set up either psychologically or institutionally for deflation. Suppose prices were falling at a steady annual rate of 2%. What would you think of a government bond where you paid $1000, received no interest income and got back $900 in ten years? Me, too. Credit creation, and all the economic activity that depends on it, would stop dead in its tracks.
2. Deflation makes outstanding debt that carries a positive nominal interest rate (in other words, all of it) a crushing burden. Prices dropping 2% per year means, among other things, wages dropping 2% annually. Let’s change the rate to 5% just to make the point easier to see. At the end of five years, you’re making 77% of what you were before deflation hit (ignore the fact that falling wages suggests widespread unemployment and other horrible economic problems). Yes, the cost of food and clothing has probably fallen in line with your income, but your mortgage and credit card payments haven’t. If your credit payments were 25% of your income pre-deflation, they’re a third–and rising–of your income now.
The situation is worse for companies with operating leverage, whose profits can quickly disappear. Imagine, too, the state of private equity or commercial real estate, which depend on high levels of financial leverage for their viability. They’re toast.
This, of course, has knock-on negative effects on the banking system. Look at the Thirties.
What a mess!
3. Traditional money policy becomes ineffective. The orthodox central bank response to recession is to lower short-term interest rates until they’re negative in real terms. The fact that finance is in effect free is supposed to stimulate borrowing, and therefore reinvigorate economic activity. But the central bank can’t push nominal (i.e., not adjusted for inflation/deflation) short rates below zero. So in a deflationary environment, the central bank can’t achieve the “free money” outcome.
This means that a country depends completely on fiscal stimulus–increased government spending–to help the economy improve. But legislative action may be slow. There’s huge potential for spending programs to be applied in pork barrel ways that will do little more than run up the government’s debt burden (think: Japan since 1990).
where are we now?
There’s good news and bad news, in my opinion. Bad news first.
Government stimulus programs seem to me to have so far been focussed on whatever is “shovel ready,” without much thought about addressing long-term structural problems like education. Maybe that will change. But to date Washington looks scarily like Tokyo circa 1990.
The good news–
Europe’s pain is our gain. US government spending depends on the continuing willingness of foreigners, notably China, to lend Washington money. Prior to the Athens-induced collapse of the euro, Beijing appears to be warming up to shift its lending activity away from the US. Not any more. So no matter how inefficient government stimulus may be, at least it does something positive, and it won’t come to a screeching halt.
Also, lots of companies are announcing that business has become good enough that they are beginning to raise wages again and reinstitute benefits cut during the recession. Given that wages are the most important element of changes in the price level in the US, this suggests that the current near-zero inflation rate is a cyclical low point and that the price level will rise from here. To some extent, this movement in the private sector will be offset by changes in state and local government workers’ payrolls (some studies claim that municipal employees are now paid 20% more than private sector workers for the same jobs). Still, I think the private sector trend is grounds for a loud sigh of relief.
The original, very successful, Wal-mart concept was to open general merchandise stores on the outskirts of towns with a population of 250,000 or less. These Wal-Marts offered a combination of one-stop shopping and low prices that small local merchants found impossible to match, let alone beat.
As the small town market matured, Wal-Mart gradually shifted to opening supercenters, which combined a supermarket with the general merchandise store and put Wal-Mart in direct competition with the big domestic grocery store chains for the first time. The supercenters have been as dramatically successful as the original Wal-Marts were in their day, both in terms of increased profits for WMT and forced restructuring for the supermarkets.
I remember attending a retail conference some years ago where a major supermarket chain was talking about its successful adaptation to Wal-Mart’s entry into its markets. The spokesman admitted that his stores experienced a dramatic drop in revenues in the initial years. But, he said, by resigning the store layout and refocusing the merchandise mix toward more upscale and specialty items his company was able to restore revenues to the pre-Wal-Mart level within about three years. A hand in the audience went up immediately. “What about profits?” (I’m not sure whether the questioned was short the supermarket stock or just annoyed at what he considered the speaker’s duplicity.) The speaker’s reply is that his firm got profits back up to half what they were before Wal-Mart’s arrival.
This dynamic–good for consumers, bad for incumbent supermarkets–is well-known. Wal-Mart has been unstoppable in most rural or suburban areas, where land is abundant/cheap and people are used to driving to shopping areas. California, New England and big urban areas like Chicago and New York have been another story. In New England, there’s the serious issue that store locations are hard to find. In California and the big cities, on the other hand, supermarkets have been able to muster powerful political support to prevent inroads from Wal-Mart.
In the New York area, where I live, Wal-Mart has recently been running TV ads that say that the average shopper who uses a newly opened Wal-Mart will likely save over $3000 a year on food and general merchandise. Even shoppers who don’t will lay out more than $1500 less than if there were no Wal-Mart (presumably because increased competition forces other merchants to lower prices). But that cuts no ice around here.
Chicago is an interesting example. The first Wal-Mart opened there in 2006. According to Reuters, after the City Council rejected attempts to saddle Wal-Mart with punitive operating restrictions, “unions helped defeat several pro-Walmart aldermen in Chicago’s 2007 elections.”
Until recently, that’s been the end of the Wal-Mart story in the Windy City.
What’s changed? The financial crisis-induced recession, for one thing. WMT has also become more flexible in its approach to store formats. And it has become more politically savvy. Its current approach is to position itself, along with local community leaders, as bringing jobs to blighted urban areas and fresh food at reasonable prices to neighborhoods abandoned by traditional supermarkets. WMT has also agreed to use only union labor to build dozens of planned new stores and to pay workers substantially above the minimum wage.
If successful–and I don’t see any reason why it shouldn’t be, WMT’s Chicago expansion will likely prove to be the thin edge of a wedge that opens up New York and Los Angeles to the company.
WMT (I own it) is no longer the dynamic growth stock it was for many years. It’s just too big. The Chicago developments are move evidence, though, for the view that the company can continue to grow profits and dividends at a 10%-15% annual rate for years to come. They also suggest that WMT now realizes that retailing in Munich or London or Tokyo or Chicago or Los Angeles isn’t just like Bentonville but with a different climate–and may require significant cultural adjustment as well as superb operating skills. That’s probably a bigger positive.