Tuesday, July 20, 2010

Thoughts on Historical Multiples and Value Investing



These two are somewhat connected, and let me explain how. Every once in a while, especially around sharper market moves up or down, the TV channels would parade numerous experts who argue if the stock market is overvalued or undervalued. One of the experts' favorite tools is looking at the 30-40-50-60-70 year average/median P/E ratios. Then the expert would inevitably make a pronouncement about the current state.

Here's something that goes a bit underappreciated in my view. Before the computers and the internet, collecting corporate data was a really labor-intensive exercise: I doubt many people active in the market now have gone "down to the library" of their institution to get the filings of Company X, and then proceeded to manually fill ledgers with numbers. I doubt anyone thought it would be possible to get all of this data (some vendors even adjust for non-recurring items), plug away a few numbers, press F9, and, voila, we "know" to the cent how much Company X is worth because "this is what the model says." The end result is that in the computer/internet era, financial information is much more democratic. So is corporate transparency: one can check if company X has that many stores, even ogle the storefronts with Streetview, or snoop on company Y's Peruvian open pit mine in 3D. If the management team seems fishy, one can check them out on LinkedIn or the federal prison bureau, lexis/nexis, facebook, basic searches, real estate records and so on. If you don't know much about the product, check the pricepoints online, see the vendors, read the reviews, order it for yourself to test. The drastically increased transparency reduces risks. Reduced risks increase multiples, and I think we have seen some of that: so that a 1920's P/E incorporates a lot more unknowns than does a 2010 P/E.

So, how is this linked to value investing? In its classic sense, value investing is about buying $1 for 50 cents. One of the most popular approaches is Ben Graham's "net-net". But here is the problem: back when Graham came up with the formula (or when Buffett did his famous cocoa liquidation trade), just trying to do a net-net calculation had been very laborious, as I describe above. And the people who did the work got rewarded. What is the situation now? One can screen thousands of companies just with the click of a button. What is the end result? Much, much fewer bargains (in my view, I have not validated at it statistically). Most of Graham's net-nets are usually small cap, illiquid stocks of companies in the middle of some reorganization: these might be fine for a smart individual investor, but difficult for anyone running more serious money. The greater transparency has resulted in making the screaming bargains nearly extinct. Even Buffett himself rarely makes substantial open market purchases any more (substantial vs. the size of BRK). Outside of the railroad acquisition, his other recent investments (GS, GE, etc.) were effectively private transactions.

"This time, it's different" is a dangerous statement, but maybe, this time it really is different: higher transparency and easy information flows might be warranting higher average multiples.

1 comment:

Sivaram Velauthapillai said...

Just because you have more information doesn't necessarily mean it's different. How can you be sure that market participants are analyzing the additional data properly? I would argue that information overload is a new risk that didn't exist back then.

To sum up, I don't agree with your view that the market multiples may be higher now due to more information and more transparency.