Feeds:
Posts
Comments

Archive for the ‘Macro-Economics’ Category

I hadn’t been paying enough attention to the markets this summer to write anything meaningful until last week – it figures that as soon as I start writing, the markets begin to tank…

The Good:

My first move is to add “Credit Writedowns” to my blogroll. I can’t believe I’ve missed this for so long, since its probably one of the best sites I’ve ever seen. There’s more categorization and sub-articles (including a credit crisis timeline, which includes an archive of every major event in the past 3 years) than I’d ever care to read through, but that goes hand in hand with the extensive nature of the website.

The Bad:

Secondly, I’d say that The Big Picture is hanging on by a thread in the class of venerable blogs. The content has been lacking over the past 3 months – mostly since the author has been promoting his book at every turn – and his pursuits of investigative journalism have been mostly overridden by efforts to stir controversy. That said, Barry Ritholtz has been adding value with his “link fests” (which must be taking viewers away from the guy at Abnormal Returns).

The Ugly:

Thirdly, I think that this article from Reuters paints a very clear picture about the remaining problems concerning real-estate loans: Many of them haven’t been marked down from values at origination.

Emergency bailout facilities allow banks that otherwise would have failed under the weight of bad loans to hold those loans to maturity — pretending the bad ones will be paid off in full over time.

In reality, many loans will default and banks will bleed capital for years. Take commercial real estate. As the Congressional Oversight Panel has reported, few CRE loans that were originated at the peak will qualify for refinancing when they mature. Banks can pretend they will, carrying the loans at values far above what will ever be paid back. (emphasis added)

Then there’s this table — originated from SEC filings — which shows losses based on a loans marked at Fair Market Value (not the carrying value) as a percentage of Tangible Common Equity (TCE):

what-loans-are-worth

This suggests that if property values stay depressed at these levels through the lives of these loans, the losses will be understated at maturity. In other words, if the mortgages in JP Morgan’s loan portfolio had expired in June, the losses would wipe out about 17% of their equity (which is remarkable, since the difference between the Fair Model assumptions and the actual carrying value is only 3%).

More on this to come…

Advertisements

Read Full Post »

Here’s a chart of the economy’s flow of funds, from Option ARMageddon: (This chart was also stolen here and here)

slide11

Mortgage payments are still responsible for a substantial portion of the US Debt, but Government borrowing has grown YoY(and will grow more in 2010). This is the phenomenon of crowding out, whereby government spending increases interest rates for the private sector, resulting in a decrease in borrowing (in today’s case, the treasury is competing with the private sector for buyers). Meanwhile, the FED’s statistics likely understate the Treasury’s liabilites:

The Fed only includes publicly held debt when calculating total federal government borrowings, $6.7 trillion at the end of Q1.  This excludes over $4 trillion owed to the Social Security “trust fund.”  More importantly, it excludes $60 trillion of unfunded future liabilities for Medicare and Social Security.

Read Full Post »

There is a good editorial out of thestreet.com regarding mark t0 market accounting. Until recently, there have been plenty of people who criticized applying fair value accounting principles to an illiquid asset class, but no one really explained how badly it hurts the financial sector.

From the article:

Here’s the problem. Let’s say a bank has purchased a series of geographically diversified securitized mortgage backed securities. How do we value them? Let’s say that within that mortgage series, 20% of those mortgages have defaulted and the prices of those defaulted houses have declined and can be sold at roughly 50% below what they were valued at when the securities were originally issued. What is the intrinsic (theoretical) value of the security? The answer is approximately 90 cents on the dollar. 100 – (0.20 x 0.50).

Here’s a more realistic example:

Now let’s look at the absurd situation we now find ourselves in. Some of the banks are forced to sell these long-term securities, but because of extreme credit market conditions they can only get 20 cents on the dollar. Now FASB 157 kicks in and says that this is the fair market value of these securities. Now we have an 80% ($1.00-$0.20) real loss on these bank-held assets instead of the 10% intrinsic (theoretical) decline, which means at a 20 times levered ratio, the holder has suffered a catastrophic 1600% total loss on their investment.

Rumor has it that congress will have a meeting next Thursday (March 12th) about suspending M2M.

Here’s something to consider: mark to market was used during the Great Depression. It was repealed and we had a functioning financial system for 60 years. It was reinstated on November 15th 2007 (13 months before the “official” recession began). This may be an unobtrusive measure, but that’s pretty shocking.

Read Full Post »

friedman1

Bloomberg has had an interesting progression over the past 6 months on how Milton Friedman’s legacy is slowly diminishing – then again, who’s isn’t?

Here’s the first article, which quarrels with the principles of laissez-faire government, which were made possible through Friedman’s free market ideology – the belief that unadulterated by government policies, markets will determine the perfect prices and interest rates, based on supply and demand.

The article also goes as far as assigning Friedman some blame for the decay of our financial system:

In 1972, Friedman helped persuade U.S. Treasury Secretary George Shultz, former dean of Chicago’s business school, to approve the first financial futures contracts in foreign currencies.

Such derivatives grew more complex after Chicago economists created the mathematical formulas to price them, helping spawn a $683 trillion market that’s proved to be a root of today’s financial system breakdown.

The follow up article – written yesterday – makes some bolder conclusions:

After a three-decade run, the free-market philosophies of Friedman that shaped U.S. policy are being eclipsed by the pro- government ideas of Tobin, the late Yale economist and Nobel laureate who brought John Maynard Keynes into the modern era.

I generally like Bloomberg’s exclusives, since they’re always thought provoking. In this instance, I also agree with them; however I don’t necessarily believe that Wall Street has given up/will ever give up on Friedman Economics – it worked too well for them for 30 years.

I’m also not seeing this idea catch on anywhere else in the media, nor have I seen anyone blame Friedman the way Bloomberg does (most people are still focused on Alan Greenspan and Fannie Mae/Freddie Mac, neglecting the fact that “deregulation” is almost synonymous with the name “Milton Friedman”).

It would be naive to think we’ll have a paradigm shift back towards the ideals of Keynes/Galbraith (as these articles seem to suggest), since we haven’t yet identified the architect of the problem on a broad enough level – although I think Friedman is a good place to focus a lot of our criticism.

Read Full Post »

From the Financial Times:

ft-japan-comparison1

The point of the FT article was to describe how it is extremely difficult for the private sector to deleverage, or reduce debt levels, during periods of falling prices…As the charts show, the US is witnessing debt balloon to levels not seen since the 1950’s (as a percentage of GDP), while asset prices have fallen precipitously.

As the author describes;

It has long been argued that the US could not suffer like Japan. This is wrong. It is true the US has three advantages over Japan: the destruction of wealth in the collapse of the Japanese bubble was three times gross domestic product, while US losses will surely be far smaller; US non-financial companies do not appear grossly overindebted; and, despite efforts by opponents of marking assets to market, recognition of losses has come far sooner.

Basically, our situation is more similar to that of Japan circa 1990-2005 than we had anticipated – not to mention some  economic characteristics which are considerably less desirable (a global recession, leaving little room for other countries to pick up the slack in our budget/trading deficit by buying our debt and consuming our exports).

Surprisingly, part of the reason the author included all of this background was to advocate for a bigger stimulus package, not to be depressing.

Unfortunately, there is no discernible solution to the problems brought up in this article, other than the old “we’ll have to tough this one out” analysis:

The bigger point, however, is not that the package needs to be larger, although it does. It is that escaping from huge and prolonged deficits will be very hard. As long as the private sector seeks to reduce its debt and the current account is in structural deficit, the US must run big fiscal deficits if it is to sustain full employment.

There will be a Part II to this author’s column in next week’s FT, if at all interested.

Read Full Post »

Despite the flood of cash from the government, Banks are still hoarding cash…What could explain this phenomenon?

From BusinessWeek:

They (Banks Chiefs) argue that the government funds are designed to shore up capital and support lending, but that they have no obligation to make new loans. “It’s not a one-to-one relationship,” says BofA CEO Kenneth D. Lewis. “We don’t write $15 billion in loans because we got $15 billion from the government.”

So there’s a disagreement on what the TARP money should be used for…But some may ask why banks won’t make new loans? The answer, as always, is dependent on money:

Right now there’s little financial incentive to make fresh loans. In the current unease, new corporate loans are immediately marked down to between 60¢ and 80¢ on the dollar, forcing banks to take a hit on the debt. It’s more lucrative, then, for them to buy old loans that are discounted already.

Just when you think all of the side effects of repealing Glass-Steagall were out of the system. Now banks won’t even make new loans; since there are so many discounted securitized mortgages on the market, they’re using the TARP money to buy outstanding mortgages…

Since the TARP was so Ad hoc by nature,  The government didn’t  force them to do otherwise, so you can’t blame banks for cutting the best deals.

The most important point regards their capital requirements:

Under federal rules, banks are required to maintain a certain level of capital based on their assets. When they incur losses, they either have to raise more capital or sell assets to keep those ratios in check.

Read Full Post »

“Dating back to work on the random walk hypothesis by French economist Louis Bachelier (1870-1946), the efficient market hypothesis asserts that stock market prices are the best available estimates of the real value of shares since the market has taken account of all available information on an individual stock.”

Economy Professor

Now from New York Times Magazine, which had an interesting 10 page spread about Risk valuation last Saturday:

VaR is often measured daily and rarely extends beyond a few weeks, and because it is a very short-term measure, it assumes that tomorrow will be more or less like today. Even what’s called “historical VaR” — a variation of standard VaR that measures potential portfolio risk a year or two out, only uses the previous few years as its benchmark. As the risk consultant Marc Groz puts it, “The years 2005-2006,” which were the culmination of the housing bubble, “aren’t a very good universe for predicting what happened in 2007-2008.”

How are these two phenomena related? The risk models our world of finance has been relying upon for several years are grounded in the Efficient Market Hypothesis; VaR models – or models which price risk – would not function without the corollary of the underlying asset being priced properly. We took comfort in this convenient theory, that everything is priced appropriately, all the time.

The counter to this argument is not another theory, but a series of real life outliers. If markets are efficient, How does George Soros continually compound his fortune by trading according to his boom/bust empiricism (like shorting the British Pound)? Warren Buffet also acknowledges fault with this theory: “I would be a bum on the street with a tin cup if the markets were always efficient.”

Every theory has shortcomings, and the admitted shortcoming of the Efficient Market Hypothesis is one of “black swans”, or in economic terms, exogenous shocks. These are events which cannot be predicted, ones which are often described as lying outside a 99% confidence interval, and ones which continually disprove the efficiency of market pricing – especially during times of panic.

The second glaring shortcoming is one which lies about the applications of the EMH, or one of our perception. One assumption of the hypothesis, like an assumption in micro-economic theory, is that the participants are completely rational (much like human calculators). This assumption has proven to hold under times of tranquility – like the times LTCM succeeded in making money – but during times of deception and opacity, many of us are hopelessly irrational.

To quote Michael Lewis’s most recent book, Panic!, on the pricing of Bear Stearns:

“If the market got the value of Bear Stearns so wrong, how can it possibly believe it knows even the approximate value of any Wall Street firm?” (P. 342)

The pillar of the EMH that comes tumbling down in times like today is “known information”…On the surface, who is to say that Bear’s balance sheet was all that bad? I’d like to believe we could extrapolate everything from the footnotes (never mind having everyone read them), but the population of people who called for the failure of Bear Stearns in 2006 lie far outside the depths of the “normal distribution.” Furthermore, here’s a real life example: how could the Nasdaq be accurately priced at 1,400 in 1997; at 5,000 in 2000; and back at 1,400 in 2002?

All of this is to argue that the notion of efficient markets, which we have taken shelter in for much of our modern financial history, stands paralyzed in times of uncertainty; this would provide insight into why panic ensues at the very signal that we are “in the dark” concerning anything (this could also be a reflection of our poor sentiment; the constant belief that we will be disappointed by Wall Street is becoming a part of our psychology). This would also provide an explanation for our dependence on models as a “crutch”, to grant us what has turned out to be a false sense of security by quantifying risk with numerical values.

The fallout of the sub-prime mortgage crisis has uncovered many of the issues with deriving models after this hypothesis – the only problem is, we haven’t come up with anything better. VaR were used heavily in the late 1990’s by none other than LTCM, but as the NYT article points out:

Firms viewed it as a human failure rather than a failure of risk modeling. The collapse only amplified the feeling on Wall Street that firms needed to be able to understand their risks for the entire firm. Only VaR could do that. (Page 7)

We then reverted back to their use.

I will end with a phrase which has time and again been a sure way to see oneself proven wrong – Maybe this time it’s different.

Read Full Post »

Older Posts »