Subprime Auto Loans Are Not The Next ‘Big Short’

Subprime Auto HeadlineThe auto industry is facing many different headwinds right now. The industry is incredibly cyclical and many people think that we have just seen peak auto sales which means auto manufacturers might struggle to sale cars over the next couple years. Over the last couple years auto manufacturers lowered lending standards to stimulate sales. Today, the average loan term for new cars is 68 months and it is 63 months for used. Both of these have never been higher. The average amount financed for a new vehicle is $28,802 which has never been higher. The average length of loan has been extended to help individuals have lower monthly payments but for a much longer period of time.

At the beginning of 2016, auto loan delinquencies creeped higher as a result of low oil prices. Americans were getting laid off from the oil and gas industry and consequently were not able to pay for their vehicles. This is no longer happening as often since oil prices have stabilized in the last few months. When auto loan delinquencies were on the rise, many people took it as an opportunity to scare people into thinking another bubble was about to pop. However, I do not believe this to be the case for many reasons. I think they will need to look elsewhere for the next big short.

The 2009 recession resulted in a major slowdown in auto sales. Only 9.3 million vehicles were sold in 2009. This created a lot of pent up demand which has worked its way out over the last seven years. Most Americans need financing to buy vehicles which has resulted in expansion of the auto loan market over this time period. This is a very normal occurrence, not any kind of bubble. Additionally, synthetic securities tied to auto loans do not exist and probably never will. Synthetic securities tied to mortgages were the primary reason for the financial crisis.

People have forgotten that vehicles and homes are completely different assets. The default process around vehicles is completely different from houses. Looking back at data from the housing crisis, it took 194 days to complete a foreclosure in Arizona; 335 days, or almost a year, in California; 520 days, or about 1 1/2 years, in Nevada; and 858 days, or almost 2 ½ years, in Florida. By way of comparison, in New York it takes 1,072 days to foreclose on a residence. As the foreclosure process goes on, most houses are abandoned resulting in plummeting values. Consequently, neighboring property values also decline. Today, when people default on their automobiles, they are repossessed almost immediately. The legal process is extremely fast so there is minimal value lost on repossessed vehicles. They can be resold quickly mitigating losses from the bad loans.

Another differentiating factor from the housing crisis comes from financial institutions. Lending risk models have been closely scrutinized since the financial crisis and are much improved. Even when defaults occur, losses are minimal because loan-to-value (LTV’s) are heavily scrutinized. Probability of default’s (PD) and loss given default’s (LGD) are updated more frequently and watched closely. No financial institution will be taking on large losses because lending models on subprime loans incorporate the risks the bank is taking. In the event that there is a default on a loan, the vehicle can be repossessed and resold minimizing loses. This process has almost no drag on the economy.

When financial institutions underwrite a subprime auto loan they often require the borrower to have their car equipped with a device that allows the lender to disable the car remotely. These devices include GPS technology so the lender can communicate the location of the vehicle if it needs to be repossessed. These devices make repossession incredibly fast.

In addition to the devices mentioned, there are hundreds of different license-plate-readers on the roads today. These are often found at airports, toll plazas, major highway entrances, parking garages, and selected intersections. Developing databases of scanned license plates is a growing business. Financial institutions are willing to pay to have access to this information.

Bottom Line


Subprime auto loans are completely different than subprime mortgages. There are glaring differences between the two products. The biggest difference between them is the amount of time it takes to repossess a vehicle versus foreclosing on a house. Repossessing a vehicle is quite easy and the asset can be resold in a timely fashion which keeps the lenders losses minimized.

Lending risk models are much improved on all loans since the financial crisis. Financial institutions are reviewing their loan files more often and are much more conservative than they were ten years ago. It is very possible that auto sales may slow down in the next 12 months. However, I do not see any similarities to the housing crisis and I would certainly not call the growth in subprime auto loans a “bubble.”

 

 

 

Indexing’s Effect on Market Efficiency

Index-Words

Over the last thirty years, the strategy of indexing has rapidly been adopted. Indexing is appealing to investors because they can achieve the return of an index while having low fees. Various studies have shown the advantages of low cost indexing. However, there is a certain myth around indexing that should be addressed. Indexing is widely considered to be a passive investing strategy but I would rebut this belief. Passive investing is not just about reduced activity, low fees and tax efficiency. A stock picker who picks one stock and buys and holds it can be just as inactive as someone who buys one index and holds it. In reality, they’re likely to be more fee efficient than the indexer because there is no recurring cost to holding stock. An index like the S&P 500, however, is an actively selected basket of companies that represent a slice of outstanding global stocks. Because nearly every index is a small piece of all equities, it is actively managed and adjusted. This calls into question whether indexing is actually a passive strategy.

Market Efficiency

There has been extensive research into market efficiency and the efficient market hypothesis. Perhaps the most well known paper was published by Paul Samuelson and Eugene Fama in the 1960’s. A market commonly is described as efficient when prices fully reflect all available information. There are two pieces that make up market efficiency. 1) Informational efficiency 2) Fundamental value efficiency. Informational efficiency represents how quickly prices respond to new information. Informational efficiency, alone, does not imply that market prices respond to new information correctly or even that prices respond at all. This brings us to fundamental value efficiency. Markets are efficient in the fundamental value sense if prices respond to new information not only quickly but accurately. Both pieces of market efficiency operate independently and both must be present for market efficiency.

It is important to weigh the type of information that is being priced into a security. What happens when new information becomes available but investors must invest substantial time, trouble, or money to get it? What happens when the information is technical and difficult to understand? Do prices still change quickly? In short, the answer is “no.” Certain types of information seem to be absorbed into prices far more slowly and incompletely than the efficient market believers suggest. A phenomenon that has been studied over the last 30 years is “post-earnings-announcement-drift.” An unexpected announcement of increased corporate earnings tends to be followed by atypical positive returns over the next several months, while firms that announce unexpectedly poor earnings see atypical negative returns over an extended period. This is evidence that the initial price response to the new earnings information is unfinished, and that the full implications of the new earnings information are priced in by the market far more slowly than previously assumed.

A recent paper, Konchitchki, Lou, Sadka & Sadka (2013) studied the explanation of the post-earnings-announcement-drift. Among their findings is that investors tend to underreact to new information in earnings. It often will take investors a period of time to incorporate the current earnings into future expectations which causes a drift. The complexity of the information can also impact the drift. This research is indicative that markets are not always efficient.

Indexing and Momentum Investing

Indexing has become more popular with the objective of replicating a particular market index. Using this investment approach, money is allocated by the proportion that a particular stock represents of the index. As such these investors pay no attention to information about a company (other than its market capitalization), much less whether it is fairly priced. Investors utilizing indexing strategies allocate capital using mostly investor flows or index changes instead of new relevant information. An increase in indexing and the consequent decrease in investors making decisions based on the fair value of stocks has the potential to make markets inefficient.

The U.S equity market grew from $11 trillion in 2002 to over $30 trillion in 2014. In that same period, the value of passive funds went from $710 billion to over $7 trillion. As a percentage of the equity market, passive ownership went from 6.50% in 2002 to 24.34% in 2014. The growth in the trend of investing while ignoring new information and valuations is worth noting. Today we have a market where trillions of dollars have been invested by uniformed market participants leading to inefficient markets. A significant proportion of investment funds have been moved into “the market portfolio” in that they are invested on the basis of the proportion that a particular stock represents of the index with no reference to valuation. As such, opportunities are being created for investors who use all available information in their investment analysis.

Momentum investing is another strategy that has grown in popularity.  It has been exhibited that momentum investing consistently generates excess returns suggesting a market inefficiency. Momentum comes in different forms with it being measured using past price movements, past returns, or changes in earnings.  In most markets, momentum investing plays a role in the management of upwards of 50% of the actively managed funds. Momentum investors represent another large proportion of the market that is in the hands of investors who make no reference to valuation when making their investment decisions. Therefore, they play no role in correcting any market mispricings. An argument could be made that momentum strategies destabilize the infrastructure of the market because they exacerbate existing trends. When investors are attentive to investing in trends and using momentum, they frequently drive the price beyond fair value until the demand from the fundamental investors causes a mean reversion in prices. These mean reversions differ in size and can cause investors to become fearful.

Some researchers believe that indexing can become self-defeating as more and more people adopt it. As the percentage of indexers grows, the markets become more inefficient as the number of investors who perform fundamental analysis shrinks. If everyone adopted indexing, the markets would likely not move much due to the lack of inflows. If there were to be a decrease in indexing, it is likely that equities in the most popular indexes would underperform those that are not in an index.

A self-reinforcing feedback loop has been created by indexing. The performance of indexes has been boosted by indexing which has led to more indexing. However, if this behavior begins to unwind the performances of indexes could suffer compared to the rest of the market.

Conclusion

Markets have demonstrated on multiple occasions in recent years that they are not perfectly efficient. When complex information is introduced to markets, it will often take a period of time to be priced in. Sometimes, it can take a substantial period of time. It is common for investors to underreact to new information which leads to a post-announcement-drift. During this period, investors price in the new information.

People have turned to indexing because they believe it is a good net of all fees strategy. However, not many have thought about the circumstance where indexes underperform the rest of the market. The major indexes like the S&P 500 are often used as a performance benchmark. What happens if the major indexes begin to underperform? Will investors adjust or will they just assume they are still performing with “the market” benchmark? Trillions of dollars have been invested into various indexes while ignoring new information, valuations, growth ect. It is certainly possible that investors who use all relevant information when making investment decisions will favor equities outside of the major indexes due to their discount to those within. This situation would lead to the major indexes underperforming and potentially a reversal of the indexing craze.

 

 

 

 

 

 

Understanding S&P 500 Earnings Estimates

Analyst Est

Q4 2015 earnings are almost finished being reported and they were fairly disappointing overall. Revenues were down 4.5% and earnings down 8.4% (on non-GAAP numbers). Once again, we watched sell side analysts have to revise their earnings estimates after being overly optimistic. But how optimistic are they about future earnings? I’ll answer that and explain why I have a problem with their earnings estimates.

Generally Accepted Accounting Principles (GAAP)


Generally accepted accounting principles (GAAP) are a set of principles that U.S. companies must use to compile their financial statements. The Financial Accounting Standards Board (FASB) issues a codification of accounting standards that companies are required to follow. These standards are meant to ensure that financial statements are consistent and comparable for investors. Most companies have found that there are certain accounting standards that negatively impact their company’s “true earnings power.” These companies will disclose a non-GAAP earnings per share in their financial statements in an attempt to show investors what they believe are the real earnings of the business. These non-GAAP earnings are sometimes helpful if there are one time items that impact earnings. However, many companies will disregard fundamental accounting like revenue recognition standards in their non-GAAP earnings. They will paint the rosiest picture they possibly can for current and prospective investors. I will give a few examples of some offenders.

Sell Side Analyst S&P 500 Estimates


When sell side analysts estimate S&P 500 earnings, which do you think they use? You guessed it, they use non-GAAP earnings. This isn’t a big problem if the variance between GAAP and non-GAAP earnings is small. However, currently the difference between the two is wide. Really wide. 2015 GAAP earnings per share (EPS) was $91.46 vs. non-GAAP of $117.92. This is the largest variance since 2008. See chart below:S&P PE GAAP non GAAP

Many people believe the market is trading at a P/E ratio of about 17. The only problem is, the earnings used in this calculation are non-GAAP or “fantasy earnings.” In reality, the market is trading at a P/E ratio north of 22x on a GAAP basis. The mean P/E over the last 100 years on the S&P 500 is 15.1x using GAAP EPS. It is important to note that these ratios are not predictive of short term market moves. They are more predictive of long-term future market returns.

Non-GAAP Earnings


Non-GAAP earnings serve a great purpose to investors when used properly. Today, this metric is widely overused with 88% companies in the S&P 500 disclosing it. Even more worrisome is that 82% of the time the adjustments increase net income. This metric is seen as a way for some companies to juice up their numbers. Lets take a look at Tesla (NASDAQ:TSLA). They have decided to disregard revenue recognition standards in their non-GAAP earnings. They enjoy recognizing all of the fees from the life of a car lease up front instead of the amount the customer has actually paid. They also like to play with the expense side of the income statement by not accounting for stock based compensation. These are the two large adjustments they make. However, it looks like there are five more that are smaller. The Q4 2015 GAAP net loss shrinks from $320 million to $114 million on a non-GAAP basis. During the Q4 2015 earnings presentation they announced they believe they can post a profit in Q1 using non-GAAP numbers.

One of the worse offenders is LinkedIn (NYSE:LNKD). LinkedIn also does not account for stock based compensation in non-GAAP earnings. Additionally, they choose to not amortize the intangible assets they have previously acquired. These two adjustments swung their GAAP net loss of $8 million to a non-GAAP profit of $126 million. This company has been doing this ever since they IPO’d. They have never consistently made money. It is anyone’s guess if they ever will. I have my doubts. However, they will continue to muddy up the waters for investors as long as they can.

Q1 2016 And Full Year Earnings


Q1 2016 earnings estimates have drifted lower for many months now. See chart below. Change in S&P 500 Earnings

Even as the overall market has rebounded 11% in the last four weeks, earnings estimates have continued to drop. As of today, Q1 2016 earnings are expected to be down 8.3% YoY. However, full year 2016 earnings estimates remain elevated. See chart below.

Quarterly Earnings EstimatesIt is evident that Q1 2016 earnings are expected to be horrible. Just in the last two weeks, Q1 estimates have dropped from -7.6% to -8.3% while the market has rallied. The amazing thing about this chart is the dramatic bounce in the second half of the year and into 2017. In Q3 2016 and Q4 2016 it expected that earnings grow 5.0% and 9.2% respectively. That will be quite the turnaround considering the -8.3% drop we are expecting in Q1. We will see if this materializes.

Conclusion


It is a complicated time to be an investor. I do my best to always keep an open mind. I continue to look for undervalued companies even though I believe the broader market is on the expensive side. In the meantime, cash is always a position. I am certainly not as bearish as many others. I think there are a lot of people out there looking for the next “big short” and the bears love to make noise. I’d rather make my own assumptions, stay patient, and wait for my pitch.