Leverage is known to play an important role in loan default, but while theoretical research on leverage exists; to our knowledge there has been virtually no long-term data driven empirical analysis on the impact of leverage on residential foreclosure. We assembled data from various sources to fill that void and examine the role leverage plays in mortgage foreclosures over the last five decades1. This is an especially timely topic given that policy makers have recently attempted to thaw the tight lending environment by reducing the price and expanding the quantity of low down payment real estate credit.
CoreLogic research highlights four key findings. First, while homeownership rates today are the same level as five decades ago, foreclosure risk is two to three times higher. Second, the primary driver of default risk over this period has been leverage. Leverage has played such a strong role that has rendered changes in income and savings as insignificant drivers of default from a long-term macro perspective. Third, the stabilization in foreclosure rates in the 1970s and 1980s was driven by high inflation rates, which propelled nominal home prices and reduced aggregate LTV, thus lowering default risk – a reminder of real estate’s role as a hedge against inflation. Fourth, the centerpiece of government regulations to help make the mortgage market safer for consumers was an income based ability-to-pay rule manages delinquency risk, but is less aimed at the market’s foreclosure risk. Therefore, leverage as the most important driver of foreclosure performance over the last five decades remains unaddressed for the market. In the future, policy makers may need to consider exploring their ability to manage the leverage cycle to promote residential financial stability.
In the 1930s, the creation of Federal Housing Administration (FHA) and Fannie Mae dramatically expanded access to mortgage credit. As a result, the U.S. homeownership rates significantly increased from 44 percent in 1940 to 62 percent by 1960. Between the early 1960s and early 1990s homeownership rates stabilized in a narrow range between 62 percent and 65 percent. In the mid-1990s pro-homeownership policies led to an expansion in mortgage credit and the homeownership rate peaked in 2004 at 69 percent. Since then homeownership rates declined through the recession and stood at 64 percent as of Q4 2014.
While the homeownership rate today is similar to the early 1960s, foreclosure rates have significantly increased. Between 1960 and 1965 the conventional foreclosure rate averaged 0.6 percent and the FHA foreclosure rate averaged 1.4 percent. In 2014, the conventional foreclosure rate was 1.5 percent and the FHA foreclosure rate was 2.6 percent. Despite the fact that homeownership was the same last year as for the early 1960s, conventional foreclosure rates were about 2.5 times higher and FHA foreclosure rates were nearly two times higher. These much higher foreclosure rates are not a function of the fact that the market is still healing from the Great Recession because even as of 2004, well before foreclosure rates spiked, the level of risk for both conventional and FHA mortgage was three times higher than during the 1960s (Figure 1).
Given that homeownership rates today are at similar levels as the early 1960s, what has driven the higher foreclosure rates? To better understand the drivers of the higher foreclosure rates, we modeled foreclosure rates as a function of savings, unemployment, inflation, aggregate current LTV for all mortgage loans outstanding and real median household income2. The model revealed only two primary factors and one very secondary factor that noticeably influenced foreclosure rates. The LTV ratio and unemployment rate stood out as the most important variables, with the LTV ratio by far being the most important variable.
The model’s findings are consistent with the dual trigger theory of foreclosures that lack of equity and economic shocks (typically unemployment) that tip a household into foreclosure. When homeowners have insufficient equity and their LTV is above 100%, they become vulnerable to adverse economic shocks. For example, if a borrower loses their job and have insufficient income to pay the mortgage, they will typically try to sell their home. However, if the value of the property is below the current unpaid principal balance, they will be unable to sell the home and pay back their mortgage, which may tip them into foreclosure.
The model’s finding is also consistent with emerging theoretical literature over the last decade that indicates leverage cycles are an underappreciated driver of default3 (Figure 2). Mortgage leverage increased in the early 1950s as down payment requirements fell, which helped drive homeownership higher, but also drove foreclosure rates higher. Between the early-to-mid 1960s and mid-1980s, leverage stabilized, which led to stable foreclosure rates. The stabilization in leverage occurred because high inflation rates led to rapid increases in nominal home values, otherwise leverage would have continued to increase. Moreover, the rise in inflation caused an increase in nominal incomes which made the pre-existing fixed monthly mortgage payment easier to meet.
During the early to mid-1990s, leverage began to increase again due to the emphasis to support homeownership via lower down payment lending. During the early 2000s, leverage rose even faster due to cash out refinances and home equity lending which rapidly increased. When home prices crashed, leverage spiked and along with the increase in unemployment rates caused foreclosures to soar. Over the past four years that process reversed and leverage has declined. Since the trough in home prices in March 2011, prices have increased 29% through November 2014, which reduced aggregate LTV from 61 percent in 2011 to 46 percent in 2014.
To illustrate the impact of the home price recovery on foreclosure rates, the model was used to break down how much of the recent decline was due to price increases versus all other variables combined. Between 2011 and 2014, foreclosure rates fell by 1.5 percentage points and the rise in prices has accounted for 1.4 percentage points of the decline. In other words, 91 percent of the drop in the foreclosure rate is due to the drop in leverage via higher home prices. Unemployment and the remaining variables accounted for the small remaining portion of the decline.
CoreLogic findings illustrate how important leverage has been both historically and in today’s recovery. While leverage is the dominant driver of foreclosure trends, the unemployment rate captures the impact of short-term economic cyclical fluctuations. A less important but still influential factor has been periods of accelerating inflation, which ease the burden of the monthly mortgage payment and masked the rise in leverage via higher nominal home prices. Interestingly, the savings rate and household income were not at all important, which was a surprise given that traditional underwriting focuses on affordability. Over the next year the continuing improvement in prices should help further reduce leverage, but the renewed emphasis on low down payment lending may in the future beyond 2015 lead to an increase in leverage.
While the Federal Reserve utilized traditional monetary policy and unconventional tools to guide the economy through the recession and the recovery, one lever that is not utilized in federal regulatory policy is a leverage target, which has been used by over 20 countries with success4. The Qualified Mortgage (QM) rule focused on the borrower’s ability to repay and it will lead to better performing mortgages; however the QM lacked a leverage standard5. While the QM rule will help improve mortgage performance, it is aimed squarely at delinquency risk and less at foreclosure risk. That means that the most important driver of mortgage performance over the last five decades has remained unaddressed for the market and will likely need to be addressed in the future.
© 2015 CoreLogic, Inc. All rights reserved.