Future of Compliance in the Mortgage Industry

QC 2.0: Next Generation of Quality Control for the Mortgage Industry

By Steve Spies, SWS Risk Advisory

The last mortgage war is over. The mortgage loan quality battles raged from 2008 until the tide turned around 2015. The war was won by the industry’s embrace of quality standards rivaling industries known for their precision manufacturing processes. Rolex would give a tip of the cap to the mortgage industry’s >99.5 percent significant defect accuracy rate. But that long needed permanent investment in quality control was designed for a high document, labor intensive and low-tech world. Digitization, risk proliferation, and the competitive onslaught requires a new mind set for using quality control dollars to survive in the era of Big Data. In short, time to pivot from defense to offense, and this month I lay out a plan to do that.

In next month’s issue, I will take a stab at forecasting the industry’s progress over the next decade in automating the credit decision process in alignment with loan quality requirements. Spoiler Alert: Even though great gains will be made in the routine aspects of the industry, like closing docs, with underwriting and QC, every mortgage is like a thumbprint. Each loan is so borrower unique that big challenges in credit data standardization prevents tech from providing anything close to a 100 percent solution in the next ten years. Count me a doubter that “within two years the majority of mortgages won’t need a human touch.”

Part One

Mortgage QC 2.0—The Best Defense is a Good Offense

Quality Control as an offensive weapon has three main attributes:

  1. Proactive – 80 percent real-time, pre-closing, 20 percent post-close
  2. Profit centered – strategic, targeted QC driving continuous bottom line improvement across the enterprise reducing costly manufacturing defects.
  3. Data driven, human applied – Artificial Intelligence (AI) and Tech rightfully power QC 2.0, but needs extreme discipline to ensure accurate data quality and interpretation.

Proactive:

It’s important to distinguish quality assurance from quality control. Quality assurance is making sure customer expectations are met before the transaction is completed. Quality control is a post-closing validation that the manufacturing process worked as intended and as required by law and investors. Most lenders only intentionally invest in back-end quality control as an audit compliance necessity. During the crisis, QC became synonymous with repurchase avoidance, and then a defense against stricter legal and regulatory compliance standards. Those foundational, post-manufacturing duties must continue, but the GSEs have committed to long-term repurchase risk minimization. This incentivizes resource reallocation to managing quality while creating your product or service. Embedding a quality assurance feedback loop in the product delivery process prevents mistakes from impacting your customers. Have you analyzed the cost of reworking a loan? The price of a lost customer? Getting it right the first time flows right to the bottom line and keeps coveted customer surveys and social media ratings high. It’s not unreasonable to shift 80 percent of your overall loan quality dollars into pre-delivery because QC technology has developed robust pre-funding QC modules, reporting, and feedback loops, possibly with an overall reduced QC budget.

More importantly, as repurchase defects are so rare, quality assurance can focus on manufacturing errors that really cost you money. Over processing, wastes, and delays are eating your lunch now. Mortgage lenders are well aware that the percentage of “just got lucky” defects is much higher than the approximately .5 percent industry repurchase error rate. These impactful defects generally don’t require repurchase when the overall loan credit profile is strong. However, the same mistake on loans with marginal credit parameters might require financial remediation if not outright repurchase. Lenders should have a sense of urgency about this; it’s time to optimize manufacturing quality before the inevitable recession and the typical erosion in loan quality that comes with it. Only proactive QC improves or preserves profits, the second pillar of next generation QC.

Profit Centered:

I recently outlined the case that strategic lenders will view QC as a profit engine. To recap, the cost to originate a loan is pushing $10,000, which cannot last. The industry is constantly bemoaning over documentation and sending loans back through underwriting five or six times due to faulty or errant documentation. And we can’t help ourselves from throwing more and more money at originators. The cost of implementing all the technological bells and whistles may add hundreds of dollars per loan with questionable payback. And that assumes implementation of these tech tools is seamless and accurate. And finally, of course, lenders are absorbing a backbreaking amount of increased compliance and regulatory costs.

Forward-thinking lenders realize investing in quality control as a real-time feedback mechanism drives continuous improvement to revenue and cost. Shifting your mindset from QC as a no news is good news role, to a powerful proactive source of improvement may separate winners and losers. It’s likely no other function in your organization dissects every step and every document on a sample of your loans every month. QC’s timely results drives costs savings now, rather than some imagined future state.

In case you missed it, here are the five immediate ways I suggested your QC work can lower costs and improve revenue:

  1. Underwriting overlays – needed or overly cautious?
  2. Required documents – eliminate CYA docs
  3. Rework – loan defects slowing down the process?
  4. Technology – promised savings being realized?
  5. Liquidity– repurchase/loss reserves excessive?

Here are couple more interesting possibilities:

  1. People – coaching up those with most defects is an easy win
  2. Loan type – self-employed taking twice as long?

Just a couple hours brainstorming your QC results will drive a long list of high impact opportunities. Rather than mechanically sampling and reporting on repurchase and compliance, the more advanced reporting tools from QC vendors and inhouse systems can be plumbed for endless insight into company operations. Engage your QC team or vendor to generate improvement ideas, gather market intel, stop overkill, and prevent mistakes.

Data Driven – Human Applied

There is no looking back on the digitization of business processes, and QC is no exception. In fact, because its main purpose is data analysis, either in a document or a raw data point itself, QC lends itself to automation better than other parts of the business. AI and machine learning excel at spotting when two or multiple pieces of information don’t make sense in relation to each other. Lenders’ investment in these tools requires at least two guardrails. First, implement with a purpose, either driving front-end improvement to a manufacturing weakness, or spotting troubling quality issues or fraud patterns post-close. Don’t act without a plausible ROI. Second, ROI calculations must factor in that AI and Robotics often generate high false positive rates. Chasing down too many inaccurate red flags destroys the benefit and numbs the staff from acting on real risk indicators.

The Big Data era is driving a generational shift in mortgage origination, from customer acquisition to fulfillment, underwriting and closing. Of course, this shift creates new frontiers for quality control. Any investment in data driven solutions comes with a big piece of yellow caution tape around it. Even though we have the technology and skill to spot defects with data, the data quality curve is just emerging. Most mortgage file credit elements lack standardization and controls around data creation, transmission, and protection. Borrowing a baseball analogy, income and employment data integrity is barely out of the first inning. Add in that humans are handling this data at some point introduces everything from normal error, to biased interpretation, to outright manipulation. By comparison, because of widely agreed upon format and data definitions, appraisal data standardization and digitization approaches the late innings. Regardless of where a piece of technology claims to be in the game, QC must lead in testing data quality and holding vendors accountable for data accuracy claims. Confirmation bias can be rampant amid the glamour of new technology.

To keep pace with mortgage digitization, QC must develop data quality testing expertise. But don’t be fooled by tech providers claiming to verify direct from the “source of truth.” This term mistakenly entered industry lexicon believing that going to the “source;” for example, an employer makes it the “truth” for purposes of credit decisioning. I prefer the terms “primary sources” and “alternative sources.” An employer is a primary source for employment information, but its “truth” depends on the employer’s approach to an array of consistency, definitional, and reporting needs. For starters: Did the income vendor interpret and map the employer paystub fields correctly? Did they manually manipulate the data in any way? Did your loan origination system import the data correctly? Your QC team must independently validate and analyze data quality at all points during its digital journey.

A single source for data also complicates independent validation. QC should lead development of secondary sources to confirm accuracy and reasonableness that don’t rely on the primary source. For instance, can LinkedIn give us independent insight on employment status? Even in today’s mostly manual world, we seldom rely on one source of corroboration. As an example, we now get up to four pieces of primary data on employment and income: a paystub, W-2, verbal verification of employment, and sometimes tax returns. Further, relying on a single source opens that source up to targeted manipulation and falsification. Fooling ourselves that QC’s no longer necessary will be the biggest mistake of all. Quite the opposite. By assuming an unvalidated primary source is always accurate, we potentially back into another version of no-doc loans.

Hoping to optimize speed and cost, AI advocates imagine a world where a sufficiently predictive credit decision can be made from modeling publicly available information. Proponents say it’s a straightforward algorithm by mining social media, credit scores, and everything else the internet knows about us. Then, it’s a simple matter of pricing for the modeled default risk. No need for documentation or other controls. Just consider the accuracy of your own internet footprint to know that data quality concerns grow exponentially with such an approach. And pricing for risk was one of the mistakes at the heart of the last crisis. By design, a pricing for risk approach assumes some borrowers lose their homes and is contrary to the goal of sustainable home ownership. My view is no matter the sophistication of the decision engine, we can never comprise the fundamentals of independently establishing borrower willingness and ability to repay a debt.

With these limits on how far “data and done” can carry us, in next month’s issue I share my vision for the next decade’s wins and challenges in pursuit of the digital mortgage.

Source: https://www.mortgagebankermag.com/quality-assurance/qc-2-0-next-generation-of-quality-control-for-the-mortgage-industry/

Web Statistics