TheMReport — News and strategies for the evolving mortgage marketplace.
Issue link: http://digital.themreport.com/i/134983
feature S e c on da r y M a r k e t a na ly t ic s se r v ic i ng or ig i nat ion ANALYTICS and economic collapse of the last decade, it would be that the industry learned some valuable lessons about evaluating loans— even if they were learned the hard way. "Hopefully, that's been a cautionary tale going forward for paying very close attention to potential risk in originating loans and in servicing and holding loans," Blomquist said. "I think that the lessons of the housing crisis should teach that risk management is extremely important. I guess the good news is there are many more tools to do that, and there's a lot more data out there to help evaluate and manage risk within a given portfolio than we had five or 10 years ago." Regulation and Other Wrenches in the Works I n addition to adjusting to the sheer scope and complexity of loan-level data available in today's mortgage banking environment, lenders and other groups have to consider other factors—both in-house and from outside—that can gum up their business. The ever-watchful eyes of the Consumer Financial Protection Bureau and similar regulatory agencies represent possibly the biggest complication. "The expectations are that you're going to hit a much higher mark in terms of how you're using data, what data you're collecting, the quality of that data, and how that folds into your risk management practices," Worth explained. "And then you've got a much higher bar on a number of levels of compliance. You really can't go back to pre-crisis-style data management, where you had very little understanding of what's going on with the nuts and bolts." As a result of this year's rush toward compliance, Worth says he expects to see a lot more front-end quality control and compliance over time. As far as ensuring and improving quality control, he believes it's a matter 64 | The M Report "The expectations are that you're going to hit a much higher mark in terms of how you're using data, what data you're collecting, the quality of that data, and how that folds into your risk management practices." —John Worth, National Credit Union Administration of being able to execute secondstage verification—for instance, verifying that income was documented accurately—without opening up folders and doing costly and time-consuming loanlevel reviews. Another area of concern is in managing the problems that can spring up in the rank and file of a company when manual underwriting allows loans that are likely safe but may contain problematic features. "The friction is caused by the underwriter and loan officer having to slow down the lending process to go do more research and by causing the borrower to have to come up with some more documentation or explain a particular situation that shows up on their application documents," Sharga said. "So you can understand why if you're fairly confident that you're issuing a good loan to a good borrower, what you'd really rather not have is someone in a compliance role or a QC [quality control] role questioning that and forcing you to go back and do more work—only to issue the loan later anyway." However, given the recent history in the industry, Sharga remarked that "it's important to have those kind of safety checks in place," even if there are frustrations. Cracks in the Armor O f course, even the most well-designed plans show weaknesses in the early implementation stage, and the MPact speakers weren't shy about pointing out the flaws in the industry's new processes. Again, there were differing opinions on where problems lie. For Sharga, it comes back to compliance, which he believes will "result in most of the large retail banks only issuing those vanilla loans that fit within the strict confines of the QM [qualified mortgage] rule." "It will probably require allnew risk management protocols and procedures for any lender who decides to issue a loan outside those [QM standards] so that they protect themselves from potential litigation up front," Sharga said. "And the tradeoff in having less risky loans in that scenario is that it costs the lender more to issue the loan and will ultimately cost the borrower more to get the loan." Sharga continued, "That's probably the part of risk management that nobody writes about—none of this is free, and even though at the end of the day you can make an argument that having a portfolio loan that's performing the way it's supposed to is less expensive for a lender or servicer, it costs so much to stay within regulatory compliance … that ultimately it's going to be more expensive for borrowers to get loans." And then there's the data that, so far at least, seems to be going unnoticed, as Blomquist points out. "I think what the industry's probably pretty good at—although I'm sure there's room for improvement—is evaluating the risk involved in a certain borrower and in a certain type of loan given to that borrower," Blomquist said. "However, I don't think the risk that's being taken into account is the risk of property or the potential risk of properties that are securing these loans." Blomquist noted 20 different local data points tracked by his company that speak to the quality of properties and of surrounding neighborhoods. A lot of that type of data gets ignored, he says. "[T]hat's a piece of risk management that I don't [think] a lot of originators and servicers who are acquiring these packages are looking at," Blomquist commented. "And I think it's important to realize that the type of property, the neighborhood that property's in, and the risk factors of that neighborhood are important to take into account when looking at a potential risk of any given loan." "The Right Tools for Managing Data and Risk" session at MPact 2013 was sponsored by MReport magazine. This piece can also be downloaded from in the MPact Data and Risk Executive Summary issued to conference attendees.