The GSE NPL sales program gives investors the opportunity to profit from investing in non-performing loans from Fannie Mae and Freddie Mac and maximizing the number of loans they can get to re-perform. Freddie Mac and Fannie Mae (the GSEs) have been selling non-performing loans (NPLs) since 2014 and 2015, respectively, to reduce their holdings of less liquid non-performing assets and gain favorable execution over holding the loans as they have done in the past.
VIENNA, Va., May 15, 2017 – RiskSpan, a data management and predictive analytics firm that specializes in solutions for the mortgage, capital markets, and banking industries, today announced a new release of its loan-level mortgage query tool, RS Edge.
Financial institutions are constantly seeking new ways to maintain a competitive advantage and increase efficiency. These days, many institutions are turning to technology as competition intensifies and the regulatory environment becomes increasingly uncertain. In order to stay afloat in the industry, these institutions are incorporating big data into their business strategy.
Complying with the DFAST/CCAR requirements within an existing quantitative models and model risk management framework is one of the most daunting of the many recent challenges banks, Bank Holding Companies (BHC) and some Investment Holding Companies (IHC) currently face. The Dodd-Frank Act Stress Tests (DFAST) require all financial institutions with total assets above $10 billion to do stress tests on their portfolio and balance sheet. The Comprehensive Capital Analysis and Review (CCAR) is generally required to be completed once a bank’s total assets are above $50 billion. The objective of both exercises is to simulate a bank’s balance sheet performance and losses in a hypothetical severe economic downturn over the next nine quarters. Given this common objective, most risk managers consider and complete both exercises together.
Using open source data modeling tools has been a topic of debate as large organizations, including government agencies and financial institutions, are under increasing pressure to keep up with technological innovation to maintain competitiveness. Organizations must be flexible in development and identify cost-efficient gains to reach their organizational goals, and using the right tools is crucial. Organizations must often choose between open source software, i.e., software whose source code can be modified by anyone, and closed software, i.e., proprietary software with no permissions to alter or distribute the underlying code.
VIENNA, Va., April 12, 2017 – RiskSpan, the data management, data applications, and predictive analytics firm that specializes in risk solutions for the mortgage, capital markets, and banking industries, today announced it is relocating its headquarters to the Rosslyn business district in Arlington, Virginia.
The financial industry has traditionally been slow to adopt the latest data and technology trends, and the case of open source software is no exception. While open source has been around for decades, we’re only now seeing its manifestation within the finance and mortgage industries. Many institutions are exploring the viability of open source within the financial industry but hesitate to act because of the potential risks open source can expose them to.
VIENNA, Va., March 22, 2017 – RiskSpan, the data management, data applications, and predictive analytics firm that specializes in risk solutions for the mortgage, capital markets, and banking industries, today at IBM’s InterConnect conference in Las Vegas, announced the launch of a new data and analytics API on the IBM Bluemix platform.
Last month I highlighted the role of the front-end insurance risk share process around Credit Risk Transfer (CRT). I reviewed what the front-end risk share model is in the current state and noted the expanded efforts underway to broaden the pool of MI’s and reinsurers as counterparties to expand the front-end offerings. This is in addition to the already successful back-end CRT which has found great success thus far. So, the key question for 2017 is what does CRT look like in a post housing reform environment where much of the capital at risk is not the government credit guarantee but is comprised of private capital?
The question of “build versus buy” is every bit as applicable and challenging to model validation departments as it is to other areas of a financial institution. With no “one-size-fits-all” solution, banks are frequently faced with a balancing act between the use of internal and external model validation resources. This article is a guide for deciding between staffing a fully independent internal model validation department, outsourcing the entire operation, or a combination of the two.
VIENNA, Va., March 9, 2017 – RiskSpan, the data management, data applications, and predictive analytics firm that specializes in risk solutions for the mortgage, capital markets, and banking industries, announced that it has been selected for HousingWire’s 2017 HW TECH100™ award.
This year saw the highest number of nominees in the history of HW TECH100™, which recognizes leading companies that bring tech innovation to the U.S. housing economy. Among this year’s winners are other industry-leading firms such as Accenture, CoreLogic, and Freddie Mac.
Models based on Machine Learning are being increasingly adopted by the finance community in general and the mortgage market in particular. The use of modeling and data analytics has been key in the turnaround of this market; however, anyone who has worked with mortgage loan data knows it is notorious for errors and data gaps. Despite industry-wide efforts to incorporate robust quality control programs, challenges with mortgage data persist. Fortunately, combining machine learning in finance with cloud computing shows promise in addressing mortgage data gaps and producing more accurate results than traditional approaches.
VIENNA, Va., March 2, 2017 – RiskSpan, the data management, data applications, and predictive analytics firm that specializes in risk solutions for the mortgage, capital markets, and banking industries, today announced the addition of Faith Schwartz, Senior Advisor of Accenture Credit Services (ACS), to RiskSpan’s Board of Directors.
The FHFA issued an RFI to solicit feedback from stakeholders on proposals from the GSEs to adopt additional front-end credit risk transfer structures and to consider additional credit risk transfer policy issues. There is firm interest in this new and growing execution for risk transfer by investors who have confidence in the underwriting and servicing of mortgage loans through new and improved GSE standards.
As we look forward to 2017 and the critical issues facing the nation’s housing finance system, one of the paramount matters will be the ongoing development of the Credit Risk Transfer (CRT) initiative.
In some respects, the OCC 2011-12/SR 11-7 mandate to verify model inputs could not be any more straightforward: “Process verification … includes verifying that internal and external data inputs continue to be accurate, complete, consistent with model purpose and design, and of the highest quality available.” From a logical perspective, this requirement is unambiguous and non-controversial. After all, the reliability of a model’s outputs cannot be any better than the quality of its inputs.
Even with CECL compelling banks to collect more internal loan data, we continue to emphasize profitability as the primary benefit of robust, proprietary, loan-level data. Make no mistake, the data template we outline below is for CECL modeling. CECL compliance, however, is a prerequisite to profitability. Also, while third-party data may suffice for some components of the CECL estimate, especially in the early years of implementation, reliance on third-party data can drag down profitability. Third-party data is often expensive to buy, may be unsatisfactory to an auditor, and can yield less accurate forecasts. Inaccurate forecasts mean volatile loss reserves and excessive capital buffers that dilute shareholder profitability. An accurate forecast built on internal data not only solves these problems but can also be leveraged to optimize loan screening and loan pricing decisions.
When someone asks you what a model validation is what is the first thing you think of? If you are like most, then you would immediately think of performance metrics— those quantitative indicators that tell you not only if the model is working as intended, but also its performance and accuracy over time and compared to others. Performance testing is the core of any model validation and generally consists of the following components:
- Sensitivity Analysis
- Stress Testing
Sensitivity analysis and stress testing, while critical to any model validation’s performance testing, will be covered by a future article. This post will focus on the relative virtues of benchmarking versus back-testing—seeking to define what each is, when and how each should be used, and how to make best use of the results of each.
On this Veterans Day, I was reminded of the Urban Institute’s 2014 article on VA loan performance and its explanation of why VA loans outperform FHA loans. The article illustrated how VA loans outperformed comparable FHA loans despite controlling for key variables like FICO, income, and DTI. The article further explained the structural differences and similarities between the veterans program and FHA loans—similarities that include owner occupancy, loan size, and low down payments.
As the amount of student loan debt continues to balloon, there is growing national attention on this subject, with attempts to better educate potential borrowers. However, there is clearly a need to further improve and assist prospective college students and their families to make informed financial decisions about the true cost of higher education. With student loan repayment terms extending up to 25 years (and 30 years for consolidated loans), the long-term effect of this debt on students and the future of our nation is concerning.