Skip to main content

Looking for Valuant? You are in the right place!

Valuant is now Abrigo, giving you a single source to Manage Risk and Drive Growth

Make yourself at home – we hope you enjoy your new web experience.

Looking for DiCOM? You are in the right place!

DiCOM Software is now part of Abrigo, giving you a single source to Manage Risk and Drive Growth. Make yourself at home – we hope you enjoy your new web experience.

Looking for TPG Software? You are in the right place!

TPG Software is now part of Abrigo. You can continue to count on the world-class Investment Accounting software and services you’ve come to expect, plus all that Abrigo has to offer.

Make yourself at home – we hope you enjoy being part of our community.

Assessing Prospective CECL Methodologies: Probability of Default/ Loss Given Default

Brandy Aycock
May 5, 2017
Read Time: 0 min

Part of a series examining prospective CECL-compliant methodologies 

The road to CECL compliance ends in identifying the CECL methodology (or methodologies) that best suits the institution and its loan portfolios as well as complies with CECL guidance and regulator demands. 

FASB has been steadfast in not specifying or even identifying all appropriate CECL methodologies. Because each institution is different, with different types of loan portfolios and different underwriting principles, the method or methodologies you choose must be determined by your unique blend of products, portfolios and markets as well as the quality and quantity of your available loan level data. 

That doesn’t mean a lender has to rebuild its methodology totally from scratch, only that it must choose a methodology for each loan pool based on its own judgment and circumstance. Most institutions have relied on historical loss analysis to estimate their allowance under the incurred loss standard, but under a rule that requires estimating loss based on assumptions that start on day one of the loan and follow it to its likely conclusion, an estimate based simply on what happened last year won’t fly – or comply. Given the need to adopt a more sophisticated methodology, most institutions will have to consider new methodologies, though their loss histories will certainly be included in the calculus. 

For insights on prospective CECL methodologies, we asked the advisors of MST Advisory, consultants helping lenders through the discovery and decisions that will lead to choosing and implementing a CECL methodology. 

What is probability of default/loss given default (PD/LGD)?

When used in migration and vintage analysis, a PD/LGD methodology combines the calculation of the probability of loans experiencing default events with the losses ultimately associated with the loans experiencing defaults. 

The “PD” component of the formula is a percentage of loans that have defaulted (as defined by the institution) in that pool over a look-back period. Lenders can use a so-called “PD balance,” the percent of the total balance of the portfolio that has defaulted over a look-back period; as such, larger loans have more impact. Alternatively, the number of defaults may be computed. The “LGD” component is the percentage of the defaulted loan balance (that is, the exposure at the time of default) ultimately charged off. Multiplying one by the other gives the lender a loss rate, which is then applied to the loan portfolio balance to determine expected future losses. 

Why PD/LGD?

An advantage of the PD/LGD approach over the simpler loss rate approach is that the lender can use different look-back periods for each component: life of loan for the PD to determine default rates throughout the loan cycle, more recent for the LGD to get more current and relevant information about declining collateral values.  Also, when adjusting PDs and LGDs for qualitative and quantitative adjustment factors, different adjustment factors may be more powerful.  For example, PDs may correlate strongly to national or regional economic conditions while the LGDs may correlate more strongly to local conditions such as local collateral value changes. 

By separating its two components, PD/LGD provides better insight – that is, more granularity – into loan losses. A loss rate expressed as a single variable does not reveal how much of the loss rate might be due to large numbers of loans having small charge-offs or small numbers of loans having large charge-offs. But PD/LGD is also a more complex methodology. 

“The value of breaking down losses into the two components could more than compensate for the added complexity, but whether that balance between cost and benefit is in the lender’s favor depends on whether and how the disaggregated information is used,” Dorsey Baskin pointed out. “Both approaches to measuring past experience and applying that knowledge to estimate credit losses are acceptable for purposes of measuring incurred and expected loan losses. Both are highly functional. The breakdown gives the lender information to improve operations by focusing on either the causes and cures of defaults or the minimization of charge-offs from defaulted loans, or both.”  

“Separate measures might, for example, provide useful information about separate departments within the lender’s loan administration operations. But, if the separate information does not find its way into operating metrics that affect the lender in a meaningful manner, it is questionable whether the added complexity is justified.” 

“Your data will lead you to where you need to be,” advised Shane Williams. “You could use PD/LGD or an annual adjustment to a vintage or cohort analytic. It’s just math given those structures.” 

Williams said PD/LGD is a preferred method where an institution has an adequate quantity and quality of data, because “it segregates the collateral from the borrower,” allowing for separate analysis to better understand and address risk.”

For more on PD/LGD, read “An ALLL Methodology for Your Future”.

Read last week’s blog on using Vintage as a prospective CECL methodology.


About the CECL Methodology Panel

Dorsey Baskin is recently retired from the National Professional Standards Group of Grant Thornton LLP and serving as an independent consultant to MST Advisory clients. His roles at Grant Thornton included national leadership of the firm’s innovation function, technical accounting and audit advisor for the banking industry audit and consulting practice, and national professional practice director.

Shane Williams, a senior advisor for MST Advisory, works with banks and credit unions to set priorities, identify data needs, implement allowance technology, run shadow analyses and identify appropriate methodologies in preparation for accounting for loan losses under CECL.  Shane counts more than 25 years of financial and risk management experience as a banker, in software development and delivery, and as a consultant to major financial institutions.

For more than 30 years Shelly Biggs has provided leadership in risk management expertise as an executive with and consultant to the nation’s largest financial institutions. Her areas of expertise include: Development of risk and reporting framework, ALLL, CECL, corporate finance (quarterly SEC reporting), reconciliation of finance and risk data, regulatory reporting and earnings call reports. Shelly has provided Risk Management related guidance for lending practices for commercial (CRE & C&I), mortgage, consumer (auto and credit cards) credit topics, including loan underwriting, due diligence, appraisal review, portfolio analysis, loan loss modeling, organization of the credit department, development of credit policies and procedures, risk and credit management reporting, management of regulatory recommendations and complex projects. Shelly is an advisor with MST Advisory.

Learn more about MST Advisory Services.

About the Author

Brandy Aycock

Brandy Aycock is Director of Event Marketing at Abrigo.

Full Bio

About Abrigo

Abrigo enables U.S. financial institutions to support their communities through technology that fights financial crime, grows loans and deposits, and optimizes risk. Abrigo's platform centralizes the institution's data, creates a digital user experience, ensures compliance, and delivers efficiency for scale and profitable growth.

Make Big Things Happen.