Dining on quality credit information
In many respects, the challenges that a chef encounters in delivering quality fare to restaurant patrons are quite analogous to those faced by bank information technology personnel in generating reports for credit analysts, loan officers, and credit committees. That is, head chefs and CIOs are both responsible for transforming numerous inputs into a finished product with characteristics that their customers desire. But in either role, access to a plethora of ingredients does not necessarily translate into satisfied customers.
Food becomes palatable as a chef combines ingredients in appropriate portions, applying the required levels of heat at each stage of the process. Similarly, data is only valuable inasmuch as it is carefully selected and synthesized to create a meaningful “information plate” that enables users to optimize their decision making. For bankers inundated by a torrent of data in the form of e-mails, spreadsheets stored on desktops and shared drives, core system reports, borrower financial statements, and credit reports from third parties, the task of creating a meaningful “information plate” can be particularly time-consuming and frustrating.
Key ingredients, such as salt and butter, are stored in areas of the kitchen where they can be readily accessed by any cook or server who requires them. Analogous to the salt and butter, debt service costs and collateral values are needed in multiple credit risk management functions: underwriting, loan tracking and risk rating, and loss provisioning. In practice, however, the crucial raw data needed for these calculations is frequently stored in separate information silos. Data that is not readily available for retrieval and integration with information kept in other silos places an additional demand on bank resources that are already thinly stretched in an environment of heightened regulatory scrutiny.
Charles Durkin, a lender and credit analyst with Private Bank Minnesota, explains that auditors and regulators are now soliciting detailed borrower financial information that they have not sought previously. Servicing these ‘off-menu orders’ frequently requires a prolonged effort. “The toughest part with managing the information flow is that we’re dealing with a moving target. Regulators suddenly want to dig into long-standing borrower relationships that they’ve had minimal interest in before.” While most plain-vanilla consumer loans haven’t piqued auditor curiosity, global cash flow analyses are now being requested for the majority of business loans with one or more guarantors.
The increased demand for meticulous documentation raises a broader question about risk management: who should bear the primary responsibility for aggregating and disseminating timely data, and what role should technology play in the process? Durkin offers a few thoughts. “Most community banks don’t have the capital or resources to hire a team of experienced people to focus exclusively on managing their credit risk function,” he says. Regarding IT enhancements, “it’s very important that [credit analysts] are trained properly. If they can input spreads in a timely manner, information systems can save time. If the analyst is not well-trained, even good software won’t help the process much.”
For many small- and mid-sized banks, the most practical solution to the information sharing problem reflects Durkin’s comments: designate several members of the lending team as risk management specialists. Train each of them to become well-versed in software that enables seamless data flow between underwriting, risk rating, loss provisioning, and stress testing functions.
Beyond optimizing the “data kitchen” to facilitate these information cascades, integrated risk management software addresses two other serious data quality problems:
1. In most banking organizations, borrower financial statements, FDIC call reports, and the core system are not speaking directly to each other. Instead, credit analysts and loan officers are forced to re-key data from one information silo into another. Not only is this exercise time-intensive, it also creates numerous opportunities for information to be lost in translation via errant keystrokes or data pastes.
2. Even when data integrity has not been compromised by manual sharing, the time requirements of data dissemination usually mean that it is infrequently performed. Unfortunately, managing risk with stale information is comparable to cooking with spoiled meat. In order to identify at-risk loans as early as possible, contemporary data is needed. Tackling loans at the outset of borrower cash flow problems is always preferable to (and less expensive than) belatedly impairing down the road.
Few commercial bankers would prefer to spend their time assimilating numerous pieces of borrower data instead of analyzing pertinent information that has already been aggregated for them. Presently, credit analysts and loan officers are spending too many hours making repeated trips to the commissary for raw ingredients (so to speak). But with a few adjustments to their risk management processes, they could instead be enjoying an a la carte dining experience.
For more information on global cash flow analysis and the key items to avoid when performing one, download the whitepaper, The Definitive Guide to Global Cash Flow.