BID® Daily Newsletter
Nov 26, 2012

BID® Daily Newsletter

Nov 26, 2012

A GOOD BOOK AND LOWER STRESS


As you get back to work today, you might be thinking about what gift you are going to get for those on your holiday shopping list. Retailers are no doubt bombarding you with ideas, as coupons fly in on your computer or jam your email. This is the banking business and not retail, so while we cannot help too much, we thought you might like to give a good book to someone on your list. If so, you might be interested to know the Top 5 best sellers on the New York Times list right now (in order) are: Killing Kennedy by Bill O'Reilly and Martin Dugard; Thomas Jefferson by Jon Meacham; Killing Lincoln by Bill O'Reilly and Martin Dugard; No Easy Day by Mark Owen and How to Create a Mind by Ray Kurzweil. This is a fascinating list to be sure, but if the generic nonfiction book isn't your thing, we'll be back tomorrow with yet another idea for our readers to help reduce your stress during this holiday shopping season. One way to reduce stress in banking is to roll out a program around ensuring data quality. To run quality reports and perform any decent analysis, you first have to start with good quality data. More and more, banks are required to drill down into granular levels, so clean, reliable, reconciled data has become critical. Most bankers we know have grown up using Excel spreadsheets and many have built quite a structure around this tool. At some point, when taken to an extreme, this approach becomes unwieldy at best and unsustainable at worst. Excel is a great tool, but it can also be labor intensive and is prone to manual error. That doesn't mean Excel should be thrown out, but rather, it means care should be taken when it is used extensively. The key to having good data at a root level is to make sure it is clean at the outset. To do this, it is important to understand how any data that may end up feeding a report is collected, calculated, aggregated and modified from its source. Banks get into trouble because data comes from multiple sources and it is often not cared for in the same way. That may sound simple, but think about the standard of care your job requires and then think about how many decisions are based on data you have collected from numerous sources. When business rules and definitions don't match up across databases, or users interpret them differently, decisions are not made in an optimal manner and can even be damaging to the bank. Making sure data is collected and cleaned efficiently before it is shared out across the bank is important. It not only helps improve efficiencies and effectiveness, but reduces redundancy, incidences of error and provides a more structured approach (making sure definitions are consistent throughout the bank). Changing processes can be difficult once they are ingrained into the daily routine, but not doing so can lead to unintended consequences and increase the risk of problems in reports and analysis that management depends upon. Data should always be reconciled with its source and delivered consistently, with speed. Regulatory and management demands require more granular data flows throughout the bank, so getting a handle on sources and uses, is a critical component of any strong data capture, reconciliation and distribution project. To do this properly, you must establish a standardized approach to sourcing, reconciling, governing and processing all data flowing into and out of the bank of any importance. To get going, start with the most important pieces first, think standardization along the way and keep moving forward even if the steps are small ones to start. Take this approach and before you know it, just like with your holiday shopping list, you will slowly but surely complete the task. Good luck with both.
Subscribe to the BID Daily Newsletter to have it delivered by email daily.