Operational risks are also becoming more important in the large, complex financial institution as more technology and automated processes are used in all areas of operations. When banks used manual processes, errors were confined to the limited area where the employee worked. But in a modern technology setting, factors such as breakdowns in controls, errors in software code, and processing stream interruptions can have enterprisewide effects on the performance of the organization.
Recent history provides us with ample evidence that operational risk can be significant. Large financial institutions have reported operational losses from breakdowns in operating controls that, in some cases, have exceeded their credit- or market-related losses. In the area of legal risk, for example, many institutions have learned that failing to identify and promptly correct problems can result in losses that significantly exceed management's initial expectations. Over the past decade, large financial institutions have experienced more than 100 operational loss events in excess of $100 million each; some of these individual operational losses, resulting from fraud, rogue trading, and settlements stemming from questionable business practices, have exceeded $1 billion.
These remarks by Ms Susan Schmidt Bies, Member of the Board of Governors of the US Federal Reserve System, at the OpRisk USA 2006 Conference reflect a growing emphasis on Operational Risk. This focus translates to a greater regulatory attention to the quality of data that institutions are utilizing for their calculations.
The level of data quality has been a management challenge for decades. The speed of change in the connected economy has created an even larger tempest for institutions to grasp. The physical and logistical problems associated with moving, archiving and retrieval is only part of the data puzzle. As Ms. Schmidt Bies has so clearly concluded, the simple fact that "Automation" creates an even larger field of risk to monitor, provides an even greater opportunity for failure. The absence of data doesn't decrease the amount of risk. What risks should we focus on? The normal and expected risks from external data, or the unexpected risks that have been encountered before.
If you think about the places where the velocity of data is the greatest, then you have a place to begin. The processes and business functions associated with traditional annual financial audits and other external data give us a known history of loss events that need continuous scrutiny. However, it is those key risk indicators (KRI's) in places where the insitutions knowledge of the root operational risk causes combines with little or no history of losses that remains the nexus for concern.
Thinking beyond the current horizon is where the focus should be on active risk management scenarios.