Our failure to prevent the Florida school shooting illustrates a pervasive problem in modern societies: we often have access to ample warning signs but all-too-frequently fail to leverage this information to avoid disaster. The issue not only impacts law enforcement agencies, but our financial institutions as well. To more effectively handle all the intelligence available to them, organizations will require major structural and cultural change.
The FBI and local law enforcement reportedly had more than enough information to legally disarm and detain confessed school shooter Nikolas Cruz before he killed 17 people at Parkland High on February 14. This is not the first such intelligence failure, and won’t be the last. Consider these examples:
- 9/11 could have been prevented had the CIA and FBI done a better job of sharing and handling intelligence.
- Russian intelligence warned the FBI about Tamerlan Tsarnaev long before he carried out the Boston Marathon bombing.
- In France, authorities failed to act on multiple clues that would have enabled them to prevent the Paris bombings that claimed 130 lives in November 2015.
In the financial industry, rating agencies and bank risk management teams failed to act in their or their clients’ best interests when they continued to create and sell residential mortgage-backed securities, despite the deterioration in mortgage lending standards, and the increasing and disturbing amount of mortgage fraud being reported by the FBI in its annual mortgage fraud reports.
A well-operating risk-management function, with a voice, would most likely have limited the potential for the cultural failures seen at the Royal Bank of Scotland, as detailed in the recently published report commissioned by the Financial Conduct Authority. The extraordinary activities of the gung-ho Global Restructuring Group at RBS in London could immediately have been stymied, as they posed reputational and business risks far outweighing the group's short-term revenue-generating interests. As the report explains:
A well-operating risk-management function, with a voice, would most likely have limited the potential for the cultural failures seen at the Royal Bank of Scotland, as detailed in the recently published report commissioned by the Financial Conduct Authority. The extraordinary activities of the gung-ho Global Restructuring Group at RBS in London could immediately have been stymied, as they posed reputational and business risks far outweighing the group's short-term revenue-generating interests. As the report explains:
“GRG enjoyed an unusual independence of action for a customer-facing unit of a major bank. It saw the delivery of its own narrow commercial objectives as paramount: objectives that focused on the income GRG could generate from the charges it levied on distressed customers. In pursuing these objectives, GRG failed to take adequate account of the interests of the customers it handled and, indeed, of its own stated objective to support the turnaround of potentially viable customers.”
These assorted failures suggest that we have a systemic problem with risk monitoring, or a failure to incorporate it appropriately within institutions. And because the problem is systemic it won’t be solved by firing a few bad apples. Instead, we need to understand and address the root cause.
One feasible argument is that jobs involving risk monitoring and mitigation generally come with a relatively low social status and thus do not necessarily attract the most motivated applicants.
This phenomenon is epitomized by our (often unfair) stereotype of security guards: that they are ineffective and prone to sleeping on the job. Because security jobs are low paying, they don’t often attract type “A” individuals. The job itself is quite boring: most of the time nothing happens. While a more proactive security guard could find and act upon many clues during the course of his or her day, almost all of the extra effort will be for naught. At least 99 times out of 100 that suspicious backpack won’t contain an explosive device.
This phenomenon is epitomized by our (often unfair) stereotype of security guards: that they are ineffective and prone to sleeping on the job. Because security jobs are low paying, they don’t often attract type “A” individuals. The job itself is quite boring: most of the time nothing happens. While a more proactive security guard could find and act upon many clues during the course of his or her day, almost all of the extra effort will be for naught. At least 99 times out of 100 that suspicious backpack won’t contain an explosive device.
Although bank risk managers and FBI call handlers undoubtedly have higher social status than security guards, they are most likely to be subordinated within their organizations. At a bank, monitoring credit risk is much less glamorous and lucrative than acquiring or merging companies, underwriting deals or trading securities. And, as with the case of the seemingly suspicious backpack, most clues won’t lead anywhere anyway: for every legitimate call law enforcement departments receive there are many that lead nowhere; a missed charge card payment, similarly, often doesn’t presage a mortgage foreclosure.
Ideally, we should elevate the status of risk monitoring jobs and make them more exciting. More attention from senior management may help. Although most money-center banks took massive losses during the financial crisis, Goldman Sachs came out relatively unscathed. A major reason is that the bank’s Chief Financial Officer reviewed daily risk management reports and held a meeting in his office to call for immediate action once its was detected that mortgage backed securities had begun to underperform in 2006. Goldman is also an exceptional case in that it rotated fast-track talent between moneymaking and risk management roles, and it empowered risk management staff to veto certain trading activities.
Although more high-level attention might help those charged with receiving and sifting through raw intelligence, the job is still a tedious one – akin to looking for a needle in a haystack.
Conviction Dilemma
In addition to the possibility that risk monitoring personnel are as a group less motivated, risk personnel tend to be a more introverted type than their front-desk colleagues. This may manifest in their being apprehensive when expressing themselves to their comparatively more aggressive colleagues, and potentially come off as being indecisive or speculative. Leaders often like a strong, definitive opinion: “hedge this risk!” and may shun or ignore a more complex opinion coming from a more cautious analyst.
In short, the personalities hired into risk-management roles often suffer from what we will term the “conviction dilemma,” which emanates from the work of Philip Tetlock and others, who studied predictive expertise. Tetlock's research findings informed his commentary that those whose expertise was valued and sought out, for example pundits on TV shows like The McLaughlin Group, were those who had vocal, unequivocal opinions, that could be articulated with utter conviction – but were often wrong.
Altogether, even strong and motivated risk experts may be introverted and may be indecisive when expressing themselves. Playing a function that is considered as subordinated in management’s eye, they might struggle to make convincing and resolute “do this!” arguments, and management might therefore be less likely to take them seriously, and act on them expeditiously.
Applying Technology
In the 21st century, we have learned to assign boring or laborious jobs to computers. We can identify potential attackers earlier by entering all the clues law enforcement receives into shared databases, and we have state-of-the-art data science tools built for analyzing this mass of information. This approach need not violate privacy: social media posts, calls from tipsters and prior arrests are all legitimately available to law enforcement today.
Palantir is among the most prominent of companies offering software that enables intelligence agencies to find needles in the haystacks of raw data they receive. Unfortunately, Palantir is not an inexpensive solution, and may thus be beyond the budgets of smaller law enforcement agencies.
Governments and NGOs may wish to invest in the development of free, open source data analysis. Aleph is an open source tool that can analyze large volumes of unstructured data. Although designed for investigative journalists, it could be customized for use by law enforcement of for counterparty tracking. Whether they use licensed or open-source solutions, law enforcement and intelligence agencies should establish and apply technical standards for data sharing. Because financial firms are overtly competitive, data sharing of financial intelligence may be less appropriate between competing firms, but may be more prevalent within institutions.
Often the information needed to prevent mass killings is hiding in plain sight. By improving organizational structures and leveraging technology, financial firms and law enforcement agencies can harvest more actionable data from legally available information. Armed with this data, they can prevent certain future acts of carnage. While no single policy solution – VaR levels or gun control included – can ever guarantee endless success, we need to be thoughtful and dynamic in going about limiting the frequency or even the magnitude of these catastrophes, and we would do well to use our tools effectively in pursuing the goal not only of making money, winning clients and awards, but also of limiting the downside.
---------------------
This piece was co-written by Marc Joffe, who consults for PF2, and members of PF2’s staff. For more on this topic, visit our 2016 piece on the detrimental impact of short-term thinking patterns on conduct with financial firms. Among other things, we recommend a re-thinking of the design of incentive structures: “The approach we put forward here is the studious linking of profit-sharing to successful and honest risk-taking and business practices.”
Marc Joffe is a Senior Policy Analyst at the Reason Foundation and a researcher in the credit assessment field. He previously worked as a Senior Director at Moody’s Analytics.
---------------------
This piece was co-written by Marc Joffe, who consults for PF2, and members of PF2’s staff. For more on this topic, visit our 2016 piece on the detrimental impact of short-term thinking patterns on conduct with financial firms. Among other things, we recommend a re-thinking of the design of incentive structures: “The approach we put forward here is the studious linking of profit-sharing to successful and honest risk-taking and business practices.”
Marc Joffe is a Senior Policy Analyst at the Reason Foundation and a researcher in the credit assessment field. He previously worked as a Senior Director at Moody’s Analytics.