22 January 2010

Intelligence-led Investigations: DecisionAdvantage...

Operational Risk incidents are surrounding us on a global basis. The continuity of operations in the rescue and relief efforts in Haiti. The security of information and Internet politics with Google and 30+ other companies. A growing AQAP threat after Ft. Hood and NW 253 while Islam converts flock from US prisons to Yemen to drink the "Shariah" Kool-aide. The economic integrity of global banking with new rule-sets and oversight on how banks are structured in order to mitigate systemic risk.

All of these Operational Risk Management (ORM) challenges require the same intelligence-led investigations to establish the ground truth and then to enable a "DecisionAdvantage."

"Whoever wishs to foresee the future must consult the past; for human events ever resemble those of preceding times. This arises from the fact that they are produced by men who ever have been, and ever shall be, animated by the same passions, and thus they necessarily have the same results." --Machiavelli

When does information that is collected become a violation of a persons privacy or legal rights? At the point it is collected from a source or at the point in time when it is analyzed by a human?

Intelligence-led investigations include the use of automated Internet Bots to troll the Internet and Open Source content (OSINT) for the collectors to find what they are looking for. This begins with a hypothesis and then the development of an algorithm to carry out the automated mechanism for collection.

These Intelligence-led investigations also include the use of new forensically sound methods and proven procedures for collection of digital data from a myriad of technology platforms including laptops, PDA's and cell phones. These methods have been proven and certified in the forensic sciences for decades and follow many of the legally bound and court tested rules associated with evidence collection, preservation and presentation. Digital Forensic tools and 21st century capabilities enable global enterprises, law enforcement and governments to not only discover what they are looking for but to use this in a court of law to find the truth.

The monitoring and collection of information associated with people begins various intersections with the context, relevance and legality of storing it, analyzing it and when to destroy it. The ability to do this effectively inside the walls of the global enterprise corporate headquarters, the Regional Fusion Center or the National Counterterrorism Center (NCTC) is at stake.

DecisionAdvantage is a term that promotes the connotation of competition, safety or defeating an adversary but only one will apply as you begin to understand the environment and the circumstances under which information is being utilized for one or the other. If you are making decisions on the most safe and ideal drop points for water, food and medical triage supplies in Haiti, decisions are being made with information collected from satellites, humans, and the national geological scientists at CalTech. It isn't until you take all of these elements into context and establish relevancy with human brainpower can you make an informed decision to give you an advantage of improved safety and security to achieve your goal.

Investigators or analysts who are leveraging the use of software, hardware and telecommunications infrastructure to more efficiently arrive at the answers of the hard hypothesis or questions being asked must improve their training, education and awareness to the associated human factors. Predicting human behavior is difficult if not impossible. What is more realistic is the utilization of automated systems to assist the human in trying to achieve a DecisionAdvantage. Proving the ground truth is a challenge in a court of law, in front of a jury and so too when it comes to declaring a cyber attack from another nation state. According to Jeffrey Carr and his Grey Goose Project, here is why:

When sensitive or classified data faces cyber attack, why can’t governments – or organisations – identify the culprits with any conclusivity?

A state cannot respond to concerted assaults by hackers with anything more potent than a diplomatic protest – which will be met with a firm denial by the accused government or body. There isn’t even agreement on what constitutes “cyber warfare”. As an expert in cyber warfare intelligence, I have researched the legal complexities and multiple strains of conflict, with the aim of trying to identify which acts qualify as cyber war.

What is undeniable is that politically-motivated attacks are becoming more frequent and sustained. Amazingly, none of the assaults on security shown (right), all of which have occurred in the last 18 months, qualify as an act of “cyber war”. The only issue that has been defined by international agreement is a nation’s right to self-defence when attacked, which, for the moment at least, applies only to the traditional manner of attack, ie, “armed” attack. From some adversaries’ point of view, this makes the internet an ideal battleground.

The eight events described opposite have all been characterized by various media sources as acts of cyber war. But definitive “attribution” – the smoking gun – was rarely achieved. The problem is that the internet was not built to be a secure platform. Its architecture inherently supports anonymity. As a result, a purely technical analysis of cyber attacks has almost never been successful at producing definitive proof, the cyber equivalent of DNA evidence.

For 18 months I and my colleagues in the Grey Goose Project have investigated Russian cyber attacks on Georgia in 2008, and we believe governments must adopt a new method of determining attribution, taking into account the policy of a state, regional events and intelligence. In addition, we apply the tried and trusted criminal investigation test of means, motive, and opportunity. I hope the attack on Google and its inevitable departure from China’s internet will trigger a broader awakening about the need to define what we call cyber warfare.


"History, by appraising...[the students] of the past, will enable them to judge of the future." --Thomas Jefferson

12 January 2010

Systems Engineering: Adaptive Processes...

The Operational Risks associated with the insider threat of fraud, terrorism, intellectual property theft and economic espionage are a moving target. This variation, deviation and migration from traditional methods of criminal activity has much to do with our systems orientation and reliance on trusted information. Until you miss one step in a process or misspell someone's name.

Systems Engineering as a discipline has it's roots in understanding the business problem before designing a remedy or tool to solve the issue at hand. Whether the engineering is business oriented or software focused the combined "Convergent Engineering" has the goal of being adaptive, flexible and on a trajectory for an integrated discipline.

Adaptive Systems have the opportunity to assist in the mitigation of risks yet software information systems continue to plague us because they are still not being developed in concert with the changing business processes. This operational risk has been in existence since the emergence of computers. The solution to this problem and the "Holy Grail" is to engineer the business or government and it's supporting software as a single, integrated system. Convergent engineering involves modeling and designing the business directly in software. This has been advocated and written about since the 1990's by David A. Taylor, "Business Engineerig with Object Technology" and others advocating concurrent engineering.

The failure of processes during our Global War on Terror is an operational risk that all too often is in the audit, testing and scenario exercises. The Washington Post highlights the breakdown in the Christmas Day 2009 "Under Pants" Bombing attempt on NW 253:

Back in November, it was a day or two after the initial Visa Viper report was received at the National Counterterrorism Center (NCTC) before analysts there realized the correct spelling of Abdulmutallab's name, based on data from other agencies. With the error corrected, he was listed, along with about 400,000 others, on the Terrorist Identities Datamark Environment (TIDE). That is a list of people, along with relevant information about them, who are suspected of, or known to be associated with, terrorist activities outside the United States.

At that time, NCTC analysts who worked on TIDE entries processed only nominations from the State Department, the CIA and other collection agencies. They checked the TIDE list to see if a name was on it, but they did not search other databases for more information. The NCTC also determined what further action, if any, was necessary, such as moving a person's name to the next level, the FBI's Terrorist Screening Center.

Meanwhile, back at the U.S. Embassy in Nigeria, State Department officials -- "out of curiosity" -- did check to see whether Abdulmutallab had a visa for entry into the United States, according to a department official who spoke on the condition of anonymity because the matter is under investigation. But because the misspelled name was used, the fact that Abdulmutallab had a multi-entrance, two-year tourist visa obtained in June 2008 was not sent to the NCTC or to other intelligence agencies.

As Crowley put it last week, "The initial search to determine if there was a visa did not -- one did not show, expressly because of this misspelling."

"This is a critical lesson learned," Crowley said. "The steps that we've put in the process beginning immediately after December 25 will, in fact, make sure that future reports do have visa information in them, so that this is . . . inserted into the process right from the outset."


The process is now adapting to the exposure of a vulnerability that could be exploited by the attacker to the system as it was designed. Could the same be said for the unfortunate incident soon thereafter on FOB Chapman in Afghanistan five days later. This breakdown again by the Washington Post brings this point into focus on the "Process Failure."


Those at the scene on Dec. 30 had been trying to strike a balance between respect for their informant -- best demonstrated, in the regional tradition, by direct personal contact -- and caution, illustrated by the attentiveness of the security guards, according to CIA officials.

But more than a dozen current and former government officials interviewed for this article said they could not account in full for what they called a breach of operational security at the base in Afghanistan's Khost province. Advance pat-downs and other precautions are common in an age of suicide bombers, and meetings are kept small and remote. None of these sources would agree to be identified by name, in many cases because of their former or current work as covert operatives.


The continuous diligence in the discipline of Operational Risk Management calls for an "All Threats & All Hazards" vigilance. However, in both of the previously mentioned cases all of the attention to process and protocols would not have overcome the larger factor of human psychology and human emotions. These Human Factors will continue to be the systems engineers worst nightmare and the single vulnerability that will never be totally mitigated.

Whether signs and red flags are missed in government or the private sector, the threat to our workplace, institutions and livelihood is at stake. ABB, a Swiss global infrastructure company is dealing with a workplace violence incident in St. Louis, MO USA and is now asking themselves "Who Knew What When":

The man widely identified as the gunman in a fatal shooting spree at a St. Louis industrial plant was described as an amicable family man and good neighbor, who would rake an elder's leaves and bring him holiday treats.

But 51-year-old Timothy Hendron of Webster Groves, a St. Louis suburb, was unhappy at work, according to those who knew him even casually, and embroiled in a pension dispute with his company that was being litigated this week in U.S. District Court in Kansas City.

Police said the gunman showed up at ABB Group's plant in north St. Louis around 6:30 a.m. Thursday and opened fire, killing three people and wounding five before apparently killing himself. Frightened co-workers scrambled into closets and to the snow-covered roof for safety.

He was found dead inside the plant from an apparent self-inflicted gunshot wound.


Systems engineering for business or government must continue to explore the human factors. Adaptive processes and software that has been designed with "Adaptive" abilities will continue to challenge even the smartest and most capable Operational Risk Managers for years to come.

05 January 2010

Deja Vu: Operational Risk in Decade Past...

The WWW is dynamic and the operational risks you take while navigating it's vast depth and breadth is part of the process. Who or what should you trust? As an example, at this very moment when you search Google for Operational Risk Management it returns this blog as the number #1 top link. Perhaps that is how you arrived here at this blog on Operational Risk.

You trusted Google that when you clicked on the link that you would find relevant information on your desired topic. Or perhaps you navigated to this site devoted to Operational Risk Management because one of the almost 1,000 postings since 2003 covered your question, topic or issue. In both cases, the information returned may have relevancy but only after careful examination of the words, concepts, ideas and arguments do you make the decision on whether to "Bookmark" this site.

And for the many that have bookmarked us or added us as your RSS Feed then we know who you are. Our mutual quest for the relevancy of "Operational Risk Management" in the current world we live in will continue. With each new incident, accident, or breach our purpose is further defined and more extensively documented.

As we encounter 2010 and the next decade we promise to provide the content you require and the relevancy to your role in the profession. Let's go back in time for a minute and see if any of our previous posts over the past 7 years have a point today:

28 October 2003

More banks hit by email fraud


U.S. Issues Saudi Alert Saying Terrorists Targeting Airlines


24 February 2004


Greenspan: Curb Fannie, Freddie Growth


24 June 2005

Negative Stock Price Reaction to Announcements of Operational Loss Events...


31 December 2006

Remember His Name: The Long War Ahead...


24 May 2007

Hedge Funds: Crystal Ball on Regulation...


11 October 2007

Fear: The Elements of Prediction...


31 March 2008

Volatility: Enemy #1...


08 May 2008

Legal Ecosystem: Survival of the Fittest...


22 September 2008

Decision Advantage: OPS Risk Intel...


25 April 2009

Human Factors: Early-Warning System...


17 August 2009

Business Resilience: Beyond Readiness...


Are you having a deja vu moment? A flashback to the future. Why is it that "lessons learned" are continuously ignored? Forgotten. Lost. History and the knowledge of that history can save you. Some use log analysis of their precious computing resources, firewalls and IDS/IPS systems to learn from the past. Others don't remember that last time they fell down the stairs, slipped on the ice or banged their head. Even those individuals who have been on the other side of the desk when the "Boss" is making their position "Extremely Clear" about their performance measures are subject to having a deja vu moment.

Operational Risk is a daily and continuous 24x7x365 process. A way of life. Not an event or a meeting at the end of the quarter. Each person and stakeholder at your organization or institution is responsible for it and should live each day embracing it. We like to say, Operational Risk Management saves lives, protects corporate assets and enables global enterprise business resilience. That's something everyone can remember, practice and strive for every waking moment and in every situation.

What do you think?