27 February 2016

RSA 2016: Ascending into a Trust Mindset...

Building awareness to a vulnerability, potentially heightens ones sensitivity to defend or build resilience to minimize damage or loss.  This is one of the foundations of Operational Risk Management (ORM), understanding what your assets are and what vulnerabilities exist.  Good old fashioned Risk Management 101, tells us to mitigate risks in the enterprise and even in our personal lives.

Is traditional Risk Management dead?  We think it is and through the eyes and inspiration of others we can now see why.  Our ability to make "Trust Decisions" is far more complex than just an emotion.  As we have evolved away from small villages where the food and water and other life essential resources were shared, trust factors have become more distant.  More shallow and less personal.  Our digital lives spanning continents and countries at light speed, now has given us a new perspective.  We must find our Trust Mindset.

As the RSA Conference opens on February 29, 2016 in San Francisco, thousands of eager professionals will converge on an event that has it's foundation and it's future built on "Achieving Digital Trust".  As we walk the Moscone Exhibition Halls observing, learning, engaged in dialogue or debate we must remind ourselves of the wisdom that comes from Jeffrey Ritter:

I have always viewed the emergence of the Internet and global computing as powerful tools to increase the velocity of the next solutions that enabled greater inter-dependence, greater accessibility to commerce, and more small steps toward peace. Through my work, however, I learned those tools were vulnerable unless, as a global society, we determine how to also build across the digital dimensions of cyberspace the capacity for humans to achieve what each transaction first requires—an affirmative decision to trust.

Jeffrey's latest book has been an inspiration for so many that have researched and lived in the Venn Diagram of the Law, Digital Technology and eCommerce.  Yet what about those people who have studied and modeled the human skills and behaviors to build trust with others that have yet to read Jeffrey's' book?  What is the fusion between the factors associated with building trust human-to-human and in a world of machines-to-machines?

"Trust Decisions" are being made by humans and computers each second of each day.  And one thing is certain about the decisions to trust by people and by the machine in your pocket, brief case or purse.  It is continuously learning and sharing.

The halls of the RSA Conference will be buzzing about trust.  In all of it's manifestations, the ecosystem of the event is about "Trust Decisions" and in many cases, man and machine.  The iPhone vs. the FBI.  Security vs. Privacy.  Cloud vs. Hybrid Cloud.  Secret Clearance or Top Secret Clearance.  Pre-hire background check.  FICO.  LinkedIn profile.  You name it and the fundamental question set, comes back to a "decision to trust."

Living an ethical life of integrity and willingness to share begins at an early age.  Sharing information responsibly with your peers, director or commander, requires a process for building trust over time and with each transaction of information exchange, either building or eroding the future decision to trust.

Here is one recent example.  Sitting in a room with a dozen strangers the other day was a mini-case study.  The purpose of this particular meeting was for this group of people to establish a forum for future trusted information exchange.  We were all part of the same ISAO if you will, not the same company.

The agenda called out for each person to introduce themselves, all for the first time.  The specific rules for the introductions were not spelled out by the host and then agreed by all of the meeting participants.  What happens next is a classic example of trust erosion, when the rules are absent.  As we proceeded around the room, each person took it upon themselves to determine how much or little information they would share with the rest of the group.

Some people introduced themselves with their name, company affiliation and a "one liner" on the business they were in.  Others in addition, took the opportunity to tell us all about their entire product/service line and why the solution was something that we should be interested in.  The first impressions were already building or eroding our perceptions of trust.  Our own reality.

It should be our ambition to continuously heighten our sensitivity to behavior in an environment absent of rules and how this builds or erodes our future trust decisions.  When you share, do you always have an expectation of reciprocity?  When you boast about yourself or your organization, is it for your own ego or self-satisfaction?  Do you ever even ask the question, "How are you" or "How can I help" you?  What are the rules?

Extraordinary trust is rare these days.  True Leadership is scarce.  Courage is almost extinct.  Think about how you can stand out and at the same moment, project a feeling of care, of concern and generosity.  Giving without any expectation of return, is what is going to help you build trust in your life.  And when you achieve that with your wife, husband, children, church, business partners, employees, clients and suppliers, then you know you are well on your way to substantial well being.

If you are alone and without many true and deep relationships in your life without cyberspace, there is a good reason why.  Achieving and building trust inside your organization (company or family) has been written about for years.  Happy employees make happy customers.  You have heard this before no doubt.   Building awareness to a vulnerability, potentially heightens ones sensitivity to defend or build resilience to minimize damage or loss.  This is where we started this blog post.

As we descend on the RSA Conference with the focus on "Trust Decisions", it will be with an ascent towards a continuous mindset of sharing, of caring and of learning.

20 February 2016

Predictive Intelligence: Data or Precogs...

The use of the term "Predictive Intelligence" has been around for a few years in the Operational Risk Management (ORM) community.  Born from the marketing collateral of the Business Intel (BI) vendors, it essentially requires hundreds of gigabytes or even terabytes of historical data and then is analyzed or data mined for so called insight.  The question is, why is this "Predictive Intelligence" and not just more "Information" in a different context?

Now introduce the nexus of our own "Trust Decisions" and the "Human Factors" associated with the science of cognitive decision making.  How do we as humans make our decisions to trust vs. how computers make their decisions to trust?  Are they not executing rules written by humans?  When is it information in a different format as opposed to true intelligence?

Christian Bonilla may be on to something here:
"Professionals in the foreign intelligence community take pains to distinguish between information and bona fide intelligence. Any piece of knowledge, no matter how trivial or irrelevant, is information. Intelligence, by contrast, is the subset of information valued for its relevance rather than simply its level of detail. That distinction is often lost in sector of the enterprise technology industry that is somewhat loosely referred to as Business Intelligence, or BI. This has become a bit of a catchall term for many different software applications and platforms that have widely different intended uses. I would argue that many BI tools that aggregate and organize a company’s information, such as transaction history or customer lists, more often provide information than intelligence. The lexicon is what it is, but calling something “intelligence” does not give it any more value. In order to sustainably outperform the competition, a company needs more than a meticulously organized and well-structured view of its history. Decision makers at all levels need a boost when making decisions amidst uncertainty and where many variables are exerting influence. They need what I would call predictive intelligence, or PI – the ability to narrow down the relevant variables for analysis and accurately measure their impact on the probability of a range of outcomes."
What does the fusion of human factors have to do with predictive intelligence?  That depends on how much you value the kind of innuendo and messages in the Tom Cruise movie, Minority Report.  Many aspects of the original Philip K. Dick story were adapted in its transition to film that was filmed in Washington, DC and Northern Virginia.  Is it possible to predict someone's future behavior even before they commit a crime or even become violent?
Set in the year 2054, where "Precrime", a specialized police department, apprehends criminals based on foreknowledge provided by three psychics called "precogs".
Cruise plays the role of John Anderton who is part of the experimental police force known as "Precrime."  These aspects of clairvoyance and precognition has many skeptics and their use for predicting future events or a related term, presentiment, refers to information about future events which is said to be perceived as emotions.
Regardless of terms, beliefs or whether the software analytics are using historical data, the science of "Predictive Intelligence" is about forecasting the future.  Based upon the recent global events that missed the forecast of economic implosion based upon historical data, maybe it's time to start introducing more human factors to the equation.

The interviews with people who have gone on record to predict a future historical event will probably be right at some point in time. How long will you be around to wait?  The demise of the banking sector and the extinction of Lehman Brothers, Bear Stearns and maybe even AIG were most likely predicted by someone, somewhere in 2007/2008 time frame.  The point is that you have to have context and relevance to the problem being solved or the question being asked.
The real story of the crash began in bizarre feeder markets where the sun doesn't shine and the SEC doesn't dare, or bother, to tread: the bond and real estate derivative markets where geeks invent impenetrable securities to profit from the misery of lower--and middle--class Americans who can't pay their debts. The smart people who understood what was or might be happening were paralyzed by hope and fear; in any case, they weren't talking.
Predictive analytics extracts relevant information from data and attempts to forecast the future. It relies on capturing relationships between explanatory variables and the predicted variables from past occurrences, and exploiting it to predict future outcomes.  Is it possible that there was and is too much reliance on the numbers and not enough on people's cognitive intuition?

This blog has documented the "11 Elements of Prediction" in the past.  Now it's time to utilize the combination of these human factors in close collaboration with the data analytics and raw numbers. Effective execution of both will provide corporate management the situational awareness they seek within the time line they wish.

The future state of Predictive Intelligence will combine the science of "Trust Decisions" with the art of "Data Analytics" to achieve our desired outcomes.

14 February 2016

Workplace Violence: Cues and Clues to Teach...

Operational Risk Management (ORM) is your foundation for crisis leadership. It will also prepare the enterprise for the potential for Homegrown Violent Extremism (HVE).  Is there a nexus with the cues and clues of traditional workplace violence and domestic terrorism? A domestic terrorist differs from a homegrown violent extremist in that the former is not inspired by, and does not take direction from, a foreign terrorist group or other foreign power.

All work locations have distinct categories of threats that are relevant to the site, people and type of business. Assessing the violent factors is the role of Senior FBI profiler (retired) Mary Ellen O'Toole and there are four categories according to a study entitled: "The School Shooter: A Threat Assessment Perspective:"
  1. A Direct Threat
  2. An Indirect Threat
  3. A Veiled Threat
  4. A Conditional Threat
Employees must be trained to be aware of the warning signals that typically occur before a threat and violent act becomes operational. Based on the O'Toole study these are some of the 23 "Red Flags" that employers should be monitoring and keeping their Corporate Threat Assessment Teams on high alert for:
  • Low tolerance for frustration
  • Poor coping skills
  • Failed relationships
  • Signs of depression
  • Exaggerated sense of entitlement
  • Attitude of superiority
  • Inappropriate humor
  • Seeks to manipulate others
  • Lack of trust/paranoia
  • Access to weapons
  • Abuse of drugs and alcohol
Source: O'Toole, Mary Ellen, "The School Shooter: A Threat Assessment Perspective," by the Critical Incident Response Group (CIRG), the National Center for the Analysis of Violent Crime (NCAVC) and the FBI Academy.
The court and the jury will look upon your employers ability to apply the basics of workplace violence and threat assessment. What did you know? When did you know it? What have you done about it? They will judge you on the threat assessments utilization of insider threat intelligence combined with the evidence of your overt training of employees in the workplace. What grade would you give your company today for these fundamentals?

Let's take it to the next step in terms of your ability to even meet the requirement by the Occupational Safety and Health Administration (OSHA) in the United States. Awareness programs are expected on the four primary types of workplace crimes:
  1. Those crimes committed by people not connected to the workplace.
  2. Aggression by third parties including customers, clients, patients, students, or any others for whom you provide a service or product.
  3. Employee-to-Employee violence or a former employee who returns to the workplace with the intention to injure a former supervisor.
  4. Aggression related to a personal relationship inside or outside the workplace.
The organization who understands the foundation for creating a proactive and preventive team for incidents in the workplace should not stop there. Once you have developed the framework for Incident Command, Emergency Operations Center, Shelter in Place, Medical Triage and Evacuation you have a good baseline to extend to a complete "Continuity of Intelligence Operations" strategy. This requires a deeper analysis into the threats inside your organization that may put you out of business entirely:
The ISIS assault on Paris and the ISIS-inspired massacre in San Bernardino, California, share a disturbing fact, no one saw them coming. Today, the biggest terrorist threat to the United States is not like al Qaeda. ISIS is wealthy, agile, sophisticated online, and operates freely in a vast territory of its own. It prefers to be called the Islamic State. The U.S. government calls it ISIL. Reporters tend to call it ISIS for the Islamic State in Iraq and Syria. But whatever the name, it has the manpower, means and ruthlessness to attack the U.S. The man who is supposed to stop that attack is John Brennan, the director of the CIA. And tonight, in a rare interview, we talk to Brennan about a world of trouble and we start with the most pressing danger.
Once the organization has adopted the "All Threats - All Hazards" intelligence mentality then it is well on it's way to becoming a survivable business.  Operational Risk Management (ORM) is a discipline that incorporates this approach and enables owners, operators and business suppliers with the tools, methods and strategy to handle workplace violence incidents or a catastrophic act of mother nature.

07 February 2016

Trusted Enterprise: Digital Science in Business...

Digital Trust has been a cornerstone for any serious organization in our 21st century era.  The foundation for an Operational Risk Management (ORM) design, begins with the engineering science of a sound and endurable platform for "Enabling Digital Trust of Global Enterprises."
The Accenture Technology Vision 2016 verifies "Digital Trust" as one of five major trends:
As every digital advancement creates a new vector for risk, trust becomes the cornerstone of the digital economy. Without trust, digital businesses cannot use and share the data that underpins their operations. To gain the trust of individuals, ecosystems, and regulators in the digital economy, businesses must possess strong security and ethics at each stage of the customer journey. And new products and services must be ethical- and secure-by-design. Businesses that get this right will enjoy such high levels of trust that their customers will look to them as guides for the digital future.  Source:  Accenture Technology Vision 2016
The concept of data ethics as a significant component of establishing "Digital Trust" is vital.  When you introduce the concept of ethics to the dialogue on software engineering in the global enterprise, there are several key considerations.  Adding the moral governance of actions taken as a result of insights derived from the analysis of information, is also a valid vector in the design of trustworthiness for modern digital applications.  Yet this means nothing, without first understanding how humans make their decisions to trust.  How effective the entire ecosystem of "Digital Trust" becomes will always come back to the root.  Digital Ground zero.

Ground zero for "Digital Trust" is the actual "Trust Decision" itself.  The science of the "Trust Decision" elements and process has been the focus of researchers and academic study for years.  In order for us to truly understand how to achieve digital trust in business, we must first grasp the science and evidence of the core elements and root of our "TrustDecisions."  Does "Achieving Digital Trust" in the enterprise ensure that, as a business you are "Achieving a Defensible Standard of Care"?   Not necessarily.

The two concepts are mutually exclusive, yet they still have affinity for each other.  Accenture's Technology Vision, provides the enterprise with sound reasoning about how to create a path towards improving digital trust, especially as it pertains to the reputation benefits associated with the "Brand."  Adding the element of ethics, drives the consumer thinking that the business has addressed privacy requirements in terms of the legal rules and usability factors.

Incorporating the conversation in the Board Room about data ethics (collection and use) or how as an enterprise you must design-in legal controls in order to alleviate liability, requires something new.  It requires all interested parties to go back to the root.  How does the human make a decision to trust?  How does a computer make a decision to trust another computer?

The people sitting around the Board Room table are thinking about creating more wealth.  They are not asking themselves, how do computers trust other computers?  In our digital age where decisions are being made as a result of the execution of zeros and ones at light speed, someone has to be designing the trust architecture with the right people in the enterprise.  The question is now at hand, who is that person or business unit?

The answer is going to be different in each business or organization.  What is the maturity of the particular digital ecosystem and how vast is the landscape for the computing assets?  One fact that must be acknowledged early on, is that it probably does not entirely exist today.  The ideal unit of people and systems that are necessary to achieve digital trust, are currently spread out across the typical silos of a business architecture.  IT, Marketing, Legal, Info Security, Privacy perhaps.  However, the dedicated and funded "Digital Trust" team, task force or department, has yet to be established.  So what?

Continue to operate as you are.  Without the advantage of truly understanding the elements of "Trust Decisions" and how this is relevant to "Achieving a Defensible Standard of Care."  A trustworthy computing division, may have existed in the past at your organization, yet initially with another focused mission,  "Cyber Crime" intervention.  You see, the idea of trust and why it is so vital to the success of the information technology industry is not new.  Smart malware researchers and software engineers understood this at the dawn of the Internet.  So why is this any different?

Trustworthy computing in the 90's is not the same as the application of "Trust Decisions" in the year 2016 and beyond.   Especially today, with the speed of cloud computing adoption and the outsourcing of core data transactions across borders.  The international implications of privacy laws and the routing and storing of data outside of your native country, is now in play.  Negotiations by a Nation State to bypass traditional use of mutual legal assistance treaty (MLAT) is the new normal:
If U.S. and British negotiators have their way, MI5, the British domestic security service, could one day go directly to American companies such as Facebook or Google with a wiretap order for the online chats of British suspects in a counter­terrorism investigation.

The transatlantic allies have quietly begun negotiations this month on an agreement that would enable the British government to serve wiretap orders directly on U.S. communication firms for live intercepts in criminal and national security investigations involving its own citizens. Britain would also be able to serve orders to obtain stored data, such as emails.  Source:  Washington Post
The requirements have changed.  The next era of "Achieving Digital Trust" requires so much more.  It now requires standing up and providing substantial resources to the "TrustDecisions" Unit within the enterprise.  What does this mean to the future of the Trusted Enterprise?

It means that the Chief Information Officer (CIO), Chief Privacy Officer (CPO), General Counsel and Chief Information Security Officer (CISO) will be using data and Digital Science to design a new architecture for the Trusted Enterprise.  They will deliver it to the desk of the Chief Executive Officer (CEO) very soon.