28 May 2016

Memorial Day 2016: The Risk of Service is Understood...

It is Memorial Day weekend in the U.S. and on this final Monday of May 2016, we reflect on this remembrance.

In order to put it all in context, we looked back 36 months to our 2013 blog post here.  It was only a few weeks since a fellow colleague from Team Rubicon had ended his battle at home, after several tours of duty with AFSOC.  Neil had joined the ranks of those fallen heroes who survive deployment tagging and tracking the enemy in the Hindu Kush.  He was also one of the 22 that day in early May, that could not defeat the legacy of demons he fought each night, as he fell deep asleep.

On Memorial Day 2016, we again honor Neil in Section 60 at Arlington Memorial Cemetery and all those other military members who have sacrificed and defended our freedoms for 239 years. Simultaneously, we do the same for the people behind the "Stars" on a wall in Langley, Va for those officers who have done the same.

Together we are on the front lines or inside the wire at the FOB.  Whether you are in Tampa, FL, Stuttgart, Germany or Arlington, VA.  Whether you are on your beat cruising the streets of a major metro USA city.  Whether you are watching a monitor at IAD, LAX or DFW.  Whether you are deep in analysis of Internet malware metadata or reviewing the latest GEOINT from a UAS.  We are all the same, in that we share the mission that gets each one of us out of bed each day.  Our countries "Operational Risk Management (ORM)."

The Operational Risk Management mission of the U.S. Homeland is vast and encompasses a spectrum of activity, both passive and kinetic.  Digital and physical.  It requires manpower and resources far beyond the capital that many developed countries of the world could to this day comprehend.  There are only a few places across the globe, where a normal citizen would say that the mission and the capital expenditures are worth every dollar and every drop of blood.

Memorial Day in the United States is exactly this:
Memorial Day is a United States federal holiday which occurs every year on the final Monday of May.[1] Memorial Day is a day of remembering the men and women who died while serving in the United States Armed Forces.[2] Formerly known as Decoration Day, it originated after the American Civil War to commemorate the Union and Confederate soldiers who died in the Civil War. By the 20th century, Memorial Day had been extended to honor all Americans who have died while in the military service[3].
So this weekend as we walk among the headstones, reflect on our colleagues who gave their service and their own lives, we will stand proud.  We understand the risks.  We know why we serve.  In the spotlight or in the shadows.  The tradition and the mission continues...

21 May 2016

Social Engineering: CxO Leadership for BEC...

In the context of cyber security, many practitioner experts are already familiar with the "Business E-Mail Compromise" (BEC).  Operational Risk Management (ORM) professionals know this:
"Amateurs attack machines, Professionals attack people"

The BEC is a global scam with subjects and victims in many countries. The IC3 has received BEC complaint data from victims in every U.S. state and 45 countries. From 10/01/20131 to 12/01/2014, the following statistics are reported: 

  • Total U.S. victims: 1198
  • Total U.S. dollar loss: $179,755,367.08
  • Total non-U.S. victims: 928
  • Total non-U.S. dollar loss: $35,217,136.22
  • Combined victims: 2126
  • Combined dollar loss: $214,972,503.30
The FBI assesses with high confidence the number of victims and the total dollar loss will continue to increase.
What executives at most organization understand, is that they are a potential target for all kinds of threats from inside and outside the company.  Fortune 500 companies already have sophisticated internal accounting controls and "Personal Protection Specialists" who are doing advance work, for travel that the CxO takes across town or overseas.  Yet what about the Small-to-Medium Enterprise with just tens of millions of dollars in annual revenues?  Are they prepared as they could be for the BEC?

It does not take much for the financial controls and the accounts payable process to break down for companies and organizations, that have not prepared for this continuous threat, by your own insiders (employers, partners, suppliers) cooperation.  The numbers tell the whole story.  Countless times each year, companies are convinced to act upon a simple e-mail crafted by clever "Social Engineering" experts, to transfer money out of their corporate banking accounts.

So what are you doing to prepare, educate and deter this continuous wave of "Social Engineering" attacking your employees and key stakeholders?  How many computers and iPhones in your business or organization receive e-mail on a daily basis?  Each one of these is a threat vector, along with each one of your employees who is the human factor behind the device.

What is amazing today, is that a cyber threat like this, that has been talked about for over a year, is still growing.  Perhaps it is a leadership problem.  Perhaps it is a public safety announcement campaign problem.  In either case, you have to realize, there are some very specific remedies that can be exercised by your organization to deter, detect and defend yourself from "Business E-mail Compromise" (BEC).

Executives and senior staff are busy.  They are running the business and rarely have time for that two hour or half day training session.  This is your largest vulnerability to begin with at your organization.  An apathetic CEO or senior staff is the perfect target for any transnational organized crime (TOC) syndicate on the other side of the globe.

As a CxO, when was the last time you had a campaign within the organization to address these threats?  Weeks, Months, Years?  Why haven't you incorporated a continuous program to keep your employees and staff up to date?  If you have 1247 employees, then you have 1247 vulnerabilities walking around in your enterprise.

When you look at the line item in the Information Technology budget this year for hardware, software, maintenance and cloud computing, look a little further.  Where is the line item for the education program and the tactical awareness, to keep your people on the leading edge of deterring the social engineering wave of attacks in your organization?
There has been a lot of news in 2016 about a particular species of phish, the so-called Business Email Compromise (BEC). In this scenario, the attacker poses as an executive of a company, asking someone--usually a subordinate employee--to perform a wire transfer or similar action. When the employee complies and completes the transfer, the company realizes--too late--that it has just given a large payment to a criminal. An investment company in Troy, Michigan, recently lost $495,000 from a BEC phish, so this is not a small matter.

It even hit close to my (professional) home: DomainTools’ CFO recently received a spear phish purporting to come from our CEO, asking her to make a wire transfer of funds. The sending email address was a clever look-alike of “domaintools.com,” using some substituted characters. Fortunately our CFO is very savvy and knew right away that her boss wouldn’t actually make such a request in that way. But it underscores how common this kind of BEC phish is -- and how easy it is for criminals to spoof legitimate emails.
This is just a small example, of the continuous trend across the small-to-medium enterprise landscape.  You have the control and the ability to make a difference in your enterprise.  The time and the services exist for you to keep your organization more safe and secure than it is today.  When will you decide it is your "Duty of Care" to protect corporate assets and to start using some of the tools to make "Business E-mail Compromise" (BEC) extinct?

15 May 2016

Know Your Customer: ISP Future Horizon...

The American public is changing their behavior as a result of the privacy and security failures across the private sector business policy landscape.  As the latest NTIA survey data reveals again, online commerce is being impacted and government agencies are now trying to further communicate there is a growing problem:

Lack of Trust in Internet Privacy and Security May Deter Economic and Other Online Activities
May 13, 2016 by Rafi Goldberg, Policy Analyst, Office of Policy Analysis and Development

Every day, billions of people around the world use the Internet to share ideas, conduct financial transactions, and keep in touch with family, friends, and colleagues. Users send and store personal medical data, business communications, and even intimate conversations over this global network. But for the Internet to grow and thrive, users must continue to trust that their personal information will be secure and their privacy protected.

NTIA’s analysis of recent data shows that Americans are increasingly concerned about online security and privacy at a time when data breaches, cybersecurity incidents, and controversies over the privacy of online services have become more prominent. These concerns are prompting some Americans to limit their online activity, according to data collected for NTIA [1] in July 2015 by the U.S. Census Bureau. This survey included several privacy and security questions, which were asked of more than 41,000 households that reported having at least one Internet user.

Perhaps the most direct threat to maintaining consumer trust is negative personal experience. Nineteen percent of Internet-using households—representing nearly 19 million households—reported that they had been affected by an online security breach, identity theft, or similar malicious activity during the 12 months prior to the July 2015 survey. Security breaches appear to be more common among the most intensive Internet-using households.

This survey is indeed only one facet of a much larger topic and pervasive problem.  Digital Trust is the output of making affirmative "Trust Decisions" with computing devices. Whether they are machine-to-machine, person-to-machine, or machine-to-person requires several technology engineering elements and business rules, that are understood and agreed upon.  The question is by whom?

Consumers who are using the Internet for communications and commerce and are the victims of Identity theft, stolen funds or other fraudulent schemes, are just the first wave of targets for transnational organized crime (TOC).  We have known this since the invention of virus scanners and bug bounty programs, in the early days of the 21st century.

Yet fifteen plus years later, the government is doing a study on the consumers feelings about privacy and security.  As a business or a consumer, we understand that the speed of commerce and technology is always far ahead of the regulations and the laws.  When enough people or businesses seem to be harmed, then the momentum begins for policy shifts and new laws are sometimes enacted after thousands of pages of semantic negotiation.

The answers and the outcomes we seek will come.  However, they will not first be solved by politicians and lawyers.  They will be mostly solved by our brilliant mathematicians, software engineers and data scientists.  At this point in time, we are getting so much closer to achieving digital trust through new innovations and inventions.  Just look at IBM Watson.

It is now time for business and commerce to begin the process of finding the truth.  Why do we continue to allow the levels of known bad actors to operate inside and within our networks?  It's a numbers game and it is because the criminals also employ the smartest social engineers and data scientists.

Digital Trust in the next fifteen years will mean something different than it does today.  We will have found the formula along the journey, the new equations and the rules agreed upon by all to make online and digital commerce more safe and secure.  So what will we do today and tomorrow, until the engineers and scientists save the day?

At this point in time, it is simply called "Know-Your-Customer"(KYC).  If this was utilized more effectively across critical infrastructure sectors beyond finance in our digital economy, then we would be making some progress.  Where are we talking about next? 

The FTC and FCC are well on their path to defining those critical elements of improving the trust that consumers have using their digital tools with ICT and on service providers web sites.  Yet even to this day, you still can find the criminals using and leveraging our own Internet Service Providers (ISP) to launch their attacks and perpetuate their fraudulent schemes.  How will this ever be deterred?  Could a version of KYC work with the ISP's?

Even with a global banking system in place you have pockets of greed and deceit.  Rogue nations or territories that have become the go-to-locations for the transnational organized crime syndicates to flourish.  Yet we can do much better, than we are today.

Just ask any "BlackHat" hacker from Eastern Europe who they prefer to do business with.  Query the experts that exist on the dark side and you will find the ISPs they prefer to do business with.  One day the regulators will realize this is where the business of e-crime has an opportunity for change and additional reform.  It will be more than just opening an account to gain access to the Internet.  It will be about scaling up our systems to a future horizon with new rules and robust real-time behavioral predictive analytics.  In the mean time:
May 11, 2016 
In testimony before Congress today, the Federal Trade Commission outlined its work over the past 40 years to protect consumers’ privacy at a hearing convened to examine privacy rules proposed by the Federal Communications Commission.

Chairwoman Edith Ramirez and Commissioner Maureen Ohlhausen testified on behalf of the Commission. The testimony before the Senate Judiciary Committee’s Subcommittee on Privacy, Technology and the Law provided background on FTC law enforcement efforts, policy work and consumer and business education programs related to protecting consumers’ privacy.

The testimony highlighted the FTC’s extensive history of privacy-related work. The testimony noted that the agency has brought more than 500 privacy-related enforcement cases in its history against online and offline companies of varying sizes, including companies across the internet ecosystem. In addition, the testimony highlighted a number of recent cases of note.

The testimony also provided information on the FTC’s policy work in the privacy area, going back to its first internet privacy workshop in 1996. The testimony noted that recent policy work has been based on principles featured in the FTC’s 2012 privacy report, and also highlighted workshops and reports related to the Internet of Things, big data, and other issues, including cross-device tracking.

The testimony also described the FTC’s extensive consumer and business education efforts related to privacy, including the FTC’s Start With Security campaign for businesses, and the newly-updated IdentityTheft.gov.

07 May 2016

The Third Offset: Seeking the Speed of Trustworthiness...

The U.S. national security "Insider Threat Score" is on it's way as a result of the aftermath of the Office of Personnel Management (OPM) hack.  The National Background Investigation Bureau (NBIB) is now standing up operations within the Pentagon umbrella.  Operational Risk Management (ORM) professionals are tracking this closely for good reason.  Social media activities such as this one, could one day be a factor in that score.

Simultaneously, the NIST Special Publication 800-160 2nd Draft has been released.  This document entitled:  Systems Security Engineering "Considerations for a Multidisciplinary Approach in the Engineering of Trustworthy Secure Systems" addresses a key component in the national security mosaic.

So if the goal of creating the "Insider Threat Score" is to help automate and maintain the process for better understanding trustworthiness, then the NIST publication should be at the center of the table at the National Background Investigation Bureau.  Why?  Definitions in Appendix B of the SP 800-160 Second Draft:

Trustworthiness: An attribute associated with an entity that reflects confidence that the entity will meet its requirements.

Note: Trustworthiness, from the security perspective, reflects confidence that an entity will meet its security requirements while subjected to disruptions, human errors, and purposeful attacks that may occur in the environments of operation.

Trust: A belief that an entity will behave in a predictable manner in specified circumstances.

The degree to which the user of a system component depends upon the trustworthiness of another component.

Note 1: The entity may be a person, process, object, or any combination thereof and can be of any size from a single hardware component or software module, to a piece of equipment identified by make and model, to a site or location, to an organization, to a nation-state.

Note 2: Trust, from the security perspective, is the belief that a security- relevant entity will behave in a predictable manner while enforcing security policy. Trust is also the degree to which a user or a component depends on the trustworthiness of another component (e.g., component A trusts component B, or component B is trusted by component A).

Note 3: Trust is typically expressed as a range (e.g., levels or degrees) that reflects the measure of trustworthiness associated with the entity.
The future of the automation of the clearance process, continuous monitoring of "Insider Threat Scores" and the trustworthy secure systems software engineering for accomplishing this remains mission critical.  The "Cleared Community" of private sector "Defense Industrial Base" (DIB) contractors will also be impacted by the convergence of both.

So who are the personnel who could be impacted by these two converging initiatives:
  • Individuals with systems engineering, architecture, design, development, and integration responsibilities; 
  • Individuals with software engineering, architecture, design, development, integration, and software maintenance responsibilities; 
  • Individuals with security governance, risk management, and oversight responsibilities;
  • Individuals with independent security verification, validation, testing, evaluation, auditing, assessment, inspection, and monitoring responsibilities;
  • Individuals with system security administration, operations, maintenance, sustainment, logistics, and support responsibilities;
  • Individuals with acquisition, budgeting, and project management responsibilities;
  • Providers of technology products, systems, or services; and
  • Academic institutions offering systems security engineering and related programs.
As the government moves towards more trustworthy secure computing systems the private sector will be there to assist.  Yet the future of our trusted environments will depend on how often we perform and how well we perform without error.

Software is continuously changing and the fear of changing it too often, has been one of our greatest downfalls.  That fear of change has created our largest exposures to continued exploits and attacks, by our most sophisticated adversaries.  Remember, Edward Snowden worked for a private sector contractor.

There are a few trustworthy organizations that have realized this fact and are now on an accelerating path for reaching a higher level of trust.  With their software systems and their people.  However, they did this with a leap of faith and the understanding that the speed to reach more trusted computing environments, was absolutely vital.

Look around the Nations Capital beltway and you will find a few examples of the ideal innovation architecture strategy that will propel us into that next level of trustworthiness.  An affirmative decision to trust is now before us and the time we take to make that trust decision is our greatest challenge.  Will it be hours, minutes, seconds or nanoseconds?  Marcel Lettre, undersecretary of Defense for Intelligence has this perspective:
"The intelligence community’s role in what Pentagon planners call “the third offset”—the search for continuing technological advantage over enemies—will feature robotics, artificial intelligence, machine learning and miniaturization. They will be applied in the areas of “pressing for global coverage capabilities, anti-access/area denial, counterterrorism and counter-proliferation, cybersecurity and countering insider threats,” Lettre said.

He said Defense is reaching out to obtain the expertise of its industrial partners, including Silicon Valley, while workforce planners are focused on “bringing in another generation skilled at innovating in the technology sector.”

01 May 2016

Innovation: Truth in Data Provenance...

For years mathematicians and computer scientists have written about the trustworthiness of data provenance.  Relying on the integrity of data collection, transport and of course the source of data is a real science.  Our modern day zeros and ones span all aspects of our lives and Operational Risk Management (ORM) professionals have encountered the questions surrounding trust and the process of decision making long before the invention of computing machines.

At the root of decision making with integrity the source of data is questioned.  The reliability and history of previous data from the source.  As the data was transported from Point A to Point B was there any possibility that the data was altered, modified or corrupted.  Couriers and the use of a "Hawala" type system have been used by traders and terrorists for hundreds of years.

"Truth in Data Provenance" is the question mark that enables our trust decisions.  This is why modern day cryptography is at the center of so many arguments and debates, when it comes to the topic of trusted information.  Yet hundreds of years ago, long before telecom and ICT was invented, the trustworthiness of data provenance was a vital factor.  The use of transposition ciphers were in use by the ancient Greeks.

So what?  In 2016 what does the truth in data provenance have to do with our business commerce, our transportation, our banking, even our abilities as governments to maintain our defense against attack?

The topic is vast and deep and worth exploration at the top level of human decision-making.  Yes, it is vital that our computing machines have high-assurance data integrity, in order for our global systems to operate day-to-day.  Yet what impact does trusted information have with humans in an environment of work and daily collaboration?  How does truth in data provenance, affect our decision making and the environments we work in?

In a recent report by LRN, the subject of trust in the work environment as a motivator has become more apparent:
Another fascinating result of the study had to do with two squishy-sounding characteristics of a company: character and trust. Companies deemed by employees to have both strong character and inspired trust performed almost four times better, using the metrics mentioned earlier, than those that had other positive cultural attributes, such as collaboration and celebrating others. (This applied to all three types of companies, though, naturally, culture and trust were much more prevalent in the self-governing ones) What’s more, “high trust” organizations were 11 times as likely to be called more innovative than their competitors. Trust, the How Report suggests, is more important than virtually any other characteristic.
How organizations address the trustworthiness of data provenance is still a new frontier in this day and age.  The use of new sensors, sophisticated analysis of "Big Data" by computer algorithms and the pace at which new data is generated by the "Internet of Things" (IOT) makes this a significant area of focus for our current executives and enlightened organizational leadership.

Why?
But what does that really mean? How does one measure the absence or presence of something as abstract as trust? The How survey defines it as “a catalyst that enhances performance, binds people together, and shapes the way people relate to each other.” High trust groups encourage risk-taking, which in turn is what is necessary for true innovation to occur. When innovation fails, says Seidman, it’s because companies don’t put enough faith in employees to let them take risks. The industries with the highest amount of trust were “computers/electronics,” followed by “software/Internet.” Coming in last? Government.
At the most fundamental level, the culture you are operating in has all to do with the trust that exists or is absent.  It has all to do with the trustworthiness of data provenance.  Leadership in any organization, must see the relevance between trust and innovation.  Between innovation and risk-taking.  Your future and your culture depends on it.

23 April 2016

Trust Decisions: The Wealth of Our Cognitive and Digital Transactions...

As you embark on your journey out the door today, you will be required to make dozens of "Trust Decisions".  You and the digital smart machines and the numerous human and digital trust transactions that you will encounter is quite fundamental.  Or is it?  As you walk into your office building the surveillance cameras are watching you and recording your behavior.  The iPhone in your pocket is transmitting your unique signals to digital data sensors embedded in the lobby.  As you press the button on the elevator to go up to your office, you are making another affirmative decision to trust.

When you step off at your floor and approach the door to your office, you might utilize your small "Radio Frequency Identity (RFID) device to swipe a small square mounted on the wall.  You hear the deadbolt unlock and you are now granted access to your office space to start your workday.  Now as you walk to your corner office, you glance at the top of the screen of your iPhone to see if you are connected automatically to the corporate wireless network and the VPN.  When you were granted access to the office, the corporate computer network knew you were now present in the office and you have been automatically granted access for your role on numerous software applications on your computing devices.

Start your day at work and now the number of digital trust encounters has just begun.  The "Trust Decisions" that you and your digital devices will be making, could reach into the hundreds after a long 8 hour day.  Yet there are five principles that emerged in May of 2015 from Oxford professor and author Jeffrey Ritter in his book "Achieving Digital Trust" we should consider now:
  • Every transaction creating wealth first requires an affirmative decision to trust.
  • Building trust creates new wealth.
  • Sustaining trust creates recurring wealth.
  • Achieving trust superior to your competition achieves market dominance.
  • Leadership rises (or falls) based on trust (or the absence of trust).
Think about a day in the life of your entire organization and the number of digital trust transactions that have nothing to do with actual monetary currency transfers.  The wealth that is being described here on first glance may be thought of in terms of dollars or yuan or property, yet what about the wealth of human trust?  A plentiful amount or an abundance of anything.  How tangible is the decision to trust the computing machine before you, or the person sitting across the desk who is a key supplier or that new client half way around the world just sending you a text message?

You see, we walk to work and communicate everyday, making hundreds of trust decisions.  Our corporate computing devices are making tens of thousands or millions of transactions of trust each hour.  The rules, information and calculations are known, because they are being measured.  Jeffrey Ritter says it this way:
Take a moment and think about each of these with respect to what you do in your business or in your job. How does the organization acquire wealth? Where does new wealth originate? How are customers retained? What provokes them to keep coming back and paying for your goods or services? Why does the leader in your market succeed? If you are not the market leader, why not? How is the loyalty of your team maintained? 
 The future is clear and becoming more revealing to us each day.  Digital trust, security and privacy of your organization and our societies are being defined before us in plain sight.  Can you see it?  The Washington Post illustrates a single example:

By Hayley Tsukayama and Dan Lamothe April 22 at 7:22 PM

Ever since Chinese computer maker Lenovo spent billions of dollars to acquire IBM’s personal-computer and server businesses, some lawmakers have called on federal agencies to stop using the company’s equipment out of concerns over Chinese spying.

This past week, those lawmakers thought the Pentagon finally heeded their warnings. An email circulated within the Air Force appeared to indicate that Lenovo was being kicked out.

“For immediate implementation: Per AF Cyber Command direction, Lenovo products are being removed from the Approved Products List and should not be purchased for DoD use. Lenovo products currently in use will be removed from the network,” stated the message. The apparent directive was generally welcomed as it circulated around Capitol Hill.

Then the Pentagon’s press office weighed in. Not so fast, it said.
Making "trust decisions" today at work and as you navigate home for the evening will be more apparent.  A heightened understanding of digital trust and how you engage with these transactions each waking hour may assist you in creating new wealth.  Improving the trust you have with computing machines and others at home or work, can make all the difference in life.

Where do you work and live?  Washington, DC.  London.  Moscow.  Beijing.  New Delhi.  Sydney.  It doesn't matter anymore because we are all connected by the Internet.  The opportunity for the societies of our planet to utilize "Information & Communication Technology" (ICT) to produce greater wealth is before us.  How will you proceed with your Trust Decisions?

16 April 2016

Leadership in Crisis: Building Trust with Continuous Training...

How often have you ever heard the leadership management philosophy that you must "Train Like You Fight"?  Here is another way to look at it:
The more you sweat in peace, the less you bleed in war.
Norman Schwarzkopf
The theme is all too familiar with Operational Risk Management (ORM) teams that operate on the front lines of asymmetric threats, internal corruption, natural disasters and continuous adversaries in achieving a "Defensible Standard of Care."

As the senior leader in your unit, department or subsidiary the responsibility remains high for preparedness, readiness and contingency planning.  Your personnel and company assets are at stake and so what have you done this month or quarter to train, sweat and prepare?  How much of your annual budget do you devote to the improvement of key skills for your people in a moment of crisis or chaos?

What will the crisis environment look like?  Will it develop with clouds, water and wind or the significant shift in tectonic plates?  Will it begin with the insider employee copying the most sensitive merger and acquisition strategy to sell to the highest bidder?  Will it start with a single IT server displaying a warning to pay a ransom or lose all possibility of retrieving it's data and operational capacity to serve your business?  Will it end up being another example of domestic terrorism or workplace violence like San Bernadino, Paris or Ft. Hood?

Leaders across our globe understand the waves of risk and the possible issues that they may encounter each year.  Many travel to Davos to the World Economic Forum where the world tackles these disruptive events, with the best minds and exchange of information.  Why?  They understand that vulnerability is what they fear the most.

Yet what can you do in your own community, at your own branch office to address the Operational Risks you face?  How can you wake up each day with the confidence as a leader, that you have trained and prepared for the future events that will surprise you?  It begins with leadership and a will to lead your team into the places no one really likes to talk about.  The scenarios that people fear to train for, because they think they will never happen.

Achieving any level of trust with your employees, your customers and your supply chain revolves around your leadership.  The discipline of "Operational Risk Management" is focused on looking at all of the interdependent pieces of your business mosaic.  The environment you operate in, even the building that houses your most precious assets.  All of these factors are considered in developing and executing your specific plan for training and readiness.

So what?  The question is "Why Don't Employees Trust Their Bosses"?
Why this lack of trust?

There is a disparity, the survey revealed, between areas that employees said were important for trust, and the performance of company leaders in these areas.

For example, half of respondents said it was important for the CEO to be ethical, take responsible actions in the wake of a crisis and behave in a transparent way. However, a much lower number of respondents actually felt their CEO was exhibiting these qualities.

This disparity is in part responsible for trust decreasing as you move down an organization’s hierarchy. So, while two-thirds of executives trust the company, less than half of rank-and-file employees do. Equally, peers were rated as much more credible than CEOs.
As a leader your roles are multi-faceted and there is never enough time or money in the budget.  The leaders who excel in the next decade, will find a way.  They will invest in their teams training and the systems to increase trust, by addressing Operational Risk Management (ORM) as a key component of the interdependent enterprise.

The "TrustDecisions" you require and the understanding developed to insure effective "Trust Decisions" by all of your stakeholders will remain your most lofty goal as a leader.  How you train to fight and how you sweat now will make all the difference in your next war.  From the boardroom to the battlefield your leadership is all that is needed.  Your leadership will make a difference.

09 April 2016

Trade Secrets: Gearing up for DTSA...

The Fortune Global 500 and the smallest research and development organizations in the U.S. have another ruleset to keep their eye on this week.  It is named DTSA or S.1890 - Defend Trade Secrets Act of 2016 has passed the Senate.  Operational Risk Management (ORM) is preparing for the next addition to national laws.

The attribution of cyberespionage adversaries has been gearing up since the Sony Pictures hack.  The private sector has been hunting and identifying those shadow individuals and nation state special units for years.  Now the lawyers can get more aggressive with civil actions.

The question remains, will another law deter the actions by global organized crime and the intelligence community of some significant nations?  How will attribution and more aggressive civil actions in foreign jurisdictions make a difference?

As a global organization, can you access your database of confidential trade secrets?  No different than the task of the identification of information assets that you are going to protect, you need an inventory.  What are they and where are they?  Everyone knows the formula for "Coca-cola" is written on a single piece of paper that is locked up in a vault in Atlanta, GA right?  Or is it?

There are trade secrets across America that have been stolen by operatives working inside organizations.  They may be preparing to leave the U.S. for another country outside the reach of law enforcement and the legal process for seizing the stolen property.  That is going to change soon.
The EX-Parte Seizure Order is part of the Trade Secrets bill that allows a trade secret owner to obtain an order from a judge for U.S. marshals to seize back the trade secret from the alleged bad actor without prior warning. This is to protect the trade secret owner from having the alleged bad actor skip the country or destroy the evidence before it is recaptured.
Now that Trade Secrets are in the same legal and enforcement category with patents and trademarks, you can predict that your legal budgets will need to be adjusted, upwards.  In general, what is a Trade Secret?
The subject matter of trade secrets is usually defined in broad terms and includes sales methods, distribution methods, consumer profiles, advertising strategies, lists of suppliers and clients, and manufacturing processes. While a final determination of what information constitutes a trade secret will depend on the circumstances of each individual case, clearly unfair practices in respect of secret information include industrial or commercial espionage, breach of contract and breach of confidence.
The effort to make intellectual property a "Trade Secret" is another strategy in itself. The determinations to designate something a trade secret is going to depend on the invention or the data itself. We understand. So what?
A Chinese businessman pleaded guilty Wednesday (March 23) in federal court in Los Angeles to helping two Chinese military hackers carry out a damaging series of thefts of sensitive military secrets from U.S. contractors.

The plea by Su Bin, a Chinese citizen who ran a company in Canada, marks the first time the U.S. government has won a guilty plea from someone involved with a Chinese government campaign of economic cyberespionage.

The resolution of the case comes as the Justice Department seeks the extradition from Germany of a Syrian hacker — a member of the group calling itself the Syrian Electronic Army — on charges of conspiracy to hack U.S. government agencies and U.S. media outlets.
Our adversaries are determined. They are already here. It has been documented for years. Let the next wave of legal indictments and seizures begin. One thing is certain. The "Insider Threat" is still present and your organization can do better. The ability to effectively utilize the correct combination of controls, monitoring, technology and internal corporate culture shifts will make all the difference. What are you waiting for?

03 April 2016

Fifth Discipline: The Evolution of Digital Intelligence...

"Learning organizations themselves may be a form of leverage on the complex system of human endeavors.  Building learning organizations involves developing people who learn to see as systems thinkers see, who develop their own personal mastery, and who learn how to surface and restructure mental models, collaboratively.  Given the influence of organizations in today's world, this may be one of the most powerful steps towards helping us "rewrite the code," altering not just what we think but our predominant ways of thinking.  In this sense, learning organizations may be a tool not just for evolution of organizations, but for the evolution of intelligence."  --Peter M. Senge -The Fifth Discipline - 1990

Many senior executives and a cadre of experienced Ops Risk professionals who are waking up across the globe today, keep this text book within arms reach.  Why?  All 413 pages of wisdom and knowledge transfer, is applicable this moment, even though it was written and practiced several years before the commercial Internet was born.  Our respective cadre of "Intelligence Analysts" spans the organization continuously seeking the truth, analyzing the growing mosaic, applying new context and taking relevant actions.

In an environment now vastly more virtual, far beyond the paper pages of Senge's book, lies the contemporary intelligence of "IBM's Watson."  At the finger tips of the FireEye operators or the Palantir Forward Deployed Engineer, we have new insights almost in real-time.  The "Learning Organizations" are no longer in a traditional hierarchy.  They are flat, agile and capable of tremendous autonomy at light speed.

So what is the opportunity now?  How can we potentially move towards more collaborative systems thinking and "rewrite the code" even in the 2nd decade of the 21st century?  It starts with rewriting the new digital code.  It continues as we reengineer our "Learning Organizations" for a digital environment that operates 24 x 7 and is ever more so fragile where trust is so inherent.  We can still create and deploy systems thinkers to question the truth and learn from the speed and capabilities of our new intelligent machines.

Peter Senge outlines five learning disciplines in his book on three levels:
  • Practices:  What you do
  • Principles:  Guiding ideas and insights
  • Essences:  The state of being of those with high levels of mastery in the discipline
The five disciplines are:
  • Systems Thinking
  • Personal Mastery
  • Mental Models
  • Building Shared Vision
  • Team Learning
The enterprise architecture for our modern day learning organization is in it's infancy.  You see, the technologies and the software has outpaced our human ability to apply it effectively, with the five disciplines.  One of our continued vulnerabilities is the ignorance of information governance as it pertains to the truth of data provenance and how as humans, we apply the disciplines of learning in our digital organizations.
The international hacker who allegedly accessed personal emails and photographs belonging to the family of former president George W. Bush and whose cyber-mischief revealed that Hillary Clinton was using a private email address appeared in a U.S. court for the first time Friday.

Marcel Lehel Lazar — better known by the moniker “Guccifer” that he is said to have affixed to the materials he stole — is charged with cyber-stalking, aggravated identity theft and unauthorized access of a protected computer in a nine-count indictment filed in 2014 in federal district court in Alexandria, Va. He was extradited to the United States recently from Romania, his home country, where he had been serving a sentence for hacking.
 Our organizations are a "plume of digital exhaust" that is invisible to many and crystal clear to some.  As you begin to capture and document the digital footprint of today's knowledge worker, the trail is long and deep.  Even for those shadow planners, logistics experts and operators, they can not escape the digital encounters they have each day.  However, the apparent threat is that they will continuously become more aware and more disciplined.

The art and practice of gaining and preserving "Digital Trust" is at stake for all of us.  The vast and consistent application of understanding "trust decisions" in our digital lives, will forever provide us new found challenges and new discoveries.  How we consistently apply our digital disciplines going forward, will make all of the difference in our prosperity or our future peril.  How we reengineer our learning organizations for 2025 and beyond, is now at our doorstep.
Today, privacy, information security, cyber defenses—all revolve around the same target: achieving trust to sustain electronic commerce and create new wealth. Digital trust is not only required; achieving digital trust will prove to be the competitive differential for the winners of the next generation.  --Jeffrey Ritter
Think about your digital footprints as you interact, communicate, travel and read the news today.  Activity-based Intelligence (ABI) is a business and you are the product.  The question is, how can you and your learning organization move from the "Fifth Discipline" to the next one?  What cognitive strategies and new disciplines will you and your organization deploy this year to attain new levels of prosperity and insight?

The journey will be long and the opportunities will be explored.  It's time that more learning organizations start the reengineering with the right tools and talent.  Yes, this is the next evolution of intelligence.

26 March 2016

The Rules: Implications and Outcomes of our Decisions...

How often during an average week in your role do you make a "Trust Decision"?  When you think about the factors associated with what really is going on when you make a decision to trust, it is beyond comprehension.  Or is it?

The thousands of "Trust Decisions" that you will make as an Operational Risk Management (ORM) professional this week span every hour of your waking day.  The portfolio of decisions to trust involve other people, processes, machines, computers and rules.  As these words are typed on this computing machine from Apple, many more decisions have already been made about trust.

A recent visit to a California symposium on "Cyber 2026," looked into the crystal ball on how our society and environment will evolve in the next ten years.  Topics included the threat landscape and our levels of machine learning hygiene.  The Internet-of-Things (IoT) was mentioned along with the latest on adding more integration with your "Smart Car" and your "Smart Phone".  This is just the beginning.

What needs to happen next?  The dialogue on digital trust is now becoming a prominent theme with significant effort occurring in the published press and on Amazon.  Business units from pwc and Accenture are pivoting people, resources and thought leadership towards the topic for good reason.  The next reengineering revolution is ready for prime time.

It has taken us the last ten years since 2006 to evolve with the cloud and the trust associated with handing over our data to a third party.  We have migrated vital core software systems to be managed by AWS, Microsoft Azure and Google.  These managed solutions provide the Small-to-Medium-Enterprise with the opportunity to scale their business without tremendous capital expenditure.

Yet we continue to find ourselves making daily and hourly decisions to trust, while interacting with computing machines with that back of the mind feeling, can this really be trusted?  Should I click on this link in my e-mail?  How shall I respond to this LinkedIn message from a person I have never met face-to-face?  As humans we are making "Trust Decisions" without even thinking about the science and systems mechanics of what underlies the components and process.  We just do it.

The rules.  Now think about your daily routine and the "Trust Decisions" you make.  How often are your decisions to trust intersecting with rules.  Rules codified into laws.  Rules codified into software.  Rules codified into religion.  Our world is about rules and how we either interact or ignore the rules:
A month after a Los Angeles hospital was crippled by crypto-ransomware, another hospital is in an "internal state of emergency" for the same reason. Brian Krebs reports that Methodist Hospital in Henderson, Kentucky, shut down its desktop computers and Web-based systems in an effort to fight the spread of the Locky crypto-ransomware on the hospital's network.

Yesterday, the hospital's IT staff posted a scrolling message at the top of Methodist's website, announcing that "Methodist Hospital is currently working in an Internal State of Emergency due to a Computer Virus that has limited our use of electronic web-based services.
Unfortunately, the trust decisions that we make each day can be catastrophic.  Whether it be online, or because we are just following the rules in Brussels:
Although this nation of 11.2 million has sent more foreign fighters per capita to the Islamic State than any other country in Europe, Belgium has a relatively small security apparatus. Brussels, the capital, is home to 2,500 international agencies and organizations, including NATO and the E.U. headquarters. Yet nationwide, the Belgian federal police have a total force of approximately 12,000.

The Belgian police have also been hampered by bizarre rules. According to Belgian Justice Minister Koen Geens, just two days after the Paris attacks Abdeslam was “likely in a flat in Molenbeek.” But because of the country’s penal code, which prohibits raids between 9 p.m. and 5 a.m. unless a crime is in progress or in case of fire, police were ordered to wait until dawn to pursue him. By then, Abdeslam was nowhere to be seen.
As we accelerate towards 2026 and beyond, it will require us to better design our systems, society and the rules associated with operating our cities, companies and countries.  How can we hope to achieve this without understanding the root cause and the outcomes of our trust decisions?  How will we reengineer our software to assist the human or artificial intelligence (AI) in writing the rules that shall "Enable Digital Trust of Global Enterprises".

The pace of technological change has far surpassed our ability to write the new rules for our next generation and beyond as humans alone.  We shall now embark on a purposeful mission to enlighten our leadership and the engineers of our vast digital environments, on how to reengineer our rules, for the safety, security and privacy of a more certain future making our daily trust decisions.

20 March 2016

Vigilance: The Casualty of the Truth...

On the other end of a planned cyber threat are the motives and plans by a person.  Sometimes that person puts into play the use of a "Bot" to carry out many of their planned steps in their scheme.  Operational Risk Management (ORM) professionals have been classifying these cybercriminals for a decade or more yet even now in 2016 they are getting more formal profiles:

BAE Systems, the London-based, multinational security company, recently released profiles of “six prominent types of cybercriminals” and detailed how they could hurt companies around the globe, officials say.

Threat intelligence experts at BAE Systems have compiled a list, “The Unusual Suspects,” that has been created from “research that uncovers the motivations and methods of the most common types of cybercriminals,” according to BAE. “The intention of the campaign is to help enterprises understand the various enemies they face so they can better defend against cyberattacks.” BAE Systems officials have profiled six cybercriminal types:
  • The Mule – naive opportunists that may not even realize they work for criminal gangs to launder money;
  • The Professional – career criminals who work 9-to-5 in the digital shadows;
  • The Nation State Actor – individuals who work directly or indirectly for their government to steal sensitive information and disrupt enemies’ capabilities;
  • The Activist – motivated to change the world via questionable means;
  • The Getaway – the youthful teenager who can escape a custodial sentence due to their age;
  • And The Insider – disillusioned, blackmailed or even over-helpful employees operating from within the walls of their own company.
These individuals and groups have caused billions of dollars in losses and caused significant harm to millions of people and organizations.  Now what?

It will be many more years to come, before the laws catch-up to the technology and those who use the vector of the Internet to carry out their crimes against humanity.  Law enforcement has their hands continually tied by the laws and the geographic challenges of a global epidemic.  Governments and politicians are in constant battle over the privacy vs. security philosophy and all the legal issues.

While the wheels of Parliament, or the U.S. Congress slowly turn and the mechanisms for law enforcement become more robust for evidence collection, investigations and prosecutions, there are significant strategies of resilience that we must focus our respective vigilance.  It is not anything new per se, just a renewed emphasis and a new commitment to redesigning our digital environments.  We can do better.

For now, what if we just pick one cybercriminal type to focus on.  The "Insider".

The "Insider" is most likely in almost every formal organization today, working diligently to mask and perpetuate their goals until they are revealed.  It is your "Duty of Care" to continuously deter, detect, defend and document within your enterprise.  The "Insider" could be anyone and so how can the organization work ever more so vigilantly?

It begins at the core of the business and the culture that surrounds those principles within your company, your team or your relationship with suppliers.  The environment you build and sustain shall have the transparency and the elements necessary to sustain a culture where the "Insider" is incapable of operating.  Where the culture itself, makes the environment impossible for the "Insider" to operate without disclosure.

We would encourage Operational Risk Management (ORM) professionals to incorporate new found strategies, new management tools and a renewed effort to extinguish the "Insider" threat across the globe.  The best way we can do this today, is to work on the culture and to establish the foundations for future "Trust Decisions" within the enterprise.  The root of changing the culture and achieving the desired future environment, begins with every single decision to trust.

The journey ahead will be long and full of new found challenges.  The vision of the future and the outcomes received will soon be more apparent.  Now the real work begins to start the journey with your own organization, with each person and understanding the environment and culture you seek.  And remember:
"In war, truth is the first casualty."
Aeschylus
Greek tragic dramatist (525 BC - 456 BC)

    12 March 2016

    Rugged DevOps: Reengineering for our Next Generation...

    The reengineering of the Internet is now underway for our next generation beyond the millennials.  The unification of corporate software development and information security teams are experiencing a deja vu and reminiscent of scenes from the 1993 movie "Groundhog Day."  Operational Risk Management (ORM) is hopeful that we are having a new resurgence of software vulnerability management thinking.  Why?

    "A weather man is reluctantly sent to cover a story about a weather forecasting "rat" (as he calls it). This is his fourth year on the story, and he makes no effort to hide his frustration. On awaking the 'following' day he discovers that it's Groundhog Day again, and again, and again. First he uses this to his advantage, then comes the realization that he is doomed to spend the rest of eternity in the same place, seeing the same people do the same thing EVERY day."  --Groundhog Day

    We are seeing the reunification of 1990's Software Quality Assurance (SQA) thinking, combined with the rigor of new 21st century rapid software development disciplines.  It is called "Rugged DevOps."  Application development life cycles are getting shorter these days.  That is because modern day software development life cycles are taking a more component-based approach, with the reuse of standardized software capabilities.  This makes sense, as long as the use of software quality assurance tools and services are not abandoned and new tools and processes are embraced.

    Welcome to "Rugged DevOps."  This Forrester report, "The Seven Habits of Rugged DevOps" will give you more context:

    Habit 1: Increase Trust And Transparency Between Dev, Sec, And Ops


    Habit 2: Understand The Probability And Impact Of Specific Risks


    Habit 3: Discard Detailed Security Road Maps In Favor Of Incremental Improvements


    Habit 4: Use The Continuous Delivery Pipeline To Incrementally Improve Security Practices


    Habit 5: Standardize Third-Party Software And Then Keep Current


    Habit 6: Govern With Automated Audit Trails


    Habit 7: Test Preparedness With Security Games


    "Enabling Digital Trust of Global Enterprises" in the next decade will require software development organizations to embrace security and risk professionals simultaneously, on a more consistent and non-adversarial basis:
    DevOps practices can only increase speed and quality up to a point without security and risk (S&R) pros' expertise. Old application security practices hinder speedy releases, and security vulnerabilities represent defects that can leave a company open to cyberattacks. But DevOps practitioners can leap forward with both increased speed and quality by including S&R pros in DevOps feedback loops and including security practices in the automated life cycle. These new practices are called rugged DevOps. This report presents the seven main principles of rugged DevOps so I&O pros and developers can break down barriers with S&R pros and achieve faster releases with stronger application security.
    Chief Information Officers (CIO), Chief Privacy Officers (CPO), Chief Legal Officers (CLO), Chief Operating Officers (COO), Chief Security Officers (CSO) and maybe the Chief Executive Officers (CEO) are now paying more attention to these issues.

    Here are 9.5 million more reasons why:

    In 2007, a class action lawsuit was filed in the United States District Court of the Northern District of California against Facebook on behalf of 3.6 million users of Facebook concerning its “Beacon” program. KamberLaw represented the plaintiffs in this action and Cooley LLP represented Facebook. This suit was settled in 2009 and was granted final approval by the Hon. Richard Seeborg in March 2010. As part of the settlement, the parties created the Foundation (the Digital Trust Foundation) “the purpose of which shall be to fund projects and initiatives that promote the cause of online privacy, safety, and security.” The case settled for $9.5 million, with the Foundation receiving approximately $6.7 million after attorney’s fees, payments to plaintiffs, and administrative costs. There were four objectors to the settlement, two of whom appealed the approval to the Ninth Circuit Court of Appeals and subsequently the Supreme Court. But ultimately, in November 2013, the appeals were rejected and the Foundation was funded. The Foundation will distribute more than $6 million and will close its doors once all of the grants have been distributed and completed.

    The corporate Board of Directors conversations about the topic of "Digital Trust" is now ongoing and the subject of new business units.  Security vs. Privacy has been a recent media frenzy between some of our technology companies and the U.S. government.  Your elected officials in the U.S. House of Representatives are also on the hot seat now, to produce new relevant legislation.  The courts are adding more privacy and data breach cases to the docket each week.  The "Digital Equilibrium Project" is being established and will hopefully include an international set of stakeholders.

    Authoring the rules that everyone understands and everyone can agree on, sets the stage or playing field for the environment of competition to engage with some sense of civility.  Rules will be broken in plain sight and the referee (law enforcement, judges, courts, juries) will impose a penalty, while potentially millions of people watch live.  Is it a penalty kick or just a loss of down?

    Think global.  Think at the speed of light.  Think about the trust of e-commerce transactions where millions of people rely on our computing machines every waking minute of the day.  Where Zettabytes of data are in use.  The rules on the "Digital Playing Field" are vital to our future social and economic well being.

    "Rugged DevOps" is another and necessary component of a safe, private and secure Internet ecosystem.  Operational Risk Management (ORM) professionals are evermore concerned, with the root cause of our current Privacy vs. (soon to be "And") Security headlines.  Digital Trust is hard to achieve and yet easy to forfeit.  It is time for us to begin "Reengineering for our Next Generation".

    05 March 2016

    Zeros and Ones: Context & Proportionality Don't Translate...

    "Context and Proportionality do not translate to Zeros and Ones."  This was a key take away from the 2016 RSA Conference last week in San Francisco.  Thousands of Operational Risk Management (ORM) professionals attended to listen to speakers with titles such as Attorney General, Secretary of Defense and Chief Technology Officer.

    Perhaps more important however, were the actual practitioners in the legal system and those "Quiet Professionals" responsible for our national security, who were clearly outlining the digital landscape and our significant challenges ahead.  For our nation and the future of our social and economic destiny.

    The software engineers and companies who are writing millions of lines of software code are at risk.  Here is why.  Context and Proportionality do not translate to Zeros and Ones because lawyers are writing words with "Semantically Intentionally Ambiguous Meaning" (SIAM), in the pursuit of achieving digital trust.  Privacy and security intent in the translation from lawyers to software engineers, has been lost for a long time.

    How can we summarize the entirety of what just took place this past week at RSA:
    • Visibility
    • Threat Protection
    • Compliance
    • Data Security
    These four pillars are where the industry is still categorized in the majority, yet we came across some very interesting companies and products that are creating a new buzz.  Walking the halls and observing the presentations, the mobile computing generation was in full force.  As everyone shuffled between sessions like the overcrowded high school hallways, the only safe location was on an escalator where you could stare at your iPhone for 20 seconds with a little peace.  Can you imagine the amount of intellectual property intelligence being collected by competitors and adversaries using digital sensors and good old fashioned trade craft during the week?

    So what?  In the spirit of all the talk and debate, the sales and marketing, the presentations and powerpoint slides, what have we learned?

    "Context and Proportionality do not translate to Zeros and Ones."

    Why is this so important to grasp?

    At a certain point in the accelerating evolution of technology innovation there are disruptive bifurcations.  It means that the rise of a particular system achieves a point in time when instead of rising and growing on the "S" curve, the system begins its descent and erosion, until it is outdated or no longer trusted as a standard.

    We are soon to reach a new bifurcation in the digital systems that run our businesses, markets and governments.  The organizations who rely on the Internet in their daily operations need to adapt.  Quickly.  Those that are able to accomplish rapid reengineering will survive.  And those who wait or miss the signals to adapt, will perish or become absorbed by the digital environment surrounding them.

    27 February 2016

    RSA 2016: Ascending into a Trust Mindset...

    Building awareness to a vulnerability, potentially heightens ones sensitivity to defend or build resilience to minimize damage or loss.  This is one of the foundations of Operational Risk Management (ORM), understanding what your assets are and what vulnerabilities exist.  Good old fashioned Risk Management 101, tells us to mitigate risks in the enterprise and even in our personal lives.

    Is traditional Risk Management dead?  We think it is and through the eyes and inspiration of others we can now see why.  Our ability to make "Trust Decisions" is far more complex than just an emotion.  As we have evolved away from small villages where the food and water and other life essential resources were shared, trust factors have become more distant.  More shallow and less personal.  Our digital lives spanning continents and countries at light speed, now has given us a new perspective.  We must find our Trust Mindset.

    As the RSA Conference opens on February 29, 2016 in San Francisco, thousands of eager professionals will converge on an event that has it's foundation and it's future built on "Achieving Digital Trust".  As we walk the Moscone Exhibition Halls observing, learning, engaged in dialogue or debate we must remind ourselves of the wisdom that comes from Jeffrey Ritter:

    I have always viewed the emergence of the Internet and global computing as powerful tools to increase the velocity of the next solutions that enabled greater inter-dependence, greater accessibility to commerce, and more small steps toward peace. Through my work, however, I learned those tools were vulnerable unless, as a global society, we determine how to also build across the digital dimensions of cyberspace the capacity for humans to achieve what each transaction first requires—an affirmative decision to trust.

    Jeffrey's latest book has been an inspiration for so many that have researched and lived in the Venn Diagram of the Law, Digital Technology and eCommerce.  Yet what about those people who have studied and modeled the human skills and behaviors to build trust with others that have yet to read Jeffrey's' book?  What is the fusion between the factors associated with building trust human-to-human and in a world of machines-to-machines?

    "Trust Decisions" are being made by humans and computers each second of each day.  And one thing is certain about the decisions to trust by people and by the machine in your pocket, brief case or purse.  It is continuously learning and sharing.

    The halls of the RSA Conference will be buzzing about trust.  In all of it's manifestations, the ecosystem of the event is about "Trust Decisions" and in many cases, man and machine.  The iPhone vs. the FBI.  Security vs. Privacy.  Cloud vs. Hybrid Cloud.  Secret Clearance or Top Secret Clearance.  Pre-hire background check.  FICO.  LinkedIn profile.  You name it and the fundamental question set, comes back to a "decision to trust."

    Living an ethical life of integrity and willingness to share begins at an early age.  Sharing information responsibly with your peers, director or commander, requires a process for building trust over time and with each transaction of information exchange, either building or eroding the future decision to trust.

    Here is one recent example.  Sitting in a room with a dozen strangers the other day was a mini-case study.  The purpose of this particular meeting was for this group of people to establish a forum for future trusted information exchange.  We were all part of the same ISAO if you will, not the same company.

    The agenda called out for each person to introduce themselves, all for the first time.  The specific rules for the introductions were not spelled out by the host and then agreed by all of the meeting participants.  What happens next is a classic example of trust erosion, when the rules are absent.  As we proceeded around the room, each person took it upon themselves to determine how much or little information they would share with the rest of the group.

    Some people introduced themselves with their name, company affiliation and a "one liner" on the business they were in.  Others in addition, took the opportunity to tell us all about their entire product/service line and why the solution was something that we should be interested in.  The first impressions were already building or eroding our perceptions of trust.  Our own reality.

    It should be our ambition to continuously heighten our sensitivity to behavior in an environment absent of rules and how this builds or erodes our future trust decisions.  When you share, do you always have an expectation of reciprocity?  When you boast about yourself or your organization, is it for your own ego or self-satisfaction?  Do you ever even ask the question, "How are you" or "How can I help" you?  What are the rules?

    Extraordinary trust is rare these days.  True Leadership is scarce.  Courage is almost extinct.  Think about how you can stand out and at the same moment, project a feeling of care, of concern and generosity.  Giving without any expectation of return, is what is going to help you build trust in your life.  And when you achieve that with your wife, husband, children, church, business partners, employees, clients and suppliers, then you know you are well on your way to substantial well being.

    If you are alone and without many true and deep relationships in your life without cyberspace, there is a good reason why.  Achieving and building trust inside your organization (company or family) has been written about for years.  Happy employees make happy customers.  You have heard this before no doubt.   Building awareness to a vulnerability, potentially heightens ones sensitivity to defend or build resilience to minimize damage or loss.  This is where we started this blog post.

    As we descend on the RSA Conference with the focus on "Trust Decisions", it will be with an ascent towards a continuous mindset of sharing, of caring and of learning.

    20 February 2016

    Predictive Intelligence: Data or Precogs...

    The use of the term "Predictive Intelligence" has been around for a few years in the Operational Risk Management (ORM) community.  Born from the marketing collateral of the Business Intel (BI) vendors, it essentially requires hundreds of gigabytes or even terabytes of historical data and then is analyzed or data mined for so called insight.  The question is, why is this "Predictive Intelligence" and not just more "Information" in a different context?

    Now introduce the nexus of our own "Trust Decisions" and the "Human Factors" associated with the science of cognitive decision making.  How do we as humans make our decisions to trust vs. how computers make their decisions to trust?  Are they not executing rules written by humans?  When is it information in a different format as opposed to true intelligence?

    Christian Bonilla may be on to something here:
    "Professionals in the foreign intelligence community take pains to distinguish between information and bona fide intelligence. Any piece of knowledge, no matter how trivial or irrelevant, is information. Intelligence, by contrast, is the subset of information valued for its relevance rather than simply its level of detail. That distinction is often lost in sector of the enterprise technology industry that is somewhat loosely referred to as Business Intelligence, or BI. This has become a bit of a catchall term for many different software applications and platforms that have widely different intended uses. I would argue that many BI tools that aggregate and organize a company’s information, such as transaction history or customer lists, more often provide information than intelligence. The lexicon is what it is, but calling something “intelligence” does not give it any more value. In order to sustainably outperform the competition, a company needs more than a meticulously organized and well-structured view of its history. Decision makers at all levels need a boost when making decisions amidst uncertainty and where many variables are exerting influence. They need what I would call predictive intelligence, or PI – the ability to narrow down the relevant variables for analysis and accurately measure their impact on the probability of a range of outcomes."
    What does the fusion of human factors have to do with predictive intelligence?  That depends on how much you value the kind of innuendo and messages in the Tom Cruise movie, Minority Report.  Many aspects of the original Philip K. Dick story were adapted in its transition to film that was filmed in Washington, DC and Northern Virginia.  Is it possible to predict someone's future behavior even before they commit a crime or even become violent?
    Set in the year 2054, where "Precrime", a specialized police department, apprehends criminals based on foreknowledge provided by three psychics called "precogs".
    Cruise plays the role of John Anderton who is part of the experimental police force known as "Precrime."  These aspects of clairvoyance and precognition has many skeptics and their use for predicting future events or a related term, presentiment, refers to information about future events which is said to be perceived as emotions.
    Regardless of terms, beliefs or whether the software analytics are using historical data, the science of "Predictive Intelligence" is about forecasting the future.  Based upon the recent global events that missed the forecast of economic implosion based upon historical data, maybe it's time to start introducing more human factors to the equation.

    The interviews with people who have gone on record to predict a future historical event will probably be right at some point in time. How long will you be around to wait?  The demise of the banking sector and the extinction of Lehman Brothers, Bear Stearns and maybe even AIG were most likely predicted by someone, somewhere in 2007/2008 time frame.  The point is that you have to have context and relevance to the problem being solved or the question being asked.
    The real story of the crash began in bizarre feeder markets where the sun doesn't shine and the SEC doesn't dare, or bother, to tread: the bond and real estate derivative markets where geeks invent impenetrable securities to profit from the misery of lower--and middle--class Americans who can't pay their debts. The smart people who understood what was or might be happening were paralyzed by hope and fear; in any case, they weren't talking.
    Predictive analytics extracts relevant information from data and attempts to forecast the future. It relies on capturing relationships between explanatory variables and the predicted variables from past occurrences, and exploiting it to predict future outcomes.  Is it possible that there was and is too much reliance on the numbers and not enough on people's cognitive intuition?

    This blog has documented the "11 Elements of Prediction" in the past.  Now it's time to utilize the combination of these human factors in close collaboration with the data analytics and raw numbers. Effective execution of both will provide corporate management the situational awareness they seek within the time line they wish.

    The future state of Predictive Intelligence will combine the science of "Trust Decisions" with the art of "Data Analytics" to achieve our desired outcomes.

    14 February 2016

    Workplace Violence: Cues and Clues to Teach...

    Operational Risk Management (ORM) is your foundation for crisis leadership. It will also prepare the enterprise for the potential for Homegrown Violent Extremism (HVE).  Is there a nexus with the cues and clues of traditional workplace violence and domestic terrorism? A domestic terrorist differs from a homegrown violent extremist in that the former is not inspired by, and does not take direction from, a foreign terrorist group or other foreign power.

    All work locations have distinct categories of threats that are relevant to the site, people and type of business. Assessing the violent factors is the role of Senior FBI profiler (retired) Mary Ellen O'Toole and there are four categories according to a study entitled: "The School Shooter: A Threat Assessment Perspective:"
    1. A Direct Threat
    2. An Indirect Threat
    3. A Veiled Threat
    4. A Conditional Threat
    Employees must be trained to be aware of the warning signals that typically occur before a threat and violent act becomes operational. Based on the O'Toole study these are some of the 23 "Red Flags" that employers should be monitoring and keeping their Corporate Threat Assessment Teams on high alert for:
    • Low tolerance for frustration
    • Poor coping skills
    • Failed relationships
    • Signs of depression
    • Exaggerated sense of entitlement
    • Attitude of superiority
    • Inappropriate humor
    • Seeks to manipulate others
    • Lack of trust/paranoia
    • Access to weapons
    • Abuse of drugs and alcohol
    Source: O'Toole, Mary Ellen, "The School Shooter: A Threat Assessment Perspective," by the Critical Incident Response Group (CIRG), the National Center for the Analysis of Violent Crime (NCAVC) and the FBI Academy.
    The court and the jury will look upon your employers ability to apply the basics of workplace violence and threat assessment. What did you know? When did you know it? What have you done about it? They will judge you on the threat assessments utilization of insider threat intelligence combined with the evidence of your overt training of employees in the workplace. What grade would you give your company today for these fundamentals?

    Let's take it to the next step in terms of your ability to even meet the requirement by the Occupational Safety and Health Administration (OSHA) in the United States. Awareness programs are expected on the four primary types of workplace crimes:
    1. Those crimes committed by people not connected to the workplace.
    2. Aggression by third parties including customers, clients, patients, students, or any others for whom you provide a service or product.
    3. Employee-to-Employee violence or a former employee who returns to the workplace with the intention to injure a former supervisor.
    4. Aggression related to a personal relationship inside or outside the workplace.
    The organization who understands the foundation for creating a proactive and preventive team for incidents in the workplace should not stop there. Once you have developed the framework for Incident Command, Emergency Operations Center, Shelter in Place, Medical Triage and Evacuation you have a good baseline to extend to a complete "Continuity of Intelligence Operations" strategy. This requires a deeper analysis into the threats inside your organization that may put you out of business entirely:
    The ISIS assault on Paris and the ISIS-inspired massacre in San Bernardino, California, share a disturbing fact, no one saw them coming. Today, the biggest terrorist threat to the United States is not like al Qaeda. ISIS is wealthy, agile, sophisticated online, and operates freely in a vast territory of its own. It prefers to be called the Islamic State. The U.S. government calls it ISIL. Reporters tend to call it ISIS for the Islamic State in Iraq and Syria. But whatever the name, it has the manpower, means and ruthlessness to attack the U.S. The man who is supposed to stop that attack is John Brennan, the director of the CIA. And tonight, in a rare interview, we talk to Brennan about a world of trouble and we start with the most pressing danger.
    Once the organization has adopted the "All Threats - All Hazards" intelligence mentality then it is well on it's way to becoming a survivable business.  Operational Risk Management (ORM) is a discipline that incorporates this approach and enables owners, operators and business suppliers with the tools, methods and strategy to handle workplace violence incidents or a catastrophic act of mother nature.

    07 February 2016

    Trusted Enterprise: Digital Science in Business...

    Digital Trust has been a cornerstone for any serious organization in our 21st century era.  The foundation for an Operational Risk Management (ORM) design, begins with the engineering science of a sound and endurable platform for "Enabling Digital Trust of Global Enterprises."
    The Accenture Technology Vision 2016 verifies "Digital Trust" as one of five major trends:
    As every digital advancement creates a new vector for risk, trust becomes the cornerstone of the digital economy. Without trust, digital businesses cannot use and share the data that underpins their operations. To gain the trust of individuals, ecosystems, and regulators in the digital economy, businesses must possess strong security and ethics at each stage of the customer journey. And new products and services must be ethical- and secure-by-design. Businesses that get this right will enjoy such high levels of trust that their customers will look to them as guides for the digital future.  Source:  Accenture Technology Vision 2016
    The concept of data ethics as a significant component of establishing "Digital Trust" is vital.  When you introduce the concept of ethics to the dialogue on software engineering in the global enterprise, there are several key considerations.  Adding the moral governance of actions taken as a result of insights derived from the analysis of information, is also a valid vector in the design of trustworthiness for modern digital applications.  Yet this means nothing, without first understanding how humans make their decisions to trust.  How effective the entire ecosystem of "Digital Trust" becomes will always come back to the root.  Digital Ground zero.

    Ground zero for "Digital Trust" is the actual "Trust Decision" itself.  The science of the "Trust Decision" elements and process has been the focus of researchers and academic study for years.  In order for us to truly understand how to achieve digital trust in business, we must first grasp the science and evidence of the core elements and root of our "TrustDecisions."  Does "Achieving Digital Trust" in the enterprise ensure that, as a business you are "Achieving a Defensible Standard of Care"?   Not necessarily.

    The two concepts are mutually exclusive, yet they still have affinity for each other.  Accenture's Technology Vision, provides the enterprise with sound reasoning about how to create a path towards improving digital trust, especially as it pertains to the reputation benefits associated with the "Brand."  Adding the element of ethics, drives the consumer thinking that the business has addressed privacy requirements in terms of the legal rules and usability factors.

    Incorporating the conversation in the Board Room about data ethics (collection and use) or how as an enterprise you must design-in legal controls in order to alleviate liability, requires something new.  It requires all interested parties to go back to the root.  How does the human make a decision to trust?  How does a computer make a decision to trust another computer?

    The people sitting around the Board Room table are thinking about creating more wealth.  They are not asking themselves, how do computers trust other computers?  In our digital age where decisions are being made as a result of the execution of zeros and ones at light speed, someone has to be designing the trust architecture with the right people in the enterprise.  The question is now at hand, who is that person or business unit?

    The answer is going to be different in each business or organization.  What is the maturity of the particular digital ecosystem and how vast is the landscape for the computing assets?  One fact that must be acknowledged early on, is that it probably does not entirely exist today.  The ideal unit of people and systems that are necessary to achieve digital trust, are currently spread out across the typical silos of a business architecture.  IT, Marketing, Legal, Info Security, Privacy perhaps.  However, the dedicated and funded "Digital Trust" team, task force or department, has yet to be established.  So what?

    Continue to operate as you are.  Without the advantage of truly understanding the elements of "Trust Decisions" and how this is relevant to "Achieving a Defensible Standard of Care."  A trustworthy computing division, may have existed in the past at your organization, yet initially with another focused mission,  "Cyber Crime" intervention.  You see, the idea of trust and why it is so vital to the success of the information technology industry is not new.  Smart malware researchers and software engineers understood this at the dawn of the Internet.  So why is this any different?

    Trustworthy computing in the 90's is not the same as the application of "Trust Decisions" in the year 2016 and beyond.   Especially today, with the speed of cloud computing adoption and the outsourcing of core data transactions across borders.  The international implications of privacy laws and the routing and storing of data outside of your native country, is now in play.  Negotiations by a Nation State to bypass traditional use of mutual legal assistance treaty (MLAT) is the new normal:
    If U.S. and British negotiators have their way, MI5, the British domestic security service, could one day go directly to American companies such as Facebook or Google with a wiretap order for the online chats of British suspects in a counter­terrorism investigation.

    The transatlantic allies have quietly begun negotiations this month on an agreement that would enable the British government to serve wiretap orders directly on U.S. communication firms for live intercepts in criminal and national security investigations involving its own citizens. Britain would also be able to serve orders to obtain stored data, such as emails.  Source:  Washington Post
    The requirements have changed.  The next era of "Achieving Digital Trust" requires so much more.  It now requires standing up and providing substantial resources to the "TrustDecisions" Unit within the enterprise.  What does this mean to the future of the Trusted Enterprise?

    It means that the Chief Information Officer (CIO), Chief Privacy Officer (CPO), General Counsel and Chief Information Security Officer (CISO) will be using data and Digital Science to design a new architecture for the Trusted Enterprise.  They will deliver it to the desk of the Chief Executive Officer (CEO) very soon.