28 May 2016

Memorial Day 2016: The Risk of Service is Understood...

It is Memorial Day weekend in the U.S. and on this final Monday of May 2016, we reflect on this remembrance.

In order to put it all in context, we looked back 36 months to our 2013 blog post here.  It was only a few weeks since a fellow colleague from Team Rubicon had ended his battle at home, after several tours of duty with AFSOC.  Neil had joined the ranks of those fallen heroes who survive deployment tagging and tracking the enemy in the Hindu Kush.  He was also one of the 22 that day in early May, that could not defeat the legacy of demons he fought each night, as he fell deep asleep.

On Memorial Day 2016, we again honor Neil in Section 60 at Arlington Memorial Cemetery and all those other military members who have sacrificed and defended our freedoms for 239 years. Simultaneously, we do the same for the people behind the "Stars" on a wall in Langley, Va for those officers who have done the same.

Together we are on the front lines or inside the wire at the FOB.  Whether you are in Tampa, FL, Stuttgart, Germany or Arlington, VA.  Whether you are on your beat cruising the streets of a major metro USA city.  Whether you are watching a monitor at IAD, LAX or DFW.  Whether you are deep in analysis of Internet malware metadata or reviewing the latest GEOINT from a UAS.  We are all the same, in that we share the mission that gets each one of us out of bed each day.  Our countries "Operational Risk Management (ORM)."

The Operational Risk Management mission of the U.S. Homeland is vast and encompasses a spectrum of activity, both passive and kinetic.  Digital and physical.  It requires manpower and resources far beyond the capital that many developed countries of the world could to this day comprehend.  There are only a few places across the globe, where a normal citizen would say that the mission and the capital expenditures are worth every dollar and every drop of blood.

Memorial Day in the United States is exactly this:
Memorial Day is a United States federal holiday which occurs every year on the final Monday of May.[1] Memorial Day is a day of remembering the men and women who died while serving in the United States Armed Forces.[2] Formerly known as Decoration Day, it originated after the American Civil War to commemorate the Union and Confederate soldiers who died in the Civil War. By the 20th century, Memorial Day had been extended to honor all Americans who have died while in the military service[3].
So this weekend as we walk among the headstones, reflect on our colleagues who gave their service and their own lives, we will stand proud.  We understand the risks.  We know why we serve.  In the spotlight or in the shadows.  The tradition and the mission continues...

21 May 2016

Social Engineering: CxO Leadership for BEC...

In the context of cyber security, many practitioner experts are already familiar with the "Business E-Mail Compromise" (BEC).  Operational Risk Management (ORM) professionals know this:
"Amateurs attack machines, Professionals attack people"

The BEC is a global scam with subjects and victims in many countries. The IC3 has received BEC complaint data from victims in every U.S. state and 45 countries. From 10/01/20131 to 12/01/2014, the following statistics are reported: 

  • Total U.S. victims: 1198
  • Total U.S. dollar loss: $179,755,367.08
  • Total non-U.S. victims: 928
  • Total non-U.S. dollar loss: $35,217,136.22
  • Combined victims: 2126
  • Combined dollar loss: $214,972,503.30
The FBI assesses with high confidence the number of victims and the total dollar loss will continue to increase.
What executives at most organization understand, is that they are a potential target for all kinds of threats from inside and outside the company.  Fortune 500 companies already have sophisticated internal accounting controls and "Personal Protection Specialists" who are doing advance work, for travel that the CxO takes across town or overseas.  Yet what about the Small-to-Medium Enterprise with just tens of millions of dollars in annual revenues?  Are they prepared as they could be for the BEC?

It does not take much for the financial controls and the accounts payable process to break down for companies and organizations, that have not prepared for this continuous threat, by your own insiders (employers, partners, suppliers) cooperation.  The numbers tell the whole story.  Countless times each year, companies are convinced to act upon a simple e-mail crafted by clever "Social Engineering" experts, to transfer money out of their corporate banking accounts.

So what are you doing to prepare, educate and deter this continuous wave of "Social Engineering" attacking your employees and key stakeholders?  How many computers and iPhones in your business or organization receive e-mail on a daily basis?  Each one of these is a threat vector, along with each one of your employees who is the human factor behind the device.

What is amazing today, is that a cyber threat like this, that has been talked about for over a year, is still growing.  Perhaps it is a leadership problem.  Perhaps it is a public safety announcement campaign problem.  In either case, you have to realize, there are some very specific remedies that can be exercised by your organization to deter, detect and defend yourself from "Business E-mail Compromise" (BEC).

Executives and senior staff are busy.  They are running the business and rarely have time for that two hour or half day training session.  This is your largest vulnerability to begin with at your organization.  An apathetic CEO or senior staff is the perfect target for any transnational organized crime (TOC) syndicate on the other side of the globe.

As a CxO, when was the last time you had a campaign within the organization to address these threats?  Weeks, Months, Years?  Why haven't you incorporated a continuous program to keep your employees and staff up to date?  If you have 1247 employees, then you have 1247 vulnerabilities walking around in your enterprise.

When you look at the line item in the Information Technology budget this year for hardware, software, maintenance and cloud computing, look a little further.  Where is the line item for the education program and the tactical awareness, to keep your people on the leading edge of deterring the social engineering wave of attacks in your organization?
There has been a lot of news in 2016 about a particular species of phish, the so-called Business Email Compromise (BEC). In this scenario, the attacker poses as an executive of a company, asking someone--usually a subordinate employee--to perform a wire transfer or similar action. When the employee complies and completes the transfer, the company realizes--too late--that it has just given a large payment to a criminal. An investment company in Troy, Michigan, recently lost $495,000 from a BEC phish, so this is not a small matter.

It even hit close to my (professional) home: DomainTools’ CFO recently received a spear phish purporting to come from our CEO, asking her to make a wire transfer of funds. The sending email address was a clever look-alike of “domaintools.com,” using some substituted characters. Fortunately our CFO is very savvy and knew right away that her boss wouldn’t actually make such a request in that way. But it underscores how common this kind of BEC phish is -- and how easy it is for criminals to spoof legitimate emails.
This is just a small example, of the continuous trend across the small-to-medium enterprise landscape.  You have the control and the ability to make a difference in your enterprise.  The time and the services exist for you to keep your organization more safe and secure than it is today.  When will you decide it is your "Duty of Care" to protect corporate assets and to start using some of the tools to make "Business E-mail Compromise" (BEC) extinct?

15 May 2016

Know Your Customer: ISP Future Horizon...

The American public is changing their behavior as a result of the privacy and security failures across the private sector business policy landscape.  As the latest NTIA survey data reveals again, online commerce is being impacted and government agencies are now trying to further communicate there is a growing problem:

Lack of Trust in Internet Privacy and Security May Deter Economic and Other Online Activities
May 13, 2016 by Rafi Goldberg, Policy Analyst, Office of Policy Analysis and Development

Every day, billions of people around the world use the Internet to share ideas, conduct financial transactions, and keep in touch with family, friends, and colleagues. Users send and store personal medical data, business communications, and even intimate conversations over this global network. But for the Internet to grow and thrive, users must continue to trust that their personal information will be secure and their privacy protected.

NTIA’s analysis of recent data shows that Americans are increasingly concerned about online security and privacy at a time when data breaches, cybersecurity incidents, and controversies over the privacy of online services have become more prominent. These concerns are prompting some Americans to limit their online activity, according to data collected for NTIA [1] in July 2015 by the U.S. Census Bureau. This survey included several privacy and security questions, which were asked of more than 41,000 households that reported having at least one Internet user.

Perhaps the most direct threat to maintaining consumer trust is negative personal experience. Nineteen percent of Internet-using households—representing nearly 19 million households—reported that they had been affected by an online security breach, identity theft, or similar malicious activity during the 12 months prior to the July 2015 survey. Security breaches appear to be more common among the most intensive Internet-using households.

This survey is indeed only one facet of a much larger topic and pervasive problem.  Digital Trust is the output of making affirmative "Trust Decisions" with computing devices. Whether they are machine-to-machine, person-to-machine, or machine-to-person requires several technology engineering elements and business rules, that are understood and agreed upon.  The question is by whom?

Consumers who are using the Internet for communications and commerce and are the victims of Identity theft, stolen funds or other fraudulent schemes, are just the first wave of targets for transnational organized crime (TOC).  We have known this since the invention of virus scanners and bug bounty programs, in the early days of the 21st century.

Yet fifteen plus years later, the government is doing a study on the consumers feelings about privacy and security.  As a business or a consumer, we understand that the speed of commerce and technology is always far ahead of the regulations and the laws.  When enough people or businesses seem to be harmed, then the momentum begins for policy shifts and new laws are sometimes enacted after thousands of pages of semantic negotiation.

The answers and the outcomes we seek will come.  However, they will not first be solved by politicians and lawyers.  They will be mostly solved by our brilliant mathematicians, software engineers and data scientists.  At this point in time, we are getting so much closer to achieving digital trust through new innovations and inventions.  Just look at IBM Watson.

It is now time for business and commerce to begin the process of finding the truth.  Why do we continue to allow the levels of known bad actors to operate inside and within our networks?  It's a numbers game and it is because the criminals also employ the smartest social engineers and data scientists.

Digital Trust in the next fifteen years will mean something different than it does today.  We will have found the formula along the journey, the new equations and the rules agreed upon by all to make online and digital commerce more safe and secure.  So what will we do today and tomorrow, until the engineers and scientists save the day?

At this point in time, it is simply called "Know-Your-Customer"(KYC).  If this was utilized more effectively across critical infrastructure sectors beyond finance in our digital economy, then we would be making some progress.  Where are we talking about next? 

The FTC and FCC are well on their path to defining those critical elements of improving the trust that consumers have using their digital tools with ICT and on service providers web sites.  Yet even to this day, you still can find the criminals using and leveraging our own Internet Service Providers (ISP) to launch their attacks and perpetuate their fraudulent schemes.  How will this ever be deterred?  Could a version of KYC work with the ISP's?

Even with a global banking system in place you have pockets of greed and deceit.  Rogue nations or territories that have become the go-to-locations for the transnational organized crime syndicates to flourish.  Yet we can do much better, than we are today.

Just ask any "BlackHat" hacker from Eastern Europe who they prefer to do business with.  Query the experts that exist on the dark side and you will find the ISPs they prefer to do business with.  One day the regulators will realize this is where the business of e-crime has an opportunity for change and additional reform.  It will be more than just opening an account to gain access to the Internet.  It will be about scaling up our systems to a future horizon with new rules and robust real-time behavioral predictive analytics.  In the mean time:
May 11, 2016 
In testimony before Congress today, the Federal Trade Commission outlined its work over the past 40 years to protect consumers’ privacy at a hearing convened to examine privacy rules proposed by the Federal Communications Commission.

Chairwoman Edith Ramirez and Commissioner Maureen Ohlhausen testified on behalf of the Commission. The testimony before the Senate Judiciary Committee’s Subcommittee on Privacy, Technology and the Law provided background on FTC law enforcement efforts, policy work and consumer and business education programs related to protecting consumers’ privacy.

The testimony highlighted the FTC’s extensive history of privacy-related work. The testimony noted that the agency has brought more than 500 privacy-related enforcement cases in its history against online and offline companies of varying sizes, including companies across the internet ecosystem. In addition, the testimony highlighted a number of recent cases of note.

The testimony also provided information on the FTC’s policy work in the privacy area, going back to its first internet privacy workshop in 1996. The testimony noted that recent policy work has been based on principles featured in the FTC’s 2012 privacy report, and also highlighted workshops and reports related to the Internet of Things, big data, and other issues, including cross-device tracking.

The testimony also described the FTC’s extensive consumer and business education efforts related to privacy, including the FTC’s Start With Security campaign for businesses, and the newly-updated IdentityTheft.gov.

07 May 2016

The Third Offset: Seeking the Speed of Trustworthiness...

The U.S. national security "Insider Threat Score" is on it's way as a result of the aftermath of the Office of Personnel Management (OPM) hack.  The National Background Investigation Bureau (NBIB) is now standing up operations within the Pentagon umbrella.  Operational Risk Management (ORM) professionals are tracking this closely for good reason.  Social media activities such as this one, could one day be a factor in that score.

Simultaneously, the NIST Special Publication 800-160 2nd Draft has been released.  This document entitled:  Systems Security Engineering "Considerations for a Multidisciplinary Approach in the Engineering of Trustworthy Secure Systems" addresses a key component in the national security mosaic.

So if the goal of creating the "Insider Threat Score" is to help automate and maintain the process for better understanding trustworthiness, then the NIST publication should be at the center of the table at the National Background Investigation Bureau.  Why?  Definitions in Appendix B of the SP 800-160 Second Draft:

Trustworthiness: An attribute associated with an entity that reflects confidence that the entity will meet its requirements.

Note: Trustworthiness, from the security perspective, reflects confidence that an entity will meet its security requirements while subjected to disruptions, human errors, and purposeful attacks that may occur in the environments of operation.

Trust: A belief that an entity will behave in a predictable manner in specified circumstances.

The degree to which the user of a system component depends upon the trustworthiness of another component.

Note 1: The entity may be a person, process, object, or any combination thereof and can be of any size from a single hardware component or software module, to a piece of equipment identified by make and model, to a site or location, to an organization, to a nation-state.

Note 2: Trust, from the security perspective, is the belief that a security- relevant entity will behave in a predictable manner while enforcing security policy. Trust is also the degree to which a user or a component depends on the trustworthiness of another component (e.g., component A trusts component B, or component B is trusted by component A).

Note 3: Trust is typically expressed as a range (e.g., levels or degrees) that reflects the measure of trustworthiness associated with the entity.
The future of the automation of the clearance process, continuous monitoring of "Insider Threat Scores" and the trustworthy secure systems software engineering for accomplishing this remains mission critical.  The "Cleared Community" of private sector "Defense Industrial Base" (DIB) contractors will also be impacted by the convergence of both.

So who are the personnel who could be impacted by these two converging initiatives:
  • Individuals with systems engineering, architecture, design, development, and integration responsibilities; 
  • Individuals with software engineering, architecture, design, development, integration, and software maintenance responsibilities; 
  • Individuals with security governance, risk management, and oversight responsibilities;
  • Individuals with independent security verification, validation, testing, evaluation, auditing, assessment, inspection, and monitoring responsibilities;
  • Individuals with system security administration, operations, maintenance, sustainment, logistics, and support responsibilities;
  • Individuals with acquisition, budgeting, and project management responsibilities;
  • Providers of technology products, systems, or services; and
  • Academic institutions offering systems security engineering and related programs.
As the government moves towards more trustworthy secure computing systems the private sector will be there to assist.  Yet the future of our trusted environments will depend on how often we perform and how well we perform without error.

Software is continuously changing and the fear of changing it too often, has been one of our greatest downfalls.  That fear of change has created our largest exposures to continued exploits and attacks, by our most sophisticated adversaries.  Remember, Edward Snowden worked for a private sector contractor.

There are a few trustworthy organizations that have realized this fact and are now on an accelerating path for reaching a higher level of trust.  With their software systems and their people.  However, they did this with a leap of faith and the understanding that the speed to reach more trusted computing environments, was absolutely vital.

Look around the Nations Capital beltway and you will find a few examples of the ideal innovation architecture strategy that will propel us into that next level of trustworthiness.  An affirmative decision to trust is now before us and the time we take to make that trust decision is our greatest challenge.  Will it be hours, minutes, seconds or nanoseconds?  Marcel Lettre, undersecretary of Defense for Intelligence has this perspective:
"The intelligence community’s role in what Pentagon planners call “the third offset”—the search for continuing technological advantage over enemies—will feature robotics, artificial intelligence, machine learning and miniaturization. They will be applied in the areas of “pressing for global coverage capabilities, anti-access/area denial, counterterrorism and counter-proliferation, cybersecurity and countering insider threats,” Lettre said.

He said Defense is reaching out to obtain the expertise of its industrial partners, including Silicon Valley, while workforce planners are focused on “bringing in another generation skilled at innovating in the technology sector.”

01 May 2016

Innovation: Truth in Data Provenance...

For years mathematicians and computer scientists have written about the trustworthiness of data provenance.  Relying on the integrity of data collection, transport and of course the source of data is a real science.  Our modern day zeros and ones span all aspects of our lives and Operational Risk Management (ORM) professionals have encountered the questions surrounding trust and the process of decision making long before the invention of computing machines.

At the root of decision making with integrity the source of data is questioned.  The reliability and history of previous data from the source.  As the data was transported from Point A to Point B was there any possibility that the data was altered, modified or corrupted.  Couriers and the use of a "Hawala" type system have been used by traders and terrorists for hundreds of years.

"Truth in Data Provenance" is the question mark that enables our trust decisions.  This is why modern day cryptography is at the center of so many arguments and debates, when it comes to the topic of trusted information.  Yet hundreds of years ago, long before telecom and ICT was invented, the trustworthiness of data provenance was a vital factor.  The use of transposition ciphers were in use by the ancient Greeks.

So what?  In 2016 what does the truth in data provenance have to do with our business commerce, our transportation, our banking, even our abilities as governments to maintain our defense against attack?

The topic is vast and deep and worth exploration at the top level of human decision-making.  Yes, it is vital that our computing machines have high-assurance data integrity, in order for our global systems to operate day-to-day.  Yet what impact does trusted information have with humans in an environment of work and daily collaboration?  How does truth in data provenance, affect our decision making and the environments we work in?

In a recent report by LRN, the subject of trust in the work environment as a motivator has become more apparent:
Another fascinating result of the study had to do with two squishy-sounding characteristics of a company: character and trust. Companies deemed by employees to have both strong character and inspired trust performed almost four times better, using the metrics mentioned earlier, than those that had other positive cultural attributes, such as collaboration and celebrating others. (This applied to all three types of companies, though, naturally, culture and trust were much more prevalent in the self-governing ones) What’s more, “high trust” organizations were 11 times as likely to be called more innovative than their competitors. Trust, the How Report suggests, is more important than virtually any other characteristic.
How organizations address the trustworthiness of data provenance is still a new frontier in this day and age.  The use of new sensors, sophisticated analysis of "Big Data" by computer algorithms and the pace at which new data is generated by the "Internet of Things" (IOT) makes this a significant area of focus for our current executives and enlightened organizational leadership.

Why?
But what does that really mean? How does one measure the absence or presence of something as abstract as trust? The How survey defines it as “a catalyst that enhances performance, binds people together, and shapes the way people relate to each other.” High trust groups encourage risk-taking, which in turn is what is necessary for true innovation to occur. When innovation fails, says Seidman, it’s because companies don’t put enough faith in employees to let them take risks. The industries with the highest amount of trust were “computers/electronics,” followed by “software/Internet.” Coming in last? Government.
At the most fundamental level, the culture you are operating in has all to do with the trust that exists or is absent.  It has all to do with the trustworthiness of data provenance.  Leadership in any organization, must see the relevance between trust and innovation.  Between innovation and risk-taking.  Your future and your culture depends on it.