below is the content in (pp. 368-369)
Private Sector Case Study
In October 2013, AvMed, a health insurer in Florida, settled a class-action lawsuit. The company had reported the theft of two laptops in 2009 that contained the personal information of more than 1.2 million customers. Neither laptop was encrypted.
The allegation was that the company had underspent in information security for years and failed to have adequate policies to ensure security awareness training, new password protocols, upgrades to laptop security systems, and facility security upgrades. These security measures, among others, are legally required by the Health Insurance Portability and Accountability Act (HIPAA) regulations (45 CFR § 164). The HIPAA law was enacted by the United States Congress and signed by President Bill Clinton in 1996.
This is a clear case in which security policies were not a priority for executive manage-ment. It was also clear through the lawsuit that the company did not spend sufficiently for the data security program it should have provided to meet legal and regulatory obligations such as those of HIPAA.
This also highlights poor governance and management processes. This risk should have been considered and monitored. It's unclear whether the CISO had brought the risk to the attention of leadership. Regardless, laptop encryption protection and access to customer data for a health insurer should have been a topic that the management processes monitored. Both governance and management process should have noted these defects or noted the lack of attention to these types of risk in their processes.
Public Sector Case Study No. 1
The U.S. Department of Education discovered on March 23, 2010, that 3.3 million records were stolen from one of its vendors. The vendor was Educational Credit Management Corporation (ECMC), which processed $11 billion in student loans. The records that were stolen included personal data on individuals who received student loans. This included names, addresses, Social Security numbers, and dates of birth.
The information was downloaded to a removable media device, which was later stolen. The report indicated that the event was a "very clear violation of our company policies." The report indicated neither whether the data was encrypted nor the type of device used. The report indicated that the data could have dated back as far as 15 years.
Although it's speculation, we can assume the data was not encrypted. Typically, when an incident occurs and the stolen data is encrypted, an organization highlights that fact. If data is properly encrypted there is no data breach. The device can be stolen but no data can be accessed. That doesn't appear to be the case here because the report assumes the data was compromised.
This leaves the issue of enforcement. It appears the enforcement of the policy was manual. The report did not indicate the type of removable media. Portable media devices can be locked down or encrypted. That would be an example of an automated control. That type of control was apparently not in place in this case.
Public Sector Case Study No. 2
In July 2013, the United States Department of Energy system suffered a data breach. It resulted in unauthorized access to more than 104,000 individual personal records. The records included Social Security numbers, birthdates and locations, bank account numbers, and security questions and answers.
According to the Department of Energy Inspector General's report, there were "early warning signs" that personnel-related information systems were at risk. Yet no actions were taken to improve security. The Inspector General's report went on to identify "a number of technical and management issues" that were root causes. Additionally, the report cited "numerous contributing factors related to inadequate management processes."
While the details of the breach were not made public, it's clear from the language and tone of the report that a lack of management process led to the security breach. Given that this agency is governed by the NIST standards, adequate security policies are most likely not the issue. That suggests the problem was lack of governance or management process to enforce the NIST standards.
It is interesting that report indicates there were "early warning signs" of the security risks. In the absence of specifics, here are a couple of possibilities, among many, to consider:
First, the management processes may have failed to identify the security control weaknesses. This could happen if the breach was the result of a management process that does not adequately rate risks. For example, patch management processes must identify critical patches for issues that could lead to a breach. If this process fails to properly risk-rank a patch, then applying the patch could be significantly delayed and needlessly expose the systems to the risk. Regardless, management processes must enforce both ranking and installing patches to prevent data breaches.
A second possibility to consider is a lack of effective governance and management oversight. Had the risk been properly identified and raised with management, the lack of action could have been an indication that management was not enforcing policies. Given that lack of management action is cited as a cause, then lack of policy enforcement as required by NIST standards might be a root cause.