A ND
Y
C
IVA
R
P
DATA SECURI
TY
UN
DE
R
CloudCom
“Cloud computing” is a catch-phrase for accessing IT resources such as software, application development, and
infrastructure over the Internet. The cloud promises easy,
on-demand access to powerful technology at less cost than
homegrown IT systems. The former U.S. chief information
officer likened it to the running water of the information
age (Kundra 2010). But moving to the cloud is essentially
outsourcing. And as in any outsourcing arrangement,
cloud computing carries a range of business and legal risks
(see, e.g., Porter and Larner 2011). This article focuses on
just one: privacy and data security compliance.
The bottom line is that moving to the cloud in no way
alters an institution’s privacy and data security obligations, but it does force an institution to rely on the cloud
provider for compliance. Because U.S. privacy and data
security law is a patchwork, the first step is to identify the
College & University | 10
By Joel Buckman and Stephanie Gold
This article outlines privacy and data security compliance issues
facing postsecondary education institutions when they utilize
cloud computing and concludes with a practical list of do’s and
dont’s. Cloud computing does not change an institution’s privacy
and data security obligations. It does involve reliance on a third
party, which requires an institution to implement practical and
legal protections to facilitate compliance with such obligations.
mputing
THE LEGAL FRAMEWORK AND
PRACTICAL DOS AND DON’TS
institution’s obligations with regard to the information
moving to the cloud. Institutions then should attempt—
and in some cases will be required by law—to obtain sufficient contractual guarantees that the cloud provider
will comply with any such requirements. However, cloud
providers may be reluctant to provide such guarantees or
may do so only at a price, perhaps undermining some of
the cloud’s benefits. Particularly in those cases, whether to
move to the cloud comes down to a cost-benefit analysis.
Developing a process-based approach will help institutions make good decisions.
The first part of this article explains the basics of cloud
computing and U.S. privacy and data security law; the second part focuses on cross-cutting cloud computing privacy and data security risks and provides a more in-depth
analysis of the Family Educational Rights and Privacy Act
College & University | 11
(FERPA); the final section provides a list of cloud computing do’s and don’ts. The chart at the end of the article summarizes the cloud implications of privacy and data security
laws commonly applicable to colleges and universities.
THE BASICS
Cloud Computing
Cloud computing is an evolving concept, and definitions
abound (Katz, Goldstein and Yanosky 2009). A straightforward if over-simplified definition is the “delivery of
scalable IT resources over the Internet, as opposed to hosting and operating those resources locally” (EDUCAUSE
2009). Typically, cloud computing is discussed in terms
of three service and four deployment models. Institutions
should understand the basics of these models because they
can affect the level of control the institution will retain
over privacy and data security ( Jansen and Grance 2011).
The three service models are Software as a Service
(SaaS), Platform as a Service (PaaS), and Infrastructure
as a Service (IaaS). SaaS involves the use of prefabricated
software and applications over the Internet (think Internet-based e-mail like Yahoo!); PaaS involves an Internet
platform from which the customer develops and deploys
software and applications (think Microsoft Azure); IaaS
involves more barebones IT structures delivered over the
Internet (think servers, network equipment, CPUs). (An
analogy to manufacturing might help clarify: Under the
SaaS model, the provider offers a fully furnished, prefabricated house; under the IaaS model, the provider offers
a completely built house, but the end user is free to furnish it and otherwise outfit it as he sees fit; under the PaaS
model, the provider offers just the raw materials for the
house and the end user can help design it from the ground
up.) Generally, SaaS offers the institution the least control
over security settings, IaaS offers institutions the most
control, and PaaS falls somewhere in between.
The service models can be deployed over a public
cloud, a community cloud, a private cloud, or a hybrid
cloud. Most relevant to higher education institutions are
public clouds (open to all), private clouds (limited to,
and in some cases built for, a particular user), and community clouds (limited to, and in some cases built for, a
group of users with common business and compliance
needs). The fourth deployment model, a “hybrid cloud,”
is a “composition of two or more clouds (private, community, or public)” that remain unique entities but are
bound by “standardized or proprietary technology that
enables application and data portability among them”
( Jansen and Grance 2011). Public clouds offer institutions
the least control over privacy and security settings; hybrid
clouds are tailored to an institution’s needs and may offer
more control; and private clouds offer the most control.
Notably, a number of major cloud providers have rolled
out cloud solutions geared toward higher education (see,
e.g., Google Apps for Education at and cloud computing in education at
).
Clouds offer on-demand, scalable, powerful, pay-asyou-go (sometimes free) IT resources. For example, Internet-based e-mail services offer megabytes of storage space
that can be accessed via any Internet connection for free
at the click of a mouse. At the institutional level, clouds
allow institutions access to IT without a large up-front
capital investment or the requirement to lock in longterm fixed costs. Home-grown IT no longer needs to predict usage requirements or to host or maintain software
on campus servers and computers (EDUCAUSE 2009). But
with these benefits come risks: Cloud providers store immense amounts of valuable data and may become targets
for hackers ( Jansen and Grance 2011); providing resources
over the Internet requires more administrative and technical layers and, thus, more access points to private data
( Jansen and Grance 2011); and cloud providers are able to
provide services cheaply in some cases by aggregating and
mining data.
U.S. Privacy and Data Security Law
U.S. privacy and data security law is in fact a patchwork
of sector-specific federal laws, diverse state laws with numerous jurisdictional hooks, and various self-imposed
requirements (typically by contract). To the extent that
such laws apply “on the ground,” they also apply “in the
cloud.” Further, because many colleges and universities are
engaged in a wide range of activities, they are subject to
many sector-specific privacy and data security laws. Such
laws include:
WWthe Family Educational Rights and Privacy Act
(FERPA), which applies to certain education insti-
College & University | 12
tutions and protects education records (20 U.S.C. §
1232g; 34 C.F.R. part 99);
WWthe Gramm-Leach-Bliley Act (GLBA), which applies
to financial institutions and protects certain nonpublic personal information (15 U.S.C. §§ 6801–09;
16 C.F.R. part 313);
WWthe Red Flags Rule, which applies to debit and credit
card issuers, users of consumer reports, and financial
institutions and creditors holding covered accounts
and requires identity theft prevention measures
(15 U.S.C. § 1681m(e); 16 C.F.R. part 681);
WWthe Health Insurance Portability and Accountability Act (HIPAA) and the Health Information
Technology for Economic and Clinical Health Act
(HITECH), which apply to, among others, certain
healthcare providers and protects certain health information (42 USC § 1320d et seq.; 45 C.F.R. parts
160, 162, 164);
WWstate data breach notification laws: 46 states, the
District of Columbia, Puerto Rico, and the U.S. Virgin Islands have enacted some form of legislation requiring certain entities—e.g., those doing business in
the state or storing one of its resident’s data—to give
certain notice to consumers affected by a data breach
(see for
a listing of such laws);
WWstate data security laws: several states have enacted
more general data security laws, typically applying
to entities doing business in the state or owning or
licensing data about state residents (e.g., 201 Code
Mass. Regs. §§ 17.01–17.05); and
WWstate privacy laws: certain states have enacted either
general privacy laws or laws applicable to certain
types of information, such as health records, mental
health records, or education records (see, e.g., Electronic Privacy Information Center, State Privacy
Laws, at ).
See the chart at the end of this article, which explains
the general applicability, basic protections, and cloud
computing implications of such laws.
These laws typically provide some mix of protections,
including data security requirements and/or rights of privacy, access, or correction. In certain cases—for example,
HIPAA—laws specifically require that an institution bind
service providers to follow the laws’ mandates.
In addition to these laws and regulations, institutions
may have contractual obligations to students or others
that specify privacy or data security standards. For example, the payment card industry requires merchants to
comply with the Payment Card Industry Data Security
Standard (PCI/DSS). PCI/DSS requires merchants to implement prevention, detection, and appropriate reaction
to security incidents. Student handbooks and IT terms of
use and privacy policies are other potential sources of selfimposed requirements.
Many colleges’ and universities’ activities are sufficiently varied to implicate much of the patchwork. For example, many institutions participate in the federal Perkins
Loan Program or provide institutional loans to students
or faculty that may trigger the Gramm-Leach-Bliley Act
and the Red Flags identity theft prevention rules (Meers
and Meade 2008). A student health center may be subject
to HIPAA with regard to treatment of faculty and staff and
to FERPA with regard to treatment of students. Because institutions commonly welcome students from all over the
country, various state data security laws may apply regardless of where the institution actually provides its services.
And if the institution accepts credit card payments for
tuition and fees, then it likely is also subject to PCI/DSS.
The list goes on. In short, because of the breadth of many
colleges’ and universities’ activities, multiple privacy and
data security regimes may apply to various IT functions
that might “move to the cloud.” Consultation with legal
counsel should be a central feature of any plan to utilize
cloud computing.
CLOUD COMPUTING PRIVACY AND DATA SECURITY
Cross-Cutting Privacy and Data Security Issues
Nearly all cloud computing privacy and data security risks
share a common origin. Moving to the cloud in no way
changes an institution’s privacy and data security obligations ( Jansen and Grance 2011), but it does force the institution to rely on a third party for compliance. Thus,
colleges and universities must (1) identify applicable
privacy and data security requirements, (2) conduct due
diligence of the provider’s compliance package, and (3)
negotiate effective contractual provisions—including ef-
College & University | 13
fective remedies for noncompliance—to ensure that the
provider will execute. Although this sounds straightforward, a number of the cloud’s features complicate matters,
including the following:
WWEasy deployability means unauthorized deployments.
Precisely because cloud resources are easy to deploy,
various campus constituencies might move to the cloud
without considering the privacy and data security implications of doing so (Young 2011). This can be especially problematic at “flat” organizations—which many
colleges and universities are. For example, a professor
might begin to communicate grades to students by way
of a free, commercially available file-sharing service,
such as Dropbox. This may result in the storage of “education records,” implicating FERPA.
WWData location: The cloud model works in part because
data can skip around the world instantaneously ( Jansen
and Grance 2011). An institution’s home jurisdiction
may prohibit such transfers, and the transferee jurisdiction may provide less protection from government
intrusion or impose fewer data security requirements
on the provider. And because physical location is a traditional jurisdictional test, the location of a provider’s
servers could subject an institution to the laws of a
“strange” jurisdiction. Even if it is unlikely that a provider’s unilateral (and possibly unknowing) transfer of
data to a server in a faraway jurisdiction would subject
the customer to that jurisdiction’s laws, the provider
should arguably bear that risk. (The cloud provider
would likely already be subject to the laws of any jurisdiction where it maintains servers.) The U.S. Department of Education recently suggested that in its view
a cloud computing “best practice” is to store sensitive
education records within the United States (U.S. Department of Education 2012).
WWData ownership & secondary uses. Some public cloud
providers rely on data mining to create revenue streams.
Data mining ranges from behavioral advertising to the
outright sale of personally identifiable information.
This model presents compliance challenges for data
security laws that prohibit the use of data for any purpose other than that for which the data were collected
(“secondary use”). Institutions should be wary of provider agreements that claim ownership or license of the
institution’s data and should consider whether some
contractual limit on secondary use is necessary or desirable ( Jansen and Grance 2011). Be aware of indirect
ownership claims, as when a SaaS provider seeks to own
software outputs created by subscriber data inputs.
WWLoss of control and lack of transparency: Like any outsourcing arrangement, a cloud customer cedes control
of some processes to the provider. Where an institution
once had the power to allocate resources and develop
a data security regime appropriate to its size and risk
profile, on the cloud it must rely on the provider’s human, physical, administrative, and technical resources
(Cloud Security Alliance 2010, Jansen and Grance
2011). Where an institution once had physical possession of its data, it now must rely on a provider not to
hold its data hostage in the event of a contract dispute
and/or at the end of the relationship. Trust must fill the
gap. Effective pre- and post-contract diligence can create trust. But for various reasons, cloud providers may
be reluctant to allow rigorous diligence ( Jansen and
Grance 2011). Independent third-party audits may constitute one solution; a provider’s reputation may constitute another. Still, the institution should negotiate
mechanisms whereby it confirms that security controls
are implemented and contractual promises are kept.
WWData security risk profile: Some argue that the cloud
provides less security than on-the-ground computing
because it adds layers of administrative and technical
complexity, is portable, and becomes a target for hacker
attacks. Others contend that cloud providers are by necessity expert at data security and provide much greater
protection than any home-grown IT department
(Winkler 2011). At the very least, the cloud does raise
different data security concerns than on-the-ground
computing. For example, cloud providers often achieve
economies of scale by storing multiple subscribers’ data
on the same server and segregating the data exclusively
through technical (as opposed to technical and physical) means ( Jansen and Grance 2011). But one cannot
say that storing information in the cloud is like storing
money in a bank (as opposed to a mattress) because
some colleges and universities already have vaults. Thus,
an institution’s IT professionals should conduct a caseby-case comparative analysis.
WWRefusal of providers to negotiate: Finally, although a
contract is critical to achieve privacy and data security
College & University | 14
compliance, many cloud providers offer one-sided,
form contracts with little room for negotiation ( Jansen
and Grance 2011). This is particularly true for the public, SaaS, out-of-the-box offerings. Providers assert that
standardization helps keep down costs. In certain cases,
contracts give providers the right to amend unilaterally,
creating the potential to undermine any privacy or data
security obligations an institution might obtain. Contracts also may not provide effective remedies or indemnification for breaches by the provider. (See Jansen and
Grance 2011 for a more comprehensive examination of
the cautionary implications of cloud computing.)
That said, large cloud providers increasingly are attempting to meet colleges’ and universities’ needs. For example,
after initial resistance, Google reportedly agreed to comply with FERPA in its provision of Gmail to postsecondary
institutions (DeSantis 2012, Mitrano 2009). A number of
major cloud providers now have sites dedicated to education institutions, and institutions are uniting to negotiate
with cloud providers. This represents progress but likely
does not signal the end of problematic unilateral contracts.
For now, when an insufficient contract is all that is available, an institution must consider whether it is legally possible and prudent to proceed. Institutions should weigh
the sensitivity of the information involved, the potential
exposure in the event of a problem, and the cloud provider’s reputation. Sometimes an institution should proceed;
sometimes an institution should look to a different cloud
service or delivery model; and sometimes an institution
should stay “on the ground.”
The FERPA Example
Because FERPA applies to most colleges and universities,
it provides a useful example of how to evaluate a privacy
law when moving to the cloud. FERPA protects “education records” (see 34 C.F.R. § 99.3), so the first question is
whether the relevant IT function involves such records.
If so, an institution must identify contractual and other
guarantees needed to ensure compliance.
Generally, education records are any information recorded in any way that is maintained by or on behalf of
an institution and that is “directly related to a student”
(see 34 C.F.R. § 99.3). This broad definition covers any
number of IT functions, including faculty and staff e-mail
(Gilbertson and Storch 2009), student information systems, grade books, extracurricular participation records…
the list goes on. But not all campus IT functions would
qualify: For example, a professor’s research database comprising of interviews with non-students likely would not
be subject to FERPA.
FERPA creates rights of privacy (34 C.F.R. subpart D ),
student access (34 C.F.R. part 99, subpart B ), and record
correction (34 C.F.R. part 99, subpart C). Cloud compliance for access and correction is relatively straightforward.
For access, an institution should contractually prohibit the
provider from unilateral records destruction and should
confirm that the service level agreement (SLA) would allow a student to inspect records within 45 days of a request (common SLAs would so allow) (Porter and Larner
2011). The contract also should prohibit the provider from
holding education records hostage in any contract dispute
or at the end of the relationship (34 C.F.R. § 99.10[b]). For
the right of correction, an institution should ensure that it
is possible to attach electronic explanatory notes that will
be transmitted with the records (34 C.F.R. § 99.21[c]; see
also 34 C.F.R. §§ 99.20 and 99.22).
As compared to the rights of access and correction, the
right of privacy is somewhat more complicated. FERPA’s
privacy protections generally prohibit the disclosure of
“personally identifiable information” (PII) from education records without a student’s signed, written consent
that specifies details about the disclosure. The regulations
define disclosure as “to permit access to or the release,
transfer, or other communication of [PII] contained in
education records by any means . . . to any party except the
party identified as the party that provided or created the
record” (34 C.F.R. § 99.3). Institutions may be able to rely
on a number of exceptions to the consent requirement to
allow disclosure to a third party in the context of cloud
computing (for example, the exceptions for disclosures
of directory information or for disclosures in connection
with financial aid). Here we focus on the more general
exception for disclosures to school officials who have “legitimate educational interests” in the records, which the
U.S. Department of Education recently discussed in a
publication regarding cloud computing (U.S. Department
of Education 2012).
Contractors to whom an institution has “outsourced
institutional services or functions” may qualify as “school
College & University | 15
officials” with “legitimate educational interests” when
they:
WWPerform a service or function for which the institution would otherwise use employees;
WWAre under the “direct control” of the institution with
respect to the use and maintenance of education records;
WWUse reasonable physical or technical controls or
equally effective administrative controls to limit
employee access to education records to those who
need to know; and
WWAgree not to re-disclose any PII without a student’s
consent or to make any secondary use of PII.
In addition, in its legally required annual FERPA notice, the institution must define “school officials” broadly
enough to include IT service providers.
The requirements that the provider fall under the institution’s “direct control,” limit access to records to
need-to-know personnel, and prohibit secondary use and
re-disclosure are the most burdensome. Cloud providers
may push back on a prohibition of secondary uses or redisclosure of personally identifiable information. Even
more burdensome is the “direct control” requirement.
Read literally, “direct control” could foreclose cloud
computing. But it seems that robust contractual provisions can provide a sufficient proxy (Gilbertson and
Storch 2009). Some contend that it would suffice for a
contract to (1) reserve all information ownership rights
to the institution, (2) prohibit all secondary uses or redisclosures of the data (as opposed to just PII), and (3)
require data security protections (Gilbertson and Storch
2009). Mandating periodic audits, sufficient remedies in
the event of a breach, and the return or destruction of the
information at the end of the agreement also would seem
prudent. These contractual provisions borrow certain features from the U.S. Department of Education’s recently released guidelines about how an institution can ensure that
a third party complies with FERPA to the “greatest extent
possible” under a different exception for nonconsensual
disclosure. Although FERPA does not include an express
data security requirement, the guidelines state that “best
practice” is to “verify the existence of a sound data security
plan” because institutions must determine whether the
third party’s protections are “adequate to prevent FERPA
violations” (76 Fed. Reg. 75,604, 75,612, 2011).
In sum, FERPA does not appear to prohibit cloud computing, and, indeed, the U.S. Department of Education
recently stated its position to that effect (U.S. Department
of Education 2012). To comply with FERPA, an institution
first should make sure that the definition of “school officials” with a “legitimate educational interest” in its annual FERPA notice encompasses contractors. Second, an
institution should obtain a provider’s contractual agreement to: (a) comply with FERPA; (b) acknowledge that
the provider is under the direct control of the institution
with respect to the “use” and “maintenance” of education
records (including, perhaps, by adopting some or all of
the contractual protections above); (c) provide physical,
administrative, and technical controls to restrict access to
education records to those persons who need to know;
and (d) prohibit the re-disclosure and secondary use of
PII from education records.
CLOUD COMPUTING DOS AND DON’TS
Do
WWDevelop a policy to mandate the use only of university-
approved IT resources and to forbid the unofficial use
of cloud services.
WWDevelop a process-based approach to managing cloud
compliance. When considering a move to the cloud,
develop an ad hoc or permanent cross-functional team
to identify the institution’s needs, evaluate providers,
and decide whether to move to the cloud. At a minimum, the team should include personnel qualified to
provide the legal, IT, and business perspectives.
WWEvaluate cloud privacy and data security compliance on
a function-by-function basis (for example, the student
information system might involve different considerations than an enrollment management system).
WWIdentify legal, contractual, or institutionally imposed
privacy or data security requirements that are applicable to the information entwined in the IT function that
is moving to the cloud.
WWSeek transparency about and/or a third-party audit of
the cloud provider’s privacy and data security measures.
College & University | 16
WWCompare the cloud provider’s security protections with
WWAssume that the cloud provider’s security protections
your institution’s existing on-the-ground data security
protections.
WWLearn whether the cloud provider will sell your institution’s information directly or indirectly to advertisers or otherwise use your institution’s data for its own
purposes; seek to restrict such activities when legally
required or desirable.
WWSeek to obligate the cloud provider to bear the risk for
laws that may be imposed on your institution as a result
of the locations of the provider’s servers.
WWBuild into the contract effective remedies for material
breaches of the cloud services agreement—including
breaches of privacy and data security provisions.
WWBuild into the contract effective transition procedures
to minimize the risk of cloud provider lock-in.
WWConduct a cost-benefit analysis when it is impossible
to obtain the ideal contractual guarantees from a cloud
provider. Weigh the cloud provider’s reputation and
potential costs in the event of a privacy or data security
lapse, the sensitivity of the data, the risks to the institution, and the benefits of moving to the cloud.
WWEvaluate whether other cloud solutions—for example,
a private or community cloud—would make it easier to
obtain the requisite privacy or data security protections.
are either better or worse than your institution’s on-theground data security protections.
WWForget that free services often involve secondary uses of
data.
WWNecessarily refuse to contract with a cloud provider who
will not disclose the location of its servers because associated risks may be manageable through contract terms.
WWSign a contract that allows a cloud provider to unilaterally change its terms.
WWAssume that transitioning from one cloud provider to
another will be easy.
WWExpect to be able to negotiate the ideal contract with all
cloud providers.
WWAssume that failure either to find a suitable public
cloud provider or to obtain a suitable agreement means
that the IT function can never move to the cloud.
Don’t
WWAssume that your institution’s lack of knowledge about
an employee’s use of cloud services would relieve the
institution of its privacy and data security obligations.
WWSign or electronically agree to a cloud services agreement without consulting your legal department.
WWAssume that all privacy and data security laws apply to
all IT functions.
WWAssume that moving to the cloud somehow lessens or
transfers to the cloud provider your institution’s responsibility for privacy and data security compliance.
WWSign an agreement that does not require the cloud provider to maintain at least commercially reasonable data
security standards or, ideally, no less protection than
your institution provides on the ground (assuming that
applicable law does not require a more stringent standard).
CONCLUSION
Privacy and data security requirements pose a compliance challenge to cloud computing. But that challenge is
not insurmountable. Institutions must understand that
whatever their privacy and data security obligations on
the ground, the same requirements apply in the cloud.
By planning and contracting accordingly, institutions can
take advantage of the tremendous benefits offered by the
cloud while managing legal and other risks.
REFERENCES
Cloud Security Alliance. 2010. Top Threats to Cloud Computing, Version
1.0. Available at .
Curran, B. P. 2010. U.S. export controls and cloud computing. Law360.
September 10. Available at: .
DeSantis, N. 2012. Google says new privacy policy has little impact on
education partners. The Chronicle of Higher Education. March 1. Available at: .
EDUCAUSE. 2009. Seven Things You Should Know About Cloud
Computing. Available at: .
Gilbertson, S. F., and J. C. Storch. 2009. Cloud contracting: Outsourcing e-mail@youruniversity.edu. NACUA Notes. 8(4). Available at:
.
Jansen, W., and T. Grance. 2011. Guidelines on Security and Privacy in
Public Cloud Computing. National Institute of Standards and Technology Special Publication 800–144. Available at: .
College & University | 17
Katz, R., P. Goldstein, and R. Yanosky. 2009. Cloud Computing in Higher
Education. Available at: .
Kundra, V. 2010, July 1. Federal Chief Information Officer, Administrator for Electronic Government and Information Technology, Office of
Management and Budget before the House Committee on Oversight
and Government Reform Subcommittee on Government Management, Organization, and Procurement.
Meers, E. B., and D. S. Meade. 2008. FTC’s red flag rule likely to affect
colleges. Initiatives News, September 23. Available at: .
Mitrano, T. 2009. Outsourcing and Cloud Computing for Higher Education. Available at: .
Porter, P. D., and M. E. Larner. 2011 Managing the Risks of Operating in
the Cloud. Available at: .
U.S. Department of Education, Privacy Technical Assistance Center. 2012.
Frequently Asked Questions—Cloud Computing. Available at: .
Winkler, V. 2011. Cloud computing: Cloud security concerns. TechNet
Magazine. November. Available at: .
Young, J. R. 2011. Colleges unite to drive down cost of ‘cloud computing’: As professors’ demand for web-based service grows, institutional
group buying may keep them from going rogue. The Chronicle of
Higher Education. October 16. Available at: .
About the Author
STEPHANIE GOLD is a partner and JOEL BUCKMAN is an associate at
Hogan Lovells US LLP. Both work in Hogan Lovells’ education practice
area and its Washington, D.C. office.
Come see what AACRAO Online
has to offer you…
Our ever-expanding Web site is filled with timely information and
news for the growing community of registrars and admissions officers
in the United States and around the world. Association members
enjoy special benefits and exclusive access to AACRAO’s higher level
resources and news. Here’s a small sampling of what content areas
the site includes:
E AACRAO Transcript
(An Online News Source)
E Publications Library
E Jobs Online
E Foreign Credential
Evaluation
E FERPA Online Guide
E Transfer Credit Practices
(TCP) Online
E Virtual Member Guide
and much more…
Pull up a chair and give us a visit today at
www.aacrao.org!
www.aacrao.org
AMERICAN ASSOCIATION OF COLLEGIATE REGISTRARS AND ADMISSIONS OFFICERS
College & University | 18
APPENDIX: Common Privacy and Data Security Regimes Applicable to
Postsecondary Education Institutions and Implications for Cloud Computing
Family Educational Rights and Privacy Act (“FERPA”)
FERPA protects “education records” and provides eligible students with
rights of privacy, access, and correction for such records.1
▶▶ Education Records—broadly defined to include “any information recorded in any way” that is (i) directly
related to a student, and (ii) maintained by an institution or by a party acting for an institution.2
Applicability and
Basic Protections
▶▶ Right of Privacy—FERPA prohibits the disclosure of “personally identifiable information”
from education records without a student’s signed, written consent.3
▶▶ Right of Access—FERPA requires institutions to allow students to inspect their education records or
to provide copies of such records within a “reasonable period of time” but no later than 45 days of
such request. An institution may charge a copying fee but not a retrieval or inspection fee.4
▶▶ Right of Correction—FERPA requires institutions to permit students to request amendment
of education records; if, after a hearing, the institution refuses to correct a record, a student
has the right to attach an explanation that must be transmitted with the record.5
Specific Privacy and/
or Data Security Contract
Provision Required/
Recommended?
Whenever education records are disclosed, it is advisable to include some contractual language
regarding FERPA compliance. The precise terms recommended or required may vary with
the exception to the consent requirement upon which the institution relies.
For example, assuming the institution discloses information to the cloud provider under
the “school officials” with a “legitimate educational interest” exception, certain contractual
requirements would be implicitly required. See ”Cloud Computing Notes” below.
Other exceptions may call for somewhat different contractual obligations. For example, the student financial aid
exception allows nonconsensual disclosure of information from education records when “necessary” to determine the
eligibility, amount or conditions of the aid or to enforce the terms and conditions of the aid.6 Though nothing would
appear to require contractual provisions that would approximate “direct control” when relying on such provisions, the
institution should nevertheless obtain contractual guarantees related to re-disclosures of information, prohibitions
on secondary use, and recordkeeping.7 Such requirements would not be necessary for directory information.8
In general, institutions seeking to move to the cloud can probably rely upon the exception for nonconsensual disclosures
to “school officials” who have a “legitimate educational interest[]” in such records.9 Cloud providers likely qualify when—
The institution outsources a service or function for which the institution would otherwise use employees;
The contractor is under the direct control of the agency or institution with respect to the use and
maintenance of education records (for example, by contractually requiring data security, specifying that
the institution owns the information, and by requiring periodic audits for contract compliance);10
Cloud Computing Notes
The disclosure of PII to the contractor is conditioned on a contractor’s promise not to re-disclose
the information without the students’ consent and that the officers, employees, and agents of the
contractor will use the information only for the purposes for which the disclosure was made;11
The contractor has physical, technological, and/or administrative controls to limit access by its
employees only to education records in which they have a legitimate educational interest;12 and
The institution’s annual FERPA notice defines “school officials” to include contractors.
Institutions should seek guarantees that the provider will comply with all FERPA requirements (for example, service
levels that would ensure the right of access; capability to attach student explanations to records as appropriate).
The Gramm-Leach-Bliley Act (GLBA)13
The GLBA applies to institutions meeting the definition of “financial institutions” which includes institutions
that engage in common financial activities such as making, brokering or servicing loans.14
The GLBA protects “nonpublic personal information,” which means “personally identifiable
financial information” and “any list, description, or other grouping of consumers…that is derived
using any personally identifiable financial information that is not publicly available.”15
Applicability and
Basic Protections
The GLBA has two principle protections: (1) a Privacy Rule (governs the use and disclosure of nonpublic personal information) and (2) a Safeguards Rule (requires a data security program).
▶▶ Privacy Rule—colleges and universities that comply with FERPA are deemed compliant
with the GLBA’s privacy rule, at least with respect to student records.16
▶▶ Safeguards Rule—requires financial institutions to develop, implement, and maintain a
comprehensive, written information security program, which contains administrative, technical,
and physical safeguards that are appropriate to the institution’s size and complexity.17
1
2
3
4
5
6
7
8
9
See 20 U.S.C. § 1232g; 34 C.F.R. part 99.
34 C.F.R. § 99.3.
Id. § 99.30.
Id. §§ 99.10, 99.11.
Id. §§ 99.7, 99.10, 99.21.
Id. § 99.31(a)(4).
Id. § 99.33(a)–(b).
Id. § 99.33(c).
Id. § 99.33.
10
11
12
13
At first, this criterion seems problematic because
cloud providers are not under the direct physical or
even administrative control of institutions. Institutions
can probably comply with this requirement, however,
using contractual guarantees. Such guarantees should
create the contractual equivalent of direct control.
Id. § 99.33(a).
Id. § 99.31(a).
15 U.S.C. § 6801–6809; 16 C.F.R. part 314.
College & University | 19
14
15
16
17
See 15 U.S.C. § 6809.
16 C.F.R. §§ 314.2(b) 313.3(n).
16 C.F.R § 313.1(b); 65 Fed. Reg. 33,646, 33,648
(“The Commission has noted in its final rule, therefore, that institutions of higher education that are
complying with FERPA to protect the privacy of
their student financial aid records will be deemed
to be in compliance with the Commission’s rule.”).
16 C.F.R. § 314.3.
GLBA (continued)
Specific Privacy and/
or Data Security Contract
Provision Required/
Recommended?
Yes, the Security Rule requires financial institutions to “[o]versee service providers” by “(1) Taking reasonable steps
to select and retain service providers that are capable of maintaining appropriate safeguards for the customer
information at issue; and (2) Requiring…service providers by contract to implement and maintain such safeguards.”18
Cloud Computing Notes
In most cases, the critical GLBA component for cloud computing will be to ensure compliance with the
Safeguards Rule. To do so, institutions should exercise diligence in selecting and retaining providers and
by obtaining contractual guarantees to ensure that the provider is implementing an information security
program that is at least as effective or better than the institution’s on-the-ground program.
Even when the Privacy Rule does apply to institutions—e.g., when an institution acts as a “financial institution” to
faculty—the Privacy Rule ordinarily should not prohibit institutions from moving to the cloud because the GLBA allows
disclosures to service providers (provided appropriate notice is given and the provider is obligated not to disclose
further or use the information other than to carry out the purposes for which the information is disclosed).19
The Red Flags Rule20
In general, the Red Flags Rule requires, among other things, that
Applicability and
Basic Protections
Specific Privacy and/
or Data Security Contract
Provision Required/
Recommended?
Cloud Computing Notes
(1) Debit and credit card issuers develop protocols to assess the validity of change-of-address
requests that are followed closely by a request for an additional or replacement card;21
(2) Users of consumer reports must develop reasonable protocols to apply when they receive
notices of address discrepancies from a consumer reporting agency;22 and
(3) “Financial institutions” and “creditors” holding “covered accounts” must develop and implement
a written identity theft prevention program “in connection with the opening of a covered
account or any existing covered account.”23 “Covered accounts” are generally consumer accounts
involving multiple payments or transactions, including a loan that is billed monthly.24
The second and third provisions ensnare many institutions, because, for example, institutions participate
in the Federal Perkins Loan program, offer institutional loans to students, faculty, or staff, or offer tuition
payment plans. The regulations distinguish “stored value cards” from debit or credit cards so the first
provision may not apply to all student ID cards that students use to purchase goods or services.
Suggested, but not required. Each financial institution or creditor that is required to implement an
identity theft prevention program must “[e]xercise appropriate and effective oversight of service
provider arrangements.”25 Interagency guidance provides that this includes “tak[ing] steps to ensure
that the activity of the service provider is conducted in accordance with reasonable policies and
procedures designed to detect, prevent, and mitigate the risk of identity theft,” which may include
contractually obligating the provider to comply with the institution’s Red Flags program.26
When a college or university is a “financial institution” and thus obligated to implement a data security plan, the
provider should agree to provide at least as rigorous an identity theft prevention program (as tailored to its specific
function) as the college or university. But the Red Flags Rule requires no specific contractual language or obligation.27
Instead, the regulations and guidance allow flexible business arrangements, so long as the service provider’s identity
theft prevention is sufficient to meet the financial institution’s or creditor’s obligations under the Red Flags Rules.
Also, if an institution is not a “financial institution,” depending on the nature of the services the cloud
provider performs, an institution may retain control over all necessary personnel and processes to
ensure compliance. For example, an institution that uses consumer reports likely still would retain
complete ability to develop reasonable protocols to respond to address discrepancies in the cloud.
18
16 C.F.R. § 314.4(d).
16 C.F.R. § 313.13.
20 The Red Flags Rules were enacted pursuant to the Fair
and Accurate Credit Transactions Act of 2003 (“FACT
Act”), Pub. L. No. 108-159, §§ 114, 315, 117 Stat. 1952,
1960, 1966 (2003) (codified as elements of 15 U.S.C.
§§ 1681c, 1681m). Several federal agencies have
promulgated Red Flags Rules regulations. This chart
cites to the Federal Trade Commission’s regulations.
19
21
16 C.F.R. § 681.2.
16 C.F.R. § 681.2.
23 16 C.F.R. § 681.1(d).
24 16 C.F.R. § 681.1(b)(3).
25 16 C.F.R. § 681.1(e)(4).
26 16 C.F.R. Part 681, App. A(vi)(c) (Interagency Guidelines on Identity Theft Detection, Prevention, and Mitigation”).
22
College & University | 20
27
“Financial institutions or creditors may find it helpful to require a service provider, by contract, to
have policies and procedures to detect relevant
red flags that may arise in the performance of the
service provider’s activities and either report the
red flags to the financial institution or creditor or
take its own appropriate steps to prevent or mitigate identity theft. See Section VI(c) of the Guidelines.” FTC, “Frequently Asked Questions: Identify
Theft Red Flags and Address Discrepancies,” www.
ftc.gov/os/2009/06/090611redflagsfaq.pdf.
HIPAA/HITECH28
Applies to any “covered entity,” which includes certain healthcare providers that engage in
standard transactions (typically because it electronically bills third party payors).29
Generally, when a college or university healthcare clinic treats students, FERPA, not HIPAA/HITECH, will apply.30
Applicability and
Basic Protections
HIPAA/HITECH applies to student healthcare clinics when they treat non-students,
such as faculty members, and to affiliated university hospitals (because they generally
do not provide care to students on behalf of a college or university).
HIPAA/HITECH protects protected health information (“PHI”) which is broadly defined to include even just
demographic information when provided by or to a covered entity.31 It protects PHI through the Privacy and
Security Rule.32 The Privacy Rule generally prohibits unauthorized disclosure and secondary use of PHI; the
Security Rule requires physical, technical and organizational data security safeguards. The Security Rule applies
to electronic protected health information and requires a robust set of data security requirements.
HITECH also imposes a federal data breach notification requirement.
Specific Privacy and/
or Data Security Contract
Provision Required/
Recommended?
Yes, if the cloud provider qualifies as a business associate, then a business associate agreement (“BAA”) is required.
Whether a service provider qualifies as a business associate typically depends on whether it receives or has access
to PHI obtained from the covered entity.33 Note that HIPAA provides a regulatory exception to the business associate
requirement for “certain private couriers and their electronic equivalents that act merely as conduits for protected
health information.”34 In certain cases, a cloud provider may qualify for this exception, but it is extremely narrow.
Cloud Computing Notes
The HIPAA Privacy Rule restricts the use of PHI for marketing purposes without the patient’s consent.35
In addition to contractual obligations required by BAAs, the HITECH Act imposes a number of the Privacy
Rule Requirements and nearly all of the Security Rule requirements directly on business associates.36
This does not abrogate or otherwise undermine a covered entity’s obligations under HIPAA.
Note that moving student health center data to the cloud might result in FERPA
applying to certain records and HIPAA applying to others.
State Data Breach Notification Laws
Applicability and
Basic Protections
Forty-six states, the District of Columbia, Puerto Rico, and the U.S. Virgin Islands have some sort of
data breach notification law.37 Typically the laws apply to electronic records containing sensitive
information; many contain some sort of risk-of-harm threshold.38 In all cases, service providers,
such as cloud providers, have an obligation to notify the data owner (here, the institution) of a data
breach; the law requires the data owner to make relevant notifications to customers.39
Specific Privacy and/
or Data Security Contract
Provision Required/
Recommended?
Always recommended; whether required varies by state.
Cloud Computing Notes
When selecting a cloud service provider, the institution should contractually obligate the cloud provider to give
the data owner timely notice of any potential data breach and should dictate who will be responsible for costs
associated with the data breach (for example, notification, legal, investigation, and reputational costs).
28
Health Insurance Portability and Accountability
Act of 1996 (“HIPAA”), 42 U.S.C. § 1320d et seq.
29 45 C.F.R. § 160.103.
30 See 45 C.F.R. § 160.103 (excluding “education records”
and “treatment records” under FERPA from the
definition of “protected health information” under
the HIPAA Privacy Rule). The HIPAA security rule,
in turn, applies only to electronic PHI and therefore
also does not apply to education records and treatment records under FERPA. See also Joint Guidance on the Application of the Family Educational
Rights Privacy Act (FERPA) and the Health Insurance Portability and Accountability Act of 1996
(HIPAA) To Student Health Records (Nov. 2008).
31
65 Fed. Reg. 82,462, 82,612 (Dec. 28, 2000).
45 C.F.R. parts 160 and 164 subparts A & E.
33 45 C.F.R. § 164.308(b)(1); id. § 164.502(e)(2).
34 OCR, FAQ, www.hhs.gov/ocr/privacy/hipaa/
faq/business_associates/245.html.
35 45 C.F.R. § 164.520(a)(3).
36 See HITECH, Pub. L. No. 111-5, part I,
§§ 13401, 13404, 123 Stat. 115.
37 e.g., Cal. Civ. Code § 1798.82(a); Fla. Stat. Ann.
§ 817.5681; N.Y. Gen. Business Law § 899-aa;
Tex. Bus. & Com. § 521.053. The national conference of state legislatures collects these laws
at www.ncsl.org/default.aspx?tabid=13489.
38
32
College & University | 21
39
See Security Breach Notification Laws, www.
ncsl.org/default.aspx?tabid=13489.
Id.
State Data Security Laws
Several states—including, for example, California, Arkansas, Maryland, Massachusetts, Minnesota,
Nevada, Oregon, Rhode Island, Texas, and Utah—have laws that impose data security requirements;
some of the laws require that security requirements be passed along to vendors by contract.40
Applicability and
Basic Protections
Massachusetts has the most restrictive state legal regime. Unlike others, it imposes specific data security
requirements, including administrative, technical, and physical safeguards and specific encryption
requirements, for electronic records containing personal information.41 The Massachusetts law purports to
apply to any entity that owns, licenses, or stores data of a Massachusetts resident.42 The other common
jurisdictional hook for state data security laws is when an institution “does business” within the state.
Specific Privacy and/
or Data Security Contract
Provision Required/
Recommended?
Always recommended; whether required varies by state.
Massachusetts, for example, requires that businesses “take reasonable steps to select and retain” third-party service
providers and to require such providers by contract to implement and maintain appropriate security measures for
personal information.43 Together, these provisions could be read to require audits or assessments of cloud providers.
Institutions should carefully assess the regulatory risk from state data security laws. In some cases, even though
the laws may technically apply to certain records because the state asserts jurisdiction over any entity that licenses
information from a state resident, the risk may be small due to a limited number of data owners from that state.
Cloud Computing Notes
When state data security laws apply, institutions should impose by contract the data security
requirements required of the institution and obtain some indemnification from the cloud provider.
Note that cloud provider’s lack of transparency may render it difficult to obtain the
requisite assurances that the provider satisfies the requirements.
State Privacy Laws
Applicability and
Basic Protections
Many states have statutory and/or common law privacy requirements. States may have specific
privacy laws for education records, health records, mental health records, or others.
Specific Privacy and/
or Data Security Contract
Provision Required/
Recommended?
Always recommended; whether required varies by state.
Cloud Computing Notes
40
As with other privacy laws, institutions should be especially attuned to restrictions on secondary use (that is,
assuming the institution may disclose the information to a cloud provider, whether the cloud provider may
use or disclose the information for purposes other than to provide the cloud services to the institution).
See John L. Nicholson & Meighan E. O’Reardon, Data
Protection Basics: A Primer for College and University Counsel, 36 J.C. & U.L. 101, 119–31 (2009).
41
42
43
201 Mass. Code Regs. § 17.01(1).
See id.
291 Mass. Code Regs. § 17.03(3)(f).
College & University | 22
Reproduced with permission of the copyright owner. Further reproduction prohibited without
permission.
Privacy and data security continue to be a focus for corporations, regulators, law enforcement,
and consumer groups across the globe. Imaginative ways to access and use information create
significant challenges in how to protect individuals, nations, and an interconnected world
economy. These issues touch virtually every aspect of modern life, from the use of smart phones
to global security against terrorism. This survey covers significant developments in global
privacy and data security and topics to watch in the coming year. Privacy and data security issues
such as those highlighted in this survey, as well as others, will continue to develop around the
world for the foreseeable future. Varying ideological approaches to privacy and data security in
the interconnected digital world complicate the already difficult task of balancing innovation
with reasonable protections. Uncertainty and change will be the norm in this space for years to
come.
Full Text
•
TranslateFull text
•
Privacy and data security continue to be a focus for corporations, regulators, law enforcement,
and consumer groups across the globe. Imaginative ways to access and use information create
significant challenges in how we protect individuals, nations, and an interconnected world
economy. These issues touch virtually every aspect of modem life, from the use of smart phones
to global security against terrorism. This survey covers significant developments in global
privacy and data security and topics to watch in the coming year.
PRIVACY IN THE CLOUD
Cloud computing services challenge traditional privacy law concepts as well as regulators who
struggle to keep up with technological developments. "Cloud" refers to a distributed internetbased infrastructure used on a shared basis1 in which user data may be stored in different or
multiple data centers around the world.
JURISDICTION AND ACCESS TO DATA
A key area of ongoing debate regarding the cloud is jurisdiction and territoriality, which are
central to privacy regulation. The legal framework regulating data transfers lags behind cloud
computing innovation,2 and there is not agreement on a new legal framework. Generally, there
are two bases for jurisdiction over the cloud: 1) location of the infrastructure (e.g., data centers)
and 2) location of the providers.3
The Patriot Act4 and the Foreign Intelligence Surveillance Act5 are examples in which providerbased jurisdiction potentially conflicts with infrastructure-based jurisdiction. U.S. companies
may be required to disclose the cloud data of an EU citizen stored in an EU data center to the
U.S. government under the Patriot Act.6 The U.S. laws in this regard are not unique. For
example, German law enforcement has tapped cloud data abroad using mutual law enforcement
treaties.7 EU finance regulations permit auditing of data in the cloud because it is considered
outsourcing.8 Expect continued activity as regulators struggle with jurisdiction in the cloud.
EU V. U.S. APPROACHES TO THE CLOUD
As various regulators impose a privacy framework on the cloud, their differing approaches to
privacy are fueling debate. Both the European Union and United States provided guidance
regarding cloud data last year. Not surprisingly, they are not in agreement on the topic. In
September 2012, the European Union issued an advisory communication9 that calls for greater
data protection in the cloud. By the end of 2013, the Commission expects to create model
contract terms and a model code of conduct for cloud providers.10
The European Data Protection Supervisor ("EDPS") supports rethinking data protection in the
cloud because, according to the EDPS, currently it is impossible for data controllers purchasing
cloud computing services to comply with legal data protection requirements.11 For example,
data controllers are held accountable for compliance with EU privacy laws even though they may
not know where or how their data is stored by the data processor (the cloud provider) in the
cloud.12 The EDPS suggests clearly defining a "transfer" of personal data in the cloud as well as
other solutions as the European Union moves toward increased regulation of the cloud.13
The International Trade Authority of the U.S. Department of Commerce ("ITA") has
downplayed these concerns. Currently, U.S. privacy protection does not meet EU "adequacy"
requirements, so moving data to the United States generally is not permitted unless the U.S.
importer has certified to Safe Harbor Principles or entered an approved EU standard contract
clause with the EU data exporter.14 The ITA stated that it "does not believe that 'cloud
computing' represents an entirely new business model or presents any unique issues for Safe
Harbor."15 This type of debate will continue as regulators struggle to address the cloud and other
new technology.
Mobile Privacy
Mobile applications and "bring your own device" issues were significant in global mobile
privacy debates in the last year.
Mobile Applications
Increased use of mobile devices and applications in lieu of personal computers is fueling privacy
concerns. Mobile industry trade groups are encouraging selfregulation in an effort to limit
government regulation.16 Likewise, the PCI Security Standards Council released proactive
Mobile Payment Acceptance Security Guidelines in September 2012, which provide global
guidelines for payment applications operating on consumer mobile devices.17
In the United States, California Attorney General Kamala Harris continues to take a leadership
role in the debate: After giving notice of her privacy concerns to popular mobile application
operators, in December 2012, the California Attorney General filed a legal action alleging
privacy deficiencies with a mobile application.18 In January 2013, the California Attorney
General also released a set of privacy best practice recommendations, including using clear and
conspicuous privacy policies and limiting the personally identifiable information collected.19
These actions are being watched closely by other law enforcement bodies and likely will be
replicated elsewhere.
The Federal Trade Commission ("FTC") also has been active. In 2012, it announced best
practices to protect consumers' private information by focusing on privacy during product
development and more choice and transparency.20 The FTC's enforcement activity likewise
reflects its broadening approach to privacy. For example, in In re HTC America, Inc., the FTC
alleged that HTC, an upstream device and software provider with limited consumer interface,
engaged in unfair and deceptive business practices in the customization of software used in
certain mobile devices running third-party operating systems.21
Abroad, the European Union asserted that mobile applications are subject to the EU's Privacy
and Electronic Communications Regulations, which require that users be informed about cookies
and consent to their use.22 A February 2013 Working Party Opinion clarified that processing
personal data in mobile applications requires mobile application controllers to notify users of
their rights of access, rectification, and erasure, along with their right to object to data
processing.23
Bring Your Own Device ("BYOD")
Having employees use their personal devices for company business is an attractive option for
employers because of the potential for reduced IT expenses and increased productivity.
However, BYOD programs also implicate personal privacy and company security issues,
including control of company information stored on employee-owned devices.24 Solutions to
protect confidential company data if a device is lost, such as "remote wipes," may also impact
personal data, leading to potential liability for unauthorized access to the device under state and
federal computer trespass laws.25
This issue is attracting increasing attention around the world. For example, in August 2012, the
White House introduced a toolkit to support federal agencies implementing BYOD programs.26
In February 2013, the German Federal Office for Information Security provided
recommendations on security strategy for BYOD programs.27 Guidance from the United
Kingdom's Information Commissioner's Office stresses that the data controller has the ultimate
responsibility for ensuring legal compliance.28 Expect new BYOD disputes and regulatory
activity.
Global Developments in Addressing Cybersecurity Threats
Cyber-attacks around the globe commonly are headline news. They occur for many reasons, are
perpetrated by different actors, and have diverse targets. Regulators are responding to protect
regional and national interests, as well as the companies that operate in the overlapping universe
of cyberspace.
On February 7, 2013, the European Parliament and the Council of the European Union adopted
the Directive Concerning Measures to Ensure a High Common Level of Network and
Information Security Across the Union.29 The Directive provides that Member States shall
ensure that public bodies, as well as operators of critical infrastructure, manage risks posed to the
security of networks and information systems they control and use.30 It also provides that
Member States shall ensure the same entities report incidents of security breaches to proper
authorities.31
After a series of failed legislation,32 on February 12, 2013, President Obama issued an
Executive Order titled Improving Critical Infrastructure Cybersecurity.33 It directs the Secretary
of Homeland Security to establish a "voluntary program" to support the adoption of a
Cybersecurity Framework by owners and operators of critical infrastructure and other interested
parties.34 That same day, the Cyber Intelligence Sharing and Protection Act was re-introduced in
Congress and later passed in the House.35 A week later, the Obama Administration issued its
Administration Strategy on Mitigating the Theft of U.S. Trade Secrets.36 Each of these actions
recognizes that U.S. companies are the target of sophisticated cyber-attacks that threaten U.S.
economic interests and security.
At the 12th ASEAN Telecommunications and Information Technology Ministers Meeting in
November 2012, the ministers of ten Asian countries reviewed progress in implementing
ASEAN's Information and Communications Technology Master Plan, which incorporates a
campaign to promote cybersecurity and collaboration with private industry, and reconfirmed
formal collaboration strategies with Japan and South Korea on cybersecurity.37
These recent actions are not isolated; local, national, and regional governments are grappling
with complicated issues presented by cyber-attacks, and how to coordinate with private industry
and other governments. Critical infrastructure (including financial services, utilities, internet,
transportation, and health care) is at risk, and protecting that infrastructure is a primary focus for
many countries. Economic espionage and the theft of trade secrets also raise significant concerns
for governments and the private sector.
Despite the widespread action around the globe in the last year, cybersecurity regulation is in its
infancy. There is a robust debate on technical and practical issues relating to cybersecurity, and
regulators are adopting varying-and potentially conflicting-approaches.
Global Data Breach Developments
Compromises of personal data have become commonplace; however, data breaches no longer are
limited by geographic boundaries. The international trend toward establishing breach notification
requirements continues, reflecting the expectation that notification enhances data security.
Both the United States and the European Union currently have a patchwork of data breach
notification requirements. For example, in the United States, individual states differ on triggering
events that require notification-"acquisition" of or "access" to personal information suffices in
some states, while others require notification only after a risk-of-harm determination.38 States
also differ on when notification should be provided and to whom.39 European nations have
similar variations. For example, some nations require notifications to au- thorities and affected
individuals, but others do not have mandatory notification to either the individuals or
authorities.40
A more unified approach to data breach notification may be developing. The European
Commission released a proposed General Data Protection Regulation41 in 2012 that addresses
data breach notification requirements throughout the European Union. The proposal is expected
to be finalized in 2014, although likely will be amended from its current form. Similarly, in June
2012, the Data Security and Breach Notification Act of 2012 was introduced in the U.S. Senate
to create a uniform federal privacy breach notification law to preempt the current patchwork of
state laws.42 This bill was reintroduced in the U.S. Senate on June 20, 2013, as the Data Security
and Breach Notification Act of 2013.43
Authorities elsewhere in the world also are enacting breach notification laws, reflecting
increased vigilance over data protection. For example, in August 2012, the Philippines passed its
first consolidated data privacy legislation-the Data Privacy Act of 2012-influenced significantly
by the European Union's current data protection laws.44 South Korea's Personal Information
Protection Act, effective in April 2012, mandates notification to individuals affected by a breach,
as well as to the Korean government for large-scale breaches.45 In April 2013, Australia
introduced for the first time legislation regarding notification requirements for a "serious
breach."46
Significant Global Privacy Developments
European Union
The European Union continues to forge an aggressive path in data privacy regulation, which
likely will be followed in other parts of the world. The proposed General Data Protection
Regulation ("Regulation")47 sought to address legal uncertainty caused by inconsistent
implementation of the 1995 Data Protection Directive by Member States and respond to
advancements in technology. Among other reforms, the Regulation requires data controllers to
appoint data protection officers, tightens consent rules, creates new rights for data subjects,
augments data breach notification requirements, and strengthens noncompliance sanctions.48
Various stakeholders within and outside Europe have weighed in on the Regulation, and the
basis for much of the recent debate has been proposed amendments to the Regulation in the
January 2013 draft report issued by the European Parliament's Committee on Civil Liberties,
Justice and Home Affairs ("LIBE"), the lead legislative committee for the Regulation.49 Critics
charge that LIBE's proposals generally increase burdens on data controllers, though others
suggest that the more precise, technical language proposed by LIBE may be beneficial.50 On
October 21, 2013, LIBE adopted a version of the Regulation incorporating the Committee's
proposals51; however, the text of the Regulation is by no means final. The European Parliament
and Council of the European Union must now negotiate on the final version of the Regulation
and will aim to reach agreement on this legislative reform before the May 2014 European
elections.52 Among the issues to watch are the scope of the regulation, the role of consent,
restrictions on and accessibility to data, rules regarding international data transfers, and
enforcement and remedies.
Hong Kong
In 2012, the Personal Data (Privacy) (Amendment) Ordinance53 ("Amendments") was enacted
to, among other things, strengthen restrictions on the use of personal data for direct marketing
purposes. The Amendments, effective April 1, 2013, modify Hong Kong's 1997 Personal Data
(Privacy) Ordinance ("PDPO")54 by limiting companies' ability to engage in direct marketing
without opt-in consent, which is enforced with criminal sanctions.55 Despite a grandfathering
provision, confusion and uncertainty persist on the use of personal data for direct marketing
under the Amendments.56 The Amendments also (i) gener48. ally prohibit the disclosure of
personal data without the consent of the individual from whom such data was collected ("Data
Subject"), (ii) increase the Privacy Commissioner's enforcement powers under the PDPO, (iii)
grant greater data access rights to Data Subjects, (iv) further regulate processing of personal data
in outsourcing, and (v) include new exemptions to allow the use, disclosure, and/or transfer of
personal data in specified circumstances.57 The Amendments also permit legal assistance with
claims made under the PDPO, criminalize disclosure of personal data for commercial gain and
without consent, and impose certain restrictions and obligations concerning the outsourcing of
data processing to third parties, some of which were put into operation in October 2012.58
Latin America
Latin American countries have been active in enacting privacy and data protection requirements.
For example, on March 22, 2013, Peru issued implementing regulations for its 2011 data
protection law,59 which included new rules concerning the law's territorial scope, restrictions on
data transfers, rights of data subjects in connection with notice and consent, and enforcement.
Costa Rica also recently published regulations60 to clarify its data protection law, which require
expanded data breach notice, new registration obligations for data controllers, restrictions on
personal data retention, express consent by a data subject for data processing, and direct
compliance liability for data processors.61
On October 17, 2012, Colombia passed a comprehensive data protection framework to require,
among other things, data subject notice and consent for personal data processing, restrictions on
the processing of personal data of children, new rights of access and correction for data subjects,
direct regulatory compliance obligations on service providers, international transfer restrictions,
and data controller registration requirements.62 Enforcement is entrusted in a new data
protection authority, delegated under the Superintendency of Industry and Commerce.
What to Expect
Privacy and data security issues such as those highlighted in this survey, as well as others, will
continue to develop around the world for the foreseeable future. Technological advances create
opportunities to access and use data in ways that were unimaginable even a few years ago, as
well as risks to individuals, companies, and countries. Varying ideological approaches to privacy
and data security in our interconnected digital world complicate the already difficult task of
balancing innovation with reasonable protections. We are far from a mature global framework
regulating privacy and data security, which means that uncertainty and change will be the norm
in this space for years to come.
Footnote
1. Peter Mell & Timothy Grance, The NIST Definition of Cloud CoMPuimg (Sept. 2011),
available at http://csrc.nist.gov.ezproxy.csp.edu/publications/nistpubs/800-145/SP800-145.pdf.
2. Eur. Parliament Directorate-Gen. for Internal Policies, Study: Fighting Cyber Crime and
Protecting Privacy in the Cloud (Oct. 2012), available at
http://www.europarl.europa.eu/committees/
en/studiesdownload.html?languageDocument=EN&file=79050.
3. See id. at 38.
Footnote
4. USA PATRIOT Act of 2001, Pub. L. No. 107-56, 115 Stat. 272 (codified in scattered sections
of the U.S.C.).
5. 50 U.S.C. §§ 1801-1811, 1821-1829, 1841-1846, 1861-1862, 1871 (2012).
6. Zack Whittaker, Patriot Act Can "Obtain" Data in Europe, Researchers Say, CBS News (Dec.
4, 2012, 3:59 PM), http://www.cbsnews.com/8301-205_162-57556674/patriot-act-can-obtaindata-ineurope-researchers-say.
7. Johanna Laas,.. . and the Cloud Again: German Government's Response to Formal Inquiry,
Privacy Eur. Blog (Apr. 29, 2013, 9:31 AM), http://www.privacy-europe.com/blog/and-thecloud-again-germangovemments-response-to-formal-inquiry/.
8. Lokke Moerel, Global Cloud Contracts: How to Navigate the EU Requirements in a Global
Contract 4-5 (IAPP Global Privacy Summit, Mar. 6-8, 2013), available at
https://www.privacyasso
ciation.org/media/presentations/13Summit/S13_Qosing_the_Deal_PPT.pdf.
9. Communication from the Commission, Unleashing the Potential of Cloud Computing in
Europe, at 8, COM (2012) 529 final (Sept. 27, 2012).
10. Id. at 12-13.
11. Opinion of the European Data Protection Supervisor on the Commission's Communication
on "Unleashing the Potential of Cloud Computing in Europe" ^ 25 (Nov. 16, 2012), available at
http://goo.gl/ FG9Dz.
12. See id. 1 24 , 82.
13. Id. 1 74.
Footnote
14. U.S. Dep't of Commerce, Clarifications Regarding the U.S.-EU Safe Harbor Framework and
Cloud Computing 1-2 (Apr. 12, 2013), available at http://goo.gl/IwqY2p.
15. Id. at 1.
16. See, e.g., A Status Update on the Development of Voluntary Do-Not-Trach Standards:
Before the S. Comm, on Commerce, Sei. & Transp., 113th Cong. (2013) (statement of Luigi
Mastria, Managing Dir., Digital Advertising Alliance).
17. PCI Sec. Standards Council, PCI Mobile Payment Acceptance Guidelines for Developers
(Sept. 2012) , available at
https://www.pcisecuritystandards.org/documents/Mobile_Payment_Security_
Guidelines_Developers_vl .pdf.
18. Press Release, Cal. Attorney Gen., Attorney General Kamala D. Harris Files Suit Against
Delta Airlines for Failure to Comply with California Privacy Law (Dec. 6, 2012), available at
http://oag.ca. gov/news/press-releases/attomey-general-kamala-d-harris-files-suit-against-deltaairlines-failure. The Delta suit was dismissed based on Airline Deregulation Act preemption.
Karen Gullo, Delta Wins Dismissal of California App Privacy Lawsuit, Bloomberg.com (May 9,
2013, 1:36 PM CST), http://www. bloomberg.com/news/2013-05-09/delta-wins-dismissal-ofcalifomia-app-privacy-lawsuit.html.
19. Cal. Dep't of Justice, Privacy on the Go: Recommendations for the Mobile Ecosystem (Jan.
2013) , available at http://oag.ca.gov/sites/all/files/pdfs/privacy/privacy_on_the_go.pdf.
Footnote
20. Fed. Trade Comm'n, Protecting Consumer Privacy in an Era of Rapid Change:
Recommendations for Blbiness and Policymakers (Mar. 2012), available at
www.ftc.gov/os/2012/03/120326privacyreport. pdf.
21. In re HTC Am., Inc., No. 122-3049, 2013 WL 752478 (FTC Feb. 22, 2013) (proposing Agreement
Containing Consent Order); see also Katherine S. Ritchey et al., Lessons from In re FITC America
Inc.: FTC's Broadening Approach to Consumer Data Security Leaves Unwary Manufacturer or
Developer with More than It Bargained for, Jones Day Publ'ns (Mar. 2013),
http://www.jonesday.com/lessons_ from_htc_america/.
Decision Support Systems 46 (2009) 815–825
Contents lists available at ScienceDirect
Decision Support Systems
j o u r n a l h o m e p a g e : w w w. e l s e v i e r. c o m / l o c a t e / d s s
Studying users' computer security behavior: A health belief perspective
Boon-Yuen Ng ⁎, Atreyi Kankanhalli, Yunjie (Calvin) Xu
Department of Information Systems, National University of Singapore, Singapore
a r t i c l e
i n f o
Available online 21 November 2008
Keywords:
Computer security
Security awareness
Health belief model
Email attachment
Virus
a b s t r a c t
The damage due to computer security incidents is motivating organizations to adopt protective mechanisms.
While technological controls are necessary, computer security also depends on individual's security behavior.
It is thus important to investigate what influences a user to practice computer security. This study uses the
Health Belief Model, adapted from the healthcare literature, to study users' computer security behavior. The
model was validated using survey data from 134 employees. Results show that perceived susceptibility,
perceived benefits, and self-efficacy are determinants of email related security behavior. Perceived severity
moderates the effects of perceived benefits, general security orientation, cues to action, and self-efficacy on
security behavior.
© 2008 Elsevier B.V. All rights reserved.
1. Introduction
Organizations increasingly rely on information systems for the
transmission, processing, and storage of information. Hence, it is
essential to protect the information within these systems and the
availability of the computer systems. However, the increase in
organizational dependence on information systems as well as the
ease of mounting attacks has led to a corresponding increase in the
number of security incidents and damage caused [26]. A computer
security incident is defined as a security-related adverse event in
which there is a loss of information confidentiality, disruption of
information or system integrity, disruption or denial of system
availability, or violation of any computer security policies [19].
According to the 2007 annual survey conducted by the Computer
Security Institute [36], 46% of respondents indicated that their
organization experienced a security incident within the last
12 months. Of these, a significant number (52%) of the attacks are
virus-related. It is therefore important for organizations and employees to be aware of and protect themselves against security threats and
cybercrime.
Chung et al. [8] described three approaches at a national level to
fight against cybercrime, i.e., legal, organizational, and technological.
Countries around the world have created laws (e.g., Computer Misuse
Act in Britain and Singapore) and set up national agencies (e.g., the
Computer Analysis Response Team in the US) to combat computer
security threats. Various technologies are applied at the national level
for this purpose, such as a computer surveillance system developed by
the FBI. Further, organizational measures are important in this fight.
⁎ Corresponding author.
E-mail addresses: ngby@comp.nus.edu.sg (B.-Y. Ng), atreyi@comp.nus.edu.sg
(A. Kankanhalli), xuyj@comp.nus.edu.sg (Y.(C.) Xu).
0167-9236/$ – see front matter © 2008 Elsevier B.V. All rights reserved.
doi:10.1016/j.dss.2008.11.010
Organizations need to develop and implement a multi-dimensional
approach to safeguard their information assets [52].
Among the approaches, technological measures such as firewalls
for perimeter defense are common in organizations. Such solutions
are necessary but not sufficient for protection [35]. This is because
success of computer security depends on the effective behavior of
users [43]. Employees in an organization play an essential role in the
prevention and detection of security incidents. While system administrators are responsible for configuring firewalls and servers in a
secure manner, users are responsible for practicing security countermeasures such as choosing and protecting appropriate passwords.
Thus, for effective security, users have to make a conscious decision
to comply with the organization's security policies and adopt
computer security behavior. To this end, organizations have been
implementing security training and awareness programs to educate
users [35]. While many practitioner guidelines are available, there is a
lack of empirical studies concerning the design and effectiveness of
security awareness programs. An effective awareness program should
influence a user's attitude and behavior to be more security-conscious
[47]. Thus, it is critical to understand what will influence a user's
security behavior so that appropriate awareness programs can be
designed. However, there is little theoretically grounded empirical
information systems research on the behavior of individuals in
practicing secure computing.
Motivated by such theoretical and practical concerns, our research
question is, “What are the salient influences for a user to practice
computer security in an organization?” Through this study, we aim to
contribute to the better understanding of security behavior of
computer users in organizations, so that the security climate of an
organization can be improved. By identifying and understanding the
determinants of computer security behaviour, interventions can be
designed to change behaviour by directing at one or more of the
determinants.
816
B.-Y. Ng et al. / Decision Support Systems 46 (2009) 815–825
With the paucity of theoretical perspectives in this area, this study
draws upon relevant literature from other fields. Specifically, it makes
use of the well-known health belief model [40] traditionally employed
to explain preventive healthcare behavior. This perspective is
applicable because security practices can be seen as preventive
behavior to avert security incidents. The model suggests that an
individual's behavior is determined by the threat perception and
evaluation of the behaviour to resolve the threat. This model offers a
new perspective to better understand the phenomenon using
constructs that have not been previously explored in IS research,
such as cues to action and general security orientation. Our research
model is tested by surveying 134 employees from multiple organizations. The findings are expected to inform theory and practice in this
area.
2. Conceptual background
of managerial interventions and controls. For example, the factors that
influence a home user's intention to practice computer security have
been investigated by applying the decomposed theory of planned
behavior [33]. Findings indicate that family, peer, and mass media
influence, perceived usefulness, and self-efficacy are important factors
that influence a home user's intention to practice computer security.
Another empirical study in the non-work context surveyed students
to investigate determinants of safe online behavior [29]. It finds
significant influences from online safety involvement, self-efficacy,
and personal responsibility but without a theoretical explanation. In
another study of college students, application of protection motivation
theory borrowed from healthcare showed that self-efficacy predicts
online consumers' intention to practice safe online behavior, such as
updating virus protection [28]. With the lack of theoreticallygrounded empirical studies of determinants of computer security
behavior in organizations, we now review theories that may be
applicable for our study.
2.1. Computer security behavior
2.2. Applicability of IS adoption theories
There are relatively few research studies of security behavior of
computer users and how behavior can be modified to practice security
countermeasures. Previous studies in this area can be categorized
according to their context, i.e., organizational or non-work use of
computers. An example of a study in the organizational context is the
investigation of end-user security behaviors and their antecedents by
Stanton et al. [43]. It reveals relationships between end-user security
behavior (such as password management, non-work-related computing behavior, and obtaining security training) and a combination of
situational factors (such as organizational type) and personal factors
(such as income level and job role). The study provides empirical
insights but without theoretical bases. Yet another study in the
organizational context by Aytes and Connolly [4] proposes a
conceptual model of user security behavior based on risk perception.
Of the rare theoretically-grounded empirical studies in this context is
the study by Chan et al. [7], which explores the influence of security
climate and self-efficacy on user compliance to security policies. Thus
there is a lack of studies that comprehensively model and test the
individual beliefs that influence computer security behavior in
organizations, which is broader than compliance to organizational
security policies.
Other related studies pertain to computer users in a non-work
environment, which differ from organizational settings by the absence
Information systems (IS) research is rich in theories pertaining to
technology adoption. Computer security behavior includes the
adoption and use of security technologies such as anti-virus software
and firewalls. Theories such as Technology Acceptance Model [14] and
Theory of Planned Behavior [3] can be applied to study users'
intention to use security technologies (e.g., [33]). However, recent
research in security behavior has revealed that there are significant
differences between positive technologies (used for designed utilities)
and protective technologies (used to prevent negative consequences)
[15]. Security technologies generally belong to the category of
protective technologies as they are used to avert undesirable
incidents, such as virus attacks. This recent discussion gives the
impetus to look for theories that are more suitable to study the use of
such protective technologies.
In addition, computer security behavior involves more than just
the adoption of technology. While the use of protective technologies is
critical, computer security behavior also includes other behaviors such
as the choice of strong passwords, regular backing up of data, and
exercising caution with suspicious email attachments. Such behaviors
do not involve the adoption of any specific technology but require the
computer user to consciously decide to perform additional steps for
the sake of preventing unwanted situations such as loss of data. For
Fig. 1. Research model.
B.-Y. Ng et al. / Decision Support Systems 46 (2009) 815–825
such behaviors, IS theories such as Technology Acceptance Model may
be less suitable. Behavioral theories such as Theory of Planned
Behavior provide a general framework to study user intentions, but
more could be done to explore determinants that are more specific to
security behavior.
With the paucity of theoretical perspectives in information
systems on practicing computer security, most research studies
have turned to theories in other domains. A domain that has been
borrowed from is healthcare. In the non-work context, security
behavior of home wireless network users has been investigated
using the protection motivation theory [51]. This theory has
previously been used in healthcare to explain a person's coping
behavior when he/she is informed of a threatening event. This and
other studies (e.g., [27,28]) suggest the applicability of healthcare
theories to study computer security behavior. The similarities
between preventive healthcare and protective security behavior
are described below.
2.3. Relevance of healthcare behavioral theories
Parallels can be drawn between protective security behavior (such
as using a strong password to prevent unauthorized use of one's
account) and preventive healthcare behavior (such as observing a
817
healthy diet to avoid heart diseases). Preventive healthcare refers to
behaviors that will prolong an individual's healthy life or practices
that otherwise lessen the effects of diseases [25]. Protective security
behavior refers to behaviors that will reduce the risk and/or impact of
security incidents. There are a number of characteristics of preventive
healthcare common to practicing security countermeasures. Both
involve practicing preventive and protective behavior to avert an
unwanted situation. The success of preventive healthcare and security
practices is seen in the non-occurrence of diseases (for preventive
healthcare) and security incidents (for security practices) respectively.
The occurrence of diseases disrupts the normal functioning of one's
body whereas the occurrence of security incidents disrupts the
functioning of one's computer system and possibly affects the
organization. Practicing preventive healthcare and security countermeasures both create inconveniences for the individuals in terms of
extra effort.
Most of the theories on preventive healthcare behavior use an
expectancy-value approach. Expectancy refers to beliefs about how
well a person can perform a task or activity, and value refers to the
incentives or reasons for performing that task or activity [16].
According to the basic expectancy-value theory, a person's attitude
towards a behavior is a function of the perceived likelihood of
outcomes associated with the behavior and the expected value or
Table 1
Constructs and items
Construct
Item
Source
Behavior (BEH)
BEH1: Before reading an email, I will first check if the subject and the sender make sense.
(agree/disagree)
BEH2: Before opening an email attachment, I will first check if the filename of the
attachment makes sense. (agree/disagree)
BEH3: I exercise caution when I receive an email attachment as it may contain a virus. (agree/disagree)
BEH4: I do not open email attachments if the content of the email looks suspicious. (agree/disagree)
SUS1: The chances of receiving an email attachment with virus are high. (agree/disagree)
SUS2: There is a good possibility that I will receive an email attachment with virus. (agree/disagree)
SUS3: I am likely to receive an email attachment with virus. (agree/disagree)
SEV1: Having my computer infected by a virus as a result of opening a suspicious email attachment is a
serious problem for me. (agree/disagree)
SEV2: Losing organizational data as a result of opening a suspicious email attachment is a serious problem
for me. (agree/disagree)
SEV3: If my computer is infected by a virus as a result of opening a suspicious email attachment, my daily work
could be negatively affected. (agree/disagree)
BEN1: Checking if the sender and subject make sense is (definitely/not) effective in preventing viruses from
infecting my computer.
BEN2: Checking if the filename of the email attachment makes sense is (definitely/not) effective in preventing
viruses from infecting my computer.
BEN3: Exercising care before opening email attachments is (definitely/not) effective in preventing viruses from
infecting my computer.
BAR1: Exercising care when reading emails with attachments is inconvenient. (agree/disagree)
BAR2: Exercising care when reading emails with attachments is time-consuming. (agree/disagree)
BAR3: Exercising care when reading emails with attachments would require considerable investment of effort
other than time. (agree/disagree)
BAR4: Exercising care when reading emails with attachments would require starting a new habit, which
is difficult. (agree/disagree)
CUE1: My organization distributes security newsletters or articles. (never/always)
CUE2: My organization organizes security talks. (never/always)
CUE3: My organization's IT helpdesk sends out alert messages/emails concerning security. (never/always)
CUE4: My organization constantly reminds me to practice computer security. (agree/disagree)
GEN1: I read information security bulletins or newsletters. (agree/disagree)
GEN2: I am concerned about security incidents and try to take action to prevent them. (agree/disagree)
GEN3: I am interested in information about computer security. (agree/disagree)
GEN4: I am constantly mindful about computer security. (agree/disagree)
SEF1: I am confident of recognizing a suspicious email. (agree/disagree)
SEF2: I am confident of recognizing suspicious email headers. (agree/disagree)
SEF3: I am confident of recognizing suspicious email attachment filename. (agree/disagree)
SEF4: I can recognize a suspicious email attachment even if there was no one around to help me.
(agree/disagree)
My organization ensures that my computer is protected from viruses by installing anti-virus software on my
computer and/or the email server. (agree/disagree)
How would you rate yourself in terms of familiarity with computer security practices? (very familiar/not at all familiar)
[38]
Perceived susceptibility (SUS)
Perceived severity (SEV)
Perceived benefits (BEN)
Perceived barriers (BAR)
Cues to action (CUE)
General security orientation (GEN)
Self-efficacy (SEF)
Technical controls (CON1)
Security familiarity (CON2)
[38]
Self-developed
Self-developed
[6]
[6]
[6]
[51]
[51]
Self-developed
Self-developed
Self-developed
Self-developed
Self-developed
[6,51]
[51]
[6]
Self-developed
Self-developed
Self-developed
Self-developed
Self-developed
[25]
[25]
Self-developed
Self-developed
Self-developed
Self-developed
[7,12]
Self-developed
Self-developed
818
B.-Y. Ng et al. / Decision Support Systems 46 (2009) 815–825
evaluation of those outcomes. The overall desirability of behavior is
based on the summed products of the expectancy and value of
outcomes. Several well-known behavioral models have their roots in
expectancy-value theories, such as social cognitive theory, protection
motivation theory, and the health belief model. The next section
describes why we chose the health belief model as the lens for our
study.
2.4. Health belief model
A popular expectancy-value model used in healthcare is the health
belief model. It is one of the earliest comprehensive attempts to
explain healthcare behavior based on expectancy value principles
[40]. It has been widely applied to all types of healthcare behavior,
such as contraceptive use, diet, and exercise. It has also been applied in
other diverse areas, such as preventive behavior against piracy threat
facing US firms [22] and emigration intention [20]. The model appears
to have implications for work motivations as well as a broad range of
human behaviors [49].
The health belief model identifies two considerations in an
individual's decision to adopt healthcare behavior in response to
the threat of illness, i.e., perceptions of illness threat and
evaluation of behavior to resolve this threat. Perception of illness
threat depends on two beliefs, i.e., the perceived susceptibility to
the illness and perceived severity of the illness. Evaluation of
behavior depends on assessing the perceived benefits of the
Table 2
Demographics of respondents
Demographic
Category
Age
20–29
30–39
40–49
N= 50
Male
Female
Senior management
Middle management
First-level supervisor
Technician
Analyst
Administrative support
Others
Accounting
Administration
Information Technology
R&D
Operations
Marketing and Sales
Others
b1 year
1–2 years
3–5 years
6–10 years
11–20 years
N20 years
Government
Education
Finance/Banking
Information Technology
Telecommunications
Health/Medical
Military
Others
1–20
21–50
51–100
101–500
501–1000
N1000
Gender
Job title
Functional area of job
Job tenure at current organization
Industry type of organization
Organization size
Percentage
54.6%
33.1%
10%
2.3%
50.7%
49.3%
2.2%
15.7%
20.9%
6.7%
16.4%
17.2%
20.8%
2.2%
9.7%
47.8%
9.7%
9.7%
8.2%
12.6%
3.1%
45%
25.2%
19.8%
3.8%
2.3%
19.4%
18.7%
3.0%
34.3%
6.0%
2.2%
1.5%
14.9%
6.0%
13.4%
7.5%
9.7%
11.2%
52.2%
Table 3
Descriptive statistics of constructs and inter-construct correlationsa
Construct
Mean
SD
BEH
SUS
SEV
BEN
BAR
GEN
CUE
SEF
BEH
SUS
SEV
BEN
BAR
GEN
CUE
SEF
6.03
4.86
5.42
5.56
3.64
5.22
4.96
5.22
0.82
1.28
1.05
0.98
1.38
1.16
1.44
1.14
0.56
0.41
0.33
0.53
− 0.07
0.17
− 0.04
0.40
0.76
0.36
0.31
0.14
0.10
−0.11
0.08
0.64
0.39
0.16
0.22
0.23
0.05
0.63
0.04
0.09
0.05
0.11
0.72
−0.05
0.05
−0.15
0.78
0.36
0.16
0.80
−0.01
0.78
a
Square root values of average variance extracted are indicated on the diagonal cells.
healthcare behavior to prevent the illness and the perceived
barriers to performing the preventive healthcare behavior in
order to compute the perceived net benefit [13]. Apart from
perceived susceptibility, perceived severity, perceived benefits and
perceived barriers, three other variables included in the health
belief model are self-efficacy, cues to action, and general health
orientation. Self-efficacy is a person's self-confidence in his ability
to perform a behavior. This concept originates from the social
cognitive theory [5] and describes individuals' responses to the
challenges of changing habitual unhealthy behaviors. Cues to
action are triggers that make the individual take action, such as
health education and advice from others [24]. General health
orientation refers to the individual's predisposition to healthcare
behavior [49]. This construct captures the individual's tendency
towards performing healthy behaviors.
The health belief model is comprehensive in including a number of
explanatory constructs that are not represented in IS adoption or other
healthcare theories, but important in computer security practice. The
constructs perceived susceptibility, perceived severity, cues to action
and general health orientation are not present in prior IS adoption
theories, while cues to action and general health orientation are not
present in other healthcare theories. One of the most important
components of individual security behavior is the effective management of risk. Risk management requires the identification of threats
and determination of the likelihood and impact of threats [44]. This is
similar to the concepts of perceived susceptibility and perceived
severity in the health belief model.
Further, the constructs of general health orientation and cues to
action are likely to be relevant to computer security behavior. Cues to
action could include the organization's security awareness efforts.
General health orientation is analogous to an individual's general
orientation or predisposition to security. Applying this idea to the
security domain, this construct is mapped to an individual's “securityconsciousness” or general security orientation. To the best of our
knowledge, these two constructs have not been explored in past
security behavior studies. Hence, we apply the health belief model as
an overarching theory to explain a user's computer security behavior
in an organization. In the next section, we elaborate on our research
model.
3. Research model
Fig. 1 presents our research model. While most studies based on
the health belief model consider behavioral intention or likelihood of
behavior as the dependent variable, we use self-reported actual
behavior instead. Although this variable is subject to self-report bias, it
is often easier to self-assess than intention and more objective. This
approach has been taken in a few previous empirical preventive
healthcare studies (e.g., [25]) by asking respondents what behaviors
they engage in. Hence, self-reported computer security behavior
constitutes our dependent variable. We define each construct and
present the related hypotheses below.
B.-Y. Ng et al. / Decision Support Systems 46 (2009) 815–825
3.1. Perceived susceptibility
In the health belief model, this construct refers to the “subjective
risks of contracting a condition” [39, p. 99]. Individuals vary widely in
their perceived susceptibility. For example, an individual may deny
any possibility of contracting the...
Purchase answer to see full
attachment