CISSP Protect Privacy – Bk1D2T4St1

Protect Privacy

One way to think about privacy is that it is why we provide security. One part of the security triad, confidentiality, is the provision of privacy. Of course, security is also a method of ensuring information integrity and availability.
Today, individual privacy is a growing global concern. It is helpful to have a general knowledge of the building blocks that led us to the present day, where individual information confidentiality, consent, and control over data collected, stored, and used belonging to data subjects informs security policy and procedures. From a security perspective, it is vital to have a command of the various international regulations dealing with privacy relevant to current regulations.

Cross-Border Privacy and Data Flow Protection

Many countries around the world, especially in Europe and Latin America, have information privacy or data protection laws. In the United States, there is no comprehensive information privacy law, but rather several sectoral or industry-based laws are used. The next section provides a sample of leading privacy regulations that are frameworks to translate expectations of individual privacy into legal standards. Privacy law has an increasingly large domain of issues to try to address, including domestic surveillance, identification systems, social networking sites, video surveillance, DNA databases, and airport body scanners. Technology advances will continue to force privacy regulations to adapt and change at a fast pace in line with the impact of the Internet, the advancements of cryptography, and any other progress in technology that improves privacy.

European Union

As contemporary privacy protection regulation, the European Union (EU) published a comprehensive set of articles and principles for data protection rules in 1995. The central objective of the Data Directive (Directive 95/46 EC) was protection of individuals with regard to the processing of personal data and the limitation of movement of such data. The general principle of the directive was that personal data should not be processed at all, except when certain conditions are met. These conditions fall into three categories:
transparency, legitimate purpose, and proportionality. For the EU, the directive is directly attached to privacy and human rights law. For more than two decades, the directive was the prevailing privacy law. Each member of the EU implemented it differently. Advances in technology in an increasingly connected world exposed gaps in how privacy was protected. For these reasons, the EU General Data Protection Regulation (GDPR) was created and adopted in April 2016. The Data Directive be replaced, and GDPR became enforceable on May 25, 2018.
The GDPR is intended to reform the Data Directive to strengthen online privacy rights in a global economic environment. There are some significant changes worth highlighting. The following list is not comprehensive:

  • A single set of rules on data protection, valid across the EU. Each EU nation has a national protection authority. However, rules are streamlined under a single market construct to reduce administrative complexity, cost, and burden for compliance.
  • Strengthening individuals’ rights so that the collection and use of personal data is limited to the minimum necessary.
  • Rights to data portability to more easily transfer personal data from one service provider to another.
  • The “right to be forgotten” is explicitly recognized. The Data Directive limited the processing of data that causes unwarranted and substantial damage or distress to all personal data from anywhere in the EU when there is no compelling reason for its processing. The GDPR expanded individual rights to allow the subject to demand removal of their information.
  • EU rules continue to apply to organizations outside the EU that handle the data of EU citizens, but it has clarified and strengthened data breach notification requirements.
  • EU enforcement is increased, with between 2 percent and 4 percent of a business organization’s annual revenue at risk of fines and penalties for data protection infractions.
  • More effective enforcement of the rules in that police involvement and criminal prosecution are tied to violations of privacy rules.

Under the Data Directive, global companies could achieve safe harbor status. This meant that the EU could determine that a foreign organization had sufficient data protection policy and procedures, and although not directly subject to the Data Directive, the organization voluntarily demonstrated an ability to be compliant with the directive. The U.S. Department of Commerce oversaw the list and required registrants to renew  annually. With the advent of GDPR, safe harbor is no longer the standard. Under GDPR, companies have to attest and demonstrate that they meet the high standards of the new guidance. If the EU evaluates that a company’s standards and procedures are sufficient, the company could be placed on a list known as the Privacy Shield list.

Safe Harbor Transition to Privacy Shield

The International Safe Harbor Privacy Principles were developed between 1998 and 2000 and were meant to provide a level of assurance that the privacy controls of U.S. organizations were adequate to protect EU citizen data, per EU Data Directive standards. The Safe Harbor provisions allowed self-certification by U.S. companies. The U.S. Department of Commerce developed privacy frameworks to guide compliance and oversaw the program for U.S. interests. The U.S. control frameworks were deemed adequate in 2000. However, in 2015, the European Commission ruled that a new framework for transatlantic data flows was needed. This action was in response to the U.S. repeatedly failing to follow the directive.

This was true for U.S. entities including the U.S. government. The EU determined that a stricter model was necessary. The prevailing safe harbor agreement was ruled invalid.

Note Safe harbor under the Data Directive differs from the concept of safe harbor under U.S. HIPAA. Under U.S. law, if an organization can demonstrate that the risk of unauthorized disclosure of data is not present, a data incident can be considered nonreportable to regulators and affected people. A common example of this is a lost laptop with protected health information stored on it. If the laptop has a verifiable encryption algorithm that is validated by FIPS 140-2, the loss is granted safe harbor, not reportable, because the data is rendered unreadable by unauthorized readers by the encryption. Safe harbor in this context is not impacted by the passing of GDPR.
In 2016, the European Commission and the United States launched a new framework for transatlantic data flows. This new arrangement is called the EU-US Privacy Shield and replaces the International Safe Harbor Agreement. However, the intent is still the same: to enable the transfer of personal data from EU citizens to U.S. commercial
organizations. The agreement makes data transfer easier while giving data subjects the privacy protection they expect from their own nations.

Note that the EU-US Privacy Shield designation is specific to the United States. Other data transfer arrangements have been approved by the EU for other countries. There are a few countries that the EU considers to have adequate data protection levels and so data transfers are permitted. The countries that are cleared include Argentina, Canada, Israel, New Zealand, Switzerland, and Uruguay. For countries that are not explicitly approved, adequacy for data protection by those companies in non–EU nations is achieved by the use of standard contract clauses in individual agreements with EU data controllers, called model clauses, to meet EU requirements. Another component of compliance for non–EU companies is the use of binding corporate rules that adhere to EU requirements. Binding corporate rules are like a code of conduct for a company and its information protection practices. In general, the privacy protections in place for either a corporation, especially a multinational firm, or an entire nation undergo a process to determine adequacy from the EU before data transfer is approved.

Related Product : Certified Information System Security Professional | CISSP

Asia-Pacific Economic Cooperation Cross-Border

Privacy Rules
To build trust in the cross-border flow of personal information, the APEC CBPR system was developed by APEC economies with input and assistance from industry and civil society. Currently, there are five participating APEC CBPR countries. Those are the United States, Mexico, Japan, Canada, and the Republic of Korea. Businesses that agree to participate must adopt the data privacy policies outlined in the APEC CBPR. An independent APEC CBPR Accountability Agent—an entity authorized by the APEC— assesses participating businesses for ongoing compliance. The agent’s assessments are enforceable by law. In addition to the five countries already participating, several more are expected to agree to the framework, including the European Commission. The GDPR is being discussed for certification within the APEC CBPR system.

U.S. Data Privacy Laws and Guidelines

In the United States, data privacy is handled as functional concern or regulated within an industry. A typical category of sensitive data is PII. PII is any information that can be used for the purpose of identifying, locating, or contacting any specific individual, either combined with other easily accessible sources or by itself. Government organizations have regulations requiring adequate safeguarding of PII. Commercial organizations may be subject to those regulations, but also to contractual requirements. Some of the most prominent U.S.
privacy regulations include the Fair Credit Reporting Act (FCRA), the Gramm-Leach Bliley Act (GLBA), the Privacy Act, the Children’s Online Privacy Protection Act (COPPA), and the Family Educational Rights and Privacy Act (FERPA). Using NIST SP 800-122, some common examples of PII include the following:

  • Name, such as full name, maiden name, mother’s maiden name, or alias
  • Personal identification number, such as Social Security number (SSN), passport number, driver’s license number, taxpayer identification number, patient identification number, or financial account or credit card number
  • Address information, such as street address or email address
  • Asset information, such as Internet Protocol (IP) or Media Access Control (MAC) address or other host-specific persistent static identifier that consistently links to a particular person or small, well-defined group of people
  • Telephone numbers, including mobile, business, and personal numbers
  • Personal characteristics, including photographic image (especially of face or other distinguishing characteristic), X-rays, fingerprints, or other biometric image or template data (e.g., retina scan, voice signature, facial geometry)
  • Information identifying personally owned property, such as vehicle registration number or title number and related information.
  • Information about an individual that is linked or linkable to one of the previous (e.g., date of birth, place of birth, race, religion, weight, activities, geographical indicators, employment information, medical information, education information, financial information)This list is broad by design. Some of these identifiers are not considered sensitive by themselves.

For example, an address found in a public listing is not considered PII. For information that may be available publicly, like names, addresses, email addresses, and so on, it is important to know how that information was obtained. A set of information termed nonpublic personal information (NPI) includes those identifiers that are considered public unless the information was obtained by deriving it in whole or in part using personally identifiable financial information that is not publicly available, such as financial account numbers.

In addition to PII, identifiers can also be composed of a subset of sensitive information defined and regulated by another law. For example, X-ray image information is also PHI and is subject to HIPAA. PHI is personally identifiable data that deals with information related to an individual’s medical history, including, but not limited to, information related to health status, healthcare provision, or healthcare payment. Some examples of this include diagnosis codes (ICD-10), date of treatment, images, genetic information, or DNA. PHI is rather broadly interpreted and includes any sort of medical payment history or records of a patient that can be identified individually and produced or received by healthcare providers, including health plan operators and health clearing houses. PHI may be related to the present, past, or future health of an individual, in either physical or mental terms. PHI may also include the current condition of an individual regarding health. In general, PHI can be utilized for the identification of any specific individual.

Additionally, it refers to information that is maintained as well as transmitted in any given form such as electronic, paper, or speech.  In the United States, organizations that handle PHI are subject to HIPAA. Specific sections of the law include a Privacy Rule and a Security Rule to guide organizations on  required and addressable (recommended but not mandatory) controls for the proper protection of PHI.

PII and PHI are regulated and of top concern in the United States because of the potential harm that loss, theft, or unauthorized access can have on individuals whose information is compromised. This is a consideration above and beyond the detrimental impact a data breach of PII or PHI can have on an organization in terms of reputational harm, financial costs, and operational loss.

The need to protect the privacy of PII and PHI has become increasingly important. The data can be used to commit crimes and disrupt the lives of individuals. For instance, a stolen medical record can be used to falsify an identity to open credit accounts and obtain medical care in the name of the affected person. In many cases, the affected person does not know their identity is being used to fraudulently obtain credit accounts or medical care until he or she receives inexplicable bills or credit agency calls, or their insurance maximum limits are reached. This impact is particularly true for U.S. people with private health insurance.

The theft of data has been a high-profile news story over the last few years. Uber had the personal data of 57 million drivers and users stolen. Equifax was breached in 2017 and exposed the Social Security numbers, birth dates, and addresses of 143 million people. Between 1 and 3 billion users worldwide were potentially impacted by the data breach of Yahoo in 2013. The ease of committing the crimes, along with how difficult it is to catch cyber criminals, favors the adversary. With the increase in frequency and severity, global regulators have increased guidance, oversight, and enforcement to address privacy protections. Sanctions, fines, and penalties have increased in similar frequency and severity against organizations that fail to exercise due diligence and due care in handling sensitive information, concepts covered later in this chapter. The first step for you as a Certified Information Systems Security Professional (CISSP) in assuring proper information handling is to know the regulations applicable to your industry and jurisdiction and then take steps to align that guidance with organizational policy and procedures.

Note PHI may not have any traditional PII elements within the data set. For instance, the record may not include a Social Security number or a name. However, if multiple health data elements, such as diagnosis, age, gender, and date or treatment provide sufficient information to identify a person, it may be considered PHI and subject to HIPAA.
Tip Every organization that handles PHI on behalf of a healthcare organization is subject to HIPAA as a business associate. However, not every organization that collects, uses, and discards health information is subject to HIPAA. Records related to education may contain health information, but are subject to FERPA.

The Privacy Act of 1974 (U.S.)
Although enacted in 1974, the U.S. Privacy Act continues to play a foundational and relevant role in protecting individual privacy with respect to information collected, used, and stored by the U.S. government. Even though the act is applicable only to U.S. federal government agencies, the Privacy Act certainly has led to later privacy laws and regulations both in the government and in private-sector organizations, even as those regulations had security practices as their emphasis. One example is HIPAA, which protects the privacy and security of information used in healthcare, and another is the FCRA, enforced by the Federal Trade Commission to promote the accuracy, integrity, fairness, and privacy of consumer information.
The Privacy Act protects the creation, use, and maintenance of records that contain personal identifiers such as a name, Social Security number, or other identifying number or symbol. Individuals can seek access to information maintained on them as well as request corrections if something is in error. When someone makes a request, organizations subject to the Privacy Act must provide an accounting of disclosures to the individual to document how the information has been shared. The Privacy Act does not allow disclosure of the sensitive information unless it is under limited permissible uses. Where records are allowed to be shared, the records must be registered in a System of Records Notice (SORN). The register is published in the U.S. Federal Register and posted to the Internet.

Fair Information Practice Principles
The Fair Information Practice Principles (FIPPs) are guidelines authored by the U.S. FTC. These guidelines are not enforceable by law. However, they are widely accepted as a necessary baseline for fair information practice in an electronic marketplace. Organizations that adhere to FIPPs do so through a process of self-regulation in an effort to maintain privacy-friendly, consumer-oriented data collection practices. The principles are categorized as follows:

Notice: Before collecting personal information, organizations should tell consumers and provide awareness about information practices that the business follows.

The following information must be explicitly disclosed in the notification:

  • Identification of the entity collecting the data, the uses to which the data will be put, and any potential recipients of the data
  • The nature of the data collected and the means by which it is collected
  • Whether the data requested is provided voluntarily or required

The steps taken by the organization to ensure the confidentiality, integrity, and quality of the data

  • Consent: Consumers should be given choice and consent options to control how their data is used. Usually, consumers express their choice through an opt-in (affirmative consent) or opt-out (affirmative decline) selection. These choices determine whether and how an organization can use the data, especially beyond the initial purpose.
  • Access: Consumers must be able to participate at some level in the collection of their data. This means that the consumer has the ability to view the data collected as well as verify and contest its accuracy. The consumer must be able to participate in an inexpensive and timely manner.
  • Integrity: Organizations that gather personal information have an obligation to ensure that it is accurate and secure. One way to maintain the accuracy, and also abide by the access principle, is to have consumers verify the information. Integrity and security are also achieved by limiting access to the information internally and externally as part of this principle.
  • Enforcement: Although the FIPPs are not law, they can be enforced, and consumers are given methods for redress. There are three means of enforcement of the FIPPs: self-regulation by the information collectors or an appointed regulatory body, private remedies for consumers that give civil causes of action for individuals whose information has been misused to sue violators, and government enforcement that can include civil and criminal penalties levied by the government.

Personal Information Protection and Electronic Documents Act (Canada)
PIPEDA applies to all private-sector organizations in Canada that are federally regulated and use personal information for commercial purposes. PIPEDA does not apply to government agencies or organizations that do not engage in commercial, for-profit activities.
It sets out the ground rules for how businesses must handle personal information in the course of their commercial activity. PIPEDA establishes an obligation that any collection, use, or disclosure of personal information must only be for purposes that a reasonable person would deem appropriate given the circumstances.

PIPEDA contains 10 fair information principles:

  • Accountability: An organization is responsible for personal information under its control. It must appoint someone to be accountable for its compliance with these fair information principles.
  • Identifying Purposes: The purposes for which the personal information is being collected must be identified by the organization before or at the time of collection.
  • Consent: The knowledge and consent of the individual are required for the collection, use, or disclosure of personal information, except where inappropriate.
  • Limiting Collection: The collection of personal information must be limited to that which is needed for the purposes identified by the organization. Information must be collected by fair and lawful means.
  • Limiting Use, Disclosure, and Retention: Unless the individual consents otherwise or it is required by law, personal information can only be used or disclosed for the purposes for which it was collected. Personal information must only be kept as long as required to serve those purposes.
  • Accuracy: Personal information must be as accurate, complete, and up-to-date as possible to properly satisfy the purposes for which it is to be used.
  • Safeguards: Personal information must be protected by appropriate security relative to the sensitivity of the information.
  • Openness: An organization must make detailed information about its policies and practices relating to the management of personal information publicly and readily available.
  • Individual Access: Upon request, an individual must be informed of the existence, use, and disclosure of their personal information and be given access to that information. An individual shall be able to challenge the accuracy and completeness of the information and have it amended as appropriate.
  • Challenging Compliance: An individual shall be able to challenge an organization’s compliance with the previous principles. Their challenge should be addressed to the person accountable for the organization’s compliance with PIPEDA, usually their chief privacy officer.

A variety of circumstances may affect the application of PIPEDA. For example, in some provinces, separate rules that limit the applicability of the federal law apply to municipalities, universities, schools, and hospitals. Similarly, if an entity is doing business exclusively inside Alberta, British Columbia, or Quebec, their respective provincial privacy laws apply. Consequently, applying the appropriate privacy protections to the information processed by an organization in Canada, as in most jurisdictions, requires addressing the unique regulatory environment in which the organization operates.

Nevertheless, PIPEDA has national reach and impact. PIPEDA brings Canada into compliance with the EU Data Directive. Under the new GDPR in the European Union, Canada has a partial adequacy designation with respect to trust in data transfers from the EU to Canada.
Tip An exception to PIPEDA is information collected, used, or stored for the purpose of journalism, art, or literary material.

Follow Us
https://www.facebook.com/INF0SAVVY
https://www.linkedin.com/company/14639279/admin/