Tuesday, August 14, 2018

What's a Certified Information Privacy Manager?

Not long ago, I became a Certified Information Privacy Manager.  "What is that, and why did you do it," you ask?  
IAPP CIPM logo - green circle with white text "CIPM IAPP"

About three years ago, and after practicing in privacy law for a few years, I became a Certified Information Privacy Professional.   Since then, I left private practice to become in-house counsel.  As an in-house lawyer, one of my challenges is helping my internal clients operationalize the legal advice I provide at the point where "the rubber meets the road."  As a result, I became interested in privacy management--the  practical implementation of privacy law and policies.  I studied the comprehensive guidebook, Privacy Program Management, published by the International Association of Privacy Professionals, and then decided to become a Certified Information Privacy Manager (CIPM).  

This summary of the CIPM is for others who may be interested in pursuing the certification.

Studying Privacy Program Management involves learning about the processes that organizations use to implement privacy, as well as the benefits and detriments of each approach, at a very specific level.  The materials cover, and applicants for the CIPM are given a comprehensive exam that covers, the following areas: 


I. Privacy Program Governance
A. Organization Level
a. Create a company vision
i. Acquire knowledge on privacy approaches
ii. Evaluate the intended objective
iii. Gain executive sponsor approval for this vision
b. Establish a privacy program
i. Define program scope and charter
ii. Identify the source, types, and uses of personal information (PI) within the organization and the applicable laws
iii. Develop a privacy strategy
1. Business alignment
a. Finalize the operational business case for privacy
b. Identify stakeholders
c. Leverage key functions
d. Create a process for interfacing within organization
e. Align organizational culture and privacy/data protection objectives
f. Obtain funding/budget for privacy and the privacy team
2. Develop a data governance strategy for personal information (collection, authorized use, access, destruction)
3. Plan inquiry/complaint handling procedures (customers, regulators, etc.)
c. Structure the privacy team
i. Governance models
1. Centralized
2. Distributed
3. Hybrid
ii. Establish the organizational model, responsibilities and reporting structure appropriate to the size of the organization
1. Large organizations
a. Chief privacy officer
b. Privacy manager
c. Privacy analysts
d. Business line privacy leaders
e. “First responders”
2. Small organizations/sole data protection officer (DPO) including when not only job
iii. Designate a point of contact for privacy issues
iv. Establish/endorse the measurement of professional competency
B. Develop the Privacy Program Framework
a. Develop organizational privacy policies, standards and/or guidelines
b. Define privacy program activities
i. Education and awareness
ii. Monitoring and responding to the regulatory environment
iii. Internal policy compliance
iv. Data inventories, data flows, and classification
v. Risk assessment (Privacy Impact Assessments [PIAs], etc.)
vi. Incident response and process, including jurisdictional regulations
vii. Remediation
viii. Program assurance, including audits
C. Implement the Privacy Policy Framework
a. Communicate the framework to internal and external stakeholders
b. Ensure continuous alignment to applicable laws and regulations to support the development of an organizational privacy program framework
i. Understand applicable national laws and regulations (e.g., GDPR)
                                    ii. Understand applicable local laws and regulations
iii. Understand penalties for noncompliance with laws and regulations
iv. Understand the scope and authority of oversight agencies (e.g., Data Protection Authorities, Privacy Commissioners, Federal Trade Commission, etc.)
v. Understand privacy implications of doing business in or with countries with inadequate, or without, privacy laws
vi. Maintain the ability to manage a global privacy function
vii. Maintain the ability to track multiple jurisdictions for changes in privacy law
viii. Understand international data sharing arrangements agreements
D. Metrics
a. Identify intended audience for metrics
b. Define reporting resources
c. Define privacy metrics for oversight and governance per audience
i. Compliance metrics (examples, will vary by organization)
1. Collection (notice)
2. Responses to data subject inquiries
3. Use
4. Retention
5. Disclosure to third parties
6. Incidents (breaches, complaints, inquiries)
7. Employees trained
8. PIA metrics
9. Privacy risk indicators
10. Percent of company functions represented by governance mechanisms
ii. Trending
iii. Privacy program return on investment (ROI)
iv. Business resiliency metrics
v. Privacy program maturity level
vi. Resource utilization
d. Identify systems/application collection points

II. Privacy Operational Life Cycle
A. Assess Your Organization
a. Document current baseline of your privacy program
i. Education and awareness
ii. Monitoring and responding to the regulatory environment
iii. Internal policy compliance
iv. Data, systems and process assessment
1. Map data inventories, flows and classification
2. Create “record of authority” of systems processing personal information within the organization
3. Map and document data flow in systems and applications
4. Analyze and classify types and uses of data
v. Risk assessment (PIAs, etc.)
vi. Incident response
vii. Remediation
viii. Determine desired state and perform gap analysis against an accepted standard or law
ix. Program assurance, including audits
b. Processors and third-party vendor assessment
i. Evaluate processors and third-party vendors, insourcing and outsourcing privacy risks
1. Privacy and information security policies
2. Access controls
3. Where personal information is being held
4. Who has access to personal information
ii. Understand and leverage the different types of relationships
1. Internal audit
2. Information security
3. Physical security
4. Data protection authority
iii. Risk assessment
1. Type of data being outsourced
2. Location of data
3. Implications of cloud computing strategies
4. Legal compliance
5. Records retention
6. Contractual requirements (incident response, etc.)
7. Establish minimum standards for safeguarding information
iv. Contractual requirements
v. Ongoing monitoring and auditing
c. Physical assessments
i. Identify operational risk
1. Data centers
2. Physical access controls
3. Document destruction
4. Media sanitization (e.g., hard drives, USB/thumb drives, etc.)
5. Device forensics
6. Fax machine security
7. Imaging/copier hard drive security controls
d. Mergers, acquisitions and divestitures
i. Due diligence
ii. Risk assessment
e. Conduct analysis and assessments, as needed or appropriate
i. Privacy Threshold Analysis (PTAs) on systems, applications and processes
ii. Privacy Impact Assessments (PIAs)
1. Define a process for conducting Privacy Impact Assessments
a. Understand the life cycle of a PIA
b. Incorporate PIA into system, process, product life cycles
B. Protect
a. Data life cycle (creation to deletion)
b. Information security practices
i. Access controls for physical and virtual systems
1. Access control on need to know
2. Account management (e.g., provision process)
3. Privilege management
ii. Technical security controls
iii. Implement appropriate administrative safeguards
c. Privacy by Design
i. Integrate privacy throughout the system development life cycle (SDLC)
ii. Establish privacy gates/PIAs-Data Protection Impact Assessments (DPIAs) as part of the standard process, system development framework
C. Sustain
a. Measure
i. Quantify the costs of technical controls
ii. Manage data retention with respect to the organization’s policies
iii. Define the methods for physical and electronic data destruction
iv. Define roles and responsibilities for managing the sharing and disclosure of data for internal and external use
b. Align
i. Integrate privacy requirements and representation into functional areas across the organization
1. Information security
2. IT operations and development
3. Business continuity and disaster recovery planning
4. Mergers, acquisitions and divestitures
5. Human resources
6. Compliance and ethics
7. Audit
8. Marketing/business development
9. Public relations
10. Procurement/sourcing
11. Legal and contracts
12. Security/emergency services
13. Finance
14. Others
c. Audit
i. Align privacy operations to an internal and external compliance audit program
1. Knowledge of audit processes
2. Align to industry standards
ii. Audit compliance with privacy policies and standards
iii. Audit data integrity and quality
iv. Audit information access, modification and disclosure accounting
v. Communicate audit findings with stakeholders
d. Communicate
i. Awareness
1. Create awareness of the organization’s privacy program internally and externally
2. Ensure policy flexibility in order to incorporate legislative/regulatory/market requirements
3. Develop internal and external communication plans to ingrain organizational accountability
4. Identify, catalog and maintain documents requiring updates as privacy requirements change
ii. Targeted employee, management and contractor training
1. Privacy policies
2. Operational privacy practices (e.g., standard operating instructions), such as
a. Data creation/usage/retention/disposal
b. Access control
c. Reporting incidents
d. Key contacts
e. Monitor
i. Environment (e.g., systems, applications) monitoring
ii. Monitor compliance with established privacy policies
iii. Monitor regulatory and legislative changes
iv. Compliance monitoring (e.g. collection, use and retention)
1. Internal audit
2. Self-regulation
3. Retention strategy
4. Exit strategy
D. Respond
a. Information requests
i. Access
ii. Redress
iii. Correction
iv. Managing data integrity
b. Privacy incidents
i. Legal compliance
1. Preventing harm
2. Collection limitations
3. Accountability
4. Monitoring and enforcement
ii. Incident response planning
1. Understand key roles and responsibilities
a. Identify key business stakeholders
1. Information security
2. Legal
3. Audit
4. Human resources
5. Marketing
6. Business development
7. Communications and public relations
8. Other
b. Establish incident oversight teams
2. Develop a privacy incident response plan
3. Identify elements of the privacy incident response plan
4. Integrate privacy incident response into business continuity planning
iii. Incident detection
1. Define what constitutes a privacy incident
2. Identify reporting process
3. Coordinate detection capabilities
a. Organization IT
b. Physical security
c. Human resources
d. Investigation teams
e. Vendors
iv. Incident handling
1. Understand key roles and responsibilities
2. Develop a communications plan to notify executive management
v. Follow incident response process to ensure meeting jurisdictional, global and business requirements
1. Engage privacy team
2. Review the facts
3. Conduct analysis
4. Determine actions (contain, communicate, etc.)
5. Execute
6. Monitor
7. Review and apply lessons learned
vi. Identify incident reduction techniques
vii. Incident metrics—quantify the cost of a privacy incident


The examination lasts two hours, and can be taken at one of the approved testing centers around the world.   Examinees who pass are notified instantly, and certificates are delivered within a matter of weeks.   Once certified, CIPMs must complete a significant amount of continuing privacy education each year in order to maintain the certification, which you can read about here.

If you would like to learn more about the CIPM credential, check out the Resource List, Body of Knowledge, Candidate Handbook, Exam Blueprint, preparation guide.   Good luck!


Thursday, June 28, 2018

California Enacts Sweeping Privacy Law to Avoid Vote on Ballot Proposal in November

Well, they did it.  

California is re-shaping U.S. privacy law again.  At the last possible minute, California lawmakers enacted a statute and persuaded the proponent of a strict privacy ballot initiative to withdraw the proposal.  

Today, the California legislature passed, and the California Governor signed, Assembly Bill 375 (the "Consumer Right to Privacy Act of 2018").  I wrote last week about the proposed bill, which resembles the ballot initiative of the same name, but is more business-friendly in most (but not all) ways than the ballot proposal (which I described here). The proponents of the ballot imitative have revoked their proposal on the final day prior to official qualification for the November ballot. 

image of laptop computer with eye on screen and text "California" Matt Cordell is a great privacy lawyerThe world now has until January 1, 2020 to decide how to play by the new rules in California. 

Rumors are already swirling on social media that the statute could be amended (i.e., weakened) before it becomes effective. (Statutes enacted by the California legislature can be more easily amended than laws approved by voters at the ballot box.) 

You can read my initial thoughts on the bill in my earlier post.  I intend to provide a more detailed analysis soon. 

Saturday, June 23, 2018

California Lawmakers Make Last-Ditch Effort to Preempt Privacy Ballot Proposal

I recently wrote about a ballot initiative in California that, if approved by voters in November, will dramatically change privacy law in California (and very likely the rest of the United States).  Two days ago, a bill was introduced in the California legislature in an attempt to pre-empt the ballot initiative.  (Remember how I keep telling you how quickly things move in privacy law?!?!)




image of laptop with eyeball and written text California [If you have not already, read my summary and analysis of the ballot initiative first.]

California's deadline for collecting signatures for initiatives to be included on the ballot in the fall is June 28 (next week).  The Consumer Right to Privacy Act of 2018 (v.2, No. 17-0039) already has far more signatures than is necessary, and is almost certain to be eligible for inclusion on the ballot when the deadline arrives next week.  Many industries, and specifically the digital advertising industry, are scrambling to address it before it causes massive disruption (and opportunity?) in the digital marketing world.

Two days ago, on June 21, California lawmakers (from each house) introduced AB 375 in the Assembly, titled "The California Consumer Privacy Act of 2018" (which, if you are paying attention, you will notice has the same title as the ballot proposal...probably not by accident).  If this bill is adopted by the legislature and signed by the Governor before the ballot initiative's qualification deadline next week (6/28), the proponent of the ballot initiative has agreed to revoke the ballot proposal from consideration.

As you might imagine, the legislative bill intentionally includes several elements that are present in the ballot initiative, but is more friendly to business (especially digital marketing/advertising) in some ways than the ballot proposal.  Importantly, the bill would be enforced primarily by the California Attorney General, whereas the ballot initiative would  likely leave enforcement primarily to plaintiff's class action lawyers.  Penalties under the bill are limited to $100-750 per violation, and only for failing to protect data from a breach.  The bill also has important exclusions, such as data collected for one-time transactions, de-identified data, etc.  Based on my initial reading, the right of a consumer to opt out in the bill appears to apply only to data sales, not just data sharing for business purposes.  The proposal's prohibition on discriminating against consumers who opt out seems somewhat softened in the bill.


On the other hand, the bill seems to be a bit more rigorous in some ways (surprisingly!).  For example, the bill would require organizations to tell a consumer the "specific pieces" of personal information that have been collected about an individual Californian (not just the "categories" of information).  In addition, the bill includes a data deletion right similar to the EU concept of the "right to be forgotten" (but with several exceptions, some broad).


















Friday, May 11, 2018

California May Be Poised To Dramatically Alter Consumer Privacy (Again)

I have previously written (for example, hereherehere, and here) that California law usually dictates U.S. privacy practices because it tends to be the most protective of consumer privacy (or aggressive, depending upon your perspective).  California may once again be poised to dramatically re-shape consumer privacy in the United States.
 
An aggressive consumer privacy proposal has gained enough signatures to be placed on the California ballot for a referendum in November. If enacted, it would effectively create a new set of standards for consumer privacy throughout the U.S., because most companies would likely adopt the California standards nationwide rather than treating California residents differently from other Americans.  
 
Background
 

The Consumer Right to Privacy Act of 2018 (specifically v.2, No. 17-0039, which I'll call the "Proposal") was filed October 12, 2017, and has gained almost twice the number of signatures necessary to be included in the November ballot (which is usually an indication that professional petition firms have been engaged).  The qualification deadline is June 28, and it appears that nothing stands in the way of this Proposal making its way onto the ballot.  The named sponsor of the Proposal is the lobbying/law firm of Remcho, Johnasen & Purcell, LLP out of Oakland California. However, it is said that Alastair A. Mactaggart,  a wealthy San Francisco-based real estate investor and executive, is funding this project. He seems to be a first-time political activist who has not been so heavily involved in ballot initiatives in the past.  
 
In a Nutshell
 

The over-simplified-but-concise explanation is that the Proposal:

  • Would give a consumer the right to demand an accounting of all disclosures made by a business of information about the consumer.
  • Would make it illegal to “sell” or "disclose" for a business purpose information about a consumer once a consumer opts out.
  • Would prohibit a business from conditioning any offering or service on a consumer's opt-out decision.
  • Would require very specific disclosures on all business websites.
  • Would be enforced primarily by class action litigation rather than a state entity.  
  • Would not require that any consumer actually suffer any harm (strict liability). 
  • Would result in penalties of $1,000 per person per occurrence, and up to $3,000 if the government concludes the violation was knowing.
  • Would deem a data security breach to be a violation of law by the breached company if the company's security procedures were not reasonable (judged, of course, with the benefit of hindsight).


In More Detail


The Proposal confers on a consumer the right to know what categories of personal information are being collected by a business. 
 
The Proposal gives a consumer the right, at any time, to direct a business that sells personal information about the consumer not to sell the consumer's personal information (the so-called “Opt-Out”). A business must give consumers a notice of this Opt-Out right and must honor Opt Outs after receiving them (presumably immediately). A consumer can authorize another person to Opt Out on his or her behalf, but the Proposal does not specify what form that authorization should take (e.g., a power of attorney).
 
A business cannot “discriminate against” a consumer because the consumer requested information or opted out, including by: (a) denying goods or services to the consumer; (b) charging different prices or rates for goods or services, including through the use of discounts or other benefits or imposing penalties; ( c) providing a different level or quality of goods or services to the consumer; or ( d) suggesting that the consumer will receive a different price or rate for goods or services, or a different level or quality of goods or services, if the consumer exercises the consumer's rights.  (It is worth noting that this provision goes further than even the GDPR)
 
A business must designate at least two methods for consumers to submit requests for information, including a toll-free telephone number, and if the business maintains a website, a website address.
 
Requests for information must be honored within 45 days, with no delay allowed for verifying the request. The look-back period is 12 months, and the consumer controls how the report is delivered. Only one demand may be made each 12 months.
 
Opt-out requests must be honored for at least 12 months, and then it appears that the Proposal would require an affirmative consent from the consumer in order for a business to begin sharing information again. [This provision is somewhat unclear.]
 
Website (and probably application) privacy policy statements must be revised to include a statement of rights that Californians have under the Proposal and a link to the opt-out mechanism titled.  The link must be “clear and conspicuous” and titled "Do Not Sell My Personal Information." 
 
There is a training provision in the Proposal that requires “all individuals responsible for handling consumer inquiries about the business's privacy practices or the business's compliance with [the Proposal]” to be aware of how to handle those inquiries.
 
A business that suffers a security breach involving consumers' personal information may be held liable if the business has failed to implement and maintain "reasonable security procedures and practices."


The Proposal includes a private right of action, and the consumer need not show that he or she suffered a loss of money or property as a result of the violation in order to bring an action. Statutory damages are set at one thousand dollars ($1,000) or actual damages, whichever is greater, for each violation, but a knowing and willful violation can result in damages of three thousand dollars ($3,000), or actual damages, whichever is greater, for each violation. An intentional violation can result in a civil penalty.  Civil penalties of up to $7,500 for each violation are authorized for intentional violations. A civil enforcement action can be brought by the California Attorney General, by any district attorney, any city attorney of a city having a population in excess of 750,000, by any city attorney, or any full-time city prosecutor, in any court of competent jurisdiction.
 
Definitions


The devil is in the details, and at least some of the Proposal's terms are defined in ways that could be easily misunderstood:


The categories of personal information covered by the Proposal are:
 
(1) Identifiers such as a real name, alias, postal address, unique identifier, internet protocol address, electronic mail address, account name, social security number, driver's license number, passport number, or other similar identifiers;
 
(2) All categories of personal information enumerated in Civil Code 1798.80 et. seq, with specific reference to the category of information that has been collected (any information that identifies, relates to, describes, or is capable of being associated with, a particular individual, including, but not limited to, his or her name, signature, social security number, physical characteristics or description, address, telephone number, passport number, driver’s license or state identification card number, insurance policy number, education, employment, employment history, bank account number, credit card number, debit card number, or any other financial information, medical information, or health insurance information. “Personal information” does not include publicly available information that is lawfully made available to the general public from federal, state, or local government records.)
 
(3) All categories of personal information relating to characteristics of protected classifications under California or federal law, with specific reference to the category of information that has been collected, such as race, ethnicity, or gender;
 
(4) Commercial information, including records of property, products or services provided, obtained, or considered, or other purchasing or consuming histories or tendencies;
 
(5) Biometric data;
 
(6) Internet or other electronic network activity information, including browsing history, search history, and information regarding a consumer's interaction with a website, application, or advertisement;
 
(7) Geolocation data;
 
(8) Audio, electronic, visual, thermal, olfactory, or similar information;
 
(9) Psychometric information;
 
(10) Professional or employment-related information;
 
(11) Inferences drawn from. any of the information identified above; and
 
(12) Any information pertaining to minor children of a consumer.
 
"Personal information" does not include information that is publicly available or that is de-identified.
 
The terms "sell," "selling," "sale," or "sold," includes sharing orally, in writing, or by electronic or other means, a consumer's personal information with a third party, whether for valuable consideration or for no consideration, for the third party's commercial purposes.
 
"Third party" means any person who is not (i) the “business” that collects personal information from consumers or (ii) to whom the business discloses a consumer's personal information for a business purpose pursuant to a written contract, provided that the contract tightly restricts further resale, use or retention beyond the scope of the business purpose and includes a “certification” that the recipient understands the restrictions. 
 
The term "business" means any organization that is for-profit, has annual revenue of at least $50MM, or 100,000 or more consumers annually, or derives at least half of its revenue from selling consumer information. A business includes entities controlled by another (including by 50% or more voting equity), or businesses that share a common brand or trademark.


   
Opposition


Opponents are already pointing out some downsides to the Proposal:  For example, there’s no safety exception. Some businesses might not be able to send recall notices to consumers who have opted out. A car dealer might not be able to share consumer information with a car manufacturer for purposes of compiling recall notice lists.
 
There is also a fear that without a requirement to demonstrate any actual harm, frivolous litigation will run amok and drive up insurance costs and other costs of doing business.
 
There is also the argument that California should not be attempting to regulate the “world wide web.”  Some fear that businesses will begin to exclude California customers or will cease services in order to avoid the burdens of the Proposal.   
 
More Information


You can read the proposal in its entirety here and judge for yourself.


I intend to follow this Proposal closely, and will likely post more about developments here and on LinkedIn and Twitter



Sunday, April 1, 2018

South Dakota and Alabama Become the Last States to Enact Data Security Breach Notification Statutes


image of outine map of continental united states showing south dakota and alabama in color

South Dakota and Alabama have just become the 49th and 50th states to enact data security breach notification statutes, joining the other 48 U.S. states and four U.S districts/territories that already have similar laws in effect. Here is what you need to know:



South Dakota's Statute (SB 62) At A Glance


  • Signed on March 21, 2018 by Governor Dennis Daugaard (before Alabama's statute) and will take effect July 1, 2018 (after Alabama's statute).

  • The statute applies to “information holders” which is a term that seems to cover the concepts of data controller and data processor in other regulatory regimes. (This is just one more reason why data controllers and data custodians will want to carefully allocate responsibility for compliance in their contracts.)

  • Notice is required to South Dakota residents within 60 days after “personal or protected information was, or is reasonably believed to have been, acquired by an unauthorized person.”

  • There are two categories of protected data (unlike most state statutes): “Personal Information" and "Protected Information,” and they include biometric data, in addition to other elements that are common among such state laws.

  • Personal and protected information includes health information (which is a recent trend in state laws that many think unnecessarily duplicative of HIPAA's breach notice provisions).

  • Access credentials (e.g., a username and password) for an online account are covered, reflecting a recent trend in state laws.

  • Notice to the Attorney General of South Dakota is required if more than 250 residents are affected. 

  • Notification to consumers is not necessary if the breached organization conducts an investigation and determines that consumers are not likely to be harmed (but notice to the AG is still required). That determination should be supported by a written analysis, which is to be retained. The AG may disagree with the conclusion and require notice to consumers. (This consultation approach is a relatively recent trend in state data breach statutes.)

  • The AG can impose fines of up to $10,000 per day per violation.

  • Violations of the breach notice requirement may also be criminal deceptive acts or practices under South Dakota’s Deceptive Trade Practices Act (37-24-6). (Note: I am not aware of any other state data security breach notification law that criminalizes a failure to comply.  If you are, please tell me.)

  • There is no express right of civil action in the new statute, but because violations are also deemed violations of the Deceptive Trade Practices Act, civil suits seem foreseeable. 



Alabama’s Statute (S.B. 318) At A Glance


  • Signed on March 28, 2018 by Governor Kay Ivey (after South Dakota's) will take effect June 1, 2018 (before South Dakota's).

  • Notice is required to Alabama residents within 45 days after discovery.

  • “Sensitive personally identifying information” includes elements that are common among other state breach notification laws.

  • Access credentials (e.g., a username and password) for an online account, are also covered, reflecting a recent trend in state laws. 

  • Notice to the Attorney General of Alabama is required if more than 1,000 residents are affected. 

  • Those who knowingly violate the notification law are subject to penalties of up to $500,000 under the Alabama Deceptive Trade Practices Act, plus additional amounts up to $5,000 per day for continuing failure to comply.

  • There is no express right of civil action in the new statute, but the Alabama Attorney General may bring a “representative action” for named individual victims to recover actual damages plus attorney’s fees and costs.

At long last, every state has some sort of data breach notification law. They vary, of course, in the details. [Georgia's statute, for example applies only to governmental "information collectors" and "data brokers" that collect and share data for compensation, severely limiting the reach of the statute.] Some of them have idiosyncrasies that preclude a once-size-fits-all breach notice. [Compare California's statute with Massachusetts' statute, for example.] For a handy reference of all states' and territories' data security breach laws, see the website of the National Conference of State Legislatures, here.





It should also be noted that the U.S. Congress seems to consider a federal breach notification statute in almost every session, and almost every proposal would preempt all state breach notification statutes.  None, however, have yet been enacted (for reasons you may have heard me describe on social media or in presentations).





As a result of these two new statutes, organizations may want to update cyber incident response plans to reflect the new notice requirements and categories of data covered.