Sunday, January 4, 2026

Looking Back at Privacy and Cybersecurity Law Changes in 2025 and Looking Ahead to 2026: Children’s Privacy

Looking Back at Privacy and Cybersecurity Law Changes in 2025 and Looking Ahead to 2026: Children’s Privacy

The privacy of children's data was in the spotlight in 2025 and will be in 2026 as well.  What for years was treated as a narrow compliance issue—often handled with a single sentence in a privacy policy—has evolved into a complex and fast-moving legal landscape that now spans federal requirements and an expanding patchwork of state laws.

As we look back at developments in 2025 and ahead to 2026, one thing is clear: organizations that collect data about anyone under 18 need to revisit their policies and practices.

Federal COPPA Changes 

The federal Children’s Online Privacy Protection Act (COPPA) has been in effect since 2000. For a quarter of a century, it has shaped how websites and apps think about children’s data, particularly information about kids under the age of 13. That history explains why so many privacy policies still include a familiar statement along the lines of: “Our website is not directed to children under 13, and we do not knowingly collect personal information from children under 13.”

In 2025, however, COPPA entered a new phase. The Federal Trade Commission amended the COPPA regulations, effective in June 2025. These amendments raise compliance expectations in several important ways. The most significant changes include:

  • New information security program requirements
    Organizations must implement and maintain a written information security program appropriate to the sensitivity of children’s personal information. Organizations must designate personnel to manage the program, assess internal and external risks to children's data, implement and maintain safeguards, regularly test the effectiveness of these safeguards, and review and revise the program at least annually. This could be either a separate children-only program or (preferably) a comprehensive program. This brings COPPA closer to modern data security regimes and raises the bar well beyond basic safeguards.

  • Expanded definitions of “personal information”
    The definition now includes (i) biometric identifiers that can be used for the automated or semi-automated recognition of an individual, such as fingerprints, handprints, retina patterns, iris patterns, genetic data, including DNA sequences, voiceprints, gait patterns, facial templates, or faceprints, and (ii) government-issued identifiers, such as social security numbers, state identification card numbers, birth certificate numbers, or passport numbers.

  • Notice and parental consent
    The Amendments require notice to parents that identifies the categories of third parties who will receive the child's personal information and the purposes for such sharing. The parental notice must explain that parents can consent to the collection and use of kid's data without agreeing to the disclosure to third parties (unless integral to the operation of the website or online service).
    It must also describe how they use persistent identifiers, and (iii) disclosures about the use of audio files.

  • Perhaps the most impactful change: operators must obtain separate, verifiable parental consent before sharing children’s personal information with third parties, even if parental consent was already obtained for collection and internal use. The amendments also expand the acceptable methods for obtaining verifiable parental consent. Operators are now allowed to use the following methods (among others to be approved by the FTC): (i) knowledge-based authentication through multiple-choice questions, (ii) government-issued photo ID, or (iii) text messaging coupled with additional steps, such as a follow-up text, letter, or phone call.

Data Retention

The new regulations only allow organizations to retain children’s information for as long as reasonably necessary to fulfill the specific purposes for which it was collected.

Taken together, these changes mean that COPPA compliance is no longer just about age gates and privacy policy disclosures. It now directly implicates vendor management, advertising technologies, analytics tools, and security governance.

Why Just Claiming “We Don’t Collect Data from Children Under 13” May No Longer Be Enough

Historically, many organizations may have assumed that a disclaimer was all it took to avoid COPPA. That approach is increasingly risky.

First, the FTC has long taken the position that “actual knowledge” of children’s use—not just intent to target them—can trigger COPPA obligations. The new amendments add that the FTC may consider marketing or promotional materials, statements to consumers or third parties, reviews by users or third parties, and the age of users on similar websites or services.

Second, state laws are now expanding protections well beyond age 13, often up to age 18. As a result, organizations must think more carefully about who is actually using their products, not just who they are intended to serve.

State Laws: Expanding Protection for Minors

In parallel with the updated COPPA Rule, states have been actively passing laws that address minors’ privacy. Unlike COPPA, these laws do not all use the same age threshold, which significantly complicates compliance.

Some states focus on children under 13, while others extend protections to all minors under 18, and still others draw a line somewhere between. 

A Closer Look at Maryland

Maryland’s law, described earlier in this series as the most important new privacy law of the year, is a good example of how nuanced these statutes can be. Effective in late 2025, Maryland’s Online Data Privacy Act goes further than other states by prohibiting organizations from selling personal data or engaging in targeted advertising if they know or should have known a consumer is under 18. Maryland defines targeted advertising as advertising directed to a person or a device using a unique identifier. This might create a practical challenge if an organization knows or should know that a minor is using the device.

New State Laws Protecting Minors

Recent and upcoming state laws with new protections for minors include the following:

  • Arkansas HB 1717 (effective 7/1/2026) – protects children under 18

  • Colorado SB 24-041 (effective 10/1/2025) – under 18

  • Delaware HB 154 (effective 1/1/2025) – under 18

  • Maryland SB 541 (effective 10/1/2025) – under 18 (data sales and targeted advertising)

  • Montana SB 297 (effective 10/1/2025) – under 18

  • New Hampshire RSA 507-H (effective 1/1/2025) – under 13

  • Nebraska LB 504 (Age-Appropriate Design Code Act) (effective 1/1/2026) – under 13

  • Vermont SB 69 (Age-Appropriate Design Code Act) (effective 1/1/2027) – under 18

Notably, the Age-Appropriate Design Code Acts (AADCs) are not traditional privacy laws, but they operate much like them. They impose affirmative design requirements on websites and apps likely to be accessed by children or teens, including data minimization, high-privacy default settings, and restrictions on certain design features.

Practical Pointers: What Organizations Should Be Doing Now

For organizations operating websites or mobile apps, it’s time for a comprehensive review.

Key steps include:

  1. Revisit terms of use and privacy policies
    Many companies still rely on outdated COPPA-only language stating that their services are intended for users 13 and older. If the organization does not intend to target teens, that language may need to be clarified or tightened. If it does intend to target teens, the policy should reflect that reality and address applicable state laws.

  2. Align internal data and privacy processes with public disclosures
    External statements should match internal practices, particularly around data sharing, advertising, analytics, and security safeguards.

  3. Assess age thresholds and compliance strategy
    Decide deliberately which age groups the organization intends to serve, what age thresholds it will use, and how it will comply with the relevant federal and state requirements.

  4. Evaluate security and vendor practices
    The updated COPPA security requirements and state law obligations make information security programs and third-party risk management more important than ever.

Looking Ahead

Children’s privacy law is no longer a narrow niche. Between the amended COPPA Rule, expanding state protections for minors, and the rise of age-appropriate design requirements, 2025 and 2026 mark a turning point. Organizations need to re-evaluate their approach in light of the changing legal landscape in order to best manage legal risk in the future.


image showing the silhouette of two children holding devices with images in the background indicating technology

Sunday, December 14, 2025

State Cybersecurity Law Updates in 2025 — and What’s Coming in 2026

Data breach notification laws have existed in the United States for more than two decades, and for the past four years every U.S. state has had some form of breach notification statute. These laws are far from static; states continue to amend them to reflect new technologies, new categories of sensitive data, and evolving expectations around transparency and accountability after a security incident.

In 2025, two states made particularly important amendments to their cybersecurity and data breach notification laws: New York and Oklahoma. These changes highlight broader national trends that organizations should keep in mind as additional state privacy laws take effect in 2026.

 

New York: Expanded definition of “personal information” and Notification Deadline

 

Medical and health insurance information


New York significantly broadened its data breach notification law for 2025 by adding “medical information” and “health insurance information” to the definition of personal information, under General Business Law § 899-aa. For entities not already covered by HIPAA, this will have meaningful consequences.  

New 30-Day Notification Deadline

 
New York also added a 30-day deadline to notify affected individuals following a data breach. Previously, the statute required notification only as “expediently as possible and without unreasonable delay,” leaving room for interpretation. The new deadline removes that ambiguity and places added pressure on incident response teams to investigate, assess scope, and prepare notices quickly.  

For HIPAA covered entities, individual breach notifications must be delivered within 30 days rather than the 60-day period required by HIPAA, and covered entities must still notify the Attorney General within five business days of notifying the US Secretary of Health and Human Services, plus the NY State Police and NY Department of State.
 

Oklahoma: Modernized Breach Law for Emerging Data Risks

 
Oklahoma’s lawmakers also made notable amendments to its data breach notification statute in the past year, bringing it more in line with modern cybersecurity realities.
 

Expanded Categories of Personal Information

 
Oklahoma expanded its definition of “personal information” to include:
  • Biometric identifiers, such as fingerprints or other biometric data used for identification or authentication; and
  • Unique identifiers associated with a financial account, when combined with a required security code, password, or passcode
These additions reflect the growing use of biometrics and digital credentials — and the serious risks posed when they are compromised.

New Attorney General Notification Requirement

Like many states, Oklahoma now requires organizations to notify the state Attorney General when a breach affects more than a certain number of residents (in this case 500). Deadline: Notice must be provided to the Attorney General within 60 days. This change ensures state authorities have visibility into large-scale incidents.

Clarifying the Encryption Safe Harbor

Many state breach laws do not treat encrypted data as “breached” unless the encryption keys are also compromised. Oklahoma has formally adopted this approach. Under the amended law, encrypted data triggers breach notification obligations only if the encryption keys are accessed or obtained as well. This aligns Oklahoma with the majority of states and reinforces the importance of strong encryption and key-management practices.

A New Safe Harbor

In 2023, Oklahoma created an affirmative defense to tort claims if the breached entity has a cybersecurity program that meets certain criteria. Starting in January 2026, it has added a further safe harbor against statutory civil penalties, which cuts the penalty in half (from $150K to $75K) if the entity has “reasonable safeguards” and gives notice as required by the statute.

Oklahoma’s reasonable safeguards include risk assessments, technical and layered defenses, employee training and incident response plan. 

New Small Business Safe Harbor in Texax

Everything’s bigger in Texas, except this: A new safe harbor from punitive damages in a cybersecurity lawsuit applies only for businesses with fewer than 250 employees. In order to take advantage of it, a company must adopt a recognized cybersecurity framework like NIST or ISO.

These safe harbors are in addition to other states that have safe harbors already: Ohio (Safe harbor from tort claims if a recognized cybersecurity framework is used, e.g., NIST, 2018); Connecticut (defense against punitive damages, 2021); Tennessee (safe harbor from class actions (May 2024); and Utah (2021). On a side note: Tennessee is the only state I’m aware of with safe harbors for privacy and cybersecurity claims. Tennessee has a safe harbor against a cybersecurity class actions as long as the incident was not caused by willful/wanton conduct or gross negligence. Tennessee also has a safe harbor against privacy violations if the entity has adopted and implemented the NIST privacy framework.
 


Broader Trends and Looking Ahead to 2026

 

These amendments underscore several nationwide trends:
  • States are expanding what counts as protected personal information, especially health data, biometrics, and digital identifiers.
  • Legislatures continue to move away from flexible standards toward clear notification deadlines.
  • Attorney General notification requirements are becoming more ubiquitous.
  • Encryption remains a critical safeguard — but only if keys are properly protected.

Looking ahead to 2026, several states will implement comprehensive data protection laws that go beyond breach notification and regulate how personal data is collected, used, shared, and retained. Together with ongoing amendments to breach statutes, this means organizations must track not only new laws, but also incremental changes to existing ones.
 
 
silhouette of new york and oklahoma with padlock

 

Monday, November 17, 2025

New State Privacy Laws Coming In 2026

 

New State Privacy Laws to Watch in 2026 

As we saw in Part 1 of this series, 2025 brought several important changes to state privacy laws, but more will arrive in the new year.  In 2026, three more state privacy laws take effect, and two states will roll out sweeping amendments. Here’s what to expect:

New laws effective January 1, 2026: Indiana, Kentucky, and Rhode Island

Each of these three states will add to the growing patchwork of comprehensive state privacy laws. If these laws will apply to your organization, and you have not already begun preparing, you should do so now. Below are the highlights:

  • The Indiana Consumer Data Protection Act will apply if 100,000 residents' personal information is collected by a company (or just 25,000, if 50% of the company's revenues come from selling data).  The consumers' rights include the right to know various things including what information has been collected and shared, the right to access the consumer's information, the right to correct inaccuracies, the right to instruct the company to delete the consumer's information, and the right to opt-out of (i) the sale of the consumer's information, (ii) targeted advertising, and (iii) profiling.  Sensitive personal information can only be handled based on affirmative consent (opt-in).  It becomes effective January 1, 2026, and will be enforced by the Attorney General (with a 30 day cure period available). 
  • The Kentucky Consumer Data Privacy Act applies if 100,000 residents' personal information is collected by a company (or just 25,000, if 50% of the company's revenues come from selling data).  The consumers' rights include the right to know various things including what information has been collected and shared, the right to access the consumer's information, the right to correct inaccuracies, the right to instruct the company to delete the consumer's information, the right to have data deleted, and the right to opt-out of (i) the sale of the consumer's information, (ii) targeted advertising, and (iii) profiling.  Sensitive personal information and children’s information can only be handled based on affirmative consent (opt-in).  It becomes effective January 1, 2026, and will be enforced by the Attorney General (with a 30 day cure period available). 
  • The Rhode Island Data Transparency and Privacy Act applies if just 35,000 residents' personal information is collected by a company (or just 10,000, if 20% of the company's revenues come from selling data).  The consumers' rights include the right to know various things including what information has been collected and shared, the right to access the consumer's information, the right to correct inaccuracies, the right to instruct the company to delete the consumer's information, the right to have data deleted, and the right to opt-out of (i) the sale of the consumer's information, (ii) targeted advertising, and (iii) profiling.  Sensitive personal information and children’s information can only be handled based on affirmative consent (opt-in).  It becomes effective January 1, 2026, and will be enforced by the Attorney General (with no right to cure). 

Upcoming Amendments: Connecticut

The Connecticut Data Privacy Act was amended in a variety of important ways.
  • The Act’s applicability threshold will be reduced to just 35,000 residents, but there is no minimum threshold when it comes to processing sensitive data or selling consumer personal data.
  • The exemption for financial institutions subject to the Gramm-Leach-Bliley Act (federal financial law) is removed and replaced with a data-level exemption for personal financial information subject to GLBA.
  • Connecticut expanded the definition of "consumer health data“ to include “health status.” The definition of sensitive data now includes disability or treatment, status as nonbinary or transgender, information derived from genetic or biometric information, neural data, certain financial account and payment card data, and government ID numbers (like SSN). Sensitive personal data processing will require consent. Consumers will have the right to obtain a list of third parties to whom personal data was sold.
  • Consumers will have the right to question profiling results, be informed of the reasons for profiling decisions, review personal data used for profiling, and correct incorrect data used in profiling decisions about housing.
  • Consent will be required for processing or selling sensitive data.
  • Minors under 18 will be treated as children for purposes of the sale of data or targeted advertising.
  • Impact assessments will be required for processing related to profiling.
  • Privacy notices will require detailed information, such as categories of personal data sold and disclosures about targeted advertising. Importantly, companies must give notice if they are collecting personal data for the purpose of training an LLM.
Amendments to the Oregon Consumer Data Privacy Act (effective January 1, 2026) focus on minors and automobiles: 

  • The covered age for a minor increases to 15.
  • It prohibits processing personal data for targeted advertising, sale, or profiling used for legally significant decisions when the controller knows that the consumer is a minor (or willfully disregards age).
  • Motor‑vehicle manufacturers and affiliates are specifically covered for personal data collected from the use of a vehicle or any vehicle component. Automakers must honor a consumer's request to (i) delete personal data, (ii) provide a copy of their data, and (iii) stop selling their data or using it for targeted advertising.
  • The amendments also prohibit "selling" data about a person's precise geolocation without consent.
  • These five state law changes in particular will affect the privacy law landscape in the U.S. in ways that will have meaningful effects on many companies.  

In the next post in this series, we'll look at state cybersecurity law changes in 2025 and 2026. 







Tuesday, October 28, 2025

2025 in Cybersecurity and Privacy Law - New State Privacy Laws

2025 has been an eventful year in privacy and cybersecurity law, with many changes at the state, federal and international levels, affecting almost every sector of the economy.  If you haven't been paying very close attention throughout the year, you may have missed some important changes.  In a series of blog posts (beginning with this one), I'll summarize the key changes you need to know.

Part 1 - State Privacy Laws

State privacy laws have been trickling out over the past five years, and by this time last year, comprehensive privacy laws were already in effect in several states, such as California, Colorado, Connecticut, Florida, Montana, Texas, Utah, Virginia and Washington.



Over the course of the year, laws in Delaware, Iowa, Maryland, Minnesota, Nebraska, New Hampshire, New Jersey, Tennessee have become effective (shown in green below), as well as significant amendments to existing privacy laws in California, Colorado and Virginia.  






At a very high level, the key provisions of these new laws are as follows: 

  • Delaware's Personal Data Privacy Act applies if a mere 35,000 residents' personal information is collected by a company (or just 10,000, if 20% of the company's revenues come from selling data).  The consumers' rights include the right to know various things including what information has been collected and shared, the right to access the consumer's information, the right to correct inaccuracies, the right to instruct the company to delete the consumer's information, the right of data portability, and the right to opt-out of (i) the sale of the consumer's information, (ii) targeted advertising, and (iii) profiling.  It became effective January 1, 2025. 
  • Iowa's statute applies if 100,000 residents' personal information is collected (or 25,000 if 50% of the company's revenues come from selling data).  The consumer rights include the right to know, access, correct, delete, portability, and to opt-out of sale/ads/and sensitive personal information (not an opt-in structure, unless the consumer is known to be a child.)  There is no right to correct.  It became effective January 1, 2025. 
  • Maryland's statute also applies if a mere 35,000 resident's personal information is collected (or 10,000 if 20% of the company's revenues come from selling data).  The consumer rights include the right to know, access, correct, delete, portability, and to opt-out of sale/ads/profiling (not an opt-in structure.)  It became effective October 1, 2025. As we'll explore later, Maryland's law has some other unique provisions. 
  • Minnesota's statute applies if 100,000 residents' personal information is collected (or 25,000 if 25% of the company's revenues come from selling data).  The consumer rights include the right to know, access, correct, delete, portability, and to opt-out of sale/ads/profiling (not an opt-in structure.) An opt-in is required for collecting sensitive personal information or information about a known child.  It became effective July 31, 2025. 
  • Nebraska's statute applies to any company doing business in Nebraska that isn't a small business as defined by the SBA.  The consumer rights include the right to know, access, correct, delete, portability, and to opt-out of sale/ads/and sensitive personal information (not an opt-in structure, unless the consumer is known to be a child.)  It became effective January 1, 2025. 
  • New Hampshire's statute is triggered if just 35,000 residents' personal information is collected (or 10,000 if 25% of the company's revenues come from selling data).  The consumer rights include the right to know, access, correct, delete, portability, and to opt-out of sale/ads/profiling.  An opt-in is required for sensitive personal information or for collecting information about a child.  It became effective January 1, 2025.
  • New Jersey's statute applies if 100,000 residents' personal information is collected (or 25,000 if the company earns revenues or discounts from selling data).  The consumer rights include the right to know, access, correct, delete, portability.  An opt-out is required for the sale of information, profiling, and targeted ads, but an opt-in is required for sensitive personal information, and for the sale of information, profiling, or targeting ads to teens.  It became effective January 15, 2025.
  • In Tennessee, the law applies if the company's revenues exceed $25,000,000 and at least 175,000 residents' personal information is collected, or if 25,000 residents' information is collected and 50% of the company's annual gross revenues come from selling personal information.  The consumer rights include the right to know, access, correct, delete, portability.  An opt-out is required for the sale of information, profiling, and targeted ads, but an opt-in is required for sensitive personal information, or to collect information about a child.  It became effective July 1, 2025.
Beyond these basic contours, a few state privacy statutory changes stand out:

In a clear reaction to the US Supreme Court's decision in Dobbs, Virginia amended its privacy law effective July 1, 2025 to make it unlawful to obtain, disclose, sell or disseminate any personally identifiable reproductive or sexual health information (other than health information under HIPAA) without the consent of the consumer.  Other states (e.g., California, Washington, Maryland) are also amending their laws to provide more protections for reproductive and sexual health data. 

California amended the California Consumer Privacy Act effective January 1, 2025, to create a new subcategory of "sensitive data" called "neural data," which means “information that is generated by measuring the activity of a consumer’s central or peripheral nervous system, and that is not inferred from nonneural information.”  It's interesting to see California looking ahead to provide privacy protections as neural implants and brain scans are quickly developing. 

Colorado amended its privacy law in two important ways:  First, the age of a minor was increased from 13 to 18 (effective October 1, 2025). Data collection from minors requires a special data protection assessment, and the company must take reasonable care to avoid harming a minor's privacy.  In addition, there are prohibitions on processing minor’s data for certain purposes, such as targeted advertising, profiling, sale, or processing precise geolocation data, with limited exceptions.  Like some of the Age-Appropriate Design Code laws we've seen recently, the Colorado amendments prohibit using any design feature to significantly increase, sustain, or extend a minor’s use.  The other key change to Colorado law involves biometric data (effective July 1, 2025).  It requires consent for collecting biometric data, including from employees for most purposes), as well as a data protection impact assessment.  There must be a written, publicly-available policy, with retention requirements, limits on collection of biometric identifiers, among other things. Companies cannot refuse to provide a service to a person who does not agree to provide their biometric data unless the biometric data is necessary to provide the service. The biometric data requirements apply without regard to thresholds, but a consumer's right of access to biometric data only applies to companies that meet the general thresholds of the Colorado Privacy Act (100,000 consumers or 25,000 consumers/earn revenue or receive a discount on goods or services from the sale of personal data).

Finally, the most impactful state privacy law change this year might be Maryland's new Online Data Privacy Act (MODPA).  First, the threshold is low--just 35,000 consumers or 10,000 consumers if 20% of gross revenue comes from the sale of personal data.  There is also a prohibition on processing personal data of minors for targeted advertising or selling personal data if the company knew or should have known that the consumer was under the age of 18 years old.  Consumer health data receives special protections in Maryland, and that term includes data regarding reproductive or sexual healthcare, both of which are broadly defined. There is also a restriction on geofencing (using technology to establish a virtual boundary) within 1,750 feet of a mental health facility or sexual or reproductive health facility for the purpose of tracking, identifying, collecting data from or sending a notification to the consumer regarding the consumer’s health data.  Perhaps most remarkable of all is Maryland's enhanced data minimization requirement. The statute limits collection of personal data to what is reasonably necessary and proportionate “to provide or maintain a specific product or service requested by the consumer."  This seems more restrictive than the phrase used in other comprehensive state privacy laws (“in relation to the specified purposes for which the data are processed”). When it comes to sensitive personal data, the statute prohibits processing unless it is strictly necessary.  Sensitive Data includes consumer health data, genetic and biometric data, personal data of a child, precise geolocation data, and data revealing racial or ethnic origin, religious beliefs, sex life, sexual orientation, status as transgender or nonbinary, national origin, or citizen or immigration status. It will be very interesting to see how Maryland's Attorney General and courts enforce this strict new data minimization requirement.

Stay tuned for Part 2....




(Thanks to Karin McInnis of Moore & Van Allen for partnering with me on a recent continuing legal education seminar summarizing 2025 and 2026 changes in privacy and cybersecurity law.  This series is based on that CLE and her excellent contributions.)




Monday, January 2, 2023

How To Think About Privacy As An Enterprise Risk in 2023


 

A new year is upon us, and with it will come major changes in how organizations handle personal data. Of course, this is not the first time we've seen significant changes:

  • 2018 brought enormous changes to Europe as the General Data Protection Regulation (GDPR) became effective;
  • 2020 brought major changes to the U.S. as the California Consumer Privacy Act became effective; and
  • 2021 ushered in massive change in China with the introduction of the Personal Information Privacy Law and the Cyber Security Law.
This year will also be a year of change, as multiple jurisdictions implement new laws governing personal data, automation, and digital commerce. In the U.S., Canada, and Europe, strict new laws will significantly increase the level of existing regulation, and many people will gain new legal rights that they have never before had.

Rather than list all of the many new personal data protection laws coming into effect in 2023, I would like to offer some high-level thoughts about personal data risk in 2023 that organizations should consider: 

  • Overall, privacy risk is trending strongly upwards, as a result of more complex and strict privacy laws. Accordingly, past experience is a poor indicator of future results.  The likelihood and severity of a privacy violation cannot be predicted using historical data alone.  Therefore, many common risk quantification models will be insufficient to predict privacy risk.
  • It is becoming more difficult to assess risk globally. Fines and settlements are based on a variety of factors that differ from jurisdiction to jurisdiction.  An activity can be lower risk in one jurisdiction and higher risk in another.  Global organizations need to understand the risk environment in every country in which they operate.  In the past, it may have been acceptable to simply apply GDPR as a global standard, but it is probably not wise to take such a simplified approach in the future.
  • Many jurisdictions utilize an enforcement model focused on deterrence rather than consistent application. Given limited enforcement resources, they aim for a small number of very large fines which will act as a deterrent rather than aiming to catch all violations and punish them proportionately to the harm they cause.  Therefore, plenty of companies will "get away with" privacy violations, which may create a false sense of security.  Those who are targeted for enforcement are likely to be punished quite severely.
  • As many companies--especially consumer-facing companies--continue to pursue digital transformations, they are adding more and more technologies and third party data custodians.  This creates internal complexity and an ever-expanding personal data environment.  An expanding personal data environment requires more and more resources to govern effectively, and at some point can become unsustainable.  Organizations should apply a rigorous process to their digital transformations that ensure that older technologies and third party data custodians are retired as rapidly as new technologies and third party data custodians are onboarded.  This means explicitly acknowledging tradeoffs and making hard choices.
  • As organizations pursue agility and decentralization, they are granting more autonomy to individual business units to make decisions closer to the "front lines."  This can be a smart management strategy.  However, organizations should know that personal data privacy risk cannot be limited to a business unit.  If one business unit creates a privacy violation, the laws increasingly hold the entire organization (meaning the top-level parent organization and all affiliated entities) responsible, and fines are often based on the global revenue of the entire global enterprise.  For example, if a small division of a small local subsidiary violates the GDPR or China's PIPL, the result could be a massive fine equal to 4-5% of the entire global revenue of all affiliated companies.  Similarly, cyberinsurance underwriters consider risk holistically, and a poor practice by one small division can affect the insurability of an entire enterprise.  Finally, reputational risks often cannot be limited to a single brand or business unit of the organization.  Media reports have tended to name the parent organization or affiliated brands in negative press coverage, even when the privacy violation was committed by only one small division of the company. Accordingly, organizations probably should not allow small divisions to take on risk that could threaten the entire enterprise.
I hope these thoughts are helpful to anyone considering a privacy risk management strategy in 2023.