Monday, November 17, 2025

New State Privacy Laws Coming In 2026

 

New State Privacy Laws to Watch in 2026 

As we saw in Part 1 of this series, 2025 brought several important changes to state privacy laws, but more will arrive in the new year.  In 2026, three more state privacy laws take effect, and two states will roll out sweeping amendments. Here’s what to expect:

New laws effective January 1, 2026: Indiana, Kentucky, and Rhode Island

Each of these three states will add to the growing patchwork of comprehensive state privacy laws. If these laws will apply to your organization, and you have not already begun preparing, you should do so now. Below are the highlights:

  • The Indiana Consumer Data Protection Act will apply if 100,000 residents' personal information is collected by a company (or just 25,000, if 50% of the company's revenues come from selling data).  The consumers' rights include the right to know various things including what information has been collected and shared, the right to access the consumer's information, the right to correct inaccuracies, the right to instruct the company to delete the consumer's information, and the right to opt-out of (i) the sale of the consumer's information, (ii) targeted advertising, and (iii) profiling.  Sensitive personal information can only be handled based on affirmative consent (opt-in).  It becomes effective January 1, 2026, and will be enforced by the Attorney General (with a 30 day cure period available). 
  • The Kentucky Consumer Data Privacy Act applies if 100,000 residents' personal information is collected by a company (or just 25,000, if 50% of the company's revenues come from selling data).  The consumers' rights include the right to know various things including what information has been collected and shared, the right to access the consumer's information, the right to correct inaccuracies, the right to instruct the company to delete the consumer's information, the right to have data deleted, and the right to opt-out of (i) the sale of the consumer's information, (ii) targeted advertising, and (iii) profiling.  Sensitive personal information and children’s information can only be handled based on affirmative consent (opt-in).  It becomes effective January 1, 2026, and will be enforced by the Attorney General (with a 30 day cure period available). 
  • The Rhode Island Data Transparency and Privacy Act applies if just 35,000 residents' personal information is collected by a company (or just 10,000, if 20% of the company's revenues come from selling data).  The consumers' rights include the right to know various things including what information has been collected and shared, the right to access the consumer's information, the right to correct inaccuracies, the right to instruct the company to delete the consumer's information, the right to have data deleted, and the right to opt-out of (i) the sale of the consumer's information, (ii) targeted advertising, and (iii) profiling.  Sensitive personal information and children’s information can only be handled based on affirmative consent (opt-in).  It becomes effective January 1, 2026, and will be enforced by the Attorney General (with no right to cure). 

Upcoming Amendments: Connecticut

The Connecticut Data Privacy Act was amended in a variety of important ways.
  • The Act’s applicability threshold will be reduced to just 35,000 residents, but there is no minimum threshold when it comes to processing sensitive data or selling consumer personal data.
  • The exemption for financial institutions subject to the Gramm-Leach-Bliley Act (federal financial law) is removed and replaced with a data-level exemption for personal financial information subject to GLBA.
  • Connecticut expanded the definition of "consumer health data“ to include “health status.” The definition of sensitive data now includes disability or treatment, status as nonbinary or transgender, information derived from genetic or biometric information, neural data, certain financial account and payment card data, and government ID numbers (like SSN). Sensitive personal data processing will require consent. Consumers will have the right to obtain a list of third parties to whom personal data was sold.
  • Consumers will have the right to question profiling results, be informed of the reasons for profiling decisions, review personal data used for profiling, and correct incorrect data used in profiling decisions about housing.
  • Consent will be required for processing or selling sensitive data.
  • Minors under 18 will be treated as children for purposes of the sale of data or targeted advertising.
  • Impact assessments will be required for processing related to profiling.
  • Privacy notices will require detailed information, such as categories of personal data sold and disclosures about targeted advertising. Importantly, companies must give notice if they are collecting personal data for the purpose of training an LLM.
Amendments to the Oregon Consumer Data Privacy Act (effective January 1, 2026) focus on minors and automobiles: 

  • The covered age for a minor increases to 15.
  • It prohibits processing personal data for targeted advertising, sale, or profiling used for legally significant decisions when the controller knows that the consumer is a minor (or willfully disregards age).
  • Motor‑vehicle manufacturers and affiliates are specifically covered for personal data collected from the use of a vehicle or any vehicle component. Automakers must honor a consumer's request to (i) delete personal data, (ii) provide a copy of their data, and (iii) stop selling their data or using it for targeted advertising.
  • The amendments also prohibit "selling" data about a person's precise geolocation without consent.
  • These five state law changes in particular will affect the privacy law landscape in the U.S. in ways that will have meaningful effects on many companies.  

In the next post in this series, we'll look at state cybersecurity law changes in 2025 and 2026. 







Tuesday, October 28, 2025

2025 in Cybersecurity and Privacy Law - New State Privacy Laws

2025 has been an eventful year in privacy and cybersecurity law, with many changes at the state, federal and international levels, affecting almost every sector of the economy.  If you haven't been paying very close attention throughout the year, you may have missed some important changes.  In a series of blog posts (beginning with this one), I'll summarize the key changes you need to know.

Part 1 - State Privacy Laws

State privacy laws have been trickling out over the past five years, and by this time last year, comprehensive privacy laws were already in effect in several states, such as California, Colorado, Connecticut, Florida, Montana, Texas, Utah, Virginia and Washington.



Over the course of the year, laws in Delaware, Iowa, Maryland, Minnesota, Nebraska, New Hampshire, New Jersey, Tennessee have become effective (shown in green below), as well as significant amendments to existing privacy laws in California, Colorado and Virginia.  






At a very high level, the key provisions of these new laws are as follows: 

  • Delaware's Personal Data Privacy Act applies if a mere 35,000 residents' personal information is collected by a company (or just 10,000, if 20% of the company's revenues come from selling data).  The consumers' rights include the right to know various things including what information has been collected and shared, the right to access the consumer's information, the right to correct inaccuracies, the right to instruct the company to delete the consumer's information, the right of data portability, and the right to opt-out of (i) the sale of the consumer's information, (ii) targeted advertising, and (iii) profiling.  It became effective January 1, 2025. 
  • Iowa's statute applies if 100,000 residents' personal information is collected (or 25,000 if 50% of the company's revenues come from selling data).  The consumer rights include the right to know, access, correct, delete, portability, and to opt-out of sale/ads/and sensitive personal information (not an opt-in structure, unless the consumer is known to be a child.)  There is no right to correct.  It became effective January 1, 2025. 
  • Maryland's statute also applies if a mere 35,000 resident's personal information is collected (or 10,000 if 20% of the company's revenues come from selling data).  The consumer rights include the right to know, access, correct, delete, portability, and to opt-out of sale/ads/profiling (not an opt-in structure.)  It became effective October 1, 2025. As we'll explore later, Maryland's law has some other unique provisions. 
  • Minnesota's statute applies if 100,000 residents' personal information is collected (or 25,000 if 25% of the company's revenues come from selling data).  The consumer rights include the right to know, access, correct, delete, portability, and to opt-out of sale/ads/profiling (not an opt-in structure.) An opt-in is required for collecting sensitive personal information or information about a known child.  It became effective July 31, 2025. 
  • Nebraska's statute applies to any company doing business in Nebraska that isn't a small business as defined by the SBA.  The consumer rights include the right to know, access, correct, delete, portability, and to opt-out of sale/ads/and sensitive personal information (not an opt-in structure, unless the consumer is known to be a child.)  It became effective January 1, 2025. 
  • New Hampshire's statute is triggered if just 35,000 residents' personal information is collected (or 10,000 if 25% of the company's revenues come from selling data).  The consumer rights include the right to know, access, correct, delete, portability, and to opt-out of sale/ads/profiling.  An opt-in is required for sensitive personal information or for collecting information about a child.  It became effective January 1, 2025.
  • New Jersey's statute applies if 100,000 residents' personal information is collected (or 25,000 if the company earns revenues or discounts from selling data).  The consumer rights include the right to know, access, correct, delete, portability.  An opt-out is required for the sale of information, profiling, and targeted ads, but an opt-in is required for sensitive personal information, and for the sale of information, profiling, or targeting ads to teens.  It became effective January 15, 2025.
  • In Tennessee, the law applies if the company's revenues exceed $25,000,000 and at least 175,000 residents' personal information is collected, or if 25,000 residents' information is collected and 50% of the company's annual gross revenues come from selling personal information.  The consumer rights include the right to know, access, correct, delete, portability.  An opt-out is required for the sale of information, profiling, and targeted ads, but an opt-in is required for sensitive personal information, or to collect information about a child.  It became effective July 1, 2025.
Beyond these basic contours, a few state privacy statutory changes stand out:

In a clear reaction to the US Supreme Court's decision in Dobbs, Virginia amended its privacy law effective July 1, 2025 to make it unlawful to obtain, disclose, sell or disseminate any personally identifiable reproductive or sexual health information (other than health information under HIPAA) without the consent of the consumer.  Other states (e.g., California, Washington, Maryland) are also amending their laws to provide more protections for reproductive and sexual health data. 

California amended the California Consumer Privacy Act effective January 1, 2025, to create a new subcategory of "sensitive data" called "neural data," which means “information that is generated by measuring the activity of a consumer’s central or peripheral nervous system, and that is not inferred from nonneural information.”  It's interesting to see California looking ahead to provide privacy protections as neural implants and brain scans are quickly developing. 

Colorado amended its privacy law in two important ways:  First, the age of a minor was increased from 13 to 18 (effective October 1, 2025). Data collection from minors requires a special data protection assessment, and the company must take reasonable care to avoid harming a minor's privacy.  In addition, there are prohibitions on processing minor’s data for certain purposes, such as targeted advertising, profiling, sale, or processing precise geolocation data, with limited exceptions.  Like some of the Age-Appropriate Design Code laws we've seen recently, the Colorado amendments prohibit using any design feature to significantly increase, sustain, or extend a minor’s use.  The other key change to Colorado law involves biometric data (effective July 1, 2025).  It requires consent for collecting biometric data, including from employees for most purposes), as well as a data protection impact assessment.  There must be a written, publicly-available policy, with retention requirements, limits on collection of biometric identifiers, among other things. Companies cannot refuse to provide a service to a person who does not agree to provide their biometric data unless the biometric data is necessary to provide the service. The biometric data requirements apply without regard to thresholds, but a consumer's right of access to biometric data only applies to companies that meet the general thresholds of the Colorado Privacy Act (100,000 consumers or 25,000 consumers/earn revenue or receive a discount on goods or services from the sale of personal data).

Finally, the most impactful state privacy law change this year might be Maryland's new Online Data Privacy Act (MODPA).  First, the threshold is low--just 35,000 consumers or 10,000 consumers if 20% of gross revenue comes from the sale of personal data.  There is also a prohibition on processing personal data of minors for targeted advertising or selling personal data if the company knew or should have known that the consumer was under the age of 18 years old.  Consumer health data receives special protections in Maryland, and that term includes data regarding reproductive or sexual healthcare, both of which are broadly defined. There is also a restriction on geofencing (using technology to establish a virtual boundary) within 1,750 feet of a mental health facility or sexual or reproductive health facility for the purpose of tracking, identifying, collecting data from or sending a notification to the consumer regarding the consumer’s health data.  Perhaps most remarkable of all is Maryland's enhanced data minimization requirement. The statute limits collection of personal data to what is reasonably necessary and proportionate “to provide or maintain a specific product or service requested by the consumer."  This seems more restrictive than the phrase used in other comprehensive state privacy laws (“in relation to the specified purposes for which the data are processed”). When it comes to sensitive personal data, the statute prohibits processing unless it is strictly necessary.  Sensitive Data includes consumer health data, genetic and biometric data, personal data of a child, precise geolocation data, and data revealing racial or ethnic origin, religious beliefs, sex life, sexual orientation, status as transgender or nonbinary, national origin, or citizen or immigration status. It will be very interesting to see how Maryland's Attorney General and courts enforce this strict new data minimization requirement.

Stay tuned for Part 2....




(Thanks to Karin McInnis of Moore & Van Allen for partnering with me on a recent continuing legal education seminar summarizing 2025 and 2026 changes in privacy and cybersecurity law.  This series is based on that CLE and her excellent contributions.)




Monday, January 2, 2023

How To Think About Privacy As An Enterprise Risk in 2023


 

A new year is upon us, and with it will come major changes in how organizations handle personal data. Of course, this is not the first time we've seen significant changes:

  • 2018 brought enormous changes to Europe as the General Data Protection Regulation (GDPR) became effective;
  • 2020 brought major changes to the U.S. as the California Consumer Privacy Act became effective; and
  • 2021 ushered in massive change in China with the introduction of the Personal Information Privacy Law and the Cyber Security Law.
This year will also be a year of change, as multiple jurisdictions implement new laws governing personal data, automation, and digital commerce. In the U.S., Canada, and Europe, strict new laws will significantly increase the level of existing regulation, and many people will gain new legal rights that they have never before had.

Rather than list all of the many new personal data protection laws coming into effect in 2023, I would like to offer some high-level thoughts about personal data risk in 2023 that organizations should consider: 

  • Overall, privacy risk is trending strongly upwards, as a result of more complex and strict privacy laws. Accordingly, past experience is a poor indicator of future results.  The likelihood and severity of a privacy violation cannot be predicted using historical data alone.  Therefore, many common risk quantification models will be insufficient to predict privacy risk.
  • It is becoming more difficult to assess risk globally. Fines and settlements are based on a variety of factors that differ from jurisdiction to jurisdiction.  An activity can be lower risk in one jurisdiction and higher risk in another.  Global organizations need to understand the risk environment in every country in which they operate.  In the past, it may have been acceptable to simply apply GDPR as a global standard, but it is probably not wise to take such a simplified approach in the future.
  • Many jurisdictions utilize an enforcement model focused on deterrence rather than consistent application. Given limited enforcement resources, they aim for a small number of very large fines which will act as a deterrent rather than aiming to catch all violations and punish them proportionately to the harm they cause.  Therefore, plenty of companies will "get away with" privacy violations, which may create a false sense of security.  Those who are targeted for enforcement are likely to be punished quite severely.
  • As many companies--especially consumer-facing companies--continue to pursue digital transformations, they are adding more and more technologies and third party data custodians.  This creates internal complexity and an ever-expanding personal data environment.  An expanding personal data environment requires more and more resources to govern effectively, and at some point can become unsustainable.  Organizations should apply a rigorous process to their digital transformations that ensure that older technologies and third party data custodians are retired as rapidly as new technologies and third party data custodians are onboarded.  This means explicitly acknowledging tradeoffs and making hard choices.
  • As organizations pursue agility and decentralization, they are granting more autonomy to individual business units to make decisions closer to the "front lines."  This can be a smart management strategy.  However, organizations should know that personal data privacy risk cannot be limited to a business unit.  If one business unit creates a privacy violation, the laws increasingly hold the entire organization (meaning the top-level parent organization and all affiliated entities) responsible, and fines are often based on the global revenue of the entire global enterprise.  For example, if a small division of a small local subsidiary violates the GDPR or China's PIPL, the result could be a massive fine equal to 4-5% of the entire global revenue of all affiliated companies.  Similarly, cyberinsurance underwriters consider risk holistically, and a poor practice by one small division can affect the insurability of an entire enterprise.  Finally, reputational risks often cannot be limited to a single brand or business unit of the organization.  Media reports have tended to name the parent organization or affiliated brands in negative press coverage, even when the privacy violation was committed by only one small division of the company. Accordingly, organizations probably should not allow small divisions to take on risk that could threaten the entire enterprise.
I hope these thoughts are helpful to anyone considering a privacy risk management strategy in 2023.