In July, Oregon’s governor signed into law the Oregon Consumer Privacy Act (“OCPA”), making Oregon the eleventh state to enact a comprehensive privacy law. The OCPA goes into effect on July 1, 2024. Covered business other than applicable non-profits must comply with the OCPA by that date. Applicable non-profits will become subject to the OCPA on July 1, 2025.
On June 30, 2023, a court in Sacramento issued an order enjoining enforcement of the implementing regulations promulgated by the California Privacy Protection Agency (CPPA) under the California Privacy Rights Act of 2020 (CPRA). If the order stands, enforcement will be delayed until March 29, 2024.
In June, Texas became the tenth state with a comprehensive privacy law. The Texas Data Privacy and Security Act (“TDPSA”) contains familiar provisions from other state privacy laws regulating the collection, use, processing, and treatment of consumers’ personal data, but also has Texas-specific provisions. The TDPSA will be effective as of July 1, 2024, allowing a one-year compliance period.
This month, Indiana, Montana and Tennessee passed comprehensive privacy laws. Each tracks closely the comprehensive privacy laws outside of California, but with some variations. None applies to employee data or has a private right of action. All have cure rights. Tennessee uniquely provides an affirmative defense for controllers who follow the NIST privacy framework. Tennessee’s law will go into effect July 1, 2024, giving businesses just over a year to prepare to comply. Indiana’s law affords businesses more time to comply – it will not take effect until January 1, 2026. Montana’s law will go into effect October 1, 2024. Below is a summary of key points from each law.
Last week the Florida Senate passed its version of a comprehensive privacy law (SB 262), entitled the Florida Digital Bill of Rights. If signed by Governor DeSantis, the Digital Bill of Rights will require large companies (those with at least $1 billion in annual global gross revenues and who meet other metrics) to provide consumers with certain rights, including access, correction and deletion rights, opt-ins for processing of sensitive personal information and data of known children, and opting out of the collection of targeting advertising, profiling, and voice recognition data. Although the threshold for coverage is high, the obligations are significant, including reasonable security measures, fair information practices, data protection assessments, mandated data retention limits, specific disclosures if the controller is engaged in targeted advertising, and a controversial requirement for disclosure of search engine methodology. Although there is no private cause of action, the Florida Department of Legal Affairs can enforce the law and impose civil penalties up to $50,000 per violation with trebling in certain instances.
As artificial intelligence systems such as ChatGPT and Midjourney have become increasingly prominent, so have concerns about the effects that such programs may have on the economy and society at large. With more businesses incorporating artificial intelligence (“AI”) into their operations, these apprehensions about its use become more salient every day. While the potential uses of AI for innovation, automation, and streamlining tasks is great, the algorithms powering AI are not free from the biases reflected in the data and content that they are fed, creating risks of violating civil rights and consumer protection laws.
Iowa has become the latest state to enact a consumer privacy law, joining California, Colorado, Connecticut, Utah, and Virginia. On March 28, Governor Kim Reynolds signed into law Senate File 262, which effective January 1, 2025, will provide Iowa consumers various protections over their personal data. The law applies to businesses that either conduct business in Iowa or produce products or services targeting Iowa consumers AND that either controls or processes personal data of at least 100,000 consumers or controls or processes personal data of at least 25,000 consumers while deriving more than 50% of gross revenue from the sale of personal data. Unlike California’s comprehensive privacy law, the Iowa statute does not have a revenue threshold for application of the statute. The statute excludes from coverage financial institutions and affiliates and data subject to GLBA, and HIPAA covered entities, among others.
On August 11, 2022, the Consumer Financial Protection Bureau (“CFPB”) issued a circular (Circular 2022-04 or, the “Circular”) addressing whether insufficient data and information security practices can violate the prohibition against unfair acts or practices in the Consumer Financial Protection Act (“CFPA”). The CFPB concluded that inadequate security practices could give rise to a claim not only under federal data security laws like the Gramm-Leach-Bliley Act (“GLBA”), but also under the CFPA as well. The Circular discusses the elements of a claim under the CFPA and identifies a few specific practices that the CFPB identified as likely giving rise to a violation of the CFPA. The Circular, however, does not otherwise provide direction to the industry on expected information security practices.
On May 29, 2022, Maryland amended the Maryland Personal Information Protection Act (PIPA). Effective October 1, 2022, the amendment (located here https://mgaleg.maryland.gov/2022RS/chapters_noln/Ch_502_hb0962E.pdf ) revises provisions regarding genetic information. These revisions include an undefined term “genetic information” for purposes of notices requires under PIPA. But the revisions also add a revised definition of genetic information as it applies to all other provisions of the law, including provisions requiring investigation into a data breach and the requirement that businesses implement and maintain reasonable security procedures and practices. Specifically, the revised definition includes data that results from the analysis of a biological sample of the individual or from another source that concerns genetic material and enables equivalent information to be obtained, DNA, RNA, genes, chromosomes, alleles, genomes, alterations or modifications to DNA or RNA, single nucleotide polymorphisms, and information extrapolated, derived or inferred from such data, unless the information is encrypted, redacted or otherwise protected by a method that renders the information unreadable or unusable.
Late last month the Securities and Exchange Commission (“SEC”) charged JP Morgan, UBS and Trade Station with violations of Regulation S-ID based on a range of inadequacies in their identity theft red flag policies and procedures. https://www.sec.gov/news/press-release/2022-131 The violations at issue might seem less than critical, such as not updating policies, merely copying over examples of red flags from Reg S-ID’s Appendix A, not incorporating specific policies into the red flag program, covering all accounts instead of conducting specific account assessments, and not providing sufficient detail in board reports. Although the SEC did not note any failure by these broker-dealers and investment advisors to actually detect and respond to identity theft red flags, the resulting orders and fines (up to $1.2 million), underline the SEC’s seriousness about protecting investors from cybercrime by requiring broker dealers and investment advisors to up their game and focus on the details.
About Data Points: Privacy & Data Security Blog
The technology and regulatory landscape is rapidly changing, thus impacting the manner in which companies across all industries operate, specifically in the ways they collect, use and secure confidential data. We provide transparent and cutting-edge insight on critical issues and dynamics. Our team informs business decision-makers about the information they must protect, and what to do if/when security is breached.