This month, Indiana, Montana and Tennessee passed comprehensive privacy laws. Each tracks closely the comprehensive privacy laws outside of California, but with some variations. None applies to employee data or has a private right of action. All have cure rights. Tennessee uniquely provides an affirmative defense for controllers who follow the NIST privacy framework. Tennessee’s law will go into effect July 1, 2024, giving businesses just over a year to prepare to comply. Indiana’s law affords businesses more time to comply – it will not take effect until January 1, 2026. Montana’s law will go into effect October 1, 2024. Below is a summary of key points from each law.
Last week the Florida Senate passed its version of a comprehensive privacy law (SB 262), entitled the Florida Digital Bill of Rights. If signed by Governor DeSantis, the Digital Bill of Rights will require large companies (those with at least $1 billion in annual global gross revenues and who meet other metrics) to provide consumers with certain rights, including access, correction and deletion rights, opt-ins for processing of sensitive personal information and data of known children, and opting out of the collection of targeting advertising, profiling, and voice recognition data. Although the threshold for coverage is high, the obligations are significant, including reasonable security measures, fair information practices, data protection assessments, mandated data retention limits, specific disclosures if the controller is engaged in targeted advertising, and a controversial requirement for disclosure of search engine methodology. Although there is no private cause of action, the Florida Department of Legal Affairs can enforce the law and impose civil penalties up to $50,000 per violation with trebling in certain instances.
As artificial intelligence systems such as ChatGPT and Midjourney have become increasingly prominent, so have concerns about the effects that such programs may have on the economy and society at large. With more businesses incorporating artificial intelligence (“AI”) into their operations, these apprehensions about its use become more salient every day. While the potential uses of AI for innovation, automation, and streamlining tasks is great, the algorithms powering AI are not free from the biases reflected in the data and content that they are fed, creating risks of violating civil rights and consumer protection laws.
Iowa has become the latest state to enact a consumer privacy law, joining California, Colorado, Connecticut, Utah, and Virginia. On March 28, Governor Kim Reynolds signed into law Senate File 262, which effective January 1, 2025, will provide Iowa consumers various protections over their personal data. The law applies to businesses that either conduct business in Iowa or produce products or services targeting Iowa consumers AND that either controls or processes personal data of at least 100,000 consumers or controls or processes personal data of at least 25,000 consumers while deriving more than 50% of gross revenue from the sale of personal data. Unlike California’s comprehensive privacy law, the Iowa statute does not have a revenue threshold for application of the statute. The statute excludes from coverage financial institutions and affiliates and data subject to GLBA, and HIPAA covered entities, among others.
On August 11, 2022, the Consumer Financial Protection Bureau (“CFPB”) issued a circular (Circular 2022-04 or, the “Circular”) addressing whether insufficient data and information security practices can violate the prohibition against unfair acts or practices in the Consumer Financial Protection Act (“CFPA”). The CFPB concluded that inadequate security practices could give rise to a claim not only under federal data security laws like the Gramm-Leach-Bliley Act (“GLBA”), but also under the CFPA as well. The Circular discusses the elements of a claim under the CFPA and identifies a few specific practices that the CFPB identified as likely giving rise to a violation of the CFPA. The Circular, however, does not otherwise provide direction to the industry on expected information security practices.
On May 29, 2022, Maryland amended the Maryland Personal Information Protection Act (PIPA). Effective October 1, 2022, the amendment (located here https://mgaleg.maryland.gov/2022RS/chapters_noln/Ch_502_hb0962E.pdf ) revises provisions regarding genetic information. These revisions include an undefined term “genetic information” for purposes of notices requires under PIPA. But the revisions also add a revised definition of genetic information as it applies to all other provisions of the law, including provisions requiring investigation into a data breach and the requirement that businesses implement and maintain reasonable security procedures and practices. Specifically, the revised definition includes data that results from the analysis of a biological sample of the individual or from another source that concerns genetic material and enables equivalent information to be obtained, DNA, RNA, genes, chromosomes, alleles, genomes, alterations or modifications to DNA or RNA, single nucleotide polymorphisms, and information extrapolated, derived or inferred from such data, unless the information is encrypted, redacted or otherwise protected by a method that renders the information unreadable or unusable.
Late last month the Securities and Exchange Commission (“SEC”) charged JP Morgan, UBS and Trade Station with violations of Regulation S-ID based on a range of inadequacies in their identity theft red flag policies and procedures. https://www.sec.gov/news/press-release/2022-131 The violations at issue might seem less than critical, such as not updating policies, merely copying over examples of red flags from Reg S-ID’s Appendix A, not incorporating specific policies into the red flag program, covering all accounts instead of conducting specific account assessments, and not providing sufficient detail in board reports. Although the SEC did not note any failure by these broker-dealers and investment advisors to actually detect and respond to identity theft red flags, the resulting orders and fines (up to $1.2 million), underline the SEC’s seriousness about protecting investors from cybercrime by requiring broker dealers and investment advisors to up their game and focus on the details.
The American Data Privacy and Protection Act (the “ADPPA”), a bill that would establish a comprehensive federal data privacy framework in the U.S., was formally introduced in the U.S. House of Representatives on June 21, 2022. Should the ADPPA become law, the United States will join the European Union and a handful of other countries such as Canada, Brazil, and New Zealand, in having a comprehensive data protection framework on a national level.
The U.S. Equal Employment Opportunity Commission (“EEOC”) is tasked with administrative enforcement of a variety of employment discrimination laws, including the Americans with Disabilities Act as amended (the “ADAAA”). The ADAAA prohibits discrimination against job applicants and employees based on “disabilities”, generally defined as a physical or mental impairment that substantially limits the individual in a major life activity. Employers of employees with a disability are required to provide disabled employee with a reasonable accommodation to enable the employee to perform the essential functions of their job, unless the reasonable accommodation would impose an undue hardship on the employer or in certain instances where the employee would still pose a direct threat to the health or safety of themselves or others that cannot be addressed by a reasonable accommodation. It is interesting, therefore, that the EEOC issued Technical Assistance on May 12, 2022 entitled The American with Disabilities Act and the Use of Software, Algorithms and Artificial Intelligence to Assess Job Applicants and Employees. The stated concern is that use of AI tools will disadvantage job applicants and employees with disabilities.
The EEOC’s Technical Assistance is not law. It is not even regulation. But it does signal how the EEOC might deal with charges of discrimination brought by applicants and employees based on an employer’s use of AI.
On May 10, 2022, Connecticut became the fifth state in the U.S. to enact a comprehensive data privacy statute.
Effective July 1, 2023, the law imposes CCPA-like requirements on covered businesses. In scope and requirements, the law more closely mirrors Virginia’s and Colorado’s comprehensive privacy laws, effective January 1, 2023 and July 1, 2023, respectively.
About Data Points: Privacy & Data Security Blog
The technology and regulatory landscape is rapidly changing, thus impacting the manner in which companies across all industries operate, specifically in the ways they collect, use and secure confidential data. We provide transparent and cutting-edge insight on critical issues and dynamics. Our team informs business decision-makers about the information they must protect, and what to do if/when security is breached.