Posted

Consumer Protection Dispatch: The Latest Developments in the World of Consumer Protection (5/22/24)

(The Consumer Protection Dispatch summarizes industry news and updates on emerging issues involving a variety of consumer protection issues including, but not limited to, data and AI.)

This week’s edition includes latest developments relating to AI laws passed by Colorado, Tennessee and Utah, U.S. Senate bipartisan working group on AI, a new privacy bill from Vermont and a new privacy law from Maryland, a new Colorado law protecting neural data, and updates from the California Privacy Protection Agency.

U.S. STATE AI LAWS

Colorado Passed Comprehensive AI Law
On May 17, Governor Jared Polis signed SB-205, “An Act Concerning Consumer Protections in Interactions with Artificial Intelligence Systems” into law. The law will go into effect on February 1, 2026. The law applies to any person doing business in Colorado who develops an “AI system” or deploys a “high-risk AI system.” “Consumer” is defined as “an individual who is a Colorado resident.” Without an exception for the employment context, SB-205 covers consequential employment-related decisions. The law requires human oversight and documentation of AI systems and addresses AI bias. The law covers both “developers” and “deployers” of high-risk AI systems and requires both parties to use reasonable care to protect consumers from any known or reasonably foreseeable algorithmic discrimination—i.e., any condition in which the use of an AI system results in unlawful treatment or impact based on certain protected classes. SB-205 also mandates that deployers complete annual impact assessments and within 90 days of any intentional or substantial modification to the high-risk AI system. SB-205 gives sole enforcement authority to the state attorney general, meaning that there is no private right of action for individuals.

Tennessee’s ELVIS Act Addresses AI in the Music Industry
On March 21, 2024, Tennessee Governor Bill Lee signed the “Ensuring Likeness Voice and Image Security Act” (ELVIS Act) into law, which aims to protect people from unauthorized AI deepfakes. While musicians immediately come to mind, the ELVIS Act will also protect voice actors and podcasters from the unfair use of their voice. The law bars the use of AI to mimic a person’s voice without their consent. The ELVIS Act clarifies when the use of a person’s voice or likeness is deemed “fair use” and not a violation of an individual’s right. The law provides for criminal enforcement that carries penalties of up to 11 months, 29 days in jail and/or fines up to $2,500, and permits a private right of action. The law goes into effect on July 1, 2024.

Utah Enacts AI Law
On March 13, 2024, Utah Governor Spencer Cox signed the Utah Artificial Intelligence Policy Act (UAIPA), effective May 1, 2024. The law mandates that businesses using generative AI disclose its use to consumers and prohibits using AI as a defense for violating consumer protection laws by claiming AI was an intervening factor. UAIPA applies to any entity or individual using AI to interact with Utah consumers. In particular, it covers those who engage in any activity regulated by the Utah Division of Consumer Protection and those who work in “regulated occupations”—i.e., those regulated by the Utah Department of Commerce that require licensing or certifications to practice their occupation. Penalties for non-compliance include fines up to $2,500 per violation and potential legal actions. The law also establishes the Office of Artificial Intelligence Policy to oversee AI regulation and administer a regulatory sandbox for AI development.

Bipartisan Senate Group Reveals Plan on AI
On May 15, 2024, the Bipartisan Senate AI Working Group which consists of Chuck Schumer, Sens. Todd Young (R-Ind.), Martin Heinrich (D-N.M.) and Mike Rounds (R-S.D.), unveiled a comprehensive roadmap for artificial intelligence policy. This roadmap, developed after a year of historic AI Insight Forums, emphasizes key legislative priorities including ethical AI development, workforce implications, data privacy, and national security. The roadmap states that the goal would be to spend approximately $32 billion over several years, an amount recommended by the National Security Commission on Artificial Intelligence, for (non-defense) AI innovation. The initiative aims to ensure the U.S. leads in AI innovation while mitigating risks like job displacement and misuse of AI technologies, fostering a balanced approach to AI advancement.

U.S. STATE COMPREHENSIVE PRIVACY LAWS

Vermont Passes a Robust Privacy Law
On May 12, 2024, the Vermont House of Representatives and the State Senate passed House Bill 121, that establishes the Vermont Data Privacy Act (VDPA). The bill has been sent to Gov. Phil Scott to be signed into law. This bill is set to be one of the strongest in the nation.

The VDPA applies to a person who conducts business in Vermont or a person who produces products or services that are targeted to residents of Vermont and that during the preceding calendar year either controlled or processed the personal data of not fewer than 25,000 consumers—excluding personal data controlled or processed solely for the purpose of completing a payment transaction—or derived more than 50% of their gross revenue from the sale of personal data. Additionally, the bill outlines exceptions from its applicability.

Notably, starting on January 1, 2027, the VDPA provides consumers with a private right of action against data brokers and “large data holders”—i.e., those which process the personal data of more than 100,000 state residents – for processing sensitive data without consent or process a consumer’s personal data “in a manner that discriminates against individuals or otherwise makes unavailable the equal enjoyment of goods or services on the basis of an individual’s actual or perceived race, color, sex, sexual orientation or gender identity, physical or mental disability, religion, ancestry, or national origin.” The VDPA allows for the repeal of the private right of action that could take effect on January 1, 2029, unless the state legislation considers extending the private right of action. The VDPA mandates reports from the attorney general’s office. Although the private right of action provides enormous rights to consumers to seek redress against brokers and large data holders for violations, this provision may cause the bill to be rejected as Gov. Scott is reportedly concerned that this provision would subject Vermont businesses to frivolous claims. It remains unclear whether this bill will survive the governor’s veto.

The bill provides for consumer rights including the right to opt out of the processing of personal data for the purposes of targeted advertising, sale of personal data, or profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer and specifies that a controller may not condition the exercise of consumer rights through the use of any false, fictitious, fraudulent or materially misleading statement or representation, or through the employment of any dark pattern.

The bill also lays down the obligations of controllers and processors, including certain duties toward minors, defined as those under the age of 18. Notably, the bill states that a controller shall not process the personal data of a known minor for the purpose of targeted advertising.

The VDPA provides Vermonters with rights to access, delete and correct their data and requires explicit consent for sensitive data collection. The law contains provisions relating to data security breach notification requirements and biometric personal data. The law introduces a unique private right of action, allowing individuals to sue companies for data violations and contains a number of novel provisions, including an Age Appropriate Design Code (age ranges for different child developmental stages). Recognizing that minors of different age ranges have distinct needs, the VDPA specifies the following developmental stages: zero to five years of age, or “preliterate and early literacy”; six to nine years of age, or “core primary school years”; 10 to 12 years of age, or “transition years”; 13 to 15 years of age, or “early teens”; and 16 to 17 years of age, or “approaching adulthood.” The VDPA requires age estimation of users of online services and provides examples of age estimation methods such as: analysis of behavioral and environmental data the covered business already collects about its users; comparing the way a user interacts with a device or with users of the same age; metrics derived from motion analysis; and testing a user’s capacity or knowledge. A covered business subject to VDPA’s Age Appropriate Design Code also must configure all default privacy settings provided to minor consumers to a high level of privacy, prominently provide privacy information, terms of service, policies and community standards, and provide prominent tools to enable minors (and their parents or guardians, if applicable) to exercise their privacy concerns. Such business must also honor a minor’s request to unpublish their social media platform account, and provide easily accessible and age-appropriate tools for a minor to limit the ability of users or covered entities to send unsolicited communications. The VDPA provides for the creation of an Artificial Intelligence and Data Privacy Advisory Council which would be tasked with giving advice and counsel on the development, employment and procurement of artificial intelligence in the Vermont state government. The law includes data broker credentialing and registration requirements. If signed into law, the law will take effect July 1, 2025.

Maryland Passes Consumer Privacy Legislation
On May 9, the Maryland Governor Wes Moore signed the Maryland Online Data Privacy Act (MODPA), a comprehensive consumer data privacy which will take effect on October 1, 2025. This new law contains unique data minimization rules by limiting data collection to what is “reasonably necessary” to provide or maintain a product or service requested by the consumer. MODPA prohibits the sale of sensitive personal data and restricts the processing of sensitive data to what is “strictly necessary” to provide or maintain a requested product or service Like many other state laws, MODPA gives consumers the right to access, correct, delete their data and opt out of data collection for targeted advertising and profiling. MODPA also mandates data protection assessments for each processing activity that presents a “heightened risk of harm to a consumer,” including an assessment for each algorithm that is used. Other processing activities that present a heightened risk include processing of sensitive data, sale of personal data and targeted advertising, as well as certain types of profiling. Like other state laws, MODPA requires, among other things, a data protection assessment to identify and weigh the benefits to the controller, the consumer, other interested parties, and the public against the risks to the consumer, as mitigated by measures taken to reduce such risk. The data protection assessment requirement applies to processing activities that occur on or after October 1, 2025, and is not required for processing activities occurring before that date. This, however, does not mean that data collected before this date will be exempt from the requirement because retaining the data is a form of “processing,” that, if occurring after the October 1 date, will require a data protection assessment. Enforcement will be handled by the Maryland Attorney General’s Office, which may issue a notice of violation if it determines that a cure is possible, and in that event, the entity receiving the notice shall have at least 60 days to cure. The use of the word “may” indicates that the regulator has discretion to bring enforcement actions without giving prior notice or an opportunity to cure. MODPA offers no private right of action, and it does contain a limited right to cure violations until 2027.

CPPA Board Meeting Updates
The California Privacy Protection Agency Board meeting on May 10, 2024, included a recognition of the work of departing member Lydia De La Torre and the introduction of Drew Liebert as a new member. Mr. Liebert is an attorney who served over two decades in various staff leadership positions in the California legislature, most recently as Chief of Staff to the Senate Majority Leader and for 18 years as Chief Counsel to the Assembly Judiciary Committee, where he advised policymakers on major technology-related measures considered by the state legislature.

Legislative updates highlighted concerns about the American Privacy Rights Act (APRA) and discussion of state bills impacting consumer privacy, such AB 1949. AB1949 proposes to expand the CCPA’s current consent requirement for the sale or sharing of the personal information of children under 16 to include collection, use and disclosure of children’s personal information, and raise the age of consumers entitled to these protections to under 18. AB 1949 proposes to eliminate the CCPA’s actual knowledge standard—i.e., the existing standard that a business knows, rather than “should have known” that the consumer is a child—to determine who is entitled to opt-in protections for the purposes of the CCPA. The Board adopted the CPPA staff’s recommendation to support AB 1949 if the bill was amended, among other things, to keep the actual knowledge standard for determining a child’s age or adopt an alternative standard. The Board also planned a public affairs campaign to raise cybersecurity awareness through various media channels. No significant public comments or future agenda items were noted.

There were no significant updates on the rulemaking. The staff is working on a package for formal rulemaking while the Agency is holding in-person stakeholder sessions in Los Angeles, Fresno, and Sacramento, to be concluded by May 22, 2024.

OTHER PRIVACY LAWS

Colorado Becomes the First State to Protect Neural and Biological Data
On April 17, Colorado enacted the Protect Privacy of Biological Data Act, which expands the Colorado Privacy Act’s definition of “sensitive data” to include biological and neural data, ensuring stronger privacy protections for Colorado residents. “Biological data” is defined as information generated from the technological processing, measurement or analysis of an individual’s biological, genetic, biochemical, physiological or neural properties, used for identification purposes. The law defines “neural data” as information generated from the nervous system’s activity, often processed by devices like brain-computer interfaces. The law mandates explicit consent for processing this data and allows consumers to opt out of its use for targeted advertising and profiling. This legislation sets a precedent as the first state law to protect this class of data, reflecting the growing significance of neurotechnology in data privacy. Other states, including California and Minnesota, are considering similar measures.


RELATED ARTICLE

EU Passes Comprehensive Law on Artificial Intelligence, Heralding a New Era of AI Regulation