
In the absence of comprehensive artificial Intelligence ("AI") regulation at the federal level, state and local legislatures have been busy considering their own AI legislation. Laws regulating automated decision making have emerged as an early priority of local government. While there is no uniform definition of "automated decision making," it can be understood to mean the use of AI, machine learning systems, and/or algorithms to make decisions without or with minimal human input and control.
Legislatures appear keen to mitigate the perceived risk that AI systems will be used to make consequential decisions that discriminate against, or otherwise adversely impact, consumers across a range of contexts, particularly with respect to employment and employment opportunities. Additionally, many state privacy laws provide consumers with the right to opt-out of data processing for profiling based on automated decisions.
This alert focuses on laws that are currently in effect. State and local laws regulating automated decision making have been passed in Colorado, Illinois, and New York City. State privacy laws generally provide consumers with the right to opt-out of data processing for profiling based on automated decision that produce legal or similarly significant effects concerning consumers. They also require covered entities to conduct data protection assessments and meet notice and access obligations.
That said, many other states and localities are likely to follow suit. State legislatures in Connecticut, Massachusetts, New Mexico, New York, and Texas are currently considering draft laws that generally track the Colorado AI Act and would similarly regulate automated decision making. In February 2025, the state legislature in Virginia narrowly passed HB 2094, which also generally tracks the Colorado AI Act; however, the fate of HB 2094 remains uncertain as many observers believe that Governor Youngkin is likely to veto the bill. Consumers can also expect amendments to state privacy laws, or newly proposed state privacy legislation, that mirror opt-out rights related to automated decision making, discussed in this alert.
To stay up-to-date on the latest AI regulatory developments in the United States, please reference White & Case's helpful AI Watch tracker and be on the lookout for additional Client Alerts.
State and Local Laws Regulating Automated Decision Making
Currently, Colorado, Illinois, and New York City have enacted bills that regulate automated decision making by AI systems. Each of these laws aim to prevent the use of AI decision making in a way that results in discrimination against consumers based on protected classes.
Colorado
As discussed in further detail in White & Case's previous Client Alert, the Colorado AI Act ("Act") – which goes into effect on February 1, 2026 – is the first comprehensive AI legislation in the United States. The Act is the most onerous of extant AI legislation. The Act applies to both private and government entities who are developers and deployers of high-risk AI systems. For purposes of the Act, a "developer" is an individual, corporation, or other legal or commercial entity doing business in Colorado1 that "develops or intentionally and substantially modifies" an AI system.2 A "deployer" is an individual, corporation, or other legal or commercial entity doing business in Colorado that deploys a high-risk AI system.3
Key Obligations
The Act imposes a duty of reasonable care4 on developers and deployers of "high-risk AI systems" to protect consumers5 from any known or reasonably foreseeable risks of "algorithmic discrimination" arising from the use of high-risk AI systems across eight enumerated contexts.6 Among other requirements, the Act imposes transparency and reporting obligations on deployers and developers. For example, deployers of high-risk AI systems must complete annual impact assessments for such systems.7 Additionally, deployers and developers must also disclose any known or reasonably foreseeable risks of algorithmic discrimination arising from the intended uses of a high-risk AI system.8
Enforcement
The Colorado Attorney General has exclusive enforcement authority to address violations of the Act, which constitute unfair trade practices.9 Violations may be punished with fines or injunctive relief. The Act does not provide a private right of action for aggrieved consumers.
Illinois
The Illinois state legislature amended the state's Human Rights Act ("HR Act") to regulate automated decision making by AI systems only in various employment contexts (c.f., the eight enumerated contexts set out in the Colorado AI Act). The amendments – which enter into force on January 1, 2026 – aim to prevent discrimination against consumers based on protected classes by AI systems.
The HR Act applies to private and government employers.
Key Obligations
- The amendments to the HR Act clarify that employers commit a human rights violation if they use AI systems in a way that results in discrimination against consumers based on a protected class in the following employment contexts: (i) recruitment; (ii) hiring; promotion; (iii) renewal of employment; (iv) selection for training/apprenticeship; (v) discharge; (vi) discipline; (vii) tenure; or (viii) the terms, privileges, or conditions of employment.10
- Under the HR Act, protected classes are defined with reference to federal and Illinois law, including race, religion, sex, national origin, disability, sexual orientation, and immigration status.11
Enforcement
Aggrieved persons can file a complaint with the Illinois Department of Human Rights ("IDHR"), which can investigate and hold public hearings regarding the complaints.12 The IDHR can also escalate complaints to the Illinois Human Rights Commission, which is empowered to order monetary and non-monetary penalties.13
New York City
New York City's Local Law 144 ("Local Law") – which is currently in effect – applies public and private employers and employment agencies ("Employer(s)"). The regulates Employers' use of automated employment tools ("AEDTs") in connection with employment decisions concerning individuals who reside in New York City.
The Local Law defines an "AEDT" as a "computational process" that "is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons."14
Key Obligations
- Transparency:
- The Local Law prohibits the use of AI systems to make employment decisions15 unless the Employer has (i) conducted an independent bias audit16 of the tool's disparate impact on persons of a protected class, and (ii) published the results of the bias audit.17
- Prior notice:
- Further, entities that use AEDTs to make, or assisting in making, employment decisions must give advance notice to candidates and allow them to opt-out of the use of the AEDT. Notice to candidates must be given no less than 10 business days prior use of the AEDT.18
Enforcement
The Corporation Counsel (the head of the New York City Law Department) or others designated by the Corporation Counsel may bring an action or proceeding in court for violations of the Local Law. Individuals who believe that an Employer violated the Local Law can submit their complaints to the New York City Commission on
Human Rights, which may refer the complaint to the Corporation Counsel.19 Violations of the Local Law carry monetary civil penalties ranging between $500 to $1,500 per violation.20
State Privacy Laws
State privacy laws generally impose specific obligations on covered businesses engaging in profiling activities. These obligations include providing consumers with the right to opt out of profiling that results in legal or similarly significant effects, conducting data protection assessments, and meeting certain notice and access requirements. Each of these state privacy laws closely mirror one another and define "processing" and "profiling" to include automated processes, which apply to automated decision making.
Key Obligations
- Consumer's Right to Opt-Out
- Decisions that produce legal or similarly significant effects refers to decisions made by controllers in the provision or denial of financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment opportunities, health-care services, or access to essential goods or services.
- The definition of "profiling" varies across state privacy laws. While most states define profiling as any form of automated processing, some—such as Connecticut, Indiana, Maryland, Nebraska, and Texas—narrow the definition to solely automated processing. This means that profiling activities that involve partial human involvement are excluded from the scope of consumers' right to opt-out of profiling.
- Colorado, however, takes a tiered approach, categorizing automated processing into three types: solely automated processing, human-reviewed automated processing, and human-involved automated processing. Consumers have the right to opt out of profiling only in cases of solely automated processing and human-reviewed automated processing.
- California privacy laws grant the California Privacy Protection Agency ("CPPA") rulemaking authority to establish regulations governing access and opt-out rights related to businesses' use of automated decision-making technology ("ADMT"), including but not limited to profiling. In its proposed regulations on Risk Assessment and ADMT ("Proposed Risk Assessment and ADMT Regulation"), the CPPA proposes recognizing the right to opt out of ADMT use for specific purposes beyond profiling, such as training ADMT capable of generating deepfakes, performing physical or biological identification, or establishing an individual's identity.21
- Notably, the definition of "personal data" in state privacy laws excludes publicly available information. As a result, businesses processing publicly available information for profiling purposes are not required to provide consumers with the right to opt out.
- Complying with Consumer's Request to Opt-Out
- Each of the state privacy laws require data controllers to abide by a consumer's request to opt out.
- Under each of the state privacy laws, a controller must respond within 45 days of receipt of the consumer's opt-out request. This period can be extended by an additional 45 days where reasonably necessary, considering the complexity and number of requests.
- If a controller does not take action on the consumer's opt-out request, the controller must inform a consumer without undue delay and at the least within 45 days of receipt. The controller must also provide instructions on how a consumer can appeal this decision.
- Within 60 days of receipt of an appeal, the controller must inform consumer in writing of any action taken or not taken in response to the appeal.
- Opt-in Requirement in New Jersey for Profiling Concerning Minors
- New Jersey's privacy rules prohibit covered organizations from processing personal data for profiling that produces legal or similarly significant effects without consent if they know, or willfully disregard, that the consumer is between 13 and 17 years old.22
- Data Protection Assessment
- State privacy laws, except in Iowa and Utah, require covered organizations to conduct a data protection assessment for processing activities that present a heightened risk of harm to consumers, which includes profiling that presents a reasonably foreseeable risk of substantial injury to consumers. The assessment must explain how the organization weighs the benefits of the processing against the potential risks to individuals and highlight the safeguards in place to mitigate those risks.
- Colorado's privacy law and California's Proposed Risk Assessment and ADMT Regulation prescribe additional content requirements for data protection assessments by covered organizations processing personal data for profiling purposes.23 These requirements include details on policies, procedures and training to prevent discrimination, how the business use outputs from ADMT, and the logic behind ADMT including any underlying assumptions.
- Notice Requirement
- State privacy laws require organizations to provide consumers with a reasonably accessible, clear and meaningful privacy notice which includes information on how consumer can exercise their rights such as the right to opt out of the profiling activities.
- Colorado privacy law and California's Proposed Risk Assessment and ADMT Regulation require organizations involved in profiling or ADMT to include additional information in their privacy notices. These laws require covered organizations to include a layman's explanation of the profiling process, how profiling is used in decision-making, and whether the automated decision making system has been evaluated for accuracy, fairness, or bias.24
- California's Proposed Risk Assessment and ADMT Regulation also requires organizations that process personal information to train AI or ADMT, and make the technology available to others to provide users with a plain language explanation of any requirements or limitations on the use of the ADMT or AI.25
- Access Right
- State privacy laws grant consumers the right to confirm whether a controller is processing their personal data and to access that data, including information about processing for profiling purposes. California's Proposed Risk Assessment and ADMT Regulation goes further by requiring organizations to include specific details in their responses to access requests. This includes explaining how the ADMT operated for the consumer, the logic used, and key parameters that influenced the system's output.26
Enforcement
Except in California where consumers' private right of action is limited to security breaches, state privacy laws do not recognize private enforcement method. Nevertheless, in all states, the state attorney general or department of justice has exclusive enforcement authority to seek injunctive relief or monetary penalties against entities that fail to comply with state privacy laws. Notably, in addition to CPPA's rulemaking authority in California, Colorado, and New Jersey state privacy laws also grant their respective attorneys general to promulgate rules governing privacy, including right to opt-out.
Finally, with the exception of Rhode Island, most state privacy laws offer businesses a chance to remedy alleged violations before enforcement actions are taken (known as the "right to cure"), though in some states this opportunity may not be applicable after a certain period of time.
Burak Haylamaz (White & Case, Staff Attorney, Los Angeles) contributed to the development of this publication.
1 The concept of "doing business in Colorado" is not defined in the Act. The Colorado Department of Revenue suggests that businesses without a physical location in Colorado are "doing business in Colorado" if they "solicit business and receive orders from Colorado residents by any means whatsoever." Accordingly, we expect this phrasing to be interpreted broadly; however, the Act empowers the Colorado Attorney General to promulgate rules, as necessary, for the purpose of implementing and enforcing the Act, so a narrower scope of applicability may be applied prior to the February 2026 effective date.
2 Colo. Rev. Stat. § 6-1-1702.7.
3 Colo. Rev. Stat. § 6-1-1702.6.
4 Colo. Rev. Stat. §§ 6-1-1702.1, 6-1-1703.1.
5 A "consumer" is an individual who is a Colorado resident. See Colo. Rev. Stat. § 6-1-1701.4.
6 Education enrollment or opportunity, employment or an employment opportunity, a financial or lending service, essential government service, healthcare services, housing, insurance, and legal services.
7 Colo. Rev. Stat. § 6-1-1703.3.
8 Colo. Rev. Stat. § 6-1-1702.5.
9 Colo. Rev. Stat. § 6-1-1706.
10 775 Ill. Comp. Stat. 5 Sec. 2-102 (L).
11 https://dhr.illinois.gov/filing-a-charge/jurisdiction-chart.html.
12 https://hrc.illinois.gov/process/faqs.html#faq-1howisachargeinitiatedundertheillinoishumanrightsact-faq.
13 https://dhr.illinois.gov/filing-a-charge/human-rights-commission.html.
14 N.Y.C. Local Law No. 144 § 20-870.
15 "Employment decision" means to screen candidates for employment or employees for promoting within the city. N.Y.C. Local Law No. 144 § 20-870.
16 "Bias audit" means an impartial evaluation by an independent auditor that must test the AEDT to assess the tool's disparate impact. N.Y.C. Local Law No. 144 § 20-870.
17 N.Y.C. Local Law No. 144 § 20-871(a).
18 N.Y.C. Local Law No. 144 § 20-871(b).
19 N.Y.C. Local Law No. 144 § 20-873.
20 N.Y.C. Local Law No. 144 § 20-872(a).
21See Article 11 of the CPPA Proposed Text (CCPA Updates, Cyber, Risk, ADMT, and Insurance Regulations).
22 New Jersey Privacy Act (SB 332) Section 9(a)(7).
23 Ibid, Article 10, Section 7152 of the CPPA Proposed Text and Rule 9.06(f) of the Colorado Privacy Act Rules.
24 Ibid, Article 11, Section 7220(c) of the CPPA Proposed Text and Rule 9.04 of the CPA Rules.
25 Ibid, Article 10, Section 7153(b) of the CPPA Proposed Text.
White & Case means the international legal practice comprising White & Case LLP, a New York State registered limited liability partnership, White & Case LLP, a limited liability partnership incorporated under English law and all other affiliated partnerships, companies and entities.
This article is prepared for the general information of interested persons. It is not, and does not attempt to be, comprehensive in nature. Due to the general nature of its content, it should not be regarded as legal advice.
© 2025 White & Case LLP