104TH GENERAL ASSEMBLY
State of Illinois
2025 and 2026
SB2203

 

Introduced 2/7/2025, by Sen. Graciela Guzmán

 

SYNOPSIS AS INTRODUCED:
 
New Act
815 ILCS 505/2HHHH new

    Creates the Preventing Algorithmic Discrimination Act. Provides that, on or before January 1, 2027, and annually thereafter, a deployer of an automated decision tool shall perform an impact assessment for any automated decision tool the deployer uses or designs, codes, or produces that includes specified information. Provides that a deployer shall, at or before the time an automated decision tool is used to make a consequential decision, notify any natural person who is the subject of the consequential decision that an automated decision tool is being used to make, or be a controlling factor in making, the consequential decision and provide specified information. Provides that a deployer shall establish, document, implement, and maintain a governance program that contains reasonable administrative and technical safeguards to map, measure, manage, and govern the reasonably foreseeable risks of algorithmic discrimination associated with the use or intended use of an automated decision tool. Provides that, within 60 days after completing an impact assessment required by the Act, a deployer shall provide the impact assessment to the Attorney General. Amends the Consumer Fraud and Deceptive Business Practices Act to make conforming changes.


LRB104 10978 SPS 21060 b

 

 

A BILL FOR

 

SB2203LRB104 10978 SPS 21060 b

1    AN ACT concerning business.
 
2    Be it enacted by the People of the State of Illinois,
3represented in the General Assembly:
 
4    Section 1. Short title. This Act may be cited as the
5Preventing Algorithmic Discrimination Act.
 
6    Section 5. Definitions. As used in this Act:
7    "Algorithmic discrimination" means the condition in which
8an automated decision tool contributes to unjustified
9differential treatment or impacts disfavoring people based on
10their actual or perceived race, color, ethnicity, sex,
11religion, age, national origin, limited English proficiency,
12disability, veteran status, genetic information, reproductive
13health, or any other classification protected by State law.
14"Algorithmic discrimination" does not include:
15        (1) the offer, license, or use of a high-risk
16    artificial intelligence system by a deployer for the sole
17    purpose of:
18            (A) the deployer's self-testing to identify,
19        mitigate, or prevent discrimination or otherwise
20        ensure compliance with state and federal law; or
21            (B) expanding an applicant, customer, or
22        participant pool to increase diversity or redress
23        historical discrimination; or

 

 

SB2203- 2 -LRB104 10978 SPS 21060 b

1        (2) an act or omission by or on behalf of a private
2    club or other establishment that is not in fact open to the
3    public, as set forth in the Civil Rights Act of 1964.
4    "Artificial intelligence system" means a machine-based
5system that, for explicit or implicit objectives, infers, from
6the input it receives, how to generate outputs such as
7predictions, content, recommendations, or decisions that can
8influence physical or virtual environments. "Artificial
9intelligence system" includes a generative artificial
10intelligence system. For the purposes of this definition,
11"generative artificial intelligence system" means an automated
12computing system that, when prompted with human prompts,
13descriptions, or queries, can produce outputs that simulate
14human-produced content, including, but not limited to:
15        (1) textual outputs, such as short answers, essays,
16    poetry, or longer compositions or answers;
17        (2) image outputs, such as fine art, photographs,
18    conceptual art, diagrams, and other images;
19        (3) multimedia outputs, such as audio or video in the
20    form of compositions, songs, or short-form or long-form
21    audio or video; and
22        (4) other content that would otherwise be produced by
23    human means
24    "Automated decision tool" means a system or service that
25uses artificial intelligence and has been specifically
26developed and marketed to, or specifically modified to, make,

 

 

SB2203- 3 -LRB104 10978 SPS 21060 b

1or be a controlling factor in making, consequential decisions.
2    "Consequential decision" means a decision or judgment that
3has a legal, material, or similarly significant effect on an
4individual's life relating to the impact of, access to, or the
5cost, terms, or availability of, any of the following:
6        (1) employment, worker management, or self-employment,
7    including, but not limited to, all of the following:
8            (A) pay or promotion;
9            (B) hiring or termination; and
10            (C) automated task allocation;
11        (2) education and vocational training, including, but
12    not limited to, all of the following:
13            (A) assessment, including, but not limited to,
14        detecting student cheating or plagiarism;
15            (B) accreditation;
16            (C) certification;
17            (D) admissions; and
18            (E) financial aid or scholarships;
19        (3) housing or lodging, including rental or short-term
20    housing or lodging;
21        (4) essential utilities, including electricity, heat,
22    water, Internet or telecommunications access, or
23    transportation;
24        (5) family planning, including adoption services or
25    reproductive services, as well as assessments related to
26    child protective services;

 

 

SB2203- 4 -LRB104 10978 SPS 21060 b

1        (6) healthcare or health insurance, including mental
2    health care, dental, or vision;
3        (7) financial services, including a financial service
4    provided by a mortgage company, mortgage broker, or
5    creditor;
6        (8) the criminal justice system, including, but not
7    limited to, all of the following:
8            (A) risk assessments for pretrial hearings;
9            (B) sentencing; and
10            (C) parole;
11        (9) legal services, including private arbitration or
12    mediation;
13        (10) voting; and
14        (11) access to benefits or services or assignment of
15    penalties.
16    "Deployer" means a person, partnership, State or local
17government agency, or corporation that uses an automated
18decision tool to make a consequential decision.
19    "Impact assessment" means a documented risk-based
20evaluation of an automated decision tool that meets the
21criteria of Section 10.
22    "Sex" includes pregnancy, childbirth, and related
23conditions, gender identity, intersex status, and sexual
24orientation.
25    "Significant update" means a new version, new release, or
26other update to an automated decision tool that includes

 

 

SB2203- 5 -LRB104 10978 SPS 21060 b

1changes to its use case, key functionality, or expected
2outcomes.
 
3    Section 10. Impact assessment.
4    (a) On or before January 1, 2027, and annually thereafter,
5a deployer of an automated decision tool shall perform an
6impact assessment for any automated decision tool the deployer
7uses that includes all of the following:
8        (1) a statement of the purpose of the automated
9    decision tool and its intended benefits, uses, and
10    deployment contexts;
11        (2) a description of the automated decision tool's
12    outputs and how they are used to make, or be a controlling
13    factor in making, a consequential decision;
14        (3) a summary of the type of data collected from
15    natural persons and processed by the automated decision
16    tool when it is used to make, or be a controlling factor in
17    making, a consequential decision;
18        (4) an analysis of potential adverse impacts on the
19    basis of sex, race, color, ethnicity, religion, age,
20    national origin, limited English proficiency, disability,
21    veteran status, or genetic information from the deployer's
22    use of the automated decision tool;
23        (5) a description of the safeguards implemented, or
24    that will be implemented, by the deployer to address any
25    reasonably foreseeable risks of algorithmic discrimination

 

 

SB2203- 6 -LRB104 10978 SPS 21060 b

1    arising from the use of the automated decision tool known
2    to the deployer at the time of the impact assessment;
3        (6) a description of how the automated decision tool
4    will be used by a natural person, or monitored when it is
5    used, to make, or be a controlling factor in making, a
6    consequential decision; and
7        (7) a description of how the automated decision tool
8    has been or will be evaluated for validity or relevance.
9    (b) A deployer shall, in addition to the impact assessment
10required by subsection (a), perform, as soon as feasible, an
11impact assessment with respect to any significant update.
12    (c) This Section does not apply to a deployer with fewer
13than 25 employees unless, as of the end of the prior calendar
14year, the deployer deployed an automated decision tool that
15impacted more than 999 people per year.
 
16    Section 15. Notification and accommodations.
17    (a) A deployer shall, at or before the time an automated
18decision tool is used to make a consequential decision, notify
19any natural person who is the subject of the consequential
20decision that an automated decision tool is being used to
21make, or be a controlling factor in making, the consequential
22decision. A deployer shall provide to a natural person
23notified under this subsection all of the following:
24        (1) a statement of the purpose of the automated
25    decision tool;

 

 

SB2203- 7 -LRB104 10978 SPS 21060 b

1        (2) the contact information for the deployer; and
2        (3) a plain language description of the automated
3    decision tool that includes a description of any human
4    components and how any automated component is used to
5    inform a consequential decision.
6    (b) If a consequential decision is made solely based on
7the output of an automated decision tool, a deployer shall, if
8technically feasible, accommodate a natural person's request
9to not be subject to the automated decision tool and to be
10subject to an alternative selection process or accommodation.
11After a request is made under this subsection, a deployer may
12reasonably request, collect, and process information from a
13natural person for the purposes of identifying the person and
14the associated consequential decision. If the person does not
15provide that information, the deployer shall not be obligated
16to provide an alternative selection process or accommodation.
 
17    Section 20. Governance program.
18    (a) A deployer shall establish, document, implement, and
19maintain a governance program that contains reasonable
20administrative and technical safeguards to map, measure,
21manage, and govern the reasonably foreseeable risks of
22algorithmic discrimination associated with the use or intended
23use of an automated decision tool. The safeguards required by
24this subsection shall be appropriate to all of the following:
25        (1) the use or intended use of the automated decision

 

 

SB2203- 8 -LRB104 10978 SPS 21060 b

1    tool;
2        (2) the deployer's role as a deployer;
3        (3) the size, complexity, and resources of the
4    deployer;
5        (4) the nature, context, and scope of the activities
6    of the deployer in connection with the automated decision
7    tool; and
8        (5) the technical feasibility and cost of available
9    tools, assessments, and other means used by a deployer to
10    map, measure, manage, and govern the risks associated with
11    an automated decision tool.
12    (b) The governance program required by this Section shall
13be designed to do all of the following:
14        (1) identify and implement safeguards to address
15    reasonably foreseeable risks of algorithmic discrimination
16    resulting from the use or intended use of an automated
17    decision tool;
18        (2) if established by a deployer, provide for the
19    performance of impact assessments as required by Section
20    10;
21        (3) conduct an annual and comprehensive review of
22    policies, practices, and procedures to ensure compliance
23    with this Act;
24        (4) maintain for 2 years after completion the results
25    of an impact assessment; and
26        (5) evaluate and make reasonable adjustments to

 

 

SB2203- 9 -LRB104 10978 SPS 21060 b

1    administrative and technical safeguards in light of
2    material changes in technology, the risks associated with
3    the automated decision tool, the state of technical
4    standards, and changes in business arrangements or
5    operations of the deployer.
6    (c) A deployer shall designate at least one employee to be
7responsible for overseeing and maintaining the governance
8program and compliance with this Act. An employee designated
9under this subsection shall have the authority to assert to
10the employee's employer a good faith belief that the design,
11production, or use of an automated decision tool fails to
12comply with the requirements of this Act. An employer of an
13employee designated under this subsection shall conduct a
14prompt and complete assessment of any compliance issue raised
15by that employee.
16    (d) This Section does not apply to a deployer with fewer
17than 25 employees unless, as of the end of the prior calendar
18year, the deployer deployed an automated decision tool that
19impacted more than 999 people per year.
 
20    Section 25. Public statement of policy. A deployer shall
21make publicly available, in a readily accessible manner, a
22clear policy that provides a summary of both of the following:
23        (1) the types of automated decision tools currently in
24    use or made available to others by the deployer; and
25        (2) how the deployer manages the reasonably

 

 

SB2203- 10 -LRB104 10978 SPS 21060 b

1    foreseeable risks of algorithmic discrimination that may
2    arise from the use of the automated decision tools it
3    currently uses or makes available to others.
 
4    Section 30. Algorithmic discrimination.
5    (a) A deployer shall not use an automated decision tool
6that results in algorithmic discrimination.
7    (b) On and after January 1, 2028, a person may bring a
8civil action against a deployer for violation of this Section.
9In an action brought under this subsection, the plaintiff
10shall have the burden of proof to demonstrate that the
11deployer's use of the automated decision tool resulted in
12algorithmic discrimination that caused actual harm to the
13person bringing the civil action.
14    (c) In addition to any other remedy at law, a deployer that
15violates this Section shall be liable to a prevailing
16plaintiff for any of the following:
17        (1) compensatory damages;
18        (2) declaratory relief; and
19        (3) reasonable attorney's fees and costs.
 
20    Section 35. Impact assessment.
21    (a) Within 60 days after completing an impact assessment
22required by this Act, a deployer shall provide the impact
23assessment to the Attorney General.
24    (b) A deployer who knowingly violates this Section shall

 

 

SB2203- 11 -LRB104 10978 SPS 21060 b

1be liable for an administrative fine of not more than $10,000
2per violation in an administrative enforcement action brought
3by the Attorney General. Each day on which an automated
4decision tool is used for which an impact assessment has not
5been submitted as required under this Section shall give rise
6to a distinct violation of this Section.
7    (c) The Attorney General may share impact assessments with
8other State entities as appropriate.
 
9    Section 40. Enforcement. A violation of this Act
10constitutes an unlawful practice under the Consumer Fraud and
11Deceptive Business Practices Act. All remedies, penalties, and
12authority granted to the Attorney General by the Consumer
13Fraud and Deceptive Business Practices Act shall be available
14to him or her for the enforcement of this Act.
 
15    Section 95. The Consumer Fraud and Deceptive Business
16Practices Act is amended by adding Section 2HHHH as follows:
 
17    (815 ILCS 505/2HHHH new)
18    Sec. 2HHHH. Violations of the Preventing Algorithmic
19Discrimination Act. A person who violates the Preventing
20Algorithmic Discrimination Act commits an unlawful practice
21within the meaning of this Act.