Department of Labor Issues New Guidance on the Use of Artificial Intelligence and … – JD Supra

This post was originally published on this site.

On April 29, 2024, the Department of Labor’s Office of Federal Contract Compliance Programs (OFCCP) released guidance to federal contractors regarding the use of artificial intelligence (AI) in their employment practices. See https://www.dol.gov/agencies/ofccp/ai/ai-eeo-guide. The guidance reminds federal contractors of their existing legal obligations, the potentially harmful effects of AI on employment decisions if used improperly, and best practices. Arriving early, the guidance puts contractors on notice of their responsibilities when using AI in their employment decisions.

On October 30, 2023, President Joe Biden issued Executive Order (EO) 14110, Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence. 88 Fed. Reg. 75191 (Nov. 1, 2023). Recognizing that AI “holds…both promise and peril,” the EO establishes a policy framework for a coordinated federal governmentwide approach to developing and using AI that is responsible, safe, and secure. Section 7.3 directed the secretary of labor to “publish guidance for Federal contractors regarding nondiscrimination in hiring involving AI and other technology-based hiring systems” no later than October 30, 2024. 88 Fed. Reg. at 75213.

The OFCCP guidance focuses primarily on the use of AI in equal employment opportunity (EEO) activities and applies to both federal prime contractors and subcontractors (collectively, federal contractors). The guidance consists of two parts: (1) common questions asked about the use of AI in EEO activities and (2) best practices federal contractors should consider when using AI in EEO activities.

Echoing sentiments similarly advocated in EO 14110, the OFCCP guidance notes that AI has the potential to increase efficiency and employment decision-making while perpetuating unlawful bias so as to automate unlawful discrimination. Reminding covered federal contractors that they are prohibited by law from discriminating in employment and are required to take affirmative action to ensure employees and applicants are treated without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran (see, e.g., FAR 52.222-25, -26, -35, and -36), the guidance informs federal contractors that their EEO obligations extend to their use of automated systems (defined broadly to mean software and algorithmic—a set of instructions that can be followed by a computer to accomplish some end—processes, including AI, that are used to automate workflows and help people complete tasks or make decisions) when making employment decisions. The guidance informs federal contractors that using automated systems, including AI, does not prevent them from violating federal EEO and nondiscrimination obligations in the workplace. Recognizing that “AI has the potential to embed bias and discrimination into a range of employment decision-making processes,” the OFCCP guidance advises federal contractors to ensure AI systems are designed and implemented properly to prevent and mitigate inequalities in the workplace that may violate an employee’s or a job candidate’s civil rights.

I know I’ve made some very poor decisions recently, but I can give you my complete assurance that my work will be back to normal.

–HAL 9000, 2001: A Space Odyssey

The OFCCP guidance clarifies that it will investigate a federal contractor’s “use of AI during compliance evaluations and complaint investigations” to ensure a federal contractor complies with its EEO and nondiscrimination obligations. To mitigate the potentially adverse effect AI systems have on a federal contractor’s employment decision-making process, federal contractors must validate an AI system using a strategy that meets “applicable OFCCP-enforced nondiscrimination laws and the Uniform Guidelines on Employee Selection Procedures (UGESP).” In addition, federal contractors must take the following actions when using AI in employment decision-making processes:

  • Understand and clearly articulate the business needs that motivate using the AI system.
  • Analyze the job-relatedness of the selection procedure.
  • Obtain results of any assessment of system bias, debiasing efforts, and/or any study of system fairness.
  • Conduct routine independent assessments for bias and/or inequitable results.
  • Explore potentially less-discriminatory alternative selection procedures.

Set forth at 29 CFR Part 1607, the UGESP consists of a set of guidelines “designed to assist employers…to comply with requirements of Federal law prohibiting employment practices which discriminate on grounds of race, color, religion, sex, and national origin…[and] provide a framework for determining the proper use of tests and other selection procedures.” 29 C.F.R. § 1607.1(B). The regulations further provide that the validity of an employment selection procedure must be demonstrated by evidence and “[u]nder no circumstances will the general reputation of a test or other selection procedures, its author or its publisher, or casual reports of its validity be accepted in lieu of evidence of validity.” Id. at § 1607.9(A). There are three primary approaches to validating an employer’s employment selection process:

  1. Content validation: Validation is demonstrated by data showing that the content of a selection procedure is representative of important aspects of job performance. This validation evaluates whether the particular questions in a test correlate to the test’s purpose?
  2. Criterion-related validation: Validation is demonstrated by empirical data showing that the selection procedure is predictive of or significantly correlated with important elements of work behavior. Criterion-related evaluates how predictive the test is of successful job performance.
  3. Construct validation: Validation is demonstrated by data showing that the selection procedure measures the degree to which candidates have identifiable characteristics that have been determined to be important for successful job performance. This validation examines whether the employment selection procedure measure what it claims to test (e.g., a programming code test for a programming position).

Id. at § 1607.16(D)-(F). Interestingly, the OFCCP has previously recognized, as far back as July 23, 2019, that AI systems can have a disparate impact on employment decision-making, noting that “[i]f OFCCP discovers that a contractor’s use of an AI-based selection procedure is having an adverse impact at a contractor’s establishment, the contractor will be required to validate the selection procedure using an appropriate validation strategy.” https://www.dol.gov/agencies/ofccp/faqs/employee-selection-procedures

In addition to stating what covered federal contractors must do to ensure a contractor-utilized AI system is properly validated by applicable OFCCP-enforced nondiscrimination laws and the UGESP, the OFCCP guidance provides a list of “best practices” federal contractors should incorporate into their AI-enabled employment decision-making processes. These best practices correspond to the following categories:

  1. Promising practices for the development and use of AI in the EEO context: Includes having a sufficient understanding of the design, development, intended use, and effects of any AI system federal contractors use in their employment practices.
  2. Promising practices for providing notice that the federal contractor is using AI: Recommendations include providing notice to employees and applicants about the use of AI in the hiring/promotion/termination process, ensuring their privacy is safeguarded, and making the employment decision, including how the AI system contributed to it, transparent.
  3. Promising practices for providing notice that the federal contractor is using AI: This includes implementing a standardized system to ensure all applicants go through the same process and routinely monitoring and analyzing at regular intervals whether the use of the AI system is causing a disparate or adverse impact.
  4. Promising practices on obtaining a vendor-created AI system: Recommendations include ensuring vendor-created AI systems obtained by federal contractors have provisions requiring vendors to maintain records consistent with all OFCCP-enforced regulatory requirements and provide OFCCP with access to such records during a compliance evaluation. Federal contractors should also verify differences between the AI system as developed and validated by the vendor versus the AI system in operational use by the federal contractor.

[View source.]