1. Home
  2. |Insights
  3. |Harmonizing AI with EEO Requirements: OFCCP’s Blueprint for Federal Contractors

Harmonizing AI with EEO Requirements: OFCCP’s Blueprint for Federal Contractors

Client Alert | 4 min read | 05.13.24

Now more than ever, federal contractors find themselves at the intersection of innovation and regulation, particularly in the realm of Artificial Intelligence (AI).  AI is now incorporated into a broad range of business systems, including those with the potential to inform contractor employment decisions.  For that reason, the Office of Federal Contract Compliance Programs (OFCCP) has issued new guidance entitled “Artificial Intelligence and Equal Employment Opportunity for Federal Contractors” (the “AI Guide”).  OFCCP issued the AI Guide in accordance with President Biden’s Executive Order 14110 (regarding the “Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence”), which we reported on here.  The AI Guide provides answers to commonly asked questions about the use of AI in the Equal Employment Opportunity (EEO) context.  The AI Guide also offers “Promising Practices,” which highlight a number of important considerations for federal contractors.  Focusing on federal contractors’ obligations and attendant risks when utilizing AI to assist in employment-related decisions, the AI Guide also provides recommendations for ensuring compliance with EEO requirements while harnessing the efficiencies of AI.

Risks and Obligations

OFCCP confirms in its AI Guide that compliance evaluations and complaint investigations will include examination of a contractor’s use of AI in employment decisions, including, but not limited to, hiring, promotion, termination, and compensation.  The guidance defines key terms, including “AI” and “automated systems,” in accordance with other official guidance, including using the definition of AI provided under the National Artificial Intelligence Initiative, 15 U.S.C. § 9401(3), as well as referencing the examples of automated systems provided in the White House Blueprint for an AI Bill of Rights

The Q&A section provides examples of the ways in which the use of AI in employment decisions may implicate federal contractors’ EEO obligations.  For example, a contractor’s obligation to offer reasonable accommodations to employees or applicants with disabilities extends to the contractor’s use of automated systems.  Additionally, when a selection procedure involving an automated system has an adverse impact on the members of any race, sex, or ethnic group, federal contractors must validate the system in accordance with the Uniform Guidelines on Employee Selection Procedures, including by articulating the business needs motivating the use of the AI system and the job-relatedness of the selection procedure, conducting independent assessments for bias, and exploring potentially less discriminatory alternative selection procedures.  Notably, contractors must be able to provide information and records about the impact and validity of a selection procedure, and “cannot escape liability for the adverse impact of discriminatory screenings conducted by a third party, such as a staffing agency, HR software provider, or vendor.” Contractors using AI in employment decisions must also ensure compliance with all recordkeeping requirements.  For example, contractors are required to keep records of any resume searches conducted using AI, including the substantive search criteria used.    

Federal contractors’ use of AI in employment decisions introduces a complex array of compliance obligations (and attendant risk).  OFCCP’s AI Guide emphasizes the importance of AI systems being transparent, equitable, and devoid of biases that could lead to adverse employment actions based on race, sex, or ethnicity, underscoring the need for contractors to maintain stringent oversight of AI applications in their employment practices.

OFCCP's Recommended Practices for AI Deployment

Where a contractor is using or intends to use an AI system for employment decisions, there is a baseline expectation that relevant contractor employees understand the design, development, intended use, and effects of the AI system and that they are properly trained on the system.  There is also a baseline expectation that vendor-obtained AI is properly vetted.  To aid contractors in navigating the compliance landscape, OFCCP recommends several best practices for AI deployment.  These include having a standard process for the use of AI for all candidates and providing clear notice to applicants and employees about the use of AI in employment decisions, including how the system will contribute to an employment decision, and how their data will be captured and used in the AI system.  For example, contractors should provide instructions on how the applicant or employee can evaluate, correct, or request deletion of data within the AI system and on how to request a reasonable accommodation.  Additionally, contractors should routinely monitor the system to ensure that it does not cause a disparate or adverse impact and ensure that there is meaningful human oversight for any decision supported by AI.  These practices and others addressed in OFCCP’s AI Guide aim to mitigate risks and promote an equitable employment environment.

Conclusion

The integration of AI into employment practices presents a unique set of challenges and opportunities for government contractors.  OFCCP’s AI Guide provides a road map as to how OFCCP will view AI in the context of compliance evaluations.  As the AI landscape evolves, maintaining a proactive approach to compliance will enable contractors to leverage AI's benefits effectively while upholding their commitment to equal opportunity and affirmative action. For additional questions about the use of AI in employment decisions, and federal contractors’ obligations when using AI, please contact Crowell & Moring.

Note: Our lawyers leveraged AI in creating this client alert, including using a transcript summary created by generative AI. As we explore the potential of generative AI in the legal space, it is our intention and our practice to be transparent with our readers and to showcase the results we are achieving using generative AI with publicly available resources. Crowell’s AI group is comprised of lawyers and professionals across our global offices, including from Crowell & Moring International (CMI), our international public policy entity, with decades of sector-specific experience. We intend to lead by example in our own responsible use of AI, as it pertains to both the risks and benefits. Should you have questions about the use of generative AI in the legal sector or Crowell’s use of AI, please contact [email protected].

Insights

Client Alert | 3 min read | 06.07.24

11th Circuit Grants Preliminary Injunction in Fearless Fund, Halting Privately-Funded Grant Program Promoting Black-Female Owned Businesses

On June 3rd, 2024, in a 2-1 ruling, the 11thCircuit U.S. Court of Appeals granted a preliminary injunction against Fearless Fund (“Fearless”), enjoining the Fearless Strivers Grant Contest, a privately-funded grant competition open only to businesses owned by black women.  In another victory for the American Alliance for Equal Rights (“Alliance”) and Edward Blum, the legal strategist behind the Supreme Court’s recent rulings against college race-based admissions, the 11thCircuit held that the Alliance had standing to sue on behalf of three pseudonymously named business owners who were “ready and able” to enter the Contest but “were excluded from the opportunity to compete . . . solely on account of the color of their skin.”  The Court determined that plaintiffs were likely to prevail in the lawsuit, finding that privately funded businesses like Fearless can violate 42 U.S.C. § 1981, originally enacted as Section 1 of the Civil Rights Act of 1866, through contract-based programs restricted to persons of color.  ...