Artificial Intelligence Executive Order WHD and OFCCP Guidance Issued

Littler
Contact

Littler

  • DOL’s Wage and Hour Division and Office of Federal Contract Compliance Programs released guidance documents governing the use of artificial intelligence (AI) in the workplace.
  • The documents address employers’ and federal contractors’ potential legal and business risks associated with AI, and provide recommended practices to help avoid them.

On April 29, 2024, the White House released a statement entitled, “Biden-⁠Harris Administration Announces Key AI Actions 180 Days Following President Biden’s Landmark Executive Order.” A few hours later, the U.S. Department of Labor’s (DOL) Wage and Hour Division (WHD) and Office of Federal Contract Compliance Programs (OFCCP) released guidance documents about the use of artificial intelligence (AI) in the workplace. Both sets of guidance documents were issued following President Biden’s Executive Order on the “Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence” (“AI Executive Order”) issued on October 30, 2023. Specifically, the AI Executive Order directed these agencies within DOL to develop best practices for employers, agencies, and federal contractors.

WHD Guidance. The AI Executive Order directed DOL to “issue guidance to make clear that employers that deploy AI to monitor or augment employees’ work must continue to comply with protections that ensure that workers are compensated for their hours worked, as defined under the Fair Labor Standards Act of 1938, 29 U.S.C. 201 et seq., and other legal requirements.” WHD’s AI guidance was issued as a Field Assistance Bulletin (FAB). According to WHD, FABs aim to “provide [WHD] investigators and staff with guidance on enforcement positions and clarification of policies or changes in the policy of WHD.”1

The thrust of the FAB is that employers are responsible for ensuring employees are properly paid if AI tools are used for scheduling, timekeeping, employee tracking purposes, or calculating wages owed. The FAB identifies certain recommended practices such as ensuring proper human oversight to make sure that any AI tools or systems—when used to track/monitor work, break, and waiting times—pay employees the applicable minimum wage and accurately calculate and pay an employee’s regular rate and overtime premium.

In addition to FLSA risks, the FAB also addresses risks that might arise under the Family and Medical Leave Act (FMLA) if employers use AI for processing leave requests. The FAB briefly explains that the use of AI tools to administer FMLA leave can create potential risks of violating the FMLA’s certification requirements for determining whether leave is FMLA-qualifying. Moreover, the FAB notes that AI tools used “to track leave use may not be used to target FMLA leave users for retaliation or discourage the use of such leave.”

The FAB also addresses certain risks that might arise under the Providing Urgent Maternal Protections for Nursing Mothers Act (PUMP Act), which requires covered employers to provide nursing employees with reasonable break time and space to express breast milk while at work. The FAB states that AI tools or systems that restrict or penalize employees for taking pump breaks (e.g., by limiting break length or frequency), penalizing productivity, or requiring makeup hours, would be illegal under FLSA.

In addition, the FAB addresses the Employee Polygraph Protection Act, which generally prohibits employers from using polygraph tests for applicants. The guidance explains that AI tools such as eye-tracking and voice-monitoring software, which are intended to gauge “truthfulness,” might qualify as polygraph tests. In that case, they would be regulated in the same manner as more traditional polygraph tests.

Finally, the FAB recognizes that, while AI tools may commonly be used to manage many aspects of employees’ work, the use of such tools to surveil the workforce for protected activity and to take adverse action could violate anti-retaliation laws.

The FAB concludes by acknowledging that “[w]hen used responsibly, AI has the potential to help improve compliance with the law.” WHD reinforces that “employers must ensure the responsible use of AI in order to continue to comply with the laws WHD enforces.”

The FAB has several notable shortcomings. First, it was not subject to public notice-and-comment. Indeed, the FAB fails to cite any resources WHD relied upon in making its assertions about how employers are using AI.

Second, it does not address other areas of FLSA concerns in the age of AI. Perhaps most notably, some have argued that the use of AI and algorithmic technologies may fundamentally alter workers’ primary duties, thus impacting FLSA exemption status.2

Third, WHD ignores the employer-vendor relationship. Employers usually engage third-party software vendors that develop and sell the AI-powered tools, which are then used to perform a wide variety of employment tasks. The FAB ignores this reality, which reinforces why notice-and-comment is so important for any actions on AI.

Fourth, the FAB reflects the Biden administration’s “whole of government” approach to promote a pro-union agenda across the entire spectrum of the government. This “whole of government” approach has relied on executive orders, interagency task forces, councils, interagency agreements, individual agency actions such as rulemaking and enforcement strategies, attempts to influence Congress, and a variety of other means to achieve a pro-union agenda. The FAB notes, without any reason or citation, that “employers have reportedly created systems to predict the likelihood that particular locations will unionize based on employee surveys and data analytics.” The FAB further states in a footnote that “[t]he use of electronic monitoring or AI systems to identify organizing activity may raise compliance challenges under the National Labor Relations Act” and cites a memorandum issued by the general counsel of the National Labor Relations Board. WHD does not enforce the National Labor Relations Act, however, nor has it entered into an interagency agreement on AI with the National Labor Relations Board.

OFCCP Guidance. President Biden’s AI Executive Order also directed DOL to “publish guidance for Federal contractors regarding nondiscrimination in hiring involving AI and other technology-based hiring systems.” In line with this mandate, the OFCCP issued guidance addressing AI in the federal anti-discrimination context vis-à-vis obligations enforced by OFCCP. As an initial matter, the OFCCP’s guidance addresses primarily federal contractors’ use of predictive AI, as opposed to generative AI, which is a type of AI that can create new content. Indeed, AI is defined in the OFCCP’s guidance, in relevant part, as “a machine-based system that can … make predictions, recommendations, or decisions influencing real or virtual environments.”

While acknowledging the benefits that may be conferred in terms of efficiency and productivity when leveraging AI systems in employment decisions, OFCCP emphasizes that federal contractors’ EEO obligations not to discriminate in employment extend to their use of AI in employment decisions. Federal contractors’ compliance obligations related to AI include:

  • Maintaining and ensuring confidentiality of records consistent with all OFCCP regulatory requirements;
  • Cooperating with OFCCP by provided information on their AI systems;
  • Reasonably accommodating known disability of otherwise qualified applicants or employees unless doing so is an undue hardship;
  • When a selection procedure, including one utilizing AI, results in an adverse impact3 on a protected group, validating the systems in accordance with applicable OFCCP-enforced non-discrimination laws and the Uniform Guidelines on Employee Selection Procedures (UGESP); and
  • Being responsible, i.e., not delegating non-discrimination and affirmative action obligations, for use of third-party products and services, including AI screening tools.

OFCCP emphasizes that because improperly designed or implemented AI may perpetuate bias and discrimination in the workplace, federal contractors using these systems in employment decisions must be aware of the risk of infringing on workers’ civil rights.

OFCCP confirms that it will investigate federal contractors’ use of AI during compliance evaluations and complaint investigations to ensure compliance with nondiscrimination obligations.

Finally, OFCCP’s guidance impresses that, whenever AI systems are being used in the employment lifecycle, federal contractors should possess a baseline understanding of the system’s design, development, intended use, and consequences. To that end, OFCCP outlines several of what it considers best practices pertaining to disclosure of use to applicants and employees; use of the AI system; vetting third-party vendor AI systems; and prioritizing accessibility and disability inclusion.

Many of OFCCP’s recommendations reflect recent legislative efforts at regulating AI. For instance, the guidance document suggests that contractors must regularly assess their AI tools for potential bias and maintain records of the assessments. This same requirement is already the law in New York City and has generally appeared in state proposals across the country.

U.S. Equal Employment Opportunity Commission. The White House’s announcement claims that the administration has “[r]eleased resources for job seekers, workers, and tech vendors and creators on how AI use could violate employment discrimination laws” since the AI Executive Order was issued in October. But the EEOC has not issued any AI guidance in about a year. Indeed, the EEOC has not released any resources within the 180 days of the AI Executive Order. It is also worth noting that none of the EEOC’s guidance documents have been voted on by the full Commission or subject to notice-and-comment.

Conclusion

Ultimately, the White House’s announcement and DOL’s guidance documents show how the administration is attempting to regulate AI without new legislation from Congress. More AI guidance is expected in the future, including a DOL report on how the government can support workers displaced by AI. The AI Executive Order directed DOL to issue the report by the end of April, but DOL has not yet issued the report.

The rapidly evolving regulatory landscape requires that employers and their compliance counsel remain especially attentive to current and developing legal authority regarding the use of AI in the workplace. Relatedly, employers and federal contractors should ensure that their AI-based algorithms are compliant with all federal and state laws and regulations. Finally, employers and federal contractors should examine ways to minimize the potential legal and business risks associated with AI such as implementing an AI usage policy and establishing internal practices.

Footnotes

1 DOL WHD, Field Assistance Bulletins, https://www.dol.gov/agencies/whd/field-assistance-bulletins#:~:text=Field%20Assistance%20Bulletins%20provide%20Wage,various%20laws%20enforced%20by%20WHD (emphasis added).

2 See Bradford J. Kelley, Wage Against the Machine: Artificial Intelligence and the Fair Labor Standards Act, 34 Stan. L. & Pol’y Rev. 261 (2023); Bradford Kelley & Stephen Malone, AI In Accounting Raises OT Exemption Questions, Law360 (Mar. 28, 2024), https://www.law360.com/employment-authority/articles/1818259/ai-in-accounting-raises-ot-exemption-questions.

3 The OFCCP guidance provides that an adverse impact results when the “procedure(s) an employer uses to make employment decisions such as hiring, promotion, and termination have a disproportionately large negative effect on a basis that is prohibited by law.”

Written by:

Littler
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Littler on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide