Get a quote
Select Page

New laws and concerns about lawsuits are causing HR professionals to be more cautious about their hiring practices.

At issue in 2023 are three key components:

  1. The growing use of artificial intelligence (AI) and machine learning (ML) in screening
  2. Evolving privacy laws that limit access to public records 
  3. Drug testing amid expanded legalization of marijuana

When you consider that 99% of Fortune 500 reports using software to sift candidates and more than half of HR leaders in the U.S. use predictive algorithms as part of the hiring process, employment screening impacts nearly every organization.1,2

These concerns play out in a variety of ways.

Increased Use of AI and HR Tech Tools in Screening

Most employers are well aware of federal laws and regulations for background checks, making sure to comply with Equal Employment Opportunity Commission (EEOC) and Federal Trade Commission (FTC) regulations. However, the Consumer Financial Protection Bureau (CFPB) is increasing scrutiny on the accuracy of the information gathered by screeners.

In doing background checks, companies may rely on information supplied by others. However, in a recent advisory, the CFPB emphasized its concern about false identity matching where applicants are disqualified based on information that identifies them incorrectly. 

“When background screening companies and their algorithms carelessly assign a false identity to applicants for jobs and housing, they are breaking the law,” said CFPB Director Rohit Chopra. “No one should lose out on a job or an apartment because of sloppy and illegal matching. Error-ridden background screening reports may disproportionately impact communities of color, further undermining an equitable recovery.”3

The CFPB said it is considering a rule change to the Fair Credit Reporting Act (FCRA) to better codify compliance. 

With a growing number of laws and regulations limiting access to public records, the information that is gathered in background screening becomes even more important as other data may not be available.

Concerns About HR Tech

A public hearing by the EEOC also expressed concerns about the rising use of AI in hiring and how it can discriminate against job applicants.

While acknowledging the benefits and potential of AI screening, the commission also pointed to potential civil rights violations of such technology. EEOC Chair Charlotte Burrows said it’s important to “employers, workers and other stakeholders on the potential for unlawful bias so that these systems do not become high-tech pathways to discrimination.”4

A study by the Harvard Business School revealed that employers have concerns as well. 88% of those surveyed believe that qualified applicants had been filtered out by screening software.1

Similar concerns exist when employees use HR tech tools as part of the performance review or appraisal process.

A new law in New York City, set to go into effect in April 2023, requires employers to conduct independent audits of any automated tools they use to ensure nondiscriminatory practices.5 Since most tools are provided by third-party providers, employers will have to take extra steps for an independent audit. 

AI Modeling Bias

AI is only as good as the data it was trained on. So, if the models use data that includes existing biases, inadvertent bias can filter out candidates.

Amazon ditched its AI recruiting tool after finding that it showed bias against women.6 The model had trained on 10 years of patterns in resumes and hires. Due to the dominance of men in the tech industry during that time, the AI “learned” that male candidates were preferred. 

Workers With Disabilities

Such technology can also negatively impact those with disabilities, creating violations of the Americans with Disabilities Act (ADA). The Justice Department and EEOC warned employers that without proper safeguards, AI and algorithms can screen out workers with disabilities from job considerations or promotions.

“Algorithmic tools should not stand as a barrier for people with disabilities seeking access to jobs,” said Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division.7

The EEOC makes it clear that employers risk violating federal laws without putting in place guardrails. “Employers may utilize these tools in an attempt to save time and effort, increase objectivity, or decrease bias. However, the use of these tools may disadvantage job applicants and employees with disabilities,” warned the EEOC.7

Increasing Legislation and Privacy Laws

Employers are also facing new regulations when it comes to privacy and information access. Here are some of the more recent laws that impact how companies do business.

Clean Slate Laws

Several cities and states have passed so-called “clean slate” laws that limit access to an applicant’s criminal history or prevent its use in hiring decisions. Such laws have been passed in 10 states so far, including:9

  • California
  • Colorado
  • Connecticut
  • Delaware
  • Michigan
  • New Jersey
  • Oklahoma
  • Pennsylvania
  • Utah
  • Virginia

While regulations vary, most provide mechanisms for people to expunge criminal records over time and limit the amount of information that a prospective employer can consider for hiring.

Use of AI Analysis and Biometric Data

Two US Supreme Court cases, Griggs v. Duke Power Company and Albemarle Paper Co v. Moody, examined the use of standardized tests for applicants and promotions.10 It put the burden on employers to show that such tests are job-related and are valid in assessing performance. 

According to the American Bar Association, courts may apply the same logic to AI programs and algorithms, forcing employers to show how screening tools relate to specific job requirements and performance.10 For example, AI tools that use facial recognition software to check for eye contact during interviews may create unintended discriminatory practices.

Illinois passed the nation’s first law regarding the use of AI analysis of video interviews that measure facial expressions, word choice, vocal tone, and body language, among other factors.11 The Illinois Artificial Intelligence Video Interview Act regulates the process, mandates consent, and has specific rules about confidentiality, sharing, and destruction.

Illinois had earlier passed the Biometric Information Protection Act (BIPA) which governs the use of any biometric data, such as fingerprints or facial scans.12 Arkansas, California, Colorado, New York, Oregon, Texas, Virginia, and Washington have since passed similar laws. More than a dozen states have similar legislation pending.

Limiting Personally Identifiable Information in Public Records

Some states have also banned the release of date of birth (DOB) information from public records. Michigan and California are two notable examples. While intending to help protect personally identifiable information (PII), it can also take longer for employers to conduct background screenings and conduct due diligence to ensure they are matching any information with the right applicants.

Outsourcing Does Not Eliminate the Risk

The ABA also says that using third-party providers or outsourcing your screening does not eliminate the exposure for employers. When contracting with outside vendors, particularly the initial vetting or screening of potential applications, it does not exempt employers from liability if the tools being used discriminate against protected groups.

Employers may be held liable for violations of law by recruiting companies.

The ABA recommends employers:

  • Have an understanding of the algorithms used and be “as equally diligent in developing and modifying” inputs that are used in the algorithms for screening and evaluating job candidates in much the same way they would evaluate all hiring procedures.
  • Consider regular auditing of automated tools. Since machine learning can evolve AI based on continued data input, it is crucial to ensure AI tools do not learn discriminatory practices over time. Companies cannot simply rely on attestations from suppliers at initial implementation.
  • Require in contracts that anyone acting on their behalf, recruiting for, or supplying screening programs comply with all existing employment laws related to screening and hiring. Companies should also require indemnification from vendors when possible.

Changing the Way You Do Employment Screenings

Before making any changes to the way you do employment screenings, you should always check with your employment counsel to ensure you are complying with applicable laws and regulations. Companies have both a legal and moral obligation to avoid discrimination.

You also need to stay on top of emerging legislation and compliance measures and partner with the right companies for professional employer (PEO) services.

Resourcing Edge focuses on simplifying HR and payroll administration, ensuring you remain in compliance with employee administration, and empowering companies to focus on success.

Resourcing Edge can help you with the administration of payroll, benefits, compliance, risk management, HR, HR technology, and recruiting. We can help you manage your business more efficiently, saving you time and money. Contact Resourcing Edge today to learn more or get a quote.

CITATIONS

  1. https://www.hbs.edu/managing-the-future-of-work/Documents/research/hiddenworkers09032021.pdf
  2. https://www.mercer.com/content/dam/mercer/attachments/private/global-talent-trends-2020-report.pdf
  3. https://www.consumerfinance.gov/about-us/newsroom/cfpb-takes-action-to-stop-false-identification-by-background-screeners/
  4. https://www.jdsupra.com/legalnews/eeoc-hearing-explores-potential-8819288/
  5. https://news.bloomberglaw.com/daily-labor-report/new-york-city-ai-bias-law-charts-new-territory-for-employers
  6. https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
  7. https://www.justice.gov/opa/pr/justice-department-and-eeoc-warn-against-disability-discrimination
  8. https://www.eeoc.gov/laws/guidance/americans-disabilities-act-and-use-software-algorithms-and-artificial-intelligence
  9. https://www.cleanslateinitiative.org/states
  10. https://www.americanbar.org/groups/business_law/publications/blt/2020/10/ai-in-hiring/
  11. https://www.ilga.gov/legislation/fulltext.asp?DocName=&SessionId=108&GA=101&DocTypeId=HB&DocNum=2557&GAID=15&LegID=&SpecSess=&Session=

          https://www.bclplaw.com/a/web/320807/BIPA-Tracker-II-603732145.3.pdf

Jami Beckwith

Pin It on Pinterest

Share This