Legal landscape for AI use in candidate recruiting and screening is changing
According to a University of Southern California study, 55% of businesses are investing in automated recruiting measures that use artificial intelligence (AI). Using AI tools in employee recruiting and screening offers a range of potential benefits to employers, including increasing the efficiency and speed when it comes to finding the most qualified candidates, reducing the workload for HR teams, and lowering operational costs. However, while using AI in hiring decisions can reduce candidate screening times and increase efficiency, many observers have raised concerns that it can introduce bias and reduce transparency in the hiring process.
Numerous individuals have already filed lawsuits alleging they were summarily denied consideration for positions as a result of AI screening tools that discriminate based on race, gender, age, or disability. Additionally, many states have introduced legislation establishing various requirements for employers using AI for employment-related decisions.
Recent litigation
In Mobley v. Workday, Inc., an individual filed a putative class action in a California federal court against Workday, a software company that provides algorithm-based applicant-screening tools to thousands of companies, including several Fortune 500 firms. The individual alleged he was denied 80 to 100 jobs by several companies using Workday’s algorithm because of inherent bias embedded in the algorithm.