{"id":191,"date":"2022-06-09T09:11:25","date_gmt":"2022-06-09T13:11:25","guid":{"rendered":"https:\/\/www.apslaw.com\/on-the-job\/?p=191"},"modified":"2025-08-18T16:06:24","modified_gmt":"2025-08-18T20:06:24","slug":"it-may-be-time-to-review-employment-policies-regulations-on-the-horizon-for-use-of-automated-employment-decision-tools","status":"publish","type":"post","link":"https:\/\/www.apslaw.com\/on-the-job\/2022\/06\/09\/it-may-be-time-to-review-employment-policies-regulations-on-the-horizon-for-use-of-automated-employment-decision-tools\/","title":{"rendered":"It May be Time to Review Employment Policies: Regulations on the Horizon for Use of Automated Employment Decision Tools"},"content":{"rendered":"<p>Employers\u2019 use of artificial intelligence in assessing job applicants and employees has increased rapidly throughout the last decade. \u00a0These tools are used in a variety of contexts, such as making hiring decisions, determining promotions, and evaluating employee performance.\u00a0 However, as more employers implement forms of artificial intelligence in their hiring processes, a demand for regulation has also emerged to combat resulting bias.<\/p>\n<p>In December 2021, the Federal Trade Commission (\u201cFTC\u201d) issued an advanced notice of proposed rulemaking titled \u201cTrade Regulation in Commercial Surveillance,\u201d which stated that the \u201cCommission is considering initiating a rulemaking under section 18 of the FTC Act to . . . ensure that algorithmic decision-making does not result in unlawful discrimination.\u201d<a href=\"#_ftn1\" name=\"_ftnref1\">[1]<\/a> This followed a statement from FTC Chair Lina M. Khan in October 2021 that stated that the FTC \u201cmust explore using its rulemaking tools to codify baseline [privacy] protections,\u201d reasoning, in part, \u201cthat greater adoption of workplace surveillance technologies and facial recognition tools is expanding data collection in newly invasive and potentially discriminatory ways.\u201d<a href=\"#_ftn2\" name=\"_ftnref2\">[2]<\/a><\/p>\n<p>Last month, both the Equal Employment Opportunity Commission (\u201cEEOC\u201d)<a href=\"#_ftn3\" name=\"_ftnref3\">[3]<\/a> and\u00a0 Department of Justice (\u201cDOJ\u201d)<a href=\"#_ftn4\" name=\"_ftnref4\">[4]<\/a> issued guidance on the use of artificial intelligence in employment processes to prevent violations of the Americans with Disabilities Act (\u201cADA\u201d).\u00a0 The DOJ warned that \u201c[e]ven where an employer does not mean to discriminate, its use of a hiring technology may still lead to unlawful discrimination.\u201d\u00a0 The EEOC explained that steps an employer may take to avoid discrimination on the basis of race and sex \u201care typically distinct from the steps needed to address the problem of disability bias.\u201d\u00a0 Both the EEOC and DOJ provided suggestions for avoiding ADA violations, including training staff to recognize and process requests for reasonable accommodation quickly, using an accessible test that measures an applicant\u2019s job skills rather than disability, and ensuring that an employer is not unlawfully seeking medical or disability-related information.<\/p>\n<p>Although federal regulation is still in the early phases, some states have already begun the process of implementing restrictions on employers\u2019 use of artificial intelligence or automated decision-making.<\/p>\n<p>For example, in March, the California Fair Employment and Housing Council (\u201cFEHC\u201d) issued \u201cDraft Modifications to Employment Regulations Regarding Automated-Decision Systems.\u201d<a href=\"#_ftn5\" name=\"_ftnref5\">[5]<\/a>\u00a0 Therein, the FEHC propose to make it \u201cunlawful for an employer or a covered entity to use qualification standards, employment tests, automated-decision systems, or other selection criteria that screen out or tend to screen out an applicant or employee or a class of applicants or employees on the basis of a characteristic protected by this Act, unless the standards, tests, or other selection criteria, as used by the covered entity, are shown to be job-related for the position in question and are consistent with business necessity.\u201d\u00a0 These changes, which would include significant recordkeeping, have not yet been fully implemented.<\/p>\n<p>Reacting to the increased use of artificial intelligence and automated decision-making, the New York City Council passed a bill to restrict employers\u2019 use of \u201cautomated employment decision tools.\u201d<a href=\"#_ftn6\" name=\"_ftnref6\">[6]<\/a> \u00a0The law \u2013 which will take effect on <strong><u>January 1, 2023<\/u><\/strong> \u2013 prohibits employers or employment agencies from using automated employment decision tools to screen a candidate for employment or promotion unless: 1) the tool has been the subject of a bias audit within one year of the tool\u2019s use, and 2) a summary of the results of the most recent bias audit (and the distribution date of the tool to which such audit applies) has been made publicly available on the website of the employer or employment agency prior to the use of the tool. Candidates now also have the right to be able to request an alternative selection process or accommodation. \u00a0The law not only includes notice requirements, but also imposes significant monetary penalties for violations.<\/p>\n<p>What, if anything, should employers do? All employers should: 1) evaluate their computer-assisted employment processes to determine if any tools might be violating the ADA, 2) consider whether any of the DOJ or EEOC suggestions should be adopted, and 3) stay apprised of applicable law changes.\u00a0 New York City employers should determine if they use, or will use, automated employment decision tools. \u00a0Employers using such technology should: \u00a01) find an independent auditor to conduct the required bias audit of these tools, 2) develop an alternative selection process, and 3) draft notices that will be required under the law.<\/p>\n<p>As with any new legislation effecting employment policy and regulations, it is important to review any longstanding employment policies to ensure compliance with any new law. Readers are encouraged to contact Juliana McKittrick at <a href=\"mailto:jmckittrick@apslaw.com\">jmckittrick@apslaw.com<\/a> or 401.427.6221 for a more detailed discussion or review of any employment policies.<\/p>\n<p>Thanks to Damaris Hernandez, 2021 Summer Associate and Recipient of the Honorable Walter R. Stone Diversity Fellowship, for her significant contributions to this blog post.<\/p>\n<p><a href=\"#_ftnref1\" name=\"_ftn1\">[1]<\/a> \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <em>Trade Regulation Rule on Commercial Surveillance<\/em>, Office of Information and Regulatory Affairs (Dec. 10, 2021), <a href=\"https:\/\/www.reginfo.gov\/public\/do\/eAgendaViewRule?pubId=202110&amp;RIN=3084-AB69\" target=\"_blank\" rel=\"noopener\">https:\/\/www.reginfo.gov\/public\/do\/eAgendaViewRule?pubId=202110&amp;RIN=3084-AB69<\/a><\/p>\n<p><a href=\"#_ftnref2\" name=\"_ftn2\">[2]<\/a> \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Statement of Chair Lina M. Khan Regarding the Rep. to Cong. on Priv. and Sec. Comm\u2019n File No. P06540, Federal Trade Commission (Oct. 1, 2021). <a href=\"https:\/\/www.ftc.gov\/system\/files\/documents\/public_statements\/1597024\/statement_of_chair_lina_m_khan_regarding_the_report_to_congress_on_privacy_and_security_-_final.pdf\" target=\"_blank\" rel=\"noopener\">https:\/\/www.ftc.gov\/system\/files\/documents\/public_statements\/1597024\/statement_of_chair_lina_m_khan_regarding_the_report_to_congress_on_privacy_and_security_-_final.pdf<\/a><\/p>\n<p><a href=\"#_ftnref3\" name=\"_ftn3\">[3]<\/a> \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <em>The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees<\/em>, EEOC (May 12, 2022) <a href=\"https:\/\/www.eeoc.gov\/eeoc-disability-related-resources\/artificial-intelligence-and-ada\" target=\"_blank\" rel=\"noopener\">https:\/\/www.eeoc.gov\/eeoc-disability-related-resources\/artificial-intelligence-and-ada<\/a><\/p>\n<p><a href=\"#_ftnref4\" name=\"_ftn4\">[4]<\/a> \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <em>Algorithms, Artificial Intelligence, and Disability Discrimination in Hiring<\/em>, DOJ Civil Rights Division (May 12, 2022) <a href=\"https:\/\/beta.ada.gov\/ai-guidance\/\" target=\"_blank\" rel=\"noopener\">https:\/\/beta.ada.gov\/ai-guidance\/<\/a><\/p>\n<p><a href=\"#_ftnref5\" name=\"_ftn5\">[5]<\/a> \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <em>Draft Modifications to Employment Regulations Regarding Automated-Decision Systems<\/em>, FEHC (March 15, 2022) <a href=\"https:\/\/www.dfeh.ca.gov\/wp-content\/uploads\/sites\/32\/2022\/03\/AttachB-ModtoEmployRegAutomated-DecisionSystems.pdf\" target=\"_blank\" rel=\"noopener\">https:\/\/www.dfeh.ca.gov\/wp-content\/uploads\/sites\/32\/2022\/03\/AttachB-ModtoEmployRegAutomated-DecisionSystems.pdf<\/a><\/p>\n<p><a href=\"#_ftnref6\" name=\"_ftn6\">[6]<\/a> \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 N.Y.C. Admin. Code \u00a7\u00a7 20-871 (2022).<\/p>\n<p><a href=\"https:\/\/legistar.council.nyc.gov\/LegislationDetail.aspx?ID=4344524&amp;GUID=B051915D-A9AC-451E-81F8-6596032FA3F9\" target=\"_blank\" rel=\"noopener\">https:\/\/legistar.council.nyc.gov\/LegislationDetail.aspx?ID=4344524&amp;GUID=B051915D-A9AC-451E-81F8-6596032FA3F9<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Employers\u2019 use of artificial intelligence in assessing job applicants and employees has increased rapidly throughout the last decade. \u00a0These tools are used in a variety of contexts, such as making hiring decisions, determining promotions, and evaluating employee performance.\u00a0 However, as more employers&#8230;<\/p>\n","protected":false},"author":7,"featured_media":192,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[55,7,6,3],"tags":[56,16,2,9],"class_list":["post-191","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-automated-employment-decision-tools","category-employees","category-employers","category-employment-law","tag-automated-decision-systems","tag-department-of-labor","tag-employees","tag-employers"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.apslaw.com\/on-the-job\/wp-json\/wp\/v2\/posts\/191","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.apslaw.com\/on-the-job\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.apslaw.com\/on-the-job\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.apslaw.com\/on-the-job\/wp-json\/wp\/v2\/users\/7"}],"replies":[{"embeddable":true,"href":"https:\/\/www.apslaw.com\/on-the-job\/wp-json\/wp\/v2\/comments?post=191"}],"version-history":[{"count":0,"href":"https:\/\/www.apslaw.com\/on-the-job\/wp-json\/wp\/v2\/posts\/191\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.apslaw.com\/on-the-job\/wp-json\/wp\/v2\/media\/192"}],"wp:attachment":[{"href":"https:\/\/www.apslaw.com\/on-the-job\/wp-json\/wp\/v2\/media?parent=191"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.apslaw.com\/on-the-job\/wp-json\/wp\/v2\/categories?post=191"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.apslaw.com\/on-the-job\/wp-json\/wp\/v2\/tags?post=191"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}