IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Can AI Eliminate Bias from the Government Hiring Process?

Shifts in how we think about work in a post-COVID-19 world could create an opening for fairer hiring with the help of asynchronous interviews, using AI to aid in reducing bias in recruiting.

computer code on blue background
It is not a simple proposition, but the pandemic aftertimes, coupled with the novel use of some not-so-new technology, could provide an interesting test bed for hiring the right person for the right job as public agencies backfill open positions, all with the bonus of mitigating implicit bias against job candidates.

As with most inflection points, the moment has a number of moving parts. Employers, including public agencies, are scrambling to figure the flavor of hybrid working setups that will meet with the least resistance from employees while supporting basic operations. The best-case scenario would be to put public employees closer to the people who need government services — that probably is not the old office, but probably is not the employees’ homes either.

According to a Korn Ferry estimate, up to 40 percent of employees are contemplating a career change prompted at least in part by their pandemic experiences. Put simply, they want something else out of life, and the forced work-from-home (WFH) lockdowns gave many the sense that their work is not tied to a specific geography. (Through legislation and administrative rules, public employees may be more limited than their private-sector or nonprofit counterparts, but cubicles are no longer the only answer for most employees.)

The coincidence of last year’s WFH lockdowns along with a racial reckoning that manifested itself in many parts of the country may bring with it an opportunity to do things differently as government recruits for its next iteration as a hybrid workforce. Many public agencies are working aggressively to refresh policies to make them more welcoming or inclusive of underrepresented groups.

Just before COVID hit, no less an authority than the Harvard Business Review ran a piece called “Your Approach to Hiring Is All Wrong,” which detailed employers’ fears and employees’ frustrations about how the next hire gets made. The discussion included the need to clean up the hiring funnel to get more well-suited — and fewer — candidates that fit the role and the organization.

HBR quotes the editor of a newsletter who claims that “companies get five to seven pitches every day — almost all of them about hiring — from vendors using data science to address HR issues.”

As the keepers of the GovTech100 inventory of startups, we’ve had some visibility into this generation of HR technology, which includes those that support asynchronous interviewing, in which applicants respond to interview questions independently. AI and machine learning can now not only administer the questions, but also score the responses. The approach may have considerable merit in making the hiring process more inclusive and fairer.

First, it takes the bias out of the mechanics of panel interviews. Rather than hoping interview panel members can approach each candidate with the same energy, the same attitude and ask the same questions in the same order (with the same social prompts), a screening driven by robotic process can deliver a consistent experience. And then it can transcribe and score the responses without human partiality. The prospective employer can set time limits for each response and give candidates the option of multiple responses — or not — but all candidates get the same shot.

Second, and perhaps more significantly, this technology can also be configured to mask cultural cues, including names, that could disadvantage candidates with certain gender, BIPOC (Black, Indigenous, people of color) or AAPI (Asian American Pacific Islander) characteristics.

It is incumbent on human resource professionals to carefully develop appropriate questions to use in this process — and for data scientists to catch biased algorithms — both of which are vital to giving everyone a fair shot.

At this moment in the country’s national life, it is important for organizations to act in good faith. One way to do that is to make hiring processes less of a black box affair and show our work. That is perhaps nowhere truer than in government. Technology can help.

Ironically, one of the best shots we have at a fairer, more diverse, and inclusive workforce is to take the human out of human resources.
Paul W. Taylor is the Executive Editor at E.Republic and of its flagship titles - Government Technology and Governing.
Special Projects
Sponsored Articles
  • How the State of Washington teamed with Deloitte to move to a Red Hat footprint within 100 days.
  • The State of Michigan’s Department of Technology, Management, and Budget (DTMB) reduced its application delivery times to get digital services to citizens faster.

  • Sponsored
    Like many governments worldwide, the City and County of Denver, Colorado, had to act quickly to respond to the COVID-19 pandemic. To support more than 15,000 employees working from home, the government sought to adapt its new collaboration tool, Microsoft Teams. By automating provisioning and scaling tasks with Red Hat Ansible Automation Platform, an agentless, human-readable automation tool, Denver supported 514% growth in Teams use and quickly launched a virtual emergency operations center (EOC) for government leaders to respond to the pandemic.
  • Sponsored
    Microsoft Teams quickly became the business application of choice as state and local governments raced to equip remote teams and maintain business continuity during the COVID-19 lockdown. But in the rush to deploy Teams, many organizations overlook, ignore or fail to anticipate some of the administrative hurdles to successful adoption. As more organizations have matured their use of Teams, a set of lessons learned has emerged to help agencies ensure a successful Teams rollout – or correct course on existing implementations.