The Problems of Hidden Bias in a Neurodiverse Workplace
Everywhere you look, artificial intelligence (AI) is in the news.
While AI is generally recognized to be a powerful tool to improve nearly every aspect of day-to-day living, more and more people are starting to become aware of its many unintended consequences. This is especially true in the workplace, where AI tools have been eagerly adopted by companies looking to increase efficiency and productivity in their workforce.
More and more, HR departments are eager to embrace AI in their screening and hiring processes. On the surface, it sounds like a great idea; save time and money by narrowing down a pool of job candidates to only the most qualified of individuals. Finding team members that will fit in with your existing workforce, increasing overall productivity, and minimizing onboarding and training costs makes a lot of sense. Companies can lose tens of thousands of dollars and countless productive hours when they hire the wrong candidate for the job, so it only seems logical to improve the process.
Now that the use of AI in screening and hiring has become practically normal, those unanticipated consequences are starting to become painfully obvious. Hidden biases in AI have led to the exclusion of perfectly capable and highly qualified neurodivergent employees from the workforce.
A while back, Ken Blackwell, who has a wonderful podcast called Insight at Work, invited me onto his show for an interview. We discussed a lot of different topics centered around supporting a neurodiverse workplace, and I invite you to listen to the entire interview by clicking here. One of the subjects we touched on was how the algorithms used in many AI screening tools are actually one of the biggest impediments to finding the best talent for your company.
Ironically, artificial intelligence can discriminate against the most innovative thinkers and eliminate some of the most amazing talent from the job applicant pool. This unfortunate reality is caused by hidden bias that particularly impacts neurodivergent workers.
How AI is Used to Screen and Hire New Employees
Modern HR departments have embraced AI technology to make it easier to filter through potential job candidates and integrate them into the workplace.
In HR departments around the world, AI is most popularly used to:
- Identify recruitment candidates
- pre-screen job resumes
- Filter candidates through the interview process
- Onboard and train new hires.
AI screening tools can review a job application or a candidate’s resume to match key skills with the job requirements of open positions. These tools save time by allowing HR professionals to tackle an enormous pile of resumes efficiently to narrow down the pool of truly qualified candidates for any given position.
During the interview process, AI can be implemented to generate reports on a particular candidate’s “soft skills” and interpersonal characteristics. Facial recognition software can read emotions and body language, while certain types of video games can paint a picture of a candidate’s personality traits. Without ever interviewing a person one on one, HR can know just how altruistic, trusting, and fair-minded an applicant may be. In fact, AI can let HR go into an in-person interview with a full cognitive and intellectual profile on the person they are considering for a job.
Once a person makes it through the hiring process, AI is being used more and more to train them into their new role. Chatbots and augmented, or virtual reality simulations are incredibly useful to impart information about company culture, protocols, and processes.
So what’s the problem? Why should we reconsider the use of AI in hiring and screening processes?
The Hidden Bias of AI in a Neurodiverse Workplace
Unfortunately, as I mentioned in that interview with Ken Blackwell, many of these AI tools unintentionally exclude an entire population of neurodivergent workers from their approved pools of job candidates.
Many proponents of AI cite it as an essential tool for overcoming the inherent bias that many of us carry, whether we know it or not.
In this age of inclusivity, it is impossible to pretend that bias doesn’t exist. It is human nature to be drawn to people who are similar to us. We feel more comfortable with folks who look like us, speak like us, and think like us. This has made it really difficult for folks of different ethnic and socioeconomic backgrounds to break into certain sectors of the workforce.
AI filters out and excludes countless neurodivergent workers who simply don’t fit the traditional molds that these tools are programmed to prefer. Unfortunately, many of the AI options are simply not inclusive enough for a neurodiverse world.
When it comes to neurodivergent workers, some common issues arise when using AI in hiring and screening processes. These include:
- Forced disclosure of a “disability” or neurodivergent condition. Many people in the neurodivergent community spend a lifetime defined by labels that they may come to resent. If they are obligated to ask for accommodations to be considered for a potential job position, they may have to disclose their neurodivergence, which is essentially a violation of their privacy.
- Biased conclusions based on body language and facial expression screeners. Some AI screening tools analyze a job candidate’s body language, facial expressions, tone of voice, and rhythm of speech. These tools do not take into account the natural variations in communication that present themselves in neurodivergent individuals. Eye contact, stuttering, verbal tics, and speech patterns that differ from the “norm” are common traits of neurodivergent individuals. These tendencies may cause a person to be disqualified, even though they are poor indicators of an individual’s capability to perform specific jobs or the skill level they may possess.
- Rigid pre-employment assessments that do not consider the visual, cognitive, or sensory needs of neurodivergent job candidates. This often leads to poor performance on standardized assessment tools, which fail to consider the way a person’s neurodivergence impacts their responses.
Inclusive AI for a Neurodiverse Workforce
Neurodiversity in the workplace is an asset for any company. Creativity and innovation are the inevitable products of a workforce that includes workers who are actively thinking outside the box.
This is why it is counterproductive for companies to actively adopt inclusive policies while simultaneously investing in AI technologies that filter neurodivergent workers from the job pool. So, how can we do things differently? Is there a better way?
Yes. By reconsidering our use of AI in the screening and hiring process, we can build a truly neurodiverse workforce and a company culture that values neurodivergent workers for the unique skills and talents they contribute to the workplace. I am not making the argument against AI — on the contrary, I believe it is a powerful tool that can be harnessed to create more inclusive workplaces for divergent workers.
The trick is in learning to recognize how certain AI technology contains potential bias against highly qualified neurodivergent job candidates.
While the hiring process can be streamlined by AI technologies, we must never forget that at the heart of every successful business is a workforce made up of people; individuals with a diversity of needs, learning and communication styles, and unique perspectives that are ultimately the creative force underlying the company’s productivity. Hiring and screening protocols need to maintain a human element, one that is sensitive to the talents and gifts offered by the neurodivergent community.
- AI for Disability Inclusion: Enabling Change with Advanced Technology, A report prepared by Accenture, a global professional services company.
- Creating Safe Spaces for Neurodivergence in the Workplace
- Ditch Conformity and Hire Divergent Thinkers
- Help Wanted: An Examination of Hiring Algorithms, Equity, and Bias, a report prepared by Upturn, an organization dedicated to equity and justice in the use of technology.
- “What Do We Do About the Biases in AI?” by James Manyika, Jake Silberg, and Brittany Presten
- Where Automated Job Interviews Fall Short by Zahira Jaser, Dimitra Petrakaki, Rachel Starr, and Ernesto Oyarbide-Magaña
Photo credit:metamorworks /iStockphoto Standard License
CLICK HERE to visit the articles page.