01_23_2020_ALGORITHIMS_ARE_MAKING_JOB_HUNTING_HELL_CV
Illustration by Cathryn Virginia
Tech

Cost Cutting Algorithms Are Making Your Job Search a Living Hell

More companies are using automated job screening systems to vet candidates, forcing jobseekers to learn new and absurd tricks to have their résumés seen by a human.

Jeffrey Johnson was on one continuous job search, more or less, for 12 years.

After the recession shuttered the textbook publisher where he was putting his bachelor’s degree in history to use, Johnson worked office temp jobs and delivered packages, on an Uber-like gig basis, for Amazon and a medical testing company around his native Baltimore. He went back to school for a masters in screenwriting and then a law degree. Throughout, he scrolled through sites like Indeed and ZipRecruiter looking for better, more stable work—or just something to tide him over between semesters.

Advertisement

About two years ago, he started getting emails asking him to take online “assessments” for jobs after he applied. Some were tests of basic office skills, like spreadsheet use and typing. Others were for legal knowledge. Some were dippy personality tests. At first, Johnson was excited. “I thought it meant I’d gotten past a gatekeeper of some kind and was in the running,” he said.

Then the tests came quicker and more frequently. One in four jobs had an assessment attached, he estimates. He got emails prompting him to take an online test seconds after he submitted an application, a sure sign no human had reviewed his résumé. Some were repeats of tests he’d already taken.

He found them demeaning. “You’re kind of being a jackass by making me prove, repeatedly, that I can type when I have two writing-heavy advanced degrees,” Johnson said, “and you are not willing to even have someone at your firm look at my résumé to see that.”

Johnson also did phone interviews with an Alexa-like automated system. For one job, he was asked to make a one-sided video “interview” of himself answering a list of company-provided questions into a webcam for hiring managers to view at their convenience. Or maybe an algorithm would scan the video and give him a score based on vocal and facial cues, as more than 100 employers are now doing with software from companies like HireVue .

Until he started as a legal writer for FreeAdvice.com last month, Johnson, 36, said he was at potential employers’ whims. “I can’t imagine I’d move to the next round if I didn’t do what they said,” he told Motherboard.

Advertisement

Companies are increasingly using automated systems to select who gets ahead and who gets eliminated from pools of applicants. For jobseekers, this can mean a series of bizarre, time-consuming tasks demanded by companies who have not shown any meaningful consideration of them.

“Obviously, in our society time is money,” said Ifeoma Ajunwa, an assistant professor of labor and employment law at Cornell University and author of The Quantified Worker. “So if you are asking [job applicants] to spend so much time on an application then you are transferring the labor your HR department would be doing to the applicant, so it becomes an ethical issue.”

Applicant Tracking System (ATS) software has long been used by high-end firms searching for executives and HR managers at large corporations that receive thousands of applications. The systems are increasingly being adopted in various sectors of the labor market. ATS vendor iCIMS claims it revamped Footlocker’s process for hiring sales associates, and JazzHR brags that it helps a regional Pittsburgh construction company hire all its staff, including interns. (JazzHR and iCIMS both declined to comment for this story.)

Beating the Bots

Maneuvering around algorithmic gatekeepers to reach an actual person with a say in hiring has become a crucial skill, even if the tasks involved feel duplicitous and absurd. ATS software can also enable a company to discriminate, possibly unwittingly, based on bias-informed data and culling of certain psychological traits.

Lynne Williams, a Philadelphia-area career advisor, holds a seminar called “Beating the Applicant Tracking System.” Every time, she braces for a wave of anger from the audience. “I can feel their blood pressure rise when I tell them what they are doing wrong,” she said.

Advertisement

Their most important task, she tells crowds of jobseekers, is to parrot keywords from job descriptions. The most basic elimination function of most ATS software is searching résumés and cover letters for keywords. Many systems can’t—or don’t bother to—distinguish synonyms, like “manager” and “supervisor,” so she says to rewrite résumés with each application, mindlessly copying words from the job description. Countless online guides for “beating the bots” recommend the same.

People find this task frustrating and are indignant over its irrelevance to their fitness for the job, Williams said. Others fume about all the time spent carefully crafting applications that were probably never seen by a human.

Jack Wei, a director of product marketing for the job site SmartRecruiters, said that “the moment a candidate applies [for a posted job], a ‘smart profile’ scrapes résumé info into a digital portfolio by extracting keywords.” The employer then sees an automatically generated score, from 1 to 5, of their apparent fitness for the job. The platform distinguishes synonyms and word variances when making this score, but the employer can search using any narrow phrase or word they choose, Wei said.

According to data from the job site Glassdoor, 250 people apply to the average corporate job. Many ATS vendors sell their products on the suggestion that hiring managers are overwhelmed by applicants. When almost half of Americans work low-wage jobs, a good job of any kind will generate a long line of hopefuls, despite the official government unemployment rate being at a 10-year low of 3.5 percent.

Advertisement

ATS technology encourages applicants to find ways to cut in line, said Anjunwa. She has heard stories of people inserting common keywords in small white font on their PDF résumés, visible only to bots, to sneak into the next tier of candidates.

Applicants can also use services to help them beat the algorithms, like the website Jobscan, which will scramble words from a job listing into their customers’ résumés for $49.95 a month.

Such tricks don’t show relevant job skills, but perhaps vital job-hunting ones, Anjunwa said. “People see that only people who are savvy get jobs,” she said, “and the others get shut out.”

How Often Does the Following Statement Describe You at Work?

The next round of the screening process is often an assessment test. Such tests have been around for decades, but ATS software has made it easy to deliver them automatically to an applicant’s inbox.

Indeed, the world’s most visited job site, has been pushing assessments in recent TV advertising. The company offers employers online tests for basic aptitudes, like attention to detail and memorization and recall; job-specific skills, like bookkeeping and first aid; and more abstract competencies, like critical thinking and problem solving.

Most take up to 30 minutes. They still piss off jobseekers.

“I've been finding a lot of online assessments that come with the job applications I file, wrote a poster on the jobs subreddit. Every time I see one of those, I immediately cringe. I HATE THEM.”

Advertisement

“They're usually very long … and most of the questions I just. Can't. Answer," they added. "For example, a question such as ‘I prefer to work in team rather than alone’ completely depends on the situation and the kind of job I’m doing.”

Indeed declined an interview but told Motherboard in a statement that “Indeed's free Assessment tool is not a burden to job seekers, it helps job seekers demonstrate their full capabilities to prospective employers,” and that the tests “help job seekers stand out based on their skills instead of their previous titles, employers, or their highest level of formal education.”

Smaller companies also provide assessments, for a fee. Atlanta-based Berke offers both aptitude and personality tests. Neil Morelli, PhD, an organizational psychologist and vice president of product and assessment, said some of the applicant outrage can “come from older assessments that last an hour or two and they can feel clinical.” He added, “These large battery assessments are being replaced by more aesthetically pleasing and enjoyable assessments.”

Morelli admits “enjoyable” is relative in this context, but said a goal of his industry is to produce tests that feel game-like but still convey useful information about a candidate.

The Berke Marketing Free Personality Assessment is a 78-question sample test, where every question is the same: “How often does the following statement describe you at work?” Start it and then comes a flood of phrases and adjectives: “lively,” “disciplined,” “leader,” “angry,” “fearless,” “puts others at ease,” “soft hearted,” etc. The test-taker picks one of four options: “almost always,” “often,” “sometimes,” or “rarely.” Morelli said some of the terms in a test are virtual synonyms to suss out applicants trying to game it. Obviously, one would hesitate to tell a potential employer they are “often” “angry” at work and “rarely” “put others at ease,” even if true.

Advertisement

The assessment tests applicants on psychology’s “big five” personality traits of extroversion, agreeableness, openness, conscientiousness and neuroticism. Some managers consider these traits for even minimum-wage work.

Why Robots Don’t Think Women Can Handle a Job at Amazon

In 2012, Kyle Behm took a break from university to deal with manic depressive disorder. He applied online for a low-wage job at a Kroger grocery store near Atlanta. A friend who worked there told Behm he didn’t get hired because of the results of a personality test. His father, an attorney, filed suit against Kroger and five other companies that tasked Behm with big-five personality tests for a low-paying job, alleging they illegally screened for mental illness. Sadly, Behm ended his life last year before the case was adjudicated.

The Americans with Disabilities Act prohibits “employment tests that screen out or tend to screen out an individual with a disability or a class of individuals with disabilities” unless necessary for the job.

Morelli said that Berke’s personality assessment is ADA compliant because it “does not meet the criteria for being a medical examination” and “is not invasive or used to infer mental health.” But asking a candidate if they are “fearless” or “comfortable with others” may produce results indicating a condition like depression or social anxiety disorder.

Some advanced ATS have “learned” bias by incorporating variables that favored people who are already advantaged. Amazon abandoned its development of an AI-based hiring process when the predictive models favored male candidates. The system was relying on résumés submitted to the company over ten years, and because of the prevalence of men in tech jobs, the system began to downgrade résumés that included all-women’s colleges or female-indicating phrases like “women’s chess club.”

Advertisement

The makers of more advanced applicant tracking systems are acutely aware of the bias problem, but are not certain of a solution.

Arya is a “recruiting platform” that uses predictive analytics to identify and classify job candidates. Madhu Modugu, the CEO and founder of its parent company, feels assessment tests are a poor indicator of job performance, and claims that Arya’s platform avoids placing a heavy time burden on applicants.

“[Arya] would look at my history and my past, what kind of organizations and what kind of cultures you are exposed to,” said Modugu. The system would then match the candidate against a composite profile of employees “who are the high performers in the culture of the company.” If a company can’t provide that information, Arya offers its own psycho-metric profile of who does well in the jobs that its customers are looking to fill.

The term “culture” can have some problematic connotations when applied to who is suited for a job. Modugu insists Arya measures the work culture of past employers compared to that of the prospective employer.

But he said he is aware of problems like the one that plagued Amazon: If the system is fed data that shows white males have been the “best” employees—because prejudice allowed them into those roles more easily—it will conclude that white males are the “best” candidates. “The AI has interpreted the data correctly,” said Modogu, “but it has generated biased results because the information is biased.”

Advertisement

The solution is not simple, he said. Developers need to work out systems that can better fight bias and HR managers need to take active steps to increase recruitment from non-traditional pools.

On top of issue of discrimination, the emphasis on data in recruiting can make people feel stuck in a role. Data points come from past experience, and neglect factors like ideas, ambition and understanding of an industry.

Nick Thorch once worked in Microsoft’s inside sales division, selling the company’s products to other large businesses. He wanted to transition to product management. After spending time fielding customer complaints about Windows Vista, he felt he had insight into how software should be developed. He applied for thousands of product manager jobs in Silicon Valley.

“The only time a recruiter was interested in me was for another inside sales position, even if my cover letter, résumé and career objective statement strongly supported a range of business roles,” Thorch said. “ATS mentality keeps people pigeonholed in their past, rather than what they feel inspired to do.”

SmartRecruiter’s Wei saidthat the effect of ATS on narrowing career pathways presents a “good question” for the industry.

“On a technological level alone, there is only so much alone you can do,” he said. Recruiters need to identify people who might be viable for a career change. “You lead with people and process first, but with technology alone you can’t have lasting change.”

Advertisement

Asking Some Hard Questions

When it debuted in 2013, HireVue’s AI analysis of video interviews seemed like an endgame for job application automation. A candidate answers questions to a silent webcam and uploads a video. A program then scans their facial features, word choices and vocal indicators to determine—through some murky, trademarked science—if they should advance to the next round. The Utah-based company once had $93 million in venture capital and more than 600 clients, including Goldman Sachs and Hilton.

Last year, artificial intelligence scientists called HireVue’s methods “pseudoscience” and “profoundly disturbing” in a Washington Post article, and the Electronic Privacy Information Center filed a complaint with the Federal Trade Commission alleging it violates jobseekers’ privacy through facial recognition.

Cornell’s Ajunwa said AI analysis of mannerisms presents some of the worst potential for discrimination in the labor market. The process would be a roadblock for people on the autism spectrum and for many from foreign countries, as acceptability would doubtlessly hinge on neurotypical, American mannerisms. “What if your culture says not smiling is respectful?” she asked. “There are many cultures where people do not laugh and smile like Americans do.”

HireVue did not respond to a request for comment. Its website no longer touts AI analysis of videos as a service. But it is one of several companies that gather self-made recordings from job applicants answering interview questions. The applicants upload them for a manager to, perhaps, view at their convenience in another new task in the automated application process. VidCruiter and Modern Hire, two other prominent video interview companies, also declined Motherboard’s requests for an interview.

Advertisement

“It’s just your typical interview questions like, ‘What’s a time when you failed at work?’” said Jeffrey Johnson. He recalls a herky-jerky system that would beep to prompt him to stop mid-answer.

He’s not sure if an AI or a person looked at his video. He submitted it on a Friday and was rejected for the job the next Sunday.

VidCruiter’s website describes the appeal of having a limitless library of candidate interviews, on tap. In a video, a “senior recruiter,” likely an actress, complains of “spending eight hours doing phone interviews” and “pulling [her] hair out” over the lousy candidates. She then describes “a system that will allow you to ask an unlimited number of candidates as many questions as you want.”

“I’m doing something else while the system is interviewing my candidates,” she says with a smile.

The message is clear: She’s offloaded much of her work to someone else.

Ajunwa said automated systems will probably continue to amass between jobs and jobseekers. “I think that’s the way it’s going to advance,” she said. “Companies have come to count on it.” She has called for mandated auditing of algorithmic systems to ensure against “bias in, bias out” preferences like the ones that affected Amazon.

Should job applicants rebel? Should they refuse to take online assessments or to upload video faux interviews or engage the next faceless gatekeeper?

She encourages candidates to take a principled stand if they are in a position to do so, if they are already employed or have good prospects. Everyone should monitor tests that “echo mental health” or show other signs of bias, she said. But she doesn’t know where exactly to draw the line to refuse to comply with the process.

“That’s a tough question,” she said, “because if you need the job, you need the job.”