Tech News

There was a sidelined AI that matched LinkedIn’s work. The company’s solution? More AI.

[ad_1]

More and more companies are using AI to hire and hire new employees, and AI may have that factor almost any stage of the procurement process. Covid-19 spurred new demand for these technologies. Both Strange thing and HireVue, Companies specializing in AI-based interviews have reported an increase in pandemic business.

Most job seekers, however, start with a simple search. They turn to platforms like job seekers LinkedIn, The monster, or ZipRecruiter, where they can upload their resume, browse for job vacancies and apply for vacancies.

The purpose of these websites is to match qualified candidates with available positions. To organize all of these openings and candidates, many platforms use AI-based recommendation algorithms. Algorithms, sometimes due to matching engines, process information from the job seeker and the employer to form a list of recommendations for each.

“Usually a contractor spends six seconds listening to an anecdote that looks at your resume, right?” says Derek Kan, vice president of Monster product management. “When we look at the recommendation engine we’ve built, you can reduce that time to milliseconds.”

Most matching engines are optimized for creating applications, he says John Jersin, Former vice president of product management at LinkedI. These systems base their recommendations on three categories of data: information that the user provides directly to the platform; data assigned to the user based on other sets of similar skills, experiences, and interests; and behavioral data, such as how often the user responds to messages or interacts with job postings.

In the case of LinkedIn, these algorithms exclude a person’s name, age, gender, and race because the inclusion of these characteristics can contribute to bias in automated processes. Jersin’s team found, however, that the algorithms of the service could detect patterns of behavior shown by groups with specific gender identities.

For example, while men are more likely to go to jobs that require work experience beyond their qualifications, women only go to jobs that fill their qualifications. The algorithm interprets this variation in behavior and the recommendations inadvertently disadvantage women.

“For example, you will recommend more jobs to a group of people who are older than another, even if they are qualified at the same level,” Jersin says. “Maybe these people haven’t had the same opportunities. And that’s really the impact we’re having here.”

Men include more skills in the curriculum at a lower level of competence than women, and often engage more aggressively with platform selectors.

To address such issues, Jersin and his team are on LinkedIn He built a new AI designed to achieve more significant results and expanded in 2018. It was basically a separate algorithm designed to address recommendations that were biased toward a particular group. The new AI ensures that the recommendation system includes a uniform distribution of users by gender before mentioning matches taken care of by the original engine.

Kan says Monster, which lists 5 to 6 million jobs at any one time, also includes behavioral data in the recommendations, but does not correct the correctness the same way LinkedIn does. Instead, the marketing team gets users from many backgrounds to sign up for the service and the company relies on employers to make reports and tell Monster whether or not it has passed a representative candidate.

Irina Novoselsky, CEO of CareerBuilder says the service is focused on using the data it collects to teach employers how to eliminate the bias of their jobs. For example, “when a candidate reads a job description with the word‘ rockstar ’, a lower percentage of women make the request,” she says.



[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button