From Artificial Intelligence  to Emotional Intelligence

Not long ago, I came across a report about recruitment in China that highlighted the contribution of AI in facilitating the first interviews.

From Artificial Intelligence to Emotional Intelligence

This one explained that for some positions, thousands of CVs were sent, and the first sorting was made from algorithms. In the end, this is a classic and increasingly common practice, but what caught my attention was that the second sorting was also done by an AI. How did it work? In concrete terms, the candidate is facing a screen with a neutral face that asks questions, just like a recruiter would do during a video interview. The approach was presented as “fair” because it does not depend on the sensitivity and subjectivity of the recruiter. Each candidate is assessed in the same way, without the mood, the moment or the person influencing the choice. And all this with an analysis of the emotions at the end.

 

Basically, this is an interesting thought, which even makes sense, but it raised questions for me. Indeed, gain of scale, fairness, improvement of processes, that’s all we expect from AI. But, if we draw the line a little, it could end up recruiting autonomously, even conduct annual interviews and why not, if we go even further, manage people. But what about the human element in all this? Indeed, the scaling up of recurring, mechanical tasks is now omnipresent: automation of mail management, emails, etc. However, here we are talking about identifying a person with whom we will work on a daily basis, with whom a feeling must be transmitted and where the informal and emotional aspects seem essential to me.

The underlying question that came to my mind was: Is it possible to complement artificial intelligence with emotional intelligence?

But how does it work?

Let’s take the example of recruitment. The first step consists in analyzing the multitude of CVs that the company receives. The CVs, although sometimes original, are in most cases “stereotyped”: presentation of different skills, experiences, keywords. A recruiter spends about 53 seconds on a CV to analyze it (1). It is clear that the use of NLP algorithms and keyword searches (or others) will easily and very effectively replace a recruiter who is going to search for skills and experience a little bit at a time in a database of profiles. We are in the process of scaling up a mechanical task.

 

Once this first batch of candidates has been selected, the next phase is the exchange with those who have been selected to detail their background, their motivation, their understanding of the challenges of the position, their suitability for the company’s values and to test their reactions. This task takes between 30 minutes and 1.5 hours(2) (presentation of the company and the position, discussion of the position, etc.). This phase is generally followed by one or more sequences with a reduced panel of candidates to make the final choice. It is therefore clearly interesting to try to reduce this second stage of selecting the “cream of the crop” if you want to optimize a recruitment.

However, there is a whole range of nuances between sorting resumes and capturing emotions. The sorting of CVs uses classic text processing techniques. Some companies use AI to capture emotions in words by observing word repetitions, turns of phrase and even the form of the email.

But here, it is a question of going further: if it is one thing to understand and interpret words, it is another to be able to identify emotions during an interview in real time. This is clearly Emotional Intelligence.

This field is in full development. Indeed, society is more and more focused on feelings and emotions. We are therefore looking to develop AIs that decode them.

 

Just like for NLP analysis, data is needed here. What becomes complex is to mix many types of data that can come from different sensors: video, micro and biometrics:

Voice data, which allows to capture emotions from your voice, especially thanks to the tone, the flow or the hesitations

Body expressions and more particularly micro-expressions on the face

To process this data, we use convolutional neural networks with mathematical functions such as Leaky Relu that deduce your emotions from an image or a video.

 

In the end, what do we get?

Most of the databases currently used to train Emotional Intelligence are composed of simulated facial expressions, in a neutral setting, and representing a limited range of emotion categories (joy, fear, anger, sadness, surprise, disgust and contempt).

On these data, the models perform quite well. However, when faced with more natural everyday data, the performance of the programs deteriorates significantly.

This comes from the emotion detection algorithms that assume a direct and unique link between facial expressions and a few basic emotions: joy, fear, anger, sadness, surprise, disgust and contempt.

While the use of speech data helps to fill in some of the gaps, AI still struggles to capture the subtle and complex alchemy of emotions.

AI is still far from the capabilities of humans to understand them, but emotional artificial intelligence is constantly evolving. Researchers are developing more and more elaborate and powerful systems on increasingly rich data.

 

It is therefore not immediately clear that managers will be replaced by Artificial Intelligence and recruiting will be totally delegated to a machine. However, it is possible to rely on it to improve human judgment (and not to replace it).

 

In addition to the example of recruitment, the applications of this work are numerous:

Customer relations: improve the performance of chatbots by adding video and sound

Marketing: improve the impact analysis of a communication on a panel of individuals

Health: helping people with autism and enabling them to better understand other human beings.

Great playgrounds ahead for our data scientists!