Emotion detecting Tech should be restricted

emotion

A leading research centre has called for new laws to restrict the use of emotion-detecting tech.

The AI Now Institute says the field is “built on markedly shaky foundations”.

Despite this, systems are on sale to help vet job seekers, test criminal suspects for signs of deception, and set insurance prices.

It wants such software to be banned from use in important decisions that affect people’s lives and/or determine their access to opportunities.

The US-based body has found support in the UK from the founder of a company developing its own emotional-response technologies  but it cautioned that any restrictions would need to be nuanced enough not to hamper all work being done in the area.

“It claims to read, if you will, our inner-emotional states by interpreting the micro-expressions on our face, the tone of our voice or even the way that we walk,” explained co-founder Prof Kate Crawford.

“It’s being used everywhere, from how do you hire the perfect employee through to assessing patient pain, through to tracking which students seem to be paying attention in class.

“At the same time as these technologies are being rolled out, large numbers of studies are showing that there is… no substantial evidence that people have this consistent relationship between the emotion that you are feeling and the way that your face looks.”

Prof Crawford suggested that part of the problem was that some firms were basing their software on the work of Paul Ekman, a psychologist who proposed in the 1960s that there were only six basic emotions expressed via facial emotions.

But, she added, subsequent studies had demonstrated there was far greater variability, both in terms of the number of emotional states and the way that people expressed them.

“It changes across cultures, across situations, and even across a single day,” she said.

Related posts

Leave a Comment