Menu Close

The United Kingdom’s Information Commissioner’s Office (ICO) has issued a warning against companies deploying biometrics-based emotional analysis algorithms, coincident with the release of two new reports

The United Kingdom’s Information Commissioner’s Office (ICO) has issued a warning against companies deploying biometrics-based emotional analysis algorithms, coincident with the release of two new reports.

The news was reported Tuesday by The Guardian, which spoke with Stephen Bonner, the ICO’s deputy commissioner. According to Bonner, firms should refrain from these types of technologies due to the “pseudoscientific” nature of the field. They could be fined if they ignore the warning.

“There’s a lot of investment and engagement around biometric attempts to detect emotion,” the deputy commissioner says. “Unfortunately, these technologies don’t seem to be backed by science.”

Bonner adds that while these technologies are harmless if used for entertainment, they should not be used in critical decision-making.

“If you’re using this to make important decisions about people – to decide whether they’re entitled to an opportunity or some kind of benefit, or to select who gets a level of harm or investigation, any of those kinds of mechanisms,” Bonner warns, “we’re going to be paying very close attention.”

According to the ICO, biometrics-based emotional analysis concerns data protection but it also could breach people’s rights and related laws.

“Emotional AI” is one of four topics that the ICO has identified in a study of the future of biometrics published this week. The study is accompanied by another providing an introduction to the state of biometrics and regulation in the Kingdom.

The regulators also have examined the difficulties of applying data protection law (asking for individual consent, in particular) when biometric surveillance techniques like gaze-tracking or fingerprint recognition in a crowd of dozens, sometimes hundreds, of people at a time.

The ICO is set to release new guidelines on how to use biometric algorithms, including face, fingerprint and voice recognition, in spring of 2023.

The commissioner and others may judge emotional analysis as ‘pseudoscientific,’ but it is gaining traction in the legal field and others.

It is also becoming more powerful, according to a recent study by the University of Trento and Eurecat Centre Tecnològic describing a new AI that can perform unsupervised multimodal emotion recognition.

Article: UK regulator warns firms off AI emotional analysis

Leave a Reply

Your email address will not be published. Required fields are marked *