DRNO - Daily Research News
News Article no. 34091
Published October 26 2022

 

 

 

ICO Issues Warning on Emotional Analysis Tech

In the UK, the Information Commissioner's Office (ICO) is warning organisations to assess the public risks of using emotion analysis technologies, before implementing these systems. The regulator says those firms which do not 'act responsibly' will be investigated.

Stephen BonnerThe ICO's concerns focus on technologies which process biometric data such as gaze tracking, sentiment analysis, facial movements, gait analysis, heartbeats, facial expressions and skin moisture testing. These technologies are used for collecting, storing and processing a range of personal data, including subconscious behavioural or emotional responses.

In a statement, the Office described algorithms as not being sufficiently developed to detect emotional cues, meaning there is a risk of 'systemic bias, inaccuracy and even discrimination'. Organisations that 'do not act responsibly, pose risks to vulnerable people, or fail to meet ICO expectations' will be investigated.

The ICO is currently developing guidance which explores public perceptions of biometric technologies and how biometric data is used, and this will be published next spring. Deputy Commissioner Stephen Bonner (pictured) comments: 'Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever. At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination. As it stands, we are yet to see any emotion AI technology develop in a way that satisfies data protection requirements, and have more general questions about proportionality, fairness and transparency in this area'.

Web site: www.ico.org.uk .

 

 
www.mrweb.com/drno - Daily Research News Online is part of www.mrweb.com

Please email drnpq@mrweb.com with any questions.

Back to normal version.

© MrWeb Ltd