The Information Commissioner’s Office (ICO) explained that emotional analysis technologies process data such as gaze tracking, sentiment analysis, facial movements, gait analysis, heartbeats, facial expressions and skin moisture. Examples include monitoring the physical health of workers by offering wearable screening tools or using visual and behavioral methods including body position, speech, eye and head movements to register students for exams.
Emotion analysis relies on the collection, storage and processing of a range of personal data, including subconscious behavioral or emotional responses and, in some cases, special category data. This type of data use is much riskier than traditional biometric technologies used to verify or identify a person.
The ICO says that the inability of algorithms that are not sufficiently developed to detect emotional signals means that there is a risk of systemic bias, inaccuracy and even discrimination.
The ICO says that to enable a fairer level playing field, it will act positively towards those who demonstrate good practices, while investigating and taking action against organizations that attempt to gain an unfair advantage by using technologies. illegal or irresponsible data collection.
The ICO is also developing guidance on the wider use of biometric technologies. These technologies can include facial, fingerprint and voice recognition, which are already used successfully in industry.
The ICO says its biometric guidelines, due out in spring 2023, will aim to further empower and help businesses, as well as highlight the importance of data security. Biometric data is unique to an individual and is difficult or impossible to alter if lost, stolen or misused, the regulator said.
When developing the guidance, the ICO will hold public dialogues with the Ada Lovelace Institute and the British Youth Council, which will explore public perceptions of biometric technologies and obtain opinions on how biometric data is used .
The ICO says supporting businesses and organizations in the development stage of biometric products and services incorporates a “privacy by design” approach, thereby reducing risk factors and ensuring organizations operate safely and securely. legality. As a result, it has released two new reports to help companies navigate the use of emerging biometric technologies.(1)
For more information on this, please contact Patrick Rennie of Wiggin by phone (+44 20 7612 9612) or email ([email protected]). Wiggin’s website can be accessed at www.wiggin.co.uk.
(1) To read the full ICO press release, which includes examples of current use of biometric technologies and relevant sectors, as well as links to the new reports, click here.