News
The growing use of emotion recognition AI is causing alarm among ethicists. They warn that the tech is prone to racial biases, doesn’t account for cultural differences, and is used for mass ...
There’s little scientific basis to emotion recognition technology, so it should be banned from use in decisions that affect people’s lives, says research institute AI Now in its annual report.
This is a big deal for everyone. Companies across the world use emotion recognition systems for hiring, law enforcement agencies use them to profile potential threats, and they’re even being ...
Representatives of a technology company contact city leaders and propose that they be granted access to video recordings captured by those cameras, in order to use emotion-recognition AI and generate ...
referring to the use of data for purposes other than those for which it was collected. Even seemingly benign applications of emotional recognition tech could lead to harm, Marda noted.
Facebook recently announced that it has created video de-identification technology that can hide people from facial recognition ... autoencoder with an arbitrary prior.” Classifiers typically ...
The company behind the system counts Huawei, China Mobile, and PetroChina among its clients, though it is unclear if these companies have purchased the emotion-recognition system for use in their ...
The European Union's new Artificial Intelligence Act is significantly restricting the use of emotion recognition systems in workplaces. This is according to the latest guidelines published last week ...
Since at least last year, Microsoft has been reviewing whether emotion recognition systems are rooted ... expression and emotional state across use cases, regions, and demographics," Sarah Bird ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results