Due to lack of scientific rigor, Microsoft Azure withdraws its emotion recognition system

0
17
azure.jpg
azure.jpg

Microsoft announced the gradual removal of public access to a series of artificial intelligence-powered facial analysis tools from its Azure platform, including one that claimed to identify a person’s emotion from videos and images.

Such “emotion recognition” tools have been criticized by experts. Beyond the fact that facial expressions considered “universal” differ between different populations, responding to cultural factors; On this proposal, it is also criticized that the exercise of equating external manifestations of emotion with internal feelings does not respond to scientific criteria.

 

Microsoft will withdraw controversial facial recognition tool that “identifies emotions” through artificial intelligence

Microsoft will limit access to some features of its facial recognition services (dubbed Azure Face) and remove others entirely, in line with the company’s updated Responsible AI Standards, which emphasize responsibility for finding out who uses its services. and in greater human oversight over where these tools are applied.

Microsoft’s AI ethics policies were launched in 2019 and this specific measure responds to a broader review of its application.

Under the new rules imposed, users will have to request the use of Azure Face for facial identification, for example by pointing out to Microsoft exactly how and where they will deploy their systems.

Along with removing its emotion recognition tool from public access, Microsoft will also remove Azure Face’s ability to identify “attributes such as gender, age, smile, facial hair, hair, and makeup.” Tools that are excluded from the aforementioned category, with less harmful potential, such as automatic blurring of faces in images and videos, will remain openly available.

Natasha Crampton, director responsible for AI at Microsoft, commented on the company blog how this determination was made. “Experts inside and outside the company have highlighted the lack of scientific consensus on the definition of ’emotions,’ the challenges in how inferences generalize across use cases, regions and demographics, and the heightened privacy concern around to this kind of capacity. wrote.

Although the approach with which this resource was initially presented was what aroused the objections of the scientific community, the practical usefulness of this resource can continue to be used in other contexts. For example, the application called Seeing AI, also from Microsoft, uses these “artificial vision” functions to describe elements of the environment to people with visual disabilities.

Sarah Bird, Microsoft group senior product manager for Azure AI, noted that tools like emotion recognition “can be valuable when used for a set of controlled accessibility scenarios”. Although the executive acknowledges that “it is easy to imagine how it could be used to impersonate speakers inappropriately and mislead listeners” also emphasizes that this resource “has exciting potential in education, accessibility and entertainment”.

The application of the announced restrictions already applies to new Azure customers, while existing users will have access revoked on June 30, 2023.

Previous articleIs Dublin a bad place to visit? Have your say after Lonely Planet’s scathing review of the city
Next articleIntel NUC Serpent Canyon bets on its own components
Brian Adam
Professional Blogger, V logger, traveler and explorer of new horizons.