Civil Rights Groups Have Called On Zoom, A Cloud Video Service Provider, To Abandon Plans To Build Emotion Analysis Software

take 5 minutes to read
Home News Main article

The American Civil Liberties Union (ACLU) said in an open letter to zoom, a cloud video conference service provider: the use of artificial intelligence (AI) - based algorithms to monitor the emotions of participants is suspected of violating users' privacy and their civil rights** In view of this, civil rights organizations called on zoom to give up the exploration plan of building "emotion analysis software".

It is reported that the software plans to use AI technology to analyze the emotional status of video conference participants.

In an open letter to zoom founder Eric yuan on Wednesday, nearly 30 advocacy groups, including ACLU and digital rights non-profit organizations, attacked such technologies as unscientific, manipulative and conducive to discrimination.

Zoom claims to be concerned about the safety and well-being of its users, but this invasive technology suspected of violating privacy and human rights has slapped the company in the face.

In addition, the collection of such 'in-depth personal data' may make the client company a 'target for snooping on government authorities and malicious hacker attacks'.

Even so, Josh dulberger, head of zoom products, data and AI, argued in an interview with protocol that although these information are available signals, they do not play a decisive role.

Obviously, dulberger envisages using this technology - for example, by detecting when the mood of participants drops - so that sales representatives can better understand the progress of videoconferencing.

But according to civil rights advocates, emotion tracking software is inherently biased because it assumes that everyone will show the same facial expressions, sound patterns, and body language.

Worse, if the technology is abused, employees, students or other zoom users may be improperly punished by the superior department for 'wrong emotional expression'.

In addition, for some ethnic groups and people with disabilities, if this stereotype based on hard coding is allowed to be deployed to millions of devices, the consequences will be unimaginable.

To sum up, the civil rights advocacy group called on zoom to make a commitment not to deploy emotion tracking AI in its products before May 20. Unfortunately, as of press time, the company did not immediately respond to requests for comment from foreign media.

In Order To Reduce The Tragedy, Biofire Has Built A Gun That Only Its Owner Can Use
« Prev 05-12
Android 13 Release: One Article Understands Two Major Upgrades
Next » 05-12