China Testing Artificial Intelligence Emotion Detection On Uyghurs

Written by smith-willas | Published 2021/05/29
Tech Story Tags: ai | emotional-intelligence | emotion-recognition | emotion-ai | emotion-ai-discussion | artificial-intelligence | ai-software | artificialintelligence

TLDR Chinese authorities are testing systems that use AI and facial recognition to detect emotional states. Beijing is accused by many countries of the genocide of the Uyghur population. Chinese authorities have flooded the predominantly Muslim Xinjiang Uygur Autonomous Region with surveillance cameras. Emotion-detection technology has been on the market since the mid-2010s. Scientists believe that the new technology cannot be trusted because a person's emotions cannot be reliably recognized by the expression on his face. Human rights activists fear that the abuse of technology could lead to totalitarian control of society.via the TL;DR App

Chinese authorities are testing systems that use AI and facial recognition to detect emotional states. This is reported by the BBC with reference to an unnamed developer of this technology. Experts Boosty Labs, a company that focuses on smart contract development and blockchain app development, share their thoughts of this innovative trend’s implications. Beijing is accused by many countries of the genocide of the Uyghur population.
The Chinese authorities have flooded the predominantly Muslim Xinjiang Uygur Autonomous Region with surveillance cameras. The authorities are demanding that people who drive in the area install government spyware. Local residents donate DNA samples to the authorities. QR codes are placed outside their houses with the help of which Chinese police officers can get basic information about the residents in a second.
The technology developer, who wished to remain anonymous due to concerns about his safety, shared with the BBC evidence of
experiments with emotion detection using video surveillance. Machine learning technologies are being used on Uyghur prisoners, he reveals, in order to understand what kind of mood they are in. In addition, they can be used as a new, more advanced lie detector.
The system recognizes the mood of the inmates and generates a real-time report that notes if the person being monitored is in a bad mood. This technology is intended to obtain a preliminary court decision without obtaining any additional, reliable evidence.
Emotion-detection technology has been on the market since the mid-2010s. The developers argue that psychologists are able to
understand the hidden emotions of a person by barely noticeable signs and this means computer algorithms receive images from high-definition cameras also.
As the Financial Times wrote, emotion-detection technology was one of the main topics of discussion at China's largest video surveillance technology exhibition, which was held from October 21 to 31, 2019 in Shenzhen. The Chinese authorities argued that the new technology would help fight crime.
“The emotion-detection algorithm can quickly identify suspects by analyzing their mental state, and prevent illegal actions, including terrorism and smuggling. We have already started using it, Li Xiaoyu, a public order expert at the Public Security Bureau in Altai City, Xinjiang, told the Financial Times.
The algorithm detects signs of aggressiveness and nervousness in a person's appearance, and also determines his level of stress. “We have partnered with various [video surveillance equipment and technology manufacturers] in Xinjiang. Only those companies that are seriously
involved in artificial intelligence development can work in this area. The
market leaders are, of course, Alibaba and Tencent, Li Xiaoyu said.
It is not only Chinese companies trying to deploy artificial intelligence to understand human feelings. There are also such international tech giants as Amazon, Microsoft and Google. However, scientists believe that the new technology cannot be trusted, because a person's emotions cannot be reliably recognized by the expression on his face.
In July 2019, the peer-reviewed journal Psychological Science in the Public Interest published a systematic review of research on facial expressions and their relationship to emotions. The reviewers concluded that people express their feelings in a variety of ways. This means that companies that tout their algorithms for detecting emotions are more likely to exaggerate their effectiveness.
Many scientists are skeptical about the possibilities of real detection of mood. American psychologist Paul Ekman, famous for the study of non-verbal human behavior, in an interview with the Financial Times
drew attention to the fact that the authors of most technologies for the
emotion-detection do not try to prove their effectiveness from a scientific
point of view.
However, human rights activists fear that the abuse of technology could lead to totalitarian control of society by governments, regardless of whether it works. The scientific community is also questioning the legitimacy of some of the measures that China is taking.
Two years ago, the Beijing authorities announced that they would use a facial recognition system to sort passengers on the subway. The Beijing subway has metal detector frames and airport X-ray equipment through which luggage passes. There is a large queue at the entrance during peak hours because everyone passes this check. And this is not always effective.
Therefore, the authorities announced that they will use algorithms for analyzing the appearance of people, not only face recognition, but also
gait, gestures, and so on – to draw conclusions about who is suspicious and who is not, and then selectively send people for inspection, and let others pass freely. It was positioned as a way to optimize the process and reduce entry queues. Many Chinese scholars from major universities have written that this is a violation of basic human rights. Where is the presumption of innocence, why is it necessary to segregate people?  It
is unclear how these algorithms work – they are opaque.
But under the current leadership of China, an emphasis is placed on security. According to the official paradigm China is exposed to danger from the outside and from within. Therefore, the authorities do not spare any funds for security measures. The country is becoming more and
more a police state.

Written by smith-willas | Smith Willas is a freelance writer, blogger, and digital media journalist. He has a management degre
Published by HackerNoon on 2021/05/29