• Washington DC
Follow Us:

Application of AI in human rights violations – The Uyghur case

By Abdulhakim Idris

It is no secret that AI holds a great deal of potential–good and bad–for many aspects of life, however, its bad impact on the liberties and freedoms of a human remains an important and very dangerous topic for our future.

Today, when people think of AI, the first thing that comes to mind is probably applications such as ChatGPT. This app was released about a year ago in November 2022. The fact that the application suddenly attracted everyone’s attention and the interest in AI in people to the development of other applications.

However, it is a fact that one of the places where artificial intelligence-supported advanced technology products are used the most in the world is East Turkistan under Chinese occupation. The Chinese Communist Regime has been using artificial intelligence applications of both Chinese and Western technology companies to oppress Muslim Uyghurs in East Turkistan for years before ChatGPT. 

A documentary showing that technologies such as artificial intelligence have been used for years in East Turkistan was published in 2020. That is before anyone has heard of ChatGPT. It was pointed out that at the time of the release of the documentary, 1200 technology companies were operating in the region.

These technology companies first tested the applications they developed in violation of human rights in East Turkistan and then exported them to other countries. East Turkistan today has become a giant test field for AI development. The Chinese Communist Regime has already established a digital dystopian system within the perfect police state of East Turkistan. Uyghur people are now living under a heavy surveillance system.

Another report published by Human Rights Watch (HRW) in 2021 documenting crimes against humanity in East Turkistan also shows how advanced technologies are used as a tool of genocide. In the report, it is emphasized that 2000 people were arrested through the Integrated Joint Operations Platform (IJOP), a digital system supported by AI Technology in Aksu prefecture alone.

In the same report, it is stated that China uses advanced technologies not only in East Turkistan but also to track Uyghurs in the diaspora and that their phones are hacked. In this way, their conversations, meetings with their families and every step of the way have become followed.

As mentioned in the report, the Uyghur diaspora is also now suffering from this kind of transnational oppression consistently carried out by a digital dystopian regime.

It certainly feels unreal. How else can you describe using an AI “emotion detector” on a Uyghur man to determine if he is a threat to the regime? But this is now a daily part of reality for a Uyghur in East Turkistan.

As reported by the BBC in 2021, an AI system is used to detect and analyze the slightest change in facial expression to determine if the subject is negative, anxious, or suspicious. A slight, out-of-place twitch or a slowed blink could be enough for the AI to decide that you are hiding something.

It goes beyond reading microexpressions–basic patterns of living are analyzed, and broken down into points of data for consumption. Any straying from this pattern marks you as a threat warning to local police.

If a Uyghur teenager comes home and chooses to enter her home through the backdoor instead of the front door as she always does, she is deviating from her pattern and is now considered abnormal. If a Uyghur grandfather decides to drink coffee instead of tea, he deviates from his pattern and is now considered abnormal. Even basic ethnicity is analyzed and flagged for concern. And this kind of information is used to establish a social credit system.

In China, censorship has never been easier than it is today. The use of AI generative content, through the likes of ChatGPT and DALL-E, has also been used to promote or hide information. The generators have been modified to promote only the content that the CCP wants. AI algorithms search through online messages and social media posts for “politically sensitive” content.

These algorithms are also used to push specific state news and messages to its people, better influencing and controlling its population. People can struggle to even criticize this because the algorithm flags and removes the content if they discuss it online, and if they attempt to organize to protest in person, the algorithm flags and informs the police. For example, you can never find a word about the Uyghur Genocide using the Chinese search engine Baidu.

Another point that needs to be emphasized is the use of AI as a transnational disinformation tool of the Chinese communist regime. It is possible to see the traces of this in the systematic attack on the variety of social media platforms on the activists or victims who struggle to announce the genocide in East Turkestan.

China also uses advanced technologies to collect the DNA of Uyghurs in a way that threatens the future of humanity. In this way, they included all the relatives of the people they declared suspicious within the scope of threats.

This is also used during the illegal removal of Uyghurs’ organs and marketing them as ‘halal organs’ to so-called Muslim countries. The Chinese Communist Regime, which has DNA samples in its hands, has created a large source of organs for itself with these data. 

Finally, AI is being used consistently to target not just the Uyghur  people, but any minority or marginal group in society in other parts of the world. Various reports have indicated that India, Israel, the United States, Denmark, and various other countries have also used AI to profile and target groups of people. Risk assessments completed by algorithms have been shown to have a bias against ethnic minorities, as remarked upon by former US Attorney General Eric Holder and reported on in outlets such as ProPublica.

Furthermore, biometric data–often collected and analyzed through AI algorithms–has been used to harm refugees, such as in the case of India’s repatriation of Rohingya Muslims back to Myanmar, where systemic genocide is still being perpetuated.

To sum up, the use of AI as a tool for genocide, racism, Islamophobia, social credit system and data collection just like in East Turkestan is a bitter reality that indicates a dark future for the whole of mankind.The Reality shows us clearly that AI is a powerful tool, one that can easily be used as a weapon. It is concerning to see how little thought has been put into its potential as an oppressor, with little care or thought towards basic human rights and freedoms.

Post navigation

Copyright Center for Uyghur Studies - All Rights Reserved