Huawei reportedly worked with 4 additional companies to build surveillance tools that track people by ethnicity, following recent revelations that it tested a 'Uighur alarm'
- Huawei has worked with at least four partner companies to develop surveillance technologies that claim to monitor people by ethnicity, The Washington Post reported Saturday.
- Last week, The Post reported that Huawei in 2018 had tested a "Uighur alarm" — an AI facial recognition tool that claimed to identify members of the largely Muslim minority group and alert Chinese authorities.
- Huawei told the The Post that the tool was "simply a test," but according to Saturday's report, Huawei has developed multiple such tools.
- The reports add to growing concern over China's extensive surveillance and oppression of Uyghurs and other minority groups, as well as increasing use of racially discriminatory surveillance tools and practices by US law enforcement.
- Visit Business Insider's homepage for more stories.
Huawei tested an AI-powered facial-recognition technology that could trigger a "Uighur alarm" for Chinese authorities when it identified a person from the persecuted minority group in 2018, The Washington Post reported last week.
At the time, Huawei spokesperson Glenn Schloss told The Post that the tool was "simply a test and it has not seen real-world application."
But a new investigation published by The Post on Saturday found that Huawei has worked with dozens of security firms to build surveillance tools - and that products it developed in partnership with four of those companies claimed to be able to identify and monitor people based on their ethnicity.
Documents publicly available on Huawei's website detailed the capabilities of those ethnicity-tracking tools as well as more than 2,000 product collaborations, according to The Post. The publication also reported that after it contacted Huawei, the company took the website offline temporarily before restoring the site with only 38 products listed.
"Huawei opposes discrimination of all types, including the use of technology to carry out ethnic discrimination," a Huawei spokesperson told Business Insider. "We provide general-purpose ICT [information and communication technology] products based on recognized industry standards."
"We do not develop or sell systems that identify people by their ethnic group, and we do not condone the use of our technologies to discriminate against or oppress members of any community," the spokesperson continued. "We take the allegations in the Washington Post's article very seriously and are investigating the issues raised within."
Huawei worked with Beijing Xintiandi Information Technology, DeepGlint, Bresee, and Maiyuesoft on products that made a variety of claims about estimating, tracking, and visualizing people's ethnicities, as well as other Chinese tech companies on tools to suppress citizens' complaints about wrongdoing by local government officials and analyze "voiceprint" data, according to The Post.
Beijing Xintiandi Information Technology, DeepGlint, Bresee, and Maiyuesoft could not be reached for comment.
Human rights groups, media reports, and other independent researchers have extensively documented China's mass surveillance and detainment of as many as one million Uyghurs, Kazakhs, Kyrgyz, and other Muslim minority groups in internment camps, where reports allege they are subjected to torture, sexual abuse, and forced labor for little or no pay.
To help it build the surveillance apparatus that enables such widespread detainment, the Chinese government has at times turned to the country's technology firms.
"This is not one isolated company. This is systematic," John Honovich, the founder of IPVM, a research group that first discovered the 2018 test, told The Post. He added that "a lot of thought went into making sure this 'Uighur alarm' works."
In October 2019, the US Commerce Department blacklisted 28 Chinese government agencies and tech companies including China's five "AI champions" - Hikvision, Dahua, SenseTime, Megvii, and iFlytek - on its banned "entity list," thus preventing US firms from exporting certain technologies to them.
Still, some of those blacklisted companies have managed to continue exporting their technologies to Western countries, and BuzzFeed News reported last year that US tech firms, including Amazon, Apple, and Google, have continued selling those companies' products to US consumers via online marketplaces.
In the US, law enforcement agencies and even schools have also increased their reliance on facial recognition software and other AI-powered surveillance technologies, despite growing evidence that such tools exhibit racial and gender bias.
But recent pushback from activists, tech ethicists, and employees has pushed some tech companies to temporarily stop selling facial recognition tools to law enforcement, and some US cities have issued moratoriums on their use, highlighting some divides between approaches to policing in the US and China.
Contributer : Business Insider https://ift.tt/3qV2LUU
No comments:
Post a Comment