For years, Xinjiang has been a testbed for the Chinese government’s novel digital and physical surveillance tactics, as well as human rights abuses. But there is still a lot that the international human rights community doesn’t know, especially when it comes to post-2016 Xinjiang.

Last Wednesday, Human Rights Watch released a report detailing the inner workings of a mass surveillance app used by police and other officials. The application is used by offiicals to communicate with the larger Integrated Joint Operations Platform (IJOP), the umbrella system for collecting mass surveillance data in Xinjiang.

This report uncovers what a modern surveillance state looks like, and can inform our work to end them. First, the report demonstrates IJOP’s system of pervasive surveillance targets just about anyone who deviates from an algorithmically-determined norm. Second, as a result, IJOP requires a massive amount of manual labor, all focused towards data entry and translating the physical world into digital relationships.

We stand by Human Rights Watch in calling for the end to violations of human rights within Xinjiang, and within China.

What’s going on in Xinjiang?

Xinjiang is the largest province in China, home to the Uighurs and other Turkic minority groups. Since 2016, the Chinese government has cracked down on the region as a part of the ongoing “Strike Hard” campaign. An estimated 1 million individuals have been detained in “political education centers,” and the IJOP’s surveillance system watches the daily lives of Xinjiang residents. While we fight the introduction and integration of facial recognition and street-level surveillance technologies in the U.S., existing research from Human Rights Watch gives us insight on how facial-recognition-enabled cameras already line the streets in front of schools, markets, and homes in Kashgar. WiFi sniffers log the unique addresses of connected devices, and police gather data from phone inspections, regular intrusive home visits, and mandatory security checkpoints.

Human Rights Watch obtained a copy of a mobile app police officers and other officials use to log information about individuals, and released its source code.

The primary purpose of the IJOP app is for police officers to record and complete “investigative missions,” which require officers to interrogate certain individuals or investigate vehicles and events, and log the interactions into the app. In addition, the application also contains functionality to search for information about an individual, perform facial recognition via Face++, and detect and log information about WiFi networks within range.

Who are they targeting? Well, basically everyone.

  On the left, a very long list of types of personal information IJOP collects. Basic information include name, ethnicity, national ID number, address, profession, education, phone number, passport, car number, and relationship with head of household. Biometric data includes blood type, height, and photo. IJOP also collects religious and political status information. On the right, a long list of 36 "person types" that IJOP authorities are paying attention to.

The application focuses on individuals who fit one of 36 suspicious “Person Types.” These categories, and the nature of these “investigative missions,” reveal a great deal about the types of people IJOP is targeting.

When conducting an “investigation,” officers are prompted to create an extensive profile of the individual(s) being investigated. Despite the Chinese government’s claim that their surveillance state is necessary for countering “separatism, terrorism, and extremism,” most of these behavioral personas have nothing to do with any of the above:

People who travel. This includes individuals who move in or out of their area of residence often, people who have been abroad, or who have simply left Xinjiang province—even if they do it legally. If an individual has been abroad “for too long,” officials are also prompted to physically check the target’s phone. They’re prompted by the app to search for specific foreign messaging apps (including WhatsApp, Viber, and Telegram), “unusual” software that few people use, VPNs, and whether their browser history contains “harmful URLs.”

People with “problematic” content and software on their phones. When “suspicious” software (again, including VPNs or foreign messaging apps like WhatsApp or Telegram) is detected, the IJOP system will send a detailed alert to officials about the target and identifying information about the phone, including a unique device identifier and metadata that can be used to track the phone’s general location. This could be tied to the JingWang spyware app many residents are forced to install. Reverse engineering work from Red Team Lab found that JingWang focuses on inspecting the files stored on the device, and transmits a list of filenames and hashes to a server over an insecure HTTP connection.

People, phones, or cars that go “off-the-grid.” This could mean an individual has stopped using a smartphone, or lent a car to a friend. An individual’s ID going “off-grid” typically means they have left Xinjiang and are no longer in IJOP’s jurisdiction of dragnet surveillance, generally due to school, moving (legally), or tourism.

People who are related to any of the above. Following the disappearance and subsequent reappearance of poet and musician Abdurehim Heyit, the International Uyghur diaspora started an online activism campaign and reported thousands of missing relatives. The strong focus on relatives and familial ties in the IJOP data profiles confirms Chinese surveillance’s focus on suspecting, interrogating, and even detaining individuals just because they are related to someone who has been deemed “suspicious.”

...And people who are not. The application flags all sorts of people. People who consume too much electricity, people subject to a data entry mishap, people who do not socialize with neighbors, people who have too many children...the list goes on and on.

Despite grandiose claims, the process is still manual and labor-intensive

Any small deviation from what the IJOP system deems “normal behavior” could be enough to trigger an investigation and prompt a series of intrusive visits from a police officer. As a result, the current surveillance system is extremely labor-intensive due to the broad categorizations of “suspicious persons,” and the raw number of officials needed to keep tabs on all of them.

Officers, under severe pressure themselves to perform, overwork themselves feeding data to IJOP. According to Human Rights Watch:

These officials are under tremendous pressure to carry out the Strike Hard Campaign. Failure to fulfill its requirements can be dangerous, especially for cadres from ethnic minorities, because the Strike Hard Campaign also targets and detains officials thought to be disloyal.

The process of logging all this data is all manual; the app itself uses a simple decision tree to decide what bits of information an official should log. According to Human Rights Watch, although the application itself isn’t as sophisticated as the Chinese government has previously touted, it’s still not exactly clear what sort of analyses IJOP may be doing with this massive trove of personal data and behavior.

IJOP’s focus on data entry and translating physical relationships into discrete data points reminds us that digitizing our lives is the first step towards creating a surveillance state. Some parts of the application depend on already-catalogued information: the centralized collection of electricity usage, for instance. Others are intended to collect as much possible to be used elsewhere. In Xinjiang, the police know a huge array of invasive information about you, and it is their job to collect more.

And as all behavior is pulled into the state’s orbit, ordinary people can become instant suspects, and innocent actions have to be rigorously monitored. Using certain software becomes, if not a crime, then a reason for suspicion. Wandering from algorithmic expectations targets you for further investigation.

Invoking the “slippery slope” is a misnomer, because the privacy violations we predict and fear are already here. Groups like Human Rights Watch, including their brave colleagues within Xinjiang, are doing everyone service by exposing what a modern surveillance state looks like.