“Algorithm” is a term that refers to rules for the efficient ordering of data and decision-making. A recipe is an algorithm. Banks, insurance companies and investment firms use them to reduce risk; courts use them to help determine guilt, and dating services use them to find promising match-ups. Because they are used for so many purposed, we are all affected by them whether we know it or not.

Data comes from an expanding variety of sources, including smart phones, credit cards, ads, medical implants and even toys. “Reason” magazine says that “humanity in 2025 will be generating 163 trillion gigabytes of data annually, a ten-fold increase in the global datasphere since 2016” Currently we are being watched by 60 million cameras.

The abolition of privacy is the essence of totalitarianism. Currently, China’s government is perfecting an algorithm for recording every aspect of the lives of its 1.4 billion citizens. Data such as a citizen’s age, educational level, family ties and legal history can be accessed by the click of a mouse. It’s a gigantic experiment called the “Social Credit System” which is a form of behavioral conditioning designed to instill obedience by dispensing rewards and punishments through remote control. The purpose is to install a synthetic superego into the public mind and suppress critical thoughts about the government.

China does this with 200 million cameras linked to facilities for facial recognition. If a citizen is late in paying taxes or shows any evidence of disloyalty to the regime, a low score goes into his permanent record. The offense is cross-checked with other data such as financial, medical, and legal records which are constantly updated. The government is partnered by tech businesses in keeping records on the citizenry.

Activities the government approves of, such as donating blood, get high scores, while engaging in video games, booze and tobacco will get low scores. So will associating with people believed to have anti-government views. Other behaviors, such as an extended time spent abroad or use of an unusual amount of electricity, may arouse suspicions leading to questioning. Low-scoring people are barred from luxury hotels, restaurants, the internet, and denied access to prestigious careers. At present, 15 million Chinese citizens are banned from air and rail transit.

Currently, China is targeting Muslims, who are suspected of having divided loyalties. Roughly a million are in camps being indoctrinated with the Communist Party’s ideas of proper citizenship. According to the NY Times, the government has 68 billion records on its Muslims. These include DNA, fingerprints, irises, blood samples, voice recordings and facial portraits from different angles.

As always, totalitarianism comes in the guise of utopianism. Shanghai says that its credit system aspires to “reach the point where no one would even consider hurting the community. If we reached this point, our work would be done.” The NY Times says that changing habits such as using a cell phone less frequently of using more electricity may arouse suspicion.

The danger of thought control isn’t limited to totalitarian governments. It is inherent in surveillance technology. Two writers for “Atlantic” magazine say that “What emerges in China will not stay in China. Its repressive technologies have a pattern of diffusing to other ... regimes around the world. For this reason ... democracies must monitor and denounce this sinister creep toward an Orwellian world.”

Indeed, China’s system is spreading. To date, 54 governments have adopted algorithmic systems, many of them financed by Chinese loans. Even Australia has signed on. “Reason” magazine observes that it will likely be adopted by “every two-bit dictator on the earth.” The technology isn’t limited to passively watching citizens, it also has the capability to frame them by creating realistic video and audio forgeries.

Could it happen here?

The police department in Orlando, Fla. is experimenting with surveillance that includes CCTV cameras to watch citizens and employs algorithmic software designed by Amazon. It identifies the faces of illegal aliens, drug dealers and other assorted felons and compares them with vast data bases. The purpose is to anticipate crimes before they happen, much as in the sci-fi film “Minority Report.” If Orlando’s experiment is deemed successful there will likely be demands for similar systems elsewhere, perhaps everywhere.

Even where algorithms are well-intended, their operations are so opaque that they may block important questions from public scrutiny. Lawyers seeking the details of a case have been rebuffed by courts because algorithmic designers have agreements with their contracting agencies which forbid disclosure.

Amazon, Google and IBM are developing a device that reads human emotions by analyzing voices. It will determine a subject’s moods, including joy, anger, sadness, fear, boredom, stress and other mental states. It will do so even if the voices are masked by background noise. How long until machines can read our thoughts?

Load comments