Wednesday, May 10, 2017

Computers Are Racist

More proof, as if we needed any, that "racist!" is a catch-all term for "bad."
A Chinese professor has expressed surprise after a study he co-authored that said machines could be programmed to tell if a person was a criminal was slammed by Google researchers as “scientific racism”.

In the paper, submitted last November to Cornell University Library’s arXiv site, an online repository of moderated scientific papers, Wu and his student Zhang Xi wrote that they found machines could use data to make reliable inferences about whether someone was a criminal. The study was based on an analysis of almost 2,000 Chinese government ID photos of criminals and non-criminals.

“Given the race, gender and age, the faces of law-abiding members of the public have a greater degree of resemblance compared with the faces of criminals,” the pair wrote, noting that criminals tended to have eyes that were closer together.
Studying Chinese faces is now racist in China, because racist.
Professor claims AI can spot criminals by looking at photos 90 percent of the time
In the first paper, Wu and his student Zhang Xi explained they used computer vision and machine learning to see if a computer could distinguish criminals from non-criminals by analyzing the faces of 1,856 people.

Wu and his team collected ID card photos that satisfied the following criteria - Chinese, male, between the ages of 18 and 55, no facial hair, no facial scars or other markings.

The ID photos of the 1,126 non-criminals of men were from a wide range of professions and social status; and the 730 ID photos of criminals came from those published by public security bureaus in different provinces in China.

The computer found that the angle from the nose tip to the corners of the mouth was on average smaller for criminals than for non-criminals. Also, the upper lip curvature was on average larger of criminals than of non-criminals.

Using these variables to judge photos, the AI was able to distinguish criminals and non-criminals in the data set with almost 90 percent accuracy.
The researchers got to 90% with faces alone.

Now watch this video of a "mentalist"

If he really has intuition and isn't doing magic tricks or suggestion, then he is picking up subtle signals. Software can process this data today. The bottleneck is the amount of data required. Watching people for all sorts of subtle tells requires constant video feeds. We're starting to get data on criminals: police are wearing body cams and their interactions with suspects will be thoroughly investigated. If no one in the West uses the data, some researched in China will.

No comments:

Post a Comment

Synthesis

Political

Potpourri

Blog Archive