New AI Analyzes Facial Structure To See Who's A Terrorist
Dave Gershgorn
at 11:19 AM May 25 2016
We're doomed.

As humans, we're taught at an early age to not judge people by the way they look. And that's for a good reason—people's natures aren't defined by their physical appearance.

Faception, a startup backed by legitimate venture capitalists, thinks you can judge a book by its cover. They claim their artificial intelligence algorithms can look at faces and tell which ones are likely to be terrorists, professional poker players, pedophiles, or (worst of all) brand promoters, as reported by The Washington Post.

Despite being comparable to the long-debunked practice of phrenology, which discerned mental capability by measuring the exterior of the heads, Faception claims to have mastered a "facial personality profiling technology, at scale and in real-time," according to CEO Shai Gilboa, who presented the software at a venture capital demo. Gilboa is also Faception's Chief Ethics Officer.

The "science" behind this claim is neatly broken down into two points on Faception's website:

  1. According to Social and Life Science research personalities are affected by genes.

  2. Our face is a reflection of our DNA.

Yes, DNA plays a role in both your physical appearance and personality. But any research suggesting that personality can be deduced from appearance is tenuous at best.

Faception works across a number of "varticals."

We don't even know what most of the human genome does. We don't know if the genes that control physical appearance are even remotely near, or affected by, those that influence behavior. As per the Human Genome Project:

Having the essentially complete sequence of the human genome is similar to having all the pages of a manual needed to make the human body. The challenge to researchers and scientists now is to determine how to read the contents of all these pages and then understand how the parts work together and to discover the genetic basis for health and the pathology of human disease.

In addition to this gaping hole in Faception's logic, Gilboa told WaPo that their A.I. system has an accuracy of 80 percent, meaning one of five people classified as a terrorist or pedophile by the system will be incorrectly tagged.

Despite this, Faception claims to have signed a $750,000 contract with "a homeland security agency." It's also funded by the same firm that funded CreditKarma, music app Smule, and Behance (which has been since acquired by Adobe).

comments powered by Disqus
Sign up for the Pop Sci newsletter
Australian Popular Science
PopSci Live