Language is all about repetition. Every word you're reading was created by humans, and then used by other humans, creating and reinforcing context, meaning, the very nature of language. As humans train machines to understand language, they're teaching machines to replicate human bias.
Today's artificial intelligence is certainly formidable. It can beat world champions at intricate games like chess and Go, or dominate at Jeopardy!. It can interpret heaps of data for us, guide driverless cars, respond to spoken commands, and track down the answers to your internet search queries.
We already interact with artificial intelligence in our daily lives. Furby and Clippy were early forms; driverless cars and Facebook's chatbots pick up the mantle today. But if AI is to continue its evolution, it'll have to get more convincingly human. Right now, its capacity for emotional depth is seriously lacking.
Mark Zuckerberg's first Q&A over Facebook Live included a special, unannounced guest. After a good amount of time during which Zuckerberg answered basic questions about Facebook (soon artificial intelligence will be able to automatically add subtitles to videos you upload) and speculations about the future, Zuckerberg left his couch, returning with Jerry Seinfeld, the famous comedian who just happened to be in the building trying the Facebook-owned Oculus Rift virtual reality headset.
Yesterday at Google's I/O developers conference, CEO Sundar Pichai briefly spoke about a custom-built chip that helps give Google its edge in machine learning and artificial intelligence. The chip, dubbed a TPU or Tensor Processing Unit (in keeping with Google's A.I. platform TensorFlow), is specifically wrought for running Google's decision-making algorithms. Most companies like Facebook and Microsoft use GPUs for their machine learning and artificial intelligence.