Facebook's DeepText is smart enough to understand you
Loading...
Facebook's DeepText is almost as smart as you.
The artificial intelligence (AI) engine "can understand with near-human accuracy the textual content of several thousands posts per second, spanning more than 20 languages," Facebook announced in a blog post Wednesday.
Facebook's newest AI system will allow it to continue to personalize content for its billion plus users, as well as weed out spam and hate speech. The engine also marks another major milestone in deep-learning technology, as Facebook, Google, and Microsoft invest heavily in teaching machines to think.
"Having good machine learning models is a force multiplier for a lot of the stuff they are doing," Bradley Hayes, a postdoctoral associate at MIT and the creator of the satirical AI chatbot @DeepDrumpf, told The Christian Science Monitor last month. "[It's about] scalability and performance."
For Facebook, the "scalability" Mr. Hayes refers to is enormous. It's about understanding the trillions of posts Facebook's 1.65 billion users post.
But before DeepText and other deep-learning technology, machines couldn't understand semantics. In the traditional machine learning of natural language processing (NLP), words are converted into numbers that computer algorithms can learn, Facebook's blog explains. In NLP, the word "brother" could be assigned the number "459." "Bro" could be assigned "98665." Because the two integers are different, a computer couldn't associate "brother" and "bro."
In order for computers to think more like humans, they must understand slang and word-sense disambiguation, writes Facebook. They must also be able to know if the sentence, "I like blackberry," refers to fruit or the smartphone. To accomplish this, the DeepText engine learns through deep-learning that draws from different neural networks
As Jeff Ward-Bailey wrote for The Christian Science Monitor in March, "neural networks are capable of sifting through data and identifying patterns and relationships on their own, so they're not limited by a hard-coded set of rules written by developers." Neural networks helped Google's AlphaGo computer defeat top-ranked Go player Lee Se-dol, as the Monitor reported.
Go, though, is just a board game. DeepText could have much more of an impact on our everyday lives, as 400,000 new stories and 125,000 comments on public posts are shared every minute, according to TechCrunch.
DeepText is already powering parts of Facebook, Quartz reports; if you write on Facebook Messenger that you need a taxi, for example, a chat bot could interject and give you the number for one. Applications of DeepText in the near future include monitoring comments, and helping to identify the most relevant or inappropriate comments to other posts.
Facebook's tease of DeepText comes as it, Google, and Microsoft continue to unveil new AI technology. After AlphaGo proved it could defeat a human, Google revealed it is developing AI that not just smart, but artistic. Microsoft is experimenting with Twitterbot Tay, a chat bot capable of having conversations with humans via Twitter.
As exciting as all of this is, Mike Murphy of Quartz points out one downside of DeepText and Facebook's ability to comprehend the information it accumulates: it will be able to keep users on Facebook, and away from Google, which searches the web.
As Facebook gets better at offering us personalized search results from our networks, as useful as those might be, it also keeps us in a more insular version of the web, shaped by our own geography, demographics, affinities, and beliefs. Google does also does this, to an extent, but at least it searches the entire web first, not just our each of our own echo chambers.