- Facebook is teaching an artificial intelligence (AI) system to navigate through areas of New York City through the context of its surroundings as part of a project to help the system understand human language. A blog post from the Facebook Artificial Intelligence Research says the "Talk the Walk" program is "introducing a degree of realism not previously found in this area."
- The trial involves two AI bots: a "tourist" that describes what it sees, and a "guide" that relies only on a 2D map to offer directions. The goal is to execute a task based on "embodied AI," or "grounded language learning," which favors context rather than large blocks of text.
- "It isn't enough for virtual assistants to offer a rote response to your voice or text," wrote FAIR researchers Douwe Kiela and Jason Weston. "For AI systems to become truly useful in our daily lives, they'll need to achieve what's currently impossible: full comprehension of human language."
Making AI work with natural human language is key to broader deployment of the technology for real-life applications, enabling bots to talk to humans and vice versa. FAIR researchers created natural language between the bots based on human subjects, rather than using precise messages such as, "Go to the next block, then turn right to get to the restaurant." Fast Company compares it to the way a baby speaks "by naming what it sees."
AI is expected to have major impacts on smart cities, in areas ranging from public safety to roads. Researchers at Cambridge University in England used AI to teach an autonomous vehicle to stay in its lane in just 20 minutes, a strategy that could accelerate adoption of self-driving cars. Maryland has invested more than $50 million in a system that uses AI to control traffic signals. Cities like Chattanooga, TN have even applied AI to electricity grids to better deploy power for its residents.
Facebook’s research will help make AI systems more reactive to their human users, which could have implications for systems like smart networks and AVs. A system that can understand and react to spoken commands from humans will make communication easier, and can make smart cities better respond to residents’ needs.