In 2009, Fei-Fei Li invented a data set that would change the history of artificial intelligence. Today, she's working on agents that can interact with their environments in 3-D virtual worlds. This is the broad goal of a new field known as embodied AI.
An embodied AI agent is an artificial intelligence agent that can probe and change its own environment. In robotics, the AI agent always lives in a robotic body, but in simulations, they may have a virtual body, or they may sense the world through a moving camera vantage point.
Researchers have been able to create realistic virtual worlds for AI agents to explore. The ability came from improvements in graphics driven by the movie and video game industries. In 2017, AI agents could make themselves at home in the first virtual worlds to realistically portray indoor spaces.
This shouldn't imply that the work is done. "It's significantly less genuine than this present reality, even the best test system," said Daniel Yamins, a PC researcher at Stanford University. With partners at MIT and IBM, Yamins co-created ThreeDWorld, which puts areas of strength on imitating genuine physical science in virtual universes — things like how fluids act and how a few items are unbending in one region and delicate in others.
An embodied AI agent is more accurate at detecting specified objects, researchers say. Researchers compare embodied agents to algorithms trained on the simpler, static image tasks. In embodied agents, each individual neuron is more selective about what it would respond to; Nonembodied networks are less efficient.
In just a few years, a team led by Dhruv Batra has rapidly improved performance on navigation tasks.
Navigation is one of the simplest tasks in embodied AI, but researchers are still far from mastering it. Navigating objects with complex language instructions like "Go past the kitchen to retrieve the glasses on the nightstand in the bedroom" remains a tough task for agents to master.
Many researchers are finding that even embodied intelligence agents can benefit from training in virtual worlds. In 2018, researchers at OpenAI proved that transferring skills from simulation to the real world was possible. More recent successes have allowed flying drones to learn how to avoid collisions in the air and self-driving cars to deploy in urban settings across two different continents.
Robotics researchers might soon send humans into simulated environments via virtual reality headsets to teach them how to interact with robots. A key goal of robotics research is to build robots that are helpful to humans in the real world. "We're going to learn the foundational technology of intelligence, or AI, that can really lead to major breakthroughs," Dieter Fox says.


No comments:
Post a Comment