Researchers from MIT have developed a learning-based particle simulator that allows robots to interact with delicate or pliable objects, such as clay, and make predictions about solid objects and liquids as well. The system could provide industrial robots a more refined touch, and give personal robots fun applications, such as molding sticky rice for sushi.
In a recently released paper, the researchers detail how a simulated model learns and remembers how different materials (in this case particles) react when interacted upon, poked, and prodded. Robots can then use those models to predict the physical makeup of an object and learn how they would respond to their touch.
“Humans have an intuitive physics model in our heads, where we can imagine how an object will behave if we push or squeeze it. Based on this intuitive model, humans can accomplish amazing manipulation tasks that are far beyond the reach of current robots. We want to build this type of intuitive model for robots to enable them to do what humans can do.” — Yunzhu Li, MIT CSAIL grad student and lead researcher
To demonstrate their learning-based particle simulator, the researchers employed their ‘RiceGrip’ robot — a two-fingered pinching robotic hand, which reshaped deformable foam into desired shapes. A depth-sensing camera and object recognition algorithms were used to first identify the foam before engaging the particle simulation system.
While the robot ‘knows’ how individual particles will react when touched, real-world particles sometimes respond differently from the models. If this is encountered by the robot, it will adjust the model based on its real-world counterpart. The next goal of the research team is to teach robots how to interact with an object that is only partially visible, such as how a stack of boxes will fall without knowing how the boxes underneath are oriented.