A new robot can now identify objects by touch2019-06-17 16:25 by Daniela
Tags: robot, MIT
A new robot developed by MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) is being taught to determine what an object looks like just by touching it, as well as predict what an object will feel like just by looking at it.The system was tested on a robotic arm called KUKA with the help of a special tactile sensor called GelSight, which was designed by another group at MIT.
The system uses just one simple web camera but has the ability to record nearly 200 objects around the arm, such as tools, household products, fabrics, and others. During testing, if the model was fed tactile data on a shoe, for instance, it could produce an image of where the shoe was most likely to be touched. The same goes for a computer mouse, box, cup, T-shirt, hammer—whatever its automated heart desires.
"By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge,” said CSAIL PhD student and lead author on the research Yunzhu Li, who wrote the paper alongside MIT professors Russ Tedrake and Antonio Torralba and MIT postdoc Jun-Yan Zhu. "By blindly touching around, our [AI] model can predict the interaction with the environment purely from tactile feelings. Bringing these two senses together could empower the robot and reduce the data we might need for tasks involving manipulating and grasping objects."
However, for now, CSAIL's robot can only identify objects in a controlled environment. Next, the researchers plan to enlarge the tactile/visual data set so the robot can perform tasks in various other settings.
In the future, this could help with a more harmonious relationship between vision and robotics, especially for object recognition, grasping, better scene understanding and helping with seamless human-robot integration in an assistive or manufacturing setting.
Read more -here-