MIT’s new robot can identify things by sight and by touch

For humans, it’s easy to predict how an object will feel by looking at it or tell what an object looks like by touching it, but this can be a big challenge for machines. Now, a new robot developed by MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) is attempting to do just that.

The team took a KUKA robot arm and added a tactile sensor called GelSight, which was created by Ted Adelson’s group at CSAIL. The information collected by GelSight was then fed to an AI so it could learn the relationship between visual and tactile information.

To teach the AI how to identify objects by touch, the team recorded 12,000 videos of 200 objects like fabrics, tools and household objects being touched. The videos were broken down into still images and the AI used this dataset to connect tactile and visual data.

https://www.engadget.com/2019/06/17/robot-identify-sight-touch/