Researchers at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) have developed a new system that could equip robots with something we take for granted: the ability to link multiple senses together.
The new system created by CSAIL involves a predictive AI that’s able to learn how to see using its ‘sense’ of touch, and vice versa. That might sound confusing, but it’s really mimicking something people do every day, which is look at a surface, object or material and anticipate what that thing will feel like once touched, ie. whether it’ll be soft, rough, squishy, etc.
The system can also take tactile, touch-based input and translate that into a prediction about what it looks like – kind of like those kids’ discovery museums where