This AI system can learn to see through touch and feel by seeing
New York: A team of researchers at the Massachusetts Institute of Technology (MIT) have come up with a predictive Artificial Intelligence (AI) that can learn to see by touching and to feel by seeing.
While our sense of touch gives us capabilities to feel the physical world, our eyes help us understand the full picture of these tactile signals.
Robots, however, that have been programmed to see or feel can’t use these signals quite as interchangeably.
The new AI-based system can create realistic tactile signals from visual inputs, and predict which object and what part is being touched directly from those tactile inputs.