News Technology

MIT Designs Predictive AI To Visualize And Feel Using Touch

mit-designs-predictive-ai-to-visualize-and-feel-using-touch

MIT designs predictive AI to visualize and feel using touch. While our touch sense gives us capacities to feel the physical world, our eyes help us comprehend the full image of these material sign. Robots, that have been modified to see or feel can’t utilize these sign very as reciprocally.

Later on, this could help with an increasingly agreeable connection among robotics and vision, particularly for grasping, object recognition, better scene comprehension and assisting with seamless human-robot incorporation in an assistive or assembling setting.

The group utilized a KUKA robot arm with an exceptional material sensor called GelSight, planned by another gathering at MIT. Separating those 12,000 video cuts into static frames, the group ordered ‘VisGel,’ a dataset of in excess of three million visual/material combined pictures.

The group would like to improve this by gathering information in progressively unstructured zones, or by utilizing another MIT-planned material glove, to more readily expand the size and decent variety of the dataset.

PhD student and lead author from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), Yunzhu Li said, “By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge. By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings. Bringing these two senses (vision and touch) together could empower the robot and reduce the data we might need for tasks involving manipulating and grasping objects.”

 

 

About the author

Sean Mendis

Sean Mendis

Sean Mendis is a reporter for MR Time. After graduating from the University of Tennessee, Sean got an internship at a morning radio show and worked as a journalist and producer. He covers national and community events for MR Time. You can contact him at ricardo@marketresearchtime.com

Add Comment

Click here to post a comment