Evolution of Robots: GelSight Sensors Enable Human-Like Touch

Evolution of Robots: GelSight Sensors Enable Human-Like Touch

By Rebecca Alexander

Popular science fiction books and movies have long toyed with the idea of human-like robots. Recent research at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has brought reality one step closer to such a possibility by improving the sensitivity and dexterity of grippers on robotic arms. The grippers of robotic arms are fitted with GelSight sensors which furnish 3D maps of surfaces that they come in contact with. The current developments are elaborated in two scientific articles, one focusing on the capability of the GelSight sensor to determine the surface hardness while the other deals with the robot’s capacity to handle objects smaller than previously achieved.

What Is GelSight?

GelSight is essentially a transparent rubber block in the form of a ‘gel’. One face of this transparent block is coated with metallic paint and takes the form of the object pressed against it. The surface of such objects become reflective because of the metallic paint, thus enabling computer vision algorithms to deduce its geometry. The opposite face of the transparent block holds three colored lights at different angles and a camera, which allows the computer algorithms to interpret the 3D shape of the concerned object.  

Putting GelSight to the Test

If a robot has to function autonomously, it should be able to grip and maneuver objects. Basically, this means that the robot should evaluate the object’s hardness and shape in order to foresee its behavior on being moved or laid. In contrast with previous robots, the current robots follow a mechanism similar to that used by humans to estimate the hardness of an object and thus distinguish different objects of similar shape.

Human judgement of an object’s hardness is based on the change in contact area between the fingers and the object held. Wenzhen Yuan, the first author on one of the scientific articles, created 400 groups based on the shape of silicone objects, with 16 objects of varying hardness per group. To describe the deformation of a given object, she recorded the change in contact pattern with time in the form of five frames at regular intervals of time. The hardness of objects was measured using a standard industrial scale and served as input for neural networks along with deformation patterns, establishing correlations between the observed changes in contact patterns and the measured hardness. As a result, the hardness of any given object is estimated very accurately whenever frames of changing contact patterns are available.

Greg Izatt, the first author of the second scientific article, came up with control algorithms to steer the robot’s gripper towards an object using a computer vision system and shift the location estimation to a GelSight sensor after the object has been grasped. A common problem in using data from different sources –a computer vision system and another sensor– is compatibility of data. Being camera-based, data from GelSight does not pose any such difficulty in integrating with visual data from other sources as evidenced by Izatt’s experiments. In his experiments, a GelSight-equipped gripper successfully picked a small screwdriver from its holster, moved it and placed it into another holster. Although data from GelSight only describes a small part of the screwdriver, data from the vision system makes it possible for Izatt’s algorithms to correctly determine supplementary information regarding the part of the screwdriver being held and its position in the robotic hand. Given a good estimation of the initial position of the screwdriver to within a few centimeters, the experiments were successful in handling it.

As part of an informal test, some people were asked to rank fruits and vegetables based on their hardness. Interestingly, the robot equipped with GelSight also reported the same order of ranks proving the qualitative accuracy of results.

What’s Next?

Sergey Levine, an assistant professor of electrical engineering and computer science at the University of California at Berkeley, believes that high-bandwidth tactile sensors and the GelSight technology will lead to significant improvement over the current robots by equipping them with a ‘sensor’ of touch. This will enable human-like manual dexterity to handle small objects.

Originally developed as part of the Defense Advanced Research Projects Agency’s Robotics Challenge (DRC) which aimed at developing control systems for a humanoid robot to perform a series of tasks in a hypothetical emergency situation, these touch-sensitive robots would be an asset in any situation where direct human interference is impossible or risky. Given this evolutionary advantage in robots, possible applications are limited solely by one’s imagination. It might be a while before the robots pick up all the desirable attributes of human beings but this research sure has brought them one level up!

Image courtesy of MIT.

Never miss an insight

Get insights delivered right to your inbox

More of Our Insights & Work

Never miss an insight

Get insights delivered right to your inbox

You have successfully subscribed to our newsletter.

Too many subscribe attempts for this email address.

*