This article is more than 1 year old

You've think you've heard it all about automation in technology? Get a load of this robot that plugs in cables

You wanted an android butler? Well, first it's gotta learn to hold a wire, then we can talk about folding sheets

Video An MIT team has built a robot that can plug cables into jacks.

That may sound effortless, yet it’s a tall order for machines. Humans typically have nimble fingers, a keen sense of touch, good eyesight, and years of experience fighting USB and headphone cables to untangle cords and place plugs into sockets correctly. A robot similarly needs sensors and algorithms to work out where to hold a wire, how hard it should grip it, and how to insert it into the required socket, and that ability doesn't come for free.

The two-fingered plug-wrangling robotic gripper developed at MIT, and presented at the virtual Robotics: Science and Systems conference this week, uses soft rubber GelSight fingertips that have embedded camera sensors.

The camera data is fed into a trained system that figures out how exactly the hand should hold the wire given its position. This information allows the hand to make real-time adjustments as it follows its fingers along the cable until it can place the plug at the end into a jack.

Crucially, it is aware of not only how it should move to keep hold of the cable, but also the amount of frictional force required: too much, and it might stretch or break the cable; too little, and it might drop the thing. According to the study's abstract, the team said it has demonstrated...

...a robot grasping a headphone cable, sliding the fingers to the jack connector, and inserting it. To the best of our knowledge, this is the first implementation of real-time cable following without the aid of mechanical fixtures.

Here’s a video of the robot fitting a headphone plug into an audio jack:

Youtube Video

“Manipulating soft objects is so common in our daily lives, like cable manipulation, cloth folding, and string knotting,” said Yu She, co-author of the study and a postdoc at MIT. “In many cases, we would like to have robots help humans do this kind of work, especially when the tasks are repetitive, dull, or unsafe."

The algorithms could be tweaked and used by other machines to grasp materials of various sizes and stiffness, automating chores as well as carrying out manufacturing and storage tasks. A more complex vision system is needed to perform more difficult tasks, such as folding clothes or picking fruit, though, we're told.

“In the future, we will explore more complex deformable objects like fabrics, clothes, fruits, which can result in extensive applications in elderly care, hospital nursing, housework performing, fruit harvesting, etc,” She told The Register.

“We will need to integrate our current tactile perception skills with existing vision perception techniques together to achieve this goal." ®

More about


Send us news

Other stories you might like