This Jenga-playing robot could build your next phone
Jenga is a deceptively tricky game. Players have to remove individual pieces from a tall block tower slowly, so the stack keeps its structural integrity, and then place them on the top gently enough so that it doesn’t topple over.
While it’s an entertaining (and sometimes stressful) game for humans, it’s devastatingly hard for robots. After all, every block looks the same, but some are loose and some are crucial to keeping the structure intact–making it difficult for a computer system equipped only with a camera to figure out which blocks to remove. Now researchers at MIT have built a robot that uses a force sensor to feel out each block and decide whether to leave it or remove it. The robot is typically able to play around 15 different moves before the tower falls. That’s better than my track record.
Nima Fazeli, a Phd student at MIT’s MCube lab, which focuses on robots’ physical interactions with the world, chose Jenga as the ideal challenge for a robot learning how to use touch to make decisions. First, Fazeli designed the robot, which primarily consists of a motorized arm and finger-like grippers, a camera, and a force sensor at its wrist so that it would be able to assess how much resistance it was receiving when it pushed on something. The second challenge was for the computer to successfully interpret what that resistance meant and use it to make a decision about whether to leave the block alone or to pull it out.
However, the machine learning models that often power robot decision-making wouldn’t work for Jenga, because these algorithms typically need to be trained on many thousands of data points for the bot to learn what works and what doesn’t. That would be vastly time-consuming; to build a training data set, researchers would have to rebuild the tower thousands of times as the robot inevitably made mistakes, a process that isn’t easily automated. Research assistants would have to do it by hand.
Instead, Fazeli and his team designed an algorithm that requires fewer data points. The robot does a shorter period of exploratory testing, similar to how a human might play Jenga–by lightly touching different blocks to see what happens. “The robot will pick blocks at random and poke them at random and see what happens,” Fazeli says. “It’s effectively just collecting data: If I apply this much force, this is how the tower changes.”
Unlike people, who have an intuitive sense of physics and can reason that pushing a stuck block will likely topple the tower, the robot needs about 300 tests to learn what amount of force on what kinds of blocks will cause the tower to fall. These 300 data points help create a model of what types of blocks to grab, which the robot can use to predict what might happen to other blocks in the future. Then, it is ready to play.
The implications of the work, which was published in the journal Science Robotics, go far beyond Jenga. Fazeli says that the most immediate applications for the algorithm are on factory floors, where robots already work to assemble parts. The ability for a robot to feel if something has been screwed in properly, even if it looks fine visually, could help with building delicate, precise products like consumer electronics.
“In a cellphone assembly line, in almost every single step, the feeling of a snap-fit, or a threaded screw, is coming from force and touch rather than vision,” Alberto Rodriguez, an assistant mechanical engineering professor at MIT who supervised the work, told MIT News. “Learning models for those actions is prime real-estate for this kind of technology.”
Unfortunately, this bot isn’t the Alpha Go for Jenga–it isn’t quite good enough to beat some human Jenga players, who can perform highly skilled tricks and strategies that are beyond the machine’s understanding.
However, humans can still play against the MIT bot. “It’s enjoyable if you’re a little patient because it’s slow,” Fazeli says, since one move can take several minutes for the robot to execute. But at least you could say you played Jenga with a robot.
Fast Company , Read Full Story
(4)