How a team of Australian engineers and their robot, named Cartman, won the Amazon Robotics Challenge
Amazon has a problem, and that problem is humans. Amazon needs humans, lots of them. But humans, as we all know, are the most unreasonable part of any business, constantly demanding things like lights and air. So Amazon has turned to robots (over 100,000 of them) for doing tasks like moving things around in a warehouse. But it’s proving to be much more difficult to get the robots to do some other tasks. One of the hardest is picking objects from shelves and bins.
To solve this problem, Amazon is making it someone else’s problem, by hosting a yearly robotics “picking” challenge. In the competition, teams have to develop robotics hardware and software that can recognize objects, grasp them, and move them from place to place. This is harder than it sounds, because we’re on year three and Amazon is still running this thing, but some clever Australians are making substantial progress.
The 2017 incarnation of the Amazon Robotics Challenge was held at RoboCup in Nagoya last month, and sixteen teams from around the world made the trip to Japan. What Amazon was looking for was a robot that could identify items, remove target items from storage and place them into boxes (picking), take target items from totes and place them into storage (stowing), and then do both at once in a grand fantastic explosion all-or-nothing final competition.
Teams brought their own robots with their own nutty gripper designs, and also their own item storage system designed to be able to handle all of the stuff and junk that crazy people like you buy on Amazon every day. Points were awarded for successful picks, successful stows, neat packing, and overall quickness, while points were deducted for (among other things) “major damage” to items, which is unfortunate, since a robot that could just flatten everything into a pancake would have a much easier time at this!
Here’s an overview of how things went:
Team ACRV (from the Australian Centre for Robotic Vision at Queensland University of Technology in Australia), which didn’t place in the top three on either the individual pick task or stow task, managed to knock it out of the park on the combined final task, taking first place and going home with US $80,000 (which is way more in Australia).
Third place went to Singapore’s Nanyang Technological University, which managed a first in the picking task and a second in the stowing task. And second place went to NimbRo, which posted this video of their final run:
A few things to note from these videos: It looks like most teams used some flavor of hybrid gripper design, relying primarily on suction and using a physical gripping mechanism when necessary. There are also plenty of instances when the first grasping attempt fails, and the robot needs to be able to detect and adapt to that, just like a human does. Additionally, the robots sometimes grasped multiple things at once by accident, or had to deal with objects (like books) that can change their shape post-grasp as they were lifted. These sorts of things are why challenges like these are important: Given the number of objects that Amazon is foisting on us, it’s hard to predict how any system will perform without trying it out in real life, or as close to real life as challenges like these allow.
While QUT’s press release suggests that “the team has solved a key robotics problem for Amazon— picking items and stowing them in boxes in an unstructured environment,” that strikes us as awfully optimistic. It’s certainly a key robotics problem, but solving it implies a reliable robotic solution that can compete (at least to some extent) with a human picker, and based on these videos, we seem kind of far from that. Also worth noting is that QUT’s winning robot is a stationary gantry system, suggesting that Amazon could perhaps be open to a picking solution that doesn’t move, rather than a mobile manipulator.
On the other hand, maybe we shouldn’t draw too many conclusions from the specific designs, and just be happy that we’re seeing some tangible advancements in object recognition, grasp planning, and everything else under conditions that are somewhat close to real-world usefulness. And as soon as Amazon buys up all the winning teams of one of their challenges and then cancels the following year, we might be able to actually figure out what their robotics fulfillment plan is.
Source: IEEE Spectrum