It’s human vs. machine in this racing drone test
As drones and their components get smaller, more efficient, and more capable, we’ve seen an increasing amount of research towards getting these things flying by themselves in semi-structured environments without relying on external localization. The University of Pennsylvania has done some amazing work in this area, as has DARPA’s Fast Lightweight Autonomy program.
At NASA’s Jet Propulsion Laboratory, they’ve been working on small drone autonomy
for the past few years as part of a Google-funded project. The focus is on high-speed dynamic maneuvering, in the context of flying a drone as fast as possible around an indoor race course using only on-board hardware. For the project’s final demo, JPL raced their autonomous drones through an obstacle course against a professional human racing drone pilot.
The AI-powered drone is fully autonomous, meaning that there’s no external localization or off-board computer control. A Qualcomm Snapdragon Flight board is used for real-time flight control. The drone has a 3D map of the course that it constructs itself using its two wide field-of-view cameras: one pointing forwards and the other pointing downwards resulting in a 250-degree-plus FOV with a persistent horizon. The two cameras generate a depth map from motion stereo, and in flight, the cameras plus an IMU localize to the map and perform visual-inertial odometry for motion tracking.
While the drones are capable of straight line speeds of over 120 km/h, JPL’s warehouse isn’t quite large enough for them to go flat out, sadly. The constrained track proved especially tricky for the professional human drone racing pilot, Ken Loo, who got mentally fatigued by the density of the track. Once Loo learned the course, though, he could complete it in an average of just over 11 seconds, while the autonomous drone took an average of 3 seconds longer. The time difference mostly came from aggression—while the autonomous drone was smoother and more consistent (flying nearly the same time every lap), Loo accelerated and decelerated more quickly, and was able to dynamically improvise maneuvers and shortcuts that the autonomous system couldn’t.
The project’s manager at JPL is Rob Reid, who helped develop that nifty robotic space hedgehog back in 2015. We spoke with Reid to find out why the heck they let a human win this race, and how they’re going to stop that from ever happening again.
IEEE Spectrum: Can you describe the drone autonomy research that JPL has been involved in that led to this demonstration?
Rob Reid: JPL has been researching camera-based navigation techniques for spacecraft and micro aerial vehicles (drones) for decades. Since 2013, it has collaborated with Google on Project Tango, and over the last two years, it has integrated Tango into a drone to demonstrate novel navigation algorithms. The team has explored various trajectory optimization techniques that account for effects such as aerodynamics and camera motion blur.
Why was a drone race an ideal way for you to demonstrate progress in this area?
The goal was to demonstrate high-performance autonomous flight among obstacles—an indoor drone race provides a complex track full of obstacles, along with a compelling reason to fly fast through them!
Were you expecting that the human pilot would win?
I wasn’t surprised by the outcome; we were confident that our drone system was going to be competitive; however, we weren’t sure who was going to learn an optimal trajectory (i.e. racing line) the fastest! With only one afternoon of flying, Ken was able to shave seconds off his lap time much faster than our algorithms could. In the weeks since, we have sped up our optimization approach considerably.
What are the limitations of the hardware that the drones are using to navigate, and how did that affect their performance in the race?
The biggest performance limitation for fast indoor flight comes from the shutter speed of the onboard cameras that are used to track the drone’s motion—flying too fast while too close to the ground, or rolling or pitching too quickly can cause the image to blur and the drone to become lost. We addressed this in two ways: First, by using two wide field-of-view cameras—by pointing one forwards and the other downwards, the >250-degree field-of-view allows the drone to always see the horizon. Second, we adjusted trajectories to cap rotation rates and speed-to-height ratio.
What will it take before drones like these are competitive with human expert pilots in structured environments?
For a typical drone race, the hardware is ready to beat human experts: Our drones are “race spec” and can pull a few g’s. We couldn’t fly a night time race, or on a track with lots of visual repetition.
Are you continuing this project? If so, what can we look forward to?
The work is ongoing, unfortunately I can’t say much of what’s next! But, you can look forward to drones with the ability to sense obstacles and update their own trajectories online.
This area of robotics is progressing rapidly; things like event-based cameras could potentially solve the issue of motion-blur to some extent and enable even more dynamic autonomous maneuvers. And Reid is definitely right that drone hardware is poised to surpass human performance, although that’s the case with robotics in general—we’re at the point where, with a few exceptions, robotics is much more of a software challenge than a hardware challenge. This doesn’t mean that it’s necessarily any easier to solve, though, and we’re excited to see how JPL’s drones evolve.
[ JPL ]
Source: IEEE Spectrum