As an early-stage investor, I am inundated daily with pitches that overstate their sales by bragging about their list of pilot engagements. Astute managers know too well that turning proofs of concept into recurring revenue is a long, arduous road that requires significant capital and time.
To guide founders through this process, I coach them on aligning their pilots with the revenue milestones of their prospects. While Amazon, Walmart and other large retailers are able to budget large outlays to automate their warehouses, the majority of America’s operators find robotics unapproachable as the unit economics still exceed the annual salary of their workers.
As a mentor of the Deep Tech tract at New York University’s Endless Frontier Labs, I have met with two startups recently that are focused on developing technologies to bring down the unit economics of automation – Wheel.me and DreamVu. I’ve detailed wheel.me’s motorized castor that potentially transforms any wooden pallet into an unmanned system. Last week, I interviewed Rajat Aggarwal, Chief Executive, of DreamVu, who developed a 360 degree sensor called “PAL.”
Aggarwal describes how his invention is changing the financial model: “Currently, sensing is a bottle-neck for scaling. We felt a strong need for such a vision sensor, which can eliminate the requirement of multiple sensors. Autonomous navigation in highly cluttered and dynamic environments, teleoperation, and situational awareness are key applications where PAL is best suited.”
Unlike autonomous vehicles that use GPS, indoor robots rely solely on sensors to navigate around their environments. To date, unmanned rovers moving cartons around the warehouse have been weighted down with expensive sensors to capture its spacial understanding within the facility to avoid collisions with humans and other equipment.
In comparing his solution to other offerings, Aggarwal boasts, “PAL introduces a multifold advantage in terms of cost, compute, and power requirements over other competing sensing technologies.”
While many existing robot installations rely heavily on LiDAR-enabled navigation, which cost thousands of dollars per sensor, he believes these devices are “overkill, given their high price points, longer ranges, and overwhelming precision.”
To roll out automation on a mass scale, integrators need to “focus on manufacturability, scalability, and ability to operate in the world designed for humans.” The technologist further claims, “To enable complete autonomy, we need situational awareness, which is not just complete but also very fast. And we have to scale this sensing for the millions of these robots; the sensing cost has to go down. DreamVu’s optics technology simplifies the way we are used to capturing different types of data (2D images, 3D images, 3D point clouds) and then fusing this data to a complex and heavy AI engine.”
Today’s logistics market is a forerunner for other segments of autonomy. Aggarwal shared his long-term outlook by saying, “79 million (people) will have a robot inside their homes by 2024. These robots will need to operate autonomously in homes, which are cluttered and dynamic environments. 360 degree depth-sensing would be very critical. There is no other solution in the market that will be able to provide the required output at the given price point (<$50) and hardware specifications.”
He continued, “In the next 5 years, we see ourselves entering the drone, smart home, smartphone, and endoscopy market, providing affordable 360 degree sensing in all cases. We have an upcoming product with ultra high resolution and higher depth ranges, which can work in both indoor and outdoor environments. We are already doing early pilots in the smart city (retail, building) and smart surveillance use-cases.”
Rather than pitching investors with a collage of corporate logos, Aggarwal is following the path of Amazon by empowering the developer community with powerful tools. “Our customers are thrilled to see a new approach to surround sensing that they can try out with our evaluation kits, which are available to order. The seamless hardware and software integration allows them to test the sensor very quickly and reduces product development time for them.”
The entrepreneur optimistically predicts, “If the sensing is done right, the AI will be more accurate, and hence a robot would be more efficient and robust.”
This holiday season DreamVu, and a fleet of other technologies, will debut in in warehouses, enabling pickers to pack orders at record-breaking speeds (and profits).
The post DreamVu PAL sensor captures real-time 360-degree, 3D video appeared first on The Robot Report.