In the latest release of its SENSR software, Seoul Robotics introduces the most advanced, 3D perception features yet. SENSR 2.2 software can detect objects that are partially obstructed, fast-moving, or clustered together, in addition to providing classification of bicycles, vehicles, and pedestrians.
The Seoul Robotics software product line consists of two majors solutions: SENSR M is designed for perception on moving vehicles like autonomous vehicles and autonomous mobile robots (AMR). SENSR 2.2 is designed for fixed sensing applications such as cities, public spaces, logistics, manufacturing and retail spaces.
Deep learning (DL) is at the heart of the SENSR software design. By leveraging DL, Seoul Robotics is improving the perception accuracy. Unlike other 3D computer vision software that relies on machine learning and rule-based systems, Seoul Robotics now uses DL to track more than 500 objects simultaneously and with an accuracy range of within 10 centimeters.
SENSR 2.2 also includes weather-filtering AI, allowing the software to track and detect objects even in severe weather conditions including heavy rain and snow. Because of its extreme versatility and accuracy, SENSR 2.2 is currently deployed by Seoul Robotics across the United States as well as in Japan, Korea, and numerous other countries.
“The introduction of deep learning into 3D perception software may be one of the last show-stopping enhancements in the LiDAR industry. Historically, the focus has been on advancing the LiDAR sensors themselves, but that’s changing. Moving forward, there will be heavy investments in 3D perception software that interprets the data into actionable solutions,” said HanBin Lee, CEO of Seoul Robotics. “The introduction of SENSR 2.2 is accelerating the adoption of solutions that will fuel autonomy across the globe.”
SENSR 2.2 is sensor-agnostic and compatible with over 75 different types of 3D sensors currently on the market today, including LiDAR, 3D cameras, and imaging radar. SENSR 2.2 brings heightened accuracy to a range of solutions, such as smart intersections, wrong-way detection, speeding, smart railroad crossings, crowd management, and smart retail. Seoul Robotics is rapidly expanding globally and has current partnerships with several top-tier organizations including BMW, Mercedes-Benz, Chattanooga Department of Transportation, Emart, and many others.
“Since we deployed Seoul Robotics’ technology into our smart city solutions we have seen an increase in our operational efficiencies and improvements in overall safety of our community,” said Kevin Comstock, smart city director for the City of Chattanooga. “Seoul Robotics has specifically helped the City of Chattanooga seamlessly monitor pedestrian traffic, and we are currently gathering data that will inform future capabilities of wrong-way detection. These efforts are saving money for the city, travel time for local residents, and–most importantly–lives.”
The post Seoul Robotics Launches First 3D Perception Software with Deep Learning appeared first on The Robot Report.