Design Exploration and Optimization News
Design Exploration and Optimization Resources
December 12, 2014
Velodyne LiDAR has helped build the RoboSimian, a headless, ape-like robot. The device is in training for the 2015 DARPA Robotics Challenge. The company has provided its HDL-32E LiDAR sensor for the top of the unit.
The robot is created by NASA’s Jet Propulsion Laboratory (JPL) for the competition which consists of several disaster-related tasks. The sensor can rotate a full 360º up to 20 times per second as well as between 10º up and 30º down.
RoboSimian moves on four limbs, making it ideal for travel over complex terrain. JPL researchers are currently working on improving the robot’s speed.
“The NASA/JPL robot was developed expressly to go where humans could not, so the element of sight – in this case, LiDAR-generated vision – is absolutely critical,” said Wolfgang Juchmann, director of sales & marketing, Velodyne. “We’re recognized worldwide for developing real-time LiDAR sensors for all kinds of autonomous applications, including 3D mapping and surveillance as well as robotics. With a continuous 360-degree sweep of its environment, our lightweight sensors capture data at a rate of almost a million points per second, within a range of 100 meters – ideal for taking on obstacle courses, wherever they may be.”
For more information, visit Velodyne LiDAR.
Sources: Press materials received from the company and additional information gleaned from the company’s website.
About the Author
DE’s editors contribute news and new product announcements to Digital Engineering.
Press releases may be sent to them via DE-Editors@digitaleng.news.