AEye, a San Francisco Bay Area-based company that develops hardware, software and algorithms that serve as the “eyes and visual cortex” of autonomous vehicles, has announced a new data type that will help self-driving cars see better and, as a result, reduce energy costs. The data type combines pixels from digital 2D cameras with voxels from 3D LiDAR (Light Detection and Ranging) sensors. By joining these into a unified high-resolution sensor data type known as Dynamic Vixels, AEye has created a format through which autonomous cars can more effectively evaluate a situation based on 2D visual algorithms. In addition, the new data type allows the cars to use 3D and even 4D information on an object’s location, intensity, and velocity. This efficient format has yielded a visual system that is faster and more accurate while using eight to ten times less energy in the process.
AEye believes that integration of data types is essential for improving the capacity of autonomous vehicles while reducing energy costs through more efficient computing. “There is an ongoing argument about whether camera-based vision systems or LiDAR-based sensor systems are better,” said AEye Founder and CEO Luis Dussan. “Our answer is that both are required – they complement each other and provide a more complete sensor array for artificial perception systems. We know from experience that when you fuse a camera and LiDAR mechanically at the sensor, the integration delivers data faster, more efficiently and more accurately than trying to register and align pixels and voxels in post-processing. The difference is magnitude better performance.”
Related: Lyft launches self-driving cars on the Las Vegas strip
AEye’s iDAR (Intelligent Detection and Ranging) perception system is modeled on human visual perception, using biomimicry to better equip autonomous vehicles for the safest ride possible. For example, iDAR is able to assess a child’s facial direction and determine the probability of whether this child will walk into the street, preparing the vehicle to stop in case the child does. “There are three best practices we have adopted at AEye,” said AEye Chief of Staff Blair LaCorte. “First: never miss anything; second: not all objects are equal; and third: speed matters. Dynamic Vixels enables iDAR to acquire a target faster, assess a target more accurately and completely, and track a target more efficiently – at ranges of greater than 230m with 10% reflectivity.” AEye plans to release the AE100 artificial perception system, its first iDAR-based project, this summer. It will be available to Tier 1 and Original Equipment Manufacturer companies that have autonomous vehicle initiatives.
Images via AEye