Info
Info

AEye announces commercially available Perception Software designed to run Inside the sensors of Autonomous Vehicles

News

AEye announced the world’s first commercially available, 2D/3D perception system designed to run in the sensors of autonomous vehicles.

Info

For the first time, basic perception can be distributed to the edge of the sensor network. This allows autonomous designers to use sensors to not only search and detect objects, but also to acquire, and ultimately to classify and track these objects. The ability to collect this information in real-time both enables and enhances existing centralized perception software platforms by reducing latency, lowering costs and securing functional safety.

This in-sensor perception system is intended to accelerate the availability of autonomous features in vehicles across all SAE levels of human engagement, allowing automakers to enable the right amount of autonomy for any desired use case - including the most challenging edge cases - in essence, providing autonomy “on demand” for ADAS, mobility and adjacent markets.

AEye's achievement is the result of its flexible iDAR platform that enables intelligent and adaptive sensing. The iDAR platform is based on biomimicry, and replicates the elegant perception design of human vision through a combination of agile LiDAR, fused camera and artificial intelligence. It is the first system to take a fused approach to perception - leveraging iDAR's unique Dynamic Vixels, which combine 2D camera data (pixels) with 3D LiDAR data (voxels) inside the sensor. This unique software-definable perception platform allows for disparate sensor modalities to complement each other, enabling the camera and LiDAR to work together to make each sensor more powerful, while providing “informed redundancy” that ensures a functional safe system.

AEye's approach solves one of the most difficult challenges for the autonomous industry as it seeks to deliver perception at speed and at range: improving the reliability of detection and classification, while extending the range at which objects can be detected, classified and tracked. The sooner an object can be classified and its trajectory accurately forecasted, the more time the vehicle has to brake, steer or accelerate in order to avoid collisions.

First generation robotic vision systems tried to solve the challenges of fully autonomous driving by capturing as much data as possible. This required both time and power to process. Second generation systems are designed to intelligently collect, manage and transform data into actionable information.

The unique intelligent capabilities of the iDAR platform allow for applications ranging from ADAS safety augmentation, such as collision avoidance, to selective autonomy (highway lane change), to fully autonomous use cases in closed-loop geo-fenced or open-loop scenarios.

Engineers can now experiment using software-definable sensors without waiting years for the next generation of hardware. They can adapt shot patterns in less than a second and simulate impact to find optimal performance. They can also customise features or power usage through modular design, for instance using a smaller laser and no camera to create a specialised ADAS system for under $1000, or mixing and matching short and long range LiDAR with camera and radar for more advanced 360 degree systems for under $15,000. Unlike with the industry's previous generations of sensors, OEMs and Tier 1s can now also move algorithms into the sensors when it is appropriate.

“We believe the power and intelligence of the iDAR platform transforms how companies can create and evolve business models around autonomy without having to wait for the creation full Level 5 Robotaxis,” said Blair LaCorte, President of AEye. “Automakers are now seeing autonomy as a continuum, and have identified the opportunity to leverage technology across this continuum. As the assets get smarter, OEMs can decide when to upgrade and leverage this intelligence. Technology companies that provide software-definable and modular hardware platforms now can support this automotive industry trend.”

AEye's system more quickly and accurately searches, detects and segments objects and, as it acquires specific objects, validates that classification with velocity and orientation information. This enables the system to forecast the object's behaviour, including inferring intent. By providing the smarts to capture better information faster, the system enables more accurate, timely, reliable perception, using far less power than traditional perception solutions.

This 2D/3D perception system is based on AEye's iDAR platform, whose perception advancements the company will make broadly available via a software reference library. That library includes the following features that will be resident in AEye's AE110 (Mobility) and AE200 sensors (ADAS):

1. Detection: Identification of objects (e.g. cars, pedestrians, etc.) in the 3D point cloud and camera. The system accurately estimates their centroids, width, height and depth to generate 3D bounding boxes for the objects.
2. Classification: Classifying the type of detected objects. This helps in further understanding the motion characteristics of those objects.
3. Segmentation: Further classifying each point in the scene to identify specific objects those points belong to. This is especially important to accurately identify finer details, such as lane divider markings on the road.
4. Tracking: Tracking objects through space and time. This helps keep track of objects that could intersect the vehicle's path.
5.Range/Orientation: Identifying where the object is relative to the vehicle, and how it's oriented relative to the vehicle. This helps the vehicle contextualize the scene around it.
5. True Velocity: Leveraging the benefits of agile LiDAR to capture the speed and direction of the object's motion relative to the vehicle. This provides the foundation for motion forecasting.
6. Motion Forecasting: Forecasting where the object will be at different times in the future. This helps the vehicle to assess the risk of collision and charter a safe course.

AEye's iDAR software reference library will be available in Q1 2020, and will be demonstrated this January at CES.


The Latest News, Brought To You By
AEye announces commercially available Perception Software designed to run Inside the sensors of Autonomous Vehicles
Modified on Wednesday 20th November 2019
Find all articles related to:
AEye announces commercially available Perception Software designed to run Inside the sensors of Autonomous Vehicles
TaaS Technology Magazine
Info
Daimler Truck AG And Volvo Group Fully Committed To Hydrogen-based Fuel-cells – Launch Of New Joint Venture Cellcentric
PCB Depaneling: Laser Technology Improves Quality And Efficiency For Automotive Applications
Faraday Future Selects Velodyne As Exclusive Lidar Supplier For Flagship FF 91
GHD Survey Reveals Half Of British Consumers Are Considering An Electric Vehicle In Next Five Years
Construction Begins On First-of-its-kind Electric Vehicle Battery Technology Centre And Pilot Line
Iteris Awarded $3.3 Million Contract By City Of Modesto For Smart Mobility Initiative
SEAT Introduces Autonomous Mobile Robots In Its Barcelona Factory
FEV Successful In Designing Low Emission, Efficient Hydrogen Internal Combustion Engine
Toshiba Expands Scope Of Its Solid-State LiDAR Solution To Address Transportation Infrastructure Monitoring
Nikola And Total Transportation Services Inc. Sign LOI For 100 Nikola Trucks
European Launch: NIO To Sell Smart Premium EVs In Norway
Fisker Launches Resource For Environmental, Social, And Governance Policy, Practices And Reporting And
Renault Group And Plug Power Inc. Launch HYVIA
Faction Raises $4.3M To Develop Light EV Driverless Fleets
Dr. Matthias Jurytko Takes Over The Management Of The Fuel Cell Joint Venture Cellcentric
Uber, Mobilize, RATP And Blablacar Join Forces For Sustainable Mobility By Launching The “Mobilité360” Project
Pininfarina And MT Distribution Join Forces To Create A New Range Of Vehicles For Urban Electric Micro-mobility
EasyMile Raises €55 Million In Series B Round
HELLA Brings Latest Passenger Car 77GHz Radar Technology Into Series Production
AAM And REE Automotive To Jointly Develop New Electric Propulsion System
Ford Boosts Investment In Solid Power
Nano One And Johnson Matthey Enter Into A Joint Development Agreement For Lithium-ion Battery Materials
Former Google Head Of Energy Strategy Neha Palmer Joins TeraWatt Infrastructure
Gilbarco Veeder-Root Expands E-mobility Platform With Launch Of EVerse
Info
Info
×
Search the news archive

To close this popup you can press escape or click the close icon.
Logo
×
Logo
×
Register - Step 1

You may choose to subscribe to the TaaS Magazine, the TaaS Newsletter, or both. You may also request additional information if required, before submitting your application.


Please subscribe me to:

 

You chose the industry type of "Other"

Please enter the industry that you work in:
Please enter the industry that you work in:
 
X
Info
X
Info
{taasPodcastNotification} Array
Live Event