Detection and classification of 2D bounding boxes around objects based on camera data. The module supports 18 classes out of the box (e.g. cars, trucks, pedestrians, construction vehicles, and more rare ones like animals). The components are flexible in terms of input image size and resolution. They have been developed with a strong focus on adaptability towards various scenes, performance in diverse conditions such as various lighting and weather situations
Detection and classification of 2D line like objects based on camera data. The modules supports 11 classes out of the box (e.g. white lines, yellow lines, dashed lines, grass, and more).
This component gives an estimate of potentially large objects which would block the autonomous vehicle from commencing. Importantly, this is purely geometrically based and can detect arbitrarly complex geometric shapes. It is therefore especially suited for things like large and complex machinery, building exteriors or piles of sand and mud.
Outputs a 3D estimate of the ground surface around the vehicle. Gives height and slope estimates and possibilities to classify ground based on slope, e.g. whether its too steep to operate on.
Estimation of 3D bounding boxes based on pointcloud clustering. Clustering is performed based on eucledian distance and motion similarity. This improves performance in dynamic scenes with many objects.
A combination of 3D LIDAR bounding boxes and 2D camera detections. Enables object tracking via frames, higher robustness in case of short misdetections and motion estimation. Highly configurable and can support diverse sensor sets without retraining or major modifications.
Generates a lanelet style local map around the vehicle for navigation and planning purposes. The functionality is mainly based on the 2D lane inputs and can serve an arbitrary number of camera inputs.
Generates trajectories based on the behavior of a leader vehicle defining the target behavior of the autonomous vehicle.
Supervises the area near the vehicle and especially in driving direction. Initiates an emergency braking when objects might lead to collisions.
Enhances existing GNSS based planning systems with reactive capabilities.
Enhances existing route-based autonomy systems with reactive capabillities.