Enhancements to camera-based object detection and continuous system level performance evaluation in 2024.02 version of Mapless Autonomy Platform
driveblocks releases a new revision of its core product, the Mapless Autonomy Platform. The release improves its ability to operate in industrial vehicle settings such as construction and agriculture. In addition, the new software release supports customers in understanding the performance for their respective use-case in detail and identify improvement potentials.
In particular, the 2024.02 release holds the following major improvements:
Full support for camera-based object detection with a pre-trained foundation model on a common list of object classes. This serves as a baseline for refinement on customer datasets.
Addition of a scenario evaluation framework which can be used to track the system-level performance on a set of customer individual datasets.
Additional minor improvements and bug-fixes are:
More detailed documentation on several components and additional tutorials on how to work with the platform.
Support for compressed images in data pipeline and inference nodes to save storage and bandwidth.
Pre-configured ROS2 nodes for lane detection, object detection and joint features.
With the new release, we lay the foundation for the integration of the Mapless Autonomy Platform in several new use-cases, especially in the industrial environment. The focus ones at this point are:
Leader & Follower applications: An autonomous machine is capable of following another machine operated by a human driver. While doing so, it performs continuous monitoring of the environment and collision avoidance.
Automated Emergency Braking and Collision Avoidance: The autonomous machine performs certain tasks based on GNSS guidance and is supervised by a perception-based emergency braking and collision avoidance system.
Complementary perception and sensor-fusion path: To increase the reliability and scenario coverage, customers can introduce the Mapless Autonomy Platform as a complementary perception and sensor-fusion path into their overall autonomy stack. They get access to a fully independent solution which helps to mitigate common cause failure modes in the overall system design.
Stay tuned for announcements around these specific applications in the upcoming weeks!