Having collected some driverless experience with the last car, FAUmax byssa was updated with new sensors improving the autonomous driving capabilities.
The chosen concept is essentially based on the following points:
- Simple and reliable construction
- 16” Tires
- A CFRP full monocoque that contains the drivers cell, the engine and its
secondary systems to make a compromise between serviceability, high
integration and superior airflow
- Longitudinally mounted KTM SX-F 450 engine with a displacement upgrade
up to 510 ccm, a compression ratio of 15.1:1 and increased torque
- Computing Unit: Nvidia Drive PX2 Autochauffeur
- Steering: Nanotec ST6018L3008-A
- EBS: redundant pneumatic system mechanically actuating the brake pedal
- Camera: Basler dart daA1600-60uc
- LiDAR: Hesai Pandar40P
Our Hesai Pandar40P LiDAR collects point clouds 20 times a second with 36000 points each. We insert the points into a grid, generating statistics for each cell. Using these, we check for a cone in each grid cell. We refine positions using the original data.
Our camera pipeline currently starts with a single fisheye camera. With the help of a neural network, we extract the position and type of cones visible in the camera field of view. Afterward, we perform the transformation to real-world coordinates purely based on the known properties of a cone and its location in the image. This provides a fast and robust enough perception for combining the colors observable by the camera with the precise positions from our LiDAR algorithm.
The track is marked using colored cones on either side. The task of the pathfinder is to find a path through the track, without going inbetween large gaps between marking cones on either side. Using the merged cone data from camera and LiDAR, it chooses the most likely path of all paths possible through the mesh.
Having determined the path, our controller sends the calculated steering angle and target speed via CAN-bus to the respective control units.