The future of sensor technology in unmanned driving: Will cameras replace LiDAR?

The core technology of unmanned driving includes environmental perception, precise positioning, and path planning. Complex road environments, especially mixed traffic environments, cause great difficulties in the environment perception of autonomous vehicles. Currently, the mainstream obstacle detection sensors are the camera and lidar. Cameras have been widely used in intelligent driving because of their low cost and the ability to obtain textures and colors of targets, which are especially important in recognition of the traffic lights and traffic signs.

The core decision of the computer vision school is to bet that artificial intelligence technology will enable vehicles to achieve the same perception as the human eyes and the same thinking ability as the human brain through visual perception and cognitive decisions made by optical cameras before LiDAR gets down to near the price of millimeter-wave radar.

Cameras are a lower-cost solution for applications where both radar and cameras can be used. The installation position of the camera in front, side, rear, and built-in. Mainly used for forwarding collision warning systems, lane departure warning system, traffic sign recognition system, parking assistant system, blind spot monitoring system.

Cameras are usually divided into the monocular cameras and binocular camera. In general, the perspective of the monocular vehicle camera is from 50° to 60°, and the visual distance is from 100m to 200m. The binocular camera can simulate human visual imaging for 3D imaging, compare the different image signals obtained by two cameras, identify the object more reliably, and obtain the distance and speed information of the object through the algorithm.

Compare with LiDAR, the cameras can use natural light during the day. It can recognize the color of cars and traffic lights and recognize objects far away with higher resolution and lower costs in full-light conditions. However, it also has some disadvantages. Such as, it is easy to be affected by rain and snow weather and illumination, the change of illumination has a great impact on its recognition accuracy, and the current camera technology is difficult to identify distant objects in static images.

Some studies suggest that it is possible to use stereoscopic cameras in autonomous vehicles, which could significantly reduce costs and improve safety. But it will be a long time before cameras replace LiDAR. Under the premise that existing artificial intelligence cannot reach human intelligence, a sensor must be used to make up for the limited range perception of passive optical imaging and solve the problem of uncertain detection direction of the millimeter-wave radar. By comparison, LiDAR can better solve the problem of distance measurement, and improve the safety of autonomous vehicles on the road under the existing technical conditions. This cannot be achieved by computer vision.

The German Federal Motor Transport Authority has finally granted the first SAE-level 3 and UN Regulation number 157 approval to Mercedes-Benz, whose perception system includes a wetness sensor in the wheel, LiDAR sensors, microphones, and a camera in the rear window, primarily used for detecting blue lights and other special signals from emergency vehicles. This technical approval was mainly possible due to the adoption of LiDAR to navigate the environment.