Autonomous driving is a complex task that requires reliable and accurate perception of the environment. While cameras are widely used in L2+ and higher levels of autonomous vehicles for their affordability and efficiency, relying solely on cameras for perception may not be enough to ensure truly safe autonomous driving and driver assist. For this reason, having two (or more!) different sensing technologies in a sensor suite is important, but combining the data from each can achieve even more. Sensor fusion is an approach to overcoming the limitations of individual sensors by integrating data from multiple sensor types. Fusing the information to run a unified perception algorithm on the merged data offers the most benefit and ultimate safety.
The camera is undoubtedly a foundational sensor for any autonomous sensor suite, providing visual information essential for perception tasks. Identifying the two dimensional shapes and colors of objects, for example, is crucial for reading lanes and pavement markings, and for classification, and is something cameras do exceptionally well. However, cameras also have certain limitations that make them fallible in certain scenarios.
For example, cameras may struggle with light saturation or white balance issues caused by changes in flows of light (think: the brightness of exiting a tunnel). The camera handles these scenarios by adjusting the shutter aperture, sometimes combined with algorithmic corrections. Still, the fact that cameras are not natively able to provide differentiation on the fringe between light and shade makes them vulnerable to false positives and false negatives.
Light saturation / white balance when exiting a tunnel
For the same reason, cameras struggle at night when a vehicle heading in the opposite direction forgets to turn down their high beams. The light saturates the camera’s image and, at some stage, the algorithm will have trouble discriminating objects on the road. This weakness is unfortunate at low speeds and close distances, but is even further exaggerated – and problematic – at high speeds and long range.
Another challenge for cameras is what is known as partial observability. Occasionally, a camera will have a line of sight but only to a part of an object – for instance, if a guardrail obscures the lower section of the vehicle. The camera can see the car, but it can’t see its true shape. This can lead to a false negative detection, although it is most often correctable with the algorithm. More often, partial observability creates depth perception challenges. Cameras can’t see depth immediately – they employ either stereoscopic mounting, sophisticated algorithms, or some latency between frames. While these corrective measures are usually effective, they are ultimately meant to overcome something that is basically a camera weakness or deficiency, and we can never be 100% confident that they alone are sufficient.
All of this is to say, cameras are clearly the basis for any automotive sensor suite, but they cannot and do not perform perfectly in every possible scenario. Only through sensor fusion – the combination of data from multiple sensor types – are we able to ensure both the data redundancy and the complementary strengths needed to achieve safety in every situation.
To overcome the limitations of cameras and ensure reliable decision making, a versatile sensor suite is necessary. Leveraging different sensing technologies such as cameras, radars, lidars, and other sensors, improves the accuracy, robustness, and redundancy of detection results. Imaging radar has emerged as a key sensor for diverse sensing tech in the sensor suite to allow for truly safe L2+ to higher autonomy levels.
Imaging radar, like cameras, provides rich and detailed data about the environment, but with some distinct and complementary advantages. For example, imaging radar is capable of accurately detecting and tracking objects, even in challenging conditions such as low light, adverse weather, and partial visibility, and at greater distances, allowing for increased reaction time. Unlike cameras, imaging radar can also provide accurate distance and speed measurements thanks to the doppler effect that is available in radio frequency technology and not available in optic sensing. Imaging radar also resolves the challenge of an obscured line of sight; since radar RF signals wrap around objects, they are able to create a complete map of objects in the entire field of view. High resolution-radars are also capable of detecting, confirming, or correcting camera detections, and adding trajectory insights to those detections.
The inclusion of radar in the sensor suite very clearly improves object tracking – including inference of object distance, time to collision, and more – through data diversity, i.e. providing data that are unique to the radar.
Radar improving inference of object distance
Beyond this, only high-resolution imaging radar is detailed enough to provide data redundancy and be able to replace the camera if it is not available for any reason at levels sufficient to offer a complex application such as free space mapping. Overall, its complementary and supplementary nature with and for other sensors make imaging radar a critical component in the sensor suite.
Arbe’s Perception Radar, with its ultra high-resolution imaging, is a game-changer for sensor fusion with cameras. Given its exceptional angular resolution and frame registration, the high level of detail Arbe’s radar achieves, makes it possible to merge the two very different inputs from the camera and the radar.
A safe and successful vehicle sensor suite is not about competition between sensor types, but leveraging them to achieve the ideal combination. The more information each sensor type can provide – the higher the resolution and the denser the point cloud – the better and safer driver assist and autonomous driving will be.
By combining the strengths of different sensor types through fusion, vehicles can achieve a more comprehensive and reliable perception of the environment, which is critical for achieving true safety and real autonomy. Arbe’s Perception Radar is detailed and robust enough to support sensor fusion with cameras better than any other radar on the market, making it essential to ensuring accurate and reliable perception in all driving scenarios.
————————————
This blog contains “forward-looking statements” within the meaning of the Securities Act of 1933 and the Securities Exchange Act of 1934, both as amended by the Private Securities Litigation Reform Act of 1995. The words “expect,” “believe,” “estimate,” “intend,” “plan,” “anticipate,” “may,” “should,” “strategy,” “future,” “will,” “project,” “potential” and similar expressions indicate forward-looking statements. Forward-looking statements are predictions, projections and other statements about future events that are based on current expectations and assumptions and, as a result, are subject to risks and uncertainties, including the risk that the proposed regulation will be adopted in a form which increases to market for Arbe’s radar sensors, that other companies may offer products which are less expensive or more attractive to the market, that if the regulation is adopted, the effective date will be in the distant future and may not have any significant effect on market for Arbe’s products and the risk and uncertainties described in “Cautionary Note Regarding Forward-Looking Statements,” “Item 5. Operating and Financial Review and Prospects” and Item 3. Key Information –Risk Factors” Arbe’s Annual Report on Form 20-F/A for the year ended December 31, 2022, which was filed with the Securities and Exchange Commission on May 16, 2023 as well as other documents filed by Arbe with the SEC. Accordingly, you are cautioned not to place undue reliance on these forward-looking statements. Forward-looking statements relate only to the date they were made, and Arbe does not undertake any obligation to update forward-looking statements to reflect events or circumstances after the date they were made except as required by law or applicable regulation. Information contained on, or that can be accessed through, Arbe’s website or any other website is expressly not incorporated by reference into and is not a part of this press release.
Connect to learn more© Arbe , All rights reserved