Sensor Fusion

What Is Sensor Fusion?

Sensor fusion is where inputs from multiple sources are combined to generate a more accurate world model. These could be of the same type, such as two radars or multiple cameras, or they could be different types. The system is used for positioning and navigation, calculating altitude in aviation, calculating traffic volume, automatic vehicle safety, and autonomous driving systems in Software-Defined Vehicles.

Levels of Sensor Fusion

The data from sensors can be combined in multiple ways. These are usually defined in terms of Levels, according to a model initially developed by the U.S. Joint Directors of Laboratories (JDL) Data Fusion Group.

Level 0

Raw data is aligned between sensors at the pixel or signal level.

Level 1

Entities or objects detected or inferred are assessed against observations from other sensors.

Level 2

Relationships between classified and identified objects are established, creating situations that can then be assessed against situations derived from other sensors.

Level 3

The relative impacts or threats of detected situations are assessed, including future risks. These can then be compared against impact assessments from other sensors.

Level 4

Processes are refined to support Levels 0-3 by managing sensor resources adaptively to improve results. This is increasingly performed using AI/ML

Level 5

User refinement adds another layer of improvement supported by users.
Sensor Fusion levels can also be based on the information supplied to the algorithms interpreting the inputs. This can either be the raw sensor data, features computed from each sensor, or decisions based on those computed features, such as route planning or steering alterations for lane keep assistance. These broadly parallel the Levels defined by JDL.

Sensor Fusion for Automotive

Sensor fusion is employed in several areas of automotive services. Even a GPS navigation system can blend data from other sensors alongside GPS satellite information, such as the inertial measurement unit (IMU), which detects specific force, angular rate, and orientation. The Advanced Driver-Assistance System (ADAS) will be the most ubiquitous use of sensor fusion. A front radar will most commonly be employed for detecting the distance to the car in front for adaptive cruise control, but LiDAR and cameras might also augment this. Cameras may also add lane-keeping assistance to ADAS and adaptive cruise control. Ultrasonic sensors will also be employed to assist low-speed emergency braking.

Beyond ADAS, the emerging domain of autonomous driving systems is a key area where sensor fusion will be essential. To deliver safe self-driving cars, they must combine inputs from radar, cameras, LiDAR, and GPS to recognize and navigate through surroundings. These systems will employ self-driving algorithms that have evolved using immense amounts of data collected from many hours of live testing combined with live information at the edge from a vehicle as it drives. New sensor data will continue to be sent back to the cloud to be fused with historical data, AI/ML Level 4 improvements and Level 5 user inputs. So autonomous driving systems will constantly improve with use, just as human drivers gain experience.

Benefits of Sensor Fusion

Different types of sensors have a variety of strengths and weaknesses. Radar is good in all kinds of weather but can be slower than other systems and has trouble resolving nearby objects from each other. LiDAR is excellent for detecting objects in three dimensions and has accurate range but isn’t tolerant of heavy rain, snow, or fog. Cameras can classify objects, detect angular position, and provide a holistic view of the scene but are also susceptible to weather, lighting levels and dirt on the lens. Ultrasonic sensors are cheap but have a very short range.

Combining inputs from multiple sensors of the same type can improve accuracy. Combining multiple types can take advantage of where each is strong and negate their weaknesses. A sophisticated algorithm can use the data from each sensor to prevent false detections and avoid missing important objects. This is crucial in automotive safety. If an automatic braking system fails to spot a pedestrian or cyclist, this can lead to a fatal accident. Adaptive cruise control that fails in poor weather will be too dangerous. Sensor fusion has a vital role in improving vehicle safety.

Interior and Exterior Sensor Fusion

Most of the focus of sensor fusion has been on exterior sensors such as radar, cameras, LiDAR, and ultrasonics. But automobiles are now taking data from sensors inside the car to make even more sophisticated systems. These include sensors on the steering wheel to detect whether a driver has their hands on the wheel when using a Level 2 autonomous system, which requires this. A cabin camera can detect if a driver has their eyes on the road or is drowsy.

Combining interior and exterior sensors can add another layer of safety precautions. For example, if the driver is inattentive to the road, the external systems could become more sensitive and immediately react to threats. Or it can provide a warning if you are looking down at the infotainment screen and something outside the car needs attention, such as a car changing lanes in front unexpectedly. Or, if you’re stationary and about to make a turn, interior and exterior sensor fusion could warn you of a pedestrian or cyclist appearing in the direction you are not looking. The result will be even safer driving, thanks to sensor fusion.

The BlackBerry IVY® platform leverages BlackBerry® QNX® , edge computing, and the cloud to support a future-proof digital ecosystem. It gives developers and automakers a secure, reliable way to share vehicle data, deliver new features and functionality, and fuel both present and future innovation. Backed by BlackBerry expertise, it’s compatible with most OS and cloud platforms, offering advanced personalization and access to our broad development community. 

Check Out Our Other Ultimate Guides

Structural Dependency
Learn about software-defined vehicles, including their benefits and architecture.
READ THE GUIDE
Structural Dependency
Covers topics such as embedded systems protection, security exploits and mitigation, and best practices
READ THE GUIDE
Structural Dependency
Offers key concepts and information on standards for safe system design
READ THE GUIDE
Structural Dependency
Defines autonomous systems and the various levels of autonomy
Read the Guide