Sensor fusion is a popular technique in embedded systems where you combine data from different sensors to get a more encompassing or accurate view of the world around your device. This process might include using multiple sensors of the same type to build a better representation of the environment, such as using two separate cameras to create a stereoscopic image that estimates three dimensions. Alternatively, you might combine information from different sensors. For example, an inertial measurement unit (IMU) can estimate absolute orientation by combining data from an accelerometer, gyroscope, and magnetometer.
This is a companion discussion topic for the original entry at https://www.edgeimpulse.com/blog/sensor-fusion-with-machine-learning-on-edge-impulse