Sensor Fusion

SENSOR FUSION for ADAS,
Industry 4.0, and Smart Home

adas sensor fusion for safety and redundancy
  • Advanced solutions to perception challenges require a multisensor approach
  • Complementary data from different sensors offer system redundancy. The goal is to process meaningful data for consistent and safe decision making
  • NOVELIC offers modeling of deep-learning algorithms as an input for neural networks to be trained and to decide when to fuse data

Why Choose NOVELIC

Multi sensor fusion provides the comparative advantage over single sensor algorithms that allows for critical development of safe and secure systems. ​Our expertise in different sensor technologies, as well as numerous executed embedded projects with an advanced algorithm development allows us to offer great accuracy, reliability, and precision to our clients.

major adas sensors vs. human performance comparison

Image: Comparison of major ADAS sensor technologies and a human

perception software

Merging the most relevant sensor technologies such as camera, LiDAR, radar, and inertial sensors for the purposes of research and development in the field of ADAS, industry 4.0 and smart home experiences is our main area of interest. Our broad experience in working with inertial, image, radar and other automotive or smart electronics related sensors has equipped us with a full knowledge of their limitations and advantages. Advanced algorithms for object detection, and fusion of data from different sensors is a field of competence of NOVELIC as a perception company.

Sensor Calibration, Synchronization, and Deep Learning Fusion

Precisely calibrated and synchronized sensors are a precondition for effective sensor fusion. We provide, as a next step in the mobility industry, informed decisions by executing advanced sensor fusion at the edge (in-vehicle processing) on a centralized, multi-core processing platform.

Strategies for sensor fusion:

Performing late fusion allows for interoperable solutions while early fusion gives AI richer data for predictions. Leveraging the complementary strengths of different strategies gives us the key advantage. The modern approach involves time and space synchronization of all onboard sensors before feeding synchronized data to the neural network for predictions. ​A possible scenario includes recording road data from multiple sensors for road tracking, object detection and classification, movement prediction, etc. This data is then used for AI training or Software-In-the-Loop (SIL) testing of real-time algorithm that receives just a limited piece of information (e.g. camera and maps).

sensor fusion and deep learning algorithm development

Project Example of Sensor Fusion Design for Automated Guided Vehicles

  • Our team has developed sensor fusion algorithms for IMU sensors (accelerometers, gyroscope, and magnetometer) that estimate event movement. The algorithms are based on Kalman, Bayesian, and convolutional neural networks.
  • We developed a synthetic trajectory generator
imu sensor fusion design for automated guided vehicles

NOVELIC team has executed all
major development steps:

  • Design of Sensor Fusion Algorithms for Navigation
  • Design of Software Architecture
  • Software implementation in the microcontroller
  • Design of Hardware Architecture
designing algorithms for navigation

Brief Overview

Multisensor Fusion

Multisensor Fusion

Deep Learning Algorithms

Deep Learning Algorithms

Sensor Calibration

Sensor Calibration

Sensor Synchronization

Sensor Synchronization