Robust Multiobject Tracking Using Mmwave Radar-Camera Sensor Fusion

Arindam Sengupta, Lei Cheng, Siyang Cao

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


With the recent hike in the autonomous and automotive industries, sensor-fusion-based perception has garnered significant attention for multiobject classification and tracking applications. Furthering our previous work on sensor-fusion-based multiobject classification, this letter presents a robust tracking framework using a high-level monocular-camera and millimeter wave radar sensor-fusion. The proposed method aims to improve the localization accuracy by leveraging the radar's depth and the camera's cross-range resolutions using decision-level sensor fusion and make the system robust by continuously tracking objects despite single sensor failures using a tri-Kalman filter setup. The camera's intrinsic calibration parameters and the height of the sensor placement are used to estimate a birds-eye view of the scene, which in turn aids in estimating 2-D position of the targets from the camera. The radar and camera measurements in a given frame is associated using the Hungarian algorithm. Finally, a tri-Kalman filter-based framework is used as the tracking approach. The proposed approach offers promising MOTA and MOTP metrics including significantly low missed detection rates that could aid large-scale and small-scale autonomous or robotics applications with safe perception.

Original languageEnglish (US)
Article number5501304
JournalIEEE Sensors Letters
Issue number10
StatePublished - Oct 1 2022


  • Kalman filter
  • millimeter-wave (MmWave) radar
  • perception
  • Sensor sapplications
  • Sensor systems
  • sensor-fusion
  • tracking

ASJC Scopus subject areas

  • Instrumentation
  • Electrical and Electronic Engineering


Dive into the research topics of 'Robust Multiobject Tracking Using Mmwave Radar-Camera Sensor Fusion'. Together they form a unique fingerprint.

Cite this