Software & Data Downloads — MMVR

Millimeter-wave Multi-View Radar Dataset for radar-assisted indoor perception development for indoor vehicle/robot/humanoid navigation, building energy management, and elderly care for better efficiency, safety, and user experience.

Compared with an extensive range of automotive radar datasets to support autonomous driving, indoor radar datasets are scarce at a much smaller scale in the format of low-resolution radar point clouds and usually under an open-space single-room setting. In this paper, we aim to scale up indoor radar data collection in a large-scale, multi-view high-resolution heatmap in a multi-day, multi-room, and multi-subject setting. Referring to the millimeter-wave multi-view radar (MMVR) dataset, it consists of $345$K multi-view radar heatmap frames collected from $22$ human subjects over $6$ different rooms (e.g, open/cluttered offices and meeting rooms). Each pair of horizontal and vertical radar frames is synchronized with RGB image-plane annotations: bounding boxes, keypoints, and pixel-level occupancy that support three major perception tasks of object detection, pose estimation, and segmentation, respectively. For each task, we report performance benchmarks under two protocols: a single subject in an open space and multiple subjects in several cluttered rooms with two data splits: random split and cross-environment split over $395$ 1-min data sequences. We anticipate that MMVR facilitates radar-assisted indoor perception development for indoor vehicle/robot/humanoid navigation, building energy management, and elderly care for better efficiency, safety, and user experience.


Access data at https://doi.org/10.5281/zenodo.12611978.