EyeDAR: A Low-Power mmWave Tag that Senses and Communicates 3D Point Clouds to Enhance Radar Perception

Live demo at the research showcase in Washington DC (with the best team!)
Live demo at the research showcase in Washington DC (with the best team!)

Abstract

Autonomous vehicles rely heavily on vision-based sensors that struggle in poor visibility in harsh weather and beyond line-of-sight. While mmWave radars work reliably in these conditions, they suffer from extremely sparse point clouds due to mirror-like reflection where most signals bounce away rather than returning to the radar. We present EyeDAR, a low-power mmWave tag deployed as roadside infrastructure that captures these lost reflections. The tag extracts arrival directions and sends these data back to the radar, providing additional point clouds to enhance radar perception. Like the human eye that uses a lens to map light angles onto different photoreceptors, EyeDAR uses a Luneburg lens to optically map arrival angles to different antennas, replacing 𝑂(𝑁^3) direction-finding algorithms with 𝑂(𝑁)detection. Combined with backscatter communication, the system operates at low power with- out power-hungry RF components. Our early prototype achieves 5.4° effective angular resolution with >15 dB passive gain at $7 fabrication cost. We experimentally demonstrate direction-of-arrival estimation error of−0.2° ±1.8° with commercial 24 GHz radar.

Type
Publication
In the Proceedings of the 27th International Workshop on Mobile Computing Systems and Applications (HotMobile 26)
Kun Woo Cho
Kun Woo Cho
Email: kc188@rice.edu

My research interests include wireless systems, metamaterials, and AI in wireless.