Skip to content

mrafidashti/neuradar

Repository files navigation

NeuRadar: Neural Radiance Fields for Automotive Radar Point Clouds

top figure

Code to be released.

About

This is the official repository for NeuRadar: Neural Radiance Fields for Automotive Radar Point Clouds.

Abstract

Radar is an important sensor for autonomous driving (AD) systems due to its robustness to adverse weather and different lighting conditions. Novel view synthesis using neural radiance fields (NeRFs) has recently received considerable attention in AD due to its potential to enable efficient testing and validation but remains unexplored for radar point clouds. In this paper, we present NeuRadar, a NeRF-based model that jointly generates radar point clouds, camera images, and lidar point clouds. We explore set-based object detection methods such as DETR, and propose an encoder-based solution grounded in the NeRF geometry for improved generalizability. We propose both a deterministic and a probabilistic point cloud representation to accurately model the radar behavior, with the latter being able to capture radar's stochastic behavior. We achieve realistic reconstruction results for two automotive datasets, establishing a baseline for NeRF-based radar point cloud simulation models. In addition, we release radar data for ZOD's Sequences and Drives to enable further research in this field.

TODOs

  • Release code

Quickstart

1. Installation: Setup the environment

Prerequisites

Our installation steps largely follow NeuRAD. You must have an NVIDIA video card with CUDA installed on the system. This library has been tested with version 11.8 of CUDA. You can find more information about installing CUDA here.

Create environment

The models require python >= 3.10. We recommend using conda to manage dependencies. Make sure to install Conda before proceeding.

conda create --name neuradar -y python=3.10
conda activate neuradar
pip install --upgrade pip

Dependencies

Install PyTorch with CUDA 11.8 and tiny-cuda-nn. cuda-toolkit is required for building tiny-cuda-nn.

pip install torch==2.0.1+cu118 torchvision==0.15.2+cu118 --extra-index-url https://download.pytorch.org/whl/cu118

conda install -c "nvidia/label/cuda-11.8.0" cuda-toolkit

# Some need to upgrade dill prior to tiny-cuda-nn install
pip install dill --upgrade
pip install --upgrade pip "setuptools<70.0"

pip install ninja git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch

We refer to NeuRAD and Nerfstudio for more installation support.

Installing neuradar

git clone [email protected]:mrafidashti/neuradar.git
cd neuradar
pip install -e .

2. Data Preparation

  1. Please create a directory ./data in the root directory of the repository.

  2. Download Zenseact Open Dataset (ZOD).

  3. Download View-of-Delft (VoD).

  4. Softlink the files into the ./data directory. The structure of the data directory should be as follows:

neuradar
    ├──data
    │   ├── zod
    │   │  ├── trainval-sequences-full.json
    │   │  ├── auto_annotations
    │   │  └── sequences
    │   ├── vod
    │   │  ├── lidar
    │   │  └── radar
    └──...

Notably,

  • Only the necessary subdirectories are shown in the structure above.
  • The full ZOD sequences occupy 1.5 TB. To save space, you can keep only the sequences you’re interested in.

3. Train Model

Training is done as in nerfstudio and neurad-studio. You can simply run:

python nerfstudio/scripts/train.py neuradar zod-data

Citation

@inproceedings{rafidashti2025neuradar,
  title={NeuRadar: Neural Radiance Fields for Automotive Radar Point Clouds},
  author={Rafidashti, Mahan and Lan, Ji and Fatemi, Maryam and Fu, Junsheng and Hammarstrand, Lars and Svensson, Lennart},
  booktitle={Proceedings of the Computer Vision and Pattern Recognition Conference},
  pages={2488--2498},
  year={2025}
}

About

NeuRadar: Neural Radiance Fields for Automotive Radar Point Clouds

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •