Software & Data Downloads — Gear-NeRF

Gear Extensions of Neural Radiance Fields for novel-view synthesis, as well as tracking of any object in the scene in the novel view using prompts such as mouse clicks.

This repository contains the implementation of Gear-NeRF, an approach for novel-view synthesis, as well as tracking of any object in the scene in the novel view using prompts such as mouse clicks, described in the paper:

Xinhang Liu, Yu-Wing Tai, Chi-Keung Tang, Pedro Miraldo, Suhas Lohit, Moitreya Chatterjee, "Gear-NeRF: Free-Viewpoint Rendering and Tracking with Motion-aware Spatio-Temporal Sampling", appeared in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2024 (Highlight).

    •  Liu, X., Tai, Y.-W., Tang, C.-K., Miraldo, P., Lohit, S., Chatterjee, M., "Gear-NeRF: Free-Viewpoint Rendering and Tracking with Motion-aware Spatio-Temporal Sampling", IEEE Conference on Computer Vision and Pattern Recognition (CVPR), May 2024, pp. 19667-19679.
      BibTeX TR2024-042 PDF Videos Software
      • @inproceedings{Liu2024may,
      • author = {Liu, Xinhang and Tai, Yu-wing and Tang, Chi-Keung and Miraldo, Pedro and Lohit, Suhas and Chatterjee, Moitreya},
      • title = {Gear-NeRF: Free-Viewpoint Rendering and Tracking with Motion-aware Spatio-Temporal Sampling},
      • booktitle = {IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
      • year = 2024,
      • pages = {19667--19679},
      • month = may,
      • publisher = {IEEE},
      • url = {https://www.merl.com/publications/TR2024-042}
      • }

    Access software at https://github.com/merlresearch/Gear-NeRF.