sudo apt-get install libopenexr-dev
conda env create -f environment.yaml
conda activate enerf-slam
The data can be generated by simulator. Then organize the data as follows.
Datasets/synthetic01
|---results
| |----depth000000.png
| |----depth000001.png
| ...
| ...
| |----frame000000.png
| |----frame000001.png
| ...
| ...
|---traj.txt
Note: The format of traj.txt is consistent with that of the replica dataset.
Next, run ENerf-SLAM
python -W ignore run.py configs/Syn/syn1.yaml
The data can be found here. Then organize the data as follows.
Datasets/hamlyn
|---results
| |----depth000000.png
| |----depth000001.png
| ...
| ...
| |----frame000000.png
| |----frame000001.png
| ...
| ...
Note: There is no groundtruth trajectory for Hamlyn Dataset.
Next, run ENerf-SLAM
python -W ignore run.py configs/Hamlyn/rectified01.yaml
The data can be found here. Then organize the data as follows.
Datasets/desc_t4_a(trans_t2_b)
|---color_undistorted
| |----0000_color.png
| |----0001_color.png
| ...
| ...
|---depth_undistorted
| |----0000_depth.tiff
| |----0001_depth.tiff
| ...
| ...
|---pose.txt
Next, run ENerf-SLAM
python -W ignore run.py configs/C3VD/trans2b.yaml
Our codebase is based on NICE-SLAM and Endo-Depth-and-Motion. We are grateful to the authors for sharing their codebase with the public. Your significant contributions have been instrumental in making our work a reality!
If you find our code or paper useful, please cite
@article{shan2024enerf,
title={ENeRF-SLAM: A Dense Endoscopic SLAM With Neural Implicit Representation},
author={Shan, Jiwei and Li, Yirui and Xie, Ting and Wang, Hesheng},
journal={IEEE Transactions on Medical Robotics and Bionics},
year={2024},
publisher={IEEE}
}