-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
5e0549b
commit df725ce
Showing
6 changed files
with
2,192 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1 +1,58 @@ | ||
# [ECCV 24] Boost Your NeRF: A Model-Agnostic Mixture of Experts Framework for High Quality and Efficient Rendering | ||
|
||
Welcome to the **official code repository** for our paper _"Boost Your NeRF: A Model-Agnostic Mixture of Experts Framework for High Quality and Efficient Rendering"_, accepted at **ECCV 2024**. π | ||
|
||
Our framework offers **state-of-the-art performance** by combining multiple experts in a model-agnostic manner to enhance the quality and efficiency of NeRF-based rendering. | ||
|
||
![Teaser Image 1](method.png) | ||
<!-- ![Teaser Image 2](path_to_image2) --> | ||
|
||
--- | ||
|
||
## π Project Page | ||
Explore the full project details and supplementary materials on our [**Project Page**](https://eidoslab.github.io/boost-your-nerf/). | ||
|
||
--- | ||
|
||
|
||
## π Datasets | ||
- Tested datasets: | ||
- *Bounded inward-facing*: [NeRF](https://drive.google.com/drive/folders/128yBriW1IG_3NJ5Rp7APSTZsJqdJdfc1), [NSVF](https://dl.fbaipublicfiles.com/nsvf/dataset/Synthetic_NSVF.zip), [T&T (masked)](https://dl.fbaipublicfiles.com/nsvf/dataset/TanksAndTemple.zip). | ||
- *Foward-facing*: [LLFF](https://drive.google.com/drive/folders/14boI-o5hGO9srnWaaogTU5_ji7wkX2S7). | ||
|
||
--- | ||
## π Quickstart | ||
|
||
We offer our implementation with 3 main Fast-NeRFs architectures: **DVGO**, **TensoRF** and **Instant-NGP**. So far the code is available for DVGO: we will update our repo soon with the other method. | ||
|
||
Our method works under the same configuration of the original implementation of [**DVGO**](https://github.com/sunset1995/DirectVoxGO). So, the first step is to clone from thre and install all the dependencies: | ||
|
||
|
||
```bash | ||
git clone [email protected]:sunset1995/DirectVoxGO.git | ||
cd DirectVoxGO | ||
pip install -r requirements.txt | ||
``` | ||
|
||
Then, clone our repository and **copy** the following files into the root directory of DVGO: | ||
|
||
- `train_single_model.py`: A wrapper to the original DVGO method to train multiple models at different resolutions. | ||
- `run_dvgo.py`: The main DVGO file with minor changes. | ||
- `moe.py`: Contains all the logic behind our Sparse-Mixture of Experts framework. | ||
- `moe_main.py`: The main file of our method. | ||
|
||
The first step is to pre-train models at different resolutions: | ||
|
||
|
||
```bash | ||
python3 train_single_model.py --dataset_name nerf --scene lego --resolutions 128 160 200 256 300 350 --datadir path_to_datadir --render_test --eval_ssim --eval_lpips_alex | ||
``` | ||
|
||
We are now ready to ready to train our MoE: | ||
|
||
|
||
```bash | ||
python3 moe_main.py --dataset_name nerf --scene lego --resolutions 128 160 200 256 300 350 --datadir path_to_datadir --render_test --eval_ssim --eval_lpips_alex --top_k 2 --num_experts 5 | ||
``` | ||
|
||
This code trains our MoE with *five experts* (the first model with a res of 128^3 is excluded as it is used for initializing the gate). |
Oops, something went wrong.