📝 updating the documentation

This commit is contained in:
otthorn 2021-06-10 16:35:08 +02:00
parent 0acee99124
commit cc9eb2cb4e

11
FORK.md
View file

@ -21,7 +21,7 @@ pip3 install -r requirements.txt
You will also need [COLMAP](colmap.github.io/) You will also need [COLMAP](colmap.github.io/)
## How to run ## How to run
### Dataset
First you need to create or find a dataset. A large set of images (at least 30, First you need to create or find a dataset. A large set of images (at least 30,
more if you want a 360 degree reconstruction). more if you want a 360 degree reconstruction).
In order to maximise the quality of the reconstuction it is recommended to take In order to maximise the quality of the reconstuction it is recommended to take
@ -34,6 +34,7 @@ with tools like ImageMagick (mogrify or convert)
mogrify -resize 800 *.jpg mogrify -resize 800 *.jpg
``` ```
### Camera pose estimation
Then use the wrapper colmap `colmap_runner/run_colmap.py` Then use the wrapper colmap `colmap_runner/run_colmap.py`
First change the two lines corresponding to input and output at the end of the First change the two lines corresponding to input and output at the end of the
@ -46,8 +47,7 @@ python3 run_colmap.py
``` ```
Then you will need to use the `format_dataset.py` script to transform the Then you will need to use the `format_dataset.py` script to transform the
wrapper COLMAP binary format data into the datastructure requirements by wrapper COLMAP binary format data into the datastructure required by NeRF++.
NeRF++.
You again need to change the `input_path` and `output_path`. You again need to change the `input_path` and `output_path`.
@ -55,6 +55,11 @@ You again need to change the `input_path` and `output_path`.
python3 format_dataset.py python3 format_dataset.py
``` ```
Before training you can visualise your camera poses using the
`camera_visualizer/visualize_cameras.py` script.
### Training the model
You then need to create the configuration file, copying the example in You then need to create the configuration file, copying the example in
`configs` and tweaking the values to your need is recommended. Refer to the `configs` and tweaking the values to your need is recommended. Refer to the
help inside `ddp_train_nerf.py` if you need to understand a parameter. help inside `ddp_train_nerf.py` if you need to understand a parameter.