Skip to content

Releases: dsilvavinicius/i3d

Performance improvements and major code changes

06 Nov 11:27
Compare
Choose a tag to compare

Special release for SIBGRAPI 2023!
We have a tutorial session there: How to train your (neural) dragon on 2023-11-06 at 09:00 GMT-3

Major code overhaul, most significant changes are:

  • Isolated most of our code in the i3d package
  • Changed the training flow so that the data remains on the GPU as much as possible, leading to faster training and inference. Pretrained models from past releases should work out-of-the-box, they will be converted to the new format, indicated by the _v2.pth suffix in the file name. Added new pretrained models anyway, check the attached files
  • Added functions to change the w0 value. We use this to set to 1 when saving the model, thus dismissing the dependence of the experiment configuration file for inferences
  • Converted the experiment files from JSON to YAML for simplicity
  • Renamed main.py to train_sdf.py and isolated the reconstruction code in reconstruct.py.

Note that there is a ww model parameter that we were experimenting with. Setting its value independently of w0 affects training convergence dramatically in some cases, but we didn't perform major experiments on this yet. Most of the time this is set to match w0, so consider anything involving different w0 and ww values very volatile code.

What's Changed

  • Handle the U=W=0 situation in principal_directions by @DavidXu-JJ in #3

New Contributors

Full Changelog: v0.0.1-pretrained...v1.0.0

Pretrained models for example meshes.

05 Sep 10:31
Compare
Choose a tag to compare

Uploading pretrained models for Armadillo, Bunny, Buddha, Dragon, and Lucy.
To reconstruct the meshes, use the tools/reconstruct.py script. An example command line would be (from the root directory of the repository):

python tools/reconstruct.py results/armadillo_biased_curvature_sdf/models/model_best.pth armadillo.ply --w0 60

The params.json files in each experiment folder contain the parameters used when training the models, including the w0, which must be passed correctly to the tools/reconstruct.py script. The w0 values used were 60 for all models except the Dragon, which has w0 = 30.

If you download the original meshes from here (as per our README) and extract them under data, you may also run the tools/estimate_mesh_curvatures.py script. This script accepts the original mesh and a trained model as input, and will save a PLY with the curvatures calculated using the implicit models. The resulting models will be saved under results/${MESH_TYPE}, where MESH_TYPE $\in$ ["armadillo", "bunny", "buddha", "dragon", "lucy"]. This script is more of a demonstration, so is less parameterized compared to tools/reconstruct.py. To run it, simply run the following command in the terminal:

python tools/estimate_mesh_curvatures.py

To visualize the curvatures, open MeshLab and select the Quality Mapper option in the toolbar. We used the Red-White-Blue Scale option in the Preset Ramps, and manually adjusted the Equalizer to fit the curvature histograms, as shown below (note that we used the Armadillo as an example here):

Quality mapper for Armadillo curvatures