Skip to content

NumPy, TensorFlow and PyTorch implementation of human body SMPL model and infant body SMIL model.

License

Notifications You must be signed in to change notification settings

CalciferZh/SMPL

Repository files navigation

SMPL

Numpy, Tensorflow and PyTorch implementation of SMPL model. For any questions, feel free to contact me.

Overview

Update on 20190127 by Lotayou

I write a PyTorch implementation based on CalciferZh's Tensorflow code, which supports GPU training. The implementation is hosted in smpl_torch.py along with the testing example.

The implementation is tested under Ubuntu 18.04, Python 3.6 and Pytorch 1.0.0 stable. The output is the same as the original Tensorflow implementation, as can be tested with test.py.

Original Overview

I wrote this because the author-provided implementation was mainly based on chumpy in Python 2, which is kind of unpopular. Meanwhile, the official version cannot run on GPU.

This numpy version is faster (since some computations were rewrote in a vectorized manner) and easier to understand (hope so), and the TensorFlow version can run on GPU.

For more details about SMPL model, see SMPL.

Usage

  1. Download the model file here.

  2. Run python preprocess.py /PATH/TO/THE/DOWNLOADED/MODEL to preprocess the official model. preprocess.py will create a new file model.pkl. smpl_np.py and smpl_tf.py both rely on model.pkl. NOTE: the official pickle model contains chumpy object, so prerocess.py requires chumpy to extract official model. You need to modify chumpy's cource code a little bit to make it compatible to preprocess.py (and Python 3). Here is an instruction in Chinese about this. If you don't want to install chumpy, you can download processed file from BaiduYunDisk with extraction code vblg

  3. Run python smpl_np.py or python smpl_tf.py or python smpl_torch.py to see the example.

About

NumPy, TensorFlow and PyTorch implementation of human body SMPL model and infant body SMIL model.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages