You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thanks for sharing the great work.👍
I am trying to reuse the motion tokenizer (VQVAE) in MotionGPT so I wonder.
Is there a way to save the VQVAE's weight as a checkpoint file separately from saving the whole MotionGPT model?
Cuz I saw there is load_pretrained_vae() function here:
I checked the pre-trained model motiongpt_s3_h3d.tar, and I found the checkpoint includes many parts (metrics, vae, lm, loss).
Why does it contain so many parameters only for metrics?
It would be a big help if someone could reply, thanks🤗
The text was updated successfully, but these errors were encountered:
Thank you for your interest and support in our work. Regarding your queries:
We currently utilize PyTorch Lightning's callback function for saving checkpoints, which saves all modules by default. We plan to consider the implementation logic for saving specific components, like the VQVAE's weights separately, in future versions.
The reason for the inclusion of many parameters, especially for metrics, is due to our model-based calculations. These parameters do not participate in the network's forward pass or inference but are used during metric computation.
We appreciate your suggestion and are looking into enhancing our model's usability in future updates.
Hi, thanks for sharing the great work.👍
I am trying to reuse the motion tokenizer (VQVAE) in MotionGPT so I wonder.
Is there a way to save the VQVAE's weight as a checkpoint file separately from saving the whole MotionGPT model?
Cuz I saw there is
load_pretrained_vae()
function here:MotionGPT/mGPT/utils/load_checkpoint.py
Line 17 in fac2972
I checked the pre-trained model
motiongpt_s3_h3d.tar
, and I found the checkpoint includes many parts (metrics, vae, lm, loss).Why does it contain so many parameters only for metrics?
It would be a big help if someone could reply, thanks🤗
The text was updated successfully, but these errors were encountered: