Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FLAVA]Use boolean pretrained flag to load ckpts #365

Closed
wants to merge 2 commits into from

Conversation

ankitade
Copy link
Contributor

@ankitade ankitade commented Nov 8, 2022

Stack from ghstack (oldest at bottom):

Test plan

  1. Pretraining debug
    python -m flava.train config=flava/configs/pretraining/debug.yaml training.lightning.gpus=0 training.lightning.strategy=null
    Epoch 0: : 50it [03:10, 3.81s/it, loss=16.3, v_num=3, train/losses/mmm_text_loss=10.50, train/losses/mmm_image_loss=9.150, train/losses/itm_loss=0.838, train/losses/global_contrastive_loss=2.090, train/losses/mlm_loss=10.40, train/losses/mim_loss=9.180]

  2. Finetune debug
    python -m flava.train config=flava/configs/pretraining/debug.yaml model.pretrained=True
    Epoch 0: : 50it [00:28, 1.78it/s, loss=6.78, v_num=4, train/losses/mmm_text_loss=0.456, train/losses/mmm_image_loss=6.150, train/losses/itm_loss=0.378, train/losses/global_contrastive_loss=3.170,

  3. coco zero shot
    python -m flava.coco_zero_shot --data_root /datasets01/COCO/022719/val2017 --annotations /datasets01/COCO/022719/annotations/captions_val2017.json

Differential Revision: D41142483

ankitade added a commit that referenced this pull request Nov 8, 2022
ghstack-source-id: 9ea01321467e8e3bc8191969f7b09a4d1fd3cd1d
Pull Request resolved: #365
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Nov 8, 2022
@codecov-commenter
Copy link

Codecov Report

Base: 93.43% // Head: 93.43% // No change to project coverage 👍

Coverage data is based on head (ae70afe) compared to base (f4eacdd).
Patch coverage: 100.00% of modified lines in pull request are covered.

Additional details and impacted files
@@                 Coverage Diff                  @@
##           gh/ankitade/13/base     #365   +/-   ##
====================================================
  Coverage                93.43%   93.43%           
====================================================
  Files                       55       55           
  Lines                     3262     3262           
====================================================
  Hits                      3048     3048           
  Misses                     214      214           
Impacted Files Coverage Δ
torchmultimodal/models/flava/model.py 95.55% <100.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

@ankitade
Copy link
Contributor Author

ankitade commented Nov 9, 2022

@DeAnkita has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

Using bool flag similar to omnivore since there is only one ckpt to load

Test plan
1. Pretraining debug
python -m flava.train config=flava/configs/pretraining/debug.yaml training.lightning.gpus=0 training.lightning.strategy=null
Epoch 0: : 50it [03:10,  3.81s/it, loss=16.3, v_num=3, train/losses/mmm_text_loss=10.50, train/losses/mmm_image_loss=9.150, train/losses/itm_loss=0.838, train/losses/global_contrastive_loss=2.090, train/losses/mlm_loss=10.40, train/losses/mim_loss=9.180]

2. Finetune debug
python -m flava.train config=flava/configs/pretraining/debug.yaml model.pretrained=True
Epoch 0: : 50it [00:28,  1.78it/s, loss=6.78, v_num=4, train/losses/mmm_text_loss=0.456, train/losses/mmm_image_loss=6.150, train/losses/itm_loss=0.378, train/losses/global_contrastive_loss=3.170,

3. coco zero shot
python -m flava.coco_zero_shot --data_root /datasets01/COCO/022719/val2017 --annotations /datasets01/COCO/022719/annotations/captions_val2017.json


Differential Revision: [D41142483](https://our.internmc.facebook.com/intern/diff/D41142483)

[ghstack-poisoned]
if pretrained_model_key is not None:
flava.load_model(FLAVA_MODEL_MAPPING[pretrained_model_key])
if pretrained:
flava.load_model(FLAVA_MODEL_MAPPING["flava_full"])
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i can move the flava_full string to constant in followup PR

@ankitade ankitade marked this pull request as ready for review November 9, 2022 05:53
@ankitade
Copy link
Contributor Author

ankitade commented Nov 9, 2022

@DeAnkita has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@ankitade
Copy link
Contributor Author

ankitade commented Nov 9, 2022

@DeAnkita has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@ankitade ankitade mentioned this pull request Nov 9, 2022
@facebook-github-bot facebook-github-bot deleted the gh/ankitade/13/head branch November 12, 2022 15:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants