Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hotfix/vocab parallel layers #215

Merged
merged 1 commit into from
Feb 14, 2022
Merged

Conversation

kurisusnowdeng
Copy link
Member

  • moved env variables to global variables;
  • added branch context;
  • added vocab parallel layers;
  • moved split_batch from load_batch to tensor parallel embedding layers;
  • updated gpt model;
  • updated unit test cases;
  • fixed few collective communicator bugs

@kurisusnowdeng kurisusnowdeng force-pushed the main branch 2 times, most recently from 1c94101 to 19d5539 Compare February 13, 2022 20:31
@kurisusnowdeng kurisusnowdeng marked this pull request as ready for review February 13, 2022 20:31
added branch context;
added vocab parallel layers;
moved split_batch from load_batch to tensor parallel embedding layers;
updated gpt model;
updated unit test cases;
fixed few collective communicator bugs
@FrankLeeeee FrankLeeeee merged commit 2989434 into hpcaitech:develop Feb 14, 2022
ver217 pushed a commit to ver217/ColossalAI that referenced this pull request Feb 14, 2022
added branch context;
added vocab parallel layers;
moved split_batch from load_batch to tensor parallel embedding layers;
updated gpt model;
updated unit test cases;
fixed few collective communicator bugs
FrankLeeeee pushed a commit that referenced this pull request Feb 15, 2022
added branch context;
added vocab parallel layers;
moved split_batch from load_batch to tensor parallel embedding layers;
updated gpt model;
updated unit test cases;
fixed few collective communicator bugs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants