Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support symbolic for conv_tbc (#58359) #58692

Closed
wants to merge 3 commits into from

Conversation

BowenBao
Copy link
Collaborator

@BowenBao BowenBao commented May 20, 2021

Stack from ghstack:

This is a fix for exporting fairseq models, see:

model = torch.hub.load(github, 'conv.wmt14.en-fr', tokenizer='moses', bpe='subword_nmt')
model = torch.hub.load(github, 'conv.wmt17.en-de', tokenizer='moses', bpe='subword_nmt')

With this fix, and comment out model script one line GradMultiply, these two models can be exported successfully with perf met.

The original PR #57708 has merging issue, use this one instead.

Co-authored-by: David jiafa@microsoft.com

Differential Revision: D28714809

This is a fix for exporting fairseq models, see:
```python
model = torch.hub.load(github, 'conv.wmt14.en-fr', tokenizer='moses', bpe='subword_nmt')
model = torch.hub.load(github, 'conv.wmt17.en-de', tokenizer='moses', bpe='subword_nmt')
```
With this fix, and comment out model script one line `GradMultiply`, these two models can be exported successfully with perf met.

The original PR #57708 has merging issue, use this one instead.

Co-authored-by: David <jiafa@microsoft.com>

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented May 20, 2021

💊 CI failures summary and remediations

As of commit a00da05 (more details on the Dr. CI page):


  • 4/4 failures possibly* introduced in this PR
    • 1/4 non-scanned failure(s)

3 failures not recognized by patterns:

Job Step Action
GitHub Actions Lint / quick-checks C++ docs check 🔁 rerun
GitHub Actions Linux CI (pytorch-linux-xenial-py3.6-gcc5.4) / render_test_results Install dependencies 🔁 rerun
GitHub Actions Lint / mypy Run mypy 🔁 rerun

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

This is a fix for exporting fairseq models, see:
```python
model = torch.hub.load(github, 'conv.wmt14.en-fr', tokenizer='moses', bpe='subword_nmt')
model = torch.hub.load(github, 'conv.wmt17.en-de', tokenizer='moses', bpe='subword_nmt')
```
With this fix, and comment out model script one line `GradMultiply`, these two models can be exported successfully with perf met.

The original PR #57708 has merging issue, use this one instead.

Co-authored-by: David <jiafa@microsoft.com>

[ghstack-poisoned]
@SplitInfinity
Copy link

@SplitInfinity has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@SplitInfinity
Copy link

@SplitInfinity has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

This is a fix for exporting fairseq models, see:
```python
model = torch.hub.load(github, 'conv.wmt14.en-fr', tokenizer='moses', bpe='subword_nmt')
model = torch.hub.load(github, 'conv.wmt17.en-de', tokenizer='moses', bpe='subword_nmt')
```
With this fix, and comment out model script one line `GradMultiply`, these two models can be exported successfully with perf met.

The original PR #57708 has merging issue, use this one instead.

Co-authored-by: David <jiafa@microsoft.com>

Differential Revision: [D28714809](https://our.internmc.facebook.com/intern/diff/D28714809)

[ghstack-poisoned]
@SplitInfinity
Copy link

@SplitInfinity has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@SplitInfinity merged this pull request in b8c96e6.

@facebook-github-bot facebook-github-bot deleted the gh/BowenBao/74/head branch May 31, 2021 14:17
deniskokarev pushed a commit to deniskokarev/pytorch that referenced this pull request Jun 9, 2021
Summary:
Pull Request resolved: pytorch#58692

This is a fix for exporting fairseq models, see:
```python
model = torch.hub.load(github, 'conv.wmt14.en-fr', tokenizer='moses', bpe='subword_nmt')
model = torch.hub.load(github, 'conv.wmt17.en-de', tokenizer='moses', bpe='subword_nmt')
```
With this fix, and comment out model script one line `GradMultiply`, these two models can be exported successfully with perf met.

The original PR pytorch#57708 has merging issue, use this one instead.

Test Plan: Imported from OSS

Reviewed By: driazati

Differential Revision: D28714809

Pulled By: SplitInfinity

fbshipit-source-id: 71c2de6cec7ee05af68560996acf47d97af46fb2

Co-authored-by: David <jiafa@microsoft.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants