Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Fusion for 1 and 2 Inputs Bert Models Converted From tf #5993

Merged
merged 2 commits into from
Dec 3, 2020

Conversation

liuziyue
Copy link
Contributor

@liuziyue liuziyue commented Dec 1, 2020

Description: Describe your changes.
Update transformer model optimization tool to support 1 and 2 inputs models from Tensorflow. Note this fix is intended for fp32 models.

Motivation and Context

  • Why is this change required? What problem does it solve?
  • If it fixes an open issue, please link to the issue here.

@liuziyue liuziyue requested a review from a team as a code owner December 1, 2020 23:07
@@ -142,12 +142,12 @@ def create_attention_node(self, mask_index, q_matmul, k_matmul, v_matmul, q_add,
raw=True)
self.model.add_initializer(bias)

attnetion_inputs = [input, attention_node_name + '_qkv_weight', attention_node_name + '_qkv_bias']
attention_inputs = [input, attention_node_name + '_qkv_weight', attention_node_name + '_qkv_bias']
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good catch!

wangyems
wangyems previously approved these changes Dec 2, 2020
@tianleiwu
Copy link
Contributor

Could we add some test cases to test_optimizer?

@liuziyue liuziyue merged commit 3b198c9 into master Dec 3, 2020
@liuziyue liuziyue deleted the ziyl/kerasFusion branch December 3, 2020 18:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants