Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Normal Attention fix to get good image quality without using flash attention #37

Merged
merged 1 commit into from
Jul 2, 2023

Conversation

ayushtues
Copy link
Contributor

@ayushtues ayushtues commented Jun 14, 2023

Hi, I was playing around with this repo while helping to integrate it with HF Diffusers huggingface/diffusers#3492, and found that if we used normal attention QKVAttentionLegacy we were not able to generate images of good quality, although using flash attention QKVFlashAttention gave the expected results.

The reason for this turned out to be a slightly different ordering of the heads and keys in the flash attention and normal attention implementations. Fixing which allowed us to generate good images using normal attention, without flash attention ( which can be a pain to install ).

Sending this PR with the fix, would be good to integrate this since flash attention is hard to install and can't be installed in all devices.

The quantization might not be that important, its just that Flash attention works with fp16 only I think, so I did an explicit conversion in the normal attention to get exactly same results

TLDR : You can now generate good images from the pretrained checkpoints without using Flash attention

@ayushtues ayushtues changed the title Normal Attention fix to get good image quality as using flash attention Normal Attention fix to get good image quality without using flash attention Jun 14, 2023
@yang-song yang-song merged commit fb11888 into openai:main Jul 2, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants