Skip to content

Fix llama model sdpa attention forward function masking bug when output_attentions=True#30652

Merged
ArthurZucker merged 14 commits intohuggingface:mainfrom Aladoro:fix-llama-mask-output-attnMay 15, 2024

Commits

Commits on May 15, 2024