Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Different training losses when flash_attention is on/off #1918

Open
6 of 8 tasks
fly-dust opened this issue Sep 18, 2024 · 0 comments
Open
6 of 8 tasks

Different training losses when flash_attention is on/off #1918

fly-dust opened this issue Sep 18, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@fly-dust
Copy link

fly-dust commented Sep 18, 2024

Please check that this issue hasn't been reported before.

  • I searched previous Bug Reports didn't find any similar reports.

Expected Behavior

Flash attention should not make training losses differs a lot.

Current behaviour

I did preliminary experiments on Gemma 2b with different datasets. When flash attention is on, the loss is significantly lower than when flash attention is off.

Please see the figure below. The wandb run name with -flash means the flash attention is on.

image

However, the validation losses are normal.

Steps to reproduce

Simply enable and disable flash attention in the configuration.

Config yaml

No response

Possible solution

Is there anything wrong with loss calculation when flash_attention is off? Since usually the training loss should be slightly lower than validation loss.

image

Which Operating Systems are you using?

  • Linux
  • macOS
  • Windows

Python Version

3.10

axolotl branch-commit

main/4d6490b

Acknowledgements

  • My issue title is concise, descriptive, and in title casing.
  • I have searched the existing issues to make sure this bug has not been reported yet.
  • I am using the latest version of axolotl.
  • I have provided enough information for the maintainers to reproduce and diagnose the issue.
@fly-dust fly-dust added the bug Something isn't working label Sep 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant