Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug] Outdated TransformerEngine #3996

Open
3 of 6 tasks
dbpprt opened this issue Jun 7, 2024 · 1 comment
Open
3 of 6 tasks

[bug] Outdated TransformerEngine #3996

dbpprt opened this issue Jun 7, 2024 · 1 comment

Comments

@dbpprt
Copy link

dbpprt commented Jun 7, 2024

Checklist

Concise Description:
The included version of TransformerEngine (0.12.0) is not compatible with FlashAttention > 2.0.4 whilst recent transformer version require FlashAttention > 2.0.4

DLC image/dockerfile:
763104351884.dkr.ecr.us-east-1.amazonaws.com/pytorch-training:2.3.0-gpu-py311-cu121-ubuntu20.04-sagemaker

Current behavior:
Old version, doesn't support recent versions of FA

Expected behavior:
It should be usable with recent versions of FA/transformers

Additional context:

@sbhavani
Copy link

sbhavani commented Sep 5, 2024

we are also working on a pip wheel for TEv1.11 (ETA 10/15) that will remove the version requirement for flash-attn and make it an optional dependency. That might be a good time to update the DLC.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants