-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update sharded_moe.py to support top2 gate with Tutel #6948
base: master
Are you sure you want to change the base?
Conversation
Given the fact that multiple experts per token is very common, and the gather and scatter operation without Tutel is so inefficient, I added support of tutel to top2 gate and tested on pipeline engine. This can be done for any k actually, I'll push that later when I have time to test,
@microsoft-github-policy-service agree company="University of Michigan" |
Not sure if this check is needed
I see that in DeepSpeed/deepspeed/moe/sharded_moe.py Line 252 in 66d3d3e
but when I refer to examples from Tutel https://github.com/microsoft/Tutel/blob/ab7937bb929bc78111d74261b490da25657a7e5c/tutel/impls/fast_dispatch.py#L143 I didn't see any specify check for non-zero mask. |
deepspeed/moe/sharded_moe.py
Outdated
@@ -517,7 +535,7 @@ def forward(self, | |||
|
|||
elif self.k == 2: | |||
gate_output = top2gating(logits, self.capacity_factor if self.training else self.eval_capacity_factor, | |||
self.min_capacity, self.drop_tokens, self.ep_group, self.top2_2nd_expert_sampling) | |||
self.min_capacity, self.drop_tokens, self.ep_group, self.top2_2nd_expert_sampling, use_tutel) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@xenshinu - thanks for this PR, could you run the pre-commit formatter on your branch to resolve the "Formatting" error? I believe it just wants the use_tutel
on a new line here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for pointing that out. I've updated the file and it looks like the pre-commit check has passed.
Can someone take a look to this question?
#6948 (comment)
Tutel is forced to be unused on k > 1 since #2053
Given the fact that multiple experts per token is very common, and the gather and scatter operation without Tutel is so inefficient, I added support of tutel to top2 gate and tested on pipeline engine. This can be done for any k actually, I'll push that later when I have time to test,