-
Notifications
You must be signed in to change notification settings - Fork 27.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Restore is_torch_greater_or_equal_than for backward compatibility #35734
Restore is_torch_greater_or_equal_than for backward compatibility #35734
Conversation
Signed-off-by: Tyler Michael Smith <[email protected]>
@ArthurZucker WDYT? I had to do some emergency PRs to MiniMax because of this, so maybe keeping them as static |
@Rocketknight1 FYI the link to my PR that does the same thing for MiniCPM3 is https://huggingface.co/openbmb/MiniCPM3-4B/discussions/39 |
OK to restore for backward compatibility, but let's do it as before and not just set to |
Signed-off-by: Tyler Michael Smith <[email protected]>
Sounds good -- updated. |
@Rocketknight1 I just merged this PR |
Yes, sounds good! Thanks @ydshieh! |
I still happens in |
Hmm, I don't think the fix is included in the patch. Would it be good for you to using (for now) the dev version or 4.48 or an older version v4.47.1? Sorry for the inconvenience. |
For vLLM, we're on |
…5734) * Restore is_torch_greater_or_equal_than for backward compatibility Signed-off-by: Tyler Michael Smith <[email protected]> * review comments Signed-off-by: Tyler Michael Smith <[email protected]> --------- Signed-off-by: Tyler Michael Smith <[email protected]>
yep working on the patch! |
will probably ship now and have another one for the num items in batch issue |
Same error with transformers |
the fix in this PR is not included in |
This PR restores booleans such as
is_torch_greater_or_equal_than_1_13
for backward compatibility reasons. These were removed in #35358, and I've set them to True as Tranformers not requires PyTorch >= 2.0For context, I am trying to upgrade vLLM to use transformers 4.48.0 in vllm-project/vllm#12120. One of the issues I'm running into is that MiniCPM3-4B
's checks
is_torch_greater_or_equal_than_1_13
here https://huggingface.co/openbmb/MiniCPM3-4B/blob/e5715484011d723a1892db91da8b59d979d14aee/modeling_minicpm.py#L63-L65.I've put up a PR to remove that check in MiniCMP3-4B but it seems likely that there are other models that will hit the same issue, I think it would be good to restore these at least for the short term.
cc @ydshieh