Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Restore is_torch_greater_or_equal_than for backward compatibility #35734

Merged
merged 3 commits into from
Jan 17, 2025

Conversation

tlrmchlsmth
Copy link
Contributor

@tlrmchlsmth tlrmchlsmth commented Jan 16, 2025

This PR restores booleans such as is_torch_greater_or_equal_than_1_13 for backward compatibility reasons. These were removed in #35358, and I've set them to True as Tranformers not requires PyTorch >= 2.0

For context, I am trying to upgrade vLLM to use transformers 4.48.0 in vllm-project/vllm#12120. One of the issues I'm running into is that MiniCPM3-4B
's checks is_torch_greater_or_equal_than_1_13 here https://huggingface.co/openbmb/MiniCPM3-4B/blob/e5715484011d723a1892db91da8b59d979d14aee/modeling_minicpm.py#L63-L65.

I've put up a PR to remove that check in MiniCMP3-4B but it seems likely that there are other models that will hit the same issue, I think it would be good to restore these at least for the short term.

cc @ydshieh

@Rocketknight1
Copy link
Member

@ArthurZucker WDYT? I had to do some emergency PRs to MiniMax because of this, so maybe keeping them as static True makes sense for backward compatibility

@tlrmchlsmth
Copy link
Contributor Author

tlrmchlsmth commented Jan 16, 2025

@Rocketknight1 FYI the link to my PR that does the same thing for MiniCPM3 is https://huggingface.co/openbmb/MiniCPM3-4B/discussions/39

@ydshieh
Copy link
Collaborator

ydshieh commented Jan 17, 2025

OK to restore for backward compatibility, but let's do it as before and not just set to True to avoid unpleasant surprise.
Thank you for bringing this issue and the PR

@tlrmchlsmth
Copy link
Contributor Author

OK to restore for backward compatibility, but let's do it as before and not just set to True to avoid unpleasant surprise. Thank you for bringing this issue and the PR

Sounds good -- updated.

@ydshieh ydshieh merged commit 10e8cd0 into huggingface:main Jan 17, 2025
21 of 23 checks passed
@ydshieh
Copy link
Collaborator

ydshieh commented Jan 17, 2025

@Rocketknight1 I just merged this PR

@Rocketknight1
Copy link
Member

Yes, sounds good! Thanks @ydshieh!

@noldorj
Copy link

noldorj commented Jan 29, 2025

I still happens in
Torch version: 2.5.1+cu124
Transformers version: 4.48.1

@ydshieh
Copy link
Collaborator

ydshieh commented Jan 29, 2025

Hmm, I don't think the fix is included in the patch.

Would it be good for you to using (for now) the dev version or 4.48 or an older version v4.47.1?

Sorry for the inconvenience.

@tlrmchlsmth
Copy link
Contributor Author

For vLLM, we're on 4.45.2, but would like to pick up 4.48 for bamba support

@tlrmchlsmth tlrmchlsmth deleted the restore_torch_greater_1_x branch January 30, 2025 03:10
ArthurZucker pushed a commit that referenced this pull request Jan 30, 2025
…5734)

* Restore is_torch_greater_or_equal_than for backward compatibility

Signed-off-by: Tyler Michael Smith <[email protected]>

* review comments

Signed-off-by: Tyler Michael Smith <[email protected]>

---------

Signed-off-by: Tyler Michael Smith <[email protected]>
@ArthurZucker
Copy link
Collaborator

yep working on the patch!

@ArthurZucker
Copy link
Collaborator

will probably ship now and have another one for the num items in batch issue

@abidkhan484
Copy link

For vLLM, we're on 4.45.2, but would like to pick up 4.48 for bamba support

Same error with transformers 4.48.1 and torch 2.5.1 here.

@ydshieh
Copy link
Collaborator

ydshieh commented Jan 30, 2025

For vLLM, we're on 4.45.2, but would like to pick up 4.48 for bamba support

Same error with transformers 4.48.1 and torch 2.5.1 here.

the fix in this PR is not included in 4.48.1. We will have a new patch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants