You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ValueError: .to is not supported for 4-bit or 8-bit bitsandbytes models. Please use the model as it is, since the model has already been set to the correct devices and casted to the correct dtype.
Hi, If I intend to compute the perplexity for a quantized model, what parameters should I pass to the .compute() function?
For example,
Much thx!
The text was updated successfully, but these errors were encountered: