You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I clearly understand how one-vs-rest works for classification models but I am not sure how is it used to solve the same issue for oversampling or undersampling binary class models in imbalanced-learn. Assuming that the multi-class extensions are indeed not present in the papers it would be nice if this is explained in the docs.
The text was updated successfully, but these errors were encountered:
When no references are given, it is indeed because the original paper does not discuss this matter (which is most of the cases). We could indeed expand the documentation there.
From the top of the head, one-vs-rest would refer to using the current class vs. all other classes in the nearest neighbors search when cleaning while the paper would mention something like minority vs. majority in the binary case.
In the docs, it's frequently mentioned in the references
So far, every time I read the referenced paper there was no one-vs-rest scheme described or an extension to multi-class whatsoever. Take for instance, TomekLinks, CondensedNearestNeighbors, EditedNearestNeighbors.
I clearly understand how one-vs-rest works for classification models but I am not sure how is it used to solve the same issue for oversampling or undersampling binary class models in
imbalanced-learn
. Assuming that the multi-class extensions are indeed not present in the papers it would be nice if this is explained in the docs.The text was updated successfully, but these errors were encountered: