-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug in RandomProjectionForest due to Nx.dot with EXLA Backend #230
Comments
Keep in mind that Nx shows the wrong representation for f32. There is an open issue about that. I am not sure if it affects this, but it is something to keep in mind. Other than that, I am not sure how much we should sweat about this. Perhaps add a warning that we recommend f64 input tensor for better precision and we just make sure to preserve the input precision? |
I wouldn't call this a bug. Those numbers are basically the same within 1.0e-5 precision. This is most likely due to some difference in the underlying calculation chosen by the EXLA compiler. Try comparing calculations in the CPU and GPU, for instance. |
By bug, I mean bug in RandomProjectionForest, not Nx.dot. |
I would still compare values. Because Nx.dot can change the return value slightly depending on the shape and the device |
The problem is, when constructing the k-NN graph I want Switching to |
Currently, there is a bug in RandomProjectionForest that I am not exactly sure how to resolve. Namely, when querying the tree, the projection value might not be the same as when fitting the tree. This happens due to inner workings of XLA, as mentioned here. For example, if we have
then doing
gives
On the other hand, doing
gives
Also,
forest.medians[0][1]
giveswhich is the same as the first result.
This is a problem when comparing projections to medians, as done in the module.
scholar/lib/scholar/neighbors/random_projection_forest.ex
Line 323 in 58ee5e3
One way to resolve this could be to use
:f64
for both hyperplanes and medians, albeit more memory expensive.Any thoughts on this?
The text was updated successfully, but these errors were encountered: