-
Notifications
You must be signed in to change notification settings - Fork 280
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performing simple inference #731
Comments
Hi! Yes that would work but doing inference with matrix multiplication gives ok performance only with small trees. With deep trees it is better to use the other strategies. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Thanks for creating such a great library. I went through the math in the README and then created some simple data that would mimic the decision tree in the README and trained a basic
DecisionTreeClassifier
. After this, I got thestate_dict
of the model which in the debugger looked like the following:Does this mean that if I have an input tensor, and I merely loop through the matrices performing matrix multiplications in order to get accurate outcomes? I'm asking this because I am looking into performing inference with as few dependencies as possible. Ideally, I'd like to build a compiled program in Rust that just loads those weights and performs the matrix multiplications to get predictions.
The text was updated successfully, but these errors were encountered: