We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
然而在本书中的计算却是这样的
# 计算特征feat <= val本身的复杂度H_Y(X) def entropy_YX(self, X, Y, feat, val): HYX = 0 N = len(Y) if N == 0: return 0 Y_l = Y[X[:, feat] <= val] HYX += -self.aloga(len(Y_l) / N) Y_r = Y[X[:, feat] > val] HYX += -self.aloga(len(Y_r) / N) return HYX
此处的式子仅有
The text was updated successfully, but these errors were encountered:
同样发现了这个问题,尝试了改为书中内容。对结果影响并不显著,但是用书中的形式更加符合逻辑。
Sorry, something went wrong.
No branches or pull requests
请问此处计算信息增益比的分母是否有误?
然而在本书中的计算却是这样的
此处的式子仅有
The text was updated successfully, but these errors were encountered: