-
Notifications
You must be signed in to change notification settings - Fork 238
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement relu #32
Comments
I would like ReLU in genann too, but the back propagation is also something I don't understand here :( |
Yes, In the code derivate of sigmoid "ddxσ(x)=σ(x)(1−σ(x))" is only implemented. I think we have to write a generic function of derivatives so that, we can add other activation functions like tanh and Relu |
Hi everyone, |
It's right there at the top of this bug report thread, |
Hello,
I started to implement the relu function for the genann library on a fork under my name (https://github.com/kasey-/genann) before sending you a PR:
But I am a bit lost in the way you compute the back propagation of the neural network. The derivative of relu formula is trivial
(a > 0.0) ? 1.0 : 0.0
But I cannot understand were I should plug-it inside your formula as I do not understand how do you compute your back propagation. Did you implemented only the derivate of the sigmoid ?The text was updated successfully, but these errors were encountered: