You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The evaluation results of batch norm for all frameworks need improvements. For example, as the text states, the performance should be better than that of LeNet without BN. However, the training looks unstable as shown in the plots, such as in the plot of val_acc.
The text was updated successfully, but these errors were encountered:
Since ResNet and DenseNet do not apply BN on the FC layers in network heads, perhaps the plot issue is with somewhere else? Are you able to find any literature that supports removal of BN after FC layers? If not, could you try something else?
http://preview.d2l.ai.s3-website-us-west-2.amazonaws.com/d2l-en/master/chapter_convolutional-modern/batch-norm.html
The evaluation results of batch norm for all frameworks need improvements. For example, as the text states, the performance should be better than that of LeNet without BN. However, the training looks unstable as shown in the plots, such as in the plot of val_acc.
The text was updated successfully, but these errors were encountered: