-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About freeze batch norm #15
Comments
|
Parameters in convolutional layers are weights and biases. If frozen bn, in my view, weights and biases will not change any more, and performance of net will not improve. |
Parameters in convolutional layers are weights and biases. If frozen bn, in my view, weights and biases will not change any more——why? The parameters in bn are fixed, but the parameters in conv could be changed.
2021年2月4日 +1100 AM2:39 sshan-zhao/GASDA <reply@reply.github.com>,写道:
…
Parameters in convolutional layers are weights and biases. If frozen bn, in my view, weights and biases will not change any more,
|
Hi, sorry for bothering you.
I've been wondering for a long time why did you freeze bn when training GASDA using the pretrained F_s, F_t and CycleGAN.
If frozen batch norm, what parameters will be optimized in training?
The text was updated successfully, but these errors were encountered: