You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jul 10, 2021. It is now read-only.
If a user specifies a Layer in the constructor that's known to the backend (lasagne) simply pass it through directly. This makes it possible to do more advanced architectures without "hitting the ceiling", at the cost of intuitiveness and simplicity.
The text was updated successfully, but these errors were encountered:
This would really be useful! The only reason I don't use scikit-neuralnetwork for all deep learning tasks right now is the fact that I can't easily add things like max-pooling layers or dropout layers.
EDIT: never mind, I just realized that max-pooling and dropout options exist! This would still be great in order to implement more exotic layers though.
If a user specifies a Layer in the constructor that's known to the backend (lasagne) simply pass it through directly. This makes it possible to do more advanced architectures without "hitting the ceiling", at the cost of intuitiveness and simplicity.
The text was updated successfully, but these errors were encountered: