Skip to content
This repository has been archived by the owner on Jul 10, 2021. It is now read-only.

Support custom Layer types known by backend #179

Closed
alexjc opened this issue Feb 18, 2016 · 2 comments
Closed

Support custom Layer types known by backend #179

alexjc opened this issue Feb 18, 2016 · 2 comments

Comments

@alexjc
Copy link
Member

alexjc commented Feb 18, 2016

If a user specifies a Layer in the constructor that's known to the backend (lasagne) simply pass it through directly. This makes it possible to do more advanced architectures without "hitting the ceiling", at the cost of intuitiveness and simplicity.

@nikcheerla
Copy link

This would really be useful! The only reason I don't use scikit-neuralnetwork for all deep learning tasks right now is the fact that I can't easily add things like max-pooling layers or dropout layers.

EDIT: never mind, I just realized that max-pooling and dropout options exist! This would still be great in order to implement more exotic layers though.

@alexjc alexjc mentioned this issue Apr 2, 2016
@alexjc alexjc closed this as completed Apr 2, 2016
@alexjc
Copy link
Member Author

alexjc commented Apr 2, 2016

@nikcheerla The code is there, you can do things like:

layers=[Native(lasagne.layers.recurrent.LSTMLayer, <parameters>)]

Section in the documentation will follow soon.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

2 participants