Skip to content
This repository has been archived by the owner on Jul 10, 2021. It is now read-only.

Support to transfer weights from autoencoder to mlp #58

Merged
merged 3 commits into from
May 23, 2015
Merged

Conversation

alexjc
Copy link
Member

@alexjc alexjc commented May 22, 2015

This acts as a form of unsupervised layer-wise pretraining.

@coveralls
Copy link

Coverage Status

Coverage decreased (-0.31%) to 99.69% when pulling 8a9701a on pretrain into d221c57 on master.

@coveralls
Copy link

Coverage Status

Coverage remained the same at 100.0% when pulling 9485e14 on pretrain into d221c57 on master.

@coveralls
Copy link

Coverage Status

Coverage remained the same at 100.0% when pulling 88e1336 on pretrain into d221c57 on master.

1 similar comment
@coveralls
Copy link

Coverage Status

Coverage remained the same at 100.0% when pulling 88e1336 on pretrain into d221c57 on master.

alexjc added a commit that referenced this pull request May 23, 2015
Support to transfer weights from autoencoder to mlp
@alexjc alexjc merged commit 84fbaea into master May 23, 2015
@alexjc
Copy link
Member Author

alexjc commented May 23, 2015

Merging as it contains a fix for a bigger bug with best_valid_error.

@alexjc alexjc deleted the pretrain branch May 23, 2015 11:38
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants