Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Several improvements for lookup_embedder.penalty #97

Merged
merged 4 commits into from
May 22, 2020

Conversation

samuelbroscheit
Copy link
Member

@samuelbroscheit samuelbroscheit commented May 14, 2020

  • improve memory uage for unweighted penalty by using PyTorch builtin norm()

  • use call to embeddings in penalty to make sparse embeddings work, before this fix unewighted penalty for sparse embeddings did not work

apply embed also to embed_all
use call to embeddings in penalty to make sparse embeddings work
Copy link
Member

@rgemulla rgemulla left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! I only recommned to get rid of all_indexes; see comments.

kge/model/embedder/lookup_embedder.py Outdated Show resolved Hide resolved
kge/model/embedder/lookup_embedder.py Outdated Show resolved Hide resolved
allocate all indexes dynamically
kge/model/embedder/lookup_embedder.py Outdated Show resolved Hide resolved
kge/model/embedder/lookup_embedder.py Outdated Show resolved Hide resolved
kge/model/embedder/lookup_embedder.py Outdated Show resolved Hide resolved
@rgemulla rgemulla merged commit 26d98ef into master May 22, 2020
@rgemulla
Copy link
Member

Thanks!

@samuelbroscheit samuelbroscheit deleted the lookup_embedder_penalty_changes branch May 22, 2020 09:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants