Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to generate 228 size leaving_group.pt with USPTO50K #10

Open
295825725 opened this issue Dec 21, 2023 · 0 comments
Open

how to generate 228 size leaving_group.pt with USPTO50K #10

295825725 opened this issue Dec 21, 2023 · 0 comments

Comments

@295825725
Copy link

295825725 commented Dec 21, 2023

Hi, thanks for the amazing work! However, when I'm using your provided code, dataset and model, I can only obtain leaving_group.pt of USPTO50K with 238 size, while the model provided has the size of 228, how to make them match?
Traceback (most recent call last):
File "entry.py", line 195, in
main()
File "entry.py", line 67, in main
model = build_model(args)
File "entry.py", line 187, in build_model
dataset_path=args.dataset
File "/data/env-retro/anaconda3/envs/retro_liuth/lib/python3.7/site-packages/pytorch_lightning/core/saving.py", line 142, in load_from_checkpoint
**kwargs,
File "/data/env-retro/anaconda3/envs/retro_liuth/lib/python3.7/site-packages/pytorch_lightning/core/saving.py", line 179, in _load_from_checkpoint
return _load_state(cls, checkpoint, strict=strict, **kwargs)
File "/data/env-retro/anaconda3/envs/retro_liuth/lib/python3.7/site-packages/pytorch_lightning/core/saving.py", line 237, in _load_state
keys = obj.load_state_dict(checkpoint["state_dict"], strict=strict)
File "/data/env-retro/anaconda3/envs/retro_liuth/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1672, in load_state_dict
self.class.name, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for RetroAGT:
size mismatch for lg_out_fn.0.weight: copying a param with shape torch.Size([228, 512]) from checkpoint, the shape in current model is torch.Size([238, 512]).
size mismatch for lg_out_fn.0.bias: copying a param with shape torch.Size([228]) from checkpoint, the shape in current model is torch.Size([238]).

and also, I cannot find the route for CC(CC1=CC=C2OCOC2=C1)NCC(O)C1=CC=C(O)C(O)=C1 by using the provided model even with 1000 iterations
root : INFO Final search status | success value | iter: 47.70581531524658 | inf | 1000
root : INFO Synthesis path for CC(CC1=CC=C2OCOC2=C1)NCC(O)C1=CC=C(O)C(O)=C1 not found. Please try increasing the number of iterations.
None

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant