Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does darpa subT dataset support mini-batch on delora? #25

Open
JaySlamer opened this issue Jan 5, 2023 · 0 comments
Open

Does darpa subT dataset support mini-batch on delora? #25

JaySlamer opened this issue Jan 5, 2023 · 0 comments

Comments

@JaySlamer
Copy link

Hi, thanks for your great work.
I trained with default hyperparameters, and can not get any sensible result.
I noticed the default batch_size is 1 in config file, which is very likely to cause unreliable and unstable result of training. The comment says:

batch_size > 1 currently only supported if single image dims are used (vertical and horizontal cells)
and
In general: larger batches currently implemented rather primitively

I don't understand what the comment means and whether darpa subT could be trained with larger batch size.
After change batch size to 32, I got error when training 1st epoch to 99.8%
Below is the error message
Traceback (most recent call last):
File "/home/slam/Documents/delora/bin/run_training.py", line 94, in
trainer.train()
File "/home/slam/Documents/delora/src/deploy/trainer.py", line 122, in train
epoch_losses = self.train_epoch(epoch=epoch, dataloader=dataloader)
File "/home/slam/Documents/delora/src/deploy/trainer.py", line 71, in train_epoch
self.step(
File "/home/slam/Documents/delora/src/deploy/deployer.py", line 292, in step
preprocessed_dict = preprocessed_dicts[batch_index]
IndexError: list index out of range

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant