Skip to content

Commit

Permalink
Fix #699 (#700)
Browse files Browse the repository at this point in the history
* Update lightning_model.py

* Update config.yaml

* Update config.yaml

Co-authored-by: Samet Akcay <samet.akcay@intel.com>
  • Loading branch information
jpcbertoldo and samet-akcay authored Nov 21, 2022
1 parent 2edcdea commit e66a17c
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 4 deletions.
4 changes: 2 additions & 2 deletions anomalib/models/cflow/config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,6 @@ dataset:
train_batch_size: 16
test_batch_size: 16
inference_batch_size: 16
fiber_batch_size: 64
num_workers: 8
transform_config:
train: null
Expand All @@ -27,7 +26,8 @@ model:
condition_vector: 128
coupling_blocks: 8
clamp_alpha: 1.9
soft_permutation: false
fiber_batch_size: 64
permute_soft: false
lr: 0.0001
early_stopping:
patience: 2
Expand Down
4 changes: 2 additions & 2 deletions anomalib/models/cflow/lightning_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -186,12 +186,12 @@ def __init__(self, hparams: Union[DictConfig, ListConfig]) -> None:
backbone=hparams.model.backbone,
layers=hparams.model.layers,
pre_trained=hparams.model.pre_trained,
fiber_batch_size=hparams.dataset.fiber_batch_size,
fiber_batch_size=hparams.model.fiber_batch_size,
decoder=hparams.model.decoder,
condition_vector=hparams.model.condition_vector,
coupling_blocks=hparams.model.coupling_blocks,
clamp_alpha=hparams.model.clamp_alpha,
permute_soft=hparams.model.soft_permutation,
permute_soft=hparams.model.permute_soft,
)
self.hparams: Union[DictConfig, ListConfig] # type: ignore
self.save_hyperparameters(hparams)
Expand Down

0 comments on commit e66a17c

Please sign in to comment.