Skip to content

Hyperparameter Optimization & Custom Evaluation

Compare
Choose a tag to compare
@epeters3 epeters3 released this 07 Apr 15:09

Two main features this release:

Hyperparameter Optimization

Hyperparameters can be optimized on the best found pipeline via the skplumber.SKPlumber.crank(..., tune=True) API or the on any single pipeline using the skplumber.tuners.ga.ga_tune method. This is accomplished via the flexga package and hyperparameter annotations which have been added to all machine learning primitives.

Custom Evaluation

Previously, skplumber.SKPlumber.crank could only do k-fold cross validation. Now, by passing in a custom evaluator e.g. skplumber.SKPlumber.crank(..., evaluator=my_evaluator), any other pipeline evaluation method can be used. skplumber provides evaluators for k-fold cross validation, simple train/test splitting, and down-sampled train/test splitting.