Skip to content

Commit

Permalink
feat: add suffixes to models to indicate their task (#588)
Browse files Browse the repository at this point in the history
### Summary of Changes

Add the suffix `Classifier` to all models for classification and
`Regressor` to all models for regression. While being longer, this
naming has many advantages:

* Better **readability**: Several models have variants for
classification and regressions. Previously, both had the same name, so
to understand which one was used in the code, imports had to be checked.
* Better **auto-completion**: Now users can simply write `Classifier` or
`Regressor` to get a list of all suitable models.
* Better **understandability**: Now it's obvious, that logistic
regression is used for classification.

---------

Co-authored-by: megalinter-bot <129584137+megalinter-bot@users.noreply.github.com>
  • Loading branch information
lars-reimann and megalinter-bot committed Mar 30, 2024
1 parent ea176fc commit d490dee
Show file tree
Hide file tree
Showing 42 changed files with 370 additions and 316 deletions.
6 changes: 3 additions & 3 deletions docs/glossary.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ It classifies the predictions to be either be [true positive](#true-positive-tp)
## Decision Tree
A Decision Tree represents the process of conditional evaluation in a tree diagram.

Implemented in Safe-DS as [Decision Tree][safeds.ml.classical.classification.DecisionTree].
Implemented in Safe-DS as [DecisionTreeClassifier][safeds.ml.classical.classification.DecisionTreeClassifier] and [DecisionTreeRegressor][safeds.ml.classical.regression.DecisionTreeRegressor].

## F1-Score
The harmonic mean of [precision](#precision) and [recall](#recall). Formula:
Expand All @@ -48,7 +48,7 @@ It is analogous to a column within a table.
Linear Regression is the supervised Machine Learning model in which the model finds the best fit linear line between the independent and dependent variable
i.e. it finds the linear relationship between the dependent and independent variable.

Implemented in Safe-DS as [LinearRegression][safeds.ml.classical.regression.LinearRegression].
Implemented in Safe-DS as [LinearRegression][safeds.ml.classical.regression.LinearRegressionRegressor].

## Machine Learning (ML)
Machine Learning is a generic term for artificially generating knowledge through experience.
Expand Down Expand Up @@ -84,7 +84,7 @@ See here for respective references:
## Random Forest
Random Forest is an ML model that works by generating decision trees at random.

Implemented in Safe-DS as [RandomForest][safeds.ml.classical.regression.RandomForest].
Implemented in Safe-DS as [RandomForestClassifier][safeds.ml.classical.classification.RandomForestClassifier] and [RandomForestRegressor][safeds.ml.classical.regression.RandomForestRegressor].

## Recall
The ability of a [classification](#classification) model to identify all the relevant data points. Formula:
Expand Down
4 changes: 2 additions & 2 deletions docs/tutorials/classification.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -145,9 +145,9 @@
"execution_count": null,
"outputs": [],
"source": [
"from safeds.ml.classical.classification import RandomForest\n",
"from safeds.ml.classical.classification import RandomForestClassifier\n",
"\n",
"model = RandomForest()\n",
"model = RandomForestClassifier()\n",
"fitted_model= model.fit(tagged_train_table)"
],
"metadata": {
Expand Down
4 changes: 2 additions & 2 deletions docs/tutorials/machine_learning.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -54,9 +54,9 @@
"execution_count": null,
"outputs": [],
"source": [
"from safeds.ml.classical.regression import LinearRegression\n",
"from safeds.ml.classical.regression import LinearRegressionRegressor\n",
"\n",
"model = LinearRegression()\n",
"model = LinearRegressionRegressor()\n",
"fitted_model = model.fit(tagged_table)"
],
"metadata": {
Expand Down
4 changes: 2 additions & 2 deletions docs/tutorials/regression.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -98,9 +98,9 @@
"execution_count": null,
"outputs": [],
"source": [
"from safeds.ml.classical.regression import DecisionTree\n",
"from safeds.ml.classical.regression import DecisionTreeRegressor\n",
"\n",
"model = DecisionTree()\n",
"model = DecisionTreeRegressor()\n",
"fitted_model = model.fit(tagged_train_table)"
],
"metadata": {
Expand Down
28 changes: 14 additions & 14 deletions src/safeds/ml/classical/classification/__init__.py
Original file line number Diff line number Diff line change
@@ -1,21 +1,21 @@
"""Classes for classification tasks."""

from ._ada_boost import AdaBoost
from ._ada_boost import AdaBoostClassifier
from ._classifier import Classifier
from ._decision_tree import DecisionTree
from ._gradient_boosting import GradientBoosting
from ._k_nearest_neighbors import KNearestNeighbors
from ._logistic_regression import LogisticRegression
from ._random_forest import RandomForest
from ._support_vector_machine import SupportVectorMachine
from ._decision_tree import DecisionTreeClassifier
from ._gradient_boosting import GradientBoostingClassifier
from ._k_nearest_neighbors import KNearestNeighborsClassifier
from ._logistic_regression import LogisticRegressionClassifier
from ._random_forest import RandomForestClassifier
from ._support_vector_machine import SupportVectorMachineClassifier

__all__ = [
"AdaBoost",
"AdaBoostClassifier",
"Classifier",
"DecisionTree",
"GradientBoosting",
"KNearestNeighbors",
"LogisticRegression",
"RandomForest",
"SupportVectorMachine",
"DecisionTreeClassifier",
"GradientBoostingClassifier",
"KNearestNeighborsClassifier",
"LogisticRegressionClassifier",
"RandomForestClassifier",
"SupportVectorMachineClassifier",
]
8 changes: 4 additions & 4 deletions src/safeds/ml/classical/classification/_ada_boost.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
from safeds.data.tabular.containers import Table, TaggedTable


class AdaBoost(Classifier):
class AdaBoostClassifier(Classifier):
"""
Ada Boost classification.
Expand Down Expand Up @@ -99,7 +99,7 @@ def learning_rate(self) -> float:
"""
return self._learning_rate

def fit(self, training_set: TaggedTable) -> AdaBoost:
def fit(self, training_set: TaggedTable) -> AdaBoostClassifier:
"""
Create a copy of this classifier and fit it with the given training data.
Expand All @@ -112,7 +112,7 @@ def fit(self, training_set: TaggedTable) -> AdaBoost:
Returns
-------
fitted_classifier : AdaBoost
fitted_classifier : AdaBoostClassifier
The fitted classifier.
Raises
Expand All @@ -131,7 +131,7 @@ def fit(self, training_set: TaggedTable) -> AdaBoost:
wrapped_classifier = self._get_sklearn_classifier()
fit(wrapped_classifier, training_set)

result = AdaBoost(
result = AdaBoostClassifier(
learner=self.learner,
maximum_number_of_learners=self.maximum_number_of_learners,
learning_rate=self._learning_rate,
Expand Down
8 changes: 4 additions & 4 deletions src/safeds/ml/classical/classification/_decision_tree.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
from safeds.data.tabular.containers import Table, TaggedTable


class DecisionTree(Classifier):
class DecisionTreeClassifier(Classifier):
"""Decision tree classification."""

def __init__(self) -> None:
Expand All @@ -23,7 +23,7 @@ def __init__(self) -> None:
self._feature_names: list[str] | None = None
self._target_name: str | None = None

def fit(self, training_set: TaggedTable) -> DecisionTree:
def fit(self, training_set: TaggedTable) -> DecisionTreeClassifier:
"""
Create a copy of this classifier and fit it with the given training data.
Expand All @@ -36,7 +36,7 @@ def fit(self, training_set: TaggedTable) -> DecisionTree:
Returns
-------
fitted_classifier : DecisionTree
fitted_classifier : DecisionTreeClassifier
The fitted classifier.
Raises
Expand All @@ -55,7 +55,7 @@ def fit(self, training_set: TaggedTable) -> DecisionTree:
wrapped_classifier = self._get_sklearn_classifier()
fit(wrapped_classifier, training_set)

result = DecisionTree()
result = DecisionTreeClassifier()
result._wrapped_classifier = wrapped_classifier
result._feature_names = training_set.features.column_names
result._target_name = training_set.target.name
Expand Down
8 changes: 4 additions & 4 deletions src/safeds/ml/classical/classification/_gradient_boosting.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
from safeds.data.tabular.containers import Table, TaggedTable


class GradientBoosting(Classifier):
class GradientBoostingClassifier(Classifier):
"""
Gradient boosting classification.
Expand Down Expand Up @@ -74,7 +74,7 @@ def learning_rate(self) -> float:
"""
return self._learning_rate

def fit(self, training_set: TaggedTable) -> GradientBoosting:
def fit(self, training_set: TaggedTable) -> GradientBoostingClassifier:
"""
Create a copy of this classifier and fit it with the given training data.
Expand All @@ -87,7 +87,7 @@ def fit(self, training_set: TaggedTable) -> GradientBoosting:
Returns
-------
fitted_classifier : GradientBoosting
fitted_classifier : GradientBoostingClassifier
The fitted classifier.
Raises
Expand All @@ -106,7 +106,7 @@ def fit(self, training_set: TaggedTable) -> GradientBoosting:
wrapped_classifier = self._get_sklearn_classifier()
fit(wrapped_classifier, training_set)

result = GradientBoosting(number_of_trees=self._number_of_trees, learning_rate=self._learning_rate)
result = GradientBoostingClassifier(number_of_trees=self._number_of_trees, learning_rate=self._learning_rate)
result._wrapped_classifier = wrapped_classifier
result._feature_names = training_set.features.column_names
result._target_name = training_set.target.name
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
from safeds.data.tabular.containers import Table, TaggedTable


class KNearestNeighbors(Classifier):
class KNearestNeighborsClassifier(Classifier):
"""
K-nearest-neighbors classification.
Expand Down Expand Up @@ -56,7 +56,7 @@ def number_of_neighbors(self) -> int:
"""
return self._number_of_neighbors

def fit(self, training_set: TaggedTable) -> KNearestNeighbors:
def fit(self, training_set: TaggedTable) -> KNearestNeighborsClassifier:
"""
Create a copy of this classifier and fit it with the given training data.
Expand All @@ -69,7 +69,7 @@ def fit(self, training_set: TaggedTable) -> KNearestNeighbors:
Returns
-------
fitted_classifier : KNearestNeighbors
fitted_classifier : KNearestNeighborsClassifier
The fitted classifier.
Raises
Expand Down Expand Up @@ -99,7 +99,7 @@ def fit(self, training_set: TaggedTable) -> KNearestNeighbors:
wrapped_classifier = self._get_sklearn_classifier()
fit(wrapped_classifier, training_set)

result = KNearestNeighbors(self._number_of_neighbors)
result = KNearestNeighborsClassifier(self._number_of_neighbors)
result._wrapped_classifier = wrapped_classifier
result._feature_names = training_set.features.column_names
result._target_name = training_set.target.name
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
from safeds.data.tabular.containers import Table, TaggedTable


class LogisticRegression(Classifier):
class LogisticRegressionClassifier(Classifier):
"""Regularized logistic regression."""

def __init__(self) -> None:
Expand All @@ -23,7 +23,7 @@ def __init__(self) -> None:
self._feature_names: list[str] | None = None
self._target_name: str | None = None

def fit(self, training_set: TaggedTable) -> LogisticRegression:
def fit(self, training_set: TaggedTable) -> LogisticRegressionClassifier:
"""
Create a copy of this classifier and fit it with the given training data.
Expand All @@ -36,7 +36,7 @@ def fit(self, training_set: TaggedTable) -> LogisticRegression:
Returns
-------
fitted_classifier : LogisticRegression
fitted_classifier : LogisticRegressionClassifier
The fitted classifier.
Raises
Expand All @@ -55,7 +55,7 @@ def fit(self, training_set: TaggedTable) -> LogisticRegression:
wrapped_classifier = self._get_sklearn_classifier()
fit(wrapped_classifier, training_set)

result = LogisticRegression()
result = LogisticRegressionClassifier()
result._wrapped_classifier = wrapped_classifier
result._feature_names = training_set.features.column_names
result._target_name = training_set.target.name
Expand Down
8 changes: 4 additions & 4 deletions src/safeds/ml/classical/classification/_random_forest.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
from safeds.data.tabular.containers import Table, TaggedTable


class RandomForest(Classifier):
class RandomForestClassifier(Classifier):
"""Random forest classification.
Parameters
Expand Down Expand Up @@ -54,7 +54,7 @@ def number_of_trees(self) -> int:
"""
return self._number_of_trees

def fit(self, training_set: TaggedTable) -> RandomForest:
def fit(self, training_set: TaggedTable) -> RandomForestClassifier:
"""
Create a copy of this classifier and fit it with the given training data.
Expand All @@ -67,7 +67,7 @@ def fit(self, training_set: TaggedTable) -> RandomForest:
Returns
-------
fitted_classifier : RandomForest
fitted_classifier : RandomForestClassifier
The fitted classifier.
Raises
Expand All @@ -86,7 +86,7 @@ def fit(self, training_set: TaggedTable) -> RandomForest:
wrapped_classifier = self._get_sklearn_classifier()
fit(wrapped_classifier, training_set)

result = RandomForest(number_of_trees=self._number_of_trees)
result = RandomForestClassifier(number_of_trees=self._number_of_trees)
result._wrapped_classifier = wrapped_classifier
result._feature_names = training_set.features.column_names
result._target_name = training_set.target.name
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ def _get_sklearn_kernel(self) -> object:
"""


class SupportVectorMachine(Classifier):
class SupportVectorMachineClassifier(Classifier):
"""
Support vector machine.
Expand Down Expand Up @@ -151,18 +151,18 @@ def _get_kernel_name(self) -> str:
TypeError
If the kernel type is invalid.
"""
if isinstance(self.kernel, SupportVectorMachine.Kernel.Linear):
if isinstance(self.kernel, SupportVectorMachineClassifier.Kernel.Linear):
return "linear"
elif isinstance(self.kernel, SupportVectorMachine.Kernel.Polynomial):
elif isinstance(self.kernel, SupportVectorMachineClassifier.Kernel.Polynomial):
return "poly"
elif isinstance(self.kernel, SupportVectorMachine.Kernel.Sigmoid):
elif isinstance(self.kernel, SupportVectorMachineClassifier.Kernel.Sigmoid):
return "sigmoid"
elif isinstance(self.kernel, SupportVectorMachine.Kernel.RadialBasisFunction):
elif isinstance(self.kernel, SupportVectorMachineClassifier.Kernel.RadialBasisFunction):
return "rbf"
else:
raise TypeError("Invalid kernel type.")

def fit(self, training_set: TaggedTable) -> SupportVectorMachine:
def fit(self, training_set: TaggedTable) -> SupportVectorMachineClassifier:
"""
Create a copy of this classifier and fit it with the given training data.
Expand All @@ -175,7 +175,7 @@ def fit(self, training_set: TaggedTable) -> SupportVectorMachine:
Returns
-------
fitted_classifier : SupportVectorMachine
fitted_classifier : SupportVectorMachineClassifier
The fitted classifier.
Raises
Expand All @@ -194,7 +194,7 @@ def fit(self, training_set: TaggedTable) -> SupportVectorMachine:
wrapped_classifier = self._get_sklearn_classifier()
fit(wrapped_classifier, training_set)

result = SupportVectorMachine(c=self._c, kernel=self._kernel)
result = SupportVectorMachineClassifier(c=self._c, kernel=self._kernel)
result._wrapped_classifier = wrapped_classifier
result._feature_names = training_set.features.column_names
result._target_name = training_set.target.name
Expand Down
Loading

0 comments on commit d490dee

Please sign in to comment.