Describe the bug
Calling optimize_hparams() on MambularClassifier (and similarly on MambularRegressor) raises:
TypeError: mambular.models.utils.sklearn_parent.SklearnBase.fit()
got multiple values for keyword argument 'regression'
The issue occurs because:
SklearnBase.optimize_hparams() calls self.fit(..., regression=...)
SklearnBaseClassifier.fit() and SklearnBaseRegressor.fit() do not declare a regression parameter in their signatures.
- They hardcode
regression=False/True when calling super().fit(...) and also forward **trainer_kwargs.
This results in regression being passed twice to SklearnBase.fit(), causing the TypeError.
This makes optimize_hparams() unusable for both classifier and regressor models.
To Reproduce
from mambular.models import MambularClassifier
model = MambularClassifier(
d_model=64,
n_layers=4,
)
model.optimize_hparams(
x_train_preprocessed,
y_train,
)
Error:
TypeError: mambular.models.utils.sklearn_parent.SklearnBase.fit()
got multiple values for keyword argument 'regression'
The same happens with MambularRegressor.
Expected behavior
optimize_hparams() should run without raising a TypeError.
regression should only be passed once to SklearnBase.fit().
Desktop (please complete the following information):
- OS: Ubuntu (local machine)
- Python version: 3.11
- Torch version: 2.5.1+cu121
- Mambular version: 1.5.0
Additional context
The problem appears to be a mismatch between SklearnBase.optimize_hparams() and the fit() signatures in SklearnBaseClassifier and SklearnBaseRegressor, leading to duplicated keyword arguments.