[ENH] skforecast integration for time series hyperparameter tuning#208
[ENH] skforecast integration for time series hyperparameter tuning#208Omswastik-11 wants to merge 11 commits intohyperactive-project:mainfrom
skforecast integration for time series hyperparameter tuning#208Conversation
src/hyperactive/experiment/integrations/skforecast_forecasting.py
Outdated
Show resolved
Hide resolved
skforecast integration for time series hyperparameter tuning
|
Hi @fkiraly !! added tested the pre-commit on changed files kindly verify this. |
fkiraly
left a comment
There was a problem hiding this comment.
For the tests to run, you need to add skforecast to the python environment - I would add it to sktime-integration in pyproject.toml, that might be easiest.
|
Hi @fkiraly !! commited the changes as you suggested
|
|
Hi, |
|
Hi @JoaquinAmatRodrigo @fkiraly !!! |
|
It needs to be triggered by one of the repository's maintainers. |
|
Hi @JoaquinAmatRodrigo, @SimonBlanke, and @fkiraly — I’d appreciate your suggestions on how to proceed here. The issue is that
What I have done so far1. Fixes for the
|
fkiraly
left a comment
There was a problem hiding this comment.
The integration works, nice!
The remaining issues relate to dependency isolation, you have to do it in two places:
- the
get_test_paramsfunction, see above - in the tags (I think you also need to add a few others)
Besides this, the "higher is better" property needs to be inferred from the metric and set as a tag in __init__.
@JoaquinAmatRodrigo, is there a programmatic way to do this?
We do not have a programmatic strategy for this. So far, all the regression metrics that Skforecast allows to be passed as a string are intended to be minimised. "mean_squared_error": mean_squared_error,
"mean_absolute_error": mean_absolute_error,
"mean_absolute_percentage_error": mean_absolute_percentage_error,
"mean_squared_log_error": mean_squared_log_error,
"mean_absolute_scaled_error": mean_absolute_scaled_error,
"root_mean_squared_scaled_error": root_mean_squared_scaled_error,
"median_absolute_error": median_absolute_error,
"symmetric_mean_absolute_percentage_error": symmetric_mean_absolute_percentage_errorIf the user passes a custom function as a metric, they need to indicate whether it is a maximisation or minimisation. |
|
Hi @JoaquinAmatRodrigo !! Thanks for the clarification . |
|
Looks like a great solution. You might want to double-check the keywords, usually libraries use 'maximize' or 'minimize'. Not sure what is the ones used in this library. |
|
Thanks @JoaquinAmatRodrigo !! I checked @fkiraly and @SimonBlanke any suggestions ? |
fkiraly
left a comment
There was a problem hiding this comment.
Why are we making substantial changes to the CI and the depsets? Looks unnecessary. Did an AI suggest this?
Please revert the changes.
I think adding a catch-all all_integrations depset makes sense for testing, but not sure about the rest.
|
Hi @fkiraly !!! |
What you say in the comment you mention is not consistent with the actual changes in the If you are using AI, please watch what it is doing. |
|
Hi @fkiraly
rest is the same as I written in comments . as you mentioned about the changes made in all tests are passing currently btw |
|
Hi @SimonBlanke !! Can you re run the workflow ? I have reverted the changes and added this which last time solved the storage issue . refer this |
| sktime-integration = [ | ||
| "skpro", | ||
| 'sktime; python_version < "3.14"', | ||
| 'skforecast; python_version < "3.14"', |
There was a problem hiding this comment.
I am not sure if it makes sense to add it in here. I would rather leave it out.
Edit:
I would add it to sktime-integration in pyproject.toml, that might be easiest.
@fkiraly Is it enough, that the dependency is added to the general "integrations" or does it really belong with sktime integrations?
SimonBlanke
left a comment
There was a problem hiding this comment.
Nice work so far. Some requested changes to resolve.
SimonBlanke
left a comment
There was a problem hiding this comment.
@Omswastik-11 Thanks for your work.
I would like to merge this soon. Just add a page in the docs to finish this: https://github.com/hyperactive-project/Hyperactive/tree/main/docs/source



Summary
This PR adds a full integration with skforecast, allowing Hyperactive to optimize hyperparameters of skforecast forecasting models using any of its optimization algorithms.
Implementation Details
SkforecastExperiment (skforecast_forecasting.py)
BaseExperiment.skforecast.model_selection.backtesting_forecasterinside_evaluate()to perform time-series cross-validation for each parameter set.set_params()before every evaluation.SkforecastOptCV (skforecast_opt_cv.py)
sklearn-style estimator (inherits from
BaseEstimator).Works with
ForecasterRecursiveand other compatible skforecast forecasters.fit():
SkforecastExperimentwith user settings (steps,initial_train_size,metric, etc.).predict():
best_forecaster_.predict()for easy forecasting after optimization.Configuration
pyproject.tomlunder theintegrationsextra.Verification
skforecast_example.pyshowing a HillClimbing search withForecasterRecursive+RandomForestRegressor.Closes
Fixes #199