Skip to main content
Stack Overflow
  1. About
  2. For Teams
Filter by
Sorted by
Tagged with
1 vote
1 answer
80 views

I'm using Optuna to optimize LightGBM hyperparameters, and I'm running into an issue with the variability of best_iteration across different random seeds. Current Setup I train multiple models with ...
0 votes
0 answers
44 views

I’m running Optuna to tune hyperparameters for a TabM regression model (10 trials) on Kaggle (GPU: Tesla P100) to minimize RMSE. The optimization runs fine — all trials complete — but right after ...
0 votes
0 answers
108 views

I am using Optuna for hyperparameter tuning. I get messages as shown below: Trial 15 finished with value: 6.226334123011727 and parameters: {'iterations': 1100, 'learning_rate': 0.04262148853587423, '...
2 votes
1 answer
152 views

I want to undersample 3 cross-validation folds from a dataset, using say, RandomUnderSampler from imblearn, and then, optimize the hyperparameters of various gbms using those undersampled folds as ...
1 vote
0 answers
79 views

I used TPESampler and set it as follows while optimizing with optuna: sampler=optuna.samplers.TPESampler(multivariate=True, n_startup_trials=10, seed=None). But in the 10 startup_trials process, it ...
0 votes
0 answers
33 views

My impression is that every trial is run for one step. Then some trials are pruned and the remaining continue for another step and so on. However, the logs show: Trial 0 completed Trial 1 completed ...
0 votes
1 answer
68 views

If I am using stratified 10-folds for classification/regression tasks, where do I need to define the logic for hyperparameter tuning using Scikit or Wandb? Should it be inside the loop or outside? I ...
0 votes
1 answer
65 views

If you use cosine decay for example and you have starting learning rate and final learning rate, can you tune those hyperparameters so that final learning rate is in some ratio of starting learning ...
2 votes
1 answer
74 views

I am trying to use keras-tuner to tune hyperparameters, like !pip install keras-tuner --upgrade import keras_tuner as kt from tensorflow.keras.models import Sequential from tensorflow.keras.layers ...
0 votes
1 answer
80 views

Im trying to forecast a time series using prophet model in python, for which I would like to find the optimal tuning parameters (like changepoint_range, changepoint_prior_scale, ...
0 votes
0 answers
30 views

I use mlflow and hyperopt for tuning a model, and trying to figure out hyperopt sampling methods. I have directly used line of codes from the documentation, as such: my code: space = {"...
-2 votes
1 answer
128 views

I am trying to tune some hyperparameters for my neural network for an image segmentational problem. I set up the tuner as simple as it can be, but when I run my code i get the following error: 2025-02-...
0 votes
0 answers
144 views

I am trying to tune my model with ray tune for pytorch. I would really like to be able to save the tuning progress, stop the execution and resume the execution from where I left. Unfortunately, I am ...
0 votes
1 answer
79 views

I'm working on training a model that predicts which way in cache to evict based on cache features, access information, etc, etc... However, I have millions and millions of data samples. Thus, I cannot ...
1 vote
0 answers
92 views

I have a dataframe named hyperparam_df which looks like the following: repo_name file_name \ 0 DeepCoMP deepcomp/util/simulation.py ...

15 30 50 per page
1
2 3 4 5
...
58

AltStyle によって変換されたページ (->オリジナル) /