-
-
Notifications
You must be signed in to change notification settings - Fork 161
-
I have a use case where I want to write a custom training loop with the models defined in the library. I checked the Implementing New Supervised Architectures guide but I don't see how I can use this in my case. Also, I want to pass some extra data to the model in the training time similar to this Learning to Reweight Examples for Robust Deep Learning
Beta Was this translation helpful? Give feedback.
All reactions
Replies: 1 comment
-
Hey there, @yahia3200! I'm here to help you with any bugs, questions, or contributions you have. Let's tackle this together!
To implement a custom training loop in pytorch_tabular, you can use the low-level API provided by the TabularModel. This allows you to access and modify the training process. Here's a general approach:
-
Access the Training Loop: Use the
prepare_dataloader,prepare_model, andtrainmethods of theTabularModelto set up and customize the training loop. This gives you control over data preparation, model initialization, and the training process itself [1] . -
Pass Extra Data: While the documentation does not explicitly cover passing additional data during training, you can modify the data preparation step to include extra data. This might involve customizing the
TabularDatamoduleto handle additional inputs [2] . -
Custom Training Logic: You can integrate custom logic, such as reweighting examples, by modifying the training loop. This might involve adjusting the loss computation or data sampling strategy to incorporate the additional data or reweighting logic.
Unfortunately, there are no specific examples or tutorials in the documentation that demonstrate implementing a custom training loop or passing additional data in this manner [1] [2] . However, leveraging the flexibility of the low-level API and PyTorch Lightning's features should allow you to achieve your goals.
To continue talking to Dosu, mention @dosu.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other
Beta Was this translation helpful? Give feedback.