0
$\begingroup$

I am working on setting up regression models for prediction in psychometrics and ran into challenges with cross validation. Essentially, I would like to have cross validated linear regression models while reducing the number of terms as much as possible and ideally, it should be cross validated. So far, I implemented a model selection on the basis of best subset regression (leaps) and the selected models can afterwards be cross validated via repeated k-fold CV (self implemented), which unfortunately requires a lot of computational power and I am not sure if this is the best way to go.

I would like to switch to glmnet with ridge regression and cross validation and afterwards use the ideal model for further computation with the linear model. The idea is:

  1. Determine model with cv.glmnet (Ridge regression)
  2. Take the coefficients from the glmnet model with min lambda.
  3. Feed the coefficients into lm and continue working with the lm model.

So my question: Is there a way to fix terms and weights in linear model (lm function in R)?

asked Sep 11, 2022 at 9:00
$\endgroup$

1 Answer 1

1
$\begingroup$

One way to fix a weight is just subtract the fixed term from the dependent variable and perform your regression on the new dependent variable. You would essentially be doing regression on the residuals.

answered Feb 1, 2023 at 21:57
$\endgroup$

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.