A Kronecker product B
Description
A_x_B() function gives A Kronecker product B
Usage
A_x_B(A,B)
Arguments
A
A matrix.
B
A matrix.
Value
A_x_B(A, B)
returns the matrix A Kronecker product B, A\otimes B
Examples
A=matrix(rep(1,6),3,2)
B=matrix(seq(1,8),2,4 )
A_x_B(A,B)
Calculating the component of vector DELTA
Description
DELTA() function gives the value of the component of vector DELTA \boldsymbol{\Delta}. See Regui et al. (2024) for periodic simple regression model.
\mathbf{\Delta}=
\left[\begin{array}{c}
\mathbf{\Delta}_1 \\
\mathbf{\Delta}_2\\
\mathbf{\Delta}_3
\end{array}\right]\ , where \mathbf{\Delta}_1 is a vector of dimension S with component
\frac{n^{\frac{-1}{2} } }{\widehat{ \sigma}_s}\sum\limits_{\underset{ }{r=0}}^{m-1}\widehat{\phi}(Z_{s+Sr,t}), \mathbf{\Delta}_2 is a vector of dimension pS with component \frac{ n^{\frac{-1}{2} } }{\widehat{\sigma}_{s}}\sum\limits_{\underset{ }{r=0}}^{m-1} \widehat{\phi}(Z_{s+Sr})K_{s}^{(n)} \mathbf{X}_{s+Sr} ,
\mathbf{\Delta}_3 is a vector of dimension S with component \frac{n^{\frac{-1}{2} } }{2\widehat{\sigma}_{s}^{2}}\sum\limits_{\underset{ }{r=0}}^{m-1}{Z_{s+Sr} \widehat{\phi}(Z_{s+Sr})-1 }.
Usage
DELTA(x,phi,s,e,sigma)
Arguments
x
A list of independent variables with dimension p.
phi
s
A period of the regression model.
e
The residuals vector.
sigma
Value
DELTA()
returns the values of \mathbf{\Delta}. See Regui et al. (2024) for simple periodic coefficients regression model.
References
Regui, S., Akharif, A., & Mellouk, A. (2024). "Locally optimal tests against periodic linear regression in short panels." Communications in Statistics-Simulation and Computation, 1–15. doi:10.1080/03610918.2024.2314662
Calculating the component of matrix GAMMA
Description
GAMMA() function gives the value of the component of matrix GAMMA \boldsymbol{\Gamma}. See Regui et al. (2024) for periodic simple regression model.
\mathbf{\Gamma}=\frac{1}{S}
\left[\begin{array}{ccc}
\left(\mathbf{\Gamma}_{11}\right)_{S \times S }&\mathbf{0} & \mathbf{\Gamma}_{13} \\
\mathbf{0} &\left(\mathbf{\Gamma}_{22} \right)_{pS\times pS } &\mathbf{0} \\
\mathbf{\Gamma}_{13} & \mathbf{0}& \left(\mathbf{\Gamma}_{33} \right)_{S\times S}
\end{array}\right]\ , where \mathbf{\Gamma}_{11}=\widehat{I}_{n}\text{diag}(\frac{1}{\widehat{\sigma}_{1}^{2}},...,\frac{1}{\widehat{\sigma}_{S}^{2}} ), \mathbf{\Gamma}_{13}=\frac{\widehat{N}_{n}}{2}\text{diag}(\frac{1}{\widehat{\sigma}_{1}^{3}},...,\frac{1}{\widehat{\sigma}_{S}^{3}} ),
\mathbf{\Gamma}_{22}=\widehat{I}_{n}\text{diag}(\frac{1}{\widehat{\sigma}_{1}^{2}},...,\frac{1}{\widehat{\sigma}_{S}^{2}} ) \otimes \mathbf{I}_{p},
\mathbf{\Gamma}_{33}=\frac{\widehat{J}_{n}}{4}\text{diag}(\frac{1}{\widehat{\sigma}_{1}^{4}},...,\frac{1}{\widehat{\sigma}_{S}^{4}} ), \widehat{I}_n=\frac{1}{nT}\sum\limits_{\underset{ }{s=1}}^{S}\sum\limits_{\underset{}{r=0}}^{m-1}{\widehat{\phi}^{2}\left(\frac{\widehat{Z}_{s+Sr}}{\widehat{
\sigma}_s} \right)}, \widehat{N}_n=\frac{1}{nT}\sum\limits_{\underset{ }{s=1}}^{S}\sum\limits_{\underset{ }{r=0}}^{m-1}{\widehat{\phi}}^{2}\left( \frac{\widehat{Z}_{s+Sr}}{\widehat{
\sigma}_s}\right)\frac{\widehat{Z}_{s+Sr}}{\widehat{
\sigma}_s}, \widehat{J}_n=\frac{1}{nT}\sum\limits_{\underset{ }{s=1}}^{S}\sum\limits_{\underset{}{r=0}}^{m-1}\widehat{\phi}^{2}\left( \frac{\widehat{Z}_{s+Sr}}{\widehat{
\sigma}_s}\right)\left( \frac{\widehat{Z}_{s+Sr}}{\widehat{
\sigma}_s}\right)^{2}-1, and
\widehat{\phi}(x)=\frac{1}{b^2_n}\frac{\sum\limits_{\underset{ }{s=1}}^{S}\sum\limits_{\underset{}{r=0}}^{m-1}\left(x-Z_{s+Sr}\right)\exp\left(-\frac{\left(x-Z_{s+Sr} \right)^2}{2b_n^2}\right) }{\sum\limits_{\underset{}{s=1}}^{S}\sum\limits_{\underset{}{r=0}}^{m-1}\exp\left(-\frac{\left(x-Z_{s+Sr} \right)^2}{2b_n^2}\right) } \text{ with }b_n\rightarrow 0.
Usage
GAMMA(x,phi,s,z,sigma)
Arguments
x
A list of independent variables with dimension p.
phi
s
A period of the regression model.
z
The residuals vector.
sigma
Value
GAMMA()
returns the matrix \mathbf{\Gamma}. See Regui et al. (2024) for simple periodic coefficients regression model.
References
Regui, S., Akharif, A., & Mellouk, A. (2024). "Locally optimal tests against periodic linear regression in short panels." Communications in Statistics-Simulation and Computation, 1–15. doi:10.1080/03610918.2024.2314662
Least squares estimator for periodic coefficients regression model
Description
LSE_Reg_per() function gives the least squares estimation of parameters of a periodic coefficients regression model.
Usage
LSE_Reg_per(x,y,s)
Arguments
x
A list of independent variables with dimension p.
y
A response variable.
s
A period of the regression model.
Value
beta
Parameters to be estimated.
X
Matrix of predictors.
Y
The response vector.
Examples
set.seed(6)
n=400
s=4
x1=rnorm(n,0,1.5)
x2=rnorm(n,0,0.9)
x3=rnorm(n,0,2)
x4=rnorm(n,0,1.9)
y=rnorm(n,0,2.5)
x=list(x1,x2,x3,x4)
LSE_Reg_per(x,y,s)
Checking the periodicity of parameters in the regression model
Description
check_periodicity() function allows to detect the periodicity of parameters in the regression model using pseudo_gaussian_test. See Regui et al. (2024) for periodic simple regression model.
T^{(n)}=\left(\mathbf{\Delta}_{1}^{\circ(n)'},\mathbf{\Delta}_{2}^{\circ(n)'},\mathbf{\Delta}_{3}^{\circ(n)'} \right) \left(\begin{array}{ccc}
\mathbf{\Gamma}^{\circ} _{1} & \mathbf{\Gamma}^{\circ}_{12} & \mathbf{0} \\
\mathbf{\Gamma}^{\circ}_{12} &\mathbf{\Gamma}^{\circ}_{22} & \mathbf{0} \\
\mathbf{0} &\mathbf{0} & \mathbf{\Gamma}^{\circ}_{33}
\end{array} \right)^{-1} \left(\begin{array}{c}
\mathbf{\Delta}_{1}^{\circ(n)} \\
\mathbf{\Delta}_{2}^{\circ(n)}\\
\mathbf{\Delta}_{3}^{\circ(n)}
\end{array} \right),
where
\boldsymbol{\Delta}_{1}^{\circ(n)}= n^{\frac{-1}{2}} \sum\limits_{\underset{ }{r=0}}^{m-1} \left(\begin{array}{c}
\widehat{\phi}(Z_{1+Sr})-\widehat{\phi}(Z_{S+Sr})
\\
\vdots\\
\widehat{\phi}(Z_{S-1+Sr})-\widehat{\phi}(Z_{S+Sr})
\end{array} \right),
\mathbf{\Delta}_{2}^{\circ(n)}= \frac{n^{\frac{-1}{2}}}{2\widehat{\sigma} }\sum\limits_{\underset{ }{r=0}}^{m-1} \left(\begin{array}{c}
\widehat{\psi}(Z_{1+Sr})- \widehat{\psi}(Z_{S+Sr}) \\
\vdots\\
\widehat{\psi}(Z_{S-1+Sr})- \widehat{\psi}(Z_{S+Sr}) \\
\end{array}\right),
\mathbf{\Delta}_{3}^{\circ(n)}=n^{\frac{-1}{2}} \sum\limits_{\underset{ }{r=0}}^{m-1} \left(
\begin{array}{c}
\widehat{\phi}(Z_{1+Sr}) \mathbf{K}_1^{(n)}\mathbf{X}_{1+Sr}- \widehat{\phi}(Z_{S+Sr}) \mathbf{K}_S^{(n)}\mathbf{X}_{S+Sr}\\ \vdots\\
\widehat{\phi}(Z_{S-1+Sr})\mathbf{K}_{S-1}^{(n)}\mathbf{X}_{S-1+Sr}- \widehat{\phi}(Z_{S+Sr})\mathbf{K}_S^{(n)}\mathbf{X}_{S+Sr}
\end{array} \right),
\mathbf{\Gamma}^{\circ} _{11}=\frac{\widehat{I}_n }{S} \Sigma , \mathbf{\Gamma}^{\circ} _{22}=\dfrac{\widehat{I}_n}{4S\widehat{\sigma}^2}
\Sigma, \mathbf{\Gamma}^{\circ} _{12}=\frac{ \widehat{N}_n }{2S\widehat{\sigma}} \Sigma, and
\mathbf{\Gamma}^{\circ} _{33}=\frac{\widehat{I}_n }{S} \Sigma \otimes \mathbf{I}_{p\times p}
with
\widehat{I}_n=\frac{1}{nT}\sum\limits_{\underset{ }{s=1}}^{S}\sum\limits_{\underset{}{r=0}}^{m-1}{\widehat{\phi}^{2}\left(\frac{\widehat{Z}_{s+Sr}}{\widehat{
\sigma}_s} \right)}, \widehat{N}_n=\frac{1}{nT}\sum\limits_{\underset{ }{s=1}}^{S}\sum\limits_{\underset{ }{r=0}}^{m-1}{\widehat{\phi}}^{2}\left( \frac{\widehat{Z}_{s+Sr}}{\widehat{
\sigma}_s}\right)\frac{\widehat{Z}_{s+Sr}}{\widehat{
\sigma}_s},
\Sigma=\left[\begin{array}{cccc}
2 & 1& \ldots&1 \\
1&\ddots & \ddots& \vdots\\
\vdots& \ddots &\ddots & 1 \\
1&\ldots &1 & 2
\end{array}\right]\ ,
Z_{s+Sr}=\frac{y_{s+Sr}-\widehat{\mu}_s-\sum\limits_{\underset{}{j=1}}^{p}\widehat{\beta}^j_{s}x^j_{s+Sr}}{\widehat{\sigma}_s}, \mathbf{ X}_{s+Sr}=\left(x^1_{s+Sr},...,x^p_{s+Sr} \right)^{'}, \mathbf{K}^{(n)}_{s}=\left[\begin{array}{ccc}
\overline{(x^1_{s})^2 } & &\overline{x^i_{s}x^j_{s} }\\
&\ddots & \\
\overline{x^j_{s}x^i_{s} } & &\overline{(x^p_{s})^2 }
\end{array}\right]^{\frac{-1}{2} } ,
\overline{x^i_{s}x^j_{s} } =\frac{1}{m}\sum\limits_{\underset{ }{r=0}}^{m-1}{x^i_{s+Sr}x^j_{s+Sr}}, \overline{(x^i_{s})^2 } =\frac{1}{m}\sum\limits_{\underset{ }{r=0}}^{m-1}{(x^i_{s+Sr})^2 }, \widehat{\psi}(x)=x\widehat{\phi}(x)-1, and
\widehat{\phi}(x)=\frac{1}{b^2_n}\frac{\sum\limits_{\underset{ }{s=1}}^{S}\sum\limits_{\underset{}{r=0}}^{m-1}\left(x-Z_{s+Sr}\right)\exp\left(-\frac{\left(x-Z_{s+Sr} \right)^2}{2b_n^2}\right) }{\sum\limits_{\underset{}{s=1}}^{S}\sum\limits_{\underset{}{r=0}}^{m-1}\exp\left(-\frac{\left(x-Z_{s+Sr} \right)^2}{2b_n^2}\right) } with b_n\rightarrow 0.
Usage
check_periodicity(x,y,s)
Arguments
x
A list of independent variables with dimension p.
y
A response variable.
s
A period of the regression model.
Value
check_periodicity()
returns the value of observed statistic, T^{(n)}, degrees of freedom, (S-1)\times(p+2), and p-value
References
Regui, S., Akharif, A., & Mellouk, A. (2024). "Locally optimal tests against periodic linear regression in short panels." Communications in Statistics-Simulation and Computation, 1–15. doi:10.1080/03610918.2024.2314662
Examples
library(expm)
set.seed(6)
n=400
s=4
x1=rnorm(n,0,1.5)
x2=rnorm(n,0,0.9)
x3=rnorm(n,0,2)
x4=rnorm(n,0,1.9)
y=rnorm(n,0,2.5)
x=list(x1,x2,x3,x4)
check_periodicity(x,y,s)
Adaptive estimator for periodic coefficients regression model
Description
estimate_para_adaptive_method() function gives the adaptive estimation of parameters of a periodic coefficients regression model.
Usage
estimate_para_adaptive_method(n,s,y,x)
Arguments
n
The length of vector y.
s
A period of the regression model.
y
A response variable.
x
A list of independent variables with dimension p.
Value
beta_ad
Parameters to be estimated.
Examples
set.seed(6)
n=400
s=4
x1=rnorm(n,0,1.5)
x2=rnorm(n,0,0.9)
x3=rnorm(n,0,2)
x4=rnorm(n,0,1.9)
y=rnorm(n,0,2.5)
x=list(x1,x2,x3,x4)
model=lm(y~x1+x2+x3+x4)
z=model$residuals
estimate_para_adaptive_method(n,s,y,x)
Fitting periodic coefficients regression model by using LSE
Description
lm_per() function gives the least squares estimation of parameters, intercept \mu_s, slope \boldsymbol{\beta}_s, and standard deviation \sigma_s, of a periodic coefficients regression model using LSE_Reg_per and sd_estimation_for_each_s functions.
\widehat{\boldsymbol{\vartheta}}=\left(X^{'}X\right)^{-1}X^{'} Y where X=
\left[\begin{array}{ccccccccccc}
&\mathbf{X}^1_{1}&0&\ldots & 0& &\mathbf{X}^p_{1}&0&\ldots & 0 \\
& 0&\mathbf{X}^1_{2} &\ldots &0 & &0&\mathbf{X}^p_{2} &\ldots &0\\
\textbf{I}_{S}\otimes \mathbf{1}_{m} &0&0& \ddots&\vdots&\ldots&0& 0&\ddots&\vdots \\
& 0 &0&0 &\mathbf{X}^1_{S}& &0 &0&0 &\mathbf{X}^p_{S}
\end{array}\right]\ ,
\mathbf{X}^j_{s}=\left(x^j_{s},...,x^j_{s+(m-1)S}\right)^{'},
Y=(\mathbf{Y}_1^{'},...,\mathbf{Y}_S^{'})^{'}, \mathbf{Y}_{s} =(y_{s},...,y_{(m-1)S+s})^{'},
\mathbf{\epsilon}=(\mathbf{\epsilon}_{1}^{'},...,\mathbf{\epsilon}_{S}^{'})^{'},
\mathbf{\epsilon}_{s} =(\varepsilon_{s},...,\varepsilon_{(m-1)S+s})^{'}, \mathbf{1}_{m} is a vector of ones of dimension m, \textbf{I}_{S} is the identity matrix of dimension S, \otimes denotes the Kronecker product, and \boldsymbol{\vartheta} =\left(\boldsymbol{\mu}^{'} ,{\boldsymbol{\beta}}^{'}\right)^{'} with \boldsymbol{\mu}=(\mu_1,...,\mu_S)^{'} and \boldsymbol{\beta}=(\beta^1_{1},...,\beta^1_{S};...;\beta^p_{1},...,\beta^p_{S})^{'}.
Usage
lm_per(x,y,s)
Arguments
x
A list of independent variables with dimension p.
y
A response variable.
s
A period of the regression model.
Value
Residuals
the residuals, that is response minus fitted values
Coefficients
a named vector of coefficients
Root mean square error
The root mean square error
Examples
set.seed(6)
n=400
s=4
x1=rnorm(n,0,1.5)
x2=rnorm(n,0,0.9)
x3=rnorm(n,0,2)
x4=rnorm(n,0,1.9)
y=rnorm(n,0,2.5)
x=list(x1,x2,x3,x4)
lm_per(x,y,s)
Fitting periodic coefficients regression model by using Adaptive estimation method
Description
lm_per_AE() function gives the adaptive estimation of parameters, intercept \mu_s, slope \boldsymbol{\beta}_s, and standard deviation \sigma_s, of a periodic coefficients regression model. \widehat{\boldsymbol{\theta}}_{AE} ={\widehat{\boldsymbol{\vartheta} }_{LSE} }+\frac{1}{\sqrt{n}}{\mathbf{\Gamma}}^{-1}\mathbf{\Delta}.
Usage
lm_per_AE(x,y,s)
Arguments
x
A list of independent variables with dimension p.
y
A response variable.
s
A period of the regression model.
Value
Residuals
the residuals, that is response minus fitted values
Coefficients
a named vector of coefficients
Root mean square error
The root mean square error
Examples
set.seed(6)
n=200
s=2
x1=rnorm(n,0,1.5)
x2=rnorm(n,0,0.9)
x3=rnorm(n,0,2)
x4=rnorm(n,0,1.9)
y=rnorm(n,0,2.5)
x=list(x1,x2,x3,x4)
lm_per_AE(x,y,s)
Calculating the value of \phi function
Description
phi_n() function gives the value of \widehat{\phi}(x)=\frac{1}{b^2_n}\frac{\sum\limits_{\underset{ }{s=1}}^{S}\sum\limits_{\underset{}{r=0}}^{m-1}\left(x-Z_{s+Sr}\right)\exp\left(-\frac{\left(x-Z_{s+Sr} \right)^2}{2b_n^2}\right) }{\sum\limits_{\underset{}{s=1}}^{S}\sum\limits_{\underset{}{r=0}}^{m-1}\exp\left(-\frac{\left(x-Z_{s+Sr} \right)^2}{2b_n^2}\right) } with b_n=0.002.
Usage
phi_n(x)
Arguments
x
A numeric value.
Value
returns the value of \widehat{\phi}(x)
Detecting periodicity of parameters in the regression model
Description
pseudo_gaussian_test() function gives the value of the statistic test, T^{(n)}, for detecting periodicity of parameters in the regression model. See check_periodicity function.
Usage
pseudo_gaussian_test(x,z,s)
Arguments
x
A list of independent variables with dimension p.
z
The residuals vector.
s
A period of the regression model.
Value
returns the value of the statistic test, T^{(n)}.
Estimating periodic variances in a periodic coefficients regression model
Description
sd_estimation_for_each_s() function gives the estimation of variances, \widehat{\sigma}_s^2=\frac{1}{m-p-1}\sum\limits_{\underset{ }{r=0}}^{m-1}\widehat{\varepsilon}^2_{s+Sr} for all s=1,...,S,in a periodic coefficients regression model.
Usage
sd_estimation_for_each_s(x,y,s,beta_hat)
Arguments
x
A list of independent variables with dimension p.
y
A response variable.
s
A period of the regression model.
beta_hat
The least squares estimation using LSE_Reg_per.
Value
returns the value of \widehat{\sigma}_s^2.
Examples
set.seed(6)
n=400
s=4
x1=rnorm(n,0,1.5)
x2=rnorm(n,0,0.9)
x3=rnorm(n,0,2)
x4=rnorm(n,0,1.9)
y=rnorm(n,0,2.5)
x=list(x1,x2,x3,x4)
beta_hat=LSE_Reg_per(x,y,s)$beta
sd_estimation_for_each_s(x,y,s,beta_hat)