Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit b0b901c

Browse files
committed
update
1 parent 97f6ebd commit b0b901c

File tree

2 files changed

+4
-4
lines changed

2 files changed

+4
-4
lines changed

‎Bermudan5F.ipynb‎

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4553,7 +4553,7 @@
45534553
"id": "current-building",
45544554
"metadata": {},
45554555
"source": [
4556-
"The two principal components explain 99.51% of the variance of the input data, and it certainly looks like the three remaining dimensions may be safely truncated. In particular, the 5th component explains less that half a basis point of the total variance. Let us then reduce dimension to two:"
4556+
"The two principal components explain 99.61% of the variance of the input data, and it certainly looks like the three remaining dimensions may be safely truncated. In particular, the 5th component explains less that half a basis point of the total variance. Let us then reduce dimension to two:"
45574557
]
45584558
},
45594559
{

‎DifferentialRegression.ipynb‎

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -69,18 +69,18 @@
6969
"Given the availability of differential labels $Z,ドル <b> differential regression </b> minimizes a combination of value and derivatives errors:\n",
7070
" \n",
7171
"$$\n",
72-
" \\min \\left\\{ MSE + \\sum_{j=1}^n \\alpha_j E\\left\\{\\left[Z_j-\\beta \\cdot \\phi_j\\left(X\\right)\\right]^2\\right\\} \\right\\}\n",
72+
" \\min \\left\\{ MSE + \\sum_{j=1}^n \\alpha_j E\\left\\{\\left[Z_j-\\beta \\cdot \\delta \\phi_j\\left(X\\right)\\right]^2\\right\\} \\right\\}\n",
7373
"$$\n",
7474
"\n",
75-
"where $\\phi_j\\left(X\\right) = \\left[\\frac{\\partial \\phi_1\\left(X\\right)}{\\partial X_j}, ..., \\frac{\\partial \\phi_K\\left(X\\right)}{\\partial X_j}\\right] \\in \\mathbb{R}^K$ is the vector of partial derivatives of the basis functions wrt the j-th input $X_j,ドル and $Z_j$ is the j-th differential label.\n",
75+
"where $\\delta \\phi_j\\left(X\\right) = \\left[\\frac{\\partial \\phi_1\\left(X\\right)}{\\partial X_j}, ..., \\frac{\\partial \\phi_K\\left(X\\right)}{\\partial X_j}\\right] \\in \\mathbb{R}^K$ is the vector of partial derivatives of the basis functions wrt the j-th input $X_j,ドル and $Z_j$ is the j-th differential label.\n",
7676
"\n",
7777
"Zeroing the gradient of the differential objective wrt weights $\\beta,ドル we obtain the differential normal equation:\n",
7878
"\n",
7979
"$$\n",
8080
" \\beta = \\left( C_{\\phi\\phi} + \\sum_{j=1}^n \\alpha_j C_{jj}^\\phi \\right)^{-1} \\left( C_{\\phi y} + \\sum_{j=1}^n \\alpha_j C_{j}^{\\phi z} \\right)\n",
8181
"$$\n",
8282
"\n",
83-
"where $C_{jj}^\\phi = E\\left[\\phi_j\\left(X\\right) \\phi_j\\left(X\\right)^T\\right] \\in \\mathbb{R}^{K \\times K}$ and $C_{j}^{\\phi z} = E\\left[\\phi_j\\left(X\\right) z_j \\right] \\in \\mathbb{R}^K$.\n",
83+
"where $C_{jj}^\\phi = E\\left[\\delta \\phi_j\\left(X\\right) \\delta \\phi_j\\left(X\\right)^T\\right] \\in \\mathbb{R}^{K \\times K}$ and $C_{j}^{\\phi z} = E\\left[\\delta \\phi_j\\left(X\\right) z_j \\right] \\in \\mathbb{R}^K$.\n",
8484
"\n",
8585
"Similarly to ridge regression, the hyperparameters $\\alpha_j$ control the relative importance of derivatives correctness in the minimization objective. Contrarily to ridge regression, however, differential regularization does not introduce bias. It follows that it has little risk of underfitting. A reasonable default is given by:\n",
8686
"\n",

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /