Gradient-based adaptation of general gaussian kernels
Research output: Contribution to journal › Journal article › Research › peer-review
Standard
Gradient-based adaptation of general gaussian kernels. / Glasmachers, Tobias; Igel, Christian.
In: Neural Computation, Vol. 17, No. 10, 2005, p. 2099-2105.Research output: Contribution to journal › Journal article › Research › peer-review
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - JOUR
T1 - Gradient-based adaptation of general gaussian kernels
AU - Glasmachers, Tobias
AU - Igel, Christian
PY - 2005
Y1 - 2005
N2 - Gradient-based optimizing of gaussian kernel functions is considered. The gradient for the adaptation of scaling and rotation of the input space is computed to achieve invariance against linear transformations. This is done by using the exponential map as a parameterization of the kernel parameter manifold. By restricting the optimization to a constant trace subspace, the kernel size can be controlled. This is, for example, useful to prevent overfitting when minimizing radius-margin generalization performance measures. The concepts are demonstrated by training hard margin support vector machines on toy data.
AB - Gradient-based optimizing of gaussian kernel functions is considered. The gradient for the adaptation of scaling and rotation of the input space is computed to achieve invariance against linear transformations. This is done by using the exponential map as a parameterization of the kernel parameter manifold. By restricting the optimization to a constant trace subspace, the kernel size can be controlled. This is, for example, useful to prevent overfitting when minimizing radius-margin generalization performance measures. The concepts are demonstrated by training hard margin support vector machines on toy data.
U2 - 10.1162/0899766054615635
DO - 10.1162/0899766054615635
M3 - Journal article
C2 - 16105219
VL - 17
SP - 2099
EP - 2105
JO - Neural Computation
JF - Neural Computation
SN - 0899-7667
IS - 10
ER -
ID: 32645794