Author
Argyriou, A
HAUSER, R
Micchelli, C
Pontil, M
Journal title
Proceedings of the 23rd International Con- ference on Machine Learning, Pittsburgh, PA, 2006
DOI
10.1145/1143844.1143850
Issue
ACM 2006
Volume
148
Last updated
2024-04-06T23:22:16.713+01:00
Page
41-48
Abstract
We address the problem of learning a kernel for a given supervised learning task. Our ap- proach consists in searching within the con- vex hull of a prescribed set of basic kernels for one which minimizes a convex regularization functional. A unique feature of this approach compared to others in the literature is that the number of basic kernels can be infinite. We only require that they are continuously parameterized. For example, the basic ker- nels could be isotropic Gaussians with vari- ance in a prescribed interval or even Gaus- sians parameterized by multiple continuous parameters. Our work builds upon a formula- tion involving a minimax optimization prob- lem and a recently proposed greedy algorithm for learning the kernel. Although this opti- mization problem is not convex, it belongs to the larger class of DC (difference of convex functions) programs. Therefore, we apply re- cent results from DC optimization theory to create a new algorithm for learning the ker- nel. Our experimental results on benchmark data sets show that this algorithm outper- forms a previously proposed method.
Symplectic ID
319876
Download URL
https://www.maths.ox.ac.uk/people/raphael.hauser
Favourite
On
Publication type
Chapter
ISBN-10
1-59593-383-2
Please contact us with feedback and comments about this page. Created on 29 Mar 2012 - 14:27.