TY - JOUR
T1 - A sparse large margin semi-supervised learning method
AU - Choi, Hosik
AU - Kim, Jinseog
AU - Kim, Yongdai
PY - 2010/12
Y1 - 2010/12
N2 - In this paper, we propose a sparse semi-supervised learning method, which combines the large margin approach and the L1 constraint. The main contribution of the paper is to develop an efficient optimization algorithm. The objective function to be minimized in a large margin semi-supervised learning method is non-convex and non-differentiable, and hence special optimization algorithms are required. For this purpose, we develop an optimization algorithm, which is a hybrid of the CCCP and the gradient LASSO algorithm. The advantage of the proposed method over existing semi-supervised learning methods is that it can identify a small number of relevant input variables while keeping the prediction accuracy high. Also, the proposed algorithm is simple enough that it can be applied to various real problems without being much hampered by computational limitations. To confirm these advantages, we compare the proposed method with the standard semi-supervised method by analyzing simulated as well as real data sets.
AB - In this paper, we propose a sparse semi-supervised learning method, which combines the large margin approach and the L1 constraint. The main contribution of the paper is to develop an efficient optimization algorithm. The objective function to be minimized in a large margin semi-supervised learning method is non-convex and non-differentiable, and hence special optimization algorithms are required. For this purpose, we develop an optimization algorithm, which is a hybrid of the CCCP and the gradient LASSO algorithm. The advantage of the proposed method over existing semi-supervised learning methods is that it can identify a small number of relevant input variables while keeping the prediction accuracy high. Also, the proposed algorithm is simple enough that it can be applied to various real problems without being much hampered by computational limitations. To confirm these advantages, we compare the proposed method with the standard semi-supervised method by analyzing simulated as well as real data sets.
KW - CCCP
KW - Gradient LASSO
KW - Semi-supervised learning
KW - Support vector machines
UR - http://www.scopus.com/inward/record.url?scp=77957948033&partnerID=8YFLogxK
U2 - 10.1016/j.jkss.2009.10.005
DO - 10.1016/j.jkss.2009.10.005
M3 - Article
AN - SCOPUS:77957948033
SN - 1226-3192
VL - 39
SP - 479
EP - 487
JO - Journal of the Korean Statistical Society
JF - Journal of the Korean Statistical Society
IS - 4
ER -