详细信息
The performance of semi-supervised Laplacian regularized regression with the least square loss ( SCI-EXPANDED收录 EI收录) 被引量:7
文献类型:期刊文献
英文题名:The performance of semi-supervised Laplacian regularized regression with the least square loss
作者:Sheng, Baohuai[1];Xiang, Daohong[2]
机构:[1]Shaoxing Univ, Dept Math, Shaoxing 312000, Zhejiang, Peoples R China;[2]Zhejiang Normal Univ, Dept Math, Jinhua 312004, Peoples R China
年份:2017
卷号:15
期号:2
外文期刊名:INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING
收录:SCI-EXPANDED(收录号:WOS:000395378300008)、、EI(收录号:20170403278615)、Scopus(收录号:2-s2.0-85009967912)、WOS
基金:This work is supported by the National Natural Science Foundation of China under Grants No. 11471292. The authors thank the reviewers for giving many valuable suggestions and comments which make the paper be revised and presented in a more readable form.
语种:英文
外文关键词:Semi-supervised LapRLS; convex analysis; Gateaux derivative; Gauss kernel; learning rate
外文摘要:The capacity convergence rate for a kind of kernel regularized semi- supervised Laplacian learning algorithm is bounded with the convex analysis approach. The algorithm is a graph-based regression whose structure shares the feature of both the kernel regularized regression and the kernel regularized Laplacian ranking. It is shown that the kernel reproducing the hypothesis space has contributions to the clustering ability of the algorithm. If the scale parameters in the Gaussian weights are chosen properly, then the learning rate can be controlled by the unlabeled samples and the algorithm converges with the increase of the number of the unlabeled samples. The results of this paper show that choosing suitable structure the semi-supervised learning approach can not only increase the learning rate, but also finish the learning process by increasing the number of unlabeled samples.
参考文献:
正在载入数据...