详细信息
文献类型:期刊文献
英文题名:Performance analysis of the LapRSSLG algorithm in learning theory
作者:Sheng, Baohuai[1];Zhang, Haizhang[2,3]
机构:[1]Shaoxing Univ, Dept Math, Shaoxing 312000, Peoples R China;[2]Sun Yat Sen Univ, Sch Data & Comp Sci, Guangzhou 510006, Peoples R China;[3]Sun Yat Sen Univ, Guangdong Prov Key Lab Computat Sci, Guangzhou 510006, Peoples R China
年份:2020
卷号:18
期号:1
起止页码:79
外文期刊名:ANALYSIS AND APPLICATIONS
收录:SCI-EXPANDED(收录号:WOS:000506339000004)、、Scopus(收录号:2-s2.0-85077984767)、WOS
基金:The first author was Supported in part by NSFC under grant 61877039 and the Zhejiang Natural Science Foundation (No. LY20A010013). The second author was Supported in part by NSFC under grants 11571377 and 11971490 and by the Guangdong Natural Science Foundation (No. 2018A030313841).
语种:英文
外文关键词:Learning rate; reproducing kernel Hilbert spaces; semi-supervised gradient learning; convex analysis
外文摘要:It is known that one aim of semi-supervised learning is to improve the prediction performance using a few labeled data with a large set of unlabeled data. Recently, a Laplacian regularized semi-supervised learning gradient (LapRSSLG) algorithm associated with data adjacency graph edge weights is proposed in the literature. The algorithm receives success in applications, hut there is no theory on the performance analysis. In this paper, an explicit learning rate estimate for the algorithm is provided, which shows that the convergence is indeed controlled by the unlabeled data.
参考文献:
正在载入数据...