详细信息
文献类型:期刊文献
英文题名:CONVERGENCE OF ONLINE PAIRWISE REGRESSION LEARNING WITH QUADRATIC LOSS
作者:Wang, Shuhua[1,2];Chen, Zhenlong[1];Sheng, Baohuai[2]
机构:[1]Zhejiang Gongshang Univ, Sch Stat & Math, Hangzhou 310018, Peoples R China;[2]Shaoxing Univ, Dept Appl Stat, Shaoxing 312000, Peoples R China
年份:2020
卷号:19
期号:8
起止页码:4023
外文期刊名:COMMUNICATIONS ON PURE AND APPLIED ANALYSIS
收录:SCI-EXPANDED(收录号:WOS:000536146500007)、、Scopus(收录号:2-s2.0-85090779317)、WOS
基金:This work is supported by the National Natural Science Foundation of China under Grants Nos. 61877039, 11971432 and the Humanities and Social Sciences Research Project of Ministry of Education (18YJA910001).
语种:英文
外文关键词:Online pairwise learning; quadratic loss; learning rate; pairwise reproducing kernel Hilbert space; convex analysis
外文摘要:Recent investigations on the error analysis of kernel regularized pairwise learning initiate the theoretical research on pairwise reproducing kernel Hilbert spaces (PRKHSs). In the present paper, we provide a method of constructing PRKHSs with classical Jacobi orthogonal polynomials. The performance of the kernel regularized online pairwise regression learning algorithms based on a quadratic loss function is investigated. Applying convex analysis and Rademacher complexity techniques, the bounds for the generalization error are provided explicitly. It is shown that the convergence rate can be greatly improved by adjusting the scale parameters in the loss function.
参考文献:
正在载入数据...