登录    注册    忘记密码

详细信息

CONVERGENCE OF ONLINE PAIRWISE REGRESSION LEARNING WITH QUADRATIC LOSS  ( SCI-EXPANDED收录)   被引量:4

文献类型:期刊文献

英文题名:CONVERGENCE OF ONLINE PAIRWISE REGRESSION LEARNING WITH QUADRATIC LOSS

作者:Wang, Shuhua[1,2];Chen, Zhenlong[1];Sheng, Baohuai[2]

机构:[1]Zhejiang Gongshang Univ, Sch Stat & Math, Hangzhou 310018, Peoples R China;[2]Shaoxing Univ, Dept Appl Stat, Shaoxing 312000, Peoples R China

年份:2020

卷号:19

期号:8

起止页码:4023

外文期刊名:COMMUNICATIONS ON PURE AND APPLIED ANALYSIS

收录:SCI-EXPANDED(收录号:WOS:000536146500007)、、Scopus(收录号:2-s2.0-85090779317)、WOS

基金:This work is supported by the National Natural Science Foundation of China under Grants Nos. 61877039, 11971432 and the Humanities and Social Sciences Research Project of Ministry of Education (18YJA910001).

语种:英文

外文关键词:Online pairwise learning; quadratic loss; learning rate; pairwise reproducing kernel Hilbert space; convex analysis

外文摘要:Recent investigations on the error analysis of kernel regularized pairwise learning initiate the theoretical research on pairwise reproducing kernel Hilbert spaces (PRKHSs). In the present paper, we provide a method of constructing PRKHSs with classical Jacobi orthogonal polynomials. The performance of the kernel regularized online pairwise regression learning algorithms based on a quadratic loss function is investigated. Applying convex analysis and Rademacher complexity techniques, the bounds for the generalization error are provided explicitly. It is shown that the convergence rate can be greatly improved by adjusting the scale parameters in the loss function.

参考文献:

正在载入数据...

版权所有©绍兴文理学院 重庆维普资讯有限公司 渝B2-20050021-8
渝公网安备 50019002500408号 违法和不良信息举报中心