登录    注册    忘记密码

详细信息

LEARNING RATES FOR THE KERNEL REGULARIZED REGRESSION WITH A DIFFERENTIABLE STRONGLY CONVEX LOSS  ( SCI-EXPANDED收录)   被引量:7

文献类型:期刊文献

英文题名:LEARNING RATES FOR THE KERNEL REGULARIZED REGRESSION WITH A DIFFERENTIABLE STRONGLY CONVEX LOSS

作者:Sheng, Baohuai[1];Liu, Huanxiang[1];Wang, Huimin[1]

机构:[1]Shaoxing Univ, Dept Appl Stat, Shaoxing 312000, Peoples R China

年份:2020

卷号:19

期号:8

起止页码:3973

外文期刊名:COMMUNICATIONS ON PURE AND APPLIED ANALYSIS

收录:SCI-EXPANDED(收录号:WOS:000536146500005)、、Scopus(收录号:2-s2.0-85090783270)、WOS

基金:This work is supported by the National Natural Science Foundation of China under Grants (No. 61877039, 11501375) and the Natural Science Foundation of Zhejiang Province under Grant (No. LQ14A010005).

语种:英文

外文关键词:Kernel regularized regression; learning rate; differentiable strongly convex loss; conjugate loss; K-functional; maximum mean discrepancy (MMD); Hutchinson metric; reproducing kernel Hilbert space

外文摘要:We consider learning rates of kernel regularized regression (KRR) based on reproducing kernel Hilbert spaces (RKHSs) and differentiable strongly convex losses and provide some new strongly convex losses. We first show the robustness with the maximum mean discrepancy (MMD) and the Hutchinson metric respectively, and, along this line, bound the learning rate of the KRR. We first provide a capacity dependent learning rate and then give the learning rates for four concrete strongly convex losses respectively. In particular, we provide the learning rates when the hypothesis RKHS's logarithmic complexity exponent is arbitrarily small as well as sufficiently large.

参考文献:

正在载入数据...

版权所有©绍兴文理学院 重庆维普资讯有限公司 渝B2-20050021-8
渝公网安备 50019002500408号 违法和不良信息举报中心