详细信息
基于相关熵损失的核正则化回归学习速度
The Learning Rate of Kernel Regularized Regression Associated with a Correntropy-induced Loss
文献类型:期刊文献
中文题名:基于相关熵损失的核正则化回归学习速度
英文题名:The Learning Rate of Kernel Regularized Regression Associated with a Correntropy-induced Loss
作者:孙小军[1];盛宝怀[1,2]
机构:[1]浙江越秀外国语学院国际商学院经济统计系,绍兴浙江312000;[2]绍兴文理学院应用统计系,绍兴浙江312000
年份:2024
卷号:53
期号:3
起止页码:633
中文期刊名:数学进展
外文期刊名:Advances in Mathematics(China)
收录:北大核心2023、CSTPCD、、CSCD_E2023_2024、北大核心、CSCD
基金:Supported partially by NSFC(No.61877039);the NSFC/RGC Joint Research Scheme(Nos.12061160462,N_CityU102/20)。
语种:中文
中文关键词:学习理论;最大熵判据;核正则化回归;学习速度
外文关键词:learning theory;maximum correntropy criterion;kernel regularized regression;learning rate
中文摘要:最大相关熵回归在信号处理领域有广泛应用,其收敛性分析是机器学习领域中的热门研究课题.本文给出一种新的误差分析框架,将非凸优化问题转化为局部凸优化问题,然后应用凸分析方法给出最大相关熵回归(MCCR)收敛性的理论分析;将最优化回归函数表示成一种积分方程的解,用K-泛函和再生核Hilbert空间最佳逼近表示泛化误差,给出学习速度的一种上界估计.
外文摘要:The maximum correntropy criterion induced regression(MCCR)has been used frequently in the field of signal processing.The consistency property analysis for MCCR has become an increasing attention topic in learning theory.We provide a new framework for analyzing learning error.We transform the non-convex kernel regularized problem into a local convex optimization,and then give theoretical analysis to the convergence of the kernel regularized MCCR.We express the optimal regression function as the solution of an integral equation,and bound the generalization error of the kernel regularized MCCR with a K-functional and the best reproducing kernel Hilbert space approximation.
参考文献:
正在载入数据...