详细信息
文献类型:期刊文献
英文题名:Convergence rate of SVM for kernel-based robust regression
作者:Wang, Shuhua[1];Chen, Zhenlong[1];Sheng, Baohuai[2]
机构:[1]Zhejiang Gongshang Univ, Sch Stat & Math, Hangzhou 310018, Zhejiang, Peoples R China;[2]Shaoxing Univ, Dept Appl Stat, Shaoxing 312000, Zhejiang, Peoples R China
年份:2019
卷号:17
期号:1
外文期刊名:INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING
收录:SCI-EXPANDED(收录号:WOS:000458647200012)、、EI(收录号:20184806149468)、Scopus(收录号:2-s2.0-85057123490)、WOS
基金:The authors would like to thank the anonymous referees. Their comments make the revised manuscript more readable. This work is supported by the National Natural Science Foundation of China under Grants Nos. 61877039, 11371321 and 11471292, the Humanities and Social Sciences Research Project of Ministry of Education (18YJA910001), the National Statistical Science Research Project of China (2017LY51) and Zhejiang Provincial Key Research Project of Statistics (18TJZZ08).
语种:英文
外文关键词:Support vector machine; robust regression; quasiconvex loss function; convergence rate; right directional derivative
外文摘要:It is known that to alleviate the performance deterioration caused by the outliers, the robust support vector (SV) regression is proposed, which is essentially a convex optimization problem associated with a non-convex loss function. The theory analysis for its performance cannot. be finished by the usual convex analysis approach. For a robust SV regression algorithm containing two homotopy parameters, a non-convex method is developed with the quasiconvex analysis theory and the error estimate is given. An explicit convergence rate is provided, and the effect degree of outliers on the performance is quantitatively shown.
参考文献:
正在载入数据...