登录    注册    忘记密码

详细信息

Joint Ranking SVM and Binary Relevance with Robust Low-Rank Learning for Multi-Label Classification  ( EI收录)  

文献类型:期刊文献

英文题名:Joint Ranking SVM and Binary Relevance with Robust Low-Rank Learning for Multi-Label Classification

作者:Wu, Guoqiang[1,2]; Zheng, Ruobing[3]; Tian, Yingjie[2,4,5]; Liu, Dalian[6]

第一作者:Wu, Guoqiang

机构:[1] School of Computer Science and Technology, University of Chinese Academy of Sciences, Beijing, 100049, China; [2] Research Center on Fictitious Economy and Data Science, Chinese Academy of Sciences, Beijing, 100190, China; [3] Computer Network Information Center, Chinese Academy of Sciences, Beijing, 100190, China; [4] School of Economics and Management, University of Chinese Academy of Sciences, Beijing, 100190, China; [5] Key Laboratory of Big Data Mining and Knowledge management, Chinese Academy of Sciences, Beijing, 100190, China; [6] Department of Basic Course Teaching, Beijing Union University, Beijing, 100101, China

第一机构:School of Computer Science and Technology, University of Chinese Academy of Sciences, Beijing, 100049, China

年份:2019

外文期刊名:arXiv

收录:EI(收录号:20200413672)

语种:英文

外文关键词:Gradient methods - Support vector machines

摘要:Multi-label classification studies the task where each example belongs to multiple labels simultaneously. As a representative method, Ranking Support Vector Machine (Rank-SVM) aims to minimize the Ranking Loss and can also mitigate the negative influence of the class-imbalance issue. However, due to its stacking-style way for thresholding, it may suffer error accumulation and thus reduces the final classification performance. Binary Relevance (BR) is another typical method, which aims to minimize the Hamming Loss and only needs one-step learning. Nevertheless, it might have the class-imbalance issue and doesn’t take into account label correlations. To address the above issues, we propose a novel multi-label classification model, which joints Ranking support vector machineNovember 6, 2019 and Binary Relevance with robust Low-rank learning (RBRL). RBRL inherits the ranking loss minimization advantages of Rank-SVM, and thus overcomes the disadvantages of BR suffering the class-imbalance issue and ignoring the label correlations. Meanwhile, it utilizes the hamming loss minimization and one-step learning advantages of BR, and thus tackles the disadvantages of Rank-SVM including another thresholding learning step. Besides, a low-rank constraint is utilized to further exploit high-order label correlations under the assumption of low dimensional label space. Furthermore, to achieve nonlinear multi-label classifiers, we derive the kernelization RBRL. Two accelerated proximal gradient methods (APG) are used to solve the optimization problems efficiently. Extensive comparative experiments with several state-of-the-art methods illustrate a highly competitive or superior performance of our method RBRL. Copyright ? 2019, The Authors. All rights reserved.

参考文献:

正在载入数据...

版权所有©北京联合大学 重庆维普资讯有限公司 渝B2-20050021-8 
渝公网安备 50019002500408号 违法和不良信息举报中心