登录    注册    忘记密码

详细信息

KENet: Distilling Convolutional Networks via Knowledge Enhancement  ( EI收录)  

文献类型:期刊文献

英文题名:KENet: Distilling Convolutional Networks via Knowledge Enhancement

作者:Liu, Hongzhe[1]; Zhang, Chi[1]; Xu, Cheng[1]; Li, Guofa[2]

第一作者:刘宏哲

机构:[1] Beijing Key Laboratory of Information Service Engineering, College of Robotics, Beijing Union University, Beijing, China; [2] Institute of Human Factors and Ergonomics, College of Mechatronics and Control Engineering, Shenzhen University, Shenzhen, China

第一机构:北京联合大学北京市信息服务工程重点实验室|北京联合大学机器人学院

年份:2020

卷号:53

期号:5

起止页码:385-390

外文期刊名:IFAC-PapersOnLine

收录:EI(收录号:20212410512765)

语种:英文

外文关键词:Deep neural networks - Convolutional neural networks - Distillation

摘要:For practical applications, deep neural networks need to be deployed with low memory and computing resources. To achieve this goal, we design a lightweight convolutional neural network namely KENet (Knowledge Enhance Network) and propose a knowledge distillation method to improve the performance of KENet. Our proposed KENet is a lightweight convolutional network derived from a wide residual network by replacing the normal convolutions with a hybrid of group convolutions and bottleneck blocks to reduce the number of parameters. However, the use of small kernels and group convolutions loses the information of both spatial and channel-wise dimensions. To solve this problem, we further propose a knowledge distillation method to enhance the information of these two dimensions. We extract both spatial and channel-wise knowledge from a 'teacher', and improve the attention transfer features for knowledge distillation. The experiment results on multiple datasets show that KENet is computationally cheap and memory saving with hardly any loss of precision. Moreover, we confirm that KENet can be effectively deployed in the advanced detectors with strong robustness and real-time performance. ? 2020 Elsevier B.V.. All rights reserved.

参考文献:

正在载入数据...

版权所有©北京联合大学 重庆维普资讯有限公司 渝B2-20050021-8 
渝公网安备 50019002500408号 违法和不良信息举报中心