登录    注册    忘记密码

详细信息

A Dynamic Knowledge Distillation Method Based on the Gompertz Curve  ( EI收录)  

文献类型:期刊文献

英文题名:A Dynamic Knowledge Distillation Method Based on the Gompertz Curve

作者:Han, Yang[1]; Guangjun, Qin[1]

机构:[1] Smart City College, Beijing Union University, Beijing, 100101, China

第一机构:北京联合大学智慧城市学院

年份:2025

外文期刊名:arXiv

收录:EI(收录号:20250495007)

语种:英文

外文关键词:Knowledge management - Knowledge transfer - Personnel training - Students - Teaching

摘要:This paper introduces a novel dynamic knowledge distillation framework, Gompertz-CNN, which integrates the Gompertz growth model into the training process to address the limitations of traditional knowledge distillation. Conventional methods often fail to capture the evolving cognitive capacity of student models, leading to suboptimal knowledge transfer. To overcome this, we propose a stage-aware distillation strategy that dynamically adjusts the weight of distillation loss based on the Gompertz curve, reflecting the student’s learning progression: slow initial growth, rapid mid-phase improvement, and late-stage saturation. Our framework incorporates Wasserstein distance to measure feature-level discrepancies and gradient matching to align backward propagation behaviors between teacher and student models. These components are unified under a multi-loss objective, where the Gompertz curve modulates the influence of distillation losses over time. Extensive experiments on CIFAR-10 and CIFAR-100 using various teacher–student architectures (e.g., ResNet50 → MobileNet_v2) demonstrate that Gompertz-CNN consistently outperforms traditional distillation methods, achieving up to 8% and 4% accuracy gains on CIFAR-10 and CIFAR-100, respectively. Copyright ? 2025, The Authors. All rights reserved.

参考文献:

正在载入数据...

版权所有©北京联合大学 重庆维普资讯有限公司 渝B2-20050021-8 
渝公网安备 50019002500408号 违法和不良信息举报中心