登录    注册    忘记密码

详细信息

Temporal self-attention-based Conv-LSTM network for multivariate time series prediction  ( SCI-EXPANDED收录 EI收录)  

文献类型:期刊文献

英文题名:Temporal self-attention-based Conv-LSTM network for multivariate time series prediction

作者:Fu, En[1];Zhang, Yinong[2];Yang, Fan[3];Wang, Shuying[2]

第一作者:Fu, En

通讯作者:Zhang, YN[1]

机构:[1]Beijing Union Univ, Beijing Key Lab Informat Serv Engn, Beijing 100101, Peoples R China;[2]Beijing Union Univ, Coll Urban Rail Transit & Logist, Beijing 100101, Peoples R China;[3]Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China

第一机构:北京联合大学北京市信息服务工程重点实验室

通讯机构:[1]corresponding author), Beijing Union Univ, Coll Urban Rail Transit & Logist, Beijing 100101, Peoples R China.|[11417]北京联合大学;

年份:2022

卷号:501

起止页码:162-173

外文期刊名:NEUROCOMPUTING

收录:;EI(收录号:20222612296299);Scopus(收录号:2-s2.0-85132916646);WOS:【SCI-EXPANDED(收录号:WOS:000822707200012)】;

基金:We thank Maxine Garcia, PhD, from Liwen Bianji (Edanz) ( www.liwenbianji.cn/) , for editing the English text of a draft of this manuscript. This research was funded by the National Science and Technology Innovation 2030 Major Project (grant No. 2018AAA0101604) of the Ministry of Science and Technology of China and the National Nature Science Foundation (grant No. 61873142) .

语种:英文

外文关键词:Self-attention mechanism; Long short-term memory; Multivariate time series; Prediction

摘要:Time series play an important role in many fields, such as industrial control, automated monitoring, and weather forecasting. Because there is often more than one variable in reality problems and they are related to each other, the multivariable time series (MTS) introduced. Using historical observations to accurately predict MTS is still very challenging. Therefore, a new time series prediction model proposed based on the temporal self-attention mechanism, convolutional neural network and long short-term memory (Conv-LSTM). When the standard attention mechanism for time series is combined with recurrent neural network (RNN), it heavily depends on the hidden state of the RNN. Particularly in the first time step, the initial hidden state (typically 0) must be artificially introduced to calculate the attention weight of that step, which results in additional noise in the calculation of the attention weight. To address this problem and increase the flexibility of the attention layer, a new self-attention mechanism designed to extract the temporal dependence of the MTS, which called temporal self-attention. In this attention mechanism, long short-term memory (LSTM) adopted as a sequence encoder to calculate the query, key, and value to obtain a more complete temporal dependence than standard self-attention. Because of flexibility of this structure, the DA-Conv-LSTM model was improved, in which a SOTA attention based method used for MTS prediction. Our improved model compared with six baseline models on multiple datasets (SML2010 and NASDAQ100), and applied to satellite state prediction (our private dataset). The effectiveness of our temporal self-attention was demonstrated by experiments. And the best shortterm prediction performance was achieved by our improved model.(c) 2022 Elsevier B.V. All rights reserved.

参考文献:

正在载入数据...

版权所有©北京联合大学 重庆维普资讯有限公司 渝B2-20050021-8 
渝公网安备 50019002500408号 违法和不良信息举报中心