Journal of Guangxi Normal University(Natural Science Edition) ›› 2019, Vol. 37 ›› Issue (2): 82-89.doi: 10.16088/j.issn.1001-6600.2019.02.010

Previous Articles     Next Articles

Predicting Financial Time Series Based on Gated Recurrent Unit Neural Network

ZHANG Jinlei, LUO Yuling*, FU Qiang   

  1. College of Electronic Engineering, Guangxi Normal University, Guilin Guangxi 541004, China
  • Received:2018-08-13 Online:2019-04-25 Published:2019-04-28

Abstract: To solve the long-term dependence of the Recurrent Neural Networks (RNN), the Gated Recurrent Unit (GRU) neural network is proposed as a variant of the RNN. By inheriting the excellent memory ability of the RNN to time series, the GRU overcomes this long-term dependence problem. For the long-term dependence of time series in financial time series, the GRU extension is applied to financial time series prediction. A financial time series prediction model based on differential operation and GRU neural network is proposed in this paper. The model is capable of handling complex features of financial time series data such as non-linearity, non-stationary and sequence correlation. It is used to predict the adjusted close price of the Standard & Poor’s (S&P) 500 stock index in this work. The experimental results show that the differencing operation can improve the generalization ability and the prediction accuracy of the GRU neural network, and the proposed method can perform a better prediction for the financial time series than the conventional approach, with a relatively low computing overhead.

Key words: recurrent neural network, gated recurrent unit (GRU), differencing operation, financial time series prediction, deep learning

CLC Number: 

  • TP183
[1] SHEN Furao, CHAO Jing, ZHAO Jinxi. Forecasting exchange rate using deep belief networks and conjugate gradient method[J]. Neurocomputing, 2015, 167: 243-253. DOI:10.1016/j.neucom.2015.04.071.
[2] TAYLOR G.Composable, distributed-state models for high-dimensional time series[D].Toronto: University of Toronto, 2009.
[3] HORNIK K,STINCHCOMBE M,WHITE H.Multilayer feedforward networks are universal approximators[J].Neural Networks, 1989, 2(5): 359-366. DOI: 10.1016/0893-6080(89)90020-8.
[4] BENGIO Y,LECUN Y.Scaling learning algorithms toward AI[M]//BOTTOU L,CHAPELLE O,DeCOSTE D,et al. Large-scale Kernel Machines. Cambridge,MA:MIT Press,2007: 321-359.
[5] SCHUSTER M, PALIWAL K K.Bidirectional recurrent neural networks[J].IEEE Transactions on Signal Processing, 1997,45(11):2673-2681.DOI:10.1109/78.650093.
[6] HSIEH T J, HSIAO H F, YEH W C. Forecasting stock markets using wavelet transforms and recurrent neural networks:an integrated system based on artificial bee colony algorithm[J].Applied Soft Computing,2011,11(2):2510-2525.DOI: 10.1016/j.asoc.2010.09.007.
[7] FISCHER T,KRAUSS C.Deep learning with long short-term memory networks for financial market predictions[J]. European Journal of Operational Research,2018,270(2):654-669.DOI:10.1016/j.ejor.2017.11.054.
[8] KIM K J,HAN I.Genetic algorithms approach to feature discretization in artificial neural networks for the prediction of stock price index[J]. Expert Systems with Applications,2000,19(2):125-132.DOI:10.1016/S0957-4174(00)00027-0.
[9] GURESEN E,KAYAKUTLU G,DAIM T U.Using artificial neural network models in stock market index prediction[J]. Expert Systems with Applications,2011,38(8):10389-10397.DOI:10.1016/j.eswa.2011.02.068.
[10] BENGIO Y.Learning deep architectures for AI[J].Foundations and Trends in Machine Learning,2009,2(1):1-127.DOI: 10.1561/2200000006.
[11] YEH S H,WANG C J,TSAI M F.Deep belief networks for predicting corporate defaults[C]//24th Wireless and Optical Communication Conference. Piscataway,NJ:IEEE Press,2015:159-163.DOI:10.1109/WOCC.2015.7346197.
[12] BENGIO Y,SIMARD P,FRASCONI P.Learning long-term dependencies with gradient descent is difficult[J].IEEE Transactions on Neural Networks,1994,5(2):157-166.DOI:10.1109/72.279181.
[13] KOLEN J F,KREMER S C.Gradient flow in recurrent nets:the difficulty of learning long-term dependencies[M]//KOLEN J F,KREMER S C.A Field Guide to Dynamical Recurrent Networks.Piscataway NJ:IEEE Press,2001:237-244.DOI: 10.1109/9780470544037.ch14.
[14] CHUNG J,GULCEHRE C,CHO K H,et al.Empirical evaluation of gated recurrent neural networks on sequence modeling[EB/OL].(2014-12-11)[2018-08-13].https://arxiv.org/pdf/1412.3555v1.pdf.
[15] HUYNH H D,DANG L M,DUONG D.A new model for stock price movements prediction using deep neural network[C]// Proceedings of the Eighth International Symposium on Information and Communication Technology.New York,NY:ACM Press,2017:57-62.DOI: 10.1145/3155133.3155202.
[16] WILLIAMS R J, ZIPSER D. A learning algorithm for continually running fully recurrent neural networks[J].Neural Computation,1989,1(2):270-280.DOI:10.1162/neco.1989.1.2.270.
[17] XIAO Lin,LI Shuai,YANG Jian,et al.A new recurrent neural network with noise-tolerance and finite-time convergence for dynamic quadratic minimization[J].Neurocomputing,2018,285:125-132.DOI:10.1016/j.neucom.2018.01.033.
[18] HOCHREITER S,SCHMIDHUBER J.Long short-term memory[J].Neural Computation,1997,9(8):1735-1780.DOI: 10.1162/neco.1997.9.8.1735.
[19] CHO K,Van MERRIENBOER B,BAHDANAU D,et al.On the properties of neural machine translation: encoder-decoder approaches[EB/OL].(2014-10-07)[2018-08-13].https://arxiv.org/pdf/1409.1259.pdf.
[20] SIAMI-NAMINI S,NAMIN A S.Forecasting economics and financial time series: ARIMA vs. LSTM[EB/OL].(2018-03-15)[2018-08-13].https://arxiv.org/ftp/arxiv/papers/1803/1803.06386.pdf.
[21] CHENG Jian,WANG Peisong,LI Gang,et al.Recent advances in efficient computation of deep convolutional neural networks[J].Frontiers of Information Technology & Electronic Engineering,2018,19(1):64-77.DOI:10.1631/FITEE. 1700789.
[22] SRIVASTAVA N,HINTON G,KRIZHEVSKY A,et al.Dropout:a simple way to prevent neural networks from overfitting[J]. Journal of Machine Learning Research,2014,15(1):1929-1958.
[1] ZHANG Mingyu,ZHAO Meng,CAI Fuhong,LIANG Yu,WANG Xinhong. Wave Power Prediction Based on Deep Learning [J]. Journal of Guangxi Normal University(Natural Science Edition), 2020, 38(3): 25-32.
[2] LI Weiyong, LIU Bin, ZHANG Wei, CHEN Yunfang. An Automatic Summarization Model Based on Deep Learning for Chinese [J]. Journal of Guangxi Normal University(Natural Science Edition), 2020, 38(2): 51-63.
[3] LIU Yingxuan, WU Xiru, XUE Ganggang. Multi-target Real-time Detection for Road Traffic SignsBased on Deep Learning [J]. Journal of Guangxi Normal University(Natural Science Edition), 2020, 38(2): 96-106.
[4] HUANG Liming,CHEN Weizheng,YAN Hongfei,CHEN Chong. A Stock Prediction Method Based on Recurrent Neural Network and Deep Learning [J]. Journal of Guangxi Normal University(Natural Science Edition), 2019, 37(1): 13-22.
[5] YUE Tianchi, ZHANG Shaowu, YANG Liang, LIN Hongfei, YU Kai. Stance Detection Method Based on Two-Stage Attention Mechanism [J]. Journal of Guangxi Normal University(Natural Science Edition), 2019, 37(1): 42-49.
[6] YU Chuanming,LI Haonan,AN Lu. Analysis of Text Emotion Cause Based on Multi-task Deep Learning [J]. Journal of Guangxi Normal University(Natural Science Edition), 2019, 37(1): 50-61.
[7] WANG Qi,QIU Jiahui,RUAN Tong,GAO Daqi,GAO Ju. Recurrent Capsule Network for Clinical Relation Extraction [J]. Journal of Guangxi Normal University(Natural Science Edition), 2019, 37(1): 80-88.
[8] ZHAO Hui-wei, LI Wen-hua, FENG Chun-hua, LUO Xiao-shu. Periodic Oscillation Analysis for a Recurrent Neural NetworksModel with Time Delays [J]. Journal of Guangxi Normal University(Natural Science Edition), 2011, 29(1): 29-34.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!