基于LSTM的水利信息分发实时推荐算法

卢焱鑫, 李永峰, 信明权, 李效宁, 刘树波

raybet体育在线 院报 ›› 2020, Vol. 37 ›› Issue (3) : 137-143.

PDF(1604 KB)
PDF(1604 KB)
raybet体育在线 院报 ›› 2020, Vol. 37 ›› Issue (3) : 137-143. DOI: 10.11988/ckyyb.20181201
信息技术应用

基于LSTM的水利信息分发实时推荐算法

  • 卢焱鑫1, 李永峰2, 信明权1, 李效宁2, 刘树波1
作者信息 +

Real-time Recommendation Algorithm for Water Information Distribution Based on Long-Short-Term Memory

  • LU Yan-xin1, LI Yong-feng2, XIN Ming-quan1, LI Xiao-ning2, LIU Shu-bo1
Author information +
文章历史 +

摘要

随着水利信息化建设的逐步深入,水情信息的实时推荐需求越来越强烈。水利数据具有很强的时效性,要求推荐系统能够提供实时推荐服务。基于用户的协同过滤算法和基于信息的协同过滤算法(Item-based Collaborative Filtering,ItemCF)是推荐领域常用的2种算法,但两者在本质上都属于离线算法,不能满足水情信息分发实时性要求。提出了一种基于长短期记忆神经网络(Long-Short-Term Memory,LSTM)的水情信息分发实时推荐算法并对其优化。实验结果表明:基于LSTM的实时推荐算法在推荐时延方面最优,而优化的结合二分类模型和ItemCF推荐结果的实时推荐算法在推荐准确率方面最优,设计实现优化的基于LSTM的实时推荐算法综合效果较好,在保证水情信息推荐准确性的同时保证了推荐实时性。

Abstract

The demand for real-time recommendation of water information is growing stronger with the deepening of water conservancy informatization in China. Since the data of water is highly time-sensitive, recommendation system is required to provide real-time recommendation services. User-based collaborative filtering and item-based collaborative filtering (ItemCF) are two commonly used algorithms in the recommendation field. Both, however, are offline algorithms in nature and cannot meet the requirement of real-time distribution of water information. In this paper, a real-time recommendation algorithm for water regime information distribution based on Long-Short-Term Memory (LSTM) is proposed and optimized to ensure the accuracy of water information recommendation while ensuring the real-time recommendation.

关键词

水情信息 / 分发 / 实时推荐 / ItemCF / LSTM / 二分类模型 / 优化

Key words

water information distribution / real-time recommendation / ItemCF / LSTM / dichotomous model / optimization

引用本文

导出引用
卢焱鑫, 李永峰, 信明权, 李效宁, 刘树波. 基于LSTM的水利信息分发实时推荐算法[J]. raybet体育在线 院报. 2020, 37(3): 137-143 https://doi.org/10.11988/ckyyb.20181201
LU Yan-xin, LI Yong-feng, XIN Ming-quan, LI Xiao-ning, LIU Shu-bo. Real-time Recommendation Algorithm for Water Information Distribution Based on Long-Short-Term Memory[J]. Journal of Changjiang River Scientific Research Institute. 2020, 37(3): 137-143 https://doi.org/10.11988/ckyyb.20181201
中图分类号: TN912   

参考文献

[1]SUNDERMEYER M, SCHLÜTER R, NEY H. LSTM Neural Networks for Language Modeling//Proceedings of the Thirteenth Annual Conference of the International Speech Communication Association. Portland, OR, USA. September 9-13, 2012: 601-608.
[2] LI D. Artificial Neural Networks//Encyclopedia of Microfluidics and Nanofluidics. Boston, MA: Springer, 2008.
[3] ABADI M, BARHAM P, CHEN J, et al. TensorFlow: A System for Large-scale Machine Learning. (2016-05-27) . https://arxiv.org/abs/1605.08695.
[4] 史栋杰. 五种快速序列化框架的性能比较. 电脑知识与技术, 2010, 6(34):9710-9711.
[5] LIU W, WEN Y, YU Z, et al. Large-margin Softmax Loss for Convolutional Neural Networks//Proceedings of the 33rd International Conference on Machine Learning (ICML 2016). International Machine Learning Society (IMLS). New York, USA, June 19-24, 2016: 507-516.
[6] HECHT-NIELSEN R. Theory of Backpropagation Neural Networks// International Joint Conference on Neural Networks. Washington DC: IEEE Xplore. June 18-22, 1989: 593-605.
[7] TOMMISKA M T. Efficient Digital Implementation of the Sigmoid Function for Reprogrammable Logic. IEEE Proceedings-Computers and Digital Techniques, 2003, 150(6):403-411.
[8] MARRA S, IACHINO M A, MORABITO F C. Tanh-like Activation Function Implementation for High-performance Digital Neural Systems//Research in Microelectronics and Electronics 2006, doi: 10.1109/RME.2006.1689940.
[9] SCHMIDT-HIEBER J. Nonparametric Regression Using Deep Neural Networks with ReLU Activation Function. (2017-08-15) https://arxiv.org/abs/1708.06633v1.
[10] PHAM V, BLUCHE T, KERMORVANT C, et al. Dropout Improves Recurrent Neural Networks for Handwriting Recognition//Proceedings of the International Conference on Frontiers in Handwriting Recognition IEEE. Crete, Greece, September 1-4, 2014:285-290.
[11] LAURENT C, PEREYRA G, BRAKEL P, et al. Batch Normalized Recurrent Neural Networks//doi: 10.1109/ICASSP.2016.7472159.
[12] ABE Y. Momentum-based Parameterization of Dynamic Character Motion. Graphical Models, 2004, 68(2):194-211.
[13] HADGU A T, NIGAM A, DIAZ-AVILES E. Large-scale Learning with AdaGrad on Spark//Proceedings of the IEEE International Conference on Big Data. Santa Clara, CA, USA: IEEE, October 29-Novomber 1,2015:2828-2830.
[14] MUKKAMALA M C, HEIN M. Variants of RMSProp and Adagrad with Logarithmic Regret Bounds//Proceedings of the 34th International Conference on Machine Learning. Sydney,Australia,August 6-11,2017: 2545-2553.
[15] KINGMA D, BA J. Adam: A Method for Stochastic Optimization//Proceedings of the 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7-9, 2015.
[16] UKIL A. Support Vector Machine//Intelligent Systems and Signal Processing in Power Engineering. Switzerland: Springer, 2002: 161-226.
[17] 刘万里, 刘三阳. SVM中不平衡数据的分离超平面的校正方法. 计算机工程与应用, 2008, 44(19):169-171.
[18] BAGCHI S, POONACHA P G. An Object Recognition Algorithm Using Maximum Margin Correlation Filter and Support Vector Machine//doi: 10.1109/NCC.2014.6811272.
[19] WANG Q, GARRITY G M, TIEDJE J M, et al. Naive Bayesian Classifier or Rapid Assignment of RNA Sequences into the New Bacterial Taxonomy. Applied & Environmental Microbiology, 2007, 73(16):5264-5267.

PDF(1604 KB)

Accesses

Citation

Detail

段落导航
相关文章

/

Baidu
map