广西师范大学学报(自然科学版) ›› 2022, Vol. 40 ›› Issue (4): 1-21.doi: 10.16088/j.issn.1001-6600.2021071102

• 综述 •    下一篇

面向复杂数据流的集成分类综述

张喜龙, 韩萌*, 陈志强, 武红鑫, 李慕航   

  1. 北方民族大学计算机科学与工程学院,宁夏银川 750021
  • 发布日期:2022-08-05
  • 通讯作者: 韩萌(1982—), 女(回族), 河南商丘人, 北方民族大学教授, 博士。E-mail: 2003051@nmu.edu.cn
  • 基金资助:
    国家自然科学基金(62062004); 宁夏自然科学基金(2020AAC03216,2022AAC03279); 北方民族大学研究生创新项目(YCX21085)

Survey of Ensemble Classification Methods for Complex Data Stream

ZHANG Xilong, HAN Meng*, CHEN Zhiqiang, WU Hongxin, LI Muhang   

  1. School of Computer Science and Engineering, North Minzu University, Yinchuan Ninxia 750021, China
  • Published:2022-08-05

摘要: 随着大数据的快速发展,挖掘有价值的知识可能会面临高维、大量、动态数据的影响,这些复杂数据流的出现会导致分类效果下降。为了进一步分析数据流集成分类的研究现状和面临的挑战,本文对复杂数据流集成分类进行综述以供进一步研究,从复杂数据流、领域数据流角度重点介绍了目前算法的核心思想以及性能。其中,复杂数据流主要介绍概念漂移、不平衡、多标签数据流等;然后,介绍文本、图、传感器等领域数据流,归纳了集成学习在领域数据流中的应用;之后,从验证技术、评估指标方面对数据流评估方法进行介绍;最后,展望未来研究可能的几个方向,包括不确定数据流集成分类的挑战、多种数据流并存集成分类的挑战、延迟数据流集成分类的挑战、数据流分类评估方法的挑战等。

关键词: 复杂数据流, 集成学习, 评估方法, 领域数据流

Abstract: With rapid development of big data, mining valuable knowledge may face the impact of high-dimensional, large and dynamic data. The emergence of these complex data streams results in a decline in the classification effect. In order to further analyze the research status and challenges of data stream ensemble classification, the ensemble classification of complex data streams is reviewed for further research. For the first time, it focuses on the core ideas and performance of the algorithm from the perspective of complex data stream and domain data stream; among them, complex data stream mainly introduces concept drift, imbalance, multi-label data stream, etc. Then, text, graph, sensor are introduced, and the application of ensemble learning in domain data streamare sumarizes. Furthermore, data stream evaluation method is introduced from the aspects of verification technology and evaluation indicators. Finally, the future research direction is given, including challenges of the ensemble and classification of uncertain data stream, challenges of the ensemble and classification of multiple data stream, challenges of the delayed data stream and the ensemble classification, and challenges of data stream classification and evaluation methods.

Key words: complex data stream, ensemble learning, evaluation technology, domain data stream

中图分类号: 

  • TP181
[1] 丁剑, 韩萌, 李娟. 概念漂移数据流挖掘算法综述[J]. 计算机科学, 2016, 43(12): 24-29, 62. DOI: 10.11896/j.issn.1002-137X.2016.12.004.
[2]BRZEZINSKI D, STEFANOWSKI J. Combining block-based and online methods in learning ensembles from concept drifting data streams[J]. Information Sciences, 2014, 265: 50-67. DOI: 10.1016/j.ins.2013.12.011.
[3]BIFET A, HOLMES G, PFAHRINGER B. Leveraging Bagging for evolving data streams[C]// Machine Learning and Knowledge Discovery in Databases. Berlin: Springer, 2010: 135-150. DOI: 10.1007/978-3-642-15880-3_15.
[4]STREET W N, KIM Y S. A streaming ensemble algorithm (SEA) for large-scale classification[C]// Proceedings of the Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, NY: Association for Computing Machinery, 2001: 377-382. DOI: 10.1145/502512.502568.
[5]WANG H X, FAN W, YU P S, et al. Mining concept-drifting data streams using ensemble classifiers[C]// Proceedings of the Ninth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY: Association for Computing Machinery, 2003: 226-235. DOI: 10.1145/956750.956778.
[6]REN S Q, ZHU W, LIAO B, et al. Selection-based resampling ensemble algorithm for nonstationary imbalanced stream data learning[J]. Knowledge-Based Systems, 2019, 163: 705-722. DOI: 10.1016/j.knosys.2018.09.032.
[7]ZHANG H, LIU W K, WANG S, et al. Resample-based ensemble framework for drifting imbalanced data streams[J]. IEEE Access, 2019, 7: 65103-65115. DOI: 10.1109/ACCESS.2019.2914725.
[8]READ J, PFAHRINGER B, HOLMES G, et al. Classifier chains for multi-label classification[J]. Machine Learning, 2011, 85(3): 333-359. DOI: 10.1007/s10994-011-5256-5.
[9]BABENKO B, YANG M H, BELONGIE S. Robust object tracking with online multiple instance learning[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2011, 33 (8): 1619-1632. DOI: 10.1109/TPAMI.2010.226.
[10]LEMAIRE V, SALPERWYCK C, BONDU A. A survey on supervised classification on data streams[C]// Business Intelligence: LNBIP Volume 205. Cham: Springer, 2015: 88-125. DOI: 10.1007/978-3-319-17551-5_4.
[11]GOMES H M, BARDDAL J P, ENEMBRECK F, et al. A survey on ensemble learning for data stream classification[J]. ACM Computing Surveys, 2018, 50(2): 23. DOI: 10.1145/3054925.
[12]IWASHITA A S, PAPA J P. An overview on concept drifts learning[J]. IEEE Access, 2019, 7: 1532-1547. DOI: 10.1109/ACCESS.2018.2886026.
[13]陈丽芳, 代琪, 赵佳亮. 不平衡数据多粒度集成分类算法研究[J]. 计算机工程与科学, 2021, 43(5): 917-925. DOI: 10.3969/j.issn.1007-130X.2021.05.019.
[14]贾涛, 韩萌, 王少峰, 等. 数据流决策树分类方法综述[J]. 南京师大学报(自然科学版), 2019, 42(4): 49-60. DOI: 10.3969/j.issn.1001-4616.2019.04.008.
[15]POLIKAR R, UPDA L, UPDA S S, et al. Learn++: an incremental learning algorithm for supervised neural networks[J]. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 2001, 31(4): 497-508. DOI: 10.1109/5326.983933.
[16]ZHAO Q L, JIANG Y H, XU M. Incremental Learning by Heterogeneous Bagging Ensemble[C]// Advanced Data Mining and Applications: LNAI Volume 6441. Berlin: Springer, 2010: 1-12. DOI: 10.1007/978-3-642-17313-4_1.
[17]OZA N C. Online bagging and boosting[C]// 2005 IEEE International Conference on Systems, Man and Cybernetics. Piscataway, NJ: IEEE, 2005: 2340-2345. DOI: 10.1109/ICSMC.2005.1571498.
[18]杜诗语, 韩萌, 申明尧, 等. 概念漂移数据流集成分类算法综述[J]. 计算机工程, 2020, 46(1): 15-24, 30. DOI: 10.19678/j.issn.1000-3428.0055747.
[19]DECKERT M. Batch weighted ensemble for mining data streams with concept drift[C]// Foundations of Intelligent Systems: LNAI Volume 6804. Berlin: Springer, 2011: 290-299. DOI: 10.1007/978-3-642-21916-0_32.
[20]BRZEZINSKI D, STEFANOWSKI J. Reacting to different types of concept drift: the accuracy updated ensemble algorithm[J]. IEEE Transactions on Neural Networks and Learning Systems, 2014, 25(1): 81-94. DOI: 10.1109/TNNLS. 2013.2251352.
[21]BERTINI JUNIOR J R, NICOLETTI M D C. An iterative boosting-based ensemble for streaming data classification[J]. Information Fusion, 2019, 45: 66-78. DOI: 10.1016/j.inffus.2018.01.003.
[22]潘吴斌, 程光, 郭晓军, 等. 基于信息熵的自适应网络流概念漂移分类方法[J]. 计算机学报, 2017, 40(7): 1556-1571. DOI: 10.11897/SP.J.1016.2017.01556.
[23]ABDUALRHMAN M A A, PADMA M C. Deterministic concept drift detection in ensemble classifier based data stream classification process[J]. International Journal of Grid and High Performance Computing, 2019, 11(1): 29-48. DOI: 10.4018/IJGHPC.2019010103.
[24]文益民, 强保华, 范志刚. 概念漂移数据流分类研究综述[J]. 智能系统学报, 2013, 8(2): 95-104. DOI: 10.3969/j.issn.1673-4785.201208012.
[25]ANCY S, PAULRAJ D. Online learning model for handling different concept drifts using diverse ensemble classifiers on evolving data streams[J]. Cybernetics and Systems, 2019, 50(7): 579-608. DOI: 10.1080/01969722.2019.1645996.
[26]FEITOSA NETO A, CANUTO A M P. EOCD: an ensemble optimization approach for concept drift applications[J]. Information Sciences, 2021, 561: 81-100. DOI: 10.1016/j.ins.2021.01.051.
[27]LIU A J, LU J, ZHANG G Q. Diverse instance-weighting ensemble based on region drift disagreement for concept drift adaptation[J]. IEEE Transactions on Neural Networks and Learning Systems, 2021, 32(1): 293-307. DOI: 10.1109/TNNLS.2020.2978523.
[28]SIDHU P, BHATIA M P S. An online ensembles approach for handling concept drift in data streams: diversified online ensembles detection[J]. International Journal of Machine Learning and Cybernetics, 2015, 6(6): 883-909. DOI: 10.1007/s13042-015-0366-1.
[29]SIDHU P, BHATIA M P S. A novel online ensemble approach to handle concept drifting data streams: diversified dynamic weighted majority[J]. International Journal of Machine Learning and Cybernetics, 2018, 9(1): 37-61. DOI: 10.1007/s13042-015-0333-x.
[30]李艳霞, 柴毅, 胡友强, 等. 不平衡数据分类方法综述[J]. 控制与决策, 2019, 34(4): 673-688. DOI: 10.13195/j.kzyjc.2018.0865.
[31]WANG S, MINKU L L, YAO X. Resampling-based ensemble methods for online class imbalance learning[J]. IEEE Transactions on Knowledge and Data Engineering, 2015, 27(5): 1356-1368. DOI: 10.1109/TKDE.2014.2345380.
[32]肖梁, 韩璐, 魏鹏飞, 等. 基于Bagging集成学习的多集类不平衡学习[J]. 计算机技术与发展, 2021, 31(10): 1-6. DOI: 10.3969/j.issn.1673-629x.2021.10.001.
[33]ZYBLEWSKI P, SABOURIN R, WOZNIAK M. Preprocessed dynamic classifier ensemble selection for highly imbalanced drifted data streams[J]. Information Fusion, 2021, 66: 138-154. DOI: 10.1016/j.inffus.2020.09.004.
[34]段化娟, 尉永清, 刘培玉, 等. 一种面向不平衡分类的改进多决策树算法[J]. 广西师范大学学报(自然科学版),2020, 38(2): 72-80. DOI: 10.16088/j.issn.1001-6600.2020.02.008.
[35]ANCY S, PAULRAI D. Handling imbalanced data with concept drift by applying dynamic sampling and ensemble classification model[J]. Computer Communications, 2020, 153: 553-560. DOI: 10.1016/j.comcom.2020.01.061.
[36]SUN Y M, KAMEL M S, WONG A K C, et al. Cost-sensitive boosting for classification of imbalanced data[J]. Pattern Recognition, 2007, 40(12): 3358-3378. DOI: 10.1016/j.patcog.2007.04.009.
[37]TAO X M, LI Q, GUO W J, et al. Self-adaptive cost weights-based support vector machine cost-sensitive ensemble for imbalanced data classification[J]. Information Sciences, 2019, 487: 31-56. DOI: 10.1016.j.ins.2019.02.062.
[38]WONG M L, SENG K, WONG P K. Cost-sensitive ensemble of stacked denoising autoencoders for class imbalance problems in business domain[J]. Expert Systems with Applications, 2020, 141: 112918. DOI: 10.1016/j.eswa.2019.112918.
[39]LOEZER L, ENEMBRECK F, BARDDAL J P, et al. Cost-sensitive learning for imbalanced data streams[C]// Proceedings of the 35th Annual ACM Symposium on Applied Computing. New York, NY: Association for Computing Machinery, 2020: 498-504. DOI: 10.1145/3341105.3373949.
[40]孟威, 周忠眉. 基于标签组合的多标签特征选择算法[J]. 模糊系统与数学, 2021, 35(1): 144-154.
[41]TSOUMAKAS G, KATAKIS I, VLAHAVAS I. Effective and efficient multilabel classification in domains with large number of labels[C]// Proceedings of the ECML/PKDD 2008 Workshop on Mining Multidimensional Data. Antwerp, Belgium: ECML PKDD, 2008: 30-44.
[42]ZHANG L, SHAH S K, KAKADIARIS I A. Hierarchical multi-label classification using fully associative ensemble learning[J]. Pattern Recognition, 2017, 70: 89-103. DOI: 10.1016/j.patcog.2017.05.007.
[43]TSOUMAKAS G, DIMOU A, SPYROMITROS E, et al. Correlation-based pruning of stacked binary relevance models for multi-label learning[C]// Proceedings of the ECML/PKDD 2009 Workshop on Learning from Multi-Label Data (MLD′09). Bled, Slovenia: ECML PKDD, 2009: 101-116.
[44]NGUYEN T T T, NGUYEN T T, LIEW A W C, et al. An online variational inference and ensemble based multi-label classifier for data streams[C]// 2019 Eleventh International Conference on Advanced Computational Intelligence (ICACI). Piscataway, NJ: IEEE, 2019: 302-307. DOI: 10.1109/ICACI.2019.8778594.
[45]WANG L Q, ZHAO Z C, SU F. Efficient multi-modal hypergraph learning for social image classification with complex label correlations[J]. Neurocomputing, 2016, 171: 242-251. DOI: 10.1016/j.neucom.2015.06.064.
[46]WANG L L, SHEN H, TIAN H. Weighted ensemble classification of multi-label data streams[C]// Advances in Knowledge Discovery and Data Mining: LNAI Volume 10235. Cham: Springer, 2017: 551-562. DOI: 10.1007/978-3-319-57529-2_43.
[47]XIA Y L, CHEN K, YANG Y. Multi-label classification with weighted classifier selection and stacked ensemble[J]. Information Sciences, 2021, 557: 421-442. DOI: 10.1016/j.ins.2020.06.017.
[48]SZYMANSKI P, KAJDANOWICZ T, CHAWLA N V. LNEMLC: label network embeddings for multi-labelclassification[EB/OL]. (2019-01-01)[2021-07-11]. http://arxiv.org/abs/1812.02956. DOI: 10.48550/arXiv.1812.02956.
[49]SUN Y G, SHAO H, WANG S S. Efficient ensemble classification for multi-label data streams with concept drift[J]. Information, 2019, 10(5): 158. DOI: 10.3390/info10050158.
[50]WANG R, KWONG S, WANG X, et al. Activek-labelsets ensemble for multi-label classification[J]. Pattern Recognition, 2021, 109: 107583. DOI: 10.1016/j.patcog.2020.107583.
[51]徐庸辉. 面向多实例分类的迁移学习研究[D]. 广州: 华南理工大学, 2017.
[52]SASTRAWAHA S, HORATA P. Ensemble extreme learning machine for multi-instance learning[C]// Proceedings of the 9th International Conference on Machine Learning and Computing. New York, NY: Association for Computing Machinery, 2017: 56-60. DOI: 10.1145/3055635.3056641.
[53]KOCYIGIT G, YASLAN Y. DEMIAL: an active learning framework for multiple instance image classification using dictionary ensembles[J]. Turkish Journal of Electrical Engineering & Computer Sciences, 2018, 26: 593-604. DOI: 10.3906/elk-1703-319.
[54]TAER P Y, BIRANT K U, BIRANT D. Comparison of ensemble-based multipleinstance learning approaches[C]// 2019 IEEE International Symposium on Innovations in Intelligent Systems and Applications (INISTA). Piscataway, NJ: IEEE, 2019: paper 31. DOI: 10.1109/INISTA.2019.8778273.
[55]TU M, HUANG J, HE X D, et al. Multiple instance learning with graph neuralnetworks[EB/OL]. (2019-06-12)[2021-07-11]. http://arxiv.org/abs/1906.04881. DOI: 10.48550/arXiv.1906.04881.
[56]WANG Z H, YOON S, XIE S J, et al. Visual tracking with semi-supervised online weighted multiple instance learning[J]. The Visual Computer, 2016, 32(3): 307-320. DOI: 10.1007/s00371-015-1067-1.
[57]CANO A. An ensemble approach to multi-view multi-instance learning[J]. Knowledge Based System, 2017, 136: 46-57. DOI: 10.1016/j.knosys.2017.08.022.
[58]BJERRING L, FRANK E. Beyond trees: adopting MITI to learn rules and ensemble classifiers for multi-instance data[C]// AI 2011: Advances in Artificial Intelligence: LNCS Volume 7106. Berlin: Springer, 2011: 41-50. DOI: 10.1007/978-3-642-25832-9_5.
[59]赵京胜, 宋梦雪, 高祥. 自然语言处理发展及应用综述[J]. 信息技术与信息化, 2019(7): 142-145. DOI: 10.3969/j.issn.1672-9528.2019.07.046.
[60]SONG G, YE Y M, ZHANG H J, et al. Dynamic clustering forest: an ensemble framework to efficiently classify textual data stream with concept drift[J]. Information Sciences,2016, 357: 125-143. DOI: 10.1016/j.ins.2016.03.043.
[61]HU X G, WANG H Y, LI P P. Online Biterm Topic Model based short text stream classification using short text expansion and concept drifting detection[J]. Pattern Recognition Letters, 2018, 116: 187-194. DOI: 10.1016/j.patrec. 2018.10.018.
[62]KHURANA A, VERMA O P. Novel approach with nature-inspired and ensemble techniques for optimal text classification[J]. Multimedia Tools and Applications, 2020, 79(33): 23821-23848. DOI: 10.1007/s11042-020-09013-2.
[63]UPADHYAY A, NGUYEN T T, MASSIE S, et al. WEC: weighted ensemble of text classifiers[C]// 2020 IEEE Congress on Evolutionary Computation (CEC). Piscataway, NJ: IEEE, 2020: 1-8. DOI: 10.1109/CEC48606.2020. 9185641.
[64]SAMAMI M, SOURE E M. Binary classification of Lupus scientific articles applying deep ensemble model on text data[C]// 2019 Seventh International Conference on Digital Information Processing and Communications (ICDIPC). Piscataway, NJ: IEEE, 2019: 12-17. DOI: 10.1109/ICDIPC.2019.8723787.
[65]AGGARWAL C C, LI Y, YU P S. On supervised change detection in graph streams[C]// Proceedings of the 2020 SIAM International Conference on Data Mining (SDM). Philadelphia, PA: SIAM, 2020: 289-297. DOI: 10.1137/1.9781611976236.33.
[66]TUNCER T, DOGAN S, ERTAM F, et al. A novel ensemble local graph structure based feature extraction network for EEG signal analysis[J]. Biomedical Signal Processing and Control, 2020, 61: 102006. DOI: 10.1016/j.bspc.2020.102006.
[67]PAN S R, WU J, ZHU X Q, et al. Graph ensemble boosting for imbalanced noisy graph stream classification[J]. IEEE Transactions on Cybernetics, 2015, 45(5): 940-954. DOI: 10.1109/TCYB.2014.2341031.
[68]LIU J, SONG C Y, ZHAO J, et al. Manifold-preserving sparse graph-based ensemble FDA for industrial label-noise fault classification[J]. IEEE Transactions on Instrumentation and Measurement, 2020, 69(6): 2621-2634. DOI: 10.1109/TIM.2019.2930157.
[69]SU H Y, ROUSU J. Multilabel classification through random graph ensembles[J]. Machine Learning, 2015, 99(2): 231-256. DOI: 10.1007/s10994-014-5465-9.
[70]SHAHI A, WOODFORD B J, DENG J D. Event classification using adaptive cluster-based ensemble learning of streaming sensor data[C]// AI 2015: Advances in Artificial Intelligence: Lecture Notes in Computer Science 9457. Cham: Springer International Publishing AG Switzerland, 2015: 505-516. DOI: 10.1007/978-3-319-26350-2_45.
[71]MUZAMMAL M, TALAT R, SODHRO A H, et al. A multi-sensor data fusion enabled ensemble approach for medical data from body sensor networks[J]. Information Fusion, 2020, 53: 155-164. DOI: 10.1016/j.inffus.2019.06.021.
[72]IFTIKHAR N, BAATTRUP-ANDERSENB T, NORDBJERG F E, et al. Outlier detection in sensor data using ensemble learning[J]. Procedia Computer Science, 2020, 176: 1160-1169. DOI: 10.1016/j.procs.2020.09.112.
[73]ALIPPI C, NTALAMPIRAS S, ROVERI M. Model ensemble for an effective on-line reconstruction of missing data in sensor networks[C]// The 2013 International Joint Conference on Neural Networks (IJCNN). Piscataway, NJ: IEEE, 2013: 1-6. DOI: 10.1109/IJCNN.2013.6706761.
[74]RODRGUEZ J, BARRERA-ANIMAS A Y, TREJO L A, et al. Ensemble of one-class classifiers for personal risk detection based on wearable sensor data[J]. Sensors, 2016, 16(10): 1619. DOI: 10.3390/s16101619.
[75]GAMA J, MEDAS P, ROCHA R. Forest trees for on-line data[C]// Proceedings of the 2004 ACM Symposium on Applied Computing. New York, NY: Association for Computing Machinery, 2004: 632-636. DOI: 10.1145/967900.968033.
[76]BIFET A, DE FRANCISCI MORALES G, READ J, et al. Efficient online evaluation of big data stream classifiers[C]// Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY: Association for Computing Machinery, 2015: 59-68. DOI: 10.1145/2783258.2783372.
[77]GOMES H M, BIFET A, READ J, et al. Adaptive random forests for evolving data stream classification[J]. Machine Learning, 2017, 106(9/10): 1469-1495. DOI: 10.1007/s10994-017-5642-8.
[78]GRZENDA M, GOMES H M, BIFET A. Delayed labelling evaluation for data streams[J]. Data Mining and Knowledge Discovery, 2020, 34(5): 1237-1266. DOI: 10.1007/s10618-019-00654-y.
[79]BIFET A, HOLMES G, PFAHRINGER B, et al. Fast perceptron decision tree learning from evolving data streams[C]// Advances in Knowledge Discovery and Data Mining: Lecture Notes in Artificial Intelligence 6119. Berlin: Springer, 2010: 299-310. DOI: 10.1007/978-3-642-13672-6_30.
[80]KUBAT M, HOLTE R C, MATWIN S. Machine learning for the detection of oil spills in satellite radar images[J]. Machine Learning,1998, 30(2): 195-215. DOI: 10.1023/A:1007452223027.
[81]SANTOS A M, CANUTO A M P, FEITOSA NETO A. Evaluating classification methods applied to multi-label tasks in different domains[C]// 2010 10th International Conference on Hybrid Intelligent Systems. Piscataway, NJ: IEEE, 2010: 61-66. DOI: 10.1109/HIS.2010.5600014.
[82]SHAKER A, HLLERMEIER E. Recovery analysis for adaptive learning from non-stationary data streams: experimental design and case study[J]. Neurocomputing, 2015, 150(Part A): 250-264. DOI: 10.1016/j.neucom.2014.09. 076.
[83]YU H, LU J, XU J L, et al. A hybrid incremental regression neural network for uncertain data streams[C]// 2019 International Joint Conference on Neural Networks (IJCNN). Piscataway, NJ: IEEE, 2019: 1-8. DOI: 10.1109/IJCNN. 2019.8852364.
[84]SHAKER A, HLLERMEIER E. Survival analysis on data streams: analyzing temporal events in dynamically changing environments[J]. International Journal of Applied Mathematics and Computer Science, 2014, 24(1): 199-212. DOI: 10.2478/amcs-2014-0015.
[85]LIOBAITE· IBIFET A, PFAHRINGER B, et al. Active learning with drifting streaming data[J]. IEEE Transactions on Neural Networks and Learning Systems, 2014, 25(1): 27-39. DOI: 10.1109/TNNLS.2012.2236570.
[1] 田晟, 宋霖. 基于CNN和Bagging集成的交通标志识别[J]. 广西师范大学学报(自然科学版), 2022, 40(4): 35-46.
[2] 胡强, 刘倩, 周杭霞. 基于改进Stacking策略的钓鱼网站检测研究[J]. 广西师范大学学报(自然科学版), 2022, 40(3): 132-140.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!
版权所有 © 广西师范大学学报(自然科学版)编辑部
地址:广西桂林市三里店育才路15号 邮编:541004
电话:0773-5857325 E-mail: gxsdzkb@mailbox.gxnu.edu.cn
本系统由北京玛格泰克科技发展有限公司设计开发