<address id="japib"><nav id="japib"></nav></address>

<cite id="japib"></cite>

        集成學習方法:研究綜述

        徐繼偉 楊云

        引用本文:
        Citation:

        集成學習方法:研究綜述

          作者簡介: 徐繼偉(1995-),男,湖南人,碩士生,主要研究領域為機器學習,集成學習.E-mail:420076887@qq.com.;
          通訊作者: 楊云, yangyun@ynu.edu.cn
        • 基金項目:

          國家自然科學基金(61876166,61663046).

        A survey of ensemble learning approaches

          Corresponding author: YANG Yu, yangyun@ynu.edu.cn
        • 摘要: 機器學習的求解過程可以看作是在假設空間中搜索一個具有強泛化能力和高魯棒性的學習模型,而在假設空間中尋找合適模型的過程是較為困難的.然而,集成學習作為一類組合優化的學習方法,不僅能通過組合多個簡單模型以獲得一個性能更優的組合模型,而且允許研究者可以針對具體的機器學習問題設計組合方案以得到更為強大的解決方案.回顧了集成學習的發展歷史,并著重對集成學習中多樣性的產生、模型訓練和模型組合這三大策略進行歸納,然后對集成學習在現階段的相關應用場景進行了描述,并在最后對集成學習的未來研究方向進行了分析和展望.
        • [1] TANG W,ZHOU Z H.Bagging-based selective clusterer ensemble[J].Journal of Software,2005,16(4):496-502.
          [2] DIETTERICH T G.Ensemble methods in machine learning[J].Proc International Workshop on Multiple Classifier Systems,2000,1857(1):1-15.
          [3] ZHANG C X,ZHANG J S.A survey of selective ensemble learning algorithms[J].Chinese Journal of Computers,2011,34(8):1399-1410.
          [4] KOHAVI R,WOLPERT D.Bias plus variance decomposition for zero-one loss functions[C].Thirteenth International Conference on International Conference on Machine Learning,1996:275-283.
          [5] CAO Y,MIAO Q G,LIU J C,et al.Advance andprospects of adaBoost algorithm[J].Acta Automatica Sinica,2013, 39(6):745-758.
          [6] DASARATHY B V,SHEELA B V.A composite classifier system design:Concepts and methodology[C].Proceedings of the IEEE,1979,67(5):708-713.
          [7] HANSEN L K,SALAMON P.Neural network ensembles[C].IEEE Computer Society,1990:993-1001.
          [8] YU L,WU T G.Assemble learning:A survey of Boosting algorithm[J].Pattern Recognition and Artificial Entelligence,2004,17(1):52-59.
          [9] FREUND Y,SCHAPIRE R E.A decision-theoretic generalization of on-line learning and an application to boosting[J].Journal of Computer and System Sciences,1997,55(1):23-37.
          [10] XIONG Z B.Research on GDP time series forecastingbased onintergrating ARIMA with neural networks[J].Jouranal of Applied Statistics and Management,2011,30(2):306-314.
          [11] JACOBS R A,et al.Adaptive mixtures of local experts[J].Neural Computation,1991,3(1):79-87.
          [12] LI X,ZHANG T W,GUO Z.An novelb ensemble method of feature gene selection based on recursive partition-tree[J].Chinese Journal of Computers,2004,27(5):675-682.
          [13] GU Y,XU Z B,SUN J,et al.An intrution detection ensemble system based on the featureextracted by PCA and ICA[J].Journal of Computer Research and Development,2006,43(4):67-72.
          [14] WOLPERT D H.Stacked generalization[M].US:Springer,1992:241-259.
          [15] FREUND Y,SCHAPIRE R E.Game theory,on-line prediction and boosting[C].Conference on Computational Learning Thoery,1996.DOI: 10.1145/238061.238163.
          [16] FREUND Y,SCHAPIRE R E.Experiments with a new boosting algorithm[C].International Conference on Machine Learning,1996:148-156.
          [17] BREIMAN L.Bagging predictors[J].Machine Learning,1996,24:123-140.
          [18] WOODS K S,KEGELMEYER W P,BOWYER K W.Combination of multiple classifiers using local accuracy estimates[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,1997,19(4):405-410.
          [19] BREIMAN L.Random forests[J].Machine Learning,2001,45(1):5-32.
          [20] ZHANG C,ZHANG C.HC/Technik/Sonstiges.Ensemble Machine Learning[M].US:Springer,2009.
          [21] BROWN G,et al.Diversity creation methods:a survey and categorisation[J].Information Fusion,2005,6(1):5-20.
          [22] TO G B,BROWN G.Diversity in neural network ensembles[D].Birmingham:University of Birmingham,2004.
          [23] TUMER K,GHOSH J.Error correlation and error reduction in ensemble classifiers[J].Connection Science,1996,8(3/4):385-404.
          [24] KROGH A,VEDELSBY J.Neural network ensembles,cross validation and active learning[J].International Conference on Neural Information Processing Systems,1994,7(10):231-238.
          [25] TANG E K,SUGANTHAN P N,YAO X.An analysis of diversity measures[J].Machine Learning,2006,65(1):247-271.
          [26] BANFIELD R E,et al.Ensemble diversity measures and their application to thinning[J].Information Fusion,2005,6(1):49-62.
          [27] KUNCHEVA L I,WHITAKER C J.Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy[J].Machine Learning,2003,51(2):181-207.
          [28] KUNCHEVA L I.Combining pattern classifiers:methods and algorithms[M].New Jersey:John Wiley Sons,2004.
          [29] ZHOU Z H.Ensemble methods:foundations and algorithms[M].Taylor & Francis,2012:77-79.
          [30] YANG Y,JIANG J.Hybrid sampling-based clustering ensemble with global and local constitutions[J].IEEE Transactions on Neural Networks & Learning Systems,2017,27(5):952-965.
          [31] YANG Y,CHEN K.Unsupervised learning via iteratively constructed clustering ensemble[C].International Joint Conference on Neural Networks,2017:1-8.
          [32] BOSTROM H.Feature vs.classifier fusion for predictive data mining a case study in pesticide classification[C].International Conference on Information Fusion,2007:1-7.
          [33] HAN D Q,HAN C Z,YANG Y.Multi-class SVM classifiers fusion based on evidence combination[C].International Conference on Wavelet Analysis and Pattern Recognition,2008:579-584.
          [34] MAIMON O,ROKACH L.Improving supervised learning by feature decomposition[J].International Symposium on Foundations of Information and Knowledge Systems,2002,2284:178-196.
          [35] HO T K.The random subspace method for constructing decision forests[J].IEEE Transactions on Pattern Analysis & Machine Intelligence,1998,20(8):832-844.
          [36] ROKACH L.Genetic algorithm-based feature set partitioning for classification problems[J].Pattern Recognition,2008,41(5):1676-1700.
          [37] ROKACH L.Decomposition methodology for classification tasks:a meta decomposer framework[J].Pattern Analysis & Applications,2006,9(2/3):257-271.
          [38] KUSIAK A.Decomposition in data mining:an industrial case study[J].IEEE Transactions on Electronics Packaging Manufacturing,2000,23(4):345-353.
          [39] YANG Y,LIU X,YE Q,et al.Ensemble learning based person Re-Identification with multiple feature representations[J].Complexity,2018(To Appear).
          [40] ROKACH L,MAIMON O.Feature set decomposition for decision trees[M].Nethorlands:IOS Press,2005:131-158.
          [41] YANG Y,JIANG J.HMM-based hybrid meta-clustering ensemble for temporal data[J].Knowledge-Based Systems,2014,56:299-310.
          [42] BREIMAN L.Randomizing outputs to increase prediction accuracy[J].Machine Learning,2000,40(3):229-242.
          [43] DIETTERICH T G,BAKIRI G.Solving multiclass learning problems via error-correcting output codes[J].AI Access Foundation,1994,2(1):263-286.
          [44] SALZBERG S L.C4.5:Programs for machine learning by J.Ross Quinlan.Morgan Kaufmann Publishers 1993[J].Machine Learning,1994,16(3):235-240.
          [45] LEE S J,XU Z,LI T,et al.A novel bagging C4.5 algorithm based on wrapper feature selection for supporting wise clinical decision making[J].Journal of Biomedical Informatics,2017,78:144-155.
          [46] GÖNEN M,ALPAYDIN E.Multiple kernel learning algorithms[J].Journal of Machine Learning Research,2011,12:2211-2268.
          [47] PARTRIDGE D,YATES W B.Engineering multiversion neural-net systems[J].Neural Computation,1996,8(4):869-893.
          [48] YATES W B,PARTRIDGE D.Use of methodological diversity to improve neural network generalisation[J].Neural Computing & Applications,1996,4(2):114-128.
          [49] JORDAN M I,LECUN Y,SOLLA S A.Advances in neural information processing systems[J].Biochemical & Biophysical Research Communications,2002,159(6):125-132.
          [50] OPITZ D W,SHAVLIK J W.Generating accurate and diverse members of a neural-network ensemble[J].Advances in Neural Information Processing Systems,1996,8:535-541.
          [51] LIU Y.Generate different neural networks by negative correlation learning[J].Springer Berlin Heidelberg,2005,3 610:417-417.
          [52] BROWN G,WYATT J.Negative correlation learning and the ambiguity family of ensemble methods[C].Multiple Classifier Systems,Internation Workshp MCS,Guilfod UK,2003:266-275.
          [53] ROSEN B E.Ensemble learning using decorrelated neural networks[J].Connection Science,1996,8(34):373-384.
          [54] YANG Y,CHEN K.An ensemble of competitive learning networks with different representations for temporal data clustering[C].International Joint Conference on Neural Networks,2006:3120-3127.
          [55] YANG Y,LIU X.A robust semi-supervised learning approach via mixture of label information[M].Amsterdan:Elsevier Science Inc,2015.
          [56] WANG W,JONES P,PARTRIDGE D.Diversity between neural networks and decision trees for building multiple classifier systems[C].International Workshop on Multiple Classifier Systems,2000:240-249.
          [57] ROKACH L.Collective-agreement-based pruning of ensembles[J].Computational Statistics & Data Analysis,2009,53(4): 1015-1026.
          [58] LIU H,MANDVIKAR A,MODY J.An empirical study of building compact ensembles[C].Advances in Web-Age Information Management:International Conference,2004:622-627.
          [59] ZHOU Z H,WU J,TANG W.Ensembling neural networks:many could be better than all[J].Artificial Intelligence,2002,137(1/2):239-263.
          [60] CHANDRA A,YAO X.Evolving hybrid ensembles of learning machines for better generalisation[J].Neurocomputing,2006,69(7):686-700.
          [61] LIU Y,YAO X.Ensemble learning via negative correlation[J].Neural Netw,1999,12(10):1399-1404.
          [62] BREIMAN L.Pasting small votes for classification in large databases and on-line[J].Machine Learning,1999,36(1/2):85-103.
          [63] BREIMAN L.Bias variance and arcing classifiers[J].Additives for Polymers,1996,2002(6):10.
          [64] BÜHLMANN P,YU B.Analyzing bagging[J].Annals of Statistics,2002,30(4):927-961.
          [65] SCHAPIRE R E.The strength of weak learnability[J].Proceedings of the Second Annual Workshop on Computational Learning Theory,1990,5(2):197-227.
          [66] ZHU X,BAO C,QIU W.Bagging very weak learners with lazy local learning[C].International Conference on Pattern Recognition,2012:1-4.
          [67] ZHU X,YANG Y.A lazy bagging approach to classification[J].Pattern Recognition,2008,41(10):2980-2992.
          [68] 唐偉,周志華.基于Bagging的選擇性聚類集成[J].軟件學報,2005,16(4): 496-502.
          [69] 張春霞,張講社.選擇性集成學習算法綜述[J].計算機學報,2011,34(8):1399-1410.
          [70] HASTIE T,FRIEDMAN J,TIBSHIRANI R.The elements of statistical learning[J].Technometrics,2001,45(3):267-268.
          [71] MEIR R,RÄTSCH G.An introduction to boosting and leveraging[J].Advanced Lectures on Machines Learning,2003,2600:119-184.
          [72] SCHAPIRE R E.The boosting approach to machine learning[J].An Overview,2003,171:149-171.
          [73] 曹瑩,黃啟廣,劉家辰,等.AdaBoost算法研究進展與展望[J].自動化學報,2013,39(6):745-758.
          [74] 于玲,吳鐵軍.集成學習:Boosting算法綜述[J].模式識別與人工智能,2004,17(1):52-59.
          [75] BENMOKHTAR R,HUET B.Classifier fusion:combination methods for semantic indexing in video content[C].International Conference on Artificial Neural Networks,2006:65-74.
          [76] LAM L.Classifier combinations:implementations and theoretical issues[C].International Workshop on Multiple Classifier Systems,2000:77-86.
          [77] RAHMAN A F R,FAIRHURST M C.Serial combination of multiple experts:a unified evaluation[J].Pattern Analysis & Applications,1999,2(4):292-311.
          [78] WOŹNIAK M,GRAÑA M,CORCHADO E.A survey of multiple classifier systems as hybrid systems[J].Information Fusion,2014,16(1):3-17.
          [79] KAI M T,WITTEN I H.Stacked generalization:when does it work?[C].Fifteenth International Joint Conference on Artifical Intelligence,1997:866-871.
          [80] SIGLETOS,PALIOURAS G,SPYROPULOS C D,et al.Combining information extraction systems using voting and stacked generalization[J].Journal of Machine Learning Research,2005,6(3):1751-1782.
          [81] LEDEZMA A,ALER R,SANCHIS A,et al.GA-stacking:Evolutionary stacked generalization[J].Intelligent Data Analysis,2010,14(1):89-119.
          [82] KUMARI G T P.A study of Bagging and Boosting approaches to develop meta-classifier[J].Engineering Science and Technology,2012,2(5):850-855
          [83] VALENTINI G,DIETTERICH T G.Bias-variance analysis of SVM for the development of SVM-based Ensemble[J].Journal of Machine Learning Research,2004,5(3):725-775.
          [84] ARISTOTLE J.Weighted majority algorithm[M].Secut Press,2013.
          [85] PAPPENBERGER F,BEVEN K J,HONTER N M,et al.Cascading model uncertainty from medium range weather forecasts (10 days) through a rainfall-runoff model to flood inundation predictions within the european flood forecasting system (EFFS)[J].Hydrology & Earth System Sciences,2005,9(4):381-393.
          [86] ISLAM M M,YAO X,MURASE K.A constructive algorithm for training cooperative neural network ensembles[J].IEEE Transactions on Neural Networks,2003,14(4):820.
          [87] KIM D,KIM C.Forecasting time series with genetic fuzzy predictor ensemble[J].Fuzzy Systems IEEE Transactions on,1997,5(4):523-535.
          [88] ASSAAD M,BONÉ R,CARDOT H.A new boosting algorithm for improved time-series forecasting with recurrent neural networks[J].Information Fusion,2008,9(1):41-55.
          [89] 熊志斌. 基于ARIMA與神經網絡集成的GDP時間序列預測研究[J].數理統計與管理,2011,30(2):306-314.
          [90] NEUGEBAUER J,BREMER J,HINRICHS C,et al.Generalized cascade classification model with customized transformation based ensembles[C].International Joint Conference on Neural Networks,2016:4056-4063.
          [91] BAGNALL A,LINES T,HILLS J,et al.Time-series classification with COTE:the collective of transformation-based ensembles[J].IEEE Transactions on Knowledge & Data Engineering,2015,27(9):2522-2535.
          [92] NGUYEN M N,LI X L,NG S K.Ensemble based positive unlabeled learning for time series classification[M].Springer Berlin Heidelberg,2012:243-257.
          [93] OEDA S,KURIMOTO I,ICHIMURA T.Time series data classification using recurrent neural network with ensemble learning[M].Springer Berlin Heidelberg,2006:742-748.
          [94] GIACOMEL F,PEREIRA A C M,GALANTE R.Improving financial time series prediction through output classification by a neural network ensemble[M].Springer International Publishing,2015:331-338.
          [95] YANG Y,CHEN K.Time series clustering via RPCL network ensemble with different representations[J].Systems Man and Cybernetics,2011,41(2):190-199.
          [96] YANG Y,CHEN K.Temporal data clustering via weighted clustering ensemble with different representations[J].IEEE Transactions on Knowledge and Data Engineering,2011,23(2):307-320.
          [97] YANG Y,JIANG J.Bi-weighted ensemble via HMM-based approaches for temporal data clustering[J].Pattern Recognition,2018,76:391-403.
          [98] YANG Y,JIANG J.Adaptive bi-weighting toward automatic initialization and model selection for HMM-based hybrid meta-clustering ensembles[J].IEEE Transactions on Cybernetics,2018,pp(99):1-12.
          [99] YANG Y.Temporal data mining via unsupervised ensemble learning[M].Elsevier,2017.
          [100] YANG Y,CHEN K.Combining competitive learning networks of various representations for sequential data clustering[C].Combining Competitie Learning Networks for Sequence Clustering,2006:315-336.DOI: 10.1007/978-3-540-36122-0_13.
          [101] STREHL A,GHOSH J.Cluster ensembles—a knowledge reuse framework for combining multiple partitions[J].JMLR org,2003,3(3):583-617.
          [102] YANG Y,LI I,WANG W,et al.An adaptive semi-supervised clustering approach via multiple density-based information[J].Neurocomputing,2017,257:193-205.
          [103] REKOW E D,CAD/CAM in dentistry[J].Alpha Omegan,1991,84(4):41.
          [104] GARG A X, ADHIKARI N K J,MCDENALD H,et al.Effects of computerized clinical decision support systems on practitioner performance and patient outcomes:a systematic review[J].Centre for Reviews and Dissemination (UK),2005,293(10):1223-1238.
          [105] LI Y,YANG L,WANG P,et al.Classification of parkinson's disease by decision tree based instance selection and ensemble learning algorithms[J].Journal of Medical Imaging & Health Informatics,2017,7(2).DOI: 10.1116/jmihi.2017.2033.
          [106] BIJIU LSSAC,NAUMAN LSRAR.Case studies in intelligent computing achievements and trends[M].CRC Press,2014:517-532.
          [107] SUBRAMANIAN D,SUBRAMANIAN V,DESWAL A,et al.New predictive models of heart failure mortality using time-series measurements and ensemble models[J].Circulation Heart Failure,2011,4(4):456-462.
          [108] GUO H Y,WANG D.A multilevel optimal feature selection and ensemble learning for a specific CAD system-pulmonary nodule detection[J].Applied Mecharics & Materials,2013,380-384:1593-1599
          [109] 李霞,張田文,郭政.一種基于遞歸分類樹的集成特征基因選擇方法[J].計算機學報,2004,27(5):675-682.
          [110] TAN A C,GILBRT D.Ensemble machine learning on gene expression data for cancer classification[J].Appl Bioinformatics,2003,2(Suppl 3):S75.
          [111] YANG P,Hwa Yang Y, B ZHOU B,et al.A review of ensemble methods in bioinformatics[J].Current Bioinformatics,2010,5(4):296-308.
          [112] OKUN O.Feature selection and ensemble methods for bioinformatics:Algorithmic Classification and Implementations[M].Information Science Reference-Imprint of:IGI Publishing,2011.
          [113] YANG P,ZHOU B B,YANG J Y,et al.Stability of feature selection algorithms and ensemble feature selection methods in bioinformatics[M].John Wiley & Sons Inc,2014:333-352.
          [114] MA Y.An empirical investigation of tree ensembles in biometrics and bioinformatics research[D].Morgantown:West Virginia University,2007.
          [115] ROWLAND C H.Intrusion detection system[P].US,Patent 6 405 318,2002-06-11.
          [116] SOMMER R,PAXSON V.Outside the closed world:on using machine learning for network intrusion detection[J].IEEE Symposium on Security and Privacy,2010,41(3):305-316.
          [117] KARTHIKEYAN S S,KARTHIKEYAN,MAYBELL P S.An ensemble design of intrusion detection system for handling uncertainty using Neutrosophic Logic Classifier[J].Elsevier Science Publishers B V,2012,28(2):88-96.
          [118] GAIKWAD D,THOOL R.DAREnsemble:decision tree and rule learner based ensemble for network intrusion detection system[M].Springer International Publishing,2016.
          [119] MEHETREY P,SHAHRIARI B,MOH M.Collaborative ensemble-learning based intrusion detection systems for clouds[C].International Conference on Collaboration Technologies and Systems,2017:404-411.
          [120] 谷雨,徐宗本,孫劍,等.基于PCA與ICA特征提取的入侵檢測集成分類系統[J].計算機研究與發展,2006,43(4):633-638.
          [121] SORNSUWIT P,JAIYEN S.Intrusion detection model based on ensemble learning for U2R and R2L attacks[C].International Conference on Information Technology and Electrical Engineering,2016:354-359.
          [122] CHEBROLU S,ABRAHAM A,THOMAS J P.Feature deduction and ensemble design of intrusion detection systems[J].Computers & Security,2005,24(4):295-307.
          [123] GIACINTO G,PERDISCI R,DEL RIO M,et al.Intrusion detection in computer networks by a modular ensemble of one-class classifiers[J].Information Fusion,2008,9(1):69-82.
        • [1] 宋炯金釗楊維和 . 機器學習中加速強化學習的一種函數方法. 云南大學學報(自然科學版), 2011, 33(S2): 176-181.
          [2] 張羿趙志勇黃治方俊霆 . SOA安全體系在南方電網信息集成平臺中的應用研究. 云南大學學報(自然科學版), 2013, 35(S2): 220-. doi: 10.7540/j.ynu.2013b56
          [3] 徐兵元周興東胡永華 . 基于動態規則的SOA服務集成平臺的研究與開發. 云南大學學報(自然科學版), 2013, 35(S2): 184-. doi: 10.7540/j.ynu.2013b61
          [4] 張秀年曹杰楊素雨札明輝 . 多模式集成MOS方法在精細化溫度預報中的應用. 云南大學學報(自然科學版), 2011, 33(1): 67-70 .
          [5] 廖松愉李志軍 . 一個四維超混沌系統的構建及其電路集成實現. 云南大學學報(自然科學版), 2018, 40(2): 243-251. doi: 10.7540/j.ynu.20170496
          [6] 姚愚晏紅明 . 多模式解釋集成方法在云南降水預測中的應用. 云南大學學報(自然科學版), 2020, 42(5): 926-935. doi: 10.7540/j.ynu.20190639
          [7] 胡光華胡光濤 . 基于線性近似的即時差分學習. 云南大學學報(自然科學版), 2002, 24(1): 9-13.
          [8] 邱宇青胡光華潘文林 . 基于正交表的支持向量機并行學習算法. 云南大學學報(自然科學版), 2006, 28(2): 93-97.
          [9] 王崇文任翔 . 一種基于SNS平臺的網絡協作學習模式研究. 云南大學學報(自然科學版), 2012, 34(S1): 16-19.
          [10] 王崇文任翔 . 一種基于移動智能手機的微學習模式研究. 云南大學學報(自然科學版), 2013, 35(S1): 133-137. doi: 10.7540/j.ynu.20130262
          [11] 袁青梅 . 如何激發學生對生物材料學課程的學習興趣. 云南大學學報(自然科學版), 2014, 36(S2): 206-207. doi: 10.7540/j.ynu.2014b24
          [12] 虞雙吉苗春生王新 . 極限學習機神經網絡在短期降水預報中的應用. 云南大學學報(自然科學版), 2013, 35(4): 507-515. doi: 10.7540/j.ynu.20120670
          [13] 張明真郭敏 . 基于流形學習和SVM的儲糧害蟲聲信號識別研究. 云南大學學報(自然科學版), 2014, 36(2): 174-180. doi: 10.7540/j.ynu.20130300
          [14] 孫蓉蓉郭磊 . 現代教育技術與學科課程整合網絡環境下英語學習策略研究. 云南大學學報(自然科學版), 2015, 37(S1): 1-. doi: 10.7540/j.ynu.20140213
          [15] 陳建平楊宜民張會章陳學松 . 一種基于GMDH模型的神經網絡學習算法. 云南大學學報(自然科學版), 2008, 30(6): 569-574.
          [16] 蔡娜王俊英劉惟一 . 一種基于小數據集的貝葉斯網絡學習方法. 云南大學學報(自然科學版), 2007, 29(4): 359-363,370.
          [17] 劉正之李國忠朱丹袁際學楊孟莊 . 云南省高等院校體育專業學生學習倦怠情況的研究與分析. 云南大學學報(自然科學版), 2016, 38(S1): 137-. doi: 10.7540/j.ynu.20160095
          [18] 韓格岳昆劉惟一 . 一種基于博弈論的交通系統最優調度策略學習方法. 云南大學學報(自然科學版), 2010, 32(1): 36-42 .
          [19] 劉琰煜周冬明聶仁燦侯瑞超丁齋生 . 低秩表示和字典學習的紅外與可見光圖像融合算法. 云南大學學報(自然科學版), 2019, 41(4): 689-698. doi: 10.7540/j.ynu.20180753
          [20] 燕志星王海瑞楊宏偉靖婉婷 . 基于深度學習特征提取和GWO-SVM滾動軸承故障診斷的研究. 云南大學學報(自然科學版), 2020, 42(4): 656-663. doi: 10.7540/j.ynu.20190535
        • 加載中
        計量
        • 文章訪問數:  683
        • HTML全文瀏覽量:  210
        • PDF下載量:  102
        • 被引次數: 0
        出版歷程
        • 收稿日期:  2018-08-01
        • 刊出日期:  2018-11-10

        集成學習方法:研究綜述

          作者簡介:徐繼偉(1995-),男,湖南人,碩士生,主要研究領域為機器學習,集成學習.E-mail:420076887@qq.com.
          通訊作者: 楊云, yangyun@ynu.edu.cn
        • 1. 云南大學 軟件學院,云南 昆明 650500
        基金項目:  國家自然科學基金(61876166,61663046).

        摘要: 機器學習的求解過程可以看作是在假設空間中搜索一個具有強泛化能力和高魯棒性的學習模型,而在假設空間中尋找合適模型的過程是較為困難的.然而,集成學習作為一類組合優化的學習方法,不僅能通過組合多個簡單模型以獲得一個性能更優的組合模型,而且允許研究者可以針對具體的機器學習問題設計組合方案以得到更為強大的解決方案.回顧了集成學習的發展歷史,并著重對集成學習中多樣性的產生、模型訓練和模型組合這三大策略進行歸納,然后對集成學習在現階段的相關應用場景進行了描述,并在最后對集成學習的未來研究方向進行了分析和展望.

        English Abstract

        參考文獻 (123)

        目錄

          /

          返回文章
          返回
          幸运快三