且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

ML之xgboost&GBM:基于xgboost&GBM算法对HiggsBoson数据集(Kaggle竞赛)训练(两模型性能PK)实现二分类预测

更新时间:2022-08-20 23:35:19

输出结

ML之xgboost&GBM:基于xgboost&GBM算法对HiggsBoson数据集(Kaggle竞赛)训练(两模型性能PK)实现二分类预测

ML之xgboost&GBM:基于xgboost&GBM算法对HiggsBoson数据集(Kaggle竞赛)训练(两模型性能PK)实现二分类预测


 

设计思

ML之xgboost&GBM:基于xgboost&GBM算法对HiggsBoson数据集(Kaggle竞赛)训练(两模型性能PK)实现二分类预测

 

核心代


finish loading from csv

weight statistics: wpos=1522.37, wneg=904200, ratio=593.94

loading data end, start to boost trees

training GBM from sklearn

     Iter       Train Loss   Remaining Time

        1           1.2069           49.52s

        2           1.1437           43.51s

        3           1.0909           37.43s

        4           1.0471           30.96s

        5           1.0096           25.09s

        6           0.9775           19.90s

        7           0.9505           15.22s

        8           0.9264            9.94s

        9           0.9058            4.88s

       10           0.8878            0.00s

sklearn.GBM total costs: 50.88141202926636 seconds

training xgboost

[0] train-ams@0.15:3.69849

[1] train-ams@0.15:3.96339

[2] train-ams@0.15:4.26978

[3] train-ams@0.15:4.32619

[4] train-ams@0.15:4.41415

[5] train-ams@0.15:4.49395

[6] train-ams@0.15:4.64614

[7] train-ams@0.15:4.64058

[8] train-ams@0.15:4.73064

[9] train-ams@0.15:4.79447

XGBoost with 1 thread costs: 24.5108642578125 seconds

[0] train-ams@0.15:3.69849

[1] train-ams@0.15:3.96339

[2] train-ams@0.15:4.26978

[3] train-ams@0.15:4.32619

[4] train-ams@0.15:4.41415

[5] train-ams@0.15:4.49395

[6] train-ams@0.15:4.64614

[7] train-ams@0.15:4.64058

[8] train-ams@0.15:4.73064

[9] train-ams@0.15:4.79447

XGBoost with 2 thread costs: 11.449955940246582 seconds

[0] train-ams@0.15:3.69849

[1] train-ams@0.15:3.96339

[2] train-ams@0.15:4.26978

[3] train-ams@0.15:4.32619

[4] train-ams@0.15:4.41415

[5] train-ams@0.15:4.49395

[6] train-ams@0.15:4.64614

[7] train-ams@0.15:4.64058

[8] train-ams@0.15:4.73064

[9] train-ams@0.15:4.79447

XGBoost with 4 thread costs: 8.809934616088867 seconds

[0] train-ams@0.15:3.69849

[1] train-ams@0.15:3.96339

[2] train-ams@0.15:4.26978

[3] train-ams@0.15:4.32619

[4] train-ams@0.15:4.41415

[5] train-ams@0.15:4.49395

[6] train-ams@0.15:4.64614

[7] train-ams@0.15:4.64058

[8] train-ams@0.15:4.73064

[9] train-ams@0.15:4.79447

XGBoost with 8 thread costs: 7.875434875488281 seconds

XGBoost total costs: 52.64618968963623 seconds