page loader
Predicting and optimizing the concrete compressive strength using an explainable boosting machine learning model
Tác giả: Võ Trọng Cường, Nguyễn Thị Quỳnh, Trần Viết Linh
74    0
Asian Journal of Civil Engineering
Quyển:     Trang:
Minh chứng: 2034_CS.pdf
Năm xuất bản: 8/2023
Tóm tắt
Accurate and understandable prediction of concrete compressive strength (CCS) and determining the optimal mixture to maximize the CCS are crucial tasks in engineering structures. When experimental compression test-based CCS prediction is laborious, expensive, and time-consuming, machine learning (ML) approaches can be used to predict the CCS accurately and early. However, such ML models are challenging to understand due to a lack of explanation. This paper explores the capacity of four boosting ML models, including adaptive boosting (AdaBoost), gradient boosting regression tree (GBRT), extreme gradient boosting (XGBoost), and categorical gradient boosting (CatBoost), in predicting the CCS. For this purpose, the comprehensive database of the CCS available in the literature is used to develop four boosting ML models. The hyperparameters of the boosting ML models are determined using the bayesian optimization (BO) algorithm and tenfold cross-validation. The results of four boosting ML models are evaluated and compared using the correlation coefficient, the root mean square error, and the mean absolute error. The comparative results show that the XGBoost model outperforms other models. Afterward, the SHapley Additive exPlanations (SHAP) method is used to interpret the predictions of the XGBoost model globally and locally. Then, an efficient XGBoost-based web application (XGBoost-WA) is developed to predict the CCS rapidly. Finally, the XGBoost-based Moth-flame Optimization (MFO) algorithm, called XGBoost-MFO, is applied for mixture optimization to maximize the CCS. The result shows an improvement of 11% of CCS using the proposed XGBoost-MFO model.
Từ khóa
Boosting machine learning, Compressive concrete strength, Moth-flame optimization, Shapley additive explanations.