skip to main content

OPTIMASI XGBOOST DALAM PREDIKSI KECEPATAN KENDARAAN SECARA REAL-TIME : PERBANDINGAN METODE TUNING HYPERPARAMETER

*Panji Lokajaya Arifa  -  Program Studi Statistika dan Sains Data, Sekolah Sains Data, Matematika dan Informatika, Institut Pertanian Bogor, Jl. Raya Dramaga Kampus IPB, Dramaga, Bogor, Indonesia 16680, Indonesia
Kusman Sadik orcid scopus  -  Program Studi Statistika dan Sains Data, Sekolah Sains Data, Matematika dan Informatika, Institut Pertanian Bogor, Jl. Raya Dramaga Kampus IPB, Dramaga, Bogor, Indonesia 16680, Indonesia
Agus M Soleh orcid scopus  -  Program Studi Statistika dan Sains Data, Sekolah Sains Data, Matematika dan Informatika, Institut Pertanian Bogor, Jl. Raya Dramaga Kampus IPB, Dramaga, Bogor, Indonesia 16680, Indonesia
Cici Suhaeni orcid scopus  -  Program Studi Statistika dan Sains Data, Sekolah Sains Data, Matematika dan Informatika, Institut Pertanian Bogor, Jl. Raya Dramaga Kampus IPB, Dramaga, Bogor, Indonesia 16680, Indonesia
Open Access Copyright 2026 Jurnal Gaussian under http://creativecommons.org/licenses/by-nc-sa/4.0.

Citation Format:
Abstract

Real-time vehicle speed prediction plays a vital role in the development of intelligent transportation systems aimed at improving traffic flow and safety. This study investigates the performance of the XGBoost algorithm enhanced with three hyperparameter tuning techniques: Grid Search, Bayesian Optimization, and Genetic Algorithm. A simulated dataset was constructed reflect diverse urban traffic scenarios, incorporating environmental variables such as weather, road conditions, and traffic density. The models were assessed using 5 and 10-fold cross-validation based on prediction metrics (MSE, RMSE, MAE and R²) as well as computational efficiency in terms of training and inference time. The findings reveal that Bayesian Optimization achieves the highest prediction accuracy, while Grid Search offers the fastest training time. Genetic Algorithm demonstrates a balanced trade-off between accuracy and computational efficiency, making it a competitive and practical choice. These results highlight the importance of selecting hyperparameter tuning strategies based on specific system needs in real-time traffic prediction using XGBoost. 

Fulltext View|Download
Keywords: Xgboost; Hyperparameter Tuning; Grid Search; Bayesian Optimization; Genetic Algorithms

Article Metrics:

  1. Chen, T. and Guestrin, C. (2016), “XGBoost: A scalable tree boosting system”, Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Vol. 13-17-August-2016, pp. 785–794, doi: 10.1145/2939672.2939785
  2. Chicco, D., Warrens, M.J. and Jurman, G. (2021), “The coefficient of determination R-squared is more informative than SMAPE, MAE, MAPE, MSE and RMSE in regression analysis evaluation”, PeerJ Computer Science, Vol. 7, pp. 1–24, doi: 10.7717/PEERJ-CS.623
  3. Fatihah, A.M., Dharmawan, K. and Swastika, P.V. (2024), Implementation of X-Gradient Boosting in Banking Stock Price Predictions, Atlantis Press International BV, doi: 10.2991/978-94-6463-413-6_17
  4. Huang, C., Zhu, X., Lu, M., Zhang, Y. and Yang, S. (2025), “XGBoost algorithm optimized by simulated annealing genetic algrithm for permeability prediction modeling of carbonate reservoirs”, Scientific Reports, Vol. 15 No. 1, pp. 1–9, doi: 10.1038/s41598-025-99627-z
  5. J Cui, B.Y. (2018), “Survey on Bayesian optimization methodology and applications”, Journal of Software, Vol. 29 No. 10, pp. 3068–3090, doi: http://dx.doi.org/10.13328/j.cnki.jos.005607
  6. James Bergsrtra, Y.B. (2012), “Random Search for Hyper-Parameter Optimization”, Journal of Machine Learning Research, Vol. 13, pp. 281–305
  7. Kandasamy, K., Vysyaraju, K.R., Neiswanger, W., Paria, B., Collins, C.R., Schneider, J., Póczos, B., et al. (2020), “Tuning hyperparameters without grad students: Scalable and robust bayesian optimisation with dragonfly”, Journal of Machine Learning Research, Vol. 21, pp. 1–25
  8. Kuhn, M. and Johnson, K. (2019), Feature Engineering and Selection, Feature Engineering and Selection, CRC Press, doi: 10.1201/9781315108230
  9. Lambora, A., Gupta, K. and Chopra, K. (2019), “Genetic Algorithm- A Literature Review”, Proceedings of the International Conference on Machine Learning, Big Data, Cloud and Parallel Computing: Trends, Prespectives and Prospects, COMITCon 2019, IEEE, No. 1998, pp. 380–384, doi: 10.1109/COMITCon.2019.8862255
  10. Martinez-Munoz, C.B.A.C. (2021), “A comparative analysis of gradient boosting algorithms”, Artifical Intelligence Review, Vol. 54, pp. 1937–1967
  11. Mehdary, A., Chehri, A., Jakimi, A. and Saadane, R. (2024), “Hyperparameter Optimization with Genetic Algorithms and XGBoost: A Step Forward in Smart Grid Fraud Detection”, Sensors, Vol. 24 No. 4, doi: 10.3390/s24041230
  12. Probst, P., Wright, M.N. and Boulesteix, A.L. (2019), “Hyperparameters and tuning strategies for random forest”, Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, Vol. 9 No. 3, pp. 1–15, doi: 10.1002/widm.1301
  13. Putatunda, S. and Rama, K. (2018), “A comparative analysis of hyperopt as against other approaches for hyper-parameter optimization of XGBoost”, ACM International Conference Proceeding Series, pp. 6–10, doi: 10.1145/3297067.3297080
  14. Rayadin, M.A., Musaruddin, M., Saputra, R.A. and Isnawaty, I. (2024), “Implementasi Ensemble Learning Metode XGBoost dan Random Forest untuk Prediksi Waktu Penggantian Baterai Aki”, BIOS: Jurnal Teknologi Informasi Dan Rekayasa Komputer, Vol. 5 No. 2, pp. 111–119
  15. Shahriari, B., Swersky, K., Wang, Z., Adams, R.P. and De Freitas, N. (2016), “Taking the human out of the loop: A review of Bayesian optimization”, Proceedings of the IEEE, Vol. 104 No. 1, pp. 148–175, doi: 10.1109/JPROC.2015.2494218
  16. Tian, X., Zheng, Q., Yu, Z., Yang, M., Ding, Y., Elhanashi, A., Saponara, S., et al. (2023), “A Real-Time Vehicle Speed Prediction Method Based on a Lightweight Informer Driven by Big Temporal Data”, Big Data and Cognitive Computing, Vol. 7 No. 3, doi: 10.3390/bdcc7030131
  17. Uzir, N., Raman, S., Banerjee, S. and Nishant Uzir Sunil R, R.S. (2016), “Experimenting XGBoost Algorithm for Prediction and Classification of Different Datasets Experimenting XGBoost Algorithm for Prediction and Classifi cation of Different Datasets”, International Journal of Control Theory and Applications, Vol. 9 No. July
  18. Wang, D., Guo, H., Sun, Y., Liang, H., Li, A. and Guo, Y. (2024), “Prediction of Oil–Water Two-Phase Flow Patterns Based on Bayesian Optimisation of the XGBoost Algorithm”, Processes, Vol. 12 No. 8, pp. 1–19, doi: 10.3390/pr12081660

Last update:

No citation recorded.

Last update:

No citation recorded.