skip to main content

PENERAPAN GRADIENT BOOSTING DENGAN HYPEROPT UNTUK MEMPREDIKSI KEBERHASILAN TELEMARKETING BANK

*Silvia Elsa Suryana  -  Departemen Statistika, Fakultas Sains dan Matematika, Universitas Diponegoro, Indonesia
Budi Warsito  -  Departemen Statistika, Fakultas Sains dan Matematika, Universitas Diponegoro, Indonesia
Suparti Suparti  -  Departemen Statistika, Fakultas Sains dan Matematika, Universitas Diponegoro, Indonesia
Open Access Copyright 2021 Jurnal Gaussian under http://creativecommons.org/licenses/by-nc-sa/4.0.

Citation Format:
Abstract

Telemarketing is another form of marketing which is conducted via telephone. Bank can use telemarketing to offer its products such as term deposit. One of the most important strategy to the success of telemarketing is opting the potential customer to create effective telemarketing. Predicting the success of telemarketing can use machine learning. Gradient boosting is machine learning method with advanced decision tree. Gardient boosting involves many classification trees which are continually upgraded from previous tree. The optimal classification result cannot be separated from the role of the optimal hyperparameter.  Hyperopt is Python library that can be used to tune hyperparameter effectively because it uses Bayesian optimization. Hyperopt uses hyperparameter prior distribution to find optimal hyperparameter. Data in this study including 20 independent variables and binary dependent variable which has ‘yes’ and ‘no’ classes. The study showed that gradient boosting reached classification accuracy up to 90,39%, precision 94,91%, and AUC 0,939. These values describe gradient boosting method is able to predict both classes ‘yes’ and ‘no’ relatively accurate.

Note: This article has supplementary file(s).

Fulltext View|Download |  Research Instrument
CTA_Form
Subject
Type Research Instrument
  Download (127KB)    Indexing metadata
Keywords: Telemarketing, Hyperopt, Gradient Boosting

Article Metrics:

  1. Bergstra, J., Komer, B., Eliasmith, C., Yasmins, D., Cox, D. D. 2015. Hyperopt: A Python library for model selection and hyperparameter optimization. Computational Science and Discovery Vol. 8, No. 1
  2. Chawla, N. V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P. 2002. SMOTE: Synthetic Minority Over-sampling Technique. Journal of Artificial Intelligence Research Vol 16, Hal. 321–357
  3. Essam, A. D. 2019. Comparison between XGBoost, LightGBM and CatBoost Using a Home Credit Dataset. International Journal of Computer and Information Engineering Vol. 13, No.1 : Hal. 6–10
  4. Moro, S., Cortez, P., Rita, P. 2014. A data-driven approach to predict the success of bank telemarketing. Decision Support Systems Vol 62, Hal 22–31
  5. Natekin, A. Knoll, A. 2013. Gradient boosting machines, a tutorial. Frontiers in Neurorobotics
  6. Raschka, S. 2018. Model evaluation, model selection, and algorithm selection in machine learning. arXiv
  7. Rust, R. T., Moorman, C. and Bhalla, G. 2010. Rethinking marketing, Harvard Business Review
  8. Stone, B. 1992. Successful Telemarketing by Bob Stone, John Wyman. New York: McGraw-Hill Professional
  9. Van der Heijden, G. J. M. G., Donders, A, R, T., Stijnen, T., Moons, K.G.M. 2006. Imputation of missing values is superior to complete case analysis and the missing-indicator method in multivariable diagnostic research: A clinical example. Journal of Clinical Epidemiology Vol 59, No. 10 : Hal. 1102–1109

Last update:

No citation recorded.

Last update:

No citation recorded.