Abstract:
Machine Learning combining predictions from multiple algorithms (e.g., Extra Trees, Gradient Boosting, AdaBoost, Random Forest, and LightGBM) with a voting classifier, is...View moreMetadata
Abstract:
Machine Learning combining predictions from multiple algorithms (e.g., Extra Trees, Gradient Boosting, AdaBoost, Random Forest, and LightGBM) with a voting classifier, is the process known as ensemble learning techniques. The ensemble assesses performance measures on the training dataset, such as accuracy scores, ROC curves, and classification reports, using cross-validation and both soft and hard voting techniques. Using a range of classifiers aims to capture different features of the data and increase the overall predicted accuracy. The versatility of the code is demonstrated with an XGBoost classifier that has been commented out, which leaves room for further customisation and experimentation. Ensemble learning technique also functions as a potent tool for improving model correctness and reliability, with comprehensive evaluation metrics facilitating informed model selection and decision-making. This ensemble strategy aims to increase the model's overall prediction power by using the unique capabilities of each classifier.
Published in: 2024 1st International Conference on Cognitive, Green and Ubiquitous Computing (IC-CGU)
Date of Conference: 01-02 March 2024
Date Added to IEEE Xplore: 22 May 2024
ISBN Information: