The critical role of hyperparameter tuning in machine learning: Implications for reproducibility and model comparison
DOI:
https://doi.org/10.5281/zenodo.18108973Keywords:
Hyperparameter tuning, Machine learning, Reproducibility, Model comparison, OptimizationAbstract
Despite being a fundamental aspect of machine learning model development, hyperparameter tuning remains underreported in the literature. This article highlights the importance of hyperparameter optimisation, outlines common hyperparameters across various algorithms, and discusses the consequences of inadequate hyperparameter documentation. We argue that the lack of transparency in hyperparameter settings impedes reproducibility, hinders fair model comparisons, and contributes to the hyperparameter deception. The importance of hyperparameter tuning in machine learning was demonstrated by comparing the performance of the decision tree, support vector machine and random forest models on Iris, Digits and Breast Cancer datasets using default and tuned hyperparameters. This further justifies the need to document and report the process and values of the hyperparameter settings used in the models. To facilitate this, an architecture that encourages the documentation of the hyperparameters has been proposed. By emphasising the need for comprehensive reporting, this study aims to raise awareness and encourage best practices in machine learning research.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Technoscience Journal for Community Development in Africa

This work is licensed under a Creative Commons Attribution 4.0 International License.