Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/224677 
Year of Publication: 
2020
Citation: 
[Title:] Proceedings of the ENTRENOVA - ENTerprise REsearch InNOVAtion Conference, Virtual Conference, 10-12 September 2020 [Publisher:] IRENET - Society for Advancing Innovation and Research in Economy [Place:] Zagreb [Year:] 2020 [Pages:] 74-83
Publisher: 
IRENET - Society for Advancing Innovation and Research in Economy, Zagreb
Abstract: 
In this research, we propose the bootstrap procedure as a method for train/test splitting in machine learning algorithms for classification. We show that this resampling method can be a reliable alternative to cross validation and repeated random test/train splitting algorithms. The bootstrap procedure optimizes the classifier's performance by improving its accuracy and classification scores and by reducing computational time significantly. We also show that ten iterations of the bootstrap procedure are enough to achieve better performance of the classification algorithm. With these findings, we propose a solution to the problem of how to reduce computing time in large datasets, while introducing a new practical application of the bootstrap procedure.
Subjects: 
the bootstrap
cross validation
repeated train/test splitting
JEL: 
C38
C52
C55
Creative Commons License: 
cc-by-nc Logo
Document Type: 
Conference Paper

Files in This Item:
File
Size





Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.