Abstract:
Abstract In [Sass et al., Eur. J. Oper. Res., 316 (1): 36 – 45, 2024], we proposed a branch-and-bound (B&B) algorithm with growing datasets for the deterministic global optimization of parameter estimation problems based on large datasets. Therein, we start the B&B algorithm with a reduced dataset and augment it until reaching the full dataset upon convergence. However, convergence may be slowed down by a gap between the lower bounds of the reduced and the original problem, in particular for noisy measurement data. Thus, we propose the use of out-of-sample estimation for improving the lower bounds calculated with reduced datasets. Based on this, we extend the deterministic approach and propose two heuristic approaches. The computational performance of all approaches is compared with the standard B&B algorithm as a benchmark based on real-world estimation problems from process systems engineering, biochemistry, and machine learning covering datasets with and without measurement noise. Our results indicate that the heuristic approaches can improve the final lower bounds on the optimal objective value without cutting off the global solution. Aside from this, we prove that resampling can decrease the variance of the lower bounds calculated based on random initial datasets. In our case study, resampling hardly affects the performance of the approaches which indicates that the B&B algorithm with growing datasets does not suffer from large variances.