Efficient Hyperparameter Tuning for Resource-Constrained Environments

python
research
deep learning
Author

Ethan, Kelsey, Jun, Catherine

Published

June 14, 2024

This study addresses the challenges posed by resource constraints, particularly on edge devices with limited computational power. Hyperparameter tuning is crucial for optimizing deep neural networks but is often computationally expensive. We propose an efficient framework using Latin Hypercube Sampling (LHS) and surrogate models like Gaussian Processes to reduce computational burden. This approach significantly reduces time while improving or maintaining model accuracy compared to traditional grid search methods.

Attached below is our research findings: