In the paper we tackle bi-objective execution time and power consumption optimization problem concerning execution of parallel applications. We propose using a discrete-event simulation environment for exploring this power/time trade-off in the form of a Pareto front. The solution is verified by a case study based on a real deep neural network training application for automatic speech recognition. A simulation lasting over 2 hours on a single CPU accurately predicts real results from executions that take over 335 hours in a cluster with 8 GPUs. The simulations allow also estimating the impact of data package imbalance on the application performance.
Autorzy
Informacje dodatkowe
- DOI
- Cyfrowy identyfikator dokumentu elektronicznego link otwiera się w nowej karcie 10.1016/j.procs.2017.05.214
- Kategoria
- Aktywność konferencyjna
- Typ
- materiały konferencyjne indeksowane w Web of Science
- Język
- angielski
- Rok wydania
- 2017