Workshop: Machine Learning on HPC Systems (MLHPCS)

View on GitHub

Surprises in Deep Learning Training: Symmetry, Efficiency, and Scale

Abstract

We make a few interesting observations regarding Deep Neural Networks (DNNS) training:

Speaker

Daniel is an assistant professor and in the Electrical Engineering Department at the Technion, working in the areas of machine learning and neural networks. His recent works focus on resource efficiency and implicit bias in neural networks. He did his post-doc working with Prof. Liam Paninski in the Department of Statistics and the Center for Theoretical Neuroscience at Columbia University, and his Ph.D. in the Electrical Engineering Department at the Technion. He is the recipient of the Gruss Lipper Fellowship, the Taub Fellowship, the Goldberg Award, and Intel’s Rising Star Faculty Award.