The Adam optimizer is a widely used optimizer in deep learning, used to train many neural networks. However, it is not flawless.
One issue is that it can cause large loss spikes late in training when loss gets (too) low, called 'slingshots'.
In recent work (currently under review) we were, to our knowledge, the first to notice that these slingshots seem to introduce dead neurons. We want to know more about this.
This project fits both a master's research internship or master thesis.
Research questions are:
Familiarity with Deep Learning and the Adam optimizer is required, like the course Deep Learning Part 1 (or equivalent).
Supervision: Stijn van den Beemt (daily), Twan van Laarhoven
Contact: Stijn van den Beemt.
Timeframe: dependent on thesis or internship, but generally flexible.
References: