Back to All Events

NeurIPS 2019: 33rd Conference on Neural Information Processing Systems

  • Vancouver Convention Centre 1055 Canada Place Vancouver, BC, V6C 0C3 Canada (map)

Presenting our paper with my co-authors at Beyond First Order Methods in ML workshop at the conference.

The title of the paper is: FD-Net with Auxiliary Time Steps: Fast Prediction of PDEs using Hessian-Free Trust-Region Methods. Nur Sila Gulgec (Lehigh University); Zheng Shi (Lehigh University); Neil Deshmukh (Moravian Academy); Shamim Pakzad (Lehigh University); Martin Takac (Lehigh University)

Higher-order methods, such as Newton, quasi-Newton and adaptive gradient descent methods, are extensively used in many scientific and engineering domains. At least in theory, these methods possess several nice features: they exploit local curvature information to mitigate the effects of ill-conditioning, they avoid or diminish the need for hyper-parameter tuning, and they have enough concurrency to take advantage of distributed computing environments. Researchers have even developed stochastic versions of higher-order methods, that feature speed and scalability by incorporating curvature information in an economical and judicious manner. However, often, higher-order methods are “undervalued.”

This workshop will attempt to shed light on this statement. Topics of interest include --but are not limited to-- second-order methods, adaptive gradient descent methods, regularization techniques, as well as techniques based on higher-order derivatives.