Friday, October 15, 2021 at 1:00pm to 2:00pm
Global Injectivity, Manifold Estimation and Universality of Neural Networks
In this talk we will cover two topics. The first topic is the construction & analysis of globally injective Relu networks. We will present a sharp characterization of layerwise global injectivity for both fully-connected and convolutional neural networks. On the way we will discover that injectivity of random matrices is less likely than you might think, prove that injective layers always have a positive inverse lipschitz constant, and find that globally injective relu networks are universal approximators. The second topic shows that when bijective flows are combined with injective layers to form a network, it is a universal manifold approximator, subject to some important topological and geometric restrictions. We explore these restrictions through the development and use of two new theoretical devices, the embedding gap and the manifold embedding property. Using these two we again present sharp characterization results, and resolve a conjecture concerning the `reverse' optimality of these networks.
UTD strives to create inclusive and accessible events in accordance with the Americans with Disabilities Act (ADA). If you require an accommodation to fully participate in this event, please contact the event coordinator (listed below) at least 10 business days prior to the event. If you have any additional questions, please email ADACoordinator@utdallas.edu and the AccessAbility Resource Center at email@example.com.