School of
Information Technology and Electrical Engineering

Speaker: Russell (Susumu) Tsuchida
Seminar Date: Mon, 04/12/2017 - 14:00
Venue: 78-420
Host: Assoc Prof Marcus Gallagher

Seminar Type:  PhD Confirmation Seminar

Abstract: 

Neural networks have recently been applied to a number of diverse problems with impressive results. Application areas include image processing, audio processing, generative models, reinforcement learning, natural language pro- cessing and robotics. These breakthroughs largely appear to be driven by application rather than an understanding of the expressive capabilities or training considerations of neural networks. Significant work has been done to increase understanding neural networks, but there is still work to be done to bring theory in line with practice. An increased understanding of the theoretical properties of neural networks will allow for better outcomes in the application neural networks to real problems.

A recently revived approach to analysing and developing tools for neural networks and kernel machines is to examine the equivalent kernel of the neural network. This is based on the fact that a fully connected hidden layer with a certain weight distribution, activation function, and an infinite number of neurons is a mapping that can be viewed as an inner product in a Hilbert space. Once the equivalent kernel is found, the neural network can be understood in terms of kernel machine theory.

The proposed research is based on the following central idea: Some neural network properties can be understood in terms of the distribution of param- eters in the neural network. This idea is complemented by the connection between kernel machines and neural networks. Analysing neural networks in terms of their parameter distributions is expected to yield results relevant to parameter initialisation for optimisers, understanding deep architectures, and understanding and comparing optimisers in the context of neural networks.