Supervision: Konstantinos Pitas
Project type:
Semester project (master)
Master thesis
Available
Deriving uncertainty estimates for feedforward DNN predictions is critical in a number of tasks. Unfortunately the outputs of the softmax layer cannot be interpreted in a principled way as probabilities over classes. Instead modelling DNNs as coming from a probability distribution and taking a Bayesian view of predictions, is much better grounded in theory, and results in much better uncertainty estimates in practice. The Bayesian view still has shortcomings. While it works in toy problems, simply put posterior distributions in high dimensions have mass in all the wrong places! In this project the student will implement techniques that are aimed to scale existing Bayesian approximate inference techniques to realistic architectures and datasets.
The student must be highly motivated and independent, with good knowledge of Tensorflow/Keras or Pytorch and able to implement and modify large DNN architectures such as VGG-16 and Resnet-56.
The project consists of 20% theory and 80% practice, and blends a variety of problems at the cutting edge of DNN research.
[1] Radial Bayesian Neural Networks: Beyond Discrete Support In Large-Scale Bayesian Deep Learning https://arxiv.org/pdf/1907.00865.pdf