Title

GENERATIVE MODEL BASED APPROACHES TO LEARNING WITH INCOMPLETE SUPERVISION

Abstract

Abstract

The developments in deep learning have led to great advances in a variety of recogni-
tion problems. However, predominantly the state-of-the-art models rely on the avail-
ability of large annotated training sets. Therefore, great attention has recently shifted
to developing models that can incorporate concepts of interest in the absence of care-
fully annotated large training sets for them. Commonly referred to as learning with
limited supervision, these methods vary from completely unsupervised learning to
model training with noisy labels. In this thesis, we focus on the problem of learning
classification models of novel classes based on a small number of training examples,
also known as few-shot learning. We approach this problem from a generative per-
spective, where we first aim to learn a generative model and then we explore the effect
of the generative model on a few-shot classification task. The primary focus of this
study revolves around a generative model that is founded on diffusion principles and
incorporates a transformer to manipulate latent patches. This model functions by uti-
lizing image features acquired from a pre-trained feature extractor as its conditional
input. As a result of adopting this methodology, we provide an empirical assessment
of the generative model’s efficacy within a few-shot learning scenario

Supervisor(s)

Supervisor(s)

SINAN GENCOGLU

Date and Location

Date and Location

2023-12-05 13:30:00

Category

Category

MSc_Thesis