Supervisor: Dr Yi-Zhe Song
This project investigates the problem of fine-grained sketch-based image retrieval (SBIR), where free-hand human sketches are used as queries to perform instance-level retrieval of images. This is an extremely challenging task because (i) visual comparisons not only need to be fine-grained but also executed cross-domain, (ii) free-hand (finger) sketches are highly abstract, making fine-grained matching harder, and most importantly (iii) annotated cross-domain sketch-photo datasets required for training are scarce, challenging many state-of-the-art machine learning techniques. In this project, we will address all these challenges, providing a step towards the capabilities that would underpin a commercial sketch-based image retrieval application. In particular, we will investigate towards novel means of incorporating user feedback into the retrieval loop, with the aim of (i) eliminating variances of sketching skills among users, and (ii) improve overall retrieval accuracy.