Deep Learning versus Grammar in Natural Language Processing
Supervisor: Dr Mehrnoosh Sadrzadeh
Research group(s): TheoryMuch recent work in computational linguistics has focussed on the capacity of machine learning methods, particularly Deep Neural Networks (DNN's), to acquire the knowledge required for a variety of natural language processing (NLP) tasks. Significant progress has been made with DNN architectures in achieving a high degree of accuracy with wide coverage systems for such applications as machine translation, sentiment analysis, speech recognition, and question answering. In NLP a particularly pressing question, however, is whether DNNs can reproduce human levels of performance for cognitively interesting tasks without being exposed to explicit syntactic representations of input data. This is, in effect, to ask whether grammar is a necessary component of linguistic learning. The successful applicant will compare the performance of recent DNN models, such as LSTM’s, to the state of the art parsers, such as Google's, on a set of cognitively interesting tasks, such as acceptability, sentence similarity, and discourse coherence.