18 May 2016Time: 2:45 - 4:00pm
Venue: BR 3.02 Bancroft Road Teaching Rooms Peter Landin Building London E1 4NSAll welcome (especially students), no pre-booking required (preceded at 2.45pm by tea and followed by a reception - both in the Informatics Hub)
Squeezing Deep Learning onto Wearables, Phones and Things
Breakthroughs from the field of deep learning are transforming how sensor data (such as images, audio, and even accelerometers and GPS) can be interpreted to extract the high-level information needed by mobile and embedded devices. Today, the state-of-the-art in computational models that, for example, recognize a face, track user emotions, or monitor physical activities are increasingly based on deep learning principles and algorithms. It is critical that the gains in recognition accuracy and robustness that this variety of models afford become embedded in the ever growing range of sensor devices (e.g., smart watches, home appliances) that we use on a daily basis. Unfortunately, this is not yet happening; instead in far too many cases, the phones, wearables and things around us locally process sensor data with machine learning methods that have been superseded by deep learning years ago.
In this talk, I will describe our recent work in developing general-purpose support for deep learning-based inference on resource-constrained mobile devices. Our goal is to radically lower the mobile resources (such as energy, memory and computation) consumed by these modeling techniques at inference time, that currently act as the key bottleneck preventing the widespread use of these algorithms. The foundation of this research is in the rethinking of how such deep inference algorithms can execute not only to better cope with mobile and embedded device conditions; but also to increase the utilization of commodity processors (e.g., DSPs, GPUs, CPUs)—now present in devices like watches, glasses and phones—as well as emerging purpose-built deep learning processors from companies like Nvidia, Qualcomm and Movidius. Ultimately in this work, we aim to completely change how mobile sensor data is processed—and in turn, what mobile and embedded devices are capable of—in the next generation of sensing devices, apps and services.
Nic Lane holds dual academic and industrial appointments as a Principal Scientist at Bell Labs Cambridge, and Senior Lecturer at University College London (UCL). At Bell Labs he is a member of the Internet-of-Things research group, while at UCL Nic is part of the Digital Health Institute and UCL Interaction Center. His research interests revolve around the systems and modeling challenges that arise when computers collect and reason about people-centric sensor data. At heart, Nic is an experimentalist and likes to build prototype next-generation of wearable and embedded sensing devices based on well-founded computational models. His work has received multiple best paper awards, including two from ACM UbiComp (2012 and 2015). Nic’s recent academic service includes serving on the PC for leading venues in his field (e.g., UbiComp, MobiSys, SenSys, WWW, CIKM), and this year he will act as PC-chair of HotMobile 2017. Nic received his PhD from Dartmouth College in 2011.