Showcasing the culmination of five years of digital music research, the FAST IMPACt project (Fusing Audio and Semantic Technologies for Intelligent Music Production and Consumption), led by Queen Mary University of London, hosted an invite-only industry day at Abbey Road Studios.
27 November 2018
Presented by Professor Mark Sandler, Director of the Centre for Digital Music at Queen Mary, the event showcased to artists, journalists and industry professionals the next generation technologies that will shape the music industry – from production to consumption.
FAST is looking at how new technologies can positively disrupt the recorded music industry. Research from across the project was presented to the audience, with work from partners at the University Nottingham and the University of Oxford presented alongside that from Queen Mary on 25 October.
Professor Sandler said: “The aim being that by the end of the FAST Industry day, people would gain some idea how AI and the Semantic Web can couple with Signal Processing to overturn conventional ways to produce and consume music. Along the way, industry attendees were able to preview some cool and interesting new ideas, apps and technology that the FAST team showcased.”
In total 120 attendees were treated to an afternoon and evening of talks, demonstrations, a Climb! performance, and an expert panel discussion with Jon Eaves (The Rattle), Paul Sanders (state51), Peter Langley (Origin UK), Tracy Redhead (award-winning musician, composer and interactive producer from University of Newcastle, Australia), Maria Kallionpää (composer and pianist from Hong Kong Baptist University) and Mark d’Inverno (Goldsmiths, University of London) who chaired the panel.
Rivka Gottlieb, harpist and music therapist, also performed some musical pieces in collaboration with the Oxford team throughout the day.
The FAST Industry Day was opened by Lord Tim Clement-Jones (Chair of Council, Queen Mary University of London) and was compered by Professor Mark d’Inverno (Professor of Computing at Goldsmiths College, London).
Highlights of the research showcased include:
Carolan Guitar: Connecting Digital to the Physical – The Carolan Guitar tells its own story. Play the guitar, contribute to its history, scan its decorative patterns and discover its story. Carolan uses a unique visual marker technology that enables the physical instrument to link to the places it’s been, the people who’ve played it and the songs it’s sung, and deep learning techniques to better event detection.
FAST DJ - Fast DJ is a web-based automatic DJ system and plugin that can be embedded into any website. It generates transitions between any pair of successive songs and uses machine learning to adapt to the user’s taste via simple interactive decisions.
Grateful Dead Concert Explorer – A web service for the exploration of recordings of Grateful Dead concerts, drawing its information from various web sources. It demonstrates how semantic audio and linked data technologies can produce an improved user experience for browsing and exploring music collections.
Jam with Jamendo – Bringing music learners and unsigned artists together by recommending suitable songs as new and varied practice material. In this web app, users are presented with a list of songs based on their selection of chords. They can then play along with the chord transcriptions or use the audio as backing tracks for solos and improvisations. Using AI-generated transcriptions makes it trivial to grow the underlying music catalogue without human effort.
MusicLynx – Is a web platform for music discovery that collects information and reveals connections between artists from a range of online sources. The information is used to build a network that users can explore to discover new artists and how they are linked together.
The SOFA Ontological Fragment Assembler – Enables the combination of musical fragments – Digital Music Objects, or DMOs – into compositions, using semantic annotations to suggest compatible choices.
Numbers into Notes - Experiments in algorithmic composition and the relationship between humans, machines, algorithms and creativity.
rCALMA Environment for Live Music Data Science - A big data visualisation of the Internet Archive Live Music Archive using linked data to combine programmes and audio feature analysis.
Climb! Performance Archive – Climb! is a non-linear composition for Disklavier piano and electronics. This web-based archive creates a richly indexed and navigable archive of every performance of the work, allowing audiences and performers to engage with the work in new ways.
The FAST project brings together labs from Queen Mary’s Centre for Digital Music, University of Nottingham's Mixed Reality Lab and the University of Oxford's e-Research Centre.
Further information: semanticaudio.ac.uk or follow on twitter @semanticaudio
Study at Queen Mary’s Centre for Digital Music