Improvising a live score to an interactive brain-controlled film

Richard Ramchurn, Juan Martinez-Avila, Sarah Martindale, Alan Chamberlain, Max L. Wilson, Steve Benford

Research output: Contribution to conferencePaperpeer-review

2 Citations (Scopus)

Abstract

We report on the design and deployment of systems for the performance of live score accompaniment to an interactive movie by a Networked Musical Ensemble. In this case, the audio-visual content of the movie is selected in real time based on user input to a Brain-Computer Interface (BCI). Our system supports musical improvisation between human performers and automated systems responding to the BCI. We explore the performers’ roles during two performances when these socio-technical systems were implemented, in terms of coordination, problem-solving, managing uncertainty and musical responses to system constraints. This allows us to consider how features of these systems and practices might be incorporated into a general tool, aimed at any musician, which could scale for use in different performance settings involving interactive media.

Original languageEnglish
Pages31-36
Number of pages6
Publication statusPublished - 2019
Event19th International conference on New Interfaces for Musical Expression, NIME 2019 - Porto Alegre, Brazil
Duration: 03 Jun 201906 Jun 2019

Conference

Conference19th International conference on New Interfaces for Musical Expression, NIME 2019
Country/TerritoryBrazil
CityPorto Alegre
Period03 Jun 201906 Jun 2019

Keywords

  • BCI
  • Film
  • Improvisation
  • Live score
  • Max/MSP/Jitter

Fingerprint

Dive into the research topics of 'Improvising a live score to an interactive brain-controlled film'. Together they form a unique fingerprint.

Cite this