24th Art Division New Face Award
VOX-AUTOPOIESIS V -Mutual-
Media performance
KOMIYA Chiku[Japan]
Outline
This is part of the VOX-AUTOPOIESIS series, in which a musician plays live sheet music generated on the spot. The musician’s voice analysis data is used as an algorithm to generate a score, which then is sung by the musician, resulting in repetitive performances and generations. The score is generated four seconds ahead, which means the musician sings his / her own traces from four seconds ago, while simultaneously notating what will be sung four seconds later. In this work, two performers mutually generate scores, creating a new polyphony (music consisting of multiple independent melodic lines). It also presents a new relationship between machine and human body, in which the music performance is improved by cooperating with a machine, and a different way of music notation.
Reason for Award
By exploring possibilities of machine-human collaboration in live electronics, the artist questions the form and structure of music. Improvised voices of two performers who are facing each other and sheet music automatically generated as traces of their voices—these interrelate within the past-present-future time axis, but also coexist like a complex fabric in non-unidirectional flows of time. The score is not only a record of the voices, but also a source for creating future voices. The dynamism of this work lies in the fact that it modifies the conventional relationship between score and performance, critically examines its structure, and creates a new structure. (TASAKA Hiroko)