Interaction musicale mumérique - Recréer les signaux du contrôle et de la perception

Interaction musicale mumérique

Recréer les signaux du contrôle et de la perception

Florent Berthaut Myriam Desainte-Catherine 

Univ. Lille, UMR 9189 - CRIStAL - Centre de Recherche en Informatique Signal et Automatique de Lille, F-59000 Lille, France. CNRS, UMR 9189, F-59000 Lille, France. Centrale Lille, F-59000 Lille, France

Université de Bordeaux, LaBRI, UMR5800, F-33400 Talence, France. CNRS, LaBRI, UMR5800, F-33400 Talence, France. Inria, F-33400, Talence, France

Corresponding Author Email: 
florent@hitmuri.net
Page: 
345-363
|
DOI: 
https://doi.org/10.3166/TS.32.345-363
Received: 
1 July 2015
| |
Accepted: 
10 November 2015
| | Citation

OPEN ACCESS

Abstract: 

Thanks to their diversity and complexity, Digital Musical Instruments open new possibilities for artists. However, they also raise many questions concerning their expressiveness, their learning curve, the limitation of feedback for musicians or their perception by the audience. In this article, we give an overview of the musical interaction research field, and describe the main research directions that it involves. We demonstrate that they amount to rebuilding the signals of control and perception.

RÉSUMÉ

Les instruments de musique numériques, de par leur diversité et complexité, ouvrent de nouvelles possibilités pour les artistes. Cependant, de nombreuses questions se posent quant à leur expressivité, leur facilité d’apprentissage, la limitation des retours vers le musicien ou encore leur perception par les spectateurs. Dans cet article, nous présentons le domaine de recherche de l’interaction musicale, et les principales pistes qui y sont explorées. Nous montrons comment ces travaux consistent avant tout à reconstruire les signaux du contrôle et de la perception.

Keywords: 

musical interaction, new interfaces for musical expression, digital musical instruments, gestures, sound processes, interfaces

MOTS-CLÉS

interaction musicale, instruments de musique numériques, interfaces, gestes, processus sonores, perception

1. Introduction
2. Instruments De Musique Numériques
3. Captation Du Contrôle
4. Du Contrôle Aux Processus Musicaux
5. Signaux De La Perception
6. Conclusion
  References

Alaoui S. F., Caramiaux B., Serrano M., Bevilacqua F. (2012). Movement qualities as interaction modality. In Proceedings of the designing interactive systems conference, p. 761–769.

Arfib D., Couturier J. M., Kessous L., Verfaille V. (2002). Strategies of mapping between gesture data and synthesis model parameters using perceptual spaces. Organised Sound, vol. 7, no 2, p. 127–144.

Baltazar P., Hogue T. de la, Desainte-Catherine M. (2014). i-score, an interactive sequencer for the intermedia arts. In Proceedings of joined international computer music conférence and sound and music computing (icmc-smc). Athena (Greece).

Barri T. (2009, 18-–21 May). Versum: audiovisual composing in 3d. Copenhagen, Denmark, Re:New – Digital Arts Forum. Consulté sur Proceedings/2009/Barri2009.pdf

Berdahl E., Verplank B., Smith J. O., Niemeyer G. (2005). A physically-intuitive haptic drumstick. In Proceedings of the 2007 international computer music conference.

Berthaut F., Coyle D., Moore J., Limerick H. (2015). Liveness through the lens of agency and causality. In International conference on new interfaces for musical expression.

Berthaut F., Dahl L. (2015). Boeuf: A unified framework for designin and modeling digital orchestras. In International symposium on computer music multi-disciplinary research (cmmr 15).

Berthaut F., Desainte-Catherine M., Hachet M. (2010). Drile : an immersive environment for hierarchical live-looping. In Proceedings of nime, p. 192-197. Sydney, Australia.

Berthaut F., Marshall M., Subramanian S., Hachet M. (2013). Rouages: Revealing the Mechanisms of Digital Musical Instruments to the Audience. In Proceedings of nime. Daejeon, South Korea.

Berthaut F., Martinez D., Hachet M., Subramanian S. (2015). Reflets: Combining and revealing spaces for musical performances. In International conference on new interfaces for musical expression.

Berthaut F., Zappi V., Mazzanti D. (2014, March). Scenography of immersive virtual musical instruments. In Vr workshop: Sonic interaction in virtual environments (sive), 2014 ieee, p. 19-24.

Birnbaum D., Fiebrink R., Malloch J., Wanderley M. M. (2005). Towards a dimension space for musical devices. In Nime ’05: Proceedings of the 2005 conference on new interfaces for musical expression, p. 192–195. Singapore, National University of Singapore.

Blaine T., Fels S. (2003). Contexts of collaborative musical experiences. In Proceedings of nime 03, p. 129–134. Singapore, Singapore.

Bonardi A., Barthélemy J. (2008, juin). The preservation, emulation, migration, and virtualization of live electronics for performing arts: An overview of musical and technical issues. J. Comput. Cult. Herit., vol. 1, no 1, p. 6:1–6:16. Consulté sur http://doi.acm.org/10.1145/1367080.1367086

Boulanger R. (s. d.). The csound book, perspectives in software synthesis, sound design, signal processing and programming. MIT Press.

Cadoz C. (1999). Les nouveaux gestes de la musique. In, p. 47-92. Éditions Parenthèses.

Caramiaux B., Tanaka A. (2013). Machine learning of musical gestures. In proceedings of the international conference on new interfaces for musical expression (nime 2013), seoul, south korea.

Cont A. (2008). Modeling musical anticipation: From the time of music to the music of time. Thèse de doctorat non publiée, UCSD.

Dahl L. (2012). Wicked problems and design considerations in composing for laptop orchestra. In Proceedings of nime 12.

Dahl S., Friberg A. (2007). Visual perception of expressiveness in musicians’ body movements.

Dannenberg R. (1984). An on-line algorithm for real-time accompaniment. In Proceedings of the international computer music conference.

Desainte-Catherine M., Allombert A., Assayag G. (2013). Towards a hybrid temporal paradigm for musical composition and performance: The case of musical interpretation. Computer Music Journal, vol. 37, no 2, p. 61–72.

Dobrian C., Koppelman D. (2006). The ’E’in NIME: musical expression with new computer interfaces. In Proceedings of the 2006 conference on new interfaces for musical expression, p. 282.

Eaton J., Eduardo M. (2014). Guide to brain-computer music interfacing. In, chap. On Mapping EEG Information into Music.

Fober D., Letz S., Orlarey Y., Bevilacqua F. (2013, juillet). Programming Interactive Music Scores with INScore. In Sound and Music Computing, p. 185-190. Stockholm, Sweden. Consulté sur https://hal.archives-ouvertes.fr/hal-00851956

Franco E., Griffith N. J. L., Fernström M. (2004). Issues for designing a flexible expressive audiovisual system for real-time performance & composition. In Nime ’04: Proceedings of the 2004 conference on new interfaces for musical expression, p. 165–168. Singapore, Singapore, National University of Singapore.

Fyans A. C., Gurevich M., Stapleton P. (2010). Examining the spectator experience. In Proc. nime, p. 451–454.

Godøy R. I., Haga E., Jensenius A. R. (2006). Playing “air instruments”: mimicry of soundproducing gestures by novices and experts. In Gesture in human-computer interaction and simulation, p. 256–267. Springer.

Gurevich M., Cavan Fyans A. (2011). Digital musical interactions: Performer–system relationships and their perception by spectators. Organised Sound, vol. 16, no 02, p. 166–175.

Hattwick I., Wanderley M. M. (2012). A dimension space for evaluating collaborative musical performance systems.

Haury J. (2008). Notation musicale pour un clavier de deux touches. In Documents musicaux, collection documents numériques, vol. 11(3-4), p. 127-148. Lavoisier Hermes.

Hunt A., Kirk R. (2000). Mapping strategies for musical performance. Trends in Gestural Control of Music, p. 231–258.

Janin D., Berthaut F., Desainte-Catherine M. (2013). Multi-scale design of interactive music systems : the libTuiles experiment. In R. Bresin (Ed.), SMC 2013, p. 123-129. Stockholm, Sweden. Consulté sur https://hal.archives-ouvertes.fr/hal-00813313

Jensenius A. R., Wanderley M. M., Godøy R. I., Leman M. (2009). Musical gestures. Musical gestures: Sound, movement, and meaning, vol. 12.

Jordà S. (2003). Interactive music systems for everyone: exploring visual feedback as a way for creating more intuitive, efficient and learnable instruments. In Proceedings of the stockholm music acoustics conference (smac03). Stockholm, Sweden.

Jordà S. (2005). Multi-user instruments: models, examples and promises. In Proceedings of the 2005 conference on new interfaces for musical expression, p. 23–26. Singapore, Singapore, National University of Singapore. Consulté sur http://dl.acm.org/citation.cfm?id=1085939.1085948

Jordà S., Kaltenbrunner M., Geiger G., Bencina R. (2005). The reactable*. In Proceedings of the international computer music conference.

Leonard J., Cadoz C., Castagne N., Florens J.-L., Luciani A. (2014). A virtual reality platform for musical creation: Genesis-rt. In Sound, music, and motion, p. 346–371. Springer.

Levin G. (2000). Painterly interfaces for audiovisual performance. Thèse de doctorat non publiée, Massachusetts Institute of Technology.

Malloch J., Sinclair S., Wanderley M. M. (2013). Libmapper: (a library for connecting things). In Chi ’13 extended abstracts on human factors in computing systems, p. 3087–3090. New York, NY, USA, ACM. Consulté sur http://doi.acm.org/10.1145/2468356.2479617

Manoury P. (1998). La note et le son : écrits et entretiens, 1981-1998. In Musique et musicologie : Les dialogues. L’Harmattan.

Marshall M., Bennett P., Fraser M., Subramanian S. (2012, May). Emotional response as a measure of liveness in new musical instrument performance. In Chi 2012 workshop on exploring hci relationship with liveness. Consulté sur http://www.cs.bris.ac.uk/Publications/Papers/2001542.pdf

Marshall M., Wanderley M. M. (2006). Vibrotactile feedback in digital musical instruments. In Nime ’06: Proceedings of the 2006 conference on new interfaces for musical expression, p. 226–229. Paris, France, France, IRCAM — Centre Pompidou.

Oh J., Herrera J., Bryan N. J., Dahl L., Wang G. (2010). Evolving the mobile phone orchestra. In Proceedings of new interfaces for musical expression (nime10). Sydney, Australia.

O’Modhrain M. S. (2001). Playing by feel: incorporating haptic feedback into computer-based musical instruments. Thèse de doctorat non publiée, Stanford, CA, USA. (Adviser-Chafe, Chris)

Pressing J. (1997). Some perspectives on performed sound and music in virtual environments. Presence, vol. 6, no 4, p. 482-503.

Roberts C., Wakefield G., Wright M. (2013). The web browser as synthesizer and interface. In Proceedings of the international conference on new interfaces for musical expression, p. 313–318.

Vercoe B., Puckette M. (1985). Synthetic rehearsal: Training the synthetic performer. In Proceedings of the international computer music conference.

Verfaille V. (2003). Effets audionumériques adaptatifs. Thèse de doctorat non publiée.

Vertegaal R., Ungvary T., Kieslinger M. (1996). Towards a musician’s cockpit: transducers, feedback and musical function. In Proceedings of the international computer music conference, p. 308–311.

Vines B. W., Krumhansl C. L., Wanderley M. M., Dalca I. M., Levitin D. J. (2011). Music to my eyes: Cross-modal interactions in the perception of emotions in musical performance. Cognition, vol. 118, no 2, p. 157–170.

Walker R. (1987). The effects of culture, environment, age, and musical training on choices of visual metaphors for sound. Perception and Psychophysics Vol 42(5), p. 491-502.

Wanderley M. M., Orio N. (2002, September). Evaluation of input devices for musical expression: Borrowing tools from hci. Computer Music Journal, vol. 26, p. 62–76. Consulté sur http://dl.acm.org/citation.cfm?id=1245194.1245202

Wanderley M. M., Orio N., Schnell N. (2000). Towards an analysis of interaction in sound generating systems. In Isea2000 conference proceedings.

Wanderley M. M., Vines B. (2006). Music and gesture. In, p. p.167. Ashgate.

Wang G., Bryan N., Oh J., Hamilton R. (2009). Stanford laptop orchestra(slork). In Proceedings of the international computer music conference, p. 505-508.

Wegner D., Wheatley T. (1999). Apparent mental causation: Sources of the experience of will. American Psychologist, vol. 54.

Weinberg G. (2005, juin). Interconnected musical networks: Toward a theoretical framework. Comput. Music J., vol. 29, no 2, p. 23–39. Consulté sur http://dx.doi.org/10.1162/0148926054094350

Wessel D., Wright M. (2002). Problems and prospects for intimate musical control of computers. Computer Music Journal, vol. 26, no 3, p. 11–22.

Wright M. (2005). Open sound control: an enabling technology for musical networking. Organised Sound, vol. 10, no 3, p. 193–200.

Zappi V., Mazzanti D., Brogni A., Caldwell D. (2011). Design and evaluation of a hybrid reality performance. In Nime ’11: Proceedings of the 2011 conference on new interfaces for musical expression, p. 355–360. Oslo, Norway.

Zappi V., McPherson A. (2014). Design and use of a hackable digital instrument. In Live interfaces.