HASGS: Hybrid augmented system of Gestural Symbiosis Generating Visual Information

This paper discusses how an augmented instrument, HASGS in the context of interactive electronic music, motivates the generation of different graphical user interfaces taking in consideration the repertoire being composed for it. New composition aesthetics are deeply influenced by electronic materia...

Full description

Bibliographic Details
Main Author: Portovedo Marques, Henrique (author)
Other Authors: Ferreira-Lopes, Paulo (author)
Format: conferenceObject
Language:eng
Published: 2019
Subjects:
Online Access:http://hdl.handle.net/10400.14/27665
Country:Portugal
Oai:oai:repositorio.ucp.pt:10400.14/27665
Description
Summary:This paper discusses how an augmented instrument, HASGS in the context of interactive electronic music, motivates the generation of different graphical user interfaces taking in consideration the repertoire being composed for it. New composition aesthetics are deeply influenced by electronic materials and sonic repositories at the same time as new mediums are currently seen as possible extensions of instrumental practice. These mediums are available for creative purposes during composition and performative processes. While the aesthetics of acoustic and electronic sounds are creating mutual influences, composers and sound designers are developing new languages, new gestural attitudes, new extended techniques, new notation methods, new performative paradigms including the creation of graphical interfaces for visual feedback. This augmented system for saxophone was motivated by the need to perform pieces with a common aesthetic that have been written using electronic environments. Those pieces shared the need for the control of external devices in order to be performed. The repertoire for saxophone and electronics is growing in a huge scale, from pieces using stomp boxes or control pedals for different triggering or fading, to pieces requiring the manipulation of knobs. These controllers, by their nature, devices that separate sound production (synthesis) and performer gesture (control), have subsequently generated an increased interest in the study of compositional mapping strategies for computer music. From our experience, we conclude that the graphical user interface (GUI) is fundamental for the understanding of the individuality of each piece, as well as to understand the relation between the augmentation system and the piece itself. If this project started with the idea of contributing to a new performative paradigm regarding the existing repertoire, new repertoire and improvisational performance situations led to the development of a hybrid system contributing to bio-feedback incorporation in the work of art, while creating new graphical user interfaces for visual feedback.