As a branch of mechatronic research in interactivity, and in robot art, we describe the concept of implementing Playware based tools inspired by modern AI robotic systems for audio-video performances. We develop immersive and personalizable tools that can allow any user to manipulate both audio and video output in a very easy manner, thanks to mechatronical wearable interfaces. In this light, we describe two of our systems that explore the concept of run-time composition of a variety of input and output modalities, e.g. both music and graphical expression. Indeed, we developed both hardware/software tools by which it is possible to allow any user to create new song versions of music (e.g. the MusicTiles app) and software that are able to translate the musical experience in to a visual one (e.g. the MAG software). By interfacing these technologies into mechatronic systems, it is now possible to create a run-time audio-video performance that is original and unique. This can further be combined with modular wearable – inspired by modular robotics – to interact and control the performance. This mechatronic wearable concept and its implementations exemplify how to convey a user-centered experience in playware technology.
|Number of pages||353|
|Publication status||Published - 2015|