Project report

It Takes Two – Shared and Collaborative Virtual Musical Instruments in the Musical Metaverse

Music in the metaverse. Our research brings collaborative virtual musical instruments into the musical metaverse! Three innovative prototypes foster creativity, interaction and social presence through spatial audio, data sonification and avatars.

Project participants

Damian Dziwis Sascha Etezazi

University of Trento,
Department of Information Engineering:

· Alberto Boem
· Matteo Tomasetti
· Luca Turchet

Project status

Research & Publication

Our research

This project investigates shared and collaborative Virtual Musical Instruments (VMIs) for the musical metaverse. Three prototypes of such instruments, specifically designed for musical collaboration and social interaction, have been developed for shared virtual environments. These include features such as spatial audio, data sonification and avatar-based interactions.

A user study examines how these instruments can foster creativity and usability and provide a sense of social presence and mutual engagement. The results show that different instrument designs exhibit varying degrees of creativity and ease of use, and instruments with symmetrical and embodied interactions provide better social presence and mutual engagement.

Common and shared musical instruments

Instruments for collaborative playing require the interaction of several people. They are designed to be played by multiple players to encourage communication and expression between players. Social interaction is the key to a satisfying experience.

In collaborative music composition, coordination with others is crucial. Such instruments resemble public installations and depart from the usual notion of an instrument and a musical interface. A particular type of instrument is one that uses the Internet to enable collaboration between geographically dispersed users.

Three prototypes for the musical metaverse

To investigate the possibilities and effects of collaborative virtual instruments in metaverse environments, we have developed three prototypes. On the Internet, both players can connect to a web-based metaverse environment via a browser from different locations to play the instruments.

The three instruments are optimized for use with head-mounted displays (HMDs) and the corresponding controllers.  The VMIs have been designed with three different unique virtual environment capabilities in mind:

· Spatial Instrument

This instrument explores sounds and virtual spaces: the first person plays the sounds on the interface, the second moves the sound source in space. The position of the sound source is linked to the position of the second person.

· Sonification Instrument

Here, an approach is used that controls the sound synthesis via the parameters of the players' avatars. This allows the players to create music without having to rely on visual interfaces. The distances between the players and reference points in the room control synthesizer parameters such as wavelength, low-pass filter cutoff frequency, frequency modulation, modulation depth and volume.

· Body Instrument

The Body Instrument has a more intimate form of interaction: the virtual avatar is the primary interface. Changes to the avatar's properties influence the music output. This enables intuitive music creation through gestures and movements. This approach combines the physicality of the player's movements with the digital output.

Result of the user study



Our study has shown that social presence is influenced by the role and design of the different instruments:


»The Sonification Instrument promotes creativity, ease of use and social presence, while the Body Instrument promotes mutual engagement.«


»The Spatial Instrument affects social presence by splitting the different user roles, suggesting that instrument designs that offer symmetrical user roles and embodied interactions are better suited for shared VMIs in the musical metaverse.«

Results of the research project


The project was presented at the IEEE 5th International Symposium on the Internet of Sounds (IS2) 2024, and published in a scientific paper:

“It Takes Two” - Shared and Collaborative Virtual Musical Instruments in the Musical Metaverse
to the publication


The program code of the three prototypes is published as open source:

to project

Photos/Screenshots: Damian Dziwis
titel picture: Alberto Boem