Project report

Programming in the metaverse: StackBeat L.O.V.E.

StackBeat L.O.V.E. (Location Oriented Virtual Extension) is a programming language that enables music to be created as code directly in three-dimensional space. In a virtual environment, coders have the option of creating blocks that are interpreted according to their distance from each other. At first, this all seems very strange and most of the time it is very noisy, but the aim is to explore the possibilities for musical action and interaction in VR.

Project participants

Matthias NowakowskiDamian Dziwis

Project status

­­­Development and refinement

Programming in the metaverse: StackBeat L.O.V.E.

StackBeat L.O.V.E. is based on StackBeat, an esoteric programming language that allows users to manipulate the byte code of an acoustic signal using a limited set of characters. Aesthetically, it is similar to ByteBeat, but has fewer characters and is syntactically based on reverse Polish notation.

The special thing here is that, unfortunately, it is not possible to directly deduce what the acoustic result will be from the characters and commands. We use this as a basis to put the (probably first-time) coder in a situation that does not correspond to musical intuition, but can also be very challenging in terms of sound.

To do this, the user moves around in a virtual environment, generally described as the metaverse: a place where everyone in the digital space can interact with each other through movements and other physical actions. In order for the coding to begin, a context is first created, which in our case takes the form of a red sphere. The basic sound is a sawtooth wave. Code blocks can then be placed here, which can be related to each other depending on their distance. It is not easy to estimate the exact position and relationship to the other objects. There are also numbers as code components, which then take on the distance to the center as their value.

The challenge is to create something that is musically satisfying despite the given limitations. However, it becomes exciting when more than one person is in the metaverse, and that is a question we would like to clarify: How do people behave when they are supposed to make music together but have no idea about aesthetics and the instrument? Are there any negotiation processes that take place musically? Do the roles of all participants remain the same, or will a “conductor” quickly emerge? After all, everyone can influence everything, contexts can overlap, and code can be shared or stolen.

Interface for creating code for Stackbeat L.O.V.E.

In addition to the social aspects, we are also interested in the interaction-related perspective on the system. Interfaces must be understandable in order to be able to handle the blocks and spheres. But how much information can be dispensed with? What is the best way to write in VR while moving in 3D?

We use such challenges to explore media-specific music for the metaverse without smuggling in familiar musical practices and instruments and changing them there.

StackBeat L.O.V.E. is therefore a tool that we will use to answer the previous questions and hopefully also to perform many live coding concerts with it.

The theoretical basis will be presented at ISEA 2025 and has already been tested in a concert with students from New York University. The system was used again at a lecture evening at Internet of Sounds 2025 in L'Aquila (Italy).

Want to try it out for yourself? https://stackblitz.com/edit/iseaspace