StackBeat L.O.V.E. is based on StackBeat, an esoteric programming language that allows users to manipulate the byte code of an acoustic signal using a limited set of characters. Aesthetically, it is similar to ByteBeat, but has fewer characters and is syntactically based on reverse Polish notation.
The special thing here is that, unfortunately, it is not possible to directly deduce what the acoustic result will be from the characters and commands. We use this as a basis to put the (probably first-time) coder in a situation that does not correspond to musical intuition, but can also be very challenging in terms of sound.
To do this, the user moves around in a virtual environment, generally described as the metaverse: a place where everyone in the digital space can interact with each other through movements and other physical actions. In order for the coding to begin, a context is first created, which in our case takes the form of a red sphere. The basic sound is a sawtooth wave. Code blocks can then be placed here, which can be related to each other depending on their distance. It is not easy to estimate the exact position and relationship to the other objects. There are also numbers as code components, which then take on the distance to the center as their value.
The challenge is to create something that is musically satisfying despite the given limitations. However, it becomes exciting when more than one person is in the metaverse, and that is a question we would like to clarify: How do people behave when they are supposed to make music together but have no idea about aesthetics and the instrument? Are there any negotiation processes that take place musically? Do the roles of all participants remain the same, or will a “conductor” quickly emerge? After all, everyone can influence everything, contexts can overlap, and code can be shared or stolen.