The central idea behind 'ASCIImage Rhapsody' is to combine live coding performances with real-time interaction from the audience. As most audiences are not usually familiar with programming or live coding in a specific language, a bridge is required to connect interaction and code. In order to develop this novel format for installation performances, research was conducted into how user interactions and the environment could be effectively incorporated into code generation.
One promising approach was to use two-dimensional images from a webcam stream for code generation.
The live image from the webcam is converted to greyscale and downsampled to a resolution of 60 x 40 pixels. Each pixel is then converted into an ASCII character that corresponds to ORCA's programming instructions. A conversion table is used to select instruction characters according to brightness.
The result is an ORCA code corresponding to the image captured by the camera, which generates MIDI data containing information about the notes played. This data is then played in real time using sound synthesizers developed in the SuperCollider live coding programming language.
For the performance in the Kunsthalle Düsseldorf, a table with three small monitors was set up in one of the exhibition rooms. Each monitor showed a different aspect of the process, from the ASCII conversion of the camera facing the audience to the ORCA code or the SuperCollider sound synthesis code.
Visitors can interact with the performance via the audience-facing camera and influence the live coding that ultimately controls the musical result.