Two people interact with a digital art installation at a creative computing degree show. One person uses a physical controller with green buttons connected to an iPhone emulator projected on a wall. The screen displays an audio track interface with dynamic, colourful wave visuals in the background, reacting in real time.

Controlling Music Visually via WebSocket and iOS

May 2023
App DevelopmentAudio EngineeringSwiftiOS DevelopmentWebSocket

An interactive audio-visual piece using a stem player to control grouped tracks via a WebSocket-linked iPhone emulator, with real-time visuals reacting to mic input through a live-coded browser video synth.

For my final project I was required to present my piece in the annual creative computing degree show. My presentation combined previous projects in creative computing and audio visual processing to create an inteactive piece for people to play with.

Using a WebSocket hosted on a local server, I connected the iPhone emulator using the green buttons on the stem player device to control the audio track outputs.

Each button represented a group of tracks. The groups were seperated in a variety of different ways as digitally created music used more contemparay instruments that traditional instruments. This is an example of stem grouping in electronic music:

  • Vocals (if present)

  • Synths 1

  • Synths 2

  • Bass

The setup for the presentation was created in a short amount of time and there were occasional delays in response to the user interactions. Using remote controls for an iOS device is not generally well suported as design principles for iOS development prefer user interactions to be contained.

A user interacts with a digital audio-visual installation using a stem player to control audio tracks on an iPhone emulator.

To create the visual feedback I used a live coding video synth in the computer browser and used the microphone input to animate the screen.