Plucking Sounds Out of the Air

The creation of a new gestural instrument for audio-visual performances

Music and movement have always been intrinsically linked, but the rise of the laptop DJ broke that nexus. At worst, stagecraft has recently been reduced to little more than email checking performances.

The AirSticks reconnect music and movement, combining the physicality of acoustic performance with the endless possibilities of digital sound, allowing the performer to pluck sounds out of thin air, freeing them of the keyboard and mouse. Utilising off-the-shelf gestural controllers and custom built software, the AirSticks allow the triggering and manipulation of sound and visuals in a 3D playing space.

Leading this practice-based research project is drummer and instrument designer Alon Ilsar, who has performed and collaborated with musicians, visual artists and dancers. Highlights include Sydney Vivid Festival, Triple J’s Like a Version and NYC’s MET Museum. More recently, other performers have began using the AirSticks in their own practice, as new hardware and software is developed

This is an ongoing project and feeds into two other research projects: teaching kids to code through gesture and music making, and empowering people living with physical disability, autism and dementia to unlock their musical creativity.

In collaboration with:

Mark Havryliv
Matt Hughes

Outcomes

See videos of the AirSticks in action

Project members

Dr Alon Ilsar