Ultrasonic tactile feedback – by Matthias Kispert and Sus Garcia

Miha Ciglar presented his interface Syntact™ at the Music Hackspace on Thursday 21st of June 2012. To feel the sound, in this occasion meant to place your hand in a focal point where ultrasonic transducers literally blow acoustic energy on your skin. Similar to AM radio, sound waves are amplitude modulated on ultrasound, thus the ultrasound carries the acoustic information inaudibly. The Syntact interface was developed from Miha’s interest in receiving tactile feedback in sound performance. This means there is tactile interaction while there is no physical contact with the interface, as the sound is felt by the musician.

After a presentation of the interface, developed by Ultrasonic audio technologies, the company he founded in 2011 in Slovenia, Miha Ciglar performed a short set on a no-input mixing board whose internal feedback was augmented by a Syntact interface, whose output was reflected through hand gestures onto a piezo transducer and fed back into the mixer. A different mode for gestural interaction is possible with the help of a USB camera, which tracks hand movements that are converted to MIDI data which can be assigned to software parameters.

Miha also demonstrated the Acouspade™ directional speaker which also uses amplitude-modulated ultrasound, but this time the wave is being demodulated into the human hearing range. Because of the high directionality of high-frequency waves, sounds can be focused onto a particular point with precision. While Miha aimed the speaker at different points in the room, it was fascinating to observe how these movements could be heard and located very clearly – the sounds always appeared to emerge from a point that was more defined than a real-life sound source would have been.

By Matthias Kispert and Sus Garcia

(note: images will follow)

0 0 votes
Article Rating
Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x