Visual Music Performance with Machine Learning – On demand

Level: Intermediate

In this workshop you will use openFrameworks to build a real-time audiovisual instrument. You will generate dynamic abstract visuals within openFrameworks and procedural audio using the ofxMaxim addon. You will then learn how to control the audiovisual material by mapping controller input to audio and visual parameters using the ofxRapid Lib add on.

Session Learning Outcomes

By the end of this session a successful student will be able to:

  • Create generative visual art in openFrameworks

  • Create procedural audio in openFrameworks using ofxMaxim

  • Discuss interactive machine learning techniques

  • Use a neural network to control audiovisual parameters simultaneously in real-time

Session Study Topics

  • 3D primitives and perlin noise

  • FM synthesis

  • Regression analysis using multilayer perceptron neural networks

  • Real-time controller integration

Requirements

  • A computer and internet connection

  • A web cam and mic

  • A Zoom account

  • Installed version of openFrameworks

  • Downloaded addons ofxMaxim, ofxRapidLib

  • Access to MIDI/OSC controller (optional – mouse/trackpad will also suffice)

About the workshop leader 

Bryan Dunphy is an audiovisual composer, musician and researcher interested in generative approaches to creating audiovisual art. His work explores the interaction of abstract visual shapes, textures and synthesised sounds. He is interested in exploring strategies for creating, mapping and controlling audiovisual material in real time. He has recently completed his PhD in Arts and Computational Technology at Goldsmiths, University of London.

About
Privacy