
Subscription id: 22499
Course id: 1112081
Course product id: 1168545
Subscription end date: 01/01/1970-12:01:00
Next payment date: 04/04/2023-02:04:50
Highest membership access level: 0
User membership allows access to this course: No
Not logged in, cannot sync.Visual Music Performance with Machine Learning - On demand
Taught by: Bryan Dunphy
Level
What you'll learn
- Create generative visual art in openFrameworks
- Create procedural audio in openFrameworks using ofxMaxim
- Discuss interactive machine learning techniques
- Use a neural network to control audiovisual parameters simultaneously in real-time
Course content
- Course Overview
- Requirements
- Pre-course preparation
- Work sheet with exercises
- Part 1 - Sphere setup
- Part 2 - Phong lighting
- Part 3 - Camera + Normal matrix
- Part 4 - Vertex displacement
- Part 5 - ofxMaxim setup
- Part 6 - Simple FM synth
- Part 7 - Machine Learning - Data collection
- Part 8 - Machine Learning - Train + Run model
- Part 9 - OSC controller
- Finished Project on Github
Access this course and 100s more for
From: £35 / month with a 7-day free trial
SubscribeBuy this course and own it forever
£ 9.9
Add to cartRequirements
- A computer and internet connection
- A web cam and mic
- A Zoom account
- Installed version of openFrameworks
- Downloaded addons ofxMaxim, ofxRapidLib
- Access to MIDI/OSC controller (optional - mouse/trackpad will also suffice)
Who is this course for
- In this workshop you will use openFrameworks to build a real-time audiovisual instrument. You will generate dynamic abstract visuals within openFrameworks and procedural audio using the ofxMaxim addon. You will then learn how to control the audiovisual material by mapping controller input to audio and visual parameters using the ofxRapid Lib add on.
Useful links
About the workshop leader
Bryan Dunphy graduated in 2021 from a PhD at Goldsmiths University. He specialises in audio-visual, immersive performances and creations. Most of his work uses Machine Learning.