
Subscription id: 22499
Course id: 1112102
Course product id: 1168567
Subscription end date: 01/01/1970-12:01:00
Next payment date: 04/04/2023-02:04:50
Highest membership access level: 0
User membership allows access to this course: No
Not logged in, cannot sync.Immersive AV Composition - On demand
Taught by: Bryan Dunphy
Level
What you'll learn
- Setup and use the ImmersAV toolkit
- Discuss techniques for rendering material on VR headsets
- Implement the Csound API within a C++ application
- Create mixed raymarched and raster based graphics
- Create an interactive visual scene using a single fragment shader
- Generate the mandelbulb fractal
- Generate procedural audio using Csound
- Map controller position and rotation to audiovisual parameters using machine learning
Course content
- Course Overview
- Requirements
- Installation of ImmersAV
- Session 1 Worksheet
- Session 2 Worksheet
- Reading Material
- Session 2 Files
- Part 1 - Project Setup
- Part 2 - Audio - Environmental Noise
- Part 3 - Audio - Granular Patch
- Part 4 - Visuals - Infinite Plane
- Part 5 - Visuals - Colour the Scene
- Part 6 - Visuals - Mandelbulb
- Part 7 - Studio - Sound Source Placement
- Part 8 - VR Rendering
- Part 1 - Setup
- Part 2 - Parameter Preperation
- Part 3 - Parameter Randomisation
- Part 4 - Neural network input
- Part 5 - Machine learning test
- Part 6 - Controller bindings
Access this course and 100s more for
From: £35 / month with a 7-day free trial
SubscribeBuy this course and own it forever
£ 19.9
Add to cartRequirements
- A computer and internet connection
- A web cam and mic
- A Zoom account
- Cloned copy of the ImmersAV toolkit plus dependencies
- VR headset capable of connecting to SteamVR
Who is this course for
- These workshops will introduce you to the ImmersAV toolkit. The toolkit brings together Csound and OpenGL shaders to provide a native C++ environment where you can create abstract audiovisual art. You will learn how to generate material and map parameters using ImmersAV’s Studio() class. You will also learn how to render your work on a SteamVR compatible headset using OpenVR. Your fully immersive creations will then become interactive using integrated machine learning through the rapidLib library.
Useful links
About the workshop leader
Bryan Dunphy graduated in 2021 from a PhD at Goldsmiths University. He specialises in audio-visual, immersive performances and creations. Most of his work uses Machine Learning.