An introduction to Flora for monome norns – On-demand
Level: Some experience of norns required
Flora is an L-systems sequencer and bandpass-filtered sawtooth engine for monome norns. In this workshop you will learn how L-system algorithms are used to produce musical sequences while exploring the script’s UI and features.
By the end of the first workshop, you will be able to:
-
Navigate the Flora UI and parameters menus to build and perform your own compositions
-
Create dynamically shaped, multinodal envelopes to modulate Flora’s bandpass-filtered sawtooth engine
-
Build generative polyrhythms and delays into your compositions
-
Use crow and/or midi-enabled controllers and synthesizers to play Flora
Session study topics:
-
Sequencing with L-system algorithms
-
Physical modeling synthesis with bandpass filters
-
Generate multi-nodal envelope
-
Norns integration with midi and/or crow
Requirements
-
A computer and internet connection
-
A norns device with Flora installed
-
Optional: A midi-enabled controller and/or synthesizer
We have a number of sponsorship places available, if the registration fee is a barrier to you joining the workshop please contact laura@stagingmhs.local.
About the workshop leader
Jonathan Snyder is a Portland, Oregon based sound explorer and educator.
Previously, he worked for 22 years as a design technologist, IT manager, and educator at Columbia University’s Media Center for Art History, Method, and Adobe.
Create with MPE in Live 11 – On demand
Level: Beginner
MIDI Polyphonic Expression (MPE) offers a vast playground of musical opportunities to create musical compositions and productions. Live 11 supports a range of MPE tools to allow the composer and producer to create ideas in a myriad of ways. In this workshop you will creatively explore and deploy a range of MPE techniques in a musical setting. This workshop aims to provide you with suitable skills to utilise the creative possibilities of MPE in the Ableton Live environment.
Session Learning Outcomes
By the end of this session a successful student will be able to:
-
Identify the role and function of MPE
-
Explore MPE compatible devices in Live
-
Utilize MPE controllers within Live 11
-
Apply MPE to create novel musical and sonic elements
Session Study Topics
-
Using MPE
-
MPE devices in Live
-
MPE controllers
-
Creatively using MPE
Requirements
-
A computer and internet connection
-
Access to a copy of Live 11 (i.e. trial or full license)
About the workshop leader
Mel is a London based music producer, vocalist and educator.
She spends most of her time teaching people how to make music with Ableton Live and Push. When she’s not doing any of the above, she makes educational content and helps music teachers and schools integrate technology into their classrooms. She is particularly interested in training and supporting female and non-binary people to succeed in the music world.
Supported by
Visual Music Performance with Machine Learning – On demand
Level: Intermediate
In this workshop you will use openFrameworks to build a real-time audiovisual instrument. You will generate dynamic abstract visuals within openFrameworks and procedural audio using the ofxMaxim addon. You will then learn how to control the audiovisual material by mapping controller input to audio and visual parameters using the ofxRapid Lib add on.
Session Learning Outcomes
By the end of this session a successful student will be able to:
-
Create generative visual art in openFrameworks
-
Create procedural audio in openFrameworks using ofxMaxim
-
Discuss interactive machine learning techniques
-
Use a neural network to control audiovisual parameters simultaneously in real-time
Session Study Topics
-
3D primitives and perlin noise
-
FM synthesis
-
Regression analysis using multilayer perceptron neural networks
-
Real-time controller integration
Requirements
-
A computer and internet connection
-
A web cam and mic
-
A Zoom account
-
Installed version of openFrameworks
-
Downloaded addons ofxMaxim, ofxRapidLib
-
Access to MIDI/OSC controller (optional – mouse/trackpad will also suffice)
About the workshop leader
Bryan Dunphy is an audiovisual composer, musician and researcher interested in generative approaches to creating audiovisual art. His work explores the interaction of abstract visual shapes, textures and synthesised sounds. He is interested in exploring strategies for creating, mapping and controlling audiovisual material in real time. He has recently completed his PhD in Arts and Computational Technology at Goldsmiths, University of London.