TouchDesigner meetup 17th April – Audio visualisation

Date & Time: Saturday 17th April 5pm – 7pm UK / 6pm – 8pm Berlin

Level: Open to all levels

Join the online meetup for expert talks on audio visualisation. Meet and be inspired by the TouchDesigner community.

The meetup runs via Zoom. The main session features short presentations from TouchDesigner users. Breakout rooms are created on the spot on specific topics, and you can request a new topic at any time.

The theme for this session is Audio visualisation, hosted by Bileam Tschepe with presentations from the community.

In the breakout rooms, you can share your screen to show other participants something you’re working on, ask for help, or help someone else.

Presenters:

Name: Ian MacLachlan
Title: Terraforming with MIDI
Bio: Bjarne Jensen is an experimental audio/visual artist from the Detroit area with an interest in creating interactive systems for spatial transformation.
Name: Jean-François Renaud
Title: Generating MIDI messages to synchronize sound and visual effect in TouchDesigner
Description : Instead of using the audio analysis strategy to affect the rendering, we are focusing on building small generative machines using the basic properties of notes (pitch, velocity), and we look at different means to manage triggering. At the end, the goal is still to merge and to make alive what you hear and what you see.
Bio: Interactive media professor at École des médias, UQAM, Montréal
Vimeohttps://vimeo.com/morpholux 
Name: Bileam Tschepe
Title: algorhythm – a first look into my software
Description: I’ve been working on a tool for audiovisual live performances and I’d like to share its current state and see if people are interested in collaborating and working with me
Bio: Berlin based artist and educator who creates audio-reactive, interactive and organic digital artworks, systems and installations in TouchDesigner, collaborating with and teaching people worldwide.
YouTube: Bileam Tschepe

Requirements

  • A Zoom account
  • A computer and internet connection

Berlin Code of Conduct

We ask all participants to read and follow the Berlin Code of Conduct and contribute to creating a welcoming environment for everyone.

Supported by

Melody Generation in Max – On demand

Level: Intermediate

The importance of the melody in traditional musical composition is difficult to understate. Often one of the first components the ear latches onto, being able to write a good melody is something of an artform. Producing basic algorithmically-generated melodies using Max/MSP is quite easy, but in order to produce something more ‘musical’ we must refine the generation process.

In this workshop you will learn some ways of generating more complex melodies in Max. This will involve implementing occasional phrase repeats to balance predictability and surprise, locking in some of the more important rhythmic elements and incorporating planned octave jumps alongside more restricted pitch-based travel.

By the end of the workshop you will have constructed a melody generation patch that can be set to play along with your compositions, with a greater understanding of some of the ways in which we can sculpt melody in Max.

Topics

    • Max/MSP
    • Algorithmic Composition
    • Melody

Requirements

  • You should be comfortable with the general workflow and data formatting in Max.

  • Knowledge of MIDI format and routing to DAWs (Ableton, Logic etc) would be a plus, although Max instruments will be provided.

  • You should have some basic knowledge of music theory: chords, scales, modes etc.

About the workshop leader 

Samuel Pearce-Davies is a composer, performer, music programmer and Max hacker living in Cornwall, UK.

With a classical music background, it was his introduction to Max/MSP during undergraduate studies at Falmouth University that sparked Sam’s passion for music programming and algorithmic composition.

Going on to complete a Research Masters in computer music, Sam is now studying a PhD at Plymouth University in music-focused AI.

About
Privacy