BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Music Hackspace - ECPv6.15.18//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:Music Hackspace
X-ORIGINAL-URL:https://musichackspace.org
X-WR-CALDESC:Events for Music Hackspace
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:20180325T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:20181028T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:20190331T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:20191027T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:20200329T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:20201025T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:20210328T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:20211031T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200625T183000
DTEND;TZID=Europe/London:20200625T200000
DTSTAMP:20260403T232154
CREATED:20200615T175744Z
LAST-MODIFIED:20200615T175809Z
UID:10000829-1593109800-1593115200@musichackspace.org
SUMMARY:Audio chips\, e-textiles\, touch interfaces: 40 years of computer music research
DESCRIPTION:Adrian Freed is the former Research Director of UC Berkeley’s CNMAT\, the historical Californian research centre led for over two decades by the late David Wessel. At CNMAT\, Adrian led a number of influential projects on computer music\, including the widely used Open Sound Control (OSC) protocol\, developed with Matt Wright. \nAdrian published his first paper in 1975\, at a time where computers were out of reach\, and he started hacking with digital and analog chips. As technology progressed\, he worked on the Fairlight CMI and powerful processors for the time\, that would cost less than $10 today. He dedicated his research to the new field of computer music\, and went on to build systems ranging from analog designs to e-textiles. \nIn this talk\, we’ll hear from Adrian’s long career\, and the exciting new project he is working on\, the FingerPhone. \n 
URL:https://musichackspace.org/event/audio-chips-e-textiles-touch-interfaces-40-years-of-computer-music-research/
LOCATION:YouTube and Facebook
CATEGORIES:Electronics,Instrument design
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/06/flyer.001-e1592241992527.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200622T180000
DTEND;TZID=Europe/London:20200622T180000
DTSTAMP:20260403T232154
CREATED:20200616T070600Z
LAST-MODIFIED:20200616T070600Z
UID:10000834-1592848800-1592848800@musichackspace.org
SUMMARY:Andrew McPherson: Augmenting Instruments
DESCRIPTION:Dr Andrew McPherson is a Reader at Queen Mary University\, where he leads the Augmented Instruments Lab. He invented the Magnetic Resonator piano\, Touch keys and Bela\, and is on a mission to empower anyone to build their own instruments. In this talk\, Andrew will revisit his inventions and give tips to get started building your own. Join the live stream and participate to the live chat!
URL:https://musichackspace.org/event/andrew-mcpherson-augmenting-instruments/
LOCATION:YouTube and Facebook
CATEGORIES:Instrument design,research
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/06/flyer.001-3-e1592290935278.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200618T183000
DTEND;TZID=Europe/London:20200618T200000
DTSTAMP:20260403T232154
CREATED:20200608T131325Z
LAST-MODIFIED:20200618T161832Z
UID:10000825-1592505000-1592510400@musichackspace.org
SUMMARY:SWAM: software instruments that sound real
DESCRIPTION:﻿ \nSound designer and musician Stefano Lucato started creating solo string libraries in 2003\, and has been on a quest ever since to create the most accurate reproduction of acoustic instrument sounds\, to be available in plug-in format. He teamed up with software developer Emanuele Parravicini to build a technology that they used together to release software versions of nearly all orchestral instruments. \n﻿ \nStefano and Emanuele created Audio Modeling in 2017\, and run the business from a small village in Lombardy\, Italy\, between Milan and Lake Como. Their software instruments are used all over the world by composers and producers in the studio\, as well as live. \nOn the 18th June\, I will interview Emanuele Parravicini about Audio Modeling’s journey and projects\, and we’ll hear demonstrations of the instruments live. Tune in on Facebook and YouTube. \n  \n 
URL:https://musichackspace.org/event/swam-software-instruments-that-sound-real/
LOCATION:YouTube and Facebook
CATEGORIES:Product discovery
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/06/SWAM.001-e1591621982802.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200613T170000
DTEND;TZID=Europe/London:20200613T190000
DTSTAMP:20260403T232154
CREATED:20200528T153436Z
LAST-MODIFIED:20200604T145324Z
UID:10000813-1592067600-1592074800@musichackspace.org
SUMMARY:Workshop with Ruben Dax: Live Looping with Ableton Live
DESCRIPTION:Book a ticket here \nAbleton Live provides a great software environment for live looping – empowering a solo performer to bring an entire song to life in front of an audience in an engaging and relatable way. Outside of performing\, live looping can also be used as a composition tool allowing you to create new music in a faster way with less distractions. Live looping using software\, such as Ableton Live\, affords you greater flexibility in your workflow\, performance style\, and mixing techniques in comparison to using traditional hardware looper devices. \nIn this workshop we will be diving deep into the different techniques for live looping using Ableton Live 10 as our digital audio workstation. We will progress through basic recording and layering techniques\, using Ableton’s powerful ‘Looper’ device\, MIDI mapping your controls\, and even discovering how to craft automated (hands-free) live-looping performances. \nHere’s a chance to learn the methods that for years I’ve experimented with\, developed\, and optimized for my own multi-instrumental live looping performances and start building your own! \nTopics \n\nAbleton Live\nAbleton Live Looper device\nBasis recording and layering techniques\nMIDI mapping\nMIDI controllers\nLive looping automations \n\nRequirements \n\nAn instrument and a way to capture sound\, or a MIDI controller connected to a virtual instrument\nAbleton Live 9\, Live 10 or even Live Intro!\nUsing a MIDI controller can increase your flexibility (such as Novation Launchpad\, Akai MPC\, Ableton Push\, ROLI Lightpad..)\n\nAbout the workshop leader \nRuben Dax is a musician\, multi-instrumentalist and instrument designer\, who uses Ableton Live to compose songs with live loops. His YouTube channel shows a range of videos where he creates tracks with diverse sounds and instruments\, aggregating them in loops to create rich\, layered tracks.
URL:https://musichackspace.org/event/workshop-with-ruben-dax-live-looping-with-ableton-live/
LOCATION:Online
CATEGORIES:Music software,Workshops
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/05/flyer.001-1-e1591282362217.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200613T160000
DTEND;TZID=Europe/London:20200613T180000
DTSTAMP:20260403T232154
CREATED:20200601T152900Z
LAST-MODIFIED:20200604T145716Z
UID:10000822-1592064000-1592071200@musichackspace.org
SUMMARY:Max and Node.js: the Age of Javascript
DESCRIPTION:Difficulty level: intermediate. \nJavascript is the glue of the Internet\, the universal language that runs in billions of web browsers and networked systems across the world. Max’s Node support connects the Max program into the ecosystem of Node.js\, a Javascript engine which can network\, talk to hardware\, connect to databases\, run sophisticated web servers\, and much more. \nIn the workshop we’ll look at the link between Max and Node\, and how the different worlds of graphical and textual programming connect. We’ll also look at some specific Javascript coding techniques and connect them to the 3D world of Max’s Jitter and OpenGL systems. From there we will start to explore ways to attach Max to web servers\, allowing Max’s data to flow into and out of dynamic web pages. \nTopics: \n\nMax\nJavascript\nWeb technologies\nJitter and 3D Graphics\n\nRequirements:  \nA good working knowledge of Max is expected\, as well as an awareness of how Jitter works. Some familiarity with textual programming languages and/or web technologies would be useful\, but not required. \nAbout the workshop leader: \nDr Nick Rothwell (aka Cassiel) is a composer\, performer\, software architect\, coder and visual artist. He has built media performance systems for projects with Ballett Frankfurt and Vienna Volksoper\, composed sound scores for Aydın Teker (Istanbul) and Shobana Jeyasingh Dance\, live coded in Mexico and in Berlin with sitar player Shama Rahman\, written software for Studio Wayne McGregor and the Pina Bausch Foundation\, and developed algorithmic visuals for large-scale outdoor installations in Poland\, Estonia\, Cambridge Music Festival and Lumiere (London / Durham). He also teaches at Ravensbourne University London and writes for Sound On Sound magazine.
URL:https://musichackspace.org/event/max-and-node-js-the-age-of-javascript/
LOCATION:Online
CATEGORIES:Software Classes,Workshops
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/06/flyer.001-2-e1591025095898.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200611T183000
DTEND;TZID=Europe/London:20200611T200000
DTSTAMP:20260403T232154
CREATED:20200601T152413Z
LAST-MODIFIED:20200601T152413Z
UID:10000819-1591900200-1591905600@musichackspace.org
SUMMARY:Atau Tanaka: Making music with muscle sensors
DESCRIPTION:Prof Atau Tanaka is a well known figure in the New Interface for Musical Expression network\, an international conference he helped get started in the early 2000s. Atau is both an academic (head of the Embodied AudioVisual Interaction group at Goldsmiths\, university of London) and an artist who performs internationally. His artistic practice with muscle sensors spans over 3 decades and hundreds of performances. \nOn the 11th of June\, Atau will give insights into his approach\, and tips for anyone looking to create custom interfaces to control music parameters live. Live on YouTube and Facebook.
URL:https://musichackspace.org/event/atau-tanaka-making-music-with-muscle-sensors/
LOCATION:Online
CATEGORIES:Artist Talks,Instrument design
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/06/flyer.001-1-e1591024989956.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200608T183000
DTEND;TZID=Europe/London:20200608T183000
DTSTAMP:20260403T232154
CREATED:20200601T145707Z
LAST-MODIFIED:20210914T093935Z
UID:10000816-1591641000-1591641000@musichackspace.org
SUMMARY:Building Joué instruments: meet founder Pascal Joguet
DESCRIPTION:If you’ve been making music for more than 15 years\, you might remember the first touch screen controller destined for music application\, Jazz Mutant’s Lemur. Originally launched in 2005\, the Lemur was a highly customisable multi-touch screen controller\, designed to create your own control user interface. The Lemur offered high definition of control by sending data over OSC\, an audacious move that no major MIDI controller manufacturers ever dared to make. \n\n\n\nI won’t dive too much in the story of the Lemur\, Peter Kirn did a great job at writing its obituary 10 years ago\, and you can find it here. Suffice to say that it found its demise shortly after the launch of the iPad in 2010. \n \nPascal Joguet was the founder of JazzMutant\, and the Lemur wasn’t the last design he had in him. A few years ago\, Pascal co-founded Joué\, aiming to create expressive instrument with a playful design\, to lower the barriers of music production. \nWith a successful Kisckstarter campaign on the way (ending 9th June)\, Joué will be launching the Joué Play and accessories in October 2020. On Monday 8th June\, we will host Pascal Joguet for a livestream\, where he will retrace the story of his designs in the past 20 years. Here’s the Kicktstarter video of the Joué Play.
URL:https://musichackspace.org/event/building-joue-instruments-meet-founder-pascal-joguet/
LOCATION:Online
CATEGORIES:Instrument design,Product discovery
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/06/flyer.001-e1591021675661.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200604T183000
DTEND;TZID=Europe/London:20200604T200000
DTSTAMP:20260403T232154
CREATED:20200527T134103Z
LAST-MODIFIED:20200604T171120Z
UID:10000811-1591295400-1591300800@musichackspace.org
SUMMARY:Tim Exile: creating Endlesss
DESCRIPTION:Tim Exile has worked with live looping and sampling as an artist for a decade before building his own plug-ins\, released by Native Instruments. His unique methods and concepts developed for live performance inspired him to build a music making flow\, Endlesss\, that would allow musicians to create music collaboratively. \nTim and the Endless team are preparing a special announcement on 2nd June. Our session takes place two days after\, where Tim will share his journey from musician to entrepreneur\, and the vision that led him to create the Endlesss app. \nAbout Tim Exile \nMusician and technologist Tim Exile founded Endlesss to bring the joy of spontaneous collaborative creativity to music-making in the way TikTok and Instagram did with other media. These platforms empowered their users to create regular short-form work in an open social space. \nAs a seasoned musician and producer\, Tim missed the elements of spontaneity\, creativity and community in his long solo studio sessions. He set out to develop his ‘Flow Machine’ instrument for electronic improvisation which would become the DNA of Endlesss. Tim saw the potential of what he’d built to unlock a new purpose for music-making – a fast-paced\, live-action\, game-like alternative to the complexities of music production and the competitive music industry. Tim has developed software instrument products with Native Instruments and released records on Warp Records. He’s performed live in every continent with his ‘Flow Machine’\, collaborated with a diverse range of artists such as Nile Rodgers\, Imogen Heap and Beardyman and has spoken multiple times at TED and TEDx conferences about music and improvisation. \nDownload Endlesss here (iOS only)
URL:https://musichackspace.org/event/tim-exile-creating-endlesss/
LOCATION:Online
CATEGORIES:Music software,Product discovery
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/05/Thumbnails.001-1-e1590586641745.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200601T183000
DTEND;TZID=Europe/London:20200601T200000
DTSTAMP:20260403T232154
CREATED:20200523T150022Z
LAST-MODIFIED:20200608T152919Z
UID:10000808-1591036200-1591041600@musichackspace.org
SUMMARY:Georgina Brett: live looping vocals
DESCRIPTION:In this online session Georgina will talk about the conception and actualisation of each piece she performs\, the various methods of composition she uses. With a very simple set up she creates complex rolling matrices of vocalisations and lyrics.\n\n\n\n \n\n\n\nGeorgina Brett’s music is created using her voice and effects pedals\, creating instant choirs of sound\, often in an hypnotic style. The point of this music is not only to captivate with extraordinary timing and melodic style but also to help the listener to relax\, in our ever-increasingly distracted world. Her double album Nonsense A and Nonsense B is of purely vocal works with no ‘deliberate’ words or lyrics. The albums show the voice as an instrument\, and as a vehicle for emotional expression. “So much music is made in order to make us feel something so as to manipulate us to buy or to follow.. this album let’s your thoughts be whatever they want to be.” The albums also play with the idea that we like to interpret\, constantly listening for meaning instead of just listening.
URL:https://musichackspace.org/event/georgina-brett-live-looping-vocals/
LOCATION:YouTube and Facebook
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/05/Georgina-Brett.001-e1590245466832.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200530T160000
DTEND;TZID=Europe/London:20200530T180000
DTSTAMP:20260403T232154
CREATED:20200518T144641Z
LAST-MODIFIED:20200523T165150Z
UID:10000804-1590854400-1590861600@musichackspace.org
SUMMARY:Working with samples in Max and Max for Live\, with Darwin Grosse
DESCRIPTION:In this 2-hour workshop\, Darwin Grosse (Cycling ’74) will use Max/MSP to combine audio clip playback with VST effects and Max for Live devices to create a range of soundscapes from glitchy beats to warped ambiences. The focus will be to create a personal performance system that can be tailored to your interests and style. Topics covered will be audio playback\, effects processing\, user-interaction development and MIDI mapping.  \nA basic familiarity with Max will help ensure success! \n \nAbout the workshop leader \nDarwin Gross is Director of Education at Cycling’74. Darwin is responsible for the tutorials and help files that can be found in every release of Max. Darwin is an experienced workshop leader\, who enjoys building small music machines in Max\, such as drones\, rhythm generators and other re-useable projects that he can connect to hardware controllers or modular synthesizers. Darwin has been running a podcast since 2013 on Art\, Music and Technology\, with over 350 podcasts already released. https://artmusictech.libsyn.com/
URL:https://musichackspace.org/event/working-with-samples-in-max-and-max-for-live-with-darwin-grosse/
LOCATION:Online
CATEGORIES:Software Classes,Workshops
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/05/flyer.001-e1590252695433.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200530T110000
DTEND;TZID=Europe/London:20200530T123000
DTSTAMP:20260403T232154
CREATED:20200518T144206Z
LAST-MODIFIED:20200523T143932Z
UID:10000732-1590836400-1590841800@musichackspace.org
SUMMARY:Virtual drumming workshop with Shay Dyer-Harris
DESCRIPTION:Virtual drumming workshop with Shay! \n \nShayanna from Kinetika Bloco will lead a drumming workshop using household objects!  \nThe Kinetika Bloco is a performance group with an exuberant mix of young brass and woodwind players\, drummers\, steel pan and dynamic dancers all in costume creating a “unique new British Carnival sound with a decidedly London edge” (BBC Radio2) \nDuring lockdown Shay has adapted her workshops to use household objects rather than tradition steel drums. \nThis fun and interactive workshop requires no previous skills and all the family can join in!  \nAbout the workshop leader \nShayanna has been with the Bloco since it began. Having gone through the project from participant\, to volunteer\, to staff\, Kinetika Bloco has a special place in her heart. Shayanna can be found running workshops or managing the team on projects with any age and all levels of knowledge. Shayanna is also a singer\, songwriter and hosts shows. with the company Blue Revolutions she shares with Sam “Blue” Agard.
URL:https://musichackspace.org/event/virtual-drumming-workshop-with-shay-dyer-harris/
LOCATION:Zoom
CATEGORIES:Workshops
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/05/Shayanna-15.001-e1590071643530.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200528T183000
DTEND;TZID=Europe/London:20200528T193000
DTSTAMP:20260403T232154
CREATED:20200513T145949Z
LAST-MODIFIED:20220617T102815Z
UID:10000729-1590690600-1590694200@musichackspace.org
SUMMARY:David Zicarelli: Fun with Complexity using MC in Max
DESCRIPTION:The patches demonstrated in this video are available here\n\nMaking and controlling complex sounds on a computer where hundreds of events are happening simultaneously can be hard work. Aren’t computers supposed to make doing complex things easier? Max has a new feature called MC that can make it fun and easy to work with large numbers of simultaneous events and audio channels. You don’t need a multichannel audio system to play with MC. All the examples shown work in mono or stereo.\n\n\nWhat is MC?\n\n\nMC stands for Multi-Channel and is a way to encapsulate many audio channels\, and manipulate them together. This is a very useful replacement for projects that have multiple channels of audio and require unique representations and flows for each of them. Check out below David’s presentation of MC at Loop in 2019.\n\n\n \n\n\n\n\nAbout David Zicarelli\nDavid is the founder and CEO of Cycling’74\, makers of Max. David founded Cycling’74 in 1997 to commercialise Max\, a programming language that had been invented at IRCAM in 1985\, and licensed to Opcode for commercialisation in 1989. David worked then at Opcode and acquired the publishing rights to the software in 1997. Max grew an audience in more than 3 decades\, in education and as a tool for artists. In 2007\, Ableton and Cycling’74 announced Max for Live\, the integration of Max into Live\, allowing Ableton Live users to use Max patches in their workflow.  Cycling’74 was then acquired by Ableton in 2017\, and Max for Live is seeing a lot of improvements since\, opening up possibilities for long time Max users to find new audiences among the Live user base.
URL:https://musichackspace.org/event/david-zicarelli-fun-with-complexity-using-mc-in-max/
LOCATION:YouTube and Facebook
CATEGORIES:Music software,Product discovery
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/05/Thumbnails.001-e1589382043581.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200521T183000
DTEND;TZID=Europe/London:20200521T200000
DTSTAMP:20260403T232154
CREATED:20200511T121910Z
LAST-MODIFIED:20200521T170940Z
UID:10000726-1590085800-1590091200@musichackspace.org
SUMMARY:Amy Dickens: Inclusive Design for Digital Musical Instruments
DESCRIPTION:The 21st May 2020 is Global Accessibility Awareness Day. To celebrate this\, we are exploring in our live-stream today the design practices to help make musical instruments (hardware and software) more accessible. To find out more about GAAD\, please check their website. \nShould we make instruments easier to play? This is a question that might divide professional musicians\, who have acquired their skills through hardship. But for everyone else\, there is little doubt that the fun should start as soon as we engage with an instrument. Making instruments easier to play is also a critical issue for people with disabilities ranging from limited mobility to visual impairment. So\, if you’re thinking of making an instrument\, why not consider lowering the barriers of playing them? \nIn this talk\, Amy Dickens will shed light on the inclusive design practices for Digital Musical Instruments\,  and how to make music accessible to everyone. Amy will walk us through some of the standards and best practices for accessibility\, as well as design considerations for music technology of all kinds. For those wanting to take part at the end of the session there will be an activity in designing some digital musical instruments for different levels of ability. \nAbout Amy Dickens:\nAmy Dickens is an accessibility ambassador\, Developer Advocate\, and researcher at The Mixed Reality Laboratory\, UK. As well as being an audio engineer and musician\, over the past five years Amy has been conducting research into accessible music technologies. Currently living in London with partner and Jack Russell (Moo Bean)\, Amy is working on producing a framework for accessible musical experiences and finishing their all important PhD Thesis. \nOn the subject of accessibility and music\, check out also the great work of charity organisation Drake Music.
URL:https://musichackspace.org/event/amy-dickens/
LOCATION:YouTube and Facebook
CATEGORIES:Instrument design
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/05/Amy.001-e1588865712962.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200518T183000
DTEND;TZID=Europe/London:20200518T200000
DTSTAMP:20260403T232154
CREATED:20200507T152359Z
LAST-MODIFIED:20200518T115105Z
UID:10000723-1589826600-1589832000@musichackspace.org
SUMMARY:Tweakable: a online programming environment for music and video
DESCRIPTION:We cover Max/MSP and Jitter a lot\, as it is the most popular visual programming environment\, especially if you want to hack audio and video in your own way. But it isn’t the only visual programming language out there. We found out recently about Tweakable\, and it turns out to be quite remarkable. \nFirst\, it is entirely online. No downloads\, no sign-up\, the URL above takes you directly to a web page\, where Tweakable runs. You might think that you are launching a video when you click on the usual triangle\, but in fact you’re turning the engine on\, and everything turns out to be editable. And you can build a user interface\, too! \nI’ve managed to extract an example with Julian’s help\, see below. The original patch is available here. \n[Update] Julian has added a share option that exports the code of a patch for easy embed. Wow. \n\nTweakable has a collection of simple examples to get you started\, such as the one above\, and it can get pretty complex. It’s made for musical applications so it has examples of algorithmic compositions\, canon\, fugue\, jazz etc. \nIt gets better. It also supports video\, through a similar patching system. The possibilities are super interesting. You can design your own audio/video project and encapsulate it onto your webpage\, and visitors can also experiment with it. \n﻿\nThe author of this software is Julian Woodward. Based in Britain\, Julian (Visual Systems Ltd) began developing Tweakable five years ago as a side project\, and is now working on it full time. Julian will join us for a first public discovery of Tweakable on Monday 18th May. Ask questions on the live chat or on the forum.
URL:https://musichackspace.org/event/tweakable-a-online-programming-environment-for-music-and-video/
LOCATION:YouTube and Facebook
CATEGORIES:Music software,Product discovery
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/05/Tweakable.018-e1588860085640.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200516T160000
DTEND;TZID=Europe/London:20200516T180000
DTSTAMP:20260403T232154
CREATED:20200501T154543Z
LAST-MODIFIED:20200501T154543Z
UID:10000797-1589644800-1589652000@musichackspace.org
SUMMARY:Introduction to Audio and Video in Max/MSP
DESCRIPTION:In this workshop\, we will use Max/MSP to create custom\, interactive A/V experiments. We will first get grounded in the concepts and workflows of Max and creative coding by making video programs in Vizzie. Then we will go through the basics of using and manipulating audio in Max through MSP objects. We will connect sound to video and vice versa\, and then create custom interfaces for our A/V experiments.  \nNo experience with programming\, music\, or video is required. Bring audio or video samples you would like to work with! \nAbout the workshop leader \nCassie Tarakajian is a software engineer and educator\, who contributes to Cycling’74 and Processing\, and teaches at New York University.
URL:https://musichackspace.org/event/introduction-to-audio-and-video-in-max-msp/
LOCATION:Online
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/05/cassie.001-e1588347643426.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200514T183000
DTEND;TZID=Europe/London:20200514T200000
DTSTAMP:20260403T232154
CREATED:20200501T175204Z
LAST-MODIFIED:20200504T102452Z
UID:10000800-1589481000-1589486400@musichackspace.org
SUMMARY:Kate Stone: interactive surfaces for music
DESCRIPTION:Kate Stone is the founder and CEO of Novalia\, a UK-based company whose mission is to create “magical” interactions. Magical is the word Kate uses to explain that she aims to hide the technology away\, and augment our analog interaction with the world with subtle and meaningful improvements. Imagine a musical instrument\, a clarinet\, say\, equipped with invisible sensors and bluetooth chip\, capturing the musician’s gestures to control video\, or audio effects. Or a printed piece of paper with conductive ink that triggers sound. \nKate imagines a world that is more like Harry Potter than Minority Report. And although she claims she’s not a musician\, most of her work revolves around music. From the McTrax she built for McDonalds to DJ Qbert’s album cover\, music interactions seem to be the perfect home for her technologies. And for the past year\, she’s been chair of the board of the MIDI Manufacturers Association. \nKate will be live-streaming on May 14th at 6:30pm UK time on our channel on YouTube and Facebook. \n \n 
URL:https://musichackspace.org/event/kate-stone-interactive-surfaces-for-music/
LOCATION:YouTube and Facebook
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/05/Kate-Stone.001-e1588426640962.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200511T183000
DTEND;TZID=Europe/London:20200511T200000
DTSTAMP:20260403T232154
CREATED:20200424T143441Z
LAST-MODIFIED:20200511T140720Z
UID:10000730-1589221800-1589227200@musichackspace.org
SUMMARY:Roger Linn: designing instruments
DESCRIPTION:In 1980\, Roger Linn designed the first drum machine with samples of a drum kit\, the LM-1. Along with the LinnDrum and the Linn 900\, Roger’s inventions played an important role in the sound of the 1980s\, which can be heard on recordings from Michael Jackson to Prince. \n40 years on\, Roger is still inventing instruments. His latest instrument is the Linnstrument\, a fretted surface equipped with pressure sensors. Notes are laid out in a similar way to the guitar\, and the sensors capture the pressure and lateral movements to render vibratos and tremolos. \nIn this talk\, Roger will present the Linnstrument and discuss his past inventions\, and approach to designing them. \nFor more information: https://www.rogerlinndesign.com/ \n \n 
URL:https://musichackspace.org/event/roger-linn-designing-instruments/
LOCATION:YouTube
CATEGORIES:Instrument design
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/04/Roger-Linn.001-e1587739328449.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200507T183000
DTEND;TZID=Europe/London:20200507T193000
DTSTAMP:20260403T232154
CREATED:20200423T134410Z
LAST-MODIFIED:20200507T101328Z
UID:10000727-1588876200-1588879800@musichackspace.org
SUMMARY:Music in times of lockdown\, with Jordan Rudess
DESCRIPTION:Meet Jordan Rudess\, the legendary keyboard of Dream Theater. Jordan is also very well known for embracing technology in any form that allows him to seek more expressivity and performing abilities. \nJordan is at heart a music hacker\, who can make music with anything\, and his insight has been valued over the years by many musical instrument companies\, from KORG to ROLI\, XKeys and he was an influential force to extend MIDI with MPE. As a touring musician\, Jordan has been affected by the lockdown\, and Dream Theater’s concerts have been canceled or postponed. \nIn this interview\, Jordan shares his thoughts on making music from home and his passion for music technology and instrument design.
URL:https://musichackspace.org/event/music-in-times-of-lockdown-with-jordan-rudess/
LOCATION:YouTube
CATEGORIES:Artist Talks
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/04/Jordan.001-e1587649330671.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200504T183000
DTEND;TZID=Europe/London:20200504T183000
DTSTAMP:20260403T232154
CREATED:20200423T124933Z
LAST-MODIFIED:20200505T141728Z
UID:10000720-1588617000-1588617000@musichackspace.org
SUMMARY:Ocean: a new online & collaborative sequencer
DESCRIPTION:In this 8th live-stream\, we hosted Robin Hunter who has recently launched a collaborative\, online sequencer running in the browser: Ocean. \n \nOnline collaboration for music makers has fuelled many dreams over the past 20 years\, but many of them have turned into nightmares. From Rocket Networks (acquired by AVID in 2003) to LL Cool J’s Boomdizzle or Ohm Force’s ambitious Ohm Studio\, many have tried to make it easy to collaborate online. Bandlab and Soundtrap seem to be doing quite well however and betting on growing large audiences\, in particular by targeting education. \nOcean is following on those footsteps with a very simple interface exempt of the jargon and complexity that we see in audio workstations. It’s designed to get people started immediately\, and features a fun collaboration approach that makes it a good experience. There are much more features that will be needed before anyone can make a song with Ocean\, but it’s a great first step. For those of you in education\, or in lockdown with kids\, have a try\, it’s free! \n 
URL:https://musichackspace.org/event/ocean-a-new-online-collaborative-sequencer/
LOCATION:YouTube and Facebook
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/04/Robin-Ocean.001-e1587646149461.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200430T183000
DTEND;TZID=Europe/London:20200430T193000
DTSTAMP:20260403T232154
CREATED:20200422T130915Z
LAST-MODIFIED:20200430T141657Z
UID:10000717-1588271400-1588275000@musichackspace.org
SUMMARY:Online talk #7: Interactive visuals with Jitter\, with Rob Ramirez
DESCRIPTION:Follow live on YouTube and Facebook. \nWhen performing live with computers\, musicians often wonder how to visually entertain the audience. Playing music from behind a laptop gives little cues to how the music is made. The performer could be a virtuoso\, or writing e-mails\, they wouldn’t behave differently. This is a departure from acoustic instruments\, where each note requires physical actions visible to the audience. \nWhen musicians work with visual artists\, they can propose a narrative and aesthetic that matches the song\, but what if you’re performing on your own? How can you produce entertaining visuals that reflect what you are doing behind the laptop? \nIn this talk\, Rob will give an overview of Jitter\, and how it can be used to build visuals that connect to your audio events. \n 
URL:https://musichackspace.org/event/online-talk-7-interactive-visuals-with-jitter-with-rob-ramirez/
LOCATION:YouTube
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/04/Rob.001-e1588251696505.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200427T183000
DTEND;TZID=Europe/London:20200427T193000
DTSTAMP:20260403T232154
CREATED:20200422T130355Z
LAST-MODIFIED:20200424T142309Z
UID:10000724-1588012200-1588015800@musichackspace.org
SUMMARY:Online talk #6: Yuri Suzuki presents the E Z Record Maker
DESCRIPTION:Yuri Suzuki is a sound artist and partner at creative agency Pentagram. Throughout his career\, he collaborated with Teenage Engineering\, Will.I.Am\, Jeff Mills\, and designed music hardware and installations such as Ototo or the Pyramidi. \nIn this talk\, Yuri will present his latest invention\, the E Z Record Maker\, a device allowing you to record vinyls at home from any audio input. As of today\, the device is only available in Japan and will ship sometime in 2020 to Europe and the US. \nhttp://www.yurisuzuki.com/
URL:https://musichackspace.org/event/yuri-suzuki-releases-vinyl-engraver/
LOCATION:YouTube
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/04/Yuri-update.001-e1587737964377.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200423T183000
DTEND;TZID=Europe/London:20200423T193000
DTSTAMP:20260403T232154
CREATED:20200416T144210Z
LAST-MODIFIED:20200507T101300Z
UID:10000714-1587666600-1587670200@musichackspace.org
SUMMARY:Granurise: A granular synthesizer for MPE instruments
DESCRIPTION:Watch the live-stream here\, 23rd April 2020\, 18:30 BST / 10:30am PT. \nMeet Andrej Kobal\, from Slovenia. Andrej is an artist and a programmer\, who built GranuRise\, a rich and complex granular synthesizer for the Seaboard RISE (hence the name\, GranuRise). In this live-stream\, Andrej presents the vision he worked on for the past 7 years\, demonstrate the product\, and share how he built it with Max. Join the live stream to ask him questions. \n7 years in the making \nI first met Andrej Kobal in 2016\, when he shared a prototype of his granular synth\, which he had built for his own artistic practice. I was blown away by the possibilities that were already available in that version. Andrej turned out to be a perfectionist\, and it took him another 4 years to bring Granurise to a version he felt comfortable distributing. In the meanwhile\, MPE (MIDI Polyphonic Expression) instruments  became more popular\, and MPE support in DAWs also became more pervasive\, so anyone trying Granurise now should have an easier time than in 2016. Granular synthesis can have many control parameters\, and it is a great synthesis technique to use with an MPE controller. You can truly feel how the dimensions of control of the Seaboard have a direct impact on the sound\, which you cannot achieve using a normal MIDI keyboard. \n \nGranurise isn’t a typical VST or AU plugin\, it’s a Max for Live device. As such\, it can be played within Ableton Live\, or standalone\, but unfortunately not within any other DAW\, unless you use Soundflower or similar re-routing software. MPE support in Ableton isn’t there yet\, but there are workarounds for Max for Live devices. \nThe user interface of GranuRise is feature-rich and intuitive. It offers micro-level controls for the grains\, and macro-level controls such as sequencing. The preset bank and morphing capacity is great to store experimentations\, and morph between them to discover new sounds and smoothly transition from a state to another. Unlike most plug-ins\, GranuRise has been built for live performance\, and you an use it standalone. \nWhat is granular synthesis? \nOver the past 20 years\, granular synthesis has become an ubiquitous form of synthesis. Every mainstream plug-in company has released their own version of it\, or integrated aspects of it to their suites of plug-ins. It was first coined by Iannis Xenakis\, who conceived a music theory based on grains of sounds\, (1960). Notable developments in the theory of granular synthesis were brought by Curtis Roads in his book Microsound (2001). \nEssentially\, granular synthesis proposes that sounds can be formed by the assemblage of smaller sounds\, or grains of sounds. These grains can be excerpts of a larger sound (for example the sustained part of a piano note). Each of these grains can be played faster or slower\, to increase or decrease their pitch\, and be altered independently. These grains are then layered together with variable offsets\, frequency\, delay lines\, feedback\, and an overall density parameter that controls the number of sounds layered together. Over time\, many more control parameters have been added. Check out Maurizio Giri’s example below for an implementation in Max. \n \nUseful links for Granular Synthesis \nList of granular synthesis software \nMax tutorial by Cycling’74 \nSupercollider tutorial by Nick Collins \n  \n 
URL:https://musichackspace.org/event/granurise-a-granular-synthesizer-for-mpe-instruments/
LOCATION:YouTube
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/04/Granurise-blog.005-e1587039141493.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200420T183000
DTEND;TZID=Europe/London:20200420T203000
DTSTAMP:20260403T232154
CREATED:20200415T153209Z
LAST-MODIFIED:20200415T161313Z
UID:10000721-1587407400-1587414600@musichackspace.org
SUMMARY:Physical modeling synthesis with Pat Scandalis
DESCRIPTION:Join the live stream on 20th April 2020\, 6:30pm BST / 10:30am PT\, or watch it later. \nIn music\, physical modeling synthesis is a technique aiming to reproduce analog sounds in software. Our guest speaker\, Pat Scandalis\, along with Julius Smith and Nick Porcaro have worked together since the 90s at CCRMA\, Stanford\, to create physical models of instruments. It isn’t a very common form of synthesis\, due to the complexity of doing it well\, and the CPU it requires to run. \nPhysical modeling synthesis is the key technique used in Pat Scandalis’ app GeoShred\, which he built for iOS and iPad with MoForte and Jordan Rudess. Check out the impressive reel below. \n \n  \nPhysical modeling synthesis has come a long way and is becoming a very attractive method to produce expressive sounds. Think of a physical model as the instrument itself: if it is the model of a violin\, then it has 4 strings\, a body\, it can be played with a bow\, staccato\, etc. Each of these parts play a role in producing the sound. In a physical model\, like with the real instrument\, it is possible to alter a few of these parameters at the same time. When connected to a MIDI controller with multiple dimensions of control\, a physical model instrument will offer a more reactive and richer response than traditional sampling engines\, because of the variety of dimensions that can be controlled at once. \nA great way to get started experimenting with physical modeling is to use the Karplus-Strong model\, which gives the basics for modeling a plucked or percussion instrument. Sam’s tutorial\, below\, gives a quick account of how to implement it in Max (it’s nearly 10 years old\, but still applies!). \n \nThe JUCE library also has a basic implementation of the Karplus-Strong algorithm in its examples\, and you can check out the code on GitHub here.
URL:https://musichackspace.org/event/physical-modeling-synthesis-with-pat-scandalis/
LOCATION:Music Hackspace\, Somerset House\, West Service Yard\, Victoria Embankment\, London\, WC2R 1LA\, United Kingdom
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/04/physical-model-for-youtube.001-e1586966484288.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200416T183000
DTEND;TZID=Europe/London:20200416T203000
DTSTAMP:20260403T232154
CREATED:20200411T112112Z
LAST-MODIFIED:20200415T160514Z
UID:10000718-1587061800-1587069000@musichackspace.org
SUMMARY:Make Music with the Data Universe
DESCRIPTION:Meet Milton Mermikides and Phelan Kane. Two artists on a journey to make music with data. Milton and Phelan’s approach reveals the musicality of the world’s elements. With the support of programming\, computer vision and sampling\, they identify patterns in images\, videos\, or abstract data to create rich compositions and sound design. Watch Milton’s TEDx talk below for a moving – yet funny – overview of his work. In our live-stream\, Milton and Phelan will explain some of the techniques used to create the compositions you can hear in the talk. \n \nEver since Pythagoras and the theory of the Music of the Spheres\, composers have been inspired to use patterns found in the universe to create music. In the 19th Century\, that approach developed across the arts\, and not only in music. French poet Stéphane Mallarmé explored the position of words and their relationship with meaning\, and his work famously inspired composer Debussy\, whose work in turn inspired painter Kandinsky. Each time\, the structure found in each work was used and translated to another art medium. \nOskar Fischinger’s 1938 Optical Poem\, in turn\, offers a visualisation of music\, an approach which was then popularised in Walt Disney’s Fantasia. \n \nOn Thursday 16th April\, we will hear more about this fascinating approach that inspired so many artists\, in a way that can inspire today’s artists and engineers to create new materials from the world of data that surrounds us. \nMilton is a Reader in Music\, Head of Composition and Director of the MMUS Programme at the University of Surrey. He is also a professor of Jazz Guitar at the Royal College of Music in London. \nPhelan is a music producer\, recording and mix engineer musician\, mastering engineer and music programmer. He is based in London and Berlin\, and is joining the live-stream from Berlin.
URL:https://musichackspace.org/event/make-music-with-the-data-universe/
LOCATION:YouTube
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2020/04/Milton-and-Phelan.001-e1586966686593.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200409T183000
DTEND;TZID=Europe/London:20200409T193000
DTSTAMP:20260403T232154
CREATED:20200403T174437Z
LAST-MODIFIED:20200411T102953Z
UID:10000715-1586457000-1586460600@musichackspace.org
SUMMARY:Online talk #2: Create music in Virtual Reality
DESCRIPTION:Join the live stream on Thursday 9th April\, 6:30pm BST / 10:30am PT. \nIn the past three decades\, Virtual Reality has been a wall onto which many dreams have been projected. From Minority Report to Ready Player One\, VR inspires worlds of possibilities. Today’s world seems however far from the film industry’s depiction of a possible future. VR has been successfully adopted by a number of hard core gamers\, but despite heavy investment from Facebook\, Samsung\, Microsoft and many others\, it is still unclear how VR will deliver entertaining experiences to the general consumers. \nSo\, how about VR for Music? There are options out there\, such as Drum Hero\, or Wave Beta. They seem to unanimously release on gaming platforms\, such as Steam or Oculus\, which is where you can find customers equipped with the necessary headsets. Our guest is no exception to this\, having release Tranzient on both platforms. \n \nJim Simons is the founder of Alive in Tech\, the studio that released Tranzient. Jim previously worked at Focusrite and Yamaha\, and is an experienced musician. Jim will lead the second session of our weekly online talks with a demo of Tranzient\, and will answer live questions. You can watch the stream below when it’s live\, or after it has happened! \n 
URL:https://musichackspace.org/event/online-talk-2-create-music-in-virtual-reality/
LOCATION:YouTube
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20200402T183000
DTEND;TZID=Europe/London:20200402T210000
DTSTAMP:20260403T232154
CREATED:20200303T170927Z
LAST-MODIFIED:20200331T115225Z
UID:10000712-1585852200-1585861200@musichackspace.org
SUMMARY:Max meetup live streamed
DESCRIPTION:This Meetup will be live streamed here. \nExperienced Max users Valeria Radchenko\, JJ Burred\, Mike Zbyszynski and Nick Rothwell will share some of their practices and patches during this first online event. You will be able to ask questions in the chat and access the patches on the Music Hackspace forum. \nMax is extensively taught here at Goldsmiths University\, the new home of the Music Hackspace.  and students and academics are doing remarkable things with it. Check out Prof Atau Tanaka’s research on music creation with bio sensors\, Dr Mike Zbyszyński’s software experiments and many more in the Computing and Music departments. There even are some short Max courses available to anyone\, taught by Daniel Ross\, who also runs SEEM. \nWhat is Max? \n \nMax is a music software that allows to build interactive media\, by connecting audio\, video and gestures in an infinity of combinations. You can download the Max 8 demo here.
URL:https://musichackspace.org/event/max-meetup/
LOCATION:St James Hatcham Building\, Goldsmiths\, 25 St James\, London\, SE14 6AD\, United Kingdom
CATEGORIES:Meet Ups
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20190920T140000
DTEND;TZID=Europe/London:20190920T190000
DTSTAMP:20260403T232154
CREATED:20190705T115230Z
LAST-MODIFIED:20190918T073245Z
UID:10000708-1568988000-1569006000@musichackspace.org
SUMMARY:A Human-Data Interaction workshop on Music and AI
DESCRIPTION:Will AI generated music one day replace human composed music? Can machine learning help us model listening habits or instrumentalist gesture? Can deep learning model musical style or create novel forms of sound synthesis? \nWe seek to separate the fantasy from reality\, and identify realistic research challenges in the midst of all the hype by organising a half day event at Somerset House. On Friday September 20\, 2019\, from 2:00pm-6:00pm\, we will organise a workshop on Music and Artificial Intelligence. It takes place as part of a UK Research & Innovation (UKRI) funded research network\, HDI Human-Data Interaction: Legibility\, Agency\, Negotiability. \nThe event will bring together leading researchers\, musicians\, and social commentators to discuss concrete challenges and opportunities that technologies of information processing bring to music and the creative industries. \n \nThe discussions will help to shape a subsequent Call for Proposals to fund projects that will put academics into partnership with artists and industry to create new systems and critical studies that look at the deployment of AI and machine learning technologies in creative acts of music making\, performing\, and listening. \nThe workshop is organised by Prof Atau Tanaka of the Embodied Audiovisual Interaction (EAVI) unit at Goldsmiths in collaboration with Music Hackspace. \nThe venue is located on the ground floor of Somerset House and is fully wheelchair accessible with an accessible toilet. If you have any other access requirements\, please let us know and we will do our best to accommodate them. \nIf you have any questions about this event or any of our workshops please contact workshops@stagingmhs.local. \nImage credit: John Hersey
URL:https://musichackspace.org/event/human-data-interaction-workshop-music-ai/
LOCATION:Somerset House Studios\, River Rooms\, New Wing\, Somerset House\, Strand\, London\, WC2R 1LA\, United Kingdom
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2019/07/credit_John_Hersey-e1562327538334.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20190904T183000
DTEND;TZID=Europe/London:20190904T210000
DTSTAMP:20260403T232154
CREATED:20190514T082346Z
LAST-MODIFIED:20190903T163636Z
UID:10000794-1567621800-1567630800@musichackspace.org
SUMMARY:Spatial Sound Meetup #4 with BBC R&D
DESCRIPTION:A new series of monthly meetups taking place at Somerset House Studios\, focusing on new spatial sound technologies\, with talks and discussions around creative uses of spatial and AR audio. \nThe meetups will take place on the following dates\, you can register here:  \n12th June – Professor Zoran Cvetkovic from King’s College  \n3rd July – Contributors TBA   \n7th August- Contributors TBA  \n4th September – Contributors TBA  \nEach session will be hosted by BBC R&D and will feature a presentation by invited guests.   \nBBC R&D has been at the forefront of innovation in media technology since the founding of public service broadcasting in the UK. \nBased in Research Labs in the North and South of the UK\, the department is comprised of just over 200 highly specialist research engineers\, scientists\, ethnographers\, designers\, producers and innovation professionals working on every aspect of the broadcast chain\, from Audiences\, Production and Distribution right through to the Programmes themselves. \nThe venue is located on the ground floor of Somerset House and is fully wheelchair accessible with an accessible toilet. If you have any other access requirements\, please let us know and we will do our best to accommodate them.
URL:https://musichackspace.org/event/spatial-sound-meetup-4-bbc-rd/
LOCATION:Somerset House Studios (G18)\, Somerset House Studios\, Lancaster Place\, London\, United Kingdom
CATEGORIES:Meet Ups
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2019/05/Copy-of-_MG_0157-1-e1557401880436.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20190831T110000
DTEND;TZID=Europe/London:20190831T180000
DTSTAMP:20260403T232154
CREATED:20190508T164315Z
LAST-MODIFIED:20190629T164039Z
UID:10000703-1567249200-1567274400@musichackspace.org
SUMMARY:Creative Electronics Laboratory with XNAME
DESCRIPTION:This workshop guides participants in the construction of a small robot animated by sunlight. After a brief introduction to the rudiments of electronics\, the focus is on building a small oscillator soldering all components directly on the microchip. When hit by lights\, the robots emit sound. When connected to a speaker they turn into a synthesisers that can be modulated with intermittent and coloured lamps. According to the components selected and the creativity of each participant\, every bot will be unique in sound and appearance.  \nAll the necessary components will be distributed at the beginning of the workshop\, and every participant can keep the synth built. No previous experience in electronics is necessary to take part in this workshop. \n\n \n  \nXNAME is a musician and instrument designer. You can find out more at her website (contains flashing lights). \nIf you have any questions about this event or any of our workshops please contact workshops@stagingmhs.local. \nWe can offer a student discount of 10% off this workshop. Please email us for a discount code. \nThe venue is located on the ground floor of Somerset House and is fully wheelchair accessible with an accessible toilet. If you have any other access requirements\, please let us know and we will do our best to accommodate them.
URL:https://musichackspace.org/event/creative-electronics-laboratory-xname/
LOCATION:Somerset House Studios\, New Wing\, Room G16\, Entrance via Lancaster Place\, Somerset House\, London\, WC2R 1LA\, United Kingdom
CATEGORIES:Workshops
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2019/05/BOT6-e1557824428893.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20190830T183000
DTEND;TZID=Europe/London:20190830T210000
DTSTAMP:20260403T232154
CREATED:20190509T162206Z
LAST-MODIFIED:20190629T163934Z
UID:10000791-1567189800-1567198800@musichackspace.org
SUMMARY:Ritual Laboratory #6; a technology to transform consciousness
DESCRIPTION:We continue our series of participatory rituals throughout summer. Based on the 12 mythic Labours of Herakles. Forming part of an ongoing research and development project\, each event considers the relationship between sound and movement performance\, and altered states of consciousness.  \nEach session of the laboratory focuses on balancing a sequential zodiacal energy. We begin with Aries – The Head (Thoughts) and work down to Pisces – The Feet (Deeds). The summer series includes: \n28th May – Gemini (Polarity) \n25th June – Cancer (Intuition) \n31st July – Leo (Expression) \n30th August – Virgo (Cultivation) \nIn each session we will utilise a variety of techniques that function as rites of passage. \n \nThe embodiment of each myth functions as an interface\, to construct the components of ritual form\, and to explore the subjective impact of trance inducing use of sound\, movement\, light\, scent\, and embodied narrative. \nEach Labour of Herakles as narrative score\, catalyses a dialogue between musicians and performers\, facilitating an immersive experience of the myth. These 12 Labours share significant parallels with the astronomical movement of the Sun\, projected through the symbolism of the zodiacal constellations in a calendar year. Harnessing the cyclical changes in our natural environment as a basis for structure and objectivity\, we aim to fall into rhythm with the cycles of the natural world.  \nAn experimental laboratory of 12 events. No prior knowledge required. Everyone welcome!  \nWe will begin each session  by sharing a discussion regarding the symbolism of the myth\, and how the Labour will play out. Secondly we explore features of trance induction\, hypnotic phenomena and their role in creative practice. Lastly we will perform the myth together\, and share feedback afterwards. \nInstrumentation will vary as we explore the intersection of traditional trance instruments and contemporary production tools. \nThis series is organised by Nicole Bettencourt Coelho. Nicole is a qualified Hypnotherapist\, practicing Sound Therapist and member at Music Hackspace / Somerset House Studios. Presently working with the composition of improvised choreographic score by drawing from her experience as a touring session musician\, to foster an interdisciplinary approach. Conducting an investigation at the intersection of traditional therapeutic instruments and modern production tools\, she is investigating the ability of sound and ritual form to support and direct the transformative power of performance. As well as generating dialogue around the sociological impact of evolving tools on culture and fulfilling creative practice. Current studies focus on communal performance in the context of astronomical myth\, as a technology to transform consciousness. Combining archetypal symbols and storytelling with therapeutic and artistic applications of intuitive sound and movement\, to induce and support inward journeys within a collective experience. Using a variety of techniques to consciously compose and utilise alternate states of awareness\, she seeks to create opportunities to foster a healthy relationship with the creative impulse on an individual and collective level. \nForming part of her therapeutic practice CURVE ASC\, she is currently exploring the interplay of energies moving through the body and the cyclic rhythms of the environment in the context of herbalism. Hosting a series of seasonal sound baths with local herbs. This being an embodied study of the natural rhythms of the dialogue between body and eco-system. A reciprocal dance through time. These are opportunities for reconnection\, restoration and reflection. \nThe venue is located on the ground floor of Somerset House and is fully wheelchair accessible with an accessible toilet. If you have any other access requirements\, please let us know and we will do our best to accommodate them.
URL:https://musichackspace.org/event/ritual-laboratory-6-technology-transform-consciousness/
LOCATION:Somerset House Studios\, New Wing\, Room G16\, Entrance via Lancaster Place\, Somerset House\, London\, WC2R 1LA\, United Kingdom
CATEGORIES:Workshops
ATTACH;FMTTYPE=image/webp:https://musichackspace.org/wp-content/uploads/2019/05/8W0A6646-e1557400289103.webp
END:VEVENT
END:VCALENDAR