This thread is devoted to discussion on 3D immersive music creation.
This caught my attention because of this video which is a masterclass on quadraphonic electronic music by Suzanne Ciani done with Loopop’s cooperation. The video is 40 minutes of intense investigation – and you really need to watch the whole thing.
Principle hardware involved here is the Buchla 227e, an adapted Eventide H9 with CV, and an iPad.
This seems something that really needs to grow as a method used in electronic music creation. I’d love to hear your thoughts and ideas about this. Everyone is welcome, and especially if you currently make 3D immersive music.
There’s a festival in Germany, Antaris, where they have a quadraphonic setup.
I know Eat Static played a quadrophonic set there.
I guess its selected individual outs just get quadrophonic panning applied.
I can’t watch the video now (at work), will check it out tonight, but I looked into what quadraphonic sound was & it stated that it is now called 4.0 surround sound… so would quadraphonic music be like panning, but instead of just left & right, it’s front left, front right, back left & back right (4 corners)? Each sound in it’s own space? Is this correct?
Another force that is driving quadraphonic and 3D immersive sound is the phenomenal development and investment happening for the past few years with VR and the need for better sound that will support it.
Here’s a non-profit, Envelop based in San Francisco CA, that has a performance space with 32 speakers for a complete spherical 3D immersive audio experience. They also offer free Max 4 Live software. This software can work on systems down to 4 speakers.
Gray Area (SF) - Don Buchla memorial festival (I believe this included Suzanne Ciani and possibly one other spatialized performance)
Lampo (Chicago) - Keith Fullerton Whitman performance
Exploratorium / Kanbar Forum (SF) - Thomas Dimuzio (spatialized performance in a 100+ speaker venue)
It’s really a treat to hear someone who knows what they’re doing envelop you in four different sounds in all four corners of the room and play with your audiospatial perception. Highly recommended if you get the chance.
This is great! Good on Zoom for being so progressive.
And it’s only $350.
And look at/listen to this. This is a 360 video (on youtube)-- so you can grab the screen while it’s playing and pan around and look up and down. As you move around the sound in front of you changes. The equipment for shooting this would probably cost a bit, but doing the audio for it doesn’t – at least not anymore.
Good to know – i was looking at the picture showing the equipment used for this shoot, which i couldn’t even begin to detail, but looks to include 16 items around the outside. plus three fish-eye thingies, and then the Zoom Apollo Mission Capsule docked to its Ambisonic mic at the top. My ignorance is showing.
Excellent question – don’t know. I did notice that the 360 video references a production company VR Playhouse of Los Angeles, so they’d be the ones to ask. Hey that would be a neat place to work wouldn’t it.
It might be some of their software, i see they create that too.
Here’s another video with from Zoom with detail on the H3-VR:
ADDED: Here’ s at least a little part of an answer to your question Ryan. According to the Zoom H3-VR website:
The H3-VR’s companion Zoom Ambisonics Player software takes the work out of post-production by easily converting your Ambisonics files to Stereo, Binaural, 5.1ch surround and more. Play, trim, re-orientate and export your audio files in minutes.