(This is a post about my visualist set for the Octopus Project’s 8-channel audio performance, Hexadecagon.)
Early Friday was a marathon of putting out technical ‘fires’, figuring out how things would fit in the actual tent that was built, praying to heathen idols to keep everything working, and then clenching our teeth and jumping in.
I’m happy to say that everything worked better than we could have hoped for, and the turnout was so large that by the second show we had to simply open the flaps of the tent to try to include the people who couldn’t get the entry wristbands we were using to keep count of capacity.
I’m counting on some reviews of the show from the point of view of the audience to go up shortly, so this article is going to be a technical run through of the video portion of the show, hopefully some of the things I learned will help other people experimenting with doing live visuals.
Our base vj environment is VDMX, and I put the set together as a series of full presets- one for each song. Our setup was slightly unusual. We were using two devoted MacBook Pros, each running a vdmx set, and each one connected to two HD projectors via two Matrox DualHead2Go’s (more on that later). The source clips were all shot, animated, and composited by myself and the band. All the motion graphics were done in Apple Motion, and all the footage was shot with a Casio EX-F1, a still camera that shoots at incredibly fast speeds (up to 1200 fps, but with diminishing resolution the faster you shoot). I covered up some of the artifacting in the slow motion footage with a lot of composited film grain. Toto Miranda from the band cranked out a lot of interesting character designs, and I animated them either in Motion or in ToonBoom; an application I kind of hate, but (oddly) I don’t know Flash and its the only animation app I could find that lets you do traditional-style animation with a vertical X sheet.
In addition to our original clips, I asked Dan Winckler, a visualist and programmer I knew from his excellent OpenEmu project, to help with some programming. VDMX lets you trigger video clips with Midi, OSC or keyboard commands, but we wanted some additional small animations with transparent backgrounds to trigger along with individual notes from the set. Dan built us a Quartz Composer patch that optimized performance for playing back small movies we created, and we used it in a few different ways during the set. He also made us a very cool Max MSP patch that, (along with another quartz composition) listens to the pitch of the theremin on stage and allows us to have one song be accompanied by minimalist color fields that react to the sound of the theremin. We got the final version of this one slightly last second and I managed to lose sound input to it during the set, but luckily dan had provided alternate controls, and I was able to follow along with Yvonne’s theremin playing and adjusted the colors manually. (This setup worked well in subsequent shows, and we eventually added a DMX bridge that allowed us to control RGB stagelights with the same patch -w)
Josh and I are both really interested in the work of John Whitney, and the last song in the set contained one of Whitney’s Pascal Programs that was re-written in Processing by a member of the Open Processing group named Jim Bumgardner, who was generous enough to allow us to use his code. I wrote Quicktime out from a portion of his code sketch using the Processing Movie Maker library, and manipulated it in Motion. I felt that Jim should get a shout-out here as well.
Once we had all our content ready, everything was placed in VDMX, and clips were given midi triggers from the corresponding Ableton Live sets that accompanied most of the songs. The show was an 8-channel audio experiment- Ableton was used to route audio on some, and a hardware solution; 4MS pedals’ Bend Matrix, was used on others. Ableton was also used to handle samples throughout the set. To trigger other clips, and to open each consecutive song preset, I used a nintendo Wiimote. VDMX has a built-in WiiMote plugin. I connected the two computers together via ethernet and used an OSC output plugin on one computer to repeat the wiimote commands to the second machine- thus I was able to control both computers at once with a very simple and responsive wireless controller. I found a pink WiiMote that matched our show poster at a local game store (the guy working the counter looked at me like I wandered in wearing a tutu when I asked to buy it).
We had initially done a lot of work trying to send midi via OSC from the ableton machine to the vdmx machines using a variety of software; a custom app from the VDMX developers, Max4Live, and OSCulator. Finally, we became so worried about the complexity of the setup and wireless interference that we used wired midi, and I’m glad we did. You can string incredibly long midi cables without issue, and it worked the instant it was plugged in- no additional software or configuration was needed.
Since the projector setup created a giant field of pixels to output to (two 2048×768 ‘screens’, each split over two projectors. The original “8 screens of video” concept became more and more abstract and meaningless as we worked on the project) we had to be able to push many layers of HD video from our machines without dropping frames. Early on in creating the show we had terrible performance problems, even with hd clips with very low data rates (we did all our compression using Mpeg Streamclip, a free program that has saved my skin professionally more times than I can count). Finally, we realized that one of our primary problems, in addition to drive speed and fragmentation, was that the vibrations from the massive noise we were creating during rehersal were causing read errors on our drives. After a little research I found these expresscard solid state volumes, and once our media was on them, the performance problems stopped. I kept the clips at half size though- 1024×384, and found that was the best balance of performance and resolution.
One of the biggest headaches of the whole show for me was dealing with the Matrox DualHead2Go boxes. These essential pieces of hardware got us running by creating a virtual giant monitor out of two seperate projectors, but they made us pay for it every step of the way. The Matrox boxes use incredibly hacky un-maclike configuration software to inject extra resolutions into the display preferences of the computer. The Matrox software is a java app-looking UI salad that immediately plops you painfully in windows-land: ambiguous controls in a slurry of confusing and unnecessary interfaces. The box draws power from usb, which it tells you to connect to your computer- our projectors were positioned far away from the VDMX computers and we didn’t have usb long enough to connect them, so we went usb from the Matrox boxes, to ac adapters. This worked sometimes. Other times I had to race from the video pit to where the Matrox boxes were mounted, plug into them directly, select the right resolution, put the computer to sleep, then race back to my area and plug back into the vga there. Scary, stupid… I wish there was a better solution. Next time I will have a crazy long usb cord made.
I’m happy to answer any questions anyone has about the video setup. It was a stressful, but rewarding experience. The band is, and has always been a joy to work with. Incredibly hard working, cool, and generally some of the plain old nicest people I have ever known. All the crew people on site were great, and we couldn’t have done it without them. Dan Winckler is a badass, and I hope to keep working with him into the future, I think we made a valuable friend. Danielle Thomas, our producer, did an amazing job of wrangling a million disparate elements and keeping things afloat. Michael Bepko from Whole Foods achieved the remarkable goal of allowing something this weird to even exist in such a big way. Thanks, all.
The show seems to have been a success. There may be additional developments in the months to come.