Warning: Illegal string offset 'custom_page_theme_template' in /storage/content/27/2008927/vfxsverige.com/public_html/wp-content/plugins/custom-page/custom-page.php on line 345
Read about the work done on Arcade Fire “The Wilderness Downtown”. 3D done by Magoo in Söderhamn.
Working with integrated production shop B-Reel, director Chris Milk and Google Creative Lab Technology Lead Aaron Koblin teamed up once again on a music-inspired digital experience, The Wilderness Downtown, an HTML5 interactive music video/Google Chrome Experiment that makes use of the browser’s features and various Google offerings to create a surprisingly nostalgic and emotional interpretation of the track “We Used to Wait,” off Arcade Fire’s album The Suburbs.
Speaking of the suburbs, that’s what you’ll experience here if you grew up in them. The site asks visitors to input the address of their childhoom home, which gets integrated into the story via Google Maps and Street View. Multiple windows open up in time with the track’s various beats, and scenes of your hometown streets become peppered with new life forms—from flocks of birds to lush trees. The growth doesn’t stop there, however. An interactive postcard visitors create completes the experience in the real world: it becomes potential content for the visuals of the upcoming Arcade Fire Suburbs tour and gets entered into The Wilderness Machine, which will actually print out their message on a seed-embedded card that when planted, will grow a real-live tree.
B-Reel Creative Director Ben Tricklebank and Producer Nicole Muniz give us the lowdown on how the project came together.
On What the Project Is:
Nicole Muniz: Featuring Arcade Fire’s new single “We Used To Wait” from their latest album The Suburbs, The Wilderness Downtown is an interactive music video built in HTML 5, using Google Maps and Street-view for Google Chrome Experiments. The film takes on a more personal approach by prompting users to input an address from their childhood which in turn places them at the center of the films narrative. Viewers then see themselves in the film as they run through the streets of their old neighborhood and finally reach their childhood home. This is tied very closely to the lyrics of the song, to make for an even greater emotional experience.
On How it All Came Together:
Ben Tricklebank: We started talking to Google while we were finishing up the Chrome Fastball project which went live at the end of June this year. Sandra Nam at Google sent over Chris’s treatment which we were all blown away with. After several meetings and conference calls with Chris and Aaron we were ready to begin. Initially the collaboration was handled in a very remote way. Chris and Aaron on the West Coast, some of our developers and our 3D partners were in Europe and us here in New York. This worked well in the early stages as there was a great deal of R&D and structure to be established before we could get into creating the individual components. At the end of July Chris came to New York so we could work very closely together on finalizing all the assets. Working with Chris and Aaron was a really enjoyable and collaborative experience. One of the most important aspects, and one that we believe helped to determine the overall success, was the open collaboration that we achieved between everyone involved.
On the Creative & Technical Process:
BT: One of the major challenges of this project was finding a method to control and sequence the the multiple windows needed to display the components of the film in synch to the music. At the early stages we created the framework of a custom application that would allow us total control over the timing, size and position of the windows at any point in the song. This tool evolved over the course of the project into a pretty sophisticated utility that was capable of not only controlling the windows but also assigning content within them with extreme precision and publishing directly to a preview link running from our development servers. Simultaneously we worked to develop the interactive and dynamic components as self contained units. This meant that each piece could be created and tested in a stand alone environment before being combined into the sequence. We tested various different forms of 2 and 3D flocking interaction for the birds. Google Maps and Street-view posed many unique challenges due to the way we needed to control very precise distances and movements while compositing animated elements in real time. The drawing and typing functionality for the postcard, which was based on a previous HTML5 experiment called the Harmony tool, was modified and adapted to create a very organic and tactile brush that resembled the growth of tree roots and branches. All these elements then had to be tested running along side our film assets to ensure the result would perform correctly for the end user.
On Production Hiccups
BT: When we set out on this project there was definitely a sense of stepping into unknown territory. Multiple videos running is separate windows while simultaneously layering animation over dynamic content from google maps and street-view—we definitely had some concerns over performance and the experience for viewers. That said, we definitely went through iterations that pushed us way beyond the capabilities of most machines. This way we could really identify how far we could go in the final piece. We believe in the end we made only minor changes to the original plans. In the first act, for instance, as we transition from the birds in the sky to the overhead maps view had originally been planned as 270 degree camera tilt, which turned out to be way beyond the capabilities of the current platform. Also in the third act, where the CGI runner is being attacked by the birds and dodging exploding trees, we had experimented with layering up to six simultaneous video windows and switching layer orders to really build on the energy in the final part of the song. When we look back now though we think we found the right balance, after all there’s a great deal of visual information for people to take in.
On What Saved the Production:
BT: Aside from the obvious technical and creative challenges of working in a new environment we also had to deal with working across multiple time zones. For a little over the first half of the project Chris was working from L.A., [designer/developer Ricardo Cabello, aka] Mr Doob in London, Magoo (3D partner) in Sweden and us in New York. Needless to say Skype was an essential part of our workflow.
On Something Cool You May Not Know:
NM: One of the key moments of interactivity in the experience appears in the second act, when one of the pop-up window invites users to type or draw a message of advice for their younger self. This is what we call the “postcard.” At the end of the experience, viewers can continue to engage with the postcard by continuing their message (or starting over if they wish). Once happy with the message they can submit it to “The Wilderness Downtown.” What viewers may not know is what happens then—here their postcard is given a code, which is now reflected in the bottom/right hand corner. They also get a special URL for their virtual “post box.” Now is where it gets interesting—their postcard is submitted to be Visuals for the Arcade Fire Suburbs Tour and is submitted to be printed by The Wilderness Machine.
The Wilderness Machine is special creation that will print postcards submitted (with their code) on special cards that have birch seeds embedded within. Fans at the concerts can print a postcard and reply to the sender by using the code. The sender can see this response in their Wait Area. On top of that, fans who receive a printed postcard from The Wilderness Machine can plant it and a tree will grow!