As part of their Remix program, Adobe has spent the last 4 years asking the creative community to remix their iconic ‘A’ logo. This year, they reached out to us here at S1T2 and invited us to put our spin on the logo for the 2017 Adobe Marketing Symposium at the Sydney Opera House.
With a only a month to work with, we threw caution to the wind, opening the show with a live remix using dance and music performance data to generate real time visuals.
Generating the idea
As automation and artificial intelligence (AI) encroach on even the creative industry, we wanted to use this opportunity to explore the relationship between a creative and their tools. Our goal was to create a moving, symbiotic duet between a pianist and a dancer, and then to use the live data from this performance to visually explore the creative process through real-time computer graphics.
In the performance, our pianist, Gavin Ahearn, played the role of creative, exploring his canvas through the rhythmic beats of the piano's gradually building melody. Meanwhile, our dancer, Naomi Hibberd (decked out in a ninja-like mocap suit) took on the role of AI muse, translating Gavin’s inputs into a whole new plane of creativity.
We then used real-time graphics to bring this relationship to life on stage in real time. Using data captured from both Gavin and Naomi during the performance, we mapped the music and movement onto the screen in a vibrant display of dynamically generated particle effects and ribbon systems. By the end of the performance, these effects would crescendo into our reimagining of the iconic Adobe logo as a creative collaboration between human and computer.
Experimental tech
Christopher Panzetta, Project Lead
We have a motion capture set up in the studio, so the original idea was to install these cameras on stage – kind of what our timelines and concept was counting on.
The problem was it quickly became clear that we weren’t going to be able to make this happen, due to the realities of bump in and rehearsal times.
So we sent out an SOS beyond our shores and hooked up with the team at Rokoko in Denmark, who were developing an experimental, inertia-based motion capture suit. The suit essentially works as if you had mounted 19 iPhones onto a leotard and used the live data to map to a skeletal system in real time.
With no cameras, we were free to jump on and off stage in no time. But we had traded that problem for a host of others, as you always face with new tech and creative applications. In particular the time zones and international delivery delays demanded a unique level of creative problem-solving. However, with both sides of the coin showing the same insanity as the other, we made it onstage.
Music that builds on the idea
Gavin Ahearn, Composer + Pianist
We wanted to create a theme that represented the process of creativity. For that reason, the main theme of the music is accumulative, starting with one note, then building to two, with the final iteration of the theme consisting of seven notes.
The original ideas were improvised and then honed over a few versions. The essential ingredient in this process was the degree of feedback and interaction from S1T2 and Naomi (choreographer) regarding the structure and vibe of the piece. The three elements of the content – visual, dance and music – were all fluid throughout the creation period, with each element feeding back off the other two elements.
A dance inspired by technology
Naomi Hibberd, Dancer + Choreographer
I was contacted by S1T2 to be apart of their remix project – a live motion capture performance incorporating live music, dance and all new motion capture technology. I was first sent a storyboard from Chris (project lead) that helped me create small snippets of movement that I thought would work well at showing off the technology. I filmed them in studio and sent the footage on to Gavin (composer), where he improvised and made a rough score to match my movements.
After that, we came together and started to piece together the puzzle, with the help of Iain for artistic direction. When we finally got the motion capture suit it was a very exciting day, though we soon found that some movements – like rolling on the floor – had to go in the bin. After many trial and error rehearsals we finally came up with the finished product, which I am very proud of.
Bringing it to life through code
Liam Stephens, Creative Tech Lead
Given the short amount of time for iteration, we decided to use OpenFrameworks and OpenGL shaders to create the particle systems necessary to fit the aesthetic brief. Our artists initially mocked up the particle motion feel using pre-rendered video, giving us direction in terms of what kind of physics should be applied to particles and when.
After the initial research and development phase, we decided to use three kinds of particle effects. The first was a robust particle system that would be attracted and repulsed by the dancer during certain scenes, with a few other functions like spraying particles when her foot hit the floor at a certain velocity threshold. The second was a system that would ‘inherit’ the motion of those particles, drawing trails which followed particles with accelerated motion. The third was an ambient static particle system, rendered to create a slight sense of depth within the scene. We also created a ribbon system that would draw ribbons using the dancers bone positions to emphasise the correlation between the dancer's physical and virtual self.
And finally, the performance
The performance itself spans across a series of distinct beats, beginning with our Creative’s inquisitive play and expression as he explores the canvas. Gradually, his exploration becomes more purposeful, awakening the AI Muse, who slowly gains an affinity with the Creative’s input.
Both characters then enter into a dialogue together. With a growing comfort and expanding powers, Creative and Muse alike begin to revel in the duet, racing towards a momentary breath. Emboldened and empowered by one another, the experience builds to a burst of creativity before their collaborative creation transforms into its final form.