Adobe Remix

Live performance data triggers real-time interactive remix of the Adobe logo.

  • Sydney Opera House
  • 1
    month to create
  • 3k
    attendees

Translating live performance data into a real-time data visualisation, our remix of the iconic Adobe logo explores the relationship between a creative and their tools. The experiential marketing artwork brings together human input and computational output to create a dynamic particle visualisation examining the future of creative industries in a world increasingly dominated by automation and artificial intelligence.

Data visualisation with a twist

The Conference
The final S1T2 Adobe Remix logo made with data visualisation generated from real-time performance data.

Every year, Adobe invites artists and studios around the world to remix their iconic logo. When they asked us to be the first creative technology agency to have a go in 2017, we decided to explore how changes in the technological landscape are affecting the creative industry.

Launching at the MAKE IT Conference in Sydney, our Adobe Remix created a symbiotic duet between a pianist and a dancer. Then, using live performance data, we generated a real-time data visualisation of the creative process – culminating in a vibrant reimagining of the Adobe logo.

A story of creativity + technology

Music + Dance

Composer and pianist Gavin Ahearn played the human creative in our Adobe Remix, exploring his canvas through the rhythmic beats of the piano’s gradually deepening melody. This melody was specially composed to represent the process of creativity through an accumulative theme – starting with one note and gradually building to a final seven-note theme.

Composer Gavin Ahearn plays the piano during practice of the S1T2 Adobe Remix performance.

Meanwhile, dancer and choreographer Naomi Hibberd took on the role of AI muse, translating Ahearn’s human inputs into a new plane of creativity through technology. Decked out in an experimental, inertia-based motion capture suit from Rokoko, her movements would respond to and build upon the music, closing the loop on a symbiotic duet between pianist and dancer, creative and technology.

Chris Panzetta fits a black Rokoko motion capture suit to Naomi Hibberd for S1T2 Adobe Remix data visualisation.

Mapping real action to the virtual world

Data Mapping
Practice session of the creative technology agency S1T2’s Adobe Remix sees real-time data from a dancer’s movements translated into a digital data visualisation.

Throughout Gavin and Naomi’s performance, behind the scenes we were capturing real-time data from both the piano and the motion capture suit, providing live input that could be used for the final part of our Adobe Remix. Here, we’d use real-time graphics to bring their relationship to life through a dynamically-generated data visualisation.

Mapping the two live inputs, music and movement, we generated a vibrant, real-time display of particle effects and ribbon systems. By the end of the performance, this data visualisation built towards a satisfying crescendo, finally exploding into our final reimagining of the Adobe logo as a result of both a creative and their technology.

A collaborative process

Collaboration

Our Adobe Remix was created through a collaborative process of creative technology. The three elements of the content – visual, dance and music – were all fluid throughout the creation period, with each element feeding back off the other two elements.

Dancer Naomi Hibberd sits on stage of the Sydney Opera House in a Rokoko motion capture suit.

The music was improvised and honed to a melody that represented the cumulative nature of creativity. Meanwhile, the choreography was generated in a similarly collaborative fashion, with Hibberd filming small snippets of dance that the team could then fit together to match the music. Then, when we got our hands on the motion capture suit, these elements were all iterated on and refined to form a cohesive performance able to generate good data for our visualisation.

Adobe Remix background visualisation.

Capturing real-time performance data

Motion Capture

Our initial idea when it came to motion capture was to replicate the set up we have in the studio. This would mean installing a system of motion capture cameras to a fixed rig on stage. However as we moved into production, it quickly became clear that we weren’t going to be able to make this happen given the realities of bump in and rehearsal times.

Chris Panzetta from creative technology agency fits a motion capture suit from Rokoko to a Naomi Hibberd.

Instead, we connected with the team at Rokoko in Denmark, who were developing an experimental, inertia-based motion capture suit. Their system didn’t rely on cameras, instead relying on 19 smart sensors mounted to a leotard to detect position and movement before using live data to map a skeletal system in real-time.

With no cameras, rehearse and perform with ease. However, as always with emerging technology, solving one problem involved creating a number of others. In this case, time zones and international delivery delays demanded some pretty creative problem-solving  to get all the different components working together in time.

Transforming live data into generative art

Programming

As both the music and the choreography were being developed, we mapped out the general beats of the performance through a storyboard. Before moving forward with the visualisation, we mocked this up with a pre-rendered animatic video. This would give our programmers a clear sense of the data visualisation’s artistic direction and aesthetic. 

Given the short amount of time for iteration, we decided to work with OpenFrameworks and OpenGL shaders to bring the on-screen data visualisation to life. Within these frameworks, we were able to map the music and motion data onto the screen, building to create a final Adobe Remix that involved four different systems of visualisation. 

Programmer Liam Stephens from creative technology agency S1T2 looks at a screen showing the real-time data from a dancer’s motion capture suit.

A stunning real-time visualisation

Particle Visualisation

To create a satisfying real-time visualisation throughout the Adobe Remix performance we used a number of different particle simulations. First was a robust particle system could be attracted and repulsed by the dancer. Another system would then ‘inherit’ the motion of these first particles, drawing trails according to their velocity and movement. The third system consisted of ambient particles that helped create a sense of depth. Finally, a ribbon system mapped the dancer’s bone position on-screen, emphasising the correlation between the performers’ physical and virtual selves throughout the Adobe Remix performance.

Adobe Retail artistic visualisation.

Stay in the loop

Subscribe to our newsletter to receive updates and insights about Adobe Remix and other S1T2 projects.

Welcome to the party