Oscillations

Immersive Virtual Experiences in the Performing Arts

Advancements in neuroscience and immersive technologies offer mechanisms for engineering an entirely new mode of performance art one that engages audiences to unprecedented degree. Using the latest VR production techniques, students used motion capture and machine learning to teach a computer to improvise a performance, creating an engaging VR experience.

The Knight Lab paired with Oscillations, who brought together a team of movement artists, musicians, scientists, and technologists to explore the possibility of immersive performance art synthesising innovative, polyrhythmic music with the performing arts’ and entertainment industry’s most creative, physically adroit dancers and athletes.

Faculty and Staff Leads

Zach Wise

Professor, Journalism

Emmy winning interactive producer & Associate Professor @NorthwesternU, @KnightLab. Formerly of The New York Times. Creator of TimelineJS & StoryMapJS

Results

  • Oscillations Audience Engagement Research Findings

    During the Winter 2018 quarter, the Oscillations Knight Lab team was tasked in exploring the question: what constitutes an engaging live movement arts performance for audiences? Oscillations’ Chief Technology Officer, Ilya Fomin, told the team at quarter’s start that the startup aims to create performing arts experiences that are "better than reality." In response, our team spent the quarter seeking to understand what is reality with qualitative research. Three members of the team interviewed more...

    Continue Reading

  • Comparing Motion Capture Techniques for Movement Art

    With Oscillations’ connection to the movement arts, it made sense to experiment with existing motion capture technology to find accurate, consistent, and scalable ways to obtain three-dimensional motion data for purposes such as animation or machine learning to augment performances in virtual reality. An additional motivation to learn more about motion capture was connected to our early experiments with spatial audio (read more about them in our spatial audio blog post). Apart from using ambisonic...

    Continue Reading

  • Prototyping Spatial Audio for Movement Art

    One of Oscillations’ technical goals for this quarter’s Knight Lab Studio class was an exploration of spatial audio. Spatial audio is sound that exists in three dimensions. It is a perfect complement to 360 video, because sound sources can be localized to certain parts of the video. Oscillations is especially interested in using spatial audio to enhance the neuroscientific principles of audiovisual synchrony that they aim to emphasize in their productions. Existing work in spatial...

    Continue Reading

  • The Hammer Without A Nail Oscillations is a New Art Form Making Meaningful Impact

    Imagine a classroom of elementary-aged students. They appear to be sitting cross-legged on the floor or walking around the room, but in reality, they’re dancing. Dancers flash across the screens of the virtual reality headsets the students wear. The performers jump and turn just inches from the young audience members’ eyes. The children can see the subtle flexibility of the dancers, connecting the shifts of movement to the music’s tempo. Looking up, down and side...

    Continue Reading

Project Details

2018 Winter

Sample Milestones
  • Week 1-2 Explore, practice and create motion capture performances.
  • Week 3 Process and evaluate the motion capture process.
  • Weeks 4-7 Iteration on motion capture process and invent a way to sync motion with video facial capture. Recording of performances.
  • Week 8-10 Post-production of performances and document/publish best practices on integrating performance motion capture into VR.
Outcome

Students will increase their understanding of the latest production techniques and tools for creating VR experiences as well as publish new techniques and findings related to VR motion capture.

Presentation Slides
Students

Joo-Young Lee

Ben Singer

Device Lab Fellow

Harriet White

2018 Fall

Outcome

Students will increase their understanding of the latest production techniques and tools for creating VR experiences as well as publish new techniques and findings related to VR motion capture. Students will use ML (Machine Learning) to teach a computer to dance. The AI output will make a 3D model dance. Connecting AI to Unity for VR.

Students

Nicole Fallert

Janet Lee

2018 Spring

Sample Milestones
  • Week 1-2 Explore, practice and create motion capture performances.
  • Week 3 Process and evaluate the motion capture process.
  • Weeks 4-10 Machine learning. AI connected to 3D model implementing dance moves.
Outcome

Students will increase their understanding of the latest production techniques and tools for creating VR experiences as well as publish new techniques and findings related to VR motion capture. Students will use ML (Machine Learning) to teach a computer to dance. The AI output will make a 3D model dance. Connecting AI to Unity for VR.

Students

Abizar Bagasrawala