AI-Generated Choreography - Dance Beyond Music

Description

Recent years have seen the advent of AI-generated choreography using models trained on motion capture of a single dancer (see e.g. https://arxiv.org/abs/1907.05297), and last year with GSoC 2024, two contributors developed cutting-edge projects to understand improvisational dance duets through the lens of neural networks including GNNs and Transformers. However, many dance traditions view dance as far more than just a visual art, and understanding dance as only a movement prediction project risks overly reducing the perception of dance in digital form. Moreover, while existing multimodal dance embeddings focus primarily on pairings of music (e.g. “beats”) to movement, human movement can incorporate and be influenced by diverse modalities well beyond music including speech, imagery, writing, touch, architecture, proprioception, sculpture, and more. By exploring diverse multimodal embeddings of dance in an artist-driven framework, this project will imagine how to use AI to bring expansiveness, rather than reductiveness and conformity, into our digital renderings of dance.

Duration

Total project length: 175 hours

Expected results

Requirements

Participants should be comfortable with standard data science software including Python, Git, Numpy, Matplotlib, and Pandas. Previous experience in Machine Learning, either in TensorFlow or PyTorch, is preferred. While previous experience in dance or the performing arts is not needed, an interest in the artistic and open-ended aesthetic dimensions of the project is required. Strong interpersonal & communication skills are essential.

Project difficulty level

Hard

Mentors

Please DO NOT contact mentors directly by email. Instead, please email human-ai-choreo@cern.ch with Project Title and include your CV and test results. The mentors will then get in touch with you.

Corresponding Project

Participating Organizations