e-space
Manchester Metropolitan University's Research Repository

    “F O R M S”: creating visual composition through the movement of dance and Artificial Intelligence

    Nogueira, Maria Rita, Menezes, Paulo and Carvalho, José Maçãs de (2022) “F O R M S”: creating visual composition through the movement of dance and Artificial Intelligence. In: The Paris Conference on Arts and Humanities 2022 (PCAH2022), 16 June 2022 - 19 June 2022, La Maison de la Chimie, Paris, France.

    [img]
    Preview
    Published Version
    Available under License In Copyright.

    Download (960kB) | Preview

    Abstract

    What relationship exists between dance and visual arts? How can dance visually express lines, shapes, and visual compositions in space? It is true that performing arts and visual arts have common methodologies and connections with each other. However, how can the audience understand their relationship? The present work intersects art with technology, more specifically dance movement, and machine learning techniques, to create a new visual representation of the body's movement in space. The field of artificial intelligence has allowed machine learning techniques, such as human-pose estimation to explore areas of body movement. The integration of machine learning with dance has resulted in different approaches, but how can this relationship contribute to involving the audience? FORMS mirrors this dancer-machine dialogue in an interactive installation performance. Body language is the vehicle that drives the visual outcome of the interactive experience, creating a novel real-time visual expression of the dance movement. The hybrid format of the installation offers the audience a live performance and an open experience where anyone can play with FORMS through their movement. It contributes to cultivating body awareness, understanding in major detail the dance movement, and enriching the art experience.

    Impact and Reach

    Statistics

    Activity Overview
    6 month trend
    99Downloads
    6 month trend
    35Hits

    Additional statistics for this dataset are available via IRStats2.

    Altmetric

    Repository staff only

    Edit record Edit record