Bringing the magic of playing music to the virtual world

Date posted: January 3, 2025
Share
JAMS press release

Researchers are aiming to bring the magic of playing music in person to the virtual world.

The Joint Active Music Sessions (JAMS) platform, created at the University of Birmingham, uses avatars created by individual musicians and shared with fellow musicians to create virtual concerts, practice sessions, or enhance music teaching.

Dr Massimiliano (Max) Di Luca from the University of Birmingham explains: “A musician records themselves and sends the video to another musician.  The software creates a responsive avatar that plays in perfect synchrony with the music partner.  All you need is an iPhone and a VR headset to bring musicians together for performance, practice, or teaching.”

The JAMS platform has the potential to develop a social network like Spotify or Myspace, where musicians can interact to learn, connect, perform, develop new music, and create virtual concerts that reach larger audiences.

JAMS has the distinct flavour of a platform developed with and for musicians, whether successful or at an early stage of learning.

The avatars capture the unspoken moments that are key in musical performance, allowing practice partners or performers to watch the tip of the violinist’s bow, or make eye contact at critical points in the piece.  They also have real-time adaptability and are dynamically responsive to the musician on the VR headset, so delivering a unique, personalised experience.

Delivery by VR headset recreates the musician’s world and provides an immersive backdrop with a realistic rendering of other musicians and cues used in the real-life setting.  It also keeps the faces at eye level, which adds to the feeling of connectedness.

Critically, there is no ‘latency’ in the JAMS user experience.  Dr Di Luca explains: “Latency is the delay between a sound production and when it reaches the listener, and performers can start to feel the effects of latency as low as 10 milliseconds, throwing them ‘off-beat’, breaking their concentration, or distracting them from the technical aspects of playing.”

JAMS is underpinned by an algorithm created during the Augmented Reality Music Ensemble (ARME) project, that captures dynamic timing adjustments between performers.  The project brought together researchers from six disciplines (psychology, computer science, engineering, music, sport science, and maths), whose input realised the vision of building a computational model that reproduces, with precision, a musician’s body movements and delivers an avatar that meets the needs co-performers.

“We’re aiming to bring the magic of playing music in person to the virtual world.  You can adapt the avatar that other people play with, or learn to play better through practice with a maestro.”

JAMS allows musicians to perform in an interactive virtual group, and can be adapted for lipsyncing or dubbing in media.  It can also gather unique user data to create digital twins of musicians, offering licensing opportunities for various applications, and further exploitation of catalogues and publishing rights.

Commercial enquiries should be directed to the ARME project website at: https://arme-project.co.uk/contact/

Share

Keep Updated

Latest News & Events

fsp

Manufacturing

Jan 22 2025

Environmental investments paying off for FSP
Read more
MeatSubstituteweb.x95d8f5ed

Food & Agri-tech

Jan 21 2025

University of Nottingham and Jampa’s partnership to pioneer future of plant-based protein products
Read more
Alstom Derby Litchurch Lane Works

Transport

Jan 17 2025

Rob Whyte appointed new Managing Director for Alstom UK and Ireland
Read more

Subscribe to our partnership newsletter