Despite the popularity of virtual musicians, most of them cannot play together with human musicians following their tempo or create their own behaviors without the aid of human characters. The authors of a recent paper created a virtual violinist having these characteristics.
It can track music and adapt to the human pianist’s tempo varying with time and with performance, making the two voices harmonized. The virtual musician’s body movements are generated directly from the music. The motion generator is trained on a music video dataset of violin performance and a pose sequence synchronized with live performance is generated.
These features mean that the human musician can practice, rehearse, and perform music with the virtual musician like with a real human, by following the music content. The proposed system has successfully performed in a ticket-selling concert, where a movement from Beethoven’s Spring Sonata was played.
Virtual musicians have become a remarkable phenomenon in the contemporary multimedia arts. However, most of the virtual musicians nowadays have not been endowed with abilities to create their own behaviors, or to perform music with human musicians. In this paper, we firstly create a virtual violinist, who can collaborate with a human pianist to perform chamber music automatically without any intervention. The system incorporates the techniques from various fields, including real-time music tracking, pose estimation, and body movement generation. In our system, the virtual musician’s behavior is generated based on the given music audio alone, and such a system results in a low-cost, efficient and scalable way to produce human and virtual musicians’ co-performance. The proposed system has been validated in public concerts. Objective quality assessment approaches and possible ways to systematically improve the system are also discussed.