Department of Computational Perception
Department of
Computational Perception
Johannes Kepler Universität Linz


Home  –  Mission  –  Teaching  –  People  –  Research  –  Media  –  Impressum


COMPUTATIONAL PERFORMANCE STYLE ANALYSIS FROM AUDIO RECORDINGS

Project Title: Computational Performance Style Analysis from Audio Recordings

Sponsor: Austrian National Science Fund (Fonds zur Förderung der wissenschaftlichen Forschung, FWF),
Project Number: P19349-N15

Duration: 36 months (Feb. 2007 Jan. 2010)

In cooperation with: Austrian Research Institute for Artificial Intelligence (ÖFAI), Vienna Machine Learning, Data Mining, and Intelligent Music Processing Group

Persons involved:

Gerhard Widmer (Project Leader)
Werner Goebl
Maarten Grachten
Sebastian Flossmann
Bernhard Niedermayer


Abstract

The project aims at investigating the fascinating, but elusive phenomenon of individual artistic music performance style with quantitative, computational methods. In particular, the goal is to discover and characterise significant patterns and regularities in the way great music performers (classical pianists) shape the music through expressive timing, dynamics, articulation, etc., and express their personal style and artistic intentions.

The starting point is a unique and unprecedented collection of empirical measurement data: recordings of essentially the complete works for solo piano by Frederic Chopin, made by a world-class pianist (Nikita Magaloff) on the Bösendorfer computer-controlled SE290 grand piano. This huge data set, which comprises hundreds of thousands of played notes, gives precise information about how each note was played, including precise onset time, duration, and loudness. State-of-the-art methods of intelligent data analysis and automatic pattern discovery will be applied to these data in order to derive quantitative and predictive models of various aspects of performance, such as expressive timing, dynamic shaping, articulation, etc. This will give new insights into the performance strategies applied by an accomplished concert pianist over a large corpus of music. Moreover, by automatically matching these precisely measured performances against sound recordings by a large number of famous concert pianists, comparative studies will be performed which, for the first time, will permit truly quantitative statements about individual artistic performance style.

All this requires extensive research into new methods for intelligent audio analysis (e.g., extraction of expressive parameters from audio, and precise alignment of different sound recordings) and intelligent data analysis and modelling (e.g., sequential pattern discovery, hierarchical probabilistic models, etc.).

The project can be seen as a continuation and extension of previous work or ours, in which we managed to show that expressive music performance is indeed amenable to computational modelling and analysis, and which has contributed to establishing an international research trend in computational music performance research. An easily readable account of that earlier work can be found in

Widmer, G., Dixon, S., Goebl, W., Pampalk, E., and Tobudic, A. (2003).
In Search of the Horowitz Factor. AI Magazine 24(3), 111-130.

The current project will go beyond earlier work by working with new empirical data of unprecedented size and quality, and by focusing on the very elusive, but fascinating question of the individual style of great artists.

Publicity, Awards, etc.


Publications

The Magaloff Project: An Interim Report

S. Flossman, W. Goebl, M. Grachten, B. Niedermayer, and G. Widmer

Journal of New Music Research (in press).


Investigations into between-hand synchronization in Magaloff's Chopin

W. Goebl, S. Flossman, and G. Widmer

Computer Music Journal (in press).


Strategies towards the Automatic Annotation of Classical Piano Music

B. Niedermayer and G. Widmer

In Proceedings of the 7th Sound and Music Computing Conference (SMC 2010),
Barcelona, Spain, 2010.

>> PDF PDF

Simple Tempo Models for Real-time Music Tracking

A. Arzt and G. Widmer

In Proceedings of the 7th Sound and Music Computing Conference (SMC 2010),
Barcelona, Spain, 2010.


Evidence for Pianist-specific Rubato Style in Chopin Nocturnes

M. Molina, M. Grachten, and G. Widmer

In Proceedings of the 11th International Society for Music Information Retrieval Conference (ISMIR 2010),
Utrecht, The Netherlands, 2010.


A Mulit-Pass Algorithm for Accurate Audio-to-Score Alignment

B. Niedermayer and G. Widmer

In Proceedings of the 11th International Society for Music Information Retrieval Conference (ISMIR 2010),
Utrecht, The Netherlands, 2010.

>> PDF PDF

The Magaloff Corpus: An Empirical Error Study

S. Flossman, W. Goebl, and G. Widmer

In Proceedings of the 11th International Conference on Music Perception and Cognition (ICMPC),
Seattle, WA, USA, 2010.


On the Use of Computational Methods for Expressive Music Performance

W. Goebl and G. Widmer

In T. Crawford and L. Gibson (Eds.), Modern Methods for Musicology: Prospects, Proposals and Realities.,
London: Ashgate Publishing.


YQX plays Chopin

G. Widmer, S. Flossmann, M. Grachten

In AI Magazine 30(3), 35-48.


Phase-plane representation and visualization of gestural structure in expressive timing

M. Grachten, W. Goebl, S. Flossmann, G. Widmer

In Journal of New Music Research38(2), 183–195.

>> View

Maintaining skill across the life span: Magaloff's entire Chopin at age 77

S. Flossmann, W. Goebl, and G. Widmer

International Symposium on Performance Science (ISPS 2009) (15–18 December 2009),
Auckland, New Zealand, 2009.
Published by the European Association of Conservatoires (AEC),
Utrecht, NL, pp. 119–124.

>> PDF PDF

Who is who in the end? Recognizing pianists by their final ritardandi

M. Grachten, G. Widmer

In proceedings of the Tenth International Society for Music Information Retrieval Conference (ISMIR'09),
Kobe, Japan, 2009.

>> PDF PDF

The ISMIR Cloud: A Decade of ISMIR Conferences at Your Fingertips

M. Grachten, M. Schedl, T. Pohle, and G. Widmer

In proceedings of the Tenth International Society for Music Information Retrieval Conference (ISMIR'09),
Kobe, Japan, 2009.

>> PDF PDF

Improving Accuracy of Polyphonic Music-to-Score Alignment

B. Niedermayer

In proceedings of the Tenth International Society for Music Information Retrieval Conference (ISMIR'09),
Kobe, Japan, 2009.

>> PDF PDF

Computational investigations into between-hand synchronization in piano playing: Magaloff's complete Chopin

W. Goebl, S. Flossmann, G. Widmer

In Proceedings of the 6th Sound and Music Computing Conference. (SMC 2009),
Porto, Portugal, 2009.

>> PDF PDF

The kinematic rubato model as a means of studying final ritards across pieces and pianists

M. Grachten, G. Widmer

In Proceedings of the 6th Sound and Music Computing Conference. (SMC 2009),
Porto, Portugal, 2009.

>> PDF PDF

Towards Audio to Score Alignment in the Symbolic Domain

B. Niedermayer

In Proceedings of the 6th Sound and Music Computing Conference. (SMC 2009),
Porto, Portugal, 2009.

>> PDF PDF

Expressive Performance Rendering: Introducing Performance Context

S. Flossmann, M. Grachten, and G. Widmer

In Proceedings of the 6th Sound and Music Computing Conference. (SMC 2009),
Porto, Portugal, 2009.

>> PDF PDF

Non-Negative Matrix Division for the Automatic Transcription of Polyphonic Music

B. Niedermayer

In Proceedings of the 9th International Conference on Music Information Retrieval (ISMIR 2008),
Philadelphia, PA, USA, 2008.

>> PDF PDF

Automatic Page Turning for Musicians via Real-Time Machine Listening

A. Arzt, G. Widmer, and S. Dixon

In Proceedings of the 18th European Conference on Artificial Intelligence (ECAI 2008),
Patras, Greece, 2008.

>> PDF PDF

Experimentally Investigating the Use of Score Features for Computational Models of Expressive Timing

S. Flossmann, M. Grachten, and G. Widmer

In Proceedings of the 10th International Conference on Music Perception and Cognition (ICMPC10),
Sapporo, Japan, 2008.

>> PDF PDF

Intuitive visualization of gestures in expressive timing: A case study on the final ritard

M. Grachten, W. Goebl, S. Flossmann and G. Widmer

In Proceedings of the 10th International Conference on Music Perception and Cognition (ICMPC10),
Sapporo, Japan, 2008.

>> PDF PDF

Phase-plane visualizations of gestural structure in expressive timing

M. Grachten, W. Goebl, S. Flossmann and G. Widmer

In Proceedings of the Fourth Conference on Interdisciplinary Musicology (CIM08),
Thessaloniki, Greece, 2008.

>> PDF PDF

Towards Phrase Structure Reconstruction from Expressive Performance Data

M. Grachten and G. Widmer

In Proceedings of the International Conference on Music Communication Science (ICOMCS),
Sydney, Australia, 2007.

>> PDF PDF


last edited by bn on 2010-06-29