Generating musical accompaniment through functional scaffolding

Amy K. Hoover, Paul A. Szerlip, Kenneth O. Stanley

Research output: Contribution to conferencePaperpeer-review

8 Scopus citations


A popular approach to music generation in recent years is to extract rules and statistical relationships by analyzing a large corpus of musical data. The aim of this paper is to present an alternative to such data-intensive techniques. The main idea, called functional scaffolding for musical composition (FSMC), exploits a simple yet powerful property of multipart compositions: The pattern of notes and rhythms in different instrumental parts of the same song are functionally related. That is, in principle, one part can be expressed as a function of another. The utility of this insight is validated by an application that assists the user in exploring the space of possible accompaniments to preexisting parts through a process called interactive evolutionary computation. In effect, without the need for musical expertise, the user explores transforming functions that yield plausible accompaniments derived from preexisting parts. In fact, a survey of listeners shows that participants cannot distinguish songs with computer-generated parts from those that are entirely human composed. Thus this one simple mathematical relationship yields surprisingly convincing results even without any real musical knowledge programmed into the system. With future refinement, FSMC might lead to practical aids for novices aiming to fulfill incomplete visions.

Original languageEnglish (US)
StatePublished - 2011
Externally publishedYes
Event8th Sound and Music Computing Conference, SMC 2011 - Padova, Italy
Duration: Jul 6 2011Jul 9 2011


Other8th Sound and Music Computing Conference, SMC 2011

All Science Journal Classification (ASJC) codes

  • General Computer Science


Dive into the research topics of 'Generating musical accompaniment through functional scaffolding'. Together they form a unique fingerprint.

Cite this