By Matt Micucci
Researchers say that jazz-playing computers and robots could soon yield clues about how to help people collaborate with machines.
This is a theory illustrated by a new project called MUSICA, which is short of Musican Improvising Collaborative Agent. It aims to develop a musical device that can improvise a jazz solo in response to human partners, using the same principle of real jazz musicians improvising alongside one another.
MUSICA is part of a new program from DARPA, the Defense Advanced Research Projects Agency, the branch of the US military responsible for developing new technologies, and is designed to explore new ways in which people can interact with computers and robots.
Ben Grosser, an assistant professor of new media at the University of Illinois at Urbana-Champaign, told Live Science. “A lot of us are familiar with various methods of interacting with computers, such as text-based and touch-based interfaces, but language-based interfaces such as Siri or Google Now are extremely limited in their capabilities.”
Grosser and his colleague Kelland Thomas, an associate professor of music at the University of Arizona, are developing MUSICA to explore how people can communicate with one another without language.
To develop a machine capable of playing improvisational jazz, the researchers will create a database of jazz solos from a variety of musicians and have computers analyze the recordings to figure out the various processes that come into play when a musician improvises.
The researchers will then develop a performance system to analyze the components of human jazz performances, including the beat, pitch, harmony and rhythm. The system will also consider what it has learned about jazz solos to communicate and respond musically in real time.