Researchers studied the electric brain responses of twenty non-musicians who listened to classical piano sonatas played either by a computer or a musician.
By investigating how the vegetative nervous system responds to music, researchers aimed to research music-therapeutic applications for the treatment of diseases.
Computerised music was found to elicit an emotional response during unexpected chords and changes in tonal key.
However, listeners responded far more strongly to recordings of human performances.
Stefan Koelsch, a psychologist involved in the study at the University of Sussex, attributed the difference in listeners’ responses to the ability of human musicians to perform with musical expression, which is evidenced by variations in loudness and speed.
Likening the study’s findings to how the brain forms meaning from language, Koelsch speculates that the brain could be looking for musical meaning from human performers and not computers.
“Humans played with expression -- variations in tempo and loudness – [while] computers played the notes without expression, which sounds like a robot or so,” Koelsch said.
“When humans play, then the meaning of the music does not only arise from the musical material composed by the composer, but also by the means of musical expression due to the performance of the player,” he explained.
While Koelsch noted that the researchers did not use computer software designed to emulate human performances in their study, he does not expect computers ever to be able to perform in a similar manner to musicians.
“Computers do not have a soul,” he said. “I believe that only the soul can give the music the full meaning -- but this is rather philosophical, of course.”
According to John Judge of Australian research organisation NICTA, musical expression is one of several challenges in the development of robotic musicians.
Judge recently led a collaborative effort between NICTA and the University of NSW to build a robotically-operated, computer-driven clarinet, which last month won first place in the international Artemis Orchestra competition.
The robotically-operated clarinet was eight months in the making, and involved a completely embedded computer system with the human interface achieved via a USB-attached keyboard, LCD screen and LEDs.
“There’s a few challenges for robotic musicians,” Judge told iTnews.
“The first barrier to overcome is to competently play the musical instruments. Once you can play the instrument competently, it’s [musical expression] another huge leap all by itself,” he said.
The Artemis Orchestra competition is an annual contest for technical students to showcase the capability of embedded systems.
Winning entrants from the 2008 event include a clarinet from NICTA/UNSW, an acoustic guitar from the Netherlands, and a piano from Finland.
“There was such a huge improvement [from entrants this year],” Judge said. “It was a larger improvement than we all expected; entries were actually quite pleasant to listen to.”
“The entrants into the competition last year may have been quite difficult to listen to for extended periods of time,” he explained, delicately.
Judge described 2008 competition entrants as “good amateur players”. To reach the level of human performers, a robotic musician would have to interpret the music in a human-like manner.
And while there is software that could mimic musical expression using artificial intelligence or by adding an extra processing set to interpreting sheet music, Judge expects such performances still to fall short of those of human musicians.
“When you go to the Opera House and listen to someone play, they’re not mechanically reproducing what’s on the page; they’re interpreting the music for performance,” he explained.
“I think there are some people who would regard human musicians as having a dialogue with the audience when they perform.”
“Can a robot ever play like a human? I don’t really know,” he said. “That gets a little too philosophical for me to answer.”
In the lead up to Artemis Orchestra 2009, NICTA is sponsoring final year engineering student projects from the University of Queensland and the University of Adelaide.
The NICTA/UNSW clarinet is expected to be one of four projects that will be considered by NICTA for entry into the international contest.
Engineers currently are working on improving the musical competency of robotic musicians.
In future, Judge expects that robotic musicians could be able to manipulate an unmodified instrument in more ways than is possible with a human, which could enable human musicians to investigate technical aspects of performance like fingerings.
“Building machines to play musical instruments allows us to investigate musical instruments in new ways,” he said. “It [findings] can actually feed back to human musicians.”
“No one’s approached this project with the assumption of replacing human musicians,” he said.
Can 'soul-less' robots make us weep?
By Liz Tay on Jul 24, 2008 7:57AM