Music, Correct At Least Twice a Day

Music, Correct At Least Twice a Day

Only now, 16 years later, is the irony fully apparent—that the technology that created a Goliath music industry also has served as the stone that threatens to topple it.

Written By

Carl Stone

My last column, “Sound Ecologies,” sure got the comments flowing gratifyingly, and it was fun to lie back in my lawn chair and watch the to and fro. Early in the discussion we had a post from David Ocker, who pointed me to an article from BBC Music Magazine about how “Japanese researchers have come up with a novel alternative to the car radio, with the invention of a musical road surface.” It seems that a team from Hokkaido Industrial Research has built several of what they call “melody roads”—expressways that have ridged surfaces where the spacing between the ridges are “tuned” to specific distances. As cars pass over the surface, their tires produce sounds whose frequencies change with the spacing of the grooves, so as to produce discernable melodies. So far three roads in Japan are reported to implement this effect—one tuned to reproduce a Japanese pop song.

The thing is, I had vague memories of hearing about a similar system being used in ancient times in Japan, where people would drag bamboo sticks over grooves in a road to create a vocalization effect. This was cited by a scholar as one of the first—if not indeed the first—examples of “sampling”. So off I went on a search for a reference, and eventually landed in the middle of the back issues of the Computer Music Journal (MIT Press). Here I happened on a long-time favorite article from 1991, entitled “Machine Tongues 1: Music and the Electronic Media” by Roger Johnson. I thought that a couple of the things Johnson has written about, while not immediately apropos of the musical roads question, might prove worthy of your comment. Here is a brief excerpt:

Today the computer has become completely familiar and ubiquitous. It has gone through stages very similar to those of all other electronic media, namely novelty giving way to familiarity, invisibility and then denial. Jeremy Rifkin has proposed that the computer is driving a fundamental revolution in our deepest understanding of time and our relationships to it. We have moved from a clock-based culture to a computer based culture. Just as the mechanical clock radically transformed western culture in the 14th Century, so the computer is restructuring our present sense of time in ways we have only begun to understand.

It is interesting and significant that musical notation, with its similarly profound effect on Western music, evolved rapidly at the same time that the clock did. Notation is a system to control time and encode memory. The clock and writing itself are both media. Musical notation embodies the clock’s conception of organized linear time just as it allowed the structuring of time and the coordination of simultaneous events. Like writing, it is a sign system and gave primacy to those musical events that could be effectively represented within the system. Notation fostered the development of complex polyphony as well as harmonic rhythm. It has been central to the entire development of western musical thought. It defined the musical commodity as a written entity rather than a sound, even though the notation had to be transformed into sound to be fully experienced. Most importantly, notation has encoded the memory on which music history is based.

Considerably more thought needs to be given to the implications of the computer as a musical medium. Both computers and music are about time, both are about memory. If the computer is in the process of remaking our culture, how has recent music been involved with this process? What effects does the computer have on the making, storing and using of music? How will the tendency of digital information to become easily decentralized affect the commodification and exchange of music in the future? Digital information is less fixed and more transferable, and thus less controllable by a hierarchical industry than was the case for most of the earlier media. Ironically, the technology creates conditions which transform the society which created it. Perhaps finally there is a chance to decentralize music and other media, to empower the outsider, to allow for a much wider range of communication.

When Johnson talks about “denial” he means that people tend to incorrectly think of the computer as a neutral tool of musical communication—not for computer music, of course, but in all the other aspects of musical production that it is used (in other words, everywhere). Secondly, he shows real prescience for anticipating in 1991 what has happened to the music industry in the following years, as the result of the computer and world-wide computer networks like the internet. Interestingly, in the first part of the article, Johnson talks about how the computer has enabled the commodification of music and thereby empowered “an enormous industry, one of the largest in the world [that] has grown up in the past 75 years or so to produce, control and market electronic musical ‘software’ ” (i.e. CDs, et al.). Only now, 16 years later, is the irony fully apparent—that the technology that created a Goliath industry also has served as the stone that threatens to topple it.

Your thoughts?