Finding a Home for the Longest Opera Ever Written

Finding a Home for the Longest Opera Ever Written

My apartment has become the temporary lodging place for the complete works of the Hungarian-American composer Gabriel von Wayditch (1888-1969), whose output includes 14 operas. Several of these are more than five hours in duration, and one—Eretnekek (which translates into English as The Heretics)—lasts over eight hours and is cited in the Guinness Book of World Records as the longest opera ever written.

Basically this translates into large heavy volumes of orchestral scores and piano reductions, a good many filling up 1200 manuscript pages. The piano reduction of Eretnekek fills some 1531 pages compiled in four separately bound volumes, and its orchestral score, also bound in four volumes, goes up to 2850 pages. (He died before he completed the orchestration.)

What makes this particular story so unusual is that this vast body of work was created over a period of more than 60 years with little-to-no incentive. Only one of Wayditch’s operas, Horus, was ever performed during his lifetime and apparently not particularly well. (Two contemporaneous reviews survive.) The operas offer a variety of potential production pitfalls in addition to their unwieldy duration: they require large forces (110-piece orchestras), many require scene changes every few minutes, and all were written in Hungarian. Since they were not composed with specific performances in mind there are also problems with the performance materials: e.g. there are no parts and many of the piano reductions, in an effort to be representative of the orchestration, are virtually unplayable with two hands and a single piano. The elaborate plots of many of these works—he wrote his own libretto as well—recall the wild fantasies of outsider artist Henry Darger, whose obsessive life’s work, also created in obscurity, begs a comparison with Wayditch.

Some back story for why I’m currently housing these materials is probably in order here. I first learned about this music 25 years ago when I chanced upon LP recordings of two short Wayditch operas, The Caliph’s Magician and Jesus Before Herod, which were issued by Musical Heritage Society. The booklet notes included a phone number to call for a Gabriel von Wayditch Music Foundation if interested in learning more about the music. I was intrigued enough by the music I heard on those LPs to call the phone number. It turned out to be the home phone number of the composer’s son, a tax accountant who financed those recordings and had dedicated his life to championing his father’s music.

Through the years I stayed in touch with the composer’s son, helping to get those LPs re-issued on CD (and thankfully they are still commercially available from VAI), writing the Grove Dictionary entry on Wayditch, trying to interest people in performing these works, and most recently—now that the composer’s son has died and his grandson has entrusted me with them—finding a permanent home for these unique manuscripts.

I’ve been intimately acquainted with this story for decades, but having these scores surrounding me has made the questions I’ve long harbored about this music all the more concrete: What force would drive someone to create such a vast body of work without any opportunities, either financial or promotional? How many other composers like Gabriel von Wayditch are there out there and what keeps their music from being more known? What could be done to preserve the legacies of such composers and keep the flames of support for them burning after they or their descendants are no longer around to carry the torch?

NewMusicBox provides a space for those engaged with new music to communicate their experiences and ideas in their own words. Articles and commentary posted here reflect the viewpoints of their individual authors; their appearance on NewMusicBox does not imply endorsement by New Music USA.

81 thoughts on “Finding a Home for the Longest Opera Ever Written

  1. Dennis Bathory-Kitsz

    You have touched on an issue close to me as well. My dear friend Gilles Yves Bonneau composed some 300-plus works, one of which (“Timesweep”) is a 24-hour-long oratorio for soloists, chorus, pianos, organ, and large orchestra. The entirety of Gilles’s performances, however, fill about an hour.

    Gilles died in a tragic home accident a few years ago, after having moved from Vermont to Seattle. He bequeathed his entire lifetime of work to me — 17 boxes arrived after his death. (I am also a small publisher and had issued three of his pieces.)

    Our Vermont Composers Consortium will be doing a Bonneau concert in the next year or two, but the same questions arise. Why keep writing with no audience? Why so much? Why so detailed? He always answered me, “does the violet that blooms in the woods need anyone to appreciate its beauty?” And so “Violet in the Woods” became a small retrospective CD of his music that I issued three years ago.

    But for me the question had no satisfactory answer, and the burden is what to do with his music. He was at best a regional composer, private, seldom pursuing performances, but a strong advocate for his composer colleagues. The university he attended is not interested. His family is not interested. No libraries want the material.

    So now it sits, carefully wrapped in plastic with rodent-repelling mothballs, in my barn’s hayloft. I’m 59. It won’t be too long before there are two sets of life work to be disposed of. Somehow.


  2. dalgas

    From the gargatuan to the miniscule…

    My friend Chris DeLaurenti wrote a small tribute to a Seattle-area composer who passed away at the end of 2005, Thomas Peterson (b. 1931).

    Tom and I ended up having pieces on the same concert, back around 1987. It was the only time I met Tom, and heard any of the wonderfully energetic & astringent work of this man who lived in a very poor flop-house in Seattle’s Chinatown (a far cry from studying in Vienna years earlier). I seem to remember that he made a good portion of what little money he had, by working as a postal temp over the Christmas season.

    Chris’ description of what remains of Tom’s creation is rather heartrending; I can only hope that they keep being preserved by a few caring folk.

    Steve Layton

  3. species7th

    About twenty years ago I met with Karl Oldberg, the son of Arne Oldberg, a Chicago based composer who apparently wrote upwards of nine symphonies and a huge collection of chamber works, piano pieces and the like. His son told me that his father did very little to promote his music in his own life. Karl, who at that time was in his late eighties was happy at my show of interest in publishing a couple of his father’s scores—a piano sonata in B flat minor and a set of piano miniatures. I am not aware that there’s been a revival of Arne Oldberg’s music but from what little I’ve heard of it, I’d have to say that he’s every bit as clever as the best of the New England classiicists—Arthur Foote, Amy Beach, John Knowles Paine, et al. Why so few performers have taken up his cause I can’t explain, but I think Oldberg’s time will eventually come. I said the same about Korngold and Ernst Toch over twenty-five years ago when everybody laughed at me and thought I was crazy.

    In any case, reviving the music of unknown dead masters does them little good. We do it for own edification and we feel gratitude towards them. While this is a subject that interests me tremendously, I want to point out that there are some living masters who get far less attention than they deserve. For the record, I’ll name several: Joseph Fennimore, Henry Martin, Tison Street, Tomas Svoboda, Victor Steinhardt, Hsueh-Yung Shen, and Ladislav Kupkovic. I hope some talented performers take up their cause before they leave us.

    Gary Noland

  4. MarkNGrant

    The eponymous Wayditch to obscurity
    It is probable that on the grand canvas of historical time, more composers– and more creative artists of all genres of art, generally– fall into the Wayditch (could any unknown artist possibly sport a more cruelly appropriate surname?) than escape it. As some wag said, the spoils of history books go to the victors (I’m garbling it). A truer portrait of artistic creative activity in all eras would encompass the unfamed battalions of foot-soldiers hitching their wagons to unreached stars. Why does one stubbornly stay at such a thankless post? Because composers/creators experience an intrinsic fulfillment from creating their brainchildren that is in itself a compensation that no amount of money, recognition, or performances can match.

  5. philmusic

    Gerry, Steve and Frank, I was very touched by your comments. I think it reminds us that in some ways we are all in this together.

    When you consider that it is just not possible to hear all the music composed during your lifetime during your lifetime – something is bound to be missed.

    As to the “victors writing the history” Well, Bach gave Telemann the slip, but it took time.

    Phil Fried

  6. RBonotto

    I agree, Phil. We do a lot of pie-in-the-sky thoughts, but one that might help, seeing the excellent site (that links to the music itself) and article devoted to Thomas Peterson, is to construct a 50-state site that lists artists, composers, theatre folk, choreographers, etc.

    To explain: the flutist John Ranck has a site (I think it’s in “Music-related” and it’s at — very much worth a look) that links to composers/musicians who died of AIDS, among many, many other sites.

    There was also a long thread recently of regional vs. urban composers and visibility on the Orchestra-L listserv.

    And on my part, I feel that here in Boston the art, music, and theatre community don’t know what the h*ll each other is doing. A state-by-state memorial site (it would start as a memorial site would be a good beginning to have, say, orchestras look up composers of past andpresent resident in their states; theatre people who are looking for composers, choreographers looking for the work of local artists, et cetera.

    Composers living in far-flung regions might (eventually) benefit from this best, since there’s the Internet’s an unreliable way of making their music visible. (I’d never heard of Peterson before, for example. )

    Of course, such an undertaking would be huge, but could start out as a memorial site, expand to an alphabetical listing per state, and have a built-in device that displayed a different state-artist/composer/etc. whenever one accessed the site. The name brought up would simply list a few biographical details and a series of links (like the Peterson article did).

    With the recent, easier methods of designing internet pages, this might not be as hard as it was only a couple of years ago.

  7. dalgas

    I might mention that Sequenza21/NetNewMusic hosts the beginnings of a new-music Wiki, that includes a section for links to any and all contemporary composers worldwide.

    Anyone is welcome to add themselves or any other relevant composer, big or small, living or dead (you do have to register, but it’s a minor process that keeps the spam-bots at bay). Feel free to contribute any biographic, discographic, or compositional information to any entry as well. More generally, folks are encouraged to post any practical, historical and theoretical offering to the other wiki categories, too.

    Steve Layton

  8. William Osborne

    It is very noble and good hearted of you, Frank, to house Wayditch’s materials, and especially considering that NYC apartments usually aren’t that large.

    Future archives will be almost completely digital. More than 1 million rare or fragile books have been digitized through the Google-Univ. of Michigan partnership since it began in 2004, with an estimated 6 million to go. Twenty-eight universities are participating in the Google book digitalization program. The ultimate goal is to digitize the estimated 50 to 100 million books in existence and make them available on line.

    A composer’s entire life’s work, with all of her scores, instrumental parts, writings, art work, photos, video, and compressed recordings could easily be placed on a single DVD-Rom. Exploring the materials would be very simple because related topics could be hyperlinked to each other. In addition, the DVD-Rom could be uploaded to a server and made easily available to the entire world. Scores could also be sent via email. Copies of the DVD could be made in minutes, and sending the disk to libraries would cost about as much as a letter. Since the disk would take up a minimum of shelf space (about ¼ inch,) multiple libraries would have few reasons not to accept the materials. Compare that to the work and costs of paper archives. Software could be easily designed to scan s DVD-Roms to catalog the contents for a library’s cataloging system.

    If necessary, uncompressed video and sound recordings could be included on separate disks, but that would also require little shelf space.

    I archive almost everything I do on my website, which is now up to about 500 megs. For ten dollars a month I get 190 gigs of disk space on my server, so I have only used a tiny fraction of the space available – about 1/380th so far. I suppose when I have everything on my site it will be about a gig in size. This means that for ten dollars a month one could archive the entire life work (scores, recordings, writings, biographical materials, photos, etc. etc.) for at least 190 composers even if the materials for each took up a gig of space, which is very unlikely.

    Scaning all of Wayditch’s papers would be a big task and I wonder if it will ever be done. Future composers will not face this problem, since most digitally notate their scores. Converting such scores to digital files is very simple.

    Of course, none of this will change the fact that much of the music will be considered garbage. As I discussed a few weeks ago, we will live in a culture of digital detritus, but at least it will be extremely well-compacted garbage. Ever the cynic, but these archives will someday allow for wonderful historical research and perhaps even some pleasant discoveries.

    William Osborne

  9. Dennis Bathory-Kitsz


    Thanks very much for bringing this up. Let me talk a little bit about digital archiving, as I have been archiving my own material and others’ as well as restoring audio media for about 35 years.

    The simplest thing to say is that it isn’t simple.

    First, about size. Compression is the most serious front-end problem because the detail is lost. Detail is very important and anyone who archives material and compresses it has (way)ditched an enormous amount of information. A compressed video is a faint copy of the original, losing character and motion. In audio, compression and even digitization itself are significant. (Google an essay called “Don’t Destroy Those Archives” by Michael Gerzon.)

    Certainly digitizing something is better than letting everything go to the dump, but it’s not a true solution, at least not yet. As many digitization projects have learned, media change and deteriorate. Massive quantities of information were stored on 9-track tapes (including the detailed U.S. Census data). Not only are the machines rapidly breaking down, the tapes themselves have not been manufactured in years. Projects such as sound environments and installations require expensive curating to keep functional; some need time and money to emulate as hardware vanishes. (I emulated a rare old analog synth; believe me, it’s a few months of work.)

    The life of a CD-R or DVD-R is nowhere near what is claimed. A Dutch study showed some data deteriorating in as little as five years on disks kept in good storage. That means time and money for transfer to new formats as data degrades. A recent archiving project has suggested a thousand-plus-year life for digital archiving with ordinary hard drives in a self-repairing cluster. However, it is not even the media that become the problem, but formats, connectors, etc. Already less interesting digitized information is being let to ‘rot’ rather than take the time to up-convert to contemporary formats.

    Your estimates on what fits on these media, even in compressed form, is very low. My archive of projects done for myself and others over the past ten years occupies thousands of CDs and several hundred DVDs. This archive includes drafts and sketches, which historians find critical to understanding.

    Let’s talk about Google’s archiving project. I used about a dozen of these for a book I finished last week (out in September — announcement soon!). Hemenway’s 19th century Vermont Gazetteer (images of text only) consumed nearly 80MB without Google’s metadata and indexing (which are quite impressive). Photos, scans, text, etc., that made up my modest book totaled 8.6GB.

    Manuscripts are entirely more complicated. If the idea is merely to grab a concept, home or small library scanning works. But what about ambiguous details of pencil sketches? Material covered over or impressions in the paper? Even oversized pages or three-dimensional content? More is needed. Curating any archive is a commitment, as Frank already knew from the AMC’s massive pre-digital library.

    Archiving also requires the judgment of history, which is in short supply. (“What do we need these Fifth Symphony sketches for? We have the published version, right?”)

    (More on this from six years ago is found in my essay, Preserving the Future’s Past. Much is already outdated.)

    I agree that digital archiving is important, and I’m glad you brought it up. But it will not be a solution for quite a while, if ever. The more we can keep, the more we do keep. (There’s another topic going on at Sequenza21 called Take Out the Trash about how composers do or don’t keep their own works. Some, like me, are packrats; others are historical minimalists.)

    So in the case of Wayditch or Bonneau or the other composers mentioned, curating their archive, whether in its original format or converted to digital, is a daunting task.


  10. William Osborne

    You are right, Denis, digital archiving presents many challenges, but then all archiving does. The purpose of the DVD-Roms I mentioned would be for ease of access, transport, reproduction, availability, and quick hyper-linked referencing. It also allows for on-line resources that instantly touch worldwide publics. And as I mentioned, uncompressed recordings could be included in archives as well.

    And let’s not forget that good compressed digital formats still sound better than most old scratchy LPs and tinny cassettes. And how much longer will the machinery for those types of recordings be around? It is exactly because of digital technology that our standards have been raised.

    You are right that storing scanned images of paper documents requires a lot more space (though a single blue ray disk could store thousands and thousands of such pages.) Some composers consider every one of their little chicken scratches a priceless document that will induce quasi-erotic fantasies and profound debate among future musicologists feverishly studying their work. I’m afraid those “geniuses” will face considerable problems creating their archives.

    Most hard copy word documents, however, can be converted to digital text with OCR software which is very accurate for most typed texts. This reduces storage space by a factor of tens of thousands. There is now fairly good OCR software for printed music (as opposed to hand written,) and I suspect the software will continue to improve.

    It too have read alarming reports about the durability of digital media, though there seems to be little consensus about how long the media can last. One terabyte harddisks are now quite affordable, and could store the work of about 300 to 400 of even the most prolific composers. And as you noted, a harddisk could last a 1000 years. Compare that to paper. And I seriously doubt that accessing legacy formats operating systems would pose a challenge as difficult as maintaining tons of thinly pressed organic matter.

    All things considered, digital media will unquestionably make it more likely that current and future music will remain and be widely accessible. Small rows and columns of inscriptions on thinly pressed sheets of organic matter hauled around by carbon burning vehicles puts us much closer to ancient Egyptians than what the future of archiving will be. All those immortal geniuses who want to preserve images of paper documents might as well get some papyrus and donkey.

    Sorry, I was going to send this to you on paper from Germany by coal burning ship, but I couldn’t find a stamp with that horse hoof glue.

    William Osborne

  11. Dennis Bathory-Kitsz


    I guess my point was lost.

    Compared with the stable formats of paper or film (or stone tablets, if you like), digital archiving of pre-digital materials is less accurate and all digital materials demand significantly more creation and maintenance time and cost (media and up-conversion). Neither Wayditch nor Bonneau would have lasted as long as they did in some storeroom.

    Current digital technology is astoundingly useful (all those access and transport points you make) but also astoundingly fragile.


  12. Chris Becker

    “And let’s not forget that good compressed digital formats still sound better than most old scratchy LPs and tinny cassettes.”

    This may sound insane but I have actually considered remastering some of my music to vinyl for archiving purposes. This might work for some kinds of music and not so well with others though.

    Dennis, are there online storage options and services you are aware of and recommend that composers might want to consider for archiving? Or is it in your opinion just as easy for a composer to ftp anything they’ve got in a digital format to their own rented server space?

  13. Dennis Bathory-Kitsz


    It depends on how paranoid you are. Online services disappear overnight (anybody remember lockers?).

    FTP to reliable storage is cheaper (I use pair Networks) and keeps one copy of your digital material off-site but under your own control.

    I have a copy of all my digitized scores and audio, but also have 2.5TB of hard drive storage at home along with all those recordable CDs and DVDs (which also have client stuff that needs to be protected).

    Right now there’s no answer but multiple redundancy coupled with ease of maintenance. My remote server has 40GB of stuff on it; it crashed two months ago and fortunately pair Networks does daily backups — but it still took more than 12 hours to restore. Now I have a shadow drive installed there. (It’s not just my stuff to protect — VoxNovus, Composers21, Kalvos & Damian, Vermont Composers Consortium, Trans/Media and 20 other arts sites reside on my server.)


  14. rtanaka

    Just as an aside, I work at a library as part of a book digitization project. If the process is done in a professional environment with sufficient means of backup, digital files are generally much more safer and accessible, if you’re thinking about its preservation value. What seems to make people nervous is hard-drive failiure and corruption, which happens quite frequently, even with advanced technology. In order for it to be “safe”, there needs to be multiple copies of it stored in multiple locations, preferrably by an organization or organizations which isn’t about to disappear any time soon.

    We’re currently digitizing books which are about to run out of copyright — so predominantly books written before the 1930s. Some of it has to do with the problematic acetate materials they were using in the older days, but I do often run into books which are disintergrating and it’ll probably be only a matter of time before they disappear completely. (Part of the reason why the project exists to begin with — it’s as much a preservation project as it is a digitizing one.) Then there’s the matter of human error and natural disasters — books get lost, stolen, damaged, or it gets mixed up in massive unorganized piles, only never to be seen again. Working behind the desk for the last few years, I know all about the sort of things which fall out just due to pure negligence.

    Also something to consider — shelve space. The world is becoming more crowded, so there’s generally less space to go around in general. But at the same time, more people means that there’s more books and more ideas floating around. Most libraries right now are hurting for shelving space and they’re forced to be more selective in the types of books that they buy.

    Anyway, some things to consider if you’re thinking about where your output might go in the long run. I do think that the digital route is probably going to be the way things are handled in the future, at least within the library system. It may be expensive for individuals but the overhead costs aren’t so bad for institutions if you consider the fact that these computer systems can handle millions of copies of books at once.

  15. Dennis Bathory-Kitsz

    There’s some wandering from the issue. Is digital preservation good? Yes. It’s numerically immutable, infinitely replicable and universally transmissible. There are resolution problems but one hopes time will solve these. There are digital preservation issues but one also hopes time will solve those.

    The problem is what to preserve (and up-convert) as time passes. The curatorial problem is not improved by digital technology. I don’t know what condition the Wayditch scores are in, but the Bonneau scores are pencil manuscripts.

    They contain no metadata. They need to be handled, sorted, indexed, examined, scanned, and proofed by people with time and money. For what gets through the triage, the next step is preparing editions. In other words, we have the same problem that existed before.

    Remember, nobody wants these. An entire compositional life has been turned down by family and university. That’s how problematic archiving is. Digital preservation of books is little more than up-converting what has already passed editorial muster. This is the question of judging a compositional life. I wish M. J. Leach were reading along and could point out the struggle with finding and preparing the music of Julius Eastman — a composer already legendary. When I prepared part of the archive of fluxus artist Kenneth Dewey 30 years ago, it went to the Franklin Furnace archive … until they ran out of money and, as far as I know today, back to the family.

    Frank entitled the thread “Finding a home…” The issue is not whether one can scan a bunch of pages and get them onto a server, but rather finding a home — not just one more storeroom, this time a digital one, where the work will again be forgotten.


  16. Chris Becker

    Thanks, Dennis. I use too – but I haven’t really utilized my deal to archive my own work.

    After Katrina, the artist who created the artwork for my Saints & Devils CD (who ended up with a lot of his creative work thoroughly baptized in mud) insisted I start backing up my stuff. That’s when I bought an external drive. I’m grateful for the info here and push to investigate options.

  17. William Osborne

    Denis comments: The issue is not whether one can scan a bunch of pages and get them onto a server, but rather finding a home — not just one more storeroom, this time a digital one, where the work will again be forgotten.

    Perhaps that is exactly the point. Big bulky collections like the Wayditch are difficult to house, so libraries don’t want to give them a “home.” On the other hand, two or three DVDs can contain an enormous amount of material from a composer’s life, and libraries could hardly turn down such small objects. Second, the DVDs can be cross-indexed with hyper-links and thus made easily researchable. This becomes an enormous factor in the “triage,” research, and housing issues you raise.

    This will be a significant factor for storing the work of composers from this point on, but the old paper medium composers still have a problem, so lets look at that. How many pages of material are in the Wayditch collection? Shall we say 10,000 for a wild estimate (and one vastly removed from the norm.) Even a fairly high scanning resolution would be around 1.5 megs a page for jpeg files. His entire work would thus only require 15 gigs – a small fraction of the harddisks we have now, or three standard DVDs. Double the estimate to 30 gigs and it is still not a storage problem. Every twenty years an archive might have to spend a couple hours copying the DVDs onto new disks. A small task.

    Sadly, I doubt anyone will scan Wayditch’s work, but the example illustrates the options we now have.

    There are aspects of fragility to digital media, but I think those problems will soon be solved, because the solutions will make a lot of money.

    BTW, in 1998 I recorded a 50 minute quadraphonic work in stereo. At that point, surround sound recording software was still very expensive and I didn’t have it. I saved all the unedited wave files from the recording sessions on DVDs in case I wanted to re-master the work later. Eight years later my recording software had improved immensely, and included surround sound, so I dug out the old DVDs with the unedited wave files to re-edit them. They all loaded without a single glitch . I was able to create a much better recording, including a DVD-A with 5.1 surround sound.

    I have quite a few paper scores from the 80s. I am slowly re-notating them with Finale when I have time. In my case, PDF files will have a much greater chance of remaining than those boxes full of paper. Whether they should remain is another question….

    William Osborne

  18. rtanaka

    Perhaps that is exactly the point. Big bulky collections like the Wayditch are difficult to house, so libraries don’t want to give them a “home.” On the other hand, two or three DVDs can contain an enormous amount of material from a composer’s life, and libraries could hardly turn down such small objects. Second, the DVDs can be cross-indexed with hyper-links and thus made easily researchable. This becomes an enormous factor in the “triage,” research, and housing issues you raise.

    This is largely what I wanted to bring up — being sensitive to the practical issues behind archiving I think will help composers increase their chances of finding a place for their output in the long run. Although maybe there’s a resurgence in some areas in utilizing older technologies (DJs often use vinyl for scratching, for instance) the fact of the matter is that vinyl records are brittle, heavy, and take up quite a bit of shelve space compared to CD and DVD formats. Although CDs are also subject to potential damage, they’re not as subject to the wear-and-tear that naturally results of the use of vinyl. Digitization solves many of these issues, and that’s the direction most libraries seem to be going in right now. LPs are largely on their way out, and I wouldn’t be surprised if in 10-20 years they started to do the same with CDs and DVDs as well. Everything that can be made online will be made online, eventually.

    For individual composers, my recommendation would be to digitize everything that can be digitized — convert their scores into pdf and recordings into mp3s if possible, then find some place online where they might house them for free. There are plenty of them now, and since these sites will likely be around for a very long time the chances of it sticking around is pretty good. (The IT people at those companies or institutions will do your backing up for you, which is pretty convenient.) You can increase your chances of people finding and taking interest in your stuff if it is categorized appropriately also.

    I understand the concerns about the loss of fidelity, and putting “labels” on where your music might fit in can be kind of a drag. But I think that the loss of nuance is something has to be generally expected when doing any kind of archiving work. Of course a DVD isn’t going to replicate the experience of the live performance, even if it’s well documented. There’s digital photographs of paintings, for instance, which some people are adamantly opposed to, because you lose a lot of the subtle gestures of brush strokes and texture. But the idea of it is that it’s supposed to give whoever’s looking or hearing a general impression of what the work is about, and I think this is useful especially for people who don’t normally have much experience dealing with artworks.

    I don’t want to discourage people from doing things outside of the standard score and recording formats, but this will make it more likely that the artist will end up competing for limited shelve space with people like Crumb, Xenakis, and other composers who did things in those vain. Those scores tend to be an anomaly and require their own special section because they don’t typically fit anywhere else. And unfortunate as it is, libraries are occasionally forced to do some spring-cleaning when new material comes in and things that don’t get looked at too often are often given away, thrown away (!), or stocked away at a storage facility such as the one I work at. I guess the artist would have to ask themselves if it’s really worth the hassle — it’s important to be realistic about it, I think.

  19. William Osborne

    Just another comment in regard to Denis’ well-considered thoughts about the fragility of digital media. Another option would be to archive information on memory sticks (flash ram.) An 8 GB unit is about the size of a house key and could easily hold the life work of a prolific composer. They are quite durable because they have no moving parts.

    The durability of a memory card can be measured by the following parameters:

    Humidity: Flash memory cards can tolerate extensive fluctuations in humidity, from 5% up to 96 %.

    Temperature: Flash memory cards can tolerate operating temperature from -13F / -25C to +185F / +85C. The tolerance in storage temperature is even more extreme, from –40F / -40C to +212F / +100C.

    Drop Shock/Vibration: Flash memory cards can withstand an operating shock of up to 2 000 Gs which is equivalent to being able to withstand a free fall drop from up to 10 foot / 3 meters.

    More info here:

    They can also be locked with a password so that they can’t be overwritten or erased.

    So you can raise them to boiling temperature, freeze them, drop them from ten feet, or keep them in the humidity of a desert or jungle and they still work. There are no doubt some drawbacks, but its probably better than papyrus-and-a-donkey technology currently used by most composers. And what library could refuse something so small? Well…in my case maybe quite a few, but it’s an idea for the rest of you.

    William Osborne

  20. Dennis Bathory-Kitsz

    William, or should I say Wiliam, :)

    This will be long, so I’ll make clear that I use digital archiving and do not advocate for papyrus and donkey where it is the inferior format.

    So let’s assume that storage continues to grow and continues to be stable, and that conversion resolution continues to grow as well.

    (The current problem with flash RAM are write cycles, and with any digital storage media the EMP, a different sort of social issue. And that 1,000 hard drive is a self-repairing cluster; these don’t yet exist, and would cost about $40K per TB to build with present technology, and would grow obsolete long before they would fail.)

    (And let’s also assume that we’re not talking about barn-door composing, that is, the composers who, once they’re written enough music, say there’s too much music in the world and we don’t need anymore and people should just dump their old stuff — but then keep their own.)

    That leaves two problem areas: the scope of the digital replica, and the curatorial problem.

    Last first. The curatorial problem is, for libraries, the most severe. Preservation alone is not the issue; libraries are full of material that has sat for years in relatively stable conditions but is simply not curated. It has no indexing and often little identification. Composers are notorious for crappy identification of their own works, sketches, source materials, etc.

    I mentioned earlier that I do archiving; it’s not casual. I was hired to do the chronology and annotated bibliography of playwright Kenneth Sawyer Goodman for the Newberry Library (the resulting book is long OOP, but can be found on antiquarian sites like Abe Books). I worked on that project one day a week, and several weeks on-site in Chicago. It took three years to identify, describe, and cross-reference what were essentially ten boxes of well-kept materials, the bulk of which were created from 1911-1918.

    In other words, it took the full-time equivalent of 30 weeks to properly curate just nine years of a single artist’s work. There was no metadata — and its absence would have mattered just as much to the process if the entire body of work existed on a memory stick or in neatly packed boxes. Had the family not paid the cost, it would still be inaccessible. Yes, it’s still in boxes in a vault, but now everything is tagged and the reference book exists.

    For others who would like their material in a library — forget it. I know one composer who set aside a trust fund to maintain his archive at his alma mater.

    And in the interim — before stable, long-term storage — there is also the need to up-convert formats before they become technologically inaccessible or incur damage that, for digital materials, is fatal. What is a blemish on a paper carton destroys the entire contents of a disk or memory card.

    Next. We are mostly a bunch of artists talking about this. And I assume that artists by and large take their work seriously and struggle with details and sensibility. That is why I am surprised that some of those same artists believe digital technology is their friend. There is a dismissal of the effects of digitization. Based on followup comments, I gather they haven’t read that Gerzon essay, but it’s worth it. Here it is.

    Creating an archive of paper scores is not so hard, even given the triage and indexing issues — and the sheer time demand of scanning and photography. But graphical and 3d works, performances, improvisations and electroacoustic pieces are another story. Archiving pre-digital electroacoustics isn’t so simple as plugging the old TEAC into the SoundBlaster and wailing away. You end up with lo-fi crap, even if the tape still plays (and mastering tapes from the 1980s probably won’t), and then it ends up in a format that might itself be frangible.

    So what about archiving digital EA or converted analog — what multitrack software will read files even a decade old? Even electronic versions of paper: How about files in Finale format? When will PDF be, to use the kind term, deprecated? How about costumes, tools, custom instruments?

    Where the technological push is always forward, there’s little occasion to look back. Backward-compatibility enthusiasts are freaks. I have a drawer full of hard drives that I up-converted last week because the parallel interface is about to go all dodo on us.

    When Chris mentioned archiving to vinyl, he wasn’t entirely off the wall. The key is archiving to a stable, future-proof format. But he notes the problem: fidelity. Again, we are artists, and artists deal in subtlety.

    It’s not an idle question to ask what’s lost during lossy compression. Most compression is psychological, removing elements masked by ear or eye during normal reproduction. Have you ever tried to re-master or re-balance a compressed file? You get an ugly, blotchy monster. Sections that are buried or unclear have been eliminated. You’re not getting them back.

    I bring these things up because aside from being abysmal keepers of their own work, artists are also prey to technological fads. Don’t think that because you’ve stuck your Sibelius files or mp3 recordings or jpeg performance photos & diagrams or QuickTime videos onto a DVD-R or uploaded them to someone else’s service business that you’ve created an archive. More likely you have created a fragile mess of future-fail information with gaping informational holes that will shortly become inaccessible not only to you but also to future archivists.

    Back to Chris. His vinyl archive might be tongue-in-cheek, but it is a significant point. Much of this will sound exquisitely self-centered, but if you value your work beyond today, consider:

    1. Index your work — identify what it is and where it is located, as well as what has been destroyed.
    2. Use the highest available technological quality, and (per Gerzon) keep the original.
    3. Annotate it, even if briefly — when it was done, where the sketches are if you keep them, the sources, previous versions, and next steps. Use text and photos and scans, unless you like to preserve the mystery. And do it when you make the work, because you won’t remember later.
    4. Make redundant copies — hard drive, CD/DVD, online, paper, film, vinyl, stone tablet — of the artistic object and its annotations. Keep them in different places, and if you use an online service, use two.
    5. Beware of paper. For that final copy, use acid-free paper and archive-quality printing (hi-temp laser).
    6. Up-convert. This is a hateful job, but it means recognizing when your technology is about to vanish and copying or converting to a new format, then (again per Gerzon) saving the original.
    7. Use no technologies dependent on digital rights management (DRM) and avoid software that is tethered (requiring a license key; see this 2003 essay for more).
    8. Document. This is a real issue, and some performers are particularly reticent about this. So bootleg if you have to. Only history will know what matters.
    9. Appoint a “curator” and set up a maintenance fund. This might be spouse, friend, child, or institution. Make clear what should be kept and where to find the metadata. Nothing is more overwhelming then shoveling out closets and drawers full of unmarked paper and reels and film and disks and slides and drives. The maintenance fund need not be big, but should cover at least the institutional triage that will follow.

    Okay, that’s it. Sorry this was more like an article than a post, but I worry about a certain glib faith in one’s work preserving itself. (And, alas, I spend more time curating and restoring others’ works than my own…)


  21. rtanaka

    Nice post Dennis. And you’re right about artists taking their work the most seriously. Boy, do I see a lot of stuff get lost or misplaced due to pure negligence or not-caring on part of some staff, especially student workers. If you’re intent on leaving your work for future generations, it’ll largely be up to the composer to do it. Then maybe make some friends with librarians…:)

    For my purposes, my music usually fits well within the scope of the standard formats so I don’t tend to sweat it too much. I don’t use pencil or paper anymore so I’m not particularly concerned with the texture, and everything that I need can be done in Sibelius with maybe an occasional graphic file embedded within. Should my apartment burn down and everything get stolen, it’ll still be on Sibeliusmusic where I can just print out another copy. Sure, the midi playback is really bad, but it’s not something you generally expect from it anyway.

  22. William Osborne

    Thank you for your thoughts and research, DenNis. (Sorry, I must be dyslexic.) Yes, composers need to properly collect, organize and index their work, since no one else will likely do it for them.

    In fact, I think digital formats are an excellent means for organizing work. The plethora of composer websites are an example. The sites are often broken down into carefully thought out categories and they are usually kept pretty well up to date – even if a lot of material is not included (and often probably doesn’t need to be.) Organizing paper, on the other hand, is a big problem. An example is how it took you nine years to archive the papers of that playwright. And paper is not even remotely as accessible to the public as digital media.

    And surely you wouldn’t suggest that paper could takes the stresses I described above that a memory stick can withstand. (E.g. 212 degrees and 96% humidity.) At the very least, it shows that there are two sides to the story.

    I am also not so sure that legacy digital formats can be compared to legacy analog formats. That latter require specific machinery to reproduce them, and that machinery is often hard to come by – e.g. eight track recorders. Computer oriented digital information, by contrast, can be converted to match new software and operating systems by adapting programming or creating emulators, which are far easier tasks than finding, maintaining, and repairing antique machines. (The plethora of free downloads that can create, edit, and merge Adobe’s PDF files are an example. Another is the excellent Atari emulators like Steem which are now better than Ataris ever were.) Legacy formats can be easily archived, copied and shipped by post, but old machines are essentially irreplaceable and shipping them extremely risky. The digital formats will also always be much more widely distributed. Probably a billion people are using Windows Media Player, but how many used eight track recorders?

    And if a library will accept a DVD-Rom, they will likely also accept a dozen disks of uncompressed CDs or DVDs with audio-visual materials. Space is not a big issue with recordings. (Most of my recordings are quad works on DVD-A with 5.1 surround sound. There is no effective compression format for surround sound.)

    One should carefully consider the risks of digital media, but I feel you are overlooking facts such as these, and that your comments thus have a slightly alarmist character. It would be great if we could all persuade libraries to harbor our papers for eternity, and if the opportunity presents itself we should grab it, but I think many of us will need to take a more pragmatic approach. And regardless of the format, digital or organic, we will all have to rely on curators willing to maintain the collection long after we and our families are gone.

    William Osborne

  23. Dennis Bathory-Kitsz

    I haven’t even begun to be alarmist. You want alarmist, I can scare the skivvies off you. You name the digital medium, and I can tell you how everyday behavior can destroy it forever in much easier ways than a flood or house fire can destroy paper.

    My point is only to balance the digital enthusiasm of infallibility with some serious reality about fragility, conversion problems, and future-proofing. (I have STEEM too, but it isn’t reading those 5-inch diskettes.)

    Fact check: That playwright archiving took 165 days over three years; his works occupied a period of nine years. And it would not have taken me less time if it were in digital format because it had no metadata. That is the curatorial challenge — to provide reference and context.

    I can always tell when a book used software indexing without review by an editor. References are not prioritized, categories are absent, cross references (esp. the ‘see also’ kind) are missing. Software is good at counting and finding patterns in raw data, but presently incompetent at context.

    Let’s go back to Wayditch and Bonneau, and Frank’s questions:

    What force would drive someone to create such a vast body of work without any opportunities, either financial or promotional? How many other composers like Gabriel von Wayditch are there out there and what keeps their music from being more known? What could be done to preserve the legacies of such composers and keep the flames of support for them burning after they or their descendants are no longer around to carry the torch?

    The first two haven’t been addressed, and should. The last was not about mere preservation of documents, because Frank has the manuscripts as I have Gilles’s. It was about preserving legacies and keeping the flames of support burning.

    To be simplistic, legacy and support derive from metadata. Only when these works are given context (artistic, societal, temporal [even within a composer’s life], length, orchestration, style, even idiomatic writing) do they become supportable. My first response to William ended with, So in the case of Wayditch or Bonneau or the other composers mentioned, curating their archive, whether in its original format or converted to digital, is a daunting task.

    I’d already written that digital is numerically immutable, infinitely replicable and universally transmissible. I think we’ve dealt adequately with digital vs. donkey, so let’s explore (even though this topic has now dropped off the front page) how that preserved data becomes a supported legacy.


  24. William Osborne

    I’d already written that digital is numerically immutable, infinitely replicable and universally transmissible. I think we’ve dealt adequately with digital vs. donkey, so let’s explore (even though this topic has now dropped off the front page) how that preserved data becomes a supported legacy.

    It would seem that this statement provides, in part, its own answer. Digital media is easily transmissible to worldwide publics. And by its very nature it is imminently indexable. (Google and Yahoo are good examples.) Won’t factors such as these play an enormous role in creating “support legacies.”

    William Osborne

  25. Dennis Bathory-Kitsz

    Excellent! Thank you for volunteering for this “imminently indexable” task. When can you begin identifying, sorting, scanning, indexing, referencing, annotating, contextualizing and uploading Bonneau’s work? We have a comfortable hayloft and you can stay as long as you like!


  26. lisa_hirsch

    Dennis isn’t the slightest bit alarmist. To claim otherwise indicates that you’re not especially familiar with the technologies you’re espousing. To start with, there’s this:

    And surely you wouldn’t suggest that paper could takes the stresses I described above that a memory stick can withstand. (E.g. 212 degrees and 96% humidity.) At the very least, it shows that there are two sides to the story.

    That doesn’t mean memory sticks don’t fail. The other point you’re leaving out is the format problem: digital formats die. Unmeasurable quantities of data, from census records to NASA missions to business records, are out there in file formats that can’t be read because the computers that ran the software that could read them are dead, because even if the machines exist, the software no longer exists, because digital translation, which you represent as something comparatively easy to do, actually isn’t.

    Some 25 years ago, in another life, I worked for Mass Mutual Life. Among other services, they offered admin-only health plans, which were funded by the insured company and administered by Mass Mutual. They had a few data tapes belonging to a couple of customers and needed to find out which customers. I arranged for a data recovery company to look at these tapes – there was exactly one company I could find in the Bay Area that had the equipment to read what was on those tapes. Mass Mutual itself couldn’t do it, and presumably they had at some point owned the machines that produced the data tapes.

    The situation has not gotten better since then. Digital media aren’t particularly stable – you’re kidding yourself if you think hard drives are stable. They fail all the time. Redundancy solves part of that problem, but…

    And as Dennis has said repeatedly, the organizational issues with archives are profound.

  27. pgblu

    For those who are frightened by the current conversation, keep in mind that the time when what you do is most relevant and urgent is NOW. Don’t let the worry about permanence get in the way of the ineluctable experience of impermanence. Don’t let all the cameras that are there to capture your performance from every imaginable angle prevent the people that are watching in the HERE and NOW from seeing what the hell you’re doing.

    Sorry, as you were, gentlepersons. This is needless to say an extremely important discussion. Can’t we get some people from the cutting edge of archiving technology in here? For all their hard work on the topic, Dennnis and Willliam are not exactly professional experts in all things archiving. What are the brilliant people at Google coming up with? Surely they are as aware as we are of the pressing nature of this problem?

    [pressing: no pun intended]

  28. William Osborne

    Dennis, a digital archive of Bonneau’s work would indeed be far less work to index than boxes full of papers. It could also be cross indexed with hyperlinks and put on-line. It’s too bad he was not able to leave his work in some sort of digital form in addition to the paper. His work would already be much more widely known. But all the same, I’ll be happy to share your barn with your donky and boxes of papyrus… (-:

    I also think Phil makes a good point. Lisa and Dennis take somewhat polarizing standpoints. No one here is denying the serious problems with digital archiving. Of course harddisks fail and even more stable formats like memory sticks are still tricky. Do you think you are telling anyone here anything new?

    Here is the point: If the view becomes too fatalistic people are discouraged from creating digital archives of their work that could be extremely helpful and useful. (And this is to say nothing of the fact that they can be printed out if that type of archiving is needed.)

    We should also consider that new technologies almost always go through an initial burst of innovation (usually about 20 years) and then level off. The first automobiles were motorized horse buggies, but by the mid 30s cars had essentially evolved to the basic form they have to this day. Jets were invented during the second world war, but since the mid sixties the machines have not radically changed. Nuclear technology peaked in a similar manner.

    I think this will also be true with home and business computing technology. Progress will no doubt continue, but not at the blazing rate that left the initial exploratory technologies so rapidly out of date Five inch disks and many forms of tape storage were evolving formats that were clearly part of an initial technological development. To use them as intimidating examples can be somewhat misleading and unnecessarily discouraging. (But again, no one here is denying the problems. We are just trying to put them in fuller perspective.)

    Mind boggling quantities of information are now being archived digitally, by universities, corporations, and governments. I hardly think the experts involved are so foolish as to not be working on the problems involved – especially given past failures. We are just now reaching the point where digital archiving is becoming a truly important issue, and it seems very likely solutions will evolve. If you are not organizing your materials digitally, you will be left out. By all means, organize your papers and try to find a home for them, but don’t live in the fatalistic, static world that says digital technology will doom your work. You will neglect preparations of your materials that will unquestionably be very useful.

    William Osborne

  29. William Osborne

    Another thought occurred to me about digital technology. It can allow work to become more widely known and appreciated, and thus increase its chances of survival. One of the first things I did with my Atari STE in 1988 was write a 22 page article with about 90 footnotes thoroughly documenting the discrimination my wife faced in the Munich Philharmonic and the years of litigation she used to fight back. In 1997, Monique Buzzarte put the article on her website where it won a “Best of the Web Award.” In 2001, I put it on our own website. In 2005, Malcolm Gladwell (a writer for the New Yorker came across the article and used the material for the concluding chapter of his book book Blink which was on the New York Times non-fiction Best Seller list for 18 weeks. (It was number one for three weeks.)

    So now Abbie’s story is in some hundreds of thousands of copies of that book. Digitalization can lead to paper – lots of paper. It is not paper or digits that preserve work, but rather the care of inspired people.

    And again, I am just trying to show this is not a black and white issue. No need to reduce my thoughts to straw man arguments.

    William Osborne

  30. Dennis Bathory-Kitsz

    William says, “a digital archive of Bonneau’s work would indeed be far less work to index than boxes full of papers.”

    Side-by-side paper originals and high-quality digital copies are essentially the same. The process of digitization does not by itself include the metadata essential to archiving. That is still a laborious process. The single success to date has been OCR of printed text, but even that is not contextual.

    Objects that start out in digital format or that can be converted at a high level of content accuracy (unfortunately not the case with pencil or even ink musical manuscripts) are eminently archivable, but their raw form is not an archive. They make up, like the Wayditch or Bonneau paper manuscripts, only a warehoused body of data.

    Let me simplify this back to a few issues — none of which are meant to deny that, from a participation standpoint, digital documents and digital conversion of analog documents are critical to their distribution.

    I provided guidelines above. They are not some amateur ramblings, but the result of ongoing experience in archiving, from the paper days (when I was a government documents research librarian) through running a computer company in the 1980s to current research in holographic storage and RFID retrieval. (I do this work and, despite Phil’s dismissal, also consult in current trends in RFID technologies. Here is an interesting recent discussion not by me.)

    Because William believes this issue is recent (“We are just now reaching the point where digital archiving is becoming a truly important issue, and it seems very likely solutions will evolve.“) does not make it so. Digital archiving has been going on for thirty-plus years, with the problems evident shortly after it began as the disasters of reckless digitizing (and even conversion to other analog formats) and discarding originals became apparent. Gerzon’s article was written 30 years ago, and Lisa mentions a concrete example. These issues are legion, from then until now. This is not news. Conferences on digital archiving have long been regular events.

    Here are those issues:

    1. Archives must be curated to be useful.
    2. Stable analog technology is well known and long-lived and for the foreseeable future should not be supplanted but rather supplemented by digital technology.
    3. Digital technology will for that same foreseeable future be fragile and require high maintenance and continual up-conversion.
    4. Digitizing is useful but inaccurate.

    Again, none of my cautions are meant to dissuade composers from digitally preserving their work. I encourage artists to preserve, document, and distribute widely — especially the latter. Indeed, if they take care of their own documentation, the process of archiving will ultimately be easier.

    Ryan wrote above, “the loss of nuance is something has to be generally expected when doing any kind of archiving work. Of course a DVD isn’t going to replicate the experience of the live performance, even if it’s well documented. There’s digital photographs of paintings, for instance, which some people are adamantly opposed to, because you lose a lot of the subtle gestures of brush strokes and texture. But the idea of it is that it’s supposed to give whoever’s looking or hearing a general impression of what the work is about, and I think this is useful especially for people who don’t normally have much experience dealing with artworks.”

    He, William and I are saying the same thing in different ways. Distribution is enhanced quite literally by orders of magnitude by digitizing material and placing it online. Digitization also pressures an artist to assess the work more closely, because the process often includes reference data even as simple as a file name and DVD storage location — the beginnings of metadata.

    None of the positives mitigate the impending crisis, however, where an entire generation of digitized information is likely to be lost due to recklessness, ignorance and, ultimately, the economic disincentive to up-convert rotting digital data.

    So now back to solving the Wayditch/Bonneau legacy problems practically. A passive-voice construction of what might be done is not the same as what actually can be done. What steps would Frank or I take that do not turn us into full-time curators of this work, yet provide access to it?


  31. William Osborne

    Dennis, as I mentioned in my first post raising the digital issue, more than 1 million rare or fragile books have been digitized through the Google-Univ. of Michigan partnership since it began in 2004, with an estimated 6 million to go. Twenty-eight universities are participating in the Google book digitalization program with the ultimate goal of digitizing 50 to 100 million books.

    If the system is so fraught with peril, why are the experts going through the massive amounts of work necessary to digitize these archives? I appreciate your view, and it is extremely important, but I would question that it is the norm in the field.

    Projects like these also explain my view that it is only relatively recently that really serious efforts have been made to study and increase the reliability of digital archiving. At this point, we might still need to wait a bit before making solid conclusions.

    And are you sure that a digitized collection of works on a DVD wouldn’t be considerably easier to archive? Even on the most obvious level, scrolling through the collection would be much easier than rummaging through boxes. You could go through it without leaving your seat, making the work vastly more efficient. And much of the typing and labeling would already be done. Search strings could be used to find documents, automatic working indexes created, and alphabetization created automatically. And one could immediately begin cross-reference the works with hyper-links. These advantages would be a considerable factor for archivists that might consider accepting a collection.

    I can see how digitizing might be inaccurate in some cases, but if done with reasonable care it would seem the quality would be quite good. Perhaps you could elaborate on this point. It might be helpful for those creating digital archives.

    William Osborne

  32. Dennis Bathory-Kitsz

    William’s questions are very good and to the point.

    He describes the extent of the Google book project, and asks, “If the system is so fraught with peril, why are the experts going through the massive amounts of work necessary to digitize these archives?

    There are several good reasons. One is a key point you have mentioned: distribution. But distribution is meaningless without the resources to create, curate, maintain and, as needed, up-convert the archive; Google has the resources. Secondly, this is a small portion of writing — ‘certified’ through publication, so to speak. Third, well-scanned text can be automatically indexed for searching, which is extraordinary if you need that resource. My own upcoming book was made easier because I could both search and download entire volumes that were otherwise a day’s drive or a few weeks’ wait away.

    The Google books project is an ideal example of what digitization is presently best at — even if the indexing is not contextual and books’ illustrations are often inferior.

    William asks, “And are you sure that a digitized collection of works on a DVD wouldn’t be considerably easier to archive?”

    Yes. It would be improved somewhat by having a multiple-screen setup, and being able to dump the contents onto a hard drive for renaming and adding various connecting metafiles, and as you suggest, alphabetization and similar chores and the post-organization aspects.

    But the real work of archiving (vs. just storing) is ‘finding stuff’ and making it part of the resulting access tools. I mentioned upthread about the poor quality of automated indexing because of its inability to assess context — a quality that gets even worse if there is no text to scan, as with manuscripts. Further, the physical nature of the originals can be important. Which draft is it? What corrections were made?

    For this kind of examination, have a look at the 1971 annotated The Waste Land (ISBN 0-15-694870-2), which includes photos of different drafts in typescript and manuscript. The significance of this edition as a guide to digital archivists is first that it is not just the final published book and second that having the paper originals allows examination of erasures and pressure on the paper that would not be otherwise visible.

    A physical archive is not a particular pleasure (though handling an original has its own rewards), but a real archivist will be making many judgments that the technician who scanned the material into random files will not. And if the archivist is also that person, then the material will have been deliberately organized first. The handling of material for scanning is monumental, especially if the original is not to be damaged.

    Once the material is organized in digital form, it will be easier to handle and reference indexing will build quickly and richly exactly as you say — even if it still demands return to the manuscripts for clarification.

    And that doesn’t begin to touch on the archiving of multimedia and software (and its sources). Your next question leads to that:

    William asks, “I can see how digitizing might be inaccurate in some cases, but if done with reasonable care it would seem the quality would be quite good. Perhaps you could elaborate on this point.”

    I will do this tomorrow; I’m giving at talk at Hartt this afternoon, and have to rent a car and drive down, just shy of four hours. So I’m on the road for now….


  33. William Osborne

    Now we are getting to some interesting details that show how archiving can be a matter of subjective desires and perspectives, instead of universal laws. Some composers will not consider evaluations of their erasures or pencil pressure terribly relevant. The “geniuses” will, of course, and there is no shortage of them in our modest field. These conceptions derive, in part, from 19th century views of the composer as a divinely inspired artist-prophet – often speaking as the voice of his nation. Jesus, Buddha and Wagner. Since we must feverishly study the pencil pressures of these patriarchal figures, we dare not blur the page of an immortal when scanning images.

    I hope that others will take a more modest and pragmatic view that will not encumber their ability to preserve final copies of their scores, writings, and recordings. Access can change hearts, and that might provide more permanence than the profound revelations of erasures. Try to get your papers archived, but also be realistic about how the modern world works.

    William Osborne

  34. lisa_hirsch

    The question about erasures and so on is not about what composers consider important at the moment of creation, but what scholars will consider important at a later time when they’re examining the gestation of a work. And a major problem with scanning is the degradation of the digital image in comparison with the paper original. Information is lost and it’s impossible to know in advance whether it’s important information or who the information might be important to.

    It is possible to create extremely high-resolution photographs or digital scans, but it’s also extremely difficult and expensive. Darned if I can recall all the details, but I read not long ago about the weeks-long process of creating a high-resolution digital photograph set of one painting. It took a super-hi-resolution camera and weeks of time. It’s non-trivial to make high-quality digital scans or photographs of analog artifacts.

  35. lisa_hirsch

    “Fragile” LPs?
    I want to comment on a previous comment referring to LPs as fragile. I would contest that assertion: LPs are fairly difficult to break, and the information on them remains readable despite wear, even though the quality of the sound deteriorates. As a format, LPs are about as robust as you can get, because the encoding is analog and the equipment needed to read the format is quite simple: a needle of the right size and an uncomplicated, well-documented set of electronics.

  36. William Osborne

    The question about erasures and so on is not about what composers consider important at the moment of creation, but what scholars will consider important at a later time when they’re examining the gestation of a work.

    I feel composers should have the right to determine what they make available to the future. They might not want all of their work studied. That is their decision, not scholars’. And more to the point, the chances of finding space in an archive might be increased if the collection is kept to a more moderate size. Questions of triage can be very subjective. They should also be pragmatic.

    And to stress a point, the U of Michigan is in the process of scanning six million books. If each were only 150 pages that would still be almost one trillion pages of material. Would they be doing so much work if it were inordinately unstable, or would become obsolete? Technology is dynamic and fluid, so our thinking about it must also evolve.

    Jesus Fernando Lloret Gonzalez, a professor at the Conservatory of Malaga, is writing his dissertation about the media work my wife and I do. He has given us a beautifully hardbound edition of the first stage of his work. It is now part of the collection of the university’s library, which gives me a feeling that at least some part of our work might remain. He discovered our work through our website. Electrons can lead to paper in archives.

    William Osborne

  37. lisa_hirsch

    I support composers’ desire to destroy or preserve whatever they please. That is a completely separate issue from how any material that’s preserved is reproduced or archived.

    I work for Google, and I can’t comment on why we do anything beyond noting that our official mission is to organize the world’s information and make it universally accessible and useful. There have been articles in the press about our infrastructure and about Google Book, however, which you’re welcome to consult.

  38. William Osborne

    I have a little more time today, and want to address Lisa’s belief that legacy machines will be easier to employ than emulators for legacy digital sound formats. That idea is dead wrong.

    Let’s take an example very relevant to our field. Though out the 90s many composers used ADAT machines for multi-track recording. In fact, they were pretty much the industry standard — especially their “light pipe” system. They are already obsolete because we now have multi-channel sound cards and computers that can record direct to disk. ADATs were very prone to breakdowns because recording and playing 8 tracks of 24 bit sound on a video head requires very precise alignments and a flawless head. Many, many composers still have their multi-channel works stored on ADAT tapes.

    A hundred years from now working ADAT players will be rare to the extreme. ADAT tapes will be almost completely unusable.

    On the other hand, composers can transfer their ADAT recordings to uncompressed WAV or AIF files and store them on DVDs, CD-Rs, harddisks, or memory sticks. Contrary to what Lisa has implied in her posts, WAV and AIF files are not complex, and creating emulators in the future to play them will not in any way be a serious challenge. (To claim otherwise is complete nonsense.) And most importantly, once those emulators are created, they can be put on-line and made available to the entire world, while the few remaining ADAT machines will probably be locked away in vaults.

    One should also consider Internet culture. For some reason there are entire Internet communities interested in maintaining legacy formats and making them available as freeware. That will be an area where a lot of work will be done that will probably be easily accessible. Using old operating systems and legacy formats will present some challenges, as will maintaining old DVDs and CD-Rs, but these problems will not be nearly as great as trying to find old devices like working ADAT players. Don’t be misinformed.

    Another interesting example is the old Ircam X4 machines that used hardware specifically designed for NeXT computers. NeXT went out of business and unbelievable amounts of Ircam’s work had to be ported over to Macs. Ircam vowed never again to create hardware based instruments. There is a basic principle to be understood here: Software is like DNA that can continually evolve, but machines are frozen in time.

    William Osborne

  39. William Osborne

    I had another thought I find very interesting. I think that in the future there will be specific branches of musicology devoted to legacy sound file formats. It will be the future equivalent of the science of handling ancient manuscripts (there’s a word for that but I can’t think of it.) Musicologists and librarians will probably have very scholarly, professional on-line archives of legacy formats and operating systems.

    William Osborne

  40. lisa_hirsch

    This is a misrepresentation of what I said:

    I have a little more time today, and want to address Lisa’s belief that legacy machines will be easier to employ than emulators for legacy digital sound formats.

    I believe no such thing, as you’ll see if you re-read all of my comments. I have stated explicitly that digital formats can’t be relied on because formats change and go out of date. That’s one of the strongest arguments against digital archives.

    My comment about “easily re-created, well-documented electronics” is in an aside about the stability of LONG-PLAYING RECORDS, an analog format. I have not advocated using “legacy machines” for preserving “legacy digital sound formats.”

  41. William Osborne

    I’m sorry if I misunderstood your postings, or inferred to much from them, Lisa. Here is one of your comments that I felt could be misleading:

    As a format, LPs are about as robust as you can get, because the encoding is analog and the equipment needed to read the format is quite simple: a needle of the right size and an uncomplicated, well-documented set of electronics.

    There seems to be an implication here that information is more “robust” if the encoding is analog. Many of the most common analog devices used by composers, however, were not at all “uncomplicated.” A good example would be the Fostex 8 track reocorders that used ¼ inch tape. They were quite common in the 80s and well into the 90s, but are now entirely obsolete. A hundred years from now the machines will be virtually non-existent, and it will not at all be easy to repair, replace, or build a recorder head that can read 8 tracks, especially on quarter inch tape. These types of problems are further compounded, because there were many different types of multi-track tape recorders, using a variety of different tapes widths, and even cassettes. Finding the old machines to play these tapes is already becoming difficult.

    By contrast, legacy software formats and the emulators to play them are not especially complex to create and keep updated, and they are infinitely reproducible and easily transportable at virtually no cost. LP players might be a bit simpler to come by or rebuild than analog tape recorders, but we need to be clear that analog recordings are not necessarily better for archival purposes. This distinction is especially important, because many composers used multi-track analog recorders. The recordings were often not published, and thus present special archival problems.

    And as noted, this also applies to proprietary digital recording formats like ADAT players. Legacy software can mutate with time to match the needs of archives, but machines cannot, or not nearly so easily. This illustrates why digital archiving of audio materials has some advantages.

    William Osborne

  42. lisa_hirsch

    I was not trying to make the kind of general statement about analog formats that you’re reading into what I wrote. I was specifically commenting on the physical robustness of LPs and why it’s easy to get sound off them. Nothing more.

    You may think that it’s easy to write emulators to reproduce various software file formats, but that assumes that the owners of proprietary formats have documented the formats and published the specifications, and also that the format can be separated from the machine that produced it. It’s not that easy. I’ve already mentioned NASA’s warehouses of unreadable tapes containing data from various space missions, to provide one example. That’s a giant amount of useful data, yet where are the emulators? Where are the researchers investigating that data? It’s not only not that easy, it’s expensive, and any such project would involve dealing with a very fragile medium.

  43. William Osborne

    Wiki has an interesting article about emulators.
    They describe the pros and cons along with references for each point:

    • Emulators maintain the original look, feel, and behavior of the digital object, which is just as important as the digital data itself.
    • Despite the original cost of developing an emulator, it may prove to be the more cost efficient solution over time.
    • Reduces labor hours, because rather than continuing an ongoing task of continual data migration for every digital object, once the library of past and present operating systems and application software is established in an emulator, these same technologies are used for every document using those platforms.
    • Many emulators have already been developed and released under GNU General Public License though the open source environment, allowing for wide scale collaboration.

    • Intellectual property – Many technology vendors implemented non-standard features during program development in order to establish their niche in the market, while simultaneously implementing ongoing upgrades to remain competitive. While this may have advanced the technology industry and increased vendor’s market share, it has left users lost in a preservation nightmare with little supporting documentation due to the proprietary nature of the hardware and software.
    • Copyright laws are not yet in effect to address saving the documentation and specifications of proprietary software and hardware in an emulator module.

    I doubt proprietary closed-source code would be a big problem for the video and audio content that would affect our field. Those codes are based on widely used standards like mpeg, mp3, Quicktime, Windows Media, RealPlayer, etc. that are will likely remain well-documented, even if they are held as trade secrets. These financial interests might even serve to preserve the code. I suspect at some point the industry and archivists will also see the necessity of carefully archiving these codes exactly so that our historical legacy can remain alive.

    And to expand our discussion a bit, it is also worth noting that artists working in digital multimedia do not have alternatives to digital storage. Emulators that would maintain the original look, feel and behavior of digital multimedia art works would be quite important. Wiki puts it this way:

    “Because of its primary use of digital formats, New Media Art relies heavily on emulation as a preservation strategy. Artists such as Cory Arcangel specialize in resurrecting obsolete technologies in their artwork and recognize the importance of a decentralized and deinstitutionalized process for the preservation of digital culture. In many cases, the goal of emulation in New Media Art is to preserve a digital medium so that it can be saved indefinitely and reproduced without error, so that there is no reliance on hardware that ages and becomes obsolete.”

    Thank you for your engagement with this issue, Lisa. It is good to know that people are carefully considering these issues, especially at powerful institutions like Google.

    William Osborne

  44. William Osborne

    I forgot one important point about Lisa’s interesting comments. She writes:
    You may think that it’s easy to write emulators to reproduce various software file formats, but that assumes that the owners of proprietary formats have documented the formats and published the specifications, and also that the format can be separated from the machine that produced it.

    Some hardware and software would be quite difficult to emulate and that is exactly my point. This would be especially true for formats that were rarer and transitory. Through conversion to standard and much more universal formats like WAV, AIF, MP3, and MPEG4 the future of both digital and analog art works could be ensured.

    William Osborne

  45. Dennis Bathory-Kitsz

    Things seem to have continued apace while I was gone.

    To reiterate: I believe computer technology can be helpful. Heck, I was online computer-to-computer in 1978 and sent my first email in 1981. Kalvos & Damian helped pioneer nonpop online with audio in 1995 and was the second ASCAP/Deems Taylor Award Winner (right after NewMusicBox!).

    But I also don’t wear rose-pixelated glasses. These technologies are transitory and become obsolete; there is no permanence to JPEG or WAV or AIFF or MPEG4 any more than there was permanence to analog formats.

    The argument won’t be settled here, and in some cases is a philosophical or even ethical question (who gets to determine what is preserved, for example). Instead, in order to answer William’s question of two days ago, I’ll do a quick summary of issues and present one specific example. He wrote, “I can see how digitizing might be inaccurate in some cases, but if done with reasonable care it would seem the quality would be quite good. Perhaps you could elaborate on this point.”

    Despite its age — Michael Gerzon died in 1996 — his article is considered a landmark document and still applicable; I’ll assume it’s been read, because it covers the most of what needs be said about the inaccuracy of audio transfer from analog originals.

    The problem extends to video and movie film as well. If the conversion is done frame by frame at high resolution to an uncompressed format, a document of reasonable quality is created. Funds will be needed for storage and up-conversion as formats deteriorate, and as newer conversion technologies are developed, the process will need to be repeated to take advantage of the improved quality — assuming, of course, that the original has also been maintained. Compressed distribution video can then be knocked off from the high-resolution original. (Note that digital decay has already arisen as a serious issue in Hollywood studios, so who knows how much money will solve this.)

    The same video or film converted through more typical means — taking it to a conversion service — will result in a compressed, inaccurate and often misrepresentative version of the artist’s work. This again comes into the philosophical/ethical questions already raised, but there is a difference in quality nonetheless. The high-resolution copy is archival; the 2 Hour Photo booth version is not. Unfortunately, most people (including many artists) follow the low-quality route. This is not a suggestion not to make digital copies, just not to believe that digital copies have either accuracy or relative permanence.

    This conversion dilemma applies to slides, negatives and prints that are scanned. There is a shift of color. Even at very high resolution, making an accurate copy of Kodachrome is a challenge not only because of its color but also because the slide film has layers of distinct thickness. Most digitization is done with a kind of best-guess approach either because it’s the only economical choice or because someone in the ‘preservation chain’ doesn’t know any better.

    Prints are often lower resolution, but not always. I have an 8×10 platinum print made in the 1920s of two stores side by side, one burned. Though invisible at ordinary magnification or when scanned at a typical 72-144dpi, this photo, when scanned at extremely high resolution or viewed under a magnifying glass, reveals an axe stuck in a chopping block. It tells the story of work interrupted by the fire, much more interesting than just two buildings.

    The data that results from ordinary scanning has a built-in brick wall (for details) that is the selected resolution (no matter what they do on CSI), whereas the resolution of the original print or negative is as small as the precipitate in the emulsion. Again, it is a problem created by typical casual digitization and made worse by the data destroyed during compression (such as JPEG), and applies to the ‘interim’ generations until conversion hardware/software/storage catch up to analog resolution. It is the current few generations of art and music that are in jeopardy through misplaced faith in the technology (or, as Steve Layton noted, faith in the archiver).

    (I apologize for cramming this all into just a few paragraphs, but the issues are enormous.)

    The already digital world can create even greater complexity. Contemporary multichannel audio does not start life as a final 5.1 DVD-A (where life cycle and up-conversion issues are also inevitable). There may be hundreds of intermediary steps of raw material and assembled ‘takes’ (acoustic or generated). If one wishes to preserve this archive, several issues arise:

    • Emulation. You have already discussed this, but emulation does not include all variants of an operating system, not does it address hardware issues. Obsolence by choice (the required upgrade) comes into play.
    • Tethering. The software application or operating system itself may have been locked. Hacks may or may not be successful in actual use, especially emulation. As the damage caused by tethering and DRM become evident, this problem should vanish in the future — not yet.
    • Hardware dependence. Software does not always replace hardware. (More below.)
    • Desire/competence. If the hardware and operating system needed are not considered interesting enough to get competent volunteers or if there is no pot of cash to hire them, the art is toast.

    Here is one example among dozens that I have personally encountered. (Please keep in mind that I’m not predicting the future. Irrespective of what solutions to future digital material may be developed, it is the first generation [or two] of digital art as well as digital conversions of art that are in peril.)

    In 1986, I created an interactive sound environment (website here) as part of a collaboration with Fernanda D’Agostino . The modified computers and hand-built hardware are still in storage; the software modified itself as it ‘learned’.

    The hardware exists, but most could be emulated anyway (not the transducers), so what is the problem? The unexpected. Careful archiver that I am, I kept a paper copy of the custom operating system and application software that I had written for it … and good thing because all the EPROMs that contained the system, programs and data had erased themselves. At the time the project was created, there was no life-cycle information for EPROMs like these; they were sold as essentially permanent storage unless exposed to ultraviolet light. Nuh-unh, no UV. Yet still gone. Blank.

    In the intervening years, I had failed to up-convert the data exactly as would be required for any kind of data storage. This was a groundbreaking piece of work, likely the first of its kind, and — had I not kept a paper copy that could be re-keyed into a future emulator — it would have vanished.

    Barn-door composers will say, fine, it’s gone, so what? They may feel that way about me or Bonneau or Wayditch or Osborne. But not all of us feel that way. The unique work of minor composers with limited resources interests me as much as the well-endowed archives of Stockhausen.


  46. William Osborne

    Dennis, thank you for this very interesting and useful information. I think conversion of film to DVD format (mpeg4) is a very good example of lossy compression. Even the most expensive film studio converters are clearly lossy.

    One might note, however, that most of us are not filmmakers. Our compression formats involve audio and multimedia created directly in digital format. No analog to digital conversion is involved. (Some audiophiles argue the CDs are unacceptable and that we should have stayed with LPs, but that is more of a philosophic or taste issue.)

    I can hear some loss, however, when I convert my files to the 5.1 dolby surround format used for DVD-Video, but it is pretty much within tolerable limits. For 5.1 Dolby surround in the DVD-Audio format there is no loss at all. (It is 24 bit 48 kHz and sounds incredibly good.)

    I hope that HD video and blue-ray disks will solve some of the problems of lossy compression for DVDs. This affects even digital media artists. When Abbie and I tour with our music theater video works, we play the videos directly from uncompressed DV tapes. The colors are much richer, and the cross-fade transitions are not shortened. It is a big hassle to use DV tape because they are only stereo. We solve this by putting a SMPTE track on the tape so we can sync it to a laptop with uncompressed audio. We thus get uncompressed DV richness in our images with uncompressed 24 bit, 48kHz audio.

    To tell the truth though, I think 99% of our public would not notice any difference whatsoever if we just played our video works from DVD-Video with surround sound.

    Since DV tape does not have surround-sound, there is no way digital media artists using surround-sound can export their works from their computer in an uncompressed format. No such uncompressed export codec even exists.

    So for contemporary composers and digital media artists, who almost invariably work digitally to begin with, conversion is not a significant issue when it comes to archiving.

    William Osborne

  47. lisa_hirsch

    Regarding this:

    It is good to know that people are carefully considering these issues, especially at powerful institutions like Google.

    Once more, you are making incorrect assumptions about me and what I’m posting.

    I hang around NewMusicBox because I have a tiny second career as a music writer and I blog about classical music.

    My comments here are strictly my own opinions and views in my capacity as a music writer and musician; I represent only myself and do not in any way speak for Google. Nothing I say has anything to do with any Google project or anything going on internally.

  48. William Osborne

    Hmm. The suggestion that digital archives are inordinately unstable while your employer is in the process of scanning 50 million books might be a problem.

    Anyway, my apologies if I misunderstood.

    William Osborne

  49. Dennis Bathory-Kitsz

    So we’ve had a long side visit to the comparative worlds of analog and digital, but have yet to address Franks’ questions, especially this question, in a practical manner: “What could be done to preserve the legacies of such composers and keep the flames of support for them burning after they or their descendants are no longer around to carry the torch?

    Groundwork has already been done: “helping to get those LPs re-issued on CD,” “writing the Grove Dictionary entry,” “trying to interest people in performing these works” and “finding a permanent home for these unique manuscripts.

    We have discussed digitization and web presence.

    There are archives of writers’ manuscripts, collections of art in museums and homes, but even the American Music Center itself no longer accepts composers’ scores (much less media material or preparatory documents). Does this reflect a larger problem of either archiving or interest in nonpop, especially of the recent variety? Wayditch had a son who did what he could and now it has become Frank’s responsibility. Gilles had me. I have nobody. How about Bill? Other senior composers here? What are your plans? How do your plans apply to Frank’s room full of scores?


  50. William Osborne

    Some universities collect the scores of their graduates who devote their lives to composition. My undergrad school, the University of New Mexico has (or had) such a program. Perhaps universities should be encouraged to develop such programs. Most larger universities have the resources for that sort of archival work.

    The Paul Sacher Foundation in Basel, Switzerland is devoted to archiving the materials of composers – though they are very selective. See:

    Are there any institutions like the Sacher Foundation in America?

    The Mid American Music Center housed at BGSU archives the works of composers who participate in their programs. See:

    Has anyone compiled a list of archives for new music, and the criteria they use for their collections? That would certainly be helpful. Dennis and Frank could study the list and develop concrete ideas for where they might send the scores they are temporarily harboring.

    William Osborne

  51. lisa_hirsch

    Hmm. The suggestion that digital archives are inordinately unstable while your employer is in the process of scanning 50 million books might be a problem.

    You might think about the distinctions between the kind of digital archive we’re discussing and Google Book represents, not to mention the resources behind all Google products.

  52. William Osborne

    I would guess that most archives for new music would be housed in major universities. Do you think their digital archives will have the same long-term stability as Google’s? Any thoughts welcome, and of course, understood as purely private perspectives.

    William Osborne

  53. lisa_hirsch

    Surely you must realize that I can’t answer a question like that, because it involves comparisons with Google. Again, I suggest that you investigate published reports yourself, and talk with some university librarians about archiving, digital archiving, and related subjects.

  54. William Osborne

    Well, yes, we should all do research, but the premise here is dialog. You mention that there are distinctions between the kind of digital archives we are discussing and Google that should be considered, but you say you can’t elaborate for professional reasons.

    Fair enough. So I will simply put it this way: Most of the archives we are speaking about would be in major universities whose resources and abilities for maintaining digital archives would probably quite comparable to Google’s. You clearly leave the implication that the archives we are speaking of would be inferior, but that is not necessarily true. Again, this is not to deny the value of paper archives, but as we have seen, digital archiving also has many values, and would solve some of the problems many composers face.

    For those interested, there was an interesting article in the LA Times yesterday about archiving media art and the difficulties of maintaining their legacy hard- and software.,1,790221.story?track=rss


  55. William Osborne

    It is interesting how the mediums we use to create art shape that nature of art itself. I wonder if someday music might be divided into three large epochs: quills, graphite, and Graphic User Interfaces.

    It was difficult to revise when writing with a feather. One couldn’t easily erase. The music had to flow rather freely. Composers were often very good improvisers because the nature of the feather required an easy fluency when composing. By the 20th century, composers had shifted to scraping graphite on bleached wood pulp. The inscriptions could be rubbed off, allowing for easy revision during the compositional process. Cogitation replaced flowing emotional impulses as the source of music. It was not so much late Romantic chromaticism that led to 20th century serialism as it was the pencil.

    Fewer and fewer composers now “write” by scraping graphite on bleached wood pulp. Their “inscriptions” are merely virtual, represented on video screens. These machines also play their inscriptions back without reliance on human corporeality. And more and more often these virtually realizations are the only sonic form the music ever assumes. Music has become largely dematerialized. Since performers are less and less needed, the inscriptions of music are merely a passing part of the compositional process that play little or no role in its final sonic realization.

    Will this dematerialization also represent an epochal change in how we archive music? We have broken from a 5000 year old tradition of physical inscriptions ranging through stone, clay, wax, papyrus, vellum, parchment, and paper. The images of script now only exist virtually. They pass for a time on a screen and then become magnetism. Music is created that does not have a meaningful system of visual representation that can be put on physical matter.

    This dematerialization of the music-making process is also taking the body out of music. The virtual notation is given a virtual realization that does not require a performer. This is also an epochal shift because music has always been an inherently corporeal art. In spite of our glorifications of the mind, music has always existed far more in our legs than our brains.

    Western culture, from Pythagoras to Stockhausen, defined music as a sort of harmony of spheres, some sort of vibration with cosmic meanings. This is a glorification based on Western idealism. In reality, music has been a phenomenon of a creature that enjoys vibrating the viscous gas in which it exists, based on the unique impulses of its bipedal nature. The tic-tak bipolar oscillation of bi-pedalism created our musical identity. We dance and hop and twirl and bounce like no other creature. The neural impulses of organisms with four, six, or eight legs are so complex they are much more reliant on the vegetative nervous system, but bi-pedalism is simple enough that the use of the legs are much more volitional. Hence the human as a music-making creature. His chimpanzee-related neural system given toward howls and chatters when excited contributes to this bipedal musicality.

    As music dematerializes, its bi-pedal physicality will go through epochal changes, including the way we archive it. Music will often never even exist as physical notation. It’s manifestations as sound will be created by digital code readable only by machines. The foundation of historical realization and study will be digital emulation of historical artifacts. Archives of music will exist as complex patterns of magnetism.

    William Osborne

  56. Chris Becker

    Going back to Dennis’ question…

    Archiving can be a part of the compositional process. On one level, Charles Mingus’ work is a history of black American music with pieces dedicated to or overtly referencing seminal artists (My Jelly Roll Soul, Reincarnation Of A Lovebird, even Eat That Chicken which evokes Fats Waller) that also sang with Mingus’ unique multifaceted compositional voice.

    Are there examples of similar works by “new music” composers? I’m sure there are, but my knowledge is such that I’d have to think about it awhile.

    And of course utilizing recordings (samples) as part of a composition is another way to combine archiving with music making. And the research one may do in order to put such a sample into historical context (or do deal with certain legal issues depending upon the nature of the piece) can be a part of the compositional process.

    So maybe we composers can’t house and care for hundred of scores. But there are creative compositional approaches to perpetuating music that is in danger of being forgotten.

  57. Dennis Bathory-Kitsz

    Chris, I don’t have an answer to your question, but I think the question itself and the surrounding concept are brilliant. Many minor composers were ‘memorialized’ and ‘archived’ in such variations and influences in the past. (Kind of a Lal and Data, if you recall that.)

    Today? I don’t know. But I suspect a little copyright-shyness might come into the mix.

    Thanks for a brilliant decoding.


  58. William Osborne

    Interestingly, jazz is largely a non-literate tradition. The music is usually improvised and not notated, except for the basic melody and perhaps some chord changes. It is archived through audio recordings. Quoting passages of music in compositions in some senses keeps the quoted music alive, but of course, it is not the same as archiving it.

    As an undergraduate I had a work-study job as the university’s music library sound technician. One of my projects was transferring recordings of ethnic Hispanic music from Northern New Mexico made on a wire recorder during the 30s and 40s to quarter inch tape on 15 inch reels. The wire ran through the machine at two feet per second and had a quality similar to a 78 rpm record. There were a huge number of spools and the project took some weeks. I hope someone has migrated the collection to CD-Rs. One big advantage would be navigation. It was very difficult to find specific songs on the wire and mylar reels. With CD-Rs you need only punch in a number and there it is.

    Wiki has an interesting article about Google Book. Regarding accuracy it comments:

    “Many of the books are scanned using Google’s undisclosed proprietary method, most likely through the use of a robotic book scanner, where books are placed into the machine by a human operator and ‘scanned’ (in practice, a digital camera is used at a distance) at a rate of 1,000 pages per hour. The rapidity of the scanning precludes checking the pages. Hence, some pages are not scanned or are scanned in such a fashion as to make them unreadable.”

    Steve Lacy provided a link to a botched page above.

    I think that is why composers should try to create their own digital archive. They can insure its quality. And as Chris notes, archival work is part of the compositional process. For younger composers who do most everything digitally anyway, creating a digital archive is not such a big problem.

    Google says it is scanning more than 3,000 books per day, a rate that translates into more than 1 million annually. Many major institutions are participating in this project, inlcuding Harvard, Columbia, Princeton, Univ. of Michigan, Stanford, Cornell, The Bavarian State Library, and the New York Public Library.

    Anyway, it’s a development we would do well not to overlook.

    William Osborne

  59. Chris Becker

    Thanks, Dennis.

    “Interestingly, jazz is largely a non-literate tradition. The music is usually improvised and not notated, except for the basic melody and perhaps some chord changes.”

    I think its “tradition” has always been a much broader more fluid collection of techniques than you imply. But I don’t want to get into a back and forth about it.

    Sue Mingus put out a great collection of Charles Mingus scores called Charles Mingus More Than A Fake Book. Each score in the book is a hybrid combining Mingus’ scores with transcriptions from various recordings of a piece. There are additional notes regarding instrumental color, improvisational approach and biographical subtext. I created a string quartet and voice version of Mingus’ “Weird Nightmare” with this book as a point of reference.

  60. William Osborne

    My apologies for a delayed response. Yes, the jazz tradition is wide and fluid. My point is that wire and Mylar recordings illustrate how university archives have been dealing with the problems of obsolescence and format migration for a long time. I think they have shown themselves quite equal to the task. University archives are very different than the examples listed above (insurance companies and census bureaus) that just stuff old materials in basements until one day they discover they are unusable. University archives are specifically focused on the challenges of archival work.

    I do not know of any statistically significant evidence that that would show that university archives let the their collections deteriorate without migrating them. And of course, they are also much more scholarly and precise in their archival work than institutions like Google.

    William Osborne

  61. Dennis Bathory-Kitsz


    I would think an old lefty like you would understand that archiving belongs to the wealthy, powerful and well connected, whether it’s Presidential libraries or university settings.

    First, you won’t get hard statistics in an anecdotal field. Who knows the condition of those tapes which you helped transfer from wire recordings during your undergraduate days? Have those tapes been maintained? Re-transferred? Digitized? From the originals or after the deterioration of several tape generations? Who paid for this? Who pays now?

    Next, I think you unnecessarily disrepect the public agencies who oversee their own archives. Public funding does not cover archiving, but I have watched these people work desperately hard to keep those materials accessible. Both when I lived in New Jersey and today in Vermont, I have been hired to help in that very preservation — a process with little money and lots of “please” and “thank you”.

    But back to the money. Upthread I talked briefly about the Kenneth Sawyer Goodman archive. It wasn’t the Newberry Library that made that possible — it was the wealthy lumber-fortune family that did. They paid me, they paid the editor, they paid for the publication and the maintenance, as well as for this little theater you may know that bears his name in Chicago. Goodman was the kind of author you and I and Bonneau and others here are composers: minor and eminently forgettable. His best plays were not his unique voice but rather collaborations with Ben Hecht. The archive is accessible because of money.

    Universities don’t preserve archives because they have a mission, but because money has been set aside. I mentioned the composer who arranged for the maintenance of his archive at his alma mater. For another archive, I designed and hand-built special audio equipment for a restoration project of glass recordings of folk songs, paid for by private funds that maintain the collection. And you can bet that family trust-fund artists won’t be forgotten.

    On the other hand, the four archives that I maintain — mine, Bonneau, Trans/Media (post-fluxus arts coop) and Vermont Composers Consortium — have no home and likely will have no home. Bonneau’s material won’t be touched without a fund. The library at Rutgers, my alma mater and that of several Trans/Media artists, won’t even answer my letters. The Trans/Media material also won’t be accepted by the New Jersey State Museum or even the city museum in Trenton (where Trans/Media, a 60-member arts coop, created hundreds of major events, including three city-wide festivals). I have approached all the well-funded universities in Vermont, and none will accept the Vermont Composers Consortium archive — scores, recordings, photos, programs, publications, personal letters, etc.

    When you write, “university archives have been dealing with the problems of obsolescence and format migration for a long time. I think they have shown themselves quite equal to the task,” do you know that? What lets you believe that “University archives are specifically focused on the challenges of archival work”? How were they doing before Google showed up, for example? The glass recording archive I mentioned sat for forty years in boxes until a blob of money appeared. Heck, I’d be “equal to the task” if I could hire an archiving team and build an archiving space funded in perpetuum. I should make a bumper sticker, “I’m equal! Pay me!”

    The archives of university libraries, at least those here in the U.S. east coast that I know, are desperate for effective preservation research and for money, and the likes of Wayditch — already maintained only by his son and now Frank — are just one more burden to them.

    (Maybe a wealthy funder is reading this right now, and Wayditch will get his day.)


  62. William Osborne

    Thank you for your very interesting thoughts, Dennis. If only more people could create engaging, helpful dialog like you!

    At your suggestion, I did a web search for the archives at the University of New Mexico. The results made me very happy. The tapes and wires I copied are now part of the university’s Center for Southwest Research.

    The wires and tapes I copied are housed there. (They were part of the Fine Arts Library when I was a work-study student copying them.) My largest migration project was the John D. Robb Field Recording Collection. I has not only now been digitalized, the collection has also been converted to mp3 files and put on line!!!! I was astounded to see this. I didn’t know. Yeah!!!

    The CSWR archive website has an entire section called the “Digital Collection.” It includes not only the Robb Field Recordings, but also Native American Oral History, and image files of a lot of historical photographs and documents. The CSWR is very well funded and treasured by UNM and the state of New Mexico.

    The website notes that he audio archive “was created in 1964, and has grown to nearly 33,000 entries on more than 1,600 reels of audio tape. The archive’s collections preserve examples of the rich cultural milieu of Southwestern music.”

    I can’t say for sure if other states have such archives. Surely New Jersey must have an extensive archive with photos of 350 species of cockroaches, with mp3 recordings of the strange squeaks and squeals they make while carrying your refrigerator out the door…. Sorry, couldn’t resist.

    Seriously though, its seems that one thing our public universities have been able to do really well is build wonderful libraries. And state universities, by their nature, are often inclined toward documenting and archiving regional history. That is also the sort of things politicians like to fund. Local history and culture invariably makes hay with the voters.

    With a little organization, some foundation funding, and some backing from organizations like ASCAP and the AMC, I think a program could be initiated to encourage universities to archive the compositional work of their graduates and other composers who were long-term residents of the state. And as I said, I think these universities would do a pretty good job of maintaining the collections. And it would be especially wonderful if the materials were available on-line, similar to what the CSWR has done.

    What we need now is some hard data about how many of these types of archives already exist. I haven’t looked into it, but I know that UNM has some sort of program for archiving the works of its graduates. I think many other unversities do as well.

    William Osborne

  63. William Osborne

    Virginia Tech has some interesting information about digital archives. They use a program called ETD to archive all of their Theses and Dissertations.

    Virginia Tech is also a member of the MetaArchive Cooperative. This is an independent, international membership association for securing digital archives. ( )

    The MetaArchive Cooperative employs the open-source LOCKSS software ( to harvest, cache, and validate files in a geographically distributed network. It is based on the same rhizomic forms of backup and distribution used by the Internet (originally designed as a communications system that could withstand a nuclear attack.) The MetaArchive Cooperative states that ETDs Can Be as Secure as Their Paper Predecessors. In fact, that is the caption on most of their web materials.

    The MetaArchive Cooperative also offers workshops for organizations that would like to join. The next is on June 4th. See:

    Dennis is quite right when he says that archival work costs money. And he is probably also correct when he suggests that these archives are unduly influenced by the interests of the financial elite (like just about everything else in our society.)

    A digital archive for composers might solve some of these problems, or at least considerably reduce their impact. As we have discussed, the life work of a composer, including all of her uncompressed recordings, can generally be contained in a few gigs of digital storage space. Today it is a relatively simple task for composers to organize their materials into a digital archive. These could be stored by archives for a very minimal cost. And as the MetaArchive Cooperative illustrates, these digital documents can not only be as secure as their paper predecessors, they can also make a composer’s work widely available for countless future generations.

    As another alternative, a university could host a national archive, or a group of universities could host a network of regional archives for composers. To reduce costs composers could be required to organize their own digital materials based on guidelines and standards that the archive establishes. The archive(s) could join the MetaArchive Cooperative and provide a very secure home for a huge body of work that would preserve a great deal of art and also be very useful for historians. The costs would be minimal.

    There is no reason that composers capable of organizing their work into a digital archive should not have it preserved.

    William Osborne

  64. greyfeeld

    since the beginning of this thread
    …. one of my best friends has passed on, Paul Nelson of Brown University (he studied with Creston, Piston, and Hindemeth. He was 78, with few relations, and in a month or so I’m meeting with a few other friends to talk over what should be done with his work. I may be asking for your folk’s advice. Thanks, Robert Bonotto

  65. bdrogin

    What an interesting discussion among familiar colleagues (although Frank kicked it off, he never returned, so it’s primarily Dennis and William (that I know) who have been carrying it forward).

    What should be clearest to all is that these are a bunch of fogies of a certain generation (to which I also belong) who don’t acknowledge that Wayditch and others belonged to a pre-software age. If Wayditch had been born today and lived a hundred years, his entire ouevre would start as digital, and he would be able to create MIDI realizations of his work, as well. The question would completely shift from scanning his work to printing out (and obtaining performances) of his work.

    We of a certain age went through several revolutions in our lifetimes. At one point, music with graphic notation was embraced, then roundly rejected. For quite some time, music notation software was insufficient, then it improved. Rough sketches had to be meticulously inked – nowadays, even those scores will be rejected, typeset scores and sonic realizations are required. Pieces that incorporate improvisation and sonic processing raise the challenges of preservation to a new level.

    First, let’s get reasonable. Bank records are digital. Dennis is harping about bit-flipping problems, but there are error-correction algorithms galore out there which could be used to ensure the integrity of digital data – you’d just have to run a digital scrub periodically and you’d “preserve” the original content to acceptable levels of error prevention.

    As we create more and more content digitally, the idea of preserving the process of original creation and early drafts becomes entirely lost. My process is now to start with pencil on paper, then enter the partial product into Finale, then continue composition in Finale, then print out the Finale onto paper, then mark up the printout, then return to Finale, etc., etc. I tend to throw out the great majority of my printouts, and I don’t save as rev 1, 2, 3 the intermediate digital files, so so much for preserving the original process of creation!

    I have lost articles and NewMusicBox postings. Thankfully, the Wayback machine has restored some of these lost writings to me. Likewise, that Google cache has come in handy on occasion. Thank you, free digital archives!

    Dennis does raise some interesting points about metadata, but this may end up being just a limitation of current designed search engines. It is not inconceivable that, in the future, we may be able to hum a few notes into our computer, or input a fragment of a recording, and be able to search and retrieve the score, recordings, websites containing biographical data, and so on. Meta-tags are completely ignored by Google, so much for the philosophy of metadata!

    As a culture, we are moving away from the entire concept of “the original” (as to originality, that is a different subject). Taken from this point of view, I side more with William – dissemination is the greatest ally of preservation. If there is only one paper copy, a single fire, the ravages of weather, and it’s gone, forever. If a digital copy is transmitted and re-transmitted, some fans will download it to their local harddrives, some will print it out (or transfer the sound media to whatever is the latest and greatest), and the odds of preservation of our digital detritus (great term!) increase.

    I agree with Dennis that someone has to WANT to protectively encode the data, renew the data, quality check the data, store the data, and I do not take his concerns lightly. Here’s an idea – libraries should make the use of large-format scanners, with transfer to your own USB drive, freely available. I’ve got some old yellowing college newspapers I’d like to preserve…

    Yes, yes, don’t throw out the original. Okay, enough already. But we live in an age where instant access to information is more important than the fact that it’s sitting in a box somewhere. The archeologists of a century ago would dig up the past, photograph it, destroy the site and claim they had “preserved” it! The modern archeologist tries to use LIDAR and such to investigate and record the site non-invasively, and then distribute the findings.

    Music as object? Listen to my favorite John McGrath quote (from memory): “a playscript is what is left over after an act of theatre has occurred.” The composer’s original intention? How about the conductor’s markings? The markings in the parts made by performers? Rehearsal recordings? Sorry, musicologists of the future, you had to be there!

    Barry Drogin
    Not Nice Music

  66. bdrogin

    P.S. There is an enormous amount of data on this page, and I scrolled through with a combination of reading and scanning. At second scan, I see that some of the points I make are made by others, and I apologize for implying that they are not.

    P.P.S. Went searching on Google Books for the exact McGrath quote, but it got hidden on page 6, which is not available on-line. I guess when these books lapse into the public domain Google will make all of the content available?

    Assuming there still is a Google…

  67. William Osborne

    The LOCKSS software for securing digital documents I mentioned above was developed by Emory University with a large grant from the Library of Congress. The system is based on a geographically dispersed network of backup archives. The LOCKSS software continually scans the networked archives. If it finds a document that is different from the documents mirrored on the other sites it corrects its. In other words, it is a self-correcting system of mirrored documents spread over a wide geographical area. The system is also not very expensive, which allows it to be based on sustainable business models.

    It is expected that over the next ten years most large institutions will move their paper archives to digital formats. For example, Stanford University, New York University, John Jay College of Criminal Justice and Albert Einstein College of Medicine are using a company named MicroMedia to convert major repositories of legal, financial, and student records stored in paper and micrographic formats.

    Scanning technology is quickly evolving. Some programs include enhanced image and data validations and quality controls and assurances. Now that so many institutions are moving paper documents to digital form, there will probably also be large improvements in scanning technology over the next ten years.

    If it turns out that no archives can be found for the papers of Bonneau and Wayditch, perhaps a grant could be obtained for their digital conversion. If the documents were then placed in a MetaArchive Collective using the LOCKSS software, they would be secure. There might be some minor loss through the scanning process, but that would be better than the garbage dump. I notice that many companies offer a fixed-unit pricing model, without hourly fees or charges. The costs could be determined in advance and used in the grant application process. The scores and other documents would be available as PDF files. Both streamable and uncompressed recordings could be included in the archive. All would be easily available to countless future generations. That would probably be more than either composer ever dreamed of.

    William Osborne

  68. Dennis Bathory-Kitsz


    Very good reference. I read the entire site and watched all the videos, and took some of the off-site links to participants. LOCKSS appears to be oriented toward print/web preservation of already digital content in transportable formats. That may simply be its starting point, but it’s not entirely clear whether this would be workable in some of our interim circumstances (I’m thinking of material in proprietary media formats, for example). Good possibilities certainly suggest themselves.

    I asked the project’s contact person (Victoria Reich) if she would join us here and offer some comments. (According to the auto-bounce, she will be back Monday.)


  69. William Osborne

    I agree with all of your impressions, Dennis. LOCKSS doesn’t address the scanning issue. (The new developments with scanning would probably occupy another 70 posts.)

    I Googled “scanning services” and “oversize scanning services” and found a lot of interesting information. Because of the move toward digitalizing paper documents, there are now hundreds of scanning services across the country. There seems to be plenty of competition, which is hopefully driving the prices down. That would be good, since grants funds for Bonneau and Wayditch might be limited. It might be useful to shop around for price, and for people who would appreciate the specific problems of scanning scores and cultural archiving.

    These scanning services (along with MetaArchive Collectives) could be very useful for the old paper composers, and composers like me who bridge the divide and have a lot of paper scores from the 80s. I will never get to them all with Finale.

    And of course, we need to see if MetaArchive Collectives for the life work of composers will be established. Perhaps Victoria Reich could give us some advice about that. This is also where the AMC, ASCAP, BMI, universities, and foundations could do something very important. It wouldn’t even cost that much, so I think such archives will appear in the next few years. In the meantime, I think it would be very wise to get homeless collections like the Bonneau and Wayditch in digital form if grants could be obtained. The costs would probably not be that high, especially when one considers that an entire life’s work would be saved.


  70. Dennis Bathory-Kitsz

    I sent the following to Victoria Reich, the director of the LOCKSS program at Stanford:

    Hello Ms. Reich,

    There is presently a preservation debate at NewMusicBox, the forum of the American Music Center.


    I have spent some time on the LOCKSS pages and do not find an answer to some questions, in particular:

    1. The use of LOCKSS for the preservation of audio content.
    2. How LOCKSS handles proprietary data formats (Finale, Sonar, Max/MSP, etc.)
    3. How LOCKSS handles scanned information (i.e., material not OCRed, such as musical manuscripts).

    It appears that LOCKSS is presently a system oriented toward publishers and libraries rather than the content producers (i.e., artists, writers, composers) themselves. It also appears that using LOCKSS gains preservation of the ‘bits’ without necessarily preserving the content.

    Consider multimedia projects created under proprietary programs running under certain generations of Mac or Windows (or earlier) operating systems. Does (or can) LOCKSS preserve operating system emulations, programs, and data in such a way that digital projects can be retrieved and made to function?

    Might you join this discussion on NewMusicBox?

    Victoria Reich read our discussion, and sent me this email:

    Hello Dennis,

    Very interesting debate. LOCKSS preserves web published content. If the
    content is delivered via the web, the content’s genre, format, etc. is

    The LOCKSS team provides support to those libraries and archives who are
    helping to fund our work. However, LOCKSS is open source software and is
    freely available.

    Best, Vicky

    So we still have no solution, at least not if the content isn’t web-centric to start with.


  71. William Osborne

    So we still have no solution, at least not if the content isn’t web-centric to start with.

    Your interpretation of Vicky’s email is somewhat hyperbolic and misleading, Dennis. The blanket statement that “we still have no solution,” is much too broad. And in fact, it is harmful, because it could mislead composers and institutions into neglecting digital archiving that could be very important for the distribution and preservation of their work.

    A digital archive would indeed be by nature rather “webcentric,” but far from being a problem, that is the advantage. This would apply to even older paper composers, since their files could be put in PDF form and their recordings in streamable formats where the possible need for future emulation would not be a significant problem.

    Emulation problems for programs like Sonar or MAX/SMSP will exist regardless of what sort of storage system is used. In fact, a digital archive would be exactly the system that could preserve the data. We should also remember that for most works, the output from these programs can be stored in formats (PDF, WAV, AIF) that will not require large emulation problems.

    Keep things in perspective. It’s almost like harboring the Bonneau archive has made you fatalistic. Keep your spirits and hopes alive and realistic, my friend. Digital archives do not solve all the problems, but they can be extremely helpful.

    William Osborne

  72. Dennis Bathory-Kitsz

    William, it seems you are dismissing the extent of the recent past, present and mid-near future issues.

    Let’s assume it all gets better in, say, 2025, meeting your vision of a kind of archiving utopia. By then all applications run online, all data is in open rather than proprietary formats, intelligent up-conversion is automatic, and storage is so cheap and distributed that nothing can be lost. (Don’t count me in as a believer for now.)

    There is still a massive amount of material that cannot be funneled into this online forum. The issues of past data (the Wayditch and Bonneau) have already been explored. They cannot be solved by themselves; only time and money will assist, even if it will not complete the picture. Let’s set that aside.

    Even if we do, there is also a tech-art bubble. As one who has been creating what one might call technological works since before 1970 (ack!), I know how quickly these are lost to eclipsed formats (both software and hardware) and technological locks (proprietary formats and tethered software). And you are right — I am not optimistic about the work of two to three generations of artists being salvageable. It is not a night’s work to emulate the hardware and software needed to place and operate a technological artwork online (even if the data itself survives bit rot in the interim).

    And tech companies, from musical instruments to multitrack software, have no incentive to do it. Businesses are in business to sell us new stuff, not coddle old stuff, unless there is a demand. There is little enough demand for current artistic ‘products’ much less stuff that’s being faded by time.

    There are tens of thousands of extinct formats and hardware platforms now. There is no reason to believe that number will slow its rise, and no reason to believe any silicon granola wholesomeness will overtake marketing minds to help these be emulated, much less actually have them function online. For, again, the present and near-mid future, the trend is opposite that. (Just look at NBC invoking the broadcast flag a few days ago so that DVRs can’t record their programs. And Google, for all its accomplishments, is doing little more than taking and indexing snapshots of books, some [see Layton upthread] less accomplished than others.)

    And, frankly, look how adeptly Ms. Reich simply sidestepped the question of platforms. That to me is evidence of the scope of the problem. When she wrote, “LOCKSS preserves web published content. If the content is delivered via the web, the content’s genre, format, etc. is irrelevant,” she speaks only to content on the web, that is, the eventual data, not the mechanism of preparation for delivery to the web — the essential but frequently unavailable on-ramp for “obsolete” content. Today this content resides on metallic spinning disks and bits of plastic connected to faltering hardware subservient to tethered programs and proprietary operating systems (and that’s not even counting the non-digital hardware or sets or costumes or dimensionality of live, interactive multimedia).

    I think we agree that we are already terrible archivists of our own work, and tend to be venal about allowing our flaws to be preserved along with what we see as our important contributions. (I’m reminded of a short story whose name and author escape. A new astronaut and his journalist roommate are close friends, but the astronaut has to go on a time-bending exploration to another star. When he returns a century later, he learns many others have made the trip and returned before him due to new technology. They don’t care about his contributions. Instead, he is brought to a conference to speak about his roommate, who ended up being considered the greatest writer of the century.)

    We are selective self-archivists, erratic self-archivists, and (money or not) we face an increasing weight of digital up-conversion and expiring formats. Our position is made more rather than less precarious because of unreasonable expectations surrounding digital technology, and that is what LOCKSS (and anything else save pure [and expensive] brute force) cannot yet solve.


  73. Dennis Bathory-Kitsz

    I was re-reading pg’s comment upthread and it reminded me that I have a donkey in this race.

    My creations are often aggregate — works that I have done earlier serve as sources for new work. With scored music, that’s not much of a problem. But with electroacoustic music, those sources are inevitably entombed in earlier formats.

    Early concepts appeared in the interactive Rando’s Poetic License (1978) for microcomputers and audience, and snippets of Rando were folded into the interactive digital sculpture Nighthawk (1985) — with programming for a new processor and with newly built hardware. Nighthawk became one source for the stage work Echo (1985) and for the multi-computer sound environment In Bocca al Lupo (1986), and Bocca was the source for TDev (which never got beyond development in 1989), and all were the source of the outdoor interactive installation Wolf5 (1991). These might have been the actual sources for the multi-sensory Xirx (1996), but by then the programming and hardware had become complex and were entirely incompatible and emulation has never been done. So Xirx (part of the collaborative Circular Screaming) was created only from the previous concepts and the actual series was broken. Xirx then became a source for the multi-year asynchronous Detritus of Mating (1997) but lost its interactivity due to hardware incompatibility, and Detritus became the source material for the much simpler Zonule Glaes II (1999) for electronics and string quartet. Parts of Zonule have become sources for pieces as recent as the playback-only smuttle (2008) for the 60×60 project.

    Many artists (probably most) aggregate, at least conceptually, with occasional epiphanies. But keeping in mind pg’s comment, that aggregation is a compositional process always in renewal — and a process that can be distorted or even fatally interrupted by the kinds of archiving failures that I point to in my post above.

    Some might say, “Good! It’ll get you off your conceptual ass and do something new!” So it may. But it also leaves long-term work unfinished.


  74. William Osborne

    These are all very good thoughts, Dennis. It seems our subject has shifted a bit. First we were speaking about the possibilities of creating digital archives that would be stable. LOCKSS and MetaArchive Collectives are a very good start in that direction. They solve the basic problems of digital decay, so to speak.

    Now you are focusing on the problems of obsolete, specialized soft- and hardware. Neither physical nor digital archives can solve those problems, though the characteristics of digital archives might at least preserve the raw data, even if it could not be emulated by future programs. On the other hand, more basic forms of data like typical sound, video, and image files will not present very serious emulation problems. (By that I mean formats like mp3, wav, wmv, aif, mov, jpg, pdf, gif, etc.)

    I read somewhere that Kaija Saariaho, who wrote several excellent interactive works using Ircam’s soft- and hardware, has decided to no longer compose anything but “tape” pieces when using electronics, because she is sick and tired of having her works become unplayable due to obsolescence. She was speaking mostly about Ircam’s 4x hardware.

    Is there not an inherent ephemerality to digital culture? This is why I have spoken in several past threads about our culture of digital detritus. There is something about digital culture that is inherently trashable. Is this something that we should accept as part of the genre? Is the 19th century concept of cultural legacy something relevant to the 21st century?

    I work only in digital media, and I strive to create works that will be durable. I try to see that their final form will be universal enough that they can be adapted to future soft- and hardware. Basically, that means I try to create audio and video recordings of the final works in standard, widely used formats that will likely be easy to emulate or migrate in the future. These DVDs and CDs can also be used to perform the works.

    I am not sure what composers are thinking who write for rare programs that will probably not remain around for long. Perhaps they are like sculptors who work with ice; they accept that their works will melt away and consider that part of the genre.

    William Osborne

  75. philmusic

    If you need to get your druthers, other opinions you must smother.

    No need for hyperbole-i-age use tons and tons of verbiage

    After you’ve dismissed ’em you can take their thoughts and list them

    No need for any tension when your working in your “zone,” every idea is your own invention.

    Phil Fried Skid-Row University cocktail hour 24/7

  76. William Osborne

    Well Phil, I wonder if your compositions are as remarkably profound as your italicized poetry…. Contrary to you suggestion, this discussion is not one where either side should win. There are many, many problems with digital archiving, and at the same time a great deal of potential. Both sides need to be carefully considered.

    I hope one result of this discussion will be an understanding that composers need archives as a resource for preserving their work once they die, and that institutions like university libraries and even the AMC could do much more in this area.

    A while back I saw an ad for the the AMC that listed the benefits offered to members. Due to the peculiarities of my life, none of the benefits would be very helpful to me. On the other hand, if the AMC could get the funding, material resources, and a part time staff member for a digital archive for composers who have reached their later years, or who are no longer with us, that would be something I would support just out of principle.

    If organized correctly, such an archive wouldn’t be all that expensive. Basic membership for the AMC in the MetaArchive collective would only be $200 per year. Additional storage space could be purchased as needed. Composers could be required to digitally organize their own materials (or have it done,) based on guideless and standards established by the archive, which would greatly reduce labor costs.

    A digital archive would have some limitations, and it would need to be properly setup to be secure, but it would be far, far better than nothing, and in fact, a truly valuable resource for preserving and making American music available to the world. What could be more within the mission the American Music Center and some of our wealthy research universities and foundations?

    William Osborne

  77. philmusic

    Is my music as profound as my poetry?

    Beats me!

    Phil Fried SkidRoe U, Drink’in leads to think’in

Comments are closed.