Wednesday, April 9, 2008

Rock History And How It's Made

Several blog entries ago I discussed Art Laboe's first Oldies But Goodies (1959) compilation, a collection of mid-50s doo wop and R&B consisting largely of L.A.-based groups such as The Penguins (“Earth Angel”) and The Medallions (“The Letter”). By issuing the Oldies But Goodies album in 1959, so I argued, Laboe was the first to historicize rock ‘n’ roll, to lend it the dignity and distinction of a “classic” or “golden” era, represented by the album title itself emblazoned in gold. While I think I was correct in that observation, in retrospect I don’t think at the time I wrote the entry I had fully considered all of the implications of my remarks. What I should have said in that earlier post is that the initial Oldies But Goodies collection serves to mark or distinguish the first from the second generation of rock ‘n’ rollers.

Although he’s writing about the idea of “nationhood” and the formation of modern nations, Benedict Anderson makes the trenchant observation in Imagined Communities that since it was impossible for the generation that came of age after the historic ruptures of 1776 (America) and 1789 (France) to recapture the spirit and inspiration that gave rise to these revolutionary moments, the following, or second, generation began “the process of reading nationalism genealogically—as the expression of an historic tradition of serial continuity” (1991 paperback ed., p. 195). The process of reading nationalism genealogically, as a process unfolding serially in time, gave rise to the study of history, history itself as a profession—the historian. Those who, for example, take upon themselves the duty of constructing The History of Rock ‘n’ Roll perform the same sort of activities as other historians, selecting representative figures, moments, and events from the past and then ascribing to them value and distinction in a larger pattern of meaning.

Take, for example, the claim widely attributed to Brian Eno, that although just a few thousand people bought the first album by the Velvet Underground, virtually every one who did so was inspired to start a band. While one might legitimately ask how he (or whomever actually uttered the remark) managed to acquire such information and to possess such grand, omnisicent knowledge, that’s really not the point. My point is that he’s taking on the role of the historian—like all historians, his role a self-appointed one—constructing a cause-and-effect narrative history of rock, giving it a genealogy and hence a tradition. In this case, he’s ascribing to the Velvet Underground a key or foundational moment in a larger, sequential narrative called the history of rock, asserting that those who came within earshot of that VU album were the inheritors—the torchbearers—of the spirit and innovation of the band (the proper names of the group normally would follow). By analogy, think of the genealogical style of Biblical chronicles: x begat y, y begat z, and so on.

He has every right to make remarks like that, of course, as Benedict Anderson points out, since those who come after, the second, third, and subsequent generations, have the right to speak for the dead--even when those on whose behalf they speak could have never understood themselves as such (198). (As Anderson points out, Michelet, the self-appointed historian of the French Revolution, claimed to speak for those who sacrificed themselves for the nation of France, insisting that he could speak on behalf of the dead, saying what they "really" meant and what they "really" wanted.) In the creation of a narrative in which the Velvet Underground serves as the grand ur-precursor to every subsequent avant-garde, experimental, glam rock, punk, post-punk, new wave, goth, and indie rock band to follow, the historian is actually speaking his own history, in actuality his own desire, articulating a faith, for he is really designating as a precursor a band whose members authored a future that they could have neither predicted nor fully comprehended.

Here’s the same general point, stated more poetically, by Gertrude Stein:

No one is ahead of his time, it is only that the particular variety of creating his time is the one that his contemporaries who are also creating their own time refuse to accept. And they refuse to accept it for a very simple reason and that is that they do not have to accept it for any reason. . . . Those who are creating the modern composition authentically are naturally only of importance when they are dead because by that time the modern composition having become past is classified and the description of it is classical. That is the reason why the creator of the new composition in the arts is an outlaw until he is a classic, there is hardly a moment in between and it is really too bad very much too bad naturally for the creator but also very much too bad for the enjoyer. . . . For a very long time everybody refuses and then almost without a pause almost everybody accepts. (“Composition as Explanation,” in Selected Writings of Gertrude Stein, ed. Carl Van Vechten (The Modern Library, 1962), 514-15.

Why is the construction of such genealogical histories so important to us? Because to claim that there is no rationally directed development is to open one to the realization, as Karl Popper observed in the 1940s, that history has no discernible meaning or pattern, that the future is radically contingent. His argument has never been answered because it is unanswerable (except by an appeal to faith, a belief in teleology). Popper claimed that the human future will be as it has always been, dominated by technological changes. The history of rock has been dominated by technological change; a book ought to be written exploring the role of technology rather than, as most all are, as genealogical influence. What would rock music be if not for the electric guitar? The programmable synthesizer? And way back when: how else would have Elvis burst onto the national spotlight if not for television?

Genealogical history has the virtue of connecting the present to a past that consequently becomes meaningful, and hence providing the semblance of continuity from one generation to the next. But as for the creation of rock histories, influence (however defined) is a faith, and hence undemonstrable.

Monday, April 7, 2008

Charlton Heston, 1923-2008

There is a story that Jean-Luc Godard, although he despised John Wayne’s politics, nonetheless burst into tears at the moment in The Searchers (1956) when the John Wayne character, Ethan Edwards, rather than killing his niece Debbie as we think he’s going to, sweeps the girl up in his arms and says to her, tenderly, “Let’s go home, Debbie.” Thus Godard understood early on something about the cinema that a filmmaker such as Michael Moore, apparently, never has: no amount of ideological demystification can diminish the sheer power of the movies—or movie stars, for that matter. In my undergraduate days at a major midwestern university, I remember standing in line to see Citizen Kane; there were several screenings that day, each of them with packed audiences. As the earlier crowd was leaving the theater so that we, the next audience, could take our seats, as he was leaving a spoilsport who’d just seen the film yelled out, loudly, so that all waiting to see the next screening of the movie might hear him, “Rosebud’s a sled!” But the joke was on him: did he really think that we were all waiting in line to see Citizen Kane merely in order to learn the final revelation of its grand enigma, as if that is what going to the movies is all about?

John Wayne became the biggest movie star of all time because he understood the fundamental principle about being a movie star: play yourself. Robert B. Ray writes:

Film stars, in fact, have always been less actors than personalities, paid to personify (rather than impersonate) a certain character type. As one film historian (Ronald L. Davis) has written, “Most of the old studio stars created a persona, and they acted that persona no matter what role they played. Audiences flocked to the theaters more to see their favorite stars than to watch realistic performances. . . . Most of the great Hollywood stars were almost pure personality, like Clark Gable, who didn’t much like acting.” (“The Riddle of Elvis-the-Actor,” 103-04)

Charlton Heston was a great Hollywood movie star because he was pure personality--he played himself. He can thank Cecil B. DeMille in large part for his magnificent career, for it was DeMille—who didn’t care two pins for so-called “realistic” acting, despite his claims to the contrary—who early on realized that Heston never would be effective at playing “slice-of-life” drama: his personality was too strong, his acting skills too rudimentary, to succeed at that sort of performance. Thus it was only appropriate that his star-making performance should have been in The Greatest Show on Earth (1952), a superb melodrama that’s really more like vaudeville than “slice-of-life.” As circus boss Brad, Heston commanded that heterogeneous group of circus performers by sheer force of his personality, not by patient, carefully reasoned argument. That movie was the proverbial harbinger of things to come: he played the same role for DeMille again, but under a different name, in The Ten Commandments (1956).

He won an Academy Award for his performance in the Biblical epic Ben-Hur--this time reprising his role from The Ten Commandments rather than The Greatest Show on Earth--and thus became a Big Star. But had Charlton Heston by chance died after making Ben-Hur, he would have become simply the answer to a trivia question: the actor who played Moses. Or perhaps the actor who appeared in one of Orson Welles' best later films, Touch of Evil (1958).

Although I now contradict the standard (sanctioned) career retrospective, his greatness as an actor lies in the films he made in the 1960s and 1970s, films such as El Cid (1961), Khartoum (1966), and Planet of the Apes (1968). It is the latter movie that placed Charlton Heston on that privileged Mount Rushmore of Hollywood stars inside my head--it has remained one of my Top 10 favorite movies for forty years. Planet of the Apes gave Charlton Heston one of the greatest moments--and greatest punch lines--in Hollywood history, and only Heston could have delivered that line addressed to a "damned dirty ape" with such memorable panache, a mixture of arrogance, contempt, loathing, recalcitrance, and seething hatred.

Moreover, after the triumph of Planet of the Apes, he performed in a series of apocalyptic films that I still find remarkable:

Beneath the Planet of the Apes (1970)
The Omega Man (1971)
Soylent Green (1973)

He dies the reluctant martyr in each one, going one step further than Brando, who often greatly physically suffered, but seldom died. It has been observed, correctly, that Marlon Brando brought to the movie screen an eroticized violence, in films such as On the Waterfront (1954), One-Eyed Jacks (1961), and The Chase (1966). I think Charlton Heston learned from these films, and like Brando began to take on roles in which he had to suffer great physical violence at the hands of his enemies: he suffered, but like Samson, took his enemies with him. I love his performance as the cynical, corrupt cop in Soylent Green: he's by turns slimy, nasty, thuggish, sentimental, and teary-eyed--and has to suffer a terrible beating by someone who's even more slimy, nasty, and thuggish, and corrupt than he is, Chuck Connors. And at the end of Soylent Green he gets to utter another famous line, but this time not yelled out with arrogance or hatred, but with revulsion and disgust mixed with resignation: "Soylent green is people."

And then, afterwards, with the films he made after that amazing stretch from 1968-1973, he was the movie star, always playing himself, as certain as gravity, his face as instantly recognizable as one's own. He would later appear on a couple episodes of Saturday Night Live, shows which I've seen in re-run; he genuinely seems to be enjoying himself, and having fun puncturing his own image. Always the actor, he couldn't turn down the limelight, accepting the controversial role as figurehead for the NRA--leader once again, defending the U. S. Bill of Rights as Moses defended the ten commandments. But unlike Michael Moore, his politics didn't much interest me. Counterculture figures such as Frank Zappa defended the Bill of Rights, too--remember the Grand Funk Railroad album Zappa produced, Good Singin' Good Playin' (1976), the album which contained the song, "Don't Let 'em Take Your Gun"?

Charlton Heston will forever remain one of my favorite Hollywood stars, one of the stars who in my lifetime conjured up the magic of the cinema, and drew me under its spell. He did so for a reason I hope he would take as a sincere compliment: by sheer force of his personality.

Friday, April 4, 2008

Trans

My friend Tim Lucas posted a comment in response to my previous entry, “His Master’s Voice,” containing a number of interesting ideas that prompted me to pursue yet another line of speculation regarding the meaning of the Moog synthesizer in sixties popular music. I'll admit to being especially intrigued by an observation made by Trevor Pinch and Frank Trocco in Analog Days: The Invention and Impact of the Moog Synthesizer (Harvard University Press, 2002), one which I cited in my earlier post:

The Moog was a machine that empowered . . . transformations. The [Moog] synthesizer . . . was not just another musical instrument; it was part of the sixties apparatus for transgression, transcendence, and transformation. (305)

In addition to the grouping of transgression, transcendence, and transformation, one could add any number of words containing the prefix trans: transmission, transistor, translation, transvestite, transferal—and transsexual. Pinch and Trocco speculate as to whether Walter Carlos’ transformation into Wendy Carlos--which roughly coincided with the time she began work on the hugely successful synthesizer album Switched-On Bach (1968)--occurred “around the time she was developing as a synthesist,” and whether the transformation “had anything to do with the Moog, and with synthesis itself” (137). Admittedly, as Pinch and Trocco themselves point out:

The question of gender and the synthesizer is a tricky one. Certainly electronic music technologies have traditionally been used for building masculine identities—the boys and their latest toys. But different sorts of masculinity can be involved in how men interact with technologies, and several women we interviewed for this book, notably Suzanne Ciani and Linda Fisher, have developed intense personal relationships with their synthesizers....If, as Judith Butler argues, gender identities have to be performed, a key prop in the performance of these synthesists is the machine with which they spent most of their hours interacting—the synthesizer. What we want to suggest with Wendy [Carlos] and her synthesizer is that it may have helped provide a means whereby she could escape the gender identity society had given her. Part of her new identity became bound up with the machine. (138)

While I’d like to pursue some implications of these speculations by Pinch and Trocco, I'll digress for a moment in order to point out how their speculations contribute to a theory about how we might possibly interpret a musician’s particular use of the synthesizer during live performance:

Keith Emerson (Emerson, Lake, and Palmer): genital prosthesis/phallic symbol
Rick Wakeman (Yes): genital prosthesis/phallic symbol (but more synths than Emerson, therefore his is “bigger”)
Allen Ravenstine (Pere Ubu): non-instrumentally, as noise, a child playing with a complicated toy, thus conforming perfectly with David Thomas’ odd stage persona as a prematurely large, chubby kid (Baby Huey)

In Wendy Carlos’ case, the use of the synthesizer to interpret a Baroque composer such as Bach is, of course, avant-garde in its impulse, but if one pauses to consider the synthesizer as a fetish object, her identification with the Moog, a machine whose operation rested upon its capacity to be re-wired--think of the endless plugging and unplugging of patch cables across a bewildering array of panels, as well as the tweaking of many dozens of knobs--in order to produce a different sound effect, is not an inappropriate object of identification for a transsexual, since gender is indeed in part a social performance--an effect. (Derrida on the fetish: “the projection operates in the choice rather than in the analysis of the model.”) In addition, engineers' coding of wire connections as "male" and "female" is highly suggestive as well.

Early Moog synthesizers had the capacity not only to produce “ethereal” or “unearthly” sounds but also the capacity to produce simulacra--not the sound of an actual harpsichord, for instance, but a pseudo-harpsichord--a “fake” or “trick” harpsichord. A simulacrum is like its model in every way, yet is unlike it because of an often intangible difference based on lack. For Wendy Carlos, the synthesizer is not a prosthesis for genital display (as are banks of synthesizers, or the electric guitar), but is homologous to a castrati, a castrated male who, dressed as a female, sang soprano parts in Italian opera. Although their high voices were the consequence of a physical cut, an alteration, castrati were nonetheless highly feted singers. (See Roland Barthes’ S/Z, a reading of Balzac’s “Sarrasine,” the story of a naïve French artist named Sarrasine who takes the requisite artistic pilgrimage to Rome. Ignorant of the fact that soprano parts are performed by castrati, Sarrasine falls in love with a soprano who goes by the name of La Zambinella, eventually to learn the devastating truth about the actual identity of his beloved and that his love is un-consummateable.)

How appropriate, then--and I remark upon this without irony or sarcasm—that Switched-On Bach was presented by “Trans-Electronic Music Productions.” It is also interesting to note that, as revealed by Pinch and Trocco's interview with Bernie Krause, the eccentric Paul Beaver--an early synthesist pioneer who died prematurely in 1975, and whose career has been largely overlooked in favor of Wendy Carlos' career--was bisexual, yet another provocative association with the Moog synthesizer, and those drawn to its mystery and singularity.

Wednesday, April 2, 2008

His Master's Voice

Although colloquially referred to as a “Leslie”, the Leslie Rotating Speaker System is actually a sound modification (deformation) device, not a standard speaker as such, in the sense of being an amplification and reproduction mechanism, one so accurate and so realistic in its sound that the reproduction could fool one’s faithful dog. The mythic origin of the relationship between the master, the master’s voice, and the faithful dog is ancient: it can be traced back to Homer’s Odyssey, with the relationship between Odysseus and his elderly dog, Argos. If you’ll remember, Odysseus has been gone from Ithaca for twenty years, and when he finally returns, he’s disguised as a beggar. Having landed back home after such a long absence, when Odysseus eventually speaks, even after all those long years, Argos, his old, dying dog—so miserably old that the only way the beast can stay warm is lay on a composting manure pile—instantly lifts up his head in excitement, having recognized his master’s voice. The presence of his master’s voice, of course, means to the dog that his master has finally returned. Thus Nipper, the name of the dog used as a model in the painting that eventually became RCA’s logo, is really misnamed. In honor of that miserably old dog that waited twenty years just to hear--once more before he died--his master’s voice, RCA’s mascot should be re-christened Argos.

The Leslie did not originate as a speaker the purpose of which was to reproduce “his master’s voice.” Although invented in the 1940s to augment the sound of the Hammond organ, in the 1960s the Leslie--named after its inventor, Donald J. Leslie (1911-2004)--began to be put to use by rock bands in an unexpected way. Michael Jarrett writes:

The overlapping waveforms produced by the Leslie’s two speakers—not unlike the effect derived by yelling into an electric fan—generate a sonic moiré pattern (a Doppler effect): the tremulant sound associated with Hammond organs. But other instruments have also been played through Leslie cabinets....To the psychedelic mind, the Leslie and LSD were homologous; both altered everyday perception. (140)

The lead guitar part on The Beach Boys’ “Pet Sounds” was modified by a Leslie, while on “In-A-Gadda-Da-Vida” it was Ron Bushy’s drums. The Beatles’ vocals were modified by a Leslie on “Tomorrow Never Knows” (among others), as was Ozzy Osbourne’s on Black Sabbath’s “Planet Caravan.”

A Few Representative Recordings Featuring the Leslie:

The Beach Boys, “Pet Sounds,” Pet Sounds (1966)
The Beatles, “Tomorrow Never Knows,” Revolver (1966)
Procol Harum, “A Whiter Shade of Pale,” Procol Harum (1967)
Steppenwolf, “Born to be Wild,” Steppenwolf (1968)
The Band, “Tears of Rage,” Music from Big Pink (1968)
Iron Butterfly, “In-A-Gadda-Da-Vida,” In-A-Gadda-Da-Vida (1968)
Black Sabbath, “Planet Caravan,” Paranoid (1970)

The Leslie was to LSD what the Moog synthesizer was to interstellar space travel. If the Leslie was light-hearted and benign, the Moog synthesizer was dark and forboding: the Leslie was incapable of creating the sinister drone of the Moog. However, both machines reveal that for sixties rock bands, sound made all the difference. According to Trevor Pinch and Frank Trocco, in Analog Days: The Invention and Impact of the Moog Synthesizer (Harvard University Press, 2002), Donald Cammell and Nic Roeg’s Performance (filmed 1968, released 1970) “is the only movie we know of where the Moog synthesizer [a Moog Series III] itself makes a cameo appearance.” (Brian De Palma’s Phantom of the Paradise [1974] featured the synthesizer TONTO, but not its sounds. Jon Weiss actually set up a patch for Mick Jagger on the Performance set.) Pinch and Trocco write:

In a key scene . . . Turner [Mick Jagger] for a moment is the mad captain at the controls of spaceship Moog. The Moog and its sounds are the perfect prop, part of the psychedelic paraphernalia, the magical means to transmigrate a fading rock star into something else. The Moog was a machine that empowered such transformations. The synthesizer for a short while in the sixties was not just another musical instrument; it was part of the sixties apparatus for transgression, transcendence, and transformation. No wonder the sixties rock stars loved their Moogs. (305)

The synthesizer’s key place in sixties rock began in June 1967. Paul Beaver and Bernie Krause (the recording duo of Beaver & Krause) set up a booth on the Monterey fairground as part of the Monterey International Pop Festival in June 1967 in order to promote, and perhaps even sell, the Moog synthesizer. They actually sold several. According to Pinch and Trocco, “Monterey was the place where the subculture became mainstream” (117).

A Few Representative Recordings Featuring the Moog Synthesizer:

Mort Garson and Bernie Krause, The Zodiac Cosmic Sounds (1967)
Johnny Mandel, Point Blank (1967) (Film Score Monthly, 2002)
The Doors, Strange Days (1967)
Paul Beaver and Bernie Krause, The Nonesuch Guide to Electronic Music (1968)
The Byrds, The Notorious Byrd Brothers (1968)
Walter [Wendy] Carlos, Switched-On Bach (1968)
Emerson, Lake, and Palmer, Emerson, Lake, and Palmer (1971)

Monday, March 31, 2008

Critical Overcomprehension

In his witty and insightful book, Adventures in the Screen Trade (1983) William Goldman, a highly successful screenwriter (Butch Cassidy and the Sundance Kid) but also a wry critic of Hollywood, observes that a Hollywood studio head is very much like the manager of a baseball team: each and every day he wakes up knowing that sooner or later he is going to be fired.

No doubt the vast majority of today’s critics--of the theater, movies, music, contemporary fine arts--wake up each morning in a similarly precarious position, not necessarily thinking they will be fired from their privileged critical occupation, but that most certainly and with a creeping, unavoidable inevitability--like the day of their death--they will be wrong. What is a critic’s deepest fear? To have erred in judgment, to have made the wrong call, in short, to have missed the boat.

No music critic wants to miss the boat--to have critically underestimated, or what’s worse, to have dismissed the next Velvet Underground, for instance--so in order to avoid making such an unwitting mistake, the critic engages in what Robert Ray, employing a term coined by Max Ernst, calls overcomprehension (How a Film Theory Got Lost, Indiana University Press, 2001, p. 82). Ray writes:

Aware of previous mistakes, reviewers become increasingly afraid to condemn anything....Hence ... [one] ... of modern criticism’s ... great dangers, what Max Ernst called “overcomprehension” or “the waning of indignation”.... (82)

No critic, of course, can see beyond the curtain of time. Time is the ultimate critic, and the critic’s limited perspective doesn’t allow him to see beyond his own pitifully narrow moment in history. Critical overcomprehension--the act of giving every new record an equally glowing reception--is a result of the critic’s deep fear of being judged by history as wrong. No one wants to be, for instance, television critic Jack Gould, who reviewed the Milton Berle Show appearance of Elvis Presley for the New York Times in 1956:

Mr. Presley has no discernible singing ability. His specialty is rhythm songs which he renders in an undistinguished whine; his phrasing, if it can be called that, consists of the stereotyped variations that go with a beginner's aria in a bathtub. For the ear, he is an unutterable bore, not nearly so talented as Frank Sinatra back in the latter's rather hysterical days at the Paramount Theater. (qtd. in Robert Ray, 80)

Of course, as Ray points out, Gould’s kind of critical error had its own unintended consequences: such gross critical mistakes, Ray argues, led to “rejection and incomprehensibility as promises of ultimate value” (82). In other words, if an album sold poorly, or the artist who recorded it was given scant attention--or worse, completely neglected in his time, the record must therefore be great, perhaps even a masterpiece.

I suppose we all have adopted our favorite neglected artist, the artist whose critical neglect or, if you will, martyrdom, ironically, is the sign of greatness, of ultimate value. In my own music collection, this sort of artist is represented by, among others, Tim Buckley and Phil Ochs.

But I’m wondering, what do we do with the opposite case, the artist who is the critical establishment’s darling and whose records we therefore own, but never play? (Perhaps I'm a heretic, but I find myself playing only certain selections of Trout Mask Replica, not the entire disc.) The presence of both sorts of records, side by side in our music collections, reveals the persistent problem of what Robert Ray calls the Gap, the problem of assimilation, the failure of a new or unusual artistic style to be made intelligible to the public. Although rock 'n' roll is now over fifty years old, we still find ourselves struggling to fully comprehend its challenges and complexities, rather like a person who has difficulty reading or understanding the lines indicating contours and elevations on a topographic map.

Sunday, March 30, 2008

DIDs and the Principle of Parsimony

Last night my fellow Video Watchdog kennel member Kim Newman left a comment on my “DIDs” entry (DIDs=Desert Island Discs) that I found so interesting I was prompted to share it:

I assume you know this, but sometimes bits of British pop culture are surprisingly obscure outside the UK. The term “Desert Island Discs” comes from a long-running BBC Radio 4 program--it started in 1942, and is running [!]--in which a celebrity selects the eight records they’d take to a desert island (along with one book and one “luxury”) and is interviewed about their life, work and how they’d survive in this situation. It’s such a simple format that it’s lasted forever in broadcasting terms (its creator, Roy Plomley, was the host until 1985, and only three other presenters have succeeded him). I’d be surprised if it hadn't been done in other countries.

I very much appreciate Kim taking the time to post this information, because in fact I did not know the origin of the practice of selecting Desert Island Discs. In the U.S., most lists default to a “Top 10,” so I’d always assumed a DIDs list consisted of ten albums. But, as Kim points out, the original practice was to select eight records, one book, and one “luxury.” As Tim Lucas pointed out in his comment on the DIDs entry, there are books on the subject of DIDs (the one I know about being Greil Marcus’s), but I’ll admit having never read any of them (see Tim's comment for a discussion). As I mentioned in my earlier blog entry, I find most DIDs lists uninteresting: either they consist of a recitation of the same old titles, or they are so willfully obscure as to be intellectually impenetrable.

The fact that the practice of selecting DIDs originated in England during wartime--that is, during a time of shortages, of scarcity, of rationing (frugality mandated by the government)--in short, a time of widespread lack of the necessities and comforts of life requiring of all civilians the necessity of sacrifice--is quite revealing, really, for in my initial post I’d connected the practice of DIDs to the Principle of Parsimony, an unstated linkage I’m now convinced, thanks to Kim’s post, is correct.

The Principle of Parsimony (parsimony generally being defined as excessive frugality or stinginess, especially with regard to money) is sometimes called “Occam’s Razor” after its putative originator, William of Occam (pictured above). His specific purpose was to formulate the rules of logic that would minimize the proliferation of causal and/or explanatory hypotheses--in colloquial terms, "the simplest explanation is most often the best," or in its laconic, Dragnet formulation, "just the facts, m'am." However, the Principle of Parsimony became more popularly formulated as, “entities should not be multiplied beyond necessity,” a utilitarian principle that not only justifies stinginess (“parsimoniousness,” sometimes referred to as “miserliness”--the Scrooge syndrome) or excessive frugality but forms the basis--seriously--of the Puritanical injunction against recreational sex: recreational sex violates the Principle of Parsimony. In strictly utilitarian terms, you have sexual intercourse when you intend to procreate--period. Parsimony, like the Reality Principle, strives to restrict or inhibit the various expressions of pleasure.

The adage, “entities should not be multiplied beyond necessity,” is just about as good a Puritanical justification as one could find for the practice of compiling DIDs lists. However, if the Principle of Parsimony is the Puritanical underpinning of DIDs lists, the actual mental activity that dictates the selection of the list itself is perversity (resistance, obstinacy). In other words, when faced with the choice of having something or nothing (even if that something is “just a little,” i.e., the Reality Principle), desire chooses something: perversely--out of necessity--it selects a single object of pleasure out of a vast number of possibilities: the rarified, fetishized object--one DID out of a possible 8 or 10 (the total set). Each element of the set is like a game piece one must select before the game starts, the game being how to negotiate the operation of pleasure with a highly restricted economy premised on lack.

There’s a Warner Brothers cartoon (I think) that expresses this mental operation of lack determining desire in a wonderfully concrete form. If my memory serves, the scene depicts a weak, starving, sad-eyed character (a dog?) placing a lone, small bean in the center of an immense plate. With his napkin, knife and fork on his left, he very carefully salts and peppers the single bean. He then ceremonially ties on a bib and raises his knife and fork over the bean . . . and then oh so delicately, with tender, loving care, cuts the bean in half, raising the parsimonious morsel to his mouth and begins to chew it, savoring its delicate, subtle flavors.

Can someone leave a comment with the name of that cartoon? If I happen to have it, I'll try to post a frame grab on a future blog entry.

Saturday, March 29, 2008

The Ideas They Kept A-Rollin’

This morning I was pleased to discover that the number of hits on my blogspot had taken a noticeable spike, I suspect in part because of the stimulating exchange (stimulating to me, anyway) Tim Lucas and I have had the past couple of days regarding the relationship between psychedelia and bubblegum music. I invite all my blogspot visitors to read his comments, available through the comments link at the end of my “Bubblegum Breakthrough (Slight Return)” entry. (His initial comment, that prompted the subsequent discussion, is available at the end of the previous day’s entry.)

I am especially gratified by the number of visitors because I think he and I have, in the space of about 48 hours, generated more ideas about how to read (as in interpret) popular music than one can find on websites specifically dedicated to the task of reviewing albums. It’s true that we have been focused on a rather narrow slice of popular music history--admittedly, a slice that is perhaps not interesting to all readers. But what I’ve found so stimulating (as I think Tim has) is not so much our individual valuations of the individual albums or songs--disagreement is a healthy thing, not a “bad” thing, because it promotes further discussion that usually translates into knowledge--but the various methods we’ve employed to make the music meaningful in the first place. After all, popular music doesn’t “mean” anything at all—doesn’t gain any adherents--until it conforms to certain trends and ideas that make it valuable to listeners.

Perhaps the point is best expressed by James Lincoln Collier, in Jazz: The American Theme Song (Oxford University Press, 1993), a critic whose knowledge about jazz is encyclopedic in its breadth. Although he is writing about how jazz music came to represent the new modern spirit of America in the 1920s (“Modernism”), his point is applicable to the way all popular music is ascribed meaning and value:

The point is that a particular style or form in art gains adherents not simply from purely aesthetic considerations, but also from how well it appears to agree with fashionable social, philosophic, or even political considerations . . . . (p. 9)

It was Collier’s insight that formed the basis of my initial assertion, that psychedelia is the aural equivalent of a hallucinogenic drug trip: the particular “sound” that became known as psychedelia meant nothing until it was ascribed a certain analogical meaning.

I think exchanges of the sort Tim and I have the past couple of days are rare in the sense that they happen because the individual participants coincidentally have the time to dedicate to such pursuits. (He’s trying to assemble the latest issue of Video Watchdog while I’m trying to provide him with the material to do just that.) Although Tim has been writing on the cinema since he was a teenager, and I’ve been writing for Video Watchdog for the past 11 years, both of us have keen interest in popular music and it has always been a pleasure for me to share ideas and views about music with him. I don’t think our mutual love of movies and music should be surprising to those who know us primarily through Video Watchdog, as we’re both extremely interested in what in the most general terms is called the “entertainment industry,” the way it has formed our identities and contributed to the life of our individual imaginations. We’re also interested in it because we’re both striving to understand ourselves as individuals whose identities were formed during a particular historical moment when the cultural influence of the entertainment industry had finally achieved the cultural dominance that we now accept as a given, like a fact of nature.

In short, we take popular music very seriously. Last year he and I both submitted proposals to Contiuum’s 33 1/3 series, only to have our proposals rejected by the editor. The manuscript for his book, on Jefferson Airplane’s Crown of Creation, has been completed for a year now if not longer; my manuscript, on Wall of Voodoo’s Call of the West, is perhaps half completed, as I stopped working on it once I received the rejection notice (an email). Both of us obviously were disappointed by the outcome, as we’d each completed a considerable amount of original research, and a number of original interviews. In my case, I had the complete cooperation and total support of the defunct band’s leader, Stan Ridgway, who is still active touring and making albums. If anyone knows of a potential publisher for these books, please let Tim or me know.

Bubblegum Breakthrough (Slight Return)

Last night my friend Tim Lucas took the time to post comments to my recent entries, “Bubblegum Breakthrough” and “DIDs,” a gesture that I very much appreciate--one hopes that one’s blog entries are taken seriously by somebody. While I’d like to respond at length to the many ideas in both of his posts, for the moment I’ll confine my remarks to Tim’s remarks on my most recent entry, “Bubblegum Breakthrough,” simply because it’s the most recent.

Having mentioned the co-songwriters of “The Rain, the Park & Other Things”--Artie Kornfeld (pictured, at the Woodstock festival) and Steve Duboff--he was right to remind readers that I’d overlooked the fact that Artie Kornfeld was one of the co-producers of the 1969 Woodstock Festival. Those interested might want to visit his webpage, where one can find biographical information as well as behind-the-scenes information on the complexities of staging the famous music and arts festival. (Alternatively, a brief bio of Kornfeld is available here.)

Tim makes an intriguing link between “The Rain, the Park & Other Things” and the heavy rains that festival-goers had to endure while at Woodstock:

No wonder he [Kornfeld] . . . looks so blissed out while standing onstage and rapping to the ABC newsman about all the people sitting in the rain in the Woodstock movie. His rap is the one Charlton Heston has memorized in The Omega Man.

Having read Tim’s comment, it occurred to me that one could think of “The Rain, the Park & Other Things” as a sort of virtual rehearsal for the Woodstock festival itself, as if Kornfeld had, in some half-formed or perhaps unconscious way, the idea for the Woodstock festival in his head when he wrote the song years before, thus making the lyrical content an example of what rhetoricians call prolepsis—speaking of something that has not yet happened as if it already has happened. One wonders if Kornfeld being “blissed out” during the interview isn’t, in part, his own bewildered reaction to the literal realization that “The Rain, the Park & Other Things” was, remarkably, unfolding before him.

In response to my assertion that "The Rain, the Park, & Other Things" was bubblegum music, Tim responded:

The Cowsills may have been a bubblegum act by definition, but I would personally categorize their performance of this song as psychedelia. There is no insincerity or irony in the vocals, for one thing, and the instrumentation has a wonderfully iridescent quality. Wholesome yes, but psychedelic nonetheless--like a black light poster or a strawberry scented candle.

In response, I would say that a fundamental problem--and what makes writing about this sort of music difficult--is that the categories of “bubblegum” and “psychedelia” are ill-defined concepts: they have “fuzzy boundaries” (no pun intended). As an old philosophy professor of mine once warned me: avoid creating false dichotomies between ill-defined concepts. The problem is this: is psychedelia defined by instrumentation, that is, by sound, or by lyrical content, or, as Tim suggests, by a certain rhetorical posture toward the subject matter? (Irony being a defining feature of bubblegum as I understand his argument.)

I agree with him in his characterization of the song’s instrumentation (sound being essential to psychedelic music), and I also think he’s correct in his observation that there’s no “insincerity or irony in the vocals.” But we disagree over the issue of irony: actually, I would take the opposite position, and say that it is psychedelia that is defined by irony, not bubblegum, the latter music being the one characterized by a certain naïvete and lack of irony--an absence of self-consciousness. In order to illustrate my point, juxtapose “The Rain, the Park, & Other Things” with, say, the Rolling Stones’ “2000 Light Years From Home”--a song which, historically speaking, has the virtue of being released almost exactly at the same time as “The Rain, the Park, & Other Things.” Which song seems more obviously psychedelic? To me, it is “2000 Light Years From Home,” certainly the more irony-laden and self-conscious of the two. What’s more, the lyrics are more “surreal” as opposed to those of “The Rain, the Park…,” which form a more coherent narrative, even if the narrator can’t decide if the event really happened or was a dream. In contrast, psychedelic lyrics are often highly fragmented, repetitive, and, as I mentioned earlier, surrealistic. As an example, think of the Stones’ “She’s a Rainbow”:

Have you seen her all in gold?
Like a queen in days of old
She shoots colors all around
Like a sunset going down
Have you seen a lady fairer?

She comes in colors everywhere;
She combs her hair
She's like a rainbow
Coming, colors in the air
Oh, everywhere
She comes in colors

She’s like a rainbow
Coming, colors in the air
Oh, everywhere
She comes in colors

I see psychedelic music as the aural equivalent of an hallucinogenic drug trip--“She’s a Rainbow” being the Stones’ answer to “Lucy in the Sky with Diamonds”--while bubblegum is the aural equivalent of non-alcoholic beer (or, alternatively, psychedelic music played by a band that doesn't inhale).

I think the Cowsills’ (cleaned-up) cover of “Hair” also works as wholesome psychedelia--listen to the sound effects during the “It can get caught in the trees” stanza--but “Indian Lake” is unabashed bubblegum.

Yes, and yes--although I was never a fan of the musical Hair nor the Cowsills (which doesn’t mean, incidentally, that just because I wasn’t “for” them meant I was “against” them). “Indian Lake” is on The Best of the Cowsills, but when I play that CD I usually press the “skip” button when “Indian Lake” cues up. To be honest, the only Cowsills record to which I really ever gave a listen was The Cowsills In Concert (which included “Hair” live), an album that a friend insisted I borrow, along with the first Vanilla Fudge album. I have to say that at the time, for some now long-forgotten reason, my tastes gravitated toward Vanilla Fudge, although the last time I listened to their first album (on CD), probably a year or so ago, I found it extraordinarily dull and turgid. Some critic once remarked about the work of the novelist Henry James, “He chewed rather more than he bit off,” a remark that is an apt description of the first Vanilla Fudge album. I probably thought at the time that it was “psychedelic,” but now I think it is just “pulverizedelic,” a plodding, Hammond organ-heavy album that is utterly devoid of any humor or imagination. You can’t imagine how many local bands at the time tried to copy its sound, bands that played so many high school proms I don’t even wish to think about it. In contrast, and to its credit, The Cowsills in Concert is, now, what it was, then--completely innocuous and benign.

Looking back at my previous post, I see that the fundamental issues became even more complicated when I suggested that "Power Pop" developed out of "bubblegum." Anyone wish to chime in on (for example), the relation between . . . the Cowsills and . . . Big Star?

Thursday, March 27, 2008

1967: Bubblegum Breakthrough

It is no accident that virtually every album considered among the greatest in rock history is not a live album but made in the studio. To name some obvious examples, think of Elvis Presley's first LP for RCA (1956), The Beatles' Sgt. Pepper’s Lonely Hearts Club Band (1967) and Pink Floyd's Dark Side of the Moon (1973)--all products of improvements in studio recording and engineering technology. Moreover, in the case of Dark Side of the Moon, developments in electronic music and the invention of the Moog synthesizer both contributed to its success and its achievement. Because of developments in electronic music and recording methods, by 1967 popular music had begun to provide an aural, electronic equivalent to the hallucinogenic drug experience, known as “psychedelic rock” or simply “psychedelia.”

What came to be referred to, pejoratively, as Bubblegum Music emerged from, and was a response to, psychedelia. The acknowledged masters of this form of pop music were Jerry Kasenetz and Jeff Katz (known as Super K Productions), who were to Bubblegum music what Alan Parsons was to psychedelic rock. Under the banner of Super K Productions, Kasenetz-Katz were responsible for hits such as “Simon Says” by The 1910 Fruitgum Company and “Yummy, Yummy, Yummy” by the Ohio Express, both released in 1968.

In my mind, though, Bubblegum’s first big hit was recorded by The Cowsills, who, as everyone knows, became the model for the musical family depicted in the TV show The Partridge Family (1970-74). The hit, released late in 1967, was titled “The Rain, The Park & Other Things.” It was written by Artie Kornfeld and Steve Duboff, who’d also written the hit “The Pied Piper” for Crispian St. Peters.

I saw her sitting in the rain, raindrops falling on her
She didn't seem to care, she sat there and smiled at me
Then I knew (I knew, I knew, I knew) she could make me happy (happy, happy!)
Flowers in her hair, flowers everywhere!
I love the flower girl! Oh, I don't know just why, she simply caught my eye
I love the flower girl! She seemed so sweet and kind, she crept into my mind
I knew I had to say hello
She smiled up at me, and she took my hand and we walked through the park alone
And I knew (I knew, I knew, I knew) she had made me happy (happy, happy!)
Flowers in her hair, flowers everywhere!
I love the flower girl! Oh, I don't know just why, she simply caught my eye
I love the flower girl! She seemed so sweet and kind, she crept into my mind
Suddenly the sun broke through (see the sun)
I turned around she was gone (where did she go?)
And all I had left was one little flower in my hand
But I knew (I knew, I knew, I knew) she had made me happy (happy, happy!)
Flowers in her hair, flowers everywhere!
I love the flower girl! Was she reality or just a dream to me?
I love the flower girl! Her love showed me the way to find a sunny day

Betraying Bubblegum’s psychedelic origins, the singer is unsure whether he’s just experienced something real or an hallucination. “The Rain, The Park & Other Things” can also be understood as a benign version of The Association’s “Along Comes Mary” with its supposedly cloaked drug reference (“Mary,” so the story goes, is short for “Mary Jane,” one of the many coded names for marijuana).

Although providing similar titillations as rock but for a younger, teenage set, Bubblegum was psychedelic music deprived of its substance. It was psychedelia with the malignant property removed, the 1960s equivalent of today’s decaffeinated coffee, fat free cream, beer without alcohol, sugarless soda pop. It was The Monkees rather than The Beatles, “I Think We’re Alone Now” rather than “Let’s Spend the Night Together,” “Crimson and Clover” as the good (drug) trip rather than the bad one of “2000 Light Years From Home.”

Bubblegum’s novel flavor dissipated fast, and by the early 1970s it was gone, supplanted by what’s since become known as “Power Pop”—think of The Raspberries’ “Go All the Way” instead of Tommy James’ “Hanky Panky.”

DIDs: Of Records, Albums, and Theology

The collocation “Desert Island Discs”—DIDs— normally refers to a music critic’s list of revered recordings, usually consisting of ten (10) albums, as in Top 10. The term is derived from the question, “If you were stranded on a desert island, what ten albums (normally ten, out of respect to the commandments), out of all the albums you own, would you want to have with you?” Given the hypothetical nature of the question, it might just as easily be phrased as, “If your house were on fire, what ten albums would you grab on the way out?” Implicit in the question is the assumption that the critic compiling the list has hoarded, in a grossly materialistic way, more albums than he could ever possibly listen to (or rather, listen to carefully). Actually, the compilation of a “DIDs” list is a tacit admission by the critic that he really listens only to a small portion of the many hundreds (or thousands) he owns.

I vividly remember a conversation I had about ten years ago or so with my friend Mike Jarrett, a music critic himself and a world expert on jazz, when the topic of DIDs came up. In the context of a conversation regarding what each of us might include on a DIDs list, he paused to ask me a question that he prefaced by insisting he was asking in all seriousness. Of course, I said, ask it. Why would I think you were not asking a serious question? The question was this, brilliant really, which I’ve pondered many times in the years since: What makes up God’s record collection: Every record ever made, or just the best records ever made?

You don’t have to have any sort of conventional religious belief--even none--to answer the question. How do you answer it--not in a “theoretical” way, meaning, how “would” you answer it assuming the off-chance that someone ever asked you--but how do you? Does the most ideal of album collections in God’s place consist of all the albums ever made, or only the best (however the Almighty should decide that)? Is heaven (a desert island, of the tropical paradise sort) a place of plenty, of excess, of everything, or is it premised on the Puritan Principle of Parsimony—that is, DIDs. (When you go to heaven, in other words, and you’ve got only ten choices, what shall they be?) Is it all-inclusive, or exclusive? If you had your druthers, do you invite everybody, or only a select few? Certain Christian traditions, of course, tell us that those selected are an elite few—the Chosen. But I recall answering Mike’s question, “all of them. God has all of them.” Mike’s response was, “But does He listen to them all?” Isn’t this the real paradox of desire: Is desire polymorphously perverse (indiscriminate), or fetishistically perverse (rarified)?

I have never seen a list of DIDs that was really anything more than a particular critic’s fetishized list, selected from a standardized list of “Rock Greats”—the critic’s favorite Beatles album, favorite Rolling Stones album, favorite Pink Floyd album, Led Zeppelin album, Bob Dylan album—you get the idea. And outside of some occasional, unexpected flourishes—Cream, perhaps, or U2, Grateful Dead, Nirvana—the list never contains surprises. (Or, if it does, it’s the “Guilty Pleasure” sort, that is, the fetishized sort, meaning the critic "can’t explain it," "just likes it" sort, meaning it eludes rational explication--he’s a mystery even to himself.) In other words, we all know the critical darlings that are going to be there—Rock music’s Great Tradition—the suspense is simply finding out which album by the canonical bands happens to be the critic’s favorite (at the moment).

The problem is that many music critics are really just fans who’ve learned how to write and found a forum to expound from, fans in the sense that their judgment is uncritical—everything by the band (Beatles, Pink Floyd, fill in the blank) is great. Every song, every album, every note by the band is just as good as every other one. Now this just can’t be true--or can it?

By way of analogy, think of the work by a major literary figure—Shakespeare, for example. As Harold Bloom points out—Bloom being one of America’s best critics—had Shakespeare died at the same age as his contemporary, playwright Christopher Marlowe, and Marlowe lived on instead, Marlowe would have been considered historically the greater playwright. Shakespeare’s early plays do not have the level of sophistication and craft of Marlowe’s early plays. At a younger age, the fact is, Marlowe was the stronger playwright of the two. Of course, history is radically contingent: Marlowe was murdered, and Shakespeare lived, eventually composing the great tragedies upon which his reputation largely, and justly, rests. Likewise, of all the many volumes of his writings, the crucial importance of British poet Samuel Taylor Coleridge (author of The Rime of the Ancient Mariner) rests, according to Bloom, on a mere nine poems—but what a brilliant nine they are. In popular music criticism, most critics refuse to make such keen discriminations, partly because they are afraid history will prove them wrong, and so overestimate the importance of every album ("five stars"), or else invent an ad hoc system on which to base their judgment--yet another mechanism of desire--which is presented as “objective.”

Question: Is Meet the Beatles as good a record as the White Album? Or, alternatively: Does God have all the Beatles albums, are only the very best?

Some, rightly so, will cry foul and claim a category error: first I asked about records, and then I asked about albums. In an earlier post, I claimed the two were not the same, a record being a material artifact, an album a concept. But if an album is a concept, does God, then, prefer "Greatest Hits" packages, or the individual albums, in the sense of particular records? Example: Does God have The Eagles' Hotel California, or The Eagles' Greatest Hits? Or all of the individual albums, avoiding the Greatest Hits?

Monday, March 24, 2008

Thursday, January 28, 1960: Why Are There Lyrics, Anyway?

Four years earlier than the above date, in his first national television appearance on the Dorsey Brothers’ Stage Show, broadcast January 28 1956, Elvis Presley chose to open with “Shake, Rattle, and Roll,” a song that Frank Sinatra, for one, would never have condescended to perform. Sinatra probably snorted in derision at Elvis’s second song that evening, too: “Flip, Flop & Fly.” Three months later, when Elvis again appeared on the Stage Show, he sang “Tutti Frutti.” Forget what young females thought about Elvis’s bad boy sneer, his gyrating hips, his wiggling and shimmying, his clamorous shouts and erotic moans—that’s legendary. The more important question is, why did Elvis choose to perform, on a national stage, such “nonsensical” songs?

One of the reasons Elvis was derided early on was that his choice of material seemed so ludicrous: “Flip Flop & Fly”? “Tutti Frutti”? Why not the deep yearning of classic ballads such as “In the Wee Small Hours of the Morning” or “When Your Lover Has Gone”? Why choose songs that are so devoid of substance—so apparently trivial? Why not perform the old standards instead? Frank Sinatra was so incensed by Elvis’s music that he wrote a magazine article condemning rock as “the most brutal, ugly, degenerate, vicious form of expression it has been my displeasure to hear.” He averred that rock ‘n’ roll is “sung, played, and written for the most part by cretinous goons and by means of its almost imbecilic reiterations and sly, lewd—in plain fact—dirty lyrics, it manages to be the martial music of every sideburned delinquent on the face of the earth.” (qtd. from Kitty Kelley, His Way: The Unauthorized Biography of Frank Sinatra, Bantam Books, p. 277).

However, according to Donald Clarke, by the time Elvis appeared on the historical scene, the music business wasn’t the same as it had been when Sinatra began his career slightly over a decade earlier. By the early 1950s, “good white songs were becoming scarce. The Berlins, Gershwins and the rest had died or retired, and the classic songs they had written could not be imitated.” Hence, Elvis never had access to the sort of material to which Sinatra had access (“standards”), and perhaps that made all the difference. (As a corollary, Elvis was never offered the sort of strong dramatic roles in movies that Sinatra was offered, either.) The composers of many of Elvis’s early songs were black, who were writing for the black music market, and who had a different sensibility than the Berlins and Gershwins. So, in answer to the question as to why Elvis chose to perform songs such as “Flip Flop & Fly” and “Tutti Frutti,” the reason is because Elvis chose songs that didn’t sound like anything else. But to Frank Sinatra, if songs weren’t standards, they were aberrations.

Simon Frith offers a way of understanding the difference between Sinatra and Elvis by referring to Dave Laing’s book, Buddy Holly (1972). Laing says that those interested in understanding rock music need to have the musical equivalent of film studies’ distinction between the auteur and the metteur en scene. According to Laing:

The musical equivalent of the metteur en scene is the performer who regards a song as an actor does his part—as something to be expressed, something to get across . . . . The vocal style of the singer is determined almost entirely by the emotional connotations of the words.

Frank Sinatra, then, was the musical equivalent of the metteur en scene. In contrast, says Laing, the rock auteur

is determined not by the unique features of the song but by his personal style, the ensemble of vocal effects that characterize the whole body his work.

Elvis, then, was the equivalent of an auteur: the meaning of the song is not simply organized around the words, but rather in the exceptional nature of his singing style. Sinatra condemned Elvis because he didn’t understand his music, nor could he, at the time, quite grasp the historic rupture in American popular music that Elvis represented. In film historical terms, Sinatra was the old, “classical” Hollywood, while Elvis anticipated the age—our age—of the independent film, the age of the auteur.

Sunday, March 23, 2008

Mondegreen Pt. 3: Melon Calling Baby

Although there are rather sophisticated Site Meter services available for monitoring website traffic, I use only the “basic” service—in other words, the free one. Other than page views, I don’t pay any attention to the various sorts of traffic data available to me—this isn’t a commercial site, after all, so such information is not critically important to me. The other day, however, I decided to take a look at the data as listed in the category “By Referrals,” information on how one’s particular website is found by a particular viewer other than by directly entering the specific URL (that is, data on re-directed traffic).

I was mildly astonished to discover how many individuals had been directed to my blogspot as a consequence of searching the key words, “Betty and the Jets lyrics.” The reason for this is that a few blog entries ago I wrote a blog entry on the mondegreen, “Dead Ants Are My Friends, A-Blowin’ in the Wind,” followed a few days later by a second entry, a follow-up that I’d titled, as a jape, “Betty and the Jets.”

After learning how many page visits my blogspot had received as a consequence of individuals searching for the lyrics to “Betty and the Jets,” I felt a tad bit guilty: wasn’t I perpetuating what is clearly a rather widespread misunderstanding, adding to, rather than clarifying, the essential homophonic ambiguity regarding “Bennie and the Jets” by disseminating the deformed, mondegreen version, “Betty and the Jets”? On the other hand, I decided, perhaps having read about the mondegreen, the searchers might, in a sort of roundabout way, figure out the song’s actual title, and hence find the lyrics they had set out to find.

In my earlier “Betty and the Jets” blog entry, I’d suggested the existence of the mondegreen, at least insofar as lyrics are concerned, is a consequence of a message being deformed once it is subject to electronic transmission, a technology which emphasizes the received nature of messages.

But perhaps on this Easter Sunday, we might want to consider an entirely different theoretical issue, that of the (invisible) effect of homophonic ambiguity (the mondegreen) on the transmission of messages that were originally made within a largely oral (that is, largely illiterate) culture, and consider whether the transmission of Biblical texts might also have been subject to deformation by the mondegreen. I’m sure such a possibility has given many a Biblical scholar a sleepless night or two (or three). I know it has been exploited for comic effect: think, for instance, of Monty Python’s Life of Brian, and the line, "Blessed are the cheesemakers," although this is an example of an intentional deformation, not an unintentional one, as is the mondegreen. However, the point is clear enough.

There is certainly textual evidence that serves as a sort of “smoking gun” for the existence of the Biblical mondegreen, as Frank Kermode has astutely pointed out in his fascinating book, The Genesis of Secrecy. In his discussion of figura (i.e., typology, a method of Biblical interpretation premised on the assumption that events in the New Testament are “pre-figured,” or anticipated, by textual material found in the Old Testament), Kermode discusses the evidence that Old Testament texts were sometimes “christologized,” that is, “rewritten in a more convenient form.” He writes:

A famous instance is the Christian version of Psalms 96:10, as found in Justin, where the words “from the tree” are added to the original text, “The Lord has reigned.” (107)

The footnote Kermode adds to this passage is worth looking at in detail:

I have heard this example contested on the ground that we cannot be sure there were not Septuagint manuscripts that included the words apo tou xulou (“from the tree”). An explanation of how such an intrusive reading might have come about is this: a translator, coming across the Hebrew word selah, which, though it is not infrequent in the Psalms, has no certain meaning, transliterated it into Greek xela or even xyla, so that the text read “The Lord shall reign xyla.” The addition was modified to xylou, and somebody then made sense of it by inserting apo and reading apo xulou, “from the tree”—thus “manufacturing a prophecy of the crucifixion which was to be welcomed by Christian exegetes” . . . . (158-59).

In other words, the actual text was modified by the exegete’s desire for the New Testament to fulfill the promises of the Old Testament. The analogy, of course, is that one hears what one wants to hear. True, Kermode’s specific example is one of scribal corruption of the orthographic sort (orthography concerns spelling, but also the issue of how sounds are expressed by written symbols), but it is only one of the many kinds of ambiguity in language. Some words have one sound but multiple spellings, with different meanings for each spelling (e.g., main/mane), homophones in the strict sense. But there are words that have one spelling but have multiple meanings (e.g., “bank,” as in river bank, but also “bank” as in financial institution) called polysemous words. Obviously, both kinds of words can confuse listeners. In Kermode’s example, the argument for the addition of “from the tree” is based on the assumption that some ancient redactor, confused by the Hebrew word selah, decided it was a pseudohomophone (for a contemporary example, for instance, think of the use of “luv” for “love”) and therefore made an orthographic error, substituting either the Greek word xela or possibly xyla. A subsequent redactor then modified xela/xyla to xylou, creating yet another confusion, one subsequently resolved by yet another redactor, the later one adding of the word apo and reading apo xulou, “from the tree.” Note the progressive deformation caused by the original substitution.

There’s a famous example in the history of literary criticism of a scholarly article having been written based on a text that had an error of the orthographic kind. The literary scholar in question (nameless) wrote a long, involved interpretation of the profound religious and metaphysical implications of Herman Melville’s Moby-Dick, mistakenly basing his interpretation on a textual source in which the phrase, “soiled fish of the sea” was, unknown to the critic, a corruption of the phrase found in the first and early editions of the novel, “coiled fish of the sea.” (“Soiled fish” vaguely suggesting of the idea of original sin, while “coiled fish” suggesting the meaning of the Old English “wyrm,” a serpent or dragon, and hence the Devil.) How easily some later copy-editor could have made such a mistake; how seemingly minor such a simple substitution of the glyph “s” for the glyph “c”—but how astonishing the interpretive flight made possible by such a seemingly insignificant error!

I’ve made the same sort of mistake myself. I am one of those listeners who for years thought Bob Dylan sang, in “Tangled Up in Blue,”

Split up on the docks that night

rather than the lyric as published on his website:

Split up on a dark sad night

Obviously, my interpretation of the song had always rested, in part, on mishearing this particular lyric. I still prefer my version over the actual lyric (the role of desire in hearing). That the song’s narrator and the unnamed woman “split up on the docks that night” always had a wonderfully cinematic, mysterious quality to it: a foggy, dockside scene in chiaroscuro, two figures in silhouette, illuminated from behind by a single bulb beneath a metal canopy overhanging the entrance to some small, dilapidated shack, with the small squibs of light marking the portholes of the docked ships behind them. I thought it was a suitably romantic image for such a somber parting. How non-cinematic is the actual lyric (to me), but the point is how one’s interpretation rests on how one has decoded the message.

And just think, all this time, you wanted to be a rockin’ polestar? Or how about a rockin' pollster? Examine not the boat in your neighbor's eye, remove the bead from your own!