If We Remembered Everything

Embed from Getty Images

(Have I used this picture before? I think I’ve used this picture before. With a better work ethic, I’d look and make sure.)

Here’s another one of those posts that has both nothing and everything to do with the ostensible subject of this website. It’s been sitting in my Drafts file in pieces for two or three years, and it still seems a little undercooked, but sometimes you gotta hit “publish” and move on.  

Continue reading “If We Remembered Everything”

Old School Mono

Embed from Getty Images

I’d like to call your attention to a comment from reader Douglas, as part of the weekend’s discussion of the Stones’ “Tumbling Dice”: “In this day of Super Dolby-Digital Plus-Remastered from the Original Remaster reissues, is it possible that some bands just f-ing sound best in Old School Mono (TM)?”

F yeah.

Years ago I was doing some research at YouTube and came across a homemade stereo mix of the Rolling Stones’ “Satisfaction,” which was famously heard only in mono for the first 30-plus years of its existence. The YouTuber who posted it included the following note: “Demand that Music Companies issue British Invasion, etc in Full STEREO & NOT in monaural . . . Don’t buy mono versions, etc !!!”

This person was guilty of a fairly common prejudice: that mono is stereo’s unsophisticated cousin, and that stereo is a “true” reproduction of music where mono is not. But we think that’s true only because stereo is what we know best. In the early days of recording, there was a debate between people who thought the goal of recording should be exact reproduction of what a listener would hear sitting in the concert hall and those who believed recording could and should enhance the listening experience. The debate was going on long before stereo came on the scene in the late 1950s. Our modern-day preference for stereo basically means that the enhancers won the debate.

A few years ago, I wrote this:

As we were reminded when the Beatles’ catalog was re-released in mono, it was the mono mixes that were slaved over in the studio. The stereo mixes were secondary. (Listen to early Beatles music in stereo—how often do you hear vocals on one side and instruments on the other? That’s the quick and easy way to create a stereo effect.) And if George Martin and the boys had considered mono inferior to stereo, it’s doubtful that the Beatles would have continued to release albums in mono right up until the end of their time together. Sgt. Pepper was intended to show what could be accomplished in the studio. Why would it have been released in mono if mono was merely an inferior copy of a better stereo original?

Mono mixing is an art, and mono mixes can be works of art, as we have chronicled here again and again over the years. And when you go to a live concert, the sound you get isn’t widely separated stereo sound—it’s something much closer to mono.

Mono isn’t inferior, it’s just different.

Stereo recording has been a thing for 60 years now, and I get the sense that it’s become so “normal” today that a lot of producers don’t think about it, the way fish don’t know they’re wet. A lot of today’s mainstream country is mastered to be intensely loud with practically no dynamics. Separation doesn’t matter much in that firehouse of sound. (Connoisseurs understand that mono doesn’t have to be loud; neither does it need to destroy dynamics.) In pop music, the loudness wars seem to have eased in recent years, which leaves more room for stereo to expand the soundscape, but there’s not much creative use of left and right. Maybe stuff flying around the soundscape is disorienting for earbud listeners, I don’t know. One thing I do know: stereo can certainly increase the effect of echo. Every other young pop singer is emoting from inside an empty water tank now, which is sometimes a hard listen for a geezer such as I, weaned on the dry, flat style of production and recording that dominated the 1970s.

The way we listen has always affected the way we make musical art, going back to the early 20th century debates about what recording should do. Think of how the development of the 45RPM record and the portable radio made kids into tastemakers; how the console stereo of the 1950s opened up a market for lush instrumental music; about the symbiotic relationship between sophisticated stereo gear and certain popular styles in the 70s; how the Walkman contributed to the DIY musical culture of the 80s; and how the modern marketplace has been affected by earbuds and streaming. There’s never been a time when we could cleanly separate what we were listening to from the things we were using to listen to it. But just as stereo wasn’t intrinsically better than mono, each succeeding innovation isn’t necessarily an improvement on what came before.

So yeah, some bands (and many of their songs) just f-ing sound best in Old School Mono.

A Beginning and Ending of Things

Embed from Getty Images

This post is a companion to the previous two. Afterward, this website is going on a brief early-summer hiatus. A previously scheduled post will appear this Saturday, but otherwise I got nothin’ until late next week. Go play outside. 

I have a longtime friend who is a lot more practical and a lot less sentimental than I am. Not long ago he said to me about something I wrote (paraphrasing), “I sometimes wonder why you still think about this stuff, and why you don’t just let it go.” He’s not wrong to wonder. I’ve asked those questions myself. Why do I still think about this stuff? Why don’t I just let it go? Isn’t it a little silly for a guy my age to spend so much time remembering stuff that happened when he was 16 or 18 or 22?

But then there’s this, from Nick Hornby’s High Fidelity:

I had kind of hoped that my adulthood would be long and meaty and instructive, but it all took place in those two years; sometimes it seems as though everything and everyone that have happened to me since were just minor distractions. Some people never got over the sixties, or the war, or the night their band opened for the Rolling Stones at the Marquee, and spend the rest of their days walking backwards; I never really got over Charlie. That was when the important stuff, the stuff that defines me, went on.

He’s talking about a broken love affair, but it doesn’t have to be only that. Doesn’t everybody have a period “when the important stuff, the stuff that defines me, went on”? Perhaps not. Maybe you have spent every day of your life constantly moving forward in a perpetual process of growth and change toward some sort of idealized perfect self. I can see the results of a similar process—the person I am now is more accomplished, wiser, better than most of my younger iterations—but I also recognize that to the extent that change took place, it was always shaped by “the stuff that defines me,” a beginning of things, a long time ago.

In the spring of 1978, writing about the ending of things in a journal long since lost, I hit upon the metaphor of a door, which I elaborated on a few years ago:

Change often takes us unawares. Disaster comes with little or no warning. We get fired. Loved ones die. Very rarely in life does a major change loom fixed within our sight, like a door in the distance, one we knowingly walk up to and through, entering into whatever lies beyond.… the one between carelessness and responsibility, between young and not-quite-so-young … between today and tomorrow. 

The stakes on the other side of the door seemed pretty high. Go to college, work hard, get your degree, get a job, work hard, climb the ladder, find someone, make a life for them and you and your children like the one your parents made for themselves and you, and don’t fk it up. I was willing to take it on—given who I was and the kind of person my parents had raised me to be, there was no other path—but in retrospect, it seems like a lot.

For some people, the weight of trying to make a life never goes away. It can be a struggle in terms of the concrete stuff—find a career/prosper in it, find a partner/stay together. But it can also be a metaphysical one: why am I doing this? Should I want to do this, or is there something else I should be doing? How does one navigate this life of randomly dealt fortune and tragedy without falling into denial or surrendering to despair?  

What I learned back when “the stuff that defines me” was going on is this: dealing with the concrete stuff—the what—came easier to me than understanding the metaphysical stuff—the why. And so the latter will always be of greater interest and concern to me.

I saw the door. I knew what was behind it. I knew my friends and I had to walk through it. But why we had to walk through it, why the stuff behind the door is like it is, and what is the best way to make peace with it and find some sort of meaning in it—43 years later, I’m still thinking about that, because I don’t know any other way to be. The half-assed armchair philosopher I am today was born out of the half-assed armchair philosophizing I did in the spring of 1978, at a beginning and ending of things.

Modern Minstrels

Embed from Getty Images

(Pictured: blackface minstrels onstage in 1925.)

I recently read Stomp and Swerve: American Music Gets Hot, 1843-1924 by David Wondrich. (It’s the subject of an episode of the always-terrific Let It Roll podcast, which you can listen to here.) One of its topics is the birth and growth of the minstrel show, a popular form of entertainment from the 1840s well into the 20th century. White performers put on blackface and told jokes, sang, played instruments, and danced in appallingly racist caricatures of Black people and others. However, despite their racist content,  minstrel shows were a significant ingredient in the stew that eventually became American popular music as we would know it in our time.

As it happens, I have seen a minstrel show. More than once.

In the 1960s and 1970s, my Wisconsin hometown, population about 8,700, was 99 and 44/100 percent white, heavily Swiss and German. And every year, the local Lions Club put on what it called a “modern” minstrel show, which featured our city band and an all-local cast. It was a very popular ticket, often selling out the local school auditorium on a Friday night, Saturday night, and Sunday afternoon.

The show opened, as in days of old, with a musical number, then a group of performers took their places on the stage. In the middle was the master of ceremonies, known as the interlocutor. He was flanked by six “end men,” who told jokes and bantered with the interlocutor, and with each other. This was a variation on the traditional 19th and early 20th century shows, where there were but two end men, frequently named Mr. Bones and Mr. Tambo (as pictured above). The end men and interlocutor were played by prominent local businessmen and professional men of the kind likely to be members of the Lions Club. They appeared in the same roles year after year, as did many of the featured singers and dancers. I can’t remember if any women were part of the cast.

After a round of jokes, banter, and pratfalls by the end men, there followed the olio, in which the end men and interlocutor left the stage, and various singers and dancers performed straight. The olio was followed by the afterpiece. In traditional minstrel shows, this was often a skit set on a plantation, or a cakewalk. What it was in my town’s show I don’t remember, although the end men and interlocutor eventually came back for more comedy before a big musical finale involving the whole cast.

When the minstrel show in my town began in 1953, the performers wore blackface. By the time I was attending, in the late 60s and early 70s, the performers wore whiteface, as well as all-white costumes meant to be similar to the shabby costumes blackface minstrels sometimes wore—castoff clothing and junkyard chic, another caricature of Black people. (If this seemed odd to me, I don’t remember it; it was just the way they dressed at the minstrel show.)

I dug into the archives of a hometown Facebook group to read a discussion from a few years back about the minstrel show. A number of people who commented had fathers, uncles, or grandfathers who had been cast members. Many people pointed out that the shows were “politically incorrect” without getting much into specifics. I doubt that the modern shows were as crudely racist as the traditional shows, but traditional minstrel shows were equal-opportunity offenders: they parodied not only Black people but other non-white people and immigrants. (Facebook commenters remembered that our town’s minstrel show frequently made fun of hippies; a classmate of mine remembers appearing as a “Swiss Indian.”) I scrolled expecting somebody to go off on an anti-PC tirade, but nobody did. One commenter even noted that she has a large collection of photos from the minstrel shows but was reluctant to share them because of their potential to offend. (That nobody called her a woke liberal snowflake is a minor miracle.)

Time passed, and the Lions Club eventually decided to stop doing the minstrel show. No one in the Facebook group gave a reason; surely if it had been due to some outcry over the content of the show, somebody would have remembered that. It seems more likely that the show simply petered out because of a lack of interest among the public, or the performers. The last show was presented in 1983—astoundingly late in history for such a problematical form of entertainment to continue, even in a “modern,” sanitized, whiteface form. But in a small, lily-white town, perhaps not all that surprising.

Tune In, Turn On, Make Lunch

Embed from Getty Images

This has nothing to do with music, really, but just go with it.

It all started sometime in 1969 with scattered stories in local newspapers, but it didn’t become national news until late that summer, after the U.S. Chamber of Commerce held a series of workshops on urban issues. In a report on the link between organized crime and narcotics, the following sentence appeared, without context or elaboration: “In Chattanooga, it was learned that due to the high cost of narcotics, young people are using as mainline injections Kool Aid and peanut butter mixed with mayonnaise.”

In October 1969, the Senate Appropriations Committee held hearings on the  federal budget. Dr. Stanley Yolles, director of the National Institute of Mental Health, was being quizzed about NIMH’s anti-drug programs when Hawaii Republican Hiram Fong asked, pretty much out of the blue: “When you find out that a person gets a big kick out of injecting peanut butter in his veins, what do you do?” Yolles reponded, “I think the only kick they get out of peanut butter is the final kick. It is a very dangerous practice to say the least; it causes death if injected in any large quantity.”

Fong then asks what NIMH does when that happens. Yolles says (and we can imagine his patient tone, but also perhaps the internal eye-rolling) that the agency doesn’t involve itself in individual cases, but instead tries to educate people about the dangers of doing such things through “straight factual information, because we have had experience over the years with misinformation deliberately set out to scare people about using various substances and this has not worked. . . .”

The feds may have wanted to counter misinformation about substance abuse, but at least some of it was coming from inside the house.

That same month, the American Academy of Pediatrics met in Chicago. Guest speakers included Ernest A. Carabillo, Jr., described by the Associated Press as “a lawyer/pharmacist in the Federal Bureau of Narcotics,” and Frank Gulich, “a narcotics bureau official stationed in Chicago. ” They spoke to reporters, and an AP story appeared in papers around the country in which Carabillo told about “an underground recipe book purporting to outline ‘culinary escapes from reality.'” Gulich said that the books “usually sell for about $1 and often give the formulas for preparing drugs such as LSD.”

But the AP story did not focus on how kids were using their chemistry sets to become little neighborhood Owsleys. Instead, it focused on the use of peanut butter, mayonnaise, and other substitutes for narcotics. Carabillo said that users “confused the bizarre and toxic reactions with the so-called ‘high’ provided by heroin or marijuana. He cited the smoking of dried banana skins, a fad of a couple of years ago, as an example.” He said that kids were also using cleaning fluid, paregoric [an anti-diarrheal derived from morphine], ethyl chloride [a cooling spray used for pain relief], and freon from aerosol cans to get high. It seems obvious to us that any of the latter would get you off better than peanut butter or mayonnaise, but Mr. and Mrs. Middle America focused on the sandwich spreads.

In December 1969, President Nixon hosted a conference of state governors to address the drug problem. Nixon would try to have it both ways in the drug fight, warning that drug abuse was a threat to American civilization and firing Yolles in 1970 for criticizing stiff marijuana sentences, but also arguing for education over incarceration, at least for some users. (A cynic might suspect it was because middle-class white kids, the sons and daughters of the Silent Majority, were getting thrown in jail.) The conference also gave the peanut-butter-and-mayonnaise phenomenon a publicity boost when TV personality Art Linkletter, whose daughter had committed suicide in October while tripping on LSD, addressed the governors. He repeated the canard and added a new one: that young people were smoking crushed aspirin to get high.

It may not surprise you to learn that there were no solid sources for any of this. The titles of those underground cookbooks went unrecorded. Linkletter’s gravitas on the issue exempted his claims from close scrutiny. News stories about the practice were entirely hearsay. No one really knew if the peanut butter-and-mayonnaise thing was something kids legitimately believed would get them high, or if it was merely a few stoners pranking The Man.

Whatever the case, by the end of 1970, nobody was talking about it anymore.

Sold to the Lowest Bidder

Embed from Getty Images

(Pictured: Olivia Rodrigo in 2019.)

As it happens, I know a little about the teaching of writing. Years ago, I had a job that required me to read 30,000 short essays written by kids in grades three through eight. Some of the older kids clearly aspired to be writerly. Some of them demonstrated the glimmerings of a gift, but the vast majority did not. It wasn’t just that they didn’t know the craft (because at age 12 or 13, how could they?), but they lacked the vocabulary. For example, when they wanted to describe something, instead of using the five senses in simile or metaphor, they’d say things like, “It was beautiful. It was so, so beautiful.”

Because I am not hip, I didn’t hear Olivia Rodrigo’s song “Driver’s License” until earlier this week, even though it spent eight straight weeks at #1 on the Hot 100 from January to March. It’s a meandering mix of textures and tempos, but I fixated on the words. A sample:

And all my friends are tired
Of hearing how much I miss you
But I kinda feel sorry for them
‘Cause they’ll never know you the way that I do
Today I drove through the suburbs
And pictured I was driving home to you

Exposition can’t be avoided, but writing teachers tell students, especially young ones, to show and not tell. Rodrigo, who wrote the lyric, spends most of her time telling. And when she wants to crank up the emotional intensity, she does this:

Red lights, stop signs
I still see your face in the white cars, front yards
Can’t drive past the places we used to go to
‘Cause I still fuckin’ love you, babe

A writing teacher would circle “red lights,” “stop signs,” and “white cars” and suggest the student find stronger words. As for the obscene adverbial intensifier, it’s as inept and immature as “so, so beautiful.” So like many an eighth-grade essay, “Driver’s License” ends up the opposite of what the writer intends: not a vivid description of a deeply felt experience, but the emotional equivalent of a grocery list.

Olivia Rodrigo is a Disney Channel star who turned 18 in February, so she isn’t that far removed from eighth grade. And at least she’s taken to heart the idea that you should write what you know. My purpose here is not to fault her. The main fault involved with “Driver’s License” lies with the people who made something that’s not especially artful into the most popular song in America for two solid months.

In 2003, Guardian columnist Stuart Jeffries wrote: “The real problem with our culture is not a dearth of ingenuity but a willingness to lend that ingenuity to devising things that should be beneath contempt.” Two decades later, the beat goes on. Earlier this week, essayist John Ganz wrote about the blandness of practically every cultural rage right now. We hand over our attention and money in exchange for very little. We are happy to sell ourselves not to the highest bidder, but to the lowest one.

I’ve said before that one purpose of art is to take you out of your moment, to allow you to experience something in a way you can’t do for yourself in that moment. But anybody who ever had a busted high-school relationship has felt what Olivia Rodrigo feels. She doesn’t say anything new about it, or describe a new way of seeing or feeling it, yet millions of listeners (many long past high-school age) were eager to listen to her tell about it, again and again.

It doesn’t take much to buy our devotion. Consider Ed Sheeran: can you recite one interesting or perceptive lyric or whistle a single memorable melody he’s written? Perhaps you aren’t intended to. It’s not an accident that we talk about “music consumption” now. The job of most popular music (words chosen deliberately, as opposed to “the purpose of most popular art”) is to be there when people are hungry for it, like a bag of chips. If chips are what people want, why spend time and effort cooking a steak?

Artists used to aspire to extend themselves. Paul McCartney, for example, has released a half-dozen albums of classical music. But what’s the likelihood that Ed Sheeran will do anything on his next album that he didn’t do on his first four? What are the chances that Olivia Rodrigo’s next single will be a vivid lyrical ride that reminds people of the early Dylan?

Does it even have to be?

Did I miss something important, or am I just completely wrong? Your comments are not just welcome, but necessary.