A Beginning and Ending of Things

Embed from Getty Images

This post is a companion to the previous two. Afterward, this website is going on a brief early-summer hiatus. A previously scheduled post will appear this Saturday, but otherwise I got nothin’ until late next week. Go play outside. 

I have a longtime friend who is a lot more practical and a lot less sentimental than I am. Not long ago he said to me about something I wrote (paraphrasing), “I sometimes wonder why you still think about this stuff, and why you don’t just let it go.” He’s not wrong to wonder. I’ve asked those questions myself. Why do I still think about this stuff? Why don’t I just let it go? Isn’t it a little silly for a guy my age to spend so much time remembering stuff that happened when he was 16 or 18 or 22?

But then there’s this, from Nick Hornby’s High Fidelity:

I had kind of hoped that my adulthood would be long and meaty and instructive, but it all took place in those two years; sometimes it seems as though everything and everyone that have happened to me since were just minor distractions. Some people never got over the sixties, or the war, or the night their band opened for the Rolling Stones at the Marquee, and spend the rest of their days walking backwards; I never really got over Charlie. That was when the important stuff, the stuff that defines me, went on.

He’s talking about a broken love affair, but it doesn’t have to be only that. Doesn’t everybody have a period “when the important stuff, the stuff that defines me, went on”? Perhaps not. Maybe you have spent every day of your life constantly moving forward in a perpetual process of growth and change toward some sort of idealized perfect self. I can see the results of a similar process—the person I am now is more accomplished, wiser, better than most of my younger iterations—but I also recognize that to the extent that change took place, it was always shaped by “the stuff that defines me,” a beginning of things, a long time ago.

In the spring of 1978, writing about the ending of things in a journal long since lost, I hit upon the metaphor of a door, which I elaborated on a few years ago:

Change often takes us unawares. Disaster comes with little or no warning. We get fired. Loved ones die. Very rarely in life does a major change loom fixed within our sight, like a door in the distance, one we knowingly walk up to and through, entering into whatever lies beyond.… the one between carelessness and responsibility, between young and not-quite-so-young … between today and tomorrow. 

The stakes on the other side of the door seemed pretty high. Go to college, work hard, get your degree, get a job, work hard, climb the ladder, find someone, make a life for them and you and your children like the one your parents made for themselves and you, and don’t fk it up. I was willing to take it on—given who I was and the kind of person my parents had raised me to be, there was no other path—but in retrospect, it seems like a lot.

For some people, the weight of trying to make a life never goes away. It can be a struggle in terms of the concrete stuff—find a career/prosper in it, find a partner/stay together. But it can also be a metaphysical one: why am I doing this? Should I want to do this, or is there something else I should be doing? How does one navigate this life of randomly dealt fortune and tragedy without falling into denial or surrendering to despair?  

What I learned back when “the stuff that defines me” was going on is this: dealing with the concrete stuff—the what—came easier to me than understanding the metaphysical stuff—the why. And so the latter will always be of greater interest and concern to me.

I saw the door. I knew what was behind it. I knew my friends and I had to walk through it. But why we had to walk through it, why the stuff behind the door is like it is, and what is the best way to make peace with it and find some sort of meaning in it—43 years later, I’m still thinking about that, because I don’t know any other way to be. The half-assed armchair philosopher I am today was born out of the half-assed armchair philosophizing I did in the spring of 1978, at a beginning and ending of things.

Modern Minstrels

Embed from Getty Images

(Pictured: blackface minstrels onstage in 1925.)

I recently read Stomp and Swerve: American Music Gets Hot, 1843-1924 by David Wondrich. (It’s the subject of an episode of the always-terrific Let It Roll podcast, which you can listen to here.) One of its topics is the birth and growth of the minstrel show, a popular form of entertainment from the 1840s well into the 20th century. White performers put on blackface and told jokes, sang, played instruments, and danced in appallingly racist caricatures of Black people and others. However, despite their racist content,  minstrel shows were a significant ingredient in the stew that eventually became American popular music as we would know it in our time.

As it happens, I have seen a minstrel show. More than once.

In the 1960s and 1970s, my Wisconsin hometown, population about 8,700, was 99 and 44/100 percent white, heavily Swiss and German. And every year, the local Lions Club put on what it called a “modern” minstrel show, which featured our city band and an all-local cast. It was a very popular ticket, often selling out the local school auditorium on a Friday night, Saturday night, and Sunday afternoon.

The show opened, as in days of old, with a musical number, then a group of performers took their places on the stage. In the middle was the master of ceremonies, known as the interlocutor. He was flanked by six “end men,” who told jokes and bantered with the interlocutor, and with each other. This was a variation on the traditional 19th and early 20th century shows, where there were but two end men, frequently named Mr. Bones and Mr. Tambo (as pictured above). The end men and interlocutor were played by prominent local businessmen and professional men of the kind likely to be members of the Lions Club. They appeared in the same roles year after year, as did many of the featured singers and dancers. I can’t remember if any women were part of the cast.

After a round of jokes, banter, and pratfalls by the end men, there followed the olio, in which the end men and interlocutor left the stage, and various singers and dancers performed straight. The olio was followed by the afterpiece. In traditional minstrel shows, this was often a skit set on a plantation, or a cakewalk. What it was in my town’s show I don’t remember, although the end men and interlocutor eventually came back for more comedy before a big musical finale involving the whole cast.

When the minstrel show in my town began in 1953, the performers wore blackface. By the time I was attending, in the late 60s and early 70s, the performers wore whiteface, as well as all-white costumes meant to be similar to the shabby costumes blackface minstrels sometimes wore—castoff clothing and junkyard chic, another caricature of Black people. (If this seemed odd to me, I don’t remember it; it was just the way they dressed at the minstrel show.)

I dug into the archives of a hometown Facebook group to read a discussion from a few years back about the minstrel show. A number of people who commented had fathers, uncles, or grandfathers who had been cast members. Many people pointed out that the shows were “politically incorrect” without getting much into specifics. I doubt that the modern shows were as crudely racist as the traditional shows, but traditional minstrel shows were equal-opportunity offenders: they parodied not only Black people but other non-white people and immigrants. (Facebook commenters remembered that our town’s minstrel show frequently made fun of hippies; a classmate of mine remembers appearing as a “Swiss Indian.”) I scrolled expecting somebody to go off on an anti-PC tirade, but nobody did. One commenter even noted that she has a large collection of photos from the minstrel shows but was reluctant to share them because of their potential to offend. (That nobody called her a woke liberal snowflake is a minor miracle.)

Time passed, and the Lions Club eventually decided to stop doing the minstrel show. No one in the Facebook group gave a reason; surely if it had been due to some outcry over the content of the show, somebody would have remembered that. It seems more likely that the show simply petered out because of a lack of interest among the public, or the performers. The last show was presented in 1983—astoundingly late in history for such a problematical form of entertainment to continue, even in a “modern,” sanitized, whiteface form. But in a small, lily-white town, perhaps not all that surprising.

Tune In, Turn On, Make Lunch

Embed from Getty Images

This has nothing to do with music, really, but just go with it.

It all started sometime in 1969 with scattered stories in local newspapers, but it didn’t become national news until late that summer, after the U.S. Chamber of Commerce held a series of workshops on urban issues. In a report on the link between organized crime and narcotics, the following sentence appeared, without context or elaboration: “In Chattanooga, it was learned that due to the high cost of narcotics, young people are using as mainline injections Kool Aid and peanut butter mixed with mayonnaise.”

In October 1969, the Senate Appropriations Committee held hearings on the  federal budget. Dr. Stanley Yolles, director of the National Institute of Mental Health, was being quizzed about NIMH’s anti-drug programs when Hawaii Republican Hiram Fong asked, pretty much out of the blue: “When you find out that a person gets a big kick out of injecting peanut butter in his veins, what do you do?” Yolles reponded, “I think the only kick they get out of peanut butter is the final kick. It is a very dangerous practice to say the least; it causes death if injected in any large quantity.”

Fong then asks what NIMH does when that happens. Yolles says (and we can imagine his patient tone, but also perhaps the internal eye-rolling) that the agency doesn’t involve itself in individual cases, but instead tries to educate people about the dangers of doing such things through “straight factual information, because we have had experience over the years with misinformation deliberately set out to scare people about using various substances and this has not worked. . . .”

The feds may have wanted to counter misinformation about substance abuse, but at least some of it was coming from inside the house.

That same month, the American Academy of Pediatrics met in Chicago. Guest speakers included Ernest A. Carabillo, Jr., described by the Associated Press as “a lawyer/pharmacist in the Federal Bureau of Narcotics,” and Frank Gulich, “a narcotics bureau official stationed in Chicago. ” They spoke to reporters, and an AP story appeared in papers around the country in which Carabillo told about “an underground recipe book purporting to outline ‘culinary escapes from reality.'” Gulich said that the books “usually sell for about $1 and often give the formulas for preparing drugs such as LSD.”

But the AP story did not focus on how kids were using their chemistry sets to become little neighborhood Owsleys. Instead, it focused on the use of peanut butter, mayonnaise, and other substitutes for narcotics. Carabillo said that users “confused the bizarre and toxic reactions with the so-called ‘high’ provided by heroin or marijuana. He cited the smoking of dried banana skins, a fad of a couple of years ago, as an example.” He said that kids were also using cleaning fluid, paregoric [an anti-diarrheal derived from morphine], ethyl chloride [a cooling spray used for pain relief], and freon from aerosol cans to get high. It seems obvious to us that any of the latter would get you off better than peanut butter or mayonnaise, but Mr. and Mrs. Middle America focused on the sandwich spreads.

In December 1969, President Nixon hosted a conference of state governors to address the drug problem. Nixon would try to have it both ways in the drug fight, warning that drug abuse was a threat to American civilization and firing Yolles in 1970 for criticizing stiff marijuana sentences, but also arguing for education over incarceration, at least for some users. (A cynic might suspect it was because middle-class white kids, the sons and daughters of the Silent Majority, were getting thrown in jail.) The conference also gave the peanut-butter-and-mayonnaise phenomenon a publicity boost when TV personality Art Linkletter, whose daughter had committed suicide in October while tripping on LSD, addressed the governors. He repeated the canard and added a new one: that young people were smoking crushed aspirin to get high.

It may not surprise you to learn that there were no solid sources for any of this. The titles of those underground cookbooks went unrecorded. Linkletter’s gravitas on the issue exempted his claims from close scrutiny. News stories about the practice were entirely hearsay. No one really knew if the peanut butter-and-mayonnaise thing was something kids legitimately believed would get them high, or if it was merely a few stoners pranking The Man.

Whatever the case, by the end of 1970, nobody was talking about it anymore.

Sold to the Lowest Bidder

Embed from Getty Images

(Pictured: Olivia Rodrigo in 2019.)

As it happens, I know a little about the teaching of writing. Years ago, I had a job that required me to read 30,000 short essays written by kids in grades three through eight. Some of the older kids clearly aspired to be writerly. Some of them demonstrated the glimmerings of a gift, but the vast majority did not. It wasn’t just that they didn’t know the craft (because at age 12 or 13, how could they?), but they lacked the vocabulary. For example, when they wanted to describe something, instead of using the five senses in simile or metaphor, they’d say things like, “It was beautiful. It was so, so beautiful.”

Because I am not hip, I didn’t hear Olivia Rodrigo’s song “Driver’s License” until earlier this week, even though it spent eight straight weeks at #1 on the Hot 100 from January to March. It’s a meandering mix of textures and tempos, but I fixated on the words. A sample:

And all my friends are tired
Of hearing how much I miss you
But I kinda feel sorry for them
‘Cause they’ll never know you the way that I do
Today I drove through the suburbs
And pictured I was driving home to you

Exposition can’t be avoided, but writing teachers tell students, especially young ones, to show and not tell. Rodrigo, who wrote the lyric, spends most of her time telling. And when she wants to crank up the emotional intensity, she does this:

Red lights, stop signs
I still see your face in the white cars, front yards
Can’t drive past the places we used to go to
‘Cause I still fuckin’ love you, babe

A writing teacher would circle “red lights,” “stop signs,” and “white cars” and suggest the student find stronger words. As for the obscene adverbial intensifier, it’s as inept and immature as “so, so beautiful.” So like many an eighth-grade essay, “Driver’s License” ends up the opposite of what the writer intends: not a vivid description of a deeply felt experience, but the emotional equivalent of a grocery list.

Olivia Rodrigo is a Disney Channel star who turned 18 in February, so she isn’t that far removed from eighth grade. And at least she’s taken to heart the idea that you should write what you know. My purpose here is not to fault her. The main fault involved with “Driver’s License” lies with the people who made something that’s not especially artful into the most popular song in America for two solid months.

In 2003, Guardian columnist Stuart Jeffries wrote: “The real problem with our culture is not a dearth of ingenuity but a willingness to lend that ingenuity to devising things that should be beneath contempt.” Two decades later, the beat goes on. Earlier this week, essayist John Ganz wrote about the blandness of practically every cultural rage right now. We hand over our attention and money in exchange for very little. We are happy to sell ourselves not to the highest bidder, but to the lowest one.

I’ve said before that one purpose of art is to take you out of your moment, to allow you to experience something in a way you can’t do for yourself in that moment. But anybody who ever had a busted high-school relationship has felt what Olivia Rodrigo feels. She doesn’t say anything new about it, or describe a new way of seeing or feeling it, yet millions of listeners (many long past high-school age) were eager to listen to her tell about it, again and again.

It doesn’t take much to buy our devotion. Consider Ed Sheeran: can you recite one interesting or perceptive lyric or whistle a single memorable melody he’s written? Perhaps you aren’t intended to. It’s not an accident that we talk about “music consumption” now. The job of most popular music (words chosen deliberately, as opposed to “the purpose of most popular art”) is to be there when people are hungry for it, like a bag of chips. If chips are what people want, why spend time and effort cooking a steak?

Artists used to aspire to extend themselves. Paul McCartney, for example, has released a half-dozen albums of classical music. But what’s the likelihood that Ed Sheeran will do anything on his next album that he didn’t do on his first four? What are the chances that Olivia Rodrigo’s next single will be a vivid lyrical ride that reminds people of the early Dylan?

Does it even have to be?

Did I miss something important, or am I just completely wrong? Your comments are not just welcome, but necessary.

On the Subject of Pretty Songs

Embed from Getty Images

(Pictured: the Stylistics, lookin’ good and singin’ pretty.)

Last week, writing about the 1972 hits “Betcha By Golly Wow,” “Rock and Roll Lullaby,” “Precious and Few,” and “Everything I Own,” the first word I thought of to describe them was “pretty.” But “pretty” is a loaded word. “It can be used to damn with faint praise,” I wrote, “to suggest that something is decent if you like that kind of thing, but not worth serious attention.” The Merriam-Webster Dictionary seems to concur. True, “pretty” is defined as “pleasing by delicacy or grace” and “having conventionally accepted elements of beauty,” but the next definition is “appearing or sounding pleasant or nice but lacking strength, force, manliness, purpose, or intensity.” In the section on usage, M-W says “‘Pretty’ often applies to superficial or insubstantial attractiveness.” Even the etymology of the word gets into the act: “pretty” is derived from Old English and Old Norse words for “trick.” In other words, a sort of beauty that may deceive, or be other than it appears.

Regarding the usage of “pretty” and words related to it, there’s an argument that the dictionary sense of “beautiful” could apply to “Rock and Roll Lullaby”—“whatever excites the keenest of pleasure to the senses and stirs emotion through the senses”—because I am still moved by the emotional power of that record. “Everything I Own” moves me along the same scale, but not quite as strongly, or as far.

But what about the other two? Are they “lovely,” maybe? M-W says that “lovely” is “close to ‘beautiful’ but applies to a narrower range of emotional excitation in suggesting the graceful, delicate, or exquisite.” So “lovely” fits “Betcha By Golly Wow,” because the default for any Thom Bell-produced love song is probably “exquisite.” “Precious and Few” doesn’t seem to rise to the “lovely” standard, although it is clearly pleasant, nice, and attractive. That one’s pretty.

Let’s talk about the weirdest, and the most problematic, part of the definition of “pretty”: lacking “manliness.” (The word seems alarmingly retro, and I’m surprised M-W hasn’t modified the definition.) When Dick Clark played “Everything I Own” on American Top 40, he introduced it by saying, “this song in particular appeals to the girls.” Although he did not use the word “pretty,” it has been used by radio types to describe the kind of record that has mainly female appeal. Female appeal has always been important to radio: apart from certain rock and talk formats, most stations make music programming decisions with the intent of attracting women. Even the “classic hits” format, as distinct from male-leaning “classic rock,” is basically an attempt to jigger the music library to attract more female listeners.

But “female appeal” can also be a value judgment: “chicks will like this, but serious people [i.e., male listeners] will like it less.” And that sends us in another direction. In the history of modern pop music, girls and women were the original tastemakers. They swooned over Sinatra, hyperventilated over Elvis, and screamed for the Beatles, and they can take a great deal of the credit for making those acts into superstars. But at some point in the middle of the 1960s, around the time of Rubber Soul and Revolver, when “rock ‘n’ roll” turned to “rock” and first became a fit subject for serious cultural criticism, that changed. Now it was the opinions of men that determined the relative worth of the art. That’s not to say there weren’t female critics or that women stopped buying records. But the economic clout of girls and women buying records, as a judgment of the records’ value, started to matter less than what critics, mostly male, thought of the records.

If we’re going to start interrogating our unconscious biases—and we should—surely we should spend some time on the one that automatically assumes male opinions matter more.

Because it’s OK to be pretty. As I wrote in my earlier essay, “classic AM Top 40 radio was, to a great degree, built on pretty songs, pleasing melodies earnestly performed, for people to sing along with and/or fall in love to.” You can do worse than to listen to a pretty record, or to make one. It’s an aesthetic that’s fallen badly out of fashion in the 21st century, but that’s yet another direction, and one we’re not traveling today.

Here’s Lookin’ at You, Kids

Embed from Getty Images

(Pictured: Lauren Bacall and Humphrey Bogart in a colorized still from the 1946 film The Big Sleep.)

Welcome to another edition of Short Attention Span Theater, with bits that never made it into a full post. Last summer I started writing about an American Top 40 show that played the Drifters’ “On Broadway” as an extra, and I went off on a tangent about shared popular culture that I ended up cutting. Here’s a bit of it: 

Nowadays we actively hunt for something to watch on TV, and in a universe with so many channels, we almost always find something. In the three-channel days, we watched whatever was on. Your show got over and you stuck around for what was next because there wasn’t much else (and if you wanted to change the channel, you’d have to walk across the room to do it). Each of us who grew up in that time can remember how, late at night or on a weekend afternoon, we’d find ourselves engrossed in some old movie. And as the years went by, we all saw Casablanca and Double Indemnity and Rebel Without a Cause and The Maltese Falcon and Singing in the Rain, film noir and screwball comedy, Bogart and Bacall, the Hope and Crosby Road pictures,  the Universal movie monsters—we engaged with one of the 20th century’s richest pop-culture texts, the films of classic Hollywood. Nobody gets that education passively nowadays—you gotta go and look for it, if you can find it, and most people won’t. Something like 80 percent of the movies on Netflix have been made since 2010, and I’ve actually heard people under the age of 40 say they simply cannot watch black-and-white.

But does a person need to be conversant with old-school Hollywood today? Probably not. If you want to appreciate modern Hollywood, you’re better off boning up on the DC and Marvel Comics universes, which have swallowed the movie industry whole. 

There’s another fragment on the flip. 

Continue reading “Here’s Lookin’ at You, Kids”