Past and Future

Embed from Getty Images

(Pictured: this llama is my energy today—talking, but not willing to exert himself beyond that.)

It’s time for another edition of Short Attention Span Theater, in which I sift the seeds and stems in my Drafts file in hopes of finding enough to roll one post.

As of this summer, I am newly out of the gig economy, working for The Man every day for the first time in 18 years, and he’s taking up a great deal of the inspiration/research/noodling time I used to be able to devote to this website. And so I strongly feel this old paragraph: 

From the time my brothers and I were seven or eight, we had small farm chores to do: helping feed the cows and getting the equipment ready before the evening milking, or gathering eggs, back when we still had chickens. The year I turned 12, I was expected to drive a tractor on the farm, helping with the crops. The small farm chores took maybe 15 minutes at the outside. Work on the tractor—driving the cultivator to uproot weeds from the corn, or the rake to get hay ready for baling—took several hours at a stretch. Dad was good enough to pay by the hour, which was something not all of my friends’ dads did, but being forced to give up a morning or an afternoon because the job needed to be done now was different.

A job you do by choice (“Sorry, I can’t fill in on the afternoon show today, I have plans”) vibes differently than one that has the first claim on your time every day. That’s not a complaint, just an observation.

On thinking of the future and then finding oneself there, written well before events of the last 18 months: 

As a kid, I could project myself forward in time if I chose. In second grade, when I read that Halley’s Comet would return to Earth in 1986, I presumed I would be around to see it. Not long after that, I figured out that I would likely be alive in the year 2000 (and that I would be 40 years old, which was an abstract concept entirely meaningless to eight-year-old me). Since 2000, however, I’ve had a harder time with this kind of projection. It’s partly because the older you get, the shorter your future is. But it’s also because the 21st century feels like a foreign country to me, and I’m not so comfortable living in it.

This happens all the time, of course. Every generation watches the passage of time erode what it thought was settled. Every generation tries to modify its outlook to reflect the changes it knows are normal and natural. And every generation eventually throws up its hands in exasperation and yells “you damn kids stay off my lawn,” and for the same reasons: what we used to value seems to be valued less; what we understood back then is less comprehensible now; the rules that used to apply no longer do.

As of 2021, this century seems even more foreign to me. There’s no way to make these times feel normal by any standard I know. As bad as the last 18 months have been, there’s a real sense this summer that we are heading for something exponentially worse in America, and I fear it. 

Here’s the middle and end of a story:

You get over stuff like that, of course, and I did. I saw her once a few years later, but I didn’t try to speak to her, because I doubted she would remember me, and why should she, really.

More time goes by. It’s decades now. In the Facebook era, she reappears through a friend-of-a-friend connection. I look at her profile. Lives out west, married and divorced, grown children. I am not tempted to friend her.

Still more time. Then one night, there we are, in the same room. We stop. We look each other up and down, a surreptitious glance at name tags. “Hi.” “Hi.” “How are you?” “Fine.” “It’s good to see you.” And that’s pretty much it.

Not long after, I write a thing, in which I talk about the bonds that people have, bonds that surpass time. In it, I mention that among those we are bonded to are “people we need to apologize to, and people who need to apologize to us.”

And I get a message from her. She read it, and she says it tugged her heartstrings. Then she wrote, “I think I need to apologize to you.”

Well.

If We Remembered Everything

Embed from Getty Images

(Have I used this picture before? I think I’ve used this picture before. With a better work ethic, I’d look and make sure.)

Here’s another one of those posts that has both nothing and everything to do with the ostensible subject of this website. It’s been sitting in my Drafts file in pieces for two or three years, and it still seems a little undercooked, but sometimes you gotta hit “publish” and move on.  

Continue reading “If We Remembered Everything”

Old School Mono

Embed from Getty Images

I’d like to call your attention to a comment from reader Douglas, as part of the weekend’s discussion of the Stones’ “Tumbling Dice”: “In this day of Super Dolby-Digital Plus-Remastered from the Original Remaster reissues, is it possible that some bands just f-ing sound best in Old School Mono (TM)?”

F yeah.

Years ago I was doing some research at YouTube and came across a homemade stereo mix of the Rolling Stones’ “Satisfaction,” which was famously heard only in mono for the first 30-plus years of its existence. The YouTuber who posted it included the following note: “Demand that Music Companies issue British Invasion, etc in Full STEREO & NOT in monaural . . . Don’t buy mono versions, etc !!!”

This person was guilty of a fairly common prejudice: that mono is stereo’s unsophisticated cousin, and that stereo is a “true” reproduction of music where mono is not. But we think that’s true only because stereo is what we know best. In the early days of recording, there was a debate between people who thought the goal of recording should be exact reproduction of what a listener would hear sitting in the concert hall and those who believed recording could and should enhance the listening experience. The debate was going on long before stereo came on the scene in the late 1950s. Our modern-day preference for stereo basically means that the enhancers won the debate.

A few years ago, I wrote this:

As we were reminded when the Beatles’ catalog was re-released in mono, it was the mono mixes that were slaved over in the studio. The stereo mixes were secondary. (Listen to early Beatles music in stereo—how often do you hear vocals on one side and instruments on the other? That’s the quick and easy way to create a stereo effect.) And if George Martin and the boys had considered mono inferior to stereo, it’s doubtful that the Beatles would have continued to release albums in mono right up until the end of their time together. Sgt. Pepper was intended to show what could be accomplished in the studio. Why would it have been released in mono if mono was merely an inferior copy of a better stereo original?

Mono mixing is an art, and mono mixes can be works of art, as we have chronicled here again and again over the years. And when you go to a live concert, the sound you get isn’t widely separated stereo sound—it’s something much closer to mono.

Mono isn’t inferior, it’s just different.

Stereo recording has been a thing for 60 years now, and I get the sense that it’s become so “normal” today that a lot of producers don’t think about it, the way fish don’t know they’re wet. A lot of today’s mainstream country is mastered to be intensely loud with practically no dynamics. Separation doesn’t matter much in that firehouse of sound. (Connoisseurs understand that mono doesn’t have to be loud; neither does it need to destroy dynamics.) In pop music, the loudness wars seem to have eased in recent years, which leaves more room for stereo to expand the soundscape, but there’s not much creative use of left and right. Maybe stuff flying around the soundscape is disorienting for earbud listeners, I don’t know. One thing I do know: stereo can certainly increase the effect of echo. Every other young pop singer is emoting from inside an empty water tank now, which is sometimes a hard listen for a geezer such as I, weaned on the dry, flat style of production and recording that dominated the 1970s.

The way we listen has always affected the way we make musical art, going back to the early 20th century debates about what recording should do. Think of how the development of the 45RPM record and the portable radio made kids into tastemakers; how the console stereo of the 1950s opened up a market for lush instrumental music; about the symbiotic relationship between sophisticated stereo gear and certain popular styles in the 70s; how the Walkman contributed to the DIY musical culture of the 80s; and how the modern marketplace has been affected by earbuds and streaming. There’s never been a time when we could cleanly separate what we were listening to from the things we were using to listen to it. But just as stereo wasn’t intrinsically better than mono, each succeeding innovation isn’t necessarily an improvement on what came before.

So yeah, some bands (and many of their songs) just f-ing sound best in Old School Mono.

A Beginning and Ending of Things

Embed from Getty Images

This post is a companion to the previous two. Afterward, this website is going on a brief early-summer hiatus. A previously scheduled post will appear this Saturday, but otherwise I got nothin’ until late next week. Go play outside. 

I have a longtime friend who is a lot more practical and a lot less sentimental than I am. Not long ago he said to me about something I wrote (paraphrasing), “I sometimes wonder why you still think about this stuff, and why you don’t just let it go.” He’s not wrong to wonder. I’ve asked those questions myself. Why do I still think about this stuff? Why don’t I just let it go? Isn’t it a little silly for a guy my age to spend so much time remembering stuff that happened when he was 16 or 18 or 22?

But then there’s this, from Nick Hornby’s High Fidelity:

I had kind of hoped that my adulthood would be long and meaty and instructive, but it all took place in those two years; sometimes it seems as though everything and everyone that have happened to me since were just minor distractions. Some people never got over the sixties, or the war, or the night their band opened for the Rolling Stones at the Marquee, and spend the rest of their days walking backwards; I never really got over Charlie. That was when the important stuff, the stuff that defines me, went on.

He’s talking about a broken love affair, but it doesn’t have to be only that. Doesn’t everybody have a period “when the important stuff, the stuff that defines me, went on”? Perhaps not. Maybe you have spent every day of your life constantly moving forward in a perpetual process of growth and change toward some sort of idealized perfect self. I can see the results of a similar process—the person I am now is more accomplished, wiser, better than most of my younger iterations—but I also recognize that to the extent that change took place, it was always shaped by “the stuff that defines me,” a beginning of things, a long time ago.

In the spring of 1978, writing about the ending of things in a journal long since lost, I hit upon the metaphor of a door, which I elaborated on a few years ago:

Change often takes us unawares. Disaster comes with little or no warning. We get fired. Loved ones die. Very rarely in life does a major change loom fixed within our sight, like a door in the distance, one we knowingly walk up to and through, entering into whatever lies beyond.… the one between carelessness and responsibility, between young and not-quite-so-young … between today and tomorrow. 

The stakes on the other side of the door seemed pretty high. Go to college, work hard, get your degree, get a job, work hard, climb the ladder, find someone, make a life for them and you and your children like the one your parents made for themselves and you, and don’t fk it up. I was willing to take it on—given who I was and the kind of person my parents had raised me to be, there was no other path—but in retrospect, it seems like a lot.

For some people, the weight of trying to make a life never goes away. It can be a struggle in terms of the concrete stuff—find a career/prosper in it, find a partner/stay together. But it can also be a metaphysical one: why am I doing this? Should I want to do this, or is there something else I should be doing? How does one navigate this life of randomly dealt fortune and tragedy without falling into denial or surrendering to despair?  

What I learned back when “the stuff that defines me” was going on is this: dealing with the concrete stuff—the what—came easier to me than understanding the metaphysical stuff—the why. And so the latter will always be of greater interest and concern to me.

I saw the door. I knew what was behind it. I knew my friends and I had to walk through it. But why we had to walk through it, why the stuff behind the door is like it is, and what is the best way to make peace with it and find some sort of meaning in it—43 years later, I’m still thinking about that, because I don’t know any other way to be. The half-assed armchair philosopher I am today was born out of the half-assed armchair philosophizing I did in the spring of 1978, at a beginning and ending of things.

Modern Minstrels

Embed from Getty Images

(Pictured: blackface minstrels onstage in 1925.)

I recently read Stomp and Swerve: American Music Gets Hot, 1843-1924 by David Wondrich. (It’s the subject of an episode of the always-terrific Let It Roll podcast, which you can listen to here.) One of its topics is the birth and growth of the minstrel show, a popular form of entertainment from the 1840s well into the 20th century. White performers put on blackface and told jokes, sang, played instruments, and danced in appallingly racist caricatures of Black people and others. However, despite their racist content,  minstrel shows were a significant ingredient in the stew that eventually became American popular music as we would know it in our time.

As it happens, I have seen a minstrel show. More than once.

In the 1960s and 1970s, my Wisconsin hometown, population about 8,700, was 99 and 44/100 percent white, heavily Swiss and German. And every year, the local Lions Club put on what it called a “modern” minstrel show, which featured our city band and an all-local cast. It was a very popular ticket, often selling out the local school auditorium on a Friday night, Saturday night, and Sunday afternoon.

The show opened, as in days of old, with a musical number, then a group of performers took their places on the stage. In the middle was the master of ceremonies, known as the interlocutor. He was flanked by six “end men,” who told jokes and bantered with the interlocutor, and with each other. This was a variation on the traditional 19th and early 20th century shows, where there were but two end men, frequently named Mr. Bones and Mr. Tambo (as pictured above). The end men and interlocutor were played by prominent local businessmen and professional men of the kind likely to be members of the Lions Club. They appeared in the same roles year after year, as did many of the featured singers and dancers. I can’t remember if any women were part of the cast.

After a round of jokes, banter, and pratfalls by the end men, there followed the olio, in which the end men and interlocutor left the stage, and various singers and dancers performed straight. The olio was followed by the afterpiece. In traditional minstrel shows, this was often a skit set on a plantation, or a cakewalk. What it was in my town’s show I don’t remember, although the end men and interlocutor eventually came back for more comedy before a big musical finale involving the whole cast.

When the minstrel show in my town began in 1953, the performers wore blackface. By the time I was attending, in the late 60s and early 70s, the performers wore whiteface, as well as all-white costumes meant to be similar to the shabby costumes blackface minstrels sometimes wore—castoff clothing and junkyard chic, another caricature of Black people. (If this seemed odd to me, I don’t remember it; it was just the way they dressed at the minstrel show.)

I dug into the archives of a hometown Facebook group to read a discussion from a few years back about the minstrel show. A number of people who commented had fathers, uncles, or grandfathers who had been cast members. Many people pointed out that the shows were “politically incorrect” without getting much into specifics. I doubt that the modern shows were as crudely racist as the traditional shows, but traditional minstrel shows were equal-opportunity offenders: they parodied not only Black people but other non-white people and immigrants. (Facebook commenters remembered that our town’s minstrel show frequently made fun of hippies; a classmate of mine remembers appearing as a “Swiss Indian.”) I scrolled expecting somebody to go off on an anti-PC tirade, but nobody did. One commenter even noted that she has a large collection of photos from the minstrel shows but was reluctant to share them because of their potential to offend. (That nobody called her a woke liberal snowflake is a minor miracle.)

Time passed, and the Lions Club eventually decided to stop doing the minstrel show. No one in the Facebook group gave a reason; surely if it had been due to some outcry over the content of the show, somebody would have remembered that. It seems more likely that the show simply petered out because of a lack of interest among the public, or the performers. The last show was presented in 1983—astoundingly late in history for such a problematical form of entertainment to continue, even in a “modern,” sanitized, whiteface form. But in a small, lily-white town, perhaps not all that surprising.

Tune In, Turn On, Make Lunch

Embed from Getty Images

This has nothing to do with music, really, but just go with it.

It all started sometime in 1969 with scattered stories in local newspapers, but it didn’t become national news until late that summer, after the U.S. Chamber of Commerce held a series of workshops on urban issues. In a report on the link between organized crime and narcotics, the following sentence appeared, without context or elaboration: “In Chattanooga, it was learned that due to the high cost of narcotics, young people are using as mainline injections Kool Aid and peanut butter mixed with mayonnaise.”

In October 1969, the Senate Appropriations Committee held hearings on the  federal budget. Dr. Stanley Yolles, director of the National Institute of Mental Health, was being quizzed about NIMH’s anti-drug programs when Hawaii Republican Hiram Fong asked, pretty much out of the blue: “When you find out that a person gets a big kick out of injecting peanut butter in his veins, what do you do?” Yolles reponded, “I think the only kick they get out of peanut butter is the final kick. It is a very dangerous practice to say the least; it causes death if injected in any large quantity.”

Fong then asks what NIMH does when that happens. Yolles says (and we can imagine his patient tone, but also perhaps the internal eye-rolling) that the agency doesn’t involve itself in individual cases, but instead tries to educate people about the dangers of doing such things through “straight factual information, because we have had experience over the years with misinformation deliberately set out to scare people about using various substances and this has not worked. . . .”

The feds may have wanted to counter misinformation about substance abuse, but at least some of it was coming from inside the house.

That same month, the American Academy of Pediatrics met in Chicago. Guest speakers included Ernest A. Carabillo, Jr., described by the Associated Press as “a lawyer/pharmacist in the Federal Bureau of Narcotics,” and Frank Gulich, “a narcotics bureau official stationed in Chicago. ” They spoke to reporters, and an AP story appeared in papers around the country in which Carabillo told about “an underground recipe book purporting to outline ‘culinary escapes from reality.'” Gulich said that the books “usually sell for about $1 and often give the formulas for preparing drugs such as LSD.”

But the AP story did not focus on how kids were using their chemistry sets to become little neighborhood Owsleys. Instead, it focused on the use of peanut butter, mayonnaise, and other substitutes for narcotics. Carabillo said that users “confused the bizarre and toxic reactions with the so-called ‘high’ provided by heroin or marijuana. He cited the smoking of dried banana skins, a fad of a couple of years ago, as an example.” He said that kids were also using cleaning fluid, paregoric [an anti-diarrheal derived from morphine], ethyl chloride [a cooling spray used for pain relief], and freon from aerosol cans to get high. It seems obvious to us that any of the latter would get you off better than peanut butter or mayonnaise, but Mr. and Mrs. Middle America focused on the sandwich spreads.

In December 1969, President Nixon hosted a conference of state governors to address the drug problem. Nixon would try to have it both ways in the drug fight, warning that drug abuse was a threat to American civilization and firing Yolles in 1970 for criticizing stiff marijuana sentences, but also arguing for education over incarceration, at least for some users. (A cynic might suspect it was because middle-class white kids, the sons and daughters of the Silent Majority, were getting thrown in jail.) The conference also gave the peanut-butter-and-mayonnaise phenomenon a publicity boost when TV personality Art Linkletter, whose daughter had committed suicide in October while tripping on LSD, addressed the governors. He repeated the canard and added a new one: that young people were smoking crushed aspirin to get high.

It may not surprise you to learn that there were no solid sources for any of this. The titles of those underground cookbooks went unrecorded. Linkletter’s gravitas on the issue exempted his claims from close scrutiny. News stories about the practice were entirely hearsay. No one really knew if the peanut butter-and-mayonnaise thing was something kids legitimately believed would get them high, or if it was merely a few stoners pranking The Man.

Whatever the case, by the end of 1970, nobody was talking about it anymore.