Friday, December 30, 2011
This was different. The massive cerebral hemorrhage came as a thief, stealing in upon the shadow of a 33-year-old cancer and snuffing out in mere moments a life that had ridden perilously and courageously through that shadow, brakes off and headlights dimmed, into a future of steadily diminished hopes. There was no chance to say good-bye. As we drove to the hospital, she was still apologizing to me for spoiling my birthday. The pain came later; her last words were “my head hurts so bad” and “my tongue is numb.” As the children and I were ushered from the room so they could insert a breathing tube, she spiraled down into the dark coma from which she would not awaken.
For the next three days, she lay in the ICU, and I held her and kissed her and spoke to her while I waited for her relatives and mine to arrive and join me in that anguished vigil. I told her that she had not been a bad mother to our children, as I knew she believed she had been, because she could not hear their recitals and take their telephone messages— because since her deafness they had instinctively and understandably come to me first when anything was amiss. I told her that her example of brave survivorhood in the face of the grimly unknown—the aftermath of a massive dose of radiation that had been intended to prolong her post-cancer survival to two years, if possible—was the best role model they would ever witness: that she had been the best mother I could have hoped for to my two nearly grown children. I told her that I loved her more than anyone could ever know. Finally, I told her that I was ready to let her go.
The organ donation people came in on Thursday evening, and the surgeon flew in at two the next morning. An hour later, while the children and I lay sleepless in our beds (we had all awakened at 2:30), he removed several of her organs for transplant and research. She was finally at rest. To my astonishment, she was only the 103rd organ donor in Central Texas this year.
As remarkable and generous as that gift was, though, it is not the gift about which I have chosen to write. That gift was given in stages, as we progressively lived through a series of medical reversals that changed both of our lives unalterably. The worst was the onset of total deafness just hours before we had planned to drive to Texas and pick out a new house. It was the death of our previous ease of communication, of our ability to enjoy music together, and of so much more. We grieved it in tandem, bearing each other up in a partnership of shared sympathy that was largely unimaginable to outsiders: to those who didn’t belong to our nation of two, bound together in mutual loss. We both understood that we had experienced a death, and that we could live through it only with severe discipline and infinite patience with each other. Somehow we grew, and discovered things in ourselves that we hadn’t known were a part of us.
What I told Barbara on Thursday night was that because we had faced the death of a shared life together hand in hand, I was now ready to face it alone. I told her that this was the gift we had given each other, and that my time had come to redeem it.
Saturday, December 17, 2011
For me as a Beethoven scholar, this is a particularly weighty birthday. Beethoven would die three months later and never had another one. By this time in life, he had already composed all his music—the promising early works, the dazzling ones of the middle period that earned him his reputation, and the profound, mystifying (in the good sense) works of his late period, written after he lost his hearing: those works in which he has always seemed to me to be transcending the medium and communicating timeless truths.
I used to look at the late works as the product of an old age marked by profound wisdom. I am now that old myself, and I don't have that wisdom, and it astonishes me to think that I have lived through as many years as Beethoven did. I know it's a cliche, but I don't feel my age—until, that is, I look at the music Beethoven wrote, and had been writing for over a decade by the time he was 56 years old. Then I realize that a creatively miraculous life was indeed compressed into what I now have to regard as a very short human time span. Beethoven's oeuvre—one of the most significant achievements in human history by any standards—was finished when he was 56. I march on toward retirement, which I expect to be at least 14 years in the future, and I have no such sense of closure, and I'm glad I don't, because I don't want the show to stop.
There have been numerous attempts to explain what makes Beethoven's music the cultural landmark that it has become. As a teenager, I read Leonard Bernstein's "imaginary conversation" titled Why Beethoven?, in which a young Bernstein asserts that Beethoven may not have been outstandingly good at any of the individual building blocks of music, but he had the inexplicable gift of always knowing which note had to come next. Beethoven, he seems to want to say, wasn't a particularly good composer, except for the fact that he was a great one.
More recently, a colleague told me of his belief that Beethoven was no better than Haydn, except for the fact that he was more neurotic, and thus appeals to our neurotic age. I have to agree—and I said this on Facebook recently—that in virtually every respect that can be measured or quantified, Haydn was at least as good as Beethoven, and often better. I'm not sure, though, that being neurotic is what has allowed Beethoven to draw so many toward his star instead.
It is telling that Beethoven has come in for his share of revisionism lately. One scholar asserts that his achievement has led later listeners to have unrealistic expectations about what music can do. Another has claimed that Beethoven's music embodies male libidinal impulses, and that a feminine "voice" is necessary to restore the balance. (Apologies to Scott Burnham and Susan McClary for simplifying their arguments.) At the very least, it is now widely understood that his music, which for so many years was considered "universal" and "immortal," is very much a product of the time and the culture in which it was written.
Yet I have always believed that it speaks uniquely to our culture as well, and that it embodies timeless truths that go far beyond the world of music and pertain to human spiritual development in the broadest sense. What did Beethoven, who died at 56, and Haydn, who lived to 77, have in common? A style, an approach, an epoch (broadly defined) within the unfolding drama of music and its history. What separates them? A revolution, a new century, and a dawning transformation of the way we understand human identity.
Haydn's music is wonderful, inventive, sophisticated, ingenious and playful by turns; it is everything that a music lover could possibly hope for or expect. Beethoven's music, meanwhile, speaks to the situation of a lone human individual caught in the middle of vast societal transformations and hobbled in its ability to respond. Unlike Haydn, Beethoven was forced to come to terms with those transformations, and unlike Mozart, who faced the same set of challenges decades earlier, he lived into what Carl Jung called "the second half of life:" the years that play out after about age 35, when the ego has been formed and the individual must find new, novel, personal ways to grow or else begin to stagnate. Beethoven's hand was forced by encroaching deafness, and he went where the daemon led him. He went, as his contemporaries often recognized, into a place that was often terrifying and baffling, but that contained undreamt-of revelations as well. Like a seer, he brought those revelations up into the light and embodied them in music, and the world will never be the same.
When I was 32 years old, my life fell apart. Divorced and unemployed, I was forced down into that same primal cauldron myself. At that time, I was unimaginably grateful to know that Beethoven had been there before me, and had recorded the experience in his music. Now I am 56, and I will always know that Beethoven is more than just a living presence who has trod the same years along parallel paths. He will live on because has shown me, as he has shown so many others, how to survive.
That's why I am always baffled when so many people—even the most musically sophisticated—hear in Beethoven's late music nothing but sublime expressions of suffering and despair. I'm not talking about the 9th symphony, with its titanic but transparent journey toward joy. I'm talking about works like the string quartet in A minor, Op. 132, which from the first time I heard it at age 15 has always been my very favorite piece of music. In it can be heard all the beautiful, hopeful humanity that has ever been breathed, compressed within a hard nut of suffering and left to germinate in the inner resonating fibers of everyone who will listen. In it is found the knowledge that life does not stop, and that biting into that nut is healthy even if your teeth crack and your gums bleed. As I now prepare to enter into the part of life that Beethoven never lived, I remain convinced that the knowledge he planted has only begun to come to fruition—within me, within humanity.
Thursday, December 8, 2011
The Great Either/Or is a fallacy in logical thinking that I have encountered my entire life, even among highly educated people. Richard Taruskin has documented its destructive effects on my own field of musicology in the introduction to his Oxford History of Western Music. Carl Dahlhaus, he points out, famously asked whether art history is the history of art, or the history of art? In a lighter vein, David Hackett spoofed this line of thinking by asking "Basil of Byzantium: Rat or Fink?" The point is that it's quite possible Basil was both a rat and a fink, and it's also quite possible that the history of art is actually the history of art, whatever that might signify. Both/And is usually the more realistic answer. Nevertheless, I have been accused more than once, after making that assertion, of wanting to repeal the Principle of Contradiction.
So here are some things I want to vouch for:
• It's quite possible to be a liberal and still believe strongly in individual responsibility.
• It's also possible to believe there is an important role for communal responsibility without being a collectivist, a socialist, or any other kind of "ist."
• Most of the reality of what politicians supposedly deal with is lived out in the gray area between extremes, and political change generally consists of shifting positions within that gray area in incremental ways.
Nevertheless, Tony Perkins, of the Family Research Council, was quoted this week as saying that Jesus was “a free marketer, not an Occupier. Jesus rejected collectivism and the mentality that has occupied America for the last few decades: that everyone gets a trophy—equal outcomes for inequitable performance. There are winners and yes, there are losers. And wins and losses are determined by the diligence and determination of the individual.”
First of all, I want to lay to rest the idea that in American schools these days, everybody gets a trophy. My son, for example, is a senior in high school. For the past four years, he has been participating in a series of auditions that will ultimately lead an elite group from all over Texas to the All-State Choir, which will perform next spring for the annual meeting of the Texas Music Educators Association in San Antonio. The competition is stiff and unrelenting. After four years of trying, Jeremy is one audition away from qualifying, and he must study a varied repertory of music from three centuries and in three or four different languages in order to even sing that audition. There's still no guarantee that the trophy will be his. In my experience as a parent, things like this are more the rule than the exception.
The broader point, though, is that Perkins makes himself irrelevant the moment he opens his mouth by implying that everybody is either an Occupier or a free marketer. The possibility that somebody —e.g. yours truly—could be both doesn't even seem to occur to him, so deeply has he bought into the Great Either/Or. If somebody doesn't agree that the diligence and determination of the individual are the Alpha and the Omega of moral values, then that person is presumably sitting in an Occupy encampment handing out trophies. There is no middle ground.
I don't mean to pick on Tony Perkins; he's too easy a target, and his comments wouldn't even be worthy of attention if they didn't serve my broader point. It must be particularly difficult, though, for a religious leader to have to rule out a priori the idea that there can be something more important than the actions of individuals. After all, as another well-known religious figure recently wrote, "It's not about you." At some level, Perkins must understand that. Nevertheless, he is so steeped in the Great Either/Or that when he looks beyond his own nose all he can see is collectivists.
So just remember, the next time somebody says that there are two kinds of people in the world, that there are indeed two kinds of people in the world: those who divide the world into two kinds of people and those who don't. In an ideal world, everybody would be in the second group.
Saturday, December 3, 2011
An Op-Ed in this week's New York Times asks what Mahatma Gandhi, whose example has been repeatedly invoked by Occupy Wall Street protesters, would make of the protests.
Gandhi, says Ian Desai, would have rejected the slogan "We are the 99%." Societies operate as wholes, and the income stratification that the OWS protests have highlighted exists because we all accept and enable it.
If Desai is right, the obvious question is "Why?" Given that the US is rated between Ghana and Senegal in terms of income inequality, and far below any other "developed" nation, can it be true that we not only allow this situation to exist but actually approve of it? When I say "we," I mean the 100% that Desai says Gandhi would have addressed. I, personally, don't approve of it, and I know many others who don't. Placing our desires in conflict with those of the collective, though, is in Gandhi's view the wrong way to go, even if we have 99% of the collective on board. It is the 100% that will have to create any meaningful change.
So what is it about the American people that makes us so resistant to such change? Malcolm Gladwell's Outliers, which I've been reading on the recommendation of a Facebook friend, provides some interesting hints. What Gladwell is interrogating in the book is, in the broadest sense, the peculiarly American doctrine of the "self-made man." It is part of our national mythology that we are an open and egalitarian society in which anybody can succeed, simply by dint of hard work, grit and determination. The outliers—the upper 1%—are the ones that have done so, and those of us who haven't shouldn't complain; we should get to work.
Gladwell documents some of the ways in which this mythology can blind us to what's really going on when people succeed beyond expectations. For example, I was born in 1955. That means that I could have been Bill Gates. Literally. If I had practiced programming computers during my adolescence with the kind of determination with which I practiced the piano, I could have been ready to get in on the ground floor of the personal computer revolution and make a fortune. There were only two problems: I wasn't very interested in computers, and I didn't have the access that Gates did growing up in a computer-obsessed area of Washington state, or that Steve Jobs did growing up in Silicon Valley. So I've ended up a moderately successful musicologist instead of a phenomenally successful computer entrepeneur.
I'm not complaining. I like what I do and I wouldn't trade places with most people who make far more than I can ever dream of making. I suspect, though, that many Americans who do complain about their lot in life still have a deep-down conviction that it's solely their own fault. In fact, there's something even more pernicious going on, and it's what has enabled the current Fox-News-driven blame the poor mentality. Many Americans believe that successful white baby boomers like myself grew up on a level playing field full of equal opportunity, but that the field has been eroded and pockmarked and all but destroyed by freeloaders demanding special treatment. That's why Newt Gingrich is able to denigrate the OWS movement in such blatantly demeaning terms and receive cheers from some of the very people who would most benefit if the current protests succeed.
I did not grow up on a level playing field, and I know it. I came from a family that gave me intellectual autonomy as a birthright, and treated the pursuit of knowledge as a worthy goal. I grew up in an upper-middle-class household that allowed me to waltz into an elite college and an Ivy League graduate school. I was able to spend countless hours playing the piano and listening to music because I didn't need to hold odd jobs to get by and supplement the costs of my education. I was blessed, and I regard the career that I was given as a vocation, not a birthright. I seek to give back to society some of the special insights and unique gifts that have been handed to me, because I know that I do not own those gifts.
I am just me, though. The 0.00000033%. I feel incredibly fortunate to be there, but I am also part of the 100%, and I wish we had a mythology that would enable us to act in the collective interest and not just that of 300 million individuals. That's not socialist; it's just reality. As long as we cheat ourselves of that understanding, we will, I'm afraid, continue to have the society we deserve.
Saturday, November 26, 2011
That's what a clerical friend reports hearing last week from a Fox-News-watching member of his congregation. It's hardly news to me, of course, that many Republicans have taken on that attitude. Certain well-known media personalities have been trumpeting it for years, so it's not surprising that their listeners/viewers have come to regard Democrats and/or liberals as an alien species hostile to core American values.
That widely held belief probably made the failure of the so-called "super-committee" earlier this week inevitable. Many people seem to think that if Congress had accepted the recommendations of the Simpson-Bowles deficit reduction commission, we could be on our way toward a grand bargain that will save our country from the precipice. In fact, it's clear that the Democrats on the super-committee were willing to accept an agreement that was well to the right of what Simpson-Bowles recommended, and the Republicans still rejected it. In their eyes, no compromise is possible; anything the Democrats propose must be defeated, as must the Democrats themselves.
The 112th Congress, in other words, has shipwrecked on its own hyper-partisanship. This much was clear after the debt-ceiling fiasco last summer, but what needs to be repeated as much as possible is that both parties were not complicit in the disaster. Only one party has insisted on driving us over this cliff.
Let me illustrate the significance of that point by returning to the analogy I raised last month in my post titled "The Carriage and the Brakes." Regardless of what Rush Limbaugh might think, all of us Americans are in this together. Both conservatives and liberals aim to maintain a democracy based on free-market economics; the differences are over how fast the process should be allowed to go and when, and how firmly, the brakes need to be applied. Since we're all invested in the ultimate success of the process, it is fair to say that in the broader historical sense, all Americans are really liberals. We don't see a serious political faction calling for the reinstatement of monarchy and the abandonment of representative government.
Since the success of our democracy matters to all of us, we should value and respect each others' opinions, not for idealistic reasons but for deeply practical ones. If the brakes are applied too hard and too often, no forward motion will occur. However, if the brakes are not applied, the carriage will be destroyed. The driver should value the brakeman for the same reason the brakeman values the driver; without both doing their jobs, neither of them is going to get anywhere in one piece.
What we are now witnessing, however, is a situation in which the driver wants to kick the brakeman permanently off the carriage and run it with no safety mechanism in place. There are enough people, unfortunately, whose grasp of reality is sufficiently tenuous that they find this vision appealing. In a Manichean world-view, you defeat your enemies rather than compromising with them. In a democracy, though, that doesn't work; all that will happen is that we'll end up in the ditch together. In fact, that's where we already are, and the sooner the us vs. them mentality is dropped, the better off we'll all be.
It's not the Democrats whose attitude will need to change to make that happen.
Monday, November 21, 2011
Before I elaborate, I need to mention that I recently encountered an article in the New York Times suggesting that, despite what Geoff Colvin thinks, there may be something to the idea of talent after all. Apparently, if you're not in that top .1% to begin with, it's very hard to fake it, no matter how hard you try.
Since my earlier post on this topic (See August: "Golden Oldies: Is Talent Overrated?") ended inconclusively, I see no need to revise my previous views. I'm still not really sure how big a role inborn talent plays compared to hard work or other less tangible advantages. (The latter, I take it, are the main subject of Malcom Gladwell's Outliers; thanks, Debbie, for the recommendation.)
What I do want to discuss is exactly what it is that defines, not the top .1%, or even the top .01%, but the top .00001% or so. What makes the difference between a Mozart and a Shakespeare, who manage to write for the ages, and your average phenomenally gifted artist who just doesn't quite measure up? (I won't name any names, so as not to raise any unnecessary hackles.)
Certainly a part of the answer, as I suggested in a Facebook post yesterday, is that the truly great are not simply speaking for themselves. They manage, through the combination of their hard work and some mysterious additional factor, to give voice to things that touch upon our common humanity, in ways that are still apparent centuries later.
An essential read for anybody who is interested in this subject is Robertson Davies's novel What's Bred in the Bone, which I first encountered about 25 years ago and have reread several times since. Davies tells the life story of Francis Chegwidden Cornish, the son of a prominent Canadian banking family whose talents happen not to lie in the world of finance.
Through an elaborate metaphorical back-story (a frequent device with Davies), it is shown that Francis's life was ruled by a daemon: not your stereotypical evil spirit, but a hermaphroditic, amoral font of artistic inspiration who forces Francis inward by repeatedly denying him satisfaction in real life. Having become mixed up with a spy ring that is involved in smuggling art out of Nazi Germany, Francis creates two paintings that, if they had not been great works of art, would have been simply forgeries.
What makes them great is that Francis has managed to use figures and themes from his own life, heritage and confused religious upbringing to crystallize content from the deep unconscious. His paintings, which appear to come from a Renaissance master (even down to artificial craquelure created by baking them in ovens), are actually timeless.
Maslanka, a fellow Oberlin grad whose Symphony no. 4 will be performed by the Baylor Wind Ensemble tonight, is unusual among modern composers in that he acknowledges deliberately trying to do the same thing. Here are a few ponderable quotes from his essay:
"After a lifetime of being myself I have come to the conclusion that the only tool I have for the perception of the 'click of rightness' is myself! On the conscious level that self is patently limited: the senses have limits, the talents have limits, the intelligence has limits. It is possible nonetheless for the conscious mind to reach 'inward' and find a universe of powerful images and feelings, and conversely to have things thrust on it from an unknown source—which means that there is the possibility for revelation."
"One of the most profound revelations that human consciousness has received is symbolic language in all its forms. ... The great tradition of musical language exists apart from the individual. I remember awakening gradually to this tradition, first as a young clarinet player, and then as a young theory student and beginning composer. I remember at age 18 the sudden realization how little I knew of this vast language, and what a complicated business composing really was. For more than 40 years I have been actively exploring that language, and understand that I will never encompass it all."
"The idea of the language using you is a profound one. It implies that the language wishes to speak, that there is a partnership between your conscious mind and the unspoken forces of the universe. My tonal musical language uses me. No matter what I do it won't be denied. In allowing myself to open to language, I open to the great common pool of human musical experience, and the language uses me. Out of tradition is invented the personal voice."
Anyone familiar with the controversies in modern music, and in post-modernism generally, will recognize these as fighting words. The idea that there are universals, revelations, and vast pools of common experience that can speak to all mankind and always will, are, to say the least, out of fashion. When Maslanka acknowledges that the musical language that seeks to speak to him is tonal, and that one of its fonts is the Bach chorales, he is making claims for tonality and for Bach that many are still eager to dispute. Just last summer I tried to make a similar claim for Bach on my professional society's listserve, and was treated to a lecture from another contributor explaining that any advantage Bach appeared to enjoy stemmed solely from the privileging of the German tradition that he represents. The implication was that exactly the same thing could have happened to any number of other composers from other national traditions who are now considered also-rans.
I am going to say here what I did not say there: I don't believe it. The privileged positions accorded to tonality, and to Bach's particular take on tonal language, are due to the fact that both are capable of drawing vast amounts of material from the universal human unconscious. This is what any great art does, and it's what makes it great. Yes, an artist needs extraordinary gifts and needs to work extraordinarily hard. To be truly great, though, an artist needs to become a vehicle for the language of art, and there's no way to make that happen unless, like Francis Cornish, you are also the vehicle of expression for a daemon of your own.
Saturday, November 19, 2011
I team-teach an interdisciplinary humanities course, and every year we dedicate some class time to 20th-century atrocities, focusing on the Holocaust and the atomic bombing of Japan. That's the week we just finished. For the sixth year in a row, I led my students through a discussion of Primo Levi's Survival in Auschwitz. We chose Levi's book because almost everyone has read Elie Wiesel's Night by the time they finish high school. Levi was almost ten years older than Wiesel when he was deported to Auschwitz, and he looked at the experience with as much detachment as he could muster. He saw that the camp was a kind of experiment in bringing out the worst in human nature, and he understood that those who survived were the ones who pursued naked self-interest with a single-minded intensity that would brand them as criminals or lunatics in any normal society.
We also read and discuss multiple first-hand accounts of the bombing of Hiroshima, and examine historical writings that make the case both for and against its necessity. I mention having grown up in Oak Ridge, Tennessee, where many of the parents of people I knew went to work each day at the "gaseous diffusion plant:" a facility for electromagnetic isotope separation that turned out fissionable uranium in amounts large enough to produce nuclear weapons. I describe my lingering memories of the emotional nightmare caused by the Cuban Missile Crisis, and tell them frankly that when I was their age, I never expected to live as long as I have. With all those nuclear missiles my friends' parents were helping to make pointed at Russia, and a similar contingent of Russian bombs targeted at us, it was nearly inconceivable to me that someone, someday, would not decide to use them. When, along with our undergraduates, I read the accounts of victims of the Hiroshima bomb losing their skin and slowly succumbing to radiation poisoning, I am reminded that this is the way I long expected to die.
It is perhaps too much to expect 19- and 20-year olds to come to terms with the naked evil that the 20th century brought back into the foreground of human self-awareness. Having swallowed their obligatory Wiesel in mid-adolescence, many of them are understandably not eager to tread that ground again. The answers come a bit too easily. Freedom is God's gift to humanity. Even though Dostoyevsky's Grand Inquisitor (we read that one just before the California trip) thought people weren't up to the challenge, the gift remains. Of course some mistakes are going to be made.
Next week we celebrate Thanksgiving, and the Christmas carols are already jingling in the background. The season of forced cheerfulness is upon us. I have been following with some bemusement the concern expressed in the media that Christmas, not satisfied with having taken over December, has now nearly devoured November as well. Christmas lights were already up in San Francisco, and large red Nutcrackers suddenly appeared in the hotel lobby the day before we left.
So it's worth remembering that Christmas actually doesn't start until - December 25th. The time before it, which begins next Sunday, is Advent, and it is a time for penitence, not celebration. I think this matters whether you're religious or not. Our lives need punctuation, and holidays—which we will all be celebrating soon regardless of our faith or lack of it—need to be balanced by introspection and soul-searching. No new seed can be sown without first making the ground ready, and heaven knows the ground has been poisoned this year by demonstrations of our increasingly impoverished vision of ourselves.
I speak as an American, and one who has always thought that being an American was a source of pride. In no other country in the world would a descendant of Russian Jewish immigrants have been as likely to marry a Protestant turned Catholic turned agnostic and produce someone with the mixture of spiritual deficits and advantages that have defined my life. At the moment, though, the magic that made America is not working, and the tent city that was stretched out right next to the tourist vendors in front of that San Francisco hotel was a reminder of what Americans can all too easily forget: that blessings come to us not in the middle of prosperity and chest-thumping pride, but through the back door of want, deprivation and yearning. They are defined by our very human need for what is unseen and, as yet, unimagined.
Unfortunately, that back door can also let in the demons we fear the most. Inhospitality, distrust of others and demonization of those who think and act differently are lurking there as well. Rarely have those traits of human nature been on such blatant display as in what has passed for our national conversation this past year. The blessings may be there too, but here's the thing; they're going to try to sneak in where and when we least expect it, and in all likelihood there will be no room for them at the proverbial inn. That's why the holiday season requires repentance as well as celebration. This doesn't mean we need to flagellate ourselves with guilt and regret. All we need is to open ourselves up to new possibilities and remember to look for hope in the face of the lowliest and most despised among us. We must exercise our freedom to be human, which means being broken, weak, inadequate and desperately in need of a helping hand.
So in preparation for the inevitable commercial and emotional onslaught that's sure to come, I would like to share my favorite Christmas song that I encountered last year. Noel Paul Stookey, of Peter Paul and Mary, packs more wisdom and understanding into these lines than we're likely to encounter anywhere in the commercialized Christmas season that's about to begin. All I can say in response is "May It Be So."
Saturday, November 5, 2011
Here's the good news: Michigan just passed a new anti-bullying law, called "Matt's Safe School Law," named for a gay teenager who committed suicide recently after being bullied beyond endurance. As a former victim of bullying, I have my doubts about whether passing a law is going to stop anyone from being a bully, but at least it's a nice gesture.
OK, here's the bad news: Republicans in the Michigan legislature refused to pass the law unless it included an exception for bullies who act out of religious or moral conviction.
Let's start with "moral." Words do occasionally change meaning, or acquire new connotations, so it might be a good idea to consult a current dictionary definition, or definitions. Microsoft Word Tools menu to the rescue.
1) Involving right and wrong.
I guess that means that if you're right and your victim is wrong, it's OK to be a bully. Right?
2) Derived from personal conscience.
Ah yes—personal conscience is a flexible thing. Given that a personal conscience can be pretty much anything you want it to be, this means that anybody who wants to can be a bully. Still with me?
3) According to common standards of justice.
Well, (as George Will would say). That one's a little more difficult. Most common standards of justice would assert that it's wrong to bully, period.
4) Encouraging goodness and decency.
This one might leave some room for the old "be good or I'll beat you up" maneuver. Goodness and decency can't be too dearly bought, one might say. (That's up to one's personal conscience, after all.)
5) Good by accepted standards.
Ah. "Accepted by whom?," one might ask. If we're talking about the kind of standards that are accepted by society at large, it's probably fair to say that bullying isn't good by those standards, no matter who is being bullied. You have to have some pretty warped standards to think otherwise. I sense that we're getting to the core of the matter here.
6) Able to tell right from wrong.
The ad hominem definition. Somebody who can tell right from wrong is moral. Therefore, such a person can advance moral reasons for his or her behavior. Such a person is not likely to argue that bullying is right, at least if he or she is judging right and wrong by accepted standards.
7) Based on personal conviction.
Uh-oh. I see a loophole here the size of Lake Ontario. Personal conviction is something that most bullies I've encountered have lots of. Conviction that it's OK to treat other people like dirt, hurt them, belittle them, and reduce them to sobbing, helpless defenseless. That's a personal decision, isn't it?
Let's try "religious."
1) Relating to religion.
Hmm. So if someone's religion says it's OK to bully people, it's OK with Michigan too.
2) Believing in a higher being.
Oh. Kay. ... Most bullies I know believe in a higher being. Themselves. As in: "Let me knock you down and trample on you until you could be standing on Mt. Everest and you'd still feel like a helpless shrimp in my presence." That's Bully Psych 101, in case you didn't know.
Oh wow. Bullies will love this one. They've got all the time they want to be as thorough as they feel like being. After all, somebody can't be humiliated out of every last vestige of his or her self-respect in a day or two. It's a good thing for the bullies that they put this one in.
4) Belonging to a monastic order.
I'm not even going there.
I just can't help myself, I guess. I wanted to make sure that the idea of allowing bullying because of moral and religious convictions had some real basis, and it turns out it does. Never mind that most reasonable people would agree, in the abstract, that a moral or religious person would never bully anyone. That's just something a nerd would say. Bullies know better. (Go back to religious, 2).)
So here's what Michigan has done. It's told anybody who wants to be a bully for any reason his or her personal conscience allows that it's fine and dandy with the state of Michigan for him or her to bully anybody that he or she doesn't think is a good or decent person. But don't take my word for it. Listen to the minority leader of Michigan's State Senate.
And to all you newly empowered bullies out there: Have a nice day, by accepted standards.
Saturday, October 29, 2011
Silverbridge eventually comes around to his father's point of view, which is that of an aristocratic progressive in an era when progress was taken for granted. The one thing to be avoided at all costs was for it to go too quickly, sweeping old institutions and titles away in its haste. Thus, an aristocrat could acknowledge and welcome progress but also be grateful for the brakes that keep its pace measured and steady. The working relationship between liberals and conservatives thus defined is what allowed parliamentary democracy to flourish in the Victorian Era.
It has struck me in recent weeks that in our present society, the roles have been reversed. It is now the conservatives who see themselves as the carriage and liberals who see themselves as the brake. That's more or less what Chris Hedges argues in his recent book The Death of the Liberal Class, but with a twist: we no longer have a working brake, and are thus careening toward disaster.
It requires some historical depth, I suppose, to understand how the ideals of 19th-century liberalism—free markets, economic and social mobility, competition in place of charity—have become conservative rallying points in our own time. In the 1830s, Malthus was in vogue, with his brutal doctrine that human population will inevitably grow faster than its means of support. This doctrine horrified conservatives, who believed that charity could always provide for the poor, but appealed to radical free thinkers who wanted to see that ball and chain replaced by a modern ethos of personal responsibility and unlimited human potential. Thus, it was Britain's liberals who instituted the notorious Poor Laws that so appalled Dickens and other conservatives. (Trollope thumbed his nose at the distinction by insisting that he was an "advanced conservative liberal.") To be conservative in the later 19th century was to wish to place a brake on social change: change that, left unchecked, could easily lead to radical Marxism or equally unpalatable versions of progress that people like Trollope and his fictional liberal Prime Minister were also determined to hold in check.
What Hedges calls "the liberal class" is the 20th-century version of the brake, with capitalism and free markets still filling the role of the historically inevitable, and with even the feared consequences still largely the same. Hedges is simply endorsing the view of many historians when he says that Franklin Delano Roosevelt's main achievement was that he saved capitalism. If Roosevelt and others of the "liberal class" had not put in place the braking mechanism known as the New Deal, capitalism in America could have easily ended up being consumed by the socialist revolution that many were fervently hoping for, or by even darker totalitarian forces. The most progressive politicians of the 30s despised Roosevelt because he stole their golden opportunity out from under their noses. By applying the brakes, he showed that capitalism could still work.
That's why the heritage of the New Deal is so vitally important, and why those who have been trying to scuttle it for the last 30 years are playing with fire. It's why popular anger has boiled up to the point that well over half the population of this country now supports either the Tea Party or the Occupy Wall Street movement. I've already made it clear where my sympathies stand—I think the Tea Party is based on a fundamentally flawed understanding of reality that seeks to remove all brakes from a carriage already hurtling downhill at terrifying speed. Nevertheless, both groups are there, whether they know it or not, because they realize that something that defined America for at least half a century is no longer working the way it should. The liberal class is no longer recognized or valued for its braking power, and so, even with a Democratic President and a Democratic Senate, there is no real liberal voice in our government today.
That is what has to change if we are going to avert disaster. As Trollope so clearly recognized, no carriage can work safely without a brake. Those of us who are proud of being brakes need to regain our voices before it is too late to make a difference. We need to establish, once again, that the brake is not a luxury that can be disposed of, but a vital part of the vehicle that makes up our body politic. Remove the brakes and you are driving a death trap. That is where our country is at present, and I have enough historical depth to tremble at the prospect of what may follow.
Tuesday, October 25, 2011
It never ceases to amaze me what people are willing to assume that liberals think. I have a pretty sizable chunk of money stashed away in private retirement funds, but I supposedly think that my fellow citizens can't be trusted to do the same. In the imagination of people like Rick Perry (yes, that's who I'm talking about), liberals like me want to expand the government because—we want more government. We live to think up new ways to deprive the American people of their personal autonomy and freedom. We revel in the perpetuation of something that he is pleased to call "the nanny state."
I've got some news for you, Rick. You really don't have a clue what we think.
How do I know that? For one thing, you don't ask. For another, you don't seem particularly receptive. Does it occur to you that hyping an image of yourself as a vigilante who jogs with a laser-sited gun at the ready doesn't exactly make people want to open up and share? If so, and if you ever decide to take a break from the cowboy persona and listen, here are some things that might surprise you.
• I am a liberal, and I believe, deeply, in personal responsibility.
• I am no more fond of government bureaucracy than anybody else.
• I haven't needed a nanny since I was in diapers.
• I try very hard not to caricature the beliefs of vast swaths of my fellow citizens.
• I would really welcome the opportunity to have an intelligent conversation with somebody who doesn't make such broad assumptions about me.
So what do I believe? What makes me a liberal if I don't want government for government's sake and have no desire to reduce other Americans to a state of helpless dependency?
Are you stumped? All you have to do is ask.
Sunday, October 23, 2011
My line-by-line commentary on the sign held by a college student who claimed to be about to graduate completely debt free, however, behaved in a way I had never seen before. Within 48 hours of my posting it the Friday before last, it had received over 200 hits. It continued to be viewed regularly throughout the next week, and as of this writing it has over 600 pageviews, surpassing the all-time previous record by 50%. And it's only been up for 9 days!
Obviously this post is being read by people who go far beyond my normal Not Ready for Facebook constituency. When you search "I am a college senior about to graduate completely debt free" on google it is the third item to come up. Thus, anybody wanting to find out more about the sign and the claims it makes is going to be directed straight to my post. And so the hits continue to come in.
I have no idea whether this will lead to a broader readership in the long run. I am going to take the opportunity, though, to write a little more about my experience with un- and under-employment—the six years I alluded to in the last post. In an atmosphere in which the man who is now considered the front-runner for the Republican presidential nomination can say that the unemployed have brought their situation on themselves and receive loud cheers—in which the very state of being unemployed is being viewed by some as an automatic disqualification for further employment—I need to explain that long-term unemployment is something that can happen to anyone, even if you play by the rules and jump through the hoops and pat yourself on the back and expect things to fall into place.
After attending the elite private college I alluded to in my last post, I went on to an even more elite graduate school. All right, it was the same Ivy League institution that four out of the last six presidents have attended. One of those places that trains most of the Wall Street bankers and other highly successful people who make up the 1% whose privileged, insular existence the Occupy Wall Street protesters have been complaining about. While there, I did everything I was supposed to do. I broadened my knowledge in many directions in order to prepare for the college teaching career I was anticipating. I got classroom teaching experience, first as a TA and then as an adjunct instructor. I won a major award for my teaching, and got reviews from my students that topped the charts. I wrote a dissertation that was quickly snatched up by Cambridge University Press and, after some revisions, was published as a book less than three years after I received my PhD.
And I still couldn't find a job. After four years of flitting from one temporary lecturer position (academic-speak for a dirt private with no rank or privileges) to another, I found myself unable to get even an interview for a tenure-track job (sort of the equivalent of a non-commissioned officer—you still have to jump through all kinds of hoops to get into the diminishing circle of those lucky enough to have tenure). It didn't matter that I had a doctorate from Yale (the name will out after all), a book published to warm reviews, and substantial, highly successful teaching experience. I couldn't even get my foot in the door, and that situation lasted for six years.
I soon discovered that I was "overqualified" for almost any other kind of job. It wasn't that I didn't look at or consider other careers. I simply learned that the fact that I had spent most of my 20s getting a PhD in the humanities made me practically unemployable. I was reduced to signing up with temp agencies and begging the local office of Princeton Review to hire me as a tutor. I found a church with a grand piano that was willing to let me use it to teach, and I developed a private piano studio that at its high point consisted of seven students. I decided to try teaching high school, which meant that I had to go back to school and take even more classes in education. One of the low points came in the middle of my semester of student-teaching. I had earned an Ivy League PhD, gone deeply into debt with student loans and published an acclaimed book, all to be paid $14,000 a year to teach part-time at a private school in the morning before going to my unpaid gig at a public school in the afternoon, where I tried to force-feed Shakespeare and Orwell to students who simply couldn't imagine that any kind of literature was not BORING!! It was hard and often degrading work, and I was being paid a pittance for my slave labor. I nearly broke down in frustration, especially after the experience with the principal that I described in my post last June titled "Dead Poets Redux."
I literally would not have made it through that time if my wife had not been able to find work as a registered nurse at the drop of a hat. It didn't hurt that she also received a substantial financial settlement over the death of her first husband. After my first child was born, I spent nearly two years as the custodial parent while Barbara worked. I did the shopping, cooking and house-cleaning as best I could, and bonded with my infant daughter in a way few men get to experience. That part of it was nice, although there were many times when I thought I could feel my brain cells disintegrating from lack of use. Then I finally got the phone call that led to the interview that led to the job that got me onto a tenure-track and, eventually, led to a tenured job, which led to another one, which led to my current position as a full professor at a major university.
Yes, I persevered when many people would have given up. Yes, I showed a willingness to do all kinds of things as needed, and refused to let my lack of professional success define me. (I had seen others fall into this trap, and saw how easy it was for them to become consumed by anger and frustration, which only made it harder and harder for them to get a job.) Nevertheless, I know beyond a doubt that where I am today was as much as result of luck as it was of hard work. I know beyond a doubt that there are many very good people out there who have never gotten a break, despite having played by the rules and done everything right. I know that unemployment is not a choice, that the unemployed are not responsible for their plight, and that anybody who thinks otherwise has simply never been there.
That's why, even though I have a good job, a lovely family, a nice house, thriving retirement funds and adequate health insurance, I stand with the people occupying Wall Street and not with the people looking down their noses at them. Nobody is self-made. Nobody succeeds without a great deal of help from others and a measure of good luck. Nobody. To the extent that our national mythology has bought into the idea that anyone can pull him or herself up by his or her own bootstraps, we are sacrificing community, compassion and mutual support on the altar of a false and soul-destroying individualism. Beware of this false idol, because it will tear our society to shreds without an ounce of regret. Then, like all idols, it will destroy its worshipers as well.
Friday, October 14, 2011
I am a college senior, about to graduate completely debt free.
That's very good. I graduated from college debt free as well. My story is a little different, though, as you'll see.
I pay for all of my living expenses by working 30+ hrs a week making barely above minimum wage.
When I was in graduate school, I had a fellowship that paid a living stipend. It stipulated that I could not hold any other employment while receiving the fellowship. There was a reason for that. Colleges want their students to be focused on their studies. Having taught college now for most of my adult life, I can assure you that this was a good idea. If I had a full-time student who was working 30 hours a week outside of class, I would strongly advise him/her to cut back on work or attend school only part-time. The number of students I have had who can really do well in college while working that many hours is minuscule.
I chose a moderately priced, in-state public university & started saving $ for school at age 17.
17 was the age at which I started college. I went to an expensive private college because I knew my parents could afford it. (They told me so.) I had savings, but it was only because my family had been giving me checks to hoard away for college for years. My family was in pretty good shape financially, and that strongly influenced some of my other decisions, as described below.
I got decent grades in high school & received 2 scholarships which cover 90% of my tuition.
I got excellent grades in high school. However, I was not eligible for financial aid at Oberlin because of their "need-blind" admission policy, which guaranteed that any student they admitted would receive as much financial aid as he or she needed. Since I didn't need financial aid, I wasn't eligible to receive it.
Here's something else I want you to think about. I turned down the chance for a national merit scholarship. I was a finalist, so I could have applied. I understood, though, that the national merit program was designed to help people pay for college who couldn't have afforded it otherwise. It never even occurred to me that someone in my position might apply. Simply having qualified as a finalist was distinction enough.
I currently have a 3.8 GPA.
I honestly don't remember what my GPA was in college, although I know it got better with each passing year. As a freshman I didn't get the kind of grades I had hoped for, largely because I entered with a semester's worth of AP credit (which allowed me to graduate early, saving my parents some money) and took all sophomore- and junior-level classes that year.
Since I've started teaching college, I've learned to be a bit suspicious of students who boast about their GPAs. The students I know who have the highest ones are usually the ones who play it safe, taking only classes they know they will do well in and avoiding exposing themselves to new ideas. I know this is a big generalization. I've also taught students who graduated with 4.0 GPAs who were truly brilliant. I have to say, though, that you sound more like a risk avoider to me.
I live comfortably in a cheap apt, knowing I can’t have everything I want. I don’t eat out every day or even once a month. I have no credit card, new car, iPad or smart phone – and I’m perfectly ok with that.
One of the apartments that I lived in as a student had a floor on which you could place a marble and watch it roll to the other side of the room. I learned to cook for myself to save money. Students couldn't get credit cards in those days; I didn't have one until I was 28. I also didn't have a car until I was 27. Up until then, when I wanted to go anywhere, I walked, or took the bus or train. Students at my college who lived on campus (which almost everybody did) weren't even allowed to have cars. In other words, I'm with you on this one—and then some.
If I did have debt, I would not blame Wall St or the government for my own bad decisions.
I did have significant student loan debt by the time I finished graduate school, although it was probably nothing by today's standards. I didn't blame anybody for it. In fact, when I got my loan check at the beginning of each year, I put it in a money market fund, which at that time allowed me to earn 17 or 18% interest.
I remember having a discussion with another graduate student who told me that he knew people who were doing the same thing and spending their earnings on stereo equipment. I assured him that the only thing I intended to earn from my investment was enough money to finish graduate school. I'm sorry you didn't have that opportunity. At that point, investment firms were still interested in helping people with limited means.
I live below my means to continue saving for the future.
Ah, so do I. However, I didn't expect my wife to go deaf in her 40s and have to stop working. I also didn't expect her loss of employment to be accompanied by medical expenses that nearly bankrupted us. That's the way life is. You try to plan for the future, but it has a tendency to blindside you.
I expect nothing to be handed to me, and will continue to work my @$$ off for everything I have.
That’s how it’s supposed to work.
See above. Until you actually fail at something, you will likely continue to carry around a severe compassion deficit that could make it difficult or impossible for you to sympathize with the difficulties that others experience. This is likely to poison your relationships with loved ones and coworkers. The best thing that could happen to you would be to have to spend a year or two un- or under-employed (I spent six), looking for work and unable to find it. You will emerge from the experience a much stronger person and a much happier one.
I am NOT the 99% and whether or not you are is YOUR decision.
Well, actually, it was my decision not to pursue a career that could put me in the top 1% of wage-earners in our society. Instead, I chose one that would allow me to give back to society with the skills that God gave me, and to do something deeply meaningful with my life. So I am definitely part of the bottom 99% of wage earners, and that was the best decision I've ever made. You, of course, are in the bottom 99% as well. Given that you could have invested all of your savings in an effort to win the lottery instead of going to college, that was probably a very good decision on your part too.
OK, that's pretty much it. I am still trying to make up my mind whether that was a real person in the picture (the absence of a face is kind of suspicious), and not a Koch brothers plant. Assuming that it is, though, all I'm trying to say here is that I'm a real person too, warts and all. I didn't do it all myself (nobody does). I have no desire to get rich. Life has thrown me some major curveballs. Most of the time I don't complain; some of the time I do, just like everybody else I've ever known.
I am the 99%.
Sunday, October 9, 2011
Somehow I succeeded. I did my job with unfeigned dedication and tried to stay out of trouble. Neither did I hide my opinions, and when called on to do so I expressed them. I refused to kiss anybody's you-know-what, but I remained civil and respectful toward all those I had to work with, understanding that the health of the entire college depended on others' and my willingness to do so. As a result, I earned enough respect to survive the tenure gauntlet.
Ever since Henry Kissinger said it, it's become a standard gambit to compare the vicious, feuding world of academia with the comparatively civil one of politics and international diplomacy. Even Kissinger must have to admit, though, that our national political conversation has come to resemble Harvard at its worst. Thus, I want to make my own contribution to the growing debate over what exactly the Occupy Wall St. (hereafter OWS) protestors might be trying to accomplish. I can't speak for anyone else, but I sense a growing frustration that people throughout our entire political culture seem to have stopped talking to each other. Civility—the glue that holds democratic institutions together and allows them to function—has stopped working. Nobody in Congress seems to be acting in good faith, willing to put the well-being of the nation and the survival of democracy ahead of the overriding goal of demolishing the other side.
This culture of dysfunctional name-calling has been building for decades. It would be sorely out of form for me, in an essay with this subject, to point fingers and assign blame for this situation. Let me just say that I don't watch MSNBC and I never have. The few times I tried to listen to Air America I turned it off in sheer boredom. Thus, I feel completely justified in asking people to stop watching Fox News. It was, in fact, my attempt to do exactly that which led to my voluntary withdrawal late last year from Facebook as a vehicle for political discussion. Looking back at what happened, I realize that all I was trying to do was persuade a few people who were clearly getting their talking points from Fox to talk to me as well and hear a different perspective. All refused. Several called me a bully and unfriended me. Incivility gained the upper hand.
So I set up shop here and seized it back. That's why I've been writing this blog and inviting others to join me. I sense a similar type of frustration, and a similar kind of initiative, in the OWS protests. These are people who have tried to work within the established framework, but have found that the framework no longer works because glaring, habitual incivility has completely destroyed it. There is no other place left for them to say what they know needs to be said. What they and I would love more than anything is for the public square to become open again to voices like ours. That, however, would require a commitment from everybody to the process of communication—to doing the hard work of democracy, which inevitably involves swallowing your personal pride and indignation so that the health of the entire country can be maintained and we can continue to function.
If OWS and I have a single message to convey—one that trumps everything else—it's this: you won't get anywhere if you let the conversation be controlled by people who won't talk to each other. This rag-tag coalition of mostly young people are currently the adults in the room. I take my hat off to them.
Wednesday, October 5, 2011
This seems like the right moment for a brief retrospective. I am in my tenth month of recording here, with what now looks to me like astonishing prolixity, my thoughts that once seemed too "hot" to appear as Facebook notes. My "friends" there, including people I had known for a very long time and others I barely knew at all, had been taking offense, so I decided not to bother them. If you've started reading my blog since then, that explains the title.
I'm glad to say that my ruse has worked. I have gotten no complaints, even though I link every new post on Facebook and now on Google+ as well. I seem to have been accepted as harmless. I've taken advantage of that, and have written here, with what I hope has occasionally bordered on poetry, of the often unbearable tension that simply following current events can create within my American soul.
I am, as I have said before, an American liberal, who was born in the 50s, grew up in the 60s and came of age in the 70s. I have always understood liberalism to mean passionate advocacy of the greatest possible freedom for the most possible people. I have been happiest when our politicians have acted with the people's interest at heart, and most miserable when they have colluded with wealth and power to squelch opportunity and enforce conformity. I do not recognize myself in the mocking caricature of the "big government liberal" to be found frequently on Fox News and elsewhere. I am the real thing.
I have recorded here my ongoing disillusionment with being forced to live out my adult life in an America vastly different from what I expected it to become. But I have also expressed my hope that my fellow Americans are better than they often seem, and that we still have it within ourselves to continue to offer the world genuine leadership. If I'm wrong, we will simply become increasingly irrelevant, while continuing to be an active drag on the world's economy and physical environment. I hope I'm not wrong, and when I have expressed disillusionment here, it has been solely in the hope of reaching out. I know, based on many private communications I've received, that I'm not just preaching to the converted. I may not have a huge readership, but I am expanding people's minds.
Today I simply want to go on record saying that the current Occupy Wall Street protests, now spreading to other urban areas throughout the country, are the most hopeful and encouraging thing I have seen in a very long time. Like many, I have been confused by what exactly they are and what the protesters hope to achieve. I may be getting a vague idea of what 60s radicalism looked like to people who were my age when it began. I will be generous enough to assume that the mainstream media have been baffled too, and have not, as can easily appear to be the case, simply been ignoring the protests. I am writing this from my sickbed because I want to make it clear that they are too significant to be ignored. The onus will increasingly be on those who are baffled to justify their bafflement, and to search their souls to figure out just what they are missing.
There is no reason to be baffled. It's been a long time coming, but we finally have a critical number of people whose awareness of wrong, and whose hope for the future, wealth and power cannot control. Whether or not the rest of us have caught on yet, they are the other 99%. They are America. If this country meets its current challenges, it will be because of them. If it fails, it will be despite them. Fortunately, it looks like they're here to stay.
Thursday, September 22, 2011
Of course there was another execution yesterday. The State of Georgia took the life of Troy Davis despite significant doubts about his guilt and highly publicized objections from around the world. Davis was further punished by being kept in suspense for four hours while the Supreme Court heard, then denied, his final appeal. This had happened before. Davis was nearly executed in 2007 and again in 2008, only to be granted last-minute stays.
I have no idea whether Troy Davis was guilty, but if the seven out of nine original witnesses who recanted their testimony are correct, then somebody else actually committed the murder he was convicted for. I sincerely hope that gives some of the people who pushed inexorably for his execution a few moments' uneasiness. To many who followed this case, the apparent inability of death penalty supporters to feel such emotions—epitomized by Rick Perry's glibness in a recent debate—is particularly baffling.
I described the Brewer execution as the "hard case," though, because it is so difficult to feel doubt or uneasiness about it. This was clearly a very bad man, who did something so indescribably awful that it revolted the nation. There was no doubt about his guilt. Furthermore, if this matters, this was a white man being executed for killing a black man. Some would even consider that progress.
Nevertheless, the simultaneity of these two executions gives us a chance to put the debate about the death penalty in perspective. What is wrong with the death penalty isn't the possibility that an innocent person could be executed—indeed, the overwhelming likelihood that many already have been. What is wrong with it is that taking human life does not convey the message that it is wrong to take human life. Instead, it cheapens human life, and thus makes us a coarser, more heartless and more violent society. The state in which I live—the state that executed Lawrence Russell Brewer yesterday—operates one of the most relentless, implacable death machines anywhere in the world. Here in Texas, the death penalty isn't solving the problem of violence and disregard for human life. It's part of the problem.
Thursday, September 15, 2011
If you're wondering, that's a line from Simon and Garfunkel's song "America." Simon and Garfunkel used to be big. Paul Simon still is. Art Garfunkel travels the country and gives concerts as a "legend." I don't think that's quite the same thing.
The line came to my mind last night after an exchange I had with a colleague about references to pop culture in textbooks. I'm writing a textbook, and I've occasionally tried to spice things up. Not a good idea, he said; such references are dated the moment the book hits the stands, and are unfathomable a few years later. So I found myself wondering if anybody else is still in love with Kathy.
I don't even know who she was, but Paul Simon fell for her in England in 1964 and wrote that song about how he had "come to doubt all that I once held as true. I stand alone without beliefs. The only truth I know is you." Kathy was probably a distant memory by the time he invoked her again in "America" a few years later, but I discovered both songs when I was in high school, and they spoke to me as I'm sure they did to many adolescents at the time. The angst was at once eloquent, a bit facile, and indescribably right. It's not great poetry, great music, or great anything, but it's a permanent part of me, and that no doubt marks me as being precisely the age that I am. Dated and now unfathomable, my memories have been replaced by many more recent generations' defining lyrics.
In my case, the Simon and Garfunkel moment was prolonged because I was romantically stymied. Not unusual for an adolescent, I know, but for me it was particularly severe because the years of bullying I had endured through most of my childhood left me afraid of my own shadow. It's one of the lesser-known—but well documented—effects of bullying that victims often find it nearly impossible to develop romantic relationships as adults. Having been there, I understand why. If you've been treated as a worthless piece of trash long enough, it may be literally impossible to believe that anybody else could care about you. What for other adolescents and young adults is a peril-filled but often comical rite of passage was for me simply out of the picture. It didn't happen. (Until it finally did, but that's another story entirely...)
Of course I fantasized about the romantic partners I feared I would never have, and they were always named Kathy. Like I said, being in love with Kathy was a permanent part of me.
I recall all of this now because what I was in love with represented something else that moved in and took up the large empty space that was available in my imagination. As a child of the 60's who came of age in the 70's, cursed by an unrequited need to love, I came to believe deep down that I was a part of something transformative. The Civil Rights movement, the protests against the Vietnam War, the flower-draped, hippy-crazed ambience: I was too young to find it threatening. Instead, it charged me with hope for what this country was becoming. Never having lived in a different America, I didn't know any better than to assume that those who were changing our country before my eyes would continue to do so. It was the only reality I knew.
Paul Simon, though older than I by over a decade, expressed the same yearning in "America." The lyrics, which famously don't rhyme, begin by saying "Let us be lovers, we'll marry our fortunes together." (Kathy and I? America and I? It wasn't quite clear.) "I've got some real estate here in my bag." (The entire country, to be sure.) "So we bought a pack of cigarettes and Mrs. Wagner pies." (Do they still make those things?) "And we walked off to look for America." The song seems to trace a journey that progresses from the Midwest toward New York City. At the end, Simon immediately drowned the introspection of the first line I quoted by asserting that he was "counting the cars on the New Jersey Turnpike. They've all come to look for America." America, it seemed, was something worth searching for, worth creating.
I am writing this to try to explain, once again, why I find the current situation of our country so brutally disappointing. The unraveling of Simon's and my dream began when Ronald Reagan was elected president—something I never, ever saw coming. (To those of you who are too young to remember, until sometime in 1980 Reagan looked exactly the way Sarah Palin and Michelle Bachmann do now: extreme, dumb, and absolutely unelectable.) The man who had once used National Guard helicopters against student protesters at Berkeley began the process of dismantling everything that was good about the country I had always known. I can only compare the slow, brutal descent into an ugly, discordant reality that has unfolded ever since to an unrequited love that just goes on and on. Like so many others I knew, I so wanted to give this country what my generation grew up believing, and I've been spurned - again, and again, and again, with unbelievable vehemence. Kathy, I'm lost.
Tuesday, September 13, 2011
Here's what happened. Blitzer was grilling Ron Paul about national health insurance. He posed what was no doubt intended to represent an extreme case. You can watch the video at http://2012.talkingpointsmemo.com/2011/09/tea-party-debate-audience-cheers-idea-of-letting-sick-man-without-insurance-die-video.php, but I've transcribed it below.
"A healthy 30-year-old young man has a good job, makes a good living, but decides 'You know what? I'm not gonna spend 200 or 300 dollars a month for health insurance, 'cause I'm healthy, I don't need it,' but, you know, something terrible happens, all of a sudden he needs it. Who's gonna pay for, if he goes into a coma. Who pays for that?"
Paul responded that "In a society where you accept welfarism and socialism, he expects the government to take care of him."
"But what do you want?," Blitzer shot back.
"But what he should do is whatever he wants to do, and assume responsibility for himself," said Paul. "My advice to him would [be] have a major medical policy but not be forced..."
"But he doesn't have that," Blitzer reminded him. "He doesn't have it, and he needs intensive care for six months. Who pays?"
"That's what freedom is all about, taking your own risks. This whole idea that you have to prepare and take care of everybody..." is what Paul was saying when the audience broke into a round of furious applause.
"But Congressman," Blitzer interrupted, "are you saying that society should just let him die?"
Paul, to his credit, answered "No," but he was nearly drowned out by a few loud voices from the audience calling out "Yeah!," with supportive cheers from many others. As I listened to the video at about 10:00 last night, I was reminded, and promptly stated on my own Facebook page, that the expression "chills ran up my spine" is not a figure of speech. I was nauseated, sickened, horrified, and in near despair. This was the audience at a debate held by one of our two major political parties (albeit with "Tea Party" backing), and a significant contingent was cheering the idea of letting somebody die. Even though I was in pain and really needed to go to bed, I put out a feeler to see if others were as appalled as I was.
The thread that followed has run, so far, to 51 comments. Everybody who chimed in was also horrified—even a few whom I know to be Republicans. There was some doubt expressed about whether this was really the view of the Republican party, or of even more than a lunatic fringe within it. (It did sound to me like a significant portion of the audience was cheering.) One poster pointed out that Blitzer's question was really quite ingeniously phrased (which it certainly was). Another poster told the first one that it was her responsibility, as a Republican, to make sure that her party doesn't fall into the hands of extremists: a position with which I strongly agree.
I've slept on it now, and I just want to add the following. I was once thirty years old and had a good job. Fortunately, it came with health insurance, and I didn't have to contribute anything on my own; it was a pure fringe benefit. I say fortunately because I was paying back significant loans from my education, and on my "lecturer" salary I could not have spared 200 or 300 dollars a month for health insurance. I was healthy, and given the choice, I would have had to turn the insurance down.
But there is more. Christianity—the religion to which the majority of "Tea Party" members claim to belong—is founded on grace. Grace is embodied in the idea that God acted for us decisively despite the fact that we had done nothing to deserve such action. Consequently, the Christian Gospel requires us to do the same. The Gospel stands or falls on whether we accept the reality of radical, unconditional grace and internalize it. You cannot be a Christian and cheer the idea of somebody being left to die, for any reason whatsoever. Period. End of discussion.
I've spoken here before about the way that the Christian Gospel has been clashing publicly with a very different Gospel: that of Ayn Rand, who would indeed have applauded the idea of letting someone die as preferable to letting society lift a finger to help. I have said before that these two Gospels are incompatible, and the cheering at last night's debate perfectly illustrates that point. The people who cheered are not Christians. They may go to church, tithe and take communion, but they are still not Christians. You can only be a Christian by the grace of God, totally beyond and above your own deserving. As a well-known Christian parable illustrates, that man in a coma without health insurance is Christ. A country that would choose not to help him is rejecting Christ. And that's really all that needs to be said.
Thursday, September 8, 2011
That day I shared a link on my Facebook page that shows the face of American Labor. Florence Reece, whose family had been harassed by management agents during the 1931 coal miner's strike in Harlan County, Kentucky, wrote the song "Whose Side Are You On?" Over 40 years later, the still feisty Reece sang the song for miners and their families as they struck against Duke Power for safer, decent working conditions. The sight of young people, who knew the song by heart, singing along with her is an image of what Labor Day really means. It is about the centuries-long struggle of American workers to achieve human decency at work and at home in the face of employers driven by profits and determined to earn them at the lowest possible cost.
This struggle, which has led to things like pensions, paid holidays, sick leaves, workplace safety laws, child labor laws, and 40-hour work weeks—things we all take for granted today—is one of the great stories in American history. For this reason, people like Florence Reece, who wrote this song, and Pete Seeger, who popularized it, should be remembered as American heroes. They contributed as much to the fabric of modern America as many presidents and generals, if not more. For most of us, what we think of as the American Way of Life is founded on their achievements.
Why did the miners of Harlan County have to struggle as they did? It wasn't just because the mining companies were greedy; it was also because the government used its power to suppress their voices and support the interests of the wealthy mine owners. This is a role that many people who call themselves conservatives are quite happy to see the government play. They are deluding themselves, though, if they think that supporting the rights of powerful corporations over the rights of the people who work for them is a formula for "small government." Their support simply puts the power of government to work increasing the concentration of wealth and promoting greater inequality. This is a process that has been going on in the US since Ronald Reagan fired the members of the air traffic controller's union thirty years ago this summer, and it has now gained frightening momentum.
I don't mean the title of this post as a call to arms. I have no great desire to divide the public into "us and them," "sheep and goats," or whatever other combative image the title of Reece's song might suggest. The lack of civility in the public sphere has become so great that it is now probably the biggest single obstacle to progress. I'm all for promoting mutual understanding, not driving in more wedges.
For that very reason, though, I want to take the occasion of Labor Day week to clarify where the dividing line in our society really lies. It is not between proponents of small government and proponents of big government. We have big government, and we're going to keep having it, because, as I've said before on this blog, we're a big society that cannot simply run on its own.
The real issue is whether big government is going to be used in the interest of the wealthy and powerful and to the neglect of the poor and needy, or whether it is going to use at least some of its power to help the poor and needy protect their rights and keep the influence of wealth and money in check. It now costs so much to be elected to any public office that both parties are in the hands of what FDR called "organized money." That's why organized labor, and everything it represents, is just as important today as it ever was. If there isn't somebody around to remind the government to take care of those who work for a living, we just might find ourselves right back where we were in 1931. Sooner than you think.
Sunday, September 4, 2011
The world, it was noted then, changed overnight. Even before the questions of who and what began, the questions of how grasped our souls and began to change us, for we could not remain unchanged. How, in the past tense, meant what strange concatenation of evil had led up to this; how could we be shaken in so unexpected a way? How, in the future tense, meant how could we respond? For respond we would, and in all the bonding and reassuring and heroism and tears of that time, we remained torn by the knowledge that we would act, and that our actions would define us.
The first days were the hardest days, as we sat, unaccustomed, in the world's nurturing embrace, striving to become great of soul. The outpouring of love was real, and we appeared to chafe, no longer able to hold ourselves up by our own exertion. The power that is made perfect in weakness eluded us, and the power defined by bombs and dollars burst gasping from the embryonic mass. As we struck, the world recoiled, gradually withdrawing its love, then its support, then what little remained of its sympathy. Our actions, which did define us, were those that we could perform by rote, by comfortable habit, by flexing our atrophied but desperate muscles and flailing blindly.
I wish I could tell a different outcome to this story. Ten years down the road, I am haunted by unrealized possibilities. The present has become our prison, as is all too evident from the way our politics have ceased to function. The glue that held us together in the past is stretched beyond the bursting point, and the flailing continues, as our worn-out strength is unable even to hold up our own weight, while the once sympathetic world licks its wounds and regroups.
There will be a lot of empty words this week, and my guess is that the world will not much care. What it needed from us, expressed in that inchoate embrace of ten years past, is what we never tried to give in return. The new, creative thinking that once compassed our greatness has yet to appear, as we turn on ourselves and fail to lead the way out of crises as great as any faced in the last century, when our self-assurance matched our untested power.
I try not to despair. The courage and will to solve the environmental and economic crises that imperil the world are still here. I can read them in countless blog posts and comments, and hear them from friends and relatives throughout the country—even (or especially) here in central Texas. My horror this week is that the swagger and desperate grasping for self-assurance that are likely to compete for our attention on the 10th anniversary of that awful day will only serve to strengthen the process that has made us less bold, less visionary, less ourselves than we have ever been before.
May the God who is the source of all wisdom shine some light through this darkness and enable us to see. May the prayers of the intercessors, far from the seats of political power, be heard and turned to hard wisdom. May we all be moved to rediscover our core values, and give to the world what it still so desperately needs us to give.
Thursday, September 1, 2011
"Non, je ne verrai pas la déplorable fête
Où s’enivre, en espoir d’un brilliant avenir,
Ce people condamné, que rien, hélas! n’arrête
Sur la pente du gouffre."
"No, I will not look upon this dreadful celebration
Where, drunk with hope of a brilliant future,
Nothing, alas, will stop this doomed people
Cassandra, in Berlioz's Les Troyens
I have been deeply moved these past few weeks to see hundreds of people willing to face arrest at the White House in a massive act of civil disobedience. If you don't know—and you very well may not—they have been protesting the proposed construction of a massive pipeline, known appropriately as Keystone XL, that would link the Canadian tar sands directly with refineries in Texas.
Among those arrested are Bill McKibben, who has been speaking out with increasing urgency on the threat of climate change, and NASA's leading climate scientist, Dr. James Hansen. In Hansen's words, approval of the pipeline will mean that it is "essentially game over" for the climate. Extracting oil from the tar sands produces three times as much greenhouse gas as simply burning it. The pipeline will run through America's heartland, and has the potential to produce an ecological disaster all by itself, including possible contamination of the Ogalalla Aquifer. In short, it is a truly horrible idea, and it appears beyond belief that a Democratic president who claims to be an environmentalist is poised to approve it.
Or so those who oppose the pipeline have been claiming. Every time the issue comes up, strong countering voices chime in to point out that Canada will develop the tar sands regardless of whether Keystone XL is built. In fact, it is already doing so, devastating native American habitat and primal forests in the process. If we don't build the pipeline, the argument goes, the Canadians will just build one to their own West Coast and export the oil to Asia. In the process, President Obama will have sacrificed thousands of potential American jobs and sabotaged our efforts to achieve energy independence.
In other words, this fuse is going to be lit, and there is nothing we can do to stop it. The game is already over. The tar sands are going to be developed, for the simple reason that the worldwide demand for oil is rising at the same time the supply is falling, or at least not growing fast enough to keep up.
This is not encouraging news, but it gives reason to wonder if Hansen's choice of words might have been unfortunate. If this is indeed "game over for the climate," and if it is bound to happen no matter what America does, then it will be difficult to mobilize people for any future efforts to stop climate change and reduce our dependence on fossil fuel. As readers of this blog know, this is something that I care passionately about, and I'm not willing to accept defeat. I fully expect future generations to hold all of us now alive responsible for the devastation they are likely to face. In fact, I expect my children and grandchildren (if I have them) to be mad as heck about our failure to solve this crisis, or even to face it honestly. The urgency I feel on their behalf is hard to overstate.
At the same time, I honestly wonder whether there is anything we can do. People simply aren't turning away from fossil fuel and embracing alternative energy at the rate that will be necessary to avert disaster. There may simply not be enough alternative energy for them to embrace. How about living without air conditioning when the temperatures here in Texas have been in the triple digits every day for the last three months? How about giving up fresh fruits and vegetables for most of the year? Abandoning air travel? These are decisions that people will have to be consciously willing to make, and people just don't make decisions like that. It's against human nature.
So we're doomed. Doomed, at the very least, to an ongoing series of futile protests like the ones now going on in Washington. Even if we win this battle, we'll probably lose the war. Does that mean we should give up, and guarantee that our ultimate defeat will be even worse than it might have been? Game over? Cassandra, I feel your pain.