Thursday, March 31, 2005
Hail to the Chief
In the past, this fearless blog has discussed war, abortion, euthanasia, torture, and Republicans. We’ve invited applause and brickbats from all quarters. But we’ve had our limits, too. For not until today has this blog dared to address the truly controversial and disturbing question of . . .
But before I get to “the Chief,” I just want to point out that although I know less about college basketball than I know about smooth jazz, I have Illinois beating North Carolina in the NCAA finals. In fact, if I were taking part in this March Madness competition among sports savants, I would be in second place with 860 points. And I think that the really amazing thing about the Illini’s comeback against Arizona last Saturday was not that they erased a 15-point deficit in four minutes, but that they erased a eight-point deficit in eleven seconds. So: go Illini.
And let me join King Kaufman in offering kudos to the University of Illinois for deciding to leave the Chief behind when the Illini travel to St. Louis this weekend.
I lived with the Chief for twelve years when I taught at Illinois, and for most of those years I didn’t think much about him. Sure, it’s embarrassing to have a white college kid in Native regalia dancing around a football stadium at halftime, but it really wasn’t on my list of the world’s Top Thousand Injustices. What astonished me, though, was the depth of emotion on the pro-Chief side. It wasn’t the in-your-face, I’m-politically-incorrect-and-lovin’-it demeanor of the defenders of the Confederate flag; it was a weird combination of truculence and sappy sentimentality. Yes, I know some of the Stars and Bars fans can get weepy about their “heritage,” too. But this was qualitatively different: these people honestly believed that they were paying solemn tribute to Native American people and culture, and that the Chief was a dignified figure whose halftime dances were august and reverential remembrances of the Illini of yesteryear. (Though I will never forget the acerbic graduate student who said, “you know, if historical accuracy is what they’re after, they should symbolically kill the Chief after every dance.") Conservative politicians tried to pass a state law designating the Chief as the symbol of the university, and local figures in and around Champaign-Urbana organized a “Save the Chief” campaign that continues to this day. (You can check out the “Chief Illiniwek Educational Foundation” website for handy pro-Chief material.) The intensity of that campaign has driven more than one chancellor from office at the University of Illinois, because for most of the people in a fifty-mile radius around Champaign-Urbana, the most important thing about the University of Illinois is not that its faculty in the sciences invented the transistor and nuclear magnetic resonance imaging, nor that its programs and conferences in the humanities are internationally known, nor even that its extraordinary library is the third largest in the nation, behind only Harvard and the Library of Congress. In the prairie precincts of the Prairie State, they don’t give a shit about the library. Nobody wants to fund that thing. No, what they care about is the Chief.
By the late 1990s, I’d had quite enough of this nonsense, so I chimed in on the anti-Chief campaign begun by Native American activist and former Illinois graduate student Charlene Teters and led, among the faculty, by Steve Kaufman of the liberally-biased Department of Cell and Structural Biology. When Steve asked for supporting letters, to be sent both to trustees and to the accreditors of the Middle States Association (we wanted them to consider the impact of the Chief in their assessment of the university, and they agreed), I wrote back and said, among other things,
the emotions and arguments of the Chief’s ardent local supporters have close analogies in minstrelsy, which was vigorously defended, 100-150 years ago, as a vehicle for and tribute to authentic African American humor. (Today, these defenses of minstrelsy are either merely laughable or utterly unthinkable, and no sensible person would seek to revive them.) Similarly, the Chief’s supporters insist on the “dignity” of this figure, and the “tradition” that underwrites his continued appearance. Yet no American university that wanted to think of itself, as Illinois rightly does, as a “world-class institution” would offer up a minstrel show at its athletic performances, regardless of how passionately attached to such shows anyone had become. Imagine, if you will, the further spectacle of alumni and trustees and state representatives testifying to their deep love of these humorous characters whose noble culture is enshrined in the revered tradition of the minstrel show. Such a spectacle would properly be seen, in 1999, not so much as a slur against African Americans as a shameful acknowledgment that the university offering the spectacle – and the people cheering it on – had no idea whatsoever that the racial discourse of 1900 was no longer appropriate to the year 2000.
Well, you get the drift. When people talk about liberal college faculty being out of step with the rest of the nation, think of this: in places like Champaign-Urbana, the college faculty are the only mass of liberals to be found for miles and miles, from Chicago to St. Louis. The contrast with the immediately surrounding environs is stark and undeniable – and it is both reflected and heightened by clashes like those over the Chief.
I saw the Chief in action precisely once. I attended a number of football and basketball games during my time at Illinois, but for one reason and another I did not see the halftime show until 1997. It was during a basketball game against Minnesota, and I was sitting with then pre-adolescent Nick and one of his friends, when suddenly a bunch of white folks in bright orange sweaters and T-shirts ran onto the court and took up positions on the perimeter, ringing the court in orange. As they clapped and smiled and bounced, on came the Chief himself. It was a profoundly cringe-inducing experience. The Chief’s supporters insist that his routine is “loosely patterned after Native American fancy dance”; now, I know even less about Native dance than I know about smooth jazz, but I am not aware of any indigenous dance forms that involve lots of splits and jumping and touching your toes in mid-air. I turned to Nick and said, “never mind the debate about whether the Chief is racist– this stuff should be banned for sheer cheesiness alone.” But I said it sotto voce.
For as I watched and cringed and cringed some more, I noticed that sure enough, people around me were cheering and tearing up. And I began to think, this is as much a cultural divide as a political one, a divide between those with a liberal cringe reflex and those without. Surely, for my fellow Illinois fans, my visceral reaction to the Chief was just the mirror image of their visceral reaction to the Chief – except that mine was defined by what they would see as an elitist, nose-pinching, PC rectitude that symbolizes everything wrong with liberal college professors. I don’t have any problem with the name “Illini,” actually—or, for that matter, with the name “Illinois.” But the Chief and his halftime dance are another order of thing altogether. Please, I thought, let this hopping-and-skipping minstrel show end, and let’s get back to basketball. I didn’t come here to meditate on town and gown – or on what we’d now call blue and red America. I just came here to watch Illinois defeat the culturally innocuous, inoffensively-named Golden Rodents Of Some Kind.
Oops! Sorry about that, Minnesota fans. Really, your little gophers are OK with me. But now, back to business: go Illini, crush the culturally innocuous, inoffensively-named Louisville Cardinals. And once again, kudos for leaving the Chief in Champaign, where he will not distract attention from a fine, fine basketball team.
Next week’s topic: the University of North Dakota hockey team and their fabulous new rink!
Wednesday, March 30, 2005
Mistah Kurtz, he dead right
Reporting on a brand-new survey of college professors, Howie nails us:
College faculties, long assumed to be a liberal bastion, lean further to the left than even the most conspiracy-minded conservatives might have imagined, a new study says.
Even the most conspiracy-minded conservatives? I don’t know any conspiracy-minded conservatives who are obsessed with liberal professors. Do you?
By their own description, 72 percent of those teaching at U.S. universities and colleges are liberal and 15 percent are conservative, says the study being published this week. The imbalance is almost as striking in partisan terms, with 50 percent of the faculty members surveyed identifying themselves as Democrats and 11 percent as Republicans.
The disparity is even more pronounced at the most elite schools, where, according to the study, 87 percent of faculty are liberal and 13 percent are conservative.
“What’s most striking is how few conservatives there are in any field,” said Robert Lichter, a professor at George Mason University and a co-author of the study. “There was no field we studied in which there were more conservatives than liberals or more Republicans than Democrats. It’s a very homogenous environment, not just in the places you’d expect to be dominated by liberals.”
Religious services take a back seat for many faculty members, with 51 percent saying they rarely or never attend church or synagogue and 31 percent calling themselves regular churchgoers. On the gender front, 72 percent of the full-time faculty are male and 28 percent female.
The findings, by Lichter and fellow political-science professors Stanley Rothman of Smith College and Neil Nevitte of the University of Toronto, are based on a survey of 1,643 full-time faculty at 183 four-year schools. The researchers relied on 1999 data from the North American Academic Study Survey, the most recent data available.
Hold the phone—1,643 full-time faculty? Folks, Penn State alone has more than 1,643 faculty. You’re telling me that this survey is based on an average of nine professors at 183 different schools? And let’s see . . . there are over two thousand four-year colleges in the United States, so . . . well, you do the math. All I can tell is that this “72 percent” figure keeps coming up: 72 percent are liberal, 72 percent are male. I’m not a specialist in statistics, but my guess is that it means that all the male faculty are liberals.
I do know, however, that the survey was undertaken for very scientific reasons:
The study appears in this month’s issue of Forum, an online political-science journal. It was funded by the Randolph Foundation, a right-leaning group that has given grants to such conservative organizations as the Independent Women’s Forum and Americans for Tax Reform.
But pay no attention to the men behind the curtain! Look at what these far-left professors actually believe:
The liberal label that a majority of the faculty members attached to themselves is reflected on a variety of issues. The professors and instructors surveyed are, strongly or somewhat, in favor of abortion rights (84 percent); believe homosexuality is acceptable (67 percent); and want more environmental protection “even if it raises prices or costs jobs” (88 percent). What’s more, the study found, 65 percent want the government to ensure full employment, a stance to the left of the Democratic Party.
Did you get that? A stance to the left of the Democratic Party! That’s gotta be some wild, far-out stuff there—completely off the Howard Kurtz map altogether, or perhaps marked only by a blank space and an ominous legend, here there be Spartacists. It’s a good thing the survey didn’t ask us how we felt about workers seizing the means of production!
But wait, let me take this thing more seriously for a moment. After all, there’s really no question that college faculty are generally more liberal than the rest of the population. OK, now, let’s see . . . I’m strongly or somewhat in favor of abortion rights, check—whatever that “somewhat” means (rape? incest? life of the mother? for Sherri Finkbine and no one else?). I believe homosexuality is acceptable, check—though I’m really curious about the five percent of “liberals” who disagreed with this one. I want more environmental protection even if it raises prices or costs jobs, check—though it all depends on whose jobs we’re talking about.
Which brings me to the bit about the government ensuring full employment. Here’s where I’ve got to part ways, yet again, with some of my brothers and sisters on my left. “Full employment” sounds nice—it’s sort of goofy and utopian, like imagining that access to health care is a human right or something—but it’s dangerously naive. Certain people should definitely be unemployed. In fact, I have a nice long list of names on my hard drive, all alphabetized and ready to go. (But readers can feel free to make their own suggestions in comments!)
In the meantime, something needs to be done about all these totalitarian leftist professors, and in this as in so much else, Florida is leading the way. The next time your liberal biology professor insists that you have to accept “evolution” in order to take his class, just sue the bastard!
Tuesday, March 29, 2005
Mister Question Man
Whenever I drive near or in actual cities with the car radio on – and by “actual cities” I mean “places of high population density and at least one ‘jammin’ oldies’ radio station playing Al Green’s ‘Still in Love with You’” – I find myself confronted with a question whose world-historical profundity is masked by its surface simplicity. And because I can contemplate the matter no longer, I’m turning this one over to you, my reasonably faithful and always thought-provoking readers.
Where did “smooth jazz” come from?
Everyone I’ve asked so far says, “it came from Kenny G,” which, however intriguing it may be as a possible horror-movie title, is ahistorical and undialectical and also wrong. Smooth jazz seems to have originated in the mid- to late 1970s; some scholars
blame cite the work of Grover Washington Jr., some refer to Chuck Mangione’s “Feels So Good,” some point to the ubiquitous David Sanborn, and some insist that the epistemological breakthrough must be credited to George Benson’s Breezin’. All of these suggestions are plausible enough, but they displace the question of structural determinations and musical influences onto a list of “major figures,” and therefore must be rejected by a properly post-neo-Bolshevist theory of the rise of smooth jazz.
The genre overlaps to some degree with that of the “quiet storm” branch of r & b, as I was reminded while driving around Baltimore and finding Heatwave’s “Always and Forever” being played on the local smooth jazz station. But it also has affiliations with both fusion and funk. Fusion is probably the more obvious of the two: it’s just a half-step from Weather Report or Al Di Meola to some of the more musically challenging forms of smooth jazz. And as the example of David Sanborn demonstrates, smooth jazz also has an embassy in the neighbor state ruled by Steely Dan (in fact, some historians attribute Aja to “Steely Dan’s colonization by the forces of smooth jazz”). The connection to funk is probably more controversial, since people tend to like funk enough to want to absolve it of all connection with smoothness and frizzy-haired flutists. And yet for drummers, the link between smooth jazz and funk is pretty clear: unlike most jazz produced between the 20s and the 70s, smooth jazz rhythms are built around the snare and bass drum rather than the ride cymbal and hi-hat. Are they funky? Well, not exactly – remember, they have to serve as wallpaper for the Weather Channel. (Now there’s an article waiting to be written: From Weather Report to the Weather Channel.) They’re kind of like funk with all the blood and vital oils and shouts and “can I take it to the bridge"s drained out. James Brown once said that the difference between funk and disco was that disco stayed on top of a groove whereas funk got down into the groove and deepened it; smooth jazz, perhaps, represents a form of musical waterskiing over the groove. In which case we’d have to add the Love Unlimited Orchestra – and all that that implies – to our list of structural determinants and musical influences. And we’d have to admit that songs like Patrice Rushen’s “Forget Me Nots” – even though they have kickin’ bass lines that make you want to move – are closely related to Jazz That is Smooth as well. Who knows but that we might have to consider the necessity of a comparative genealogy of smooth jazz and the post hoc genre of “jammin’ oldies” itself.
I actually like some small fraction of the stuff, particularly when I’m driving long distances and spacing out. For that matter, I also like Aja, especially “Home at Last.” I think I was the only person I knew in college who enjoyed both Aja and Never Mind the Bollocks, Here’s the Sex Pistols. This ideological eclecticism- tantamount- to- incoherence has dogged me to the present day, though at least I am not ahistorical or undialectical about it. Still, if anyone has further suggestions about the origins and affiliations of smooth jazz, now’s the time.
Monday, March 28, 2005
Greetings from the Houston Zoo
It’s Monday Cobra Statue Blogging!
And yes, that’s a jacket Jamie is wearing, even though we’re in southeast Texas: on Sunday, as we toured the zoo and the museum of natural science (which was crammed full of pro-evolution propaganda, as it turned out), we found to our dismay that the so-called “laws” of “physics” had been suspended for a 24-hour period. It was in the high 70s on Saturday and I hear that Houston is in the mid 70s today, but for our Sunday excursion it was in the low 50s with 20-mph winds. The models of the Earth’s atmosphere and climate that we consulted in the natural science museum itself suggest that this is impossible, which leads me to conclude that science does not, in fact, have all the “answers.” I therefore announce the formation of a new school of thought devoted to the proposition that there is a nonhuman and possibly divine “intelligence” behind our seemingly random daily weather patterns. I’m thinking of calling it “Intelligent Meteorology.” But I’m open to other suggestions!
Then we caught our 4 pm flight to Baltimore, got to our car at 9:30, drove home through the freezing rain and pulled into the driveway at 1 am this morning. Blogging will resume tomorrow when I’ve recovered from the trip.
Thursday, March 24, 2005
Off to Houston
Good news for all of you who are weary of the 2000-word mini-essays I’ve been posting lately (like the one below): I’m taking a break! It’s off to Rice University to deliver the keynote address at a conference on “The Post-National Nation: Ideology and Institution in the Global Era”—and this one’s a special event, because I’m bringing Jamie with me. (Many, many thanks to the conference organizers who are making this possible.) He’s been eagerly looking forward to this for months and months. How psyched is he? This psyched: the other night, after he finished his math homework (he really gets the concepts of factors and fractions; he still stumbles a bit with long division), he had to compose sentences with his spelling words, one of which was “control.” He immediately came up with “you need to control yourself when you are excited.” “Good one!” I said. “Like going to Houston,” he replied.
Well, let’s hope that’s a good sign, and that all goes well. He loves hotels, he loves hotel pools, and he really loves room service. And then on Sunday we’re off to the zoo. See you all next week.
I don’t know if anyone is talking about “advance directives” or “living wills” or the right to refuse life-sustaining medical treatment lately, but just in case anyone is, I thought it might help to confuse things beyond measure if I pointed out that
(a) some courts have insisted that advance directives have to be quite detailed with regard to specific levels of care and specific states of injury or illness;
(b) advance directives give courts and guardians guidelines for honoring patient autonomy– most importantly, an individual’s right to refuse treatment– but, of course, cannot account for the possibility that an individual might change his or her mind about refusing treatment after becoming ill or injured (and that such an individual might be incapable of saying so); thus, there is a possibility that the ideal of patient “autonomy” can be invoked both to honor the advance directive and to set it aside in favor of the argument that a patient’s radically changed circumstances, due to illness or injury, might have induced him or her to reassess his or her desires about treatment;
(c) the difficulties of entertaining the possibility that a person might “change her mind” about her advance directive become even more impossibly complex when the person’s mindedness is precisely what’s in question, as in cases of dementia, mental illness, or injuries and illnesses that leave a person conscious but incompetent; and
(d) adults with intellectual disabilities may not be competent to execute advance directives in the first place.
I’ll be more specific. The legal cases In re Wendland (28 P.3d 151 (Cal. 2001)) and In re Martin (450 Mich. 204; 538 NW2d 399 (1995)) involved eerily similar circumstances: relatively young men badly injured in auto accidents, both of whom had repeatedly expressed– orally, to their families– the desire to refuse medical treatment in what they considered to be extreme circumstances. As Mary Ann Buckley summarizes the case (this link is to a .pdf file), Robert Wendland
developed a drinking problem after the death of his father-in-law, who had been maintained on a ventilator while dying from gangrene. While watching his father-in-law in that condition, Robert told his wife, Rose, “I would never want to live like that, and I wouldn’t want my children to see me like that, and look at the hurt you’re going through seeing your father like that.” Robert told Rose that her father “wouldn’t want to live like a vegetable” and “wouldn’t want to live in a comatose state.”
Both Rose and Robert’s brother, Michael, became concerned about Robert’s safety because of his drinking. Michael told him, “I’m going to get a call from Rosie one day, and you’re going to be in a terrible accident.” Upon Michael’s warning that he would end up laying in bed “just like a vegetable,” Robert responded, “Mike, whatever you do[,] don’t let that happen. Don’t let them do that to me.” According to one of his children, Robert said during that conversation that “if he could not be a provider for his family, if he could not do all the things that he enjoyed doing, just enjoying the outdoors, just basic things, feeding himself, talking, communicating, if he could not do those things, he would not want to live.” Rose testified that Robert “made clear” to her that under no circumstances would he want to live if he had to have diapers, if he had to have life support, if he had to be kept alive with a feeding tube or if he could not be a “husband, father, provider.”
Robert was severely injured in an automobile accident in September 1993, as a result of his driving while intoxicated. He remained in a coma for sixteen months. Although he eventually regained consciousness, he was left both mentally and physically disabled.
. . . He first began to show signs of responsiveness in late 1994 and early 1995. Between January and July of 1995, Robert’s feeding tube dislodged four times. Rose authorized surgical replacement of the tube the first three times but refused the fourth.
At this point Robert’s estranged mother, Florence, and his sister filed for a restraining order to block removal of the feeding tube. And here’s where things got really difficult. Buckley writes:
Despite support for Rose’s decision by his counsel, his physician, and the ethics committee, the trial court found that Robert’s statements to his wife and brother while he was competent were not enough to show by clear and convincing evidence that he would have wanted to die if he were minimally conscious. The trial court held that a conservator could withhold artificial nutrition and hydration from a minimally conscious conservatee if shown by clear and convincing evidence to be in the conservatee’s best interest, considering any wishes the conservatee may have previously expressed. The court found that Rose had not met her burden. [My emphasis.]
The appellate court reversed, and then in 2000 the California Supreme Court reversed the appellate court in turn, arguing (among other things) that Robert’s stated wish not to live “like a vegetable” pertained only to “persistent vegetative state” rather than the “minimally conscious but incompetent” condition in which he found himself. Robert eventually died in July 2001 of pneumonia.
Martin is similarly agonizing, involving severe injuries and deeply divided families. But this time I’ll just give you the digest summary:
In determining whether a person, now incompetent, would desire to refuse life-sustaining medical treatment under the circumstances, the predominant factor is the existence of a prior directive to that effect. The prior directive may be written or oral. The weight of an oral statement depends upon the remoteness, consistency, specificity, and solemnity of the statement, and a statement made in response to another’s prolonged death does not provide clear and convincing evidence of the desire to refuse treatment. In this case, the patient was severely injured in an automobile accident, and suffered brain damage which significantly impaired his physical and mental functioning. The injuries left him partially paralyzed; he cannot walk, talk, or eat and has no bowel control. Although he remains conscious, his cognitive abilities are seriously affected. Testimony established that before his accident the patient had verbally repeatedly expressed his desire not to be maintained if incapable of performing basic functions and without hope of improvement. The patient’s expressions did not sufficiently specify the circumstances presented and did not constitute clear and convincing evidence of the desire to refuse treatment.
OK, what to make of all this? First, that the supreme courts of Michigan and California have set an astonishingly high standard as to what constitutes “clear and compelling” evidence of an individual’s wishes. Second, that these cases have implications that you attorneys out there would call “sweeping”: as Buckley points out, “Of the approximately six thousand deaths that occur daily in the United States, it is estimated that approximately seventy percent involve decisions to forego life-sustaining treatment. . . . [P]atients with dementia often retain some level of cognitive functioning before they are deemed terminally ill, yet after questions of life-sustaining treatments arise. In addition, given the rarity of PVS [permanent vegetative state], it is reasonable to assume that the majority of such decisions involve patients who are at least minimally conscious. As a result, the ‘narrow class’ that Wendland affects in fact includes a significant number of individuals.” And third, that in these cases we’re not talking about “permanent vegetative states”; we’re talking about far more ambiguous and indeterminable states of mind in which it is profoundly unclear as to whether the person in question is capable of perceiving his state and/or capable of deciding either to refuse or maintain life-sustaining medical treatment. But if, just if, hypothetically speaking, if there were a case where you had a person in a persistent vegetative state and a court or a legislature willing to set aside his or her guardian’s decision to refuse life-sustaining treatment, then the bar for refusal of care would be almost inconceivably high. You’d have to devise extremely specific and detailed advance directives covering all manner of states of injury, in order to provide “clear and compelling” evidence of your wishes. And even then, the possibility would remain that your wishes would not be honored. As Buckley points out, the California Supreme Court’s “attempt to honor autonomy has the contradictory result that patients who would have refused treatment no longer have the right to have their family assert that choice on their behalf.”
Deep breath. Now for the paragraph that will lose me some friends in the disability community. Yes, our society fears and stigmatizes disability, especially when it entails significant physical or cognitive impairment. Yes, it is entirely possible that some people, upon losing some physical or cognitive functions, might come to understand the value of living with disabilities even though they had once feared and stigmatized disability themselves. Yes, we should resist brutal, utilitarian cost/benefit analyses of the value of human life, especially if we’re living in a society where health care is privatized rather than a social good. Yes, it’s possible that Robert Wendland and Michael Martin themselves had attitudes toward disability that some of us would find undesirable or objectionable. And yes, finally yes, of course yes, the disability-rights critique of these cases is substantially different from the religious-fundamentalist position, and nobody pays much attention to the former while the Culture of Life® gets all the airtime. Nevertheless. It pains me to see disability advocates taking positions that effectively undermine the very ideal of individual autonomy on which so much of the disability rights movement rests. I realize– in fact, I believe I acknowledged at the outset– that “autonomy” is a problematic ideal in these cases, all the more so when you’re dealing with (a) the decisions of surrogates and guardians or (b) persons who would not be able to meet the intellectual standards of “individual autonomy” at any point in their lives. But it seems to me the truly liberal imperative here should lead us to honor the wishes of others with regard to their desires to refuse medical treatment when those wishes can be ascertained by a preponderance of the available evidence [UPDATE: no, that’s too low a legal standard-- “clear and compelling” is the proper standard, though I still believe that Robert Wendland’s evidence was clear and compelling], and we should likewise defer to the wishes of the legal guardians of incompetent persons, charitably granting them the assumption that they are indeed acting in what they perceive to be their charges’ best interests. And we should do so even when we ourselves disagree with other people as to their own wishes, or their perceptions of the best interests of those whom they serve as guardians.
These cases pose excruciatingly difficult moral questions and involve competing moral imperatives. They also put ordinary and extraordinary people in wrenching emotional circumstances, in case you haven’t already noticed. But amid all the tangle and the murk, this postmodern situational ethics and that neo-pragmatist antifoundationalism, the fundamental right to life and the fundamental right to refuse medical treatment, there is at least one constant, one bedrock principle, one thing that is certain beyond all doubt: Republicans are ghouls, and Tom DeLay is one of the foulest ghouls ever to have blighted our fair land.