Monday, October 24, 2005
A stupid proposal
Do you know any idiots? How about morons, or imbeciles? Retards, perhaps? People riding the short bus?
The first three items were once part of standard terminology in intelligence measurement: “moron” is the most recent of them, having been proposed in the early twentieth century by Henry Goddard. Before the twentieth century, “idiot” and “imbecile” were general insults, as they are today, though they too were once pressed into service as classifications. For those of you who don’t remember those days, “morons” had what we now call “mild” mental retardation, or IQs between 50 and 70; “imbeciles” had what we now call “moderate” mental retardation, or IQs between 26 and 50; and everyone below that threshold, whom we now call people with “severe and profound” mental retardation, were idiots.
A century ago, “Mongoloid idiot,” for example, was not (as so many people think) a slur. It was a descriptive term, a diagnosis.
Over the past five years, the number of morons and idiots seems to have increased dramatically. Either that or the use of the terms has increased; as you know, it’s sometimes hard for us literature professors to figure out where language ends and nonlinguistic phenomena begin. But to gauge by the state of our political discourse, things are looking pretty grim. On one side you have people compiling lists of the left’s “useful idiots”; on the other side you have people calling Bush a “moron” and drawing cartoons that liken the U.S. to a classroom led by a “special ed” student. And on the Internets, “retard” is common parlance, found on every point of the political spectrum.
It’s not as if I’ve never used the terms myself. The other day, as my poor automobile was minding its own business, just humming along down the highway, it was suddenly set upon by a clump of drivers so reckless and inattentive that I referred to two of them (then in the act of cutting each other off in the left lane) as “idiots.” “You know,” said my co-pilot, “we should probably retire that word one of these days.” She was right, and I admitted as much at the time. “Besides,” I added, “these guys are really assholes.”
After all, dear reader, it’s not as if the English language is hurtin’ for terms of abuse. If you truly believe that someone is acting unwisely or thinking incompetently, you can draw upon thousands of words that speak about performance rather than capacity, which is far more appropriate anyway (as Chris Clarke has eloquently pointed out). That “moron” you revile might just as easily be a jerk, a jerkoff, or a jackass; the “idiot” in the adjacent car or adjacent voting booth might instead be a fool, a wuss, a sap, a chump, a poltroon, a schlemiel, or a patsy. Even as you read these words, thousands of people are just begging to be called scoundrels, prigs, and coxcombs. Why, there’s even an entire Shakespearean Insult Server available online for those of you who want to hurl especially colorful and vivid forms of contempt and contumely upon your adversaries, so there’s really no excuse for failing to take full advantage of the opportunities afforded by this rich and complex language of ours.
If you’re concerned about stigmatizing jackasses, however, on the grounds that you may be likening an innocent beast to a hideous human (or, conversely, figuratively dehumanizing one of your fellow men or women), you can always adopt the more politically correct term “jackass-American,” presuming, of course, that the jackass in question is -American.
So next time you’re fed up with someone and you want to call his or her intelligence or judgment into question, remember: you might be better off with insults that speak to the performance of intelligence or judgment rather than to capacity. This isn’t just a matter of politeness; it’s also a matter of proper English usage. Many, many morons and retards have very good judgment about some matters, whereas many, many ostensibly intelligent people make bafflingly, excruciatingly bad decisions. Why? Because some of them are knaves, and others gulls, and still others hoodlums and miscreants. That’s why.
Thursday, October 20, 2005
Hi everyone! It’s me again, checking in from the depths of Footnote Hell.
Of course, even though footnotin’ is hard work, it’s not all tedium and Googling and visits to the stacks. Not at all! Some of footnotin’ involves real argumentin’, just in a tinier font at the back of the book. And I thought I’d share an example with you this evening, not least because two or maybe three people have written to me to ask me how the book’s going and what it will look like when it appears next fall.
Well, the book is going fine, and when it appears it will almost surely be rectangular. But for those of you who might like a small taste of what one of the more substantial footnotes will look like—and it just happens to follow from yesterday’s post—I’m posting a draft version for your perusal.
Here’s the back story. After disposing of David Horowitz and his like for once and for all, and then checking out some of the more widely-reported tales of conservative students being persecuted by their Stalinist professors, I get around to explaining what I do in some of my classes. Now, the last time I got together with my editor, on a weekday evening in a midtown restaurant in New York, he flagged the opening pages of the chapter on my postmodernism seminar and said, you might want to watch the mention of Kuhn—because, as you know, there are any number of readers out there who are really tired of humanities professors citing Kuhn and getting him wrong. Likewise with Gödel and Heisenberg on “incompleteness” and “uncertainty.”
As you might imagine, this remark made me violently angry. Yanking the bottle of pinot grigio from the ice bucket to my right, I smashed it on the edge of the table, stood up, and said, “All right, man. I know all about those readers. And I’m as pissed off about sloppy appropriations of Kuhn as anyone. But let me say one thing.” At this point I had drawn the alarmed attention of all the diners-and-drinkers in the place, not least because I was waving the broken bottle around and making random stabbing motions. “I’ll put my reading of Kuhn up against anyone’s. Anyone’s, do you hear me? DO YOU HEAR ME? I’m serious, man—I don’t just go on about ‘paradigm’ this and ‘incommensurability’ that, people. I can take Kuhn’s examples about phlogiston and X-rays and shit, and I can extrapolate them to Charles Messier’s late-eighteenth century catalog of stellar objects, or the early controversy over the determination of the Hubble constant, or the 1965 discovery of the cosmic microwave background radiation by Penzias and Wilson. GET IT? So don’t mess with my goddamn reading of Kuhn. Any of you.”
There were a few moments of silence, punctuated only by some nervous clattering of silverware. Then a conservatively-dressed man in his early fifties got up from a table fifteen or twenty feet away. “People like you,” he said, trying to stare me down, “read Kuhn backwards by means of Feyerabend’s Against Method, and as a result, you make him out to be some kind of Age of Aquarius irrationalist who thinks that scientists run from paradigm to paradigm for no damn reason.” Then he tossed his napkin across the table. “And if you want to deny it, I suggest we step outside.”
Fortunately for that guy, the maitre d’ intervened at just that moment, imploring me to “settle this peacefully,” preferably with a footnote to the sixth chapter. And cooler heads prevailed.
So here’s the goddamn footnote already.
the many misreadings of Kuhn among humanists: partly because humanists’ work does not proceed under the same protocols of “verifiability” as those of the natural sciences, our interpretations of Kuhn have been somewhat looser than they should be. It is commonly charged that humanists embraced Kuhn so enthusiastically because he seemed to have undermined the authority and the objectivity of the sciences, and the charge may have some merit; but I believe humanists, as well as social scientists, were attracted primarily to the idea of paradigm shifts as a way of explaining epistemic change (for it is a very good explanatory scheme) and less concerned with what Kuhn calls “normal science,” which, after all, is where all the important paradigm-building and -challenging work gets done. So, for example, humanists tend to overlook the specificity of Kuhn’s examples with regard to the discovery of oxygen or X-rays, not least because we have no direct analogy for Roentgen’s realization that, in the course of his experiments with cathode rays, something was causing a barium platinocyanide-coated screen to heat up across the room.
Because of his emphasis on the importance of “normal science” and the protocols under which it operates, Kuhn is not a relativist; on the contrary, he argues that there is such a thing as scientific “progress,” though he insists that it can only be gauged retrospectively, for it is not proceeding toward any preordained goal. For Kuhn, science is therefore evolutionary in precisely the same sense that evolution itself was evolutionary for Darwin: in an anti-teleological sense.
The developmental process described in this essay has been a process of evolution from primitive beginnings—a process whose successive stages are characterized by an increasingly detailed and refined understanding of nature. But nothing that has been or will be said makes it a process of evolution toward anything. Inevitably that lacuna will have disturbed many readers. We are all deeply accustomed to seeing science as the one enterprise that draws constantly nearer to some goal set by nature in advance. . . .
For many men the abolition of that teleological kind of evolution was the most significant and least palatable of Darwin’s suggestions. The Origin of Species recognized no goal set either by God or nature. Instead, natural selection, operating in the given environment and with the actual organisms presently at hand, was responsible for the gradual but steady emergence of more elaborate, further articulated, and vastly more specialized organisms. Even such marvelously adapted organs as the eye and hand of man—organs whose design had previously provided powerful arguments for the existence of a supreme artificer and an advance plan—were products of a process that moved steadily from primitive beginnings but toward no goal.
T. S. Kuhn, The Structure of Scientific Revolutions, 2d. ed. (Chicago: U of Chicago P, 1970): 170-72. This passage aligns Kuhn quite clearly with philosophers like Rorty, who similarly see human deliberations about things like “justice” in an antiteleological way: though Rorty prefers trial by jury to trial by ordeal, he believes it is fruitless to conceive of this progress in human affairs as proceeding toward some antecedent goal. As we will see later in the chapter, this stance puts Rorty at odds with philosophical foundationalists for whom the idea of an antecedent goal provides a benchmark, a “ground,” for notions of human progress.
In a recent complaint about humanists’ appropriation of Kuhn’s work, Thomas Nagel writes: “Much of what Kuhn says about great theoretical shifts, and the inertial role of long-established scientific paradigms and their cultural entrenchment in resisting recalcitrant evidence until it becomes overwhelming, is entirely reasonable, but it is also entirely compatible with the conception of science as seeking, and sometimes finding, objective truth about the world” (547). Nagel, “The Sleep of Reason,” rpt. in Theory’s Empire: An Anthology of Dissent, ed. Daphne Patai and Will H. Corral (New York: Columbia U P, 2005): 541-52. I agree with this if, and only if, “objective” is understood as “mind-independent,” and (as I will explain in more detail in the course of this chapter) I decline to believe that this standard of “objectivity,” as it pertains to objects like quarks and quasars, can be usefully applied to mind-dependent matters such as justice or anxiety. See, e.g., my entry on “Objectivity” in New Keywords, ed. Tony Bennett, Lawrence Grossberg, and Meaghan Morris (London: Blackwell, 2005): 244-46. Finally, like Kuhn, I see no need to tie this idea of mind-independent objectivity to a teleological idea of human or scientific progress.
Well, now. I trust that solves everything.
I hope some of you are wondering how in the world I get from the world of David Horowitz, George Will, and Sean Hannity to the world of Kuhn and Nagel and then back again, defending liberalism all the way. Because the best way for y’all to find out is to buy the book when it appears.
But first, I have to go finish it. And so back to work.
Wednesday, October 19, 2005
Since I’m on a book-finishing, light-posting schedule this week, I thought I’d offer something I’ve already published—one of my entries from New Keywords, which appeared this past June and is edited by Tony Bennett, Lawrence Grossberg, and Meaghan Morris. I figure it’s perfect for blog material, insofar as it deals with the history of the meaning of the word “objectivity” in just under one thousand words. Almost as much fun as Hideous Oldies, and sure to generate as many comments!
Particularly assiduous readers of this harried blog will remember that I mentioned New Keywords about four months ago and even linked to Blackwell’s .pdf of my entry for “experience.” New Keywords is intended, as its name suggests, as an update on Raymond Williams’ nearly-lifelong project. And, of course, I present this little snippet here as part of my nearly-lifelong project of trying to demonstrate to skeptical onlookers that “cultural studies” does not consist exclusively of close readings of Madonna and Die Harder.
With that, then, here’s some objectivity (and I really did try to be objective about this):
Objectivity, together with its cognates, objective, objectively, and objectivism, has what might seem to contemporary observers a placid history. There is widespread agreement that the term “objectivity” is synonymous with such things as neutrality, impartiality, and disinterestedness; the objective observer, for instance, is able to give a reliable account of events precisely because she or he has no interest in the outcome and is able to make statements and render judgments regardless of their consequences. Apparently, we have managed to agree about what an objective observer is, even though we usually disagree about whether this or that person has in fact served as an objective observer in any given case.
These disagreements are most noticeable in politics—and, to a lesser degree, in journalism—where charges of partisanship and bias are so common as to give the ideal of objectivity something of a quaint air. Indeed, many politicians seem to work with a definition of “politics” in which “politics” itself is antithetical to “objectivity”; thus it is customary to hear that a “political” consideration puts party and partisan interest above all else, rendering objective assessments irrelevant or unavailable. In this sense of the “political,” one party will oppose something simply because another party has proposed it, without regard for the (“objective”) benefits or drawbacks of the proposal itself.
In journalism, by contrast, most parties agree that reporters should be bound by a code of professional objectivity. But in the US, with its weak public sector and its private ownership of most media, left-leaning critics of the media have long insisted that journalism is in practice conservative insofar as it is owned and operated by large corporate interests, whereas right-wing critics have insisted in return that journalists themselves are tainted by a liberal bias that prevents them from reporting objectively on such matters as race, sexuality, and religion (Chomsky and Herman, 1988; Goldberg, 2001).
What’s curious about the widespread agreement as to the meaning of objectivity in these debates is that the word is one of those rare specimens whose philosophical meaning was once directly opposed to its current meaning. In medieval philosophy the terms “objective” and “subjective” respectively meant what “subjective” and “objective” have denoted in Western philosophy since the C17, and especially since the eC19: the “subjective” denoted those features proper to what we would now call an object and that could be said to exist independently of perception, and the “objective” corresponded to the features of an object as they presented themselves to what we now call the subjective consciousness of an observer. With René Descartes, however, Western philosophy began to associate subjectivity with a perceiving “I”; and since Immanuel Kant, most Western thinkers have agreed to parcel the world into objective phenomena that exist independent of mind, and subjective phenomena that are in one way or another mind-dependent (such as injustice) or wholly attributable to mindedness (such as anxiety).
Subjectivity, then, has come to be aligned with the partisan and the partial, and objectivity with all that pertains to objects as in themselves they really are (in Matthew Arnold’s phrase). One of the central questions for the philosophy of mind in the C19-C20 has accordingly been how to construe the boundary between objective and subjective phenomena, particularly with regard to matters such as color (which may or may not exist independently of our perception of them). Similarly, one of the central questions for moral philosophy has been how to parse out the potential domain and applicability of moral truth-claims, such that sentences like “it is wrong to torture another human being” might be understood to be grounded differently—that is, more objectively—than sentences like “it is wrong to eat pastrami with mayonnaise.” The idea here is that the latter judgment is a mere “subjective” matter of taste, since the eating of pastrami with mayonnaise presumably affects no one but the person eating the sandwich, however much it may offend the sensibilities of everyone else in the delicatessen. The practice of torture, by contrast, is widely felt not to be a simple matter of taste, but rather a serious moral issue calling out for intersubjective forms of agreement that will allow us to condemn torture “objectively,” without regard to who is being tortured or why.
Since the mC19, but especially in recent decades, social theorists have debated whether the standard of objectivity pertinent to the natural sciences, which pertains to things such as quasars and quarks, is appropriate to the social sciences, which involve things like kinship rituals, torture chambers, and parliamentary procedures. Proponents of objectivity in the social sciences claim that neutral, disinterested scholarship is the only medium by which we can obtain reliable knowledge in such fields as history, economics, anthropology, and sociology. Critics of objectivity counter-argue that no observation of human affairs can escape the inevitably human parameters of the observation itself, and that invocations of objectivity with regard to human affairs are therefore (knowingly or not) veils for partisan agendas that do not recognize their own partisanship. Not all critics of objectivity, however, are wont to accuse their opposite numbers of bad faith; some argue more moderately that “objectivity” is merely the wrong term for complex intersubjective forms of agreement. Richard Rorty, for example, has argued in a series of books beginning with Philosophy and the Mirror of Nature (1979) that utterances designated as “true,” whether in the realm of the natural sciences or in the realm of moral philosophy, should be understood not as accurate descriptions of mind-independent objects but as useful claims that have managed over time to “pay their way” (R. Rorty, 1982), thus providing pragmatic grounds for broad agreement among human investigators.
Some moral philosophers claim that Rorty’s position on objectivity amounts to a shallow relativism in which all value judgments are of equal standing. Be this as it may, it can be safely—and perhaps objectively—said, at the very least, that while most people agree that objectivity is akin to impartiality, philosophers continue to disagree strenuously as to whether objectivity is merely another name for human agreement.
Tuesday, October 18, 2005
Light in October
The next two weeks might be rough on the blog. Just letting you know. Next week, I am consumed with Official Committee Business on three different committees, two of which will account for a marathon four-day meeting. Some guys have all the luck! This week, I am frenetically scouring the Internets (and even some “print” material), finishing up all the footnotes to the book whose working title is now What’s So Liberal About the Liberal Arts? And you know what? Footnotin’ is hard, hard work! It’s hard! It turns out that in one kind of footnote, you not only have to find out what actually happened, but you have to provide a “source” for your “information” as well. And in another kind of footnote, you not only have to find out what somebody said or wrote, you have to find out “when” and “where,” too. It’s a lot like bloggin’ and literary alludin’, only harder!
And yes, I know I was finishing this book last fall and then again last spring. But this time I’m really finishing it. Just you wait and see. This is gonna be the most finished book I’ve ever finished.
Of course, it would have been easier to write books the Jonah Goldberg way*:
WANTED: HERBERT SPENCER EXPERT [Jonah Goldberg] I’m working on a chapter of the book which requires me to read a lot about and by Herbert Spencer. There’s simply no way I can read all of it, nor do I really need to. But if there are any real experts on Spencer out there—regardless of ideological affiliation—I’d love to ask you a few questions in case I’m missing something.
But I just couldn’t go that route, dear friends. As you know, I have dedicated much of my life over the past two years to solving the mystery of why there aren’t more conservatives on college faculties. And I think I may, at long last, have discovered an important clue.
I’ll check in when I’m takin’ a footnotin’ break here and there. In the meantime, all hail the mighty White-Stockings of Chicago. And the redoubtable Albert Pujols, too.
* edited with the help of Vance Maverick, in comments
Monday, October 17, 2005
Arbitrary but fun Monday for a change
Oldies radio lies, man.
More specifically, the “oldies” canon, having congealed over the past decade into a reliable rotation of “Bus Stop,” “Spirit in the Sky,” “You’re So Vain,” and such, nicely demonstrates the point—made twenty-odd years ago by any number of literary critics and theorists—that the process of canon formation is inevitably “partial,” in the sense that it does not (and does not attempt to) retrieve the past “as it really was.”
Instead, it presents us with the past as we now like to think it really was. There’s nothing necessarily insidious about this process; it’s not as if Oldies Radio represents history as told by the victors of some global slaughter. Besides, most of the victors, like Norman Greenbaum’s ubiquitous one-hit wonder, survive to this day because they’re really pretty decent little pop songs (or, at the very least, they have a catchy riff and a cool guitar sound that still sounds tolerably cool thirty-five years later). Granted, there are plenty of oldies—think of Seals and Crofts’ handful of contributions to Western Civ—that should be allowed to die a dignified death. But there are hundreds more that have been purged from the Oldies archives altogether. Some, like Paper Lace’s hideous “The Night Chicago Died,” have a ghostly existence as “oldies novelty” tunes, the kind of thing you have to hear every five or six years just to wonder what the hell people were thinking. Hiding behind the oldies novelty tunes, however, is a vast legion of cultural dreck that no Oldies station will touch—even though it once ruled the charts.
Sure, “Tie a Yellow Ribbon” was a horror. And it was the number one song of 1973. But what is there to say of “Say, Has Anybody Seen My Sweet Gypsy Rose?”—Tony Orlando and Dawn’s followup single, which wound up as number 34 of the year? Or, God help us, “Who’s in the Strawberry Patch with Sally?” No oral or written language known to humankind can adequately express the profound and promiscuous badness of these songs. Likewise, Gilbert O’Sullivan is justly renowned for having written the world’s most bathetic tune, “Alone Again (Naturally)” (as one critic put it, “the worst potential influence on the direction of pop music since Tiny Tim”). But how many of us remember—or care to remember—that we were subsequently treated to three or four more “hits” from O’Sullivan, each of which was even worse (though, of course, not more bathetic)?
You don’t believe me? Fine. Then you deserve this:
Told you once before, and I won’t tell you no more
Get down, get down, get down
You’re a bad dog baby
But I still want you around.
You give me the creeps, when you jump on your feet
So get down, get down, get down
Keep your hands to yourself
I’m strictly out of bounds.
Don’t make me quote O’Sullivan again. You’ll regret it.
Similarly, Helen Reddy’s bizarre, groundbreaking portraits of women with mental illness (“Delta Dawn,” “Ruby Red Dress,” “Angie Baby”) have been wiped from our collective public memory, together with Bobby Sherman’s neo-existentialist “Easy Come, Easy Go” and Bo Donaldson and the Heywoods’ searing antiwar anthem, “Billy, Don’t Be a Hero.” And while this Funes-like blog is more or less content to call to mind Looking Glass’s “Brandy,” an inoffensive piece of pop fluff that wound up at number 12 for 1972, who, I wonder, will dare to put in a good word for Wayne Newton’s “Daddy Don’t You Walk So Fast” (number 10 that same year) or Mouth and MacNeal’s “How Do You Do” (number 25) or Daniel Boone’s “Beautiful Sunday” (number 42)?
The selectiveness of the Oldies Canon is understandable enough. All of us (that is, all of us of a certain age) want to believe—and want others to believe—that we were listening to “Brick House” in ‘77 when, in fact, we were being subjected to Leo Sayer’s “You Make Me Feel Like Dancing” five or six times a day (a classic Paradoxical Song, in Janet’s famous phrase, like Orleans’ “Dance With Me” insofar as it is utterly—nay, rigorously—undanceable). Even worse, if we were to be reminded of the existence of Gallery’s “It’s So Nice to Be With You” or Cher’s “Dark Lady,” we might realize that we were not merely subjected to these songs but, in fact, fond of them. And then we would not be able to face ourselves, now, would we?
(Don’t get me wrong—I wasn’t fond of that crap. Not me! Along with the rest of the seventh grade, I was hoppin’ and boppin’ to the Crocodile Rock. Which, I believe, was recorded by the Velvet Underground.)
So here’s today’s Fun Game. What’s your favorite example of an Oldie Too Hideous to Acknowledge? Extra points will be awarded to suggestions that carry with them an obvious tinge of remorse (for example, I’ve always thought that Helen Reddy’s cover of Leon Russell’s “Bluebird” was perfect for her voice, so all my Helen Reddy examples above are tinged by remorse-by-association). And extra extra points will be awarded to suggestions so hideous that they derange the entire thread.
Sunday, October 16, 2005
Ann Arbor, MI (Rooters) – In a cliffhanger game extended by “Big House” Republican leaders until they won, the Michigan Wolverines defeated the Penn State Nittany Lions by two points on Saturday, 27-25.
“Sometimes you just need a little extra time to get things done,” said new House of Representatives majority leader Roy Blunt about the controversial call that gave Michigan two extra seconds on their final-minute, game-winning drive. Michigan quarterback Chad Henne then threw a touchdown to receiver Mario Manningham with one second remaining. “Last week we had to make an important giveaway to big oil, and the House of Representatives needed almost forty extra minutes, beyond the five we’d allotted, in order to keep changing the vote until we won,” Blunt said. “This weekend the Michigan Wolverines needed only two extra seconds. That’s not such a big deal, in the end. The point is that people have to be flexible about these things until we fix them just the way we want them.”
No one has been able to account for the additional two seconds. Late in the final quarter, with Penn State leading 25-21, a completion from Henne to Carl Tabb near the left sideline got the Wolverines to the Penn State 32-yard line, but Tabb couldn’t get out of bounds before being tackled by Justin King. The Wolverines then called timeout with 28 seconds left. After some deliberation, the officials put two seconds back on the clock. Penn State head coach Joe Paterno said he did not receive an explanation for the added time.
“What can I do?” he said later. “There’s nothing I can do about it.”
The previous play, a 17-yard pass from Henne to Jason Avant, was also questionable. Television replays showed Avant’s heel landed out of bounds, but the Wolverines got the next play off before the officials could review it.
Nittany Lion fans in the stands chanted “shame, shame, shame” as the final tally was announced.
Democratic leader Nancy Pelosi also complained, saying the proceedings brought “dishonor to the Big House.”
“I don’t see why the losers are making such a fuss,” replied Rep. Fred Upton (R.- Michigan). “We haven’t built a new refinery in a generation. We need more. And if we have to dangle a few of our colleagues out of their office windows by the ankles for forty minutes in order to change a vote from 210 yes - 214 no to 212 yes - 210 no, so be it. That’s just the way the House works.”
“Likewise,” Upton continued, “when Penn State plays in our house, they play by our rules. And our rules say the game isn’t over until we win. That’s just the way the Big House works.”