Scholarly Truth and the Hunger for Progress
Public trust in the academy has been waning for more than a generation now, and in that time, the public support of academic scholarship has been under an almost constant threat of drastic cuts. As accountability has been bureaucratized inside the university itself, scholarship has also come under the suspicious eye of cost-benefit accounting performed by external agencies. This narrow version of accountability may press so far against autonomy that "tension" seems too weak a word to describe the divergence. Scholarly autonomy, if it means anything, means that the individual scholar, answerable to no judging body higher than his scholarly peers, is able to give the law to himself concerning the nature of the work and how it is to proceed.
there is a certain presumption in addressing an audience of scholars on the future of scholarship, and since my area of competence is far narrower than the range of the topic may suggest, a personal confession seems in order. I teach mainly British and American literature. Also, occasionally I teach courses in a nondepartmental humanities curriculum dealing with political thought and moral philosophy from the late seventeenth to the mid-twentieth century. I am a critic and interpreter more than a scholar in the textual or archival sense. The probity and accuracy of my work depend on the prior work of such scholars, and for selfish as well as public-minded reasons, I want their research to go on and their support to be strengthened. But I have in mind today a broader sense of scholarship: a word that goes back to the Middle English scolere and is associated with the compound activity of learning and teaching.
Recent books by foundation heads, education administrators, and professors have instructed us on the dominance of the STEM (science, [End Page 221] technology, engineering, and mathematics) disciplines in university budgets and their disproportionate attraction for ambitious students, on the diminishment of the place of the physical library as a haunt of the learned, and on the eclipse of the liberal arts ideal by vocational training and the transmission of "skill sets" in institutions that once would have scorned such a demand.1 All these circumstances go to shape our predicament, but I am concerned rather with a question about human personality. What are the protocols of scholarship doing to the people who learn whatever skills we have to teach and who become teachers in their turn? A distinct character and temper have always been required for scholarly work, and many conditions of the modern university seem to militate against the perpetuation of that character and temper. The remarks that follow, therefore, will focus on the academic environment, though I recognize a danger in conflating "scholarly knowledge" with the knowledge conveyed by universities and the people who inhabit them. Scholarship is bigger than that, of course, and harder to delimit, but universities are the institutions that matter most. In the United States, we could hardly imagine the vocation of scholarship without them.
A preliminary statement of aims for this conference alluded to "the tension between autonomy and accountability." What does this mean? The scholar possesses an individual mind, and yet the work of the scholar needs checking: what John Locke called the process of "rectification" by other minds. This is naturally done by persons well acquainted with the scholar's work—they are his or her relevant colleagues at a distance, and they may have been consulted in the course of the work itself. Such exertions of accountability are also performed, in ever more minutely calibrated ways, by a machinery of oversight that may take the form of separate advisory committees, in the divisions of the humanities, the social sciences, and the natural sciences. Those committees then look for guidance and rectification to external referees, who are asked to rank a given scholar's work against the background of other work to which it can be properly compared. At any of the major research universities, referees are [End Page 222] asked to employ the measure of the hypothetical or actual world-class scholar, alongside whose typical achievements every other contribution to knowledge must be rated. For teachers in smaller liberal arts colleges, a broadly similar index is likely to be invoked, perhaps with a heavier emphasis on student evaluations.
"Every idiom has its idiot," as the novelist Peter De Vries wrote (2014, 108), and one advantage of the present system is that idiocy is decreased. Few persons who bear the name of scholar now command a distinctly uncommunicative idiom that baffles students and colleagues alike. As this was a common trait of the old, embarrassing, cartoonish image of the scholar, its loss is undoubtedly our gain. A desirable sort of conformity is thereby fostered, and the regulated currency of a discipline offers assurance that an agreed-on body of knowledge is being referred to. But there has been a corresponding loss of something valuable—a quality of idiosyncrasy or eccentricity—which is itself a major element of individuality in scholarship, as it is in life. The eccentric may move in overlapping regions with her colleagues without aspiring to share their center and circumference. In a promotion review, I once heard a younger colleague's work depreciated on the ground that it was "orthogonal to the interests of the field." It failed to go with the flow, it crossed the familiar and recommended currents at a right angle, and this was offered as a decisive criticism. But could it not equally have been a heterodox kind of praise? How much cooperation and conformity should we strive for in a line of work we say we have undertaken in pursuit of truth?
Most people who come into possession of some power—academic life differs little in this regard from the corporate world and politics—once their prestige is high enough are apt to suppose their thoughts are congenial to the truths that matter at the moment. The disciplines I am familiar with (the interdisciplinary fields included) are good at rewarding the scholar who travels in a fashionable groove. It may be tediously respectable, like rational choice theory, or it may carry avant-garde credentials, like queer theory, but sooner or later the groove becomes a rut. All the while, those who advance on the [End Page 223] recommended lines are saying to a generation of novices, "Defy us at your peril." Young scholars are thus advised by their elders to aim at corporate approval, at the sacrifice of the difficult thing that intellectual self-knowledge may be. There has never been a more desolate title of a scholarly essay than "The Way We Think Now."
________
public trust in the academy has been waning for more than a generation, and in that time, the public support of academic scholarship has been under an almost constant threat of drastic cuts. The response has been for universities to adapt themselves to the corporate mold, in the evident belief that this will render their physical presence and products less forbidding. Has this adjustment led to an increase in the public esteem for scholarship? I suspect the reverse is true. Our financial betters may work in corporations, but they don't always love the places where they work. Do they like us better for trying to resemble those places? As late as the 1950s, a great many colleges and universities were like monastic institutions (relatively speaking)—both in the look and feel of the campus, and in the habits of the faculty. Is it entirely good that they have become more like consultancies or the specialized units of a research park?
As accountability has been bureaucratized inside the university itself, scholarship has also come under the suspicious eye of cost-benefit accounting performed by external agencies. A pharmaceutical company that donates money to a new program in chemistry expects some return, however indirect; the Department of Defense is interested in theoretical physics because it may cash out in retina scanners to be deployed at dangerous borders; the civil rights division of the Department of Education believes racial diversity is an educational value, and government funds may be supplied or withdrawn as a college backs that belief in practice or falls short. Accountability may press so far against autonomy—the freedom-of-mind of the scholar—that "tension" seems too weak a word to describe the divergence. Scholarly autonomy, if it means anything, means that the individual scholar, answerable to no judging body higher than his scholarly [End Page 224] peers, is able to give the law to himself concerning the nature of the work and how it is to proceed.
To the extent that universities are something other than a boutique for varieties of professional training, the humanities have always been at the heart of university education. And the judging body to which the scholar of the humanities is accountable first of all is the body of significant survivals of thoughts and imaginings from the past. T. S. Eliot observed in "Tradition and the Individual Talent" that a tradition "cannot be inherited, and if you want it you must obtain it by great labour" ([1919] 1964, 4). A good deal of the labor of the humanities goes into allowing a tradition (or some sufficiently representative understanding of a tradition) to be passed on to students. You do not inherit American literature by being an American and deciding to write a novel in English. You may come to understand a good deal about the American imagination and participate in its tradition by taking courses that assign generous readings of Edwards, Franklin, Jefferson, Paine, Madison, Emerson, Poe, Whitman, Thoreau, Douglass, Lincoln, Melville, Dickinson, Twain, Henry James, William James, Dewey, Stephen Crane, Eliot, Frost, Stevens, Moore, W. C. Williams, Veblen, Du Bois, Dreiser, Stein, Hemingway, O'Neill, Hart Crane, Faulkner, Cather, Ellison, Tennessee Williams, Bellow, Elizabeth Bishop—and let us stop there, because one of the great and gentle things about tradition is that it does not steal from youth their possession of their contemporaries as a kind of value precious to themselves and not yet settled. One bad thing about the humanities in our time is that the machine of academic legitimation has stolen this pleasure from students. The loss of contemporary culture as a free zone comes from an understandable attempt by professors to reenter the ranks of the young by teaching instant classics hot off the press or the plasma screen.
There is progress in the natural sciences. Or let us say: there is measurable advance to a subsequent stage of interest, whose legitimacy is tested against nature, without prejudice to the discoveries of the previous stage. But there is no progress in the arts, and there is [End Page 225] no progress in the humanities. This is a truth that we who study the arts are reluctant to repeat—partly owing to sheer embarrassment at the want of a catchall justification, but partly too from causes connected with funding. The social sciences gained their credibility, with some success, by imitating the natural sciences, and the humanities, since 1975 or so, have tried to follow by imitating the social sciences (see Cohen, Bromwich, and Stocking 1988). The second of these impositions may involve a larger error than the first, and among its causes has been a failure of intellectual candor. It was supposed until recently that the central subject of humanistic study was the mind and conduct of individuals, while science examined persons, animals, and things in groups. No one has ever argued the necessity of a "control group" to check the interpretation of a novel; we cannot even imagine what such a contrivance would look like, whereas a scientific experiment will hardly carry credit without a control to test the independent variable.
The humanities by their nature are backward looking. A proper defense ought to acknowledge the fact and go on to affirm that the knowledge of the past belongs to a discipline of becoming fully human. William James, in his essay "Reflex Action and Theism," developed this idea with great richness and subtlety, and you will pardon me for the length of the quotation that follows, because it is a matchless train of thought on education itself and the almost physiological demand that produces the morale of the scholar. "The conceiving or theorizing faculty," says James,
functions exclusively for the sake of ends that do not exist at all in the world of impressions we receive by way of our senses, but are set by our emotional and practical subjectivity altogether. It is a transformer of the world of our impressions into a totally different world,—the world of our conception; and the transformation is effected in the interests of our volitional nature, and for no other purpose whatsoever. Destroy the volitional nature, the definite [End Page 226] subjective purposes, preferences, fondnesses for certain effects, forms, orders, and not the slightest motive would remain for the brute order of our experience to be remodelled at all. But, as we have the elaborate volitional constitution we do have, the remodelling must be effected; there is no escape. The world's contents are given to each of us in an order so foreign to our subjective interests that we can hardly by an effort of the imagination picture to ourselves what it is like. We have to break that order altogether,—and by picking out from it the items which concern us, and connecting them with others far away, which we say "belong" with them, we are able to make out definite threads of sequence and tendency; to foresee particular liabilities and get ready for them; and to enjoy simplicity and harmony in place of what was chaos. Is not the sum of your actual experience taken at this moment and impartially added together an utter chaos? The strains of my voice, the lights and shades inside the room and out, the murmur of the wind, the ticking of the clock, the various organic feelings you may happen individually to possess, do these make a whole at all? Is it not the only condition of your mental sanity in the midst of them that most of them should become non-existent for you, and that a few others—the sounds, I hope, which I am uttering—should evoke from places in your memory that have nothing to do with this scene associates fitted to combine with them in what we call a rational train of thought,—rational, because it leads to a conclusion which we have some organ to appreciate? We have no organ or faculty to appreciate the simply given order.
([1896] 1956, 117–18; emphasis in original)
In this wonderfully evocative passage, James is speaking of the artifice by which we construct an order from the external world that comes to us advertising no inward order of its own. Every unity is created by us [End Page 227] and not simply given by an existing state of things. And for that reason we are bound to be creative in our most commonplace acts: we refine our understanding of the world by pulling together certain data and stimuli, by associating certain sights with certain smells for example, but the faculty by which we do this is not mechanical but constructive and inventive. We work on nature as artists of everyday life and realize—that is, we make utterly real to ourselves—the aspects and conjunctures of the world that our experience wants to use. This is not a matter of experiment but rather of function and survival.
James goes on to demonstrate for his reader just how alien the world would become if we subtracted from it this figuration and shaping by the mind. The random and miscellaneous order of events in the world at any given moment, says James,
is an order with which we have nothing to do but to get away from it as fast as possible. As I said, we break it: we break it into histories, and we break it into arts, and we break it into sciences; and then we begin to feel at home. We make ten thousand separate serial orders of it, and on any one of these we react as though the others did not exist. We discover among its various parts relations that were never given to sense at all (mathematical relations, tangents, squares, and roots and logarithmic functions), and out of an infinite number of these we call certain ones essential and lawgiving, and ignore the rest. Essential these relations are, but only for our purpose, the other relations being just as real and present as they; and our purpose is to conceive simply and to foresee. Are not simple conception and prevision subjective ends pure and simple? They are the ends of what we call science; and the miracle of miracles, a miracle not yet exhaustively cleared up by any philosophy, is that the given order lends itself to the remodelling. It shows itself plastic to many of our scientific, to many of our aesthetic, to many of our practical purposes and ends.
([1896] 1956, 119–20; emphasis in original)
[End Page 228]
Turn back to the central sentence in which James addresses our ability to learn about the world: "we break it into histories, and we break it into arts, and we break it into sciences; and then we begin to feel at home." On this view, academic study, indeed all intellectual inquiry, is far from the perverse or peculiar exertion the citizens of a business civilization take it to be. It is what we are born doing and compelled to do, and what scholars have the luck to go on doing more freely than other people. By science, of course, James means knowledge generally, and he knows that the breaking of it into histories and arts and sciences is arbitrary to a degree. It does not follow that the truths we are seeking are contingent or relative or infinitely malleable. Our point of entry is arbitrary only because it comes from the intensities of concentration that have moved us to look closely at something outside ourselves and to convey exactly what we thought we could see. That a thing happened or did not happen, that it occurred in this way and not in that—these facts matter to the imagination. It is a terrible mistake to suppose that accuracy is the enemy of imagination.
Let me give an example of a point of entry and a path of discovery from my own experience: an in-school occurrence that happened by accident. As an undergraduate, I read Perry Miller's great essay "The Marrow of Puritan Divinity" and learned about the unhappy awareness, among the second generation of New England Puritans, of their falling-away from the rigorous faith of the first generation. About the same time, I read H. R. Trevor-Roper's essay "The European Witch Craze of the Seventeenth Century," which traced the literalism of belief in the devil in Christian Europe and the sprees of panic and credulity by which whole communities were turned against themselves or their neighbors. Some years later, teaching an introductory course in American literature, I thought I could see in Hawthorne's tale "Young Goodman Brown" certain signs of the generational self-reproach that Miller had diagnosed, as well as the fever-panic of mass psychology whose history Trevor-Roper recounted. From reading Marion Starkey's book The Devil in Massachusetts, a history of the Salem witch trials, I also knew that the people whom Hawthorne's protagonist [End Page 229] encountered on his hapless journey through the forest bore the actual names of accused witches in Salem. So it seemed to me this famous story was not what commentators often took it to be, namely a fanciful tale of the haunting of a mind deformed by a native self-distrust. "Young Goodman Brown" seemed to be, instead, an ironic analysis of a particular historical predicament—the conscious waning of the faith of an entire community—which had produced a unique collective injustice in American colonial history.
Now, these elements were present together only because of reading I had done at different times and the assignment to teach a difficult story that invited allegorical reading. My interpretation didn't arrive as the payoff of supervised research. It was possible because of the convergence of pieces of knowledge drawn from various interests and earlier readings. As it turned out, others better equipped to do justice to a historical view of the story were working at the same time on similar lines, and some such view is now close to common acceptance, thanks to their labors (Colacurcio 1984, 286–304). But in a small way here I was a sharer in a discovery. It changed my idea of Hawthorne, and of some other things: the morale of early New England, the capacity of human nature for self-deception.
I relate this anecdote as typical rather than exceptional. It represents the kind of intellectual work we all do when we break up the given world into histories and arts and sciences. Sometimes we recombine the elements unexpectedly. To say it like that may make this sound like a parable of the blessings of interdisciplinary study, but in fact I do not think a single scholar who claims to occupy two fields or who belongs to a cooperative Arbeitsgruppe has better odds at such a discovery than anyone else. And to underline a detail that bears out my argument for the long duration of scholarly tradition: the historical essays that I was inspired by had been published a decade earlier and four decades earlier. They were not cutting-edge. When, therefore, I hear a scholar of the humanities say with disdain about a recent published piece, "this could have been written 30 years ago"—when the remark signifies not that the argument has been made before, [End Page 230] or that it is false, but only that it pursues interests eccentric to us now—I know I am listening to someone who thinks scholarly work is perishable in roughly the same way as a line of sports cars that has been replaced by a subsequent line.
________
the convergence of fashion with the desire to imitate science has recently produced in my field the idea of "distant reading." Close reading was a name for the use of the whole mind to interpret the verbal texture of great writing, with a readiness to look into individual words, phrases, sentences, paragraphs. It was a method that critics like Johnson and Coleridge had certainly known and practiced, but it had never been formally recognized as having special value. Mid-twentieth-century criticism brought that recognition. By contrast, the innovation of distant reading was made possible by the facility of computers at scanning thousands of texts and comparing the frequency of the use of certain words and their proximity to other words. Computers can do it with a thoroughness impossible to a human reader. The results have included a network theory–inspired diagram of the characters of Hamlet, and such inferences as the definition of a protagonist as "the character that minimized the sum of the distances to all other vortices" (Schulz 2011). And yet Hamlet is a document of human experience and not a sum of vectors and vortices, and to know what to look for, even in the word counts, you would have to know, for example, something about the shadings of words and the likely presence or absence of humor at a given moment—a kind of practical wisdom the computer will not yield.
The drive to imitate the social sciences has led to this sort of thing. One might have detected a similar motive in the vogue of post-structuralism in the 1980s, and the influence a decade later of the discourse of "power/knowledge." The best antidote for such impersonal excess is to have a friend or a close associate who is intelligent without being academic. But that is a piece of luck denied to many scholars, and the result is an intellectual parochialism that starts in graduate school and may last the length of a career. All their contact [End Page 231] with things of the mind comes from professionals like themselves; and in such closed company, it becomes impossible to translate any idea out of the prevailing jargon—the professional surround establishes both the language and the content of all we can know of things of the mind. The consequence is a loss of good sense and mother wit.
A further cause of inhibition and self-limitation in the humanities has been the rise of an academic avant-garde. The very idea of an academic avant-garde may seem paradoxical. After all, in the nineteenth century there was the academy, in which painters like Couture and Delaroche and Bouguereau flourished, and then there was the avant-garde, constituted by rebels like Courbet, Manet, and Pissarro: the one excluded the other. Yet the humanities have fostered an avant-garde that wields official authority inside the academy. This is hard for outsiders to notice, and in the case of recent scholarship, I believe it has become almost invisible to the academicians themselves. The sociological fact remains hidden in some measure because their judgments in favor of their own work adopt a language that mixes scientific and aesthetic shading: it is "pathbreaking," "groundbreaking," but it is also the bearer of "new methods," "productive hypotheses," "research programs" that explore untapped resources of knowledge. I suspect that an unarticulated wariness of this mixture of scientific signaling and aesthetic stance is what underlies much of the public bewilderment about the work universities are subsidizing today. I have in mind the reasonable questions that remain once we have allowed for and written off sheer philistinism, which in this context I would define as a distrust of learning for learning's sake. Yet it is a curious feature of the academic avant-garde that it, too, distrusts learning for its own sake. Scholarship is valued instead for the progress it may bring in the arts and in the improvement of society.
At this point, a reader may rightly demand examples of the tendency I have described. I draw mine from a recent article by Lisa Ruddick, "When Nothing Is Cool," in the Chicago journal The Point (Ruddick 2015). For research toward a book on the state of academic literary study, Ruddick interviewed 70 graduate students and noticed [End Page 232] that among them, two groups seemed to have trouble adapting to the expectations they encountered in the field. The first group, she says, "bridle at the left-political conformity" and related attitudes from the "culture wars." The second group—these were the more interesting phenomenon and the more recent—"say that something in this intellectual environment is eating them alive." Ruddick quotes the head of an academic coaching service who, remarking the same phenomenon, spoke of it as "an immorality they can't put their finger on." I would describe this immorality as a defection from the work of judgment in an area where human knowledge, including knowledge from personal experience, ordinarily enters into judgment. (One could almost say that such knowledge from experience wants to enter and cries out when it is blocked.)
Ruddick believes on good evidence that this reaction has something to do with "the game of academic cool"—a game that has engendered norms "that make ruthlessness look like sophistication"—and she cites a widely appreciated article on the horror-thriller The Silence of the Lambs (Halberstam 1991). The psychotic murderer in that movie, Buffalo Bill, tortures his female victims before killing them, and wears a coat made of their skins. In the article, this character is said to be the bearer of a posthuman aesthetic; his violence is found to disclose the truth that gender itself is "always posthuman, always a sewing job which stitches identity into a body bag." For the professor who wrote the article, the metaphor of the coat, among other details, clinched the argument that we have all undergone "'a historical shift' to an era marked by the destruction of gender binaries and 'of the boundary between inside and outside'"—an era in which "not only gender but also 'identity … prove only to be skin deep.'" Ruddick concludes with words that my own professional experience confirms:
I believe that when a scholar traffics in antihumanist theories for purposes of professional advancement, his or her private self stands in the doorway, listening in. When it hears things that make it feel unwanted … it can go mute. [End Page 233] I have spoken with many young academics who say that their theoretical training has left them benumbed. After a few years in the profession, they can hardly locate the part of themselves that can be moved by a poem or novel.
The connection is immediate here between the sacrifice of individual scholarly identity and the anxious placation of the demands of careerism. The branch of the humanities that has gone furthest to make a fetish of "resistance" and "transgression" has itself sponsored protocols of intellectual conformity as thoroughly routinized as any in academic history.
Nonconformity has become hard to carry off in the choice of a subject as well as the mode of scholarly treatment. Two senior scholars whom my department was interested in hiring told me at different times in their visits to campus that they had looked over the list of all the dissertation topics of all the graduate students and none of them was any good. How did they know? Or rather, what did they mean? Good for what purpose, and to what end? I took the dismissal to mean that these PhD candidates were out of step with the way we think now and they would not land high-prestige jobs any time soon. It is easy to see how such unchecked arrogance, whether in a department or in the professional organization identified with a field of study, could lead to effects that are oppressive for the individual scholar.
________
as a test of the present climate, let us compare the professional and bureaucratized manner I have been describing—a way of thought that aims to carry the torch for advanced thinking in our time—with the inward limitations of American higher education as it appeared a century ago. By then, private colleges and the state and land-grant colleges had taken hold with sufficient solidity to make an all-around picture of higher education plausible. Santayana left us a picture of that different climate in his lecture of 1911 on "The Genteel Tradition [End Page 234] in American Philosophy." (Philosophy here he takes to denote academic culture generally and the intellectual ethos of the universities.) Santayana called the reigning philosophy of 1911 "transcendentalism," and he drew an ironic contrast between its remoteness and abstraction and the uninstructed pagan self-assurance of the business civilization. "What has happened," wrote Santayana,
is that the hereditary philosophy has grown stale, and that the academic philosophy afterwards developed has caught the stale odor from it. America is not simply, as I said a moment ago, a young country with an old mentality: it is a country with two mentalities, one a survival of the beliefs and standards of the fathers, the other an expression of the instincts, practice, and discoveries of the younger generations. … This division may be found symbolized in American architecture: a neat reproduction of the colonial mansion—with some modern comforts introduced surreptitiously—stands beside the skyscraper. The American Will inhabits the skyscraper; the American Intellect inhabits the colonial mansion.
(2009, 4)
Little of this description could fit the academic environment today. For a few years in the 1990s, the Graduate Center of the City University of New York was housed in a skyscraper with a view that would flatter the pretensions of a firm like McKinsey or Ernst & Young. Nevertheless (and it feels strange to admit this) our environment in other ways continues as remote as ever from the common morale of American society, and the effects of the estrangement now threaten a loss of confidence in education as well as a weakening of self-confidence among scholars.
But can the humanities be supposed to assist the utilitarian and instrumental values that recommend the natural sciences to society? Hume, in his essay "Of Essay Writing" ([1742] 1985), divided the possible milieux of intellectual argument or discussion into what [End Page 235] he called the learned world and the conversable world. The learned world encompassed the sphere of laboratory experiment, the testing of hypotheses in areas like physics and natural history (then on its way to becoming biology) and chemistry (still on the way to being born). Men of science in this sense—and Hume could feel sure they were always men—desired no contact with the literate public and would have found such exposure rebarbative: so small a proportion of that public could begin to understand what the arguments had at stake. On the other hand, for Hume, discussions of taste, morals, and politics belonged to the conversable world. This included the liberal arts in the freest and most general sense; and the knowledge necessary for someone to join the discussion was intricate, no doubt, but it could be acquired by a literate person without a great deal of tutorial help. The conversable world, as Hume saw it, was led by women, who in fact understood the arts of conversation better than men; and if the discoveries of the laboratory were ever to be conveyed to citizens whose suffrage and opinions must count in a free commonwealth, this would have to happen through the exertions of the conversable world.
You will see where I am heading. The humanities have long supplied our best approximation of a conversable world, and people come to know that world most commonly through higher education. Yet the terms of "the conversation"—for us, a somewhat slack all-purpose word implying discussion, disputation, argument, conciliation, mediation, and therapy—have become foreign to the interests of the public at large. The terms, in fact, are set in significant measure by the academic avant-garde, which has defected from its conversable function by adopting the manners of the learned. The exception proves the rule. We do have a small and familiar body of academicians, the sort of people the British call "media dons"—in the fields of politics and history preeminently, but not only there—who in articles and interviews produce commentary that is directed to persons categorically less well-informed than themselves. They talk down to a polite audience. They are playing the role of facilitators and middle-men [End Page 236] that Virginia Woolf characterized with unappeasable scorn in her essay "Middlebrow." By contrast, those who work and teach in the learned world—that is, in the natural sciences—for their part have retreated farther inside the laboratory and have found it expedient to entrust a great part of their teaching to novices who often lack the most rudimentary skills of pedagogy.
These are intramural defects, some of them well known and perhaps on the way to solution, others clearly getting worse. There is no tendency, however, that has done more harm to the universities and the ethics of scholarship than the academic legitimation of identity politics. This is a form of anti-intellectualism that fosters an evasion of responsibility for ideas. It mandates and enforces severe limitations on any conversation about ideas, in obedience to what are supposed the impermeable determinants of ideas—the boundaries of nation, race, and religion, all the things that together now go by the collective name of "culture." Yet the strictures of cultural identity, as promulgated in the schools and in much current scholarship, are founded on a self-contradiction. We say that we are trying to teach a power of intense sympathy, or what most people now call empathy, but empathy literally means feeling your way into a life different from your own. The doctrine that we can only know and only be permitted to speak for our sort cuts away the very basis of empathy, and makes a forsworn hope of the study of society and culture itself. The abstemious practice that follows from this theory has cheated many students of the pleasure of thinking hard about something besides themselves—an experiment in thought that is almost a necessity of life for an intelligent person, at least after adolescence. The grownups who first proposed and then built up or acquiesced in the addition of new layers to the identity bureaucracy have much to answer for. They have discouraged the inquisitive instincts that nourish education quite as effectually as the funding giants who demand to know the cash value of every course a student might take. Think about the fallacy involved. In candor, we ought to concede that it is almost as hard for a young black person as it is for a white person to imagine [End Page 237] what it was like to be a slave. In the next generation, the same will be true for a young Jew trying to imagine what it was like to be an inmate in a concentration camp. No one has automatic access to imaginative truth by the mere virtue of racial or religious membership. A terribly wrong lesson is being taught.
The surrender of the right to speak with authority outside one's own ascriptive group has led to the weakening of other freedoms. Civil liberties were still mainly a liberal cause in the 1960s and 1970s when I was in college. It pains me to recognize that this is no longer so. Few left-liberals whom one meets under the age of 40 see anything wrong with speech codes; words like "hurtful" and "insensitive" come to them easily in characterizing modes of speech that fall far short of denigration, insult, or threat. What caused this extraordinary change? Herbert Marcuse in 1965 broached a theory of the deceptive practice that he called "repressive tolerance" (Wolff, Moore, and Marcuse [1965] 1997). The argument asserted that liberal democracy might be tolerant in its facade, but the sheer range of the available opinions in "the marketplace of ideas" actually served to conceal and exempt from critique the narrow range of views that were counted as legitimate. Marcuse recommended a left-wing counteraction against the tolerant facade that protects the liberal-industrial consensus. A radical movement of the time, he proposed, with students in the lead, would be justified in enforcing a salutary repression of wrong opinions within their milieu and indeed within their political control as far as it extended. Such a purified pattern of speech and communication might become the germ of a later and more truly tolerant society.
This idea has come down to us in diluted form. The university, it is said, ought to be an epitome of what is best in society, and to accomplish that aim it must purge itself of all traces of what is worst in society. The demand falls in with the argument for salutary repression: the necessary regimen of reform is understood to incorporate programs of reeducation, commonly supervised by extra-educational authorities, which are described as "training." All this is done in the [End Page 238] name of community. In the week just previous to writing these sentences, I learned that Princeton University has recently constructed an array of "Affinity Spaces" for students from diverse cultural backgrounds, and Harvard has launched a university-wide task force on how "to advance from diversity to belonging." Community is the watchword here, the value almost automatically cited in support of bureaucratic interventions in the name of expanding and enriching education. And yet, there is no value more depressing to the morale of a community of scholars than the imperative of "belonging." It is one more injunction—now uttered from far above the classroom, the laboratory, or the archive—to conform to the way we think now.
Is thinking generally done in the first-person plural? Many have thought otherwise. "The intelligence is defeated," Simone Weil wrote, "as soon as the expression of one's thoughts is preceded, explicitly or implicitly, by the little word 'we.'" Corporate identity, whether belonging to a department, a profession, or any of the little platoons in society, has always been eager to enlist a thinker, against her own strongest instincts, to adhere to the program of the group, whether the reason given is social progress or brute loyalty. It was in defiance of just this call to synonymize oneself with other people that Emerson, in his oration "The American Scholar," defined the scholar as the person who has the good fortune to live with integrity. Emerson was reacting against clichés of solidarity, heated by nationalist sentiment in his day, which resemble the appeals to cultural identity and solidarity that we hear today. Yet he made his argument explicit against the separation of thinking into subdivisions of labor commanded by distinct professions:
Man is not a farmer, or a professor, or an engineer, but he is all. Man is priest, and scholar, and statesman, and producer, and soldier. In the divided or social state, these functions are parcelled out to individuals, each of whom aims to do his stint of the joint work, whilst each other performs his. The fable implies, that the individual, to possess [End Page 239] himself, must sometimes return from his own labor to embrace all the other laborers.
(Emerson 1971, 53; emphasis in original)
The scholar is thus pictured in Emerson's argument as a figure standing out against the background of partial citizens. They have suffered reduction and disfigurement in the name of "commodity," which is really a nickname for money. They cannot escape their necessary compromise. The scholar forms the lucky exception and ought to make appreciable use of his good luck:
The state of society is one in which the members have suffered amputation from the trunk, and strut about so many walking monsters,—a good finger, a neck, a stomach, an elbow, but never a man. … Man is thus metamorphosed into a thing, into many things. The planter, who is Man sent out into the field to gather food, is seldom cheered by any idea of the true dignity of his ministry. He sees his bushel and his cart, and nothing beyond, and sinks into the farmer, instead of Man on the farm. The tradesman scarcely ever gives an ideal worth to his work, but is ridden by the routine of his craft, and the soul is subject to dollars. The priest becomes a form; the attorney, a statute-book; the mechanic, a machine; the sailor, a rope of a ship. … In this distribution of functions, the scholar is the delegated intellect. In the right state, he is, Man Thinking.
The nearly monastic poverty of young scholars Emerson takes to be an effect of self-sacrifice, which may be comfortless—he calls it a cross—and so he is impelled to ask: "What is the remedy?" And he gives this answer: "They did not yet see, and thousands of young men as hopeful now crowding to the barriers for the career, do not yet see, that if the single man plant himself indomitably on his instincts, and there abide, the huge world will come round to him. Patience,—patience." [End Page 240]
________
among the most troubling facts for the support of patience in scholarship has been the sheer speed of the new technologies—including the pace with which each new version displaces the last—and the metabolic changes that can't thereby fail to occur in the susceptible mind. All the powers of modern culture and the modern state combine to lengthen the distraction. And yet, it takes time for good work to sink its roots. In the dissertations I have supervised or had a hand in supervising, it has sometimes startled me to discover a scholar with a first chapter that struggled to find the definition of the topic, or even to clear three consecutive sentences unobstructed by a dutiful adherence to jargon, breaking through at last to a discovery from late rereading and close thinking. This is a blessing of the library, too, for it is the book next to the one you thought you wanted that proves to be the book that matters. In the humanities, where rightness can seldom be proved by logic or quantitative evidence, but where nonetheless there are many ways of being wrong, the strength and durability of a project come from cultivating the virtue of wanting to tell the truth. And, strange to say, that also takes time.
I began by quoting William James on the instinct for organizing a chaotic mass of sensations and experience: this was the ability, he said, out of which we make for ourselves a comprehensible world. Let me conclude with another passage from James, from a talk entitled "The Social Value of the College-Bred." Addressing the American Association of Alumnae at Radcliffe College in 1907, James laid it down as an uncontroversial proposition that the humanities alone distinguish American colleges from the technical or professional schools, and he said plainly what I have been trying to suggest by selective history and allusion: that the knowledge of a deep past humanizes. It gives us a consciousness of something other than ourselves and larger than ourselves.
But I would go further. The knowledge of the past creates a sort of civic good that a modern society based on convenience and technique cannot otherwise deliver. Here is how James puts it: [End Page 241]
You can give humanistic value to almost anything by teaching it historically. Geology, economics, mechanics, are humanities when taught with reference to the successive achievements of the geniuses to which these sciences owe their being. Not taught thus, literature remains grammar, art a catalogue, history a list of dates, and natural science a sheet of formulas and weights and measures.
(1987, 1243)
James supposed that judgments of value, as well as the study of what is valuable, lay at the heart of such humanizing knowledge: "the sense for human superiority ought, then, to be considered our line, as boring subways is the engineer's line and the surgeon's is appendicitis. Our colleges ought to have lit up in us a lasting relish for the better kind of man, a loss of appetite for mediocrities, and a disgust for cheapjacks" (1244). The learning of a relish for what is to be valued in artists, scientists, and the productions for which both are responsible James calls "the sifting of human creations" (1243). Do we, as scholars, still believe that this labor of judgment is consistent with the democratic requirement of tolerance? So long as we agree that liberty of thought and discussion is not the enemy of enlightenment, I don't see that we have a choice.
david bromwich is the Sterling Professor of English at Yale University. He has written extensively about British Romanticism, modern poetry, and the rhetoric of political persuasion. His books include American Breakdown: The Trump Years and How They Befell Us (2019), How Words Make Things Happen (2019), and Writing Politics: An Anthology (2020).
NOTE
1. See, for example, Derek Bok, Universities in the Marketplace: The Commercialization of Higher Education (Princeton, NJ: Princeton U. Press, 2004); William G. Bowen, Higher Education in the Digital Age (Princeton, NJ: Princeton U. Press, 2013); Alvin Kernan, ed., What's Happened to the Humanities? (Princeton, NJ: Princeton U. Press, 1997); Bill Readings, The University in Ruins (Cambridge, MA: Harvard U. Press, 1996); Andrew Delbanco, College: What It Was, Is, and Should Be (Princeton, NJ: Princeton U. Press, 2012).