Andrew Karns, JHU:
Are the humanities suffering a “crisis” in the modern American university? Amid harsh budget cuts, Humanities departments have been forced into deep soul-searching as they wonder what place they have in the future. Fareed Zakaria, a well-known columnist for the Washington Post, argued that the shift towards a STEM education has become “obsessive” in the United States. As he points out, even high-tier politicians have begun to express this view: “Is it a vital interest to the state to have more anthropologists?” asked Florida Governor Rick Scott. “I don’t think so.” The American academic future seems to be economically, intellectually, and politically geared towards the hard sciences.
Indeed, could this all mean that we simply need fewer people specializing humanities? James Turner, a self-styled “historian of the humanities,” argues that by and large, what are today known as academic disciplines in the university—natural sciences included—all emerged historically from the study of philology. That so few students today even know what the word means could represent part of the problem. As Turner explains, when the first universities appeared in Europe during the Middle Ages, philology (the compilation and criticism of texts) as we know it today emerged from debates on precisely how to criticize and analyze new information, a debate that is not by any means a stranger to our own time.
What, then, are the humanities for? Most university departments would settle for a relatively safe, catch-all position by defining humanistic study along the lines of “studying the expression of human thought.” Hard-liners would reduce this to simply teaching how to properly study what cannot be quantified. If, however, we were to measure the progress of the humanities as we tend to see the development of the natural sciences, what would “progress” look like exactly? Scientific breakthroughs are perhaps quite easily identified historically: the discovery of a heliocentric view of the universe, Einstein’s theory of relativity, and, more recently, the development of massive information processing through computing. What, though, is a “humanistic breakthrough?”
Asking such a question provides for some interesting responses, yet also points to similar fascinating historical developments. What the invention of the printing press in Europe, the discovery of the Indo-European language family, and Freud’s “discovery” of the human subconscious all share in common is a profound and unprecedented impact on humanistic fields of knowledge. Is such a model of progress nonetheless compatible—let alone sufficient—for today’s research-driven university model? Such a question could be difficult to answer.
In a more practical and immediate sense, these debates reflect a different understanding of the university model itself. Indeed, the debate reaches antagonistic levels. Students of the Humanities often scoff at their hard-science peers who will well spend a decade “overspecializing” in their field. (A neurosurgeon can “only” lead brain surgery, after all). On the other hand, those inclined to natural sciences may point out that humanistic disciplines “study” very little at all: One doesn’t need to understand the physics of sunsets to appreciate their beauty just as they would not need to know the history of cinematography to tell a good movie from a bad one.
Both views are oversimplified and misleading, yet reveal the underlying tension between different understandings of the university system. The university can be thought of as a center of learning and erudition, regardless of academic discipline, yet there is alternative, prevalent view of higher education with the university-as-business at its center.
The Economist provides a puzzling picture of American higher education. Though American universities lead the world in the publication of scholarly papers and in the amount of funding dedicated to research, university students consistently rank poorly in international standardized tests that measure “numeracy and literacy.” In contrast to their European and Asian peers, an incredible 45% of American undergraduates show no gains in academic achievement during their first 2 years at the university. These figures emerge in the face of a surging demand for university education and an exponential growth in tuition costs. The Economist wisely attributes this puzzle to the fact that “the market for higher education, like that for health care, does not work well.” The money is in the research, not in the learning. Naturally, universities will then turn to more “profitable” research ventures, pouring millions into new laboratories and equipment while neglecting funding for proper humanistic research, which, while fundamentally different from research in the natural sciences, is just as important to its academic needs.
Perhaps this “crisis of the Humanities” could simply be a symptom of a larger crisis in higher education. These tensions have opened debates as to whether it would be more appropriate to divorce the two functions of the university—one for career preparation and the other for the sake of research and learning in itself—or rather integrate them more closely. While the career field of humanities majors is certainly in the market, they are by far not as well-defined as those emerging from STEM fields, and even learning-through-research curricula have often faced the pressures of finding sources of funding. Is this shift a simple consequence of history or a failed approach to education? The answer is undoubtedly complex, and though there can be no single, concrete resolution to this issue, one important step in the right direction would be to encourage a more well-informed debate over the future of the university.
For better or for worse, new “trending” majors such as Gender Studies or Environmental Studies no longer reflect traditional, historical bodies of scholarship, but are rather viscerally attached to the demands and problems of our present time. Majors in English, History, or Art (or even worse, in Letters) all seem to be interpreted as a degree for being “generally educated” in the eyes of today’s job market, in spite of their long and robust history as an academic field of knowledge. Yet Turner himself proposes that their versatility may in fact be their strength, and that the Humanities should embrace a cross-disciplinary model of learning in order to more closely integrate intensive learning and practical application. “Today’s humanities divisions are not ancient, integral modes of knowledge,” he writes. “They are modern, artificial creations—where made-up lines pretend to divide the single sandbox in which we all play into each boy’s or girl’s own inviolable kingdom. It is a sham.” For him, we have fallen in too deeply into a game of titles and labels, and we should not neglect a great body of knowledge that was, for most of its own life, indistinguishable from what we know today as the natural sciences themselves.
Still, there could be much more than just “making better people” at the heart of the Humanities. As Steve Jobs declared as he first unveiled the iPad, “it’s in Apple’s DNA that technology alone is not enough—that it’s technology married with liberal arts, married with the humanities, that yields us the result that makes our hearts sing.”
Disclosure: The author is a student of the uncomfortable gray areas that make up the social sciences.
For James Turner’s fascinating book on the Humanities, see Philology: The Forgotten Origins of the Modern Humanities. Princeton: Princeton University Press, 2014. Print.