Coda: The Human in the Humanities

C

My first semester of grad school was kind of a wreck: I was constantly sick, my nerves were bound tight with anxiety, and my back and wrists were in pain from the Soviet-era metal chair-desks in a basement classroom. None of this was helped by the ideological distress I found myself in. Two pieces of scholarly advice that found their way to me that semester still linger with me: one, there’s no such thing as the human condition; and two, your graduate program will tear you apart and remake you in its image.

A photo of a metal classroom chair with tiny desk attached at the armrest.
The chairs were still the worst part, though.

In the classroom, I mentally conceded the probable truth of the first one. My undergrad philosophy classes taught me that we have no good definition of “human.” And the conditions people live in vary so radically that there can’t really be a universal one: the Elizabethans understood the world’s functions quite differently than do the Mosuo or a New Yorker, and attempts to demand that there is one ideal understanding usually end up serving some hegemonic understanding to the exclusion and oppression of other worldviews. That didn’t stop the statement from messing with my heart, though.

You won’t be surprised to learn that I had recently graduated from a Jesuit college, and “the human condition” is a big part of Ignatian philosophy. My best friend and I had lofty aspirations of studying “the human condition” through literature in grad school; I still amuse myself by correctly identifying Jesuit-educated students and priests by their use of the phrase in discussions and homilies, respectively; and Christ’s entering “the human condition” through the Incarnation is the foundation of Ignatian imaginative contemplation, my graduate research, and my personal aesthetic. To be told that “the human condition” is inherently meaningless was like being told that J.K. Rowling’s prose is mediocre, only worse: both statements may be true, but I still love the object that they discredit — and “the human condition” informed my life and work more deeply and for far longer than Harry Potter.

A photo of tree-lined sidewalk leading to a redbrick academic building, which features a statue of a priest over the entry doors and a clocktower topped with a cross. The trees are bare but there is no snow on the grass.
Le Moyne College on a rare snowless day in winter.

 

As imposter syndrome set in and I attempted to impress my professors and fit in with my classmates through mimicking their interests and ideologies, I began to darkly wonder if there was some degree of truth to the second statement, too. As I’ve gained confidence in my ideas, my professors have all been wonderfully supportive of my research, even at critical moments of doubt, but I still felt strangely disembodied from my ideas. They were necessarily available, even susceptible, to outside influences in the name of getting a job, which could range from something as benign as entering them into a critical discourse I was unenthusiastic about to something as disheartening as avoiding theories that are no longer trendy.

Not until I took a summer creative nonfiction workshop with the magnificent Minnie-Bruce Pratt did I realize that this compulsory refashioning had nothing to do with my program, but with the state of English-language literary studies. I spent two weeks reading first-hand accounts like Toni Morrison’s Playing in the Dark: Whiteness and the Literary Imagination, in which Morrison exposes the subtle racism of American literary tradition not in the form of a journal article, but of a personal reckoning with that history. I spent three weeks writing in the first person about the body of Christ, the woman’s body, and the queer body not in the form of a seminar paper but in the form of a series of anecdotes and meditations steeped in medieval and Renaissance mysticism. I found myself applying my research to my life in ways that made the Early Moderns come alive — in our exchange of good-byes, classmates from diverse religious backgrounds told me how fascinating and important my research was through having encountered it in this genre.

: The greyscale cover of Toni Morrison’s book Playing in the Dark. Morrison holds a giant floppy hat. A gold sticker proclaims that the book won the 1993 Nobel Prize in Literature.
Fantastic book, by the way: accessible first-person literary criticism. Highly recommend.

Creative nonfiction enabled me to communicate my ideas — shaped by research and critical writing — with a public upon whom they had material impact. My ideas became my own again: I had a personal investment in recovering historically obscured understandings of gender and the body to not only locate the essential value of the queer and the female bodies in Catholicism but also to share old ways of embodying queerness and femininity that are relevant today. In creative nonfiction, my first-person voice had credibility, purpose, and an audience who otherwise wouldn’t or couldn’t access to this knowledge.

Radical queer and feminist scholarship is somewhat better at this, leveraging the personal narrative as a source of knowledge and an act of inquiry. To assert a self in English (and, I’d wager, biology, history, math, or information studies) is to assert that you are not the implied raceless, genderless, classless entity interested only in books, but that you instead have an investment in disrupting the status quo. This trickles down into policing how we frame our inquiries: we teach our students not to use the first-person because the personal isn’t credible, and we apply the same principle to our critical essays. Consequently, I have no idea why most of my colleagues study what they do: I assume they all love literature, but if that were their only motivation they wouldn’t still be suffering through grad school. If the English scholar speaks, it is only through the voice of their subject of study, and tentatively: papers on nuns I identify with, on devotional poems that resonate with me. Our research overwhelms our selves, and obscures its own real-life applicability. And so we get accused of navel-gazing and being out of touch with reality:

Nothing like some anti-intellectual sentiment to kick-start one’s drive to inform the public.

So maybe there isn’t a single human condition, but that doesn’t mean studying the humanities can’t improve the conditions of some humans. If my experience with creative nonfiction is any indication, one of the most meaningful ways to connect with those outside the academy is to acknowledge our own subject positions, explicitly recognizing the self in order to humanize the humanities. This is what I’ve tried to do here. But now it’s your turn:

Why do you study what you do? Why do you work where you do? Who are you?

A painted full-length portrait of a nun sitting in a library, paging through a book; she wears a large icon of the Annunciation over her breast.
Also, Sor Juana Inés de la Cruz is just objectively rad.

Ashley O’Mara (@ashleymomara | ORCID 0000-0003-0540-5376) is a PhD student and teaching assistant in the Syracuse University English program. She studies how Ignatian imagination and Catholic iconology shape representations of sacred femininity in Early Modern devotional writings. In her down time, she writes creative nonfiction and snuggles her bunny Toffee.

 

About the author

Ashley O'Mara
By Ashley O'Mara

Subscribe

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 4 other subscribers

Recent Posts

Social Media