What Makes a Story

The Secret Life of Stories: From Don Quixote to Harry Potter, How Understanding Intellectual Disability Transforms the Way We Read by Michael Bérubé is out today! This excerpt appeared in The Chronicle of Higher Education.

Pat Kinsella for The Chronicle Review

—Michael Bérubé

My kids are no longer kids. One is in his late 20s, one in his early 20s; Nick, the firstborn, was a “gifted” child, capable of copying drawings of medieval European hill cities at the age of 5; Jamie, his brother, has Down syndrome. Jamie also has an encyclopedic knowledge of sharks and the music of the Beatles, and an astonishing memory. Both of them are natural narrative theorists — though because of the differences in their capacities for abstraction, I wound up testing their narrative theories in different ways.

I learned almost as soon as Nick could talk that he loved my stories; he even gave them numbers, though I never did learn his classification system. Nick liked stories with drama — the story of how the hockey-camp bus left Tallinn without me in 1972; the story of how the camp counselor threw me out of the pool; the story of my first day in first grade, when the teacher corrected me for saying I was 6 when I was still only 5 (a situation that got worse in the following years, after that teacher decided to skip me into the second grade because of my reading skills). But as you might imagine, sometimes Nick’s appetite for stories became wearisome. I read to him every night, and I told him stories about people in my life, and I even made up some stuff. But one day when he was about 3 and a half, on a long car trip in the Midwest, he asked for story after story. And finally I decided to conduct a little experiment.

“All right,” I sighed. “I have a new story for you. It goes like this: Red. Yellow. Orange. Blue. Violet. Green. Black. Brown. White. … ”

“Daddy!” Nick interjected, annoyed. “That is not a story.”

“No?”

“No! It is just a bunch of colors.”

“And a bunch of colors is not a story?”

“No! A story has to have things in it.”

“Ah,” I replied, Phase 1 of the experiment complete. “A story has to have things in it. You are right. OK. Tree. Cloud. Sunshine. Water. … ”

“No, no, no,” Nick insisted, more annoyed. “Things happen in a story.”

“Fair enough,” I acknowledged. “The tree blocked the cloud. The sunshine reflected off the water. The flowers grew. … ”

“Dad!!” Nick interrupted, even more annoyed. “That is not a story either.”

“But things happen in it,” I pointed out.

“But you are not telling why they happen.”

Eureka. We were very close, at this point, to E.M. Forster’s famous dictum, “‘The king died, and then the queen died’ is a story. ‘The king died, and then the queen died of grief’ is a plot.” In a story, things have to happen for a reason.

Jamie learned his alphabet before he started kindergarten at age 6; he learned to read at a first-grade level by the time he was 8. Along the way, he somehow taught himself the American Sign Language alphabet and a few simple ASL words by imitating the pictures he saw in a Sesame Street book devoted to the subject. But even though Jamie had developed a profound love of sharks, barn animals, and the movie Babe (just like any number of children his age), he didn’t really understand stories as stories. He had an amazing recall of individual scenes, particularly scenes that involved pratfalls, and he was able to repeat most of the dialogue from the exchange in which Ferdinand the Duck tells Babe the Pig why he wants Babe to go into Farmer Hoggett’s house and steal the Hoggetts’ alarm clock from the side of the bed. But he didn’t understand how much of the movie’s plot is predicated on that scene (the issue is whether animals can avoid being eaten by humans if they demonstrate that they are “indispensable,” and Ferdinand has decided that he will crow like a rooster each morning in order to stay alive), nor did he understand what it might mean for a plot to be “predicated on a scene” in the first place. Until he was 10, Jamie enjoyed narratives almost as if they were a series of entertaining vignettes.

Then, in 2001, we took him to see Harry Potter and the Sorcerer’s Stone. We feared that the film might be a bit too much for him to take in — from the opening scenes of Harry the Abject Orphan to the climactic sequence in which Voldemort’s baleful spirit speaks from the back of Professor Quirrell’s head, with all the plot miscues and misdirections along the way (most of which point to Snape rather than Quirrell as the malevolent force in Harry’s world). But we were most pleasantly surprised to find that he got it — and not because he himself had glasses just like Harry’s, not because he dreamed of going to Hogwarts himself. He roundly dismissed all comparisons between himself and Harry. Rather, he got it because he loved the story — and he loved talking about it for weeks, even after he’d seen the film three or four times. Impressed, I asked him if he’d like to read the book on which the film was based, and he responded with hand-rubbing glee.

Thus began his — and my — adventures with J.K. Rowling’s plots, and Jamie’s fascination with the intricacies of plotting. A critical index of Jamie’s increasing sophistication as a reader was that he became increasingly capable of (and delighted by) making thematic connections that enriched his understanding of his other favorite narratives. In the course of our reading of Half-Blood Prince, we came upon an extended flashback/exposition in which the young Professor Dumbledore visits an 11-year-old Tom Riddle in the orphanage in order to inform him that he is a wizard and extend him an invitation to Hogwarts. Jamie gasped at Tom’s arrogant reaction to Dumbledore’s invitation, and, despite his fatigue, stayed awake for another couple of pages.

But before we got to that point, I read the following passage: “The orphans, Harry saw, were all wearing the same kind of grayish tunic. They looked reasonably well-cared for, but there was no denying that this was a grim place in which to grow up”. I decided to say a few words about the orphanage, and about Harry’s odd, complex moment of sympathy for the friendless boy who grows up to become Voldemort. “Did Harry have a happy childhood when he was growing up?” I asked. Jamie shook his head no. “He had the Dursleys,” he said. I pointed out that Harry and Voldemort are similar in that they grow up without parents, and that the kids in the orphanages are there because they have no parents either. I added that Jamie might remember the orphanage in the filmLike Mike, which was in the “heavy rotation” section of Jamie’s DVD collection for a while.

“Or Free Willy,” Jamie suggested. “Yes, that’s right,” I said with some surprise. “Free Willy is also about a kid who is growing up without parents, and who has stepparents, and he has trouble getting used to his new home.”

“Or Rookie of the Year,” Jamie said. “Not exactly,” I replied. “In Rookie of the Year, Henry has his mother, but his mother’s boyfriend is a creep, and we don’t know where his father went before he was born.”

Star Wars too. There’s Luke,” Jamie said. “Good one! Great example!” I cried. Perfect, in fact. Star Wars is like a class reunion of the West’s major mythological motifs.

Mrs. Doubtfire,” Jamie offered. “Nope, that’s about parents who are divorced and live in different houses,” I said, “but still, in Mrs. Doubtfire the father misses his kids and wants to see them, so dresses up as a nanny.”

“What about Babe?” Jamie asked.

“Oh yes, that’s a very good example,” I told him. “Babe has no parents, and that’s why he is so happy when Fly agrees to be like his mother.”

“And Rex is like his father,” Jamie added. “And Ferdinand the duck is like his brother.”

Why, yes, Ferdinand is like his brother. This had never occurred to me before. But who knew that Jamie was thinking, all this time, about the family configurations in these movies? And who knew that Jamie knew that so many unhappy families, human and pig, are alike?

But my children, adept narrative theorists though they be, are not my only inspiration. I’m also informed by years of conversations with colleagues in disability studies, including two wholly unexpected encounters that subtly but decisively widened the parameters of how I could think about how an understanding of intellectual disability can transform the way we read.

The first encounter happened at the 2011 Modern Language Association convention in Los Angeles, and in retrospect is merely amusing — though at the time, it seemed like the stuff of professors’ anxiety dreams. I was on a panel titled “Narrative and Intellectual Disability,” chaired by Rachel Adams, a professor of English and comparative literature at Columbia University. I was trying to get at the question of how narrative irony works when it involves a character with an intellectual disability, a character who is rendered explicitly as someone who is incapable of understanding the story he or she inhabits. Steinbeck marks Lennie in this way from Of Mice and Men’s opening scene:

Lennie looked timidly over to him. “George?”
“Yeah, what ya want?”
“Where we goin’, George?”
The little man jerked down the brim of his hat and scowled over at Lennie. “So you forgot that awready, did you? I gotta tell you again, do I? Jesus Christ, you’re a crazy bastard.”
“I forgot,” Lennie said softly. “I tried not to forget. Honest to God I did, George.”

And just as Lennie does not understand where he is going or why, so too will he not understand what is going to happen to him in the book’s final pages; in that sense, his intellectual disability provides the structure for the narrative irony, and the narrative irony defines the novel. Lennie knows not what he does, and we know he knows not what he does. But I mentioned Of Mice and Men only in passing, opening instead with Benjy Compson of The Sound and the Fury and proceeding to a comparison between Elizabeth Moon’s The Speed of Dark and Mark Haddon’s The Curious Incident of the Dog in the Night-Time,because Haddon provides an ingenious (and quite moving) solution to the problem of writing a novel in the voice of a character who (initially) does not understand the narrative he is in, whereas Moon has to skirt that problem by giving us a second level of narrative focalized through characters who do not have autism and who can explain what is at stake in the unfolding of the narrative told by the character who does have autism.

At the last minute, one of my fellow panelists had to pull out of the convention, and Rachel Adams informed us that Rob Spirko, a senior lecturer in English at the University of Tennessee at Knoxville, would substitute instead, with a paper titled “The Human Spectrum: Human Fiction and Autism.” Rob preceded me on the program — and proceeded to deliver a paper about The Speed of Dark and The Curious Incident of the Dog in the Night-Time,making many of the points and citing many of the passages I had hoped to highlight in my paper. As I listened to Rob, I toyed with the idea of taking the podium and saying simply, “My paper is what Rob said,” but just then, he made an offhand reference to the “Rashomon-like” narrative sequence in Philip K. Dick’s Martian Time-Slip. I snapped to attention: This seemed to me to be something worth discussing. I had not written anything about Martian Time-Slip in my paper, but I had recently read it and was still trying to figure out what to make of its extraordinary strangeness. And I was thrilled to be able to discuss it at a conference with Rob Spirko, who has worked on disability and science fiction for some time.

I did wind up delivering most of my original paper; Rob’s arguments and mine did not overlap completely. But I threw in some extemporaneous remarks about how the narrative sequences in Chapters 10 and 11 of Martian Time-Slip are not, in fact, Rashomon-like. If they were, they would involve four characters telling the same story from drastically different perspectives, narrating significantly different sequences of events, such that the very idea of “the same story” becomes untenable. But they don’t. Instead, they open by telling the same story almost word for word, and then proceed into disturbing fantasias that cannot be attributed to any one character — even though each character, the following day, feels the aftereffects of the sequence as a whole. The sequence is not merely “about” the perspective of a character with an intellectual disability; it represents an attempt to render intellectual disability in the form of a disabled textuality that cannot be attributed simply to any one character’s mental operations.

And when I realized that, thanks to the offhand remark of a last-minute-replacement speaker giving a 15-minute paper at the MLA convention, I realized that I had a critical piece of my argument — a way of talking about intellectual disability and narrative that did not begin and end with the discussion of whether X character has Y disability.

As for the second encounter: To say that it was “wholly unexpected,” as I have done, is actually an understatement. It was pretty much the last thing in the world I might have imagined. It involved a whimsical decision to join Facebook (after years of steadfast, principled resistance) and, relatedly, to go to the 40th-anniversary reunion of my sixth-grade class (not a happy place for me when I was 10, but I thought that the details of tween angst of 1972 were not worth recalling in 2012).

Within a few days of joining Facebook, I was hailed by one Phyllis Anderson, nee Phyllis Eisenson — someone I had not thought about in almost 40 years. And in the course of striking up a conversation with this person 40 years after graduation from PS 32 Queens, I happened to mention Madeleine L’Engle’s A Wrinkle in Time, about which I had just been reading; 2012 was the 50th anniversary of the book’s publication, and in 1972 at least half of our cohort had read it. And Phyllis was of course (or so I imagined) Meg Murry, the very smart girl with the long hair and prominent glasses. This rudimentary identification was complicated a bit by the fact that I identified more with Meg than with any male character in the book — with her sense of isolation, helplessness, and vulnerability above all — and by the fact that I did not stop to think that I might be inadvertently saying to my former classmate, “I remember you — you were the girl with glasses and what’s more, everybody thought you were a weirdo.”

But in the course of that brief Facebook conversation, as we caught up on partners and kids and professions (literature professor, clinical psychologist), I mentioned that I was sitting at a table across from Jamie, who had just handed me a list of 25 kinds of sharks. To which Phyllis replied, “I am sure you did not know that my brother is autistic.” Well, I could have plotzed. Needless to say, I did not know that; I did not know anything about this person, starting with the fact that she had a brother. And a brother with autism, in the 1960s (Andy Eisenson was born in 1957).

Oh my goodness, I thought at once, what that must have meant for her mother — to have a child with autism at the precise historical moment when autism was being attributed to “refrigerator mothers.” How difficult that must have been. Followed almost immediately by, oh goodness, and then along comes this very smart girl, the younger sister. I can’t even begin to imagine the family dynamics, except that no, wait, yes I can. And the story quickly went still deeper: Sylvia Eisenson, Phyllis’s mother, was in fact a psychologist in the New York City school system. She was A.B.D. from the University of Illinois at Urbana-Champaign, where I taught for 12 years. She knew very well what Bruno Bettelheim was doing with Leo Kanner’s refrigerator-mothers theory, and had actually written to Bettelheim to tell him that his work was destructive to loving parents. She received a reply from an underling, telling her to get therapy.

All this I learned in the course of one Facebook chat, which somehow went from “oh yes, I remember you” to serious familial and emotional matters in the course of a few minutes. Life in the disability community can be like that; I remember a conference at which someone introduced herself as the parent of a child with Down syndrome and we wound up talking about our then-teenaged sons’ desires for friends, especially girlfriends, within 10 minutes. Because there is so much shared terrain, casual conversations can suddenly turn into serious discussions of special-needs trusts and the ethics of getting your child or sibling to sign over his power of attorney. And then, a few days later, after Phyllis had gone back to look at her copy of A Wrinkle in Time, she wrote:

just read the first chapter of “Wrinkle”, and Meg is described with her glasses and braces and general awkwardness. And I thought — that is at least partly why I liked this book so much — there I am, though without the spunk to duke it out with the kid who said something mean about my brother.

When I read that note I had yet another “oy, what did I say” moment: Oh yes, I remember you with your glasses and braces and general awkwardness? (It turns out the braces came later. I did not remember any braces.) But it was the last clause that grabbed me. No doubt young Phyllis Eisenson, or anyone with a sibling with an intellectual disability, would read Wrinkle with that inflection, with a sense of protectiveness for the more vulnerable family member: Wasn’t this one of the lessons we learned in Reader-Response Criticism 101?

From one angle it is a rudimentary point, a truism: Of course we all bring to every text the welter of experiences, associations, encounters, and intertextual relations we have accumulated over the years. Reader-response criticism made much of this rudimentary point for much of the 1970s, with earnest Critical Inquiry forums on whether readers or texts make meaning, whether meaning is determinate or indeterminate, and whether the hypothetical “Eskimo reading” of Faulkner’s “A Rose for Emily” can be ruled out of court.

But from another angle, this exchange seemed (and seems) to me to open onto a principle of considerable breadth, one that has not yet been considered by literary criticism influenced by disability studies. It is the complement to the Rob Spirko-induced insight that disability in literary texts need not be located in, or tied to, a specific character with an identifiable disability: It is the Phyllis Eisenson-induced insight that disability in the relation between text and reader need not involve any character with disabilities at all. It can involve ideas about disability, and ideas about the stigma associated with disability, regardless of whether any specific character can be pegged with a specific diagnosis. This opens the field of criticism considerably; and I am going to insist that this is a good thing, not least because I am determined to cure disability studies of its habit of diagnosing fictional characters.

My argument is that even as disability studies has established itself in the humanities in a way that was unthinkable 20 years ago, it has still limited itself to too narrow a range of options when it comes to literary criticism; and though I am (obviously) being facetious about the idea of “curing” disability studies of anything, I am quite serious about the conviction that disability studies limits itself unnecessarily, as a new branch of criticism and theory, whenever it confines itself to determining the disability status of individual characters. Disability studies need not and should not predicate its existence as a practice of criticism by reading a literary text in one hand and the DSM-5 in the other — even when a text explicitly announces that one or more of its characters is (for example) on the autism spectrum. It is not that a character’s condition is irrelevant to how we read him or her; rather, I want to argue that we should avoid the temptation to think that a diagnosis “solves” the text somehow, in the manner of those “psychological” interpretations of yesteryear that explain Hamlet by surmising that the prince is, unbeknownst to himself, gay.

In opening the question of the potential relations between disabled and nondisabled characters (and readers’ potential relations to those relations) so that it includes characters who are merely presumed to be intellectually disabled by their fellow characters (such as Coetzee’s Michael K and Friday, from his novel Foe), we come to recognize intellectual disability not merely as the expression of somatic/neurological symptoms but as a trope, a critical and underacknowledged thread in the social fabric, a device for exploring the phenomenon of human sociality as such.

This is not merely a matter of remarking that the idiot and the holy fool offer strategic insight into human hierarchies and the contingency of systems of value, though it is partly that; it is also a matter of gauging how literary works depict systems of sociality in part by including characters who either are, or who are presumed by their fellow characters to be, constitutively incapable of understanding or abiding by the social systems by which their worlds operate. When you think about it that way, and you should, you realize that we have only just begun to explore the implications of disability for our understanding of the narratives we read and the narratives we live by.

Michael Bérubé is the Edwin Erle Sparks professor of literature and director of the Institute for the Arts and Humanities at Pennsylvania State University. He is the author of The Secret Life of Stories: From Don Quixote to Harry Potter, How Understanding Intellectual Disability Transforms the Way We Read (NYU Press, 2016).

The Difference a Mutant Makes

—Ramzi Fawaz

[This piece originally appeared on Avidly.]

Like any good origin story, I’ve told this one a thousand times: The first comic book I ever read was X-Men #80, the 35th Anniversary issue of America’s most popular comic book series, which, for over three decades, had narrated the lives, loves, and losses of a band of mutant outcasts gifted with extraordinary abilities because of an evolution in their genetic makeup. It was 1998. I was thirteen years old and the cover of this single comic book issue was a young gay boy’s dream: a shiny pink hologram with a tower of dazzling disco-attired superheroes exploding before one’s eyes.

Growing up in a queer family, sibling to a gay brother, and bullied to tears on a daily basis for my own exuberant gayness, the words “A team reunited…a dream reborn” emblazoned on that cover spoke to me of the promise and possibility of queer kinship and solidarity in the face of all odds. Above all, what struck me about that cover was the sheer variety of characters depicted—how could a man of made of steel, an intangible woman, a white haired weather goddess, a butch teen girl ith bones sticking out of her skin, and a teleporting blue elf be any kind of team? Who were these people, I wondered, and what kind of dream did they share?

tumblr_n7bmdkqKUU1sqltzeo1_500Like so many readers of the X-Men over the decades, no character drew me in more than the weather goddess Storm, a Kenyan immigrant to the U.S., the first black woman superhero in a mainstream comic book, and by the 1990s, the X-Men’s team leader. In that same anniversary issue, at a low point in the team’s battle with an imposter group of X-Men, Storm rallies her bruised and beaten comrades by reminding them that what defines their bond is a set of shared values, a chosen kinship maintained through mutual love and respect, not by force or expectation. With my budding left-wing consciousness on one side, and my attachment to queer family on the other, I fell in love with this fictional mutant goddess and her team: this was the kind of community I longed for. How did it come to be that a thirteen year old Lebanese-American, suburban gay boy found common cause with an orphaned, Kenyan, mutant, immigrant X-Man?

If one were to try and explain this question by turning to recent public debates about superhero comics, we might put forward the answer: “diversity.” Yet this term and its shifting meanings—variety, difference, or representational equality—would have rung false to my thirteen year old ears. It was not simply the fact of Storm’s “diverse” background as Kenyan, immigrant, woman, or mutant that drew me to her, but rather her ethical orientation towards those around her, her response to human and mutant differences, and her familial bond with her fellow X-Men. These were qualities significantly shaped by her distinct differences, but not identical to them. This was not any traditional idea of diversity then, understood as the mere fact that different kinds of people exist. Rather what Storm and the X-Men embodied was true heterogeneity: not merely the fact of many kinds of people but what those people do in relation to their differences. As I became a dedicated comic book fan, I realized that every issue of the X-Men was both an extended meditation on the fact that people are different from one another, and that this reality requires each and every person to forge substantive, meaningful, intelligent responses to those differences.

As a teenage reader, I simply took this fact for granted as part of the pleasures of reading superhero comics. As a scholar years later, I came to realize that the ability to respond to differences and forge meaningful relationships across them was a capacity, a super-power if you will, that comics could train their readers to exercise, an imaginative skill fit for a truly heterogeneous world. It was this realization that led me to write The New Mutants: Superheroes and the Radical Imagination of American Comics, in which I ask the question: what is it about the visual and narrative capacities of the comic book medium, and the figure of the mutant, cyborg, or “freak” superhero in particular, that has allowed so many readers to develop identification with characters across race, class, gender, sexuality, ability, and cultural origin?

Recent public dialogue about the rapidly diversifying ranks of superhero comic books have overwhelmingly celebrated the increased racial, gender, sexual, and religious variety of America’s greatest fictional heroes. Yet every time a news outlet lauds the major comics companies for introducing a gay superhero, or a Pakistani superhero, or a classically male superhero replaced by a powerful woman, the historian in me thinks, “but comics were doing that in 1972, so what’s the big deal now?”

Certainly, one potentially distinct element of today’s push for diversity is the range of “real-world” or identifiable differences comics are willing to name and represent on the comic book page. But in writing The New Mutants, I came to the conclusion that without an underlying democratic ethos or worldview, such real-world differences have little meaning. In The New Mutants, I argue that cultivating egalitarian and democratic responses to differences became the sin qua none of American superhero comics from the 1960s through the early 1990s.

justice leagueI call this vision a “comic book cosmopolitics,” an ethos of reciprocal, mutually transformative encounters across difference that infused the visual and narrative content of comics for nearly three decades. In the 1960s and 1970s comic book series like the Justice League of America, theFantastic Four, and the X-Men provided readers an exceptionally diverse range of new characters and creative worlds, but most importantly, modeled what it might look like for those characters to bridge divides of race, species, kin and kind for their mutual flourishing and the good of the world. What “doing good for the world” meant or could mean was the question that motivated these characters to engage one another, forge bonds, disagree, and take collective action. Today’s celebratory proclamations about the internal diversity of American comics ignores the fact that by 1984 Marvel Comics alone had Kenyan, Vietnamese, Native American, Russian, American working-class, Jewish, and Catholic superheroes, and even a Pagan Demon sorceress at the helm of one of its main titles.

What distinguished these earlier figures from their contemporary counterparts are the seemingly endless dialogues and struggles they engaged to negotiate, respond to, rethink, and do somethingwith their differences as a matter of changing the world. It was this negotiation within the context of characters’ actual diversity that allowed readers like me, and thousands more, to identify with a vast range of people who were, at least on the surface, radically unlike us.

In a recent op-ed for the New York Times, “That Oxymoron, The Asian Comic Superhero,” columnist Umapagan Ampikaipakan makes the counterintuitive claim that the push for racial diversity in contemporary superhero characters, rather than reflect the progressive evolution of the superhero, might actually “dilute” the fundamental purpose of the figure to function as a universal fantasy of belonging. The more specific or particular the superhero gets, he suggests, the less the character speaks to all kinds of readers.

As a child growing up in Kuala Lampur, Ampikaipakan explains that even thousands of miles away from U.S. culture, he found himself identifying with the misfit and freak Spider-Man. It didn’t matter that Spider-Man and so many of the superheroes in the Marvel Universe were white. Rather it was the message these comics carried about the value of being a freak or an outcast that translated across both actual and virtual distance.

In the face of much public celebration of comic book diversity, Ampikaipakan’s argument is compelling because it refuses a reductive understanding of identity politics, namely that seeing oneself or one’s own particular identity reflected back in any given character is the only possible way that one can feel invested in a them or their creative world. This argument is both undoubtedly correct, yet severely misguided.

The mistake Ampikaipakan makes is not to claim that readers have the capacity to identify with a range of characters regardless of their social identity, but in his failure to stress thatit is difference and distinction itself that has made the superhero such a durable fantasy to so many readers globally, not the figure’s empty universality or the flexibility of whiteness to accommodate a variety of identifications. The fact that superheroes highlight (rather than overlook) the social, cultural, and biological differences that shape humankind, that makes identifying with them possible—this is why one superhero is never enough. Superheroes proliferate because no matter how many there are, they can never quite capture the true heterogeneity of everyday life. The attempt to do so is what keeps us reading.

We should not settle for the mere representation of more diverse characters, as though the very existence of a female Pakistani Ms. Marvel alone were an act of anti-racism, or anti-sexism; these latter categories describe not a representation or image, but an ethos, a worldview and way of life—this ethos is what Ampikaipakan was drawn to in reading Spider-Man. It was an underlying ideal of celebrating outcasts, misfits, and freaks—a democratic investment in all who did not fit into the model of “normal” American citizenship—that defined Marvel Comics in the 1960s and 1970s, and that shaped readers’ relationship to characters like Spider-Man and his universe of mutant, alien, and superhuman friends, all of whom we grew to love because of their particularities, differences, and distinctions, not their imagined universality. As readers, we must demand that the depiction of more diverse characters be motivated by an ethos attentive to human heterogeneity, its problems and possibilities; these character must be placed into dynamic exchange with the world around them, rather than merely making us feel good that some more of us are now included every once in a while.

X-Men Move to San FranciscoTake for example the dramatic creative decision by writer Matt Fraction to relocate the X-Men from their long-standing home at the Xavier Institute for Higher Learning in Westchester, Massachusetts, to San Francisco in 2008. With this momentous move to one of America’s most recognized gay holy lands, it seemed as though the X-Men series had finally made explicit its long-standing symbolic association between mutation as a fictional category of difference, and gayness, as a lived form of minority identity; and yet, in the handful of years that the X-Men resided in San Francisco between 2008 and 2012—where they lived as billionaire jet-setters buying property in the Marin headlands at the height of a national recession no less—never once did they address the city’s massive housing crisis, increasing rates of violence towards the city’s queer and minority populations, or the shifting status of HIV. Did the X-Men even deign to go to a gay club in their five years in the Golden Gate city? Did the team’s one putatively “out” character Northstar claim any solidarity with the city’s queer community? I’m afraid not.

The series capitalized on its symbolic gesture of solidarity with minorities, queers, and misfits, but it jettisoned its earlier substantive engagement with the problem of difference: back in 1979, when Storm visited the slums of Harlem and witnessed the reality of youth homelessness and drug abuse, she was forced to contend with the realities of inner-city African American life from the perspective of a Kenyan immigrant who experiences blackness differently than African Americans and the working poor; and in the early 1990s, with the invention of the fictional mutant disease the Legacy Virus, the X-Men series used fantasy to address the AIDS crisis by thinking through the kinds of solidarities mutants and humans would have to develop to respond to a genetic disease ravaging the mutant population.

Storm Visits Harlem 1979

In today’s comic book pages, is there a single X-Man with HIV? Now that Iceman is out of the closet, will he go on PrEP, the HIV prophylactic? And as a former lady’s man, will his sexual health be an issue at stake in the series? The likelihood that Bobby Drake’s gayness will either be treated substantively, or have a meaningful effect on the social fabric of the Marvel Universe seems very low in today’s creative environment, where the mere “outing” of characters as exceptionally diverse in their identities is presupposed as an act of social benevolence on the part of writers and artists.

My point here, is not that superhero comics need greater realism in their storytelling or should be more “true to life.” Rather, superhero comics are one place where fantasy and creative worldmaking can run up against the specificities of our everyday lives, so that “real life” is presented to us anew or opened up to other possibilities. Mutation and gayness, for instance, are not the same thing. But they resonate in surprising ways.

The imagined category of mutation sheds light on the workings of a real-world social identity like gayness, or blackness, but it also reveals the limits of analogy because all of these categories are never quite identical. The ability to distinguish between the places where differences overlap and where they don’t is a political skill that fantasy can help us develop. It demands we not only see where solidarity can be forged, but also figure out what to do when sameness no longer holds true, or our differences overwhelm the ability to forge meaningful bonds.

What I was doing that summer day when I read my first issue of the X-Men was figuring something out not only about myself, but about my relationship to the world around me as someone who fundamentally understood that I was different, but didn’t yet know how to respond to being different. This is the true gift that superhero comics have given to American culture in the 20th century, but it is a creative offering increasingly taken from our grasp.

When Marvel Comics reached a creative level of near maximum mutant heterogeneity in the X-Men series around 2005—a moment of incredible promise when mutants no longer appeared as minorities but a significant portion of the human population—Marvel spun out a barrage of storylines from “E is for Extinction” to “House of M” that depicted the mass slaughter of the majority of the world’s mutants by members of their own kind. The X-Men have been living in the shadow of genocide ever since: shot down by ever-more efficient mutant killing robots, murdered and harvested for their organs, nearly eliminated from history by time-traveling mutant hunters, and now subject to M-Pox, another genetic disease threatening to wipe out the mutant race. In a sense, Marvel could not face the complexities of the world it had created, and decided to obliterate it instead: in so doing, fantasy truly became a reflection of our violent post-9/11 reality.

uncanny-x-men-1-marvel-nowContrast the exuberant, bubble-gum pink cover of the first X-Men comic book I ever read, with the most recent issue I picked up: in the renumbered Uncanny X-Men #1 (2013) written by Brian Michael Bendis, the cover presents us a picture of mutants at war. There is a revolution afoot, but it is lead by a single male figure, the famed character Cyclops reaching out to the reader from the center of the page with his army of followers, merely black and white outlines in the background. That army is composed of some of the most complex characters to ever grace the pages of the X-Men series, yet here they have been flattened to ghosts haunting the background of the X-Men’s former dream.

The X-Men now appear as a leather-clad, armored military unit, not a high-flying, exuberant, queer menagerie. At the center, Cyclops’ hand reaches out to us not in a gesture of solidarity but as a claw, perhaps ready to grip our throats. In this new chapter of the X-Men’s history, mutants are presented as divided over the right path towards the preservation of the mutant race. But instead of rich, textured disagreements the characters appear merely as ideologues spouting flat and rigid political manifestos. There is no space for genuine debate, or loyalty amidst disagreement, or even the notion that more than one dream could exist side by side among companions. As Alex Segade has recently argued in a brilliant Art Forum article on theX-Men’s decades long mutant mythology, the recent introduction of increasingly “diverse” cast members to the series has come at extraordinary costs, including the mass deaths of entire swathes of mutants round the world.

In the X-Men, fantasy—that realm meant to transport us to a different world—has become the ground for narrating the collapse of all visions of hope, social transformation, or egalitarian action: in Bendis’ epic narrative the original five teenage members of the X-Men are teleported to the present only to see that their youthful dreams of peaceful relations between human and mutant kind have resulted in death, destruction, and seemingly endless violence.

When I recently caught up on Bendis’ X-Men plot about time traveling mutant teenagers, it made me wonder what my thirteen year old self would have thought about his future had these been the comic book issues he first encountered in the summer of 1998. But I’m lucky they weren’t. In the face of the kinds of violence and death-dealing that recent comics present, I remember that there are other uses for fantasy, because it was the X-Men series itself that first showed me it was possible. Today, as years of reading, thinking, and writing about superhero comics come together with the publication of The New Mutants, I look back at that first cover image of X-Men #80 with a mix of longing and hope: I wonder now how a team can be reunited, and how new dreams can be born.

Ramzi Fawaz is an Assistant Professor of English at the University of Wisconsin, Madison. He is the author of The New Mutants: Superheroes and the Radical Imagination of American Comics (NYU Press, 2016).

The Exquisite Corpse of Asian America: Q&A with author Rachel C. Lee

Last season, Faye Qiyu Lu, one of our fall interns and an undergraduate at NYU, put together a series of questions for Rachel C. Lee, author of The Exquisite Corpse of Asian America. Check out the Q&A on the book below! 

What does the term “exquisite corpse” entail for your project?

Rachel C. Lee: The exquisite corpse, for me, is a structure for collaboration, an experimental method that values distributed sites of intelligence, despite the likely disjunctures of approach and worldview of various participants. In the early 2000s, I began working with a group of feminists in Los Angeles to create a critical-creative prose piece using the cadavre exquis as our model. That exercise led me to appreciate the conjunctive elements of scholarly endeavor, rather than simply to pretend that, for instance, a book emanates from a singular monastic researcher. At the same time, I had also been writing about performances, novels, science exhibits, and other cultural artifacts. All were concerned with racially marked populations and some used “body parts” in their compositions, for instance, wielding human detritus as art material or referring to fleshly organs in their titles. I discovered that these Asian American artists were not simply responding to how racial violence occurs through the assertion of anatomical difference between “colored” and “white” people—differences not simply in hair texture and skin color, but in diet, endurance, pain threshold, and so forth—they were also responding to the way in which biotechnology was changing Enlightenment notions about the integrity and autonomy of the human organism.

André Breton credited games like the cadavre exquis with bringing about something unexpected—a “pooling” of creativity and knowledge, perhaps an early intimation of what is now called “crowd sourcing.” That description seemed an apt figure, encapsulating the way my book had grown from an engagement with racial profiling tied to external features and body parts to an examination of such profiling in relation to risk assessments of populations based on genetic, metabolic, endocrinological, and environmental regulation.

It is argued in the book that a biosocial/biopolitical perspective would shed new light on the literary study of race (Asian American in this case). How did you first arrive at this approach?

The literature on biopolitics and biosociality—which I became familiar with through anthropologies and sociologies of medicine—helped me understand the gap between those who were studying embodiment on the scale of perception and corporeal dynamics and those who were studying it more sociologically, as properties and propensities of bodies aggregated into types. We can think of these as an approach that starts from inside a particular, situated body and an approach that starts from outside, looking over a crowd of bodies. As I explain in the first chapter of my book, literary artifacts often focalize their stories through the perspectives of individual protagonists.

In the case of canonical literature by racial subjects, readers who take up these books vicariously see from the viewpoint of these racially specific characters, taking on their speech inflections, and understanding or sympathizing with the traps of these characters’ own and others’ making. This approach corresponds to the anatomical-political register of biopower—how individual bodies feel the effects of (and partially defy) the managerial, biopolitical aspects of biopower codified in institutions such as the the legislature, the courts, the health clinic, the army, the Taylorized workplace, the credit and finance sector, and so forth. Public policy and the law necessarily address social problems–such as harm caused by institutional bias against racial others, the disabled, and sexual minorities, for instance–in terms of broad edicts aimed at ensuring classes of individuals are not singled out for unfair treatment. In other words, legal and policy discourses necessarily “abstract” individuals into populational patterns, but who wouldn’t feel that his/her individual instance of tragicomedy has not been heard in the broad edicts of these bureacracies? It is the desire to be in a particular body, or the riveting concreteness of a particular body’s story, that finds us looking to literature.

Apparently your study examines not only literature, but also various art forms. Is it pushing your own boundaries as a literary scholar?

Since the completion of my first book in 1999, I had been working on the difference between scripts treated as literary texts and live performances on the stage. Indeed, in my earlier work on standup comedienne Margaret Cho, the archived ‘text’ of a live performance was not a written transcript but a DVD. When working on performance, one pays attention not simply to the verbal emanations of the performer, but to the communicative and tonal qualities of gesticulating arms, crouched legs, pointed toes, sweat streaks, facial grimaces, costume, etc. I suppose you could say that I spent the first decade of the 2000s pushing my boundaries as a transdisciplinary scholar, not simply in terms of taking stage performances as part of my archive but also in my consideration of visual media and other forms of visual-tactile interfaces enabled by electronic platforms. Perhaps the biggest boundary I have recently pushed is that between bioscientific and humanistic approaches. However, here I am grateful to be following in the footsteps of brilliant feminist science and technology scholars such as Donna Haraway, Banu Subramaniam, Hannah Landecker, and Elizabeth Wilson.

Why did you choose the specific cases of Cheng-Chieh Yu’s dance theater, Margaret Cho’s stand-up comedy, Amitav Ghosh’s novel, and Denise Uyehara’s performance art?

The Exquisite Corpse of Asian America is also an experiment in various modes of critical writing. My introduction, first chapter, final chapter, and epilogue are all driven by argument and theory. There, I use examples from literature, scientific exhibit, clinical practice, and visual design and art in aggregate, as it were— meaning their force of evidence lies in the sum of their effect. In four chapters on the artists identified above, I ruminate at length on each artist’s corpus, dwelling in the minutiae of their choreographies, multi-media art practices, narrative structures, and pedagogical commitments. The goal is to draw out what ethical and political practices they accomplish—and urge us to accomplish—through their work. There are numerous ways in which the works of the four primary artists overlap and could be explored, but I didn’t want the length of the book to be too forbidding. For instance, while I expressly explore the boundaries among species—microorganisms and their insect and vertebrate hosts—through Amitav Ghosh’s fiction, I could also have turned to dancer Cheng-Chieh Yu, who has a series of dance performances devoted to the animal-human divide; these dances convey a sense that the movements of Chinese martial arts and the pharmacopeia of Chinese medicine acknowledge the continuity of humans and other animals. Similarly, both Margaret Cho and Denise Uyehara (both queer actresses) have recently turned to the topic of having babies; nonetheless, I felt Uyehara’s earlier work on disability and incarceration was far more pressing to address.

Lastly, you mentioned the hope to “provoke a new symbiont species of inquiry.” Would you consider The Exquisite Corpse of Asian America successful at doing so?

Immodestly, I’d love to say yes, but future readers will have to answer that question! Perhaps the best I can do is point to how biotechnology on a daily basis is disaggregating and reaggregating our body parts in ever new ways. Biotechnology now allows for something called the “three-parent embryo,” basically the altering of one’s offspring’s cellular materials, such as mitochondria, while maintaining that offspring’s genetic (nuclear DNA) tie to the parent. While the three-parent embryo is not a new species (all the parts combining are from one species), it nevertheless might do as a figure that is good to think with in our current moment. Such new combinations in and across bodies are coming into being because of well-funded infrastructures enabling their realization. My book aspires to be a more modest infrastructure, enabling analogous new inquisatorial combinations across bodies of disciplinary inquiry.

We might also take a cue from artists, poets, novelists, and standup comedians themselves, who are not leaving the social and ethical implications of these new technologies up to the biotech industry, but are speculating and imagining multiple futures emerging from these changes. Asian American Studies, race studies, literary studies, and American Studies, whether in a symbiont or three-parent embryo manner, would do well to amplify their engagements with bioscience in order to continue the work of critical race studies, social justice, and ethical pedagogy in relation to these developments.

Rachel C. Lee is Associate Professor of English and Gender Studies at UCLA. She is the author of The Americas of Asian American Literature: Gendered Fictions of Nation and Transnation, co-editor of the volume Asian America.Net: Ethnicity, Nationalism, and Cyberspace, and editor of the Routledge Companion to Asian American and Pacific Islander Literature and Culture.

The satiric lesson of ‘Dear White People’

—Pamela Newkirk

[This article originally appeared at the Chronicle of Higher Education.]

Rarely is a white audience afforded a lucid and freewheeling response to the deluge of indignities blacks still endure. Instead, reaction to the barrage of stereotypes embodied in many Tyler Perry films, the one-dimensional depiction of blacks in news or reality television, or whites’ insulting appropriation and commodification of a hard-earned black urban culture, is seldom considered.

Now Dear White People, appropriately set on an elite and predominantly white university campus, delivers a timely and barely satiric lesson on why, for many blacks, tensions continue to simmer beneath the nation’s facade of racial harmony and transcendence. The film’s writer-director, Justin Simien, lays out an ambitious lesson plan to reveal how racial stereotypes play out on an elite campus that claims to celebrate diversity.

Inclusion in such settings typically means a small number of blacks fitting into preconceived notions of who and what they are. And for many, the stereotype of black life—of a monolithic, urban slang-wielding group that glorifies criminality and crass consumerism—is more salient than the reality of black individuality.

So many conflate blackness with an underachieving urban underclass that, for some white filmgoers, it’ll come as a surprise that blacks don’t wish to be viewed as products of street culture. Nor do they relish the curiosity of whites who touch their hair or inquire about its texture, manageability, or authenticity. The individuality and dignity readily accorded whites are often denied blacks, so few African-Americans manage to escape some of the slights deftly depicted on screen.

Even at elite colleges, many high-achieving African-Americans are often addressed by their white peers as if aliens from a rap video, rather than as fellow classmates from similar or even more-privileged backgrounds. Why white students feel entitled to use the N-word, or to affect urban slang when greeting their black classmates, is both confounding and yet all too familiar.

One character in Dear White People, a prototypical nerd and gay writer, is assumed to be a member of the Black Student Union and to live in black housing when in reality he feels as alienated by many blacks as he does by the larger culture. A girl from the South Side of Chicago is so determined to fit in that she conforms to a narrowly prescribed, self-deprecating role; while the main character, Sam White, is the agitator whose provocative campus radio show, Dear White People, not only catalogs the daily slights but lashes back. In one show she mockingly counsels: “Dear White People, don’t dance.” But behind the scowl is the pain and frustration of a sensitive aspiring filmmaker who privately favors the Swedish film director Ingmar Bergman over Spike Lee.

While Sam speaks for the black students, she is understood best by her perceptive white boyfriend. He alone among the characters sees and fully appreciates the person beneath the skin. It is here where we are granted a close-up look at the intricate dance that is race, the complicated series of endlessly variable calculations that defy neat categories or lazy shorthand. It’s all covered, sometimes tumbling all at once from the screen with such velocity that one may at times miss the subtlety.

What becomes clear is that the central black characters are anything but the interchangeable cartoon cutouts they are in the imaginations of their white—and sometimes their black—peers. The cost of acceptance in predominantly white settings is often great, as is the temptation to insist that America has come so far on the racial front that whites can be considered the new victims of discrimination who can mindlessly evoke black stereotypes for fun.

In the end it’s not their race that unites these highly individual black students at the proverbial cafeteria table, but rather the barrage of indignities that effectively obliterates their differences. It’s the persisting erasure—the inability to see them as unique individuals—that cuts so deeply.

Recognition or even denial may account for some of the uneasy laughter I heard in the Upper West Side theater where I was among an age- and racially-diverse New York audience. Not all will agree that the filmmaker’s incisive critique is justified, but this film is certain to be discussed, on campuses and elsewhere, for many weeks and years to come.

Pamela Newkirk is Professor of Journalism at New York University. She is the author of Within the Veil: Black Journalists, White Media (NYU Press, 2012).

‘Left Behind,’ again? The re-emergence of a political phenomenon

—Glenn W. Shuck

Critics just don’t get Left Behind, a new movie adaptation of the best-selling book series. Sure, it’s predictably awful. The acting is bad, the production is terrible, and the plot is thinner than Soviet toilet paper. But the stakes are far higher than with a typical, first-order howler. Left Behind preaches to the choir, sure, but this is no ordinary choir! The film, like the novels, doesn’t cater to Hollywood styles; it’s all about motivating people to “spread the word,” and that word is just as political as it is otherworldly.

Ten years have passed since the original Left Behind novels by Tim LaHaye and Jerry B. Jenkins concluded. In a world where news cycles grow ever shorter, ten years is several lifetimes. Sure, the Christian suspense novels helped unleash powerful political forces then, but what about now?  The series and the values it champions may have found a way to return with the debut of the new Left Behind film.

Why Left Behind?  Why now?  Financial motives, as always, play a powerful role. Just look at recent films and television serials. Apocalypse sells. God sells. Fear sells. But another motive is at hand: apocalyptic narratives are also multi-stranded; they carry, after all, a revelation. They proclaim a new way of being in the world. In short, apocalyptic narratives often motivate action.

Dr. LaHaye and Mr. Jenkins helped politically conservative evangelicals in the 1990’s move beyond “single issue” and “values voters” labels to empower their political imaginations beyond narrow and predictable categories.  As the series progressed and the politics behind them came together, they helped an upstart and then highly disliked presidential candidate, George W. Bush, to two unlikely victories, selling millions of political primers along the way.  The Left Behind phenomenon helped embolden a hyper-energized religious right.

But something went wrong with doomsayers’ forecasts of evangelical political dominance: they stopped voting at the high rates that boosted President Bush. It wasn’t just that the original Left Behind film and its sequels were big–screen busts. “New” Republican standard-bearers Senator John McCain and former Massachusetts governor Mitt Romney never held much appeal for evangelicals. Yes, it hurt that the Left Behind series and its spin-offs and spokespersons were no longer so influential. Moreover, Bush’s unpopularity became toxic. More “moderate” Republican candidates, however, without the full support of a key voting bloc, found a different kind of apocalypse.

Fast-forward to 2014. A president is not on the ballots but President Obama’s policies certainly are as Democrats fight to retain the Senate. Polls and pundits raise concerns for Democrats. But Democrats ought to also consider a voting bloc that has been under-engaged for a decade. Some experts have assumed evangelicals and the Tea Party are one and the same (or similar enough), hence one can already account for these potential voters. But it is simplistic to equate the Tea Party with the religious right. It takes more than faux filibusters to help push high percentages of mercurial evangelical conservatives to the polls, especially in a midterm election, albeit as critical as this year.

Re-enter the Left Behind phenomenon. Left Behind, another adaptation of the novels, is earning the Left Behind phenomenon and the values it champions, a closer look. The film does not have the highest budget, but this re-boot has fared much better than the 2001 original, grossing almost twice as much (roughly $ 7 million) in the first weekend as the ill-fated original all told.  Whether the film grosses $20 million or $50 million matters less, however, than the fact it has brought conservative evangelicals back into the news cycle.

Thus in the days leading up to the 2014 midterms, Republicans have a wild card in Left Behind that just may become an ace. It is absurd to suggest a low-budget film will change the balance of power in the U.S.A., but it has resurrected the dynamics of the novels, and conservative evangelicals finally have a powerful reminder to vote.  And as any pundit will admit, it won’t take much to tip the scales in Washington.

Finally, the timely release of Left Behind may owe to coincidence, one month before the crucial midterms.  Evangelicals do not believe in coincidence, however, nor should campaigners in evangelical-filled battleground states such as North Carolina, Kansas, and Iowa, to name a few. Left Behind is playing in the heartland, playing for the hearts and minds of conservative evangelical voters. Critics, who dismiss Left Behind as simply an awful film and fire dull hip-shots with dismissive derision and canned clichés, miss the point. Left Behind is not about film prizes or outstanding cinematography or even good taste. It’s all about “spreading the word.” Who will be left behind if the film re-energizes its core audience and steers them into action just weeks before the crucial elections next month?

Glenn W. Shuck is Assistant Professor of Religion at Williams College and author of Marks of the Beast: The Left Behind Novels and the Struggle for Evangelical Identity (NYU Press, 2004).

Books That Cook: Yellow Potatoes

During the month of September, we’vee celebrated the publication of our first literary cookbook, Books That Cook: The Making of a Literary Meal by rounding up some of our bravest “chefs” at the Press to take on the task of cooking this book! Check out the reviews, odes, and confessions from Press staff members who attempted various recipes à la minute.

Our final post comes from Monica McCormick. Read below as she shares her thoughts on finding home through the joy of cooking. 

 


I have moved often in my adult life. In each new apartment, preparing meals has become a way of making a home. Pulling out my well-used pots and knives, reaching for ingredients in strange new cupboards, and learning the quirks of an unfamiliar stove are all part of the ritual. Whatever I cook, it fills my new place with comforting scents and flavors, evoking other meals in other homes, long ago.

The opening lines of Ketu H. Katrak’s essay evoked this nostalgia created by food, sounds, and scents.[1] She writes of waking in her childhood home in Bombay on a visit from the U.S., roused from sleep by clanging in the kitchen:

All these sounds mingle with the aromatic spices wafting over my waking body. The sounds of prayer and smells of chapatis and vegetables weave into a pattern of belonging, of home-sounds and home-aromas.

This brought me back to a related, though in some ways opposite, experience. At age 18, I left Stockton, California, for a year as an exchange student in Mombasa, Kenya. I have often thought of my first few mornings there, waking to strange sounds and smells: voices shouting (in what language?), a rooster crowing, the cranking of an old car engine that wouldn’t turn over, foods frying in an oil I couldn’t identify, the oddly floral soapy water my host sister was sloshing on the hallway floor. I wondered how I would ever feel comfortable with all this.

I found my way home there through the kitchen. My host family, like my California one, made meals a central daily ritual. The Oderos were Luo people from Lake Victoria in western Kenya, but in Mombasa they cooked in the Swahili style. This Indian Ocean culture was wholly new to me, combining people and traditions from places like Zanzibar, Goa, Gujarat, Oman, and the Seychelles on the East African coast. I learned to roll out flaky chapatis, though my first attempts were so far from round that my sister Leonida would laugh, “It’s the shape of Kenya!”

I grated coconut, seated on a low folding stool fixed with a serrated blade, the white flakes falling to a plate below. We packed the coconut in a long, cylindrical basket and twisted it to extract the milk that thickened stews of fish, potatoes, tomato, and curry spices, or flavored large pots of long-grained basmati rice. I pounded the tiny red chilies that grew outside our back door, burning my fingers as I scooped the paste out of the big wooden mortar and pestle.

Eventually I took on the family task of going to the covered open-air market, mixing my minimal Kiswahili with English to bargain for staples: potatoes, onions, tomatoes, rice, beans, lentils, and the local bananas, mangos, and papayas. At my favorite stand was a corpulent yet dignified man in a white skull cap, presiding over trays mounded with brilliant-colored spices: cumin, coriander, turmeric, cayenne, paprika, mustard seeds. From his high stool he would reach out to scoop what you needed on to a metal scale, blending to your specifications, and pouring the spices into newspaper cones, twisted at the ends.

At school, we had a two-hour lunch break. Because my host family lived a long bus ride away, schoolmates would bring me to home with them. I especially loved invitations from Bansari Shah, a girl whose tiny frame belied her healthy appetite. Her mother seemed to spend all morning preparing our lunch, in a kitchen lined with shiny metal tins of lentils, grains, and spices. She would set out a gorgeous array of vegetarian dishes: okra stewed with tomato and chili; acid-yellow turmeric potatoes flecked with black mustard seeds; green mung-bean dal; shiny white rice studded with cloves and cardamom pods. Bansari pointed out her favorites to be sure I tried them, and we would tuck in happily.

When I returned to the States, I was homesick for Mombasa. I made some of this food, trying to reproduce the methods and tastes in American kitchens. Like Ketrak in Massachusetts, I found Indian grocers in Minneapolis and San Francisco where I could once again inhale the combined scent of innumerable spices, and select from bins of lentils, dried peas, and beans. I bought Indian cookbooks and made elaborate meals with many garnishes. It was all a lot of work, and over the years I’ve simplified my cooking.

But Katrak’s recipe for Yellow Potatoes reminded me of lunch with Bansari. It inspired a trip to the Indian markets on Lexington Avenue near East 29th Street. I selected fresh packets of turmeric and black mustard seeds, and asked the grocer to reach me a bunch of cilantro, a knob of ginger and a lime from the small cooler behind the counter. Back in my little Harlem kitchen, I heated a generous slug of oil in my favorite heavy pot, let the mustard seeds pop to season the oil, and sizzled cubes of potato with the spices and minced chilies. When the potatoes were tender, I added a squeeze of lime, a few torn cilantro leaves, and gave a quick stir. Breathing in the flowery, sharp, tangy aromas, I took a mouthful and felt right at home.

Monica McCormick is Program Officer for Digital Scholarly Publishing at NYU Libraries and NYU Press.


[1] Food and Belonging: At ‘Home’ and in ‘Alien-Kitchens’, by Ketu H. Katrak

Books That Cook: Lettuce in Ribbons with Cream

During the month of September, we’re celebrating the publication of our first literary cookbook, Books That Cook: The Making of a Literary Meal by rounding up some of our bravest “chefs” at the Press to take on the task of cooking this book! In the next few weeks, we’ll be serving up reviews, odes, and confessions from Press staff members who attempted various recipes à la minute.

Today’s special:
Assistant Editor Caelyn Cobb, on pot brownies, Gertrude Stein, and how to cook lettuce (or “sacrifice the innocents”). 


“So does it have pot in it?” my boyfriend asked when I said I planned to make a dish from The Alice B. Toklas Cookbook for our blog.

More than any other Modernist writer, Alice B. Toklas is a household name, due largely to the success of her cookbook, a mishmash of memoir and recipes which contained one of the earliest published recipes for pot brownies. I am here to break the terrible news that Books That Cook does not contain a recipe for pot brownies. (Maybe in the second edition.)

Instead, the excerpt from The Alice B. Toklas Cookbook is a retelling of the writer’s time at a house in the French countryside, where she and her partner Gertrude Stein spent fourteen summers. It is largely about fruits and vegetables—growing them, picking them, cooking them, serving them, and eating them. I was attracted to this chapter for two reasons. First of all, vegetables seemed much easier to prepare, involving less time, fewer ingredients, and less of my money. I was sort of right about these things, but then again, I only sort of made the recipe. But more on that later.

Mainly, though, I was drawn to this chapter due to my past life as a literature student. Most interested in feminism, poetry, and Modernism, I was steered by many TAs and professors to Gertrude Stein’s most famous work, the poetry collection Tender Buttons. I was disappointed to find that I did not like this book anywhere near as much as I liked feminism, poetry, or Modernism individually. Tender Buttons is often described as “cubism for poetry,” which mostly means that you can only sometimes tell what is going on.

It wasn’t easy being a Modernist woman, especially on the American expatriate scene, and so I feel bad about not being a bigger fan of Gertrude Stein’s work. The leading lights of the Lost Generation were the greatest literary bros of their generation, ushering in a period of literary bro-ism that persists to this day. Given the time they spent watching bullfights, locking their wives in sanitariums, learning to box, and moving young ingénues into their homes (with or without approval of their wives) because it “helped with their creativity”, it’s a wonder that they got any writing done. Gertrude Stein and Alice Toklas hosted, promoted, and befriended many of these men, introducing them to leading artists and intellectuals—and looking at their memoirs, it sounds like quite a lovely and exciting time. However, Hemingway would later memorialize Stein in his own memoirs as looking “like a Roman emperor, and that was fine if you liked your women to look like Roman emperors,” so maybe not.

I mull this over as I get ready to make my dish, getting in the mood by thinking angry feminist thoughts and listening to the moodiest French band I had on my iPod. I had selected a cooked lettuce dish called “lettuce in ribbons with cream” because I thought it was funny how Toklas only begrudgingly gives these recipes, calling them “the sacrifice of the innocents” (innocents being lettuce, I presume).

It is the simplest of the recipes but also the least specific. What kind of lettuce? What is “heavy cream sauce”? I imagine her cooking with the tiny, sweet lettuce my grandpa grows in his garden, but I can’t find these in Queens and settle on two tiny heads of Boston lettuce. I contemplate making my own béchamel sauce, which I think is what she means by “cream sauce”, but I instead purchase a jar of Alfredo sauce because I do not feel like it.

The recipe does call for one specific ingredient, and that is “one teaspoon of onion juice.” They definitely do not have this at my grocery store; I turn to Google for instructions on how to make my own onion juice, but it seems like way too much work for one teaspoon. I instead buy an onion and sauté a few pieces with the lettuce. I am basically murdering this recipe, but you know, death of the author, etc.

The dish itself is pretty easy: slice up the lettuce, sauté it (with onion) in a lot of butter, then once the lettuce absorbs the butter, add salt, cover it and let simmer. I buy two heads of lettuce and the shredded bits fill three large bowls. I cook down two and half of them into a tiny wad of lettuce, which I then cover in Alfredo sauce. I am reminded of a Dutch dish, which involves cooking lettuce with ham and root vegetables in a white gravy. I’m sure it has a Dutch name, but in my family we just call it “Dutch lettuce.” It is not a crowd pleaser. My aunt would request it for her birthday dinners as a child just to prank her siblings. I begin to regret my choice of dish, but am too far in to turn back, much in the way I began to regret my decision to write my BA thesis on Modernist poetic criticism over winter break of my senior year. The only option is to suck it up and see it through.

Early in our courtship, my boyfriend had confirmed a deep love of vegetables that we both share. “Don’t insult my home by bringing a salad into it,” he warned, as I offered to do this very thing. Thus, I have promised the meal I make will not be only vegetables, and set about preparing bacon and tomato wraps while the lettuce is simmering. At the time I planned this, I had liked the BLT symmetry.

When I serve up the lettuce, he pronounces it “very tasty.” It is not bad. It is also very heavy; cooked lettuce has an earthy taste, and paired with a creamy cheese sauce, it’s extremely rich. I recommend serving it with something lighter than a bacon sandwich, like tilapia or chicken or anything besides bacon. I feel like I just ingested a grease ball.

After we finish our meal, I jokingly whip out my copy of Tender Buttons.

“Vegetable,” I read. “What is cut. What is cut by it. What is cut by it in.”

“Okay, that’s enough of that,” my boyfriend says.

Caelyn Cobb is Assistant Editor at NYU Press.

Books That Cook: Artichokes with Beurre au Citron

During the month of September, we’re celebrating the publication of our first literary cookbook, Books That Cook: The Making of a Literary Meal by rounding up some of our bravest “chefs” at the Press to take on the task of cooking this book! In the next few weeks, we’ll be serving up reviews, odes, and confessions from Press staff members who attempted various recipes à la minute.

Read, savor, and be sure to enter our giveaway for a chance to win a copy of the book before it ends on September 21!

Up next on the menu: Managing Editor Dorothea Stillman Halliday masters the art of l’artichaut.


My father would have loved Books That Cook, uniting as it does two of his greatest passions: books and food. My father read voraciously, books of all kinds, several at one time, and among those were cookbooks and literary food writing. He read about the history of foods and cuisines and the culinary practices of different cultures. He traveled widely and loved to taste the world. But every night at home he sat down to a disappointing dinner.

My parents lived in Europe during their early married life and then again later, after my siblings and I were part of the mix. My father’s tastes were full of herbs and spices to begin with and grew only more sophisticated after living abroad, while my mother seemed incapable of distinguishing between delicious and dreadful food. Her culinary ability and discernment were in the great American (pre–Julia Child) tradition of the blandest meat and most overcooked vegetables, puddles of mayonnaise, and a cabinet full of prefab foods. Why make fresh potatoes, when you could make potato flakes from a box just by adding them to hot water? Why even drink fresh milk when you could mix milk from a powder? The nutrition was what mattered to her, and that was all.

Unfortunately for my parents, who were newlyweds in the late forties, they were imprisoned in the gender roles of their time. My father went into the world and worked; my mother stayed home and cooked. He read Mastering the Art of French Cooking and watched Julia Child on TV, but he felt constrained to do no more food preparation than to put cream cheese and lox on a bagel. The wife did the cooking. The husband’s lot was to sit and be served—poor wretch. He dreamed of gastronomy; she dreamed of getting all our nutrients via vitamin pills. Needless to say, this caused considerable marital friction, and were it not for frequent dinners at the local Chinese restaurant, things might have gotten really ugly.

By the mid-seventies, the culture and my parents had both evolved enough that my father finally declared that he would do the cooking from now on. The dinner table became a happier place. And the food was much improved too. My mother was relieved to be liberated from the pressure of preparing meals that continually fell short of expectations. But she never understood what the big deal was. She always greeted any culinary preparation by expounding the nutritional value of the components: “Oh, carrots are very good for you. They’re full of vitamin A” and “Spinach is loaded with iron.”

If you’ve ever painstakingly prepared a delicious meal for someone and been greeted by this kind of response, then you know that the friction at the dinner table did not disappear altogether. My father would sigh in exasperation with her lack of appreciation but would console himself with the responses he got from the rest of us and with his own enjoyment of his meal. Once, while serving up one of his creations, he loaded up a fork and offered it to my mother. “Try this,” he said. “It’s got something in it that’s very good for you: flavor.”

When asked to select a recipe from Books That Cook to write about, I chose artichokes with beurre au citron, lemon butter sauce. The dish is both simple and fine, and it is one of a very few I remember my mother making that we all truly enjoyed. I don’t know if she learned from Julia Child’s recipe or from somewhere else, but even my mother could boil an artichoke and squeeze lemon juice into melted butter.

I remember how exotic it seemed to eat a huge flower bud. It was a gustatory adventure, even a quest: We sought the hidden treasure, the succulent heart. We peeled the petals away one at a time, avoiding the sharp points—my mother either didn’t know, or didn’t bother, to cut them off. We dipped the petals in the lemon butter and scraped the “meat” off with our teeth. We worked our way past the soft inner leaves of pale green and purple, down to the choke, which guarded the heart. Once past its defenses, we beheld the grail. And there was peace and harmony at the table.

The artichoke was good, even in the hands of an unskilled cook. And it still is.

Dorothea Stillman Halliday is Managing Editor at NYU Press.