What Makes a Story

The Secret Life of Stories: From Don Quixote to Harry Potter, How Understanding Intellectual Disability Transforms the Way We Read by Michael Bérubé is out today! This excerpt appeared in The Chronicle of Higher Education.

Pat Kinsella for The Chronicle Review

—Michael Bérubé

My kids are no longer kids. One is in his late 20s, one in his early 20s; Nick, the firstborn, was a “gifted” child, capable of copying drawings of medieval European hill cities at the age of 5; Jamie, his brother, has Down syndrome. Jamie also has an encyclopedic knowledge of sharks and the music of the Beatles, and an astonishing memory. Both of them are natural narrative theorists — though because of the differences in their capacities for abstraction, I wound up testing their narrative theories in different ways.

I learned almost as soon as Nick could talk that he loved my stories; he even gave them numbers, though I never did learn his classification system. Nick liked stories with drama — the story of how the hockey-camp bus left Tallinn without me in 1972; the story of how the camp counselor threw me out of the pool; the story of my first day in first grade, when the teacher corrected me for saying I was 6 when I was still only 5 (a situation that got worse in the following years, after that teacher decided to skip me into the second grade because of my reading skills). But as you might imagine, sometimes Nick’s appetite for stories became wearisome. I read to him every night, and I told him stories about people in my life, and I even made up some stuff. But one day when he was about 3 and a half, on a long car trip in the Midwest, he asked for story after story. And finally I decided to conduct a little experiment.

“All right,” I sighed. “I have a new story for you. It goes like this: Red. Yellow. Orange. Blue. Violet. Green. Black. Brown. White. … ”

“Daddy!” Nick interjected, annoyed. “That is not a story.”

“No?”

“No! It is just a bunch of colors.”

“And a bunch of colors is not a story?”

“No! A story has to have things in it.”

“Ah,” I replied, Phase 1 of the experiment complete. “A story has to have things in it. You are right. OK. Tree. Cloud. Sunshine. Water. … ”

“No, no, no,” Nick insisted, more annoyed. “Things happen in a story.”

“Fair enough,” I acknowledged. “The tree blocked the cloud. The sunshine reflected off the water. The flowers grew. … ”

“Dad!!” Nick interrupted, even more annoyed. “That is not a story either.”

“But things happen in it,” I pointed out.

“But you are not telling why they happen.”

Eureka. We were very close, at this point, to E.M. Forster’s famous dictum, “‘The king died, and then the queen died’ is a story. ‘The king died, and then the queen died of grief’ is a plot.” In a story, things have to happen for a reason.

Jamie learned his alphabet before he started kindergarten at age 6; he learned to read at a first-grade level by the time he was 8. Along the way, he somehow taught himself the American Sign Language alphabet and a few simple ASL words by imitating the pictures he saw in a Sesame Street book devoted to the subject. But even though Jamie had developed a profound love of sharks, barn animals, and the movie Babe (just like any number of children his age), he didn’t really understand stories as stories. He had an amazing recall of individual scenes, particularly scenes that involved pratfalls, and he was able to repeat most of the dialogue from the exchange in which Ferdinand the Duck tells Babe the Pig why he wants Babe to go into Farmer Hoggett’s house and steal the Hoggetts’ alarm clock from the side of the bed. But he didn’t understand how much of the movie’s plot is predicated on that scene (the issue is whether animals can avoid being eaten by humans if they demonstrate that they are “indispensable,” and Ferdinand has decided that he will crow like a rooster each morning in order to stay alive), nor did he understand what it might mean for a plot to be “predicated on a scene” in the first place. Until he was 10, Jamie enjoyed narratives almost as if they were a series of entertaining vignettes.

Then, in 2001, we took him to see Harry Potter and the Sorcerer’s Stone. We feared that the film might be a bit too much for him to take in — from the opening scenes of Harry the Abject Orphan to the climactic sequence in which Voldemort’s baleful spirit speaks from the back of Professor Quirrell’s head, with all the plot miscues and misdirections along the way (most of which point to Snape rather than Quirrell as the malevolent force in Harry’s world). But we were most pleasantly surprised to find that he got it — and not because he himself had glasses just like Harry’s, not because he dreamed of going to Hogwarts himself. He roundly dismissed all comparisons between himself and Harry. Rather, he got it because he loved the story — and he loved talking about it for weeks, even after he’d seen the film three or four times. Impressed, I asked him if he’d like to read the book on which the film was based, and he responded with hand-rubbing glee.

Thus began his — and my — adventures with J.K. Rowling’s plots, and Jamie’s fascination with the intricacies of plotting. A critical index of Jamie’s increasing sophistication as a reader was that he became increasingly capable of (and delighted by) making thematic connections that enriched his understanding of his other favorite narratives. In the course of our reading of Half-Blood Prince, we came upon an extended flashback/exposition in which the young Professor Dumbledore visits an 11-year-old Tom Riddle in the orphanage in order to inform him that he is a wizard and extend him an invitation to Hogwarts. Jamie gasped at Tom’s arrogant reaction to Dumbledore’s invitation, and, despite his fatigue, stayed awake for another couple of pages.

But before we got to that point, I read the following passage: “The orphans, Harry saw, were all wearing the same kind of grayish tunic. They looked reasonably well-cared for, but there was no denying that this was a grim place in which to grow up”. I decided to say a few words about the orphanage, and about Harry’s odd, complex moment of sympathy for the friendless boy who grows up to become Voldemort. “Did Harry have a happy childhood when he was growing up?” I asked. Jamie shook his head no. “He had the Dursleys,” he said. I pointed out that Harry and Voldemort are similar in that they grow up without parents, and that the kids in the orphanages are there because they have no parents either. I added that Jamie might remember the orphanage in the filmLike Mike, which was in the “heavy rotation” section of Jamie’s DVD collection for a while.

“Or Free Willy,” Jamie suggested. “Yes, that’s right,” I said with some surprise. “Free Willy is also about a kid who is growing up without parents, and who has stepparents, and he has trouble getting used to his new home.”

“Or Rookie of the Year,” Jamie said. “Not exactly,” I replied. “In Rookie of the Year, Henry has his mother, but his mother’s boyfriend is a creep, and we don’t know where his father went before he was born.”

Star Wars too. There’s Luke,” Jamie said. “Good one! Great example!” I cried. Perfect, in fact. Star Wars is like a class reunion of the West’s major mythological motifs.

Mrs. Doubtfire,” Jamie offered. “Nope, that’s about parents who are divorced and live in different houses,” I said, “but still, in Mrs. Doubtfire the father misses his kids and wants to see them, so dresses up as a nanny.”

“What about Babe?” Jamie asked.

“Oh yes, that’s a very good example,” I told him. “Babe has no parents, and that’s why he is so happy when Fly agrees to be like his mother.”

“And Rex is like his father,” Jamie added. “And Ferdinand the duck is like his brother.”

Why, yes, Ferdinand is like his brother. This had never occurred to me before. But who knew that Jamie was thinking, all this time, about the family configurations in these movies? And who knew that Jamie knew that so many unhappy families, human and pig, are alike?

But my children, adept narrative theorists though they be, are not my only inspiration. I’m also informed by years of conversations with colleagues in disability studies, including two wholly unexpected encounters that subtly but decisively widened the parameters of how I could think about how an understanding of intellectual disability can transform the way we read.

The first encounter happened at the 2011 Modern Language Association convention in Los Angeles, and in retrospect is merely amusing — though at the time, it seemed like the stuff of professors’ anxiety dreams. I was on a panel titled “Narrative and Intellectual Disability,” chaired by Rachel Adams, a professor of English and comparative literature at Columbia University. I was trying to get at the question of how narrative irony works when it involves a character with an intellectual disability, a character who is rendered explicitly as someone who is incapable of understanding the story he or she inhabits. Steinbeck marks Lennie in this way from Of Mice and Men’s opening scene:

Lennie looked timidly over to him. “George?”
“Yeah, what ya want?”
“Where we goin’, George?”
The little man jerked down the brim of his hat and scowled over at Lennie. “So you forgot that awready, did you? I gotta tell you again, do I? Jesus Christ, you’re a crazy bastard.”
“I forgot,” Lennie said softly. “I tried not to forget. Honest to God I did, George.”

And just as Lennie does not understand where he is going or why, so too will he not understand what is going to happen to him in the book’s final pages; in that sense, his intellectual disability provides the structure for the narrative irony, and the narrative irony defines the novel. Lennie knows not what he does, and we know he knows not what he does. But I mentioned Of Mice and Men only in passing, opening instead with Benjy Compson of The Sound and the Fury and proceeding to a comparison between Elizabeth Moon’s The Speed of Dark and Mark Haddon’s The Curious Incident of the Dog in the Night-Time,because Haddon provides an ingenious (and quite moving) solution to the problem of writing a novel in the voice of a character who (initially) does not understand the narrative he is in, whereas Moon has to skirt that problem by giving us a second level of narrative focalized through characters who do not have autism and who can explain what is at stake in the unfolding of the narrative told by the character who does have autism.

At the last minute, one of my fellow panelists had to pull out of the convention, and Rachel Adams informed us that Rob Spirko, a senior lecturer in English at the University of Tennessee at Knoxville, would substitute instead, with a paper titled “The Human Spectrum: Human Fiction and Autism.” Rob preceded me on the program — and proceeded to deliver a paper about The Speed of Dark and The Curious Incident of the Dog in the Night-Time,making many of the points and citing many of the passages I had hoped to highlight in my paper. As I listened to Rob, I toyed with the idea of taking the podium and saying simply, “My paper is what Rob said,” but just then, he made an offhand reference to the “Rashomon-like” narrative sequence in Philip K. Dick’s Martian Time-Slip. I snapped to attention: This seemed to me to be something worth discussing. I had not written anything about Martian Time-Slip in my paper, but I had recently read it and was still trying to figure out what to make of its extraordinary strangeness. And I was thrilled to be able to discuss it at a conference with Rob Spirko, who has worked on disability and science fiction for some time.

I did wind up delivering most of my original paper; Rob’s arguments and mine did not overlap completely. But I threw in some extemporaneous remarks about how the narrative sequences in Chapters 10 and 11 of Martian Time-Slip are not, in fact, Rashomon-like. If they were, they would involve four characters telling the same story from drastically different perspectives, narrating significantly different sequences of events, such that the very idea of “the same story” becomes untenable. But they don’t. Instead, they open by telling the same story almost word for word, and then proceed into disturbing fantasias that cannot be attributed to any one character — even though each character, the following day, feels the aftereffects of the sequence as a whole. The sequence is not merely “about” the perspective of a character with an intellectual disability; it represents an attempt to render intellectual disability in the form of a disabled textuality that cannot be attributed simply to any one character’s mental operations.

And when I realized that, thanks to the offhand remark of a last-minute-replacement speaker giving a 15-minute paper at the MLA convention, I realized that I had a critical piece of my argument — a way of talking about intellectual disability and narrative that did not begin and end with the discussion of whether X character has Y disability.

As for the second encounter: To say that it was “wholly unexpected,” as I have done, is actually an understatement. It was pretty much the last thing in the world I might have imagined. It involved a whimsical decision to join Facebook (after years of steadfast, principled resistance) and, relatedly, to go to the 40th-anniversary reunion of my sixth-grade class (not a happy place for me when I was 10, but I thought that the details of tween angst of 1972 were not worth recalling in 2012).

Within a few days of joining Facebook, I was hailed by one Phyllis Anderson, nee Phyllis Eisenson — someone I had not thought about in almost 40 years. And in the course of striking up a conversation with this person 40 years after graduation from PS 32 Queens, I happened to mention Madeleine L’Engle’s A Wrinkle in Time, about which I had just been reading; 2012 was the 50th anniversary of the book’s publication, and in 1972 at least half of our cohort had read it. And Phyllis was of course (or so I imagined) Meg Murry, the very smart girl with the long hair and prominent glasses. This rudimentary identification was complicated a bit by the fact that I identified more with Meg than with any male character in the book — with her sense of isolation, helplessness, and vulnerability above all — and by the fact that I did not stop to think that I might be inadvertently saying to my former classmate, “I remember you — you were the girl with glasses and what’s more, everybody thought you were a weirdo.”

But in the course of that brief Facebook conversation, as we caught up on partners and kids and professions (literature professor, clinical psychologist), I mentioned that I was sitting at a table across from Jamie, who had just handed me a list of 25 kinds of sharks. To which Phyllis replied, “I am sure you did not know that my brother is autistic.” Well, I could have plotzed. Needless to say, I did not know that; I did not know anything about this person, starting with the fact that she had a brother. And a brother with autism, in the 1960s (Andy Eisenson was born in 1957).

Oh my goodness, I thought at once, what that must have meant for her mother — to have a child with autism at the precise historical moment when autism was being attributed to “refrigerator mothers.” How difficult that must have been. Followed almost immediately by, oh goodness, and then along comes this very smart girl, the younger sister. I can’t even begin to imagine the family dynamics, except that no, wait, yes I can. And the story quickly went still deeper: Sylvia Eisenson, Phyllis’s mother, was in fact a psychologist in the New York City school system. She was A.B.D. from the University of Illinois at Urbana-Champaign, where I taught for 12 years. She knew very well what Bruno Bettelheim was doing with Leo Kanner’s refrigerator-mothers theory, and had actually written to Bettelheim to tell him that his work was destructive to loving parents. She received a reply from an underling, telling her to get therapy.

All this I learned in the course of one Facebook chat, which somehow went from “oh yes, I remember you” to serious familial and emotional matters in the course of a few minutes. Life in the disability community can be like that; I remember a conference at which someone introduced herself as the parent of a child with Down syndrome and we wound up talking about our then-teenaged sons’ desires for friends, especially girlfriends, within 10 minutes. Because there is so much shared terrain, casual conversations can suddenly turn into serious discussions of special-needs trusts and the ethics of getting your child or sibling to sign over his power of attorney. And then, a few days later, after Phyllis had gone back to look at her copy of A Wrinkle in Time, she wrote:

just read the first chapter of “Wrinkle”, and Meg is described with her glasses and braces and general awkwardness. And I thought — that is at least partly why I liked this book so much — there I am, though without the spunk to duke it out with the kid who said something mean about my brother.

When I read that note I had yet another “oy, what did I say” moment: Oh yes, I remember you with your glasses and braces and general awkwardness? (It turns out the braces came later. I did not remember any braces.) But it was the last clause that grabbed me. No doubt young Phyllis Eisenson, or anyone with a sibling with an intellectual disability, would read Wrinkle with that inflection, with a sense of protectiveness for the more vulnerable family member: Wasn’t this one of the lessons we learned in Reader-Response Criticism 101?

From one angle it is a rudimentary point, a truism: Of course we all bring to every text the welter of experiences, associations, encounters, and intertextual relations we have accumulated over the years. Reader-response criticism made much of this rudimentary point for much of the 1970s, with earnest Critical Inquiry forums on whether readers or texts make meaning, whether meaning is determinate or indeterminate, and whether the hypothetical “Eskimo reading” of Faulkner’s “A Rose for Emily” can be ruled out of court.

But from another angle, this exchange seemed (and seems) to me to open onto a principle of considerable breadth, one that has not yet been considered by literary criticism influenced by disability studies. It is the complement to the Rob Spirko-induced insight that disability in literary texts need not be located in, or tied to, a specific character with an identifiable disability: It is the Phyllis Eisenson-induced insight that disability in the relation between text and reader need not involve any character with disabilities at all. It can involve ideas about disability, and ideas about the stigma associated with disability, regardless of whether any specific character can be pegged with a specific diagnosis. This opens the field of criticism considerably; and I am going to insist that this is a good thing, not least because I am determined to cure disability studies of its habit of diagnosing fictional characters.

My argument is that even as disability studies has established itself in the humanities in a way that was unthinkable 20 years ago, it has still limited itself to too narrow a range of options when it comes to literary criticism; and though I am (obviously) being facetious about the idea of “curing” disability studies of anything, I am quite serious about the conviction that disability studies limits itself unnecessarily, as a new branch of criticism and theory, whenever it confines itself to determining the disability status of individual characters. Disability studies need not and should not predicate its existence as a practice of criticism by reading a literary text in one hand and the DSM-5 in the other — even when a text explicitly announces that one or more of its characters is (for example) on the autism spectrum. It is not that a character’s condition is irrelevant to how we read him or her; rather, I want to argue that we should avoid the temptation to think that a diagnosis “solves” the text somehow, in the manner of those “psychological” interpretations of yesteryear that explain Hamlet by surmising that the prince is, unbeknownst to himself, gay.

In opening the question of the potential relations between disabled and nondisabled characters (and readers’ potential relations to those relations) so that it includes characters who are merely presumed to be intellectually disabled by their fellow characters (such as Coetzee’s Michael K and Friday, from his novel Foe), we come to recognize intellectual disability not merely as the expression of somatic/neurological symptoms but as a trope, a critical and underacknowledged thread in the social fabric, a device for exploring the phenomenon of human sociality as such.

This is not merely a matter of remarking that the idiot and the holy fool offer strategic insight into human hierarchies and the contingency of systems of value, though it is partly that; it is also a matter of gauging how literary works depict systems of sociality in part by including characters who either are, or who are presumed by their fellow characters to be, constitutively incapable of understanding or abiding by the social systems by which their worlds operate. When you think about it that way, and you should, you realize that we have only just begun to explore the implications of disability for our understanding of the narratives we read and the narratives we live by.

Michael Bérubé is the Edwin Erle Sparks professor of literature and director of the Institute for the Arts and Humanities at Pennsylvania State University. He is the author of The Secret Life of Stories: From Don Quixote to Harry Potter, How Understanding Intellectual Disability Transforms the Way We Read (NYU Press, 2016).

The Difference a Mutant Makes

—Ramzi Fawaz

[This piece originally appeared on Avidly.]

Like any good origin story, I’ve told this one a thousand times: The first comic book I ever read was X-Men #80, the 35th Anniversary issue of America’s most popular comic book series, which, for over three decades, had narrated the lives, loves, and losses of a band of mutant outcasts gifted with extraordinary abilities because of an evolution in their genetic makeup. It was 1998. I was thirteen years old and the cover of this single comic book issue was a young gay boy’s dream: a shiny pink hologram with a tower of dazzling disco-attired superheroes exploding before one’s eyes.

Growing up in a queer family, sibling to a gay brother, and bullied to tears on a daily basis for my own exuberant gayness, the words “A team reunited…a dream reborn” emblazoned on that cover spoke to me of the promise and possibility of queer kinship and solidarity in the face of all odds. Above all, what struck me about that cover was the sheer variety of characters depicted—how could a man of made of steel, an intangible woman, a white haired weather goddess, a butch teen girl ith bones sticking out of her skin, and a teleporting blue elf be any kind of team? Who were these people, I wondered, and what kind of dream did they share?

tumblr_n7bmdkqKUU1sqltzeo1_500Like so many readers of the X-Men over the decades, no character drew me in more than the weather goddess Storm, a Kenyan immigrant to the U.S., the first black woman superhero in a mainstream comic book, and by the 1990s, the X-Men’s team leader. In that same anniversary issue, at a low point in the team’s battle with an imposter group of X-Men, Storm rallies her bruised and beaten comrades by reminding them that what defines their bond is a set of shared values, a chosen kinship maintained through mutual love and respect, not by force or expectation. With my budding left-wing consciousness on one side, and my attachment to queer family on the other, I fell in love with this fictional mutant goddess and her team: this was the kind of community I longed for. How did it come to be that a thirteen year old Lebanese-American, suburban gay boy found common cause with an orphaned, Kenyan, mutant, immigrant X-Man?

If one were to try and explain this question by turning to recent public debates about superhero comics, we might put forward the answer: “diversity.” Yet this term and its shifting meanings—variety, difference, or representational equality—would have rung false to my thirteen year old ears. It was not simply the fact of Storm’s “diverse” background as Kenyan, immigrant, woman, or mutant that drew me to her, but rather her ethical orientation towards those around her, her response to human and mutant differences, and her familial bond with her fellow X-Men. These were qualities significantly shaped by her distinct differences, but not identical to them. This was not any traditional idea of diversity then, understood as the mere fact that different kinds of people exist. Rather what Storm and the X-Men embodied was true heterogeneity: not merely the fact of many kinds of people but what those people do in relation to their differences. As I became a dedicated comic book fan, I realized that every issue of the X-Men was both an extended meditation on the fact that people are different from one another, and that this reality requires each and every person to forge substantive, meaningful, intelligent responses to those differences.

As a teenage reader, I simply took this fact for granted as part of the pleasures of reading superhero comics. As a scholar years later, I came to realize that the ability to respond to differences and forge meaningful relationships across them was a capacity, a super-power if you will, that comics could train their readers to exercise, an imaginative skill fit for a truly heterogeneous world. It was this realization that led me to write The New Mutants: Superheroes and the Radical Imagination of American Comics, in which I ask the question: what is it about the visual and narrative capacities of the comic book medium, and the figure of the mutant, cyborg, or “freak” superhero in particular, that has allowed so many readers to develop identification with characters across race, class, gender, sexuality, ability, and cultural origin?

Recent public dialogue about the rapidly diversifying ranks of superhero comic books have overwhelmingly celebrated the increased racial, gender, sexual, and religious variety of America’s greatest fictional heroes. Yet every time a news outlet lauds the major comics companies for introducing a gay superhero, or a Pakistani superhero, or a classically male superhero replaced by a powerful woman, the historian in me thinks, “but comics were doing that in 1972, so what’s the big deal now?”

Certainly, one potentially distinct element of today’s push for diversity is the range of “real-world” or identifiable differences comics are willing to name and represent on the comic book page. But in writing The New Mutants, I came to the conclusion that without an underlying democratic ethos or worldview, such real-world differences have little meaning. In The New Mutants, I argue that cultivating egalitarian and democratic responses to differences became the sin qua none of American superhero comics from the 1960s through the early 1990s.

justice leagueI call this vision a “comic book cosmopolitics,” an ethos of reciprocal, mutually transformative encounters across difference that infused the visual and narrative content of comics for nearly three decades. In the 1960s and 1970s comic book series like the Justice League of America, theFantastic Four, and the X-Men provided readers an exceptionally diverse range of new characters and creative worlds, but most importantly, modeled what it might look like for those characters to bridge divides of race, species, kin and kind for their mutual flourishing and the good of the world. What “doing good for the world” meant or could mean was the question that motivated these characters to engage one another, forge bonds, disagree, and take collective action. Today’s celebratory proclamations about the internal diversity of American comics ignores the fact that by 1984 Marvel Comics alone had Kenyan, Vietnamese, Native American, Russian, American working-class, Jewish, and Catholic superheroes, and even a Pagan Demon sorceress at the helm of one of its main titles.

What distinguished these earlier figures from their contemporary counterparts are the seemingly endless dialogues and struggles they engaged to negotiate, respond to, rethink, and do somethingwith their differences as a matter of changing the world. It was this negotiation within the context of characters’ actual diversity that allowed readers like me, and thousands more, to identify with a vast range of people who were, at least on the surface, radically unlike us.

In a recent op-ed for the New York Times, “That Oxymoron, The Asian Comic Superhero,” columnist Umapagan Ampikaipakan makes the counterintuitive claim that the push for racial diversity in contemporary superhero characters, rather than reflect the progressive evolution of the superhero, might actually “dilute” the fundamental purpose of the figure to function as a universal fantasy of belonging. The more specific or particular the superhero gets, he suggests, the less the character speaks to all kinds of readers.

As a child growing up in Kuala Lampur, Ampikaipakan explains that even thousands of miles away from U.S. culture, he found himself identifying with the misfit and freak Spider-Man. It didn’t matter that Spider-Man and so many of the superheroes in the Marvel Universe were white. Rather it was the message these comics carried about the value of being a freak or an outcast that translated across both actual and virtual distance.

In the face of much public celebration of comic book diversity, Ampikaipakan’s argument is compelling because it refuses a reductive understanding of identity politics, namely that seeing oneself or one’s own particular identity reflected back in any given character is the only possible way that one can feel invested in a them or their creative world. This argument is both undoubtedly correct, yet severely misguided.

The mistake Ampikaipakan makes is not to claim that readers have the capacity to identify with a range of characters regardless of their social identity, but in his failure to stress thatit is difference and distinction itself that has made the superhero such a durable fantasy to so many readers globally, not the figure’s empty universality or the flexibility of whiteness to accommodate a variety of identifications. The fact that superheroes highlight (rather than overlook) the social, cultural, and biological differences that shape humankind, that makes identifying with them possible—this is why one superhero is never enough. Superheroes proliferate because no matter how many there are, they can never quite capture the true heterogeneity of everyday life. The attempt to do so is what keeps us reading.

We should not settle for the mere representation of more diverse characters, as though the very existence of a female Pakistani Ms. Marvel alone were an act of anti-racism, or anti-sexism; these latter categories describe not a representation or image, but an ethos, a worldview and way of life—this ethos is what Ampikaipakan was drawn to in reading Spider-Man. It was an underlying ideal of celebrating outcasts, misfits, and freaks—a democratic investment in all who did not fit into the model of “normal” American citizenship—that defined Marvel Comics in the 1960s and 1970s, and that shaped readers’ relationship to characters like Spider-Man and his universe of mutant, alien, and superhuman friends, all of whom we grew to love because of their particularities, differences, and distinctions, not their imagined universality. As readers, we must demand that the depiction of more diverse characters be motivated by an ethos attentive to human heterogeneity, its problems and possibilities; these character must be placed into dynamic exchange with the world around them, rather than merely making us feel good that some more of us are now included every once in a while.

X-Men Move to San FranciscoTake for example the dramatic creative decision by writer Matt Fraction to relocate the X-Men from their long-standing home at the Xavier Institute for Higher Learning in Westchester, Massachusetts, to San Francisco in 2008. With this momentous move to one of America’s most recognized gay holy lands, it seemed as though the X-Men series had finally made explicit its long-standing symbolic association between mutation as a fictional category of difference, and gayness, as a lived form of minority identity; and yet, in the handful of years that the X-Men resided in San Francisco between 2008 and 2012—where they lived as billionaire jet-setters buying property in the Marin headlands at the height of a national recession no less—never once did they address the city’s massive housing crisis, increasing rates of violence towards the city’s queer and minority populations, or the shifting status of HIV. Did the X-Men even deign to go to a gay club in their five years in the Golden Gate city? Did the team’s one putatively “out” character Northstar claim any solidarity with the city’s queer community? I’m afraid not.

The series capitalized on its symbolic gesture of solidarity with minorities, queers, and misfits, but it jettisoned its earlier substantive engagement with the problem of difference: back in 1979, when Storm visited the slums of Harlem and witnessed the reality of youth homelessness and drug abuse, she was forced to contend with the realities of inner-city African American life from the perspective of a Kenyan immigrant who experiences blackness differently than African Americans and the working poor; and in the early 1990s, with the invention of the fictional mutant disease the Legacy Virus, the X-Men series used fantasy to address the AIDS crisis by thinking through the kinds of solidarities mutants and humans would have to develop to respond to a genetic disease ravaging the mutant population.

Storm Visits Harlem 1979

In today’s comic book pages, is there a single X-Man with HIV? Now that Iceman is out of the closet, will he go on PrEP, the HIV prophylactic? And as a former lady’s man, will his sexual health be an issue at stake in the series? The likelihood that Bobby Drake’s gayness will either be treated substantively, or have a meaningful effect on the social fabric of the Marvel Universe seems very low in today’s creative environment, where the mere “outing” of characters as exceptionally diverse in their identities is presupposed as an act of social benevolence on the part of writers and artists.

My point here, is not that superhero comics need greater realism in their storytelling or should be more “true to life.” Rather, superhero comics are one place where fantasy and creative worldmaking can run up against the specificities of our everyday lives, so that “real life” is presented to us anew or opened up to other possibilities. Mutation and gayness, for instance, are not the same thing. But they resonate in surprising ways.

The imagined category of mutation sheds light on the workings of a real-world social identity like gayness, or blackness, but it also reveals the limits of analogy because all of these categories are never quite identical. The ability to distinguish between the places where differences overlap and where they don’t is a political skill that fantasy can help us develop. It demands we not only see where solidarity can be forged, but also figure out what to do when sameness no longer holds true, or our differences overwhelm the ability to forge meaningful bonds.

What I was doing that summer day when I read my first issue of the X-Men was figuring something out not only about myself, but about my relationship to the world around me as someone who fundamentally understood that I was different, but didn’t yet know how to respond to being different. This is the true gift that superhero comics have given to American culture in the 20th century, but it is a creative offering increasingly taken from our grasp.

When Marvel Comics reached a creative level of near maximum mutant heterogeneity in the X-Men series around 2005—a moment of incredible promise when mutants no longer appeared as minorities but a significant portion of the human population—Marvel spun out a barrage of storylines from “E is for Extinction” to “House of M” that depicted the mass slaughter of the majority of the world’s mutants by members of their own kind. The X-Men have been living in the shadow of genocide ever since: shot down by ever-more efficient mutant killing robots, murdered and harvested for their organs, nearly eliminated from history by time-traveling mutant hunters, and now subject to M-Pox, another genetic disease threatening to wipe out the mutant race. In a sense, Marvel could not face the complexities of the world it had created, and decided to obliterate it instead: in so doing, fantasy truly became a reflection of our violent post-9/11 reality.

uncanny-x-men-1-marvel-nowContrast the exuberant, bubble-gum pink cover of the first X-Men comic book I ever read, with the most recent issue I picked up: in the renumbered Uncanny X-Men #1 (2013) written by Brian Michael Bendis, the cover presents us a picture of mutants at war. There is a revolution afoot, but it is lead by a single male figure, the famed character Cyclops reaching out to the reader from the center of the page with his army of followers, merely black and white outlines in the background. That army is composed of some of the most complex characters to ever grace the pages of the X-Men series, yet here they have been flattened to ghosts haunting the background of the X-Men’s former dream.

The X-Men now appear as a leather-clad, armored military unit, not a high-flying, exuberant, queer menagerie. At the center, Cyclops’ hand reaches out to us not in a gesture of solidarity but as a claw, perhaps ready to grip our throats. In this new chapter of the X-Men’s history, mutants are presented as divided over the right path towards the preservation of the mutant race. But instead of rich, textured disagreements the characters appear merely as ideologues spouting flat and rigid political manifestos. There is no space for genuine debate, or loyalty amidst disagreement, or even the notion that more than one dream could exist side by side among companions. As Alex Segade has recently argued in a brilliant Art Forum article on theX-Men’s decades long mutant mythology, the recent introduction of increasingly “diverse” cast members to the series has come at extraordinary costs, including the mass deaths of entire swathes of mutants round the world.

In the X-Men, fantasy—that realm meant to transport us to a different world—has become the ground for narrating the collapse of all visions of hope, social transformation, or egalitarian action: in Bendis’ epic narrative the original five teenage members of the X-Men are teleported to the present only to see that their youthful dreams of peaceful relations between human and mutant kind have resulted in death, destruction, and seemingly endless violence.

When I recently caught up on Bendis’ X-Men plot about time traveling mutant teenagers, it made me wonder what my thirteen year old self would have thought about his future had these been the comic book issues he first encountered in the summer of 1998. But I’m lucky they weren’t. In the face of the kinds of violence and death-dealing that recent comics present, I remember that there are other uses for fantasy, because it was the X-Men series itself that first showed me it was possible. Today, as years of reading, thinking, and writing about superhero comics come together with the publication of The New Mutants, I look back at that first cover image of X-Men #80 with a mix of longing and hope: I wonder now how a team can be reunited, and how new dreams can be born.

Ramzi Fawaz is an Assistant Professor of English at the University of Wisconsin, Madison. He is the author of The New Mutants: Superheroes and the Radical Imagination of American Comics (NYU Press, 2016).

Zach Anderson, Statutory Rape Laws, and the History of Age

—Nicholas L. Syrett

In October of this year, thanks in part to a petition that circulated on change.org, stories that went viral on Twitter, Facebook, and Reddit, and a good deal of media attention, an Indiana man named Zach Anderson was removed from the sex offender registry in two states. He had originally been added after having been convicted of statutory rape.

The story, briefly, is this: last year Anderson met a girl on the app “Hot or Not.” He drove from his home in Elkhart, Indiana to her home in Niles, Michigan, about twenty-two miles away, where they had sex. Anderson was nineteen at the time and the girl said that she was seventeen and had registered in the “adults” section of the app. It turns out she was only fourteen and the age of consent in Michigan is sixteen, meaning that Anderson had unknowingly committed statutory rape.

After being arrested, Anderson pled guilty to a charge of fourth-degree criminal sexual assault and spent seventy-five days in jail. Under the terms of his five-year probation, he was forbidden from using the Internet, owning a smartphone, or having any contact with anyone under the age of seventeen (save immediate family). He was also placed on the sex offender registry in Indiana and Michigan until 2040.

The case attracted as much attention and outrage as it did – aside from the fact that his parents were savvy users of the very Internet from which their son was banned – because the girl admitted to lying about her age, she and her parents opposed the prosecution, and because the judge in the original sentencing was intransigent in handing down the mandated punishment stipulated by the statute. The case provoked broader discussions about teenage sexuality in the age of the Internet and the long-term repercussions of statutory rape laws that brand teenagers as sex offenders.

What received less attention, however much it was lurking just beneath the surface of these conversations, was the function of age itself in what happened to Zach Anderson and his youthful sexual partner. While the minor girl was only fourteen, either she looked like she could be seventeen or Anderson simply willed himself to believe that this was so. Michigan’s penal code stipulates that someone is guilty of “criminal conduct in the fourth degree” if the victim “is at least 13 years of age but less than 16 years of age, and the actor is 5 or more years older than that other person.”

This age-gap exception to statutory rape law is meant to protect the older of two sexually active teenagers from prosecution; it is a result of late twentieth-century revisions to rape laws that came in the wake of the sexual revolution, after which high school students were more likely to have sex than when the original laws were passed in the late nineteenth and early twentieth centuries. But Anderson was not helped by this part of the law, despite still being a teenager, because he was nineteen. Had Anderson only been eighteen years old, one year younger than he was, he could not have been charged under the law, because the two would only have been separated by four years, not five. Had his birthday been perhaps a little later in the year or hers a little sooner, there would not have been a crime.

Age is a blunt and sometimes arbitrary legal instrument. Legal age cannot accommodate those who lie or who do not “look” their age. In a time when all of our ages are precise, fixed, and documented, Anderson had no wiggle room. He was a nineteen-year-old adult and she was a fourteen-year-old child; despite all the ambiguity of their meeting, in his first trial there had simply been no way around this. Age-based laws have also offered little meaningful protection to actual victims of sexual violence like, for instance, the countless children abused by Catholic priests.

This reliance on age has a history. Only about a hundred years ago Americans were completing the process of incorporating age into their criminal and civil law, as well as their collective consciousness. Since that time, they have increasingly used it as if it were a simple, incontrovertible, biological fact. In one way that is undeniable, but as the case of Zach Anderson also demonstrates, age itself does not fully accommodate or contain all aspects of human beings’ development, capabilities, and responsibilities. It is a legal shorthand that all too often—and statutory rape law is only one example—perpetuates limitations and enacts violence when it is meant to protect or enable. Further, at a moment when the age one purports to be online can have little relation to one’s actual age, and when numerous websites (not just Hot or Not) do little to verify age, we have entered a new era in the malleability of age itself.

While Zach Anderson’s story has a (relatively) happy ending—in the form of his revised sentence—many other young men remain in jail and on sex offender registries because their age at the time of their sexual encounters placed them there.

Nicholas L. Syrett is an associate professor of history at the University of Northern Colorado, the coeditor (with Corinne T. Field) of Age in America: The Colonial Era to the Present (NYU Press, 2015), and author of The Company He Keeps: A History of White College Fraternities (2009) and the forthcoming American Child Bride: A History of Minors and Marriage in the United States.

Innocent Children and Frightened Adults: Why Censorship Fails

—Philip Nel

Few things upset American adults more than books for children and adolescents. If you look at the American Library Association’s annual list of Challenged and Banned Books, the top 10 titles are nearly always those written for or assigned to young people: J.K. Rowling’s Harry Potter novels, Dav Pilkey’s Captain Underpants series, Mark Twain’s Adventures of Huckleberry Finn, Toni Morrison’s The Bluest Eye. On those rare occasions when the books are not intended for school-age readers or given as homework, they’re on the list because young people are reading them anyway: E.L. James’ Fifty Shades of Grey, a favorite target for 2012 and 2013.

Banned and challenged books tell us very little about what is suitable for actual children. Instead, books targeted for censure offer an index of adult fears, reflecting, as David Booth says, “changing ideas about childhood and notions of suitability.”1 Censorship is also transideological, advocated by people of many political persuasions. Progressive censors seek to scrub away racism from Doctor Dolittle and Huckleberry Finn, creating Bowdlerized editions of the books. Conservative censors wish to protect children from knowledge of the human body: as a result, Robie Harris and Michael Emberley’s It’s Perfectly Normal: Changing Bodies, Growing Up, Sex, and Sexual Health frequently lands on the ALA’s Challenged-and-Banned Books list.

While censorship will not keep young people safe, censors and would-be censors are right about two things. First, books have power. Second, responsible adults should help guide young people through the hazards of the adult world.

However, like all attempts to safeguard children’s innocence, removing books from libraries and curricula are not only doomed to failure; they are an abdication of adult responsibility and, as Marah Gubar writes of associating innocence with childhood, “potentially damaging to the wellbeing of actual young people.”2 A responsible adult recognizes that innocence is a negative state — an absence of knowledge and experience — and thus cannot be sustained. Shielding children from books that offer insight into the world’s dangers puts these children at risk. As Meg Rosoff notes, “If you don’t talk to kids about the difficult stuff, they worry alone.”3 Books offer a safe space in which to have conversations about difficult subjects. Taking these books out of circulation diminishes understanding and increases anxiety.

Separating children from books also fails to recognize that peril is not distributed randomly throughout the population, but concentrated in groups identifiable by their members’ race, gender, class, sexuality, disability, or religion. Preventing teenagers from reading Laurie Halse Anderson’s Speak or Maya Angelou’s I Know Why the Caged Bird Sings impedes them from learning about what survivors of rape endure, and how peers and teachers might better help them. Blocking children from reading Justin Richardson, Peter Parnell, and Henry Cole’s And Tango Makes Three prevents them from understanding that same-sex parents appear elsewhere in the animal kingdom, too. Banning Tim O’Brien‘s The Things They Carried and Walter Dean Myers’s Fallen Angels stops readers from discovering how war shapes a young psyche. Prohibiting Sherman Alexie’s The Absolutely True Story of a Part-Time Indian impedes young people from learning about the hard realities of life on a reservation, and from getting to know the novel’s resilient, funny protagonist. These books provide mirrors for young people of similar backgrounds or experiences, and windows for those of different ones.

Furthermore, preventing children from reckoning with potentially offensive works ill prepares them for the indignities that life will inflict. They should read books that trouble them, and have serious conversations about those books. For example, while Twain was a progressive nineteenth-century white author, if his Huckleberry Finn doesn’t offend contemporary readers, then they’re not reading it carefully enough. It’s not just the repeated use of the n-word, which should make people at least uncomfortable and at most angry (news flash: it was a racial slur in the nineteenth century, too). The portrayal of slave-owning Uncle Silas as a kindly “old gentleman” (Huck calls him “the innocentest, best old soul I ever see”) offers an apology for white supremacy. Assigning Huck Finn provides an occasion not only to talk about a classic American novel, but to teach people how to read uncomfortably, and to cope with experiences that upset them.

Though the motive is protection, restricting access to books hurts the children and teens who need them most. Young readers in vulnerable populations crave stories that help them make sense of their lives. Denying them access to these books contributes to their marginalization and puts them at greater risk. In any case, children often have experiences that they do not yet have the words to express: reading books can provide them with the words, and help them better understand. As Mr. Antolini tells Holden Caulfield in J.D. Salinger’s The Catcher in the Rye (another frequently challenged book), “you’ll find that you’re not the first person who was ever confused and frightened and even sickened by human behavior.… Many, many men have been just as troubled morally and spiritually as you are right now.  Happily, some of them kept records of their troubles.  You’ll learn from them — if you want to.”4

Young people do want to learn. Concerned adults should acknowledge innocence’s inevitable evaporation, and recognize that the young likely know more than you think they do. So, respect their curiosity. Take their concerns seriously. Let them read. Let them learn.

Notes

  1. David Booth, “Censorship,” Keywords for Children’s Literature, eds. Philip Nel and Lissa Paul (NYU Press, 2011), p. 26.
  2. Marah Gubar, “Innocence,” Keywords for Children’s Literature, eds. Nel and Paul, p. 122.
  3. Meg Rosoff, “You can’t protect children by lying to them — the truth will hurt less.” The Guardian 20 Sept. 2013: <http://www.theguardian.com/lifeandstyle/2013/sep/21/cant-protect-children-by-lying>.
  4. J.D. Salinger, The Catcher in the Rye (1951; Bantam Books, 1988), p. 189.

Philip Nel has co-edited two books for NYU Press: Tales for Little Rebels: A Collection of Radical Children’s Literature (2008, with Julia Mickenberg) and Keywords for Children’s Literature (2011, with Lissa Paul). His most recent books are Crockett Johnson and Ruth Krauss: How an Unlikely Couple Found Love, Dodged the FBI, and Transformed Children’s Literature (UP Mississippi, 2012) and Crockett Johnson’s Barnaby, Vol. 1 (1942-1943) and Vol. 2 (1944-1945) (2013 & 2014, both co-edited with Eric Reynolds, Fantagraphics). He is University Distinguished Professor of English at Kansas State University.

Are Jews Too Sexy for the Censors?

—Jodi Eichler-Levine

Jewish authors are tremendously popular when it comes to banned-books lists. Judy Blume, Lesléa Newman, and Anne Frank are all represented on the American Library Association’s 100 Most Banned Books of 1990-1999 and 2000-2009 lists. These are just a few examples of Jews whose work has been targeted, banned, and even burned in America. All of these women’s ouevres have one thing in common: sex. Sex, along with race and obscenity, is one of the most common rationales presented for banning books. Are Jewish authors too sexy for the library?

It is Banned Books Week, a time to reflect on how censorship has altered the American literary landscape and to take action on freedom of expression. For 2015, the week’s focus is on Young Adult literature: that mildly nebulous, wildly popular genre that crosses between “juvenile” and “mature” literature.

Jews, particularly Jewish women, have been stereotyped as licentious and sexually voracious ever since the nineteenth century brought us notions of the “exotic Jewess,” a seductive, Orientalized creature. More generally, the notion of “carnal Israel,” contrasted with “spiritual” Christianity, dominated many public discussions of Jews. As Josh Lambert argues in Unclean Lips, the very notion of obscene speech in American derives in part from reactions towards Jewish and other writing as insufficiently pure and “American.” Modern Jewish writers and filmmakers have received a disproportionate number of obscenity charges.

Judy Blume’s Are You There God, It’s Me, Margaret? caused a stir when it was first published in 1970. Censors charged that the book offends along axes of both sex and religion. On the one hand, Margaret famously discusses menarche with God, asking when she will finally get her first period; she also describes her early, and really quite limited, sexual experiences. At the same time, Margaret faces tensions as the daughter of a Jewish father and a Christian mother, and her Christian grandparents are particularly doctrinaire. As a result, the book was charged with “anti-Christian sentiment.” Blume’s Tiger Eyes, Forever, and Blubber have also been banned, typically for explicit sex scenes or “immoral content.”

Fast-forward two decades to 1989, when Lesléa Newman, an out Jewish lesbian poet and children’s book author, publishes Heather Has Two Mommies, a picture book about a lesbian family. Newman has written about her desire to have all children see their families represented in children’s books; she also connects this longing to her own experience with the lack of Jewish family representations during her youth. In the New York public schools, the book’s inclusion as a suggested reading for the “Children of the Rainbow” curriculum led to public outrage and the downfall of a schools chancellor.

Ironically, it was the protection of children—a very romanticized notion of the marriage as a primarily a venue for procreation—that drove Supreme Court Justice Stephen Bryer’s majority opinion in the very case that legalized marriage equality, just over a quarter century after Newman’s book. Heather Has Two Mommies was re-released just a few months before the decision, with some minor changes. Heather’s mommies now wear matching rings.

Think of the children! This is the double-edged refrain behind both intellectual censorship and legal progress. Both turn upon the notion of young people as innocent blank slates who lack the agency and discernment to make their own sense out of challenging material. This irony is starkest when we consider bans on one of the most popular Jewish authors across the globe, namely: Holocaust victim Anne Frank.

Various edited versions of her journal, usually titled Anne Frank: The Diary of a Young Girl, have been wildly popular and assigned in American schools since the book was translated into English in 1951. In 2013, the diary was almost removed from the Northville, Michigan. Most bans have stemmed from Frank’s explicit reflections on her emerging sexuality and the changes in her adolescent body. Once again, the Jewish female body is suspect: oversexed, potentially dangerous, and overtly described. Frank’s body is a particularly contested one, as her diary and other portrayals became fast forward forms of hagiography, leading contemporary Americans to resist associations between Frank and sex.

Does the censorship of Jewish-authored books truly stem from old, anti-Jewish caricatures? Is it mere happenstance? The causality is not simple. Intersectional identities cannot be neatly sorted out and untangled. I will say this: these particular Jewish authors do not shirk from our always-embodied lives. They trouble sexual taboos, portray powerful women, and challenge social mores, pushing for social justice in their lives and in their art. Like the much-maligned isha zara—the “foreign woman”—of the book of Proverbs, these women cross boundaries and give us new visions. What some see as transgression, others see as its flip side: wisdom.

Keep writing, Jewish women. Write strong.

Jodi Eichler-Levine is Associate Professor of Religious Studies and the Berman Professor of Jewish Civilization at Lehigh University (PA). She is the author of Suffer the Little Children: Uses of the Past in Jewish and African American Children’s Literature (NYU Press, 2015).

A Texas teenager’s arrest points to a deep and growing trend of Islamophobia

—Moustafa Bayoumi

By now you’ve heard about Ahmed Mohamed, the 14-year-old Muslim-American kid from Texas who built a clock at home and brought it to school to show to his teacher, only to be arrested on the ridiculous suspicion that his invention was a bomb.

Young Ahmed was handcuffed, taken to police headquarters, fingerprinted and questioned without his parents present. During his interrogation, as The Washington Post reports, the officers repeatedly brought up his last name.

Here is an inventive Sudanese-American teenager in a NASA T-shirt whose curiosity and ingenuity are rewarded with handcuffs and punishment.

Things turned out well for Mohamed in the end — President Obama tweeted at him, and Mohamed is fielding invitations to visit MIT and Harvard.

Cool clock, Ahmed. Want to bring it to the White House? We should inspire more kids like you to like science. It’s what makes America great.

— President Obama (@POTUS) September 16, 2015

But the national attention his absurd arrest has garnered is an exception. Most of the time, bigotry against Muslims goes unremarked upon or even gets rewarded.

The same week that Mohamed brought his clock to school, vandals spray-painted hate-filled messages on a mosque in Kentucky. Days earlier in a Chicago suburb, Inderjit Singh Mukker, a Sikh-American father of two, was repeatedly punched in the face while his attacker yelled, “Terrorist, go back to your country, bin Laden.” (Sikhs are often the victims of anti-Muslim hate crimes because of their beards, turbans and skin color.) On this year’s anniversary of the 9/11 attacks, a Florida gun shop owner offered $25 off any gun purchased online with the coupon code “Muslim.

In case you think anti-Muslim sentiment is limited to the fringes, consider this University of Connecticut study. Researchers there last year found that job applicants with identifiably Muslim names received “32 percent fewer e-mails and 48 percent fewer phone calls than applicants from the control group, far outweighing measurable bias against the other faith groups.”

Official agencies reflect these attitudes, too. The New York Police Department was caught spying a few years ago on every facet of Muslim life around the region. This was massive, expensive surveillance performed without even the hint of any criminal activity. And federal policies such as the Countering Violent Extremism initiative stigmatize Muslim-Americans as terrorists, even though the number of terrorist attacks that Muslim-Americans have committed are miniscule and far fewer than those that right-wing extremists have perpetrated.

Islamophobia infests our politics and our society. Republican presidential contender South Carolina Sen. Lindsey Graham supports the surveillance of mosques, while former Democratic presidential candidate Wesley Clark recently proposed the reintroduction of internment camps for “radicalized Americans.” Muslims across the country regularly face opposition in constructing their houses of worship and are routinely demonized in the media.

What most Americans don’t realize is how exhausting it is to live a Muslim-American life in this environment. Many see anti-Muslim attitudes not as bigoted but as common sense. Ordinary things that Muslims do, such as cleverly making a clock at home to show off at school, can be interpreted as suspicious and threatening.

Islamophobia in the United States today is real and it’s growing. Like Ahmed Mohamed, we need to be inventive, and find solutions that will help our country live up to its ideals.

Moustafa Bayoumi is the author of This Muslim American Life (NYU Press, 2015), and How Does It Feel To Be a Problem?: Being Young and Arab in America, which won an American Book Award and the Arab American Book Award for Nonfiction. He is Professor of English at Brooklyn College, City University of New York (CUNY).

[This piece originally appeared in The Progressive.]

Playing (anti-)blackness: Expanding understandings of racism in sport

—Stanley I. Thangaraj

dengThe National Basketball Association’s (NBA) Atlanta Hawks entered the 2015 playoff run as the number one seed in the Eastern Conference, and with one of the best records in franchise history. Even with injuries, to key defender Thabo Sefolosha, role player Demarre Carroll, and bull’s eye shooter Kyle Korver, the Hawks’ efficient offensive attack and stifling defense propelled them to the Eastern Conference finals. Though the Cleveland Cavaliers defeated the Hawks, there was much to rejoice for the Hawks after a very successful season of winning streaks. With their rewarding season, however, came a type of forgetting, or even worse, a limited understanding of race. As the Hawks did well, the racial violence within sport became an invisible background to their stories of sporting success. In this essay, I will demonstrate how narrow versions of blackness (as seen in the case of Hawks General Manager Danny Ferry and Civil Rights icon Andy Young) marginalize the black migrant, queer, and trans person which further de-politicizes and de-legitimates anti-racism campaigns.

During the recruitment period in the summer of 2014, General Manager of the Hawks, Danny Ferry was on a conference call with other team executives to discuss potential free agents. Ferry, a white male and former NBA and Duke University player, looked through his data on South Sudanese American player Luol Deng, and stated that Deng “has a little African in him.” With regard to the inflammatory comment, Ferry admitted to perusing through various sources of material gathered on Luol Deng and added, “He’s like a guy who would have a nice store out front and sell you counterfeit stuff out of the back.”

Danny Ferry’s comments remind us how the anti-black racism in larger American society seeps and bleeds into the very fabric of sport. The presence of black athletes in the NBA does not make mainstream American sport “post-racial.” These comments and the events that followed them not only demonstrate the presence of racism but also the containing of blackness as identity and politics. In present-day U.S. society, we must carefully evaluate the immediate history of anti-black violence and interrogate it, if we seek to fully understand the ways in which blackness is contained.

The loaded and vile evaluations of Luol Deng resulted in Danny Ferry taking a leave of absence. Many individuals came to the support of Danny Ferry. The support, as I will argue further, gives us a problematic understanding of blackness that is out of touch with the Black Lives Matter movement and the trans women color organizing. Organizations like the Audre Lorde Project link anti-black racism to xenophobia, anti-immigrant practices, and U.S. imperialism. We do not yet fully see this expansive social justice campaign in sport. Instead, after the leak of Ferry’s comments, Atlanta Hawks head coach Mike Budenholzer (who was named 2015 “coach of the year”) iterated that it was the genius of Danny Ferry that played a part in the Hawks franchise’s success. This affirmation of Ferry as a professional genius and not a racist—unlike former Los Angeles Clippers owner Donald Sterling who was pushed out by the league for his racist comments about black people sitting in his seats—is part of a new terrain of expressing race that is simplistic in its compilation of blackness and in privileging of whiteness. As Luol Deng was African, he was somehow outside the respectable bounds of care and thus not able or allowed to speak against racism. Certain types of representations of native-born blackness become iconic, while the black migrant Other is seen as duplicitous, dodgy, and untrustworthy.

To both my shock and expectations, former Atlanta mayor and civil rights legend Andy Young, a leader in Dr. Martin Luther King Jr.’s Southern Christian Leadership Conference, came to the side of Danny Ferry. According to ESPN staff writer Kevin Arnovitz, when asked whether Ferry should lose his job, Young responded, “Hell no.” Young said that had he been the decision-maker in the Hawks executive offices, he would have encouraged Ferry to stay on. He added that he doesn’t believe Ferry is a racist. To make matters even more complicated, he substituted himself into this equation to free Ferry of any blame: “No more than I am,” Young told the Atlanta station. “That’s a word that you cannot define, ‘You are a racist.’ You can’t grow up white in America without having some problems. You can’t grow up black in America without having some subtle feelings.”

Andy Young’s comments, although disheartening in their disregard for the harrowing experiences of racial violence, should not be seen as exceptional. Rather, it is part and parcel of the projection of African American identity through which certain nefarious alliances are made between black and white elites. Accordingly, a version of blackness is created through Young comments; it is a narrow, constricted, and limited understanding of blackness that elides and dismisses entire groups of people. This version of blackness contains threads of xenophobia that justify racist acts against immigrant black individuals like Luol Deng.

I believe Young’s support of Ferry keeping his job is tied to a clearly bounded blackness with specific national contours. Deng’s refugee status and African identity underwhelmed claims to blackness and anti-black racism. In the process of constructing what black is by stating who is not—in this case, Luol Deng, we see the parameters of blackness and ideas of respectability come to the surface. By not condemning Ferry’s statements and supporting his dismissal, Andy Young manufactures and engrains versions of blackness that make the victim of racism the middle-class, native-born, heterosexual, Christian African American man.

Not seeing Ferry’s racial statements as problematic, Young defines blackness and subsequent experiences of racism in limiting ways that fails to account for the heterogeneity and contradictions within blackness. The overemphasis on the black Atlantic is prevalent in how we think about race, racism, and activism. Roderick Ferguson, in his chapter in Strange Affinities, asks us to imagine a blackness that complicates our understandings of Africa and accounts for various diasporic African populations on U.S. shores. Instead of centering western Africa, he asks for black studies to include work on east Africans in the United States. For example, there are large Ethiopian, Sudanese, and Somalian communities in Atlanta. In fact, the Lost Boys of Sudan (the young Sudanese who fled across nations and refugee camps at the height of the civil war in 1980s Sudan) have a strong community in Metro Atlanta and there is a large African refugee community in the Atlanta suburb of Clarkston (see the fabulous book Outcasts United by Warren St. John).

When Andy Young dismisses the problematic discourse that ostracizes black refugee and immigrant bodies, this might be part of a larger societal discourse of blackness that does not attend to interconnected issues of racism, immigration reform, poor black communities, rising xenophobia, and the entrenchment of Islamophobia (see Junaid Rana’s Terrifying Muslims and Ahmed Afzal’s Lone Star Muslims). In many ways, his encapsulated and static understanding of race is easily worked into the anti-immigrant logic that sees immigrants, especially African immigrants, as non-subjects and not within the discourse of race and racial justice within the United States. As a result, the broken leg sustained by Hawks Afro-French player Thabo Sefolosha is not attended to by persons like Andy Young. Although the details have not surfaced as to how Sefolosha broke his leg in the encounter with police, Young’s conceptualization of blackness already projects Sefolosha outside the logic of racial communities and care.

To go back in time and come back to the present, the blackness that was central to the Civil Rights Movement could not and did not always accommodate blackness in radical ways. The mainstreaming versions of the Civil Rights Movement struggled and failed to attend to LGBTQI and immigration matters in the movement. Andy Young’s version of blackness and respective productions of social justice are therefore not expansive. Luol Deng did not fit enough to the middle-classed, light-skinned, and American-centered version of blackness. Young’s version of blackness was not as expansive as the Pan-African claims by Marcus Garvey, Audre Lorde, and many other scholars and activists. As we have increasing numbers of African players in the NBA and other professional sports, how will blackness account for the far reach and radical possibilities that move beyond our shores?

Andy Young’s support of Danny Ferry plays into the xenophobia that governs how we think about U.S. identity and African American identity. There are many examples of how the histories of Africans, African diaspora communities, and African Americans have not always led to collaborative work. There are instances of tension between these groups, but “blackness” must be an open concept in order to create true change.

As a high school student in Atlanta, I came across the contradictions and entrenchments within blackness. One morning, in 1990, the students and teachers arrived to find anti-black racist graffiti sprayed against the walls at Druid Hills High School in Atlanta. This deeply affected the souls of my African American classmates and a few students of color. We had an African student at our school and he was an exceptional soccer player. Despite the racist happenings at my school, on many occasions, the African student heard racialized comments from African American young men stating that he should go back to the “jungle,” “take care of the goats,” and other such matters. Instead of building a coalition with what the Civil Rights Movement called “Pan-African” connection through an expansive concept of blackness, there continues to be black bleeding, but in isolation and silence. Africans were outside the scope of respectability based on certain bodily comportments, phenotype, name, accent, smell, and desires that defined blackness in Atlanta. This logic, I believe, is evident in Andy Young’s support of Danny Ferry. In the process, the Atlanta Hawks can use the iconicity of Andy Young and his blackness to leverage support and wash away the racist structures within Atlanta Hawks management. Thus, we have to ask: Why is there silence regarding Sefolosha’s broken leg? What does that silence tell us about Black Lives Matter when it took place during an encounter with New York police?

When we continue to figure violence only in terms of those people who we think are embodiments of the best of our community, we fail to see the true reach of racism. We fall into the trap of recognizing only certain persons as respectably human and worthy of attention. What does respectability have to do with that? Why should it be a concern? When respectability becomes the crux of why we care about certain deaths and bodies over others, as evident in Lisa Cacho’s wonderful book Social Death, we account for the horrific murder of the nine people at the historic AME church in Charleston. This tragic event has spaces for empathy as the dead included teachers, professionals, and respectable church-going people.

As we mourn the deaths of the nine people in Charleston, South Carolina, we have failed to collaborate to interrogate the haunting and continued silence concerning the killings of trans women of color. So many black trans women have been murdered since the death of Eric Garner and Mike Brown. Yet, the campaign to combat anti-black racism generally does not account for these persons. Trans women of color, especially, are marginalized, feel the wrath of poverty intimately, daily encounter the police state and racial profiling, and have little resources for survival. As organizations like the Audre Lorde Project and various others open up the category of blackness, the same must be true in all aspects of society, including sporting cultures. At the ESPY awards for sporting figures, Caitlyn Jenner received the Arthur Ashe Award for courage and service. There was great applause and a superficial demonstration of unity. Although this moment brought much-needed visibility to the anti-trans violence, it continued to drown out the activism of Kye Allums, a trans man of color who has been a fierce social justice advocate with sporting cultures for the last 5 years.

Furthermore, with the continued violence against poor African American women, will Andrew Young and the misogyny of the civil rights leadership corps account for the everyday struggle of poor black women? Will this blackness accommodate the young black homeless women like the ones described in anthropologist Aimee Cox’s Shapeshifters and Between Good and Ghetto by sociologist Nikki Jones? If not, then what we have is similar to the blackness that South Asian American athletes consume and appropriate in my book, Desi Hoop Dreams. It is a blackness that is sellable in the larger marketplace but devoid of fierce political fires. Yet, some South Asian American men consume cultural blackness as a way to critique U.S. society and the racial stratification of immigrants. There are other possibilities and openings for blackness that Andy Young and the larger Black Lives Matter movement must attend to in order to create a society for all.

We see how the politics of respectability plays out with regard to organizing against anti-black racism. Racism is expansive, fluid, and recruits a wide spectrum of black victims, yet the responses can be shallow, myopic, and limiting. Racism has always been tied to stratification, capitalism, sexism, homophobia, poverty, and imperialism. Blackness as a point of identification and as a compass for change must not have gatekeepers but infinite openings that make the category a vision and praxis for a just tomorrow.

Stanley I. Thangaraj is Assistant Professor of Anthropology at City College of New York and the author of Desi Hoop Dreams: Pickup Basketball and the Making of Asian American Masculinity (NYU Press, 2015).

How email ruined my life

—Catherine Zimmer

I got my first email account the fall I started graduate school, in 1995. Even then I had an inkling of the pressures that would come to be associated with this miracle of communication. My entry into grad school coincided with a relationship transitioning into a long-distance one, and what at first was a welcome mode of remaining tethered soon enough became a source, and an outlet, of demand, anxiety, guilt, and recrimination.

This early experience of the pressure and affective weight of email faded into memory alongside that relationship, and certainly at the time it did not occur to me to hold the medium responsible for the messages. But over the past couple of years, now that I am lucky enough to be firmly cemented in an academic job and stupid enough to have taken on an administrative role, that experience has reemerged for me as something of a retroactive portent of the place email would come to hold in my life. Because as anyone in an even remotely bureaucratic environment will tell you, “email” is no longer simply a way of communicating with others, though periodically a message gets through that is significant enough that the medium becomes momentarily transparent. Email is now an entity in and of itself—a gargantuan, self-perpetuating and purely self-referential monstrosity that I do not “use” but barely “manage,” a time-sucking and soul-crushing mass that I can only chip away at in an attempt to climb over or around to get to an actual task.

From epidemic-level ADHD and eye fatigue to degenerative spinal conditions at younger and younger ages—not to mention my self-diagnosed early-onset thumb arthritis—constant interaction with digital devices has arguably had widespread health consequences. It is also fraught with an expansion, intensification, and perversion of the emotions associated with that first email account. But while then I attached those affective elements to a romantic relationship, they are now purely indicative of my relationship to email “itself”: the phenomenon that makes constant and growing demands on my time, attention, and energy, requiring that I devote at least a modicum of substantive thought to each individual expression of its enormous, garbage-filled maw. Time spent on email has grown to hours every day. This is not a measure of increased “productivity.” In fact it is just the opposite, as answering email has become the forest I have to machete my way through just to get to the things that actually constitute my job. And while I do get angry at the jaw-dropping idiocy of certain student emails (Hi Mrs. Zimmer can u send me the syllabus because the final is tomorrow and i missed the last eleven weeks of class) and irritated at the endlessly accumulating details of academic work (Dear Dr. Zimmer, this is a friendly reminder that the due date for the Learning Outcomes Rubrics Departmental Self-Assessment Model is March 23rd) ultimately each one of these individually maddening iterations is just a sign of the incomprehensible sprawl of the medium. And when factored in with texting, messaging, social media, streaming television, and any number of other incoming and outgoing information flows, the sense of being “overwhelmed” seems unsurprisingly ubiquitous.

Email is of course inseparable from the character of any digital labor and the economy of which it is a part: it thus becomes a useful metonymic device to understand how convenience has become so profoundly debilitating. Though no one explicitly states it (because it would sound insane), the demand that we keep up with and process this level of information, and communicate coherently in return, is a demand that the human brain function closer to the speed of digital communications. Needless to say, it does not. Thus the unparalleled levels of prescription of amphetamines and pain medications are not merely the triumph of the pharmaceutical industry, but an attempt to make the human brain and body function alongside and through digital mediation. The relative ease of communications, the instantaneity of information exchange, does not make our lives simpler: it means that we are asked to attend to every goddamn thing that occurs to the countless people we know, institutions we are a part of, and every other organization whose mailing list you have been automatically placed on simply by having a single interaction with them. It’s like being able to hear the internal mutterings of both individual people and cultural constructs: a litany of the needs of others and the expectations of the social sphere (not to mention my own neurotic meanderings when I have to construct a careful response to someone, or an email I have sent goes unanswered). Finding it increasingly impossible to recognize and affectively react only to the articulations of each missive, I respond instead to the cacophonous noise of the whole damn thing. That noise is now constant, while its volume ebbs and flows with the rhythms of the work year. As the only constant, email becomes an end in itself. Email never goes away. Email is an asshole.

It is not surprising that this self-perpetuating mode of interaction comes alongside a proliferation of (self-)assessment and (self-)documentation—talking about what you will, have, or are doing instead of just doing it. Thus the ability to communicate about everything, at all times, seems to have come with the attendant requirements that we accompany every action with a qualitative and quantitative discourse about that action. Inside and in addition to this vast circularity are all those things that one’s job actually entails on a practical, daily basis: all the small questions, all the little tasks that need to be accomplished to make sure a class gets scheduled, a course description is revised, or a grade gets changed. Given how few academic organizations have well-functioning automatic systems that might allow these elements to be managed simply, and that my own university seems especially committed to individually hand-cranking every single gear involved in its operation on an ad hoc basis, most elements of my job mean that emails need to be sent to other people.

Once I send an email, I can do nothing further until someone sends an email back, and thus in a sense, sending that email became a task in itself, a task now completed. More and more it is just a game of hot potato with everyone supposedly moving the task forward by getting it off their desk and onto someone else’s, via email. Every node in this network are themselves fighting to keep up with all their emails, in the back and forth required before anything can actually be done. The irony of the incredible speed of digital mediation is thus that it often results in an intractable slowness in accomplishing simple tasks. (My solution has been to return to the telephone, which easily reduces any 10-email exchange into a 2-minute conversation. Sidenote: I never answer my own phone.)

In case it isn’t already clear, such an onslaught of emails, and the pressure of immediacy exerted sometimes explicitly but mostly by the character of the media, means that we no longer get to leave work (or school, or our friends or our partners). We are always at work, even during time off. The joy of turning on our vacation auto-reply messages is cursory, for even as we cite the “limited access” we will have to email (in, like, Vancouver), we know that we can and will check it. And of course we know that everyone else knows that it’s a lie. Even if we really do take time away from email, making ourselves unavailable (not looking at email, not answering our texts) does not mean email has not been sent to us and is not waiting for us. And we know it, with virtually every fiber of our being. Our practical unavailability does not mitigate our affective understanding that if we ignore email too long, not only will work pile up, but there will be emotional consequences. I can feel the brewing hostility of the email senders: irritated, anxious, angry, disappointed.

Even if I start to relax on one level, on another my own anxiety, irritation, and guilt begin to grow. Email doesn’t go away. It’s never over. It’s the fucking digital Babadook, a relentless, reflexive reminder of the unfathomable mass underlying every small transaction of information.

The nonstop stream of communication and its affective vortex are in part what philosopher Gilles Deleuze (and now many others) have described as “societies of control,” distinguished not by discipline but by the constant modulation and management of the flow of information. Ultimately we are exhausted by the endless negotiation of this unmappable terrain, and our personal and professional labors increasingly have the character of merely keeping ourselves afloat. Which is not to say that discipline no longer functions: those excluded from the privilege of control will often find themselves subject to the sharper baton of policing and incarceration.

There does appear to be increasingly widespread recognition that email is having a significant effect on both the amount of work one does and the increasing threat of that work to health and well-being. A widely and enthusiastically misreported news story that France had instituted a ban on sending work email after 6:00pm provided a much-needed salve for the idea that there is no outside to the onslaught. Never mind that this was a wishful, apocryphal version of a French labor agreement that in reality didn’t cut off email at any hour—the story still allowed France to perform its ongoing service as the English-speaking world’s fetish of a superior, butter-drenched, bicycle-riding quality of life, a life in which steak frites is now accompanied by possible escape from a particularly maddening incarnation of digital labor. That life is apparently now the stuff of internet rumor and media fancy.

The range of feelings I associate with the era of my first email account roll on through now and then as I check my inbox, and I could probably name them, though perhaps they were never discrete. And I understand that it is my job as a participant in digital culture to respond to email, and text, and instant messaging—in writing and in sentiment. But the truth is that I am just really tired. Perhaps the vacuum in affect attested to by the accumulation of emoticons and emojis has little to do with the flattening effect of digital communication. Maybe feelings are simply exhausted.

Catherine Zimmer is Associate Professor of Film and Screen Studies and English at Pace University in New York City. She is the author of Surveillance Cinema (NYU Press, 2015).

[This piece was originally posted on Avidly, Los Angeles Review of Books channel.]