Between the world and #Ferguson

—Jelani Cobb

[This article originally appeared in The New Yorker.]

When I was eighteen, I stumbled across Richard Wright’s poem “Between the World and Me.” The poem, a retelling of a lynching, shook me, because while the narrator relays the details in the first person, the actual victim of that brutish ritual is another man, unknown to him and unknown to us. The poem is about the way in which history is an animate force, and how we are witnesses to the past, even to that portion of it that transpired before we were born. He writes,

darkness screamed with thirsty voices; and the witnesses rose and lived:
The dry bones stirred, rattled, lifted, melting themselves
into my bones.
The grey ashes formed flesh firm and black, entering into
my flesh.

Nothing save random fortune separated the fate of the man who died from that of the one telling the story. Errin Whack and Isabel Wilkerson have both written compellingly about the long shadow of lynching. It is, too often, a deliberately forgotten element of the American past—one that is nonetheless felt everywhere in Ferguson, Missouri, where protests followed the shooting of Michael Brown, who was eighteen years old, by a police officer. One can’t make sense of how Brown’s community perceived those events without first understanding the way that neglected history has survived among black people—a traumatic memory handed down, a Jim Crow inheritance.

It took sixteen days for Brown’s body to be buried, an extended postscript that included three separate autopsies, the emergence of duelling interpretations of his last moments, and the resolution of precisely nothing about how race, media, and policing operate in the United States. A year ago, people gathered in anticipation of a verdict in the trial of George Zimmerman, the man who killed Trayvon Martin. During that case, images of people wearing hoodies, as Martin had when he was shot, proliferated on social media. This month, it has been portraits of people with their hands raised, in recognition of a number of witness accounts that Brown tried to surrender before being shot by police officer Darren Wilson. (Wilson, according to press reports, has told people that Brown was running at him.) The idea, in both instances, is that, like Wright’s narrator, any of us could be Martin, Brown, or one of the hundreds of others who have died under questionable circumstances. There is a disturbing sense that this is how we spend our summers now, submerged in outrage, demonstrating, yet again, the hard parameters of public sympathy and the damnable, tiresome burden of racism.

In the days after 9/11, it was common to hear people say that it was the first time Americans had really experienced terrorism on their own soil. Those sentiments were historically wrong, and willfully put aside acts that were organized on a large scale, had a political goal, and were committed with the specific intention of being nightmarishly memorable. The death cult that was lynching furnished this country with such spectacles for a half century. (The tallies vary, but, by some estimates, there were thirty-three hundred lynchings in the decades between the end of Reconstruction and the civil-rights era.) We know intuitively, not abstractly, about terrorism’s theatrical intent. The sight of Michael Brown, sprawled on Canfield Drive for four hours in the August sun, dead at the hands of an officer who was unnamed for a week, recalled that memory. It had the effect of reminding that crowd of spontaneous mourners of their own refuted humanity. A single death can be understood as a collective threat. The media didn’t whip up these concerns among the black population; history did that.

For fifteen days this month, people marched in heat and thunderstorms, amid tear gas, despite the warnings of police styled as a militia, undeterred by the tear gas or the obstinacy of the local bureaucracy. They persisted despite the taint that opportunistic violence and looting imposed upon their efforts.

Linda Chavez wondered on Fox News whether “the ‘unarmed teen’ mantra” really fit Brown, who was six feet four and nearly three hundred pounds and had been caught on video shoplifting—and, it perhaps bears repeating, was a teen, and was unarmed. Chavez was roundly criticized, but she was really only guilty of saying aloud what many others have thought. Whatever happened or did not happen between Michael Brown and Darren Wilson on a winding side street, in the middle of the afternoon, in a non-descript outpost on the edge of a midsized city, whatever we imagine we know of the teen-ager, the salient fact is that he did not live long enough to cultivate his own answers.

I spent eight days in Ferguson, and in that time I developed a kind of between-the-world-and-Ferguson view of the events surrounding Brown’s death. I was once a linebacker-sized eighteen-year-old, too. What I knew then, what black people have been required to know, is that there are few things more dangerous than the perception that one is a danger. I’m embarrassed to recall that my adolescent love of words doubled as a strategy to assuage those fears; it was both a pitiable desire for acceptance and a practical necessity for survival.  I know, to this day, the element of inadvertent intimidation that colors the most innocuous interactions, particularly with white people. There are protocols for this. I sometimes let slip that I’m a professor or that I’m scarcely even familiar with the rules of football, minor biographical facts that stand in for a broader, unspoken statement of reassurance: there is no danger here. And the result is civil small talk and feeble smiles and a sense of having compromised. Other times, in an elevator or crossing a darkened parking lot, when I am six feet away but the world remains between us, I remain silent and simply let whatever miasma of stereotype or fear might be there fill the void.

Fuck you, I think. If I don’t get to feel safe here, why should you?

Jelani Cobb is Associate Professor of History and Director of the Institute of African American Studies at the University of Connecticut, and the author of To the Break of Dawn: A Freestyle on the Hip Hop Aesthetic (NYU Press, 2007). Read more of Cobb’s writing via The New Yorker here.

Book giveaway: Books That Cook

To celebrate the final days of summer, we are giving away two free copies of Books That Cook: The Making of a Literary Meal, the newest title in our Fall 2014 catalog. 

Many of us at NYU Press have been waiting to get our hands on this delightful cookbook anthology since it made an appearance on the ‘forthcoming’ list a year ago—and it’s finally here!

Organized like a cookbook, Books That Cook is a collection of American literature written on the theme of food: from an invocation to a final toast, from starters to desserts.

Including writing from Maya Angelou, Sherman Alexie, and Nora Ephron, among many others, the collection reveals the range of ways authors incorporate recipes—whether the recipe flavors the story or the story serves to add spice to the recipe.

To enter our book giveaway, simply fill out the form below with your name and preferred mailing address. We will randomly select our winners on Sunday, September 21st, 2014 at 1:00 pm EST.

Plus, stay tuned to the blog—we’ll be offering a free chapter (recipe included!) from the book next month.

Diamonds and death

—Susan Falls

Engagement ring sales drive the diamond market in the United States. But people purchase diamonds to celebrate all kinds of occasions, many of which are rites of passage: births, graduations, and weddings. As people experience these events, their social status changes, or is reaffirmed. They may get a new name, a new title or different responsibilities. In the case of an engagement, a woman moves from single to (almost) married and often, into adulthood (at least in the eyes of some people). In a wedding, one becomes a husband or wife. And diamonds are sometimes given to new mothers or babies as a way to celebrate birth. But what about the ultimate rite of passage: death?

When I was working on my recently published book Clarity, Cut and Culture: The Many Meanings of Diamonds (NYU Press 2014), death was a theme that loomed large, even within stories of happy unions and new relationships. Many people told me about diamonds they keep hidden away in small velvet boxes because of the emotional power these glittering objects can exert upon us.

My friend Mabel described a diamond her grandfather gave her when she turned sixteen. Her grandmother had died when she was a young child but had asked that the diamond be given to Mabel when she came of age. Mabel treasures this diamond because it belonged to her beloved grandmother, and because it shows how she was already thinking of her granddaughter as the woman she would miss knowing.  But, Mabel told me, “I rarely wear it, and when I do, it makes me kind of sad.”

As it turns out, her grandmother had purchased it for herself. Her grandfather was “not romantic like that,” never giving her fancy jewelry. In what Mabel describes as a brave and difficult move, her grandfather came out following the loss of his wife, and so for Mabel, as much as she adores her grandfather, the ring not only represents her grandmother’s love for her, but also makes her think about “all of the things that she should have had, deserved to have—like romantic love and passion—that she did not get to experience.” The gem contains a story of generosity, family and attachment, but also of longing, even sacrifice. Perhaps Mabel would have the stone reset, or she could pass it on to another family member in the future, but it is hard to image gifting a stone with such a story to a new bride or fresh graduate.

On the other hand, in another story, a woman named Chandra keeps a small, but well cut diamond that belonged to her mother tucked away in the bottom of her closet. She explained that the stone was too much to wear (bear), bringing up memories of her mother’s early demise. But she knew it was part of a fulfilling marriage, passed on to her with the idea of having an heirloom for future children. And indeed, she is excited about passing it on to her nephews when they get engaged.

One thing I learned is that stories definitely stick to diamonds. But what about a stone that is not only associated with a story about someone, but is someone? The company Life Gem can transform cremation remains into diamonds, as a “memorial to their unique life,” which can then be set into a ring or pendant. The company website states that over 100 Life Gems can be made for the family in about six to nine months, and—in following the 4 C’s grading criteria used by the natural diamond industry—provides information on the color, carat size, cut and clarity of their product. The gems can be ordered in a variety of colors (from clear to varying shades of blue, yellow, red or green), and they come in several shapes, or cuts, such as round, princess and radiant, although all are expected to have flaws (just as most natural diamonds have). The diamonds are sized from 0.1 to 1.5 carats, but the company expects to develop an ability to make much larger stones in the future.

I know I was pretty surprised when I first learned of people making synthetic diamonds from cremation ashes or even hair, but, then again, a diamond is just carbon that has been submitted to tremendous heat and pressure. Here we have a man-made stone whose value comes not only from one’s memories, but from enjoying an actual material connection to a loved one. Barring a catastrophe, these diamonds really will be around ‘forever.’

Susan Falls teaches anthropology at the Savannah College of Art and Design in Savannah, Georgia. She is the author of Clarity, Cut and Culture: The Many Meanings of Diamonds (NYU Press 2014).

Illustration by Kay Wolfersperger.

Trans*politics, solidarity, and ENDA

—Isaac West

Having already declared June as LGBT Pride Month via a presidential proclamation, President Obama is prepared to further demonstrate his commitment to LGBT equality by signing an executive order designed to prohibit federal contractors from practicing employment discrimination against LGBT individuals. Obama’s action is necessary because the Republican leadership in the House refuses to allow the membership to vote on the Employment Nondiscrimination Act (ENDA), which the Senate passed 64-32.

In short, ENDA would incorporate sexual orientation and gender identity into the protected classes of federal employment anti-discrimination law. (The current version of ENDA is not without its problems—the National Center for Lesbian Rights, Transgender Law Center and GetEQUAL, among others, withdrew support for the current bill, citing unprecedented religious exemptions for non-religious employers.) Even though 208 co-sponsors have signed on to ENDA in the House, including eight Republicans, Speaker John Boehner will not bring it to the floor. According to Boehner’s rather disingenuous reading of employment law, he finds ENDA redundant because he claims LGBTs are already covered by current legislation and does not want to afford “special rights” to any new minority groups.

If Boehner’s interpretation of our current laws was not motivated by his catering to his right flank, he would be in good company given that the majority of Americans think that it is already illegal to fire someone because of their sexual orientation or gender identity. Along with this common misperception, paradoxically, there is also a consensus that LGBT employment discrimination is widespread. A Kaiser Family Foundation survey of the general public revealed 67% of respondents answered affirmatively when asked if “LGBT people experienced discrimination ‘often’ or ‘sometimes’ in applying for or keeping a job.”

Given these conditions, it is unsurprising that in a recent poll of LGBT Americans, conducted by the Pew Research Center, employment protections topped marriage rights as the most pressing legislative issue. Although same-sex civil marriage equality gets most of the media attention, LGBT advocates and allies have waged at least as vigorous a campaign for employment protections.

Like most legislation, ENDA’s long, slow march through Congress began in 1974 when Bella Abzug introduced the Equality Act of 1974, a bill that outlawed address discrimination based on sexual orientation. After two decades of little to no movement on measures such as this, ENDA experienced numerous stops and starts during the Clinton and Bush presidencies.

Congressional momentum picked up in 2007 when Barney Frank and Tammy Baldwin, self-identified gay and lesbian members of Congress, championed the bill. The 2007 version of ENDA finally included gender identity as a category, which had been a sticking point for years, until Frank, over Baldwin’s objections, excised the gender identity protections from ENDA., justifying the move on the grounds that some members would not vote for a bill with gender identity as one of the protected categories.

In a surprising turn of events, almost every major LGBT organization, excluding the Human Rights Campaign, withdrew support of the sexual-orientation-only ENDA. Over 400 LGBT organizations joined forces to form United ENDA, pledging to actively work to delay, if not defeat, the bill if it excluded trans’ protections.

In my analysis of these events, I highlight how the gender identity provisions of the bill provided an occasion for solidarity, reversing the general trend whereby trans* and gay and lesbian issues are framed as separate and competing agendas. In this case, these advocates had to make a choice about whether or not they would fight for the rights of the whole LGBT community, or accept a partial victory for the LGB community. After examining the legislation, United ENDA argued that gender identity protections would prevent employers from exploiting the “gender identity loophole,” meaning that an employer could claim to fire someone for their atypical gender performances, not their sexuality.

What makes this case instructive for the future is how United ENDA placed trans* concerns at the center of their advocacy and used it as the glue for their coalition. Instead of treating trans* and gender identity matters as a fringe issue, they served a unifying purpose for rethinking what LGBT solidarity might look like. By rethinking LGB identities through a trans* perspective, the advocates understood that their identities could not be cleaved off as neatly as Frank would have liked to do. As we move forward, keeping in line with the actions of United ENDA, we need to make sure that LGBT politics work toward the good of the whole, and sometimes this may require us to focus more on our shared positions of vulnerability rather than our differences.

Isaac West is Assistant Professor in the Departments of Communication Studies and Gender, Women’s, and Sexuality Studies at the University of Iowa. He is the author of Transforming Citizenships: Transgender Articulations of the Law (NYU Press, 2013).

Suzanna Walters kicks off Pride Month book tour

June is LGBT Pride Month!

Celebrate with author Suzanna Danuta Walters as she hits the road this month on a national book tour for The Tolerance Trap: How God, Genes, and Good Intentions Are Sabotaging Gay Equality (“out” now from NYU Press). Each stop on the tour will have an opportunity for a Q&A session with the author and book signing. If you are in any of the following cities, please stop by and meet her!

The complete list of tour dates is below. For further details, visit Suzanna’s website

 

William Way LGBT Community Center
JUNE 3, 2014 | 6:00 PM
PHILADELPHIA, PA

Out Professionals Pre-Pride Book Party at NYU
JUNE 4, 2014 | 6:30 PM
NEW YORK, NY

Barnes & Noble, Upper West Side
JUNE 5, 2014 | 7:00 PM
NEW YORK, NY

Book Soup
JUNE 10, 2014 | 7:00 PM
LOS ANGELES, CA

Books, Inc.
JUNE 11, 2014 | 7:00 PM
SAN FRANCISCO, CA

Attorney General’s Office
JUNE 17, 2014 | 12:00 PM
WASHINGTON, DC

Harvard Book Store
JUNE 19, 2014 | 7:00 PM
CAMBRIDGE, MA

The Book Cellar
JUNE 21, 2014 | 7:00 PM
CHICAGO, IL

Provincetown Public Library
JUNE 26, 2014 | 6:00 PM
PROVINCETOWN, MA

Mensa Gathering
JULY 2, 2014 | 3:00 PM
BOSTON, MA

Stay tuned for more on The Tolerance Trap this month on our blog, including a book giveaway and Q&A with the author. Happy Pride!

Book giveaway: Open Veins of Latin America

Since its publication in 1971, Open Veins of Latin America has been translated into more than a dozen languages and has sold more than a million copies. Written by Uruguayan journalist Eduardo Galeano, the book chronicles five centuries of exploitation in Latin America—first by European empires, and later the United States. In it, Galeano argues that this “structure of plunder” led to the region’s enduring poverty and underdevelopment.

Now, according to a recent New York Times article, Galeano has disavowed the book. But has he?

In light of the controversy, we’re giving away a FREE copy of Open Veins of Latin America to three lucky winners. To enter our book giveaway, simply fill out the form below with your name and e-mail address. Winners will be randomly selected on Friday, June 6 at 12:00pm EST.

“Boys will be boys”?

—Judy Y. Chu

As a parent of a 10-year-old, I have spent a fair amount of time over the past few years observing kids playing—at schools, playgrounds, and various social functions. As a researcher who studies boys’ development, I am especially inclined to tune in to what parents and teachers say about boys. And I have found that when adults talk about boys, regardless of the context or the particular group of kids, I can expect to hear someone at some point remark that, “Boys will be boys.”

Usually, this comment comes as a response to boys’ rowdy and rambunctious play, as when they are running around, being loud, acting hyper, getting into mischief, or otherwise brimming with energy. (Incidentally, no one says anything when girls display similar behaviors). Even when said in a tone of acceptance, it seems to have a negative connotation. In my experience, this comment is not meant as a celebration, as in “Hooray! Boys will be boys!” Rather, as they say this, adults will often shrug their shoulders, smile mildly, and sigh as though in resignation: “Oh well. What can you do? Boys will be boys.”

But what does it mean for boys to be boys? And why might this be something less than desirable? When we think about it, the first question almost doesn’t make sense. Of course, boys will be boys. What else would they be? But the question gains new meaning when we consider anthropologist Margaret Mead’s observation that in many cultures and societies, boys must prove their masculinity. Somehow it is not enough to be biologically male. Boys must prove that they are “boys” or “real” boys (and, later on, “real” men). For the most part, they do this by aligning with group and cultural norms of masculinity.

Social psychologists remind us that we tend to find what we look for and favor those things that match our expectations. So, when boys behave in ways that confirm gender stereotypes and are consistent with conventions of masculinity—that emphasize, for instance, physical activity and toughness, emotional stoicism, and projected self-sufficiency— we take notice and are prompted to conclude that, “Boys will be boys.”

Conversely, we tend to overlook or discount those things that challenge our assumptions. Although we may like to think of ourselves as being receptive to new information, most of us are more comfortable with evidence that affirms what we already know and believe. It requires extra effort to truly consider and incorporate unfamiliar ideas or ways of thinking.

This might explain why I rarely, if ever, hear people remark that “Boys will be boys” when boys are calm, quiet, gentle, kind, thoughtful, generous, and considerate. Boys certainly exhibit these qualities as well. Indeed, they are a part of boys’ (as well as girls’) humanity. Nevertheless, to the extent that these qualities are considered “feminine,” and we continue to define masculinity as the opposite of femininity, we are less likely to recognize these qualities in boys, much less count them among the attributes that confirm boys’ masculine identities.

As couples therapist Terrence Real points out, when we take all of the qualities that make us human, divide them into “masculine” and “feminine,” and decide that only males should be “masculine” and only females should be “feminine,” everyone loses. While there is no doubt that boys will be boys, it is necessary to update and expand our understanding of what it means to be a boy, including what boys are capable of knowing and doing in their relationships. We know from our experiences of the boys in our lives, as well as from research studies, that gender stereotypes may misrepresent, or represent only a fraction of, boys’ capabilities and strengths.

Although we know that there is more to boys than being “boys,” it is easy to allow stereotypes to influence how we view and respond to them. When we expect boys to be “masculine” and we focus on ways in which boys’ behaviors conform to masculine norms, it can become difficult for us to acknowledge that they are capable of anything else. At times, the notion that “Boys will be boys” can even become an excuse for doing nothing about sub-standard behavior (e.g., when boys behave dispectfully towards others or towards themselves).

To support boys’ healthy development and relationships, we need to hold them accountable to standards that exceed merely being “boys.” By moving beyond gender stereotypes, we can transform this cliché to convey greater expectations. Whether or not boys align with norms of masculine behavior, ultimately it is the qualities that make them human—such as their sense of integrity, decency, compassion, and connection to others—that will be crucial to their happiness and success.

Judy Y. Chu is Affiliated Faculty in the Program in Human Biology at Stanford University and the author of When Boys Become Boys: Development, Relationships, and Masculinity (NYU Press, 2014).

[Note: This article originally appeared on Psychology Today.]

Depictions of masculinity on television

Amanda D. Lotz

It is revealing that so little has been written about men on television. Men have embodied such an undeniable presence and composed a significant percentage of the actors upon the small screen—be they real or fictional—since the dawn of this central cultural medium and yet rarely have been considered as a particularly gendered group. In some ways a parallel exists with the situation of men in history that Michael Kimmel notes in his cultural history, Manhood in America. Kimmel opens his book by noting that “American men have no history” because although the dominant and widely known version of American history is full of men, it never considers the key figures as men. Similarly to Kimmel’s assertion, then, we can claim that we have no history of men, masculinity, and manhood on television—or at best, a very limited one—despite the fact that male characters have been central in all aspects of the sixty-some years of US television history. It is the peculiar situation that nearly all assessments of gender and television have examined the place and nature of women, femininity, and feminism on television while we have no typologies of archetypes or thematic analyses of stories about men or masculinities.

For much of television studies’ brief history, this attention to women made considerable sense given prevailing frameworks for understanding the significance of gender representation in the media. Analyses of women on television largely emerged out of concern about women’s historical absence in central roles and the lack of diversity in their portrayals. Exhaustive surveys of characters revealed that women were underrepresented on television relative to their composition of the general populace and that those onscreen tended to be relegated to roles as wives, love interests, or sex objects. In many cases, this analysis was linked with the feminist project of illustrating how television contributed to the social construction of beliefs about gender roles and abilities, and given the considerable gender-based inequity onscreen and off, attention to the situation of men seemed less pressing. As a result, far less research has considered representations of men on television and the norms or changes in the stories the medium has told about being a man.

Transitioning the frameworks used for analyzing women on television is not as simple as changing the focus of which characters or series one examines. Analyzing men and masculinity also requires a different theoretical framework, as the task of the analysis is not a matter of identifying underrepresentation or problematic stereotypes in the manner that has dominated considerations of female characters. The historic diversity of stories about and depictions of straight white men has seemed to prevent the development of “stereotypes” that have plagued depictions of women and has lessened the perceived need to interrogate straight white men’s depictions and the stories predominantly told about their lives. Any single story about a straight white man has seemed insignificant relative to the many others circulating simultaneously, so no one worried that the populace would begin to assume all men were babbling incompetents when Darrin bumbled through episodes of Bewitched, that all men were bigoted louts because of Archie Bunker, or even that all men were conflicted yet homicidal thugs in the wake of Tony Soprano. Further, given men’s dominance in society, concern about their representation lacked the activist motivation compelling the study of women that tied women’s subordinated place in society to the way they appeared—or didn’t appear—in popular media.

So why explore men now? First, it was arguably shortsighted to ignore analysis of men and changing patterns in the dominant masculinities offered by television to the degree that has occurred. Images of and stories about straight white men have been just as important in fostering perceptions of gender roles, but they have done their work by prioritizing some attributes of masculinity—supported some ways of being a man—more than others. Although men’s roles might not have been limited to the narrow opportunities available to women for much of television history, characteristics consistent with a preferred masculinity have pervaded—always specific to the era of production—that might generally be described as the attributes consistent with what is meant when a male is told to “be a man.” In the past, traits such as the stoicism and controlled emotionality of not being moved to tears, of proving oneself capable of physical feats, and of aggressive leadership in the workplace and home have been common. Men’s roles have been more varied than women’s, but television storytelling has nevertheless performed significant ideological work by consistently supporting some behaviors, traits, and beliefs among the male characters it constructs as heroic or admirable, while denigrating others. So although television series may have displayed a range of men and masculinities, they also circumscribed a “preferred” or “best” masculinity through attributes that were consistently idealized.

The lack of comprehensive attention to men in any era of television’s sixty-some-year history makes the task of beginning difficult because there are so few historical benchmarks or established histories or typologies against which newer developments can be gauged. Perhaps few have considered the history of male portrayal because so many characteristics seemed unexceptional due to their consistency with expectations and because no activist movement has pushed a societal reexamination of men’s gender identity in the manner that occurred for women as a component of second-wave feminism. Male characters performed their identity in expected ways that were perceived as “natural” and drew little attention, indicating the strength of these constructs. Indeed, television’s network-era operational norms of seeking broad, heterogeneous audiences of men and women, young and old, led to representations that were fairly mundane and unlikely to shock or challenge audience expectations of gender roles.

One notable aspect of men’s depictions has been the manner through which narratives have defined them primarily as workers in public spaces or through roles as fathers or husbands—even though most male characters have been afforded access to both spaces. A key distinction between the general characterizations of men versus women has been that shows in which men functioned primarily as fathers (Father Knows BestThe Cosby Show) also allowed for them to leave the domestic sphere and have professional duties that were part of their central identity—even if actually performing these duties was rarely given significant screen time. So in addition to being fathers and husbands, with few exceptions, television’s men also have been workers. Similarly, the performance of professional duties has primarily defined the roles of another set of male characters, as for much of television history, stories about doctors, lawyers, and detectives were necessarily stories about male doctors, lawyers, and detectives. Such shows may have noted the familial status of these men but rarely have incorporated family life or issues into storytelling in a regular or consistent manner.

This split probably occurs primarily for reasons of storytelling convention rather than any concerted effort to fragment men’s identity. I belabor this point here because a gradual breakdown in this separate-spheres approach occurs in many dramatic depictions of men beginning in the 1980s and becomes common enough to characterize a sub-genre by the twenty-first century. Whether allowing a male character an inner life that is revealed through first-person voice-over—as in series such as Magnum, P.I.Dexter, or Hung—or gradually connecting men’s private and professional lives even when the narrative primarily depicts only one of these spheres—as in Hill Street Blues or ER—such cases in which the whole lives of men contribute to characterization can be seen as antecedents to the narratives that emphasize the multifaceted approach to male characters that occurs in the male-centered serial in the early 2000s. Though these series offer intricately drawn and complex protagonists, their narrative framing does not propose them as “role models” or as men who have figured out the challenges of contemporary life. The series and their characters provide not so much a blueprint of how to be a man in contemporary society as a constellation of case studies exposing, but not resolving, the challenges faced.

The scholarly inattention to men on television is oddly somewhat particular to the study of television. The field of film studies features a fairly extensive range of scholarship attending to changing patterns of men’s portrayals and masculinities. While these accounts are fascinating, the specificity of film as a medium very different from television in its storytelling norms (a two-hour contained story as opposed to television’s prevailing use of continuing characters over years of narrative), industrial characteristics (the economic model of film was built on audiences paying for a one-time engagement with the story while television relies on advertisers that seek a mass audience on an ongoing basis), and reception environment (one chooses to go out and see films as opposed to television’s flow into the home) prevent these studies of men on film to tell us much about men on television. Further, gender studies and sociology have developed extensive theories of masculinity and have been more equitable in extending beyond the study of women. Although theories developed in these fields provide a crucial starting point—such as breaking open the simple binary of masculinity and femininity to provide a language of masculinities—it is the case that the world of television does not mirror the “real world” and that the tools useful for exploring how societies police gender performance aren’t always the most helpful for analyzing fictional narratives. Sociological concepts about men aid assessments of men and masculinity on television, but it is clearly the case that the particularities of television’s dominant cultural, industrial, and textual features require focused and specific examination.

Why Cable Guys?

One of the motivations that instigated my 2006 book Redesigning Women: Television after the Network Era was frustration with how increasingly outdated frameworks for understanding the political significance of emerging gender representations were inspiring mis-, or at least incomplete, readings of shows and characters that indicated a rupture from previous norms. Tools established to make sense of a milieu lacking central female protagonists disregarded key contextual adjustments—such as the gradual incorporation of aspects of second-wave feminism into many aspects of public and private life—and were inadequate in a society profoundly different from that of the late 1960s. For example, it seemed that some aspects of gender scripts had changed enough to make the old models outdated, or that there was something more to Ally McBeal than the length of her skirts, her visions of dancing babies, and her longing for lost love that had led to scorn and dismissal from those applying conventional feminist analytics. Given generational and sociohistorical transitions apparent by the mid-1990s, it seemed that this series and its stories might be trying to voice and engage with adjustments in gender politics rather than be the same old effort to contain women through domesticity and conventional femininity, as was frequently asserted.

I’m struck with a similar impulse in reflecting on how stories about men, their lives, and their relationships have become increasingly complicated in the fictional narratives of the last decade. Indeed, this evolution in depictions of male identities has not received the kind of attention levied on the arrival of the sexy, career-driven singles of Sex and the City and Ally McBeal or the physically empowered tough women of Buffy the Vampire Slayer or Xena: Warrior Princess. Assessments of men in popular culture, and particularly television, haven’t been plentiful in the last decade. Most of the discussion of men on television merely acknowledges new trends in depiction—whether they be the sensitivity and everymanness of broadcast characters or the dastardly antiheroism of cable protagonists. Such trend pieces have offered little deeper engagement with the cultural and industrial features contributing to these shifts or analysis of what their consequences might be for the cultures consuming them.

While these curiosities might motivate any scholar, I suspect the motivations of a female feminist scholar embarking on an analysis of men and masculinity also deserve some explanation. In addition to curiosity about shifting depictions and stories on my television screen, for well over a decade I’ve also had the sense that “something is going on” with men of the post–Baby Boomer generation, who, like me, were born into a world already responding to the critiques and activism of second-wave feminism. Yet nothing I’ve read has adequately captured the perplexing negotiations I’ve observed. For example, on a sunny Tuesday morning just after the end of winter semester classes, I took a weekday to enjoy the arrival of spring with my toddler. We found ourselves in the sandpit at the neighborhood park, and shared it that day with two sisters—one a bit older, the other a bit younger than my nearly two-year-old son—who were being watched over by their father. He was about my age and was similarly clad in the parental uniform of exercise pants and a fleece jacket. With some curiosity I unobtrusively watched him interact with his daughters. Dads providing childcare aren’t uncommon in my neighborhood—overrun as it is with academics and medical professionals with odd hours that allow for unconventional childcare arrangements—but something in his demeanor, his willingness to go all in to the tea party of sandcakes his oldest was engaging him with, grabbed my attention for its play with gender roles. It reminded me of the many male friends with whom I share a history back to our teen years who have similarly transformed into engaged and involved dads; they’ve seemingly eradicated much of the juvenile, but also sexist, perspectives they once presented, and also have become men very different from their fathers. Then his phone rang. Immediately, his body language and intonation shifted as he became a much more conventional “guy.” Was it a brother? It was definitely another man. An entirely different performance overtook his speech and demeanor as he strolled away from the sandpit, yet, suggesting that all was not reversed, he proceeded to discuss attending a baby shower, whether he and his wife would get a sitter, and the etiquette of gift giving for second babies. When the call ended he shifted back to the self I had first observed.

Watching this made me reflect on how the gender-based complaints I might register regarding balancing work and family—such as the exhausting demands, the still-tricky negotiations of relationships that cross the working mom/stay-at-home mom divide, and the ever-ratcheting demands to be the Best Mom Ever while maintaining pre-mom employment productivity—have been well documented by others and are problems with a name. My male peers, in contrast, must feel out to sea with no land or comrades in sight. Esteemed gender historian Stephanie Coontz has gone so far as to propose the term and reality of a “masculine mystique” as an important component of contemporary gender issues.

This wasn’t the first time I’d been left thinking about the contradictory messages offered to men these days. The uncertain embodiment of contemporary manhood appears in many places. For years now I’ve wondered, even worried, about the men in my classes. In general, they seem to decrease in number each year, perhaps being eaten by the ball caps pulled ever lower on their foreheads. As a hopefully enlightened feminist scholar, I try to stay attuned to the gender dynamics of my classroom—but what I’ve commonly found was not at all what I was prepared for or expected. Consistent with the Atlantic cover story in the summer of 2010 that declared “The End of Men” and touted that women had become the majority of the workforce, that the majority of managers were women, and that three women earned college degrees for every two men, the young women in my classes consistently dominate their male peers in all measures of performance—tests, papers, class participation, attendance. I haven’t been able to explain why, but it has seemed that most—although certainly not all—of the young men have no idea why they find themselves seated in a college classroom or what they are meant to do there. Though I must acknowledge that despite evidence of female advancement in sectors of the academy like mine, men still dominate in many of the most prestigious and financially well-rewarded fields, including engineering, business, and computer science.

I brought my pondering about classroom gender dynamics home at night as I negotiated the beginning of a heterosexual cohabitation in the late 1990s and thought a lot about what it meant to become a “wife” and eventually a “mother.” There were also conversations about what it meant to be the husband of a feminist and how being a dad has changed since our parents started out, although the grounds for these talks were more uncertain and role models and gender scripts seemed more lacking. Both in charting our early years of marriage and still in facing parenthood, my husband and I have often felt adrift and without models. Although we had little to quibble with in regard to our own upbringing, neither of us was raised in households in which both parents had full-time careers, which seemed quite a game changer and has proved the source of our most contentious dilemmas. While a wide range of feminist scholarship and perspectives has offered insight into the challenges of being a mom and professor, my husband and his compatriots seem to be divining paths without a map or a trail guide. As the mother of both a son and a daughter, I feel somewhat more prepared to help my daughter find her way among culturally imposed gender norms than my son; at least for her the threats and perils are known and named.

Amanda D. Lotz is Associate Professor of Communication Studies at the University of Michigan. She is the author of Cable Guys: Television and Masculinities in the 21st Century (NYU Press, 2014).

[Read a fuller version of this excerpt from Amanda D. Lotz's new book, Cable Guys on Salon.com.]

Cycles of gender testing

—Ellen Samuels

A friend who cycles competitively just sent me a link to the new policy on transgender participants in the Eastern Collegiate Cycling Conference. It seems like a progressive and welcoming policy, stating that:

The ECCC particularly recognizes the challenges facing transgender athletes. Such members of the community should compete in the gender category most appropriate to their unique personal situation.”

The release of this policy highlights the growing centrality of issues of non-normative gender and sexuality in athletic competitions as well as in the wider cultural sphere. The prominence of such concerns, as well as the challenges ahead, were highlighted in the weeks leading up to the 2014 Olympic games, as tennis great Billie Jean King called for a LGBTQ “John Carlos moment”—referring to the African American 1968 Olympic medalist who stood on the winners’ podium with lowered head and raised fist, becoming an iconic symbol for social justice.

In Sochi, despite extensive media coverage of Russian anti-gay policies, that moment never came.

Meanwhile, a little-noted story out of Iran highlighted the extent to which international sports must still contend with its own legacy of gendered injustice. In February, on the cusp of Women’s History Month, it was reported that players in Iran’s women’s soccer league were being subjected to “gender testing” and that a number of players were subsequently expelled from the team for failing to qualify as “real women.”

Sex testing in female athletics has a long and tarnished history dating back to the 1940s, and has included requiring female athletes to parade naked before male doctors, performing invasive medical exams, and mandating genetic and hormonal testing. Indeed, from 1968 until the early 1990s, all elite athletes competing as female were required to carry “certificates of femininity,” issued by the International Association of Athletics Federations. Such universal sex testing was abandoned more than a decade ago, but female athletes who are perceived as overly “masculine” are still required to undergo sex testing and even medical treatment in order to remain eligible.

Representations of the Iranian soccer controversy in the Western media have invoked anti-Islamic stereotypes of backwardness, suggesting that gender confusion was caused by the body-masking uniforms worn by the soccer players. These stories ignore the long history of female athletes from all nations and in the skimpiest of running outfits being challenged and subjected to sex testing, their bodies closely analyzed for signs of masculine “hardness,” “strength,” and “power.”

Media reporting on the Iranian women’s soccer team also reflects a common and disturbing tendency to blur together the very different topics of transgender athletes, intersex athletes, and athletes suspected to be cisgendered men deliberately pretending to be women. The International Olympic Committee recently revised its gender policies in part to attempt to disentangle these categories—although the new policies are rife with their own problematic understandings of “sex” and “gender.”

To return to the ECCC policy, after appreciating its initial trans-positive language, I was dismayed to read the next paragraph:

“Competitors may be asked by the Conference Director(s) and/or their designee(s) to furnish two pieces of documentation from relevant legal, medical, or academic authorities documenting personal sex, gender, or gender dysphoria supporting their selected competition gender category.”

Such requirements show how assumptions about the necessity for biocertification can both underpin and undermine even the most well-meaning of policies directed toward people who do not fit neatly into gender binaries.  It is likely that, just as in international female athletics, the cyclists most likely to be asked to provide documentation are those who appear suspiciously “masculine,” yet identify as female.

However, I did notice a peculiar difference in this policy compared to those adopted in the Olympics and other sports settings: The athlete can provide material from “relevant legal, medical, or academic authorities” to support their gender identification.

To my knowledge, no other athletic gender policy allows for “academic” documentation, and I can’t help but wonder what such documentation would look like: Would a note from Judith Butler suffice? Certainly, this unusual addition to a biocertification policy indicates that queer, trans*, and feminist scholars should not discount the relevance of our work to the everyday contestations of gender in sports and other sites of global exchange.

Ellen Samuels is Assistant Professor of Gender and Women’s Studies and English at the University of Wisconsin at Madison. She is the author of Fantasies of Identification: Disability, Gender, Race (NYU Press, 2014).

The racism that still plagues America

Excerpted from The Price of Paradise: The Costs of Inequality and a Vision for a More Equitable America by David Dante Troutt (NYU Press, 2014).

The impatience that characterizes discussions of race and racism in our so-called color-blind society has its roots in the momentous  legislative changes of the 1960s. The Civil Rights Acts of 1964, 1965, and 1968 reached into nearly every aspect of daily life—from segregated facilities to voting to housing—and represented a long overdue re-installation of the equality principle in our social compact. The question was what it would take—and from whom—to get to equality.

Was racial equality something that could be had without sacrifice? If not, then who would be forced to participate and who would be exempt? As implementation of the laws engendered a far-reaching bureaucracy of agencies, rules, and programs for everything from affirmative action hiring goals to federal contracting formula, the commitment was quickly tested. For a great many who already opposed the changes, patience was quickly exhausted. As welfare rolls rapidly increased, crime surged, and the real and perceived burdens of busing took their toll, many voters pointed to the apparent failure of a growing federal government to fix the problems it was essentially paid to cure. Among Democratic voters this made for unsteady alliances and vulnerable anxieties. People don’t live in policy and statistics as much as they do through anecdote and personal burdens. A riot here, a horrific crime there, a job loss or perhaps the fiery oratory of a public personality could tip a liberal-leaning person’s thinking toward more conservative conclusions—or at least fuel her impatience. Impatience would ossify into anger, turning everything into monetary costs, and making these costs the basis for political opposition to a liberal state. As it happened, this process moves the date of our supposed final triumph over racism from the mid-1960s to at least the mid-1980s. In the end, impatience won.

What I call impatience, others have characterized as a simmering voter ambivalence—even antagonism, in the case of working-class whites—to civil rights remedies, one that was susceptible to the peculiar backlash politics that elected both Ronald Reagan and George Herbert Walker Bush president. Language was central to this strategy, and the language that stuck was colorblindness. As Thomas Byrne Edsall and Mary Edsall wrote in “Chain Reaction: The Impact of Race, Rights, and Taxes on American Politics,” “In facing an electorate with sharply divided commitments on race—theoretically in favor of egalitarian principle but hostile to many forms of implementation—the use of a race-free political language proved crucial to building a broad-based, center-right coalition.” Ronald Reagan managed to communicate a message that embodied all the racial resentments around poverty programs, affirmative action, minority set-asides, busing, crime, and the Supreme Court without mentioning race, something his conservative forebears—Barry Goldwater, George Wallace, and Richard Nixon—could not quite do. The linchpin was “costs” and “values.” Whenever “racism” was raised, it became an issue of “reverse racism” against whites. The effect was the conversion of millions of once fiscally liberal, middle-class suburban Democrats to the Republican Party. Issues identified with race—the “costs of liberalism”—fractured the very base of the Democratic Party. In the 1980 presidential election, for example, 22 percent of Democrats voted Republican.

By 1984, when Ronald Reagan and George Bush beat Walter Mondale and Geraldine Ferraro in the presidential election, many white Democratic voters had come to read their own party’s messages through what Edsall calls a “racial filter.” In their minds, higher taxes were directly attributable to policies of a growing federal government; they were footing the bill for minority preference programs. If the public argument was cast as wasteful spending on people of weak values, the private discussions were explicitly racial. For instance, Edsall quotes polling studies of “Reagan Democrats” in Macomb County—the union friendly Detroit suburbs that won the battle to prevent cross-district school desegregation plans in 1973—that presents poignant evidence of voter anger: “These white Democratic defectors express a profound distaste for blacks, a sentiment that pervades almost everything they think about government and politics. . . . Blacks constitute the explanation for their [white defectors’] vulnerability and for almost everything that has gone wrong in their lives; not being black is what constitutes being middle class; not living with blacks is what makes a neighborhood a decent place to live. These sentiments have important implications for Democrats, as virtually all progressive symbols and themes have been redefined in racial and pejorative terms.”

By 1988, these same voters had endorsed tax revolts across the country and had become steadfast suburbanites, drawing clearer lines between a suburban good life and the crime and crack-infested city. Still they were angry, as magazine articles chronicled the rising political significance of what would be known as the “Angry White Male” voter. George Bush, down seventeen points in the presidential election polls during midsummer, overcame that deficit with TV ads about murderous black convicts raping white women while on furlough. That and a pledge never to raise taxes seemed to be enough to vanquish Bush’s liberal challenger, Michael Dukakis of Massachusetts. What’s important to recognize in this transition is how as recently as twenty years ago, Americans’ social lives were very much embroiled in racial controversy—despite the obfuscatory veneer of colorblind language to the contrary. Our politics followed. The election of Bill Clinton represented a distinct centrist turn among Democrats toward Republican language and themes and away from rights, the “liberal” label, and the federal safety net. The question we might ask about our current race relations is, only a couple of decades removed from this political history, what would compel us to assume that we are beyond the legacy of our racial conflicts?

 * * *

The racial polarization that connected these political outcomes was deliberately fed by national Republican candidates in order to do more than roll back civil rights. It also served to install “supply-side economics,” a system of regressive tax-based reforms that contributed mightily to the costs of income inequality we currently face. That era—which arguably ended with the election of President Barack Obama—illustrates two points central to my examination of civic connectivity. The first is that the economic underside of racial polarization proved no more than the old okey doke. The second is that localism contains its own contradictions, which have come due in our time. Let me explain.

Only racism could achieve the ideological union of the Republican rich with the working man (and woman). Nothing else could fuse their naturally opposed interests. The essence of supply-side economics was its belief in the importance of liberating the affluent from tax and regulatory burdens, a faith not typically shared by lower-income households who might at best see benefits “trickle down” to them. In fact, they often paid more under tax-reform schemes of the 1980s.  Edsall provides data on the combined federal tax rate that include all taxes—income, Social Security, and so forth. Between 1980 and 1990, families in the bottom fifth of all earners saw their rates increase by 16.1 percent; it increased by 6 percent for those in the second-lowest fifth (the lower middle class); and it increased by 1.2 percent for those in the middle fifth (the middle middle class). But those in the second-highest fifth of all income earners saw a cut in their tax rate by 2.2 percent during that decade; and those in the top fifth got a 5.5 percent decrease in their rate. Overall, the richest 10 percent of American earners received a 7.3 percent decrease in their combined federal tax rate. The top 1 percent? A 14.4 percent cut during the 1980s. Clearly this hurt the middle class, as the vaunted trickle down never arrived. But it was working-class whites who bought the message that this model of fiscal conservatism, married to social conservatism in the form of a rollback of redistributive programs they perceived to favor blacks, would benefit them. It did not. Yet it established a popular political rhetoric by which lower-income whites can be counted on to take up against “liberal” policies that may actually serve their interests as long as opposition can be wrapped in the trappings of “traditional values,” “law and order,” “special interests,” “reverse racism,” and “smaller government.” This was pure okey doke based on an erroneous notion of zero-sum mutuality—that is, that whatever “the blacks” get hurts me.

Which also demonstrates the contradictions of localism. Remember my earlier argument that localism—or local control expressed formally through home rule grants, as it’s sometimes known—became the spatial successor to Jim Crow segregation. Through racially “neutral” land use and housing policy, it kept white communities white after the fall of legal segregation in the late 1950s and mid-1960s. Yet here’s the contradiction. While voters opposed to civil rights remedies and Great Society programs followed Republican leadership toward fiscal conservatism at the national level, they maintained their fiscal liberalism at the local level. The tax base they created for themselves through property taxes in suburbia could be contained and spent locally. Edsall describes the irony this way: “Suburbanization has permitted whites to satisfy liberal ideals revolving around activist government, while keeping to a minimum the number of blacks and the poor who share in government largess.” Of course, all of this worked best when “suburbs” meant middle-class white people and “cities” (or today’s “urban” areas) always signaled black and brown people. There was no mutuality of interests between the two kinds of places. It also worked when low property taxes—together with generous state aid—could reliably pay for great local public services like schools, libraries, and fire protection. It was a terrific deal. But that was then. Now, neither is true. The line between cities and suburbs has blurred into regions, and minorities and whites are busy crossing back and forth to work, live, and shop. Most of the fragmented municipalities that sprawled across suburbia are no longer able to sustain their own budgets, threatening the quality of their services, despite unimaginably high property taxes. The assumptions have not held.

Perhaps now we should consider the racially polarizing policies that became the norm under Reagan’s failed experiment. We tried them. Some believed fervently in them. But it is clear that they didn’t work and are not in our long-term national or local interest. There remains a legacy of racism, however, that continues to harm some of us disproportionately and all of us eventually. It’s to those three examples that I now turn.

Read the rest of this excerpt at Salon.com.

Dude, what’s that smell? The Sriracha shutdown and immigrant excess

—Anita Mannur and Martin Manalansan

All across America, bottles with a green cap, rooster and fiery chili sauce that were once exclusively the mainstay of fast food style Asian restaurants, have been slowly making their mark on mainstream palates. In 2011, the popular television show The Simpsons featured an episode—described by executive producer Matt Selman as a “love letter to food culture”—in which Bart Simpson’s usually pedestrian palate becomes attuned to the finer possibilities of sriracha.

In 2012, as part of a national campaign to introduce a new flavor, the Lay’s potato chip company announced Sriracha as one of the three finalist flavors, along with Cheesy Garlic Bread and Chicken & Waffles. Cheesy Garlic Bread Lay’s eventually went on to win the contest; some claim it was because the signature piquant taste of sriracha could barely be detected in the chip’s flavor. In 2013 the national bagel sandwich chain restaurant Brueggers introduced the Sriracha Egg Sandwich. Not to be outdone, Subway followed suit with their version of a chicken sriracha melt.

By the end of 2013, sriracha popularity seemed to be at an all time high. From January to December of 2012, some 20 million bottles of sriracha sauce had sold, and on October 27, 2013, the first Sriracha festival was held in downtown Los Angeles. Americans, it seemed, could not get enough of the hot sauce. That is, until it came into their own backyards.

On October 28, Huy Fong Foods, the purveyor of sriracha, was sued by the small town of Irwindale, California for causing “burning eyes, irritated throats, and headaches” to its residents. An initial report published by the Associated Press tellingly described the odors produced by the Huy Fong plant as “a nuisance.”

Huy Fong’s owner and creator David Tran’s mistake was in assuming that the sriracha boom meant that the town of Irwindale would accept the changes that came with the presence of Asianness. In many ways, his story was that of the consummate Asian American model minority who had made his mark through hard work and perseverance in America. From origins in Vietnam to “making it” as an ethnic entrepreneur in the US, the story of sriracha, and in particular that of Huy Fong, can be understood as a quintessentially Asian American story.

David Tran, a Vietnamese refugee of Chinese origin, was among the first wave of refugees to leave Vietnam in 1979. Fleeing Vietnam aboard the Panamanian freighter “Huy Fong,” for which he later named his company, Tran started his fledgling company in the town of Rosemead, California in the mid-1980s with an initial investment of a meager $50,000. Over the next two decades, the company, which exclusively harvests jalapeños grown in Piru, California, grew dramatically, largely by word of mouth, and has become one of the most popular condiments with something of a cult-like following.

Food historian John T. Edge notes that part of sriracha’s success is in its ability to position itself as malleable to many palates: “Multicultural appeal was engineered into the product: the ingredient list on the back of the bottle is written in Vietnamese, Chinese, English, French and Spanish. And serving suggestions include pizzas, hot dogs, hamburgers and, for French speakers, pâtés.” Despite sriracha’s obvious connection to Thainess—the sauce, according to a recent documentary, Sriracha (Dir. Griffin Hammond, 2013), has its origins in the town of Si Racha—Tran disavows the necessary connection to one particular national lineage, noting, “I know it’s not a Thai sriracha…It’s my sriracha.”

As the company expanded, it moved from its more modest location in Rosemead to a larger factory in Irwindale. And with the growth of the factory, resentment of the presence of Asianness has been more acutely expressed through a refusal of the visceral and purported offensiveness of Asian odors. Ultimately it is the inability of odors to remain in place, the toxicity and the purported public health danger of Asian coded comestibles that has come to characterize this stage in the sriracha story as a story of racial exclusion in an Asian American context.

As Martin Manalansan has written elsewhere, “smell in America…is a code for class, racial and ethnic differences.” Yet cities are expected to function as odorless zones, both literally and psychically. Traces of immigrant excess must always be kept at bay and where food is concerned, difference must be managed to ensure that the kind of food one finds at the table is synchronous with the mandates of a multiculturalist ethos of eating. It must not appear “too foreign,” “too different”, “too oily” or too aberrant. In other words it must not be too offensive, lest it upset a carefully calibrated balance of acceptable multiculturalism.

Sriracha seemed poised to become America’s next favorite condiment. But condiments have to be manufactured somewhere, and when Asianness comes to roost in the town of Irwindale, population 1,422 (2% Asian American, 47% white), the cultural odor of the town also changes. And taste for difference, as history has suggested, can often only go so far. The winds in the California city of Irwindale not only transport the sharp smell of chilies in the sriracha sauce, they also convey the heavy weight of Western history’s fraught encounters with olfactory experiences.

Throughout the ages, smell has been used to mark things and bodies that are sinister, sinful, dangerous, foreign, diseased, and decaying. Modern cities were planned under the idealized schemes of de-odorized landscapes. Accoutrements to contemporary living include room deodorizers and air fresheners that aim to eliminate unwanted odors and showcase social uplift and class distinction. The Sriracha incident in California reeks of all these historical antecedents and cultural symptoms. The very fact that sriracha has been called a “public nuisance” and a potential health threat is part of a longer tradition that views Asianness as a public health menace. The SARS epidemic of 2002, with its concomitant xenophobic links to the fear of Asian bodies, is not far removed from the panic about Asianness discursively inherent in the charges being levied against Huy Fong Foods.

In the midst of all the accusations and counter-accusations of state overreach, cultural insensitivity and xenophobia, smell should not be seen as merely a potential health hazard but rather as a crucial signpost of where we are as a society and as a nation in the 21st century. Indeed, to consider sriracha’s odors a public nuisance is not far removed from the kinds of radicalizing language that is used to put immigrants in their place. We may like our sriracha bottles on our tables, but we don’t want it too close, lest it contaminate our spaces of living. Like the Asian American bodies with which we associate the bottles of hot sauce, we prefer to limit the spread of Asianness.

On November 29, 2013, three days after the Los Angeles court ruled in favor of a partial shutdown of the company, Huy Fong Foods released a simple statement to the public with a banner reading, “No tear gas made here,” placed outside its Irwindale factory. Those simple words summed up what is perhaps really at stake here. The underlying issues which have led to the fracas about sriracha are very much about toxicity, but the banner is as much about dispelling the notion that the product they are making is toxic as it is about pointing out that underlying racism and charges against Huy Fong are mired in a more dangerous form of toxicity—one that seeks to vigilantly remind immigrants about where they do and do not belong.

Anita Mannur is Associate Professor of English and Asian /Asian American Studies at Miami University. Martin F. Manalansan, IV is Associate Professor of Anthropology and Asian American Studies at the University of Illinois, Urbana-Champaign. Mannur and Manalansan are co-editors (with Robert Ji-Song Ku) of Eating Asian America: A Food Studies Reader (NYU Press, 2013).

Embracing spreadability in academic publishing

—Sam Ford

The world of academic publishing was built on a model of scarcity. The specialist knowledge of an academic discipline was considered too limited for general commercial publication, so a niche industry was built to support the development and publication of essay and book-length academic publications. Academic presses played a vital role in this model and built their infrastructure to protect and make available academic essays for university libraries and specialists in a particular field. And, in return, the system for evaluating success among academics has been built in tandem with this publishing model—so that publishing milestones have become the logic on which tenure processes are built.

I had the pleasure of being invited to speak to the American Association of University Presses last summer on a panel about “reaching the world.” At it, I advocated that university presses have to rethink their raison d’être in the 21st century.

In a world where information is now overabundant rather than scarce, might it make sense that publishers have to change their logic dramatically in order to stay relevant? Rather than protecting and bringing information to circulation inside academia, as had been the old model, might not the role of the press be to curate and further cultivate the most important content in that vast field—and, equally as important—to focus on bringing that content to new audiences outside university libraries and professionals within one discipline?

I cited—as example—my experiences with Spreadable Media, the book I published this year (co-authored with Henry Jenkins and Joshua Green) with New York University Press. There were few arguments or examples in this book that weren’t, in some form, published or presented somewhere previously: various white papers, blog posts, online articles, academic essays, keynote speeches, and so on. And we have published excerpts and examples from the book in a variety of places since it came out. Further, the overall project included more than 30 essays, available freely online, in addition to the book we co-authored.

As far as I can tell, the availability of all that material hasn’t hindered interest in our book. For whatever few people who would have bought the book but were instead sated by finding the information available online, there were many more that discovered the book through these various materials and purchased it.

Writing more than a decade ago about piracy, Tim O’Reilly said, “Obscurity is a far greater threat to authors.” The same can be said for concerns of “self-cannibalization.” And the logic of at least some presses’ acquisition editors underscore this. Consider this statement from Harvard University Press: “prior availability doesn’t have a clear relationship to market viability.”

An early version of a piece Peter Froehlich (with Indiana University Press) published in Learned Publishing in October highlights the model now employed by Harvard Business Review Press as a potential way forward: the press embraces multiple-platform publishing, thinking about the connection among its blog, its magazine, and its books as varying tiers of publication and embracing authors who share their ideas elsewhere—in the process developing a reputation as a catalyst for thinking and then curating the best of that thinking in more increasingly formal ways.

In this model, the book acts as a thoroughly edited articulation of an idea at a moment in time: the culmination of work up to that point, the launching point of work to come. And the press helps take that idea and make it accessible, in reasonable fullness, to those who haven’t been following the development of the argument all along the way. In other words, the press’ role is about curating the information that most needs to be preserved and then making that information more visible to people outside the narrow field from which it came.

A similar model might be understood by publications like Fast Company. Authors like me write online pieces, with Fast Company receiving 24-hour exclusivity for our writing, followed by it being shared elsewhere. The magazine may pull together and curate its deepest, most considered pieces. Meanwhile, thoughts I initiated at Fast Company may end up eventually showing up elsewhere (properly attributed and sourced, of course). Such is a publishing model that still provides windows for a viable business model without being focused on locking content down.

This is a vital problem to be figured out, for not just the current and next generation of academics but, crucially, for the next generation of college students and all of us who benefit when ideas from within the academy spread throughout the culture and our professional worlds. It’s not just an issue niche university presses need to solve but rather a crucial question for us all.

Sam Ford is Director of Audience Engagement with Peppercomm, an affiliate with both MIT Comparative Media Studies/Writing and the Western Kentucky University Popular Culture Studies Program, and co-author of Spreadable Media: Creating Value and Meaning in a Networked Culture (NYU Press, 2013). He is also a contributor to Harvard Business Review and Fast Company.