Depictions of masculinity on television

Amanda D. Lotz

It is revealing that so little has been written about men on television. Men have embodied such an undeniable presence and composed a significant percentage of the actors upon the small screen—be they real or fictional—since the dawn of this central cultural medium and yet rarely have been considered as a particularly gendered group. In some ways a parallel exists with the situation of men in history that Michael Kimmel notes in his cultural history, Manhood in America. Kimmel opens his book by noting that “American men have no history” because although the dominant and widely known version of American history is full of men, it never considers the key figures as men. Similarly to Kimmel’s assertion, then, we can claim that we have no history of men, masculinity, and manhood on television—or at best, a very limited one—despite the fact that male characters have been central in all aspects of the sixty-some years of US television history. It is the peculiar situation that nearly all assessments of gender and television have examined the place and nature of women, femininity, and feminism on television while we have no typologies of archetypes or thematic analyses of stories about men or masculinities.

For much of television studies’ brief history, this attention to women made considerable sense given prevailing frameworks for understanding the significance of gender representation in the media. Analyses of women on television largely emerged out of concern about women’s historical absence in central roles and the lack of diversity in their portrayals. Exhaustive surveys of characters revealed that women were underrepresented on television relative to their composition of the general populace and that those onscreen tended to be relegated to roles as wives, love interests, or sex objects. In many cases, this analysis was linked with the feminist project of illustrating how television contributed to the social construction of beliefs about gender roles and abilities, and given the considerable gender-based inequity onscreen and off, attention to the situation of men seemed less pressing. As a result, far less research has considered representations of men on television and the norms or changes in the stories the medium has told about being a man.

Transitioning the frameworks used for analyzing women on television is not as simple as changing the focus of which characters or series one examines. Analyzing men and masculinity also requires a different theoretical framework, as the task of the analysis is not a matter of identifying underrepresentation or problematic stereotypes in the manner that has dominated considerations of female characters. The historic diversity of stories about and depictions of straight white men has seemed to prevent the development of “stereotypes” that have plagued depictions of women and has lessened the perceived need to interrogate straight white men’s depictions and the stories predominantly told about their lives. Any single story about a straight white man has seemed insignificant relative to the many others circulating simultaneously, so no one worried that the populace would begin to assume all men were babbling incompetents when Darrin bumbled through episodes of Bewitched, that all men were bigoted louts because of Archie Bunker, or even that all men were conflicted yet homicidal thugs in the wake of Tony Soprano. Further, given men’s dominance in society, concern about their representation lacked the activist motivation compelling the study of women that tied women’s subordinated place in society to the way they appeared—or didn’t appear—in popular media.

So why explore men now? First, it was arguably shortsighted to ignore analysis of men and changing patterns in the dominant masculinities offered by television to the degree that has occurred. Images of and stories about straight white men have been just as important in fostering perceptions of gender roles, but they have done their work by prioritizing some attributes of masculinity—supported some ways of being a man—more than others. Although men’s roles might not have been limited to the narrow opportunities available to women for much of television history, characteristics consistent with a preferred masculinity have pervaded—always specific to the era of production—that might generally be described as the attributes consistent with what is meant when a male is told to “be a man.” In the past, traits such as the stoicism and controlled emotionality of not being moved to tears, of proving oneself capable of physical feats, and of aggressive leadership in the workplace and home have been common. Men’s roles have been more varied than women’s, but television storytelling has nevertheless performed significant ideological work by consistently supporting some behaviors, traits, and beliefs among the male characters it constructs as heroic or admirable, while denigrating others. So although television series may have displayed a range of men and masculinities, they also circumscribed a “preferred” or “best” masculinity through attributes that were consistently idealized.

The lack of comprehensive attention to men in any era of television’s sixty-some-year history makes the task of beginning difficult because there are so few historical benchmarks or established histories or typologies against which newer developments can be gauged. Perhaps few have considered the history of male portrayal because so many characteristics seemed unexceptional due to their consistency with expectations and because no activist movement has pushed a societal reexamination of men’s gender identity in the manner that occurred for women as a component of second-wave feminism. Male characters performed their identity in expected ways that were perceived as “natural” and drew little attention, indicating the strength of these constructs. Indeed, television’s network-era operational norms of seeking broad, heterogeneous audiences of men and women, young and old, led to representations that were fairly mundane and unlikely to shock or challenge audience expectations of gender roles.

One notable aspect of men’s depictions has been the manner through which narratives have defined them primarily as workers in public spaces or through roles as fathers or husbands—even though most male characters have been afforded access to both spaces. A key distinction between the general characterizations of men versus women has been that shows in which men functioned primarily as fathers (Father Knows BestThe Cosby Show) also allowed for them to leave the domestic sphere and have professional duties that were part of their central identity—even if actually performing these duties was rarely given significant screen time. So in addition to being fathers and husbands, with few exceptions, television’s men also have been workers. Similarly, the performance of professional duties has primarily defined the roles of another set of male characters, as for much of television history, stories about doctors, lawyers, and detectives were necessarily stories about male doctors, lawyers, and detectives. Such shows may have noted the familial status of these men but rarely have incorporated family life or issues into storytelling in a regular or consistent manner.

This split probably occurs primarily for reasons of storytelling convention rather than any concerted effort to fragment men’s identity. I belabor this point here because a gradual breakdown in this separate-spheres approach occurs in many dramatic depictions of men beginning in the 1980s and becomes common enough to characterize a sub-genre by the twenty-first century. Whether allowing a male character an inner life that is revealed through first-person voice-over—as in series such as Magnum, P.I.Dexter, or Hung—or gradually connecting men’s private and professional lives even when the narrative primarily depicts only one of these spheres—as in Hill Street Blues or ER—such cases in which the whole lives of men contribute to characterization can be seen as antecedents to the narratives that emphasize the multifaceted approach to male characters that occurs in the male-centered serial in the early 2000s. Though these series offer intricately drawn and complex protagonists, their narrative framing does not propose them as “role models” or as men who have figured out the challenges of contemporary life. The series and their characters provide not so much a blueprint of how to be a man in contemporary society as a constellation of case studies exposing, but not resolving, the challenges faced.

The scholarly inattention to men on television is oddly somewhat particular to the study of television. The field of film studies features a fairly extensive range of scholarship attending to changing patterns of men’s portrayals and masculinities. While these accounts are fascinating, the specificity of film as a medium very different from television in its storytelling norms (a two-hour contained story as opposed to television’s prevailing use of continuing characters over years of narrative), industrial characteristics (the economic model of film was built on audiences paying for a one-time engagement with the story while television relies on advertisers that seek a mass audience on an ongoing basis), and reception environment (one chooses to go out and see films as opposed to television’s flow into the home) prevent these studies of men on film to tell us much about men on television. Further, gender studies and sociology have developed extensive theories of masculinity and have been more equitable in extending beyond the study of women. Although theories developed in these fields provide a crucial starting point—such as breaking open the simple binary of masculinity and femininity to provide a language of masculinities—it is the case that the world of television does not mirror the “real world” and that the tools useful for exploring how societies police gender performance aren’t always the most helpful for analyzing fictional narratives. Sociological concepts about men aid assessments of men and masculinity on television, but it is clearly the case that the particularities of television’s dominant cultural, industrial, and textual features require focused and specific examination.

Why Cable Guys?

One of the motivations that instigated my 2006 book Redesigning Women: Television after the Network Era was frustration with how increasingly outdated frameworks for understanding the political significance of emerging gender representations were inspiring mis-, or at least incomplete, readings of shows and characters that indicated a rupture from previous norms. Tools established to make sense of a milieu lacking central female protagonists disregarded key contextual adjustments—such as the gradual incorporation of aspects of second-wave feminism into many aspects of public and private life—and were inadequate in a society profoundly different from that of the late 1960s. For example, it seemed that some aspects of gender scripts had changed enough to make the old models outdated, or that there was something more to Ally McBeal than the length of her skirts, her visions of dancing babies, and her longing for lost love that had led to scorn and dismissal from those applying conventional feminist analytics. Given generational and sociohistorical transitions apparent by the mid-1990s, it seemed that this series and its stories might be trying to voice and engage with adjustments in gender politics rather than be the same old effort to contain women through domesticity and conventional femininity, as was frequently asserted.

I’m struck with a similar impulse in reflecting on how stories about men, their lives, and their relationships have become increasingly complicated in the fictional narratives of the last decade. Indeed, this evolution in depictions of male identities has not received the kind of attention levied on the arrival of the sexy, career-driven singles of Sex and the City and Ally McBeal or the physically empowered tough women of Buffy the Vampire Slayer or Xena: Warrior Princess. Assessments of men in popular culture, and particularly television, haven’t been plentiful in the last decade. Most of the discussion of men on television merely acknowledges new trends in depiction—whether they be the sensitivity and everymanness of broadcast characters or the dastardly antiheroism of cable protagonists. Such trend pieces have offered little deeper engagement with the cultural and industrial features contributing to these shifts or analysis of what their consequences might be for the cultures consuming them.

While these curiosities might motivate any scholar, I suspect the motivations of a female feminist scholar embarking on an analysis of men and masculinity also deserve some explanation. In addition to curiosity about shifting depictions and stories on my television screen, for well over a decade I’ve also had the sense that “something is going on” with men of the post–Baby Boomer generation, who, like me, were born into a world already responding to the critiques and activism of second-wave feminism. Yet nothing I’ve read has adequately captured the perplexing negotiations I’ve observed. For example, on a sunny Tuesday morning just after the end of winter semester classes, I took a weekday to enjoy the arrival of spring with my toddler. We found ourselves in the sandpit at the neighborhood park, and shared it that day with two sisters—one a bit older, the other a bit younger than my nearly two-year-old son—who were being watched over by their father. He was about my age and was similarly clad in the parental uniform of exercise pants and a fleece jacket. With some curiosity I unobtrusively watched him interact with his daughters. Dads providing childcare aren’t uncommon in my neighborhood—overrun as it is with academics and medical professionals with odd hours that allow for unconventional childcare arrangements—but something in his demeanor, his willingness to go all in to the tea party of sandcakes his oldest was engaging him with, grabbed my attention for its play with gender roles. It reminded me of the many male friends with whom I share a history back to our teen years who have similarly transformed into engaged and involved dads; they’ve seemingly eradicated much of the juvenile, but also sexist, perspectives they once presented, and also have become men very different from their fathers. Then his phone rang. Immediately, his body language and intonation shifted as he became a much more conventional “guy.” Was it a brother? It was definitely another man. An entirely different performance overtook his speech and demeanor as he strolled away from the sandpit, yet, suggesting that all was not reversed, he proceeded to discuss attending a baby shower, whether he and his wife would get a sitter, and the etiquette of gift giving for second babies. When the call ended he shifted back to the self I had first observed.

Watching this made me reflect on how the gender-based complaints I might register regarding balancing work and family—such as the exhausting demands, the still-tricky negotiations of relationships that cross the working mom/stay-at-home mom divide, and the ever-ratcheting demands to be the Best Mom Ever while maintaining pre-mom employment productivity—have been well documented by others and are problems with a name. My male peers, in contrast, must feel out to sea with no land or comrades in sight. Esteemed gender historian Stephanie Coontz has gone so far as to propose the term and reality of a “masculine mystique” as an important component of contemporary gender issues.

This wasn’t the first time I’d been left thinking about the contradictory messages offered to men these days. The uncertain embodiment of contemporary manhood appears in many places. For years now I’ve wondered, even worried, about the men in my classes. In general, they seem to decrease in number each year, perhaps being eaten by the ball caps pulled ever lower on their foreheads. As a hopefully enlightened feminist scholar, I try to stay attuned to the gender dynamics of my classroom—but what I’ve commonly found was not at all what I was prepared for or expected. Consistent with the Atlantic cover story in the summer of 2010 that declared “The End of Men” and touted that women had become the majority of the workforce, that the majority of managers were women, and that three women earned college degrees for every two men, the young women in my classes consistently dominate their male peers in all measures of performance—tests, papers, class participation, attendance. I haven’t been able to explain why, but it has seemed that most—although certainly not all—of the young men have no idea why they find themselves seated in a college classroom or what they are meant to do there. Though I must acknowledge that despite evidence of female advancement in sectors of the academy like mine, men still dominate in many of the most prestigious and financially well-rewarded fields, including engineering, business, and computer science.

I brought my pondering about classroom gender dynamics home at night as I negotiated the beginning of a heterosexual cohabitation in the late 1990s and thought a lot about what it meant to become a “wife” and eventually a “mother.” There were also conversations about what it meant to be the husband of a feminist and how being a dad has changed since our parents started out, although the grounds for these talks were more uncertain and role models and gender scripts seemed more lacking. Both in charting our early years of marriage and still in facing parenthood, my husband and I have often felt adrift and without models. Although we had little to quibble with in regard to our own upbringing, neither of us was raised in households in which both parents had full-time careers, which seemed quite a game changer and has proved the source of our most contentious dilemmas. While a wide range of feminist scholarship and perspectives has offered insight into the challenges of being a mom and professor, my husband and his compatriots seem to be divining paths without a map or a trail guide. As the mother of both a son and a daughter, I feel somewhat more prepared to help my daughter find her way among culturally imposed gender norms than my son; at least for her the threats and perils are known and named.

Amanda D. Lotz is Associate Professor of Communication Studies at the University of Michigan. She is the author of Cable Guys: Television and Masculinities in the 21st Century (NYU Press, 2014).

[Read a fuller version of this excerpt from Amanda D. Lotz's new book, Cable Guys on Salon.com.]

Cycles of gender testing

—Ellen Samuels

A friend who cycles competitively just sent me a link to the new policy on transgender participants in the Eastern Collegiate Cycling Conference. It seems like a progressive and welcoming policy, stating that:

The ECCC particularly recognizes the challenges facing transgender athletes. Such members of the community should compete in the gender category most appropriate to their unique personal situation.”

The release of this policy highlights the growing centrality of issues of non-normative gender and sexuality in athletic competitions as well as in the wider cultural sphere. The prominence of such concerns, as well as the challenges ahead, were highlighted in the weeks leading up to the 2014 Olympic games, as tennis great Billie Jean King called for a LGBTQ “John Carlos moment”—referring to the African American 1968 Olympic medalist who stood on the winners’ podium with lowered head and raised fist, becoming an iconic symbol for social justice.

In Sochi, despite extensive media coverage of Russian anti-gay policies, that moment never came.

Meanwhile, a little-noted story out of Iran highlighted the extent to which international sports must still contend with its own legacy of gendered injustice. In February, on the cusp of Women’s History Month, it was reported that players in Iran’s women’s soccer league were being subjected to “gender testing” and that a number of players were subsequently expelled from the team for failing to qualify as “real women.”

Sex testing in female athletics has a long and tarnished history dating back to the 1940s, and has included requiring female athletes to parade naked before male doctors, performing invasive medical exams, and mandating genetic and hormonal testing. Indeed, from 1968 until the early 1990s, all elite athletes competing as female were required to carry “certificates of femininity,” issued by the International Association of Athletics Federations. Such universal sex testing was abandoned more than a decade ago, but female athletes who are perceived as overly “masculine” are still required to undergo sex testing and even medical treatment in order to remain eligible.

Representations of the Iranian soccer controversy in the Western media have invoked anti-Islamic stereotypes of backwardness, suggesting that gender confusion was caused by the body-masking uniforms worn by the soccer players. These stories ignore the long history of female athletes from all nations and in the skimpiest of running outfits being challenged and subjected to sex testing, their bodies closely analyzed for signs of masculine “hardness,” “strength,” and “power.”

Media reporting on the Iranian women’s soccer team also reflects a common and disturbing tendency to blur together the very different topics of transgender athletes, intersex athletes, and athletes suspected to be cisgendered men deliberately pretending to be women. The International Olympic Committee recently revised its gender policies in part to attempt to disentangle these categories—although the new policies are rife with their own problematic understandings of “sex” and “gender.”

To return to the ECCC policy, after appreciating its initial trans-positive language, I was dismayed to read the next paragraph:

“Competitors may be asked by the Conference Director(s) and/or their designee(s) to furnish two pieces of documentation from relevant legal, medical, or academic authorities documenting personal sex, gender, or gender dysphoria supporting their selected competition gender category.”

Such requirements show how assumptions about the necessity for biocertification can both underpin and undermine even the most well-meaning of policies directed toward people who do not fit neatly into gender binaries.  It is likely that, just as in international female athletics, the cyclists most likely to be asked to provide documentation are those who appear suspiciously “masculine,” yet identify as female.

However, I did notice a peculiar difference in this policy compared to those adopted in the Olympics and other sports settings: The athlete can provide material from “relevant legal, medical, or academic authorities” to support their gender identification.

To my knowledge, no other athletic gender policy allows for “academic” documentation, and I can’t help but wonder what such documentation would look like: Would a note from Judith Butler suffice? Certainly, this unusual addition to a biocertification policy indicates that queer, trans*, and feminist scholars should not discount the relevance of our work to the everyday contestations of gender in sports and other sites of global exchange.

Ellen Samuels is Assistant Professor of Gender and Women’s Studies and English at the University of Wisconsin at Madison. She is the author of Fantasies of Identification: Disability, Gender, Race (NYU Press, 2014).

The racism that still plagues America

Excerpted from The Price of Paradise: The Costs of Inequality and a Vision for a More Equitable America by David Dante Troutt (NYU Press, 2014).

The impatience that characterizes discussions of race and racism in our so-called color-blind society has its roots in the momentous  legislative changes of the 1960s. The Civil Rights Acts of 1964, 1965, and 1968 reached into nearly every aspect of daily life—from segregated facilities to voting to housing—and represented a long overdue re-installation of the equality principle in our social compact. The question was what it would take—and from whom—to get to equality.

Was racial equality something that could be had without sacrifice? If not, then who would be forced to participate and who would be exempt? As implementation of the laws engendered a far-reaching bureaucracy of agencies, rules, and programs for everything from affirmative action hiring goals to federal contracting formula, the commitment was quickly tested. For a great many who already opposed the changes, patience was quickly exhausted. As welfare rolls rapidly increased, crime surged, and the real and perceived burdens of busing took their toll, many voters pointed to the apparent failure of a growing federal government to fix the problems it was essentially paid to cure. Among Democratic voters this made for unsteady alliances and vulnerable anxieties. People don’t live in policy and statistics as much as they do through anecdote and personal burdens. A riot here, a horrific crime there, a job loss or perhaps the fiery oratory of a public personality could tip a liberal-leaning person’s thinking toward more conservative conclusions—or at least fuel her impatience. Impatience would ossify into anger, turning everything into monetary costs, and making these costs the basis for political opposition to a liberal state. As it happened, this process moves the date of our supposed final triumph over racism from the mid-1960s to at least the mid-1980s. In the end, impatience won.

What I call impatience, others have characterized as a simmering voter ambivalence—even antagonism, in the case of working-class whites—to civil rights remedies, one that was susceptible to the peculiar backlash politics that elected both Ronald Reagan and George Herbert Walker Bush president. Language was central to this strategy, and the language that stuck was colorblindness. As Thomas Byrne Edsall and Mary Edsall wrote in “Chain Reaction: The Impact of Race, Rights, and Taxes on American Politics,” “In facing an electorate with sharply divided commitments on race—theoretically in favor of egalitarian principle but hostile to many forms of implementation—the use of a race-free political language proved crucial to building a broad-based, center-right coalition.” Ronald Reagan managed to communicate a message that embodied all the racial resentments around poverty programs, affirmative action, minority set-asides, busing, crime, and the Supreme Court without mentioning race, something his conservative forebears—Barry Goldwater, George Wallace, and Richard Nixon—could not quite do. The linchpin was “costs” and “values.” Whenever “racism” was raised, it became an issue of “reverse racism” against whites. The effect was the conversion of millions of once fiscally liberal, middle-class suburban Democrats to the Republican Party. Issues identified with race—the “costs of liberalism”—fractured the very base of the Democratic Party. In the 1980 presidential election, for example, 22 percent of Democrats voted Republican.

By 1984, when Ronald Reagan and George Bush beat Walter Mondale and Geraldine Ferraro in the presidential election, many white Democratic voters had come to read their own party’s messages through what Edsall calls a “racial filter.” In their minds, higher taxes were directly attributable to policies of a growing federal government; they were footing the bill for minority preference programs. If the public argument was cast as wasteful spending on people of weak values, the private discussions were explicitly racial. For instance, Edsall quotes polling studies of “Reagan Democrats” in Macomb County—the union friendly Detroit suburbs that won the battle to prevent cross-district school desegregation plans in 1973—that presents poignant evidence of voter anger: “These white Democratic defectors express a profound distaste for blacks, a sentiment that pervades almost everything they think about government and politics. . . . Blacks constitute the explanation for their [white defectors’] vulnerability and for almost everything that has gone wrong in their lives; not being black is what constitutes being middle class; not living with blacks is what makes a neighborhood a decent place to live. These sentiments have important implications for Democrats, as virtually all progressive symbols and themes have been redefined in racial and pejorative terms.”

By 1988, these same voters had endorsed tax revolts across the country and had become steadfast suburbanites, drawing clearer lines between a suburban good life and the crime and crack-infested city. Still they were angry, as magazine articles chronicled the rising political significance of what would be known as the “Angry White Male” voter. George Bush, down seventeen points in the presidential election polls during midsummer, overcame that deficit with TV ads about murderous black convicts raping white women while on furlough. That and a pledge never to raise taxes seemed to be enough to vanquish Bush’s liberal challenger, Michael Dukakis of Massachusetts. What’s important to recognize in this transition is how as recently as twenty years ago, Americans’ social lives were very much embroiled in racial controversy—despite the obfuscatory veneer of colorblind language to the contrary. Our politics followed. The election of Bill Clinton represented a distinct centrist turn among Democrats toward Republican language and themes and away from rights, the “liberal” label, and the federal safety net. The question we might ask about our current race relations is, only a couple of decades removed from this political history, what would compel us to assume that we are beyond the legacy of our racial conflicts?

 * * *

The racial polarization that connected these political outcomes was deliberately fed by national Republican candidates in order to do more than roll back civil rights. It also served to install “supply-side economics,” a system of regressive tax-based reforms that contributed mightily to the costs of income inequality we currently face. That era—which arguably ended with the election of President Barack Obama—illustrates two points central to my examination of civic connectivity. The first is that the economic underside of racial polarization proved no more than the old okey doke. The second is that localism contains its own contradictions, which have come due in our time. Let me explain.

Only racism could achieve the ideological union of the Republican rich with the working man (and woman). Nothing else could fuse their naturally opposed interests. The essence of supply-side economics was its belief in the importance of liberating the affluent from tax and regulatory burdens, a faith not typically shared by lower-income households who might at best see benefits “trickle down” to them. In fact, they often paid more under tax-reform schemes of the 1980s.  Edsall provides data on the combined federal tax rate that include all taxes—income, Social Security, and so forth. Between 1980 and 1990, families in the bottom fifth of all earners saw their rates increase by 16.1 percent; it increased by 6 percent for those in the second-lowest fifth (the lower middle class); and it increased by 1.2 percent for those in the middle fifth (the middle middle class). But those in the second-highest fifth of all income earners saw a cut in their tax rate by 2.2 percent during that decade; and those in the top fifth got a 5.5 percent decrease in their rate. Overall, the richest 10 percent of American earners received a 7.3 percent decrease in their combined federal tax rate. The top 1 percent? A 14.4 percent cut during the 1980s. Clearly this hurt the middle class, as the vaunted trickle down never arrived. But it was working-class whites who bought the message that this model of fiscal conservatism, married to social conservatism in the form of a rollback of redistributive programs they perceived to favor blacks, would benefit them. It did not. Yet it established a popular political rhetoric by which lower-income whites can be counted on to take up against “liberal” policies that may actually serve their interests as long as opposition can be wrapped in the trappings of “traditional values,” “law and order,” “special interests,” “reverse racism,” and “smaller government.” This was pure okey doke based on an erroneous notion of zero-sum mutuality—that is, that whatever “the blacks” get hurts me.

Which also demonstrates the contradictions of localism. Remember my earlier argument that localism—or local control expressed formally through home rule grants, as it’s sometimes known—became the spatial successor to Jim Crow segregation. Through racially “neutral” land use and housing policy, it kept white communities white after the fall of legal segregation in the late 1950s and mid-1960s. Yet here’s the contradiction. While voters opposed to civil rights remedies and Great Society programs followed Republican leadership toward fiscal conservatism at the national level, they maintained their fiscal liberalism at the local level. The tax base they created for themselves through property taxes in suburbia could be contained and spent locally. Edsall describes the irony this way: “Suburbanization has permitted whites to satisfy liberal ideals revolving around activist government, while keeping to a minimum the number of blacks and the poor who share in government largess.” Of course, all of this worked best when “suburbs” meant middle-class white people and “cities” (or today’s “urban” areas) always signaled black and brown people. There was no mutuality of interests between the two kinds of places. It also worked when low property taxes—together with generous state aid—could reliably pay for great local public services like schools, libraries, and fire protection. It was a terrific deal. But that was then. Now, neither is true. The line between cities and suburbs has blurred into regions, and minorities and whites are busy crossing back and forth to work, live, and shop. Most of the fragmented municipalities that sprawled across suburbia are no longer able to sustain their own budgets, threatening the quality of their services, despite unimaginably high property taxes. The assumptions have not held.

Perhaps now we should consider the racially polarizing policies that became the norm under Reagan’s failed experiment. We tried them. Some believed fervently in them. But it is clear that they didn’t work and are not in our long-term national or local interest. There remains a legacy of racism, however, that continues to harm some of us disproportionately and all of us eventually. It’s to those three examples that I now turn.

Read the rest of this excerpt at Salon.com.

Dude, what’s that smell? The Sriracha shutdown and immigrant excess

—Anita Mannur and Martin Manalansan

All across America, bottles with a green cap, rooster and fiery chili sauce that were once exclusively the mainstay of fast food style Asian restaurants, have been slowly making their mark on mainstream palates. In 2011, the popular television show The Simpsons featured an episode—described by executive producer Matt Selman as a “love letter to food culture”—in which Bart Simpson’s usually pedestrian palate becomes attuned to the finer possibilities of sriracha.

In 2012, as part of a national campaign to introduce a new flavor, the Lay’s potato chip company announced Sriracha as one of the three finalist flavors, along with Cheesy Garlic Bread and Chicken & Waffles. Cheesy Garlic Bread Lay’s eventually went on to win the contest; some claim it was because the signature piquant taste of sriracha could barely be detected in the chip’s flavor. In 2013 the national bagel sandwich chain restaurant Brueggers introduced the Sriracha Egg Sandwich. Not to be outdone, Subway followed suit with their version of a chicken sriracha melt.

By the end of 2013, sriracha popularity seemed to be at an all time high. From January to December of 2012, some 20 million bottles of sriracha sauce had sold, and on October 27, 2013, the first Sriracha festival was held in downtown Los Angeles. Americans, it seemed, could not get enough of the hot sauce. That is, until it came into their own backyards.

On October 28, Huy Fong Foods, the purveyor of sriracha, was sued by the small town of Irwindale, California for causing “burning eyes, irritated throats, and headaches” to its residents. An initial report published by the Associated Press tellingly described the odors produced by the Huy Fong plant as “a nuisance.”

Huy Fong’s owner and creator David Tran’s mistake was in assuming that the sriracha boom meant that the town of Irwindale would accept the changes that came with the presence of Asianness. In many ways, his story was that of the consummate Asian American model minority who had made his mark through hard work and perseverance in America. From origins in Vietnam to “making it” as an ethnic entrepreneur in the US, the story of sriracha, and in particular that of Huy Fong, can be understood as a quintessentially Asian American story.

David Tran, a Vietnamese refugee of Chinese origin, was among the first wave of refugees to leave Vietnam in 1979. Fleeing Vietnam aboard the Panamanian freighter “Huy Fong,” for which he later named his company, Tran started his fledgling company in the town of Rosemead, California in the mid-1980s with an initial investment of a meager $50,000. Over the next two decades, the company, which exclusively harvests jalapeños grown in Piru, California, grew dramatically, largely by word of mouth, and has become one of the most popular condiments with something of a cult-like following.

Food historian John T. Edge notes that part of sriracha’s success is in its ability to position itself as malleable to many palates: “Multicultural appeal was engineered into the product: the ingredient list on the back of the bottle is written in Vietnamese, Chinese, English, French and Spanish. And serving suggestions include pizzas, hot dogs, hamburgers and, for French speakers, pâtés.” Despite sriracha’s obvious connection to Thainess—the sauce, according to a recent documentary, Sriracha (Dir. Griffin Hammond, 2013), has its origins in the town of Si Racha—Tran disavows the necessary connection to one particular national lineage, noting, “I know it’s not a Thai sriracha…It’s my sriracha.”

As the company expanded, it moved from its more modest location in Rosemead to a larger factory in Irwindale. And with the growth of the factory, resentment of the presence of Asianness has been more acutely expressed through a refusal of the visceral and purported offensiveness of Asian odors. Ultimately it is the inability of odors to remain in place, the toxicity and the purported public health danger of Asian coded comestibles that has come to characterize this stage in the sriracha story as a story of racial exclusion in an Asian American context.

As Martin Manalansan has written elsewhere, “smell in America…is a code for class, racial and ethnic differences.” Yet cities are expected to function as odorless zones, both literally and psychically. Traces of immigrant excess must always be kept at bay and where food is concerned, difference must be managed to ensure that the kind of food one finds at the table is synchronous with the mandates of a multiculturalist ethos of eating. It must not appear “too foreign,” “too different”, “too oily” or too aberrant. In other words it must not be too offensive, lest it upset a carefully calibrated balance of acceptable multiculturalism.

Sriracha seemed poised to become America’s next favorite condiment. But condiments have to be manufactured somewhere, and when Asianness comes to roost in the town of Irwindale, population 1,422 (2% Asian American, 47% white), the cultural odor of the town also changes. And taste for difference, as history has suggested, can often only go so far. The winds in the California city of Irwindale not only transport the sharp smell of chilies in the sriracha sauce, they also convey the heavy weight of Western history’s fraught encounters with olfactory experiences.

Throughout the ages, smell has been used to mark things and bodies that are sinister, sinful, dangerous, foreign, diseased, and decaying. Modern cities were planned under the idealized schemes of de-odorized landscapes. Accoutrements to contemporary living include room deodorizers and air fresheners that aim to eliminate unwanted odors and showcase social uplift and class distinction. The Sriracha incident in California reeks of all these historical antecedents and cultural symptoms. The very fact that sriracha has been called a “public nuisance” and a potential health threat is part of a longer tradition that views Asianness as a public health menace. The SARS epidemic of 2002, with its concomitant xenophobic links to the fear of Asian bodies, is not far removed from the panic about Asianness discursively inherent in the charges being levied against Huy Fong Foods.

In the midst of all the accusations and counter-accusations of state overreach, cultural insensitivity and xenophobia, smell should not be seen as merely a potential health hazard but rather as a crucial signpost of where we are as a society and as a nation in the 21st century. Indeed, to consider sriracha’s odors a public nuisance is not far removed from the kinds of radicalizing language that is used to put immigrants in their place. We may like our sriracha bottles on our tables, but we don’t want it too close, lest it contaminate our spaces of living. Like the Asian American bodies with which we associate the bottles of hot sauce, we prefer to limit the spread of Asianness.

On November 29, 2013, three days after the Los Angeles court ruled in favor of a partial shutdown of the company, Huy Fong Foods released a simple statement to the public with a banner reading, “No tear gas made here,” placed outside its Irwindale factory. Those simple words summed up what is perhaps really at stake here. The underlying issues which have led to the fracas about sriracha are very much about toxicity, but the banner is as much about dispelling the notion that the product they are making is toxic as it is about pointing out that underlying racism and charges against Huy Fong are mired in a more dangerous form of toxicity—one that seeks to vigilantly remind immigrants about where they do and do not belong.

Anita Mannur is Associate Professor of English and Asian /Asian American Studies at Miami University. Martin F. Manalansan, IV is Associate Professor of Anthropology and Asian American Studies at the University of Illinois, Urbana-Champaign. Mannur and Manalansan are co-editors (with Robert Ji-Song Ku) of Eating Asian America: A Food Studies Reader (NYU Press, 2013).

Embracing spreadability in academic publishing

—Sam Ford

The world of academic publishing was built on a model of scarcity. The specialist knowledge of an academic discipline was considered too limited for general commercial publication, so a niche industry was built to support the development and publication of essay and book-length academic publications. Academic presses played a vital role in this model and built their infrastructure to protect and make available academic essays for university libraries and specialists in a particular field. And, in return, the system for evaluating success among academics has been built in tandem with this publishing model—so that publishing milestones have become the logic on which tenure processes are built.

I had the pleasure of being invited to speak to the American Association of University Presses last summer on a panel about “reaching the world.” At it, I advocated that university presses have to rethink their raison d’être in the 21st century.

In a world where information is now overabundant rather than scarce, might it make sense that publishers have to change their logic dramatically in order to stay relevant? Rather than protecting and bringing information to circulation inside academia, as had been the old model, might not the role of the press be to curate and further cultivate the most important content in that vast field—and, equally as important—to focus on bringing that content to new audiences outside university libraries and professionals within one discipline?

I cited—as example—my experiences with Spreadable Media, the book I published this year (co-authored with Henry Jenkins and Joshua Green) with New York University Press. There were few arguments or examples in this book that weren’t, in some form, published or presented somewhere previously: various white papers, blog posts, online articles, academic essays, keynote speeches, and so on. And we have published excerpts and examples from the book in a variety of places since it came out. Further, the overall project included more than 30 essays, available freely online, in addition to the book we co-authored.

As far as I can tell, the availability of all that material hasn’t hindered interest in our book. For whatever few people who would have bought the book but were instead sated by finding the information available online, there were many more that discovered the book through these various materials and purchased it.

Writing more than a decade ago about piracy, Tim O’Reilly said, “Obscurity is a far greater threat to authors.” The same can be said for concerns of “self-cannibalization.” And the logic of at least some presses’ acquisition editors underscore this. Consider this statement from Harvard University Press: “prior availability doesn’t have a clear relationship to market viability.”

An early version of a piece Peter Froehlich (with Indiana University Press) published in Learned Publishing in October highlights the model now employed by Harvard Business Review Press as a potential way forward: the press embraces multiple-platform publishing, thinking about the connection among its blog, its magazine, and its books as varying tiers of publication and embracing authors who share their ideas elsewhere—in the process developing a reputation as a catalyst for thinking and then curating the best of that thinking in more increasingly formal ways.

In this model, the book acts as a thoroughly edited articulation of an idea at a moment in time: the culmination of work up to that point, the launching point of work to come. And the press helps take that idea and make it accessible, in reasonable fullness, to those who haven’t been following the development of the argument all along the way. In other words, the press’ role is about curating the information that most needs to be preserved and then making that information more visible to people outside the narrow field from which it came.

A similar model might be understood by publications like Fast Company. Authors like me write online pieces, with Fast Company receiving 24-hour exclusivity for our writing, followed by it being shared elsewhere. The magazine may pull together and curate its deepest, most considered pieces. Meanwhile, thoughts I initiated at Fast Company may end up eventually showing up elsewhere (properly attributed and sourced, of course). Such is a publishing model that still provides windows for a viable business model without being focused on locking content down.

This is a vital problem to be figured out, for not just the current and next generation of academics but, crucially, for the next generation of college students and all of us who benefit when ideas from within the academy spread throughout the culture and our professional worlds. It’s not just an issue niche university presses need to solve but rather a crucial question for us all.

Sam Ford is Director of Audience Engagement with Peppercomm, an affiliate with both MIT Comparative Media Studies/Writing and the Western Kentucky University Popular Culture Studies Program, and co-author of Spreadable Media: Creating Value and Meaning in a Networked Culture (NYU Press, 2013). He is also a contributor to Harvard Business Review and Fast Company.

 

Podcast: Josh Lambert on Jews and obscenity in America

In Unclean Lips: Obscenity, Jews, and American Culture, Josh Lambert navigates us through American Jews’ participation in the production, distribution, and defense of sexually explicit literature, plays and comedy.

From Theodore Dreiser and Henry Miller to Curb Your Enthusiasm and FCC v. Fox, Lambert explores the central role Jews have played in the struggles over obscenity and censorship in the modern United States. Below, listen to a conversation with Lambert on a recent episode of Vox Tablet’s podcast. 

[Warning: This conversation contains explicit language and content.]

What’s new about Hanukkah?

—Dianne Ashton

[This post originally appeared on the Jewish Book Council blog on November 26, 2013.]

This year, Jewish Americans will participate in an extraordinary Hanukkah celebration—they will light the first menorah candle on the evening before Thanksgiving. This has never happened before, but we came very close to it in 1888. Then, the first Hanukkah light and Thanksgiving occurred on the same day. That year, the national Jewish newspaper, the American Hebrew, dedicated its November 30 issue to the “twofold feasts.” The issue was as much “a tribute to the historic significance of Chanuka” as to “the traditions entwined about Thanksgiving Day.” The editors hoped readers would find the newspaper to be “a stimulus to the joyousness and gladness upon the observance of both.” In previous years they had described Hanukkah as a festival to thank God for the Maccabean victory, and, seeing both Thanksgiving and Hanukkah as occasions for giving thanks to God, they easily encouraged American Jews to enthusiastically celebrate both events.

But most of the time, as we know, Hanukkah occurs at a time closer to Christmas. Most years, the American Hebrew’s Hanukkah message urged its readers not to join their fellow Americans in the national festivities because it was the celebration of Jesus’ birth that enchanted their gentile neighbors. Instead, that newspaper echoed the December messages of most other Jewish publications. Jewish newspapers, synagogue bulletins, women’s and men’s club letters, rabbinical sermons, and the urgings of educators and self-styled community leaders alike urged America’s Jews to make their Hanukkah celebrations as festive as possible.

Again and again, in the years since that early American Hebrew message, American Jews wove Hanukkah’s story into their own contemporary lives in ways that reflected their changing circumstances. Those retellings kept Hanukkah’s meaning alive and relevant. They turned the simple holiday rite into an event which, like other well-loved Jewish festivals, drew families together in their own homes where they could tailor the celebration to fit their own tastes in food and décor, and to reflect their own ideas about the holiday’s significance. They could indulge their children, and be joyous.

Will we ever celebrate Hanukkah and Thanksgiving together this way again? Almost. In 2070 Thanksgiving will fall on November 27th and Hanukkah will begin the following day. In 2165, we will light the first Hanukkah candle on November 28—Thanksgiving Day. But for Hanukkah’s first light to occur the evening before Thanksgiving, as it does this year, is truly an anomaly we won’t see again.

Dianne Ashton is Professor of Religion Studies and former director of the American Studies program at Rowan University. Her most recent book, Hanukkah in America: A History (NYU Press, 2013) is now available. (Read more about the book in this review from the Jewish Book Council.)

Marriage? Meh.

—Karen M. Dunak

In the aftermath of DOMA’s overturning and state after state legalizing same sex unions, there have been a flurry of articles to suggest the wedding industry has struck gold with the impending rush of gay and lesbian weddings. Maybe. But the New York Times suggests the onslaught may not be what vendors within the “wedding-industrial complex” have hoped for. Many gay men and women will look at the opportunity to marry, be happy for the move toward marriage equality and extension of citizenship rights, and then go about their daily lives.

To some degree, I think the best part of this article is that it uncovers the assumption that those who share a single element of identity are one community. In fact, there is never really just one community but rather multiple communities to consider. When teaching women’s history, I have to remind my students over and over that we can’t say “women” and imagine it’s a catchall term. Differences in race, class, region, religion, political affiliation, and so on make the population impossible to lump as one uniform group. So, too, with gay men and women.

As the Times article notes, “For some, marriage is an outdated institution, one that forces same-sex couples into the mainstream. For others, marriage imposes financial burdens and legal entanglements. Still others see marriage not as a fairy tale but as a potentially painful chapter that ends in divorce.” Exactly. Straight society’s elevation of the married relationship – with all its flaws – above all other relationships is just one area where homosexuals are glad to emphasize their difference from a problematic heterosexist value system.

It’s interesting to consider what influence homosexuals’ negotiation of newfound marriage rights will yield. Even as they existed outside the mainstream, gay relationship styles have been largely influential. In the 1960s and 1970s, as homosexual relationship became increasingly visible, many couples were happy with to live together outside the bonds of matrimony (and for many of the reasons outlined above). In fact, many historians (myself included) argue that gays’ rejection of marriage and celebration of the cohabitation alternative ultimately influenced the straight world, where cohabitation went from almost a non-existent occurrence in the early 1960s to one that was fairly common by the end of the 1970s. Likewise, an emphasis on egalitarianism within gay partnerships influenced a move toward greater equity in straight relationships.

I wonder if it’s possible that the younger generation – the one the Times describes as post-marriage – will wield a similar kind of power and influence. Those in their early twenties, disillusioned by a world in which expectations of marital success are fairly low and divorce is common, may celebrate the acquisition of the right to marry but likewise embrace the right to not marry. It’s possible that marriage equality will stand as a hallmark of sexual civil rights, but the reality of how people live their lives and organize their relationships will remain flexible. Here we may see a community of those committed to marriage alternatives, a community that may be influential but is likely to remain outside the mainstream. And it may well be a community linked not by sexual preference but by age and experience.

Karen M. Dunak is Assistant Professor of History at Muskingum University in New Concord, Ohio. She is the author of As Long as We Both Shall Love: The White Wedding in Postwar America (NYU Press, 2013).

[This piece originally appeared on the author's blog here.]

Bullying, teasing and the gender trap

—Emily W. Kane

With National Bullying Prevention Month underway and a focus this year on the sponsoring organization’s tagline, “The End of Bullying Begins with Me,” I find myself thinking back to what I heard from parents of three- to five-year-old children during interviews for my book, The Gender Trap: Parents and the Pitfalls of Raising Boys and Girls.

I talked to parents from all social backgrounds and all family types, and found that quite a few wanted to give their kids the freedom to pick activities, toys, colors, and approaches that were not strictly determined by gender. But even those parents who wanted to encourage a moderately more fluid approach to gender, expressed fear and anxiety about how their children might be treated if they didn’t conform to typical gender expectations.

I heard reports of an everyday world teeming with social pressures, judgments from friends, relatives, their children’s peers and even strangers if their kids didn’t stick to a pretty narrowly gendered path. These parents were very much conscious of the social costs their children might face and, consistent with decades of scholarship in gender studies, these costs and anxieties loomed larger in relation to boys. With frequent mention of phrases liked “picked on” and “ostracized,” parents expressed the fear that their sons would be bullied by other children if they wandered even a little bit off that socially-dictated path.

The trap of parents pushing children toward traditionally-gendered outcomes is sometimes baited by beliefs about biology, personal preferences, and unconscious actions. Even when it isn’t, though, the everyday judgments of friends, relatives, and teachers can bait that same trap. Gender nonconformity is much too often met with bullying behavior, and if adults are not vigilant about responding to that bullying and responding to the more minor policing of gender expectations (which parents in my study labeled as teasing), many parents will enforce gendered constraints they don’t even agree with out of fear for what their children might face.

Individual parents can try to create a less constraining world for their children, but only if the rest of us suspend our judgments, applaud their efforts, and seek to interrupt the everyday teasing and more significant bullying that are too often ignored in children’s daily worlds. Suspending our judgments, offering that applause, and executing those interruptions are all ways that the end of bullying can indeed begin with each of us.

Emily W. Kane is a Professor of Sociology at Bates College and the author of The Gender Trap: Parents and the Pitfalls of Raising Boys and Girls (NYU Press, 2012).

Q&A with authors Lisa Jean Moore and Mary Kosut: Part 2

September is National Honey Month! In celebration, we’re featuring the second half of a Q&A with Lisa Jean Moore and Mary Kosut, authors of Buzz: Urban Beekeeping and the Power of the Bee (read the first half here). In part two, Moore and Kosut talk about their experience in the field with bees, the truth about the sting, and bees as the new cause célèbre.

Question: What was it like to be in the field with the bees?

Lisa Jean Moore and Mary Kosut: The first few times with the bees was intense. We weren’t even paying attention to the beekeepers – it was all about being in the space of the bee and moving slowly and deliberately and respecting their airspace.  We realized if they just sit on your body and you aren’t freaking out, they won’t sting you. We had to get over all the ways we have been socialized by media messages that bees are bad and could attack at any moment.  Bees are actually mostly docile and if you are just smart and contemplative you will be okay. But of course, humans make mistakes.

Beekeeping is also a sensual experience. First is the sound of the bees. The hum can be kind of meditative. It is almost like water in a stream – bees can cause people to really calm down and be in the present. Also, in terms of the embodiment of beekeeping, the hive gives off this extraordinary smell and one of the beekeepers we interviewed, talked about this smell being like truffle oil. She talked about it as being like a “good, heady sex smell” – sort of like the pheromones of sex across species. And we smelled that smell.

When people go and open up the hive, they will sit and watch the bees come and go. These are the intimate embodied relationships people have with the bee.

Q: What about the sting?

LJM and MK: In Buzz, we discuss the sting a lot. There are these affective relationships with bees where fear and anxiety is in involved in the practice. So there is something about the wildness and danger that is attractive to people in New York City in particular. Because beekeeping had been illegal in New York City until a couple years ago because of the fact that they sting, we found beekeepers that actually liked the illegality – it was living on the edge and exciting.

The unpredictability of the bees is also a thrill. There are thousands of them flying in the air and they are making all this noise and it is a little bit intimidating, because they can sting you. People deal with the dangers differently – some beekeepers work without prophylactics. We interviewed this one guy who does beekeeping barefoot. There is a little bit of machoness to it – to quote one of our beekeepers, there’s a “bad-ass-ness” about it. It is kind of a flirtation with wild nature, but not too wild. It is not like keeping a tiger in the city.

Getting stung – being able to say you lived through it and you are going in again – we think that is attractive to people. Because getting stung hurts. The bees die after they get stung, so they actually don’t want to sting you. They do all these warning things to avoid stinging – they release pheromones to signal threat, they change their pitch to show they are angry, they also do this thing called bonking. They fly in and basically hit you on the forehead with their bodies to warn you and make you back off.

Bees die when they sting you because their bottom half falls off. So it is very sacrificial for them – they are sacrificing their life for the security of the hive. The whole is more important than the individual – we as sociologists are very attracted to that idea.

Humans theorize about the bees, comparing the hive to a democracy where all these individuals are working together for the greater good. In Buzz, we discuss how bees are basically this model insect because they are so easily anthropomorphized and a template for how humans are supposed to behave. Historically people have been attracted to bees.

Q: Do killer bees make honey?

LJM and MK: Killer bees are sort of a misnomer – or misnamed.  Basically these are bees taken from Africa and studied in South America – mostly Brazil, where they have been studied for certain characteristics. As they are smaller than European honeybees, they reproduce more quickly. But bees are an unpredictable species. They are domesticated, but they don’t always follow what humans want. So some escaped.

They have the capacity to supplant other European hives with their own queen. Once they install their own queen, a hive can turn over to being Africanized. This has been moving up the borders across the U.S.-Mexico border as far north as about Southern Georgia – maybe a bit further. This is tracked by the USDA and other agencies because of the presumed threat of Africanized bees.

They don’t have more venom in their sting – but if they are provoked, they will go farther and longer to sting. The fact that they nest inside buildings and underneath the ground means that humans’ actions sometimes disrupt those bees more than European honeybees because of their nesting patterns. So in Buzz, we talk about that threat of the Africanized bees and how it has been sort of managed through existing tropes of race and racism. Bees have become another way to express anxiety about the border and race and ethnicity. Bees that are out of control are not paying attention to all the rules about entering the country. They are more robust and heartier and that trait is capitalized on and used in a pejorative sense to make us fearful.

Q: What are some of the surprising findings about bees and urban beekeeping?

LJM and MK: One of the things we found so interesting in looking at bees and urban beekeepers and colony collapse disorder (CCD) is that simultaneously, while it is probably a panoply of causes that lead to bees dying, but primarily neonicotinoid – there is also this movement at the same time to save the bees. We liken this to the 1970s save the whales movement.

Bees have become this new mascot or cause célèbre for people to root for or rally behind and this has effects in the urban beekeeping landscape and also for larger corporations like Haagen Dazs or other companies who make saving the bees part of their way of engaging with consumers. It is a pitch to get us concerned by the environment. Bees are seen as so wholesome and so threatened that we need to help them. This is a far cry from when we grew up and were taught to run from bees due to “The Swarm” and other killer bee movies.

In a short time, we have been taught to be concerned about the bees, to worry for them, to want to care for them, to want to buy products that protect them. There is a real shift in how we have seen bees as a real threat to how we seem them now as completely threatened.  And this becomes part of our own desire in how to participate to try and help them.

Fall books available on NetGalley

We’ve got quite a few gems in our NetGalley catalog this fall, all available for advance review now. Book reviewers, journalists, bloggers, librarians, professors, and booksellerswe welcome you to submit a request!

Not familiar with NetGalley? Learn more about how it works.

 
Buzz: Urban Beekeeping and the Power of the Bee by Lisa Jean Moore and Mary Kosut (September 27, 2013)

We think Booklist said it best: “In this fascinating blend of sociology, ecology, ethnographic research, and personal memoir, the authors range through all of the aspects of the human relationship with the honeybee.”

Ever thought of honeybees as sexy? You might after watching Mary Kosut discuss the sensual nature of beekeeping.

 

Cut It Out: The C-Section Epidemic in America by Theresa Morris (October 7, 2013)

In Cut It Out, Theresa Morris offers a riveting and comprehensive look at this little-known epidemic, as well as concrete solutions “that deserve the attention of policymakers” (Publishers Weekly starred review).

C-sections are just as safe as vaginal births, right? Not true, says Theresa Morris. Watch her discusses this and other misconceptions on our YouTube channel.

 

Hanukkah in America: A History by Dianne Ashton (October 14, 2013)

Hanukkah will fall on Thanksgiving this year for the first time ever—and the last time for another 70,000 years. Brush up on your knowledge of the holiday in time to celebrate the once-in-an-eternity event. Publishers Weekly, in another starred review, promises a “scholarly but accessible guide to the evolution of the Festival of Lights in America.”

Stay tuned for our interview with the author!

 
Browse all of our e-galleys available for review on NetGalley.

The secret history of gay marriage

Excerpted from As Long as We Both Shall Love: The White Wedding in Postwar America by Karen M. Dunak.

On October 10, 1987, nearly 7,000 people witnessed a wedding on the National Mall in Washington, DC. Men and women cheered and threw rice and confetti as family, friends, and community members took part in the largest mass wedding in American history. After the celebrants exchanged rings and were pronounced newlywed, guests released hundreds of balloons into the air. Brides and grooms, dressed in formal wedding attire, cried and embraced after an “emotional and festive” ceremony. Like so many brides and grooms, participants identified the wedding day as one of the happiest, most meaningful days of their lives.

But this was no ordinary wedding. And these were not typical brides and grooms. This wedding held special significance for its participants. Beyond the “mass” nature of the celebration, something else was unique. The newlyweds that fall Saturday paired off as brides and brides, grooms and grooms. “The Wedding,” as it came to be known, marked the symbolic beginning of nearly 2,000 same-sex marriages. Rejecting the idea that a wedding—and by implication, a marriage—should have one male and one female participant, the grooms and their grooms, the brides and their brides presented a striking picture.A wedding, a fairly conventional affair, became a site of radical protest. Layered in meaning, “The Wedding” celebrated the personal commitments of those being wed. At the same time, it was a direct political act that challenged the legal, religious, and social barriers against same-sex relationships. Like couples before them, gay men and lesbians found they could use their weddings to make a statement about the world and the place of their relationship in it.

Designed to reflect an alternative approach to love and marriage, “The Wedding,” part of the 1987 March on Washington for Gay and Lesbian Rights, rejected the narrow definition of marriage that limited the relationship to members of the opposite sex. “The Wedding” likewise rejected a narrow view of the standard wedding celebration. Dina Bachelor, metaphysical minister, hypnotherapist, and “Wedding” officiant, designed a new-age style ceremony. Bachelor recognized the uniqueness of the celebration and chose her words and actions carefully. Standing under a swaying arch of silver, white, and black balloons, Bachelor omitted any mention of the customary “honor and obey, till death do us part.” Including observers in the celebration, she asked witnesses to join hands and encircle the celebrants. For participants, “The Wedding” was not about fitting into a pre-arranged style. Instead, it was about expanding the celebration to include various approaches to marriage and family. Like alternative wedding celebrants of the 1960s and 1970s, same-sex partners recognized the flexibility of the wedding and used the celebration to express their views about life and love. Bachelor likewise noted the celebration’s significance and concluded the event by stating, “It matters not who we love, only that we love.”

Gay community leaders emphasized the political component of the celebration. Drawing on the activist view that the personal was political, the public pronouncement and celebration of a long-ridiculed personal lifestyle served as the ultimate political statement.Those present rejected the shame associated with their relationships and proved that many same-sex couples shared long-term, committed relationships. Courageously displaying their individual love and their membership in a community of likeminded gay men and lesbians, “Wedding” participants did not demand a social inclusion marked by assimilation or guarded emotions. Rather, they demanded full acceptance of their lifestyle and relationship choices. Reverend Troy Perry, a minister evicted from the Pentecostal Church of God for his own homosexuality and founder of the Universal Fellowship of Metropolitan Community Churches, spoke to this desire for openness and acceptance as he rallied his congregants with a shout of “Out of the closets and into the chapels!”

Hosting “The Wedding” in front of the Internal Revenue Service’s building was a symbolic choice meant to protest the tax office’s refusal to accept taxes jointly filed by same-sex couples. As activist Sue Hyde recalled, couples participated in “The Wedding” “both to protest discrimination against them and to celebrate their love and commitment to each other.” Challenging conventional views of family and marriage, groom and “Wedding” organizer Carey Junkin of Los Angeles echoed “The Wedding’s” official slogan when he said, “Love makes a family, nothing else.” Adding his own sentiment, he stated, “We won’t go back.” Marriages celebrated that day held no legal standing, but that did not diminish the emotional impact of the event. The community of couples who wed accomplished their political objective by making their private relationships part of the political discourse. The very public, very political event demanded recognition of the legitimacy of the relationship between two brides or two grooms.

As for “The Wedding” participants (composed of more male than female couples, suggesting an ongoing discomfort with weddings and marriage among politically active feminists), they expressed warm praise for the celebration, as well as a sense of anger that any members of the gay or lesbian community would criticize their decision to wed. Dressed in suits, tuxedos, and wedding gowns, albeit with little regard for normative notions of gender, the celebrants saw the day as an important turning point in their lives and relationships. Despite their unorthodox appearances, many participants noted that they would have been comfortable with an even more “traditional” ceremony. The only registered disappointment pertained to the desire that the ceremony might have been more explicit in regard to monogamy or couples’ exclusivity. The mass “Wedding” was not intended to replicate heterosexual marital relationships or wedding celebrations, but the importance given the celebration and the desire for expression of personal preference—be it for a more or even less traditional form than the ceremony before the IRS—hinted at possible similarities between same-sex weddings and their opposite-sex counterparts.

While “The Wedding” looked unlike the individual white weddings celebrated by heterosexual couples, the event incorporated familiar elements of the wedding ceremony. Most participants wore some sort of special dress; an authority figured presided over the celebration; and guests bore witness to the event. The relationships may have seemed atypical or strange in the eyes of the mainstream observer, but there could be no question as to what had transpired that October day. The familiarity of the wedding served as a valuable political tool even as it fulfilled the personal desires of same-sex couples who wished to share their lives together. For a population who had the option— admittedly the very unpleasant option—of invisibility, the choice to make public the intimacies of private life was a political statement in and of itself.

Same-sex weddings transcended the “difference vs. accommodation” debates often raised in subcultural groups and hotly contested within the queer community.8 In the years following the celebration of “The Wedding,” gay men and lesbians expressed a blend of intentions and motivations with their celebrations. The flexibility of the wedding, continually tested by the heterosexual marrying population in the decades since World War II, likewise served the personal as well as the political objectives of queer couples. Moving from the mass to the individual, weddings legitimated and celebrated relationships that had long been deemed wrong or strange and had thus been cloaked in secrecy. Such celebrations allowed men and women to celebrate their private lives in a public style and with the sanction of chosen and accepting family and community members. By publicly celebrating their relationships, queers challenged a political system that refused to recognize their right to wed.

Like the weddings of those before them, the white weddings hosted by same-sex couples in the 1990s and in the early years of the new century seemingly adhered to a standardized form of celebration. The similarity between opposite-sex and same-sex events, of course, was noticeable in the continued reliance on a wedding industry and adherence to wedding norms: formal dress, recitation of vows, and elaborate receptions. On the surface, this suggested a kind of queer accommodation to the standard form. Even though a gay couple might purchase a cake topper that featured two grooms, the couple still purchased a cake topper. The prerequisites of a wedding had tremendous staying power. But same-sex couples shaped their weddings in ways specific to their relationships and cultural identifications. Ceremonial alteration and amendment, whether slight or pronounced, reflected the beliefs and desires of same-sex couples.

Queer couples, like other brides and grooms, negotiated tensions created by family, cost, and the overall wedding planning procedure. Unlike heterosexual couples, same-sex brides and grooms challenged existing authority in the very act of celebrating a wedding. Couples celebrated the communities from which they came, to which they currently belonged, and those they created, if only for their weddings. They exerted individual authority over their ceremonies not only in their selection of music, dress, and wedding style, but also in their very direct rejection of a legal system that denied them access to the rights and privileges of marriage. They publicly celebrated relationships long denied public recognition. Weddings could be and could say whatever the celebrating couples wished. As various states began to recognize same-sex marriages, acceptance of same-sex unions extended even beyond the queer community. Weddings both affirmed the political victory achieved by those who had long advocated on behalf of equal rights and marked the triumph of personalization in American wedding culture.

Read the rest of this entry at Salon.com.