Dude, what’s that smell? The Sriracha shutdown and immigrant excess

—Anita Mannur and Martin Manalansan

All across America, bottles with a green cap, rooster and fiery chili sauce that were once exclusively the mainstay of fast food style Asian restaurants, have been slowly making their mark on mainstream palates. In 2011, the popular television show The Simpsons featured an episode—described by executive producer Matt Selman as a “love letter to food culture”—in which Bart Simpson’s usually pedestrian palate becomes attuned to the finer possibilities of sriracha.

In 2012, as part of a national campaign to introduce a new flavor, the Lay’s potato chip company announced Sriracha as one of the three finalist flavors, along with Cheesy Garlic Bread and Chicken & Waffles. Cheesy Garlic Bread Lay’s eventually went on to win the contest; some claim it was because the signature piquant taste of sriracha could barely be detected in the chip’s flavor. In 2013 the national bagel sandwich chain restaurant Brueggers introduced the Sriracha Egg Sandwich. Not to be outdone, Subway followed suit with their version of a chicken sriracha melt.

By the end of 2013, sriracha popularity seemed to be at an all time high. From January to December of 2012, some 20 million bottles of sriracha sauce had sold, and on October 27, 2013, the first Sriracha festival was held in downtown Los Angeles. Americans, it seemed, could not get enough of the hot sauce. That is, until it came into their own backyards.

On October 28, Huy Fong Foods, the purveyor of sriracha, was sued by the small town of Irwindale, California for causing “burning eyes, irritated throats, and headaches” to its residents. An initial report published by the Associated Press tellingly described the odors produced by the Huy Fong plant as “a nuisance.”

Huy Fong’s owner and creator David Tran’s mistake was in assuming that the sriracha boom meant that the town of Irwindale would accept the changes that came with the presence of Asianness. In many ways, his story was that of the consummate Asian American model minority who had made his mark through hard work and perseverance in America. From origins in Vietnam to “making it” as an ethnic entrepreneur in the US, the story of sriracha, and in particular that of Huy Fong, can be understood as a quintessentially Asian American story.

David Tran, a Vietnamese refugee of Chinese origin, was among the first wave of refugees to leave Vietnam in 1979. Fleeing Vietnam aboard the Panamanian freighter “Huy Fong,” for which he later named his company, Tran started his fledgling company in the town of Rosemead, California in the mid-1980s with an initial investment of a meager $50,000. Over the next two decades, the company, which exclusively harvests jalapeños grown in Piru, California, grew dramatically, largely by word of mouth, and has become one of the most popular condiments with something of a cult-like following.

Food historian John T. Edge notes that part of sriracha’s success is in its ability to position itself as malleable to many palates: “Multicultural appeal was engineered into the product: the ingredient list on the back of the bottle is written in Vietnamese, Chinese, English, French and Spanish. And serving suggestions include pizzas, hot dogs, hamburgers and, for French speakers, pâtés.” Despite sriracha’s obvious connection to Thainess—the sauce, according to a recent documentary, Sriracha (Dir. Griffin Hammond, 2013), has its origins in the town of Si Racha—Tran disavows the necessary connection to one particular national lineage, noting, “I know it’s not a Thai sriracha…It’s my sriracha.”

As the company expanded, it moved from its more modest location in Rosemead to a larger factory in Irwindale. And with the growth of the factory, resentment of the presence of Asianness has been more acutely expressed through a refusal of the visceral and purported offensiveness of Asian odors. Ultimately it is the inability of odors to remain in place, the toxicity and the purported public health danger of Asian coded comestibles that has come to characterize this stage in the sriracha story as a story of racial exclusion in an Asian American context.

As Martin Manalansan has written elsewhere, “smell in America…is a code for class, racial and ethnic differences.” Yet cities are expected to function as odorless zones, both literally and psychically. Traces of immigrant excess must always be kept at bay and where food is concerned, difference must be managed to ensure that the kind of food one finds at the table is synchronous with the mandates of a multiculturalist ethos of eating. It must not appear “too foreign,” “too different”, “too oily” or too aberrant. In other words it must not be too offensive, lest it upset a carefully calibrated balance of acceptable multiculturalism.

Sriracha seemed poised to become America’s next favorite condiment. But condiments have to be manufactured somewhere, and when Asianness comes to roost in the town of Irwindale, population 1,422 (2% Asian American, 47% white), the cultural odor of the town also changes. And taste for difference, as history has suggested, can often only go so far. The winds in the California city of Irwindale not only transport the sharp smell of chilies in the sriracha sauce, they also convey the heavy weight of Western history’s fraught encounters with olfactory experiences.

Throughout the ages, smell has been used to mark things and bodies that are sinister, sinful, dangerous, foreign, diseased, and decaying. Modern cities were planned under the idealized schemes of de-odorized landscapes. Accoutrements to contemporary living include room deodorizers and air fresheners that aim to eliminate unwanted odors and showcase social uplift and class distinction. The Sriracha incident in California reeks of all these historical antecedents and cultural symptoms. The very fact that sriracha has been called a “public nuisance” and a potential health threat is part of a longer tradition that views Asianness as a public health menace. The SARS epidemic of 2002, with its concomitant xenophobic links to the fear of Asian bodies, is not far removed from the panic about Asianness discursively inherent in the charges being levied against Huy Fong Foods.

In the midst of all the accusations and counter-accusations of state overreach, cultural insensitivity and xenophobia, smell should not be seen as merely a potential health hazard but rather as a crucial signpost of where we are as a society and as a nation in the 21st century. Indeed, to consider sriracha’s odors a public nuisance is not far removed from the kinds of radicalizing language that is used to put immigrants in their place. We may like our sriracha bottles on our tables, but we don’t want it too close, lest it contaminate our spaces of living. Like the Asian American bodies with which we associate the bottles of hot sauce, we prefer to limit the spread of Asianness.

On November 29, 2013, three days after the Los Angeles court ruled in favor of a partial shutdown of the company, Huy Fong Foods released a simple statement to the public with a banner reading, “No tear gas made here,” placed outside its Irwindale factory. Those simple words summed up what is perhaps really at stake here. The underlying issues which have led to the fracas about sriracha are very much about toxicity, but the banner is as much about dispelling the notion that the product they are making is toxic as it is about pointing out that underlying racism and charges against Huy Fong are mired in a more dangerous form of toxicity—one that seeks to vigilantly remind immigrants about where they do and do not belong.

Anita Mannur is Associate Professor of English and Asian /Asian American Studies at Miami University. Martin F. Manalansan, IV is Associate Professor of Anthropology and Asian American Studies at the University of Illinois, Urbana-Champaign. Mannur and Manalansan are co-editors (with Robert Ji-Song Ku) of Eating Asian America: A Food Studies Reader (NYU Press, 2013).

Podcast: Josh Lambert on Jews and obscenity in America

In Unclean Lips: Obscenity, Jews, and American Culture, Josh Lambert navigates us through American Jews’ participation in the production, distribution, and defense of sexually explicit literature, plays and comedy.

From Theodore Dreiser and Henry Miller to Curb Your Enthusiasm and FCC v. Fox, Lambert explores the central role Jews have played in the struggles over obscenity and censorship in the modern United States. Below, listen to a conversation with Lambert on a recent episode of Vox Tablet’s podcast. 

[Warning: This conversation contains explicit language and content.]

What’s new about Hanukkah?

—Dianne Ashton

[This post originally appeared on the Jewish Book Council blog on November 26, 2013.]

This year, Jewish Americans will participate in an extraordinary Hanukkah celebration—they will light the first menorah candle on the evening before Thanksgiving. This has never happened before, but we came very close to it in 1888. Then, the first Hanukkah light and Thanksgiving occurred on the same day. That year, the national Jewish newspaper, the American Hebrew, dedicated its November 30 issue to the “twofold feasts.” The issue was as much “a tribute to the historic significance of Chanuka” as to “the traditions entwined about Thanksgiving Day.” The editors hoped readers would find the newspaper to be “a stimulus to the joyousness and gladness upon the observance of both.” In previous years they had described Hanukkah as a festival to thank God for the Maccabean victory, and, seeing both Thanksgiving and Hanukkah as occasions for giving thanks to God, they easily encouraged American Jews to enthusiastically celebrate both events.

But most of the time, as we know, Hanukkah occurs at a time closer to Christmas. Most years, the American Hebrew’s Hanukkah message urged its readers not to join their fellow Americans in the national festivities because it was the celebration of Jesus’ birth that enchanted their gentile neighbors. Instead, that newspaper echoed the December messages of most other Jewish publications. Jewish newspapers, synagogue bulletins, women’s and men’s club letters, rabbinical sermons, and the urgings of educators and self-styled community leaders alike urged America’s Jews to make their Hanukkah celebrations as festive as possible.

Again and again, in the years since that early American Hebrew message, American Jews wove Hanukkah’s story into their own contemporary lives in ways that reflected their changing circumstances. Those retellings kept Hanukkah’s meaning alive and relevant. They turned the simple holiday rite into an event which, like other well-loved Jewish festivals, drew families together in their own homes where they could tailor the celebration to fit their own tastes in food and décor, and to reflect their own ideas about the holiday’s significance. They could indulge their children, and be joyous.

Will we ever celebrate Hanukkah and Thanksgiving together this way again? Almost. In 2070 Thanksgiving will fall on November 27th and Hanukkah will begin the following day. In 2165, we will light the first Hanukkah candle on November 28—Thanksgiving Day. But for Hanukkah’s first light to occur the evening before Thanksgiving, as it does this year, is truly an anomaly we won’t see again.

Dianne Ashton is Professor of Religion Studies and former director of the American Studies program at Rowan University. Her most recent book, Hanukkah in America: A History (NYU Press, 2013) is now available. (Read more about the book in this review from the Jewish Book Council.)

Why Hanukkah and Thanksgiving will never again coincide

—Joel Hoffman

[This piece originally appeared in the Huffington Post on November 24, 2013.] 

This month, Hanukkah and Thanksgiving will overlap for a joint celebration that will never happen again. Here’s why. (Try to keep up with me on this.)

Thanksgiving is the 4th Thursday in November. Hanukkah is the 25th day of the Jewish month of Kislev.

The 4th Thursday in November can range from the 22nd to the 28th. If the 29th is a Thursday, then so is the 1st, so the 29th would be the fifth Thursday, not the fourth. And if the 21st is a Thursday, then it’s only the third Thursday. On average, then, Thanksgiving falls on the 28th about every seven years. It will fall on the 28th this year, then again in 2019, 2024, 2030, and 2041, or four times in the next 28 years. (It’s not exactly every seven years because leap days throw things off a little.)

The Jewish month of Kislev can currently start as early as November 3 or as late as December 2, which means that the first day of Hanukkah can come as early as November 28 or as late as December 27.

The reason for the broad range of possible dates is that the Jewish calendar is lunar-solar. The months are based on the cycles of the moon. But the calendar changes the lengths of those months, and even how many months are in a year, to make sure that Passover always falls in the spring. This complex system—put in place by Rav Shmuel in the first half of the first millennium CE—ensures that the Jewish date and the secular date match up every 19 years. (By contrast, the Muslim calendar is purely lunar, which is why Ramadan can fall during any time of the solar year. The Christian religious calendar is almost entirely solar, but Easter falls on the first Sunday after the first full moon after the spring equinox [around March 21], a calculation that involves the moon as well as the sun.)

Because of this Jewish 19-year cycle, 19 years from now, in the year 2032, Hanukkah will again fall on November 28. But Thanksgiving in that year falls three days earlier, on the 25th.

On average, we would expect the 19-year Jewish cycle and the 7-year Thanksgiving-on-November-28 cycle to coincide about every 19×7 years, which is to say, approximately every 133 years. And they sort of do.

One-hundred and fifty-two years ago, in 1861, the first day of Hanukkah and the 4th Thursday in November were both on November 28th. But there was no Thanksgiving back then.

In 152 years from now, in 2165, Thanksgiving falls on the 28th, and you’d expect Hanukkah also to fall on the 28th, but it doesn’t.

If you’ve been paying attention (and if you haven’t given up yet), you may have noticed that I said “currently” when I explained when Kislev can begin. Remember Shmuel, who fixed the details of our current Jewish calendar in the first place? He, like everyone else back then, though that the year was 365.25 days long. This is why we have a usual year of 365 days, but every 4th year we add a leap day in February to make 366.

But Shmuel—again, like everyone else—was off by a little more than 11 minutes. The year is not quite 365.25 days long, but, rather, closer to only 365.2425 days, or about 11 minutes shorter than 365.25 days. For a long time no one noticed those 11 minutes. For a longer time no one cared. But by the time of Pope Gregory XIII in 1582, those 11 minutes per year—or about 3 days per 400 years—had added up to about ten days.

This meant that March 21, which had once been the approximate date of the spring equinox, was now 10 days later than the spring equinox. Or, conversely, the spring equinox fell on March 11. This was a problem for the Church, because the springtime holiday of Easter was shifting further and further away from spring.

Pope Gregory fixed the problem in two ways. First, he lopped off 10 days from the calendar. For Catholics, the day after Thursday, October 4, 1582 was Friday, October 15, 1582. Secondly, he eliminated 3 leap days every four hundred years. He decreed that years divisible by 4 would still be leap years, unless they were also divisible by 100 but not by 400. So 1600 would be a leap year (divisible by 100 and by 400), but 1700 would not (divisible by 100 and not by 400). This became known as the Gregorian calendar, and it gradually spread through the Christian world.

In 1752, the British empire adopted the Gregorian calendar, making the day after Wednesday, September 2, 1752 not the 3rd but rather the 14th. (An 11th day was necessary because 1700 was not a leap year in the Gregorian calendar.)

The Jews, of course, didn’t give a damn what Pope Gregory said. They kept using the Shmuelian calendar for their calculations. The Shmuelian calendar and the Gregorian calendar have been diverging at the rate of about 11 minutes a year, or 3 days every 400 years. Furthermore, the year 2100 will be a leap year in the Shmuelian calendar (because it’s divisible by 4) but not in the Gregorian calendar (because it’s divisible by 100 but not 400). So not long after the year 2100, the Jewish calendar and the secular calendar will diverge by an additional 1 day—though the details are even a little more nuanced, because Shmuel used a simplification of the final Jewish calendar.

This is why (remember the question from several paragraphs ago?) in the year 2165, when we’d expect Thanksgiving and Hanukkah to coincide again, Hanukkah will actually be one day later. And that is why Thanksgiving and Hanukkah will never again coincide.

Well, almost never. If the Jews don’t ever abandon the calculations based on the Shmuelian calendar, Hanukkah will keep getting later and later—moving through winter, then into spring, summer, and finally back into fall—so that tens of thousands of years from now they will again coincide. But long before then the springtime holiday of Passover will have moved deep into summer, so be on the lookout for a memo with a calendar update in the next several thousand years.

And in the meantime, don’t miss this opportunity to enjoy an exceedingly rare confluence of celebrations.

Happy Hanukkah. And Happy Thanksgiving.

Joel Hoffman is the author of In the Beginning: A Short History of the Hebrew Language (NYU Press, 2004). Hoffman is a regular contributor to the Huffington Post—read more of his entries here.

American Literatures Initiative approaches 100th book

“The American Literatures Initiative, the first of the university press collaborative publishing grants awarded by The Andrew W. Mellon Foundation, approaches 100 book mark.”

To learn more about this exciting news, read an excerpt from the press release below! [The full version appears on the ALI website here.]

In 2007, The Andrew W. Mellon Foundation, looking for ways to encourage presses to collaborate more on basic publishing operations, and looking to support the more economical publication of first scholarly books in what they called “underserved” fields, issued a call for grant proposals to the University Press community. The call for grant proposals was open-ended: there was no definition of an underserved field, nor were any guidelines provided about the potential size of a grant, the timeframe, or the number of presses needed in the collaboration. However, each proposal had to address several key issues, including providing evidence of an underserved field, identifying  more economical and transformative processes for the publication of first books, and providing a plan for long-term sustainability of publishing monographs  in the discipline.

The American Literatures Initiative (ALI) was the first such grant awarded by The Andrew W. Mellon Foundation in 2008, to be followed by a half dozen additional grants during 2008 and 2009. The ALI allowed five similarly-sized university presses—NYU Press, along with Fordham University Press, Rutgers University Press, Temple University Press, and the University of Virginia Press—to collaborate on the publication of 125 first books in the field of literary studies over five years, with a grant of $1.3 million.

As the ALI approaches the publication of its 100th book, the many skepticisms voiced at the beginning—that presses could not effectively collaborate in these ways—were unfounded. To find out more about the participating publishers and the project’s goals, the Directors of the ALI presses were asked to provide a candid assessment of their progress to date, with the hope that some of the successes of the ALI might be adapted by individual university presses, or scaled onto other university press collaborations.

What’s the future of the ALI after the next grant period ends? “We’re exploring several options,” said Steve Maikowski, Director at NYU Press. “We see the ability to make a strong case to continue funding the most successful, transformative parts of the program, scaling back the big marketing spend, continue our experimentation in collaborative production methods, and focusing on the core plant costs, which will make these niche monographs a bit more viable on our lists. Without such outside support, some of the ALI presses may have to again significantly reduce the titles published in this field.”

In spite of the new financial challenges faced by the ALI, for new scholars in literary studies, the ALI is a beacon of hope in an otherwise dreary publishing landscape for first books for scholars in the humanities. And the ALI has indeed achieved many of the original goals in the grant proposal to The Andrew W. Mellon Foundation. Perhaps most gratifying is the number of prizes won by the ALI to date, most recently the top book prize of the American Studies Association. These distinguished prizes fulfill one of the most important goals: that ALI would become the destination point for the best, award-winning original scholarship in literary studies.

The New Southern Strategy: GOP plays the race card (again)

—Matthew W. Hughey and Gregory S. Parks

At a recent Tea Party meeting in Hood County, Texas, Rafael Cruz, the father of Sen. Ted Cruz (R-Texas), made bold statements reeking of white supremacy and Christian nativism, suggesting that the U.S. is a “Christian nation,” in which the Declaration of Independence and the Constitution were a “divine revelation from God […] yet our president has the gall to tell us that this is not a Christian nation […] The United States of America was formed to honor the word of God.” And just months prior to that event, in speaking to the North Tea Party on behalf of his son Ted (who was then running for Senate), Rafael urged the crowd to send Obama “back to Kenya.”

In the wake of the political dysfunction, gridlock, and rightwing obstruction, these statements certainly gesture toward important questions. Namely, is Rafael Cruz merely a “bad apple” that threatens to spoil the GOP or are his comments indicative of a larger strategic rhetoric that resonates with the lesser angels of our nature?

We suggest the latter.

Since the “Southern Strategy” whereby the GOP turned its back on Civil Rights and began courting white Southerners who believed they were victims of a new reverse racism social order, the Republican Party has employed subtle (and not so subtle, as witnessed above) rhetoric that turns on implicit anti-Black and anti-immigrant messages. It is said no better than former Republican Party strategist Lee Atwater’s 1981 remarks in which he laid bare the GOP racialized strategy:

You start out in 1954 by saying, “Nigger, nigger, nigger.”  By 1968 you can’t say “nigger”—that hurts you.  Backfires.  So you say stuff like forced busing, states’ rights and all that stuff.  You’re getting so abstract now [that] you’re talking about cutting taxes, and all these things you’re talking about are totally economic things and a byproduct of them is [that] blacks get hurt worse than whites.  And subconsciously maybe that is part of it.  I’m not saying that.  But I’m saying that if it is getting that abstract, and that coded, that we are doing away with the racial problem one way or the other.  You follow me—because obviously sitting around saying, “We want to cut this,” is much more abstract than even the busing thing, and a hell of a lot more abstract than “Nigger, nigger.”

Hence, Ronald Reagan’s evocation that “welfare queens” were gaming the system sent a clear yet implicit message: Your taxes are high because Lyndon Johnson’s programs are funneling your money to undeserving and lazy black women.  When a group that supported George H. W. Bush’s presidential campaign ran a television advertisement that blamed Michael Dukakis for murders committed by Willie Horton (a black parolee who broke into a white couple’s home), the message baited white fears about young black male violence.  When Jesse Helms, a white senator from North Carolina, faced black challenger Harvey Gantt in 1990, Helms’s camp ran a television advertisement showing the hands of a white person crumbling a rejection letter with the voiceover: “You needed that job, and you were the best-qualified. But they had to give it to a minority because of a racial quota. Is that really fair?” The ad was broadcast just a few days shy of the election and boosted Helms to victory in what was previously a dead heat.

And in relation to Obama, we have a new Southern Strategy: Cruz’s comments fall lock and step with GOP and Tea Party elements that have attempted to frame Obama as either culturally out of place if not legally unable to hold the presidency. Due in part to the conservative conspiracy theorists, a small cadre of politicians (e.g., Missouri Republican Sen. Roy Blunt, or Nathan Deal, the Democrat-turned-Republican Gov. of Georgia), and already established cultural tropes that conflate whiteness and Christianity with Americaness, by the 2008 election season, studies showed that U.S. residents were more likely to associate American symbols with white politicians (e.g., Hillary Clinton) or even white European politicians (e.g., Tony Blair) than with Obama. Even when American citizens viewed an American flag they then showed implicit and explicit prejudice toward African Americans in general and reluctance to vote for Obama when compared to those not exposed to the flag.

Simply put, Barack Obama did not fit most American’s implicit idea of an authentic American, and the GOP has seized upon that opportunity to engage in the latest state of the New Southern Strategy. The playing of the race card, even in implicit fashion, remains an efficacious political strategy for those on the Right.

Matthew W. Hughey is associate professor of sociology at the University of Connecticut. Gregory S. Parks is assistant professor of law at Wake Forest University School of Law.  They are co-authors of The Wrongs of the Right: Language, Race, and the Republican Party in the Age of Obama (forthcoming in 2014 from New York University Press).

Vaginal birth for twins as safe as c-section delivery

—Theresa Morris

A study published in the New England Journal of Medicine (NEJM) on October 3rd examines what is safer for the delivery of twins: planned vaginal or planned cesarean section? A summary of the study was written by the Associated Press and distributed widely through many news outlets, including National Public Radio. The title of the AP article—“Most Twins Can Be Born Without a C-Section”—gets at the major finding of the study. The authors randomly assigned women with twin pregnancies between 32 weeks 0 days gestation and 38 weeks 6 days gestation and with the first twin in a head-down (cephalic) position to planned cesarean section (1398 women) or planned vaginal delivery (1406 women). There was no significant difference in outcomes between these two groups. That is, a planned vaginal twin delivery posed no greater risk to women and babies than a planned cesarean section twin delivery.

This study is notable. As the authors of the NEJM study observe, the rate of vaginal twin delivery has plummeted in recent years. There is no doubt that part of this decrease is due to publication of findings from The Term Breech Trial, which discovered worse outcomes for babies presenting in breech (head up) position who were delivered vaginally. Although in a follow-up study published in 2004 the outcomes at age 2 of the babies born vaginally were no different from the outcomes at age 2 of babies born by c-section, planned vaginal breech delivery had already been greatly curtailed worldwide.

The Term Breech Trial affected twin deliveries because many second twins (i.e. the twin that is delivered second) present in a breech position. Although the Term Breech Trial only included singleton pregnancies, maternity clinicians I interviewed for the research in my book indicated that many obstetricians stopped offering vaginal twin deliveries when the second twin was presenting in a breech presentation because, with the publication of findings from the Term Breech Trial, if there were to be a bad outcome, they believed they would be sued for malpractice. The NEJM study published on October 3rd, which includes five of the authors of the Term Breech Trial publications, is a good redress to obstetricians’ defensive practice of delivering most twins by c-section.

This study is good news for women pregnant with twins. If vaginal delivery is not being presented to them as an option, they should bring this publicly available article to their next prenatal appointment.

Theresa Morris is Professor of Sociology at Trinity College in Hartford, Connecticut. She is the author of Cut It Out: The C-Section Epidemic in America (NYU Press, October 2013).

NYU Press authors sweep top prizes in American studies

We are thrilled to announce that Lisa Cacho’s Social Death: Racialized Rightlessness and the Criminalization of the Unprotected has been selected by the American Studies Association (ASA) to receive the 2013 John Hope Franklin Publication Prize, which celebrates the best published book in American studies. This book is part of the Nation of Newcomers series.

We’re also honored to share that Kyla Wazana Tompkins’ Racial Indigestion: Eating Bodies in the 19th Century has won the other top prize awarded by the ASA, the 2013 Lora Romero First Book Publication Prize, which identifies the best first book in American studies that highlights intersections of race with gender, class, sexuality, and/or the nation. Racial Indigestion—part of the America and the Long 19th Century series and the American Literatures Initiative—was also awarded the 2013 Association for the Study of Food and Society Book Award earlier this year.

Congratulations to the authors, editors, and everyone who has worked on these books!

Breaking Bad breakdown: Deserving denouement

—Jason Mittell

[A longer version of this article originally appeared on the media and cultural studies blog, Antenna.]

What do we want from a finale? Should it be a spectacular episode that serves as the dramatic peak of the series? Should it be like any other episode of the series, only more so? Should it be surprising, shocking, or transformative? Or should it offer closure?

For me, the main thing that I’m looking for in any finale is for a series to be true to itself, ending in the way it needs to conclude to deliver consistency, offering the type of ending that its beginning and middle demand. That changes based on the series, obviously, but it’s what makes Six Feet Under’s emphasis on mortality and The Wire’s portrayal of the endless cycle of urban crime and decay so effective, and why Battlestar Galactica’s final act turn toward mysticism (and that goofy robot epilogue) felt like such a betrayal to many fans. And it’s why finales like The Sopranos and Lost divide viewers, as the finales cater to one aspect of each series at the expense of other facets that some fans were much more invested in.

“Felina” delivered the ending that Breaking Bad needed by emphasizing closure over surprise. In many ways, it was predictable, with fans guessing many of the plot developments—read through the suggestions on Linda Holmes’s site for claiming cockamamie theories and you’ll see pretty much everything that happened on “Felina” listed there (alongside many more predictions that did not come to pass). For a series that often thrived on delivering “holy shit” moments of narrative spectacle, the finale was quite straightforward and direct.

The big shocks and surprises were to be found in episodes leading up to this one, especially the brilliant “Ozymandias”; since then, we’ve gotten the denouement to Walt’s story, his last attempt to make his journey mean something. It’s strange to think that an episode that concludes with a robot machine gun taking down half a dozen Nazis feels like a mellow epilogue, but emotionally it was this season’s least tense and intense episode. Instead, Walt returned home a beaten-down man, lacking the emotional intensity that drove him up the criminal ladder, but driven by a plan that he had just enough energy to complete. Given that the series premise was built on the necessity of a character arc building toward finality, and that it began with that character receiving a death sentence, we always knew that closure was likely to come in the form of Walt’s death, and this episode simply showed us how his final moments played out in satisfying fashion.

While Walt’s mission to destroy the remnants of his business occupy the bulk of the episode’s plot, its emotional centerpiece is his meeting with Skyler. As always, [Bryan] Cranston and Anna Gunn make the scene crackle, conveying the both the bonds and fissures between the two characters that make their final goodbye neither reconciliation nor retribution. He visits her as one of his more selfless acts we have seen. He has no illusions that he’ll resolve things or get her back on his side; he simply wants to give her two things. First, the coordinates for Hank and Gomie’s grave, offered to provide closure to Marie and others, as well as assuaging Walt’s guilt over this one act of violence he caused but could not stop. Second, the closest he’ll ever come to an apology—after starting like a typical rationalization about “the things I’ve done” that Skyler rightly attacks him as another deceptive rationalization about family, Walt finally admits the truth. “I did it for me. I liked it. I was good at it. And I was alive.” This is not easy for Walt to say; it is his most brutal penance, having to admit his own selfishness to both his wife and himself. But in the end, Skyler returns the favor with the gift of a final moment with Holly, the child that Walt used as a bargaining chip the last time they spoke, as she remembers the part of him that still loved his children despite his abusive treatment of them. And Walt takes his own moment to observe Flynn from afar, looking at a child who rightly despises him, but he still loves. When I look back on this finale, this will be the scene I replay in my mind.

Of course the episode and series climax is the final confrontation. I fully believe that Walt intends to kill Jesse alongside the Nazis, as he fully believes that his protege has both betrayed him and stolen his formula—and based on Badger’s testimony, the student has surpassed the teacher. Many fans were speculating that Walt sought to “save Jesse,” but up until he sees his former partner in chained servitude, Jesse is an equal target of his wrath. Yet again, Cranston conveys Walt’s emotional shifts wordlessly, as he devises a plan to spare Jesse from his robo-gun once he sees that Jesse is yet again a victim of men with larger egos and more malice than him. While this final confrontation was a satisfying moment of Walt putting the monsters that he had unleashed back in the box, it was almost entirely suspense free. I never doubted that Walt would successfully kill the Nazis and spare Jesse, that he had poisoned Lydia, and that Jesse would not pull the trigger on Walt. These were the moral necessities of a well-crafted tale; Breaking Bad was done playing games with twists and surprises, but ready to allow Walt to sacrifice himself to put down the monsters he had unleashed. Yet the scene was constructed to create suspense with the potential that Walt might not get the remote control in time, creating a rare moment of failed tension in the series—I awaited and anticipated the emotional confrontation between Walt and Jesse without ever doubting the outcome or tension about what might happen.

The “how it happened” was quite satisfying, however. I saw the robo-gun as an homage to one of my favorite Breaking Bad scenes: in “Four Days Out” when Jesse thinks Walt is building a robot to engineer their rescue. This time he does, and it works in an appropriately macabre and darkly funny payoff: the excessive gunfire mirrors Walt’s frequent insistence to maximize his inventions (as with the overpowered magnets, insistence to capture every last drop of methylamine, etc.), and it keeps firing blanks as Kenny’s body receives an endless massage. Although Jesse is no cold-blooded killer, killing Todd was a line he was happy to cross in payback for months of torture and Todd’s own heartless killings of Drew Sharp and Andrea. However when given a chance to kill Walt, Jesse takes a pass; instead he forces Walt to admit that Jesse killing him is what he wants, and then denies him that pleasure. When Jesse sees that Walt was shot, Jesse thinks leaving him to die alone is what Walt deserves, especially given what happened with Jane.

What Walt deserves matters in Breaking Bad. I’m reminded of an important scene in the penultimate episode of The Wire, when one character wonders why another has plotted to kill him, asking what he’s done to deserve it (keeping names vague if you haven’t seen it). The would-be killer’s reply quotes the film Unforgiven: “Deserve’s got nothing to do with it.” But on Breaking Bad, deserve’s got everything to do with it, as it has always been a tale of morality and consequences. Jesse deserves his freedom, even though he is a broken-down shell of who he was—and while we want to know what’s next for him, I’m content with the openness that allows me to imagine him driving to Alaska and becoming a carpenter, perhaps after rescuing Brock and Lydia’s daughter from orphanhood.

Walt deserves to die, and we deserve to see it. The final musical cue in a series that excelled at their use was Badfinger’s “Baby Blue,” another classic like “Crystal Blue Persuasion” that the producers have probably been hanging onto for years. The opening line of the song is as essential as the color-specific romance: “Guess I got what I deserve.” In this final glorious sequence, Walt gets to die in the lab, as the music sings a love song to chemistry—which in this context, serves as an ode to his own talents in perfecting the Baby Blue. His tour around the lab has prompted some debate as to what Walt is doing: is he strategically leaving his bloody fingerprints to claim ownership, a sort-of turf-claiming mark of Heisenberg Was Here? I think not, but rather that Walt is admiring the precision and craft of the lab, both as a testament to his own pedagogical prowess that yielded Jesse’s talents, and as his natural habitat where he “felt alive,” as he told Skyler earlier. To the soundtrack romanticizing Walt’s own greatness, it’s a final moment of pride and arrogance that he seizes to overshadow all the carnage he has caused, an acceptance that more than his family, he did it for the chemistry.

“Felina” is far from Breaking Bad’s best episode, but it is the conclusion that the series and its viewers deserve. I think it will play even better both for viewers bingeing the season in quick succession and upon rewatch without the trappings of anticipation, hype, and suspense. Jesse escapes, Skyler and her family survive, and Walt and his one-time minions die. It all happens with less emotion and drama than what we’ve come to expect from the series, but given the strain of the journey up to this point, we’re as emotionally drained as the characters. So a low-key bloodbath is an appropriate way to exit this wonderful trip.

Jason Mittell is Professor of Film & Media Culture and American Studies at Middlebury College. He is the co-editor (with Ethan Thompson) of How to Watch Television (NYU Press, 2013).

Is the Miss America pageant good or bad for women?

Earlier this month, the NYT’s “Room for Debate” blog featured a thoughtful discussion on the Miss America pageant and its role in today’s society. Now that this year’s pageant is over, we asked Megan Seely, author of Fight Like a Girl: How to Be a Fearless Feminist, to weigh in on the controversy in light of recent racial backlash faced by its first Indian-American winner. Read her piece below.
 

Miss America Colleen Kay Hutchins (R) looking at her trophy, September 1952.

I often hear the Miss America pageant defended as a great source of scholarship funds. Indeed, it is said that this year’s winner will receive about $50,000 in scholarship money. But given the fact that women receive fewer academic and merit scholarships than their male counterparts, despite overall higher grades; and despite the fact that there are fewer role models for women and girls in education particularly within Science, Technology, Engineering and Math (STEM); and despite the fact that women continue to confront pay inequity once they are in their jobs and careers, it is offensive that we defend and celebrate a ‘scholarship program’ whose main requirement is women meet a specific and narrow definition of physical beauty.

Some have recently argued that there have been gains for women of color, citing the very small handful of women of color who have been crowned. It should be noted that we have never had a Latina Miss America or a transgendered Miss America. And until the 1930s the official rulebook of the pageant required contestants to be “of the white race.” A rule that might officially be gone today, but is clearly still expected, judging by the blatantly racist response to the 2014 winner, Nina Davuluri.

There are women who love the pageant. There are participants who defend and celebrate what Miss America has meant to their lives. They argue choice in participating or watching or believing in the pageant. I don’t mean to dismiss these perspectives. But I do question, what is choice, in a culture that so deeply holds and reinforces a beauty standard that is then required for participation in the pageant?  Miss America deviates little from the thin, tall, heteronormative, and more than not, white ideal of beauty. Though there are a few who have successfully challenged the whiteness of the pageant, very little has changed. If we teach women and girls that their value is in their physical appearance, then it is no wonder that many turn to Miss America for validation.

I would hope that the winners would use their public platform to create change and impact the world.  But changing the pageant and the culture in which it exists is a far greater challenge. While women of color who become Miss America certainly defy the stereotypes of American beauty, they often do so while reinforcing the expectations of body size and appearance. There is still little, if any, diversity in regards to all races, ethnicities, cultural identities, body sizes, genders, sexualities, ages and abilities.  As long as this remains true, and as long as all women do not see themselves routinely represented and valued in every aspect of society, then we cannot justify the existence of this overtly misogynistic institution. Even if we celebrate the few women who have managed to stand out within it.  We cannot ignore the negative and harmful impacts this event has on thousands upon thousands of women of all ages who struggle to find their worth in a culture that emphasizes and rewards women’s physical appearance above all else. Girls are watching; we owe them more.

Megan Seely is a third wave feminist and activist, and author of Fight Like a Girl: How to Be a Fearless Feminist (NYU Press, 2007). She lives and teaches in northern California.

Fall books available on NetGalley

We’ve got quite a few gems in our NetGalley catalog this fall, all available for advance review now. Book reviewers, journalists, bloggers, librarians, professors, and booksellerswe welcome you to submit a request!

Not familiar with NetGalley? Learn more about how it works.

 
Buzz: Urban Beekeeping and the Power of the Bee by Lisa Jean Moore and Mary Kosut (September 27, 2013)

We think Booklist said it best: “In this fascinating blend of sociology, ecology, ethnographic research, and personal memoir, the authors range through all of the aspects of the human relationship with the honeybee.”

Ever thought of honeybees as sexy? You might after watching Mary Kosut discuss the sensual nature of beekeeping.

 

Cut It Out: The C-Section Epidemic in America by Theresa Morris (October 7, 2013)

In Cut It Out, Theresa Morris offers a riveting and comprehensive look at this little-known epidemic, as well as concrete solutions “that deserve the attention of policymakers” (Publishers Weekly starred review).

C-sections are just as safe as vaginal births, right? Not true, says Theresa Morris. Watch her discusses this and other misconceptions on our YouTube channel.

 

Hanukkah in America: A History by Dianne Ashton (October 14, 2013)

Hanukkah will fall on Thanksgiving this year for the first time ever—and the last time for another 70,000 years. Brush up on your knowledge of the holiday in time to celebrate the once-in-an-eternity event. Publishers Weekly, in another starred review, promises a “scholarly but accessible guide to the evolution of the Festival of Lights in America.”

Stay tuned for our interview with the author!

 
Browse all of our e-galleys available for review on NetGalley.

Constitution Day: 5 books to read now

September 17th is Constitution Day – a federally recognized day to celebrate and teach about the United States Constitution. But what are the proper “texts” for this day of teaching?

To start, we’ve selected a short list of recent NYU Press books we think every citizen should read this year. But, there are certainly others. What’s on your list? Let us know in the comments section!

5 books for Constitution Day

Why Jury Duty Matters: A Citizen’s Guide to Constitutional Action by Andrew Guthrie Ferguson

Jury duty is constitutional duty—and a core responsibility of citizenship! The first book written for jurors, Why Jury Duty Matters provides readers with an understanding of the constitutional value of jury duty. (Also, be sure to read the author’s excellent piece in The Atlantic on ways to the make the Constitution relevant to our daily lives.)

 

The Embattled Constitution
Edited by Norman Dorsen, with Catharine DeJulio

The book presents a collection of the James Madison lectures delivered at the NYU School of Law. The result is a fascinating look into the minds of the judges who interpret, apply, and give meaning to our “embattled Constitution.”

 

America’s Founding Son: John Bingham and the Invention of the Fourteenth Amendment by Gerard N. Magliocca

This book sheds light on John Bingham, the father of the Fourteenth Amendment, who helped put a guarantee of fundamental rights and equality to all Americans into the U.S. Constitution.

 

Government by DissentProtest, Resistance, and Radical Democratic Thought in the Early American Republic by Robert W.T. Martin

Democracy is the rule of the people. But what exactly does it mean for a people to rule? The American political radicals of the 1790s understood, articulated, and defended the crucial necessity of dissent to democracy. This is their story.

 

Bonds of Citizenship: Law and the Labors of Emancipation by Hoang Gia Phan

In this study of literature and law from the Constitutional founding through the Civil War, Hoang Gia Phan demonstrates how citizenship and civic culture were transformed by antebellum debates over slavery, free labor, and national Union.