Why has TV storytelling become so complex?

—Jason Mittell

[This post originally appeared at The Conversation.]

If you watch the season finale of The Walking Dead this Sunday, the story will likely evoke events from previous episodes, while making references to an array of minor and major characters. Such storytelling devices belie the show’s simplistic scenario of zombie survival, but are consistent with a major trend in television narrative.

Prime time television’s storytelling palette is broader than ever before, and today, a serialized show like The Walking Dead is more the norm than the exception. We can see the heavy use of serialization in other dramas (The Good Wife and Fargo) and comedies (Girls and New Girl). And some series have used self-conscious narrative devices like dual time frames (True Detective), voice-over narration (Jane the Virgin) and direct address of the viewer (House of Cards). Meanwhile, shows like Louie blur the line between fantasy and reality.

 

Many have praised contemporary television using cross-media accolades like “novelistic” or “cinematic.” But I believe we should recognize the medium’s aesthetic accomplishments on its own terms. For this reason, the name I’ve given to this shift in television storytelling is “complex TV.”

There are a wealth of facets to explore about such developments (enough to fill a book), but there’s one core question that seems to go unasked: “why has American television suddenly embraced complex storytelling in recent years?”

To answer, we need to consider major shifts in the television industry, new forms of television technology, and the growth of active, engaged viewing communities.

A business model transformed

We can quibble about the precise chronology, but programs that were exceptionally innovative in their storytelling in the 1990s (Seinfeld, The X-Files, Buffy the Vampire Slayer) appear more in line with narrative norms of the 2000s. And many of their innovations – season-long narrative arcs or single episodes that feature markedly unusual storytelling devices – seem almost formulaic today.

What changed to allow this rapid shift to happen?

As with all facets of American television, the economic goals of the industry is a primary motivation for all programming decisions.

For most of their existence, television networks sought to reach the broadest possible audiences. Typically, this meant pursuing a strategy of mass appeal featuring what some derisively call “least objectionable programming.” To appeal to as many viewers as possible, these shows avoided controversial content or confusing structures.

But with the advent of cable television channels in the 1980s and 1990s, audiences became more diffuse. Suddenly, it was more feasible to craft a successful program by appealing to a smaller, more demographically uniform subset of viewers – a trend that accelerated into the 2000s.

In one telling example, FOX’s 1996 series Profit, which possessed many of contemporary television’s narrative complexities, was quickly canceled after four episodes for weak ratings (roughly 5.3 million households). These numbers placed it 83rd among 87 prime time series.

Yet today, such ratings would likely rank the show in the top 20 most-watched broadcast programs in a given week.

This era of complex television has benefited not only from more niche audiences, but also from the emergence of channels beyond the traditional broadcast networks. Certainly HBO’s growth into an original programming powerhouse is a crucial catalyst, with landmarks such as The Sopranos and The Wire.

But other cable channels have followed suit, crafting original programming that wouldn’t fly on the traditional “Big Four” networks of ABC, CBS, NBC and FOX.

A well-made, narratively-complex series can be used to rebrand a channel as a more prestigious, desirable destination. The Shield and It’s Only Sunny in Philadelphia transformed FX into a channel known for nuanced drama and comedy. Mad Men and Breaking Bad similarly bolstered AMC’s reputation.

The success of these networks has led upstart viewing services like Netflix and Amazon to champion complex, original content of their own – while charging a subscription fee.

The effect of this shift has been to make complex television a desirable business strategy. It’s no longer the risky proposition it was for most of the 20th century.

Miss something? Hit rewind

Technological changes have also played an important role.

Many new series reduce the internal storytelling redundancy typical of traditional television programs (where dialogue was regularly employed to remind viewers what had previously occurred).

Instead, these series subtly refer to previous episodes, insert more characters without worrying about confusing viewers, and present long-simmering mysteries and enigmas that span multiple seasons. Think of examples such as Lost, Arrested Development and Game of Thrones. Such series embrace complexity to an extent that they almost require multiple viewings simply to be understood.

In the 20th century, rewatching a program meant either relying on almost random reruns or being savvy enough to tape the show on your VCR. But viewing technologies such as DVR, on-demand services like HBO GO, and DVD box sets have given producers more leeway to fashion programs that benefit from sequential viewing and planned rewatching.

Serialized novels, like Charles Dickens’ The Mystery of Edwin Drood, were commonplace in the 19th century. Wikimedia Commons.

Like 19th century serial literature, 21st century serial television releases its episodes in separate installments. Then, at the end of a season or series, it “binds” them together into larger units via physical boxed sets, or makes them viewable in their entirety through virtual, on-demand streaming. Both encourage binge watching.

Giving viewers the technology to easily watch and rewatch a series at their own pace has freed television storytellers to craft complex narratives that are not dependent on being understood by erratic or distracted viewers. Today’s television assumes that viewers can pay close attention because the technology allows them to easily do so.

Forensic fandom

Shifts in both technology and industry practices point toward the third major factor leading to the rise in complex television: the growth of online communities of fans.

Today there are a number of robust platforms for television viewers to congregate and discuss their favorite series. This could mean partaking in vibrant discussions on general forums on Reddit or contributing to dedicated, program-specific wikis.

As shows craft ongoing mysteries, convoluted chronologies or elaborate webs of references, viewers embrace practices that I’ve termed “forensic fandom.” Working as a virtual team, dedicated fans embrace the complexities of the narrative – where not all answers are explicit – and seek to decode a program’s mysteries, analyze its story arc and make predictions.

The presence of such discussion and documentation allows producers to stretch their storytelling complexity even further. They can assume that confused viewers can always reference the web to bolster their understanding.

Other factors certainly matter. For example, the creative contributions of innovative writer-producers like Joss Whedon, J.J. Abrams and David Simon have harnessed their unique visions to craft wildly popular shows. But without the contextual shifts that I’ve described, such innovations would have likely been relegated to the trash bin, joining older series like Profit, Freaks and Geeks and My So-Called Life in the “brilliant but canceled” category.

Furthermore, the success of complex television has led to shifts in how the medium conceptualizes characters, embraces melodrama, re-frames authorship and engages with other media. But those are all broader topics for another chapter – or, as television frequently promises, to be continued.

Jason Mittell is Professor of Film & Media Culture and American Studies at Middlebury College. He is the author of Complex TV: The Poetics of Contemporary Television Storytelling (NYU Press, 2015), and co-editor of How to Watch Television (NYU Press, 2013).

What lies beneath the Chapel Hill murders? More than a ‘parking dispute’

—Nadine Naber

We may never know exactly what Craig Stephen Hicks was thinking when he killed Syrian American medical student Deah Barakat, his Palestinian American wife Yusor Abu-Salha, and her sister Razan Abu-Salha. But we do know that U.S.-led war in Arab and Muslim majority countries has normalized the killing of Arabs and Muslims. It is more crucial than ever before to understand the institutionalization of racism against persons perceived to be Arab or Muslim in terms of the structures of imperial war that normalize killing and death and hold no one (other than victims themselves) accountable.

Photo: Molly Riley/UPI.

The Obama Administration may have dropped the language of the “war on terror,” but it has continued its fundamental strategy of endless war and killing in the Arab region and Muslim majority countries such as Afghanistan and Pakistan (without evidence of criminal activity). The unconstitutional “kill list” for instance, allows the president to authorize murders every week, waging a private war on individuals outside the authorization of congress. Strategies like the “kill list” resolve the guilt or innocent of list members in secret and replace the judicial process (including cases involving U.S. citizens abroad) with quick and expedited killing. These and related practices, and their accompanying impunity, look something like this:

Al Ishaqi massacre, Iraq 2006: The U.S. army rounded up and shot at least 10 civilians including 4 women and 5 children. The Iraqis were handcuffed and shot in the head execution style. The U.S. spokesperson’s response? “Military personnel followed proper procedures and rules of engagement and did nothing wrong.”

Drone attack, Yemen 2015: A drone killed 13-year old Mohammad Tuaiman (whose father was killed in a 2011 drone strike), his brother, and a third man. Questioned about the incident, the CIA stated that “the 3 men were believed to be Al Qaeda” even though the CIA refused to confirm that he was an Al Qaida militant.

The U.S.-backed Israeli killing of Palestinians reinforces the acceptability of Arab and Muslim death. In July 2014, the Israeli Defense Force killed at least 2,000 Palestinians including 500 children. It is well established that the IDF soldiers deliberately targeted civilians. The Obama Administration’s response? Explicit support for Israel.

And those left behind are forced to watch their loved ones’ bodies fall to the ground or burn like charcoal and can only conclude that, “In [the U.S. government’s] eyes, we don’t deserve to live like people in the rest of the world and we don’t have feelings or emotions or cry or feel pain like all the other humans around the world.”

Since the 1970s (when the U.S. consolidated its alliance with Israel), the corporate news media has reinforced the acceptability of Arab and Muslim death—from one-sided reporting to fostering fear of Arabs and Muslims. From Black Sunday (1977) to American Sniper (2015), Hollywood has sent one uninterrupted message: Arabs and Muslims are savage, misogynist terrorists; their lives have no value; and they deserve to die.

This interplay between the U.S. war agenda abroad and the U.S. corporate media extends directly into the lives of persons perceived to be Arab and/or Muslim in the United States. Hate crimes, firebomb attacks, bomb threats, vandalism, detention and deportation without evidence of criminal activity and more have all been well documented. Of course, such incidents escalated in the aftermath of the horrific attacks of 9/11. As the U.S. state and media beat the drums of war, anyone perceived to be Arab and/or Muslim (including Sikhs, Christian Arabs, and Arab Jews) became suspect. Muslim women who wore the headscarf became walking emblems of the state and media discourse of Islamic terrorism. Across the United States, at school, on the bus, at work, and on the streets, women wearing the headscarf have been bullied, have had their scarves torn off, and have been asked over and over why they support Al Qaeda, Saddam Hussein, terrorism, and the oppression of women.

Despite this, the corporate media (replicating the words of the police) and government officials have either reduced the North Carolina killings to a parking dispute or expressed grave confusion over why an angry white man would kill three Arab Muslim students in North Carolina execution-style. Yet the father of one of the women students stated that his son-in law did not have any trouble with Hicks when he lived there alone. The trouble, he said, started only after Yusor, who wore a headscarf identifying her as a Muslim, moved in. Even so, Chapel Hill Mayor Mark Kleinschmidt told CNN that the community is still “struggling to understand what could have motivated Mr. Hicks to commit this crime,” adding, “It just baffles us.”

The “parking dispute” defense individualizes and exceptionalizes Hicks’ crime—in this case, through a logic that obscures the connection between whiteness, Islamophobia, and racism. And the bafflement rhetoric constructs a reality in which there are no conceivable conditions that could have potentially provoked Hicks. Both approaches deny the possible links between the killings, U.S. and Israeli military killings, the media that supports them, and the U.S. culture of militarized violence. They will also assist Hicks in attempting to avoid the more serious hate crime charge that would come with a heavy additional sentence.

Alternately, discussions insisting on the significance of Islamophobia in this case must go beyond the call for religious tolerance and account for the projects of U.S. empire building and war that underlie Islamophobia. Contemporary Islamophobia is a form of racism and an extension of U.S.-led war abroad. As I wrote in Arab America, immigrant communities from the regions of U.S.-led war engage with U.S. racial structures, specifically anti-Arab and anti-Muslim racism, as diasporas of empire—subjects of the U.S. empire living in the empire itself. Perhaps then, we should also avoid applying the same analysis of racism across the board—as if all racisms are the same or as if the framework #blacklivesmatter can simply be transposed onto the killing of Arab Muslim Americans. Otherwise, we risk disremembering the distinct conditions of black struggle (and black Muslims) including the systematic state-sanctioned extrajudicial killing of black people by police and vigilantes as well as black poverty, and histories of slavery and mass incarceration. It is also important to remember the distinct conditions of the war on terror whereby anyone and everyone perceived be Muslim (including Arab Christians and Sikhs) are potential targets.

Rest in peace and power Deah Barakat, Yusor Abu-Salha, and Razan Abu-Salha. May your loved ones find strength and support. My heart is shattered.

Nadine Naber is Associate Professor in the Gender and Women’s Studies Program at the University of Illinois at Chicago. She is the author of Arab America: Gender, Cultural Politics, and Activism (NYU Press, 2012). 

Beyond intent: Why we need a new paradigm to think about racialized violence

—Evelyn Alsultany

Three Muslim Americans – Deah Shaddy Barakat, 23; his wife, Yusor Mohammad, 21; and her sister, Razan Mohammad Abu-Salha, 19 – were murdered last week in Chapel Hill, North Carolina by 46-year-old resident Craig Stephen Hicks. The tragedy has sparked a debate over whether or not these deaths were the result of a hate crime or a parking dispute.

Women take part in a vigil for three young Muslims killed in Chapel Hill, North Carolina. Photo: Mandel Ngan/AFP/Getty Images.

Muslim Americans who claimed that this was surely a hate crime were presented with evidence to the contrary. Hicks’s Facebook and other online posts revealed that he is an atheist who is against all religion, regardless of whether it is Islam, Christianity, or Judaism, a gun enthusiast, and an advocate for gay rights. His online posts show that he is passionate about the protection of constitutional rights, especially freedom of speech and freedom of religion. His archived posts even include commentary on the “Ground Zero mosque” controversy, in which he writes in support of Muslim rights and notes the important distinction between Muslims and Muslim extremists. His wife has insisted that the murders were the result of a parking dispute, and not a hate crime. As a result, Hicks has been portrayed as not hating Muslims.

This profile of Hicks is indeed complex. He does not fit the conventional profile of a “racist” – i.e., someone who believes that all Muslims are a threat to America; who clings to essentialist and binary notions of identities; who espouses that certain groups of people do not deserve human rights; who practices intentional bigotry; who is firmly rooted in a logic that justifies inequality. I am reluctant to use the term “racist” since it conjures an image of someone who participates in blatant and intentional forms of hate. However, what this case shows us is that we need a new paradigm to understand racialized violence today. Yes, this case is complex, but that does not mean it is not a hate crime. It is complex because it does not fit the narrow way in which we have defined a hate crime.

Posing an either/or option – either this is or is not a hate crime – does not help us understand what transpired. Racism is not an either/or phenomenon. It is deeply embedded in our society and, when left unchecked, has the potential to inform our perceptions and actions in ways not captured by a caricatured understanding of its diverse forms. Racism is not always conscious or intentional. It exists across a vast spectrum of consciousness and intentionality.  As part of our subconscious, racism can manifest in the form of microaggressions that are often unintentional and sometimes even well-meaning. On the more dangerous side of the spectrum, it manifests in violence. We need to break the association of racism with intent because racism endures without it.

Our current cultural paradigm often makes a simplistic equation: Good people are well-intentioned and are therefore not racist; bad people are ill-intentioned and are therefore racist. Consequently, if the white police officers who killed Michael Brown and Eric Gardner are considered “good people” by their friends, families, and colleagues, their actions cannot be deemed racist. Such a conclusion focuses solely on intent and overlooks how members of the police – like all of us – have been shaped and influenced by notions of black men as threatening and how such cultural imagery has, in turn, structured racialized violence.

The point is not that Craig Hicks is any more or any less racist than the white police officers who murdered Michael Brown, Eric Garner, and other black men. Indeed, the question of their individual, consciously expressed or felt racism does not help us to understand what happened or how to prevent it in the future; it just provokes denial and defensiveness. Conversely, claiming that we are “post-race” and/or denying that a particular incident has anything to do with race does not help us solve the problem of racialized violence.

The point is not whether Craig Hicks is any more or less racist than any of us; the point is that Craig Hicks lives and his victims died in a society that is structured by deeply institutionalized and culturally pervasive racism that exists regardless of whether any individual “wants” it to or not, and regardless of whether we as a society want to acknowledge it or not. We need a new paradigm, a new language and framework, to understand racialized violence today. Hicks’ profile provides an opportunity to challenge ourselves to rethink our understanding of racism and hate crimes in order to prevent murder.

Evelyn Alsultany is Associate Professor in the Program in American Culture at the University of Michigan. She is the author of Arabs and Muslims in the Media: Race and Representation after 9/11 (NYU Press, 2012).

#MuslimLivesMatter, #BlackLivesMatter, and the fight against violent extremism

—Zareena Grewal

On Tuesday February 10, 2015, Craig Stephen Hicks, 46, was charged with first-degree murder of three Arab, Muslim college students in Chapel Hill, North Carolina.

Photo: http://twitter.com/samahahmeed.

Hicks’ neighbors, Deah Shaddy Barakat, 23, and Yusor Mohammad, 21, were newlyweds—and Razan Mohammad Abu-Salha, 19, was visiting her older sister and brother-in-law at the time of the execution-style killing. After the mainstream US media’s initial silence, the homicide is now referred to as “a shooting,” sparking worldwide Twitter hashtag campaigns such as #CallItTerrorism and #MuslimLivesMatter with many speculating on how the crime might have been framed had the perpetrator been Muslim and the victims white.

The motives of Hicks, who turned himself in to police, are the source of heated debate and speculation. According to his Facebook profile, Hicks describes himself as an anti-theist, a fan of the controversial film American Sniper and atheist polemicist Richard Dawkins, and a proud gun-owner. The Chapel Hill Police Department described the crime as motivated by an on-going dispute between the neighbors over parking, while the father of the two young women insists it was a “hate-crime.” Chief Chris Blue recognizes and continues to investigate “the possibility that this was hate-motivated.”

Such language suggests that while Hicks’ violence is exceptional and excessive, his motivations could have been ordinary and benign: maybe he was there first, maybe he had dibs on that parking spot, maybe he had a bad day or a bad life and so he had a mental breakdown with a gun in hand. After all, while this murder is devastating to the family and friends of the victims, for many of us, it is not shocking. We know and expect “lone shooters” to be white, heterosexual men; we know and expect their victims to be men of color, women, youth.

But it is American Muslim leaders who will gather in DC for the Obama administration’s “Countering Violent Extremism Summit” in a few days.

Individualizing the violence of white American men into “lone wolves” conceals the regularity of such violence and the state’s inability to prevent it, to make us “secure,” even to name it. This is one of the searing lessons of the #BlackLivesMatter movement; George Zimmerman’s sense of insecurity was used to justify his murder of an unarmed, black teenager, Trayvon Martin. As the #BlackLivesMatter movement demonstrates, Zimmerman was part and parcel of a larger phenomenon of racial, homicidal violence against unarmed blacks enacted in tandem by ordinary white citizens “standing their ground” and militarized police forces.

A significant number of blacks in the US are also Muslim and, therefore, vulnerable to being brutalized and murdered simply because they are black. Despite the fact that black youth are more than four times likely than any other group to be gunned down by police, critics of #BlackLivesMatter continue to ignore this harsh reality, insisting that #AllLivesMatter.

Clearly, all lives do not matter to everyone. The #BlackLivesMatter movement brings our attention to the fact that violence in the name of white supremacy only horrifies and terrifies some of us.

Disingenuous claims about how all lives matter or how parking is frustrating hide the insidious influence of racism. In my book, Islam is a Foreign Country, I explore how American Muslim communities grapple with the pervasive, racial hatred of their religion. This morning a Pakistani friend asked whether she will now have to explain to her young children that some people hate them just for being Muslim. African American Muslims know all too well that the question is not whether but when to teach their children that they are vulnerable. Hicks’ victim knew it too; she saw it in his eyes, telling her father, “He hates us for what we are and how we look.”

Zareena Grewal is Associate Professor of American Studies and Religious Studies at Yale University. She is the author of Islam is a Foreign Country: American Muslims and the Global Crisis of Authority (NYU Press, 2015).

No scrubbing away America’s racist past

—Carl A. Zimring

Last week, @deray tweeted an image of a century old soap advertisement showing a young white boy using soap to wash the pigment off of a young African-American boy’s body. He captioned it “Ads. Bleaching. History. America.”

Had he wished to, @deray could have sent out dozens of such tweets, each with a different image. The tweeted image was but one of dozens printed between 1880 and 1915 displaying claims soaps could literally washed dark pigment off of skin. My forthcoming book, Clean and White, reproduces similar examples from Lautz Brothers, Kirkman and Sons, and Pearline. The latter featured an illustration of an African-American woman scrubbing a young child and exclaiming “Golly! I B’leve PEARLINE Make Dat Chile White.”

These racist caricatures focused primarily but not exclusively on African-Americans. Kirkman and Sons released an advertisement sometime after 1906 that referenced the year’s Pure Food and Drug Act. The ad showed three white women washing three wealthy Turkish men’s skin from brown to white. The accompanying poem tells the story of how the women were the Turkish men’s maids. They convinced the men to let them wash them with the soap, transforming their features to milky white. The story ended happily, with the now-white men marrying each of the maids. Cross-racial and cross-class lines were transcended, all through the miracle of a pure, cleansing soap.

Such a message was consistent with the trope that skin darker than white was somehow impure and dirty. Products boasting of absolute purity claimed to be so powerful that they could literally wash away the stain of race.

Why do these images matter as anything beyond century-old relics of America’s racist past? These images proliferated at a time when the rhetoric and imagery of hygiene became conflated with a racial order that made white people pure, and anyone who was not considered white was somehow dirty. The order extended from caricatures to labor markets. Analysis of census data indicates the work of handling waste (be it garbage, scrap metal, laundry, or domestic cleaning) was disproportionately done by people who were not native-born white Americans.

Through World War II, this involved work by African Americans and first- and second-generation immigrants from Asia, Latin America, and Southern and Eastern Europe. In the second half of the twentieth century, the burdens of this dirty and dangerous work fell heavier on Hispanic and African-American workers, creating environmental inequalities that endure to this day. They are evident in the conditions that led to the Memphis’s sanitation workers strike in 1968, as well as the residents of Warren County, North Carolina laying down in the street to block bulldozers from developing a hazardous waste landfill in 1982. Environmental inequalities are evident still in environmental justice movements active across the United States in 2015.

Since the end of the Civil War, American sanitation systems, zoning boards, real estate practices, federal, state, and municipal governments, and makers and marketers of cleaning products have all worked with an understanding of hygiene that assumes “white people” are clean, and “nonwhite people” are less than clean. This assumption is fundamental to racist claims of white supremacy, a rhetoric that involves “race pollution,” white purity, and the dangers of nonwhite sexuality as miscegenation. It is also fundamental to broad social and environmental inequalities that emerged after the Civil War and that remain in place in the early twenty-first century. Learning the history of racist attitudes towards hygiene allows us to better understand the roots of present-day inequalities, for the attitudes that shaped those racist soap advertisements remain embedded in our culture.

Carl A. Zimring is Associate Professor of Sustainability Studies at Pratt Institute. He is the author of Clean and White: A History of Environmental Racism in the United States from Monticello to Memphis (forthcoming from NYU Press).

‘Left Behind,’ again? The re-emergence of a political phenomenon

—Glenn W. Shuck

Critics just don’t get Left Behind, a new movie adaptation of the best-selling book series. Sure, it’s predictably awful. The acting is bad, the production is terrible, and the plot is thinner than Soviet toilet paper. But the stakes are far higher than with a typical, first-order howler. Left Behind preaches to the choir, sure, but this is no ordinary choir! The film, like the novels, doesn’t cater to Hollywood styles; it’s all about motivating people to “spread the word,” and that word is just as political as it is otherworldly.

Ten years have passed since the original Left Behind novels by Tim LaHaye and Jerry B. Jenkins concluded. In a world where news cycles grow ever shorter, ten years is several lifetimes. Sure, the Christian suspense novels helped unleash powerful political forces then, but what about now?  The series and the values it champions may have found a way to return with the debut of the new Left Behind film.

Why Left Behind?  Why now?  Financial motives, as always, play a powerful role. Just look at recent films and television serials. Apocalypse sells. God sells. Fear sells. But another motive is at hand: apocalyptic narratives are also multi-stranded; they carry, after all, a revelation. They proclaim a new way of being in the world. In short, apocalyptic narratives often motivate action.

Dr. LaHaye and Mr. Jenkins helped politically conservative evangelicals in the 1990’s move beyond “single issue” and “values voters” labels to empower their political imaginations beyond narrow and predictable categories.  As the series progressed and the politics behind them came together, they helped an upstart and then highly disliked presidential candidate, George W. Bush, to two unlikely victories, selling millions of political primers along the way.  The Left Behind phenomenon helped embolden a hyper-energized religious right.

But something went wrong with doomsayers’ forecasts of evangelical political dominance: they stopped voting at the high rates that boosted President Bush. It wasn’t just that the original Left Behind film and its sequels were big–screen busts. “New” Republican standard-bearers Senator John McCain and former Massachusetts governor Mitt Romney never held much appeal for evangelicals. Yes, it hurt that the Left Behind series and its spin-offs and spokespersons were no longer so influential. Moreover, Bush’s unpopularity became toxic. More “moderate” Republican candidates, however, without the full support of a key voting bloc, found a different kind of apocalypse.

Fast-forward to 2014. A president is not on the ballots but President Obama’s policies certainly are as Democrats fight to retain the Senate. Polls and pundits raise concerns for Democrats. But Democrats ought to also consider a voting bloc that has been under-engaged for a decade. Some experts have assumed evangelicals and the Tea Party are one and the same (or similar enough), hence one can already account for these potential voters. But it is simplistic to equate the Tea Party with the religious right. It takes more than faux filibusters to help push high percentages of mercurial evangelical conservatives to the polls, especially in a midterm election, albeit as critical as this year.

Re-enter the Left Behind phenomenon. Left Behind, another adaptation of the novels, is earning the Left Behind phenomenon and the values it champions, a closer look. The film does not have the highest budget, but this re-boot has fared much better than the 2001 original, grossing almost twice as much (roughly $ 7 million) in the first weekend as the ill-fated original all told.  Whether the film grosses $20 million or $50 million matters less, however, than the fact it has brought conservative evangelicals back into the news cycle.

Thus in the days leading up to the 2014 midterms, Republicans have a wild card in Left Behind that just may become an ace. It is absurd to suggest a low-budget film will change the balance of power in the U.S.A., but it has resurrected the dynamics of the novels, and conservative evangelicals finally have a powerful reminder to vote.  And as any pundit will admit, it won’t take much to tip the scales in Washington.

Finally, the timely release of Left Behind may owe to coincidence, one month before the crucial midterms.  Evangelicals do not believe in coincidence, however, nor should campaigners in evangelical-filled battleground states such as North Carolina, Kansas, and Iowa, to name a few. Left Behind is playing in the heartland, playing for the hearts and minds of conservative evangelical voters. Critics, who dismiss Left Behind as simply an awful film and fire dull hip-shots with dismissive derision and canned clichés, miss the point. Left Behind is not about film prizes or outstanding cinematography or even good taste. It’s all about “spreading the word.” Who will be left behind if the film re-energizes its core audience and steers them into action just weeks before the crucial elections next month?

Glenn W. Shuck is Assistant Professor of Religion at Williams College and author of Marks of the Beast: The Left Behind Novels and the Struggle for Evangelical Identity (NYU Press, 2004).

“Sounds familiar”: The revolution in #Ferguson

—Shana L. Redmond

My first efforts to see the real time, on-the-ground happenings in Ferguson was on that day, the same day that alternative media streams temporarily went black. I visited Activist World News Now online for the live stream of Ferguson but I did not get a visual of that embattled community’s resolve; instead I heard it. From beyond the black screen I heard a voice leading a chant:Seven days after the murder of Black youth Michael Brown by police officer Darren Wilson in Ferguson, Missouri, state governor Jay Nixon instituted a citywide curfew between the hours of midnight and 5am. This effort to tame and remove from view those who continue to protest and rebel against the injustices suffered there and around the country was a part of a larger blackout designed to conceal the escalation of the militarized police state in Ferguson. This strategy on the part of the city’s police department included disabling streetlights and attacking and arresting journalists covering the story.

Solo voice: “Won’t be no police brutality…”

All: “…when the revolution come.”

Solo voice: “Won’t be mass incarceration…”

All: “…when the revolution come.”

This performance, in the step-worn and gas canister-ridden streets of Ferguson, was the sound of protest and all the evidence I needed to document this war zone. The sound showed me that police antagonized protestors. The sound showed me that there were critical and politically diverse numbers of people there, demanding change. The sound showed me that neither voices nor spirits were broken in that city under siege. And the sound showed me that the peoples’ determination to imagine and claim different futures, free from police brutality and mass incarceration (amongst other violences), is alive even when haunted and pursued by death.

Unfortunately, the sounds emanating from Ferguson also showed me that times haven’t changed as much as some insist they have—at least not for African descended people in the U.S. The demands by protestors for alternatives to the frightening present are not new. Marcus Garvey, the inimitable leader of the Universal Negro Improvement Association (UNIA), argued a century ago for these futures that we still march for today. This truth is particularly devastating when one considers that one of his most famous speeches on Black violability was compelled by events that occurred 15 miles southeast of Ferguson.

In 1917, Garvey delivered a speech entitled “The Conspiracy of the East St. Louis Riots.” The bloody events of that massacre, which ensued just over the state line in western Illinois, began with a white mob who attacked the Black working class section of the city over perceived competition for employment. Their murderous nativism was so explosive that the Illinois National Guard was deployed. Garvey’s speech on the incident followed a large silent protest staged by the National Association for the Advancement of Colored People (NAACP) in New York City. Far from condoning this approach, Garvey argued that it was “no time for fine words, but a time to lift one’s voice against the savagery of a people who claim to be the dispensers of democracy.” He continued,

For three hundred years the Negroes of America have given their life blood to make the Republic the first among the nations of the world, and all along this time there has never been even one year of justice but on the contrary a continuous round of oppression. At one time it was slavery, at another time lynching and burning, and up to date it is wholesale butchering. This is a crime against the laws of humanity; it is a crime against the laws of the nation, it is a crime against Nature, and a crime against the God of all mankind.

The litany of brutalities described here provide a genealogy of state-sanctioned violence that continues to its logical end in contemporary Ferguson, New York City, Los Angeles, Milwaukee, Atlanta, and countless other locations across the country. As in 1917, the National Guard has again been called to greater St. Louis. Black communities continue to be the laboratories for warfare, testing the efficacy of technology (Ferguson police have employed tanks, snipers, tear gas, and rubber bullets, to name but a few weapons) and enemy narratives that turn murder victims into easily disposable criminals. This is the status of our democracy.

I do not know what special meaning the people who slaughtered the Negroes of East. St. Louis have for democracy of which they are the custodians, but I do know that it has no literal meaning for me as used and applied by these same lawless people. America, that has been ringing the bells of the world, proclaiming to the nations and the peoples thereof that she has democracy to give to all … has herself no satisfaction to give 12,000,000 of her own citizens except the satisfaction of a farcical inquiry that will end where it begun, over the brutal murder of men, women and children for no other reason than that they are black people seeking an industrial chance in a country that they have labored for three hundred years to make great.[1] 


Shana L. Redmond
 is Associate Professor of American Studies and Ethnicity at the University of Southern California. She is a former musician and labor organizer. Her book, Anthem: Social Movements and the Sound of Solidarity in the African Diaspora, is available now from NYU Press. Follow her on Twitter: @ShanaRedmond.


[1] Marcus Garvey, “The Conspiracy of the East St. Louis Riots” (1917), reprinted on the “American Experience” website, Public Broadcasting Service (PBS), http://www.pbs.org/wgbh/amex/garvey/filmmore/ps_riots.html (accessed August 17, 2014).

Maleficent: A feminist fairy tale?

—Jessie Klein and Meredith Finnerty

Maleficent makes us want to stand up and cheer—and then sit down stunned. The film distinguishes itself as the third in a trend of major studio releases that seem determined to reverse the damage of the common fairy tale motif: “Wealthy princes save skinny damsels for love ever after.” Yet, as research reveals high U.S. social isolation, the reinvented princess plots portend ominous new troubles while embracing old snares; together these phenomena suggest that human love in the U.S. may be endangered.

In the wake of Brave (2012) and Frozen (2013), Maleficent suggests that true love at best won’t be found in some random prince you meet one day, and at worst, said prince may well be seeking to destroy you to realize his own ambitions.

“You got engaged to someone you met the same day?” howls Kristoff to Anna in Frozen. These messages are a partial triumph, advising young people to work to find a forever partner, among other priorities.

The other themes, though, are foreboding: In addition to pressure to look like ever more unattainable Photoshopped images (still contributing to eating disorders at ever younger ages), young people are told to look for intimacy from parents and siblings—and consider romantic love from a spouse (or anyone else) a distant, and perhaps unachievable, goal.

Maleficent’s former love, Prince Stefan, steals her power to fly when he absconds with her wings, to become King. In Frozen, Anna’s fiancé, Prince Hans, tries to kill Anna and destroy the ice-power endowed to her older sister, Queen Elsa, in order to mount their throne. And Princess Merida’s suitors, in Brave, chosen by her parents, are arrogant and incompetent.

In Frozen, it is Anna’s sister, Elsa, who accidentally ices Anna’s heart, and then frees her from this fate with her own true love sibling kiss. In Maleficent, the evil witch-turned-doting mother figure embodies such love; and in Brave, Merida herself liberates her mother from life as a bear, with the heart only a daughter can bestow.

What a departure from the historic themes where evil stepsisters, stepmothers, and girls generally are so competitive that they achieve each other’s demise. Such parables characterizing sisters as envious and hateful are present in, among others, Oz, the Great and Powerful (2013) and expected in Cinderella (2015); and a constant in contemporary film renditions of classics such as King Lear.

The depiction of sisters and “stepmothers” as devoted to one another in Frozen and Maleficent is new; and the portrayal of true love found in familial bonds reflects startling statistics. Family intimacy remains constant when relationships of other kinds are disintegrating as revealed by the General Social Survey 2004 when compared to GSS 1985. The U.S. marriage rate has reached its lowest point in the past century. In 1920, 92.3 percent of Americans married; now it is 31.1 percent according to a 2013 study by Bowling Green State University’s National Center for Marriage and Family; and 40 to 50 percent of those unions end in divorce. Not least, people have fewer friends, and connect with neighbors and other community members less.

Today’s fairy tale heroines are also turning to non-human companions for support (note Maleficent’s bird and Anna’s snowman). Princess Merida and her mother see each other’s wisdom only when the mom becomes a bear. Could this be a reference to real world declining rates of social connections outside family? Almost 25 percent of women won’t marry unless their pets approve (as per JDate and Christian Mingles State of Dating in America, 2014), suggesting that animals are replacing humans for family support. Another trend is for women to adopt dogs instead of children.

Young people watch these films while social isolation has tripled; and empathy and trust decreased. Other than with Mom and Dad, a trusted sibling, and perhaps a dog, people in the U.S. have less love in their lives than past generations.

We celebrate the victories in these reimagined legends. When before have children’s movies warned against blindly following the call to marry, above any other goal—and encouraged girls to look for intimacy elsewhere, much less the family? We appreciate the themes encouraging girls to know and use their inner power. These are among the memos we wish we and our peers received in our formative years.

We hope, though, that future scripts will also describe, and prescribe, more hope for social relationships in America among intimate partners (gay, straight and other) and male and female human friends. We look forward to heroines who defy the still frozen frames whereby women must be blonde and stick-thin to be loved.

These standards are destructive and cruel, and have even expanded to torment men. New impossibly high-definition muscle man images have contributed to increasing rates of eating disorders among men who are afflicted with life-threatening diseases such as the still recently dubbed: “Bigorexia.”

Each of these tales shifts hope for the marriage in question from the classic “happily ever after” to “perhaps.” Will we see such a “maybe” embrace heroes and heroines with different body types, in future films? Could friends and neighbors be the source of an expanded depiction of the many shapes of true love? Let us know.

Jessie Klein is the author of The Bully Society: School Shootings and the Crisis of Bullying in America’s Schools (NYU Press, 2012). She is Associate Professor of Sociology and Criminal Justice at Adelphi University. Meredith Finnerty is a Birth doula and certified HynoBirthing Childbirth Educator (HBCE).

[Note: This article originally appeared on Psychology Today.]