This week marks the 10th anniversary of Hurricane Katrina. In reflection, we’d like to highlight a few recent books that explore the effects of the historic storm and its impact on the resilient city of New Orleans.
Mardi Gras, jazz, voodoo, gumbo, Bourbon Street, the French Quarter—all evoke that place that is unlike any other: New Orleans. But what is it that makes New Orleans ‘authentic’? In Authentic New Orleans, Kevin Fox Gotham explains how New Orleans became a tourist town, a spectacular locale known as much for its excesses as for its quirky Southern charm. Beginning in the aftermath of Hurricane Katrina amid the whirlwind of speculation and dread surrounding the rebuilding of the city, Gotham provides a unique interpretation of New Orleans, one that goes beyond its veneer and moves into the rich cultural roots of this unique American landmark.
In Critical Rhetorics of Race, a groundbreaking collection edited by Michael G. Lacy and Kent A. Ono, scholars seek to examine the complicated and contradictory terrain of racial rhetoric, critiquing our depictions of race in innovative and exciting ways. In the powerful first chapter, Michael G. Lacy and Kathleen C. Haspel take us back in time to the post-apocalyptic New Orleans of 2005 to explore the media’s troubling representations of black looters following the devastation caused by Hurricane Katrina.
When the images of desperate, hungry, thirsty, sick, mostly black people circulated in the aftermath of Hurricane Katrina, it became apparent to the whole country that race did indeed matter when it came to government assistance. The Wrong Complexion for Protectionilluminates the long history of failed government responses to a range of environmental and health threats to African Americans. Drawing on compelling case studies and jaw-dropping statistics, the book is a sobering exploration of the brutal realities of institutionalized racism in disaster response and recovery.
Since the 2012 election cycle the role of digital politics continues to evolve. Now the story is all about social media: Facebook, Twitter, Instagram, Pinterest, and LinkedIn are all venues for candidates to communicate with voters. (All declared, and soon to be declared, candidates have Facebook and Twitter accounts.) Hillary Clinton leads on Twitter with over 3.7 million followers. Donald Trump is not far behind with just over 3 million. Rand Paul has the most “likes” on Facebook with over 2 million. There is good reason for the candidates to use social media tools. Pew reports that in 2014, 71% of adults online use Facebook. Sixty-five percent of those share, post, and comment at least sometimes on Facebook. And almost one-third of those post and comment about the news on Facebook. Data on Millennials is even more striking. According to Pew, Facebook is their main source for news about government and politics.
Social media has also impacted the way that citizens participate in political debate. At the time of my analysis of the 2012 presidential election, the main space for people to join an online debate about political issues was the comment forum that sits below individual articles on many news sites. While the democratizing effect of this type of public debate was celebrated, the substance of the discourse was also criticized as rude and vulgar. Some believed that the language on such forums represented only the most extreme and polemic views, undermining public discourse altogether. I disputed this position in my analysis of 2012 comment forum speech leading up to the presidential election, demonstrating that the substance of most comment forum speech was, in fact, fairly similar to elite discourse about the presidential election. If there was a problem with incivility during the 2012 election cycle, the problem existed far beyond citizen comment forums.
Heading into the 2016 presidential cycle, social media has also changed the nature of comment forums. Due to the tremendous increase in social media users, as well as a desire to improve the civility of comments, many news sources either require contributors to sign in through an existing social media account, or have moved public discussion to social media sites altogether. For instance, in 2014, Huffington Post banned anonymous comments and required contributors to sign in through a social media account to ensure that their comments were attached to a real name (no more “sukonthis,” “libs_r_trouble,” or “mancreatedgod”). CNN removed its comment forums altogether at the end of 2014, opting to host discussion via its Facebook and Twitter platforms. Fox News is an interesting exception. During the months leading up to the 2012 election, Fox News disabled its comment function completely, but since the election, has brought back the comment forum for some articles. In general, all news sites now have Facebook accounts, whether or not they have retained the comment forum function on their official news sites.
So, has the move to Facebook altered the substance of online public discourse? At this stage, it is difficult to compare current Facebook discussions with my original analysis. The 2012 data came from comments generated in the final months of the general election cycle, while we are barely into the primary season for 2016. Discussion during a primary season is likely qualitatively different from discussion during a general election, when internal party disagreement decreases. Keeping in mind that this is the primary stage, with most of the cycle still ahead us, two things stand out in comment forums. First, the changes in comment forums rules and venues have not changed the discourse. Second, conservative commenters are really angry at the Republican establishment.
First, language has not changed much as it moves to social media. Comments are still very polarized, routinely rude, and often tied to policy issues, very loosely defined, which is what I found in my first analysis. The one difference is not the speech, but the more polarized discussion spaces. As people rely on social media for their news, they are exposed to fewer perspectives, because even more than before, people see the news they want to see. It is also still true that social media comments on the 2016 presidential race still track fairly closely with elite discourse, which is similar to my findings in 2012. Because the rules now make it harder (although not impossible) to post anonymously, it is increasingly difficult to dismiss comment forums as the ravings of extremists and trolls who do not represent real citizens’ views. Further, as Pew reports, “For most politically active SNS users, social networking sites are not a separate realm of political activity. They are frequently active in other aspects of civic life.” While we might want to ignore this discourse, the people posting on comment forums are likely to be a factor in the presidential election.
Second, it is abundantly clear that there is discontent amongst the conservatives represented on comment forums. Everyone knows that liberals and conservatives are polarized, but the division within the Republican Party is extremely evident online, as well. Conservatives who post on these forums are very upset with the Republican establishment. They believe that their causes have faced nothing but losses – losses that are the fault of Republicans, such as a majority Republican Congress that has not delivered results (in their minds) and two significant defeats from a presumably conservative Supreme Court (on healthcare and gay marriage.) Typical posts on the subject are as follows:
“Why [have] the Republicans…done NOTHING since they won a landslide victory in both houses???????????????”
“I have not missed a presidential vote since Reagan in 1980. I’m so very close to sitting this next one out. The candidate better be an uncompromised Constitutionalist or I’m out.”
“…Thus far, none of the elected Republicans have shown any backbone at all or done what they promised they would do. We still have Obamacare, it’s not defunded, or removed. We still have a budget that only serves special interests. We have the rights of Christians, gun owners, and the constitution under attack. Can ANY of YOU remember that you are elected to protect the Constitution?…”
The fury fairly leaps off the page on these forums and it is clear that at this point in the election cycle they are not at all interested in candidates who can build coalitions and consensus. They want a fighter who will defeat the opposition, not work with them.
Enter Donald Trump. Many elites scoff at Trump’s bombastic language, fairly criticizing its flaws in fact and tone. They are also surprised (and sometimes worried) at the support he has received thus far. Based on comment forum discourse, however, it is not surprising at all. The attraction of Trump is not his mastery of policy issues – it is his uncompromising, “take no prisoners” approach to our political problems. For conservatives who feel the establishment wing achieves nothing by bargaining and negotiating in the political process, his rhetoric is music to their ears. In the words of commenters,
“These main stream Republicans are running scared. The are basically no different than the democrats. Spineless. Crank it up Mr. Trump!”
“The republicans bashing trump are weak. And jealous of him. These republicans are the same ones meander [sic] with the dems behind close [sic] doors.”
“I want Ted Cruz, Carly Fiornia and other candidates – including Donald Trump included in the upcoming debates. No more shoving some weak kneed GOP candidate who will lose (again) to the Liberal Progressives who have taken over the Democrat party. If FOX can’t accomplish this simple task, why should we TRUST FOX NEWS anymore?”
There is currently great support for Trump’s candidacy. Of the first 100 comments on a Fox News Facebook post about Trump, 92 expressed support. This is typical for conservative forums where support for Trump currently in the majority, if not a supermajority. A tally of the comments on a CNN Facebook post about Anderson Cooper’s interview with Trump showed less support, only 24 of the first 100 comments were supportive (which is not insignificant, given CNN’s position in the media’s mainstream). Based on a reading of the first 100 comments of four CNN Facebook pieces about presidential candidates, approximately 20% support Trump. Of course, today’s frontrunners may be forgotten in a few months (or even weeks), but the anger at establishment Republicans is the force driving support for Trump and will likely continue to be a factor in the race. Trump may not be the ultimate vehicle for this element of the Republican Party, but they want a candidate who is a fighter and not interested in bargaining and compromise.
Viewed individually, comment forum posts do not provide much insight on public opinion and they mostly serve to alarm everyone about the decline of civilized discourse. If you read enough of this speech, however, overall trends emerge. In the aggregate, comment forums are particularly useful in identifying more visceral aspects of opinion. The substance of this language is similar to elite discourse, but public comments tap into an overall mood.
Every week is a lifetime in a political campaign and it is not likely that Trump’s appeal can survive the entire primary cycle. The details of his various policy pronouncements are conveniently vague, and his bold statements will not be as impressive when subjected to close scrutiny. The anger and division within the Republican Party will remain, however, and Republican candidates will have the unenviable task of placating a very active wing of the Republican Party that is not in the mood for compromise and wants nothing to do with Establishment Republicans. I would not be surprised if many Republican candidates are currently hearing this message loud and clear (which is why many of them are hesitant to simply denounce Trump) and will continue to incorporate plenty of “fighting” words in their discourse. It is telling that Scott Walker’s speech declaring his candidacy did not tout a record of building consensus and getting things done, but rather that he could fight and win.
By the time the general election rolls around, this rebellion could subside as Republicans close ranks against the Democratic candidate, but the gist of the current comment forum language is that they erred in “settling” for Mitt Romney in 2012 and are not going to make that mistake again.
Karen S. Hoffman is Director of Undergraduate Studies and Visiting Assistant Professor in the Department of Political Science at Marquette University. She is the author of Popular Leadership in the Presidency: Origins and Practice. She has also published articles on the presidency, presidential rhetoric, and political communication in Rhetoric & Public Affairs and Congress and the Presidency. Her essay on comment forum speech appears in Controlling the Message: New Media in American Political Campaigns(NYU Press, 2015).
I got my first email account the fall I started graduate school, in 1995. Even then I had an inkling of the pressures that would come to be associated with this miracle of communication. My entry into grad school coincided with a relationship transitioning into a long-distance one, and what at first was a welcome mode of remaining tethered soon enough became a source, and an outlet, of demand, anxiety, guilt, and recrimination.
This early experience of the pressure and affective weight of email faded into memory alongside that relationship, and certainly at the time it did not occur to me to hold the medium responsible for the messages. But over the past couple of years, now that I am lucky enough to be firmly cemented in an academic job and stupid enough to have taken on an administrative role, that experience has reemerged for me as something of a retroactive portent of the place email would come to hold in my life. Because as anyone in an even remotely bureaucratic environment will tell you, “email” is no longer simply a way of communicating with others, though periodically a message gets through that is significant enough that the medium becomes momentarily transparent. Email is now an entity in and of itself—a gargantuan, self-perpetuating and purely self-referential monstrosity that I do not “use” but barely “manage,” a time-sucking and soul-crushing mass that I can only chip away at in an attempt to climb over or around to get to an actual task.
From epidemic-level ADHD and eye fatigue to degenerative spinal conditions at younger and younger ages—not to mention my self-diagnosed early-onset thumb arthritis—constant interaction with digital devices has arguably had widespread health consequences. It is also fraught with an expansion, intensification, and perversion of the emotions associated with that first email account. But while then I attached those affective elements to a romantic relationship, they are now purely indicative of my relationship to email “itself”: the phenomenon that makes constant and growing demands on my time, attention, and energy, requiring that I devote at least a modicum of substantive thought to each individual expression of its enormous, garbage-filled maw. Time spent on email has grown to hours every day. This is not a measure of increased “productivity.” In fact it is just the opposite, as answering email has become the forest I have to machete my way through just to get to the things that actually constitute my job. And while I do get angry at the jaw-dropping idiocy of certain student emails (Hi Mrs. Zimmer can u send me the syllabus because the final is tomorrow and i missed the last eleven weeks of class) and irritated at the endlessly accumulating details of academic work (Dear Dr. Zimmer, this is a friendly reminder that the due date for the Learning Outcomes Rubrics Departmental Self-Assessment Model is March 23rd) ultimately each one of these individually maddening iterations is just a sign of the incomprehensible sprawl of the medium. And when factored in with texting, messaging, social media, streaming television, and any number of other incoming and outgoing information flows, the sense of being “overwhelmed” seems unsurprisingly ubiquitous.
Email is of course inseparable from the character of any digital labor and the economy of which it is a part: it thus becomes a useful metonymic device to understand how convenience has become so profoundly debilitating. Though no one explicitly states it (because it would sound insane), the demand that we keep up with and process this level of information, and communicate coherently in return, is a demand that the human brain function closer to the speed of digital communications. Needless to say, it does not. Thus the unparalleled levels of prescription of amphetamines and pain medications are not merely the triumph of the pharmaceutical industry, but an attempt to make the human brain and body function alongside and through digital mediation. The relative ease of communications, the instantaneity of information exchange, does not make our lives simpler: it means that we are asked to attend to every goddamn thing that occurs to the countless people we know, institutions we are a part of, and every other organization whose mailing list you have been automatically placed on simply by having a single interaction with them. It’s like being able to hear the internal mutterings of both individual people and cultural constructs: a litany of the needs of others and the expectations of the social sphere (not to mention my own neurotic meanderings when I have to construct a careful response to someone, or an email I have sent goes unanswered). Finding it increasingly impossible to recognize and affectively react only to the articulations of each missive, I respond instead to the cacophonous noise of the whole damn thing. That noise is now constant, while its volume ebbs and flows with the rhythms of the work year. As the only constant, email becomes an end in itself. Email never goes away. Email is an asshole.
It is not surprising that this self-perpetuating mode of interaction comes alongside a proliferation of (self-)assessment and (self-)documentation—talking about what you will, have, or are doing instead of just doing it. Thus the ability to communicate about everything, at all times, seems to have come with the attendant requirements that we accompany every action with a qualitative and quantitative discourse about that action. Inside and in addition to this vast circularity are all those things that one’s job actually entails on a practical, daily basis: all the small questions, all the little tasks that need to be accomplished to make sure a class gets scheduled, a course description is revised, or a grade gets changed. Given how few academic organizations have well-functioning automatic systems that might allow these elements to be managed simply, and that my own university seems especially committed to individually hand-cranking every single gear involved in its operation on an ad hoc basis, most elements of my job mean that emails need to be sent to other people.
Once I send an email, I can do nothing further until someone sends an email back, and thus in a sense, sending that email became a task in itself, a task now completed. More and more it is just a game of hot potato with everyone supposedly moving the task forward by getting it off their desk and onto someone else’s, via email. Every node in this network are themselves fighting to keep up with all their emails, in the back and forth required before anything can actually be done. The irony of the incredible speed of digital mediation is thus that it often results in an intractable slowness in accomplishing simple tasks. (My solution has been to return to the telephone, which easily reduces any 10-email exchange into a 2-minute conversation. Sidenote: I never answer my own phone.)
In case it isn’t already clear, such an onslaught of emails, and the pressure of immediacy exerted sometimes explicitly but mostly by the character of the media, means that we no longer get to leave work (or school, or our friends or our partners). We are always at work, even during time off. The joy of turning on our vacation auto-reply messages is cursory, for even as we cite the “limited access” we will have to email (in, like, Vancouver), we know that we can and will check it. And of course we know that everyone else knows that it’s a lie. Even if we really do take time away from email, making ourselves unavailable (not looking at email, not answering our texts) does not mean email has not been sent to us and is not waiting for us. And we know it, with virtually every fiber of our being. Our practical unavailability does not mitigate our affective understanding that if we ignore email too long, not only will work pile up, but there will be emotional consequences. I can feel the brewing hostility of the email senders: irritated, anxious, angry, disappointed.
Even if I start to relax on one level, on another my own anxiety, irritation, and guilt begin to grow. Email doesn’t go away. It’s never over. It’s the fucking digital Babadook, a relentless, reflexive reminder of the unfathomable mass underlying every small transaction of information.
The nonstop stream of communication and its affective vortex are in part what philosopher Gilles Deleuze (and now many others) have described as “societies of control,” distinguished not by discipline but by the constant modulation and management of the flow of information. Ultimately we are exhausted by the endless negotiation of this unmappable terrain, and our personal and professional labors increasingly have the character of merely keeping ourselves afloat. Which is not to say that discipline no longer functions: those excluded from the privilege of control will often find themselves subject to the sharper baton of policing and incarceration.
There does appear to be increasingly widespread recognition that email is having a significant effect on both the amount of work one does and the increasing threat of that work to health and well-being. A widely and enthusiastically misreported news story that France had instituted a ban on sending work email after 6:00pm provided a much-needed salve for the idea that there is no outside to the onslaught. Never mind that this was a wishful, apocryphal version of a French labor agreement that in reality didn’t cut off email at any hour—the story still allowed France to perform its ongoing service as the English-speaking world’s fetish of a superior, butter-drenched, bicycle-riding quality of life, a life in which steak frites is now accompanied by possible escape from a particularly maddening incarnation of digital labor. That life is apparently now the stuff of internet rumor and media fancy.
The range of feelings I associate with the era of my first email account roll on through now and then as I check my inbox, and I could probably name them, though perhaps they were never discrete. And I understand that it is my job as a participant in digital culture to respond to email, and text, and instant messaging—in writing and in sentiment. But the truth is that I am just really tired. Perhaps the vacuum in affect attested to by the accumulation of emoticons and emojis has little to do with the flattening effect of digital communication. Maybe feelings are simply exhausted.
Catherine Zimmer is Associate Professor of Film and Screen Studies and English at Pace University in New York City. She is the author of Surveillance Cinema (NYU Press, 2015).
On June 7th, 2015, the musical Fun Home emerged triumphant. It won 5 Tony Awards, including Best Musical, Best Original Score, Best Book of a Musical, Best Lead Actor in a Musical, and Best Direction of a Musical. The significance of these wins cannot be overstated. A musical based on a graphic memoir featuring a lesbian, her gay father, and the rest of the family has been thrust into the purview of mainstream America—and really, who can resist having ALL of the feelings when Sydney Lucas sings “Ring of Keys?” Moreover, Jeanine Tesori and Lisa Kron have made history as the first women to win a Tony for best songwriting team.
It is clear that Fun Home gives people many reasons to be proud, especially in a month when we traditionally celebrate LGBT pride. One of the things that I find most moving about the musical (and the original graphic memoir by Alison Bechdel) is the way it actually subverts traditional narratives of pride and shame based on particular understandings of identity and masochism.
One of the conventional understandings of Pride is that it exists to celebrate triumph over homophobia and prejudice against LGBT people. That this narrative privileges a particular form of progress and has been easier for particular segments of the LGBT population is something that has been written about extensively by other queer studies scholars. In this post, I’m more interested in mentioning the ways that this conventional version of identity politics shores up a particular vision of masochism. One of the main arguments in my book Sensational Flesh: Race, Power, and Masochismis that the framework that we’ve been using to understand the relationship between individuals and power is masochism. In the book that means various things, but in the context of Pride, it has meant reveling in the wounds that produce LGBT identity—triumph would not be possible if there were no obstacle to overcome and the more wounds that are available, the more visible the triumph and the more celebrated the identity/person.
While I am not the first to describe this relationship between identity, woundedness, and masochism, I argue that this narrative frames our understanding of what it is to be an individual so that those with the privilege of appearing wounded are able to do because they are already part of an assumed arc of redemption and celebration while those whose wounds are less affective and more structural in terms of access to resources cannot access this arc in the same way (see last year’s post on Kara Walker as an example).
On the surface, it would appear as though Fun Home could fall easily into this particular trope, but it smartly sidesteps the arc of progress. In her retrospective gaze at her family life and its relationship to her father’s gayness, Alison (the oldest version of the character that we see) doesn’t pity her father or frame his suicide as the effect of a bygone prejudice that she has been fortunate to avoid. The question is not what would have happened to Bruce Bechdel had he lived in an era when he could live freely as a gay man. Neither is the focus on Alison’s ability to come out as a college student and live as a butch because things are better now. The universe of the musical understands these characters as inhabiting different modes of queerness, but it doesn’t ask us to do a comparison (despite the fact that Bruce commits suicide, which would seem to be the ultimate masochistic act).
Instead, the character whose life we imagine might have been different is Bechdel’s mother, Helen, played achingly by Judy Kuhn, whose song near the end of the show, “Days and Days” is a tearjerker —not because she is self-pitying but because she is resigned. This is structural difference at work. She knows that her suffering does not connect to later progress or triumph, but it does not diminish her work or her pain.
Where does this lacuna of feeling lie in a world structured by suffering or triumph, a world where the individual is a masochist in order to receive redemption through pity? Throughout the musical, we see so many moments when the semi-closeted world that Bruce inhabits that his daughter so desperately wants to remember and connect to, is not uniformly sad; there is fun—a dance with a casket, a furtive sighting of a kindred spirit (the butch that Lucas sings so movingly about). In all, it is not a play about moving through masochism to find identity, but about recognizing the many different notes being played at the same time. The arc of identity need not be neat or masochistic (so as to end in triumph), but it makes one feel, and gives reason for finding different narratives of individuality.
The online community is still pulsing with speculation about the final close up of Don Draper meditating on the edge of the Pacific at Esalen Institute—where he found bliss or maybe just an idea for another blockbuster ad campaign.
The writers and set decorators of Mad Men got 1970s Esalen spot on: from the lone outside pay phone at the run-down central Lodge to the dozens of long-haired hippies, former beatniks and spiritual seekers revealing themselves to each other in encounter groups. The images are so accurate that an alternative cyber universe of old Esalen hands has been speculating about how the writers were able to depict the old days so well—and whether the morning meditation leader was supposed to be Zen trailblazer Alan Watts or edgy encounter group leader Will Schutz.
None of these debates matter much to the entrepreneurs who have transformed Esalen from a rustic spiritual retreat to a polished destination resort that serves gourmet meals and offers workshops with themes like ‘capitalism and higher consciousness.’ Soon after the last episode of Mad Men aired, Yahoo Travel published an article promoting a “Don Draper Weekend Getaway” for fortunate consumers who could foot the tab. The rates vary, but on a weekend, a premium single room at Esalen costs $450 per night and the prices go way up for luxurious accommodations overlooking the sea. In a throwback to the old days, there is a ‘hardship policy’—making it possible for up to a dozen people who take weekend workshops to spend ‘only’ about $200 a night to spread out their sleeping bags in meeting rooms that they must vacate between 9:00 in the morning and 11:00 at night.
When Esalen opened its gates in the 1960s, visitors and residents traded work for housing or paid what they could afford. The founding generation believed that everyone was entitled to personal expansion and spiritual awakening through the growing Human Potential Movement. My book, The American Soul Rush chronicles how Esalen changed from being a mystical think tank, sacred retreat and therapeutic community into a wellness spa dedicated to de-stressing affluent customers with challenges at work or in their relationships.
In the late 1960s and early 1970s very different kinds of people drove along Highway 1 to Esalen, hoping to create better lives for themselves and often hoping to repair the world as well. They were spiritually privileged, with the time and resources to select, combine and revise their religious beliefs and personal practices. However, many of them were far from wealthy, because Esalen opened at a time of economic abundance that extended far down into the white middle class and there was widespread faith in unlimited possibilities for every American.
People in small towns and distant cities read long articles about Esalen and human possibilities in Life Magazine, Newsweek and other popular periodicals. Its key encounter group leader briefly became a celebrity when he appeared regularly on the Tonight Show Starring Johnny Carson. And during Esalen’s glory days, movie stars like Natalie Wood, Cary Grant and Steve McQueen regularly drove north from Hollywood to discover more about themselves and to soak in the famous hot springs baths. But once they arrived, they stayed in simple rooms, they were called only by their first names and other workshop participants tried to honor their humanity by treating the stars as if they were just like them.
Esalen was dedicated to opening the gates to personal and spiritual expansion to everyone and it fueled a Soul Rush. It popularized many things that contemporary Americans have added to their lives and can practice almost anywhere: yoga, mindful meditation, holistic health, humanistic psychology and therapeutic massage.
But most people can no longer afford to visit Esalen itself. A leader who left Big Sur to counsel clients in disadvantaged neighborhoods summed up how much the Institute has changed over the decades: “Damn,” she said, “I guess we got gentrified just like everybody else.”
If you watch the season finale of The Walking Dead this Sunday, the story will likely evoke events from previous episodes, while making references to an array of minor and major characters. Such storytelling devices belie the show’s simplistic scenario of zombie survival, but are consistent with a major trend in television narrative.
Prime time television’s storytelling palette is broader than ever before, and today, a serialized show like The Walking Dead is more the norm than the exception. We can see the heavy use of serialization in other dramas (The Good Wife and Fargo) and comedies (Girls and New Girl). And some series have used self-conscious narrative devices like dual time frames (True Detective), voice-over narration (Jane the Virgin) and direct address of the viewer (House of Cards). Meanwhile, shows like Louie blur the line between fantasy and reality.
Many have praised contemporary television using cross-media accolades like “novelistic” or “cinematic.” But I believe we should recognize the medium’s aesthetic accomplishments on its own terms. For this reason, the name I’ve given to this shift in television storytelling is “complex TV.”
There are a wealth of facets to explore about such developments (enough to fill a book), but there’s one core question that seems to go unasked: “why has American television suddenly embraced complex storytelling in recent years?”
To answer, we need to consider major shifts in the television industry, new forms of television technology, and the growth of active, engaged viewing communities.
A business model transformed
We can quibble about the precise chronology, but programs that were exceptionally innovative in their storytelling in the 1990s (Seinfeld, The X-Files, Buffy the Vampire Slayer) appear more in line with narrative norms of the 2000s. And many of their innovations – season-long narrative arcs or single episodes that feature markedly unusual storytelling devices – seem almost formulaic today.
What changed to allow this rapid shift to happen?
As with all facets of American television, the economic goals of the industry is a primary motivation for all programming decisions.
For most of their existence, television networks sought to reach the broadest possible audiences. Typically, this meant pursuing a strategy of mass appeal featuring what some derisively call “least objectionable programming.” To appeal to as many viewers as possible, these shows avoided controversial content or confusing structures.
But with the advent of cable television channels in the 1980s and 1990s, audiences became more diffuse. Suddenly, it was more feasible to craft a successful program by appealing to a smaller, more demographically uniform subset of viewers – a trend that accelerated into the 2000s.
In one telling example, FOX’s 1996 series Profit, which possessed many of contemporary television’s narrative complexities, was quickly canceled after four episodes for weak ratings (roughly 5.3 million households). These numbers placed it 83rd among 87 prime time series.
Yet today, such ratings would likely rank the show in the top 20 most-watched broadcast programs in a given week.
This era of complex television has benefited not only from more niche audiences, but also from the emergence of channels beyond the traditional broadcast networks. Certainly HBO’s growth into an original programming powerhouse is a crucial catalyst, with landmarks such as The Sopranos and The Wire.
But other cable channels have followed suit, crafting original programming that wouldn’t fly on the traditional “Big Four” networks of ABC, CBS, NBC and FOX.
A well-made, narratively-complex series can be used to rebrand a channel as a more prestigious, desirable destination. The Shield and It’s Only Sunny in Philadelphia transformed FX into a channel known for nuanced drama and comedy. Mad Men and Breaking Bad similarly bolstered AMC’s reputation.
The success of these networks has led upstart viewing services like Netflix and Amazon to champion complex, original content of their own – while charging a subscription fee.
The effect of this shift has been to make complex television a desirable business strategy. It’s no longer the risky proposition it was for most of the 20th century.
Miss something? Hit rewind
Technological changes have also played an important role.
Many new series reduce the internal storytelling redundancy typical of traditional television programs (where dialogue was regularly employed to remind viewers what had previously occurred).
Instead, these series subtly refer to previous episodes, insert more characters without worrying about confusing viewers, and present long-simmering mysteries and enigmas that span multiple seasons. Think of examples such as Lost, Arrested Development and Game of Thrones. Such series embrace complexity to an extent that they almost require multiple viewings simply to be understood.
In the 20th century, rewatching a program meant either relying on almost random reruns or being savvy enough to tape the show on your VCR. But viewing technologies such as DVR, on-demand services like HBO GO, and DVD box sets have given producers more leeway to fashion programs that benefit from sequential viewing and planned rewatching.
Like 19th century serial literature, 21st century serial television releases its episodes in separate installments. Then, at the end of a season or series, it “binds” them together into larger units via physical boxed sets, or makes them viewable in their entirety through virtual, on-demand streaming. Both encourage binge watching.
Giving viewers the technology to easily watch and rewatch a series at their own pace has freed television storytellers to craft complex narratives that are not dependent on being understood by erratic or distracted viewers. Today’s television assumes that viewers can pay close attention because the technology allows them to easily do so.
Shifts in both technology and industry practices point toward the third major factor leading to the rise in complex television: the growth of online communities of fans.
Today there are a number of robust platforms for television viewers to congregate and discuss their favorite series. This could mean partaking in vibrant discussions on general forums on Reddit or contributing to dedicated, program-specific wikis.
As shows craft ongoing mysteries, convoluted chronologies or elaborate webs of references, viewers embrace practices that I’ve termed “forensic fandom.” Working as a virtual team, dedicated fans embrace the complexities of the narrative – where not all answers are explicit – and seek to decode a program’s mysteries, analyze its story arc and make predictions.
The presence of such discussion and documentation allows producers to stretch their storytelling complexity even further. They can assume that confused viewers can always reference the web to bolster their understanding.
Other factors certainly matter. For example, the creative contributions of innovative writer-producers like Joss Whedon, J.J. Abrams and David Simon have harnessed their unique visions to craft wildly popular shows. But without the contextual shifts that I’ve described, such innovations would have likely been relegated to the trash bin, joining older series like Profit, Freaks and Geeks and My So-Called Life in the “brilliant but canceled” category.
Furthermore, the success of complex television has led to shifts in how the medium conceptualizes characters, embraces melodrama, re-frames authorship and engages with other media. But those are all broader topics for another chapter – or, as television frequently promises, to be continued.
We may never know exactly what Craig Stephen Hicks was thinking when he killed Syrian American medical student Deah Barakat, his Palestinian American wife Yusor Abu-Salha, and her sister Razan Abu-Salha. But we do know that U.S.-led war in Arab and Muslim majority countries has normalized the killing of Arabs and Muslims. It is more crucial than ever before to understand the institutionalization of racism against persons perceived to be Arab or Muslim in terms of the structures of imperial war that normalize killing and death and hold no one (other than victims themselves) accountable.
Photo: Molly Riley/UPI.
The Obama Administration may have dropped the language of the “war on terror,” but it has continued its fundamental strategy of endless war and killing in the Arab region and Muslim majority countries such as Afghanistan and Pakistan (without evidence of criminal activity). The unconstitutional “kill list” for instance, allows the president to authorize murders every week, waging a private war on individuals outside the authorization of congress. Strategies like the “kill list” resolve the guilt or innocent of list members in secret and replace the judicial process (including cases involving U.S. citizens abroad) with quick and expedited killing. These and related practices, and their accompanying impunity, look something like this:
The U.S.-backed Israeli killing of Palestinians reinforces the acceptability of Arab and Muslim death. In July 2014, the Israeli Defense Force killed at least 2,000 Palestinians including 500 children. It is well established that the IDF soldiersdeliberately targeted civilians. The Obama Administration’s response? Explicit support for Israel.
Since the 1970s (when the U.S. consolidated its alliance with Israel), the corporate news media has reinforced the acceptability of Arab and Muslim death—from one-sided reporting to fostering fear of Arabs and Muslims. From Black Sunday (1977) to American Sniper (2015), Hollywood has sent one uninterrupted message: Arabs and Muslims are savage, misogynist terrorists; their lives have no value; and they deserve to die.
This interplay between the U.S. war agenda abroad and the U.S. corporate media extends directly into the lives of persons perceived to be Arab and/or Muslim in the United States. Hate crimes, firebomb attacks, bomb threats, vandalism, detention and deportation without evidence of criminal activity and more have all been well documented. Of course, such incidents escalated in the aftermath of the horrific attacks of 9/11. As the U.S. state and media beat the drums of war, anyone perceived to be Arab and/or Muslim (including Sikhs, Christian Arabs, and Arab Jews) became suspect. Muslim women who wore the headscarf became walking emblems of the state and media discourse of Islamic terrorism. Across the United States, at school, on the bus, at work, and on the streets, women wearing the headscarf have been bullied, have had their scarves torn off, and have been asked over and over why they support Al Qaeda, Saddam Hussein, terrorism, and the oppression of women.
Despite this, the corporate media (replicating the words of the police) and government officials have either reduced the North Carolina killings to a parking dispute or expressed grave confusion over why an angry white man would kill three Arab Muslim students in North Carolina execution-style. Yet the father of one of the women students stated that his son-in law did not have any trouble with Hicks when he lived there alone. The trouble, he said, started only after Yusor, who wore a headscarf identifying her as a Muslim, moved in. Even so, Chapel Hill Mayor Mark Kleinschmidt told CNN that the community is still “struggling to understand what could have motivated Mr. Hicks to commit this crime,” adding, “It just baffles us.”
Alternately, discussions insisting on the significance of Islamophobia in this case must go beyond the call for religious tolerance and account for the projects of U.S. empire building and war that underlie Islamophobia. Contemporary Islamophobia is a form of racism and an extension of U.S.-led war abroad. As I wrote in Arab America, immigrant communities from the regions of U.S.-led war engage with U.S. racial structures, specifically anti-Arab and anti-Muslim racism, as diasporas of empire—subjects of the U.S. empire living in the empire itself. Perhaps then, we should also avoid applying the same analysis of racism across the board—as if all racisms are the same or as if the framework #blacklivesmatter can simply be transposed onto the killing of Arab Muslim Americans. Otherwise, we risk disremembering the distinct conditions of black struggle (and black Muslims) including the systematic state-sanctioned extrajudicial killing of black people by police and vigilantes as well as black poverty, and histories of slavery and mass incarceration. It is also important to remember the distinct conditions of the war on terror whereby anyone and everyone perceived be Muslim (including Arab Christians and Sikhs) are potential targets.
Rest in peace and power Deah Barakat, Yusor Abu-Salha, and Razan Abu-Salha. May your loved ones find strength and support. My heart is shattered.
Three Muslim Americans – Deah Shaddy Barakat, 23; his wife, Yusor Mohammad, 21; and her sister, Razan Mohammad Abu-Salha, 19 – were murdered last week in Chapel Hill, North Carolina by 46-year-old resident Craig Stephen Hicks. The tragedy has sparked a debate over whether or not these deaths were the result of a hate crime or a parking dispute.
Women take part in a vigil for three young Muslims killed in Chapel Hill, North Carolina. Photo: Mandel Ngan/AFP/Getty Images.
Muslim Americans who claimed that this was surely a hate crime were presented with evidence to the contrary. Hicks’s Facebook and other online posts revealed that he is an atheist who is against all religion, regardless of whether it is Islam, Christianity, or Judaism, a gun enthusiast, and an advocate for gay rights. His online posts show that he is passionate about the protection of constitutional rights, especially freedom of speech and freedom of religion. His archived posts even include commentary on the “Ground Zero mosque” controversy, in which he writes in support of Muslim rights and notes the important distinction between Muslims and Muslim extremists. His wife has insisted that the murders were the result of a parking dispute, and not a hate crime. As a result, Hicks has been portrayed as not hating Muslims.
This profile of Hicks is indeed complex. He does not fit the conventional profile of a “racist” – i.e., someone who believes that all Muslims are a threat to America; who clings to essentialist and binary notions of identities; who espouses that certain groups of people do not deserve human rights; who practices intentional bigotry; who is firmly rooted in a logic that justifies inequality. I am reluctant to use the term “racist” since it conjures an image of someone who participates in blatant and intentional forms of hate. However, what this case shows us is that we need a new paradigm to understand racialized violence today. Yes, this case is complex, but that does not mean it is not a hate crime. It is complex because it does not fit the narrow way in which we have defined a hate crime.
Posing an either/or option – either this is or is not a hate crime – does not help us understand what transpired. Racism is not an either/or phenomenon. It is deeply embedded in our society and, when left unchecked, has the potential to inform our perceptions and actions in ways not captured by a caricatured understanding of its diverse forms. Racism is not always conscious or intentional. It exists across a vast spectrum of consciousness and intentionality. As part of our subconscious, racism can manifest in the form of microaggressions that are often unintentional and sometimes even well-meaning. On the more dangerous side of the spectrum, it manifests in violence. We need to break the association of racism with intent because racism endures without it.
Our current cultural paradigm often makes a simplistic equation: Good people are well-intentioned and are therefore not racist; bad people are ill-intentioned and are therefore racist. Consequently, if the white police officers who killed Michael Brown and Eric Gardner are considered “good people” by their friends, families, and colleagues, their actions cannot be deemed racist. Such a conclusion focuses solely on intent and overlooks how members of the police – like all of us – have been shaped and influenced by notions of black men as threatening and how such cultural imagery has, in turn, structured racialized violence.
The point is not that Craig Hicks is any more or any less racist than the white police officers who murdered Michael Brown, Eric Garner, and other black men. Indeed, the question of their individual, consciously expressed or felt racism does not help us to understand what happened or how to prevent it in the future; it just provokes denial and defensiveness. Conversely, claiming that we are “post-race” and/or denying that a particular incident has anything to do with race does not help us solve the problem of racialized violence.
The point is not whether Craig Hicks is any more or less racist than any of us; the point is that Craig Hicks lives and his victims died in a society that is structured by deeply institutionalized and culturally pervasive racism that exists regardless of whether any individual “wants” it to or not, and regardless of whether we as a society want to acknowledge it or not. We need a new paradigm, a new language and framework, to understand racialized violence today. Hicks’ profile provides an opportunity to challenge ourselves to rethink our understanding of racism and hate crimes in order to prevent murder.