Genocide denial by default

—Nicole Rafter

The great centennial commemoration of the Armenian genocide is almost over. With parades in San Francisco, Los Angeles, and New York City, massive rallies in Argentina, prayer services in Washington, D.C., historical displays at the Library of Congress, and a formal remembrance by the European Union, Armenians and their supporters have kept alive memories of the atrocities of 1915.

In Boston, over three thousand gathered at the Armenian Heritage Park to honor the 1.5 million Armenians slaughtered by the Turks, a genocide that saw men tortured and shot, women raped and beheaded, and children forced to jump into the Black Sea to drown. Pope Francis recognized the event as “the first genocide of the 20th-century.”

Trouble is, the Pope—although admirable in his intentions—was wrong. So were others who memorialized the Armenians as the first 20th-century victims of mass atrocities.

The first victims of 20th-century genocide were in fact the Herero, a group of semi-nomadic tribes in South-West Africa (now Namibia). Before colonization by Germany began, in the 1880s, the Herero’s tribal confederation consisted of about 85,000 people. Caught up in the “scramble for Africa,” Germans settlers moved into South-West Africa as if by right, taking the natives’ cattle, building railroads on their grazing lands, raping and shooting women, and flogging men to death until the Herero decided to rise up.

The Herero knew they could not possibly win a fight against the Germans settlers and their army. “Let us die fighting,” counseled one chief, “rather than die as a result of maltreatment, imprisonment, or some other calamity.”

The surviving son of a Herero leader said his father “knew that if we rose in revolt we would be wiped out in battle because our men were almost unarmed and without ammunition. The cruelty and injustice of the Germans had driven us to despair, and our leaders and the people felt that death had lost much of its horror in the light of the conditions under which we lived.”

In response to the uprising, the German emperor put the colony under military rule and sent in Lieutenant General Lothar von Trotha, who had already brutally suppressed rebellious blacks in East Africa. Delivering his opinion of “race war” with Africans, von Trotha declared that “no war may be conducted humanely against nonhumans.” To his soldiers (as to the general himself), Africans seemed more like “baboons” than human beings.

Hung, burned, shot, starved, and driven into the desert to die of thirst, few Herero survived von Trotha’s extermination order. More than three-quarters died, while survivors became virtual slaves to the German settlers.

Germany held onto the colony for another decade but was forced out by an invasion from South Africa during World War I. After that, the British took control of what had once been Herero lands.

This was the first genocide of the 20th-century. If the Herero genocide is more obscure today than the Armenians’, it may be because of race, location, and geopolitics. It is wonderful that we have, in the Armenian case, monuments and memorials commemorating white people who were targeted for extermination partly because the Turks wanted their land. At the same time, we should remember these black people who were targeted for extermination because Germany wanted African land.

Genocide denial comes in many forms. We are familiar with the brazen dismissals of Holocaust deniers. We are also familiar with Turkish insistence that their country did nothing but “relocate” the Armenians. A more subtle but equally insidious form of erasure is genocide denial by default—by inadvertence or ignorance.

Unfortunately, the Pope’s claim that the Armenian genocide was “the first genocide of the 20th-century” marginalizes and ignores the near-extinction of the Herero.

This too is a form of genocide denial.

Nicole Rafter is Professor of Criminology and Criminal Justice at Northeastern University. She is the author of Criminology Goes to the Movies: Crime Theory and Popular Culture (NYU Press, 2011).

How email ruined my life

—Catherine Zimmer

I got my first email account the fall I started graduate school, in 1995. Even then I had an inkling of the pressures that would come to be associated with this miracle of communication. My entry into grad school coincided with a relationship transitioning into a long-distance one, and what at first was a welcome mode of remaining tethered soon enough became a source, and an outlet, of demand, anxiety, guilt, and recrimination.

This early experience of the pressure and affective weight of email faded into memory alongside that relationship, and certainly at the time it did not occur to me to hold the medium responsible for the messages. But over the past couple of years, now that I am lucky enough to be firmly cemented in an academic job and stupid enough to have taken on an administrative role, that experience has reemerged for me as something of a retroactive portent of the place email would come to hold in my life. Because as anyone in an even remotely bureaucratic environment will tell you, “email” is no longer simply a way of communicating with others, though periodically a message gets through that is significant enough that the medium becomes momentarily transparent. Email is now an entity in and of itself—a gargantuan, self-perpetuating and purely self-referential monstrosity that I do not “use” but barely “manage,” a time-sucking and soul-crushing mass that I can only chip away at in an attempt to climb over or around to get to an actual task.

From epidemic-level ADHD and eye fatigue to degenerative spinal conditions at younger and younger ages—not to mention my self-diagnosed early-onset thumb arthritis—constant interaction with digital devices has arguably had widespread health consequences. It is also fraught with an expansion, intensification, and perversion of the emotions associated with that first email account. But while then I attached those affective elements to a romantic relationship, they are now purely indicative of my relationship to email “itself”: the phenomenon that makes constant and growing demands on my time, attention, and energy, requiring that I devote at least a modicum of substantive thought to each individual expression of its enormous, garbage-filled maw. Time spent on email has grown to hours every day. This is not a measure of increased “productivity.” In fact it is just the opposite, as answering email has become the forest I have to machete my way through just to get to the things that actually constitute my job. And while I do get angry at the jaw-dropping idiocy of certain student emails (Hi Mrs. Zimmer can u send me the syllabus because the final is tomorrow and i missed the last eleven weeks of class) and irritated at the endlessly accumulating details of academic work (Dear Dr. Zimmer, this is a friendly reminder that the due date for the Learning Outcomes Rubrics Departmental Self-Assessment Model is March 23rd) ultimately each one of these individually maddening iterations is just a sign of the incomprehensible sprawl of the medium. And when factored in with texting, messaging, social media, streaming television, and any number of other incoming and outgoing information flows, the sense of being “overwhelmed” seems unsurprisingly ubiquitous.

Email is of course inseparable from the character of any digital labor and the economy of which it is a part: it thus becomes a useful metonymic device to understand how convenience has become so profoundly debilitating. Though no one explicitly states it (because it would sound insane), the demand that we keep up with and process this level of information, and communicate coherently in return, is a demand that the human brain function closer to the speed of digital communications. Needless to say, it does not. Thus the unparalleled levels of prescription of amphetamines and pain medications are not merely the triumph of the pharmaceutical industry, but an attempt to make the human brain and body function alongside and through digital mediation. The relative ease of communications, the instantaneity of information exchange, does not make our lives simpler: it means that we are asked to attend to every goddamn thing that occurs to the countless people we know, institutions we are a part of, and every other organization whose mailing list you have been automatically placed on simply by having a single interaction with them. It’s like being able to hear the internal mutterings of both individual people and cultural constructs: a litany of the needs of others and the expectations of the social sphere (not to mention my own neurotic meanderings when I have to construct a careful response to someone, or an email I have sent goes unanswered). Finding it increasingly impossible to recognize and affectively react only to the articulations of each missive, I respond instead to the cacophonous noise of the whole damn thing. That noise is now constant, while its volume ebbs and flows with the rhythms of the work year. As the only constant, email becomes an end in itself. Email never goes away. Email is an asshole.

It is not surprising that this self-perpetuating mode of interaction comes alongside a proliferation of (self-)assessment and (self-)documentation—talking about what you will, have, or are doing instead of just doing it. Thus the ability to communicate about everything, at all times, seems to have come with the attendant requirements that we accompany every action with a qualitative and quantitative discourse about that action. Inside and in addition to this vast circularity are all those things that one’s job actually entails on a practical, daily basis: all the small questions, all the little tasks that need to be accomplished to make sure a class gets scheduled, a course description is revised, or a grade gets changed. Given how few academic organizations have well-functioning automatic systems that might allow these elements to be managed simply, and that my own university seems especially committed to individually hand-cranking every single gear involved in its operation on an ad hoc basis, most elements of my job mean that emails need to be sent to other people.

Once I send an email, I can do nothing further until someone sends an email back, and thus in a sense, sending that email became a task in itself, a task now completed. More and more it is just a game of hot potato with everyone supposedly moving the task forward by getting it off their desk and onto someone else’s, via email. Every node in this network are themselves fighting to keep up with all their emails, in the back and forth required before anything can actually be done. The irony of the incredible speed of digital mediation is thus that it often results in an intractable slowness in accomplishing simple tasks. (My solution has been to return to the telephone, which easily reduces any 10-email exchange into a 2-minute conversation. Sidenote: I never answer my own phone.)

In case it isn’t already clear, such an onslaught of emails, and the pressure of immediacy exerted sometimes explicitly but mostly by the character of the media, means that we no longer get to leave work (or school, or our friends or our partners). We are always at work, even during time off. The joy of turning on our vacation auto-reply messages is cursory, for even as we cite the “limited access” we will have to email (in, like, Vancouver), we know that we can and will check it. And of course we know that everyone else knows that it’s a lie. Even if we really do take time away from email, making ourselves unavailable (not looking at email, not answering our texts) does not mean email has not been sent to us and is not waiting for us. And we know it, with virtually every fiber of our being. Our practical unavailability does not mitigate our affective understanding that if we ignore email too long, not only will work pile up, but there will be emotional consequences. I can feel the brewing hostility of the email senders: irritated, anxious, angry, disappointed.

Even if I start to relax on one level, on another my own anxiety, irritation, and guilt begin to grow. Email doesn’t go away. It’s never over. It’s the fucking digital Babadook, a relentless, reflexive reminder of the unfathomable mass underlying every small transaction of information.

The nonstop stream of communication and its affective vortex are in part what philosopher Gilles Deleuze (and now many others) have described as “societies of control,” distinguished not by discipline but by the constant modulation and management of the flow of information. Ultimately we are exhausted by the endless negotiation of this unmappable terrain, and our personal and professional labors increasingly have the character of merely keeping ourselves afloat. Which is not to say that discipline no longer functions: those excluded from the privilege of control will often find themselves subject to the sharper baton of policing and incarceration.

There does appear to be increasingly widespread recognition that email is having a significant effect on both the amount of work one does and the increasing threat of that work to health and well-being. A widely and enthusiastically misreported news story that France had instituted a ban on sending work email after 6:00pm provided a much-needed salve for the idea that there is no outside to the onslaught. Never mind that this was a wishful, apocryphal version of a French labor agreement that in reality didn’t cut off email at any hour—the story still allowed France to perform its ongoing service as the English-speaking world’s fetish of a superior, butter-drenched, bicycle-riding quality of life, a life in which steak frites is now accompanied by possible escape from a particularly maddening incarnation of digital labor. That life is apparently now the stuff of internet rumor and media fancy.

The range of feelings I associate with the era of my first email account roll on through now and then as I check my inbox, and I could probably name them, though perhaps they were never discrete. And I understand that it is my job as a participant in digital culture to respond to email, and text, and instant messaging—in writing and in sentiment. But the truth is that I am just really tired. Perhaps the vacuum in affect attested to by the accumulation of emoticons and emojis has little to do with the flattening effect of digital communication. Maybe feelings are simply exhausted.

Catherine Zimmer is Associate Professor of Film and Screen Studies and English at Pace University in New York City. She is the author of Surveillance Cinema (NYU Press, 2015).

[This piece was originally posted on Avidly, Los Angeles Review of Books channel.]

‘Fun Home’ and Pride

—Amber Jamilla Musser

MotheralOn June 7th, 2015, the musical Fun Home emerged triumphant. It won 5 Tony Awards, including Best Musical, Best Original Score, Best Book of a Musical, Best Lead Actor in a Musical, and Best Direction of a Musical. The significance of these wins cannot be overstated. A musical based on a graphic memoir featuring a lesbian, her gay father, and the rest of the family has been thrust into the purview of mainstream America—and really, who can resist having ALL of the feelings when Sydney Lucas sings “Ring of Keys?” Moreover, Jeanine Tesori and Lisa Kron have made history as the first women to win a Tony for best songwriting team.

It is clear that Fun Home gives people many reasons to be proud, especially in a month when we traditionally celebrate LGBT pride. One of the things that I find most moving about the musical (and the original graphic memoir by Alison Bechdel) is the way it actually subverts traditional narratives of pride and shame based on particular understandings of identity and masochism.

One of the conventional understandings of Pride is that it exists to celebrate triumph over homophobia and prejudice against LGBT people. That this narrative privileges a particular form of progress and has been easier for particular segments of the LGBT population is something that has been written about extensively by other queer studies scholars. In this post, I’m more interested in mentioning the ways that this conventional version of identity politics shores up a particular vision of masochism. One of the main arguments in my book Sensational Flesh: Race, Power, and Masochism is that the framework that we’ve been using to understand the relationship between individuals and power is masochism. In the book that means various things, but in the context of Pride, it has meant reveling in the wounds that produce LGBT identity—triumph would not be possible if there were no obstacle to overcome and the more wounds that are available, the more visible the triumph and the more celebrated the identity/person.

While I am not the first to describe this relationship between identity, woundedness, and masochism, I argue that this narrative frames our understanding of what it is to be an individual so that those with the privilege of appearing wounded are able to do because they are already part of an assumed arc of redemption and celebration while those whose wounds are less affective and more structural in terms of access to resources cannot access this arc in the same way (see last year’s post on Kara Walker as an example).

On the surface, it would appear as though Fun Home could fall easily into this particular trope, but it smartly sidesteps the arc of progress. In her retrospective gaze at her family life and its relationship to her father’s gayness, Alison (the oldest version of the character that we see) doesn’t pity her father or frame his suicide as the effect of a bygone prejudice that she has been fortunate to avoid. The question is not what would have happened to Bruce Bechdel had he lived in an era when he could live freely as a gay man. Neither is the focus on Alison’s ability to come out as a college student and live as a butch because things are better now. The universe of the musical understands these characters as inhabiting different modes of queerness, but it doesn’t ask us to do a comparison (despite the fact that Bruce commits suicide, which would seem to be the ultimate masochistic act).

Instead, the character whose life we imagine might have been different is Bechdel’s mother, Helen, played achingly by Judy Kuhn, whose song near the end of the show, “Days and Days” is a tearjerker —not because she is self-pitying but because she is resigned. This is structural difference at work. She knows that her suffering does not connect to later progress or triumph, but it does not diminish her work or her pain.

Where does this lacuna of feeling lie in a world structured by suffering or triumph, a world where the individual is a masochist in order to receive redemption through pity? Throughout the musical, we see so many moments when the semi-closeted world that Bruce inhabits that his daughter so desperately wants to remember and connect to, is not uniformly sad; there is fun—a dance with a casket, a furtive sighting of a kindred spirit (the butch that Lucas sings so movingly about). In all, it is not a play about moving through masochism to find identity, but about recognizing the many different notes being played at the same time. The arc of identity need not be neat or masochistic (so as to end in triumph), but it makes one feel, and gives reason for finding different narratives of individuality.

Amber Jamilla Musser is Assistant Professor of Women, Gender, and Sexuality Studies at Washington University in St. Louis. She is the author of Sensational Flesh: Race, Power, and Masochism (NYU Press, 2014).

Mad Men, Esalen, and spiritual privilege

—Marion Goldman

The online community is still pulsing with speculation about the final close up of Don Draper meditating on the edge of the Pacific at Esalen Institute—where he found bliss or maybe just an idea for another blockbuster ad campaign.

stevemcqueen

The writers and set decorators of Mad Men got 1970s Esalen spot on: from the lone outside pay phone at the run-down central Lodge to the dozens of long-haired hippies, former beatniks and spiritual seekers revealing themselves to each other in encounter groups. The images are so accurate that an alternative cyber universe of old Esalen hands has been speculating about how the writers were able to depict the old days so well—and whether the morning meditation leader was supposed to be Zen trailblazer Alan Watts or edgy encounter group leader Will Schutz.

None of these debates matter much to the entrepreneurs who have transformed Esalen from a rustic spiritual retreat to a polished destination resort that serves gourmet meals and offers workshops with themes like ‘capitalism and higher consciousness.’ Soon after the last episode of Mad Men aired, Yahoo Travel published an article promoting a “Don Draper Weekend Getaway” for fortunate consumers who could foot the tab. The rates vary, but on a weekend, a premium single room at Esalen costs $450 per night and the prices go way up for luxurious accommodations overlooking the sea. In a throwback to the old days, there is a ‘hardship policy’—making it possible for up to a dozen people who take weekend workshops to spend ‘only’ about $200 a night to spread out their sleeping bags in meeting rooms that they must vacate between 9:00 in the morning and 11:00 at night.

When Esalen opened its gates in the 1960s, visitors and residents traded work for housing or paid what they could afford. The founding generation believed that everyone was entitled to personal expansion and spiritual awakening through the growing Human Potential Movement. My book, The American Soul Rush chronicles how Esalen changed from being a mystical think tank, sacred retreat and therapeutic community into a wellness spa dedicated to de-stressing affluent customers with challenges at work or in their relationships.

In the late 1960s and early 1970s very different kinds of people drove along Highway 1 to Esalen, hoping to create better lives for themselves and often hoping to repair the world as well. They were spiritually privileged, with the time and resources to select, combine and revise their religious beliefs and personal practices. However, many of them were far from wealthy, because Esalen opened at a time of economic abundance that extended far down into the white middle class and there was widespread faith in unlimited possibilities for every American.

People in small towns and distant cities read long articles about Esalen and human possibilities in Life Magazine, Newsweek and other popular periodicals. Its key encounter group leader briefly became a celebrity when he appeared regularly on the Tonight Show Starring Johnny Carson. And during Esalen’s glory days, movie stars like Natalie Wood, Cary Grant and Steve McQueen regularly drove north from Hollywood to discover more about themselves and to soak in the famous hot springs baths. But once they arrived, they stayed in simple rooms, they were called only by their first names and other workshop participants tried to honor their humanity by treating the stars as if they were just like them.

Esalen was dedicated to opening the gates to personal and spiritual expansion to everyone and it fueled a Soul Rush. It popularized many things that contemporary Americans have added to their lives and can practice almost anywhere: yoga, mindful meditation, holistic health, humanistic psychology and therapeutic massage.

But most people can no longer afford to visit Esalen itself. A leader who left Big Sur to counsel clients in disadvantaged neighborhoods summed up how much the Institute has changed over the decades: “Damn,” she said, “I guess we got gentrified just like everybody else.”

Marion Goldman is Professor of Sociology and Religious Studies at the University of Oregon, and author of The American Soul Rush: Esalen and the Rise of Spiritual Privilege (NYU Press, 2012).

Celebrating Revolutionary Blackness: Haitian Flag Day

—Bertin M. Louis, Jr.

[This post originally appeared on Mark Anthony Neal’s blog, NewBlackMan (in Exile).]

Haitian-flag3

In communities across the globe, thousands of Haitians celebrate Haitian Flag Day every May 18 at concerts and ceremonies, on the Internet and at festivals and parades. The flag not only reflects pride in Haitian roots but it is the flag of the first black republic in the world. The Haitian flag takes on renewed meaning as an anti-racist symbol of revolutionary blackness and freedom in a continuing time of white supremacy and anti-blackness. Its inception was from the Haitian Revolution (1791-1803).

On May 18, 1803, in the city of Archaie, not far from Haiti’s current capital of Port-au-Prince, Jean-Jacques Dessalines, the leader of the blacks and the first leader of an independent Haiti, and Alexandre Pétion, the leader of the mulattoes, agreed on an official flag, with blue and red bands placed vertically. Haitian heroine Catherine Flon, who also served as a military strategist and nurse, sewed Haiti’s first flag. However, the flag was modified on Independence Day (January 1st) when the blue and the red bands were placed horizontally with the blue band on top of the red band. Haiti used the red and blue flag until 1964, when President-for life François “Papa Doc” Duvalier used a vertical black and red flag and added a modified version of the arms of the republic during the Duvalier regime, which lasted from 1971 to 1986. On February 25, 1986, after Jean-Claude “Baby Doc” Duvalier fled Haiti on an American-charted jet and the Duvalier regime fell apart, the Haitian people in its vast majority requested that the red and blue flag be brought back. The red and blue flag remains the official flag of Haiti.

Haiti was the French colony of Saint-Domingue before the revolution. A 1697 treaty between the French and the Spanish created the colony on the western third of the island of Hispaniola. Saint-Domingue was known as “the pearl of the Antilles” because the industrialization of sugar in the region enriched its French absentee owners and made it one of the most successful sugar colonies in history. The arduous labor required for sugar production resulted in the virtual eradication of the indigenous Taino Arawak population and an average seven-year life span for Africans who were brought against their will. In an area roughly the size of Maryland enslaved Africans produced indigo, tobacco and at one point in history two-fifths of the world’s sugar and almost half of the world’s coffee.

Physical and psychological violence were used to maintain plantation production processes. As sociologist Alex Dupuy writes it was not uncommon for slave masters to “hang a slave by the ears, mutilate a leg, pull teeth out, gash open one’s side and pour melted lard into the incision, or mutilate genital organs. Still others used the torture of live burial, whereby the slave, in the presence of the rest of the slaves who were forced to bear witness, was made to dig his own grave…Women had their sexual parts burned by a smoldering log; others had hot wax splattered over hands, arms, and backs, or boiling cane syrup poured over their heads.” Within this violent and dehumanizing environment, many enslaved Africans resisted and fought against their captors and participated in the most radical revolution of the “Age of Revolution.”

The Haitian Revolution was more radical than the American Revolutionary war (1775-1783) and the French Revolution (1789-1799) because it challenged chattel slavery and racism, the foundation of American and French empires. As the late anthropologist Michel-Rolph Trouillot wrote: “The Haitian Revolution was the ultimate test to the universalist pretensions of both the French and the American revolutions. And they both failed. And they both failed. In 1791, there is no public debate on the record, in France, in England, or in the United States on the right of black slaves to achieve self-determination, and the right to do so by way of armed resistance.” The Haitian Revolution led to the destruction of plantation capitalism on the island where both modern-day Haiti and the Dominican Republic are located.

Through the efforts of black people and the leadership of Toussaint Louverture, British and Spanish forces were defeated and independence from the French colonial master was achieved. The only successful slave revolt in human history resulted in the formation of Haiti as the world’s first black republic, which extended the rights of liberty, brotherhood and equality to black people. Unlike the United States and France, Haiti was the first country to articulate a general principle of common, unqualified equality for all of its citizens regardless of race unlike the United States where only propertied white males had the privilege of full citizenship.

The Haitian Revolution would spawn uprisings among captive Africans throughout the Caribbean and the United States. The revolution also influenced other Western Hemispheric liberation movements. Haitian blogger Pascal Robert observes that Venezuelan military and political leader Simon Bolivar went to Haiti to receive the military assistance and material support from Haiti’s then president Alexandre Petion. Bolivar used those Haitian connections to liberate colonial territories from Spanish rule. The Haitian flag reflects and symbolizes this unique and promising moment for people of African descent – black freedom in a world dominated by white supremacy.

Haitian Flag day celebrations take on renewed meaning when we recall the recent treatment of Haitians in the Western Hemisphere. In February 2015 a young Haitian man was lynched in the Dominican Republic. This lynching occurred at a time where the Dominican state has revoked the citizenship of Haitian-descended Dominicans. Essays from sociologist Regine O. Jackson’s edited volume Geographies of the Haitian Diaspora (Routledge 2011) discusses how Haitians serve as repugnant cultural “others” in Jamaica, Guadeloupe, and Cuba. In Haiti a post-earthquake cholera outbreak introduced by Nepalese soldiers from the United Nations Stabilization Mission in Haiti (MINUSTAH) has claimed 9,000 Haitian lives and affected more than 735,000 people. This preventable tragedy is in addition to earthquake aid that did not go to Haitians but mostly went “to donors’ own civilian and military entities, UN agencies, international NGOs and private contractors.” A recent essay from Latin Correspondent reporter Nathalie Baptiste recognizes anti-Haitian policies in Brazil, Canada, the Dominican Republic and the United States.

While we must attend to the differences in the local histories, varying socioeconomic factors and political situations of each country mentioned, a pattern of alienation, expulsion, elimination, marginalization and stigmatization of Haitians is evident when reviewing recent news and scholarly publications.

Anti-Haitianism is also prevalent in the Bahamas where I conduct anthropological research and where a new immigration policyadversely affected Haitians. A brief anecdote that I discuss in my book My Soul Is in Haiti: Protestantism in the Haitian Diaspora of the Bahamas (NYU Press 2014)”illustrates this fact. Towards the end of ethnographic research in New Providence, I was invited by a Bahamian friend to speak about the importance of education to elementary school children at an afterschool program. The children, who all sat around me in a circle, were black. As I spoke to them about the importance of reading, studying, doing well on tests, and getting help when they encountered difficulties, one girl was struck with a look of astonishment when I mentioned that I was of Haitian descent. After my speech I took the opportunity to ask her why she was so stunned. She replied that I didn’t look Haitian to her but that I looked Bahamian. So I asked her “so what does a Haitian look like?” Replying in Bahamian Creole she and her friends replied that Haitians were “scrubby,” meaning that they have an uneven or mottled dark complexion. They also said of Haitians that “Dey (They) black,” “Dey smell bad” and “Dey look like rat.”

These comments came from children who are of African descent (85 percent of the Bahamas is black) and the darkest black-skinned Bahamian child in that group said that Haitians were “scrubby.” This story from the field reflects the current crisis in Haitian identity in the Western Hemisphere and why it is necessary to celebrate Haitian Flag day as a way to resist the dehumanizing effects of anti-blackness. Anti-blackness is a key component of white supremacy “an historically based, institutionally perpetuated system of exploitation and oppression of continents, nations, and peoples of color by white peoples and nations of the European continent, for the purpose of maintaining and defending a system of wealth, power, and privilege.” In this example, young Bahamian children do the work of white supremacy through their use of anti-Haitian and anti-Black stereotypes.

The stigmatization of Haitians in the Western Hemisphere should alarm other black people because Haitian instability also reflects the current insecurity of blacks around the globe. The deaths of West African migrants in the Mediterranean on their way to Europe, Ethiopian Jews who are encouraged to either leave Israel or be imprisoned, police brutality against blacks in favelas in Brazil, and attacks against African immigrants by black South Africans should remind us of this ongoing crisis, which many people view as normative (i.e. there’s always death and destruction among Africans and in the African Diaspora). But we do not have to look outside of the borders of the United States to understand the deprivation of the humanity of black people. The current #BlackLivesMattermovement against police killings of unarmed black people is another reminder of the disposability of black life in the modern world which continues a pattern of anti-blackness that harkens back to the transatlantic slave trade.

Anti-blackness began with the forced marches of Africans from the interiors of the continent to African coasts where they were sold as chattel and would become the engine that fueled European colonial wealth. It continued during the Middle Passage where white captains tightly packed blacks together on slave ships and threw black bodies into the Atlantic Ocean with the hope that large numbers of human cargo would offset increased deaths. Anti-blackness was codified in the colonies and territories where the legally imposed identity of slave was passed from mother to child and became associated with blackness.

Anti-blackness is prevalent during this contemporary period in the media coverage of the killings of Walter Scott and Eric Garner as corporate news channels show their video-recorded killings at the hands of American law enforcement on a loop and refer to the black youth of Baltimore rebelling against unequal treatment under the law as “thugs.” Anti-blackness is also reflected in the current relations between Haitians and the nations they live in as well as how other countries treat people of African descent.

In closing, the Haitian flag reminds us that white superiority and black inferiority are fallacies and have no basis in biology and that white supremacy can be challenged and defeated as the Haitian Revolution demonstrated. Due to the poor treatment of Haitians throughout the Western Hemisphere we should also understand why Haitians are proud of their heritage and celebrate the anniversary of their flag. But the Haitian flag is also a flag that belongs to people of African descent around the globe, as do other flags. It is one of many symbols that Haitians and other people of African descent should utilize in resistance to the dehumanizing and deadly effects of capitalism, state power and white supremacy on black bodies. Overall, Haitian Flag Day should remind all of us to celebrate revolutionary blackness and to continue to challenge white supremacy in the struggle to create dignified lives for black people worldwide.

Bertin M. Louis, Jr. is the author of My Soul Is in Haiti: Protestantism in the Haitian Diaspora of the Bahamas (NYU Press, 2014) and an Assistant Professor of Anthropology and Africana Studies at the University of Tennessee, Knoxville. He is also the creator of #ShamelesslyHaitian, a Twitter event where Haitians express pride and educate others about their history and culture on Haitian Independence Day and Haitian Flag Day. Follow him on Twitter @MySoulIsInHaiti.

Ferguson, race, and the disability politics of the teen brain

In an article published on Somatosphere this week, author Julie Passanante Elman discusses race, disability, and the volatile teen brain. Read an excerpt from the essay below—and be sure check out more from the website’s series, Inhabitable Worlds. 

In February 2014, University of Missouri students made national news when they formed a human wall to protest the Westboro Baptist Church’s presence on their campus. Westboro arrived to denounce Michael Sam, a gay “Mizzou Tiger” who would become the first openly gay NFL player. Mizzou students eagerly donned “Stand with Sam” rainbow buttons and “WE ARE ALL COMOSEXUAL” t-shirts (an homage to “COMO,” or how locals refer to Columbia, MO). The nation turned its collective eye to “The Middle,” a North American region that has been associated (at times, stereotypically, by those on the coasts) with religious conservatism, provincialism, and intolerant attitudes toward cultural difference or sexual non-normativity. Rather than asking “what’s the matter with Kansas?” in frustration, onlookers celebrated Missouri’s anti-homophobic moment of conviction, its investment in creating an “inhabitable world” for queers living outside metronormativity’s coastal enclaves.

While one “Missouri Mike” made his NFL bid, another would never arrive on his campus or attend his first college class. On August 9, 2014 in Ferguson, MO, Michael Brown, an unarmed African-American teenager, was fatally shot by Darren Wilson, a white police officer. His body lay in the street for four hours, as his blood pooled on the asphalt, warmed by the same unyielding Missouri sun that shone on MU’s Francis Quadrangle as students returned in late August. Mizzou students returned to a very different campus. Many of my students were returning from their childhood homes in St. Louis and its neighboring suburbs. Many were from Ferguson. Others were the sons and daughters of St. Louis-area police officers.

In late November, Governor Jay Nixon declared a state of emergency in Ferguson nearly a week before the grand jury decided not to indict Wilson. Politically committed MU students quickly mobilized to support the Ferguson protests. Using the social media handle “MU4Mike,” students organized die-ins in the student center and City Hall and were supported by a variety of faculty, including a Vice Chancellor.

Mizzou’s Facebook page posted photos of the event (including the one above), which incited a variety of hateful responses:

  • “White lives matter too!”
  • “…[B]lack lives appear to matter to everyone but black people…the black community is the one offing themselves in record numbers, not white cops defending themselves from charging aggressors.”
  • “Raise your kids not your hands.”
  • “How stupid. All lives matter. Stop wallowing in self pity [sic]. This was and is not a race issue. Get real.”

Meanwhile, campus police monitored the MU Gaines/Oldham Black Culture Center after an anonymous threat to the center (“Let’s burn down the black culture center & give them a taste of their own medicine.”) appeared on YikYak, a mobile, anonymous social media application.

Perhaps no image better encapsulates the abruptness with which Mizzou’s political landscape shifted than this screenshot of Mizzou’s Facebook page:

image3-23-510x266

Enveloped in hopeful sunlight, an African-American student stands with his hands raised in peaceful protest. He stands in stark contradistinction to racist comments (“Your [sic] a thug bro!!”) as well as a meme of a white father and son pointing, as if to the man in photo, to proclaim, “Look son, a faggot!” Less than ten months after the campus had “Stood with Sam,” entangled racism and homophobia seemed more virulent than ever.

As an MU faculty member, I wanted to contribute my perspective to this special series—first off—to spotlight our students’ courageous (and ongoing) activism to make Mizzou a more inhabitable world for all of its students. As a critical queer/disability studies scholar contemplating Ferguson, I am thinking of the challenging questions posed by queer/disability activist Eli Clare, who invites us to map the sedimentary layers of injustice.

Continue reading on Somatosphere.

Artist as ethnographer: Jason Whitesel on Books Combined

—Jason Whitesel

[This article was originally posted on Books Combined, a collaborative blog launched by our friends at Combined Academic Publishers.]

Growing up, I found the human body an abundant source of artistic inspiration. Painting and drawing was a significant part of my life from grade school on into my early years of graduate school. I did mostly figure drawing and self-portraits  – my favorite artist at the time was Egon Schiele. Certainly my emotional state pulsated through my artwork: yet it was not the inner world of my imagination that I sought to express, but always direct observation of the world around me.

Later, ethnographic research appealed to me for the same reason: it engaged me in direct observation. When I think about the books that first lit my intellectual fire and subsequently shaped my career, they were all ethnographies. I was introduced to ethnography and the sociology of everyday life when I was an undergraduate. For me, they’re a natural fit with the perspective I take in my artwork. Conducting ethnographic research allows me to pay attention to the rich details of things we usually take for granted and help the reader visualize the community/culture I am studying by painting a vivid, “thick description” of it.

Of course, I am not the only one to think of ethnography in terms of artwork. In an undergraduate class on sociological fieldwork, I learned from Writing Ethnographic Fieldnotes (1995) by Robert Emerson et al. that fieldworkers, struck by a vivid sensory impression, sketch the social scene, depicting it like a still life, providing detailed imagery from the field. Likewise, when writing my recent book, I consulted John Van Maanen’s Tales of the Field: On Writing Ethnography (1988) in which he speaks of confessional tales of ethnographers being similar to self-portraits, where one tries to show the biases and character flaws the fieldworker brings to the ethnographic table.

Among the ethnographies that I cherish is Marcia Millman’s Such a Pretty Face: Being Fat in America (1980), a social psychologically oriented comparative ethnography of three groups: the National Association to Aid Fat Americans (NAAFA) – now it reads “to Advance Fat Acceptance”; Overeaters Anonymous; and a summer diet camp. The book takes off with the idea that fat is a feminist issue.  It contains autobiographic stories collected through in-depth interviews and thoughtful observations in each of the three organizations, their meetings, pamphlets, and booklets. When I first encountered this book, little did I know that approximately ten years later I would embark on a research project to expand upon this classic, by engaging gay men’s perspectives as they worry about their weight in meaning-laden ways.

Carol Brooks Gardner’s Passing By: Gender and Public Harassment (1995) is another ethnography that had a significant impact on my life. Anytime I have to sit down and start writing up my own work, I pull out the book and thumb through it, feeling certain that inspiration will seep in by osmosis. Gardner, who has been my mentor, studied under Erving Goffman, a professor of Anthropology and Sociology at U Penn. In 1979, in his book Gender Advertisements, Goffman used a micro-sociological approach to decode gender displays in advertising. Gardner applies and extends his concepts to explore unwanted public attention women receive from men on the street and in semi-public places like a department store. Through 506 interviews and five years of public observation in a Midwestern city in the U.S., she documents the various indignities women and other situationally disadvantaged groups are made to suffer and how such experiences erode these groups’ trust in public civility, and wear away at their psyche, constraining the way women engage with and enjoy public places or contributing to their fear thereof.

I can trace my intellectual pedigree to Goffman not only through Carol Gardner, but also through folklorist Amy Shuman, another significant mentor of mine who was also one of Goffman’s students. In graduate school, I took “Folklore Field Methods” and a seminar on “The Rhetoric of Ethnography” with Shuman, who introduced me to Goffman’s ideas about narrative. At the time she was preparing her own book Other People’s Stories: Entitlement Claims and the Critique of Empathy (2010). Through Shuman’s eyes, I began to see Goffman’s work in a different way; it was about how people create themselves through narrative. I came to understand that Goffman was not just interested in the public performance of identity where the self emerges as a series of façades, but also in the ways narrative opens up an avenue for one to make sense of one’s self, no matter how untenable one’s position may be.

As an artist and an ethnographer, I found these books, above all others, to have helped me build bridges between my creative and scholarly ways of seeing the world.

Jason A. Whitesel is a Women’s and Gender Studies Department faculty member at Pace University. His research focuses on gay men’s rigid body image ideal and the resulting intragroup strife among them. His recent book, Fat Gay Men: Girth, Mirth, and the Politics of Stigma (NYU Press, 2014) describes events at Girth & Mirth club gatherings and examines how big gay men use campy-queer behavior to reconfigure and reclaim their sullied images and identities.

Why has TV storytelling become so complex?

—Jason Mittell

[This post originally appeared at The Conversation.]

If you watch the season finale of The Walking Dead this Sunday, the story will likely evoke events from previous episodes, while making references to an array of minor and major characters. Such storytelling devices belie the show’s simplistic scenario of zombie survival, but are consistent with a major trend in television narrative.

Prime time television’s storytelling palette is broader than ever before, and today, a serialized show like The Walking Dead is more the norm than the exception. We can see the heavy use of serialization in other dramas (The Good Wife and Fargo) and comedies (Girls and New Girl). And some series have used self-conscious narrative devices like dual time frames (True Detective), voice-over narration (Jane the Virgin) and direct address of the viewer (House of Cards). Meanwhile, shows like Louie blur the line between fantasy and reality.

 

Many have praised contemporary television using cross-media accolades like “novelistic” or “cinematic.” But I believe we should recognize the medium’s aesthetic accomplishments on its own terms. For this reason, the name I’ve given to this shift in television storytelling is “complex TV.”

There are a wealth of facets to explore about such developments (enough to fill a book), but there’s one core question that seems to go unasked: “why has American television suddenly embraced complex storytelling in recent years?”

To answer, we need to consider major shifts in the television industry, new forms of television technology, and the growth of active, engaged viewing communities.

A business model transformed

We can quibble about the precise chronology, but programs that were exceptionally innovative in their storytelling in the 1990s (Seinfeld, The X-Files, Buffy the Vampire Slayer) appear more in line with narrative norms of the 2000s. And many of their innovations – season-long narrative arcs or single episodes that feature markedly unusual storytelling devices – seem almost formulaic today.

What changed to allow this rapid shift to happen?

As with all facets of American television, the economic goals of the industry is a primary motivation for all programming decisions.

For most of their existence, television networks sought to reach the broadest possible audiences. Typically, this meant pursuing a strategy of mass appeal featuring what some derisively call “least objectionable programming.” To appeal to as many viewers as possible, these shows avoided controversial content or confusing structures.

But with the advent of cable television channels in the 1980s and 1990s, audiences became more diffuse. Suddenly, it was more feasible to craft a successful program by appealing to a smaller, more demographically uniform subset of viewers – a trend that accelerated into the 2000s.

In one telling example, FOX’s 1996 series Profit, which possessed many of contemporary television’s narrative complexities, was quickly canceled after four episodes for weak ratings (roughly 5.3 million households). These numbers placed it 83rd among 87 prime time series.

Yet today, such ratings would likely rank the show in the top 20 most-watched broadcast programs in a given week.

This era of complex television has benefited not only from more niche audiences, but also from the emergence of channels beyond the traditional broadcast networks. Certainly HBO’s growth into an original programming powerhouse is a crucial catalyst, with landmarks such as The Sopranos and The Wire.

But other cable channels have followed suit, crafting original programming that wouldn’t fly on the traditional “Big Four” networks of ABC, CBS, NBC and FOX.

A well-made, narratively-complex series can be used to rebrand a channel as a more prestigious, desirable destination. The Shield and It’s Only Sunny in Philadelphia transformed FX into a channel known for nuanced drama and comedy. Mad Men and Breaking Bad similarly bolstered AMC’s reputation.

The success of these networks has led upstart viewing services like Netflix and Amazon to champion complex, original content of their own – while charging a subscription fee.

The effect of this shift has been to make complex television a desirable business strategy. It’s no longer the risky proposition it was for most of the 20th century.

Miss something? Hit rewind

Technological changes have also played an important role.

Many new series reduce the internal storytelling redundancy typical of traditional television programs (where dialogue was regularly employed to remind viewers what had previously occurred).

Instead, these series subtly refer to previous episodes, insert more characters without worrying about confusing viewers, and present long-simmering mysteries and enigmas that span multiple seasons. Think of examples such as Lost, Arrested Development and Game of Thrones. Such series embrace complexity to an extent that they almost require multiple viewings simply to be understood.

In the 20th century, rewatching a program meant either relying on almost random reruns or being savvy enough to tape the show on your VCR. But viewing technologies such as DVR, on-demand services like HBO GO, and DVD box sets have given producers more leeway to fashion programs that benefit from sequential viewing and planned rewatching.

Serialized novels, like Charles Dickens’ The Mystery of Edwin Drood, were commonplace in the 19th century. Wikimedia Commons.

Like 19th century serial literature, 21st century serial television releases its episodes in separate installments. Then, at the end of a season or series, it “binds” them together into larger units via physical boxed sets, or makes them viewable in their entirety through virtual, on-demand streaming. Both encourage binge watching.

Giving viewers the technology to easily watch and rewatch a series at their own pace has freed television storytellers to craft complex narratives that are not dependent on being understood by erratic or distracted viewers. Today’s television assumes that viewers can pay close attention because the technology allows them to easily do so.

Forensic fandom

Shifts in both technology and industry practices point toward the third major factor leading to the rise in complex television: the growth of online communities of fans.

Today there are a number of robust platforms for television viewers to congregate and discuss their favorite series. This could mean partaking in vibrant discussions on general forums on Reddit or contributing to dedicated, program-specific wikis.

As shows craft ongoing mysteries, convoluted chronologies or elaborate webs of references, viewers embrace practices that I’ve termed “forensic fandom.” Working as a virtual team, dedicated fans embrace the complexities of the narrative – where not all answers are explicit – and seek to decode a program’s mysteries, analyze its story arc and make predictions.

The presence of such discussion and documentation allows producers to stretch their storytelling complexity even further. They can assume that confused viewers can always reference the web to bolster their understanding.

Other factors certainly matter. For example, the creative contributions of innovative writer-producers like Joss Whedon, J.J. Abrams and David Simon have harnessed their unique visions to craft wildly popular shows. But without the contextual shifts that I’ve described, such innovations would have likely been relegated to the trash bin, joining older series like Profit, Freaks and Geeks and My So-Called Life in the “brilliant but canceled” category.

Furthermore, the success of complex television has led to shifts in how the medium conceptualizes characters, embraces melodrama, re-frames authorship and engages with other media. But those are all broader topics for another chapter – or, as television frequently promises, to be continued.

Jason Mittell is Professor of Film & Media Culture and American Studies at Middlebury College. He is the author of Complex TV: The Poetics of Contemporary Television Storytelling (NYU Press, 2015), and co-editor of How to Watch Television (NYU Press, 2013).