Social Text

Published by Duke University Press
Online ISSN: 1527-1951
Print ISSN: 0164-2472
Publications
Impossible ArrivalsHeterosexuality and the DomesticReturn of the Repressed: Risking the Hyphen in Asian(-)American StudiesQueerness and Diaspora in Asian American StudiesOut Here and Over There: Ang Lee's The Wedding Banquet
 
Reformulating the Question about ColonialismThe Problem about Colonial PowerThe Problem about Modern PowerGoverning Colonial ConductNotes
 
Social Text 20.3 (2002) 79-99 This essay is about how women's rights as a complicated discourse, and the burkha as a complex symbolic, are the sites from which to understand the complexity of global power struggles at this moment. But first a note of context is necessary to clear some space for thinking—openly, critically, historically—in terms of a before and after of September 11. September 11 has not changed everything. It has just made clear how much context and perspective and location matter. Ask the people of Chile about September 11—when their beloved president, Salvador Allende, was gunned down in a coup d'état supported by the United States. Ask them the meaning of trauma and grief. Think back to the Gulf War and U.S. militarist terrorism of its smart bombs. Think across and beyond to the children of Iraq, today, this minute, who need cancer drugs or textbooks for their schools and cannot have them because of the economic sanctions imposed on their country. Do what women always do—multitask, so that you are not simply concentrated on yourself, or the United States, or this moment. Please remember: The U.S. economy was in trouble before September 11; Boeing was angling for its defense contract before September 11; the airlines were in trouble before September 11. Also please think about the 3,000 wonderful people who were murdered on September 11, who came from over sixty different countries; the horrible tragedies in Nigeria and Sudan; the high school students like my daughter who were expected to wear flag pins and would not; the hundreds of thousands of workers who have lost their jobs since September 11; the incredible profits being made by the military-industrial complex on the present war; that Planned Parenthood has faced anthrax threats for years; that college campuses are being targeted as sites of antipatriotism. Try to see what is not easily visible. Rethink invisibility; rethink as overt the covert realms of power that are not being named. Do not give into the falseness of the moment. This is a time of insecurity and trouble. Do not pretend that having to use a plastic spoon to spread cream cheese on your bagel in the airport—instead of a plastic knife—makes you safe. None of us will be safe until the world embraces democracy for us all. A masculinist-militarist mentality dominates on both sides of the ill-named East/West divide. The opposition implied by this divide is not simple or complete. Flows between these locations have always existed and this is the case today more than ever. Further, the two sides of the divide share foundational relations, even if differently expressed, especially in terms of male privilege. Neither side embraces women's full economic and political equality or sexual freedom. In this sense fluidity has always existed in the arena of women's rights and obligations between the two. The Taliban's insistence on the burkha and the U.S. military's deployment of women fighter pilots are used to overdraw and misrepresent the oppositional stance. At present, economic flows of the global economy simply lessen the divide further. The bin Laden family itself represents this form of globalism. The family's money is tied to multiple Western investments such as General Electric, Goldman-Sachs, Merrill Lynch, Microsoft, and Boeing. One can easily assume that bin Laden's fury is directed as much at his family as at the West, which is a deadly combination. The quick and easy East/West divide is also not helpful politically, as the United States champions democracy while banding together with military dictators and kings. As I try to think through these post-September 11 moments, I feel compelled to locate and name the privileging of masculinist power with all its destructiveness. The silencing of women's unique voices at this moment, but most especially the voices of Afghan women and feminists—who criticized the early U. S. support of the Taliban—needs to be exposed. Women have been fighting and resisting the Taliban as well as other forms of Muslim fundamentalist misogyny for decades. Fundamentalist...
 
Social Text 21.2 (2003) 25-48 One of the difficulties in discussing violence against Palestinians during the 1948 war is that "Palestine," the site of the violence, both persists and has ceased to exist. Its simultaneous presence and erasure occurs in part through the survival of Palestinians from the 1948 war in what has ceased to be Palestine. Their scattered yet persistent presence constitutes a thread with which one can return to that moment when Palestine was ruined. They embody the survival of Palestine, yet also stand for its death. This death continues both to impede their memory of what happened in 1948 and to structure it. The memories of the Palestinian survivors constitute a challenge to another kind of thread (mis)leading us back to 1948. The war also resulted in a birth—of a new Jewish state attempting to deny the simultaneous death that gave rise to its creation. The documents recording the birth of the state attempt to conceal the death of Palestine. Documents recording the birth of the Jewish state and memories recollecting the death of Palestine were recently put on trial in Israel. The case involved a disputed historical account of the seizure of the Palestinian village of Tantoura by Zionist forces during the 1948 war and what were called the "exceptional acts of killing" that followed the village's surrender. This essay examines the memories of the Tantoura survivors and the dynamics of death/survival that structure them. It considers how legal rules of evidence and historical argument situated these memories and produced truths that transformed and often excluded them. The essay tries to rescue these narratives from the limits of positivist historiography and law, offering an alternative way of understanding them. Theodore Katz, an Israeli Jewish graduate student in history at Haifa University, wrote a master's thesis in March 1998 about the exodus of Arabs from five Palestinian villages in 1948. The thesis received an exceptionally high grade. It was a product of microhistorical research on five Palestinian villages located on the Mediterranean coast between Haifa and Hedera, with a special focus on two villages: Umm al-Zeinat and Tantoura. For the chapter on Tantoura, Katz wove the stories of the Tantoura refugees with those of the veterans of the Alexandaroni Brigade, the unit of the Israeli army that captured the village, and the official records he located in various Israeli archives. The testimonies included reports concerning the killing, or massacring, of the village's inhabitants by Alexandaroni fighters on the night of 22 May 1948, after the inhabitants had surrendered. Amir Gilat, a journalist, read the thesis and published an article outlining the conclusions of the chapter on Tantoura in the widely read Hebrew newspaper Ma'ariv. Gilat interviewed some of Katz's witnesses, both Tantoura refugees and Alexandaroni veterans. The Palestinians talked about the massacre that took place after the occupation of the village, while the Alexandaroni fighters denied it. Gilat also solicited the opinion of several academics, some of whom praised the thesis while others dismissed it as a work of fabrication. However, what the thesis labeled as "exceptional acts of killing" after the occupation of the village were transformed in the media discussion that followed into talk about a massacre, partly drawing on the vocabulary of the Palestinian survivors and of the Israeli academics who praised the thesis. Following the public debate about the thesis, the Alexandaroni Brigade veterans' association sued Katz for libel, seeking NIS (New Israeli Shekels) 1.1 million ($250,000) in damages. Katz was understood to have argued that after the fall of Tantoura and its inhabitants' surrender, Zionist soldiers entered the village, deported the women, old men, and children to the nearby village of Furaydis, killed some 200 to 250 men, and took the remaining men as prisoners. Some of the killed were executed in groups on the shore. Others were killed in a rampage unleashed by soldiers' rage at shots fired (with lethal results for one, two, or eight of them) after the village had officially surrendered. The imprisoned men were held for a year and a half. After their detention, most of them were expelled to what...
 
Social Text 21.2 (2003) 75-94 Although widely acknowledged as a universal phenomenon, collaboration has remained understudied in social sciences. Robinson (1972), who recognized the significance of collaboration for the functioning and the effectiveness of regimes that principally rely on political control, argued that collaboration emanates from both compelling circumstances that the regime creates and a local social system conducive to such a behavior. Guha (1997) maintained that collaboration is only one aspect of such circumstances. Resistance comprises the other side of the coin. It is unfortunate that the insights of Robinson and Guha, which were developed within the discussion on colonialism, have not been elaborated and extended so as to deal with deeply divided societies governed by relations of control. The main argument in this article is that the Palestinians in Israel (those living within the 1948 Green Line) were incorporated by the Israeli state after 1948 through a system of collaboration. Theoretically the article relies on insights from Robinson (1972) and Guha (1997) as well as additional sources from the Gramscian tradition. Meanwhile, the historical framework was dictated by theoretical and practical considerations. While before 1948 organizations of the Yishuv—the Jewish community in Palestine—had been successful in attaining the collaboration of individual Palestinians through bribery, since the establishment of Israel in 1948 collaboration has become the official system of the minority's incorporation. Because the interest in this article is in collaboration as a system of governance rather than as a phenomenon relating to the behavior of individuals, 1948 will be regarded as the point of departure. The discussion ends in the early 1970s because reliable archival materials are unobtainable thereafter; nonetheless, the consequences for the Palestinian minority of the system of collaboration have been unraveling until the present, a subject which will also be theorized and discussed. Within this time frame, the system of collaboration has changed, and along with that its superstructure of cultural codes, vocabulary, subtexts, and myths has also been altered. These changes will be analyzed within their historical context. The 1948 war is one of those momentous events about which the fourteenth-century Arab social historian Ibn Khaldun wrote: "When there is a general change of conditions, it is as if the entire creation had changed and the whole world been altered" (quoted in Hourani 1992, 3). Indeed, Al-Nakbah (the immense catastrophe) challenged the basic tenets of the human existence of the Palestinians; the homeland, the place of residence, the land—a major source of wealth, dignity, and influence—and the physical and cultural environment, the validity and endurance of which Palestinians had never questioned, turned out to be most insecure. Even the very existence of Palestinian society as an imagined community ceased to be taken for granted. After the end of the war, only a small minority of 160,000 out of the 900,000 Palestinians remained in the part of Palestine upon which Israel was established (Abu-Lughod 1971; see also Hadawi 1967). They were ruled by a military government that restricted their movements, controlled various aspects of their lives, and acted as a tool for the expropriation of the bulk of their lands (Jiryis 1976; Lustick 1982; Kretzmer 1987). After eighteen years, the military government was replaced by a system of control that was more elaborate and less visible (Lustick 1982). This shattering and disorienting transformation led to the accentuation of personal survival among the Palestinians. Moreover, their dependence upon the regime, their lack of secure prospects, and their precarious existential conditions constituted persuasive circumstances for some Palestinians to collaborate. Even during the war, the Israeli regime had begun to use those who collaborated with the Yishuv in the institutionalization of its rule over the captured Palestinian localities. Such collaborators acted as informers, and in some cases they were elevated to the status of intermediaries. One of the significant assignments that collaborators with the Yishuv were able to carry out during the war was the defection of the Druze unit from the Arab fighters' side. Jabber Dahish Muadi, Labib Abu-Rokon, and Saleh Khnefis, who played a vital role in this affair, were later awarded seats in the Knesset...
 
Social Text 20.3 (2002) 1-8 Introduction The opening ceremonies of the 2002 Winter Olympics in Salt Lake City featured, among the nods to Utah Native Americans and culturally diverse musicians, a U.S. flag disinterred from the carnage of the World Trade Center. The cause of some initial discomfort to officials of the International Olympic Committee, the wounded flag did make it to the February 8 event, carried into the stadium before a hushed crowd of 55,000. Too fragile to fly, this new symbol of global unity bore the hurt of all civilized nations. Yielded from the ground of ontological innocence, a space of victims and heroes, the flag arose phoenixlike from the ashes. Such are the conditions under which the catastrophe—encoded most simply as 911—has continued to circulate. The Olympic episode would stand as a banner for international cooperation, even as one nation exercised a supreme unilateralism that was reconciled with calls for infinite retribution. From Ground Zero, a new era dawned as the flag moved from the fallen global pinnacle to the world's level playing field. Henceforth, it was presumed, everything would be different. Whatever was building before that day—especially doubt at the fairness at the world's field—would have to be forgotten. For those of a critical disposition, the urgency would seem to be to remind the public of those other times, of those prior issues that remain. So, the Dickensian terms of 911 have emerged: the best of times, the worst of times; everything has changed, nothing has changed. Whatever the bleak remnants of 911, it continues to stand as a Manichean frame of all-or-nothing that can only wreak havoc on the Left, which is spurred to imagine its own conditions of public access as existing in a state of emergency. To accept that everything is now different invites amnesia but also manacles the future to official crisis management. Simple refusal of these declared new times is, at best, unnewsworthy, and at worst, self-anesthetizing to what it is now possible to say. The cult of the news that raises the specter of public access clashes with those very critical traditions that would ennoble the voices of opposition. The results are bound to be disorienting and self-censorious to radical intervention long after the dust has settled. Whatever historical and political economic analysis that can be brought to bear on the straitjacket of 911 as an event needs to be coupled with an unhinging of the conditions under which the Left intervenes. This special issue of Social Text is devoted to opening up both the analysis and the interventions, to complicate the terms of good and evil, under the shadow of which we are supposed to think our world and operate within it. Our contribution comes amid many journals of leftist tendency that have had to grapple with the problem of publishing after the fact under the presumption of continued urgency to complicate reductive terms of public reception. Manichean narratives are always tempting because they give us a false sense of moral security, wrapping us in a narcissistic cocoon, allowing us to digest the indigestible, to assimilate the unacceptable. Within this discourse, an orderly and peaceful world has been subjected to arbitrary and irrational attack, and our own regenerative violence will restore the everyday order of the world "before the fall," a prelapsarian order for which the "American Nation" is already nostalgic. The desire to narrate events in this manner is an understandable response in the wake of a traumatic crisis, but it is also our civic responsibility to be skeptical about such ahistorical narratives. Bin Laden, fingered so hastily as the incarnation of evil, was, as we know, at one point recruited and supported by the United States. In the 1980s, government-sponsored centers in Brooklyn recruited Muslim fundamentalists to fight the Soviet Union in Afghanistan. At that time, bin Laden was on the good side of the Manichean divide. Our government, as in the Rambo imaginary of the day, called bin Laden and his fellow moujadheen "freedom fighters." Since World War II, U.S. foreign policy has repeatedly used Muslim fundamentalists against both communism...
 
Social Text 20.1 (2002) 27-50 At 10:00 A.M. on the morning of 1 March 2001, undergraduates and faculty members met at the offices of New York University president L. Jay Oliva to deliver petitions in support of unionized graduate assistants. Twenty-five hundred students and 170 professors were urging the university to bargain with NYU's Graduate Student Organizing Committee, a duly elected union of teaching and research assistants, represented by the UAW. Later that day, the union—GSOC-UAW—was scheduled to take a strike authorization vote. A decisive statement of campus solidarity, the moment was auspicious. Two hours before the strike vote, university administrators and the union signed a letter of agreement, which set the terms for negotiations that began on 1 April. As we write in August 2001, negotiations are still in progress. While we do not speak in the name of GSOC-UAW, the authors of this essay are members of the union bargaining team. We have been privileged to participate in a historic event. Significantly, it is the first time graduate students at any private university have exercised the right to bargain over their conditions of work—the result of a landmark decision issued on 3 April 2000, by the National Labor Relations Board (NLRB). This ruling, which established employee status for graduate assistants at NYU, also established a precedent for the unionization of graduate students at other private universities. Maybe there will be a contract at NYU by the time this essay is published; maybe there will have been a strike at some point. We have no way of knowing. In one sense at least, it hardly matters. Even at this uncertain moment, it is clear that events at NYU have had far-reaching effects. GSOC's achievement has ignited UAW graduate student campaigns at several other campuses, including Brown, Columbia, Harvard, and Cornell. Importantly, its influence has extended beyond the realm of graduate students to touch the lives of adjuncts and full-time faculty as well. While this essay is an update on campus unionization, it is also an attempt to place current organizing drives within the context of an emerging academic labor movement. This movement has found energetic leadership in a national network of undergraduates, whose most prominent organization is United Students against Sweatshops (USAS). Since its inception in 1997, USAS has moved steadily toward a broad definition of the sweatshop that now includes the academy with its army of low-wage, contingent workers—including adjuncts, graduate assistants, and a range of service employees. Students and university workers have already found common cause within this new academic labor movement; and its potential has been evident at NYU, where the GSOC campaign has spawned an informal coalition among campus unions, student organizations, and faculty activists. While coalitions of this sort suggest a direction for the new academic labor movement, they also reveal practical problems and ideological contradictions as students, faculty, and other campus workers pursue objectives that are sometimes different and occasionally at odds. GSOC-UAW has been a remarkably harmonious group of youthful intellectuals and seasoned trade unionists in one of America's most powerful and progressive unions. Still, we have had our differences of opinion in the process of creating a union structure and a method of operation. Most graduate students, for example, are inclined toward discourse and used to reading and writing for a living. As a consequence, they itch to compose essays rather than leaflets and long for public meetings in order to debate controversial issues. Wisely, we believe, our colleagues in the UAW have urged us to stay focused on basic issues, keep our message concise, and avoid substituting the written word for one-on-one relationships with other graduate students. The union has been under constant pressure to pursue a winning strategy—first for the NLRB election, then for collective bargaining. In the process, we have had to grapple with some competing claims and fundamental questions: What are the issues most graduate students care most deeply about? What are the interests that unite graduate students? Which might be potentially divisive? How do we balance bread-and-butter concerns with more utopian...
 
Social Text 18.1 (2000) 31-54 For well over a decade now, practically all U.S. cities have been locked into a mode of urbanization best described as "entrepreneurial." Webster's Dictionary of American English defines entrepreneur as somebody "who organizes and manages an enterprise, especially a business, usually with considerable daring, skill and financial risk." Anybody who knows anything about the fortunes of American cities throughout the 1980s and 1990s knows what "considerable daring, skill and financial risk" really boil down to. For whole cities themselves have assumed the status of enterprises, and they too are now managed with considerable daring, occasionally with skill, but seemingly always with financial risk. Indeed, the contemporary American city hustles inexorably to the tune of the bottom line and has become very adept at enhancing its "good business climate" reputation. Creating and re-creating healthy business climates has apparently been vital not only for the growth and continued financial viability of cities but also for the survival of every city in an age of global insecurity and competitive volatility. In recent years, apologists and gurus of such a line have basically set the tone of the debate about the nature of contemporary urbanism and the trajectory of urban growth and development. For these people there's simply no alternative to the entrepreneurial paradigm. Now, the argument goes, ubiquitous capital deregulation and corporate hypermobility browbeat and cajole cities into capturing a piece of the action. If cities don't capture a piece of the action, then apparently this action -- that is, the investment, the jobs, the industries, the well-heeled consumers -- will go elsewhere, to another town, maybe close by, where the package is more favorable, more profitable, more capitalistically efficient. Competition here is inevitably cutthroat. But every city cannot win in this zero-sum jamboree. Accordingly, mayors, local councillors, chambers of commerce, and business elites have all gotten together to hash out their very own growth strategy, one in which the public sector lends something of a visible hand to absorb part of the market risk, thus helping locales become more economically attractive, more business oriented, and more attuned to daring and entrepreneurship. And here a lot of heady legal, political, and economic mechanisms have been deployed to push things along. A decade ago, Urban Development Action Grants (UDAGs) were all the rage. These interest-free leverage grants, bestowed on intrepid developers, were supposed to help regenerate run-down areas, provide jobs, rekindle local economies, kick start small businesses, and attract investors to spaces and projects that might otherwise be shunned. During the 1980s, New York, Detroit, and Baltimore topped the national UDAG league. But few strict guidelines were set about what these handouts should be used for. So rather than fund low-income housing or other social infrastructure, UDAGs mostly sponsored the construction of convention centers, hotels, marinas, and expensive residential complexes. Soon they were hastening, not ameliorating, social polarization, since few of the goodies seemed to trickle down to needy people. After a while UDAGs became the greatest hotel-building venture in American history. And if that weren't enough, these new hotels barely paid employees minimum wage; large chains, like Marriott and Hyatt, also turned into big-time union busters. Clinton's "empowerment zones," set up in 1993, merely continued the trend whereby the public sector underwrites corporate interests. Here, specific areas in cities have been gerrymandered into zones that waive property taxes for a while and ensure locating companies receive income tax credits for each new person they hire. Meanwhile, more direct support for companies has come in the shape of Industrial Development Bonds (IDBs), federally sponsored bonds (whose earnings directly accrue to the specific business), and Tax Increment Financing Districts (TIFDs). That all this represents yet another form of corporate welfare is well proven. The Citizens for Tax Justice estimates that lost revenue from the IDB program alone between 1996 and 2000 will weigh in at around $900 million, which is more than the total federal budget for urban mass transit. Such policies pretty much follow neoliberal, supply-side economic formulas, cherished so dearly by both Reagan and Bush and which, until recently, assumed the mantle of...
 
Social Text 20.1 (2002) 61-80 It started innocently enough. It was late summer or early fall of 1989, and a friend who was teaching at New York University—let's call him Jim—invited me to take over a class on contemporary social issues. At first I hesitated. I'd seen what happened to friends who'd gone down that road. The early signs were subtle. There would be gradual changes in their appearance. Their grooming habits changed. They began to appear haggard. Later they stopped going out on the weekends and evenings, preferring to stay in to mark papers or conference with students. They were often sick—knocked out by the flu every year. And their other professional work almost always deteriorated. Artists who had been showing stopped. Writers who had published were "blocked." The longer they taught, the more resigned and bitter they seemed to grow. That was not for me. But Jim cajoled me. "Come on," he said, "what harm can it do? It's only one class. You're only signing on for one semester." He was right. What harm could it do? So I agreed, and soon I was teaching cultural theory one night a week. My partner immediately noticed the changes in my behavior. I'd get home around ten o'clock after my evening class and I'd be flying. Often, if we'd had an especially good class, I couldn't sleep: my head would be spinning with ideas for the coming week's class, with ideas for new readings or screenings we could talk about. I'd stay up late: reading, watching videos, and later, after arrival of the Internet, I'd be on-line, searching for good sites for students or corresponding with them on bulletin boards we'd set up. But I'm getting ahead of myself. . . The fall semester of 1989 came and went, and in the spring my department asked me if I wanted to teach a writing workshop. It would be more work and less money, but at that point I was hooked. Sure, I said. And so it began: a teaching habit that escalated out of control. Over the next decade I would teach three, four, five courses a year, all the while holding down a full-time day job that supported my teaching habit. I was what they call a "high-functioning" addict. My day job in a nonprofit arts organization, though not particularly lucrative, provided health benefits and compensation adequate enough to maintain appearances. I was so hooked on teaching that I even enrolled in a doctoral program, imagining that I might find a full-time position if I had a Ph.D. on top of an M.F.A. Things began to spiral out of control in late 1997. Somehow, in spite of my adjuncting habit, I'd managed to become pregnant. My partner and I were thrilled. At that point my teaching habit was so all-consuming that we'd all but given up hopes of ever starting a family. I was expecting and I was delighted. But still I wouldn't give up my teaching habit. Even in the final month of the pregnancy, when my doctor urged me to remain in bed to control pregnancy hypertension, I couldn't do it. Fearful that I'd lose my classes or seem "unprofessional," I was in the classroom, dizzy with high blood pressure, the week before my daughter was born, not surprisingly, by emergency C-section. This was insanity. This was an addict's insanity. For two years I continued teaching, leaving my daughter in the care of a college student whom I paid more per hour than I earned for teaching. Then in September 1999 I finally hit rock bottom. Alcoholics turn to AA—I turned to the AAUP—and my long road toward recovery began. A group of fellow teaching junkies had already been meeting for two years to discuss our mutual problems. We were enslaved by our passion for teaching. Our vocation had become our addiction. What else could...
 
Social Text 21.1 (2003) 1-5 —Jane Brown In the early 1990s, the adoption of children across national borders began to accelerate at an astonishing rate. Although transnational adoption originated more than fifty years ago in the aftermath of World War II and the Korean War, the current wave of adoption is unprecedented in magnitude and visibility. Immigrant orphan visas issued by the U.S. Immigration and Naturalization Services nearly tripled between 1991 and 2001: from 7,093 to 19,237. In the United States alone, more than 139,000 children have been adopted internationally in the last ten years. Over 50,000 of these children were born in China or Russia. What are the implications of this massive movement of children, almost entirely from poor nations, to the more affluent West? The essays in this issue explore transnational adoption from multiple perspectives, encompassing both "sending" and "receiving" countries: birth parents who relinquish children, adoptive parents and adopted children, and adult adoptees. All of the essays view adoption as situated in the midst of larger social and cultural transformations and, inevitably, in the space of familial intimacy and the public sphere. In its transnational mode, adoption enters into and informs the complex politics of forging new, even fluid, kinds of kinship and affiliation on a global stage. These politics start from, rather than end on, the critical insight that identity is a social construction (see Taussig 1993). Questions of belonging, race, culture, and subjectivity loom large in the discourses of transnational adoption. In an earlier era, adoption across borders was assumed to be straightforward: A child traveled to a new country and stayed there. A child born in Korea and adopted in Minnesota was expected to grow up, and remain, simply a (white) American. Parents and adoption organizations did not question that their acts were good deeds. The past was erased or contained in an abandoned "there"; the racialized trace of origins tended to be treated as manageable. Today, adopted people—children or adults—are expected, or at least invited, to explore their multiple identities: to retain a name, to imagine their birth families, to learn about "birth cultures," perhaps to visit the birth country. As an anthropologist and interpreter accompanying a group of Chilean adoptees and their Swedish parents who traveled "home" to Chile, Barbara Yngvesson tracks one such exploration, suggesting that these journeys unsettle the narrative of exclusive belongings, the notion of a singular identity, a self that can be made whole. Contemporary adoption discourse echoes the ambivalences discussed in Yngvesson's essay: the contradictory narratives of the child "rooted" in his or her original culture and the child as freely transferable, to new kin and culture, in the global marketplace. At least some of the popular culture of adoption has begun to acknowledge the impossibility of "exclusive belongings." An American mother wrote on the Internet of her hopes to give her daughter "what she would need to have a fulfilling, but divided life." The daughter, six-year-old Sierra Song E, echoed her mother's thoughts: "Part of me lives here now and part of my heart is in China now, you know?" Her mother replied: "That is the way it should be—you are a daughter of each of the two lands you rightfully claim as yours" (Brown 2002). Like the passage quoted in the epigraph above—Sierra...
 
Social Text 21.1 (2003) 83-109 This essay considers the place of visual media, specifically child portrait photography, in the culture of international adoption. In 1999 my husband and I began to research transnational adoption for what was to be a personal process: adoption of a toddler-aged child into our family, which included a three-year-old biological son. I found myself immersed in a visual culture in which portraits of institutionalized children available for adoption circulated with surprising ease. Images of "waiting children" performed multiple functions in the adoption process. Brochures enclosed with agency mailings displayed child images captioned with brief biographical or medical data. Agency Web sites offered photolisting links that in some cases facilitated the sorting of children by sex and region for ease of selection. These images functioned initially as lures, drawing prospective clients into the adoption market, helping them to imagine "their" child or themselves as parents of children "like these." Beyond this, some individual images became, for some clients, a coveted family photograph, representing an imagined future family member much as a fetal sonogram might do. Waiting child images also served as tools in medical screening. Many agencies advised clients to submit photographs or video footage they provided, along with the child's sometimes sketchy medical records, for evaluation to one of a handful of practices specializing in adoption medicine before making a commitment to one child. In this use, the image functions somewhat like prenatal test data, in that it is screened for possible empirical signs of pathology. A physician's reading of the photograph or video thus could factor into a client's decision to accept or reject a referred child. Discussions among foreign officials, agency representatives, child advocates, prospective parents, and parents about child image circulation have impacted positions, policies, and practice among the agencies and countries involved in transnational adoption. Concerns have included the problematic nature of a system where children of poor countries become commodities and their images become advertisements in a global market, the enhanced potential for racial and esthetic discrimination in image-based child selection, and the child's right to privacy. Transnational adoption raises these and other problems deserving critical attention. This essay contributes to these discussions by addressing the function of image classification relative to client perceptions of a child's cultural identity and health. First, I consider how child images are categorized by agencies in brochures and computer data banks, to demonstrate how visual classification gives shape to children's histories, identities, and futures relative to race, ethnicity, health, and ability. Second, I discuss the special needs classification and the problem of discerning medical or developmental information from pictures. Screening and diagnosis of children at a distance is a challenge: the child's body cannot be accessed, and language and cultural differences make interpreting records difficult. Interpreting casual portraits for medical information has become a common practice in transnational adoption. My specific example is the screening of child portraits for Fetal Alcohol Syndrome (FAS). This medical technique is beset, I will argue, with the historical problem of pathologizing signifiers of cultural difference. I will consider some problems in the analysis of "racially mixed" and "Asian" faces. Accommodating for racial or ethnic differences in the establishment of bodily norms, or discriminating between visual markers that are believed to signify identity and those that are believed to indicate pathology, are the impossible tasks of FAS image analysis. I should be clear about my stakes in this essay. I write as a participant critic in the transnational adoption world, occupying the roles of agency client, adoptive parent, and volunteer adoption coordinator. I did not shy away from the use of child images in any of these roles. My place firmly inside the world of adoption has done little to diminish my initial and ongoing concerns, as a scholar of visual media, about the troubling dynamics of this image culture. The material I reference could easily be used to support a critique of the practice of transnational adoption or its use of images. However, it would be misguided to condemn transnational adoption in total, to fault agencies for using images, or to criticize clients for...
 
Social Text 21.1 (2003) 111-127 By the end of the 1980s, just as in numerous other donor countries, it was to a great extent the increasing presence of foreign adoptive parents that led Brazilian policy makers to turn their attention to the plight of the country's children and refine policies concerning in-country adoption. At the time, Brazil was in fourth place among the world's largest furnishers of internationally adopted children (behind Korea, India, and Columbia). To the growing consternation of government authorities, by the early nineties greater numbers of Brazilian children were being legally adopted by people in France, Italy and, to a lesser extent, the United States (Kane 1993). Rumor had it that even more were being smuggled illegally over the borders. However, despite lingering tendencies in the poorer and more remote parts of the country, by 1994 the tide of international adoption had turned, reducing the outgoing flow of children to a slow trickle. A series of factors contributed to the decline of intercountry adoption in Brazil. Local-scale influences included nationalistic zeal against what was seen by many as a predatory threat from abroad, the enforcement of increasingly stringent legislation (including widely publicized jail sentences handed out to public officials involved in irregularities), and the growing popularity of national adoption. On the global level, one should not ignore the importance of international legislation aimed explicitly at curtailing the South-North flow of children as well as the sudden availability for adoption of an immense number of Chinese and Russian children (see Fonseca forthcoming). Although today Brazil is no longer counted among the world's major furnishers of internationally adopted children, a close look at local child-raising practices and national policies on adoption still raises many issues relevant to the field. For example, fifteen years of intensive intercountry adoptions have left their mark on national legislation. A clause in the country's 1990 Children's Code stating that poverty alone should, under no circumstances, justify the loss of parental authority has been attributed by certain analysts (Abreu 2002) to the reaction against the plundering of Brazilian children by "rich" foreigners. Another even more consequential legacy of Brazil's experience with intercountry adoption concerns the high value placed by contemporary policy makers on in-country adoption and, in particular, plenary adoption, seen by many as the ideal solution for children in dire need. In this sense, it would seem that although the wave of intercountry adoption has receded, it has left in its wake, embedded in contemporary legal regulations, certain globalized principles based on the modern nuclear family that may or may not be relevant to many of the country's citizens. The possible difference explored here between national adoption legislation, in tune with cosmopolitan sensitivities, and local-level practices is highly relevant to general debates in the field of intercountry adoption. In many respects, the ethnographic material presented here portrays a reality similar to that described in other Third World countries that continue to send children abroad. Furthermore, the gap in Brazil between national legislation and local-level sensitivities may be taken as symptomatic of the even wider gap between values embedded in international conventions on intercountry adoption and those of poverty-stricken sending families in Third World countries. Elsewhere, I have considered in greater detail the attitude of foreign adoptants as well as the evolution of Brazilian adoption laws (Fonseca 2001, 2002). In this article, on the basis of my ethnographic fieldwork among the urban poor of Porto Alegre, Brazil, I dwell on examples that illustrate local child-raising dynamics. Pointing out how extremely poor women resort to a wide range of strategies—from charitable patrons and state-run boarding schools to mutual help networks involving a form of shared parenthood—my purpose is to contribute to the rethinking of national as well as intercountry adoption from the bottom up. I was drawn to the subject of lower-income families not exactly by design but by force of circumstances. It was in the early 1980s. I had recently settled in Porto Alegre with my ("native") husband and two toddlers and was preparing my first classes of anthropology to...
 
Social Text 21.1 (2003) 29-55 Isabelle, who is six, makes a list of all the children she knows and begins to identify those among them who are, as she is, adopted. Naming three other Asian children in her New York City first grade class, she pauses, then shakes her head: "No, but they look adopted." Isabelle's mother asks, "What does an adopted person look like?" Isabelle replies, "Chinese." In the 1990s, families in the United States began to adopt children from other regions of the world in unprecedented numbers. Although adoption across national borders had its beginnings in the 1950s, in the aftermath of World War II and the Korean conflict, it remained for decades a relatively unnoticed phenomenon. "The quiet migration" is how a demographer writing in 1984 described the movement of children for adoption across national borders (Weil 1984). That description now needs to be revised. Over the past ten years, transnational adoption has become both visible and vocal. How has this shift occurred? And how might the contemporary practice of transnational adoption provoke new ways of imagining race, kinship, and culture in North America? In 1994, when I traveled to China with my husband to adopt our daughter, I had no inkling that we were on the cusp of what would become an enormous wave of Chinese adoptions. Neither did I sense the tremendous changes in adoption practices that were under way, the heightened attention to all aspects of adoption that would become so defining of this moment, or the ways in which my lived experience would touch so intimately on contested anthropological terrain. Soon after returning to New York, however, I realized that our very personal act of creating a family through adoption was simultaneously, if unwittingly, part of a larger, collective project. In that project "culture" figured both prominently and, for me, a bit uneasily. This essay is the fruit of my efforts to understand parents' fascination with the imagined "birth culture" of their adopted children. I argue that this fascination may, in part, represent displaced longings for origins and absent birth mothers, and I attempt to situate such longings within historical and cultural shifts in adoption discourse and practices over the last ten years. Adoptions from China to the United States soared from 115 in 1991 to 5,081 in 2000. By the end of the 1990s, China had become the leading "sending" country of children to the United States and the world, and more than 30,000 adopted Chinese children, mostly girls, were growing up with their (mostly) white parents in North America. In February 2002, bookstore windows in Manhattan displayed Valentine's Day specials, among them I Love You Like Crazy Cakes (Lewis 2000), a children's book about a single mother adopting a baby girl from China. The mainstreaming of Chinese adoption has occurred in part through the incessant media attention that has been lavished on adopted Chinese girls over the past decade. This interest shows no signs of abating, with a steady stream of articles in disparate venues. On a page entitled boldly "How America Lives," the Ladies' Home Journal featured "Citizen Amy," an adopted five-year-old Chinese girl in Kentucky, American flag in hand (Leader 2001). Numbers and media attention do not in themselves suggest profound transformation or the normalization of adoption. They surely do not reveal the ardent embrace of a new transracial kinship; the Ladies' Home Journal knows that this is not really how most of America lives. Nonetheless, the phenomenal growth of adoption that crosses lines of nation and of race—and its media presence—hint that interesting changes are in motion, changes that must be situated within larger processes of rewriting kinship, identity, and culture in North America. I focus here on adoption from China, because Chinese adoption and the communities that have developed around it have become remarkably visible and vocal. Families with children from many other parts of the world, however, are dealing with similar issues, particularly when race visibly marks differences between parents and children. In contrast to the isolation and confusion articulated in recent years by many young adult Korean...
 
Social Text 21.1 (2003) 7-27 —Tomas Tranströmer, "Romanesque Arches" In the world of intercountry adoption, two stories predominate: a story of abandonment and a story about roots. In the abandonment story, a baby is found in a marketplace, on a roadside, outside a police station, or in the tour of an orphanage; alternately, a child is left by its mother at a hospital or is relinquished or surrendered to child welfare officials, a social worker, or the staff of a children's home. After passing through the hands of social workers, lawyers, and/or orphanage staff and perhaps in and out of hospitals, foster homes, and courts, this child may ultimately be declared free for adoption, a process that requires a second, legal separation that constitutes the child as a legal orphan. Similarly, a mother who relinquishes her child to state agents must consent to the irrevocable termination of her rights to the child. In international adoptions, the child will also be separated from its state of origin (a procedure that in some nations involves sealing the record of this severance and altering the child's birth certificate) so that it can be connected to a new family, a new name, a new nation. The child is given a new identity. It now belongs in a new place. This story of separation is a story about loss and the transformation of loss into a "clean break" (Duncan 1993, 51) that forms the ground for starting anew. The clean break separates the child from everything that constitutes her grounds for belonging as a child to this family and thisnation, while establishing her transferability to that family and that nation. With a past that has been cut away—an old identity that no longer exists—the child can be reembedded in a new place, almost as though he or she never moved at all. Even as this legal story of separation is the official ground for constituting adoptive identities, another story competes with it in both law and adoption practice. This other story was a persistent counterpoint to the movement for "strong" adoptions that prevailed at the Hague Conference in the early 1990s (Duncan 1993) and was incorporated into the Hague Convention as children's right to preservation of their "ethnic, religious and cultural background" (Hague Convention 1993, Article 16c). The preservation story implies that there is no such thing as a clean break and underpins the search movement in domestic adoptions, the debate over sealed records, and the movement to keep adoptions open in the United States today (Yngvesson 1997; Carp 1998; Verhovek 2000). In this story, identity is associated with a root or ground of belonging that is inside the child (as "blood," "primal connectedness," and "identity hunger") (Lifton 1994, 67-71) and unchanging. But it is also outside the child in the sense that it is assumed to tie her to others whom she is like (as defined by skin color, hair texture, facial features, and so forth). Alienation from this source of likeness produces "genealogical bewilderment" (Lifton 1994, 68, citing Sants 1964) and a psychological need for the adopted child to return to where she really belongs. The story of a freestanding child and the story about a rooted child appear to be mutually exclusive and are associated with different adoption practices. The former is associated with race and other forms of matching that are intended to produce "as if" adoptive families that mimic natural ones (Modell 1994). Even in international transracial adoptions, where race matching is impossible, adoption practices in the 1960s and 1970s emphasized complete absorption of the adopted child into the new family and nation (Andersson 1991). By contrast, the story about roots is associated with the recognition of adoption as a distinct family form (Kirk 1981) and involves acknowledging (even underscoring) the differences between an adoptee and his or her adoptive parents, constituting the adoptive family as...
 
Social Text 20.3 (2002) 45-65 In my local library in East Brunswick, New Jersey, military artifacts, weapons, and photos are on prominent display as a reminder of the days of the world wars, the military interventions of the postwar era, and the sacrifice of young men who grew up in local neighborhoods. If you want to know Chinese culture and history, you have no difficulty finding about thirty or forty books just a few steps away. These books can be readily divided into two categories. One set idealizes a long tradition of Chinese cultural heritage and the other is mostly narrative accounts of harrowing experiences of living in contemporary China. Books like Red Azalea by Anchee Min, White Swan by Jung Chang, and Red Flower of China by Zhai Zhenhua form a genre of semiautobiography. They tell stories of personal tragedy, tortuous bildungsroman, the purgatory experience under the "totalitarian regime." The first set seems to freeze China in a comfort zone of ancient civilization; the narratives appeal to an audience that would still like to see a "Red China" with demonic intents of the enemy. In the wake of September 11, the proximity of the military memorabilia to books about China takes on uncanny significance. If the unconscious structure can be traced in physical layout of mundane objects, we may detect a hidden standoff between the weapons for national security and the fantasies of China or other foreign countries as real or imagined threats. The memorabilia testify not just to the world wars but also to the more extended agenda of national security through military interventions during the Cold War. We have been told that the Cold War ended in 1989 and things have moved on to the globalization track. The Cold War, with its confrontation between sovereign nation-states of leviathan power, its mutually assured destruction policy, and its ideological conflict, has gone the way of the dinosaurs. We are entering into a new age relieved of big power's confrontation and threat, blessed with accelerated economic momentum and free flows of capital without borders: where the modernist style of international politics is obsolete, taken over by the postmodern fluid dynamics of trade and commerce, under the imperial supervision of the supranational jurisdiction of an international system. Events since September 11 came as a shattering blow to this myth of globalization. By conjuring up the specter of the Cold War, they compel us to question the neoliberal forecast of the global circuit of capital accumulation and circulation and to reevaluate in a more realistic fashion a suddenly revealed force field of power struggle. The numerous references in the aftermath of September 11 to Pearl Harbor, the world wars, and the Korean conflict, the nostalgia about "the good old days" of the citizen army and righteous heroism, and the elevation of an elusive terrorist group into "the Enemy" endowed with state sovereignty "at war" with us, suddenly turned the clock back a half-century. The tremendous display of sentiments, passion, phobia, and policy initiatives is redolent of the Cold War. It is as if America and the civilized world had lived in a soothing dream, only to be rudely awakened and thrown back to the rugged terrain of Cold War conflict, to the paranoiac security needs, the bloody conflict of giant powers, the tightening of boundaries, and the hysterical assertion of national identity. Does the specter of the Cold War signal the return of the repressed lurking beneath the discourse of globalization? Does this return really signal any real change in the world system or simply reveal its secret? How does this event alter the production of subjectivity in the sphere of culture? How does it affect the area studies? These are the issues I will explore. The Cold War provides a parameter to assess the so-called post-Cold War period since 1989. If it is true that we have entered a new era, the novelty of the current situation needs to be placed in a broader historical perspective, taking into account the interaction between modern sovereignty and capital's worldwide expansion. Capital, in its unceasing expansion and hostility to boundaries, is...
 
Social Text 21.2 (2003) 95-110 This essay focuses on discourses, anxieties, and competing strategies of white Afrikaners during the political transition in South Africa in the late 1980s and early 1990s. Its sources rely on opinion surveys, election results, political party pronouncements, academic analyses, and personal observations in lectures, group discussions, and formal interviews throughout this period. The article seeks to explore what can be learned from the negotiated settlement of a seemingly intractable ethnoracial conflict for the unresolved strife in the Middle East. When and why did a privileged ruling group consent to negotiate itself out of power? What divisions occurred and how were internal cleavages handled? How was the historic compromise marketed to a skeptical constituency? What role did civil society and dissidents play in the change? In short, can the South African "miracle" be replicated in the Middle East? Many activists advocate similar antiapartheid strategies (divestment, boycott) against Israel and assume that strong pressure would produce similar outcomes. There is nothing wrong with such idealistic optimism, except that it may foster illusions. The underlying assumption that the South Africa model of conflict resolution readily lends itself to export ignores unique historical circumstances. It may actually retard necessary new solutions by clinging to visions or processes of negotiation that may not work in another context. Above all, in South Africa an entire regime had to be changed, while in Israel the occupation and the status of the territories is the main contentious issue. Therefore, a more nuanced understanding of differences and similarities may enhance new approaches. The lessons to be drawn were probed in a more comprehensive comparison, published as Peace-making in Divided Societies: The Israel-South Africa Analogy (Adam 2002), on which this essay elaborates. Six elements were evaluated in both contexts: economic interdependence, religious divisions, third-party intervention, leadership, political culture, and violence. As a background to the Afrikaner debate, it seems worthwhile to summarize the main arguments in each of the six realms for a clarification of the similarities and differences: 2. Religion in South Africa served as a common bond to assail and delegitimize apartheid, while Judaism and Islam compete for sovereignty in Jerusalem. Religiously motivated settlers and ultraorthodox believers may not be as easily marginalized as Afrikaner extremists merely interested in territorial autonomy. 3. Both the African National Congress (ANC) and the National Party (NP) eschewed third-party intervention in their negotiations. An Israeli-Palestinian settlement depends heavily on a U.S. policy that strongly supports Israel. Sanctions (divestment and trade boycotts) are generally overrated in triggering South African change. Only loan refusals and, to a lesser extent, moral ostracism impacted significantly on the apartheid government. Such action against Israel by the West is inconceivable at present. Unlike Afrikaners, Israelis enjoy a supportive diaspora. 4. The South African negotiations were facilitated by a cohesive and credible leadership with a widely endorsed open mandate on both sides. Leaders could sell a controversial compromise to a skeptical constituency. Both the Israeli and Palestinian leadership is fragmented, with militant outbidding, a frequent tool of populist mobilization. The apartheid Westminster electoral system rewarded majority parties, in contrast to the minority influence in the proportional representation in Israel. 5. Much more personal interaction in a vertical status hierarchy shaped South African race relations, compared with the more horizontal social distance between Jews and Palestinians. Paternalism characterized Afrikaner attitudes. Moral erosion among the ruling elite in South Africa contrasts with moral myopia in Israel, a few hundred military objectors notwithstanding. Both sides in the Middle East display a collective sense of victimhood. Apartheid clearly...
 
Social Text 19.4 (2001) 53-65 Most of the literature on social movements assumes that the media, the market, consumerism, and new technologies are adjuncts to if not the causes of subordination and oppression, particularly in a globalizing conjuncture led by "Americanization." My work on youth cultural groups, briefly exemplified here by reference to the group Afro Reggae, suggests that these venues can also be mobilized in the pursuit of social justice under certain circumstances. I also focus on the importance of public spectacle, not only as a feature of the cultural activism of groups like Afro Reggae, but more generally. The Zapatistas in Chiapas, Mexico, are an excellent example of indigenous mobilization that depends to a great degree on spectacle for the few successes that they have achieved (Yúdice 1998). The synergy of all these factors, in conjunction with the insertion of groups like Afro Reggae into the "global civil society" of grassroots community movements, NGOs, foundations, and other initiatives, produces what I call a "performative injunction." The performance of identity and cultural styles is partially overdetermined by these groups' insertion into these networks of articulations. Consequently, rather than view certain social movements' collaboration with the media and markets as simply a form of co-optation, it is also accurate to see this as the strategic management of the use that these groups make of these venues and vice versa. An interesting outcome of this kind of activity is the transformation of all cultural practices into resources that are used and counterused to gain position by any of these articulated actors. The Grupo Cultural Afro Reggae came into being in 1993. Like various other citizen initiatives, it aimed at countering the violence that racked Rio de Janeiro at the beginning of the 1990s. In the final months of 1992 and again in the winter of 1993, a series of arrastões (looting rampages) was carried out by youths from the outlying favelas and opportunistically overdramatized by the media, sending the middle classes into panic. But it was the brutal violence against the poor that immediately prompted the formation of a "citizen action initiative" known as Viva Rio. On 23 July 1993, eight street children were murdered by a "social cleansing" death squad made up of off-duty police in front of the Candelária Church at the major intersection of Rio's downtown avenues. At the end of August, twenty-one innocent residents of the favela Vigário Geral were massacred. Apparently, on the previous day, the local chapter of the narcotraffic gang Comando Vermelho (Red Command) had killed four cops who tried to extort a drug shipment from them. The military police stormed the favela the next day and shot people indiscriminately. In one house they murdered all eight family members, parishioners of the evangelist church the Assembly of God. The parents died, Bible in hand. All three of these events disrupted the sense of place that cariocas (Rio's residents) associated with the spaces in which they occurred (Soares 1996, 244). The arrastões on the beaches introduced an element of fear into the space of leisure. The murders in front of the Candelária Church undid the assumed sociability among classes that is taken for granted at this inevitable space of encounter. The massacre in the favela reversed the role of the police, much like the Rodney King beating in Los Angeles. The police became the criminals who defiled a now-sacred space otherwise identified with abjection. While the arrastões saw quick action from the authorities, the response to the other two events came from "civil society." Viva Rio emerged not only to demand effective action from the authorities but also to communicate a new sense of citizenship, of belonging and participation, that included all classes, especially the poor, and that largely sought to use culture to bring the two parts of the divided city together. This communication also had to target the media, which reproduced and sometimes even staged the violence that Viva Rio activists sought to allay. The community movement of Vigário Geral was formed to analyze what had happened, to demand justice, and to...
 
Derek Jarman's Blue did not explode onto the cinematic world in the full glory of Hollywood hype. When the film was premiered at the Venice Biennale in June 1993, McDonald's didn't organize a special promotion of blue hamburgers, and Coca-Cola stuck to its red-colored cans and browncolored drink. Nor were there dozens of photographers hustling for the best shot of the sexiest star as the audience gathered at the Palazzo de Cinema. No, the screening of Jarman's film passed quietly-just Jarman himself, a single reporter, a small audience, and seventy-six minutes of unchanging blue celluloid backed by a soundtrack about the director's experience of living and dying with AIDS. Published (publisher's copy) Peer Reviewed
 
Social Text 19.4 (2001) 67-91 The main question I will address in this essay is: What are the connections between postmodern capitalism in the era of globalization and the return to conservative, traditionalist attitudes toward life (the postmodern/traditionalist dichotomy)--which then converge to keep women oppressed in so-called Third World countries like Pakistan, which have witnessed the rise of Islamic "fundamentalism" in the last three decades of the twentieth century? My aim in asking and delineating possible answers to such a question is to draw attention to the complex convergence of "postmodern" economic and sociocultural factors and their ideological underpinnings, which are contributing to the rise of seemingly anachronistic or "premodern" fundamentalisms around the globe. In the case of Pakistan, this connection between postmodern capitalism and the resurgence of Islamic fundamentalism has been made explicit through the theatrical work of many alternative theater groups, such as Tehrik-I-Niswan, Ajoka, and Punjab Lok Rehas, that formed in Pakistan in the early 1980s initially to counter the repressive religious policies being ushered into place under Zia ul-Haque. Many of the more recent plays in the repertoire of several of the groups I have been researching do indeed make these linkages, especially a play entitled Dukhini (Suffering woman) by the Ajoka Theater group, which brings these issues to light through the theme of the trafficking of women who are smuggled from poverty-stricken Bangladesh across India and into Pakistan under the false promise of a "better life," only to find themselves sold into prostitution to the highest bidder. These women are victims not only of postmodern consumerist ideology that treats women's bodies as commodities to be bought and sold in the marketplace, but also of Islamist/traditionalist ideologies that work to keep them oppressed by convincing them of their "fallen nature." Under such an ideology, it is never the rapist/buyer of sex who is blamed but the woman who is raped or forced into prostitution -- she has to bear the burden of having "dishonored" her family, who will never accept her back because of the "shame" she has brought them! In other plays by other groups, many actual cases of so-called honor killings or other violent acts against women such as stove-burnings, beatings, rape, and the like have also been used to shape the plots; several of these groups have also connected the rise of such hideous "Islamic" crimes against women (which are never punished by the state or the courts) to global politics and economics, such as the infiltration into Pakistani society of the Taliban movement from neighboring Afghanistan, which was a creation of U.S. imperialism to keep the armament and drug trade flourishing for the benefit of consumer capitalism. Saar, for example, a play by the Punjab Lok Rehas group, portrays these societal conflicts by taking the real-life story of a young woman named Saima whose father brought a case against her in the courts for exercising her choice to marry without seeking his permission, citing the Islamic injunction of wali (guardian) in his favor, as the central event but weaving around it the larger and intertwined issues of unemployment and drug use by many disenchanted young men such as the heroine's brother, who needs money for his habit and uses the patriarchal/religious ideology of female "shame" to blackmail his sister (he has found out about her decision to elope with the man she loves and threatens to reveal her "dishonorable" behavior to their father if she doesn't supply him with cash). Thus, what I attempt to offer here is an analysis of the global conditions leading to women's oppression in Pakistan and, by implication, other "Third World" countries, in the hope that through such analysis can come the impetus for change -- something that theater and other cultural activists in Pakistan are certainly aiming for through their work. Such an analysis and undertaking has become ever more crucial following the events of 11 September 2001, which have led thus far to more, rather than less, confusion in perceiving and accepting collective responsibility for the imbrication of the issues and ideologies I am outlining here.
 
Social Text 21.3 (2003) 85-108 An essay about Gay Sunshine Press and Latin America should start off with an attempt at seducing the reader, which is why we opted for the tacky allure of "beautiful" and "sinister" in the title. The phrase itself appears in an essay titled "Latin America: Myths and Realities," written by E. A. Lacey and published in the tabloid journal Gay Sunshine in 1979, and it gives a good idea of what Latin America meant for at least part of the gay radical San Francisco arm of the movement. The two words, in tense relationship with each other, describe the state of mind of at least three authors who "worked" Latin America for the press: Lacey himself, Winston Leyland, and Erskine Lane. The words register what those observers saw as they engaged in missionary work from San Francisco to Rio de Janeiro, while they blended beat culture with samba, turned political liberation into a politics of identity, and experimented in the Guatemalan highlands with a transculturated politics, two-thirds Zen Buddhism and one-third Mayan spirituality. "Latin America was seen as being—with that exasperating quality of paradox that inevitably creeps into our perception of the alien and unfamiliar—both magical and menacing, a beautiful, sinister fairyland where the usual rules of logic were suspended and anything good or bad might happen, and usually did." It is an image that belongs to the imagination of the foreign voyeur, the disenchanted white homosexual who follows the footsteps of a modern-day Rimbaud—leaving civilization for the "menacing" context that beauty provides, in a constant "deréglement des senses" of suspended logic. The magic was all in the eyes of the beholder, as was the sense of anticipation, the beauty, the threat, and even the "exasperating" paradox that is but the result of the perception of an alien object, or a place—"fairyland" as the unexpected, equivocal term in this scenario. As far as "fairyland" goes, Lacey was himself most probably not quoting Rimbaud but expressing the exoticism and marvel of the magical realist representation of Latin America. And one could not help but notice that the form of embodiment for this reality strokes the other meaning of fairy—"fey," or "queer." It is as if what Alejo Carpentier called "lo real maravilloso" became, by some perverse, sexy, and paradoxical trope, something emphatically fleshy, warm to the touch, hard—and, above all, male. The fairyland that Lacey talked about was an invented reality, of course, and it had all the trappings of a self-conscious invention. It was mostly an erector set meant to uphold an object called "Latin American gay literature" as this was apparently thought up in San Francisco in the 1970s by a group of non-Latin American gay white men, some of whom are still very active in the promotion and dissemination of the "sinister" fruits of that fairyland. The gay literature was there, but the links and the continental scope were provided by Lane, Lacey, and Leyland—the third of the trio also furnishing the marketing tools for its dissemination. These men were intellectuals bent on creating—to borrow Benedict Anderson's term—an "imagined community," spreading the gospel of gay liberation from San Francisco to the far reaches of the globe. The enterprise was managed by scholars, entrepreneurs, poets, historians, and translators who were keen on their sense of mission; they were serious enough about their pursuits to devote part of their lives' work to bringing out sexuality from what they perceived was its hidden closet. They were travelers and self-questing philosophers, and they created fields of knowledge heretofore unseen: queer Buddhism, Mayan post-hippie queerness projected back to pre-Columbian times, meditative set pieces that meshed quantum physics with Indian religion, Tikal with Lao-Tse, the Grateful Dead with the Aztec poet-warrior Nezahualcoyotl. That they worked out of sheer pleasure born out of their sense of mission is not to be doubted; that the fruits of their labor sometimes verged on the objectionable is part of what lures us to explore how sinister their sense of beauty really is. It is clear that Winston Leyland...
 
Social Text 21.2 (2003) 125-139 It is dimly possible to imagine that the September 11, 2001, attacks on the World Trade Center and the Pentagon could have provided an occasion to begin a serious national conversation about why some Muslims— relatively few to be sure—hate the United States enough to kill themselves to harm our country and its people. Instead, September 11 further consolidated an understanding of the world drawing sharp oppositions between "us" and "them," and positing Islam as the "new enemy for a new world order." President Bush declared, "Islam is not the enemy." Nonetheless, the administration and its allies—neoconservatives, the Christian Right, and pro-Israel hawks—encouraged this understanding by promoting a vision of the world divided into the forces of freedom and "the evil ones." The proposition articulated in the president's January 2002 state of the union address that North Korea, Iran, and Iraq constitute an "axis of evil" may well be the most flawed and unsophisticated understanding of international affairs to have been offered by any head of state since the end of World War II. Israeli prime minister Ariel Sharon quickly identified with the Bush administration's post-September 11 worldview and sought to turn it to Israel's advantage. Announcing a day of mourning in Israel and appropriating rhetoric from the era of the Cold War, Sharon declared, "The fight against terror is an international struggle of the free world against the forces of darkness who seek to destroy our liberty and way of life. Together we can defeat these forces of evil." After September 11, Sharon repeatedly equated Osama bin Laden and Al-Qaeda with those he regarded as Israel's more direct enemies: Yasser Arafat and the Palestinian Authority, Hamas, Lebanese Hizbollah, Iraq, and Iran. The Bush administration, with only minimal reservations, embraced this proposition. The consequence was to give Sharon a nearly free hand in repressing the second Palestinian Intifada, which erupted a year before the attacks on the World Trade Center and the Pentagon. The American Council of Trustees and Alumni (ACTA) attempted to give a patina of intellectual legitimacy to the Bush administration's simplistic world outlook in a report entitled "Defending Civilization" released in November 2001. According to ACTA, criticism of the Bush administration's response to the September 11 attacks on campuses across the country is tantamount to negligence in "defending civilization" and proof that "our universities are failing America." ACTA alleges that American universities have been brought to this sorry state by inadequate teaching of Western culture and American history. Consequently, students and faculty do not understand what is at stake in the fight against terrorism and are undermining the defense of civilization by asking too many questions. ACTA was founded by Lynne Cheney, wife of Vice President Dick Cheney. Former Democratic vice presidential candidate Senator Joseph Lieberman is a member of its national council. Lieberman criticized the report, though not too aggressively, after it appeared. Although she is no longer officially active in ACTA, a lengthy quote by Ms. Cheney appears on the cover of the report, suggesting that she supports its contents and giving the document the appearance of a quasi-official statement of government policy. The original version of "Defending Civilization" named and quoted comments by 117 university faculty members and students in reaction to the September 11 attacks. ACTA's ire was aroused by my statement: "If Osama bin Laden is confirmed to be behind the attacks, the United States should bring him before an international tribunal on charges of crimes against humanity." Other remarks in the report's list of unacceptable speech included "Ignorance breeds hate" and "There needs to be an understanding of why this kind of suicidal violence could be undertaken against our country." After receiving considerable criticism, ACTA removed the appendix to the report containing the names and quotes. Of course, ACTA's attack on American universities in the name of "defending civilization" was a ruse for pursuing its shared agenda with the Bush administration: suppressing any form of dissent from the militarized policy response to the September 11 attacks. By vilifying those who attempted to engage in a debate over foreign...
 
Social Text 18.1 (2000) 143-151 Only state intervention and welfare reforms can put an end to women's economic dependency, and thereby free women from men's control, that doughty feminist reformer Eleanor Rathbone argued just over eighty years ago. How her old statist rhetoric betrays her. Women's economic dependency and welfare reforms are currently on everyone's minds, but in an ambience generating thoughts only of purging most of those receiving any benefits at all. Today in Britain, in the long shadow of the United States, the political usage of the term economic dependency is being definitively transformed. The notion of welfare benefit no longer promises the hard-fought-for amelioration, but rather the definitive symptom, of dependency: the erstwhile utopian cure is resignified as the disease. It is some years since Nancy Fraser and Linda Gordon traced the genealogical transformation of dependency in the United States, noting its conjunction with a flourishing, deceptively feminist-sounding self-help literature on autonomy. This is why single mothers can be demonized if they don't work, even while married women with young children can be demonized if they do. Shifting a mother from dependency on the state to reliance on a man for economic support, in this troubling slippage, supposedly removes her from the pathologies of dependency. It is a massive deception. The continuing offensive against welfare provides perhaps the single most general threat to Western women's interests at present -- at least for those many women who are not wealthy and who still take the major responsibility for caring work in the home. As feminists in the 1970s made so clear, and sought so hard to transform, women are most vulnerable to the very worst pathologies of dependency when they are most at the mercy of husbands or male partners, especially during and after pregnancy and childbirth. Indeed, midwives in Britain have recently been asked to look for signs of abuse in just such women, following alarming reports from the United States examining the bruised bodies of pregnant women and those who have recently become mothers. Similar antitheses exist in relation to needy children. Carolyn Steedman has written eloquently of how the expansion of welfare in the late 1940s gave a particular confidence to working-class children like herself: Liz Heron echoes these sentiments, although, like Steedman herself, she was well aware of the limitations of such services: it was their paternalistic, undemocratic delivery that made them vulnerable to subsequent attack. Introducing her anthology of autobiographical writings by girls growing up in Britain in the 1950s, Heron writes: "Along with the orange juice and the cod-liver oil, the malt supplement and the free school milk, we may also have absorbed a certain sense of our own worth and the sense of a future that would get better and better, as if history were on our side." Not any more! The shedding of public responsibility for the welfare of poorer women threatens to devastate the lives of millions of children in Britain, just as it has already done in the United States over the last two decades. Increasingly in Britain the new myth of "dependency culture" is used to condemn those receiving any form of state service, marking them out as vulnerable to "welfare dependency." Yet, despite the hassles and indignities they now face, surveys of single mothers have shown that a majority would still prefer dependency on the state to their experience of dependency on a man. That option is now disappearing. In alliance with Reagan and the American Right, there was no doubting Margaret Thatcher's determination to overturn all traces of the postwar Keynesian economic orthodoxy with its support for spending on welfare -- while upholding and abetting warfare spending. What is somewhat less clear is the extent to which the Blair...
 
Social Text 21.2 (2003) 7-23 From the Israeli invasion of Lebanon in the summer of 1982 to the signing of the Declaration of Principles on 13 September 1993, Palestinian politics witnessed the transformation of the PLO from a revolutionary national liberation movement to a partner in a U.S.-led peace process. The demise of the PLO as a revolutionary movement began a decade earlier with its negotiated withdrawal from Beirut and the establishment of new headquarters in Tunis. The Sabra and Shatila massacres and the painful war of the camps in the mid-1980s devastated the PLO and revealed the weakness of Arafat's leadership. Cut off from its major civilian constituencies in the refugee camps of Lebanon, Syria, and Jordan and wrecked as an armed resistance group, the post-1982 PLO was unable to formulate a coherent political strategy that could meet the demands of the constantly shifting grounds of the struggle. The beginning of the Intifada in December 1987 interrupted the PLO's slide into inaction by upsetting the status quo in the occupied territories and opening alternative fronts of opposition to Israel. Long neglected by the PLO as a strategic locus of operations, the occupied territories emerged during the Intifada as the principal site of resistance and the real stake in the conflict. The remarkable achievements of the uprising, especially in its first two years, gave a new mandate to the debilitated PLO leadership in Tunis and somehow offset the failures of the post-1982 period. The escalation of confrontations between Israeli soldiers and Palestinian youths enhanced the PLO's ability to mobilize effective international opposition to Israel's policy in the occupied territories and also forced the organization to reassess its established positions. At the 1988 Palestine National Council meeting in Algiers, the PLO made a "Declaration of Independence" and announced a new plan of action that indicated a reorientation centered on achieving an Israeli withdrawal from the occupied territories. By the early 1990s, however, the Intifada had run its course, and the many Palestinians living under occupation were finding it increasingly difficult to endure the worsened living conditions resulting from Israeli repression of the Intifada. The effects of the Gulf War— on the entire region, but especially on Palestinians following the PLO's announcement of support for Iraq—put an end to the Intifada and undermined many of the gains produced by the three-year uprising. The 1990-91 Gulf War had a devastating effect on Palestinians in the occupied territories, sapped the Intifada, and isolated the PLO. One of the major outcomes of the war was the Middle East peace process, beginning with the public meetings in Madrid in October 1991, in which the Palestinian negotiation team was composed largely of personalities who had emerged during the Intifada as leaders from inside the occupied territories. The Madrid process came to an end with the announcement that the PLO and Israel had reached an agreement in secret meetings in Oslo, followed by the signing of the Declaration of Principles in Washington. The initial—and misplaced—euphoria following the Washington handshake between Arafat and Rabin, and Arafat's subsequent arrival in Palestine in July 1994, did not last long. The actions of the Palestinian National Authority in Gaza and Jericho soon revealed the failure of Arafat to learn from the Intifada and understand the concerns expressed by the Palestinian negotiators during the post-Madrid bilateral rounds. No longer the head of a liberation organization—now the chairman of the National Authority—Arafat has proved himself to be often insensitive to Palestinians in the occupied territories and ineffective in negotiating with the Israelis. The years from 1982 to 1993 can now be understood as the period when the PLO abandoned the politics of resistance linked to those other struggles for national liberation and embraced the politics of appeasement, defined almost entirely in terms of U.S. recognition (Hassan 2001). I have rehearsed briefly the historical trajectory of the PLO from 1982 to the early 1990s, from Beirut to Jericho, to provide a specific context within which to discuss the relationship between Palestinian literary production and national politics. Prior to 1982, in the period when...
 
Social Text 21.2 (2003) 49-74 Eurocentric and Zionist norms of scholarship have had dire consequences for the representation of the history and identity of Arab Jews/Mizrahim (that is, Jews from Arab/Muslim regions) vis-à-vis the question of Palestine. In previous publications I suggested some of the historical, political, economic, and discursive links between the question of Palestine and Arab Jews, and argued for a scholarship that investigates the erasure of such links. Here, I will trace some moments in the hegemonic production of an isolationist approach to the study of "Jewish History" as crucial to a quite anomalous project in which the state created the nation—not simply in the metaphorical sense of fabrication, but also in the literal sense of engineering the transplant of populations from all over the world. New modes of knowledge about Jews were essential in this enterprise, which placed Palestinians and European Zionist Jews at opposite poles of the civilizational clash. Yet, Arab Jews presented some challenges for Zionist scholarship, precisely because their presence "messed up" its Enlightenment paradigm that had already figured the modern Jew as cleansed from its shtetl past. In Palestine, freed of its progenitor the Ostjuden, the New Jew could paradoxically live in the "East" without being of it. Central to Zionist thinking was the concept of Kibbutz Galuiot—the "ingathering of the exiles." Following two millennia of homelessness and living presumably "outside of history," Jews could once again "enter history" as subjects, as "normal" actors on the world stage by returning to their ancient birth place, Eretz Israel. In this way, Jews were thought to heal a deformative rupture produced by exilic existence. This transformation of Migola le'Geula—from diaspora to redemption—offered a teleological reading of Jewish History in which Zionism formed a redemptive vehicle for the renewal of Jewish life on a demarcated terrain, no longer simply spiritual and textual but rather national and political. The idea of Jewish return (which after the establishment of Israel was translated into legal language handing every Jew immediate access to Israeli citizenship) had been intertwined with the imaging of the empty land of Palestine. Its indigenous inhabitants could be bracketed or, alternately, portrayed as intruders deemed to "return" to their Arab land of origins (a discourse that was encoded in the various transfer plans). A corollary of the notion of Jewish "return" and continuity in Israel was the idea of rupture and discontinuity with diasporic existence. In order to be transformed into "New Jews" (later Israelis), the "Diasporic Jews" had to abandon their diasporic culture, which, in the case of Arab Jews, meant abandoning Arabness and acquiescing in assimilationist modernization, for "their own good." Within this Promethean rescue narrative, concepts of "ingathering" and "modernization" naturalized and glossed over the historical, psychic, and epistemological violence generated by the Zionist vision of the New Jew. This rescue narrative also elided Zionism's own role in provoking ruptures, dislocations, and fragmentation for Palestinian lives, and—in a different way—for Middle Eastern and north African Jews. These ruptures were not only physical (the movement across borders) but also cultural (a rift in relation to previous cultural affiliations) as well as conceptual (in the very ways time and space, history, and geography were conceived). In this essay I will examine some of the foundational premises and substratal axioms of Zionist discourse concerning Arab Jews, arguing that writing a critical historiography in the wake of nationalism—both Arab and Jewish—requires the dismantling of a number of master narratives. I will attempt to disentangle the complexities of the Mizrahi question by unsettling the conceptual borders erected by more than a century of Zionist discourse, with its lethal binarisms of savagery versus civilization, tradition versus modernity, East versus West, and Arab versus Jew. While one might examine the position of Mizrahim within the restrictive parameters of what Zionist scholarship constructed as "Jewish History," I have long argued against creating such a segregated discursive space for history, identity, and culture. Even if Mizrahi identity was "invented" within the process of the Zionist invention of the "Jewish nation," it is important to unsettle the ghettoized nationalist analytical framework. A diasporized analysis...
 
Social Text 21.4 (2003) 127-138 The month following the U.S. commemoration of the thousands who died in the attacks on the World Trade Center brought a new wave of terror: a sniper in the Washington suburbs. Not the mass catastrophe produced by tons of collapsing steel and burning jet fuel, the sniper's campaign wrought massive uncertainty punctuated by randomly chosen, but precisely aimed shots. Ordinary people doing ordinary things were transformed into targets, the suburbs into a shooting range. Seven people fell during the first three days of the sniper's attack. "Killed while sitting on a park bench," "killed while walking across a parking lot," "killed while doing lawn work," "killed while putting gas in his taxi cab," "killed while vacuuming her minivan," "killed on the street corner," or "shot while loading packages into her car," each victim was a mark, frozen in a rifle's telescopic sight, isolated from his or her surroundings, a target in an exercise in marksmanship. Six more would fall before the snipers, John Allen Muhammad and John Lee Malvo, were finally apprehended. In terms of numbers, thirteen shooting deaths in the space of three weeks is hardly remarkable in the gun-crazy United States. Los Angeles alone can yield six hundred shooting deaths in a year, with the notorious South Central area contributing twenty per week. Even Muhammad and Malvo's exploits might have gone unnoticed had they spread their victims out instead of concentrating them in the highly visible topography of the D.C. area. Indeed, prior shootings in Louisiana and Alabama that were subsequently attributed to the snipers might have remained in the homicide limbo of unsolved crime—just another random shooting death. Only by concentrating their attack did the snipers' serial spree take on the proportion of terror to emerge as what Jean Baudrillard might call a "singularity [in] the heart of a system of generalized exchange." October was a month of relentless uncertainty, steeped in the awful reality that anyone could get a bullet through the head while pumping gas. The sniper attacks brought a lottery of death to the suburbs. The banal landscape of car-choked roadways and parking lots, the commerce of gas stations, convenience stores, and strip malls were reconfigured in a new terrain of risk. The cloak of uneventful malaise that passes for security was torn asunder to reveal a population gripped with fear and anxiety. The media and law enforcement officials issued palliatives meant to calm, all the while fanning our fears with more uncertainty. We were told that the likelihood of death by the sniper's bullet was a one-million-to-one shot and we were reminded that death in a car crash is far more likely than death by a random sniper. But the stakes seemed much higher, since every victim was so unremarkably just like us. Beyond reason and gripped by the faith we bestow in the luck of the individual (whether good or bad), we fell prey to sniper anxiety. During the height of the sniper attacks, USA Today reported that Americans were more worried by the sniper than they were by impending war with Iraq and the plummeting economy. Even people far from the sniper's epicenter, in places like Montana, worried that the sniper would generate copycat versions in their own neighborhoods. We, the society of rampant individualism and the culture of commodity replication (whose apparent contradiction has recently found resolution in the reported birth of the first cloned human), are doomed to the fear that every individual event will spawn its unwanted copycat replicant. So, we plotted the sniper's strikes on a mental map of crisscrossing interstates from Montgomery County, Maryland, to Ashland, Virginia, wondering if the sniper would move further south or do something truly spectacular like take a plane to the West Coast and begin anew with a fresh set of victims. Our fear enhanced the sniper's aura. The FBI provided us with a profile of an angry white man, to which we added an apocryphal white van seen leaving the crime scenes. And with each day, we awaited the report of yet another unlikely, but no less...
 
Social Text 18.1 (2000) 55-79 On 22 August 1996, President Bill Clinton signed into law the Personal Responsibility and Work Opportunity Reconciliation Act (PRWORA). Once fully implemented, this act has the potential to destroy the means of subsistence of millions of working and jobless poor who will be removed from all federal assistance programs granted under the Social Security Act by the year 2002. Yet the damaging effects of PRWORA have already been felt. Those who were targeted to suffer the most immediate and life-threatening cuts to the welfare state were "legal," green-card-holding immigrants receiving federal aid through the Social Security Administration (SSA). In September 1997, states began implementing their specific plans to remove immigrants from SSA-administered Supplemental Security Income (SSI) for the disabled and elderly, and food stamps for dependent, immigrant children. Following this initial immigrant removal, federal cash assistance programs, namely Aid to Families with Dependent Children (AFDC), will be drastically reduced and eventually eliminated for both citizens and immigrants within five years of the signing of PRWORA. Southeast Asians in the United States -- primarily Vietnamese, Lao, Hmong, and Cambodian immigrants -- represent the largest per capita race or ethnic group in the country receiving public assistance. Originally placed on federal welfare rolls as a temporary and "adaptive" measure under the Indochina Migration and Refugee Assistance Act of 1975, a large segment of Southeast Asian refugees who fled their homelands in the aftermath of the U.S. invasion of Vietnam and the subsequent bombing of Cambodia by the United States are now entering a third consecutive decade of welfare dependency, contrary to government officials' predictions of a seamless transition into American labor markets. Stripped of their refugee status in the post-Cold War era, virtually all Southeast Asians have now been reclassified as permanent residents (or "legal" immigrants) and are therefore fully subject to the impending cuts under PRWORA. The consequences of the new law's "immigrant removal" campaign are sweeping and disastrous for Southeast Asian communities that, in California alone, have shown poverty and welfare-dependency rates of nearly 80 percent for the state's entire Southeast Asian population. For many of its critics, PRWORA represents a post-civil rights, neoconservative backlash against the poor and nonwhite sectors of the United States, particularly against the black urban poor who have been labeled the new "underclass"--a damaging term that encompasses a range of racist imagery including the sexually deviant "welfare queen," the "dysfunctional" black family, and the uncontrollably violent black male. These distorted and dehumanizing constructions are reproduced in mainstream periodicals, popular literature, and even liberal-leaning policy reports. Yet, despite its wide circulation, the term underclass evades a precise and agreed-on definition. According to historian Robin Kelley, this lack of precision stems from the fact that underclass has never actually referred to a class of people but rather to a set of behaviors. "What makes the 'underclass' a class," suggests Kelley, "is members' common behavior -- not their income, their poverty level, or the kind of work they do. . . . It's a definition of class driven more by social panic than by systemic analysis." The common behaviors that comprise the underclass are wide ranging and arbitrary; they can include criminal mischief, sexual promiscuity, shiftlessness, dependency, nihilism, immorality, and deviance. In an attempt to offer some coherence to these random sets of behaviors, underclass proponents have conveniently drawn on the term culture. Indeed, underclass behaviors are often grouped together to represent a so-called culture of poverty. Here, the new underclass literature is appropriating and distorting the work of Oscar Lewis, who, in the early 1960s, first introduced the phrase subculture of poverty. Lewis, however, did not evoke culture to describe a set of immutable behaviors, but rather to describe the daily practices that poor people engaged in, in an effort to both survive and resist systemic inequality and class polarization. For contemporary culture of poverty theorists, the addition of culture still fails to yield a more precise definition of the term. However, its evocation is nonetheless powerful, for it designates an ethnographic field, an "area of difference" wherein the aforementioned behaviors are confined to a particular group -- the black urban poor...
 
This article explores the ways in which Pentecostal media, especially electro-acoustic media, are integrated in the everyday life of a favela in Rio de Janeiro. It argues that the popularity of Pentecostal radio has to be understood in relation to the sociocultural meaning of sound, the architecture of the favela, and the ways in which sound is employed to mark off space and express identity and alterity. Pentecostal broadcasts acquire their meaning against the background of sound that evangelicals define as “worldly” instead of Godly, and the cacophony of sounds in the dense urban space reflects the power struggles that are going on and the position that evangelicals try to maintain. People are inclined to listen to evangelical radio because it signals their “sanctified” position in the harsh and complex social conditions of the favela. By way of a discussion of the qualities of sound in relation to the built environment--its appearance as an extension of material and social boundaries and its manifestation as an immediate presence that defies boundaries--this article also explores some of the more intricate transformations that the use of electro-acoustic technology brings about in the nexus of religion, technology, and bodily dispositions. By many evangelicals, electro-acoustic technology is considered an important means to be “in touch” with God. This emotionally charged experience subjectively confirms the status aparte of the persons who adhere to the Pentecostal churches and, as such, authenticates the social distinctions they make between themselves and the other favela inhabitants.
 
Social Text 21.3 (2003) 39-57 In 2002, a researcher looking for endocrine markers of early childhood stress and trauma compared two groups: postinstitutionalized children adopted from Romania by U.S. families, and post-foster care children adopted by U.S. families. By virtually any measure—age at adoption, aggressiveness toward peers and family, trouble getting along with other children, school problems, delinquency—these two groups of children offered the same (considerable) behavioral and emotional challenges to their adoptive families. (In an unexpected note of grace, despite their serious struggles, a very high percentage of adoptive parents—84 percent—reported that they were either satisfied or very satisfied with their relationship with their adoptive children.) Finding these children so similar in so many dimensions raises a question: why did Americans rush to Romania in 1991 (2,594 State Department visas were issued for Romanian "orphans" that year, and only 100 each in 1990 and 1992) to adopt deeply troubled kids at considerable expense to themselves and with very little formal support for raising them, when they could have adopted substantially similar children in the United States, with institutionalized support and government subsidy? While the motives of adoptive parents are as heterogeneous as the children they adopt, this pattern suggests that something, culturally, was at work that steered people away from U.S. kids in foster care. In this light, legal scholar Richard Posner's paradigm of low price increasing demand for difficult-to-place U.S. children is clearly inadequate; something far more complex than money was at stake. (Patricia Williams cynically called Posner's the "old children, cheap" model.) In 1991, at the height of the Romanian adoption boom, many suggested that the answer was crudely race, as these were presumptively white children being adopted from Romania. Yet the considerable numbers of white children in U.S. foster care (nearly 40 percent) gives the lie to this as a motive in any straightforward way. Either a lot of would-be adoptive parents were behaving in an extremely irrational way—spending upward of $10,000, $20,000, even $30,000, traveling halfway across the world, and negotiating difficult visa problems in a language few of them knew to adopt children who were as old, as traumatized, of the same race, and as sick as children that they could adopt in the United States with more information about their past, free access to health care and mental health care for their children, and non-need-based subsidies of upward of $600 a month—or adoption decisions respond to different pressures than are usually articulated. We read this apparently inexplicable difference in approach to Romanian children and children in the domestic foster care system as what anthropologist Sally Falk Moore terms a "diagnostic event." By this she means those moments of powerful contradiction that lay bare cultural logics, identify the diverse stakeholders in social conflicts, and reveal the genealogies of ideas linking institutions. Not all adoption is like the Romanian context; it is a very different set of practices in diverse places. This particular time and place draw some of the contradictions most starkly, however, leading us to ask whether they inform us about important logics of childhood. We are not interested so much in the motives of individual adoptive parents—good, bad, or indifferent—but in looking at the kinds of knowledges, beliefs, and institutional and state practices that made their decisions seem natural and inevitable, that structured what potential adopters were likely to know, believe, and not know, and that may have led individuals to think that adopting U.S. children in the foster care system was far more difficult than it, in fact, is. The conundrum of Romanian adoption in 1991 lays bare a genealogy of the racialization and biologization of poverty through a series of academic discourses and social policy decisions that resonated with popular cultural expectations concerning the plasticity of children exposed to turbulent environments in their earliest years. In order to explore how these tropes of resilient (overseas) and toxic (U.S.) childhoods were produced, to show their interrelationship, and to make the argument that they together did powerful cultural work (often in the service...
 
This article examines the ruins of the former structures of rule in a medium-sized town in the Sultanate of Oman. It explores how people in Bahla relate to and perceive the ruins of forts, walls, and neighborhoods that had helped maintain order in the previous era and that are being transformed in the new state. While the fort has changed from local and regional political-military center to national museum, helping shape more abstract and impersonal relationships to the past, the town's crumbling wall, whose origin myth and grandeur are no longer legitimized as an emblem of the town's identity, continues to evoke the sense of a protective boundary. And, the town's neighborhoods are succumbing to simultaneous centrifugal and centripetal pressures: state attempts at streamlining the administration of neighborhoods are emerging just as the control of the neighborhoods through locked doors ends, giving way to the disciplining of movement and the realignment of forms of belonging. Ruins of the former regime are selectively being legitimized, rebuilt, and incorporated into the new centralized and bureaucratic state, while memories of the past system of rule and governance, although shifting and certainly also selective, continue to mediate peoples' relationships to and senses of these sites. The shifts in perception and spatial experience that have accompanied the ruins of the old regime and that have emerged in the wake of the changing regime's management of order are also further reminders of the possibilities of Oman's future after the demise of the sultan and when oil reserves are depleted.
 
This paper provides an analysis of the working of a musical form: bandiri. Bandiri is a musical genre performed by Sufi adepts in northern Nigeria who take popular Hindi film songs and change the words to sing praises to the Prophet Mohammed. In doing so they are involved in a complicated process of taking a profane genre and sacralizing it. I argue that bandiri is the result of the convergence in Kano, northern Nigeria, of three very different sorts of transnational cultural and religious networks: the long presence of Sufi brotherhoods in the north, the recent emergence of an anti-Sufi Islamist movement; and the continuing popularity of Indian films and songs. As an urban centre Kano is made up of overlapping sets of cultural, religious and economic networks that constitute its particular configuration. These networks create structural preconditions that provide the raw material from which urban experience might be fashioned.
 
Social Text 18.4 (2000) 1-23 The word international was coined by Jeremy Bentham in a book published in 1789. In the very first instance where the term appears, it is aligned with the word jurisprudence. International jurisprudence is suggested by the author to replace the term law of nations, what he deems to be a misnomer. He explains in a footnote that the new term I would like to suggest that the drive behind the neologism, as indicated by Bentham, may offer a remarkable key to the entire Benthamite edifice. The older phrase law of nations, according to Bentham, refers to a certain discursive space only through the force of custom, or convention. What he thinks to be more appropriately required, however, is a designation that would go beyond mere convention. The phrase international law is therefore purportedly linked to that which it signifies in a manner never achieved by the phrase law of nations. Law of nations, in other words, is a sign relying on the mediation of convention. Without the convention, "the force of custom," the phrase law of nations might be understood as one designating the domestic, municipal law of diverse nations. On the other hand, international, corresponding to that to which it refers in an intrinsic, immediate, or unmediated manner, is a term of presence in the Derridean sense. That is, international is a term that stands in no need of the mediation of custom and convention. The drive behind the neologism could be a unique key to Bentham's thought, for it is emblematic of a certain attitude toward meaning formation in language that significantly underwrites much of the Benthamite oeuvre. The Marquis de Sade, a contemporary of Bentham, was also troubled, notoriously so, about the mediation of custom and convention in meaning formation. In this article I bring together Bentham and de Sade to argue that the two share a strikingly similar view of language, one that is symbolized in the coining of international by Bentham. The distinction Bentham assumes between the terms law of nations and international law crudely corresponds to the opposition J. L. Austin famously draws between "performative" and "constative" instances of language. Accordingly, while performative utterances are essentially interpretive, requiring the mediation of such extrinsic elements as context and convention, constative utterances are mere descriptions of objects or states of affairs. Constative utterances, consequently, stand in no need of extralinguistic, contextual support. In the course of his analysis, however, Austin comes to raise more and more doubts about the validity of such a radical distinction, hinting instead at constatives as no more than a mere subcategory of a now generalized category of performatives. In this view, all language is essentially interpretive. Bentham, on the other hand, is convinced that in order to attain rigor in meaning, language needs to be cleansed, as much as possible, of performative terms, namely terms that require the extrinsic support of custom and convention. I argue in this article that de Sade is equally strongly contemptuous toward the mediation of extralinguistic elements in meaning formation in language. I then link the two authors to the contemporary studies of the international, of international political theory, suggesting that the prejudice toward the mediation of custom -- of the performative, the mimetic, which Bentham and de Sade prominently display -- is an equally prominent characteristic of the mainstream perceptions of international politics. This concept of global politics rests on an exclusion of the mimetic from its thematic concerns, with an emphasis instead on the "real." The history of international relations as an autonomous discipline is that of international politics studied from a perspective introduced in the late 1930s and 1940s by writers such as E. H. Carr, Hans Morgenthau, and George Kennan. This so-called Realist perspective on the international is built on a sharp distinction of principles...
 
Social Text 19.2 (2001) 75-101 For students of black radicalism, Richard Wright's Black Power (1954), his report on the nationalist revolution in the Gold Coast colony (present-day Ghana), inevitably frustrates the expectations raised by its prophetic title. Wright undermines his avowed anti-imperialism with numerous problematic assertions about African culture. Wright's evident disdain, in Black Power, for the folk cultures of peoples of African descent places him fundamentally at odds with major currents of black radical thought, which regard the cultural resistance of African peoples as imaginative responses to their subordination. Wright's view of the backwardness of African peoples and culture, and his inability to question his teleology of modernization, have clouded his legacy and hindered recognition of Black Power's importance in theorizing black radicalism and diaspora. In this essay, I argue that despite the considerable problems of Black Power, Wright demands our attention for his revisionist reading of the condition of blacks in the diaspora, which he understands dialectically as the product of slavery, dispersion, and oppression, and simultaneously, as the necessary condition for black modernity and the forging of an anti-imperialist critique of Western culture. However unfashionably idiosyncratic Wright may appear to us today, it is important to realize that he was not the lonely rebel he often made himself out to be. With fellow black radical intellectuals in exile, including George Padmore and C. L. R. James, Wright was engaged in theorizing about the revolutionary significance of black and African peoples' struggles against Western oppression. Moreover, like Wright, Padmore and James were vehement in their own rejections of Negritude, the politically charged assertion by some Francophone African nationalists of a transhistorical, transnational black cultural unity. Wright's revisionist reading of the African diaspora offers an explicit critique of the diaspora-homeland binary. This is of paramount importance, insofar as that binary powerfully underwrites nationalist and essentialist understandings of blackness and is frequently indicative of a black diaspora identity invested in an often diversionary quest for authenticity. Wright thus challenges both the black cultural commonplace of the African diaspora as the fallen condition whose resolution is obtained through reclamation of African roots as well as its converse: the equally irrelevant assertion that one cannot "go home" to Africa. Instead, Wright's discussion of anticolonialism recasts diaspora as the mobilization of black modernity toward a transnational, transracial community of struggle. Wright's text, read in the context of his participation in transnational politics of decolonization and African liberation, suggests yet another sense of the dialectic of diaspora. Brent Edwards has given an account of the emergence of the term during the mid-1960s as a corrective to totalizing declarations of pan-Africanism that masked the waging of sharp political and ideological disputes within decolonization movements. This corrective sense of diaspora as the pragmatic recognition of a diverse black world fragmented by language, politics, and ideology is commonly displaced by those who invoke diaspora as the sign of incommensurability. Instead of perceiving internal tensions and differences as symptomatic of political struggle, defining diaspora as incommensurability simply acquiesces to the notion that national, geographical, or cultural differences pose insurmountable obstacles to transnational solidarity. This essentializing, reductive sense of diaspora as an unbridgeable gulf reinscribes the diaspora-homeland dyad and is routinely invoked in the wake of the destruction of pan-African and global black radical projects. Wright's exemplary reflections on diaspora suggest a larger problem of the politics of location bearing on the transnational careers of black radical thinkers as varied as Wright, Padmore, Malcolm X, and others. If exile and expatriation have been vital and liberating for some black radical intellectuals, then what of the relationship of these intellectuals to communities of struggle, both within their countries of origin and elsewhere? What were the conditions either facilitating or mitigating the production and dissemination of knowledge between these intellectuals and audiences and vice versa? These questions move beyond issues of location framed by the diaspora "return" to the ancestral homeland as a gauge of authenticity. Their ultimate aim is an account of the interface of global black radical projects with the configurations of hegemonic power they confronted. Wright's reflections on black...
 
Social Text 22.1 (2004) 17-33 It is a hot day in Johannesburg, the last day of work before the summer vacation. December 2001. From the central foyer of the offices where I work, I can see into the inner city, shards of light on the glass building shaped like a diamond, the new taxi rank, one of four going up in the city for the 800,000 commuters passing through every day, the Market Theatre, and the Mandela Bridge starting to take shape. I am reading the Sowetan, South Africa's largest-selling daily newspaper. On the front page, a full-color, full-page photograph of Orlando Pirates and Kaizer Chiefs, ready to spar for the soccer cup on the weekend. On page 2, the "In Brief" column offers snippets: "Fire" Held over Child Support Sundowns star midfielder Joel "Fire" Masilela is expected to appear in the Mamelodi Magistrate's Court today after he was arrested for alleged failure to pay maintenance. The first snippet refers to a baby rape, the rape of the youngest baby yet, one of many since the start of the year, and the one that has most upset the public. The second snippet, detailing the arrest of a well-known soccer player for failure to pay maintenance to his ex-wife, reveals the law in action, protecting the rights of women, bringing to book men who try to get away without paying child support. Two snippets, mini states of the art unfolding along two South African trajectories: violent histories of the body, and rights that have come, if intermittently and in important redemptive pockets, to be protected by the most liberal constitution in the world. As I turn to page 4, another "In Brief" snippet tells me that "Gauteng, MEC for Safety, liaison Nomvula Mokonyane, and Social Welfare MEC Angie Motshekga are expected to address hundreds of men who will be marching against children and women abuse and rape in Ivory Park, North Rand, at the weekend." Another redemptive pocket. If this is one kind of action taken in relation to the dramatic conundrum of men, women, and girls' bodies that is unfolding, page 7 reveals a quite different kind. High on the page is an image, drawn by hand. See figure 1. It is a chastity belt for babies, for small girl bodies. Not girl bodies in general but for specific parts of their anatomy—their vaginas, buttocks, torsos, and, up in a loop, their shoulders. The headline reads "Keeping Safe Below Belt," setting up a link with safety as it draws on the euphemistic phrase below the belt, with its origins in sport's ritualized violence. Charity Bengu (2001a, 2001b) has the story: "A chastity belt with a lock and a key has been developed in a desperate attempt to protect children from escalating incidents of rape. The device, which is designed to prevent the woman wearing it from having sexual intercourse, comes after a series of reports of child rapes." Although the first sentence refers to the protection of children, it is only girl children, and babies, who are involved. The second sentence refers to "the woman" who will be wearing the device, but it is not for women—it is for young and baby girls. Perhaps children and women carry a greater moral invocation, as political categories, than girls? The story continues: "The South African Institute of Race Relations (SAIRR) said 58 children were raped or nearly raped in South Africa every day. ... Developed after extensive research spanning ten years, the device will be officially launched in Pretoria in January next year by a group of independent researchers led by Mrs. Sunette Ehlers, a former 'heamolatology [sic] and serology' researcher at the South African Institute for Medical Research." I call Charity Bengu. Charity...
 
George W. Bush and his "coalition of the willing" wage war on the corrupt regime of Saddam Hussein. Islamic fundamentalists deride their national governments as corrupt and, accordingly, have little love for the United States, a patron of many of these regimes. The World Bank has declared that corruption is the single greatest obstacle to global development. The Michigan Militia and similar right-wing populist groups claim that federal institutions, such as the FBI and IRS, are a corruption. Left-leaning critics and reformers such as Michael Moore and Ralph Nader, attack the corruption that presumably plagues American political and economic life. The list could go on and on; it seems that there is hardly any comtemporary political tendency that does not contain some form of anti-corruption agenda. It is striking that so many disparate and competing political discourses all agree that corruption is a problem, oftentimes the problem. Regardless of the interpretive frame (right, left, populist, technocratic, religious, secular, etc.), the specter of corruption is a constant, and is both unavoidable and unquestioned; unquestioned in the sense that the undesirability of corruption is taken as a given, no substantive argument is needed - who is, after all, in favor of corruption? - and unavoidable in that corruption seems to refer to underlying tensions, antagonisms, and traumas that, regardless of one's conceptual toolbox and political tendencies, cannot be ignored or passed over.
 
Top-cited authors
Alan Sokal
  • University College London
Ron Eglash
  • University of Michigan
Anna Mccarthy
  • New York University
Daniel C. Hallin
  • University of California, San Diego
Arvind Rajagopal
  • New York University