Is Violence Cultural?

 

As the #YesAllWomen hashtag trended over the weekend, I tweeted out a few of my own. In response to one of my tweets [about having been menaced on 3 separate campuses by male students who were antagonized by the low grades or critical evaluations that I gave], a friendly tweep asked whether my experiences could be ascribed to a culture of violence. It was an important question, and I didn’t respond as 140 characters seemed to be rather limiting. I want to think through one part of that question here. But I want to note: my comments are not a reflection on my friendly interlocutor; rather, I’m trying to explore my concern about the phrase.

I’m always surprised when the words ‘violence’ and ‘culture’ are placed in close proximity. Much like the phrase “social construction of race,” the notion of a “culture of violence” seems to create an artificial stopping point at what should be the beginning of an analysis. These days, the phrase ‘social construction of race’ indicates a moment in the political development of theories of race rather than some meaningful insight in itself. Similarly, the notion of a ‘culture of violence’ is often the description given to explain the pro-gun discourse that marks the US in international lights, or the massacres that seem to be occurring with increasing frequency in the United States. The most recent one to come to public attention was the one that a young man, Elliot Rodgers, carried out a few days ago. The phrase ‘culture of violence,’ seems to be immediately problematic in several ways. First, it obscures the specificity of various kinds of violence (a shooting in cold blood versus a woman who shoots at an ex-lover in self-defense; a serial massacre by a young man versus a military massacre of a village). I’m not suggesting that they are all horrific or heinous. Rather, I want to suggest that the level and quality of (dis)approval in each case is affected by the conditions and institutions which supported that action. The second, closely related, way in which the discussion of a ‘culture of violence’ is problematic is that it elides state-led policies that endorse certain kinds of violent actions—based on who is committing the violence and who the violence is committed against—rather than on the action in question.

Examples of the second would include executive policies such as a memo that authorized the use of drones to kill people who are suspected of terrorism (or having a governmental body vote in favor of a federal judgeship for the lawyer who co-authored that memo); or the actions of federal judges who exculpate police officers who shoot young black men while sentencing a political protestor to prison for elbowing a policeman for a boob grab, or a range of bills that unanimously approve the pre-emptive policing, or potential detention, or profiling and entrapment thousands of people who loosely fall into the same group as the 19 men who flew into the World Trade Center in 2001.

You get my point.

‘Culture,’ like ‘social construction,’ seems to sidestep an assumption that certain traits are permanently embedded, without confessing to that assumption. It seems that culture is most often used in 4 different ways:

1. As a marker of identity: Indian culture, Russian culture, Irish culture, etc.

2. As a comparative descriptor, such as when praising a group of people affiliated with a certain society as having superlative values: French culture, Western culture, progressive culture.

3. To ascribe ‘primitive’ or ‘regressive’ traits to a group of people who are united on the basis of some practices or beliefs or (mutual) recognition of identity: Muslim/Islamic culture, Black culture, Masculine culture, etc.

4. To describe a set of (negative) practices that people abide by or embrace (wittingly or not), and therefore become part of that group: A culture of: consumerism, rape, terrorism, narcissism, violence.

Over a decade ago, at the first philosophy conference I attended after receiving my doctorate, my excitement melted into despair as I heard the keynote speaker, a white feminist philosopher of some renown, painstakingly describe how Palestinians and other Muslim cultures were more prone to a ‘culture of terrorism’ than those in Western societies. It seemed to link violence to a population while avoiding references to biology, ontology, or nature. [Uma Narayan, Talal Asad and Edward Said have challenged such a link in their considerable writings, but to judge from its frequent invocation, it still seems to remain an easy go-to place.] And in forging this link, the keynote speaker indicated that these actions were compulsive, driven by the culture to which said people belong.

This kind of deployment of ‘culture’ is striking for its complete bifurcation from a discussion of historical, (geo)political, economic, social, legal structures: what is the history of Palestine (or Iraq, Afghanistan, Kashmir, etc)? What are the material, geopolitical, social circumstances in which certain men and women engage in certain specific practices? What are the legal structures that punish certain men and women for acts of violence while retaining a blind eye towards others? How do we construe violence or terrorism, when lone individuals or groups associated with non-state entities who blow up cafes become the prime figures of terrorism (and if they survive, will most certainly face punishment at the hands of government or military forces)–while other figures–surrounded by government security personnel as they instruct others to deploy drones against certain persons in Yemen selected by a computer algorithm–are hailed as heroes and voted repeatedly back into positions of power? All this, while those who provide legal validation for such practices are elevated to the nation’s highest courts (the most recent example being, of course, David Barron)?

Such a disarticulation from a discussion of underlying structures entrenches the belief that these practices are inherent – perhaps uniquely so — to the group with whom they are associated. So, to talk of a ‘culture of violence’ suggests that there is a set of violent practices that constitute the fabric of a society, bringing that very society together as a unit, which that society (or some part at least) doesn’t necessarily question, criticize, or challenge.

That may not be the intent of using this phrase, since—in none of the above 4 senses is culture used as a factual descriptor (even when that is the intent of the speaker) but more as a rhetorical descriptor. It is always possible to falsify a statement about culture that presumes that most if not all of its people ascribe to a certain belief. Hindus are not all vegetarian; Not all feminists believe that the hijab is oppressive; Not all Muslims (women or men) believe that the hijab must be worn. The French don’t all believe in republicanism. All of these groups have internal debates about various issues, and it may be impossible without (even with) extensive surveys, to discover which part of the group practices/believes in the belief under question, and whether that part of the group constitutes a majority.

My concern with the above deployment of term ‘culture’, is that the speaker obscures the very structures that s/he claims to take into account by locating violence/narcissism/entitlement/rape in a generic culture. It is true that the phrase ‘culture’ can accurately connote a set of embedded attitudes regarding violence, rape, narcissism or consumerism. But—especially when ascribing these attitudes to a group that is already the subject of criticism—s/he connotes that the actions of these populations are driven by their culture. By ascribing certain events to a ‘culture of violence,’ I wonder if it prevents us from having a more insightful conversation about the specific elements that drive a certain event.

Let me be clear: I do NOT want to exculpate men (or women) who benefit from patriarchy, white supremacy, or other systems validating hierarchies or endorsing oppression against groups on the basis of race, gender or nationality. These are systems—grounded through laws, economic policies, geopolitical history, and social policies of rewards and benefits–which can engender acceptance about the privileges that accrue to some persons on the basis of being – say — male or white (often without regard to class), or to being middle- or upper-class white women. And while it’s possible to talk of a set of beliefs that seem to be shared by those who benefit from patriarchy or white supremacy, I think it’s much more effective and important to prioritize a focus on systems rather than culture.

A useful followup to this rumination might be to problematize the discussion of “privilege”—as in in white privilege, male privilege, etc. That will be for a future post.

Children murdered, homes foreclosed: How the government makes “mistakes” with impunity

Anyone who’s been at the mercy of the DMV, the IRS, or a health insurance company knows that bureaucracies make mistakes. Most people are accustomed to bureaucracies making mistakes. And even presidential administrations and U.S. Armed Forces make mistakes.

Yet when considering U.S. national security policies, raising the question of mistakes that cost lives is chalked up as a minor issue: “We have to expect collateral damage in wars/drones/bombs/armed conflict.”

If we know that organizations make mistakes, then it’s not that hard to see that organizations without external oversight and accountability will be empowered to make mistakes with impunity.

Not rectifying mistakes, not allowing oversight, refusing to be accountable to an external judicial body is considered by many an abuse of power. But abuse can only be claimed when a state promises to be accountable. If the state claims that it can’t be accountable, can’t be reviewed for mistakes, can’t rectify mistakes because such practices would be dangerous (the reason isn’t really important here), then at most levels, it’s hard to name the state’s attitude as abuse.

Moreover, as journalist Margaret Kimberley points out, the Obama Adminstration has claimed the right to kill American citizens without charge or trial. That’s not an abuse of power. It’s a complete usurpation of power. There is no space by which to claim the Administration should have acted differently by its own lights.

Wouldn’t it be more accurate to call this, not the abuse of, but the monopoly of power?

In 2005, Rahina Ibrahim was “cuffed, detained, and denied a flight” to Hawaii to deliver a conference paper about sustainable housing. She was allowed to return home to Malaysia, but because her name was on a U.S. government no-fly list, Ibrahim’s visa was subsequently revoked; she was prevented from returning to the U.S., thus effectively ending her doctoral studies at Stanford.  She eventually finished her dissertation in Malaysia, and sued the US government to have her name removed from the no-fly list. But the courts initially ruled that she had no legal standing to sue the US to change its policies because she is a non-citizen, and the US’s efforts to fight terrorism could not be challenged by a foreign national.

Ibrahim persisted, and at least in the most recent round, won.  Despite the US’s best efforts to the contrary, Ibrahim is the first to successfully force the US government to remove her name from the list. U.S. District Court William Alsup’s ruling points out that the US government had erred: an FBI agent confessed to having filled out the No-Fly list form for Rahina Ibrahim in exactly the opposite way as he should have. Alsup had suspected as early as December 2009 that Ibrahim had been the victim of a “monumental” government error.

Murtaza Hussain, in an excellent assessment, points out that Attorney General Eric Holder abused the state-secrets privilege in the Ibrahim case. In an affidavit from April 2013, Holder invoked the state secrets privilege as the reason that the Department of Justice could not turn over the records regarding why her name was put on the no-fly list. Referring to the 2009 State Secrets Policyy established under a young Obama Administration, Holder promised that he would not claim the state-secrets privilege to hide wrongdoing, incompetence, inefficiency, or embarrassment. Nor would he invoke it to “prevent or delay the release of information the release of which would not reasonably be expected to cause significant harm to national security.”

Clearly, Holder lied. The reason we know that Holder lied is because of what was revealed in Judge Alsup’s decision.  In this specific instance, we have clear evidence that the Obama Administration abused its power—on the view that the abuse of power is constituted when an government has promised to behave within certain procedural bounds and legal limits, but has stepped beyond them.

As journalists Kevin Gosztola and Marcy Wheeler demonstrate, the Obama Administration is completely indifferent to its own state-secrets policy, except as a subterfuge. They have invoked it time and time again, for horrendous ends. As Shahid Buttar, head of the Bill of Rights Defense Committee, communicated to Gosztola back in 2012 about the invocation of state secrets privilege:

 

the ability of the FBI to “stand above the law” and not answer to any authority when they outright lie or make deliberate misrepresentations about what kind of operations they are or are not conducting. Also, it makes it possible for the Executive Branch to enjoy extraordinary immunity from punishment when incredible abuses of power are committed and cases on torture, warrantless wiretapping or spying are brought forward in court.

State secrets privilege is but one of multiple excuses that the Obama Administration, like the Bush Administration before it, has used to expand its own power without any accompanying review or oversight of it. Whether the continued renewal of FISA (which candidate Obama voted in favor of in 2008), the NDAA 2012, NDAA 2013, or a myriad of other laws, under the Obama Administration has endorsed the unchecked expansions of power claimed by the FBI, the CIA (often in collusion with the NYPD, the DOJ. Countless foreigners have been rendered from Somalia, Sweden, and elsewhere, and interrogated without defense lawyers; numerous men have been placed in solitary confinement in prisons around the country, still unaware of the charges against them, with sketchy trials at best. Some of these men have been rendered stateless with the help of the British Home Office, such that their kidnappings could not be contested. Muslim communities all over the United States–in Southern California, Oregon, Minnesota, NY, Pennsylvania, New Jersey—have been subject to spying and entrapment.

Let’s not forget Terror Tuesdays and the Disposition Matrix, where Obama Administration officials gather to determine which alleged terrorist to execute next—without evidence, without oversight, with impunity.

It’s also been recently discovered that the FBI—the agency whose agent made a mistake in placing Rahina Ibrahim on the no-fly list–holds the power to delay the citizenship applications of Muslims—a policy enacted under the Bush Administration but still in effect today.

Mistakes, shmistakes.

The targeting of Abdulrahman Al-Awlaki, the 16year-old U.S.-born son of Anwar Al-Awlaki was a mistake.

Putting post-surgery, wheelchair-bound, Stanford doctoral student Rahina Ibrahim’s name on a federal No-Fly list in 2005 was a mistake.

Hundreds of thousands of people were subject to housing foreclosures due to mistakes.

The Obama mortgage settlement allows for a threshold error rate for mistaken foreclosures.

Killing scores of civilians by drones is a mistake.

Incarcerating innocent (but not guilty) men without charges or trials is a mistake.

Holder’s behavior and that of many of his colleagues in the Obama Administration, such as DNI James Clapper, indicates that they have no problems with mistakes, or with lying about government practices, evading demands for evidence, or concealing violations with law.  This may make them corrupt—on the view that there should be a higher standard of behavior from government officials, one that conforms to consistency and accountability.

To the extent that the Obama Administration has conceded to calls for oversight, it has facilitated pseudo-review boards, as when Obama appointed the DNI Clapper to review the NSA’s protocols. Even the name of the group, “Director of National Intelligence Review Group on Intelligence and Communications Technologies,” indicated no interest in external oversight.

On the view that lying, evading and concealing are the (counter)part and parcel of the Obama Administration’s approach to national security—the other part being that any and all strategies will be utilized without regard to accountability or oversight–because these are necessary actions to protect the public at all costs, then Holder’s and Clapper’s actions don’t reveal an abuse of power, but rather the precise and intended application of power.

 

If the Administration promises to behave within certain procedural bounds–along with the proviso that it will be the sole arbitrator on when and how to proceed to execute its power, whom it will delegate its power, and who will be subject to its power—then we should not name that the abuse of power, but the ultimate monopoly—indeed, the ultimate expression of power–and laud the Administration for resolutely carrying out its own promises and marvel at its own rare consistency!

In fact, as many have pointed out, the Obama Presidency is following in the footsteps of the Bush Administration. It might be more accurate to say that the current Administration is carving out even bigger footsteps for itself, what with its impressive record number of drone murders, solitary-confinement based incarcerations, domestic and global surveillance, deportations of migrants, and its pointed indifference to looting bankers. By claiming the right to wield power without apology in all areas of national security domestic and foreign, and on behalf of Wall Street, the Obama Administration is claiming the status of the Leviathan, as the sovereign authority in Thomas Hobbes’ 16th century treatise on politics is named.

The Leviathan claims both to be the actor and author of the collective will: once people have handed over their consent to the sovereign (demonstrated by abrogating each individual’s rights to kill), then the Leviathan claims that power in the name of the people completely. The Leviathan can do no wrong and admits to no wrong. What’s more, unless a person can find a stronger protector, they have no choice to but to submit to the Leviathan’s authority.

So, the Obama Administration—by refusing to admit that its policies are fraught with mistakes, by refusing to concede that its mistakes have hurt innocents needlessly, by refusing to correct those mistakes in the name of state security—and by resisting all attempts to make it accountable by resorting to incarceration (John Kiriakou), mock trials (e.g., Chelsea Manning) or no trials (Barrett Brown), rescinding passports (Edward Snowden), coercing other sovereign states to incarcerate challengers to its power (Yemen/Abdulelah Haider Shaye), and killing citizens and foreigners alike without review or impunity (whether by drones, financial starvation), it claims to be the ultimate sovereign authority—without challenge, dissent, or resistance. It makes the same claim as the Leviathan.

At some level, the question that needs to be addressed is not whether the Obama Administration is interested in holding itself accountable—it clearly does not—but whether we are interested.

If US citizens are interested in the accountability from an Administration that considers itself to be not only above the law, but is unilaterally creating law and (by extension) determining others’ criminality through its own (often secret) standards, then we have to decide how to wrest back power from an absolutist state. By an absolutist state, I mean an Administration that considers dissent, scrutiny, and criticism from any lowly individual unforgivable, while insisting that its own mistakes (real and contrived) are necessary to its self-awarded status as the ruler of the world.

________________________________________________________

This piece was originally published at Salon.com.

Correcting the Poor: The Civilizing Impulses of Homo Corporatus and Private Charities*

This is the next post in my series on Neoliberalism and Charity. Part 1 is posted here and at New Economic Perspectives.

_________________________________

Should anyone—the state or any other source–have an obligation to interfere with you in order bring your best, flourishing, self about?

Certainly, this is the debate that philosophers such as Isaiah Berlin and libertarians such as Robert Nozick have engaged in heartily, with a view to socialist frameworks that redistribute resources in order to produce certain kinds of outcomes. Should the state impose certain ideals and goals upon you, and why? There are certainly examples of very good certain state-imposed expectations such as seatbelts or prohibitions against drunk driving, as well as terrible examples, such as state-imposed prohibitions on certain kinds of drugs.

In a neoliberal era, the corollary to above question is whether non-state organizations should have the ability to interfere with you in order to bring about your best, flourishing, self?

This question emerges in the wake of the heralded contrition of Sam Polk, as expressed in a New York Times opinion piece, where he offered a self-congratulatory description of his decision to give up being a Wall Street trader and “money addict,” and instead to form a charity that awards “grocery scholarships” to “poor moms.”

Polk’s charity, Groceryships, on its face appears to be a thoughtful idea.  Indeed, the basic Groceryship is a “scholarship for groceries.”

 Soon a simple one emerged: what if we bought groceries for a family for six months. I imagined a single mom, working overtime to try to put food on her table, and falling short. We wanted to give that mom some breathing room, and her kid some healthy food in the process.

The language of Groceryships is certainly neutral, but tells a story that reveals a number of assumptions about poor folks. In his tale about how Groceryships started, Polk gives a narrative about how he and his physician wife learned about eating better. And how they might be healthier if they ate better (apparently, this was previously unknown to them).  So they got to work, switching to whole foods, eliminating processed and fatty foods. Though they suffered “withdrawal” from their addiction to unhealthy foods, they were able to kick their habit. (addiction seems to be the lens by which Polk understands many phenomena).

We started buying tons of vegetables and whole grains, and cut down on fatty meats, sugar, and processed foods. It was hard. Very hard. Kirsten and I both experienced what we can only describe as withdrawal symptoms—nightmares, panicky feelings, irritability.

After a few weeks those symptoms faded. We found we enjoyed eating healthy and especially how good we felt. We no longer had to battle ourselves about whether to eat another Cheetos, or felt shame about eating too much cake. That everyday battle-stress just faded away. We ate at mealtimes, snacked when hungry, and felt great. After three months, Kirsten got her cholesterol levels tested. They’d been cut in half. She went off Lipitor.

Polk and his spouse were so impressed with the results that they wanted to share their newfound knowledge and to give back to society at the same time.

A few months later, we watched A Place At The Table (sic), a documentary focused on the staggering numbers of Americans, especially children, facing food insecurity. Each day 50 million people in this country (including one in four children) go hungry.

Growing up, my parents struggled, living paycheck to paycheck. But it never got so bad that food wasn’t on the table. Kirsten and I were horrified that so many people—kids!—were hungry. We were especially horrified that many of these kids lived down the street from us. Los Angeles is a segregated city. It’s easy to forget that just a few miles away people were starving.

I guess the truth is that we had known that; we’d just never taken ownership of our responsibility to do something about it. That day, we decided to help.

Polk recognizes the correlation between poverty and hunger, but he frames this correlation in the language of “choice” and options:

Hunger in America looks strange; there is a definite correlation between food insecurity and obesity. You’d think that people who can’t afford food would be rail thin, but it’s often the opposite. People that struggle to make ends meet tend to opt for the cheapest calories, processed/fast food. They often live in Food Deserts, areas where nutritious produce is simply not available. (Emphasis mine)

Perhaps the implied causation was inadvertent. Perhaps Polk recognizes that such “opting” is the result of being short of cash. In which case, the solution would be to distribute sufficient money to buy healthier food. And certainly, that seems to have been the initial idea, but Polk frames the solution in these terms:

…we realized that mom could also use some nutrition education and group support. We remembered how difficult quitting sugar and processed/fast food was for us, and we realized that a structure of support would be helpful, necessary.

It suggests helpfully, liberally, perhaps due to no fault of their own, that poor moms don’t know much about nutrition.  So, families who receive a “Groceryship” will be supported not only financially, but medically, educationally, and emotionally. Support typically means resources are available to help one advance towards a goal, but not mandated. By contrast, mandatory resources are not forms of support, but a form of discipline: if you must avail yourself of a resource, then you are not supported, rather you are compelled.

Groceryship awards are not merely the distribution of groceries with the “option” of attending nutrition classes; rather the classes are required. “Poor moms” who apply for the meritorious award must swear their allegiance and commitment to attending nutrition classes, “weekly meetings” and to do weekly homework. It’s as if they were young, naïve, subservient children.

Indeed, Polk acknowledges that his program is different from “but can be used in conjunction with SNAP (food stamps) which provides financial to support to struggling families (link not in original),

 but doesn’t insist the money be spent on healthful foods, or teach families how to prepare and shop for those healthy foods.” (emphasis mine)

In that simple sentence, Polk reveals more of his (limited) worldview: the state “does not insist that the money be spent on healthful foods.”

Had Polk searched, he would have found that, if anything, food stamps severely constrain the purchase of healthy foods. According to the Center of Budget and Policy Priorities, the maximum monthly budget for a family of 4 (i.e. those who have no other income) on food stamps is $632.

That boils down to $5.64 per person per day. Whole Foods, expensive as it is, accepts food stamps; there are multiple sites where families have accepted the “Thrifty Whole Foods” challenge to shop for whole foods on a food stamp budget. I’ll let them tell their stories—many of which have various helpful hints about how to shop and cook on a limited budget.

In short: it is possible to cook healthy foods on a severely restricted budget. But healthy foods require adequate kitchen facilities to process and cook them.  Poor families, who can presumably afford housing that is cheap (cheap because landlords don’t make repairs to provide decent stoves, rat- and cockroach-proof storage, adequate refrigerators needed to store fresh foods), often do not have those facilities, therefore tenants are forced to choose processed, sealable, storable foods.

As I’ve noted elsewhere, time (or more its scarcity) becomes a severe constraint if a “poor mom” is also working or doesn’t have access to child-care so that she can schlep to her Whole Foods easily/quickly, and also process said healthy foods. The issue of access to transportation that allows her to get to her Whole Foods will also, chances are, constrain her free cooking time further.  But all of these constraints raise another urgent issue: namely the assumption that someone who is both cash- and time-poor is expected to cook whole foods after long, difficult, days. How many working professionals are expected to cook full, healthy meals after a full day of work?

Aside from the sheer difficulty of spending money on “healthful foods,” there is also the issue of why any state should impose a certain standard on those who are dependent upon public monies for survival, when it does not impose the same expectations on the rest of its citizens.  It calls to mind Isaiah Berlin’s discussion of positive liberty.

For Berlin, positive liberty–defined as the ability to “be my own master,”[1] is least harmful when I am able to decide how to live my own life, to make my own decisions, rather than to have to depend upon external forces. As a counterpart to negative liberty, namely that where I would be protected from being harmed by others and the state, positive liberty allows me to find a way to flourish, to decide how I want to live.  In this idea, Berlin marks an idea that re-emerges a decade later in Hannah Arendt. Arendt criticizes the “Social,” that dimension of society that is subsumed by the economy, where one’s acts are instrumental—where one works in order to make a living.[2]

For Arendt, this idea undermines our very humanness. It coerces us into thinking only about life, about living, rather than acting, understood as great words and great deeds. The economy, with its inducement to consume, to work in order to live and consume—was anathema to Arendt. Arendt was critical of the notion that one’s goals must have utility. Being healthy is exemplifies this idea: Health has become naturalized as an end in itself, but in fact is about usefulness: to be less of a drain on society, to be aesthetically pleasing, to appear successful.

To be fair, Arendt’s is precisely not a socialist ideal, where one’s needs are met through a communal society, where one hunts, fishes, reads, in the model of a balanced life. Nevertheless, Arendt’s fear comports with Berlin’s, who skeptically asks:

“What, or who, is the source of control or interference that can determine someone to do, or be, this rather than that?”

To find a way to flourish without being forced to live out another’s expectations for you—this was both Arendt’s and Berlin’s concern. This question was a challenge to the authoritarian state whose creeping influence, in their experiences, had been detrimental, to say the least.

But the creeping state is not the issue at stake with regard to Sam Polk and Groceryships. Rather, the issue of state-imposed expectations has been derailed with the forceful emphasis on civil society as the arena by which to solve various social and economic problems.

Civil society, a term that G.W.F. Hegel used to indicate that arena where the public and private meet, has a distinctly different sense today. Whereas Hegel circumscribed civil society as that where the individual and the state can interact through intermediate organizations such as guilds, or unions, today’s civil society is that arena where the state has dialed back its obligations in order to allow private organizations and individuals to pick up the slack.

Polk’s charity, like that of many others (such as Teach for America, charter schools, Kiva) that have sprung up in the last several decades, reflects the success of a paradigm that has emerged over the last 3 decades. This paradigm endorses private, faith-based, or “non-profit” charities as the foundation of civil society (defined as a non-government sector). These organizations, endorsed by every U.S. President since Ronald Reagan, have facilitated the evacuation of a public safety net—an evacuation that goes hand in hand with the deregulation of the banking industry, and the steady erosion of unions, public pensions, and labor protections.

Certainly, it is unreasonable to expect that the state can or will address all levels of public need. But private non-governmental charities have fewer Congressional or procedural inhibitions  what they may demand of the constituents that they claim to want to help, such as the ability to impose certain behavioral features.

Groceryships imposes many strings for the mere flaw of being poor.  According to the rules of applying for a Groceryship, being poor apparently means one chooses to eat unhealthily. Being poor apparently means that one is “addicted” to fast foods and sugar (this isn’t such a far-fetched idea for Polk, who frames his past actions in finance as the result of an “an addiction” to wealth).

Thus, to be eligible for a Groceryship, poor moms can’t have excessively large families (“no more than 3 children”), and be only moderately poor. And they “must” need/want/be eager/be motivated/be ready to adopt a healthy lifestyle, to want to be healthy, to be open to new ideas. See here.

Groceryships’ expectations fit into the neoliberal paradigm that I discussed in another piece, namely that poor people, more so than the non-poor, have an obligation to be moral, aesthetically reasonable, healthy, happy, and eager about it.

The most vulnerable—or as I say elsewhere, those who are perceived to be unruly—are seen as scary, dangerous, frightful because they are seen as “failures” due to their personal characters rather than through their circumstances: Why are they poor? Why don’t they eat better? Why are they fat? Why are they rude? Why are they noisy and loud?

If the poor just worked harder, smoked less, didn’t do drugs, shunned McDonald’s and cooked more, then they too could be as aesthetically pleasing—and perhaps as successful and happy as Sam Polk and his spouse.  This is one of the pernicious implications of a neoliberal economic model: the poor are expected to fulfill the aesthetic and moral expectations of the upper-class of what it means to live “a good life,” to flourish. And they are subject to those who are precisely in a position to be able to dictate the life goals for those who are more vulnerable.

Being poor means that if one wants to have one’s poverty relieved slightly or temporarily (remember, the Groceryship is for 6 months, after which one still remains poor), one is at the mercy of the ex-money addict Sam Polk and his neoliberal buddies, who are cheered for “helping the poor.”

Let’s remember that Polk’s money-addiction days were part of a milieu—a group of traders/financiers/bankers who were engaging in a set of practices that were both induced and condoned by state power and general pre-financial crisis societal approval. That is to say, his role in JP Morgan Chase, or other financial corporations who contributed heavily to the banking crisis (including mortgage foreclosures on the working class and minority populations) was seen as a positive contribution, until around 2008/9. Moreover, the state—both Congress and the Executive Branch–continues to condone it through (pro-banking) legislation that allowed CEOs to receive large bonuses in spite of their roles, or through supposedly punitive legislation that slapped banks lightly on their wrists, and paid out less than $2000 per person to those who lost their homes over a three year period. Moreover, this settlement changed nothing in the relationship between the borrower and loan servicing company.

By framing Polk’s actions within an individualizing framework (be it therapeutic or moral conscience), and without locating them in a larger political/cultural structure, this frame precisely engenders the kind of glorification that is showered upon Polk, by Jacqueline Novogratz and many others such as Rachel Cook, Jessica Jackley…and the Nobel Peace Prize winning innovator of microfinance himself, Mohammed Yunus, who are engaged in similar, if not identical, shifts.

What Polk et al. appear to be doing here is making a move from a “corporate free market” to a “non-profit free market,” which in no way challenges the idea that poverty and wealth are exclusively about individual choices. Rather, Polk’s (and Novogratz and Yunus) shifts still emphasize the ideology and primacy of the “free market,” coupled with a rhetorical emphasis on hard work, along with individual moral, personal, social accountability for darker or non-American population.  In Yunus’ case, micro-lending is tested in Bangladesh; for Novogratz, it’s taken to East Africa, India, Pakistan and Ghana, and for Polk, it’s applied to black and Latino populations of Southern California.

But there is another aspect of this that is also troublesome: the self-satisfaction experienced by these “free market successes” who reclaim their moral sensibilities through the act of walking away after making millions in profits and then turning to “help the poor” on their terms. They are cheered for their charity work (in an individualist frame) without being asked about their participation in a financially corrupt, morally bankrupt “free market” system that allowed these individuals to “flourish” at the expense of millions of individuals who are unable to access the free market system because they don’t have the connections or “moral luck” to have been born in the right place at the right time.  As economist Dean Baker clarifies in his book, The Conservative Nanny State, there is nothing “free” about the free market: it is rigged to benefit those who already have at the expense of those who don’t.

As well: this kind of neoliberal framework ensures that the ruling class will shape the poor, by forcing them to behave, reshape themselves through these seemingly neutral, or generous, charities in Sam Polk et al.’s own ill-informed visions of what it means to be a successful citizen.

This, then, is an expression of Michel Foucault’s biopolitics: those who are induced to cultivate themselves in the image of the ruling class are those who are the most vulnerable—subject to the whims and dictates of the wealthy and the powerful.  This is the success of the neoliberal paradigm: it renders to Homo Corporatus (or Homo Wall Streetus) the freedom and flexibility to shape the actions and character of the most vulnerable to those who have the money, the power, and the favor of the state; simultaneous Homo Corporatus’ contributions, the results of plunder and the corporate nanny state—are read as an individual/private acts of generosity to help those who are most needy, those were rendered needy through institutional/governmental/financial practices.


[1] Isaiah Berlin, “Two Concepts of Liberty,” p. 131. In Four Essays on Liberty, Oxford U Press: 1969.

[2] Hannah Arendt, The Human Condition, ch. 6. University of Chicago Press, 1958.

*Updated version. Thanks to Robin James, Janine Jones, and Robert Prasch for their helpful comments.

 

Will We Ever Close Guantánamo Bay Detention Center?

I’ve drifted away from blogging the last few months, but hoping to put up some original pieces soon. In the meantime, here’s a piece that I published over at Salon last month. Guantanamo has been on my mind ceaselessly, especially as I teach my Global War on Terror course this term.

I’ve been writing away, and so more pieces on other topics to be post over the next few weeks…

_______________________________________

January 11th marked the 12th anniversary of Guantánamo Bay Detention Center, which, according to former Secretary of Defense Donald Rumsfeld, is the “least worst place to house” men suspected by the U.S. government links to al-Qaida and the Taliban.

But Rumsfeld’s statement reeks of incredulity. Beginning with the Bush administration, the U.S. has done more than merely house them. Through its military and medical personnel, it has inflicted physical brutality, extended torture, solitary confinement, force-feeding upon these men, all the while remaining publicly indifferent, even righteous, about the absence of charges, due process and legitimacy of the imprisonment.

Of the nearly 800 prisoners who have been confined there, 115 remain. Eleven were released in the last five months, twice as many as were released the previous three years.

Yet, as artist and writer Molly Crabapple pointed out in her recent Guardian column noting the prison’s anniversary, we also know — we have for some time — that over half of all the detainees who have been imprisoned there were handed over for U.S.-paid bounties, rather than because they were hostile or dangerous enemies of the U.S.

Crabapple is not asserting this as a fantasy of her own making. She cites an important but not widely known report written by Seton Hall law professor Mark Denbeaux, lawyer Joshua Denbeaux, and several Seton Hall law students. The Denbeaux are legal counsel to several of the detainees. In their report, the authors show extensive evidence that over half (55 percent) of the 517 prisoners that they profiled committed no hostile acts against the U.S. or its allies. Of those 517, only 41 (8 percent) are “characterized” as al-Qaida fighters. One hundred ninety prisoners had no connection to al-Qaida, and 86 had no links to al-Qaida or the Taliban. And of those 517, 445 were captured by Pakistan or the Northern Alliance were handed over to the United States at a time in which the United States offered large bounties for capture of suspected enemies.

Offering a large bounty doesn’t disprove the assertion that these men were a serious threat. But when a government creates these classifications without external accountability, and it is supported in this by a supine judiciary, the circumstances do present a serious — overwhelming, unmitigated — doubt about whether these prisoners are a danger to Americans. The Denbeaux have made evidence of this doubt available since 2006.

What should have amplified this doubt even further for all of the serious, fact-finding, mainstream media is that the Combatant Status Review Board – enacted under the auspices of the U.S. Department of Defense, and which has no incentive to be critical of the U.S. government — also made the same evidence of this doubt available as early as 2005.

As striking was a second report published by the Denbeaux group. This report pointed out that of the 72 groups recognized as terrorist organizations by the Department of Defense, 52 of them (72 percent) are not on any of the terrorist-watch lists maintained by the State Department. By this measure, the DoD keeps its own list of terrorist groups that are neither reviewed, confirmed nor double-checked by any other government office. As the Denbeaux report concludes,

This inconsistency leads to one of two equally alarming conclusions: either the State Department is allowing persons who are members of terrorist groups into the country or the Defense Department bases the continuing detention of the alleged enemy combatants on a false premise. (my emphasis)

Given that we have had few further terrorist acts committed within the confines of the United States by foreign nationals in the last decade, the second conclusion is more likely.

What is striking about this truth today is that it is possible to state it in print in established media such as the Guardian. Even as several more prisoners were released this past month, there appears to be a slight opening in the conversation, one enabling human rights advocates’ criticisms to echo for more than a few seconds.

This was not the case a decade ago, when early critics of the Bush administration’s policies tried to suggest that there was little proof that captives brought to Guantánamo were a danger to the U.S., and that the prison should not be treated as a “legal black hole.” Those critics’ voices included several U.N. high commissioners for human rights as well as Richard Goldstone, the former chief prosecutor of the International Tribunal of the former Yugoslavia, and American lawyers such as Michael Ratner, the head of the Center for Constitutional Rights and Michael Posner, the head of the Lawyers’ Committee. But their criticisms were drowned out by officials and polls indicating that Americans were overwhelmingly in favor of the prison and the inhumane treatment meted out to Afghan men.

Indeed, the original head of Guantánamo, Maj. Gen. Mark Lehnert, recently confirmed his own early doubts. Writing forcefully, Lehnert insists that Guantánamo never should have been opened, and many of the detainees should have never been sent there.

As cynics will suggest, that is how politics works, as even a casual perusal of American history reveals to us.  After the attack on Pearl Harbor in December 1941, 120,000 men, women and children of Japanese descent were incarcerated across 10 prisons for little reason other than the fear shared by the U.S. government and non-Japanese populace alike. The fear, suspicion and contempt acted on by then-President Franklin Delano Roosevelt, was that these civilians, if allowed to live in the populace freely, might turn their freedom toward aiding the “enemy,” the Japanese government.  This fear was pursued, despite the Roosevelt administration’s knowledge that these civilians, many with American citizenship, had few ties to the country of their parents’ origin.

These same residents had been scapegoated by the U.S. for decades. In 1913, in California, a law stripping Asian non-citizens of their businesses had been passed. That law was a mere continuation of decades of policies designed to manage the “Japanese problem,” as historian Greg Robinson’s book, “By Order of the President,” informs us. By May 1942, many Asians, residents and citizens alike, were being ordered to board trains and buses to whichever “internment camp” they had been assigned, with only what they could carry with their own two hands. At that point, nearly any Japanese American families who still owned businesses had to forfeit them as they were dispatched to stark campsites, thousands of miles away from their towns, any towns where they might be in danger of talking to other non-Asians. (See here for a remarkable pictorial spread published by the Atlantic several years ago that show some moments from that period.)  The internment had the extended benefit of politically and socially ostracizing the internees. Friends, if any remained or wished to claim that mantle, would have found it prohibitive to visit them.

I visited one of those former camps about six years ago—Manzanar Camp, which sits at the foot of the Sierras, just outside of Death Valley. A U.S. park ranger, with a degree in comparative literature from the University of California, Irvine, had painstakingly curated the camp, whose vast desolate grounds had been denuded of most traces of that shameful period (scroll down for photos of what Manzanar looked like in 1943). In the main auditorium — the only structure that was left standing — the ranger had retrieved or reconstructed several barracks in which these families lived. Each housed several families of four, five, seven, eight or more: grandparents, babies, young children, teenagers, newlyweds and others. According to accounts made by former inhabitants of other camps, such as Tule Lake in Northern California, the sheds would be marked by makeshift curtains to divide the rooms into smaller, closet-like sleeping areas, for some semblance of privacy in which occupants could retreat for a while. Other inhabitants remarked on the unceasing wind that threatened to drive them mad, along with the fine layer of sand that covered every possession, including tablecloths, beds, makeshift dressers or dry goods.

Outside the auditorium, the vast grounds were marked by signs indicating where the canteen had been erected, and the school for the children had been built. There were maps that indicated the order of other structures, including watchtowers to ensure that none of the civilian internees escaped. Also remaining were traces of some old Buddhist gardens, created by some of the internees in an effort to bring beauty and life in that desolate, dry place.

P3170067.JPG

Buddhist gardens in Manzanar (Photo credit: Falguni A. Sheth)

As well, there were several burial places, marked by stones. One was as small as 2 feet, marked by the usual ring of stones, and several toys, indicating that an infant was buried there.

P3170063.JPG

An infant’s grave in Manzanar (Photo credit: Falguni A. Sheth)

I remember that the map indicated a building marked as a fire station, which presumably held water to be deployed in the likely event that a blaze might decimate the brittle wood buildings that sat on the desiccated land.

Manzanar was one of 10 camps to which American citizens and residents of Japanese descent were incarcerated during the remainder of the war. There, as with the prisoners in Guantánamo, the internees attempted to challenge their resistance in a myriad of ways, procedurally and physically.

As well, there was another group, nearly forgotten, who were also victimized by the U.S. Several thousand Japanese Latin Americans were arrested by their own governments (mostly Peru) and shipped to U.S. camps, including one in Panama.  The U.S. had hoped to trade them to Japan in exchange for American prisoners of war (it was unsuccessful). Many of these men and women, like their U.S. counterparts, had little actual connection to Japan. They had their passports confiscated. They remained in these camps for the duration of the war. After the war, betrayed by their home countries, both groups were essentially homeless, due to no fault of their own. They had no desire to return to Japan or the countries that had betrayed them, and the U.S. had revealed itself to be a hostile land.

Even though I had previously studied the historical and political aspects of the internment of Japanese Americans, thanks to the effort of this ranger, that trip to Manzanar foregrounded for me the extreme consequences of the unthinking panic legislated at the executive and congressional level little over 60 years ago.

It reminded me of the collective panic that recurred just over 12 years ago, a panic cynically exploited by U.S. leaders and representatives. Though these functionaries might have been zealous to protect their country, they could not see past their immediate interests to the moral stanchions of judicial procedures and habeas corpus, or to the effects of their short-sightedness: the ubiquitous ether of injustice that still mars this country’s reputation.

It appears that this is how politics has worked again and again. But such politics can only work when leaders and functionaries can savor the successes of their deal-making with immunity; when their decisions are not expected to be compelled by moral dictates, when they are affirmed and rewarded for their egregious human rights violations by being reelected; when military commanders and politicians prioritize “the masculine logic of the security state,” as the late philosopher Iris Marion Young called it.

This country and its leaders have never figured out how to redress wrongdoing. The U.S., beginning with President Ronald Reagan, paid out $1.6 billion to the 82,000 descendants of the Japanese internees, along with an apology. But these “reparations” can not make up for the damage done to an entire people; and it has little effect if no lessons are learned from such recent mistakes.

As Carol Rosenberg points out, in the intervening decade, the suspicions against these prisoners have diminished, perhaps because the panic has abated and many more have had time to reflect on the hasty actions that have led to Guantánamo. Many prisoners have been released, finally. The next remedy is obvious, but it will take a moment of courage by the current administration to enact it.

________________________________________________________

A version of this article was published on Salon.com on Jan. 16, 2014

The Clintons: Back on the campaign trail with the help of the New York Times

This article was published at Salon.com on December 4, 2013 under the headline, “New York Times’ blind spot on Clinton and race.”

____________________________________

The New York Times published a piece this week in the service of the Democratic Party’s campaign for the 2016 elections that reveals a grave misunderstanding of recent history. Reporters Amy Chozick and Jonathan Martin profiled the tactics of former Secretary of State Hillary Clinton and her husband, the 42nd president of the United States, to restore their fragile relationship with African-Americans in anticipation of the former’s 2016 presidential run. The Times frames it as an attempt to “to soothe and strengthen their relationship with African-Americans,” apparently strained after Bill’s 2008 comments about the Obama presidency.

Here is the motivation they assign to the Clintons:

This task [of courting the black vote] has taken on new urgency given the Democratic Party’s push to the left, away from the centrist politics with which the Clintons are identified. Strong support from black voters could serve as a bulwark for Mrs. Clinton against a liberal primary challenge should she decide to run for president in 2016.

It would have been illuminating, and accurate as well, to distinguish between Democratic Party functionaries and Democratic voters in their description; I don’t see much in the way of Democratic politicos’ “push to the left”: NDAA 2012/2013, bank bailouts, the ACA, among other laws, don’t strike me as overly progressive.

Chozick and Martin assiduously cover the various black leaders with whom the Clintons have consorted since Hillary’s resignation as secretary of state earlier this year. Along with that coverage is a telling, if not accurate, description of Bill Clinton’s legacy, which Hillary will surely be relying on to vouch for her “progressive” credentials. Here is perhaps the most remarkable paragraph of the article:

Mr. Clinton has a rich, if occasionally fraught, history with African-Americans. He was a New South governor and a progressive on race who would eventually be called “the first black president” by the author Toni Morrison. But he infuriated blacks in 2008 when, after Mr. Obama won a big South Carolina primary victory, he seemed to dismiss the achievement by reminding the press that the Rev. Jesse Jackson had won the state twice and calling Mr. Obama’s antiwar position “the biggest fairy tale I’ve ever seen.”

Many African-Americans took Mr. Clinton’s fairy tale comment to mean that Mr. Obama’s candidacy itself was a hopeless fantasy.

It is true that black Americans were mightily irritated by Bill’s comments. But that’s hardly the only source of the injury.

(Un)surprisingly, even as Chozick and Martin tritely repeat Toni Morrison’s description of Clinton as the first black president, proudly repeated by Bill (and ad nauseam by mainstream media), they don’t offer any context for her remarks.

Morrison, writing in the New Yorker in 1998, was reflecting on the Republicans’ move to impeach Bill Clinton in the aftermath of revelations of his affair with Monica Lewinsky, his intern at the time. She says:

African-American men seemed to understand it right away. Years ago, in the middle of the Whitewater investigation, one heard the first murmurs: white skin notwithstanding, this is our first black President. Blacker than any actual black person who could ever be elected in our children’s lifetime. After all, Clinton displays almost every trope of blackness: single-parent household, born poor, working-class, saxophone-playing, McDonald’s-and-junk-food-loving boy from Arkansas. And when virtually all the African-American Clinton appointees began, one by one, to disappear, when the President’s body, his privacy, his unpoliced sexuality became the focus of the persecution, when he was metaphorically seized and body-searched, who could gainsay these black men who knew whereof they spoke? The message was clear: “No matter how smart you are, how hard you work, how much coin you earn for us, we will put you in your place or put you out of the place you have somehow, albeit with our permission, achieved. You will be fired from your job, sent away in disgrace, and—who knows?—maybe sentenced and jailed to boot. In short, unless you do as we say (i.e., assimilate at once), your expletives belong to us.”

It is clear that Morrison is poetic and pained here. She analogizes the experiences faced by Clinton to those faced all too often by black men. There is much that can be said about this piece. But the cynicism of Clinton and his supporters is such that her phrase was co-opted as an endorsement of his “progressive” politics, rather than what it signaled at the very least; it is a searing insight into the inferior, abject status of black men in the United States at the end of the millennium. And here is Morrison in her own words in 2008.

But Chozick and Martin, in their own perhaps subconscious cynicism merely repeat Morrison’s endorsement and omit any discussion of Clinton’s policies during his two terms as president, or during his time as governor of Arkansas.

The first “black” president and his partner in devastation proudly designed the prototype of Clinton’s famous 1996 welfare reform bill when he was the governor of Arkansas. Women who applied for aid from the state were required, among other indignities, to name the potential fathers of their children. Yes, yes, save your objections: This policy was created to search out “deadbeat dads,” and get them to pay child support.

But somehow it never occurred to many — not the press, not white liberals, not liberal feminists, much less the Clintons (if they cared at all) — that such a reform would only be effective in further humiliating already poor women, women who, had they other options, would never have resorted to the state for help. Here’s a brilliant letter from a Seattle feminist to N.O.W. back in 2007, which sets out the various assumptions and implications of welfare reform.

The ballast for welfare reform exploited the racial antagonism against black women that was inflated and gained momentum under Ronald Reagan’s administration. But as many, from Barbara Ehrenreich to digby to Jason DeParle, point out, the Clintons and their Democratic buddies endorsed the righteous smokescreen that “workfare” was needed to teach the poor how to keep a job rather than asking for money, and to teach poor (black) women “chastity training.” Patronizing? Racist? Those words don’t even cover half of it, especially as they’re accompanied by the convenient selective amnesia about the legacy of slavery and the still-existent practice of institutional discrimination against blacks. We can see this in the history of the drug war, the prison industry, red-lining, not to mention plain old-fashioned racism as seen in our public school system, post-secondary admissions practices, and employment across multiple industries.

Hillary’s express support for welfare reform enabled Bill to get the 1996 Personal Responsibility and Work Opportunity Act passed. Peter Edelman, a senior Clinton appointee who resigned in protest of the bill, pointed out that this was the “worst thing Bill Clinton has done.” Due to the remarkable efforts of the “first black president” and his wife, and like-minded “liberals” and conservatives who believed that the poor needed to be taught to climb out of a “culture of poverty,” welfare was no longer the entitlement that it had been for decades (and should have remained as such). Rather, it was transformed into a sporadic privilege periodically and provisionally bestowed on the poor, all the while leaving millions more in poverty. As Edelman pointed out in 2011, that 1996 bill made things much worse for the poor: “There are now people who cannot find work, and who cannot get welfare.”

Needless to say, Democrats and Republicans have managed to augment, enhance, exacerbate the level of nationwide poverty through its support of banking deregulation and absence of serious sanctions for bankers and subprime mortgage companies.

When Chozick and Martin write about Bill Clinton as a “progressive on race,” I have to wonder which criteria they use to measure. They use certain famous black politicians’ comments (such as those of Democratic Rep. James Clyburn or Rep. Elijah Cummings) or public gestures (such as sitting next to “friend and rival” and former Democratic Virginia Gov. L. Douglas Wilder at Howard University’s May 2013 commencement) at face value and out of context. To gauge race progress by which friends a white Democrat sits next to — doesn’t this strike anyone as uncomfortably close to the “Some of my best friends” cliché?

Why not consider the effects of NAFTA and WTO, which decimated the manufacturing industry that employed enormous numbers of African-Americans? Many journalists and left economists have detailed the detrimental impact of the offshoring of corporations, the forgiveness of taxes, the eradication of labor protections for foreign nationals who work at formerly American companies. Why does none of this figure into the assessment of “racial progress”? Even one paragraph might have allowed for the possibility that the Times was engaged in some critical questions about the releases and information that they were being fed by the Clinton campaign.

Why not consider the effects of the 1996 Immigration Reform Bill, which was a precursor to the enormous anti-immigration tide that has swept the country, enhanced by the right-wing and neo-patriotic impulses of both Democrats and Republicans in the aftermath of the Sept. 11, 2001, attacks?

Why not consider the effects of the 1994 Crime Bill, which heralded in “three strikes” legislation at the federal level, also signed under the “first black president”? The expansion of the death penalty in the 1996 Anti-Terrorism and Death Penalty Act?

I can hear objections that Hillary should be able to run on her own record. OK, why not examine a few of her votes? Remember, it was Sen. Russ Feingold — not Sen. Clinton, or Sens. Dianne Feinstein and Barbara Boxer, or Secretary of State John Kerry — who stood up against the USA Patriot Act, as a harbinger of a (by now) vengeful, 12-year, racist and arbitrary tide of vitriol against Muslims in the U.S., Iraq, Afghanistan, U.K., Yemen, Pakistan, Somalia and elsewhere in the world. How about on the 2002 authorization to invade Iraq? AUMF 2005? The 2007 surge in Iraq? She voted in favor of them. To her credit, she voted against the 2008 FISA bill, citing checks on presidential authority, even as elsewhere she has been in support of increasing it. How does she feel about WikiLeaks? Edward Snowden? The death penalty (supports it, but not for Iran).

These are hardly left votes. These are hardly liberal votes. These are hardly racially progressive votes.

Let’s not judge whether someone is a “race progressive” — especially a politician — by the utterances of his/her friends. Presumably, journalists understand that the notion of an alliance does not confirm the truth of one’s race politics; it merely demonstrates that all other concerns have been provisionally subordinated in order to further one particular goal. Sure, we can call it pragmatic, strategic, realpolitik. But regardless of the term used, journalists — of all people — know that citing such alliances does not offer a valuable insight or confirmation about the truth of one’s politics.

I tell my students that if they want to write about politics effectively and forcefully, they must major in something other than journalism: history, sociology, ethnic studies, politics — something other than a field that disciplines its students to forget that accurate narratives have a long-seated, deeply buried history that cannot simply be articulated through a repetition of sound bites aired by corporate news media or covering poll results. Facts, those snippets that refer to a certain state of the world, must be assembled and grounded by searching through indirect, long-buried records that have long slipped the public (and corporate media’s) memory. Such an excavation requires the skills of an archaeologist and the critical distance of an outsider — not the propitiatory writing skills of someone familiar with the well-worn seat of an election press bus or who lunches with his subjects on a regular basis.

Of course, that assumes that establishment media such as the Times is interested in reportage from a critical perspective. Perhaps that’s the most flawed assumption of all.

Why our best students are totally oblivious

Why our best students are totally oblivious:

While being up in arms about popular injustices, they’re educated how not to see race, empire and colonialism

This past week, I taught my first classes of the semester. The college where I teach attracts young men and women who are generally left of center. Some of them are the children of progressive activists and academics. Many of the students who enroll in my courses hope to spend the rest of their lives ending poverty, racism, sexual oppression, among other forms of injustice. As such, they are an extremely aware crowd.

In one of my courses, which deals with race, philosophy and legal theory, I listed a series of names on the board and asked students to describe who they were: Trayvon Martin, Yusuf SalaamShaker AamerAafia SiddiquiJosé Padilla. Nearly every student in the room was familiar with the first name, and could give in excruciating detail the facts of the case and trial, and the questionable laws used to defend George Zimmerman in public discussion. Most of the students knew immediately that Yusuf Salaam was one of the Central Park Five who, despite their innocence, had been convicted of raping a woman and had spent years in prison. They were making astute connections to New York’s stop-and-frisk policy, racial profiling, “stand your ground” laws (yes, even though these were not explicitly part of the Zimmerman trial, they are relevant). You may not have known some of these details, but they did. As I mentioned, they’re rather politically aware.

Not a single student recognized the other three names.

In another course on political philosophy that also began last week, several students had only the faintest idea that Guantánamo was a prison, and could not describe who the prisoners were, why they were there, or why it mattered.


advertisement

These were illuminating reminders for me. Most of these students are not to blame for not knowing. They were born between 1992 and 1995. A few are slightly older. For them, the U.S.-led War on Terror is a constant background in their lives. They have few memories of a time when the U.S. was not waging war in the Middle East. They grew up in the shadow of the first Gulf War. But shadows are just that: observable, yet elusive, ungraspable. In the same way, the War on Terror, unless it has affected them directly, is neither unfamiliar, nor completely familiar. It’s not close enough for them to know which questions to ask in order to have a clear picture; yet it’s too close to know what the opposite of a War on Terror would look like.

The context in which my young progressive students can know so much about some populations and nothing about other populations who face analogous circumstances is worthy of pause. It is true that most of us find it difficult to remember names and figures when they cycle through the mainstream news hour for less than a few minutes, for only a day or two. We know Trayvon Martin’s name because there were assiduous protests surrounding his death, and because the mainstream news media became interested in it. The names of so many young black men who died similarly will not be known to us because of the absence of organized protests and the lack of media interest.

Similarly, the names of Padilla, Siddiqui and Aamer have not been mentioned for quite some time in the mainstream news cycle to which my students are attuned. When they were noticed, the mentions were generally brief and in the context of the state’s successful fight against “Terror.” In certain spaces, there have been continual protests and excellent critical coverage. But few dissents against the U.S.’s sustained foray into empire — through drones, torture, indefinite detention and other means — have commanded alert and aggressive attention from our patriotic and subservient mainstream media.

My students’ lack of knowledge of most things related to the U.S.’s war on terror indicates other predictable and alarming things: The principle of preemptive policing — jailing men indefinitely without charges, torturing them — is commonplace and no longer (if ever) worthy of shock. The racial profiling of Muslim men, because it is done in the context of an explicit state-led war, is difficult to be alarmed about without challenging the moral credibility of the government that leads it.

If racism is discussed, it is, correctly, within the context of the U.S.’s morally troubling and murky history of slavery. But the discussions are not usually linked to the equally troubling history of colonialism and conquest of indigenous populations. The U.S.’s history of racism against migrants such as Asians and Latinos is perhaps better known for some. But it is difficult to be a “good citizen” and still be critical of the ideological war that the U.S. wages on Muslims — especially in the midst of the U.S.’s ever-continuing attacks — covert, drone, explicit.

My students’ lack of knowledge about the effects of the Global War on Terror on men and women in the U.S. indicates to me that they are the successful product — even in the elite grammar/high schools from which so many of them graduated — of a patriotic and “morally upstanding” education. They have learned that many institutions — like their schools — work in their favor, even on their behalf. They have not come face to face with prisons, border police, customs officials, NYPD or hostile judges. They have learned how not to see race, empire and colonialism while being up in arms about the more popular facets of injustice — even though these are closely linked: the environment, sexual and reproductive rights, and “wringing bias out” of our hearts.

The latter phrase is invoked by President Obama in a speech, given after the “not guilty” verdict in the George Zimmerman trial: “Am I wringing as much bias out of myself as I can?” This question reduces racism to an individual failing, a problem of conscience, rather than one of laws (drug and three strikes, preemptive policing, racial profiling), institutions (carceral, banking, social, state, military, cultural), ideologies (lynch law, slavery, empire, national security, surveillance, the War on Terror), and accepted culture.

The president’s follow-up question — “Am I judging people as much as I can, based on not the color of their skin, but the content of their character?” — elides the complex interplay of ideology, institutional power and political circumstances in ascribing morality to any individual person.

When young black men are arrested for petty theft, it becomes commonplace to discuss their “individual moral failings.” When senior, often white, investment bankers embezzle money, they are rewarded with bailouts, bonuses and bona fides.

When a young Somali-American woman sends less than $2,000 to Somalia to aid the poor, she is convicted of aiding terrorists, and given extended prison time. When HSBC Bank skirts material support statutes by laundering $850 million, they are fined less than a month’s profits.

When young Muslim men speak critically of the U.S.-led wars against predominantly Muslim countries, they are immediately assumed to be terrorists.

Are the judgments ascribed to each of these groups about character alone? I would suggest they emerge from a history of ideological biases, cemented by unaccountable institutions, including the last two presidential administrations. These judgments are embedded in the political discourse spun by political authorities. They guarantee that only those who are poorer, darker or less powerful will pay — heavily, disproportionately, with their lives. These matters are hardly only about the bias in our hearts and judging the content of one’s character.

Within the American tradition of adventure-packed action movies and the 30-minute news cycle, individual failings are easier to focus on, to obsess over, to judge, to be outraged about.

Cultural worldviews, pernicious politics, racial histories and ideologies are more difficult to disarticulate. They require reading histories and thinking through multiple logics, and weeding through numerous laws and political contexts.

_______________________________________________________________

This article appeared in today’s edition of Salon (www.salon.com).

Edward Snowden: The Great Criminal

As Edward Snowden’s name is bandied about, with a debate emerging over whether he is a hero or a criminal, whistleblower or traitor, the words of philosopher Walter Benjamin come to mind.  In his 1921 essay, The Critique of Violence, Benjamin discusses the law’s goal to pursue the monopoly on violence:

The law’s interest in a monopoly of violence vis-a-vis individuals is not explained by the intention of preserving legal ends but, rather, by that of preserving the law itself; that violence, when not in the hands of the law, threatens it not by the ends that it may pursue but by its mere existence outside the law.

Here Benjamin restates one of the fundamental goals of classical liberal political philosophy, at least for philosophers such as Hobbes and Locke, namely to eliminate the use of violence from everyone except the state and its duly appointed deputies. This is why in Locke, the state ‘agrees’ to protect the rights of individuals in exchange for individuals giving up their right of retribution and punishment. The right of violence becomes the sole provenance of the state, whether through the death penalty, prisons, or defense of the state itself.

However, as we also know, the state monopolizes and regulates the use of violence in the interests of those who have the most influence over the state: these wealthy men who decide the personification of the state. In the 1600’s English North America, this would have been white Englishmen. In the 1910’s, Benjamin was interested in the role of workers in challenging the monopoly of state violence.

Understood in this way, the right to strike constitutes in the view of labor, which is opposed to that of the state, the right to use force in attaining certain ends. The antithesis between the two conceptions emerges in all its bitterness in face of a revolutionary general strike. In this, labor will always appeal to its right to strike, and the state will call this appeal an abuse, since the right to strike was not “so intended,” and take emergency measures.

Perhaps unsurprisingly, unions aroused a widespread secret admiration from a public that was weary of the state’s imposition.  Today, as Occupy and other movements point out, the most influential are still the 1%–though the colors, sexes, and sexualities of this privileged demographic have been somewhat expanded.

For example, Locke’s story of slavery is more accurately read as the story of colonialism –and eventually—imperialism. Strangers attack Englishmen. Englishmen fight back and win. They have the right to kill the strangers, but grant them their lives in exchange for their agreeing (at least implicitly) to be slaves. It is, an apologia for the conquest of American Indians. But in the modern moment, it is a story that is replicated by Samuel Huntington in the “Clash of Civilizations.”

Back to Benjamin, who is thought to have committed suicide in Southern France as he was trying to flee from the Nazis.  Here is another excerpt from the “Critique of Violence”:

The same may be more drastically suggested if one reflects how often the figure of the “great” criminal, however repellent his ends may have been, has aroused the secret admiration of the public. This cannot result from his deed, but only from the violence to which it bears witness.

How might this apply to Edward Snowden? Snowden’s ‘crime,’ if you will, was that he disrupted the state’s ability to protect its monopoly of violence by exposing its widespread surveillance activities.  He did this despite the widely claimed fears of interested parties that doing so would “undermine national security,’ and in the face of the state’s insistence that these activities are justified and justifiably secret. In this sense, the fact that he challenged the prerogatives of the state itself, makes his alleged ‘crime’ so much more transgressive than, for example, merely lying to Congress about weapons of mass destruction, starting a war with a random nation in which tens of thousands die, or torturing rendered persons. None of these latter crimes are a threat to the state itself, and for that reason may be readily forgiven and forgotten.  Manning and Snowden are, however, ‘great criminals’ in that their actions embarrassed and undermined state power.  They can never be forgiven or forgotten.

So, for a significant portion of the public, there seems to be an–open or perhaps grudging…admiration of Snowden because he has dared to challenge the state’s monopoly on violence. He challenges the state even as he acknowledges that the state will use every resource at its disposal to exact its revenge. We know from the tragic example of Aaron Schwartz that challenging the Department of Justice will require endless resources, from millions of dollars of legal know-how and the filing of endless FOIA requests. We know from the example of John Kiriakou that even going through formal channels of whistleblowing—including being

 

“the first CIA officer to call waterboarding “torture”; to reveal that the CIA’s torture program was policy rather than a few rogue agents; and to say it was wrong”

 

will not stop the state, even a state led by a “transformative presidency,” from making sure that no one disturbs its monopoly on violence.

In this case, therefore, the violence of which present-day law is seeking in all areas of activity to deprive the individual appears really threatening, and arouses even in defeat the sympathy of the mass against law. By what function violence can with reason seem so threatening to law, and be so feared by it, must be especially evident where its application, even in the present legal system, is still permissible.

What makes Snowden so interesting is that it appears that he is an old-fashioned “believer” in the American project—someone who wanted to fight the good fight, to uphold American principles and ideals, as the US government has long professed is also its mission. He contracted to work for defense contractors who in turn worked with the NSA, and for that reason did not begin his (short-lived) post-military career with misgivings about the American imperial project. As he got to see the how its affairs were being misconducted, he continued to believe in “doing the right thing.”  What also makes Snowden remarkable is his awareness that

[T]he “US Persons” protection in general is a distraction from the power and danger of this system. Suspicionless surveillance does not become okay simply because it’s only victimizing 95% of the world instead of 100%. Our founders did not write that “We hold these Truths to be self-evident, that all US Persons are created equal.

Whether or not one agrees with his actions, whether or not his politics and ideology mesh with the ideas of the right or the left–it will always be a remarkable sight to a see a lone person stand up to the Leviathan, composed as it is of its myriad eyes, all watching, waiting, to clamp down on any threat, no matter how trivial to it relentless monopolistic pursuit of violence—and power.

_______________________________________

This piece was republished in Salon on June 19, 2013 as “Edward Snowden’s real crime: Humiliating the state.”