Pandaemonium

PLUCKED FROM THE WEB #76

The latest (somewhat random) collection of essays and stories from around the web that have caught my eye and are worth plucking out to be re-read.


.

Xenophobia turns migrants into scapegoats
Jan Bornman, New Frame, 23 September 2020

In recent months, as South Africa has been dealing with the economic fallout of the Covid-19 pandemic, there have been growing calls for migrants to be deported and for the government to take action against crimes allegedly committed by people born outside the country.

A recent analysis by the Atlantic Council’s Digital Forensic Research Lab showed how a number of bots and fake accounts have been behind some of the most xenophobic propaganda online. In 2008, more than 60 people were killed in xenophobic violence. In September 2019, violence again broke out in Johannesburg after online messages were disseminated demanding migrants leave and refugee camps be opened.

The flyer circulating ahead of the violence called for a mass shutdown and for all “to come together as South Africans with one voice of enough is enough, on selling of drugs, on property theft, and on our work taken by foreign nationals.

“We as the people who fought for this freedom, find no respect from the owners of the companies and what is going to happen on the 2 September 2019 is to show our dissatisfaction, and to use our right to vote by taking out foreign nationals from our work,” the pamphlet warned.  “South Africa for South Africans. This is not xenophobia but the truth.”

Now, just over a year later, the online army of trolls that back these xenophobic messages is calling for a march on the Nigerian embassy on 23 September 2020 over “drugs and human trafficking”. This comes after a march was planned on 29 August with similar intentions, though nothing came of it. 

Commentators and the government want us to believe these incidents are sporadic and spontaneous, seemingly coming out of nowhere. But just last week, there was tension between South African and migrant vendors at a market in Yeoville. And at the end of July, mobs of South Africans marched through the streets of Phola Park in Thokoza, east of Johannesburg, evicting migrants from their homes and burning their belongings in the streets. 

While they were doing this, South Africans proudly chanted, “We are just spring cleaning,” and “We are sanitising the area.”

Read the full article in New Frame.


.

How to distribute a Covid-19 vaccine ethically
Nicole Hassoun, Scientific American, 25 September 2020

The COVID-19 Vaccines Global Access (COVAX) facility is co-led by the World Health Organization (WHO), the Coalition for Epidemic Preparedness Innovation and Gavi, the vaccine alliance. It lets countries support a broad portfolio of vaccine candidates, and requires distribution in line with need. The facility has proposed to give countries vaccine quantities in proportion to their populations until each country can help 20 percent of its population. But it only guarantees enough to non-contributing—mostly poor—countries to cover essential workers before donors vaccinate 20 percent of their populations. In a controversial decision, the Trump administration announced the United States will not join this facility.

Another WHO proposal aims to prioritize health care workers, elderly and the most vulnerable. It would seek to reduce COVID-19 deaths and protect health systems by giving countries vaccines based on the number of essential health care workers, the proportion of people over 65 and those who are otherwise likely to suffer seriously if they get COVID-19.

Another approach, the “fair priority model,” tries to limit economic and health consequences. A collaboration of ethicists, spearheaded by Zeke Immanuel, former director of bioethics at the National Institutes of Health, argues that at least after countries have their pandemics under control, fair allocation requires distributing the vaccine first to those countries where it is possible to save the greatest number of life years, then considering also disability that can be prevented by the vaccine as well as the amount of poverty and aggregate economic damage the vaccine can prevent, and finally distributing the vaccine to reduce transmission rates as far as possible.

A proposal by Vanderbilt University considers contribution and capacity. Researchers at Vanderbilt propose scoring countries based on (1) their capacities to provide care, (2) their ability to distribute vaccines and (3) whether they have helped test and develop new interventions. Those with lower capacity to provide care without a vaccine, greater capacity to distribute the vaccine, and who have helped test and develop new interventions would have higher scores and thus priority access.

These proposals differ, and each one has some merit. It is important to distribute in line with need (though we need to examine what needs matter), and we should try to mitigate economic as well as health effects as well as prioritize countries who lack the ability to provide care.

But all of these proposals unfairly prioritize rich countries; they either let rich countries control their epidemics first, help all of their essential health workers (they have many more than poor countries), or even help 20 percent of their populations before letting poor countries do more than treat 3 percent.

A truly ethical proposal would treat all people equally and help countries get vaccines to people when they lack capacity to do so on their own, rather than accepting inequality in access as an unchangeable fact and bypassing the poor to help the rich, the weak to help the strong.

Read the full article in Scientific American.


.

The trouble with disparity
Adolph Reed, Jr. & Walter Benn Michaels, Nonsite, 2 September 2020

We can see how this works in a recent report from the National Women’s Law Center, which, in the context of the current health crisis, found not only that “Black women are disproportionately represented in front-line jobs providing essential public services” but also that the black women doing these jobs “are typically paid just 89 cents for every dollar typically paid to white, non-Hispanic men in the same roles.”4 Overall, the average wage disparity for black women across the eleven occupational categories the report discusses is $.20 per hour, which, as the authors note, is especially significant for low-wage workers. “This difference in wages results in an annual loss that can be devastating for Black women and their families that were already struggling to make ends meet before the public health crisis. For example, Black women in a low paid frontline occupation such as waiters and waitresses lost $7,800 due to the wage gap in 2018. Black women working as teachers lost a staggering $14,200 due to the wage gap in 2018.” This is precisely the kind of injustice that the battle against disparity is meant to address.

But it is also precisely the kind of injustice that reveals the class character of that battle. The median hourly wage for white non-Hispanic men in eight out of eleven of the occupational categories in which black woman are underpaid is less than $20 an hour (and in a ninth, Healthcare social workers, it’s barely over $20.) Disparity tells us the problem to solve is that the black women make $.20 an hour less than the white men. Reality tells us it’s not that $.20 an hour that makes those black women workers’ economic situations precarious. Everyone receiving an hourly wage of less than $20 is in a precarious economic position. And it’s not just that this report makes no reference to the need to raise the wages of all workers in those eleven front-line occupational categories. Every time we cast the objectionable inequality in terms of disparity we make the fundamental injustice—the difference between what front-line workers make and what their bosses and the shareholders in the corporations their bosses work for make—either invisible, or worse. Because if your idea of social justice is making wages for underpaid black women equal to those of slightly less underpaid white men, you either can’t see the class structure or you have accepted the class structure.

The extent to which even nominal leftists ignore this reality is an expression of the extent of neoliberalism’s ideological victory over the last four decades. Indeed, if we remember Margaret Thatcher’s dictum, “Economics are the method: the object is to change the soul,” the weaponizing of antiracism to deploy liberal morality as the solution to capitalism’s injustices makes it clear it’s the soul of the left she had in mind. Thus, for example, the reception of Raj Chetty and his coauthors’ widely discussed 2018 study of intergenerational economic mobility made it clear that their most shocking finding was the degree to which rich black people are less likely than their white counterparts to pass their status on to their children, especially their male children. As if the difficulty rich people might experience in passing on their expropriated wealth is made into a left issue by the fact that the rich people in question are black.5 Of course, the study’s authors aren’t necessarily responsible for how news media represent its significance, but they are totally responsible for the fact that their work largely disconnects economic mobility—and racial disparities—from political economy, in both diagnosis and proposed remedies. For them “the critical question to understand the black-white gap in the long run is: do black children have lower incomes than white children conditional on parental income, and if so, how can we reduce these intergenerational gaps?” Their idea of the basic problem really isn’t that unfair advantage is being passed from generation to generation but that it’s being passed more effectively between white people than between blacks.

Read the full article in Nonsite.


.

The racial wealth gap is about the upper classes
Matt Bruenig, Jacobin, 5 July 2020

If you take the net worth of all white households and divide it by the number of white households, you get $900,600. If you do the same thing for black households, you get $140,000. The difference between these figures — $770,600 — is the best representation of the overall racial wealth gap. That is how much more wealth black people would need per household to have as much wealth as white people have per household.

But overall statistics can be misleading. If you decompose both of these bars into deciles, what you find is that nearly all white wealth is owned by the top 10 percent of white households just as nearly all black wealth is owned by the top 10 percent of black households. The lower and middle deciles of each racial group own virtually none of their racial group’s wealth.

What this means is that the overall racial wealth disparity is being driven almost entirely by the disparity between the wealthiest 10 percent of white people and the wealthiest 10 percent of black people.

One way to illustrate this point (h/t Sam Tobin-Hochstadt) is to see what would happen to the overall racial wealth gap if we entirely closed the wealth gap that exists between the bottom 90 percent of each racial group. Put differently: What would happen to mean black wealth if the bottom 90 percent of black families were given the exact same per-household wealth as the bottom 90 percent of white families?

The answer is that mean black wealth would rise from $140,000 to $311,100. The overall racial wealth gap would thus decline from $760,600 to $589,500, a fall of 22.5 percent. This means that even after you have completely closed the racial wealth gap between the bottom 90 percent of each race, 77.5 percent of the overall racial wealth gap still remains, which is to say that the disparity between the top deciles in each race drives over three-fourths of the racial wealth gap.

In much of the popular discourse on the racial wealth gap, the emphasis is on the median white household and the median black household. To understand how misguided this emphasis is, we can do exactly what we did above but for the bottom 50 percent of each race. After topping off the current bottom 50 percent of black families so that they have as much wealth per household as the current bottom 50 percent of white families, mean black wealth rises by $23,100, cutting the racial wealth gap by 3 percent.

What this shows is that 97 percent of the overall racial wealth gap is driven by households above the median of each racial group. This explains why you can produce shocking conclusions that show a relatively small amount of money can dramatically reduce the (median) racial wealth difference: it is not that hard to get two groups who own relatively little to own the same amount of relatively little. But such measures would not make much of a dent in the overall racial wealth gap.

Read the full article in Jacobin.


.

The Supreme Court used to be openly political.
It traded partisanship for power.

Rachel Shelden, Washington Post, 25 September 2020

The court’s authority, built on the notion that it will act outside of politics, has expanded to include power over major elements of American democracy — including determining the outcome of presidential elections. Because the justices have accrued so much power based on an apolitical posture, Americans are increasingly boxed in by the idea that the court must remain above politics, even as the nomination and confirmation process becomes more and more partisan. That dynamic may not be sustainable for much longer. But if the court becomes more openly political, it’ll be returning to the way it once worked for more than 100 years — only with vastly more power than it had before it wrapped itself in the mantle of non-partisanship.

Nineteenth-century Americans were deeply partisan, and they understood that the Supreme Court would be, too. Although justices were expected to follow the law in their judicial determinations, there were no clear limitations on partisan politicking outside the courtroom. Public trust in the court did not rely on justices claiming to be apolitical; Americans were far more concerned about limiting judicial power, period. Public concerns about the court becoming “political” materialized only when justices began to accrue more constitutional authority in the first few decades of the 20th century. Early Americans would have recognized the kinds of partisan political conversations we are having about the court today — but they would have been shocked to discover how much power we have given the judiciary over our democracy.

Although most Americans today assume that the Supreme Court should have the final say in constitutional matters — what the political scientist Keith Whittington calls “judicial supremacy” — few believed that in the nation’s early years. Judges were trusted to handle legal disputes fairly, but the court’s power to determine constitutional meaning was never secure. Instead, as the historians Gerald Leonard and Saul Cornell have recently argued, by the early 19th century, most Americans believed that the people, operating through partisan mechanisms, were the ultimate arbiters of constitutional authority.

So partisan fidelity — not legal ability — was the primary consideration in presidents’ Supreme Court appointments. A significant majority of 19th-century justices were chosen because of their previous partisan allegiances: Most nominees had served in federal, state or local political positions.

Senate majorities often declined to confirm or even take up nominations by presidents from an opposing party. Of the 23 failed nominations in the 19th century, only seven were rejected by a friendly Senate, and of these, four were the casualty of intraparty squabbles. When the Senate refused to give Democrat Roger Taney a hearing in 1835, it wasn’t for fear that President Andrew Jackson was politicizing the Supreme Court — it was simply because they opposed Jackson.

Read the full article in the Washington Post.


.

How philanthropy benefits the super-rich
Paul Vallely, Guardian, 8 September 2020

Philanthropy, it is popularly supposed, transfers money from the rich to the poor. This is not the case. In the US, which statistics show to be the most philanthropic of nations, barely a fifth of the money donated by big givers goes to the poor. A lot goes to the arts, sports teams and other cultural pursuits, and half goes to education and healthcare. At first glance that seems to fit the popular profile of “giving to good causes”. But dig down a little.

The biggest donations in education in 2019 went to the elite universities and schools that the rich themselves had attended. In the UK, in the 10-year period to 2017, more than two-thirds of all millionaire donations – £4.79bn – went to higher education, and half of these went to just two universities: Oxford and Cambridge. When the rich and the middle classes give to schools, they give more to those attended by their own children than to those of the poor. British millionaires in that same decade gave £1.04bn to the arts, and just £222m to alleviating poverty.

The common assumption that philanthropy automatically results in a redistribution of money is wrong. A lot of elite philanthropy is about elite causes. Rather than making the world a better place, it largely reinforces the world as it is. Philanthropy very often favours the rich – and no one holds philanthropists to account for it.

The role of private philanthropy in international life has increased dramatically in the past two decades. Nearly three-quarters of the world’s 260,000 philanthropy foundations have been established in that time, and between them they control more than $1.5tn. The biggest givers are in the US, and the UK comes second. The scale of this giving is enormous. The Gates Foundation alone gave £5bn in 2018 – more than the foreign aid budget of the vast majority of countries.

Philanthropy is always an expression of power. Giving often depends on the personal whims of super-rich individuals. Sometimes these coincide with the priorities of society, but at other times they contradict or undermine them. Increasingly, questions have begun to be raised about the impact these mega-donations are having upon the priorities of society.

There are a number of tensions inherent in the relationship between philanthropy and democracy. For all the huge benefits modern philanthropy can bring, the sheer scale of contemporary giving can skew spending in areas such as education and healthcare, to the extent that it can overwhelm the priorities of democratically elected governments and local authorities.

Read the full article in the Guardian.


.
One of Philip Guston’s KKK paintings

Critics, scholars – and even museum’s own curator – condemn decision to postpone Philip Guston show over Ku Klux Klan imagery
Gareth Harris, The Art Newspaper, 25 September 2020

Critics and scholars have reacted angrily to the announcement that a major touring exhibition dedicated to the late Canadian-American artist Philip Guston, and due to run at four major museums over the next two years, has been delayed until 2024. A spokesperson for the National Gallery of Art in Washington, DC—one of the four museums involved—told ARTnews that organisers were concerned about “painful” images of Ku Klux Klan characters in some of Guston’s works.

The decision announced this week in a joint statement from the directors of all four museums—Tate; National Gallery of Art, Washington; Museum of Fine Arts, Boston; and the Museum of Fine Arts, Houston—said that the reason is “the racial justice movement that started in the US… in addition to challenges of a global health crisis”.

But Mark Godfrey, who was due to curate the Guston show at Tate Modern, has questioned the move. “Cancelling or delaying the exhibition is probably motivated by the wish to be sensitive to the imagined reactions of particular viewers, and the fear of protest. However, it is actually extremely patronising to viewers, who are assumed not to be able to appreciate the nuance and politics of Guston’s works,” he writes in a lengthy statement on Instagram.

“As art museums, we are expected to show difficult art and to support artists. By cancelling or delaying, we abandon this responsibility to Guston and also to the artists whose voices animate the catalogue such as Glenn Ligon [and] Tacita Dean,” Godfrey wrote.

According to Robert Storr, author of a recent Guston biography, “the prompt was push back from staff about an anti-lynching image from the 1930s, which was in effect the predicate for all of Guston’s later Ku Klux Klan imagery”. Storr adds: “If the National Gallery of Art, which has conspicuously failed to feature many artists-of-colour, cannot explain to those who protect the work on view that the artist who made it was on the side of racial equality, no wonder they caved to misunderstanding in Trump times.”

Guston’s daughter, Musa Mayer, also released a statement this week saying she was “deeply saddened” by the decision. She added: “My father dared to unveil white culpability [in this series of works], our shared role in allowing the racist terror that he had witnessed since boyhood, when the Klan marched openly by the thousands in the streets of Los Angeles. As poor Jewish immigrants, his family fled extermination in the Ukraine. He understood what hatred was. It was the subject of his earliest works. […] This should be a time of reckoning, of dialogue. These paintings meet the moment we are in today. The danger is not in looking at Philip Guston’s work, but in looking away”…

Rachel Wetzler, the associate editor at Art in America magazine, wrote on Twitter: “Amazing to see four museums essentially admit they don’t trust themselves to adequately contextualise the work”. Jason Farago, the art critic at the New York Times, posted that “it is bleak, beyond words, not to say cowardly, for these museums to postpone their Guston retrospective for four years”.

Read the full article in the The Art Newspaper.


.

Greece has a deadly new migration policy –
and all of Europe is to blame

Daniel Trilling, Guardian, 27 August 2020

The revelation by the New York Times that Greece has secretly expelled more than 1,000 asylum seekers, abandoning many of them on inflatable life rafts in the Aegean Sea, is the latest example of this disturbing trend. Since 2015, Greece has effectively been used by the rest of the EU as a buffer zone against unwanted migration, leaving thousands of refugees in unsanitary camps on islands in the Aegean and on the mainland. At the same time, a hastily arranged EU deal with Turkey saw the latter agree to act as border cop on Europe’s behalf, preventing refugees from crossing to Greece in return for financial aid and other diplomatic concessions.

This spring, amid rising geopolitical tensions, Turkey decided to send thousands of migrants towards the Greek border as a way of exerting pressure on Europe. It provoked a nationalist backlash, followed by several hardline and legally questionable border control measures from Greece’s conservative New Democracy government. Earlier this year, the New York Times also reported that Greece was operating a secret detention centre at its land border with Turkey, so that it could carry out summary deportations without giving people the right to claim asylum; the latest revelations about its actions in the Aegean fit the same pattern.

In the central Mediterranean, where people still attempt to cross to Europe via boats launched from north Africa, mainly by smugglers in Libya, the EU has for several years been trying to stop migration by closing down rescue operations. The consequence is that people are either more likely to die at sea, or they are returned to Libya, a war zone where torture, forced labour and abuse of migrants – some of which occurs at the hands of Libyan officials the EU treats as partners – is widely documented. This spring, Italy and Malta, the countries where most people rescued from the central Mediterranean disembark (and which like Greece have also been used by their European neighbours as buffer zones), closed their ports to rescue ships on the grounds that they were no longer safe havens due to the pandemic.

While Italy has since allowed some ships to dock, Malta apparently took the pandemic as an opportunity to form its own private flotilla of merchant vessels to intercept migrants at sea and hand them over to the Libyan coastguard. As the migration monitoring organisation Alarm Phone reported in April (and as I wrote about here), Malta’s reluctance to bring people into port led to a situation in which a boat carrying more than 60 people was allowed to drift at sea for several days, during which time some of the passengers died.

It would be easy to place the blame for these situations squarely on the shoulders of countries at the EU’s Mediterranean frontier. But they are acting in a way that most European governments see as beneficial. “I thank Greece for being our European aspida [shield] in these times,” declared Ursula von der Leyen, president of the European Commission, during the Greece-Turkey border crisis in March. This includes the UK, which makes use of those buffer zones regardless of Brexit: as coronavirus spread through Europe, the Home Office refused to resettle refugee children trapped in Greece – children who had relatives in the UK and the legal right to join them – only doing so belatedly under pressure from campaigners.

Read the full article in the Guardian.


.

The wages of whiteness
Hari Kunzru, New York Review of Books, 24 September 2020

Regardless of DiAngelo’s personal politics, this truth remains. Her business model depends on making people uncomfortable, but not too much, or rather only along certain axes of discomfort. She will not get hired if she asserts that the problem she is proposing to solve may be structural and best addressed by the redistribution of power and resources, rather than maximizing the human potential of the marketing department. Of necessity, in a corporate forum, solutions need to be presented in ways that do not threaten the host organization, and that inevitably leads to their being framed as matters of personal, individual behavior.

In White Fragility, DiAngelo identifies “Individualism” and “objectivity” as “two key Western ideologies.” Individualism “claims that there are no intrinsic barriers to individual success and that failure is not a consequence of social structures but comes from individual character.” She then makes a case for why social structures and group identities matter in overcoming bias. Cognitive dissonance must afflict anyone advocating for social constructivism in today’s rigidly neoliberal corporate environment. The solution, which in essence is post-1960s liberalism’s answer whenever it is called upon to address the thorny question of collectivity, is to route the argument through consciousness. Raising or changing consciousness is conceived of as a prelude to possible future collective action. Perhaps if enough minds are changed, then social or political progress will be a natural (and preferably nonviolent) consequence. The difficult questions—of collective organization, of how the individual gets subsumed into a collective project, and of course the exercise of power—all fade tastefully into the background. The time is always soon, but never now.

Essentially, a diversity consultant has to be able to tell both an activist story and a business story, while persuading each audience that theirs is the real one, the important one, and the other is secondary. Apart from any gains in productivity that might arise from a more diverse, harmoniously functioning workforce, the corporate client also receives what could be called American liberalism’s psychological wage, the good feeling of social responsibility. The pageantry of respect is cheap, or at least cheaper than paying reparations, so on Martin Luther King Jr. Day (and latterly Juneteenth) an unlikely parade of organizations, from the FBI to ExxonMobil, came down from the mountaintop to judge us by the content of our character rather than the color of our skin. There are many variants of an Internet joke that mocks the substitution of symbolism for material change: “Black People: Stop killing us. Liberals: Hey we’re renaming the Pentagon the Maya Angelou War Center.”

Read the full article in the New York Review of Books.


.

America’s eviction epidemic
Gabriel M. Schivone, NYR Daily, 16 September 2020

Brian Goldstone, a journalist and anthropologist who is just completing research for his forthcoming book The New American Homeless, has been volunteering at an emergency housing hotline that mainly serves Atlanta residents, but also receives calls from all over the state, including rural counties, for people facing eviction. The vast majority of those affected whom he encounters are black and Latinx—although, he adds, he’s now starting to see even single white men, including tech company workers laid off during the pandemic. According to the Urban Institute, between February and April, one out of every five rental households nationwide had at least one member who lost a job.

“This is only an amplification of a problem that was already taking place before coronavirus,” Goldstone said.

Some experts’ fears earlier in the pandemic have now been borne out. “We don’t want what was originally a health crisis turned into a job crisis then now to become a housing crisis and a crisis of housing instability as people are evicted from their homes,” said Ingrid Gould Ellen, a professor and faculty director at the Furman Center for Real Estate and Urban Policy at New York University, quoted by Vox in July.

When Covid-19 arrived in the US this spring, it changed the housing landscape overnight. By late March, when the public health crisis engulfed the US, hundreds of grassroots mutual aid networks had emerged around the country, in virtually every state. They could hardly do enough, but they did help many vulnerable people. And it was many of these same aid networkers who also demanded a moratorium on evictions. In one sense, they appeared to be pushing an open door: numerous authorities at city, state, county, and federal level ordered halts on evictions, based in part on the pressing need for people to stay isolated at home to tamp down community transmission of the coronavirus.

But some commentators saw fundamental flaws in these measures from the beginning. Despite the moratoriums on evictions by authorities in the public sector, for example, few officials seemed concerned that private-sector banks were let off the hook—even though “that’s where most of the evictions and foreclosures will occur,” said Joseph Stiglitz, the former chief economist for the World Bank and a senior economic adviser to Bill Clinton, back in March. As he noted in his 2019 book, People, Power, and Profits, three in five Americans do not have the cash reserves to cover a $1,000 emergency.

Part of this pattern is familiar from the Great Recession of the 2000s that preceded my eviction. Writer and activist Laura Gottesdiener records in her book, A Dream Foreclosed: Black America and the Fight for a Place to Call Home, that evictions and foreclosures by banks shuttered homes to some 10 million people between 2007 and 2013. This amounts to the combined populations of Oklahoma, Mississippi, Wyoming, Vermont, and New Mexico.

Read the full article in the NYR Daily.


.

We’re learning more about the relationships
between race, class, and police brutality

Megan Day, Jacobin, 23 June 2020

According to a paper published today by the People’s Policy Project, Justin Feldman’s “Police Killings In The U.S.,” of the 6,451 police killings recorded between January 2015 and the present, 3,353 of the individuals killed were white, 1,746 were black, and 1,152 were Latino.

It’s important to stress that while more white Americans are killed by police in sheer numbers, white people also make up a majority of the US population and are significantly less likely to be killed by police than black people. Feldman finds that “whites had the lowest overall rate of police killings (3.3 per million) followed by Latinos (3.5 per million). The rate of police killings for the black population was more than double that of whites: 7.9 per million.”

Outside the Right’s reactionary echo chamber, this reality is well known. But Feldman’s paper also introduces a new analytical category, pairing data on police killings with census tract poverty data to estimate the likely socioeconomic status of the deceased. This method is imprecise, but it’s a step forward in the general body of research on police killings. Class is rarely accounted for in data analyses of fatal police violence, and its inclusion deepens our understanding of who is susceptible to it, across racial lines.

Feldman found that “the rate of police killings increased as census tract poverty increased,” with the level of police killings in the highest-poverty quintile more than three times that of the lowest-poverty quintile. In layman’s terms, you’re overall more likely to be killed by a police officer if you’re working-class or poor. Given this country’s long and continuing history of intense racial oppression, it’s little surprise that black and Latino people are more likely to live in high-poverty areas than white people: Feldman observes that “median census tract poverty was 9.4% for whites compared to 18.7% for black and 16.8% of Latino individuals.”

The paper then examines the relationship between poverty quintile and police killings across racial demographics. What Feldman finds is notable: the correlation between poverty and susceptibility to fatal police violence that exists for white people is much stronger than for black and Latino people. In other words, white people who live in the poorest neighborhoods are at high risk of getting killed by a police officer, but black people are at high risk everywhere.

Read the full article in Jacobin.


.

The culture wars are a distraction
William Shoki, Africa Is A Country, 15 September 2020

When the EFF first emerged as a political party in 2013, it was widely cheered as being a viable option to fill the void left in working-class politics in the wake of the Marikana massacre as the ruling African National Congress’ hegemony began to crumble. While the composition of its admirers included a diverse range—disgruntled local businesspeople, university students and the urban unemployed—its militant populist style was touted as left in orientation given its advocacy for policies such as nationalizing South Africa’s mines (which it is no longer that committed to), and land expropriation without compensation. (Two years later, as South Africa’s campuses erupted with #RhodesMustFall and #FeesMustFall, the EFF won SRC elections on many campuses.)

Nowadays, the party has become too loaded with contradictions for it to be considered left-wing in any credible sense, both in its ideology and practice. Besides its lack of internal democracy and the cult of personality surrounding its leader Julius Malema, some of the EFF’s lead figures have been embroiled in various financial scandals including municipal tender fraud and the ransacking of a mutual bank primarily serving informal rural, friendly societies. Throughout its history, the EFF has never had any moorings in the organized working class; it lacks any trade union affiliation (it enjoyed some informal links to the Marikana workers union, AMCU, but it was never formalized), nor does it have any concrete ties to other social movements like those for the unemployed or in mining affected communities. Despite this, it clings vehemently to the rhetoric of class, and proclaims its opposition to capitalism although playing almost no part in trying to build a working class movement in South Africa. How then, are they still venerated by most as progressive, and taken at their word by even their naysayers who believe them to be sincerely anti-capitalist?

What explains this is that the terms of radical politics in the public discourse, have shifted from a materialist, class-rooted mode, to an identity-based, culturalist one, and the EFF have contributed to this shift and are its biggest beneficiary. In South Africa, where race is deeply embedded in everyday thinking and experience, the EFF has capitalized and revived the idea that black people possess a distinctive, social identity, therefore constituting a “people” whose political and material interests are uniform. By positing some homogenous “black interest,” the EFF is able to flatten the contradictions of its political project, which at this point looks simply like a kind of economic nationalism, less opposed to capitalism per se, and more opposed to the fact that South Africa’s capitalist class continues to be dominated by “white monopoly capital.” The EFF’s biggest problem isn’t that capitalism concentrates wealth in the hands of the few, but that this few are predominantly foreign, white or Indian.

Read the full article in Africa Is A Country.


.

The antibiotic paradox:
why companies can’t afford to create life-saving drugs

Maryn McKenna, Nature, 19 August 2020

In a bitter paradox, antibiotics fuelled the growth of the twentieth century’s most profitable pharmaceutical companies, and are one of society’s most desperately needed classes of drug. Yet the market for them is broken. For almost two decades, the large corporations that once dominated antibiotic discovery have been fleeing the business, saying that the prices they can charge for these life-saving medicines are too low to support the cost of developing them. Most of the companies now working on antibiotics are small biotechnology firms, many of them running on credit, and many are failing.

In just the past two years, four such companies declared bankruptcy or put themselves up for sale, despite having survived the perilous, decade-long process of development and testing to get a new drug approved. When they collapsed, Achaogen, Aradigm, Melinta Therapeutics and Tetraphase Pharmaceuticals took out of circulation — or sharply reduced the availability of — 5 of the 15 antibiotics approved by the US Food and Drug Administration (FDA) since 2010 (see ‘Trimming a thinning herd’)…

Bringing a new antibiotic to market represents a Herculean feat. Only about 14% of antibiotics and biologicals in phase I trials are likely to win approval, according to the World Health Organization. A team of economists estimated1 in 2016 that the cost of getting from first recognition of an active drug molecule to FDA approval in the United States was US$1.4 billion, with millions more required for marketing and surveillance after approval. When companies such as Eli Lilly or Merck made antibiotics in the mid-twentieth century, those costs could be spread across their many divisions. And when, as used to happen, big companies bought smaller ones whose new drugs showed preclinical promise, the purchase price covered any debt the small companies had incurred.

Those business models no longer exist. The trio that runs Paratek knows this because all three are big-company veterans. Loh worked at Wyeth Pharmaceuticals in Philadelphia with Adam Woodrow, Paratek’s president and chief commercial officer, and with Randy Brenner, chief development and regulatory officer, on the successful antibiotic tigecycline (Tygacil), which was approved in 2005. (Wyeth sold its antibiotic portfolio to Pfizer in 2009.)

“When you come from a big company to a small company, your focus becomes: ‘How do I make sure this company survives?’” says Brenner, who previously also worked at Pfizer in New York City and at Shire in Lexington, Massachusetts (now a subsidiary of Takeda Pharmaceutical Company in Tokyo). “Bigger companies don’t need to think like that. No matter what happens to a product, the company survives.”

Read the full article in Nature.


.

Disdain for the less educated is the last acceptable prejudice
Michael J Sandel, New York Times, 2 September 2020

Joe Biden has a secret weapon in his bid for the presidency: He is the first Democratic nominee in 36 years without a degree from an Ivy League university.

This is a potential strength. One of the sources of Donald Trump’s political appeal has been his ability to tap into resentment against meritocratic elites. By the time of Mr. Trump’s election, the Democratic Party had become a party of technocratic liberalism more congenial to the professional classes than to the blue-collar and middle-class voters who once constituted its base. In 2016, two-thirds of whites without a college degree voted for Mr. Trump, while Hillary Clinton won more than 70 percent of voters with advanced degrees.

Being untainted by the Ivy League credentials of his predecessors may enable Mr. Biden to connect more readily with the blue-collar workers the Democratic Party has struggled to attract in recent years. More important, this aspect of his candidacy should prompt us to reconsider the meritocratic political project that has come to define contemporary liberalism.

At the heart of this project are two ideas: First, in a global, technological age, higher education is the key to upward mobility, material success and social esteem. Second, if everyone has an equal chance to rise, those who land on top deserve the rewards their talents bring.

This way of thinking is so familiar that it seems to define the American dream. But it has come to dominate our politics only in recent decades. And despite its inspiring promise of success based on merit, it has a dark side.

Building a politics around the idea that a college degree is a precondition for dignified work and social esteem has a corrosive effect on democratic life. It devalues the contributions of those without a diploma, fuels prejudice against less-educated members of society, effectively excludes most working people from elective government and provokes political backlash.

Here is the basic argument of mainstream political opinion, especially among Democrats, that dominated in the decades leading up to Mr. Trump and the populist revolt he came to represent: A global economy that outsources jobs to low-wage countries has somehow come upon us and is here to stay. The central political question is not to how to change it but how to adapt to it, to alleviate its devastating effect on the wages and job prospects of workers outside the charmed circle of elite professionals.

The answer: Improve the educational credentials of workers so that they, too, can “compete and win in the global economy.” Thus, the way to contend with inequality is to encourage upward mobility through higher education.

The rhetoric of rising through educational achievement has echoed across the political spectrum — from Bill Clinton to George W. Bush to Barack Obama to Hillary Clinton. But the politicians espousing it have missed the insult implicit in the meritocratic society they are offering: If you did not go to college, and if you are not flourishing in the new economy, your failure must be your own fault.

Read the full article in the New York Times.


.

Black Lives Matter on the picket line
Amir Khafagy, Discourse Blog, 27 August 2020

The idea to strike had been growing for some time, as the discontent against Metro mounted. But the coronavirus proved to be the catalyst. As the virus crept its way into the heart of Black New Orleans, silently wreaking havoc on an already fragile population, some hoppers began to fall ill. Some hesitated to call in sick or risk losing a day’s pay.

“Metro never gave us any PPEs or anything like that so hoppers was getting sick,” said Brooks. “And if they get sick, we don’t have sick pay, so they still come to work because they can’t afford not to get paid.”

“A few of us have been thinking about striking way before the pandemic, but when we weren’t getting any PPE we figured the time was right now,” said Jonathan Edwards, a 12-year veteran hopper. “Now that we are on strike we ain’t planning to lie down”…

In 1996, New Orleans privatized its public sanitation sector, birthing a private trash collection industry overnight. Metro is currently one of three private companies that are contracted with the city to collect residents’ trash. In 2017, Metro signed a seven-year, $10.7 million contract with the city. That same year, Metro’s co-owner Jimmie Woods donated $2,500 to City Councilman Joseph I. Giarrusso III, the chair of the city’s Public Works, Sanitation and Environment Committee, which evaluates contracts and has oversight over the sanitation industry.

Interestingly, Metro Service Group is a Black-owned company. It was founded in 1982 by brothers Jimmie and Glenn Woods, who were both former hoppers. Since its founding, the brothers have become some of the most respected business leaders in the city and the company has grown into one of the largest African-American owned contractors in Louisiana: its annual revenue is $19.98 million.

With nearly 40 percent of Black businesses not expected to survive the pandemic, Metro stands as a rare success story. But Daytrian Wilken, the City Waste Union spokesperson, believes that success is predicated on the exploitation of Black bodies.

“We as a community want to support Black businesses but Black exploitation does not end because the company is Black,” she said. “Their bottom line needs them to exploit Black men.”

Read the full article in Discourse Blog.


.

English universities are in peril
because of 10 years of calamitous reform
Stefan Collini, Guardian, 31 August 2020

The truth is that if you say you want more children from deprived areas to be able to go to university, then don’t faff around with entry tariffs: invest in Sure Start centres, preschool groups, subsidised childcare and properly resourced primary schools. Make benefits genuinely accessible and life-supporting. Better still, stop whole sections of society being condemned to underpaid, vulnerable, soul-destroying labour while others cream off inordinate wealth from the profits of that labour.

But don’t kid yourself that the odd bit of “widening access” will in itself provide a magic bullet and thereby allow you, the beneficiary of a rigged competition, to sleep soundly at night.

Then there is the rather less obvious contradiction between consumerism and education. Our higher education system is at present structurally consumerist. Even now, it is not widely understood how revolutionary were the changes introduced in 2010-12 by the coalition government in England and Wales (Scotland wisely followed another course). It wasn’t simply a “rise in fees”. It was a redefinition of universities in terms of a market model. The Office for Students is explicitly a “consumer watchdog”. Consumers are defined by their wants; in exchange for payment they are “entitled” to get what they ask for.

The core experience of life-changing education is almost the exact opposite of this. It involves engaging with what is not us. It’s not about studying something we can “identify” with: it’s about encountering manifold forms of otherness; it’s about coming to understand things we didn’t previously know existed; it’s about struggling with the knotty intractability of how the world is, rather than how we like to think of it. Given this contradiction, there are few more pathetic sights than Tory backbenchers bleating about “grade inflation” while insisting that universities must provide “value for money” by “giving students what they want”.

Then there is the tension, though it may be unpopular to articulate it, between thinking of universities as a continuation of school and thinking of them as having a lot of functions in addition to undergraduate education. Those who urge that we move to “comprehensive universities”, on the model of comprehensive schools, taking all 18-year-olds from a given area regardless of grades in A-levels or equivalent exams, may be animated by admirably egalitarian convictions. But quite apart from the immediate resistance such a scheme would encounter from the dominant consumerism – “I want my children to be able to choose which (better) university they go to” – this neglects all those other functions.

Read the full article in the Guardian.


.

Don’t steal this book
Matt Taibbi, Substack, 5 September 2020

The book expresses zero compassion for those who do not see themselves as involved in politics and are just trying to get by. In her NPR interview Osterweil repeats the common left-Twitter trope that looting “is not actually hurting any people” because “most stores are insured; it’s just hurting insurance companies on some level,” which simply isn’t true. Not every business is insured for this kind of damage, and even if they are, line employees and business owners alike will lose weeks or months of income while claims are paid and repairs are made, if claims are paid and repairs are made.

She clearly has no idea what it is to work, to spend years squeaking out the shitty little margins of a corner store or a restaurant, to hose a kitchen floor down at two in the morning, or wash the puke out of the back of a taxi at the end of a shift. Abbie Hoffman at least told readers to leave big tips for waitresses, if you’re going to rip off restaurants. To Osterweil, everyone’s a kulak. She says Korean store owners were “the face of capital” in early nineties Los Angeles, just as, she says, Jewish businesses were in sixties New York. When she talks about who suffers in riots, she writes:

Though the buildings destroyed may be located in a predominantly Black or proletarian neighborhood, the losses go to the white, bourgeois building and business owners, rarely the people who live near them.

Rarely! She goes on to cite anthropologist Neal Keating in comparing looting to “the potlatch, a communal practice of Indigenous nations in the Pacific Northwest,” in which “wealthy people” at births, deaths, weddings, and other festivals give possessions away and “vie with each other to destroy the most accumulated wealth in a massive bonfire.” 

The minor distinguishing detail of the potlatch being a voluntary surrender of wealth was not considered relevant enough to mention, an interesting detail given how touchy this sort of person tends to be with boundaries in other situations. All sorts of people now have to take responsibility for the mere possibility of, say, a student being fleetingly discomfited by a word or image in a novel or history book, but apparently we don’t have to worry about making someone sad by burning their house down and throwing their shit on the street to be gobbled up by strangers, an act of “communal cohesion.”

These and countless other details make In Defense of Looting more cringe-worthy in its own way than a Sean Hannity flag-and-mugshot insta-book could ever hope to be, but what makes it a perfect manifesto for the woke era is its pathos. Adherents to this theology are characterized by a boundless, almost Trumpian capacity for self-pity, even as they’re advocating setting you on fire. They can make wrapping fishwiches sound like digging coal in Matewan, being deprived of a smartphone like being whipped by Centurions, and they matter because everyone, including especially Democratic Party politicians, is afraid of the fallout that comes with telling them to shut the fuck up. So their “ideas” spread like cancer.

Read the full article on Substack.


.

Why crowdfunding is no replacement for the welfare state
Moya Lothian-McLean, Guardian, 2 September 2020

Last week, a stranger messaged me to ask if I’d buy them some groceries.

“If not, I completely understand,” concluded the message that popped into my inbox from a young mum. “Be blessed.”

Perhaps this message would have once seemed unusual. But there has been a noticeable and distressing rise in the number of individuals resorting to digital fundraising in the UK since March, when the economic effects of the pandemic really started to bite. That people are turning to “internet begging”, as it’s uncharitably referred to, is an unsurprising development: about 730,000 people have lost their jobs since the beginning of lockdown – and that’s only those who’ve made it into the official statistics. The threat of mass homelessness forced the government to extend its ban on evictions at the eleventh hour last week. Food bank usage has almost doubled, with the Trussell Trust reporting an 89% rise in demand for emergency food parcels during April compared to last year.

Behind all this sits the skeletal remains of the British welfare state, stripped to its bones by successive governments. Universal credit, the main port of call for those in need of financial support, is unfit for purpose, with its meagre payments and sluggish system pushing low-income households further into poverty (such are the problems plaguing this system, it would need an £8bn overhaul to provide a “dependable” safety net, a cross-party House of Lords committee recently concluded).

While digital crowdfunding is far from a new concept – “charity” wasn’t exactly invented in 2020 – it has become more popular during the pandemic. We’ve already seen it take root in the US, where the private healthcare system has forced many to turn to crowdfunding to meet basic medical bills: a third of all donations made to fundraising platform GoFundMe are for US healthcare-related costs. But people in the UK had been comparatively insulated from leaning too heavily on the crutch of crowdfunding. Until now.

Within a month of lockdown, the number of crowdfunders visible on my social media timeline had risen steadily. Every day it seemed there was another GoFundMe or Paypal.Me link circulating. Their purpose had changed too: while the usual solicitations for help with expensive medical treatment or raising money in memory of a lost loved one were still there, joining them were regular requests for assistance with smaller, more rudimentary costs: rent, electricity bills, another year of further education funding for students whose graduate teaching jobs had suddenly disappeared.

Read the full article in the Guardian.


.
Wynton Marsalis and Stanley Crouch, New York, 1991 Photo: Frank Stewart

The Stanley Crouch I knew
Adam Shatz, NYR Daily, 24 September 2020

But Stanley liked sparring with antagonists. He lived for music and argument, and they went hand in hand. His passion for “the music” (as we in jazz call it) was contagious; so was the passion with which he defended his positions about what he called the “jazz fundamentals,” blues and swing. Whether you agreed with him or not, you had—to use one of his favorite verbs—to “deal” with him. And while Stanley didn’t hesitate to press his case with me, singing the praises of some new “young lion” in Marsalis’s orbit, or belittling Cecil Taylor, one of my heroes, he didn’t try to proselytize. Instead, he told stories about musicians, often about their lives, most of them unpublishable. “You see, Billy Higgins used to tell me,” he’d begin, as if he were sharing this confidence for the first time. He wasn’t; but then he’d be off and you’d have been a fool not to listen. Stanley traded on these stories throughout his career, but they gave his criticism, for all its rhetorical bombast, its allure of authenticity (another word he loved).

Many jazz critics avoid hanging out with musicians, usually because they’re afraid of jeopardizing their objectivity. Stanley was different. Not only did he think that spending time with musicians was crucial to understanding their work, he loved being around them. His appetite for the jazz life—for life, generally—seemed to know no limits. Some musicians thought he was full of shit, but even those who did mostly liked him. (Not all, of course: “Fuck him,” one musician he’d attacked wrote me just after his death.) Stanley knew jazz as intimately as he did because he lived the jazz life. If it led him to be less than objective, he didn’t see the point of being critically detached about the music he loved…

What Stanley believed in, I think, was calling things as he saw them even if it meant speaking hard truths, which used to be the business of critics. He left a number of bruises, not always in the right places: his virulent review for The New Republic of Toni Morrison’s Beloved—“a blackface Holocaust novel”—was an especially unfortunate example of his weakness for gratuitous, attention-seeking polemic. Yet he remained faithful to his belief that a critic shouldn’t mince words or genuflect to fashionable pieties. And for all his criticisms of his black intellectual peers, he argued unceasingly that American culture and black culture are inseparable, indeed almost synonymous. America, he wrote, “is not so much a melting pot as it is a rich thick soup in which every ingredient both maintains its taste and also takes on the taste of everything else.” That, I think, is the thing Stanley most wanted to impart: what he considered the real taste of American culture. The idea that “anti-blackness” is foundational to the republic didn’t shock him, but neither did it preoccupy him; he was more interested in the miracle of how blackness transformed the nation’s culture—above all, in its music. Jazz, as he saw it, was not only the expression of American democracy, but it was also the only working model of meritocracy in America, other than sports. Rather than bemoan its absence in other arenas, he wanted to build on the example.     

Read the full article in NYR Daily.


.

White Australia” policy lives on in immigration detention
Behrouz Boochani, New York Times, 20 September 2020

Growing up in a Kurdish family in the Ilam Province of Iran, I never expected my life to be affected by Australia’s history of white supremacy and settler colonialism. I had little awareness of Australia, a faraway country founded as a penal colony, and built on the massacres of its Indigenous people and on European migration. It was to be decades before I would hear about the White Australia policy, an official state immigration policy, in effect between 1901 and 1973, barring nonwhite people from immigrating to the country and intent on making Australia a white nation.

Yet the xenophobic legacy of the White Australia policy had a significant impact on the trajectory of my life and choked the lives of thousands of asylum-seekers and migrants who were held by Australia in offshore detention centers in its former colony Papua New Guinea and on the island of Nauru, a former protectorate.

After graduating from a public university, I wrote a bit for a Kurdish magazine in Ilam but mostly contributed to Kurdish publications outside Iran and advocated the preservation of Kurdish culture, which was seen as a threat by Iranian hard-liners. In 2013, the Iranian Revolutionary Guards Corps arrested some of my journalist colleagues. I was being followed and surveilled, and I went into hiding. The pressure was relentless; I had no choice but to flee Iran.

I flew to Indonesia and from there traveled with 60 other people by boat to Australia. We were intercepted and taken by the Australian Navy to Christmas Island, an Australian territory in the Indian Ocean. Subsequently, in a shocking move by the Australian government, I, along with hundreds of other people seeking asylum, was banished from there to a remote prison in the middle of a silent ocean in Manus Province on Papua New Guinea.

arrived there during the same week that Kevin Rudd, then the prime minister of Australia, brought in a horrific immigration policy. On July 19, 2013, he announced that asylum seekers arriving on the Australian shores on a boat would never be allowed to settle in Australia and would be forcibly taken to Papua New Guinea and Nauru. Australia paid the government of Papua New Guinea to keep hundreds of asylum seekers like me imprisoned in a disused naval base on Manus Island.

When I set foot on the island I was confronted with a decrepit and filthy prison, and saw a group of refugees — men, women and children — who had been imprisoned there before us. They told us they had been there since 2012. A few days after we arrived they were transferred to Australia. We were their replacements.

I had no pre-existing knowledge of this prison and thought it was extraordinary after I found out that hundreds of people had been held there in 2001. The Australian government led by Julia Gillard, the prime minister between 2010 and 2013, had reopened it in 2012.

Read the full article in the New York Times.


.

Has the Parliamentary check on the executive
practically ended in India?
Shoaib Daniyal, Scroll.in, 25 September 2020

There are a number of things that go into the making of a Westminster-style Parliamentary democracy but none more critical than the principle of an executive being responsible to the elected legislature. A government only hold power till it has the confidence of the legislature. But even when it sits in office, it is constantly kept on its toes by legislators. From voting on bills to asking ministers questions, the legislative check on the executive is the lifeblood of any parliamentary democracy.

However, in the world’s largest Parliamentary democracy, this principle is now teetering dangerously on the edge.

Take the monsoon session of India’s Parliament that ended on Wednesday. It was a critical session, the first after the Covid-19 pandemic hit India. Moreover, the country is getting buffeted by many grave problems: from India’s ever-rising coronavirus graph to an unprecedented economic crash to the Chinese army intruding into parts of Ladakh. More than ever, India was looking to its elected representatives to discuss its issues and hold the government responsible for any lapses.

However, Parliament found itself hobbled even before it convened. The Modi government decided that the monsoon session of Parliament would be held without Question Hour, the segment of a parliament session during which MPs are allowed to ask questions of the government.

The government claimed that Question Hour requires the presence of a large number of bureaucrats in Parliament to brief ministers and would thus violate Covid-19 norms. Considering that hundreds of MPs were meeting anyway, it was an unusual excuse. It was even more unusual given that this is 2020 and so much of government work around the world is already being done via technology…

This wasn’t all. On September 16, the speaker of the Lok Sabha disallowed a debate on the India-China border situation. This came after Prime Minister Narendra Modi began the session urging Parliament to “speak in one voice” to support India’s soldiers – a not-so-subtle message for the Opposition to support the government.

It is by now clear that the Modi government has much to answer about its mishandling of the Chinese incursions. The administration has bungled something as simple as maintaining a consistent position on the exact nature of the Chinese aggression. In such a situation, not allowing MPs to question the government not only devalued Parliament, it hurt India’s national security by letting the government off the hook.

Read the full article in Scroll.in


.

How the black vote became a monolith
Theodore R. Johnson, New York Times, 16 September 2020

The Democrats’ and Republicans’ national platforms in this period often addressed civil rights in nearly equal measure, and sometimes Republicans were more progressive on the question. President Dwight Eisenhower declared in the 1950s that racial segregation harmed the nation’s security interests. Deploying the 101st Airborne to enforce the integration of Little Rock High School in 1957, he warned that “our enemies are gloating over this incident and using it everywhere to misrepresent our whole nation.” Richard Nixon held positions on civil rights similar to John F. Kennedy’s during the 1960 presidential campaign, and won nearly a third of the Black vote that year (though in the South, where the majority of the Black population still lived, Black voters were effectively barred from the polls).

It was the last time a Republican would win more than 15 percent of the Black vote in a presidential election. Stumping for Nixon in 1960, Senator Barry Goldwater, the Arizona Republican, declared that “there’s hardly enough difference between Republican conservatives and the Southern Democrats to put a piece of paper between.” When Goldwater became the 1964 Republican presidential nominee and voiced his opposition to the Civil Rights Act, Black voters bunched themselves into the Democratic Party for good, supporting Lyndon Johnson at a rate comparable with Barack Obama’s nearly a half-century later…

Within a decade, white Southern Democrats were responding favorably to the appeals of the Republican Party. Richard Nixon’s “law and order” refrain and Ronald Reagan’s renewed call for “states’ rights” were racialized, implicitly communicating opposition to progressive policies like busing and tapping into anxieties about a rapidly integrating society. With explicitly racist appeals now socially taboo, symbolic and ostensibly colorblind gestures made the transition easier by reframing the race question as one about free-market principles, personal responsibility and government nonintervention. Racial segregation could be achieved without openly championing it; the social hierarchy maintained without evangelizing it. American voters, Black and white alike, got the message.

The Republican Party’s rightward move on race was a tremendous electoral success, winning the White House in five out of six elections from 1968 to 1992 and the Senate in consecutive elections for the first time since the onset of the Great Depression. At the same time, the Democratic Party deepened its relationship with Black voters. The electoral power of Black voters produced historic firsts, like the first elected Black governor in the nation’s history in Virginia, Douglas Wilder. Jesse Jackson lost his presidential primary runs in 1984 and 1988, but his strong showings won concessions in the Democratic Party platform. More Black members arrived in Congress, won mayoral races and set the stage for the Black political identity to become synonymous with support for Democrats. Symbolic fights, like over whether to commemorate the Rev. Martin Luther King Jr. with a federal holiday, further clarified the racial divisions between the parties.

The result was that racial polarization was now less a product of partisan philosophies about the personhood or citizenship of Black Americans and more a fact of partisan identity — and a political instrument to hold and wield power. This was a subtle but profound shift, and a dangerous one. As the University of Maryland professor Lilliana Mason writes in her 2018 book, “Uncivil Agreement,” “Partisan, ideological, religious and racial identities have, in recent decades, moved into strong alignment, or have become ‘sorted,’” such that partisan attacks can become race-based, personal and unmoored from policy disputes.

Read the full article in the New York Times.


.

This soldier’s witness to the Iraq War lie
Frederic Wehrey, NYR Daily, 15 September 2020

A few weeks before I deployed to Iraq as a young US military officer, in the spring of 2003, my French-born father implored me to watch The Battle of Algiers, Gillo Pontecorvo’s dramatic reenactment of the 1950s Algerian insurgency against French colonial rule. There are many political and aesthetic reasons to see this masterpiece of cinéma vérité, not least of which is its portrayal of the Algerian capital’s evocative old city, or Casbah. One winter morning in 2014, more than a decade after I first saw the film, I took a stroll down the Casbah’s rain-washed alleys and into the newer French-built city. Scenes from the black-and-white movie—like the landmark Milk Bar café where a female Algerian guerrilla sets off a bomb that kills French civilians—jumped to life. The ensuing French military response, memorably depicted in the film, included arbitrary arrests, torture, and “false flag” bombings that only inflamed the Algerian insurrection. 

It was these moral perils of counterinsurgency that my father hinted at. “Keep your eyes open,” he told me. This was a prescient warning, one that served as the backdrop for my deployment, even if the Algerian analogy was imperfect and would become overused. As American soldiers soon faced a guerrilla and civil war in Iraq for which they were woefully ill-equipped, intellectually and militarily, The Battle of Algiers would be screened and discussed at the Pentagon. To this day, it is taught to West Point cadets as a cautionary tale. 

Still, the full weight of the film’s lessons was not apparent to me in Iraq until one morning in the summer of 2003, when I received an urgent phone call about a captured Iraqi intelligence officer. My commander wanted me to go interview him at the Baghdad hospital where he was being treated for unspecified wounds. 

I donned my Kevlar vest and grabbed my carbine for the trip to the so-called Green Zone in the city center, which was becoming increasingly dangerous because of bomb attacks and ambushes by a growing insurgency.

My own experience with this militancy was mostly of a distant nature—though my encounters were anything but impersonal. As an intelligence officer, I debriefed Iraqi sources and informants on insurgent groups and foreign fighters, which sometimes yielded detailed information that US soldiers would use to conduct raids, looking for weapons, explosives, insurgents, or wanted ex-regime figures. Since I read the after-action reports of these operations, I learned the names and ages of those who were captured. Sometimes, I even saw photographs of their faces. This established a sort of intimacy, a chain of causality between my actions and their fates. 

Read the full article in the NYR Daily.


.

How Angela Merkel’s great migrant gamble paid off
Philip Olterman, Observer, 30 August 2020

The events of the summer of 2015 did evidently mobilise and further radicalise Germany’s rightwing extremist circles, who targeted asylum shelters with arson attacks or assassinated politicians with pro-immigration views, such as the CDU’s Walter Lübcke. No other country in Europe saw as much severe and fatal rightwing violence in 2019 as Germany.

Germany’s Federal Office of Criminal Investigations records a rise of criminal offences, including violent crime, in the years between 2014 and 2016, linking the trend to the influx of migration. The percentage of asylum seekers found guilty of such crimes also doubled in the same period. However, the majority of these offences were within the refugee shelters where new arrivals were initially housed. By 2017, when Trump claimed that “crime in Germany is way up” because it had taken in “all of those illegals”, the number of overall recorded crimes was decreasing. Last year, crime in Germany sank to an 18-year low.

What about the organised crime on Europe’s borders, where human traffickers prey on those willing to risk it all in the hope of a better life? In a 2017 book on reforming asylum policy, British economist Paul Collier argued that “while the industry was already well-established in the Mediterranean, the massive rise in demand triggered by the invitation from Germany further increased demand for smuggling by criminal syndicates.”

Gerald Knaus, chairman of the European Stability Initiative, a thinktank that advises EU member-states on migration policy, disagrees vehemently: “The thesis that Merkel created the refugee crisis was absurd in 2015, and it’s even more absurd in retrospect,” he says.

Empirical studies have failed to find data proving that Merkel’s Wir schaffen das significantly intensified the movement of refugees into Europe, although it is likely that the attention drawn towards Germany’s liberal stance on asylum influenced the decisions of those who were already in Europe at the time.

“The question is: what could she have done differently?” says Knaus. “Reintroduce borders and try what France did after the Bataclan attacks in November 2015, sending all irregular migrants back to Italy? That proved futile: France received twice as many asylum applications in 2019 as in 2015. You can’t seal a wide-open border with rhetoric and a few more border guards, while brutality was fortunately ruled out in Germany.”

Read the full article in the Observer.


.
Detail of an illumination from a 1409 manuscript of Pierre Salmon’s Dialogues, Bibliothèque Nationale de France

Book of Revelation
David Rundle, Lapham’s Quarterly, 17 September 2020

In medieval Western Europe, the book was an essential tool for rational debate, but its power went beyond reason. This was an age when a book could work miracles: the book was so much more than a container for words inscribed on pages.

The written word wielded power in part because of the rarity of the skills required to comprehend it, let alone compose in it. In the later Middle Ages, books were produced in increasing quantities for increasingly diverse audiences, but even then, as earlier, in Christendom, literacy was always the reserve of the few. This situation contrasted with Judaism where, at least in theory, every adult male was expected to be able to transcribe the holy texts. It was also a situation that continued to pertain in England in the early eighteenth century: most people were incapable of signing their names. If we move back from that date just three hundred years, we find that reliable estimates are difficult to extrapolate from the incomplete and incidental evidence available: some optimistic assessments would have about a third of England’s population being in some way literate; a soberer guess puts the rate of full literacy closer to one in ten. Those contrasting figures partly reflect complexities of definition. The ability to sign does not necessarily demonstrate wider skills of writing or reading; at the same time, an inability to mark the page with more than a cross does not categorically disprove a capacity to read, even perhaps to a level of some fluency. What is certain is that there was substantial variation and some of the factors affecting that are clear. You were more likely to be able to write if you were a man, not a woman, and if you were one of the minority who lived in a town, not the countryside. Even if you were an urban male, the probability remained that you would be illiterate, unless you were fortunate enough to have been born toward the end of our period into a good family in Florence, which probably had the highest rates of literacy in Europe. Both a result of and a reason for the increase in literacy in that city was the development there, and elsewhere in northern Italy, of education in the vernacular. In other places and at other times, school meant Latin, and instruction was primarily intended for boys who were going to enter the church. The bond between learning and religion remains in English, latent in the double meaning of clerical (contrast clerical assistant with clerical garments). The implication is that those who came to gain some facility in reading and writing in their mother tongue may not have had access to the lingua franca of communication shared across Europe. Only a minority of the small minority who were literate were so literate to the level of being literati.

Readers were few, but writing was everywhere. You might not have been able to decipher the letters, but you would have been hard pressed to escape encounters with texts. Words were written on wooden markers or chiseled on stones in your local graveyard and painted on the walls of your parish church. They were carried in your purse, as the legend—the literal meaning of which is “what must be read”—on coins. They might travel close to your flesh, as short texts stored in amulets to bring good fortune and ward off evil. Written records were also a technology of control, the documents held by landowners defining the dues they claimed from those who lived on their land—and so writing was, for some, an object of hatred, a symbol of oppression, to be destroyed when the opportunity allowed, as during the Flemish peasants’ revolts of 1323–28, the Jacquerie rebellion in northern France that began in 1358, and England’s Peasants’ Revolt of 1381.

Read the full article in Lapham’s Quarterly.


.

We’re all socially awkward now
Kate Murphy, New York Times, 1 September 2020

As the school year begins amid a pandemic, many are concerned about the negative impact that virtual or socially distanced learning may have on children’s developing social skills.

But what about grown-ups? It seems adults deprived of consistent and varied peer contact can get just as clumsy at social interactions as inexperienced kids.

Research on prisoners, hermits, soldiers, astronauts, polar explorers and others who have spent extended periods in isolation indicates social skills are like muscles that atrophy from lack of use. People separated from society — by circumstance or by choice — report feeling more socially anxious, impulsive, awkward and intolerant when they return to normal life.

Psychologists and neuroscientists say something similar is happening to all of us now, thanks to the pandemic. We are subtly but inexorably losing our facility and agility in social situations — whether we are aware of it or not. The signs are everywhere: people oversharing on Zoom, overreacting to or misconstruing one another’s behavior, longing for but then not really enjoying contact with others.

It’s an odd social malaise that can easily become entrenched if we don’t recognize why it’s happening and take steps to minimize its effects.

“The first thing to understand is that there are biological reasons for this,” said Stephanie Cacioppo, the director of the Brain Dynamics Laboratory at the University of Chicago. “It’s not a pathology or mental disorder.”

Even the most introverted among us, she said, are wired to crave company. It’s an evolutionary imperative because there’s historically been safety in numbers. Loners had a tough time slaying woolly mammoths and fending off enemy attacks.

So when we are cut off from others, our brains interpret it as a mortal threat. Feeling lonely or isolated is as much a biological signal as hunger or thirst. And just like not eating when you’re starved or not drinking when you’re dehydrated, failing to interact with others when you are lonely leads to negative cognitive, emotional and physiological effects, which Dr. Cacioppo said many of us are likely experiencing now.

Even if you are ensconced in a pandemic pod with a romantic partner or family members, you can still feel lonely — often camouflaged as sadness, irritability, anger and lethargy — because you’re not getting the full range of human interactions that you need, almost like not eating a balanced diet. We underestimate how much we benefit from casual camaraderie at the office, gym, choir practice or art class, not to mention spontaneous exchanges with strangers.

Read the full article in the New York Times.


.

Why diversity training on campus is likely to disappoint
Anna Khalid & Jeffrey Aaron Snyder,

The Conversation, 5 August 2020

Called into a typical diversity training session, you may be told to complete a “privilege walk”: step forward if “you are a white male,” backward if your “ancestors were forced to come to the United States,” forward if “either of your parents graduated from college,” backward if you “grew up in an urban setting,” and so on.

You could be instructed to play “culture bingo.” In this game, you would earn points for knowing “what melanin is,” the “influence Zoot suits had on Chicano history” or your “Chinese birth sign.”

You might be informed that white folks use “white talk,” which is “task-oriented” and “intellectual,” while people of color use “color commentary,” which is “process-oriented” and “emotional.”

You will most definitely be encouraged to internalize an ever-expanding diversity lexicon. This vocabulary includes terms such as Latinx, microaggressions and white privilege.

It also features terms that are more obscure, like “adultism,” which is defined as “prejudiced thoughts and discriminatory actions against young people, in favor of the older.”

In terms of reducing bias and promoting equal opportunity, diversity training has “failed spectacularly,” according to the expert assessment of sociologists Frank Dobbin and Alexandra Kalev. When Dobbin and Kalev evaluated the impact of diversity training at more than 800 companies over three decades, they found that the positive effects are short-lived and that compulsory training generates resistance and resentment.

“A company is better off doing nothing than mandatory diversity training,” Kalev concluded.

Some of the most popular training approaches are of dubious value. There is evidence, for example, that introducing people to the most commonly used readings about white privilege can reduce sympathy for poor whites, especially among social liberals.

There is also evidence that emphasizing cultural differences across racial groups can lead to an increased belief in fundamental biological differences among races. This means that well-intentioned efforts to celebrate diversity may in fact reinforce racial stereotyping.

With its emphasis on do’s and don’t’s, diversity training tends to be little more than a form of etiquette. It spells out rules that are just as rigid as those that govern the placement of salad forks and soup spoons. The fear of saying “the wrong thing” often leads to unproductive, highly scripted conversations.

This is the exact opposite of the kinds of debates and discussions that you would hope to find on a college campus.

The main beneficiaries of the forthcoming explosion in diversity programming will be the swelling ranks of “diversity and inclusion” consultants who stand to make a pretty penny. A one-day training session for around 50 people costs anywhere between US$2,000 and $6,000. Robin DiAngelo, the best-selling author of “White Fragility,” charges up to $15,000 per event.

Read the full article in the The Conversation.


.

Kept from all contagion: Germ theory, disease, and the dilemma
of human contact in late nineteenth-century literature by Kari Nixon
Jodie Matthews, LSE Review of Books, 24 September 2020

Kept From All Contagion takes its title from Grant Allen’s 1895 novella, The Woman Who Did, usually read for its association with the figure of the fin-de-siècle ‘New Woman’. The idea of being ‘kept from all contagion’ stands in for author Kari Nixon’s broader hypothesis that many authors of the late nineteenth century represented the conflict between the risk of contagion and vital social contact in ways that would have been immediately obvious to contemporaneous readers. This conflict is, of course, supremely legible in 2020.

In the period on which Nixon focuses, the theory that living microorganisms, spread from person to person, were the true cause of infectious disease was rapidly gaining in authority. This ‘germ theory’ displaced previous ideas such as miasma theory. As Nixon demonstrates, germ theory did not immediately and fundamentally alter medical responses to disease, but it did frame human contact in new, and often disturbing, ways.

In this book, Nixon achieves the difficult balance between a rigorous scholarly tone and engaging and always appropriate comparisons with contemporary culture. Those comparisons – with anti-vaxxers, or with Margaret Atwood’s novel The Handmaid’s Tale, for instance – demonstrate the power of reading ‘for today’, as renowned literary critic J. Hillis Miller terms it, rather than the strictly historicist method of reading with a ‘period mindset’ that has sometimes beleaguered the academic monograph in this discipline.

That Nixon allows us to read literature of the nineteenth century for today is crucial for a book about contagion, when ‘today’ is defined by a global pandemic altering every facet of her readers’ lives, one that the author could not have predicted as the manuscript was put to bed. It is impossible to read about disease and human contact, no matter the century, without seeking out the lessons, the hubris and the hope that find meaning now.

The contagion explored in this book includes plague, cholera, syphilis, tuberculosis, typhoid, puerperal fever and smallpox – a list that, even now, causes a shiver down the spine. Nixon’s twenty-first-century point of comparison is Ebola, especially the ‘Dallas Ebola Outbreak’ of 2014.

Nixon’s book asks us to think about the politics and cultural effects of ‘contaminated connectivity’. This is pushing at an open door in a world where there are mass demonstrations about wearing face masks. The more complicated idea to navigate at the moment, depending on one’s COVID-19 outlook, is ‘that to reject risk is to reject real connection with others’. Risk for a character in a nineteenth-century novel is one thing; I’m all for it – bring on the risk! Negotiating how much life to live during a real and present pandemic is quite another matter. Certainly, one’s reaction to Nixon’s conclusions will depend on precisely where and when one reads it. In full lockdown? The day that schools reopen? During a second or third wave? I hope that I will feel very differently about contaminated connectivity in a less urgently contaminated future. Ultimately, Nixon concludes that what she calls the ‘“messy” engagement with the biosocial amalgam of humanity is not about a desire for disease, but about a desire to engage with the human community that takes precedence over other, baser fears’. This certainly puts a different spin on the behaviour of those who contravene public health regulations, but also provokes a rethink of the ebb and flow of COVID-related anxiety.

Read the full article in the LSE Review of Books.


.

Banishing the idea of the “Dark Ages”
Mary Wellesley, Guardian, 25 September 2020

In The Light Ages, Seb Falk unpicks many of these popular assumptions. He points out that several accounts of the history of science begin sometime around 1600, as though scientific inquiry just popped out of the ground like a mushroom. But a mushroom is just the visible surface growth of a larger organism. And the same applies to medieval scientific thinking, which was complex, interconnected and wide-ranging. Far from being resistant to foreign ideas, medieval thinkers systematically translated works from Greek, Hebrew and Arabic by writers from Iberia to Persia. Falk speaks of the “irresistible medieval drive to tinker, to redesign, to incrementally improve or upgrade technology” and the same was true of scientific thought.

This was not an age that abhorred novelty or an age of narrow conformity, but one in which the latest ideas were hotly debated. Medieval thinkers also sought to build on the learning of earlier ages, despite viewing pre-Christian writers with a whiff of suspicion. The early church fathers likened pagan philosophy to the gold and silver that the biblical Israelites took with them on their exodus from slavery – tainted by association, but still precious. This was the period, in Europe, of the first eyeglasses, the first mechanical clocks and the first universities. The Middle Ages were anything but “dark”.

In The Light Ages, Seb Falk unpicks many of these popular assumptions. He points out that several accounts of the history of science begin sometime around 1600, as though scientific inquiry just popped out of the ground like a mushroom. But a mushroom is just the visible surface growth of a larger organism. And the same applies to medieval scientific thinking, which was complex, interconnected and wide-ranging. Far from being resistant to foreign ideas, medieval thinkers systematically translated works from Greek, Hebrew and Arabic by writers from Iberia to Persia. Falk speaks of the “irresistible medieval drive to tinker, to redesign, to incrementally improve or upgrade technology” and the same was true of scientific thought.

This was not an age that abhorred novelty or an age of narrow conformity, but one in which the latest ideas were hotly debated. Medieval thinkers also sought to build on the learning of earlier ages, despite viewing pre-Christian writers with a whiff of suspicion. The early church fathers likened pagan philosophy to the gold and silver that the biblical Israelites took with them on their exodus from slavery – tainted by association, but still precious. This was the period, in Europe, of the first eyeglasses, the first mechanical clocks and the first universities. The Middle Ages were anything but “dark”.

We feel for Westwyck when he was sent to Tynemouth Priory – a Northumbrian daughter house of St Albans Abbey, perched on a rocky outcrop looking out to the North Sea. Here, according to an account written by another monk, “night and day the waves rage”, there are “dense and gloomy fogs” that “dull the eyes, hoarsen the voice and constrict the throat”. He adds that, “spring with its flowers is outlawed there; summer warmth is banned”. And for the intellectually inquiring Westwyck, it must have been especially rough that the priory had only “a dozen or so books”.

Read the full article in the Guardian.


.

How history began with a counterfactual
Tom Holland, Unherd, 18 September 2020

Counter-factuals are as old as history itself. The first historian to imagine an altered timeline was the first historian. Herodotus, whose ‘historia’ — ‘enquiries’ — into the great events that had recently shaken the Greek world served to establish history as an entire new genre, knew that some of his perspectives on the past were bound to prove controversial. Of these, the one he evidently felt most uncomfortable expressing was his conviction that the Athenians ranked as the saviours of Greece. This was, he freely admitted, “an opinion which most people will find hard to stomach.” Nevertheless, he stuck to it. Not only that, but he also sought to justify it in a most novel way.

Herodotus’ high estimation of the debt owed by Greece to the Athenians rested on his interpretation of the epic events of the year that we commemorate as 480 BC. 2,500 years ago this summer, the King of Persia crossed the Hellespont at the head of a massive army and fleet. Xerxes ruled the largest empire that the world had ever seen, and the resources available to him seemed so stupefying to the Greeks as to appear effectively limitless. Many, convinced that they had no prospect of resisting such an adversary, scrambled to collaborate.

Only a few cities, headed by Athens and the peerless warrior-state of Sparta, refused to surrender. At Thermopylae, a pass to the north of Athens, a Greek holding-force led by a Spartan king was dislodged after three days’ brutal fighting, and the Spartan king killed. Athens fell soon afterwards. The Acropolis was stormed and burned. But then, in the waters off a nearby island called Salamis, the Greek fleet won an unexpected victory. The following summer, an alliance of various Greek cities routed the Persian land forces. The liberty of mainland Greece was definitively secured. To the Greeks themselves it seemed a barely believable triumph: the most astounding victory of all time.

Who, though, had best earned the bragging rights? Since it was the Athenians who had provided by far the largest contingent of ships at Salamis, and it was Salamis that had proved the decisive engagement, no one could really deny that their role in defeating the Persian invasion had been a significant one. Nevertheless, over the course of the decades that followed, patience with the notion loudly trumpeted by the Athenians that Greece owed them her freedom came to wear very thin. Athenian triumphalism proved as wearying to other Greeks as English chants about the Second World War tend to be today to continental football fans. Nor was boasting all the Athenians did to make themselves unpopular. Cities liberated from Persian rule in the wake of Xerxes’ defeat increasingly found themselves the victims of an Athenian extortion racket. Sparta, the city that alongside Athens had led the resistance to the Persian invasion, grew ever more alarmed. Relations between the erstwhile allies fell apart. In 431, cold war exploded into open conflict — destined to rage for decades to come. The whole Greek world was made to bleed. Such was the background against which Herodotus wrote his history.

So it was, painfully conscious of making a case that was bound to infuriate many, he invented alternative history. Suppose, he pondered, that the Athenians had refused to fight Xerxes. Suppose instead that they had set sail in their fleet for some distant land, or had actively collaborated.

Read the full article in Unherd.

%d bloggers like this: