Pandaemonium

PLUCKED FROM THE WEB #72

web 72

The latest (somewhat random) collection of essays and stories from around the web that have caught my eye and are worth plucking out to be re-read.


.

 

The urgency of critique
New Frame, 21 May 2020

In South Africa there is, to our collective shame, a long list of unarmed people who have been killed by the police in the post-apartheid era. People have been killed during protests, as well as during various forms of armed state action such as evictions and disconnection from self-organised electricity connections. There’s an equally long and shameful list of people who have died in police custody.

It is not unusual for deaths at the hands of the state to pass without any media reports or public discussion. When there are reports, the names of the dead are sometimes not given and frequently no attempt is made to establish the circumstances of the killing. Often the scandal is that there is no scandal

When a police murder has been recorded on camera – as happened with the killing of Andries Tatane during a protest in Ficksburg in 2011 and the murder of Mido Macie in Daveyton in 2013 – there is usually some public discussion.

But not even the massacre at Marikana in 2012, shown that night on the television news, resulted in sustained mass protest. It did not result in the fall of the government or even the resignation of a president. We do not just face a situation in which we are governed by a state that routinely subjects impoverished black people to colonial forms of policing, we also inhabit a society in which there is tacit consent for colonial forms of policing.

There have been a number of deaths at the hands of the police and the army during the lockdown. It would be naive to fear there will not be more in the weeks to come.

The murder, on Good Friday, of Collins Khosa in Alexandra has received more media attention than is usually the case with these sorts of murders. This is due, in part, to the shifts in reporting that have occurred during the lockdown. The coronavirus pandemic has generated some sense that all our futures are entwined and, as a result, there has been more coverage than usual of the experiences of impoverished people, including evictions and hunger.

Khosa’s death is now subject to a legal challenge against the minister of defence and the chief of the army, which is also why it is receiving attention. This is not an anomaly. Grassroots activists have long known that middle-class actors will frequently ignore or disbelieve their claims of abuse at the hands of the state if they are not captured on video or repeated in court. State actors understand this equally well, which is why grassroots activists making use of the courts have been targeted for repression and are frequently accused, in high colonial fashion, of being manipulated by malicious white agitators or foreign governments.

Read the full article in New Frame.


.

Just because you can afford to leave the city
doesn’t mean you should

Mary T Bassett, New York Times, 15 May 2020

But everything we know so far about the coronavirus tells us that blaming density for disease is misguided.

New York City Health Department data indicate that Manhattan, the borough with the highest population density, was not the hardest hit. Deaths are concentrated in the less dense, more diverse outer boroughs. Citywide, black and Latino residents are experiencing mortality rates that are twice those of white city dwellers.

Then there is the rest of the world. While the coronavirus first exploded in Wuhan, a city of 11 million, many ‘hyperdense’ cities in Asia have been able to contain their outbreaks. The virus appeared in Singapore (5.6 million residents), Seoul (9.8 million), Hong Kong (7.5 million) and Tokyo (9.3 million), cities close in size to New York, but with much lower recorded deaths.

Cities, large and dense by definition, do not inevitably support explosive viral transmission. But factors that do seem to explain clusters of Covid-19 deaths in the United States are household crowding, poverty, racialized economic segregation and participation in the work force. The patterns of Covid-19 by neighborhood in New York City track historical redlining that some 80 years ago established a legacy of racial residential segregation.

Population density is not the same as household overcrowding. The U.S. census defines crowding as more than one person per room, excluding the kitchen and bathroom. That means a one-bedroom apartment occupied by four people is crowded. In 2013, the Bronx had New York City’s highest percentage of crowded households(12.4 percent), followed by Brooklyn (10.3 percent) and Queens (9.3 percent). Manhattan and Staten Island had 5.4 percent and 3.4 percent crowding. (Nationally, 2 percent of people live in crowded households)…

That disease is devastating cities like New York because of the structure of health care, the housing market and the labor market, not because of their density. The spread of the coronavirus didn’t require cities — we have also seen small towns ravaged. Rather, cities were merely the front door, the first stop.

It’s not that there are too many people in cities. It’s that too many of their residents are poor, and many of them are members of the especially vulnerable black, Latino and Asian populations. That’s what underlies the erroneous idea that, for those who can, it might be best to get far away from those people who endure household crowding and its risks.

Read the full article in the New York Times.


.

Did Japan just beat the virus
without lockdowns or mass testing?
Lisa Du & Grace Huang, Bloomberg, 22 May 2020

Japan’s state of emergency is nearing its end with new cases of the coronavirus dwindling to mere dozens. It got there despite largely ignoring the default playbook.

No restrictions were placed on residents’ movements, and businesses from restaurants to hairdressers stayed open. No high-tech apps that tracked people’s movements were deployed. The country doesn’t have a center for disease control. And even as nations were exhorted to ‘test, test, test,’ Japan has tested just 0.2% of its population — one of the lowest rates among developed countries.

Yet the curve has been flattened, with deaths well below 1,000, by far the fewest among the Group of Seven developed nations. In Tokyo, its dense center, cases have dropped to single digits on most days. While the possibility of a more severe second wave of infection is ever-present, Japan has entered and is set to leave its emergency in just weeks, with the status already lifted for most of the country and likely to exit completely as early as Monday.

Analyzing just how Japan defied the odds and contained the virus while disregarding the playbook used by other successful countries has become a national conversation. Only one thing is agreed upon: that there was no silver bullet, no one factor that made the difference.

‘Just by looking at death numbers, you can say Japan was successful,’ said Mikihito Tanaka, a professor at Waseda University specializing in science communication, and a member of a public advisory group of experts on the virus. ‘But even experts don’t know the reason.’

Read the full article in Bloomberg.


.

That blood on the tracks
was of people abandoned to their fate
Salil Tripathi, Live Mint, 14 May 2020

They had names. Dhansingh Gond. Nirvesh Singh Gond. Buddharaj Singh Gond. Achchelal Singh. Rabendra Singh Gond. Suresh Singh Kaul. Rajbohram Paras Singh. Dharmendra Singh Gond. Virendra Singh Chainsingh. Pradeep Singh Gond. Santosh Napit. Brijesh Bheyadin. Munimsingh Shivratan Singh. Shridayal Singh. Nemshah Singh. Deepak Singh.

They worked hard. They were migrant workers. They had left poorer parts of India to earn their living in the country’s richer parts, so that they could look after their families and hope to build a better future for their children. They came because one of their cousins had probably found a job in a bigger town and knew of the opportunities and asked them to join him, and they came without contracts, banking on those words. And if they did sign contracts later, they knew these were not going be honoured. Even if they were to try getting them honoured, who knew if the courts, busy as they were, would have time for their plea? When migrant workers started walking home after the first lockdown, a court had asked the government about it, and the government assured the court that it was no longer a problem. This is the sort of issue for which committees are formed to submit reports. These things take time. If only everyone respected social distancing, we would be fine.

The migrants understood what social distancing meant. But it wasn’t easy. It is hard to maintain any distance, forget social distance, in a slum. There are eight in a room, lying on mattresses next to one another; the stove is in a corner, and the bathroom is shared with many more. There is no running water to wash hands regularly, there are no hand-sanitizers. Even in a slum, rent had to be paid. But where would the money come from? The construction site or the farm had closed.

They were abandoned. Some employers had probably shuttered factories. Some employers refused to pay any wages. Some paid them a little and told them to leave. They had come from elsewhere, they were reminded that they did not belong to the town in which they worked.

They felt lonely. Some had helped build the big city. Maybe some were the watchmen of apartment complexes. Some could have driven rickshaws. Some could have been domestic help. Or perhaps waiters. The specifics don’t matter. They were part of the silent, invisible force that keeps towns and cities humming, the force that made sure that their bright lights always shone, and cities that never sleep kept moving.

Read the full article in Live Mint.


.

Why do some COVID-19 patients infect many others,
whereas most don’t spread the virus at all?

Kai Kupferschmidt, Science, May. 19, 2020

Most of the discussion around the spread of SARS-CoV-2 has concentrated on the average number of new infections caused by each patient. Without social distancing, this reproduction number (R) is about three. But in real life, some people infect many others and others don’t spread the disease at all. In fact, the latter is the norm, Lloyd-Smith says: ‘The consistent pattern is that the most common number is zero. Most people do not transmit.’

That’s why in addition to R, scientists use a value called the dispersion factor (k), which describes how much a disease clusters. The lower k is, the more transmission comes from a small number of people. In a seminal 2005 Nature paper, Lloyd-Smith and co-authors estimated that SARS—in which superspreading played a major role—had a k of 0.16. The estimated k for MERS, which emerged in 2012, is about 0.25. In the flu pandemic of 1918, in contrast, the value was about one, indicating that clusters played less of a role.

Estimates of k for SARS-CoV-2 vary. In January, Julien Riou and Christian Althaus at the University of Bern simulated the epidemic in China for different combinations of R and k and compared the outcomes with what had actually taken place. They concluded that k for COVID-19 is somewhat higher than for SARS and MERS. That seems about right, says Gabriel Leung, a modeler at the University of Hong Kong. ‘I don’t think this is quite like SARS or MERS, where we observed very large superspreading clusters,’ Leung says. ‘But we are certainly seeing a lot of concentrated clusters where a small proportion of people are responsible for a large proportion of infections.’ But in a recent preprint, Adam Kucharski of LSHTM estimated that k for COVID-19 is as low as 0.1. ‘Probably about 10% of cases lead to 80% of the spread,’ Kucharski says.

That could explain some puzzling aspects of this pandemic, including why the virus did not take off around the world sooner after it emerged in China, and why some very early cases elsewhere—such as one in France in late December 2019, reported on 3 May—apparently failed to ignite a wider outbreak. If k is really 0.1, then most chains of infection die out by themselves and SARS-CoV-2 needs to be introduced undetected into a new country at least four times to have an even chance of establishing itself, Kucharski says. If the Chinese epidemic was a big fire that sent sparks flying around the world, most of the sparks simply fizzled out.

Read the full article in Science.


.

Pandemic narratives and the historian
Alex Langstaff, LA Review of Books,18 May 2020

AL.: Historian of Medicine Charles Rosenberg famously argued in the 1980s that epidemics operate as a form of ‘dramaturgy’ — in other words, as inherently constructed events that are patterned as stories or narratives. The stories taking shape around COVID-19 seem to be featuring certain key motifs that we’ve seen many times before — for instance, the virus as malevolent actor, often ascribed to a specific ethnic or social group; and metaphors of containment (‘walls’) and destruction (‘war’). What have you noticed in the story that has been told thus far about COVID-19 as it relates to other epidemics, disasters, and crises you’ve studied? What’s been left out? How has this shaped our understanding and response to COVID-19?

JULIE LIVINGSTON: From the first moments in Wuhan, we have watched this epidemic follow the predictable dramaturgy laid out by Rosenberg. But, analytically, I find that I am more interested in the prehistory and long aftermath than the discrete event. Any dramaturgy that begins in Wuhan takes the epidemic out of the larger flow of historical time. It separates this event as somehow distinct from the massive fires that engulfed Australia a few months ago, the Ebola epidemic of 2014, the ongoing Chennai water crisis, or Hurricane Katrina. And yet zoonosis can’t be understood apart from the same larger forces that have produced the unprecedented drought cycles or the warming of our oceans: decades of land speculation, agribusiness, industrial toxicity, commodification of nature. Indeed the sixth extinction, which reminds us of the intense pressures on wild species as their habitats give way to industrial terraforming, helps us understand escalating incidents of zoonosis (here I like the work of evolutionary biologist Rob Wallace). Whenever this pandemic ends, or perhaps when it ends as ‘emergency’ and becomes sedimented in poor communities, then, insofar as life returns to ‘normal,’ it will in fact be the normal of climate-change-induced emergency. The pandemic could inspire new forms of consumption and international cooperation, and a new impetus toward redistribution and recognition of our interdependence. Or not.

DEBORAH COEN: I should say, first, that unlike the others here, I am not a historian of medicine and have never researched historical epidemics. I approach these questions from the perspective of a historian of modern science with an abiding interest in how people cope with uncertainty. In my research into the histories of climate science and seismology, I have been particularly interested in the implications of turning a natural disaster into a field site for scientific research.

One prevalent narrative in media reporting on the outbreak is that ‘fear’ of the virus is making people behave irrationally. Headlines ask, ‘Why Are We So Afraid?’ and offer guidance on ‘Managing Stress, Fear, and Anxiety.’ Newspapers seize on stories like that of the man who shot himself and his wife because he suspected (wrongly, it turned out) that they had caught the virus. Whether the story revolves around domestic violence or excessive toilet-paper buying, the message is the same: our emotional reactions to the crisis are suspect. As a historian of science, I am interested in the work that these narratives do. For one thing, they introduce a new type of expert into the crisis: the social scientist of disaster. The disaster scientist is an expert not on epidemics or earthquakes or terrorism per se but on disaster as a generalized phenomenon. The cause of the disaster is immaterial; instead, it’s the response that’s the object of study. These experts will likely tell you that they are a wholly new breed. In fact, the CDC helped launch a ‘disaster science’ research program in 2014, and new journals and degree-granting programs in ‘disaster science’ have sprouted up just in the last couple of years.

Read the full article in the LA Review of Books.


.

Stereotype threat
Angela Saini, The Lancet, 23 May 2020

Such speculation runs the risk of forgetting that the demographic categories we recognise socially do not in fact have very much biological meaning and betrays a wider problem in medicine when it comes to race. It has become routine in medical research and clinical practice to categorise people by race and ethnicity. While this is no doubt important in identifying demographic groups who might be disadvantaged by unequal treatment and to spot any environmental or social patterns affecting disease prevalence, these categories are also sometimes used to guide research, diagnosis, and treatment in ways that are not necessarily useful. At worst, they may be reinforcing damaging myths about biological differences between groups.

In making the case for the possibility of innate biological health differences between groups during the COVID-19 crisis, at least one researcher has pointed to the already-recognised increased risk of hypertension among black people of Afro-Caribbean descent in the UK and the USA. Hypertension is an example of a health condition that has been unambiguously racialised. It is so widely accepted as such that the UK National Institute for Health and Care Excellence guidelines recommend that black patients younger than 55 years with hypertension be given calcium-channel blockers instead of angiotensin-converting enzyme inhibitors (ACE) inhibitors, which are given to non-black patients under 55 years.

What justifies this distinction in treatment on the basis of race? When epidemiologist Jay Kaufman, at McGill University in Canada, and cardiologist and global expert on hypertension Richard Cooper, at Loyola University Chicago in the USA, analysed studies that claimed to see racial differences in responses to ACE inhibitors, they did not find evidence that black or white patients were significantly advantaged by different prescriptions. Their conclusion about the benefit of assigning ACE inhibitors according to race was that from ‘the point of view of any individual patient, this is not meaningfully better than being assigned by the flip of a coin’. Kaufman and Cooper’s research affirmed what has long been known by population geneticists. Humans are a highly homogeneous species, even more so than our closest evolutionary cousin, the chimpanzee. By far the greatest source of human genetic variation is not group differences, but individual differences. This is perhaps why, for all the effort that has been poured into research to prove the long-held hypothesis that racial differences seen in hypertension have a genetic basis, scientists have not found anything consistent in our DNA to support it.

More pertinently, when we talk about race we are talking about groups that are socially defined. In the USA, for instance, someone may have just one grandparent of African ancestry but be categorised as black based on appearance. It makes little sense to conduct research around the assumption that a socially defined group could exhibit a genetic difference from another socially defined group when the groups are not biologically defined to begin with. To do so defies logic. It similarly defies logic to assume that all non-white people in the UK, with their diverse geographical ancestries, are so genetically different from white people that they will as a group be more innately affected by COVID-19.

Read the full article in the The Lancet.


.

We should be very wary of the R value
Tom Chivers, Unherd, 12 May 2020

There’s an interesting statistical anomaly called Simpson’s paradox. It is that you can find a trend going one way in lots of individual datasets, but that when you combine those datasets, it can make the trend look like it’s going the other way.

That sounds quite dry, so let me use a famous example. In the autumn of 1973, 8,442 men and 4,321 women applied to graduate school at UC Berkeley. Of those, 44% of the men were admitted, compared with just 35% of the women.

But this is not the clear-cut example of sex bias that it seems. In most of the departments of the university, female applicants were more likely than male applicants to be admitted. In the most popular, 82% of women were admitted compared to just 62% of men; in the second most popular, 68% of women compared to 65% of men. Overall, there was a ‘small but statistically significant bias in favour of women’.

What’s going on? Well: men and women were applying for different departments. For instance, of the 933 applicants for the most popular department — let’s call it A — just 108 were women, but of the 714 applicants for the sixth most popular department (call it B), 341 were women.

Let’s take a look at just those two departments. Women were more likely to be admitted to both. But, crucially, the two departments had hugely different rates of acceptance: in Department A, 82% of female applicants and 62% of male ones were accepted; in Department B, 7% and 6%.

So of the 108 women applying to Department A, 89 of them were admitted (82%), while of the 825 men applying, 511 got in (62%). Meanwhile, of the 341 who applied to Department B, just 24 (7%) were admitted; for men, it was 22 out of 373 (6%).

You see what’s going on here? In both departments, women were more likely to be accepted. But added together, it’s a different story. Of 449 women who applied to the two departments, just 111 were accepted: 25%. Whereas of the 1,199 men who applied to the two, 533 got in: 44%.

So, to repeat: even though any individual woman applying to either department had a higher chance of being admitted, on average fewer women were admitted because they tended to apply to more competitive departments.

Read the full article in Unherd.


.

Research provides no basis for pandemic travel bans
David J Bier, Cato At Liberty, 15 April 2020

The following post reviews a dozen pre-COVID-19 studies that analyze the effects of international travel restrictions—or international and domestic travel restrictions—on influenza pandemics globally or in the United States. Given the similarities to COVID-19, the influenza research would have been the most relevant research available to the government in January 2020.

A broader view of the pre-COVID-19 evidence supports the World Health Organization’s (WHO) position that travel restrictions are only effective before a pandemic has spread and only serve to delay the spread by a few days or weeks. Multiple meta-analyses and reviews by the WHO, U.S. Homeland Security Council in 2006, and United Kingdom (UK) Department of Health in 2018 also back this conclusion.

The consensus of researchers is that stopping all travel is impossible, unenforceable, and politically unrealistic—an assumption that held in the COVID-19 case—and that if it were even possible to selectively stop it (e.g. excluding all nationals of an affected country), doing so will only delay transmission until the disease spreads beyond the initial selection. With any travel, the risk of infiltration grows proportionately to the spread abroad such that even a ‘tenfold reduction in numbers of visitors delays arrival of infection for approximately as long as it takes global prevalence to increase tenfold to compensate.’

These dozen studies are unanimous that anything less than an immediate total travel restriction has only a ‘modest,’ ‘little,’ ‘low,’ ‘negligible,’ or no effect at all on the spread of a pandemic. If a ban delays the introduction of the virus into the high flu season, it can actually worsen a pandemic. Even the most extreme restrictions on travel delay the introduction or progression of flu pandemics by at most a few weeks. For context, the U.S. Homeland Security Council in 2006 expected that pandemic influenza would reach the United States ‘within 1 to 2 months,’ which happened in the case of COVID-19.

The research also indicates that there is no benefit to international travel restrictions once an outbreak has already become an epidemic inside the destination country.

Read the full article in Cato at Liberty.


.

Socialize Big Pharma today. Save your life tomorrow
Leigh Phillips, Jacobin, 9 April 2020

AIs have been applied to other aspects of antibiotic research before — what the researchers term ‘in silico’ screening, as opposed to in vivo (studies conducted in living organisms) or in vitro (those conducted ‘in the glass’ outside organisms via test tubes, petri dishes, flasks, etc). But previous models have still needed some bootstrapping through human assumptions; none have sufficiently accurate to identify brand new antibiotics without such assistance. James Collins, the bioengineer who led the team, reckons halicin is one of the most powerful antibiotics ever found.

In a paper in the journal Cell detailing their work, the researchers describe how they were driven to try out this method by ‘the decreasing development of new antibiotics in the private sector that has resulted from a lack of economic incentives.’ If urgent action is not taken to both discover and develop new antibiotics, public health officials project that deaths from antibiotic resistance will hit 10 million a year by 2050.

This otherwise preventable annual calamity could arise all because it isn’t sufficiently profitable to research, test, and manufacture a commodity that if it works will only be purchased for a few weeks or months at most until the infection is gone, and works best the fewer people use it. Antibiotics work in the opposite direction to how the free market works.

Antibiotic discovery is also already just really, really difficult, and getting more so, with the same molecules being rediscovered over and over. New versions of existing antibiotics ‘results in substantially more failures than leads,’ the authors of the halicin paper lament.

But fundamentally, the problem of antibiotic resistance comes from, as the researchers say: ‘the decreasing development of new antibiotics in the private sector that has resulted from a lack of economic incentives,’ exacerbating the difficulty.

‘We’re facing a growing crisis around antibiotic resistance, and this situation is being generated by both an increasing number of pathogens becoming resistant to existing antibiotics, and an anemic pipeline in the biotech and pharmaceutical industries for new antibiotics,’ says James Collins, one of the paper’s authors.

Major pharmaceutical firms have abandoned all aspects of antibiotic development and production, because it makes no business sense to be producing a drug that works best the fewer the number of people that use it, and that is also only used for a few weeks or months at a time. Drugs for chronic diseases that have to be taken every day for the rest of a patient’s life are significantly more profitable. So Big Pharma largely left this space some three to four decades ago, preferring the greener pastures of more profitable therapeutics.

Read the full article in Jacobin.


.

Death of William of Norwich, Holy Trinity church, Loddon, Norfolk

Anti-Semitism runs deep in Britain
Matthew Sweet, Unherd, 20 May 2020

To the British, anti-Semitism can seem like someone else’s problem. The Edict of Expulsion was 730 years ago; there were no pogroms in Peterborough, no Shoah in Shepperton; the Mosley men did not pass through Cable Street or the lobby at Westminster. But this complacent habit obscures a past from which it would be wrong to extract a flattering exceptionalist narrative.

The British nineteenth century, for instance, was marked by periodic displays of solidarity with persecuted Jews in mainland Europe. In 1882, Tsar Alexander III instituted the May Laws to curtail Jewish property ownership and freedom of movement: a few weeks later Victorian campaigners had raised a £75,000 in aid. In 1858, the so-called Mortara Affair inspired national demonstrations in support of a seven year-old Jewish boy kidnapped from his home in Bologna by a phalanx of Papal carabinieri. Another good cause — but objections to his treatment expressed as much anti-Catholic as philosemitic feeling.

What about the twentieth century, when Britain licked Hitler and kept the anti-Semitic far right out of power? In the 1930s, the British Union of Fascists failed to gain a single Parliamentary seat, but during the decade their gatherings were listed in the local press among tea dances and cricket matches, part of the pattern of provincial life. During the Second World War, the fight against Nazism coincided with a legible increase in anti-Semitic attitudes — the Mass Observation project recorded British chatter about Jews running the black market or bagging the best places in the communal air-raid shelters.

And after the war, when the population of Britain had gazed on the photographs from the death camps? In April 1946, the Attlee government was discovered to be publishing job advertisements that requested Jews not to apply. The Minister of Labour assured Parliament it would not happen again. Phil Piratin, the Communist MP for Mile End, urged him to speak to the Home Secretary about prohibiting racial discrimination by law, but received no answer.

To speak of this history, the image of the reservoir is not sufficient: these are turbulent waters, in which anti-Semitism swirls and surges beside other ideological currents. But there are ways of observing their movement. Watching those old black and white movies, for one.

Put Hitchcock’s The 39 Steps (1935) on your TV, and see Robert Donat and Madeleine Carroll untangle a conspiracy; then go back to John Buchan’s novel and register that the conspiracy is the same as the one in The Protocols of the Elders of Zion, perpetrated, according to one character, by ‘a little white-faced Jew in a bath chair with an eye like a rattlesnake.’

Read the full article in Unherd.


.

Jewish, Israeli scholars back
African intellectual smeared for Israel criticism

Maitav Zonszein, +972, 10 May 2020

In their letter to the German government on April 30, the 37 Israeli and Jewish scholars argued that Klein ‘has assumed a leading role in the ‘weaponization’ of antisemitism against critics of the Israeli government,’ adding that he ‘has done a disservice to the urgent fight against real antisemitism, casting a shadow over the integrity of his public office.’

Several mainstream German intellectuals who study the Holocaust, among them Professors Micha Brumlik and Wolfgang Benz, each of whom have served as directors of centers that focus on the study of antisemitism in Germany, have signed on to another letter of solidarity with Mbembe (Brumlik also backed the call to replace Klein).

‘To accuse our colleague of trivializing the Shoa [Holocaust] or even equating the genocide against European Jewry with the racist regime of Apartheid South Africa calls into question a fundamental basis of science [history as an academic discipline] and is, therefore, wrong,’ the letter reads. ‘Historical comparisons, which serve to highlight differences and similarities between events, discourses and processes, are necessary and legitimate.’

The accusations levied against Mbembe center around two texts he has authored. One is in the forward he wrote for the 2015 book ‘Apartheid Israel: The Politics of an Analogy,’ a collection of articles by 18 scholars of Africa and its diaspora on the similarities and differences between apartheid-era South Africa and contemporary Israel.

‘The occupation of Palestine is the greatest moral scandal of our times, one of the most dehumanizing ordeals of the century we have just entered, and the biggest act of cowardice of the last half-century,’ wrote Mbembe, and concluded his forward asserting that ‘the time has come for global isolation.’

The second text is from his book ‘The Politics of Enmity,’ in which he compares elements of the Israeli settlement enterprise, as well as the destruction of the Jews of Europe, to the colonial fixtures of the apartheid regime in South Africa, which he calls ‘emblematic manifestations of this phantasy of separation.’

Read the full article in +972


.

America’s cities could house everyone if they chose to
 Binyamin Appelbaum, New York Times, 15 May 2020

The rise of homelessness is often portrayed as a collection of personal tragedies, the result of bad choices or bad luck. But the first law of real estate applies to homelessness, too: Location, location, location. The nation’s homeless population is concentrated in New York, the cities of coastal California and a few other islands of prosperity. Well-educated, well-paid professionals have flocked to those places, driving up housing prices. And crucially, those cities and their suburbs have made it virtually impossible to build enough affordable housing to keep up.

The government calculates that $600 is the most a family living at the poverty line can afford to pay in monthly rent while still having enough money for food, health care and other needs. From 1990 to 2017, the number of housing units available below that price shrank by four million.

Most hard-pressed people manage to stave off homelessness. While there are roughly 80,000 homeless people in New York on any given night, more than 800,000 New Yorkers — more than 10 times as many people — are scraping by, spending more than half their income on rent.

Those who do end up homeless are often those with additional burdens. They are disproportionately graduates of foster care or the prison system, victims of domestic abuse or discrimination, veterans, and people with mental and physical disabilities. Some end up on the street because of addictions; some develop addictions because they are on the street. Whatever problems they face, however, they are much more likely to become homeless in places without enough affordable housing. According to one analysis, a $100 increase in the average monthly rent in a large metro area is associated with a 15 percent increase in homelessness….

Reframing the debate — asking what is necessary to end homelessness — is an important first step for New York and for other places that are failing this basic test of civic responsibility.

The next step is simple but expensive. The federal government already provides housing vouchers to help some lower-income families. The families pay 30 percent of their monthly income toward rent; the government pays the rest. But instead of giving vouchers to every needy family, the government imposes an arbitrary cap on total spending. Three in four eligible families don’t get vouchers.

The program costs about $19 billion a year. Vouchers for all eligible households would cost an additional $41 billion a year, the Congressional Budget Office estimated in 2015. Where to get the money? Well, the government annually provides more than $70 billion in tax breaks to homeowners, including a deduction for mortgage interest payments and a free pass on some capital gains from home sales. Let’s end homelessness instead of subsidizing mansions.

Read the full article in the New York Times.


.

The Pulitzer problem
Rafia Zakaria, The Baffler, 8 May 2020

The first person you meet in New Yorker journalist Ben Taub’s Pulitzer-winning story ‘Guantánamo’s Darkest Secret’ is the kindly guard. Steve Wood, a member of the Oregon National Guard, was deployed to the Guantánamo Bay detention facility. Despite being told to ‘never turn your back’ on prisoners, Wood befriended one. Prisoner 760, as he was called, is Mohamedou Ould Salahi, a man kept in a trailer called ‘Echo Special’ whose identity was so secret that his name did not even appear in the log of America’s cruelest prison.

Putting a guard, even a friendly one, front and center of an article about torture is an interesting and clever choice: Americans raised on the lore of cowboys and G.I Joes are more discomfited than most when Americans are cast as villains. Giving readers a friendly guard, Taub might have figured, would provide them with a good white American to root for—and a redemption story where, on the whole, there has been no American redemption. The suggestion that Wood was in over his head at the prison helps temper the thorny specter of American complicity; it offers befuddlement and ignorance (both of which Wood exhibits) as easy outs of the Guantánamo moral mess. If Wood is a kindly white guard who just wants to learn about Islam—he spends a lot of time at the base library reading up—then the new hero of the story is Taub himself. Great cruelties may have been inflicted at Guantánamo, New Yorker readers can tell themselves, but brave young journalists are out to expose them so that those educated well-off readers can sadly shake their heads.

Except that those cruelties had already been exposed. Taub’s article was published in 2019, slightly more than four years after Salahi himself published his best-selling Guantánamo Diary, which notably did not win a Pulitzer Prize. Large parts of ‘Guantánamo’s Darkest Secret’—awarded a Pulitzer this week in the Feature Writing category—particularly those that deal with Salahi, rehash with the customary ‘he wrote’ what had already been written. Yet while the content may be mostly the same, the purpose is different. Taub, unlike Salahi, is out to deliver absolution to his American reader: casting Steve Wood as an integral player is one part of this; leaving the still-constrained reality of Salahi’s present (he cannot leave tiny Mauritania) to the very end of the piece is another.

Indeed, while Salahi (whose name has also been styled as Slahi) may have told many truths in his own book, it is Taub who gets to tell them in the pages of The New Yorker. Credibility and journalistic heroism, as each year’s prizes show, reside in the pages of prestige publications; the New York Times and the Washington Post are mainstays, and since the prizes were first opened up to include magazines in 2015, The New Yorker is as well. No truth is really a truth, particularly a courageous truth, until it appears in their pages. The brown man, the accused terrorist, the actual torture survivor Mohamedou Ould Salahi may have written a great book. But the definitive story about ‘Guantánamo’s Darkest Secret’ is the one penned by Taub.

Read the full article in The Baffler.


.

Trafficked and abused:
Libya’s migrants caught in the business of war
Andrew England, Financial Times, 3 May 2020

Mohammed’s tale echoes thousands of others in Libya, one of the main gateways to Europe, where the trade in people has become part of a flourishing war economy.

‘Migrants have become income generating assets and the detention centres are the real money makers,’ said Liam Kelly, country director of the Danish Refugee Council in Tripoli. ‘The government doesn’t really control them, they are controlled by armed groups and the EU and other agencies are indirectly funding them.’

Since 2015, the EU has provided €408m to programmes intended to stem the flow of migrants, including support for the Libyan Coast Guard, which is tasked with intercepting boats and returning migrants to the north African state, often to detention centres.

Some, like the one Mohammed was held in are run by traffickers and gangs. Others are ostensibly overseen by the weak UN-backed government in Tripoli. But even the ‘official’ centres, of which there are more than a dozen holding a total of 3,000 to 5,000 migrants, are affiliated to militias and the conditions are dire, aid officials say.

Often the centres are hangars or warehouses, with fighters’ barracks and ammunition stores nearby. This puts the migrants at risk of becoming military targets — on top of the risk of abuse they already face — which has intensified criticism of the EU’s support of the Libyan Coast Guard and other initiatives.

By financing programmes that result in the return of migrants to such camps, the EU is contributing to serious human rights violations, according to the Global Legal Action Network and two other groups that filed a complaint this week at a Brussels court calling for the bloc’s funding to be suspended.

Read the full article in the Financial Times.


.

Franz Boas

The circle
Jennifer Wilson, The Nation, 5 May 2020

In 1949 a Columbia anthropologist named Geoffrey Gorer published an essay in his study The People of Great Russia, in which he attempted to provide insight into why those living in the Soviet Union were not more resistant to Stalinist authoritarianism. It was not because they were tortured or threatened with the gulag, according to Gorer and the study’s coauthor, the psychoanalyst John Rickman; it was because they had been swaddled for too long as babies. Gorer had studied child-rearing practices across Western and Eastern Europe and found that Russian peasants tended to swaddle their children for longer periods than other parents did, sometimes up to nine months. Therein lay the explanation, Gorer and Rickman insisted, for why the Soviets preferred the warm cloak of authoritarianism to the freedoms of Western liberalism. The theory, which came to be known as the swaddling hypothesis, was roundly and rightfully mocked. One critic called it ‘diaperology.’ Gorer’s friend and fellow anthropologist Margaret Mead defended and even doubled down on his theory; she insisted that in swaddling them for so long, ‘Russians communicate to their infants a feeling that a strong authority is necessary.’

The swaddling hypothesis and the ire it justly provoked dealt a considerable blow to the prestige of the national character studies program just as it was reaching its zenith at Columbia, raising questions about the methodologies being employed there and even the value of culture as a heuristic. It also highlights a problem with the work of these anthropologists, which is often framed as revolutionary and egalitarian for insisting that human differences are rooted in culture rather than race. That such a worldview would be any less dangerous is belied by the reality of how this research—culture cracking, as it was known—was employed. From World War II into the early years of the Cold War, anthropologists in the program were repeatedly tapped by the US government to create national profiles for countries deemed threats to US national security. The most famous of these was Ruth Benedict’s wartime study of Japanese culture, later published as The Chrysanthemum and the Sword (1946), but the program produced countless reports for the government on China, Syria, Eastern European Jews, and other ‘cultures’ that needed decoding before they could be exploited.

At a time when the country’s foremost social scientists, figures like the eugenicist Madison Grant, were insisting that different cultures fell along a continuum of evolution, cultural anthropologists asserted that such a continuum did not exist. Instead of evolving in a linear fashion from savagery to civilization, they argued, cultures were in a constant process of borrowing and interpolation. Boas called this process ‘cultural diffusion,’ and it would come to be the bedrock of cultural anthropology, inspiring an entire generation of anthropologists to travel the world searching for examples of it. Hurston went to Florida to collect African American folklore, Deloria to the American Southwest to codify Native American languages, and Mead to American Samoa to ask teenagers about their sex lives. And while their findings have been heralded as revolutionary—within the social sciences and for the general public—they also laid the groundwork for a new form of liberal racism centered on cultural rather than physiological difference…

‘Culture’ often proved to be too slippery a term in the hands of these ‘gods of the upper air’ (a phrase borrowed from Hurston’s autobiography, Dust Tracks on a Road). As King traces their development, particularly Boas’s, it becomes clear that their ideas about culture and cultural differences were not as distinct as they imagined from the notions of racial difference they sought to overturn.

Read the full article in The Nation.


.

The defender of differences
Kwame Anthony Appiah, New York Review of Books,
28 May 2020

Boas can be viewed, then, as having been a spectacularly effective vehicle for introducing into American academic culture (and then, in significant ways, correcting and refining) a particular German tradition of progressive anthropology. ‘My whole outlook,’ he later wrote in a credo, ‘is determined by the question: how can we recognize the shackles that tradition has laid upon us?’ Yet his resolve to recognize those shackles itself arose from a tradition, one that proved, in the main, rather more liberating than constraining.

The year 1906, when Zumwalt’s biography stops, is when Charles King’s Gods of the Upper Air really gets going. King, a professor of international affairs and government at Georgetown, is a terrific writer and storyteller—and a disciplined one, too, who knows how to dip into the rabbit holes along his path without getting lost in them. His is also an unabashed work of tribute: if it’s routine to reject racism, sexism, homophobia, or ethnocentrism, he maintains, ‘we have the ideas championed by the Boas circle to thank for it.’

Because Boas is so associated with ‘cultural anthropology’ (a term that his students popularized), it’s easy to forget how much time he spent calculating cephalic indexes, determining who was long-headed and who short. He meant to defeat race science by turning its methods against its claims. In 1908 the Dillingham Commission—a group of senators and congressmen who worried that inferior arrivals from Italy and Eastern Europe were polluting the American stock—asked Boas to produce a report on the effects of ‘the immigration of different races into this country.’ Under his supervision, measurements were taken of nearly 18,000 subjects. ‘The adaptability of the immigrant seems to be very much greater than we had a right to suppose before our investigations were instituted,’ Boas’s study concluded. ‘While heretofore we had the right to assume that human types are stable, all the evidence is now in favor of a great plasticity of human types.’

However closely Boas followed Virchow in emphasizing mutability, mistrusting invidious racial claims, and recognizing that (as Boas later wrote of him) ‘it is dangerous to classify data that are imperfectly known under the point of view of general theories,’ he had not yet followed Virchow in becoming publicly outspoken in defense of his convictions. As Douglas Cole put it, ‘Boas’s political views remained personal.’

Read the full article in the New York Review of Books.


.

Anti-communist massacres killed Indonesia’s hopes
for national liberation and socialism
Vincent Bevins & Benjamin Fogel, Jacobin, 19 May 2020

BF: How exactly was the conflict in Indonesia more important than the Vietnam War?

VB: Indonesia is the fourth-largest country in the world by population. Within the ‘domino theory,’ it was by far the biggest domino — it had nearly three times as many people as Vietnam. In the early 1960s, everyone in the US foreign-policy establishment recognized it was more important than Vietnam as a foreign-policy issue, as Sukarno was a founding leader of the Third World movement. The Vietnam War dominated US domestic politics for many years, but geopolitically, it achieved exactly nothing. Indonesia 1965–66 changed everything.

BF: The event at the heart of your book is a mass extermination campaign directed against the Indonesian Communist Party (PKI), at the time the largest communist party outside of China and the Soviet Union. How was the party so successful and seen as such a threat to the United States’ interests?

VB: The PKI was the oldest communist party in Asia, founded before the Chinese Communist Party, and from the beginning, it was committed to collaboration with ‘national-bourgeois’ forces. They were two-stage revolutionaries that only wanted to transition to socialism way in the future, after the full development of capitalism. It was very moderate compared to what English speakers think of when they hear ‘communist’ today.

In China, the Comintern actually instructed Mao to collaborate with the Nationalists because Moscow wanted the Chinese to replicate the success Indonesian communists had working with Muslim groups. It didn’t work out so well for Mao, but the PKI stayed more or less on this path throughout its entire existence. After Sukarno and revolutionary forces expelled the Dutch in 1949, the PKI became one part of a new, independent multiparty democracy.

President Sukarno, the country’s independence hero and founding father, was not a communist. But he was a left-leaning anti-imperialist, governing in coalition with a lot of different forces. The Indonesian communists did not have weapons and didn’t even contemplate the possibility of armed struggle. Even American officials noted at the time that they were simply a really well-run organization — they had very popular cultural programs and peasant organizations and a huge feminist base, and they didn’t suffer from rampant corruption like everybody else. But they got more and more votes, which did not please Washington — so the United States tried two stop them in two ways, which both failed.

Read the full article in Jacobin.


.

The woman behind ‘Roe vs. Wade’ didn’t
change her mind on abortion. She was paid
Meredith Blake, Los Angeles Times, 19 May 2020

When Norma McCorvey, the anonymous plaintiff in the landmark Roe vs. Wade case, came out against abortion in 1995, it stunned the world and represented a huge symbolic victory for abortion opponents: ‘Jane Roe’ had gone to the other side. For the remainder of her life, McCorvey worked to overturn the law that bore her name.

But it was all a lie, McCorvey says in a documentary filmed in the months before her death in 2017, claiming she only did it because she was paid by antiabortion groups including Operation Rescue.

‘I was the big fish. I think it was a mutual thing. I took their money and they’d put me out in front of the cameras and tell me what to say. That’s what I’d say,’ she says in ‘AKA Jane Roe,’ which premieres Friday on FX. ‘It was all an act. I did it well too. I am a good actress.’

In what she describes as a ‘deathbed confession,’ a visibly ailing McCorvey restates her support for reproductive rights in colorful terms: ‘If a young woman wants to have an abortion, that’s no skin off my ass. That’s why they call it choice.’

Arriving in an election year as the Supreme Court is considering a high-profile abortion case with the potential to undermine Roe vs. Wade and several states across the country have imposed so-called ‘heartbeat laws’ effectively banning the procedure, ‘AKA Jane Roe’ is likely to provoke strong emotions on both sides of this perennial front in the culture wars.

Director Nick Sweeney says his goal was not necessarily to stir controversy, but to create a fully realized portrait of a flawed, fascinating woman who changed the course of American history but felt she was used as a pawn by both sides in the debate.

‘The focus of the film is Norma. That’s what I really want people to take away from the film — who is this enigmatic person at the center of this very divisive issue,’ he says. ‘With an issue like this there can be a temptation for different players to reduce ‘Jane Roe’ to an emblem or a trophy, and behind that is a real person with a real story. Norma was incredibly complex.’

Read the full article in Los Angeles Times.


.

Whatever happened to the polymath?
Dan Hitchens, Unherd, 22 May 2020

The decline of polymathy, then, suggests a broader crisis. For Burke, it is a crisis of too much information. The seventeenth century was a ‘golden age of polymaths’, as explorers found new regions, the scientific method flourished, and the postal service and the proliferation of journals allowed scholars to trade ideas. But those same forces led to ‘information overload’.

Over the next 200 years, the intellectual world divided between the specialists who knew a lot about their little area, and popularisers who knew a little about a lot. Institutions, as well as individuals, had to go their separate ways: in the 1880s the Natural History Museum split off from the British Museum, and the Science Museum from what is now the V&A. The twentieth century saw some conscious efforts to foster ‘interdisciplinarity’, but the fragmentation of knowledge only accelerated — even before the internet came along.

This is Burke’s version of events, and it is obviously a large part of the story. But there is surely another reason for the decline of the polymath: namely, the intellectual revolution of the sixteenth and seventeenth centuries, when as John Donne put it in 1611, ‘new philosophy calls all in doubt.’

That new philosophy claimed, like today’s political leaders, that it was merely following the science: instead of theorising about the celestial spheres, just look through a telescope! But there was a sinister undercurrent, as Donne realised: the new philosophers sometimes seemed to imply that, if you did follow the science, you might well find a cold, dead universe in which our beliefs about the beauty, harmony and meaning of the world around us would be exposed as delusions.

When Dante gazed at the night sky, he saw ‘the love which moves the sun and other stars’; 350 years later, Pascal looked at the same thing and recorded that ‘the eternal silence of these infinite spaces terrifies me.’ The contemplation of the world, it appeared, might not lead us to sublime truths, but to disenchantment.

Read the full article in Unherd.


.

Jean-Léon Gérôme, The Snake Charmer, 1870

The worldly exile
Rashid Khalidi, The Nation, 5 May 2020

Said’s alienation and worldliness were at the heart of the complexity and richness of his work; they lent him a sharper awareness of and sympathy for other cultures and stirred inside him a pointed disdain for the placid provincialism and monoglot lack of reflection among many leading figures in the American academy. Although he shared the class and educational background of many of his peers, he insisted that we see beyond the parochial bounds of the ivory tower and the self-referential culture of the West. While this critical attitude was expressed most saliently in Orientalism, it characterized much of Said’s mature work, both critical and political. In one of his last offerings, ‘The Return to Philology’ (on what he called this ‘most unmodern’ branch of learning), his erudite analysis is informed by a sense of the larger stakes of the specific political moment: the war in Iraq and Secretary of State Madeleine Albright’s casual dismissal in 1996 of the thousands of Iraqi deaths in that decade as a result of US-mandated sanctions.

Said deftly interlaced philosophy and literature with political critique. Although his political writings could be blunt, even scalding, he most often wielded a sharp scalpel in his criticism and did so with elegance and élan. The best of the essays in After Said do likewise, often using literary analysis to make subtle political points. At the same time, they avoid the hagiography that is unfortunately prevalent in many of the works on Said. Both Abu-Manneh’s introduction and Robert Spencer’s ‘Political Economy and the Iraq War’ question the lack of an underpinning in political economy in Said’s writing on imperialism in general and on recent US policy in the Middle East in particular, although they do so while underscoring the lasting value of his interventions.

Similarly, Vivek Chibber’s ‘The Dual Legacy of Orientalism’ offers one of the most acute and fair-minded expositions of the flaws in what he nevertheless recognizes as a ‘great book.’ Although he notes the distance between Said’s ‘profound commitment to humanism, universal rights, secularism, and liberalism’ and the disavowal or at least skepticism of postcolonial theory toward these values, Chibber writes that Orientalism ‘prefigured, and hence encouraged, some of the central dogmas of postcolonial studies.’ While Said’s analysis brought a sophisticated critique of imperialism to the mainstream, Chibber observes, it fed an approach that undermined that very critique by excising its economic dimensions—a point that serves as one of the key subtexts in this collection. Although Said is one of this era’s fiercest critics of imperialism, missing from his analysis is a grounding in political economy, a failing that robbed his critique of some of its potential force and gave license to his postcolonial followers to move away from Marxism.

Read the full article in The Nation.


.

Computers don’t give a damn
Tim Crane, TLS, 15 May 2020

Smith is surely right that AI’s recent successes give us little or no reason to believe in the real possibility of genuine thinking machines. His distinction between reckoning and judgment is an important attempt to identify what it is that is missing in AI models. In many ways (despite his protest to the contrary) it echoes the criticisms of Dreyfus and others, that AI will not succeed in creating genuine thinking unless it can in some way capture ‘common sense’. And just as common sense (part of Smith’s ‘judgment’) cannot be captured in terms of the ‘rules and representations’ of GOFAI, nor can it be captured by massively parallel computing drawing patterns from data.

To make this point about judgment, Smith does not actually need the more ambitious ontological claims, that the world does not have natural divisions or boundaries, that all classification is simply a result of human interest, and so on. Maybe these claims are true, maybe not – for many centuries philosophy has wrestled with them, and they are worth debating. But we should not need to debate them in order to identify the fundamental implausibility of the idea that AGI is on the horizon.

This implausibility derives from something intrinsic to the success of AI itself. For despite the sophistication of machine learning, the fact remains that like chess, Go is still a game. It has rules and a clear outcome which is the target for players. Deep learning machines are still being used to achieve a well-defined goal – winning the game – the meaning of which can be articulated in advance of turning on the machine. The same is true of speech and face recognition software. There is a clear goal or target – recognizing the words and faces – and successes and failures in meeting this goal are the input which helps train the machine. (As Smith says, ‘recognition’ here means: correctly mapping an image onto a label: nothing more than that.)

But what might be the goal of ‘general intelligence’? How can we characterize in abstract terms the problems that general intelligence is trying to solve? I think it’s fair to say that no one – in AI, or philosophy, or psychology – has any idea how to answer this question. Arguably, this is not because it is an exceptionally difficult empirical question, but rather that it is not obviously a sensible one. I suppose someone might say, in the spirit of Herbert Simon (whose famous AI programme was called the ‘General Problem Solver’), that general intelligence is the general ability to solve cognitive problems. This might seem fine until we ask ourselves how we should identify, in general terms, the cognitive problems that we use our intelligence to solve. How can we say, in general terms, what these problems are?

Read the full article in TLS.


.

The forgotten Holocaust
Kushanava Choudhury, The Caravan, 1 May 2020

The British instituted a ‘denial policy’ across coastal districts, including Midnapore, sending agents to villages to seize surplus rice in order to deprive the Japanese of food if they landed. Bengal’s chief minister, Fazlul Huq of the Krishak Praja Party, was critical of this policy and warned of an imminent rice famine. Huq had campaigned on a slogan of dal bhat—rice and dal—for everyone. By this time, the Congress leadership was being imprisoned for its refusal to support the British war effort and for launching the Quit India movement. Mukerjee writes that the wartime cabinet in London, and the colonial government in India, appeared to care for little besides the war. The British prime minister during the war, Winston Churchill, has been famously quoted blaming Indians themselves for the famine, for ‘breeding like rabbits.’

Hundreds of thousands of Allied soldiers were arriving in Calcutta and had to be fed and clothed. Industries in Bengal, including factories, railways and civil services, were repurposed to meet wartime demands. The colonial government enacted policies that diverted food to soldiers and Indian support staff in the towns and cities, even as the death toll in rural Bengal kept rising. Indeed, as Mukerjee shows, the British supplied food only to those considered essential to the war, while being indifferent to the fate of the rest of the Bengali population, ninety percent of which lived in the countryside. At the height of the famine, she notes, not only did Churchill refuse to import grain in large amounts, but rice was also being exported out of India.

Officially, there was no famine in Bengal in the 1940s. The government never declared it as such. That would have required putting in place the Famine Code, a set of guidelines on how to provide relief that had been established after the famines of the late nineteenth century. But throughout its tenure, the British Raj pretended that the mass annihilation of people was not taking place right under its nose.

In the end, while Japanese planes did bomb Calcutta a few times, there was no full-blown invasion. By 1943, Japan was suffering losses on its home turf, and the war on Indian soil never took place. There were no battles here, the history books say, no heroes, no martyrs. And yet, it left 3 million dead, like an attack by a neutron bomb.

The names of these dead are not etched into any monument, nor are they written down in any archive. The Bengal famine simply does not inform our sense of collective history or our national imagination in quite the same way as the martyrdom of Bhagat Singh or the Jallianwala Bagh Massacre. But it is one of the largest crimes of the twentieth century. It was the mass destruction of a population, with millions of witnesses. It should be impossible to write of the Second World War, or of India’s Independence, without stories of the famine.

Read the full article in The Caravan.


.

Why nostalgia is our new normal
David Berry, The Walrus, 7 May 2020

Appropriately for the elusiveness of the concept, the word nostalgia did not originally mean what we now consider it to – also appropriately, it was coined with a longing for a time when there was no word for what it described. In 1688, a Swiss medical student named Johannes Hofer gave the name nostalgia to a malady he had noticed in young Swiss people who had been sent abroad – chiefly mercenaries, one of Switzerland’s prime exports at the time, though also household servants and others who found themselves in ‘foreign regions.’ As was the style at the time in the nascent field of ‘medicine more complicated than bleeding humours,’ Hofer used a portmanteau from an indistinctly highfalutin form of Ancient Greek: nostos roughly means ‘home’ – although it more often means ‘homecoming,’ which incidentally was also the name for an entire subcategory of Greek literature, most notably the Odyssey – while algos means, more simply, ‘pain,’ derived from Algea, the personifications of sorrow and grief, and a common classification at the time, attached to a variety of maladies that have since gotten either more precise or more vernacular names. (If you ever want to stoke excessive sympathy from, say, your boss, tell them you have cephalgia or myalgia – a headache or sore muscles, respectively.)

So nostalgia literally means ‘pain associated with home’ – or, in slightly more familiar terms, ‘homesickness.’ This is not a coincidence, but more relevantly, it’s also not a case of fancy medical-speak being dumbed down for popular consumption. At least not generally: the English word homesickness is a more or less direct translation of nostalgia. But the original term is French, maladie du pays, and not only does it specifically refer to the tendency of the Swiss to powerfully miss their home country, it precedes Hofer by at least thirty years. Hofer’s coinage brought a specifically medical dimension, insomuch as medicine as we know it existed in his time: Hofer’s observations were quite detailed but still entirely anecdotal and subject to a lot of conjecture. What he lacked in scientific rigour he made up for with linguistics, attempting to legitimize medicine’s dominion over the concept with multiple coinages, including nostomania (obsession with home, which, as you’ll see in a second, is probably more accurate to the ‘disease’ as he conceived it), philopatridomania (obsessive love of one’s homeland), and years later, in the second edition of his thesis, pothopatridalgia (pain from the longing for the home of one’s fathers, which certainly has the advantage of precision if not rhythm).

Though the difference between mere homesickness and medical nostalgia was mostly a case of ancient language, Hofer nevertheless describes a serious disease, one that could progress from simple physical ailments, like ringing in the ears or indigestion, to near-catatonia and even death. Its root cause, according to Hofer, was ‘the quite continuous vibration of animal spirits through those fibres of the middle brain in which impressed traces of ideas of the fatherland still cling.’ As Helmut Illbruck explains in his book Nostalgia: Origins and Ends of an Unenlightened Disease, essentially what that means is that the nostalgic suffers from a powerful obsession with their home that eventually makes them entirely insensate to any other experience or stimulation. Illbruck points out that the action Hofer describes does loosely capture how the brain seems to store, process, and recall memories, which may explain some of why his concept caught on, at least in the medical circles in which it persisted for the next few hundred years.

Read the full article in The Walrus.


.

Freedom now
Alex Gourevitch & Corey Robin, Polity, 15 May 2020

The left is having a moment in the United States. Policies that went unmentioned or were declared out-of-bounds during the presidential campaign of 2016—a federal jobs guarantee, single-payer health care, free college, massive tax hikes on the rich, and the Green New Deal—are commonplaces of the Democrats’ 2020 campaigns. According to recent Gallup polls, socialism is now more popular than capitalism among Democrats and young people, and support for ‘some form of socialism’ among all Americans is at 43% (compared to 25% in 1942).1 Across the country, self-declared socialists are being elected to office.2 There is a new militancy in the labor movement, with teachers striking even in the reddest of states—producing the largest number of striking workers since 1986—and flight attendants helping to bring Donald Trump’s government shutdown of 2018–19 to an end. Often led by women, in occupations traditionally associated with women, these strikes and job actions suggest a potential convergence between issues of labor and gender that is the hallmark of socialist feminism.

Yet there is a mismatch between the ambition of the moment and the ambit of its arguments. The left is hesitant about broader claims that might justify these policies and movements. There is talk of socialism, and debate about what it entails, but less discussion of socialism as an ideology and why it is desirable. The same goes for the Democratic candidates, from whom one has not heard anything on the order of Franklin Roosevelt’s Commonwealth Club speech or Reagan’s story of the free market. Instead, there is the nostalgic turn of ‘Green New Deal’ and ‘Medicare For All,’ talismans of the past meant to inaugurate a future. But aura cannot do the work of argument. If these policies are to have a chance of breaking through, they will need a grounding principle, which does what an ideology is supposed to do: name the enemy, organize the policies, orient the actions, state the destination, and provide the fuel for the movement to get us there.

We argue here that, for modern Democrats and those further to the left, that principle is freedom. Freedom is a global principle that reaches back to the birth of the left during the French Revolution and runs through various emancipation struggles since. It also has a special resonance in the United States. According to historian Eric Foner, freedom is ‘the central term in our political vocabulary.’ While voices of the left periodically worry that freedom has been lost irretrievably to the right, there is an ongoing contest in this country between elite claimants invoking freedom as a possession already had and subaltern counter-claimants envisioning freedom as a struggle to be won

Read the full article in Polity.


.

African Jazz Pioneers at The Rainbow, Durban

Jazz for the struggle and the struggle for jazz
Rafs Mayet, New Frame, 15 May 2020

The Rainbow is a loud, rowdy and often raucous place with the audience and the band feeding off each other’s energy. It is definitely not a ‘sit down and listen carefully’ concert venue. The audience roars if a solo wins its approval. It was always a brave artist that attempted to play a ballad there, unless you were Sipho Gumede starting the first notes of Mantombi as the shouts of recognition drowned out the first few bars of the song.

The gigs are held on Sundays. As the afternoon goes on, people get up and dance, forming a train that snakes between the seating booths, down the aisles and past the stage, waving and applauding the musicians as they pass by. iStimelen’ sase Rainbow – the Rainbow train – it’s called, which allows total immersion in the moment. As one punter memorably put it: ‘Fuck tomorrow, fuck the bosses, fuck the system, we’ll worry about that later. Now we’re just gonna let go and get down and forget about all that shit.’

The Rainbow was intimately linked with the politics of the 1980s, when South Africa was under states of emergency. Sometimes Pretorius would cheekily offer any Special Branch members in the audience free batteries for their recording devices. In 1988, when Nelson Mandela’s 70th birthday party concert at what was then the University of Durban-Westville was banned, it moved to The Rainbow. Despite efforts by the authorities to stop it – cops harangued Pretorius over his liquor licence, which was valid, while armed police and Security Branch members packed into the venue – the show went on.

From the stage, Pretorius pointed out to them, ‘This is a trumpet, not a rocket launcher, and this is a guitar, not a gun.’ After a while, they left and the music stopped. Nkosi Sikelel’ iAfrika, the original version by Enoch Sontonga, was sung and then the music continued. When workers at SA Breweries went on strike in 1988, Pretorius stopped sales of their products in solidarity, even though it meant a huge loss for the club. The National Union of Metalworkers of South Africa had offices up the road and many of the union’s members frequented the place. Next door was a hall that the South African Council for Higher Education used for meetings and events like May Day commemorations where The Rainbow’s sound system was used mahala (for free).

During the fighting that took place in the 1980s between the UDF and Inkatha, and the ANC and Inkatha’s later iteration, the Inkatha Freedom Party, in the early 1990s, many refugees from the violence in Hammarsdale and Mpumalanga were housed and fed at the Rainbow until alternative arrangements could be made. In the heady days after Mandela’s release, the ANC’s Highway Branch was launched there.

Read the full article in New Frame.


.

The erosion of deep literacy
Adam Garfinkle, National Affairs, Spring 2020

For one, the new digital technology is democratizing written language and variously expanding the range of people who use and learn from it. It may also be diffusing culture; music and film of all kinds are cheaply and easily available to almost everyone. In some respects, new digital technologies are decreasing social isolation, even if in other respects they may be increasing it. Taken together, these technologies may also be creating novel neural pathways, especially in developing young brains, that promise greater if different kinds of cognitive capacities, albeit capacities we cannot predict or even imagine with confidence.

But it is also clear that something else has been lost. Nicholas Carr’s 2010 book, The Shallows, begins with the author’s irritation at his own truncated attention span for reading. Something neurophysiological is happening to us, he argued, and we don’t know what it is. That must be the case, because if there is any law of neurophysiology, it is that the brain wires itself continuously in accordance with its every experience. A decade later, Carr’s discomfort is shared by growing legions of frustrated, formerly serious readers.

In her 2018 book, Reader, Come Home, Wolf uses cognitive neuroscience and developmental psycholinguistics to study the reading brain and literacy development, and in doing so, helps identify what is being lost. According to Wolf, we are losing what she calls ‘deep literacy’ or ‘deep reading.’ This does not include decoding written symbols, writing one’s name, or making lists. Deep literacy is what happens when a reader engages with an extended piece of writing in such a way as to anticipate an author’s direction and meaning, and engages what one already knows in a dialectical process with the text. The result, with any luck, is a fusion of writer and reader, with the potential to bear original insight.

Deep literacy has wondrous effects, nurturing our capacity for abstract thought, enabling us to pose and answer difficult questions, empowering our creativity and imagination, and refining our capacity for empathy. It is also generative of successive new insight, as the brain’s circuitry for reading recursively builds itself forward. It is and does all these things in part because it touches off a ‘revolution in the brain,’ meaning that it has distinctive and describable neurophysiological consequences. Understanding deep literacy as a revolution in the brain has potential payoffs for understanding aspects of history and contemporary politics alike.

Deep reading has in large part informed our development as humans, in ways both physiological and cultural. And it is what ultimately allowed Americans to become ‘We the People,’ capable of self-government. If we are losing the capacity for deep reading, we must also be prepared to lose other, perhaps even more precious parts of what deep reading has helped to build.

Read the full article in National Affairs.


.

The restless cosmopolitan
George Scialabba, The Inference, May 2020

Nussbaum is herself an influential theorist in the cosmopolitan tradition, and she concludes the book with a review of contemporary problems about which the tradition may have something helpful to say: pluralism, international law, foreign aid, immigration, and asylum. She sees only a moral function for international law, promulgating norms that nations may adopt or not. That may indeed be the best one can do today, though it is perhaps too much to say, as she does, that early proponents of international law were starry-eyed about its potential efficacy. In fact, the UN Charter, binding on its signatories, was well designed for keeping the peace and would have saved countless lives if the superpowers had only lived up to their obligations.

On foreign aid, Nussbaum shares the skepticism of economists William Easterly and Angus Deaton, who found that autocracy, corruption, paternalism, and ignorance of local conditions have made most foreign aid almost totally ineffective. And where it is effective, the result, allegedly, is dependency and lack of political initiative. This is doubtless often true, but I wish Nussbaum had also mentioned the many strong defenses of aid by Jeffrey Sachs and others.

The final problem, perhaps the knottiest and the most urgent, is immigration and asylum. The cosmopolitan tradition is particularly well adapted to address this problem, since its ‘basic insight is that respect for humanity requires us to furnish the basic wherewithal of human life, somehow, to those in desperate need.’ If this can be done through humanitarian aid, with a minimum of disruption to both countries, it should be. But those who must leave, because of want or persecution, should be welcomed.

This is where things get knotty. How many of them should be welcomed? Not too many: it is reasonable to limit numbers ‘in accordance with skills and job opportunities’ for the sake of economic stability, and to require that candidates for permanent residence understand and accept our political culture, that is, our constitution and laws. But not too few, either: we cannot try to ‘preserve national homogeneity’ or ‘defend dominant national ethnic or religious traditions from the pluralism and challenge that immigration typically brings.’ This is a little too general. Nussbaum’s argument might have been more persuasive if she had engaged with the defenders of homogeneity and national culture, who are not all xenophobes, or even acknowledged the existence of a controversy over whether immigrants lower wages, as many businessmen and aggrieved populists seem to believe.

Read the full article in The Inference.


.

On ancestry
Justin Smith, 6 May 2020

Two thoughts have long come unbidden to my mind whenever I hear people talking about doing their family trees, or, more recently, getting their DNA done. The first is of Bruce Willis’s character in Pulp Fiction, the boxer Butch Coolidge in the back of the taxi, who, when asked by his South American driver what his name means, replies, ‘I’m an American, baby, our names don’t mean shit.’ The other is of Seneca, who wrote in his Moral Letters to Lucilius: ‘If there is any good in philosophy, it is this, — that it never looks into pedigrees. All men, if traced back to their original source, spring from the gods.’

To be an American is to bear a name with no historical resonance, or at least none worth looking into, to orient oneself in the world without regard for lineage. To be a philosopher is to know consciously what the American feels by instinct: that the reason lineages are not worth looking into is the same for all of us, namely, that we all derive from the same divine source.

But I am, or like to think of myself as, an American philosopher, and so of course I always scoffed when my late father –who did not share my sensibility, did not see being American in the same way– used to come home with all sorts of vital-statistics records from Utah and Arkansas, with genealogical scrolls stretching back to Olde England. I always got a vague whiff of prejudice moreover from those family-history buffs more extreme than my father ever was, displaying with pride their ancestors’ tartan patterns above the fireplace, or hanging up a coat-of-arms and explaining with pride why the stag is rampant as opposed to statant, say, or offering an embroidered pillow with some implausible sentiment about Irish or Polish or Swedish superiority. No, I always thought, to hell with all that. I come from nowhere. I come from no one but the gods.

And yet, I am also among other things a scholar of the history of the concept of race, and I know full well that this is the same thing as the history of genealogy. To put it very succinctly, ‘race’ in its Latinate variants first appeared in the sixteenth century in the context of animal husbandry: paying attention to which horse, pigeon, or dog should be coupled with which other of its own kind in order to artificially create a better ‘breed’ (that is to say, in Italian, razza; in Spanish, raza; in French, race) of creature. Eventually, as Marx would later caustically point out, it came to be understood that ‘the key to aristocracy is zoology,’ and by the mid-seventeenth century it was common to speak of the ‘race’ of the Plantagenets, the ‘race’ of the Carolingians, and so on.

Read the full article on Jason Smith’s blog.


.

Bundesliga’s quiet return hints
at
a silent threat to home advantage
Sean Ingle, Guardian, 18 May 2020

In the 224 Bundesliga games this season before the lockdown, referees awarded 151 more fouls against away teams and handed out 62 more yellow cards. On Saturday, however, that discrepancy vanished. Indeed, slightly more fouls and yellow cards were awarded against the home teams on average.

We should expect this. As Ignacio Palacios-Huerta, who sat on the board of Athletic Bilbao from 2011 to 2018 and is also a professor of management, economics and strategy at the London School of Economics, points out, referees are unconsciously influenced by crowds.

He and his fellow academics were the first to study how officials were affected by social pressure by looking at stoppage time in La Liga matches. Strikingly they found that when a home team was ahead by a single goal, the referee allowed almost 30% less additional time than average. However, if the home team was behind by one goal the referee allowed 35% more time than average. What’s more, when crowds were larger, the referees become more biased.

There was something else too. When the visiting team scored after the end of the regulation 90 minutes, stoppage time went on 15% longer than when the home team scored. In other words, referees were quicker to end the game if the home team scores, thus giving the visitors less time to respond, than if the visitors score.

In Spain, two teams particularly benefit from refereeing bias – Barcelona and Real Madrid. Though as Palacios-Huerta dryly notes, ‘most fans would not need an econometric regression to confirm this’.

A subsequent study looked at what happened in Serie A in 2007 after several Italian clubs were forced to play behind closed doors following the death of a policeman in the Derby di Sicilia between Catania and Palermo. Again the results were significant. The authors found that the typical home advantage in terms of fouls, yellow cards and red cards awarded against the away side all declined dramatically – and that the same referee behaved very differently when officiating the same teams in the same stadium if there was no crowd.

Notably, however, the researchers also found there was ‘no indication that the players are differently affected in games with and without spectators’.

Another fascinating piece of research examined how 40 qualified referees judged 47 incidents from a match between Liverpool v Leicester. Half watched with crowd noise, while another group watched the action in silence. Those viewing the footage with noise awarded significantly fewer fouls (15.5%) against the home team compared with those watching in silence.

Read the full article in the Guardian.

.

The images are, from top down: ‘The Triumph of Death’ by Peter Breugel the Elder; ‘The Death of William of Norwich’, painting from 1144 in Holy Trinity church in Loddon, Norfolk depicting the first known case of blood libel; Franz Boas, posing for a series of photographs in a US National Museum exhibit, photo via John Curan’s Flickr stream; Jean-Léon Gérôme’s  ‘The Snake Charmer’ (1870), a painting on the cover of Edward Said’s book Orientalism; The African Jazz Pioneers at The Rainbow, from New Frame.