Pandaemonium

PLUCKED FROM THE WEB #78

The latest (somewhat random) collection of essays and stories from around the web that have caught my eye and are worth plucking out to be re-read.


.

How they built Grenfell
Anoosh Chakelian, New Statesman, 11 December 2020

In December 2007, an Irish building materials company called Kingspan tested the fire safety of one of its insulation materials, Kooltherm K15. It was tested on a rig mocked up like a building, to mimic how the product might be used in real life, with aluminium cladding panels on a steel frame six metres tall.

It created a “raging inferno”, according to one of the test’s observers from Kingspan. The Building Research Establishment (BRE), the certification body that carried out the test, had to stop it early because it risked setting fire to the laboratory. Even after the heat source was extinguished, the product continued to burn on its own.

Nevertheless, the company went on selling the product as safe to use on high-rise buildings following a successful test in 2005 on a previous, different formulation of the product.

Yet even the 2005 test was declared “wholly invalid” at the Grenfell Tower Inquiry in November. Its rig used products that were not freely commercially available or widely used, such as non-combustible cement fibre cladding, and steel and graphite cavity barriers. This meant the test was unrepresentative.

The timings recorded did not reveal that the flames “had reached four metres up the six-metre rig after only five minutes of a 60-minute test, thereby demonstrating that the test would have failed but for the cavity barriers”.

K15 was one of the materials used in the refurbishment of Grenfell Tower, the west-London high-rise block where a fire broke out on 14 June 2017, killing 72 people…

In 2009, K15 was certified by the Local Authority Building Control (LABC) accreditation body as being “of limited combustibility”. This was wrong – even Kingspan’s technical adviser Ivor Meredith “couldn’t believe” what had been written in the certificate.

“We can be very convincing when we need to be,” remarked a jubilant email from Kingspan’s technical manager Philip Heath at the time. “In the end I think the LABC convinced themselves Kooltherm is the best thing since sliced bread. We didn’t even have to get any real ale down him!”

Read the full article in the New Statesman


.

The looming long civil war that could break Ethiopia
John Young, New Frame, 10 December 2020

The civil war in Ethiopia that broke out on 4 November when Prime Minister Abiy Ahmed launched an attack on Tigray, the state administered by the Tigray People’s Liberation Front (TPLF), caught the world unaware. The transition of power in the ruling Ethiopian People’s Revolutionary Democratic Front (EPRDF) after the death of Meles Zenawi in 2012 to Hailemariam Desalegn, and then to Ahmed in 2018, seemingly went smoothly. 

But this was not a simple shifting of chairs among the elite. It was a profound ideological change that threatens not only the viability of the ethnically fragmented Ethiopian state, but could also engulf neighbouring states in the Horn of Africa. 

At the top of that list is Eritrea, which is supported by Ahmed’s forces. Drones from the United Arab Emirates base in the Eritrean port city of Assab have attacked the TPLF. Meanwhile, neighbouring Sudan has received 45 000 mostly Tigrayan refugees, with the United Nations (UN) anticipating 200 000. The British Broadcasting Corporation and other media, however, reported that the Ethiopian army is stopping fleeing Tigrayans, presumably to prevent them from telling stories of widespread atrocities committed by soldiers. The army took control of Tigray’s capital of Mekelle on 28 November, and although Ahmed was quick to claim victory, almost all of the TPLF leadership escaped capture…

The immediate stimulus for the present war was a conflict that broke out in the northern command of the national army just outside Mekelle. But the main differences between the TPLF and Ahmed were over the fate of the EPRDF’s system of national federation, which had its origins in the TPLF’s demand for national self-determination during the anti-Derg war and became the cement that bound the various national-based movements that formed the EPRDF. 

Ethiopia was an empire state similar to tsarist Russia, and the national minorities demanded the end of suppression by an elite drawn from the Amhara, the second largest ethnic group with about 35 million or 32% of the country’s population.   

When the Derg was overthrown in 1991, there was a real danger of national groups seceding. These groups were led by the Oromos, the largest ethnic group with over 40 million people, or 36.4% of the population. As a result, the EPRDF federalist system radically decentralised state power and promised all national groups the right to self-determination. 

It is this system that Ahmed wants to end and return Ethiopia to the centrist government against which generations of Ethiopians fought. This conflict gives impetus to Eritrean rebel groups that have long sought to overthrow the Isaias Afwerki dictatorship there, and it also encourages other aggrieved national groups in Ethiopia to rebel against Ahmed’s centrism.

Read the full article in New Frame.


.

Farmers’ protests: With emotional appeal
running high, Modi govt has lost the plot

Ajoy Ashirwad Mahaprashasta, The Wire, 12 December 2020

The union government’s climbdown on a number of contentious clauses in the newly-framed farm laws has only reinforced the perception that the Narendra Modi-led Centre prefers agitations over consultations.

Over the last few years, the ruling BJP’s non-consultative approach has precipitated multiple agitations. Yet, it has been fairly successful in using these as opportunities to polarise opinion and consolidate its rank and file, supporters, and fence-sitters by investing its political energy and monetary might.

In that respect, the unanimous rejection of the government’s proposal to reconsider the farm laws by ideologically-competing farmers’ unions marks a watershed moment in six-year-long tenure of Modi as prime minister. 

The BJP machinery’s response to the farmers’ protests has only been uncertain until now. The union government, on the other hand, has only been uncertain on how to deal with the protest. Having taken aback by a sustained, organised campaign, and protest against the laws, government officials and ministers have shown their readiness for negotiations with the farmers. But, at the same time, the BJP, its supporters, and pliant media platforms have sought to defame farmers movement initially by labelling them as “Khalistanis”, and now projecting it is “Naxal-influenced”.

Farmers unions, on the other hand, have been unconcerned about what they see as “BJP’s propaganda”. They have already conveyed to the Centre that they will not settle for anything less than a complete repeal of the laws, and have even threatened to intensify their agitations in the days to come. 

All of these only signify that the saffron party has struggled to polarise political narrative around the farmers’ protests in its favour. The BJP hasn’t had much trouble in spinning previous such agitations, which were borne out of the Centre’s visibly undemocratic attitude, within its politics of Hindutva.

t predictably painted, although without any evidence, the Muslim leadership in the protests against the Citizenship (Amendment) Act, National Register of Citizens, and National Population Register as “anti-national”. Similarly, the protests against a hurriedly-implemented Goods and Services Tax (GST) were also served to the public by the BJP as having vested political interests.

Such was the level of government’s unilateralism that one can easily find lower-level bureaucracy complaining about difficulties in implementing the constantly-changing rules. Then again, the imminent protests against the sudden revocation of the constitutional status of Jammu and Kashmir was silenced by brute force. There are multiple such small and big examples in which the union government before taking decisions with farreaching implications hasn’t kept important stakeholders in the loop.

Read the full article in The Wire.


.

What if tropical diseases
had as much attention as COVID?

Francine Ntoumi, Nature, 17 November 2020

All year, COVID-19 has commandeered the world’s attention. It is as if no other disease has ever been more important, more contagious or more deadly.

I founded a non-profit research institute in 2008; we established the first molecular-biology laboratory in the Republic of Congo, at the country’s only public university. We monitor pathogens such as those that cause gastrointestinal diseases, malaria, HIV, tuberculosis (TB) and chikungunya — which together infect more than 250 million people each year globally, and kill more than 2.5 million. To keep treatments effective, we assess the development of resistance to antimalarial, antiretroviral and antibiotic drugs.

Our research programmes were already in place, so we could quickly pivot to diagnostic testing and blood-based epidemiological studies to understand how COVID-19 was spreading in Congo and how to keep health-care workers safe. Since March, three-quarters of our time has been spent on COVID-19.

That means I am neglecting my work on other diseases — which are not going away. And it’s not only my lab. In October, the World Health Organization (WHO) reported that progress against TB might stall: in the countries with the highest rates of the disease, the number of people diagnosed and directed to care dropped by one-quarter compared with last year’s figure. Because many countries have implemented lockdowns, hospitals and health centres have seen a significant drop in the number of people coming for treatment.

In Uganda, maternal mortality rose by 82% from January to March, and because of COVID-19, rates of HIV diagnoses and of people starting antiretroviral treatment (and treatment to prevent TB) will fall by 75% (D. Bell et al. Am. J. Trop. Med. Hyg. 103, 1191–1197; 2020). These treatments must be kept on track through active community outreach. In September, researchers at the WHO and elsewhere modelled what could happen if distribution of antimalarial medicine and insecticidal bednets to prevent malaria falls by up to 75% (D. J. Weiss et al. Lancet Infect. Dis. https://doi.org/fg3n; 2020). If this plays out, all the gains made against malaria over the past 20 years could be lost.

My message is not that efforts against COVID-19 are misguided, but that I am disheartened that such efforts have not been rallied and sustained against other infectious diseases.

Read the full article in Nature.


.

The Covid vaccine: For rich countries only
Heidi Chow, Tribune, 18 November 2020

A vaccine breakthrough is indeed great news but sadly it’s not for the whole of humanity – just a small fraction. Over 80% of the Pfizer vaccine stocks up to the end of next year have already been hoarded by rich countries such as the UK, US, EU, Japan and Canada. Collectively these countries represent just 14% of the global population.

If Pfzier’s vaccine is approved, the majority of the world’s population – living mainly in low and middle-income countries – will not be able to get anywhere near it. And it’s the same story with Moderna, who has declared that their vaccine is nearly 95% effective. 78% of their doses have already been bought up by rich countries, representing just 12% of the global population.

It is likely that global supplies will be limited even further because Pfizer and its partner BioNTech’s patent on the vaccine means no other company can make or sell that vaccine for a minimum of 20 years. This provides the basis for a legal monopoly – and with no competition, Pfizer decides who gets the vaccine and at what price.

On these terms, it’s no surprise that most of the vaccines have gone to the highest bidders and Pfizer/BioNTech are set to walk away with bonanza profits, making an estimated $13 billion next year from the vaccine.

All of this sounds hugely unfair, if not immoral. And yet that is exactly how the system has worked for decades. The pharmaceutical industry is a profit-driven machine that uses patent monopolies to charge the highest prices for life-saving treatments while reaping the highest profits.

It’s become one of the most profitable in the world, but at the expense of billions of patients who have struggled to access affordable basic and life saving treatments. This is bad enough in normal times but during a global pandemic, it could be truly disastrous.

So what can be done about it? Well, no one company can satisfy global demand. If getting actual physical stocks is the problem, then the obvious thing to do is for companies like Moderna and Pfizer to share its technological know-how and the rights to make the vaccine with other companies. Mobilising more manufacturers will increase global supply so that more people can access the vaccine and prevent price gouging.

The World Health Organisation launched a mechanism earlier this year – Covid-19 Technology Access Pool – to facilitate sharing of technological know-how and intellectual property rights to allow any company or any country to access much-needed vaccines and treatments. However, only 40 countries have so far have joined this global pool and pharmaceutical companies have condemned the scheme, with the Pfizer boss dismissing it as ‘nonsense.’

Read the full article in Tribune.


.

Should Big Pharma profit from Covid?
Tom Chivers, Unherd, 17 November 2020

Imagine that a drug firm produces a vaccine for some deadly disease that is affecting people in both the developed and the developing world. Roughly speaking, a new vaccine costs about a billion dollars to get to market, after putting it through animal trials, Phase I safety trials and Phase II and Phase III efficacy trials, applying for licensing, and so on. Let’s imagine it costs about a dollar a dose to make. So if you only made enough of the vaccine to dose one person, the cost of that dose would be $1,000,000,001.

But if you make two doses, the average cost per dose is $500,000,001. If you make four, it’s $250,000,001. If you make a billion, it’s $2. The more doses you make and sell, the closer your average cost per dose gets to your marginal cost of making a single dose.

The socially optimal outcome is that we charge everyone the marginal cost, $1, so it can reach the maximum number of people. But if we do that, the pharma company will never cover its fixed costs: it will be $1 billion in the red, and probably won’t want to make any more vaccines. 

On the other hand, if you let pharma companies charge the market value — that is, the amount that people are willing to pay — then it will cover their losses, because people in rich countries want to be vaccinated against this disease. But poorer countries will not be able to afford it. People who would be willing to pay the marginal cost are priced out of a vaccine. 

The cost that will encourage industry investment R&D, and the cost that maximises the number of people covered by the vaccine, are different. It’s a simple tradeoff. “It’s obvious that, if your financial model is to sell to rich consumers at above cost price, you’re going to do R&D on products that you can do that for,” says Owen Barder, a development economist who has worked on pharma incentives. “The patent model suffers a really big problem.”

One obvious solution would be to take the whole business out of pharma companies’ hands, and let governments and philanthropic organisations run the whole thing. But the profit motive is very good at one particular thing: making pharma companies drop a product that isn’t working. “I think the key skill of a pharma company is knowing when to exit,” says Barder. “And, to a gross generalisation, governments are rubbish at exit.”

Most candidate drugs don’t work, or don’t work very well. So most of the time you have to abandon research, after spending a lot of money on them. But it is very hard for a government to say “This thing that we’ve spent £100 million on doesn’t work, and we’re going to throw all that money away.” It’s easier to throw another £100 million in and avoid the question. Drug companies, we all agree, are extremely good at making money. And part of being good at making money is not throwing good money after bad. Publicly funded research is great for blue-skies, open-ended questions that help us understand things, but it is less effective than private firms at developing specific products, because of this exit problem.

Read the full article in Unherd.


.
Joe Jones, “We Demand” (1934)

Beyond the Great Awokening
Adolph Reed Jr., New Republic, 8 December 2020

Discourse about race and politics in the United States has been driven in recent years more by moralizing than by careful analysis or strategic considerations. It also depends on naïve and unproductive ways of interpreting the past and its relation to the present. I’ve discussed a number of the political and intellectual casualties of what we might call this Great Awokening, among them a tendency to view the past anachronistically, through the lens of the assumptions, norms, and patterns of social relations of the present.

That inclination has only intensified with proliferation of notions like Afropessimism, which postulates that much of, if not all, the history of the world has been propelled by a universal “anti-blackness.” Adherents of the Afropessimist critique, and other race-reductive thinkers, posit a commitment to a transhistorical white supremacy as the cornerstone and motive force of the history, and prehistory, of the United States, as well as imperialist and colonialist subjugation in other areas of the world. Most famously, The New York Times’ award-winning 1619 Project, under the direction of Nikole Hannah-Jones, asserts that slavery and racial subordination have defined the essence of the United States since before the founding—a brand of ahistorical moralizing that is now being incorporated into high school history curricula.

Yet, as I have argued, the premise that subordination to white supremacy has been black Americans’ definitive and unrelenting experience in the United States is undone by the most casual observation. As just one instance, I recall a panel at an early 1990s conference on black politics at Harvard Law School, organized by the school’s black student group, on which a distinguished black Harvard Law professor declaimed—with no qualification or sense of irony—that nothing had changed for black Americans since 1865. Until recently, this obviously false contention could make sense as a rhetorical gambit, indeed one that depended on its falsity for its effectiveness. It was a jeremiad dressed up as an empirical claim; “nothing has changed” carried a silent qualifier—that whatever racial outrage triggered the declaration makes it seem as though nothing had changed. This kind of provocation pivots on the tacit rhetorical claim that the offense it targets is atavistic—but in order for it to gain any significant traction, it requires that we understand that things have changed to the extent that such offenses should no longer be condoned, accepted, or taken in stride.

However, the fervor of the Great Awokening has since transformed this fundamentally rhetorical device into an assertion of fact. That is one of the most intellectually disturbing features of the irrationalist race reductionism of our own historical moment. It sacrifices or openly rejects not only nuance in historical interpretation but also the idea of historicity itself—the understanding that the relation between past and present generates meanings and nuances beyond the bounds of outworn dogmas in either era. Inflexible race reductionism also rules out, on principle, the notion that we should strive to understand ideas and actions in the past synchronically, as enmeshed in their own complex contexts of meaning, as well as in relation to ours. Race reductionist politics depends on denying historical specificity, typically through a sleight-of-hand maneuver that depicts black Americans’ challenges and struggles as set in motion by a singular, transhistorical, and idealized abstraction called “racism,” “white supremacy,” or “anti-blackness.” What’s omitted from this Borg-like model of an undeviating, and seemingly all-conquering, white supremacist opposition are the actual policies and programs that actual black people, often along with others, fought for and against. Black Metropolis shines a spotlight on that difference.

Read the full article in the New Republic.


.

The racial wealth gap is about the upper classes
Matt Bruenig People’s Policy Project, 29 June 2020

In light of the recent resurgence of Black Lives Matter protests, there has been renewed discussion of the racial wealth gap and how to close it (Nikole Hannah-Jones, Annie Lowrey). I have written on this topic many times in the past (I, II, III, IV). One thing I have tried to emphasize over the years, which I will do again here in a different way, is that due to the extremely concentrated wealth distribution in the US, the racial wealth gap is almost entirely about the upper classes in each racial group. I say this not to imply that it is unimportant but rather because this fact must be grappled with upfront if we are going to make a serious effort to close the racial wealth gap.

If you take the net worth of all white households and divide it by the number of white households, you get $900,600. If you do the same thing for black households, you get $140,000. The difference between these figures — $770,600 — is the best representation of the overall racial wealth gap. That is how much more wealth black people would need per household to have as much wealth as white people have per household.

But overall statistics can be misleading. If you decompose both of these bars into deciles, what you find is that nearly all white wealth is owned by the top 10 percent of white households just as nearly all black wealth is owned by the top 10 percent of black households. The lower and middle deciles of each racial group own virtually none of their racial group’s wealth.

What this means is that the overall racial wealth disparity is being driven almost entirely by the disparity between the wealthiest 10 percent of white people and the wealthiest 10 percent of black people.

One way to illustrate this point (h/t Sam Tobin-Hochstadt) is to see what would happen to the overall racial wealth gap if we entirely closed the wealth gap that exists between the bottom 90 percent of each racial group. Put differently: what would happen to mean black wealth if the bottom 90 percent of black families were given the exact same per-household wealth as the bottom 90 percent of white families?

The answer is that mean black wealth would rise from $140,000 to $311,100. The overall racial wealth gap would thus decline from $760,600 to $589,500, a fall of 22.5 percent. This means that even after you have completely closed the racial wealth gap between the bottom 90 percent of each race, 77.5 percent of the overall racial wealth gap still remains, which is to say that the disparity between the top deciles in each race drives over three-fourths of the racial wealth gap.

Read the full article in People’s Policy Project.


.

To reduce racial inequality, raise the minimum wage
Ellora Derenoncourt & Claire Montialoux,

New York Times, 25 October 2020

Our new research shows that Congress’s decision in 1966 to both raise the minimum wage and expand it to workers in previously unprotected industries led to a significant drop in earnings inequality between Black and white Americans — and explains more than 20 percent of the overall reduction during this period.

The findings suggest that raising and expanding the minimum wage could once again reduce the persistent earnings divide between white workers and Black, Hispanic and Native American workers. Though legislation to raise the wage floor would be a universal program in name and application, in practice it would be a remarkably effective tool for racial justice.

As with other major pieces of 20th-century progressive legislation, the cost of gaining Southern Democratic votes in 1938 for the federal minimum wage was a racist compromise: in this case, the exclusion of certain industries because of their high concentrations of Black workers, especially in the South.

Though it’s a fact that is often skipped over in popular histories, civil rights leaders who organized the famous March on Washington for Jobs and Freedom in 1963 demanded an increase in the minimum wage and one that applied to all employment. Modest but meaningful increases were eventually passed, and the Fair Labor Standards Act of 1966 also extended coverage to some of the excluded industries: nursing homes, laundries, hotels, restaurants, schools, hospitals and agriculture.

In 1967, the newly covered sectors employed about eight million workers ages 25 to 55, or about 21 percent of the U.S. prime-age work force. And, crucially, nearly one-third of Black workers were employed in these sectors.

White workers greatly benefited from the 1966 law; Black workers gained even more. In addition to being overrepresented in the newly covered industries, Black workers earned less on average in these industries than their white counterparts. So the earnings increase caused by the reform was 10 percent on average for Black workers in the newly covered industries, twice as much as that for white workers.

Based on our analysis, we estimate that the minimum wage increase was responsible for approximately 20 percent of the reduction in the earnings gap between Black and white workers between 1967 and 1980.

Read the full article in the New York Times.


.

“People of color” do not
belong to the Democratic Party

Jay Caspian Kang, New York Times, 20 November 2020

Every immigrant arrives in this country with an implied debt. This country was nice enough to let you in, handed you a bag of rights and will now leave you alone to make your fortune. Left and right might disagree on how many people to let into the country or how to treat them when they’re here, but both sides expect a return on their good will.

They agree that America is enough — as long as you meet opportunity with hard work, you can secure ownership in this country. In exchange, both sides expect loyalty, whether complaint-free allegiance to the country’s ideals or the acknowledgment that very open-minded and generous people worked hard to fight off the racists and the xenophobes and that you, downtrodden immigrant, should never forget those who protect your freedom to pursue the American dream.

In the wake of the election, there has been a concerted call to stop treating Latinos and, to a lesser extent, Asian-Americans as a monolith. Such a reckoning is long overdue and certainly necessary. It’s fundamentally true that a Cuban-American in South Florida shares very little in common with a Guatemalan fishery worker in New Bedford, Mass. — who, in turn, does not identify in any real way with fifth-generation Texans along the Rio Grande Valley.

Similarly, former Vietnamese refugees in Orange County, Calif., will have a different level of sensitivity toward charges of “Communism” than a second-generation Ivy League-educated Indian-American just up the freeway in suburban Los Angeles. Though the full picture of the electorate is not yet clear, it shouldn’t be surprising that some of these populations ended up ignoring or even championing the xenophobia of the first Trump administration while others found it abhorrent and against their particular interests.

This should be fairly obvious — different people from different parts of the world think differently, especially across generations — but the quintessentially American idea of the immigrant’s debt flattens all immigrants down into fixed categories. Those categories might help organize data, but they do not capture any meaningful insights into why people are voting the way they are.

It’s why the right does not understand why a group of castoffs from “shithole countries” would ever complain about America. It’s why progressives, the people who place “Immigrants Are Welcome Here” signs in their coffee shops and who appreciate all the nuances of immigrants’ native cuisines, cannot understand why those same castoffs would ever vote against their self-appointed protectors. The debt is the monolith.

Read the full article in the New York Times.


.

Reconstructing justice: Race, generational divides,
and the fight over “Defund the police”
Michael Javen Fortner, Niskanen Centre, October 2020

How do we break this impasse? Where do we go from here? We can begin to look for a path forward by reflecting on how the politics of punishment have evolved from the 1980s to today, reviewing polling data and key policy moments. While many accounts of attitudes about policing highlight “racial divides,”11 my analysis seeks to understand African American opinion on its own terms as well as in relation to other racial groups and seeks to capture its political significance historically and in the current moment. Instead of assuming a coherent “Black perspective” on policing and punishment, it centers the complex, and sometimes contradictory, internal politics of public safety within African American communities.12 While most Blacks have been less punitive than most whites, most Blacks have also been extremely punitive in their own right.

First, African American attitudes grew increasingly punitive towards crime, policing, and punishment in response to rising violence in Black communities from the 1960s to the early 1990s. The passage of the Violent Crime Control and Law Enforcement Act of 1994 (aka “the crime bill”) provides a key example. Anti-crime sentiments made African Americans a crucial member of the “get tough” coalition that defined American politics and policy in that era. Second, crime’s stunning denouement led Black opinion to moderate, as revealed by attitudes and events in New York City as reported violent crimes dropped sharply from their peak in the early 1990s, in part reflecting new policing strategies. Despite living in safer communities and continuing to see police brutality, most African Americans remained committed to effective policing as a public safety strategy. The Black Lives Matter movement emerged, in part, however, as a response to these same policing strategies and signals a major generational division in African American politics.

Third, manifestations of these generational splits were visible in the 2020 Democratic presidential primary campaign and the subsequent protests seeking to “defund the police.” Recent surveys show that most African Americans side with Clyburn more than Alexander. Most Americans, including Blacks, endorse meaningful police reforms, but they also oppose abolition, although that is favored by a plurality of Black and white millennials. The fate of defund measures in Minneapolis, Atlanta, and New York City document the ways in which the fight over “defund the police” is as much a conflict between young and old and left and center as it is between Black and white.

My analysis then returns to the central question: Where do we go from here? Some have cheered the ethical and practical benefits of abolition.13 Others have championed the merits of certain reforms.14 Without rehashing or adjudicating between these perspectives, one can still see a policy space that heeds the constraints of contemporary attitudes and attends both to the deep and legitimate fear of crime that continues to weigh heavily on many African Americans and to the terror that police violence foments among all Blacks.15 Living with overpolicing and underprotection,16 most African Americans seek the reconstruction of public safety strategies, urban communities, and the relationship between those strategies and those communities. We need to end police brutality without ending policing.

Read the full article at the Niskanen Centre.


.

Stealing to survive: More Americans are
shoplifting food as aid runs out during the pandemic
Abha Bhattarai & Hannah Denham,

Washington Post, 10 December 2020

With the United States now registering more than 150,000 new coronavirus cases a day, some communities are reintroducing restrictions in an effort to contain the virus. Most of California is now under strict stay-at-home orders, for example, while states including Nevada, Maryland and Pennsylvania have issued new indoor occupancy limits. Such orders tend to hit already vulnerable workers in low-wage service jobs in restaurants, retail and bars the hardest.

In Maryland, Jean was successfully juggling college and a job, and had just bought her first car, when the pandemic crashed down like a sneaker wave. Her son’s day-care center suddenly closed in April, forcing her to give up her $15-an-hour job as a receptionist. But quitting meant she didn’t qualify for unemployment benefits. She says she was denied food stamps at least three times and gave up on local food banks because of the lines.

With no stimulus aid and her savings gone by May, Jean said she was out of options. So she began sneaking food into her son’s stroller at the local Walmart. She said she’d take things like ground beef, rice or potatoes but always pay for something small, like a packet of M&M’s. Each time, she’d tell herself that God would understand.

“I used to think, if I get in trouble, I’d say, ‘Look, I’m sorry, I wasn’t stealing a television. I just didn’t know what else to do. It wasn’t malicious. We were hungry,’ ” said Jean, 21, who asked to be identified by her middle name to discuss her situation freely. “It’s not something I’m proud of, but it’s what I had to do.”

Retailers have historically been most concerned about staff when it comes to what they call “shrink.” Workers are typically behind about a quarter of the $25 billion in global losses reported each year, a category that includes lost merchandise, stolen cash and employee errors, security experts say.

That changed with the pandemic as customer shoplifting became more pronounced, especially in areas with high joblessness, said Fabien Tiburce, chief executive of Compliant IA, which provides loss prevention software to retailers. “There is a well-known historical correlation between unemployment and theft,” he said, a connection that is more entrenched in the United States than in countries with more robust safety nets like Canada and Australia.

Dollar Tree and Family Dollar, which often are concentrated in low-income areas, have seen “increasing instances of theft” during the past year, according to spokeswoman Kayleigh Painter. She declined to share specific data or protocols, but said the company is “continually evaluating and enhancing on-premise security and surveillance systems, as well as our associate training.”

Read the full article in the Washington Post.


.
Facial Recognition ague (via the BBC)

The ethical questions that
haunt facial-recognition research

Richard van Noorden, Nature, 18 November 2020

In September 2019, four researchers wrote to the publisher Wiley to “respectfully ask” that it immediately retract a scientific paper. The study, published in 2018, had trained algorithms to distinguish faces of Uyghur people, a predominantly Muslim minority ethnic group in China, from those of Korean and Tibetan ethnicity1.

China had already been internationally condemned for its heavy surveillance and mass detentions of Uyghurs in camps in the northwestern province of Xinjiang — which the government says are re-education centres aimed at quelling a terrorist movement. According to media reports, authorities in Xinjiang have used surveillance cameras equipped with software attuned to Uyghur faces.

As a result, many researchers found it disturbing that academics had tried to build such algorithms — and that a US journal had published a research paper on the topic. And the 2018 study wasn’t the only one: journals from publishers including Springer Nature, Elsevier and the Institute of Electrical and Electronics Engineers (IEEE) had also published peer-reviewed papers that describe using facial recognition to identify Uyghurs and members of other Chinese minority groups. (Nature’s news team is editorially independent from its publisher, Springer Nature.)

The complaint, which launched an ongoing investigation, was one foray in a growing push by some scientists and human-rights activists to get the scientific community to take a firmer stance against unethical facial-recognition research. It’s important to denounce controversial uses of the technology, but that’s not enough, ethicists say. Scientists should also acknowledge the morally dubious foundations of much of the academic work in the field — including studies that have collected enormous data sets of images of people’s faces without consent, many of which helped hone commercial or military surveillance algorithms.

An increasing number of scientists are urging researchers to avoid working with firms or universities linked to unethical projects, to re-evaluate how they collect and distribute facial-recognition data sets and to rethink the ethics of their own studies. Some institutions are already taking steps in this direction. In the past year, several journals and an academic conference have announced extra ethics checks on studies.

“A lot of people are now questioning why the computer-vision community dedicates so much energy to facial-recognition work when it’s so difficult to do it ethically,” says Deborah Raji, a researcher in Ottawa who works at the non-profit Internet foundation Mozilla. “I’m seeing a growing coalition that is just against this entire enterprise.”

Read the full article in Nature.


.

Is facial recognition too biased to be let loose?
Davide Castelvecchi, Nature, 18 November 2020

The accuracy of facial recognition has improved drastically since ‘deep learning’ techniques were introduced into the field about a decade ago. But whether that means it’s good enough to be used on lower-quality, ‘in the wild’ images is a hugely controversial issue. And questions remain about how to transparently evaluate facial-recognition systems.

In 2018, a seminal paper by computer scientists Timnit Gebru, then at Microsoft Research in New York City and now at Google in Mountain View, California, and Joy Buolamwini at the Massachusetts Institute of Technology in Cambridge found that leading facial-recognition software packages performed much worse at identifying the gender of women and people of colour than at classifying male, white faces2. Concerns over demographic bias have since been quoted frequently in calls for moratoriums or bans of facial-recognition software.

In June, the world’s largest scientific computing society, the Association for Computing Machinery in New York City, urged a suspension of private and government use of facial-recognition technology, because of “clear bias based on ethnic, racial, gender, and other human characteristics”, which it said injured the rights of individuals in specific demographic groups. Axon, a maker of body cameras worn by police officers across the United States, has said that facial recognition isn’t accurate enough to be deployed in its products. Some US cities have banned the use of the technology in policing, and US lawmakers have proposed a federal moratorium.

Companies say they’re working to fix the biases in their facial-recognition systems, and some are claiming success. But many researchers and activists are deeply sceptical. They argue that even if the technology surpasses some benchmark in accuracy, that won’t assuage deeper concerns that facial-recognition tools are used in discriminatory ways.

Facial-recognition systems are often proprietary and swathed in secrecy, but specialists say that most involve a multi-stage process (see ‘How facial recognition works’) using deep learning to train massive neural networks on large sets of data to recognize patterns. “Everybody who does face recognition now uses deep learning,” says Anil Jain, a computer scientist at Michigan State University in East Lansing.

Read the full article in Nature.


.

When I step outside, I step
into a country of men who stare

Fatima Bhojani, New York Times, 17 November 2020

I am angry. All the time. I’ve been angry for years. Ever since I began to grasp the staggering extent of violence — emotional, mental and physical — against women in Pakistan. Women here, all 100 million of us, exist in collective fury.

“Every day, I am reminded of a reason I shouldn’t exist,” my 19-year-old friend recently told me in a cafe in Islamabad. When she gets into an Uber, she sits right behind the driver so that he can’t reach back and grab her. We agreed that we would jump out of a moving car if that ever happened. We debated whether pepper spray was better than a knife.

When I step outside, I step into a country of men who stare. I could be making the short walk from my car to the bookstore or walking through the aisles at the supermarket. I could be wrapped in a shawl or behind two layers of face mask. But I will be followed by searing eyes, X-raying me. Because here, it is culturally acceptable for men to gape at women unblinkingly, as if we are all in a staring contest that nobody told half the population about, a contest hinged on a subtle form of psychological violence…

This country fails its women from the very top of government leadership to those who live with us in our homes. In September, a woman was raped beside a major highway near Lahore, Pakistan’s second-largest city. Around 1 a.m., her car ran out of fuel. She called the police and waited. Two armed men broke through the windows and assaulted her in a nearby field.

The most senior police official in Lahore remarked that the survivor was assaulted because, he assumed, she “was traveling late at night without her husband’s permission.”

An elderly woman in my apartment building in Islamabad, remarked, “Apni izzat apnay haath mein” — Your honor is in your own hands. In Pakistan, sexual assault comes with stigma, the notion that a woman by being on the receiving end of a violent crime has brought shame to herself and her family. Societal judgment is a major reason survivors don’t come forward.

Responding to the Lahore assault, Prime Minister Imran Khan proposed chemical castration of the rapists. His endorsement of archaic punishments rather than a sincere promise to undertake the difficult, lengthy and necessary work of reforming criminal and legal procedures is part of the problem. The conviction rate for sexual assault is around 3 percent, according to War Against Rape, a local nonprofit.

Read the full article in the New York Times.


.

India COVID-19 response suggests
‘scientific superpower’ tag an impossible dream

Vasudevan Mukunth, The Wire, 12 December 2020

Journalists have been on their toes, at the edge of their seats and in various other awkward positions throughout India’s COVID-19 epidemic – and more often than not because they have been brought there by inept governance. It was very heartening through the last ten months to witness many reporters and editors rise to the unfortunate occasion, intelligently covering a panoply of stories that located scientific ideas and decisions in their right social and political contexts in the face of acrimonious resistance from the ruling party and its foot-soldiers.

In the face of their work, the government’s attempts to wield the epidemic as an excuse for its failures quickly came unstuck. However, many of these stories ought not to have been needed in the first place – in that they were the result of actions and policies that are easily fixed, but weren’t, often because government officials had lied. The Serum Institute case is one of the latest examples of such a fiasco, especially now that other trial participants are contemplating legal action against the company as well as the Indian Council of Medical Research (ICMR) for not being told of a severe adverse event that may have been related to the vaccine.

Other examples include the shutting down of the Manipal Institute of Virology, the initial seroprevalence survey results, the validation of antibody testing kits, details regarding drug and vaccine clinical trials, ignoring epidemiologists’ advice, decisions about lockdowns and containment, various notices issued by the AYUSH ministry, confusion over ‘red’ and ‘orange’ zones, ridiculous statements at press conferences including a crude attempt at communalising the pandemic, and, unforgettably, the approval of hydroxychloroquine, remdesivir, favipiravir, itolizumab and tocilizumab without sufficient evidence of their efficacy.

To get to the hearts of these stories, journalists have had to wade through a swamp of lies and deceit. The Wire wrote in its July 1 editorial: “Today, no one expects ICMR to contradict the Centre’s COVID-19 response strategy on any count, irrespective of the enormity of a transgression”…

It’s baffling as to how a country can be – or even aspire to be – a ‘scientific superpower’ if its political leadership grossly and perhaps deliberately misunderstands what ‘science’ is. Nothing bears this out more than the ruling Bharatiya Janata Party’s intention to roll out a form of medicine that the Indian Medical Association calls ‘mixopathy’ – a mix of Ayurvedic and allopathic ideas, techniques and methods.

Read the full article in The Wire.


.

Belgium’s reckoning with a brutal history in Congo
Neil Munshi, Financial Times, 13 November 2020

The Union Minière runs like a copper seam through Belgian colonialism and the country’s contemporary relationship with that period. In 1906, Leopold II created L’Union Minière du Haut-Katanga (UMHK) along with a diamond-mining business and railway company to exploit Congo’s mineral resources. Those minerals helped to build the Belgian economy and its cities, to accelerate its industrialisation and to enrich its royal family. They also became, as with the statue on Place du Trône, the very means through which Belgium celebrated itself, in scores of bronze-and-tin tributes to its mission civilisatrice.

In 1912, Union Minière began ripping high-grade copper out of the Kalukuluku mine. The company, which was part owned by a British group, was the world’s largest copper producer by 1929, the largest producer of cobalt by the 1960s. By then, UMHK generated half of Congo’s revenues and 70 per cent of its exports. When America’s atomic bombs fell on Japan, they were fuelled by Congolese uranium extracted by Union Minière.

The company became so powerful it was known as “a state within a state”. By 1961, its dominion spread over 7,000 square miles. It was in charge of its employees’ lives from birth to death, schooling their children, who then became workers themselves. Up to a quarter of a million men were forcibly “pressed into its service” during the first 30 years of UMHK’s existence, according to historian John Higginson. The forced rural exodus transformed Congo’s agricultural production for ever.

“Belgium, and Belgian people, have to admit that if they are rich now . . . it’s because they took the money [from] somewhere — and it’s not just that they took the money, they really colonised the whole population, a whole territory, with violence,” says Anne Wetsi Mpoma, an activist and art curator who has been asked to advise the new parliamentary commission as a member of the Congolese diaspora.

Belgium was already a major industrial power by the late 19th century but its economy was immeasurably improved by the capture of Congo, says Pierre Kompany, Belgium’s most prominent politician of Congolese descent. The port of Antwerp grew to become the world’s second busiest after Liverpool, first because of Congolese rubber and ivory shipments and then because of the UMHK minerals. According to a 2007 survey, nine of the 23 richest Belgian families could trace their fortunes to colonial Congo.

Meanwhile, Leopold II spent so much on public works, including Brussels’ grand Arcade du Cinquantenaire, that he earned the nickname the “builder king”. Kompany advocates for closer, equitable ties rather than reparations, but he too is clear that Belgium must fully acknowledge its legacy. “Everyone should know from where he got his money,” he says. “If Belgium didn’t meet Congo, Belgium wouldn’t be what it is today.”

Read the full article in the Financial Times.


.

The birther myth stuck around for years.
The election fraud myth might too.

Kaleigh Rogers, FiveThirtyEight, 23 November 2020

A significant number of Americans currently believe the 2020 election was stolen, even though it wasn’t. A Reuters/Ipsos poll last week showed 52 percent of Republicans believe President Trump “rightfully won” the election. But the only “evidence” of election fraud has been widely debunked.

An optimist might think the public will gradually drop this election fraud myth as the Trump campaign’s lawsuits are thrown out, recounts and audits are conducted, and, eventually, Joe Biden is sworn in as president. But we’ve seen Trump try to falsely claim a president is illegitimate before, as he spent years claiming without evidence that President Obama wasn’t born in the United States, and thus ineligible to be president. If this recent saga is anything like the birtherism movement, it’s not going anywhere.

“If you’re asking if this is going to go away, I would bet a lot of money that it won’t,” said Adam Berinsky, a political scientist at MIT who is working on a book about political conspiracy theories.

Birtherism first emerged in 2008 during Obama’s primary campaign through the now very quaint medium of chain emails. After securing the nomination, Obama’s campaign published a copy of his certification of live birth in Hawaii. Many assumed this would put an end to the myth that he wasn’t born in America. Trump was one of the main people who ensured it did not.

In 2011, Trump began aggressively beating the birtherism drum, including via some comments he made at the Conservative Political Action Conference in February. Over the next five years, in media appearances, speeches and on Twitter, Trump repeatedly made false claims that Obama was not born in America. This continued even after Obama released his long-form birth certificate in April 2011, a piece of evidence Trump had demanded.

It wasn’t until Trump’s own campaign for president in 2016 — when his birtherism was thought to be among the reasons he wasn’t polling well among Black Americans — that Trump admitted the obvious truth: Obama was born in the United States. But that admission was brief (the statement lasted less than a minute) and included Trump blaming Hillary Clinton for originating the myth, which isn’t true.1

When the birther myth first emerged, there wasn’t much in the way of public opinion polling about it. It wasn’t until Trump revived the conspiracy theory in 2011 that pollsters started to track how much of the public believed it. And over the years, even as more evidence emerged — such as the long-form birth certificate and contemporaneous newspaper announcements — proving Obama was born in the U.S., the belief has persisted. As recently as last year, a YouGov poll found that 34 percent of Americans think it’s “probably true” or “definitely true” that Obama was born in Kenya, as the birther myth often claimed. Among self-identified Republicans, that number was 56 percent.

Read the full article in FiveThirtyEight.


.
Dana Schutz, “Open Casket”

All shook up: The politics of cultural appropriation
Brian Morton, Dissent, Fall 2020

Critics of cultural appropriation believe themselves to be involved in a significant political activity, yet the objects of their criticism are usually people who are relatively powerless—the yoga teacher, the women with the burrito cart, the visual artist, the novelist who dares to venture out of her lane. It would be hard to make the case that the critique of cultural appropriation constitutes an assault on unjust hierarchies in our society, since those who hold real power are rarely the objects of this critique.

Charges of cultural appropriation are also often made against successful artists and celebrities, from Elvis Presley to Kim Kardashian to Jeanine Cummins, the author of American Dirt—but it would be fanciful to say that entertainers represent the source of power and unjust hierarchy in our society either.

In 2009, Haiti’s parliament raised the national minimum wage to 61 cents an hour. Foreign manufacturers, along with the U.S. State Department, immediately pushed back, prevailing on Haiti to lower textile workers’ minimum wage to 31 cents an hour. This came to about $2.50 per day, in a country whose estimated daily cost of living for a family of three was about $12.50.

Powerful corporations from the most powerful country on earth exerted pressure that intensified the destitution of people in Haiti. Among the corporations were Levi Strauss and Hanes, whose CEO was at that time receiving a compensation package of about $10 million a year. Yet you could have searched Facebook and Twitter and the rest of the internet for a long time before finding any Americans who cared or even knew about any of this, even after WikiLeaks and the Nation brought it to light in 2011.

In 2017, the two Portland women who’d opened a burrito cart closed their business after being assailed by online activists for appropriating the cuisine of Mexico. The following year, when the Goodyear Tire & Rubber Company fired dozens of workers who were trying to launch an independent trade union at its factory in San Luis Potosí, Mexico, few in the world of online outrage took any notice.

Of course, the pressure exerted on working people in Haiti and Mexico is the same pressure that corporate power exerts all over the world, including within this country, where capital’s long war against labor rights and social welfare provisions seems to grow more intense every year. This is true appropriation—the stealing of people’s life chances, the repression of their opportunity for leisure and health and safety, the bulldozing of any possibility of equitable local development. The malefactors here aren’t women running a burrito cart or musicians soaking up influences or white models wearing dreadlocks or writers trying to dream their way into other people’s lives, but corporate actors making decisions that degrade us all…

We can embrace a sort of cultural solipsism that holds that different groups have nothing in common, or we can understand that our lives are inextricably bound up with the lives of people we’ll never know. We can deny what we owe to one another, or we can seek to retrieve the vision of a shared humanity. We can choose to believe that it’s virtuous to try to stay in our lanes, or we can choose to learn about the idea of solidarity. It’s an old idea, but for those of us concerned with freedom and equality, it’s still the best idea we have.

Read the full article in Dissent.


.

Why are politicians suddenly talking
about their “lived experience”?

Kwame Anthony Appiah, Guardian, 14 November 2020

And what made the phrase so powerful was the unappealable authority it seemed to represent. As Walt Whitman wrote in Song of Myself, that most American of poems, “I am the man, I suffer’d, I was there.” You can debate my sociopolitical analyses – those facts and interpretations are shared and public – but not my lived experience. Lived experience isn’t something you argue, it’s something you have.

Yet if lived experience was once viewed as a way to speak truth to power, power has learned to speak “lived experience” with remarkable fluency. Consider what happened when, in the wake of the George Floyd protests, Senate Republicans set out to counter a Democratic bill for police reform with a milder proposal of their own, one backed by Senator Tim Scott, from South Carolina. Mitch McConnell, the Republican leader of the Senate, declared: “It’s a straightforward plan based on facts, on data and lived experience” – the lived experience evidently supplied by Scott, the only black Republican in the Senate. Somehow the lived experience of Cory Booker, the black senator who introduced a Democratic police reform bill, offered different lessons.

Experience, alas, is never unmediated and self-interpreting. Ideology, though it can be shaped by experience, also shapes our experiences. The twins Shelby Steele and Claude Steele – a former professor of English and a professor of psychology – draw on their lived experience to produce opposite pictures of the black American condition. Claude has emphasised the detrimental effects of racial stereotypes; Shelby sees the real threat in efforts, such as affirmative action, to remedy racial disparities. Justice Clarence Thomas, a black conservative, draws from his lived experience to confirm a bootstrapping position (If I can make it, so can you), just as the late Congressman John Lewis, hero of the civil rights left, could do so to confirm the need for social intervention (I almost didn’t make it). There’s no guarantee what message people will take from their experience: no guarantee that we’ll all be singing the Song of Myself in the same key.

When we’re thinking about policy, then, how much weight should we give to private experience? Pressed to explain what she had in mind, Harris listed some elements of her biography: growing up a black child in the US, serving as a prosecutor, having a mother who was a teenage immigrant from India. There’s no doubt, of course, that these are the sorts of experiences from which a person could learn a great deal. And stories drawn from our own experience can be powerful ways of recounting what we have learned. But identities are too multiple and complex to allow any individual’s experience to count as truly representative.

Read the full article in the Guardian.


.

Talk less, work more
Namit Arora, The Baffler, 23 November 2020

Sometime after midnight on June 25, 1975, over six hundred political leaders, social activists, and trade unionists in India were rudely awakened by knocks on their doors. By dawn, they had been placed behind bars for inciting “internal disturbance.” In parallel, the government shut off electricity to newspaper offices, blocking their next day’s editions.

“The President has proclaimed the Emergency,” Prime Minister Indira Gandhi announced in a surprise broadcast the next morning on All India Radio. “This is nothing to panic about.” The previous night, she had made a bleary-eyed President Fakhruddin Ali Ahmed trigger the Emergency provision in Article 352 of India’s constitution, which allowed her to postpone elections and suspend most fundamental rights, including those to speech, assembly, association, and movement. With the stroke of a pen, Gandhi had effectively dismantled India’s democratic infrastructure, concentrating dictatorial power in herself. Total press censorship was imposed, and foreign journalists who did not toe the line were summarily expelled, including stringers with the Washington Post, the Guardian, and the Daily Telegraph. On June 28, someone snuck a clever obituary into the Bombay edition of The Times of India: “D’Ocracy—D.E.M., beloved husband of T. Ruth, loving father of L.I. Bertie, brother of Faith, Hope, and Justice, expired on 26th June.”

The twenty-one months of Emergency that followed are regarded as the darkest chapter in independent India’s history. For those old enough to remember, the word recalls mass incarcerations without trial, a gagged press and propaganda, slum demolitions, and—most shockingly—the forced sterilizations of millions. Often cited as a cautionary tale in Indian political discourse, it is generally seen as an “exceptional” period from which India recovered admirably well, thanks in large part to the resilience of its democratic institutions and ethos. 

Two new books powerfully challenge this consensus. In India’s First Dictatorship, Christophe Jaffrelot and Pratinav Anil expose the chronic weaknesses in India’s democratic culture prior to the Emergency, revealing the role that other actors—businessmen, the middle class, even trade unionists and some communists—played in enabling Gandhi’s authoritarian rule. In Emergency Chronicles, Gyan Prakash considers how aspects of the modern Indian state, particularly its Constitution, enabled a demagogic takeover. In asking fundamental questions about the relation between state and society in India and exploring its many fault lines, these books cut through the boosterism that generally occludes the “world’s largest democracy.” Rather than viewing the Emergency as an aberration, they present it as a logical outcome of certain social and political tendencies of independent India.

Read the full article in the The Baffler.


.

Toward a global history of white supremacy
Daniel Geary, Camilla Schofield & Jennifer Sutton,

Boston Review, 16 October 2020

Because white nationalists are primarily concerned with the racial integrity of states, they have wrongly been assumed to be parochial in their politics, focused solely on domestic issues. In fact, transnational ties and transnational flows of culture and capital have long undergirded the pursuit of white racial nationalism. The success of Brexit, for example, emboldened Trump’s nativist supporters to see themselves as part of a global movement that could achieve power in the United States. Trump’s victory in turn inspired the Christchurch killer, who praised the U.S. president as a “symbol of renewed white identity and common purpose.” We need to understand the history of these connections if we are to grasp what has sustained white nationalism despite global trends toward liberation and equality.

White nationalism is an ideology that asserts national identity and belonging in terms of European descent. Accordingly, white nationalists see their countries as threatened by immigration and social advancement by non-whites. They contend that national identity and belonging must be built around racial whiteness—rather than culture, language, or place—and that it is the whiteness of the nation’s past, present, and future that ensures its continued historical development and survival. The fundamental ideas of white nationalists are hardly new, yet they have taken on new formulations since the mid-twentieth century as a politics of reaction to the promise of racial equality and decolonization. Though the numbers of self-identified white nationalists remain small, their ideas resonate broadly, impacting contemporary debates about global demographic change, national identity, and mass migration.

The shift of white nationalist politics from center to ostensible periphery is a relatively recent phenomenon. At the British Empire’s zenith, its apologists claimed that the rule of law, free trade, and parliamentary sovereignty were natural virtues of the “English race.” At the turn of the twentieth century, U.S. elites shared with British imperialists a discourse of English racial heritage termed Anglo-Saxonism that was used to justify the subjugation of Native Americans, the subordination of African Americans, and the possession of the United States’ own overseas empire. According to Anglo-Saxonism, white, Protestant, English-speaking men naturally made modern nations. This racialized modernity is based on the presumption that only whites can govern and that the empowerment of non-whites is therefore an existential threat to white self-government.

Anglo-Saxonism’s cherished ideal of a white man’s country reserving self-government and economic opportunity to whites may no longer be as dominant as it was a century ago, but neither has it disappeared. Popular historian Niall Ferguson still maintains that British colonial settler culture brought “modernity” to the world. Today some Brexiteers look to trade within an “Anglosphere” to reanimate this historical political tradition and harness racialized notions of kith and kin in the English-speaking world. Indeed, nostalgia for a past period of national glory in which white rule was unchallenged is a signature feature of today’s right-wing populists who seek to make their nations great again.

Read the full article in the Boston Review.


.

The subjective turn
Jon Stewart, Aeon, 2 November 2020

Hegel claims that, while the emergence of subjectivity in the ancient and medieval world was a liberating development, in the modern world the pendulum has swung too far to the opposite extreme. Beginning with the Renaissance and the Reformation, there has been an ever greater recognition of the value and importance of the subjectivity of the individual. This has produced ideas such as Luther’s notion that religious faith is a matter for individuals to decide on their own, or the Enlightenment idea that individuals are in possession of universal, God-given human rights. With the Romantic movement in the 19th century, the celebration of individuality accelerated with ideas such as the cult of genius, life as art, free love, and the rejection of bourgeois values. Elements of these ideas can also be found in the cultural movement of existentialism in the 20th century, which seemed in some cases to deny the truth of any external objective sphere and to insist on the absolute spontaneous freedom of the individual.

This development has culminated in the Western culture of the 21st century, which is sometimes characterised as an age of self-indulgence and narcissism, where we are all individual atoms pursuing our own private goals and ideas, with no regard to anything outside us. What was originally the emergence of subjectivity against the background of tradition has now become the dominance of subjectivity against the tattered remains of tradition and, indeed, any conception of an external truth.

Today, we dedicate much of our lives to developing and asserting some sense of personal self-identity that is identifiable and separable from that of others. People have become increasingly creative in the ways in which this is done. The obsession today of creating a profile for oneself on social media has often been cited as an example of the narcissism of the modern age. It lends itself to an exaggeration of the importance of one’s activities and accomplishments and tends to tune out anything in the external world, such as one’s failures or shortcomings, that doesn’t fit with the narrative one wants to tell about oneself. In all of this we see sometimes seemingly desperate attempts to create a fictional persona for ourselves that’s different from others. Independent of any actual facts, people can become authors of their own stories – true or imaginary – that they can tell as they wish.

Read the full article in Aeon.


.

Noam Chomsky and the left:
Allies or strange bedfellows?

Anjan Basu, The Wire, 7 December 2020

Why does Chomsky always seem to ‘stand at a slight angle to the universe’ of given wisdom (to borrow E.M. Forster’s memorable phrase) in everything he does? The answer is not far to seek, though it has several components. One, Chomsky always insists on thinking everything through to the end. No halfway house for him in anything he ventures on, no ceding of ground to high rhetoric, or to radical impetuosity.

As early as in his 1967 essay On Resistance, when civil society mobilisation in the US against the Vietnam War was nearing its peak with Chomsky fully committed to the protest movement, he calmly surveys the options available to war resisters, weighing the pros and cons of each option down even to what look like minutiae, and stressing the likely efficacy or otherwise of every possible kind of disruptive action open to the movement. And he repeatedly cautions potential resisters against embracing spectacularly heroic, but in effect unavailing, options. This brings us to the second important ingredient of Chomsky’s thought process: the optics of social action hardly ever appeals to him; appearances mean next to nothing to Chomsky. He is focussed narrowly on the cost that a mode of resistance will likely entail for the government, and he is not prepared to worry about the acceptability of that method in the eyes of non-participants. Next, individual choice is always for him the cardinal principle. Even though the cause of the war resisters was a just and humane one, each individual participant involved in the collective action needed to be allowed absolute freedom to choose what method, if any, suited him best.

We must not, I believe, thoughtlessly urge others to commit civil disobedience, and we must be careful not to construct situations in which young people will find themselves induced, perhaps in violation of their basic convictions, to commit civil disobedience. Resistance must be freely undertaken. 

Finally, Chomsky never loses sight of the overarching moral principle:

Resistance is in part a moral responsibility, in part a tactic to affect government policy. In particular, with respect to support for draft resistance, I feel that it is a moral responsibility that cannot be shirked.

The moral underpinning of Chomsky’s attitude to social action is not by any means a philosophical construct, however.  It is alive, pulsating with a sense of  community that he believes encompasses all the participants in the action: 

I also hope, more sincerely than I know how to say, that it (i.e., war resistance) will create bonds of friendship and mutual trust that will support and strengthen those who are sure to suffer.

This moral sense, grounded in an astute recognition of the potentialities and limitations of social action, is the pivot around which Chomsky’s activism has turned for all of his adult life. The individual is quite as important in his scheme of things as the community, the means as salient as the goal towards which they strive; and again, clear-eyed realism is no less vital to his programme than ideological integrity. In other words, he is free from all traces of dogma. And I think it is Chomsky’s abhorrence of all dogmas that the institutional Left finds   hard to come to terms with.

Read the full article in The Wire.


.
Yassir Mahmoud El Haj holding a picture of his family, all but one killed in Israeli bombardment (photo: Jehad al-Saftawi/NYR)

The Gaza I grew up in
Jehad al-Saftawi, New York Review, 18 November 2020

My name is Jehad al-Saftawi. I am a photographer and journalist. For years, I clung to the idea of fleeing my country for the Western world. There is no free press in Gaza. Most of the news channels cater to political parties that use violence to silence opposition. I come from a place overflowing with weapons, where my father could easily buy a pistol and shoot it into the air while cruising the streets of our city. A place where, on any night, you could be awoken by a bomb exploding in your neighbor’s home, stored there by a member of their family who belonged to an armed faction.

Working as a journalist in Gaza is like walking barefoot in a field of thorns. You must always watch where you step. Each neighborhood comprises its own intimate social network, and traveling through them with a camera makes you a significant cause for suspicion. You’re caught between the two sides of the conflict: the rulers of Gaza limit what you can photograph and write about, imprisoning and torturing those who disobey; at the same time, the Israeli army sees you as a potential threat that must be eliminated, as has been the fate of many Palestinian journalists. Standing behind the camera, my hands shook as I documented the suffering.

I am the second son of five children. Our father, Imad al-Saftawi, grew up in an ultraconservative middle-class family that was heavily influenced by the Muslim Brotherhood. As an adult, he spent many years participating in armed struggles, both within and outside the framework of Palestinian armed organizations, which he believed to be justifiable resistance to the Israeli occupation. As a member of one of the leading armed factions in Gaza, Islamic Jihad, he killed innocent Israelis.

I condemn these actions, though many in Gaza consider my father a hero, one who carried out valiant operations for the sake of his country and religion. In the late 1990s, when I was a young child, our father’s day job was with the Ministry of Awqaf and Religious Affairs, which meant, in practical terms, he worked in the management of mosques. On top of his professional duties, he acted as the khateeb (orator) on Fridays in various mosques around the Gaza Strip, where he would lecture about religion. My mother was a housewife, overseeing our education and raising us according to our father’s methods and rules.

In 2000, when I was nine, our father was arrested by the Israeli army at the Rafah border crossing between the Gaza Strip and Egypt. He remained in prison for the next eighteen years. His influence over our family did not relent: from prison, he frequently telephoned our house and enforced religious and social strictures upon us, his children, threatening us in the event of noncompliance. Hadith (the Prophet’s quotes) lined our walls. Islamic books filled the shelves, along with animal statues my father had broken the heads off of in accordance with the Islamic rule prohibiting the portrayal and embodiment of spirits.

Read the full article in the New York Review.


.

In U.S. and UK, globalization leaves
some feeling ‘left behind’ or ‘swept up’

Laura Silver, Shannon Schumacher & Mara Mordecai, Pew Research Centre, 5 October 2020

The question “What is globalization?” was not easy for focus group participants to answer. Definitions were wide-ranging, touching on economic changes such as the rising influence of multinational corporations and the role of international trade; international organizations like the United Nations; immigration and the movement of people; and amorphous concepts like the exchange of ideas and cultures. As a further indication of how challenged participants were by the task of defining globalization, some offered a response, only to hurriedly seek confirmation from the moderator that their answer was “correct.”

Though participants were often unsure of how to define globalization, key themes did emerge. These centered on economics and trade, the global balance of power, immigration and cultural exchange, technological advancement, and community.

And unlike technical definitions, participants found it relatively easy to share illustrations of globalization. They brought up the impacts of globalization on their daily lives, like the experiences of calling customer service and reaching a call center in another country. People touted the ability to order goods from the other side of the world on Amazon and have them delivered the next day. Others brought up how immigration has shifted the fabric of their country for better or worse, or how openness to foreign ideas and customs was changing their country’s culture – again, for both better and worse.

When describing key changes in their local communities, participants did not always invoke “globalization.” Yet their stories often linked to broader illustrations of what constitutes globalization. This was particularly true when participants spoke about changes due to industrial shifts, automation and the growing influence of multinational corporations. All three were consistently described as negatively impacting local communities – in contrast to growing cultural diversity or improved communication technology, which were sometimes viewed favorably.

Industrial change, automation and the influence of multinationals were prime catalysts in stories of being left behind by globalization. Being left behind was often equated with job loss and shuttered businesses. Depending on the locale, participants described either industry-specific or general job losses. Focus group participants in Pittsburgh and Newcastle were particularly animated by stories of being left behind, describing how they or people they knew had lost jobs at coal mines, steel mills and other industrial facilities.

Read the full article in the Pew Research Centre.


.

Amid the monument wars, a rally for “more history”
Jennifer Schuessler, New York Times, 28 September 2020

On Saturday, a group of about 30 mustered under drizzly skies at the edge of the battlefield at Gettysburg, Pa. The site of one of the bloodiest and most important battles of the Civil War, Gettysburg has seen its share of clashes over the memory of the war in recent years. But this group was there to make a stand of a different kind.

They carried signs with quotations from 19th-century newspapers, passages from the Confederacy’s constitution extolling slavery, and facts (some of them footnoted) about Robert E. Lee’s treatment of his human property. Some in the group wore T-shirts emblazoned with a social media-ready battle cry: #wewantmorehistory.

Scott Hancock, a professor of history at Gettysburg College, urged the group to be “polite” to anyone who challenged them and reminded them they were not at a protest — or not exactly.

“Our job is to do something a bit more constructive by telling a fuller story,” he said.

The group was part of a “Call to Action” organized by the Journal of the Civil War Era, a scholarly publication. For two hours on Saturday, at about a dozen Civil War-related sites across the country, from New York to Nashville to St. Louis, historians simultaneously gathered with signs highlighting distortions in existing plaques and memorials, or things that simply weren’t being spoken of at all.

The idea was to move beyond binary debates about problematic monuments — tear down or keep? — and instead emphasize the inaccuracies and omissions of the existing commemorative landscape, including the erasure of Black history.

“Historians have different views on taking down statues,” said Gregory Downs, a professor at the University of California, Davis, and one of the organizers. “But that debate doesn’t really capture what historians do, which is to bring more history.”

Read the full article in the New York Times.


.

For Camus, it was always personal
Robert Zaretsky, LA Review of Books, 20 September 2020

In her sharp and sympathetic foreword, Alice Kaplan observes that, for readers who know Camus only as the author of The Stranger, the “lush emotional intensity of these early essays and stories will come as a surprise.” Yet I confess that, even for someone who knew this side of Camus, I was again, as I reread the pieces, surprised by their Dionysian intensity.

The lyricism is especially bracing today, when even Dionysus would think twice before throwing a bacchanalia. For Camus, the lyrical sentiments were deeply rooted in the physical and human landscape of his native Algeria. They flowed from the childhood he spent in a poor neighborhood of Algiers, where he was raised by an illiterate and imperious grandmother and a deaf and mostly mute mother in a sagging two-story building, whose cockroach-infested stairwell led to a common latrine on the landing.

That latrine plays a pivotal role in his unfinished novel, The First Man (published posthumously in 1994). Wishing to keep the change he received after going to the store, Camus’s adolescent alter ego tells his grandmother that it had fallen into the latrine pit. Without a word, she rolls up a sleeve, goes to the hole, and digs for it. At that moment, Camus writes, “he understood it was not avarice that caused his grandmother to grope around in the excrement, but the terrible need that made two francs a significant amount in this home.”

Yet, like Sisyphus with his boulder, Camus claims his impoverished childhood as his own. In his preface to his first collection of essays, The Wrong Side and the Right Side (1937), Camus recalls that his family “lacked almost everything and envied practically nothing.” This is because, he explains, “poverty kept me from thinking all was well under the sun and in history.” Yet, at the same time, “the sun taught me that history was not everything.” Poverty was not a misfortune, he insists. Instead, it was “radiant with light.”

Radiance washes across the early essays, at times so fulsomely that it is hard to keep your head above the cascade of words. During an earlier visit to Tipasa, the sun-blasted pile of Roman rubble that overlooks the Mediterranean, Camus seems quite literally enthused — filled by the gods — as he goes pagan. This place, he announces, is inhabited by gods who “speak in the sun and the scent of absinthe leaves, in the silver armor of the sea, in the raw blue sky, the flower-covered ruins, and the great bubbles of light among the heaps of stone.” It is here, he declares, “I open my eyes and heart to the unbearable grandeur of this heat-soaked sky. It is not so easy to become what one is, to rediscover one’s deepest measure.”

Yes, it is surprising to think of the iconic black-and-white figure, wrapped in a trench coat and smoking a Gauloise, as the author of these words. It is more surprising, perhaps, to learn that before he wrote these words (or, for that matter, ever wore a trench coat), Camus had declaimed them while wandering with two friends through the Roman ruins. Yet this lyricism does burst through the austere prose of his novels, as when Meursault finds himself alone on the light-blasted beach with the “Arab” in The Stranger (1942) or when Rieux and Tarrou go for their nocturnal swim in The Plague (1947).

Read the full article in the LA Review of Books.


.

The Crown’s majestic untruths
Helen Lewis, Atlantic, 5 December 2020

The second, more difficult, question is what responsibility The Crown has to history.

Drama creates order out of chaos; the writer, and then the director, turn many possible pathways into one. Direct Hamlet and you need to decide who sees the ghost, whether Claudius really killed Old Hamlet, and when the prince is crazy versus when he’s just acting. Those choices affect our sympathy for the characters in front of us.

Peter Morgan has made similar editorial decisions. This isn’t Rashomon, a rare drama that allows competing versions of the truth to remain unresolved. Morgan’s Prince Charles is apportioned more blame than Diana for their doomed relationship, because he is older than her, and fully aware from the start that there will always be three people in their marriage. Several historical details are altered to support this characterization: According to the historian Hugo Vickers, the bracelet Charles gave Camilla was truly intended as a farewell gift, and it read GF (for “Girl Friday,” or invaluable assistant) rather than F&G, as The Crown depicts (for their pet names, Fred and Gladys). Most historians agree that Charles did not contact Camilla as often in the early years of his marriage as the show suggests.

These alterations show The Crown deliberately putting its thumb on the scale. Another version of the show was possible: Charles could just as easily have been gently excused from his sneak visits, his illicit phone calls, his evident longing for his first and only great love. Since marrying Camilla in 2005, there’s been not a whiff of scandal around their relationship, so an equally supportable reading of the 1980s is that he was a natural monogamist forced to marry the wrong woman. Like Princess Margaret in the show’s first season, Charles was instructed to deny his feelings in the service of an outdated notion of an “appropriate” royal relationship. Yet the show grants Diana a victimhood that is denied to him.

This has led to whispers that Morgan is pursuing a secret republican agenda. It’s a cute theory, but the key change in Season 4 is just as attributable to its shift in focus from the Queen (worst habit: telling people to buck up) to her eldest son (worst habit: reminding his wife she’s very much the silver medal). Elizabeth II has never talked about her opinions or her private life in anything more than platitudes, and there are no “sides” to take in the story of her 70-year marriage to Prince Philip. But Charles and Diana’s relationship ended in a hailstorm of furious briefings to journalists and ill-advised on-the-record interviews.

There is no neutral, universally accepted version of the events of the 1980s; that fracturing of consensus itself reflects Britain’s changing media climate.

So dramatists take sides. They also create meaning. And here is a vice that The Crown shares with horse-race election coverage: the subordination of facts to narrative.

Read the full article in the Atlantic.


.

This is not a burial, it’s a resurrection
Sekese Rasephei, New Frame, 4 December 2020

Lesotho is fondly known as the Kingdom in the Sky, as its mountainous terrain elevates it more than 1 000m above sea level. This small enclave of a country, completely landlocked by South Africa, has a population of about two million, making it one of the smallest states in the world. But out of these peaks has emerged a feature film, set in the world of its landscape and written and directed by one of its own. 

Lemohang Jeremiah Mosese’s This Is Not a Burial, It’s a Resurrection has made history as the country’s first official submission to the Best International Feature Film category at the upcoming Oscars.

In a country with no semblance of film culture, Mosese’s film getting this far is an impressive feat. But it is even more so when one considers that there is not a single cinema in Lesotho. To meet the Oscars’ eligibility criteria for nomination, the film is screening for a week in South Africa. Speaking to Film Comment magazine before the Sundance Film Festival in February, Mosese said: “Where I grew up, there was nobody who made films. So, you don’t have a reference of someone who can actually make you dream of making movies.”

This Is Not a Burial, It’s a Resurrection is a masterful tale of grief, belonging and defiance. It centres on 80-year-old Mosotho matriarch ’Mantoa, portrayed by late South African actress Mary Twala Mhlongo in a towering performance. 

The film focuses on ’Mantoa’s reaction to the impending doom her small community faces in the village of Nasareta. Constantly clad in black to signify her period of mourning as a widow, ’Mantoa is also afflicted by the recent deaths of her daughter and granddaughter. On the day her one remaining love – a son working in the mines of neighbouring South Africa – is meant to return, she awaits eagerly, ululating, only to be told that her son has also died.

Overcome with grief, ’Mantoa spends the rest of her days in her small hut, listening to obituaries on the radio, hoping to die. It is when the village leader announces that the residents of ’Mantoa’s village are going to be relocated to make way for a dam that she feels a renewed sense of purpose, even if the price is her life.

Mosese has crafted a poetic ode to cinema that is breathtaking in its beauty and gut-wrenching in its sombre themes. It is a traditional slow burner, fragmented in its narrative structure but anchored in part by veteran South African actor Jerry Mofokeng Wa Makhetha as the brooding lesiba player whose ghoulish and mythic narration holds the story together.

Cryptic and singular in shape, Mosese’s direction offers a textured picture expertly shot in a 4:3 aspect ratio that enables the film to show the beautiful landscapes of the Lesotho highlands without taking too much away from the focal point of the story because of its limited width. And the cinematography, helmed by Pierre de Villiers, dazzles in its play of light, landscapes and close-ups of faces.

Read the full article in New Frame.

%d bloggers like this: