The latest (somewhat random) collection of essays and stories from around the web that have caught my eye and are worth plucking out to be re-read.
Ghosts in the land
Adam Shatz, London Review of Books, 3 June 2021
The territory it governs lies in ruins, but Hamas has reason to celebrate. While 90 per cent of its rockets were repelled by Iron Dome, Israel’s defence system, 100 per cent hit their other target: the Palestinian Authority, which looks even more impotent than usual. Hamas’s performance in the war has not only raised its prestige among Palestinians; it has made them forget for the moment its mismanagement and authoritarian rule inside Gaza. If the PA held an election, Hamas would almost certainly win, which may be the real reason that, in late April, President Mahmoud Abbas indefinitely postponed the legislative election scheduled for 22 May.
Privately, Netanyahu and the Israeli army have always had an interest in keeping Hamas in power in Gaza. Israel allowed the movement to flourish in its early years as a counterweight to the secular nationalists of the PLO. Hamas’s rule in Gaza kept the Palestinians divided, and Palestinian political fragmentation has always been a key Israeli objective. Several Israeli pundits have suggested that Netanyahu deliberately provoked Hamas in order to prevent his opponents from establishing a governing coalition. Israel has had four elections in two years, and if he fails to hold on to power, he could face corruption charges and a prison term. In the lead-up to Hamas’s rocket barrage, he pursued a series of flagrantly reckless policies: closing off the plaza outside the Damascus Gate during Eid – Muslim families gather there to celebrate the end of the fast – and violently raiding the prayer rooms of the mosque itself. Oppression alone seldom detonates revolt; humiliation – what the Algerians call hogra – is also necessary. Netanyahu supplied it in abundance. As soon as Hamas responded to the provocations in Jerusalem, the right-wing politician Naftali Bennett, of the Orthodox Zionist party Jewish Home, pulled out of talks to form an anti-Netanyahu coalition. Yair Lapid, another Netanyahu opponent (centrist by Israeli standards, right-wing by any other), praised the military campaign. On Israeli television, Ariel Sharon’s son explained what he considered the appropriate response to rocket fire from Gaza: ‘You strangle them. No water, no electricity, no food, no gas, no medical treatments. Nothing.’ Ayman Abu al-Ouf, the head of the Covid-19 response team at Gaza’s largest hospital, was among the victims.
The war was more of a gift to Hamas than Netanyahu bargained for, however. He failed to consider that Hamas stood to benefit, not least because of its declining reputation in Gaza and the political confinement imposed by the blockade, now into its fourteenth year. Hamas knew that the Palestinian Authority – weakened and humiliated by Israel – could do nothing about the expulsions in Sheikh Jarrah or the attack on al-Aqsa. Always more skilled at mobilising than building, Hamas not only exploited the leadership vacuum, it neutralised the blockade by demonstrating that it would not, and didn’t have to, stand by while Palestinians suffered aggression in Jerusalem. The result has been symbolically to unify Palestinians from the river to the sea, as the chant goes, and across the vast Palestinian diaspora. For the first time in Israel’s history, its security forces found themselves simultaneously engaged in the Gaza Strip, in East Jerusalem, in the occupied West Bank and – most troubling of all for Netanyahu – in Israel’s so-called ‘mixed cities’, where well-organised Jewish militias attacked Palestinian citizens, who in turn attacked Jews and set fire to synagogues. The attacks by Jewish mobs inside Israel, often with the police standing by, has stirred painful memories among Palestinian citizens of the killing of thirteen protesters in October 2000, and intensified already acute feelings of disenfranchisement and discrimination. These are grievances that a ceasefire between Israel and Hamas is powerless to address, any more than it can address the root causes of the war.
Read the full article in the London Review of Books.
Israel is bleeding from within. Restoring the peace
shouldn’t mean a return to the way we lived before
Ayelet Gundar-Goshen, Time, 17 May 2021
The Arab minority, many of whom prefer to be called Palestinian citizens of Israel, constitutes 21.1% of the Israeli population, and yet, over half of the poorest families in Israel are Arab families, and Arab municipalities constitute the poorest municipalities within Israel. According to recent research, about 40% of the Arab men aged 18-35 are NEET (not in education, employment or training). The percentage of Arab citizens who graduated from Israeli universities during 2017 was 9.7%, compared to 44.9% among Jewish males. Since the riots started, the Jewish citizens of Israel are mourning the loss of co-existence with our Arab neighbors. But for many Arabs citizens of the country, this was already an imaginary co-existence.
I believe most middle-class Israelis will have to agree that when we say “we interact a lot with Arabs” it usually means that we meet the Arab citizens when they come to clean our house as we go to work, serve us hummus as we dine at a restaurant, or change our sheets when we go on vacation. Jewish towns like Afula did everything in their power to prevent Arabs from entering public parks and swimming pools. The Israeli Supreme Court had to intervene in order to stop discrimination of Arabs applying for jobs. And as much as I cherish the solidarity between Arab and Jewish staff members in my ward, I can’t ignore the fact that all of the doctors and the executives are Jews, while the cleaners and nurses/medical workers are mostly Arabs.
During Netanyahu’s regime, discriminatory laws were passed against the Arab minority. The “nation-state law” downgraded Arabic from an official language and imposed other measures that made official Arab-Israelis status as second class citizens, alienating us even more from one another. “Kaminitz Law” is used to prevent Arab citizens from illegal building—a common practice in the lack of a legal alternative, as the Israeli government refuses to allow Arab municipalities to expand. “We’re feeling under siege”, say my friends from the Kibbutz of Galilee and the Negev areas, after their Arab neighbors threw stones on their cars and torched tires at the Kibbutz entrance. Indeed, we all must condemn any sort of violence. But when you look at the life conditions of the Bedouins in the Negev—no electricity, no running water—when you compare it to the luxurious villas in the nearby kibbutz, you cannot be surprised by the amount of hate gathered inside.
The violence against fellow citizens in our streets is heartbreaking, and must be stopped. But violence isn’t only burning tires and rioting. Violence is also the variety of means taken by the Israeli governments in order to control and exclude Arab minority. For some of the Jewish majority, “restoring the peace” means that Jews will return to their comfortable lives, while Arabs continue to suffer from poverty and discrimination. Due to our traumatic history as a nation, many Jewish-Israelis are blind to this discrimination—after years of being persecuted, the Jewish mentality has a strong sense of collective victimhood. The current violence reinforces this sense of victimhood, with more victims added to the list—10 Israeli civilians dead. In the Palestinian territories, more than 217 casualties, among them 34 women and 58 children.
Read the full article in Time.
Teshuvah: A Jewish case for Palestinian refugee return
Peter Beinart, Jewish Currents, 11 May 2021
In Jewish discourse, this refusal to forget the past—or accept its verdict—evokes deep pride. The late philosopher Isaiah Berlin once boasted that Jews “have longer memories” than other peoples. And in the late 19th century, Zionists harnessed this long collective memory to create a movement for return to a territory most Jews had never seen. “After being forcibly exiled from their land, the people kept faith with it throughout their Dispersion,” proclaims Israel’s Declaration of Independence. The State of Israel constitutes “the realization” of this “age-old dream.”
Why is dreaming of return laudable for Jews but pathological for Palestinians? Asking the question does not imply that the two dreams are symmetrical. The Palestinian families that mourn Jaffa or Safed lived there recently and remember intimate details about their lost homes. They experienced dispossession from Israel-Palestine. The Jews who for centuries afflicted themselves on Tisha B’Av, or created the Zionist movement, only imagined it. “You never stopped dreaming,” the Palestinian poet Mahmoud Darwish once told an Israeli interviewer. “But your dream was farther away in time and place . . . I have been an exile for only 50 years. My dream is vivid, fresh.” Darwish noted another crucial difference between the Jewish and Palestinian dispersions: “You created our exile, we didn’t create your exile.”
Still, despite these differences, many prominent Palestinians—from Darwish to Edward Said to law professor George Bisharat to former Knesset member Talab al-Sana—have alluded to the bitter irony of Jews telling another people to give up on their homeland and assimilate in foreign lands. We, of all people, should understand how insulting that demand is. Jewish leaders keep insisting that, to achieve peace, Palestinians must forget the Nakba, the catastrophe they endured in 1948. But it is more accurate to say that peace will come when Jews remember. The better we remember why Palestinians left, the better we will understand why they deserve the chance to return.
Even for many Jews passionately opposed to Israeli policies in the West Bank and Gaza Strip, supporting Palestinian refugee return remains taboo. But, morally, this distinction makes little sense. If it is wrong to hold Palestinians as non-citizens under military law, and wrong to impose a blockade that denies them the necessities of life, it is surely also wrong to expel them and prevent them from returning home. For decades, liberal Jews have parried this moral argument with a pragmatic one: Palestinian refugees should return only to the West Bank and Gaza, regardless of whether that is where they are from, as part of a two-state solution that gives both Palestinians and Jews a country of their own. But with every passing year, as Israel further entrenches its control over all the land between the Jordan River and the Mediterannean Sea, this supposedly realistic alternative grows more detached from reality. There will be no viable, sovereign, Palestinian state to which refugees can go. What remains of the case against Palestinian refugee return is a series of historical and legal arguments, peddled by Israeli and American Jewish leaders, about why Palestinians deserved their expulsion and have no right to remedy it now. These arguments are not only unconvincing but deeply ironic, since they ask Palestinians to repudiate the very principles of intergenerational memory and historical restitution that Jews hold sacred. If Palestinians have no right to return to their homeland, neither do we.
Read the full article in Jewish Currents.
The violence that began at Jerusalem’s
ancient holy sites is driven by a distinctly modern zeal
Yair Wallach, Guardian, 13 May 2021
The modern Jewish national movement, calling for a return to Zion, wanted to reclaim the wall. From the early 20th century, Zionist leaders called to “redeem” it by purchasing the houses in its vicinity and paving a plaza for worshippers. They sought to transform it into a monument of national revival. But the wall itself, as a remnant of the destroyed Temple’s compound, was a symbol of ruin, and nothing could change that fact. For Judaism, the wall was a constant reminder of God’s exile – an exile that the modern Zionist promise to “ingather the Jewish Diasporas” could not overcome. This simple and insurmountable contradiction has never ceased to haunt the Zionist engagement with the wall.
This ambivalence was noticeable in early Zionist attitudes. The wall was largely absent from early Zionist iconography, and appeared (if at all) as a metaphor for destruction, contrasted with symbols of Zionist revival such as the agricultural colonies. The Labour-dominated Zionist movement sought to harness Jewish religious symbols in favour of secular nationalism, but was strongly opposed to ideas of the reconstruction of the Temple. So much that, as historian Hillel Cohen revealed, in 1931 the Zionist Hagana militia murdered a Jew who planned to blow up the Islamic sites of the Haram.
After the Israeli occupation of East Jerusalem in 1967, Israeli officials were in direct control of the holy sites. They pledged to maintain the status quo on the Haram, which remained under effective Palestinian Muslim control. When it came to the Western Wall, the desire to make the site into a national Jewish monument was finally achieved. Within days, the Mughrabi Quarter, a medieval neighbourhood that stood next to the wall, was entirely depopulated and razed to the ground to make room for a huge plaza. From a hidden wall, seen only from close proximity, it became a monumental stage, used not only for prayer but also for state and military ceremonies.
But the transformation did not resolve the basic contradictions embedded in the wall, and indeed has only served to accentuate them. Now much more than before, the wall’s liminal position as a sharp border between Jews (below) and Muslims (above), between ruin (the wall) and redemption (the unattainable Temple Mount), was rendered visible. The wall remains a memorial of destruction, a site of absence, while the Muslim sites loom from above.
After 1967, the secular Labour movement lost its position as the Zionist vanguard. Religious settlers claimed the language of Zionism as they spearheaded the colonisation of the occupied territories. The secular-Zionist project of “normalisation” – making Jews a territorial nation “like any other” – was overtaken from within by those who continued the colonising mission, but interpreted the biblical promise of the land literally as manifest destiny. In that context, the holy sites – now under Israeli control – assumed a new meaning, and became a new frontier. Some religious Zionists were no longer content with the Western Wall, given that the Temple Mount was within reach.
In the 1980s, there were two attempts by Jewish militant groups to blow up the Islamic sites on the Haram. Since then, the Temple Mount Faithful, calling for Israel to assert Jewish control of the Haram, has grown from a tiny fringe group to a movement with political backing. The Temple Institute in the Old City, funded partly by the Israeli government, produces ritual objects for the Temple, in anticipation of its reconstruction, while performances of simulated ritual sacrifices by priests in white robes are held annually before Passover, in close proximity to the Haram al-Sharif. Such practices represent no less than a reinvention of Judaism – given that it has been shaped for 2,000 years by the Temple’s destruction.
Read the full article in the Guardian.
Taking liberties: Covid-19 and the
anatomy of a constitutional catastrophe
Adam Wagner, Prospect, 26 March 2021
As the real threat of the virus—finally, hopefully—begins to recede, can we go back to normal? In principle, of course we can. In practice, the first question is whether a virus that has never yet listened to politicians’ pronouncements will mutate in vaccine-resistant ways.
But at some point, presumably, science will get it under control. Will we then permanently restore the tenets of our liberal democracy? Or might we find that the authorities have developed a taste for their mighty new powers? Will politicians, having discovered the ease with which they can accrue more, make it a habit? And could “extraordinary” threats become a rather ordinary occurrence?
If these questions strike you as paranoid, remember that governing is always difficult. Even in routine situations, politicians and officials have to strike fraught balances between competing objectives, and trying to do a lot of complicated things at speed. Anything that promises to make their life more convenient is tempting: just look at the sweeping surveillance powers over our communications and internet use that were revealed by Edward Snowden in 2013; it transpired that the privacy of the citizen simply did not get a look in. For people tasked with surveilling security threats, a “just in case it comes in handy” approach to state powers is understandable. But theirs is not the only point of view that should be taken into account.
Pandemic law-making will have left an indelible stain on the rule of law and on the relationship between citizen and the state. The public has been left confused; the police have sometimes acted arbitrarily, and been shamed by recent events; and throughout, a radical contempt has been shown for parliamentary democracy.
As the government endlessly iterated and experimented, it has shown a worrying lack of concern for explaining exactly what it is doing and why. While most of the time it has seemed motivated by the sincere hope of containing the virus, this has not been entirely consistent; recall the 10pm curfew on pubs, which made for easy headlines but was seemingly unconnected to any serious scientific advice. The effective total ban on protest seems to have emerged not from scientific advice but the home secretary’s private instruction. Under government without explanation, an awful lot has to be taken on trust.
We have seen how fast the most basic of so-called constitutional “norms” can be jettisoned in a crisis. Come the next great threat, we have set the marker: parliament will not have a say and will hardly raise a whimper; decisions will be made on a whim by whichever person, however capricious, happens to be behind a particular ministerial desk. And remember, while the next crisis could be another virus that threatens the community as a whole and allows for a reasonably unifying political response, it could also be something very different, like a terrorist attack. In the cases of the Troubles in Northern Ireland and the “war on terror,” we saw that the response sometimes divided communities. In such circumstances it is especially important that everyone should be entitled to a voice, but nobody can have confidence that they will be heard if decisions are taken without discussion or debate.
Read the full article in Prospect.
How Amazon exploited a weakened America
Sarah Leonard, TNR, 2 April 2021
Bezos has spoken with admiration about the founder of Walmart, and Amazon has followed in its tradition of allowing government welfare programs to make up for the unlivable low compensation it offers. Because of low wages, Walmart’s employees have infamously subsisted on food stamps. A 2018 report found that one in 10 Amazon employees in Ohio was on food stamps. There are the millions that states give in explicit subsidies, and then there are these hidden ones.
Only places wealthier than Dayton have been able to refuse Amazon’s demands. In 2019, New Yorkers famously rejected the package of incentives negotiated by the governor and mayor in secret in exchange for Amazon locating its headquarters in Queens. Amazon may have finally pulled out of the deal in order to continue union busting. During discussions, Governor Andrew Cuomo brokered a meeting between Amazon executives and labor leaders, who wanted assurances that, if the company expanded in New York, it would not aggressively discourage employees from organizing. The next day, Jay Carney, the senior vice president for global corporate affairs at Amazon, called to say the deal was officially off.
It’s at the national level where Amazon has been making some of its most astonishing gains, and this is where Carney really shines as a smarmy supervillain of the inequality era. Carney was a journalist for 20 years before going to work in the Obama administration, eventually becoming press secretary. He went from government to Amazon, making millions smoothing the way for the corporate behemoth that crushes unions and drains local government. If average Americans lack faith in the civic virtues of those in government, they need look no further than the revolving door between government and business and people like Carney, who profit by passing through it. If many are skeptical that the Democratic Party stands with the worker and against big business, well, they need only look to the tech industry to have their fears confirmed a hundred times over. (There’s little difference between Democrats and Republicans in this book, because both parties have close relationships with the company.) In 2020, Amazon set a lobbying record in the second quarter by spending $4.38 million. The federal government’s billion-dollar procurement and web hosting contracts may be Amazon’s real holy grail. It has made major gains in federal, state, and local procurement, along the way destroying small businesses accustomed to filling those orders.
In other words, America’s inequality has been Amazon’s gain: It exploits the desperation of small cities and towns, to sponge off their meager infrastructure and battered workforces, and floods the sparkling centers of influence with cash.
Read the full article in TNR.
The liberals who can’t quit lockdown
Emma Green, Atlantic, 4 May 2021
Last year, when the pandemic was raging and scientists and public-health officials were still trying to understand how the virus spread, extreme care was warranted. People all over the country made enormous sacrifices—rescheduling weddings, missing funerals, canceling graduations, avoiding the family members they love—to protect others. Some conservatives refused to wear masks or stay home, because of skepticism about the severity of the disease or a refusal to give up their freedoms. But this is a different story, about progressives who stressed the scientific evidence, and then veered away from it.
For many progressives, extreme vigilance was in part about opposing Donald Trump. Some of this reaction was born of deeply felt frustration with how he handled the pandemic. It could also be knee-jerk. “If he said, ‘Keep schools open,’ then, well, we’re going to do everything in our power to keep schools closed,” Monica Gandhi, a professor of medicine at UC San Francisco, told me. Gandhi describes herself as “left of left,” but has alienated some of her ideological peers because she has advocated for policies such as reopening schools and establishing a clear timeline for the end of mask mandates. “We went the other way, in an extreme way, against Trump’s politicization,” Gandhi said. Geography and personality may have also contributed to progressives’ caution: Some of the most liberal parts of the country are places where the pandemic hit especially hard, and Hetherington found that the very liberal participants in his survey tended to be the most neurotic.
The spring of 2021 is different from the spring of 2020, though. Scientists know a lot more about how COVID-19 spreads—and how it doesn’t. Public-health advice is shifting. But some progressives have not updated their behavior based on the new information. And in their eagerness to protect themselves and others, they may be underestimating other costs. Being extra careful about COVID-19 is (mostly) harmless when it’s limited to wiping down your groceries with Lysol wipes and wearing a mask in places where you’re unlikely to spread the coronavirus, such as on a hiking trail. But vigilance can have unintended consequences when it imposes on other people’s lives. Even as scientific knowledge of COVID-19 has increased, some progressives have continued to embrace policies and behaviors that aren’t supported by evidence, such as banning access to playgrounds, closing beaches, and refusing to reopen schools for in-person learning.
“Those who are vaccinated on the left seem to think overcaution now is the way to go, which is making people on the right question the effectiveness of the vaccines,” Gandhi told me. Public figures and policy makers who try to dictate others’ behavior without any scientific justification for doing so erode trust in public health and make people less willing to take useful precautions. The marginal gains of staying shut down might not justify the potential backlash.
Read the full article in the Atlantic.
1619, 1776, and us
Cathy Young, The Bulwark, 3 March 2021
The disputes about specific facts point to a more fundamental question about the nature of the American experiment—and of modern-day America. Was the United States born as a flawed democracy that failed to extend its ideals of liberty and equality to black Americans, or as a faux democracy whose ideals of liberty and equality were merely a smokescreen for the enslavement and oppression of blacks? Did black Americans fight for their freedom and human rights almost entirely on their own, or in a multiracial alliance in which white abolitionists and civil rights advocates played a key role? Are America’s enduring racial problems and inequities primarily the result of a legacy of past oppression and complicated societal dynamics, or of a white supremacy that remains “baked into” our institutions by design?
These are incredibly complicated questions. And yet consistently choosing the second answer, which is the core narrative of the 1619 Project, not only encourages a crude monocausal interpretation of complex social problems but also leads to a bleak outlook that delegitimizes the American political system and enshrines racial balkanization.
Of course, the 1776 Report is a good example of how not to make a “flawed democracy” argument. The section that deals with the painful question of the Founders’ slaveholding, for instance, falls back on the but-they-personally-opposed-it cliché, asserting that George Washington and Thomas Jefferson wanted to end slavery but had to accept the political compromises necessary to hold the republic together. The reality, as the work of recent historians shows, is far messier…
But where the 1776 Report absolves too much, the 1619 Project (and other literature in the same vein) demonizes. “On the issue of American slavery, I am an absolutist: enslavers were amoral monsters,” wrote New York Times opinion columnist Charles Blow last June, in a piece about monument removal with the self-explanatory title, “Yes, Even George Washington.” Blow dismisses the view that slaveholders were “simply men and women of their age”: “There were also men and women of the time who found slavery morally reprehensible. The enslavers ignored all this and used anti-black dehumanization to justify the holding of slaves and the profiting from slave labor.” But both Washington and Jefferson did condemn slavery as morally reprehensible, and the denunciation of the slave trade that Jefferson originally wrote for the Declaration of Independence stressed that it was traffic in “MEN,” with capitals for emphasis. Even Jefferson’s Notes on the State of Virginia, written in the early 1780s and containing some appallingly racist passages, deplored slavery as evil and expressed hope for “total emancipation.”
Read the full article in The Bulwark.
Welcome to Germany
Thomas Rogers, New York Review, 29 April 2021
Much of the international coverage of the refugee situation in Germany has focused not on the experiences of the migrants themselves but on the simultaneous rise of the far right in the country. The influx, many have argued, has been a direct cause of this development. Far-right extremism has indeed surged. In 2015 the official number of attacks on refugee housing registered by the Federal Criminal Police Office increased fivefold over the previous year’s, to more than one thousand. There have also been a number of high-profile attacks on pro-refugee politicians, including a knife attack on a Cologne mayoral candidate that severed her windpipe and the murder in 2019 of Walter Lübcke, a regional politician in Hesse, by a far-right extremist.
It’s true that the influx helped fuel the rapid growth of the Alternative for Germany (AfD), which was founded as a Euro-skeptic party in 2013 but drifted further toward the anti-immigrant far right after 2015. Its politicians have raced to outdo one another in venal, attention-grabbing rhetoric. In 2016 Beatrix von Storch, the party’s deputy chairperson, wrote on Facebook that German border guards should shoot women and children trying to illegally cross the border into the country. (She later backtracked, saying that only the women should be shot.) The party garnered a surprising 13 percent of the vote in the next year’s federal election, making it the biggest opposition party and the first far-right party to sit in the German parliament since 1961.
But the media’s focus on the AfD has overshadowed the continued support for migrants among ordinary Germans. Although the Willkommenskultur has faded from public view, it has not disappeared. A study by the Family Ministry from 2018 showed that one in five Germans helped the refugees in some capacity. Silke Radosh-Hinder, a pastor working with the Refugee Church, a religious organization helping migrants in Berlin, recently told me that many volunteers remained as committed as during the height of the influx: “I still know an incredible number of people who support and mentor refugees, and what I think has ebbed is the put-on aspect, where people want to be publicly recognized for helping people.”
In his wide-ranging 2019 history of migration in Germany, Das Neue Wir (The New Us), the German historian Jan Plamper argued that the past decade has been unfairly defined in the public eye by the rise of the far right: “In truth, during the 2010s, politics overall has grown polarized—both the left-wing and the right-wing extremes became stronger.” This development on the left, he writes, has coincided with the rise of a vocal and well-organized pro-refugee movement, largely organized by migrants themselves, that has laid the groundwork for a lasting social acceptance of migration. Willkommenskultur, he makes clear, was not a onetime event but part of a larger trend.
Read the full article in the New York Review.
The demise of “political blackness”
Ralph Leonard, UnHerd, 22 March 2021
But political blackness was born in a different era, one where “coloureds” — anyone who was not white — were treated as part of a de facto underclass in the UK. Institutional racism meant they occupied menial jobs, were denied decent housing and education and were excluded from many social spaces due to the “colour bar”.
At the time, the mainstream Left did little to change this state of affairs. Trade unions, for instance, were generally hostile to the cause of anti-racism. Political blackness, therefore, wasn’t simply a critique of Tory governments; it also attacked the Labour Party for being co-conspirators in institutional racism. “What Enoch Powell says today, the Conservative Party says tomorrow, and the Labour Party legislates on the day after,” was Sivanandan’s sardonic assessment. After Powell’s Rivers of Blood speech, as fears abounded of Kenyan Asians “swarming” into Britain, it was Harold Wilson’s Labour government that rushed through the racist 1968 Commonwealth Immigration Act. Where previously Commonwealth citizens had had unlimited entry to Britain, as British subjects, they were now barred.
In this context, it made sense for the descendants of Britain’s colonies to band together. In 1968, Jagmohan Joshi co-founded the Black Peoples’ Alliance (BPA). Their founding congress was attended by various Caribbean and Asian militant groups — including the Pakistani Workers’s Association, the West Indian Standing Conference (WISC) and the Afro-Asian Liberation Front. The BPA took to the streets a year later, 8,000-strong, in a “march for dignity” to Downing Street, calling for the repeal of the Immigration Act. And over the next decade, it would, along with other groups, continue to play an important role in protesting police brutality and defending ethnic minority communities from far-Right attacks.
But political blackness is now a fossil. Today it’s widely accepted that it flattens out the differences between blacks and South Asians into an abstract non-white identity. For Kehinde Andrews, a modern-day black nationalist, blackness is the sacred unifying glue of the “African diaspora”. Turning into a political umbrella, he claims, “erases the history of political organising based around shared African ancestry” in Britain.
Moreover, discussions surrounding race in Britain have been further fudged by the super-diversity that has resulted from recent decades of mass migration from Eastern Europe, Africa and the Middle East. Eastern European migrants, for example, have been the frequent object of xenophobia in the past couple of decades, despite being “white”. At the same time, the emergence of identity politics in recent years has had the effect of breaking up what was once a unified “black struggle” into ethnic and religious fragments.
Read the full article in UnHerd.
The disintegration of the ACLU
James Kirchick, Tablet, 31 March 2021
No one embodied that late-20th-century cultural archetype of the fiercely outspoken, intellectual, principled, and Jewish ACLU activist more than Ira Glasser. From his appointment as national executive director in 1978 until his retirement in 2001, Glasser transformed the ACLU from a mom and pop outfit into a “nationwide civil liberties powerhouse,” broadening its mandate to include issues such as sexual orientation discrimination and abortion rights. Through his ubiquitous and spirited media appearances, Glasser became the face of civil liberties in America. When Vice President George H.W. Bush campaigned to succeed his boss in 1988—and spoke like a Connecticut blueblood’s idea of a Texas hayseed—he derided his opponent Michael Dukakis as a “card-carrying member of the ACLU.” It was guys like Glasser whom Bush was trying to conjure up in the minds of the voting public.
Glasser and his friends rarely encountered anybody who wasn’t Jewish, much less Black, within the 12-block neighborhood of East Flatbush that comprised the world of their childhood (New York City, he says in the film, was less the fabled “melting pot” of popular American sentiment than “a collection of insular segregated tribes”). But while listening to play-by-play announcer Red Barber’s report about the Dodgers’ road trip to St. Louis, then the southernmost city in the National League, they understood two important things: that Jackie Robinson was a god, and that the treatment he endured—the racist invective from Cardinals fans, the segregation that kept him from eating in the same restaurants or sleeping in the same hotels as his teammates—constituted a form of blasphemy.
A hatred of Jim Crow and a passion for civil rights developed from Glasser’s dedication to the Dodgers, as did a theory of the nature of sports fandom itself: If rooting for the Dodgers situated one on the right side of what was then the country’s central moral struggle, then cheering for the Yankees (the third to last team to hire a Black player) signified a belief in “oil depletion allowances.” Writing a quarter century after his beloved team abandoned Brooklyn for sunnier Los Angeles and Ebbets was razed to the ground, Glasser observed that “Dodger fans became egalitarians who would often be found working at the ACLU.”
Viewers may question the relevance of a documentary about a civil liberties veteran who retired over 20 years ago. The contours of contemporary debates surrounding issues of free speech have evolved as a result of technological advances, demographic changes, the presidency of Donald Trump, and other phenomena too numerous to cite, and a man like Glasser reliving the highlights of his storied career may strike some as indistinguishable from an aging Brooklyn Dodgers fan reminiscing about his long-gone team. The decision by directors Nico Perrino, Chris Maltby, and Aaron Reese to start their film with Glasser waxing nostalgic about Ebbets Field surely risks drawing that conclusion, but what makes this particular trip down memory lane meaningful is its illustration of how dramatically institutions like the ACLU have changed. As much as Mighty Ira is intended as a tribute to an individual life, it cannot help but also be a lament for the endangered values to which that life was dedicated. Ebbets Field may not be the only thing we’ve lost.
Read the full article in Tablet.
A mayday call, a dash across the Mediterranean…
and 130 souls lost at sea
Emmanuel Chaze, Observer, 25 April 2021
As the storm raged and lashed around them and the boat was violently tossed from side to side, below deck the medical team went through inventory checks of supplies and first-aid drills to treat multiple casualties. “We knew we wouldn’t arrive until morning. If there would be any survivors they would have been in the water for hours. They would be freezing, seasick and have hypothermia,” says Tanguy Louppe, a former soldier and firefighter turned sea rescuer and who now heads the search and rescue team on the Ocean Viking.
Yet the mood had changed onboard the Viking. Without immediate assistance, both the deteriorating weather conditions and the darkness would mean that the boat would capsize or be torn apart. The Viking continued to power through the waves, but the storm was making progress painfully slow. Every hour that went by, the chance of finding anyone alive was slipping away.
Louppe gathered the crew together and told them to prepare for a mass casualty plan. “We know we won’t be there until morning. We have to expect the worst,” he told them.
On the bridge, Albera was clinging on to hope. Three merchant boats had also responded to the mayday call. None of them would be able to carry out a rescue, but if they located the rubber dinghy they might be able to give it shelter until the Viking arrived.
At 5am on Thursday, the Viking finally reached the last-known location of the dinghy. With no sign of any help from the Italian or Libyan authorities, the three merchant vessels had coordinated their efforts to mount a search, and once again Albera called Frontex to request aerial support to assist.
For over six hours, the four ships scoured the waves for any sign of life. Then, at 12.24pm, one of the merchant vessels radioed to say that three people had been spotted in the water. Ten minutes later, Frontex announced that it had spotted the remains of a boat.
When Albera and her crew arrived, they found a scene of desolation: an open cemetery in an otherwise breathtakingly pretty, deep blue sea.
The rubber boat hadn’t stood a chance against the fury of the storm. The deck of the boat had disappeared. Only a few grey floating buoys remained. Around them, dozens of lifeless bodies floated in the waves. The Ocean Viking, with a team of trained rescuers and medics onboard, had arrived too late. Among the men, women and children they found in the water, there were no survivors.
Read the full article in the Observer.
4 years after an execution, a different man’s DNA
is found on the murder weapon
Heather Murphy, New York Times, 7 May 2021
For 22 years, Ledell Lee maintained that he had been wrongly convicted of murder.
“My dying words will always be, as it has been, ‘I am an innocent man,’” he told the BBC in an interview published on April 19, 2017 — the day before officials in Arkansas administered the lethal injection.
Four years later, lawyers affiliated with the Innocence Project and the American Civil Liberties Union say DNA testing has revealed that genetic material on the murder weapon — which was never previously tested — in fact belongs to another man. In a highly unusual development for a case in which a person has already been convicted and executed, the new genetic profile has been uploaded to a national criminal database in an attempt to identify the mystery man.
Patricia Young, Mr. Lee’s sister, has been fighting for years to prove that it was not her brother who strangled and fatally bludgeoned the 26-year-old Debra Reese in Jacksonville, Ark., a suburb of Little Rock, in 1993.
“We are glad there is new evidence in the national DNA database and remain hopeful that there will be further information uncovered in the future,” Ms. Young said in a statement last week. In response to a lawsuit filed by Ms. Young in January, Jacksonville city officials released the bloody wooden club recovered from the victim’s bedroom, a bloody white shirt wrapped around the club and several other pieces of evidence for testing.
The Innocence Project and the A.C.L.U. have pushed for additional DNA testing at previous times, including the eve of Mr. Lee’s execution. The request was denied. A federal judge rejected Mr. Lee’s request for a stay of the execution, saying that he had “simply delayed too long,” according to a complaint filed by Ms. Young.
Mr. Lee’s execution, on April 20, 2017, was the first in Arkansas in more than a decade. Some accused the state of rushing Mr. Lee and several other prisoners to their deaths that month before the expiration of its supply of a lethal injection drug.
At a news conference on Tuesday, Gov. Asa Hutchinson defended Mr. Lee’s execution. “It’s my duty to carry out the law,” he said, adding that “the fact is that the jury found him guilty based upon the information that they had.” He called the new DNA evidence that has emerged “inconclusive.”
In a statement, lawyers from the A.C.L.U. and the Innocence Project were cautious about stating what, exactly, could be extrapolated from the newly tested DNA from the shirt and the murder weapon — beyond the facts that both samples appeared to belong to the same man and that that man was not Mr. Lee.
Read the full article in the New York Times.
How to study racial disparities
Bryan Schonfeld & Sam Winter-Levy,
Scientific American, 14 August 2020
Studying race, and in particular the relationship between race and social outcomes like health or police violence, comes with both statistical and conceptual challenges, which make understanding exactly why Black people are dying from COVID-19 at higher rates harder than it might seem.
Perhaps the biggest issue arises out of what statisticians call “post-treatment bias.” Because racial identity is assigned at birth, it affects a wide range of other aspects of people’s lives—where someone lives, how they’re educated, the sorts of opportunities they have, and how much money they earn. To understand the effect of race on a certain outcome—say, police violence, or the likelihood of death among COVID-19 patients—scholars will often control for factors like education, income, health status or occupation.
But all these variables are “post-treatment,” or downstream, of race, in the sense that race itself can shape how a person is raised, educated and employed. Controlling for these variables can distort any results that scholars may find.
Consider the following analogy: if researchers set out to investigate whether smoking leads to death, but controlled for whether someone gets lung cancer, they might find that smoking doesn’t increase mortality—because they’ve effectively removed an important pathway by which smoking influences health. What’s more, by controlling for lung cancer, they’re now comparing the life spans of smokers who don’t get lung cancer, who are likely to be unusually healthy, to nonsmokers without lung cancer (and comparing nonsmokers who get lung cancer, a highly unusual group, to smokers who get lung cancer). In the context of race, controlling for almost any socioeconomic or health variable—as most studies on ethnic disparities in COVID-19 deaths do—can create serious biases in an analysis, calling many empirical results into question.
Similar issues abound in the study of race and policing. Take the recent debate over whether there is evidence of racism in American policing. Roland Fryer, an economist at Harvard, found that police shoot white, Black and Hispanic Americans whom they’ve stopped at equal rates. However, as political scientists Dean Knox, Will Lowe, and Jonathan Mummolo point out, if there is initial discrimination in who gets stopped in the first place, estimating racial disparities in how people are treated once they’ve been stopped becomes much more complicated—especially since police officers are more likely to stop Black and Hispanic people than white people, and more of those stops are unjustified. If Black people are stopped by police for lesser (or nonexistent) offenses, “equal treatment” in terms of the use of force would actually indicate deeply unequal policing overall. Being stopped by the police is “post-treatment” to race, and failing to account for this bias can lead to erroneous conclusions that may mask the extent of racism in American institutions.
A second challenge, as the political scientists Maya Sen and Omar Wasow point out, comes from the instability of racial labels. As one study concluded, “No two measures of race will capture the same information.” In one 19-year survey of thousands of Americans, a full 20 percent of the sample changed either how they were racially classified by others or how they identified themselves. Survey respondents even changed identification in response to life events: incarceration, unemployment or having an income below the poverty line made respondents more likely to identify as Black, while people who get married are more likely to be seen as and identify as white.
Read the full article in Scientific American.
First monkey–human embryos reignite debate over hybrid animals
Nidhi Subbaraman, Nature, 15 April 2021
Scientists have successfully grown monkey embryos containing human cells for the first time — the latest milestone in a rapidly advancing field that has drawn ethical questions.
In the work, published on 15 April in Cell, the team injected monkey embryos with human stem cells and watched them develop. They observed human and monkey cells divide and grow together in a dish, with at least 3 embryos surviving to 19 days after fertilization. “The overall message is that every embryo contained human cells that proliferate and differentiate to a different extent,” says Juan Carlos Izpisua Belmonte, a developmental biologist at the Salk Institute for Biological Studies in La Jolla, California, and one of the researchers who led the work.
Researchers hope that some human–animal hybrids — known as chimaeras — could provide better models in which to test drugs, and be used to grow human organs for transplants. Members of this research team were the first to show in 2019 that they could grow monkey embryos in a dish for up to 20 days after fertilization. In 2017, they reported a series of other hybrids: pig embryos grown with human cells, cow embryos grown with human cells, and rat embryos grown with mouse cells.
But the latest work has divided developmental biologists. Some question the need for such experiments using closely related primates — these animals are not likely to be used as model animals in the way that mice and rodents are. Non-human primates are protected by stricter research ethics rules than are rodents, and they worry such work is likely to stoke public opposition.
“There are much more sensible experiments in this area of chimaeras as a source of organs and tissues,” says Alfonso Martinez Arias, a developmental biologist at Pompeu Fabra University in Barcelona, Spain. Experiments with livestock animals, such as pigs and cows, are “more promising and do not risk challenging ethical boundaries”, he says. “There is a whole field of organoids, which can hopefully do away with animal research.”
Izpisua Belmonte says that the team does not intend to implant any hybrid embryos into monkeys. Rather, the goal is to better understand how cells of different species communicate with each other in the embryo during its early growth phase.
Attempts at growing human–mouse hybrids are still preliminary and chimaeras need to be more effective and healthier before they can be useful. Scientists suspect that such hybrids might have trouble thriving because the two species are evolutionarily distant, so the cells communicate through different means. But observing cellular cross-talk in monkey–human embryo chimaeras — which involve two more closely related species — could suggest ways to improve the viability of future human–mouse models, Izpisua Belmonte says.
Read the full article in Nature.
Everything that’s wrong about the
future of the UEFA Champions League
Jonathan Wilson, SI, 31 March 2021
But the finances of the Champions League created a self-perpetuating elite: the superclubs. The Champions League final has become the preserve of the three Spanish giants, Juventus, Bayern Munich, PSG and a smattering of Premier League teams. When Ajax reached the semifinal two seasons ago, it felt like a fairy tale, so disadvantaged are Dutch clubs. Yet Ajax is a four-time European champion. The distribution of prize money is such that Barcelona made twice as much from reaching the last four that season as Ajax did, just because it comes from a country with a bigger TV market.
It’s a structure that has broken domestic football. France, Italy and Germany are effectively monopolies, even if COVID-19 has created some unpredictability this season. Spain is essentially the preserve of two clubs, with Atlético Madrid providing an occasional challenge. Even in the Premier League, where vast domestic TV rights mean Champions League revenues are less relevant, the last three seasons have seen the champions win with 98 or more points, totals that would have seemed unimaginable even a decade ago.
Which brings us to the existential question: what is football for? Is it about something inherent in the game, about competitiveness—and skill and beauty—for its own sake? Or is it about the production of content to generate revenue for the big brands? Increasingly, it feels the latter is true.
Jose Ángel Sánchez, the general director of Real Madrid, has compared the club to Disney. He was an architect two decades ago of the galacticos project, which was of limited footballing success but raised the profile of the club enormously. Juventus may come to see its signing of Ronaldo in the same way, despite a series of early exits from the Champions League. Increasingly, it feels, football is moving away from a focus on actually winning matches. At least one club is exploring the possibility of producing a semi-fictionalized soap opera set in its offices. Everything is about the generation of content and the promotion of the brand.
That has profound implications for how football may look. In traditional terms, Porto’s success against Juventus was a brilliant game of football. But in a content production sense, it may be that the more attractive game was Juventus’s victory away to Barcelona at the end of the group stage. The result didn’t matter, as both sides had already qualified, but there was Lionel Messi against Ronaldo, two enormous names wearing the shirts of enormous brands, pitted against each other. Who cares if three months later barely anybody could recall the result; content generation is not about posterity.
Read the full article in SI.
Is Musa Al-Gharbi the last academic who can tell the truth?
B Duncan Moench, Tablet, 4 May 2021
“Basically, what happened was: I published this essay in this journal, Middle East Policy”, Al-Gharbi told me. “It’s the No. 1 cited journal for Middle East studies, it was a big deal … it was exciting … but there’s sort of a cottage industry of anti-Muslim Muslims—people who are ex-Muslims who go on right-leaning media and say something like ‘Muslims are horrible, I would know.’ So one of the people who’s part of this industry tried trolling me on Twitter about my article, which was about ISIS, and I pretty much demolished the guy. It was great in the moment. I felt super proud of myself, in part because as he started losing the empirical argument, he tried shifting to talking about things like ‘liberty’ and stuff like this. I’m guessing he assumed I didn’t know a lot about American political theory, but I actually graduated from one of the top programs in the world for political philosophy. I went on his own home turf and just demolished him there as well. It really got under his skin. Looking back on it now, I wish I would have tried to spike the ball less. Basically, he started this smear campaign to try to get me fired from my positions [at University of Arizona].”
A short time later, Al-Gharbi found himself opening rejection letters from both of the Ph.D. programs he applied to at the University of Arizona—despite already holding the titles of outreach scholar and research fellow at the institution. “It just seemed like I’d become persona non grata. In fact, I did find out later that I had been blackballed, at least where hiring is concerned.” Al-Gharbi spoke about this episode with a remarkable lack of bitterness or anger.
Al-Gharbi put forth a leftist position, for which he was pilloried by Fox News. He is an African American Muslim, which presumably should have given him intersectional identity cred. Yet before America even had the term “cancel culture,” he became one of its casualties. How that happened helps illuminate a general misunderstanding about cancel culture, and the nature of the university system where it largely originated.
Conservatives claim that American universities are bastions of left-wing radicalism, but the reality is far more complicated, and ridiculous. At its core, America’s higher education industry is neither left nor right, per se. Like the Wall Street investment firms its graduates pour into after graduation, elite American higher education prioritizes risk aversion above all else. By necessity, its decision-makers hold the worldview of a fretful public relations manager: Avoid bad press. Don’t make waves. Keep the ship afloat.
It’s true that universities are dominated by a niche version of radical social thought, but it’s generally the kind of “leftism” Goldman Sachs can endorse—a transhumanist love affair with machines (and data) combined with calls for representational correctness, all to the accompaniment of the declining arts, which dance around in the background, flailing its limbs and speaking in tongues, compelled by the god of anti-racism.
Read the full article in Tablet.
Study: Employment rose among those in free money experiment
Adam Beam, AP News, 4 March 2021
After getting $500 per month for two years without rules on how to spend it, 125 people in California paid off debt, got full-time jobs and reported lower rates of anxiety and depression, according to a study released Wednesday.
The program in the Northern California city of Stockton was the highest-profile experiment in the U.S. of a universal basic income, where everyone gets a guaranteed amount per month for free. Announced by former Mayor Michael Tubbs with great fanfare in 2017, the idea quickly gained momentum once it became a major part of Andrew Yang’s 2020 campaign for president.
Supporters say a guaranteed income can alleviate the stress and anxiety of people living in poverty while giving them the financial security needed to find good jobs and avoid debt. But critics argue free money would eliminate the incentive to work, creating a society dependent on the state.
Tubbs, who at 26 was elected Stockton’s first Black mayor in 2016 after endorsements from Oprah Winfrey and Barack Obama, wanted to put those claims to the test. Stockton was an ideal place, given its proximity to Silicon Valley and the eagerness of the state’s tech titans to fund the experiment as they grapple with how to prepare for job losses that could come with automation and artificial intelligence.
The Stockton Economic Empowerment Demonstration launched in February 2019, selecting a group of 125 people who lived in census tracts at or below the city’s median household income of $46,033. The program did not use tax dollars, but was financed by private donations, including a nonprofit led by Facebook co-founder Chris Hughes.
A pair of independent researchers at the University of Tennessee and the University of Pennsylvania reviewed data from the first year of the study, which did not overlap with the pandemic. A second study looking at year two is scheduled to be released next year.
When the program started in February 2019, 28% of the people slated to get the free money had full-time jobs. One year later, 40% of those people had full-time jobs. A control group of people who did not get the money saw a 5 percentage point increase in full-time employment over that same time period.
Read the full article in AP News.
Another essentializing moment?
Chris Vasantkumar, Africa is a Country, 13 May 2021
In his introduction to the recently released volume of Stuart Hall’s writings on race and difference, historian Paul Gilroy argues strongly for the contemporary relevance of Hall’s thought. Yet, as social theorist Sindre Bangstad notes, in “one particular respect,” Gilroy “places … Hall resolutely in the past.” Specifically, Gilroy identifies the “considerable hostility” among contemporary anti-racists towards the “open … notion of blackness” as “a political color accessible to all non-whites,” a notion that figured centrally in Hall’s classic essays on the politics of difference in 1980s Britain. Bangstad is skeptical of Gilroy’s claim that this version of blackness as a kind of self-identification akin in many respects to class consciousness is “now anachronistic.” Instead, he suggests that Hall would react in an “open” and “pragmatic” manner to the ideas of contemporary anti-racists.
By contrast, I would argue that Gilroy is correct in his assessment; indeed, the passing (or, more optimistically, eclipse) of race as a political rather than ontological category highlights just how much the terrain of struggle has shifted since Hall produced his classic works on the subject. Yet something important has been lost in the bargain by which the racial categories that Hall so presciently revealed to be social, historical, and political have re-ossified into apparently inarguable natural forms. For at a time when, as American political scientist Jodi Dean suggests, “those taken to share an identity are presumed to share a politics, as if the identity were obvious and the politics didn’t need to be built,” we seem to have slipped into what Hall might have termed “another essentializing moment.”
In his classic 1992 essay, “What Is This ‘Black’ in Black Popular Culture?,” Hall criticized “the essentializing moment” he inhabited for naturalizing and de-historicizing difference, for “mistaking what is historical and cultural for what is natural, biological, and genetic.” Moreover, he suggested that such essentialism was incompatible with anti-racism: “the moment the signifier ‘black’ is torn from its historical, cultural and political embedding and lodged in a biologically constituted racial category,” he wrote, “we valorize, by inversion, the very ground of the racism we are trying to deconstruct.” The end point of such racial absolutism, he warned, is unwarranted faith in the absurd notion that “we can translate from nature to politics using a racial category to warrant the politics of a cultural text and as a line against which to measure deviation.” Ironically, the kind of naturalization that Hall located in Thatcherite racism has taken root at the heart of anti-racist rhetoric itself.
British sociologist Claire Alexander describes “an important shift during the 1990s [in the UK] from ‘political blackness’ to ethnically defined identities, such as black British or British Asian,” or, more recently and somewhat controversially, BAME (Black, Asian, and Minority Ethnic). According to Alexander, the dynamics of political blackness were once very much like those of class formation. In the 1970s and 1980s, she says, “young people of color identified as black and campaigned together to fight racial discrimination,” adding that “at the heart of political blackness was a shared feeling of being unwanted.” Yet from the perspective of the present in both the US and the UK, the idea that blackness could denote anything other than African ancestry seems absurd. Alexander, “who describes herself as an Asian woman in her 50s, says, ‘I still use black, but I realize you can’t really get away with that because you look at young people and you describe yourself as black, they will look at you like you’re deranged.’”
Read the full article in Africa is a Country.
Is this the end of French intellectual life?
Christopher Caldwell, New York Times, 5 March 2021
At the end of last summer, Le Débat, France’s most prestigious intellectual review, accompanied its 40th-anniversary issue with a wholly unexpected announcement: It would cease publication forthwith. Le Débat and its three or four thousand loyal readers had maintained an allegiance to the political left since the Cold War — but the meaning of “left” has been shifting. Rivals now claim the term, particularly social movements that arose in France in the 1980s to champion what is variously called identity politics or social justice. After waging a decades-long twilight struggle against these movements, Le Débat has lost.
Intellectuals of all persuasions have been debating what that defeat means for France, and they have reached a conclusion: The country’s intellectual life has come under the sway of a more ideological, more identity-focused model imported from the United States.
Le Débat was always resistant to American imports. It never fully made its peace with the free market in the way that self-described social democrats in America did under Bill Clinton. Nor did it climb aboard the agenda of humanitarian invasions and democracy promotion, as left-leaning American intellectuals like Paul Berman and George Packer did. That was all fine. But Le Débat’s reluctance to partake of identity politics as it arose in France, always a couple of steps behind (and always in imitation of) American civil rights advances, brought the review into disrepute with a new generation of leftists.
Many French people see American-style social-justice politics as a change for the worse. President Emmanuel Macron does. In the wake of the death of George Floyd in police custody last spring, protests and riots across America brought the dismantling of statues and other public symbols — sometimes on the spot, sometimes after further campaigning and agitation. Aware that such actions had found a sympathetic echo among some of his fellow citizens, Mr. Macron warned that France would not follow suit. “It will not erase any trace or name from its history,” he said. “It will not forget any of its works. It will not topple any statues.”
By last fall Mr. Macron was also inveighing against foreign university traditions. “I’m thinking of the Anglo-Saxon tradition, which has another history, and it is not ours,” he said, before singling out “certain social-science theories imported from the United States of America.”
To look at how Le Débat unraveled is to see that these tensions have been developing for years, if not decades. They bode poorly for the future of intellectual life in France — and elsewhere.
Read the full article in the New York Times.
Life expectancy in adulthood is falling for those without
a BA degree, but as educational gaps have widened,
racial gaps have narrowed
Anne Case & Angus Deaton, PNAS, 16 March 2021
Our main aim here is to document the patterns in Figs. 1 to 3, that the fall in period life expectancy between 25 and 75 in the US population is confined to those without a 4-y college degree, and that this is true for men and women and for Black and White people. The widening educational differences have meant that education is now a sharper differentiator of expected years of life between 25 and 75 than is race, a reversal of the situation in 1990. The causes of death behind these patterns have been well researched and are summarized, for example, by Sasson and by Sasson and Hayward. Deaths of despair, especially drug overdoses, rose rapidly beginning in the mid-1990s, and cardiovascular disease, which was the main engine of mortality decline after 1970, stopped declining and started to rise for Black and White men and women around or after 2012. If actual deaths are compared with the deaths that would have occurred had previous trends continued, what has happened to cardiovascular disease is by far the largest factor. Obesity is likely implicated in this. Smoking-related deaths have been important, especially for women who were slower than men both to start and to stop, behavior that also contributes to heart disease mortality. However, the timing of the stalling of progress against cardiovascular disease is too uniform by race and sex for either obesity or smoking to provide a complete explanation.
Our main interest is to document the mortality or period life expectancy premium that comes with the BA, by race, and by gender in exactly the same way that labor economists have long documented the parallel premium in earnings. Discussions of the mechanisms behind those premia and how they change over time are of paramount interest but are not our main focus here. Even so, we hazard some brief remarks.
Our preferred account is that changes in labor markets, especially automation and the increased demands for more educated workers to operate the robots as well as the rising costs of employer-provided healthcare, have reduced the supply of good, well-paid jobs for people without a BA. In the early 1980s, median wages of prime-aged (25 to 54) workers with a 4-y degree were 40% higher than those without. This college wage premium had soared to 80% by the late 20-teens, in part through the rise in real wages for those with a BA and in part through the decline in real wages for those without.§§ If people set their standard of success as doing at least as well as their parents, it is possible that, since 1970, a rising number see themselves as unsuccessful.
The decline in wages has been paired with a long-run decline in labor market attachment for those without a BA. In the early 1980s, 6% of prime-aged men without a BA were not participating in the labor force. This grew to 14% by the late 20-teens. By comparison, men with a BA experienced a much smaller (3 percentage point) reduction in labor force participation over that period. Reduced wages and labor force participation for those without a BA have had negative effects on family life. Marriage is often postponed until at least one partner has a job with prospects. In 1980, 80% of adults without a BA were married at age 40. By the late 20-teens, that figure had dropped to 60%. American adults without a BA are increasingly more likely to report pain in midlife to the point that those now in middle age report more pain than the elderly, something not observed for those with a 4-y degree (5). These forces work to deprive working-class life in America of meaning and social structure, conditions that since Durkheim have been seen as fertile ground for self-destruction through suicide, alcoholism, obesity, or drugs.
Read the full paper in PNAS.
The American who brought modern masterpieces to Iran
Tim Cornwell, The Art Newspaper, 3 May 2021
The story of the Tehran Museum of Contemporary Art (TMoCA) has been irresistible to journalists for four decades, laden as it is with period glamour, political intrigue and eye-catching art. To briefly recap: in the mid-1970s, the third wife of the Shah of Iran, Shahbanu Farah Pahlavi, was patron of a crash museum-building programme.
Surging oil prices had made Iran’s ruling classes rich, and Western economies, and therefore the art market, weak. In October 1977, TMoCA opened for Pahlavi’s birthday with a hastily but deftly assembled collection of Modern masterpieces from Gauguin to Giacometti, Picasso to Pollock. But scarcely a year after its glamorous opening, the Iranian Revolution toppled the Pahlavi dynasty. The museum and its collection went into deep storage—but remarkably survived almost intact.
“Have you heard there’s this American girl who’s going to start a new museum in Tehran?” That was the question circulating in New York gallery circles in 1974, when a young assistant curator in the department of prints and illustrated books at the Museum of Modern Art (MoMA), Donna Stein, was tapped to work for Her Imperial Majesty’s Private Secretariat.
More than four decades later, Stein has written a memoir of those days, The Empress and I, staking her claim to have laid the groundwork for TMoCA’s famous collection. Success has many fathers, or mothers, but Stein clearly played a major behind-the-scenes role in putting the Western part of the museum’s collection together, and curated two of its opening exhibitions.
Stein first visited Iran in 1973, after working for six years on the curatorial staff at MoMA, on a fellowship to study the cultural impact of world’s fairs. A connection with Fereshteh Daftari, a member of one of Iran’s elite families who had interned at MoMA, opened doors.
A letter came from Tehran, dangling a possible job. “I literally jumped for joy, bobbing up and down in my high heels,” Stein writes. Responsibilities were to include “purchasing prints that would represent the various movements and tendencies up to the present day”, running and organising catalogues and exhibitions, and “training Iranians to run the department after your departure”.
Stein flew to Tehran to interview for a post with the royal regime with her “eyes wide shut”—but asked for, and got, a $25,000 salary. On a site visit to the new museum, whose design mixed Solomon R. Guggenheim-style circular walkways with traditional Iranian wind towers, she was not afraid to point out “various flaws” to the architect and cousin of the queen, Kamran Diba. They included poured concrete walls that would make it hard to hang art for changing exhibitions, cramped staff offices and only one washroom. She was also unimpressed by early purchases of lithographs by Pablo Picasso that showed images of “women who looked like the Empress” but were not the best examples of his printmaking.
Read the full article The Art Newspaper.
Revisiting the life and intellectual legacy of Primo Levi
Enzo Traverso, Jacobin, 11 April 2021
The second widespread misunderstanding of Primo Levi deals with his Jewishness: the tendency to classify him as a Jewish writer. Undoubtedly, Levi was a Jew. He never tried to hide this obvious fact: he had been persecuted and deported to Auschwitz as a Jew and spent most of his intellectual life bearing testimony to the Nazi extermination of the European Jews.
Nonetheless, he was not a “Jewish writer” like Elie Wiesel, Aharon Appelfeld, or Philip Roth, to mention some of his contemporaries. The Italian-Jewish writers of the twentieth century deeply differed from their Israeli fellows, as well as from the New York intellectuals, however diverse the latter could be. Not only did he never consider himself the representative of a religious community — his attachment to the tradition of science and the Enlightenment implied a radical form of atheism, which his experience of deportation strongly reinforced, even if he always expressed respectful feelings toward believers — but he probably never felt part of a Jewish milieu with clearly defined social and cultural boundaries.
Rather than as an Italian Jew — a definition in which Jew is the substantive and Italian the adjective — he preferred to depict himself as an italiano ebreo, a “Jewish Italian.”
Interviewed by Risa Sodi after his successful lecture tour of the United States in 1985, he stressed that in Italy the notion of “Jewish writer” was very difficult to define. There, he said, “I am known as a writer who, among other things, is Jewish,” whereas in the United States he felt “as if [he] had worn again the Star of David!” Of course, he was joking, but he wished to emphasize that his education and his cultural formation had not been particularly Jewish, and that most of his friends as well as the overwhelming majority of the Italian readers of his books were not Jewish. In a lecture given in 1982, he admitted that he had finally resigned himself to accept the label of “Jewish writer,” but “not immediately and not without reservations.” This remark could be extended to most Jewish writers of twentieth-century Italian literature, from Italo Svevo to Alberto Moravia, from Giorgio Bassani to Natalia Ginzburg, and many others.
Between 1938 and the end of the Second World War (i.e., between the promulgation of fascist racial laws and his liberation from Auschwitz), Levi probably fit the famous Sartrian definition of the Jew: “The Jew is one whom other men consider a Jew . . . for it is the anti-Semite who makes the Jew.” In a conversation with Ferdinando Camon, he mentioned his Jewishness as “a purely cultural fact.” “If not for the racial laws and the concentration camp,” he said, “I probably would no longer be a Jew, except for my last name. Instead this dual experience, the racial law and the concentration camp, stamped me the way you stamp a steel plate: at this point I am a Jew, they have sewn the star of David on me and not only on my clothes.”
Levi certainly was a “Godless Jew” (gottloser Jude), as Peter Gay depicted Sigmund Freud, but he probably would not have inscribed himself into the noble gallery of those whom Isaac Deutscher called the “non-Jewish Jews” (i.e., the Jewish heretics). After the war, Primo Levi did not feel targeted by antisemitism and considered emancipation from religious alienation and obscurantism a legacy of the Enlightenment rather than a task of the present. He did not consider himself an iconoclast or a dissenter within Judaism. He simply was not a believer or a religious man.
Read the full article in Jacobin.
Orientalism and its afterlives
Vivek Chibber, Catalyst, Fall 2020
Few works have had a greater influence on the current Left than Edward Said’s Orientalism. In the first instance, it has become the lodestone for critical scholarship around the colonial experience and imperialism. But, more expansively, in its status as a founding text of postcolonial studies, its imprint can be discerned across the moral sciences — in race studies, history, cultural theory, and even political economy. Indeed, it is hard to think of many books that have had a greater influence on critical scholarship over the past half century. There are some respects in which Said’s placement of colonialism at the center of the modern era has had a salutary effect, not just on scholarship, but also on politics. Even as the Left went into retreat in the neoliberal era, even as working-class parties either shrank in influence or were absorbed into the mainstream, the centrality of anti-imperialism surprisingly remained close to the center of Left discourse — an achievement in no small part attributable to Said’s great book. And even as class politics is reemerging after its long hiatus, it is impossible to imagine a future in which the Left in the core countries will ever repeat its sometimes baleful disregard for imperial aggression, and for the aspirations of laboring classes in the Global South. In this recalibration of the Left’s moral compass, Said’s Orientalism continues to play an important role.
Precisely because of its classic status, and its continuing influence, Orientalism deserves a careful reexamination. Its importance as a moral anchor for the anti-imperialist Left has to be balanced against some of the other, less auspicious aspects of its legacy. In particular, alongside its excoriation of Western colonialism and its deep investigation of colonialism’s ideological carapace, the book undeniably took several steps backward in the analysis of colonial expansion. It was this very weakness that proved to be so attractive to the emerging field of postcolonial studies in the 1980s, and that enabled its proponents to don the mantle of anti-imperial critique even as they were engaging in the very essentialism and exoticization of the East that was emblematic of colonial ideology. It is no small irony that Said, a deeply committed humanist, secularist, and cosmopolitan, is now associated with an intellectual trend that traduces those very values. This apparent paradox, I will argue, is, in fact, not so mysterious. It reflects real weaknesses in Orientalism’s basic arguments — weaknesses that were exposed very early by critics from the South, but that were brushed aside by the New Left in its flight from materialism. As the Left gathers its intellectual resources once again and takes up the challenge of confronting imperial power, an engagement with Orientalism has to be high on its agenda.
Read the full article in Catalyst.
The demobilization of the South African masses
Russell Grinker, Africa is a Country, 27 April 2021
Spontaneous and fragmented protest seems to be growing. But the mass organized movements of the working class and poor that bloomed in the wake of the 1973 Durban strikes and grew into the fragmented, but massive United Democratic Front in the 1980s, hardly seem to exist now. And any radical-sounding state-driven developmental project has also long since been consigned to the scrap heap. The once widely popular Reconstruction and Development Programme (RDP) is now forgotten and current glossy and much-hyped government plans have no popular traction whatsoever. Popular cynicism and demoralization are about all we have left.
As those of us who were around in the early 1990s soon discovered, controlled post-apartheid decolonization required the exclusion of the masses from political life. The defeat of the mass base of our nationalist movement and unions and the containment of more radical nationalists and leftists since then, were preconditions for the realization of this process. The stabilization of capitalism in post-apartheid South Africa demanded the neutralizing of grassroots aspirations towards social change. In the sphere of politics, the main priority of the ANC regime was to ensure that the urban and rural proletariat should be deprived of its own organizational and political voice—its ability to represent itself. Old militant mass structures had to be put to bed (the UDF was disbanded in the transition).
The post-apartheid ANC and its Alliance partners (the trade unions and communists) initially represented an uneasy alliance between radical and moderate nationalists. However, driven by the demoralization and disorientation of the radicals and old Communists following the collapse of the Soviet Bloc, it was the moderates who were in the ascendancy. The radicals were put on the defensive. Individual radical leaders like Chris Hani, Joe Slovo, and Harry Gwala were merely tolerated during the negotiations phase of the early 1990s in order to strengthen the party’s radical nationalist credentials. Grassroots civics representatives were also humored for a while under the SANCO banner (The South African National Civic Organization). For a while, those affiliated to the Congress of South African Trade Unions (which along with the South African Communist Party, is allied to the ANC), played a significant policy role for a while and maintained a militant posture, but grew less militant as union leaders became comfortable with their role in corporatist bodies and more radical rank and file structures were co-opted and tamed.
After 1990 it was only a matter of time before surviving radicals began to be marginalized. Chris Hani, leader of the SACP and spokesperson for the more radical elements of the Alliance, was assassinated in April 1993 during the unrest orchestrated by the old regime preceding the transition to democracy. The radical-sounding RDP, developed with the left’s assistance, was the radicals’ last hope. This was viewed as the cornerstone of government development policy, but it was soon replaced by the Growth, Employment and Redistribution (GEAR) macroeconomic policy framework in 1996. Cosatu, the ANC’s trade union partner in the Alliance, slammed GEAR as neoliberal but the battle for a radical programme had clearly been lost.
Read the full article in Africa is a Country.
Arendt and Roth: An uncanny convergence
Corey Robin, New York Review, 12 May 2021
The difference between the two writers is obvious. She was born in Germany in 1906; he was born in Newark in 1933. She fled Hitler and never looked back; he fled his parents and kept going home. She wrote The Human Condition; he wrote Portnoy’s Complaint.
Yet, throughout the postwar Jewish ascendancy in America, as other writers and scholars eased their way into the conversation, Arendt and Roth distinguished themselves—not by stirring up the little magazines but by contending with the Jews. Summoning the anxious wrath of a still vulnerable community, Roth and Arendt occupied a singular position: defending the margin against the marginalized, refusing the political pull and moral exaction of an embattled minority. Today, at a moment of rising anti-Semitism and increasing polarization, when the tendency, even among writers and intellectuals, is to circle the wagons in defense of team and tribe, their shared archive of heresy among the heretics pays revisiting.
That we even know of that archive is because of the work of another Roth biographer, Ira Nadel, in a little-noticed article in 2018. The story begins in August 1963, when the Princeton sociologist Melvin Tumin grumbles in a letter to Roth about the fact that Roth “liked” Arendt’s report on the trial of Adolf Eichmann, which had appeared in The New Yorker that spring. (Roth’s respect for Eichmann in Jerusalem seems not to have faded across the years. When she was in her twenties, the author Lisa Halliday had a relationship with a much older Roth, which she turned into fiction in Asymmetry. In the novel’s first section, which is set against the backdrop of the Iraq War, the Roth-based character tells the Halliday-based character, “If you want to learn about the Holocaust I’ll show you what to read.” One of the three books he recommends is Eichmann in Jerusalem.)
Eichmann in Jerusalem set off a furious reaction upon its publication. For her alleged soft-pedalling of Eichmann’s anti-Semitism and her criticism of Jewish leaders who cooperated with the Nazis, Arendt was vilified as a friend to anti-Semites and an enemy of the Jews. In the view of her critics, Arendt had not simply written a flawed book; she had revealed her bad character. She was a cretin and a criminal—heartless, vain, wicked, meretricious, cruel. The savage tenor of the campaign against her, which extended from the Anti-Defamation League and the World Jewish Congress to The New York Times and even Partisan Review, to which she’d been a longtime contributor, was captured by the words used to describe the controversy: her allies compared it to a pogrom, her antagonists to a civil war.
We don’t know what drew Roth to Arendt’s writing on Eichmann, but we do know that he, too, was unsettled by the question of Jewish collaboration with the Nazis. The “moral horror” of it, he said, “excited my imagination.”
Read the full article in the New York Review.
One for the history books
Shah Tazrian Ashrafi, Caravan, 30 April 2021
On 25 March this year, at a feminist webinar to mark fifty years since Bangladesh gained independence, the Pakistani author and oral historian Anam Zakaria spoke about the “political and cultural silencing around the birth of Bangladesh” that she witnessed around her while growing up. She clarified that this was not a “complete erasure” of 1971, which “remains one of the most defining years in Pakistan’s history and the national imagination,” with a lasting effect on education, policy-making, perceptions of neighbouring countries and so on. “But what is remembered about ’71 also in many ways determines what must not be remembered,” she continued. In her research, where she draws on textbooks, museum exhibits and military memoirs, she analyses techniques of official history-making and the dissemination of “fabricated narratives” regarding the war.
In late 1971, India participated in Bangladesh’s Liberation War against Pakistan, alongside the Mukti Bahini forces—Bangladeshi freedom fighters. For Bangladesh, which won its sovereignty on 16 December that year, it was a historic event that led to the realisation of its long-sought dreams of independence. For India, it was a brilliant triumph over its arch-enemy, Pakistan. And for Pakistan, it was another Indian attempt to destabilise its polity. Indeed, in historical or political discourse outside Bangladesh, the significance of the war is often subsumed within the overarching narrative of enmity between India and Pakistan—1971 becomes just another blotch on the relationship shared by these two countries. But for Bangladesh, 1971 is a matter of pride and national fervour, though it also carries with it a sense of unrelenting trauma.
With the fiftieth anniversary of the birth of Bangladesh this year, it is worth revisiting how the memory of 1971 has acquired different shapes in these countries. Zakaria’s 1971: A People’s History From Bangladesh, Pakistan and India, published in late 2019, seeks to shed light on the multiple ways in which memory blends with politics, etching 1971 into the collective psyches of the three countries. Meanwhile, Gary J Bass’s The Blood Telegram and Srinath Raghavan’s 1971: A Global History of the Creation of Bangladesh, two examinations of that landmark year, demonstrate that it is the shadow of the Cold War that defines international memories of it.
Zakaria’s goal in her book is not to amass or present new information about 1971. Instead, her focus is on documenting personal and national memories of the time, gathered through interviews with civilians, scholars and army personnel across the three countries. This is juxtaposed with the different ways in which the war is officially memorialised in them.
Read the full article in Caravan.