Pandaemonium

PLUCKED FROM THE WEB #79

The latest (somewhat random) collection of essays and stories from around the web that have caught my eye and are worth plucking out to be re-read.


.

Coronavirus and the withering of the public sphere
Philip Alcabes, American Scholar, 19 September 2020

In lieu of a proper public response to coronavirus, the United States was the scene of a heated and unbridled reaction that is, as anyone who pays attention knows, still smoldering. National health officials shifted blame; state governors offered a plethora of self-congratulatory photo ops and defensive press conferences but little guidance; mainstream media featured obsessive coverage of modes of spread, routes of travel, and self-protective measures, much of it questionable; social media became a stage for a festival of untruth, mockery of public officials, and enactments of fear; armed libertarians threatened legislators and unarmed ones screamed at retail store workers who asked them to put on masks; information about potential treatments seemed to change weekly. It was hysterical and uncivil, and, clearly enough, it wasn’t really about coronavirus at all. Something deeper was at work: a dispatching of the normal ways of dealing with public problems.

Something new certainly seems to be replacing the old way. I mean not just that a state of emergency had to be declared, or that exhortations to wear a mask and stay apart from others (“social distancing!”) were valorized as if they constituted genuine public health. Admittedly, even beyond the selfless labor of service, emergency, and medical workers, many collective acts were refreshing and hopeful: the willingness of legions of Americans to stand up and cry out about the injustices and abuses that Black Americans have suffered at the hands of white police, lawmakers, school boards, and so on. How tired our Black compatriots must be of it all. How laudable to not give up. And real questions arose about how policing is done, by whom, and of whom.

But the negation of that hopeful collectivity is another part of the uprooting: the eagerness of many white Americans to decry the voicing of those truths of American history, the drawing of guns on peaceful protestors, the swinging of clubs and fists, and even the firing of weapons to stop Black people (and white allies) from saying these truths.

I don’t mean that anyone, in the first months of 2020, latched onto coronavirus specifically because they were looking for an excuse for either protest or violence. I don’t think those connections were evident. But something was already ripening. And the complete uninterest in dealing with coronavirus as a phenomenon that demanded a public response through public discourse was a sign. The sphere of collective engagement in making a society, the wide-flung conversation on decency and responsibility—that public arena seems, suddenly, to be in question.

Read the full article in American Scholar.


.

Why I became sceptical of the lockdown sceptics
Michael Fitzpatrick, Medium, 30 January 2021

Reading reports from China early last year describing an apparently novel viral pneumonia, I was initially sceptical. Terrifying threats have become the currency of public health and 24/7 news, and doctors have become used to the ensuing procession of patients unnecessarily worried about diseases such as Ebola or Zika virus. Then I encountered my first patients gasping for breath with Covid-19, amid accounts from colleagues of mounting deaths in nursing homes. It was soon evident that this new virus was a threat of a qualitatively different order from those of the recent past.

The experience of inflated health scares inevitably makes some of us sceptical. Such scares have come to play a role in ‘the medicalisation of life and the politicisation of medicine’, as I observed in 2000 when I wrote The Tyranny of Health.[1] It was already apparent then that issues of health were steadily occupying a more prominent role in the politics of the post-Cold War world, as the established polarities of Left and Right, Labour and Capital were no longer offering any convincing visions of the future. A burgeoning interaction ‘between a state seeking authority and individuals seeking reassurance, provided enormous scope for government intervention in personal life and guaranteed the popularity of such intervention’. ‘Public health’ and an unremitting focus on lifestyle and personal well-being became the order of the day. Rather than a means to our desired ends ‘health’ was becoming the very goal of human endeavour, thus diminishing rather than enhancing the quality of our lives.

Playing its part in this process was the gross exaggeration in the late 1980s and 1990s of the threats posed to every one of us by diseases such as HIV/Aids and BSE/CJD, provoking widespread anxiety. The ‘worried well’ became a recognised disease category. New viruses in the new century — SARS in 2002–3, Avian Flu in 2005–6 and Swine Flu in 2009–10 — were accompanied by predictions of apocalyptic death rates that were rapidly revealed as disproportionate to the real threat. This was generating unwarranted fear, and also eroding trust in doctors and health authorities. In a paper in 2010 I endorsed the growing view that when a genuine threat arrived it would not be taken seriously.

So it came to pass. When Covid-19 appeared, many people, suspicious of a scare and hostile to lockdown measures, were too ready to dismiss or minimise the threat. The wish was father to the thought. Commentators such as Ross Clark and Ivor Cummins hastily claimed that Covid was no worse than seasonal flu or the flu pandemics in 1957 and 1968. They cherry-picked congenial expert criticisms and dubious expert authorities, rather than objectively examining the available range. Later, when new cases fell to low levels over the summer of 2020, they disputed the abundant indications that this reflected the impact of lockdown policies.

These are not the only sceptics giving scepticism a bad name. Some reputable scientists such as epidemiologists Sunetra Gupta and Carl Henegan, suggested, on the basis of disputed interpretations of the data, that herd immunity had been reached and that a second wave of infections was now unlikely. Some even asserted that the pandemic was ‘past its peak’ and ‘over’. When these claims were patently falsified by the dramatically rising curve of infections and deaths from November onwards, few sceptics — commentators or scientists — were prepared to acknowledge their mistaken prognostications.

Read the full article on Medium.


.

Myths about poverty must be refuted so that parents
are trusted with £20 and not half a pepper

Sam Freedman, The House, 14 January 2021

The reason the Department hasn’t done the simplest, cheapest thing and just give parents a bit of extra cash is because they don’t trust them to spend it properly. Or rather they are scared of the public perception that, as Tory MP Ben Bradley luridly put it last year, the money would be spent in crack dens and brothels.

We know this isn’t true. Cash transfer schemes have been rigorously evaluated all over the world and show that people in poverty use money provided for food to buy food. They are significantly cheaper than programmes providing food directly and come without the stigma of receiving parcels or stamps.

And yet distrust of those on benefits remains strong; the belief that poverty indicates moral failing runs deep. An Ipsos-MORI study in 2013 found people thought, on average, that benefit fraud was 34 times more common than is actually the case. The 2017 British Social Attitudes survey found that, on average, people think 34% of all claimants are providing false information and a fifth of respondents said *most* welfare recipients don’t deserve help. Staggeringly 61% of people think it is wrong for benefit claimants to use legal loopholes to increase their payments, compared with only 48% who think it is wrong to use legal loopholes to pay less tax.

These attitudes are, in part, due to the way poverty is portrayed in newspapers and on television. Right-wing tabloids have spent years presenting rare cases of extreme or unusual fraud as commonplace and “poverty porn” TV series have wrung entertainment out of distorted portrayals. Successive Conservative governments have happily participated in this myth building – no scapegoat for societies perceived failures goes unsacrificed. Benefits freezes; the bedroom tax and the abhorrent two child benefits limit have all been popular with the public but are disastrous from a public policy perspective, as the cost of shattered lives ends up with the state one way or another. 

Meanwhile a large part of the benefits system has been turned into an extension of the criminal justice system, without the oversight. While most people would acknowledge the need for some conditionality in benefits the expansion of the sanctions system has been shocking, utterly dehumanising, and mostly hidden from the wider public. For instance, over a million disabled claimants were sanctioned between 2010 and 2017. Even for those who’ve avoided sanction the brutal administrative complexity of the system would cause a revolution if the middle classes had to endure it.

Read the full article in The House.


.

From forgeries to Covid-denial,
Tim Harford on how we fool ourselves

Tim Harford, Financial Times, 28 January 2021

They called Abraham Bredius “The Pope”, a nickname that poked fun at his self-importance while acknowledging his authority. Bredius was the world’s leading scholar of Dutch painters and, particularly, of the mysterious Dutch master Johannes Vermeer.

When Bredius was younger, he’d made his name by spotting works wrongly attributed to Vermeer. Now, at the age of 82, he had just published a highly respected book and was enjoying a retirement swan song in Monaco.

It was at this moment in Bredius’s life, in 1937, that Gerard Boon paid a visit to his villa. Boon, a former Dutch MP, was an outspoken anti-fascist. He came to Bredius on behalf of dissidents in Mussolini’s Italy. They needed to raise money to fund their escape to the US, said Boon. And they had something which might be of value.

Boon unpacked the crate he had brought out of Italy. Inside it was a large canvas, still on its 17th-century wooden stretcher. The picture depicted Christ at Emmaus, when he appeared to some of his disciples after his resurrection, and in the top left-hand corner was the magical signature: IV Meer.

Johannes Vermeer himself! Was it genuine? Only Bredius had the expertise to judge.

The old man was spellbound. He delivered his verdict: “Christ at Emmaus” was not only a Vermeer, it was the Dutch master’s finest work. He penned an article for The Burlington Magazine for Connoisseurs announcing the discovery: “We have here — I am inclined to say — the masterpiece of Johannes Vermeer of Delft. Quite different from all his other paintings and yet every inch a Vermeer.”

He added, “When this masterpiece was shown to me, I had difficulty controlling my emotions.”

That was precisely the problem.

“Christ at Emmaus” was a rotten fraud, of course. But although the trickery was crude, Bredius wasn’t the only one to be fooled. Boon had been lied to as well: he was the unwitting accomplice of a master forger. Soon enough, the entire Dutch art world was sucked into the con. “Christ at Emmaus” sold to the Boijmans Museum in Rotterdam, which was desperate to establish itself on the world stage. Bredius urged the museum on and even contributed. The total cost was 520,000 guilders — compared to the wages of the time, well over $10m today.

Read the full article in the Financial Times.


.

How COVID unlocked the power of RNA vaccines
Ellie Dolgin, Nature, 12 January 2021

The era of RNA vaccines has arrived — and dozens of companies are getting in the game. “All of the major pharmas are, in one way or the other, now testing out the technology,” says Jeffrey Ulmer, former head of preclinical research and development at GlaxoSmithKline’s vaccine division in Rockville, Maryland, and before that a member of Geall’s team at Novartis.

The idea of using RNA in vaccines has been around for nearly three decades. More streamlined than conventional approaches, the genetic technology allows researchers to fast-track many stages of vaccine research and development. The intense interest now could lead to solutions for particularly recalcitrant diseases, such as tuberculosis, HIV and malaria. And the speed at which they can be made could improve seasonal-flu vaccines.

But future applications of the technology will run up against some challenges. The raw materials are expensive. Side effects can be troubling. And distribution currently requires a costly cold chain — the Pfizer–BioNTech COVID-19 vaccine, for example, must be stored at −70 °C. The urgency of COVID-19 is likely to speed up progress on some of those problems, but many companies might abandon the strategy once the current crisis subsides. The question remains: where will it end up?

“The RNA technology has proved itself, but it’s not done yet,” says Philip Dormitzer, head of viral vaccines research at Pfizer, and a former colleague of Geall’s at Novartis. “And now that we’ve seen it work for COVID-19, it’s tempting to want to do more.”

Vaccines teach the body to recognize and destroy disease-causing agents. Typically, weakened pathogens or fragments of the proteins or sugars on their surfaces, known as antigens, are injected to train the immune system to recognize an invader. But RNA vaccines carry only the directions for producing these invaders’ proteins. The aim is that they can slip into a person’s cells and get them to produce the antigens, essentially turning the body into its own inoculation factory.

The idea for RNA-based vaccination dates back to the 1990s, when researchers in France (at what is now the drug firm Sanofi Pasteur) first used RNA encoding an influenza antigen in mice. It produced a response, but the lipid delivery system that the team used proved too toxic to use in people. It would take another decade before companies looking at RNA-interference therapeutics — which rely on RNA’s ability to selectively block the production of specific proteins — discovered the LNP technologies that would make today’s COVID-19 vaccines possible.

Read the full article in Nature.


.

The pandemic didn’t shatter society, Zoom did
Timandra Harkness, UnHerd, 4 January 2021

These divides won’t continue in exactly these forms, though they will add to the many fractures in society. What will persist, exacerbating the splits and hampering our ability to recover from both pandemic and economic shock, is the atomisation of society.

Instead of a country united in the face of Nature’s novel threat, we split into millions of individuals, separated into households or “support bubbles” (as if every single person is only a mental health patient waiting to happen). The normal, informal, everyday encounters that remind us we’re all human, and not so very different, were severely restricted by law.

Instead, we built connections online, seeking out what scratched our itch, whether that was friendly networks to lift our spirits, or groups whose analysis of the crazy world confirmed our own instincts.

Little wonder that rational scepticism about politicians or medical authorities slid so easily into conspiracy theories. In the pub, or across the workplace coffee machine, the idea that Bill Gates is microchipping the population is quickly laughed out of existence. On the internet, you can always find a YouTube video to tell you it’s true.

The shift towards more working from home, for those who can, makes it hard to build any workplace ethos of professionalism, of teaching younger colleagues, or of solidarity with your fellow employees. Universities may be able to deliver lectures online, but it’s hard to cultivate any sense of a shared project of scholarship in virtual seminars, especially with no trip to the bar afterwards.

Transferring cultural, or political, or sporting events online revealed how much more an audience is than a number of individuals who watch and listen. Without them, performers, speakers, or players, have no focus, no feedback, and no sense that this occasion is unique in happening here, now, with this specific group of people, transformed into a whole greater than its parts.

I hope that we enter 2021 with a much deeper sense of why shared, public, social life is important, and a commitment to restoring it as fast as possible. I fear that there are long-lived social forces moving against that. Re-starting the economy is going to be much easier than knitting together the unravelled strands of a society that already lacked coherence and trust.

Read the full article in Unherd.


.
The vaccination line on April 14, 1947, at the New York City Health Department. Arthur Brower/New York Times

How New York City vaccinated
6 million people in less than a month

John Florio and Ouisie Shapiro,

New York Times, 18 December

“The smallpox eradication program is absolutely considered one of the crowning achievements of global public health,” Dr. DiMaggio said. “And it’s never been duplicated. Just the very idea that a disease was eradicated — a disease that ravaged humankind for millennia — is remarkable. And the reason we were able to do that is because of vaccinations.”

In 1947, most New Yorkers had been inoculated against smallpox. They’d been told the inoculation would protect them for life — but there was no guarantee. In some cases, the vaccine didn’t take. In others, the immunity wore off. Mr. Le Bar was proof of that.

Dr. Weinstein had some tough decisions to make.

The lab results reached him on Good Friday, April 4. In two days, New Yorkers would be gathering for the city’s annual Easter Parade. If only one of them had smallpox, even among a vaccinated population, the resulting outbreak could be devastating.

“Imagine the Easter Parade, and all these people crowded together on Fifth Avenue,” said Dr. Howard Markel, director of the Center for the History of Medicine at the University of Michigan. “All of them cheering and chanting, and potentially coughing and sneezing, and you have smallpox introduced into that picture. That is a public health nightmare.”

Dr. Weinstein wasted no time. Knowing there was only one way to deal with the virus — vaccination — he took action. At 2 o’clock that day, he held a news conference, urging all city dwellers to get vaccinated immediately, even if they had been inoculated as children. Re-vaccinations were necessary, he said, in case people had lost their immunity.

It was hardly without risk. Not only could the announcement cause mass hysteria, but in 1947, vaccines were not tested the way they are today. The vaccine available at the time could trigger rare but dangerous side effects, especially in people with weakened immune systems or particular skin conditions.

According to Dr. David Oshinsky, professor of medicine at NYU Langone Health, Dr. Weinstein acted in line with the scientific knowledge of the era. He made the right move, which was to vaccinate as many people as possible.

Dr. Markel agrees. “Weinstein was doing his job as best he could,” he said. “The risk of smallpox spreading and causing disease and death was far, far greater than the tiny risk of getting encephalitis or dying of the vaccine. So, I come not to bury Weinstein, but to praise him.”

Read the full article in the New York Times.


.

The other virus that worries Asia
Harriet Constable, BBC Future, 12 January 2021

But even as the world grapples with Covid-19, Wacharapluesadee is already looking to the next pandemic.

Asia has a high number of emerging infectious diseases. Tropical regions have a rich array of biodiversity, which means they are also home to a large pool of pathogens, increasing the chances that a novel virus could emerge. Growing human populations and increasing contact between people and wild animals in these regions also ups the risk factor.

Over the course of a career sampling thousands of bats, Wacharapluesadee and her colleagues have discovered many novel viruses. They’ve mostly found coronaviruses, but also other deadly diseases that can spill over to humans.

These include the Nipah virus. Fruit bats are its natural host. “It’s a major concern because there’s no treatment… and a high mortality rate [is] caused by this virus,” says Wacharapluesadee. The death rate for Nipah ranges from 40% up to 75%, depending on where the outbreak occurs.

She isn’t alone in her worry. Each year, the World Health Organization (WHO) reviews the large list of pathogens that could cause a public health emergency to decide how to prioritise their research and development funds. They focus on those that pose the greatest risk to human health, those that have epidemic potential, and those for which there are no vaccines.

Nipah virus is in their top 10. And, with a number of outbreaks having happened in Asia already, it is likely we haven’t seen the last of it.

There are several reasons the Nipah virus is so sinister. The disease’s long incubation period (reportedly as long as 45 days, in one case) means there is ample opportunity for an infected host, unaware they are even ill, to spread it. It can infect a wide range of animals, making the possibility of it spreading more likely. And it can be caught either through direct contact or by consuming contaminated food.

Someone with Nipah virus may experience respiratory symptoms including a cough, sore throat, aches and fatigue, and encephalitis, a swelling of the brain which can cause seizures and death. Safe to say, it’s a disease that the WHO would like to prevent from spreading.

Read the full article in BBC Future.


.

After record turnout, Republicans are trying
to make it harder to vote

Michael Wines, New York Times, 30 January 2021

According to the Brennan Center for Justice at New York University, state legislators have filed 106 bills to tighten election rules, generally making it harder to cast a ballot — triple the number at this time last year. In short, Republicans who for more than a decade have used wildly inflated allegations of voter fraud to justify making it harder to vote, are now doing so  again, this time seizing on Mr. Trump’s thoroughly debunked charges of a stolen election to push back at Democratic-leaning voters who flocked to mail-in ballots last year.

In Georgia, where the State House of Representatives has set up a special committee on election integrity, legislators are pushing to roll back no-excuse absentee voting. Republicans in Pennsylvania plan 14 hearings to revisit complaints they raised last year about the election and to propose limitations on voting.

Arizona Republicans have subpoenaed November’s ballots and vote tabulation equipment in Maricopa County, a Democratic stronghold that includes Phoenix. Legislators are taking aim at an election system in which four in five ballots are mailed or delivered to drop boxes.

Those and other proposals underscore the continuing power of Mr. Trump’s campaign to delegitimize the November election, even as some of his administration’s top election experts call the vote the most secure in history. And they reflect longstanding Republican efforts to push back against efforts to expand the ability to vote.

Democrats have their own agenda: 406 bills in 35 states, according to the Brennan Center, that run the gamut from giving former felons the vote to automatically registering visitors to motor vehicle bureaus and other state offices. And Democrats in the Senate will soon unveil a large proposal to undergird much of the election process with what they call pro-democracy reforms, with lowering barriers to voting as the centerpiece. Near-identical legislation has been filed in the House.

“There’s going to be a rush in the next year to legislate certain types of election reforms,” said Nate Persily, a Stanford University law professor and co-director of the Stanford-MIT Healthy Elections Project. “The jury is still out on whether the lesson from this election will be that we need to make voting as convenient as possible, or whether there will be a serious retrenchment that makes voting less accessible.”

In truth, who controls a given legislature will largely decide what chances a bill has.

In the 23 states wholly run by Republicans, Democratic bills expanding ballot access are largely dead on arrival. The same is true of Republican proposals to restrict ballot access in the 15 states completely controlled by Democrats.

But in some states where legislators’ control and interests align, the changes could be consequential.

Read the full article in the New York Times.


.

Riot on the Hill
Mike Davis, Sidecar, 7 January 2021

What was essentially a big biker gang dressed as circus performers and war-surplus barbarians – including the guy with a painted face posing as horned bison in a fur coat – stormed the ultimate country club, squatted on Pence’s throne, chased Senators into the sewers, casually picked their noses and rifled files and, above all, shot endless selfies to send to the dudes back home. Otherwise they didn’t have a clue. (The aesthetic was pure Buñuel and Dali: ‘Our only rule was very simple: no idea or image that might lend itself to a rational explanation of any kind would be accepted.’)

But something unexpectedly profound happened: a deus ex machina that lifted the curse of Trump from the careers of conservative war hawks and right-wing young lions, whose ambitions until yesterday had been fettered by the presidential cult. Today was the signal for a long-awaited prison break. The word ‘surreal’ has been thrown around a lot, but it accurately characterizes last night’s bipartisan orgy, with half of the Senate election-denialists channeling Biden’s call for a ‘return to decency’ and vomiting up vast amounts of noxious piety.

Let me be clear: the Republican Party has just undergone an irreparable split. By the White House’s Fuhrerprinzip standards, Pence, Tom Cotton, Chuck Grassley, Mike Lee, Ben Sasse, Jim Lankford even Kelly Loeffler are now traitors beyond the pale. This ironically enables them to become viable presidential contenders in a still far-right but post-Trump party. Since the election and behind the scenes, big business and many mega-Republican donors have been burning their bridges to the White House, most sensationally in the case of that uber-Republican institution, the National Association of Manufacturers, which yesterday called for Pence to use the 25th Amendment to depose Trump. Of course, they were happy enough in the first three years of the regime with the colossal tax cuts, comprehensive rollbacks of environmental and labor regulation, and a meth-fed stock-market. But the last year has brought the unavoidable recognition that the White House was incapable of managing major national crises or ensuring basic economic and political stability.

The goal is a realignment of power within the Party with more traditional capitalist interest groups like NAM and the Business Roundtable as well as with the Koch family, long uncomfortable with Trump. There should be no illusion that ‘moderate Republicans’ have suddenly been raised from the grave; the emerging project will preserve the core alliance between Christian evangelicals and economic conservatives and presumably defend most of the Trump-era legislation. Institutionally, Senate Republicans, with a strong roster of young talents, will rule the post-Trump camp and, via vicious darwinian competition – above all, the battle to replace McConnell – bring about a generational succession, probably before the Democrats’ octogenarian oligarchy has left the scene. (The major internal battle on the post-Trump side in the next few years will probably center on foreign policy and the new cold war with China.)

That’s one side of the split. The other is more dramatic: the True Trumpists have become a de facto third party, bunkered down heavily in the House of Representatives. As Trump embalms himself in bitter revenge fantasies, reconciliation between the two camps will probably become impossible, although individual defections may occur. Mar-a-Lago will become base camp for the Trump death cult which will continue to mobilize his hardcore followers to terrorize Republican primaries and ensure the preservation of a large die-hard contingent in the House as well as in red-state legislatures. (Republicans in the Senate, accessing huge corporation donations, are far less vulnerable to such challenges.)

Read the full article in Sidecar.


.

Biden’s team and priorities show how
the Democratic Party changed in the Trump era
Perry Bacon Jr., FiveThirtyEight, 21 January 2021

Sanders, Warren and Ocasio-Cortez and the broader left wing of the Democratic Party unified behind Biden in the general election. But the Biden wing had won the primary and that was clear as Biden began to fill top jobs in Washington.

Biden hasn’t picked a lot of people for key jobs who endorsed Sanders or Warren for president or who are explicitly tied to the party’s more anti-establishment progressive wing. But he hasn’t explicitly cast off the left either. Instead, Biden has gone about filling the government and leadership of the Democratic Party with a demographically diverse group of establishment types who have moved left in recent years like Harris and Biden himself. Biden’s approach to filling out top jobs is perhaps best exemplified by his choices of Jamie Harrison, who was unsuccessful in his 2020 bid to be South Carolina’s first-ever Black Democratic U.S. senator, to be chair of the Democratic National Committee; Alejandro Mayorkas, who would be the first immigrant and first Latino to run the Department of Homeland Security, and Jake Sullivan as national security adviser. Sullivan, who is a white man, is not a unique choice based on demographic characteristics, but the one-time top adviser to Hillary Clinton’s 2016 campaign urged the party to become more populist after Clinton’s defeat.

Of course, there are some very progressive people who have been selected to key posts in Biden’s administration, including Rep. Deb Haaland as interior secretary and Gary Gensler and Rohit Chopra to lead key financial industry oversight departments. But we couldn’t do a story describing seven competing ideological wings in Biden’s Washington the way we did in 2017 when Trump came to office. Instead, in the Biden administration, there is one clear, dominant ideological view — left of Obama in 2016, not as left as Warren now.

“Left of Obama in 2016, not as left as Warren now,” of course, isn’t a precise ideology. But we are already getting some glimpses of what that means in practice. Incoming White House chief of staff Ron Klain explicitly described the four main focuses of the administration in a memo released the weekend before Biden was inaugurated: “[T]he COVID-19 crisis, the resulting economic crisis, the climate crisis, and a racial equity crisis.” It is hard to imagine that Obama would have so explicitly included racial issues as one of his top four goals in January 2009. In another leftward shift, Biden has said he will prioritize the economic standing of everyday Americans over trying to keep down the federal budget deficit; the latter had been a focus of Obama’s.

At the same time, there is little indication Biden will push for getting rid of the filibuster, forgiving most student loan debt by executive order or other priorities of the more progressive wing of the party. Having a President Biden, instead of a President Warren or President Sanders, means that the left is still largely locked out of power. The Democratic Party spent 2017 to 2020 debating the best strategy to defeat Trump. It will spend the next two years debating what, exactly, Biden should enact and push in terms of policy and what he should do to make sure Democrats do well in the 2022 midterms. And that debate is likely to feature a lot of the same left-vs.-center-left dynamics we’ve seen before.

Read the full article on FiveThirtyEight.


.

Can Twitter exist in a democracy?
Ed West, UnHerd, 11 January 2011

Back in the early 1980s, during the worst period of urban squalor and decay in the United States, a feeling of despair had set in about crime. Could cities ever be made liveable again? Who would want to raise a family in a place with so much everyday disorder and violence?

It was at this point that political scientist James Q. Wilson came up with the theory of Broken Windows, an idea that was to become hugely influential in turning the tide and restoring American cities to civility (much of which has been undone in 2020). Wilson argued that if the authorities crack down on minor incivility – graffiti, fare-dodging, panhandling – then very soon it will start to have an effect on major crimes too. It was to some degree basic common sense – give them an inch and they’ll take a mile – but then the 1960s had been a unique time of unlearning common sense in favour of exciting and fashionable new theories about human behaviour.

Broken Windows works partly because, even in the most violent places, huge amounts of serious crime is committed by a very small percentage of men. In Central American countries such as El Salvador or Honduras, which are plagued by horrific homicide rates, violence is mostly concentrated not just in a few neighbourhoods but even a few streets. Removing only a very small number of men has a drastic effect on wider society.

I often think about Wilson when perusing everyone’s favourite forum of thoughtful political debate, Twitter, which in terms of civility is somewhere around the period of The Warriors or Joker, the nadir of late 70s/early 80s urban decay.

If Twitter were a city it would be the sort of city where the authorities allow people to defecate in public or shoot up outside a school, and then express surprise when middle-class families wish to leave because of “the better quality of life” found in a four-hour commute away exurb.

The situation has been deteriorating for some time, although users of the site have rather adopted a Golden Age myth of a non-existent time when Twitter wasn’t filled with hysterics and fanatics. But this weekend, and with the moral courage of Ecuador or Paraguay declaring war on Germany in February 1945, Twitter finally decided to ban Donald Trump. After years of winding people up, lying, inciting hatred and worse, the outgoing President had finally overstepped the mark on 6 January. Three days later, and his Twitter opus was gone.

To anyone who still believes in liberal democracy, the question of what to do with Donald Trump provides no easy answers – and anyone who says it is simple is a partisan with an axe to grind.

Read the full article in UnHerd.


.
Photo via Allanah Dore.

Xenophobia marks SA’s ongoing decline
Editorial, New Frame, 29 January 2021

The Stations of the Cross are a series of images – paintings or sculptures – used by Catholics to mark the key moments between the imposition of a death sentence on Jesus and his body being placed in the tomb. The faithful move from station to station, stopping to reflect and pray at each.

Most often carried out on Good Friday, the ritual focuses attention on the suffering of Jesus in preparation for the celebration of the resurrection, the moment of miraculous transcendence, on Easter Sunday.

In South Africa, our public life has been marked by a series of events over the past quarter of a century, many concretised in visual images. They lay out a journey that moves in the opposite direction, from the transcendence of the social sublime to a descent into horror.

In our current morass it is sometimes difficult to fully recall the scale and depth of the social hope that carried the struggles for liberation. For millions here and around the world, those hopes seemed to be concretised in the ascension of Nelson Mandela to the presidency of the country, a transcendent moment.

There are now multiple lines of descent, each marked out by a set of events and images, from that moment. The repressive violence of the new state was first generally grasped when television viewers witnessed the police murder of Andries Tatane during a protest in Ficksburg in 2011. It was swiftly followed by the Marikana massacre the following year.

The sadism carried within the state, a sadism without explicit political purpose, was drawn to public attention with the police murder of Mido Macie, a Mozambican migrant who was dragged behind a police vehicle in Daveyton in 2013 and then met his death in a holding cell. Last year, there was the series of police murders during the first Covid-19 lockdown. The sadism of the state has recently been reinforced with another indelible image – that of a Cabinet minister watching from an armoured police vehicle as the police used a water cannon on impoverished people queueing for disability grants in Cape Town.

The public understanding of the corruption of the state runs from the arms deal, through to the looting in the Jacob Zuma period and then the shock that the initial sense of social solidarity in response to the Covid-19 pandemic had been met with the cynical wholesale theft of resources allocated to ameliorate the crisis.

Read the full article in New Frame.


.

Crime and banishment
Cathy Young, Arc Digital, 17 January 2021

Among other things, Burnett’s essay piqued my interest because, as an example of painful but righteous public shaming, she cites a moment from a film based on one of my favorite books: Dangerous Liaisons (1988). In the film’s penultimate scene, the Marquise de Merteuil (Glenn Close), whose charming exterior conceals a ruthless and sexually depraved schemer and whose machinations have led to the death or ruin of several people in her social circle, arrives in her box at the Opera only to realize that she has been exposed (one of her victims has made public her damning letters to her accomplice-turned-enemy, the now-dead Vicomte de Valmont). She endures first the silent accusatory stares of hundreds of people and then a mounting chorus of boos and hisses. In the film’s final moments, when she stares into her boudoir mirror while scrubbing off her makeup, she has been transformed into a social outcast. Terrible, but just.

It’s a striking analogy. But in fact, Dangerous Liaisons — the 1782 epistolary novel by Pierre Choderlos de Laclos more than the film — offers, among many other things, a trenchant portrayal of a toxic “cancel culture.” (I wrote about the book and its adaptations a few years ago for the now-defunct Weekly Standard.)

A “cancel culture” certainly exists in the late 18th-century French aristocratic society in which the story takes place. Status and reputation are everything; violations of society’s unspoken and often arcane codes are punished by loss of reputation and, sometimes, by social death. In Merteuil’s case, the outcast status is well-deserved. But others, especially women, may suffer the same fate for nothing more than being caught up in a humiliating sexual scandal. (Having lovers is fine; scandal is not.) At one point, when Merteuil is pursued by the handsome rake Prévan, Valmont tells her about a past adventure of his which she missed due to travel abroad. Prévan, we learn, simultaneously wooed three young women who were close friends, getting them to cheat on their husbands and their regular lovers. Then, he talked the three wronged lovers out of dueling and persuaded them instead to punish the fickle women: All three are invited to a rendezvous with Prévan at the same place and time, only to have the deceived lovers show up instead — followed by Prévan. The women’s humiliation becomes the talk of upper-class Paris the next day. The story concludes with the pitiless results of their “cancellation”: “One is now in a convent while the other two languish in exile on their estates.” (For Prévan, of course, the notoriety only enhances his glamor.)

Another woman who ends up in a convent is Cécile de Volanges, the 15-year-old heiress Merteuil has plotted to corrupt for revenge on her fiancé: at the end of the novel, when the tangle of intrigue explodes in scandal, she takes the veil to avoid the ruin that awaits her if her lapses are exposed. Her offenses: a sexual relationship with Valmont (which began with de facto rape by blackmail), a pregnancy and miscarriage, and a night with Danceny, the penniless noble she loves. Earlier, Valmont uses Cécile’s fear of “cancellation” to coerce her into compliance. After getting her to give him her bedroom key so that he can have a copy made and use it to deliver Danceny’s letters, he enters her room with a very different purpose and stops her from calling for help by pointing out that she would be “ruined forever”: no one would ever believe that he came uninvited.

Read the full article on Arc Digital.


.

Which black lives matter?
Preston H Smith, Catalyst, Vol 4, No3

Reed turns next to a nuanced dissection of race and racism, observing that Coates and Obama’s ideas about race and culture occupy two sides of the same racial coin. While Coates’s cultural nationalism makes him call out Obama’s post-racialism, Reed intimates that the two share more common perspectives about race than at first appears. Despite the fact that their views on race are “diametrically opposed,” what these “black emissaries of neoliberalism” share is taking racial inequality out of a political economy context. Race is primary because, for both of them, the cause of and solution to racial inequality remain largely in the racial domain. For Obama, the lack of black upward mobility is due, at least in part, to poor blacks’ failure to learn bourgeois norms, rather than to their inability to secure adequate income and social goods, which places him squarely in the tradition of racial uplift. Coates, on the other hand, blames an unchanging white racial prejudice and discrimination as the culprit for anti-black disparities, which only whites can remedy through atonement and compensation. Reed argues that both treat race and racism culturally without consideration for its material sources or the effectiveness of anti-racist policies to improve the material conditions of precarious black citizens. For instance, when anti-racists advocate race relations training, cultural tutelage, or even reparations, these proposals are curiously in line with neoliberal opposition to a state bent on downward redistribution and tight regulation of the market. Reed persuasively argues that, in this case, separating race from class abets neoliberalism and is a class politics on its own.

Reed reminds us that the liberal tendency to substitute culture for political economy emerged from a combination of the Cold War’s repression of radicals and a pro-growth postwar economy tasked with managing endemic poverty and inequality, both of which worked to discredit the “social democratic promise” of the New Deal. Reed identifies the crucial turn in postwar political economy when culture supplants class in both the analysis of and solutions to poverty. In utilizing a historical materialist method, he draws important distinctions between the New Deal, Cold War, and War on Poverty regimes when it comes to specifying the ways capital has limited the reach and scope of liberal government. Unlike Coates, who sees an unswerving approach to the distribution of welfare state spending from FDR to Barack Obama, Reed shows how subsequent liberal and neoliberal regimes replaced the broadly redistributive policies of the New Deal with an assortment of means-tested government programs and voluntarism, such as cultural tutelage for “disadvantaged youth.”

Read the full article in Catalyst.


.

Why the West isn’t racist
Ralph Leonard, UnHerd, 28 January 2021

“This system can no more provide freedom, justice and equality than a chicken can lay a duck egg.” A few days after the death of George Floyd, the British academic Kehinde Andrews, who specialises in Black Studies, quoted Malcolm X in an interview. Fires burned across America, statues were toppled, problematic TV episodes were erased from history and L’Oréal removed the word “fair” from its beauty products. It all seemed so profound. Yet Andrews was under no illusions. “Today’s inequality,” he argued, “is the cul-de-sac we went down when we tried to reform racism out of a fundamentally racist system.” And there’s only one way out of a cul-de-sac.

Andrews’s book The New Age of Empire is an extension of that argument. In it, he cautions sympathetic readers not to get too giddy over the embrace of “anti-racism” by officialdom. As he sees it, it will only lead to “meaningless change” and “token gestures”. A self-proclaimed “black radical”, Andrews wants to attack the root of the problem: the West itself and the “logic of empire” that organises it.

Unsurprisingly, ground zero for Andrews’ critique is the Enlightenment — the “sacred foundation of Western knowledge”, as he sardonically puts it. All our modern ideas of freedom and equality are traditionally traced back to this Age of Reason. Andrews argues against that narrative. Kant’s racist anthropology and Hume’s polygenism, he writes, “provided the universal and scientific framework of knowledge that maintained colonial logic”, which is the central organising principle of the current “political and economic system and therefore infects all interactions, institutions and ideas”.

Kant et al placed white people firmly at the top of the racial hierarchy. And that codification of black inferiority was used to justify Western imperialism, which has gone through several mutations — from the relentless expeditions into the New World and Atlantic slavery, to 19th century European colonialism, to the present, in which the United States inherits imperial responsibilities. In our allegedly “post-racial” age, institutions such as the IMF and the World Bank continue to exploit Africa through unfair “structural adjustment” programs that force austerity and privatisation upon poor African nations, further crippling their economies. No matter how “the logic of empire” mutates, the premise is still the same: “The West is rich because the rest is poor”.

As an epigone of Malcolm X, Andrews is not interested in interracial allyship as a solution. “The white left”, as he calls them, are too in thrall to the “psychosis of whiteness” to recognise that racism and imperialism are baked into their own politics. Echoing Maoism-Third Worldism, he chastises the white Left for failing to see that the “true revolutionary class has always resided outside the West”. As far as he’s concerned: “If you have come this far and believe that White people offering a meaningful hand of friendship is the solution, then you have missed the point.”

Read the full article in UnHerd.


.

Why black matters
Rahila Gupta, New Internationalist, 18 January 2021

My own personal journey to ‘Black’ began with me self-identifying as Indian when I first arrived in the UK in 1975. This slowly morphed into ‘Asian’ as a way of situating myself within the national discourse and stayed that way until I joined the anti-racist struggle and adopted ‘Black’ gratefully almost overnight as a marker of my politicization.

Sivanandan, Director of the Institute for Race Relations from 1972 onwards, articulated a definition of ‘black’ which left its mark on a generation of activists. For him, it was a political colour. It stood for an anti-racist, anti-imperialist and anti-capitalist politics rooted in the Global South and the independence struggles in British colonies, many of whose representatives were based in London and understood the strength that comes from unity and collective action.

This premise opened the way for inclusion of the Irish too, which reveals the extent to which Sivanandan wrenched ‘black’ away from its literal moorings. However, the concept of Black, then as now, but more so now, was shot through with tension between its ‘ethnic’ and ‘political’ roots.

As an umbrella term, Black had been knocking around in the anti-racist and independence movements for a while. The Black Panthers who came into existence in the UK in 1968 had Asian, African and Caribbean members. Farrukh Dhondy, an Indian member of the group, said in a podcast interview with Reni Eddo-Lodge: ‘There was no colourism in the Black Panther movement, obviously there were no white members. There were supporters, associates, but the membership was basically Asian and Black. They saw it as a common fight against the ex-colonial masters.’

Whether or not ‘black’ was used as a descriptor, there were a number of organizations campaigning against the British state which drew their membership from all three communities, a form of joint organizing that is less in evidence today.

In his essay ‘From resistance to rebellion: Asian and Afro-Caribbean struggles in Britain’ for the journal Race and Class, Sivanandan traces the history of black struggles starting as early as 1945 when Asians, Africans and West Indians came together in a Subject Peoples’ Conference, followed by the setting up of the Co-ordinating Committee Against Racial Discrimination (CCARD) in Birmingham and the Conference of Afro-Asian Caribbean Organisations (CAACO) in London in 1962. All of these local developments were underpinned by the Bandung Conference of 1955, where the newly independent states of Africa and Asia had come together to form a joint force against the colonial and imperial West.

Read the full article in the New Internationalist.


.

Free speech and the question of race
Stephen Rohde, LA Review of Books, 24 January 2021

There is no question that Titley presents a consistent and elaborate argument grounded in a compassionate and genuine sympathy for the victims of racism. But his argument is deeply flawed, beginning with the elemental obligation of defining what he’s talking about. While everyone, including Titley, has their own working definition of “racism,” in a book whose essential purpose is to eliminate an entire classification of speech from public discourse, the author has a special responsibility to define what he means by “racist speech.” He is advancing a very serious project of categorically excluding what he calls “racism” from the protections afforded to the universal notion of free speech. But what ideas, ideologies, practices, and policies are encompassed within Titley’s definition of “racism” which are sufficiently false, discredited, irrational, and beyond the pale to no longer deserve to be debated? What ideas, ideologies, practices, and policies should be blocked, canceled, or disrupted? Titley never says.

In just a moment of reflection, a number of possibilities come to mind which might — or might not — qualify to be blocked, canceled, disrupted, or placed beyond the realm of debate in a world controlled by the speech regime Titley recommends: Is it racist to counter “Black Lives Matter” with a demand that “All Lives Matter”? Is it racist to call racial sensitivity training “divisive anti-American propaganda”? Is it racist to label Critical Race Theory “cult indoctrination”? Is it racist to deny that America is inherently a racist country? Is it racist to complain about multiculturalism, ethnic studies, and identity politics? Is it racist to claim that there is more Black-on-Black crime compared to police shootings of unarmed Black men? Is it racist to argue that affirmative action programs constitute reverse discrimination against whites? Is it racist to object to reparations? Is it racist to prohibit the teaching of the New York Times 1619 Project in public schools? Is it racist to deny the existence of white privilege and white fragility? Is it racist to deny that systemic racism exists in law enforcement? Is it racist to demand that athletes salute the American flag rather than take a knee? Is it racist to oppose the removal of Confederate statues? And so on.

Remember, to deem the expression of any or all of these ideas “racist” under Titley’s free speech regime is to exclude them from any further debate and to allow them to be blocked, canceled, or disrupted when (or even before) they are uttered. This is the essential problem with Titley’s book. If after 155 pages he is proposing something else, what is it? A book fails when a careful reading leaves a reader in such a quandary.

And there are other problems with his arguments. Titley bridles at the suggestion that anti-racists are trying to “censor” racist speech, but his entire argument — placing certain speech beyond debate and allowing it to be blocked, canceled, and disrupted — validates that very suggestion. Long passages from his book, some of which are quoted in this review, will be Exhibit A the next time Richard Spencer or other alt-right speakers accuse anti-racists of seeking to suppress what they have to say. This only serves to play into the white supremacists’ hands, further solidifying their claim to the hallowed role of Defenders of Free Speech.

Read the full article in the LA Review of Books.


.
Farmers’ protest, New Delhi (Photo via Times of India)

Why India has become a different country
Salil Tripathi, Foreign Policy, 27 November 2020

If ticking boxes were sufficient to evaluate democracies, then India still gets the right ticks—it holds elections periodically, it has an independent judiciary, a constitution that safeguards minority rights and recognizes individual rights, where privately owned media operates, where opposition parties exist and are in parliament.

And yet, the essence of democracy is not the form, but its content; the norms, not the laws; it is not the presence of the structures, but whether those structures function the way they are meant to perform, and whether checks and balances set the system right when things go wrong, that determines democracy. And by those criteria, India—for long described as the world’s largest democracy—has been failing persistently.

To be sure, India has always had caveats placed around its democracy. It has laws that permit detention without trial; it sends troops to quell dissent and has laws that human-rights groups say allow the army to act with impunity; the freedoms it grants its citizens come with many restrictions; between 1975 and 1977 then Prime Minister Indira Gandhi declared an internal emergency and jailed opposition leaders and suspended some laws that guaranteed fundamental rights; and it has periodically erupted in brutal violence, as in the inter-religious riots of 1984 and 2002. And yet, its descent has been sharp and steep since 2014, when Modi became the prime minister of the first right-wing, Hindu nationalist government elected on its own strength, not requiring coalition partners (although it does have parliamentary allies).

The world took notice of the deteriorating situation when in late September, the government’s conflict with Amnesty International, the world’s leading human rights organization, played out in the open. That’s when Amnesty decided to shut its Indian operations because of what it called a witch-hunt, and several international human-rights organizations expressed alarm. India promptly dismissed Amnesty’s concerns, saying the organization was under investigation for possibly breaking Indian laws regulating foreign financial contributions to civil society groups. Amnesty has denied those charges. The Indian Government has been after Amnesty for some time now, upset over its many reports critical of the government’s human rights record in Kashmir, and curbs on peaceful protests, and it has charged Amnesty with sedition for organizing events that the government considers anti-national.

India always had complicated and onerous rules regulating foreign funds going to civil society groups. The Modi administration has tightened those rules, making it harder for groups that advocate policy reforms or protection of rights to receive funds from abroad; groups that do so-called humanitarian work such as poverty alleviation or delivering services (but not questioning why poverty persists) are preferred. Globally, the space for civil society has been shrinking, and like Brazil, Hungary, and Russia, India has sharpened laws making it tougher to dissent.

Read the full article in Foreign Policy.


.

Paris 1961: a hidden massacre
Tom Whittaker, New Frame, 10 December 2020

Leaving aside situations of insurrection, revolution or civil war, the massacre of Algerian demonstrators that took place from 17 to 20 October 1961 in Paris constitutes “the bloodiest act of state repression of street protest in Western Europe in modern history”. Jim House and Neil McMaster’s account of 17 October and its complex legacies in Paris 1961: Algerians, State Terror and Memory (Oxford University, 2006) is rich in detail, based upon a combination of extensive research in recently opened state archives, oral sources and secondary literature. The result is a highly significant contribution to the polarised historiography of 17 October, the Left in this debate being represented by Jean-Luc Einaudi and the Right by Jean-Paul Brunet.

Of course, when the state archives were eventually opened in the late 1990s, Brunet was given privileged access some 30 months before Einaudi. He subsequently emerged claiming that only 30 Algerians had been killed and that the events of 17 October could hardly be called a massacre. Such historical controversies are a product of the fact that the French state was, from the outset, successfully able to erase this act of repression from the public vision.

On 17 October 1961, as the Algerian war for independence neared its end, 30 000 Algerian migrants set off from the bidonvilles (shanty towns) to demonstrate in Paris. They assembled to show their support for the Algerian National Liberation Front, and their opposition to a police curfew and rising levels of violence they faced at the hands of the Harkis (pro-French Algerians) and the French police. On arriving in Paris demonstrators were met with extreme violence. Many were either shot or bludgeoned to death, their bodies dumped in the Seine. Meanwhile, thousands of demonstrators were taken to “holding centres” where they faced torture before being deported into the hands of the colonial authorities in Algeria. The police acted as they did confident in the knowledge that they would not be held to account by President Charles de Gaulle or anyone else in the French government.

Maurice Papon, the chief of police, congratulated himself upon having smashed the National Liberation Front and claimed that only two demonstrators had been killed. House and McMaster argue that it is hard to ascertain exactly how many died, but that there were at least 120 excess Algerian deaths throughout the whole of September and October 1961 due to police violence. In this sense, 17 October was the peak of a cumulative wave of repression unleashed against Algerians during autumn 1961. Others estimate the number killed at the demonstration was 200. In terms of colonial violence, this was unique in only one regard – that it took place in the capital of the imperial metropole rather than in the colony. An account of how methods of colonial policing were imported from Algeria to France is provided, many key figures within the Parisian police, including Papon, being veterans of repression in the colonies.

Read the full article in New Frame.


.

Gulf slave society
Sam Haselby, Aeon, 22 January 2021

The six city-states on the Arab side of the Persian Gulf, each formerly a sleepy, pristine fishing village, are now all glitzy and futuristic wonderlands. In each of these city-states one finds large tracts of ultramodern architecture, gleaming skyscrapers, world-class air-conditioned retail markets and malls, buzzing highways, giant, busy and efficient airports and seaports, luxury tourist attractions, game parks, children’s playgrounds, museums, gorgeous beachfront hotels and vast, opulent villas housing fabulously affluent denizens. The six city-states ­– Dubai and Abu Dhabi in the United Arab Emirates (UAE), Manama in Bahrain, Dammam in Saudi Arabia, Doha in Qatar, and Kuwait City in Kuwait – grew into these luminous metropolises beginning in the 1970s, fuelled by the discovery of oil and gas, an oligarchic accumulation of wealth, and unconditional grants of political independence from the United Kingdom, the former colonial master of the region. Thereafter, the family-run polities that took control of these city-states began to attract huge amounts of financial capital from all over the world. Abu Dhabi, the capital of the UAE, has been described as ‘the richest city in the world’, with wealth rivalling that seen in Singapore, Hong Kong or Shanghai. Like those cities, Abu Dhabi is swimming in over-the-top affluence. According to a 2007 report in Fortune magazine, Abu Dhabi’s 420,000 citizens, who ‘sit on one-tenth of the planet’s oil and have almost $1 trillion invested abroad, are worth about $17 million apiece’.

The Persian Gulf has a venerable history, stretching back to ancient times. It has always been a cosmopolitan and diverse centre of wealth and commerce. For nearly 1,000 years, Dilmun, a Bronze Age Arabian polity based in what is today Bahrain, controlled the trading routes between ancient Mesopotamia and the Indus river valley. During the Abbasid caliphate, a 500-year-long Islamic empire based in Baghdad, mercantile entities in Basra and al-Ubulla, at the head of the Gulf, dominated trade and commercial links with East Africa, Egypt, India, Southeast Asia and China. One could buy anything in this trade, including giraffes, elephants, precious pearls, silk, spices, gemstones and very expensive Chinese porcelain. Omani Arabs, who periodically controlled the maritime entrance to the Gulf at the Strait of Hormuz, were known as the ‘Bedouins of the Sea’. They came to control the trading routes with East Africa, transporting spices, precious stones and many other luxury commodities.

Slavery and slave trading formed a major part of this commercial history, particularly after the advent of Islam. Africans, Baluchis, Iranians, Indians, Bangladeshis, Southeast Asians and others from the Indian Ocean littoral were steadily and involuntarily transported into the Gulf in increasingly large numbers, for work as domestic servants, date harvesters, seamen, stone masons, pearl divers, concubines, guards, agricultural workers, labourers, and caretakers of livestock. Historians have noted that there was a great upsurge of slave trading into the region in the 18th and 19th centuries, during the heyday of the Indian Ocean slave trade. Many Persian Gulf families became very wealthy as a result of this upsurge. This is the backdrop for what turns out to be a very ugly and sad aspect of the spectacular rise of contemporary social orders in the six Gulf city-states. Each is an example, and perhaps the only examples existing in the world today, of what the sociologist Moses Finley (1912-86) called a ‘genuine slave society’.

Read the full article in Aeon.


.

Caste does not explain race
Charisse Burden-Stelly, Boston Review, 15 December 2020

Wilkerson claims that the caste system is about power, not “feelings or morality.” However, insofar as the book draws on her own experiences as a Black professional in the United States, it suggests that perception, judgement, and assumptions sustain structural inequality. This is because, for her, the stereotypes and messaging that uphold caste derive from “automatic, unconscious, reflexive response[s] to expectations.” For example, in her discussion of scapegoating, Wilkerson presents the perception of Black poverty as a more significant problem than Black poverty itself. “Little more than one in five African-Americans, 22 percent, are poor,” she argues, “and they make up just over a quarter of poor people in America, at 27 percent.” However, when the news portrays poor people, Black families account for 59 percent of those depicted. This, she contends, “shape[s] popular sentiment” and makes “black [a] synonym for poor.” Wilkerson does not discuss why a disproportionately high number of Black Americans are impoverished or what reproduces this inequality. The chapter title “The Heart Is the Last Frontier” sums up Wilkerson’s view: self-reinforcing attitudes and behaviors sustain caste. Thus, social change only requires shifting the behaviors and attitudes of those positioned as superior. Radical change must then flow from “dominant caste” individuals who recognize the plight of the “subordinate caste” and choose to reject the system. Cox, on the other hand, posits that any effort to change the racial order of the United States must attack the racially hierarchical political economy, not perceptions and attitudes.

Wilkerson’s reasoning allows her to position Nazi Germany, India, and the United States as comparable caste systems, claiming that each society’s perceptions of and attitudes toward those at the bottom maintained its respective social hierarchies.

Wilkerson focuses on Nazi fixation with purity of blood in determining who was Aryan. Chapters such as “The Euphoria of Hate” and “The German Girl with the Dark, Wavy Hair” give the impression that negative perception was as important to upholding the Nazi “caste system” as was the extermination of Jews and other undesirable populations. These chapters are perhaps the least rigorous and compelling sections of the book, neglecting responsible historical analysis. As Sunil Khilnani notes, “the final objective of Nazi ideology was to eliminate Jewish people, not just to subordinate them.”

Wilkerson’s analysis of caste in India is similarly superficial insofar as she treats the Indian caste system as essentially unchanged over some 4,000 years. She neglects to mention, for example, that British colonial rule used existing caste distinctions as instruments of colonial domination, much as it had in Africa, to impose a more rigid social structure. Ignoring these historical dynamics, she describes caste’s function in India today through her own observations: “I could see that the upper-caste people took positions of authority, were forthright, at ease with being in charge, correcting and talking over lower-case people,” she explains. “On the other hand,” she continues, “the Dalits, as if trained not to bring attention to themselves, sat in the shadows, on the periphery. . . . asking few questions, daring not, it seemed, to intrude upon an upper-caste domain or conversation.” These descriptions draw on secondary sources, alongside anecdotal evidence and personal observations gathered during a brief trip she took to give a talk there. Indeed, she confesses: “I spoke none of the Indian languages, knew nothing of the jatis, and was in no position to query anyone as to the section of village from which they came.”

Read the full article in the Boston Review.


.

History from below
Priya Satia, Aeon, 18 December 2020

After the Second World War, historians asked us to shift our focus from great men to the actions and experiences of ordinary people, to culture rather than institutions. This methodological shift to ‘history from below’ was political, supporting a democratic vision of political, social, intellectual and cultural agency as the Cold War stoked authoritarian impulses in the East and West. It sought to rectify historians’ paternalistic habit of writing about the people ‘as one of the problems Government has had to handle’, as E P Thompson put it, as objects rather than subjects of history. Influential as this trend was, great-man history retained a cultural hold too and, today, the would-be ‘great men’ dominating political stages around the world, however caricature in form, challenge democratic visions of how history has been and should be made. ‘History from below’ succeeded in throwing out the chimera of great men while preserving the chimera of the nation that was the most common excuse for their invocation. Revisiting its origins might reveal why.

Thompson is perhaps the figure most popularly associated with ‘history from below’, specifically his totemic work, The Making of the English Working Class (1963). Expansive as its cast is, its geographical scope is constricting. Though set in the era of British conquest of vast swathes of the world, it barely acknowledges that reality. This is doubly strange, given that Thompson wrote it while decolonisation was forcing Britons to contend with the ethics of empire, and was himself descended from a line of colonial missionaries deeply engaged with such matters. His classic text created an island template for the most progressive British history of the late-20th century, unwittingly legitimising the nostalgic view of ‘Little England’ that has culminated in Brexit. The book’s enormous impact also ironically endowed Thompson with fairly robust great-man status himself, as the iconic historian-activist of his time.

Is there a history from below, or at least a wider genealogy, that might explain the paradoxical political and intellectual event that was ‘E P Thompson’? How does the picture shift if we recall the choir of voices harmonising with his – his working-class students, fellow British social historians such as Raphael Samuel and Christopher Hill, and European antecedents such as Georges Lefebvre and the Annales school? Or the masses of people – including Thompson – whose collective experience of the global calamities of the 1940s forced reconsideration of progress-wrought-by-great-men as the most practical or believable model of historical narration? Even this tentative shifting of scale tends to dislocate Thompson from the provincial fastness of his work. As it turns out, his ‘Little England’ focus was not a case of early Brexitism, or a patriotic move in the aftermath of Britain’s finest hour, or even a function of his belief in the nation as history’s natural subject. Rather, it was a mirage that ossified into an intellectual reality. Recovering the high-stakes global arguments within the Left in Thompson’s time reveals the cosmopolitan concerns behind his English preoccupations, their short-term cultural payoffs and long-term political and disciplinary costs.

Read the full article in Aeon.


.

No Homers club
Sam Kriss, Damage, 27 January 2021

The Wall Street Journal reports, with some panic, that “Even Homer Gets Mobbed.” How, exactly, do you mob someone who’s been dead for nearly three thousand years? The article explains: “A sustained effort is under way to deny children access to literature. Under the slogan #DisruptTexts, critical-theory ideologues, schoolteachers and Twitter agitators are purging and propagandizing against classic texts.” Unlike the carefully and sensitively crafted young-adult fiction of today, these classics risk doing harm to our children. One of the educators behind the slogan summed up its general theory: “Did y’all know that many of the ‘classics’ were written before the 50s? Think of US society before then & the values that shaped this nation afterwards. THAT is what is in those books. That is why we gotta switch it up.” Ugh, gross—the past.

In actual practice, what this mobbing of Homer amounted to was a single high school teacher proudly announcing that she’d removed the Odyssey from her English curriculum. This is a strange thing to be proud of, but not exactly a cataclysm. Even so, the article provoked a brief firestorm. It’s still worth asking, though: of all the people furious at this attack on the Odyssey, how many have actually read it? Do these culture warriors storm across the loud-roaring sea with sails unfurled, carrying a treasure of many tripods and copper bowls? Don’t be stupid. Who has the time? Never mind books of epic poetry; we can barely watch films, not without automatically messing around on our phones. Sometimes I’ll make the mental effort to turn the thing off and focus on the other, bigger screen—only to find my hands still fiddling with it a second later, tapping away of their own accord. We’re all living in a permanent daze; we need distractions from our distractions. Linear, sequential media is always too big and cumbersome for our churning algorithmic now.

The people outraged on Homer’s behalf don’t really care about his work; as always, the canon wars have very little to do with the actual texts contained in any given canon. Instead, Homer acts as a fetish-object, standing for some hazy concept of a Western tradition: the veneration of dead European males, the capitalist mode of production, the guns on the hips of the police. Poor blind Homer: they’ll pick him up by his feet and swing him around as a bludgeon. Never mind that the actual poet doesn’t really belong to anything called ‘the West,’ that he was probably not a European, and that (if you subscribe to certain eccentric Victorian theories) he might not have even been male. It doesn’t matter. Why think too much about the past? Don’t you know we’re in a war?

It’s hard to blame these people too much: nobody reads anything any more. One of the most popular thinkers in circles like #DisruptTexts is Frantz Fanon. When they rage against Achilles, it’s in his name: tear down these monuments to white supremacy! Build something better, more inclusive, more relevant in their place! It’s worth asking if any of the champions of Black Skin, White Masks have actually read their book either. Did they not notice that it’s an intricately structured Bildungsroman? Did they miss its glorious final chapter? “The Negro, however sincere, is the slave of the past. Nonetheless I am a man, and in this sense the Peloponnesian War is as much mine as the invention of the compass… I am a man, and what I have to recapture is the whole past of the world. I am not responsible solely for the revolt in Santo Domingo.” Frantz Fanon was one of the most brilliant thinkers of the twentieth century. Again, it doesn’t matter. He’s just another cudgel in the bush wars of our culture. A book is a large heavy object, useful for battering skulls.

Read the full article in Damage.


.
zDeveloping Zebrafish embryo (Photo: Philipp Keller/HHMI Janelia Research Campus via Nature)

The secret forces that squeeze and pull life into shape
Amber Dance, Nature, 13 January 2021

At first, an embryo has no front or back, head or tail. It’s a simple sphere of cells. But soon enough, the smooth clump begins to change. Fluid pools in the middle of the sphere. Cells flow like honey to take up their positions in the future body. Sheets of cells fold origami-style, building a heart, a gut, a brain.

None of this could happen without forces that squeeze, bend and tug the growing animal into shape. Even when it reaches adulthood, its cells will continue to respond to pushing and pulling — by each other and from the environment.

Yet the manner in which bodies and tissues take form remains “one of the most important, and still poorly understood, questions of our time”, says developmental biologist Amy Shyer, who studies morphogenesis at the Rockefeller University in New York City. For decades, biologists have focused on the ways in which genes and other biomolecules shape bodies, mainly because the tools to analyse these signals are readily available and always improving. Mechanical forces have received much less attention.

But considering only genes and biomolecules is “like you’re trying to write a book with only half the letters of the alphabet”, says Xavier Trepat, a mechanobiologist at the Institute for Bioengineering of Catalonia in Barcelona, Spain.

Over the past 20 years, more scientists have started paying attention to the importance of mechanics across a variety of developmental stages, organs and organisms. Researchers have begun to define the mechanisms by which cells sense, respond to and generate forces. They have done so by inventing bespoke tools and tricks, incorporating lasers and micropipettes, magnetic particles and custom-built microscopes. Most researchers are probing mechanical signals using cells or tissues cultured in a dish. But a few groups are studying whole animals, and sometimes they find different principles at work from those apparent in isolated tissues. These in vivo studies come with many challenges — such as measuring tiny amounts of force in complex tissues — but they are key to understanding the role of force in sculpting life, says Roberto Mayor, a developmental biologist at University College London.

As a handful of determined scientists begin to surmount those challenges, they’ve observed crucial forces shaping biology — from the earliest stages of an embryo’s existence to diseases that strike later in life. Down the line, this information might help scientists to design better interventions for problems such as infertility or cancer.

“Forces will operate in every single instance where shape is at play,” says Thomas Lecuit, a developmental biologist at the Developmental Biology Institute of Marseille in France.

Read the full article in Nature.


.

The quiet disappearance of Britain’s public libraries
Adele Walton, Tribune, 17 January 2020 

Since the election of the Conservative-led coalition in 2010, cuts to public spending have been vast and widespread, crippling our National Health Service, undermining our social care system, and squeezing local councils into ever-tightening budgets. One often silent victim of this insidious process is our public libraries.

Libraries are a gravely underestimated public good with immense communal value. As well as providing free access to books, libraries have improved outcomes on general health, digital literacy skills, and employment skills, and build community resilience. In 2014-2015, visits to libraries topped attendance to Premier League football games, the cinema, and the country’s top ten tourist attractions combined.

However, in an austerity climate, libraries have been targeted as a disposable resource. Spending on libraries in 2009 was at £1 billion, but by 2019 this had declined by a quarter. The same decade saw 773 libraries close – one fifth of all in the UK. The Conservatives’ assault on public libraries is a deep injustice to all community members, but it’s one that disproportionately affects underprivileged children.

The pandemic, with its required home-schooling, has proven how much low income families rely on public services for educational resources and internet access. The 773 closures will already have had serious consequences for educational outcomes pre-pandemic, and with millions of families pushed into poverty by Covid, those outcomes are only going to get worse.

The Institute of Education found that children who read for pleasure make significantly more progress in maths and English compared to those who do not read – but the ability to participate in reading outside school is limited for those without disposal income to spend on books. This is a key factor in disparities in educational outcomes: research has found that children who own books are six times more likely to read above the expected level for their age, and with over 380,000 children in the UK not owning books, closures of public libraries are reinforcing the divide.

A widening of the cleft between rich and poor is a common feature of Conservative educational policies. Schools in more deprived areas in England have seen the largest cuts to funding per pupil since 2010. With less spent on each child, schools are stretched to provide resources and are essentially pressured to take on the role of the welfare state. In a system which sees the most prestigious schools as a funnel to elite universities and thus secure employment—often in government, business, or both—and where exam success defines wider life outcomes, closures of libraries mean the chance to succeed for poorer children is further suppressed.

Read the full article in Tribune.


.

A letter from a Florida inmate asked for help.
It arrived too late.

CJ Ciaramella, Reason, 21 January 2021

I wondered how Mathis ended up in Lowell Correctional Institution, a place so wretched that the Trump Justice Department recently put Florida on notice that the conditions there violated women’s constitutional rights against cruel and unusual punishment. A 2015 Miami Herald investigation found, in addition to rampant sexual abuse, inadequate medical care, rancid food, and vermin infestations.

A transcript from Mathis’ 2009 sentencing hearing started to fill in the blanks. Mathis had pleaded guilty to three counts of trafficking hydrocodone after selling pills to an undercover officer and a confidential informant.

“I had a dependency on prescription pills which clouded my judgement,” Mathis begged the judge. “I have a loving family waiting for me in Michigan. I also have a brand new grandson that I would like to help raise, also three sons in their twenties. I worked at General Motors for 20 years and can be a productive human being. There is a good possibility that I could get my job back and retire in 10 years. Please don’t take my life away from me.”

The Florida legislature passed harsh mandatory minimum sentences in 1999 in response to the state’s booming black market for opioid pills. The laws were supposed to target drug kingpins, but the weight thresholds to trigger a mandatory minimum sentence were so low that they mostly ensnared people selling to support their habit. 

Like many other cases Krisai and I had come across, the judge at Mathis’ sentencing was disgusted at the sentence he was required to impose, but he had no choice.

“I do not for the life of me understand why the Florida legislature has come up with the mandatory minimum sentences they have for prescription painkillers, in which a relatively—in the court’s view—small amount of prescription painkillers results in just an incredibly harsh sentence,” circuit judge Jon Morgan said. “But it is not my job or option as a judge to decide what the law should be. I’m required to follow the law as it is. And I’m sorry that it is what it is in your particular case.”

Read the full article in Reason.


.

Without God or reason
Morten Høi Jensen, Commonweal, 6 January 2021

I first read Albert Camus when I was seventeen. I borrowed my father’s well-thumbed copy of The Stranger and greedily consumed it in one sitting on a train from Frederikssund to Copenhagen. I was spellbound. How could I not be? There in the novel was the bright Algerian sun and the shimmering Mediterranean Sea; outside my window was the drab Danish sky—the color of a pâté, forever portending rain. With every page I longed more intensely for the spare beauty of Algiers, to sense what Camus elsewhere describes as “a life that tastes of warm stone”—a life that, in the hard lyricism of his prose, seemed simple yet inexhaustibly rich. At the end of the novel, when the condemned Meursault is visited in his cell by the prison chaplain, he is asked if he ever wished for another life. He answers a little evasively. After the chaplain demands to know exactly how he pictured this other life, Meursault, finally, responds with a shout: “One where I could remember this life!”

Meursault’s moving outburst is also his maker’s. More than any other twentieth-century writer, Camus struggled to affirm life as it was handed to him, without appeal to religious or secular divinities. In his early notebooks, he spoke of wanting to “hold my life between my hands” and of enduring “this experience without flinching, with complete lucidity”—words made all the more poignant by the “sentence” of tuberculosis he received when he was just seventeen years old, and which repeatedly threatened to cut his life short. It was perhaps in Algiers’s Mustapha Hospital, in the winter of 1930, that Camus first felt himself brushed by the absurd: the clash between humankind’s desire for meaning and the inscrutable silence of the universe.

The young man who loved to roam the shabby streets of the Belcourt quarter, who preferred to spend his days swimming or playing soccer, thus learned early that our lives mean nothing to the mute world that surrounds us. Only a few years after receiving his diagnosis, Camus asked the costume designer and amateur pilot Marie Viton to fly him to Djemila, a coastal mountain village home to the ancient Roman ruins of Cuicul. There, standing atop the stony remnants of a vanished world, a hard wind burning his eyes and cracking his lips, Camus was overcome by the unrelenting indifference of the natural world. As he wrote in “The Wind at Djemila,” the essay his experience inspired:

I tell myself: I am going to die, but this means nothing, since I cannot manage to believe it and can only experience other people’s death. I have seen people die. Above all, I have seen dogs die. It was touching them that overwhelmed me. Then I think of flowers, smiles, the desire for women, and realize that my whole horror of death lies in my anxiety to live. I am jealous of those who will live and for whom flowers and the desire for women will have their full flesh and blood meaning. I am envious because I love life too much not to be selfish. What does eternity matter to me?

For the twenty-three-year-old author who wrote those words, death was not the distant terminus of old age but an awful negation that threatens at every moment to erase us. Remarkably, for someone so young when the shadow of death first darkened his way, he did not recoil from this insight. He had no interest in merely pacifying the terror of death, as the Stoic philosophers counseled. (In the Enchiridion, Epictetus recommends that we think of death daily, in order to lessen our fear of it.) Instead, Camus resolved to “gaze upon my death with the fullness of my jealousy and horror,” to face the absurdity of the human condition with total clarity. “I have no wish to lie or to be lied to,” he wrote.

Read the full article in Commonweal.


.

Modi’s ghastly Delhi dream
Kapil Komireddi, The Critic, April 2020

Let there be a new New Delhi, Narendra Modi decreed last year. It is the Indian prime minister’s “dream”, one of his underlings announced to the press, to bequeath the world’s largest democracy a capital that radiates native authenticity. The dream has its genesis in a nightmare. In the summer of 2002, the then speaker of the Indian parliament, an orthodox and often reactionary Hindu by the name of Manohar Joshi, became convinced that the building in which he worked, the circular Parliament House built by the British, was cursed. 

A string of Joshi’s senior colleagues had died in rapid succession over the preceding year. His predecessor as speaker of the Lok Sabha — India’s House of Commons — was killed in a freak helicopter crash months before parliament convened that monsoon. The vice president of India, who chaired the Rajya Sabha — the upper chamber modelled on the Lords and the US Senate — died when parliament was in session. And eight months before the vice president’s abrupt departure, more than half a dozen security personnel had lost their lives in a gun battle at the gates of Parliament House while thwarting armed militants backed by Pakistan from storming it.

A massacre had been averted, but death and division continued to haunt and paralyse the corridors of power. Indian troops, awaiting orders on the border to punch into Pakistan, were weighed down by heavy casualties. In New Delhi, the business of government was juddering to a halt. Joshi, newly installed as speaker of the Lok Sabha by the Hindu-first Bharatiya Janata Party which led a fragile coalition of a dozen minor parties in government, decided to act.

He summoned Ashwinie Kumar Bansal, a lawyer and a specialist in Vastu — the ancient Indian discipline of architecture, akin to the Chinese Feng Shui — to survey Parliament House and recommend remedies to rescue India. Bansal, who has published 30 books on Vastu and Feng Shui, spent two days wandering the verandas and halls of the colonnaded camera contrived by Herbert Baker almost as an appendage to the stupendous acropolis conceived by Edwin Lutyens: the pièce maîtresse of the city inaugurated in 1931 as “New Delhi”. For more than five decades, the most mundane and exalted deliberations of Indian democracy were enacted within Parliament House’s annular walls of heavy sandstone, which house the Lok Sabha, the Rajya Sabha, and the majestically domed Central Hall where joint sessions of both houses are held on rare occasions and where Nehru, at the stroke of midnight on 15 August 1947, had proclaimed the birth of modern India. 

History and grandeur, however, were swept aside by the “negative vibrations and energy” Bansal experienced during his inspection of the place. “It is the circular building,” he declared in a confidential memo to the speaker, “which ails the nation’s polity.” To Bansal, it was “an odd piece of architecture made according to the whims and fancies of a foreigner”. It evinced no fidelity to Hindu, Islamic, or Christian conventions of construction. And its “round shape”, evocative of a “zero” and epitomising “void and nothingness”, endowed it with a mystical power to “destroy anything that interacts with it”. 

Read the full article in The Critic.


.

How Indian cinema shaped East Africa’s urban culture
Rasna Warah, Africa Is A Country, 29 January 2021

At a time when “social distancing” is becoming the norm due to the coronavirus pandemic, it may appear self-indulgent to reminisce about a period when going to the cinema was a regular feature of East African Asians’ lives. But perhaps now that the world is changing—and many more people are watching movies at home on Netflix and other channels—it is important to document the things that have been lost in the war against COVID-19 and with the advent of technology. One of these things is the thrill of going to the cinema with the family.

What has also been lost is an urban culture embedded in East Africa’s South Asian community—a culture where movie-going was an integral part of the social fabric of this economically successful minority.

Those who pass the notorious Globe Cinema roundabout, which is often associated with pickpockets and street children, might be surprised to learn that the Globe Cinema (which no longer shows films but is used for other purposes, such as church prayer meetings) was once the place to be seen on a Sunday evening among Nairobi’s Asian community. I remember that cinema well because in the 1970s my family used to go there to watch the latest Indian—or to be more specific, Hindi (India also produces films in regional languages like Telegu, Bengali and Punjabi) blockbuster at 6pm on Sundays. Sunday was movie day in my family, and going to the cinema was a ritual we all looked forward to. The Globe Cinema was considered one of the more “posh” cinemas in Nairobi; not only was it more luxurious than the others, but it also had better acoustics.

As veteran journalist Kul Bhushan writes in a recent edition of Awaaz magazine (which is dedicated entirely to Indian cinema in East Africa from the early 1900s to the 1980s), “Perched on a hillock overlooking the Ngara roundabout, the Globe became the first choice for cinemagoers for new [Indian] releases as it became the venue to ogle and be ogled by the old and the young.”

Indian movies were—and are—the primary source of knowledge about Indian culture among East Africa’s Asian community. The early Indian migrants had little contact with the motherland, as trips back home were not only expensive but the sea voyage from Mombasa to Bombay or Karachi took weeks. (At independence in 1947, the Indian subcontinent became two countries—India and Pakistan—hence the reference to Indians in East Africa as “Asians.”) So they relied on Indian films to learn about the customs and traditions of the country they or their ancestors had left behind.

Read the full article in Africa is a Country.

%d bloggers like this: