The latest (somewhat random) collection of essays and stories from around the web that have caught my eye and are worth plucking out to be re-read.
The triumph of American idealism
Alex Hochuli, Damage, 17 June 2020
What has motivated people from Sweden to New Zealand to take to the streets, for non-Americans to express solidarity with an American cause? We are all horrified to see police brutality and think racism is bad, of course, but that hardly suffices as an explanation. The coincidence of events – most of the world east of the Americas coming out of lockdown, and the shocking video and ensuing US protests – suggests timing is at play, with pent-up energy no doubt delighted to find cathartic expression. But simultaneity is hardly enough of an explanation, either.
At the risk of stating the obvious: the causes of George Floyd’s murder and the specific demands of protesters in response are matters internal to the United States, rooted in that country’s history of racism and how it polices its citizens with a brutality that increasingly approximates that which its troops inflict abroad. It would be one thing if the international BLM protests were real solidarity protests, if they were congregations of non-Americans acting as non-Americans expressing concern about a particular American problem. But it’s clear from the demonstrations that part of the enthusiasm involves a curious short-circuit between American and non-American identities. These are non-Americans acting like they’re Americans, fantasizing their participation in America’s problems, eagerly adopting the slogans and paraphernalia of a uniquely American protest.
Memes circulate throughout Europe about “how to talk to your family about racism”, translated directly from US sources. According to a Danish friend, “everyone is writing in English for a demonstration in Denmark. Or when they’re in Danish, the syntax is taken from English, and sounds weird to the Danish ear.” (“It is white privilege for us to sit back and discuss this” is one offending importation; “sit back” is not a Danish expression). An acquaintance in Finland, journalist Sofia Hirvonen, informs me that protests there have melded US woke campus politics with the country’s own tradition of consensual politics, in phrases like, “this isn’t against anyone, we’re all together against white privilege” (the country doesn’t produce ethnicity statistics, though those speaking a non-European mother-tongue total barely 2%).
What could “white privilege” possibly mean in Finland, or in Poland or the Balkans (which also saw solidarity protests) for that matter? Little good that privilege did those countries across the 20th century. That is not to say racial prejudice does not exist in these societies, but the roots are entirely different. Only according to an idealist and transhistorical understanding of racism – “bad ideas” – could these vastly different cases be treated as one.
Read the full article in Damage.
The invention of the police
Jill Lepore, New Yorker, 13 July 2020
In 1965, President Lyndon Johnson declared a “war on crime,” and asked Congress to pass the Law Enforcement Assistance Act, under which the federal government would supply local police with military-grade weapons, weapons that were being used in the war in Vietnam. During riots in Watts that summer, law enforcement killed thirty-one people and arrested more than four thousand; fighting the protesters, the head of the L.A.P.D. said, was “very much like fighting the Viet Cong.” Preparing for a Senate vote just days after the uprising ended, the chair of the Senate Judiciary Committee said, “For some time, it has been my feeling that the task of law enforcement agencies is really not much different from military forces; namely, to deter crime before it occurs, just as our military objective is deterrence of aggression.”
As Elizabeth Hinton reported in “From the War on Poverty to the War on Crime: The Making of Mass Incarceration in America,” the “frontline soldiers” in Johnson’s war on crime—Vollmer-era policing all over again—spent a disproportionate amount of time patrolling Black neighborhoods and arresting Black people. Policymakers concluded from those differential arrest rates that Black people were prone to criminality, with the result that police spent even more of their time patrolling Black neighborhoods, which led to a still higher arrest rate. “If we wish to rid this country of crime, if we wish to stop hacking at its branches only, we must cut its roots and drain its swampy breeding ground, the slum,” Johnson told an audience of police policymakers in 1966. The next year, riots broke out in Newark and Detroit. “We ain’t rioting agains’ all you whites,” one Newark man told a reporter not long before being shot dead by police. “We’re riotin’ agains’ police brutality.” In Detroit, police arrested more than seven thousand people…
In 1968, Johnson’s new crime bill established the Law Enforcement Assistance Administration, within the Department of Justice, which, in the next decade and a half, disbursed federal funds to more than eighty thousand crime-control projects. Even funds intended for social projects—youth employment, for instance, along with other health, education, housing, and welfare programs—were distributed to police operations. With Richard Nixon, any elements of the Great Society that had survived the disastrous end of Johnson’s Presidency were drastically cut, with an increased emphasis on policing, and prison-building. More Americans went to prison between 1965 and 1982 than between 1865 and 1964, Hinton reports. Under Ronald Reagan, still more social services were closed, or starved of funding until they died: mental hospitals, health centers, jobs programs, early-childhood education. By 2016, eighteen states were spending more on prisons than on colleges and universities. Activists who today call for defunding the police argue that, for decades, Americans have been defunding not only social services but, in many states, public education itself. The more frayed the social fabric, the more police have been deployed to trim the dangling threads.
he blueprint for law enforcement from Nixon to Reagan came from the Harvard political scientist James Q. Wilson between 1968, in his book “Varieties of Police Behavior,” and 1982, in an essay in The Atlantic titled “Broken Windows.” On the one hand, Wilson believed that the police should shift from enforcing the law to maintaining order, by patrolling on foot, and doing what came to be called “community policing.” (Some of his recommendations were ignored: Wilson called for other professionals to handle what he termed the “service functions” of the police—“first aid, rescuing cats, helping ladies, and the like”—which is a reform people are asking for today.) On the other hand, Wilson called for police to arrest people for petty crimes, on the theory that they contributed to more serious crimes. Wilson’s work informed programs like Detroit’s stress (Stop the Robberies, Enjoy Safe Streets), begun in 1971, in which Detroit police patrolled the city undercover, in disguises that included everything from a taxi-driver to a “radical college professor,” and killed so many young Black men that an organization of Black police officers demanded that the unit be disbanded. The campaign to end stress arguably marked the very beginnings of police abolitionism. stress defended its methods. “We just don’t walk up and shoot somebody,” one commander said. “We ask him to stop. If he doesn’t, we shoot.”
Read the full article in the New Yorker.
To end police violence
fund public goods and raise wages
Dustin Guastella, Nonsite, 9 July 2020
The racial disparities in police killings are almost always the first statistics marshalled in defense of calls to abolish, dismantle or defund the police. But the focus on racial disparities can confuse as much as it clarifies. Consider the oft quoted statistic that black Americans make up 24% of the victims of police killings despite only accounting for 13% of the population. Without context, this suggests that black Americans are indiscriminately murdered by the police, regardless of where they fall on the ladder of economic inequality or even where they live.
Jeff Bezos recently quipped: “I have a 20-year-old son, and I simply don’t worry that he might be choked to death while being detained one day. […] Black parents can’t say the same.” Are we to believe that the reason the Bezos children are not likely to be killed by the police is because they were born white, or might being the children of the world’s wealthiest man have something to do with it? Merck CEO Kenneth Frazier has declared that George Floyd “could have been me,” but do we really think that black multimillionaires are more likely to be murdered by the police than poor whites? In truth, this corporate brand of anti-racism—now present in the statements of major multinational companies, featured on the splash pages of all major streaming services, and prominent in most national papers—has sought to make invisible the most significant features of American society linked to police violence: inequality and austerity.
Though we don’t have comparable statistics at the individual level, there is much evidence to suggest a startling disparity in the pattern of police violence by class. According to one analysis, a person in the poorest quintile of census tracts is 3.5 times more likely to be killed by the police than a person in the wealthiest quintile. Of all the police killings in the United States about 60% take place in census tracts falling in the two quintiles with the highest levels of poverty – despite these tracts accounting for only 39% of the population. A full 35% of all police killings occurred in the census tract quintile with the highest concentration of poverty.
The truth is police lethality is a problem almost exclusively experienced by the poor. And when we limit our analysis to this population alone we see that racial disparities in police killings are greatly diminished. In one study, Roland Fryer Jr. found that (despite robust evidence for racial discrimination, and startling disparities in the frequency and use of force) there was no statistically significant racial disparity in the use of lethal force. A 2019 study provided further confirmation, finding “no overall evidence of anti-Black or anti-Hispanic disparities in fatal shootings.”
Read the full article in Nonsite.
The dehumanizing condescension of white fragility
John McWhorter, Atlantic, 15 July 2020
And herein is the real problem with White Fragility. DiAngelo does not see fit to address why all of this agonizing soul-searching is necessary to forging change in society. One might ask just how a people can be poised for making change when they have been taught that pretty much anything they say or think is racist and thus antithetical to the good. What end does all this self-mortification serve? Impatient with such questions, DiAngelo insists that “wanting to jump over the hard, personal work and get to ‘solutions’” is a “foundation of white fragility.” In other words, for DiAngelo, the whole point is the suffering. And note the scare quotes around solutions, as if wanting such a thing were somehow ridiculous.
A corollary question is why Black people need to be treated the way DiAngelo assumes we do. The very assumption is deeply condescending to all proud Black people. In my life, racism has affected me now and then at the margins, in very occasional social ways, but has had no effect on my access to societal resources; if anything, it has made them more available to me than they would have been otherwise. Nor should anyone dismiss me as a rara avis. Being middle class, upwardly mobile, and Black has been quite common during my existence since the mid-1960s, and to deny this is to assert that affirmative action for Black people did not work.
In 2020—as opposed to 1920—I neither need nor want anyone to muse on how whiteness privileges them over me. Nor do I need wider society to undergo teachings in how to be exquisitely sensitive about my feelings. I see no connection between DiAngelo’s brand of reeducation and vigorous, constructive activism in the real world on issues of import to the Black community. And I cannot imagine that any Black readers could willingly submit themselves to DiAngelo’s ideas while considering themselves adults of ordinary self-regard and strength. Few books about race have more openly infantilized Black people than this supposedly authoritative tome.
Or simply dehumanized us. DiAngelo preaches that Black History Month errs in that it “takes whites out of the equation”—which means that it doesn’t focus enough on racism. Claims like this get a rise out of a certain kind of room, but apparently DiAngelo wants Black History Month to consist of glum recitations of white perfidy. This would surely help assuage DiAngelo’s sense of complicity in our problems, but does she consider what a slog this gloomy, knit-browed Festivus of a holiday would be for actual Black people? Too much of White Fragility has the problem of elevating rhetorical texture over common sense.
White Fragility is, in the end, a book about how to make certain educated white readers feel better about themselves. DiAngelo’s outlook rests upon a depiction of Black people as endlessly delicate poster children within this self-gratifying fantasy about how white America needs to think—or, better, stop thinking. Her answer to white fragility, in other words, entails an elaborate and pitilessly dehumanizing condescension toward Black people. The sad truth is that anyone falling under the sway of this blinkered, self-satisfied, punitive stunt of a primer has been taught, by a well-intentioned but tragically misguided pastor, how to be racist in a whole new way.
Read the full article in the Atlantic.
Economics after slavery and George Floyd
Peter Doyle, NIESR Blogs, 10 June 2020
As detailed in my paper, the simple numbers—some 12 million people were transported by all “Enlightenment” nations to the New World over 3 centuries—do not really yield a visceral intuitive sense of scale.
Instead, reset the numeraire. Broadly, an African man in his prime was sold at auction in the American Colonies and the United States for prices equivalent to what was paid for a median white middle class house there at the time of his sale. Those were the standard terms of trade.
Thus, with such a house as numeraire, some 12 million “houses” were imported into the New World by the Atlantic Trade in People.
That is almost half the number of houses in the UK today and likely considerably more than the number of houses in the UK at the time—when it was the emergent and actual global hegemon and a key global orchestrator and beneficiary of the Atlantic Trade in People.
And that daunting number excludes the vastly greater number of “houses” subsequently born into lives in slavery in the New World. To get a sense of that scale, in total, of the 12 million “houses” imported into the New World over 3 centuries, only some 400,000 were imported into the American Colonies and the United States prior to 1860. But in 1860 at outbreak of Civil War, 4 million “houses” (people) were enslaved there. Now ponder what all those parameters imply for the number of “houses” elsewhere in the New World.
Slavery was a gigantic business relative to the size of the global economy.
And if buyers were consistently paying such sums at auction in such numbers over centuries, it was profitable for them to do so. Reflecting relative market power, those enormous sums were divided between people-traders, the support industries behind them, and in the case of people born into slavery, the people-owners.
So, slavery wasn’t “something bad that happened for social or historical reasons which had economic consequences”; it was “something bad (and huge) that happened for economic reasons.”
So, it is quite something for us to say virtually nothing to regular economics students about such an enormous and long-enduring profit-seeking activity.
Read the full article in the NIESR Blogs.
Why affluent Indians speak up about race
but stay silent about caste
Aarushi Punia, New Politics, 11 June 2020
Given South Asian solidarity with the African-American demand for political and social equality, Indians are amongst the first to speak against the racism that has now proven to be endemic in the US. However, the same Indians who abhor racism and protest racial discrimination in the US, choose to remain silent about caste and its practice in India and abroad. This is a virulent reality that is much closer to home and has been documented as a two-thousand-year-old form of discrimination practiced against Dalits (a term which means ‘oppressed’ or broken and has been self-appropriated by lower castes in India). It is practiced even today in the form of untouchability and remains uncontested by these apparently ‘woke’ Indians who publicly question race.
That is because there are two types of Indians who have tried to express solidarity with the African-American cause. The first are the bourgeois, diasporic upper castes who stand to gain directly from the abolition of racism by getting sought after jobs in the U.S. from which they have been excluded because of systemic racism. They only question racism and not casteism because they speak from a position of upper-caste privilege which can only play a limited role abroad when confronted with racism. The second are Dalits who have historically drawn strength from the African-American struggle through organizations like Dalit Panthers inspired by the Black Panthers; the solidarity between B.R. Ambedkar and W.E.B. DuBois; slogans such as #DalitLivesMatter and Dalit literature which is protest literature like African-American literature with which it has had a productive relationship.
To White Americans who are asking why the slogan #AllLivesMatter is not preferable to #BlackLivesMatter, it must be pointed out that by subsuming black lives under all lives, the systemic discrimination against Blacks and the social construction of ‘race’ is made invisible. This invisibility produces, as the civil rights advocate and legal scholar Michelle Alexander asserts in her book The New Jim Crow, a “color blindness”, which prevents us from seeing certain acts, such as a policeman pressing down on the throat of an African-American, as effects of racist ideology. This is similar to the acts of upper caste Indians who wish to rewrite history from the perspective of the upper-caste and view the inclusion of caste politics in mainstream history as muddying or polluting of the hegemonic Indian image abroad. It results in, as the social psychologist Yashpal Jogdand stresses, a “caste-blindness”, which is the product of a deliberate refusal to see the role of caste in an individual’s professional and personal success or failure.
Read the full article in New Politics.

The threat to civil liberties goes way beyond ‘cancel culture’
Leigh Phillips, Jacobin, 12 July 2020
In 2014, the University of Illinois at Urbana–Champaign withdrew an offer of employment to English professor Steven Salaita after some faculty, students, and donors asserted that his tweets critical of the Netanyahu administration during the Gaza war were antisemitic. Due to the controversy, he’s been driven out of academic employment and now works as a bus driver. Political scientist Norman Finkelstein, another critic of the Israeli occupation, was denied tenure at DePaul University in 2007 after a successful campaign by the Anti-Defamation League and lawyer Alan Dershowitz. He likewise has difficulty finding employment and says he struggles to pay the rent.
When appeals to academic freedom and due process are raised in all these cases, the response from the pro-Likudnik right has echoed the “no platform” rhetoric from the Left, arguing that criticism of the Israeli government is hate speech and thus should not be protected (and indeed, in Canada, unlike in the United States, hate speech is not constitutionally protected). They also copy the liberal-left’s demand for “stay in your lane” identitarian deference (in which only the oppressed group concerned may speak to an issue), asserting that non-Jews cannot comprehend Jewish suffering and so must shut up and listen.
Despite his cancellation, Salaita does not support the Harper’s letter. This is perhaps understandable given that English professor Cary Nelson is a signatory but was also among those who led the charge against hiring Salaita. It must be equally galling to him that New York Times opinion writer Bari Weiss, another Harper’s signatory, spent her Columbia University days campaigning against pro-Palestinian professors for alleged intimidation of Jewish students under the Orwellian guise of “Columbians for Academic Freedom.”
But while Nelson and Weiss may be guilty of egregious hypocrisy, hypocrisy does not undermine the letter’s argument for freedom of speech. Despite Finkelstein’s cancellation, or indeed precisely because he knows his cancellation to be a breach of academic freedom, he remains an adamant defender of freedom of speech. He knows that the solution to his own censorship comes not from censorship of those who censor him, but from an end to censorship entirely.
The upturning of lives and livelihoods comes not just in the arena of the Israel-Palestine conflict with respect to Salaita and Finkelstein. In some cases, the religious right’s efforts to de-platform is actively defended by the Left, such as when Iranian feminist Maryam Namazie was shouted down in 2015 by Islamic conservatives at Goldsmiths University and the university’s feminist society defended their use of the heckler’s veto.
Read the full article in Jacobin.
How capitalism drives cancel culture
Helen Lewis, Atlantic, 14 July 2020
Progressive values are now a powerful branding tool. But that is, by and large, all they are. And that leads to what I call the “iron law of woke capitalism”: Brands will gravitate toward low-cost, high-noise signals as a substitute for genuine reform, to ensure their survival. (I’m not using the word woke here in a sneering, pejorative sense, but to highlight that the original definition of wokeness is incompatible with capitalism. Also, I’m not taking credit for the coinage: The writer Ross Douthat got there first.) In fact, let’s go further: Those with power inside institutions love splashy progressive gestures—solemn, monochrome social media posts deploring racism; appointing their first woman to the board; firing low-level employees who attract online fury—because they help preserve their power. Those at the top—who are disproportionately white, male, wealthy and highly educated—are not being asked to give up anything themselves.
Perhaps the most egregious example of this is the random firings of individuals, some of whose infractions are minor, and some of whom are entirely innocent of any bad behavior. In the first group goes the graphic designer Sue Schafer, outed by The Washington Post for attending a party in ironic blackface—a tone-deaf attempt to mock Megyn Kelly for not seeing what was wrong with blackface. Schafer, a private individual, was confronted at the party over the costume, went home in tears, and apologized to the hosts the next day. When the Post ran a story naming her, she was fired. New York magazine found numerous Post reporters unwilling to defend the decision to run the story—and plenty of unease that the article seemed more interested in exonerating the Post than fighting racism. Even less understandable is the case of Niel Golightly, communications chief at the aircraft company Boeing, who stepped down over a 33-year-old article arguing that women should not serve in the military. When Barack Obama, a notably progressive president, only changed his mind on gay marriage in the 2010s, how many Americans’ views from 1987 would hold up to scrutiny by today’s standards?
This mechanism is not, as it is sometimes presented, a long-overdue settling of scores by underrepresented voices. It is a reflexive jerk of the knee by the powerful; a demonstration of institutions’ unwillingness to tolerate any controversy, whether those complaining are liberal or conservative. Another case where the punishment does not fit the offense is that of the police detective Florissa Fuentes, who reposted a picture from her niece taken at a Black Lives Matter protest. One of those pictured held a sign reading who do we call when the murderer wear the badge. Another sign, according to the Times, “implied that people should shoot back at the police.” Fuentes, a 30-year-old single mother to three children, deleted the post and apologized, but was fired nonetheless…
It is strange that “cancel culture” has become a project of the left, which spent the 20th century fighting against capricious firings of “troublesome” employees. A lack of due process does not become a moral good just because you sometimes agree with its targets.
Read the full article in the Atlantic.
The historical amnesia of culture warriors
Dorian Lynskey, Unherd, 14 July 2020
The phrase “cancel culture” might have been coined by the Devil to ensure maximum rancour and confusion. It is currently both ubiquitous and uselessly vague. The offences under its rickety umbrella range from an unguarded line in an interview to serial sexual assault; the punishments stretch from a rough week on Twitter to career annihilation; the prosecutors might be a powerful institution or a few powerless tweeters.
As if that weren’t muddled enough, the current debate is largely taking place in a state of historical amnesia, as if the issues were as novel as the terminology. The sociologist Jib Fowles called this fallacy chronocentrism: “the belief that one’s own times are paramount, that other periods pale in comparison”. The author and academic Philip Seargeant suggests “the narcissism of the present”.
For many progressives, this unknowing is indeed a kind of generational vanity: only we, in the early 21st century, have the moral clear-sightedness and mettle to reprimand behaviour that our predecessors let slide. There is a whole click-friendly genre of journalism dedicated to scolding “of its time” art in the tone of a disappointed schoolteacher, while oblivious to the fact that many of their points were made at the time.
For their opponents, meanwhile, chronocentrism magnifies the danger of current challenges to free speech: the mob is at the gates, the clock is ticking and the survival of liberalism itself hangs in the balance. Novelty inspires urgency. It doesn’t help them to point out that conservative writers were routinely warning against “liberal fascism” and “a new McCarthyism” 30 years ago, nor that some of them simultaneously endorsed censorship of work that offended them. Both versions of the fallacy imply that, roughly between the peak of the Enlightenment and the launch of Twitter, it was plain sailing.
This narcissism of the present became a little grotesque in the response to last week’s instantly notorious open letter to Harper’s, ‘A Letter on Justice and Open Debate’. Critics caricatured the signatories as a bunch of pampered, out-of-touch gatekeepers who are unaccustomed to criticism or challenge, as if decades of literary feuds, brutal reviews, boycotts and controversies had never happened.
Try telling that to Salman Rushdie, who was not only threatened with the ultimate cancellation by the Ayatollah Khomeini but had to listen to eminent figures from across the political spectrum say that, regrettable though it was, he had brought the fatwa on himself by writing the damn book in the first place, and who might therefore know a thing or two about threats to free speech. (The Algerian author Kamel Daoud has also received a fatwa.)
Other signatories, such as Noam Chomsky, Greil Marcus and Todd Gitlin, have been defending freedom of expression since the 1960s and are unlikely to draw the line at JK Rowling. Anyone disappointed by their participation hasn’t been paying attention. These people know that there are in fact worse things than being shouted at on Twitter. Most of them aren’t even on Twitter.
Read the full article in Unherd.
The American press is destroying itself
Matt Taibbi, Substack, 12 June 2020
It’s been learned in these episodes we may freely misreport reality, so long as the political goal is righteous. It was okay to publish the now-discredited Steele dossier, because Trump is scum. MSNBC could put Michael Avenatti on live TV to air a gang rape allegation without vetting, because who cared about Brett Kavanaugh – except press airing of that wild story ended up being a crucial factor in convincing key swing voter Maine Senator Susan Collins the anti-Kavanaugh campaign was a political hit job (the allegation illustrated, “why the presumption of innocence is so important,” she said). Reporters who were anxious to prevent Kavanaugh’s appointment, in other words, ended up helping it happen through overzealousness.
There were no press calls for self-audits after those episodes, just as there won’t be a few weeks from now if Covid-19 cases spike, or a few months from now if Donald Trump wins re-election successfully painting the Democrats as supporters of violent protest who want to abolish police. No: press activism is limited to denouncing and shaming colleagues for insufficient fealty to the cheap knockoff of bullying campus Marxism that passes for leftist thought these days.
The traditional view of the press was never based on some contrived, mathematical notion of “balance,” i.e. five paragraphs of Republicans for every five paragraphs of Democrats. The ideal instead was that we showed you everything we could see, good and bad, ugly and not, trusting that a better-informed public would make better decisions. This vision of media stressed accuracy, truth, and trust in the reader’s judgment as the routes to positive social change.
For all our infamous failings, journalists once had some toughness to them. We were supposed to be willing to go to jail for sources we might not even like, and fly off to war zones or disaster areas without question when editors asked. It was also once considered a virtue to flout the disapproval of colleagues to fight for stories we believed in (Watergate, for instance).
Today no one with a salary will stand up for colleagues like Lee Fang. Our brave truth-tellers make great shows of shaking fists at our parody president, but not one of them will talk honestly about the fear running through their own newsrooms. People depend on us to tell them what we see, not what we think. What good are we if we’re afraid to do it?
Read the full article in Substack.
Should we cancel Aristotle?
Agnes Callard, New York Times, 21 July 2020
There is a kind of speech that it would be a mistake to take literally, because its function is some kind of messaging. Advertising and political oratory are examples of messaging, as is much that falls under the rubric of “making a statement,” like boycotting, protesting or publicly apologizing.
Such words exist to perform some extra-communicative task; in messaging speech, some aim other than truth-seeking is always at play. One way to turn literal speech into messaging is to attach a list of names: a petition is an example of nonliteral speech, because more people believing something does not make it more true.
Whereas literal speech employs systematically truth-directed methods of persuasion — argument and evidence — messaging exerts some kind of nonrational pressure on its recipient. For example, a public apology can often exert social pressure on the injured party to forgive, or at any rate to perform a show of forgiveness. Messaging is often situated within some kind of power struggle. In a highly charged political climate, more and more speech becomes magnetically attracted into messaging; one can hardly say anything without arousing suspicion that one is making a move in the game, one that might call for a countermove.
For example, the words “Black lives matter” and “All lives matter” have been implicated in our political power struggle in such a way as to prevent anyone familiar with that struggle from using, or hearing, them literally. But if an alien from outer space, unfamiliar with this context, came to us and said either phrase, it would be hard to imagine that anyone would find it objectionable; the context in which we now use those phrases would be removed.
In fact, I can imagine circumstances under which an alien could say women are inferior to men without arousing offense in me. Suppose this alien had no gender on their planet, and drew the conclusion of female inferiority from time spent observing ours. As long as the alien spoke to me respectfully, I would not only be willing to hear them out but even interested to learn their argument.
I read Aristotle as such an “alien.” His approach to ethics was empirical — that is, it was based on observation — and when he looked around him he saw a world of slavery and of the subjugation of women and manual laborers, a situation he then inscribed into his ethical theory.
When I read him, I see that view of the world — and that’s all. I do not read an evil intent or ulterior motive behind his words; I do not interpret them as a mark of his bad character, or as attempting to convey a dangerous message that I might need to combat or silence in order to protect the vulnerable. Of course in one sense it is hard to imagine a more dangerous idea than the one that he articulated and argued for — but dangerousness, I have been arguing, is less a matter of literal content than messaging context.
Read the full article in the New York Times.
Thomas Chatterton Williams
on race, identity and ‘cancel culture’
Thomas Chatterton Williams & Isaac Chotiner,
New Yorker, 22 July 2020
I’ve seen you talk about the fact that, as you once put it in an interview, no one owns any topics. Essentially, we should look at the content of what people say, not their identity. Is that accurate?
I understand how experience informs insight, but I don’t think there are topics where some identities must simply be silent. What I reject is what the economist Glenn Loury calls identity epistemology, which is that I as someone partially descended from slaves have access to an understanding of American reality that can never be open to you. I reject that idea. I think that if you actually care enough to figure it out, you can understand much of my experience to the extent that anyone can understand an individual. You can, if you want to. Now, a lot of people don’t actually try hard enough.
What about people having the right to say certain things based on their identity? I was wondering if you thought that people could be privileged to say certain things or speak on certain topics, or that the most important thing was to judge the words themselves.
I studied philosophy. I genuinely believe that the most important thing is to judge the quality of the insights, the idea, the language, the argument. I don’t think that there is a Black point of view, because Black people don’t all agree on anything. When you say that somebody has more authority to speak as a Black person, what does that mean?
In “Losing My Cool,” you wrote, “Where I lived, books were like kryptonite to” the N-word [the text uses “niggas”]—“they were terrified, allergic, broke out in rashes and hives.”
I stand by everything in that book.
That’s not something a white person can really say in most polite societies. It’s also an idea that I think a lot of people would find very problematic—that books were like kryptonite to Black people.
That’s why the context is important. The whole book was about how books were my father’s life and that the Black culture that he comes from was one that prioritized education as the most important thing that a human being could participate in, the act of cultivating yourself. That comes in the context of me saying that the kind of street culture that I was in was making a false claim that books were kryptonite, that they were not for us. We were fooling ourselves in that we were participating in a culture that was monetizing the glorification of our anti-intellectualism, which is my argument against hip-hop culture. When it’s sliced into this little bit on Twitter, it’s to make me look like some type of racist who hates his Blackness. When, in fact, the book is a love letter to the kind of Black culture and tradition that my father comes from.
Read the full article in the New Yorker.

A friendship, a pandemic and death beside a highway
Basharat Peer, New York Times, 31 July 2020
Somebody took a photograph on the side of a highway in India.
On a clearing of baked earth, a lithe, athletic man holds his friend in his lap. A red bag and a half empty bottle of water are at his side. The first man is leaning over his friend like a canopy, his face is anxious and his eyes searching his friend’s face for signs of life.
The man is small and wiry, in a light green T-shirt and a faded pair of jeans. He is sick, and seems barely conscious. His hair is soaked and sticking to his scalp, a sparse stubble lines the deathlike pallor of his face, his eyes are closed, and his darkened lips are half parted. The lid of the water bottle is open. His friend’s cupped hand is about to pour some water on his feverish, dehydrated lips.
I saw this photo in May, as it was traveling across Indian social media. News stories filled in some of the details: It was taken on May 15 on the outskirts of Kolaras, a small town in the central Indian state of Madhya Pradesh. The two young men were childhood friends: Mohammad Saiyub, a 22-year-old Muslim, and Amrit Kumar, a 24-year-old Dalit, which refers to former “untouchables,” who have suffered the greatest violence and discrimination under the centuries-old Hindu caste system.
Over the next few weeks, I found myself returning to that moment preserved and isolated by the photograph. I came across some details about their lives in the Indian press: The boys came from a small village called Devari in the northern Indian state of Uttar Pradesh. They had been working in Surat, a city on the west coast, and were making their way home, part of a mass migration that began when the Indian government ordered a national lockdown to prevent the spread of the coronavirus. Despite our image-saturated times, the photograph began assuming greater meanings for me.
For the past six years, since Prime Minister Narendra Modi and his Hindu nationalist Bharatiya Janata Party took power, it has seemed as if a veil covering India’s basest impulses has been removed. The ideas of civility, grace and tolerance were replaced by triumphalist displays of prejudice, sexism, hate speech and abuse directed at women, minorities and liberals. This culture of vilification dominates India’s television networks, social media and the immensely popular mobile messaging service WhatsApp. When you do come across acts of kindness and compassion, they seem to be documented and calibrated to serve the gods of exhibitionism and self-promotion.
The photograph of Amrit and Saiyub came like a gentle rain from heaven on India’s hate-filled public sphere. The gift of friendship and trust it captured filled me with a certain sadness, as it felt so rare. I felt compelled to find out more about their lives and journeys.
Read the full article inn the New York Times.
The truth about vaccines
Stuart Ritchie, Unherd, 29 July 2020
Anti-vaxxers are “nuts”. That was Boris Johnson’s summation last week, as he visited a London GP surgery to discuss expanding coverage of the flu vaccine. In a time of mealy-mouthed advice from the government on the pandemic, this was a refreshing moment of clarity.
Vaccines really do work, and have saved millions of lives — indeed, it’s perhaps the biggest irony in medicine that one of the most effective and beneficial interventions known to humanity is the one that’s regarded with the most mistrust and suspicion. With the Prime Minister, we might ask: why can’t these recalcitrant anti-vaccine fools just trust the experts?
But it might be unwise to take such a black-and-white view of the anti-vaccination movement. For although the evidence on the effectiveness and safety of vaccines is clear, scientists and doctors have been anything but immune to missteps and errors in how they’ve dealt with them over the years. Pitting the wacky, hidebound anti-vaxxers against the sober, all-knowing experts risks glossing over the mistakes those experts have made in the past — and failing to learn from the cautionary tales.
Last week was a big week for vaccines. As well as the government’s flu vaccine push, initial results from two new COVID-19 vaccine trials appeared in the journal The Lancet — one from the UK and one from China. They join a US trial from the week before in showing that their candidate vaccines have tolerable side-effects and definitely do cause an immune response. These Phase I and Phase II trials are just the first — albeit crucial — steps on the way to finding a working vaccine. Next, we move to larger, Phase III trials that involve many thousands of people, testing whether those who get the vaccine are actually less likely to catch Covid-19 than those who are injected with a placebo.
It does feel somewhat ironic that The Lancet in particular is taking the lead in publicising the research on Covid-19 vaccines: besides being a super-prestigious medical journal, the main public claim to fame of that particular journal is in publishing one of the worst and most damaging vaccine studies of all time. That was, of course, Andrew Wakefield’s notorious Lancet article linking the MMR vaccine to autism. Its appearance in 1998 fanned the flames of the anti-vaccine movement, with crushing media suspicion falling on the MMR — and a resulting deadly surge in measles cases in the UK and worldwide.
Read the full article in Unherd.
The tragedy of vaccine nationalism
Thomas J Bollyky & Chad P Bown, Foreign Affairs, July 27, 2020
Vaccine manufacturing is an expensive, complex process, in which even subtle changes may alter the purity, safety, or efficacy of the final product. That is why regulators license not just the finished vaccine but each stage of production and each facility where it occurs. Making a vaccine involves purifying raw ingredients; formulating and adding stabilizers, preservatives, and adjuvants (substances that increase the immune response); and packaging doses into vials or syringes. A few dozen companies all over the world can carry out that last step, known as “fill and finish.” And far fewer can handle the quality-controlled manufacture of active ingredients—especially for more novel, sophisticated vaccines, whose production has been dominated historically by just four large multinational firms based in the United States, the United Kingdom, and the European Union. Roughly a dozen other companies now have some ability to manufacture such vaccines at scale, including a few large outfits, such as the Serum Institute of India, the world’s largest producer of vaccines. But most are small manufacturers that would be unable to produce billions of doses.
Further complicating the picture is that some of today’s leading COVID-19 vaccine candidates are based on emerging technologies that have never before been licensed. Scaling up production and ensuring timely approvals for these novel vaccines will be challenging, even for rich countries with experienced regulators. All of this suggests that the manufacture of COVID-19 vaccines will be limited to a handful of countries.
And even after vaccines are ready, a number of factors might delay their availability to nonmanufacturing states. Authorities in producing countries might insist on vaccinating large numbers of people in their own populations before sharing a vaccine with other countries. There might also turn out to be technical limits on the volume of doses and related vaccine materials that companies can produce each day. And poor countries might not have adequate systems to deliver and administer whatever vaccines they do manage to get.
During that inevitable period of delay, there will be many losers, especially poorer countries. But some rich countries will suffer, too, including those that sought to develop and manufacture their own vaccines but bet exclusively on the wrong candidates. By rejecting cooperation with others, those countries will have gambled their national health on hyped views of their own exceptionalism.
And even “winning” countries will needlessly suffer in the absence of an enforceable scheme to share proven vaccines. If health systems collapse under the strain of the pandemic and foreign consumers are ill or dying, there will be less global demand for export-dependent industries in rich countries, such as aircraft or automobiles. If foreign workers are under lockdown and cannot do their jobs, cross-border supply chains will be disrupted, and even countries with vaccine supplies will be deprived of the imported parts and services they need to keep their economies moving.
Read the full article in Foreign Affairs.
Stop flaunting those curves! Time for stats
to get down and dirty with the public
Timandra Harkness, Harvard Data Science Review, 30 July 2020
Greater use of data by governments is to be welcomed when it informs action that is more effective toward policy goals, monitors the success and failure of those actions, and helps hold politicians to account for their promises and policies.
But recent experience shows that data and statistics are also used to convey an impression of scientific certainty where none exists. In the words of Spiegelhalter, statistics in government briefings were “number theatre” (Spiegelhalter, 2020), deployed not for deeper understanding but for dramatic effect.
Some later examples from UK government briefings (2020) have the form of a graph but no mathematical underpinning at all, such as the curve that seemed to show stages of relaxing lockdown rules as the R number decreased, but with no axis labels.
There was even an ‘equation’ that contained an equal sign but made no sense either mathematically or in any other terms. Apparently COVID Alert Level = R (rate of infection) + Number of infections. For anyone who knows the difference between addition and multiplication it should come with a disturbing content warning.
These visuals are clearly aimed neither to harness nor to encourage greater public understanding of statistics or anything else. They simply invoke the authority of mathematics and data as a shield against doubt.
Perhaps the most insidious barrier against greater public understanding of statistics is the quest for certainty in an uncertain world. The desire to know the future, and to feel that somebody is in control, is so strong that even a terrible future in a world controlled by evil conspiracies holds strong appeal against rudderless and unpredictable reality.
But statistics doesn’t offer certainty, you cry. Statistics is all about engaging with uncertainty, trying to tease apart what we know, what we can reasonably infer, and what we must accept as unknowable, at least for now.
That is exactly why it’s so hard to turn this upsurge in public interest into genuine engagement and deeper understanding. Anyone should be able to read graphs, and to ask questions about how data was collected, what was left out, and what assumptions underlie the models used to predict the future. That should be as much a part of being an engaged citizen as knowing what policies each party stands for.
But the more the public gets stuck in asking awkward questions about the numbers, the clearer it will become that statistics and data are not gods, but tools for fallible humans making sense of the world. Computer models are not oracles, but machines for imagining the world in more detail than one human brain can hold.
Read the full article in the Harvard Science Review.
The idea of a nation
Thomas Meaney, The Point, 12 June 2020
And what about those self-proclaimed truce-makers between the right and the left, the liberal nationalists? Every decade a lonely liberal nationalist writes a large book that says the same thing: We are not yet ready as a species for international solidarity, nor even for baby steps like Habermas’s vision of “constitutional patriotism,” which defines citizenship on the basis of allegiance to abstract liberal ideals rather than shared history or social affinities. We—Americans, Israelis, the French—are separate peoples, bound by particular historical experiences, and we should not fool ourselves into thinking that national borders can be transcended. We should not fool ourselves that supranational political authority is desirable, much less practicable. To lose hold of this insight, they insist, is to forfeit a hard-won inheritance to the forces of the nationalist right.
Liberal-nationalist authors flatter their liberal readership by finding the solution to their accelerating sense of dislocation in the attic of their own tradition. They are bound together by the determination to reconcile a putative need for historical belonging with tolerance. If the cultural needs of a particular minority are very great indeed, then that people—Palestinians, Kurds, Rakhine, Kosovars—may have a legitimate claim to their own territory. But there’s an impatience, too, that characterizes liberal nationalists: you cannot play the game of Russian dolls forever, they seem to say.
What they do not say is that the United States will reserve the ultimate decision of whether your application for territory is approved. Perhaps the most dramatic failed “liberal nationalist” project of the nineteenth century was the Confederate States of America, whose leaders believed they were operating not only within the legal writ of the U.S. Constitution but also with the Mazzinian zeitgeist at their backs. The content of their cause—the maintenance and expansion of slavery—hardly impinged on their procedural liberal right to political self-determination. But as both the Confederates and Native Americans learned, rights to nationhood, no matter how enshrined in law, can always be revoked by the power that issued them.
Even if a truly tolerant brand of liberal nationalism could be imagined, it would be the equivalent of a horse and wagon on the Autobahn of global capitalism. An ideology cobbled together in the mid-nineteenth century yields poor results when confronted with 21st-century neoliberalism. A squadron of $150 million F-22 Raptors flies over a football stadium, or a regiment performs its goose-step change of guard at the border, liberal nationalists recommend protest: Why don’t we build more national parks with that money instead? National ideology, they believe, can be channeled to whatever ends the national public chooses. But nothing in the treasure chest of liberal nationalism encourages such social instincts, which is why its ideology of legalism and proceduralism has long been so attractive to elites who already have power.
Read the full article in The Point.
Peter Beinart on the end of the
two-state solution for Israel and Palestine
Peter Beinart & Hadas Thier, Jacobin, 13 July 2020
One of the things that really struck home to me in your piece is distinguishing between a Jewish state and a Jewish home. What do you think are the problems inherent to a Jewish state?
PB
A Jewish state, as most people would define it, is a state that has obligations to Jews that it doesn’t have to the other people under its domain. Most of them are Palestinian. Right there you have a serious tension with the notion of equality under the law, which is really core to liberal democracy. And that’s just within the Green Line where Palestinians are citizens, but not equal citizens. In the West Bank, there’s no liberal democracy at all. Those Palestinians are not citizens, they don’t have the right to vote.
For a long time I hoped that Israel would end the occupation, and then inside the Green Line, it would evolve towards a more inclusive national identity. Maybe it would still have certain things like granting refuge to Jews or certain special obligations for Jews, but it would broaden its notion of Israeliness to make it more fully inclusive for Palestinian citizens. But in reality, Israel has more and more deeply entrenched its control over the West Bank. And it’s also simultaneously become more illiberal inside the Green Line.
My hopes for this trajectory were framed partly because I’m a product of the 1990s. There was a certain moment in the early 1990s, where one could squint and see that possibility a little bit in the distance. Now we’ve gone in completely the opposite direction. So, I’ve had to reconsider that.
Now I’m not a diasporist. I believe that a Jewish society in the land of Israel is deeply important. In that way, I’m influenced by people like Ahad Ha’am, who believed that there were certain things that a Jewish society in the land of Israel could create, that in diaspora Jews could not create.
When I think about a Jewish home, that’s partly what I’m thinking about — all of the cultural production and religious innovation that comes out of Jewish Israel. Not all of it’s good, but there are certain mitzvot that you can only do in the land of Israel. There’s a way in which Israel, as a Jewish society, can have a public conversation which is infused with Jewish thought and Jewish text. That’s what I think about as being a Jewish home.
I believe, and obviously many to my right will disagree with me, that this could also be a place of refuge for Jews as well as being a place of refuge for Palestinians. I’m even idealistic enough to believe what Ahad Ha’am thought about, which was that a Jewish society that would radiate and enrich the whole world might even be able to do so more powerfully if it was also equally a Palestinian home.
Read the full article in Jacobin.

How the Dutch invented our world
Ralph Leonard, Unherd, 14 July 2020
“Forward! Brave people! The goddess of liberty leads you on!” So declares Count Egmont, the protagonist in Goethe’s exquisite 1788 play, Egmont, a tragedy based on the Dutch revolt of the late 16th century. “And as the sea breaks through and destroys the barriers that would oppose its fury, so do ye overwhelm the bulwark of tyranny, and with your impetuous flood sweep it away from the land which it usurps.”
Egmont would become a martyr representing the aspirations of the Dutch people, persecuted and oppressed for their Protestantism by the corrupt and tyrannical Duke of Alba, an agent of Phillip II and his Catholic Spanish empire. This was the first great “bourgeois revolution”, a concept that will be familiar to those acquainted with the Marxist and socialist lexicon, denoting the events and processes that facilitated the development of modern bourgeois society on the basis of the capitalist mode of production. The classic example was the French Revolution of 1789, when monarchy and seigneurialism were overthrown, and the basis of the modern liberal-democratic capitalist nation established.
In recent years, however, bourgeois revolution has gone out of fashion and subjected to revisionist critique wishing to consign it to the dustbin of history. Part of what underlines this dismissiveness is a rather childish unwillingness to credit capitalism, and by extension, the bourgeoisie and liberalism, with any positive contribution to human development. The assumption is that because the bourgeoisie has been reactionary for so long, therefore it has never played a historically revolutionary role; because capitalism is now senile and decadent therefore it has never been historically progressive; because liberalism is now servile therefore it has never been emancipatory. None of which is true.
The Dutch Revolt that began in the late 16th century was the first such bourgeois revolution, but because it was the first, it is arguably the most ignored. Today its distance in time makes it seem remote, and it doesn’t bare a shadow over our current epoch in much the same way as the American and French revolutions do.
But for the figures of the Enlightenment and Romantic period the Dutch Revolt and its effects held huge resonance. Adam Smith saw the Dutch Republic as the prime exemplar of a commercial society, and in the Wealth of Nations, Smith lauded the commercial cosmopolitanism of the republic, and its emphasise on free trade as the basis of its immense wealth.
For the likes of Goethe it was the struggle for liberty that made the Dutch revolt a subject of fascination, and his play in turn inspired Beethoven to compose Egmont. Friedrich Schiller, in his history of the Dutch revolt, extolled the “spirit of independence” of the Dutch people in their liberation struggle against the Habsburg Empire.
Read the full article in Unherd.
On Afropessimism
Jesse McCarthy, LA Review of Books, 20 July 2020
No serious Black intellectual today thinks antiblack racism is not a matter of life and death. The question is still the old one: what is to be done? There has to be room for a serious debate and the flexibility of open-minded conversation on that score. It’s simply implausible that the answers are easy, obvious, or one-dimensional. The fact that Black Lives Matter has done more to explode the Overton window in American politics than any movement since the 1960s has to be fully and duly appreciated for the extraordinary achievement that it is. But Adolph Reed Jr.’s countervailing contention that Black Lives Matter is merely a rebranding and retreading of Black Power for millennials is a barb nonetheless worth reflecting on seriously.
Reed has been right before, most famously about Obama whom he crossed paths with in Chicago in the mid-1990s when he diagnosed him as “a smooth Harvard lawyer with impeccable do-good credentials and vacuous-to-repressive neoliberal politics.” The ease and celerity with which multinational corporations and political elites rushed to eulogize George Floyd, instantly adopting the performative repertoire of genuflection and the mimeographed consultancy lingo of McKinsey et al. through the issuing of carefully worded “statements,” should give us pause. It is possible for a nominally leftist rhetoric, especially one that is explicitly ethno-nationalist and directed by actors professionally linked to the governing class, to weaponize superficial and symbolic gains in ways that serve to advance their own professional and middle-class interests. This work happens at the expense of broadly based and genuinely popular political strategies that could have otherwise advanced the interests of the Black poor and working classes who are most vividly affected by the forces that the movement alleges it is dismantling. Everything in Black political history suggests that the danger of this kind of cooptation is very real. As Imani Perry observes, a robust Black feminism is critical at this juncture precisely because it is so uncompromising vis-à-vis “the self-congratulatory posture of the neoliberal state” and its constant attempts to funnel the energy of righteous discontent back into market-driven and customizable “lean in” conceptions of activism.
At the same time, it seems clear that whatever its eventual failings and misfires, the spectacular and urgent appeal of BLM among the younger generation (not just in the United States but around the world) is a rational response and rejection of the style of racial politics that wound-licking left-liberals fashioned in the late Clinton and Bush years, and that reached its apogee in both the persona and policy offers of Obama’s presidency. That generation’s rose-tinted conception of politics as the transactional but egalitarian rule of the demos by the best and the brightest was enshrined, as the commentator Luke Savage cannily pointed out, in the sanctimoniocracy of Aaron Sorkin’s television show The West Wing (1999–2006). The main threat in that world was understood to be the crude morality and venality of Republicans and the threat of terrorism emanating vaguely from the Middle East. But the real Vietnam that threatened this new breed of “whiz kids” (whose failures and educational pedigrees uncannily resemble those of David Halberstam’s famous book on Kennedy’s “Best and Brightest” men) was not brewing in the (undeniably real) quagmire abroad, but in the neglected quagmire at home, one that was captured in the other signal television show of that era, David Simon’s The Wire (2002–2008).
Read the full article in the LA Review of Books.
Ambivalent sense of belonging
Zahra Moloo, Africa is a Country, 1 July 2020
The uneasy nature of white Kenyans’ sense of belonging in the country is unraveled and analyzed in Janet McIntosh’s fascinating book Unsettled: Denial and Belonging Among White Kenyans. Based on extensive in-depth interviews, and structured poetically into different themes which explore varying components of the white Kenyan experience, McIntosh’s book reveals the complex and often ambivalent positions of white Kenyan subjectivities in contemporary Kenya. She explores their relationships to the land, to Kiswahili, to domestic workers, to other black Kenyans and to their own white community. The last chapter is dedicated to white Kenyans’ relationship to the occult and how they justify or explain their participation in practices that transcend a “rational” European worldview.
Through her interviews, McIntosh discovers an interesting dynamic at play in the white Kenyan consciousness. Their uneasy sense of belonging is expressed through the notion of a “moral double consciousness,” a term borrowed from the phrase “double consciousness,” defined by W.E.B. DuBois, and which in McIntosh’s book is used to describe what results when white Kenyans look at themselves through the eyes of others, and experience the shock of seeing that their community is being seen. They were raised to think of their settler families as good, but now have to grapple with the fact that they were in fact, oppressors and that they are also seen through the same lens. They experience an inner self-doubt, and shift between a moral self-assurance and a sense of anxiety elicited by their critics. As they cannot for long dwell in shame about themselves or about their colonial past, some settle into a “defensive stance” in order to remain in their comfort zone and mystify their structural advantages. Others focus on their felt bonds to Kenya and insist that their personal intentions take precedence over history. A very small number try to find ways to empathize with black Kenyan perceptions. In today’s Kenya, argues McIntosh, white Kenyans are no longer looking to rule, but to belong…
Perhaps the most glaring and contentious area in which the presence of white Kenyans in the country comes to the fore is around the question of land. As McIntosh notes in her second chapter, land is already a “painful theme” across Kenya which often plays out in terms of which ethnic group was on the land first. Taking the reader back in history, she describes how the British colonial government expropriated land and imposed individual land rights to encourage agricultural production and “proper” land use. The Crown Lands Ordinance of 1902 imposed English property law and forced Africans to give up land that was not occupied or developed, enabling the colonial state to give huge swathes of it in the Rift Valley to European and South African settlers. These fertile areas, so desirable to white settlers, were places where Maasai pastoralists practiced seasonal migration under a complex system of rights to land and water. As the colonial administration created more room for white settlers, the Maasai were coerced into signing away their lands. In 1911 and 1912, thousands of Maasai were herded toward the south at gunpoint and by 1913, they had lost between 50 and 70 percent of the land which they had previously used.
The settler descendants that McIntosh interviews about this history do not seem to know about the land expropriations. Operating out of what she describes as ignorance, collective defensiveness and possibly systematic whitewashing, settler descendants spin their narratives to assert that the Laikipia territories were fairly purchased from the Maasai, or that Laikipia was a no-man’s land at the time of settler arrival, echoing the classic settler frontier ideology of terra nullius.
Read the full article in Africa Is a Country.
RA Fisher and the science of hatred
Richard J Evans, New Statesman, 28 July 2020
In 1989, the fellows of Gonville and Caius College (founded in 1348, and one of Cambridge University’s largest, wealthiest and most prestigious collegiate institutions) had the genial idea of fitting stained-glass windows in the dining hall to commemorate prominent scientists who had been among its members, counterbalancing the many lawyers and divines whose portraits adorn its walls. By the early 2000s the collection included a double helix, paying homage to Francis Crick, the co-discoverer of the structure of DNA, and other windows showing the scientific achievements of men such as the mathematician and philosopher John Venn, the physicist James Chadwick and the physiologist Charles Sherrington.
The collection also includes a “Latin Square”, a mathematical device invented by Ronald Fisher, who is widely regarded as the most important biostatistician of the 20th century. Richard Dawkins has called him “the greatest biologist since Darwin”. His book Statistical Methods for Research Workers, published in 1925, exercised a huge influence, and he is often referred to as the father of modern experimental design – the subject of another important book. For a long time he taught at University College, London, where a professorship is named after him, before moving to Cambridge as Balfour Professor of Genetics and Fellow of Gonville and Caius College, where he had studied as an undergraduate between 1909 and 1912.
Fisher’s work in statistics was closely integrated into the science of eugenics – the supposed improvement of the human stock through selective breeding, the encouragement of “superior” genetic stock (Fisher himself put his beliefs into practice by siring no fewer than eight children), and discouragement, either by persuasion or by some form of compulsion (including sterilisation) of “inferior” lines of heredity. Head of the Department of Eugenics at UCL, editor from 1934 of the Annals of Eugenics, and a prominent member of the British Eugenics Society, he was also co-founder in 1947 of the journal Heredity with the Oxford Professor of Botany, Cyril Darlington. Darlington claimed that as slaves in America, Africans “improved in health and increased in numbers” because they were living in an environment far superior to that of their home continent; emancipation had destroyed this advantage, he argued, by removing the discipline under which they had lived as slaves, leading to problems of “drugs, gambling and prostitution” in the African-American community…
Fisher was less unsympathetic to Nazi eugenics than most of his British colleagues were. In the mid-1930s he campaigned for the legalization of compulsory eugenic sterilization especially of the “mentally defective”. He was a British representative at the International Federation of Eugenic Organizations until the war, where he met regularly with German colleagues involved in the compulsory sterilization programme. In Britain, the mainstream of the Eugenics Society wanted to keep the Germans at arm’s length and did not support compulsory sterilization, not least because it knew that a law to this effect would never get through the House of Commons.
Read the full article in the New Statesman.
Believe what you like
N J Enfield, TLS, 17 July 2020
How do you react when you come across a bear in the woods? William James begins his essay of 1884 on human emotion with the commonsense view: “we meet a bear, are frightened and run”. Wrong, says James. “This order of sequence is incorrect.” It is not that we run because we are frightened. It is that we find ourselves running and then, on experiencing this bodily reaction, call it fear. Reversal of an assumed arrow of causation is a hallmark of many conceptual breakthroughs, especially in domains where the truth is counterintuitive, or where it supports a narrative we don’t like. As with Galileo, a conceptual reversal may seem heretical at first, but in time we may see that it explains things that once made no sense.
In Not Born Yesterday, the cognitive scientist Hugo Mercier brings the conceptual reversal to a domain in desperate need of new insights: that of truth and falsehood, knowledge and ignorance. We keep hearing that this is a post-truth era, that feelings beat facts, people no longer care what’s true, and we’re heading for ruin. Opponents of Brexit and Donald Trump not only found those victories intolerable, but many refused to believe them to be legitimate, instead supposing that lies had swayed a docile population. This idea of a gullible, pliable populace is, of course, nothing new. Voltaire said, “those who can make you believe absurdities can make you commit atrocities”. But no, says Mercier, Voltaire had it backwards: “It is wanting to commit atrocities that makes you believe absurdities”.
This reversal may be unsettling but it has the merit of treating people as agents, with accountability for their choices. Mercier’s case against gullibility is grounded in an evolutionary account of human cognition and communication, in which the mind is not bug-ridden but well tuned, adapted for social interaction. If receivers of messages were inclined to believe whatever they hear, he says, human communication as we know it could not have evolved. Those wired to accept unverified and unintuitive claims would have been too easily exploited by others who, by chance, were wired differently. Gullible people would either have wised up, or they would have quickly exited the gene pool, taking their vulnerability with them.
Mercier insists that gullibility is vastly oversold. It’s true that once in a while we will fall for a prank or believe a lie. But an objective look at our behaviour shows that we are far from uncritical sponges. Would you believe me (actually, Nigel Farage) if I told you that “the doctors have got it wrong on smoking”? Is your mind changed by campaign advertisements for politicians you detest? Does more ad coverage mean greater influence? An example among many: the billionaire hedge fund manager Tom Steyer put more than $190 million into his campaign for US Democratic nomination in 2020 (compared to Joe Biden’s roughly $118 million), but he couldn’t secure a single pledged delegate.
Read the full article in TLS.
Fertility rate: ‘Jaw-dropping’ global crash in children being born
James Gallagher, BBC News, 15 July 2020
The world is ill-prepared for the global crash in children being born which is set to have a “jaw-dropping” impact on societies, say researchers.
Falling fertility rates mean nearly every country could have shrinking populations by the end of the century. And 23 nations – including Spain and Japan – are expected to see their populations halve by 2100. Countries will also age dramatically, with as many people turning 80 as there are being born.
The fertility rate – the average number of children a woman gives birth to – is falling. If the number falls below approximately 2.1, then the size of the population starts to fall. In 1950, women were having an average of 4.7 children in their lifetime. Researchers at the University of Washington’s Institute for Health Metrics and Evaluation showed the global fertility rate nearly halved to 2.4 in 2017 – and their study, published in the Lancet, projects it will fall below 1.7 by 2100.
As a result, the researchers expect the number of people on the planet to peak at 9.7 billion around 2064, before falling down to 8.8 billion by the end of the century.
“That’s a pretty big thing; most of the world is transitioning into natural population decline,” researcher Prof Christopher Murray told the BBC.
“I think it’s incredibly hard to think this through and recognise how big a thing this is; it’s extraordinary, we’ll have to reorganise societies.”
Why are fertility rates falling? It has nothing to do with sperm counts or the usual things that come to mind when discussing fertility. Instead it is being driven by more women in education and work, as well as greater access to contraception, leading to women choosing to have fewer children.
In many ways, falling fertility rates are a success story.
Which countries will be most affected. Japan’s population is projected to fall from a peak of 128 million in 2017 to less than 53 million by the end of the century. Italy is expected to see an equally dramatic population crash from 61 million to 28 million over the same timeframe. They are two of 23 countries – which also include Spain, Portugal, Thailand and South Korea – expected to see their population more than halve.
“That is jaw-dropping,” Prof Christopher Murray told me. China, currently the most populous nation in the world, is expected to peak at 1.4 billion in four years time before nearly halving to 732 million by 2100. India will take its place. The UK is predicted to peak at 75 million in 2063, and fall to 71 million by 2100.
Read the full article in BBC News.

Revisiting a revolution of Mexican art in America
Anna Shapiro, NYR Daily, 20 July 2020
The Whitney’s show, “Vida Americana: Mexican Muralists Remake American Art,” is a study in revisionism, recasting the standard story so that those formerly disregarded and excluded from the canon of modern American art are instead given a place in it. Exhibitions in recent years have been doing that rewriting in accord with values newly freed from stigma, discovering or rediscovering artists who are female or non-European-American, or who simply didn’t fit the strictures of formalist Modernism. The artists in this show, however, were truly avant-garde in their social values, championing the underdogs of history when it was deeply unfashionable to do so.
Their politics and style became, in the late 1940s, the subject that dared not speak its name, and they were all but expunged from the record. They were Communists or fellow travelers, and the show could equally have been called “American Communists”: it seems the more apt title for an exhibit less about “American life” than about a sense of what that life ought, and ought not, to be.
Though the show is presented coolly enough as a reassessment of the influence Mexican artists had on North American art, I could not greet it with detachment. The three Mexican muralists central to the show—Diego Rivera, David Alfaro Siqueiros, and José Clemente Orozco—were touchstones for my lefty artist father, who had made the pilgrimage to exotic New Hampshire from New York with a group of friends just to see the 1934 Orozco mural in Dartmouth’s Baker library. A number of my father’s old buddies and teachers from the American Artists School—where a free art education was briefly to be had, between 1936 and 1941, courtesy of the John Reed Club—are in the show.
After enjoying early success, they all led lives of unexpected obscurity. I was painfully exhilarated, and haunted, seeing these household gods, who were blacklisted or simply denigrated during their lifetimes, validated on the walls of the Whitney, now that they are all dead. The show’s labels did not tell this tale. It was as if no one else knew.
That sense of their times and ideals left me feeling that the show—though impressive, beautifully put together, and full of rich works never exhibited together before—was denatured, free-floating. This art had been made in passionate espousal of the poor and downtrodden, and fury at how the powerless are crushed, yet the show distanced these concerns as quaint or merely pretty, as though these frankly propagandistic images, instead of rousing viewers to righteous action, were only entertaining as a curious side note in art history.
Read the full article in the NYR Daily.
Happy birthday, Baghdad, wonder of the world
Justin Marozzi, Unherd, 30 July 2020
In the midsummer furnace of 30 July 762, a date considered auspicious by the royal astrologers, the Abbasid caliph Al Mansur, supreme leader of the Islamic world, offered up a prayer to Allah and laid the first ceremonial brick of his new capital on the Tigris river. “Now, with the blessings of God, build on!” he ordered the assembled workers.
And build on they did. It took them four back-breaking years, slogging away in the fiercest summer temperatures of Iraq, to complete the job, and by 766, it was done. Mansur’s city was an architectural marvel from the start. “They say that no other round city is known in all the regions of the world,” wrote Al Khatib al Baghdadi, the eleventh-century author of the comprehensive History of Baghdad.
Four straight roads ran from four gates in the outer walls towards the city centre, past vaulted arcades containing the merchants’ shops and bazaars, past squares and houses. At the heart of the circular city was a vast royal precinct almost 2,000 metres in diameter, empty apart from two monumental buildings. The Great Mosque stood alongside the caliph’s Golden Gate Palace, a classically Islamic expression of the union between temporal and spiritual authority.
Built on the west bank of the Tigris, the imperial capital spread swiftly to the east, where it grew at breakneck pace. The famous Barmakid family, well-heeled viziers to the Abbasid caliphs, spent 20 million silver dirhams building an opulent palace there, and another 20 million furnishing it (to put that in perspective, a master-builder working on the construction of Baghdad was paid 1/24th of a dirham a day).
Wisely, in an age when men could lose their heads with a caliph’s nod to the ever-present executioner, it was presented as a gift to Al Mamun, son and heir of the great caliph Harun al Rashid, and became his official residence in the early ninth century, glittering centrepiece of the Dar al Khilafat, the caliphal complex that was home to future generations of Commanders of the Faithful.
All this was mightily impressive, but very quickly the architectural pre-eminence of the Round City became the least of Baghdad’s merits. Flush with money, the capital presided over a cultural revolution every bit as remarkable as its burgeoning political, military and economic power. Poets and prose writers, scientists and mathematicians, musicians and physicians, historians, legalists and lexicographers, theologians, philosophers and astronomers, even cookery writers, together made this a golden age, Islam’s answer to Greece in the fifth century BC.
Read the full article in Unherd.
‘You think that’s racist?’: the generational tension
in Melbourne’s high-rise migrant families
James Button and Julie Szego, Guardian, 11 July 2020
“Ahmed and I talk about this a lot,” says Nor. “Because we came here as refugees at the age of 10, we knew we were outsiders and we were OK with it. But these kids who are born in Royal Women’s hospital, all they know is Australia. When their parents take them back to Eritrea, Somalia, they’re outsiders. They don’t understand nothing there. They’re westerners, that’s how they’re seen. And then they come back here and they’re still outsiders. I think that creates resentment.”
Being unable to talk to their parents exacerbates the sense of estrangement. “We have a lot of kids that can barely put together a sentence in Arabic or Somali, and their parents can barely put together a sentence in English. Those of us who came as immigrants knew our mother tongue. We’re not genius at it, but we’re able to have a basic conversation with our parent. But for a lot of kids now that doesn’t really happen.” If they experience racism, “they can’t talk to their parents because, they say, ‘I don’t want to put that stress on them when I know they can’t do anything about it. So I just have to suck it up.’ And this is a 14-year-old kid saying that to you.”
It worries Nor that young people of African background are not registering to vote. Or that “young kids who grow up here, especially in public housing, by the ages of 12 or 13 will say to me, ‘I would never go to uni. It is a waste of time. Australia is a racist country. You think someone’s going to give me a corporate job? You think I could ever be a lawyer in this country? A journalist? No.’ That’s at 12. I always tell people that’s dangerous, because why would they even try at school? Even if he’s going to grow out of it by 18 or 19, his options are very limited now. I said to one kid, ‘Why do you think that? You’re a kid. What do you know?’ And he’s like, ‘My sister’s got two degrees. She’s been looking for a job for two years.’ And I say, ‘Maybe your application wasn’t that good. Maybe you need to work on your interview skills. Now you dismiss everything by saying everyone’s racist. Trust me, I’ve seen racists. I’ve had to sit down and watch blatant racists sit there with a smile on their face. I’ve been beaten up by Victoria police officers. But that’s not always the case.’
“Yes there is bias, all those issues. But also, [young African Australians] are very defensive now. If I don’t get something, it’s because I’m black, because that person is a racist. If you’re only within the bubble, all you hear about is how white people are racists and how they don’t give opportunities. And as soon as you don’t get that one, or two, or three opportunities, that is evidence of what you already thought.”
Read the full article in the Guardian.
Why MA Jinnah was a man of many contradictions
Salil Tripathi, Live Mint, 31 July 2020
The peculiarity of hindsight is that it depends on the point from which you look back at Pakistan’s and India’s trajectories. In the early 1970s and till the late 1990s, as Pakistan itself broke up into two and generals and mullahs controlled its politics, India could afford to be smug. In 1992, the destruction of the Babri Masjid changed that, and the consequences of India’s 2014 election are there for us to see. Some Pakistanis may feel triumphant, but the virtue of Pakistani lawyer Yasser Latif Hamdani’s new biography, Jinnah: A Life, is that it takes a sober tone.
In clear, if not sparkling, prose, Hamdani, an admirer of Jinnah, offers a nuanced perspective of the man who began as an ambassador of Hindu-Muslim unity and ended up being instrumental in dividing India along religious lines. The book adds to the growing body of literature around Jinnah, building on the work of Stanley Wolpert, regarded as the most important biography till Ayesha Jalal’s detailed and absorbing biography, and the indifferent book by former BJP minister Jaswant Singh, which gained notoriety for all the wrong reasons.
Projecting Pakistani nationalism as if it was one man’s fantasy, no matter how powerful, is misleading. Wolpert did that with his 1984 biography, Jinnah Of Pakistan, where he said: “Few individuals significantly alter the course of history. Fewer still modify the map of the world. Hardly anyone can be credited with creating a nation-state. Jinnah did all three.” As Jalal argued in The Sole Spokesman: Jinnah, The Muslim League And The Demand For Pakistan (1985), this is a misleading view. She meticulously built the case that Jinnah never wanted a theocracy. As a shrewd tactician, he wanted assurances of Muslim rights, to secure a loose federal structure that would keep the subcontinent together.
But Wolpert was not entirely wrong—powerful individuals do shape the destinies of nations, but the context matters. Focusing on what Jinnah made of Pakistan is one thing; more interesting is the question that examines the context that made Jinnah. Jalal brings us closer to that question. In 2009, Jaswant Singh’s Jinnah: India, Partition, Independence unexpectedly sold many copies after Modi, then Gujarat chief minister, decided to ban the book (a court overturned the ban) and Singh was expelled from the party. Singh’s book was tedious and didn’t tell us much that was new; it was unusual simply because it was sympathetic to Jinnah.
India may be divided today in identifying the real heroes of its freedom struggle but it is still Jinnah who gets most of the blame. He is called stubborn and difficult, intransigent even; his apparent dietary hypocrisies are recounted to question if he was a “good Muslim”; his preference for well-cut suits is mocked, and his falling in love with, and marriage to, a Parsi woman young enough to be his daughter is considered scandalous.
In Jinnah: A Life, Hamdani offers a dispassionate account of Pakistan’s founding father, which reveals the remarkable man he was,without deifying him (as many do in Pakistan), and provides arguments that make it harder to vilify him easily (as many do in India). Hamdani shows how Jinnah began his career as a leader of the Congress and also became a member of the Muslim League. He receives Mohandas Karamchand Gandhi when he returns from South Africa to India in 1915, and speaks at the public felicitation for the fellow Gujarati. Hamdani discounts the more dramatized accounts of that event (did the two have a falling out there which altered history?) and reminds us that Gandhi and Jinnah both considered Gopal Krishna Gokhale their mentor.
The complex history of Cape Muslim cuisine
Ishay Govender, New Frame, 24 July 2020
The pastel-coloured houses running up the foot of Signal Hill in the Bo-Kaap are home to one segment of the community of Cape Malays, an ideological and political term, fraught in many circles.
Until the mid-17th century, southern Africa had escaped the shackles of the slave trade, unlike East and West Africa. The precise location of their ancestors became hazy with time as the enslaved were stripped of their identity and records were inked by the creative hands of new masters, who doled out names that were facetious or corresponding with a day or month in a hegemonic naming practice.
Melayu, one of the prominent languages spoken by the sailors and the enslaved from South East Asia (and commonly used from New Guinea to Madagascar, as Yusuf da Costa and Achmat Davids point out in Cape Malay History), became the language of Islam in the Cape and, later, the precursor to the creolised Dutch created by the enslaved and subsequently used by the slave masters. This would transmute into Afrikaans, the first record of which was actually in Arabic script.
It’s said that the use of Melayu, which sounds like “Malay”, and the geographical region known as the Malay Archipelago, embodying many of the enslaved’s home territories, are some of the reasons that the misnomer “Malay” persists when describing this Cape Muslim community.
Researcher and poet Rustum Kozain says: “As a child, I learned that ‘Cape Malay’ was an apartheid term and that the primary determinant in my identity was ‘Muslim’.”
As a government directive, it served to categorise the community of Muslims under the same umbrella (even those not directly linked to the slave trade) and distinguish them from those deemed “coloured”. But mixing by way of marriage, and consequently shared food customs, happened frequently, blurring the restrictive boundaries set by the state. Curries influenced by Muslims from India, for example, became a staple that lives on in the Cape today; a Cape Malay curry, however, is distinct from the east coast Durban curry.
Read the full article in New Frame.