The latest (somewhat random) collection of essays and stories from around the web that have caught my eye and are worth plucking out to be re-read.
Why the coronavirus has been so successful
Ed Yong, Atlantic, 20 March 2020
The new virus certainly seems to be effective at infecting humans, despite its animal origins. The closest wild relative of SARS-CoV-2 is found in bats, which suggests it originated in a bat, then jumped to humans either directly or through another species. (Another coronavirus found in wild pangolins also resembles SARS-CoV-2, but only in the small part of the spike that recognizes ACE2; the two viruses are otherwise dissimilar, and pangolins are unlikely to be the original reservoir of the new virus.) When SARS-classic first made this leap, a brief period of mutation was necessary for it to recognize ACE2 well. But SARS-CoV-2 could do that from day one. ‘It had already found its best way of being a [human] virus,’ says Matthew Frieman of the University of Maryland School of Medicine.
This uncanny fit will doubtlessly encourage conspiracy theorists: What are the odds that a random bat virus had exactly the right combination of traits to effectively infect human cells from the get-go, and then jump into an unsuspecting person? ‘Very low,’ Andersen says, ‘but there are millions or billions of these viruses out there. These viruses are so prevalent that things that are really unlikely to happen sometimes do.’
Since the start of the pandemic, the virus hasn’t changed in any obviously important ways. It’s mutating in the way that all viruses do. But of the 100-plus mutations that have been documented, none has risen to dominance, which suggests that none is especially important. ‘The virus has been remarkably stable given how much transmission we’ve seen,’ says Lisa Gralinski of the University of North Carolina. ‘That makes sense, because there’s no evolutionary pressure on the virus to transmit better. It’s doing a great job of spreading around the world right now.’
There’s one possible exception. A few SARS-CoV-2 viruses that were isolated from Singaporean COVID-19 patients are missing a stretch of genes that also disappeared from SARS-classic during the late stages of its epidemic. This change was thought to make the original virus less virulent, but it’s far too early to know whether the same applies to the new one. Indeed, why some coronaviruses are deadly and some are not is unclear. ‘There’s really no understanding at all of why SARS or SARS-CoV-2 are so bad but OC43 just gives you a runny nose,’ Frieman says.
Researchers can, however, offer a preliminary account of what the new coronavirus does to the people it infects. Once in the body, it likely attacks the ACE2-bearing cells that line our airways. Dying cells slough away, filling the airways with junk and carrying the virus deeper into the body, down toward the lungs. As the infection progresses, the lungs clog with dead cells and fluid, making breathing more difficult. (The virus might also be able to infect ACE2-bearing cells in other organs, including the gut and blood vessels.)
Read the full article in the Atlantic.
South Korea’s coronavirus lessons:
Quick, easy tests; monitoring
Kelly Kasulis, Al Jazeera, 19 March 2020
South Korea’s coronavirus outbreak is a lesson in early action and swift containment.
One month ago on February 18, South Korea diagnosed its 31st patient with COVID-19, and she soon became known as the country’s ‘super-spreader.’ A middle-aged woman who took part in mass congregations at a religious group called the Shincheonji Church of Jesus, Patient 31 passed the virus onto other members of the faithful as well as other unsuspecting residents of the southeastern city of Daegu.
Suddenly, South Korea’s coronavirus cases multiplied 180-fold in a two-week span. At its peak, medical experts were diagnosing more than 900 new cases a day, making South Korea the second-largest outbreak in the world.
Now, that growth rate has significantly slowed – and there is even talk that the outbreak might have peaked…
When Chinese scientists first published the COVID-19 virus’ genetic sequence in January, at least four South Korean firms quietly began developing and stockpiling test kits alongside the government – well before the country had its first outbreak.
By the time things got bad, the country had the ability to test more than 10,000 people per day, including at makeshift drive-through testing centres and newly added consultation phone booths at hospitals. Anyone with a mobile phone in the country also received alerts about nearby infection paths so that citizens could avoid areas where the virus was known to be active.
At the same time, the South Korean government created a GPS-enabled app to monitor those under quarantine and set off an alarm if they ventured outdoors. Travellers entering the country are also being asked to record their symptoms on a state-sponsored app.
Unlike other countries, South Korea also managed to turn its outbreak around without locking down cities or banning travel. In fact, the term ‘social distancing’ first originated with the South Korean president’s campaign against the virus.
However, that does not mean all other countries should follow suit. South Korea’s mass-testing and early detection may have afforded it the luxury of being able to avoid declaring a total shutdown. ‘Because Korea has the ability to sample and test faster than in other countries, there was no reason to do what other countries are doing [and lock down],’ Roh said.
‘The method of blocking off certain areas and stopping movement was what people did in the Middle Ages when they were dealing with the Black Death. It was because they didn’t know what was causing infections at the time and they didn’t know where the disease was spreading.’
Read the full article in Al Jazeera.
How pandemics change history
Frank M Snowden & Isaac Chotiner,
New Yorker, 3 March 2020
I want to start with a big question, which is: What, broadly speaking, are the major ways in which epidemics have shaped the modern world?
One way of approaching this is to examine how I got interested in the topic, which was a realization—I think a double one. Epidemics are a category of disease that seem to hold up the mirror to human beings as to who we really are. That is to say, they obviously have everything to do with our relationship to our mortality, to death, to our lives. They also reflect our relationships with the environment—the built environment that we create and the natural environment that responds. They show the moral relationships that we have toward each other as people, and we’re seeing that today.
That’s one of the great messages that the World Health Organization keeps discussing. The main part of preparedness to face these events is that we need as human beings to realize that we’re all in this together, that what affects one person anywhere affects everyone everywhere, that we are therefore inevitably part of a species, and we need to think in that way rather than about divisions of race and ethnicity, economic status, and all the rest of it.
I had done some preliminary reading and thought this was an issue that raises really deep philosophical, religious, and moral issues. And I think epidemics have shaped history in part because they’ve led human beings inevitably to think about those big questions. The outbreak of the plague, for example, raised the whole question of man’s relationship to God. How could it be that an event of this kind could occur with a wise, all-knowing and omniscient divinity? Who would allow children to be tortured, in anguish, in vast numbers? It had an enormous effect on the economy. Bubonic plague killed half the population of full continents and, therefore, had a tremendous effect on the coming of the industrial revolution, on slavery and serfdom. Epidemics also, as we’re seeing now, have tremendous effects on social and political stability. They’ve determined the outcomes of wars, and they also are likely to be part of the start of wars sometimes. So, I think we can say that there’s not a major area of human life that epidemic diseases haven’t touched profoundly.
Were you trying to make a point about how the way we respond to these things is often a function of our racial or ethnic or religious views rather than our general humanity, and that the response has shown the flaws of human beings in some way? Or were you making a different point?
I think I was trying to make two points. I think the causal chain works in both directions. Diseases do not afflict societies in random and chaotic ways. They’re ordered events, because microbes selectively expand and diffuse themselves to explore ecological niches that human beings have created. Those niches very much show who we are—whether, for example, in the industrial revolution, we actually cared what happened to workers and the poor and the condition that the most vulnerable people lived in.
Read the full article in the New Yorker.
Why the government changed tack on Covid-19
Saloni Dattani, Unherd, 17 March 2020
A central concept to understand in epidemiology is the basic reproduction number (R0), which is the number of people that are expected to be infected by an individual case, in a population that is susceptible to infection.
This means that if a virus has an R0 of 2, for example, an individual case is expected to infect 2 other people, who are expected to infect a further 2 people each, and so on. In general, a virus with a larger R0 spreads more rapidly in a population. As a consequence, the R0 determines whether a pathogen will remain endemic in a population (if it is >1) or die out (if it is <1). During early stages of the pandemic in Wuhan in January, research from different labs estimated an R0 of 2.54 on average (with a 95% confidence interval between 2.17-2.91).
Crucially, the R0 is not an inherent or fixed property of a pathogen: the expected number of people who will be infected by a case depends on the behaviour of individuals in a population and their environmental context, such as the length of time that cases are infectious, the number of susceptible people they are in contact with, and their general infectiousness.
This is the reason that reducing the number of contacts that individuals have (such as by social distancing) works, as does handwashing and effective treatment — these strategies reduce the R0 of a pathogen.
Countermeasures in Wuhan and elsewhere have already reduced the local R0 of COVID-19, with research suggesting the R0 was reduced all the way down to 0.32 in Wuhan in early February after extensive testing and containment measures. In Italy, which implemented aggressive countermeasures fairly late into their local epidemic, preliminary analysis suggests the R0 was reduced from 3 in late February to 1.7 in early March and the number of new cases has dramatically slowed down.
According to an analysis published in the Lancet, approximately 95% of the Wuhan population remained uninfected by the virus at the end of January, after the peak of their crisis, as a result of aggressive countermeasures. These data on their own indicate that herd immunity is not an inevitable outcome, nor is the possibility that up to 80% of the UK population will be infected within the next year, as was claimed by Professor Chris Whitty.
Read the full article in Unherd.
Is our fight against coronavirus worse than the disease?
David L Katz, New York Times, 20 March 2020
What we know so far about the coronavirus makes it a unique case for the potential application of a ‘herd immunity’ approach, a strategy viewed as a desirable side effect in the Netherlands, and briefly considered in the United Kingdom.
The data from South Korea, where tracking the coronavirus has been by far the best to date, indicate that as much as 99 percent of active cases in the general population are ‘mild’ and do not require specific medical treatment. The small percentage of cases that do require such services are highly concentrated among those age 60 and older, and further so the older people are. Other things being equal, those over age 70 appear at three times the mortality risk as those age 60 to 69, and those over age 80 at nearly twice the mortality risk of those age 70 to 79.
These conclusions are corroborated by the data from Wuhan, China, which show a higher death rate, but an almost identical distribution. The higher death rate in China may be real, but is perhaps a result of less widespread testing. South Korea promptly, and uniquely, started testing the apparently healthy population at large, finding the mild and asymptomatic cases of Covid-19 other countries are overlooking. The experience of the Diamond Princess cruise ship, which houses a contained, older population, proves the point. The death rate among that insular and uniformly exposed population is roughly 1 percent.
We have, to date, fewer than 200 deaths from the coronavirus in the United States — a small data set from which to draw big conclusions. Still, it is entirely aligned with the data from other countries. The deaths have been mainly clustered among the elderly, those with significant chronic illnesses such as diabetes and heart disease, and those in both groups.
This is not true of infectious scourges such as influenza. The flu hits the elderly and chronically ill hard, too, but it also kills children. Trying to create herd immunity among those most likely to recover from infection while also isolating the young and the old is daunting, to say the least. How does one allow exposure and immunity to develop in parents, without exposing their young children?
The clustering of complications and death from Covid-19 among the elderly and chronically ill, but not children (there have been only very rare deaths in children), suggests that we could achieve the crucial goals of social distancing — saving lives and not overwhelming our medical system — by preferentially protecting the medically frail and those over age 60, and in particular those over 70 and 80, from exposure.
Read the full article in the New York Times.
Search for coronavirus vaccine becomes a global competition
David E. Sanger, David D. Kirkpatrick,
Sui-Lee Wee & Katrin Bennhold,
New York Times, 19 March 2020
What began as a question of who would get the scientific accolades, the patents and ultimately the revenues from a successful vaccine is suddenly a broader issue of urgent national security. And behind the scramble is a harsh reality: Any new vaccine that proves potent against the coronavirus — clinical trials are underway in the United States, China and Europe already — is sure to be in short supply as governments try to ensure that their own people are the first in line.
In China, 1,000 scientists are at work on a vaccine, and the issue has already been militarized: Researchers affiliated with the Academy of Military Medical Sciences have developed what is considered the nation’s front-runner candidate for success and is recruiting volunteers for clinical trials.
China ‘will not be slower than other countries,’ Wang Junzhi, a biological products quality control expert with the Chinese Academy of Sciences, said Tuesday at a news conference in Beijing.
The effort has taken on propaganda qualities. Already, a widely circulated photograph of Chen Wei, a virologist in the People’s Liberation Army, receiving an injection of what was advertised to be the first vaccine, has been exposed as a fake, taken before a trip she made to Wuhan, where the virus began.
President Trump has talked in meetings with pharmaceutical executives about making sure a vaccine is produced on American soil, to assure the United States controls its supplies. German government officials said they believed he tried to lure a German company, CureVac, to do its research and production, if it comes to that, in the United States.
The company has denied it received a takeover offer, but its lead investor made clear there was some kind of approach.
Asked by the German magazine Sport 1 about how the contact with Mr. Trump had unfolded, Dietmar Hopp, whose Dievini Hopp BioTech Holding owns 80 percent of the company, said: ‘I personally didn’t speak to Mr. Trump. He spoke to the company and they immediately told me about it and asked what I thought of it, and I knew immediately that it was out of the question.’
The report of the approach was enough to prompt the European Commission to pledge another $85 million to the firm, which has already had support from a European vaccine consortium.
The same day, a Chinese company offered $133.3 million for an equity stake and other consideration from another German firm in the vaccine race, BioNTech.
Read the full article in the New York Times.
Why you should stop joking that black people
are immune to coronavirus
Brentin Mock, CityLab, 14 March 2020
While some may argue that the jokes, at least, are harmless, U.S. history evinces how unsubstantiated claims about race-based resilience to disease have led to devastating outcomes, particularly for African Americans. The impacts of such beliefs still affect how people of color are medically treated — or not — today.
The 18th-century yellow fever outbreak in the Americas is instructive here. In the 1740s, yellow fever had overtaken coastal port cities such as Charleston, South Carolina, driving people into delirium, endless vomiting, hemorrhaging, and eventually death. The physician John Lining recorded his observations about the disease in Charleston after inspecting slave ships and their cargo —including captive Africans — finding that it was almost exclusively white people who were succumbing to the disease. These observations helped reinforce already-stirring beliefs that Africans had some kind of supernatural inoculation to some of the deadliest diseases floating along the American coast.
Lining’s medical briefs became the reference manuals for another physician, Dr. Benjamin Rush, when in 1793 a yellow fever outbreak took hold of Philadelphia, Pennsylvania, which at the time was the nation’s capitol. Close to 20,000 people — half of the population — fled Philly that year, while many African Americans actually stayed in the city at the request of Rush, who wanted to train them to nurse, care-take, and dig graves for the thousands of people dying of yellow fever.
Rush was operating on the belief that black people were immune to the disease, and black Philadelphians believed him when he told them that they were. Rush not only was an outspoken abolitionist, but also friend of the black clergymen Absalom Jones and Richard Allen, founders of the African Methodist Episcopal church, and two of the most influential African Americans of the time.
Jones and Allen helped convince black people to stay behind to assist Rush, telling their congregations that it was their Christian duty to help care for the lives of white Philadelphians. But Rush was wrong. Many of the African Americans in his medical camp contracted the disease. Hundreds of them died. Allen became afflicted and almost died himself. While Rush was a highly respected doctor — the American Psychiatric Association would later title him the ‘father of American psychiatry’ — he was relying on faulty claims about race and health conditions that proved fatally wrong. The Philadelphia massacre became an abject lesson in what happens when race gets bandied about amidst the rages of a major health maelstrom.
As Dr. Rana Hogarth wrote in her book Medicalizing Blackness about the 1793 yellow fever outbreak in Philadelphia: ‘The idea of innate black immunity placed an undue burden on the city’s black inhabitants. For those black people who did stay behind to help, it meant buying into a belief that at its core defined their bodies as being distinctive and unequal to whites.’
This is why Hogarth bristles a little every time she sees memes fly by on Twitter or Facebook pointing out fewer documented cases of coronavirus in Africa, or fewer deaths of African Americans, as indications that black people are somehow impervious to the disease. Such statements, whether made literally or comically, are rooted in racist beliefs that hearken back to the 18th century yellow fever disaster that almost decimated black Philadelphia.
Read the full article in CityLab.
Red and blue America
aren’t experiencing the same pandemic
Ronald Brownstein, Atlantic, 20 March 2020
Even a disease as far-reaching as the coronavirus hasn’t entirely crossed the chasm between red and blue America.
In several key respects, the outbreak’s early stages are unfolding very differently in Republican- and Democratic-leaning parts of the country. That disconnect is already shaping, even distorting, the nation’s response to this unprecedented challenge—and it could determine the pandemic’s ultimate political consequences as well.
A flurry of new national polls released this week reveals that while anxiety about the disease is rising on both sides of the partisan divide, Democrats consistently express much more concern about it than Republicans do, and they are much more likely to say they have changed their personal behavior as a result. A similar gap separates people who live in large metropolitan centers, which have become the foundation of the Democratic electoral coalition, from those who live in the small towns and rural areas that are the modern bedrock of the GOP.
Government responses have followed these same tracks. With a few prominent exceptions, especially Ohio, states with Republican governors have been slower, or less likely, than those run by Democrats to impose restrictions on their residents. Until earlier this week, Donald Trump downplayed the disease’s danger and overstated the extent to which the United States had ‘control’ over it, as the conservative publication The Bulwark recently documented. Conservative media figures including Rush Limbaugh and Sean Hannity likewise insisted for weeks that the media and Democrats were exaggerating the danger as a means of weakening Trump. Several Republican elected officials encouraged their constituents to visit bars and restaurants precisely when federal public-health officials were urging the opposite.
Read the full article in the Atlantic.
Mulling the allure and peril
of state power amid Covid-19
Michael Schulson, Undark, 16 March 2020
As public health officials around the world continue to confront the rapid outbreak of the coronavirus illness known as Covid-19, quarantines and other restrictions on personal movement have become a key tool of containment. China had aggressively quarantined some 50 million people in Hubei province since late January — at the time the largest ‘cordon sanitaire,’ as such quarantines of mass geographic areas are known, in history — before starting to ease movement restrictions last week. Italian leaders have upped the ante, putting the entire country of 60 million people on indefinite lockdown, with police limiting travel between towns and cities.
For its part, the U.S. has enforced quarantines for individuals entering the country who are known to have been exposed to the virus officially dubbed SARS-CoV-2. President Donald J. Trump has also declared a national state of emergency, and effectively banned foreign nationals from entering the U.S. from most of Europe, as well as China, and Iran. Meanwhile, a cascading list of states and municipalities across the U.S. have begun shuttering schools, universities, bars, restaurants, sporting events, church services, and other large gatherings of people — and encouraging Americans to limit interpersonal contact and, if possible, to simply stay at home.
One question that continues to percolate among an increasingly nervous American populace, however, is whether more widespread and aggressive state actions — forcing the infected to self-isolate, requiring exposed people to self-quarantine whether they want to or not, and even instituting cordons sanitaires — could happen here, or whether they even should.
In some ways, the answer is academic: While the federal government does have the authority to control the movements of people into and out of the U.S., as well as across state lines, the power to limit movement or take other coercive actions against individuals or groups of citizens living in a particular area generally resides with state and municipal authorities. And while some local and state officials have begun ratcheting up actions against infected individuals who ignore recommendations to stay home — police officers in Kentucky, for example, have been stationed outside the home of a recalcitrant individual with Covid-19 — individual citizens, for the most part, remain free to move about at their discretion. New York’s Gov. Andrew Cuomo, for example, set up a ‘containment area’ around the city of New Rochelle last week, but restrictions do not prevent residents from leaving the area if they choose.
Whether, where, and how more stringent containment efforts might emerge is difficult to say, and it seems certain that many Americans would defy — intentionally or inadvertently — even clear orders. More fundamentally, however, the mere notion of official edicts regarding individual behavior raises long-standing questions, still unsettled in the American public health sphere, around when, if ever, officials should resort to coercion in pursuit of a public health goal.
Read the full article in Undark.
The ugliness of coronavirus shaming
Isabel Hardman, Spectator, 22 March 2020
In the early years of the First World War, a man out of uniform had a reasonable chance of being stopped in the street by a young woman and handed a white feather. This campaign of social shame encouraged those who had not yet enlisted to do so using white feathers as a symbol of cowardice. It may have had noble roots – encouraging everyone who could serve their country to do so – but it quickly became ugly.
Men who had come home for a few days’ leave, men discharged after being injured fighting, and men in exempted professions such as doctors and train drivers, were often handed feathers by indignant, self-righteous women who had come to regard the practice as a hobby. There are many tales of the humiliation of men already scarred by the horrors of war who were handed multiple feathers just on a peaceful walk through London. It came to the point where the government issued silver badges to indicate that the man in question had already served or was contributing to the war effort at home.
We are not asking people to sign up to fight in the trenches today, but to stay away from each other and wash their hands. That so many have chosen to ignore that is deeply troubling, particularly to those on the frontline of the NHS, or those who suffer from or have loved ones with serious health conditions. That someone would think going to a crowded pub, or hanging out in a group was more important than stopping the spread of a deadly virus is baffling. The more social pressure there is for all of us to realise that careless contact costs lives, the better.
But among the earnest pleas for people to stay at home, wash their hands and save the NHS, there are white feathers. Many were being waved yesterday when photos emerged of too many people enjoying parks and beauty spots. How stupid could they get, raged others online. How selfish!
It is indeed selfish to rock up in Hyde Park for a group picnic, or Brighton beach for a barbecue with friends who you don’t live with. But many of the people walking outdoors yesterday would have been trying to heed government advice to get outdoors with appropriate social distancing. It’s often the case that we don’t realise until it’s too late that everyone has had the same good idea as we have, and has pitched up in the same place. Do they really deserve to be tarred with the same brush, handed the same white feather? We are in the middle of an unprecedented pandemic that no government in the world has worked out how to deal with, and which has ripped up all our ways of going about our lives and looking after ourselves. Is it really a surprise that some of us are making mistakes while trying to handle it?
Read the full article in the Spectator.
Revealed: the great European refugee scandal
Daniel Howden, Apostolis Fotiadis & Zach Campbell,
Guardian, 12 March 2020
The confusion at sea that night was not an isolated incident but an illustration of the painstaking lengths to which Europe has gone to ensure migrants do not reach the continent. While the level of violence at Greece’s border with Turkey has shocked many Europeans, Europe’s retreat from refugee rights did not begin last week. Greece’s decision to seal its borders and deny access to asylum is only the most visible escalation of an assault on people’s right to seek protection.
The groundwork for this was laid in the central Mediterranean, where the EU and Italy created a proxy force to do what they could not do themselves without openly violating international laws: intercept unwanted migrants and return them to Libya.
The strategy has relied on maintaining deniability of responsibility for Libyan coastguard operations. But the connivance revealed in the audio recordings is supported by previously unpublished letters between high-level EU mandarins, confirmed by inside sources and laid bare in emails from the Libyan coastguard, all obtained by the Guardian. Taken together, this evidence threatens to unravel a conspiracy in the Mediterranean that flouts international law in the name of migration control.
The Mediterranean is the theatre where tensions between Europe’s ideas of human rights do battle with continental politicians’ anxiety about African migration. Until 2009, Libya was a ‘safe’ country of return because countries such as Italy said it was. Italian vessels would intercept migrants and persuade them to clamber off their boats with promises of passage to Italy, and then put them in handcuffs and sail them to Tripoli.
Italy shipped close to 900 people back to Libya in 2009. Among those returnees were 11 Eritreans and Somalis who complained to the European court of human rights. The court’s ruling in 2012 said Italy was guilty of refoulement and had violated the men’s right to claim asylum and not to be returned to an unsafe port. In rejecting Italy’s arguments, one of the judges pointed out that ‘refugees have the right to have rights.’
This ruling, named the Hirsi ruling after one of the returnees, means any refoulement operation, even one carried out by a proxy force, would be vulnerable to international legal scrutiny if an EU state could be shown to be controlling and directing these operations. Europe had to find allies in Libya who were capable of intercepting migrants on the high seas without overt direction from the Europeans.
The project of building a proxy took off in the summer of 2017. At that time Libya, in the middle of a civil war, had no centralised coastguard and no capacity to manage its own search and rescue area. From the outset it was a joint project between Rome and Brussels: Italy provided ships while the EU trained and paid the new coastguards, often recruiting from among militias and smugglers.
Read the full article in the Guardian.
‘We discuss food banks at school gates like it’s normal’
Chris Vallance, BBC News, 18 March 2020
The food bank was set up by Jane Benyon, 11 years ago. A former social worker, whose husband was once the Conservative MP for nearby Abingdon, Jane could tell from her work that there was hidden hunger in Oxford.
People are visiting the food bank in increasing numbers. Last year it helped 3,205 adults and children, up from 2,626 in 2018. It was in 2018 that Universal Credit was rolled out to single people in Oxfordshire, and research by Oxford University student Rosie Sourbut found that 39% of those visiting a food bank in the city in 2018/2019 cited delays with Universal Credit as one of the reasons.
‘The concept was a good one but it’s had some huge teething problems,’ Jane Benyon tells me. ‘And whether those will ever be resolved I’m not sure.’
Paul Clarke a Baptist minister who is in charge of another branch of the Community Emergency Foodbank, a mile to the north of St Francis in the Oxford suburb of Barton, runs through a sheaf of blue referral forms. Even now, he says, most mention problems with Universal Credit. ‘It does feel to me we are relying on charitable organisations to sort these things out,’ he says.
The Oxford served by this branch of the food bank is not the one you see in tourist brochures. It is among the top tier of most deprived areas in England, according to the City Council.
‘These estates they’re massive, they’re dotted around and they are a lot poorer than what you see in the centre,’ says Sharon. Drug dealers thrown off other estates sometimes end up in Barton, she says.
It’s here that I meet Rachel, a mother of four, a grandmother, a former alcoholic and drug user who has turned her life around, gaining qualifications and studying for a while at university. Volunteers bring her bags of groceries and toiletries.
Rachel has lived in Barton for 18 years. She is softly spoken, but this cannot mask her anger.
She tells me some local women are so desperate they have to steal when they have their period. ‘Nicking sanitary towels, how embarrassing that people have to stoop that low,’ she says.
She tells me she usually doesn’t eat breakfast. Her priority is feeding her children, two of whom are still under 18.
When I ask if Christmas was difficult, the pain of not having enough money to be able to give her kids what she would like to rises to the surface, and we have to pause the interview.
She says a lot of people on the estate are still embarrassed to use the food bank but others accept have become accustomed to it.
‘We’re supposed to be one of the richest countries in the world but we’ve got to use food banks. We’re talking about food banks at the school gates like it’s normal!’
Read the full article on BBC News.
Sexism probably wasn’t what doomed Warren’s campaign
Cathy Young, Atlantic, 10 March 2020
Meanwhile, in real-life congressional elections—since as far back as the 1980s—women who run win as often as men do. Do these women, as some have suggested, need to be better than men to do as well? Testing this proposition is virtually impossible. But some evidence suggests that being female in 21st-century America is not a disadvantage in political races. Jennifer L. Lawless, a professor of government at American University, and Danny Hayes, a professor of political science at George Washington University, carried out a detailed analysis of voter surveys and media coverage from the 2010 midterms and found that ‘candidate sex does not affect journalists’ coverage of, or voters’ attitudes toward, the women and men running for office.’
Those data, I should note, came from congressional elections; some analysts believe that Americans who have no problem with female legislators may be more hesitant to elect women to executive positions, though the evidence remains inconclusive. ‘The optimistic story that we’ve been telling for 15 years is not about presidential politics,’ Lawless told me in an interview two days after Warren’s withdrawal. ‘It might be! But we don’t have systematic evidence, because we still have too few women running for president.’
Does Lawless think sexism played a role in either Clinton’s defeat or Warren’s failure? Her conclusion is that we simply don’t know. In 2016, she pointed out, the female candidate was Hillary Clinton, who had unique baggage after decades in the public eye. ‘It was hard to tell if it was sexism or Clintonism,’ Lawless told me. And this year? ‘Although some very qualified female contenders did not make it to the end of the race,’ Lawless said, ‘neither did some very qualified men.’
None of which is to say that sexism is extinct. In a June 2019 Ipsos poll, about 12 percent of Democrats and independents disagreed strongly or somewhat with the statement ‘I am comfortable with a female president.’ But far more—40 percent—felt that it was important for the Democratic Party to nominate a woman.
Lawless does believe that gender currently puts up one distinct hurdle for women: the widespread belief that America is not willing to elect a female president. Because of this assumption, some voters eager to get Trump out of office might see a male candidate as a safer choice. Other commentators have raised the same issue in recent months. ‘While most Americans claim they are ready for a woman president, far fewer see other people as quite so open to the possibility,’ the New York Times columnist Michelle Cottle wrote in January, pointing to several polls in which this pattern emerged. Warren supporters have deplored the media’s flogging of the ‘electability’ issue as a ‘self-fulfilling prophecy.’ But no one seems to be asking whether the relentless focus on the misogyny allegedly thwarting female candidates—and, specifically, Clinton in 2016—played into this self-fulfilling prophecy as well.
Read the full article in the Atlantic.
Is there a crisis of democracy in Europe?
Hanspeter Kriesi, Politische Vierteljahresschrift,
16 March 2020
From the vantage point of the Chinese elites, not only free market capitalism but also Western-style democracy has lost its attraction, and they suggest that ‘90 percent of democracies created after the fall of the Soviet Union have now failed’ (Wolf 2018). There might be some wishful thinking involved in their assessment of the state of democracy across the globe. But in Europe, too, the democratic ‘Zeitgeist’ has become more pessimistic (Brunkert et al. 2018), and the current public debate about democracy suggests that liberal democracy is in crisis. In Europe, the rise of populism from the right and the left; the imposition of austerity on southern European countries by the Troika; Brexit; and the illiberal measures taken by governments in Hungary and Poland are interpreted by pundits—academics as well as public intellectuals—as so many signs of a crisis of democracy.
The current situation in Europe reminds me of the early 1970s, when preoccupied observers identified ‘a breakdown in consensus,’ ‘a political and economic decline,’ and ‘a crisis of democracy.’ This crisis talk came in two versions (Held 2006): those arguing from the premises of a pluralist theory of politics and those arguing from Marxist theory. According to the doomsayers from left and right, the liberal democratic state had become increasingly hamstrung or ineffective in the face of growing demands that were either ‘excessive’ (Crozier et al. 1975) or the ‘inevitable result of the contradictions within which the state is enmeshed’ (Offe 1984; Habermas 1973). The crisis talk of these theories was later disconfirmed by a large comparative research program—the ‘Beliefs in Government’ program (Klingemann and Fuchs 1995)—that was triggered by the very preoccupations with the alleged democratic crisis. The results of this program showed that Western representative democracies proved to be perfectly capable of absorbing and assimilating growing pressure from societal problems, and the forms of political expression taken by such pressure could be understood as the normal manifestations of democracy in complex societies. By the time these results were published, however, nobody cared anymore about the crisis of democracy. In the meantime, the Berlin Wall had fallen, democracy had triumphed, pundits had declared the ‘end of history,’ and the public had moved on. But the dominant mood soon changed again, and the crisis talk did return in new garb. In 2000, a follow-up study under the title of ‘Disaffected Democracies’ was newly preoccupied by the lack of public confidence in leaders and institutions of democratic governance (Pharr and Putnam 2000). The study argued that the causes for the decline of confidence did not lie in the social fabric, nor were they the result of general economic conditions. The problem, it suggested, was with government and politics themselves. Similarly, the contributors of yet another study on political disaffection in contemporary democracies (Torcal and Montero 2006) highlighted the decisive role of politics and institutions in shaping political disaffection.
In the more recent past, pessimism about the state of Western democracy has again increased. Larry Diamond (2015) writes about ‘a democratic recession,’ and Marc Plattner (2017) sees democracies ‘on the defensive.’ In 2018, the academic observers’ tone became more alarmist still. Sasha Mounk (2018) published a treatise entitled The People vs Democracy, Steven Levitsky and Daniel Ziblatt (2018) discussed ‘how democracies die,’ and David Runciman (2018) similarly wrote about ‘how democracy ends.’ Is this time different? Are we heading for a truly transformative crisis of western democracy? Although I share some of the concerns of the more alarmist colleagues, I do not think we should dramatize the current situation. I side with The Economist, which, in its June 16, 2018, issue suggested that ‘reports on the death of democracy are greatly exaggerated.’ But, it added, ‘the least bad system of government ever devised is in trouble. It needs defenders.’ And I think one way to defend it is to get the facts right and to base our expectations about the future of democracy on the best available empirical evidence.
Read the full article in Politische Vierteljahresschrift.
The American Psychological Association
keeps getting the science of video games wrong
Christopher J Ferguson, Medium, 3 March 2020
As with past moral panics regarding rock music, comic books, or Dungeons & Dragons, it is increasingly clear that video games play little role in violent crime or even prank-level aggressive behaviors. As various studies show, there is no long-term association between aggressive video games and violent crime. If anything, studies suggest that playing popular violent games like Call of Duty or Grand Theft Auto is associated with reduced crime in society.
Examinations of this issue by the U.S. Supreme Court, the U.S. School Safety Commission, as well as reviews by the governments of Australia, Sweden, and the U.K., all come to the same conclusions: evidence for even mild aggression are inconsistent at best.
The lonesome holdouts still complaining about games are professional guilds, particularly the American Psychological Association (APA). This group — full disclosure: I am a fellow of the APA, but speak only for myself — stubbornly holds onto its 2015 resolution that, though games are not related to violent crime, they do cause ‘aggression,’ which the group confusingly leaves undefined. This resolution suggests both that the evidence for aggression effects is consistent and, in some real-world way, worth worrying about despite considerable evidence to the contrary. Why does the APA hold onto such a scientifically inaccurate resolution?
I suspect the answer is complex, but largely comes down to confirmation bias: in general, humans are poor at changing their minds, even in the face of evidence that contradicts their worldviews. Add in generational issues — most individuals making these decisions are older, not unlike previous generations of older adults who railed against rock music — along with the APA’s purpose of marketing psychology and the sunk costs of staking out a bad public position, and it looks like the APA got itself into a trap with no easy escape.
In 2013, the APA announced it was putting together a task force to update its existing policies on aggressive video games. Scholars were concerned. Video game research had been contentious for decades and the APA had a history of overstating the evidence for effects. Although the APA ostensibly wanted open-minded scholars, a majority of task force members had already staked out public positions demonizing games. Despite the APA’s apparent intention to appear neutral, one member had previously published studies critiquing games, of which the task force chair appeared unaware. No task force members had publicly defended games prior to this appointment.
Given how many APA members had never taken a position on games, a task force this clearly biased was statistically implausible as a random outcome. In response, 230 scholars wrote an open letter in 2013 calling on the APA to refrain from releasing declarative statements on how games affect aggression. The APA task force didn’t even acknowledge this letter in their final report.
Read the full article on Medium.
Hannah Arendt and the hierarchy of human activity
Finn Bowring, TLS, March 2020
In The Human Condition, Arendt divided human activity into a tripartite hierarchy. The lowest activity is the laborious task of transforming the organic environment into human sustenance – what Marx referred to as the ‘eternal natural necessity which mediates the metabolism between man and nature’. Arendt called labour a ‘worldless’ activity, because it contributes nothing beyond the perpetuation of an ultimately perishable life, and because it testifies to that aspect of human existence that is shared by all organic things. The lowly status of labour in Greek antiquity is reflected in the way survival activities were hidden from sight in the household (oikos), and performed by the lowest ranked inhabitants of the Greek city state (women and slaves). Labour often required strength, if not physical violence, which was exercized both over recalcitrant nature and, by the ruler of the household, over those who laboured. But never was the necessary domination of people or nature confused with freedom, the function of slavery being to liberate citizens from the burden of labour so that they could take their public place among a community of equals: ‘if it were true that nothing is sweeter than to give commands and to rule others’, Arendt points out, ‘the master would never have left his household’.
Part of the superiority of ‘work’ over labour, Arendt argued, is that it redeems human beings from worldlessness. Work is a higher activity because to work is to fabricate a stable and lasting world of objects capable of protecting humans from the forces of nature. The highest accomplishments of homo faber are works of art, which transcend the criteria of need and usefulness and whose beauty may shine through the centuries. But the worker remains an instrumentalist at heart, and this leads to another problem, which is the haemorrhaging of meaning from a world where everything can be reduced to a disposable means.
For Arendt, the highest activity available to human beings is the public deed – to which she simply gave the term ‘action’. As the development of the polis and the Roman res publica superseded the heroic world of Homer, the substance of human action shifted from daring deeds to the faculty of speech, and in this process ‘freedom’ became the political act of participation in public dialogue. Action, Arendt argues, remedies the meaninglessness of human life by revealing our unique capacity to say and do the unforeseen, to break free of causal relations and means-ends reasoning by bringing something into being that is new and improbable. Action, she famously wrote, is ‘the one miracle-working faculty of man’.
The freedom and novelty of action transcend the utilitarianism of homo faber, but action also gives meaning to human life by bringing into existence a common world which leads us out of ourselves, relating and separating us at the same time. Arendt called this the condition of ‘plurality’. We share the world with other unique people, and the claim it makes on our affection and attentiveness grows when we exchange different perspectives on it. For us to believe in a common, three-dimensional world more solid and more lasting than our partial and fleeting selves, we need to see that world from a plurality of standpoints: ‘the more peoples there are in the world who stand in some particular relationship with one another, the more world there is to form between them, and the larger and richer that world will be’. This, one might say, is Arendt’s epistemological cubism.
Read the full article in TLS.
What causes concern about immigration
Simon Wren-Lewis, Mainly Macro, 4 February 2020
Immigration from the EU has declined dramatically, which is not surprising, but this has been partly offset by a significant rise in non-EU immigration. Are people really more concerned about EU immigrants than non-EU immigrants?
Roy Greenslade notes that the newspaper articles full of stories of immigration peril have all but disappeared. He writes
‘It was the press phenomenon of the age 10 years ago, and for at least the following six years – right up to the EU referendum. Since then, however, immigration has all but disappeared from newspaper pages.’
Could it be that the explanation for the diminished salience of immigration is the very simple one that it is no longer in the news?
The folk law comes from the fact that the increase in concern about immigration at the turn of the century coincided with the increase in immigration numbers, first from outside the EU and then from the A8 countries joining the EU. However, as I note here, there is a two or three year lag between the initial increase in immigration and public attitudes. The lag is much shorter with a time series for the number of stories in the press about immigration.
This shouldn’t be a surprise. Much of the concern about immigration is in areas that see very few immigrants. If people are getting their information ‘first hand’ from friends or relatives living in areas of high immigration you might expect a relatively short lag between numbers and concern, but if people are getting their information from the media you would require some change in how the media covered this issue before salience changed. Or to put it more crudely, salience to some extent is inevitably going to reflect what is ‘in the news’.
This does not mean salience is completely divorced from what people think. You could fill newspapers with stories about the housing problems of the very wealthy and it is unlikely that housing would start climbing the salience ranking. It is also true that rising immigration numbers helped newspapers write stories of ‘floods’ and ‘waves’. But what it does mean is that if stories about immigration start disappearing, salience will gradually decline.
Read the full article on Mainly Macro.
The birth of modern belief: Faith and judgment
from the Middle Ages to the Enlightenment
David Manning, Reviews in History, 23 March 2020
The Birth of Modern Belief is seriously good. It is erudite, insightful, and cogent; but, above all, it enables us to think hard about the relationship between our past and our present. This is no mean feat in an age when ‘consensual knowledge of the past dwindles in inverse proportion to how much is known in toto’. The form and content of the book may well divide opinion amongst historians. However, for the most part, this will be grist to the mill for a work that argues that ‘in the modern West belief has effectively become a synonym for opinion or judgement: a space of autonomy rather than a prescription for its exercise’. Here, Ethan Shagan the historian has been elevated by Ethan Shagan the essayist.
The book is premised on a professional credo that ‘the purpose of history as a discipline is to explain change over time’ . A deceptively simple observation – that belief has a history – forms the basis of an essay on the problem, rather than a comprehensive study of a thing. The issue at hand concerns a demonstrable shift in the character of belief as a function of epistemology: an historical transformation from a pre-modern European condition in which ‘Christians routinely denied that other people’s claims were beliefs at all’ to a modern western situation whereby people ‘believe vastly different things’ whilst being generally accepting of the ‘epistemic status of one another’s beliefs as beliefs’. Here, the essay works to complement Kuhn’s sense of ‘paradigm’ and Foucault’s notion of ‘episteme’ with Shagan’s construct of ‘credulity’, that is to say a ‘matrix of interpretation’ which incorporates those historically contingent ‘spaces or conditions of believing’ that shape ‘religious knowledge and its relationship to other truth claims’. This nondogmatic framework allows for a ‘history of ideas’, which illuminates how religious belief has operated with respect to knowledge and opinion through medieval, confessional, and modern epochs.
Medieval ‘belief mitigated the potential hubris that adhered in every attempt to approach God, not just because it was biblically sanctioned… but because it was amphibious, a kind of knowledge-claim without implying knowledge’. A protean rationalist tradition anchored in the interventions of St Augustine pondered a symbiotic relationship between belief and knowledge. Various strains of polemical apologia tended to consider belief as an alternative to understanding, whilst mystical theology gave rise to the suggestion of John Ruysbroeck (1293–1381) that ‘we should believe the articles of faith, and not desire to understand them’. These abstract concerns were brought into stark relief by the ways in which ‘religious belief was in constant danger of collapsing back into its profane homonym, haunted by the secular knowledge-claims from which it sought to separate itself’. Thomistic scholasticism disaggregated the demonstrable knowledge of ‘science’, the lack of certain judgement of ‘opinion’, and the ‘act of believing’. The object of belief may have been the infallible authority God but bearing true witness to such a principle was invariably premised upon obedience to the Church. ‘Augustine used the preposition ‘in’ to differentiate ordinary propositional assent from genuine Christian commitment’. To make matters more challenging, medieval theology was far from settled upon whether communicants ought to believe the Church or believe in it. Furthermore, heretics, pagans, Jews, and Muslims could be deemed believers of things, but the objects of their religious beliefs rendered them unbelievers of God in the eyes of Christians. Whilst the ‘medieval category of belief was baggy… the Christian Middle Ages were almost wholly innocent of the notion that ‘belief’ consisted in a person’s individual views on religion’.
Read the full article in Reviews in History.
On the hatred of literature
Jon Baskin, The Point, 26 January 2020
It is no secret that in contemporary America there are many people who hardly read at all, and then another sizable group who, though they keep up with news, sports and the latest fads in self-care or technology, have little interest in serious fiction, poetry or literary commentary. It would be wrong to say such people hate literature, for one has to care about something to truly hate it. What my classmate in the survey course had precociously recognized was that we were being introduced to a phenomenon both subtler and more sinister than the neglect or ignorance of literature. Our professors had a great deal invested in novels and poems; and it was probably even the case that, at some point, they had loved them. But they had convinced themselves that to justify the ‘study’ of literature it was necessary to immunize themselves against this love, and within the profession the highest status went to those for whom admiration and attachment had most fully morphed into their opposites. Their hatred of literature manifested itself in their embrace of theories and methods that downgraded and instrumentalized literary experience, in their moralistic condemnation of the literary works they judged ideologically unsound, and in their attempt to pass on to their students their suspicion of literature’s most powerful imaginative effects.
The lesson was not a new one. Going back to Plato—perhaps the first hater of literature on record—philosophers and religious authorities have attacked art for the same reasons our professors taught us to deconstruct and distrust it: because it is unpredictable, unreasonable and often inconsistent with their preferred politics or morality. It was also a lesson that was destined, in the years that followed, to seep off campus. Even as New Historicism fell out of fashion in literary studies—along with the broader postmodern notion of ‘critique’ that had produced it—the students it had trained were taking up positions in the public intellectual magazines and book reviews, where they now preside over the gradual disappearance of a distinctively literary mode of criticism: a criticism, that is, that attends to matters of form, style and character, that takes aesthetic experience seriously, and that appreciates the emotions inspired by an artwork as fully as, and as constitutive of, its politics. To the extent that this disappearance has gone unremarked, it is because the hatred of literature, though it remains almost unheard of among the general reading public, has become the default mode in the upper reaches of our literary culture. As was the case in my college survey course, the highest honors go to the most eloquent haters.
Read the full article in The Point.
A hidden war threatens Ethiopia’s transition to democracy
Economist, 19 March 2020
Arrests and summary executions have become commonplace in the far-flung reaches of Oromia, Ethiopia’s largest region. The Ethiopian security forces are waging war on armed Oromo separatists. They are also treating civilians brutally. Accounts by witnesses suggest there is indiscriminate repression of local dissent in a country supposedly on the path from one-party rule towards democracy.
This was not what Ethiopians expected from Abiy Ahmed, who became prime minister in 2018. He was a young reformer from Oromia. He promised democracy for all and redress for what Oromos claim is centuries of political and economic marginalisation. Abiy freed thousands of political prisoners and welcomed rebel groups back from exile to contest elections, now scheduled for August.
Abiy made peace with neighbouring Eritrea, for which he won the Nobel Peace Prize, as well as with rebel groups including the Oromo Liberation Front (olf), which is now an opposition party. The group’s armed wing, the Oromo Liberation Army (ola), agreed to put down its guns; in return its soldiers were to join Oromia’s police. Many hoped to see the end of an insurgency that began almost 50 years ago.
But the social fractures that lifted Abiy to high office continue to divide Ethiopia. Years of unrest in Oromo areas have weakened local government and left a security vacuum. In Wollega (to the west) and Guji (in the south) returning rebels stepped into the breach, sometimes working with the police to enforce order. But they soon began accusing the government of betraying the Oromo cause and reneging on promises to give them jobs in the police. The government, in turn, accused the ola of keeping its weapons. The details of the peace deal were never disclosed, making it easier for both sides to accuse the other of failing to honour it.
By the end of 2018 the rebels had returned to the forests and were murdering officials and attacking army convoys. In 2019 the air force was reportedly bombing ola training camps. After a third peace deal flopped in 2019 the olf formally split from its armed wing (though they are thought to keep covert lines of communication). The government, in effect, declared a state of emergency in Wollega and Guji, with the army in charge of security. By the start of 2020 fighting in Guji had forced some 80,000 people from their homes.
Read the full article in the Economist.
To ‘review’ such supreme paintings is slightly absurd:
Titian at the National Gallery reviewed
Martin Gayford, Spectator, 21 March 2020
In 1576 Venice was gripped by plague. The island of the Lazzaretto Vecchio, on which the afflicted were crammed three to a bed, was compared to hell itself. In the midst of this horror Tiziano Vecellio, the greatest painter in Europe, died — apparently of something else.
He was in his eighties and working, it seems, almost to the end. Titian: Love, Desire, Death, which was briefly on at the National Gallery, before it was closed down this week by our own plague, contained several of the greatest masterpieces of his old age — and also of European art. It comprises just seven canvases, all done for Philip II of Spain — a villain of English history, the man who launched the Armada, but as far as Titian was concerned his most discerning patron. Philip was an avid persecutor of heretics but happy to give his favourite artist a free hand. He was rewarded by some of Titian’s most audacious work.
To ‘review’ such supreme paintings is slightly absurd. These are the touchstones from which Rubens, Velazquez and Rembrandt learnt and their successors still do. Van Dyck actually owned ‘Perseus and Andromeda’; Lucian Freud confessed that he, too, would have liked to have had one of these Titians on his wall. He couldn’t choose between ‘Diana and Actaeon’ and ‘Diana and Callisto’, which he considered jointly ‘simply the most beautiful pictures in the world’.
Frank Bowling, a contemporary master of abstraction, returns again and again to ‘The Death of Actaeon’ — in part because he finds it so modern. ‘There’s something amazing in the stirring up of the paint,’ he told me. ‘It just comes across at you — whoosh! — like a De Kooning.’
That’s absolutely right. These late pictures are executed with a looseness and freedom that startled 16th-century viewers such as Giorgio Vasari, who found Titian’s late manner ‘judicious, beautiful, and astonishing’. That was spot-on too, including the point about judiciousness.
Read the full article in the Spectator.
The real story of the birth of immigration controls
in the UK is eerily familiar
David Glover, Independent, 13 March 2020
In our near-Brexit condition, it seems almost impossible to think of Britain as a nation state without confronting the question of who has the authority to control its borders and how this can best be done.
Yet the modern practice of policing immigration is little more than a century old and can be dated very precisely: 1 January 1906. On that day, a new law was brought into operation which laid down the conditions of entry for any foreigner wishing to live and work in the UK and put the power to decide into the hands of an immigration officer, a state functionary that had never existed before.
According to the 1905 Aliens Act, those seeking admission could only apply at one of 14 named ports and an ‘alien immigrant’ was legally defined as a person who travelled on a steerage-class ticket, someone unable to afford a cabin. As soon as they stepped ashore, migrants were required to queue for a complex assessment that determined whether they were entitled stay, sifting out ‘decent’ from ‘undesirable aliens’ according to the results of health checks (including signs of insanity or criminality), proof of financial support, the likelihood of finding a job, and access to accommodation.
Then, as now, the government could arrange for the deportation of ‘aliens’ who committed crimes while in Britain. But, unlike today, it was first necessary for a judge to recommend this course of action during sentencing and the Home Office would only endorse his advice after having carefully considered such factors as the amount of time the guilty person had lived in the country and the hardship that expulsion might cause in each individual case.
Passed into law during the final days of a Conservative administration, but attacked and amended by the Liberal opposition, the 1905 Aliens Act satisfied no one. Among those who were most disappointed were those on the far right of the Tory party who had championed immigration control and who saw their ‘anti-alien crusade’ in starkly racial terms.
Conservative politicians like Major William Evans-Gordon, the then Stepney MP, had few qualms about joining with the grassroots extra-parliamentary British Brothers’ League in January 1902 when it held its ‘Great Public Demonstration’ against alien immigration at the People’s Palace on Mile End Road, where the air was thick with cries of ‘Down with them’, ‘Wipe them out’, and ‘Jews!’
Read the full article in the Independent.
Local bookstores have a new weapon
in the fight with Amazon
Joan Verdon, Forbes, 14 February 2020
In the book industry, Amazon is Goliath, the giant who overshadows everyone else. But there’s a new David on the scene, Bookshop.org.
It doesn’t expect to topple the giant, but it has launched a weapon that could make Amazon’s shadow a little smaller, and help local bookstores fight back.
Bookshop.org, a website that went live at the end of January and is still in beta mode, is designed to be an alternative to Amazon, and to generate income for independent bookstores. And, perhaps more importantly, it seeks to give book reviewers, bloggers and publications who rely on affiliate income from ‘Buy now’ links to Amazon a different option.
Profit from books sold through Bookshop will be split three ways, with 10% of the sale price going into a pool that will be divided among participating bookstores, 10% going to the publication that triggered the sale by linking to Bookshop.org, and 10% going to Bookshop.org to support its operations.
Bookshop’s 10% commission for affiliate publications is roughly twice Amazon’s 4.5% affiliate commission.
Over 200 independent bookstores already have signed up to participate, and Bookshop has the backing of the American Booksellers Association (ABA).
‘We believe that there are consumers who shop online and would choose to support indie bookstores if there were a visible and convenient alternative to Amazon and others,’ the ABA said when it announced its partnership with Bookshop last month.
Bookshop is the brainchild of Andy Hunter. He is well-known in the independent bookstore community, with his background as a book publisher (Catapult, Counterpoint, Soft Skull Press); publisher of online literary sites (Literary Hub, CrimeReads, Book Marks; and as the founder of digital publisher Electric Literature.
Hunter said he started Bookshop because ‘I became more and more worried about what the future was going to look like if Amazon achieved total market dominance.’
John Warner, who writes about books for the Chicago Tribune, used a Star Wars analogy and cast Bookshop as the ‘rebel alliance’ preparing to stand up against the Amazon empire, with Hunter in the role of Princess Leia, leading the charge.
Hunter, however, says blowing up the Amazon empire is not his goal. Instead, he is aiming to divert some of its sales to helping independent stores.
‘We don’t have to beat Amazon for this to succeed,’ he said. ‘All we have to do is get a very small number of socially conscious consumers to choose Bookshop instead of Amazon.’
Read the full article in Forbes.
The logic of the rebel:
On Simone Weil and Albert Camus
Robert Zaretsky, LA Review of Books, 7 March 2020
Soon after, though, in 1948, Camus discovered Weil’s manuscripts. The impact was seismic not because he found something new and strange in her writings, but because he found something old and shared. Not only was the public’s understanding of Weil forever changed, but so too was Camus’s understanding of himself. It was not that he changed. Instead, he became more fully Camus.
Beginning in 1949, with the publication of L’Enracinement (The Need for Roots), Camus edited seven of Weil’s books. In these works, Camus found a thinker as taken — as possessed, really — by ancient Greece as he was. His tragic sense of life — whether embodied by Sisyphus or Doctor Rieux, the narrator of The Plague — was reinforced by Weil’s reflections on ancient Greece, particularly her remarkable essay ‘The Iliad or the Poem of Force.’ The rare and ‘luminous moments’ undergone by Homer’s heroes who face the relentless pounding of force is recreated in the scene when Rieux and his friend Tarrou go for a silent swim together while the plague rages in Oran.
Similarly, Camus’s visceral knowledge of suffering — he was raised by a deaf and mostly mute mother who worked as a housecleaner — was sharpened by Weil’s notion of le malheur, or affliction. Among the writings Camus published was Weil’s ‘Factory Journal,’ in which she records her experience, one that nearly killed her, of working in three different factories over the course of a year. As she concluded about such work, you kill yourself ‘with nothing at all to show for it […] that corresponds to the effort you put out. In that situation, you really feel you are a slave, humiliated to the very depths of your being.’ Weil explored the distinction between suffering and affliction in her essay ‘Human Personality,’ which Camus not only published but from which he also copied long passages in his notebooks.
Describing yet another manuscript of Weil’s that he published, La condition ouvrière (The Worker’s Condition), Camus insisted that she revealed in a way no one ever had before the nightmarish lives of industrial workers: ‘It is essential that the suffering of the worker, a state which dishonors our civilization, be repaired immediately.’ In his Nobel Prize address, Camus channeled Weil’s horror at the degradation of human beings — their transformation into things — in the workplace and public sphere. By becoming a writer, he declared, one becomes responsible for others. It was the writer’s duty to speak on behalf of the silenced men and women who are subjected to ‘unending misery.’
In a telling (though poorly documented) account, Camus fled to the apartment of Weil’s mother, Selma Weil — with whom he worked closely in his editing of Simone Weil’s manuscripts — in order to ‘gather his thoughts’ upon receiving news about the Nobel Prize. In an equally telling (and well documented) account, Camus made mention in Stockholm of his intimate ties to Weil. When asked by a reporter which writers he felt closest to, Camus named the poet René Char and Simone Weil. When the journalist observed that Weil was dead, Camus replied that death never comes between true friends.
Perhaps the deepest intellectual mark left by this friendship is found in what Camus called his ‘cycle of rebellion.’ When he first encountered Weil, Camus was trying to complete this particular cycle. Like his earlier ‘cycle of absurdity’ — formed by The Stranger, Caligula, and The Myth of Sisyphus — this new cycle contained a novel, a play, and a philosophical essay. He had already published the novel — The Plague — and was about to stage the play, The Just Assassins. The essay, though, was another story. Camus was struggling, riddling his notebooks and letters with expressions of doubt and despair over whether he would ever finish the book. And if he did finish it, he kept asking himself if it would ever amount to anything at all.
Read the full article in the LA Review of Books.
American dirty tricks
Sarah Ditum, The Critic, March 2020
Having been primed to expect an orgy of incompetence and sensationalism, reading American Dirt itself is disappointing. Some of its plot elements are silly, and the characters devised to convey them broadly sketched. Some of the lines are a little ripe. I confess I did not feel pleasure when I read: ‘Lydia feels like a cracked egg, and she doesn’t know if she’s the shell or the yolk or the white. She is scrambled.’
But I have read worse crimes against the English language. Torture porn is also wide of the mark for a novel that conspicuously puts its violence off the page, focusing instead on the effects of trauma (hence that bizarre egg metaphor). American Dirt is, in other words, a decent work of popular fiction. And yet, it is impossible to defend it on those terms — because in the process of writing the novel, Cummins accepted entirely the logic that her critics later brought to bear in trashing the book. All of her assertions about her right to write are founded in exactly the same terms that would later be used against her.
The problem with Cummins’s allusion to her husband being an undocumented migrant is not that, being Irish, his experiences are unrelated to those of Latina migrants: there is, surely, an underlying insecurity held in common. The problem is that she felt she had to obscure his Irishness to make him a convincing part of her defence, because she had accepted that only Mexicans can imagine Mexico. American Dirt even includes a distancing caricature of the well-meaning white woman, which (had more of Cummins’s critics bothered to read the novel) could have been effectively turned back on the author. Lydia and her son are sheltered by a friend of her husband and his American missionary wife, who makes a ‘proprietary’ show of her grief and then objects to giving any practical help. She is the definition of the ‘drive-by Samaritan’ that Cummins so feared being taken for.
What breaks American Dirt isn’t a carelessness for identity politics but a clumsy hypersensitivity to them. It is a novel that struggles under its author’s lack of confidence in her medium. Observation and experience are fundamental to the craft of fiction, but ultimately the job of the novelist is to make stuff up, and American Dirt is best in the set pieces that rely on pure literary invention, like Cummins describing the death-grazing lurch of boarding a moving train, or her depiction of the gradual processes by which Lydia and her son psychologically transition from being ‘normal people’ to being migrants. Where she heavily trails her research or layers on the moral lessons, it creaks.
Read the full article in The Critic.
What is the geometry of the universe?
Erica Klarreich & Lucy Reading-Ikanda,
Quanta, 16 March 2020
When you gaze out at the night sky, space seems to extend forever in all directions. That’s our mental model for the universe, but it’s not necessarily correct. There was a time, after all, when everyone thought the Earth was flat, because our planet’s curvature was too subtle to detect and a spherical Earth was unfathomable.
Today, we know the Earth is shaped like a sphere. But most of us give little thought to the shape of the universe. Just as the sphere offered an alternative to a flat Earth, other three-dimensional shapes offer alternatives to ‘ordinary’ infinite space.
We can ask two separate but interrelated questions about the shape of the universe. One is about its geometry: the fine-grained local measurements of things like angles and areas. The other is about its topology: how these local pieces are stitched together into an overarching shape.
Cosmological evidence suggests that the part of the universe we can see is smooth and homogeneous, at least approximately. The local fabric of space looks much the same at every point and in every direction. Only three geometries fit this description: flat, spherical and hyperbolic. Let’s explore these geometries, some topological considerations, and what the cosmological evidence says about which shapes best describe our universe.
Flat Geometry
This is the geometry we learned in school. The angles of a triangle add up to 180 degrees, and the area of a circle is πr2. The simplest example of a flat three-dimensional shape is ordinary infinite space — what mathematicians call Euclidean space — but there are other flat shapes to consider too.
These shapes are harder to visualize, but we can build some intuition by thinking in two dimensions instead of three. In addition to the ordinary Euclidean plane, we can create other flat shapes by cutting out some piece of the plane and taping its edges together. For instance, suppose we cut out a rectangular piece of paper and tape its opposite edges. Taping the top and bottom edges gives us a cylinder. Next, we can tape the right and left edges to get a doughnut (what mathematicians call a torus).
Now, you might be thinking, ‘This doesn’t look flat to me.’ And you’d be right. We cheated a bit in describing how the flat torus works. If you actually tried to make a torus out of a sheet of paper in this way, you’d run into difficulties. Making the cylinder would be easy, but taping the ends of the cylinder wouldn’t work: The paper would crumple along the inner circle of the torus, and it wouldn’t stretch far enough along the outer circle. You’d have to use some stretchy material instead of paper. But this stretching distorts lengths and angles, changing the geometry.
Inside ordinary three-dimensional space, there’s no way to build an actual, smooth physical torus from flat material without distorting the flat geometry. But we can reason abstractly about what it would feel like to live inside a flat torus.
Imagine you’re a two-dimensional creature whose universe is a flat torus. Since the geometry of this universe comes from a flat piece of paper, all the geometric facts we’re used to are the same as usual, at least on a small scale: Angles in a triangle sum to 180 degrees, and so on. But the changes we’ve made to the global topology by cutting and taping mean that the experience of living in the torus will feel very different from what we’re used to.
Read the full article in Quanta.
The dark side of the Italian Renaissance
Charles Nicholl, Guardian, 12 March 2020
The conquest of the Romagna by the pope’s warlord son, Cesare Borgia, was swift and vicious. It was closely observed by a Florentine diplomat, Niccolò Machiavelli, who made it a case study in his book The Prince, written about 1512 and published posthumously 20 years later. This laconic breviary of amoral realpolitik – memorably described by Bertrand Russell as a ‘handbook for gangsters’ – is the keynote text of these wartorn years. The impact of the Lutheran Reformation and the retrenchment of the Counter-Reformation add further dimensions of religious ideology to the conflicts.
Fletcher navigates this difficult terrain with great skill. She creates atmosphere and drama without any surrendering of clarity. Those with only a vague knowledge of the League of Cambrai or the Council of Trent will find them crisply explained and contextualised. She also has some trenchant chapters on the sexual politics of the era – as evidenced in the enforced seclusion of women in convents, and the glorification of rape in the pornographic poetry of Pietro Aretino and his followers.
This is a powerful book, but it is also one with an argument or agenda to pursue, and in this aspect it is less satisfactory. The argument is signalled by Fletcher’s ambitious subtitle, which promises us an alternative history of the Italian Renaissance. The orthodoxy she challenges is the view of the Renaissance as an unbroken vista of exquisite art and aspirational inventiveness: a new dawn that flooded the superstitious murk of medievalism with the bright light of reason. As she points out, this somewhat utopian view was essentially a 19th-century invention, formulated in works such as Jacob Burckhardt’s The Civilization of the Renaissance in Italy (1860) and endorsed by eloquent Victorian aesthetes such as Walter Pater. For the majority of those who actually lived and worked in this supposed golden age, the reality was grim: they knew it as a time of terror more than beauty. In his Art of War (1521) Machiavelli argued that the cultivation of courtly art was a weakness. Looking back to the first French invasion of 1494, he criticised the complacency – the ‘splendour and deceit’ – of the Italian princes. ‘They were preparing themselves to be the prey of whoever assaulted them,’ he wrote, and from this arose ‘great terrors, sudden flights and miraculous losses’.
Burckhardt’s idealistic view of the Renaissance has long since been challenged – the Marxist historian Arnold Hauser dismissed it is an anachronistic attempt to ‘provide a genealogy’ for 19th-century liberalism – but in Fletcher’s view it remains entrenched as a convenient cliche, and a whole lorryload of guidebooks to Italy could be cited to confirm she is right.
So the argument of her ‘alternative history’ is that this celebratory rhetoric ‘masks the brutal realities’ of war, corruption, oppression and misogyny that are the facts of life in Renaissance Italy. So far, so good. But is it also true, as she asserts, that the great artistic achievements of the period are in themselves complicit in these brutalities and injustices? Her key example, heavily flagged up in the publisher’s promotion, concerns that most iconic of Renaissance artworks, The Mona Lisa. Here is the pitch: the woman portrayed in the painting, Lisa Gherardini, ‘was married to a slave trader’.
Read the full article in the Guardian.
‘Dead Sea Scrolls’ at the Museum of the Bible
are all forgeries
Michael Greshko, National Geographic, 13 March 2020
On the fourth floor of the Museum of the Bible, a sweeping permanent exhibit tells the story of how the ancient scripture became the world’s most popular book. A warmly lit sanctum at the exhibit’s heart reveals some of the museum’s most prized possessions: fragments of the Dead Sea Scrolls, ancient texts that include the oldest known surviving copies of the Hebrew Bible.
But now, the Washington, D.C. museum has confirmed a bitter truth about the fragments’ authenticity. On Friday, independent researchers funded by the Museum of the Bible announced that all 16 of the museum’s Dead Sea Scroll fragments are modern forgeries that duped outside collectors, the museum’s founder, and some of the world’s leading biblical scholars. Officials unveiled the findings at an academic conference hosted by the museum.
‘The Museum of the Bible is trying to be as transparent as possible,’ says CEO Harry Hargrave. ‘We’re victims—we’re victims of misrepresentation, we’re victims of fraud.’
In a report spanning more than 200 pages, a team of researchers led by art fraud investigator Colette Loll found that while the pieces are probably made of ancient leather, they were inked in modern times and modified to resemble real Dead Sea Scrolls. ‘These fragments were manipulated with the intent to deceive,’ Loll says.
The new findings don’t cast doubt on the 100,000 real Dead Sea Scroll fragments, most of which lie in the Shrine of the Book, part of the Israel Museum, Jerusalem. However, the report’s findings raise grave questions about the ‘post-2002’ Dead Sea Scroll fragments, a group of some 70 snippets of biblical text that entered the antiquities market in the 2000s. Even before the new report, some scholars believed that most to all of the post-2002 fragments were modern fakes.
‘Once one or two of the fragments were fake, you know all of them probably are, because they come from the same sources, and they look basically the same,’ says Årstein Justnes, a researcher at Norway’s University of Agder whose Lying Pen of Scribes project tracks the post-2002 fragments.
Read the full article in the National Geographic.
An ‘Anthem’ for our troubled times:
Leonard Cohen’s music stirs a range of emotions
Monobina Gupta, The Wire, 21 March 2020
It took Leonard Cohen a decade to compose his famous song, ‘Anthem’ – a song that becomes a beam of light, even if a sliver, during dark passages of time. The verses, carrying within them a redemptive strain, remain meaningful in the troubled times we pass through, finding particular resonance in contemporary India.
Last Monday, human and civil rights activist Gautam Navlakha reminded us of Cohen’s song. ‘Do please listen to Leonard Cohen sing the ‘Anthem’ and remember to: Ring the Bell/Which still can ring/Forget your perfect/ Offering/There is a crack/A crack in everything/ That’s how light gets in,’ wrote Navlakha in a statement after the Supreme Court turned down the anticipatory bail petitions he had filed along with well-known academic, Anand Teltumbde, in the Bhima Koregaon case.
The song’s verses speak to us in different and relevant ways: ‘I can’t run no more/ With that lawless crowd/ While the killers in high places/ Say their prayers out loud/ But they’ve summoned, they’ve summoned up/ A thundercloud/ And they’re going to hear from me.’
It’s difficult to find easy binaries – with us or against us – in much of Cohen’s music. Moving between shadow and light, his music is about nuances. Cohen’s songs don’t necessarily rile up passions. They sometimes prod people into recalling values that are fast dying: empathy, compassion, tolerance, freedom. In a world as narrowly fragmented and narcissistic as ours, ‘Anthem’ is a reminder of all that we are losing. Or perhaps have lost already. Cohen’s lyrics speak to us not only about the sheer despair of the situation at hand, but of the small chinks in that seemingly impregnable armour of darkness. ‘There is a crack in everything/ That’s how the light gets in.’
At home, Navlakha’s invocation of Cohen stirs a range of emotions. Among the many disturbing developments that have, since 2014, changed India beyond recognition, the recent violence in Delhi, and now the impending outbreak of the coronavirus, deepens the uncertainty of our times, and of life itself. Situated on different registers, they reveal the brokenness of things, the stigma of isolation, and the rampant inequities bolstered by religious affiliation and class position. Though the nature of these ‘diseases’ differs, we find running through them a common strain of human indifference
Read the full article in The Wire.
Women refugee photographers who changed
how post-war Britain saw itself
John March, OpenDemocracy, 18 March 2020
The latest Four Corners exhibition, ‘Another Eye’, which was planning to run until 2 May, celebrates the lives and work of the two dozen women photographers who sought refuge in Britain from Nazi Europe after 1933. (See below for the latest information on the exhibition). The scope of the exhibition is broad, and tells a series of interlocking stories. The historical backcloth is how Germany and Austria were at the forefront of photography in the 1920s and 1930s, and how photographs were used for mass communication. Alongside this the exhibition tells the stories of forced exile and how these highly skilled women photographers escaped and re-established themselves in Britain. The exhibition also shows the very diverse range of work that the women were involved in once in Britain – portraits, fashion, social documentary, book illustrations, educational photographs of artworks and adverts.
One intriguing aspect of the impact made that by these women photographers was the way in which, as outsiders, they managed to capture features of British life, work and rituals, and through their photographs, showed Britain back to the British public in new and fresh ways.
The origins of this very British story begin elsewhere; mainly in Berlin and Vienna where the majority of the refugees came from. It was there, especially after the First World War, where photography developed rapidly, along with the opportunity for a career working in it. New portable cameras, such as the Leica, capable of ‘candid’ and ‘street photography’ since they could be used spontaneously and go close up to the subject, came onto the market in the mid-1920s. Training institutions such as the Bauhaus, the (women-only) Lette Verein in Berlin, and the Graphical Research and Teaching Institute in Vienna, all offered advanced photography courses and training. Meanwhile the illustrated press boomed as an industry, with a wide range of local and national magazines, together with illustrated pull-out sections in local and national daily newspapers. As well as very positive factors which were propelling German and Austrian photography forward, modernist perspectives that were being projected in architecture, design and the arts found their way to photography. These fresh approaches to making photographs, and experimenting with how they could be used, later made their way to Britain with the emigrés.
It was little surprise that middle-class young women, who had been given more civic rights in the post-war settlement and were entering the workforce in greater numbers than ever before, should chose photography as a career. Those women photographers forced into exile typically came from middle-class assimilated Jewish homes where they were often supported by their families in their career choices. Given their family heritage, when the Nazi regime came to power in Germany in 1933 and Austria in 1938, these highly trained women fled the persecution and some two dozen of them arrived in Britain.
Once in Britain the women photographers who had established careers in their country of origin, worked energetically to quickly re-establish themselves in what turned out to be their new home. A good example of this is the way in which Gerty Simon, a portraitist of some standing in Berlin, managed to mount two solo portrait exhibitions within two years of arriving in London, with celebrity sitters from the world of society, politics and the arts. The British press gave positive reviews and signalled that a new and fresh talent was on the scene. Portraits of rising stars in politics and the arts showed a boyish Aneurin Bevan in a relaxed pose, and an elegant Kenneth Clark, contemplative, standing in front of a beloved favourite painting.
Read the full article in OpenDemocracy.
The images are, from top down: A coronavirus (Alfred Pasieka/ Science Photo Library ; ‘Immigrants’, created by students of St. Luke School at Colossi, Limassol and winner of the 2016 Saatchi Gallery/ Deutsche Bank Art Prize for Schools; portrait of Hannah Arendt by Fred Stein; ‘Bacchus and Ariadne’ by Titian; Cosmic background radiationas seen by the satellite Planck (© ESA/Planck Collaboration)