The latest (somewhat random) collection of essays and stories from around the web that have caught my eye and are worth plucking out to be re-read.
Why we fail to prepare for disasters
Tim Harford, Financial Times, 16 April 2020
You can’t say that nobody saw it coming.
For years, people had warned that New Orleans was vulnerable. The Houston Chronicle reported that 250,000 people would be stranded if a major hurricane struck, with the low-lying city left 20ft underwater.
New Orleans’s Times-Picayune noted the inadequacy of the levees. In 2004, National Geographic vividly described a scenario in which 50,000 people drowned. The Red Cross feared a similar death toll.
Even Fema, the Federal Emergency Management Agency, was alert: in 2001, it had stated that a major hurricane hitting New Orleans was one of the three likeliest catastrophes facing the United States.
Now the disaster scenario was becoming a reality. A 140mph hurricane was heading directly towards the city. More than a million residents were warned to evacuate. USA Today warned of ‘a modern Atlantis’, explaining that the hurricane ‘could overwhelm New Orleans with up to 20ft of filthy, chemical-polluted water’.
The city’s mayor, Ray Nagin, begged people to get away. He was reluctant to make evacuation mandatory because more than 100,000 people had no cars and no way of leaving. The roads out were jammed, anyway. Thousands of visiting conference delegates were stranded; the airport had been closed.
There were no emergency shelters. Nagin mooted using a local stadium, the Louisiana Superdome, as a temporary refuge — but the Superdome was not necessarily hurricane-proof and Nagin was warned that it wasn’t equipped to be a shelter.
But then, the storm turned aside. It was September 2004, and New Orleans had been spared. Hurricane Ivan had provided the city, and the nation, with a vivid warning. It had demonstrated the need to prepare, urgently and on a dozen different fronts, for the next hurricane.
‘In early 2005, emergency officials were under no illusions about the risks New Orleans faced,’ explain Howard Kunreuther and Robert Meyer in their book The Ostrich Paradox. But the authorities did not act swiftly or decisively enough.
Eleven months later, Hurricane Katrina drowned the city — and many hundreds of its residents. As predicted, citizens had been unable or unwilling to leave; levees had been breached in over 50 places; the Superdome had been an inadequate shelter.
Surely, with such a clear warning, New Orleans should have been better prepared to withstand Hurricane Katrina? It’s easily said. But as the new coronavirus sweeps the globe, killing thousands more people every day, we are now realising that New Orleans is not the only place that did not prepare for a predictable catastrophe.
Read the full article in the Financial Times.
Model v evidence
Jonathan Fuller, Boston Review, 5 May 2020
In one camp are infectious disease epidemiologists, who work very closely with institutions of public health. They have used a multitude of models to create virtual worlds in which sim viruses wash over sim populations—sometimes unabated, sometimes held back by a virtual dam of social interventions. This deluge of simulated outcomes played a significant role in leading government actors to shut borders as well as doors to schools and businesses. But the hypothetical curves are smooth, while real-world data are rough. Some detractors have questioned whether we have good evidence for the assumptions the models rely on, and even the necessity of the dramatic steps taken to curb the pandemic. Among this camp are several clinical epidemiologists, who typically provide guidance for clinical practice—regarding, for example, the effectiveness of medical interventions—rather than public health.
The latter camp has won significant media attention in recent weeks. Bill Gates—whose foundation funds the research behind the most visible outbreak model in the United States, developed by the Institute for Health Metrics and Evaluation (IHME) at the University of Washington—worries that COVID-19 might be a ‘once-in-a-century pandemic.’ A notable detractor from this view is Stanford’s John Ioannidis, a clinical epidemiologist, meta-researcher, and reliable skeptic who has openly wondered whether the coronavirus pandemic might rather be a ‘once-in-a-century evidence fiasco.’ He argues that better data are needed to justify the drastic measures undertaken to contain the pandemic in the United States and elsewhere…
In the other corner, Harvard’s Marc Lipsitch, an infectious disease epidemiologist, agrees that we lack good data in many respects. Countering Ioannidis’s hesitation, however, Lipsitch responds: ‘We know enough to act; indeed, there is an imperative to act strongly and swiftly.’ According to this argument, we could not afford to wait for better data when the consequences of delaying action are disastrous, and did have reason enough to act decisively.
Public health epidemiologists and clinical epidemiologists have overlapping methods and expertise; they all seek to improve health by studying populations. Yet to some extent, public health epidemiology and clinical epidemiology are distinct traditions in health care, competing philosophies of scientific knowledge. Public health epidemiology, including infectious disease epidemiology, tends to embrace theory and diversity of data; it is methodologically liberal and pragmatic. Clinical epidemiology, by contrast, tends to champion evidence and quality of data; it is comparatively more methodologically conservative and skeptical. (There is currently a movement in public health epidemiology that is in some ways closer to the clinical epidemiology philosophy, but I won’t discuss it here.)
To be clear, these comparisons are fair only writ large; they describe disciplinary orthodoxy as a whole rather than the work of any given epidemiologist. Still, it is possible to discern two distinct philosophies in epidemiology, and both have something to offer in the coronavirus crisis over models and evidence. A deeper understanding of modeling and evidence is the key not only to reconciling these divergent scientific mindsets but also to resolving the crisis.
Read the full article in the Boston Review.
Profile of a killer: the complex biology
powering the coronavirus pandemic
David Cyranoski, Nature, 4 May 2020
Now, as the death toll from the COVID-19 pandemic surges, researchers are scrambling to uncover as much as possible about the biology of the latest coronavirus, named SARS-CoV-2. A profile of the killer is already emerging. Scientists are learning that the virus has evolved an array of adaptations that make it much more lethal than the other coronaviruses humanity has met so far. Unlike close relatives, SARS-CoV-2 can readily attack human cells at multiple points, with the lungs and the throat being the main targets. Once inside the body, the virus makes use of a diverse arsenal of dangerous molecules. And genetic evidence suggests that it has been hiding out in nature possibly for decades.
But there are many crucial unknowns about this virus, including how exactly it kills, whether it will evolve into something more — or less — lethal and what it can reveal about the next outbreak from the coronavirus family.
‘There will be more, either out there already or in the making,’ says Andrew Rambaut, who studies viral evolution at the University of Edinburgh, UK.
Of the viruses that attack humans, coronaviruses are big. At 125 nanometres in diameter, they are also relatively large for the viruses that use RNA to replicate, the group that accounts for most newly emerging diseases. But coronaviruses really stand out for their genomes. With 30,000 genetic bases, coronaviruses have the largest genomes of all RNA viruses. Their genomes are more than three times as big as those of HIV and hepatitis C, and more than twice influenza’s.
Coronaviruses are also one of the few RNA viruses with a genomic proofreading mechanism — which keeps the virus from accumulating mutations that could weaken it. That ability might be why common antivirals such as ribavirin, which can thwart viruses such as hepatitis C, have failed to subdue SARS-CoV-2. The drugs weaken viruses by inducing mutations. But in the coronaviruses, the proofreader can weed out those changes.
Mutations can have their advantages for viruses. Influenza mutates up to three times more often than coronaviruses do, a pace that enables it to evolve quickly and sidestep vaccines. But coronaviruses have a special trick that gives them a deadly dynamism: they frequently recombine, swapping chunks of their RNA with other coronaviruses. Typically, this is a meaningless trading of like parts between like viruses. But when two distant coronavirus relatives end up in the same cell, recombination can lead to formidable versions that infect new cell types and jump to other species, says Rambaut.
Recombination happens often in bats, which carry 61 viruses known to infect humans; some species harbour as many as 121. In most cases, the viruses don’t harm the bats, and there are several theories about why bats’ immune systems can cope with these invaders. A paper published in February argues that bat cells infected by viruses rapidly release a signal that makes them able to host the virus without killing it.
Read the full article in Nature.
COVID-19 and the basics of democratic governance
Dave Archard & Hugh Whittall,
Nuffield Council on Bioethics, 25 April 2020
The UK Government’s communication with the public has been admirably clear and simple: stay home. But it has been one-dimensional and one directional, whilst the challenges presented by COVID-19 are multiple, and they are far from simple.
They entail ethical questions about how we balance different interests (e.g. individual and collective; economic and social) and different risks (e.g. of COVID infection, and of poor health associated with poverty and isolation); of what and who we should prioritise when it comes to the crunch (e.g. COVID-19 over other health needs; the young, the elderly or key workers?); about who bears responsibilities for supporting those in need (Government, industry, communities, individuals); about whether we have not only national, but also international responsibilities; about how privacy will be protected when contact-tracing apps get up and running, as Matt Hancock has said they will, very soon; about the implications of mass testing for disease or immunity – what is the validity of the tests; who gets an ‘immunity certificate’, and where does that leave the rest of us?
These are critically important issues that affect many people – indeed everybody – in many ways and we need to talk about them, together. And yet the Westminster Government does not seem to want to engage or take on board other views on any of these issues; nor is it evident that they are thinking about them, or taking advice on them from a social and ethical perspective. ‘We are following the science’ is the supposedly reassuring message. But following the science is not politically or morally neutral. Every scientist will tell you that science does not provide certainty (and is usually contested); and it does not deliver policy answers – that involves values and judgements for which people are responsible and should be scrutinised, and accountable. Which values are in play and what judgements are being made? By whom? On what advice? When senior political advisors join expert advisory groups such as SAGE, what is being brought into and taken away from those discussions? We understand that there is now an ethical advisory group for the app development, but public information about this is limited and obscure.
This is not merely a matter of curiosity. It is a matter of fundamental democratic accountability. Decisions are being made and are due to be made that go to the very heart of what governments are there to do: to protect the freedom and well-being of their people. But they must do so openly, transparently, and accountably, especially where those decisions impinge on precisely that freedom or aspects of well-being. Democratic governments must be subjected to public debate and challenge. The fact of an emergency or crisis makes things difficult, but is no justification for closing down on public discourse. On the contrary, if we are all at risk, and we are all in it together, we all need to know and all need to have a voice.
Read the full article at the Nuffield Council on Bioethics.
Paying the ultimate price
Liam James Kingsley, Africa is a Country, 7 May 2020
The parallels between HIV and coronavirus are legion. A highly infectious virus with a lengthy and asymptomatic incubation period spreads silently and quickly in its geographical region of origin. By the time medical and public health professionals understand it to be a problem of significance, it has already made its way to other regions. Pre-existing social and political structures and divisions set the mold for both the spread of the disease and the response. The disease proliferates rapidly among traditionally vulnerable populations least capable of taking precautionary steps to protect themselves. Knowledge of the epidemic’s region of origin, as well as entrenched attitudes toward the vulnerable classes affected, allow national governments to create scapegoats, which are used either to ignore or downplay the crisis or to promulgate nationalist sentiment.
We see these patterns emerging in response to the coronavirus, just as they did during the HIV epidemic decades before. The virus’s origin and rapid spread in the Wuhan region of China has led Trump and many of his followers to dub COVID-19 the ‘Chinese Virus.’ Such xenophobic naming echoes the common characterization of HIV as an ‘African’ disease, implying some biological or cultural predisposition to the viral pathogen. Conspiracy theories proliferate. In China, some citizens wonder why this virus seemed to target them in particular. For explanation, many dip into existing political wells. The Chinese government has blamed American service members who visited the Wuhan region in October 2019, playing on geopolitical tensions with the United States. Similarly, many Africans still suspect that HIV was deliberately introduced by westerners seeking to debilitate the continent after the end of formal colonialism.
Yet, in truth, the explanation is far simpler and far less satisfying: the coronavirus affected China’s population most severely in its early days because it is a virus that crossed over into humans in China, just as HIV infected more Africans than any other group for no other reason than it began in Africa. Both diseases possess lengthy asymptomatic incubation periods (though they differ by several orders of magnitude) during which they are highly contagious, and thus both spread with great speed in the places they originated before health or government officials noticed their presence. It is precisely this trait that made both difficult to contain.
Viral crises unearth and widen already extant social divisions. Around the world, nations and publics continue to recycle and repackage familiar scapegoats to explain coronavirus and its spread. In Italy, far-right politicians invoke the familiar bogeyman of migrancy, despite a total lack of supporting evidence. In the US, the state closed borders and issued travel bans long after the virus had taken root. But the true problem lies in the fact that many of these latent social problems (migration, poverty, racism) are built into the very structure of societies, and facilitate the spread of disease. These structures usually render vulnerable those already most likely to be targeted by scapegoating.
Read the full article in Africa is a Country.
The precariat in the time of the pandemic
PV Srovidya, The Hindu, 5 May 2020
In this contract labour economy, the company is the principal employer, the contractor is the employer; and the worker, the employee. The company pays the contractor a service charge on the cumulative wage bill. Faced with a pandemic, this contract labour economy with its underlying precarity has weakened the distribution of relief on the ground.
With no name in the employee roll as proof of employment, a vast majority of this migrant industrial workforce belongs to the shadow labour economy that gets paid in cash, without the safety net of PF and ESI and no employee IDs.
In mid-April, the disquiet inside the migrant workers cluster in Ganga Nagar was contagious. Shakir and his two friends from Bihar’s Muradabad are waiting for word from their contractor, whose mobile phone has been switched off since the lockdown. The trio came in December for a job in a buffing unit (a hazardous chemical process that gives metal a sheen and rust resistance). They buffed petrol tanks and gear box covers. ‘ The contractor promised ₹100 per piece, but he has not even paid me my ₹25,000 for January,’ says Shakir showing photographs of the neatly-arranged shiny aluminium white metal tanks, as proof of his labour, on his phone.
Ajay Behera says he came from Jagathsinghpur in Odisha 10 years ago. At 34, Behera made ₹10,000 per month as a press operator handling hydraulic machines for a small unit. His contractor paid him in cash, without ESI (employee accident insurance), or PF deductions. He remembers an Assamese co-worker, who lost his fingers under the machine and was sent back to Assam to heal on his own without ESI cover. ‘The landlord came and asked for rent today. But my contractor has not picked up my calls. I also need to send money home for my parents. What about next month? If the units don’t open, we will at least need money to go home,’ he says…
Day 16 of the lockdown is a a balmy morning after night- long rain in Hosur. A group of permanent employees with greying hairlines, from the factories of some of the largest auto and ancillary companies, gathered under a tree in Mookandapalli secure under the cover of anonymity offered by the COVID-19 protection masks.
‘In another 10 or 15 years, when the last of us retire, there will be no permanent employee in our factories. Direct recruitments at the worker level stopped long ago,’ one of them says.
If contract and casual labour through job contractors becomes the established norm, albeit with the ‘employee’ status; other forms of labour will shape up on the factory floor under various trainee arrangements. Trainees sourced through various in-house industrial training courses offered by some companies; mandatory apprentice trainees under the Apprentice Act; and various skill development schemes of the Central government that pushed for an ‘industry-led, practice-oriented’ skill development of unskilled labour — expands the presence of non-permanent/rotational precarious workforce on the production floors as part of on-job trade learning at minimum cost to the company.
Read the full article in The Hindu.
In shielding its hospitals from COVID-19,
Britain left many of the weakest exposed
Stephen Grey & Andrew Macaskill, Reuters, 5 May 2020
David Halpern, a psychologist who heads a behavioural science team – once nicknamed the ‘nudge unit’ – advising the UK government, had expanded on the idea in a separate media interview on March 11. As the epidemic grew, he said, a point would come ‘where you’ll want to cocoon, you’ll want to protect those at-risk groups so that they basically don’t catch the disease.’
Nonetheless, Reuters interviews with five leaders of large local authorities and eight care home managers indicate that key resources for such a cocoon approach were not in place.
There weren’t adequate supplies of protective equipment, nor lists of vulnerable people, they said. National supply chains for food were not identified, nor was there a plan in place to supply medicines, organise volunteers, or replace care staff temporarily off sick. Above all, those interviewed said, there was no plan for widespread testing in vulnerable places like care homes or prisons, let alone an infrastructure to deliver it.
On March 23, Johnson announced another shift in strategy, replacing the mitigate-plus-cocoon approach with a broader lockdown. Schools, pubs and restaurants were shuttered, sport cancelled and everyone was told to stay at home.
For local leaders, caring for the most vulnerable became increasingly challenging. Typically, they said, new plans were announced in an afternoon national press conference by a government minister, with instructions to implement them, sometimes the next day, arriving by email to councils later that night. Ministerial promises, handed off to the councils, included drawing up a ‘shield list’ of the most vulnerable, delivering food to them and organising and delivering prescription medicines. Even plans for using volunteers were announced nationally, without taking account of volunteer infrastructures that many councils had in place…
According to several care home managers, a key route for infection was opened up by an NHS decision taken in mid-March, as Britain geared up for the pandemic, to transfer 15,000 patients out of hospitals and back into the community, including an unspecified number of patients to care homes. These were not only patients from general wards. They included some who had tested positive for COVID-19, but were judged better cared for outside hospital.
In a plan issued by the NHS on March 17, care homes were exhorted to assist with national priorities. ‘Timely discharge is important for individuals so they can recuperate in a setting appropriate for rehabilitation and recovery – and the NHS also needs to discharge people in order to maintain capacity for acutely ill patients,’ the plan said.
A Department of Health guidance note dated April 2 and published online further stated that ‘negative tests are not required prior to transfers / admissions into the care home.’
Read the full article on Reuters.
Coronavirus at Smithfield pork plant:
The untold story of America’s biggest outbreak
Jessica Lussenhop, BBC News, 17 April 2020
The Smithfield pork plant, located in a Republican-led state that is one of five in the US that has not issued any kind of shelter-in-place order, has become a microcosm illustrating the socioeconomic disparities laid bare by the global pandemic. While many white-collar workers around the country are sheltering in place and working from home, food industry workers like the employees at Smithfield are deemed ‘essential’ and must remain on the front lines.
‘These jobs for essential workers are lower paying than the average job across America, in some cases by significant margins. So home health aides, cashiers – absolutely essential, on the front lines, have to physically report to work,’ said Adie Tomer, a fellow at the Brookings Institute. ‘They are more predominantly African American or Hispanic than the overall working populations.’
The workforce at Smithfield is made up largely of immigrants and refugees from places like Myanmar, Ethiopia, Nepal, Congo and El Salvador. There are 80 different languages spoken in the plant. Estimates of the mean hourly wage range from $14-16 an hour. Those hours are long, the work is gruelling, and standing on a production line often means being less than a foot away from your co-workers on either side.
The BBC spoke to half a dozen current and former Smithfield employees who say that while they were afraid to continue going to work, deciding between employment and their health has been an impossible choice.
‘I have a lot of bills. My baby’s coming soon – I have to work,’ said one 25-year-old employee whose wife is eight months pregnant. ‘If I get a positive, I’m really worried I can’t save my wife’…
When announcing the shutdown, Smithfield CEO Sullivan warned of ‘severe, perhaps disastrous, repercussions’ for the supply of meat.
But according to Smithfield employees, their union representatives, and advocates for the immigrant community in Sioux Falls, the outbreak that led to the plant closure was avoidable. They allege early requests for personal protective equipment were ignored, that sick workers were incentivised to continue working, and that information regarding the spread of the virus was kept from them, even when they were at risk of exposing family and the broader public.
Read the full article on BBC News.
Lockdowns will starve people in low-income countries
Julian C Jamison, Washington Post, 20 April 2020
Unusually for an infectious disease, covid-19 has primarily made its presence felt in wealthier countries — so far. Now we are beginning to see its effects in the developing world, although they may not be the effects you’d expect.
Uganda, which has zero covid-19 deaths to date, preemptively forbade the movement of private vehicles on March 30. Since then, at least seven pregnant women have died after attempting to walk to health facilities to give birth, according to a human rights group there. In Nepal, which also has zero covid deaths, a national lockdown that limited the movement of people and closed nonessential shops has also forced rural laborers to work less than half the number of hours they would even in the leanest part of the agricultural season. Hunger, not disease, tops their worries, surveys find.
School closures have denied more than 1.6 billion children in 199 countries access to traditional education; 370 million no longer receive school meals that they counted on. These are stark reminders that efforts to mitigate the catastrophic effects of the pandemic can exert their own tolls — not just in terms of money but also in lost educational opportunities, social isolation, the inability to provide for one’s family and sometimes death.
We desperately need more thoughtful and nuanced strategies for dealing with the pandemic in different regions, and not just because there are three ventilators in all of Liberia compared with roughly 170,000 in the United States. Someone surviving on $3 or less per day with no savings or social safety net — a description that applies to about a quarter of the world’s population, or 2 billion people — faces radically different priorities and trade-offs than their American or European counterparts.
Sheltering in place is a luxury that most of the developing world cannot afford: Without work, some people would be unable to buy food the very next day, throwing themselves and their households into immediate jeopardy. (Think, too, of all the refugees, migrants and homeless people who have no stable place to shelter at all.) While intelligently targeted social distancing makes sense everywhere, stricter measures such as broad shutdowns carry a much higher human price in the developing world. All nations face hard choices, but rational policies can ease the trade-offs.
Of course, it is imperative to pick at least the low-hanging fruit in all countries and thereby save millions of people from the novel coronavirus. Developing nations can and should support stringent personal hygiene: avoiding physical greetings, wearing homemade masks, covering the mouth when coughing or sneezing, and washing hands whenever possible. Other smart responses include temporarily banning large indoor gatherings such as religious services, and closing clubs and bars.
Read the full article in the Washington Post.
A perfect storm for an outbreak
Kirsten Han, We the Citizens, 24 April 2020
But he noted that safe distancing just wasn’t possible a lot of the time. While the men kept a metre away from one another while queuing for the bus, they were still sitting side-by-side in the bus. The same applied at their lodgings; workers would maintain distance while waiting to scan their resident cards, but were still living in close quarters and sharing communal facilities once inside.
Safe distancing isn’t the only problem. Even in ‘normal times’ migrant workers exist in extremely precarious conditions: their work permits are tied to their employers, who are allowed to cancel them at any time, with or without reason. When so many have taken loans, pawned possessions, or leased family land to raise the thousands of dollars needed for recruitment fees, the threat of repatriation is a huge disincentive to speak out, complain, or do anything that might run the risk of displeasing their bosses.
Fordyce points out that this long-standing problem is likely also a factor in the coronavirus spread. The official advice might have been to stay home and seek medical attention if unwell, but it was unlikely any migrant worker would have heeded such a call if they feared the consequences of taking sick leave. Some employers have even been known to impose fines if men fail to show up for work.
While the government might have been concerned about the disruption of shutting down work sites, workers were worried about losing their jobs if they asked for time off. ‘In the minds of both, it was the economy,’ Fordyce says.
What was being created — from the crowded conditions in the dormitories and the inability to maintain safe distancing, to the lack of adequate labour protections that left workers too afraid to report sick — was a perfect storm for an outbreak…
Two days before Singaporeans began their ‘circuit breaker’ — a partial lockdown period, lasting up to June 1, during which non-essential workplaces are closed, students undergo home-based learning, and gatherings of any size in any space are banned — the government announced the game plan.
Lawrence Wong, Minister for National Development and co-chair of the multi-ministry task force on COVID-19, laid it out at a press conference. The government was splitting locally transmitted cases into ‘two separate categories’: cases in dormitories, and everyone else. Workers would be largely confined to their lodgings, ‘so that there will be no infection to the rest of the community.’
‘With the foreign worker dormitories, looking after the workers there, taking care of their welfare, looking after all their well-being but also taking all of these precautions, I think we would be able to ring-fence and contain the infected cases in the foreign worker dormitories,’ the minister said, before moving on to talk about ‘cases within our own community, by residents in Singapore.’
Read the full article on We The Citizens.
What the great pandemic novels teach us
Orhan Pamuk, New York Times, 23 April 2020
For the past four years I have been writing a historical novel set in 1901 during what is known as the third plague pandemic, an outbreak of bubonic plague that killed millions of people in Asia but not very many in Europe. Over the last two months, friends and family, editors and journalists who know the subject of that novel, ‘Nights of Plague,’ have been asking me a barrage of questions about pandemics.
They are most curious about similarities between the current coronavirus pandemic and the historical outbreaks of plague and cholera. There is an overabundance of similarities. Throughout human and literary history what makes pandemics alike is not mere commonality of germs and viruses but that our initial responses were always the same.
The initial response to the outbreak of a pandemic has always been denial. National and local governments have always been late to respond and have distorted facts and manipulated figures to deny the existence of the outbreak.
In the early pages of ‘A Journal of the Plague Year,’ the single most illuminating work of literature ever written on contagion and human behavior, Daniel Defoe reports that in 1664, local authorities in some neighborhoods of London tried to make the number of plague deaths appear lower than it was by registering other, invented diseases as the recorded cause of death.
In the 1827 novel ‘The Betrothed,’ perhaps the most realist novel ever written about an outbreak of plague, the Italian writer Alessandro Manzoni describes and supports the local population’s anger at the official response to the 1630 plague in Milan. In spite of the evidence, the governor of Milan ignores the threat posed by the disease and will not even cancel a local prince’s birthday celebrations. Manzoni showed that the plague spread rapidly because the restrictions introduced were insufficient, their enforcement was lax and his fellow citizens didn’t heed them.
Much of the literature of plague and contagious diseases presents the carelessness, incompetence and selfishness of those in power as the sole instigator of the fury of the masses. But the best writers, such as Defoe and Camus, allowed their readers a glimpse at something other than politics lying beneath the wave of popular fury, something intrinsic to the human condition.
Read the full article in the New York Times.
The outbreak that invented intensive care
Hannah Wunsch, Nature, 3 April 2020
It was the polio epidemic of August 1952, at Blegdam Hospital in Copenhagen. This little-known event marked the start of intensive-care medicine and the use of mechanical ventilation outside the operating theatre — the very care that is at the heart of abating the COVID-19 crisis.
In 1952, the iron lung was the main way to treat the paralysis that stopped some people with poliovirus from breathing. Copenhagen was an epicentre of one of the worst polio epidemics that the world had ever seen. The hospital admitted 50 infected people daily, and each day, 6–12 of them developed respiratory failure. The whole city had just one iron lung. In the first few weeks of the epidemic, 87% of those with bulbar or bulbospinal polio, in which the virus attacks the brainstem or nerves that control breathing, died. Around half were children.
Desperate for a solution, the chief physician of Blegdam called a meeting. Asked to attend: Bjørn Ibsen, an anaesthesiologist recently returned from training at the Massachusetts General Hospital in Boston. Ibsen had a radical idea. It changed the course of modern medicine.
The iron lung used negative pressure. It created a vacuum around the body, forcing the ribs, and therefore the lungs, to expand; air would then rush into the trachea and lungs to fill the void. The concept of negative-pressure ventilation had been around for hundreds of years, but the device that became widely used — the ‘Drinker respirator’ — was invented in 1928 by Philip Drinker and Louis Agassiz Shaw, professors at the School of Public Health in Boston, Massachusetts. Others went on to refine it, but the basic mechanism remained the same until 1952.
Iron lungs only partially solved the paralysis problem. Many people with polio placed in one still died. Among the most frequent complications was aspiration — saliva or stomach contents would be sucked from the back of the throat into the lungs when a person was too weak to swallow. There was no protection of the airway.
Ibsen suggested the opposite approach. His idea was to blow air directly into the lungs to make them expand, and then allow the body to passively relax and exhale. He proposed the use of a trachaeostomy: an incision in the neck, through which a tube goes into the windpipe and delivers oxygen to the lungs, and the application of positive-pressure ventilation. At the time, this was often done briefly during surgery, but had rarely been used in a hospital ward.
Read the full article in Nature.
Coronavirus: why public trust is an issue
for news media but don’t trust those polls
CharlieBeckett, LSE Polis Blog, 24 April 2020
At times of national division and heated debate on top of four years of divisive Brexit debate with two very opposed main political parties, it’s not surprising that people are sceptical about everything. After ten years of austerity followed by an unprecedented pandemic there’s a lot to be angry and upset about.
So it would be easy – and not entirely untrue – to dismiss this anger against the media as evidence of people wanting to shoot the messenger. Journalists bring both the bad news of reality and also a lot of views that you will disagree with. So in that sense, it’s not so much that people don’t trust the system or profession of journalism, they are reacting to the presence of a lot of journalism that makes them sad, or cross or that they disagree with. Other surveys before coronavirus have actually shown a slight uptick in ‘trust’ in journalism. See below for an Ipsos MORI chart that gives an historical pre-COVID-19 perspective.
Studies such as the Reuters Institute for the Study of Journalism’s annual digital report show more nuance. It suggests, for example, that people tend to trust news sources they agree with or consume.
As the LSE’s Truth, Trust and Technology Commission report said in 2018, ‘trust’ is socially-constructed. It is a product of our economic and political environment and current circumstances. Information is now produced and distributed as much by major tech companies and the networks they create as it is by TV or newspapers. Media is as much a symptom as a cause of public opinion or understanding. I have long argued that asking people if they ‘trust’ something – especially something as varied and subjective as attitudes to journalism – is the wrong question. When we say ‘trust’ what do we mean? Do you believe, agree with, have confidence in, rely upon, use, share or interact or act upon journalism, would all be more useful questions. Which journalism do you trust (or not) and why?
The British people are pretty well educated, relatively politically engaged and have been encouraged by the news media (and universities!) to think for themselves and be sceptical for decades. Disasters like the financial crisis or phone-hacking give them some cause to be sceptical. It is not surprising that they are reluctant to tell a pollster that they happily ‘trust’ anyone, let alone a journalist.
Arguably, it’s a healthy sign in a democracy that citizens are not deferential to authority such as the news media (or politicians or even scientists). If you want high trust ratings in media and politicians you have to go to China. Considering what happened in China regarding coronavirus in and since December/January, I’m not sure that’s a great model.
Read the full article on the LSE Polis Blog.
What the conoravirus crisis
reveals about American medicine
Siddhartha Mukherjee, New Yorker, 27 April 2020
Within hours, the magnitude of the loss was evident to Toyota. The company had adopted ‘just in time’ (J.I.T.) production: parts, such as P-valves, were produced according to immediate needs—to precisely match the number of vehicles ready for assembly—rather than sitting around in stockpiles. But the fire had now put the whole enterprise at risk: with no inventory in the warehouse, there were only enough valves to last a single day. The production of all Toyota vehicles was about to grind to a halt. ‘Such is the fragility of JIT: a surprise event can paralyze entire networks and even industries,’ the management scholars Toshihiro Nishiguchi and Alexandre Beaudet observed the following year, in a case study of the episode.
Toyota’s response was extraordinary: by six-thirty that morning, while the factory was still smoldering, executives huddled to organize the production of P-valves at other factories. It was a ‘war room,’ one official recalled. The next day, a Sunday, small and large factories, some with no direct connection to Toyota, or even to the automotive industry, received detailed instructions for manufacturing the P-valves. By February 4th, three days after the fire, many of these factories had repurposed their machines to make the valves. Brother Industries, a Japanese company best known for its sewing machines and typewriters, adapted a computerized milling device that made typewriter parts to start making P-valves. The ad-hoc work-around was inefficient—it took fifteen minutes to complete each valve, its general manager admitted—but the country’s largest company was in trouble, and so the crisis had become a test of national solidarity. All in all, Toyota lost some seventy thousand vehicles—an astonishingly small number, given the millions of orders it fulfilled that year. By the end of the week, it had increased shifts and lengthened hours. Within the month, the company had rebounded.
Every enterprise learns its strengths and weaknesses from an Aisin-fire moment—from a disaster that spirals out of control. What those of us in the medical profession have learned from the covid-19 crisis has been dismaying, and on several fronts. Medicine isn’t a doctor with a black bag, after all; it’s a complex web of systems and processes. It is a health-care delivery system—providing antibiotics to a child with strep throat or a new kidney to a patient with renal failure. It is a research program, guiding discoveries from the lab bench to the bedside. It is a set of protocols for quality control—from clinical-practice guidelines to drug and device approvals. And it is a forum for exchanging information, allowing for continuous improvement in patient care. In each arena, the pandemic has revealed some strengths—including frank heroism and ingenuity—but it has also exposed hidden fractures, silent aneurysms, points of fragility. Systems that we thought were homeostatic—self-regulating, self-correcting, like a human body in good health—turned out to be exquisitely sensitive to turbulence, like the body during critical illness. Everyone now asks: When will things get back to normal? But, as a physician and researcher, I fear that the resumption of normality would signal a failure to learn. We need to think not about resumption but about revision.
Read the full article in the New Yorker.
The pandemic is changing the face of Indian labour
Arun Kumar, The Wire, 9 May 2020
The pandemic has highlighted the real working and living conditions of most workers. The unfolding crisis does make one stop and wonder – how can this be the situation of most Indian workers, 70 years after independence?
True, conditions are not what they were in 1947. There is more education and better health facilities. Child mortality has dropped and longevity has increased. Many of the poor own mobile phones and wear chappals. Electricity and tapped water have reached many villages. But this is expected in an economy where GDP has grown by 32.2 times and per capita income by 8.2 times since 1950, as some fruits of development trickle down to some of the marginalised.
What is missing though is a dignified life which should be the right of every citizen in a democratic country. The ruling class often ignores this and argues that the marginal improvement in the material conditions of many workers is enough. They even imply that the workers ought to be grateful for this slight betterment and portray it as the success of the prevailing unequal economic system. In the present ruling economic ideology, equity is not high on the agenda.
The ruling elites thrive on the poor working and living condition of labour for their lifestyle and profits. Consequently, neither the state nor the businesses grant the workers their rights. For instance, large numbers do not get a minimum wage or social security or protective gear at worksites. They do not have security of employment, often wages are not paid in time, muster rolls are fudged, there is no entitlement to leave, etc. Given their low wages, they are forced to live in squalid conditions with many sharing a small room in a slum. Water is scarce and drinking water more so. Access to clean toilets is limited and disease spreads. There is lack of civic amenities like sewage. Their children are often deprived of schools and playgrounds.
With COVID-19 as an excuse, state after state is reducing what little security was available to workers by eliminating or diluting various laws so as to favour businesses. In Uttar Pradesh, at least 14 labour laws like the Minimum Wages Act and Industrial Disputes Act are being suspended for three years in an effort to attract capital. Similar is the case with MP and Gujarat. The plea is that this is needed to revive economic activity. The chief minister of MP has said that this would lead to new investment in the state. Whether or not new investment will come at this time when businesses are unable to start or they face a situation of low capacity utilisation, what this would ensure is competition among states to relax and eliminate labour laws. Thus, the poor working conditions of labour will deteriorate further.
Read the full article in The Wire.
Anatomists of melancholy in the age of coronavirus
Chronicle of Higher Education, 17 April 2020
Before 2015, few people would have thought of not finishing college as a public-health issue. That changed because of research done by Anne Case and Angus Deaton, economists at Princeton who are also married. For the past six years, they have been collaboratively researching an alarming long-term increase in what they call ‘deaths of despair’ — suicides, drug overdoses, and alcoholism-related illnesses — among white non-Hispanic Americans without a bachelor’s degree in middle age.
Change any one of those attributes (race, nationality, education), and the trend disappears. Mortality has not increased among white Americans with a bachelor’s degree, nor American people of color, nor non-Americans without a bachelor’s degree. (Indeed, all-cause mortality among those groups has continued to go down, as usual.) Something about not having a bachelor’s degree in America, especially when white, can be deadly.
The term ‘deaths of despair’ has taken on a life of its own, becoming ubiquitous in newspapers, magazines, and op-eds. It has been the subject of think-tank panels, conferences, and even government inquiry. ‘America Will Struggle After Coronavirus. These Charts Show Why,’ proclaims a New York Times article that visualizes some of their research. This past fall, Congress’s Joint Economic Committee issued its own report on ‘Long-Term Trends in Deaths of Despair.’
Case and Deaton’s new book, Deaths of Despair and the Future of Capitalism (Princeton University Press), takes their message even further. Capitalism itself, they argue, needs serious reform if it is to make good on its potential to improve the lives of all Americans. In particular, as Case pointedly observed in a lecture last year at Stanford University, ‘We don’t think [American capitalism] is working for people without a four-year college degree — and that’s two-thirds of Americans between the ages of 25 and 64.’ The coronavirus outbreak, the dire economic forecast, the millions of newly unemployed — all of these recent events raise the stakes of their research.
Read the full article in the Chronicle of Higher Education.
Harvesting the blood of America’s poor:
The latest stage of capitalism
Alan McLeod, Mint Press News, 3 December 2020
For much of the world, donating blood is purely an act of solidarity; a civic duty that the healthy perform to aid others in need. The idea of being paid for such an action would be considered bizarre. But in the United States, it is big business. Indeed, in today’s wretched economy, where around 130 million Americans admit an inability to pay for basic needs like food, housing or healthcare, buying and selling blood is of the few booming industries America has left.
The number of collection centers in the United States has more than doubled since 2005 and blood now makes up well over 2 percent of total U.S. exports by value. To put that in perspective, Americans’ blood is now worth more than all exported corn or soy products that cover vast areas of the country’s heartland. The U.S. supplies fully 70 percent of the world’s plasma, mainly because most other countries have banned the practice on ethical and medical grounds. Exports increased by over 13 percent, to $28.6 billion, between 2016 and 2017, and the plasma market is projected to ‘grow radiantly,’ according to one industry report. The majority goes to wealthy European countries; Germany, for example, buys 15 percent of all U.S. blood exports. China and Japan are also key customers.
It is primarily the plasma– a golden liquid that transports proteins and red and white blood cells around the body– that makes it so sought after. Donated blood is crucial in treating medical conditions such as anemia and cancer and is commonly required to perform surgeries. Pregnant women also frequently need transfusions to treat blood loss during childbirth. Like all maturing industries, a few enormous bloodthirsty companies, such as Grifols and CSL, have come to dominate the American market.
But in order to generate such enormous profits, these vampiric corporations consciously target the poorest and most desperate Americans. One study found that the majority of donors in Cleveland generate more than a third of their income from ‘donating’ blood. The money they receive, notes Professor Kathryn Edin of Princeton University, is literally ‘the lifeblood of the $2 a day poor.’ Professor H. Luke Schaefer of the University of Michigan, Edin’s co-author of $2 a Day: Living on Almost Nothing in America, told MintPress News:
The massive increase in blood plasma sales is a result of an inadequate and in many places non-existent cash safety net, combined with an unstable labor market. Our experience is people need the money, that’s the primary reason people show up at plasma centers.’
Almost half of America is broke, and 58 percent of the country is living paycheck to paycheck, with savings of less than $1000. 37 million Americans go to bed hungry, including one-sixth of New Yorkers and almost half of South Bronx residents. And over half a million sleep on the streets on any given night, with many millions more in vehicles or relying on friends or family. It is in this context that millions in the red have turned to selling blood to make ends meet. In a very real sense then, these corporations are harvesting the blood of the poor, literally sucking the life out of them.
Read the full article in Mint Press News.
Climate refugees: The fabrication of a migration threat
Hein de Haas, 31 January 2020
The typical approach of apocalyptic climate migration forecasts has been to map climate-change-induced developments (such as sea-level rise, drought or desertification) onto settlement patterns to predict future human displacement. For instance, if climate change models predicted a sea-level rise of (say) 50 centimeters, it would be possible to map all coastal areas affected by this and work out how many people lived in such areas. The assumption then is that all these people would have to move.
Yet research evidence challenges the popular idea that climate change will lead to mass migration. In 2011, a group of prominent researchers conducted a study for the UK Government of Science on the links between migration and environmental change. They concluded that because migration is driven by many factors, it can rarely be reduced to the effects of just one form of change, such as climate change or other environmental factors.
The environment is but one of the many factors that shape migration, and this effect is indirect rather than direct. This makes it difficult to directly attribute migration to climate change and other environmental factors. In fact, migration is likely to continue regardless of climate and the environment, because it is mainly driven by powerful economic, political and social processes, such as labor demand (in destination areas) and development (in origin areas).
For instance, this challenges the popular idea that much migration within Bangladesh is an ‘obvious example’ of mass displacement due to the sea-level rise. After all, much of this movement would have happened anyway as part of more general processes of urbanization, education and the growth of urban-based industrial and service sectors. In fact, many people voluntarily migrate from rural into urban areas of greater environmental vulnerability, such as fertile deltas and cities partly built on floodplains. They do so because of improved livelihood opportunities they can expect to find there despite high population densities and environmental hazards (particularly flooding) they often encounter there.
Furthermore, we cannot just assume that low-lying areas will simply be submerged through sea level rise. Whether land will come at risk of being submerged and inhabitable (unless dikes are built) does not only depend on sea levels, but also on natural patterns of erosion and sedimentation as well as land subsidence through soil compaction.
For instance, delta areas have always been highly dynamic and characterized by constantly shifting patterns of land formation and erosion. We should therefore refrain from simplistic analyses. For instance, research on Bangladesh has shown that while in some areas, land is being lost, in other areas land has been gained. A recent study revealed that, in the period between 1985 and 2015, the rate of land area growth (through sedimentation) in coastal areas of Bangladesh has been slightly higher than the rate of erosion.
Read the full article on Hein Haas’ blog.
To fight locusts, historic rivals
India and Pakistan team up
Lou de Bello, Undark, 20 April 2020
Since December 2019, an international locust outbreak of exceptional severity spread across the Horn of Africa and the Middle East before moving on to Asia. Scientists say that climate change may have played a role in this incursion. In 2019, eight cyclones developed in the western part of the Indian ocean, bringing heavy rains to farmland, says Keith Cressman senior locust forecasting officer with the United Nations Food and Agriculture Organization (FAO). When it comes to cyclones, he says, past years have brought just one, or even none, to the region. Locust breeding is directly tied to soil moisture and food availability, so rain patterns have a strong influence on locust populations.
Scientists and government officials agree that countries in the region — including India and Pakistan — must coordinate their efforts to minimize the damage of future swarms. A locust flare-up would be an equally serious threat on either side of the India-Pakistan border, says Muhammad Tariq Khan, technical director of the Department of Plant Protection, a branch of Pakistan’s National Food Security and Research agency. So far, the two nations have been able to set aside their political differences to address the locust problem, says Khan. But cross-border conflict anywhere in the wider region has the potential to disrupt economies and food security.
When conflict does break out, the FAO can serve as a neutral broker, says Cressman. He mentions that the 1979 revolution in Iran disrupted locust management cooperation between Iran and Pakistan. In the 1990s, both countries appealed to FAO to help them rekindle the dialogue. In addition to serving as an intermediary, the FAO collects data from all affected regions around the world, putting them online, so countries can plan monitoring actions or pesticide spraying…
Last year, the strained relations between India and Pakistan reached a tipping point after a suicide attack attributed to a Pakistani terrorist group killed 40 people in the Pulwama district of India’s Jammu and Kashmir state. The incident led India to carry out its first airstrike in recent memory on Pakistani territory, escalating into one of the most dangerous standoffs between the two countries in decades.
During that period, and into the summer, communications, trade, and activities slowed down or were suspended. But even as tensions grew at the border, at least one conversation was allowed to continue, negotiated under the FAO umbrella. Pakistani and Indian officials were gathering peacefully further south along the border in a place known as Zero Point , to stave off the threat of the desert locust.
Read the full article in Undark.
Oldest evidence of a moving tectonic plate
found in Australia
Maya Wei-Haas, National Geographic, 22 April 2020
In the desolate landscape of western Australia, a rocky outcrop that formed more than three billion years ago is giving geologists an unprecedented look at the early churnings of our planet. These rocks—among the most ancient in the world—contain what may be the oldest direct evidence of the movement of tectonic plates.
The rocks formed when magma oozed up from beneath Earth’s surface into a now-vanished ocean, cooling and hardening into a bulbous mass. As detailed in a new study in Science Advances, magnetic signatures preserved in the rock suggest the region was inching across the planet 3.2 billion years ago at similar speeds to tectonic plates today—nearly half a billion years earlier than previous evidence of such movement.
‘This is kind of the smoking gun,’ says geochemist Annie Bauer of the University of Wisconsin-Madison, who was not part of the new study. ‘This is the most important evidence we can get [of early plate motion].’
Today, Earth’s tectonic plates continually shift and migrate—a process that builds mountains, carves basins, and drives volcanic eruptions. These motions sculpted a variety of ecological niches, including hydrothermal vents at the bottom of the sea and boiling pools of water on the surface—the types of environments where life is believed to have formed.
‘While piecing together the story of plate tectonics, we’re helping to piece together our own origin story,’ says the study’s lead author Alec Brenner, a Ph.D. student at Harvard University.
Our planet coalesced from a swirling cloud of gas and dust some 4.5 billion years ago, and initially it was scorching hot. Oceans of molten rock glowed on the surface, and volcanoes likely spit lava into the air. But Earth soon began to cool, and over tens of millions of years, the surface hardened into a crust.
Scientists believe this early crust was a singular cap enveloping the planet, much like the surface of Mars today. At some point—estimates vary from roughly four billion to a billion years ago—this cap fractured into a global jigsaw of crust, with pieces crashing into each other and driving rock down into the mantle or up into the sky. Plate tectonics was born.
Read the full article in National Geographic.
Denis Goldberg, a proper mensch and a true hero
Tymon Smith, New Frame, 1 May 2020
I sheltered with the people in a time of uproar
And then I joined in their rebellion.
That’s how I passed my time that was given to me on this Earth. – Bertolt Brecht, To Those Born After
When he was interviewed for a documentary in 1990 while living in England, Rivonia Trialist and former political prisoner Denis Goldberg read these lines to explain to his interviewers how he viewed the way that he had passed his time on Earth in the service of the struggle against apartheid and political oppression generally.
They were apt for a man whose life served as an example for others of a complete commitment to the greater good and prosperity of his fellow countrymen. Goldberg was then 57 years old and had spent just over 22 years in prison in Pretoria before being released in 1985. His wife Esme and their two children, Hilary and David, had lived in England for the duration of his imprisonment and in spite of the release of Nelson Mandela and the unbanning of the ANC, Goldberg did not at that stage see himself returning to South Africa because his wife and family no longer saw it as their home.
He would not return to South Africa until 2002, after Esme’s death in 2000 and his daughter Hilary’s sudden death two years later. For the remainder of his life, which ended on the night of Wednesday 29 April 2020, Goldberg remained fiercely dedicated to the fight against inequality and oppression that he had waged for most of his 87 years…
In 1957, Goldberg joined the Communist Party. In 1961, he became a member of Umkhonto weSizwe (MK). He had already experienced his first prison spell in 1960 when he and his mother were arrested for their involvement in supporting strikers in the townships in the wake of the Sharpeville massacre. Goldberg and his mother spent four months in detention without trial.
In 1962, he and fellow MK comrade Looksmart Ngudle organised the first MK training camp in South Africa at Mamre outside Cape Town. Ngudle would later become the first person to die in detention when he was found hanged in his cell in Pretoria Prison in September 1963.
Goldberg was arrested on 11 July 1963 at Liliesleaf Farm. He was part of the MK underground command that was busy planning for Operation Mayibuye, which intended to violently overthrow the apartheid regime. As an engineer, Goldberg had been tasked with manufacturing the arms necessary for the task. He was the youngest of the accused at the Rivonia Trial and when he was sentenced to four life sentences in 1964, his mother, who was unable to hear the judge, asked her son, ‘What did the judge say?’, to which he famously replied, ‘Life, and life is wonderful.’
Read the full article in New Frame.
The man who thought too fast
Anthony Gottlieb, New Yorker, 27 April 2020
‘The world will never know what has happened—what a light has gone out,’ the belletrist Lytton Strachey, a member of London’s Bloomsbury literary set, wrote to a friend on January 19, 1930. Frank Ramsey, a lecturer in mathematics at Cambridge University, had died that day at the age of twenty-six, probably from a liver infection that he may have picked up during a swim in the River Cam. ‘There was something of Newton about him,’ Strachey continued. ‘The ease and majesty of the thought—the gentleness of the temperament.’
Dons at Cambridge had known for a while that there was a sort of marvel in their midst: Ramsey made his mark soon after his arrival as an undergraduate at Newton’s old college, Trinity, in 1920. He was picked at the age of eighteen to produce the English translation of Ludwig Wittgenstein’s ‘Tractatus Logico-Philosophicus,’ the most talked-about philosophy book of the time; two years later, he published a critique of it in the leading philosophy journal in English, Mind. G. E. Moore, the journal’s editor, who had been lecturing at Cambridge for a decade before Ramsey turned up, confessed that he was ‘distinctly nervous’ when this first-year student was in the audience, because he was ‘very much cleverer than I was.’ John Maynard Keynes was one of several Cambridge economists who deferred to the undergraduate Ramsey’s judgment and intellectual prowess.
When Ramsey later published a paper about rates of saving, Keynes called it ‘one of the most remarkable contributions to mathematical economics ever made.’ Its most controversial idea was that the well-being of future generations should be given the same weight as that of the present one. Discounting the interests of future people, Ramsey wrote, is ‘ethically indefensible and arises merely from the weakness of the imagination.’ In the wake of the Great Depression, economists had more pressing concerns; only decades later did the paper’s enormous impact arrive. And so it went with most of Ramsey’s work. His contribution to pure mathematics was tucked away inside a paper on something else. It consisted of two theorems that he used to investigate the procedures for determining the validity of logical formulas. More than forty years after they were published, these two tools became the basis of a branch of mathematics known as Ramsey theory, which analyzes order and disorder. (As an Oxford mathematician, Martin Gould, has explained, Ramsey theory tells us, for instance, that among any six users of Facebook there will always be either a trio of mutual friends or a trio in which none are friends.)
Ramsey not only died young but lived too early, or so it can seem. He did little to advertise the importance of his ideas, and his modesty did not help. He was not particularly impressed with himself—he thought he was rather lazy. At the same time, the speed with which his mind worked sometimes left a blur on the page. The prominent American philosopher Donald Davidson was one of several thinkers to experience what he dubbed ‘the Ramsey effect.’ You’d make a thrilling breakthrough only to find that Ramsey had got there first.
There was also the problem of Wittgenstein, whose looming example and cultlike following distracted attention from Ramsey’s ideas for decades. But Ramsey rose again. Economists now study Ramsey pricing; mathematicians ponder Ramsey numbers. Philosophers talk about Ramsey sentences, Ramseyfication, and the Ramsey test. Not a few scholars believe that there are Ramseyan seams still to mine.
Read the full article in the New Yorker.
We shouldn’t be scared by ‘superintelligent A.I.’
Melanie Mitchell, New York Times, 31 October 2019
The assumption seems to be that this A.I. could surpass the generality and flexibility of human intelligence while seamlessly retaining the speed, precision and programmability of a computer. This imagined machine would be far smarter than any human, far better at ‘general wisdom and social skills,’ but at the same time it would preserve unfettered access to all of its mechanical capabilities. And as Dr. Russell’s example shows, it would lack humanlike common sense.
The problem with such forecasts is that they underestimate the complexity of general, human-level intelligence. Human intelligence is a strongly integrated system, one whose many attributes — including emotions, desires, and a strong sense of selfhood and autonomy — can’t easily be separated.
Similarly, if generally intelligent A.I. is ever created (something that will take many decades, if not centuries), its objectives, like ours, will not be easily ‘inserted’ or ‘aligned.’ They will rather develop along with the other qualities that form its intelligence, as a result of being embedded in human society and culture. The machines’ push to achieve these objectives will be tempered by the common sense, values and social judgment without which general intelligence cannot exist.
What’s more, the notion of superintelligence without humanlike limitations may be a myth. It seems likely to me that many of the supposed deficiencies of human cognition are inseparable aspects of our general intelligence, which evolved in large part to allow us to function as a social group. It’s possible that the emotions, ‘irrational’ biases and other qualities sometimes considered cognitive shortcomings are what enable us to be generally intelligent social beings rather than narrow savants. I can’t prove it, but I believe that general intelligence can’t be isolated from all these apparent shortcomings, either in humans or in machines that operate in our human world.
In his 1979 Pulitzer Prize-winning book, ‘Gödel, Escher, Bach: an Eternal Golden Braid,’ the cognitive scientist Douglas Hofstadter beautifully captures the counterintuitive complexity of intelligence by posing a deceptively simple question: ‘Will a thinking computer be able to add fast?’ Dr. Hofstadter’s surprising but insightful answer was, ‘perhaps not.’
As Dr. Hofstadter explains: ‘We ourselves are composed of hardware which does fancy calculations but that doesn’t mean that our symbol level, where ‘we’ are, knows how to carry out the same fancy calculations. Let me put it this way: There’s no way that you can load numbers into your own neurons to add up your grocery bill. Luckily for you, your symbol level (i.e., you) can’t gain access to the neurons which are doing your thinking — otherwise you’d get addlebrained.’ So, why, he asks, ‘should it not be the same for an intelligent program?’
Read the full article in the New York Times.
Worlds within a self
Helen Hayward, TLS, 10 April 2020
V.S. Naipaul’s work speaks eloquently to the contemporary world. His focus is on migration and displacement, and his abiding theme is ‘the great movement of peoples in the second half of the twentieth century’. Naipaul is ripe for reassessment now that work can be seen as a whole, following his death in 2018 – and time has only made his legacy clearer. Moreover, Naipaul is no longer around to stir up controversy with outrageous statements in interviews – a form of deliberate provocation that George Lamming likened to carnival masquerading.
Naipaul was born in 1932 in rural Trinidad; a scholarship enabled him to study in Oxford, and so his life followed the trajectory to which Sanjay Krishnan’s subtitle alludes. Krishnan constructs a narrative out of Naipaul’s oeuvre, making it the story of postcolonial societies undergoing the disorientating transition to modernity, with Naipaul’s own life providing the starting point: his subject is ‘the worlds I contained within myself’. Naipaul works through this modern disorientation, Krishnan contends, in order to consider how formerly subject peoples can hope to understand their predicament and reshape their lives.
At the beginning of his writing life, in the 1950s, reaching an understanding of such historical forces constituted an innovation, Krishnan suggests, and it involved trying to see the postcolonial world as a globalized whole. Yet Naipaul is more interested in self-examination than in ideas of resistance and of cultural hybridity celebrated by postcolonial critics – which is part of the reason he has excited controversy, if not opprobrium. Krishnan notes that Naipaul writes frankly about racist feelings in the context of the ethnic hostilities unleashed by decolonization, in his efforts to understand the forces that shaped him. The problem is that Naipaul’s expressions of outrage at postcolonial racism risk echoing the language of the racism he condemns. His critics denounce him for peddling damaging stereotypes about the formerly colonized and their inability to govern themselves, and Krishnan at times finds himself writing in the guise of Naipaul’s advocate, taking issue with these detractors, despite his claims not to seek to defend him.
When Naipaul revisited Trinidad in the 1950s, he reckoned that agitation for independence was about to spark racial war; his outrage at the decline of multiculturalism found expression in satire. Krishnan argues that this excuses his sometimes sneering tone, as when he characterizes the mutual hostility of Asian and black Trinidadians as being ‘like monkeys pleading for evolution’. ‘Cruelty disguises distress’, Krishnan contends. He also suggests that at this stage of his career Naipaul was seeking to prove himself to a metropolitan audience – an excuse that forms precisely the basis of some postcolonial critics’ argument with his work.
Read the full article in the TLS.
Why free will is real
Peter A Graham,
Notre Dame Philosophical Reviews, 4 May 2020
According to List, free will is a three-part capacity, requiring intentional agency, alternative possibilities, and causal control. These three requirements he takes to be necessary and sufficient for a being’s having free will in the sense that matters to us when we wonder whether we ourselves have free will. More particularly, List sets out three theses which he contends must be true, and which are jointly sufficient, for a being to have free will:
Intentional Agency: Any bearer of free will is an intentional agent, whose intentions support the relevant actions (i.e., those actions performed freely).
Alternative Possibilities: Any bearer of free will faces the choice, at least in relevant cases (i.e., those cases in which she exercises her free will), between two or more alternative actions, each of which is a genuine possibility for the agent.
Causal Control: The relevant actions (i.e., those that are the product of free will) of any bearer of free will are caused, not merely by some nonintentional processes, but by the appropriate mental states, viz., the intentions to perform those actions.
To show that free will is in fact real, List sets out to respond to what he takes to be the main challenges posed by aspects of our contemporary scientific worldview to these theses.
The challenge to Intentional Agency is the challenge of radical materialism. Radical materialism consists of either eliminative materialism or reductive materialism. According to eliminative materialism, scientifically speaking, there is no such thing as intentional agency. This is because, according to it, the idea of there being mental states such as intentions, desires, and beliefs is a relic of our folk/prescientific worldview which contemporary science has shown (or will show) to be just as ill-founded as it has shown the idea that thunder and lightning are the manifestations of divine displeasure to be. Rather, explaining the behavior and functioning of agents, be they human or non-human, will be shown to be the province not of folk psychology, but of neurobiology, neurochemistry, and ultimately, neurophysics. According to reductive materialism, mental states such as beliefs, desires, and intentions, though perhaps not eliminable, are reducible to neurophysical states. If either form of radical materialism were true, then there would be no non-eliminated or non-reduced-away intentional states. And, as there can be no intentional agency, without intentional states, likewise there would be no real free will.
Read the full article in Notre Dame Philosophical Reviews.
Clair Wills, New York Review of Books, 14 May 2020
The living, breathing Cromwell of the early books had nothing to do with historical accuracy. Quite the reverse. Mantel’s genius was to take a well-known, verifiable historical character and make it seem as though he wasn’t one. We spent almost as much time with his children painting Easter eggs or dressing up for Christmas as we did at Henry’s court. We saw him doing ordinary things—talking in bed with his wife in the morning, or falling asleep over his desk at night, his imagination alert to his surroundings but also to the presence of people and things that weren’t there. ‘Beneath every history, another history,’ thinks Cromwell in Wolf Hall. He sees his dead wife Liz ‘whisking around a corner’ or lurking under a stairwell. After Cardinal Wolsey’s death his scarlet clothes are cut up and used in other garments: ‘Your eye will be taken by a crimson cushion or a patch of red on a banner or ensign. You will see a glimpse of them in a man’s inner sleeve or in the flash of a whore’s petticoat.’
This is a portrait of a world where ‘the dead grip the living,’ where the end comes so fast (the sweating sickness can kill in hours, before you get a chance to say goodbye) that it seems natural for the dead to hang around. But it is also the landscape of Mantel’s weird and brilliant novel Beyond Black (2005), set in English suburbia at the end of the twentieth century. The head Mantel took us inside in her portrait of Cromwell in the first two books was her own.
Alison in Beyond Black is a spirit medium, at the mercy of ‘interference’ from a set of mostly malevolent phantoms. We learn in the course of the novel that the spirits that subject her to ‘slow torture,’ now that they are dead, are the same figures who raped and tormented her when they were alive. Alison’s oppression by these beings makes her ill: her huge and unruly body is in constant pain. Another writer would have thinned the boundary between these ghosts and the unconscious, to allow us to read the spirits as emanations of Alison’s tortured thoughts. But the weight of other consciousnesses is real to Mantel. Alison is a portrait of a writer at the mercy of the voices of others, who cannot stop herself from seeing the other world that lies just behind this one, out of the corner of her eye.
Read the full article in the New York Review of Books.
The outrageous optimism of Jean-Paul Sartre
Ian Birchall, Jacobin, 15 April 2020
Sartre is often presented as a pessimistic thinker. In his novel Nausea, he wrote: ‘Every existing thing is born without reason, prolongs itself out of weakness, and dies by chance.’ Perhaps the best-known quotation of Sartre’s comes from his play In Camera — ‘hell is other people.’ But if his starting-point seems bleak — we live in a godless, meaningless universe — the logic is that all meaning, all values, come from human beings, from ourselves. In Sartre’s own phrase, we are ‘condemned to be free.’
As Sartre himself noted, it was not his alleged pessimism that outraged people so much as his powerful optimism: his insistence that we are free to act, free to change the world, and hence that we are responsible for the world as it is — responsible for war, starvation, and oppression. The fact of this freedom, experienced not pleasurably, but as anguish, is central to all of Sartre’s work, as are the strategies we develop to deny our own responsibility — what he called ‘bad faith.’
Thus Sartre insisted that there was no such thing as a natural disaster: ‘It is man who destroys his cities through the agency of earthquakes.’ In a world without human beings, an earthquake would be of no significance: just a meaningless upheaval of matter. It is only when the earthquake comes up against human projects — roads, buildings, towns — that it becomes a disaster. It is a stark reminder that in the epoch of climate change, disaster results not from nature but from human choices, human ambitions, and human brutality.
In an article written in 1948, Sartre proclaimed his ambition to ‘write for his own time.’ His aim was not to pursue universal truths, but to confront the reality of the world in which he lived. Its problems were too urgent to be neglected in favor of more long-term considerations.
To appreciate this, it is important to remember the world in which Sartre was active. Between 1939 and 1962, a period which covers much of Sartre’s work, France was only at peace for brief periods. First came the Second World War, during which German forces occupied the country, giving rise to an armed Resistance movement.
Scarcely had France achieved liberation than it was involved in two long and bitter wars, trying vainly to hang on to its vast colonial empire: an eight-year conflict in Indochina that led to humiliating defeat at Dien Bien Phu, followed by seven years of war in Algeria, characterized by brutality and torture. The violence in Algeria frequently spilled over onto the streets of mainland France.
In addition, from 1947 onwards, France was embroiled in the Cold War between the Soviet Union and the United States, with the constant threat of nuclear annihilation. Small wonder that Sartre’s work from this period is constantly preoccupied with violence.
Read the full article in Jacobin.
Migrant stories, a multi-millionaire
and the Tinder date – the rise of German cricket
Stephan Shemilt, BBC Sport, 7 May 2020
Somewhere along the border that separates Turkey and Bulgaria, Abdul Shakoor lost his way.
A trek that should have taken two days ran into six or seven. He can’t remember exactly how many, probably because he was starving and thirsty.
He had already done a similar journey before, walking 48 hours from Iran to Turkey. He was 15 years old.
Shakoor left his home in Peshawar, northern Pakistan, with the dream of reaching England. He paid $2,000, money he got from relatives, to what he calls an agent. That was the price of reaching Turkey.
More money was needed to travel across Europe. When he finally got to Bulgaria, avoiding police that would not hesitate to fire on him, he moved on through Serbia, Hungary and Austria.
In Austria, he received word from friends who were already in England that caused him to rethink his plans. Germany would provide a warmer welcome, they said. He should go there instead.
On arriving in Germany, Shakoor had the clothes he was wearing, about 100 euros in cash and a mobile phone. Nothing else.
Now, five years on, he is the opening batsman in the national team of one of the fastest-growing cricketing nations on the planet.
Even the darkest clouds can have a silver lining.
As Britain is gripped by the coronavirus pandemic, hearts have been warmed by Captain Tom Moore walking laps of his garden. Nine months after the 9/11 attacks, ante-natal classes in New York were said to be swamped by expectant mothers.
In 2015, millions of people fled Syria, Afghanistan, Iraq, other parts of Asia and Africa. Some were escaping war, other years of violence. Some were just looking for what they hoped would be a better life.
Around one million refugees and migrants ended up in Germany. One estimate is that about 180,000 were Afghans, the vast majority of whom were male and under the age of 30. In other words, cricket fans and players in a nation where the sport has never been a natural fit.
Still, it is cricket that has helped these men settle in a country where they knew little of the language, culture or heritage. As a result, the German game is on the rise, primed to make a mark on the world stage.
Read the full article on BBC Sport.
Van Schurman (1607-1678)
Project Vox, 23 April 2020
There is no more fascinating figure in early modern philosophy than the Dutch polymath Anna Maria van Schurman. A figure of renowned learning, an inspiring linguist who learned more than a dozen ancient and modern languages, van Schurman would have been a remarkable intellectual at any point in European history. The fact that she was a woman living in northern Europe in the early 17th century, a time when women were not only officially excluded from colleges, universities and intellectual academies, but also a time when they were rarely given any formal education at all, makes her all the more remarkable. She was a genius and recognized as such in her day. However, in other respects, van Schurman defies easy categorization. Textbooks concerning the history of science and the history of philosophy in early modern Europe are full of recognizable characters. There are the ‘schoolmen’ and the ‘doctors of philosophy’ promoting the traditional, Aristotelian-focused ideas promulgated throughout the colleges of early modern Europe. And then there are the radical figures who challenged the Aristotelians—the novatores or ‘moderns’—who made history. They were led by figures who quickly became famous for their anti-Scholastic methods and ideas, people like Galileo in Italy, Bacon in England, and Descartes in France and Holland. In a deep sense, van Schurman does not fall into either camp. That is not because she lacked views about Scholasticism and the various modern challenges to it. On the contrary, she expressed those views in depth in correspondence with various figures in her day. Rather, she charted her own unique path. In terms of her method, she displayed strong sympathies for Scholasticism, which stood in stark contrast to her friend Princess Elisabeth, who was more sympathetic to modern methods such as those of Descartes (Broad 2002, 17-19). But in terms of van Schurman’s ideas, she articulated a series of arguments in favor of women’s education that were equally challenging to the orthodoxy of European institutions at the time and to the moderns who sought to reshape those institutions in their own image. One is tempted to say that she put new wine in old bottles. But that old trope does not fully capture her unusual approach. In following a Scholastic method, especially in her famous Dissertatio, described below, she deployed an approach that would have been welcome to many of her interlocutors and certainly well known by the moderns who sought to challenge it. But in deciding to employ that method to endorse a conclusion about women’s education that was a serious challenge to nearly every male intellectual figure in Europe at that time, van Schurman greatly enriched the philosophical conversation of her day. Her work helps to underscore a profound fact that is easy to ignore. Despite their attempts to shake the very foundations of knowledge; despite their challenge to the orthodoxy of the schools throughout Europe in their day, the great ‘moderns’ like Bacon, Descartes, Galileo, and Newton did nothing to challenge the prevailing gender norms governing philosophy and education in the 17th century. Their radical agenda, which eventually did overturn an intellectual regime that had prevailed for centuries, stopped at gender’s door. In that sense, the seemingly conservative and traditional Anna Maria van Schurman was perhaps more radical in her conception of education than the greatest thinkers of her age.
Read the full article in Project Vox.
The real Lord of the Flies: what happened
when six boys were shipwrecked for 15 months
Rutger Bregman, Guardian, 9 May 2020
No one noticed the small craft leaving the harbour that evening. Skies were fair; only a mild breeze ruffled the calm sea. But that night the boys made a grave error. They fell asleep. A few hours later they awoke to water crashing down over their heads. It was dark. They hoisted the sail, which the wind promptly tore to shreds. Next to break was the rudder. ‘We drifted for eight days,’ Mano told me. ‘Without food. Without water.’ The boys tried catching fish. They managed to collect some rainwater in hollowed-out coconut shells and shared it equally between them, each taking a sip in the morning and another in the evening.
Then, on the eighth day, they spied a miracle on the horizon. A small island, to be precise. Not a tropical paradise with waving palm trees and sandy beaches, but a hulking mass of rock, jutting up more than a thousand feet out of the ocean. These days, ‘Ata is considered uninhabitable. But ‘by the time we arrived,’ Captain Warner wrote in his memoirs, ‘the boys had set up a small commune with food garden, hollowed-out tree trunks to store rainwater, a gymnasium with curious weights, a badminton court, chicken pens and a permanent fire, all from handiwork, an old knife blade and much determination.’ While the boys in Lord of the Flies come to blows over the fire, those in this real-life version tended their flame so it never went out, for more than a year.
The kids agreed to work in teams of two, drawing up a strict roster for garden, kitchen and guard duty. Sometimes they quarrelled, but whenever that happened they solved it by imposing a time-out. Their days began and ended with song and prayer. Kolo fashioned a makeshift guitar from a piece of driftwood, half a coconut shell and six steel wires salvaged from their wrecked boat – an instrument Peter has kept all these years – and played it to help lift their spirits. And their spirits needed lifting. All summer long it hardly rained, driving the boys frantic with thirst. They tried constructing a raft in order to leave the island, but it fell apart in the crashing surf.
Worst of all, Stephen slipped one day, fell off a cliff and broke his leg. The other boys picked their way down after him and then helped him back up to the top. They set his leg using sticks and leaves. ‘Don’t worry,’ Sione joked. ‘We’ll do your work, while you lie there like King Taufa‘ahau Tupou himself!’
They survived initially on fish, coconuts, tame birds (they drank the blood as well as eating the meat); seabird eggs were sucked dry. Later, when they got to the top of the island, they found an ancient volcanic crater, where people had lived a century before. There the boys discovered wild taro, bananas and chickens (which had been reproducing for the 100 years since the last Tongans had left).
Read the full article in the Guardian.
The images are, from top down: Indian migrant workers trying to get home during the lockdown (from a BBC news report); image of the Great Plague of London by Thomas Dekker; Image from the cover of ‘Deaths of Despair and the Future of Capitalism’, by Anne Case and Angus Deaton; Portrait of Denis Goldberg by Anastasya Eliseeva from New Frame; Mark Rylance in the BBC adaptation of ‘Wolf Hall’.