The latest (somewhat random) collection of recent essays and stories from around the web that have caught my eye and are worth plucking out to be re-read.
The ISIS files
Rukmini Callimachi, New York Times, 4 April 2018
The disheveled fighters who burst out of the desert more than three years ago founded a state that was acknowledged by no one except themselves. And yet for nearly three years, the Islamic State controlled a stretch of land that at one point was the size of Britain, with a population estimated at 12 million people. At its peak, it included a 100-mile coastline in Libya, a section of Nigeria’s lawless forests and a city in the Philippines, as well as colonies in at least 13 other countries. By far the largest city under their rule was Mosul.
Nearly all of that territory has now been lost, but what the militants left behind helps answer the troubling question of their longevity: How did a group whose spectacles of violence galvanized the world against it hold onto so much land for so long?
Part of the answer can be found in more than 15,000 pages of internal Islamic State documents I recovered during five trips to Iraq over more than a year.
The documents were pulled from the drawers of the desks behind which the militants once sat, from the shelves of their police stations, from the floors of their courts, from the lockers of their training camps and from the homes of their emirs, including this record detailing the jailing of a 14-year-old boy for goofing around during prayer….
Individually, each piece of paper documents a single, routine interaction: A land transfer between neighbors. The sale of a ton of wheat. A fine for improper dress. But taken together, the documents in the trove reveal the inner workings of a complex system of government. They show that the group, if only for a finite amount of time, realized its dream: to establish its own state, a theocracy they considered a caliphate, run according to their strict interpretation of Islam.
The world knows the Islamic State for its brutality, but the militants did not rule by the sword alone. They wielded power through two complementary tools: brutality and bureaucracy.
ISIS built a state of administrative efficiency that collected taxes and picked up the garbage. It ran a marriage office that oversaw medical examinations to ensure that couples could have children. It issued birth certificates — printed on Islamic State stationery — to babies born under the caliphate’s black flag. It even ran its own D.M.V.
Read the full article in the New York Times.
The Syrian people have been betrayed by all sides
Mehdi Hasan, The Intercept, 20 March 2018
The reality is that there are ‘no good guys in the Syrian tragedy,’ as former UN special envoy to Syria, Lakhdar Brahimi, reminded me last year. The veteran Algerian diplomat, who served as Annan’s successor between August 2012 and May 2014 before also quitting in frustration, said he placed ‘a lot of blame on the outside forces, the governments, and others who were supporting one side or the other’ but who never had ‘the interest of the Syrian people as their first priority.’
Indeed. Assad may be the biggest monster but he is far from the only monster. Some of us, therefore, refuse to pick a side; refuse to glorify regime over rebels, or Americans over Russians, or Iranians over Arabs. Brahimi had it right: a plague on all their houses. The Syrian people deserve better than Assad but they’ve been betrayed on all sides, and suffered as a result, for six long years.
A political solution based on a negotiated, power-sharing deal is now as much a chimera as a military solution in which Assad is forced from power. Syria will continue to bleed. Rather than picking between the various bad guys, and further prolonging the fighting, our time and energy would be better spent on pressuring hypocritical governments in the West and the Arab world to open their borders to Syrian refugees and also to uphold and deliver on their much-vaunted pledges of humanitarian aid. There are, in fact, many ways to help ordinary Syrians without dropping more bombs on them.
Read the full article in The Intercept.
A betrayal
Hannah Dreier, Pro Publica, 2 April 2018
If Henry is killed, hhis death can be traced to a quiet moment in the fall of 2016, when he sat slouched in his usual seat by the door in 11th-grade English class. A skinny kid with a shaggy haircut, he had been thinking a lot about his life and about how it might end. His notebook was open, its pages blank. So he pulled his hoodie over his earphones, cranked up a Spanish ballad and started to write.
He began with how he was feeling: anxious, pressured, not good enough. It would have read like a journal entry by any 17-year-old, except this one detailed murders, committed with machetes, in the suburbs of Long Island. The gang Henry belonged to, MS-13, had already killed five students from Brentwood High School. The killers were his friends. And now they were demanding that he join in the rampage.
Classmates craned their necks to see what he was working on so furiously. But with an arm shielding his notebook, Henry was lost in what was turning out to be an autobiography. He was transported back to a sprawling coconut grove near his grandfather’s home in El Salvador. In front of him was a blindfolded man, strung up between two trees, arms and legs splayed in the shape of an X. All around him were members of MS-13, urging him on. Then the gang’s leader, El Destroyer, stepped forward. He was in his 60s, with the letters MS tattooed on his face, chest and back. A double-edged machete glinted in his hand. He wanted Henry to kill the blindfolded man…
But now, Henry wrote, he wanted to escape the life that had followed him from El Salvador. If he stayed in the gang, he knew he would die. He needed help.
He tore out the pages and hid them inside another assignment, like a message in a bottle. Then he walked up to his teacher’s desk and turned them in.
A week later, Henry was called to the principal’s office to speak with the police officer assigned to the school. In El Salvador, Henry had learned to distrust the police, who often worked for rival gangs or paramilitary death squads. But the officer assured Henry that the Suffolk County police were not like the cops he had known before he sought asylum in the United States. They could connect him to the FBI, which could protect him and move him far from Long Island.
So after a childhood spent in fear, Henry made the first choice he considered truly his own. He decided to help the FBI arrest his fellow gang members. Henry’s cooperation was a coup for law enforcement. MS-13 was in the midst of a convulsion of violence that claimed 25 lives in Long Island over the past two years.
President Trump had seized on MS-13 as a symbol of the dangers of immigration, referring to parts of Long Island as ‘bloodstained killing fields.’ Police were desperately looking for informants who could help them crack how the gang worked and make arrests. Henry gave them a way in.
Under normal circumstances, Henry’s choice would have been his salvation. By working with the police, he could have escaped the gang and started fresh. But not in the dawning of the Trump era, when every immigrant has become a target and local police in towns like Brentwood have become willing agents in a nationwide campaign of detention and deportation. Without knowing it, Henry had picked the wrong moment to help the authorities.
Read the full article in Pro Publica.
Why (almost) everything reported about the Cambridge Analytica Facebook ‘hacking’ controversy is wrong
Chris Kavanagh, Medium, 26 March 2018
The real story then is not that Kogan, Wylie, and Cambridge Analytica developed some incredibly high tech ‘hack’ of Facebook. It is that, aside from Kogan’s data selling, they used methods that were common place and permitted by Facebook prior to 2015. Cambridge Analytica has since the story broke been outed as a rather obnoxious, unethical company- at least in how it promotes itself to potential clients. But the majority of what is being reported in the media about its manipulative power is just an uncritical regurgitation of Cambridge Analytica (and Chris Wylie’s) self-promotional claims. The problem is that there is little evidence that the company can do what it claims and plenty of evidence that it is not as effective as it likes to pretend; see the fact that Ted Cruz is not currently president.
No one is totally immune to marketing or political messaging but there is little evidence that Cambridge Analytica is better than other similar PR or political canvassing companies at targeting voters. Political targeting and disinformation campaigns, including those promoted by Russia, certainly had an impact on recent elections but were they the critical factor? Did they have a bigger impact than Comey announcing he was ‘reopening’ the Hillary email investigation the week before the US election? Or Brexiteers claiming that £250 million was being stolen from the NHS by the EU every week? Colour me skeptical.
To be crystal clear, I’m not arguing that Cambridge Analytica and Kogan were innocent. At the very least, it is clear they were doing things that were contrary to Facebook’s data sharing policies. And similarly Facebook seems to have been altogether too cavalier with permitting developers to access its users’ private data.
What I am arguing is that Cambridge Analytica are not the puppet masters they are being widely portrayed as. If anything they are much more akin to Donald Trump; making widely exaggerated claims about their abilities and getting lots of attention as a result.
Read the full article on Medium.
A fate worse than death
Cathy Rentzenbrink, Prospect, 16 March 2018
We have lost our way with death. Improvements in medicine have led us to believe that a long and fulfilling life is our birthright. Death is no longer seen as the natural consequence of life but as an inconvenient and unjust betrayal. We are in an age of denial.
Why does this matter? Why not allow ourselves this pleasant and surely harmless delusion? It matters because we are in a peculiar and precise period of history where our technological advances enable us to keep people alive when we probably shouldn’t. Life or death is no longer a black and white situation. There are many and various shades of grey. We behave as though death is the worst outcome, but it isn’t.
Many years after the accident, when I wrote a book about it called The Last Act of Love, I catalogued what happened to me as I witnessed the destruction of my brother. I detailed the drinking and the depression. The hardest thing was tracking our journey from hope to despair. I still find it hard to be precise about exactly when and how I realised that Matty would be better off dead. I know I moved from being convinced that if I tried hard enough I could bring Matty back to life, to thinking I should learn to love him as he was. Eventually I asked myself the right question: would Matty himself want to be alive like this?
Of course, the answer was no.
Tony Bland, a victim of the Hillsborough disaster, had been the first person to be allowed to die from withdrawal of artificial nutrition and hydration in 1993. Matty became the 14th case in 1998. He was 24. He had 16 years of full life and another eight of being in a persistent vegetative state – an ugly expression for a profoundly terrible condition. We were told it would take between seven and 10 days for Matty to die but it was 13 days between the removal of his feeding tube and his final breath. I just about held it together for the first 10 days but I couldn’t deal with the extra time. You could call what happened to me a breakdown, I suppose. I’d already had a few. That’s not what it felt like though. Sitting by Matty’s bedside, unable to cope with how much I longed for his death, I felt like I was being scythed into thousands of tiny pieces.
Henry Marsh is a surgeon who has seen this from the other side of the operating table. In his excellent book Do No Harm, he describes what happens to the family of someone in a prolonged disorder of consciousness as collateral damage. He also points out that it is fairly easy to save a life with emergency brain surgery—you drill some holes and let out some blood—but that the question of what constitutes meaningful life is much more difficult to grapple with. Surgeons don’t spend much time looking at their failures. They operate from a place of hope, wishing and willing for the best, and why not? With no expert knowledge, that’s what we would all want for our stricken loved one, that the maximum effort is made.
Read the full article in Prospect.
The success of all-women shortlists risks masking
the issues they were meant to solve
Helen Lewis, New Statesman, 23 March 2018
But here’s the thing. Say your industry is dominated by men, which often means white, privately educated Oxbridge men. Institute a gender quota without tackling the underlying issues and you’ll largely fill the slots with white, privately educated Oxbridge childless women (hi, call me!). And that’s not bad, as long as it’s not the end of the conversation.
Handing a golden ticket to the women who can behave most like privileged men is a very partial answer to the problem, but it massages the figures wonderfully. Similarly, using Oxford and Cambridge as a blunt indicator of privilege ignores the fact many working-class pupils choose these universities specifically to offset the advantages of their richer, private-school peers.
Overall, I worry that we give more credit to people who do the shiny stuff and take the resulting lap of honour than those who ask the really tough questions, and argue for slower action to reform structures. This allows movements such as feminism to be co-opted as a branch of marketing: just last week, I listened to a studio executive explain that the new Lara Croft film was the most ‘badass’ yet, and realised that I was supposed to be grateful. As if watching an attractive woman in a sweaty vest kick people in the groin would, by some mysterious trickle-down effect, solve the pay gap.
Read the full article in the New Statesman.
Why are the poor blamed and shamed for their deaths?
Barbara Ehrenreich, Guardian, 31 March 2018
While the affluent struggled dutifully to conform to the latest prescriptions for healthy living – adding whole grains and gym time to their daily plans – the less affluent remained mired in the old comfortable, unhealthy ways of the past – smoking cigarettes and eating foods they found tasty and affordable. There are some obvious reasons why the poor and the working class resisted the health craze: gym memberships can be expensive; ‘health foods’ usually cost more than ‘junk food’. But as the classes diverged, the new stereotype of the lower classes as wilfully unhealthy quickly fused with their old stereotype as semi-literate louts. I confront this in my work as an advocate for a higher minimum wage. Affluent audiences may cluck sympathetically over the miserably low wages offered to blue-collar workers, but they often want to know ‘why these people don’t take better care of themselves’. Why do they smoke or eat fast food? Concern for the poor usually comes tinged with pity. And contempt…
There may well be unfortunate consequences from eating the wrong foods. But what are the ‘wrong’ foods? In the 80s and 90s, the educated classes turned against fat in all forms, advocating the low-fat and protein diet that, journalist Gary Taubes argues, paved the way for an ‘epidemic of obesity’ as health-seekers switched from cheese cubes to low-fat desserts. The evidence linking dietary fat to poor health had always been shaky, but class prejudice prevailed: fatty and greasy foods were for the poor and unenlightened; their betters stuck to bone-dry biscotti and fat-free milk. Other nutrients went in and out of style as medical opinion shifted: it turns out high dietary cholesterol, as in oysters, is not a problem after all, and doctors have stopped pushing calcium on women over 40. Increasingly, the main villains appear to be sugar and refined carbohydrates, as in hamburger buns. Eat a pile of fries washed down with a sugary drink and you will probably be hungry again in a couple of hours, when the sugar rush subsides. If the only cure for that is more of the same, your blood sugar levels may permanently rise – what we call diabetes.
Special opprobrium is attached to fast food, thought to be the food of the ignorant. Film-maker Morgan Spurlock spent a month eating nothing but McDonald’s to create his famous Super Size Me, documenting his 11kg (24lb) weight gain and soaring blood cholesterol. I have also spent many weeks eating fast food because it’s cheap and filling but, in my case, to no perceptible ill effects. It should be pointed out, though, that I ate selectively, skipping the fries and sugary drinks to double down on the protein. When, at a later point, a notable food writer called to interview me on the subject of fast food, I started by mentioning my favourites (Wendy’s and Popeyes), but it turned out they were all indistinguishable to him. He wanted a comment on the general category, which was like asking me what I thought about restaurants.
If food choices defined the class gap, smoking provided a firewall between the classes. To be a smoker in almost any modern, industrialised country is to be a pariah and, most likely, a sneak. I grew up in another world, in the 1940s and 50s, when cigarettes served not only as a comfort for the lonely but a powerful social glue. People offered each other cigarettes, and lights, indoors and out, in bars, restaurants, workplaces and living rooms, to the point where tobacco smoke became, for better or worse, the scent of home. My parents smoked; one of my grandfathers could roll a cigarette with one hand; my aunt, who was eventually to die of lung cancer, taught me how to smoke when I was a teenager. And the government seemed to approve. It wasn’t till 1975 that the armed forces stopped including cigarettes along with food rations.
As more affluent people gave up the habit, the war on smoking – which was always presented as an entirely benevolent effort – began to look like a war against the working class. When the break rooms offered by employers banned smoking, workers were forced outdoors, leaning against walls to shelter their cigarettes from the wind. When working-class bars went non-smoking, their clienteles dispersed to drink and smoke in private, leaving few indoor sites for gatherings and conversations. Escalating cigarette taxes hurt the poor and the working class hardest. The way out is to buy single cigarettes on the streets, but strangely enough the sale of these ‘loosies’ is largely illegal. In 2014 a Staten Island man, Eric Garner, was killed in a chokehold by city police for precisely this crime.
Read the full article in the Guardian.
A litany of the ways in which Facebook
corrupts the spirit of free speech
Robert Sharp, 15 March 2018
Vast swathes of political discourse take place on Mark Zuckerberg’s platform. We treat it like a public square, but it is not. At any moment, the messages we post, and the networks we have built can be taken away from us.
Whatever mechanise that has been used to shut down the far right will be used to censor other groups. Campaigners will note the demise of the Britain First page and seek to have other pages similarly banned. Islamist groups and the Trans-Exclusionary Radical Feminists will be at immediate risk, but other kinds of political discussion will soon be targeted. Any legitimate political cause that contains militant elements, such as pro-Palestine or pro-Kurdish groups, could easily find their Facebook privileges are revoked when those who are ideologically opposed start gaming the complaint features.
This is privatised censorship. Individuals and interest groups can and will enlist the help of a billionaire to shut up people with whom they disagree. The western liberal democracies are unlikely to participate in this shutting down of discussion, but authoritarian regimes and their avatars will get in on the act sooner or later. Embattled groups, such as the liberals in Saudi Arabia or the LGBT activists in Uganda will find themselves squeezed even tighter.
Our response to this cannot be ‘well, you can always go elsewhere’. Where exactly? MySpace? Friends Reunited? Independent websites (such as this blog) do not have the same networking opportunities and potential for ‘virality’ that the leading social media platforms offer. Social media is where our discourse happens now and all other content is filtered through these platforms. They are private spaces where we conduct very public politics. Denial of access to these spaces presents a huge barrier to expression for anyone thus suppressed. A single American company should not be the final arbiter on what organisations get to participate in British politics. We may think they have made the right call in banning Britain First… but even a stopped clock is right twice a day.
Read the full article on Robert Sharp’s blog.
Divided by DNA: The uneasy relationship
between archaeology and ancient genomics
Ewen Callaway, Nature, 28 March 2018
Some archaeologists are ecstatic over the possibilities offered by the new technology. Ancient-DNA work has breathed new life and excitement into their work, and they are beginning once-inconceivable investigations, such as sequencing the genome of every individual from a single graveyard. But others are cautious.
‘Half the archaeologists think ancient DNA can solve everything. The other half think ancient DNA is the devil’s work,’ quips Philipp Stockhammer, a researcher at Ludwig-Maximilians University in Munich, Germany, who works closely with geneticists and molecular biologists at an institute in Germany that was set up a few years ago to build bridges between the disciplines. The technology is no silver bullet, he says, but archaeologists ignore it at their peril.
Some archaeologists, however, worry that the molecular approach has robbed the field of nuance. They are concerned by sweeping DNA studies that they say make unwarranted, and even dangerous, assumptions about links between biology and culture. ‘They give the impression that they’ve sorted it out,’ says Marc Vander Linden, an archaeologist at the University of Cambridge, UK. ‘That’s a little bit irritating.’
This isn’t the first time archaeologists have had to contend with transformative technology. ‘The study of prehistory today is in crisis,’ wrote Cambridge archaeologist Colin Renfrew in his 1973 book Before Civilization, describing the impact of radiocarbon dating. Before the technique was developed by chemists and physicists in the 1940s and 50s, prehistorians determined the age of sites using ‘relative chronologies’, in some cases relying on ancient Egyptian calendars and false assumptions about the spread of ideas from the Near East. ‘Much of prehistory, as written in the existing textbooks is inadequate: some of it, quite simply wrong,’ Renfrew surmised.
It wasn’t an easy changeover — early carbon-dating efforts were off by hundreds of years or more — but the technique eventually allowed archaeologists to stop spending most of their time worrying about the age of bones and artefacts and focus instead on what the remains meant, argues Kristian Kristiansen, who studies the Bronze Age at the University of Gothenburg in Sweden. ‘Suddenly there was a lot of free intellectual time to start thinking about prehistoric societies and how they are organized.’ Ancient DNA now offers the same opportunity, says Kristiansen, who has become one of his field’s biggest cheerleaders for the technology.
Read the full article in Nature.
Why they hate Margaret Atwood
Jonathan Kay, Quilette, 15 March 2018
If you live outside Canada, and recognize Atwood as the author of such renowned feminist works as Cat’s Eye, you might assume that she’d be representing the side of sound feminist doctrine in this metaphorical bout. As literary critic Carmine Starnino once noted, Atwood is the ‘best-known English-language novelist of contemporary sexual politics.’ She more or less invented the modern Anglo Canadian feminist fiction genre, specializing in what Starnino aptly describes as ‘salty post-Freudian satires on gender inequalities, the oppressiveness of marriage and the historical animosity of women.’
In the 1980s, when I studied North American Literature as a high school elective, Atwood was the only writer with two books on our reading list. She also was the youngest writer on that list by a significant margin. Decades later, when I acted as her editor for a 2016 book about the French presence in North America, she was just as sharp and witty as I’d hoped. (In response to her complaints that my edits were too severe, I feebly protested that I’d ‘left the bones where they were, and just moved around some of the skin and hair.’ To which she replied that ‘all bones look much the same. The hair and skin are what make us recognizable.’ It’s always a thrill when your heroes put you in your place.)
And yet, this being the bizarro world of 2018, Atwood’s role in Rak’s University of Alberta event wasn’t as a feminist heroine. In fact, Atwood wasn’t even in attendance. The above-described poster was just a gimmick to promote Rak’s caricature of Atwood as the Trotsky of Canadian feminism. And the fact that Rak feels comfortable signaling this posture on publicly displayed posters shows she isn’t some outlier loon. Just the opposite: In recent years, the ideological mobbing of Atwood and other well-established writers has become a mass-participation phenomenon among young Canadian literati who mobilize daily on social media.
It’s difficult to explain the strangeness of all this to a non-Canadian. Perhaps the closest comparison I can offer would come by way of imagining the late Edward Said being denounced by Palestinian-rights advocates as a febrile Zionist—or Black Lives Matter protestors savaging the work of Ta-Nehisi Coates. As magazine writer Alicia Elliott put it recently, the world of Canadian literature (‘CanLit,’ as it’s known within the treehouse) has become ‘a raging dumpster fire‘ of embittered identity politics and ideological tribalism—so much so that even speaking panels convened to discuss this dumpster fire now can be transformed by a few of Elliott’s own Tweets into meta-dumpster fires of their own.
Read the full article in Quilette.
Fatalism, freedom and the fight for America’s future
David Runciman, Boston Review, 16 March 2018
There is another conundrum. Pinker contrasts the innate pessimism of human beings with the overwhelming evidence that this attitude is wrong—people keep fearing the worst yet continued progress keeps confounding them. What that means, however, is that progress is on some level not merely consistent with pessimism but possibly contingent on it. Societies full of doomsters are the ones that have been delivering the goods. The world has been getting better while we think it has been getting worse. Of course, such societies also contain some optimists, enthusiasts and visionaries. But Pinker wants to covert us all to the cause of optimism. What good would that do? On the historical evidence, there is nothing to suggest that such societies would succeed because no such societies have ever existed. A society of optimists might well be a disaster.
Pinker’s previous book, The Better Angels of Our Nature (2011), was a more tightly argued account of progress across a particular domain – the relative decline of violence over both the long and the short term. It provoked a similar reaction – readers divided on the basis of their prior convictions about the state of the world. They read the evidence according to what they thought should be true, rather than adjusting what they thought was true in the light of the evidence. Optimists see acts of violence as the exception not the rule; pessimists see them as the rule not the exception. When a terrible act of violence takes place, we tend to filter it through the stories we tell ourselves about the possibility of progress. On the one hand, we can argue that it shouldn’t derail the good news; on the other, we can argue that it makes a mockery of the good news. Either way, we can be left with a feeling of helplessness…
Fatalism is one of the permanent temptations of modern politics, but the history of modern political thought – particularly the work of Alexis de Tocqueville and Friedrich Hayek – can help us better understand the varieties of fatalism and how the lines between optimistic and pessimistic fatalism are blurry and easily crossed.
No one wants to be a fatalist, after all. Pinker is typical in his insistence that fatalism is the enemy of progress, but he is also typical in that he is far nearer to the state of mind he is trying to confront than he appears to realise. By applying some of Tocqueville and Hayek’s insights to the challenge of, say, environmental politics, we can better highlight the risks that optimistic fatalism poses alongside the more pessimistic kind.
Read the full article in the Boston Review.
By rewriting history, Hindu nationalists
aim to assert their dominance over India
Rupam Jain & Tom Lasseter, Reuters, 6 March 2018
During the first week of January last year, a group of Indian scholars gathered in a white bungalow on a leafy boulevard in central New Delhi. The focus of their discussion: how to rewrite the history of the nation.
The government of Hindu nationalist Prime Minister Narendra Modi had quietly appointed the committee of scholars about six months earlier. Details of its existence are reported here for the first time.
Minutes of the meeting, reviewed by Reuters, and interviews with committee members set out its aims: to use evidence such as archaeological finds and DNA to prove that today’s Hindus are directly descended from the land’s first inhabitants many thousands of years ago, and make the case that ancient Hindu scriptures are fact not myth.
Interviews with members of the 14-person committee and ministers in Modi’s government suggest the ambitions of Hindu nationalists extend beyond holding political power in this nation of 1.3 billion people – a kaleidoscope of religions. They want ultimately to shape the national identity to match their religious views, that India is a nation of and for Hindus.
In doing so, they are challenging a more multicultural narrative that has dominated since the time of British rule, that modern-day India is a tapestry born of migrations, invasions and conversions. That view is rooted in demographic fact. While the majority of Indians are Hindus, Muslims and people of other faiths account for some 240 million, or a fifth, of the populace.
The committee’s chairman, KN Dikshit, told Reuters, ‘I have been asked to present a report that will help the government rewrite certain aspects of ancient history.’ The committee’s creator, Culture Minister Mahesh Sharma, confirmed in an interview that the group’s work was part of larger plans to revise India’s history.
Read the full article on Reuters.
The consciousness deniers
Galen Strawson, NYRB Daily, 13 March 2018
What is the silliest claim ever made? The competition is fierce, but I think the answer is easy. Some people have denied the existence of consciousness: conscious experience, the subjective character of experience, the ‘what-it-is-like’ of experience. Next to this denial – I’ll call it ‘the Denial’ – every known religious belief is only a little less sensible than the belief that grass is green.
The Denial began in the twentieth century and continues today in a few pockets of philosophy and psychology and, now, information technology. It had two main causes: the rise of the behaviorist approach in psychology, and the naturalistic approach in philosophy. These were good things in their way, but they spiraled out of control and gave birth to the Great Silliness. I want to consider these main causes first, and then say something rather gloomy about a third, deeper, darker cause. But before that, I need to comment on what is being denied – consciousness, conscious experience, experience for short.
What is it? Anyone who has ever seen or heard or smelled anything knows what it is; anyone who has ever been in pain, or felt hungry or hot or cold or remorseful, dismayed, uncertain, or sleepy, or has suddenly remembered a missed appointment. All these things involve what are sometimes called ‘qualia’ – that is to say, different types or qualities of conscious experience. What I am calling the Denial is the denial that anyone has ever really had any of these experiences.
Perhaps it’s not surprising that most Deniers deny that they’re Deniers. ‘Of course, we agree that consciousness or experience exists’, they say – but when they say this they mean something that specifically excludes qualia.
Who are the Deniers? I have in mind – at least – those who fully subscribe to something called ‘philosophical behaviorism’ as well as those who fully subscribe to something called ‘functionalism’ in the philosophy of mind. Few have been fully explicit in their denial, but among those who have been, we find Brian Farrell, Paul Feyerabend, Richard Rorty, and the generally admirable Daniel Dennett. Ned Block once remarked that Dennett’s attempt to fit consciousness or ‘qualia’ into his theory of reality ‘has the relation to qualia that the US Air Force had to so many Vietnamese villages: he destroys qualia in order to save them.’
One of the strangest things the Deniers say is that although it seems that there is conscious experience, there isn’t really any conscious experience: the seeming is, in fact, an illusion. The trouble with this is that any such illusion is already and necessarily an actual instance of the thing said to be an illusion. Suppose you’re hypnotized to feel intense pain. Someone may say that you’re not really in pain, that the pain is illusory, because you haven’t really suffered any bodily damage. But to seem to feel pain is to be in pain. It’s not possible here to open up a gap between appearance and reality, between what is and what seems.
Read the full article in NYRB Daily.
‘Magic, illusions, and zombies’: An exchange
Daniel Dennett, NYRB Daily, 3 April 2018
I thank Galen Strawson for his passionate attack on my views, since it provides a large, clear target for my rebuttal. I would never have dared put Strawson’s words in the mouth of Otto (the fictional critic I invented as a sort of ombudsman for the skeptical reader of Consciousness Explained) for fear of being scolded for creating a strawman. A full-throated, table-thumping Strawson serves me much better. He clearly believes what he says, thinks it is very important, and is spectacularly wrong in useful ways. His most obvious mistake is his misrepresentation of my main claim:
If [Dennett] is right, no one has ever really suffered, in spite of agonizing diseases, mental illness, murder, rape, famine, slavery, bereavement, torture, and genocide. And no one has ever caused anyone else pain.
I don’t deny the existence of consciousness; of course, consciousness exists; it just isn’t what most people think it is, as I have said many times. I do grant that Strawson expresses quite vividly a widespread conviction about what consciousness is. Might people—and Strawson, in particular—be wrong about this? That is the issue.
He invokes common sense against which to contrast ‘the silliest claim ever made’ (I’m honored!), but here is some other common sense that pushes back: when you encounter people who claim to have seen a magician saw a lady in half, counsel them to postpone their extravagant hypotheses – backwards time travel, multi-world wormholes, quantum entanglement, ‘real magic’ – until they have exhausted the more mundane possibilities. Unrevolutionary science has discovered good explanations for such heretofore baffling phenomena as reproduction, metabolism, growth, and self-repair, for instance. So while it is possible that we will have to overthrow that science in order to account for consciousness, we should explore the default possibilities first. This is the pragmatic policy of naturalism, nothing more. And since we already have lots of evidence that nature has devised a cornucopia of shortcuts and indirect tricks to help animals cope with the complexities of their environments, we would be wise to check first for the possibility that we have somehow inflated our own sense of the ‘magic’ of our consciousness.
Strawson claims to know already that this is hopeless, and even urges a pre-emptive strike against the attempt. He insists, citing Bertrand Russell as his authority, that ‘we know something fundamental about the essential nature of conscious experience just in having it.’ How strange it would be for us to know something ‘fundamental’ about the ‘essential nature’ of a phenomenon simply by undergoing it! We can know something important, something that cannot be ignored, while still being in the dark about the ‘essential nature’ of a phenomenon. Some cancer sufferers think they know something fundamental about their cancer just because it is theirs; but while they no doubt know something about how it seems to them, this is not the kind of knowledge of ‘something fundamental’ to pit against empirical research.
Read the full article in NYRB Daily.
Baldwin’s lonely country
Ed Pavlic, Boston Review, 29 March 2018
Baldwin’s literary fame had been built on a complex and elusive sense of racial reconciliation, drawing together disparate but nonetheless proximate corners of the US reading public. Those connections had been tough enough to forge in books and magazines. Reconcilation in history would be much harder than that. But, as he wrote in The Fire Next Time, ‘human history in general, and Negro history in particular’, testified ‘to nothing less than the perpetual achievement of the impossible.’
The Fire Next Time had rocketed Baldwin into the role of public intellectual. Almost from the first, Baldwin employed his celebrity in increasingly politicized ways. Having gained the national spotlight, he gave lectures in support of the Congress of Racial Equality (CORE). The same month that his portrait appeared on the cover of Time, May 1963, he made the documentary Take This Hammer, which brought attention to black poverty and white gentrification in San Francisco. A week later he led an acrimonious meeting with then–U.S. attorney general Robert F. Kennedy about the dire, national implications of racism and violence in Birmingham. Outspoken in ways that kept him off the podium at the August 28 March on Washington for Jobs and Freedom, Baldwin nonetheless led an accompanying march in Paris and appeared with Brando, Sidney Poitier, Harry Belafonte, and Charlton Heston in a roundtable discussion broadcast live on TV in the United States the evening of the march.
While many continued to think of Baldwin as the spokesperson for a vision of ultimate cross-racial communion such as concluded The Fire Next Time, Baldwin’s speeches and essays grew increasing direct about the impossibilities of saving the United States from itself. By the time of King’s murder, Baldwin had shifted his intellectual focus mainly away from black–white reconciliation to instead undertake a no-less-difficult project: facilitating a conversation connecting younger, more radical black leaders with those of his own generation.
Read the full article in the Boston Review.
‘Speak freely’
Keith E Whittington & Scott Jaschik,
Inside Higher Ed, 3 April 2018
Q: Legally, the issues differ somewhat for public and private institutions. But your argument seems less legalistic and more on the role of higher education. Do you think there are any different ethical obligations on this issue for public and private institutions?
A: The law can be helpful in clarifying some of the relevant issues as we grapple with free speech controversies, but as members of the campus community, we ought to value free speech for our own purposes and not just because we are sometimes constrained to do so by legal authorities.
Public universities have to worry more about those legal restrictions, but both private and public universities have a common concern with how free speech is instrumental to the truth-seeking mission of the university and how learning to work through disagreements with reason and deliberation is an important aspect of what universities try to impart. Private institutions do have more flexibility in charting their own course, and some might choose to pursue a quite different mission. Most obviously, religious institutions might prioritize some commitments based on faith over free-ranging skeptical inquiry, but we should be clear-eyed about the compromises that are being made in such cases to the vision of a university where scholars can in engage in the fearless pursuit of the truth…
Q: What do you make of ‘hate speech’? Should Richard Spencer be given the same rights to speak on campus as, for example, Peter Singer, a noted philosopher whose views offend many?
A: ’Hate speech’ is a broad and ill-defined category. There is general agreement that threats and personal harassment have no place on campus. We should be very reluctant, however, to declare that the consideration of some ideas have no place on campus. A college campus is precisely where we should be able to scrutinize controversial, outrageous and marginal ideas. Having said that, we should be spending our time on campus trying to engage with serious ideas and in the most intellectually productive way possible. We need to make choices of what to read and discuss in the classroom when we construct a syllabus.
But we also need to make choices outside the classroom about what ideas are worth debating and how, including those ideas that we do not think are academically serious. We should want to hear from the best, most serious advocates of ideas that matter to our public debates, even if those ideas are outside the political and social mainstream — and even when we think they are misguided and wrong. We can disagree vociferously with Peter Singer and yet learn a lot from him. We would be fools to not want to hear from him and read his work. The same cannot be said for Richard Spencer. That doesn’t mean that Spencer should be specifically prohibited from speaking on campus, but I don’t see any reason why we should be inviting him to speak on a college campus.
Read the full article in Inside Higher Ed.
New brain maps with unmatched detail
may change neuroscience
Monique Brouillette, Quanta Magazine, 4 April 2018
Sitting at the desk in his lower-campus office at Cold Spring Harbor Laboratory, the neuroscientist Tony Zador turned his computer monitor toward me to show off a complicated matrix-style graph. Imagine something that looks like a spreadsheet but instead of numbers it’s filled with colors of varying hues and gradations. Casually, he said: ‘When I tell people I figured out the connectivity of tens of thousands of neurons and show them this, they just go ‘huh?’ But when I show this to people …’ He clicked a button onscreen and a transparent 3-D model of the brain popped up, spinning on its axis, filled with nodes and lines too numerous to count. ‘They go ‘What the _____!’’
What Zador showed me was a map of 50,000 neurons in the cerebral cortex of a mouse. It indicated where the cell bodies of every neuron sat and where they sent their long axon branches. A neural map of this size and detail has never been made before. Forgoing the traditional method of brain mapping that involves marking neurons with fluorescence, Zador had taken an unusual approach that drew on the long tradition of molecular biology research at Cold Spring Harbor, on Long Island. He used bits of genomic information to imbue a unique RNA sequence or ‘bar code’ into each individual neuron. He then dissected the brain into cubes like a sheet cake and fed the pieces into a DNA sequencer. The result: a 3-D rendering of 50,000 neurons in the mouse cortex (with as many more to be added soon) mapped with single cell resolution.
This work, Zador’s magnum opus, is still being refined for publication. But in a paper recently published by Nature, he and his colleagues showed that the technique, called MAPseq (Multiplexed Analysis of Projections by Sequencing), can be used to find new cell types and projection patterns never before observed. The paper also demonstrated that this new high-throughput mapping method is strongly competitive in accuracy with the fluorescent technique, which is the current gold standard but works best with small numbers of neurons.
Read the full article in Quanta Magazine.
American peace in an age of endless war
Samuel Moyn, Raritan Quarterly, Winter 2018
In a recent episode of Silicon Valley, the biting parody of con- temporary tech culture, a startup builds an antiwar app. On the show, everyone is always promising ‘to make the world a better place.’ After you download ‘PeaceFare,’ you can send ‘virtual corn to feed virtual starving villages,’ in effect ‘turning your mobile device into an empa- thy machine.’ The founder has no trouble feeling good, justifying his marketing budget with effortless sanctimony: ‘We really think our company’s message is worth getting out there.’ Like much else on the program, the scene is a harsh indictment of our pseudomorality.
The conversion of peace into an opportunity for clicks is only one of the indignities visited on the historic dream to beat swords into plowshares – the dream that climaxed in the twentieth century in the midst of total war. Far worse than trivializing peace as a marketing strategy, Americans have allowed their state to embark on an endless war that shows no sign of abating. Donald Trump has made the war making more egregious, but it is built on the foundation laid by George W. Bush and Barack Obama, using a rationale of antiterror that has left a long-standing American criticism of war nearly irretrievable.
‘I lived in the first century of world wars,’ Muriel Rukeyser wrote in a 1968 poem. We live in a century of endless war. It is literally global for the rst time, with American special ops present last year in 150 countries, which amounts to three-quarters of them. Yet our new brand of warfare is less spectacular and visible. ‘The news would pour out of various devices/Interrupted by attempts to sell products to the unseen’, Rukeyser continued. She responded by seeking an imaginary community, writing ‘poems for others unseen or unborn.’ But even as the literature of endless war has crystallized into an identifiable genre, the difference is that it is much easier today even for the morally sen- sitive to skirt the news. Our empathy machines have not worked well in the past, and now they face a new challenge – a way of war that is all but invisible.
Read the full article in the Raritan Quarterly.
The imaginative reality of Ursula K Le Guin
David Naimon, VQR, 28 March 2018
You’ve said that by diagramming sentences, you discover that they have skeletons.
I wasn’t taught that in school—that was the previous generation. My mother and my great-aunt could diagram a sentence, and they showed me how. I enjoyed it; for anyone who has that kind of mind, it’s illuminating. It’s kind of like drawing the skeleton of a horse. You go: ‘Oh, that’s how they hang together!’
It’s interesting to think that if sentences have skeletons then different sentences are, in a sense, different animals. This would bring us back to rhythm, as sentences would all have a different rhythm, a different sound, because they would walk differently.
A different gait, right. Although all the sentences in a piece would also be following a certain underlying, integrating rhythm.
In your book on writing, Steering the Craft, you say that morality and language are linked, but that morality and correctness are not the same thing. Yet we often confuse them in the realm of grammar.
The ‘grammar bullies’ – you read them in places like the New York Times—and they tell you what is correct: You must never use ‘hopefully.’ ‘Hopefully, we will be going there on Tuesday.’ That is incorrect and wrong and you are basically an ignorant pig if you say it. This is judgmentalism. The game that is being played there is a game of social class. It has nothing to do with the morality of writing and speaking and thinking clearly, which Orwell, for instance, talked about so well. It’s just affirming that I am from a higher class than you are. The trouble is that people who aren’t taught grammar very well in school fall for these statements from these pundits, delivered with vast authority from above. I’m fighting that. A very interesting case in point is using ‘they’ as a singular. This offends the grammar bullies endlessly; it is wrong, wrong, wrong! Well, it was right until the eighteenth century, when they invented the rule that ‘he’ includes ‘she.’ It didn’t exist in English before then; Shakespeare used ‘they’ instead of ‘he or she’ – we all do, we always have done, in speaking, in colloquial English. It took the women’s movement to bring it back to English literature. And it is important. Because it’s a crossroads between correctness bullying and the moral use of language. If ‘he’ includes ‘she’ but ‘she’ doesn’t include ‘he,’ a big statement is being made, with huge social and moral implications. But we don’t have to use ‘he’ that way—we’ve got ‘they.’ Why not use it?
This difference between grammatical correctness and the ways language engages moral questions reminds me of this quote of yours: ‘We can’t restructure society without restructuring the English language.’ That the battle is essentially as much at the sentence level as it is in the world, and it’s reflected in your work as well. I think of The Dispossessed, your novel about an anarchist utopia. There is no property in this imagined world and there are also no possessive pronouns. The world and the language of the world are reflecting back upon each other.
The founders of this anarchist society made up a new language because they realized you couldn’t have a new society and an old language. They based the new language on the old one but changed it enormously. It’s simply an illustration of what Orwell was saying in his great essay about how writing English clearly is a political matter.
Read the full article in VQR.
A jar, a blouse, a letter
Maria Dimitrova, LRB blog, 3 April 2018
State Security divided Bulgarians abroad into two camps: loyal and ‘enemy’ émigrés. A glance at Kristeva’s ‘Personal’ file – three times the size of her ‘Work’ file – reveals that she was under close surveillance from the earliest years of her career. Her private correspondence, her academic and journalistic work and her conversations with other Bulgarians were closely monitored, and information about her family was methodically collected. Sixteen officers worked on her case. The contents of one intercepted package read: ‘a jar, a blouse, a letter’.
Born in Sliven in 1941, Kristeva was first taught French by Dominican nuns in a Catholic convent. After they were expelled from Bulgaria on suspicion of espionage, she was transferred to a secular French school (the English school was open only to the children of party members). She graduated from Sofia University at the top of her class, was active in youth organisations and worked as a journalist for several publications including Narodna Mladej. In 1965 she was authorised to go to Paris to study for one year. But she stayed longer and in 1967 married Philippe Sollers. In 1970, according to a report in her ‘Personal’ file by agent ‘Petrov’, following a successful ‘recruitment talk’ she was added to the ‘agent apparatus’ under the alias ‘Sabina’.
‘In those years, there were only three ways to leave the country,’ a former member of the Dossier Commission told me. ‘You had to be a cop, have a relative in the party, or agree to collaborate with State Security.’ Everyone had a ‘verbovachna beseda’ – a recruitment talk – ‘and many never forget it for the rest of their lives.’ Agent Petrov describes Kristeva admitting she felt ‘a little uncomfortable’ because of her marriage to Sollers: before going to France, she had declared that she had no intention to marry or settle there, and was now afraid that her actions would be negatively interpreted. Petrov reports her as saying that in Paris she has become an even ‘stauncher adherent to socialism because of the trust our authorities placed in her by letting her go to Paris and allowing her parents to visit her’. ‘I asked her if she remembers our conversation in my office,’ Petrov writes. ‘She assured me that she does remember it very well, and indeed had been waiting to be contacted. To that I responded that we are patient people.’
Nothing in the files is written or signed by Kristeva. It seems the authorities expected her to ‘reveal ideological centres in France that work against Bulgaria and the USSR’ and find information about other Bulgarian intellectuals and cultural figures in France, but Kristeva did not write donosi, the personal denunciations that have become a source of painful reckoning for many Bulgarians, or supply any information that could be of use to the security services. By contrast, a classmate of Kristeva’s – alias ‘Krasimir’ – submitted a report describing her as ‘very selfish’ and ‘exceptionally ambitious’, and complaining that she treated him ‘haughtily’ in Paris.
Read the full article on the LRB blog.
The last Ottoman generation
and the making of the modern Middle East
Ramazan Hakki Öztan, Reviews in History, 5 April 2018
Unlike what the nationalist historiographies have come to claim, we now know that Ottoman institutions, elites, and political culture actually survived well beyond the First World War and left behind a fragmented but resilient legacy. Provence tries to capture this reality by imagining ‘a post-Ottoman Middle East of great cities, and rural and pastoral hinterlands, inter-connected through modern infrastructure, and institutions, undivided by borders, ruling arrangements, or the constructed barriers of human consciousness’ (p. 7). He is keen on understanding the effects of mass schooling, particularly imperial military schools which – through tuition-free education and boarding facilities – attracted boys from rural sectors and of modest backgrounds. Educated under the watchful eyes of influential German officers such as Colmar von der Goltz, and infused with his notions of militarized nations, it was this last Ottoman generation that posed as ‘the saviour of the Ottoman nation’ in a series of wars that wrecked the empire from 1911 to 1918. The first chapter is devoted to a discussion of their education and broader Ottoman military culture, complete with biographies of key individuals whose careers structure the rest of the book. The transitions here from one biographical entry to another can come across a bit disjointed, but the detailed data when combined reveal the shared career trajectories of these imperial functionaries.
The Ottoman defeat in the First World War, however, shattered the visions of imperial reconstruction long cultivated by this last Ottoman generation. The victors of the war, namely Britain and France, had made plans to partition the Ottoman territories, but it proved to be a rather complicated business to reconcile a variety of promises they had made to different parties and reach a post-war settlement. In chapter two, Provence details these negotiations and the imposition of the mandate framework onto the region, showing how the Ottoman educated elite responded by appropriating Lenin and Wilson’s language of self-determination to advance their own agendas. As Provence skilfully shows, however, what proved more effective in challenging the dictated terms of the post-war settlement was armed resistance: locally rooted but led by the war-hardened ex-Ottoman officers. In this sense, the successful Kemalist struggle against the partitions in Asia Minor became a source of inspiration for the Ottoman-Arab officers operating in Mesopotamia and Greater Syria (i.e. today’s Syria, Lebanon, Israel/Palestine, and Jordan). Due to strategic calculations, the Kemalists were also willing to help their former classmates or ‘brotherly officers’, as Provence calls them, to wage a similar struggle against Britain and France. While both imperial powers struggled greatly to establish control in Greater Syria due to a series of uprisings that featured these itinerant ex-Ottoman veterans, they managed to contain all the insurgencies, except the one led by Mustafa Kemal.
Read the full article on Reviews in History.
Scientists probe an enduring question:
Can language shape perception?
Jyoti Madhusoodanan, Undark, 4 April 2018
Picture a sunlit Grecian sea or the deep hues of Santorini’s rooftops. They’re both called ‘blue’ in English. But to Greek speakers, the lighter hue is ‘ghalazio’ and the darker color ‘ble.’ When researchers showed native speakers of both languages squares of light and dark blue, they found the Greek speakers viewed the two colors as more unlike each other. Did a different language influence how people viewed the two colors?
The idea that language can shape perception and thought — a hypothesis formally known as ‘linguistic relativity’ — harkens back to the 1930s. This hypothesis asserts that language doesn’t just express ideas, it actively shapes them, determining how we understand the world around us. Initially met with great interest, this idea fell out of favor by the 1960s due to a lack of scientific evidence.
Still, the notion refuses to die. Many researchers now believe that language does play a role in some aspects of cognitive activity, but the nature and extent of this role remains frustratingly murky. ‘We’d like to be able to say: look we tested it, and here’s the answer,’ says Terry Regier, a linguistics researcher at the University of California Berkeley. But evidence to support some version of linguistic relativity ‘doesn’t replicate reliably.’
Now, cognitive scientists are applying new technologies to resolve the issue. Their methods can tap directly into the brain to track blood flow and electrical activity in response to sensory input, allowing researchers to investigate the neural mechanisms by which language works to influence cognitive function. Additionally, research groups have begun to study young infants. These pudgy-cheeked humans allow us to understand the brain’s capacity to process sensory information before words are learned. Taken together, these experiments point in a surprising direction: Language does, indeed, influence our ability to perceive the world around us.
In 2016, researchers in Japan investigated whether babies can categorize colors, by using a method that measured the babies’ neural responses to various colors and shades. A color switch between blue and green triggered an increase in blood flow to certain parts of the brain, indicating that the brain perceived and processed the two colors differently — well before the babies learned the words ‘green’ and ‘blue.’ Yet when babies were shown 2 shades of green, there was no corresponding change in blood flow.
Read the full article in Undark.
The disappearance of books
threatens to erode fine arts libraries
Sarah E Bond, Hyperallergic, 21 March 2018
The University of Texas at Austin isn’t the only large public university planning to do away with its art library. The University of Wisconsin Library System has recently released a Facilities Master Plan, which proposes to eliminate the Kohler Art Library by 2030. Like UT-Austin, UW-Madison is planning to slash the overall number of books available on campus, in this case to 15% of the total collection. The Kohler Art Library’s holdings would eventually be reduced by 50%, with the other half moved off-site. UW-Madison’s Vice Provost for Libraries and University Librarian, Edward V. Van Gemert, echoes UT-Austin Dean Dempster’s reasoning for the reduction in browsable books, saying: ‘The vision of the plan is to strengthen the role of campus libraries in the academic pursuits of the University by providing the needed spaces and services at strategic locations across campus in alignment with campus planning.’
Across the country, many university libraries are engaged in a book purge. This has meant reassessing the use of library spaces and consolidating book holdings in a bid to attract more visitors. In states like Missouri and Kansas, libraries have begun to spend more and more of their annual budgets on digital subscriptions and spaces for people, rather than on the acquisition of physical books. As in Austin and Madison, such shifts have often been met with resistance. At Syracuse University in New York, there was a faculty uproar over the proposed movement of books to a far-away warehouse. The struggle ultimately resulted in the university building a 20,000-square-foot storage facility nearby for over 1 million books — guaranteeing next-business-day delivery.
With space at a premium and books ever-multiplying, how can we continue to make the case for browsable libraries dedicated to fine arts on major research campuses across the United States? University of Texas sophomore Abigail Sharp (a Corporate Communication major with a minor in Art History who is working on a certificate in Museum Studies) perhaps put it best. ‘The numerous irrelevant, algorithmically selected results from an online database do not compare to the titles found so close to one another when wandering the stacks of the library,’ she told me. ‘Some of the best research and learning materials I’ve come across have been analog, and physically holding and flipping through print makes all the difference in learning and retention. There is a way to incorporate the new design major with the current Fine Arts Library and Building, and that does not require relocation of analog resources.’
Read the full article in Hyperallergic.
Advances in human behaviour
came surprisingly early in Stone Age
John Tollefson, Nature, 15 March 2018
Early humans in eastern Africa crafted advanced tools and displayed other complex behaviours tens of thousands of years earlier than previously thought, according to a trio of papers published on 15 March in Science. Those advances coincided with — and may have been driven by — major climate and landscape changes.
The latest evidence comes from the Olorgesailie Basin in Southern Kenya, where researchers have previously found traces of ancient relatives of modern human as far back as 1.2 million years ago.
Evidence collected at sites in the basin suggests that early humans underwent a series of profound changes at some point before roughly 320,000 years ago. They abandoned simple hand axes in favour of smaller and more advanced blades made from obsidian and other materials obtained from distant sources. That shift suggests the early people living there had developed a trade network — evidence of growing sophistication in behaviour. The researchers also found gouges on black and red rocks and minerals, which indicate that early Olorgesailie residents used those materials to create pigments and possibly communicate ideas.
All of these changes in human behaviour occurred during an extended period of environmental upheaval, punctuated by strong earthquakes and a shift towards a more variable and arid climate. These changes occurred at the same time as larger animals disappeared from the site and were replaced by smaller creatures. ‘It’s a one-two punch combining tectonic shifts and climate shifts,’ says Rick Potts, who led the work as director of the human origins programme at the Smithsonian Institution in Washington DC. ‘That’s the kind of stuff out of which evolution arises.’
Read the full article in Nature.
The images are, from top down: ‘The Death of Christ and Three Mourners’ by Andrea Mantegna; Bell Beaker pot (photo credit: Ashmolean Museum/University of Oxford/Bridgeman/Nature; Lucy McKenzie, ‘Untitled’ (Courtesy Lucy McKenzie/Galerie Buchholz/The Museum of Modern Art/NYRB Daily); Image by Olena Shmahalo/Quanta Magazine; Photo of Mehmed VI, last Ottoman Sultan (credit Wikipedia).
(Admittedly a problem in the US rather than the UK) We quite rightly deplore interference with freedom of speech on university campuses, and yet tolerate the existence of a whole network of accredited degree-granting institutions where even tenured professors can and do lose their jobs if they come to believe that the universe is more than a few thousand years old; https://paulbraterman.wordpress.com/2013/12/04/credit-where-none-is-due-and creationist-colleges-and-courses/
The piece on Margaret Atwood reminds me of what you wrote earlier on the Erinyes vs the Eumenides. The Erinyes are winning.
In the piece on “Speak Freely”, I’m with him when he says “those ideas that we do not think are academically serious.” But I also feel like saying: when will you apply the same standard to postmodernism?