Pandaemonium

PLUCKED FROM THE WEB #53

web 53

The latest (somewhat random) collection of recent essays and stories from around the web that have caught my eye and are worth plucking out to be re-read.


.

CRISPR babies: a view from the centre of the storm
Robin Lovell-Badge, Development, 6 February 2019

JK’s rationale was clearly to make children resistant to HIV. Apparently, there is a lot of stigma associated with being HIV-positive in China, or even having a family member who is, and children with HIV-positive parents are often ostracized. This may sound like an acceptable reason, but dealing with the social issues should really be a priority. Moreover, if the father is HIV-positive, as in this case, there are already methods to prevent HIV transmission to children that work extremely well. These involve first ensuring that the viral load is low by providing anti-viral drugs, and then washing the sperm prior to IVF. There was, therefore, no compelling reason or unmet need to use germline genome editing of CCR5.

The genome-editing technology itself is, in most people’s opinion, not yet ready for use, but this is also something that was overlooked by JK. It is clear that both babies are mosaic for the edits, i.e. not all their cells contain them. This happens when the edits occur in the late one-cell embryo or even at the two-cell stage, and current research seeks to develop methods to avoid this inefficiency. While off-target events don’t seem to be as much of a concern now, there are still issues with on-target events not being necessarily what you want. JK used simple non-homology end joining (NHEJ) to make small mutations. Initially, he told us that he was trying to recreate the delta-32 mutation but that this hadn’t worked in one of the babies. In fact, what we know now is that it had not worked at all: none of the embryos he implanted contained delta-32 mutations. They harbour completely novel mutations, of which we have absolutely no understanding. They may fail to confer HIV-resistance and could even lead to deleterious effects on the immune system or the brain. There was no preclinical data to assess the consequences of the specific mutations engineered by JK in the embryos before they were transferred in to the mothers. All this shows that JK didn’t know enough about CCR5 or the potential risks of tinkering with this gene. Of course, we could now recreate those mutations in mice or other animals and determine their effects. In theory, it would also be possible to take blood samples from Lulu and Nana when they are older and see whether their white cells are resistant to HIV. However, it’s not going to be easy to fully understand the effects of the mutations he made. The role of CCR5 in the brain is poorly understood, and it will not be simple to test the consequences of the mutations made, on either HIV resistance or CNS function. Nor will it be trivial to assess whether such genome-edited individuals are more prone to the serious effects of influenza, etc.

Read the full article in Development.


.

Everything you know about global order is wrong
Adam Tooze, Foreign Policy, 30 January 2019

By the late 1960s, barely more than 10 years old, Bretton Woods was already in terminal trouble. And when confronted with demands for deflation, U.S. President Richard Nixon reverted to economic nationalism. Between 1971 and 1973, he unhitched the dollar from gold and abandoned any effort to defend the exchange rate, sending the dollar plunging and helping to restore something closer to trade balance. If our own world has a historic birthplace, it was not in 1945 but in the early 1970s with the advent of fiat money and floating exchange rates. The unpalatable truth is that our world was born not out of wise collective agreement but out of chaos, unleashed by America’s unilateral refusal any longer to underwrite the global monetary order.

As the tensions built up in the 1960s exploded, foreign exchange instability contributed to a historically unprecedented surge in inflation across the Western world. We now know that this era of inflationary instability would be concluded by the market revolution and what Ben Bernanke dubbed the ‘great moderation.’ But once again hindsight should not blind us to the depth of the crisis and uncertainty prevailing at the time. The first attempts to restore order were not by way of the market revolution but by the means of corporatism—direct negotiations among governments, trade unions, and employers with a view of limiting the vicious spiral of prices and wages. This promised a direct control of inflation by way of price setting. But its effect was to stoke an ever-greater politicization of the economy. With left-wing social theorists diagnosing a crisis of capitalist democracy, the trilateral commission warned of democratic ungovernability.

What broke the deadlock was not some inclusive conference of stakeholders. The stakeholders in the 1970s were obstreperous trade unions, and that kind of consultation was precisely the bad habit that the neoliberal revolutionaries set out to break. The solution, as U.S. Federal Reserve chair Paul Volcker’s recent memoirs make embarrassingly clear, was blunt force wielded by the Fed. Volcker’s unilateral interest rate hike, the sharp revaluation of the dollar, deindustrialization, and the crash of surging unemployment dealt a death blow to organized labor and tamed inflationary pressure. The Volcker shock established so-called independent central bankers as the true arbiters of the new dispensation.

They put paid to what Margaret Thatcher referred to as the ‘enemy within.’ But the global victory of the liberal order required a more far-reaching struggle. The world of the market revolution of the 1980s was still divided between communism and capitalism, between first, second, and third worlds. The overcoming of those divisions was a matter of power politics first and foremost, negotiation second. The United States and its allies in Europe raised the pressure on the Soviet Union, and after a period of spectacularly heightened tension, Mikhail Gorbachev chose to de-escalate, unwittingly precipitating the union’s collapse.

The truth is that the postwar moment that the Davos crowd truly hankers after is not that of 1945 but the aftermath of the Cold War, the moment of Western triumph. It was finally in 1995 that the Bretton Woods vision of a comprehensive world trade organization was realized. A sanitized version of this moment would describe it as a third triumph of enlightened technocracy. After Bretton Woods and the defeat of inflation, this was the age of the Washington Consensus. But as in those previous moments, its underpinnings were power politics: at home the humbling of organized labor, abroad the collapse of Soviet challenge and the decision by the Beijing regime to embark on the incorporation of China into the world economy.

Read the full article in Foreign Policy.


.

America’s original identity politics
Sarah Churchwell, NYRB Daily, 7 February 2019

The good news for anyone feeling perturbed is that it simply isn’t true that identity politics represents the end of America or of liberal democracy. Nor is it true that identity politics began on the left, or that the Klan was America’s first ‘identity movement.’ As historians like Fukuyama and Lilla should know, the United States was founded on identity politics, per The Economist’s description: political positions based on ethnicity, race, sexuality, and religion. There are no pre-identity politics, just as there are no pre-identity economics, in a country in which political, economic, and legal rights were only ever granted to some identity groups and not to others. The only thing new about ‘the omnipresent rhetoric of identity‘ is the voices that have been added to it, reshaping it in ways that alarm and affront those who used to be its sole authors. But it was always omnipresent.

Virtually every major event in the long and troubled history of the United States was a direct consequence of identity politics. Start whenever you think America begins, and power struggles based on identity will be staring you in the face, starting with the genocide and forced resettlement of indigenous peoples by European migrants. A handful of those migrants, traveling on the Mayflower, called themselves ‘Separatists’ and decided to start a new society based on their religious beliefs, in which church membership would be a requirement of political representation. That’s identity politics.

Black people were enslaved, white people were free: it takes a colossal set of blinders to keep from seeing that as identity politics. Political judgments and legal decisions based on identity underwrote white supremacy from the start: measuring African Americans as three-fifths of a human is identity politics, a logic that led to the one-drop rule, the Dred Scott decision, Jim Crow segregation, and the Birther movement, to name just a few of the most consequential instances. Electoral colleges were established in order to solve the ‘problem of the Negroes,’ as James Madison put it, rigging the number of electors a state received in order to put a white supremacist thumb on the constitutional scale. Insofar as identity politics helped elect Donald Trump, electoral colleges seem a more proximate cause than debates over gender-neutral bathrooms.

Read the full article in the NYRB Daily.


.

Do we really have a ‘suicidal generation’?
Tom Chivers, UnHerd, 4 February 2019

There’s this trick that climate deniers used to use. They used to say ‘there’s been no warming since 1998’. And in a weird way they were right: looking at global atmospheric surface temperatures, none of the years that followed was as hot as 1998.

But they were cheating. They picked 1998 deliberately since it was an outlier – an El Niño year much hotter than the years around it. If you were, on the other hand, to measure from 1997 or 1999, then there were lots of much hotter years on record; and the clear trend was that later years, on average, were hotter than earlier ones. It was a wobbly, noisy line, with some outliers, but the average temperature really was going up, and the only way you could hide that trend was by cherry-picking statistics.

I was thinking about this as I read the Sunday Times splash this week, which (using as-yet unavailable data from the Office for National Statistics) claimed that the ‘suicide rate among teenagers has nearly doubled in eight years’. It expressed concerns that we are raising ‘a suicidal generation’.

They said that the suicide rate among 15- to 19-year-olds in 2010 was just over three per 100,000. The ONS figures, due out in September, will (apparently) show that it is now over five per 100,000. Inevitably enough, the piece links the purported rise to the growth of social media since 2010.

But this is – and I don’t want to get too technical here, but bear with me – absolute bollocks from top to bottom. It’s a masterclass in what scientists call ‘hypothesising after results are known’, or HARKing. If you have the data in front of you, then you can make it say almost anything you like.

First, it’s worth noting that very few teenagers kill themselves. The total number of suicide deaths in 2017 in England and Wales was 177, out of about 3.25 million. That means that small changes can look like big percentage swings. More important, though, the Sunday Timesstory did exactly what the climate deniers did. The year 2010 had the lowest rate of teen suicides of any year since at least 1981, when the ONS records begin. You could compare it with literally any other year and you’d see a rise.

Added to which, picking social media as your reason for is completely arbitrary. Social media did not start in 2010. The BBC TV series Sherlock, starring Benedict Cumberbatch and Martin Freeman, did, though. Maybe we should blame that.

You could, if you wanted to, use the same trick to tell the exact opposite story. Facebook was first released in 2004, when the suicide rate among 15- to 19-year-olds in England and Wales was 4.7. But after six years of social media being available, it had dropped to 3.1! It’s a life-saver, no?

Read the full article in UnHerd.


.

Twitter is the crystal meth of newsrooms
David Von Drehle, Washington Post, 25 January 2019

In the final week of August 2012, after Tropical Storm Isaac delivered a scouring rain to Florida’s Gulf Coast, an army of scribes and pundits wilted in the humid aftermath at the gates of the Republican National Convention in Tampa. A presidential nominating convention is not the best place to find political insights, but it’s not the worst, either. One can learn something about up-and-comers by sampling the speeches that drone on all afternoon and evening. Nuggets can be mined from interviews with delegates. Most of all, a reporter can perhaps glimpse the nominee’s inner self — not in what they show so much as how they choose to show it. (For instance, after noticing that Donald Trump was staging his 2016 convention as a mash-up of reality TV and professional wrestling, a reporter could be braced for the hyperbolic and improvisational melodrama of Trump’s presidency.)

But in the coolness of the press section of what was then known as the Tampa Bay Times Forum, I observed something I’d never seen in seven prior conventions. The seats were filled, but hardly anyone glanced at the stage or the delegates. My colleagues had their laptops opened to their TweetDecks. Side by side in the dim glow of their screens, they monitored each bon mot and burp of Twitter’s commentary on events.

Now, obviously: old fogey alert. Apply your geezer filter as you see fit. In that moment, I had a bad feeling about Twitter, and little has happened in the ensuing years to make me feel any better. Twitter is the crystal meth of newsrooms — a drug that insinuates itself into our vulnerabilities only to leave us toothless and disgraced.

What are these vulnerabilities? For one, many journalists are surprisingly shy. We chose a trade that involves watching and witnessing rather than risking and daring. For many of us, the most difficult part of the job is ringing the doorbell of a bereaved family, or prying into the opinions of unwelcoming strangers. Twitter has created a seductive universe in which the reactions of a virtual community are served up in neatly quotable bits without need for uncomfortable personal interactions.

For another, many journalists are these days under intense pressure to produce quick ‘takes’ on the news to drive website traffic. Twitter offers the amphetamine hit that makes such pressure survivable. No reporter can go to the scene of a dozen events per day, observe what happens, interview those affected, sort the meaning from the dross and file a story. But Twitter offers an endless stream of faux events: fleeting sensations, momentary outrages, ersatz insights and provocative distortions. ‘News’ nuggets roll by like the chocolates on Lucy’s conveyor belt.

Read the full article in the Washington Post.


.

_MAL8994

Should we mourn the loss of industrial jobs?
James Patrick Ferns,
Working Class Studies, 4 February 2019

Steelmaking is a dirty job, performed under intense heat and exceptional danger. Harmful gasses and dust destroy workers’ health, and horrific accidents cripples their bodies. Steelworkers know this all too well.  As former steelwork Brian Cunningham put it, a typical day could switch from ‘mundane, repetitive, monotonous, to absolute terror … When it went wrong, it went spectacularly wrong’. Peter Hamill recalled how a fellow steelworker ‘had been feeding a rope in and it had whiplashed him, cut him, killed him’. Death was a persistent reality for steelworkers. Tommy Johnston’s ‘lowest point’ was witnessing a co-worker ‘strangled in a conveyor belt’.

In light of such stories we might expect workers to be glad to see steelworks close.  But they aren’t. Instead, they say they would not have left if given the choice. As Johnston said, he  would ‘go back tomorrow’ if he could. Why?

These workers are not deluded by rose-tinted nostalgia; they simply recognize that the jobs they have now don’t offer the real, tangible benefits of industrial work. Following a path common to other displaced industrial workers, the former steelworkers I interviewed found new jobs as production line workers, taxi drivers, cleaners, or janitorial staff. Most of these jobs are insecure and low-paid. For James Carlin there was ‘absolutely no comparison whatsoever’ – his wage more than halved in his first new job. Likewise, Cunningham’s annual salary diminished from £24,000 to £10,000. Finding it ‘hard to say’, Johnston admitted that after twenty-five years his annual wage is now just below his steelworker wage of 1991 in absolute terms. It wasn’t just that wages are considerably lower. Workers also lost access to monthly bonuses, union-negotiated wage rises, and seniority-based promotion.  Instead of improving their standard of living, these former steelworkers had to cut household spending and sacrifice hobbies, social outings, and family holidays. Some sold their cars and even forfeited ownership of their homes.

But even more than the economic loss, these workers mourn the loss of a powerful and vocal union and the workplace culture it fostered.  Cunningham described labour relations in his former workplace as ‘respectful’; the authority of the union ‘always put the management on notice’. In stark contrast, his new management was aggressively anti-union: ‘If you joined a union you were sacked, you were out the door’. Carlin observed workforce bullying and harassment by an ‘almost dictatorial’ management which actively suppressed union organising by outright dismissal. Not blinded by nostalgia, workers were in fact well aware that the culture of workforce/management ‘mutual respect’ was not underpinned by benevolence, but rather necessity, as workers’ treatment corresponds to their respective power in the workplace. By contrast Carlin summed up his later non-union workplace by observing – ‘we never had any power, we never had any voice’. The heavy unionisation of the workplace encouraged a culture of solidarity and co-operation.

Read the full article in Working Class Studies.


.

Capitalism’s new clothes
Evgeny Morozov, The Baffler, 4 February 2019

In a series of remarkably prescient articles, the first of which was published in the German newspaper Frankfurter Allgemeine Zeitung in the summer of 2013, Shoshana Zuboff pointed to an alarming phenomenon: the digitization of everything was giving technology firms immense social power. From the modest beachheads inside our browsers, they conquered, Blitzkrieg-style, our homes, cars, toasters, and even mattresses. Toothbrushes, sneakers, vacuum cleaners: our formerly dumb household subordinates were becoming our ‘smart’ bosses. Their business models turned data into gold, favoring further expansion.

Google and Facebook were restructuring the world, not just solving its problems. The general public, seduced by the tech world’s youthful, hoodie-wearing ambassadors and lobotomized by TED Talks, was clueless. Zuboff saw a logic to this digital mess; tech firms were following rational—and terrifying—imperatives. To attack them for privacy violations was to miss the scale of the transformation—a tragic miscalculation that has plagued much of the current activism against Big Tech.

This analytical error has also led many clever, well-intentioned people to insist that Silicon Valley should—and could—repent. To insist, as these critics do, that Google should start protecting our privacy is, for Zuboff, ‘like asking Henry Ford to make each Model T by hand or asking a giraffe to shorten its neck.’ The imperatives of surveillance capitalism are almost of the evolutionary kind: no clever policy, not even in Congress, has ever succeeded in shortening the giraffe’s neck (it has, however, done wonders for Mitch McConnell’s).

Zuboff’s  pithy term for this regime, ‘surveillance capitalism,’ has caught on. (That this term had been previously used—and in a far more critical manner—by the Marxists at Monthly Review, is a minor genealogical inconvenience for Zuboff.) Her new, much-awaited book The Age of Surveillance Capitalism exhaustively documents its sinister operations. From Pokemon Go to smart cities, from Amazon Echo to smart dolls, surveillance capitalism’s imperatives, as well as its methods—marked by constant lying, concealment, and manipulation—have become ubiquitous. The good old days of solitary drunken stupor are now gone: even vodka bottles have become smart, offering internet connectivity. As for the smart rectal thermometers also discussed in the book, you probably don’t want to know. Let’s jst hope your digital wallet is stocked with enough Bitcoins to appease the hackers.

Zuboff’s book makes clear that the promises of ‘surveillance capitalists’ are as sweet as their lobbying is ruthless. Tech companies, under the pompous cover of disrupting everything for everyone’s benefit, have developed a panoply of rhetorical and political tricks that insulate them from any pressure from below. It helps, of course, that the only pressure coming from below is usually the one directed at the buttons and screens of their data-sucking devices.

Read the full article in the Baffler.


.

Lessons of defeat
Ursula Lindsey, The Point, Issue 18, Winter 2019

In the early 1970s, the leftist student movement in Egypt organized protests that rocked the country. Arwa Salih was a leading member of the movement, and in 1991 she completed a manuscript entitled The Stillborn that dissected the failure of her generation—a generation that had once believed itself to be ‘in full possession of the future.’ In 1997, a few months after the book’s publication, Salih committed suicide by jumping off a Cairo balcony.

Mustafa Khalifa was also a member of a radical leftist opposition group, but in Syria. From 1982 to 1994, he was tortured and held without trial in secret prisons, including the county’s most infamous, Tadmor. The Shell, which was first published in French in 2005 and in Arabic a year later, is Khalifa’s deeply moving record of his time in nightmarish conditions. Today he lives in Paris.

Salih analyzes the failure and aftermath of a radical political movement; Khalifa bears witness to unimaginable cruelty, suffering and endurance. Their books are distinct in style and purpose, and address different experiences in different countries. Nevertheless, each book also continues to speak, unfortunately, to the present moment. (Both are now available in English: Salih’s translated, outstandingly, by Samah Selim and Khalifa’s by Paul Starkey.) Khalifa was one of the first to describe the barbarity of the regime of Hafez El Assad, whose cruelties have persisted on an unprecedented scale under the regime of Hafez’s son, Bashar, and underpinned the 2011 revolt in Syria. As for Salih’s account of the failure of an earlier protest movement, it has become a reference point for Egyptians trying to make sense of how the popular uprising they enthusiastically and daringly joined in 2011 came to such a dismal end.

When Salih was a student, Egypt’s defeat in the 1967 war with Israel and the death of President Nasser three years later left the country reeling. A regime that had ruled Egypt since its independence in 1956 lost credibility, and with it the ability to repress dissent, leading to an explosion of critical debate, art and political mobilization. Salih joined student activists who were Marxist as well as nationalist: they believed that liberation from colonialism and Western imperialism went hand in hand with class struggle.

They called on the regime of President Anwar Sadat to liberate the Sinai from Israel and to stand up to U.S. imperialism. Sadat did recover the Sinai after the 1973 October War and subsequent peace treaty. Yet he also made peace with Israel and aligned Egypt firmly within the U.S. camp of the Cold War—not the outcome that Salih and her comrades had hoped for. In addition, Sadat’s ‘Open Door’ policy of economic liberalization symbolized a decisive pivot away from socialism, ushering in an era of growing corruption, speculation and inequality. When bread riots broke out in Cairo in 1977 in response to new austerity measures imposed by the IMF, Sadat answered with army violence and mass arrests.

The young leftists who had briefly been at the forefront of a huge popular movement discovered themselves to be irrelevant. They had led the protests, Salih writes, ‘in the name of an insurgent dream: to change the future of the nation, to save it.’ But her generation became ‘superfluous’ before it could realize any of its dreams: ‘Barely launched on its journey into politics, art and science, it was quashed along with the world it had attempted to bring into being.’ It had grown ‘suddenly old; its children became incomplete projects—a stillborn generation.’

Read the full article in The Point.


.

A surveillance wall is not a good alternative
to a concrete wall
India McKinney, EFF, 29 January 2019

Since even before he took office, President Trump has called for a physical wall along the southern border of the United States. Many different organizations have argued this isn’t a great idea. In response, some Congressional Democrats have suggested turning to surveillance technology to monitor the border instead of a physical barrier.

Without specific legislative proposals, it’s hard to know what these suggestions actually mean. However, any bill Congress considers related to border security should avoid–at minimum–invasive surveillance technologies like biometric screening and collection, DNA collection, social media snooping, unregulated drones along the border, and automatic license plate readers aimed at interior traffic.

We have already seen several proposals authorizing the U.S. Department of Homeland Security (DHS) and its sub-agencies to collect biometric information from all people who exit the U.S., including U.S. citizens. We oppose legislation that would entrench and expand DHS’s existing program of facial recognition of all international travelers who take certain outgoing flights from U.S. airports. EFF is also opposed to TSA’s proposals to collect and share biometric data of domestic travelers with the FBI, the State Department, and local governments. Given the sensitivity of biometric information, we’re concerned about the threat that any collected data will be stolen or misused, as well as the potential for such programs to be expanded far beyond their original scope.

EFF has long opposed dragnet biometric surveillance of immigrants. Among other things, we oppose any proposal that would require DHS to collect DNA and other biometric information from ‘any individual filing an application, petition, or other request for immigration benefit or status.’ DNA surveillance raises special concerns, because DNA can expose sensitive information about familial history and personal health issues.

EFF opposes existing DHS and State Department programs of screening the social media of foreign visitors. We would also oppose any legislation that would expand and entrench DHS reviewing the social media accounts of visa applicants from so-called ‘high risk countries.’ These programs threaten the digital privacy and free speech of innocent foreign travelers, and the many U.S. citizens who communicate with them. Also, it is all-too-likely that such programs will invite ‘extreme vetting’ of visitors from Muslim nations.

Read the full article in EFF.


.

Many working-class people believe in Brexit.
Who can blame them?
Lisa McKenzie, LSE Blogs, 31 January 2019

The personal is always political when you are working class. I speak from a personal and a professional position as a working-class woman. I am heavily involved in political protest: I am an anarchist, so I put my cards on the table. I lost faith and tolerance in our current political system a long time ago. Seeing my family and my community crushed during the miners’ strike through state violence, and then purposefully devalued and denied ever since by both Conservative and Labour governments, will do that to a teenager.

Now, as a working-class academic, I mix and connect with people from all social strata. This is very different from my early life in the mining community. I never really understood, until the 1984 strike, the levels of class hatred and class prejudice in Britain. My family and community had cushioned me from those prejudices, which have always been deeply woven into British society. When the strike was over and we had lost, I realised very quickly that the working class in industrial communities were quickly deemed ‘backward’ and stupid, and it was they who were held responsible for holding the country back. The Conservatives created this narrative, but it was the Labour party that embraced it, and cultured it into the rhetoric of social exclusion. It was the working class that were excluding themselves from an otherwise modern, cosmopolitan and prosperous Britain through their ‘inferior culture’: there was no place for those that could not, or would not, emerge into New Labour’s vision of a third way utopia. My PhD mapped this New Labour-Third Way exclusion of Britain’s working class.

I finished it in 2009 as the New Labour project was on its last legs. My research within working class communities has spanned governments led by Blair, Brown, Cameron and Clegg, Cameron, and now May. As an ethnographer I have collected, analysed and disseminated the stories of British working class life for many years. Despite the change of government, for working-class people life has been a constant struggle for recognition, respect, dignity and value.

Since the referendum in 2016, I have been involved in academic, political and popular debates and arguments about why in some parts of the country working class people voted to leave the EU. I wrote a post for this site a year ago outlining how working class people had read, understood and heard the debates around the EU as exclusive, and elite, too often using language that diminished their own life experiences: ‘stupidity and racism’ has been the most common.

Read the full article on LSE blogs.


.

Skengdo and AM

Behind bars: After years of the UK banning music,
attempts to censor drill break alarming new ground

Ian McQuaid, Dummy, 26 January 2019

In his written history of British nightlife, Life After DarkDave Haslam discussed the controversy the first wave of jazz encountered when it hit British shores in the 1910s; ‘there were fundamental objections to live jazz… musical experts, for example, denounced jazz as ‘rhythm without melody’. Various moral guardians objected to the ‘negro’ origin of the music’. This came to a head when Leyton council (unsuccessfully) attempted to shut down any venues hosting jazz dances, the outraged council – fuelled by a racist, hysterical right wing press (sound familiar?) – describing jazz dances as ‘morally bad’.

Inevitably, as time passed, jazz became acceptable in polite society, and the fear that it would ultimately destroy us all proved to be pearl-clutching nonsense. Fortunately for editors with newspapers to sell, a new bogeyman soon presented itself, in the form of rock ‘n’ roll.

Back in the ’50s, the media continued to make no attempt to hide overtly racist rhetoric. The Daily Mail set the tone with a notorious anti-rock ‘n’ roll editorial in 1956 that declared: ’It has something of the African tom tom and voodoo dance [about it]. It is deplorable. It is tribal. And it is from America. It follows rag-time, dixie, jazz, hot cha cha and boogie woogie, which surely originated in the jungle. We sometimes wonder whether it is the negro’s revenge’…

The ’70s saw the much-documented rise of punk, and the concurrent banning of the Sex Pistols both live and on air, alongside a court case that tried – unsuccessfully – to rule their album title ‘Never Mind the Bollocks’ to be criminally obscene. The ’80s had Frankie Goes To Hollywood’s synthetic sleaze banger ‘Relax‘ causing outrage for its risqué lyrics, anarcho punks Crass having pressing plants refusing to print their records because they were ‘blasphemous’ and U2 terrifying the board holders of Island Records by singing about the brutality of the British army in Northern Ireland. In the ’90s, rave bangers steamrolled over sensibilities with a slew of tracks built around gleefully obvious drug references – from D Mob’s ‘We Call it Acieeeed‘ to The Shamen’s chart-topping ode to getting smashed on Es, ‘Ebenezer Goode‘.

It continues.

Jungle; garage; grime; all of it was considered to be a public menace. Don’t play ‘Pow!‘ at carnival. Don’t let So Solid on telly. Don’t play bashment anywhere. Questions asked in parliament and coppers on telly stuttering over MC names. The same old merry-go-round with the same empty results. In the North, bassline was considered public enemy number one, and the spiritual home of the movement, Niche nightclub, was closed by the police. No dancing for you lot.

But none of all of this matches up to what happened last week – because no matter how many times the police have shut down a venue, or the BBC (or whichever radio station) has decided not to play a song, I cannot for the life of me find any examples of an artist being punished with a prison sent

Read the full article in Dummy.


.

The deported Americans
Brooke Jarvis, California Sunday, 31 January 2019

Ashley one of 600,000 American-born children who are believed to be enrolled in K-12 schools across Mexico. Their lives are a reflection of the complicated realities of border politics: of the so-called ‘mixed-status’ families that formed on the U.S. side when a militarized border made it too difficult for workers to go back and forth; of deportation policies that don’t take the presence of children into consideration; of the wave of returns that followed the Great Recession, which, for the past ten years, has meant more Mexicans migrating out of the United States than into it. Often, parents choose to leave their American-citizen children, especially older ones, behind with family or friends, deciding that the pain of separation is a lesser burden than the pain of dislocation and displacement. Others bring their kids with them, hoping they’ll be able to find their place in a different world.

Theirs is a new and unique generation, one that academics are only beginning to name. ‘The students we share,’ a phrase meant to encompass transnational students living on both sides of the border, reflects the hope that the two countries will develop better support for students who researchers broadly agree are being failed educationally. Researchers I spoke to also used ‘American Mexicans,’ ‘the other Dreamers,’ and ‘Los Invisibles, the invisible ones.’ Children from ‘forced cross-border families,’ offered Maria Dolores Paris Pombo, who teaches sociology at the Colegio de la Frontera Norte. ‘This problem of being unable to adapt to Mexico or belong to the U.S. — it’s a generation that was left in between.’

Together, these American children now make up 3 percent of all students in Mexico, though the concentrations vary. In some municipalities where migration is particularly common, one in ten students is American. Across the country, public schools are scrambling to find the resources to accommodate them. ‘It’s a huge problem for Mexico,’ Patricia Gándara, a research professor at the UCLA Graduate School of Education, told me. ‘All these people who just kind of landed there.’

Read the full article in California Sunday.


.

We know online harms exist –
but this concept has one small weakness
Rowland Manthorpe, Sky News, 7 February 2019

Everyone agrees that something – something! – must be done about the ravages of ‘online harms’. This umbrella term is used in government circles to describe all the bad things about the internet, from revenge porn to misogyny.

It’s a useful bit of jargon, because it draws in all the topics which swirl around, in our endless circling discussions about the way things are now. Social isolation, trolling, underage sexting… everything has its place in the landscape of harms.

The concept only has one small weakness, which at this late stage of the debate I am embarrassed to even mention; no-one knows exactly what a harm is.

At first glance, this probably seems ridiculous. In 2019, the damage caused by the internet appears self-evident. But while it’s easy to cite examples, it’s far harder to point to evidence. In most cases, all we can say for sure is that something is happening – and, sometimes, it’s hard to be certain about that.

Take, for instance, cyberbullying. Speak to parents, and read reports in the media, and you’d get the impression that children are faced with an epidemic of cyberbullying.

Pressure groups and charities condemn it; politicians inveigh against it.

Yet when two leading researchers, Andrew Przybylski and Lucy Bowes, undertook the largest-ever study of cyberbullying in 2017, based on a representative sample of 120,115 adolescents, they concluded that children were far less likely to be victims of cyberbullying than traditional bullying, and that cyberbullying was not rising at a dramatic rate. The campaign against it, Przybylski said, was a ‘panic’.

Conversely, when harms are clear, it’s rarely obvious why. One of the most alarming recent trends is the sudden increase in self-harm among teenage girls. Research published in 2017 showed it had risen by 68% in just three years, while remaining fairly constant among younger girls and boys.

Was this the malign effect of social media? The girls themselves and their parents thought so – but the evidence was shaky.

The same goes for numerous other issues, from phone addiction to the link between mental health and social media use. Theories abound, but harm isn’t measured in TED talks.

Read the full article on Sky News.


.

A complete guide to understanding
immigrants and crime
Tanvi Misra, City Lab, 6 February 2019

Sure, individual immigrants commit crimes. But a review of available research (a study of studies, if you will) does not support the claim that migrants are more likely to engage in criminal behavior than native-born Americans. In fact, researchers have often observed the opposite relationship.

One (imperfect) way to think about a group’s relationship to crime is to see how many people from that group end up in prison—and why. An analysis by Michelangelo Landgrave and Alex Nowrasteh at the libertarian Cato Institute from 2016 found that legal and undocumented immigrants were less likely to be incarcerated than native-born Americans—and that likelihood appeared to be decreasing over time. Another one out of the Cato Institute focused specifically on the state of Texas. It showed that in 2015, undocumented immigrants had a criminal conviction rate 50 percent below that of native-born Americans. The conviction rate of those here legally was 66 percent below.

It does not appear that these are rates are low because immigrants found committing crimes were swiftly deported. A working paper from 2007 released by the National Bureau of Economic Research (NBER) concluded that immigrants who come to the country either self-select so that they are less likely to cause crime to begin with, or they have much more to lose by committing crime and therefore are more easily deterred. (Some argue that even if people have committed crimes, they are human and still have the right to migrate. But that’s a deeper question for another time…)

Read the full article on City Lab.


.

The plight of the political convert
Corey Robin, New Yorker, 23 January 2019

The ex-Communist didn’t merely defect. He created the modern right, clearing a path for others, not just Communists and leftists, to follow. Twentieth-century conservatism is unthinkable without Chambers or Burnham or Irving Kristol, who, despite leaving the left, remained loyal to its imagination. They transmuted its energy into a movement that found traction in magazines like the National Review or journals like The Public Interest and, eventually, a home in the White House. The same goes for Frank Meyer, the ex-Communist intellectual who devised the Republican strategy of fusionism, which enabled free-market libertarians to ally with social traditionalists and statist Cold War warriors.

In this way, the right has often relied on the kindness of strangers. Though Burke launched his political career decades before left and right emerged as terms of political discourse—that happened only with the French Revolution—he spent much of his time in Parliament a committed reformer, inveighing against the suffering of the Irish and the Catholics, the American colonists and the colonized Indians, and slaves throughout the Americas. From that intimate knowledge of the reformer’s sensibility, he was able to craft a right that might lure liberty-minded defectors from the left. When he took aim at the Revolution, he knew where to shoot.

Curiously, the movement from right to left has never played an equivalent role in modern politics. Not only are there fewer converts in that direction, but those conversions haven’t plowed as fertile a field as their counterparts have. Since the end of the Cold War, there have been a handful of notable defections from the right: Arianna Huffington, Michael Lind, Bruce Bartlett, Glenn Loury, and, in Britain, John Gray. They’ve had virtually no effect on the left. The best the convert from the right can do, it seems, is say goodbye to his comrades and make his way across enemy lines.

Read the full article in the New Yorker.


.

How Diderot’s Encylopedia challenged the king
Andrew Curran, Longreads, January 2019

During Diderot’s three-month imprisonment, his jailer the Count d’Argenson and the count’s brother the marquis had looked on with amusement while this ‘insolent’ philosophe had bowed and scraped before the authority of the state. In a diary entry from October 1749, the marquis related with glee how his brother the count had supposedly broken Diderot’s will. Solitary confinement and the prospect of a cold winter had succeeded where the police’s warnings had failed; in the end, the once-cheeky writer had not only begged for forgiveness, but his ‘weak mind,’ ‘damaged imagination,’ and ‘senseless brilliance’ had been subdued. Diderot’s days as a writer of ‘entertaining but amoral books,’ it seemed, were over.

The marquis was only half right. When Diderot was finally released from Vincennes in November 1749, he certainly returned to Paris with his tail between his legs. Entirely silenced, however, he was not. Two years after he left prison, the first volume of the Encyclopédie that he and Jean le Rond d’Alembert were editing together appeared in print. Its extended and self-important title, which indicated a systematic and critical treatment of the era’s knowledge and its trades, promised something far beyond a normal reference work…

Far more influential and prominent than the short single-authored works that Diderot had produced up to this point in his life, the Encyclopédie was expressly designed to pass on the temptation and method of intellectual freedom to a huge audience in Europe and, to a lesser extent, in faraway lands like Saint Petersburg and Philadelphia. Ultimately carried to term through ruse, obfuscation, and sometimes cooperation with the authorities, the Encyclopédie (and its various translations, republications, and pirated excerpts and editions) is now considered the supreme achievement of the French Enlightenment: a triumph of secularism, freedom of thought, and eighteenth-century commerce. On a personal level, however, Diderot considered this dictionary to be the most thankless chore of his life.

Read the full article on Longreads.


.

frantz-fanon

Rapping with Fanon
Adam Shatz, NYRB Daily, 22 January 2019

‘Fanon’s books were in the house, but my father never told me about Fanon. That was his history, not mine, and he didn’t want to talk about it. Often when parents change country, they don’t want to start again. They want to move on. So their experiences aren’t really transmitted’—a problem he says the the album was partly designed to redress. As Rocé writes in his notes, ‘It’s crucial to pass on these moments when anything was possible, so that they infiltrate and disperse the bleak mood that new generations are growing up with.’

Like many French rappers of his generation, Rocé was initially attracted to Fanon because he was ‘a warrior,’ an intellectual who joined a national liberation struggle, but what fascinates him now is ‘the link Fanon made between the deconstruction of imperialist culture and the creation of a new world.’ That link is the implicit subject of one of the album’s most striking tracks, ‘Complexium,’ recorded in 1974 in New York by the singer Dane Belany. ‘Complexium’ is a taut setting of a few lines from a play by the Martinican poet Aimé Césaire, who had been Fanon’s mentor; the piece originally appeared on an album dedicated to Fanon, Motivations. Belany’s story is among the intriguing ones unearthed by Rocé’s project (and recounted in the album’s excellent liner notes by the historians Naïma Yahi and Amzat Boukari-Yabara). The daughter of a Senegalese father and a Turkish mother, educated in Paris, Belany won acclaim in the 1960s as a ‘sexy jazz singer’ who combined the ‘charms of Paris and subjugation of Harlem.’ But at the 1969 Avignon Festival, she experienced an epiphany that radicalized her outlook. Her straightened hair had begun to kink under the sun when a black American jazz musician took a comb to her head, unfurled her curls, and told her, ‘this is how you should comb your hair.’ She became one of the first women in Paris to wear her hair in a towering Afro (featured in a triplicate image on the cover of Motivations), only to find herself insulted on the street. When she moved to New York a few years later, she immediately felt at home in the world of black musicians and artists—notably Ornette Coleman, who became a close friend…

What, ultimately, is the purpose of Par les damné.e.s de la terre? The revolutionary spirit it honors is all but extinguished, not least in France, where the gilets jauneshave been notable for their lack of ideology—and the conspicuous absence of non-white demonstrators. Rocé’s project carries more than a whiff of radical chic nostalgia, which he does little to conceal when he describes the 1960s and 1970s as ‘an epoch of struggles, of possibility.’ Yet Par les damné.e.s de la terre is an unexpectedly moving document, not only because it presents an extraordinary archive of recordings, but also because it illuminates the radical hopes that inspired them. Rocé himself, the son of a left-wing French Jew and a black Algerian Muslim who met at a meeting for the liberation of Angola, would not have existed without these hopes, and the new world they dared to imagine. Par les damné.e.s de la terre is a powerful reminder of what that world sounded like.

Read the full article in the NYRB Daily.


.

The Supreme Court case
that enshrined white supremacy in law
Louis Menand, New Yorker, 4 February 2019

‘White nationalist, white supremacist, Western civilization—how did that language become offensive?’ the Iowa congressman Steve King inquired of a Times reporter last month. After the remark blew up, King explained that by ‘that language’ he was referring to ‘Western civilization.’ He also said that he condemned white nationalism and white supremacy as an ‘evil and bigoted ideology which saw in its ultimate expression the systematic murder of six million innocent Jewish lives.’ (It’s unclear whether King thinks of Jews as nonwhite.)

However, to answer the congressman’s original question: only after a long struggle. Seventeen states had laws banning interracial marriage, which is pretty much the heart of the doctrine of white supremacy, until 1967, when the Supreme Court declared them unconstitutional. From the Compromise of 1877, which ended Reconstruction, to the Civil Rights Act of 1964 and the Voting Rights Act of 1965, American race relations were largely shaped by states that had seceded from the Union in 1861, and the elected leaders of those states almost all spoke the language of white supremacy. They did not use dog whistles. ‘White Supremacy’ was the motto of the Alabama Democratic Party until 1966. Mississippi did not ratify the Thirteenth Amendment, which outlawed slavery, until 1995.

How did this happen? How did white people in a part of the country that was virtually destroyed by war contrive to take political control of their states, install manifestly undemocratic regimes in them, maintain those regimes for nearly a century, and effectively block the national government from addressing racial inequality everywhere else? Part of the answer is that those people had a lot of help. Institutions constitutionally empowered to intervene twisted themselves every which way to explain why, in this matter, intervention was not part of the job description. One such institution was the Supreme Court of the United States.

Read the full article in the New Yorker.


.

Sex, ska and Malcolm X:
MI6’s covert 1960s mission to woo West Indians
Jamie Doward, Observer, 27 January 2019

Even by the demanding standards of the 1960s, Flamingo was considered a groundbreaking magazine. Mixing glamour, sex advice, culture and international politics, it was one of the first magazines to target Britain’s African-Caribbean community.

It ran from September 1961 until May 1965 and at its peak sold up to 20,000 copies in the UK and 15,000 in the US. It was also distributed in the Caribbean and West Africa, and published dedicated editions in Nigeria, Ghana and Liberia. It carried interviews with Malcolm X and advertisements for Island Records, which brought Jamaican ska music to Britain.

But now it has emerged that Flamingo blazed a trail for another extraordinary reason: its founder, Peter Hornsby, was an agent for the intelligence service, MI6, which used the magazine to push an anti-communist agenda among black and West Indian communities.

The revelation came to light after Hornsby’s wife, Jennifer, contacted Stephen Dorril, an author and senior lecturer in the journalism department at Huddersfield University, and told him of her husband’s exploits.

‘After the Notting Hill riots [in 1958] it was thought by my husband and MI6 that something had to be done with regard to helping the West Indian community,’ she told Dorril, an expert on the intelligence services who later received a copy of her private memoirs from her son, which contained fascinating details of Hornsby’s life as a spy. Peter died in 2000, Jennifer in 2014.

‘There were people inside MI6 who saw which way Africa was going in terms of politics and nationalism, and were willing to support black students, writers and aspiring politicians who were on the left but who could be persuaded to oppose communism,’ Dorril told the Observer.

‘They had links to centre-left politicians and student leaders in this country who were anti-racist and opposed to the white regimes in Africa. Through subtle propaganda activities such as Flamingo, support could be given to such social democrat initiatives while at the same time providing a pool of potential recruitment both here and in Africa and the Caribbean, where the CIA – MI6’s main rival – was a recent interloper.’

Read the full article in the Guardian.


.

Southern Africa’s problem of elite impunity
Tendai Biti & Greg Mills,
Daily Maverick, 11 February 2019

Liberation ties have proven more important than the democratic process or outcomes. South Africa’s favouritism towards Robert Mugabe and his ruling ZANU-PF regime has planted a seed which is now flourishing like a poisonous weed across southern Africa — explaining Zambia’s flawed elections in 2015 and 2016, why Joseph Kabila did not feel the need to hold elections for three years in the Congo, and why Zimbabwe lurches from crisis to crisis. The deafening silence from the region to Zimbabwe’s military coup in November 2017 is symptomatic of a lack of commitment to democratic norms and international law while reinforcing elite impunity.

Rigged elections have consequences. These are still being felt. In the week of Pahad’s shameless article, police opened fire on a group of opposition UPND supporters in Zambia led by Hakainde Hichilema who were holding a peaceful by-election rally in the remote Western Province town of Shesheke. Civilians in brightly coloured chitenge were attacked by police and party militia bedecked in camouflage fatigues and armed with automatic rifles.

Edgar Chagwa Lungu won the Zambian presidential by-election in January 2015 by less than 28,000 votes, an outcome that was disputed by the opposition. But few listened, and fewer elsewhere joined the UPND in raising a voice in protest. The international community again wrung its hands over the subsequent election which saw Lungu returned as president in August 2016 with a convenient 13,000 vote margin (out of nearly 3.8 million cast) over the threshold of 50%, thus avoiding a second-round run-off. International onlookers apparently did not have the spine to do anything about it, simply lacked a dog in the fight, or agreed with the result.

And they scarcely cleared their throats in protest when Emmerson Mnangagwa was declared the winner of the election in Zimbabwe in 2018 by a similarly suspiciously scant margin of 36,464 votes out of more than 4.7 million, again dodging a run-off. While there were some doubters, including the United States, on this occasion a number of outsiders desperate for positive change were initially enthusiastically behind the ruling party, notably including the former colonial power Britain.

Read the full article in the Daily Maverick.


.

The father of geopolitics
Phil Tinline, New Statesman, 30 January 2019

A hundred years ago this week, the statesmen in Versailles were building a new world. But on 1 February 1919, a British geographer completed a book which argued that if they did not consider how politics was shaped by land and sea, their settlement would collapse. Halford Mackinder’s book, Democratic Ideals and Reality, was ignored. Yet ever since, his ideas have played a striking, sometimes disturbing, role in international affairs.

A century on, democratic ideals of international community are taking a battering, and Mackinder’s ideas are back in fashion. ‘Mackinder helped shape strategic thinking about great-power rivalry,’ argues Professor Charles Kupchan, who was President Obama’s special assistant and a senior director on the National Security Council staff. ‘He focused on the importance of the Eurasian ‘Heartland’ and the importance of territory, strategic access, and material strength, and all of that is coming back into play.’

Mackinder was nearly 60 in 1919, and had spent much of his career pioneering the study of geography at Oxford, Reading and across the country. But in a rapidly changing international scene, he was increasingly drawn into politics. As a boy, he saw the news of Prussia’s victory over France in 1871 on the door of his Lincolnshire post office; three decades later, he was busy worrying about the rise of Germany at a ‘social imperialist’ dining club founded by Sidney and Beatrice Webb, alongside Leo Amery and HG Wells. In the face of the German challenge, Mackinder wanted to improve British ‘manpower’ through protectionism, better working-class housing and education, and a minimum wage. Democratic Ideals sometimes reads like imperialist post-liberalism: he lambasts laissez-faire policy for letting London suck the life out of the country, but at the same time he is against socialist centralisation. He hymns neighbourliness at every level, from local communities to the League of Nations. He sees the militaristic uniformity of Germany and Russia as emanating from their sweeping plains.

Portraits of Mackinder suggest a melancholy scholar – his only child died in infancy and his marriage failed – but he was a strong orator, and sustained a long public career, from running the London School of Economics, to becoming a Unionist Party MP in Glasgow, and latterly a knight and privy councillor. But he would be long forgotten were it not for the astonishing sweep of his geopolitical imagination. ‘Geopolitics’ was not a word Mackinder liked – yet he is widely seen as its father.

Read the full article in the New Statesman.


.

It’s the end of the gene as we know it
Ken Richardson, Nautilus, 3 January 2019

First, laboratory experiments have shown how living forms probably flourished as ‘molecular soups’ long before genes existed. They self-organized, synthesized polymers (like RNA and DNA), adapted, and reproduced through interactions among hundreds of components. That means they followed ‘instructions’ arising from relations between components, according to current conditions, with no overall controller: compositional information, as the geneticist Doron Lancet calls it.

In this perspective, the genes evolved later, as products of prior systems, not as the original designers and controllers of them. More likely as templates for components as and when needed: a kind of facility for ‘just in time’ supply of parts needed on a recurring basis.

Then it was slowly appreciated that we inherit just such dynamical systems from our parents, not only our genes. Eggs and sperm contain a vast variety of factors: enzymes and other proteins; amino acids; vitamins, minerals; fats; RNAs (nucleic acids other than DNA); hundreds of cell signalling factors; and other products of the parents’ genes, other than genes themselves.

Molecular biologists have been describing how those factors form networks of complex interactions. Together, they self-organize according to changing conditions around them. Being sensitive to statistical patterns in the changes, they anticipate future states, often creating novel, emergent properties to meet them.

Accordingly, even single cells change their metabolic pathways, and the way they use their genes to suit those patterns. That is, they ‘learn,’ and create instructions on the hoof. Genes are used as templates for making vital resources, of course. But directions and outcomes of the system are not controlled by genes. Like colonies of ants or bees, there are deeper dynamical laws at work in the development of forms and variations…

So it has been dawning on us is that there is no prior plan or blueprint for development: Instructions are created on the hoof, far more intelligently than is possible from dumb DNA. That is why today’s molecular biologists are reporting ‘cognitive resources’ in cells; ‘bio-information intelligence’; ‘cell intelligence’; ‘metabolic memory’; and ‘cell knowledge’—all terms appearing in recent literature. ‘Do cells think?’ is the title of a 2007 paper in the journal Cellular and Molecular Life Sciences. On the other hand the assumed developmental ‘program’ coded in a genotype has never been described.

Read the full article in Nautilus.


 

.

The why of reality
Nathanael Stein, Aeon, 7 February 2019

The easy question came first, a few months after my son turned four: ‘Are we real?’ It was abrupt, but not quite out of nowhere, and I was able to answer quickly. Yes, we’re real – but Elsa and Anna, two characters from Frozen, are not. Done. Then there was a follow-up a few weeks later that came just as abruptly, while splashing around a pool: ‘Daddy, why are we real?’

I don’t have a ready answer this time, partly because I don’t really understand the question. Four-year-olds ask Why? a lot – the stereotype is true, maybe even an understatement – and they use Why? ambiguously. Like little Aristotles with their legs dangling from their car seats, their Whys are ‘said in many different ways’. Sometimes these Whys even fall under neat, Aristotelian types: they might be asking what the point of something is, or how it’s made, or even asking for a criterion. Usually, you can feel your way by context.

But sometimes, like now, I have no idea what my son is asking me to explain. He’s learning about the world, and learning how to ask questions about it at the same time, so there are at least two moving targets. My only clue so far is that he previously wondered whether he was real, which made it sound like he was trying to sort things into real and not-real. So maybe the follow-up is a request for a definition: What makes something real? What distinguishes the real things from the unreal ones? If so, this could be a bit awkward. ‘Why’-questions at their most straightforward correspond to ‘Because’-answers, where the ‘because’ refers to something other than what we’re trying to explain. You’re cranky because you haven’t eaten; we’re driving because we need to get food; this food is healthy because it has the nutrients you need. But when the question is ‘Why am I real?’, what other thing is there to fill in the blank after ‘because’?

I have a professional interest in this query. The notion of reality is one of the most basic and most abstract ones we have. Raising questions about the very idea of what’s real has led to some of the most important, classic work in philosophy – from Parmenides to Aristotle to Avicenna to Aquinas to Immanuel Kant. It also, however, has a tendency to produce the kind of frustrating, easily caricatured work that leads people – including many philosophers – to wonder whether certain questions are simply pointless or even illegitimate, and to adopt a kind of skeptical stance towards abstract questions in general. That attitude can be helpfully critical, but it can also be facile and self-stultifying, and it likes to masquerade as pragmatic good sense.

So how does that kind of question get started? It’s easy enough to notice when a child starts confronting questions about good and bad, right and wrong. That’s one reason for thinking that these questions have good credentials. But when, if ever, does reality itself become an object of curiosity, or puzzlement, or wonder – and why?

Read the full article in Aeon

.

The images are from top down: A section of Diego Rivera’s Detroit Industry mural (my photo); Skengdo and AM (photographer unknown); Frantz Fanon (photographer unknown)

%d bloggers like this: