Pandaemonium

PLUCKED FROM THE WEB #46

web 46

The latest (somewhat random) collection of recent essays and stories from around the web that have caught my eye and are worth plucking out to be re-read.


.

Americans want to believe jobs
are the solution to poverty. They’re not.

Matthew Desmond, New York Times, 11 September 2018

In America, if you work hard, you will succeed. So those who do not succeed have not worked hard. It’s an idea found deep in the marrow of the nation. William Byrd, an 18th-century Virginia planter, wrote of poor men who were ‘intolerable lazy’ and ‘Sloathful in everything but getting of Children.’ Thomas Jefferson advocated confinement in poorhouses for vagabonds who ‘waste their time in idle and dissolute courses.’ Leap into the 20th century, and there’s Barry Goldwater saying that Americans with little education exhibit ‘low intelligence or low ambition’ and Ronald Reagan disparaging ‘welfare queens.’ In 2004, Bill O’Reilly said of poor people: ‘You gotta look people in the eye and tell ’em they’re irresponsible and lazy,’ and then continued, ‘Because that’s what poverty is, ladies and gentlemen.’

Americans often assume that the poor do not work. According to a 2016 survey conducted by the American Enterprise Institute, nearly two-thirds of respondents did not think most poor people held a steady job; in reality, that year a majority of nondisabled working-age adults were part of the labor force. Slightly over one-third of respondents in the survey believed that most welfare recipients would prefer to stay on welfare rather than earn a living. These sorts of assumptions about the poor are an American phenomenon. A 2013 study by the sociologist Ofer Sharone found that unemployed workers in the United States blame themselves, while unemployed workers in Israel blame the hiring system. When Americans see a homeless man cocooned in blankets, we often wonder how he failed. When the French see the same man, they wonder how the state failed him.

If you believe that people are poor because they are not working, then the solution is not to make work pay but to make the poor work — to force them to clock in somewhere, anywhere, and log as many hours as they can. But consider Vanessa. Her story is emblematic of a larger problem: the fact that millions of Americans work with little hope of finding security and comfort. In recent decades, America has witnessed the rise of bad jobs offering low pay, no benefits and little certainty. When it comes to poverty, a willingness to work is not the problem, and work itself is no longer the solution.

Read the full article in the New York Times.


.

Hollywood’s vaccine wars
Gary Baum, Hollywood Reporter, 10 September 2014

Across California, thousands of children and babies are coughing so violently that their bodies convulse, uncontrollably wheezing and fighting to breathe for weeks. Nearly 8,000 pertussis cases have been reported in 2014 to the state’s Department of Public Health as of Sept. 2, and 267 of those patients have been hospitalized, including 58 requiring intensive care.

Adults can contract the disease, but 94 percent of all cases reported statewide involve children — and the youngest suffer the most. So far this year, three infants under 2 months of age have died statewide from pertussis, a disease commonly known as whooping cough (named for the high-pitched sound that kids make when they inhale after coughing).

Children’s Hospital Los Angeles is at the front line, with 72 pertussis patients this year. ‘A number of them have been in the ICU and very, very sick,’ says CHLA infectious disease specialist Dr. Jeffrey Bender. ‘They cough so hard, it turns into vomiting and broken ribs; they end up intubated, to ventilate their lungs.’

Although whooping cough was once a national scourge, killing more than 1,100 in 1950, decades of immunizations — the DTaP (diphtheria, tetanus and pertussis) vaccine and its forerunners — almost eliminated the disease. Only six Americans died of pertussis in 1995. Alas, an epidemic has arrived. (One case of pertussis typically can cause at least a dozen secondary transmissions because it is so communicable. Antibiotics do little to ameliorate the symptoms, except to shorten the period of infectiousness.) Medical officials are increasingly alarmed — especially here in L.A. No other county in California has more cases: 1,317 so far this year. ‘It’s a smoldering fire that has started and it could be a complete wildfire if vaccination rates continue to fall,’ says Dr. Deborah Lehman, associate director of pediatric infectious diseases at Cedars-Sinai Medical Center.

And it’s not just pertussis. According to the U.S. Centers for Disease Control, the number of domestic measles cases is at a 20-year high. About half of the cases in California involve unvaccinated patients.

Whether it’s measles or pertussis, the local children statistically at the greatest risk for infection aren’t, as one might imagine, the least privileged — far from it. An examination by The Hollywood Reporter of immunization records submitted to the state by educational facilities suggests that wealthy Westside kids — particularly those attending exclusive, entertainment-industry-favored child care centers, preschools and kindergartens — are far more likely to get sick (and potentially infect their siblings and playmates) than other kids in L.A. The reason is at once painfully simple and utterly complex: More parents in this demographic are choosing not to vaccinate their children as medical experts advise. They express their noncompliance by submitting a form known as a personal belief exemption (PBE) instead of paperwork documenting a completed shot schedule.

The number of PBEs being filed is scary. The region stretching from Malibu south to Marina del Rey and inland as far as La Cienega Boulevard (and including Santa Monica, Pacific Palisades, Brentwood, West Hollywood and Beverly Hills) averaged a 9.1 percent PBE level among preschoolers for the 2013-14 school year — a 26 percent jump from two years earlier. By comparison, L.A. County at large measured 2.2 percent in that period. Many preschools in this area spiked far higher, including Kabbalah Children’s Academy in Beverly Hills (57 percent) and the Waldorf Early Childhood Center in Santa Monica (68 percent). According to World Health Organization data, such numbers are in line with immunization rates in developing countries like Chad and South Sudan.

Read the full article in the Hollywood Reporter.


.

Just stop it
Daniel A Kaufman, The Electric Agora, 15 September 2018

In other words, some in our profession are trying to turn philosophy – and the university more generally – into a fever swamp.  It’s stupid and short sighted and dangerous, and we all need to stop it, right now.

Currently, you’ll find the worst of this stuff going on in the brawl – you can’t call it a ‘conversation’ – over gender identity.  Just this week, the Times of London reported that a professor had organized a campaign to accuse gender-critical scholars of hate crimes against trans people, in an effort to get them fired from their jobs. (About one such scholar, Kathleen Stock, of the University of Sussex, a participant in this campaign wrote ‘File a hate crime report against her, and then the chairman and vice-chair… Drag them over the fucking coals.’) But nothing about what’s happening is specific to that topic.  You’ll find the interactions similarly toxic and dysfunctional in discussions of race and other areas that intersect with what used to be called ‘civil rights,’ but is now commonly referred to as ‘social justice.’

The way this goes is depressingly familiar to anyone who has been paying attention over the last few years.  A professor articulates a view on a controversial social or political subject that is at odds with the prevailing view in the academy or at least, with the view that is most fiercely promoted by academic activists.  It is then claimed that the professor in question has ‘harmed’ the relevant population, i.e. racial minorities, trans people, women, etc., and that consequently, his or her writing/speech is outside the frame of acceptable discourse.  If the professor decides to stand up for him or herself and reaffirm the position in question, even perhaps marshaling additional arguments or evidence on its behalf, what only can be called a ‘mob’ is then unleashed, first on social media, and then later, depending on the circumstances, against the professor’s home institution, with the aim of exacting some penalty, up to and including the termination of his or her employment.

That one even has to explain what’s wrong with this says everything about the current state of academia and social/political discourse more generally.  (This sort of thing is happening with increasing frequency outside of the academy as well.)  What I want to focus on, here, however, is the misuse – nay, the outright abuse – of the harm principle, for it is the linchpin to this entire sorry state of affairs and is something that if not addressed, will destroy the basis on which the very possibility of critical scholarship – not to mention a liberal civil society – rests.

Read the full article in The Electric Agora.


.

Middle-class influence vs. working-class character
Jack Metzgar, Working Class Perspectives,
10 September 2018

‘Jesse’ is one of a cohort of 80 students sociologist Jessica Calarco observed from the 3rd through the 5th grades and then revisited in middle school for her new book, Negotiating Opportunities: How the Middle Class Secures Advantages in School.  Calarco also interviewed the students’ parents. Her research reveals that middle-class children practice ‘strategies of influence’ in school because their parents prioritize academic success, while working-class kids generally follow ‘strategies of deference’ because their parents care more about developing long-term character.

In middle school Jesse lost a homework packet and simply accepted a ‘0’ grade when the assignment was due.  Several weeks later his mother found the packet and made Jesse complete it.  When Jesse turned it in, his teacher ‘firmly, and a bit incredulously’ returned the packet ungraded, saying: ‘It’s a little too late for that now.  I mean, that [assignment] was like a month ago.’  Here’s how Calarco describes Jesse’s reaction:

Jesse does not look up.  He nods slowly, but he keeps his shoulders hunched forward and his head low.  As Ms. Cartwright heads back to her desk, Jesse glances up at me, his face and shoulders heavy with resignation.  He murmurs quietly, almost sadly: ‘It wasn’t to get a better grade.  It was to make me a better person.’

Jesse later explained to Calarco that his mother had told him to complete the late assignment not to improve his grade but because it was the right thing to do – ‘to work hard and take responsibility for his actions.’

Jesse is from a working-class family, and Calarco recounts in heart-breaking detail how the working-class kids she observed are disadvantaged in grade school by their inability and unwillingness to push teachers to give them more time on a test, help them with answers, and allow them to turn in homework late. Middle-class kids, on the other hand, often treat teachers’ instructions as but opening statements in a game of negotiating that these kids become amazingly good at as early as the 4th grade.

According to Calarco, middle-class kids are taught to question and negotiate with the authority of their teachers, who are there to serve and help them. They learn that children should ask for help and seek  special accommodations when they need them.  Working-class kids, conversely, are taught to defer to teachers, to do what they’re told, and not to burden teachers with unnecessary questions but to work out their problems on their own.
Read the full article in Working Class Perspectives.


.

Galileo's letter

Discovery of Galileo’s long-lost letter shows
he edited his heretical ideas to fool the Inquisition

Alison Abbot, Nature, 21 September 2018

It had been hiding in plain sight. The original letter — long thought lost — in which Galileo Galilei first set down his arguments against the church’s doctrine that the Sun orbits the Earth has been discovered in a misdated library catalogue in London. Its unearthing and analysis expose critical new details about the saga that led to the astronomer’s condemnation for heresy in 1633.

The seven-page letter, written to a friend on 21 December 1613 and signed ‘G.G.’, provides the strongest evidence yet that, at the start of his battle with the religious authorities, Galileo actively engaged in damage control and tried to spread a toned-down version of his claims.

Many copies of the letter were made, and two differing versions exist — one that was sent to the Inquisition in Rome and another with less inflammatory language. But because the original letter was assumed to be lost, it wasn’t clear whether incensed clergymen had doctored the letter to strengthen their case for heresy — something Galileo complained about to friends — or whether Galileo wrote the strong version, then decided to soften his own words.

Galileo did the editing, it seems. The newly unearthed letter is dotted with scorings-out and amendments — and handwriting analysis suggests that Galileo wrote it. He shared a copy of this softened version with a friend, claiming it was his original, and urged him to send it to the Vatican.

The letter has been in the Royal Society’s possession for at least 250 years, but escaped the notice of historians. It was rediscovered in the library there by Salvatore Ricciardo, a postdoctoral science historian at the University of Bergamo in Italy, who visited on 2 August for a different purpose, and then browsed the online catalogue.

‘I thought, ‘I can’t believe that I have discovered the letter that virtually all Galileo scholars thought to be hopelessly lost,’’ says Ricciardo. ‘It seemed even more incredible because the letter was not in an obscure library, but in the Royal Society library.’

Read the full article in Nature.


.

The trouble with uplift
Adolph Reed Jr, The Baffler, No 41, September 2018

Dismissals of Glory and Free State of Jones, as well as DuVernay’s explanation for the historical falsifications at play in Selma, may give the impression that the detractors of white saviorship are voicing a populist sensibility, complaining that black people are represented as incapable of effective social action without a white person (usually a man) leading them. And there is ample precedent in the history of popular culture for suspicion in that regard. The Tarzan films are perhaps the crassest and best-known examples; my father often remarked sarcastically that Africans should be grateful for Tarzan’s presence, since otherwise they apparently would all have been eaten by lions and crocodiles. The 1988 film Mississippi Burning incongruously makes FBI agents (white, though does that really matter?) the heroes of the civil rights campaign. Richard Attenborough’s 1987 Cry Freedom describes the struggle against apartheid and the murder of Stephen Biko through the travails of his white friend, the journalist Donald Woods. And there are many more examples; it is in fact the long history of such narratives that makes what might otherwise be simple feel-good stories, presented with an interracial twist—Conrack (1974), Dangerous Minds (1995), and The Blind Side (2009), among many others—something more distasteful and pernicious than just a set of interchangeable thematic variations on the maudlin human-interest narrative of uplift and overcoming.

But ‘white savior’ objections to Glory and Free State are a different matter. Those films hinge largely on the prominence of black agency, which race-first critics apparently deem irrelevant. Their objection is not that blacks’ agency is absent; it is rather about who is represented as leading their efforts. Decisions by blacks to support nonblack candidates or social policies not expressed in race-first terms are interpreted as evidence of flawed, limited, misguided, or otherwise co-opted black agency. The idea that blacks, like everyone else, make their history under conditions not of their own choosing becomes irrelevant, just another instance of insufficient symbolic representation.

The notion that black Americans are political agents just like other Americans, and can forge their own tactical alliances and coalitions to advance their interests in a pluralist political order is ruled out here on principle. Instead, blacks are imagined as so abject that only extraordinary intervention by committed black leaders has a prayer of producing real change. This pernicious assumption continually subordinates actually existing history to imaginary cultural narratives of individual black heroism and helps drive the intense—and myopic—opposition that many antiracist activists and commentators express to Bernie Sanders, social democracy, and a politics centered on economic inequality and working-class concerns.

Read the full article in the Baffler.


.

China’s chilling ‘social credit system’ is straight out of dystopian sci-fi, And it’s already switched on
Peter Dockrill, Science Alert, 20 September 2018

It’s been in the pipeline for years: a sprawling, technological mass surveillance network the likes of which the world has never seen. And it’s already been switched on.

China’s ‘Social Credit System’ – which is expected to be fully operational by 2020 – doesn’t just monitor the nation’s almost 1.4 billion citizens. It’s also designed to control and coerce them, in a gigantic social engineering experiment that some have called the ‘gamification of trust’.

That’s because the massive project, which has been slowly coming together for over a decade, is about assigning an individual trust score to each and every citizen, and to businesses too.

According to China’s Communist Party, the system will ’allow the trustworthy to roam freely under heaven while making it hard for the discredited to take a single step’.

To pull this off, the unprecedented scheme will harness the immense reach of China’s technological infrastructure: some 200 million CCTV cameras, according to a report by Australia’s Foreign Correspondent.

The idea is these ever-watchful eyes will be hooked up to facial recognition systems, and cross-checked with financial, medical records, and legal records – with the whole apparatus regulated and interpreted by advanced, big-data-crunching AI networks.

The sweeping dystopia of it all is uncannily reminiscent of the TV show Black Mirror – in particular the eerily prescient episode ‘Nosedive‘ – but while several outlets have pointed the similarities out, China’s ultimate goal goes even further.

‘This is potentially a totally new way for the government to manage the economy and society,’ economist Martin Chorzempa from the Peterson Institute for International Economics told The New York Times in July.

‘The goal is algorithmic governance.’

Read the full article in Science Alert.


.

Workers with low levels of education
still haven’t recovered from the Great Recession

Lauren Bauer & Jay Shambaugh,
Brookings, 6 September 2018

Those with less education were disproportionately harmed by the Great Recession. We see that graduate degree holders —and to a lesser extent bachelor’s degree holders —experienced smaller reductions in employment during the recession. For those with no postsecondary degree, the employment rate gap in 2011 was 5 percent or more, while it was just 2 percent for those with a bachelor’s degree.

Recovery from the bottom of the trough occurred earlier for those with more education. The first upturn among graduate degree holders was between 2009 and 2010, between 2010 and 2011 for those with a bachelor’s degree. By 2018, only those with bachelor’s or graduate degrees had returned to their demographically adjusted pre-recession employment rate.

The recession was particularly hard on those without a high school diploma. In 2010 and 2011, this group had an employment-to-population ratio that was fully six percentage points lower than in 2007. Those with a high school diploma and/or some college followed a similar trend through this period, with a slightly shallower trough during the worst of the recession than those who didn’t graduate from high school. In recent years, workers without a postsecondary degree have seen improving employment outcomes, though a gap remains.

Not only have less-educated groups not recovered as fully from the recession, they started at lower levels of employment rates prior to the crisis such that at this point, amongst those aged 25 and higher, 72.5 percent of those with a bachelor’s degree work compared to just 55 percent of those with only a high school degree.

The Great Recession inflicted economic pain on many American families, but its burden was not equally distributed. Ultimately, the brunt of the Great Recession was borne by those without the protection of postsecondary education. College raises average lifetime earnings, and it also helps insulate workers from economic downturns, providing economic security in the times they need it most. Finally, racial disparities have been less severe in recovery than in the worst years of the Great Recession, though differences in employment rates persist. For the American labor market to be truly healthy, it needs to work for all people – not just some.

Read the full article in Brookings


.

Lynette Shelley, The Elephant In The Room

Machine learning confronts the elephant in the room
Kevin Hartnett, Quanta, 20 September 2018

Neural networks are adept at specific visual chores. They can outperform humans in narrow tasks like sorting objects into best-fit categories — labeling dogs with their breed, for example. These successes have raised expectations that computer vision systems might soon be good enough to steer a car through crowded city streets.

They’ve also provoked researchers to probe their vulnerabilities. In recent years there have been a slew of attempts, known as ‘adversarial attacks,’ in which researchers contrive scenes to make neural networks fail. In one experiment, computer scientists tricked a neural network into mistaking a turtle for a rifle. In another, researchers waylaid a neural network by placing an image of a psychedelically colored toaster alongside ordinary objects like a banana.

This new study has the same spirit. The three researchers fed a neural network a living room scene: A man seated on the edge of a shabby chair leans forward as he plays a video game. After chewing on this scene, a neural network correctly detected a number of objects with high confidence: a person, a couch, a television, a chair, some books.

Then the researchers introduced something incongruous into the scene: an image of an elephant in semiprofile. The neural network started getting its pixels crossed. In some trials, the elephant led the neural network to misidentify the chair as a couch. In others, the system overlooked objects, like a row of books, that it had correctly detected in earlier trials. These errors occurred even when the elephant was far from the mistaken objects.

Snafus like those extrapolate in unsettling ways to autonomous driving. A computer can’t drive a car if it might go blind to a pedestrian just because a second earlier it passed a turkey on the side of the road.

And as for the elephant itself, the neural network was all over the place: Sometimes the system identified it correctly, sometimes it called the elephant a sheep, and sometimes it overlooked the elephant completely.

‘If there is actually an elephant in the room, you as a human would likely notice it,’ said Rosenfeld. ‘The system didn’t even detect its presence.’

Read the full article in Quanta


.

‘Eradicating ideological viruses’
Human Rights Watch, 9 September 2018

In May 2014, China launched its ‘Strike Hard Campaign against Violent Terrorism’ in Xinjiang. Since then, the number of people formally arrested has leaped three-fold compared to the previous five-year period, according to official figures and estimates by the nongovernmental organization Chinese Human Rights Defenders. The government has held people in pretrial detention centers) and prisons, both of which are formal facilities, and in political education camps, which have no basis under Chinese law. Those detained have been denied due process rights and suffered torture and other ill-treatment.

International media attention on Xinjiang has thus far focused on the political education camps. Although the Chinese government provides no public information on the number of detainees in these camps, credible estimates place the number in these camps at around one million.[1] Within these secretive facilities, those held are forced to undergo political indoctrination for days, months, and even over a year.

It is not uncommon to find Uyghurs, particularly from Hotan and Kashgar in southern Xinjiang, – perceived by the authorities as anti-government hotspots – reporting that half or more of their immediate family members are in a mix of political education camps, pre-trial detention, and prison. For example, an interviewee said her husband, his 4 brothers, and their 12 nephews – that is, all the men in the family – have been detained in political education camps since 2017.

There have been reports of deaths in the political education camps, raising concerns about physical and psychological abuse, as well as stress from poor conditions, overcrowding, and indefinite confinement. While basic medical care is available, people are held even when they have serious illnesses or are elderly; there are also children in their teens, pregnant and breastfeeding women, and people with disabilities. Former detainees reported suicide attempts and harsh punishments for disobedience in the facilities.

Chinese officials have denied that abuses have occurred; instead they characterize these camps as ‘vocational education and employment training centers’ for ‘criminals involved in minor offenses.’ However, they permit no independent monitoring of these facilities from the UN, human rights organizations, or the media…

Perhaps the most innovative – and disturbing – of the repressive measures in Xinjiang is the government’s use of high-tech mass surveillance systems. Xinjiang authorities conduct compulsory mass collection of biometric data, such as voice samples and DNA, and use artificial intelligence and big data to identify, profile, and track everyone in Xinjiang. The authorities have envisioned these systems as a series of ‘filters,’ picking out people with certain behavior or characteristics that they believe indicate a threat to the Communist Party’s rule in Xinjiang. These systems have also enabled authorities to implement fine-grained control, subjecting people to differentiated restrictions depending on their perceived levels of ‘trustworthiness.’

Authorities have sought to justify harsh treatment in the name of maintaining stability and security in Xinjiang, and to ‘strike at’ those deemed terrorists and extremists in a ‘precise’ and ‘in-depth’ manner. Xinjiang officials claim the root of these problems is the ‘problematic ideas’ of Turkic Muslims. These ideas include what authorities describe as extreme religious dogmas, but also any non-Han Chinese sense of identity, be it Islamic, Turkic, Uyghur, or Kazakh. Authorities insist that such beliefs and affinities must be ‘corrected’ or ‘eradicated.’

Read the full article in the Human Rights Watch report.


.

‘I want to burn things to the ground’
Tom Bartlett, Chronicle of Higher Education,
11 September 2018

As Barrett sees it, some of what the data thugs do ‘borders on harassment.’ The prime example is that of Amy Cuddy, whose power-pose study was the basis for a TED talk that’s been viewed more than 48 million times and led to a best-selling book, Presence (Little, Brown & Company, 2015). The 2010 study has failed to replicate, and the first author, Dana Carney, a psychologist at Berkeley, no longer believes in the effect. The power-pose study is held up as an example of psychology at its most frivolous and unreliable. Cuddy, though, has not renounced the research and has likened her treatment to bullying. She recently tweeted: ‘People who want to destroy often do so with greater passion and energy and time than people who want to build.’ Some psychologists, including Barrett, see in the ferocity of that criticism an element of sexism. It’s true that the data thugs tend to be, but are not exclusively, male — though if you tick off the names of high-profile social psychologists whose work has been put through the replication ringer, that list has lots of men on it, too. Barrett thinks the tactics of the data thugs aren’t creating an atmosphere for progress in the field. ‘It’s a hard enough life to be a scientist,’ she says. ‘If we want our best and brightest to be scientists, this is not the way to do it.’

Richard Nisbett agrees. Nisbett has been a major figure in psychology since the 1970s. He’s co-director of the Culture and Cognition program at the University of Michigan at Ann Arbor, author of books like Mindware: Tools for Smart Thinking (Farrar, Straus, and Giroux, 2015), and a slew of influential studies. Malcolm Gladwell called him ‘the most influential thinker in my life.’ Nisbett has been calculating effect sizes since before most of those in the replication movement were born.

And he’s a skeptic of this new generation of skeptics. For starters, Nisbett doesn’t think direct replications are efficient or sensible; instead he favors so-called conceptual replication, which is more or less taking someone else’s interesting result and putting your own spin on it. Too much navel-gazing, according to Nisbett, hampers professional development. ‘I’m alarmed at younger people wasting time and their careers,’ he says. He thinks that Nosek’s ballyhooed finding that most psychology experiments didn’t replicate did enormous damage to the reputation of the field, and that its leaders were themselves guilty of methodological problems. And he’s annoyed that it’s led to the belief that social psychology is riddled with errors. How do they know that?, Nisbett asks, dropping in an expletive for emphasis

Simine Vazire has heard that argument before. Vazire, a professor of psychology at the University of California at Davis, and one of the SIPS organizers, regularly finds herself in meetings where no one shares her sense of urgency about the replication crisis. ‘They think the status quo is fine, and we can make tweaks,’ she says. ‘I’m often the only person in the room who thinks there’s a big problem.’

It’s not that the researchers won’t acknowledge the need for improvement. Who’s against progress? But when she pushes them on what that means, the division becomes apparent. They push back on reforms like data transparency (sharing your data freely with other researchers, so they can check your work) or preregistration (saying publicly what you’re trying to discover in your experiment before you try to discover it). That’s not the way it’s normally been done. Psychologists tend to keep their data secret, arguing that it’s proprietary or that revealing it would endanger subjects’ anonymity. But not showing your work makes it easier to fudge what you found. Plus the freedom to alter your hypothesis is what leads to so-called p-hacking, which is shorthand for when a researcher goes searching for patterns in statistical noise.

Read the full article in the Chronicle of Higher Education.


.

Rewriting Poland
Marta Figlerowicz, Boston Review, 17 September 2018

Księgi Jakubowe (The Books of Jacob) continues Tokarczuk’s preoccupation with hybridity and belonging in a different key. It will be a pleasure to see how Croft translates this epic narrative, small parts of which have already begun to appear online. It will also be fascinating to see how the broader international public responds to it. The book comes to nearly a thousand pages in the original Polish. Told from a variety of perspectives, by a dozen character-narrators, it recounts the only partly fictionalized adventures of Jacob Frank. An eighteenth-century Jewish man born in Poland who grew up in Romania and the Ottoman Empire, Jacob travels around Eastern and Central Europe fleeing detractors and accumulating proponents, as he seeks to establish himself as the new Jewish Messiah, the reincarnation of the earlier Sabbatai Zevi. He adopts elements of Catholicism and Islam into his teachings, and temporarily converts to both these other faiths, which gets him into trouble with Catholics, Muslims, and Jews. By turns beloved and reviled by his companions, admired for his charisma and hated for his narcissism, Jacob becomes for Tokarczuk a vehicle to explore a slew of questions about interfaith dialogue, the viability of free love, the international systems of patronage that support thinkers and artists, and the purpose of amassing bodies of cultural and scientific knowledge.

Księgi Jakubowe pointedly reaffirms the importance of Jewish culture to Central and Eastern Europe; it also reminds its readers of the role Polish people in particular played in the gradual suppression of the region’s multiculturalism. In the context of present-day Polish politics, it thus reads as an implicit rebuke of the white ethno-nationalism that, as in much of Europe, now threatens to grab the reins of government. Tokarczuk’s unapologetic distaste for the Polish far-right has gotten her into considerable trouble. Indeed, the response from the right has been at times so unhinged that her publisher, fearing for Tokarczuk’s life, has had to hire bodyguards to protect her. Croft, Tokarczuk’s English translator, published a short essay about a 2015 incident – which followed Tokarczuk winning the Nike for the second time – to bring attention to the extreme conditions under which the author now lives and works.

I turned to the Polish press in search of some explanation for this sudden outpouring of hate, absolutely unprecedented in Tokarczuk’s longstanding and distinguished career. I found an explanation in Tokarczuk’s post-awards interview, in which she said, among other things, ‘We have come up with this history of Poland as an open, tolerant country, as a country uncontaminated by any issues with its minorities. Yet we committed horrendous acts as colonizers, as a national majority that suppressed the minority, as slave-owners and as the murderers of Jews.’

The terms of this debate—in the course of which many prominent Eastern Europeans, including Svetlana Alexievich, came to Tokarczuk’s defense—might seem less relatable, almost provincial, when compared to the cosmopolitanism of Flights. But in many ways, the themes of Księgi Jakubowe are even more internationally pressing than those of Tokarczuk’s Booker-winning volume, as we see the near entirety of the West fall under the spell of extreme nationalism. History, Księgi Jakubowe reminds us, does not offer clean narratives of our origins and cultural motivations. It hides pockets of utopian generalization where one might least expect them; it can influence and inspire us not only through its victors, but also through its half-abandoned projects and roads not taken, reminding us both of our roots and of our homelessness in spite of them. One can only hope that, buoyed by the rising ambition of this most recent work and the acclaim of the Booker, Tokarczuk’s treatment of such issues will continue to grow in complexity and scope—and that the larger literary world will keep on listening.

Read the full article in the Boston Review.


.

trump-john-springs-nyrb

Dupe throat
Patrick Blanchfield, n+1, 12 September 2018

It is a bleak fact of American life that a brilliant woman can publicly speak the devastating truth about a prominent man for decades without his success being affected much at all. Joan Didion had Bob Woodward’s number back in 1996. Writing in the New York Review of Books, she skewered the famous journalist and his ‘disinclination . . . to exert cognitive energy on what he is told.’ Selling a sexily packaged ‘insider’s inside story,’ Woodward, per Didion, did more than just spin a ‘crudely personalized,’ Great Man narrative history of recent events: he offered his powerful subjects near limitless opportunities for comprehensive image rehabilitation. For all his self-proclaimed rigor and attention to detail, Woodward’s work was at core defined by its ‘deferential spirit’—a basic, transactional pact in which he would be allowed access as long as he maintained his unquestioning credulity:

As any prosecutor and surely Mr. Woodward knows, the person on the inside who calls and says ‘I want to talk’ is an informant, or snitch, and is generally looking to bargain a deal, to improve his or her own situation, to place the blame on someone else in return for being allowed to plead down or out certain charges. Because the story told by a criminal or civil informant is understood to be colored by self-interest, the informant knows that his or her testimony will be unrespected, even reviled, subjected to rigorous examination and often rejection. The informant who talks to Mr. Woodward, on the other hand, knows that his or her testimony will be not only respected but burnished into the inside story, which is why so many people on the inside, notably those who consider themselves the professionals or managers of the process – assistant secretaries, deputy advisers, players of the game, aides who intend to survive past the tenure of the patron they are prepared to portray as hapless—do want to talk to him.

In the twenty-two years since Didion’s diagnosis, Bob Woodward has gone on to write eleven further books, seven of them about contemporaneous presidencies. America’s turn to global war has been particularly good for Woodward, not least because the inside story of leaders in wartime makes for especially sexy copy – and since the reversals and tragedies of war itself mean politicians have added need for P.R. triage of the sort Woodward happily provides.

But if his four books on George W. Bush and two on Barack Obama were case studies in proving Didion’s point, Woodward’s latest, Fear: Inside the Trump White House, drives it home with almost excruciating feats of self-parody. It’s not just that Woodward’s self-consciously Serious approach to Serious People sputters and short-circuits when confronted with the ludicrously Unserious figure of Donald Trump himself (who, unlike previous Presidents, did not make himself available for Woodward to interview.) Rather, Fear showcases Woodward in his most abject and pathetic role as what Christopher Hitchens, who also saw him for what he was, called a ‘stenographer to power.’ For page after dumbfounding page, Fear reproduces, with gobsmacking credulity, the self-aggrandizing narratives of factitious scoundrels. Didion was absolutely right to class Woodward’s work as fundamentally a kind of ‘political pornography.’ But Fear is to Woodward’s previous oeuvre of political pornography what Fifty Shades of Grey is to Twilight: vampiric fan-fiction repackaged as middlebrow smut.

Read the full article in n+1


.

The life intense
Tim Adams, Guardian, 24 September 2018

In 2014 a leading life insurance company asked 2,000 of its British clients the things they would most like to do before they died. Some of the bucket list items were concrete goals – No 1 was ‘have a holiday home abroad’, while a little further down the list was ‘own a Mulberry handbag’. The great majority of the wishes, however, involved experiences rather than assets. The experience most desired by the largest number of respondents was to ‘swim with dolphins’; it was followed, variously, by ‘drive Route 66’, ‘ride a hot air balloon’, ‘hold a koala’ and ‘do a parachute jump’. Perhaps more than any previous culture we routinely associate adrenaline with enlightenment.

The bucket-list hit parade represents a triumph of what Alvin Toffler was among the first to identify in his 1970 book Future Shock. Once society had provided most people with basic needs and a level of comfort, they argued, the economy would be increasingly directed, in the absence of organised religion, toward ‘psychic gratification’. We would, they suggested, be likely to see the emergence not only of the ‘experience economy’ selling adventure, danger, sexual thrills, but everyday products would be freighted with added emotional meaning. We would not buy a pair of training shoes without believing ourselves to be purchasing the whole culture of extreme and focused passion that they represented.

The quality that underpins this culture and economy, according to the French novelist and philosopher Tristan Garcia, is the near universal desire for intensity. Whereas previous centuries may have prized integrity or harmony or steadiness or social grace, our dreams are geared toward novelty, newness, the desperate need to find escape from the mundane and the habitual. ‘When was the last time you did something for the first time?’ the rapper Drake asks. No coffee shop conversation or teenage social media thread would be complete without a string of OMG answers. Status is increasingly measured not by what we earn, or what we contribute, but what we experience, what we photograph and what we choose to share. Our sports must be extreme, our tastes exotic, our relationships ecstatic and our drugs revelatory. The idea of intensity, of living to the limit has become another way to fill the God-shaped hole, to prove to ourselves that we are fully alive. Fundamentally, Garcia argues, in what is both an erudite and intensely academic read, this quest – like all quests worth their grail – is doomed to end in failure.

Garcia, in the first of a trilogy entitled Letting Be, traces our desires for skiing virgin slopes and for queuing to upgrade our technology and for swiping right for the next perfect partner back to the triumph of the Enlightenment, and the reaction to it. ‘The classical age of the sciences is, in the first instance, the moment of all intensity reduced to nothingness,’ he argues. Newton killed the concepts of ‘more’ and ‘greater’. The world was suddenly a thing of atoms and gravity; anything else we ascribed to it was nothing more than ‘sentiment’. The Romantic movement was a backlash against that reductionism. Wordsworth’s natural epiphanies, De Quincey’s pharmacological experiments, Byron’s libertine lusts were all statements of overwhelming interiority, proof positive that we were alive. Goethe’s Werther expressed perhaps the first adrenaline rush in western literature: ‘I felt exalted by this overflowing fullness to the perception of the Godhead, and the glorious forms of an infinite universe became visible to my soul! Stupendous mountains encompassed me, abysses yawned at my feet, and cataracts fell headlong down before me…’ Thereafter, we were all would-be emotional skydivers, ‘searching for the strong sensations that might justify our lives’.

Read the full article in the Guardian.


.

Privilege
Matthew B Crawford, The Hedgehog Review,
Summer 2018

But simply becoming more noisy about equality wouldn’t do the trick. Some conceptual innovation was needed, one that would shift the terms in such a way as to ease the contradiction. Enter ‘diversity.’

This concept claims descent from a lineage of shining democratic moments in the struggle for equal rights that we rightly celebrate: John Locke’s A Letter Concerning Toleration, Martin Luther King’s ‘Letter from Birmingham Jail,’ the statesmanship by which Nelson Mandela averted civil war in South Africa. But the family resemblance turns out to be superficial when one grasps the function ‘diversity’ serves as a principle of administration in today’s political economy.

As Michael Lind has written, ‘Neoliberalism—the hegemonic ideology of the transatlantic elite—pretends that class has disappeared in societies that are purely meritocratic, with the exception of barriers to individual upward mobility that still exist because of racism, misogyny, and homophobia.’ Marking out the corresponding classes of persons for special solicitude is thus key to sustaining the democratic legitimacy of our major institutions. Or, rather, the point is to shift the basis of that legitimacy away from democratic considerations toward ‘moral’ ones. These have the advantage that they can be managed through the control of language, which has become a central feature of institutional life.

The concept of diversity first germinated in the corporate world, and was quickly seized upon by academia in the 1990s. It arrived just in the nick of time. The previous two decades had seen the traditional mission of the university undermined, if not abandoned, under pressure from a highly politicized turn in the humanities that made its case in epistemic terms, essentially debunking the very idea of knowledge. The role that the upper-tier university soon discovered for itself, upon the collapse of ideals of liberal learning, was no longer that of training citizens for humane self-government, but rather that of supplying a cadre to staff the corporations, the NGOs, and the foundations. That is, the main function of elite schools is to supply the personnel required to run things in an economy that has become more managerial than entrepreneurial.

The institutional desideratum – the political antipode to hated ‘privilege’ – is no longer equality, but diversity. This greatly eases the contradiction Furet identified, shielding the system from democratic pressure. It also protects the self-conception of our meritocrats as agents of historical progress. As was the case with the Soviet nomenklatura, and the leading Jacobins as well, it is precisely our elite that searches out instances of lingering privilege, now understood as obstacles to fulfillment of the moral imperative of diversity. Under this dispensation, the figure of the ‘straight white male’ (abstracted from class distinctions) has been made to do a lot of symbolic work, the heavy lifting of legitimation (in his own hapless way, as sacrificial goat). We eventually reached a point where this was more weight than our electoral system could take, as the election of 2016 revealed. Whether one regards that event as a catastrophe or as a rupture that promises the possibility of glasnost, its immediate effect has been panic in every precinct where the new class accommodations have been functioning smoothly, and a doubling down on the moralizing that previously secured them against popular anger. We’ll see how that goes.

Read the full article in the Hedgehog Review.


.

The forgotten story of how ‘punching up’
harmed the science-fiction/fantasy world

Cathy Young, Quilette, 18 August 2018

In September 2014, the sci-fi/fantasy world was rocked by revelations about the bizarre online past of a much-praised young author in the field, the Thai-born, Hong Kong-based Benjanun Sriduangkaew, one of that year’s finalists for the John W. Campbell Award for Best New Writer. Sriduangkaew was outed as a notorious social justice ‘rage-blogger’ known by the fitting moniker ‘Requires Hate’ (a shortened version of the title of her blog, ‘Requires Only That You Hate’), whose vitriol-soaked takedowns and callouts of ‘problematic’ works and authors had sown fear in the SFF community since 2011. What’s more, Requires Hate also doubled as a prolific troll and cyberbully who mainly went by ‘Winterfox’ but sometimes used other handles.

After several weeks of heated debates, a lengthy, detailed, carefully researched report on Sriduangkaew’s activities under her various aliases was posted by sci-fi writer Laura Mixon on her LiveJournal blog.

It makes for a hair-raising read. Requires Hate’s rants made Jeong’s tweets sound like drawing-room pleasantries. She frequently resorted to graphic threats of murder, rape, mutilation, acid attacks, and other extreme violence. Of American sci-fi novelist Paolo Bacigalupi, whom she blasted as a ‘raging racist fuck’ and an ‘appropriative bag of feces,’ she wrote, ‘If I see [him] being beaten in the street I’ll stop to cheer on the attackers and pour some gasoline on him,’ and ‘Let him be hurt, let him bleed, pound him into the fucking ground. No mercy.’ Irish-American author Caitlyn Kiernan was branded a ‘rape apologist’ whose ‘hands should be cut off so she can never write another Asian character.’

According to Mixon, Sriduangkaew, often aided by her followers, had at various times tried to ‘suppress the publication of fiction and reviews’ and get speakers disinvited from panels and readings; cyber-stalked sci-fi fans who had crossed her; ‘chased down positive reviews’ in order to ‘frighten reviewers and fans away’ from promoting works she disliked; and ‘single-handedly destroyed several online SFF, fanfic, and videogaming communities with her negative, hostile comments and attacks.’ (All italics in the original.) Moreover, ‘At least one of her targets was goaded into a suicide attempt’…

Mixon herself was upfront about the fact that Sriduangkaew’s reign of terror was made possible by the political culture in the SFF community: since Requires Hate self-identified as an Asian lesbian, she had the backing of ‘progressives … who appreciate[d] that – despite her sometimes over-the-top rhetoric—she unapologetically sp[oke] up for people of color and queer/ LGBTQI people, calling out racist, homophobic, misogynist content in many popular SFF novels and stories.’  Interestingly, Mixon also pointed to evidence that Sriduangkaew’s abusive online behavior had begun with nasty but nonpolitical forum trolling – until ‘at some point she discovered social-justice-driven rage-speak and found it to be a particularly effective weapon.’

Read the full article in Quilette


.

Detail from Sveta Dorosheva

The inner voice
Philip Jaekl, Aeon, 13 September 2018

That voice isn’t the sound of anything. It’s not even physical – we can’t observe it or measure it in any direct way. If it’s not physical, then we can arguably only attempt to study it by contemplation or introspection; students of the inner voice are ‘thinking about thinking’, an act that feels vague. William James, the 19th-century philosopher who is often touted as the originator of American psychology, compared the act to ‘trying to turn up the gas quickly enough to see how the darkness looks’.

Yet through new methods of experimentation in the last few decades, the nature of inner speech is finally being revealed. In one set of studies, scans are allowing researchers to study the brain regions linked with inner speech. In other studies, researchers are investigating links between internal and external speech – that which we say aloud.

The roots of the new work trace back to the 1920s and the Russian developmental psychologist Lev Vygotsky, who said the human mind was shaped by social activity and culture, beginning in childhood. The self, he hypothesised, was forged in what he called the ‘zone of proximal development’, the cognitive territory just beyond reach and impossible to tackle without some help. Children build learning partnerships with adults to master a skill in the zone, said Vygotsky, then go off on their own, speaking aloud to replace the voice of the adult, now gone from the scene. As mastery increases, this ‘self-talk’ becomes internalised and then increasingly muted until it is mostly silent – still part of the ongoing dialogue with oneself, but more intimate and no longer pronounced to the world. This voice – at first uttered aloud but finally only internal – was, from Vygotsky’s perspective, the engine of development and consciousness itself.

Vygotsky’s theory of childhood development contrasted sharply with those of his Western counterparts. William James had a complete disdain for the study of inner speech, because, to him, it was a ghost: impossible to observe. The French developmental psychologist Jean Piaget insisted that private speech signified simple inability – it was the babble of a child without capacity for social communication with no relation to cognitive functioning at all. Through much of the 20th century, Piaget seized the reigns of child development, insisting that children had to reach a developmental stage before learning could occur. Which came first: the chicken or the egg? Vygotsky said that learning occurred, then the brain developed. Piaget said the brain developed, then learning occurred.

Over years of meticulous experiment behind the Iron Curtain, Vygotsky continued to make his case. One thing he did was study children in the zone of proximal development as they worked with adults to accomplish tasks. In the experiments, the child would be presented with a challenge and a tool for overcoming it. In the zone, Vygotsky observed what he called ‘private speech’ – self-talk that children between the ages of two and eight often engage in. This intermediate stage, he held, was connected on one end to a prior period when we had no thread of memory (and no inner voice) and on the other end to true inner speech so crucial to self-reflection, narrative memory, and development of cognitive skills.

Within the newly forming Soviet Union, Vygotsky’s research was stigmatised, in large part because it used intelligence testing to validate some concepts; IQ testing itself had been banned as a challenge to Marxist principles of equality. Despite the roadblocks, in 1934 (the year of his death) Vygotsky finally published his opus on inner speech and childhood development, Thought and Language. It was a potent challenge to Piaget but, shrouded by the Stalinist censor, his ideas remained under wraps.

Read the full article in Aeon


.

Your DNA is not your culture
Sarah Zhang, The Atlantic, 25 September 2018

DNA-testing companies are careful not to use racial categories in their tests, instead reporting breakdowns of specific regions around the world. And they say that their tests are meant to bring people together by highlighting shared ancestry and challenging the idea that people are ‘pure.’ I don’t doubt that DNA tests have sparked meaningful explorations of family history for some people and filled in the blanks for others whose histories were lost to slavery and colonialism. I do doubt that a DNA test will solve racism.

In November 2016, I got an email from an AncestryDNA publicist with the subject, ‘Post-election spike in DNA interest.’ The publicist wanted to share some numbers: Ancestry’s DNA-kit sales jumped 33 percent compared to the week before the election. She attributed this to a viral video called The DNA Journey, where people are first seen talking about how proud they are of their heritage and dumping on others (‘I have a side of me that hates Turkish people’). Then they take a DNA test and find out that they, in fact, have mixed ancestry.

In the divisive days after the election, the publicist said readers were hungry for this message. ‘PLEASE DO THIS! Let’s stop spreading hate and start seeing that we are all one! I just ordered my kit! We all bleed the same color!’ is how she characterized viewers’ reactions. ‘This should be compulsory,’ one woman exclaims in the video. ‘There would be no such thing as, like, extremism in the world.’

It’s a nice message. But it elides history. Mixed ancestry does not necessarily mean a harmonious coexistence, past or future. African Americans have, on average, 24 percent European ancestry. To take a genetic-ancestry test is to confront a legacy of rape and slavery -perhaps to even recognize one’s own existence as the direct result of it. There is a way to use genetics and genealogy to uncover injustices and properly account for them. The 23andMe-sponsored podcast Spit, for instance, has featured some nuanced conversations about race. But it’s not through feel-good ads that paper over the past.

At the end of The DNA Journey (36 million views on Facebook and counting), participants are offered a trip around the world to visit the places revealed in the DNA test. The video was, of course, an advertisement: a collaboration between AncestryDNA and Momondo, a travel website.

Read the full article in the Atlantic.


.

Landmines in the Sahara
Matthew Porges, London Review of Books,
7 September 2018

Daha Bulahi, sixtyish, is a Sahrawi, born into a nomadic family in the northwestern Sahara. One of his eyes is fake, the eyelid mangled, and he’s missing a couple of fingers. None of this prevents him from brewing tea, which he did throughout our interview in the Sahrawi way, aerating the tea by pouring it from glass to glass and accumulating bubbles on the surface. He worked in landmine clearance for several years, and Yago, a Spanish demining technician who was working with him, told me the story of Daha’s mutilation. Lacking sophisticated equipment, he would dig underneath each mine and pick it up from below with his bare hands, avoiding the pressure-plate triggering mechanism on the top. Then he would throw it over his shoulder, letting it explode, and move onto the next one. This is about as safe as it sounds. He had cleared a vast number of mines successfully, but one day a mine exploded as he threw it, spraying him with shrapnel. Daha’s survival strained the bounds of credulity, but there he was, brewing tea with what was left of his hand.

There are something like seven million unexploded landmines in Western Sahara, a former Spanish colony now mostly occupied by Morocco. Daha, along with thousands of others, fled the Moroccan invasion in 1975 and has lived in a refugee camp in Algeria ever since. Most of the landmines here are antipersonnel mines, which are light enough to move around in the dunes with changing wind and rain patterns. In 2015 and 2016, there was unprecedented flooding. This made the issue of mobile landmines even more pressing, and Daha’s organisation, the Asociación Saharaui de Víctimas de Minas, was busier than ever.

The Spanish never got as far inland as the refugee camp where Daha now lives, but when people fled the war with Morocco, they took their languages with them, and when they built new institutions, they did so partly in the language of the former colonists. ASAVIM’s main office, in Rabouni, Algeria, consists of a single low building with four rooms. It isn’t a mine clearance operation; instead, it supports local landmine victims and their families. One of the rooms is filled with gruesome photographs; I was told there are about fourteen incidents in Western Sahara each year. In the hallway, there are prosthetic legs in flowerpots, limbs sprouting from the soil. Maybe this was meant as an artistic statement of some kind, but I didn’t ask.

ASAVIM’s director, Awala Lahbib, has a prosthetic leg. He and Daha were full of stories about nomads and landmines. One concerned a man who was herding fifty or so camels across the desert when he stepped on a mine. His camels scattered and his leg was blown off. Alone in the desert, bleeding heavily, he lit a fire and cauterised his wound, surviving the incident. According to Awala, he was 73 years old at the time. Yago, as usual, ruined a good story with his precision, probably a good quality in a demining technician: landmine explosions, he said, often cauterise wounds almost immediately anyway.

Read the full article in the London Review of Books.


.

Ubang: The Nigerian village where men and women speak different languages
Yemisi Adegoki, BBC, 23 August 2018

Dressed in a brightly coloured traditional outfit, a red chief’s cap and holding a staff, Chief Oliver Ibang calls over his two young children, eager to demonstrate the different languages.

He holds up a yam and asks his daughter what it is called.

‘It’s ‘irui’,’ she says, without hesitating.

But in Ubang’s ‘male language’ the word for yam, one of Nigeria’s staple foods, is ‘itong’.

And there are many other examples, such as the word for clothing, which is ‘nki’ for men and ‘ariga’ for women.

It is not clear exactly what proportion of words are different in the two languages and there is no pattern, such as whether the words are commonly used, related or linked to traditional roles for men or women.

‘It’s almost like two different lexicons,’ says anthropologist Chi Chi Undie, who has studied the community.

‘There are a lot of words that men and women share in common, then there are others which are totally different depending on your sex. They don’t sound alike, they don’t have the same letters, they are completely different words.’

She says the differences are far greater than, for example, British and American versions of English.

However, both men and women are able to understand each other perfectly – or as well as anywhere else in the world.

Read the full article on the BBC.


.

_MAL4617

What cave art means
Justin EH Smith, Art in America, 1 September 2018

There are images on the walls of caves, whether we put them there or not. Or, more precisely, we create images on the walls of caves, whether with charcoal and manganese or simply with our imaginations. Michelangelo’s well-known claim that he simply released from stone what was already there is straightforwardly true of Paleolithic artists. They placed their lines where the contours already suggested animal motion.

When I had occasion to remark early in my cave-art education that the pair of clay bison sculptures (ca. 15,500 years before the present) located in France’s Tuc d’Audoubert cave are relative rarities, since most cave art is painted on the walls, a veteran of the field corrected me. ‘It’s all sculpture,’ she said. It is all ‘sculpture,’ though most of it was done for us by the same natural forces that brought forth the underground spaces hosting the works. The caverns’ many undulations, outcroppings, fissures, and declivities were then enhanced by human hands, and sometimes saliva: as in the common crachis technique of spitting on a surface and then rubbing on the pigments. Other techniques include using water or plant oils as mediums and applying colors by means of pads, brushes, hands, or blowing—either through a tube or directly from the mouth.

The Grotte des Merveilles cave is found in Rocamadour, France, a historic town with a medieval chapel (likely the most impressive in the region) built into the limestone cliffside as if its builders, too, were only drawing out—more elaborately, with greater refinement—what was already there. Outside the Grotte, in early May, I watched a snail inscribing its trail of slime across a road. Inside, I saw what the scholarship calls ‘punctuations’: clusters of dots left there at some point in the Magdalenian (17,000–12,000 bp)—perhaps as clan identifiers, or as trail markers for those on long journeys, or ritual or sacred symbols, or units of reckoning, or any number of other purposes besides. They are the bare minimum traces of intentionality, and to see them as existing across an ontological divide from the snail’s trail may well be an act of faith or species-based pride.

I confess I am wary of such pride, and my exposure to Paleolithic art has only reinforced this wariness. Over the past several decades anthropologists such as Philippe Descola and Tim Ingold have done considerable work to show that in most human societies there has been no presumption of a sharp boundary between the natural world and the built cultural milieu, between wilderness and settlement, and in consequence there has been no default presumption of a fundamental difference between the ways in which human art transforms the environment and those in which animals and plants do the same.2 The authors base their arguments on evidence from the ethnography of Amazonia and Sápmi, focusing on groups of people who, respectively, perceive peccaries and reindeer as in some deep and real sense equal actors, as persons if not as human beings, in a shared sociocosmic reality. It helps to make sense of cave art, I have found, to entertain the idea that the inhabitants of Paleolithic Europe made sense of the world around them through a similar sort of ontology.

Read the full article in Art in America.


.

The mystery of people who speak dozens of languages
Judith Thurman, New Yorker, 3 September 2018

Linguistic competence, as it happens, was the subject of my own interest in Rojas-Berscia. He is a hyperpolyglot, with a command of twenty-two living languages (Spanish, Italian, Piedmontese, English, Mandarin, French, Esperanto, Portuguese, Romanian, Quechua, Shawi, Aymara, German, Dutch, Catalan, Russian, Hakka Chinese, Japanese, Korean, Guarani, Farsi, and Serbian), thirteen of which he speaks fluently. He also knows six classical or endangered languages: Latin, Ancient Greek, Biblical Hebrew, Shiwilu, Muniche, and Selk’nam, an indigenous tongue of Tierra del Fuego, which was the subject of his master’s thesis. We first made contact three years ago, when I was writing about a Chilean youth who called himself the last surviving speaker of Selk’nam. How could such a claim be verified? Pretty much only, it turned out, by Rojas-Berscia.

Superlative feats have always thrilled average mortals, in part, perhaps, because they register as a victory for Team Homo Sapiens: they redefine the humanly possible. If the ultra-marathoner Dean Karnazes can run three hundred and fifty miles without sleep, he may inspire you to jog around the block. If Rojas-Berscia can speak twenty-two languages, perhaps you can crank up your high-school Spanish or bat-mitzvah Hebrew, or learn enough of your grandma’s Korean to understand her stories. Such is the promise of online language-learning programs like Pimsleur, Babbel, Rosetta Stone, and Duolingo: in the brain of every monolingual, there’s a dormant polyglot—a genie—who, with some brisk mental friction, can be woken up. I tested that presumption at the start of my research, signing up on Duolingo to learn Vietnamese. (The app is free, and I was curious about the challenges of a tonal language.) It turns out that I’m good at hello—chào—but thank you, cm ơn, is harder.

The word ‘hyperpolyglot’ was coined two decades ago, by a British linguist, Richard Hudson, who was launching an Internet search for the world’s greatest language learner. But the phenomenon and its mystique are ancient. In Acts 2 of the New Testament, Christ’s disciples receive the Holy Spirit and can suddenly ‘speak in tongues’ (glōssais lalein, in Greek), preaching in the languages of ‘every nation under heaven.’ According to Pliny the Elder, the Greco-Persian king Mithridates VI, who ruled twenty-two nations in the first century B.C., ‘administered their laws in as many languages, and could harangue in each of them.’ Plutarch claimed that Cleopatra ‘very seldom had need of an interpreter,’ and was the only monarch of her Greek dynasty fluent in Egyptian. Elizabeth I also allegedly mastered the tongues of her realm – Welsh, Cornish, Scottish, and Irish, plus six others.

With a mere ten languages, Shakespeare’s Queen does not qualify as a hyperpolyglot; the accepted threshold is eleven. The prowess of Giuseppe Mezzofanti (1774-1849) is more astounding and better documented. Mezzofanti, an Italian cardinal, was fluent in at least thirty languages and studied another forty-two, including, he claimed, Algonquin. In the decades that he lived in Rome, as the chief custodian of the Vatican Library, notables from around the world dropped by to interrogate him in their mother tongues, and he flitted as nimbly among them as a bee in a rose garden. Lord Byron, who is said to have spoken Greek, French, Italian, German, Latin, and some Armenian, in addition to his immortal English, lost a cursing contest with the Cardinal and afterward, with admiration, called him a ‘monster.’ Other witnesses were less enchanted, comparing him to a parrot. But his gifts were certified by an Irish scholar and a British philologist, Charles William Russell and Thomas Watts, who set a standard for fluency that is still useful in vetting the claims of modern Mezzofantis: Can they speak with an unstilted freedom that transcends rote mimicry?

Read the full article in the New Yorker.


.

The known known
Sue Halpern, New York Review of Books,
27 September 2018

Users, who in the early days of social media were predominantly young, were largely guileless and unconcerned about privacy. In a survey of sixty-four of her students at Rochester Institute of Technology in 2006, Susan Barnes found that they ‘wanted to keep information private, but did not seem to realize that Facebook is a public space.’ When a random sample of young people was asked in 2007 by researchers from the Pew Research Center if ‘they had any concerns about publicly posted photos, most…said they were not worried about risks to their privacy.’ (This was largely before Facebook and other tech companies began tracking and monetizing one’s every move on- and offline.)

In retrospect, the tendencies toward disclosure and prurience online should not have been surprising. As Sarah Igo observes in The Known Citizen, her masterful study of privacy in the United States, the sharing and oversharing of intimacies predates the social Web; indeed, the social Web simply allowed these behaviors to proliferate on a more open and accessible platform. Igo cites the enormous popularity of An American Family, a documentary doled out in twelve installments on public television in 1973, as one of the earliest cultural watersheds in Americans’ changing appreciation of privacy. Culled from the filmmakers’ seven-month immersion in the day-to-day lives of an ordinary family, the Louds of California, the series suggested that nothing was off-limits on TV: the Louds’ marriage fell apart; their son came out as gay; his father’s infidelities were exposed. Part of what made this so sensational was that, by making the private public, voyeurism and exhibitionism became mainstream entertainments. (Decades later, with webcams built into computers, peering into other people’s homes and lives no longer seems all that unusual.)

Igo also points to the influence of confessional talk shows, like Phil Donahue’s in the 1970s and Oprah Winfrey’s in the 1980s and beyond, where guests opened up about previously taboo subjects such as incest and spousal abuse. The public also had a voracious appetite for revelatory memoirs, a genre that grew exponentially as writers, famous or not, offered up increasingly startling, true—or possibly true—confessions of drug addiction, alcohol abuse, childhood traumas, sexual misadventures, and failures of every stripe. Igo writes:

Confessional culture, 1990s style, had many taproots: the media forms and celebrity culture that made self-publicity so alluring, the critique of secrets that was transforming political culture, and the incitements to authenticity and redemption emanating in equal measure from the couch and congregation.

When the social Web came along not long afterward, people were primed to participate.

Read the full article in the New York Review of Books.


.

A poem in the Nation spurs a backlash and an apology
Jennifer Schuessler, New York Times, 1 August 2018

Since its founding in 1865, The Nation has published some of the most important voices in American poetry, including Hart Crane, Elizabeth Bishop, Amiri Baraka and Adrienne Rich.

But last week, the venerable progressive weekly published what may have been a first: an apology for one of its offerings that ran twice as long as the poem itself.

The 14-line poem, by a young poet named Anders Carlson-Wee, was posted on the magazine’s website on July 5. Called ‘How-To,’ and seemingly written in the voice of a homeless person begging for handouts, it offered advice on how to play on the moral self-regard of passers-by by playing up, or even inventing, hardship.

But after a firestorm of criticism on social media over a white poet’s attempt at black vernacular, as well as a line in which the speaker makes reference to being ‘crippled,’ the magazine said it had made a ‘serious mistake’ in publishing it.

‘We are sorry for the pain we have caused to the many communities affected by this poem,’ the magazine’s poetry editors, Stephanie Burt and Carmen Giménez Smith, wrote in a statement posted on Twitter last week, which was posted above the poem on the magazine’s website a day later, along with an editor’s note calling the poem’s language ‘disparaging and ableist.’

‘When we read the poem we took it as a profane, over-the-top attack on the ways in which member of many groups are asked, or required, to perform the work of marginalization,’ they wrote. But ‘we can no longer read the poem in that way.’

Mr. Carlson-Wee also posted his own apology. ‘Treading anywhere close to blackface is horrifying to me, and I am profoundly regretful,’ he said in a statement posted on Facebook and Twitter.

Read the full article in the New York Times.


.

Why you should read this article slowly
Joe Moran, Guardian, 14 September 2018

‘A poem is an interruption of silence, whereas prose is a continuation of noise,’ the poet Billy Collins once said. Poets and lyrically minded prose writers see the written word rather as Quaker worship sees the spoken word: they think it more powerful if it emerges out of and is separated by silence. Writing and reading online, we struggle to find this silence out of which words can materialise and be contemplated. There is too much speaking and reacting, and not enough listening and reflecting.

Perhaps we should slow down. We hear a lot today about recovering the lost virtues of slowness – by, for instance, spending time on locally sourcing and preparing a meal, or leaving children to explore the world unsupervised and at their own pace. But the slow reading movement has yet to take off in the same way. Reading is constantly promoted as a social good and source of personal fulfilment. But this advocacy often emphasises ‘avid’, ‘passionate’ or ‘voracious’ reading – none of which adjectives suggest slow, quiet absorption.

Online writing is often designed to be mined for its ‘take home’ or ‘takeaway’ lesson, or perhaps its ‘take down’ of another piece. Most below-the-line comment focuses on whether the commenter agrees with the writer. It rarely mentions what a piece of writing was actually like to read.

To a slow reader, the medium can’t be detached from its message in this way. A piece of writing is not simply a ‘take’ on something, but a rhetorical exercise in pace, rhythm, tone, texture and voice. Ultimately it is irreducible to precis or paraphrase. It can only be fully understood by immersing oneself in the words and their slow unravelling of a line of thought. The slow reader is like a swimmer who stops counting the number of pool laps they have done and just enjoys how their body feels and moves in water.

Slow reading feels to me like a more generous, collegiate form of reading – rather as listening is a more generous act than speaking, and more difficult. Slow reading gives someone else (the writer) the gift of your time, without any guaranteed return, and with the risk that you will be bored or discomforted by the writing’s strangeness or difficulty. Slow reading is a gradual encounter with the obdurate otherness of another person’s mind. Like any such encounter, it should take as long as it takes and be its own end.

The human need for this kind of deep reading is too tenacious for any new technology to destroy. We often assume that technological change happens inexorably and in one direction, so that older media like ‘dead-tree’ books are shoved out by newer, more virtual forms. In practice, older technologies are quite resilient and can coexist with new ones. The Kindle has not killed off the printed book any more than the car killed off the bicycle.

Read the full article in the Guardian.

.

 

The images are, from top down: Galileo’s letter (credit: The Royal Society); ‘The Elephant in the Room’ by Lynette Shelley; Sketch of Donald Trump by John Springs, from the New York Review of Books; Detail from a drawing by Sveta Dorosheva; A cave painting from Ubirr, in Kakadu National Park, Northern Territory, Australia (the photo is by me).

2 comments

  1. Dreek Freyberg

    The article you refer to about vaccination in California: “Hollywood’s vaccine wars”, Gary Baum, Hollywood Reporter, 10 September 2018, was in fact published on 10 September 2014, not 2018.
    California Senate Bill 277 (SB277), signed into law in June 2015, removed personal belief exemptions against the vaccinations that are required to attend public schools – see, for example,
    https://en.wikipedia.org/wiki/California_Senate_Bill_277 and the links therein.
    Which is not to say that California is entirely sensible about vaccination. First, vaccinations are only checked a couple of times, so unvaccinated children have several years in the system; second, there are a few doctors who will issue medical exemptions with little basis for their real necessity, medical contraindications to vaccination. Still, it is a lot better than it was four years ago.

    • My apologies. I misread the date. Thanks for spotting and for your update. I will leave it up there as a warning to myself to be more careful in reading dates.

Comments are closed.