The Afro-Pessimist Temptation

Amy Sherald/Private Collection/Monique Meloche Gallery, Chicago Amy Sherald: What’s precious inside of him does not care to be known by the mind in ways that diminish its presence (All American), 2017; from the exhibition ‘Amy Sherald,’ on view at the Contemporary Art Museum St. Louis, May 11–August 19, 2018

Not long ago in the locker room of my Harlem gym, I was the eavesdropping old head who thought Black Panther was another documentary about the militants of the Black Panther Party from the Sixties. I caught on from what the young white guy and the young black guy were talking about that Kendrick Lamar had written some of the film’s soundtrack. I almost said, “Lamar is woke,” but the memory of the first time I heard my father say a thing was “fly” rose up and shut my mouth.

In the current political backlash—the only notion the current administration has is to undo whatever President Obama did, to wipe him out—black America is nevertheless a cultural boomtown. My maternal cousins e-mailed everyone to go to Black Panther that first record-breaking weekend, like they were getting out the vote. Twenty-five years ago black people were the lost population, abandoned in inner cities overrun with drugs, exhorted by politicians and preachers to mend the broken black family. Black intellectuals were on the defensive, and bell hooks talked of the resentment she encountered from white people when she spoke of white supremacy instead of racism. Now white people are the ones who seem lost, who don’t seem to know who they are, except for those white Americans who join the resistance against white supremacy and make apologies to black friends for white privilege because, although they don’t know where else to begin, they do know that they don’t want to be associated anymore with the how-long-has-this-been-going-on.

For eight years, I didn’t care what right-wing white people had to say about anything. Obama’s presence on the international stage decriminalized at home the image of the black man; and the murdered black men around whom black women founded Black Lives Matter were regarded more as the fallen in battle than as victims. The vigils of Black Lives Matter drew strength from memories of the marches of the civil rights movement, just as the protesters of the 1960s were aware of the unfinished business of the Civil War as their moral inheritance. Obama’s presidency made black neoconservatives irrelevant. They fumed that on paper he should have added up to be one of them, but instead Obama paid homage to John Lewis. That was Eric Holder in the Justice Department. But as it turned out, not everyone was vibing with the triumphant celebrations at David Adjaye’s beautiful National Museum of African American History and Culture.

White supremacy isn’t back; it never went away, though we thought it had become marginal or been contained as a political force, and maybe it has, which only adds to the unhelpful feeling that this should not have happened, that the government has been hijacked. I think of the Harvard sociologist Lawrence Bobo in the election’s aftermath telling a meeting of the American Psychoanalytic Association that, had the same number of black people who voted in Milwaukee, Detroit, and Philadelphia in 2012 come to the polls in 2016, Hillary Clinton would have won in the Electoral College. What the 2016 presidential election demonstrated is that, as David Foster Wallace put it, there is no such thing as not voting.

I mind this happening when I am getting too old to run from it. Shit, do not hit that fan. My father’s siblings, in their late eighties and early nineties, assure me that we have survived worse. They grew up on Negro History Week. The Great Depression shaped their childhoods; McCarthyism their college years. My father lived to see Obama’s election in 2008, but not the gutting of the Voting Rights Act in 2013. He would have said that the struggle for freedom is ongoing. Look at how “they” managed to get around Brown v. Board of Education; look at Citizens United, he would say, he who hawked NAACP memberships in airport men’s rooms or read from William Julius Wilson at Christmas dinner. I longed for him to change the subject, to talk to my Jewish friends about science, not racism.

In 1895, the year Frederick Douglass died, Booker T. Washington gave an address in Atlanta cautioning black people to cast down their buckets where they were. The black and white races would be like the fingers of the hand, separate but working together on essential matters. White people took Washington to mean that blacks would accept Jim Crow and not agitate for restoration of the civil rights they had exercised during Reconstruction. They would concentrate instead on self-improvement and economic development. Washington’s conciliatory philosophy made his autobiography, Up from Slavery (1901), a best seller. He was hailed as the most influential black spokesman of his day. Theodore Roosevelt invited him to dine at the White House, much to the consternation of Washington’s white southern supporters.

Washington’s program may have won him admiration among whites, but he never persuaded black people, as far as an angry W.E.B. Du Bois was concerned. In The Souls of Black Folk (1903), Du Bois argued that the influence of three main attitudes could be traced throughout the history of black Americans in response to their condition:

a feeling of revolt and revenge; an attempt to adjust all thought and action to the will of the greater group; or, finally, a determined effort at self-realization and self-development despite environing opinion.

For Du Bois, Washington represented the attitude of submission. He had no trouble with Washington preaching thrift, patience, and industrial training for the masses, but to be silent in the face of injustice was not being a man:

Negroes must insist continually, in season and out of season, that voting is necessary to modern manhood, that color discrimination is barbarism, and that black boys need education as well as white boys.

Du Bois was not alone among black intellectuals in his condemnation of Washington, but it was not true that Washington had no black followers. For Washington, the withdrawal of black people from American political life was to be temporary. Black people would earn white respect by acquiring skills and becoming economically stable. If they couldn’t vote, then they could acquire property. However, Du Bois and his allies maintained that disenfranchisement was a significant obstacle to economic opportunity. Black prosperity was taken by whites as a form of being uppity: white people burned down the black business section of Tulsa, Oklahoma, in 1921, furious at its success. Moreover, black Marxist critics of the 1930s held that Washington’s program to produce craftsmen and laborers uninterested in unions had been made obsolete by the mass manufacturing economy. Washington’s Tuskegee Movement came to stand for backwater gradualism, of which the guesthouse for white visitors to the Tuskegee Institute was a symbol.

The Du Bois–Washington controversy described basic oppositions—North/South, urban/rural—that defined black America at the time. Identifying what Arnold Rampersad has called “an essential dualism in the black American soul,” Du Bois also explored the concept of “double-consciousness”:

One ever feels his two-ness—an American, a Negro; two souls, two thoughts, two unreconciled strivings; two warring ideals in one dark body.

The conflict between national and racial identity has had political expression—integrationist/separatist—as well as psychological meaning: good black/bad black, masked black self/real black self. “Free your mind and your ass will follow,” Funkadelic sang in 1970, by which time the authentic black was always assumed to be militant: there is a Malcolm X in every black person, the saying went.

Ta-Nehisi Coates says that he came to understand as a grown-up the limits of anger, but he is in a fed-up, secessionist mood by the end of We Were Eight Years in Power: An American Tragedy. His collection of eight essays on politics and black history written during Obama’s two terms of office, introduced with some new reflections, portrays his post-election disillusionment as a return to his senses. Coates wonders how he could have missed the signs of Trump’s coming: “His ideology is white supremacy in all of its truculent and sanctimonious power.” He strongly disagrees with those who say that racism is too simple an explanation for Trump’s victory. He was not put in office by “an inscrutable” white working class; he had the support of the white upper classes to which belong the very “pundits” who play down racism as an explanation.

The title We Were Eight Years in Power, Coates tells us, is taken from a speech that a South Carolina congressman made in 1895 when Reconstruction in the state was terminated by a white supremacist takeover. Du Bois noted at the time that what white South Carolina feared more than “bad Negro government” was “good Negro government.” Coates finds a parallel in Trump’s succeeding Obama, whose presidency was “a monument to moderation.” Obama’s victories were not racism’s defeat. He trusted white America and underestimated the opposition’s resolve to destroy him. Coates sees Obama as a caretaker, not a revolutionary, and even that was too much for white America. He writes from the perspective that that “end-of-history moment” when Obama was first elected “proved to be wrong.”

In the 1960s frustration with integration as the primary goal of civil rights began Booker T. Washington’s rehabilitation as an early advocate of black self-sufficiency. But it’s still a surprise to find him among Coates’s influences, to be back there again. It is because Coates at first identified with the conservative argument that blacks couldn’t blame all their problems on racism, that they had to take some responsibility for their social ills. He names Washington the father of a black conservative tradition that found “a permanent and natural home in the emerging ideology of Black Nationalism.” He writes, “The rise of the organic black conservative tradition is also a response to America’s retreat from its second attempt at Reconstruction.” As a young man in 1995, Coates experienced the Million Man March in Washington, D.C., at which the Nation of Islam’s Louis Farrakhan urged black men to be better fathers.

In their emphasis on defense of black communities against racist agents of the state, the Black Panthers in the 1960s considered themselves revolutionary; so, too, did the FBI, which destroyed the movement. Black nationalism wasn’t necessarily revolutionary: some leaders of the Republic of New Afrika endorsed Nixon in 1972 so that the commune might benefit from his Black Capitalism schemes. In the Reagan era, black conservatives complained that a collective black identity was a tyranny that sacrificed their individualism. What they were really attacking was the idea of black people as a voting bloc for the Democratic Party.

Black conservatism joined with white conservatism in opposing the use of government as the enforcement arm of change. Coates eventually gave up on movements that asked blacks to shape up, even though it gave him a politics “separate from the whims of white people.” What turned him off was that, historically, conservative black nationalism assumed that black people were broken and needed to be fixed, that “black culture in its present form is bastardized and pathological.”

Siegfried WoldhekTa-­Nehisi Coates

At every turn, Coates rejects interpretations of black culture as pathological. I am not broken. William Julius Wilson’s theories that link the deterioration of black material conditions to industrial decline “matched the facts of my life, black pathology matched none of it.” Coates holds the 1965 Moynihan Report on the black family accountable as a sexist document that has shaped policy on the mass incarceration of black men. He is done with what he might call the hypocrisy of white standards. “The essence of American racism is disrespect.” There is no such thing as assimilation. Having a father and adhering to middle-class norms have “never shielded black people from plunder.” American democracy is based on “plunder.”

The subject of reparations has been around in radical black politics for some time. But Coates takes the argument beyond the expected confines of slavery and applies the notion of plunder to whites’ relations with blacks in his history of red-lining and racial segregation as urban policy and real estate practice in postwar Chicago. He also cites the psychological and financial good that West Germany’s reparations meant for Israel: “What I’m talking about is a national reckoning that would lead to spiritual renewal.” Reparations are clearly the only solution for him, but he writes as though they will never be paid; therefore nothing else matters.

Between him and the other world, Du Bois said, was the unasked question of what it felt like to be a problem. But white people are the problem. The exclusion of black people transformed “whiteness itself into a monopoly on American possibilities,” Coates says. It used to be that social change for blacks meant concessions on the part of white people. But Coates is not looking for white allies or white sympathy. “Racism was banditry, pure and simple. And the banditry was not incidental to America, it was essential to it.” He has had it with “the great power of white innocence,” he writes. “Progressives are loath to invoke white supremacy as an explanation for anything.” The repeated use of the phrase “white supremacy” is itself a kind of provocation. “Gentrification is white supremacy.”

There may be white people who don’t believe the “comfortable” narratives about American history, but Coates hasn’t time for them either. The “evidence of structural inequality” may be “compelling,” but “the liberal notions that blacks are still, after a century of struggle, victims of pervasive discrimination is the ultimate buzzkill.” He means that the best-intentioned of whites still perceive being black as a social handicap. He wants to tell his son that black people are in charge of their own destinies, that their fates are not determined by the antagonism of others. “White supremacy is a crime and a lie, but it’s also a machine that generates meaning. This existential gift, as much as anything, is the source of its enormous, centuries-spanning power.” That rather makes it sound like hypnosis, but maybe the basic unit of white supremacy is the lynch mob.

Malcolm X thought Du Bois’s double-consciousness a matter for the black middle class—blacks living between two worlds, seeking the approval of both the white and the black and not getting either. But even when black people could see themselves for themselves, there was still the problem of whether white power could be reformed, overthrown, or escaped. The essential American soul is hard, isolate, stoic, and a killer, D.H. Lawrence said. If white supremacy is still the root of the social order in the US, then so, too, are the temptations of Hate, Despair, and Doubt, as Du Bois put it. “As we move into the mainstream,” Coates says, “black folks are taking a third road—being ourselves.”

It’s as though racism has always been the action and dealing with it the reaction. That is maybe why black thinkers and artists try to turn things around, to transcend race, to get out of white jurisdiction. When black students in the 1970s baited Ralph Ellison for his detachment from protest movements, he said that writing the best novel he could was his contribution to the struggle.

Cornel West blasted Coates for his narrow “defiance,” for choosing a “personal commitment to writing with no connection to collective action.”1 He argued that Coates makes a fetish of white supremacy and loses sight of the tradition of resistance. For West, Coates represents the “neoliberal” wing of the black freedom struggle, much like Obama himself. Obama is little more than a symbol to West (and Coates insists that symbols can mean a great deal). Coates’s position amounts to a misguided pessimism, in West’s view. Robin D.G. Kelley, author of the excellent Thelonious Monk: The Life and Times of an American Original (2009), attempted to mediate between their positions, saying, in part, that West and Coates share a pessimism of outlook and that black movements have always had a dual purpose: survival and ultimate victory.2

As a dustup encouraged by newspaper editors, West’s attack on Coates has been likened to the battle royal: that scene in Invisible Man where black youth are made to fight one another blindfolded in a ring for the amusement of white men. Richard Wright recounts in his autobiography, Black Boy, how he tried to get the other boy he was to oppose in just such an entertainment to stand with him and refuse to fight. Part of what drove Ellison was his need to one-up Wright, who got to use, in his work before Ellison, metaphors they both shared. But West, however ready he is to say impossible things before breakfast, is the older man, not Coates’s peer, which makes his name-calling—his contempt in the expression “neoliberal”—ineffectual purity.

In pre-Obama times, West warned black youth against the internal and external threats of nihilism. I remember one evening at Howard University in the early 1990s when he and bell hooks rocked the auditorium. I couldn’t hear what they were saying sometimes. But much of Coates’s audience wasn’t of reading age then.

The swagger of 1960s black militancy was absorbed into the rap music of the 1990s. In Democracy Matters: Winning the Fight Against Imperialism (2004), West interprets hip-hop culture as an indictment of the older generation, the lyrics of the young proclaiming that they were neglected by self-medicated adults: “Only their beloved mothers—often overworked, underpaid, and wrestling with a paucity of genuine intimacy—are spared.”

Coates is passionate about the music that helped him find himself and a language. His ambivalence about Obama goes away once he claims him as a member of hip-hop’s foundational generation. In his memoir Losing My Cool (2010), Thomas Chatterton Williams recalls that as a teenager immersed in hip-hop, it nagged at him that he and the other black students at his private school couldn’t say when Du Bois died or when King was born, but they were worked up over the anniversary of the assassination of Biggie Smalls. Coates is different from many other black writers of his generation in that he doesn’t come from a middle-class background. His biography is like a hip-hop story.

He grew up in “segregated West Baltimore,” where his father was chapter head of the Black Panther Party. He said he understood black as a culture, not as a minority, until he entered rooms where no one else looked like him. Early on in We Were Eight Years in Power he speaks of “the rage that lives in all African Americans, a collective feeling of disgrace that borders on self-hatred.” You wonder whom he’s speaking for, even as he goes on to say that music cured his generation’s shame, just as to embrace Malcolm X was to be relieved of “the mythical curse of Ham.” It’s been fifty years since Malcolm X talked about brainwashed Negroes becoming black people bragging about being black. It’s been half a century since those books that told us depression and grief among blacks were hatred turned on the black self.

Coates declares that when Obama first ran for president in 2008, the civil rights generation was

exiting the American stage—not in a haze of nostalgia but in a cloud of gloom, troubled by the persistence of racism, the apparent weaknesses of the generation following in its wake, and the seeming indifference of much of the country to black America’s fate.

Obama rose so quickly because African-Americans were

war-weary. It was not simply the country at large that was tired of the old baby boomer debates. Blacks, too, were sick of talking about affirmative action and school busing. There was a broad sense that integration had failed us.

Peril is generational, Coates says. He has given up on the liberal project, castigating liberal thinking for having “white honor” and the maintenance of “whiteness” at its core. King’s “gauzy all-inclusive” dream has been replaced by the reality of an America of competing groups, with blacks tired of being the weakest of the lot. Harold Cruse in The Crisis of the Negro Intellectual (1967), a vehement work of black nationalism and unique in black intellectual history, said flat out that Washington was right and that Du Bois had ended up on the wrong side, that Marxism was just white people (i.e., Jewish people) telling black people what to think. Cruse was regarded as a crank in his time, but his view of black history in America as a rigged competition is now widely shared, and Cruse was writing before Frantz Fanon’s work on the decolonized mind was available in English.

Afro-pessimism derives in part from Fanon, and maybe it’s another name for something that has been around in black culture for a while. Afro-pessimism found provocative expression in Incognegro: A Memoir of Exile and Apartheid (2008) by Frank B. Wilderson III. A Dartmouth graduate who grew up in the 1960s in the white Minneapolis suburb where Walter Mondale lived, Wilderson is West’s generation. He went to South Africa in the early 1990s and became involved with the revolutionary wing of the ANC that Mandela betrayed. White people are guilty until proven innocent, Wilderson asserts throughout. Fanon is everywhere these days, the way Malcolm X used to be, but Wilderson makes me think of Céline, not Fanon. Coates’s “critique of respectability politics” is in something of the same mood as Wilderson, and, before him, Cruse. He also has that echo of what Fanon called the rejection of neoliberal universalism.

The 1960s and 1970s showed that mass movements could bring about systemic change. Angela Davis said so.3 Unprecedented prosperity made the Great Society possible. But only black people could redefine black people, Stokeley Carmichael and Charles V. Hamilton said in Black Power (1967). West has remembered entering Harvard in 1970 and feeling more than prepared by his church and family. The future of the world as he could imagine it then and how it evidently strikes Coates these days is a profound generational difference. “The warlords of history are still kicking our heads in, and no one, not our fathers, not our Gods, is coming to save us.”

Cornell West is right or I am on his side, another old head who believes that history is human-made. Afro-pessimism and its treatment of withdrawal as transcendence is no less pleasing to white supremacy than Booker T. Washington’s strategic retreat into self-help. Afro-pessimism threatens no one, and white audiences confuse having been chastised with learning. Unfortunately, black people who dismiss the idea of progress as a fantasy are incorrect in thinking they are the same as most white people who perhaps believe still that they will be fine no matter who wins our elections. Afro-pessimism is not found in the black church. One of the most eloquent rebuttals to Afro-pessimism came from the white teenage anti-gun lobbyists who opened up their story in the March for Our Lives demonstrations to include all youth trapped in violent cultures.

My father used to say that integration had little to do with sitting next to white people and everything to do with black people gaining access to better neighborhoods, decent schools, their share. Life for blacks was not what it should be, but he saw that as a reason to keep on, not check out. I had no idea how much better things were than they had been when he was my age, he said. That white people spent money in order to suppress the black vote proved that voting was a radical act. Bobby Kennedy happened to be in Indianapolis the day Dr. King was assassinated fifty years ago. I always thought my father had gone downtown to hear Kennedy speak. No, he told me much later, he’d been in the ghetto tavern of a crony, too disgusted to talk. Yet he wouldn’t let me stay home from school the next day.

A couple of decades later I was resenting my father speaking of my expatriate life as a black literary tradition, because I understood him to be saying that I wasn’t doing anything new and, by the way, there was no such thing as getting away from being black, or what others might pretend that meant. Black life is about the group, and even if we tell ourselves that we don’t care anymore that America glorifies the individual in order to disguise what is really happening, this remains a fundamental paradox in the organization of everyday life for a black person. Your head is not a safe space.

  1. 1

    “Ta-Nehisi Coates Is the Neoliberal Face of the Black Freedom Struggle,” The Guardian, December 17, 2017. 

  2. 2

    “Coates and West in Jackson,” Boston Review, December 22, 2017. 

  3. 3

    Angela Y. Davis, Freedom Is a Constant Struggle (Haymarket, 2016). 

Source Article from

Ratfucked Again

Bill Clark/CQ Roll Call/Getty ImagesAnti-gerrymandering activists in costume as Maryland district 5 (left) and district 1 (right) in front of the Supreme Court, March 2018

A decade ago, when the Republican Party was paying the price for the various cataclysms brought on by the George W. Bush presidency—the shockingly inadequate response to Hurricane Katrina, the ill effects of the Iraq War, the great economic meltdown—the Democratic Party reached its post–Great Society zenith. It nominated and elected the country’s first African-American president—and he won decisively, against an admired war hero. It sent sixty senators to Washington, which it hadn’t done in forty years (and back then, around a dozen of those were southern conservatives).1 It also sent 257 representatives to the House, its highest number since before the Gingrich Revolution of 1994. Its governors sat in twenty-eight executive mansions, including in such improbable states as Tennessee, Kansas, Oklahoma, and Wyoming.

Then came the rise of the Tea Party and the calamitous 2010 elections. The Republicans’ net gain of sixty-three seats in the House of Representatives, giving them control over that chamber after a four-year hiatus, swallowed most of the headlines (the party also had a net gain of six Senate seats). The Democrats, as President Obama put it, took a “shellacking.”

But perhaps the more consequential results happened in the states. Democrats lost a net total of four gubernatorial races, taking them down to a minority of twenty-two governorships. They lost gubernatorial contests in some important large states: Pennsylvania and Ohio; Michigan, where Governor Rick Snyder would make his fateful decision about the source of water for the city of Flint; and Wisconsin, where Scott Walker would pass anti-union legislation and steer state government hard to starboard. Florida, governed before that election by Charlie Crist, an independent who had left the GOP and criticized it as extremist, turned to the very conservative Republican Rick Scott. And all of those improbable states listed above eventually reverted to GOP control.

Democrats likewise took a pounding in state legislative races in 2010. Pennsylvania, Michigan, and Ohio had had divided legislatures before that election, and Wisconsin a Democratic one. All four went Republican. So did Maine, New Hampshire, North Carolina, Alabama, and Minnesota. Iowa, Louisiana, Colorado, and Oregon moved from Democratic control to having divided legislatures. In many of these states, the pendulum has never swung back, or it has swung more aggressively in the Republican direction, so that we now have, for example, thirty-three Republican governors and just sixteen Democratic ones, while Republicans maintain complete control of thirty-two state legislatures to the Democrats’ mere thirteen.

It was just one year, 2010, and one election. But it was a pivotal one, because it coincided with the decennial census and the drawing, in time for the 2012 elections, of new legislative districts at the federal and state levels. These newly empowered Republican governors and legislators found themselves with enormous power to reshape politics for a decade, and boy did they use it.

It cannot be said that what they did with their power stood flagrantly outside the tradition of American representative democracy, about which there is much to be ashamed—or at the very least, much of which fails to match the inspiring story we learned as schoolchildren. But it certainly can be said that these new Republican majorities—and a few Democratic ones, too, for example in Maryland—took partisan gerrymandering to new levels. And they did so immediately, so that in the 2012 elections, as the congressional voting analyst David Wasserman of the Cook Political Report found, Democratic candidates for the House of Representatives collectively won 1.37 million more votes than their Republican opponents, or 50.6 percent of the vote—but only 46 percent of the seats.2

As we head into this fall’s elections, the Democrats are expected to make big gains: most observers believe they’ll recapture the House, which they can do with a net gain of around twenty-four seats. That would effectively forestall President Trump’s enacting any sort of legislative agenda. Retaking the Senate—considered a tougher climb, but now thought possible by the experts in a way it was not a few months ago—would mean the Democrats could bottle up presidential nominations and even return the favor of what the Republicans did to Judge Merrick Garland in 2016 by blocking a nomination to the Supreme Court, should one open up.

But as the next census approaches, state executive mansions and legislatures are at least as important, as liberals have belatedly come to realize. The Democrats actually have two election cycles to see how much ground they can regain here, as new district lines won’t be drawn until after the 2020 election results are in. The party that wins the right to draw the legislative maps of the 2020s will have enormous power to shape future Congresses and state legislatures—to determine, for example, whether districts are drawn in such a way that Republicans need only worry about winning conservative votes and Democrats liberal ones, or in a way that might push candidates toward the center; and whether districts comply with the Voting Rights Act, in a decade when much demographic change is expected, enough to perhaps turn the crucial state of Texas at least purple, if not blue. Much is at stake.

The story of what the Republicans accomplished in 2010 is ably told by David Daley, the former editor of Salon, in his book Ratf**ked: The True Story Behind the Secret Plan to Steal America’s Democracy, which Elizabeth Drew reviewed favorably in these pages in 2016.3 In sum, the story starts in the summer of 2009, when Chris Jankowski, who worked for a group called the Republican State Leadership Committee, read a story in The New York Times emphasizing the importance of the 2010 elections. Like all Republican operatives, Jankowski was down in the dumps at the time. But reading that Times article gave him a sense of purpose and mission.

Jankowski grasped the connections immediately. Map-drawing is hugely important; state legislatures control map-drawing; many state legislatures are narrowly divided; many can therefore be “flipped” from one party to another with comparatively small amounts of money, far less than it would cost to flip a congressional seat. Jankowski quickly put together a plan named REDMAP (short for “Redistricting Majority Project”), which would help the Republican Party dominate politics for the decade to come. “Win big in 2010 and Republicans could redraw the maps and lock in electoral and financial advantages for the next ten years,” Daley writes. “Push just 20 [House] districts from competitive to safely Republican, and the GOP could save $100 million or more over the next decade.”

So Jankowski got his seed money and started setting up offices in the state capitals most important to the effort. Wind filled the project’s sails in the form of the crippled economy, which gave anti-Obama voters extra motivation to turn out that fall, and the January 2010 Citizens United Supreme Court decision, which opened the door for many millions of dollars of “dark money” (untraceable back to donors) to finance both individual campaigns and independent committees. REDMAP was off to the races.

Ratf**ked describes the striking results. In Wisconsin, Republicans went into the 2010 election with a 50–45 deficit in the state assembly and an 18–15 disadvantage in the Senate; they emerged with respective majorities of 60–38 and 19–14. In Michigan, Republicans already controlled the state senate. They maintained that control, and they flipped a twenty-three-seat deficit in the lower house to a sixteen-seat advantage. In North Carolina, a Democratic 30–20 advantage dissolved into a 31–19 Republican edge in the state senate; in the state house, the Republicans went from a 68–52 disadvantage to a 67–52 edge (with one independent). And so on, and on.

In every election, corners were cut, court precedents ignored, dirty deeds performed. In Pennsylvania, a thirteen-term Democratic state representative named David Levdansky was defeated because he allegedly voted for a “$600 million Arlen Specter Library.” Such allegations were made in ads paid for by the state Republican Party and the Republican State Leadership Committee. In fact, $600 million was the entire state budget, although even that was the initially appropriated figure; actual outlays, as Levdansky explains to Daley, typically come in lower. As for the amount of that total actually earmarked for the library in honor of the longtime senator, it was around $2 million. But by the time Levdansky got around to explaining all that, most voters had stopped listening.

That same fall in North Carolina, a Democrat named John Snow found himself the target of a mailing about a black felon named Henry Lee McCollum, who was serving time for the rape and murder of an eleven-year-old girl. “Thanks to arrogant State Senator John Snow,” it read, “McCollum could soon be let off of death row.” Snow lost. Four years later, McCollum, who has an IQ in the sixties, and his half brother were cleared of the crime on DNA evidence; Henry Lee had spent more than thirty years on death row.4

The tools of map-drawing began to grow more and more sophisticated in the 1980s, with the advent of computers. In one congressional district in Houston back then, two neighborhoods were united into the same district by inclusion of the Houston ship channel, where of course no actual voters lived. By now, districts can be drawn with such precision—including a specific census tract, excluding the one next door—that party registration of inhabitants can be calculated to the second or third decimal point. The result is districts that are so far removed from the “compact and contiguous” standard that courts have been known to apply that they become the butt of jokes. Pennsylvania’s current seventh congressional district, two blobs linked by a little strip of land that appears to be no more than a few miles wide, reminded one observer of nothing so much as “Donald Duck kicking Goofy.”

Through such techniques, the majority party can figure out ways to cram the voters of the minority party into as few districts as possible. Republicans in particular are assisted in this effort by the fact that Democrats and liberals tend to live in higher-density areas more often than Republicans and conservatives do. Hence, millions of Democrats are packed into comparatively fewer urban districts and suburban districts close to the city center, while Republicans are spread out over more districts. All this in turn means that Republicans can rack up impressive legislative majorities even as they are winning a minority of the vote.

This happened, as Daley documents, in state after state. In Wisconsin in 2012, for example, President Obama won 53 percent of the vote, and Democratic Senate candidate Tammy Baldwin won 51.4 percent. Democrats also won 50.4 percent of the aggregate vote for candidates for the House of Representatives, but Republicans took five of the state’s eight seats. In the state assembly, Democratic candidates overall received 174,000 more votes than GOP candidates, but Republicans won 60 percent of the seats.

A few rays of hope have recently emerged. Arizona is one of a handful of states (including California) that has turned over the drawing of legislative lines after the 2020 census to an independent commission. Such commissions will not be entirely free of politics, but they will surely be an improvement on legislators’ drawing districts for themselves and their friends.

Second, the courts have thrown out the egregious lines that Republicans drew in Pennsylvania, a state where Democrats outnumber Republicans, where until 2016 no Republican presidential candidate had won since 1988, where there had been twelve Democrats in the state’s House delegation to seven Republicans, but where after 2010 the congressional split went to 13–5 in the Republicans’ favor. The new map, which was drawn by the Pennsylvania Supreme Court and will be used this November, actually features districts that for the most part make some geographic sense and that most experts think will produce something more like an even split or a narrow Democratic advantage (which would reflect actual voter registration).5

In June, the Supreme Court is expected to rule on two more gerrymandering cases—one coming from Wisconsin, where Republicans drew egregious lines, and another from Maryland, where Democrats were the culprits. At issue is whether a Court majority will define discernible standards for what constitutes partisan gerrymandering. If it does so, a flood of gerrymandering litigation is likely to ensue, which reformers hope will lend momentum to the movement to take the process out of politicians’ hands once and for all.

In the beginning, the edict was simple. The fifty-five delegates to the 1787 Constitutional Convention agreed—under the leadership of a committee led, ironically enough, by Elbridge Gerry, who some years later as governor of Massachusetts would lend his name to the practice under discussion here—that each member of the new House of Representatives would represent around 40,000 people. Later—on the last day of the convention—they lowered the number to 30,000. The Constitution they approved provided that every ten years, a census would be taken, and the size of House districts and number of representatives adjusted accordingly.

A census was duly conducted every decade, and the populations of congressional districts increased by a few thousand each time—37,000 in 1800, 40,000 in 1810, and so on. But the various states’ commitments to drawing fair districts was, shall we say, indifferent. This was a problem that went back to the British Parliament. As boroughs were incorporated, they demanded representation, and they were given it; but no one had yet thought (say, in the 1600s) about the problem of equal representation. As such, both towns with only a few people and fast-growing cities sent two representatives to Parliament. Nothing was done, and by 1783, writes Rosemarie Zagarri in The Politics of Size, a Commons committee reported that a majority of the body’s members was elected by just 11,075 voters—a staggering 1/170th of the population.6

The United Kingdom fixed this “rotten borough” problem with the Reform Act of 1832. In the United States, however, the boroughs just got rottener and rottener over the course of the nineteenth century and well into the twentieth. As immigrants began to arrive, and after the slaves were freed, and then as African-Americans left the southern fields for the northern cities, few states made any effort whatsoever to draw fair congressional districts every ten years. Most continued to conduct a census; they then resolutely ignored the results, openly thumbing their noses at the Constitution. The motivation, of course, was to deny cities—with their populations of immigrants and, later, black people—their rightful representation.

Here are some numbers, from J. Douglas Smith’s eye-opening 2014 book On Democracy’s Doorstep.7 The inequities nearly defy belief. In Illinois after World War II, the populations of congressional districts ranged from 112,000 to 914,000. The larger district was urban, the smaller one rural, and the larger number meant of course that urban areas had fewer representatives, and that residents of the larger district had about one eighth the voice in Congress that residents of the smaller district had. In midcentury California, the 6,038,771 residents of Los Angeles County had one state senator, the same as the 14,294 inhabitants of three rural Sierra counties. As you might guess, the numbers in the South were appalling, disenfranchising what black voters did exist. But of all the states, the worst was Michigan, where rural voters and the legislative barons of Lansing lived in mortal fear of Detroiters having their rightful political say in the state’s affairs.

Hulton Archive/Getty ImagesElbridge Gerry, circa 1800: as governor of Massachusettshe became known for manipulating voting districts,a process now called “gerrymandering”

So it went, for 170 long years. How did such states get away with this? The courts would not enforce fair districts. Aggrieved citizens filed lawsuits, and courts looked at the numbers and said “you’re right”; but they would go on to aver that this was a political matter best settled through politics. The story Smith tells is the harrowing process by which these wrongs were finally put right in the early 1960s in two landmark Supreme Court decisions, Baker v. Carr and Reynolds v. Sims. In Baker (1962), which originated in Tennessee, the Court held that apportionment was a “justiciable” issue, i.e., one on which court intervention was appropriate. Two years later in Reynolds, which originated in Alabama, the Court held by 8–1 that all legislatures (except the United States Senate) had to meet the “one person, one vote” standard of representation, so that districts all had more or less equal numbers of voters.

It’s a riveting tale, involving Archibald Cox, later of Watergate fame but in the early 1960s President Kennedy’s solicitor general, urging caution, and Robert F. Kennedy pushing for more aggressive arguments before the Court. Earl Warren was asked numerous times to name the toughest case decided while he presided as chief justice. The man who oversaw decisions like Brown v. Board of Education and Miranda v. Arizona always answered “apportionment.”

Two titans squared off as Justice William O. Douglas emerged as the biggest champion on the Court of taking on the apportionment issue, and Felix Frankfurter its chief opponent (during Baker deliberations; by the time of Reynolds, Frankfurter was gone). Another justice, Charles Evans Whittaker, was so tormented by the Baker deliberations that he had a nervous breakdown and left the Court. A movement started immediately to call a constitutional convention to undo this judicial treachery and return to the states the right to treat legislative representation as they pleased, egged on by Senate Republican leader Everett Dirksen. Thirty-three states said yes—leaving the effort one state short of success.

This is the larger historical background against which recent Republican efforts need to be understood. The history of legislative system-rigging by rural, conservative interests is a long and ignoble one. For most of our history, our democracy has been, in Smith’s memorable phrase, a “deliberately misshapen enterprise.”

Democrats have now, for the first time in modern history, set up the machinery to try to do in 2020 what the GOP accomplished in 2010. The National Democratic Redistricting Committee was established last year and is headed by Eric Holder, the former attorney general. “We have to come up with a system that is more neutral, because the reality now is that we have politicians picking their voters as opposed to citizens choosing who their representatives are going to be,” Holder said at a Harvard Kennedy School forum on April 30.

His group raised nearly $11 million in its first six months and has placed a dozen states on its “target” list and another seven on its “watch” list. In most states, the group covets the governor’s mansion, for obvious reasons, and hopes to flip at least one house of the state legislature. In Minnesota, Wisconsin, North Carolina, and Ohio, it’s also eyeing down-ballot races. It appears to be most focused on Ohio, where the offices of secretary of state (which oversees elections) and state auditor are on its list.

If successful, the Holder group’s efforts—he is also considering running for president, by the way—will bear late fruit. In the meantime, to capture the two dozen seats they need to control the House, Democratic candidates this fall will need to win considerably more than 51 percent of the total vote. Wasserman of the Cook Political Report estimates that Democrats need to beat Republicans by 7 percent overall, which is in the vicinity of the party’s lead in most polls asking respondents whether they prefer that Democrats or Republicans win this fall. But the Brennan Center for Justice issued a report in late March saying that the number needed to win is more like 11 percent. The report assumes different overall Democratic vote margins and from there projects potential Democratic seat gains in the House based on historical totals and on Brennan’s own estimates taking gerrymandering into account.8

Notice that even according to the more optimistic (at least in the lower range) historical expectation numbers, the Democrats would need to win the national vote margin by 6 percent to win enough seats to retake the House. Doing that is a tall order. Even in 2012, their best year in recent times, they won by only about 2 percent overall.

Every other sign for the Democrats has been encouraging. Enthusiasm has been far greater among Democratic voters than among Republican ones. Even when Democratic candidates have lost, they’ve lost encouragingly. In a late April special congressional election in Arizona, the Democratic candidate came within five points of the Republican in a district that both Donald Trump in 2016 and Mitt Romney in 2012 carried by more than twenty points. After the results came in, Wasserman tweeted: “If the only data point you had to go on was last night’s #AZ08 result, you’d think a 30–40 seat Dem House gain in Nov. would be way low.”

So Democrats have many reasons to retain their optimism. But if they fall short, the reason may have less to do with Donald Trump than with Chris Jankowski and his work in 2010 that stands far less athwart American political history and tradition than we’d prefer to believe.

—May 10, 2018

  1. 1

    Technically, fifty-eight; but two independents, Bernie Sanders of Vermont and Angus King of Maine, caucused with the Democrats, giving them the crucial sixty votes needed to break a filibuster. 

  2. 2

    The Cook paper itself is behind a paywall, but the numbers can be found at W. Gardner Selby, “Republicans Won More House Seats Than More Popular Democrats, Though Not Entirely Because of How Districts Were Drawn,”, November 26, 2013. 

  3. 3

    “American Democracy Betrayed,” The New York Review, August 18, 2016. 

  4. 4

    See Mandy Locke and Joseph Neff, “Pardoned Brothers’ Payout Triggers Fight Over Who Gets a Cut,” The Charlotte Observer, May 1, 2017. 

  5. 5

    See Nate Cohn, Matthew Block, and Kevin Quealy, “The New Pennsylvania Congressional Map, District by District,” The New York Times, February 19, 2018.  

  6. 6

    The Politics of Size: Representation in the United States, 1776–1850 (Cornell University Press, 1987), p. 37. 

  7. 7

    On Democracy’s Doorstep: The Inside Story of How the Supreme Court Brought “One Person, One Vote” to the United States (Hill and Wang, 2014).  

  8. 8

    Laura Royden, Michael Li, and Yurij Rudensky, “Extreme Gerrymandering and the 2018 Midterm,” Brennan Center for Justice, March 23, 2018. 

Source Article from

Remodeling Mayhem

The Image-Complex/Rafah: Black Friday/Forensic Architecture, 2015Photographs and videos are placed within a 3D model to tell the story of one of the heaviest days of bombardment in the 2014 Israel-Gaza conflict

About five miles north of the Israeli city of Beersheba, on the edge of the Negev desert, there’s a small village named Al-Araqib that has been demolished more than a hundred times in the last eighteen years. These demolitions have ranged from the total razing by over a thousand armed policemen with trucks and bulldozers to a simple flattening of a tent by a tractor.

In its heyday, the village had about 400 inhabitants, though now only a dozen or so remain, living within the limits of the village cemetery, next to the many graves. The cemetery affords the main claim of the Bedouin villagers to the land—if they can prove that they have cultivated it since before 1948, that the village has existed for longer than that, then the Israeli government will let them stay.

The plight of this village—shared by forty-six others that have been collectively dubbed “the battle of the Negev” by the Israeli media and establishment—is one of the many cases of state violence examined in “Counter Investigations,” an exhibition at the Institute of Contemporary Arts in London detailing the work of an investigative agency called Forensic Architecture. The group, founded by the Israeli architect Eyal Weizman in 2010 and based at Goldsmiths College in South London, seeks to use forensic methods of evidence-gathering and presentation against the nation states that developed them. Weizman believes that architects, who are skilled at computer modeling, presenting complex technical information to lay audiences, and coordinating projects made up of many different experts and specialists, are uniquely suited to this kind of investigation. But there’s another, simpler explanation for their involvement: “Most people dying in contemporary conflicts die in buildings.”

From missiles designed to pierce a hole in a roof before exploding inside a particular room, to army units blasting through the walls of houses, to the repeated demolitions of villages like Al-Araqib, conflict has increasingly acquired an architectural dimension. This development has prompted a wave of work by writers, artists, and academics such as Sharon Rotbard, Derek Gregory, Trevor Paglen, and Hito Steyerl focusing on the intersection between design, warfare, and the city. Steyerl, for example, writes in her recent book Duty Free Art that killing is a “matter of design” expressed through planning and policy; like Weizman she asks us not just to examine the moment a gun is fired, but also the ever-widening scope of circumstances and legal mechanisms that make such violence possible, even inevitable.

Forensic Architecture, 2016A composite image merging 3D modeling with news footage of a home destroyed in a drone strike on Miranshah, North Waziristan, Pakistan

Weizman studied at the Architectural Association in London, and published his first book in 2000, Yellow Rhythms: A Roundabout for London, an eccentric proposal for a vast roundabout straddling the Thames in southwest London, just north of Vauxhall. This was less a sincere attempt to solve traffic congestion and more an inventive thought experiment designed to reveal the absurdities and inequities of the London real estate market. Weizman imagined a state-owned, speculative development in the center of the roundabout—a set of empty skyscrapers accumulating value that would be skimmed off and used to fund progressive policies elsewhere. With this project Weizman embraced an attitude that had flourished at the AA and a number of the other innovative architecture schools since the Sixties: that architecture is a way of thinking about the world, of synthesizing and presenting knowledge, rather than just a way of designing and constructing buildings.

Within a few years, Weizman was able to put this approach to the test. Weizman, along with his colleague Rafi Segal, was selected to represent Israel at the World Congress of Architecture of Berlin in 2002. As part of the exhibit, the pair put together a catalogue of fourteen essays detailing the different ways in which the business and practice of architecture was part of Israeli strategy in the West Bank. The result, entitled A Civilian Occupation, was banned by the Israeli Association of United Architects, which ordered the pulping of the 5,000 printed copies (Segal and Weizman managed to save around 850, and the book was later re-released by Verso Books and the Israeli publisher Babel). Weizman built upon these ideas in his 2007 book Hollow Land, a sweeping investigation of Israeli policy from an architectural point of view that combines polemical force with minute, often surreal detail gleaned from interviews with Israeli military officers: “Derrida may be a little too opaque for our crowd,” says one, unexpectedly. “We share more with architects; we combine theory and practice. We can read, but we know as well how to build and destroy, and sometimes kill.”

Through this work Eyal Weizman, and later Forensic Architecture, has been involved in numerous court cases in Israel—most successfully as part of an action in the Israeli High Court designed to halt the construction of a section of separation wall in the village of Battir on environmental grounds. The agency provided models and animations demonstrating the environmental damage that would be caused by various army engineer proposals for a more “architecturally sustainable” and less “invasive” form of wall—leading to the idea of a wall in that area being abandoned altogether. Wider work by the agency has been used in trials from Guatemala to the International Criminal Court at the Hague, with similar aims—to use visualization and modeling to bring a complicated set of relationships vividly to life in the courtroom. This can have varying results, and their work is ignored or dismissed as often as it produces dramatic turnarounds. Battir might have been saved, but the future of the village of Al-Araqib, for example, is still precarious despite exhaustive historical research and a number of court sessions.

Although showcasing some of the agency’s work in Israel, where Weizman and his colleagues continue to collaborate with civil society groups and human rights organizations such as ActiveStills and B’Tselem, the main purpose of the London exhibition is to show how Forensic Architecture adapts its practice to provide a commentary on its immediate location. The show is squarely aimed at a British audience during a period of mounting hostility toward refugees and migrants. Immediately after the referendum to leave the European Union, for example, there was a documented spike in hate crimes in Britain, and they have yet to fall below pre-referendum levels. The exhibition has coincided with a long overdue public discussion about a policy once proudly described by Prime Minister Theresa May as the “hostile environment,” in which both non-citizens and British citizens of color have faced the same labyrinthine bureaucracy and a pervasive attitude of skepticism when trying to prove their right to be in the country. Preliminary figures suggest that thousands of citizens have suffered at the hands of this racist system in recent years, leading to destitution, lack of healthcare provision, and sometimes even deportation.

Forensic Architecture and Anderson Acoustics, 2017Simulated propagation of sound within a digital model of the internet café where Halit Yozgat, the son of Turkish immigrants, was murdered in Kassel, Germany, 2006

The show’s curators have subtly assembled parts of the exhibition to address this political background, in which issues surrounding race and immigration regularly dominate British media. The first project in the show uses videos, a vast timeline, and outlines traced on the gallery floor to reconstruct the murder of a Turkish man by a neo-Nazi in Germany, while another uses survivors’ testimonies to construct a harrowing model of a secret torture prison in Syria called Saydnaya. The latter project, perhaps their most famous to date, was carried out by Lawrence Abu Hamdan, a member of the group specializing in sound analysis. Through an immersive video shown at the exhibition, we see Hamdan working with survivors to reconstruct claustrophobic visual models of the interior of the prison, according to the sounds they heard while held there, or transported to and from torture cells, in near total darkness. Hamdan’s approach preserves gaps and errors in the memories of the survivors, and makes visual the trauma they experienced—corridors stretch to impossible lengths, while sinister locked doors multiply and spread around the viewer. These eerie plans, resembling a nightmarish memory palace, testify not just to the physical existence of the hidden prison, but also to the psychological consequences of what happened there, persistent in the minds of the survivors.

Beyond the film about Saydnaya, as the exhibition moves deeper into a cavernous darkened room, three video projections break down into traumatic detail the way that a lack of coordinated sea rescue services extracts a terrible death toll among migrants attempting to cross the Mediterranean: while different European authorities, NGO ships, and Libyan Coast guard vessels jostle chaotically, refugees drown, sometimes only feet away from the boats that are supposed to rescue them. (One such death is shown on film, captured by a camera fixed to the side of an NGO boat, and cannot be forgotten once seen.) According to Forensic Architecture’s Research Coordinator Christina Varvia, the purpose of these displays—from Germany, to Syria, to the Mediterranean Sea—is to viscerally impress upon a British audience the brutal reality of the refugee experience—whether it’s the violence they’re fleeing from, the dangerous, often deadly journey they face if they try to reach Europe (most don’t), or the racism they often encounter once they get there.

It is a testament to the group’s media fluency and inventive presentations that the sheer quantity of depressing, disturbing information on display in this exhibition rarely feels wearing or boring. For all its focus on precisely reconstructed factuality, many acutely emotional moments linger in the mind long after seeing the works. Whether or not this is “art” is a debate that Forensic Architecture long since left behind—the group is adept at presenting its findings in galleries and courts alike. (The art establishment is clearly not troubled by the question either: Forensic Architecture has been nominated for the prestigious Turner Prize.)

Why, when nation states have always committed crimes and lied about them, has an organization such as Forensic Architecture appeared only in the last decade? In Duty Free Art Hito Steyerl argues with apocalyptic brio that the world is entering a period of “post-democracy,” in which “states and other actors impose their agendas through emergency powers,” while democratic mandates weaken and “oligarchies of all kinds are on the rise.” Perhaps the group owes its existence to this new mood, in which authorities can no longer be trusted to handle the evidence, and impartiality seems increasingly impossible or, in Weizman’s opinion, even undesirable. “Having an axe to grind,” he writes, “should sharpen the quality of one’s research rather than blunt one’s claims.”

Ariel Caine/Forensic Architecture, 2016A diagram of a well belonging to Awimer Salman Abu Medigam, Al-Araqib, north of Beersheba, Negev desert, September 2016; blue rectangles indicate the positions of the individual image frames from which the 3D information was derived

“Counter Investigations: Forensic Architecture” was at London’s Institute of Contemporary Arts through May 13. Forensic Architecture: Violence at the Threshold of Detectability, by Eyal Weizman, is distributed by MIT PressDuty Free Art: Art in the Age of Planetary Civil War, by Hito Steyerl, is published by Verso.

Source Article from

Devastatingly Human

Paula Rego/Marlborough International Fine ArtPaula Rego: The Family, 1988

The gripping and dramatic show “All Too Human: Bacon, Freud and a Century of Painting Life” merits its title: it is “all too human” in the tender, painful works that form its core. But “a century of painting life” promises something wider—does it smack of marketing, a lure to bring people in? In fact, the heart of the show is narrower and more interesting, illustrating the competing and overlapping streams of painterly obsession in London in the second half of the twentieth century. It shows us how, in their different ways, painters such as Francis Bacon and Lucian Freud, Leon Kossoff and Frank Auerbach, R.B. Kitaj, and Paula Rego redefined realism. In defiance of the dominant abstract trend, they teased and stretched the practice and impact of representational art. “What I want to do,” Francis Bacon said in 1966, “is to distort the thing far beyond the appearance, but in the distortion to bring it back to a recording of the appearance.” In this show, terms like “realism” and “human” take on new meaning and power.

Private collection, Switzerland c/o Di Donna Galleries Chaïm Soutine: The Butcher Stall, circa 1919; click to enlarge

The exhibition begins, cleverly, with some forebears of these London artists, pre-war painters who looked with intensity at the lives, settings, and landscapes that most affected them, and used paint in a highly personal way to convey not only what they saw, but what they felt. It feels odd, at first, to walk into a show that claims to be about “life” and find a landscape rather than a life-study, yet the urgent, textured use of paint in Chaïm Soutine’s earthy landscapes, as well as his distorted figures and the raw strength of The Butcher Stall (circa 1919), with its hanging carcasses, had a profound impact on Francis Bacon. In a similar way, all the works in this first room reach toward the future: Stanley Spencer’s portraits of his second wife, Patricia Preece, clothed and naked, stare out with the unpitying confidence of Lucian Freud’s early portraits. Sickert’s dark portrayals of London prostitutes—his attempt to give “the sensation of a page torn from the book of life”—anticipate the unsentimental nudes of Freud and Euan Uglow (no relation to me). David Bomberg’s layered, arid Spanish landscapes point toward the scumbled, perspectiveless scenes of Kossoff and Auerbach. 

Tate Francis Bacon: Dog, 1952; click to enlarge

Nothing, however, prepares one for the tender ferocity of Bacon’s isolated, entrapped figures. In the earliest of these, the large canvas of Figure in a Landscape (1945), a curled-up, almost human form appears to be submerged in a desert—we see his arm and part of his body, but the legs of his suit hang, empty, over a bench. This is masculinity destroyed. The sense of desperation is even stronger in Bacon’s paintings of animals, such as Dog (1952), in which the dog whirls like a dervish, absorbed in chasing its tail, while cars speed by on a palm-bordered freeway, or Study of a Baboon (1953), where the monkey flies and howls against the mesh of a fence. In their struggles, these animals are the fellows of Bacon’s “screaming popes”: in Study after Velazquez (1950), a businessman in a dark suit, jaws wrenched open in a silent yell, is trapped behind red bars that fall like a curtain of blood. The curators connect Bacon’s postwar angst with Giacometti’s elongated statues, isolated in space, and to the philosophy of existentialism. Yet Bacon’s vehement brushstrokes speak of energy and involvement, physical, not cerebral responses. In Study for Portrait II (after the Life Mask of William Blake) (1955), you feel the urgent vision behind the lidded eyes. He cares, passionately.

Celia Paul/Victoria Miro, London and VeniceCelia Paul: Painter and Model, 2012; click to enlarge

This was a postcolonial as well as a postwar world, a point made abruptly by devoting a room to the work of F.N. Souza, who came to London from Bombay  in 1949, and worked here until he left for New York in 1967. Despite Souza’s popularity at the time, and the range of sacred and profane references that link him uneasily to Bacon, his stark religious iconography feels out of keeping with the bodily compulsion of Bacon’s work and the new streams of influence shaping  what R.B. Kitaj named “the School of London.”

One of these streams flowed from the Slade, where William Coldstream was professor of Fine Art and the young Lucian Freud was a visiting tutor. Here, in a very different way to Bacon, you feel the pressure of flesh. Coldstream believed that artists should work without preconceptions, through minute, painstaking observation, fixing “reality” with measurement, allowing the subject to emerge slowly on the canvas. His Seated Nude (1952–1953) was painted over at least thirty sittings of about two hours each—no wonder the model looks glazed. His pupil Euan Uglow adopted this technique, setting his figures against a geometric grid. It gives them an eerie physicality. (I’m not the only person to stand in front of his 1953–1954 Woman with White Skirt and say, “Paula Rego.”) Uglow is famous for telling a model, “Nobody has ever looked at you as intensely as I have.” Over time, his control of detail and setting became obsessive, but his piercing gaze and careful technique remained, rendering his subjects at once solid and dreamlike, their inner spirit elusive but embodied.  

TateLucian Freud: Girl with a White Dog, 1950–1951

In the 1950s, Uglow’s belief in the value of minute observation was shared by Freud, who admitted, as Emma Chambers writes in the exhibition catalogue, to a “visual aggression” toward his sitters: “I would sit very close and stare. It could be uncomfortable for both of us.” His paintings from this period, delicately wrought with a fine sable brush, are almost hallucinatory in their detail, with a Pre-Raphaelite veracity of sheen and texture. We see the softness of material, the fur of the dog. And how exposed and alarmed his first wife, Kitty Garman, looks in the extraordinary Girl with a White Dog (1950–1951), in her pale green dressing gown with one white, veined breast revealed. 

Frank Auerbach/Marlborough Fine Art/TateFrank Auerbach: Head of E.O.W. I, 1960; click to enlarge

At the same time as Coldstream was instilling in his students the virtues of precision and measurement, David Bomberg was inspiring his pupils at the Borough Polytechnic in South London from 1946 to 1953 with a far freer, more tactile approach. To Bomberg, painting was about the “feeling” and experience of form, not its mere appearance. His own work conveyed the sense of mass in fluid, sensuous oils, and young artists such as Frank Auerbach, Dennis Creffield, Leon Kossoff, and Dorothy Mead flocked to his classes. Often working outdoors, as Bomberg did, Auerbach and Kossoff painted the settings they knew, showing a new London rising from the old, driving across the canvas in slabs of paint and thick encrustations. Auerbach’s Rebuilding the Empire Cinema, Leicester Square (1962) and Kossoff’s Building Site, Victoria Street (1961) are so tactile that they make you want to trace the lines with your hand, while the sticky ridges of Auerbach’s Head of E.O.W. I (1959–1960)—so strong from a distance, so baffling up close—seem as much sculpture as painting.   

Again and again in this exhibition, we move from the exchange of ideas and influences to the individual vision. Some works, indeed, are so drenched in emotion that they produce ripples of shock. The intimacy of Freud’s work is intensified when he moves, around 1960, from minute, close-up fidelity to large, expressive brushstrokes. In his later paintings, he catches the twist of muscles, the sweat on the skin, the pride and fullness of bodies in sleep, as in the great Leigh Bowery (1991), showing Bowery, a performance artist with a body of billowy corpulence,  with his head slumped gently on his shoulder, or in Sleeping by the Lion Carpet (1996), where Sue Tilley—“Big Sue,”  Bowery’s cashier at his Taboo night club and a benefits supervisor at the Charing Cross JobCentre—dozes safely before a predatory image.

Photo by Prudence Cuming Associates Ltd./The Estate of Francis Bacon/DACS, LondonFrancis Bacon: Study for Portrait of Lucian Freud, 1964; click to enlarge

By the 1960s, when Freud was subjecting his models to hours and days of sitting, Bacon was standing back, using photographs rather than live models. One room here shows a selection of portraits he commissioned from the photographer John Deakin. These are direct, intimate, and suggestive, but when Bacon explores the human form, the effect is very different. Bodies and heads become twisted, swollen, contorted. In his Study for a Portrait of P.L. (1962), painted in the year of Lacy’s death, after ten years of their turbulent, sometimes violent relationship, the internal and sexual organs seem to bulge through their covering. Two years later, his Study for Portrait of Lucian Freud emphasized the strong torso, the fierce expression, the unnerving clarity of Freud’s gaze. These are psychological as much as physical studies. In the moving Triptych (1974–1977), an unusual outdoor, light-filled work, the body beneath the umbrella writhes on the deserted beach, as Bacon mourns the death of his lover, George Dyer. But beyond them, the clear sky suggests a slow, painful coming to terms with loss—the promise of new life , or at least oblivion, in the deep blue of the sea beyond?

Bacon’s solitary figures are, paradoxically, imbued with a feeling of relationship. The same is true of Freud’s portraits, of his wife, his mother, his daughter, his friends. The intimacy of the family is also part of what it means to be “all too human.” Michael Andrews, for example, intrigued by Bacon’s use of photography, worked from a color photograph of a holiday in Scotland for his darkly beautiful Melanie and Me Swimming (1978–1979), spray-painted in acrylic. This feels like a moment swimming out of time into memory. And sometimes the sociability of London’s artistic life is itself  commemorated. In Colony Room I (1962), Andrews painted the Colony Club, where Bacon, Freud, and Deakin drank with Soho’s artists and writers. “Life,” in the sense of a community, also fills R.B. Kitaj’s brilliant group scenes, such as Cecil Court, London W.C.2. (The Refugees) (1983–1984). His crowded, colorful The Wedding (1989–1993) celebrates not only his marriage to Sandra Fisher but his friendships—with Auerbach, Freud, Kossoff, and David Hockney, among others.

The estate of R. B. Kitaj/TateR.B. Kitaj: Cecil Court, London W.C.2 (The Refugees), 1983–1984

Hockney’s work is inexplicably absent here, and so, up to this point in the show, are works by women, apart from a blurry, atmospheric nude by Dorothy Mead.  But suddenly, you turn a corner, and there is Paula Rego. The streams of the London School flow together. Rego came from Portugal when she was sixteen to finish her education, and from 1952 to 1956 she studied at the Slade under Coldstream, alongside Andrews, Uglow, and her future husband, Victor Willing. As Victor slowly declined from multiple sclerosis, her painting became increasingly personal. The Family (1988), painted in the last months of his life, shows two women helping him take off his jacket—yet there is a strange undertone here: they seem to be shuffling him into the grave. The feeling  is curiously sinister, perhaps reflecting Rego’s awareness that women—always the carers—are often so intimate with death. The little shrine in the background may show George slaying the dragon, but above him stands St. Joan, the martyred, martial saint.

Paula Rego/TatePaula Rego: The Betrothal: Lessons: The Shipwreck, after ‘Marriage a la Mode’ by Hogarth, 1999

Rego has often used stories to uncover the depths of our humanity, exposing the shattered dreams and desires of women across time. In Bride (1994), the bride lies back awkwardly, as if her wedding dress were a strait-jacket. In the trilogy The Betrothal: Lessons: The Shipwreck, after ‘Marriage a la Mode’ by Hogarth (1999), Hogarth’s moral tale of greed and disease, a mockery of the dream family, is reworked in the fashions of her own childhood.

By contrast, the final room—apart from Celia Paul’s Family Group (1984–1986) and the powerfully interior Painter and Model (2012)—feels like a token addition, a nervous nod to gender and diversity. Jenny Saville, Cecily Brown, and Lynette Yiadom-Boakye are fine artists, but they belong in a different narrative. A misstep, yet “All Too Human” remains an extraordinary exhibition, full of works of deep seriousness and bold, brave fidelity to life. For me, it ends with Rego’s bitingly honest work. With her bold, distinctive use of outline and color, and her mighty sympathy for human pain and longing, her paintings show life in all its senses.

Celia Paul/Victoria Miro, London and VeniceCelia Paul: Family Group, 1984–1986

“All Too Human: Bacon, Freud, and a Century of Painting Life” is at the Tate through August 27.

Source Article from

Epigenetics: The Evolution Revolution

Cas Oorthuys/Nederlands FotomuseumChildren in Amsterdam during the Dutch Hunger Winter, 1944–1945

At the end of the eighteenth century, the French naturalist Jean-Baptiste Lamarck noted that life on earth had evolved over long periods of time into a striking variety of organisms. He sought to explain how they had become more and more complex. Living organisms not only evolved, Lamarck argued; they did so very slowly, “little by little and successively.” In Lamarckian theory, animals became more diverse as each creature strove toward its own “perfection,” hence the enormous variety of living things on earth. Man is the most complex life form, therefore the most perfect, and is even now evolving.

In Lamarck’s view, the evolution of life depends on variation and the accumulation of small, gradual changes. These are also at the center of Darwin’s theory of evolution, yet Darwin wrote that Lamarck’s ideas were “veritable rubbish.” Darwinian evolution is driven by genetic variation combined with natural selection—the process whereby some variations give their bearers better reproductive success in a given environment than other organisms have.1 Lamarckian evolution, on the other hand, depends on the inheritance of acquired characteristics. Giraffes, for example, got their long necks by stretching to eat leaves from tall trees, and stretched necks were inherited by their offspring, though Lamarck did not explain how this might be possible.

When the molecular structure of DNA was discovered in 1953, it became dogma in the teaching of biology that DNA and its coded information could not be altered in any way by the environment or a person’s way of life. The environment, it was known, could stimulate the expression of a gene. Having a light shone in one’s eyes or suffering pain, for instance, stimulates the activity of neurons and in doing so changes the activity of genes those neurons contain, producing instructions for making proteins or other molecules that play a central part in our bodies.

The structure of the DNA neighboring the gene provides a list of instructions—a gene program—that determines under what circumstances the gene is expressed. And it was held that these instructions could not be altered by the environment. Only mutations, which are errors introduced at random, could change the instructions or the information encoded in the gene itself and drive evolution through natural selection. Scientists discredited any Lamarckian claims that the environment can make lasting, perhaps heritable alterations in gene structure or function.

But new ideas closely related to Lamarck’s eighteenth-century views have become central to our understanding of genetics. In the past fifteen years these ideas—which belong to a developing field of study called epigenetics—have been discussed in numerous articles and several books, including Nessa Carey’s 2012 study The Epigenetic Revolution2 and The Deepest Well, a recent work on childhood trauma by the physician Nadine Burke Harris.3

The developing literature surrounding epigenetics has forced biologists to consider the possibility that gene expression could be influenced by some heritable environmental factors previously believed to have had no effect over it, like stress or deprivation. “The DNA blueprint,” Carey writes,

isn’t a sufficient explanation for all the sometimes wonderful, sometimes awful, complexity of life. If the DNA sequence was all that mattered, identical twins would always be absolutely identical in every way. Babies born to malnourished mothers would gain weight as easily as other babies who had a healthier start in life.

That might seem a commonsensical view. But it runs counter to decades of scientific thought about the independence of the genetic program from environmental influence. What findings have made it possible?

In 1975, two English biologists, Robin Holliday and John Pugh, and an American biologist, Arthur Riggs, independently suggested that methylation, a chemical modification of DNA that is heritable and can be induced by environmental influences, had an important part in controlling gene expression. How it did this was not understood, but the idea that through methylation the environment could, in fact, alter not only gene expression but also the genetic program rapidly took root in the scientific community.

As scientists came to better understand the function of methylation in altering gene expression, they realized that extreme environmental stress—the results of which had earlier seemed self-explanatory—could have additional biological effects on the organisms that suffered it. Experiments with laboratory animals have now shown that these outcomes are based on the transmission of acquired changes in genetic function. Childhood abuse, trauma, famine, and ethnic prejudice may, it turns out, have long-term consequences for the functioning of our genes.

These effects arise from a newly recognized genetic mechanism called epigenesis, which enables the environment to make long-lasting changes in the way genes are expressed. Epigenesis does not change the information coded in the genes or a person’s genetic makeup—the genes themselves are not affected—but instead alters the manner in which they are “read” by blocking access to certain genes and preventing their expression. This mechanism can be the hidden cause of our feelings of depression, anxiety, or paranoia. What is perhaps most surprising of all, this alteration could, in some cases, be passed on to future generations who have never directly experienced the stresses that caused their forebears’ depression or ill health.

Numerous clinical studies have shown that childhood trauma—arising from parental death or divorce, neglect, violence, abuse, lack of nutrition or shelter, or other stressful circumstances—can give rise to a variety of health problems in adults: heart disease, cancer, mood and dietary disorders, alcohol and drug abuse, infertility, suicidal behavior, learning deficits, and sleep disorders. Since the publication in 2003 of an influential paper by Rudolf Jaenisch and Adrian Bird, we have started to understand the genetic mechanisms that explain why this is the case. The body and the brain normally respond to danger and frightening experiences by releasing a hormone—a glucocorticoid—that controls stress. This hormone prepares us for various challenges by adjusting heart rate, energy production, and brain function; it binds to a protein called the glucocorticoid receptor in nerve cells of the brain.

Normally, this binding shuts off further glucocorticoid production, so that when one no longer perceives a danger, the stress response abates. However, as Gustavo Turecki and Michael Meaney note in a 2016 paper surveying more than a decade’s worth of findings about epigenetics, the gene for the receptor is inactive in people who have experienced childhood stress; as a result, they produce few receptors. Without receptors to bind to, glucocorticoids cannot shut off their own production, so the hormone keeps being released and the stress response continues, even after the threat has subsided. “The term for this is disruption of feedback inhibition,” Harris writes. It is as if “the body’s stress thermostat is broken. Instead of shutting off this supply of ‘heat’ when a certain point is reached, it just keeps on blasting cortisol through your system.”

It is now known that childhood stress can deactivate the receptor gene by an epigenetic mechanism—namely, by creating a physical barrier to the information for which the gene codes. What creates this barrier is DNA methylation, by which methyl groups known as methyl marks (composed of one carbon and three hydrogen atoms) are added to DNA. DNA methylation is long-lasting and keeps chromatin—the DNA-protein complex that makes up the chromosomes containing the genes—in a highly folded structure that blocks access to select genes by the gene expression machinery, effectively shutting the genes down. The long-term consequences are chronic inflammation, diabetes, heart disease, obesity, schizophrenia, and major depressive disorder.

Such epigenetic effects have been demonstrated in experiments with laboratory animals. In a typical experiment, rat or mouse pups are subjected to early-life stress, such as repeated maternal separation. Their behavior as adults is then examined for evidence of depression, and their genomes are analyzed for epigenetic modifications. Likewise, pregnant rats or mice can be exposed to stress or nutritional deprivation, and their offspring examined for behavioral and epigenetic consequences.

Experiments like these have shown that even animals not directly exposed to traumatic circumstances—those still in the womb when their parents were put under stress—can have blocked receptor genes. It is probably the transmission of glucocorticoids from mother to fetus via the placenta that alters the fetus in this way. In humans, prenatal stress affects each stage of the child’s maturation: for the fetus, a greater risk of preterm delivery, decreased birth weight, and miscarriage; in infancy, problems of temperament, attention, and mental development; in childhood, hyperactivity and emotional problems; and in adulthood, illnesses such as schizophrenia and depression.

What is the significance of these findings? Until the mid-1970s, no one suspected that the way in which the DNA was “read” could be altered by environmental factors, or that the nervous systems of people who grew up in stress-free environments would develop differently from those of people who did not. One’s development, it was thought, was guided only by one’s genetic makeup. As a result of epigenesis, a child deprived of nourishment may continue to crave and consume large amounts of food as an adult, even when he or she is being properly nourished, leading to obesity and diabetes. A child who loses a parent or is neglected or abused may have a genetic basis for experiencing anxiety and depression and possibly schizophrenia. Formerly, it had been widely believed that Darwinian evolutionary mechanisms—variation and natural selection—were the only means for introducing such long-lasting changes in brain function, a process that took place over generations. We now know that epigenetic mechanisms can do so as well, within the lifetime of a single person.

It is by now well established that people who suffer trauma directly during childhood or who experience their mother’s trauma indirectly as a fetus may have epigenetically based illnesses as adults. More controversial is whether epigenetic changes can be passed on from parent to child. Methyl marks are stable when DNA is not replicating, but when it replicates, the methyl marks must be introduced into the newly replicated DNA strands to be preserved in the new cells. Researchers agree that this takes place when cells of the body divide, a process called mitosis, but it is not yet fully established under which circumstances marks are preserved when cell division yields sperm and egg—a process called meiosis—or when mitotic divisions of the fertilized egg form the embryo. Transmission at these two latter steps would be necessary for epigenetic changes to be transmitted in full across generations.

The most revealing instances for studies of intergenerational transmission have been natural disasters, famines, and atrocities of war, during which large groups have undergone trauma at the same time. These studies have shown that when women are exposed to stress in the early stages of pregnancy, they give birth to children whose stress-response systems malfunction. Among the most widely studied of such traumatic events is the Dutch Hunger Winter. In 1944 the Germans prevented any food from entering the parts of Holland that were still occupied. The Dutch resorted to eating tulip bulbs to overcome their stomach pains. Women who were pregnant during this period, Carey notes, gave birth to a higher proportion of obese and schizophrenic children than one would normally expect. These children also exhibited epigenetic changes not observed in similar children, such as siblings, who had not experienced famine at the prenatal stage.

During the Great Chinese Famine (1958–1961), millions of people died, and children born to young women who experienced the famine were more likely to become schizophrenic, to have impaired cognitive function, and to suffer from diabetes and hypertension as adults. Similar studies of the 1932–1933 Ukrainian famine, in which many millions died, revealed an elevated risk of type II diabetes in people who were in the prenatal stage of development at the time. Although prenatal and early-childhood stress both induce epigenetic effects and adult illnesses, it is not known if the mechanism is the same in both cases.

Whether epigenetic effects of stress can be transmitted over generations needs more research, both in humans and in laboratory animals. But recent comprehensive studies by several groups using advanced genetic techniques have indicated that epigenetic modifications are not restricted to the glucocorticoid receptor gene. They are much more extensive than had been realized, and their consequences for our development, health, and behavior may also be great.

It is as though nature employs epigenesis to make long-lasting adjustments to an individual’s genetic program to suit his or her personal circumstances, much as in Lamarck’s notion of “striving for perfection.” In this view, the ill health arising from famine or other forms of chronic, extreme stress would constitute an epigenetic miscalculation on the part of the nervous system. Because the brain prepares us for adult adversity that matches the level of stress we suffer in early life, psychological disease and ill health persist even when we move to an environment with a lower stress level.

Once we recognize that there is an epigenetic basis for diseases caused by famine, economic deprivation, war-related trauma, and other forms of stress, it might be possible to treat some of them by reversing those epigenetic changes. “When we understand that the source of so many of our society’s problems is exposure to childhood adversity,” Harris writes,

the solutions are as simple as reducing the dose of adversity for kids and enhancing the ability of caregivers to be buffers. From there, we keep working our way up, translating that understanding into the creation of things like more effective educational curricula and the development of blood tests that identify biomarkers for toxic stress—things that will lead to a wide range of solutions and innovations, reducing harm bit by bit, and then leap by leap.

Epigenetics has also made clear that the stress caused by war, prejudice, poverty, and other forms of childhood adversity may have consequences both for the persons affected and for their future—unborn—children, not only for social and economic reasons but also for biological ones.

  1. 1

    See our essay “Evolving Evolution” in these pages, May 11, 2006.  

  2. 2

    The Epigenetic Revolution: How Modern Biology is Rewriting Our Understanding of Genetics, Disease, and Inheritance (Columbia University Press, 2012).  

  3. 3

    The Deepest Well: Healing the Long-Term Effects of Childhood Adversity (Houghton Mifflin Harcourt, 2018). 

Source Article from

Malaysia and the Improbable Win of an Unlikely Alliance

Lai Seng Sin/ReutersA video clip of the then jailed opposition leader Anwar Ibrahim playing in the background at an anticorruption rally with Anwar’s wife, Wan Azizah, and Anwar’s one-time nemesis but now political ally, Malaysia’s former prime minister, Dr. Mahathir Mohamad, in Petaling Jaya, Malaysia, October 14, 2017

The flag of Malaysia’s Parti Keadilan Rakyat, or People’s Justice Party (PKR), is turquoise-blue with red stripes at both ends. At its center is a stylized white “O.” It symbolizes the black eye of Anwar Ibrahim, Malaysia’s former deputy prime minister, who was a rising political star in the 1990s until he criticized the ruling National Front, a right-wing coalition led by Dr. Mahathir Mohamad, and was shipped off to jail for alleged sodomy. In September 1998, before a show trial, Anwar was beaten up by a police chief. Thereafter, a photo of Anwar’s bruised face became a symbol of opposition to the National Front, which had, in one form or another, been in power since Malaysia achieved full independence in the early 1960s.

Mahathir claimed at the time that Anwar’s black eye was “self-inflicted,” caused by his “pressing a glass over his eyes.” Anwar went to jail for six years, until 2004. Then, incredibly, in 2015 he was jailed again—again for alleged sodomy. Anwar has openly criticized many establishment figures and has long been viewed by those in power as a volatile and threatening figure. He was released from prison, upon receiving a special royal pardon, only this week.

Mahathir had ruled Malaysia with an iron fist, crushing dissenters like Anwar, from 1981 to 2003. In 2018, at the age of ninety-two, he was back, campaigning to become prime minister once more—under the PKR’s turquoise-blue flag. His running-mate was Anwar Ibrahim’s wife, Wan Azizah Wan Ismail, who founded the PKR after her husband went to jail. Mahathir’s campaign promise was to obtain a pardon for Anwar if the coalition won, eventually to step down himself and hand over power to his former deputy. This unlikely-seeming team of former rivals buried their differences in the single-minded hope of ousting the spectacularly corrupt administration of Malaysia’s most recent prime minister, Najib Razak—and they did so knowing that they would need all the star power they could get.

Their convoluted alliance was at the heart of Malaysia’s historic general election last week. For the first time in the country’s post-independence history, an opposition coalition succeeded in unseating the National Front. Mahathir led the Pakatan Harapan, or the “Alliance of Hope,” against his own former party. Uniting this alliance was its animus against Najib, a genteel, foreign-educated former protégé of Mahathir’s, who has reportedly stolen almost $700 million from a state fund named 1Malaysia Development Berhad. That scandal, generally known by the initials “1MDB,” along with Najib’s oversight of an unpopular goods and services tax that aimed to simplify tariffs, also angered millions of voters.

Mahathir has referred to his part in supporting Najib’s rise as “the biggest mistake that I have made in my life,” and in 2016, he quit Najib’s party, the United Malays National Organization (UMNO), one of the main components of the National Front coalition. Mahathir’s show of remorse, backed by his efforts to make reparation, is something rarely seen in Malaysian public life and high office.

Even so, the opposition’s victory surprised everyone. Postmortems were written for Malaysia’s fourteenth general election months before it happened. The country was so gerrymandered, people assumed, and the National Front’s hold on government so strong, that Najib would easily win re-election, despite his unpopularity and scandals. Instead, turnout was high, reflecting popular discontent: over 12 million Malaysians, or 82 percent of those eligible, cast votes. The Alliance won 121 seats (of 222), giving it a decisive majority in parliament.


If Najib had won, the increasingly illiberal trend in Malaysian society toward a strongly conservative form of Islam would undoubtedly have continued. In recent decades, Malaysia has received extensive religious investments such as scholarships, mosques, preachers, universities, schools, and textbooks from Saudi Arabia, which has systematically propagated its brand of puritanical Salafi Islam across the Muslim world, as a matter of basic foreign policy. Under Najib, Salafi preachers and ideas gained mainstream platforms; and many members of the Salafi-identified Association of Malaysian Scholars have joined the UMNO outright.

“Najib needed Islamic legitimacy in order to boost his beleaguered image,” Ahmad Farouk Musa, a liberal Islamic scholar based in Kuala Lumpur, told me. “He got this from Malaysia’s Salafi network, many of whose figures have joined UMNO, increasingly defining its religious stances.” Meanwhile, Saudi leaders have also cultivated close personal ties with Najib; some are implicated in the 1MDB scandal, though they claim that a mysterious $681 million deposit to Najib’s personal account last year was a “genuine donation.”

Najib’s election defeat, however, does not mean that the trend of Islamicization in Malaysian society will be checked, since it predates his time in office. Malaysia has long had, for instance, parallel legal systems, with civil law for all citizens and Islamic law for the Muslims (and sharia courts in every state). Many prominent figures have railed against the Saudi investments and the “Arabization” of Malaysia’s religious traditions, including the Sultan of Johor (the constitutional monarch of that state) and Marina Mahathir, Dr. Mahathir’s eldest daughter and a prominent social activist, who has called it “Arab colonialism.” According to Mohamed Nawab Mohamed Osman, a Singapore-based security scholar, Malaysia may be the “weakest link” in Southeast Asia’s resistance to Islamist radicalization “because of the mainstreaming of puritan ideas.”

Lily Rahim, an associate professor at the University of Sydney, agrees that the popular uptake of Salafi ideas has “certainly increased” in the last decade, “encouraged by the UMNO-led government, opposition party PAS, conservative Islamic NGOs, state ulama [clerics] and the Islamic bureaucracy.” The UMNO may have suffered a heavy setback in this year’s election, but the country’s main Islamist party, the Malaysian Islamic Party (PAS), increased its share of the popular vote from 15 percent to 17 percent. (The president of the PAS, Haji Hadi Awang, is himself a Saudi university alumnus.) The PAS quit the Alliance of Hope last year because it wanted to impose hudud, or strict Islamic law, in Kelantan, a rural state that today stages public floggings of criminals.

“The natural alliance now is between UMNO and PAS [against the Hope Alliance],” Tom Pepinsky, a political scientist and Southeast Asia specialist at Cornell University, told me. This would unite the most vocally Malay party and the most vocally Muslim one. “That’s a large majority, essentially all Malay Muslim… and we know that conversations between their leadership have been happening for a while now.” Pepinsky said they would be a force to be reckoned with at the next election, which must take place by May 2023.


Malaysia has entered a period of uncertainty. No one seemed able to predict what would happen once Najib’s re-election bid was rejected. After all, Malaysia had never before seen a democratic transfer of power.

Thousands gathered outside the State Palace on May 10, a day after the election, waving and wearing PKR flags. Most had had a sleepless night after the polls closed, tracking the results on their screens and watching Mahathir claim victory, but with no official word all night from either Najib’s camp or the Election Commission. Najib finally conceded at around noon, enabling Mahathir to be sworn in by King Muhammad V, the country’s constitutional monarch. Mahathir was driven through the palace gates around 4:30 PM, in formal Malay traditional dress of a stiff black cap and a gold-threaded sarong, for the swearing-in, which was scheduled for 5 PM.

A cloudless afternoon turned to dusk, and then to night. Mahathir was still yet to be sworn in. Some young Malaysians camped out on a hill overlooking the palace while others stocked up on buttery roti canai flatbreads from a nearby café. No one went home. Mahathir had made Thursday a national holiday, and Friday, too. Marina Mahathir unexpectedly joined the crowd at around 8:30 PM. She had declared on Twitter that she was too exhausted to join her father in person, but when the delay became apparent, she turned around her car from the airport—she was on her way to a speaking engagement in Bangladesh—and came to the palace, after all.

“Well, he doesn’t tell us anything,” she told me, talking about her father’s sudden decision to return to political life. “We had a feeling he would run, because a lot of gears were turning these last few months, but he is very much his own man.” She sat on the sidewalk, wearing a loose black vest, slacks, and ballet flats. “Given how long this is going on, I realized I just couldn’t leave the country tonight.”

A minor celebrity in her own right, she was greeted by well-wishers and people asking for photos with her on the lawn of the palace grounds. “Do you think they are discussing something inside?” asked one elderly man, whom she had greeted warmly.

“Discuss what, lah!?” she said, using the common Malay interjection to signal her distaste. The Malay royals are known to dislike Mahathir, and no one expected him to receive an especially warm welcome. Never had an election victory turned out to be so suspenseful. The entire day, with its endless delays, was a puzzle. No one was quite sure of the result, nor could they go to work, so most people just recharged their smartphones, waiting for word in a daze.

Finally, a few minutes after 9:30 PM, Mahathir’s meeting with the king was broadcast.

“It’s on Amani!” Marina told the crowd, and everyone tuned in on their phones to the network that was live-streaming the ceremony. We watched her father meet an incredibly bored-looking king, who slouched through the proceedings, a cell phone visibly bulging in his shirt pocket.

So far, though, Mahathir’s elaborate election strategy has proceeded according to plan. He is firmly in power, and arranged for Anwar to be pardoned right before Ramadan. He also prevented Najib Razak from leaving the country, purportedly so that he can face trial for his alleged corruption.

This is all promising, but there are still several more steps before Anwar can realize Mahathir’s promise of power. To start with, Anwar would need to win a parliamentary seat of his own; the most likely scenario is that his wife will resign from hers, triggering an election in which he will stand in her stead. Mahathir, meanwhile, has given himself a leisurely timeline of two years before he plans to cede his position, by which time he will be ninety-four (he is already the world’s oldest state leader).

Less encouraging is that the stunning electoral upset has done little so far to reverse the erosion of civil liberties in Malaysia—a process that began well before Najib’s administration. During his last term, Mahathir himself behaved like an autocrat, stamping out challenges from opponents through political maneuvering, sacking judges, crushing freedom of assembly, and jailing critics. He also handed out government contracts to cronies and issued antisemitic diatribes. (Although there are hardly any Jews in Malaysia, Mahathir has made ready use of demagogic conspiracy-theory tropes, claiming that Jews “rule the world by proxy.”) This week, in a blow to press freedom, Mahathir announced that he will not repeal Malaysia’s widely criticized “anti-fake news law,” enacted by Najib ahead of the election.

Mahathir has also always aggressively promoted affirmative action for ethnic Malays, who make up more than 50 percent of the population. Back in 1970, he wrote a controversial book, The Malay Dilemma, arguing that the Malay race was naturally nonconfrontational and lazy, and that it was this that led to their subjugation by British colonists and then Chinese businessmen; this racial disadvantage, he proposed, needed the redress of affirmative action in order to preserve Malays’ status as bumiputera, sons of the soil.

Mahathir’s pro-Malay policies have been criticized in the past for causing economic stagnation, encouraging discrimination, and spurring the flight of talented ethnic Chinese and Indian Malaysians. Although Mahathir abandoned the UMNO, he transferred his allegiance to the Malaysian United Indigenous Party, which has an almost indistinguishable focus on ethnic Malays. It seems unlikely that he will abandon Malaysia’s race-based policies. Anwar, however, has said in an interview since his release that he hopes eventually to reform the race-based affirmative action policies in favor of a “more transparent” system based on merit and socioeconomic class.

Many commentators believe that if civil liberties improve at all under the new administration, it won’t be because of Mahathir but thanks to the people around him. “I’m under no illusions that Mahathir is a liberal democrat, but I do think that he allied himself with opposition parties founded on the premise that civil liberties deserve respect under Malaysian law,” Pepinsky told me. “If he doesn’t uphold at least some of them, he will have a tough time serving his coalition.”

The mood in Malaysia remains optimistic, even as the mundane logistics of power transfer and administration-building take the place of initial jubilation. The election result was an extraordinary proof that Malaysian democracy is not simply theoretical.

Outside the State Palace on May 10, I shared my phone screen with Dennis Ignatius, a former ambassador to Chile and Argentina who is now a newspaper columnist. He wears round glasses and speaks in the clipped tone of someone who spent his early years under British colonial rule. “For a long time, it felt like this whole region has gone dark,” he said. “Cambodia, Myanmar, the Philippines, Vietnam.” Today, Southeast Asia’s political establishment is packed with authoritarians. “I still can’t believe we managed to change something here.” 

For years, he has been railing against the National Front, corruption, and Saudi investments—albeit with necessarily delicate phrasing—in his columns. Like most other Alliance voters, he never imagined his vote would actually propel them to victory. “I am sixty-seven years old,” he said. “This morning, for the first time ever, I woke up as a man without a mission.”

As for Anwar Ibrahim, he was released from a prison hospital to an ecstatic crowd of admirers on May 16, a day before the first fast of the holy month of Ramadan. He emerged in a natty black suit, and spontaneously canceled the expected news conference because of the boisterous throng of supporters that awaited him.

“I have forgiven him,” he said later, when asked about Najib. “But the issue of injustice toward the people, crimes committed against the people, endemic corruption that has become a culture in this country, that he has to answer for.”

Anwar is seventy years old and has spent the last twenty of them in and out of jail. He has promised to run for parliament within Mahathir’s two-year interregnum, but not immediately. First, he said, he would need some “time and space.”

Source Article from

The New Europeans

La Fracture [The Fracture]

Al-Britannia, My Country: A Journey Through Muslim Britain

Europe’s Angry Muslims: The Revolt of the Second Generation

Dan Kitwood/Getty ImagesVisitors at an amusement park in London during an Eid celebration, July 2014

One of the consequences of the defeat of ISIS in Iraq and Syria is that many of the estimated five to six thousand Europeans who had gone to Mesopotamia to fight for or live under its so-called caliphate are now coming home. Depending on the policies of their respective countries, they may be jailed, closely watched, or placed in rehabilitation programs (France is the toughest, Denmark among the most understanding). No one knows how much of a threat these returnees pose. They may be repentant and ready to be lawful citizens, or they may be planning acts of revenge. Regardless of the setbacks suffered by ISIS on the battlefield, the list of atrocities committed by Muslim jihadis on European soil continues to grow. From a series of bomb, vehicle, and knife assaults in Britain in early 2017—the worst of which, claiming twenty-two lives, took place during a concert at the Manchester Arena—to more recent ones in Spain and France, it is clear that the appetite of a tiny number of Muslims for killing infidels is undimmed.

Although terrorist attacks in Europe continue to attract much attention, they don’t dominate the news as much as they did when they were a horrendous novelty back in 2014 and 2015. That terrorists can create localized but not widespread panic has been proved time and again; Sir Jeremy Greenstock, the former head of the United Nations’ counterterrorism committee, has aptly described Islamist terrorism as “a lethal nuisance.”

And yet this nuisance has made some progress in achieving what the French social scientist Gilles Kepel, one of Europe’s foremost authorities on Islamist militancy, maintains was the goal behind bringing the jihad home: creating an unbridgeable “fracture” between Europe’s Muslims and non-Muslims. According to Kepel, after engaging in holy war against the Soviets in Afghanistan, the jihadis moved on to the “near abroad” (Bosnia, Algeria, and Egypt), before homing in on the West itself, first—and in most dramatic fashion—the US, and since 2012 Western Europe, in a campaign of attacks by small, often “home-grown,” cells and individuals.1 That longed-for fracture gave Kepel the title of his latest addition to the vast literature on the subject, La Fracture, published a few months after its author was included on a hit list of seven French public figures singled out for execution by the jihadi assassin Larossi Abballa. (Abballa was killed by police in June 2016 after murdering a police officer and his wife.) “If you want to kill me, kill me,” Kepel taunted the jihadis during an interview in April 2017 with The New York Times.

In contrast to attacks committed by non-Muslims such as Stephen Paddock, who massacred fifty-eight people at a concert in Las Vegas on October 1, 2017, jihadi attacks have repercussions on the communities and traditions that are believed to have encouraged them. Each atrocity increases by a fearsome multiple the distrust, surveillance, and interference to which Muslims in the West are subject. In the month following the Arena bombing, the Manchester police logged 224 anti-Muslim incidents, compared to thirty-seven in the same period a year earlier. On June 19, 2017, when a white Briton, Darren Osborne, plowed his van into a group of Muslims in North London, killing one, many Britons, including ones I spoke to, felt that the Muslims had had it coming.2 This is what Kepel means by fracture: jihadism engenders a reaction “against all Muslims,” while populist politicians “point the finger at immigrants or ‘Islam.’”

Many Europeans of Muslim heritage, of course, are contributing to the life of their adoptive nations, very often while retaining elements of their faith and culture.3 But with each jihadi attack, the oft-heard formula that Islam is a religion of peace that has been perverted by an isolated few loses currency.

This was brought home to me a few days after the North London attack as I listened to an exchange on a popular radio show between the host, Nick Ferrari, and Khola Hasan, a prominent female member of one of Britain’s sharia councils. (These bodies spend much of their time dissolving unhappy marriages contracted under Islamic law and have been criticized for arrogating the duties of the state.) It began with Hasan dismissing ISIS as a “death cult that is pretending to have grown from within the Muslim faith.” So opportunistic was the jihadists’ attachment to Islam, she went on, that the mass murderers who rampaged about shouting “Allah” could just as easily be yelling “Buddha” or “Jesus.” “Remind me,” Ferrari interjected glacially, “of the last time a group of Christians…blew up children coming out of a pop concert, because I must have been off that day,” and he went on to say that “something about the faith” had caused the current problems. “That’s the whole point!” Hasan replied, in obvious distress. “It’s not about the faith,” she said, before the telephone line mysteriously—and rather symbolically—went dead.

According to an opinion poll commissioned by Le Figaro in April 2016, 63 percent of French people believe that Islam enjoys too much “influence and visibility” in France, up from 55 percent in 2010, while 47 percent regard the presence of a Muslim community as “a threat,” up from 43 percent. A poll conducted in Britain around the same time found that 43 percent of Britons believe that Islam is “a negative force in the UK.” Many British Muslims, I was told by a Muslim community activist in Leeds, spent the hours after the Las Vegas massacre “praying that the perpetrator wasn’t a Muslim,” for had he been, it would have led to furious responses online, in addition to the usual round of ripped-off hijabs and expletives in the street, if not actual physical threats.

The integration of Muslims became a political issue in Europe in the 1980s. In Britain, Muslim activists began to split from the black community, and tensions developed between the two. In France, years of neglect by the government, combined with the popularization of Salafist ideas through the Afghan jihad, undermined the complacent old assumption that the country’s North African immigrants were inevitably socialists and secularists. French cultural control, or dirigisme, and British multiculturalism—their respective approaches to the integration of immigrants—are logical extensions of their contrasting versions of empire (France’s civilizing mission versus British laissez-faire) and informed in Britain’s case by the principle of diversity inherent in a conglomerate United Kingdom. (Germany has adopted an uneasy mixture of the two.)

When the French banlieus erupted in rioting in 2005, and again when jihadist terror struck France in January 2015, many Britons attributed France’s troubles to its unwise policy of forcing North African immigrants to adopt all aspects of French culture, including a reverence for the French language and a secular and republican ideology. The British, by contrast, allowed different communities to maintain their diverse characteristics while seducing them with symbols such as Parliament and the crown. Even the London bombings of 2005, in which fifty-two people were killed, didn’t do lasting damage to this optimistic approach, in part because the eight-year gap before the next jihadist attack encouraged Britons to think of it as an aberration.

Back in 1975, the authors of a UK government document on education argued that “no child should be expected to cast off the language and culture of the home as he crosses the school threshold.” Here was the essence of multiculturalism, under which the state saw no advantage in weakening the ties of community and tradition that bound together citizens of foreign origin. On the contrary, they should be encouraged—as the Greater London Council put it—to “express their own identities, explore their own histories, formulate their own values, pursue their own lifestyles.”

Education and public housing were two areas in which multiculturalism made substantial inroads. In many cases no efforts were made to dilute a homogeneous immigrant community, which led to monocultural areas such as—in James Fergusson’s phrase in Al-Britannia, My Country, his highly sympathetic new survey of British Muslims—“the Muslim marches of east Birmingham,” while schools were encouraged to acknowledge the histories and cultures of “back home.” As a result of Britain’s multiculturalist program, under which communities were encouraged to represent themselves, somewhat in the manner of the British Raj, Muslim-majority areas gained Muslim councilors, mayors, and members of Parliament in much higher numbers than similar areas did in France.4

Bradford, in West Yorkshire, is one of several former manufacturing towns in England that have given shape to multiculturalism in its most controversial form. Not all of Britain’s Muslims live in enclosed communities, by any means; the Muslims of East London and Leicester are substantially intermixed with other arrivals of various religions. But parts of Bradford—in the tight little streets around Attock Park, for instance, where the women dress in Pakistani jilbab gowns—are defiantly monocultural. Bradford’s Muslim community of at least 130,000 is dominated by families from a single district of Pakistan-administered Kashmir, many of whom migrated in the 1960s. The skyline is punctuated by the minarets of 125 mosques, and the city’s systems of education, philanthropy, and commerce (not to mention marriage and divorce) have been formed by the attitudes of a close-knit, extremely conservative community. Poverty and drug-related crime have afflicted the town, but I was told repeatedly during recent visits that if it weren’t for their cohesion, ethos of self-help, and faith in an exacting but compassionate God, the Muslims of Bradford would be much worse off.

In recent years, England’s encouragement of multiculturalism has weakened in response to terrorist attacks and a rapid increase in the Muslim population, which has doubled since 2000 to more than three million people. By 2020 half the population of Bradford—which, besides being one of the country’s most Muslim cities, has one of its highest birth rates—will be under twenty years old. Responding to this demographic shift and the fear of terrorism, Britain under David Cameron and, more recently, Theresa May has given the policy of multiculturalism a very public burial, a shift that seems entirely in tune with the defensive impulses that led a small majority of voters to opt for Brexit. (I was told in Bradford that many Muslim inhabitants of the city also voted for Brexit, to indicate their displeasure at the recent arrival of Polish and Roma immigrants.) Typical of the panicky abandonment of a venerable article of faith was May’s reaction to the terrorist attack on London Bridge in early June 2017, after which she demanded that people live “not in a series of separated, segregated communities, but as one truly United Kingdom.” A central element of the government’s anti-extremism policy is the promotion of “British” values such as democracy, the rule of law, individual liberty, and tolerance.

One of the observations made in the 1975 government report on education was that the mothers of some pupils in British schools might be living “in purdah, or speak no English.” The report’s tone was neutral—nowhere did it suggest that this represented a threat to the civic state. In today’s environment, however, this statement might seem to provide evidence of communities spurning the British way of life—and along with it the values of emancipation and individual fulfillment that official British culture holds dear. The ghetto, runs this line of thinking, is the first stop on a journey to extremism and terrorism.

That such a direct connection exists between Muslim communities and extremist organizations is of course widely contested. In fact, many Islamist terrorists have left suffocating communities in order to find a new “family” among globalists whose aim, in the words of another French scholar of jihadis, Olivier Roy, is “a new sort of Muslim, one who is completely detached from ethnic, national, tribal, and family bonds.”5 It is most often this new “family,” and not a Muslim-dominated hometown, that inculcates jihadist ideology. The very Muslim societies and schools that Britons criticize for their illiberal attitudes may, because they encourage family and community cohesion, be a more effective obstacle to revolutionary jihadism than any amount of government propaganda. But Britons from the liberal left and the patriotic right both view conservative Muslim communities as patriarchal, chauvinistic, and homophobic—in many cases with good reason. And the old policy of deferring to more “moderate” Islamic groups has been undermined by the perception that they are stooges, or not very moderate.

The end of multiculturalism and its replacement with heightened surveillance and the emphasis on national cultural values are dealt with in detail in Fergusson’s Al-Britannia. Under the government’s anti-radicalization program, called Prevent, teachers, hospital staff, and other public sector workers must report anyone they regard as actual or potential extremists. Everyone agrees on the necessity of identifying potential radicals, whether Islamist or neofascist, but under Prevent many Muslims have been unjustly labeled, bringing shame that lasts long beyond the “voluntary” counseling that follows referral. And the gaffes committed in the name of Prevent are legion. The room of an Oxford student (a Sikh, as it happens) was searched after she was overheard praying, and a fourteen-year-old schoolboy was questioned by the authorities after he used the word “ecoterrorism.”

Britain’s newfound hostility toward illberal Muslim enclaves was brought to light by a national scandal in 2014, when details were revealed of an alleged plot to Islamize schools in Birmingham, the country’s second-largest city. Over the previous decade there had been a concerted effort to bring more Muslim staff into schools in Muslim-majority parts of Birmingham. Regular prayer and the Ramadan fast were assiduously promoted by school authorities, and some segregation between the sexes was introduced. On occasion teachers expressed rebarbative views on homosexuality and women’s rights. But in reinforcing these Islamist values—and downgrading the “British” values they were supposed to champion—the educators at these schools also created the conditions for a rise in academic and disciplinary standards, with a concomitant effect on exam results and employment opportunities.

After the scandal broke, teachers were suspended at three Birmingham schools, which had an adverse effect on morale and exam results. Just one disciplinary charge was upheld by the subsequent tribunal, and allegations of a plot have been discredited. But the damage to the reputation of the schools and their pupils has been considerable.

For Fergusson, the fracture in today’s Britain isn’t so much between Muslims and non-Muslims as among Muslims: the young are pushing against their parents, and the combination of sexual temptation and stalled incomes (marriage is a luxury many cannot afford) creates frustration that can slide into nihilism. Fergusson has been criticized for being too sympathetic to some Islamists who have been painted as menaces by the media, such as Asim Qureshi, a senior member of the Islamic advocacy group Cage, who in 2015 described Mohammed Emwazi, the British ISIS militant known as “Jihadi John,” as a “beautiful young man.” But Fergusson’s belief that British Muslims should be valued because of their faith, not in spite of it, is a major improvement on the self-interested toleration that has often passed for an enlightened position on the Muslim question.

The late American political scientist Robert S. Leikin, for example, whose book Europe’s Angry Muslims was recently reissued with a preface on the rise of ISIS, argued that one reason Westerners must combat anti-Muslim discrimination is that “such bigotry robs us of allies, including Muslims in the West, just when we have an imperative need for informants.” It is hard to think of a statement against intolerance more hostile to the notion that a Muslim might actually be “one of us.”

While Fergusson believes that recognizing the faith and values of Muslim communities is essential to society’s cohesion, Kepel regards it as a weapon that has been placed in the hands of Islamists by liberal bleeding hearts. Kepel’s journey is an interesting one. In 2004 he was a member of a commission that recommended—and secured—the prohibition of ostentatious religious symbols in French schools (the so-called “headscarf ban”). But he has also spent much of his career as a high-flying academic investigating the combination of poverty, cultural entrapment, and Salafist ideas that has created a sense of alienation in today’s Muslim generation. His previous book, Terror in France, was a meticulous modern history of Muslim France through various causes célèbres, from the publication of cartoons lampooning the Prophet Muhammad in 2005, to Marine Le Pen’s complaint in 2010 that sections of society were under “occupation” (for which she was acquitted on hate crime charges), to the first jihadist attacks on French soil in 2012.

Especially interesting in light of the Muslim generational divide that Fergusson concentrates on is Kepel’s description of an abortive attempt by the Union des organisations islamiques de France to impose order on the riot-hit banlieus in 2005. This body of religious and lay leaders, considered by many to be the most powerful in France, had lost prestige because of its inability to prevent the passing of the headscarf ban, and was dominated by aging Muslim Brothers out of touch with the young who were out trashing cars. The fatwa issued by the union’s theologians declaring vandalism to be haram, or illicit, had the opposite effect of the one intended. The following day more than 1,400 vehicles were destroyed and thirty-five policemen were wounded.

Kepel’s La Fracture is a collection of radio essays and commentaries on events in France and the Islamic world. Its most interesting part is the long epilogue, in which he displays his distrust, common among establishment intellectuals, of communal politics, which are held to be at odds with French republicanism’s emphasis on equal opportunity. The trouble, as Kepel sees it, lies not simply in the violence the jihadis perpetrate, but in the long game of their fellow travelers who aim to foment the fracture while working just within the framework of the law. In a secular republic like France, he writes, “a religious community cannot be represented as such in the political sphere”—but this is exactly the principle that a new generation of Muslim activists is trying to subvert.

Kepel singles out Marwan Muhammad, the energetic and well-educated young leader of a pressure group called the Collective against Islamophobia in France, as perhaps the most dangerous of these fellow travelers. Over the summer of 2016, following a jihadi attack in Nice that killed eighty-six people, several seaside towns issued a ban on burkinis. Muhammad was prominent in orchestrating the subsequent protests against what he called a “hysterical and political Islamophobia,” which gave France much unflattering publicity and ended with a judge declaring the bans illegal. Kepel regards the storm around burkinis—along with another outcry around the same time over the mistreatment of two Muslim women in a restaurant—as populist agitations masquerading as human rights campaigns. He accuses such movements of calling out instances of “Islamophobia” (the quotation marks are his) with the express purpose of making Muslims feel like victims.

There is no room for naiveté in this discussion. Plenty of Westerners were told by Ayatollah Khomeini while he was in his French exile that he had no desire to introduce Islamic rule in Iran. Within a year of his return to Iran in February 1979 the country was an Islamic republic. It is possible that Marwan Muhammad and others dream of a Muslim France. I don’t know if this is the case and neither does Kepel—though he has his suspicions. Certainly, there is much in the doctrinaire French enforcement of laïcisme that reminds of me of the despairing measures of Turkey’s secular republic before it was finally overwhelmed by the new Islamists led by Recep Tayyip Erdoğan. But France, despite its large and fast-growing Muslim population—around 8.4 million, or one-eighth of the total population—isn’t about to fall to the new Islamists. And whatever the ultimate goal of activists like Muhammad, their entry into the political mainstream and adept advocacy for increased rights bode well for the goal of integrating Muslims into European institutions.

President Emmanuel Macron has made friendly overtures to France’s Muslims, and during his campaign last year he acknowledged that terrible crimes were committed by the French in Algeria. On November 1 anti-terrorism legislation came into force that transferred some of the most repressive provisions of France’s state of emergency—which ended on the same day—into ordinary law. Prefects will continue to be allowed to restrict the movement of terror suspects and shut down places of worship without a court order, even if raids on people’s homes—a particularly controversial feature of the state of emergency—are now possible only with the permission of a judge. To be Muslim will be to remain under suspicion, to be belittled, profiled, and worse. As in Britain, the short-term imperative of keeping people safe is proving hard to reconcile with the ideal of building a harmonious society.

Europe has become more anti-Muslim as it has become more Muslim. Though it is hard to find many cultural affinities between the Pakistanis of Bradford, the Algerians of Marseille, and the Turks of Berlin, Islam remains the main determinant of identity for millions of people. That this is the case in hitherto multicultural Britain and laïque France suggests that, for all the differences between the two countries’ systems and the relative tolerance of the British one, neither has been able to solve the problem of Muslim integration. As long as this remains the case, and as long as the Muslim population continues to increase so quickly, Islam will continue to cause apprehension among very large numbers of Europeans. They have made their feelings clear by supporting anti-immigration candidates in election after election across the continent, stimulated in part by Angela Merkel’s profoundly unwise decision in 2015 to admit more than one million refugees to Germany.

The panic is caused by rising numbers, sharpened by fears over terrorism. Governments can act to allay both of these things, and they are acting. But they must also recognize Islam for what it is: a European religion. Seldom does mainstream Western discourse make room for the good Islam does—stabilizing communities, minimizing crime and delinquency, and providing succor for millions. Like the Christian temperance movements of the nineteenth century, Islam wards off the alcoholic disarray that descends on many towns on Friday and Saturday nights; you only need visit the emergency room of an urban hospital in “white” Britain to see the wreckage. There are many factors in Europe’s Muslim crisis, but perhaps the most fundamental is that Islam is never part of any general consideration of values in a successful modern society. Its position is at the margins of society, spoken at rather than engaged with.

  1. 1

    In support of this thesis Kepel cites an influential online screed by a Syrian-born veteran of the Afghan jihad who goes by the nom de guerre Abu Musab al-Suri. His Call for a Global Islamic Resistance (2005) prophesies a civil war in Europe that will create the conditions necessary for the triumph of the global caliphate. 

  2. 2

    See my article following that attack, “Britain: When Vengeance Spreads,” NYR Daily, June 24, 2017. 

  3. 3

    The mayor of London, Sadiq Khan, for instance, enjoys much popularity among liberal-minded Londoners of all backgrounds, while not hiding the fact that his faith is important to him. 

  4. 4

    Although Muslims are nowadays more represented in French public life than they once were, and an estimated fifteen of them entered the National Assembly in the 2017 elections, in some of the Muslim-majority communes around Paris one still has the impression of being in a French colony. In and out of the regional administrations step men and women “of French stock,” as the somewhat agricultural phrase goes, salaried, suited, and assured of a decent pension. They are administrators of a brown population as surely as their forefathers were in Oran or Rabat. 

  5. 5

    See Olivier Roy, Jihad and Death: The Global Appeal of Islamic State, translated by Cynthia Schoch (Hurst, 2017). 

Source Article from

The Old Lady

Chris Steele-Perkins/Magnum PhotosMargaret Thatcher at the Conservative Party Conference, Winter Gardens, Blackpool, 1985

The Bank of England is one of Britain’s distinct contributions to history. It was chartered in 1694 to lend money to King William for war on France, when a company of London merchants received from Parliament the right to take deposits in coin from the public and to issue receipts or “Bank notes.” The bank financed a summer’s fighting in the Low Countries, gave the business district of London, known as the City, a currency for trade, and lowered the rate of interest for private citizens.

In the next century, the Bank of England developed for Crown and Parliament sources of credit that permitted Britain, during 115 years of intermittent warfare, to contain and then defeat France and to amass, in Bengal and Canada, the makings of an overseas empire. Through “discounts,” or unsecured lending to merchants and bankers, the bank provided the City with cash and influenced rates of lending and profit, and thus the course of trade. In the war that followed the French Revolution of 1789, it was forced to stop paying its banknotes in gold and silver; it had issued more notes than it had gold with which to back them. Across the Channel, for want of a public bank of his own, King Louis XVI of France lost his kingdom and his head.

In the nineteenth century the Bank of England became the fulcrum of a worldwide system based on gold and known as “the bill on London.” Over a succession of City crises at approximately ten-year intervals, it took on the character and functions of a modern central bank. By World War I, it was propping up an empire living beyond its means. In conjunction with the Federal Reserve Bank of New York and the German Reichsbank, the Bank of England’s longest-serving governor, Montagu Norman, sought to develop a club of central banks that would impose on the chaos of international commerce and the caprices of government a pecuniary common law. In reality, the Bank of England had to fight ever-greater runs on sterling until, in 1992, it was routed and sterling was taken out of the European Exchange Rate Mechanism, the forerunner of the euro.

Even in the postwar period, the bank had successes. In 1986 it directed a dismantling of anticompetitive practices in London’s stock and bond markets that heralded a quarter-century of British prosperity. It also invented the phrase (“Big Bang”) by which the reforms came to be known. Nationalized in 1946, the bank recovered some of its independence in 1997. After the collapse of Lehman Brothers in New York in 2008, it flooded London with money. Shorn of many of its ancient functions and traditions, its armies of scriveners long gone to their graves, the bank is now headed by a Canadian, Mark Carney. What would be more shocking to the shades of the strict Protestants who supplied the bank’s first Court of Directors and shareholders: the 120th governor is a Roman Catholic.


Historians such as Sir John Clapham, John Fforde, Richard Sayers, and Forrest Capie have told parts of this story. David Kynaston tells it all, or at least up to 2013. A careful and thorough writer, Kynaston made his name in the 1990s with a history in four volumes of the City of London after 1815, much of it drawn from the Bank of England’s archive. He then turned to general social history in Austerity Britain, 1945–1951; Family Britain, 1951–1957; and Modernity Britain, 1957–1962. The later books brought him readers who would not be interested in discount brokers and commercial paper.

In Till Time’s Last Sand Kynaston devotes four chapters to the lives of the thousands of clerks that powered this engine of credit and the warren in Threadneedle Street where they worked, one chapter each for the eighteenth and nineteenth centuries, and two for the twentieth. He records the arrival of round-sum banknotes, telephones, women, college graduates, computers, economists, and management consultants in an overstaffed and tedious institution. His models are the novelists of London such as Dickens, Trollope, Gissing, Galsworthy, and Wells. His title comes from an elegy for Michael Godfrey, the first deputy governor, blown to bits next to King William on a visit to the siege trenches around Namur in 1695.

What there is not in this book is political economy, which may be a fault from one or more points of view, but suits its subject. As Montagu Norman is supposed to have said, the place is “a bank and not a study group.” The Bank of England never “got” economics, but sometimes in governors’ speeches and in evidence to parliamentary committees it showed an interest in economic theory (bullionist, Keynesian, or monetarist) to please or mislead the west, or political, end of London. Later on, it published a quarterly bulletin that seemed particularly designed to obfuscate. Down the long corridor of the bank’s history, such doctrines are apparitions.

The Bank of England was always less secretive than tongue-tied. “It isn’t so much that they don’t want to tell you what’s going on,” a twentieth-century banker said. “It’s more that they don’t know how to explain it: they’re like a Northumbrian farmer.” David Kynaston explains the Bank of England.

In the 1690s King William’s ministers became intrigued by a City catchphrase, “a fund of Credit.” It meant that a secure parliamentary tax over a number of years could be used to pay the annual interest on a loan to the king by the City’s merchants. Such a tax could, as we would say, be capitalized.

In April 1694 Parliament granted the king duties on ships’ cargoes and beer and spirits up to £100,000 a year, which would be sufficient (after £4,000 in management expenses) to pay an 8 percent interest on a loan of £1.2 million to make war with France. The City merchants who put up the loan were allowed to incorporate as “the Governor and Company of the Banke of England.” As Adam Smith wrote later, the king’s credit must have been bad “to borrow at so high an interest.”

The bank’s charter, which was at first for just eleven years, was prolonged again and again, always at the price of a further payment to the Crown. Parliamentary acts of 1708 and 1742 gave the Bank of England a monopoly on joint-stock (i.e., shareholder-owned) banking in London and its environs. In 1719 the finance minister of France, the Scottish-born John Law of Lauriston, devised a scheme to convert the liabilities of the bankrupt French Crown into shares of a long-term trading company. In a sort of panic, men such as the journalist and novelist Daniel Defoe warned that unless Britain followed, it would forfeit all the country had gained from twenty years of warfare. “Tyranny has the whip-hand of Liberty,” Defoe wrote.

The bank was sucked into an auction against the South Sea Company for the privilege of refunding the British national debt, only to find that South Sea, by bribing members of Parliament with free shares, had rigged the auction. In the spring and summer of 1720, shares in South Sea rose to giddy heights and then collapsed, and the bank was there to rescue the state creditors. The prime minister, Sir Robert Walpole, swept South Sea’s bribes under the parliamentary carpet.

As a consequence, the bank gained a position in the British constitution that it has never entirely lost and a satrapy in the eastern part of London, where king and Parliament must ask leave to enter. From its quarters in Grocers’ Hall, in Poultry, the bank moved in 1732 to its present address in Threadneedle Street, where it expanded to occupy a three-acre city block. Rather against its nature, the Court employed as its architect in the later part of the century a talented bricklayer’s son, John Soane, who did more than anyone to create the neoclassical style in bank architecture that conceals, behind columns and pediments, the rampage that is deposit banking.

In 1745 the bank survived a run when supporters of the former ruling house of Stuart, which had been ousted by King William in 1688, invaded England from the north. The bank paid its notes in silver shillings and sixpences. During the “Gordon Riots” of June 1780, an armed crowd sacked London’s prisons and was repulsed from the main gate of the bank by cavalry and foot guards. The incident gave rise to the Bank Picquet, a night watch of redcoats in bearskin helmets that was only stood down in 1973.

Trustees of the British Museum‘Political-ravishment, or The Old Lady of Threadneedle-Street in danger!’; cartoon by James Gillray, 1797

More perilous than either of those setbacks were the demands for cash the bank faced in the 1790s from the wartime prime minister, William Pitt the Younger. Inundated with short-term government bills it was expected to cash on sight, amid a business slump and an invasion scare, the Bank of England lost almost all its reserves and, early in 1797, told Pitt it could no longer pay out in gold. In March of that year, the Irish MP and playwright Richard Brinsley Sheridan created the popular image of the bank as “an elderly lady in the City, of great credit and long standing, who…had contracted too great an intimacy and connexion at the St James’s end of town.” Two months later the cartoonist James Gillray drew a lanky, freckled Pitt stretching to land a kiss on a gaunt dame dressed in banknotes, with notes in her hair for curl papers, sitting on a chest of treasure: “POLITICAL RAVISHMENT, or The Old Lady of Threadneedle-Street in danger!”

During Restriction, when Parliament lifted the requirement that the bank convert banknotes into gold and which lasted until 1821, it printed notes of £1 and £2 for the use of the public at large. Forgery was rife, and the bank sent hundreds of men and women to the gallows or the penal colonies for the crime. For the journalist William Cobbett, who like everybody else in those days thought gold and silver coins were money in a way that banknotes were not, there was no offense in forging a forgery. “With the rope, the prison, the hulk and the transport ship,” Cobbett wrote in 1819, “this Bank has destroyed, perhaps, fifty thousand persons, including the widows and orphans of its victims.” Behind these antique British institutions, there is always the flash of a blade.


Gold was a hard master. The bank all but stopped converting banknotes again in 1825. Having been reluctant for years to discount to Jewish houses, it was saved by a shipment of French bullion from Nathan Rothschild. As the government in Westminster became better organized and the country’s banks consolidated and grew, there were complaints (in the words of a Norfolk banker) that “the pecuniary facilities of the whole realm should thus depend on the management of a small despotic Committee.”

In 1844 the Peel Banking Act (after Sir Robert Peel, the prime minister) gave the bank a monopoly over the issuing of banknotes in England but prescribed that any increase must be matched by an increase in gold reserves. Constrained in its banking business and overtaken by giant joint-stock banks forged by mergers outside London, the bank became a lender of last resort, ready to mobilize its resources and those of the City to keep isolated cases of bad banking practice from paralyzing trade. It became the arbiter of who was fit to do business in the City and who was not.

The bank came to this position not through theory but in the tumult of events, above all the failure of the discount broker Overend Gurney in 1866 and the first rescue of the blue-blooded Baring Brothers & Co in 1890. Though it could not drop Bank rate (its short-term interest rate) too far or fast without losing its gold reserves, the bank could nudge interest rates in the direction it wanted. The central bank familiar from our times took shape.

With the outbreak of war in 1914, the bank again stopped payment in gold. When £350 million in war loans failed to sell in full, the governor, Lord Cunliffe, without telling his colleagues, let alone the public, bought the unsold £113 million for the bank’s account.

Montagu Norman, “Old Pink Whiskers” (as Franklin D. Roosevelt called him), ruled the bank from 1920 until 1944 and made so many enemies in the Labour Party and in British industry that the bank’s nationalization after World War II became inevitable. Devoted to the idea of the British Empire, Norman was too high-strung and unconventional to be part of it, like a character from Rudyard Kipling (whom he adored). Under his advocacy, Britain returned to gold in 1925, but industry could not tolerate the high interest rates needed to sustain the value of sterling on the global market. There was a general strike in 1926, and in 1931 Britain left the gold standard (no doubt for all time). Those events made the reputation of Norman’s principal opponent, John Maynard Keynes, whose doctrine of stimulating demand in bad times became British orthodoxy until it perished amid soaring consumer prices and a 17 percent Bank rate in the later 1970s.

Norman was close to both Benjamin Strong of the Federal Reserve Bank of New York and Hjalmar Schacht of the German Reichsbank. He thought of central bankers as “aristocrats,” as he wrote to Strong in 1922, who would dispel the “castles in the air” built by troubled democracies. Misled by Schacht, who was himself deceived, Norman did not see that for Adolf Hitler a central bank was just another weapon of war. Parliament was outraged in 1939 when, after Hitler’s invasion of Czechoslovakia, Norman transferred to the Reichsbank £6 million in gold deposited in London by the Czechoslovak National Bank. “By and large nothing that I did,” he wrote two years before his death in 1950, “and very little that old Ben did, internationally produced any good effect.”

Norman demolished pretty much all of Soane’s bank but the curtain wall and erected in its place the present building, designed by Sir Herbert Baker. Nikolaus Pevsner, the expert on England’s buildings, called the destruction more terrible than anything wrought by the Luftwaffe in World War II: “the worst individual loss suffered by London architecture in the first half of the 20th century.” Kynaston thinks that judgment unfair.


With the clarity of hindsight, many authors have said that the postwar Bank of England, by 1946 just an arm of the UK government in Whitehall, should never have tried to maintain sterling as a currency in which foreign countries held their reserves. Siegmund Warburg, the outstanding London merchant banker of the postwar era, said that Britain was now the world’s debtor, not creditor. As a reserve currency, he much later recalled having argued, sterling was “a very expensive luxury for us to have…. The Governor of the Bank of England at the time didn’t like this statement at all.” Charles Goodhart, one of a handful of economists who trickled into Threadneedle Street, thought the bank should have sought an alliance with the Continental central banks. “The Bank (and Whitehall) exhibited devastating Euro-blindness,” he wrote.

Kynaston has no time for such brooding. Sterling remained a reserve currency, in part because as a consequence of the fight against the Axis Powers and Japan, Britain was too poor to redeem the sterling still in use abroad. What follows is a story of Pyrrhic victories and sanguinary defeats, fading British military strength, bad judgment in Whitehall, poor management of industry, and, between 1970 and 1990, a rise in consumer prices greater than in the previous three hundred years combined. In a word, found by Kynaston in the bank’s market report for Friday, November 17, 1967, when its dealers spent £1.45 billion to defend sterling’s value against the dollar and failed: “Crucifixion.”

Yet this was also the period, as Kynaston shows, when the bank encouraged London to capture the dollars that had piled up in Europe as a result of the Marshall Plan and US imports of foreign luxuries, and turn them into loans for European industry. Who needed automakers if you had eurobonds (and the Beatles)? As a market for capital, the City regained almost all the ground it had lost since 1914.

Step by step, so as not to frighten the horses, the bank began to reform. In 1969 the management consulting firm McKinsey & Co was invited in. “I will…tell them,” John Fforde, the chief cashier, said of the two McKinsey partners, “we would be pleased to see them at lunch from time to time and will add, tactfully, that we would not expect them to lunch with us every day.” Kynaston has an ear for this sort of thing.

The highlight of the book is his account of relations between the bank and Margaret Thatcher, who came to power at No. 10 Downing Street in 1979 and at once abolished the paraphernalia of exchange control that had been in place to protect sterling since 1939. Seven hundred and fifty bank employees, housed at an ugly building by St. Paul’s Cathedral, were put to doing something else.

Thatcher and her chancellors were at first devotees of the doctrines of the Chicago economist Milton Friedman, who held that restricting the growth of money (variously defined) would halt the rise in consumer prices, rather as night follows day. As Kynaston writes, “It was monetarism or bust at No. 10.” The bank obliged with an array of monetary measures, set up targets, aimed at them, and missed.

Chancellor Nigel Lawson cooled on monetarism and instead instructed the bank to maintain sterling’s exchange rate with the deutsche mark, the West German currency, so as to impose on the UK economy the discipline and order of German industry. That policy, and its successor, the Exchange Rate Mechanism of the European Union, came to grief on “Black Wednesday,” September 16, 1992, when those who bet against sterling overwhelmed the bank. At 4 PM that day, the bank’s dealing room stopped buying pounds. A US banker recalled “Everyone sat in stunned silence for almost two seconds or three seconds. All of a sudden it erupted and sterling just free-fell. That sense of awe, that the markets could take on a central bank and actually win. I couldn’t believe it.”

While the Continental countries moved toward the euro, a chastened Britain sought its own solution. The credit of the Bank of England was not what it was, but it was still a great deal more solid than that of any particular UK government. Why not give the bank back its independent power to set interest rates free of the pressures of Parliament and prime minister? That idea, outlined by Lawson back in the 1980s and killed by Thatcher, was executed in 1997 by the Labour chancellor Gordon Brown, who established at the bank an independent Monetary Policy Committee. There followed ten years of prosperity and price stability.

At the same time, the bank lost its regulatory function. The bank had always governed the City not through rules but by small acts of disapproval, often so subtle that they were known as the “Governor’s eyebrows.” This approach became antiquated after the Big Bang brought to London foreign bankers less attuned to the facial expressions of Englishmen, and in any case the bank had failed to detect fraud and money-laundering at the Pakistani-owned Bank of Credit and Commerce International. Bank regulation was transferred to the Financial Services Authority, which did it no better. Regulation was returned to the bank in 2013.

Kynaston had access to the bank’s archives only through 1997, and his long final chapter, covering the twenty-first century, is drawn from newspaper reports, a couple dozen interviews, and the speeches of governors Eddie George (until 2003) and Mervyn King (2003–2013). He makes no great claim for it. There is nothing here of cryptocurrencies like bitcoin, Carney’s polymer banknotes, or his far-flung speeches on subjects from Scottish independence to global warming, which would have astonished his 119 predecessors.

As it turned out, the calm that followed bank independence was perilous. With its eyes fixed on consumer prices, the bank failed to act when the price of assets such as real estate started going through the roof. As King put it just before being appointed governor, “you’ll never know how much you need to raise interest rates in order to reduce asset prices.” The banks found themselves with loans secured on hopelessly overvalued security. In 2008, three British banks—Royal Bank of Scotland, Bank of Scotland, and Lloyd’s—and a couple of building societies (savings and loans) lost their capital. It was the greatest bank failure in British history.

Like its counterparts in the US, Japan, and continental Europe, the Bank of England bought securities from the surviving commercial banks in the hope that they would use the cash created to make loans and forestall a slump in business. This program, known as quantitative easing, has added nearly £500 billion to British banks’ reserves. Whether it has stimulated trade, hampered it, or had no effect at all is impossible to say. As Eddie George put it in other circumstances, and very much in the Bank of England style, “It is easy to slip into the position of the man on the train to Brighton who kept snapping his fingers out of the window to keep the elephants away. Since he saw no elephants, his technique was self-evidently effective.”

Brexit presents the Bank of England with all the challenges of the past three hundred years plus a few more. The Tories and the Scottish Nationalists detest Mark Carney, while Labour wants to move the bank from London to Birmingham, an industrial city in the English Midlands with little by way of banking, no stock exchange, and fewer places of recreation. To survive its fourth century, the bank will need all its cunning.

Source Article from

Araki, Erotomaniac

Private Collection Nobuyoshi Araki: Marvelous Tales of Black Ink (BokujūKitan) 068, 2007

Nobuyoshi Araki and Nan Goldin are good friends. This is not surprising, since the seventy-seven-year-old Japanese photographer and the younger American are engaged in something similar: they make an art out of chronicling their lives in photographs, aiming at a raw kind of authenticity, which often involves sexual practices outside the conventional bounds of respectability. Araki spent a lot of time with the prostitutes, masseuses, and “adult” models of the Tokyo demi-monde, while Goldin made her name with pictures of her gay and trans friends in downtown Manhattan.

Of the two, Goldin exposes herself more boldly (one of her most famous images is of her own battered face after a beating by her lover). Araki did take some very personal photographs, currently displayed in an exhibition at New York City’s Museum of Sex, of his wife Yoko during their honeymoon in 1971, including pictures of them having sex, and of her death from cancer less than twenty years later. But this work is exceptional—perhaps the most moving he ever made—and the kind of self-exposure seen in Goldin’s harrowing self-portrait is largely absent from Araki’s art.

Araki is not, however, shy about his sexual predilections, recorded in thousands of pictures of naked women tied up in ropes, or spreading their legs, or writhing on hotel beds. A number of photographs, also in the current exhibition, of a long-term lover named Kaori show her in various nude poses, over which Araki has splashed paint meant to evoke jets of sperm. Araki proudly talks about having sex with most of his models. He likes breaking down the border between himself and his subjects, and sometimes will hand the camera to one of his models and become a subject himself. But whenever he appears in a photograph, he is posing, mugging, or (in one typical example in the show) holding a beer bottle between his thighs, while pretending to have an orgasm.

Juergen TellerJuergen Teller: Araki No.1, Tokyo, 2004

If this is authenticity, it is a stylized, theatrical form of authenticity. His mode is not confessional in the way Goldin’s is. Araki is not interested in showing his most intimate feelings. He is a showman as much as a photographer. His round face, fluffy hair, odd spectacles, T-shirts, and colored suspenders, instantly recognizable in Japan, are now part of his brand, which he promotes in published diaries and endless interviews. Since Araki claims to do little else but take pictures, from the moment he gets up in the morning till he falls asleep at night, the pictures are records of his life. But that life looks as staged as many of his photographs.

Although Araki has published hundreds of books, he is not a versatile photographer. He really has only two main subjects: Tokyo, his native city, and sex. Some of his Tokyo pictures are beautiful, even poetic. A few are on display in the Museum of Sex: a moody black-and-white view of the urban landscape at twilight, a few street scenes revealing his nostalgia for the city of his childhood, which has almost entirely disappeared.

But it is his other obsession that is mainly on show in New York. It might seem a bit of a comedown for a world-famous photographer to have an exhibition at a museum that caters mostly to the prurient end of the tourist trade, especially after having had major shows at the Musée Guimet in Paris and the Tokyo Photographic Art Museum. But Araki is the least snobbish of artists. The distinctions between high and low culture, or art and commerce, do not much interest him. In Japan, he has published pictures in some quite raunchy magazines.

The only tacky aspect of an otherwise well-curated, sophisticated exhibition is the entrance: a dark corridor decorated with a kind of spider web of ropes that would be more appropriate in a seedy sex club. I could have done without the piped-in music, too.

Sex is, of course, an endlessly fascinating subject. Araki’s obsessive quest for the mysteries of erotic desire, not only expressed in his photographs of nudes, but also in lubricious close-ups of flowers and fruits, could be seen as a lust for life and defiance of death. Some of these pictures are beautifully composed and printed, and some are rough and scattershot. One wall is covered in Polaroid photos, rather like Andy Warhol’s pictures of his friends and models, except that Araki’s show a compulsive interest in female anatomy.

Genitalia, as the source of life, are literally objects of worship in many Japanese Shinto shrines. But there is something melancholy about a number of Araki’s nudes; something frozen, almost corpse-like about the women trussed up in ropes staring at the camera with expressionless faces. The waxen face of his wife Yoko in her casket comes to mind. Then there are those odd plastic models of lizards and dinosaurs that Araki likes to place on the naked bodies of women in his pictures, adding another touch of morbidity.

But to criticize Araki’s photos—naked women pissing into umbrellas at a live sex show, women with flowers stuck into their vaginas, women in schoolgirl uniforms suspended in bondage, and so on—for being pornographic, vulgar, or obscene, is rather to miss the point. When the filmmaker Nagisa Oshima was prosecuted in the 1970s for obscenity after stills from his erotic masterpiece, In the Realm of the Senses, were published, his defense was: “So, what’s wrong with obscenity?”

The point of Oshima’s movie, and of Araki’s pictures, is a refusal to be constrained by rules of social respectability or good taste when it comes to sexual passion. Oshima’s aim was to see whether he could make an art film out of hardcore porn. Araki doesn’t make such high-flown claims for his work. To him, the photos are an extension of his life. Since much of life is about seeking erotic satisfaction, his pictures reflect that.

Displayed alongside Araki’s photographs at the Museum of Sex are a few examples of Shunga, the erotic woodcuts popular in Edo Period Japan (1603–1868), pictures of courtesans and their clients having sex in various ways, usually displaying huge penises penetrating capacious vulvas. These, too, have something to do with the fertility cults of Shinto, but they were also an expression of artistic rebellion. Political protest against the highly authoritarian Shoguns was far too dangerous. The alternative was to challenge social taboos. Occasionally, government officials would demonstrate their authority by cracking down on erotica. They still do. Araki has been prosecuted for obscenity at least once.

One response to Araki’s work, especially in the West, and especially in our time, is to accuse him of “objectifying” women. In the strict sense that women, often in a state of undress, striking sexual poses, are the focus of his, and our, gaze, this is true. (Lest one assume that the gaze is always male, I was interested to note that most of the viewers at the Museum of Sex during my visit were young women.) But Araki maintains that his photographs are a collaborative project. As in any consensual sado-masochistic game, this requires a great deal of trust.

Museum of Sex CollectionNobuyoshi Araki: Colourscapes, 1991

Some years ago, I saw an Araki show in Tokyo. Susan Sontag, who was also there, expressed her shock that young women would agree to being “degraded” in Araki’s pictures. Whereupon the photographer Annie Leibovitz, who was there too, said that women were probably lining up to be photographed by him. Interviews with Araki’s models, some of which are on view at the Museum of Sex, suggest that this is true. Several women talk about feeling liberated, even loved, by the experience of sitting for Araki. One spoke of his intensity as a divine gift. It is certainly true that when Araki advertised for ordinary housewives to be photographed in his usual fashion, there was no shortage of volunteers. Several books came from these sessions. 

But not all his muses have turned out to be so contented, at least in retrospect.

Earlier this year, Kaori wrote in a blog post that she felt exploited by him over the years (separately, in 2017 a model accused Araki of inappropriate contact during a shoot that took place in 1990). Araki exhibited Kaori’s photos without telling her or giving her any credit. On one occasion, she was told to pose naked while he photographed her in front of foreign visitors. When she complained at the time, he told her, probably accurately, that the visitors had not come to see her, but to see him taking pictures. (The Museum of Sex has announced that Kaori’s statement will be incorporated into the exhibition’s wall text.)

These stories ring true. It is the way things frequently are in Japan, not only in relations between artists and models. Contracts are often shunned. Borderlines between personal favors and professional work are blurred. It is entirely possible that Araki behaved badly toward Kaori, and maybe to others, too. Artists often use their muses to excite their imaginations, and treat them shabbily once the erotic rush wears off.

One might wish that Araki had not been the self-absorbed obsessive he probably is. Very often in art and literature, it is best to separate the private person from the work. Even egotistical bastards, after all, can show tenderness in their art. But in Araki’s case, the distinction is harder to maintain—this is an artist who insists on his life being inseparable from his pictures. The life has dark sides, and his lechery might in contemporary terms be deemed inappropriate, but that is precisely what makes his art so interesting. Araki’s erotomania is what drives him: aside from the posing and self-promotion, it is the one thing that can be called absolutely authentic.

“The Incomplete Araki: Sex, Life, and Death in the Works of Nobuyoshi Araki” is at the Museum of Sex, New York, through August 31.

Source Article from

Swedish Apologies

In response to:

Knifed with a Smile from the April 5, 2018 issue

To the Editors:

In the April 5 issue of The New York Review, you published an article entitled “Knifed with a Smile.” I would like to make a few comments regarding that text:

  1. The text states that none of the whistleblowers recall receiving an apology from Karolinska Institutet (KI). That could very well be the case; however, it is an irrefutable fact that on December 20, 2016, KI published a public apology in one of the most widely distributed and read Op-Ed pages of all the Swedish dailies, Dagens Nyheter: “That the whistleblowers who raised the alarm about Paolo Macchiarini’s activities were not listened to is unacceptable, and here, in this article, KI publicly apologizes to the whistleblowers.” (Text in Swedish: “Att de visselblåsare som larmade om Paolo Macchiarinis verksamhet inte blev hörda är oacceptabelt och här ber KI offentligt visselblåsarna om ursäkt för det.”) This information was sent to Professor Elliott on October 31, 2017.
  2. Furthermore, the article implies that KI is actively trying to forget or hide the Paolo Macchiarini case. This is simply not correct. There is an ongoing investigation of scientific misconduct regarding publications in which Paolo Macchiarini is named as the principal investigator. A decision by Karolinska Institutet’s vice-chancellor regarding that investigation will come later this spring. During the past two years, Karolinska Institutet has also implemented new and revised routines and guidelines and reinforced supervision as a direct consequence of this case—and there are more changes yet to come.
  3. Finally, Karolinska Institutet has made a significant effort to create and maintain total transparency throughout the process and case regarding Paolo Macchiarini and his involvement with KI. This includes openly referring to and citing criticism against KI. Please take a look at our website for more details:

Peter Andréasson
Chief Press Officer
Karolinska Institutet

Carl Elliott replies:

Peter Andréasson’s letter should give no one any confidence that the Karolinska Institute has learned from the Macchiarini scandal. It is true that Karin Dahlman-Wright, the pro-vice-chancellor of the Karolinska Institute, informed me of an article in Dagens Nyheter on December 20, 2016, which included a brief statement of apology to “the whistleblowers.” However, the supposed apology seemed almost laughably inadequate. It concerned a single finding of research misconduct by Macchiarini in an article published in Nature Communication, and the misconduct had concerned tissue engineering in rats. Yet by the time the article appeared, Macchiarini had been charged with manslaughter; the vice-chancellor of the Karolinska Institute had resigned in disgrace; an external review had identified research misconduct in six published papers by Macchiarini; and at least five patients who had gotten synthetic trachea implants from Macchiarini were dead.

In the article cited, Dahlman-Wright addressed only the single instance of research misconduct, not the suffering inflicted on patients. She did not apologize for the fact that the leaders of the Karolinska Institute and the Karolinska University Hospital had protected Macchiarini. She failed to mention that institutional leaders had reported the whistleblowers to the police and threatened to fire them. In fact, she did not even do the whistleblowers the courtesy of mentioning their names. Neither does Andréasson. For the record, the whistleblowers are Matthias Corbascio, Thomas Fux, Karl-Henrik Grinnemo, and Oscar Simonson.

When I followed up by e-mail to Dahlman-Wright, I questioned the effectiveness of an apology that the whistleblowers themselves did not consider sufficient. I also pointed out that they had been threatened with dismissal and reported to the police. In response, Dahlman-Wright referred me again to the article in Dagens Nyheter and said my questions about the threats should be directed to the Karolinska University Hospital. A spokesperson for the hospital replied that it had no plans to apologize. When I put the same questions to the newly appointed vice-chancellor, Ole Petter Ottersen, he declined to answer and referred me back to the statement by Dahlman-Wright.

For years officials at the Karolinska Institute insisted that the whistleblowers were wrong and that we should believe the administration instead. Many did, and the results were disastrous. Now Karolinska Institute officials are again insisting that the whistleblowers are wrong and that true reform is underway. Who should we believe this time?

Source Article from