Sunday, January 08, 2017


The color is still off, but not as much

Thursday, January 05, 2017

"Dungeons and Dragons and Philosophy" Redux



Third Earthquake: You and your child are trapped in slowly collapsing wreckage, which threatens both of your lives. You cannot save your child's life except by using Black's body as a shield, without her consent, in a way that would crush one of her toes. If you also caused Black to lose another toe, you would save your own life. (vol. 1, 222)
For Parfit, as well as Sidgwick, the lesson is not to overgeneralize from paradigm cases. Just as promises under duress show that not all promises are equally important, Third Earthquake is used by Parfit to argue that not all cases of harming someone in pursuit of ends to which they do not consent amounts to using them as a mere means. If you crush one of Black's toes to save your child's life but sacrifice your own life so that she can have nine toes rather than eight, Parfit notes, you are hardly treating her as a mere means in any intuitive sense.
Once you eliminate the self all kinds of logical puzzles are resolved. Assume a can-opener, assume a virgin birth, assume moral truths.
"If her interests have the same value as his, then my interests must have the same value as yours."
The Research Imperative
The Trolley Problem
The Doctrine of Double Effect

Fallen rock stars of the 70s




Emerson's playing is studious and wooden. It's almost embarrassing alongside Peterson's

Comparing  G.A. Cohen and Parfit I realized what should have been obvious. I've always thought of Cohen as a liberal because he operated in the domain of liberalism. Liberal universalism is the preference for absolutes over virtue ethics, of ideals over decisions in context, of timelessness over time, but I'd never thought enough about the roots of liberal psychology.

I've always known people who've dedicated their lives to service; who've taken, formally or informally a vow of poverty. And I've accepted that following my own definition of morality, they're morally superior to me. All the ratiocination and earnest puffery of philosophical liberalism, and Parfit's thousands of pages, are made in an idealistic defense of those who haven't made that choice. It's a sham idealism to assuage the guilt of a life of self-interest and self-absorption. And the sham becomes the career. I've always made fun of the pretension but never thought about its function.

And never mind Nietzsche; he never had the courage of his barbarism. He was a moralizing anti-moralist, (as he would have to be), the servant of the thing he opposed, without which he'd have no purpose, the man who never left the library telling people they should never enter it, a self-hating slave, indulging another form of immaturity.


Remembering Derek Parfit
...Realising we had no idea what Parfit looked like, we asked every man leaving the room if he was Derek Parfit. They all laughed: they must have been twentysomething graduate students. Finally, out came a man with a mane of white hair and a bright red tie tucked into his trousers, wielding a large Smirnoff vodka bottle.

...Five years later, as I was starting doctoral research at Oxford, I was elected to a prize fellowship at All Souls. Parfit – who had been a fellow there since 1967 – was appointed as my college adviser. He wrote to me suggesting we have lunch in college. Over the soup I tried to describe our first meeting, hoping that he would recall my silly earnestness and his enormous generosity. At first he seemed not to hear me. When I tried again, he changed the subject. That our lives had intersected before held no interest for him. He had simply been kind to me then, and now was kind to me again; one thing didn’t have anything to do with the other. 
Instead he wanted to talk about what I intended to do with the seven years of my fellowship. He suggested I spend the first year reading novels, ‘sowing seeds’. He asked if I would like to comment on his work in progress. (The next day I received two hefty boxes of draft pages of the book that would be published in 2011 as On What Matters.) We talked about meta-ethics, and I told him I was inclined towards anti-realism, the view that moral truths are in some sense dependent on the human mind. He was visibly distressed by this – he said it implied that there was nothing wrong with torture – and I had to recant in order to stop him from leaving.

...Until a year ago, Derek read everything I wrote for publication, including my pieces for the LRB, and usually sent them back to me with detailed comments within a few hours. He would point me towards a relevant passage of Nietzsche, or suggest that a metaphor was too violent, or raise a fundamental philosophical objection. I wasn’t special to Derek; many philosophers, young and old, have similar stories. Sometimes I would pass by Derek in college and he would smile at me in a way that did not entirely convince me that I was recognised.

I don’t think it’s unfair to say that Derek didn’t see what is obvious to many others: that there are persons, non-fungible and non-interchangeable, whose immense particularity matters and is indeed the basis of, rather than a distraction from, morality. But in not seeing this, Derek was able to theorise with unusual, often breathtaking novelty, clarity and insight. He was also free to be, in some ways at least, better than the rest of us. After he retired from All Souls, Derek didn’t like to go to the college common room, so we had our last meeting in my study. While jostling his papers he knocked over a glass. He was unfazed. We sat and talked for a few hours, his feet in a pool of water and shattered glass.
Jerry Cohen, a personal appreciation
I first met Jerry during the 1981-2 academic year. I had arrived at University College London as a graduate student but had been assigned to someone else. I wanted to work with Jerry so I knocked on his door. A voice greeted me in French, so I replied in French, and that was how the whole conversation was conducted. Jerry’s disconcerting playfulness at work. Supervision sessions at UCL were usually free-form seminars on historical materialism—I was a bullshit Marxist trying to become a non-bullshit one under the influence of Karl Marx’s Theory of History. Another of his students and I worked out that if we pooled our sessions we’d get twice as many hours, so that’s what usually happened. We’d see him again during the weekly sessions of his two-year cyclical Marxism course and then, later I think, at the seminars where he was trying to grapple with self-ownership and property for the first time. All of those seminars would also feature his old friend Arnold Zuboff who would chip in with often brilliant objections or simply crack jokes. As a graduate supervisor Jerry was very generous with his time. Unlike others, I don’t remember a lot of careful commentary on my work, but I do remember a lot of conversation about matters philosophical and political, walks around Bloomsbury and Hampstead, and conversations on the phone (you could ring him up to discuss ideas!). Some things were sacred though: I once made the mistake of calling him during an episode of Dallas, his favourite TV show of the moment.

...One thing I’m going to find it difficulty to convey is not just how Jerry was, but how he was the way he was. It would be easy to come up with a series of anecdotes that might appal or amuse, depending on the listener. My first supervision session, as a postgraduate student, for example, contained a long digression about Jerry’s itchy arse, and what the doctor had said about it – not really what you expect from your supervisor! But the way Jerry was made some of his chat, his disarming personal questions about bodily functions or personal relationships, and so on, different from how it might have been from someone else. For Jerry was completely lacking in inhibition, and because of that he could say things that in the mouth of a more uncomfortable person would have seemed creepy. Jerry was frequently disconcerting but never creepy. Jerry could say stuff and we’d laugh, because we were thinking it anyway but lacked his unembarrassment.
Their shared home.
There's an anachronistic opulence at All Souls College, Oxford, a college with many eminent scholars, but not a single student. College retainers serve dinner; there's port on offer, and posh cutlery on display.
repeats and repeats and...
I hate the lazy decadence of the unworldly. I hate philosophers, not lawyers.
Doing these cases,” he wrote, “I began to find myself in a dangerous situation as an advocate. I came to believe in the truth of what I was saying. I was no longer entirely what my professional duties demanded, the old taxi on the rank waiting for the client to open the door and give his instruction, prepared to drive off in any direction, with the disbelief suspended.”

Wednesday, January 04, 2017



I began reading Ways of Seeing many years ago, and didn't get very far. It was simplistic. But watching the first episode now, his arguments are simply vulgar and anti-intellectual.

"As if pictures were like words, rather that holy relics." He talks about paintings as images, when they're things, crafted by hand. An image of a Caravaggio has the relation to a Caravaggio at best, that a student translation of Flaubert has to Flaubert. His argument works more for film, which is an art of images, but he has no interest in art he won't talk about the construction of images, the craft of making, so he talks about the effect of the juxtaposed images when channel-flipping but he doesn't mention Kuleshov. That's almost fraud. He talks about Vertov and not Eisenstein. Benjamin was an ass. Even photographs are more than index.

Fine art is the art of the church and the monarchy, the art of authority. That's why it's the art of objects, that only power can afford, and why its related historically to the words that serve power, to philosophy and theology, not the vulgar and common, mere fictions. etc. etc. etc. all repeats.

I'd wanted to read his novels from Switzerland. I may at some point, but watching interviews with his old neighbors and watching him talk about them (all on youtube), you really get the sense he didn't even know he was a dilettante and that he thought he'd made them into art, in print and on stage, which they would never see. He comes off as the worst of Bloomsbury falling in the love with the innocence of children and peasants.
There are 5 million people in Damascus.

"We, the above mentioned entities and the armed opposition groups operating in Wadi Barada declare that once the ceasefire agreement is respected and all aggression operations among them ground and air operations are on hold against the civilian populated areas in Wadi Barada, we will work immediately on facilitating the entrance of the maintenance teams to the water facility in Fijeh Spring and allowing the accessories to access and we will make all the available efforts to assist the maintenance team for resuming the water supply to our people in Damascus city."

The White Helmets: repeat

Monday, January 02, 2017

The New Yorker does it again
retweeted today by economists, political scientists and philosophy professors.
Ian Bremmer, Justin Wolpers, Nigel Warburton, others.
Technocracy is not democracy.

Saturday, December 31, 2016

"That's like saying no one will judge our Khmer Rouge policy kindly. What choices did we have?"
Major Crispin Burke




I'm not sure what else there is to say.

Friday, December 30, 2016

argumente ab homine

 pretty poison
The 2016 elections gave thoughtful Americans plenty of reasons to despair about the state of our democracy. The looming Donald Trump presidency has forced us to confront ugly truths about racism, misogyny and economic inequality. But according to a new paper published in the prestigious academic journal “Philosophy & Public Affairs,” there is at least one more heretofore undetected poison floating in the cocktail that is our politics. If the philosophers behind the paper are right, this problem is amplifying every other malady afflicting American culture.

They call it “moral grandstanding.”

“Moral grandstanding is the use of moral talk for self-promotion,” says Justin Tosi, a postdoctoral fellow in the University of Michigan’s philosophy department. “It’s people using moral conversation, making moral claims, to present an impressive image of themselves to others.”
"prestigious academic journal"

Moral Grandstanding [™®© etc.]
Our basic contention is that one grandstands when one makes a contribution to public moral discourse that aims to convince others that one is “morally respectable.” By this we mean that grandstanding is a use of moral talk that attempts to get others to make certain desired judgments about oneself, namely, that one is worthy of respect or admiration because one has some particular moral quality—for example, an impressive commitment to justice, a highly tuned moral sensibility, or unparalleled powers of empathy. To grandstand is to turn one's contribution to public discourse into a vanity project.
"To grandstand is to turn one's contribution to public discourse into a vanity project."

I will resist naming the professional philosophers who should read this.

The philosophical divisions between written and spoken, between propositional and expressive, and then the discovery[!] of subtext, of the possibility of pomposity, the existence of narcissism.

Pretentious, Moi?

"Our attitudes to artworks are much more unpredictable and surprising than a lot of social theories allow for."

Monday, December 26, 2016


1983
How New York Stole the Idea of Modern Art
Serge Guilbaut
"A provocative interpretation of the political and cultural history of the early cold war years. . . . By insisting that art, even art of the avant-garde, is part of the general culture, not autonomous or above it, he forces us to think differently not only about art and art history but about society itself."—New York Times Book Review
2016
The Philosophy Scare: The Politics Of Reason In The Early Cold War
John McCumber

From the rise of formalist novels that championed the heroism of the individual to the proliferation of abstract art as a counter to socialist realism, the years of the Cold War had a profound impact on American intellectual life. As John McCumber shows in this fascinating account, philosophy, too, was hit hard by the Red Scare. Detailing the immense political pressures that reshaped philosophy departments in midcentury America, he shows just how radically politics can alter the course of intellectual history.

McCumber begins with the story of Max Otto, whose appointment to the UCLA Philosophy Department in 1947 was met with widespread protest charging him as an atheist. Drawing on Otto’s case, McCumber details the hugely successful conservative efforts that, by 1960, had all but banished the existentialist and pragmatist paradigms—not to mention Marxism—from philosophy departments all across the country, replacing them with an approach that valorized scientific objectivity and free markets and which downplayed the anti-theistic implications of modern thought. As he shows, while there have since been many instances of definitive and even explosive rejection of this conservative trend, its effects can still be seen at American universities today.
A search inside the new one at google books; regardless of the blurb's reference to formalist novels and abstract art I found none. The blurb writer just assumed they'd be there. Change is slow. Too fucking slow.

Friday, December 23, 2016

a repeat from a little over a year ago.
---
Another reason to hate political "science".
One way to understand this is as a manifestation of what political scientists call the expressive, as opposed to the instrumental, theory of voting. If voting is instrumental then it’s presumed that voters are primarily motivated by the results they hope to achieve: leaders and parties who can deliver real benefits. If it’s expressive then voters are more interested in signalling who they are and what they value. The case for expressive voting is partly driven by the thought that instrumental voting is a waste of time, since in any significant election no one’s vote ever decides the outcome (if your candidate wins or loses it is always by more than one vote, making your contribution incidental). But it also seems to chime with the world of social media and online communication, where self-expression rules and echo chambers proliferate.
Are social movements expressive, or instrumental?
See also "propositional" and "high-value" speech.

Objectivity is neutrality and passivity; you can only refuse to participate when participation is an option. Refusal plays a role in the outcome of events: it is still participation.
Do we all have a right to cross borders?
...But I think these philosophical arguments against a right to move are ultimately unconvincing. They fail to give sufficient weight to the essential interest that all of us have in being able to live, love, study, work and settle without being restricted by the coercive and often violent imposition of borders. In the context of massive inequality, the current border regime is even more unjustified, akin to the arbitrary and anti-human character of a global caste system.
Do we have the right to keep the jobs we were trained for?
---
The economics of open borders
On the first question, I’ll offer a bold Maybe. Kennan’s core assumption is that immigrant workers with a given level of education and (I think) experience will have the same productivity as already resident workers. So a move from a low productivity country to a high productivity country produces a big increase in their effective labor capacity. That benefits those workers, but also produces a shift in global income from labor to capital since the supply of labor has increased.
"with a given level of education and (I think) experience will have the same productivity as already resident workers."
Given the opportunity, immigrants will outpace them.
---
Nobel economist Angus Deaton on a year of political earthquakes
Over trout in Princeton, the laureate says he’s glad the Clinton era is over and it isn’t only Trump voters who feel ‘excluded’.

...Deaton is gracious about my bind and offers some advice. It helps that he looks like he has been plucked from central casting for emeritus professors: requisite tweed jacket, jumper and wire-rimmed glasses; white hair just unkempt enough to give a flicker of Ivy League eccentricity. He is also wearing a blue bow tie with vivid red stars that once belonged to one of his mentors, the late Richard Stone, fellow Nobel Prizewinner and the godfather of British national accounts.

Mistral is bright and airy despite the rain outside, and filled with music, cheer and the clanging of cutlery and plates. The noise forces us — two slightly rumpled large men — to lean across the small table to hear each other. I can’t help thinking that we are also, in the parlance of 2016, two “metropolitan elites”, sipping a smooth Oregon pinot noir and pondering death, pain and Donald Trump.
I can’t help thinking he's right.
---
Regional Policy and Distributional Policy in a World Where People Want to Ignore the Value and Contribution of Knowledge -and Network- Based Increasing Returns
All of this "what you deserve" language is tied up with some vague idea that you deserve what you contribute--that what your work adds to the pool of society's resources is what you deserve. 
This illusion is punctured by any recognition that there is a large societal dividend to be distributed, and that the government can distribute it by supplementing (inadequate) market wages determined by your (low) societal marginal product, or by explicitly providing income support or services unconnected with work via social insurance.
Instead, the government is supposed to, somehow, via clever redistribution, rearrange the pattern of market power in the economy so that the increasing-returns knowledge- and network-based societal dividend is predistributed in a relatively egalitarian way so that everybody can pretend that their income is just "to each according to his work", and that they are not heirs and heiresses coupon clipping off of the societal capital of our predecessors' accumulated knowledge and networks.

On top of this we add: Polanyian disruption of patterns of life--local communities, income levels, industrial specialization--that you believed you had a right to obtain or maintain, and a right to believe that you deserve. But in a market capitalist society, nobody has a right to the preservation of their local communities, to their income levels, or to an occupation in their industrial specialization. In a market capitalist society, those survive only if they pass a market profitability test. And so the only rights that matter are those property rights that at the moment carry with them market power--the combination of the (almost inevitably low) marginal societal products of your skills and the resources you own, plus the (sometimes high) market power that those resources grant to you.
...Now I think it is an open question whether it is harder to do the job via predistribution, or to do the job via changing human perceptions to get everybody to understand that:
  • no, none of us is worth what we are paid.
  • we are all living, to various extents, off of the dividends from our societal capital
  • those of us who are doing especially well are those of us who have managed to luck into situations in which we have market power--in which the resources we control are (a) scarce, (b) hard to replicate quickly, and (c) help produce things that rich people have a serious jones for right now.
  • "Speaking as a card-carrying neoliberal and as a proud member of the Rubinite wing of the Democratic Party,..."
  • "a welcome and tasty and affordable simulacrum of the tomato-eating experience."
new tags for DeatonDeLong,  QuigginNoah Smith, also Dani Rodrik. It's easier than linking and relinking to earlier posts, and I'm not going to make a tag for economics.

Tuesday, December 20, 2016

 Leiter, "The Case Against Free Speech",  again
This long-gestating paper has finally appeared in the Sydney Law Review (it was the 2013 Julius Stone Address in Jurisprudence at Sydney), and can be downloaded here for those who are interested. This year's electoral catastrophe just underlines the crucial Millian point, namely, that well-being in America will remain in danger as long as Breitbart, Drudge, Fox and the other fact-free media are allowed to operate freely.
The fish rots from the head; Weimarization seeps down from above; philosophy is authoritarian by definition; "the melancholy superiority of a schoolmaster of a school for wayward youth." etc., etc.
Fascism.

Monday, December 19, 2016

Because people are saying it's going to win a Pulitzer.

Sunday, December 18, 2016


1
This is not a pseudo-epic of redemption or revenge, with boxers and gangsters and their churchgoing moms and wives. It’s a masculine melodrama that doubles as a fable of social catastrophe. Lee, Joe and their friends would never define themselves as privileged. They have proletarian tastes and sensibilities. But they also have paid-up houses and boats, kids on track for college, decent medical care and an ironclad entitlement to the benefit of the doubt. (Observe what happens to Lee in the Manchester police station and you’ll see what I mean.) Their main problems come from women, who spoil the parties, don’t get the jokes and sometimes can’t control their drinking or keep their pants on. Some are good moms or good sports, and anyway, a man can always steal away to the boat or the basement with the guys and some beers.

Cast out of this working man’s paradise, Lee is also exiled from the prerogatives of whiteness. He lives in a basement room, earning minimum wage, answering to an African-American boss and accepting a tip from a black tenant whose toilet he has cleaned and repaired. He doesn’t complain, but it is also clear that he has chosen these conditions as a form of self-abasement, as punishment for his sins.

Maybe its sounds like I’m over-reading, or making an accusation. But to deny that “Manchester by the Sea” has a racial dimension is to underestimate its honesty and overlook its difficult relevance. Lee is guilty and angry, half-convinced that what happened was not his fault and half-certain that it was, unable to apologize or to accept apologies, paralyzed by grief and stung by a sense of grievance. He’s broken, and he’s also smart enough to realize — and Mr. Lonergan is wise and generous enough to allow him to understand — that nothing will make him whole again.
2
In his approach to work, Seb is a proud purist, perpetually oppressed and affronted by the prospect of compromise. To pay the rent, he is obliged to take what he regards as demeaning gigs: tickling out Christmas carols and show tunes at a restaurant (the manager is J. K. Simmons, the fearsome Oscar-winner from “Whiplash”); doing ’80s pop hits with a knowingly cheesy cover band; touring with a combo fronted by an old friend who has made it big.

That friend, Keith, is played by the real-life R&B star John Legend, whose affable participation presents an interesting challenge to Seb’s dogmatic traditionalism. It seems doubtful that Mr. Legend would have shown up to perform music that he thought was bad, and Keith’s unapologetic commercialism is less a strawman for Seb’s high-mindedness than a plausible counterargument. The difference between selling out and breaking through is not always clear, and “La La Land” is not so hypocritical as to pretend otherwise.

This is especially true in Mia’s case. She works as a barista at a coffee shop on the Warner Bros.’ lot, dashing off to audition for small roles in dubious films and television shows. But, of course, the line between art and junk is also blurry, partly because to qualify for the junk you must be absolutely dedicated to your art. Which Mia is, in a way that magnifies Ms. Stone’s extraordinary discipline, poise and naturalness.

The real tension in “La La Land” is between ambition and love, and perhaps the most up-to-date thing about it is the way it explores that ancient conflict. A cynical but not inaccurate way to put this would be to describe it as a careerist movie about careerism. But that would be to slight Mr. Chazelle’s real and uncomfortable insight, which is that the drive for professional success is, for young people at the present time, both more realistic and more romantic than the pursuit of boy-meets-girl happily-ever-after. Love is contingent. Art is commitment.
3
Other than some fanciful nonsense that dribbles out of Bobby Kennedy (Mr. Sarsgaard), the film mostly avoids presidential politics and policies, as well as the grim scandals, sex parties and popped pills. Instead, it explores the fantasy that becomes that scandalous house’s own double: Camelot, as Mrs. Kennedy christened it. The idea of the Kennedy years as Camelot became an enduring trope and, for some, a maddening lie. In a 2011 essay in Vanity Fair, Christopher Hitchens took a whack at Jacqueline Kennedy, arguing that her “winsome innocence,” as he put it, was “a soft cover for a specific sort of knowingness and calculation.” This knowingness seemed to repulse him; it galvanizes “Jackie.”

The film takes Jackie’s cunning and dissimulations as much for granted as it does her elegance and love of couture. Put differently, it takes her personhood for granted, which may be why Mr. Larraín shows all the snot, tears and blood, all the desperate bodily mess. In “Jackie,” Kennedy’s body — the object of obsessive inquiries — is replaced by hers in a kind of symbolic transfiguration as she assumes the role of his dignified representative, the guardian of a shining legacy. The assassination was a national and personal tragedy, one which she answered with a myth which was an act of radical will and sovereignty. She married John F. Kennedy; she also helped invent him.
4
Peluchonneau is a tragically constricted soul, but not an entirely unsympathetic character. Neruda is a heroic figure — comic and Dionysian, brilliant and naughty — but his personal Javert is in some ways the film’s protagonist. Neruda is annoyed and sometimes amused by the detective’s doggedness, but Peluchonneau is haunted by the poet’s mystique, and by a growing sense of his own incompleteness. A curious symbiosis develops between them, a dynamic more complex and strange than the simple conflict of good and evil.

Mr. Larraín is a master of moral ambiguity. His previous films about Chile — “Tony Manero,” “No” (which also starred Mr. Bernal) and “The Club” — are interested in collaboration as well as resistance, in the inner lives of the corrupt as well as the actions of the virtuous. Those movies, in particular “Tony Manero,” set during the military dictatorship in the 1970s, and “The Club,” about a group of disgraced priests, are studies in claustrophobia, with cloudy cinematography and grubby behavior.

“Neruda” has a looser story, richer colors and a more buoyant spirit. It is less abrasive than Mr. Larraín’s Chilean trilogy, and less intensely focused than “Jackie,” his new English-language film about Jacqueline Kennedy in the aftermath of her husband’s assassination. But like that unorthodox foray into history, this one approaches political issues from an oblique angle, looking for the idiosyncrasies and ironies that humanize the pursuit of ideals and the exercise of power.

The period details cast a romantic glow over Neruda’s flight, which feels more swashbuckling than desperate. But the film casts a shadow forward in time, into the darkness of Chile’s later, bloodier period of military rule, and beyond that into the political uncertainties of the present, in Latin America and elsewhere. Mr. Larraín invites us to believe that history is on the side of the poets and the humanists, and that art will make fools of politicians and policemen. But he is also aware, as Pablo Neruda was, that history sometimes has other plans.
Playwrights are scriptwriters more now than in the past. The author of the autobiographical story that became the basis for Moonlight is the new head of playwrighting at Yale. He'd already won a MacArthur. The man behind Penny Dreadful is the author of Red.  I google one of the scriptwriters of Gotham on a hunch.

NY Times, November, 2010
Craig Wright, who has written and produced for shows like “Six Feet Under,” “Lost” and “Brothers and Sisters,” said working in television has made him a better playwright. His new comic drama “Mistakes Were Made,” now at the Barrow Street Theater, is a taut work in which Michael Shannon plays an opportunistic Broadway producer, loosely based on the multiple Tony-winner Kevin McCollum. That it’s a sympathetic portrait of the usually demonized producer, Mr. Wright says, is partly because of his work in Hollywood. “One of the things you experience creating television,” he said, “is that there’s this fake vanity sometimes on the part of the writers who still labor under this old idea that the writers are Artists with a capital A, and the producers and studio executives are businessmen who just want to make a buck.”
This extends from the UK model. Joe Orton's first play began as a radio play. Mike Leigh started in theater and then made television plays. I've said before that the UK did television better than film; that TV was visually uninteresting because of the format; film is pictorial, and English is a literary culture. Widescreen changes things but not that much; long form TV is still visual prose. Film now sometimes risks becoming pictorial formalism

Thursday, December 15, 2016

FACT (Black's Law Dictionary)
The terms “fact” and “truth” are often used in common parlance as synonymous, but as employed in reference to pleading, they are widely different. A fact in pleading is a circumstance, act, event, or incident; a truth Is the legal principle which declares or governs the facts and their operative effect.

Wednesday, December 14, 2016

continuing the previous post.

Kazin again.

And again it's not that he's brilliant but that his writing fits a model opposed to the academy, and what it's become. Repeats of repeats.

Bureaucrats thought of literature as parasitic and now that they champion it they become parasites themselves, as biographers live off on their subjects. Philosophers always see themselves as proscribing or granting permission.

Kazin: The Language of Pundits
It is curious that Freud, the founder of psychoanalysis, remains the only first-class writer identified with the psychoanalytic movement. It was, of course, Freud's remarkable literary ability that gave currency to his once difficult and even "bestial" ideas; it was the insight he showed into concrete human problems, the discoveries whose force is revealed to us in a language supple, dramatic, and charged with the excitement of Freud's mission as a "conquistador" into realms hitherto closed to scientific inquiry, that excited and persuaded so many readers of his books. Even the reader who does not accept all of Freud's reasoning is aware, as he reads his interpretation of dreams, of the horror associated with incest, of the Egyptian origins of Moses, that this is a writer who is bent on making the most mysterious and unmentionable matters entirely clear to him-self, and that this fundamental concern to get at the truth makes dramatis personae out of his symbols and dramatic episodes out of the archetypal human struggles he has described. It is certainly possible to read Freud, even to enjoy his books, without being convinced by him, but anyone sensitive to the nuances and playfulness of literary style, to the shaping power of a great intellectual conception, is not likely to miss in Freud the peculiar urgency of the great writer; for myself, I can never read him without carrying away a deeply engraved, an unforgettable sense of the force of human desire.

By contrast, many of the analysts who turn to writing seem to me not so much writers as people clutching at a few ideas. Whenever I immerse myself, very briefly, in the magisterial clumsiness of Dr. Gregory Zilboorg, or the slovenly looseness of Dr. Theodore Reik,or the tensely inarticulate essays of Dr. Harry Stack Sullivan, or the purringly complacent formulas of Dr. Edmund Bergler, or even the smoothly professional pages of Dr. Erich Fromm, I have a mental picture of a man leaping up from his chair, crying with exultation, "I have it! The reason for frigidity in the middle-aged female is the claustrophobic constitution!," and straightway rushing to his publisher. Where Freud really tried to give an explanation to himself of one specific human difiiculty after another, and then in his old-fashioned way tried to show the determination of one new fact by another, it is enough these days for Dr. Bergler to assert why all writers are blocked, or for Dr. Theodore Reik, in his long-winded and inconsequential trek into love and lust, to announce that male and female are so different as to be virtually of different species. The vital difference between a writer and someone who merely is published is that the writer seems always to be saying to himself, as Stendhal actually did, "If I am not clear, the world around me collapses." In a very real sense, the writer writes in order to teach himself, to understand himself, to satisfy himself; the publishing of his ideas, though it brings gratifications, is a curious anticlimax.

Of course, there are psychoanalyst-writers who aim at understanding for themselves, but don't succeed. Even in Freud's immediate circle, several of the original disciples, having obtained their system from the master, devoted themselves to specialties and obsessions that, even if they were more than private idees fixes, like Otto Rank's belief in the "birth-trauma," were simply not given the hard and lucid expression necessary to convince the world of their objectivity. Lacking Freud's striking combination of intellectual zeal and common sense, his balanced and often rueful sense of the total image presented by the human person, these disciples wrote as if they could draw upon Freud's system while expanding one or two favorite notions out of keeping with the rest. But so strongly is Freud's general conception the product of his literary ability, so much is it held together only in Freud's own books, by the force of his own mind, that it is extraordinary how, apart from Freud, Freudianism loses its general interest and often becomes merely an excuse for wild-goose chases.

Obviously these private concerns were far more important to certain people in Freud's own circle than was the validity of Freudianism itself. When it came to a conflict between Freudianism and their own causes (Otto Rank) or their desire to be uninhibited in mystical indefiniteness (C. G. Jung), the body of ideas which they had inherited, not earned, no longer existed for them. Quite apart from his personal disposition to remain in control of the movement which he had founded, Freud was objectively right in warning disciples like Ferenczi, Rank, Adler, and Stekel not to break away from his authority. For the analyst's interest in psychoanalysis is likely to have its origin in some personal anxiety, and some particularly unstable people (of whom there were several in Freud's circle), lacking Freud's unusual ability not only to work through his own neuroses but to sublimate everything into the grand creative exultation of founding a movement, committed themselves fruitlessly to the development of their unsystematic ideas, found it impossible to heal themselves by the ad hoc doctrines they had advanced for this purpose, and even relapsed into serious mental illness and suicide.

Until fairly recently, it was perfectly possible for anyone with a Ph.D. (in literature or Zen or philology) to be a "psychotherapist"in New York State. I have known several such therapists among the intellectuals of New York, and I distinguish them very sharply from the many skillful and devoted lay analysts, with a direct training in psychoanalysis, who are likely to have an objective concern with the malady of their patients. The intellectuals with Ph.D.'s who transferred from other professions to the practice of psychoanalysis still seem to me an extreme and sinister example of the tendency of psychoanalysis to throw up the pundit as a type. Like modern intellectuals everywhere, intellectuals as self-made analysts are likely to have one or two ruling ideas which bear obvious relation to their private history, but which, unlike intellectuals generally, they have been able to impose upon people who came to them desperately eager for orientation in their difficulties. In short, the ruling weakness of intellectuals, which is to flit from idea to idea in the hope of finding some instrument of personal or world salvation, has often become a method of indoctrination. All the great figures in psychoanalysis have been egotists of the most extreme sort; all the creative ones, from Freud himself to the late unfortunate Dr. Wilhelm Reich, were openly exasperated with the necessity of having to deal with patients at all. They were interested only in high thinking, though Freud at least tempered his impatience enough to learn from his patients; the objective power, the need to examine symptoms in others, never left him.

By contrast, the intellectual who is looking for an audience or a disciple has often, as a psychotherapist, found one in his patient. And the obvious danger of exploiting the credulous, the submissive, the troubled (as someone said, it is the analyst's love that cures the patient, and certain intellectuals love no one so much as a good listener), which starts from a doctrine held by the analyst in good faith but which may be no less narrow-minded or fanatical for all that, seems to me only an extension of the passion for explaining everything by psychoanalysis which literary intellectuals have indulged in so long. When I think of some of the intellectuals who have offered their services as therapists, I cannot but believe that to them the patient is irrelevant to their own passion for intellectual indoctrination. My proof of this is the way they write. Ever since Freud gave the word to so many people less talented than himself, it has become increasingly clear that, whatever psychoanalysis may have done for many troubled people, it has encouraged nonwriters to become bad writers and mediocre writers to affect the style of pundits. For the root of all bad writing is to be distracted, to be self-conscious, not to have your eye on the ball, not to confront a subject with entire directness, with entire humility, and with concentrated passion. The root of all bad writing is to compose what you have not worked out, de haut en has, for yourself. Unless words come into the writer's mind as fresh coinages for what the writer himself knows that he knows, knows to be true, it is impossible for him to give back in words that direct quality of experience which is the essence of literature.

Now, behind the immense power and authority of psychoanalytical doctrines over contemporary literature — which expresses itself in the motivation of characters, the images of poetry, the symbol hunting of critics, the immense congregation of psychiatric situations and of psychiatrists in contemporary plays and novels — lies the urgent conviction, born with modern literature in the romantic period, the seedbed of Freudian ideas, that literature can give us knowledge. The Romantic poets believed in the supremacy of imagination over logic exactly as we now believe that the unconscious has stories to tell which ordinary consciousness knows nothing of. And just as the analyst looks to free association on the part of the patient to reveal conflicts buried too deep in the psyche to be revealed to the ordi- narily conscious mind, so the Romantic poets believed that what has been buried in us, far from the prying disapprovals of culture, stands for "nature," our true human nature. A new world had been re- vealed to the Romantics, a world accessible through the imagination that creates art. And Freud, who also felt that he had come upon a new world, said that his insights had been anticipated by literary men in particular; he felt that he had confirmed, as scientific doctrine, profound discoveries about our buried, our archetypal, our passionate human nature that philosophers and poets had made as artists.

Had made as artists. Nietzsche, who also anticipated many of Freud's psychological insights, said that Dostoevsky was the only psychologist who had ever taught him anything. No doubt he meant that the characters Dostoevsky had created, the freshness of Dostoevsky's perceptions, the powerful but ironic rationality of Dostoevsky's style had created new facts for him to think of in comparison with the stale medical formulas of psychiatry in his time. Similarly, Freud said of Dostoevsky that "before genius, analysis lays down its arms," indicating that with the shaping power of the artist who can create characters like old Karamazov and Prince Myshkin, with the genius that in its gift of creation actually parallels life instead of merely commenting on it, analysis cannot compete. And in point of fact we do learn more about the human heart from a stupendous creation like the Karamazov family than we ever do from all the formulary "motivations" of human nature. Just as each human being, in his uniqueness, escapes all the dry formulas and explanations about human nature, so a great new creation in imaginative literature, a direct vision of the eternal like William Blake's or an unprecedented and unassimilable human being like old Karamazov, automatically upsets and rearranges our hardened conceptions of human nature.

There is no substitute for life, for the direct impression of life; there is no deep truth about life, such as writers bring home to us, that does not come in the form of more life. To anyone who really knows how rare and precious imaginative creation is — how small, after all, is that procession which includes Dante's Paolo and Francesca, Shakespeare's Othello, and Tolstoy's Natasha — how infinitely real in suggestion is the character that has been created in and through imagination, there is something finally unbearable, the very opposite of what literature is for, in the kind of metallic writing which now so often serves in a novel to "motivate" a character.

Maybe the only tenable literary role which novelists and poets, as well as critics and psychologists, now want to play is that of the expert — the explainer, the commentator, the analyst. Just as so many psychoanalysts want to be writers, so many writers now want to be analysts. And whenever I rise up at intervals from my dutiful immersion in certain specimens of contemporary literature, I find it hard to say who has less to contribute to literature, the psychiatrist who wants to push a few small ideas into a book or the novelist who in the course of a story breaks down into writing like a psychoanalyst.

« 2 » 

The deterioration of language in contemporary fiction into the language of pundits is not often noticed by critics — perhaps because the novelists have taken to writing like critics. But it is by no means the highbrow or intellectual novelist — like Mary Mc- Carthy, who in a single story for Partisan Review is likely to produce so many deliberate symbols — who is the only offender against art. John O'Hara in From the Terrace wrote, of the mother of his hero, that "What had happened to her was that she unconsciously aban doned the public virginity and, again unconsciously, began to function as a woman." Of the Eaton brothers, O'Hara made it clear that "If William slapped Alfred or otherwise punished him, the difference in ages was always mentioned while William himself was being punished; and each time that that occurred the age separation con- tributed to a strengthening of the separation that was already there because of, among other considerations, the two distinct personalities." This is a novelist? Frankly, I have the impression that many of the younger novelists have learned to write fiction from reading the New Critics, the anthropologists and psychologists. I cannot begin to enumerate all the novels of recent years, from Ralph Ellison's Invisible Man to Vance Bourjaily's recent Confessions of a Spent Youth, which describe American social customs, from college up, as fulfilling the prescription of tribal rites laid down by the anthropologists. But whereas an angry and powerful novelist, as Ellison is in Invisible Man, whatever helpful hints he may get from psychiatrically oriented literary critics, will aim at the strongest pos-sible image of Negro suffering and confusion in a hostile society, Vance Bourjaily, in his recent novel, has his hero preface his description of a business smoker by apologizing that "it would take the calm mind of an anthropologist to describe objectively the rites with which the advertising tribe sent its bachelor to meet his bride."

I don't know what repels me more in such writing, the low spirits behind such prosiness or the attempted irony that is meant to disguise the fact that the writer is simply not facing his subject directly but is looking for something to say about it. No wonder that a pas- sage like this sounds not like fiction but a case history: "I had a good time with Vicky during those two or three months; at the same time, I was learning about the social structure of the town and that of the school which, with certain exceptions for unusual individuals, reflected it; Vicky was more or less middle middle. As a friend of hers, since my own status was ambiguous, it seemed to me that I must acquire hers by association." And Mr. Bourjaily's book is a case history, though so meanderingly self-absorbed, for the most part, that it comes splendidly alive when the hero describes a visit to his relatives in the Near East; for a few pages we are onto people whom Mr. Bourjaily has to describe for us, since they are new types, and then we get free of the motivational analysis that is the novelist's desperate response to people who he thinks are too familiar to be conveyed directly. This is a curious idea of a novel — as if it were the subject, rather than the point of view, which made it boring. The true writer starts from autobiography, but he does not end there; and it is not himself he is interested in, but the use he can make of self as a literary creation. Of course, it is not the autobio- graphical subject that makes such books as Mr. Bourjaily's flat; it is the relatively shallow level from which the author regards his own experience. The mark of this is that the writer does not even bother to turn his hero into a character; he is just a focus for the usual "ironic" psychological comment. If the writer nowadays sees himself as a pundit, he sees his hero as a patient. What, in fact, one sees in many contemporary American novelists today is the author as analyst confronting his alter ego as analysand. The novel, in short, becomes simply an instrument of self-analysis, which may be privately good for the writer (I doubt it) but is certainly boring to his readers.

 « 3 »  

The deterioration of language in contemporary "imaginative" literature — this reduction of experience to flat, vaguely orphic loose statements — seems to me most serious whenever, in our psychiatri- cally centered culture, spontaneity becomes an arbitrary gesture which people can simulate. Among the Beat writers, spontaneity be- comes a necessary convention of mental health, a way of simulating vitality, directness, rough informality, when in fact the literary works produced for this pose have no vitality, are not about anything very significant, and are about as rough as men ever are using dirty words when they cut themselves shaving. The critic Harold Rosenberg once referred scathingly to the "herd of independent minds"; when I read the Beat and spontaneous poets en bloc, as I have just done in Donald Allen's anthology of the "new" American poetry, I feel that I am watching a bunch of lonely Pagliaccis making themselves up to look gay. To be spontaneous on purpose, spontaneous all the time, spontaneous on demand is bad enough; you are obeying not yourself but some psychiatric commandment. But to convert this artificial, constant, unreal spontaneity into poetry as a way of avoiding the risks and obligations of an objective literary work is first to make a howling clown out of yourself and then deliberately to cry up your bad literature as the only good literature.

The idea of the Beat poets is to write so quickly that they will not have to stand up for the poem itself; it is enough to be caught in the act of writing. The emphasis is not on the poem but on themselves being glimpsed in the act of creation. In short, they are func- tioning, they are getting out of the prison house of neurosis, they are positive and free. "Look, Ma, no hands!" More than this, they are shown in the act of writing poems which describe them in the act of living, just about to write poems. "Morning again, nothing has to he done/ maybe buy a piano or make fudge/ At least clean the room up, for sure like my farther / I've done flick the ashes & buts over the bedside on the floor." This is Peter Orlovsky, "Second Poem."

Elsewhere, the hysterical demand for spontaneity as an absolute value means that everything in the normal social world becomes an enemy of your freedom. You want to destroy it so as to find an image of the ecstasy that has become the only image of reality the isolated mind will settle for. It is a wish for the apocalypse that lies behind the continued self-righteous muttering that the world is about to blow up. The world is not about to blow up, but behind the extreme literary pose that everything exists to stifle and suppress and exterminate us perhaps lies the belief, as Henry Miller plainly put it in Tropic of Cancer, that "For a hundred years or more the world, our world, has been dying. . . . The world is rotting away, dying piecemeal. But it needs the coup de grace, it needs to be blown to smithereens. . . . We are going to put it down — the evolution of this world which has died but which has not been buried. We are swimming on the face of time and all else has drowned, is drowning, or will drown."

The setting of this apocalyptic wish is the stated enmity between the self and the world, between the literary imagination and mere reality — a tension which was set up by Romanticism and which Freudianism has sharpened and intensified to the point where the extreme Romantic, the Beat writer, confesses that the world must be destroyed in order that the freedom of his imagination proceed to its infinite goal. Romanticism put so much emphasis on the personal consciousness that eventually the single person came to consider himself prior to the world and, in a sense, replacing it; under Romanticism, the self abandoned its natural ties to society and nature and emphasized the will. The more the single conscious mind saw the world as an object for it to study, the more consciousness was thrown back on itself in fearful isolation; the individual, alone now with his consciousness, preoccupied in regarding himself and studying himself, had to exercise by more and more urgent exertions of will that relationship to the world which made consciousness the emperor of all it could survey — the world was merely raw material to the inquiring mind.

Freud, himself a highly conservative and skeptical thinker with a deeply classical bias in favor of limitation, restraint, and control, could not have anticipated that his critique of repression, of the admired self-control of the bourgeoisie, would in time, with the bank- ruptcy of bourgeois values, become a philosophy for many of his followers. Freudianism is a critique of Victorian culture; it is not a prescription for hving in the twentieth century, in a world where the individual finds himself increasingly alienated from the society to which he is physically tied. Freud once wrote in a letter to Romain Rolland: "Psychoanalysis also has its scale of values, but its sole aim is the enhanced harmony of the ego, which is expected successfully to mediate between the claims of the instinctual life [the id] and those of the external world; thus between inner and outer reality.

"We seem to diverge rather far in the role we assign to intuition. Your mystics rely on it to teach them how to solve the riddle of the universe; we believe that it cannot reveal to us anything but primitive, instinctual impulses and attitudes . . . worthless for orientation in the alien, external world."

It was the Romantics who handed down to modern writers the necessity to think of the world as "ahen and external." By now so many writers mechanically think of it this way that it is no wonder that they look for a philosophy of life to the "primitive, instinctual impulses and attitudes," though, as Freud knew, they are "worthless for orientation in the alien, external world." Man cannot cheat his own mind; he cannot bypass the centrality of his own intelligence. Yet is not sole reliance on the "primitive, instinctual impulses" exactly the raison d'etre of so many Beat poems and novels; of neurotic plays dealing with people whose only weakness, they think, is that they are repressed; of literary studies whose whole thesis is that the American novel has always been afraid of sex? What is wrong with such works is not that the single points they make are incorrect, but that they rely upon a single point for a positive philosophy of life. It is impossible to write well and deeply in this spirit of Sisyphus, pushing a single stone up the mountain. It is impossible to write well if you start from an arbitrary point of view, and in the face of everything that is human, complex, and various, push home your idee fixe. It is impossible for the haunted, the isolated, the increasingly self-absorbed and self-referring self to transcend it" self sufficiently to create works of literature.

Literature grows out of a sense of abundant relationships with the world, out of a sense that what is ugly to everyone else is really beautiful to you, that what is invisible to many men is pressingly alive and present to your writer's eye. We can no longer, by taking thought, transcend the life that consists in taking thought. The English novelist and philosopher Iris Murdoch has recently helped clear the air of desperate self-pity by saying that "We need to return from the self-centered concept to the other-centered concept of truth. We are not isolated free choosers, monarchs of all we survey, but benighted creatures sunk in a reality whose nature we are constantly and overwhelmingly tempted to deform by fantasy. Our current picture of freedom encourages a dream-like facility; whereas what we require is a renewed sense of the difficulty and complexity of the moral life and the opacity of persons."

By now the self-centered mind fashioned by romanticism, constantly keeping itself open only to adjurations of absolute freedom and spontaneity, has traveled about as far along the road of self- concern as it can; it has nothing to discover further of itself but fresh despair. The immediate proof of this is in the quahty of so much of the hterature that has been shaped by Freudianism — only because all other creeds have failed it. It is not possible to write well with one's own wishes as the only material. It is not possible any longer to think anything out without a greater reality than one-self constantly pressing one's words into dramatic shape and unexpected meaning. All our words now are for our own emotions, none for the world that sustains the writer. And this situation is impossi- ble, for it was never the self that literature was about, but what transcended the self, what comes home to us through experience. 
 [1961]
Kazin's opinions were bog-standard in the world I grew up in. It wasn't until I was older, reading Panofsky and Arendt, Huizinga and bits of Auerbach, that I discovered the tradition his writing descends and devolves from. When I read that Panofsky laughed at the New Critics I understood, not because I'd read Kazin, but because I was raised by people who did. But reading him after rather than before, I sensed the defensiveness and snobbery, the need to push back. The older tradition didn't grow up overshadowed by positivism. They had the luxury of sadness.

Tuesday, December 13, 2016

"Our attitudes to artworks are much more unpredictable and surprising than a lot of social theories allow for"

Comic. Absurd. Pathetic. Embarrassing.

Daily Nous
Ingrid Robeyns, professor of philosophy and holder of the Ethics and Institutions Chair at the Utrecht University, has won a 2 million euro grant from the European Research Council to pursue her research on “limitarianism” over the next five years.

Her project is called “Can Limitarianism Be Justified? A Philosophical Analysis of Limits on the Distribution of Economic and Ecological Resources,” or Fair Limits, for short. Here is a little about it:

Inequalities in wealth are significant and on average increasing, and various ecological sinks and resources are overused. These circumstances should prompt us to rethink what fairness entails in the distribution of economic and ecological material resources. In particular, are there good grounds to opt for upper limits in the distribution of those resources? Are there, from a moral point of view, certain limits in our appropriation or use of material resources that should not be crossed? Can we say, either individually or collectively, that at some point we are polluting too much and using too many natural resources, or that we are having too much wealth? If so, why—and if not, why not?…

The Fair Limits project will not only push the boundaries of the philosophy of distributive justice, but also pose some fundamental questions of the contemporary dominant paradigm in thinking about justice. Methodologically, this will be done by developing methods for normative political philosophy in non-ideal conditions. In addition, Fair Limits also entails a critical dialogue with non-liberal philosophies, such as Confucian philosophy, African Philosophy, and Indigenous philosophies, to reconsider the soundness of basic assumptions in contemporary liberal theories of justice. Fair Limits thus has the potential to contribute to a paradigm shift in philosophical analysis of questions of distributive justice. 
You can learn more about the project and the award here.
Chronicle of Higher Education

What's Wrong With Literary Studies?
Some scholars think the field has become cynical and paranoid
In the low-budget realm of humanities grantmaking, a University of Virginia press release this May came as a shock. The Danish National Research Foundation had awarded roughly $4.2 million to a literary-studies project led by an English professor at Virginia, Rita Felski. And this wasn’t yet another big-ticket digital-humanities effort to map the social history of the United States or crunch the cultural data stored in five million books. This money would help Felski assemble a team of scholars to investigate the social uses of literature.
For Felski, the windfall validates a nearly decade-long push to change the way literature and other art forms are studied. In a series of manifestoes, she has developed a sophisticated language for talking about our attachments to literature and prodded literary scholars to reconsider their habit of approaching texts like suspicious detectives on the hunt for hidden meanings. Felski’s message boils down to prefixes. Literary critics have emphasized "de" words, like "debunk" and "deconstruct." But they’ve shortchanged "re" words — literature’s capacity to reshape and recharge perception.

"There’s actually quite a diverse range of intellectual frameworks, politically, theoretically, philosophically," says Felski, who specializes in literary theory and method. "Yet there’s an underlying similarity in terms of this mood of vigilance, wariness, suspicion, distrust, which doesn’t really allow us to grapple with these really basic questions about why people actually take up books in the first place, why they matter to people."

Though the size of her grant may be unique, Felski’s sense of frustration is not. Her work joins a groundswell of scholarship questioning a certain kind of critique that has prevailed in literary studies in recent decades. "Critique" can be a blurry word — isn’t all criticism critique? — but in Felski’s usage it carries a specific flavor. Critique means a negative commentary, an act of resistance against dominant values, an intellectual discourse that defines itself against popular understanding.

Felski sketches the shake-up of literary studies that started in the ’60s as a shift from criticism ("the interpretation and evaluation of literary works") to critique ("the politically motivated analysis of the larger philosophical or historical conditions shaping these works"). Most frameworks taught today in a literary-theory class, such as feminism, Marxism, deconstruction, structuralism, and psychoanalysis, would count as variants of critique. 
Contemporary literary scholarship has never lacked for detractors: Down with politics in the academy! Back to the Great Books! What’s different now is that the questioning of critique is coming from people steeped in its theories. Eve Kosofsky Sedgwick, a founder of queer theory and sexuality studies, galvanized this soul-searching with a 2003 essay arguing that theory had spawned a paranoid mood in literary studies. The debate gained momentum with a special issue of the journal Representations in 2009, when Stephen Best and Sharon Marcus challenged a method of interpretation known as symptomatic reading, in which critics read texts like psychoanalysts probing for repressed meanings.

Then, last year, came Lisa Ruddick’s essay "When Nothing Is Cool," a hand grenade lobbed at her field.
I didn't catch that last reference at first. "A tortured milquetoast epistle".

It gets worse.
...That question of attachments — to novels and films, paintings and music — is at the heart of Felski’s next book. She operates from the premise that people’s everyday experience of art is much more mysterious than commonly thought. Consider the story of Zadie Smith’s changing relationship to Joni Mitchell. The novelist once dismissed Mitchell’s music as, in Felski’s words, "a white girl’s warbling." Then one day Smith could no longer listen to Mitchell’s songs without crying. Why? To think about such questions, Felski draws on the philosophical tradition of phenomenology, looking closely at first-person experience. So, in that musical epiphany, Smith is in her 30s. She and her husband are driving to a wedding in Wales, with Mitchell playing on the car radio. They bicker. They spend an afternoon at Tintern Abbey, where Smith gazes out at the green hills. And suddenly she’s humming Joni Mitchell. Felski writes about the way such different strands of experience come together to shape perceptions of art.

"Our attitudes to artworks are much more unpredictable and surprising than a lot of social theories allow for," she says. "And therefore we need to look at these specific examples of a relationship to an artwork. A lot of specific examples are going to explode our theories rather than confirm them."
The academic "discovery" of Zadie Smith is perverse of course. We're begun to move on from Shalizi and Moretti, but just slightly.

continuing in the next post.