Tuesday, January 08, 2013
R.I.P. Ada Louise
I never met Ada Louise Huxtable, though for a brief period we were both writing for the New York Times. She even quoted me in one of her books, and I certainly quoted her in The Malling of America. She was one of its guiding lights. She died this week at the age of 91. Here's her LA Times obit.
Her beat was architecture for the Times, their first real architecture critic, and she was up there with Jane Jacobs and Lewis Mumford in reshaping New York City and how architects and planners thought about cities and how people live in them. She shaped criticism itself as a journalistic pursuit involving reporting, scholarship and taste.
In particular I found her work revelatory on the South Street Seaport in New York, when it was about to host another urban "marketplace" mall built by the Rouse Company, after its successes on the waterfronts of Boston and Baltimore.
At retirement age she instead became the architecture critic for the Wall Street Journal. She also wrote books (I'm footnoted in The Unreal America.) Her influence was highest in the 60s but her work inspired me in the 80s and I'm sure is inspiring others right now. No reason why it won't for a long time to come. May she rest in peace, and her work live on.
Wednesday, January 02, 2013
Big Deal on Ross Street
After a Christmas that involved waiting out cyclonic winds before setting forth on a 6 hour car trip, and a return on a bus with a driver who wasn't really sure where we were going--aided by what might be described as in person crowd sourcing--New Years was quiet. New Year's Eve I was the only one awake as usual, watching Charlie Rose ask inane questions about Shakespeare which nevertheless elicited interesting answers when I heard a few distant firecracker pops to note the passage of the old year. Then the Day was spent in an epic Scrabble game and watching a lovely old Italian comedy I don't think I've seen since college, Big Deal on Madonna Street. It's a kind of neo-realist comedy, a 1958 spoof of caper films, with a working class crew in Italy. Their goal is a pawn shop safe but they wind up with pasta fazol. Marcello Mastroianni and Claudia Cardinale are part of the ensemble. A nice way to end the day and start the year.
Wednesday, October 24, 2012
True Blue
It turns out that "true blue" has a colorful meaning. In the New York Times:
For the French Fauvist painter and color gourmand Raoul Dufy, blue was the only color with enough strength of character to remain blue “in all its tones.” Darkened red looks brown and whitened red turns pink, Dufy said, while yellow blackens with shading and fades away in the light. But blue can be brightened or dimmed, the artist said, and “it will always stay blue.”
And that blue fascinates more than me. The Times story by Natalie Angier goes on:
Scientists, too, have lately been bullish on blue, captivated by its optical purity, complexity and metaphorical fluency. They’re exploring the physics and chemistry of blueness in nature, the evolution of blue ornaments and blue come-ons, and the sheer brazenness of being blue when most earthly life forms opt for earthy raiments of beige, ruddy or taupe."
All that and more. Blue by the way is the color of the throat chakra. Blue voice. How about that for metaphorical fluency?
The artist of blue most people think of is Picasso. The Blue Period (brought on by the combination of the suicide of a friend and--at least according to Gertrude Stein--because someone gave him a lot of blue paint, and he was too poor to buy other colors.)
But I like blues in Klee, Van Gogh, O'Keeffe, Monet, Severini, Morris Graves, and Rene Magritte. Magritte paints blue skies, usually in that mysterious luminous blue that partakes of both day and night. It is the blue of dawn and the blue of dusk. And one of the features of these paintings I love is, you can't tell which it is. Everything is the dawn of something, and the dusk.
I wear a lot of blue. I enjoy it. Beginning with its third season, the George Reeves Superman TV series of the 50s was filmed in color, even though it would be shown in black and white for the next decade or so. When color TV was more widely available in the 60s, the series had a big revival. But to both film in color and principally for black and white TV, almost everybody in Superman wore blue. I assume that was the reason. All shades and patterns of blue. Blue sweaters with blue suits. (And they didn't seem to change their outfits very often.) Even the cars tended to be blue. I've got these on DVD. It's my kind of world.
Superman wore blue. Doctor Who in his blue box. Spock in his blue uniform. Now that new BBC Sherlock wears blue. But it probably all comes down, or up, to that blue sky. The blue ocean. This blue planet. This blue voice.
Sunday, September 16, 2012
The Amity of Influence
In a nifty collection of interviews (Talking Music), William Duckworth asks composer Lou Harrison why he brought together divergent influences to his own work. Harrison says it wasn't conscious, but the reason he does give is forthright and true at times for most creative people.
DUCKWORTH: How did they get in there?
HARRISON: Well, because I loved them. "Me, too," that's the idea. If I like something I want that too. It's greed--that's the basis of it."
DUCKWORTH: How did they get in there?
HARRISON: Well, because I loved them. "Me, too," that's the idea. If I like something I want that too. It's greed--that's the basis of it."
Wednesday, September 12, 2012
September Song
Martin Amis is a writer I've admired mostly from afar. I've enjoyed the novels I've read and the non-fiction collection about the 80s, The Moronic Inferno, a title that describes the 80s and a lot of the ever since. But I haven't read a lot of his work, for often his most urgent concerns are not mine--at least not of the same moment.
Maybe it's just that his life has been so different from mine. But he was quoted making an observation that I've not only never read anybody else making, I've never heard anyone else say. He was describing something that happens to him, that I thought that for all intents and purposes, only happens to me.
He said that he is often caught offguard by a memory of something that attacks him with regret and chagrin, seemingly out of the blue, just walking down the street or in any daily situation. In fact, I referenced this on this very blog:
Several years ago I was pleased to hear novelist Martin Amis admit that small regrets hit him suddenly every day, to the point that they stop him in his tracks, literally, as he walks down the street, and he involuntarily winces and mutters to himself because of some small memory that emerged with the peculiar force of shame and the pitiless, bottomless thump of regret. I was pleased because I thought I was the only one this happened to.
Now he's done it again, in a recent interview (published at Smithsonian online and flagged by Andrew Sullivan's site.) He has identified something I am dimly aware is happening to me--that in recent days I've become more conscious of. Here's what he said:
"Your youth evaporates in your early 40s when you look in the mirror. And then it becomes a full-time job pretending you’re not going to die, and then you accept that you’ll die. Then in your 50s everything is very thin. And then suddenly you’ve got this huge new territory inside you, which is the past, which wasn’t there before. A new source of strength. Then that may not be so gratifying to you as the 60s begin [Amis is 62], but then I find that in your 60s, everything begins to look sort of slightly magical again. And it’s imbued with a kind of leave-taking resonance, that it’s not going to be around very long, this world, so it begins to look poignant and fascinating.”
Yes, there is that "huge new territory inside" which is "the past." But especially, "in your 60s, everything begins to look sort of slightly magical again."
It does. It's a bit easier to appreciate the moment. I'm very aware that this is a golden time--I'm reasonably healthy, I am without physical pain, temporarily secure--well, the sense that it is certainly all temporary. But it is, right now. And the day is easier to appreciate. People, relationships that are good--and the blessings I have here, of this lovely air, especially in the sunny autumn of the North Coast. It is fascinating and it is poignant, and it's sharpened by the awareness not only that it will all soon end, but you don't know when it will start ending, or how.
Maybe it's just that his life has been so different from mine. But he was quoted making an observation that I've not only never read anybody else making, I've never heard anyone else say. He was describing something that happens to him, that I thought that for all intents and purposes, only happens to me.
He said that he is often caught offguard by a memory of something that attacks him with regret and chagrin, seemingly out of the blue, just walking down the street or in any daily situation. In fact, I referenced this on this very blog:
Several years ago I was pleased to hear novelist Martin Amis admit that small regrets hit him suddenly every day, to the point that they stop him in his tracks, literally, as he walks down the street, and he involuntarily winces and mutters to himself because of some small memory that emerged with the peculiar force of shame and the pitiless, bottomless thump of regret. I was pleased because I thought I was the only one this happened to.
Now he's done it again, in a recent interview (published at Smithsonian online and flagged by Andrew Sullivan's site.) He has identified something I am dimly aware is happening to me--that in recent days I've become more conscious of. Here's what he said:
"Your youth evaporates in your early 40s when you look in the mirror. And then it becomes a full-time job pretending you’re not going to die, and then you accept that you’ll die. Then in your 50s everything is very thin. And then suddenly you’ve got this huge new territory inside you, which is the past, which wasn’t there before. A new source of strength. Then that may not be so gratifying to you as the 60s begin [Amis is 62], but then I find that in your 60s, everything begins to look sort of slightly magical again. And it’s imbued with a kind of leave-taking resonance, that it’s not going to be around very long, this world, so it begins to look poignant and fascinating.”
Yes, there is that "huge new territory inside" which is "the past." But especially, "in your 60s, everything begins to look sort of slightly magical again."
It does. It's a bit easier to appreciate the moment. I'm very aware that this is a golden time--I'm reasonably healthy, I am without physical pain, temporarily secure--well, the sense that it is certainly all temporary. But it is, right now. And the day is easier to appreciate. People, relationships that are good--and the blessings I have here, of this lovely air, especially in the sunny autumn of the North Coast. It is fascinating and it is poignant, and it's sharpened by the awareness not only that it will all soon end, but you don't know when it will start ending, or how.
Friday, August 24, 2012
Accessing the Slow
In June Margaret and I were down in Menlo Park, visiting her daughter, son in law and grandson who had just turned 1.5 years old. Besides hanging out with him--even at this early age he demonstrated good taste in taking a shine to me--I spent pleasant hours at a fine cafe called Cafe Borrone. Very good coffee, very good food, excellent staff and great atmosphere, especially in the large outdoor plaza area pictured above. I snapped that photo in the lull before the late afternoon crowd, which I was around to see. I was there at Sunday brunch time as well, so I saw a fair number of people.
The cafe is close to Stanford University, and in the vicinity of Google and Facebook hqs, and lots of other tech related firms. I'm sure some of the young crowd sipping beers or coffee were worth millions, or soon would be. So this has to be one of the most tech savvy places in the world. But what struck me was how few of them in that environment were plugged into cells, smartphones, tablets or laptops. I saw far less of it than on the HSU campus.
I did see people reading newspapers and books. I saw a woman using a pen and writing on a paper tablet. The cafe is itself adjacent to a bookstore.
It seemed to me that these people had restored some balance to their lives. Electronics have found a place, doing what they do best, but the slower media still have their functions. I could be projecting here, but it gave me some hope that people who are most familiar with these devices are not enslaved by them. And they can still enjoy simple conversation with people actually present, or a quiet newspaper or book with a cup of coffee in the sun, as have many generations before them.
Sunday, August 19, 2012
Bird in the Moonlight
"The artist struggles against indifference, yet anonymity is a protection. When he becomes known he becomes vulnerable. In what manner do we catch the eye of Polyphemus and become recognized as an individual and not one of his sheep?"
Morris Graves by Frederick S. Wight, John I.H. Baur and Duncan Phillips
painting: "Bird in the Moonlight" by Morris Graves
Wednesday, August 01, 2012
Selling Ourselves
In a very trenchant essay on why (contrary to common belief) there really isn't much innovation anymore, David Graeber expresses a frustration that has deeply affected my life and those of others near and dear. Graeber's essay in The Baffler says that real innovation and invention slowed in about 1970, and what has passed for technological breakthroughs since then are mostly recombinations of existing technologies fashioned into marketable products.
His thesis very briefly is that political, consumer-driven and bureaucratic priorities have dominated and stifled scientific research. He may also have put his finger on what has stifled artistic and intellectual breakthroughs as well. In any case, he describes a context that I've observed as well-- though (like the previous post) I've felt I've sounded crazy for my solitary grumbling. He writes:
"What has changed is the bureaucratic culture. The increasing interpenetration of government, university, and private firms has led everyone to adopt the language, sensibilities, and organizational forms that originated in the corporate world. Although this might have helped in creating marketable products, since that is what corporate bureaucracies are designed to do, in terms of fostering original research, the results have been catastrophic.
My own knowledge comes from universities, both in the United States and Britain. In both countries, the last thirty years have seen a veritable explosion of the proportion of working hours spent on administrative tasks at the expense of pretty much everything else. In my own university, for instance, we have more administrators than faculty members, and the faculty members, too, are expected to spend at least as much time on administration as on teaching and research combined. The same is true, more or less, at universities worldwide.
The growth of administrative work has directly resulted from introducing corporate management techniques. Invariably, these are justified as ways of increasing efficiency and introducing competition at every level. What they end up meaning in practice is that everyone winds up spending most of their time trying to sell things: grant proposals; book proposals; assessments of students’ jobs and grant applications; assessments of our colleagues; prospectuses for new interdisciplinary majors; institutes; conference workshops; universities themselves (which have now become brands to be marketed to prospective students or contributors); and so on.
As marketing overwhelms university life, it generates documents about fostering imagination and creativity that might just as well have been designed to strangle imagination and creativity in the cradle. No major new works of social theory have emerged in the United States in the last thirty years. We have been reduced to the equivalent of medieval scholastics, writing endless annotations of French theory from the seventies, despite the guilty awareness that if new incarnations of Gilles Deleuze, Michel Foucault, or Pierre Bourdieu were to appear in the academy today, we would deny them tenure.
There was a time when academia was society’s refuge for the eccentric, brilliant, and impractical. No longer. It is now the domain of professional self-marketers. As a result, in one of the most bizarre fits of social self-destructiveness in history, we seem to have decided we have no place for our eccentric, brilliant, and impractical citizens. Most languish in their mothers’ basements, at best making the occasional, acute intervention on the Internet."
To this I would only add that it is largely the case not only in academia but in the world of trade publishing. The writing of book proposals is more important than the writing of books, as is the marketing of books. It seems it's becoming true of university press publishing is largely true already of e-publishing.
That doesn't mean good books don't get written and published anyway, and some of these books justly find their audience. It just takes more nerve, perseverance and luck.
His thesis very briefly is that political, consumer-driven and bureaucratic priorities have dominated and stifled scientific research. He may also have put his finger on what has stifled artistic and intellectual breakthroughs as well. In any case, he describes a context that I've observed as well-- though (like the previous post) I've felt I've sounded crazy for my solitary grumbling. He writes:
"What has changed is the bureaucratic culture. The increasing interpenetration of government, university, and private firms has led everyone to adopt the language, sensibilities, and organizational forms that originated in the corporate world. Although this might have helped in creating marketable products, since that is what corporate bureaucracies are designed to do, in terms of fostering original research, the results have been catastrophic.
My own knowledge comes from universities, both in the United States and Britain. In both countries, the last thirty years have seen a veritable explosion of the proportion of working hours spent on administrative tasks at the expense of pretty much everything else. In my own university, for instance, we have more administrators than faculty members, and the faculty members, too, are expected to spend at least as much time on administration as on teaching and research combined. The same is true, more or less, at universities worldwide.
The growth of administrative work has directly resulted from introducing corporate management techniques. Invariably, these are justified as ways of increasing efficiency and introducing competition at every level. What they end up meaning in practice is that everyone winds up spending most of their time trying to sell things: grant proposals; book proposals; assessments of students’ jobs and grant applications; assessments of our colleagues; prospectuses for new interdisciplinary majors; institutes; conference workshops; universities themselves (which have now become brands to be marketed to prospective students or contributors); and so on.
As marketing overwhelms university life, it generates documents about fostering imagination and creativity that might just as well have been designed to strangle imagination and creativity in the cradle. No major new works of social theory have emerged in the United States in the last thirty years. We have been reduced to the equivalent of medieval scholastics, writing endless annotations of French theory from the seventies, despite the guilty awareness that if new incarnations of Gilles Deleuze, Michel Foucault, or Pierre Bourdieu were to appear in the academy today, we would deny them tenure.
There was a time when academia was society’s refuge for the eccentric, brilliant, and impractical. No longer. It is now the domain of professional self-marketers. As a result, in one of the most bizarre fits of social self-destructiveness in history, we seem to have decided we have no place for our eccentric, brilliant, and impractical citizens. Most languish in their mothers’ basements, at best making the occasional, acute intervention on the Internet."
To this I would only add that it is largely the case not only in academia but in the world of trade publishing. The writing of book proposals is more important than the writing of books, as is the marketing of books. It seems it's becoming true of university press publishing is largely true already of e-publishing.
That doesn't mean good books don't get written and published anyway, and some of these books justly find their audience. It just takes more nerve, perseverance and luck.
Monday, June 25, 2012
Let There Be Light
I've been ranting about this for years, though you would have thought I was describing an alien abduction: the frequent--very frequent--poor projection in movie theatres, specifically the lack of light behind the image.
There have been a few articles about this from time to time--I recall one at least from the 1970s if not before. But generally, people didn't seem to credit it. The technology of movie projection changed, but the problem didn't. The power of Xenon bulbs was dialed back to save moviehouse managers money, resulting in a dimmer image. (Now I learn that it didn't even save them money!) But even with today's digital tech, the problem persists--and has caught the ire of somebody who matters, legendary film critic Roger Ebert:
The most common flaw is that the picture is not bright enough. I've been seeing that for a long time.
And Ebert quotes another and perhaps greater expert: "Yet when Martin Scorsese used people around the country to actually check theater brightness, he found most of the theaters involved were showing an underlit image."
This is still happening, as the Boston Globe and Ebert articles attest, and for the same reason: to save money. What these cynical managers are counting on is ignorance-- that people have never seen a properly projected movie. And probably most people have not.
It was a fairly long time before I did, although I don't think the problem was as serious when I started going to the movies as a child in the 1950s. But a few experiences--movies at the Orson Welles in Cambridge in the mid-70s, or even earlier, an eye-popping sparkling new black and white print of Doctor Strangelove at a cinema in San Francisco in 1966--showed me what an illuminated experience it could be.
Those images should be bright, because they are supposed to be. But they hardly ever are. It only takes the first few seconds to know what misery I'm in for--if I can see through the white letters in the title sequence, I inwardly and even sometimes audibly groan. Those white letters should be solid and they should shine.
Now the dim images have literally driven me from the movies--I simply don't go anymore. And Ebert finally informs me that I'm not alone:
" When people don't have a good time at the movies, they're slower to come back. I can't tell you how many comments on my blog have informed me that the writers enjoy a "better picture" at home on their big-screen TVs with Blu-ray discs. This should not be true."
Hey, I enjoy a better picture on my regular old TV with ordinary DVDs. It's that bad.
That first article I read years ago suggested that moviemakers were unaware of how badly projected their movies are because they only see them in cinemas near Hollywood which cater to movie industry clientele, especially around Academy Award time. But even Oscar viewing is not enough to guarantee that movies these days are properly projected, as Ebert found.
I've got used to seeing movies a year or more after they've been in theaters. But every once in awhile I'd like to see one right away, on the big screen, in an environment where I once almost literally lived: a cinema. But I almost never do now.
I don't expect that to change. So the biggest outcome of these articles for me is vindication. Not a lot of solace, but then, that's how it is these days.
There have been a few articles about this from time to time--I recall one at least from the 1970s if not before. But generally, people didn't seem to credit it. The technology of movie projection changed, but the problem didn't. The power of Xenon bulbs was dialed back to save moviehouse managers money, resulting in a dimmer image. (Now I learn that it didn't even save them money!) But even with today's digital tech, the problem persists--and has caught the ire of somebody who matters, legendary film critic Roger Ebert:
The most common flaw is that the picture is not bright enough. I've been seeing that for a long time.
And Ebert quotes another and perhaps greater expert: "Yet when Martin Scorsese used people around the country to actually check theater brightness, he found most of the theaters involved were showing an underlit image."
This is still happening, as the Boston Globe and Ebert articles attest, and for the same reason: to save money. What these cynical managers are counting on is ignorance-- that people have never seen a properly projected movie. And probably most people have not.
It was a fairly long time before I did, although I don't think the problem was as serious when I started going to the movies as a child in the 1950s. But a few experiences--movies at the Orson Welles in Cambridge in the mid-70s, or even earlier, an eye-popping sparkling new black and white print of Doctor Strangelove at a cinema in San Francisco in 1966--showed me what an illuminated experience it could be.
Those images should be bright, because they are supposed to be. But they hardly ever are. It only takes the first few seconds to know what misery I'm in for--if I can see through the white letters in the title sequence, I inwardly and even sometimes audibly groan. Those white letters should be solid and they should shine.
Now the dim images have literally driven me from the movies--I simply don't go anymore. And Ebert finally informs me that I'm not alone:
" When people don't have a good time at the movies, they're slower to come back. I can't tell you how many comments on my blog have informed me that the writers enjoy a "better picture" at home on their big-screen TVs with Blu-ray discs. This should not be true."
Hey, I enjoy a better picture on my regular old TV with ordinary DVDs. It's that bad.
That first article I read years ago suggested that moviemakers were unaware of how badly projected their movies are because they only see them in cinemas near Hollywood which cater to movie industry clientele, especially around Academy Award time. But even Oscar viewing is not enough to guarantee that movies these days are properly projected, as Ebert found.
I've got used to seeing movies a year or more after they've been in theaters. But every once in awhile I'd like to see one right away, on the big screen, in an environment where I once almost literally lived: a cinema. But I almost never do now.
I don't expect that to change. So the biggest outcome of these articles for me is vindication. Not a lot of solace, but then, that's how it is these days.
Tuesday, May 29, 2012
In a Name
A well-named character is important in fiction. We remember Babbitt not only because of what Sinclair Lewis wrote about him, but because George Babbitt is the perfect name for the character.
Some novelists make up names that are right for the character but also funny, outrageous. Charles Dickens was a master of this. For the perplexing complexity of our time, the contemporary master is acknowledged to be Thomas Pynchon, though Kurt Vonnegut played a little, Don DeLillo has indulged, particularly in one of his least loved novels, Ratner's Star (Elux Troxl, Mimsy Mope Grimmer, Desilu Espy and the punning U.F.O.Schwarz and Bhang Pao) as has Jonathan Lethem in Chronic City.
But as Philip Roth famously wrote, nobody could make up Richard Nixon, and reality is currently impinging on this wonderland of names. What fictionist could come up with a chair of the Republican National Commitee named Reince Priebus? Or the head of SpaceX, the billionaire Elon Musk? (Well, Elux Troxl is awfully close.) Or another billionaire, who funds green energy initiatives, an Indian businessman named Ratan Tata?
I'm not saying we should laugh at those names (not that anyone is going to laugh at the name of a billionaire.) But they do seem like the kind of names these outrageous fictionists would invent. Now they don't have to. I'm not sure I'm reassured by that.
Some novelists make up names that are right for the character but also funny, outrageous. Charles Dickens was a master of this. For the perplexing complexity of our time, the contemporary master is acknowledged to be Thomas Pynchon, though Kurt Vonnegut played a little, Don DeLillo has indulged, particularly in one of his least loved novels, Ratner's Star (Elux Troxl, Mimsy Mope Grimmer, Desilu Espy and the punning U.F.O.Schwarz and Bhang Pao) as has Jonathan Lethem in Chronic City.
But as Philip Roth famously wrote, nobody could make up Richard Nixon, and reality is currently impinging on this wonderland of names. What fictionist could come up with a chair of the Republican National Commitee named Reince Priebus? Or the head of SpaceX, the billionaire Elon Musk? (Well, Elux Troxl is awfully close.) Or another billionaire, who funds green energy initiatives, an Indian businessman named Ratan Tata?
I'm not saying we should laugh at those names (not that anyone is going to laugh at the name of a billionaire.) But they do seem like the kind of names these outrageous fictionists would invent. Now they don't have to. I'm not sure I'm reassured by that.
Sunday, May 20, 2012
Today's Prayer
"It occurred to me that lots of people have to sit through meetings every day, and I said a prayer for them as you would for those lost at sea."
Tim Kreider, one of many sadly funny lines in an illuminating testimony in the NYTimes to the cognitive dissonance of writing v. publishing a book in this new Youtubian Twitterverse.
Tim Kreider, one of many sadly funny lines in an illuminating testimony in the NYTimes to the cognitive dissonance of writing v. publishing a book in this new Youtubian Twitterverse.
Saturday, March 24, 2012
What's the Story?
When I grow up I still want to be a novelist. And a playwright. But the kind of storytelling I've mostly done--journalistic storytelling-- remains a satisfying form. I've realized that in a few ways recently.
One of the primary ways we think we know things is through the findings of science. While much of that is accurate in a practical sense, a lot of it pretends to be sure of more than it actually can be sure of. This is most evident in the lesser sciences such as economics and psychology, at least as they are practiced today. Through narrow and dubiously designed experiments, psychologists pretend to be able to say all sorts of things about human behavior, human nature and how brains work. These assertions get these folks tenured positions, books and TV interviews in which they purport to know a lot more than they do. Their arrogance is amazing. Especially since they must ignore the limitations of their experiments that have been defined repeatedly, most recently in a book I've just started reading by Jerome Kagan, called Psychology's Ghosts.
But I don't need to refer to this thoughtful and eminent authority--these limitations are maddeningly clear to me. For one thing, they purport to say universal things based on "experiments" involving the behaviors and responses of mostly North American university undergraduates who volunteer to be subjects, perhaps for small amounts of money. But there are broader objections. I think especially of a panel discussion in Seattle I believe, but carried on C-Span, involving Jane Jacobs, who to my mind was one of the great minds of the 20th century, upon the publication of her final and prophetic book, Dark Age Ahead. The friendly host made an offhand comment, saying something to the effect that although much of her evidence was "anecdotal," it was nevertheless intriguing. It was not long before he was sorry he'd said it. She honed in on this point with great precision.
Because it's a common charge: the only true evidence is scientific observation, especially from experiment, or statistical. Other sources such as "anecdotal" or self-reporting (introspective) is less reliable. Jane Jacobs strongly disagreed, and she was so eloquent that I went back and transcribed what she said:
“Our science began comparing two things to each other, [to find things like] the temperature needed to melt water. That was science for centuries. It’s easy to compare two things, it’s much more difficult to compare 3 or more. It’s “bivariant comparison”—it's very reductive, you have to leave out everything but 2 things, and real life is not like that. Real life attaches anything to everything else. We have begun to learn that in biology...In due course along came ‘disorganized complexity,’ like insurance actuarial tables, or marks a child gets in school—anything that is explained in a graph with a bell shape, or seems to be explained by statistics. Things belonging in this class are based on the law of averages. So many important things are left out in these comparisons, too. They are not really very clarifying in most cases, and are actually counterproductive in a good many others.
How has the human race been getting along all these centuries and millennia, sufficiently well with such bad inputs from the real world? I think we have been doing it with stories. Stories are a means of showing how everything is attached to everything else. Our stories on based on these multiple attachments, and what they mean. We love stories, as human beings. It’s the way we understand the world, very largely. One trouble is, instead of respecting our own intuitions about these things and our own abilities to analyze them and appreciate them, we have only? as some kind of second class intellectual operation. Scientists themselves despise anecdotal evidence, and everything that is a story is called anecdotal evidence, and not valued, and yet you can’t understand the world without anecdotal evidence. Scientists may think a.e. is not important enough to occupy their time, but I think a.e. is important enough to occupy anybody’s time, beginning with very ancient poetry and up to film documentaries, which we’re learning to value more than we used to.”
Novels and other forms of fiction employ storytelling often without reference to facts derived from science (experiment, statistics, etc.), those these may be implied (and in some genres, like science fiction, employed directly though perhaps extended beyond the current science.) But these "scientific" forms disdain story (although not in explaining themselves, when they most often employ metaphor.) Journalistic storytelling combines "scientific" fact and anecdotal information.
Further, journalistic storytelling depends on testing both kinds of information. For instance I might ask experts (and often, different kinds of experts) the same question and see how many sets of facts I get. When the facts--the numbers, the interpretations--start to converge, or at least the disagreement among them sharpens--then I know how to evaluate and use these facts. The facts are also tested by what people say, by anecdotal and introspective evidence, which in turn are tested by the facts. The interplay of all of them is part of the story, and sometimes the story itself.
From the experience of high school debate as much as the anticipation of being caught out in print, I learned to be skeptical of facts and how they were ascertained. From reading and writing fiction and plays, I learned elements of storytelling. These combinations make this a form that is somehow more comprehensive. Journalistic storytelling may not have the depth or resonance or provide the aesthetic pleasure of great novels or great drama. But it does have an aesthetic that I've found pleasing to work with.
When I grow up I still want to be a novelist. And a playwright. But the kind of storytelling I've mostly done--journalistic storytelling-- remains a satisfying form. I've realized that in a few ways recently.
One of the primary ways we think we know things is through the findings of science. While much of that is accurate in a practical sense, a lot of it pretends to be sure of more than it actually can be sure of. This is most evident in the lesser sciences such as economics and psychology, at least as they are practiced today. Through narrow and dubiously designed experiments, psychologists pretend to be able to say all sorts of things about human behavior, human nature and how brains work. These assertions get these folks tenured positions, books and TV interviews in which they purport to know a lot more than they do. Their arrogance is amazing. Especially since they must ignore the limitations of their experiments that have been defined repeatedly, most recently in a book I've just started reading by Jerome Kagan, called Psychology's Ghosts.
But I don't need to refer to this thoughtful and eminent authority--these limitations are maddeningly clear to me. For one thing, they purport to say universal things based on "experiments" involving the behaviors and responses of mostly North American university undergraduates who volunteer to be subjects, perhaps for small amounts of money. But there are broader objections. I think especially of a panel discussion in Seattle I believe, but carried on C-Span, involving Jane Jacobs, who to my mind was one of the great minds of the 20th century, upon the publication of her final and prophetic book, Dark Age Ahead. The friendly host made an offhand comment, saying something to the effect that although much of her evidence was "anecdotal," it was nevertheless intriguing. It was not long before he was sorry he'd said it. She honed in on this point with great precision.
Because it's a common charge: the only true evidence is scientific observation, especially from experiment, or statistical. Other sources such as "anecdotal" or self-reporting (introspective) is less reliable. Jane Jacobs strongly disagreed, and she was so eloquent that I went back and transcribed what she said:
“Our science began comparing two things to each other, [to find things like] the temperature needed to melt water. That was science for centuries. It’s easy to compare two things, it’s much more difficult to compare 3 or more. It’s “bivariant comparison”—it's very reductive, you have to leave out everything but 2 things, and real life is not like that. Real life attaches anything to everything else. We have begun to learn that in biology...In due course along came ‘disorganized complexity,’ like insurance actuarial tables, or marks a child gets in school—anything that is explained in a graph with a bell shape, or seems to be explained by statistics. Things belonging in this class are based on the law of averages. So many important things are left out in these comparisons, too. They are not really very clarifying in most cases, and are actually counterproductive in a good many others.
How has the human race been getting along all these centuries and millennia, sufficiently well with such bad inputs from the real world? I think we have been doing it with stories. Stories are a means of showing how everything is attached to everything else. Our stories on based on these multiple attachments, and what they mean. We love stories, as human beings. It’s the way we understand the world, very largely. One trouble is, instead of respecting our own intuitions about these things and our own abilities to analyze them and appreciate them, we have only? as some kind of second class intellectual operation. Scientists themselves despise anecdotal evidence, and everything that is a story is called anecdotal evidence, and not valued, and yet you can’t understand the world without anecdotal evidence. Scientists may think a.e. is not important enough to occupy their time, but I think a.e. is important enough to occupy anybody’s time, beginning with very ancient poetry and up to film documentaries, which we’re learning to value more than we used to.”
Novels and other forms of fiction employ storytelling often without reference to facts derived from science (experiment, statistics, etc.), those these may be implied (and in some genres, like science fiction, employed directly though perhaps extended beyond the current science.) But these "scientific" forms disdain story (although not in explaining themselves, when they most often employ metaphor.) Journalistic storytelling combines "scientific" fact and anecdotal information.
Further, journalistic storytelling depends on testing both kinds of information. For instance I might ask experts (and often, different kinds of experts) the same question and see how many sets of facts I get. When the facts--the numbers, the interpretations--start to converge, or at least the disagreement among them sharpens--then I know how to evaluate and use these facts. The facts are also tested by what people say, by anecdotal and introspective evidence, which in turn are tested by the facts. The interplay of all of them is part of the story, and sometimes the story itself.
From the experience of high school debate as much as the anticipation of being caught out in print, I learned to be skeptical of facts and how they were ascertained. From reading and writing fiction and plays, I learned elements of storytelling. These combinations make this a form that is somehow more comprehensive. Journalistic storytelling may not have the depth or resonance or provide the aesthetic pleasure of great novels or great drama. But it does have an aesthetic that I've found pleasing to work with.
Tuesday, January 03, 2012
Write to Life
Reviewing a biography of Kurt Vonnegut has prompted more thinking about the difficulties if not impossibility of writing to the highest standards while being a decent human being who is fair to others. Vonnegut's biography is one of several literary exposes recently (of Joseph Heller, J.D. Salinger, and Hemingway again) that emphasize how bad they were at life. Hemingway as a phony macho, a self-promoting liar, etc. Heller as a bad father, and Vonnegut as a sad and bitter man, who betrayed friends, abused one wife (and was abused by another), and scared children. He, like the others, was not like his public image. That's the big revelation supposedly.
(I should say immediately that there is plenty of counterevidence in this biography of Vonnegut's kindnesses, loyalty, etc. But on the whole it seems to emphasize the flaws.)
Vonnegut brings this topic to me in a peculiarly personal way. I was in precisely the generation of young readers in the late 60s that made him famous, and my admiration was as a writer as well as a reader. Vonnegut had taught at the Iowa Writers Workshop just a few years before I entered there. My fiction teacher in my senior year of college had been one of his students. So the evocation of that place and that time in this biography has particular resonance.
Those years turned out to be the tail end of an era--roughly Hemingway to Vonnegut and Heller--in which the novelist was an important and influential figure. This was also a time that people drank a lot, especially writers. And they smoked a lot, especially writers. It was during the sexual revolution so-called, and before the consciousness-raising--and that's what it was--prompted by the womens movement. But even within the context where drinking, smoking and philandering was common--and even more expected among writers-- there was behavior that stood out as troubling, awful, even scary. So even before Iowa (at a few writing conferences, or on campus during writers visits) I had seen professional writers behaving badly. I saw writers who were abusive drinkers, sexual predators, liars and cowards. What I didn't see I heard, because writers could also be vicious gossips.
So even as I saw some of these writers as role models, I was troubled. The drinking and smoking was just exhausting and debilitating, though it was a long time before I gave it up, as it could occasionally lead to some memorably wild evenings, as when I found myself playing blues piano with novelist Vance Bourjaily on slide trombone. But patterns of deceit bothered me, and cruelty repulsed me, and frightened me. I didn't want to become that. Then there were the questions of irresponsibility.
Some of this behavior made me lose respect for these writers, and question the validity of their work. I think that's inevitable, even at a distance-- when you read that writers were cruel, it casts a pall over the writing. But as a reader, it's ultimately the words on the page that make the difference in our lives. As a writer, a beginning writer, it raises questions of identity, and what kind of a life is possible.
For even though character flaws are involved, the necessities of writing itself come into play. I also learned from writing how hard it is to maintain a balance. The world being created on the page is a very different world from the one populated by real people, in which actions have actual consequences that can't be corrected in the next draft. I know a writer now, with several very well-respected novels, who said he gave up writing novels because it was too hard on his family and his relationships. The sensitivity required of a writer writing leaves a painful vulnerability to reality, while the discipline and standards of writing so intensely can feed a monstrous impatience with the world, for its imperfections and shoddy standards, as well as its conspiracy designed to destroy your concentration.
Despite my call-me-irresponsible bravado, that question of responsibility, and doubts that I could be a writer and also responsible, was a major reason I never married. Mostly it was my inability to be financially responsible while still pursuing a vocation as a writer, but that alone was so hard that it became overwhelming to consider adding house and children. (Vonnegut knew this--he used to say that one reason there are so many gay men in the arts is that it takes so long to establish a career, marriage and family is unaffordable.) But there was also the doubt that I could be emotionally and personally responsible to relationships and to my writing. So this is in part why I fell into that statistically insignificant class of hetero men who never married.
I saw what the costs to a family might be. In being reminded of those days, I realized that I was in that particular career track, which through a combination of my own actions and the actions of women who saw all this more clearly than I did, I left.
In the end I suppose the joke was on me, for I didn't have much success, especially success on the page, especially in my most cherished forms of fiction and plays. I'll never know what my refusal to sacrifice others to my writing struggles contributed to that, but less than a fierce and selfish dedication to that writing above all may well have contributed. I expect it still does.
It's not that I've been the height of responsibility, far from it. And as a factor in my failures this may be a self-serving delusion. But when I look back on what the writer's life was supposed to be and sometimes was, I'm glad I dodged that bullet. Like a lot of my accomplishments, this one is of what I didn't do, of pain not caused.
To some extent that was a road deliberately not taken. And to perhaps a greater extent, it remains so. I suppose if I still felt my gift was so great I'd make different choices. And I don't presume to judge those whose accomplishments are greater. Even at a distance, I found the Vonnegut biography shockingly reductive. What's in it (if its proportionately true) may be another side of the story, but it isn't the whole story. He may have been different in life than he was in interviews and books, but the interviews and books are a big part of the story. Maybe no writer can live up to the ideals in the work, in the wit and inspiration of the words. But we can share those ideals and aspirations.
Reviewing a biography of Kurt Vonnegut has prompted more thinking about the difficulties if not impossibility of writing to the highest standards while being a decent human being who is fair to others. Vonnegut's biography is one of several literary exposes recently (of Joseph Heller, J.D. Salinger, and Hemingway again) that emphasize how bad they were at life. Hemingway as a phony macho, a self-promoting liar, etc. Heller as a bad father, and Vonnegut as a sad and bitter man, who betrayed friends, abused one wife (and was abused by another), and scared children. He, like the others, was not like his public image. That's the big revelation supposedly.
(I should say immediately that there is plenty of counterevidence in this biography of Vonnegut's kindnesses, loyalty, etc. But on the whole it seems to emphasize the flaws.)
Vonnegut brings this topic to me in a peculiarly personal way. I was in precisely the generation of young readers in the late 60s that made him famous, and my admiration was as a writer as well as a reader. Vonnegut had taught at the Iowa Writers Workshop just a few years before I entered there. My fiction teacher in my senior year of college had been one of his students. So the evocation of that place and that time in this biography has particular resonance.
Those years turned out to be the tail end of an era--roughly Hemingway to Vonnegut and Heller--in which the novelist was an important and influential figure. This was also a time that people drank a lot, especially writers. And they smoked a lot, especially writers. It was during the sexual revolution so-called, and before the consciousness-raising--and that's what it was--prompted by the womens movement. But even within the context where drinking, smoking and philandering was common--and even more expected among writers-- there was behavior that stood out as troubling, awful, even scary. So even before Iowa (at a few writing conferences, or on campus during writers visits) I had seen professional writers behaving badly. I saw writers who were abusive drinkers, sexual predators, liars and cowards. What I didn't see I heard, because writers could also be vicious gossips.
So even as I saw some of these writers as role models, I was troubled. The drinking and smoking was just exhausting and debilitating, though it was a long time before I gave it up, as it could occasionally lead to some memorably wild evenings, as when I found myself playing blues piano with novelist Vance Bourjaily on slide trombone. But patterns of deceit bothered me, and cruelty repulsed me, and frightened me. I didn't want to become that. Then there were the questions of irresponsibility.
Some of this behavior made me lose respect for these writers, and question the validity of their work. I think that's inevitable, even at a distance-- when you read that writers were cruel, it casts a pall over the writing. But as a reader, it's ultimately the words on the page that make the difference in our lives. As a writer, a beginning writer, it raises questions of identity, and what kind of a life is possible.
For even though character flaws are involved, the necessities of writing itself come into play. I also learned from writing how hard it is to maintain a balance. The world being created on the page is a very different world from the one populated by real people, in which actions have actual consequences that can't be corrected in the next draft. I know a writer now, with several very well-respected novels, who said he gave up writing novels because it was too hard on his family and his relationships. The sensitivity required of a writer writing leaves a painful vulnerability to reality, while the discipline and standards of writing so intensely can feed a monstrous impatience with the world, for its imperfections and shoddy standards, as well as its conspiracy designed to destroy your concentration.
Despite my call-me-irresponsible bravado, that question of responsibility, and doubts that I could be a writer and also responsible, was a major reason I never married. Mostly it was my inability to be financially responsible while still pursuing a vocation as a writer, but that alone was so hard that it became overwhelming to consider adding house and children. (Vonnegut knew this--he used to say that one reason there are so many gay men in the arts is that it takes so long to establish a career, marriage and family is unaffordable.) But there was also the doubt that I could be emotionally and personally responsible to relationships and to my writing. So this is in part why I fell into that statistically insignificant class of hetero men who never married.
I saw what the costs to a family might be. In being reminded of those days, I realized that I was in that particular career track, which through a combination of my own actions and the actions of women who saw all this more clearly than I did, I left.
In the end I suppose the joke was on me, for I didn't have much success, especially success on the page, especially in my most cherished forms of fiction and plays. I'll never know what my refusal to sacrifice others to my writing struggles contributed to that, but less than a fierce and selfish dedication to that writing above all may well have contributed. I expect it still does.
It's not that I've been the height of responsibility, far from it. And as a factor in my failures this may be a self-serving delusion. But when I look back on what the writer's life was supposed to be and sometimes was, I'm glad I dodged that bullet. Like a lot of my accomplishments, this one is of what I didn't do, of pain not caused.
To some extent that was a road deliberately not taken. And to perhaps a greater extent, it remains so. I suppose if I still felt my gift was so great I'd make different choices. And I don't presume to judge those whose accomplishments are greater. Even at a distance, I found the Vonnegut biography shockingly reductive. What's in it (if its proportionately true) may be another side of the story, but it isn't the whole story. He may have been different in life than he was in interviews and books, but the interviews and books are a big part of the story. Maybe no writer can live up to the ideals in the work, in the wit and inspiration of the words. But we can share those ideals and aspirations.
Thursday, October 06, 2011
R.I.P. Sam, Bill and Fred
Three college teachers who were important to me have passed away in the past few months. The most recent was Sam Moon, the professor who sought me out and first talked to me as I was registering for classes at Knox College. I was there on the Scholastic Magazines Writing Awards Scholarship, and he was head of the writing program (which he pretty much invented.) I can see his face across the table right now.
He wanted to talk to me before I registered in case I thought I had to take writing courses because of my writing scholarship. He said I didn't, and probably would have gotten a scholarship anyway. I had read and re-read the brochure on the writing program the previous spring, and then all summer. It was one of the main reasons I chose Knox (I had a scholarship to the University of Pittsburgh as well.) So I told him I actually wanted to take writing courses. He seemed very pleased.
I did take several writing courses from him, and he was always willing to read and discuss a manuscript even when it wasn't part of a course. He listened well, asked questions, tried to get inside your thinking. But he also let you know what he thought, in certain terms. I don't remember much that's specific, just odd moments of classes or in his office. I believe--and this applies to other teachers--that a lot of what we learn from teachers that turns out to be most important is what we absorbed without realizing it, or at least without realizing and remembering where we learned it.
What I do associate the most with Sam Moon however is the bounty of writers and others who spent time at Knox while I was there. Many of those specific individuals were there because of Sam Moon. He was himself a poet who published regularly in Chicago's prestigous Poetry Magazine and elsewhere. He had the respect of writers and he was connected to their world.
So in my time I saw and heard poets from Mark Van Doren to David Ignatov. I had lunch with W.H. Auden! Gary Snyder came pretty much directly from Japan for a week one spring, and read for hours every day. We waited for him in Old Main, listening for the bells in his boots jingling down the stone hallway. Nobody who was there will ever forget the week that Robert Creeley visited, or (for other reasons) James Dickey. Robert Bly came to read at least twice. These poets changed us. John Cage came to campus several times, as did dancer Merce Cunningham--and once they were there at the same time, as the Cunningham group performed a piece by Cage. Grace Paley read her stories. They are part of my college memories--bringing tea to Denise Levertov, talking about the New York Worlds Fair on the Gizmo patio with John Cage ("I liked the lines," he said.) Others probably had something to do with bringing these people to Knox, but it was pretty much Sam.
He was also possibly the last Knox teacher I talked to in his Old Main office, when I visited campus in the 1980s. I'd been out of school 15 years or so by then, but he recognized me standing uncertainly at his office door, as a group of undergrad writers surrounded him at his desk. We had a coffee in the Gizmo.
He retired soon after that (I remember I wrote something for a book to be presented to him at his farewell dinner), and I was surprised to see that he left Galesburg soon afterwards. He had another life, another quarter century somewhere else, in New York state. He's buried in Ontario. Our teachers are always something of a mystery to us, as young as we were, but I'll bet Sam was more of a mystery than most.
William Matthews was pretty much the entire Religion department when I was at Knox. I don't think I ever had a class with him, but for some reason he liked things I wrote, for the newspaper and the campus magazines. People would tell me that he quoted them enthusiastically in his classes. I was embarrassed, since it seemed to compromise my lapsed Catholic dogmatic anti-religion. I did talk with him from time to time, but again, I couldn't feel comfortable, thanks to 12 years of priests and nuns. I do wonder if he had anything to do with bringing another speaker to campus, who had a profound effect on me. I don't remember what he was actually talking about, there in the Commons Room of Old Main, but he made one offhand comment that reoriented me completely: he noted that after killing an animal, a Native American hunter would say a prayer thanking the animal. Eventually this moment would send me on a different road, spiritual and otherwise.
I recently learned from a fellow student who got back in touch after a very long time that Fred Newman died several months ago. Fred was a philosophy professor who changed more than my life in his time at Knox. I had only one class with him, and knew him for no more than a year or so. I could write pages on that spring of my freshman year, and its impact on me for years following.
But I lost complete track of him after Knox, and though I had seen his name from time to time--not usually in flattering contexts, as a kind of New York political eccentric--I was unaware of the extent, nature and influence of his work over the years. Which is kind of astonishing, since I used to spend a fair amount of time in New York in worlds that touched upon his. He was political, a "public philosopher", a playwright and songwriter with dozens of productions, etc. I learned most of this from his website. It even has sound files of him lecturing--talk about a blast from the past!
I'm sorry that I wasn't aware of this while he was alive. He remained a charismatic figure well beyond Knox, so on the other hand it's unlikely I would have wanted to go through that exhausting experience again. From his obits I learned something else: that he lost his college teaching jobs after Knox because he insisted on giving all his male students As because of the draft. I wish I'd known that. I'd probably need pages more to explain that to those who weren't young men then, but it's something else I will always admire about Fred Newman.
May they rest in peace--Sam, Bill, Fred, if I may call them that. I probably can.
Wednesday, September 07, 2011
Hillman on the Author's Battlefield
I was never in any military, and although I played at war endlessly for a few years as a boy, I never developed a lot of interest in the military vocabulary. But James Hillman was in the U.S. Navy during World War II, tending to amputees and other severely injured in hospitals. His last book (so far) was A Terrible Love of War, which according to the paperback back cover, somebody in the San Francisco Chronicle called "A skillfully constructed tour de force." Oh yeah, that was me. And yes, it was a pun, kind of.
Anyway, that's a long way to get to this slyly revealing passage in that book in which Hillman talks about writing using military terms and concepts--not something I've ever thought about, or how I approached writing, but worth considering.
"Writing books for me is anyway much like a military campaign. I confess to fighting my way through with military metaphors. There is a strategy, an overall concept, and there are tactics along the way. When stuck, don't dig in; keep moving forward. Don't obsess trying to reduce a strongpoint by sheer force or laying seige. Isolate it and in time it will fall by itself. No pitched battles with the interior voices of saboteurs, critics, adversaries. A light skirmish, a show of arrows, and disappear into the next paragraph. Camouflage your own vulnerability, your lack of reserves with showy parades and bugles---remember everyone else is equally vulnerable. Pillage the storehouses of thought, refurbish old material and use it to reinforce your lines. Abandon ground you can't exploit, but when you've got an issue on the run, take all the territory you can."
I was never in any military, and although I played at war endlessly for a few years as a boy, I never developed a lot of interest in the military vocabulary. But James Hillman was in the U.S. Navy during World War II, tending to amputees and other severely injured in hospitals. His last book (so far) was A Terrible Love of War, which according to the paperback back cover, somebody in the San Francisco Chronicle called "A skillfully constructed tour de force." Oh yeah, that was me. And yes, it was a pun, kind of.
Anyway, that's a long way to get to this slyly revealing passage in that book in which Hillman talks about writing using military terms and concepts--not something I've ever thought about, or how I approached writing, but worth considering.
"Writing books for me is anyway much like a military campaign. I confess to fighting my way through with military metaphors. There is a strategy, an overall concept, and there are tactics along the way. When stuck, don't dig in; keep moving forward. Don't obsess trying to reduce a strongpoint by sheer force or laying seige. Isolate it and in time it will fall by itself. No pitched battles with the interior voices of saboteurs, critics, adversaries. A light skirmish, a show of arrows, and disappear into the next paragraph. Camouflage your own vulnerability, your lack of reserves with showy parades and bugles---remember everyone else is equally vulnerable. Pillage the storehouses of thought, refurbish old material and use it to reinforce your lines. Abandon ground you can't exploit, but when you've got an issue on the run, take all the territory you can."
Thursday, September 01, 2011
Wednesday, July 27, 2011
For Ralph Waldo Emerson (that's his Concord house in the photo), the crucial unit of living time was the day. He was inspired by what could be accomplished or revealed in each new day, and frustrated by the lost hours, the plod of seemingly wasted days.
"Days" is probably his most famous poem:
Daughters of Time, the hypocrite Days,
Muffled and dumb like barefoot dervishes,
And marching single in an endless file,
Bring diadems and fagots in their hands.
To each they offer gifts after his will,
Bread, kingdoms, stars, and sky that holds them all.
I, in my pleached garden, watched the pomp,
Forgot my morning wishes, hastily
Took a few herbs and apples, and the Day
Turned and departed silent. I, too late,
Under her solemn fillet saw the scorn.
So in some real sense, the struggle to fulfill the potential of the day is the essential creative struggle. And it is renewed...every day. Elsewhere he wrote: "The days are gods. That is, everything is divine." In his great biography, Emerson: The Mind on Fire, Robert D. Richardson virtually outdoes the eloquence of his subject, on this subject:
"The personal consequences of such perceptions was an almost intolerable awareness that every morning began with infinite promise. Any book may be read, any idea thought, any action taken. Anything that has ever been possible to human beings is possible to most of us every time the clock says six in the morning. On a day no different from the one now breaking, Shakespeare sat down to begin Hamlet and [Margaret] Fuller began her history of the Roman revolution of 1848. Each of us has all the time there is; each accepts those invitations he can discern. By the same token, each evening brings a reckoning of infinite regret for the paths refused, openings not seen, and actions not taken."
But the essence of it is "Each of us has all the time there is," but "each accepts those invitations he can discern." This is beyond the irksome questions of "time management," or the conflicting demands, needs, temptations as well as falsely promising dead ends. On good days, one may forgive the lapses, knowing that even apparently wasted time may contribute to something that arrives unexplained and redeems the day in a flash. On bad days, the temptation of course is to wallow in that possibility. The "divine dissatisfaction" jockeys with receptivity and acceptance, as the questions narrow with the numbered days.
"Days" is probably his most famous poem:
Daughters of Time, the hypocrite Days,
Muffled and dumb like barefoot dervishes,
And marching single in an endless file,
Bring diadems and fagots in their hands.
To each they offer gifts after his will,
Bread, kingdoms, stars, and sky that holds them all.
I, in my pleached garden, watched the pomp,
Forgot my morning wishes, hastily
Took a few herbs and apples, and the Day
Turned and departed silent. I, too late,
Under her solemn fillet saw the scorn.
So in some real sense, the struggle to fulfill the potential of the day is the essential creative struggle. And it is renewed...every day. Elsewhere he wrote: "The days are gods. That is, everything is divine." In his great biography, Emerson: The Mind on Fire, Robert D. Richardson virtually outdoes the eloquence of his subject, on this subject:
"The personal consequences of such perceptions was an almost intolerable awareness that every morning began with infinite promise. Any book may be read, any idea thought, any action taken. Anything that has ever been possible to human beings is possible to most of us every time the clock says six in the morning. On a day no different from the one now breaking, Shakespeare sat down to begin Hamlet and [Margaret] Fuller began her history of the Roman revolution of 1848. Each of us has all the time there is; each accepts those invitations he can discern. By the same token, each evening brings a reckoning of infinite regret for the paths refused, openings not seen, and actions not taken."
But the essence of it is "Each of us has all the time there is," but "each accepts those invitations he can discern." This is beyond the irksome questions of "time management," or the conflicting demands, needs, temptations as well as falsely promising dead ends. On good days, one may forgive the lapses, knowing that even apparently wasted time may contribute to something that arrives unexplained and redeems the day in a flash. On bad days, the temptation of course is to wallow in that possibility. The "divine dissatisfaction" jockeys with receptivity and acceptance, as the questions narrow with the numbered days.
Monday, June 06, 2011
"All my writings may be considered tasks imposed from within; their source was a fatal compulsion. What I wrote were things that assailed me from within myself. I permitted the spirit that moved me to speak out. I have never counted upon any strong response, any powerful resonance, to my writings. They represent a compensation for our times, and I have been impelled to say what no one wants to hear. For that reason, and especially at the beginning, I often felt utterly forlorn. I knew that what I said would be unwelcome, for it is difficult for people of our times to accept the counterweight to the conscious world. Today I can say that it is truly astonishing that I have had as much success as has been accorded me--far more than I ever could have expected. I have the feeling that I have done all that it was possible for me to do. Without a doubt that life work could have been larger, and could have been done better; but more was not within my power."
"A book of mine is always a matter of fate. There is something unpredictable about the process of writing, and I cannot prescribe for myself any predetermined course."
"My life has been in a sense the quintessence of what I have written, not the other way around. The way I am and the way I write are a unity. All my ideas and all my endeavors are myself. Thus the 'autobiography' is merely the dot on the i."
"In Bollingen, silence surrounds me almost audibly, and I live 'in modest harmony with nature.' Thoughts rise to the surface which reach back into the centuries, and accordingly anticipate a remote future. Here the torment of creation is lessened; creativity and play are close together."
Carl Jung, who died on June 6, 1961, 50 years ago today. [See also this post and its links.]
"A book of mine is always a matter of fate. There is something unpredictable about the process of writing, and I cannot prescribe for myself any predetermined course."
"My life has been in a sense the quintessence of what I have written, not the other way around. The way I am and the way I write are a unity. All my ideas and all my endeavors are myself. Thus the 'autobiography' is merely the dot on the i."
"In Bollingen, silence surrounds me almost audibly, and I live 'in modest harmony with nature.' Thoughts rise to the surface which reach back into the centuries, and accordingly anticipate a remote future. Here the torment of creation is lessened; creativity and play are close together."
Carl Jung, who died on June 6, 1961, 50 years ago today. [See also this post and its links.]
Saturday, May 14, 2011
Genius Within: The Inner Life of Glenn Gould
White Pine Pictures
Nearly 30 years after his death, pianist Glenn Gould remains an icon. He’s one of the few classical musicians revered by peers who also has awestruck fans without expertise in that area of music. I’m one of those. I’ve collected his recordings, read two biographies, and have listened to few pieces of music—and none of classical music—more than I have of Gould’s 1981 version of Bach’s Goldberg Variations.
There are a number of videos about Gould that are limited, bathetic or complete rip-offs. Also a fine fictional film based on Gould's life, Thirty-Two Short Films About Glenn Gould by Francois Girard (it took me years to catch on to the fact that the number corresponds to the number of variations in "Goldberg": this film had to have been called The Glenn Gould Variations at some point.) And a 2006 film I haven't yet seen called Hereafter by Gould confidant Bruno Monsaingeon that adds dramatizations and encounters with Gould fanatics to archival video and audio. But this film, Genius Within by Michele Hozer and Peter Raymont is a satisfying biographical treatment with a fair amount of music. It's been on the festival circuit, but the DVD extra interviews really contribute.
I happened to rent this along with the docu DVD on 1970s popular singer-songwriter Harry Nilsson, and they have provocative similarities. So instead of recapping the Gould story, I’ll suggest some of their commonalities as artists.
Artistic success is often a matter of being ready with the right skills at the right time and place. It also tends to be circumscribed in time, between the damage or obsessions that motivate, and the time when their effects accumulate.
Nilsson had a troubled childhood, Gould perhaps a lonely one. They both were successful in their 20s with a clear path to an artistic career, and they both violated career rules, to their benefit and at a cost. (Both notoriously avoided live performance for the recording studio. Gould felt touring drained imagination, causing him to "grow old very quickly. It's a dreadful life.") Drugs (alcohol and 70s recreationals for Nilsson, prescription drugs for Gould) arguably fueled their work and arguably hastened their demise.
Both men mesmerized their friends, and demanded much of them. Both explored other creative areas but returned to culminating works—Gould especially, with his second Goldberg Variations in the last year of his life. They were both hyperactive and intuited an early death, and both died in their early 50s.
But their legends can lead to distortions. One woman on this DVD reveals a romantic relationship and implies a sexual one with the reputedly virginal Gould. Nilsson’s life also became more balanced with a happy family life in his later years.
Yet crucially they each followed internal guides, and apart from personal failings their unorthodox career decisions turned out to be artistically right. Perhaps this sentence from Thoreau, from the journal passage quoted in the previous post, applies not only to individual works but to the arc of a career: “We must walk consciously only part way to our goal,” Thoreau wrote, “and then leap in the dark to our success.”
Subscribe to:
Posts (Atom)