Wednesday, October 24, 2012

True Blue


When did I start identifying with blue?  Blue eyes maybe. Blue blankets and other blue baby stuff for a boy.  It became at some point my favorite color and has grown in that regard.  So that now blue is an essential marker of me.  And my blue voice.

It turns out that "true blue" has a colorful meaning. In the New York Times:

For the French Fauvist painter and color gourmand Raoul Dufy, blue was the only color with enough strength of character to remain blue “in all its tones.” Darkened red looks brown and whitened red turns pink, Dufy said, while yellow blackens with shading and fades away in the light. But blue can be brightened or dimmed, the artist said, and “it will always stay blue.”

 And that blue fascinates more than me.  The Times story by Natalie Angier goes on:

 Scientists, too, have lately been bullish on blue, captivated by its optical purity, complexity and metaphorical fluency. They’re exploring the physics and chemistry of blueness in nature, the evolution of blue ornaments and blue come-ons, and the sheer brazenness of being blue when most earthly life forms opt for earthy raiments of beige, ruddy or taupe."

All that and more.  Blue by the way is the color of the throat chakra.  Blue voice.  How about that for metaphorical fluency?

The artist of blue most people think of is Picasso.  The Blue Period (brought on by the combination of the suicide of a friend and--at least according to Gertrude Stein--because someone gave him a lot of blue paint, and he was too poor to buy other colors.)

But I like blues in Klee, Van Gogh, O'Keeffe, Monet, Severini, Morris Graves, and Rene Magritte.  Magritte paints blue skies, usually in that mysterious luminous blue that partakes of both day and night.  It is the blue of dawn and the blue of dusk.  And one of the features of these paintings I love is, you can't tell which it is.  Everything is the dawn of something, and the dusk.

I wear a lot of blue. I enjoy it.  Beginning with its third season, the George Reeves Superman TV series of the 50s was filmed in color, even though it would be shown in black and white for the next decade or so.  When color TV was more widely available in the 60s, the series had a big revival.  But to both film in color and principally for black and white TV, almost everybody in Superman wore blue. I assume that was the reason.  All shades and patterns of blue.  Blue sweaters with blue suits. (And they didn't seem to change their outfits very often.)  Even the cars tended to be blue. I've got these on DVD.  It's my kind of world. 

Superman wore blue.  Doctor Who in his blue box. Spock in his blue uniform.  Now that new BBC Sherlock wears blue.  But it probably all comes down, or up, to that blue sky.  The blue ocean.  This blue planet.  This blue voice.

Sunday, September 16, 2012

The Amity of Influence

In a nifty collection of interviews (Talking Music), William Duckworth asks composer Lou Harrison why he brought together divergent influences to his own work.  Harrison says it wasn't conscious, but the reason he does give is forthright and true at times for most creative people.

DUCKWORTH: How did they get in there?

HARRISON: Well, because I loved them.  "Me, too," that's the idea.  If I like something I want that too.  It's greed--that's the basis of it."

Wednesday, September 12, 2012

September Song

Martin Amis is a writer I've admired mostly from afar.  I've enjoyed the novels I've read and the non-fiction collection about the 80s, The Moronic Inferno, a title that describes the 80s and a lot of the ever since.  But I haven't read a lot of his work, for often his most urgent concerns are not mine--at least not of the same moment. 

Maybe it's just that his life has been so different from mine.  But he was quoted making an observation that I've not only never read anybody else making, I've never heard anyone else say.  He was describing something that happens to him, that I thought that for all intents and purposes, only happens to me.

He said that he is often caught offguard by a memory of something that attacks him with regret and chagrin, seemingly out of the blue, just walking down the street or in any daily situation.  In fact, I referenced this on this very blog:

 Several years ago I was pleased to hear novelist Martin Amis admit that small regrets hit him suddenly every day, to the point that they stop him in his tracks, literally, as he walks down the street, and he involuntarily winces and mutters to himself because of some small memory that emerged with the peculiar force of shame and the pitiless, bottomless thump of regret. I was pleased because I thought I was the only one this happened to.


Now he's done it again, in a recent interview (published at Smithsonian online and flagged by Andrew Sullivan's site.)  He has identified something I am dimly aware is happening to me--that in recent days I've become more conscious of.  Here's what he said:

"Your youth evaporates in your early 40s when you look in the mirror. And then it becomes a full-time job pretending you’re not going to die, and then you accept that you’ll die. Then in your 50s everything is very thin. And then suddenly you’ve got this huge new territory inside you, which is the past, which wasn’t there before. A new source of strength. Then that may not be so gratifying to you as the 60s begin [Amis is 62], but then I find that in your 60s, everything begins to look sort of slightly magical again. And it’s imbued with a kind of leave-taking resonance, that it’s not going to be around very long, this world, so it begins to look poignant and fascinating.”

Yes, there is that "huge new territory inside" which is "the past."  But especially, "in your 60s, everything begins to look sort of slightly magical again."

It does.  It's a bit easier to appreciate the moment.  I'm very aware that this is a golden time--I'm reasonably healthy, I am without physical pain, temporarily secure--well, the sense that it is certainly all temporary.  But it is, right now.  And the day is easier to appreciate.  People, relationships that are good--and the blessings I have here, of this lovely air, especially in the sunny autumn of the North Coast.  It is fascinating and it is poignant, and it's sharpened by the awareness not only that it will all soon end, but you don't know when it will start ending, or how.

Friday, August 24, 2012

Accessing the Slow

 
Students are back at Humboldt State University here.  Perhaps the novelty of seeing so many on campus after the summer opened my eyes to noticing yet again how universally they are wedded to their cell phones--walking or standing anywhere on campus, they are either talking or texting or staring at them.

In June Margaret and I were down in Menlo Park, visiting her daughter, son in law and grandson who had just turned 1.5 years old.  Besides hanging out with him--even at this early age he demonstrated good taste in taking a shine to me--I spent pleasant hours at a fine cafe called Cafe Borrone.  Very good coffee, very good food, excellent staff and great atmosphere, especially in the large outdoor plaza area pictured above.  I snapped that photo in the lull before the late afternoon crowd, which I was around to see.  I was there at Sunday brunch time as well, so I saw a fair number of people. 

The cafe is close to Stanford University, and in the vicinity of Google and Facebook hqs, and lots of other tech related firms.  I'm sure some of the young crowd sipping beers or coffee were worth millions, or soon would be.  So this has to be one of the most tech savvy places in the world.  But what struck me was how few of them in that environment were plugged into cells, smartphones, tablets or laptops.  I saw far less of it than on the HSU campus.

I did see people reading newspapers and books.  I saw a woman using a pen and writing on a paper tablet.  The cafe is itself adjacent to a bookstore.

It seemed to me that these people had restored some balance to their lives.  Electronics have found a place, doing what they do best, but the slower media still have their functions.  I could be projecting here, but it gave me some hope that people who are most familiar with these devices are not enslaved by them.  And they can still enjoy simple conversation with people actually present, or a quiet newspaper or book with a cup of coffee in the sun, as have many generations before them.     

Sunday, August 19, 2012

Bird in the Moonlight


"The artist struggles against indifference, yet anonymity is a protection.  When he becomes known he becomes vulnerable.  In what manner do we catch the eye of Polyphemus and become recognized as an individual and not one of his sheep?"

Morris Graves by Frederick S. Wight, John I.H. Baur and Duncan Phillips
painting: "Bird in the Moonlight" by Morris Graves

Wednesday, August 01, 2012

Selling Ourselves

In a very trenchant essay on why (contrary to common belief) there really isn't much innovation anymore, David Graeber expresses a frustration that has deeply affected my life and those of others near and dear.  Graeber's essay in The Baffler says that real innovation and invention slowed in about 1970, and what has passed for technological breakthroughs since then are mostly recombinations of existing technologies fashioned into marketable products.

His thesis very briefly is that political, consumer-driven and bureaucratic priorities have dominated and stifled scientific research.  He may also have put his finger on what has stifled artistic and intellectual breakthroughs as well.  In any case, he describes a context that I've observed as well-- though (like the previous post) I've felt I've sounded crazy for my solitary grumbling.  He writes:

"What has changed is the bureaucratic culture. The increasing interpenetration of government, university, and private firms has led everyone to adopt the language, sensibilities, and organizational forms that originated in the corporate world. Although this might have helped in creating marketable products, since that is what corporate bureaucracies are designed to do, in terms of fostering original research, the results have been catastrophic.

My own knowledge comes from universities, both in the United States and Britain. In both countries, the last thirty years have seen a veritable explosion of the proportion of working hours spent on administrative tasks at the expense of pretty much everything else. In my own university, for instance, we have more administrators than faculty members, and the faculty members, too, are expected to spend at least as much time on administration as on teaching and research combined. The same is true, more or less, at universities worldwide.

The growth of administrative work has directly resulted from introducing corporate management techniques. Invariably, these are justified as ways of increasing efficiency and introducing competition at every level. What they end up meaning in practice is that everyone winds up spending most of their time trying to sell things: grant proposals; book proposals; assessments of students’ jobs and grant applications; assessments of our colleagues; prospectuses for new interdisciplinary majors; institutes; conference workshops; universities themselves (which have now become brands to be marketed to prospective students or contributors); and so on.

As marketing overwhelms university life, it generates documents about fostering imagination and creativity that might just as well have been designed to strangle imagination and creativity in the cradle. No major new works of social theory have emerged in the United States in the last thirty years. We have been reduced to the equivalent of medieval scholastics, writing endless annotations of French theory from the seventies, despite the guilty awareness that if new incarnations of Gilles Deleuze, Michel Foucault, or Pierre Bourdieu were to appear in the academy today, we would deny them tenure.

There was a time when academia was society’s refuge for the eccentric, brilliant, and impractical. No longer. It is now the domain of professional self-marketers. As a result, in one of the most bizarre fits of social self-destructiveness in history, we seem to have decided we have no place for our eccentric, brilliant, and impractical citizens. Most languish in their mothers’ basements, at best making the occasional, acute intervention on the Internet."

To this I would only add that it is largely the case not only in academia but in the world of trade publishing.  The writing of book proposals is more important than the writing of books, as is the marketing of books.  It seems it's becoming true of university press publishing is largely true already of e-publishing.

That doesn't mean good books don't get written and published anyway, and some of these books justly find their audience.  It just takes more nerve, perseverance and luck.  

Monday, June 25, 2012

Let There Be Light

I've been ranting about this for years, though you would have thought I was describing an alien abduction: the frequent--very frequent--poor projection in movie theatres, specifically the lack of light behind the image. 

There have been a few articles about this from time to time--I recall one at least from the 1970s if not before.  But generally, people didn't seem to credit it.  The technology of movie projection changed, but the problem didn't.  The power of Xenon bulbs was dialed back to save moviehouse managers money, resulting in a dimmer image.  (Now I learn that it didn't even save them money!)  But even with today's digital tech, the problem persists--and has caught the ire of somebody who matters, legendary film critic Roger Ebert:  

 The most common flaw is that the picture is not bright enough. I've been seeing that for a long time.

And Ebert quotes another and perhaps greater expert: "Yet when Martin Scorsese used people around the country to actually check theater brightness, he found most of the theaters involved were showing an underlit image."

This is still happening, as the Boston Globe and Ebert articles attest, and for the same reason: to save money.  What these cynical managers are counting on is ignorance-- that people have never seen a properly projected movie.  And probably most people have not.

It was a fairly long time before I did, although I don't think the problem was as serious when I started going to the movies as a child in the 1950s.  But a few experiences--movies at the Orson Welles in Cambridge in the mid-70s, or even earlier, an eye-popping sparkling new black and white print of Doctor Strangelove at a cinema in San Francisco in 1966--showed me what an illuminated experience it could be.

Those images should be bright, because they are supposed to be. But they hardly ever are.  It only takes the first few seconds to know what misery I'm in for--if I can see through the white letters in the title sequence, I inwardly and even sometimes audibly groan.  Those white letters should be solid and they should shine.

   Now the dim images have literally driven me from the movies--I simply don't go anymore.  And Ebert finally informs me that I'm not alone:

" When people don't have a good time at the movies, they're slower to come back. I can't tell you how many comments on my blog have informed me that the writers enjoy a "better picture" at home on their big-screen TVs with Blu-ray discs. This should not be true."

Hey, I enjoy a better picture on my regular old TV with ordinary DVDs.  It's that bad.

That first article I read years ago suggested that moviemakers were unaware of how badly projected their movies are because they only see them in cinemas near Hollywood which cater to movie industry clientele, especially around Academy Award time.  But even Oscar viewing is not enough to guarantee that movies these days are properly projected, as Ebert found.

I've got used to seeing movies a year or more after they've been in theaters.  But every once in awhile I'd like to see one right away, on the big screen, in an environment where I once almost literally lived: a cinema.  But I almost never do now. 

I don't expect that to change.  So the biggest outcome of these articles for me is vindication.  Not a lot of solace, but then, that's how it is these days.

Tuesday, May 29, 2012

In a Name

A well-named character is important in fiction.  We remember Babbitt not only because of what Sinclair Lewis wrote about him, but because George Babbitt is the perfect name for the character.

Some novelists make up names that are right for the character but also funny, outrageous.  Charles Dickens was a master of this.  For the perplexing complexity of our time, the contemporary master is acknowledged to be Thomas Pynchon, though Kurt Vonnegut played a little, Don DeLillo has indulged, particularly in one of his least loved novels, Ratner's Star (Elux Troxl, Mimsy Mope Grimmer, Desilu Espy and the punning U.F.O.Schwarz and Bhang Pao) as has Jonathan Lethem in Chronic City. 

But as Philip Roth famously wrote, nobody could make up Richard Nixon, and reality is currently impinging on this wonderland of names.  What fictionist could come up with a chair of the Republican National Commitee named Reince Priebus?  Or the head of SpaceX, the billionaire Elon Musk?  (Well, Elux Troxl is awfully close.)  Or another billionaire, who funds green energy initiatives, an Indian businessman named Ratan Tata?

I'm not saying we should laugh at those names (not that anyone is going to laugh at the name of a billionaire.)  But they do seem like the kind of names these outrageous fictionists would invent.  Now they don't have to.  I'm not sure I'm reassured by that.           

Sunday, May 20, 2012

Today's Prayer

"It occurred to me that lots of people have to sit through meetings every day, and I said a prayer for them as you would for those lost at sea."

Tim Kreider, one of many sadly funny lines in an illuminating testimony in the NYTimes to the cognitive dissonance of  writing v. publishing a book in this new Youtubian Twitterverse.

Saturday, March 24, 2012

What's the Story?

When I grow up I still want to be a novelist.  And a playwright.  But the kind of storytelling I've mostly done--journalistic storytelling-- remains a satisfying form.  I've realized that in a few ways recently.

One of the primary ways we think we know things is through the findings of science.  While much of that is accurate in a practical sense, a lot of it pretends to be sure of more than it actually can be sure of.  This is most evident in the lesser sciences such as economics and psychology, at least as they are practiced today.  Through narrow and dubiously designed experiments, psychologists pretend to be able to say all sorts of things about human behavior, human nature and how brains work.  These assertions get these folks tenured positions, books and TV interviews in which they purport to know a lot more than they do.  Their arrogance is amazing.  Especially since they must ignore the limitations of their experiments that have been defined repeatedly, most recently in a book I've just started reading by Jerome Kagan, called Psychology's Ghosts.

But I don't need to refer to this thoughtful and eminent authority--these limitations are maddeningly clear to me.  For one thing, they purport to say universal things based on "experiments" involving the behaviors and responses of mostly North American university undergraduates who volunteer to be subjects, perhaps for small amounts of money.  But there are broader objections.  I think especially of a panel discussion in Seattle I believe, but carried on C-Span, involving Jane Jacobs, who to my mind was one of the great minds of the 20th century, upon the publication of her final and prophetic book, Dark Age Ahead.  The friendly host made an offhand comment, saying something to the effect that although much of her evidence was "anecdotal," it was nevertheless intriguing.  It was not long before he was sorry he'd said it.  She honed in on this point with great precision.

Because it's a common charge: the only true evidence is scientific observation, especially from experiment, or statistical.  Other sources such as "anecdotal" or self-reporting (introspective) is less reliable.  Jane Jacobs strongly disagreed, and she was so eloquent that I went back and transcribed what she said:

“Our science began comparing two things to each other, [to find things like] the temperature needed to melt water. That was science for centuries. It’s easy to compare two things, it’s much more difficult to compare 3 or more. It’s “bivariant comparison”—it's very reductive, you have to leave out everything but 2 things, and real life is not like that. Real life attaches anything to everything else. We have begun to learn that in biology...In due course along came ‘disorganized complexity,’ like insurance actuarial tables, or marks a child gets in school—anything that is explained in a graph with a bell shape, or seems to be explained by statistics. Things belonging in this class are based on the law of averages. So many important things are left out in these comparisons, too. They are not really very clarifying in most cases, and are actually counterproductive in a good many others.

How has the human race been getting along all these centuries and millennia, sufficiently well with such bad inputs from the real world? I think we have been doing it with stories. Stories are a means of showing how everything is attached to everything else. Our stories on based on these multiple attachments, and what they mean. We love stories, as human beings. It’s the way we understand the world, very largely. One trouble is, instead of respecting our own intuitions about these things and our own abilities to analyze them and appreciate them, we have only? as some kind of second class intellectual operation. Scientists themselves despise anecdotal evidence, and everything that is a story is called anecdotal evidence, and not valued, and yet you can’t understand the world without anecdotal evidence.  Scientists may think a.e. is not important enough to occupy their time, but I think a.e. is important enough to occupy anybody’s time, beginning with very ancient poetry and up to film documentaries, which we’re learning to value more than we used to.”

Novels and other forms of fiction employ storytelling often without reference to facts derived from science (experiment, statistics, etc.), those these may be implied (and in some genres, like science fiction, employed directly though perhaps extended beyond the current science.)  But these "scientific" forms disdain story (although not in explaining themselves, when they most often employ metaphor.)  Journalistic storytelling combines "scientific" fact and anecdotal information. 

Further, journalistic storytelling depends on testing both kinds of information.  For instance I might ask experts (and often, different kinds of experts) the same question and see how many sets of facts I get.  When the facts--the numbers, the interpretations--start to converge, or at least the disagreement among them sharpens--then I know how to evaluate and use these facts.  The facts are also tested by what people say, by anecdotal and introspective evidence, which in turn are tested by the facts.  The interplay of all of them is part of the story, and sometimes the story itself.

From the experience of high school debate as much as the anticipation of being caught out in print, I learned to be skeptical of facts and how they were ascertained.  From reading and writing fiction and plays, I learned elements of storytelling.  These combinations make this a form that is somehow more comprehensive.  Journalistic storytelling may not have the depth or resonance or provide the aesthetic pleasure of great novels or great drama.  But it does have an aesthetic that I've found pleasing to work with.

Tuesday, January 03, 2012

Write to Life


Reviewing a biography of Kurt Vonnegut has prompted more thinking about the difficulties if not impossibility of writing to the highest standards while being a decent human being who is fair to others.  Vonnegut's biography is one of several literary exposes recently (of Joseph Heller, J.D. Salinger, and Hemingway again) that emphasize how bad they were at life.  Hemingway as a phony macho, a self-promoting liar, etc.  Heller as a bad father, and Vonnegut as a sad and bitter man, who betrayed friends, abused one wife (and was abused by another), and scared children.  He, like the others, was not like his public image.  That's the big revelation supposedly.

(I should say immediately that there is plenty of counterevidence in this biography of Vonnegut's kindnesses, loyalty, etc.  But on the whole it seems to emphasize the flaws.)

Vonnegut brings this topic to me in a peculiarly personal way.  I was in precisely the generation of young readers in the late 60s that made him famous, and my admiration was as a writer as well as a reader.  Vonnegut had taught at the Iowa Writers Workshop just a few years before I entered there.  My fiction teacher in my senior year of college had been one of his students.  So the evocation of that place and that time in this biography has particular resonance.

Those years turned out to be the tail end of an era--roughly Hemingway to Vonnegut and Heller--in which the novelist was an important and influential figure.  This was also a time that people drank a lot, especially writers.  And they smoked a lot, especially writers.  It was during the sexual revolution so-called, and before the consciousness-raising--and that's what it was--prompted by the womens movement.  But even within the context where drinking, smoking and philandering was common--and even more expected among writers-- there was behavior that stood out as troubling, awful, even scary.  So even before Iowa (at a few writing conferences, or on campus during writers visits) I had seen professional writers behaving badly.  I saw writers who were abusive drinkers, sexual predators, liars and cowards. What I didn't see I heard, because writers could also be vicious gossips.

So even as I saw some of these writers as role models, I was troubled.  The drinking and smoking was just exhausting and debilitating,  though it was a long time before I gave it up, as it could occasionally lead to some memorably wild evenings, as when I found myself playing blues piano with novelist Vance Bourjaily on slide trombone. But patterns of deceit bothered me, and cruelty repulsed me, and frightened me.  I didn't want to become that.  Then there were the questions of irresponsibility.

Some of this behavior made me lose respect for these writers, and question the validity of their work.  I think that's inevitable, even at a distance-- when you read that writers were cruel, it casts a  pall over the writing.  But as a reader, it's ultimately the words on the page that make the difference in our lives.  As a writer, a beginning writer, it raises questions of identity, and what kind of a life is possible.

For even though character flaws are involved, the necessities of writing itself come into play.  I also learned from writing how hard it is to maintain a balance. The world being created on the page is a very different world from the one populated by real people, in which actions have actual consequences that can't be corrected in the next draft. I know a writer now, with several very well-respected novels, who said he gave up writing novels because it was too hard on his family and his relationships. The sensitivity required of a writer writing leaves a painful vulnerability to reality, while the discipline and standards of writing so intensely can feed a monstrous impatience with the world, for its imperfections and shoddy standards, as well as its conspiracy designed to destroy your concentration.

Despite my call-me-irresponsible bravado, that question of responsibility, and doubts that I could be a writer and also responsible, was a major reason I never married.  Mostly it was my inability to be financially responsible while still pursuing a vocation as a writer, but that alone was so hard that it became overwhelming to consider adding house and children.  (Vonnegut knew this--he used to say that one reason there are so many gay men in the arts is that it takes so long to establish a career, marriage and family is unaffordable.)  But there was also the doubt that I could be emotionally and personally responsible to relationships and to my writing.  So this is in part why I fell into that statistically insignificant class of hetero men who never married.

I saw what the costs to a family might be.  In being reminded of those days, I realized that I was in that particular career track, which through a combination of my own actions and the actions of women who saw all this more clearly than I did, I left.

In the end I suppose the joke was on me, for I didn't have much success, especially success on the page, especially in my most cherished forms of fiction and plays.  I'll never know what my refusal to sacrifice others to my writing struggles contributed to that, but less than a fierce and selfish dedication to that writing above all may well have contributed.  I expect it still does.

It's not that I've been the height of responsibility, far from it. And as a factor in my failures this may be a self-serving delusion. But when I look back on what the writer's life was supposed to be and sometimes was, I'm glad I dodged that bullet.  Like a lot of my accomplishments, this one is of what I didn't do, of pain not caused.

 To some extent that was a road deliberately not taken.  And to perhaps a greater extent, it remains so.  I suppose if I still felt my gift was so great I'd make different choices.  And I don't presume to judge those whose accomplishments are greater.  Even at a distance, I found the Vonnegut biography shockingly reductive.  What's in it (if its proportionately true) may be another side of the story, but it isn't the whole story.  He may have been different in life than he was in interviews and books, but the interviews and books are a big part of the story.  Maybe no writer can live up to the ideals in the work, in the wit and inspiration of the words.  But we can share those ideals and aspirations.

This is an early demonstration of a graph of storytelling that Vonnegut refined--but not a lot--over the years.  It's about four minutes long.