Friday, May 15, 2009

Alec Guinness - Greatest Actor of the 20th Century


In the British Isles, there has been a long tradition of legitimate stage acting, which is unrivaled throughout the world. When cinema developed momentum in the 1920's--and with the arrival of the Talkies in 1927--film began to rival the stage as a venue for acting talent. The two media may have seemed quite different modes in those early days, but eventually their qualities became better defined, and their distinctions clearer. Sound is thought to have "ruined" the careers of certain actors, whose voices didn't match their silent acting abilities. "Old fashioned" acting methods--common to stage acting up to this point--were thrown into sharp relief by cinema. The degree to which the cinema actually changed the legitimate stage--particularly acting technique--is a story for another time. My point here is to suggest that certain actors/actresses were able to bridge the gulf between stage and film, to have, in effect, dual careers. This has become very much the exception since the Second World War, with film actors/actresses rarely maintaining separate skills (live and recorded).  

It's still true, however, that training in the legitimate stage probably is more effective in producing skilled actors, than starting out exclusively in film. Many actors find film acting to be difficult for one reason or another, though the demands for prolonged performance are rarely as great as they are in major stage roles. (It seems possible to construct a major career as a film actor, without ever having to develop any significant skills for real acting.) On the other hand, a legitimate actor may consider his talent squandered in film. The money, of course, in film, plays a large part in seducing serious acting talent away from the stage. In the case of Marlon Brando, for instance, a great acting career on the stage appeared to have been derailed when he moved over exclusively to movie roles.

The audience for theatre is quite limited when compared to that of film. Who would want the greatest acting talents confined to the theatre, when a great universal medium like film is available?  

The career of Alec Guinness presents the ideal of an actor whose legitimate stage career led directly to his success as a screen actor, but nevertheless didn't cause him to abandon the stage altogether. Such dual careers are of course more common in the British Isles than here. Laurence Olivier comes to mind, but there are countless others. Guinness's apprenticeship as a classically trained actor enabled him to express his talent in countless guises. Professionally modest, even meek, about his abilities, he was routinely able to place himself inside a vast array of strikingly diverse roles, both comic and heroic. 

Beginning with Great Expectations (1946), he did a string of film roles that is unrivaled by any other talent:

Oliver Twist 1948
Kind Hearts and  Coronets 1949
Last Holiday 1950
The Lavender Hill Mob 1951
The Man in the White Suit 1951
The Card 1952
The Captain's Paradise 1953
To Paris with Love 1955
The Ladykillers 1955
The Bridge on the River Kwai 1957
The Horse's Mouth 1958
Our Man in Havana 1959
Tunes of Glory 1960
H.M.S. Defiant 1962
Lawrence of Arabia 1962

--and these are only the highspots! In the space of only 15 years, he established himself as the major male acting talent of the post-War era. It is sometimes said that great acting careers are the result of shrewd part selection, that failed careers are often the result of actors "selling out" for pop roles, avoiding challenges and cashing in on a single success. This is certainly true in America, where an early plum role may lead to a string of howlers, as with Nicolas Cage, for instance, whose major performance in Moonstruck (1987) was cashed in for two dozen macho-action flicks. Why does one career go awry in this way, while another, say, like Kevin Spacey's, or Jeff Bridges', continues from one pinnacle to another? It probably has something to do with a proclivity for taking chances; but it's also the result of challenging oneself to higher standards of performance. Without a solidly grounded preparation of training, many American acting talents dither in the wind, reluctant to forego easy exploitation movies (and the easy money), in favor of more interesting parts that might relegate them to mere "character actor" status.
 
In Guinness's case, the tendency to take diverse roles seems to have worked to his advantage. Long before Dustin Hoffman and Robin Williams essayed cross-dressing parts, Guinness was playing eight separate parts in the same movie [Kind Hearts and Coronets], one of which was a woman! It's difficult to imagine any Hollywood actor or actress trying anything half that ambitious today, both because of the difficulty of execution, and of the risk it would entail to "image" and "market continuity."

The studio system in Hollywood tended to typecast actors, and though that system is quite gone now, the vestiges of that approach to role choices remain in place. Actors still tend to measure the desirability of a part in relation to what they believe the "target audience" may expect. But looking at the successful career of Guinness, "image" and "expectation" probably played little or no part in the roles he chose. He was a professional actor; each role required that he exercise his ingenuity and guile to adapt himself to its specific demands. It wasn't about preserving his reputation, but about making good theatre (or good film). 

It is often remarked about Guinness that he was a humble man, self-effacing, quietly devoted to personal obligations--his marriage, the church, his interests--who always strove to maintain a business-like, courteous approach to his craft. It's unlikely we shall ever see his like again. A modest man, without vanity or confusion, who possessed a great talent which he used to bring entertainment to millions. It's not a little thing.

Sunday, May 10, 2009

Brideshead Revisited [1981] - A Catholic Epic ?


Brideshead Revisited was adapted by the BBC as a television miniseries and released in 1981. It premiered on American PBS, and was an immediate hit, reprised several times, once with William F. Buckley (himself a Catholic) moderating episodes, once even inviting the critic Hugh Kenner to briefly discuss the implications of the plot. 

I was not raised Catholic. My parents forced me to attend Presbyterian for a few years as a boy (while they never went to church at all), but it never made much sense to me, and I learned practically nothing in Bible school. My first year in college, I had a Catholic roommate, but we never discussed religion, and he was soon to become "lapsed" himself. 

Therefore, a lot of the deeper emotional and philosophical issues which Waugh weaves into the drama are lost on me, or at least I don't feel them with anything like the intensity I suppose Catholics may experience. 

Waugh himself was a converted Catholic. It's a cliche that many of the most devout and committed members of any religious sect are converts. In Waugh's case, it seems to have been both an intellectual and personal persuasion, reflecting both a need to see life as part of a larger given structure, and a character "weakness" which expressed itself in a somewhat "dissolute" youth. 

The novel, published originally in 1945, has a nostalgic cast. It looks with some wistfulness back at the central character's romantic university years at Oxford, his later wandering years as a professional landscape painter, resigned to a dreary present circumstance as an officer in wartime Britain. The novel attempts to show how Charles Ryder's three failed loves--with Sebastian Flyte (a titled Lord), his college buddy; Celia, his stuffy wife; and Julia Flyte (Mottram), Sebastian's married sister--lead him to a reconciliation with his faith in larger purposes. 

Waugh unashamedly celebrates the upper classes, their privileged lives, their freedom, their indulgence, which is contrasted to the moral strictures of the church, the codes of behavior against which the Brideshead family measures itself. There's a lot of hypocrisy of various kinds, but none of that seems to alter Waugh's essential attitude:  No matter who you are, what kind of life you live, you can't escape the judgment of your own conscience, and the consequences of your wicked behavior. Not having been raised a Catholic, as I say, I was not very familiar with the concept of institutionalized guilt. Particularly, the notion that no matter how evil, how naughty one might be, there was always some expiation, some forgiveness, available, an ultimate salvation from mortal sin. 

As a piece of cinema, this production is almost without equal as an episodic panorama. Presented as a dozen hour-long episodes, it gives a vivid picture of a certain segment of British upper-class life over a 20 year period.   

Brideshead had never been dramatized before. The production (of 1981) was competent in all respects. Jeremy Irons played Charles Ryder, Anthony Andrews (most familiar to American audiences as the star of the BBC's Danger UXB, and a major part in Huston's Under the Volcano [1984]) played the difficult part of Sebastian, Diana Quick was Julia--with other major parts for Claire Bloom, John Gielgud, Jeremy Sindon, John Grillo, Charles Keating, Laurence Olivier, Stephane Audran, and Jane Asher. For American audiences, its picture of life in English country estate houses may be familiar from detective fiction, or Wodehouse comedies, but the class and sexual tensions are probably somewhat exotic. 

We have no specific  counterpart to British public prep schools, with their rituals of punishment and dog-pack pecking-orders, sado-masochistic homo-erotic routines. Upper-middle class Americans predictably fantasize about the trappings of titled entitlement, and the Ivory-Merchant period pieces have depended as much as anything on this obsessive New World preoccupation with Old World privilege. 

Nevertheless, I must confess to having been smitten with this effete, rarified vision of English sophistication, its wool suits, its brisk contemptuous wit, its breezy presumption and limp-wristed decadence. Sebastian's Teddy-bear may be the ultimate fetish of irresponsible leisure, but what American school-child hasn't been captivated by A.A. Milne's Winnie-the-Pooh and the cosseted world-view it portrays?
 
If Charles and Sebastian are insufficiently heroic in their rebellion against the staid conservatism of late Edwardian mores and prejudice, the qualities they do exude certainly offer a refreshing contrast to the bumptious athleticism of Studs Lonigan or James Cagney. 

As the sun set on the British Empire, her chosen sons graciously sank into irresistible decay and excess. To Waugh, the only moral salvation available became the Church of Rome, a symbol of Medieval order in a world of crumbling honor and good taste. The nostalgic innocence he reveres in this novel is identified with the purified duty and devotion of faith. As in Forster, the greatest tragedy is in growing old. Empires rise and fall, and proud young men either die or slip into drunkenness and dissolution.     

Friday, May 8, 2009

Manny Ramirez Caught Using Drugs

Let me be clear:  I've never been a Dodger fan. I was raised in the San Francisco Bay Area. When the Giants moved west in 1958, I watched them play that year at Seals Stadium, the little minor league park they used while Candlestick Park was being built. 

When the Dodgers moved west to Los Angeles in the same year, a natural rivalry sprang up. Anyone who loved the Giants just naturally hated the Dodgers, and vice versa. In the annals of fandom, the more your team beats the enemy, the better you feel about it. It's part of the fun. 

This year the Dodgers started like a fire-engine, all sirens blaring, winning 13 games in a row at home, a record. They led the NL West by 6 1/2 games, and looked to be favorites as World Series contenders. Leading the charge was Ramirez. Signed last year by LA, and re-signed for 2009, he was on a torrid pace, batting .348 with 6 homers, and 20 RBI's, many of his hits coming in the clutch. Yesterday, MLB announced that Ramirez had failed a drug test, and tested positive for performance-enhancing substances. No one, not even Ramirez, seems to deny that he'd been caught red-handed. You don't have to be discovered with the needle in your butt to prove that you're juicing. 

Because the Giants are in the NL West, I've never followed the American League closely. Ramirez was just a rumor to me, until he came to play for the Dodgers. The guy isn't an appealing role-model:  He sports long dread-locks, and generally looks a little grungy, which seems to be his preferred mode. Apparently he's been considered a selfish player, even a trouble-maker, in the past. Big stats and big salaries seem to have something to do with that. I can't work up much sympathy for a fellow who, with natural skills and superior ability, and already making a mountain of money, thinks he has to cheat to get a little more edge. Bust his ass!

But the best news is that he won't help the Dodgers, a rich franchise which traditionally has had the luxury to hire expensive free agents over the years to pump up their lineups. He's been suspended for 50 games, right in the heart of the season. Is anyone sorry for the Dodgers? Maybe their fans. 

But not me. The Giants, who demurred to pay Manny the astronomical salary he was demanding, now can only thank their stars they let him get away. Manny may only have been using the Giants offer as leverage, anyway, to squeeze more cash out of LA.  

This is the best Bay Area sports news since Tim Lincecum won the Cy Young last year. Now we might have an even chance to unseat those pesky Southern California rivals!                

Go Giants!   

Wednesday, May 6, 2009

The Pedestrian Book of Wrongs - How Not to Cross the Street


I've been thinking for a long time about transportation behavior. As I become older, I seem to get more impatient and frustrated with the inconveniences and irritations of driving, both on freeways and hiways, as well as on city and suburban streets. As our environment becomes more and more crowded, each day is increasingly a negotiation of interruptions, delays, mishaps, ill-tempered dialogues, etc. 

When I was a kid growing up in Richmond, California (age 4-7), our grammar school had a formal demonstration provided by the local police department. It was meant to be frightening, so that the students wouldn't forget it. All the students were lined up on both sides of the big black-top playground, and a patrol car, driven by an officer, was accelerated from one end, up to a speed, say, of 35 mph, and then the brakes were applied precipitously, causing the car to slide, sideways, about a hundred feet, burning rubber all the way. This was frightening to watch, not least because we knew what was coming. (Probably, today, such an event wouldn't be allowed, for liability and safety reasons.) The point of the demonstration was to show us kids how difficult an automobile is to stop suddenly. We were told with methodical deliberateness: "Always look both ways before crossing the street. Always wait for cars to pass before negotiating a cross walk. Never cross in the middle of the street." These imprecations were delivered with grave seriousness. On the other hand, bicycle training--behavior and law--was completely neglected in those years (1950's thru 1970's). Most kids didn't bother to license their bikes, and we navigated "by the seat of our pants," hardly aware of hazards or regulation. 

I'm not sure when, or why, common sense pedestrian behavior began to be abandoned, but it has steadily deteriorated over the decades. In the Greater (San Francisco) Bay Area where I've lived most of my life, it has gotten to the point that most people have either completely forgotten safe practice, or vigilantly flaunt risky or unlawful behavior(s). Perhaps this has something to do with the demonization of the automobile, as if cars, and those who have the audacity to drive them, are somehow to be despised or "put in their place." 

In any event, driving on any urban or suburban street in the Bay Area has truly become an adventure. In poorer neighborhoods, almost anyone is likely either to stride boldly into traffic, often not even bothering to look in either direction beforehand, or playing chicken by daring drivers to stop suddenly. People routinely will cross against a light, or in the middle of the street, as if they had a right and privilege superior to vehicular traffic. On boulevards or four-lane avenues, pedestrians on islands or median-strips will unexpectedly and without warning simply wade into busy traffic lanes. Frequently, in my experience, even if there are no vehicles following my own, pedestrians will insist on crossing in front of me. And those who seem to flaunt law and safety are often those who seem incapable of picking up their feet, choosing instead to amble lazily along, as if to thumb their nose at those they have chosen to inconvenience. 

Women with baby carriages will often push the cart in front of them, directly in the path of moving vehicles, as if the carriage were a spearhead to clear the way. I sometimes wonder if they don't just wheel empty carriages around to facilitate quicker crossings, except that playing chicken by pushing your own infant in front of moving traffic would seem to violate a much higher moral standard.

Automobiles are expensive. Gas is expensive. Most people don't drive as a form of recreation, but out of necessity. Despite our modern technology, driving an automobile safely and with control is a risky business. Acting in a responsible way costs pedestrians nothing. Stopping unnecessarily or in emergency is expensive and dangerous. Our culture is based on the automobile--our whole way of life is made possible by its efficiency and power. If we have too many cars, it's because there are too many people. 

It's okay to hate the automobile, but it's important to respect it, too. Cars are dangerous, and expensive and crucial to our way of life. 

Pedestrians need to follow the rules, and show some consideration and good manners. Just because you are on foot, and a driver has wheels, doesn't mean you are automatically on superior moral ground, or deserve special favors. As a fallback, the law entitles pedestrians to the right of way, but that's just a legal maneuver. If you think logically about it, vehicles always have the right of way, because a pedestrian doesn't stand a chance in a collision with a car. What idiot pedestrian would seriously play chicken with a three thousand pound vehicle? And yet that's precisely what happens all the time on our city streets.   

I'd like to see police ticket pedestrians with the same alacrity they do drivers of vehicles. 

Also, they could start ticketing bicycle riders too. I'm not keeping my fingers crossed, though. 

The Archetype of Decadence - on a Photograph of Clarence John Laughlin


Take a moment to look at this photograph, by Clarence John Laughlin [1905-1985], entitled "Mother" Brown [1945]. 

At first glance, it appears to be a straightforward formal outdoor portrait of a late middle-aged African-American woman in a long dark dress or robe, with a large Xtian cross around her neck, standing in front of a formal architectural feature, a blind archway straddled by two decorative pediments. All of the information necessary to an understanding of the photographer's intention(s) is here, in black and white; we don't need to know the exact color of any part of the composition to understand the dialectic being offered.

The light is a few degrees from apex, perhaps it's 10:30 AM or 1:45 PM, it doesn't really matter. The shadow that is created by the archway and pediment is the real point of the light's angle. The shadow formed makes a large dark sickle which looms over the woman's head. The symbolism is perfectly obvious:  Death (the sickle) is poised over her. Will her cross protect her from the risk of mortality? 

Light and shadow (dark) are opposite poles in the dilemma of religious struggle. We don't have to know--as it happens--that "Mother" Brown was the leader of a religious cult in New Orleans which included Voodoo among its tenets. We don't need to know what kind of a building it is she's posed before. She stands in a dignified posture, secure in her beliefs, confident and convinced. Does the light reveal truths about her plight, or is this just the ironic accident of the angles and positions of the solar system and rays of the visible spectrum falling on pieces of shaped stone or stucco? 

The Gulf South--in particular New Orleans and the surrounding Delta Country--is subject to continuous melting decay, brought on by water and intense tropical heat. Bodies buried in the ground literally "float upwards" to the surface over time, hence burial practice dictates that bodies be placed in vaults instead of in the earth. In the context of Post-Bellum Southern decay and decadence, superstition is often associated with fetish-objects. In a critical statement by Laughlin, he says "the physical object, to me, is merely a stepping-stone to an inner world where the object, with the help of subconscious drives and focused perceptioins, becomes transmuted into a symbol whose life is beyond the life of the objcts we know and whose meaning is a truly human meaning." 
 
In this photograph, the cross--fixed symbol of Xtian dogma throughout history, worn by acolytes and followers as a talisman of their faith, or as a charm to ward off "evil"--is contrasted with a mythical symbol of mortality. But the contrast doesn't imply struggle. The diagonal lines of shadow throw the scene into vivid relief. Her body, surrounded by symbolic shadows, masquerades as subject, a hostage to art.

Is Laughlin telling us something else about life and death, belief and delusion?        

Tuesday, May 5, 2009

The Grammar Gestapo

The misuse of the word "like" has gotten completely out of hand.

Like has been under assault for decades. Back in the 1950's, "Beatniks" would say "like, yeah, man" using the word as a sort of cult placeholder for hip speech. In the last 10 years, people have begun using it in a slightly different way. 

You often hear the phrase "I felt like...." when what they mean to say is "I felt that...." 

You will also hear the phrase used when the speaker wants to inject a degree of qualification, even going so far as to say "I felt like that it was...."  

To be like is to indicate a comparison between one thing and another; it isn't a qualifier which can be used to modify a verb. You can "feel like" someone else, or you can "feel like" (technically a stretch--replacing as if with like) you're becoming ill. But you can't "feel like" you think or do something; that's not English.
___________________________________________________________________
In other news--
EX-UNITED STATES POET LAUREATE TRIES OUT NEW CAREER AS FASHION MODEL !!
 

Sunday, May 3, 2009

Quarantine - Are Viruses God's Little Mickey Finn?

What kind of a god would create viruses?  If the universe was created through "intelligent design" what possible purpose would viruses serve?  Viruses are so small we don't even really call them life-forms. They're like little machines, whose only function is to infect a host in order to replicate themselves and keep going.  By themselves, they're without any purpose or ultimate meaning.  

In what sense is that different than any life form?  What is the "ultimate purpose" of a green bottle fly, buzzing irritatingly around under the shady eaves on hot days, looking for some stray dung to lay its eggs in? Aren't all living things just trying to get along, reproduce themselves so their kind can continue? That's the "life force" of which philosophers speak. But does it have any ethical meaning, which we ourselves don't give to it?

There's been much hoopla lately in the press about the "Swine Flu" scare which appears to have originated in a dusty little rural hog-farm community in Central Mexico. We already know that "corporate" farming, in which thousands of animals are squeezed into limited spaces with unimaginably dirty and unsanitary conditions is a recipe for contagion, disease, and the promotion of "super bugs".  All of us who eat meat, undoubtedly consume protein which has been grown under intolerable conditions, not just for the poor beasts themselves, but for the dangers this system entails.  

Plagues and infestations have been known for millennia. What they all have in common is crowded conditions, unsanitary practices. Dirty water, poor sewage management, tainted food, overpopulation. Many of our most stubborn prejudices and superstitions grew up over the centuries in response to unscientific notions about how and why people get sick. 

Science has taught us that bacteria and viruses thrive and may quickly get out of control where people live too closely together, or allow their infrastructure to deteriorate. We know that viruses are "everywhere," that all living beings carry thousands of these things around as "benign" "passengers," any one of which may, without warning, mutate unexpectedly into deadly killers.  

As man continues to proliferate across the planet, the natural restraints upon the explosive potential of infection are abandoned. When people once lived in small communities, or nomadic bands, there was a natural "quarantine" effect of isolating contagion within small groups or individuals. One of the classic "tools" epidemiologist and public health administrators always consider in fighting new diseases, is isolation or enforced quarantining of individuals, groups, or communities. But in the bustle and flux of the modern world, quarantining on a large scale appears unworkable.  

The solution to plagues or pandemics is to reduce the concentration of individuals across the biological spectrum. Failure to do this voluntarily will inevitably result in periodic crises of contagion. If we continue to up the ante by ignoring this deadly paradigm, these waves of disease and suffering will increase in severity and frequency. 

In Asian countries, it has become common for people to wear face masks in large cities, either to prevent spread of an infection they have contracted, or to keep themselves from being infected. Wherever people congregate closely in numbers, there's a risk. Anyone who has had a child who went to public school knows how colds spread like wildfire in classroom situations. You will hear theorists occasionally claim that common contagion is really a method of "naturally" inoculating ourselves against infections, by stimulating the natural immune systems within our bodies, that close contact is really a useful tool in maintaining a robust population. But that argument doesn't hold up with the periodic deadly mutation.   

Epidemiology tells us that we are unlikely to keep up with the rapid transformations of opportunistic viral mutations. We can't count on science to provide us with quick cures or inoculations against common or unknown invaders. Historically, we know that isolating contagion is probably the first and best line of preventive medicine. That's what's being tried now in America against the "Swine Flu"--send the kids home and don't go out.  

We haven't really come very far from the days when superstition dictated that one simply avoid seriously ill members. That's still good advice. Of course, we have to treat people, and we have to care for them, even if they're dying. But crowding and overpopulation are the ultimate recipe for armageddon. In wars and natural disasters, disease usually kills far more than bullets or falling walls or fires.