Friday, October 7, 2011

The Tree of Life, The Tree of Death

“Summing it up, one might say that the screen rectangle must be charged with emotion.” — Alfred Hitchcock

“The feel of the experience is the important thing, not the ability to verbalize or analyze it." — Stanley Kubrick

“I can’t quite tell you what the film was about, I can’t analyze it. I’m not a film critic, I can’t tell you where Fassbinder was going with that film… But I had a certain emotional reaction to it, and that’s what I admire. The spirit of that picture permeates the attitude of a lot of the scenes in Taxi Driver. And that means everything.” – Martin Scorsese on Rainer Werner Fassbinder’s The Merchant of Four Seasons.

I generally hate articles that begin, and for that matter films too, with a quote. It’s a cut-and-dried cliché that has been juiced to the brink of meaninglessness. But I take an exception here, both because those three quotes quoted above condense the point of this post eloquently and also because the film in question opens with one, a quote from God Himself from the Old Testament:

“Where were you when I laid the earth’s foundation ... while the morning stars sang together and all the sons of God shouted for joy?" – God

The filmmaking guidebook conventionally dictates that a film should set the tone in the opening 10 minutes of the film, but Terrence Malick sets a big part of the tone with that quotation itself, establishing the spirituality of the film and the spiritual themes of the origin of the universe and life, channeling our emotions in the direction of that undeniable poetic sensation central to the God concept even an atheist like me cannot deny. The spiritual tone set here together with the first words softy uttered by Mrs. O'Brien (Jessica “Goddess-Like-Beautiful” Chastain) becomes the heart and soul of the film and suffuses smoothly into the rest, carrying with it and tying together all the elements that go into making it.

There is much to be said about all the elements that go into making it, technically and artistically, from sound mixing to editing, cinematography to narrative structure to the hypnotic background score. It would be farcical even to try to dissect every single one of those to ascertain how the experience is brought to life, but in a Portuguese interview one of the editors of this film, Daniel Rezende, quoted the very first advice Malick gave him before beginning the editing, an advice which Rezende says in this interview changed his approach to editing forever, and this attitude in essence reflects Malick’s creative process for this film and the attitude he has tried to spill over to the rest of the cast and crew he has worked with while making it (Keep in mind that there were five editors for this film, one after another over a period of two years of editing, and different cinematographers in different countries of shooting):

“I know you can edit this scene. But please show me what you'd never do with this scene. Let's try to find the opposite of what you plan to do. What you know is not interesting to me. I don't care if the movie does not please all audiences. But I want to give them a new cinematic experience.” - Malick

And a new cinematic experience he gave us alright!

I want to comment a little on the central themes of this movie. The most common interpretation of the film I’ve heard is that it’s Nature VERSUS Grace. And given the opening monologue by Mrs. O’Brien about the contrast between Way of Nature and Way of Grace, it’s almost tempting to believe it to be a versus-themed film, but I couldn’t distance myself from that opinion any further even if I tried. It wasn’t a Nature-versus-Grace story at all, but rather how Grace originates out of Nature. The long, mesmerizing visual scenes of the origins of the universe, and their cold and heartless feel, are not immediately contrasted with the birth of the O’Briens’ children and their joyous upbringing by their angelic, playful, loving mother and an equally loving yet domineering, authoritative father. There exists a transformation between the two sequences – that of the evolution of life from lifeless primeval soup to oceans bursting with life, from abiogenesis to biogenesis – where you can literally feel your emotional response transform from unsympathetic to sympathetic, as you watch the first images of a beating heart in a developing fetus, the first images of a predator dinosaur walking away from its wounded prey out of compassion for its suffering. In other words, Way of Grace is shown as being born out of Way of Nature; they are not two independent entities that are being contrasted but one entity taking birth from another and then vanishing back into the same (the scene towards the end where Earth is shown being blown to dust as the Sun expands into a supernova). Our place in the universe reduced to oblivion by one tiny puff of the Sun.

Search for our place in the universe is another recurring theme in the film, along with loss, bereavement, and deeply bonded sibling relationships. Mr. O’Brien is always on a quest to make a mark on the world, to become significant, to make his existence meaningful, while Mrs. O’Brien has already found that meaning in sharing love with her children in the brief time she gets to be with them on earth. The expressions of merrily joy on the faces of the three children when only their mother is around turn to fearful, blank expressions of forced respect when Mr. O’Brien enters the vicinity. This actually is one of the several ways in which Malick creates the mood pattern by fluctuating the emotions – by using alternating scenes of Mr. and Mrs. O’Brien’s spending time with their children. As a result, each scene feels like a distinct bubble of emotions and the entire film becomes a bubble bath – it’s all one big interconnected cluster of conjoined bubbles, but they are still not a continuum. This I personally feel is what distinguishes The Tree of Life from most other mainstream/conventional films.

Another thing worth noting is the apt title for the film. The evolutionary tale I talked about above is referenced in a very ingenious way in the film title: Tree of Life being the name biologists give to the evolutionary tree of life, referencing Nature, and Tree of Life also being the Biblical Tree of Life, referencing Grace.

When all is said and done, I thank the reclusive Mr. Terrence Malick for giving me this once-in-a-lifetime experience of watching something that is quite possibly the greatest film made when I was alive: A class never before achieved, and perhaps never will be. An ode to the potential and limitlessness of cinema, as eloquently expressed by David Lynch:

“Every medium is infinitely deep.” - DL


___________________________

Trivia 1: Zbigniew Preisner's Lacrimosa is a rendition of Mozart's Lacrimosa.
Trivia 2: Heath Ledger was supposed to play the role of Mr. O'Brien, but I have absolutely no complaints because Brad Pitt plays the character as honestly as and more powerfully than Ledger could have. This to me is by far Brad Pitt's best acting performance of his career.

Trivia 3: The intended irony that a film titled The Tree of Life actually begins with the death of a character. Hence, the title of this post.

Wednesday, September 28, 2011

George Clooney's 100

Hokay so George Clooney, in a recent interview for his upcoming film The Ides of March, has published his list of the best 100 films from 1964-76, which some (excluding me) consider to be the Golden Age of Cinema. Excluding me because such pigeonholed classifications are usually based on blindspot biases, and every generation has its greats. Only ignorant/bigoted idiots would stoop to say that great filmmakers/artists have all perished.

The list.

The list has some really great titles, interspersed with some that are in my opinion junk yet highly overrated, and some lesser known gems. Those who use the IMDb Top 250 as a recommendation list should immediately identify the good titles in this list, so I'm going to recommend some from it that are NOT in IMDb Top 250 but are truly great films in their own right:

In order:

Deliverance (1972)
The Conversation (1974)
All The President’s Men (1976)
The Last Picture Show (1971)
The Blow Up (1966)
The Producers (1968)
The French Connection (1971)
Wait Until Dark (1967)
Marathon Man (1976)
MASH (1970)


Going to watch Badlands this weekend, and I might add that pending my response. And one very important and worthy film that's missing from Clooney's list is, of course, Barry Lyndon. Though not as popular as Kubrick's other endeavors, Barry Lyndon is a period masterpiece with Kubrick's signatures signed all over it, a paragon of artistic, dramatic and technical perfectionism he was known for.

Saturday, September 17, 2011

Thursday, September 15, 2011

"Good evening, this is Orson Welles!"

Orson Welles has been dubbed many times with many titles: prodigy, unfathomable genius, creative genius, natural, greatest, &c. I couldn't agree more with all those titles, but I prefer to dub him with my own invention: Awesome Welles. The story of how Orson Welles became Orphan Welles at age 15 and Awesome Welles at age 25 and kept on going up and up the scale of awesomeness is long and, I hate to say, sad. Sad because he was considered too much of a genius (or to use their own term, "rebellious") for his time and had to struggle, despite all his inborn talent, to earn his bread. In fact, he used to act in other filmmaker's films and sell his inimitable voice to narrations and present himself in hack TV commercials (another) for only one reason: salary.

While there are many reasons for his lifelong poverty, both external and internal reasons as he himself was at least partly responsible for his poverty being bad as he was with managing money, this post is not about any of that. I am writing this specifically to (try to) express my inexpressible appreciation of one of his films, The Trial (1962), about which Welles said:

Say what you will, but The Trial is the best film I have ever made. I have never been so happy as when I made that film.

I'm not trying to write a review of the film (if need be of that, I refer you to a well-covered, 4-star-rated Roger Ebert review of The Trial, four stars being the highest he gives any film). Rather, I'll comment on the general character of the film and why it really is, as Welles said in the afore-quoted statement, the best film he ever made, along with, of course, Citizen Kane.


A brief history of how this film came to be: Citizen Kane is the first and the only film, up until the 70s, in which the director had been given absolute control over all aspects of the film. So much control, in fact, that even the producers and the studio people weren't allowed to see rushes or any part of the film until the final cut of the film came out, and when it did, even if they hated it, they couldn't do anything to it. Welles often said that that was the reason why Citizen Kane came out as good as it did, and since he was never given that kind of control again, he had put up a challenge: "Give me that kind of control again, and I'll give you a film better than Citizen Kane." The only other time he came close to it was with The Trial, where European producer Alexander Salkind offered him near-complete control except for small sections of score, which eventually did come out great anyway and Welles did not complain. Unfortunately, however, once the film was completed, it was not marketed the way Citizen Kane was and thus did not receive the same kind of attention and as a result has largely remained unseen by critics and public. Typical. The film eventually ended up in the public domain, which means no one can claim copyright to it and anyone can distribute it anywhere -- which has both pros and cons.

Anyhow...

The Trial is an adaptation of Kafka's surreal novella of the same name, and as someone accurately described it: "The film is a competition between Kafka and Welles, with Kafka coming second place." This is one of those rare gems of adaptation where the film outperforms the book.

The film makes you feel uncomfortable and disorientates you at every possible opportunity, and there's absolutely nothing the audience can feel safe about. It punishes you for watching it. It disorientates, makes you dizzy, psychologically upsets you, confuses even. It's cruel to the audience. But in that cruelty lies a strange sense of amazement and attraction that showcases the directorial potential of Orson Welles as truly deserving the title of the Greatest Director Ever, an undisputed master of mood. Welles himself appears in the small role of The Advocate -- a mean, harsh, overpowering character that he plays with such conviction that I can't help but place him into the same rank as Marlon Brando, which previous to watching The Trial I hadn't felt so strongly.

At the end of the opening pin-animated sequence Welles says, "Some have said this story has the logic of a dream...of a NIGHTMARE." It's impossible to say exactly what, after watching the film, made the entire thing feel so nightmarish. It was the totality and combination of everything in the film, of course, but it's impossible to consciously assess how all those factors come together into making it feel creepy and disturbing. There is nothing overtly creepy or disturbing about it, but Welles works so much on the subtlety of its nightmarish feel that only the emotions are transferred, not the inner workings of it. Given that Welles was an accomplished magician in real life, it only makes more sense that he would do something this brilliant without letting the audience in on how he is doing it. And Kafka cannot be given credit for its nightmare-like character, at least not entirely, because there have been numerous other adaptations of The Trial and none of them come even one-tenth close to the utter magnificence of Welles' impossible-to-emulate adaptation.

As for the plot line of the film: A man named Joseph K. is brought in for a courtroom trial without being made aware of the charges, and he and the audience struggle, in this surreal world he is surfing through, to find out what his charges are while at the same time he attends the trial and defends himself like he knows precisely what he is accused of: of being himself (Ebert's review has good exposition on this). And as is characteristic of surreal stories, interpretation is open to every member of the audience.

In a 1981 documentary/interview called Making The Trial, Welles in answer to a question said, "My vision of The Trial is not the same as Kafka's vision, and no director should ever find the need to stay true to the novel, because once adapted, it becomes your perspective, and hence, your story." I wholeheartedly agree. To me, The Trial is as much of a crowning glory in Orson Welles' oeuvre as Citizen Kane.
_____

And oh, before I forget, Anthony Perkins, who played the antagonist in Psycho, has put forth one of the best performances screen has ever seen, and his unsettling presentation of body language and dialogues is one of the things, I reckon, that makes this film like living through a nightmare. Perkins in Making The Trial stated that the high point of his life was having acted in an Orson Welles film, having been directed by Orson Welles.

Wednesday, August 31, 2011

August: Movies


Harry Potter Part 7 Part 2 - 7/10
Being There - 8/10
The Lincoln Lawyer - 7/10
Super 8 - 4/10
Delhi Belly - 3/10
Horrible Bosses - 7/10
Limitless - 6/10
Sunset Blvd. - 8/10
Scream 4 - 7/10
Tootsie - 8/10
The Trial (1962) - 9/10
The Adjustment Bureau - 2/10
The Witness - 5/10
I Saw the Devil - 7/10
The Good, the Bad, and the Ugly - 8/10
Taxi Driver (Extended Version) - 8.5/10
I Spit on Your Grave (Remake) - 6/10

Tuesday, August 23, 2011

Writing impulses surface

So...I've been pretending to be busy for many a month now and nothing good has come out of it. Meanwhile, I've been suppressing my psychotic writing fits and avoiding thinking about any topic for more than 15 minutes. But, as it so often happens when you suppress urges, they manage to manifest as other behavioral aberrations: I've killed three hookers in the last month alone, and lost count of it before that.

And to spare those presumably innocent lives, I'll try and write something in a free-flow form, without much deliberate thinking.

As you've probably guessed, that part about killing hookers is rank bullshit. But I enjoy talking about bullshit, I enjoy seeing people bullshit, I enjoy psychoanalysing bullshit. Bullshit surrounds us as thickly and imperceptibly as the air we breathe. I don't mean bullshit in terms of lies, rather, as just idiotic shit we are better off being ignorant of. And one of those is the way people get you to open up, to spill the beans on something by guising themselves as pseudoacademic experts citing their own personal expertise. Now, to be fair, we are all psychologists. We all try to understand how people's minds work, and in doing so, understand our own. Having an understanding of the nature and attitude of people you interact with, and people in general, gives you an edge on manipulating them. But when the line between that understanding and the pretense of understanding begins to blur, you will end up shitting in the minds and lives of those you are attempting to manipulate.

Now, I feel an obligation to be specific. Remember, when you are down and out after life has completely fucked you over (we all have those moments, right?), a friend always very comfortingly tells you "talking helps, I'm here, you can share it with me"? That's bullshit. No, actually, it isn't total bullshit, because talking does help, but:
1. It comes with constraints.
2. The friend could:
a) Be genuinely trying to comfort you by letting you share your sorrow with them, or
b) Just be curious about your plight and is attempting to get you to open up. Morbid curiosity is part of us.

About the constraints: Talking about your situation or misfortune with someone helps only when the talking is done immediately post the misfortune, and the talking window is open only for a brief time period. I read about this in a Martin Seligman (a very respected, important and real psychologist, also the former President of American Psychological Association, elected by the widest margin in all the history of the association) book, so I'll get right down to the source:

Another widely believed theory, now become dogma, that also imprisons people in an embittered past is the hydraulics of emotion. This one was perpetrated by Freud and insinuated itself, without much serious questioning, into popular culture and academia alike. Emotional hydraulics is, in fact, the very meaning of "psychodynamics". Emotions are seen as forces inside a system closed by an impermeable membrane, like a balloon. If you do not allow yourself to express an emotion, it will squeeze its way out at some other point, usually as an undesirable symptom.

In the field of depression, dramatic falsification came by way of horrible example. Aaron Beck's invention of cognitive therapy, now the most widespread and effective talk therapy for depression, emerged from his disenchantment with the premise of emotional hydraulics. The crucial experience for Tim came in the late 1950s. He had completed his Freudian training and was assigned to do group therapy with depressives. Psychodynamics held that you could cure depression by getting them to open up about the past, and to ventilate cathartically about all the wounds and losses that they had suffered.

Tim found that there was no problem getting depressed people to re-air past wrongs and to dwell on them at length. The problem was that they often unraveled as they ventilated, and Tim could not find ways to ravel them up again. Occasionally this led to suicide attempts, some fatal. [...]

Anger is another domain in which the concept of emotional hydraulics was critically examined. America is a ventilationist society. We deem it honest, just, and even healthy to express our anger. So we shout, we protest, and we litigate. [...] If we don't express our rage, it will come out elsewhere - even more destructively, as in cardiac disease. But this theory turns out to be false; in fact, the reverse is true. Dwelling on trespass and the expression of anger produces more cardiac disease and more anger.

[...]

I want to suggest another way of looking at emotion that is more compatible with the evidence. Emotions, in my view, are indeed encapsulated by a membrane - but it is highly permeable and its name is "adaptation," as we saw in the last chapter. Remarkably, the evidence shows that when positive and negative events happen, there is a temporary burst of mood in the right direction. But usually over a short time, mood settles back into its set range. This tells us that emotions, left to themselves, will dissipate. Their energy seeps out through the membrane, and by "emotional osmosis" the person returns in time to his or her baseline condition. Expressed and dwelt upon, though, emotions multiply and imprison you in a vicious cycle of dealing fruitlessly with past wrongs.

The summed up point is, when shit happens, express it healthily to people close to you who you trust, express it soon after the incident and be done with it - that is of course assuming that you want to express it at all. If you wish to keep it a secret and not share it with anyone, freely abandon any fear that it'll outwardly manifest in other ways. As Seligman says, and he knows what he is saying, left to its own devices all those negative feelings will dissipate over time and will turn into nothing more than a faint memory of a bad incident. But dwelling over it, recollecting it to some curious bystander who is pretending to comfort you by claiming it'll make you feel good to talk about it or "blow out the steam" even long after the incident, is only going to make things from good to bad and bad to worse.

As I write all this, there's an avalanche of thoughts pouring into my mind over this topic, one particularly important (about the dangers and perils of reading psychology books by self-supposed "pop" psychology authors who do not have a formal training in psychology). And another important topic, a sort of online experiment if you will, I wanna cover about how easily we can be influenced by a cluster of similar opinions, but it's too much to write and I'll save it for another time.

So for now, over and out, goodnight!

Wednesday, July 27, 2011

Coincidences


I've been a fan of the two Davids: David Foster Wallace (DFW) for a couple of years now, and David Lynch for longer. In the past year, my admiration for both these guys has escalated exponentially. Generally, I try not to admire the artist himself/herself and instead just constrain my admiration to the work alone. (This keeps me from creating any biases towards the artist or against another artist, the way those idiotic Sharukh Khan and Aamir Khan fanboys squabble over which of the two is the greatest human being ever in all of recorded history). But I'd be lying if I said it's not hard to not admire the Davids, especially when I'm reveling in the works of those artists to the point where I forget I'm having breakfast as I'm reading/watching the piece of work and lose myself in it body and soul until I have no brain space left to even think of the food plate on my table.

Just recently, I was reading an essay by DFW online, and as coincidences do so often happen, the opening sentence of the essay had me hooked like a fish to the bait:

This is not because of anything having to do with me or with the fact that I'm a fanatical David Lynch fan from way back, though I did make my pro-Lynch fanaticism known when the Asymmetrical (studio) people were trying to decide whether to let a writer onto the set.

One of the two artists I fanatically admire himself fanatically admires the other of the two.

It's a small, strange postmodern world.

(The essay, on going back and reading the title which I usually skip, was about DFW's experience of visiting the shooting set of Lynch's Lost Highway, the number 4 film on my all-time favorite films list. Can be accessed here. The subtitle reads: "In which novelist David Foster Wallace visits the set of David Lynch's new movie and finds the director both grandly admirable and sort of nuts.")

Monday, May 30, 2011

On the importance of variety


I'm sure you've all heard of and read - to nauseating levels in pop culture, sometimes - the saying that "variety is the spice of life." I'm sure at least some of you have read nerdlikejazzy's recent viral blogpost '18 Things I Wish Someone Would’ve Told Me at 18,' point number 4 of which elaborates on "exploring new ideas and opportunities often." In this post, which in all likelihood will be my last one for the next 6 months owing to my busy schedule, I'll add a few points to support the imperative of varied experiences in life. Don't worry, I'm not gonna get into any heavy-handed, pseudophilosophical discussions about the meaning of life or the "painful journey" we call life. I'll try and work it out from a scientific perspective only and skim the surface of the scientific basis that substantiates the claim.

The American psychologist Fred Attneave writes about the human brain:
"If there were just one brain cell to cope with each image that we can distinguish in all its presentations, the volume of the brain would have to be measured in cubic light years."

Meaning that there is just a tiny bit of information (information as in an image) that can be stored in and processed by one brain cell, and since we come across literally infinite images (well not literally!) each waking moment of our lives, each human brain would have to be the size of a galaxy to be able to successfully process all the images (or other information) we see at any given time, all the time. But we know this isn't true, since we know how big our brains really are - anywhere from the size of a coconut, for most of us, to the size of a peanut, for some people like Arindam Chaudhuri and his disciples. Despite the size that is really required of the brain to process information, our brain does it with extraordinary precision using a size that is only a billionth of what is required. How so, though?

The brain has a workaround trick to solve this, discovered independently in the 1950s by Barlow and Attneave. It has evolved a very sophisticated and complex system of erasing redundancy. Redundancy is the opposite of information; redundancy is a measure of unsurprisingness, information a measure of surprisingness. To illustrate this with an analogy, imagine what a worthless waste of paper it would be if the newspapers reported every single day that the sun rose in the east that day. It's redundant information, and everyone knows without being reminded that the sun rises in the east everyday. However, in the unlikely event that the sun rises in the west on a fateful day, that would be something of a news to report. That is a change from the daily routine of sun's rising in the east, and newspapers will not fail to report it the following day. The information content of it would be high, measured as the 'surprise value' of the message. The brain works in a very similar way, regardless of the body sense to which it is responding or the memory it is storing, as there is only a limited number of neurons for the brain to work with. In fact, the entire nervous system works this way. I'll explore two examples, of senses in particular.

The eye. Suppose you are looking at a black rectangle on a white background. The whole scene is projected on to your retina - you can think of the retina as a screen covered with a dense carpet of tiny photocells, the rods and cones. In theory, each photocell could report to the brain the exact state of the light falling upon it. But the scene we are looking at is massively redundant. Cells registering black are overwhelmingly likely to be surrounded by other cells registering black. Cells registering white are nearly all surrounded by other white-signalling cells. The important exceptions are cells on edges. Those on the white side of an edge signal white themselves and so do their neighbours that sit further into the white area. But their neighbours on the other side are in the black area. Using a phenomenon called 'lateral inhibition,' the brain can reconstruct the whole scene with just the retinal cells on edges firing. There is a massive savings in nerve impulses. The eye tells the brain only about edges and the brain fills in the boring bits between. Once again, redundancy is removed and only information gets through. Any image you see, including these words as you read as well as of anything else that is laying by your computer, is reconstructed in your brain through this same process of inhibition of redundancy.

You can see this lateral inhibition in action in this optical illusion below. Cover the central divide between the two shades of grey with your index finger and stare at your finger for a few seconds. Now both sides appear to be made of the same shade of grey, or very nearly the same. Now move your finger off the screen, and the difference between the shade drastically increases. Can you reason why?



The ear. Hearing, all audible sounds as analogous to images, works very similar to the eye as described above: by cutting down on the redundancy in the pitch of the sound, filtering in only the information about the variation in pitch, and then the brain processing and reconstructing the sound - all in real time. But with sound, there is an additional factor: of time itself. To explain just how dramatically the brain filters out redundancy, let me cite this study conducted by researchers at Rice University. Two groups of subjects were made to listen to two sound pieces respectively. The first sound, given to the first group, was of constant pitch whereas the second sound, given to the second group, was of varying pitch. The groups were not told of the duration of the sound they were listening to, and none of them had clocks or watches or any other way of measuring time. After both groups finished listening to the sounds, when asked to guess the duration of the sound piece they just listened to, all the subjects of the first group (the constant-pitch group) reported a duration that was much less than the duration reported by the subjects of the second group - despite the fact that both sound pieces were of the exact same duration! The first group subjects' brains not only cut down on the redundancy of constant pitch but also reduced the subjective perception of time that was associated with the redundant information. The second group subjects' brains, however, because of the variation in pitch and thus more 'surprise content,' were able to store not only more information about the sound but also more of the overall time associated with it.

Where I am going with all these illustrations and citations should be clear at this point. The reason why four years of college feel like four months is because of the same reason that the brain remembers a constant-pitch sound to be shorter than it is: repetitive, routine tasks are compacted by the brain into just a few, typical, representative prototypes and stored into memory locations where the subjective perception of time is also cut short in accordance with those compressed memories, thereby filtering out not only the redundancies of the experience but also much of the perception of time that the experience is made of. This essentially is the same way our brain retains boiled-down memories over an entire lifetime.

So, to counter this seeming 'drawback' in the design of our brain, to keep life from feeling short-lived and having passed in a flash, to avoid the perception of time having flown like an arrow, it's most advisable to have different kinds of varied experiences that the brain can perceive as information or 'surprise content', to try new things if only for the sake of the feel, to break the patterns of tedious, recurrent routine that come off as redundant, and also to collect as many photos, videos and souvenirs of the experience - because remember this: there's only so much the brain can remember.

_

Tuesday, May 3, 2011

Glimpses of genius

The next suitable person you’re in light conversation with, you stop suddenly in the middle of the conversation and look at the person closely and say, “What’s wrong?” You say it in a concerned way. He’ll say, “What do you mean?” You say, “Something’s wrong. I can tell. What is it?” And he’ll look stunned and say, “How did you know?” He doesn’t realize something’s always wrong, with everybody. Often more than one thing. He doesn’t know everybody’s always going around all the time with something wrong and believing they’re exerting great willpower and control to keep other people, for whom they think nothing’s ever wrong, from seeing it. This is the way of people. Suddenly ask what’s wrong and whether they open up and spill their guts or deny it and pretend you’re off, they’ll think you’re perceptive and understanding. They’ll either be grateful, or they’ll be frightened and avoid you from then on. Both reactions have their uses, as we’ll get to. You can play it either way. This works over 90 percent of the time.

-The Pale King, pages 17-18, David Foster Wallace

Trivia: While googling a sentence from this paragraph to find the rest of it and thus save the effort and time in typing it out for this blog post, I discovered that dozens of other bloggers have blogged this exact same quote! (Example 1, Example 2, Example 3 , and many more...)

Monday, May 2, 2011

Source Code: A Review, or a Review of a Review

Under peer pressure or with sly secondary intentions, I’m being made to write this review by the two other, equally worthless assholes (Rohan & Himanshu) I watched this film with yesterday at the movies. I civilly oblige.

So, yeah. First things first. Unlike them both, who thought this film didn’t live up to the hype (what with all the raving reviews and a 90% rating at Rotten Tomatoes that gave Inception an appropriate 75%), I thought it way surpassed the hype. Actually, leaving popular hype where it belongs (in the shitbucket), Source Code surpassed my own expectations of Duncan Jones whose directorial debut, Moon, being awesome and everything, had me looking forward to his next project for about a year now. And having now seen this piece of celluloid masterpiece, I’m looking more forward to his next project, Mute, than I did for this.

OK, my personal raving aside, time for that which is implicitly promised when I review: Less discussing and more diss-cuss-ing. Source Code is as bad a title for a film as it is for the sci-fi invention used in the film. Source code, as all you proper, would-be and engineer of sorts know, is a generic term for the text part of a computer program. To name your biggest, game-changing technological creation as Source Code makes me question your judgment as well your credentials as a PhD who can tap into the residual consciousness of the dead. Additionally, naming your film by a dry, common software engineering terminology can have anyone – engineers and non-engineers alike – assume or even judge before actually watching it as most do that it’s a computerish or softwarish or geekish film – none it could be any further from. Use some creativity, of which you are evidently brimful, while naming your films for future reference, Jones.

And now the Holy Grail of film critiquing: Masand ki pasand. To put it in simple terms, Masand’s one review of Source Code has more insights and penetrating observations, and a rating that is objectively accurate to the fifth decimal, than the planetary congregation of insights of all those who saw the film, made the film and wrote the screenplay. His unnatural eloquence and insightfulness are on display in what he aptly calls Masand’s Verdict (seriously, why call it anything else when it is exactly that – a verdict, not a review), as can be seen in this exemplary concluding statement of his Source Code verdict: "It’s not perfect." Oh man, what a bummer! It’s not perfect!

Masand’s ideal definition of a perfect film is Avatar, which he gave a singular 5/5 despite acknowledging, using his remarkable I’ve-seen-more-films-than-you-duh power of dissection, that the plot was so predictable as to numb your mind like anesthesia.

In his infinite wisdom, Masand also suggests that Inception lovers and dopers should go watch Source Code. At the risk of being exiled into oblivion for disagreeing with His Insightfulness’ suggestion, I recommend you suspend any expectations of seeing an Inception or some other pop film in this and watch it with a blank-slate preconception plate. When the movie ends and the titles roll and you realize it’s not another Inception or Déjà Vu, you are more likely than not to feel let down. Every film has its inspirations and aspirations from past films – regardless of how original it superficially appears – but to cite those films as a ruler to weigh this film against and to give false ideas of what to expect would lead to despair when the film fails to meet your expectations or goes in a different direction altogether. So put simply, let the film thrill you in its own right.

But yes, both Chris Nolan and Duncan Jones – besides hailing from England, bah! – have similar philosophy at filmmaking, though to varying degrees. Like Nolan, Jones doesn't patronize the audience by either spoon-feeding the story or preaching moral lessons at them. While much is presented visually comprehensibly, he does leave at least some aspect of the story to our imagination and lets us work out why this or that happened by following the clues here and there. He did that in Moon and he does it in SC as well in the much-discussed ending, and he does it so efficiently that the audience feel immensely rewarded when they figure out the why post gestating the questions. However, more important than these intellectual puzzles, and this is the hallmark of a good director, he packs and transforms the whole story into an engaging emotional medium that connects the screen to the audience and lets the drama, the action, the compassion for the characters and the progression of mood flow. He does this consistently for the stretch of the film, and at no point did I feel “left out” or disconnected; I was involved in it; I had lost myself in it. This sadly doesn’t happen with most films I watch.

That, however, is not to say it was all sentimental goo. Far, far from it – subtlety is the key. If there is any director who can pull off a happy ending without making it repulsively cloying (refer: Rajkumar Hirani’s brilliant, amazing, soul-bonding, generation-defining body of work that is saturated with didactic moral chapters on the "right way to do things"), it is Duncan Jones. He has decidedly become an object of envy for me, and I hope to God it stays that way!

8.5/10. Yeah!

Note:
1. Anish Kapoor’s famous sculpture Cloud Gate is featured towards the end of the film – partly chosen for its reflective features, mirror reflection being an important plot device in this film, and partly for artistic purposes.
2. Michelle Monaghan is very cute, damn it!
3. If you do not grasp the answer to the lingering question that’s on everyone’s mind at the end of the film, read this explanation below (not mine). But read only after you watch it – contains spoilers, even the very reading of the question.



SPOILER ALERT:

Q. How does Colter survive longer than 8 minutes in the final transfer to the train bomb scene in the end?

A. Colter's consciousness is sent from his body attached to the source machine in the starting universe to Sean's body in the newly created parallel universe each time source code is "started". When Sean's body is killed in the parallel universe, Colter's consciousness is returned to his body in the starting universe (because of the link via source code). Even if Sean's body isn't killed, Colter's consciousness is still returned to his body in the starting universe, and Sean regains/resumes possession of his own body in the parallel universe. Once Colter's consciousness returns to the starting universe, the source code team then interrogate his conciousness (via the computer terminal) to find out what Colter has learned and then send him to another parallel universe, rinse and repeat, rinse and repeat until they get the information they need.

When Colter is sent to a new parallel universe for the final time by Goodwin, he is able to defuse the bomb and arrest the bomber within 8 minutes and gets prepared for the last kiss of his life just at the end of 8 minutes. When Goodwin in the original universe switches off Colter's life support at the end of 8 minutes, she has terminated Colter's body, but his consciousness is in the final parallel universe. By switching off Colter's life support she has severed Colter's link to his body and he is forever trapped in Sean's body in the final parallel universe (which is what he suspected and wanted to happen). When Colter/Sean dies (presumably of old age) in the final parallel universe, only then will Colter's consciousness finally die.

Monday, March 28, 2011

My Latest Pet Peeve


Another in a long line of my pet peeves, the phrase “open to/beyond criticism” has really started to tick me off. I’d first come across the term “beyond criticism” in the context of Oprah endorsing a book into her elite class of Oprah Book Club (does she have anything in her life whose title doesn't contain her name in it, that self-indulgent, narcissistic bitch?), and I didn’t give it much thought back then for reasons I can neither recall nor, if could recall, justify. But with the increasing frequency of this term in print – used often by critics and reviewers of literary, cultural, religious and political phenomena – my peeve detector has finally rung the alarm.

Since I came across this phrase exclusively in literary context, I’ll stick to what I know. Calling a book or some other work of literature as beyond criticism categorically submits that any opinion or point of view that opposes the established awesomeness of the work can be either discredited because the (insecure) majority who pushed it past the unsafe zone of criticism disagree with it, or restricted from being uttered altogether. Now, “open to criticism” can superficially come off as a critically superior term that, as it manifestly suggests, opens the book to criticism and allows novelty and variety of opinions to be expressed unlike the other phrase; but really, it is just an elastic extension of ‘beyond criticism’ that when let free returns back to the intolerant connotations of its base phrase, ‘beyond/above criticism.’ Being ‘open to criticism’ insinuates that it can also, at some time in the future and with sufficient consensus attained, reach the point of being ‘closed to criticism’ – just a fancy way of saying it is now beyond criticism. Well, fuck that! A phrase which fundamentally grants any work of art that privilege is illiberal and despicable in equal measure. At what point, may I ask, does a work go from being ‘open to criticism’ to ‘beyond criticism’? In my book, never. As long as I am alive, no book ever written is above or beyond or closed to criticism!

This pet peeve, however, as I mentioned earlier, is highly context-specific. I have no qualms with this phrase being used in a few other areas of human affairs, such as science. That earth goes around the sun and not the other way around is above and beyond criticism, and that evolution is true hasn’t been open to criticism for well over a century and a half now.

I want to go on about how blinkered we Indians are about other people’s outlook towards something that is different from our own and how closed we make our objects of praise to criticism. But that could, true to our Indian form, make this into a pissing contest and flame war, so I’ll conclude the pet peeve with my original intention of just expressing the peeve, and nothing more.

Monday, March 7, 2011

Response to a response to my Bicycle Thieves review

So hello again, Alexander. At the outset, sorry for the delay. I'd been going through a rough patch for the past two months (you know, breakup and everything), but I'm done with the grieving stage and am back to form now. Except for a broken right hand middle finger that I broke today morning by stubbing it onto the door, but it shouldn't cause much discomfort in typing this out. I'll try and maintain brevity in this reply.

(For the uninitiated, the quoted parts are Alexander's reply to my previous post.)

Hi there, buddy! Remember me? I actually still check on your rants from time to time. I find it interesting that you write that many considered classics supposed contain: "symptoms of pretentiousness". So, am I right to assume that you dislike much of what is considered 'classic' pre-Kubrick era, because you find many of those films to contain "symptoms of pretentiousness" which takes away story? Or, do you simply mean, that it's just hard to find representative elements in just one film from one particular era, because of mentioned symptoms, which in terms makes it harder to get into, and therefore just not worth the effort?

I think I'm more inclined towards the latter. Cinema evolved over 100 years. Today's directors have a much larger database of "Do's and Don't's" that has in essence accumulated over those 100 years from hundreds of thousands of films, and that gives them an edge over the first half century's directors as to what works and what doesn't. Kubrick had a very significant influence on the following generation of filmmakers, and in my opinion, more so than any other director of the second half of 20th century. Following in Kubrick's footsteps, many filmmakers have mastered the art and craft of filmmaking, most famously including Steven Spielberg who admits it himself. But in my experience of watching films, I hardly come across many films in the pre-Kubrick era that can, in totality, stand up to the post-Kubrick era where such films began to rise in number. That said, however, I will be quick to admit that even with the apparent lack of totality of filmmaking craftmanship, without the pre-Kubrick films there wouldn't have been any influence and ground rules for Kubrick himself to make films and there is no way in hell he could've started everything from scratch all by himself.

I am mostly asking because of this line: "but it’s hard to find much of what is good about cinema in a single film in many films before the 1960s." So, let's get right to the 1960's and forward. Because, I'd say Kubrick's produced some pretentious works, by your logic, if I am to understand you correctly. '2001: A Space Odyssey', is to me, the very same thing I'd interpret you to describe as "pseudoartistic babble". Same goes for, 'Eyes Wide Shut', and 'The Shinning'. Movies that tells a story that really doesn't seem to make sense to me in a conventional way.

The Shining was not meant to be an artistic film. Because of Barry Lyndon's commercial failure (even though artistically it was highly acclaimed and still is one of the "basic references" for period filmmaking), Kubrick's disappointment led him to want to make a film for a larger audience with more entertainment value than artistic, and still having his own creative identity as an experimental filmmaker, at the same time avoiding the conventional way of telling a horror story. Eyes Wide Shut succeeds on many levels for me while failing at many as well. Kubrick tried to capture the tone of A Clockwork Orange into this film with long pauses within and between dialogues, but it just didn't work like it did for ACO. The dream-like tone he intended for the film did not come across as very dream-like but just laborious to watch. It did, however, succeed on its occult imagery - the kind of impeccable craft that makes Kubrick films what they are.

Or perhaps you don't like all Kubrick films? Please correct me if I'm assuming too much.

I don't, mate. In fact, I don't even like to the same degree all the Kubrick films that I do like, and even in the ones I like, with the exception of 2001 which in my opinion is pitch perfect, there are parts I think were not directed to his true potential, especially in his last two films. And Lolita is my least favorite of his work, so much so that I refuse to believe he even directed the film. Assuming they hired some ghost director and used Kubrick's name for branding purposes gives me solace :D

Don't get me wrong, I love those movies. But isn't '2001: A Space Odyssey', something you'd imagine Fellini could've done? Initially, I got that '2001...' is all about telling a story through pictures. But, there is no initial plot, and no depth to any of the characters. There is really nothing at stake. It's highly original for its time, because it breaks away from conventional storytelling. The point of '2001...' to me was the beginning and end of life, and the infinity of the universe. Humans played a smaller part, because the movie's main point to me is how insignificant we really are in the whole of the universe. '2001...' isn't supposed to have an epic assemble of characters with a quest, because the bigger whole is the journey.

To answer your first question, no, I don't think Fellini or any other director could've done 2001, just as I don't think Kubrick could've done 8-1/2. Every director has his own directorial signatures, some which are perhaps unknown even to him, and it is usually that which gives a film the feel that it has. Replacing directors and expecting to get the same feel out of a film would be ridiculous.

I can go on and on about the depth of 2001, but I'll limit it to the points you raise. Even to this day, a large majority of those who have seen 2001 over the span of 40 years think the only commendable thing about the film was its groundbreaking use of visual effects. While that is true, it is only the cult following that the film has generated over 40 years that can truly appreciate the depth of its scope and meaning and how well it incorporates Friedrich Nietzsche's ubermensch philosophy into an unconventional story without losing the essence of the philosophy and maintaining the artistic as well as dramatic merit of the film's own story. It's funny you should say that humans play a small part in the film, because the entire film is about nothing other than humans, the evolution of humans from apes to the quintessential ubermensch (which is referred to in the film as Star Man - the giant black monolith), and the relationship between humans and the universe. There is not a moment - not one tiny fraction of a second - in the film in which humans are not discussed, although not through words but through the subtlety of non-verbal storytelling. HAL is shown to have more human characteristics than Dave Bowman, and Dave is shown to have more mechanical characteristics than HAL - a reference to the futuristic transformation of man into ubermensch.

There are clues all over the film about the meaning of the film, and with each clue discovered, a new layer is uncovered, opening the flood gates to a whole range of possible interpretations. For example, the symphony that is used in the beginning, the end, and when the ape discovers the use of the tool, is Richard Strauss' Also Sprach Zarathustra, which is a direct reference to Friedrich Nietzsche's Thus Spake Zarathustra - the foundation of the ubermensch philosophy on which the film is based. (You really need to be familiar at least with the gist of that philosophy to get a grasp over the meaning of the film). Or, in the very title of the film, 2001: A Space Odyssey, the clue to Homer's Odyssey is referenced, which has a great many similarities in major plot points of the story, the most obvious one being Dave Bowman killing HAL with just a screw driver, as the protagonist of Odyssey kills the antagonist with just a bow. The film has as many interpretations as there are people who watch it, which adequately accomplishes Kubrick's aim who, on finishing the film and before its release, said:

"It's not a message that I ever intend to convey in words. 2001 is a nonverbal experience; out of two hours and 19 minutes of film, there are only a little less than 40 minutes of dialog. I tried to create a visual experience, one that bypasses verbalized pigeonholing and directly penetrates the subconscious with an emotional and philosophic content. To convolute McLuhan, in 2001 the message is the medium. I intended the film to be an intensely subjective experience that reaches the viewer at an inner level of consciousness, just as music does; to "explain" a Beethoven symphony would be to emasculate it by erecting an artificial barrier between conception and appreciation. You're free to speculate as you wish about the philosophical and allegorical meaning of the film -- and such speculation is one indication that it has succeeded in gripping the audience at a deep level -- but I don't want to spell out a verbal road map for 2001 that every viewer will feel obligated to pursue or else fear he's missed the point. I think that if 2001 succeeds at all, it is in reaching a wide spectrum of people who would not often give a thought to man's destiny, his role in the cosmos and his relationship to higher forms of life. But even in the case of someone who is highly intelligent, certain ideas found in 2001 would, if presented as abstractions, fall rather lifelessly and be automatically assigned to pat intellectual categories; experienced in a moving visual and emotional context, however, they can resonate within the deepest fibers of one's being."

It is the last sentence, "experienced in a moving visual and emotional context, however, they can resonate within the deepest fibers of one's being," that truly rings with me and millions of others who can go beyond the superficial visual effects supremacy in their appreciation of the film.

But as far as technique goes, I'd apply that to any Fellini movie, as well.

Fellini's 8-1/2 does this really well, if you ask me. The intro scene in 8-1/2, tells me that the main character is kind of... well... fucked up. Kind of stressed out, as well. His celebrity status is giving him no room to breathe. His stressful everyday life is taking its toll, and he is getting middle-age burn out. Beneath this charming older gentleman, lies a hidden sorrow, which I believe is portrayed pretty well using flashbacks of his childhood, and goofy dreams, etc. And, it is a show. Why not make it a show?

8-1/2 does, perhaps, leave more to our own interpretation than '2001...', but I think it's unfair to label it "pseudoartistic babble". Point is: I think the way 8-1/2 was made is important to tell the story of this character, even if it does skip and trip.

And, I'm only bringing 8-1/2 up because you used it as an example. I like Fellini, but I am way more into Pier Paolo Pasolini & Bernardo Bertolucci movies. I also like all the goofy Giallo flicks directed by Dario Argento, and Lucio Fulci.

But hey... that's me.

I know this is your blog, and you are free to form simple, to the point, personal opinions about anything. And, that the point of this post is the review of one particular film, and not your reflection on 'cinema' in general. Having said that, I do find it unfair to dismiss a film, or several films from an era, based solely on some of reasons you've stated.

As for Kurosawa movies... I love Kurosawa movies, but my comment is already too long.

Fellini was actually one of the six directors Kubrick cited as his biggest influence on making films (the other five being David Lean, Ingmar Bergman, Vittorio De Sica, François Truffaut, and Max Ophüls); and clearly, without 8-1/2, Kubrick wouldn't have had the inspiration and directions to make 2001. So perhaps I was a little too harsh in judging 8-1/2, and I apologize if I hurt your sentiments towards the film. It's just that I saw 8-1/2 during a time when I was very, umm, anti-sophisticated-artistic-talk in literary, stage or film work, and 8-1/2 does that a lot - with frequent talks about different so-called "movements" of cinematic expression, which I found very intellectually as well as artistically pretentious. I did like the surrealism of the opening sequence and I was hoping that what would follow would be along those lines, but I had my hopes up too high and the film went in a different direction altogether. Maybe, after another watch of 8-1/2, I might have a change in perspective, so I will look into it when I can.

By the way, a brief observation I made on 8-1/2 the first time around: Whenever a writer suffers from writer's block, he ends up writing the story about himself, about his own inability to compose the work. Fellini was going through writer's block back then and ended up making 8-1/2 about his mental state at that time (without, I hear, a definitive script). Charlie Kaufman did the same thing with Adaptation. They perhaps wouldn't even know that they are following this pattern, but they end up with a work that is full of themselves. William Gass, a novelist, made this observation very early on in his career and wrote The Tunnel, his magnus opus which took him 27 years to write, in which a Nazi Germany historian, while writing a piece of work called Guilt and Innocence in Hitler's Germany, suffers from writer's block and ends up writing it about himself and his own feelings.

Anyway, that's that. Since you mentioned a few directors, I do think Dario Argento, while certainly being one of the first directors to push the limits of gore allowed in cinema, is a highly overrated filmmaker. Maybe it was good when it was released, in that age, but Suspiria bored the daylights out of me. And even though I can appreciate Kurasowa's contributions to film directing, in the end, I find his films extremely tasteless and boring (yet original, I agree).

Monday, January 17, 2011

The Thieves Who Stole His Bicycle

Having not been impressed by Italian films despite my many attempts at trying to be impressed, I finally decided to give The Bicycle Thief (1948) (or is it in plural, as thieves? I still don’t know!) a try – my final shot at Italian cinema, failing which I would erase Italy off my copy of the world map. At the outset, I was expecting nothing more than a pretentious parade of pseudoartistic babble along the lines of Fellini’s 8-1/2, also considered an Italian classic. But as the minutes went by and the story uncovered layer by layer, I stopped rolling my eyes and began to show a hint of interest and toned down my condescension towards the filmmaker little by little. At the end of the 90 minutes that followed, I was left with an expression that can only be best put in the words of Col. Walter Kurtz (from Apocalypse Now): “My God, the genius of it!”

The genius of it lies in its unnaturally simplistic yet powerful plot lacking any signs or symptoms of pretentiousness that is so dominant in the so-called “classics” of world cinema. I don’t mean any disrespect to the black-and-white era, but with the exception of Orson Welles and to some extent Alfred Hitchcock, I largely avoid most pre-Kubrick films. Granted, there were innumerable contributions from a slew of filmmakers – some, like D. W. Griffith, more than others – over a stretch of several decades, but it’s hard to find much of what is good about cinema in a single film in many films before the 1960s. Even those who made significant contributions to film direction, like Akira Kurasowa, brought out such pathetically painful acting from their actors that it makes it almost impossible to watch the scenes, and I would instead prefer to just read the subtitles and pretend to have seen the film as it were. But the Bicycle Thief, I’m glad to say, was a delicious surprise.

At its heart, the film explores our reaction to desperation – the presence of desperation can be subtly or strongly felt at every step of the movie – and how external material objects such as a bicycle can determine and control the life and livelihood of an entire family. The film follows a father and son’s vain attempt at finding their stolen bicycle which would help them “live again” through his new job – job being a very lucky find in post-Fascism Italy. There is no heavy-handed symbolism or over-readable metaphors at any point (although I suspect some pseudointellectual critics will read too much into it anyway – as they always do. Please watch South Park Season 14 Episode 2 for a clever criticism of reading too much into symbolism; to quote Beckett, "No symbols where none intended"), and for this reason, the film has been rightly hailed as the first great neo-realist film of the century. It is realistic to the extent that the two central characters – the father and the son – were specifically chosen as non-actors. Now, I am of the opinion that if symbolism be used in a story, it be used in such a way and form that it is instantly transmitted to the audience via an emotional channel, not via intellectual dissection of “what could this represent” ideas to reach the meaning. That is the difference between real art and pseudoart, and that is what art is about – an emotional transference of ideas and opinions, not an intellectual social theory or puzzle. The most glaring example of this aspect of the film is the way it is ended: (although it is used to the point that it is hackneyed in movies today, The Bicycle Thief was the first film to use this technique) the main character walks with the crowd and slowly blends into it and becomes unidentifiable in it, and eventually we see nothing but the crowd – meaning that the story you just saw could be a story of any person from that crowd, and therefore, of anyone including you.

The critics had initially observed that this is a story of a typical man in post-WWII Italy. But really, that is just a very narrow way of looking at it, since the characters, the setting, and the emotions that the plot evokes are so identifiable with and realistic that it is almost universally true, and for all ages of time. 8.5/10.

The frustration in their faces: a rational response to the desperate circumstances they are in

Monday, January 3, 2011

The monkey who sold his almari

OK, so it’s a Monday night (my free night) and I’m supposed to be watching a movie, but since I already saw True Grit in the morning (whose epic disappointment I will come to shortly) and watched like 7 episodes of In Treatment back to back and read a few pages of The Recognitions, I no longer feel motivated to indulge in the life of fictional characters. I’m bored. So please bear with my ramblings for a bit.

Coming back to True Grit. Colossal failure. Even that’s just putting it mildly. There are so many things wrong with this film, I can only begin to describe how much it has let me down as an anticipated film, much less an anticipated Coen brothers film. The trailer really had me going: I thought this is it, we have the cinematic event of the year right here. The tone, the magical cinematography of Roger Deakins, the utterly awesome Jeff Bridges being directed by the uttttterly awesome Coen brothers – there was no way this could have been anything but pure gold blended with purer gold. But as it turns out, the entire film, set in 19th century America, which in itself can be annoying to watch given all the local colloquialisms that we cannot follow, wallows in cheesy, corny sentimentality layered over used-to-death, cheesier, cornier background score. Half way down, I was just hoping it was the end of the film. It was that friggin’ boring! (I wonder what IMDb users found in it that they've given an 8.5 rating. Maybe they just went with the whole hype that if it's a Coen brothers film, it will be awesome, so expect awesome and pretend to see only awesome, even if it's non-existent.) After three hat-trick awesome films in 2007, 2008 and 2009, the Coen brothers finally have jinxed their momentum with this shite. Thank you for ruining the first movie of 2011 for me, Ethan and Joel Coen!

Now, about In Treatment. Nah, I think I’ll cover it in greater depth in another post, because there’s so much to say about its unparalleled, engaging drama I just can't cover it in a few minutes, in a few words, and in this suicidally-bored state of mind. That’s enough ramblings for now.

2010 was a bad year for me. But I’m glad it’s over. Happy new year!

A joke to begin 2011 with:

An atheist and a Christian Jesus freak are having an exchange.

Jesus Freak: Do you believe in Virgin Birth?
Atheist: No, but I believe in Virgin Death - which you will encounter soon if you keep asking questions like that.

:)