Skyfall Countdown Day 24: Dr. No

“Bond… James Bond.”

It’s been a bit of a dry spell for us fans of James Bond of late, a drought not seen since the dreaded 1989-1995 hiatus when a combination of lawsuits, hostile takeovers and general public ennui made it seem like 007 had had his day.  The financial woes of the legendary MGM have kept Bond off the big screen since 2008, but as anyone who’s seen the trailers for Skyfall can attest, he’s ready to roar back in a big way, with Academy Award-winning director Sam Mendes at the helm and a powerhouse A-list cast including the likes of Javier Bardem, Ralph Fiennes and Albert Finney.  It occurred to me this morning that there are 24 days until the movie is released here in North America, and that there have been 24 James Bond films preceding this one (if you include the “non-official” 1967 Casino Royale and Never Say Never Again).  What better way to celebrate this new Bond than to review one 007 adventure a day culminating with my take on Skyfall (because you know I’ll be there on opening night)?  So let’s get down to it then – with the movie that started this 50-year rollercoaster ride.

Dr. No seemed an unlikely choice to kick off the film series in 1962 – it was Ian Fleming’s sixth James Bond novel and hardly the most cinematic of the ones he had written up to that date – to say nothing of that oddball title, a moniker probably more suited to a goofy 1930’s Flash Gordon-type serial.  True enough, producers Albert R. “Cubby” Broccoli and Harry Saltzman had wanted to make Thunderball first, but it was tied up in litigation.  And the unknown Sean Connery was not anybody’s first choice for the leading role – Fleming himself wanted David Niven, and offers had been rejected by bankable stars of the day like Cary Grant, James Mason and Patrick McGoohan.  Yet it’s difficult to imagine any of them defining the role the way Connery did, particularly in his introductory scene.  There’s a sort of laissez-faire to the way Connery announces “Bond… James Bond,” cigarette dangling from his lips, like he just doesn’t give a rat’s arse whether you care who he is – he’s that confident in his awesomeness.  (One can imagine Grant delivering the line with his customary wink and smile – James Bond would have been Cary Grant, not the other way around.)

There has been a copious amount of criticism written around the “James Bond formula” – the exotic locations, the women, the cartoonish megalomania of the villains.  Many of the elements are introduced in Dr. No, but almost seem like they’re in rough draft form; indeed, it’s difficult to look at the movie objectively 50 years on.  The plot is probably one of the simplest of the film series – a British agent is murdered in Jamaica after looking into reports of radio interference with American space launches, and James Bond is sent in to investigate.  Bond is assisted by CIA operative Felix Leiter (Jack Lord) and local boatman Quarrel (John Kitzmiller), and eventually crosses paths with the half-German half-Chinese, handless Dr. No (Joseph Wiseman), agent of SPECTRE (Special Executive for Counterintelligence, Terrorism, Revenge and Extortion), who is using his private nuclear reactor to knock the American rockets out of the sky.  And of course there’s eye candy in the form of Eunice Gayson as Sylvia Trench, Zena Marshall as Miss Taro, 1961’s Miss Jamaica Marguerite LeWars as a photographer, and most famously, the voluptuous Honey Ryder (Ursula Andress), whom Bond famously encounters as she strolls out of the ocean in a white bikini, knife on her hip, singing “Underneath the Mango Tree.”

Dr. No is a tough sell to modern audiences if it isn’t the first Bond movie you’ve ever seen.  It was made on a shoestring budget of $1 million (nowadays, that wouldn’t even pay for a third of an episode of CSI) and a lot of it does look very cheap.  The acting is pretty painful across the board, and Connery himself tends to flap his gums and yell his lines as he tries to figure out the character, not yet realizing that intensity doesn’t require volume.  Andress begins a long tradition of Bond girls having their lines completely dubbed by another actress, and the effect can be greatly distracting.  Apart from Wiseman, who is aware of his character’s cartoonishness and underplays to compensate, none of the villains are terribly menacing.  The fight and chase scenes are nothing special.  The “dragon tank” is a goofy excuse for a prop that belongs on Gilligan’s Island.  The latter half of the film, once Dr. No finally enters the picture, slows down and drags where it should be building tension to a breaking point, such that the climactic battle between Bond and the villain seems a bit like an afterthought.  Apart from the singular James Bond theme (which is regrettably hacked up in the opening credits) the musical score is cheesy and instantly forgettable.  Yet compared to the largesse of some of the later films, there is a rawness to this adventure and more of a sense of Bond as a bruiser of a man relying on his skills, wits and fists to extricate himself from sticky situations, rather than the finely-tailored dandy with nary a hair out of place who always has the right gadget at the right time.  When a bloodied, battered Bond is crawling through an air vent to escape Dr. No’s lair, you truly worry whether he’s going to make it out alive.  And there are several memorable scenes that help to define Bond as a new kind of morally uncompromising hero, most notably when he shoots an unarmed man in cold blood, and callously turns a woman he’s just slept with over to the police.  Bond is always at his best when he’s being an unrepentant badass.

In most recaps of the Bond series, Dr. No tends to rate around the middle, which is where I’d probably place it.  It’s a little low-key for how I like my James Bond, and really shows its age in certain places, particularly in its pacing.  It has not yet acquired the panache and greater sense of fun of the mid-60’s Bond pictures, and the cheapness of its budget is quite evident throughout.  In recipe terms, Dr. No is a soufflé with all the right ingredients that doesn’t quite manage to rise all the way.  But you certainly cannot argue that without it and its success to set the stage, we would never have had the James Bond that we’ve grown up with all these decades and continue to love.  That alone tends to earn it both a pass for its faults, and a greater appreciation of what it is – a competently-executed thriller bursting with promise for what is to come.

Tomorrow:  From Russia with Love raises the bar.

Selling out circa summer 2012

Like many things in music, The Who did it best.

What is the most annoying trend in popular music?  With YouTube and Auto-Tune making celebrities out of individuals who should never have come anywhere near a microphone, and genuinely talented singers continuing to struggle for any semblance of a break that doesn’t require an uncle in a senior management position with a record company, how could we possibly distil popular music’s faults down to the most egregious offender?  It’s ultimately a matter of opinion, but if I had to pick a single irritant that most damages my appreciation for today’s sound, it’s musicians recording multiple versions of their songs for different markets.  Nothing is more insulting to listeners than this shameless pandering to commercial interests.  Every time you hear one of these bowdlerized abominations oozing through your speakers, you can feel the greasy fingerprints of the Armani-suited marketing committee as they scrape at your eardrums.  Worse though are singers and bands bringing material to the studio they know they’ll have to re-record to ensure maximum market penetration (an apt metaphor if there ever was one).  It speaks of greed, cynicism, contempt for the fans and a fundamental lack of anything resembling artistic integrity.  And the worst part is, it’s totally unnecessary.

One of the big hits of the summer is Maroon 5’s “Payphone.”  Maroon 5 was every mother’s favourite band for their teenage daughters:  catchy and inoffensive with an easy-on-the-eyes lead singer.  They faded away somewhat after their initial explosion onto the scene but are experiencing a resurgent popularity with Adam Levine’s judging NBC’s The Voice and their infectious smash “Moves Like Jagger.”  But “Payphone” is an embarrassment.  It’s whiny emo nonsense that rings completely false – the complaints of a fifteen-year-old upset that his crush doesn’t love him anymore, with no more depth than a chewing gum wrapper.  Most irritating about the song, though, are the final two lines of the chorus:  “All those fairytales are full of shit, one more fucking love song I’ll be sick.”  What’s that, you say?  I must be making this up, you haven’t heard that?  Of course not – the radio version, the one you’ve heard, goes “All those fairytales are full of it, one more stupid love song I’ll be sick.”  And it isn’t Godzilla-esque bad dubbing either – Maroon 5 deliberately recorded two different versions of this line.  The reason?  They knew the line as originally written wouldn’t be played on adult contemporary radio, and that’s a huge audience to forfeit for the sake of some naughty words.  But that’s the thing – why did those words need to be in there in the first place?  The song isn’t great, but at least the message gets across without the potty mouth.  And don’t tell me it’s to express the depth of the singer’s anger; Gotye’s “Somebody That I Used to Know” is a much more honest scream of contempt at the woman who’s left him and contains absolutely no profanity (depending on your opinion of the weight of the word “screwed.”)  “Payphone” is juvenile, a kid giggling at the dirty picture he drew on his school desk, and Adam Levine et al. should know better.  And I say this as someone who admired Levine for telling off Fox News on Twitter after they used a Maroon 5 song in one of their promos.  However, swearing in their songs is just making the case for the likes of L. Brent Bozell and whatever suspiciously well-funded “Parents” group wants to fundraise for the evangelical right on the backs of those evil Hollywood liberals corrupting your children again, and the willingness to record and release a sanitized version for mainstream radio play is evidence of the emptiness of their commitment to branding themselves as rebels, badasses or whatever the point of dropping the F-bomb in the original version was.

“Payphone” contains another example of what pop songs do to try and broaden their customer base:  include a guest rapper in the middle eight.  A few of the singles from Katy Perry’s Teenage Dream contain rap:  “California Gurls” features Snoop Dogg and “E.T.” features Kanye West.  Not that you’d know it if you’ve only heard these on the radio – they play the version where, like with profanity, the rap section has been neatly sutured out for popular consumption, in the studio long before your local DJ gets his hands on it.  I have nothing against rap or the blending of genres (Aerosmith and Run-DMC’s “Walk This Way” collaboration continues to be awesome twenty-five years on), but these aren’t it.  These are stitch jobs.  In all likelihood the rapper and the main performer aren’t even in the studio at the same time – the result is a Frankenstein’s monster of a track where disjointed parts are cobbled together for commercial appeal rather than coherent performance.  The fact that usually the rap can be lifted out without any significant effect (or even notice – it was months after I first heard “E.T.” that I discovered Kanye was on the original version) speaks to the argument that forcing it in to bubblegum pop is misguided, cynical marketing at its most insidious – a way to ensure that even though we’ve got the white kids, let’s make sure there’s something for the black kids too.  More to the point – if the artists know they’re going to have to cut the rap for full radio exposure, why include it in the first place?  The other reason you know this whole phenomenon is marketing B.S. is that it’s never done the other way; sorry for those of you eager for that Jay-Z featuring One Direction number.  Here’s a radical thought – why not just write a better song that can appeal across color lines without pandering to them?

Since there is so much cross-pollination and cross-promotion of entertainment products these days, why not take pop music philosophy and apply it to novels?  (Oh wait, they’re already doing that – witness Pride and Prejudice and Zombies.)  But how ridiculous would it be if, for example, George R.R. Martin’s A Game of Thrones came in both regular and sanitized versions, the latter where anything potentially offensive to Aunt Ethel was eliminated, so that Cersei and Jaime Lannister are just good friends, Bran fell out the window on his own and Eddard Stark died offstage due to a nasty throat infection?  Or if somewhere about two thirds of the way in we had a guest chapter authored by Stephenie Meyer where Sansa mopes over the sparkly Tyrion, because we have to make sure to get the youth vampire audience in as well.  Better yet, let’s do this in movies.  Let’s have the second act of The Dark Knight Rises directed by Brett Ratner featuring Chris Tucker as a wise-cracking Gotham City police officer and Jackie Chan as his kung fu master partner taking on Bane (“When you touch my goddamn radio, y’all have my permission to die!”)  Does that sound like anything we’d want to read or see?  Then why do we let musicians get away with it?  Chopped up, bastardized and sewn together alternate versions of songs ultimately please no one and only embarrass the artist.

In the end, quality is quality, and it begins from the ground and proceeds organically – piling stuff on top after the fact, or half-assing out a different version, is a sign of a last-minute lack of confidence fueled by focus groups and marketing gurus who need to look up from their spreadsheets.  Like books and movies, there should be one song, and one song only.  Putting out multiple versions for different demographic markets only reinforces the concept of music as product – the last thing I suspect anyone who fancies themselves an artist wants to admit.

It’s not a great show yet, but it can be

The Newsroom has taken a lot of flack in the press for being too similar to what Aaron Sorkin has done before – a workplace drama where characters race through halls and corridors, their words flying at the same breakneck pace as their feet, while sermonizing about everything that’s wrong with the world and about the nobility of trying to fix it.  Well, what can you say, really – the man has his wheelhouse.  We’ll probably never know for certain the exact details of why Sorkin left The West Wing in the hands of John Wells after the fourth season, but I believe that he missed writing it.  On the DVD commentary for the final episode he penned, he hints at having an alternate resolution for the storyline where President Bartlet’s daughter is abducted and Bartlet steps aside to allow the Republican Speaker of the House to serve temporarily as President until she is found – but ultimately chooses to hold his piece and not pass judgement on the version penned by Wells.  When Studio 60 on the Sunset Strip came out, the old West Wing tropes crept back in to a series that was ostensibly about something a light-year removed from Washington politics – a Saturday Night Live-esque comedy show.  But when Matthew Perry’s unapologetic liberal Matt Albie and Sarah Paulson’s sorta-conservative-but-not-really Harriet Hayes got into a debate on the beliefs of their respective political parties, it was almost a flare going up from Sorkin indicating that he’d rather be putting these words in the mouths of Sam Seaborn and Ainsley Hayes.  Cementing this notion, the final four episodes that closed Studio 60’s only season were an extended plot about one of the characters’ brothers going missing in Afghanistan and the rescue operation to find him.  You could tell that the limitations of potential plots about sets breaking down and guest hosts showing up drunk were chafing Sorkin’s desire to tell big, consequential stories, and by the time he knew the show was on the way out he didn’t care to make the distinction anymore.

The Newsroom is a kind of hybrid of these two disparate beasts – a show about television that now has a logical reason for dealing with political stories.  Sorkin’s thesis is that news on both the left and the right has lost its way, that scoring points and sucking up to corporate and political interests has become more important than the reporting of the truth and the willingness to challenge people on their obfuscation and misinformation.  He’s not wrong in this, even though the right is more complicit along these lines (for all the bitching on the right about MSNBC, it is not a blatant propaganda mouthpiece for the Republicans the way Fox News is).  As conflicted anchor Will McAvoy, Jeff Daniels has a great moment in the pilot when he turns to the left-wing talking head (seated, unsubtly, on his left) and tells her that no one likes liberals because they lose all the time.  Again, as a liberal eternally frustrated by our collective inability to explain our message succinctly and stick it to people who don’t agree with us the way conservatives do, this is manna, something that desperately needs to be said, understood and acted upon.

But the show isn’t meant as a wakeup call to the left, inasmuch as it isn’t a strict smackdown of the right either.  It’s a request to both sides to do better.  For liberals to find their balls, and for conservatives to find their sense of decency.  Sorkin wants the debate – he wants both sides to present their ideas in their purest, most robust, intellectual form, bereft of political gamesmanship and the “my dad can beat up your dad” state of current discourse.  As a news anchor, McAvoy is positioned perfectly, in Sorkin’s view, to act as arbiter of this hoped-for grand debate, to call out liars and steer the conversation away from constant appeals to the lowest common denominator.  As the show puts it, to tell truth to stupid.  What frustrates Sorkin most is that the only thing preventing this happening in real life is not the lack of resources, or opportunity, but of will.  As Sam Waterston’s network boss Charlie Skinner puts it in the line that gives the title to the pilot episode, “we just decided to.”  We can just decide to.

Noble ambitions aside, how fares the execution?  Well, The Newsroom is not without its flaws, some of which may be chalked up to first-episode jitters.  The West Wing cast was considerably more seasoned than this starting lineup when they began chewing on the “Sorkinese” in 1999, and while old pros Daniels and Waterston are excellent (and it’s fun to watch Waterston play an old drunk who doesn’t give a rat’s ass after what felt like decades as stalwart integrity warrior Jack McCoy) the younger performers haven’t quite nailed the pacing of the dialogue – fast-paced banter among them feels like they are trying too hard to make sure the lines come out in the proper order, as opposed to sounding like the character thought of them first.

One of the great things about The West Wing’s pilot was how the ensemble entered the story individually, with distinct beats that gave you a great snapshot of who they were and what they might become, before they began to interact with one another and the plot built gradually to the climactic introduction of the President.  Not so here.  We’re thrown into ACN’s news bullpen with little sense of who is who and what their function is – perhaps that matches the chaotic feel of a real newsroom, but it doesn’t necessarily allow us to latch on to types we want to identify with quickly.  And this is a personal preference, but as someone who is not the biggest fan of obvious love triangles, it would have been preferable to see the Don-Maggie-Jim subplot develop gradually a few episodes in, instead of hitting us over the head with it in the first half hour, because now, dramatically, it doesn’t have anywhere to go.  Maggie is with Don and then might end up with Jim and of course Don won’t accept that and so on and so forth.  I’m still not quite sure what Don’s function will be going forward – he is supposed to be moving to another program but is still hanging around McAvoy’s “News Night” for the time being.  Anyway – easily my least favourite character and the greatest potential to be the Mandy Hampton of this series.

As for the other major player, Emily Mortimer as MacKenzie McHale, a few histrionic moments do not provide an adequate counterbalance to Daniels’ McAvoy.  She is, in this episode, as insubstantial as the phantom vision of herself that McAvoy thinks he spots in the back row of the auditorium.  If theirs is to be the pivotal relationship around which the show revolves, I’m hoping that we see more humanizing flaws as the weeks go by, and a little less of the idealized “news goddess” with forced moments of endearment.

As a devoted fan, I’m willing to cut Sorkin a lot of slack because I love the rhythm and spirit of his writing so much, and I empathize with his opinion on the excessive devotion major media gives to the stupid and the banal.  But he has to balance his criticism with the demands of drama, and in “We Just Decided To,” I think he’s fallen a wee bit short of the mark.  As I noted earlier, one cannot impugn his main argument about the state of the media.  But if you can’t fire your rebuttal on all cylinders, you open yourself up for accusations of pontificating, and Sorkin would be the first to admit that his ultimate responsibility is to entertain.  (As an aside, I wish he’d stop beating up on bloggers – really Aaron, some of us do like you a lot, and we’re not all the cast of One Flew Over the Cuckoo’s Nest chain-smoking Parliaments in our muumuu’s.)

Fundamentally, television is better when it challenges us, instead of regaling us passively with the embarrassing exploits of real-life rich families.  And it’s certainly better when Aaron Sorkin is on it.  When McAvoy is asked, at the beginning of the episode, why America is the greatest country in the world, he sees MacKenzie in the audience holding up a sign that reads “It’s not, but it can be.”  That phrase, I think, is the best judgment on The Newsroom for the time being.  The elements are all there to make a challenging and entertaining show, even if they haven’t quite jelled yet.  Hopefully audiences will have the patience to go along for the ride.  I certainly do.

Even if Sorkin still hates blogs.

We need to go darker

Katy Perry in the video for “Wide Awake,” conjuring some musical magic.

Katy Perry’s “Wide Awake” has been on my playlist all week long, an incongruity even sandwiched inside an eclectic playlist that includes Hendrix, Dylan, the Byrds, Tom Petty, Richard Ashcroft, Thomas Newman, Jerry Goldsmith, Mychael Danna and Hans Zimmer.  I cannot stop listening to it.  It accomplishes the remarkable feat of being both catchy and soulful, bruised yet full of hope.  Apart from innocently fancying Ms. Perry herself (which my Alexander Skarsgard-adoring better half assures me she’s totally okay with) I’ve been indifferent toward her music until now.  Her breakout hit “I Kissed a Girl” is the giggle of a nine-year-old too chicken to truly explore questions of confused sexuality lest her parents think badly of her.  “Firework” is a well-meaning song undermined by Perry’s inability to hit and sustain high notes.  The lack of proper rhymes in “California Gurls” and the Brady Bunch-esque misdeeds of “Last Friday Night” are a saran wrap-deep package unwilling to chafe against the very successful mould in which she’s been forged.

Then her marriage to Russell Brand broke apart, and she wrote, recorded and released “Wide Awake” as a meditation on what she’d been through and where she is now.  And it’s a great song.  This isn’t a pig-tailed goofy girl jumping up and down on a beach – it’s the honest testament of an emotionally bruised woman picking herself up off the concrete.  Katy Perry has established such a niche for herself that she didn’t have to record this song – she could have released yet another ode to partying in the sunshine and achieved plenty of accolades and album sales.  But she chose to try to say something profound about who she is and how she’s feeling about the world.

I’m not going to go faux-Lester Bangs and suggest that “Wide Awake” is a watershed moment in music.  But it illuminates a larger question that I think most artists grapple with.  Is introspection by its nature a journey of sadness?  Does something have to be dark to be good?  Is the stuff of genius found only in the minor chords?  There’s an old axiom that says all real comedy is born from pain.  So too does it seem that the best music is that which reflects lessons learned at great cost.  This is not to say that everyone gets it right – it seems that every Kelly Clarkson song is about breaking up with someone and being better off because of it, but unlike Katy Perry in “Wide Awake,” you get the sense that Kelly’s just reading the lines someone else wrote for her instead of feeling them through the notes, and that’s why, at least to my ears, “Wide Awake” will have greater staying power than the grating and empty “Stronger (What Doesn’t Kill You)”.

Bob Dylan told John Lennon when they first met that he needed to get personal in his lyrics.  You begin to witness the transformation through the Beatles middle period as songs like “I’m a Loser” on Beatles for Sale and “Help!” lead to angry kiss-offs like “Norwegian Wood,” the existential exploration of “Nowhere Man” and the psychedelic dream state of “Tomorrow Never Knows,” and the Sgt. Pepper era becomes the truly dark, soul-baring Primal Scream anguish that closed out the Fab Four and realized itself fully in John’s solo career.  Had Lennon and the others chose to rest on their laurels and sing nothing but upbeat generic pop for their entire careers, they might have done very well.  They might still be touring casinos and retirement homes today.  But they wouldn’t be legends.  It was their choice to share their vulnerability, their humanity, that made them so – the gods who dared to admit they were the very same as the mortals who worshipped them.  In the documentary Imagine, there’s a scene where Lennon confronts an obsessed fan who is trespassing on his property, who wants to know how Lennon could have known so much about this fan’s life as to write songs that seemed to be about him.  Lennon responds, frankly, that “I’m singing about meself.”

The stories that have the deepest impact on us are tales of catharsis; of people like us who are tested to the limits of their endurance, who go all the way to the point of breaking and come back changed, improved, and renewed.  To find the brightest light, one must brave the darkness, because it is only in the dark that light can shine.  Every artist who starts out warbling giddily about rainbows and lollipops will face a crossroads at some point, where they will be forced to decide whether to continue skipping along the yellow brick road or stumble off into the gloomy forest – with no guarantee that something better waits on the other side, only faith that it does.  It’s a journey that is always worth taking.  The Dixie Chicks’ music improved immeasurably after their fracas with the American right over their Bush-inspired version of John Lennon’s “bigger than Jesus moment”, when they got away from karaoke-ready dreck like “Goodbye Earl” and opened up with powerful anthems like “Not Ready to Make Nice.”  Brian Wilson struggled his entire career against the goofy surfin’ tunes that characterized the Beach Boys and that his record label insisted he continue to produce, and as a result we were blessed with lasting gems like “God Only Knows.”  I have no doubt whatsoever that someday Justin Bieber will grow a goatee and release an acoustic album, and you know what – done with the right intentions, and not just as a sales gimmick, it’ll be terrific.

Until then, play “Wide Awake” again and think to yourself, damn, Katy Perry makes for one fine-looking goth.

A price above rubies

Elisabeth Moss (Peggy) and Christina Hendricks (Joan).

What price does a woman put on her soul?  How blurred is the line between integrity and compromise?

As Puritanical attitudes towards what is acceptable to a television viewing audience have softened, the portrayal of women has evolved as well, with the smiling apron-wearing June Cleaver giving way to ever more complex characters, where what it means to be a woman, in all its wonderful, contradictory glory, is examined on a psychological level – much more deeply than hacky debates on the best make of shoes or how sexually inadequate their partners may be.  Last Sunday’s episode of Mad Men, “The Other Woman,” after four and a half seasons of examining the ways in which men compromise themselves in pursuit of wealth, sex and power, took its two strongest female characters and forced them to ask themselves what their own price might be.  Joan agreed to an indecent proposal in exchange for a partnership in the company, while the lately taken-for-granted Peggy decided her worth couldn’t be expressed in numbers and chose to walk when that was all she was offered to stay.

The buxom redhead Joan has been described by the show’s creators as man-like in her full command of her sexuality, a beautiful woman who is well aware of the effect she has on those obsessed by mammaries.  To their (and Christina Hendricks’) credit, she has never been portrayed as the kind of vampy temptress such a description usually fits; she isn’t working from the Erica Kane playbook, but rather striving, consistently, to prove herself as the best at her job.  As to her relationships with the men and the women of Sterling Cooper Draper Pryce, one will forgive what sounds like a puerile argument when the symbolism of Joan as mother figure is expanded upon.  Even those who have expressed sexual desire for her, whether fulfilled like Roger Sterling or unrequited like Lane Pryce, have found themselves in the position of whimpering babes at her ample breast.  Others, like cardboard husband Greg, have been unable to cope.  (Greg escaped, ironically, to the boys-only army.)  Her relationship with serial womanizer Don is perhaps the most complex of all – ironic that the two best-looking people on the show have never taken it much beyond a brother-sister level.  Don is man enough, in the end, to recognize that Joan agreeing to sleep with a lecherous car dealer in exchange for securing the Jaguar account isn’t the path she should take.  The episode played expectations by staging Don’s last-ditch attempt to change Joan’s mind without revealing until later that it took place well after the deed was done.  Was Joan truly as compromised as most reviewers of this episode tend to believe, or was it a logical progression in her evolution – a conclusion on her part, regardless of what we may think of its validity, that to get where she wants to be, she has to use every talent at her disposal, regardless of the collateral damage to her spirit?  Coincidentally, this week’s Game of Thrones featured a scene where the ruthlessly ambitious Cersei Lannister drunkenly observed to the virginal Sansa Stark that a woman’s greatest weapon lay between her legs.  Has Joan crossed that line now?  Has she decided that being good at what she does is only going to take her so far?  One thing is for certain, in the jubilation that accompanied the announcement of SCDP’s winning the Jaguar account, newly-minted partner Joan was as out of place as a prostitute at a church picnic.  Perhaps inside, that was how she felt.

Peggy, on the other hand, while she has had her share of romances (and one ill-advised fling with Pete Campbell, whose abject disinterest in her since that early episode indicates that she was strictly a novelty to him) is the little chickadee to Joan’s mother hen.  Unlike Joan, she’s never really had the option to full-out Mata Hari lecherous men into helping advance her station in life, and so her drive to prove herself comes more from a place of not having much of a choice otherwise.  She and Joan both find themselves brushing against the glass ceiling, and for Peggy, going down the road suggested by Cersei Lannister is not only unpalatable, but unnecessary.  Peggy’s worth is not tied to her future at Sterling Cooper Draper Pryce – it is, after all, only one of many companies out there where her talents can be of use, as is quickly proven by her meeting and ultimate decision to go work with Ted Chaough.  When she admits this to her mentor, Don – while incredibly empathetic in his encounter with Joan – cannot reconcile the idea that Peggy’s problem cannot be solved with just more money.  But Peggy is in as much a crisis of spirit as the one faced by Joan.  Oddly enough, Joan’s loyalty to SCDP and its people – her mother’s instinct again – was probably what led her to make the choice she did, the dangling carrot of a partnership aside.  Peggy, by contrast, realizes that to grow as a person she must, in a Buddhist sense, divest herself of her attachment to Don Draper and the old gang.  The little chickadee has to leave the nest.  It is a much healthier decision, and explains the smile on her face as she steps onto the elevator for the last time, with the Kinks’ “You Really Got Me” playing in the background in a final brushstroke of symbolism.

Proverbs 31:10 says that the worth of a virtuous woman is far above rubies.  Joan let herself be bought, some would say for far less than rubies.  Peggy didn’t.  What is most important, however, is that in the end, the choice was theirs.  They may indeed have a price, but they are going to be the ones to decide what that price is.  These women defined themselves instead of letting men do it for them – a greater achievement in the sexist era in which Mad Men takes place.  They were willing to accept the consequences of that definition, whatever they may be.  And taking absolute charge of one’s destiny is, to risk a cliche, true empowerment.

In fairness, I did like The Lord of the Rings too (Part 1)

Frodo eyeing Sting for the first time, duplicating my skeptical look at the prospect of a Lord of the Rings movie.

The Huffington Post quoted me praising Star Wars in their “battle of the franchises,” in which, following preliminary rounds that have seen spirited contenders such as Harry Potter and James Bond fall by the wayside, Jedi now fight hobbits in the quest for the ultimate prize – the top rank in a meaningless, statistically-flawed survey of genre popularity.  Judging such things is a bit like trying to assign criteria to beauty – everyone has his own preference, and for infinite different reasons.  The same can be said for how I and many like me weigh Star Wars against The Lord of the Rings.  How we view them depends on who we are, what our circumstances are when we experience them for the first time, and how those experiences evolve as we grow and accrue the cynicism of wisdom to find endless fault with what once sparked only wonder.

I grew up with Star Wars, but can’t say the same for The Lord of the Rings.  I saw the Ralph Bakshi animated version at a friend’s birthday party when I was six or seven and what I recall most was the entire group of youngsters finding it tiresome and cheap and quickly shutting it off to listen to the newest Duran Duran record instead.  As I got older, it was one of those elements of popular culture that I was always aware of, but never terribly interested in exploring further (kindly recall that this would have been when the idea of sitting down with three enormous volumes of Tolkien prose would be quickly supplanted by the sight of a shapely pair of tanned legs strolling by).  And I was jaded by cinematic fantasy throughout the 80’s and 90’s:  endless chintzy, low-budget productions with lousy special effects, cruddy-looking monsters, embarrassing writing, hammy acting by D-list performers and the infuriating cliché of the “magical portal to Los Angeles.”  After all, why pit your dashing heroes against dastardly villains in a wondrous setting of visceral imagination (you know, something you’d actually have to pay somebody talented and expensive to dream up) when you can have them duke it out on Sunset Boulevard while hip-hop plays over each swing of their enchanted swords?  On television, mainstays like Hercules and Xena were amusing diversions, but drowned in smirking, anachronistic pop culture references, and characters’ ability to die and resurrect ad infinitum, what a friend once called “a day pass to the underworld,” undermined any sense of stakes when the scripts could be bothered trying to aim for it.  You got the sense that the creative sorts behind these ventures considered their target audience strictly ADD-afflicted kids.  Given little consideration was any semblance of “the big ideas” that fantasy can tackle, or any sense that these characters were remotely human.

Around the turn of the millennium I’d heard rumblings here and there that a new movie adaptation of The Lord of the Rings was underway.  Oh yeah, that crummy cartoon, I thought to myself.  The CV of director Peter Jackson was not encouraging either; the few minutes of The Frighteners I’d seen were silly.  When the appalling Dungeons & Dragons limped its way onto the screen in 2000, I thought it was a pretty accurate barometer of how the new LOTR would turn out.  Nobody could do this right, not with the kind of verisimilitude that fantasy cried out for, and this unknown New Zealander with a few weird-ass movies on his IMDb page certainly wasn’t going to be the first.

Then, in early 2001, someone sent me a Fellowship of the Ring promotional calendar.  And I was floored by what I saw – portraits of esteemed actors like Ian McKellen, Christopher Lee, Cate Blanchett and Ian Holm in richly detailed costumes as wizards, elves and hobbits.  Steven Tyler’s daughter looking simply radiant as Arwen.  North and Rudy as Frodo and Sam respectively.  The grizzly-looking guy who played Satan in The Prophecy as Aragorn, and what’s this… the MAN himself, Sean Bean as Boromir.  Okay, I thought, there might be something to this after all.  Especially since the quality of this calendar proved that some serious coin had been poured into this endeavour, it wasn’t a one-off “let’s-cut-our-losses-and-sell-the-rights-to-Taco-Bell” promotion.  Maybe, I dared to hope.  Maybe this time, they’ll get it right.  Thus, unbelieving me decided it was finally time to set about reading the books, so I could see how, despite all this incredible design work, the filmmakers would screw everything up.

Certainly a lot of Tolkien’s original work is decidedly uncinematic (not that it’s a bad thing, just some stuff fundamentally works better on the page).  Goofy Tom Bombadil seemed like a train wreck waiting to happen, and I cringed every time Sam burst into tears or characters broke into song at the drop of a wizard’s hat like they were starring in a Middle-earth revival of Guys & Dolls.  Realistically, I thought, for this to be adapted faithfully you’d have to turn it into a ten-hour musical.  But coming to it late, in the shadow of the upcoming films, I didn’t find any story beat I was particularly attached to, or dying to see realized in 35 millimeter.  I thought it could have made a great movie; I was just saddled with memories of 20 years of bad movies and could visualize the visible matte lines, crude animation and histrionic over-emoting under a synthesizer score that could have resulted.  Even as the months ticked away, trailers leaked out into the world, a traveling exhibit of the movie’s props and artwork made a stop in Toronto around my birthday, part of me tempered my excitement with a pestering reminder that after all of this promise, the inevitable letdown was soon to come.  It still could have gone so wrong.

Then, just after midnight on December 17th, 2001, the lights went down and the screen came to life…

(To Be Continued)

Aaron Sorkin takes on Steve Jobs

But can it sing “I Am the Very Model of a Modern Major-General”?

He has said he loves his Mac, so I guess it’s no shock that Aaron Sorkin has agreed to write the upcoming big-screen retelling of the life of Steve Jobs.  What can we expect from this new venture?  I can see the fateful moment of the founding of the world’s biggest corporation unfolding something like this:

INT. JOBS HOME (CRIST DRIVE) – GARAGE – NIGHT – 1977

STEVE JOBS, STEVE WOZNIAK and RONALD WAYNE are standing around their first, crudely built computer.

JOBS:  What do you think?

WAYNE:  It’s ugly.

JOBS:  What do you mean it’s ugly?

WAYNE:  It’s ugly.  As in “unpleasant or repulsive in appearance.”

JOBS:  I was thinking “ugly” as in “involving or likely to involve violence.”

WAYNE:  Violence?

JOBS:  As in what I’m going to do to you if you don’t shove that Silenian gloom and doom up your ass.

WAYNE:  Forgive me for being the only one in the room worried about aesthetics.

WOZNIAK:  It is kind of ugly.

JOBS:  Kind of ugly?  There are degrees of ugly?

WOZNIAK:  Well, yeah, I suppose… there’s “yeah, whatever” ugly and “I-am-Oedipus-gouge-your-eyes-out-to-purge-the-horrible-memory” ugly.

JOBS:  It’s not that ugly.

WAYNE:  It’s pretty ugly.

JOBS:  Pretty ugly is another degree of ugly?  Like gorgeously abhorrent or beautifully hideous?

WAYNE:  Beautifully hideous, that’s good.  That suits it.

WOZNIAK:  What are we going to call this beautifully hideous thing?

JOBS:  Somehow I don’t see “beautifully hideous” as an effective selling point.

WOZNIAK:  Depends who you’re selling to.  You’d clean up with Dadaists and deconstructionists.

JOBS:  Yes, because they’re well known for their interest in computers.

WAYNE:  I can’t think of a good name.

WOZNIAK:  Me neither.

JOBS:  Come on, guys.

WOZNIAK:  I’m very good at integral and differential calculus, not naming things.

JOBS:  We need to think this thing differently.  You know, when Gautama sat under the Bodhi tree, he vowed not to rise until achieving enlightenment.  Part of enlightenment is what Buddhists call the concept of “sati” – the awareness to see things for what they are with clear consciousness and being aware of the present reality within oneself, without any craving or aversion.  Gentlemen, we are not moving from this garage until we come up with a name for this product, and I don’t care if we sit here until we are all so old and beautifully hideous that we can’t stand the sight of one another.

WAYNE:  The tree.

JOBS:  Pardon?

WAYNE:  The Bodhi tree.  What kind of tree was it?

JOBS:  A fig tree.

WOZNIAK:  “Fig Computers”?

JOBS:  No, something more primal.  Something indicative of beginnings.  Genesis.  Garden of Eden.  The fruit… the fruit of knowledge.  Apple.

WOZNIAK:  “Apple Computers.”

JOBS:  Apple Computers.

No one speaks for a moment.

WAYNE:  It’s ugly.

WOZNIAK:  Pretty ugly.  Beautifully hideous.

JOBS:  We’ll go with that then.

Not coming to theaters anytime soon…

Quis custodiet ipsos numeros?

An emergency board meeting in Margin Call.

Margin Call, written and directed by J.C. Chandor, is a 2011 movie about the 2008 financial crisis that stars Kevin Spacey, Paul Bettany, Jeremy Irons, Demi Moore, Stanley Tucci and Zachary Quinto (who also produced).  It features a topical storyline, some strong, subtle performances (particularly from Irons and Tucci), interesting characters and key ethical questions to be asked about the spiritual worth of the pursuit of money.  It is also somewhat difficult to follow if you do not have experience in high finance.  Characters drop references to commercial securities, asset valuations and market fluctuations so fast, without pausing for a breath to catch the audience up, that you almost find yourself wishing for subtitles.  Even when characters make jokes about not being able to understand what they’re looking at, and plead for facts to be explained in plain English (or as Irons says at one point, as if one is speaking to a small child or dog), what follows remains untranslated biz jargon.  Cobbling together what you do comprehend, you conclude that a major investment firm has gotten too greedy and has purchased too many high-risk assets that, due to changes in the market, are about to become worthless, necessitating a massive pre-emptive sell-off that will, in itself, precipitate a further worldwide decline, but may, it is hoped, save a portion of the firm.  (I hope you got all that because I’m still trying to figure it out.)  The moment this becomes clear is when Irons puts it into colloquial terms, declaring, “The music is about to stop and we’ll be left holding a bag of odorous excrement.”

One cannot help but be reminded of the Star Trek trope where one character proposes a long technobabbling resolution to a crisis, summed up by someone else with a much simpler metaphor:  “If we reconfigure the deflector dish to emit a synchronous stream of alpha-wave positrons along a non-linear coefficient curve, we might be able to produce a stable gravimetric oscillation that would divert the asteroid’s course.”  “Like dropping pebbles into a pond… make it so!”  As tiresome as this became, it was done for a reason.  When setting any scene in a foreign environment – be it another country, another world or simply an exotic office – the writer has to walk a tightrope between being truthful to the environment and servicing the demands of drama.  The audience has to be able to relate to what’s going on in front of them, or it might as well indeed all be playing out in Mandarin Chinese.  Yet you don’t want to dumb things down for mass consumption, and you can’t succumb to the dreaded “As you know, Bob” epic fail:  characters stopping to explain things that they already know, and would have no reason to discuss given the course of their day.  If you’re an accountant, are you going to spend any time explaining to your veteran colleague what a trial balance is?  Is Alex Rodriguez going to pause mid-game for a five-minute exegesis with Derek Jeter on the infield fly rule?  Nor does it make any sense for these experienced brokers to sermonize on the basics of brokerage.  Usually a writer gets around this by introducing a “fresh-faced intern on his first day” who can ask the “business 101” questions on behalf of us dummies watching.

There are no interns or other such clichés in Margin Call, which chooses not to explain its dialogue in digestible nuggets for the masses.  Characters in this glass-enclosed world debate, ruminate, decide what they have to do and proceed with their financial chicanery, complicit in what may turn out to be their own destruction.  And after scratching your head for an hour and a half, you discover that what is sneakily clever about Margin Call’s screenplay is how it turns the incomprehensibility of its subject matter into a revelation about its subjects – the wheelers and dealers of the Wall Street world, men and women who are as much prisoners of an impenetrable capitalist system as those of us who can scarcely be bothered to look at our mutual fund statement every month.  No one understands this stuff, not really; they just want it all to work seamlessly and invisibly to make them rich, which is part of what makes the system so vulnerable to collapse.  Depressingly, here in the real world, four years on, the same cycle of greed has circumvented the installation of proper safeguards to ensure that these mistakes are not repeated.  It’s too complicated, no one really gets it, they can’t be bothered, it’s trivial, that’s the other guy’s problem, the market will regulate itself as it always has.  But the genie is long out of the bottle.  In a moment of insight, Jeremy Irons’ character judges this world thus: “It’s just money; it’s made up. Pieces of paper with pictures on it so we don’t have to kill each other just to get something to eat.” 

The problem is we are killing each other over these pieces of paper – we are letting the numbers control our lives, and as Margin Call demonstrates, no one is truly in control of the numbers.  It’s all gambling, and as any experienced gambler will tell you, no matter how well you play, in the end the house always wins.  I’m not sure who “the house” is in this case, but I’m fairly certain that it isn’t us.

The Force should be with you, always

I first saw Star Wars on Beta.  (Those of you born after 1985 are scratching your heads right now wondering what that is.)  It was a bad, commercial-laden dub off the local TV station:  the picture quality was dreadful, the sound was worse and the story was interrupted every five minutes to try and sell me pantyhose and dish detergent.  Regardless, my young self was completely transfixed.  Set aside the sheer whiz-bang factor of cool spaceships zipping around shooting lasers at each other; for a quiet, lonely kid who grew up looking at the stars and dreaming, Star Wars was that dream given shape – the idea that from the humblest beginnings could arise an adventure to span the galaxy.  Star Wars and its every subtle quirk – characters with a half-second of screen time, unusual inflections on innocuous line readings – burned itself into the zeitgeist and became an instant allegory for our own troubled history.  “May the Force be with you” was more than a secret sign between members of an exclusive cult; it evolved into a universal greeting of peace and goodwill.

Thirty-five years later, our post-Star Wars world is a far more cynical time, when the wide-eyed eagerness displayed by young Luke Skywalker is seen more as tragic naïveté than an admirable sense of hope and optimism.  Thus, the enormous anticipation afforded to the prequel trilogy could not help but lead to an equally enormous letdown, a sense that despite all the ingredients being there, the recipe wasn’t gelling.  One can waste gigabytes citing all the familiar criticisms:  poor acting, dodgy writing, wooden characters, Jar Jar Binks.  But it seems to me, as someone who admittedly experienced the same disappointment as The Phantom Menace unspooled, that what was missing from the equation was us.  We didn’t have the same optimism, and we weren’t looking at the stars the way we used to.

It’s no surprise, then, that the newest iteration of Star Wars would fail to penetrate that jaded shell, erected by decades of frustration with the failures of our leaders, our increasing obsession with the banal, and a realignment of our values – towards the shallow, the material and the increasingly out of reach.  How could even the most masterfully crafted Star Wars film compete against that?  The clearest indicator of our cynicism, for me, was that in the months leading up to the release of Episode I in 1999, buzz centered largely not on the question of whether it would capture our imaginations and spark a cultural phenomenon the way the first movie did, but whether it would outgross Titanic – ironic in that Star Wars has always been a victim of its own success, and to examine it only in financial terms, as we seem to do with everything these days, is to miss its fundamental meaning.

Star Wars represented something that has gone somewhat astray amidst the background noise of our modern discourse, and deserves to be brought back in full vigor.  That connection with the old stories, with the passions that have driven us since we first stood erect, and the myths we have handed down across generations almost as genetic souvenirs of what matters most to us about our collective human experience.  It has endured, because it is the best of who we are and who we have ever been.  Star Wars stokes the hunger to set out upon a journey and to emerge triumphantly at its end, not as a wealthier or more famous man, but simply a better one.  To become more than what we are.  That is what we are truly wishing each other when we say “May the Force be with you” – may your spirit be emboldened by the force it needs to achieve its greatest potential.  Not a bad sentiment to express on May the Fourth – and something worth keeping in mind all year round.

Rise of The Dark Knight

The Christopher Nolan Batman trifecta.

After groaning through a prehistoric glacier’s worth of ice puns in 1997’s Batman & Robin, I was done with the Caped Crusader.  This was back in an era when I could usually find something positive to say about any movie I went to see, and my comment upon completing a slow funereal march out of the theater along with dozens of other disappointed audience members was, “That was $100 million that could have gone to feed starving children.”  Batman & Robin was a two-hour sensory middle finger, stitched together to become less than the sum of its parts like some ungodly Frankenstein’s monster by accountants and focus groups.  The old Adam West-Burt Ward TV show had been an after school ritual for me for many years, but the kitsch that worked so well in 22-minute installments in the late 60’s was excruciating when blown up for the multiplexes.  What was fun and oddly sincere in one medium became insulting in another.

Since ’97, the theaters had been flooded with one superhero movie after another, some decent but most not, as studios plumbed their back catalogue to find some obscure character in a mask whom they could dress a star as and plug into basically the same script with a hip-hop soundtrack and thus secure a pre-sold blockbuster.  Drubbed to death just as thoroughly around the same time was the concept of the prequel.  “We’re going back to show you how it all happened.”  It wasn’t enough to let a character exist with some mystery about their backstory; now it all had to be spelled out with each personality quirk given a deep, long-simmering Freudian rationale.  (We can all admit that we thought Darth Vader was much cooler before we heard his boyhood self squeal “Yippee!” in The Phantom Menace.)  So when I heard there was a new Batman movie coming out and that it was a prequel, my excitement level was roughly akin to what it would be if someone told me today’s special in our work cafeteria was a bowl of hot concrete.

The trailers for Batman Begins didn’t spur much enthusiasm either.  Liam Neeson doing his Jedi mentor routine again.  Bruce Wayne angst-ridden about his parents, even though we’d seen him coping with that in movies one through four.  The only thing that seemed promising was the casting – heavyweights like Neeson, Michael Caine, Gary Oldman and Morgan Freeman, each of whom has the freedom to pick and choose and certainly wasn’t going to sign on for the same old same old.  After Jack Nicholson stole the first Batman, successive films had tried to compete by doubling the number villains and cramming whatever A-lister was available into the roles, regardless of whether or not the story was served by it.  Screenwriter William Goldman, when discussing working with Batman Forever‘s cowl-wearer Val Kilmer, commented on this pattern by observing that “Batman is and always has been a horrible part,” and that it existed solely for the more over-the-top villain roles to play off.  The casting of Christian Bale in the lead this time, not an unknown but not exactly a seat-packing screen presence either, seemed to suggest that there were slim pickings in the ranks of volunteers to succeed Kilmer, George Clooney and Michael Keaton.  The trailer scenes showed a very low-key approach to the storytelling as well, almost pleading “um, excuse me, if you don’t mind, that is, if you’re not busy, we kind of have a sort of new Batman movie for you.”  The director, Christopher Nolan, had made the fascinating low-budget Memento, and the plodding higher-budget Insomnia.  Truthfully, it all added up to a spectacular non-event.

Imagine one’s surprise when Batman Begins turned out to be merely spectacular.

The reasons why?  Well, Christopher Nolan made one crucial decision in crafting his film.  Aside from the usual reasons offered – treating the material seriously, dialing down the camp – he defied both expectation and tradition and deliberately made Batman/Bruce Wayne the most interesting character in the movie.  Admittedly borrowing a lesson from the casting of the first Superman, where Oscar-winners and other screen legends surrounded the unknown-at-the-time Christopher Reeve, Nolan uses his stars to reflect their light onto the lead.  The movie remains Batman’s story through and through; while there are villains, they are not given equal billing, nor is any significant screen time wasted on the complexity of their origins (the burden of all the Spider-Man movies).  Like the best villains, they exist mainly as challenges for the hero to overcome – impediments to his growth as a human being.  Even in The Dark Knight, the Joker comes out of nowhere and simply is, like a force of nature – he lies repeatedly about how he got his signature scars, in effect taking the piss out of the tired “villain’s motivation” trope.  And there is a mystery to be solved; an actual plot to unravel piece by piece, instead of the bad guys running around trying to kill Batman for two hours.  It keeps moving forward in so compelling a fashion that you forget you’re actually watching a character study, that happens to have some cool fight scenes in it.

In addition, Nolan created a complexity to Bruce Wayne heretofore unexplored on screen.  He has three personas:  Batman; the private, troubled Bruce Wayne; and the flamboyant, spoiled rich 1%-er Bruce Wayne – a new dimension to the man, unseen in his Keaton/Kilmer/Clooney iterations, where Wayne seemed to be just a decent guy who happened to be extraordinarily rich.  Bale’s public Bruce is a trust fund brat, careless with his millions, the last guy you would ever expect to want to be Batman, let alone actually do it – which makes it even more logical that he would choose to act this way.  Bale’s work is so good in the part that he’s actually more interesting as Wayne than he is in the Batsuit – which is just as well, because it’s over an hour into the movie before he finally puts it on.  The Dark Knight continues this dichotomy:  Bruce Wayne continues to act like a colossal entitled douchebag, deflecting all suspicion that he could possibly be the noble, driven soul determined to save Gotham City from itself.  In Nolan’s Batman films, the true battles are not “Biff!”  “Zap!”  “KaPow!” but the ones going on inside these incredibly damaged people who are essentially representatives of the conflicts and contradictions inherent in all human beings.  Batman isn’t just a token good guy – he’s us.  He’s what we like to think we’d do, given the means, but more importantly, the will.  And like us, he is a man who must overcome significant flaws and weaknesses to push himself beyond that limit.

The forthcoming conclusion to Nolan’s trilogy, The Dark Knight Rises, takes place nine years after Batman went on the lam, blamed for the murders of Harvey Dent and several police officers.  It isn’t much of a spoiler to suggest that Bruce Wayne’s challenge in this movie may be to question whether he can truly leave the mantle of Batman behind, if the path of a hero is ultimately futile in that it has no end, no final triumph, way to know for certain whether the entire journey has been worth it.  With apologies to William Goldman, Batman is no longer a horrible part.  Truthfully, it never was – he just happened to end up in some horrible movies.  Handled properly, he is an ideal vehicle for an exploration into the concepts of heroism, sacrifice and morality – the stuff of what the best stories are made.  So go on and rise, Batman – we’re going to miss you when the last of the credits roll.