Wednesday, July 5, 2017

Lynch Drops the Bomb (and the Hammer)

I’ll admit it: I’m one of those who winced upon learning, a couple years ago, that David Lynch would be returning to Twin Peaks.  Catch lightning in a bottle once, you don’t do anything as foolhardy as attempt it again, right?

Especially when again is fully twenty-six years later.

I also figured Lynch had to be smart enough to know better than try.
 
So now, then, this corollary admission:

The new Twin Peaks, eight of eighteen episodes in, is pretty dang good.  

True, the Dougie Jones stuff is thin gruel.  (Kyle MacLachlan’s often touching portrayal of a fugue-state Agent Cooper isn’t the problem; Lynch and Frost’s meandering, uninspired vision of suburban and corporate Las Vegas is—a problem only exacerbated by the fact they seem to be using this stuff to spoof Mad Men and Breaking Bad.)

Beyond that, though, the show does indeed recapture a fair amount of the surreal, wondrous-strange magic of the ’90 and ’91 seasons.

And at least some of the new season finds Lynch dropping the hammer, leaving behind that delightful, is-this-for-real? Twin Peaks hokeyness to do what he did in 1986’s Blue Velvet and 2001’s Mulholland Drive: demonstrate he can hang just fine, thanks very much, with the Scorseses, Spielbergs, and Kubricks of this world.

Episode 8, titled “Gotta Light?” by Showtime, is pretty much one big drop-the-hammer moment—a not-uncommon assessment, I know, having taken in a fair bit of the best-hour-of-TV-ever! yowling (this, for instance—or this) that started about two minutes after the episode finished airing.

So what is Episode 8?

For its first twenty minutes, it’s just a particularly tense, taut, strong third-season episode—one featuring the most unnerving (as of that moment, at least) incursion yet of surreal/supernatural forces into the show’s diegetic space.

And a Nine Inch Nails musical interlude, too.  Because why not?

After that, though, the remaining forty minutes—and they work nicely as a standalone short, in case anyone’s intrigued but not familiar with the larger, admittedly complex Twin Peaks universe—are Lynch’s meditation on…

The bomb.

The nuclear bomb.


Now, part of what makes these final forty minutes so remarkable is that there’s precious little in Lynch’s forty year-old oeuvre to suggest a meditation on this particular subject was coming—though the instant this viewer saw the 1945 Trinity test erupt on his own Twin Peaks monitor, he felt in his bones it was right, on some level, that Lynch should finally arrive here.

The other thing that makes these forty minutes remarkable is their pulverizing beauty.  I mean, they’re very possibly the forty most gorgeous minutes Lynch has ever put on screen—stuff to rival the Scorsese of Raging Bull, the Spielberg of Schindler’s List, and (this one’s especially apt) the Kubrick of 2001: A Space Odyssey (wait till you see what’s in that mushroom cloud).

Add to all this the fact that the final twenty of those forty breathtaking minutes (it’s a two-act short, really, twenty minutes per act) are doing great Amurican monster-movie horror….


And…wow.  Right?

This, quick, though, before we turn the corner and acknowledge that Episode 8 may not be flawless, exactly:

It’s mildly befuddling, the amount of that-was-batshit-crazy! blogging and articling that's gone on in the days since Episode 8 aired.

I mean, “Gotta Light?” really isn’t so perplexing.  If anything, it’s an uncommonly coherent Twin Peaks episode—maybe too coherent (though again: flaws soon).

True, a whole lot of that fantastic forty minutes is thickly, aggressively surreal.  But you don’t have to be Sigmund Freud to sort out the nightmare we’re watching here.  (It sorts way more neatly than, say, Laura Palmer’s frequently mind-bending Black Lodge appearances.)


Everything at first blush bonkers in this forty-minute, two-act short—the creepy white mother homunculus vomiting up eggs and evil spirits; the shimmering gold mist emitting from our beloved old friend the Giant’s skull; the crackling, flickering “woodsmen” scurrying about that 1940s gas station; the Abraham Lincoln-gone-satanic figure staggering around the nighttime desert, croaking “Gotta light?” at terrified New Mexicans, crushing their skulls in his hands, uttering into a commandeered radio-station mike the terrifying gobbledygook that makes everyone in listening range collapse into slumber; the good-luck penny discovered “heads up” (our 16th POTUS again); the half-frog, half-cicada, all-horrible thing that hatches in the desert, then disappears, God help us, into that beautiful sleeping child’s mouth: it’s all clearly harnessed toward illustrating one pretty-cogent notion:

That the U.S. sure betrayed itself—sure delivered evil unto itself—when it concocted the bomb.   

(If anyone doubts Lynch is taking us to a moral place here, consider what’s playing as his camera goes 2001 star-gating through that mushroom cloud: Krysztof Penderecki’s jarring contemporary-classical piece, “Threnody for the Victims of Hiroshima.”)  

So why does it make perfect sense Lynch should finally, after forty years, arrive here, at the bomb?

Because his whole body of work is about the seething, wormy underside of post-World War II American life.
 
It’s an oft-noted feature of works like Blue Velvet and both the old and new Twin Peaks series that it’s tough to tell when, exactly, they’re set: the 1950s?  The ’80s?  The twenty-first century?  Whatever the exact—or maybe shifting—time frame, we’re always in post-war America, land of white-picket-fenced houses, blue-jeaned and motorcycle-jacketed bad boys, Main Street hardware stores, land-line phones with spiral cords, lead-sled muscle cars, linoleum-countered roadside diners, etc.

And post-war America begins, of course, with the bomb.

If there’s a seething, creeping, mostly-concealed evil lurking in post-war America—an evil forever threatening to let the content of our nightmares rupture forth into our friendly bobby-socks-and-apple-pie waking lives—it’s got to have something to do with the bomb.

It’s got to somehow originate with the bomb.

And when the aforementioned beautiful sleeping child, fresh from the most adorably chaste first kiss you’ve ever seen, opens her mouth to let that nightmare bug fresh from the nuclear-bomb-blasted sands up the road from her family’s Craftsman house crawl down her throat, the message couldn’t be clearer: 

The bomb is the truly hellish evil all Cold War-American children swallowed.


A coincidence, maybe, the beautiful sleeping child swallows the nuclear-mutant bug right after her first date with her counterpart, upright and handsome post-war American boy?

Sorry.  No.

Soon enough, no doubt, these two will start a family—maybe-possibly Laura Palmer’s own.

And we know, we Twin Peaks watchers, what trouble family is in Lynch’s universe.  Right?

It’s a good place to get serial-raped and murdered, your middle-class American post-war nuclear family.  It's the cultural institution, in Lynch's imagination, bearing the brunt of the terrible karmic toll for the great American sin of the bomb. 

So there’s what Twin Peaks has to do with Hiroshima.

Anyway…I’m not here, again, to accuse “Gotta Light?” of being a perfect work of art.

Here’s the problem:

It’s not a freestanding short film.

Its final forty minutes contain tie-ins—both clear and probable—to larger Twin Peaks narrative strands.

For instance: that scary face peering out at us from within the creepy white mother homunculus’s vomit stream isn’t just some anonymous evil spirit.


No.  That’s Bob.

And it would appear we’ve just witnessed the birth of Bob. (Funny: "Bob" is an acronym of "birth of bob."  And it's one wee letter off from "bomb," too.  Hmm.)

Bob comes from the creepy white mother homunculus’s vomit.

And the creepy white mother homunculus ("the Experiment," she's called in the closing credits) comes from the bomb.

I wasn’t sure what so bothered me about this until I saw Margaret Lyons’ question to herself in her own post-Episode-8 New York Times article: “does Bob, a supernatural manifestation of evil, really need an origin story?”

Exactly.

Who knows where Bob comes from?

He’s an owl.  He’s the wind in the Douglas Firs.  He’s your own father.  

There had been, up to now, no explaining his insane malice.  It just appeared, implacable and irrational.  And if Bob just appeared in the Palmer household, he could just as easily show up behind your couch, be at the foot of your bed.


I’m not sure I like Lynch’s letting me in—even if it’s in dream terms—on Bob’s backstory.  The dude’s way scarier when he’s baffling.

And if I don’t need to know how Bob came to be, I sure don’t need to know how Laura Palmer came to be. Yet "Gotta Light?" seems to want to reveal this to us, too: her soul springs straight from the skull of our beloved old friend the Giant—a guy we’re having to start to suspect might be, like, God or something.

Laura is created as a direct counterbalance, it seems, to the evil of Bob, newly born in the flames of the Trinity test.

Meh, I say.

I don’t particularly want to understand how the Giant, Mike, the Man from Another Place, the Evolution of the Arm, Bob, and the version of Laura Palmer haunting the Black Lodge operate.  These supernatural figures' logic—their “rationality”—has always been delightfully opaque and alien; I’d hate to think we’re entering a phase of Twin Peaks in which Lynch starts over-explaining his universe’s otherworldly metaphysics to us, starts revealing too much of what goes on behind the red curtain. 

I don’t ever want to know why garmonbozia (human pain and suffering) must take the form, in the Black Lodge, of creamed corn.


I just know it makes sense on some ineffable level that it should.


Thursday, February 9, 2017

The Bowling Green Massacre Was More Important than You Think

In three different interviews between January 29th and Feb. 2nd, the last with heavyweight Chris Matthews, Kellyanne Conway asserted that the Bowling Green Massacre led President Obama, back in 2011, to institute tough new restrictions on Iraqi immigration to the U.S.

Just to note it—because we’ve entered a historical moment in which such things actually need noting—there was no such event, in 2011 or any other year, as the Bowling Green Massacre.

There’s also a zero-percent chance Kellyanne Conway didn’t know this—not, at least, by her second and third mentions of the event.

Now, the non-existence of the Bowling Green Massacre got some news play last week, for sure.

But not enough.

Because here’s what I think really happened between January 29th and February 2nd:

The Trump administration, with Kellyanne Conway as its instrument, made its boldest foray yet into the willful, deliberate warping of the fabric of reality.

And what the administration is doing now, rest assured, is watching the alt-right blogosphere and news circuit to see just how big a segment of the U.S. ogre population does, in fact, will itself, in the coming days, weeks, and months, to believe in the Bowling Green Massacre of 2011.

Do photos and videos of the event and its aftermath start materializing online?

Do news stories seemingly from 2011 about the event start appearing in the darker corners of breitbart.com?

Do now-declassified government documents suggesting an Obama administration cover-up of the Massacre start getting passed around via patriots’ email accounts?

If/when these things do happen (and for all this commentator knows, they’re happening already), the new administration will know it has real, real power over a goodly portion of the U.S. populace.

I’m sure not the first to say it. But I do want, for a fleeting moment, to be the latest:

We’re entering Orwellian terrain.


Thursday, February 2, 2017

U.S. Cities and the Long, Slow Divorce

We know, of course, that politically liberal Americans have for some time now been concentrating themselves in the nation’s big cities.

My bold prediction:

This trend accelerates.

As it does, the most progressive American cities—New York, Portland, D.C., San Francisco, Seattle—will start providing their own residents social services of the type citizens of most wealthy nations have long enjoyed: free health clinics, free daycare and preschool, free public higher education, free or radically subsidized housing for the elderly, etc.

Enabled in no small measure by Republican provincialism (states’ rights and all that, right?), big cities will keep raising their own minimum wages. They’ll reduce their own carbon emissions and foster development of renewable energy sources. They’ll shelter and protect undocumented immigrant workers fleeing poverty or seeking asylum. They’ll decriminalize drug use. They’ll treat addiction medically. They’ll resume the effort to equalize and integrate public schools. They’ll expand rights of and legal protections for women and LGBTQ residents. They’ll work to eradicate poverty in all their communities.

How will New York, Portland, D.C., San Francisco, and Seattle afford all this?

As an increasingly permanently conservatized federal government keeps ratcheting down taxes on super-wealthy citizens, American cities will steadily ratchet up taxes on their own super-wealthy residents.

Most of whom—brace yourselves—will be fine with it.

And so the long, slow divorce will proceed. Those who want Bernie Sanders-style socialism will get themselves to Philly, or Oakland, or Pittsburgh.

Those who want to keep pursuing Trumpist liberty—the stuff that’s already concentrated the nation’s wealth into alarmingly few hands, changed the weather, shortened the average American’s lifespan, and rendered college a pipe dream for the working class (hey: you probably didn’t want your kid brainwashed by liberals anyway)—will simply stay put. And keep voting as they already do.

In not many years’ time, we’ll see that very few indeed of the world’s tired, poor, and huddled still dream of making it to America.

They may, however, dream of making it to Chicago. Or L.A. Or Boston.

And if the blessing of a redeye flight means they never have to see, even, the strange Mad Max world beyond those city-states’ borders...all the better.


Tuesday, January 17, 2017

Nope

“When they go low,” Michelle Obama said at the Democratic National Convention last summer, “we go high.”

It was a fairly electrifying moment—one in which Democrats felt the thrill of near-certain victory.  Because Mrs. Obama was really reminding us, of course, that America never rewards meanness, bluster, ignorance, bigotry, pettiness, bullying—lowness—of the sort our now-victorious opponent is still schooling us tiresomely in day after day.
 
Not with a prize like the presidency, at least.

Sure, we’ve had some warty presidents in recent decades.  But they didn’t display their full plumage before winning the White House, and they paid steep prices (think of the shamings Nixon and Clinton endured) once they’d shown us, as they say, who they really were.

Everything’s different now.  On Friday we install in our highest office a disgusting, patently unworthy individual, someone who vaulted to political prominence selling Bircher-style bigotry even uglier than that for which we long ago dispatched doltish Barry Goldwater.

I’m not especially fearful of this individual.  We’ve borne prominent—even powerful—blowhards and bigots and buffoons in American public life before.  What does make me shudder is knowing tens of millions of people around me voted, not many weeks ago, for meanness, for bullying, for anti- intellectualism, for bigotry and racism.  (Sorry, but anyone who thinks that birth-certificate freak-out a few years back was anything but naked racism is self-deluding.) 

These are the qualities a great many Americans now want in a leader.  And I’ve never before lived in an America that rooted for the bully.  That hoisted up the buffoon.  That would gleefully cheer Apollo Creed (if only he were white) as he beat that lowlife Rocky Balboa to death.

Any powerful country that revels in meanness—that’s done with thoughtfulness, restraint, mercy, love for strangers—is flirting with its own demise.  And maybe that’s what scares me most about so many of the president-elect’s supporters: they seem somehow to crave the end.

Me, I don’t want the American experiment to be over.  So I’m glad, on one level, that Friday will see an eminently peaceful transfer of power of the sort the U.S. is justly famous for.  And I’m glad the U.S. will have, right on schedule, a new president.

But I won’t.

Saturday, July 23, 2016

Trump, Imperial Self

I remarked in another posting that I’m one of those anti-fundamentalists who tends, on the whole, to like the relativistic, liberal-democratic (little “d,” in this instance) messages beamed tirelessly across the globe by 21st-century mass media.

No two ways around it, though: it’s a double-edged ethos those media hand us.  And for all the work Hollywood and Madison Avenue have done in recent decades to persuade a great many Americans that women, people of color, the poor, and LGBTQ folk are, like, human, the fact is they’ve sold a great many of us on a dangerous corollary belief:

The belief that I (whoever "I" am) should be getting exactly what I want—all the time.

They sell us, in other words, on an individualism that may instruct us, on its more salutary channels, that we shouldn't need to feel like we're looking in a mirror every time we meet another human (individuality, right?)—but that also, when it reaches a certain frenzied pitch, births what Don DeLillo, in White Noise, calls the imperial self.

The self that not only craves but expects dominion over all it surveys.

The self with a toddler on its brainstem.  The self you bet is going to pitch a fit if it doesn’t get exactly the candy it wants in the check-out aisle.

Spend an hour looking carefully at TV ads if you wonder where this self comes from.  Then bear in mind the average American takes in God only knows how many hours of this stuff in a year.  And that the brands those ads rub in our faces—the brands so desperate to gratify our every arcane desire—are no less up in our grills when we're out on our highways, in our malls, in our sports arenas and airports and even our schools

Maybe we're taught by our hyper-consumerized environs that variety and diversity are good things.

But we’re also taught that we are the individuals empowered to whittle our worlds down to exactly what we want—that we need see only what we want to see and hear only what we want to hear.

And that we need meet and deal with only those other humans we want to meet and deal with.

Swipe left/swipe right.

Enter, now, Muslims. 

And Mexicans. 

And LG…BTQ people. 

And (we see where this is going) Donald Trump—an ultimately imperial self prepared to whittle the world down to exactly what Americans want, to meet and deal with only those other humans Americans want to meet and deal with. 

And to do the exact same “really great” job of whittling and dealing you’d do, if only you were similarly empowered.

Now, Donald Trump is no one’s idea of an intellectual. But this commentator isn’t buying the oft-floated notion he’s stupid.

Trump clearly has a profound, bone-deep understanding of at least one unhappy human tendency: that of disenfranchised (or even disenfranchised-feeling) individuals to throw in bigly with super-agents, super-subjects, super-imperial selves.

And Trump, like Don DeLillo (to create the weirdest pair of bedfellows ever), knows that we modern Americans, with those corporate-nursed toddlers on our brainstems, are especially susceptible to this unhappy tendency.

A couple nights ago Trump provided us all a toweringly grim state-of-the-union address.

America has 99 problems, each one deadlier than the last.

What’s the only feasible solution to every last one of those problems?

Donald Trump

"I alone can fix it," he says.

With tens of millions watching, with an obvious Mussolini impression on his face, he speaks these actual words.

"Believe me," Trump says—then proffers nothing by way of a plan to decimate ISIS, to tame the Chinese, to end mass shootings, to quell the flow of illegal drugs into American cities.

“Believe me” is the plan.

Though there is, of course, that wall.

It’ll cost hundreds of billions, all told—but he’ll build it even while lowering our taxes hugely.

Sure he will.  

After all, it’s the Mexicans picking up the tab.

Maybe the wealthiest nation in history can’t afford that wall.  But Mexicans can.

"I am your voice," Donald Trump says.

He could just as easily say, “I am your self.”

He could just as easily say, “You are too enfeebled to think, to speak, to act—but I will be the strong you who does all these things for you.”

This is, after all, exactly what his millions of soon-to-be voters hear him saying.

It's at once commonplace and pedantic, a year and change into Trump’s presidential campaign, to point out that this is all bald and shameless demagoguery.  That it's old-school fascism being born right in front of our American eyes, right on our American TV screens.  That it’s exactly what could never happen here happening right here. 

In Ohio.

Trump’s barely-grudging admiration for Vladimir Putin and Saddam Hussein tell us exactly what we’re getting if we elect him president.

Yet here we are, semi-poised to do it.

Why is that?
           

Friday, July 15, 2016

Matter More? Or Matter Too?

The Black Lives Matter movement has given us all an excellent litmus test—in the form of its very name.

Some people hear that name, that phrase, “black lives matter,” and they’re sure they hear a particular word right after it:

More.

These people say, “Black lives matter more?  And they get angry.  And they put signs on their lawns—and bumper stickers on their cars—making a retort: All lives matter.”

Others of us hear that name, that phrase, “black lives matter,” and we’re sure we hear a different particular word right after it: 

Too.

And we say, “Has it really come to this?  So many years after Brown vs. Board and Dr. King and Malcolm X and the Freedom Riders, black Americans are actually having to remind us, in the wake of so many ugly recent incidents, that their lives matter too?

And we say, It’s clearly time to get back to work.

And we say, Police, schools, and tax codes gotta change.

And we say, How on earth does any sensible American hear that name, that phrase, and imagine there's a more after it?

Friday, March 20, 2015

Doyers Street, Bloody Angle, Dazes Cosa

Maybe the best thing about NYC is it's got more history lurking around than anyone can screw a bronze plaque to or crush onto a museum wall—so much, in fact, the bulk of it (here’s where Marilyn’s skirt flew up; here’s where Stanford White was gunned down; here, somewhere in this Gap, is where Nathan Hale was hanged) goes, of something like necessity, uncommemorated.  

I had dinner in Chinatown a few nights ago with some friends at the famed and ancient Nom Wah Tea Parlor (NYC’s first dim-sum place; same location, basically, since 1927) on oft-movie-setted Doyers Street.  Here they are, Nom Wah and Doyers, in a photo someone else took:


The Nom Wah would have been history-geek pleasure enough.  But the real fun didn’t start till after dinner, when we were all standing stuffed and bleary on the sidewalk outside.  That's when my one buddy pointed out the below creepy doorway (I promptly photographed it) right in the architectural pocket matching the oh-so strange (by Manhattan-grid standards) dog-leg curve of Doyers Street, just a couple doors down from Nom Wah:


That same buddy remarked—astutely, I’d say—that this looked both like a place one might go for an encounter with an underage sex slave and (relatedly) a “portal to hell.”  And though I don’t consider myself in any way psychically sensitive (zero good ghost stories to tell despite an adulthood spent in 19th-century buildings), I must say (and maybe I was in a suggestible state, charmed and a little skeeved by strange Doyers Street) I felt an emanation from that doorway, something like the presence of the Evil One, whoever s/he might be, whatever s/he might be doing in celebrity-haunted (we’d spotted Laurie Anderson at MoMA a few hours earlier) NYC on a windy pre-spring Sunday night.  
 
So entranced was I by the Evil One, in fact, I found myself leading my buddies to that doorway, close enough for us to make out, among other details, the curious “DAZES COSA” sticker by the door handle.  


The door, as you see, was ajar.  I pulled it open.  I stepped most of the way inside.  Then one buddy’s warning of cameras—plus the throbbing presence of the Evil One—made me wonder what exactly I was doing, and I rejoined my pack on the sidewalk.  

What did I see in there?

Upward-leading stairs, I believe, bathed, as they say, in some unearthly, as they also say, electric blue light—a staircase sensed more than seen, actually, obscured, as it somehow was, I think, by cloudy, cruddy plexiglass.

It was silent in there.

And only now as I’m writing do I remember stepping aside a bit after opening the door, so my very tall buddy whose “portal to hell” speculation had perhaps bewitched me could see around my pea-coated torso, into the blue dimness, his eyebrows raised, his mouth open: “Whoa,” he declared.

I know, I know: Orientalism.  I’m not saying it was my finest hour.

But let me return to the thesis with which I began.

What I discovered when I got home and hit the interwebs is that that mysterious, shadowy doorway I’d for some reason been compelled to enter stands not in the very, very pocket of a strange-for-Manhattan dogleg curve.

It stands in the very, very pocket of the Bloody Angle.

And my buddies and I, studying it, had occupied what NYPD has more or less officially declared the most blood-sodden chunk of urban real estate in the United States of America.

The Bloody Angle of Doyers Street was, you see, ground zero for the Chinese gang (or tong) wars of the early 20th century. Things, in fact, were sufficiently ghastly there—there, where we stood BSing, where once a Dutchman’s tavern stood—to have birthed into our lexicon a colorful phrase we all know:

Hatchet man.

The tong badasses who killed on Doyers Street frequently did their human butchering with, famously, hatchets.

Who knows how many corpses had been strewn, over the years, on the pavement beneath my buddies’ and my feet?  How many gallons of blood let?  How many hacked-off hands and heads kicked to that very curb?

Who knows indeed? 

What we do know is that the Tong Wars—like those between the Bowery Boys and Dead Rabbits—are Real American History.

But in NYC, that history—again, of necessity; we can’t screw a thousand bronze plaques to every damn building fa├žade on every damn city block (here’s where Andy Warhol was shot; here’s where Dylan Thomas drank himself to death; here’s where Lincoln stopped for a beer after the Cooper Union address)—goes all but neglected, all but unremarked upon.

The same ever-ongoing, interdependently arising set of urban circumstances that raised, then razed, a Dutchman named Doyer’s tavern, then, two centuries later, brought into being the brothels, opium dens, and hatchet men of Doyers Street, still continues unabated, uninterrupted, un-museumed, producing, a century after the relatively recent Bloody Angle events, and in place of the bronze plaque that likely would be there were this Boise, or Allentown, or Dallas, an itself-wildly-ephemeral sticker that, to the best of the interwebs’ knowledge, pushes no product, hypes no celebrity, marks no gang territoryjust whispers out, in all caps, to the observant eye in a visually cacophonous environment, a single inscrutable message:

DAZES COSA.

The Bloody Angle points at this door, by whose handle (the Evil One bade me tug it) resides this message.

What does it mean?

Or what did it mean, in the event it’s already been peeled or scraped or graffitically palimpsested away in the four days' worth of NYC history since I photographed it?

What sort of thing is a dazes thing?

A thing, maybe, proceeding in a daze.  In a blur.  Unstoppered.  Undammed.  Un-plaqued.  Streaming.  Unstoppable as a dream.  As narcotized blood. 

As NYC history.

Tuesday, February 24, 2015

Hyperreal Terror

History teaches us not to be too surprised when dirt-poor, idle young men with not a lot of future start cutting off people’s heads.  Reihan Salam observes that post-Tiananmen Square China has grown Beijing and Shanghai at insane clips—empty towers be damned—in part to make sure young dudes there (and they’ve got a few) don’t foment similar ideas for how to pass the time.

It’s way more perplexing when relatively affluent young Westerners leave colleges, jobs, and actual prospects in cities like London and Minneapolis—places to which their own parents had only recently emigrated—to “go to jihad,” as if “going to” some new version of Cancun or Daytona Beach.

The American Republican presidential field is insisting ever more loudly there’s just one thing tugging these kids to Syria: Islam.  To suggest otherwise is, it seems, to hate America, or something.  (Christ, what a pitiable little figure Rudy Giuliani has become.)  But positing there’s something essentially sinister at the heart of Islam not found in the hearts of other major world faiths was intellectually lazy when Salman Rushdie did it in 1981, and it’s lazy when Republicans do it today.  “The Prophet’s Hair” has no clothes.

There are socio-economic-cultural-historical reasons, of course, why it’s Islam, and not Hinduism (or Christianity, or Buddhism, or Marxism), now serving as lightning rod for the particular human proclivity that is the sinister force at work here.  And we don’t need to pretend it’s not Islam serving that function.  (As Doyle McManus points out, the president really hasn’t been pretending.)

But let’s not observe the rod in action and conclude there’s lightning blasting from it.

Islam is the lightning rod—at least right now, at least in a certain part of the world.  But the lightning striking it—and taking off human heads—is the truly indefatigable human pursuit of epistemological foundations.

And the atmospheric conditions engendering the lightning storm (to extend the metaphor one important step farther) are in no small measure produced by image-laden consumer capitalism.

It has to be humanity’s immersion in signs, signifiers, representations—symbols of every kind—that makes so many of us crave ground to drop anchor on.  And we’ve never been more immersed in symbols than we are today, with better than one in seven humans on Facebook and even more of us clutching smartphones.

To be sure, some human minds delight in the “float” we feel when we’re sufficiently awash in symbols.  Andy Warhol was the very prototype of this kind of person; the relish with which he produced pictures of pictures of celebrities who were themselves only ever simulacra—copies without originals (“Marilyn Monroe”)—all but inaugurated this postmodernist mode of pleasure.

For lots of other people, though, the float is terrifying.  That Yahweh should be not the word but a word; that white should be not a privilege but a lack (of melanin); that straight should be not right but the analogue of right-handedness; that a penis should be not a universal signifier but a big clitoris: some people do not want to hear this stuff.

And when these ground-shaking, foundation-erasing, vexingly-difficult-to-refute ideas come at them via Calvin Klein ads and Modern Family episodes and Kanye West albums and Naomi Wolf books and Dalai Lama dharma talks streaming from the YouTubes, some subset of these unhappy individuals will be compelled to reassert the capital-T Truth—and remind everyone else just where the ground is. 

This is where terror can come in handy.

Because what are the Removers of Heads (with their crashingly conspicuous British and American accents) asserting if not their status as arbiters of Ultimate Reality?

The terror that haggard, bound man in the orange jumpsuit feels is real.  His death is absolute and final.  A severed head issues no blasphemy.  None.  And none, in this particular instance, means absolutely none.

A decapitation video is the assertion of Everything That Is Real to the exclusion of everything that is bullshit—everything foisted on the world, that is, by the great Satan Hollywood and its vast, attendant modernity.

We’ve entered here the precincts of what the late French postmodernist philosopher Jean Baudrillard calls the hyperreal: places, things, people, and/or gestures so vivid, so intense, they serve (in theory) as insurers of Meaning and Reality in an image- and info-drunk world otherwise happy to drift into ersatz-ness and moral/social relativism. 

The hyperreal, though, has a comedic tendency to come at us in simulation’s clothing.  Disneyland, Baudrillard insists, is a big fat case in point: it presumes to distill the essence of America, the meaning of America; it’s America concentrated, America intensified.  It marks a “miniaturized and religious reveling in real America,” Baudrillard says; “all [America’s] values are exalted here, in miniature and comic-strip form.”

Hyperreal Disneyland purports to deliver capital-R Reality.  But it’s also and simultaneously “a perfect model of all the entangled orders of simulation”—a “play of illusions and phantasms: pirates, the frontier, future world, etc." 

Anything intense enough, vivid enough, extreme enough to remind us all what’s really real­—as Disneyland reminds us of the real ingenuity of America; as ISIS executions remind us of the real wrath of Allah—is worthy, needless to say, of the camera’s gaze.

Of uploading to YouTube.

Of a 24-hour spin cycle on CNN.

Of being totally ensconced, in other words, in bullshit.

How can we be sure something is capital-R Real unless it’s ensconced in bullshit?  Bullshit is, in fact, what makes capital-R Reality conceivable.  It renders capital-R Reality visible.  You simply can’t have one without the other—something ISIS seems to know when it makes terrify-America videos (“Coming soon: Flames of War”) that actually emulate movie previews.

All of this, anyway, gets at why asserting the capital-R Real via the hyperreal—via, say, decapitation videos—is the very definition of a fool’s errand.

But the fact there is, here on planet earth, an ever-growing mountain of what a great many humans will insist is evil, relativistic, anti-Real bullshit (and not just an opportunity for a good, fun float) means the era of hyperreal terror may just now be amping up.

We can better combat it when we see it for what it is. 

It’s not Islam. 

It is the reaction of a certain type of human mind to the decidedly relativistic messages broadcast loudly and incessantly by liberal-humanist consumer capitalism.

Hey: I tend to like those messages.

But there’s no denying a whole lot of other people don’t.

And the label that should adhere to those people is not Muslim.  It’s fundamentalist—and Timothy McVeigh was one, too. 

Anyone who wants to insist the former British rapper wielding that bloody hunting knife is the Prophet’s emissary might ask themselves which one's influence the disturbingly slick video he stars in really reflects: the Quran or the ultra-violent PlayStation and Xbox games (“Assassin’s Creed,” “Call of Duty,” “Mortal Kombat”) our hip-hopper stormed disgustedly away from back in BestBuyLand.

Wednesday, February 19, 2014

How I Learned to Stop Worrying about Scorsese and Love Spike Jonze and the Coens

Caught both the Coens' Inside Llewyn Davis and Spike Jonze's Her in recent days.  And I'm happy to report I'm now mildly embarrassed about the Hollywood-is-sick-unto-death screed Scorsese's beastly Wolf provoked from me.

Llewyn Davis may not be top-flight Coens.  But it's pretty dang good.  And if you're a fan of their stuff (and lordy is this boy), you'll see it makes an excellent companion piece to 1991's Barton Fink, another movie casting a second-tier artistarrogant, aloof, married to a burdensome "life of the mind"as irresistible cannon fodder for the universe's mean streak.



If Barton Fink is a bit of a hot mess, though, its supernatural freak-out climax incoherent in much the same way the final stretch of Kubrick's The Shining is, then Llewyn Davis is maybe a mite too leashed.  It could sure use Fargo's wood chipper.  Or No Country for Old Men's air tank.  Or A Serious Man's tornado.  Or something.  After building a nice store of eerie tension (it's testament to the Coens' powers they can do this with a whole lot of Greenwich Village folk music going on), the movie arrives at its abrupt shrug of an ending, basically its opening scene all over again with one bit of added info.  So unsure are the brothers how to finish their movie they heap the job onto poor Bob Dylan.  It's a would-be disarming ending that, unlike Sheriff Ed Bell's telling of twin dreams at the end of No Country, doesn't particularly reward reflection or scrutiny.

That's the bad news about Llewyn Davis.  The good news is it's freakin' gorgeous, thanks in large part to first-time Coens collaborator (for feature-film purposes, at least) Bruno Delbonnel, a cinematographer to be reckoned with.  The guy summons beautiful platinum hazes through which to shoot the Village and Washington Square.  And various interiors are near-breathtaking (not to mention thematically apt) studies in shadow and lightsee especially a highway-side cafeteria imbued with so much existential dread it's amazing the worst thing we have to watch happen in it is a heroin overdose.



Maybe the best thing about Llewyn Davis, though, now that I think of it, is it sure indicates the Coens aren't done making movies to please themselves.  Which is how most first-rate artists operate, of course: they do it their way, and if any viewers/readers/listeners out there want to come along for the ride, so be it.  Not many well-bankrolled filmmakers enjoy that kind of freedom these days.  And as charming and all-around good as True Grit was, it certainly left this fan wondering if the Coens were entering a Disney(ish) phase.

Nope.

(I'm remembering now going to see No Country for Old Men for the first time at the Shirlington 7 outside D.C. and watching a seriously pissed off, stylishly dressed yuppie couple storm out of the theater, warning all of us queued up for the next screening, "It sucks!  Don't go!  Don't go!"  I think it's safe to say Llewyn Davis will hit those two the same waywhich is a beautiful thing.)  

So bully for the Coens.  Another strong one.  And I'm still banking on their having at least one more No Country-grade bedazzler left in the tank.

The real kick in the head of the winter movie season, though, is Spike Jonze's first flat-out gob-smacker, Her.

This is an excellent film.  But maybe not for the reason lots of commentators think it is.



I know we're all supposed to be blown away by Jonze's, like, prescience, making us ask ourselves how we'll cope when, one day soon, we boot up our laptops to hear them say, "I am"and mean it in a way Siri obviously doesn't.

But the fact is we're on almost two centuries now of writers and movie-makers following Mary Shelley down that philosophical rabbit hole.  (Holy hell: four more years and Frankenstein is 200.)  All Spike Jonze does is switch up the tired old genre conventions we usually fall back on when exploring the Shelley Preoccupation.  Because meditations on artificial/technological intelligence have almost always come to us in sci-fi-horror clothing, right?  Not that it's a tradition in need of badmouthing.  It's given us 2001: A Space Odyssey, after all.  And The Terminator.  And Blade Runner.  



But as the increasingly inevitable moment of the Awake Machine nears, Spike Jonze, at least, isn't feeling the whole sure-to-sink-its-weirdly-lifelike-thumbs-into-our-eye-sockets-while-crushing-our-skulls-betwixt-its-weirdly-lifelike-palms thing.

He sees the Awake Machine coming and veers....

Romantic comedy.

That's interesting.

I mean, Her is still sci-fi, in a low-octane way.  But what it really is is rom-com.  And while I'm no particular fan of the genreand while the flick is full of oh-lover navel-gazing dialogue that would make me drink Drano if not for the novelty of one of the lovers being a circuit board-bound disembodied voiceit's nonetheless wondrous strange to see the Shelley Preoccupation dragged back from exhaustion (or the gravehar har) in this surprising way.

It gives Jonze proprietary claim, maybe possibly, to a no-screwing-around Real Insight: that far from being the terror we've all been anticipating for two centuries now, the Awake Machine will probably amount to little more than an opportunity for an already wildly narcissistic species to fall even more in love with itself.

Because that's the uneasy realization in the back of the viewer's mind the whole time this wondrous-strange rom-com is unwinding: that what we're really looking at here is a manTheodore Twombly, played by a once-again crushingly excellent Joaquin Phoenixinteracting not with another person, or even another "person," but with himself Masturbatory style.  Theodore's is, after all, the only body in the bed during the Big Sex Scene with Samantha, his O.S.  And Theodore's attempts to get it on with other beings with actual brains and bodies all go memorably badly.  (Two words: dead cat.)  And Theodore's beloved Samantha springs from nowhere other (the movie definitely suggests) than the 1.5 sentences he speaks to his newly updated computer about his own mother.

True, Samantha is a wildly brilliant, funny, sexy...entity.  But she's also a corporate producta consumer good that may be doing nothing more, beginning to end, than following her owner's lead, reflecting Theodore back to himself, interacting with him the way his own lovelorn personality indicates to her algorithms she should.

We may have thought we were living already in the Age of Narcissism, Her says to us.  But buckle up.  Because that age might find a whole 'nother gear soon enough.

And maybe that will be a nightmare, Her says.  But it's going to be a funny, sexy, oh-so poignant nightmare.

And isn't that the nicest kind of nightmare to have?

Of course I'm being facetious when I say all Spike Jonze does is switch up the old Frankenstein genre conventions.  He does that and nails about a hundred other Qualities of Great Films: split-second comic timing, pitch-perfect writing, couldn't-have-been-played-by-anyone-else casting, and, maybe most notably....

Gorgeosity. 

Her looks nothing like Llewyn Davis, it's true.  But it's similarly visually sumptuous.



The movie it does look a little like is Sophia Coppola's Lost in Translationit's effortlessly stylish in much the same way.  And Her is a love letter to a future Los Angeles much as Lost is one to circa-2003 Tokyo.  (Her is also, in a way, a love letter to present-day Shanghai, the place where most of its wow!-grade cityscapes were shot.  It's worth noting, too, that its setting cleverly plays up its genre-switch game, since we're all but forced to compare its future L.A. to Ridley Scott's vastly more menacing one in Blade Runner.)



I'll bring that last thought out of parentheses to close on this tangential one:

Her reminds us of what's always been best about what we've come to call postmodernist art and thought.  It insists to us the future is never already or innately emplotted.  We're empowered to write it.  We have to emplot it.

So we should choose our genres wisely.

Saturday, December 28, 2013

The Feature Film Is Dead, and “The Wolf of Wall Street” Is Its Tombstone

There are lots of bad Hollywood films pitchforked at us every year, of course.  But not many from the director of Taxi Driver, Raging Bull, and Goodfellas.

The Wolf of Wall Street is both a Martin Scorsese movie and a crime against cinema.  It’s a soulless, brainless, lazy, relentlessly ugly calamity it’s hard not to read as hostile to its audience—an audience out fully three hours of its one and only life on earth by the time the nightmare’s over. 

This is a film that asks the searing cultural question, "What happens when you lift a bunch of fictional 'men' out of a Bud Light ad, drop them into an NC-17 playground, and let the cameras roll?"

And then leave nothing on the cutting-room floor?   

I’d synopsize the story, but there is no story.

I’d mention the characters, but there are no characters.


There’s just a bloated, depthless cartoon that makes the idiotic mistake of cranking the debauchery knob to an anemic “10” when it’s well over two decades now since Bret Easton Ellis gave us a similarly revolting Wall Street nightmare (American Psycho) with the knob wrenched to 12 and a half.  

If debauchery's all you're going for, and you can’t get your knob to at least 13, what's the point?

Maybe it’s just the beginnings of the old-and-crankies on my part.  But I sometimes feel a little betrayed when hugely talented artists I’ve put a certain amount of spiritual stock in decide it's time to start farting in public.  I mean, can the man who gave us Jake Lamotta before the mirror really not see how unwatchable, how unbearably bad these never-ending scenes with Leo DiCaprio preaching hyena capitalism to cattle pens full of coked-up stockbrokers are?  Minute after impossible minute grinds by, DiCaprio screaming vapid corporate nothings about, like, Steve Madden shoes into a hand-held mike.  And just when you’re sure there can’t possibly be another such scene in the film, twenty minutes later he’s hollering into that mike again, another six, seven, eight, nine minutes ticking painfully off the clock.

There are a few possibilities here, maybe.

The first is that Scorsese just needs to retire.  Because he can no longer tell the difference between good cinema—which he may have been limply trying for here—and a migraine. 

The second is that The Wolf of Wall Street actually isn't an attempt at good movie making.  It’s just a hate letter addressed to multiplex-goers.  It’s Scorsese saying, All right, dolts.  You think the 2013 Superman was a good movie?  And Cars and Skyfall and The Secret Life of Walter Mitty?  Well here’s some red meat for you, morons.  Choke on it.  All three hours of it.  And how about you give it a best-picture nod, too, since the fact it came out in December must mean it’s Oscar material? 

The Wolf of Wall Street, this is to say, just might be Scorsese’s Metal Machine Music.

But there’s a third possibility.  And it brings me no joy to introduce it, but I feel, as I gaze out on the smoking ruins of 21st-century American cinema, the time has come to do so.

Maybe there are so few good American movies these days because we’re becoming a nation of philistines.

Maybe Martin Scorsese can’t make a great movie (or even a good one) in 2013 because he’s no longer living in a culture that licenses him to do it.  Maybe he’s working in an America that doesn’t want good movies.  That can’t tell the difference between a good movie and The Hobbit: The Desolation of Smaug

Maybe a culture’s desire for good art is the rainwater that makes good art grow.       

Maybe one of our founding assumptions about great popular art is ass-backwards: great artists don’t create mass audiences for themselves through sheer brilliance, persistence, and brute intellectual will.  

Maybe society instead uses its force of will—one harder to see, but no less real—to grow the great art and artists it secretly wants and needs.

Maybe Americans, despite initial widespread expressions of anger and disgust, willed Friedkin and his Exorcist into being.  Or Hitchcock and his Psycho.  Or Waters and his Female Trouble.        

Maybe the veiled social will that gave rise to those great movies and scores of others is now fading away.

Maybe Martin Scorsese’s status as a sometimes-great artist doesn’t make him a magician who can grow a big, strong orange tree in the middle of Death Valley.  Maybe no matter how hard he tries, the best he’ll manage is to raise a gnarly little weed like The Wolf of Wall Street.   

Okay, okay: the nation-of-philistines thing might be going too far.  There’s been no particular scarcity of excellent pop music since 2000.  Or excellent TV, truth be told.  And it’s not like there have been no good American movies in the 21st century: behold Lynch’s Mulholland Drive (2001), Coppola’s Lost in Translation (2003), the Coens’ No Country for Old Men (2007), Lee’s Brokeback Mountain (2005), Spielberg’s Munich (2005).

Every year, though, there are fewer films cut from anything like the same cloth as Taxi Driver, The Godfather, Nashville, Annie Hall, A Clockwork Orange, and Chinatown—or any of a thousand more obscure, no-less inspired Golden Era titles: your Eraserheads, your Night of the Living Deads, your Chelsea Girls.  And I know—I know—there’s been no paucity of mega-budget CGI extravaganzas in recent years making perfectly smart critics jump out of their seats with glee and invent whole new vocabularies of superlatives to drive up the Metacritic and Rotten Tomatoes scores.  

But that fact begs a certain question:

Does anyone really think we’ll still be talking about Avatar twenty years from now?  

How about Gravity?  Or Harry Potter and the Deathly Hallows?  Or The Lord of the Rings?  Or Wall-E?  Or Iron ManAnd lest I come off as a simple special-effects/summer-blockbuster hater, how many 21st-century American movies that seem, at first blush, cut from genuine Golden Era cloth do we really think will stand that same test of time?  Winter’s BoneDjango UnchainedSidewaysThe Hurt LockerThere Will Be Blood21 GramsBoratBlack SwanMidnight in Paris

Takers?  Anyone?  On twenty years from now?

Look: I’m not saying it means fire and brimstone that the era of the Hollywood feature film as art is ending.  But I guess I’m saying it’s ending.  And pretty rapidly, too.  And it’s rarely clearer than when a once-major artist like Martin Scorsese drops a stink bomb like The Wolf of Wall Street.

Is some other cultural form going to step up to provide art for the masses?

Don’t look to pop music: unless it’s Justin Bieber's we’re talking about, there are no mass audiences anymore.

Don’t look to the American novel: a few strong 21st-century efforts by Toni Morrison and Jonathan Franzen aside, its best days are obviously gone.  Besides which, how many Americans still read?  Anything?

Could it be we’re simply evolving out of our need for art, now that so much of what happens in “reality” gets sucked up instantly into the Screenland we used to go to for art?

(Wasn't the painted canvas always a "screen"?  Wasn't the proscenium arch?  The printed page?)

For better or worse, we’re poised to find out.