Saturday, October 23, 2010
MAD MEN’s DON DRAPER: NOWHERE MAN IN SEARCH OF TRUTH IN ADVERTISING
“Who is Don Draper?” is a question which is one of the central themes of the hit AMC series. A friend asked my opinion of this season’s finale and about how I thought the series would ultimately end. It got me to thinking about a lot more than Don’s dilemma and urge to confess. My confession is that I have a love/hate relationship with the show.
I grew up in the era that it depicts so well—what I hate is the mirror reflection of what I remember of that time as a child of divorce. But I what I love is the great dramatic craft and wonderful acting—even though, I did pick up one anachronism last year. In the offending episode, the ad guys celebrated on one occasion by breaking out a bottle of Stolichnaya—a gesture that certainly would not have gone over too well in the Cold War Era. I’ve subsequently seen bottles of Smirnoff in later episodes, which has righted the situation. That said, there is so much to admire about the art direction and attention to period detail that it almost makes you want to take off your fedora, take out a pack of Lucky Strikes, and reinstate the three-martini lunch ritual.
But getting back to my friend’s questions, I told her that I thought that the last scene of the finale was unnecessary. When I saw the masterful scene before it, I thought that it was a great open ending—after Don tells his ex-wife that he’s getting remarried, we see an empty bottle on the kitchen counter of their former family home, framed center stage like a dark, Courbet still life as the lights are shut off by the departing former couple in what I thought was a fade to black and end credits. The question becomes—is there a spark between them that will come between Don’s impending new marriage? We already should have reservations about the match knowing that he is an inveterate cad and his impulsive decision to marry his secretary does not auger well for faithfulness or longevity.
But the writers couldn’t resist the obvious and my enthusiasm was quickly dampened because it didn’t end there. They chose to continue, and cut to Don and his fiancée in bed with Sonny and Cher’s “I Got You Babe” playing in the background. Now, the inference here—and a rather heavy handed one I thought—whether it is subconscious or not—is that the duo singing on the soundtrack had a fairy book Hollywood marriage that ended in divorce—hence, the seemingly sentimental, hippie love song casts a foreboding shadow over the betrothed couple lounging in bed. This final scene seemed to lack the subtlety of the one beforehand—do we really have to spell everything out in TV America? The kitchen scene struck me as something you’d see in a foreign film. The bed scene, typical soap opera.
OK. I’ll stop producing the show. Let’s get back to Don. He’s got a secret that is now burning after the last season. The perfection of his character is written down to his Dickensian name—his “adopted” namesake of “Draper” lends itself to two meanings—his hidden identity is sequestered under the draping of what happened in the Korean War; the other, is that, in addition to hiding personal truths, he is the perfect Ad Man because he’s so adept at all the shadings of truth that his profession requires.
The well-spun phrase “Truth in Advertising” (actually a law) certainly has a different spin to it especially with the platform now offered by social media. Consumers can instantly flex their muscles and spark negative PR grassfires that can grow into the kind of outright conflagrations that have sometimes brought corporate giants to their knees—remember the Domino’s pizza disaster when several misbegotten pizza twirlers posted a video on YouTube showing them adorning their pies with toppings that were…shall we say, not on the menu, but definitely organic? There are countless other examples that have motivated most major corporations to preemptively hire legions of twenty-year-olds to maintain a vigilant watch in the blogosphere for negative consumer rumblings. Mad consumers can now be an activist virus.
Back in the 1960’s however, we were living the American Dream and drinking the Kool Aid that turned us into that consummate culture of consumption which has for the last several decades displaced our standing as a manufacturing country—the rest as they say, is subject of daily reports on the unemployment rate, foreclosures, and the general Fall of the Dow Jones and perhaps Empire. One of the many things that “Mad Men” gets so right is the way that we were sold and bought a bill of goods in the 50’s and 60’s from unfiltered cigarettes to the bomb shelters that we didn’t need. It all seemed so simple—merely “duck and cover” until the inconvenience of an atomic attack passed over and we could return to our regularly scheduled programming.
So, Don Draper is really the Beatles’ “Nowhere Man” come several years early and in some respects, he is a reflection of several generations who lived through the post World War II era. He’s caught in between the sheets as a relationship train wreck that doesn’t know who he is and between the 50’s and the 60’s that are starting to explode as we see in the season just concluded.
Don is a hybrid archetype of both T.S. Eliot’s “hollow men”, and the discontented businessman captured by Sloan Wilson’s, “The Man In The Grey Flannel Suit”, a 50’s best seller and hit movie with Gregory Peck and Jennifer Jones in 1956. Like this work, “Mad Men” has appropriately been celebrated as capturing the mid-century American “zeitgeist” just as “The Social Network” has been cited as doing the same for the Internet Era. The irony about Don’s secret is that he is about to enter a time period when his identity problem is
no longer be relevant.
Even though I enjoy the series, I hope that next season is the last. In America, the lifespan of a TV series is not motivated so much by its organic narrative shape and pulse, but by the economic imperative of reaching the magical goal of a syndicatable 65 episodes. This is the threshold for the number of episodes that can be “stripped” or be distributed as “repeats” five days a week for roughly half-a-year (26 weeks) before they have to recycle and repeat. One of the reasons that foreign, and in particular, British TV series seem to have an edge to them—take the seamlessness and naturally closed narrative arc of a classic series like “The Prisoner”, for example—is that rather than produce a show until it runs out of steam, foreign shows are written and produced as so-called “limited series”, a standard emulated well by some US cable network shows.
The end of “The Sopranos” was roundly criticized as a cop-out by many critics and fans, and is a prime example of how the American system is wanting at times. Sitcoms may be one thing to draw out as long as the stars stay the relative ages of their characters, but drama is better written as an entire story arc at the outset rather than running on tracks that have no final destination in sight—except having enough episodes to syndicate.
So, how should “Mad Men” end? Here’s my take: An energized client pitch is disrupted at the agency offices as a growing brouhaha emanates from the New York streets below. It's the sound of thousands of Peace Marchers parading to protest the Vietnam War and starting to fight with hooting construction workers. Maybe according to the series’ lifecycle, it’s a tad early for this and I’m committing my own anachronistic crime, but time lapse could help the series get through the inevitable relationship body counts which predictably lie ahead.
For all we know, Don may have already dropped acid in a future episode, thus confusing his identity issue even further like so many who psychedelicized. Now, with a burgeoning Peace Movement and Hippie Scene converging on our Nowhere Man, he is overcome by curiosity as everyone in the pitch meeting is drawn in astonishment to the high window to look out over the spectacle of history in the making. Impulsive to the end, he bails on the pitch and descends to the street. On the ground, he is caught up in the crowd, looking unsure of himself as his tie is loosened and jacket pawed by hippie chicks who welcome the “straight”. We last see him as he looks around in disbelief, not knowing whether to join “the parade” or run for his life. What he realizes quickly is that his desire to confess and his problems, in the immortal words spoken by Humphrey Bogart at the end of “Casablanca”, don’t “amount to a hill of beans” compared to the march of history. And the audience doesn't know either. End of story.
That way, Don Draper represents a whole generation of Mad Men, who like my father, were all so convinced that they were defined by their work. Blame the Cold War or Madison Avenue. Don is only special because he had to deploy a mask to cover up an identity issue that is no longer a big deal when assassinations, LSD, and Vietnam ripped open the facade of the mythic 50's/60's “Ozzie and Harriet”/I Like Ike/Apple Pie/Take A Letter/Zone, and everyone was revealed as not knowing who they were, where they belonged, and what tribe was right for them...Ultimately, Don can only find the redemption we all hope for him once the women finally take over, so maybe he gets hit on the head at the end with all the secretaries’ burning bras as they fall from high out of the agency office windows like a snowy ticker tape parade over Madison Avenue.
And now, for these messages from our discorporate sponsors…
(With thanks to Sarah Kelley)
Wednesday, September 29, 2010
DEATH BY DATA
I just finished listening to the complete symphonies of Franz Josef Hadyn who is widely recognized as “the father of the symphony.” His achievement is incredible if only for its sizeable output—some 107 works in all. The reason I bring it up is not out of any odd feeling of accomplishment though the experience was filled with musical wonders—but because it’s made me think that before digital media came on the scene, it would not have been possible to listen to them all—unless, of course, I was able to sit through the four years of concerts that it took for the Stuttgarter Kammerorchester to record the 37 CD set.
The digital compact disc made available comprehensive box sets of individual artists, composers, and bands in diverse collections that encompass the history of music genres including the arcane as well. It may be kind of daunting to confront an artist’s complete works when they exhibit the scale of a Hadyn, for example. The Internet has also made it possible to expand one’s reach exponentially into the world of the consequential, in addition to burying us in minutiae and trivia. The question becomes—how do we go about discovery and finding meaning in this mirror maze of data?
It’s nothing new to say that we are suffering under the weight of information and the grip of technology. Jaron Lanier’s recent book, You Are Not A Gadget, is as good as any in the list of jeremiads warning us about giving up our souls to silicon based-lifeforms. Personally, I experienced a tipping point this past summer with my inbox groaning for the mercy of the delete button and unsubscribe links which became my truest online friends.
The data available at a mere mouse click through search is imposing as well. Recently, my ten-year-old son expressed an interest in movies about World War II. He came to me frustrated by the wide range of choices offered by Netflix. It became apparent to me that his desire for discovery needed human intercession—and not the kind offered by several engines that pride themselves on non-robotic crawler solutions and even so-called "human search." Collaborative filtering and recommendation engines be damned, what he was asking for was curation.
The future of search is curation. I am convinced it will be at the foundation of many successful business enterprises and for individuals who can provide an editorial perspective on qualifying information. It’s not enough just to make the information available as we have been finding out. Say you were new to rock and roll—or Hadyn, for that matter. Where would you start? Google? Wikipedia? iTunes? And if so, how reliable are these methods? Google’s acquisition of metaweb last July speaks to emergent search methodologies that attempt to provide a layer of contextualization. Wolfram/Alpha is another that steps up the visual component of search.
In a conversation with Frank Zappa, he once pointed out to me that the binary mind behind modern computer technology is more limited than we think, particularly when taking into consideration the nature of time. He saw the conventional perspective of past, present, and future augmented by “never” and “eternity” and offered a vision of time as spherical and non-linear. He suggested that a computer that added these two features to the conventions of "on" and "off" switching would yield results that were more in keeping with the way that we live in time radially with our brains. Before he died in 1993, Frank joked that the Japanese “had probably already been working on it.”
The religious scholar, Mircea Eliade, once pointed out that the end of an era or great age often generates a popular belief that if all information were to be made available, that the Answer will then present itself. Of course, if Google were a religion, this idea would be the central tenet of the digital faith—and any entity whose corporate philosophy is “You can make money without doing evil” might arouse suspicions. Its mega initiatives like Google Earth and Project Gutenberg should raise an eyebrow at least. Who knows, maybe Google has already discovered the Answer to the Answer.
But, on the whole, I prefer to look for the answer in music, say in one of Bach’s inventions or in a John Coltrane solo, than in any old text-based search. It is here that we are presented with the age-old battle of what came first at the Creation—a subject of one of Hadyn’s master works as well—did the universe start with light as in a very special visual effect or was it in born of sound, mantra or "the Word"? I’ll place my bet on the sound of music any day because a Google search I just did yields 146,000,000 results for “Let there be light” versus a search for “The Big Note" which wins with 203,000,000--so it must be true...
Friday, March 12, 2010
JIMI B. GOODE
Anthony DeCurtis’ recent New York Times article, “Beyond The Jimi Hendrix Experience” (2/28/10) is refreshingly accurate, especially about the circumstances of the great guitarist’s death, which he attributes correctly to “misadventure” as opposed to a heroin overdose (so often reported as fact especially in anti-drug propaganda). However, his characterization that Jimi “never spoke out about the pressing civil rights issues (of his day) either in his lyrics or in interviews” is simply not accurate. A look at some of the facts reveals an essential part of the man, his music, and his times.
Though it may not reference them directly, his song “House Burning Down,” (which appeared on the “Electric Ladyland” album) was written following the 1967 Detroit riots and implores in one lyric, “Try to learn instead of burn, hear what I say.” In live performance, Jimi usually dedicated “I Don’t Live Today” to the “American Indian,” in part, as tribute to his Cherokee Grandmother and in a heartbreaking description of the blight of reservation life. Any listener interested in his attitude toward race should reference the lyrics to both of these songs.
The scorched earth instrumental “Midnight” is apocryphally said to have been an improvisation recorded in anger and outrage the night after Martin Luther King’s assassination. Transcending color is a major theme in much of his body of work. It is also a featured element in his legendary and as yet unreleased musical autobiography, “Black Gold.” He spoke about civil rights during many interviews as well including one in which he said, “I wish they'd had electric guitars in cotton fields back in the good old days. A whole lot of things would've been straightened out.”
According to some of his closest friends, the fact that the African American community did not embrace him during his lifetime is said to have troubled Jimi. Despite his free street concerts in Harlem, promotion as the “Black Elvis,” attempts at recruitment of him by the Black Panthers, and his replacement of the original Experience with the all black, Band of Gypsies, Hendrix’s fan base remained largely middle-class and white—even though he greatly influenced contemporary black musicians like Miles Davis and Sly Stone. Among the biographies that treat this aspect of his life are David Henderson's well received, "'Scuse Me While I Kiss The Sky: Jimi Hendrix: Voodoo Chile" and "Jimi Hendrix: Electric Gypsy" by Harry Shapiro and Caesar Glebbeek (founder of the Hendrix Information Center).
Like the era with which he is so synonymous, Jimi Hendrix was complex, conflicted, and deeply indebted to African American musical tradition. His success—typified by the fact that he was the first rock performer of any color to earn $100,000 for a single concert—was the hard earned result of playing as Little Richard’s guitar player, working the chitlin’ circuit with the Isley Brothers and other major R & B artists, and in acknowledging still other forebears as he did with his definitive, non-chalant cover version of Chuck Berry’s “Johnny B. Goode” (memorialized in the documentary, "Jimi Plays Berkeley").
Today, Jimi’s legacy as a fearless virtuoso is common wisdom. But, the part of his personal story that also needs to be told is that he was truly an artist beyond color—and that is why, as Mr. DeCurtis observes, he also continues to be “an enduring symbol of personal freedom.” His breakthrough as one the most celebrated rock stars of the sixties—and the only one “of color”—is an achievement that should be considered enough of a statement about race relations at that tumultuous time.
Thursday, February 18, 2010
WHAT HAS CHANGED? KEY TRANSFORMERS IN HUMAN AND MEDIA BEHAVIORS
Traditional media brands and networks are playing catch-up with trends that have been in effect as a result of the Internet for several decades now. In particular, four distinct shifts in audience and consumer behavior have resulted from the influence of the Web and each should guide our thinking about media, marketing, content, and new technologies. These are:
1) Interactivity—audience members and consumers are called “users” with good reason in this medium where the expected experience is no longer the “lean-back” one of the television living room, but the “lean-forward” engagement of a user who expects to have a say and the ability to interact and manipulate his “personal” media environment
2) Personalization—from MySpace to the iPhone, digital media is now super-charged with the capability of incorporating the individual and personal—from branding and iconography to collaborative filtering, choice, and having options are the way of the digital world
3) Immediacy—the web offers the kind of instant gratification that can be addictive from enhanced shopping experiences a la Amazon’s “one-click” buy button to the streaming media of sites like Netflix.com and Hulu.com
4) Community—arguably the most compelling transformation wrought by the Web, the specialization of human experience is now capable of being channeled into affinities of every special interest imaginable where, through the power of networking, like-minded individuals can find each other by just a click-through in a search window
This last transformation is critical because of the way that community has now extended to social media and thereby, changed the very nature of what networks can produce virally. The advent of distributed computing over ten years ago is a tribute to the accelerated power of the networked individual. As part of its value proposition, any new network would have to offer the capability of accommodating and encouraging user generated content and feedback.
Additionally, the community aspect of building network presence should not be restricted to creating Facebook and MySpace pages—several cable networks, for example, have made investments in acquiring several online newsletters to aggregate communities of special interest in the arts, music, and culture, and to create cross-promotional programming opportunities for web content to be broadcast on television and vice versa.
The introduction of time shifting behavior through the use of PVRs and TIVO as well as VOD are all reflections of personalization and the ability of the user to interact with media on demand.
All of the above transformations caused a sea change shift in the nature of media distribution. From peer-to-peer and social network sharing to crowdsourcing and user generated content, the inmates are now running the asylum and distribution that was once in the hands of media companies is now being given a run for the money by game-changing “user distributors”. The trend toward distributed authority of the flat organizational model where decision-making authority is at the edge is just one corporate reaction to this new empowerment of the individual and what Malcolm Gladwell calls “outliers.” Even savvy brands like Amazon have been caught up in the grassfire of a negative blogging campaign, hence, the evolution of the corporate blog as pre-emptive brand strategy.
While conventional wisdom proclaims that the dominant forces that will transform media will come from the introduction of new technologies and changes in the means of distribution, the most powerful transformative agent of change will be a coming generational shift. First signs of such a shift were evident in the advent of multi-tasking and new television formats such as MTV’s experiments with three ten-minute segments making up a half-hour show as well as Nickelodeon’s innovating a programming wheel of five cartoons within a half-hour block of a single show. The shift from the large plasma and HD screen digital surround sound of the home movie theater to the small screen and mp3 of the web and mobile phone are another sign of differing generational appetites in the consumption of media.
The power of web video is also a reflection of how different generations are utilizing media. Six billion videos were viewed on YouTube in January, 2009. Twenty hours of video are uploaded to YouTube every minute. Between 150,000 and 200,000 videos are uploaded daily. The growth of short-form video viewing answers a seemingly insatiable appetite among younger audiences for entertainment. The challenge facing traditional long-form and series is that the new viewer is a non-sequential consumer who is apparently less interested in these kind of formats than in instant gratification of what’s hot right now and it does not have to be scripted, professionally produced “broadcast quality”.
There is also a short-form video revolution going on. The single-most influential trend influencing the creation of content is the evolving short-form program format. If YouTube is any indicator, the audience of the future will prefer short attention span theater to the half-hour and hour formats that still dominate traditional broadcast. The average YouTube video is two-minutes and forty-six seconds in duration.
The growth of Twitter should be seen as another indicator for the coming power of snack size media. 70% of its current users joined in 2009 demonstrating a 1400% growth between February, 2008 and February, 2009. An average of twenty million tweets are sent every day with 3.8 billion sent to date.
Short-form program formats are not new and have been around since the 1970’s and 80’s when program insert series such as “This Day in Sports” and “Today in Music History” were successful informational commercials of sixty-seconds in length. But, these formats are a very distant cousin to webisodes and mobisodes that last only several minutes. ABC’s first online experiment in offering its primetime hours for download offers another illustration of how the offline and online worlds differ. As measured by Nielsen, there were some forty million downloads of which the average time viewed was two-minutes. Clearly, the remote control’s cousin is the click of a mouse away.
Social media can now be leveraged to reach target audiences in their native, online environments. The power of online video syndication is that it can reach beyond video networks such as YouTube and Facebook, and engage users through tactics such as community and blogger outreach, featured video portal placement, content seeding, social applications, game development, and other methods. The potential reach of video syndication networks like dailymotion.com, metacafe.com, vimeo.com is expansive.
Certain applications now offer the capability of identifying influencer activity on the Web. Usually, web site and blogs are ranked by popularity. Increasingly, tools like those provided by Buzzlogic and Visible Technologies offer the ability to actually reverse engineer networks of specialized interest. By identifying such nodes of audience concentration that appeal to a particular media brand’s core value proposition and program content, it would be possible to reverse engineer an online component to a vertically integrated network.
Mobile is the fastest growing channel in the world, offering new and exciting opportunities for marketing, advertising and content distribution. Mobile provides a conduit between media outlets, entertainment, e-commerce, and consumers. Mobile data capable phones reached a social tipping point with the introduction of the iPhone in 2007.
The market for mobile video content is growing at a rate of 20% a month. While they were only introduced a year ago, video ringtones and video screensavers account for approximately 10,000 downloads a month at a price point between $2.50 and $4 (on Tier 1 North American Carriers). Given consumer adoption rates for mobile data and the fact that the music download market still accounts for five million downloads per month (between $2-3), all next generation of handsets will support this type of content and will drive the expansion of this market. As such, the media network of the future will be well advised to create a mobile beach-head to take advantage of the platform for distribution of its content.
What kind of world is this transformative media environment creating? I have written before in this blog ("Is Personalization Really That Personal?", "National Nano Memory", "It's A Short Form World After All", "Why The Web Is Like A Time Machine") about the fact that there is no free lunch and that the allure of new technologies always carries a price, particularly in what may be lost as the result of supposed advantages in efficiency, ease of use, choice, and other features dangled like shiny carrots by new gadgetry. Automation and its impact on the declining of the Industrial Age workforce is one example of the trade-off in human terms that "better machines" have wrought. If something appears to be too good to be true, it probably is. Or as the Zen Buddhists would say, "Things are not as they appear. Nor are they otherwise."
Jaron Lanier, the computer scientist who coined the term “virtual reality”, has written a new manifesto which is essential reading called “You Are Not A Gadget”, which describes at length the perils invited by our increased love affair and reliance on technology, particularly the Internet and social media. Hardly a neo-Luddite, Lanier is not the kind of voice in the wilderness that one might expect to sound the Cassandra call to action and for conscious use of technology. Maybe that’s what makes his beautifully written argument so compelling. Or as Tuli Kupferberg of The Fugs once so poetically put it, “I now pronounce you man and…machine.”
At the beginning of the 20th century, Rudolf Steiner predicted that by the end of the century a non-biological lifeform would develop in parallel through a parasitic relationship with biological life. I think that he was prescient in describing our present day silicon-based lifeforms. Anyone who has sat at a keyboard for hours or been pulled by the strange attractor of the Blackberry keypad or iPhone apps knows that feeling of losing control and all sense of time. We might ask in our spare time in between Facebook and texting, who is actually being served here? Are we the digital canaries in the proverbial silicon coal mine?
I don’t necessarily subscribe to the singularity theory (the technological creation of smarter-than-human intelligence), remembering that the HAL 9000 onboard computer was incapable of lying in "2001: A Space Odyssey," and that he failed when he became paranoid through cognitive dissonance when his instructions were compromised by conflicting instructions as supplied by the NSC and White House—“people who lie for a living”—according to the script in Arthur C. Clarke’s sequel, "2010: The Year We Make Contact."
Perhaps the singularity is not near as Ray Kurzweil has supposed in his recent tome, but is already here. At least, I think that HAL probably had wisdom beyond his circuits when he said, “I am putting myself to the fullest possible use, which is all I think that any conscious entity can ever hope to do.” ...OK, already. I hear you. So, why not get off my soap box and let’s just change the channel and see what else is on—after all, we have over five-hundred channels now on TV at least and we’re just getting started on the Web and mobile…
Special thanks for Liz Gebhardt—http://www.thinkingoutloud.com—for the YouTube and Twitter metrics.
Subscribe to:
Posts (Atom)