Saturday, June 15, 2013

IT'S STILL ALL ABOUT STORYTELLING


I've noticed at recent YouTube,  2nd, 3rd, and 4th screen panels, and other such confabs that established YouTube stars are often unseasoned in the kind of linear storytelling basics that have driven compelling content starting with Radio and Serials. While YouTube "views" may be a good start, it's also been interesting to note that many of these Internet "storytellers" are targeting traditional media distribution as the eventual outlet for their content and careers. Another motivator is suggested by a surge of content producer backlash to Google's revenue share of its nascent subscription channel partnerships. Now that the inmates are running the digital asylum and capitalizing on self-distribution platforms, many pundits and companies are betting that the web will produce talent for other more traditional media.

But, there is a reason why newcomers to programming like Netflix, XBox, and Amazon TV have all hired former entertainment executives to develop their new series entries. And there is probably a reason that they are defaulting to known stars and producers to deliver them. At the same time, the creative agencies in Hollywood have legions of so-called "twenty-somethings" diligently scouring the web for emergent talent to scoop up for representation. The entertainment business is hard to forecast--despite Google's recent announcement that search results can predict weekend box office. As one studio executive remarked, "Duh..." Still, the average YT video is 2:46 long, so short-form is definitely king in terms of the standard issue web viewer and fifteen-second rich media banners are another indicator.

How much narrative can fit in such a format and does it drive sequential viewing like traditional long form? A generational shift has certainly taken place with access to multiple screen entertainment and hopefully it will yield new storytellers and formats. But, web storytellers should look at securing a foundation in what has made stories relevant on an ongoing basis ever since the tribal fireside, rather than engaging only in producing the latest novelty video. We're hopefully moving now beyond the cat videos that dominated early YouTube successes and the opportunities in a multi-channel distribution universe offer these newcomers novel ways of stepping up to the plate and potentially reaching huge audiences. There are other challenges.

I recall a well attended confab a number of years ago at the home of Buzztone Marketing head, Liz Heller, where a web video producer exuberantly proclaimed that his video had received over fifty-thousand "hits". I resisted the temptation strenuously to ask whether the scenario would have changed, had he asked viewers to pay for the privilege.

Even so, the behavioral shift in attention span has yielded an almost cannibalistic appetite on the part of users to novelty, rather than to what used to be called by network program executives, "appointment viewing". Google's addition of watching metrics, speaks to this trend because it essentially measures whether viewers are sticking with a video all the way through as opposed to merely grazing and moving on to the next recommendation by the engine, itself, or from their friends. Among older viewers, binge viewing has also had an impact in the way that traditional television series are watched. Time Warner invested over $30 million last year in Maker Studios and perhaps it will pay off. It also begs the question: does taking this level of investment mean that Maker is not making any money? Predicting the future in Hollywood--exciting as these expansive platforms and distribution channels are--may be better left for the time being to Las Vegas bookies.

(With props to Bill Sarine from Beachglass Films for input and inspiration for this post)

Tuesday, July 24, 2012

THE NETWORK OF ALL NETWORKS


 
The Web is old enough so that several generations have grown up with it at this point. For those of us who can recall that lonesome, crunchy ripple of dial-up, the wonder of Mosaic, the bustling of bulletin boards, and other artifacts of the early days of the World Wide Web, it may be hard to imagine the lives of young people who have never known what it is like to live without the Internet.

Its founding quietly took place back on October 29, 1969 with a message sent from UCLA to Stanford. According to Jonathan Zittrain, author of The Future of the Internet and How To Stop It, the purpose of building such a network at that time “was not to offer a particular set of information or services like news or weather to customers…but to connect anyone on the network to anyone else. It was up to the people connected to figure out why they wanted to be in touch in the first place; the network would simply carry data between the two points.” Without a large war chest, the academics and corporate researchers behind its original design couldn’t contemplate the creation of a global network. Now that this has occurred, it’s fascinating to see that its pervasive role in connecting people to brands and services is not that different from how it worked in its infancy.

The early days of its popular use ramped up back in 1995, when by the end of the year, it had grown to a robust 16 million users. The latest data shows that there are over two-and-a-quarter-billion global users today—or about a third of the world’s total population.

The impact of the mobile web, social media, and social TV that represent networks within a network have expanded its eco-system beyond the affinities, communities, and other nodes of specialized interest that grew out of the original capabilities of the Internet. The potential to knit together networks for brands and marketers presents a compelling, new non-linear business model.

It is so much a part of our lives that we sometimes forget that the Internet is a network of networks. Its use now extends across social networks, the mobile web, landing pages, and RSS broadcast (among others), and each new extension should remind us that the power of the Web lies not just in the common URL, but in the way that it radiates from site destination points to expansively "cast a net widely".




Network Conceptualization of Twitter - http://www.cytoscape.org

Directing eyeballs to a single web site is an expensive and consuming venture. While there are certain goliath brands like Amazon and eBay that have established themselves as islands in the data stream, we are moving back to the future in a sense, from the destination orientation of the web to a radial network model. What exactly does this mean for marketing?

When a client comes in and characteristically asks for “a new web site” that is blue and has flowers, what they may be really asking for is a functional hub inside a multi-channel, branded eco-system. This a hub is at the center of a hybrid digital and traditional media wheel that has multiple spokes (or channels) which expand on a conversational journey outward, around, and looping back. If effective, these pathways have the eventual effect of creating a dynamic platform by linking to various targeted nodes of influence and driving users, consumers—an audience—back to the mother lode at the web site home. With consumers favoring multiple media channels, it is important that brands reach out to them wherever they are and on whatever medium they favor. We are no longer living in a world where the all-too familiar address of www. has a lemming effect.

Social media blackbelt and integrated marketing strategy Technoratisaurus Rex, Liz Gebhardt, formulated a depiction of the multi-channel universe in a blog post written over three years ago about product modeling, which I affectionately have called "The Gebhardt Brand Mandala" ever since:


At the core of any network is the hub that establishes its identity and center of gravity. Gebhardt’s model is more relevant than ever—especially as video platforms converge with traditional broadcast and short-form content. Even so, mainstream television and cable broadcast networks have for years now had to rely on watermarks that float on the lower third of the screen in a feeble attempt to claw at some remaining identification. This is not easy when TV brands are whipped up and feature ever redundant programming in the 500+ channel universe blender. With the addition of Internet and mobile television to the mix as well as behavioral trends like social TV and the rise of “binge viewing” (see my next blog post), the whole concept of the video “broadcast network” is being reinvented and up for grabs.

If YouTube is any indication, today’s average web user is videocentric, searching for constant novelty with a cannibalistic fervor, and beset by a short attention span—two-minutes and forty-three seconds to be exact. It’s also a short-form video world after all on the Internet, with 55% of bandwidth being hogged globally by some form of video—at any given time, it’s 30% BitTorrent with YouTube playing catch-up at 25%. What does this volume mean for brands, services or products in search of an audience?

Primarily, it means that if the common language of the world is now visual, then you would be wise to “speak” video to get your message across. Back in the day, Yahoo had what they called the “Three-Second Rule”, which admonished internal developers to deliver on value within that time-frame or risk losing the user forever. This was a far more dire prospect than the days when the TV remote control ruled and broadcasts had to engage viewers or else—another network was only a click away. Today, we are overwhelmed with the choices that clicks can offer us, and the order of the day is to grab attention through immediacy, interactivity, personalization, community, freemium, and other methods of creating value for the end user. All of these elements should inform how a brand platform or network is created.

The brand network model should be customized, but its essential look is planetary with marketing channels orbiting like satellites around the brand hub:



In taking a non-linear approach, we are only mimicking nature, where radial models dominate and networks abound. Chinese Taoist philosophers called the rivers and waterways of earth its circulatory system and the planetary water cycle bears this analogy out as science. Theodor Schwenk’s Sensitive Chaos is a classic study of patterns in water flow that can be seen as inspiring the classic phrase that “water seeks its own level”—an adage that is also an apt description of social network behavior.

The idea of reoccurring patterns in nature, their evolution in time and their configuring and reconfiguring to enhance movement is the basis of a field called constructal law. One of the designs in nature that it studies is branching. Lightning, water, cardiovascular systems—all have evolved treelike architecture because it is a efficient way of moving currents. Constructal law also extends to organization in business, politics, science and other fields where hierarchy and flow create patterns. 
The brain is famously a neural network and the web of life can be seen as a vast set of nodes that are self-organizing. Why should its mirror in the Internet be any different? And the creation of a brand platform network is an organic expression of its capabilities.

The history of successful networks and cable channels is that they have been branded by a defining show or personality, which encapsulates a broadcast entity’s mission and identity. Early television networks ported over successful radio personalities as a way of not only bringing along their loyal audiences, but as a way of defining their program offerings—whether it was comedy, drama, soaps, game shows, and other familiar formats. Though it started in 1948, the ABC network really established itself in the 1970s as differentiated from its predecessors, CBS, and NBC, with the introduction of innovations like “Monday Night Football” and the Sunday Night “Movie of the Week”.

Later in 1986, when Fox arrived, it distinguished itself as the “18-34” demographic network through younger skewing, slightly more risqué shows like “The Simpsons” and “Married With Children” that took familiar formats to new extremes. MTV was not on the map until it broadcast “Remote Control”, its own version of a game show that turned a storied format on its head and broadcast to both audience and advertisers that this was not your parent’s network. Successful branding by Emeril Lagasse established the Food Network with a larger-than-life, chef character.

The idea of personalities defining networks is as old as Ed Sullivan, Walter Cronkite, and Dan Rather branding CBS, Johnny Carson and Bill Cosby branding NBC, and Roseanne and Barbara Walters branding ABC. Audiences like to identify with personalities and characters. A nascent network program development strategy should be informed by an active search for talent and defining show concepts that can attract viewers, differentiate its value proposition, and compel advertisers to invest.

The emergence of watermarks on broadcast and cable television during the 90’s was no accident. In a cluttered landscape of multiple brands with often-redundant program offerings, it became an essential feature to help audiences know what entity they were actually watching. Whether a new network undertakes long or short-form programming, it needs to be packaged in a way that is clearly branded. This is particularly the case for any web-related or mobile video. Current effective tactics for syndicated and seeded web video content branding include end plate, white-wrap, and vanity URL techniques to ensure tracking of traffic directly related to video campaigns.

The diversity of media choices to connect brands with consumers has never been greater. The challenge presented by this opportunity is how to integrate the real with the digital world, and to make a strategic assessment of which channels represent the optimal means of reaching audiences wherever they are. The media and marketing networks of the future will be integrated. Brands will all be broadcasters.

While it is still early—especially with networks and agencies learning the hard way about paid blogging and in-your-face tweeting—organic growth and behavioral change will fuel the eventual integration of platforms. Movies did not replace radio and either did TV. The Internet did not kill television, as many web evangelists in its late 90's go-go years predicted. Social media will not replace prime time programming. On the contrary, several IPTV startups are looking at permanent integration of Twitter. Putting chips down on all of these new marketing channel markers is strategic, but allocation of resources and investment needs to be measured, given that ROI, SROI, and analytic models are still evolving. Traffic is still a key indicator, but is no longer enough of a metric; measuring influence is another step in the right direction.

Every new medium defines its own market at the same time that it forces extant media to redefine their own market share. Survival depends on companies and creative talent being able to recognize and optimize the unique value that differentiates one from another, and to provide the appropriate content accordingly. The marketing networks of the future will identify the best means to reach their intended consumers to create value and an optimal, personalized experience with their respective brands and content. They will also ensure that consumers are empowered with information and enabled through the multiple channels they provide to be active participants in evolving brands to be better. 

In the future, brands should be more persuaded by the words of poets, than by marketers. Or in the words of Emily Dickinson: 
                                                    
                                                    Tell All The Truth
                                            
                                            Tell all the truth but tell it slant,
                                            Success in circuit lies,
                                            Too bright for our infirm delight
                                            The truth's superb surprise;

                                            As lightning to the children eased
                                            With explanation kind,
                                            The truth must dazzle gradually
                                            Or every man be blind.

We are all nodes on the network of all networks now. Or to let Shakespeare have the final word, "'Tis true, there is magic in the web of it." 

Saturday, March 26, 2011

WHAT IS "SOCIAL"?


A more recent take on Andy Warhol’s famous dictum puts us in a future where we will all have 15 friends. If you Google the word “social”, you get over 2 billion results. But, what is this “social” which we all take for granted and of which we all so readily speak? The word appears in history prior to the year 1387 as sociale borrowed apparently from the Latin via the Middle French. Routed from the Roman mother tongue, it originally meant “united or living with others” and “companion.”

Looking just one step further into the wilderness of word origins, we find its root in the Latin sequi which means “to follow.” So, here in a nutshell is where the Twitter transitive verb, “to follow”, finds its first use. If we search still further, we come upon its link to the Old Icelandic seggr meaning “companion or man” and ultimately, to the mother lode in Sanskrit where, as sakha, it simply means “friend.”

Here, we arrive at root origin of the Facebook transitive verb, “to friend”, closing the loop of a word that we use everyday to describe the expanding communication ripples that bind, link, and otherwise connect us at a click. Or to paraphrase what Terence, the Roman philosopher might tweet, “Nothing social is foreign to me.”

Friday, March 25, 2011

THE STRATIGRAPHY OF "SUCK"


I kind of like to know what I’m talking about. At least, I like to know what the words I’m using mean even if I can’t make sense with what I am trying to make them say. We invariably use lots of words throughout our daily lives without reference to where they come from or how their original meaning has changed. Sometimes, we’re even distant from the slang that seems so current, but may be recycled. Words like “cool” have re-entered the lingo of new generations who don’t know that it came from the bebop beatniks, daddy-o. The first time I heard it since I’d first heard Kookie Burns say it on “77 Sunset Strip” was in Silicon Valley in the 90’s—and it came out of the mouths of some very geeky engineers. I still get a funny feeling when I hear Bill Gates use it in one of his testimonials.

There are other words that are also in common usage that are very distant from their origins—one in particular is almost as widespread as the word “like” and “awesome”. That word is “suck” and it’s been somewhat twisted not necessarily to mean something entirely new, but has found wide social acceptance despite its low origins.

When I was about 11, I bought some badges at the local hippie emporium. One of them said, “Dracula Sucks”, which my father made me take right off my Sgt. Pepper’s jacket and toss in the trash. I was surprised, and he answered what must have been my hangdog look by saying that it was “just inappropriate.” That was enough for me to spend the rest of the night seeking out its deep, dark, hidden meaning. I better understood when I discovered it referred to a sex act that my teachers probably would not see eye-to-eye with as a point of for extra class discussion.

But today, “suck” is so commonly used in commercials, on talk shows, by politicians, and in everyday conversation cross generations that it seems to have been denuded of its original meaning. It’s used to convey a general sense of something that is awful. Its reference to a subservient position for one participant in a sex act may be hidden in the mists of time—or at least in how well-worn it’s become as part of the daily lexicon used by schoolchildren and adults.

I wondered if its popular use might be excused somehow—maybe there was another meaning that forgave its vulgar origins. After some digging into a handy dictionary of etymology, I discovered that the word was part of a once popular phrase. It just so happened that “suck” also designated the sad lot of the runt of the porcine litter who was left to suckle on the hind most “teat”. Eventually, the expression gave way to “suck hind tit”, which is probably shrink-wrapped today in common usage as good old, plain, “it sucks!” So, the next time you are tempted to use it, remember that words are chameleons that double-up and sometimes come back to bite us like the time-tossed travelers they are.

(With thanks to Suw Charman Anderson and Peter Corbett: http://charman-anderson.com/2010/02/04/the-impenetrable-layer-of-suck/)

Saturday, October 23, 2010

MAD MEN’s DON DRAPER: NOWHERE MAN IN SEARCH OF TRUTH IN ADVERTISING


“Who is Don Draper?” is a question which is one of the central themes of the hit AMC series. A friend asked my opinion of this season’s finale and about how I thought the series would ultimately end. It got me to thinking about a lot more than Don’s dilemma and urge to confess. My confession is that I have a love/hate relationship with the show.

I grew up in the era that it depicts so well—what I hate is the mirror reflection of what I remember of that time as a child of divorce. But I what I love is the great dramatic craft and wonderful acting—even though, I did pick up one anachronism last year. In the offending episode, the ad guys celebrated on one occasion by breaking out a bottle of Stolichnaya—a gesture that certainly would not have gone over too well in the Cold War Era. I’ve subsequently seen bottles of Smirnoff in later episodes, which has righted the situation. That said, there is so much to admire about the art direction and attention to period detail that it almost makes you want to take off your fedora, take out a pack of Lucky Strikes, and reinstate the three-martini lunch ritual.

But getting back to my friend’s questions, I told her that I thought that the last scene of the finale was unnecessary. When I saw the masterful scene before it, I thought that it was a great open ending—after Don tells his ex-wife that he’s getting remarried, we see an empty bottle on the kitchen counter of their former family home, framed center stage like a dark, Courbet still life as the lights are shut off by the departing former couple in what I thought was a fade to black and end credits. The question becomes—is there a spark between them that will come between Don’s impending new marriage? We already should have reservations about the match knowing that he is an inveterate cad and his impulsive decision to marry his secretary does not auger well for faithfulness or longevity.

But the writers couldn’t resist the obvious and my enthusiasm was quickly dampened because it didn’t end there. They chose to continue, and cut to Don and his fiancée in bed with Sonny and Cher’s “I Got You Babe” playing in the background. Now, the inference here—and a rather heavy handed one I thought—whether it is subconscious or not—is that the duo singing on the soundtrack had a fairy book Hollywood marriage that ended in divorce—hence, the seemingly sentimental, hippie love song casts a foreboding shadow over the betrothed couple lounging in bed. This final scene seemed to lack the subtlety of the one beforehand—do we really have to spell everything out in TV America? The kitchen scene struck me as something you’d see in a foreign film. The bed scene, typical soap opera.

OK. I’ll stop producing the show. Let’s get back to Don. He’s got a secret that is now burning after the last season. The perfection of his character is written down to his Dickensian name—his “adopted” namesake of “Draper” lends itself to two meanings—his hidden identity is sequestered under the draping of what happened in the Korean War; the other, is that, in addition to hiding personal truths, he is the perfect Ad Man because he’s so adept at all the shadings of truth that his profession requires.

The well-spun phrase “Truth in Advertising” (actually a law) certainly has a different spin to it especially with the platform now offered by social media. Consumers can instantly flex their muscles and spark negative PR grassfires that can grow into the kind of outright conflagrations that have sometimes brought corporate giants to their knees—remember the Domino’s pizza disaster when several misbegotten pizza twirlers posted a video on YouTube showing them adorning their pies with toppings that were…shall we say, not on the menu, but definitely organic? There are countless other examples that have motivated most major corporations to preemptively hire legions of twenty-year-olds to maintain a vigilant watch in the blogosphere for negative consumer rumblings. Mad consumers can now be an activist virus.

Back in the 1960’s however, we were living the American Dream and drinking the Kool Aid that turned us into that consummate culture of consumption which has for the last several decades displaced our standing as a manufacturing country—the rest as they say, is subject of daily reports on the unemployment rate, foreclosures, and the general Fall of the Dow Jones and perhaps Empire. One of the many things that “Mad Men” gets so right is the way that we were sold and bought a bill of goods in the 50’s and 60’s from unfiltered cigarettes to the bomb shelters that we didn’t need. It all seemed so simple—merely “duck and cover” until the inconvenience of an atomic attack passed over and we could return to our regularly scheduled programming.

So, Don Draper is really the Beatles’ “Nowhere Man” come several years early and in some respects, he is a reflection of several generations who lived through the post World War II era. He’s caught in between the sheets as a relationship train wreck that doesn’t know who he is and between the 50’s and the 60’s that are starting to explode as we see in the season just concluded.

Don is a hybrid archetype of both T.S. Eliot’s “hollow men”, and the discontented businessman captured by Sloan Wilson’s, “The Man In The Grey Flannel Suit”, a 50’s best seller and hit movie with Gregory Peck and Jennifer Jones in 1956. Like this work, “Mad Men” has appropriately been celebrated as capturing the mid-century American “zeitgeist” just as “The Social Network” has been cited as doing the same for the Internet Era. The irony about Don’s secret is that he is about to enter a time period when his identity problem is
no longer be relevant.

Even though I enjoy the series, I hope that next season is the last. In America, the lifespan of a TV series is not motivated so much by its organic narrative shape and pulse, but by the economic imperative of reaching the magical goal of a syndicatable 65 episodes. This is the threshold for the number of episodes that can be “stripped” or be distributed as “repeats” five days a week for roughly half-a-year (26 weeks) before they have to recycle and repeat. One of the reasons that foreign, and in particular, British TV series seem to have an edge to them—take the seamlessness and naturally closed narrative arc of a classic series like “The Prisoner”, for example—is that rather than produce a show until it runs out of steam, foreign shows are written and produced as so-called “limited series”, a standard emulated well by some US cable network shows.

The end of “The Sopranos” was roundly criticized as a cop-out by many critics and fans, and is a prime example of how the American system is wanting at times. Sitcoms may be one thing to draw out as long as the stars stay the relative ages of their characters, but drama is better written as an entire story arc at the outset rather than running on tracks that have no final destination in sight—except having enough episodes to syndicate.

So, how should “Mad Men” end? Here’s my take: An energized client pitch is disrupted at the agency offices as a growing brouhaha emanates from the New York streets below. It's the sound of thousands of Peace Marchers parading to protest the Vietnam War and starting to fight with hooting construction workers. Maybe according to the series’ lifecycle, it’s a tad early for this and I’m committing my own anachronistic crime, but time lapse could help the series get through the inevitable relationship body counts which predictably lie ahead.

For all we know, Don may have already dropped acid in a future episode, thus confusing his identity issue even further like so many who psychedelicized. Now, with a burgeoning Peace Movement and Hippie Scene converging on our Nowhere Man, he is overcome by curiosity as everyone in the pitch meeting is drawn in astonishment to the high window to look out over the spectacle of history in the making. Impulsive to the end, he bails on the pitch and descends to the street. On the ground, he is caught up in the crowd, looking unsure of himself as his tie is loosened and jacket pawed by hippie chicks who welcome the “straight”. We last see him as he looks around in disbelief, not knowing whether to join “the parade” or run for his life. What he realizes quickly is that his desire to confess and his problems, in the immortal words spoken by Humphrey Bogart at the end of “Casablanca”, don’t “amount to a hill of beans” compared to the march of history. And the audience doesn't know either. End of story.

That way, Don Draper represents a whole generation of Mad Men, who like my father, were all so convinced that they were defined by their work. Blame the Cold War or Madison Avenue. Don is only special because he had to deploy a mask to cover up an identity issue that is no longer a big deal when assassinations, LSD, and Vietnam ripped open the facade of the mythic 50's/60's “Ozzie and Harriet”/I Like Ike/Apple Pie/Take A Letter/Zone, and everyone was revealed as not knowing who they were, where they belonged, and what tribe was right for them...Ultimately, Don can only find the redemption we all hope for him once the women finally take over, so maybe he gets hit on the head at the end with all the secretaries’ burning bras as they fall from high out of the agency office windows like a snowy ticker tape parade over Madison Avenue.

And now, for these messages from our discorporate sponsors…

(With thanks to Sarah Kelley)

Wednesday, September 29, 2010

DEATH BY DATA


I just finished listening to the complete symphonies of Franz Josef Hadyn who is widely recognized as “the father of the symphony.” His achievement is incredible if only for its sizeable output—some 107 works in all. The reason I bring it up is not out of any odd feeling of accomplishment though the experience was filled with musical wonders—but because it’s made me think that before digital media came on the scene, it would not have been possible to listen to them all—unless, of course, I was able to sit through the four years of concerts that it took for the Stuttgarter Kammerorchester to record the 37 CD set.

The digital compact disc made available comprehensive box sets of individual artists, composers, and bands in diverse collections that encompass the history of music genres including the arcane as well. It may be kind of daunting to confront an artist’s complete works when they exhibit the scale of a Hadyn, for example. The Internet has also made it possible to expand one’s reach exponentially into the world of the consequential, in addition to burying us in minutiae and trivia. The question becomes—how do we go about discovery and finding meaning in this mirror maze of data?

It’s nothing new to say that we are suffering under the weight of information and the grip of technology. Jaron Lanier’s recent book, You Are Not A Gadget, is as good as any in the list of jeremiads warning us about giving up our souls to silicon based-lifeforms. Personally, I experienced a tipping point this past summer with my inbox groaning for the mercy of the delete button and unsubscribe links which became my truest online friends.

The data available at a mere mouse click through search is imposing as well. Recently, my ten-year-old son expressed an interest in movies about World War II. He came to me frustrated by the wide range of choices offered by Netflix. It became apparent to me that his desire for discovery needed human intercession—and not the kind offered by several engines that pride themselves on non-robotic crawler solutions and even so-called "human search." Collaborative filtering and recommendation engines be damned, what he was asking for was curation.

The future of search is curation. I am convinced it will be at the foundation of many successful business enterprises and for individuals who can provide an editorial perspective on qualifying information. It’s not enough just to make the information available as we have been finding out. Say you were new to rock and roll—or Hadyn, for that matter. Where would you start? Google? Wikipedia? iTunes? And if so, how reliable are these methods? Google’s acquisition of metaweb last July speaks to emergent search methodologies that attempt to provide a layer of contextualization. Wolfram/Alpha is another that steps up the visual component of search.

In a conversation with Frank Zappa, he once pointed out to me that the binary mind behind modern computer technology is more limited than we think, particularly when taking into consideration the nature of time. He saw the conventional perspective of past, present, and future augmented by “never” and “eternity” and offered a vision of time as spherical and non-linear. He suggested that a computer that added these two features to the conventions of "on" and "off" switching would yield results that were more in keeping with the way that we live in time radially with our brains. Before he died in 1993, Frank joked that the Japanese “had probably already been working on it.”

The religious scholar, Mircea Eliade, once pointed out that the end of an era or great age often generates a popular belief that if all information were to be made available, that the Answer will then present itself. Of course, if Google were a religion, this idea would be the central tenet of the digital faith—and any entity whose corporate philosophy is “You can make money without doing evil” might arouse suspicions. Its mega initiatives like Google Earth and Project Gutenberg should raise an eyebrow at least. Who knows, maybe Google has already discovered the Answer to the Answer.

But, on the whole, I prefer to look for the answer in music, say in one of Bach’s inventions or in a John Coltrane solo, than in any old text-based search. It is here that we are presented with the age-old battle of what came first at the Creation—a subject of one of Hadyn’s master works as well—did the universe start with light as in a very special visual effect or was it in born of sound, mantra or "the Word"? I’ll place my bet on the sound of music any day because a Google search I just did yields 146,000,000 results for “Let there be light” versus a search for “The Big Note" which wins with 203,000,000--so it must be true...

Friday, March 12, 2010

JIMI B. GOODE


Anthony DeCurtis’ recent New York Times article, “Beyond The Jimi Hendrix Experience” (2/28/10) is refreshingly accurate, especially about the circumstances of the great guitarist’s death, which he attributes correctly to “misadventure” as opposed to a heroin overdose (so often reported as fact especially in anti-drug propaganda). However, his characterization that Jimi “never spoke out about the pressing civil rights issues (of his day) either in his lyrics or in interviews” is simply not accurate. A look at some of the facts reveals an essential part of the man, his music, and his times.

Though it may not reference them directly, his song “House Burning Down,” (which appeared on the “Electric Ladyland” album) was written following the 1967 Detroit riots and implores in one lyric, “Try to learn instead of burn, hear what I say.” In live performance, Jimi usually dedicated “I Don’t Live Today” to the “American Indian,” in part, as tribute to his Cherokee Grandmother and in a heartbreaking description of the blight of reservation life. Any listener interested in his attitude toward race should reference the lyrics to both of these songs.

The scorched earth instrumental “Midnight” is apocryphally said to have been an improvisation recorded in anger and outrage the night after Martin Luther King’s assassination. Transcending color is a major theme in much of his body of work. It is also a featured element in his legendary and as yet unreleased musical autobiography, “Black Gold.” He spoke about civil rights during many interviews as well including one in which he said, “I wish they'd had electric guitars in cotton fields back in the good old days. A whole lot of things would've been straightened out.”

According to some of his closest friends, the fact that the African American community did not embrace him during his lifetime is said to have troubled Jimi. Despite his free street concerts in Harlem, promotion as the “Black Elvis,” attempts at recruitment of him by the Black Panthers, and his replacement of the original Experience with the all black, Band of Gypsies, Hendrix’s fan base remained largely middle-class and white—even though he greatly influenced contemporary black musicians like Miles Davis and Sly Stone. Among the biographies that treat this aspect of his life are David Henderson's well received, "'Scuse Me While I Kiss The Sky: Jimi Hendrix: Voodoo Chile" and "Jimi Hendrix: Electric Gypsy" by Harry Shapiro and Caesar Glebbeek (founder of the Hendrix Information Center).

Like the era with which he is so synonymous, Jimi Hendrix was complex, conflicted, and deeply indebted to African American musical tradition. His success—typified by the fact that he was the first rock performer of any color to earn $100,000 for a single concert—was the hard earned result of playing as Little Richard’s guitar player, working the chitlin’ circuit with the Isley Brothers and other major R & B artists, and in acknowledging still other forebears as he did with his definitive, non-chalant cover version of Chuck Berry’s “Johnny B. Goode” (memorialized in the documentary, "Jimi Plays Berkeley").

Today, Jimi’s legacy as a fearless virtuoso is common wisdom. But, the part of his personal story that also needs to be told is that he was truly an artist beyond color—and that is why, as Mr. DeCurtis observes, he also continues to be “an enduring symbol of personal freedom.” His breakthrough as one the most celebrated rock stars of the sixties—and the only one “of color”—is an achievement that should be considered enough of a statement about race relations at that tumultuous time.